Sphinx uses old module code - python

I use Sphinx to document my code, which is a collection of Python modules. The autogenerated documentation that's made by scrubbing my source code is fine, but when I click on the "code" link that links directly to the HTML pages containing my package's sources that Sphinx generates, an older version of my code is shown.
I've tried deleting my Sphinx generated documentation, uninstalling the package from my own site-packages folder, and deleting everything in my build folder. I can find absolutely no files that match Sphinx's output - it's old, and I'm not sure where it's coming from. Does anybody know how to get Sphinx to put my new code in the documentation?
As I said, the autodocumentation works fine, so it's obviously parsing my code on some level. So why is the pure text different from the autodocumentation?

Sphinx caches (pickles) parsed source files. The cache is usually located in a directory called .doctrees under the build directory. To ensure that your source files are reparsed, delete this directory.
See http://www.sphinx-doc.org/en/stable/man/sphinx-build.html#cmdoption-sphinx-build-d.

Pass the poorly documented -a option to sphinx-build.
Doing so effectively disables all Sphinx caching by forcing Sphinx to rebuild all output documentation files (e.g., HTML in the docs/build/ subdirectory), regardless of whether the underlying source input files have been directly modified or not. This is my preferred mode of operation when locally authoring Sphinx documentation, because I frankly do not trust Sphinx to safely cache what I expect it to. It never does.
For safety, I also:
Prefer directly calling sphinx-build to indirectly invoking Sphinx through makefiles (e.g., make html).
Pass -W, converting non-fatal warnings into fatal errors.
Pass --keep-going, collecting all warnings before failing (rather than immediately failing on the first warning).
Pass -n, enabling "nit-picky mode" generating one warning for each broken reference (e.g., interdocument or intrasection link).
Pass -j auto, parallelizing the build process across all available CPU cores.
I have trust issues. I'm still shell-shocked from configuring Sphinx for ReadTheDocs three all-nighters ago. Now, I always locally run Sphinx via this command (wrapped in a Bash shell script or alias for convenience):
$ sphinx-build -M html doc/source/ doc/build/ -W -a -j auto -n --keep-going
That's just how we roll, Sphinx. It's Bash or bust.

Related

How to disable specific PEP8 warnings in PyCharm at the repository level?

Looking to find a way, if possible, to disable specific PEP8 warnings for a Python project loaded in PyCharm at the repository level (i.e. saving a repository-committed configuration file which can apply PEP8 configuration hints to any user loading a project in PyCharm).
In a situation where other developers may contribute to a project using PyCharm. I personally do not use PyCharm myself (just a text editor) and building/linting is performed through various tox environments. There are select PEP8 rules that I am particularly not fond of, such as the promotion on injecting two blank lines in specific areas of the code (e.g. E305). While linters can be configured to ignore specific PEP8 rules and developers can invoke a command (like tox) with the same linter configuration, developers using PyCharm will still see these warnings in their environment. For example:
The problem I experience is developers will make (undesired) adjustments to the implementation and submit them for changes in pull requests. While developers can dismiss warnings themselves, I do not want developers to have to assume/interpret which PEP8 rules the project follows (aside from what may be mentioned in a CONTRIBUTING document). In addition, while source files can be modified with a # noqa comment to hint to the IDE to ignore an issue on that line, I am looking for an alternative way to ignore specific PEP8 rules without peppering various # noqa hints throughout the implementation.
For example, looking for a way to disable all E305 warnings in a theoretical .pycharm file such as follows:
[pycharm]
ignore = E305
After some additional investigation, there does not appear to be support (as of 2020-09-23; v2020.2) for repository-specific PEP8 warning customization.
The IDE uses pycodestyle for PEP8 processing. Testing out options from pycodestyle's configuration has shown that user-specific configurations will disable various PEP8 hints in the IDE; however, the goal is avoid having the user to configure anything in this case. While pycodestyle indicates that it will also look at setup.cfg and tox.ini, the IDE does not invoke pycodestyle in a way to use them. Examining how the IDE invokes pycodestyle shows that it will feed a source's contents through stdin for information. pycodestyle only implicitly loads supported project-specific configurations (i.e. setup.cfg or tox.ini) based on a common prefix of provided input files (if provided) on the existing working directory pycodestyle is invoked on -- unfortunately, since PyCharm does not change the working directory when invoking pycodestyle to a project's root, project-specific configurations cannot be implicitly loaded.
An issue has been created on their issue tracker:
PY-44684 | support pycodestyle project-specific configuration settings (pep8)
https://youtrack.jetbrains.com/issue/PY-44684
There is way to do it, but at the moment it requires the corresponding PyCharm's project settings to be kept in VCS. For every reported pycodestyle.py error there is a dedicated quick fix "Ignore errors like this" that includes the respective error code in the settings of "PEP 8 coding style violation" inspection (these codes are then passed directly to pycodestyle.py via its --ignore option).
You can then find and edit these codes in the inspection settings.
If a per-project inspection profile was configured, these settings are saved in .idea/inspectionProfiles/Your_Profile_Name.xml and this way can be shared with the team or external contributors.
Obviously, it won't work if you don't want to expose IDE-specific config files in the repository, especially when there is a better, tool-independent place for such preferences. The reported PY-44684 is a legitimate issue, I agree.

Is there an ldd like command for python

This is the problem: You try to run a python script that you didn't write yourself, and it is missing a module. Then you solve that problem and try again - now another module is missing. And so on.
Is there anything, a command or something, that can go through the python sources and check that all the necessary modules are available - perhaps even going as far as looking up the dependencies of missing modules online (although that may be rather ambitious)? I think of it as something like 'ldd', but of course this is much more like yum or apt-get in its scope.
Please note, BTW, I'm not talking about the package dependencies like in pip (I think it is called, never used it), but about the logical dependencies in the source code.
There are several packages that analyze code dependencies:
https://docs.python.org/2/library/modulefinder.html
Modulefinder seems like what you want, and reports what modules can't be loaded. It looks like it works transitively from the example, but I am not sure.
https://pypi.org/project/findimports/
This also analyzes transitive imports, I am not sure however what the output is if a module is missing.
... And some more you can find with your favorite search engine
To answer the original question more directly, I think...
lddcollect is available via pip and looks very good.
Emphases mine:
Typical use case: you have a locally compiled application or library with large number of dependencies, and you want to share this binary. This tool will list all shared libraries needed to run it. You can then create a minimal rootfs with just the needed libraries. Alternatively you might want to know what packages need to be installed to run this application (Debian based systems only for now).
There are two modes of operation.
List all shared library files needed to execute supplied inputs
List all packages you need to apt-get install to execute supplied inputs as well as any shared libraries that are needed but are not under package management.
In the first mode it is similar to ldd, except referenced symbolic links to libraries are also listed. In the second mode shared library dependencies that are under package management are not listed, instead the name of the package providing the dependency is listed.
lddcollect --help
Usage: lddcollect [OPTIONS] [LIBS_OR_DIR]...
Find all other libraries and optionally Debian dependencies listed
applications/libraries require to run.
Two ways to run:
1. Supply single directory on input
- Will locate all dynamic libs under that path
- Will print external libs only (will not print any input libs that were found)
2. Supply paths to individual ELF files on a command line
- Will print input libs and any external libs referenced
Prints libraries (including symlinks) that are referenced by input files,
one file per line.
When --dpkg option is supplied, print:
1. Non-dpkg managed files, one per line
2. Separator line: ...
3. Package names, one per line
Options:
--dpkg / --no-dpkg Lookup dpkg libs or not, default: no
--json Output in json format
--verbose Print some info to stderr
--ignore-pkg TEXT Packages to ignore (list package files instead)
--help Show this message and exit.
I can't test it against my current use case for ldd right now, but I quickly ran it against a binary I've built, and it seems to report the same kind of info, in fact almost twice as many lines!

How can I make sphinx-quickstart auto-include docstrings from my Python modules

I am trying out Sphinx for generating documentation for a Python project. Just to make sure I can actually make it work, I have made a test project to try it out on: https://github.com/ThomasA/sphinxtest.
I have run sphinx-quickstart in the root of this repository. In the following questions, I specified 'doc' as the documentation root, named the project 'sphinxtest', entered 'Thomas Arildsen' as author, answered 'y' to the 'autodoc' option, and selected the default setting for everything else.
I expected the 'autodoc' option to cause the generation of a file 'amodule.rst' in the 'doc' folder. However, this does not get generated. I am puzzled by this. I thought this was what the 'autodoc' option was supposed to do and what I have seen examples of others apparently achieve with it. Sphinx completes without any error messages, so it seems to be doing what it thinks it should do. So, what could I be doing wrong?
I am using Sphinx v. 1.5.6 and Python 3.5.3, all installed with Anaconda.
autodoc does not generate the .rst source files.
Instead first use sphinx-apidoc to generate the source files. Then run Sphinx to make your documentation.
An alternative to Percy's answer is autoapi written by Carlos Jenkins. He provides a Sphinx extension that generates the ReST files per documentation generation. You can configure the output ReST by modifying a template file.
https://github.com/carlos-jenkins/autoapi

Why is recommended to add extra files in a python source package?

There are python tools like check-manifest, to verify that all your files under your vcs are included also in your MANIFEST.in. And releasing helpers like zest.releaser recommend you to use them.
I think files in tests or docs directories are never used directly from the python package. Usually services like read the docs or travis-ci are going to access that files, and they get the files from the vcs, not from the package. I have seen also packages including .travis.yml files, what makes even less sense to me.
What is the advantage of including all the files in the python package?

How to generate Python documentation using Sphinx with zero configuration?

We don't want to be maintaining documentation as well as the source code, which is evolving rapidly at the moment, yet Sphinx seems to require a frustrating amount of setup and configuration. (We just need some basic API docs.) Is there not a single command you can run inside a python project that will just iterate over all of the packages, modules, classes and functions generating documentation as HTML?
The sphinx-apidoc splats stuff into a directory, and after modifying the conf.py to have our packages in the sys.path we can run "make html", but it only lists packages and modules without documenting any classes or functions.
Thanks!
The sphinx-apidoc tool will autogenerate stubs for your modules, which might be what you want.
Instructions
Make sure the autodoc module was enabled during Sphinx configuration.
extensions = ['sphinx.ext.autodoc']
within Sphinx's conf.py should do the trick.
Make sure conf.py adjusts sys.path accordingly (see the comments at lines 16-19 in the file).
sys.path.insert(0, os.path.abspath('/my/source/lives/here'))
Run sphinx-apidoc to generate skeletons.
sphinx-apidoc -o /my/docs/live/here /my/source/lives/here
Rebuild the docs. If all goes well, you shouldn't get the following sort of warning:
mymodule.rst:4: WARNING: autodoc can't import/find module 'mymodule'
Your module RSTs should now be populated.

Categories

Resources