I have a project where I manage the version through git tags.
Then, I use setuptools_scm to get this information in my setup.py and also generates a file (_version.py) that gets included when generating the wheel for pip.
This file is not tracked by git since:
it has the same information that can be gathered by git
it would create a circular situation where building the wheel will modify the version which changes the sources and a new version will be generated
Now, when I build the documentation, it becomes natural to fetch this version from _version.py and this all works well locally.
However, when I try to do this within ReadTheDocs, the building of the documentation fails because _version.py is not tracked by git, so ReadTheDocs does not find it when fetching the sources from the repository.
EDIT: I have tried to use the method proposed in the duplicate, which is the same as what setuptools_scm indicate in the documentation, i.e. using in docs/conf.py:
from pkg_resources import get_distribution
__version__ = get_distribution('numeral').version
... # I use __version__ to define Sphinx variables
but I get:
pkg_resources.DistributionNotFound: The 'numeral' distribution was not found and is required by the application
(Again, building the documentation locally works correctly.)
How could I solve this issue without resorting to maintaining the version number in two places?
Eventually the issue was that ReadTheDocs did not have the option to build my package active by default and I was expecting this to happen.
All I had to do was to enable "Install Project" in the Advanced Settings / Default Settings.
Related
Recently I started using the sphinx_autodoc_typehints and the sphinx_autodoc_defaultargs extensions for Sphinx via my project's conf.py. As it seems are not default packages in the sphinx installation on readthedocs (over there sphinx is on v1.8.5). Because my build fails with an Extension error shown here:
Could not import extension sphinx_autodoc_typehints (exception: No module named
'sphinx_autodoc_typehints')
I understand I somehow have to tell readthedocs to get sphinx_autodoc_typehints (and later sphinx_autodoc_defaultargs as well) from PyPI. Or is there a way I can install packages on readthedocs myself?
Since I use pbr for package management I use a requirements.txt that readthedocs knows of. I don't want to specify the sphinx extensions there because every user of my package would have to install them. Is there no other way of telling readthedocs which extensions to use?
Following the comment of Steve Piercy, I found a way to have two requiremens.txt. readthedocs' advanced settings (on the website) allow for one requirements.txt only.
readthedocs' preferred way are .readthedocs.yaml, which needs to live in the root folder. Following https://docs.readthedocs.io/en/stable/config-file/v2.html, this is how the file now looks like:
version: 2
sphinx:
configuration: docs/conf.py
python:
version: 3.7
install:
- requirements: requirements.txt
- requirements: docs/requirements.txt
and the docs/requirements.txt looks like this:
sphinx==3.4.3
sphinx_autodoc_typehints==1.12.0
sphinx_autodoc_defaultargs==0.1.2
In the advanced settings page I had to manually set the location of sphinx' conf.py, although it's a standard location. Without this setting my build would still fail.
I work on a project where two of the dependencies require conflicting dependencies. In particular the project requires eli5==0.8 which requires tabulate>=0.7.7 and invocations==1.4.0 which requires tabulate==0.7.5.
I can still install the project, import the module and run the code, however when I try to create an entry point via setup.py and run it I encounter the following failure:
Traceback (most recent call last):
File "/Users/user/.pyenv/versions/3.6.6/envs/repro/lib/python3.6/site-packages/pkg_resources/__init__.py", line 574, in _build_master
ws.require(__requires__)
File "/Users/user/.pyenv/versions/3.6.6/envs/repro/lib/python3.6/site-packages/pkg_resources/__init__.py", line 892, in require
needed = self.resolve(parse_requirements(requirements))
File "/Users/user/.pyenv/versions/3.6.6/envs/repro/lib/python3.6/site-packages/pkg_resources/__init__.py", line 783, in resolve
raise VersionConflict(dist, req).with_context(dependent_req)
pkg_resources.ContextualVersionConflict: (tabulate 0.8.2 (/Users/user/.pyenv/versions/3.6.6/envs/repro/lib/python3.6/site-packages), Requirement.parse('tabulate==0.7.5'), {'invocations'})
Even if I try to pin the version of tabulate directly in my setup.py I get the same failure.
How are situations like this resolved?
As extra information. I'm using Python 3.6.6 and the following minimal python module and setup.py can be used to reproduce the problem.
a_script.py:
def cli():
print('Hello world')
if __name__ == '__main__':
cli()
setup.py:
from setuptools import setup
setup(
name='repro',
version='0.1',
py_modules=['a_script'],
install_requires=[
'eli5==0.8',
'invocations==1.4.0',
# 'tabulate==0.8.2'
],
entry_points='''
[console_scripts]
repro=a_script:cli
''',
)
Welcome to the world of dependencies hell!
I know no clean way to solve this. Some hints for a simple workaround:
can you find a later (may be called dev or unstable) version of the older dependency meeting the requirement of the newer? If yes you could try if it passes the integration tests of your own project (do all your nominal use cases pass with it).
can you find an older version of the newer dependency meeting your own requirements? If yes you should test that is works in all your nominal use cases
If none of the above worked, you will have to build a custom version of one of the conflicting projects (assuming at least one is open source). Ideally you should clone the oldest (here invocation), set its version to a local version identifier (here for example 1.4.0+tab0-7) and change its own requirement to accept a tabulate>=0.7.7. then use that special version and again throughly test that all your use cases pass with that.
If all the tests of the modified project still pass, you could try to propose its maintainer to change their version requirement for a future release, for example by proposing a patch / pull request based on the current development tree.
I recently resolved a situation like this by patching the package's METADATA file using a unified diff, made using git diff, and GNU patch.
This is an effective solution if you are trying to deploy an application, but if you are writing a library, then only effective solution is to go bother the maintainers and tell them to relax their constraints, or eliminate your reliance on their work.
So I've been using the guide found here to install gr-gsm for GNU Radio with pybombs on Arch Linux. However, when I get to the line for installing gr-gsm, I get the following error:
[josh#localhost ~]$ pybombs install gr-gsm
PyBombs.DepManager - ERROR - Package does not exist: ssl (declared as dependency for package libevent)
According to the guide, it should install dependencies by itself. I've gone through the documentation for pybombs to see if I'm doing something incorrectly and couldn't find anything. I double checked the configuration as well.
If I go into Python and import ssl, it's there.
EDIT:
Checking the recipe list, ssl is in there:
[josh#localhost build]$ pybombs recipes list
...
ssl ~/.pybombs/recipes/gr-recipes/ssl.lwr
and yet, I get the same error:
[josh#localhost build]$ pybombs install gr-gsm
PyBombs.DepManager - ERROR - Package does not exist: python (declared as dependency for package mako)
It can not find the package. Actually it's the same on Void. It's a pitfall of all these wrap arounds. Because the package managers of Arch (fixed by now if I read the sources of Pybombs correctly), Void are not coded ...
You can edit ~/.pybombs/recipes/gr-recipes/ssl.lwr and add a check yourself. If you add a new key then the code also needs changing if it is not considered in the sources.
But first check if you have SSL installed locally, e.g. using pkg-config openssl --version.
You can also flag the SSL check as optional during dependency check and then make sure it really exists and that it will be found by the compiler (which is invoked later).
Though for Arch this should be fixed by now.
[A nicer way for Arch instead of such wrappers like pybombs are PKGBUILDS - the same is true for Void who uses almost same format templates as Arch. A cross platform system like 0install might also help.]
Edit: GNURadio exists as Arch package already, see https://wiki.archlinux.org/index.php/GNU_Radio
When I use pip to install a package from source, it will generates a version number for the package which I can see using 'pip show '. But I can't find out how that version number is generated and I can't find the version string from the source code. Can someone tell me how the version is generated?
The version number that pip uses comes from the setup.py (if you pip install a file, directory, repo, etc.) and/or the information in the PyPI index (if you pip install a package name). (Since these two must be identical, it doesn't really matter which.)
It's recommended that packages make the same string available as a __version__ attribute on their top-level module/package(s) at runtime that they put in their setup, but that isn't required, and not every package does.
And if the package doesn't expose its version, there's really no way for you to get it. (Well, unless you want to grub through the pip data trying to figure out which package owns a module and then get its version.)
Here's an example:
In the source code for bs4 (BeautifulSoup4), the setup.py file has this line:
version = "4.3.2",
That's the version that's used, directly or indirectly, by pip.
Then, inside bs4/__init__.py, there's this line:
__version__ = "4.3.2"
That means that Leonard Richardson is a nice guy who follows the recommendations, so I can import bs4; print(bs4.__version__) and get back the same version string that pip show beautifulsoup4 gives me.
But, as you can see, they're two completely different strings in completely different files. If he wasn't nice, they could be totally different, or the second one could be missing, or named something different.
The OpenStack people came up with a nifty library named PBR that helps you manage version numbers. You can read the linked doc page for the full details, but the basic idea is that it either generates the whole version number for you out of git, or verifies your specified version number (in the metadata section of setup.cfg) and appends the dev build number out of git. (This relies on you using Semantic Versioning in your git repo.)
Instead of specifying the version number in code, tools such as setuptools-scm may use tags from version control. Sometimes the magic is not directly visible. For example PyScaffold uses it, but in the project's root folder's __init__.py one may just see:
import pkg_resources
try:
__version__ = pkg_resources.get_distribution(__name__).version
except:
__version__ = "unknown"
If, for example, the highest version tag in Git is 6.10.0, then pip install -e . will generate a local version number such as 6.10.0.post0.dev23+ngc376c3c (c376c3c being the short hash of the last commit) or 6.10.0.post0.dev23+ngc376c3c.dirty (if it has uncommitted changes).
For more complicated strings such as 4.0.0rc1, they are usually hand edited in the PKG-INFO file. Such as:
# cat ./<package-name>.egg-info/PKG-INFO
...
Version: 4.0.0rc1
...
This make it unfeasible to obtain it from within any python code.
I would like to use distutils (setup.py) to be able to install a python package (from a local repository), which requires another package from a different local repository. Since I am lacking decent documentation of the setup command (I only found some examples
here and here, confused by setup-terms extras_require, install_require and dependency_links found here and here), does anyone have a complete setup.py file that shows how this can be handled, i.e. that distutils handles the installation of a package found in some SVN repository, when the main package I am installing right now requires that?
More detailed explanation: I have two local svn (or git) repositories basicmodule and extendedmodule. Now I checkout extendedmodule and run python setup.py install. This setup.py files knows that extendedmodule requires basicmodule, and automatically downloads it from the repository and installs it (in case it is not installed yet). How can I solve this with setup.py? Or maybe there is another, better way to do this?
EDIT: Followup question
Based on the answer by Tom I have tried to use a setup.py as follows:
from setuptools import setup
setup(
name = "extralibs",
version = "0.0.2",
description = ("Some extra libs."),
packages=['extralib'],
install_requires = "basiclib==1.9dev-r1234",
dependency_links = ["https://source.company.xy/svn/MainDir/SVNDir/basiclib/trunk#20479#egg=basiclib-1.9dev-r1234"]
)
When trying to install this as a normal user I get the following error:
error: Can't download https://source.company.xy/svn/MainDir/SVNDir/basiclib/trunk#20479: 401 Authorization Required
But when I do a normal svn checkout with the exact same link it works:
svn co https://source.company.xy/svn/MainDir/SVNDir/basiclib/trunk#20479
Any suggestion how to solve this without changing ANY configuration of the svn repository?
I think the problem is that your svn client is authentified (caching realm somewhere in ~/.subversion directory) what your distutils http client don't know how to do.
Distutils supports svn+http link type in dependency links. So you may try adding "svn+" before your dependency link providing username and password:
dependency_links =
["svn+https://user:password#source.company.xy/svn/MainDir/SVNDir/basiclib/trunk#20479#egg=basiclib-1.9dev-r1234"]
For security reasons you should not put your username and password in your setup.py file. One way to do that it fetching authentication information from an environment variable or event try to fetch it from your subversion configuration directory (~/.subversion)
Hope that help
Check out the answers to these two questions. They both give specific examples on how install_requires and dependency_links work together to achieve what you want.
Can Pip install dependencies not specified in setup.py at install time?
Can a Python package depend on a specific version control revision of another Python package?