Python entry point fails for dependency conflict - python

I work on a project where two of the dependencies require conflicting dependencies. In particular the project requires eli5==0.8 which requires tabulate>=0.7.7 and invocations==1.4.0 which requires tabulate==0.7.5.
I can still install the project, import the module and run the code, however when I try to create an entry point via setup.py and run it I encounter the following failure:
Traceback (most recent call last):
File "/Users/user/.pyenv/versions/3.6.6/envs/repro/lib/python3.6/site-packages/pkg_resources/__init__.py", line 574, in _build_master
ws.require(__requires__)
File "/Users/user/.pyenv/versions/3.6.6/envs/repro/lib/python3.6/site-packages/pkg_resources/__init__.py", line 892, in require
needed = self.resolve(parse_requirements(requirements))
File "/Users/user/.pyenv/versions/3.6.6/envs/repro/lib/python3.6/site-packages/pkg_resources/__init__.py", line 783, in resolve
raise VersionConflict(dist, req).with_context(dependent_req)
pkg_resources.ContextualVersionConflict: (tabulate 0.8.2 (/Users/user/.pyenv/versions/3.6.6/envs/repro/lib/python3.6/site-packages), Requirement.parse('tabulate==0.7.5'), {'invocations'})
Even if I try to pin the version of tabulate directly in my setup.py I get the same failure.
How are situations like this resolved?
As extra information. I'm using Python 3.6.6 and the following minimal python module and setup.py can be used to reproduce the problem.
a_script.py:
def cli():
print('Hello world')
if __name__ == '__main__':
cli()
setup.py:
from setuptools import setup
setup(
name='repro',
version='0.1',
py_modules=['a_script'],
install_requires=[
'eli5==0.8',
'invocations==1.4.0',
# 'tabulate==0.8.2'
],
entry_points='''
[console_scripts]
repro=a_script:cli
''',
)

Welcome to the world of dependencies hell!
I know no clean way to solve this. Some hints for a simple workaround:
can you find a later (may be called dev or unstable) version of the older dependency meeting the requirement of the newer? If yes you could try if it passes the integration tests of your own project (do all your nominal use cases pass with it).
can you find an older version of the newer dependency meeting your own requirements? If yes you should test that is works in all your nominal use cases
If none of the above worked, you will have to build a custom version of one of the conflicting projects (assuming at least one is open source). Ideally you should clone the oldest (here invocation), set its version to a local version identifier (here for example 1.4.0+tab0-7) and change its own requirement to accept a tabulate>=0.7.7. then use that special version and again throughly test that all your use cases pass with that.
If all the tests of the modified project still pass, you could try to propose its maintainer to change their version requirement for a future release, for example by proposing a patch / pull request based on the current development tree.

I recently resolved a situation like this by patching the package's METADATA file using a unified diff, made using git diff, and GNU patch.
This is an effective solution if you are trying to deploy an application, but if you are writing a library, then only effective solution is to go bother the maintainers and tell them to relax their constraints, or eliminate your reliance on their work.

Related

How do I track where a pip dependency came from with Bazel?

I'm using bazel to manage a Python project, and there is currently an issue with the typing package:
File "<excluded>/pypi__typing/typing.py", line 1007, in __new__
self._abc_registry = extra._abc_registry
AttributeError: type object 'Callable' has no attribute '_abc_registry'
After some investigation on Google, this appears to have come from the old typing package that used to be installed via pip, and is now part of the Python standard library. This package no longer works, and so the recommended solution is to remove it and use the standard library one.
I can't do the recommended solution because I'm not using this package directly, and in a bazel environment I can't just run pip uninstall typing. I need to find out how the package is getting included, and remove it from the source.
How can I track down where the package is being imported?
Turns out you can just use bazel query to discover this.
Suppose your WORKSPACE file has this to import the dependencies:
pip_install(
name = "my_project_deps",
python_interpreter = "/usr/bin/python3",
requirements = "//my_project:requirements.txt",
)
You would then run the query:
bazel query 'allpaths(//my_project, #my_project_deps//pypi__typing)'
This gives me a result like this:
//my_project:target
//some/other:library
#my_project_deps//pypi__expiringdict:pypi__expiringdict
#my_project_deps//pypi__typing:pypi__typing
Seems like it's being pulled in by the expiringdict package.
How you actually remove the dependency is another question; if they don't have a newer version then you may have to fork the package and fix it yourself. In this case since typing is now part of the standard library, it is probably just a matter of removing it from that package's list of dependencies.

ReadTheDocs + Sphinx + setuptools_scm: how to?

I have a project where I manage the version through git tags.
Then, I use setuptools_scm to get this information in my setup.py and also generates a file (_version.py) that gets included when generating the wheel for pip.
This file is not tracked by git since:
it has the same information that can be gathered by git
it would create a circular situation where building the wheel will modify the version which changes the sources and a new version will be generated
Now, when I build the documentation, it becomes natural to fetch this version from _version.py and this all works well locally.
However, when I try to do this within ReadTheDocs, the building of the documentation fails because _version.py is not tracked by git, so ReadTheDocs does not find it when fetching the sources from the repository.
EDIT: I have tried to use the method proposed in the duplicate, which is the same as what setuptools_scm indicate in the documentation, i.e. using in docs/conf.py:
from pkg_resources import get_distribution
__version__ = get_distribution('numeral').version
... # I use __version__ to define Sphinx variables
but I get:
pkg_resources.DistributionNotFound: The 'numeral' distribution was not found and is required by the application
(Again, building the documentation locally works correctly.)
How could I solve this issue without resorting to maintaining the version number in two places?
Eventually the issue was that ReadTheDocs did not have the option to build my package active by default and I was expecting this to happen.
All I had to do was to enable "Install Project" in the Advanced Settings / Default Settings.

Requiring only one of two dependencies in a requirements file

Some Python packages require one of two packages as a dependency. For example, Ghost.py requires either PySide or PyQt4.
Is it possible to include such a dependency in a requirements.txt file? Is there any 'or' operator that works with these files?
If not, what can I do to add these requirements to the file so only one of them will be installed?
Currently neither pip's requirement.txt nor setuptools directly allow such a construction. Both require you to specify a list of requirements. You can restrict the version of a requirement, but that's all.
Inside Python, you can handle this situation as follows:
try:
import dependency1
def do_it(x):
return dependency1.some_function(x)
except ImportError:
try:
import dependency2
def do_it(x)
return dependency2.another_function(x)
except ImportError:
raise ImportError('You must install either dependency1 or '
+ 'dependecy2!')
Now do_it uses either dependency1.some_function or dependency2.another_function, depending on which is available.
That will still leave you with the problem of how to specify your requirements. I see two options:
Don't formally specify the requirement in requirements.txt or setup.py but document that the user needs to install one of the dependencies. This approach might be OK if the setup of your software requires additional manual steps anyway (i.e. more than just pip install my_tool).
Hard-code your preferred requirement in requirements.txt or setup.py.
In the end, you have to ask yourself why people might want to use one dependency over the other: I usually couldn't care less about the dependencies of the libraries that I use, because (disk) space is cheap and (due to virtualenv) there is little risk of incompatibilities. I'd therefore even suggest you think about not supporting two different dependencies for the same functionality.
I would use a small Python script to accomplish this
#!/usr/bin/env python
packages = 'p1 p2 p3'.split()
try:
import optional1
except ImportError: # opt1 not installed
try:
import optional2
except ImportError: # opt2 not installed
packages.append('optional2')
print(' '.join(packages))
Have this script executable with
chmod +x requirements.py
And finally run pip with it like this:
pip install $(requirements.py)
The '$(requirements.py)' will execute the requirements.py script and put its output (in this case, a list of packages) into pip install ...
For setuptools, you can change the setup code to look similar to here:
https://github.com/albumentations-team/albumentations/blob/master/setup.py#L11
Where dependency1 would be installed if none of dependency1 and dependency2 is installed yet, but nothing is installed if any of them is already part of the system.
The caveat is that it doesn't work with wheels, and you need to install with --no-binary to make it work: https://albumentations.ai/docs/getting_started/installation/#note-on-opencv-dependencies

How pip determine a python package version

When I use pip to install a package from source, it will generates a version number for the package which I can see using 'pip show '. But I can't find out how that version number is generated and I can't find the version string from the source code. Can someone tell me how the version is generated?
The version number that pip uses comes from the setup.py (if you pip install a file, directory, repo, etc.) and/or the information in the PyPI index (if you pip install a package name). (Since these two must be identical, it doesn't really matter which.)
It's recommended that packages make the same string available as a __version__ attribute on their top-level module/package(s) at runtime that they put in their setup, but that isn't required, and not every package does.
And if the package doesn't expose its version, there's really no way for you to get it. (Well, unless you want to grub through the pip data trying to figure out which package owns a module and then get its version.)
Here's an example:
In the source code for bs4 (BeautifulSoup4), the setup.py file has this line:
version = "4.3.2",
That's the version that's used, directly or indirectly, by pip.
Then, inside bs4/__init__.py, there's this line:
__version__ = "4.3.2"
That means that Leonard Richardson is a nice guy who follows the recommendations, so I can import bs4; print(bs4.__version__) and get back the same version string that pip show beautifulsoup4 gives me.
But, as you can see, they're two completely different strings in completely different files. If he wasn't nice, they could be totally different, or the second one could be missing, or named something different.
The OpenStack people came up with a nifty library named PBR that helps you manage version numbers. You can read the linked doc page for the full details, but the basic idea is that it either generates the whole version number for you out of git, or verifies your specified version number (in the metadata section of setup.cfg) and appends the dev build number out of git. (This relies on you using Semantic Versioning in your git repo.)
Instead of specifying the version number in code, tools such as setuptools-scm may use tags from version control. Sometimes the magic is not directly visible. For example PyScaffold uses it, but in the project's root folder's __init__.py one may just see:
import pkg_resources
try:
__version__ = pkg_resources.get_distribution(__name__).version
except:
__version__ = "unknown"
If, for example, the highest version tag in Git is 6.10.0, then pip install -e . will generate a local version number such as 6.10.0.post0.dev23+ngc376c3c (c376c3c being the short hash of the last commit) or 6.10.0.post0.dev23+ngc376c3c.dirty (if it has uncommitted changes).
For more complicated strings such as 4.0.0rc1, they are usually hand edited in the PKG-INFO file. Such as:
# cat ./<package-name>.egg-info/PKG-INFO
...
Version: 4.0.0rc1
...
This make it unfeasible to obtain it from within any python code.

ImportError: Permission Denied while using LXML

I've been having a ton of trouble using LXML, after installing it from https://pypi.python.org/pypi/lxml/3.2.1 using Easy_Install-2.7. I installed it on Windows using cygwin, and at first the package seemed to be okay. However upon further testing I ran into problems.
When I run code with:
import lxml
it works completely fine. But as soon as I try:
import lxml.etree
I get this error:
Traceback (most recent call last):
File "D:\Nick_Code\NewsScraper\testdummy.py", line 7, in <module>
import lxml.etree
File "/usr/lib/python2.7/site-packages/lxml-3.2.0-py2.7-cygwin-1.7.20-i686.egg/lxml/etree.py", line 7, in <module>
__bootstrap__()
File "/usr/lib/python2.7/site-packages/lxml-3.2.0-py2.7-cygwin-1.7.20-i686.egg/lxml/etree.py", line 6, in __bootstrap__
imp.load_dynamic(__name__,__file__)
ImportError: Permission denied
I've been trying to find information/work arounds for quite a while but no success. Please let me know if you have any insight or need information.
Thanks!
Michael
This is not a solid answer. But I will highlight several of the problems involved for obtaining a solution. Most likely the problem above, is like a cancer caused by several factors acting catastrophically together.
I have the same exact problem as in the OP, when attempting to use the native Cygwin supplied Python packages on my Windows Vista machine. Being new to Python I have spent several days in trying to get this to work, and understand why it is not working. But all my Google-fu returned nothing but countless dead ends. So here's my take on this.
There are many reasons why Python could have trouble under Cygwin, some which you can do something about and some which are beyond most peoples control. What it boils down to, are the following key issues:
Windows is a complete mess when it comes to file permissions, and Cygwin cannot handle windows file permissions very well. So what you see in Cygwin is far from the whole story.
Windows is shamefully character case-independent which causes loads of trouble, especially when you need to (cross)compile anything that was originally developed under *nix based system (i.e. everything). In fact, if you attempt extracting any archive that contains files whose names differ only in capitalization. (I.e. "makefile" vs "Makefile" etc.) files under Windows or Cygwin, you loose all but one of the files. in case their You need to enable case-sensitivity to do anything more than "hello world" *nix compilations.
Windows handles symlinks completely different than Cygwin. And if your ZIP, TAR etc. archives contain any symlinks, they will be broken after extraction to Windows environment.
Sloppy code practices, where developer have not properly tested their creations on various environments, or carefully set proper file permissions to their *.tar.gz collections. Including correct dependency specifications, or mentioning whether or not binaries has been statically linked etc.
For the full gory details and further (Win-Cygwin) issues, look HERE.
At first I tried to use Cygwin's own Python without any additional packages, and nstalling lxml using PIP and easy_install. Then I tried to use Cygwin's own libxml2, libxslt and xml python packages, and I had the same problems.
At first, after installing the static windows binaries (as suggested elsewhere),
I got this error:
File "/usr/lib/python2.7/site-packages/lxml-3.2.4-py2.7-cygwin-1.7.24-i686.egg/lxml/etree.py", line 6, in __bootstrap__
imp.load_dynamic(__name__,__file__)
ImportError: Permission denied
Aborted (core dumped)
Then I investigated the file permissions and changed those with: chmod -R 755 /usr/lib/python2.7/
I got one step further to isolate problem to an apparently missing file.
And enabling verbose and diagnostic mode's didn't help much either.
File "/usr/lib/python2.7/site-packages/lxml-3.2.4-py2.7-cygwin-1.7.24-i686.egg/lxml/etree.py", line 6, in __bootstrap__
imp.load_dynamic(__name__,__file__)
ImportError: No such file or directory
Aborted (core dumped)
HERE is the exact statement specification:
Load and initialize a module implemented as a dynamically loadable shared
library and return its module object. If the module was already initialized, it
will be initialized again. Re-initialization involves copying the __dict__
attribute of the cached instance of the module over the value used in the module
cached in sys.modules. The pathname argument must point to the shared library.
The name argument is used to construct the name of the initialization function:
an external C function called initname() in the shared library is called. The
optional file argument is ignored. (Note: using shared libraries is highly
system dependent, and not all systems support it.)
So I started reading on the lxml website which clearly state lxml's dependencies on both libxml2 and libxslt, and unless they are statically linked, they also depend on iconv and zlib. So you're lead to believe you need to install all of these. Don't! Continue reading. But if you're going to build from sources (as easy_install may try to do) you'll need everything, including the development header libraries: libxml2-devel, libxslt-devel. Another place states that you also need Cython and install with:
easy_install lxml==dev
The dependencies are shown in this picture from HERE:
So you think you may get away with something like:
STATIC_DEPS=true pip install lxml
But that doesn't do it either. Probably because the libraries used to compile Cygwin's Python have to be the same as those for compiling lxml. But I don't know. Notice how the lxml package refers to Cygwin "1.7.24". My Cygwin is already "1.7.25" and you can check this with uname -a. Then you can check your static python executable with file and ldd. Then you understand that this also depend on the C-compiler used for building python/cygwin under Windows or *nix. Smelling a nightmare I decided that building my own was not the way to go. So next I tried to install the Python libraries (supplied as
executables) meant for Windows Python. This didn't work since I never had windows native Python installed, and I was greeted with an error that the installed could not find Python in my registry. I could of course just extract the executable, but I wouldn't know where to put the binaries without the installer. So I had another idea...
There are 3 possible solutions to getting this to work, as far as I can see.
The easy way of installing a Windows native Python interpreter. You loose some native Cygwin functionality, unless you install in correct place: /usr/lib/python2.7 and make sure Cygwin can find it and use it. This also uses a different file-permissions, case-sensitivity and character set (UTF-16LE) than Cygwin (UTF-8), potentially creating many other issues down the line! Difficulty: Easy
Continue hacking the Cygwin's Python, to make it work with the binary libraries used in (1). But this requires:
a) Uninstall and remove all Cygwin Python packages, except bare Python interpreter.
b) Remove all PIP and easy install traces.
c) Hacking the Windows registry to pretend to have Python27 installed:
HKEY_LOCAL_MACHINE\SOFTWARE\Python\PythonCore\2.7\InstallPath C:\Python27\
HKEY_LOCAL_MACHINE\SOFTWARE\Python\PythonCore\2.7\PythonPath C:\Python27\Lib;C:\Python27\DLLs;C:\Python27\Lib\lib-tk
HKEY_CLASSES_ROOT...
d) Install the Windows binary libraries.
e) All the rest should now hopefully work with PIP or easy_install. Difficulty: Medium!
Doing it properly by compiling Python and all libraries from scratch. Difficulty: Hard!
I successfully did (1), but I still think (2) is the smarter way of doing it, but I have not tested it, which is why I don't consider this as a good answer. BTW. One more quirk, I have to run the interpreter with: python.exe -E to avoid an annoying: "SyntaxError: invalid syntax" when hitting return!
Conclusion:
Apparently, you don't need the libxml2 and libxslt python packages to use lxml!
In my case I needed Scrapy, so I also had to install a few other packages.
$ pip.exe list
cssselect (0.9.1)
lxml (3.2.4)
pip (1.4.1)
pyOpenSSL (0.11)
pywin32 (218)
queuelib (1.1.1)
Scrapy (0.20.0)
setuptools (1.4.1)
six (1.4.1)
Twisted (13.2.0)
w3lib (1.5)
zope.interface (4.0.5)
$ll /cygdrive/c/Python27/Lib/site-packages/
adodbapi
cssselect
isapi
lxml
OpenSSL
pip
pythonwin
pywin32_system32
queuelib
scrapy
twisted
w3lib
win32
win32com
win32comext
zope
cssselect-0.9.1-py2.7.egg-info
lxml-3.2.4-py2.7.egg-info
pip-1.4.1-py2.7.egg-info
queuelib-1.1.1-py2.7.egg-info
Scrapy-0.20.0-py2.7.egg-info
six-1.4.1-py2.7.egg-info
Twisted-13.2.0-py2.7.egg-info
w3lib-1.5-py2.7.egg-info
zope.interface-4.0.5-py2.7.egg-info
PyWin32.chm
setuptools-1.4.1-py2.7.egg
pyOpenSSL-0.11-py2.7.egg-info
pywin32-218-py2.7.egg-info
easy-install.pth
pywin32.pth
setuptools.pth
zope.interface-4.0.5-py2.7-nspkg.pth
pythoncom.py
six.py
pythoncom.pyc
six.pyc
pythoncom.pyo
pywin32.version.txt
README.txt
Useful References:
HERE
HERE
HERE HERE HERE HERE
HERE

Categories

Resources