I have a dependency tree of modules that works like this (→ indicating a dependency):
a → b, c
b → ruamel.yaml >= 0.16.5
c → ruamel.yaml < 0.16.6, >=0.12.4
It's very clear to me that ruamel.yaml 0.16.5 will resolve both of these dependencies correctly. However, when I pip install a, I get the following logs:
Collecting ruamel.yaml>=0.16.5
Downloading ruamel.yaml-0.16.10-py2.py3-none-any.whl (111 kB)
And then later:
ERROR: <package c> 0.4.0 has requirement ruamel.yaml<0.16.6,>=0.12.4, but you'll have ruamel-yaml 0.16.10 which is incompatible.
So pip has completely ignored the grandchild dependencies when choosing which packages to install. But it realises that it has messed up at the end. Why is pip not choosing the correct package here. Is there a way to help it work better?
I believe this is a well known problem that is currently being worked on. Message from one week ago: http://pyfound.blogspot.com/2020/03/new-pip-resolver-to-roll-out-this-year.html
In the meantime there are some measures that can be taken to try and mitigate this kind of issues:
Revert the order of the dependencies (in your example a could list c before b)
Use an additional requirements.txt or constraints.txt file
Depending on the actual needs, an alternative tool could help (I believe poetry, pipenv, and most likely others as well might have better dependency resolvers, but they are not a one-to-one replacement for pip)
It appears to be already possible to test pip's future dependency resolver today:
Install pip from source
Run path/to/python -m pip install --unstable-feature=resolver ...
In a way it also seems to be possible to somewhat test this dependency resolver in current releases or pip via the pip check command.
Some more references on the topic:
https://pradyunsg.me/blog/2020/03/27/pip-resolver-testing/
https://discuss.python.org/t/an-update-on-pip-and-dependency-resolution/1898/2
Related
I've been trying to install a package through pip on my rpi 3 model B
my operating system is raspbian. Debian based pip version is 21.0.1 and python version is 3.7.4
the command I'm using is:
python3 -m pip install librosa
the problem is that the dependency resolver takes way too long to resolve the conflicts.
and after a few hours, it keeps repeating this line over and over again for hours ( I even left the installation running for 2 days overnights )
INFO: pip is looking at multiple versions of <Python from requires-Python> to determine which version is compatible with other requirements. this could take a while.
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. If you want to abort this run you can press ctrl + c to do so.
I've tried using a stricter constraint such as adding "numpy > 1.20.0" and other stuff but now the popped up and I have no clue what I can do now.
So as of pip 20.3, a new (not always working) resolver has been introduced. As of pip 21.0 the old (working) resolver is unsupported and slated for removal dependent on pip team resources.
Changes to the pip dependency resolver in 20.3
I have hit the same issue trying to build jupyter, my solution was to pin pip back to the 20.2 release which is the last release with the old resolver. This got past the point my builds were choking at using the new resolver under pip 21.1.1.
A second approach that might work (untested) is to use the flag:
--use-deprecated=legacy-resolver
which appears to have been added when 20.3 switched over to the new resolver. This would allow the benefits of newer pip releases, until the backtracking issue is resolved, assuming it works.
What is happening, according to the devs on this Github issue, is "pip downloads multiple versions [of each package] because the versions it downloaded conflict with other packages you specified, which is called backtracking and is a feature. The versions need to be downloaded to detect the conflicts." But it takes a very long time to download all of these versions. Pip explains this in detail, along with ways to resolve it or speed it up, at https://pip.pypa.io/en/latest/topics/dependency-resolution/.
If you run
pip install -r requirements.txt --use-deprecated=legacy-resolver
you will not get this backtracking behavior, but your install will complete, and you will see an error at the end that is useful for troubleshooting:
ERROR: pip's legacy dependency resolver does not consider dependency conflicts when selecting packages. This behaviour is the source of the following dependency conflicts.
apache-airflow-providers-amazon 2.6.0 requires boto3<1.19.0,>=1.15.0, but you'll have boto3 1.9.253 which is incompatible.
package_xyz 0.0.1 requires PyJWT==2.1.0, but you'll have pyjwt 1.7.1 which is incompatible.
Upgrading my pip to 21.3.1 worked
python.exe -m pip install --upgrade pip
I am asking myself, which version of the library pip will install in this scenario:
requirements.txt contains:
numpy<=1.14
scikit-learn
Now imagine, that scikit-learn depends on numpy>=1.10.
If I start pip install -r requirements.txt now, how will pip install the dependencies?
Does it parse the whole dependency structure before installing and finds a valid version of numpy?
Does it just parse the file and dependencies sequentially (package by package) and tries to go for the best "last" dependency?
In my example this would be:
numpy==1.14
numpy==latest
The essential question is: In which order will pip install its dependencies? How does it determine the proper version, respecting all cross dependencies?
EDIT: My initial guess would be, that it has an internal list with valid version and cancels out invalid versions by parsing all dependencies before installing. Then it takes the highest valid remaining version of each package.
First thing to know: Most likely the order will change soon. pip is currently (today: September 2020) slowly rolling out a new dependency resolver. It can be used today already but is not the default. So depending which dependency resolver you are using, results might differ.
A couple of pointers:
pip's documentation chapter "Installation Order"
pip's documentation chapter "Changes to the pip dependency resolver in 20.2 (2020)"
I am using requirement.txt to specify the package dependencies that are used in my python application. And everything seems to work fine for packages of which either there are no internal dependencies or for the one using the package dependencies which are not already installed.
The issue occurs when i try to install a package which has a nested dependency on some other package and an older version of this package is already installed.
I know i can avoid this while installing a package manually bu using pip install -U --no-deps <package_name>. I want to understand how to do this using the requirement.txt as the deployment and requirement installation is an automated process.
Note:
The already installed package is not something i am directly using in my project but is part of a different project on the same server.
Thanks in advance.
Dependency resolution is a fairly complicated problem. A requirements.txt just specifies your dependencies with optional version ranges. If you want to "lock" your transitive dependencies (dependencies of dependencies) in place you would have to produce a requirements.txt that contains exact versions of every package you install with something like pip freeze. This doesn't solve the problem but it would at least point out to you on an install which dependencies conflict so that you can manually pick the right versions.
That being said the new (as of writing) officially supported tool for managing application dependencies is Pipenv. This tool will both manage the exact versions of transitive dependencies for you (so you won't have to maintain a "requirements.txt" manually) and it will isolate the packages that your code requires from the rest of the system. (It does this using the virtualenv tool under the hood). This isolation should fix your problems with breaking a colocated project since your project can have different versions of libraries than the rest of the system.
(TL;DR Try using Pipenv and see if your problem just disappears)
The installation information page of PyCryptodome says the following under the "Windows (pre-compiled)" section:
Install PyCryptodome as a wheel:
pip install pycryptodomex
To make sure everything works fine, run the test suite:
python -m Cryptodome.SelfTest
There are several problems with this though:
Contrary to what these instructions say, this will not install PyCryptoDome as a wheel, but it will rather download it and try to build it, resulting in an error if you don't have the correct build environment installed for the C components included in this package (and the entire mess related to this is the biggest benefit of using a wheel instead to begin with).
Even if I instead download the correct wheel file from PyCryptoDome's PyPi page, I must (as far as I know?) instead use a command-line as follows to install it:
pip install c:\some\path\name-of-wheel-file.whl
This in turn makes it install under the default "Crypto" package instead of the "Cryptodome" package explicitly mentioned in the instructions (and therefore colliding in a breaking fashion with any pre-existing installations of the PyCrypto package).
So, my question is:
Is there any way to install a wheel file under a different package name than the default one?
PyCryptodome does not seem to provide any specific wheel files for installing under this alternative package name, so if this is impossible, I have a big problem (because I already have PyCrypto installed). :-(
PS.
Some more context regarding the need for the alternative package name can be provided by the following quote from the same installation page that is linked above:
PyCryptodome can be used as:
1.
a drop-in replacement for the old PyCrypto library. You install it with:
pip install pycryptodome
In this case, all modules are installed under the Crypto package. You can test everything is right with:
python -m Crypto.SelfTest
One must avoid having both PyCrypto and PyCryptodome installed at the same time, as they will interfere with each other.
This option is therefore recommended only when you are sure that the whole application is deployed in a virtualenv.
2.
a library independent of the old PyCrypto. You install it with:
pip install pycryptodomex
You can test everything is right with:
python -m Cryptodome.SelfTest
In this case, all modules are installed under the Cryptodome package. PyCrypto and PyCryptodome can coexist.
So, again, all I want is to install it as described under alternative 2 in this quote, from a wheel file, but the problem is that the provided wheel files seem to only default to the package name described under alternative 1 in this quote (i.e. "Crypto").
As far as I know this is not possible. The only way to achieve this by recompiling the wheel yourself after you modified its name in the setup.py.
After hours of finding out why there are missing some documented functions in my Babel installation, I learned there are two branches of Babel development:
Babel has two separate development paths (0.9.x branch and trunk) in
parallel for about 4 years now despite very few developers working on
the project. We try to resolve that situation by releasing a stable
1.0 version but well, real live is not always friendly to open source contribution.
Babel's FAQ confirms that. I want to use Flask-Babel in my project. It's dependency in setup.py says I need just Babel. It means my pip takes any version installed in my environment or searches PyPI for the newest version, where is version 0.9.6. Unlogically, Flask-Babel uses functions which are not present in 0.9.x branch. Maybe I am missing something, maybe I am just confused, but how can I easily install the trunk version, where is the most of new features? And how can I enforce using such a version in my setup.py? How it all works for the people who use Flask-Babel? (I know, the last question is rather Flask-specific and should go here, but all the other questions can answer anyone else.)
Thank you for any suggestions. The bold questions are the most important, the rest is rather Flask-Babel-specific "nice to have".
Have you tried using pip with the url to the branch that you need?
$ sudo pip install http://svn.edgewall.org/repos/babel/trunk
After that, pip should be happy with the dependency:
$ sudo pip install Flask-Babel
...
Requirement already satisfied (use --upgrade to upgrade): Babel in /usr/local/lib/python2.7/dist-packages (from Flask-Babel)
...
Regarding how to force a dependency in you setup.py. Since you're already using pip, you can give a try to a requirements file.