I recently had to bump a google cloud library due to a conflict that was generating a bug. Long story short, I had
google-cloud-pubsub==1.4.2
which I had to bump to 1.4.3. This in turn reverted google-api-core module to 1.16.0, which generated a conflict with another module google-cloud-secret-manager which required a higher version of google-api-core.
Now, I have removed google-cloud-secret-manager. But, If I try to install the module again to the last version however, it will bump me google-api-core to a version not compatible with google-cloud-pubsub. What I want to do instead is to pip install google-cloud-secret-manager to the highest possible version that is compatible with google-api-core==1.16.0 without manually trying to install all the versions until i find the right match. Is it something possible?
Is there a pip install fix dependency version command that could allow me to easily install google-cloud-secret-manager that will not change the version of the dependency module google-api-core to a different version? Thank you
You can achieve this with a constraints file. Just put all your constraints into that file:
google-api-core==1.16.0
Then you can install via:
python -m pip install -c constraints.txt google-cloud-secret-manager
This will try every version of google-cloud-secret-manager, starting from the most recent version, until it finds a version that is compatible with the given constraints.
I am asking myself, which version of the library pip will install in this scenario:
requirements.txt contains:
numpy<=1.14
scikit-learn
Now imagine, that scikit-learn depends on numpy>=1.10.
If I start pip install -r requirements.txt now, how will pip install the dependencies?
Does it parse the whole dependency structure before installing and finds a valid version of numpy?
Does it just parse the file and dependencies sequentially (package by package) and tries to go for the best "last" dependency?
In my example this would be:
numpy==1.14
numpy==latest
The essential question is: In which order will pip install its dependencies? How does it determine the proper version, respecting all cross dependencies?
EDIT: My initial guess would be, that it has an internal list with valid version and cancels out invalid versions by parsing all dependencies before installing. Then it takes the highest valid remaining version of each package.
First thing to know: Most likely the order will change soon. pip is currently (today: September 2020) slowly rolling out a new dependency resolver. It can be used today already but is not the default. So depending which dependency resolver you are using, results might differ.
A couple of pointers:
pip's documentation chapter "Installation Order"
pip's documentation chapter "Changes to the pip dependency resolver in 20.2 (2020)"
I have a dependency tree of modules that works like this (→ indicating a dependency):
a → b, c
b → ruamel.yaml >= 0.16.5
c → ruamel.yaml < 0.16.6, >=0.12.4
It's very clear to me that ruamel.yaml 0.16.5 will resolve both of these dependencies correctly. However, when I pip install a, I get the following logs:
Collecting ruamel.yaml>=0.16.5
Downloading ruamel.yaml-0.16.10-py2.py3-none-any.whl (111 kB)
And then later:
ERROR: <package c> 0.4.0 has requirement ruamel.yaml<0.16.6,>=0.12.4, but you'll have ruamel-yaml 0.16.10 which is incompatible.
So pip has completely ignored the grandchild dependencies when choosing which packages to install. But it realises that it has messed up at the end. Why is pip not choosing the correct package here. Is there a way to help it work better?
I believe this is a well known problem that is currently being worked on. Message from one week ago: http://pyfound.blogspot.com/2020/03/new-pip-resolver-to-roll-out-this-year.html
In the meantime there are some measures that can be taken to try and mitigate this kind of issues:
Revert the order of the dependencies (in your example a could list c before b)
Use an additional requirements.txt or constraints.txt file
Depending on the actual needs, an alternative tool could help (I believe poetry, pipenv, and most likely others as well might have better dependency resolvers, but they are not a one-to-one replacement for pip)
It appears to be already possible to test pip's future dependency resolver today:
Install pip from source
Run path/to/python -m pip install --unstable-feature=resolver ...
In a way it also seems to be possible to somewhat test this dependency resolver in current releases or pip via the pip check command.
Some more references on the topic:
https://pradyunsg.me/blog/2020/03/27/pip-resolver-testing/
https://discuss.python.org/t/an-update-on-pip-and-dependency-resolution/1898/2
I have two versions of my python build:
16.1206.43542
17.0817.221945+f4cc396
The only really difference I can see is the ending metadata. When I run pip install package, the version 16.1206.43542 is installed and not the latest. Is this the proper behavior? I would have thought pip would have honored the metadata, and installed the later package?
Thoughts? Ideas? Any would be welcomed. For transparency, I am adding the sha from a git build into the version.
I looked at this, and found that the correct answer is that anything following the normal Semantic Versioning, it labels that build as a pre-release build and must then be explicitly called to be installed, normally identified with a '-'. https://semver.org/ (Topic 9)
For pip install The version must be a NON pre-release build to be automatically installed.
I've just uploaded a new version of my package to PyPi (1.2.1.0-r4): I can download the egg file and install it with easy_install, and the version checks out correctly. But when I try to install using pip, it installs version 1.1.0.0 instead. Even if I explicitly specify the version to pip with pip install -Iv tome==1.2.1.0-r4, I get this message: Requested tome==1.2.1.0-r4, but installing version 1.1.0.0, but I don't understand why.
I double checked with parse_version and confirmed that the version string on 1.2.1 is greater than that on 1.1.0 as shown:
>>> from pkg_resources import parse_version as pv
>>> pv('1.1.0.0') < pv('1.2.1.0-r4')
True
>>>
So any idea why it's choosing to install 1.1.0 instead?
This is an excellent question. It took me forever to figure out. This is the solution that works for me:
Apparently, if pip can find a local version of the package, pip will prefer the local versions to remote ones. I even disconnected my computer from the internet and tried it again -- when pip still installed the package successfully, and didn't even complain, the source was obviously local.
The really confusing part, in my case, was that pip found the newer versions on pypi, reported them, and then went ahead and re-installed the older version anyway ... arggh. Also, it didn't tell me what it was doing, and why.
So how did I solve this problem?
You can get pip to give verbose output using the -v flag ... but one isn't enough. I RTFM-ed the help, which said you can do -v multiple times, up to 3x, for more verbose output. So I did:
pip install -vvv <my_package>
Then I looked through the output. One line caught my eye:
Source in /tmp/pip-build-root/ has version 0.0.11, which satisfies requirement <my_package>
I deleted that directory, after which pip installed the newest version from pypi.
Try forcing download the package again with:
pip install --no-cache-dir --upgrade <package>
Thanks to Marcus Smith, who does amazing work as a maintener of pip, this was fixed in version 1.4 of pip which was released on 2013-07-23.
Relevant information from the changelog for this version
Fixed a number of issues (#413, #709, #634, #602, and #939) related to
cleaning up and not reusing build directories. (Pull #865, #948)
I found here that there is a known bug in pip that it won't check the version if there's a build directory with unpacked sources. I have checked this on my troubling package and after deleting its sources from build directory pip installed the required version.
If you are using a pip version that comes with some distribution packages (ex. Ubuntu python-pip), you may need to install a newer pip version:
Update pip to latest version:
sudo pip install -U pip
In case of "virtualenv", skip "sudo":
pip install -U pip
Following command may be required, if your shell report something like -bash: /usr/bin/pip: No such file or directory after pip update:
hash -d pip
Now install your package as usual:
pip install -U foo
or
pip install foo==package.version.here
Got the same issue to update pika 0.9.5 to 0.9.8. The only working way was to install from tarball: pip install https://pypi.python.org/packages/source/p/pika/pika-0.9.8.tar.gz.
In my case the python version used (3.4) didn't satisfy Django 2.1 dependencies requirements (python >= 3.5).
For my case I had to delete the .pip folder in my home directory and then I was able to get later versions of multiple libraries. Note that this was on linux.
pip --version
pip 18.1 from /usr/lib/python2.7/site-packages/pip (python 2.7)
virtualenv --version
15.1.0
Just in case that anyone else hassles with upgrading torchtext (or probably any other torch library):
Although https://pypi.org/project/torchtext/ states that you could run pip install torchtext I had to install it similiar to torch by specifying --find-links aka -f:
pip install torchtext===0.8.1 -f https://download.pytorch.org/whl/torch_stable.html
What irritated me was that PyCharm pointed me to the new version, but couldn't find it when attempting to upgrade to it. I guess that PyCharm uses its own mechanism to spot new versions. Then, when invoking pip under the hood, it didn't find the new version without the --find-links option.
In my case I am pip installing a .tar.gz package from Artifactory that I make a lot of updates to. In order to overwrite my cached Python files and always grab/install the latest I was able to run:
pip install --no-cache-dir --force-reinstall <path/to/tar.gz>
You should see this re-download any necessary files and install those, instead of using your local cache.
10 years on and pip still fails to work as expected 😖.
I wasted a couple of hours now banging my head against the wall trying to find out why pip won't install a development version of my package. In my case, there are versions 0.0.4 and 0.0.5.dev1 in a private gitlab.com package registry (hence the --extra-index-url argument below), but I believe that's not relevant to the problem.
Following a lot of the advice on this page, I create a test venv in a far away folder, clear the pip cache, uninstall the package in question, etc. first to rule out the most common problems:
$ pip cache purge && \
pip uninstall --yes my-package && \
pip install --extra-index-url "https://_:${GITLAB_PASSWORD_TOOLS_VAULTTOOLS}#gitlab.com/api/v4/projects/<project-id>/packages/pypi/simple" \
--no-cache-dir \
--pre \
--upgrade my-package
output (using empty lines to separate output for commands):
WARNING: No matching packages
Files removed: 0
Found existing installation: my-package 0.0.4
Uninstalling my-package-0.0.4:
Successfully uninstalled my-package-0.0.4
Looking in indexes: https://pypi.org/simple, https://_:****#gitlab.com/api/v4/projects/<project-id>/packages/pypi/simple
Collecting my-package
Downloading https://gitlab.com/api/v4/projects/<project-id>/packages/pypi/files/f07 ... 397/my_package-0.0.5.dev1-py3-none-any.whl (16 kB)
Downloading https://gitlab.com/api/v4/projects/<project-id>/packages/pypi/files/775 ... 70e/my_package-0.0.4-py3-none-any.whl (16 kB)
...
Successfully installed my-package-0.0.4
So pip does see the dev package version, but chooses the earlier one nonetheless.
In an attempt to figure out what's going on, I published a 0.0.5 version: Error persists, pip sees all three versions, but still installs 0.0.4.
In a further, increasingly desperate attempt, I removed any versions prior to 0.0.5* from the gitlab.com package registry.
Only now, pip would bother to actually display some useful information:
$ (same command as above)
... (similar output as above) ...
ERROR: Cannot install my-package==0.0.5 and my-package==0.0.5.dev1 because these package versions have conflicting dependencies.
The conflict is caused by:
my-package 0.0.5 depends on my-other-package<0.2.5 and >=0.2.4
my-package 0.0.5.dev1 depends on my-other-package<0.2.5 and >=0.2.4
To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict
ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
OK, so there is something wrong with my package dependencies. Thanks for letting me know.
Seriously - I tried hard for a couple of hours using all kinds of pip ... -vvv and/or fixed versions such as e.g. my-package==0.0.5.dev1 - but I did not manage to get any useful output out of pip - until I wiped the entire history from my package registry 🤬.
Hope this at least helps someone in the same situation.
I found that if you use microversions, pip doesn't seem to recognize them. For example, we couldn't get version 1.9.9.1 to upgrade.
In my case, someone had published the latest version of a package with python2, so attempting to pip3 install it grabbed an older version that had been built with python3.
Handy things to check when debugging this:
If pip install claims to not be able to find the version, see whether pip search can see it.
Take a look at the "Download Files" section on the pypi repo -- the filenames might suggest what's wrong (in my case i saw -py2- there clear as day).
As suggested by others, try running pip install --no-cache-dir in case pip isn't bothering to ask the internet because it already has your answer locally.
I had hidden unversioned files under the Git tab in PyCharm that were being installed with pip install . even though I didn't see the files anywhere else.
Took a long time to find it for me, posting this in hope that it'll help somebody else.
if you need the path for your package do pip -v list. Example see related post when using pip -e Why is an old version of a package of my python library installing by itself with pip -e?