When I use conda environments I often find myself using packages that are only on PyPI requiring pip to install them, but for which many of the dependencies could be installed using conda. If I just install them using pip I'll end up with an environment with several pip-managed packages.
Currently I have to start a pip install (say for torchkge), and notice that it requires for instance pandas and starts downloading it. I ctrl+C and then conda install pandas before redoing the pip install until I see that pip sees that all of the dependencies are satisfied and none need to be installed.
Is there a simple way of doing this dependency check and installing the available dependencies with conda?
You can use below command. For example if I need to check dependencies of Pandas:
pip show pandas
In result set you will find the modules required under Requires keyword.
OR
You can also use pipdeptree
$ pip install pipdeptree
then run
$ pipdeptree
and it will show you your dependencies in a tree form.
For pandas you can check:
$ pipdeptree -p pandas
I'm running into an issue with pip install --no-deps --no-build-isolation -r requirements.txt that I would not expect.
As a way of working around Python dependency hell using pip (the use of other package managers, e.g., poetry, are longer-term projects), we want to freeze our dependencies and install the exact versions of various packages we need.
In the list of dependencies includes:
fastparquet==0.4.1
...
numpy==1.17.4
The former depends on the latter.
In an attempt to get fastparquet to use the version of numpy we want, I thought that pip install --no-deps --no-build-isolation -r requirements.txt would work. --no-deps would ensure that dependencies of the packages listed won't be pulled down by pip, and --no-build-isolation would ensure we don't build fastparquet with a newer version of numpy as a result of pip's build isolation behavior.
Running pip install ... works without error. However, when running my code, I still get this issue when using fastparquet:
ValueError: numpy.ndarray size changed, may indicate binary incompatibility.
Expected 88 from C header, got 80 from PyObject
This error appears to be a numpy version compatibility issue.
Reinstalling fastparquet==0.4.1 via pip install --upgrade --force-reinstall fixes the issue, which implies to me that, the first time around, pip is still using a newer version of numpy.
Note that we are starting from a clean Python virtual environment with neither numpy nor fastparquet installed.
Any ideas? Thanks!
I think your analysis makes sense. The build environment and runtime environment are separate. When building fastparquet, the system doesn't know which numpy has been installed, so it just pulled the latest numpy based on its build settings. You may want to install numpy before fastparquet and use --no-build-isolation option.
Have you tried adding the option into requirements.txt?
fastparquet==0.4.1, --no-build-isolation
This works with pip version larger than 20.2.2
I have some problems with jenkins and creating a virtualenv. I'm using the shiningpanda plugin and the "Virtualenv Builder" build step combined with pyenv.
I can install packages with "pip install package" but I cannot install requirements from a requirements file, because the subsequent packages cannot find the installed packages, e.g. numexpr cannot find/import numpy.
As I was typing my question, I found the answer to that problem: The current version (v0.21) of the shiningpanda plugin does NOT support pip's requirements.txt in virtualenv builders.
https://wiki.jenkins-ci.org/display/JENKINS/ShiningPanda+Plugin
Current version (0.23) works in our setup like this (in Build-Virtualenv Builder, with Nature: Shell):
pushd %run_dir%
SET PYTHONPATH=%CD%
python -m pip install --upgrade -r configurations/requirements.txt
This has worked well even if libraries require each other.
I have a packages file (dependencies.conf) for pip including a bunch of packages that my app needs:
argparse==1.2.1
Cython==0.20.2
...
In my build process, I download all packages using:
pip install --download=build/modules -r conf/dependencies.conf
Then in the deployment process, I want to install these files only if the installed version is different than what I need and in the correct order (dependencies)
I'm currently using the following:
for f in modules/*; do pip install -I $f; done
But this is wrong since it doesn't validate the version (-I is there in order to downgrade packages if needed) and it doesn't handle the right order of dependencies.
Is there a simple method to do that? (I'm basically trying to update the packages in machines that don't have internet connection)
Get the version using PIP, using the following command
eg. pip freeze | grep Jinja2
Jinja2==2.6
as explained in the following link Find which version of package is installed with pip
then compare this with the version, and run pip install with the appropriate version if necessary
I am trying to install version 1.2.2 of MySQL_python, using a fresh virtualenv created with the --no-site-packages option. The current version shown in PyPi is 1.2.3. Is there a way to install the older version? I have tried:
pip install MySQL_python==1.2.2
However, when installed, it still shows MySQL_python-1.2.3-py2.6.egg-info in the site packages. Is this a problem specific to this package, or am I doing something wrong?
TL;DR:
Update as of 2022-12-28:
pip install --force-reinstall -v
For example: pip install --force-reinstall -v "MySQL_python==1.2.2"
What these options mean:
--force-reinstall is an option to reinstall all packages even if they are already up-to-date.
-v is for verbose. You can combine for even more verbosity (i.e. -vv) up to 3 times (e.g. --force-reinstall -vvv).
Thanks to #Peter for highlighting this (and it seems that the context of the question has broadened given the time when the question was first asked!), the documentation for Python discusses a caveat with using -I, in that it can break your installation if it was installed with a different package manager or if if your package is/was a different version.
Original answer:
pip install -Iv (i.e. pip install -Iv MySQL_python==1.2.2)
What these options mean:
-I stands for --ignore-installed which will ignore the installed packages, overwriting them.
-v is for verbose. You can combine for even more verbosity (i.e. -vv) up to 3 times (e.g. -Ivvv).
For more information, see pip install --help
First, I see two issues with what you're trying to do. Since you already have an installed version, you should either uninstall the current existing driver or use pip install -I MySQL_python==1.2.2
However, you'll soon find out that this doesn't work. If you look at pip's installation log, or if you do a pip install -Iv MySQL_python==1.2.2 you'll find that the PyPI URL link does not work for MySQL_python v1.2.2. You can verify this here: http://pypi.python.org/pypi/MySQL-python/1.2.2
The download link 404s and the fallback URL links are re-directing infinitely due to sourceforge.net's recent upgrade and PyPI's stale URL.
So to properly install the driver, you can follow these steps:
pip uninstall MySQL_python
pip install -Iv http://sourceforge.net/projects/mysql-python/files/mysql-python/1.2.2/MySQL-python-1.2.2.tar.gz/download
You can even use a version range with pip install command. Something like this:
pip install 'stevedore>=1.3.0,<1.4.0'
And if the package is already installed and you want to downgrade it add --force-reinstall like this:
pip install 'stevedore>=1.3.0,<1.4.0' --force-reinstall
One way, as suggested in this post, is to mention version in pip as:
pip install -Iv MySQL_python==1.2.2
i.e. Use == and mention the version number to install only that version. -I, --ignore-installed ignores already installed packages.
To install a specific python package version whether it is the first time, an upgrade or a downgrade use:
pip install --force-reinstall MySQL_python==1.2.4
MySQL_python version 1.2.2 is not available so I used a different version. To view all available package versions from an index exclude the version:
pip install MySQL_python==
I believe that if you already have a package it installed, pip will not overwrite it with another version. Use -I to ignore previous versions.
Sometimes, the previously installed version is cached.
~$ pip install pillow==5.2.0
It returns the followings:
Requirement already satisfied: pillow==5.2.0 in /home/ubuntu/anaconda3/lib/python3.6/site-packages (5.2.0)
We can use --no-cache-dir together with -I to overwrite this
~$ pip install --no-cache-dir -I pillow==5.2.0
Since this appeared to be a breaking change introduced in version 10 of pip, I downgraded to a compatible version:
pip install 'pip<10'
This command tells pip to install a version of the module lower than version 10. Do this in a virutalenv so you don't screw up your site installation of Python.
This below command worked for me
Python version - 2.7
package - python-jenkins
command - $ pip install 'python-jenkins>=1.1.1'
I recently ran into an issue when using pip's -I flag that I wanted to document somewhere:
-I will not uninstall the existing package before proceeding; it will just install it on top of the old one. This means that any files that should be deleted between versions will instead be left in place. This can cause weird behavior if those files share names with other installed modules.
For example, let's say there's a package named package. In one of packages files, they use import datetime. Now, in package#2.0.0, this points to the standard library datetime module, but in package#3.0.0, they added a local datetime.py as a replacement for the standard library version (for whatever reason).
Now lets say I run pip install package==3.0.0, but then later realize that I actually wanted version 2.0.0. If I now run pip install -I package==2.0.0, the old datetime.py file will not be removed, so any calls to import datetime will import the wrong module.
In my case, this manifested with strange syntax errors because the newer version of the package added a file that was only compatible with Python 3, and when I downgraded package versions to support Python 2, I continued importing the Python-3-only module.
Based on this, I would argue that uninstalling the old package is always preferable to using -I when updating installed package versions.
There are 2 ways you may install any package with version:-
A). pip install -Iv package-name == version
B). pip install -v package-name == version
For A
Here, if you're using -I option while installing(when you don't know if the package is already installed) (like 'pip install -Iv pyreadline == 2.* 'or something), you would be installing a new separate package with the same existing package having some different version.
For B
At first, you may want to check for no broken requirements.
pip check
2.and then see what's already installed by
pip list
3.if the list of the packages contain any package that you wish to install with specific version then the better option is to uninstall the package of this version first, by
pip uninstall package-name
4.And now you can go ahead to reinstall the same package with a specific version, by
pip install -v package-name==version
e.g. pip install -v pyreadline == 2.*
If you want to update to latest version and you don't know what is the latest version you can type.
pip install MySQL_python --upgrade
This will update the MySQL_python for latest version available, you can use for any other package version.
dependency packaging has had a new release, wherein it has dropped LegacyVersion from its codebase
The quick solution might be pin packaging==21.3