I am trying to install a python software using the requirements file.
>> cat requirements.txt
Cython==0.15.1
numpy==1.6.1
distribute==0.6.24
logilab-astng==0.23.1logilab-common==0.57.1
netaddr==0.7.6
numexpr==2.0.1
ply==2.5
pycallgraph==0.5.1
pyflowtools==0.3.4.1
pylint==0.25.1
tables==2.3.1
wsgiref==0.1.2
So I create a virtual environment
>> mkvirtualenv parser
(parser)
>> pip freeze
distribute==0.6.24
wsgiref==0.1.2
(parser)
>> pip install -r requirements.txt
... and then I packages downloaded but not installed with errors: http://pastie.org/4079800
(parser)
>> pip freeze
distribute==0.6.24
wsgiref==0.1.2
Surprisingly, if I try to manually install each package, they install just fine.
For instance:
>> pip install numpy==1.6.1
(parser)
>> pip freeze
distribute==0.6.24
wsgiref==0.1.2
numpy==1.6.1
I am lost. What is going on?
PS: I am using pip v1.1 and python v2.7.2 with virtualenv and virtualenvwrapper
It looks like the numexpr package has an install-time dependency on numpy. Pip makes two passes through your requirements: first it downloads all packages and runs each one's setup.py to get its metadata, and then it installs them all in a second pass.
So, numexpr is trying to import from numpy in its setup.py, but when pip first runs numexpr's setup.py, it has not yet installed numpy.
This is also why you don't see this error when you install the packages one by one: if you install them one at a time, numpy will be fully installed in your environment before you pip install numexpr.
The only solution is to install pip install numpy before you ever run pip install -r requirements.txt -- you won't be able to do this in a single command with a single requirements.txt file.
More info here: https://github.com/pypa/pip/issues/25
I come across with a similar issue and I ended up with the below:
cat requirements.txt | sed -e '/^\s*#.*$/d' -e '/^\s*$/d' | xargs -n 1 python -m pip install
That will read line by line the requirements.txt and execute pip. I cannot find from where I got the answer properly, so apologies for that, but I found some justification below:
How sed works: https://howto.lintel.in/truncate-empty-lines-using-sed/
Another similar answer but with git: https://stackoverflow.com/a/46494462/7127519
Hope this help with alternatives.
This is quite annoying sometimes, a bug of pip.
When you run pip install package_name the pip will first run pip check to the target package, and install all the required package for the dependency(target package).
But when you run pip install -r requirements.txt pip will try to directly install all the required packages listed one by one from top to bottom. Sometimes the dependency is listed above the package it depend upon.
The solution is simple:
1.pip install package_name
2.simply put the error package to the bottom of the requirements.txt
3.sometimes a particular version of the package is not be able to be installed,just install the newest version of it and update the data in requirements.txt
Related
Is there a possibility to install a python package with pipenv without also installing the dependencies?
I'm looking for an analogue of pip install package_name --no-dependencies for the Pipfile. I already tried to specify with a marker but it raises an exception.
[packages]
"psycopg2-binary" = "*"
"aiopg"={version = "*", markers="--no-dependencies"}
Currently, pipenv does not support this. One workaround adds a script like the following to the end of Pipfile:
[scripts]
install = "sh -c 'pipenv install ; pip install --no-deps aiopg'"
With this script, calling pipenv run install installs all dependencies from the [packages] section, including aiopg but excluding its dependencies.
I'm not sure pipenv supports that, but I think the following option may work (never tried it):
Install via pip into a requirements.txt file
pip install <package> --no-deps -r requirements.txt --> then import to pipenv pipenv install -r /path/to/requirements.txt
If you already have a Pipfile in your current project, export your current pipenv file to a requirements.txt file
pipenv lock -r > requirements.txt
Combine the two files and then pipenv install from the combined requirements.txt file
pipenv install -r path/to/requirements.txt
Perhaps this link from the docs may help as well
I am dealing with a legacy Dockerfile. Here is a very simplified version of what I am dealing with:
FROM ubuntu:14.04
RUN apt-get -y update && apt-get -y install \
python-pip \
python-numpy # ...and many other packages
RUN pip install -U pip
RUN pip install -r /tmp/requirements1.txt # includes e.g., numpy==1.13.0
RUN pip install -r /tmp/requirements2.txt
RUN pip install -r /tmp/requirements3.txt
First, several packages are installed using apt, and then several packages are installed using pip. pip version 10 has been released, and part of the release is this new restriction:
Removed support for uninstalling projects which have been installed using distutils. distutils installed projects do not include metadata indicating what files belong to that install and thus it is impossible to actually uninstall them rather than just remove the metadata saying they've been installed while leaving all of the actual files behind.
This leads to the following problem in my setup. For example, first apt installs python-numpy. Later pip tries to install a newer version of numpy from e.g., /tmp/requirements1.txt, and tries to uninstall the older version, but because of the new restriction, it cannot remove this version:
Installing collected packages: numpy
Found existing installation: numpy 1.8.2
Cannot uninstall 'numpy'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.
Now I know at this point there are several solutions.
I could not install python-numpy through apt. However, this causes issues because python-numpy installs a few different packages as requirements, and I do not know if another part of the system relies on these packages. And in reality, there are several apt packages installed through the Dockerfile, and each one I remove seems to reveal another Cannot uninstall X error, and removes a number of other packages along with it, that our app may or may not rely on.
I could also use the --ignore-installed option when I try to pip install things that have already been installed through apt, but then again I have the same problem of every --ignore-installed argument revealing yet another thing that needs to be ignored.
I could pin pip at an older version that does not have this restriction, but I don't want to be stuck using an outdated version of pip forever.
I have been going around in circles trying to come up with a good solution that involves minimal changes to this legacy Dockerfile, and allows the app we deploy with that file to continue to function as it has been. Any suggestions as to how I can safely get around this problem of pip 10 not being able to install newer versions of distutils packages? Thank you!
UPDATE:
I did not realize that --ignore-installed could be used without a package as an argument to ignore all installed packages. I am considering whether or not this might be a good option for me, and have asked about it here.
This is the solution I ended up going with, and our apps have been running in production without any issues for close to a month with this fix in place:
All I had to do was to add
--ignore-installed
to the pip install lines in my dockerfile that were raising errors. Using the same dockerfile example from my original question, the fixed dockerfile would look something like:
FROM ubuntu:14.04
RUN apt-get -y update && apt-get -y install \
python-pip \
python-numpy # ...and many other packages
RUN pip install -U pip
RUN pip install -r /tmp/requirements1.txt --ignore-installed # don't try to uninstall existing packages, e.g., numpy
RUN pip install -r /tmp/requirements2.txt
RUN pip install -r /tmp/requirements3.txt
The documentation I could find for --ignore-installed was unclear in my opinion (pip install --help simply says "Ignore the installed packages (reinstalling instead)."), and I asked about the potential dangers of this flag here, but have yet to get satisfying answer. However, if there are any negative side effects, our production environment has yet to see the effects of them, and I think the risk is low/none (at least that has been our experience). I was able to confirm that in our case, when this flag was used, the existing installation was not uninstalled, but that the newer installation was always used.
Update:
I wanted to highlight this answer by #ivan_pozdeev. He provides some information that this answer does not include, and he also outlines some potential side-effects of my solution.
This is what worked for me--
pip install --ignore-installed <Your package name>
or
sudo pip install --ignore-installed <Your package name>
or (inside jupyter notebook)
import sys
!{sys.executable} -m pip install --ignore-installed <Your package name>
For windows
write
conda update --all
pip install --upgrade <Your package name>
OR
conda update --all
pip install <Your package name>
OR
pip install wrapt --upgrade --ignore-installed
pip install <Your package name>
from ERROR: Cannot uninstall 'wrapt'. during upgrade
You can just remove numpy manually but keep the other dependencies installed by apt. Then use pip as before to install the latest version of numpy.
#Manually remove just numpy installed by distutils
RUN rm /usr/lib/python2.7/dist-packages/numpy-1.8.2.egg-info
RUN rm -r /usr/lib/python2.7/dist-packages/numpy
RUN pip install -U pip
RUN pip install -r /tmp/requirements1.txt
The location of numpy should be the same. But if you want to confirm the location you can run the container without running the requirements.txt files and issue the following commands in the python console inside the container.
>>> import numpy
>>> print numpy.__file__
/usr/lib/python2.7/dist-packages/numpy/__init__.pyc
We have a python/django based web application, many components of which are installed using pip. So I would like to ask if there is a way to save or download and save the particular python packages that we are having pip install (example: pip install django==1.5.1). We would like to have in the end a collection of the packages in the versions known to be working and with which the app was developed locally. Any and all advice will be appreciated.
If I understood your question right, you can pip freeze > requirements.txt, this command will add all the libraries you have used/"downloaded" for your app in the file requirements.txt(in case it exists the file be overwritten). This command allows you to later do pip install -r requirements.txt. However, be aware that your Django project must be running in a virtual environment, otherwise the install command will attempt to install all the python packages in your development machine.
The freeze command will allow you to have the current version of the app so upon installation will attempt to install that same version. Your requirements file will look something like:
Flask==0.8
Jinja2==2.6
Werkzeug==0.8.3
certifi==0.0.8
chardet==1.0.1
distribute==0.6.24
gunicorn==0.14.2
requests==0.11.1
Your packages are installed (if using virtualenv) at: ../<your project>/<your virtual env>/<lib>/<python version>/<site-packages>/
As for downloading you can use pip install --download command as #atupal suggested in his response, however think if this is really needed you can also fork those libraries on github to accomplish the same.
Here is a good source of information on how this works: http://www.pip-installer.org/en/latest/cookbook.html
Maybe what you want is:
Download the packages:
pip install --download /path/to/download/to packagename
OR
pip install --download=/path/to/packages/downloaded -r requirements.txt
install all of those libraries just downloaded:
pip install --no-index --find-links="/path/to/downloaded/dependencies" packagename
OR
pip install --no-index --find-links="/path/to/downloaded/packages" -r requirements.txt
Shamelessly stolen from this question
Create a requirements.txt file.
Put:
django==1.5.1
in the first line.
Then run pip install -r requirements.txt
Then you can complete that file...
Why does pip list generate a more comprehensive list than pip freeze?
$ pip list
feedparser (5.1.3)
pip (1.4.1)
setuptools (1.1.5)
wsgiref (0.1.2)
$ pip freeze
feedparser==5.1.3
wsgiref==0.1.2
Pip's documentation states:
freeze
Output installed packages in requirements format.
list
List installed packages.
What is a "requirements format"?
One may generate a requirements.txt via:
$ pip freeze > requirements.txt
A user can use this requirements.txt file to install all the dependencies. For instance:
$ pip install -r requirements.txt
The packages need to be in a specific format for pip to understand, such as:
# requirements.txt
feedparser==5.1.3
wsgiref==0.1.2
django==1.4.2
...
That is the "requirements format".
Here, django==1.4.2 implies install django version 1.4.2 (even though the latest is 1.6.x).
If you do not specify ==1.4.2, the latest version available would be installed.
You can read more in "Virtualenv and pip Basics",
and the official "Requirements File Format" documentation.
To answer the second part of this question, the two packages shown in pip list but not pip freeze are setuptools (which is easy_install) and pip itself.
It looks like pip freeze just doesn't list packages that pip itself depends on. You may use the --all flag to show also those packages.
From the documentation:
--all
Do not skip these packages in the output: pip, setuptools, distribute, wheel
The main difference is that the output of pip freeze can be dumped into a requirements.txt file and used later to re-construct the "frozen" environment.
In other words you can run:
pip freeze > frozen-requirements.txt on one machine and then later on a different machine or on a clean environment you can do:
pip install -r frozen-requirements.txt
and you'll get the an identical environment with the exact same dependencies installed as you had in the original environment where you generated the frozen-requirements.txt.
Look at the pip documentation, which describes the functionality of both as:
pip list
List installed packages, including editables.
pip freeze
Output installed packages in requirements format.
So there are two differences:
Output format, freeze gives us the standard requirement format that may be used later with pip install -r to install requirements from.
Output content, pip list include editables which pip freeze does not.
pip list shows ALL installed packages.
pip freeze shows packages YOU installed via pip (or pipenv if using that tool) command in a requirements format.
Remark below that setuptools, pip, wheel are installed when pipenv shell creates my virtual envelope. These packages were NOT installed by me using pip:
test1 % pipenv shell
Creating a virtualenv for this project…
Pipfile: /Users/terrence/Development/Python/Projects/test1/Pipfile
Using /usr/local/Cellar/pipenv/2018.11.26_3/libexec/bin/python3.8 (3.8.1) to create virtualenv…
⠹ Creating virtual environment...
<SNIP>
Installing setuptools, pip, wheel...
done.
✔ Successfully created virtual environment!
<SNIP>
Now review & compare the output of the respective commands where I've only installed cool-lib and sampleproject (of which peppercorn is a dependency):
test1 % pip freeze <== Packages I'VE installed w/ pip
-e git+https://github.com/gdamjan/hello-world-python-package.git#10<snip>71#egg=cool_lib
peppercorn==0.6
sampleproject==1.3.1
test1 % pip list <== All packages, incl. ones I've NOT installed w/ pip
Package Version Location
------------- ------- --------------------------------------------------------------------------
cool-lib 0.1 /Users/terrence/.local/share/virtualenvs/test1-y2Zgz1D2/src/cool-lib <== Installed w/ `pip` command
peppercorn 0.6 <== Dependency of "sampleproject"
pip 20.0.2
sampleproject 1.3.1 <== Installed w/ `pip` command
setuptools 45.1.0
wheel 0.34.2
My preferred method of generating a requirements file is:
pip list --format=freeze > requirements.txt
This method keeps just the package names and package versions without potentially linking to local file paths which 'pip freeze' alone will sometimes give me. Local file paths in a requirements file make your codebase harder to use for other users and some developers don't know how to fix this so I prefer this method for ease of adoptability.
pip list
List installed packages: show ALL installed packages that even pip installed implictly
pip freeze
List installed packages: - list of packages that are installed using pip command
pip freeze has --all flag to show all the packages.
Other difference is the output it renders, that you can check by running the commands.
For those looking for a solution. If you accidentally made pip requirements with pip list instead of pip freeze, and want to convert into pip freeze format. I wrote this R script to do so.
library(tidyverse)
pip_list = read_lines("requirements.txt")
pip_freeze = pip_list %>%
str_replace_all(" \\(", "==") %>%
str_replace_all("\\)$", "")
pip_freeze %>% write_lines("requirements.txt")
`pip freeze > requirements.txt`
automatically writes my dependencies in an apparently alphabetically order, like this:-
matplotlib==1.2.0
numpy==1.6.2
pandas==0.9.1
The problem with this is that pip install -r requirements.txt (when I deploy my code with its dependencies listed in requirements.txt) will end up failing because matplotlib needs numpy to be installed first.
How can I ensure that matplotlib is listed after numpy in the requirements.txt file when I pip freeze it?
For your case it does not matter, because pip builds every requirements (calling python setup.py egg_info for each) and then install them all. For your specific case, it does not matter, because numpy is currently required to be installed while building matplotlib.
It is a problem with matplotlib, and they created a proposal to fix it: https://github.com/matplotlib/matplotlib/wiki/MEP11
See comments from this issue at pip issue tracker: https://github.com/pypa/pip/issues/25
This question is a duplicate of Matplotlib requirements with pip install in virtualenv.
You can try command
pip install --no-deps -r requirements.txt
This installs the packages without dependencies and possibly you will get rid above written problems.
A file containing the packages in the desired order can be used like so:
pip freeze -r sorted-package-list.txt > requirements.txt
Where sorted-package-list.txt contains
numpy
matplotlib
Note: Packages not included in the sorted-package-list.txt file are appended at the end of the requirements file.
Example result:
numpy==1.14.1
matplotlib==2.2.3
## The following requirements were added by pip freeze:
pandas==0.23.4
Note that h5py (the HDF5 Python wrapper) has the same problem.
My workaround is to split the output of pip freeze into two: into a short requirements file containing only numpy's version ${NUMPY_REQS}, and a long one ${REQS} containing all other packages. Note the -v switch of the second grep, the "inverse match".
pip freeze | tee >( grep '^numpy' > ${NUMPY_REQS} ) | grep -v '^numpy' > ${REQS}
And then invoke pip install twice (e.g. when installing a virtual env):
# this installs numpy
pip install -r ${NUMPY_REQS}
# this installs everything else, h5py and/or matplotlib are happy
pip install -r ${REQS}
Note that this tee / grep magic combo works on Unix-like systems only. No idea how to achieve the same thing on Windows.