pipenv and wheel/pre-compiled packages - python

I'm currently using a buildhost for third party packages running
$ pip3 wheel --wheel-dir=/root/wheelhouse -r /requirements.txt
After successful build I copy the directory /root/wheelhouse onto a new machine and install the compiled packages by running
$ pip3 install -r /requirements.txt --no-index --find-links=/root/wheelhouse
Is there something similar in pipenv?
Everything I found in combination with wheel are bug reports on GitHub.
Note that copying the venv directory is not an option. I'm using a docker container and want to install the packages system wide.

Related

Vendoring dependencies not working with --find-links

I'm trying to build a package with its dependencies and then install in a separate step.
I'm not using a requirements file I'm using setup.cfg and pyproject.toml.
pip download vendor --dest ./build/dependencies --no-cache-dir
python setup.py check
python setup.py bdist_wheel
python setup.py sdist
That seems to install dependencies into the ./build/dependencies folder, but I can't figure out how to install the wheel by looking in that folder for dependencies.
--find-links doesn't appear to work because I get "Could not find a version that satisfies the requirement.." errors when doing this:
python -m pip install --no-index $(ls dist/*.whl) --find-links="./build/dependencies"
It builds fine without --no-index fetching from the internet.
I also tried running pip install with --target like this,
pip install -e . --target=./build/dependencies
But get the same errors when trying to point to it with --find-links.

Cannot install some of requirements.txt python dependencies

I cannot figure out why when installing my python dependencies from requirements.txt pip don't complain, but when I do it from docker container, I got the following error message:
The requirements.txt content:
Flask~=1.1
grpcio
grpcio-tools
protobuf
iexfinance
numpy
pandas
pandas_datareader
pymongo
I've created my container like below:
docker run -it -p 8080:50051 -v ${pwd}:/app -w "/app" python:3.8-alpine
I've tried to install my dependencies using this command:
pip install -r requirements.txt
Bellow some screenshot:
Alpine Linux uses musl C, but most python wheel files are compiled for glib C. Therefore, packages that have extensions written in C/C++ need to be compiled. If you do not have a compiler installed, you will get an error.
Instead of installing a compiler and dependencies that packages might require at compile time, I suggest using a python Docker image that is not based on Alpine. For example, you can use python:3.8-slim or python:3.8, and python packages that ship Linux wheels will not have to be compiled. All of the packages listed in OP's requirements.txt can be installed from pre-compiled wheels if using python:3.8-slim.
So you can use these commands
docker run -it -p 8080:50051 -v ${pwd}:/app -w "/app" python:3.8-slim
pip install -r requirements.txt
If you are concerned about the size of the resulting image, you can also use the --no-cache-dir flag in pip install to disable caching.
The solution was to update alpine-SDK, which is a "meta-package" that pulls in the essential packages used to build new packages."
apk add --update alpine-sdk
I found the solution here:
Github: docker alpine issues

Docker build taking too long when installing grpcio via pip

I have a Dockerfile which installs a few packages via pip.
Some of them are requiring grpcio, and it takes a few minutes only to build this part.
Does anyone have a tip to speed up this part?
Installing collected packages: python-dateutil, azure-common, azure-nspkg, azure-storage, jmespath, docutils, botocore, s3transfer, boto3, smmap2, gitdb2, GitPython, grpcio, protobuf, googleapis-common-protos, grpc-google-iam-v1, pytz, google-api-core, google-cloud-pubsub
Found existing installation: python-dateutil 2.7.3
Uninstalling python-dateutil-2.7.3:
Successfully uninstalled python-dateutil-2.7.3
Running setup.py install for grpcio: started
Running setup.py install for grpcio: still running...
Running setup.py install for grpcio: still running...
Running setup.py install for grpcio: still running...
Thanks.
I had the same issue and it was solved by upgrading pip:
$ pip3 install --upgrade pip
Here's a word from one of the maintainers of grpc project:
pip grpcio install is (still) very slow #22815
Had the same issue, fixed it by using a virtualenv and a multistage dockerfile :
FROM python:3.7-slim as base
# ---- compile image -----------------------------------------------
FROM base AS compile-image
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
build-essential \
gcc
RUN python -m venv /app/env
ENV PATH="/app/env/bin:$PATH"
COPY requirements.txt .
RUN pip install --upgrade pip
# pip install is fast here (while slow without the venv) :
RUN pip install -r requirements.txt
# ---- build image -----------------------------------------------
FROM base AS build-image
COPY --from=compile-image /app/env /app/env
# Make sure we use the virtualenv:
ENV PATH="/app/env/bin:$PATH"
COPY . /app
WORKDIR /app
Here is my requirements.txt :
fastapi==0.27.*
grpcio-tools==1.21.*
uvicorn==0.7.*
Some docker images (looking at you, Alpine) can't pull prebuilt wheels. Use a docker image that can, like Debian.
Check out this nice write-up on why. I'll reproduce a quote from it, that's especially apt:
But for Python, as Alpine doesn't use the standard tooling used for
building Python extensions, when installing packages, in many cases
Python (pip) won't find a precompiled installable package (a "wheel")
for Alpine. And after debugging lots of strange errors you will
realize that you have to install a lot of extra tooling and build a
lot of dependencies just to use some of these common Python packages.
😩
-- Sebastián Ramírez

Copy installed packages using pip to another environment

I downloaded some packages in my environment using pip command. And I want to have a copy of them to transfer them to another environment. I know that using:
pip freeze > requirements.txt
will generate requirements into a file, but since my second environment does not have access to internet i can not use:
pip install -r requirements.txt
to install that packages again.
Is there any way to copy installed packages? or somehow install packages in a specified directory in my first environment?
Thanks
You can use pip download followed by pip install --find-links to achieve what you want.Here is the steps involved
Get the requirements
pip freeze>requirements.txt
Download the packages to a folder
pip download -r requirements.txt -d path_to_the_folder
From the new environment
pip install -r requirements.txt --find-links=path_to_the_folder

How to save pip packages

We have a python/django based web application, many components of which are installed using pip. So I would like to ask if there is a way to save or download and save the particular python packages that we are having pip install (example: pip install django==1.5.1). We would like to have in the end a collection of the packages in the versions known to be working and with which the app was developed locally. Any and all advice will be appreciated.
If I understood your question right, you can pip freeze > requirements.txt, this command will add all the libraries you have used/"downloaded" for your app in the file requirements.txt(in case it exists the file be overwritten). This command allows you to later do pip install -r requirements.txt. However, be aware that your Django project must be running in a virtual environment, otherwise the install command will attempt to install all the python packages in your development machine.
The freeze command will allow you to have the current version of the app so upon installation will attempt to install that same version. Your requirements file will look something like:
Flask==0.8
Jinja2==2.6
Werkzeug==0.8.3
certifi==0.0.8
chardet==1.0.1
distribute==0.6.24
gunicorn==0.14.2
requests==0.11.1
Your packages are installed (if using virtualenv) at: ../<your project>/<your virtual env>/<lib>/<python version>/<site-packages>/
As for downloading you can use pip install --download command as #atupal suggested in his response, however think if this is really needed you can also fork those libraries on github to accomplish the same.
Here is a good source of information on how this works: http://www.pip-installer.org/en/latest/cookbook.html
Maybe what you want is:
Download the packages:
pip install --download /path/to/download/to packagename
OR
pip install --download=/path/to/packages/downloaded -r requirements.txt
install all of those libraries just downloaded:
pip install --no-index --find-links="/path/to/downloaded/dependencies" packagename
OR
pip install --no-index --find-links="/path/to/downloaded/packages" -r requirements.txt
Shamelessly stolen from this question
Create a requirements.txt file.
Put:
django==1.5.1
in the first line.
Then run pip install -r requirements.txt
Then you can complete that file...

Categories

Resources