Not able to install any python package in docker container - python

I am trying to create an docker image with ubutu 16.04 as base. I want to install few python packages like pandas, flask etc. I have kept all packages in "requirements.txt". But when I am trying to build image, I am getting
Could not find a version that satisfies the requirement requests (from -r requirements.txt (line 1)) (from versions: )
No matching distribution found for requests (from -r requirements.txt (line 1))
Basically, I have not mentioned any version in "requirements.txt". I guess it should take the latest available and compatible version of that package. But for every package same issue I am getting.
My DockerFile is as follows.
FROM ubuntu:16.04
RUN apt-get update -y && \
apt-get install -y python3-pip python3-dev build-essential cmake pkg-config libx11-dev libatlas-base-dev
# We copy just the requirements.txt first to leverage Docker cache
COPY ./requirements.txt /testing/requirements.txt
WORKDIR /testing
RUN pip3 install -r requirements.txt
and requirements.txt is.
pandas
requests
PyMySQL
Flask
Flask-Cors
Pillow
face-recognition
Flask-SocketIO
Where I am doing wrong ? Can anybody help ?

I too ran into the same situation. I observed that, python packages is looking for the network within docker. It is thinking that, it is running in a standalone without network so its not able to locate the package. In these type of situations either
No matching distribution found
or sometimes
Retrying ...
error may occur.
I used a --network option in the docker build command like below to overcome this error where the command insists python to use the host network to download the required packages.
docker build --network=host -t tracker:latest .

Try using this:
RUN python3.6 -m pip install --upgrade pip \
&& python3.6 -m pip install -r requirements.txt
by using it in this way, you are specifying the version of python in which you want to search for those packages.
Change it to python3.7 if you wish to use 3.7 version.

I suggest using the official python image instead. As a result, your Dockerfile will now become:
FROM python:3
WORKDIR /testing
COPY ./requirements.txt /testing/requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
... etc ...
Now re: Angular/Node. You have two options from here: 1) Install Angular/Node on the Python image; or 2) Use Docker's multi-stage build feature so you build the Angular and Python-specific images before merging them together. Option 2 is recommended but it would take some work. It would probably look like this:
FROM node:8 as node
# Angular-specific build
FROM python:3 as python
# Python-specific build
# Then copy your data from the Angular image to the Python one:
COPY --from=node /usr/src/app/dist/angular-docker /usr/src/app

Related

How to make docker-build to cache the pip installs as well?

I am working with the following Dockerfile.
FROM ubuntu
RUN apt-get update
RUN apt-get install -y \
python3.9 \
python3-pip
COPY . /app
WORKDIR /app
RUN pip install -r requirements.txt
EXPOSE 8501
ENTRYPOINT [ "streamlit", "run" ]
CMD ["app.py"]
Whenever, I rebuild the image, the docker uses the previously cached version of image and builds it quickly, except when it reaches the RUN pip install -r requirements.txt part, which it runs every time I rebuild the image (does not cache). The problem is that one of the requirement in the requirements.txt is streamlit==1.9.0, which has a large number of dependencies, so it slows the down the rebuild process. Long story short, the docker is cacheing RUN apt-get install foo but not the RUN pip install bar, which, I guess, is expected. How would you find a work-around it, to speed-up the image rebuild process when you have a long list in requirements.txt file?
It can't cache it because your files in app are changing constantly (I guess). The way its usually done is to copy requirements.txt seperately first, then install, then copy everything else. This way, docker can cache the install as requirements.txt didn't change.
FROM ubuntu
RUN apt-get update
RUN apt-get install -y \
python3.9 \
python3-pip
WORKDIR /app
COPY requirements.txt . # Copy ONLY requirements. This layer is cached if this file doesn't change
RUN pip install -r requirements.txt # This uses the cached layer if the previous layer was cached aswell
COPY . .
EXPOSE 8501
ENTRYPOINT [ "streamlit", "run" ]
CMD ["app.py"]
Docker BuildKit has recently introduced mounting during build time.
See https://github.com/moby/buildkit/blob/master/frontend/dockerfile/docs/syntax.md
For particular scenario for PIP, check out https://dev.doroshev.com/docker-mount-type-cache/

How to install non Python software in a container built from python image

I have a Python application that I want to containerise with Docker. I'm using a Dockerfile such as:
FROM python:3.9
WORKDIR /app
COPY myapp ./myapp
COPY requires.txt .
RUN pip install --no-cache-dir -r requires.txt
Now, my problem is that I have to install certain dependencies that are not just Python packages, for example tesseract.
How do I install this in the built image? Is the python:3.9 image built on top an OS, such as Ubuntu, where I can add software? If so, how? Or does it make more sense to start with a proper OS image, let's say debian por instance, and from there install tesseract and Python (using RUN apt-get ...), and then using python -m pip ... to fulfill the Python of dependences?
The python:3.9 image is based on the debian:bullseye image, so you can use apt to install tesseract like this
FROM python:3.9
RUN apt update && apt install -y tesseract-ocr
WORKDIR /app
COPY myapp ./myapp
COPY requires.txt .
RUN pip install --no-cache-dir -r requires.txt
As for starting from a base OS image or from a python image, I always try to get as much for free as I can and use the image that I have to modify the least.
Another way is to load the required .tar files, that you need to install specific package, or just use precompilied binary.
as you can see in documentation, they have binaries for ubuntu, debian and windows - (load here, for ubuntu docker image).
You can call them straight from supported python library or command line.
Also you should ensure to provide required language data (add those files in dockerfile), if required.

docker image build fails at RUN pip install -r requirements.txt in a Flask app

I have a Flask application with the following requirements.txt
chalice
matplotlib
sklearn
numpy
scipy
pandas
flask
flask_restful
and the following Dockerfile:
FROM python:3.6.1-alpine
WORKDIR /project
ADD . /project
RUN pip install -r requirements.txt
CMD ["python","app.py"]
Running the command docker image build -t clf_test .
generates the following error:
You are using pip version 9.0.1, however version 20.2.3 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
The command '/bin/sh -c pip install -r requirements.txt' returned a non-zero code: 1
It seems that the matplotlib can't get installed for some reason.
Running the pip install -r requirements.txt locally doesn't produce any errors
matplotlib must be built from source, and compiling it requires a number of supporting libraries as well as a functioning C compiler. You can figure out what these are and install them so that it builds properly...
...or you can just base your Dockerfile on the non-alpine python:3.6.1 image and then apt-get install python3-matplotlib before installing your other requirements. E.g., this builds without errors:
FROM python:3.6.1
WORKDIR /project
ADD . /project
RUN apt update; apt-get -y install python3-matplotlib
RUN pip install -r requirements.txt
CMD ["python","app.py"]

Always installing vscode plugin in docker container doesnt work

I am using vscode with docker container. I have following entry in user settings.json.
"remote.containers.defaultExtensions": [
"ms-python.python",
"ms-azuretools.vscode-docker",
"ryanluker.vscode-coverage-gutters"
]
But when I build or rebuild container, these plugins don't get installed automatically inside container.
Am I doing something wrong ?
Modified
Here is how my dockerfile looks like
FROM ubuntu:bionic
RUN apt-get update
RUN apt-get install -y python3.6 python3-pip
RUN apt-get install -y git libgl1-mesa-dev
# Currently not using requirements.txt to improve caching
#COPY requirements.txt /home/projects/my_project/
#WORKDIR /home/projects/my_project/
#RUN pip3 install -r requirements.txt
RUN pip3 install torch pandas PyYAML==5.1.2 autowrap Cython==0.29.14
RUN pip3 install numpy==1.17.3 open3d-python==0.7.0.0 pytest==5.2.4 pptk
RUN pip3 install scipy==1.3.1 natsort matplotlib lxml opencv-python==3.2.0.8
RUN pip3 install Pillow scikit-learn testfixtures
RUN pip3 install pip-licenses pylint pytest-cov
RUN pip3 install autopep8
COPY . /home/projects/my_project/
This might be an old question, but to whomever it might concern, here is one solution. I encountered this problem, that particularly the Python extension from VS Code would not install itself inside my Docker container in VS Code. In order to get it to install the python extension (and for me anything else) you have to specify the Python version, like:
"extensions": [
"ms-azuretools.vscode-docker",
"ms-python.python#2020.9.114305",
"ms-python.vscode-pylance"
]
If you want to see this in action you can clone my repository. Simply open this repo in VS Code, install the extension Remote Container, and then it should start the docker container all by itself.

How to run SSL_library_init() from Python 3.7 docker image

Up until recently I've been using openssl library within python:3.6.6-jessie docker image and thing worked as intented.
I'm using very basic Dockerfile configuration to install all necessary dependencies:
FROM python:3.6.6-jessie
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
WORKDIR /code
RUN apt-get -qq update
RUN apt-get install openssl
RUN apt-get upgrade -y openssl
ADD requirements.txt /code/
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
And access and initialize the library itself with these 2 lines:
openssl = cdll.LoadLibrary("libssl.so")
openssl.SSL_library_init()
Things were working great with this approach.
This week I was doing upgrade of python and libraries and as result I switched to newer docker image:
FROM python:3.7.5
...
This immediatelly caused openssl to stop working because of this exception:
AttributeError: /usr/lib/x86_64-linux-gnu/libssl.so.1.1: undefined symbol: SSL_library_init
From this error I can understand that libssl no longer provides SSL_library_init method (or so it seems to be) which is rather weird issue because the initializer name in openssl documentation is the same.
I also tried to resolve this using -stretch and -buster distributions but the issue remains.
What is the correct approach to run SSL_library_init in those newer distributions? Maybe some additional dockerfile configuration is required?
I think you need to install libssl1.0-dev
RUN apt-get install -y libssl1.0-dev

Categories

Resources