I have folder xyz that contains a package I need to pip install with docker. Normally the following command works
pip install xyz/python
My dockerfile is as follows
FROM python:3.7
RUN mkdir /app
WORKDIR /app
RUN pip install xyz/python/
COPY . .
CMD ["./main.py"]
This is the end result of running docker-compose up
Building test_py
Step 1/6 : FROM python:3.7
---> 7c891de3e220
Step 2/6 : RUN mkdir /app
---> Using cache
---> dda26c8c800e
Step 3/6 : WORKDIR /app
---> Using cache
---> 494a714d91ef
Step 4/6 : RUN pip install xyz/python/
---> Running in 099f6997979a
ERROR: Invalid requirement: 'xyz/python/'
Hint: It looks like a path. File 'xyz/python/' does not exist.
WARNING: You are using pip version 22.0.4; however, version 22.1.1 is available.
You should consider upgrading via the '/usr/local/bin/python -m pip install --upgrade pip' command.
ERROR: Service 'test_py' failed to build: The command '/bin/sh -c pip install xyz/python/' returned a non-zero code: 1
As mentioned in the comments:
You should just put COPY . . before RUN pip install xyz/python
(otherwise the xyz folder won't be available in the docker context during the image build)
And RUN mkdir /app can/should also be removed, because WORKDIR /app itself is the Dockerfile equivalent of mkdir -p /app && cd /app.
Related
My Dockerfile
FROM python:3.7 AS builder
RUN python3 -m venv /venv
COPY requirements.txt .
RUN /venv/bin/pip3 install -r requirements.txt
FROM python:3.7
WORKDIR /home/sokov_admin/www/bot-telegram
COPY . .
CMD ["/venv/bin/python", "./bot.py"]
When I run the docker image I have this error:
docker: Error response from daemon: OCI runtime create failed:
container_linux.go:380: starting container process caused: exec:
"/venv/bin/python": stat /venv/bin/python: no such file or directory:
unknown.
What should I change in my code?
The example you show doesn't need any OS-level dependencies for Python dependency builds. That simplifies things significantly: you can do things in a single Docker build stage, without a virtual environment, and there wouldn't be any particular benefit from splitting it up.
FROM python:3.7
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
CMD ["./bot.py"]
The place where a multi-stage build with a virtual environment helps is if you need a full C toolchain to build Python libraries. In this case, in a first stage, you install the C toolchain and set up the virtual environment. In the second stage you need to COPY --from=... the entire virtual environment to the final image.
# Builder stage:
FROM python:3.7 AS builder
# Install OS-level dependencies
RUN apt-get update \
&& DEBIAN_FRONTEND=noninteractive \
apt-get install --no-install-recommends --assume-yes \
build-essential
# libmysql-client-dev, for example
# Create the virtual environment
RUN python3 -m venv /venv
ENV PATH=/venv/bin:$PATH
# Install Python dependencies
WORKDIR /app
COPY requirements.txt .
RUN pip3 install -r requirements.txt
# If your setup.py/setup.cfg has a console script entry point,
# install the application too
# COPY . .
# RUN pip3 install .
# Final stage:
FROM python:3.7 # must be _exactly_ the same image as the builder
# Install OS-level dependencies if needed (libmysqlclient, not ...-dev)
# RUN apt-get update && apt-get install ...
# Copy the virtual environment; must be _exactly_ the same path
COPY --from=builder /venv /venv
ENV PATH=/venv/bin:$PATH
# Copy in the application (if it wasn't `pip install`ed into the venv)
WORKDIR /app
COPY . .
# Say how to run it
EXPOSE 8000
CMD ["./bot.py"]
I have a requirements.txt file which contains the following package:
git+https://username:password#gitlab.mycompany.com/mypackage.git#master#egg=mypackage
I am able to build my docker image using a basic dockerfile.
However, I'm trying to use a more complex docker file to get my docker image to be as slim as possible:
FROM python:3.7-alpine as base
COPY . /app
WORKDIR /app
FROM base AS dependencies
COPY requirements.txt ./
RUN apk add --no-cache make automake gcc g++ git && \
pip install -r requirements.txt
FROM base
WORKDIR /app
COPY . /app
COPY --from=dependencies /root/.cache /root/.cache
COPY requirements.txt ./
RUN pip install -r requirements.txt && rm -rf /root/.cache
EXPOSE 8000
CMD python main.py
The problem is that during the last phase of the build I get error which 'git' cannot be found, i.e The build tries to pull 'mypackage' instead of taking it from the "dependencies" part. Any idea how to fix this?
The error:
Error [Errno 2] No such file or directory: 'git': 'git' while executing command git clone -q Cannot find command 'git' - do you have 'git' installed and in your PATH?
You don't have git in your last (3rd) image, because you only have git in dependencies, while the last one derives from base, which is pure alpine python.
So when you try to RUN pip install -r requirements.txt && rm -rf /root/.cache, you fail on requirement with git protocol.
If you need your final image to be slim, there are few options how to fix it:
use venv (Python's virtual environment); create it on 2nd step and COPY to last one. Then there no need to install requirements.
download reqs from repository to local disk on 2nd step, then COPY them to 3rd step and install (may need gcc on 3rd step, but not git)
I'm run docker in windows10 ,my dockerfile like this
FROM python:2.7-slim
WORKDIR E:/docker
ADD . E:/docker
RUN pip install -r requirements.txt
EXPOSE 80
ENV NAME World
CMD ["python", "app.py"]
when i run
docker build -t friendlyhello .
i get a wrong like this:
E:\docker-demo>docker build -t friendlyhello .
Sending build context to Docker daemon 4.608kB
Step 1/7 : FROM python:2.7-slim
---> d0d1b97dd328
Step 2/7 : WORKDIR E:/docker
---> Using cache
---> 305b573b82a5
Step 3/7 : ADD . E:/docker
---> 6d25bd33ba84
Step 4/7 : RUN pip install -r E:/docker-demo/requirements.txt
---> Running in e68dd3cd1c71
Could not open requirements file: [Errno 2] No such file or directory: 'E:/docker-demo/requirements.txt'
The command '/bin/sh -c pip install -r E:/docker-demo/requirements.txt' returned a non-zero code: 1
it make the RUN as /bin/sh -c. but it shoud be cmd /S /C in windows
,and it can be worked when I run the pip install -r requirements.txt at commonds lines direct.I have no idea how to resove it.thanks for answer
requirements.txt should be present at the same location where your docker file is
Replace E:/docker with /tmp/ (any suitable path in form of linux directory structure) in dockerfile and execute
docker build -t friendlyhello .
my dockerfile:
FROM python:2.7-slim
WORKDIR /tmp/
ADD . /tmp/
RUN pip install -r requirements.txt
EXPOSE 80
ENV NAME World
CMD ["python", "app.py"]
One idea would be to specify the full path to the requirements.txt and make sure that it exists
You have to find correct context for docker to start from.
Try navigating to parent folder and use -f argument of "docker build" to specify path to Dockerfile:
cd ../ParentFolder
docker build -f ../ParentFolder/ChildFolder/Dockerfile -t imagename .
Otherwise, try editing your Dockerfile
I rewrote some sync python lib to async. How do I integrate it into my project?
I did the following:
clone it from github and rewrited
build the lib using python3 setup.py bdist_wheel --universal and
got the file .whl file
How can I integrate it to my project?
Currently I have the following docker file:
FROM python:3.6
MAINTAINER ...
COPY requirements.txt requirements.txt
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
COPY . $APP_DIR
EXPOSE 8080
CMD python app.py
How do I copy the .whl file into to container and install it using pip3 install {...}.whl?
First add WORKDIR /app before COPY requirements.txt to specify your app working directory inside the container, then if you have xxx.whl in the same folder as requirements.txt just copy it COPY xxx.whl /app then RUN pip install xxx.whl
like this:
FROM python:3.6
MAINTAINER ...
# specify workdir
WORKDIR /app
COPY requirements.txt /app
# copy xxx.whl from host to container
COPY xxx.whl /app
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
# install xxx.whl
RUN pip install xxx.whl
COPY . /app
EXPOSE 8080
CMD python app.py
I have a requirements.txt file containing, amongst others:
Flask-RQ==0.2
-e git+https://token:x-oauth-basic#github.com/user/repo.git#egg=repo
When I try to build a Docker container using Docker Compose, it downloads both packages, and install them both, but when I do a pip freeze there is no sign of the -e package. When I try to run the app, it looks as if this package hasn't been installed. Here's the relevant output from the build:
Collecting Flask-RQ==0.2 (from -r requirements.txt (line 3))
Downloading Flask-RQ-0.2.tar.gz
Obtaining repo from git+https://token:x-oauth-basic#github.com/user/repo.git#egg=repo (from -r requirements.txt (line 4))
Cloning https://token:x-oauth-basic#github.com/user/repo.git to ./src/repo
And here's my Dockerfile:
FROM python:2.7
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY requirements.txt /usr/src/app/
RUN pip install -r requirements.txt
COPY . /usr/src/app
I find this situation very strange and would appreciate any help.
I ran into a similar issue, and one possible way that the problem can appear is from:
WORKDIR /usr/src/app
being set before pip install. pip will create the src/ directory (where the package is installed) inside of the WORKDIR. Now all of this shouldn't be an issue since your app files, when copied over, should not overwrite the src/ directory.
However, you might be mounting a volume to /usr/src/app. When you do that, you'll overwrite the /usr/src/app/src directory and then your package will not be found.
So one fix is to move WORKDIR after the pip install. So your Dockerfile will look like:
FROM python:2.7
RUN mkdir -p /usr/src/app
COPY requirements.txt /usr/src/app/
RUN pip install -r /usr/src/app/requirements.txt
COPY . /usr/src/app
WORKDIR /usr/src/app
This fixed it for me. Hopefully it'll work for you.
#mikexstudios is correct, this happens because pip stores the package source in /usr/src/app/src, but you're mounting a local directory over top of it, meaning python can't find the package source.
Rather than changing the position of WORKDIR, I solved it by changing the pip command to:
pip install -r requirements.txt --src /usr/local/src
Either approach should work.
If you are recieving a similar error when installing a git repo from a requirements file under a dockerized container, you may have forgotten to install git.
Here is the error I recieved:
Downloading/unpacking CMRESHandler from
git+git://github.com/zigius/python-elasticsearch-logger.git (from -r
/home/ubuntu/requirements.txt (line 5))
Cloning git://github.com/zigius/python-elasticsearch-logger.git to
/tmp/pip_build_root/CMRESHandler
Cleaning up...
Cannot find command 'git'
Storing debug log for failure in /root/.pip/pip.log
The command '/bin/sh -c useradd ubuntu -b /home && echo
"ubuntu ALL = NOPASSWD: ALL" >> /etc/sudoers &&
chown -R ubuntu:ubuntu /home/ubuntu && pip install -r /home/ubuntu/requirements.txt returned a non-zero code: 1
Here is an example Dockerfile that installs git and then installs all requirements:
FROM python:3.5-slim
RUN apt-get update && apt-get install -y --no-install-recommends git \
ADD . /code
WORKDIR /code
RUN pip install --upgrade pip setuptools && pip install -r /home/ubuntu/requirements.txt
Now you can use git packages in your requirements file in a Dockerized environment