I rewrote some sync python lib to async. How do I integrate it into my project?
I did the following:
clone it from github and rewrited
build the lib using python3 setup.py bdist_wheel --universal and
got the file .whl file
How can I integrate it to my project?
Currently I have the following docker file:
FROM python:3.6
MAINTAINER ...
COPY requirements.txt requirements.txt
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
COPY . $APP_DIR
EXPOSE 8080
CMD python app.py
How do I copy the .whl file into to container and install it using pip3 install {...}.whl?
First add WORKDIR /app before COPY requirements.txt to specify your app working directory inside the container, then if you have xxx.whl in the same folder as requirements.txt just copy it COPY xxx.whl /app then RUN pip install xxx.whl
like this:
FROM python:3.6
MAINTAINER ...
# specify workdir
WORKDIR /app
COPY requirements.txt /app
# copy xxx.whl from host to container
COPY xxx.whl /app
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
# install xxx.whl
RUN pip install xxx.whl
COPY . /app
EXPOSE 8080
CMD python app.py
Related
I have folder xyz that contains a package I need to pip install with docker. Normally the following command works
pip install xyz/python
My dockerfile is as follows
FROM python:3.7
RUN mkdir /app
WORKDIR /app
RUN pip install xyz/python/
COPY . .
CMD ["./main.py"]
This is the end result of running docker-compose up
Building test_py
Step 1/6 : FROM python:3.7
---> 7c891de3e220
Step 2/6 : RUN mkdir /app
---> Using cache
---> dda26c8c800e
Step 3/6 : WORKDIR /app
---> Using cache
---> 494a714d91ef
Step 4/6 : RUN pip install xyz/python/
---> Running in 099f6997979a
ERROR: Invalid requirement: 'xyz/python/'
Hint: It looks like a path. File 'xyz/python/' does not exist.
WARNING: You are using pip version 22.0.4; however, version 22.1.1 is available.
You should consider upgrading via the '/usr/local/bin/python -m pip install --upgrade pip' command.
ERROR: Service 'test_py' failed to build: The command '/bin/sh -c pip install xyz/python/' returned a non-zero code: 1
As mentioned in the comments:
You should just put COPY . . before RUN pip install xyz/python
(otherwise the xyz folder won't be available in the docker context during the image build)
And RUN mkdir /app can/should also be removed, because WORKDIR /app itself is the Dockerfile equivalent of mkdir -p /app && cd /app.
My Dockerfile
FROM python:3.7 AS builder
RUN python3 -m venv /venv
COPY requirements.txt .
RUN /venv/bin/pip3 install -r requirements.txt
FROM python:3.7
WORKDIR /home/sokov_admin/www/bot-telegram
COPY . .
CMD ["/venv/bin/python", "./bot.py"]
When I run the docker image I have this error:
docker: Error response from daemon: OCI runtime create failed:
container_linux.go:380: starting container process caused: exec:
"/venv/bin/python": stat /venv/bin/python: no such file or directory:
unknown.
What should I change in my code?
The example you show doesn't need any OS-level dependencies for Python dependency builds. That simplifies things significantly: you can do things in a single Docker build stage, without a virtual environment, and there wouldn't be any particular benefit from splitting it up.
FROM python:3.7
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
CMD ["./bot.py"]
The place where a multi-stage build with a virtual environment helps is if you need a full C toolchain to build Python libraries. In this case, in a first stage, you install the C toolchain and set up the virtual environment. In the second stage you need to COPY --from=... the entire virtual environment to the final image.
# Builder stage:
FROM python:3.7 AS builder
# Install OS-level dependencies
RUN apt-get update \
&& DEBIAN_FRONTEND=noninteractive \
apt-get install --no-install-recommends --assume-yes \
build-essential
# libmysql-client-dev, for example
# Create the virtual environment
RUN python3 -m venv /venv
ENV PATH=/venv/bin:$PATH
# Install Python dependencies
WORKDIR /app
COPY requirements.txt .
RUN pip3 install -r requirements.txt
# If your setup.py/setup.cfg has a console script entry point,
# install the application too
# COPY . .
# RUN pip3 install .
# Final stage:
FROM python:3.7 # must be _exactly_ the same image as the builder
# Install OS-level dependencies if needed (libmysqlclient, not ...-dev)
# RUN apt-get update && apt-get install ...
# Copy the virtual environment; must be _exactly_ the same path
COPY --from=builder /venv /venv
ENV PATH=/venv/bin:$PATH
# Copy in the application (if it wasn't `pip install`ed into the venv)
WORKDIR /app
COPY . .
# Say how to run it
EXPOSE 8000
CMD ["./bot.py"]
I am having lots of problems creating a Docker container using python:3.6-alpine for Plotly. Plotly also uses Pandas and Numpy. When I run my Dockerfile below, the "RUN venv/bin/pip install -r requirements.txt" fails. Anyone have recommendations for this, am I missing requirements?
FROM python:3.6-alpine
RUN adduser -D visualdata
RUN pip install --upgrade pip
WORKDIR /home/visualdata
COPY requirements.txt requirements.txt
RUN python -m venv venv
RUN venv/bin/pip install -r requirements.txt
RUN venv/bin/pip install gunicorn
#RUN venv/bin/pip install install python3-pymysql
COPY app app
COPY migrations migrations
COPY visualdata.py config.py boot.sh ./
RUN chmod a+x boot.sh
ENV FLASK_APP visualdata.py
RUN chown -R visualdata:visualdata ./
USER visualdata
EXPOSE 8000
ENTRYPOINT ["./boot.sh"]
If you look at the Python docker image official repository, there is a Dockerfile example that illustrates the pip step:
RUN pip install --no-cache-dir -r requirements.txt
You should be able to use pip directly instead of venv/bin/pip.
You do not really need to use a virtualenv in a docker container if you are only running one application inside. The container already provides its own isolated environment.
Hello im having difficult writing the Dockerfile to run a python spider script which is inside my project directory
The script file is in scrapy_estate/tutorial/tutorial/spiders/ .I think of using the COPY command ,but CMD ["python3","real_estate_spider.py"] still can't find the real_estate_spider.py file
Here is my Dockerfile
FROM ubuntu:18.04
FROM python:3.6-onbuild
RUN apt-get update &&apt-get upgrade -y&& apt-get install python-pip -y
RUN pip install --upgrade pip
RUN pip install scrapy
WORKDIR /usr/local/bin
COPY scrapy_estate/tutorial/tutorial/spiders/real_estate_spider.py
COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
EXPOSE 80
CMD ["python3","real_estate_spider.py"]
Hope someone can help me :)
It looks like you are forgetting the './' in the copy line for your script file
I have a requirements.txt file containing, amongst others:
Flask-RQ==0.2
-e git+https://token:x-oauth-basic#github.com/user/repo.git#egg=repo
When I try to build a Docker container using Docker Compose, it downloads both packages, and install them both, but when I do a pip freeze there is no sign of the -e package. When I try to run the app, it looks as if this package hasn't been installed. Here's the relevant output from the build:
Collecting Flask-RQ==0.2 (from -r requirements.txt (line 3))
Downloading Flask-RQ-0.2.tar.gz
Obtaining repo from git+https://token:x-oauth-basic#github.com/user/repo.git#egg=repo (from -r requirements.txt (line 4))
Cloning https://token:x-oauth-basic#github.com/user/repo.git to ./src/repo
And here's my Dockerfile:
FROM python:2.7
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY requirements.txt /usr/src/app/
RUN pip install -r requirements.txt
COPY . /usr/src/app
I find this situation very strange and would appreciate any help.
I ran into a similar issue, and one possible way that the problem can appear is from:
WORKDIR /usr/src/app
being set before pip install. pip will create the src/ directory (where the package is installed) inside of the WORKDIR. Now all of this shouldn't be an issue since your app files, when copied over, should not overwrite the src/ directory.
However, you might be mounting a volume to /usr/src/app. When you do that, you'll overwrite the /usr/src/app/src directory and then your package will not be found.
So one fix is to move WORKDIR after the pip install. So your Dockerfile will look like:
FROM python:2.7
RUN mkdir -p /usr/src/app
COPY requirements.txt /usr/src/app/
RUN pip install -r /usr/src/app/requirements.txt
COPY . /usr/src/app
WORKDIR /usr/src/app
This fixed it for me. Hopefully it'll work for you.
#mikexstudios is correct, this happens because pip stores the package source in /usr/src/app/src, but you're mounting a local directory over top of it, meaning python can't find the package source.
Rather than changing the position of WORKDIR, I solved it by changing the pip command to:
pip install -r requirements.txt --src /usr/local/src
Either approach should work.
If you are recieving a similar error when installing a git repo from a requirements file under a dockerized container, you may have forgotten to install git.
Here is the error I recieved:
Downloading/unpacking CMRESHandler from
git+git://github.com/zigius/python-elasticsearch-logger.git (from -r
/home/ubuntu/requirements.txt (line 5))
Cloning git://github.com/zigius/python-elasticsearch-logger.git to
/tmp/pip_build_root/CMRESHandler
Cleaning up...
Cannot find command 'git'
Storing debug log for failure in /root/.pip/pip.log
The command '/bin/sh -c useradd ubuntu -b /home && echo
"ubuntu ALL = NOPASSWD: ALL" >> /etc/sudoers &&
chown -R ubuntu:ubuntu /home/ubuntu && pip install -r /home/ubuntu/requirements.txt returned a non-zero code: 1
Here is an example Dockerfile that installs git and then installs all requirements:
FROM python:3.5-slim
RUN apt-get update && apt-get install -y --no-install-recommends git \
ADD . /code
WORKDIR /code
RUN pip install --upgrade pip setuptools && pip install -r /home/ubuntu/requirements.txt
Now you can use git packages in your requirements file in a Dockerized environment