Django struggles to find Pillow and I'm not quite sure why.
Environment
Linux Alpine based Docker image, Django 2.2. Here are the relevant parts of:
The Dockerfile
RUN apk update \
&& apk add --virtual build-deps gcc python3-dev musl-dev jpeg-dev zlib-dev \
&& apk add --no-cache mariadb-dev mariadb-client
# install dependencies
RUN pip install --upgrade pip
RUN pip install pipenv
RUN pip install mysqlclient
COPY ./Pipfile /usr/src/cms/Pipfile
RUN pipenv install --skip-lock --system --dev
RUN apk del build-deps
The Pipfile
[[source]]
name = "pypi"
url = "https://pypi.org/simple"
verify_ssl = true
[dev-packages]
[packages]
django = "==2.2"
markdown = "==3.1.1"
pillow = "==5.0.0"
[requires]
python_version = "3.6"
The issue
When I run python manage.py runserver 0.0.0.0:8000 from the container, I get the following error:
django.core.management.base.SystemCheckError: SystemCheckError: System check identified some issues:
ERRORS:
website.Photo.photo: (fields.E210) Cannot use ImageField because Pillow is not installed.
HINT: Get Pillow at https://pypi.org/project/Pillow/ or run command "pip install Pillow".
which is weird because pip install Pillow gives me
Requirement already satisfied: Pillow in /usr/local/lib/python3.7/site-packages (5.4.1)
About Pillow's conflict with PIL
While having a look at /usr/local/lib/python3.7/site-packages, I noticed that I had both PIL and Pillow. Is this:
the source of a conflict (Pillow's documentation) is quite specific about the need to uninstall PIL
pillow's use of the very name PIL to maintain compatibility as suggested in this discussion?
From the facts that i) pip uninstall PIL -> not installed ii) print(PIL.PILLOW_VERSION) -> 5.0.0 in python's shell and that iii) Django uses from PIL import Image source, I would go for hypotheses 2. So if Pillow is installed in the container, why does not Django find it?
Current path
>>> from PIL import Image
Traceback (most recent call last):
File "<console>", line 1, in <module>
File "/usr/local/lib/python3.7/site-packages/PIL/Image.py", line 58, in <module>
from . import _imaging as core
ImportError: Error loading shared library libjpeg.so.8: No such file or directory (needed by /usr/local/lib/python3.7/site-packages/PIL/_imaging.cpython-37m-x86_64-linux-gnu.so)
I added jpeg-dev to the Dockerfile but, somehow, it does not seem to be enough. Still digging. Thanks for any clue
Turns out, jpeg-dev (required by the compilation) was not enough to satisfy all dependencies during execution. Adding libjpeg solved the issue. Updated Dockerfile
# install mysqlclient
RUN apk update \
&& apk add --virtual build-deps gcc python3-dev musl-dev jpeg-dev zlib-dev \
&& apk add --no-cache mariadb-dev mariadb-client
# install dependencies
RUN pip install --upgrade pip
RUN pip install pipenv
RUN pip install mysqlclient
RUN apk add libjpeg -------------> et voila
COPY ./Pipfile /usr/src/cms/Pipfile
RUN pipenv install --skip-lock --system --dev
RUN apk del build-deps
In my case, as I am using Docker, I had to add
RUN python -m pip install Pillow
into my DockerFile, before RUN pip install -r ./requirements/prod.txt.
Simply running python -m pip install Pillow in the terminal
wouldn't work.
Related
I recently learned about the concept of building docker images based on a multi-staged Dockerfile.
I have been trying simple examples of multi-staged Dockerfiles, and they were working fine. However, when I tried implementing the concept for my own application, I was facing some issues.
My application is about object detection in videos, so I use python and Tensorflow.
Here is my Dockerfile:
FROM python:3-slim AS base
WORKDIR /objectDetector
COPY detect_objects.py .
COPY detector.py .
COPY requirements.txt .
ADD data /objectDetector/data/
ADD models /objectDetector/models/
RUN apt-get update && \
apt-get install protobuf-compiler -y && \
apt-get install ffmpeg libsm6 libxext6 -y && \
apt-get install gcc -y
RUN pip3 install update && python3 -m pip install --upgrade pip
RUN pip3 install tensorflow-cpu==2.9.1
RUN pip3 install opencv-python==4.6.0.66
RUN pip3 install opencv-contrib-python
WORKDIR /objectDetector/models/research
RUN protoc object_detection/protos/*.proto --python_out=.
RUN cp object_detection/packages/tf2/setup.py .
RUN python -m pip install .
RUN python object_detection/builders/model_builder_tf2_test.py
WORKDIR /objectDetector/models/research
RUN pip3 install wheel && pip3 wheel . --wheel-dir=./wheels
FROM python:3-slim
RUN pip3 install update && python3 -m pip install --upgrade pip
COPY --from=base /objectDetector /objectDetector
WORKDIR /objectDetector
RUN pip3 install --no-index --find-links=/objectDetector/models/research/wheels -r requirements.txt
When I try to run my application in the final stage of the container, I receive the following error:
root#3f062f9a5d64:/objectDetector# python detect_objects.py
Traceback (most recent call last):
File "/objectDetector/detect_objects.py", line 3, in <module>
import cv2
ModuleNotFoundError: No module named 'cv2'
So per my understanding, it seems that opencv-python is not successfully moved from the 1st stage to the 2nd.
I have been searching around, and I found some good blogs and questions tackling the issue of multi-staging Dockerfiles, specifically for python libraries. However, it seems I missing something here.
Here are some references that I have been following to solve the issue:
How do I reduce a python (docker) image size using a multi-stage build?
Multi-stage build usage for cuda,cudnn,opencv and ffmpeg #806
So my question is: How can we use opencv in a multistage docker image?
in aws-eb I am deployed an application -django- and there was no error on that process. Health is green and OK but page is giving Internal Server Error. so I checked the logs and saw the below error.
... web: from .cv2 import
... web: ImportError: libGL.so.1: cannot open shared object file: No such file or directory
while installing requirements.txt on deployment process opencv must be installed. because it includes opencv-python==4.5.5.64
so I not quite sure what is the above error pointing at.
and helpers.py this is how I am import it.
import requests
import cv2
libGL.so is installed with the package libgl1, pip3 install opencv-python is not sufficient here.
Connect the aws via ssh and run;
apt-get update && apt-get install libgl1
Or even better, consider using docker containers for the project and add the installation commands to the Dockerfile.
Also, as https://stackoverflow.com/a/66473309/12416058 suggests, Package python3-opencv includes all system dependencies of OpenCV. so installing it may prevent further errors.
To install python3-opencv;
apt-get update && apt-get install -y python3-opencv
pip install -r requirements.txt
To install in Dockerfile:
RUN apt-get update && apt-get install -y python3-opencv
RUN pip install -r requirements.txt
I have encountered a problem while trying to run my django project on a new Docker container.
It is my first time using Docker and I can't seem to find a good way to run a django project on it. Having tried multiple tutorials, I always get the error about psycopg2 not being installed.
requirements.txt:
-i https://pypi.org/simple
asgiref==3.2.7
django-cors-headers==3.3.0
django==3.0.7
djangorestframework==3.11.0
gunicorn==20.0.4
psycopg2-binary==2.8.5
pytz==2020.1
sqlparse==0.3.1
Dockerfile:
# pull official base image
FROM python:3.8.3-alpine
# set work directory
WORKDIR /usr/src/app
# set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# install dependencies
RUN pip install --upgrade pip
COPY ./requirements.txt .
RUN pip install -r requirements.txt
# copy project
COPY . .
# set project environment variables
# grab these via Python's os.environ
# these are 100% optional here
ENV PORT=8000
ENV SECRET_KEY_TWITTER = "***"
While running docker-compose build, I get the following error:
Error: pg_config executable not found.
pg_config is required to build psycopg2 from source. Please add the directory
containing pg_config to the $PATH or specify the full executable path with the
option:
python setup.py build_ext --pg-config /path/to/pg_config build ...
or with the pg_config option in 'setup.cfg'.
If you prefer to avoid building psycopg2 from source, please install the PyPI
'psycopg2-binary' package instead.
I will gladly answer any questions that might lead to the solution.
Also, maybe someone can recommend me a good tutorial on dockerizing django apps?
I made it work. This is the code:
FROM python:3.8.3-slim #Image python:3.9.5-slim also works # Image python:3.9.5-slim-buster also works
RUN apt-get update \
&& apt-get -y install libpq-dev gcc \
&& pip install psycopg2
On Alpine Linux, you will need to compile all packages, even if a pre-compiled binary wheel is available on PyPI. On standard Linux-based images, you won't (https://pythonspeed.com/articles/alpine-docker-python/ - there are also other articles I've written there that might be helpful, e.g. on security).
So change your base image to python:3.8.3-slim-buster or python:3.8-slim-buster and it should work.
This scripts work on MacBook Air M1
Dockerfile
FROM ubuntu:20.04
RUN apt-get update && apt-get -y install libpq-dev gcc && pip install psycopg2
COPY requirements.txt /cs_account/
RUN pip3 install -r requirements.txt
requirements.txt
psycopg2-binary~=2.8.6
Updated answer from the answer of Zoltán Buzás
This worked for me. Try slim-buster image.
In your Dockerfile
FROM python:3.8.7-slim-buster
and in your requirements.txt file
psycopg2-binary~= <<version_number>>
I added this to the top answer because I was getting other errors like below:
gcc: error trying to exec 'cc1plus': execvp: No such file or directory
error: command 'gcc' failed with exit status 1
and
src/pyodbc.h:56:10: fatal error: sql.h: No such file or directory
#include <sql.h>
This is what I did to fix this, so I am not sure how others were getting that to work, however maybe it was some of the other things I was doing?
My solution that I found from other posts when googling those two errors:
FROM python:3.8.3-slim
RUN apt-get update \
&& apt-get -y install g++ libpq-dev gcc unixodbc unixodbc-dev
I've made a custom image with
FROM python:alpine
ADD requirements.txt /
RUN apk update --no-cache \
&& apk add build-base postgresql-dev libpq --no-cache --virtual .build-deps \
&& pip install --no-cache-dir --upgrade pip \
&& pip install --no-cache-dir -r /requirements.txt \
&& apk del .build-deps
RUN apk add postgresql-libs libpq --no-cache
and requirements.txt
django
djangorestframework
psycopg2-binary
I met a dependency of libxml issue, when create a docker container with python, installing dependencies lib from a ubuntu image :
# pull official base image
FROM python:3.8.0-alpine
# set work directory
WORKDIR /usr/src/app
# set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
FROM ubuntu:16.04
RUN apt-get update -y
RUN apt-get install g++ gcc libxml2 libxslt-dev -y
# install dependencies
FROM python:3.8.0-alpine
RUN pip install --upgrade pip
COPY requirements.txt .
RUN pip install -r requirements.txt
# copy project
COPY . /usr/src/app/
getting this compilation output error :
Could not find function xmlCheckVersion in library libxml2. Is libxml2 installed?
You are installing those packages in ubuntu, but not alpine. In the builder pattern you would need to copy over the files from the builder layer into the runtime layer. However, ubuntu != alpine, so compiled binaries will not work.
You will need to leverage the apk installer to add those packages to the alpine layer:
...
RUN apk update && apk add g++ gcc libxml2 libxslt-dev
RUN python -m pip install --upgrade pip
...
Im trying to use pyzbar 0.1.4 on a Flask Server in Docker
The image was created by us, based in python 2.7 taken from alpine.
Install ZBar by
apk update
apk add zbar
Im getting the following error when running dockerfile
File "/usr/lib/python2.7/site-packages/pyzbar/pyzbar.py", line 8, in <module>
from .wrapper import (
File "/usr/lib/python2.7/site-packages/pyzbar/wrapper.py", line 166, in <module>
c_uint_p, # minor
File "/usr/lib/python2.7/site-packages/pyzbar/wrapper.py", line 159, in zbar_function
return prototype((fname, load_libzbar()))
File "/usr/lib/python2.7/site-packages/pyzbar/wrapper.py", line 135, in load_libzbar
raise ImportError('Unable to find zbar shared library')
ImportError: Unable to find zbar shared library
Im trying to decode a QR image using that library
Dockerfile
FROM buffetcontainerimages.azurecr.io/base/buffetcloud-python:0.1
RUN pip install --upgrade pip setuptools wheel
COPY wheeldir /opt/app/wheeldir
COPY *requirements.txt /opt/app/src/
RUN pip install --use-wheel --no-index --find-links=/opt/app/wheeldir \
-r /opt/app/src/requirements.txt
RUN pip install --use-wheel --no-index --find-links=/opt/app/wheeldir \
-r /opt/app/src/test-requirements.txt
COPY . /opt/app/src/
WORKDIR /opt/app/src
RUN python setup.py install
EXPOSE 5000
CMD dronedemo
And requirements.txt
requests>=2.18.4
flask>=0.12.2
mechanize>=0.3.6
regex>=2.4.136
PyPDF2>=1.26.0
bs4>=4.5.3
pyzbar>=0.1.4
openpyxl>=2.5.0
selenium>=3.9.0
matplotlib>=2.1.2
When pip install zbar
pip install zbar
Collecting zbar
Downloading zbar-0.10.tar.bz2
...
zbarmodule.h:26:18: fatal error: zbar.h: No such file or directory
#include <zbar.h>
compilation terminated.
error: command 'gcc' failed with exit status 1
In Ubuntu install zbar-tools
sudo apt-get install zbar-tools
A simple test, looks good.
FROM python:2.7
RUN apt-get update && \
apt-get install -y build-essential libzbar-dev && \
pip install zbar
i tried alpine.. but the zbar lib is only available in the edge branch -- trying to get it to work was more trouble than it was worth.
PS. beware of images that are not in the docker repo. -- didnt know it was your image
Working example:
$ docker build -t yourimagenamehere .
Sending build context to Docker daemon 2.048kB
Step 1/2 : FROM python:2.7
---> 9e92c8430ba0
... trunc...
Successfully built d951cd32ea74
Successfully tagged yourimagenamehere:latest
$ docker run -it --rm yourimagenamehere
Python 2.7.14 (default, Dec 12 2017, 16:55:09)
[GCC 4.9.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import zbar
>>>
In Ubuntu terminal simply run this command and this will install zbar in your global package
sudo apt-get install zbar-tools
I encountered the same issue (happy to have found this thread). Not sure if this has already been solved but this might help you or future devs.
As usual, it worked on my machine locally but couldn't get it to work in a container
What I tried initially:
Building an image based on a Python3 image
What solved the issue:
Build with FROM ubuntu:18.04
Within Ubuntu I was able to install the zbar shared library. According to https://pypi.org/project/pyzbar/ we need sudo apt-get install libzbar0
Set LC_ALL & LANG ENV variables (not sure why, it was provided in an additional error)
Within requirements.txt downgrade Pillow==8.4.0 to Pillow==6.2.2
My Dockerfile:
FROM ubuntu:18.04
RUN apt-get update -y
# Get's shared library for zbar
RUN apt-get install -y libzbar0
# Installs Python
RUN apt-get install -y python3-pip python3-dev build-essential
COPY . /app
WORKDIR /app
COPY requirements.txt .
RUN pip3 install -r requirements.txt
# Initially encountered an issue that indicated I had to set these ENVs
ENV LC_ALL C.UTF-8
ENV LANG C.UTF-8
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8080"]
And requirements.txt
fastapi==0.67.0
Pillow==6.2.2
pyzbar==0.1.8
urllib3==1.26.7
uvicorn==0.12.2
I grea with the second commend on your post, but you might want to try to install the pyzbar dependency through pip.
FROM buffetcontainerimages.azurecr.io/base/buffetcloud-python:0.1
RUN pip install --upgrade pip setuptools wheel pyzbar
COPY wheeldir /opt/app/wheeldir
COPY *requirements.txt /opt/app/src/
RUN pip install --use-wheel --no-index --find-links=/opt/app/wheeldir \
-r /opt/app/src/requirements.txt
RUN pip install --use-wheel --no-index --find-links=/opt/app/wheeldir \
-r /opt/app/src/test-requirements.txt
RUN pip install -y pyzbar
COPY . /opt/app/src/
WORKDIR /opt/app/src
RUN python setup.py install
EXPOSE 5000
CMD dronedemo