docker not installing requirements on docker - python

FROM ubuntu:latest
RUN apt-get update
RUN apt-get install
RUN apt install python3.9 -y
RUN apt-get install -y git
RUN apt-get -y install python3-pip
RUN git clone https://ACCESS_TOKEN#github.com/username/repo
WORKDIR ./appy/
RUN pip install -r requirements.txt
CMD ["python3.9", "main.py"]
Hey, for some reason I'm getings
Traceback (most recent call last):
File "/appy/main.py", line 7, in <module>
import disnake
ModuleNotFoundError: No module named 'disnake'
requirements.txt
disnake=2.4.0
psutil=5.8.0
motor=2.5.1
aiohttp=3.7.4.post0
It appears that the packages in requirements.txt are not being installed properly. Any suggestions to what could be causing this? when building the container there doesn't seem to be any errors.And I can visually see when the container is built that all packages are installed. Including disnake.

Related

Why can't my Gitlab pipeline find python packages installed in Dockerfile?

My file structure is as follows:
Dockerfile
.gitlab-ci.yml
Here is my Dockerfile:
FROM python:3
RUN apt-get update && apt-get install make
RUN apt-get install -y python3-pip
RUN pip3 install --upgrade pip
RUN pip3 install pygdbmi
RUN pip3 install pyyaml
RUN pip3 install Path
And here is my .gitlab-ci.yml file:
test-job:
stage: test
image: runners:test-harness
script:
- cd test-harness
# - pip3 install pygdbmi
# - pip3 install pyyaml
- python3 main.py
artifacts:
untracked: false
when: on_success
expire_in: "30 days"
paths:
- test-harness/script.log
For some reason the pip3 install in the Dockerfile doesn't seem to be working as I get the error:
python3 main.py
Traceback (most recent call last):
File "/builds/username/test-harness/main.py", line 6, in <module>
from pygdbmi.gdbcontroller import GdbController
ModuleNotFoundError: No module named 'pygdbmi'
When I uncomment the two commented lines in .gitlab-ci.yml:
# - pip3 install pygdbmi
# - pip3 install pyyaml
It works fine but ideally, I want those 2 packages to be installed in the Dockerfile not the .gitlab-ci.yml pipeline stage
I've tried changing the WORKDIR as well as USER and it doesn't seem to have any effect.
Any ideas/solutions?

How can we use opencv in a multistage docker image?

I recently learned about the concept of building docker images based on a multi-staged Dockerfile.
I have been trying simple examples of multi-staged Dockerfiles, and they were working fine. However, when I tried implementing the concept for my own application, I was facing some issues.
My application is about object detection in videos, so I use python and Tensorflow.
Here is my Dockerfile:
FROM python:3-slim AS base
WORKDIR /objectDetector
COPY detect_objects.py .
COPY detector.py .
COPY requirements.txt .
ADD data /objectDetector/data/
ADD models /objectDetector/models/
RUN apt-get update && \
apt-get install protobuf-compiler -y && \
apt-get install ffmpeg libsm6 libxext6 -y && \
apt-get install gcc -y
RUN pip3 install update && python3 -m pip install --upgrade pip
RUN pip3 install tensorflow-cpu==2.9.1
RUN pip3 install opencv-python==4.6.0.66
RUN pip3 install opencv-contrib-python
WORKDIR /objectDetector/models/research
RUN protoc object_detection/protos/*.proto --python_out=.
RUN cp object_detection/packages/tf2/setup.py .
RUN python -m pip install .
RUN python object_detection/builders/model_builder_tf2_test.py
WORKDIR /objectDetector/models/research
RUN pip3 install wheel && pip3 wheel . --wheel-dir=./wheels
FROM python:3-slim
RUN pip3 install update && python3 -m pip install --upgrade pip
COPY --from=base /objectDetector /objectDetector
WORKDIR /objectDetector
RUN pip3 install --no-index --find-links=/objectDetector/models/research/wheels -r requirements.txt
When I try to run my application in the final stage of the container, I receive the following error:
root#3f062f9a5d64:/objectDetector# python detect_objects.py
Traceback (most recent call last):
File "/objectDetector/detect_objects.py", line 3, in <module>
import cv2
ModuleNotFoundError: No module named 'cv2'
So per my understanding, it seems that opencv-python is not successfully moved from the 1st stage to the 2nd.
I have been searching around, and I found some good blogs and questions tackling the issue of multi-staging Dockerfiles, specifically for python libraries. However, it seems I missing something here.
Here are some references that I have been following to solve the issue:
How do I reduce a python (docker) image size using a multi-stage build?
Multi-stage build usage for cuda,cudnn,opencv and ffmpeg #806
So my question is: How can we use opencv in a multistage docker image?

Error using matplotlib in docker - ImportError: Cannot load backend 'TkAgg

This is my error log:
% docker run --env=DISPLAY -t my-docker1:latest
Traceback (most recent call last):
File "/app/main.py", line 8, in <module>
matplotlib.use('TkAgg')
File "/usr/local/lib/python3.9/site-packages/matplotlib/__init__.py", line 1144, in use
plt.switch_backend(name)
File "/usr/local/lib/python3.9/site-packages/matplotlib/pyplot.py", line 296, in switch_backend
raise ImportError(
ImportError: Cannot load backend 'TkAgg' which requires the 'tk' interactive framework, as 'headless' is currently running
Also sharing my code scripts so it can be repeated
main.py
import matplotlib
import matplotlib.pyplot as plt
import os
print(os.getenv('DISPLAY'))
import numpy as np
matplotlib.use('TkAgg')
plt.switch_backend('tkagg')
plt.imshow(np.random.randn(4, 5))
requirements.txt
cycler==0.11.0
fonttools==4.34.4
kiwisolver==1.4.3
lxml==4.9.1
matplotlib==3.5.2
numpy==1.23.1
packaging==21.3
pyparsing==3.0.9
six==1.16.0
soupsieve==2.3.2.post1
webencodings==0.5.1
lxml==4.9.1
main.py
FROM python:3.9.13-slim-buster
WORKDIR /app
ADD main.py .
ADD requirements.txt .
RUN apt-get update
RUN apt-get install ffmpeg libsm6 libxext6 -y
RUN apt-get install python3-tk -y
RUN apt-get install libx11-dev -y
RUN apt-get install tk -y
RUN apt-get install python3-matplotlib -y
# RUN yum install python3-tkinter -y
# RUN pip3 install matplotlib
RUN pip3 install -r requirements.txt
RUN pip3 --no-cache-dir install -U --force-reinstall matplotlib
RUN export DISPLAY=:0
RUN export MPLBACKEND=TKAgg
RUN ls
ENTRYPOINT [ "python", "main.py" ]
I've read tons of similar questions(like this one and tried them all too as you can see in my solution, but none worked, unfortunately.
Any inputs or even suggestions to alternate backends(I tried PyQt5, but that too didn't work)/alternate libraries similar to matplotlib would be great. Thanks!

aws eb opencv-python "web: from .cv2 import"

in aws-eb I am deployed an application -django- and there was no error on that process. Health is green and OK but page is giving Internal Server Error. so I checked the logs and saw the below error.
... web: from .cv2 import
... web: ImportError: libGL.so.1: cannot open shared object file: No such file or directory
while installing requirements.txt on deployment process opencv must be installed. because it includes opencv-python==4.5.5.64
so I not quite sure what is the above error pointing at.
and helpers.py this is how I am import it.
import requests
import cv2
libGL.so is installed with the package libgl1, pip3 install opencv-python is not sufficient here.
Connect the aws via ssh and run;
apt-get update && apt-get install libgl1
Or even better, consider using docker containers for the project and add the installation commands to the Dockerfile.
Also, as https://stackoverflow.com/a/66473309/12416058 suggests, Package python3-opencv includes all system dependencies of OpenCV. so installing it may prevent further errors.
To install python3-opencv;
apt-get update && apt-get install -y python3-opencv
pip install -r requirements.txt
To install in Dockerfile:
RUN apt-get update && apt-get install -y python3-opencv
RUN pip install -r requirements.txt

cant install pip in ubuntu 18.04 docker /bin/sh: 1: pip: not found

I am getting the error using pip in my docker image.
FROM ubuntu:18.04
RUN apt-get update && apt-get install -y \
software-properties-common
RUN add-apt-repository universe
RUN apt-get install -y \
python3.6 \
python3-pip
ENV PYTHONUNBUFFERED 1
RUN mkdir /api
WORKDIR /api
COPY . /api/
RUN pip install pipenv
RUN ls
RUN pipenv sync
I installed python 3.6 and pip3 but getting
Step 9/11 : RUN pip install pipenv
---> Running in b184de4eb28e
/bin/sh: 1: pip: not found
To run pip for python3 use pip3, not pip.
Another solution.
You can add this line (after apt-get install). It will upgrade pip to the version you need, for instance:
RUN pip3 install --upgrade pip==20.0.1
and you can then use pip install from requirements file (for instance):
RUN pip install -r requirements.txt

Categories

Resources