Python script cannot find installed libraries inside container? - python

I use the following Dockerfile, which builds fine into an image:
FROM python:3.8-slim-buster
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONPATH="$PYTHONPATH:/app"
WORKDIR /app
COPY ./app .
RUN apt-get update
RUN apt-get install -y python3-pymssql python3-requests
CMD ["python3", "main.py"]
main.py:
from pymssql import connect, InterfaceError, OperationalError
try:
conn = connect(host, username, password, database)
except (InterfaceError, OperationalError) as e:
raise
conn.close()
For some reason, the python3-pymssql installed libraries are not present even if I see them getting installed during the docker build process, when I do a docker run I get the following error:
Traceback (most recent call last):
File "main.py", line 1, in <module>
from pymssql import connect, InterfaceError, OperationalError
ModuleNotFoundError: No module named 'pymssql'
I presume I could use a pip3 install but I prefer to take advantage of pre-build apt packages. Can you please let me know what am I missing?
Thank you for your help.

You have two copies of Python installed, 3.8 and 3.7:
root#367f37546ae7:/# python3 --version
Python 3.8.12
root#367f37546ae7:/# python3.7 --version
Python 3.7.3
The image you're basing this on, python:3.8-slim-buster, is based on Debian Buster, which uses Python 3.7 in its packages. So if you install an apt package, that's going to install Python 3.7 and install the package for Python 3.7. Then, you launch Python 3.8, which lacks the pymssql dependency.
You can:
Use a non-Python image, and install Python using apt. This way, you're guaranteed that the version of Python installed will be compatible with the Python dependencies from apt. For example, you could base your image on FROM debian:buster.
Use a Python image, and install your dependencies with pip.

You can install pymssql with pip only. The following Dockerfile builds and runs fine:
FROM python:3.8-slim-buster
ENV PYTHONDONTWRITEBYTECODE 1
RUN pip install pymssql
WORKDIR /app
COPY ./app .
CMD ["python3", "main.py"]

Related

How to run an application based on TensorFlow 2 in Docker container?

I am relatively new to TensorFlow, so I have been trying to run simple applications locally, and everything was going well.
At some point I wanted to Dockerize my application. Building the Docker image went with no errors, however, when I tried to run my application, I received the following error:
AttributeError: module 'tensorflow' has no attribute 'gfile'. Did you mean: 'fill'?
After googling about the problem, I understood that it is caused by version differences between TF1 and TF2.
One of the explanation about the problem I found is found here.
Locally, I am using TF2 (specifically 2.9.1), inside a virtual environment.
When dockerizing, I also confirmed from inside the docker container that my TF version is the same.
I also tried to run the container in interactive mode, and create virtual environment, and install all dependencies manually, exactly the same way I did locally, but still with no success.
My Dockerfile is as follows:
FROM python:3-slim
# ENV VIRTUAL_ENV=/opt/venv
# RUN python3 -m venv $VIRTUAL_ENV
# ENV PATH="$VIRTUAL_ENV/bin:$PATH"
WORKDIR /objectDetector
RUN apt-get update
RUN apt-get install -y protobuf-compiler
RUN apt-get install ffmpeg libsm6 libxext6 -y
RUN pip3 install update && python3 -m pip install --upgrade pip
RUN pip3 install tensorflow==2.9.1
RUN pip3 install tensorflow-object-detection-api
RUN pip3 install opencv-python
RUN pip3 install opencv-contrib-python
COPY detect_objects.py .
COPY detector.py .
COPY helloWorld.py .
ADD data data /objectDetector/data/
ADD models /objectDetector/models/
So my question is: How can I ran an application using TensorFlow 2 from a docker container?
Am I missing something here?
Thanks in advance for any help or explanation.
I believe that in tensorflow 2.0 :
tf.gfile was replaced by tf.io.gfile
Can you try this ?
Have a nice day,
Gabriel

Install Python.Net in docker images (linux containers)

I'm attempting to create a simple Python script that calls a .Net function through Python.Net (http://pythonnet.github.io/).
I'm trying to configure a Docker image containing both Python, .Net and, of course, Python.net, possibly at their latest releases.
I attempted in many ways, starting from several base machines (Ubuntu, Debian, Alpine, ...).
I managed to install Python (I tried with several flavours and versions) and .Net (or Mono), but, every time I get stuck when installing Python.Net.
I get errors like those discussed in Pip Pythonnet option --single-version-externally-managed not recognized or in Error Installing Python.net on Python 3.7 (suggested solutions in those posts didn't work for me).
As an example, here's a (failing!) dockerfile:
FROM python:slim-buster
# Install .Net
RUN apt update
RUN apt install wget -y
RUN wget https://dot.net/v1/dotnet-install.sh
RUN chmod 777 ./dotnet-install.sh
RUN ./dotnet-install.sh -c 5.0
#install Python.Net
RUN pip install pythonnet
Another attempt is:
FROM continuumio/anaconda
# Install .Net
RUN apt update
RUN apt install wget -y
RUN wget https://dot.net/v1/dotnet-install.sh
RUN chmod 777 ./dotnet-install.sh
RUN ./dotnet-install.sh -c 5.0
#install Python.Net
# See: https://stackoverflow.com/a/61948638/1288109
RUN conda create -n myenv python=3.7
SHELL ["conda", "run", "-n", "myenv", "/bin/bash", "-c"]
RUN conda install -c conda-forge pythonnet
ENV PATH=/root/.dotnet/:${PATH}
The dockerfile above builds correctly, but when running the container with a /bin/bash, issuing:
conda activate myenv
python myscript.py
I get an error like:
Traceback (most recent call last):
File "factorial.py", line 1, in <module>
import clr
ImportError: System.TypeInitializationException: The type initializer for 'Sys' threw an exception. ---> System.DllNotFoundException: /opt/conda/envs/myenv/lib/../lib/libmono-native.so assembly:<unknown assembly> type:<unknown type> member:(null)
(solutions found around the Internet, like https://github.com/pythonnet/pythonnet/issues/1034#issuecomment-626373989 didn't work, neither did attempts to use different versions of Python and .Net)
To be noted that the same error occurs also replacing .Net with Mono.
How can a docker image be built, that can run a simple Python script that makes use of Python.Net?

Python project can't find modules inside of Docker

I'm trying to run a python project inside of docker using the following Dockerfile for machine learning purposes:
FROM python:3
RUN apt-get update \
&& apt-get install -yq --no-install-recommends \
python3 \
python3-pip
RUN pip3 install --upgrade pip==9.0.3 \
&& pip3 install setuptools
# for flask web server
EXPOSE 8081
# set working directory
ADD . /app
WORKDIR /app
# install required libraries
COPY requirements.txt ./
RUN pip3 install -r requirements.txt
# This is the runtime command for the container
CMD python3 app.py
And here is my requirements file:
flask
scikit-learn[alldeps]
pandas
textblob
numpy
matplotlib[alldeps]
But when i try to import textblob and pandas, i get a no module named 'X' error in my docker cmd.
| warnings.warn(msg, category=FutureWarning)
| Traceback (most recent call last):
| File "app/app.py", line 12, in <module>
| from textblob import Textblob
| ImportError: No module named 'textblob'
exited with code 1
Folder structure
machinelearning:
backend:
app.py
Dockerfile
requirements.txt
frontend:
... (frontend works fine.)
docker-compose.yml
Does anyone know the solution to this problem?
(I'm fairly new to Docker, so I might just be missing something crucial.)
This worked for me
FROM python:3
RUN apt-get update
RUN apt-get install -y --no-install-recommends
# for flask web server
EXPOSE 8081
# set working directory
WORKDIR /app
# install required libraries
COPY requirements.txt .
RUN pip install -r requirements.txt
# copy source code into working directory
COPY . /app
# This is the runtime command for the container
CMD python3 app.py
On Linux, whenever you have the message:
ImportError: No module named 'XYZ'`
check whether you can install it or its dependencies with apt-get, example here that does not work for textblob, though, but may help with other modules:
(This does not work; it is an example what often helps, but not here)
# Python3:
sudo apt-get install python3-textblob
# Python2:
sudo apt-get install python-textblob
See Python error "ImportError: No module named" or How to solve cannot import name 'abort' from 'werkzeug.exceptions' error while importing Flask.
In the case of "textblob", this does not work for python2.7, and I did not test it on python3 but it will likely not work either, but in such cases, one should give it a try.
And just guessing is not needed, search through the apt cache with a RegEx. Then:
$ apt-cache search "python.*blob"
libapache-directory-jdbm-java - ApacheDS JDBM Implementation
python-git-doc - Python library to interact with Git repositories - docs
python-swagger-spec-validator-doc - Validation of Swagger specifications (Documentation)
python3-azure-storage - Microsoft Azure Storage Library for Python 3.x
python3-bdsf - Python Blob Detection and Source Finder
python3-binwalk - Python3 library for analyzing binary blobs and executable code
python3-discogs-client - Python module to access the Discogs API
python3-git - Python library to interact with Git repositories - Python 3.x
python3-mnemonic - Implementation of Bitcoin BIP-0039 (Python 3)
python3-nosehtmloutput - plugin to produce test results in html - Python 3.x
python3-swagger-spec-validator - Validation of Swagger specifications (Python3 version)
python3-types-toml - Typing stubs for toml
python3-types-typed-ast - Typing stubs for typed-ast
would be needed to check whether there are some python packages for "textblob" out there.

Always installing vscode plugin in docker container doesnt work

I am using vscode with docker container. I have following entry in user settings.json.
"remote.containers.defaultExtensions": [
"ms-python.python",
"ms-azuretools.vscode-docker",
"ryanluker.vscode-coverage-gutters"
]
But when I build or rebuild container, these plugins don't get installed automatically inside container.
Am I doing something wrong ?
Modified
Here is how my dockerfile looks like
FROM ubuntu:bionic
RUN apt-get update
RUN apt-get install -y python3.6 python3-pip
RUN apt-get install -y git libgl1-mesa-dev
# Currently not using requirements.txt to improve caching
#COPY requirements.txt /home/projects/my_project/
#WORKDIR /home/projects/my_project/
#RUN pip3 install -r requirements.txt
RUN pip3 install torch pandas PyYAML==5.1.2 autowrap Cython==0.29.14
RUN pip3 install numpy==1.17.3 open3d-python==0.7.0.0 pytest==5.2.4 pptk
RUN pip3 install scipy==1.3.1 natsort matplotlib lxml opencv-python==3.2.0.8
RUN pip3 install Pillow scikit-learn testfixtures
RUN pip3 install pip-licenses pylint pytest-cov
RUN pip3 install autopep8
COPY . /home/projects/my_project/
This might be an old question, but to whomever it might concern, here is one solution. I encountered this problem, that particularly the Python extension from VS Code would not install itself inside my Docker container in VS Code. In order to get it to install the python extension (and for me anything else) you have to specify the Python version, like:
"extensions": [
"ms-azuretools.vscode-docker",
"ms-python.python#2020.9.114305",
"ms-python.vscode-pylance"
]
If you want to see this in action you can clone my repository. Simply open this repo in VS Code, install the extension Remote Container, and then it should start the docker container all by itself.

How to run SSL_library_init() from Python 3.7 docker image

Up until recently I've been using openssl library within python:3.6.6-jessie docker image and thing worked as intented.
I'm using very basic Dockerfile configuration to install all necessary dependencies:
FROM python:3.6.6-jessie
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
WORKDIR /code
RUN apt-get -qq update
RUN apt-get install openssl
RUN apt-get upgrade -y openssl
ADD requirements.txt /code/
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
And access and initialize the library itself with these 2 lines:
openssl = cdll.LoadLibrary("libssl.so")
openssl.SSL_library_init()
Things were working great with this approach.
This week I was doing upgrade of python and libraries and as result I switched to newer docker image:
FROM python:3.7.5
...
This immediatelly caused openssl to stop working because of this exception:
AttributeError: /usr/lib/x86_64-linux-gnu/libssl.so.1.1: undefined symbol: SSL_library_init
From this error I can understand that libssl no longer provides SSL_library_init method (or so it seems to be) which is rather weird issue because the initializer name in openssl documentation is the same.
I also tried to resolve this using -stretch and -buster distributions but the issue remains.
What is the correct approach to run SSL_library_init in those newer distributions? Maybe some additional dockerfile configuration is required?
I think you need to install libssl1.0-dev
RUN apt-get install -y libssl1.0-dev

Categories

Resources