I am writing a memgraph transformation in python.
When I import modules such as "requests" or "networkx", the transformation works as expected.
I have avro data w/ schema registry, so I need to deserialize it. I followed the memgraph example here: https://memgraph.com/docs/memgraph/2.3.0/import-data/kafka/avro#deserialization
When I save the transformation with those imports, I receive the error:
[Error] Unable to load module "/memgraph/internal_modules/test_try_plz.py";
Traceback (most recent call last): File "/memgraph/internal_modules/test_try_plz.py", line 4,
in <module> from confluent_kafka.schema_registry import SchemaRegistryClient ModuleNotFoundError:
No module named 'confluent_kafka' . For more details, visit https://memgr.ph/modules.
How can I update my transform or memgraph instance to include the confluent_kafka module?
The link provided in the answer did not provide any leads, at least to me.
You cannot add python dependencies to memgraph in memgraph cloud (free trial at least..)
Instead, create your own docker image and use that, e.g.,
FROM memgraph/memgraph:2.5.2
USER root
# Install Python
RUN apt-get update && apt-get install -y \
python3-pip \
python3-setuptools \
python3-dev \
&& pip3 install -U pip
# Install pip packages
COPY requirements.txt ./
RUN pip3 install -r requirements.txt
# Copy the local query modules
COPY transformations/ /usr/lib/memgraph/query_modules
COPY procedures/ /usr/lib/memgraph/query_modules
USER memgraph
And my requirements.txt, all of which are required for a transformation leveraging the confluent schema-registry/avro packages:
confluent_kafka==2.0.2
fastavro==1.7.1
requests==2.28.2
Related
in aws-eb I am deployed an application -django- and there was no error on that process. Health is green and OK but page is giving Internal Server Error. so I checked the logs and saw the below error.
... web: from .cv2 import
... web: ImportError: libGL.so.1: cannot open shared object file: No such file or directory
while installing requirements.txt on deployment process opencv must be installed. because it includes opencv-python==4.5.5.64
so I not quite sure what is the above error pointing at.
and helpers.py this is how I am import it.
import requests
import cv2
libGL.so is installed with the package libgl1, pip3 install opencv-python is not sufficient here.
Connect the aws via ssh and run;
apt-get update && apt-get install libgl1
Or even better, consider using docker containers for the project and add the installation commands to the Dockerfile.
Also, as https://stackoverflow.com/a/66473309/12416058 suggests, Package python3-opencv includes all system dependencies of OpenCV. so installing it may prevent further errors.
To install python3-opencv;
apt-get update && apt-get install -y python3-opencv
pip install -r requirements.txt
To install in Dockerfile:
RUN apt-get update && apt-get install -y python3-opencv
RUN pip install -r requirements.txt
I'm trying to run a python project inside of docker using the following Dockerfile for machine learning purposes:
FROM python:3
RUN apt-get update \
&& apt-get install -yq --no-install-recommends \
python3 \
python3-pip
RUN pip3 install --upgrade pip==9.0.3 \
&& pip3 install setuptools
# for flask web server
EXPOSE 8081
# set working directory
ADD . /app
WORKDIR /app
# install required libraries
COPY requirements.txt ./
RUN pip3 install -r requirements.txt
# This is the runtime command for the container
CMD python3 app.py
And here is my requirements file:
flask
scikit-learn[alldeps]
pandas
textblob
numpy
matplotlib[alldeps]
But when i try to import textblob and pandas, i get a no module named 'X' error in my docker cmd.
| warnings.warn(msg, category=FutureWarning)
| Traceback (most recent call last):
| File "app/app.py", line 12, in <module>
| from textblob import Textblob
| ImportError: No module named 'textblob'
exited with code 1
Folder structure
machinelearning:
backend:
app.py
Dockerfile
requirements.txt
frontend:
... (frontend works fine.)
docker-compose.yml
Does anyone know the solution to this problem?
(I'm fairly new to Docker, so I might just be missing something crucial.)
This worked for me
FROM python:3
RUN apt-get update
RUN apt-get install -y --no-install-recommends
# for flask web server
EXPOSE 8081
# set working directory
WORKDIR /app
# install required libraries
COPY requirements.txt .
RUN pip install -r requirements.txt
# copy source code into working directory
COPY . /app
# This is the runtime command for the container
CMD python3 app.py
On Linux, whenever you have the message:
ImportError: No module named 'XYZ'`
check whether you can install it or its dependencies with apt-get, example here that does not work for textblob, though, but may help with other modules:
(This does not work; it is an example what often helps, but not here)
# Python3:
sudo apt-get install python3-textblob
# Python2:
sudo apt-get install python-textblob
See Python error "ImportError: No module named" or How to solve cannot import name 'abort' from 'werkzeug.exceptions' error while importing Flask.
In the case of "textblob", this does not work for python2.7, and I did not test it on python3 but it will likely not work either, but in such cases, one should give it a try.
And just guessing is not needed, search through the apt cache with a RegEx. Then:
$ apt-cache search "python.*blob"
libapache-directory-jdbm-java - ApacheDS JDBM Implementation
python-git-doc - Python library to interact with Git repositories - docs
python-swagger-spec-validator-doc - Validation of Swagger specifications (Documentation)
python3-azure-storage - Microsoft Azure Storage Library for Python 3.x
python3-bdsf - Python Blob Detection and Source Finder
python3-binwalk - Python3 library for analyzing binary blobs and executable code
python3-discogs-client - Python module to access the Discogs API
python3-git - Python library to interact with Git repositories - Python 3.x
python3-mnemonic - Implementation of Bitcoin BIP-0039 (Python 3)
python3-nosehtmloutput - plugin to produce test results in html - Python 3.x
python3-swagger-spec-validator - Validation of Swagger specifications (Python3 version)
python3-types-toml - Typing stubs for toml
python3-types-typed-ast - Typing stubs for typed-ast
would be needed to check whether there are some python packages for "textblob" out there.
I am trying to create an docker image with ubutu 16.04 as base. I want to install few python packages like pandas, flask etc. I have kept all packages in "requirements.txt". But when I am trying to build image, I am getting
Could not find a version that satisfies the requirement requests (from -r requirements.txt (line 1)) (from versions: )
No matching distribution found for requests (from -r requirements.txt (line 1))
Basically, I have not mentioned any version in "requirements.txt". I guess it should take the latest available and compatible version of that package. But for every package same issue I am getting.
My DockerFile is as follows.
FROM ubuntu:16.04
RUN apt-get update -y && \
apt-get install -y python3-pip python3-dev build-essential cmake pkg-config libx11-dev libatlas-base-dev
# We copy just the requirements.txt first to leverage Docker cache
COPY ./requirements.txt /testing/requirements.txt
WORKDIR /testing
RUN pip3 install -r requirements.txt
and requirements.txt is.
pandas
requests
PyMySQL
Flask
Flask-Cors
Pillow
face-recognition
Flask-SocketIO
Where I am doing wrong ? Can anybody help ?
I too ran into the same situation. I observed that, python packages is looking for the network within docker. It is thinking that, it is running in a standalone without network so its not able to locate the package. In these type of situations either
No matching distribution found
or sometimes
Retrying ...
error may occur.
I used a --network option in the docker build command like below to overcome this error where the command insists python to use the host network to download the required packages.
docker build --network=host -t tracker:latest .
Try using this:
RUN python3.6 -m pip install --upgrade pip \
&& python3.6 -m pip install -r requirements.txt
by using it in this way, you are specifying the version of python in which you want to search for those packages.
Change it to python3.7 if you wish to use 3.7 version.
I suggest using the official python image instead. As a result, your Dockerfile will now become:
FROM python:3
WORKDIR /testing
COPY ./requirements.txt /testing/requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
... etc ...
Now re: Angular/Node. You have two options from here: 1) Install Angular/Node on the Python image; or 2) Use Docker's multi-stage build feature so you build the Angular and Python-specific images before merging them together. Option 2 is recommended but it would take some work. It would probably look like this:
FROM node:8 as node
# Angular-specific build
FROM python:3 as python
# Python-specific build
# Then copy your data from the Angular image to the Python one:
COPY --from=node /usr/src/app/dist/angular-docker /usr/src/app
There is a python project in which I have dependencies defined with the help of "requirement.txt" file. One of the dependencies is gmpy2. When I am running docker build -t myimage . command it is giving me following error at the step when setup.py install is getting executed.
In file included from src/gmpy2.c:426:0:
src/gmpy.h:252:20: fatal error: mpfr.h: No such file or directory
include "mpfr.h"
similarly other two errors are:
In file included from appscript_3x/ext/ae.c:14:0:
appscript_3x/ext/ae.h:26:27: fatal error: Carbon/Carbon.h: No such file
or directory
#include <Carbon/Carbon.h>
In file included from src/buffer.cpp:12:0:
src/pyodbc.h:56:17: fatal error: sql.h: No such file or directory
#include <sql.h>
Now question is how i can define or install these internal dependencies required for successful build of image. As per my understanding gmpy2 is written in C and depends on three other C libraries: GMP, MPFR, and MPC and it is unable to find this.
Following is my docker-file:
FROM python:3
COPY . .
RUN pip install -r requirement.txt
CMD [ "python", "./mike/main.py" ]
Install this apt install libgmp-dev libmpfr-dev libmpc-dev extra dependency and then RUN pip install -r requirement.txt
i think it will work and you will be able to install all the dependency and build docker image.
FROM python:3
COPY . .
RUN apt-get update -qq && \
apt-get install -y --no-install-recommends \
libmpc-dev \
libgmp-dev \
libmpfr-dev
RUN pip install -r requirement.txt
CMD [ "python", "./mike/main.py" ]
if apt not run you can use Linux as base image.
You will need to modify your Dockerfile to install the additional C libraries using apt-get install. (The default Python 3 image is based on a Debian image).
sudo apt-get install libgmp3-dev
sudo apt-get install libmpfr-dev
It looks like you can install the dependencies for pyodbc using
sudo apt-get install unixodbc-dev
However, I'm really unsure about the requirement for Carbon.h as that's an OS X specific header file. You may have an OS X specific dependency in your requirements file that won't work on a Linux based image.
I am trying to export my computer vision API, which is functioning correctly under macOS, to an Azure Function.
I tried to use the docker approach:
func azure functionapp publish --build-native-deps
but I keep getting the error:
can't import cv2 and imutils
log file
and
Exception: ImportError: libgthread-2.0.so.0: cannot open shared object file: No such file or directory
Here is the requirements.txt:
requirements.txt
How do I solve this problem? Or must I switch to AWS Lambda?
I have access to Kudu if that's helpful.
Thanks in advance!
The Azure team has updated the default function image to include libglib2.0-dev
You will need to install the headless version of OpenCV through pip instead of the default.
https://pypi.org/project/opencv-python-headless/
I think the issue is lack of the necessary library libgthread. To fix it, you need to add it into your Docker file to build your own image for your function deployment.
On Azure, please follow the section Build the image from the Docker file of the offical document Create a function on Linux using a custom image to add the code below in azure-functions/python:2.0 Docker file.
RUN apt-get update && \
apt-get install -y libglib2.0-dev
But it will add a new docker image layer, so you can add libglib2.0-dev into azure-functions/base:2.0 like below.
# Line 19
RUN apt-get update && \
apt-get install -y gnupg wget unzip libglib2.0-dev && \
wget https://functionscdn.azureedge.net/public/ExtensionBundles/Microsoft.Azure.Functions.ExtensionBundle/1.0.0/Microsoft.Azure.Functions.ExtensionBundle.1.0.0.zip && \
mkdir -p /FuncExtensionBundles/Microsoft.Azure.Functions.ExtensionBundle/1.0.0 && \
unzip /Microsoft.Azure.Functions.ExtensionBundle.1.0.0.zip -d /FuncExtensionBundles/Microsoft.Azure.Functions.ExtensionBundle/1.0.0 && \
rm -f /Microsoft.Azure.Functions.ExtensionBundle.1.0.0.zip