I am developing an azure app function that will be using a custom image with docker.
This is my DockerFile:
FROM mcr.microsoft.com/azure-functions/python:3.0-python3.7
ENV AzureWebJobsScriptRoot=/home/site/wwwroot \
AzureFunctionsJobHost__Logging__Console__IsEnabled=true
# Adding "apt-get install make" here
RUN apt-get update && apt-get install make && apt-get install -y gcc g++ && rm -rf /var/lib/apt/lists/*
RUN pip install numpy
RUN pip install pandas
RUN pip install scipy
RUN wget http://prdownloads.sourceforge.net/ta-lib/ta-lib-0.4.0-src.tar.gz && \
tar -xvzf ta-lib-0.4.0-src.tar.gz && \
cd ta-lib/ && \
./configure --prefix=/usr && \
make && \
make install
RUN rm -R ta-lib ta-lib-0.4.0-src.tar.gz
COPY requirements.txt /
RUN pip install -r /requirements.txt
COPY . /home/site/wwwroot
Everything works fine when I am using my local docker desktop app.
When I am trying to deploy to azure using the VS-Code build it interface the docker file fails to build here is the output I get:
14:57:08 : building 'talib._ta_lib' extension
14:57:08 : creating build/temp.linux-x86_64-3.7
14:57:08 : creating build/temp.linux-x86_64-3.7/talib
14:57:08 : gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -I/usr/include -I/usr/local/include -I/opt/include -I/opt/local/include -I/opt/homebrew/include -I/opt/homebrew/opt/ta-lib/include -I/tmp/pip-install-rgqyohyf/ta-lib/.eggs/numpy-1.21.2-py3.7-linux-x86_64.egg/numpy/core/include -I/opt/python/3.7.9/include/python3.7m -c talib/_ta_lib.c -o build/temp.linux-x86_64-3.7/talib/_ta_lib.o
14:57:08 : talib/_ta_lib.c:613:28: fatal error: ta-lib/ta_defs.h: No such file or directory
14:57:08 : #include "ta-lib/ta_defs.h"
14:57:08 : ^
14:57:08 : compilation terminated.
14:57:09 : error: command 'gcc' failed with exit status 1
14:57:09 : ----------------------------------------
14:57:09 : ERROR: Failed building wheel for TA-Lib
14:57:09 : DEPRECATION: Could not build wheels for TA-Lib which do not use PEP 517. pip will fall back to legacy 'setup.py install' for these. pip 21.0 will remove support for this functionality. A possible replacement is to fix the wheel build issue reported above. You can find discussion regarding this at https://github.com/pypa/pip/issues/8368.
14:57:16 : [11:57:16+0000] Running setup.py install for TA-Lib: started
14:57:16 : [11:57:16+0000] Running setup.py install for TA-Lib: finished with status 'error'
14:57:16 : ERROR: Command errored out with exit status 1:
I am not sure what to make of this error and why it works fine in docker.
Thanks
Amit
I am using https://hub.docker.com/r/tiangolo/meinheld-gunicorn-flask/ to host a flask api.
To deploy I use a docker image which I host on azure web app services for container. I am trying to connect my flask application to Azure SQL server. But I have trouble installing the correct odbc drivers and pyodbc.
My docker file:
FROM ubuntu:16.04
# apt-get and system utilities
RUN apt-get update && apt-get install -y \
curl apt-utils apt-transport-https debconf-utils gcc build-essential g++-5\
&& rm -rf /var/lib/apt/lists/*
# adding custom MS repository
RUN curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add -
RUN curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list > /etc/apt/sources.list.d/mssql-release.list
# install SQL Server drivers
RUN apt-get update && ACCEPT_EULA=Y apt-get -y install msodbcsql17
RUN apt-get -y install unixodbc unixodbc-dev
# install necessary locales
RUN apt-get update && apt-get install -y locales \
&& echo "en_US.UTF-8 UTF-8" > /etc/locale.gen \
&& locale-gen
FROM tiangolo/meinheld-gunicorn:python3.7
ENV LISTEN_PORT=80
EXPOSE 80
COPY /app /app
# Uncomment to install additional requirements from a requirements.txt file
COPY requirements.txt /
RUN pip install --no-cache-dir -U pip
RUN pip install --no-cache-dir -r /requirements.txt
Getting the error:
gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -DPYODBC_VERSION=4.0.25 -I/usr/local/include/python3.7m -c src/buffer.cpp -o build/temp.linux-x86_64-3.7/src/buffer.o -Wno-write-strings
2019-01-28T23:58:42.5006898Z In file included from src/buffer.cpp:12:0:
2019-01-28T23:58:42.5006979Z src/pyodbc.h:56:17: fatal error: sql.h: No such file or directory
2019-01-28T23:58:42.5007040Z #include <sql.h>
2019-01-28T23:58:42.5007105Z ^
2019-01-28T23:58:42.5007155Z compilation terminated.
2019-01-28T23:58:42.5007377Z error: command 'gcc' failed with exit status 1
I think its because unixodbc is not correctly installed?
The second FROM line probably means that later actions are performed with a image that doesn't have those libararies that are needed to compile pyodbc.
FROM tiangolo/meinheld-gunicorn:python3.7
ENV LISTEN_PORT=80
EXPOSE 80
For more information, https://blog.alexellis.io/mutli-stage-docker-builds/
My makefile:
SHELL := /bin/bash
.PHONY: all
all:
pip install runcython
makecython++ stitch_wrapper.pyx "" "stitch_rects.cpp ./hungarian/hungarian.cpp"
hungarian: hungarian/hungarian.so
hungarian/hungarian.so:
cd hungarian && \
TF_INC=$$(python -c 'import tensorflow as tf; print(tf.sysconfig.get_include())') && \
if [ `uname` == Darwin ];\
then g++ -std=c++11 -shared hungarian.cc -o hungarian.so -fPIC -I $$TF_INC -undefined dynamic_lookup;\
else g++ -std=c++11 -shared hungarian.cc -o hungarian.so -fPIC -I $$TF_INC; fi
I have already installed
-cython
-runcython
-python-dev
-python3-dev
-cffi
Unfortunately I continue to get the error:
pkg-config: command not found
.cpp:4:20: fatal error: Python.h: No such file or directory
compilation terminated.
Makefile:5: recipe for target 'all' failed
make: *** [all] Error 1
If you already have
sudo apt-get install python-dev
sudo apt-get install python3-dev
sudo apt-get install libpython3-dev
sudo apt-get install libpython3.4-dev
sudo apt-get install libpython3.5-dev
the problem may be related to missing of one of these packages:
sudo apt-get install Cython
sudo apt-get install pkgconf
sudo apt-get install libpkgconf
sudo apt-get install python-pkgconfig
sudo apt-get install python3-pkgconfig
Solved Wow, these guys are fast... It's basically this https://github.com/pyca/cryptography/issues/2750 It turned out that a security update for openssl was released (DROWN Attack) and that update contained an unexpected function signature change which caused the incompatibility, so this was just bad luck for me.
I need to use pip install cryptography in a Docker container running Alpine Linux. Actually, it's another module, service_identity, but the problem resides in the cryptography module, which is a dependency.
I have the following Dockerfile
FROM alpine:3.3
RUN apk --update add build-base libffi-dev openssl-dev python-dev py-pip
RUN pip install cryptography
which fails with the following error
generating cffi module 'build/temp.linux-x86_64-2.7/_openssl.c'
building '_openssl' extension
creating build/temp.linux-x86_64-2.7/build
creating build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7
gcc -fno-strict-aliasing -Os -fomit-frame-pointer -DNDEBUG -Os -fomit-frame-pointer -fPIC -I/usr/include/python2.7 -c build/temp.linux-x86_64-2.7/_openssl.c -o build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_openssl.o
build/temp.linux-x86_64-2.7/_openssl.c:726:6: error: conflicting types for 'BIO_new_mem_buf'
BIO *BIO_new_mem_buf(void *, int);
^
In file included from /usr/include/openssl/asn1.h:65:0,
from build/temp.linux-x86_64-2.7/_openssl.c:434:
/usr/include/openssl/bio.h:692:6: note: previous declaration of 'BIO_new_mem_buf' was here
BIO *BIO_new_mem_buf(const void *buf, int len);
^
error: command 'gcc' failed with exit status 1
openssl 1.0.2g was released on 2016-03-01 (yesterday) and the alpine package already got updated to that version. Can it be related to this?
How can I resolve this issue? Maybe some environment variables which I can set?
Update I've been checking the GitHub Repo for openssl, and in fact BIO *BIO_new_mem_buf(void *buf, int len) of openssl/bio.h got changed to BIO *BIO_new_mem_buf(const void *buf, int len) during the 1.0.2f to 1.0.2g transition (search for "BIO_new_mem_buf" in https://github.com/openssl/openssl/compare/OpenSSL_1_0_2f...OpenSSL_1_0_2g). I don't know where this openssl/asn1.h is coming from, which is importing an outdated version of openssl/bio.h, as it does not look like the one in the openssl repo. Any ideas?
Ok, I see some are already working on this:
https://github.com/pyca/cryptography/issues/2750
For those who are still experiencing problems installing cryptography==2.1.4 in Alpine 3.7 like this:
writing manifest file 'src/cryptography.egg-info/SOURCES.txt'
running build_ext
generating cffi module 'build/temp.linux-x86_64-2.7/_padding.c'
creating build/temp.linux-x86_64-2.7
generating cffi module 'build/temp.linux-x86_64-2.7/_constant_time.c'
generating cffi module 'build/temp.linux-x86_64-2.7/_openssl.c'
building '_openssl' extension
creating build/temp.linux-x86_64-2.7/build
creating build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7
gcc -fno-strict-aliasing -Os -fomit-frame-pointer -g -DNDEBUG -Os -fomit-frame-pointer -g -DTHREAD_STACK_SIZE=0x100000 -fPIC -I/usr/include/python2.7 -c build/temp.linux-x86_64-2.7/_openssl.c -o build/temp.linux-x86_64-2.7/build/temp.linux-x86_64-2.7/_openssl.o -Wconversion -Wno-error=sign-conversion
build/temp.linux-x86_64-2.7/_openssl.c:493:30: fatal error: openssl/opensslv.h: No such file or directory
#include <openssl/opensslv.h>
^
compilation terminated.
error: command 'gcc' failed with exit status 1
Solution
Install these dependencies in the Alpine container:
$ apk add --no-cache libressl-dev musl-dev libffi-dev
To install these dependencies using a Dockerfile:
RUN apk add --no-cache \
libressl-dev \
musl-dev \
libffi-dev && \
pip install --no-cache-dir cryptography==2.1.4 && \
apk del \
libressl-dev \
musl-dev \
libffi-dev
Reference
Installation instructions for cryptography on Alpine can be found here:
https://cryptography.io/en/latest/installation/#building-cryptography-on-linux
A version from the time of writing is available on github
Here is the relevant portion:
Building cryptography on Linux
[skipping over the part for non-Alpine Linux] …
$ pip install cryptography
If you are on Alpine or just want to compile it yourself then
cryptography requires a compiler, headers for Python (if you're not
using pypy), and headers for the OpenSSL and libffi libraries
available on your system.
Alpine
Replace python3-dev with python-dev if you're using Python 2.
$ sudo apk add gcc musl-dev python3-dev libffi-dev openssl-dev
If you get an error with openssl-dev you may have to use libressl-dev.
If it fails because of Rust version, then following is recommended in cryptography's docs:
The Rust available by default in Alpine < 3.12 is older than the
minimum supported version. See the Rust installation instructions
for information about installing a newer Rust.
$ sudo apk add gcc musl-dev python3-dev libffi-dev openssl-dev cargo
in my case, python3.8-alpine, adding cargo resolved.
Add this before install:
RUN apk -U upgrade
RUN apk add --no-cache libffi-dev openssl-dev
Alternatively use build-base:
RUN apk add --no-cache --upgrade --virtual .build-deps build-base
Details here: https://git.alpinelinux.org/aports/tree/main/build-base/APKBUILD?h=3.3-stable
Check if you are building for the right architecture !!
x86-64 or amd64 architecture runs similar softwares and the other category is
aarch64 or arm architecture chips like Apple Silicon M1 or your mobile phone cpu
I'm trying to install Flask-socketIO using pip :
pip install flask-socketIO
But in the end I'm facing this error :
cc -O2 -fPIC -Wimplicit -DLIBEV_EMBED=1 -DEV_COMMON= -DEV_CLEANUP_ENABLE=0 -DEV_EMBED_ENABLE=0 -DEV_PERIODIC_ENABLE=0 -Ibuild/temp.linux-x86_64-2.7/libev -Ilibev -I/home/abhishek/.virtualenvs/test/include -c gevent/gevent.core.c -o build/temp.linux-x86_64-2.7/gevent/gevent.core.o
gevent/gevent.core.c:9:22: fatal error: pyconfig.h: No such file or directory
#include "pyconfig.h"
^
compilation terminated.
error: command 'cc' failed with exit status 1
----------------------------------------
Command "/home/abhishek/.virtualenvs/test/bin/pypy -c "import setuptools,
Looks like you need to have the Python Development package installed.
On CentOS / RHEL:
yum install python-devel
yum install python-lxml
On Ubuntu / Debian:
sudo apt-get install python-dev libxml2-dev libxslt-dev