Can't install nmslib in docker - python

I can't install nmslib using pip in docker.
This is my Dockerfile:
FROM continuumio/anaconda3:4.4.0
MAINTAINER UNP, https://unp.education
EXPOSE 8000
RUN apt-get update && apt-get install -y apache2 \
apache2-dev \
emacs \
&& apt-get clean \
&& apt-get autoremove \
&& rm -rf /var/lib/apt/lists/*
WORKDIR /var/www/devise-api/
COPY ./devise-api.wsgi /var/www/devise-api/devise-api.wsgi
COPY ./devise-api /var/www/devise-api/
RUN pip install -r requirements.txt
RUN /opt/conda/bin/mod_wsgi-express install-module
RUN mod_wsgi-express setup-server devise-api.wsgi --port=8000 \
--user www-data --group www-data \
--server-root=/etc/mod_wsgi-express-80
CMD /etc/mod_wsgi-express-80/apachectl start -D FOREGROUND
For now requirements.txt contains only the word nmslib
This is the output I get when running sudo docker build -t devise-api .:
Sending build context to Docker daemon 306.2MB
Step 1/11 : FROM continuumio/anaconda3:4.4.0
---> 795ad88c47ff
Step 2/11 : MAINTAINER UNP, https://unp.education
---> Using cache
---> cd5b1f7e6188
Step 3/11 : EXPOSE 8000
---> Using cache
---> 21ad868f0823
Step 4/11 : RUN apt-get update && apt-get install -y apache2 apache2-dev emacs && apt-get clean && apt-get autoremove && rm -rf /var/lib/apt/lists/*
---> Using cache
---> ebfc7c30b394
Step 5/11 : WORKDIR /var/www/devise-api/
---> Using cache
---> 8228e4b4d4fd
Step 6/11 : COPY ./devise-api.wsgi /var/www/devise-api/devise-api.wsgi
---> Using cache
---> c8cf5cfcf7dc
Step 7/11 : COPY ./devise-api /var/www/devise-api/
---> Using cache
---> 856e67f0b1de
Step 8/11 : RUN pip install -r requirements.txt
---> Running in 7260901af476
Collecting nmslib (from -r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/e1/95/1f7c90d682b79398c5ee3f9296be8d2640fa41de24226bcf5473c801ada6/nmslib-1.7.3.6.tar.gz (255kB)
Collecting pybind11>=2.0 (from nmslib->-r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/f2/7c/e71995e59e108799800cb0fce6c4b4927914d7eada0723dd20bae3b51786/pybind11-2.2.4-py2.py3-none-any.whl (145kB)
Requirement already satisfied: numpy in /opt/conda/lib/python3.6/site-packages (from nmslib->-r requirements.txt (line 1))
Building wheels for collected packages: nmslib
Running setup.py bdist_wheel for nmslib: started
Running setup.py bdist_wheel for nmslib: finished with status 'error'
Complete output from command /opt/conda/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-4qfop8hr/nmslib/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/tmpl9tnakclpip-wheel- --python-tag cp36:
running bdist_wheel
running build
running build_ext
creating tmp
gcc -pthread -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/opt/conda/include/python3.6m -c /tmp/tmp2pnwck3x.cpp -o tmp/tmp2pnwck3x.o -std=c++14
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
gcc -pthread -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/opt/conda/include/python3.6m -c /tmp/tmp5ktoxd0l.cpp -o tmp/tmp5ktoxd0l.o -fvisibility=hidden
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
building 'nmslib' extension
creating build
creating build/temp.linux-x86_64-3.6
creating build/temp.linux-x86_64-3.6/nmslib
creating build/temp.linux-x86_64-3.6/nmslib/similarity_search
creating build/temp.linux-x86_64-3.6/nmslib/similarity_search/src
creating build/temp.linux-x86_64-3.6/nmslib/similarity_search/src/space
creating build/temp.linux-x86_64-3.6/nmslib/similarity_search/src/method
gcc -pthread -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I./nmslib/similarity_search/include -I/opt/conda/include/python3.6m -I/root/.local/include/python3.6m -I/opt/conda/lib/python3.6/site-packages/numpy/core/include -I/opt/conda/include/python3.6m -c nmslib.cc -o build/temp.linux-x86_64-3.6/nmslib.o -O3 -march=native -fopenmp -DVERSION_INFO="1.7.3.6" -std=c++14 -fvisibility=hidden
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
nmslib.cc:16:31: fatal error: pybind11/pybind11.h: No such file or directory
#include <pybind11/pybind11.h>
^
compilation terminated.
error: command 'gcc' failed with exit status 1
----------------------------------------
Failed building wheel for nmslib
Running setup.py clean for nmslib
Failed to build nmslib
Installing collected packages: pybind11, nmslib
Running setup.py install for nmslib: started
Then it stays like this forever.
Does anyone know what I could try to fix this?
Thank you a lot in advance!
Best regards
F

I just faced the same issue and fixed it by adding RUN step before installing nmslib:
RUN pip install pip==9.0.3 pybind11
See: https://github.com/nmslib/nmslib/issues/307#issuecomment-384113900. Hope this helps in your case as well :-)

Related

Error when installing Odoo on Mac with Python 3.10

I'm following this tutorial to install Odoo 15 in Mac with Python 3.10, but I get this error when running pip3 install -r requirements.txt:
File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/subprocess.py", line 369, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '(cd "/private/var/folders/ht/sqtnbdnx7r5562trfyn3827w0000gn/T/pip-install-qiz3m1g1/gevent_22fc5c60d97046e4bea11df299f9facf/deps/c-ares" && if [ -r include/ares_build.h ]; then cp include/ares_build.h include/ares_build.h.orig; fi && sh ./configure --disable-dependency-tracking -C CFLAGS="-Wno-unused-result -Wsign-compare -Wunreachable-code -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -arch arm64 -arch x86_64 -g" && cp src/lib/ares_config.h include/ares_build.h "$OLDPWD" && cat include/ares_build.h && if [ -r include/ares_build.h.orig ]; then mv include/ares_build.h.orig include/ares_build.h; fi) > configure-output.txt' returned non-zero exit status 77.
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for gevent
Failed to build gevent
ERROR: Could not build wheels for gevent, which is required to install pyproject.toml-based projects
I haven't found absolutely any document addressing the error I'm getting
you need to install these pyproject projects and be carefully about which python you are using '2' or '3'
pip install pyproject projects

deploying azure function using custom image

I am developing an azure app function that will be using a custom image with docker.
This is my DockerFile:
FROM mcr.microsoft.com/azure-functions/python:3.0-python3.7
ENV AzureWebJobsScriptRoot=/home/site/wwwroot \
AzureFunctionsJobHost__Logging__Console__IsEnabled=true
# Adding "apt-get install make" here
RUN apt-get update && apt-get install make && apt-get install -y gcc g++ && rm -rf /var/lib/apt/lists/*
RUN pip install numpy
RUN pip install pandas
RUN pip install scipy
RUN wget http://prdownloads.sourceforge.net/ta-lib/ta-lib-0.4.0-src.tar.gz && \
tar -xvzf ta-lib-0.4.0-src.tar.gz && \
cd ta-lib/ && \
./configure --prefix=/usr && \
make && \
make install
RUN rm -R ta-lib ta-lib-0.4.0-src.tar.gz
COPY requirements.txt /
RUN pip install -r /requirements.txt
COPY . /home/site/wwwroot
Everything works fine when I am using my local docker desktop app.
When I am trying to deploy to azure using the VS-Code build it interface the docker file fails to build here is the output I get:
14:57:08 : building 'talib._ta_lib' extension
14:57:08 : creating build/temp.linux-x86_64-3.7
14:57:08 : creating build/temp.linux-x86_64-3.7/talib
14:57:08 : gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -I/usr/include -I/usr/local/include -I/opt/include -I/opt/local/include -I/opt/homebrew/include -I/opt/homebrew/opt/ta-lib/include -I/tmp/pip-install-rgqyohyf/ta-lib/.eggs/numpy-1.21.2-py3.7-linux-x86_64.egg/numpy/core/include -I/opt/python/3.7.9/include/python3.7m -c talib/_ta_lib.c -o build/temp.linux-x86_64-3.7/talib/_ta_lib.o
14:57:08 : talib/_ta_lib.c:613:28: fatal error: ta-lib/ta_defs.h: No such file or directory
14:57:08 : #include "ta-lib/ta_defs.h"
14:57:08 : ^
14:57:08 : compilation terminated.
14:57:09 : error: command 'gcc' failed with exit status 1
14:57:09 : ----------------------------------------
14:57:09 : ERROR: Failed building wheel for TA-Lib
14:57:09 : DEPRECATION: Could not build wheels for TA-Lib which do not use PEP 517. pip will fall back to legacy 'setup.py install' for these. pip 21.0 will remove support for this functionality. A possible replacement is to fix the wheel build issue reported above. You can find discussion regarding this at https://github.com/pypa/pip/issues/8368.
14:57:16 : [11:57:16+0000] Running setup.py install for TA-Lib: started
14:57:16 : [11:57:16+0000] Running setup.py install for TA-Lib: finished with status 'error'
14:57:16 : ERROR: Command errored out with exit status 1:
I am not sure what to make of this error and why it works fine in docker.
Thanks
Amit

python : pandas install errors on container

I want to install pandas on docker image containing python. I used the code below to run a container:
docker run -p 8888:8888 -v /home/DATA/Project_NY/:/home/jovyan/work/Project_NY jupyter/scipy-notebook
I created a new notebook and then tried to install my requirements file doing pip install -r "requirements.txt" i got the error below and when I tried to pip install pandas inside that running container it works perfectly:
requirements.txt content
SQLAlchemy==1.2.2
pandas==0.25.0
docker==3.3.0
python-json-logger
sshtunnel==0.1.4
jupyter
jupytext==0.8.4
matplotlib
seaborn
psycopg2-binary
the error is
building 'pandas._libs.algos' extension
creating build/temp.linux-x86_64-3.9
creating build/temp.linux-x86_64-3.9/pandas
creating build/temp.linux-x86_64-3.9/pandas/_libs
gcc -pthread -B /opt/conda/compiler_compat -Wno-unused-result -Wsign-compare -DNDEBUG -fwrapv -O2 -Wall -fPIC -O2 -isystem /opt/conda/include -fPIC -O2 -isystem /opt/conda/include -fPIC -DNPY_NO_DEPRECATED_API=0 -I./pandas/_libs -Ipandas/_libs/src/klib -Ipandas/_libs/src -I/opt/conda/lib/python3.9/site-packages/numpy/core/include -I/opt/conda/include/python3.9 -c pandas/_libs/algos.c -o build/temp.linux-x86_64-3.9/pandas/_libs/algos.o -Wno-unused-function
error: command 'gcc' failed: No such file or directory
----------------------------------------
ERROR: Failed building wheel for pandas
It seems like the pandas version you are trying to install needs a custom build for the System you're using as Dockercontainer.
You should get the same error if you run pip install pandas==0.25.0 inside the container.
Either use a different version of pandas or install gcc in the container ( e.g.: for alpine, inside Dockerfile CMD apk add --no-cache --virtual .build-deps gcc).
EDIT: I think the 'jupyter/scipy-notebook'-image uses conda, so maybe try:
docker run -p 8888:8888 -v /home/DATA/Project_NY/:/home/jovyan/work/Project_NY jupyter/scipy-notebook conda install gcc
the conda install gcc in the end is executed inside the container.

Failed to install pykaldi on ubuntu 18.04

I followed the instructions and ran the following to commands to install pykaldi:
git clone https://github.com/pykaldi/pykaldi.git
cd pykaldi
sudo apt-get install autoconf automake cmake curl g++ git graphviz libatlas3-base libtool make pkg-config subversion unzip wget zlib1g-dev
sudo apt install intel-mkl-64bit-2020.4-912
python3.7 -m pip install --upgrade pip setuptools
python3.7 -m pip install numpy pyparsing ninja==1.10.0
cd tools
sudo torify ./check_dependencies.sh /usr/bin/python3.7
sudo torify ./install_protobuf.sh /usr/bin/python3.7
sudo torify ./install_clif.sh /usr/bin/python3.7
sudo torify ./install_kaldi.sh
cd ..
python3.7 -m pip install setuptools
sudo apt-get install ninja-build
sudo python3.7 setup.py install
when it comes to the last line, I get the following error:
running install
running bdist_egg
running egg_info
writing pykaldi.egg-info/PKG-INFO
writing dependency_links to pykaldi.egg-info/dependency_links.txt
writing requirements to pykaldi.egg-info/requires.txt
writing top-level names to pykaldi.egg-info/top_level.txt
reading manifest file 'pykaldi.egg-info/SOURCES.txt'
adding license file 'LICENSE'
writing manifest file 'pykaldi.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
running build_ext
Using PYCLIF: /usr/local/bin/pyclif
Using CLIF_MATCHER: /usr/clang/bin/clif-matcher
-- Configuring done
-- Generating done
-- Build files have been written to: /home/soroushh/KhodnevisProjects/wav2vec-u/pykaldi/build
[6/505] Building CXX object kaldi/matrix/CMakeFiles/_matrix_ext.dir/matrix-ext.cc.o
FAILED: kaldi/matrix/CMakeFiles/_matrix_ext.dir/matrix-ext.cc.o
/usr/bin/c++ -D_matrix_ext_EXPORTS -I../kaldi/lib -I../kaldi -Ikaldi -I../tools/kaldi/src -I/usr/include/python2.7 -I/home/soroushh/.local/lib/python3.7/site-packages/numpy/core/include -std=c++11 -I.. -isystem /home/soroushh/KhodnevisProjects/wav2vec-u/pykaldi/tools/kaldi/tools/openfst-1.6.7/include -O1 -Wall -Wno-sign-compare -Wno-unused-local-typedefs -Wno-deprecated-declarations -Winit-self -DKALDI_DOUBLEPRECISION=0 -DHAVE_EXECINFO_H=1 -DHAVE_CXXABI_H -DHAVE_MKL -I/opt/intel/mkl/include -m64 -msse -msse2 -pthread -g -fPIC -Wno-maybe-uninitialized -fPIC -MD -MT kaldi/matrix/CMakeFiles/_matrix_ext.dir/matrix-ext.cc.o -MF kaldi/matrix/CMakeFiles/_matrix_ext.dir/matrix-ext.cc.o.d -o kaldi/matrix/CMakeFiles/_matrix_ext.dir/matrix-ext.cc.o -c ../kaldi/matrix/matrix-ext.cc
In file included from /home/soroushh/.local/lib/python3.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:4:0,
from /home/soroushh/.local/lib/python3.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:12,
from /home/soroushh/.local/lib/python3.7/site-packages/numpy/core/include/numpy/arrayobject.h:4,
from ../kaldi/matrix/matrix-ext.cc:8:
/home/soroushh/.local/lib/python3.7/site-packages/numpy/core/include/numpy/npy_common.h:386:9: error: ‘Py_hash_t’ does not name a type
typedef Py_hash_t npy_hash_t;
^~~~~~~~~
In file included from /home/soroushh/.local/lib/python3.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:12:0,
from /home/soroushh/.local/lib/python3.7/site-packages/numpy/core/include/numpy/arrayobject.h:4,
from ../kaldi/matrix/matrix-ext.cc:8:
/home/soroushh/.local/lib/python3.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:655:9: error: ‘npy_hash_t’ does not name a type; did you mean ‘npy_half’?
npy_hash_t hash;
^~~~~~~~~~
npy_half
[11/505] Building CXX object kaldi/chain/CMakeFiles/_chain_generic_numerator.dir/chain-generic-numerator-clifwrap.cc.o
ninja: build stopped: subcommand failed.
Command '['ninja', '-j', '6']' returned non-zero exit status 1.
I also tried different versions of PyKaldi, i.e. 1.8.0 and 1.10.0. But it shows the same error.
Edit 1: when I execute /usr/bin/ninja -j 6 I get the following error: ninja: error: loading 'build.ninja': No such file or directory

Installing Scrapy on ubuntu 14.04 fails

I'm getting error installing Scrapy on my ubuntu box. I'm using pip to install Scrapy. I'm aware that it needs setuptools to be installed. I got that installed using the script provided in setuptools website.
reading manifest file 'Twisted.egg-info/SOURCES.txt'
writing manifest file 'Twisted.egg-info/SOURCES.txt'
creating build/lib.linux-x86_64-2.7/twisted/internet/iocpreactor/iocpsupport
copying twisted/internet/iocpreactor/iocpsupport/iocpsupport.c -> build/lib.linux-x86_64-2.7/twisted/internet/iocpreactor/iocpsupport
copying twisted/internet/iocpreactor/iocpsupport/winsock_pointers.c -> build/lib.linux-x86_64-2.7/twisted/internet/iocpreactor/iocpsupport
copying twisted/test/raiser.c -> build/lib.linux-x86_64-2.7/twisted/test
copying twisted/runner/portmap.c -> build/lib.linux-x86_64-2.7/twisted/runner
copying twisted/python/sendmsg.c -> build/lib.linux-x86_64-2.7/twisted/python
running build_ext
x86_64-linux-gnu-gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/python2.7 -c conftest.c -o conftest.o
building 'twisted.runner.portmap' extension
creating build/temp.linux-x86_64-2.7
creating build/temp.linux-x86_64-2.7/twisted
creating build/temp.linux-x86_64-2.7/twisted/runner
x86_64-linux-gnu-gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/python2.7 -c twisted/runner/portmap.c -o build/temp.linux-x86_64-2.7/twisted/runner/portmap.o
twisted/runner/portmap.c:10:20: fatal error: Python.h: No such file or directory
#include <Python.h>
^
compilation terminated.
error: command 'x86_64-linux-gnu-gcc' failed with exit status 1
----------------------------------------
Cleaning up...
Command /usr/bin/python -c "import setuptools, tokenize;__file__='/tmp/pip_build_root/Twisted/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-4QNuNV-record/install-record.txt --single-version-externally-managed --compile failed with error code 1 in /tmp/pip_build_root/Twisted
Storing debug log for failure in /root/.pip/pip.log
Any Idea where the process goes wrong? I've got gcc and g++ installed already.
From the error fatal error: Python.h: No such file or directory it looks like python development headers are not installed. Try this command and then try to install again.
sudo apt-get install python-dev
For installing libevent library apply this command,
sudo apt-get install libevent-dev
I figured this out on ubuntu 14.04 by (A) actually following the scrapy docs for installing on ubuntu, then (B) an error that a lot of others were getting, by following a S/O solution Error while starting new scrapy project
(A) http://doc.scrapy.org/en/1.0/topics/ubuntu.html#topics-ubuntu
then
(B)
sudo pip install pyasn1 --upgrade
Right after installing Fedora 23 that's what I did (may works on Ubuntu):
[root#x ~]# pip install scrapy
[root#x ~]# dnf install python-cffi
[root#x ~]# dnf install openssl-devel
[root#x ~]# dnf install gcc
[root#x ~]# dnf install redhat-rpm-config
[root#x ~]# dnf install libxml
[root#x ~]# dnf install libxml2-devel
[root#x ~]# dnf install libxml-devel
[root#x ~]# dnf install glib2-devel gnet2-devel
[root#x ~]# dnf install libxslt-devel
sudo apt install python3-scrapy
I have tried this ..and it worked for me.
(You should have the ssh installed)

Categories

Resources