Unable to install PIP & PYMSSQL in docker - python

I am trying to build a docker image with
Python
PIP
pymssql
below is my Dockerfile
FROM ubuntu:latest
RUN apt-get update -y
RUN apt-get update && apt-get install -y telnet && apt-get install -y ksh && apt-get install -y python2.7.x && apt-get update && apt-get install -y python-pip python-dev build-essential && apt-get -y clean && rm -rf /var/lib/apt/lists/*
RUN pip install pymssql==2.1.3
ENTRYPOINT ['python']
it is throwing the following error:

If you don't mind to use python3 (otherwise use virtual env) I rate you to do :
RUN apt-get install python3-pip
Then you will be able to install whatever pip package you need : by doing :
RUN pip3 install <your_pip_pkg>

Python-pip is only available on Ubuntu Bionic. see python-pip.
You need to switch from focal to bionic. Universe repository should be enabled.

Related

How install PIP & PYMSSQL in docker

I have a Python program which is to be executed in the Azure Kubernetes.
Below is my docker file - I have Python installed
#Ubuntu Base image with openjdk8 with TomEE
FROM demo.azurecr.io/ubuntu/tomee/openjdk8:8.0.x
RUN apt-get update && apt-get install -y telnet && apt-get install -y ksh && apt-get install -y python2.7.x && apt-get -y clean && rm -rf /var/lib/apt/lists/*
however I don't know how to install PIP and related dependent libraries (eg: pymssql)?
Best option is installing miniconda on docker image. I used it always when I need to have python on docker image without python or pip.
Here is part for installing minicinda in my simple docker image
FROM debian
RUN apt-get update && apt-get install -y curl wget
RUN rm -rf /var/lib/apt/lists/*
RUN wget \
https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh \
&& mkdir /root/.conda \
&& bash Miniconda3-latest-Linux-x86_64.sh -b \
&& rm -f Miniconda3-latest-Linux-x86_64.sh
RUN conda --version

Docker installation failed on windows

I am building an image using the following Dockerfile:
FROM ubuntu:18.04
RUN apt-get update \
&& apt-get install -y python3-pip python3-dev \
&& cd /usr/local/bin \
&& ln -s /usr/bin/python3 python \
&& pip3 install --upgrade pip
# Setup the Python's configs
RUN pip install --upgrade pip && \
pip install --no-cache-dir matplotlib==3.0.2 pandas==0.23.4 numpy==1.16.3 && \
pip install --no-cache-dir pybase64 && \
pip install --no-cache-dir scipy && \
pip install --no-cache-dir dask[complete] && \
pip install --no-cache-dir dash==1.6.1 dash-core-components==1.5.1 dash-bootstrap-components==0.7.1 dash-html-components==1.0.2 dash-table==4.5.1 dash-daq==0.2.2 && \
pip install --no-cache-dir plotly && \
pip install --no-cache-dir adjustText && \
pip install --no-cache-dir networkx && \
pip install --no-cache-dir scikit-learn && \
pip install --no-cache-dir tzlocal
# Setup the R configs
RUN apt-get update
RUN apt-get install -y software-properties-common
RUN apt-key adv --keyserver keyserver.ubuntu.com --recv-keys E298A3A825C0D65DFD57CBB651716619E084DAB9
RUN add-apt-repository 'deb https://cloud.r-project.org/bin/linux/ubuntu bionic-cran35/'
RUN apt update
ENV DEBIAN_FRONTEND=noninteractive
RUN apt install -y r-base
RUN pip install rpy2==2.9.4
RUN apt-get -y install libxml2 libxml2-dev libcurl4-gnutls-dev libssl-dev
RUN echo "r <- getOption('repos'); r['CRAN'] <- 'https://cran.r-project.org'; options(repos = r);" > ~/.Rprofile
RUN Rscript -e "install.packages('BiocManager')"
RUN Rscript -e "BiocManager::install('ggplot2')"
RUN Rscript -e "BiocManager::install('DESeq2')"
RUN Rscript -e "BiocManager::install('RColorBrewer')"
RUN Rscript -e "BiocManager::install('ggrepel')"
RUN Rscript -e "BiocManager::install('factoextra')"
RUN Rscript -e "BiocManager::install('FactoMineR')"
RUN Rscript -e "BiocManager::install('apeglm')"
When I build this on Linux I launched the web-app from the container and it's run fine.
But, when I build this on windows using Docker Toolbox although the installation of factoextra and FactoMineR work successfully, when I launch the web-app it's raise an error:
Error in library("factoextra") : there is no package called ‘factoextra’
Do you have any idea what's might cause this problem? It's very strange because when I build the image the installation of these 2 packages seems to work successfully.

Run install python packages on Dockerfile

I'm new to Docker and currently trying to create a Dockerfile with installing the python packages and its libraries as shown here:
FROM balenalib/fincm3-debian-python:latest
# RUN install_packages git
RUN apt-get update && apt-get install python \
&& apt-get install pip3 \
apt-get install libportaudio0 libportaudio2 libportaudiocpp0 portaudio19-dev \
pip3 install pyaudio \
pip3 install numpy \
pip3 install matplotlib \
pip3 install scipy \
pip3 install librosa \
# Set our working directory
WORKDIR /usr/src/app
COPY Recorder.py /usr/src/app
# Recorder.py will run when container starts up on the device
CMD ["python","/usr/src/app/Recorder.py"]
However, while I am trying to push this Dockerfile, the error is generated with
Error: The command '/bin/sh -c apt-get update && apt-get install python && apt-get install pip3 apt-get install libportaudio0 libportaudio2 libportaudiocpp0 portaudio19-dev pip3 install pyaudio
pip3 install numpy pip3 install matplotlib pip3 install scipy pip3 install librosa WORKDIR /usr/src/app' returned a non-zero code: 100
Moving python packages in requirements.txt and installing python3-pip worked with python:3 base image.
# RUN install_packages git
RUN apt-get update \
&& apt-get install -y python \
&& apt-get install -y python3-pip
RUN pip install -r requirements.txt
The package you are looking for is called python3-pip.
Next, you need both && (to separate commands) and \ (to continue the command line). So, in summary, that should be:
FROM balenalib/fincm3-debian-python:latest
RUN apt-get update && apt-get install python && \
apt-get install -y \
python3-pip libportaudio0 libportaudio2 libportaudiocpp0 \
portaudio19-dev && \
pip3 install pyaudio numpy matplotlib \
scipy librosa
# Set our working directory
WORKDIR /usr/src/app
COPY Recorder.py /usr/src/app
# Recorder.py will run when container starts up on the device
CMD ["python","/usr/src/app/Recorder.py"]
I believe you have more than one problem in this Dockerfile, and when you put all commands together with && and \, you don't know which one is triggering the error. I suggest splitting them for debugging purposes, when they all work then you can put then together. Once you understand each individual error is easier to check and solve them. this question has valuable info: how to install pip in docker
Try this:
1- packages are triggers Y/n questions, give -y to guarantee it passes
2- using the backslashes to refer to a new command, you should use &&, backslashes refer to breaking line, you can use \ and then &&
3- pip3 and libportaudio0 packages doesn't exist.
E: Unable to locate package libportaudio0
I found out about the errors dividing the Dockerfile like this and removing the problems mentioned:
RUN apt-get update
RUN apt-get install python -y\
&& apt-get install python3-pip -y
RUN apt-get install libportaudio2 libportaudiocpp0 portaudio19-dev -y
RUN pip3 install pyaudio numpy matplotlib \
scipy librosa
If you want to put the commands together:
RUN apt-get update \
&& apt-get install python -y \
&& apt-get install python3-pip -y \
&& apt-get install libportaudio2 libportaudiocpp0 portaudio19-dev -y \
&& pip3 install pyaudio numpy matplotlib \
scipy librosa
I also suggest adding a pip requirements file, would make things cleaner.

Can't install scipy

I am trying to install scipy from a Dockerfile and I cannot for the life of me figure out how.
Here is the Dockerfile:
FROM python:3.5
ENV HOME /root
# Install dependencies
RUN apt-get update
RUN apt-get install -y gcc
RUN apt-get install -y build-essential
RUN apt-get install -y zlib1g-dev
RUN apt-get install -y wget
RUN apt-get install -y unzip
RUN apt-get install -y cmake
RUN apt-get install -y python3-dev
RUN apt-get install -y gfortran
RUN apt-get install -y python-numpy
RUN apt-get install -y python-matplotlib
RUN apt-get install -y ipython
RUN apt-get install -y ipython-notebook
RUN apt-get install -y python-pandas
RUN apt-get install -y python-sympy
RUN apt-get install -y python-nose
# Install Python packages
RUN pip install --upgrade pip
RUN pip install cython
# Install scipy
RUN apt-get install -y python-scipy
This builds an image, but when I run the container and try to import scipy it says:
Python 3.5.1 (default, Mar 9 2016, 03:30:07)
[GCC 4.9.2] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import scipy
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named 'scipy'
I have tried using RUN pip install scipy and RUN pip install git+https://github.com/scipy/scipy.git but those throw an error before completing the build.
You're using Python 3 but installing the Python 2 packages. Change your Dockerfile to the following:
FROM python:3.5
ENV HOME /root
ENV PYTHONPATH "/usr/lib/python3/dist-packages:/usr/local/lib/python3.5/site-packages"
# Install dependencies
RUN apt-get update \
&& apt-get upgrade -y \
&& apt-get autoremove -y \
&& apt-get install -y \
gcc \
build-essential \
zlib1g-dev \
wget \
unzip \
cmake \
python3-dev \
gfortran \
libblas-dev \
liblapack-dev \
libatlas-base-dev \
&& apt-get clean
# Install Python packages
RUN pip install --upgrade pip \
&& pip install \
ipython[all] \
numpy \
nose \
matplotlib \
pandas \
scipy \
sympy \
cython \
&& rm -fr /root/.cache

Docker: "Unknown instruction: VIRTUALENV'

Dockerfile:
FROM ubuntu:14.04.2
RUN apt-get -y update && apt-get upgrade -y
RUN apt-get install python build-essential python-dev python-pip python-setuptools -y
RUN apt-get install libxml2-dev libxslt1-dev python-dev -y
RUN apt-get install libpq-dev postgresql-common postgresql-client -y
RUN apt-get install openssl openssl-blacklist openssl-blacklist-extra -y
RUN apt-get install nginx -y
RUN pip install virtualenv uwsgi
ADD canonicaliser_api ~
virtualenv ~/canonicaliser_api/venv
source ~/canonicaliser_api/venv/bin/activate
pip install -r ~/canonicaliser_api/requirements.txt
RUN echo "daemon off;" >> /etc/nginx/nginx.conf
EXPOSE 80
CMD service nginx start
Build error:
...
Successfully installed virtualenv uwsgi
Cleaning up...
---> 0c141e23f725
Removing intermediate container d9fd3c20365d
Step 8 : ADD canonicaliser_api ~
---> 89b4fb40dba5
Removing intermediate container b0c1ad946fc4
Step 9 : VIRTUALENV
Unknown instruction: VIRTUALENV
is it supposed to remove those containers?
Why isn't it seeing virtualenv?
is it supposed to remove those containers?
Yes. If you want to keep them for some reason, pass --rm=false to the docker build command.
Why isn't it seeing virtualenv?
It is seeing it, but because it's at the start of a line, it treats it like a Dockerfile instruction, but there is no "VIRTUALENV" instruction. Presumably, you meant to put RUN before each line after the ADD:
ADD canonicaliser_api ~
RUN virtualenv ~/canonicaliser_api/venv
# This one needs to be a single RUN so the "source" will affect pip.
RUN source ~/canonicaliser_api/venv/bin/activate && \
pip install -r ~/canonicaliser_api/requirements.txt

Categories

Resources