ipython version mismatches python version in docker - python

I'm building a docker with the following Dockerfile:
FROM ubuntu:18.04
RUN \
apt-get update -y && \
apt-get install python3.8 -y && \ <----- I ask for python3.8
apt-get install python3-pip -y && \
pip3 install ipython
When I ran the image I was surprised to see that the version of ipython is 3.6.9:
$ docker build --tag versions --file .\Dockerfile.txt .
$ docker run -d -t --name ver versions
ba9bd772bc6d247a6c83f2bf932a6c5172c23f00e1e6a35f14878608d0f35f89
$ docker exec -it ver bash
# ipython
Python 3.6.9 (default, Jan 26 2021, 15:33:00)

The package python3-pip depends on python3, and the default python3 for ubuntu 18.04 is version 3.6.
There are at least three options.
Use a python base image
The official python base images include pip. If possible, I would use one of these.
python:3.8 - includes compilers to install compiled packages
python:3.8-slim - does not include compilers
Install pip with the get-pip.py script
One can install pip without the system package manager, for example with the script get-pip.py:
wget https://bootstrap.pypa.io/get-pip.py
python3.8 get-pip.py
Use ubuntu 20.04
As #NickODell comments, ubuntu 20.04 uses 3.8 as the default python. But you will have less fine-grained control over the python version than if you use one of the official python docker images.

Related

Is there an option to install Python3.8.13 slim version on an RHEL8 based Docker?

I want the exact version of Python 3.8.13 to run on my container, but so far when I used the below, it generated a very large Docker image:
RUN yum update -y && yum install -y python3.8 python38-pip && yum clean all
The command "yum install python3.8" installs 3.8.13 and this is fine, but as mentioned, the end result (with other required elements) is a bit above 2 GB once built. I would like to make the image smaller and I am wondering if I can use a slim or alpine version of Python 3.8.13.
I was trying with the following commands:
yum install -y python3.8.13-slim
yum install -y python3.8.13-slim-buster
yum install -y python3.8-slim
Did not succeed, yum does not recognize these as valid packages.
Is there a workaround for this?

How to install python 3.9 on Amazon Linux 2 with cloud-init and CDK

I'm trying to install Python 3.9 an EC2 instance that uses Amazon Linux 2. I tried following this guide: https://computingforgeeks.com/install-latest-python-on-centos-linux/, and I was able to install Python3.9 manually on the EC2 instance by SSH'ing in and running the commands. I'm now trying to setup the EC2 instance with a UserData script that calls some CloudFormationInit scripts to install dependencies, including Python 3.9, and my script is failing.
Here's part of the script that I'm using to install Python 3.9:
const installPythonString = `
#!/bin/bash
sudo amazon-linux-extras install -y epel
sudo yum -y update
sudo yum groupinstall "Development Tools" -y
sudo yum install openssl-devel libffi-devel bzip2-devel -y
gcc --version
sudo yum install wget -y
sudo mkdir -p /opt/python3.9/
sudo chown -R $USER:$USER /opt/python3.9/
wget https://www.python.org/ftp/python/3.9.9/Python-3.9.9.tgz -P /opt/python3.9
cd /opt/python3.9/
tar xvf Python-3.9.9.tgz
whoami
sudo chown -R $USER:$USER Python-3.9.9
cd Python-3.9.9/
ls -al
pwd
./configure --enable-optimizations
sudo make altinstall
python3.9 --version
pip3.9 --version
`;
init.addConfig('install_python39', new ec2.InitConfig([
ec2.InitFile.fromString('/opt/install_python39.sh', installPythonString, {
mode: '000755',
owner: 'root',
group: 'root',
}),
ec2.InitCommand.shellCommand('sudo sh install_python39.sh', {
cwd: '/opt',
key: 'install_python39',
}),
]))
I'm getting the following errors when trying to start up the EC2 instance:
Python build finished successfully!
...
WARNING: The script pip3.9 is installed in '/usr/local/bin' which is not on PATH.
install_python39.sh: line 21: python3.9: command not found
install_python39.sh: line 22: pip3.9: command not found
Is there an easier way to install Python 3.9 on Amazon Linux 2 using CloudFormationInit?
Looks like the path to the python is /usr/local/bin which is not in $PATH so the python3.9 command is not found.
run the following commands in order.
export PATH="/usr/local/bin:$PATH" or echo "export PATH='/usr/local/bin:$PATH' >> ~/.bashrc(if you do this relaunch the ssh session) to save it to bashrc so you don't have to run the export everytime you log in.
python3.9 --version
additionally if you keep having issues, follow this to install python3.9, which is what i used, and everything went flawlessly.
if you have python packages that need to be installed, i would recommend creating a requirements.txt and using pip3.9 install -r requirements.txt to install them.

Installed R packages in Dockerfile won't be found when running container

I'm trying to install several R packages in a Python Docker image. I've this small Dockerfile:
# Python 3.7.5
FROM python:3.7.5
ENV PYTHONUNBUFFERED 1
# Install R 3.6
RUN echo 'deb http://cran.rstudio.com/bin/linux/debian buster-cran35/' >> /etc/apt/sources.list
RUN apt install dirmngr
RUN apt-key adv --keyserver keys.gnupg.net --recv-key 'E19F5F87128899B192B1A2C2AD5F960A256A04AF'
RUN apt update
RUN apt install -y r-base
# Install R dependencies
RUN R -e "install.packages('BiocManager', dependencies=TRUE, repos='http://cran.rstudio.com/')"
It doesn't throw any error. But when I run docker container exec -it <my container> bash and do:
Rscript -e 'installed.packages()' | grep BiocManager
There aren't any results. I don't know if this applies, but during building it throws:
The downloaded source packages are in
'/tmp/Rtmp7jBLWQ/downloaded_packages'
Maybe that it's installing the packages on a temp folder is the problem. Is there any way to install R packages without making an image base on R-base image and use install2.r?
What a shame... I had a docker-compose with a build: . clause and I forget to run docker-compose build. That's why the changes didn't apply.
However I used the r-base image to install the dependies in an easier way. My final Dockerfile is:
FROM r-base:3.6.2
# Install R dependencies
RUN install2.r --error BiocManager
# Install Python 3.7
RUN apt install -y python3.7 python3-pip

Building Python 3.6.4 on Linux from scratch

I'm trying to build Python 3.6.4 from LFS 8.2-systemd so I run the configure command:
./configure --prefix=/usr \
--enable-shared \
--with-system-expat \
--with-system-ffi \
--with-ensurepip=yes
followed by make -j.
However, at this point the module "pyexpat" is not found by Python, but the file exists in /usr/lib/libexpat.so.
After reading building Python from source with zlib support, I created a symlink:
ln -s /usr/lib /usr/lib/x86_64-gnu-linux
If i run make install, I get an error:
ModuleNotFoundError: No module named pyexpat
My expat lib version is 2.2.5.
I'm doing the compilation inside env -i chroot /mnt bash
and my environment just contains a valid PATH and LX_ALL=POSIX variables.
I ran into this same problem for python 3.6.8 , when I initially configured using:
./configure --prefix=/opt/python-3.6/ --enable-optimizations
However, when I retried using the command in the BLFS book:
./configure --prefix=/opt/python-3.6/ --enable-shared --with-system-expat --with-system-ffi --with-ensurepip=yes
My pyexpat started working.
That being said, I think it may be helpful to just retry, since my second command is functionally identical to yours.
sudo add-apt-repository ppa:jonathonf/python-3.6
sudo apt-get update
sudo apt-get install python3.6
To make python3 use the new installed python 3.6 instead of the default 3.5 release, run following 2 commands:
sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.5 1
sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.6 2
Finally switch between the two python versions for python3 via command:
sudo update-alternatives --config python3
After selecting version 3.6:
python3 -V

Installing second python on Debian

So I have Debian machine for my Django production server.
I need to install second python (2.7.1) to use with virtualenv.
But it always write I don't have some modules, then I have to search manually, apt-install them and rebuild. Is there either a way to resolve the dependencies for building, or pre-compiled .deb with python 2.7.1 for Debian Squeeze?
Sorry if this is much of a noobie question, I googled, honestly.
Get the Python 2.7.1 sources and compile it manually:
configure --prefix=/path/to/python-2.7
make; make install
Python 2.7 is available for wheezy (testing), so you should be able to install it by adding the testing repository and doing some APT pinning.
1) add the repository in /etc/apt/sources.list
deb http://ftp.us.debian.org/debian testing main contrib non-free
2) do the actual pinning in /etc/apt/preferences
Package: *
Pin: release n=testing
Pin-Priority: 100
A Pin-Priority of under 500 basically means that no packages from testing are installed automatically, so you won't have problems with other packages.
3) install python2.7 from testing:
aptitude -t testing install python2.7
(or apt-get if you don't have aptitude)
Here is two methods for Debian GNU/Linux 6.0.7 (on 18/07/2013):
The classic
Install dependencies
aptitude -y install build-essential python-pip libmysqlclient-dev libadns1-dev \
python-dev libreadline-dev libgdbm-dev zlib1g-dev libsqlite3-dev \
libssl-dev libbz2-dev libncurses5-dev libdb-dev
Download python
cd /tmp
wget http://python.org/ftp/python/2.7.5/Python-2.7.5.tar.xz
unxz -c Python*xz | tar xpf -
Compile
cd Python*
./configure --prefix=/opt/python2.7.5 --enable-shared
make
Install
make install
echo "/opt/python2.7.5/lib" > /etc/ld.so.conf.d/libpython2.7.conf
ldconfig
Test
/opt/python2.7.5/bin/python -c "print('Ok')"
Upgrade pip virtualenv
easy_install pip
pip -v install --upgrade distribute==0.7.3
pip -v install --upgrade virtualenv==1.9.1
Create an user and its virtualenv
adduser user_app --home /opt/user_app
su user_app
virtualenv --no-site-packages --verbose -p /opt/python2.7.5/bin/python $HOME
Test again
su user_app
cd
source bin/activate
python -c "import sys; print sys.version"
The "pythonic"
Use the package pyenv.
pyenv install 2.7.5
Installing a chroot-environment with debootstrap could be also a fast and secure solution.
It uses about 300mb
debootstrap wheezy /opt/debian7
chroot /opt/debian7
apt-get install python2.7
You can install and switch python versions using pythonbrew I installed python 2.7.3 and python 2.7.9 in Debian 6 and Debian 7 and works fine.
You can follow this tutorial pythonbrew howto

Categories

Resources