A better way to deploy a Debian-python hybrid application - python

I wrote a small application for Debian linux that calls python2.7 to perform almost all of its functions.
The python functions include for example remote database access, so the app will depend on python modules that are not in every linux distribution by default.
The app is packaged in a dpkg file in order to be used on many other machines (with same linux distribution), using dpkg -i MyApp01.
But the python dependencies have to be installed separately in order for the app to work: for example pip install mysql-connector-python-rf
Now I want to use Docker to ship my dependencies with the app and make it work on other machines without having to install them as above.
Can Docker be used to do this?and how?
If no, Is there a better approach to natively bundle the python dependencies in the dpkg file (assuming target machines have similar environment)?

A container is an isolated environment, so you have to ship all what will be needed for your program to run.
Your Dockerfile will be based on Debian, so begin with
FROM debian
and will have some
RUN apt-get update \
&& apt-get install -y mysoft mydependency1 mydependency2
and also
RUN pip install xxx
and end with something like
CMD ["python","myapp.py"]
As your Python program does certainly things like
import module1, module2
Those Python modules will need to be installed in your Dockerfile in a RUN directive

Related

sh: flake8: not found, though I installed it with python pip

Here I'm going to use flake8 within a docker container. I installed flake8 using the following command and everything installed successfully.
$ sudo -H pip3 install flake8 // worked fine
and it's location path is-
/usr/local/lib/python3.8/dist-packages
Then I executed the following command but result was not expected.
$ sudo docker-compose run --rm app sh -c "flake8"
It says,
sh: flake8: not found
May be, the flake8 package is not installed in the correct location path. Please help me with this issue.
When you docker-compose run a container, it is in a new container in a new isolated filesystem. If you ran pip install in a debugging shell in another container, it won't be visible there.
As a general rule, never install software into a running container, unless it's for very short-term debugging. This will get lost as soon as the container exits, and it's very routine to destroy and recreate containers.
If you do need a tool like this, you should install it in your Dockerfile instead:
FROM python:3.10
...
RUN pip3 install flake8
However, for purely developer-oriented tools like linters, it may be better to not install them in Docker at all. A Docker image contains a fixed copy of your application code and is intended to be totally separate from anything on your host system. You can do day-to-day development, including unit testing and style checking, in a non-Docker virtual environment, and use docker-compose up --build to get a container-based integration-test setup.

How to convert my .py and .kv into the android application on windows

Currently, I am developing my project that requires me to develop the android app.
For my project, I use Python and Kivy for UI design. Following some tutorial videos on youtube, I know I can turn my project into an android app that can be launched in my android phone, but it can be easily done using linux OS. My question is how can I do that when I am using the Windows OS? And what step do I need to take?
P.S. I heard that it can't be done on Windows OS since few years ago, please correct me if I am wrong.
If you are using windows 10, you can now run buildozer from WSL (ubuntu at least), as it's mostly equivalent to a native linux installation, only minor fixes were necessary.
You could also install Docker and run any linux distribution from there (ubuntu is certainly still the safest choice though).
If you are on an older version of windows, the easiest way is to run a Virtual Machine, again, ubuntu 18.04 is a fine choice.
once you have any of these setup, the usual instructions to setup applies:
apt update
apt install -y git zip unzip python3 python3-virtualenv python3-pip openjdk-8-jdk pkg-config autoconf libtool zlib1g-dev
pip install cython buildozer
buildozer android debug
The last command has to be run in your app's directory, which you can access once you have setup WSL by using shift+right-click and selecting "open linux shell here" (as seen in https://superuser.com/questions/1066261/how-to-access-windows-folders-from-bash-on-ubuntu-on-windows) and running the command inside it.
If you are using a VM, you'll need to setup file sharing with it so it can access your project directory, and run the command inside it.

How can I use pip to install Python packages into my Divio Docker project?

I'm used to using pip to install Python packages into my Django projects' virtual environments.
When I am working with a Divio Docker project locally, this does not work.
There are two things you need to be aware of when installing Python packages into a Docker project:
the package must be installed in the correct environment
if you want to use the installed package in the future, it needs to be installed in a more permanent way
The details below describe using a Divio project, but the principle will be similar for other Docker installations.
Installation in the correct environment
To use pip on the command line to install a Python package into a Dockerised project, you need to be using pip inside the Docker environment, that is, inside the container.
It's not enough to be in the directory where you have access to the project's files. In this respect, it's similar to using a virtual environment - you need to have the virtualenv activated. (Otherwise, your package will be installed not in the virtual environment, but on your own host environment.)
To activate a virtual environment, you'd run something like source bin/activate on it.
To install a package within a Divio web container:
# start a bash prompt inside the project
docker-compose run --rm web bash
# install the package in the usual way
pip install rsa
rsa is now installed and available to use.
More permanent installation
So far however, the package will only be installed and available in that particular container. As soon as you exit the bash shell, the container will disappear. The next time you launch a web container, you will not find the rsa package there. That's because the container is launched each time from its image.
In order to have the package remain installed, you will need to include it in the image.
A Divio project includes a requirements.in file, listing Python packages that will be included in the image.
Add a new line containing rsa to the end of that file. Then run:
docker-compose build web
This will rebuild the Docker image. Next time you launch a container with (for example) docker-compose run --rm web bash, it will include that Python package.
(The Divio Developer Handbook has some additional guidance on using pip.)
Note: I am a member of the Divio team. This question is one that we see quite regularly via our support channels.

What is the purpose of running a django application in a virtualenv inside a docker container?

What is the purpose of virtualenv inside a docker django application? Python and other dependencies are already installed, but at the same time it's necessary to install lots of packages using pip, so it seems that conflict is still unclear.
Could you please explain the concept?
EDIT: Also, for example. I'v created virtualenv inside docker django app and recently installed pip freeze djangorestframework and added it to installed in settings.py but docker-compose up raises error . No module named rest_framework.Checked, everything is correct.Docker/virtualenv conflict ? May it be?
Docker and containerization might inspire the illusion that you do not need a virtual environment. distutil's Glpyh makes a very compelling argument against this misconception in this pycon talk.
The same fundamental aspects of virtualenv advantages apply for a container as they do for a non-containerized application, because fundamentally you're still running a linux distribution.
Debian and Red Hat are fantastically complex engineering projects.
Integrating billions of lines of C code.For example, you can just apt install libavcodec. Or yum install ffmpeg.
Writing a working build
system for one of those things is a PhD thesis. They integrate
thousands of Python packages simultaneously into one working
environment. They don't always tell you whether their tools use Python
or not.
And so, you might want to docker exec some tools inside a
container, they might be written in Python, if you sudo pip install
your application in there, now it's all broken.
So even in containers, isolate your application code from the system's
Regardless of whether you're using docker or not you should always run you application in a virtual environment.
Now in docker in particular using a virtualenv is a little trickier than it should be. Inside docker each RUN command runs in isolation and no state other than file system changes are kept from line to line. To install to a virutalenv you have to prepend the activation command on every line:
RUN apt-get install -y python-virtualenv
RUN virtualenv /appenv
RUN . /appenv/bin/activate; \
pip install -r requirements.txt
ENTRYPOINT . /appenv/bin/activate; \
run-the-app
A virtualenv is there for isolating the packages to a specific environment. Docker is also there to isolate the settings to a specific environment. So in essence if you use docker there isn't much benefit of using virtualenv too.
Just pip install thing into the docker environment directly it'll do no harm. To pip install the requirements use the dockerfile where you can execute commands.
You can find a pseudo code example below.
FROM /path/to/used/docker/image
RUN pip install -r requirements.txt

How to manage libraries in deployment

I run Vagrant on Mac OS X. I am coding inside a virtual machine with CentOS 6, and I have the same versions of Python and Ruby in my development and production environment. I have these restrictions:
I cannot manually install. Everything must come through RPM.
I cannot use pip install and gem install to install the libraries I want as the system is managed through Puppet, and everything I add will be removed.
yum has old packages. I usually cannot find the latest versions of the libraries.
I would like to put my libraries locally in a lib directory near my scripts, and create an RPM that includes those frozen versions of dependencies. I cannot find an easy way to bundle my libraries for my scripts and push everything into my production server. I would like to know the easiest way to gather my dependencies in Python and Ruby.
I tried:
virtualenv (with --relocatable option)
PYTHONPATH
sys.path.append("lib path")
I don't know which is the right way to go. Also for ruby, is there any way to solve my problems with bundler? I see that bundler is for rails. Does it work for custom small scripts?
I like the approach in Node.JS and NPM; all packages are stored locally in node_modules. I have nodejs rpm installed, and I deploy a folder with my application on the production server. I would like to do it this way in Ruby and Python.
I don't know Node, but what you describe for NPM seems to be exactly what a virtualenv is. Once the virtualenv is activated, pip installs only within that virtualenv - so puppet won't interfere. You can write out your current list of packages to a requirements.txt file with pip freeze, and recreate the whole thing again with pip install -r requirements.txt. Ideally you would then deploy with puppet, and the deploy step would involve creating or updating the virtualenv, activating it, then running that pip command.
Maybe take a look at Docker?
With Docker you could create a image of your specific environment and deploy that.
https://www.docker.com/whatisdocker/

Categories

Resources