Docker containers python environment vs python virtual environments - python

I program primarily in python and have some experience with virtual environments. I am new to the software and started looking at docker to run my code. I would like insight on what it does and how it works.
From my understanding docker containers are like virtual environments that run a set of instructions when executed and can treat that everything contained within it as a single entity (so it or something else wouldn't be conditional on each other?). As I read more about containers, they sound pretty perfect and would eliminate any need for virtual environments but again unsure. Would much appreciate some clarification on this because I haven't been able to find anything online.

The main purpose of the python virtual environment is the isolation of the environment for each project, it's mean that each project can have its own dependencies, regardless of what dependencies every other project has.
what-is-a-virtual-environment
But when it comes to docker, you can treat each docker image as an isolated environment, you do not need to create or maintain a virtual environment in Dockerfile, as Dockerfile should be base on a particular version of python and should run single project.
python-versions-docker
So in short, if you have 3 projects that require
Project A requires Python 3.6
Project B requires Python 3.7
Project C requires Python 3.8
All need to chose base image for each project
Project A FROM python:3.6
Project B FROM python:3.7
Project C FROM python:3.8

I prefer to think of containers as a OS on top of your OS. You can google lots of info about docker, but if talking in simple language it is a thin layer that runs on top of your OS, uses your OS's resources (unlike VM), and runs its own enclosed environment.

Not any of the two can replace the other. It depends on what you are doing.
Python virtual environment is a way to encapsulate all app dependencies inside a single environment (actually a directory). These dependencies are other apps and packages that support the OS version you are using.
Docker container is a way to run a virtual machine with low resource consumption by sharing a lot of your OS files (more details in docker docs)
So,
if you need to create a development environment; it is recommended to use docker because you can douplicate the exact development experience for all developpers. Everything will be in a virtual machine that has its own OS version and its own files (virtually). Python virtual envirounment will not 100% help other developers unless they are using the same OS version of yours and they can duplicate your exact steps to deploy your app
But, if you are creating a package that will deploy a remote server (let say by using ansible), docker will be an extra un-needed step. Python environment will do the job just fine without any issue.
Also, it is very common to have dockers that include many python virtual environments; one environment for each service. So, even a docker image can include a python venv

Related

What workflow can I use for moving from Poetry environment from dev to prod?

I've been coding small personal projects using conda for years. Recently I've been working on a higher volume of projects without the need for scientific packages, so I decided to switch over to standard Python. I am now using poetry as my package manager.
My development environment is working fine, and I'm liking poetry overall. But it's time to start rolling out my apps to several different machines. I'm letting some of my workmates use my apps. I don't know how to go about flowing my projects to them, as they are not developers.
Part of my app has a system_tray.py which loads on startup (windows), and is basically the launcher for all the additional *.py files that perform different functions. This is my issue that I need to solve during the small rollout.
system_tray.py
...
args = ['C:\\poetry\\Cache\\virtualenvs\\classroom-central-3HYMdsQi-py3.11\\Scripts\\python.exe',
'ixl_grade_input.py']
subprocess.call(args)
...
This obviously triggers the running of a separate *.py file. What I'm not sure of is what workflow would deal with this situation. How would I make this work in a development environment, and be able to roll this out into a production environment?
In my case, I can just manually modify everything and install Python on the individual machines along with pip install of the required modules.. but this can't be the way a real rollout would go, and I'm looking to learn a more efficient method.
I'm aware of, but have yet to use, a requirements.txt file. I first thought that perhaps you just setup virtualenvs to model after the production environment configurations. However, after trying to research this, it seems as though people roll their virtualenvs out into the production environments as well. I also understand that I can configure poetry config to install the venv into the project directory. Then just use a relative path in the "args"?
This is one workflow
Add launch scripts to your pyproject.toml
Get the source code, Python and Poetry on the target computer
Run poetry install to create a Poetry environment (virtual environment under the hood)
Run your launch script with poetry run command - they will automatically use the environment created with poetry install
Some examples how to add launch scripts to Poetry pyproject.toml
[tool.poetry.scripts]
trade-executor = 'tradeexecutor.cli.main:app'
get-latest-release = 'tradeexecutor.cli.latest_release:main'
prepare-docker-env = 'tradeexecutor.cli.prepare_docker_env:main'

Is there a way to pack a python project with all dependencies but not as an executable, but with access to the code?

since I think the original post wasn't clear enough -
EDIT:
Is there a tool or an option to pack my python project and all of it's dependencies and sub dependencies - same as pyinstaller does, but pyinstaller generates an executable binary, and I'd like to have the ability to make changes in the code after it's distributed (e.g on a client's environment)
Original Post
I have this python project, with a lot of dependencies and sub-dependencies, that's currently distributed by building it using pipenv to create a virtual environment and getting the 3rd party libraries, and pyinstaller to generate an executable which is being used on a virtual machine with another OS (I'm packing an executable to the target's machine OS by building it on a docker with the same OS).
the thing is - I'm using some data and scripts from this virtual machine in my python project, so i can't run it locally, and whenever there's a bug, or an error, I have to make changes locally and rebuild (which takes some time) and only then move the executable to the vm in order to run it.
my goal is to have all of the code packed with the dependencies, but with the structure of my project, and not as executable, so I can make quick changes on the VM itself.
The VM may not have an external connection, so I can't just install the dependencies on the machine.
is there a tool that can help me do such a thing?
Note
Currently the python version on the VM is different than the python version the project uses. it's possible to install another version if necessary.

Can we implement our python project in Python Virtual Environment for production environment?

Problem:
We have two different python service which should be ran in a single server. Where we have a dependency clash. Say Project A needs module - older version while Project B needs the same module but with newer version.
To isolate we found Python Virtual Environment will solve this issue.
But the real question for me is the Virtual environment will be stable
and accepted on the production level usage.
Or Is there any other way we can approach for the problem.
Yes, you can
You can create virtual environment for first service where python version will be different and for second service, you can use different python version.
you can set these environment in running path of your services (for example in supervisor which we use for running process)
[program:service1]
command=path_to_virtualenv_for_service1 python service1.py
[program:service2]
command=path_to_virtualenv_for_service2 python service2.py
It is perfectly acceptable to use a virtual environment in production. In fact, it is often encouraged as it will ensure that any updates to the Python packages for one of the projects will not break the other.
A good alternative is to use separate Docker containers for each of the projects.

Why is virtualenv necessary?

I am a beginner in Python.
I read virtualenv is preferred during Python project development.
I couldn't understand this point at all. Why is virtualenv preferred?
Virtualenv keeps your Python packages in a virtual environment localized to your project, instead of forcing you to install your packages system-wide.
There are a number of benefits to this,
the first and principle one is that you can have multiple virtulenvs, so you
can have multiple sets of packages that for different projects, even
if those sets of packages would normally conflict with one another.
For instance, if one project you are working on runs on Django 1.4
and another runs on Django 1.6, virtualenvs can keep those projects
fully separate so you can satisfy both requirements at once.
the second, make it easy for you to release your project with its own dependent
modules.Thus you can make it easy to create your requirements.txt
file.
the third, is that it allows you to switch to another installed python interpreter for that project*. Very useful (Think old 2.x scripts), but sadly not available in the now built-in venv.
Note that virtualenv is about "virtual environments" but is not the same as "virtualization" or "virtual machines" (this is confusing to some). For instance, VMWare is totally different from virtualenv.
A Virtual Environment, put simply, is an isolated working copy of Python which allows you to work on a specific project without worry of affecting other projects.
For example, you can work on a project which requires Django 1.3 while also maintaining a project which requires Django 1.0.
VirtualEnv helps you create a Local Environment(not System wide) Specific to the Project you are working upon.
Hence, As you start working on Multiple projects, your projects would have different Dependencies (e.g different Django versions) hence you would need a different virtual Environment for each Project. VirtualEnv does this for you.
As, you are using VirtualEnv.. Try VirtualEnvWrapper : https://pypi.python.org/pypi/virtualenvwrapper
It provides some utilities to create switch and remove virtualenvs easily, e.g:
mkvirtualenv <name>: To create a new Virtualenv
workon <name> : To use a specified virtualenv
and some others
Suppose you are working on multiple projects, one project requires certain version of python and other project requires some other version. In case you are not working on virtual environment both projects will access the same version installed in your machine locally which in case can give error for one.
While in case of virtual environment you are creating a new instance of your machine where you can store all the libraries, versions separately. Each time you can create a new virtual environment and work on it as a new one.
There is no real point to them in 2022, they are a mechanism to accomplish what C#, Java, Node, and many other ecosystems have done for years without virtual environments.
Projects need to be able to specify their package and interpreter dependencies in isolation from other projects. Virtual Environments are a fine but legacy solution to that issue. (Vs a config file which specifies interpreter version and local __pypackages__)
pep 582 aims to address this lack of functionality in the python ecosystem.

bundling/executing python script + modules to a remote machine

I have looked into other python module distribution questions. My need is a bit different (I think!, I am python newbie+)
I have a bunch of python scripts that I need to execute on remote machines. Here is what the target environment looks like;
The machines will have base python run time installed
I will have a SSH account; I can login or execute commands remotely using ssh
i can copy files (scp) into my home dir
I am NOT allowed to install any thing on the machine; the machines may not even have access to Internet
my scripts may use some 'exotic' python modules -- most likely they won't be present in the target machine
after the audit, my home directory will be nuked from the machine (leave no trace)
So what I like to do is:
copy a directory structure of python scripts + modules to remote machine (say in /home/audituser/scripts). And modules can be copied into /home/audituser/scripts/pythhon_lib)
then execute a script (say /home/audituser/scripts/myscript.py). This script will need to resolve all modules used from 'python_lib' sub directory.
is this possible? or is there a better way of doing this? I guess what I am looking is to 'relocate' the 3rd party modules into the scripts dir.
thanks in advance!
Are the remote machines the same as each other? And, if so, can you set up a development machine that's effectively the same as the remote machines?
If so, virtualenv makes this almost trivial. Create a virtualenv on your dev machine, use the virtualenv copy of pip to install any third-party modules into it, build your script within it, then just copy that entire environment to each remote machine.
There are three things that make it potentially non-trivial:
If the remote machines don't (and can't) have virtualenv installed, you need to do one of the following:
In many cases, just copying a --relocatable environment over just works. See the documentation section on "Making Environments Relocatable".
You can always bundle virtualenv itself, and pip install --user virtualenv (and, if they don't even have pip, a few steps before that) on each machine. This will leave the user account in a permanently-changed state. (But fortunately, your user account is going to be nuked, so who cares?)
You can write your own manual bootstrapping. See the section on "Creating Your Own Bootstrap Scripts".
By default, you get a lot more than you need—the Python executable, the standard library, etc.
If the machines aren't identical, this may not work, or at least might not be as efficient.
Even if they are, you're still often making your bundle orders of magnitude bigger.
See the documentation sections on Using Virtualenv without bin/python, --system-site-packages, and possibly bootstrapping.
If any of the Python modules you're installing also need C libraries (e.g., libxml2 for lxml), virtualenv doesn't help with that. In fact, you will need the C libraries to be almost exactly the same (same path, compatible version).
Three other alternatives:
If your needs are simple enough (or the least-simple parts involve things that virtualenv doesn't help with, like installing libxml2), it may be easier to just bundle the .egg/.tgz/whatever files for third-party modules, and write a script that does a pip install --user and so on for each one, and then you're done.
Just because you don't need a full app-distribution system doesn't mean you can't use one. py2app, py2exe, cx_freeze, etc. aren't all that complicated, especially in simple cases, and having a click-and-go executable to copy around is even easier than having an explicit environment.
zc.buildout is an amazingly flexible and manageable tool that can do the equivalent of any of the three alternatives. The main downside is that there's a much, much steeper learning curve.
You can use virtualenv to create a self-contained environment for your project. This can house your own script, as well as any dependency libraries. Then you can make the env relocatable (--relocatable), and sync it over to the target machine, activate it, and run your scripts.
If these machines do have network access (not internet, but just local network), you can also place the virtualenv on a shared location and activate from there.
It looks something like this:
virtualenv --no-site-packages portable_proj
cd portable_proj/
source bin/activate
# install some deps
pip install xyz
virtualenv --relocatable .
Now portable_proj can be disted to other machines.

Categories

Resources