Python multiprocessing.Process to use virtualenv - python

I am running a program using virtualenv. But the multiprocessing.Process spawned here uses system python by default. How do I force it to use virtualenv python.
import os
from multiprocessing import Process
def function1():
# do_something_here
p = Process(func2(), args=(param,))
p.start()
return something
def func2(param):
os.system("which python")
Here it prints "/usr/bin/python". But I need it to use virtualenv python instead.

With sudo venv/bin/python, you effectively activated virtualenv by using python executable in virtualenv directly.
multiprocessing.Process spawn child process with fork(), without exec(), it uses exactly the same python executable as the parent process.
You could confirm python executable in-use by:
>>> import sys
>>> print(sys.executable)
/Users/georgexsh/workspace/tmp/venv/bin/python
>>> print(sys.exec_prefix)
/Users/georgexsh/workspace/tmp/venv/bin/..
Do not use which to determine running python executable path. which, as a Bash command, searches each element of $PATH, for a directory containing an executable file named "python", as you use virtualenv's python directly, not by run its shell activate script first, $PATH not get patched with virtualenv's, as a result, shell command which python would output path of the system python executable.
In fact, at python layer, $PATH is irrelevant, patching $PATH is for the convenience at the Bash layer, to invoke python executable in the virtualenv path by just typing "python", rather than typing full path, What matters most is which python executable is invoked, but not how it is get invoked.

Your problem is here (copied your comment):
#georgexsh I was running it using sudo. By default if you use sudo
then it will use system python. So, I have used "sudo venv/bin/python
main.py" to run the program. Even though I am using venv's python here
it returns "/usr/bin/python" for "os.system('which python')". I don't
understand this behaviour
Basically, what you explain here is something where your virtualenv is not active.
When you activate a virtualenv (. venv/bin/activate), the activation script will change your environment so that your PYTHONPATH is correct and Python executable is searched (and found) first in the virtual env directory. This is what virtualenv does.
By just executing the Python binary from virtualenv directories, your environment is not set for the virtual environment, so any subsequent calls to Python use your default path - as virtualenv is not there to override it.
When you execute sudo, a new process/shell is created and it does not inherit your virtual environment. You might be able to use sudo -E to pass environment but it depends on your sudo. The bulletproof version that should work in every environment is to execute a shell that first activates virtualenv and then executes your script. Something like this:
sudo -- bash -c ". /home/test/mytest/bin/activate; which python"
This executes a bash shell as root, then activates the virtual environment and finally tells you which python it uses. Just modify the above command with your virtual environment path and it might even work.
If your system is shared, just keep in mind that this is a horrible thing to allow your regular users do from security perspective. If you create a passwordless sudo for your regular users to do this, it would give them root access with little tweaking. If it is your own system and the requirement is the knowledge of root password anyway, it does not matter.

Related

Is there a single line way to run a command in a Python venv?

I have a command that only runs correctly inside a Python virtual environment I've configured (as intended). I know that I can run the command as
$ cmd args
once I've activated the venv. But (due to the constraints of the tool I'm using) I need to activate run (and deactivate?) in one line: something equivalent to running
$ activate_somehow cmd args
outside the command line.
Is there a way to do this?
You can generally run something in a virtual environment simply by using a fully qualified path to the script. For example, if I have:
virtualenv .venv
Then I can install something into that virtual environment without activating it by running:
.venv/bin/pip install foo
This should be true for anything installed using standard Python mechanisms.
After looking into the generated bin/activate script, it seems like the only thing relevant to python is the VIRTUAL_ENV variable, so this should be enough to get going:
$ env VIRTUAL_ENV=path/to/venv python ...
Note that the python executable in the bin directory of target environment is just a symlink to globally installed interpreter, which does nothing other that setting process executable path. Assuming the program does not make use of it, utilizing the main binary itself seems harmless. In case you have installed a package which in turn installs some executables, just specify the absolute path:
$ env VIRTUAL_ENV=path/to/venv path/to/venv/bin/executable
You can create a simple wrapper script which runs activate, executes your command, and then deactivates simply by exiting the script in which your environment was activated.
#!/bin/sh
. ${venv-./env}/bin/activate
"$#"
This lets you set the environment variable venv to the path of the environment you want to use, or else uses ./env if it is unset. Perhaps a better design would be to pass the env as the first parameter:
#!/bin/sh
. "$1"/bin/activate
shift
"$#"
Either way, save this somewhere in your PATH ($HOME/bin is a common choice for your private scripts) and give it executable permission.
I found venv-run which should do what you ask:
pip install venv-run
venv-run cmd args
Larsk's answer is probably cleaner, but this is another possible way.
Assuming you use UNIX and your user is user and you have a virtual environment in home (any) directory, ie /home/user/venv, you can make a script like:
#!/bin/sh
export VIRTUAL_ENV=/home/user/venv
export PATH=/home/user/venv/bin:$PATH
python3 "$#"
We can make this script executable (eg call it venv-python3 and do chmod +x venv-python3) and call it as such, or put it some place discoverable in PATH - let's say alongside python. Assuming you have sudo rights:
sudo cp venv-python3 /usr/bin/venv-python3
Then we can call that instead of the python callable. Since the variables are set within the script, explicit call on deactivate is not necessary at exit.
Example:
user#machine ~ % venv-python3 --help
This works for at least for virtualenv version 20.0.17 but if adopted, you should be keeping an eye on what variables bin/activate sets, if this ever changes.
Yes, you can execute the python file using a virtual environment in a single line of command on windows.
venv\Scripts\activate&&python fall_detector.py
I installed pgadmin4 in my home directory in a virtual environment called "pgadmin4".
I use fish shell and it runs perfectly fine with:
~/pgadmin4/bin/python3 ~/pgadmin4/lib/python3.10/site-packages/pgadmin4/pgAdmin4.py
Just in case this helps somebody.

virtualenv & subprocess: How to get the correct path to a script

When someone installs my Python package, they can use the command mycmd (it's a console script added to the python bin/ directory). In turn, mycmd launches several other Python console scripts using subprocess:
subprocess.Process('celery arg1 arg2')
subprocess.Process('huey arg1 arg2')
...
(celery and huey commands are installed through my package's pip dependencies.)
This generally works fine, except in the situation where someone invokes mycmd directly without activating its virtualenv. For example, I am trying to use mycmd inside the process control system "circusd". See here where the circus.ini file invokes venv/bin/chaussette directly, without actually activating the venv. If I do this, I get the message celery: No such file or directory, I presume because the virtualenv is not activated and therefore those commands are not found on the path.
How can I ensure that when someone runs mycmd, the correct celery gets run, even if the virtualenv was not activated? (And should also work if the person is not using virtualenv at all, and cross-platform, etc.)
By the way, I am not using subprocess.Process directly, but rather using Honcho, which provides a layer around it.
I solved this simply by adding my virtualenv bin path to the PATH used by circus.

Mercurial update hook not activating Python virtual environment

I have a bash script that I'm trying to execute anytime an hg update occurs. The goal of this bash script is to switch to the correct virtualenv. For the sake of simplicity, this script is called .test - shown as follows:
#!/bin/bash
echo 'testing hg update hook'
source ~/.virtualenvs/myvirtualenv/bin/activate
Whenever I call this script from my shell using source .test everythying works properly; I can see the results of echo and my shell changes to reflect the activated virtualenv
However,when I do an hg update, the virtualenv is not being activated. The script is firing as I can see the echo result; however, my shell is not updated to reflect the activated virtualenv. Below is the hook setup in my .hg/hgrc file is below. Any ideas why my virtualenv isn't being activated in this hook?
[hooks]
# Update to the correct virtualenv when switching branches (hg update branchname)
update = source .test
UPDATE 1: Per this answer, I don't believe the hg update hook is firing in my current shell; which is why the virtualenv activates when I run the script manually but fails from the hook
Your problem is that when you invoke a shell script, any changes to the environment variables do not get exported to the calling shell (hence why you need to call source activate from the surrounding shell).
The good news is that you don't strictly need to call activate in order to access a virtual environment. What activate will do is:
Add the virtualenv's bin directory to $PATH.
Set the VIRTUAL_ENV environment variable.
Modify your prompt.
None of this is necessary in order to use the virtualenv, and you can execute the python binary in the virtualenv without ever using the script; the prompt is likely not relevant for your use case, you can add the directory (or just the python executable) to your path by symlinking it, and you need the VIRTUAL_ENV environment variable only for software that for some reason needs to be aware of the virtualenv it's running in. If necessary, you can figure it out from sys.executable. For example:
import sys, os
def find_venv():
python = sys.executable
for i in xrange(10):
if not os.path.islink(python):
break
python = os.path.realpath(python)
return os.path.dirname(os.path.dirname(python))
if not os.environ.has_key("VIRTUAL_ENV"):
os.environ["VIRTUAL_ENV"] = find_venv()

Python: Is there any way how I can know under which virtualenv script is running?

I'm working on project which consists of several subprojects. Each of them uses their own virtualenv. And sometimes I'm not sure that script is running at proper virtualenv. I have a pid of that script in memory.
Is there any way how I know (and be sure that env is correct) under which virtualenv script is running?
I usually decide which virtualenv is running by the absolute path it is. So, from the python script it can be found by next commands:
import os
os.environ.get('VIRTUAL_ENV')
It will get the path from the environment variable VIRTUAL_ENV which is always defined by script for activating virtualenv.

execute os.system('python ') inside a virtualenv

I'm using a virtualenv to execute a script, in this script I call:
os.system('python anotherScript.py')
My question is whether the script is executed in the same virtualenv as the caller script?
It's hard to tell, but if you are running this script under an activated virtualenv, you should be under that virutla environment. You can verify your thought by doing
#script.py
import os
os.system('which python')
and from command-line
virtualenv newvirtualenv
source newvirtualenv/bin/activate
(newvirtualenv) user#ubuntu: python script.py
you should see it is under newvirtualenv/bin/python
Usually, you want to put an exectuable header to use the current environment:
#!/usr/bin/env python
import os
os.system('which python')
This does not say use newvirtualenv, but gives you a little more confident if the script is executed under newvirtualenv, it will definitely be newvirtualenv.
If you use /usr/bin/python this is still okay under virtualenv. But for advanced programmers, they tend to have multiple virtual environments and multiple python version. So depending on where they are, they can execute the script based on the environment variable. Just a small gain.
If you run newvirtualenv/bin/python script.py it will be under virtualenv regardless.
As long as the python binary is pointing at the virtualenv's version, you are good.
e.g. use anaconda to manage virtual envs, and in Pycharm IDE:
os.system('which python') # /usr/bin/python
command = 'python3 xxxx.py'
os.system(command) # make /usr/bin/python as interpreter
If I want to use some modules (e.g. cv2) installed in certain virtual env,
command = '/path/to/anaconda3/envs/your_env_name/bin/python3 xxxx.py'
os.system(command)

Categories

Resources