I have the virutalenv created and installed. I have also installed jsnapy tool inside my virutal env.
This is the script that we are using:
Filename : venv.py
import os
os.system('/bin/bash --rcfile ~/TestAutomation/End2EndAutomation/bin/activate')
os.system('End2EndAutomation/bin/jsnapy')
ubuntu#server:~/TestAutomation$ python venv.py
(End2EndAutomation) ubuntu#sdno-server:~/TestAutomation$ ^C
We need to know, is how we can get into virutalenv, run a command and deactivate it using python script?
[EDIT1]
i used the code given in the comment. its just entering virutal env. When i issue exit, its running jsnapy command.
ubuntu#server:~/TestAutomation$ python venv.py
(End2EndAutomation) ubuntu#server:~/TestAutomation$ exit
exit
usage:
This tool enables you to capture and audit runtime environment of
networked devices running the Junos operating system (Junos OS)
Tool to capture snapshots and compare them
It supports four subcommands:
--snap, --check, --snapcheck, --diff
1. Take snapshot:
jsnapy --snap pre_snapfile -f main_configfil
Each call to os.system() will create a new bash instance and terminate the previous one. To run all the commands in one bash instance you could put all your commands inside a single bash script and call that from os.system()
run.sh
source ~/TestAutomation/End2EndAutomation/bin/activate
End2EndAutomation/bin/jsnapy
deactivate
Python
os.system('source run.sh')
Alternatively, you could write a multiline bash command, as long as it's all in one os.system() call.
Two successive calls to os.system() will create two independent processes, one after the other. The second will run when the first finishes. Any effects of commands executed in the first process will have been forgotten and flushed when the second runs.
You want to run the activation and the command which needs to be run in the virtualenv in the same process, i.e. the same single shell instance.
To do that, you can use bash -c '...' to run a sequence of commands. See below.
However, a better solution is to simply activate the virtual environment from within Python itself.
p = os.path.expanduser('~/TestAutomation/End2EndAutomation/bin/activate_this.py')
execfile(p, dict(__file__=p))
subprocess.check_call(['./End2EndAutomation/bin/jsnapy'])
For completeness, here is the Bash solution, with comments.
import subprocess
subprocess.check_call(['bash', '-c', """
. ~/TestAutomation/End2EndAutomation/bin/activate
./End2EndAutomation/bin/jsnapy"""])
The preference for subprocess over os.system is recommended even in the os.system documentation.
There is no need to explicitly deactivate; when the bash command finishes, that will implicitly also deactivate the virtual environment.
The --rcfile trick is a nice idea, but it doesn't work when the shell you are calling isn't interactive.
Related
How do i run bash commands from python while the session is maintained.
For example if I pwd, then cd .., then pwd, it SHOULD move to a directory level one level lower than the current directory . I dont want to run all these commands as a single command with | or &. I want to run them on individual lines.
In general, processes can't modify the environment of their parent process, or any other existing process. So you can't do this easily in the way you're describing, unless you deliberately save the environment from the child process somehow (e.g. end all your bash commands by redirecting env to a file, prefixing each entry in the file with export, and source that file at that at the start of every subsequent command...).
Alternatives:
Add all of your inter-dependent bash commands to a single bash script and run the bash script from python, rather than running bash commands in python one by one.
Use os.chdir and other methods to change the python process environment variables as needed before running each bash command.
I am trying to run python code on a build server. In order to keep the agent clean, I'm creating a virutal environment which can be deleted after the task. The python script calls python via subprocess. The Questions are:
why does the call to subprocess not use the same python virtual env the actual script was called in?
How can this be achieved?
Miminal example:
tmp.py:
from subprocess import check_output
import sys
# python interpreter used to call this script
print(sys.executable)
# check which python interpreter is used when calling subprocess
print(check_output(f'python -c "import sys\nprint(sys.executable)').decode())
run.bat:
#echo off
python -m venv .\test_venv
call .\test_venv\Scripts\activate.bat
python tmp.py
output, where the second line is the default python installation on my computer:
λ run.bat
D:\tmp\pytest\test_venv\Scripts\python.exe
D:\tools\python\python.exe
desired output:
λ run.bat
D:\tmp\pytest\test_venv\Scripts\python.exe
D:\tmp\pytest\test_venv\Scripts\python.exe
I am on 64 bit Windows 10.
The subprocess you create uses the operating system's general PATH traversal to find and run the commands you specify, and doesn't know anything about the parent process.
You already know the value of sys.executable; if that's specifically what you want to run, say so:
print(check_output([sys.executable, "-c", "import sys\nprint(sys.executable)"]), text=True)
(This also avoids the shell, which was providing no value at all. Without an explicit shell=True, your code would only work on Windows.)
(Conversely, on any sane platform, the environment, including the virtual environment, would be inherited by child processes.)
However, Python calling Python is almost always an antipattern. Instead, you want to refactor the code so you can import it and run it in the same process.
I need to run commands in command prompt but they only work when the command prompt is set at a particular location in the system. I need the following commands to run in a python script:
import os
os.system("set OMP_NUM_THREADS=2")
os.system("explorer.exe /e,::{20D04FE0-3AEA-1069-A2D8-08002B30309D}"#
os.system("cd C:\CFD\crit_vel_01_02")
os.system("mpiexec -n 9 FDS crit_vel_01_02.fds")
os.system("PAUSE")
the system does not recognise the command
os.system("mpiexec -n 9 FDS crit_vel_01_02.fds")
unless this is run in the command shell which is installed on installation of the program "fds" which is a fire dynamics simulator. I appreciate this seems quite specific to the program but I am assuming there is some generic way that python can run command shell from a different location/with different settings.
The shortcut to the command prompt is called CMDfds and is installed in:
"C:\ProgramData\Microsoft\Windows\Start Menu\Programs\FDS6"
in the properties the target in the shortcut tab is:
"C:\Windows\System32\cmd.exe /k fdsinit"
Not sure it will work but you can give a try at subprocess.run with shell=True.
If shell is True, the specified command will be executed through the shell. This can be useful if you are using Python primarily for the enhanced control flow it offers over most system shells and still want convenient access to other shell features such as shell pipes, filename wildcards, environment variable expansion, and expansion of ~ to a user’s home directory.
Also try running the python script from the fds command shell. It seems to be initializing stuff in the shell.
The trouble with running programs with system commands is that they often have a different shell environment. In order to prevent problems arising from this it's a good idea to use absolute paths. In your case:
os.system("mpiexec -n 9 FDS crit_vel_01_02.fds")
should be changed to:
os.system("/absolute/path/to/mpiexec -n 9 FDS crit_vel_01_02.fds")
How do i run bash commands from python while the session is maintained.
For example if I pwd, then cd .., then pwd, it SHOULD move to a directory level one level lower than the current directory . I dont want to run all these commands as a single command with | or &. I want to run them on individual lines.
In general, processes can't modify the environment of their parent process, or any other existing process. So you can't do this easily in the way you're describing, unless you deliberately save the environment from the child process somehow (e.g. end all your bash commands by redirecting env to a file, prefixing each entry in the file with export, and source that file at that at the start of every subsequent command...).
Alternatives:
Add all of your inter-dependent bash commands to a single bash script and run the bash script from python, rather than running bash commands in python one by one.
Use os.chdir and other methods to change the python process environment variables as needed before running each bash command.
Question
My default Python is 2.7, but I have a script that requires Python 3.4. I am trying to create a function in R that will:
Switch to Python 3.4
Run this script
Switch back to Python 2.7
Import results into R
To switch between Python versions, I use my cluster's "dotkit" system, like this:
use Python-2.7
use Python-3.4
"use" is a bash function that is imported in my .bashrc file. It sets all of my path variables (PATH, LIBRARY_PATH, LD_LIBRARY_PATH, CPATH, C_INCLUDE_PATH, etc). The problem is that when I try to call this function in R, I get the following error:
system('use Python-3.4')
sh: use: command not found
It seems like this is a problem with my PATH. I am using the correct shell:
system('echo $SHELL')
/bin/bash
My $PATH variable also looks good. However, when I create a script that essentially does the same thing:
load_py34.sh:
#!/bin/bash
source ~/.bashrc
use Python-3.4
and call this script through R, then it actually runs, but for some reason, it doesn't change my python version within R. (I have verified that this script works from the command line.)
> R
> system('python --version')
Python 2.7.1
> system('sh load_py34.sh')
Prepending: R-3.4 (ok)
> system('python --version')
Python 2.7.1
So I'm a little confused, but if anyone can help, I would really appreciate it.
Suggested fixes
When I combine them into a single command, I still have the same problem:
> system("sh load_py34.sh; python --version")
Prepending: Python-3.4 (already loaded)
Python 2.7.1
When I try calling bash directly, I still have a problem with the PATH:
> system("bash -c 'use Python-3.4; python --version'")
bash: use: command not found
Python 2.7.1
.bashrc is only loaded for interactive bash sessions.
"use" is a bash function that is imported in my .bashrc file. It sets
all of my path variables.
If set via export, the environment of the calling process will not be altered.
export [-fn] [name[=word]] ... The supplied names are marked for automatic export to the environment of subsequently executed commands. (https://man7.org/linux/man-pages/man1/bash.1.html)
Child processes do not normally have access to the parent process' environment. (This poses a problem because system() creates a sub-process.)
The source and . built-ins execute the commands in the current shell environment, hence why your script works.
Other commands (executables, non-shell-builtins) are executed by the fork-and-exec mechanism, whereby the executing shell process forks, creating a child process with an identical environment and state. This new child process is the process in which the command is executed. Changes to the environment of that process are not replicated to the parent's environment.
This means that you will not be able to rely on system('...') to modify the environment of the R process, or that of processes spawned by subsequent system() invocations.
In a single invocation to system(), you can construct a command-line that changes the environment of the spawned shell like so:
bash -c 'source ~/.bashrc; use Python-3.4; python --version'
Mind you, ~/.bashrc is not really the best place to put this functionality (might be subjective).
When you call system() it uses /bin/sh, not /bin/bash. sh doesn't read your .bashrc file when it starts up, so it does not know any of the functions you've defined there.
To use the function from your .bashrc, you must get bash to run it instead:
system("bash -c 'use Python-3.4; python --version'")
Edit: placement of closing single quote.