I'm on Windows using PowerShell and WSL 'Ubuntu 20.04 LTS'. I have no native Linux Distro, and I cant use virtualisation because of nested device reasons.
My purpose is to use a Windows Python script in PowerShell to call WSL to decrypt some avd-snapshots into raw-images. I already tried os.popen, subprocess.Popen/run/call, win32com.client, multiprocessing, etc.
I can boot the WSL shell, but no further commands are getting passed to it. Does somebody know how to get the shell into focus and prepared for more instructions?
Code Example:
from multiprocessing import Process
import win32com.client
import time, os, subprocess
def wsl_shell():
shell = win32com.client.Dispatch("wscript.shell")
shell.SendKeys("Start-Process -FilePath C:\\Programme\\WindowsApps\\CanonicalGroupLimited.Ubuntu20.04onWindows_2004.2021.825.0_x64__79rhkp1fndgsc\\ubuntu2004.exe {ENTER}")
time.sleep(5)
os.popen("ls -l")
if __name__ == '__main__':
ps = Process(target = wsl_shell)
ps.start()
There are a few ways of running WSL scripts/commands from Windows Python, but a SendKeys-based approach is usually the last resort, IMHO, since it's:
Often non-deterministic
Lacks any control logic
Also, avoid the ubuntu2004.exe (or, for other users who find this, the deprecated bash.exe command). The much more capable wsl.exe command is what you are looking for. It has a lot of options for running commands that the <distroname>.exe versions lack.
With that in mind, here are a few simplified examples:
Using os.system
import os
os.system('wsl ~ -e sh -c "ls -l > filelist.txt"')
After running this code in Windows Python, go into your Ubuntu WSL instance and you should find filelist.txt in your home directory.
This works because:
os.system can be used to launch the wsl command
The ~ tells WSL to start in the user's home directory (more deterministic, while being able to avoid specifying each path in this case)
wsl -e sh runs the POSIX shell in WSL (you could also use bash for this)
Passing -c "<command(s)>" to the shell runs those commands in the WSL shell
Given that, you can pretty much run any Linux command(s) from Windows Python. For multiple commands:
Either separate them with a semicolon. E.g.:
os.system('wsl ~ -e sh -c "ls -l > filelist.txt; gzip filelist.txt')
Or better, just put them all in a script in WSL (with a shebang line), set it executable, and run the script via:
wsl -e /path/to/script.sh
That could even by a Linux Python script (assuming the correct shebang line in the script):
wsl -e /path/to/script.py
So if needed, you can even call Linux Python from Windows Python this way.
Using subprocess.run
The os.system syntax is great for "fire and forget" scripts where you don't need to process the results in Python, but often you'll want to capture the output of the WSL/Linux commands for processing in Python.
For that, use subprocess.run:
import subprocess
cp = subprocess.run(["wsl", "~", "-e", "ls", "-l"], capture_output=True)
print(cp.stdout)
As before, the -e argument can be any type of Linux script you want.
Note that subprocess.run also gives you the exit status of the command.
Related
I am trying to run python code on a build server. In order to keep the agent clean, I'm creating a virutal environment which can be deleted after the task. The python script calls python via subprocess. The Questions are:
why does the call to subprocess not use the same python virtual env the actual script was called in?
How can this be achieved?
Miminal example:
tmp.py:
from subprocess import check_output
import sys
# python interpreter used to call this script
print(sys.executable)
# check which python interpreter is used when calling subprocess
print(check_output(f'python -c "import sys\nprint(sys.executable)').decode())
run.bat:
#echo off
python -m venv .\test_venv
call .\test_venv\Scripts\activate.bat
python tmp.py
output, where the second line is the default python installation on my computer:
λ run.bat
D:\tmp\pytest\test_venv\Scripts\python.exe
D:\tools\python\python.exe
desired output:
λ run.bat
D:\tmp\pytest\test_venv\Scripts\python.exe
D:\tmp\pytest\test_venv\Scripts\python.exe
I am on 64 bit Windows 10.
The subprocess you create uses the operating system's general PATH traversal to find and run the commands you specify, and doesn't know anything about the parent process.
You already know the value of sys.executable; if that's specifically what you want to run, say so:
print(check_output([sys.executable, "-c", "import sys\nprint(sys.executable)"]), text=True)
(This also avoids the shell, which was providing no value at all. Without an explicit shell=True, your code would only work on Windows.)
(Conversely, on any sane platform, the environment, including the virtual environment, would be inherited by child processes.)
However, Python calling Python is almost always an antipattern. Instead, you want to refactor the code so you can import it and run it in the same process.
I need to run commands in command prompt but they only work when the command prompt is set at a particular location in the system. I need the following commands to run in a python script:
import os
os.system("set OMP_NUM_THREADS=2")
os.system("explorer.exe /e,::{20D04FE0-3AEA-1069-A2D8-08002B30309D}"#
os.system("cd C:\CFD\crit_vel_01_02")
os.system("mpiexec -n 9 FDS crit_vel_01_02.fds")
os.system("PAUSE")
the system does not recognise the command
os.system("mpiexec -n 9 FDS crit_vel_01_02.fds")
unless this is run in the command shell which is installed on installation of the program "fds" which is a fire dynamics simulator. I appreciate this seems quite specific to the program but I am assuming there is some generic way that python can run command shell from a different location/with different settings.
The shortcut to the command prompt is called CMDfds and is installed in:
"C:\ProgramData\Microsoft\Windows\Start Menu\Programs\FDS6"
in the properties the target in the shortcut tab is:
"C:\Windows\System32\cmd.exe /k fdsinit"
Not sure it will work but you can give a try at subprocess.run with shell=True.
If shell is True, the specified command will be executed through the shell. This can be useful if you are using Python primarily for the enhanced control flow it offers over most system shells and still want convenient access to other shell features such as shell pipes, filename wildcards, environment variable expansion, and expansion of ~ to a user’s home directory.
Also try running the python script from the fds command shell. It seems to be initializing stuff in the shell.
The trouble with running programs with system commands is that they often have a different shell environment. In order to prevent problems arising from this it's a good idea to use absolute paths. In your case:
os.system("mpiexec -n 9 FDS crit_vel_01_02.fds")
should be changed to:
os.system("/absolute/path/to/mpiexec -n 9 FDS crit_vel_01_02.fds")
I have a python script which included some bash commands in os.system() method .
If I convert this python script to exe using Pyinstaller, will this exe file work properly in windows OS or will I face any issues since Windows can't run bash commands ?
the bash commands include pdftk utility.
Example : pdftk input_pdf output output_pdf userpw password
Should I install pdftk utility also in Windows.
What should I do or install to make it work in Windows ?
Please help me..
Thank you
It won't work, os.system is os specific, in Windows it will just spawn a cmd process and try to execute that command and cmd != bash.
Edit: powershell has a lot of common bash commands implemented on windows, you could try to figure out in the code what os are you running on and if powershell supports your bash commands you could use the subprocess module to spawn powershell processes
It probably will not work, from what I have seen when using bash commands inside of code on windows.
Solutions:
Change the commands to commands that work on Windows.
Use some kind of python api (if you know of one post it in the comments and I will put it here.) that allows you to use the commands you need.
Simply run the script using the bash terminal on windows, but you won't be able to make it a exe as far as I know.
I am trying to install node.js and then check appium version using appium -v
import os,subprocess
os.system('node.msi')
os.system('exit')
os.system('appium -v')
node.msi is a node file on my computer. when i do it through cmd, appium -v works if i do it in a new cmd, but it doesn't work if i keep using the same cmd. so i was hoping that after exit, my code should have worked. can someone point out what i am doing wrong here.
Most likely, the installation of node.msi modifies your system's PATH variable. This change does not become visible inside your running Python process.
If you know the path to your node installation, you can specify it explicitly in a call such as
subprocess.run([r'C:\node\bin\apium.exe', '-v'])
I assume that you are running Windows here. When a console starts, it reads its environment from the registry. That explains why it works when you open a second cmd console.
That means that you have to ask Python to lauch the command appium - v in a new console (and not only a new cmd.exe shell).
It can be done through os.system by using start:
os.system("start /W appium -v")
or depending on what is really appium:
os.system("start /W cmd /c appium -v")
You could also directly use the subprocess module (which offer more configuration than os.system)
p = subprocess.Popen("cmd / c appium -v", creationflags=subprocess.CREATE_NEW_CONSOLE)
p.wait()
Depending on what appium is, the following could be enough:
p = subprocess.Popen("appium -v", creationflags=subprocess.CREATE_NEW_CONSOLE)
p.wait()
I have the virutalenv created and installed. I have also installed jsnapy tool inside my virutal env.
This is the script that we are using:
Filename : venv.py
import os
os.system('/bin/bash --rcfile ~/TestAutomation/End2EndAutomation/bin/activate')
os.system('End2EndAutomation/bin/jsnapy')
ubuntu#server:~/TestAutomation$ python venv.py
(End2EndAutomation) ubuntu#sdno-server:~/TestAutomation$ ^C
We need to know, is how we can get into virutalenv, run a command and deactivate it using python script?
[EDIT1]
i used the code given in the comment. its just entering virutal env. When i issue exit, its running jsnapy command.
ubuntu#server:~/TestAutomation$ python venv.py
(End2EndAutomation) ubuntu#server:~/TestAutomation$ exit
exit
usage:
This tool enables you to capture and audit runtime environment of
networked devices running the Junos operating system (Junos OS)
Tool to capture snapshots and compare them
It supports four subcommands:
--snap, --check, --snapcheck, --diff
1. Take snapshot:
jsnapy --snap pre_snapfile -f main_configfil
Each call to os.system() will create a new bash instance and terminate the previous one. To run all the commands in one bash instance you could put all your commands inside a single bash script and call that from os.system()
run.sh
source ~/TestAutomation/End2EndAutomation/bin/activate
End2EndAutomation/bin/jsnapy
deactivate
Python
os.system('source run.sh')
Alternatively, you could write a multiline bash command, as long as it's all in one os.system() call.
Two successive calls to os.system() will create two independent processes, one after the other. The second will run when the first finishes. Any effects of commands executed in the first process will have been forgotten and flushed when the second runs.
You want to run the activation and the command which needs to be run in the virtualenv in the same process, i.e. the same single shell instance.
To do that, you can use bash -c '...' to run a sequence of commands. See below.
However, a better solution is to simply activate the virtual environment from within Python itself.
p = os.path.expanduser('~/TestAutomation/End2EndAutomation/bin/activate_this.py')
execfile(p, dict(__file__=p))
subprocess.check_call(['./End2EndAutomation/bin/jsnapy'])
For completeness, here is the Bash solution, with comments.
import subprocess
subprocess.check_call(['bash', '-c', """
. ~/TestAutomation/End2EndAutomation/bin/activate
./End2EndAutomation/bin/jsnapy"""])
The preference for subprocess over os.system is recommended even in the os.system documentation.
There is no need to explicitly deactivate; when the bash command finishes, that will implicitly also deactivate the virtual environment.
The --rcfile trick is a nice idea, but it doesn't work when the shell you are calling isn't interactive.