convert a bash script to python3 - python

I have a bash script, which is running perfectly:
gvim --servername "servername" $1
if [ -f ${1%.tex}.pdf ];
then
evince ${1%.tex}.pdf &
fi
evince_vim_dbus.py GVIM servername ${1%.tex}.pdf $1 &
I am trying to convert it to python as:
#!/usr/bin/env python3
from subprocess import call
import sys, os
inp_tex = sys.argv[1]
oup_pdf = os.path.splitext(sys.argv[1])[0]+".pdf"
print(oup_pdf)
call(["gvim", "--servername", "servername", sys.argv[1]])
if os.path.exists(oup_pdf):
call(["evince", oup_pdf])
call(["evince_vim_dbus.py", "GVIM", "servername", oup_pdf, inp_tex])
in the python, both gvim and evince window is open, but evince_vim_dbus.py line is not working. Not that it is giving any error, but it is not showing intended result, as it should, and is doing with the bash script.
trying with check_call (I have to kill it after a while, here's the traceback):
Traceback (most recent call last):
File "/home/rudra/vims.py", line 28, in <module>
check_call(["python","/home/rudra/bin/evince_vim_dbus.py", "GVIM", "servername", oup_pdf, inp_tex])
File "/usr/lib64/python3.5/subprocess.py", line 576, in check_call
retcode = call(*popenargs, **kwargs)
File "/usr/lib64/python3.5/subprocess.py", line 559, in call
return p.wait(timeout=timeout)
File "/usr/lib64/python3.5/subprocess.py", line 1658, in wait
(pid, sts) = self._try_wait(0)
File "/usr/lib64/python3.5/subprocess.py", line 1608, in _try_wait
(pid, sts) = os.waitpid(self.pid, wait_flags)
KeyboardInterrupt

I'm going to have a guess that your real problem isn't the evince_vim_dbus.py line itself, but rather the gvim line, because you pass it the server name 'servername' instead of simply servername, and so doesn't match the name on the line that runs evince_vim_dbus.py.
I'm not familiar with gvim or its server functionality, but I'm guessing the evince_vim_dbus.py program connects to gvim using the given name, in which case it's going to fail since the server of the right name isn't running.
If that's not it, then maybe the problem is that subprocess.call() runs the given program and waits for it to exit, whereas in your original bash script, you run evince with an ampersand, causing bash not to wait for it, so maybe the problem is that evince_vim_dbus.py never runs at all until you exit Evince.

Related

Python psutil.wait raise timeout without good reason

I'm facing a strange situation, I've searched on google without any good results.
I'm running a python script as a subprocess from a parent subprocess with nohup using subprocess package:
cmd = list()
cmd.append("nohup")
cmd.append(sys.executable)
cmd.append(os.path.abspath(script))
cmd.append(os.path.abspath(conf_path))
_env = os.environ.copy()
if env:
_env.update({k: str(v) for k, v in env.items()})
p = subprocess.Popen(cmd, env=_env, cwd=os.getcwd())
After some time the parent process exists and the subprocess (the one with the nohup continues to run).
After another minute or two the process with the nohup exits, and with obvious reasons, becomes a zombie.
When running it on local PC with python3.6 and ubuntu 18.04, I manage to run the following code and everything works like a charm:
comp_process = psutil.Process(pid)
if comp_process.status() == "zombie":
comp_status_code = comp_process.wait(timeout=10)
As I said, everything works like a charm, The zombie process removed and I got the status code of the mentioned process.
But for some reason, when doing the SAME at docker container with the SAME python version and Ubuntu version, It fails after the timeout (Doesn't matter if its 10 seconds or 10 minutes)
The error:
psutil.TimeoutExpired timeout after 60 seconds (pid=779)
Traceback (most recent call last): File
"/usr/local/lib/python3.6/dist-packages/psutil/_psposix.py", line 84,
in wait_pid
retpid, status = waitcall() File "/usr/local/lib/python3.6/dist-packages/psutil/_psposix.py", line 75,
in waitcall
return os.waitpid(pid, os.WNOHANG) ChildProcessError: [Errno 10] No child processes
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File ".py", line 41, in
run
comp_status_code = comp_process.wait(timeout=60) File "/usr/local/lib/python3.6/dist-packages/psutil/init.py", line
1383, in wait
return self._proc.wait(timeout) File "/usr/local/lib/python3.6/dist-packages/psutil/_pslinux.py", line
1517, in wrapper
return fun(self, *args, **kwargs) File "/usr/local/lib/python3.6/dist-packages/psutil/_pslinux.py", line
1725, in wait
return _psposix.wait_pid(self.pid, timeout, self._name) File "/usr/local/lib/python3.6/dist-packages/psutil/_psposix.py", line 96,
in wait_pid
delay = check_timeout(delay) File "/usr/local/lib/python3.6/dist-packages/psutil/_psposix.py", line 68,
in check_timeout
raise TimeoutExpired(timeout, pid=pid, name=proc_name) psutil.TimeoutExpired: psutil.TimeoutExpired timeout after 60 seconds
(pid=779)
One possibility may be the lack of an init process to reap zombies. You can fix this by running with docker run --init, or using e.g. tini. See https://hynek.me/articles/docker-signals/

Subprocess function python: Use in automation

I am trying to run this command in the terminal using Python:
./Pascal --set=settings/1_settings.txt --runpathway=on
--genescoring=sum --pval=1_snp_values.txt.gz
I need to run this script 180 times using a different pval everytime. Hence, automating it via Python saves me a lot of time.
Currently I have a Python subprocess like this:
subprocess.call("./Pascal --set=settings/1_settings.txt
--runpathway=on --genescoring=sum --pval=1_snp_values.txt.gz")
Although, I am getting this error:
Traceback (most recent call last):
File "test_automation.py", line 4, in <module>
subprocess.call("./Pascal --set=settings/1_settings.txt --runpathway=on --genescoring=sum --pval=1_snp_values.txt.gz")
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 522, in call
return Popen(*popenargs, **kwargs).wait()
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 710, in __init__
errread, errwrite)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 1335, in _execute_child
raise child_exception
The problem is when I execute the exact same command in terminal (outside of the Python code) it works fine. Am I using the syntax incorrectly?
Using os.system...
You could try using this to run your terminal command like so:
import os
os.system("./Pascal --set=settings/1_settings.txt --runpathway=on --genescoring=sum --pval=1_snp_values.txt.gz")
And if this works, it is simple to execute in a for-loop to get it to run 180 times:
import os
for _ in range(180):
os.system("./Pascal --set=settings/1_settings.txt --runpathway=on --genescoring=sum --pval=1_snp_values.txt.gz")
Additionally, if Pascal is in the same directory that you are running your script then the path of ./ is not necessary as by default, it will search the current working directory. So you can change it to just Pascal --set-settings...
Using subprocess.call...
Personally, I think using os.system is a cleaner solution, however you can use subprocess.call to do the same thing in one of two ways, either:
call Pascal and the other arguments separately from a list like:
import subprocess as s
s.call(['./Pascal', '--set=settings/1_settings.txt', '--runpathway=on', '--genescoring=sum', '--pval=1_snp_values.txt.gz'])
or just set shell to true and pass in the full line as a string like:
import subprocess as s
s.call('./Pascal --set=settings/1_settings.txt --runpathway=on --genescoring=sum --pval=1_snp_values.txt.gz', shell=True)
Hope this works for you!

Python: pass ctrl-c to a process ran via os.spawnvpe

In my python script I have:
os.spawnvpe(os.P_WAIT, cmd[0], cmd, os.environ)
where cmd is something like ['mail', '-b', emails,...] which allows me to run mail interactively and go back to the python script after mail finishes.
The only problem is when I press Ctrl-C. It seems that "both mail and the python script react to it" (*), whereas I'd prefer that while mail is ran, only mail should react, and no exception should be raised by python. Is it possible to achieve it?
(*) What happens exactly on the console is:
^C
(Interrupt -- one more to kill letter)
Traceback (most recent call last):
File "./tutster.py", line 104, in <module>
cmd(cmd_run)
File "./tutster.py", line 85, in cmd
code = os.spawnvpe(os.P_WAIT, cmd[0], cmd, os.environ)
File "/usr/lib/python3.4/os.py", line 868, in spawnvpe
return _spawnvef(mode, file, args, env, execvpe)
File "/usr/lib/python3.4/os.py", line 819, in _spawnvef
wpid, sts = waitpid(pid, 0)
KeyboardInterrupt
and then the mail is in fact sent (which is already bad because the intention was to kill it), but the body is empty and the content is sent as a attachment with a bin extension.
Wrap it with an try/except statement:
try:
os.spawnvpe(os.P_WAIT, cmd[0], cmd, os.environ)
except KeyboardInterrupt:
pass

Running powershell script within python script, how to make python print the powershell output while it is running

I am writing a python script which checks various conditions and runs a powershell script accordingly to help me automate migration from windows XP to windows 7. The powershell script gives its own output giving the user updates as to what is happening. I would like to take the output of the powershell script and print it as output of the python script. I have looked around at some questions which seem to want to do the same thing but they don't seem to be working for me. Initially I tried using
import subprocess
subprocess.call(["C:\Users\gu2124\Desktop\helloworld.ps1"])
As was suggested here Run PowerShell function from Python script but I found out that this waits for the program to execute first and does not give output so I found out I need to use subprocess.Popen() as was suggusted here Use Popen to execute a Powershell script in Python, how can I get the Powershell script's output and update it to web page? so I tried this
import subprocess
subprocess.Popen(["C:\Users\gu2124\Desktop\helloworld.ps1"], stdout=sys.stdout)
and I get this error
Traceback (most recent call last):
File "C:\Users\gu2124\Desktop\pstest.py", line 5, in <module>
subprocess.Popen(["C:\Users\gu2124\Desktop\helloworld.py1"], stdout=sys.stdout)
File "C:\Python27\lib\subprocess.py", line 701, in __init__
errread, errwrite), to_close = self._get_handles(stdin, stdout, stderr)
File "C:\Python27\lib\subprocess.py", line 848, in _get_handles
c2pwrite = msvcrt.get_osfhandle(stdout.fileno())
File "<string>", line 523, in __getattr__
File "C:\Program Files\PyScripter\Lib\rpyc.zip\rpyc\core\netref.py", line 150, in __getattr__
return syncreq(self, consts.HANDLE_GETATTR, name)
File "C:\Program Files\PyScripter\Lib\rpyc.zip\rpyc\core\netref.py", line 71, in syncreq
return conn.sync_request(handler, oid, *args)
File "C:\Program Files\PyScripter\Lib\rpyc.zip\rpyc\core\protocol.py", line 434, in sync_request
raise obj
AttributeError: DebugOutput instance has no attribute 'fileno'
I'm not completely sure what this means but from what I think I understand after reading this AttributeError: StringIO instance has no attribute 'fileno' is that it is because I am messing with the stdout incorrectly. I looked a around more and I found this Why won't my python subprocess code work? where the answers said to use stdout=subprocess.PIPE so I tried this
import subprocess
subprocess.Popen(["C:\Users\gu2124\Desktop\helloworld.ps1"], stdout=subprocess.PIPE)
which also does not give me output
Finally I saw this http://www.pythonforbeginners.com/os/subprocess-for-system-administrators and changed my code to this
import subprocess
p = subprocess.Popen(["powershell","C:\Users\gu2124\Desktop\helloworld.ps1"], stdout=subprocess.PIPE)
print p.communicate
I thought that it may because I am initially trying to run a powershell script from the command line so I have to open powershell first. When I type these commands directly into the command line it works the way it should but when I run it through the python script it gives this
<bound method Popen.communicate of <subprocess.Popen object at 0x00000000026E4A90>>
which is an improvement I guess but not the "Hello world" I was expecting.
I have no idea what I should try to do next to get this to work. Any help would be greatly appreciated
Also if the powershell script I am using is needed here it is
$strString = "Hello World"
write-host $strString
function ftest{
$test = "Test"
write-host $test
}
EDIT: I tried upgrading to python 3.3 like was suggested in the first answer but I still can't get it to work. I used the command p = subprocess.Popen(['powershell.exe', "C:\\Users\\gu2124\\Desktop\\helloworld.ps1"], stdout=sys.stdout) and am sure the file is there but am getting this error:
Traceback (most recent call last):
File "<pyshell#3>", line 1, in <module>
p = subprocess.Popen(['powershell.exe', "C:\\Users\\gu2124\\Desktop\\helloworld.ps1"], stdout=sys.stdout)
File "C:\Python27\lib\subprocess.py", line 701, in __init__
errread, errwrite), to_close = self._get_handles(stdin, stdout, stderr)
File "C:\Python27\lib\subprocess.py", line 848, in _get_handles
c2pwrite = msvcrt.get_osfhandle(stdout.fileno())
UnsupportedOperation: fileno
Make sure you can run powershell scripts (it is disabled by default). Likely you have already done this. http://technet.microsoft.com/en-us/library/ee176949.aspx
Set-ExecutionPolicy RemoteSigned
Run this python script on your powershell script helloworld.py:
# -*- coding: iso-8859-1 -*-
import subprocess, sys
p = subprocess.Popen(["powershell.exe",
"C:\\Users\\USER\\Desktop\\helloworld.ps1"],
stdout=sys.stdout)
p.communicate()
This code is based on python3.4 (or any 3.x series interpreter), though it should work on python2.x series as well.
C:\Users\MacEwin\Desktop>python helloworld.py
Hello World
I don't have Python 2.7 installed, but in Python 3.3 calling Popen with stdout set to sys.stdout worked just fine. Not before I had escaped the backslashes in the path, though.
>>> import subprocess
>>> import sys
>>> p = subprocess.Popen(['powershell.exe', 'C:\\Temp\\test.ps1'], stdout=sys.stdout)
>>> Hello World
_
In addition to the previous answers, I have some suggestions which makes your code more portable.
Instead of setting ExecutionPolicy globally to RemoteSigned (which imposes some security issues) you can use this to set it only for the PowerShell instance created by your Python script:
import subprocess, sys
p = subprocess.Popen('powershell.exe -ExecutionPolicy RemoteSigned -file "hello world.ps1"', stdout=sys.stdout)
p.communicate()
Note the quotes which allows your PowerShell-script's path/filename to contain spaces.
Furthermore, as shown in the above example, you can use a relative path to call your PowerShell script. The path is relative to your Python workspace directory.
This is how I get the output from Popen
p = subprocess.Popen(["powershell","C:\Users\gu2124\Desktop\helloworld.ps1"], stdout=subprocess.PIPE)
p_out, p_err = p.communicate()
print(p_out)
From the docs on Popen.communicate(). The function returns a tuple (stdout_data, stderr_data).

NameError: name 'buffer' is not defined with Ant Based framework batch file

I'm using a python script to execute an Ant based framework batch file(Helium.bat)
subprocess.Popen('hlm '+commands, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
However the script will always stop and display the following error when it executes the .bat file:
import codecs
File "C:\Python25\lib\codecs.py", line 1007, in <module>
strict_errors = lookup_error("strict")
File "C:\Python25\lib\codecs.py", line 1007, in <module>
strict_errors = lookup_error("strict")
File "C:\Python25\lib\encodings\__init__.py", line 31, in <module>
import codecs, types
File "C:\Python25\lib\types.py", line 36, in <module>
BufferType = buffer
NameError: name 'buffer' is not defined
If I execute the .bat directly on command line, there will not be any issue.
I think at least part of the problem is how you're executing the batch file. Give this a try:
# execute the batch file as a separate process and echo its output
Popen_kwargs = { 'stdout': subprocess.PIPE, 'stderr': subprocess.STDOUT,
'universal_newlines': True }
with subprocess.Popen('hlm '+commands, **Popen_kwargs).stdout as output:
for line in output:
print line,
This pass different arguments to Popen -- the difference are this version removes the shell=True which isn't needed on a batch file, sets stderr=subprocess.STDOUT which redirects stdout to the same place stdout is going to to avoid missing any error messages, and adds a universal_newlines=True to make the output more readable.
Another difference is it reads and prints the output from the Popen process which will effectively make the Python script running the batch file wait until it's finished executing before continuing on -- which I suspect is important.

Categories

Resources