I am trying to automate the installation of a specific program using Sikuli and scripts on Windows 7. I needed to start the program installer and then used Siluki to step through the rest of the installation. I did this using Python 2.7
This code works as expected by creating a thread, calling the subprocess, and then continuing the main process:
import subprocess
from threading import Thread
class Installer(Thread):
def __init__(self):
Thread.__init__(self)
def run(self):
subprocess.Popen(["msiexec", "/i", "c:\path\to\installer.msi"], shell=True)
i = Installer()
i.run()
print "Will show up while installer is running."
print "Other things happen"
i.join()
This code does not operate as desired. It will start the installer but then hang:
import subprocess
from threading import Thread
class Installer(Thread):
def __init__(self):
Thread.__init__(self)
def run(self):
subprocess.call("msiexec /i c:\path\to\installer.msi")
i = Installer()
i.run()
print "Will not show up while installer is running."
print "Other things happen"
i.join()
I understand that subprocess.call will wait for the process to terminate. Why does that prevent the main thread from continuing on? Should the main continue execution immediately after the process call?
Why is there such a difference in behaviors?
I have only just recently started using threads C.
You're calling i.run(), but what you should be calling is i.start(). start() invokes run() in a separate thread, but calling run() directly will execute it in the main thread.
First.
you need to add the command line parameters to your install command to make it a silent install..
http://msdn.microsoft.com/en-us/library/aa372024%28v=vs.85%29.aspx
the subprocess is probably hung waiting for an install process that will never end because it is waiting for user input.
Second.
if that doesn't work.. you should be using popen and communicate
How to use subprocess popen Python
Third.
if that still didn't work, your installer is hanging some where and you should debug the underlying process there.
Related
I have a script that is supposed to run 24/7 unless interrupted. This script is script A.
I want script A to call Script B, and have script A exit while B is running. Is this possible?
This is what I thought would work
#script_A.py
while(1)
do some stuff
do even more stuff
if true:
os.system("python script_B.py")
sys.exit(0)
#script_B.py
time.sleep(some_time)
do something
os.system("python script_A.py")
sys.exit(0)
But it seems as if A doesn't actually exit until B has finished executing (which is not what I want to happen).
Is there another way to do this?
What you are describing sounds a lot like a function call:
def doScriptB():
# do some stuff
# do some more stuff
def doScriptA():
while True:
# do some stuff
if Your Condition:
doScriptB()
return
while True:
doScriptA()
If this is insufficient for you, then you have to detach the process from you python process. This normally involves spawning the process in the background, which is done by appending an ampersand to the command in bash:
yes 'This is a background process' &
And detaching said process from the current shell, which, in a simple C program is done by forking the process twice. I don't know how to do this in python, but would bet, that there is a module for this.
This way, when the calling python process exits, it won't terminate the spawned child, since it is now independent.
It seems you want to detach a system call to another thread.
script_A.py
import subprocess
import sys
while(1)
do some stuff
do even more stuff
if true:
pid = subprocess.Popen([sys.executable, "python script_B.py"]) # call subprocess
sys.exit(0)
Anyway it does not seem a good practice at all. Why do you not try the script A listens the Process Stack and if it finds script B running then stops. This is another example how you could do it.
import subprocess
import sys
import psutil
while(1)
#This sections queries the current processes running
for proc in psutil.process_iter():
pinfo = proc.as_dict(attrs=['pid', 'name'])
if pinfo[ 'name' ] == "script_B.py":
sys.exit(0)
do some stuff
do even more stuff
if true:
pid = subprocess.Popen([sys.executable, "python script_B.py"]) # call subprocess
sys.exit(0)
I am trying to run the Robocopy command (but I am curious about any subprocess) from Python in windows. The code is pretty simple and works well. It is:
def copy():
with Popen(['Robocopy', media_path, destination_path, '/E', '/mir', '/TEE', '/log+:' + log_path], stdout=PIPE, bufsize=1, universal_newlines=True) as Robocopy:
Robocopy.wait()
returncode = Robocopy.returncode
Additionally I am running it in a separate thread with the following:
threading.Thread(target=copy, args=(media_path, destination_path, log_path,), daemon=True)
However, there are certain instances where I want to stop the robocopy (akin to closing the CMD window if it was run from the command line)
Is there a good way to do this in Python?
We fought with reliably killing subprocesses on Windows for a while and eventually came across this:
https://github.com/andreisavu/python-process/blob/master/killableprocess.py
It implements a kill() method for killing your subprocess. We've had really good results with it.
You will need to somehow pass the process object out of the thread and call kill() from another thread, or poll in your thread with wait() using a timeout while monitoring some kind of global-ish flag.
If the process doesn't start other processes then process.kill() should work:
import subprocess
class InterruptableProcess:
def __init__(self, *args):
self._process = subprocess.Popen(args)
def interrupt(self):
self._process.kill()
I don't see why would you need it on Windows but you could run Thread(target=self._process.wait, daemon=True).start() if you'd like.
If there is a possibility that the process may start other processes in turn then you might need a Job object to kill all the descendant processes. It seems killableprocess.py which is suggested by #rrauenza uses this approach (I haven't tested it). See Python: how to kill child process(es) when parent dies?.
I'm executing a function as a thread in python. Now, the program will wait for the function to execute and then terminate after its completion.
My target is to starting the background thread and closing the program calling it.
how can we do it. As in below code, the thread will take 30 min to execute. I want to stop the main program after calling the thread and let the thread run in background.
thread = threading.Thread(target=function_that_runs_for_30_min)
thread.start()
print "Thread Started"
quit()
You cannot do that directly. A thread is just a part of a process. Once the process exits, all the threads are gone. You need to create a background process to achieve that.
You cannot use the multiprocessing module either because it is a package that supports spawning processes using an API similar to the threading module (emphasize mine). As such it has no provision to allow a process to run after the end of the calling one.
The only way I can imagine is to use the subprocess module to restart the script with a specific parameter. For a simple use case, adding a parameter is enough, for more complex command line parameters, the module argparse should be used. Example of code:
import subprocess
import sys
# only to wait some time...
import time
def f(name):
"Function that could run in background for a long time (30')"
time.sleep(5)
print 'hello', name
if __name__ == '__main__':
if (len(sys.argv) > 1) and (sys.argv[1] == 'SUB'):
# Should be an internal execution: start the lengthy function
f('bar')
else:
# normal execution: start a subprocess with same script to launch the function
p = subprocess.Popen("%s %s SUB" % (sys.executable, sys.argv[0]))
# other processing...
print 'END of normal process'
Execution:
C:\>python foo.py
END of normal process
C:\>
and five seconds later:
hello bar
I am running python 2.7 on Ubuntu in Eclipse
I am trying to call subprocess.Popen from a thread other than the main thread.
When I run this code from Eclipse:
#lsbt.py
class someThread(threading.Thread):
def __init__(self):
threading.Thread.__init__(self)
def run(self):
p = subprocess.Popen(["ls", "/usr"], stdout=subprocess.PIPE)
out = p.communicate()
print "Done" + out[0]
def main():
test = someThread()
test.daemon = True
test.start()
while True:
time.sleep(3600)
The whole python program seems to exit at the subprocess.Popen() line.
Here is what eclipse says the call stack looks like:
<terminated>lsbt_1 lsbt.py [Python Run]
<terminated>lsbt.py
lsbt.py
<terminated, exit value: 137>lsbt.py
All debugging seems to stop as well and nothing is printed to the console.
When I run the subprocess code from the main thread in Eclipse, it seems to work well.
It does not seem to matter what command the subprocess.Popen runs, then only thing that seems to matter is that it is not being run from the main thread.
When I run the python code from the terminal, it works.
Could it be a problem with Eclipse?
#aabarnert commented: IIRC, errno 137 on linux is ENOTTY
One way to do it is to set:
daemon = False
I'm not sure why this works for Eclipse, but it does.
From Python Documentation:
A thread can be flagged as a “daemon thread”. The significance of this flag is that the entire Python program exits when only daemon threads are left. The initial value is inherited from the creating thread. The flag can be set through the daemon property
Contents of check.py:
from multiprocessing import Process
import time
import sys
def slp():
time.sleep(30)
f=open("yeah.txt","w")
f.close()
if __name__=="__main__" :
x=Process(target=slp)
x.start()
sys.exit()
In windows 7, from cmd, if I call python check.py, it doesn't immediately exit, but instead waits for 30 seconds. And if I kill cmd, the child dies too- no "yeah.txt" is created.
How do I make ensure the child continues to run even if parent is killed and also that the parent doesn't wait for child process to end?
What you seem to want is running your script as a background process. The solution in How to start a background process in Python? should do, you will have to specify some command line parameter that tell your script to go into slp rather than spawning a new process.
have a look at subprocess module instead.