I am trying to start a java process that is meant to take a long time, using python's subprocess module.
What I am actually doing is using the multiprocessing module to start a new Process, and using that process, use subprocess module to run java -jar.
This works fine, but when I start the new process, the java process replaces the python process running python Process. I would like java to run as a child process in a way that when the process that started a new multiprocessing.Process died, the process running java would die too.
Is this possible?
Thanks.
Edit: here's some code to clarify my question:
def run_task():
pargs = ["java -jar app.jar"]
p = Popen(pargs)
p.communicate()[0]
return p
while(True):
a = a_blocking_call()
process = Process(target=run_task)
process.start()
if not a:
break
I want the process running run_task to be killed along with the process running java when the process executing the while loop reaches the break line. Is this possible?
I think you should show some code, it's not clear how you are using subprocess and multiprocessing together.
From the documentation it looks like subprocess should spawn and not replace your new Process-started process. Are you sure that isn't happening ? A test case showing it doesn't would be good.
You may get some hints out of Detach a subprocess started using python multiprocessing module
Related
New to multiprocessing in python, consider that you have the following function:
def do_something_parallel(self):
result_operation1 = doit.main(A,B)
do_something_else(C)
Now the point is that I want the doit.main to run in another process and to be non blocking, so the code in do_something_else will run immediately after the first has been launched in another process.
How can I do it using python subprocess module?
Is there a difference between subprocessing and creating new process aside to another one, why would we need a child processes of other process?
Note: I do not want to use multithreaded approach here..
EDIT: I wondered whether using a subprocess module and multiprocess module in the same function is prohibited?
Reason I want this is that I have two things to run: first an exe file, and second a function, each needs it own process.
If you want to run a Python code in a separate process, you could use multiprocessing module:
import multiprocessing
if __name__ == "__main__":
multiprocessing.Process(target=doit.main, args=[A, B]).start()
do_something_else() # this runs immmediately without waiting for main() to return
I wondered whether using a subprocess module and multiprocess module in the same function is prohibited?
No. You can use both subprocess and multiprocessing in the same function (moreover, multiprocessing may use subprocess to start its worker processes internally).
Reason I want this is that I have two things to run: first an exe file, and second a function, each needs it own process.
You don't need multprocessing to run an external command without blocking (obviously, in its own process); subprocess.Popen() is enough:
import subprocess
p = subprocess.Popen(['command', 'arg 1', 'arg 2'])
do_something_else() # this runs immediately without waiting for command to exit
p.wait() # this waits for the command to finish
Subprocess.Popen is definitely what you want if the "worker" process is an executable. Threading is what you need when you need things to happen asynchronously, and multiprocessing is what you need if you want to take advantage of multiple cores for the improved performance (although you will likely find yourself also using threads at the same time as they handle asynchronous output of multiple parallel processes).
The main limitation of multiprocessing is passing information. When a new process is spawned, an entire separate instance of the python interpreter is started with it's own independent memory allocation. The result of this is variables changed by one process won't be changed for other processes. For this functionality you need shared memory objects (also provided by multiprocessing module). One implementation I have done was a parent process that started several worker processes and passed them both an input queue, and an output queue. The function given to the child processes was a loop designed to do some calculations on the inputs pulled from the input queue and then spit them out to the output queue. I then designated a special input that the child would recognize to end the loop and terminate the process.
On your edit - Popen will start the other process in parallel, as will multiprocessing. If you need the child process to communicate with the executable, be sure to pass the file stream handles to the child process somehow.
I have a python script that runs another python program and then gathers results from the logs. The only problem is that I want it to run a limited number of seconds. So I want to kill the process after say, 1 minute.
How can I do this?
I'm running an external program with the command os.system("./test.py")
you need more control over your child process than os.system allows for. subprocess, especially Popen and Popen objects give you enough control for managing child processes. For a timer, see again the section in the standard library
Check out the psutil module. It provides a cross-platform interface to retrieving information on all running processes, and allows you to kill processes also. (It can do more, but that's all you should need!)
Here's the basic idea of how you could use it:
import os
import psutil
import time
os.system('./test.py')
# Find the PID for './test.py'.
# psutil has helper methods to make finding the PID easy.
pid = <process id of ./test.py>
time.sleep(60)
p = psutil.Process(pid)
p.kill()
#!/usr/bin/env python
import time, os, subprocess
process = subprocess.Popen(
['yes'], stdout=open(os.devnull,'w'))
time.sleep(60)
process.terminate()
My application creates subprocesses. Usually, these processeses run and terminate without any problems. However, sometimes, they crash.
I am currently using the python subprocess module to create these subprocesses. I check if a subprocess crashed by invoking the Popen.poll() method. Unfortunately, since my debugger is activated at the time of a crash, polling doesn't return the expected output.
I'd like to be able to see the debugging window(not terminate it) and still be able to detect if a process is crashed in the python code.
Is there a way to do this?
When your debugger opens, the process isn't finished yet - and subprocess only knows if a process is running or finished. So no, there is not a way to do this via subprocess.
I found a workaround for this problem. I used the solution given in another question Can the "Application Error" dialog box be disabled?
Items of consideration:
subprocess.check_output() for your child processes return codes
psutil for process & child analysis (and much more)
threading library, to monitor these child states in your script as well once you've decided how you want to handle the crashing, if desired
import psutil
myprocess = psutil.Process(process_id) # you can find your process id in various ways of your choosing
for child in myprocess.children():
print("Status of child process is: {0}".format(child.status()))
You can also use the threading library to load your subprocess into a separate thread, and then perform the above psutil analyses concurrently with your other process.
If you find more, let me know, it's no coincidence I've found this post.
I have a python web application that needs to launch a long running process. The catch is I don't want it to wait around for the process to finish. Just launch and finish.
I'm running on windows XP, and the web app is running under IIS (if that matters).
So far I tried popen but that didn't seem to work. It waited until the child process finished.
Ok, I finally figured this out! This seems to work:
from subprocess import Popen
from win32process import DETACHED_PROCESS
pid = Popen(["C:\python24\python.exe", "long_run.py"],creationflags=DETACHED_PROCESS,shell=True).pid
print pid
print 'done'
#I can now close the console or anything I want and long_run.py continues!
Note: I added shell=True. Otherwise calling print in the child process gave me the error "IOError: [Errno 9] Bad file descriptor"
DETACHED_PROCESS is a Process Creation Flag that is passed to the underlying WINAPI CreateProcess function.
Instead of directly starting processes from your webapp, you could write jobs into a message queue. A separate service reads from the message queue and runs the jobs. Have a look at Celery, a Distributed Task Queue written in Python.
This almost works (from here):
from subprocess import Popen
pid = Popen(["C:\python24\python.exe", "long_run.py"]).pid
print pid
print 'done'
'done' will get printed right away. The problem is that the process above keeps running until long_run.py returns and if I close the process it kills long_run.py's process.
Surely there is some way to make a process completely independent of the parent process.
subprocess.Popen does that.
I create a subprocess using subprocess.Popen() that runs for a long time. It is called from its own thread, and the thread is blocked until the subprocess completes/returns.
I want to be able to interrupt the subprocess so the process terminates when I want.
Any ideas?
I think you're looking for Popen.terminate or .kill function. They were added in python 2.6.