I am writing a python program that lauches a subprocess (using Popen).
I am reading stdout of the subprocess, doing some filtering, and writing to
stdout of main process.
When I kill the main process (cntl-C) the subprocess keeps running.
How do I kill the subprocess too? The subprocess is likey to run a long time.
Context:
I'm launching only one subprocess at a time, I'm filtering its stdout.
The user might decide to interrupt to try something else.
I'm new to python and I'm using windows, so please be gentle.
Windows doesn't have signals, so you can't use the signal module. However, you can still catch the KeyboardInterrupt exception when Ctrl-C is pressed.
Something like this should get you going:
import subprocess
try:
child = subprocess.Popen(blah)
child.wait()
except KeyboardInterrupt:
child.terminate()
subprocess.Popen objects come with a kill and a terminate method (differs in which signal you send to the process).
signal.signal allows you install signal handlers, in which you can call the child's kill method.
You can use python atexit module.
For example:
import atexit
def killSubprocess():
mySubprocess.kill()
atexit.register(killSubprocess)
Related
How can I run a command from a python script and delegate to it signals like Ctrl+C?
I mean when I run e.g:
from subprocess import call
call(["child_proc"])
I want child_proc to handle Ctrl+C
I'm guessing that your problem is that you want the subprocess to receive Ctrl-C and not have the parent Python process terminate? If your child process initialises its own signal handler for Ctrl-C (SIGINT) then this might do the trick:
import signal, subprocess
old_action = signal.signal(signal.SIGINT, signal.SIG_IGN)
subprocess.call(['less', '/etc/passwd'])
signal.signal(signal.SIGINT, old_action) # restore original signal handler
Now you can hit Ctrl-C (which generates SIGINT), Python will ignore it but less will still see it.
However this only works if the child sets its signal handlers up properly (otherwise these are inherited from the parent).
How we can hook up a code inside a python process so that it should send an alert in case of shutdown of process/abrupt termination ?
Use Supervisor Daemon
It's not clear what exactly you mean. Shutdown/abort of the process itself? Or of a child process?
Shutdown/abort of a process itself: Have a look at Pythons atexit module; here you can register a callback for when your program cleanly exits. But there is absolutely no way for you to catch all circumstances, if your program fails b/o a serious issue (e.g. segfault) your atexit handlers will never get called. You need a supervising process to catch absolutely all aborts.
Shutdown/abort of a child process: If you e.g. use the subprocess module you can simply call poll() or wait() on popen objects to see if the spawned process is dead / wait for them to die. For a more advanced implementation use Pythons signal module to set a handler for SIGCHLD - this signal is sent to your process whenever one of the child processes terminates.
I have a python web application that needs to launch a long running process. The catch is I don't want it to wait around for the process to finish. Just launch and finish.
I'm running on windows XP, and the web app is running under IIS (if that matters).
So far I tried popen but that didn't seem to work. It waited until the child process finished.
Ok, I finally figured this out! This seems to work:
from subprocess import Popen
from win32process import DETACHED_PROCESS
pid = Popen(["C:\python24\python.exe", "long_run.py"],creationflags=DETACHED_PROCESS,shell=True).pid
print pid
print 'done'
#I can now close the console or anything I want and long_run.py continues!
Note: I added shell=True. Otherwise calling print in the child process gave me the error "IOError: [Errno 9] Bad file descriptor"
DETACHED_PROCESS is a Process Creation Flag that is passed to the underlying WINAPI CreateProcess function.
Instead of directly starting processes from your webapp, you could write jobs into a message queue. A separate service reads from the message queue and runs the jobs. Have a look at Celery, a Distributed Task Queue written in Python.
This almost works (from here):
from subprocess import Popen
pid = Popen(["C:\python24\python.exe", "long_run.py"]).pid
print pid
print 'done'
'done' will get printed right away. The problem is that the process above keeps running until long_run.py returns and if I close the process it kills long_run.py's process.
Surely there is some way to make a process completely independent of the parent process.
subprocess.Popen does that.
In Python, is it possible to do a non-blocking system call without forking off a thread? i.e., can I avoid:
import thread
thread.start_new_thread(os.system,('cmd',))
Use the subprocess module (Popen) and have the result written to a file. You can either "wait" for the subprocess to terminate or proceed with other business and poll for the result in the file etc.
I create a subprocess using subprocess.Popen() that runs for a long time. It is called from its own thread, and the thread is blocked until the subprocess completes/returns.
I want to be able to interrupt the subprocess so the process terminates when I want.
Any ideas?
I think you're looking for Popen.terminate or .kill function. They were added in python 2.6.