In python, how can I do a non-blocking system call? - python

In Python, is it possible to do a non-blocking system call without forking off a thread? i.e., can I avoid:
import thread
thread.start_new_thread(os.system,('cmd',))

Use the subprocess module (Popen) and have the result written to a file. You can either "wait" for the subprocess to terminate or proceed with other business and poll for the result in the file etc.

Related

Using subprocess module to work in parallel (Multi-process)

New to multiprocessing in python, consider that you have the following function:
def do_something_parallel(self):
result_operation1 = doit.main(A,B)
do_something_else(C)
Now the point is that I want the doit.main to run in another process and to be non blocking, so the code in do_something_else will run immediately after the first has been launched in another process.
How can I do it using python subprocess module?
Is there a difference between subprocessing and creating new process aside to another one, why would we need a child processes of other process?
Note: I do not want to use multithreaded approach here..
EDIT: I wondered whether using a subprocess module and multiprocess module in the same function is prohibited?
Reason I want this is that I have two things to run: first an exe file, and second a function, each needs it own process.
If you want to run a Python code in a separate process, you could use multiprocessing module:
import multiprocessing
if __name__ == "__main__":
multiprocessing.Process(target=doit.main, args=[A, B]).start()
do_something_else() # this runs immmediately without waiting for main() to return
I wondered whether using a subprocess module and multiprocess module in the same function is prohibited?
No. You can use both subprocess and multiprocessing in the same function (moreover, multiprocessing may use subprocess to start its worker processes internally).
Reason I want this is that I have two things to run: first an exe file, and second a function, each needs it own process.
You don't need multprocessing to run an external command without blocking (obviously, in its own process); subprocess.Popen() is enough:
import subprocess
p = subprocess.Popen(['command', 'arg 1', 'arg 2'])
do_something_else() # this runs immediately without waiting for command to exit
p.wait() # this waits for the command to finish
Subprocess.Popen is definitely what you want if the "worker" process is an executable. Threading is what you need when you need things to happen asynchronously, and multiprocessing is what you need if you want to take advantage of multiple cores for the improved performance (although you will likely find yourself also using threads at the same time as they handle asynchronous output of multiple parallel processes).
The main limitation of multiprocessing is passing information. When a new process is spawned, an entire separate instance of the python interpreter is started with it's own independent memory allocation. The result of this is variables changed by one process won't be changed for other processes. For this functionality you need shared memory objects (also provided by multiprocessing module). One implementation I have done was a parent process that started several worker processes and passed them both an input queue, and an output queue. The function given to the child processes was a loop designed to do some calculations on the inputs pulled from the input queue and then spit them out to the output queue. I then designated a special input that the child would recognize to end the loop and terminate the process.
On your edit - Popen will start the other process in parallel, as will multiprocessing. If you need the child process to communicate with the executable, be sure to pass the file stream handles to the child process somehow.

Opening and communicating with a subprocess

I have a subprocess that I use. I must be able to asynchronously read and write to/from this process to it's respective stdout and stdin.
How can I do this? I have looked at subprocess, but the communicate method waits for process termination (which is not what I want) and the subprocess.stdout.read method can block.
The subprocess is not a Python script but can be edited if absolutely necessary. In total I will have around 15 of these subprocesses.
Have a look how communicate is implemented.
There are essentially 2 ways to do it:
either use select() and be notified whether you can read/write,
or delegate the read and write, which both can block, to a thread, respectively.
Have you considered using some queue or NOSQL DB for inter process communication?
I would suggest you to use Redis, and read and write to different keys with your processes.
Have a look at sarge: http://sarge.readthedocs.org/en/latest/index.html
From the sarge docs:
If you want to interact with external programs from your Python applications, Sarge is a library which is intended to make your life easier than using the subprocess module in Python’s standard library.

How to get alert for shutdown of python process/abrupt termination?

How we can hook up a code inside a python process so that it should send an alert in case of shutdown of process/abrupt termination ?
Use Supervisor Daemon
It's not clear what exactly you mean. Shutdown/abort of the process itself? Or of a child process?
Shutdown/abort of a process itself: Have a look at Pythons atexit module; here you can register a callback for when your program cleanly exits. But there is absolutely no way for you to catch all circumstances, if your program fails b/o a serious issue (e.g. segfault) your atexit handlers will never get called. You need a supervising process to catch absolutely all aborts.
Shutdown/abort of a child process: If you e.g. use the subprocess module you can simply call poll() or wait() on popen objects to see if the spawned process is dead / wait for them to die. For a more advanced implementation use Pythons signal module to set a handler for SIGCHLD - this signal is sent to your process whenever one of the child processes terminates.

Stopping a Long-Running Subprocess

I create a subprocess using subprocess.Popen() that runs for a long time. It is called from its own thread, and the thread is blocked until the subprocess completes/returns.
I want to be able to interrupt the subprocess so the process terminates when I want.
Any ideas?
I think you're looking for Popen.terminate or .kill function. They were added in python 2.6.

kill subprocess when python process is killed?

I am writing a python program that lauches a subprocess (using Popen).
I am reading stdout of the subprocess, doing some filtering, and writing to
stdout of main process.
When I kill the main process (cntl-C) the subprocess keeps running.
How do I kill the subprocess too? The subprocess is likey to run a long time.
Context:
I'm launching only one subprocess at a time, I'm filtering its stdout.
The user might decide to interrupt to try something else.
I'm new to python and I'm using windows, so please be gentle.
Windows doesn't have signals, so you can't use the signal module. However, you can still catch the KeyboardInterrupt exception when Ctrl-C is pressed.
Something like this should get you going:
import subprocess
try:
child = subprocess.Popen(blah)
child.wait()
except KeyboardInterrupt:
child.terminate()
subprocess.Popen objects come with a kill and a terminate method (differs in which signal you send to the process).
signal.signal allows you install signal handlers, in which you can call the child's kill method.
You can use python atexit module.
For example:
import atexit
def killSubprocess():
mySubprocess.kill()
atexit.register(killSubprocess)

Categories

Resources