Python - Launch a Long Running Process from a Web App - python

I have a python web application that needs to launch a long running process. The catch is I don't want it to wait around for the process to finish. Just launch and finish.
I'm running on windows XP, and the web app is running under IIS (if that matters).
So far I tried popen but that didn't seem to work. It waited until the child process finished.

Ok, I finally figured this out! This seems to work:
from subprocess import Popen
from win32process import DETACHED_PROCESS
pid = Popen(["C:\python24\python.exe", "long_run.py"],creationflags=DETACHED_PROCESS,shell=True).pid
print pid
print 'done'
#I can now close the console or anything I want and long_run.py continues!
Note: I added shell=True. Otherwise calling print in the child process gave me the error "IOError: [Errno 9] Bad file descriptor"
DETACHED_PROCESS is a Process Creation Flag that is passed to the underlying WINAPI CreateProcess function.

Instead of directly starting processes from your webapp, you could write jobs into a message queue. A separate service reads from the message queue and runs the jobs. Have a look at Celery, a Distributed Task Queue written in Python.

This almost works (from here):
from subprocess import Popen
pid = Popen(["C:\python24\python.exe", "long_run.py"]).pid
print pid
print 'done'
'done' will get printed right away. The problem is that the process above keeps running until long_run.py returns and if I close the process it kills long_run.py's process.
Surely there is some way to make a process completely independent of the parent process.

subprocess.Popen does that.

Related

Check if a subprocess is running in Linux using Python

I have an environment variable that allows a suite of applications to run under certain conditions, and the applications cannot run with the environment variable off.
My python script uses
p = subprocess.Popen(cmdStr3,stdout=subprocess.PIPE,stderr=subprocess.STDOUT)
to open the subprocess.
To test if the application is running, I have tried
try:
os.kill(pid, 0)
return True
except OSError:
return False
and also checking p.returncode. However, these always return true, because even if the application doesn't pull up on the screen, there are mini processes of ~1 MB that still run without the application fully running, so the os sees these and returns true. Is there a way around this?
Another issue is that os.kill doesn't work, the only way I have found to terminate the application is os.killpg at the end.
What I've learned from the comments is that what actually happens is that the subprocess I call is a starter that calls a child which is another application. My subprocess always runs, but with the environment variable off, the child application does not run. Is there a way to see if the child is running?
you can use p.poll() to check if the process is still running.
Another issue is that os.kill doesn't work, the only way I have found to terminate the application is os.killpg at the end.
If you want to kill a process launched using subprocess, you can use p.terminate().
Finally, if you want to wait until the process child dies, you can use p.wait().

Process() called from from Pylons creates a fork

I'm trying to create a background process for some heavy calculations from the main Pylons process. Here's the code:
p = Process(target = instance_process, \
args = (instance_tuple.instance, parent_pipe, child_pipe,))
p.start()
The process is created and started, but is seems to be a fork from the main process: it is listening to the same port and the whole application hangs up. What am I doing wrong?
Thanks in advance.
Process IS a fork. If you look through it's implementation you'll find that Process.start() calls a fork. It does NOT, however, call any of the exec variations to change the execution context.
Still, this may have nothing to do with listening on the same port (unless the parent process is multi-threaded). At which point is the program hanging?
I know that when you try shutting down a python program without terminating the child process created through multiprocessing it will hang until the child process terminates.
This might be caused if, for instance, you do not close the pipe between the processes.

Start long running process using subprocess module

I am trying to start a java process that is meant to take a long time, using python's subprocess module.
What I am actually doing is using the multiprocessing module to start a new Process, and using that process, use subprocess module to run java -jar.
This works fine, but when I start the new process, the java process replaces the python process running python Process. I would like java to run as a child process in a way that when the process that started a new multiprocessing.Process died, the process running java would die too.
Is this possible?
Thanks.
Edit: here's some code to clarify my question:
def run_task():
pargs = ["java -jar app.jar"]
p = Popen(pargs)
p.communicate()[0]
return p
while(True):
a = a_blocking_call()
process = Process(target=run_task)
process.start()
if not a:
break
I want the process running run_task to be killed along with the process running java when the process executing the while loop reaches the break line. Is this possible?
I think you should show some code, it's not clear how you are using subprocess and multiprocessing together.
From the documentation it looks like subprocess should spawn and not replace your new Process-started process. Are you sure that isn't happening ? A test case showing it doesn't would be good.
You may get some hints out of Detach a subprocess started using python multiprocessing module

How to find out if a program crashed with subprocess?

My application creates subprocesses. Usually, these processeses run and terminate without any problems. However, sometimes, they crash.
I am currently using the python subprocess module to create these subprocesses. I check if a subprocess crashed by invoking the Popen.poll() method. Unfortunately, since my debugger is activated at the time of a crash, polling doesn't return the expected output.
I'd like to be able to see the debugging window(not terminate it) and still be able to detect if a process is crashed in the python code.
Is there a way to do this?
When your debugger opens, the process isn't finished yet - and subprocess only knows if a process is running or finished. So no, there is not a way to do this via subprocess.
I found a workaround for this problem. I used the solution given in another question Can the "Application Error" dialog box be disabled?
Items of consideration:
subprocess.check_output() for your child processes return codes
psutil for process & child analysis (and much more)
threading library, to monitor these child states in your script as well once you've decided how you want to handle the crashing, if desired
import psutil
myprocess = psutil.Process(process_id) # you can find your process id in various ways of your choosing
for child in myprocess.children():
print("Status of child process is: {0}".format(child.status()))
You can also use the threading library to load your subprocess into a separate thread, and then perform the above psutil analyses concurrently with your other process.
If you find more, let me know, it's no coincidence I've found this post.

How do I check if a process is alive in Python on Linux?

I have a process id in Python. I know I can kill it with os.kill(), but how do I check if it is alive ? Is there a built-in function or do I have to go to the shell?
Use subprocess module to spawn process.
There is proc.poll() function - it returns None if process is still alive, otherwise it returns process returncode.
http://docs.python.org/library/subprocess.html
os.kill does not kill processes, it sends them signals (it's poorly named).
If you send signal 0, you can determine whether you are allowed to send other signals. An error code will indicate whether it's a permission problem or a missing process.
See man 2 kill for more info.
Also, if the process is your child, you can get a SIGCHLD when it dies, and you can use one of the wait calls to deal with it.
With psutil you can check if a process id exists:
import psutil
print(psutil.pid_exists(1234))

Categories

Resources