Python Subprocess Popen Stalling CGI Page - python

I have a tool that I am working on and I need it to run a parser and also output another analysis log. Currently I have it so that it's through a web interface.
User goes to the form and submits a filename for parsing (file already on system).
Form submits information to Python CGI script
Python CGI script runs and spawns a subprocess to run the parsing.
Parser finds appropriate information for analysis and spawns subprocess also.
I am using
import subprocess
...
subprocess.Popen(["./program.py", input])
In my code and I assumed from documentation that we don't wait on the child process to terminate, we just keep running the script. My CGI script that starts all this does:
subprocess.Popen(["./program.py", input])
// HTML generation code
// Javascript to refresh after 1 second to a different page
The HTML generation code is to output just a status that we've processed the request and then the javascript refreshes the page to the main homepage.
The Problem
The CGI page hangs until the subprocesses finish, which is not what I want. I thought Popen doesn't wait for the subprocesses to finish but whenever I run this tool, it stalls until it's complete. I want the script to finish and let the subprocesses run in the background and let the webpages still function properly without the user thinking everything is just stalled with the loading signals.
I can't seem to find any reason why Popen would do this because everywhere I read it says it does not wait, but it seems to.
Something odd also is that the apache logs show: "Request body read timeout" as well before the script completes. Is Apache actually stalling the script then?
Sorry I can't show complete code as it's "confidential" but hopefully the logic is there to be understood.

Apache probably waits for the child process to complete. You could try to demonize the child (double fork, setsid) or better just submit the job to a local service e.g., by writing to a predefined file or using some message broker or via higher level interface such as celery

Not sure exactly why this works but I followed the answer in this thread:
How do I run another script in Python without waiting for it to finish?
To do:
p = subprocess.Popen([sys.executable, '/path/to/script.py'],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
Instead of:
p = subprocess.Popen([sys.executable, '/path/to/script.py'])
And for some reason now the CGI script will terminate and the subprocesses keep running.
Any insight as to why there is a difference would be helpful? I don't see why having to define the other two parameters would cause such a stall.

Related

Python Paramiko Child Process

I've written a script that uses PARAMIKO library to log on to a server and executes a command. This command actually invokes the server to execute another python script (resulting in a child process I believe). I believe the server returns back signal indicating that the command was executed successfully, however it doesn't seem to wait for the new child process to complete - only that the original parent process has been completed. Is there anyway of waiting to reference any/all child processes that were generated as a result of this command and waiting that they are all completed before returning control to the initiating client?
Many thanks.
Without the code this will be difficult. I think you should create a rest service . So you would POST to http://0.0.0.0/runCode and this would kick off a process in a different thread. That would end that call. The thread is still running ...when done do a post to http:// 0.0.0.0/afterProcessIsDone this will be the response from the thread that was kicked off. Then in that route you can do whatever you want with thay response there. If you need help with REST check out Flask. It's pretty easy and straight to the point for small projects.

python web thread

So I have a simple python cgi script. The web front end is used to add stuff to a database, and I have update() function that does some cleanup.
I want to run the update() function every time something is added to site, but it needs to be in the background. That is, the webpage should finish loading without waiting for the update() function to finish.
Now I use:
-add stuff to db
Thread(target=update).start()
-redirect to index page
The problem seems to be that python does not want to finish the request (redirect) until the update() thread is done.
Any ideas?
That is, the webpage should finish loading without waiting for the update() function to finish
CGI has to wait for the process -- as a whole -- to finish. Threads aren't helpful.
You have three choices.
subprocess. Spawn a separate "no wait" subprocess to do the update. Provide all the information as command-line parameters.
multiprocessing. Have your CGI connect place a work request in a Queue. You'd start a separate listener which handles the update requests from a Queue.
celery. Download Celery and use it to manage the separate worker process that does the background processing.
you could add a database trigger to update db in response to an event e.g., if a specific column has changed
start a subprocess e.g., subprocess.Popen([sys.executable, '-c', "from m import update; update()"]). It might not work depending on your cgi environment
or just touch update file to be picked up by an inotify script to run necessary updates in a separate process
switch to a different execution environment, e.g., some multithreaded wsgi-server
as a heave-weight option you could use celery if it is easy to deploy in your environment

How to invoke a external application(Windows based) independent of parent python script?

I want to invoke an external GUI application from a python script which will be triggered upon some file upload to the server.
I would like the process to be launched and kept running whereas the python script should continue and eventually finish its job and quit. I have tried different options but none proved successful.
Right now the script expects the application to be closed before script exits and sends the response.
I tried Subprocess, Popen, os.System, Spawnl, Spawnlp in the main thread as well by calling these API's in a separate thread. There are lot of questions asked in this regard in stackoverflow and other forums. But I couldn't get the exact solution for this yet.
Appreciate any help.
had exactly the same problem and took me friggin ages to find it, but here is your answer:
import win32api
win32api.ShellExecute(0, "open", "python.exe", 'blah.py', '', 1)
This guarantees you an independent process - even after you exit the calling python program, this will continue to work.

Kill a Python process if it doesn't finish in a certain time?

Here's the scenario:
I have a Python script that is called from a browser with AJAX. I want the Python script to run and return its output, however if it fails to successfully run in 10 seconds, I want the script to abort and return some other process. Also, at the very beginning, I want to log what the request was.
Here's how I'm thinking of architecting it:
Commander script
Calls three scripts, spawned as subprocesses?
1. the main code to execute and return its result
2. a second script, that waits 10 seconds, then returns False
3. a third script, to log the request in a database
The commander script will return either the result from (1) or an error if it hears back from (2).
How would you actually implement this? I can't figure out if I should use the threading library or the os library with subprocesses.
Or is there a better way to do this?
If this script is run under Unix-like system, you may use SIGALRM to do it.
There's an example in python docs.
I highly sugges using the multiprocessing library of Python. It has process.terminate() methods.
check out:
subprocess.Popen.poll
and
subprocess.Popen.terminate

How to find out if a program crashed with subprocess?

My application creates subprocesses. Usually, these processeses run and terminate without any problems. However, sometimes, they crash.
I am currently using the python subprocess module to create these subprocesses. I check if a subprocess crashed by invoking the Popen.poll() method. Unfortunately, since my debugger is activated at the time of a crash, polling doesn't return the expected output.
I'd like to be able to see the debugging window(not terminate it) and still be able to detect if a process is crashed in the python code.
Is there a way to do this?
When your debugger opens, the process isn't finished yet - and subprocess only knows if a process is running or finished. So no, there is not a way to do this via subprocess.
I found a workaround for this problem. I used the solution given in another question Can the "Application Error" dialog box be disabled?
Items of consideration:
subprocess.check_output() for your child processes return codes
psutil for process & child analysis (and much more)
threading library, to monitor these child states in your script as well once you've decided how you want to handle the crashing, if desired
import psutil
myprocess = psutil.Process(process_id) # you can find your process id in various ways of your choosing
for child in myprocess.children():
print("Status of child process is: {0}".format(child.status()))
You can also use the threading library to load your subprocess into a separate thread, and then perform the above psutil analyses concurrently with your other process.
If you find more, let me know, it's no coincidence I've found this post.

Categories

Resources