I'm currently trying to figure out how I can host a simple webserver to handle POST requests with Python 3.7. My problem is that I want to answer requests after they were received, but the submitted POST data should be used to play back a specific audio file on my RaspberryPi. In two days of googling I couldn't figure out how to have the webserver run constantly while processing the incoming requests in the background.
I tried to use the subprocess module to run the playback script in the background but I never found a way to have it run in the background independently from the webserver. I always end up with my webserver getting a request which is than handled, but while this happens the webserver is unaccessible.
I would apreciate if someone pointed out a direction to look at for me.
I always end up with my webserver getting a request which is than handled, but while this happens the webserver is unaccessible.
To solve this problem, you can create a separate thread or a process to handle the request while the main thread/process goes back to processing new requests.
The workflow will be something like this:
The main process receives a request
The main process creates a new process to handle the request.
The main process goes back to listening for new requests while the new process processes the received request.
Assuming, you are unfamiliar with multithreading and multiprocessing, I'd suggest you go read a little about these topics. Most likely, multithreading will solve your problem. So, you can start from there. Here's a good article about it: An Intro to Threading in Python
Related
I have a fairly simple problem that I would like to solve in Python.
I would like a webserver that has the following behavior: if it receives a POST request for /work, then it should add it to a work queue and execute some function on the data attached. If it receives a POST request for /cancel it should cancel whatever its current task is.
Unfortunately, the only way I can seem to get a BaseHTTPRequestHandler to handle multiple requests is to use a ThreadingMixIn, but that seems unecessarily complicated as I then have to use a set of locks to prevent multiple work tasks from executing concurrently.
I tried to use a BaseHTTPRequestHandler without a ThreadingMixIn and just spin off threads in do_POST, but that didn't work since apparently BaseHTTPRequestHandler closes its connection when the do_POST function returns.
Ideally, I'm looking for an interface that gives me the ability to close the connection to the client on my own terms, so I can do it in a worker thread, and manage the queue myself, rather than working around the ThreadingMixIn's behavior in this regard.
I've written a script that uses PARAMIKO library to log on to a server and executes a command. This command actually invokes the server to execute another python script (resulting in a child process I believe). I believe the server returns back signal indicating that the command was executed successfully, however it doesn't seem to wait for the new child process to complete - only that the original parent process has been completed. Is there anyway of waiting to reference any/all child processes that were generated as a result of this command and waiting that they are all completed before returning control to the initiating client?
Many thanks.
Without the code this will be difficult. I think you should create a rest service . So you would POST to http://0.0.0.0/runCode and this would kick off a process in a different thread. That would end that call. The thread is still running ...when done do a post to http:// 0.0.0.0/afterProcessIsDone this will be the response from the thread that was kicked off. Then in that route you can do whatever you want with thay response there. If you need help with REST check out Flask. It's pretty easy and straight to the point for small projects.
I have an interesting task ahead of me. I need to connect to 45 different websocket streams and stuff them all into the correct places. Sometimes those streams stop working and need to be relaunched. I'm right now running process via bash for each python file to ensure nothing crashes, but I'd rather have a single "server" if you will managing the whole process.
The goal is to eventually have a dashboard of sorts showing the status of each named subprocess, and some stats. I'd like to be able, via dashboard, to relaunch or just stop any of the processes without killing the server, but being able to relaunch those processed at any time.
Is this possible? Any direction?
Most python websockets examples are server/client. I'm just looking for an indefinite client with a relaunch on error w/o ever killing off the server.
I think, Supervisor is what you are looking for.
It has no web dashboard "out of the box", but it has some plugins that implement such feature though.
I'm quite new to python threading/network programming, but have an assignment involving both of the above.
One of the requirements of the assignment is that for each new request, I spawn a new thread, but I need to both send and receive at the same time to the browser.
I'm currently using the asyncore library in Python to catch each request, but as I said, I need to spawn a thread for each request, and I was wondering if using both the thread and the asynchronous is overkill, or the correct way to do it?
Any advice would be appreciated.
Thanks
EDIT:
I'm writing a Proxy Server, and not sure if my client is persistent. My client is my browser (using firefox for simplicity)
It seems to reconnect for each request. My problem is that if I open a tab with http://www.google.com in it, and http://www.stackoverflow.com in it, I only get one request at a time from each tab, instead of multiple requests from google, and from SO.
I answered a question that sounds amazingly similar to your, where someone had a homework assignment to create a client server setup, with each connection being handled in a new thread: https://stackoverflow.com/a/9522339/496445
The general idea is that you have a main server loop constantly looking for a new connection to come in. When it does, you hand it off to a thread which will then do its own monitoring for new communication.
An extra bit about asyncore vs threading
From the asyncore docs:
There are only two ways to have a program on a single processor do
“more than one thing at a time.” Multi-threaded programming is the
simplest and most popular way to do it, but there is another very
different technique, that lets you have nearly all the advantages of
multi-threading, without actually using multiple threads. It’s really
only practical if your program is largely I/O bound. If your program
is processor bound, then pre-emptive scheduled threads are probably
what you really need. Network servers are rarely processor bound,
however.
As this quote suggests, using asyncore and threading should be for the most part mutually exclusive options. My link above is an example of the threading approach, where the server loop (either in a separate thread or the main one) does a blocking call to accept a new client. And when it gets one, it spawns a thread which will then continue to handle the communication, and the server goes back into a blocking call again.
In the pattern of using asyncore, you would instead use its async loop which will in turn call your own registered callbacks for various activity that occurs. There is no threading here, but rather a polling of all the open file handles for activity. You get the sense of doing things all concurrently, but under the hood it is scheduling everything serially.
I've been searching for an answer to this for awhile, it's possible that I haven't been searching for the right information though.
I'm trying to send data to a server, and once received the server executes a python script based on that data. I have been trying to spawn a thread and return, but I can't figure out how to "detach" the thread. I simply have to wait until the thread returns to be able to return an HttpResponse(). This is unacceptable, as the website interface has many other things that need to be able to be used while the thread runs on the server.
I'm not certain that was a clear explanation but I'll be more than happy to clarify if any part is confusing.
Have a look at Celery. It's quite nice in that you can accept the request, and it offload it quickly to workers, and return. It's simple to use.
http://celeryproject.org/
Most simply, you can do this with subprocess.Popen. See here for some information regarding the subprocess module:
http://docs.python.org/library/subprocess.html
There are other (possibly better) methods to doing this, but this one seems to fit your requirements.
Use message queue system, like celery (django-celery may help you.)
Use RDBMS and background process(es) which is periodically invoked by cron or always running.
First, the web server inserts data required by the background job into a database table. And then, background process (always running or run periodically by cron) gets the latest inserted row(s) and process it.
Spawn a thread.
worker_thread = threading.Thread(target=do_background_job, args=args)
worker_thread.setDaemon(False)
worker_thread.start()
return HttpResponse()
Even after HttpResponse is sent, do_background_job is processed. However, because Web server (apache) may kill any threads, execution of background_job is not guaranteed.