I am a bit confused with multiprocessing. I have a video processing script which can be run from the command line or launched from a PySide application using a subprocess call. The script seems to run fine from the command line and basically initializes a pool of workers which each process a separate video file.
When I run the program however the OS tells me my program is not responding. I would like to make use of all the cores on my system for multiprocessing but I would also like to prevent this annoyance. What should I do I get around this? Do I start the initial script in a thread or something?
As you are speaking of PySide, I assume you program is a GUI one. In a GUI program all processing must occurs in a worker thread if you want to keep the UI responsive. So yes, the initial script must be start in a thread distinct from main thread (main one is reserved for UI)
Related
I know that I can run a background process in python using subprocess. But the problem is that when I make a gui and then use subprocess with close_fds=True parameter, the window changes to not responding.
So, what I want is that I need to create a background process but it should run separately along with the main process and when that process is done, it should again combine with the main process.
BTW, I am using PySide2 as the gui framework
Any help would be appreciated
I think what would be more beneficial to you would be threading, you are able to start a process in another thread without blocking the main thread which runs your gui. Once the other thread has completed its task it will join the main thread
I am writing a framework in Python 3.8 on Debian that will launch some multiprocessing processes. I want the configuration and launching of the processes to be done in functions OTHER than the main. The main file will be written by the end user of the framework and they should not need to know about these processes. Hence I tried to put the code that configures and launches the processes in helper functions or class methods that the main will call.
What I'm finding is as soon as the launcher function / method exits the processes die. This is even though the launcher functions / methods run (I think) in the same process as main which is still running. I have put a long time.sleep in the launcher functions / methods right before they exit and it seems the processes are alive for that long.
I tried setting the 'daemon' flag but that doesn't seem to solve it. If this is truly a limitation of multiprocessing I can instruct the users of my framework to always put some boiler-plate launcher code in their file, but it seems clunky. All help is appreciated!
The function that launches the processes was returning only the processes to main and NOT managed queues that the processes were using. I changed to return a dict with the processes AND the managed queues and everything works. This is even though main does NOT use the queues.
I have been having trouble with exiting Mulitprocessing pool after keyboard interrupt, and after a long time of trying a gave up, but if i just exited CMD after im done/ have done what i needed with my script, what would be the downsides? I know this isn't a good practice, but it dosen't really need to be good. I'm assuming everything gets killed after the Command line is exited, but i'm not sure.
I think it depends. If it is a Python console the ressources might get cleaned up. But I know from software crashes in Python that sometimes the child processes stay alive even when the main application closes. Usually when started directly via the file explorer.
To make sure that your sub threads/process get closed when the main application closes, you shoud set them as daemon. Then killing the main script kills the childs as well.
see this Link
I wanted to run a function in parallel to the main program in Python. Suppose I have a speech recognition function. I want it to run in the background and interrupt the main program if a particular utterance is heard. But at the same time, I have other tasks to perform. So, the speech recognition should work as a separate process and may be call a function when a command is heard.
I tried the python multiprocessing module, the thread module and the threading module. But all of these required me to wait until the process or thread is finished. What I want is something that will allow me to run functions in the background. They must call some callback function if a specific event occurs.
I hope I will find an effective way of doing this.
I tried the threading module. The code looked like this (pseudocode):
def detected(text):
commands = 'a list of commands'
if text in commands:
#call some function according to the command
def speech_recognition():
#while True:
#If speech detected:
#record it
#process it and covert it to text
#if text is a specified command:
#call the detected(text) function with the recognized text as argument
import threading as t
pr = t.Thread(target=speech_recognition)
pr.start()
#from here starts the main program that does some other functions that
#doesn't need to be mentioned here.
But this doesn't work. The speech recognition runs for a few seconds and then just quits. No Exceptions raised, no system exits, nothing. Its the same when I try the multiprocessing and thread modules.
I don't know how CPU-intense speech recognition is, but I am pretty sure the problem you are describing is best solved with maximum decoupling between entities, i.e. separation in processes. Simple scenario: one of your processes runs your "main program", the other process is entirely responsible for speech recognition. You then implement a communication protocol between those processes. The "main program" still needs some kind of event system and asynchronous execution based on threading, because it needs to be able to listen and immediately react to events sent by the speech-reco-process. Hence, a working model would contain:
one main process which should not be CPU-bound
one child process, spawned via multiprocessing, handling speech recognition
a communication protocol enabling the transmission of data/events between main and child process
one additional thread in the main process that waits for events sent by the child process and makes sure that the main program reacts correspondingly
Main and child process run concurrently as scheduled by the operating system. Of course this works best on a system with at least two CPU cores. In the main process, the main thread and the other thread to not really run synchronously -- only one thread can run at a time due to CPython's global interpreter lock (GIL).
The communication between main process and child process can be implemented with basic multiprocessing tools such as Queue or Pipe.
As you realize, you need to invest some serious thought into a problem like this. Don't try to solve this quick&dirty or just by trial and error. You need to make sure you understand your self-developed architecture.
Just use threading and pass your thread either a function handle to call when it has data ready or the application handle in the case of GUI applications so that the thread can create an event with the data attached.
Python have been really bumpy for me, because the last time I created a GUI client, the client seems to hang when spawning a process, calling a shell script, and calling outside application.
This have been my major problem with Python since then, and now I'm in a new project, can someone give me pointers, and a word of advice in order for my GUI python application to still be interactive when spawning another process?
Simplest (not necessarily "best" in an abstract sense): spawn the subprocess in a separate thread, communicating results back to the main thread via a Queue.Queue instance -- the main thread must periodically check that queue to see if the results have arrived yet, but periodic polling isn't hard to arrange in any event loop.
Your main GUI thread will freeze if you spawn off a process and wait for it to completely. Often, you can simply use subprocess and poll it now and then for completion rather than waiting for it to finish. This will keep your GUI from freezing.