Python multiprocessing: Running a process after its parent exited - python

I'm writing a Python program that runs under Mac OS and Linux, and I want to run some logic in a multiprocessing.Process. That logic will take a while, and I want it to continue running even after my program is finished and has exited. i.e., I want the main process to not wait for the auxiliary process to finish. I want the main process to exit as soon as it's finished.
I made a few experiments, and it seems that this behavior is the default when using subprocess, but I can't make it happen using multiprocessing.Process, even when I run set_start_method('spawn').
Do you know of a way to get multiprocessing.Process to behave this way?

Looks like starting a new process, and then calling os.fork from it does the trick.

Related

How can I use a subprocess to do some periodic work and do some clean work after script exit using multiprocessing in Python?

For example, after I execute python main.py, then main process create a subprocess using spawn to handle some work. After the last line of code has been executed, I want the subprocess to do some closeout work then it should exit well.
It will exit immediately after the script has been executed if I set subprocess.daemon = True.

How to kill a MATLAB Batch Process ran by Qt QProcess?

I currently have a Python program that calls a MATLAB script as batch like so:
matlab = QProcess()
matlab.start('matlab -noFigureWindows -batch "cd(users/script_directory/); MyScript.m;"')
#^ command to start MATLAB batch process in CMD
The issue I'm running into is that once this batch process starts, there's no way to kill it. So if my Python app gets force-closed, the MATLAB script keeps running and causes all sorts of issues, meaning I need to kill the process on app close.
I'm calling the MATLAB script as a QProcess and get the following message when I force-close the Python app before the MATLAB script finishes executing:
QProcess: Destroyed while process ("matlab") is still running.
With this, how do I stop the batch MATLAB process? Using 'ctrl-c' in CMD works for me sometimes to kill the process but I need it to be consistent to make the Python work right.
Similarly, can I just have it 'force quit' or 'restart' batch MATLAB or anything along those lines to clear all running processes?
A brute force way to kill it would be to just kill any matlab process via the process and system utilities library at the start of your application:
import psutil
for process in psutil.process_iter():
if process.name().lower() == 'matlab.exe':
process.terminate()

Running a multithreaded C program via python subprocess

I have a multi-threaded C program, which runs perfectly fine as a standalone program. It uses pthread_create and pthread_join to execute some some logic. Now, I am trying to execute this code from python using subprocess. However, when executing via subprocess, it seems subprocess returns as soon as the main thread exits, but I wish to wait for the entire code to finish executing. Is it possible to do this?

Python script never ends in task scheduler

I am trying to set up a python code to be executed automatically.
I started with a small code to be executed:
import datetime
with open("out.txt","a") as f:
f.write(datetime.datetime.now().isoformat())
The task will start allright, and executes (The file is modified), but it never ends in the task scheduler.
this and this exist in SO, but have no real answer. The only workaround proposed in these threads is to force the end of task after a given time in Windows, but this requires to know how long the python script will take which will not be the case for my actual task.
How can the task scheduler know that a python script is finished ?
I run it the following way in the task scheduler :
program : cmd
arguments : /c C:\python27\python.exe C:\path\of\script.py
execute in : C:\path\of\
I tried some variations around this, like executing python instead of cmd, but it didn't change anything. I had hoped the /c would force the task to close.
as Gaurav Pundir mentioned, adding sys.exit(0) should end the script properly and thus the task. However, you do need to add the sys library with import sys in order to use sys.exit(0). Hope this helps!
it looks like a bug to me.
Try looking up for python console under Task Manager.
if it is not there then the program has exited successfully.
I have the same issue with Windows 10, python script ran successfully, there is no python console under Task Manager, yet the scheduled task Status still says 'Running'
There seems to be no correct fix for such issue with CMD as the intermediate launcher.
There is a [End] command in Task Scheduler GUI, clicking it will always terminate the CMD/batch file leaving the spawned python.exe process to straw.
The real problem: there doesn't seem to be any way for cmd to pass on the terminate signal to python.exe.. and neither can taskengine reliably determine if python.exe is alive or not.
I ran into the same problem, the python file didn't stop in the task scheduler. I imported sys and wrote sys.exit(0) but I still got the same problem.
Finally, I decided to press "Update" which solved my problem; the status of the task was "Ready", and not "Running". For information, I use windows 11.

Script does not run on command prompt as in PyScripter

I'm developing a script in PyScripter. When I run it in PyScripter it runs fairly well.
However, the script contains two separate threads (one Thread object, and the main flow of the script). When I run the script from the Command prompt it gets stuck in the Thread. It gives no impression of executeing the main process, and it never ends, which it does when I run it inside PyScripter. What should I do?
In your code, use timeout in join() to put time constrain on the thread. For instance
....
yourThread = threading.Thread()
yourThread.start()
yourThread.join(10.0)
....
Instructions of multithreading checks here. Hope it helps you.

Categories

Resources