Popen Detached Process is Closing When Parent Closes - python

Edit: I think this is an issue with vscode, as when run from Sublime Text or Powershell the started process does indeed remain running.
I have written a Python library that includes a function to launch an independent process. This is essentially the call (the partial is called in the surrounding function)
return partial(
Popen,
args=args,
creationflags=DETACHED_PROCESS | CREATE_NEW_PROCESS_GROUP,
close_fds=True
)
The launching all works well, but when the process that launched the detached process exits, then the launched process quits.
How do I launch a process that does not exit when it's parent process exits?

Related

How to kill a MATLAB Batch Process ran by Qt QProcess?

I currently have a Python program that calls a MATLAB script as batch like so:
matlab = QProcess()
matlab.start('matlab -noFigureWindows -batch "cd(users/script_directory/); MyScript.m;"')
#^ command to start MATLAB batch process in CMD
The issue I'm running into is that once this batch process starts, there's no way to kill it. So if my Python app gets force-closed, the MATLAB script keeps running and causes all sorts of issues, meaning I need to kill the process on app close.
I'm calling the MATLAB script as a QProcess and get the following message when I force-close the Python app before the MATLAB script finishes executing:
QProcess: Destroyed while process ("matlab") is still running.
With this, how do I stop the batch MATLAB process? Using 'ctrl-c' in CMD works for me sometimes to kill the process but I need it to be consistent to make the Python work right.
Similarly, can I just have it 'force quit' or 'restart' batch MATLAB or anything along those lines to clear all running processes?
A brute force way to kill it would be to just kill any matlab process via the process and system utilities library at the start of your application:
import psutil
for process in psutil.process_iter():
if process.name().lower() == 'matlab.exe':
process.terminate()

Shell script in Python subprocess.Popen does not terminate child

I have an issue with interrupting the subprocess.Popen This is the setup:
I have a Tkinter gui and it is running a another python script using the Popen. This inner script (let's call it running script) is running a shell shell script that is executing a piece of C++ code using a Popen so the hierarchical structure is looking like this:
GUI
\running_script
\shell-script
\c++
running_script works so that if it receives an interrupt, it sends SIGINT to shell-script. If i run a shell_script with my piece of C++ code with the running_script alone and do a CRTL+C everything works like a charm. However if i do it with GUI running the running_script as Popen, SIGINT is sent to the running_script is receives it properly and sends the interrupt signal to shell-script, but instead of terminating the inner process(being c++ code), the shell-script terminates itself and the C++ process continues running, as it was ran in the background, but it was not. When I execute ps -xaf the tree looks like this:
GUI
\running_script
\shell-script <defunct>
c++
So to reiterate, when i run it without the GUI it works like a charm, but with GUI it behaves as explained above. I've tried sending shell-command a SIGTERM instead of SIGINT as well, the result is the same.
You could catch SIGINT in the shell script and make it send SIGINT to the c++ program, e.g:
#!/bin/bash
./cpp_program &
procpid=$!
function killit() {
kill -SIGINT $procpid
}
trap killit SIGINT
......

Kill external process on pythonw.exe

In Python, external processes can be started easily using the subprocess module. For instance on Windows:
command = 'external_app'
my_process = subprocess.Popen(
command,
creationflags=subprocess.CREATE_NEW_PROCESS_GROUP,
shell=True,
stderr=subprocess.STDOUT,
stdout=subprocess.PIPE,
universal_newlines=True)
To kill the process, we can run:
os.kill(my_process.pid, signal.CTRL_BREAK_EVENT)
This works fine using the command-line interpreter of Python (python.exe). But if I like to start and stop processes from within a graphical Python application without a command-line window using pythonw.exe the problem occurs that I can’t stop the process with os.kill anymore.
How can I kill an external process on Windows with pythonw.exe?

Run python script from another python script but not as an child process

Is it possible to run a python script from another python script without wating for termination.
Parent process will terminate immediately after creation of child process.
I tried:
subprocess.Popen([sys.executable, "main.py"], stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE)
and also:
os.system(...)
If you know that the other Python script has a main method, you could simply in you code call that other script:
import main
...
exit(main.main())
But here the other script executes in the context of calling script. If you want to avoid it, you could use the os.exec... functions, by launching a new Python interpretor:
import os
...
os.execl(sys.executable, "python", 'main.py')
The exec family functions will replace (under Unix-Linux) the current Python interpretor with a new one.
You can just add & to start script in background:
import os
os.system('/path/to/script.sh &')
exit()
In this case launched shell script will continue working even after main Python script exits.
But keep in mind that it can cause zombie processes appearance in our system.

Python - subprocess dying causes program to be suspended

I am invoking a subprocess via some Python code resembling the following:
proc = subprocess.Popen(["bash", "-ic", ". foo.bash && some_func_from_foo"])
One of the processes I am opening spawns a bunch of child process (or subprocesses, or forked processes, not sure which), and one of the child processes dies. This dying is OK and expected behavior. However, I do not expect the actual Python program to be interrupted when this node dies, but right after the child process dies the Python program is suspended and I see [1]+ Stopped rosrun my_package my_app.py.
Some more details: I am running a ROS launch file from the command line. When one of its nodes dies it gives the following sort of output:
terminate called after throwing an instance of 'std::runtime_error'
what(): ...
[my_node_name] process has died [pid 30816, exit code -6, ...]
And then the Python program gets sent to the background, putting me back in bash (I have to run fg 1 to resume the Python program).
Is there any way to prevent the Python process from getting backgrounded by a subprocess that spawns a child that dies?
subprocess may thrown an exception. Have you tried wrapping your code in a try except block to deal with the errors?

Categories

Resources