Python: Kill a tensorflow subprocess - python

Is it possible to kill a process of another user with python by using:
import subprocess
def killProcess(pid):
p = subprocess.Popen(['sudo','kill','-9',str(pid)], stdout=subprocess.PIPE)
Because if I execute this, nothing happens. If I execute sudo kill -9 pid in terminal no matter which user Iam logged in it workds. So I think there is something wrong with my Popen execution. I try to kill subprocesses spawned with pythons multiprocessing module. Each of those subprocesses creates tensorflow instances. When the main process has killed the subprocesses still blocking the GPUs memory and therefore has to be killed.
I also tried the psutil.Process(pid).terminate() approach. But then I get the error message:
AccessDenied: psutil.AccessDenied (pid=326080)
Anyone has an idea?
Best regards!

try using psutil,
for i in psutil.process_iter():
if 'tensorflow' in i.name():
i.kill()
or
[i.kill() for i in psutil.process_iter() if 'tensorflow' in i.name()]
Each process iter having their own .kill() attribute.

Related

How to kill a MATLAB Batch Process ran by Qt QProcess?

I currently have a Python program that calls a MATLAB script as batch like so:
matlab = QProcess()
matlab.start('matlab -noFigureWindows -batch "cd(users/script_directory/); MyScript.m;"')
#^ command to start MATLAB batch process in CMD
The issue I'm running into is that once this batch process starts, there's no way to kill it. So if my Python app gets force-closed, the MATLAB script keeps running and causes all sorts of issues, meaning I need to kill the process on app close.
I'm calling the MATLAB script as a QProcess and get the following message when I force-close the Python app before the MATLAB script finishes executing:
QProcess: Destroyed while process ("matlab") is still running.
With this, how do I stop the batch MATLAB process? Using 'ctrl-c' in CMD works for me sometimes to kill the process but I need it to be consistent to make the Python work right.
Similarly, can I just have it 'force quit' or 'restart' batch MATLAB or anything along those lines to clear all running processes?
A brute force way to kill it would be to just kill any matlab process via the process and system utilities library at the start of your application:
import psutil
for process in psutil.process_iter():
if process.name().lower() == 'matlab.exe':
process.terminate()

Python killing process using psutil

I'm trying to use python psutil library to create and kill process. My script starts an process, and then tries to kill started subprocess. I run same code under Windows and Linux. Under Windows everything works well. Under Linux psutils starts subprocess correctly(so started app is script's child and it executed with same privileges as the script, but when I try to kill the process psutil detaches from the process but not kills. Here is starting app code:
self.__proc = psutil.Popen(cmd, cwd=working_directory, env=env, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
After this I try to kill started child process:
self.__proc.kill()
self.__proc = None
I got same behavior using this:
while psutil.pid_exists(pid):
p = psutil.Process(pid)
if p is not None:
p.kill()
Can anyone explain why I can't kill process started by me? What I'm doing wrong?
I'm using Python 2.7.

Python - subprocess dying causes program to be suspended

I am invoking a subprocess via some Python code resembling the following:
proc = subprocess.Popen(["bash", "-ic", ". foo.bash && some_func_from_foo"])
One of the processes I am opening spawns a bunch of child process (or subprocesses, or forked processes, not sure which), and one of the child processes dies. This dying is OK and expected behavior. However, I do not expect the actual Python program to be interrupted when this node dies, but right after the child process dies the Python program is suspended and I see [1]+ Stopped rosrun my_package my_app.py.
Some more details: I am running a ROS launch file from the command line. When one of its nodes dies it gives the following sort of output:
terminate called after throwing an instance of 'std::runtime_error'
what(): ...
[my_node_name] process has died [pid 30816, exit code -6, ...]
And then the Python program gets sent to the background, putting me back in bash (I have to run fg 1 to resume the Python program).
Is there any way to prevent the Python process from getting backgrounded by a subprocess that spawns a child that dies?
subprocess may thrown an exception. Have you tried wrapping your code in a try except block to deal with the errors?

Blocking and Non Blocking subprocess calls

I'm completely confused between subprocess.call() , subprocess.Popen(), subprocess.check_call().
Which is blocking and which is not ?
What I mean to say is if I use subprocess.Popen() whether the parent process waits for the child process to return/exit before it keep on its execution.
How does shell=True affect these calls?
Popen is nonblocking. call and check_call are blocking.
You can make the Popen instance block by calling its wait or communicate method.
If you look in the source code, you'll see call calls Popen(...).wait(), which is why it is blocking.
check_call calls call, which is why it blocks as well.
Strictly speaking, shell=True is orthogonal to the issue of blocking. However, shell=True causes Python to exec a shell and then run the command in the shell. If you use a blocking call, the call will return when the shell finishes. Since the shell may spawn a subprocess to run the command, the shell may finish before the spawned subprocess. For example,
import subprocess
import time
proc = subprocess.Popen('ls -lRa /', shell=True)
time.sleep(3)
proc.terminate()
proc.wait()
Here two processes are spawned: Popen spawns one subprocess running the shell. The shell in turn spawns a subprocess running ls. proc.terminate() kills the shell, but the subprocess running ls remains. (That is manifested by copious output, even after the python script has ended. Be prepared to kill the ls with pkill ls.)

Terminal Command Doesn't Finish

I have a command in terminal which doesn't finish. I mean it's not like "ls" which after executing finishes. I am using this command in my python code, so I need it to finish ! because I have to proceed. Any idea how to do it?
It looks like you can just use Python's Popen to create a child process and then not wait for the child process to complete
http://docs.python.org/library/subprocess.html

Categories

Resources