OS: Jessie
Python: 2.7
I want to use psutil to terminate my script I am currently executing. My problem is that I would like to kill it with the ID but I don't know how to get the pid of my script.
I know I can terminate with the names of processes, but I think thats not a pretty solution.
Has anyone an idea how to make this work?
I have set up my Pi with the PiCamera, a GUI and some sensors. I am using the cv2 library and the problem is that the windows won't close.
Therefore, I googled how to close them but there wasn't any solution I could use. Killing the process is ok for me.
EDIT:
import psutil
def on_terminate(proc):
print("process {} terminated with exit code {}".format(proc, proc.returncode))
procs = psutil.Process().children()
for p in procs:
p.terminate()
gone, still_alive = psutil.wait_procs(procs, timeout=3, callback=on_terminate)
for p in still_alive:
p.kill()
I found this snippet in the documentation. How can I make this run with pid's?
os.getpid()
and
How to terminate process from Python using pid?
was the answer.
Related
I am debugging a multiprocess program with anaconda2 in pycharm community edition.
It has several background worker processes. The worker process will check the input Queue to retrieve the task without sleep until a task received. In fact, I'm only interested in the main process. But the pycharm debugger always step into the subprocess, it seems that the main process hasn't been working, and the task never sent out. How can I make the debugger out of the subprocess?
The worker subprocess looks like this:
class ILSVRC_worker:
...
def run(self):
cfg_parser = ConfigParser.ConfigParser()
cfg_parser.read(self.cfg_path)
data_factory = ILSVRC_DataFactory(cfg_parser)
logger = mp.log_to_stderr(logging.INFO)
while True:
try:
annotation_path = self.que_in.get(True,0.1)
except Queue.Empty:
continue
if annotation_path is None:
# to exit the subprocess
logger.info('exit the worker process')
break
...
I could think of two ways to achieve this but unfortunately I think it won't be possible with the community edition.
If you have the PID of the process you could try attaching to it by using the Tools>Attach to Process.. functionality (I don't know if that is available in the community edition). This is difficult if you use a Pool because you don't know which process the job is assigned to.
Another way would be to use a remote debugger and connect to it in the dispatched python process. This is only available in the professional edition
I ended up testing my code without any multiprocessing
I am working from a windows platform. In my python script, I can make a call to an external program in the following way:
os.system("C:\mainfolder\menu.exe C:\others\file1.inp C:\others\file2.inp")
os.popen("C:\mainfolder\menu.exe C:\others\file1.inp C:\others\file2.inp")
subprocess.call(["C:\mainfolder\menu.exe","C:\others\file1.inp" "C:\others\file2.inp"])
where:
menu.exe: is my external program.
file1 and file2: are input files to my external program.
All the above works fine. Now that my external program has finished successfully, I need to totally close it along with all the windows that are left opened by it. I have gone through lots of other posts, python documentation, etc and found commands as for example:
os.system("taskkill /im C:\mainfolder\menu.exe")
os.kill(proc.pid,9)
child.kill()
But they did not work. I spent a lot of time trying to find something that worked for me, until I realised that no matter which commands I type after, they will not be read as the computer does not know that my external program has finished. That is the reason why I can easily terminate the program from the command line anytime just by typing taskkill /im menu.exe, but not from python.
Does anybody know how to sort this out?, should I include something else when I make the call to my external program?
Here's some example code, how to detect if a program opens a window. All you need to know is the title of the message box, that menu.exe opens, when it is finished:
import subprocess
import win32gui
import time
def enumHandler(hwnd, lParam):
if win32gui.IsWindowVisible(hwnd):
if 'Menu.exe Finished' in win32gui.GetWindowText(hwnd):
proc.kill()
proc = subprocess.Popen(["C:\mainfolder\menu.exe","C:\others\file1.inp" "C:\others\file2.inp"])
while proc.poll() is None:
win32gui.EnumWindows(enumHandler, None)
time.sleep(1)
If you want to have a process end immediately, i.e., wait for it to end, this is a blocking call, and os.system() normally waits, discussed here as well as .communicate[0] using subprocess there.
If you want to end a process later in your program, an asynchronous, non-blocking process, perhaps get its pid and depending on whether shell=True or not that will either be the pid of the spawned shell or of the child process.
That pid can be used to end it either immediately by using psutil or os, or wait until it ends using little cpu time, though then other tasks can be done while waiting, or threads could be used.
It might be a bit late to post my findings to this question as I asked it some months back but it may still be helpful for other readers.
With the help of the people who tried answering my question, especially Daniel's answer, I found out a solution for my problem. Not sure if it is the best one but I got what I was needing.
Basically instead of looking for the word "finished" on my pop up window, I looked for the results that my external program generates. If these results have been generated, it means that the program has finished so I then kill the process:
proc=subprocess.Popen(["C:\mainfolder\menu.exe","C:\others\file1.inp" "C:\others\file2.inp"])
while proc.poll() is None:
if os.path.exists("D:\\Results_folder\\Solution.txt"):
time.sleep(10)
os.system('taskkill /im menu.exe')
I have a python script that runs another python program and then gathers results from the logs. The only problem is that I want it to run a limited number of seconds. So I want to kill the process after say, 1 minute.
How can I do this?
I'm running an external program with the command os.system("./test.py")
you need more control over your child process than os.system allows for. subprocess, especially Popen and Popen objects give you enough control for managing child processes. For a timer, see again the section in the standard library
Check out the psutil module. It provides a cross-platform interface to retrieving information on all running processes, and allows you to kill processes also. (It can do more, but that's all you should need!)
Here's the basic idea of how you could use it:
import os
import psutil
import time
os.system('./test.py')
# Find the PID for './test.py'.
# psutil has helper methods to make finding the PID easy.
pid = <process id of ./test.py>
time.sleep(60)
p = psutil.Process(pid)
p.kill()
#!/usr/bin/env python
import time, os, subprocess
process = subprocess.Popen(
['yes'], stdout=open(os.devnull,'w'))
time.sleep(60)
process.terminate()
I am trying to start a java process that is meant to take a long time, using python's subprocess module.
What I am actually doing is using the multiprocessing module to start a new Process, and using that process, use subprocess module to run java -jar.
This works fine, but when I start the new process, the java process replaces the python process running python Process. I would like java to run as a child process in a way that when the process that started a new multiprocessing.Process died, the process running java would die too.
Is this possible?
Thanks.
Edit: here's some code to clarify my question:
def run_task():
pargs = ["java -jar app.jar"]
p = Popen(pargs)
p.communicate()[0]
return p
while(True):
a = a_blocking_call()
process = Process(target=run_task)
process.start()
if not a:
break
I want the process running run_task to be killed along with the process running java when the process executing the while loop reaches the break line. Is this possible?
I think you should show some code, it's not clear how you are using subprocess and multiprocessing together.
From the documentation it looks like subprocess should spawn and not replace your new Process-started process. Are you sure that isn't happening ? A test case showing it doesn't would be good.
You may get some hints out of Detach a subprocess started using python multiprocessing module
I have a python web application that needs to launch a long running process. The catch is I don't want it to wait around for the process to finish. Just launch and finish.
I'm running on windows XP, and the web app is running under IIS (if that matters).
So far I tried popen but that didn't seem to work. It waited until the child process finished.
Ok, I finally figured this out! This seems to work:
from subprocess import Popen
from win32process import DETACHED_PROCESS
pid = Popen(["C:\python24\python.exe", "long_run.py"],creationflags=DETACHED_PROCESS,shell=True).pid
print pid
print 'done'
#I can now close the console or anything I want and long_run.py continues!
Note: I added shell=True. Otherwise calling print in the child process gave me the error "IOError: [Errno 9] Bad file descriptor"
DETACHED_PROCESS is a Process Creation Flag that is passed to the underlying WINAPI CreateProcess function.
Instead of directly starting processes from your webapp, you could write jobs into a message queue. A separate service reads from the message queue and runs the jobs. Have a look at Celery, a Distributed Task Queue written in Python.
This almost works (from here):
from subprocess import Popen
pid = Popen(["C:\python24\python.exe", "long_run.py"]).pid
print pid
print 'done'
'done' will get printed right away. The problem is that the process above keeps running until long_run.py returns and if I close the process it kills long_run.py's process.
Surely there is some way to make a process completely independent of the parent process.
subprocess.Popen does that.