I'm trying to create a script in Python. The idea is to start 3 processes, 2 of them constantly print a message, and the third is there to kill them after a few seconds. The problem is that I don't know how to tell that third which processes should be terminated.
from multiprocessing import *
import time
def OkreciLevi():
while 1:
print "okrecem levi"
time.sleep(3)
def OkreciDesni():
while 1:
print "okrecem desni"
time.sleep(3)
def Koci(levi,desni):
for vrednost in range(2):
print str(vrednost)
time.sleep(3)
levi.terminate()
desni.terminate()
print "kocim"
if __name__== '__main__':
levi=Process(target=OkreciLevi)
desni=Process(target=OkreciDesni)
koci=Process(target=Koci, args=(levi,desni))
koci.start()
levi.start()
desni.start()
levi.join()
desni.join()
koci.join()
Assuming that you're on *nix-like operating system I guess that you need to:
Get the PID of the multiprocessing worker;
Send SIGTERM to them. For instanse use os.kill.
Also this information may be useful for you.
Related
I am trying to restart a python process using multiprocessing module, but "AssertionError: cannot start a process twice" appears.
My question
How can I restart the process
Once its terminated why it is going to zombie mod
How can I remove the zombie process
import time
from multiprocessing import Process
def worker ():
while True:
print "Inside the worker"
time.sleep(10)
p1 = Process(target=worker,name="worker")
p1.start()
#p1.join()
time.sleep(3)
p1.terminate()
print "after Termination "
time.sleep(3)
p1.start()
Actually I am trying to create a process monitor function to watch the memory and CPU usage of all processes . If it reach a certain level I want to restart on realtime
How can I restart the process?
You cannot restart a terminated process. You need to instantiate a new process.
Once its terminated why it is going to zombie mod?
Because on Unix-y systems the parent process needs to read the exit-code before the kernel clears the corresponding entry from the process table.
How can I remove the zombie process?
You have multiple options. I'm citing the docs here:
Joining zombie processes
On Unix when a process finishes but has not been joined it becomes a zombie. There should never be very many because each time a new process starts (or active_children() is called) all completed processes which have not yet been joined will be joined. Also calling a finished process’s Process.is_alive will join the process. Even so it is probably good practice to explicitly join all the processes that you start.
Actually I am trying to create a process monitor function to watch the memory and CPU usage of all processes.
You should take a look at the psutil module for that.
In case you just want to suspend (not kill) processes if memory consumption gets to high, you might be able to draw some inspiration from my answer here.
I hope it will help you
import time
from multiprocessing import Process
def worker ():
while True:
print "Inside the worker"
time.sleep(10)
def proc_start():
p_to_start = Process(target=worker,name="worker")
p_to_start.start()
return p_to_start
def proc_stop(p_to_stop):
p_to_stop.terminate()
print "after Termination "
p = proc_start()
time.sleep(3)
proc_stop(p)
time.sleep(3)
p = proc_start()
print "start gain"
time.sleep(3)
proc_stop(p)
terminate() process will not allow to restart the process but kill() process can be used and the process can be restarted. it works
import time
from multiprocessing import Process
def worker ():
while True:
print "Inside the worker"
time.sleep(10)
p1 = Process(target=worker,name="worker")
p1.start()
#p1.join()
time.sleep(3)
p1.kill()
print "after kill"
time.sleep(3)
p1.start()
tl;dr: I have several threads, one being a thread listening to input() to keep the program running/exit on keypress. But at one time in the program I need to stop this listener or it will intercept the input for a subprocessed program.
Long version:
- Program should download some data, then hand this over to some other console program to be processed.
- Program should either run until download is finished or until ENTER-keypress has been sent.
- In both cases the download thread will be ended gracefully and the external processing should be done.
- Problem: The input() function is still listening and intercepting the first input to the subprocess'ed console program.
import os
import subprocess
import threading
import time
def thread_do_downloads():
# does some downloads and will set the flag "flag_download_completed=True"
# eventually to signal download completed
# for this example just set the flag
global flag_download_completed
flag_download_completed = True
def do_stuff_with_downloaded_data():
# this is of course not the program I would call,
# but this example should show how the input would be intercepted
if os.name == 'nt':
parameters = ["set", "/p", "variable=Press Enter"] # for this example (Windows) call "set", this program will wait for a user input
else:
parameters = ["read", "variable"] # hope this works for linux...
p1 = subprocess.Popen(parameters, shell=True)
p1.communicate()
def listen_for_keypress():
input()
print("keypress intercepted")
def main():
dl = threading.Thread(target=thread_do_downloads)
dl.start()
kill_listener = threading.Thread(target=listen_for_keypress, daemon=True) # daemon: to not have it lingering after main thread is done
kill_listener.start()
print("Press ENTER to stop downloading.")
while True:
if not kill_listener.is_alive() or flag_download_completed:
break
time.sleep(1)
# here are some lines to make sure the download thread above completes gracefully
do_stuff_with_downloaded_data()
print("All done")
if __name__ == '__main__':
flag_download_completed = False
main()
Will result in:
Press ENTER to stop downloading.
Press Enter << stopped here until I pressed ENTER
keypress intercepted << stopped here until I pressed ENTER
All done
If you can keep the main thread on top of the console, maybe you could take advantage of the fact that input() is going to block the main thread until Enter is pressed. Once the execution continues (because Enter was pressed), communicate to the running threads that they have to stop using an Event (another example here). If you do want to listen for S.O. signals, I suggest you take a look to the signal module (watch out, some features may be O.S dependent).
import threading
import time
def thread_do_downloads(stop_activated):
# does some downloads and will set the flag "flag_download_completed=True"
# eventually to signal download completed
# for this example just set the flag
global flag_download_completed
while not stop_activated.is_set():
time.sleep(0.5)
print("ZZZZZZZ")
def do_stuff_with_downloaded_data():
print("doing stuff with downloaded data")
def main():
stop_activated = threading.Event()
dl = threading.Thread(target=thread_do_downloads, args=(stop_activated,))
dl.start()
input("Press ENTER to stop downloading.")
stop_activated.set()
print("stopping (waiting for threads to finish...)")
dl.join()
# here are some lines to make sure the download thread above completes gracefully
do_stuff_with_downloaded_data()
print("All done")
if __name__ == '__main__':
main()
EDIT (as per the OP's comment):
One of the complications that the original question has is how to communicate the termination request to a subprocess. Because processes don't share memory with the parent process (the process who spawned it) this can, indeed, only (or almost only) be done through actual SO signals. Because of this memory isolation, any flags set on the parent process will have no effect in the spawned subprocesses: the only way of inter process communication is either through OS signals, or through files (or file-like structures) that both parent and child process "known about" and use to share information. Also, calling an input() in the parent binds the standard input (stdin) to that process which means by default, the subprocesses are unaware about the keys pressed in the parent (you could always bind the stdin of the child process to the stdin of the parent, but that would complicate a bit more the code)
Fortunately, the instances of Popen do offer a nice way to send signals to the child process: the TERM signal, which the subprocess could catch and is supposed to interpret as "Hey, you're gonna be stopped real soon, so do your clean-up things, close files and so on and exit" and the KILL signal that doesn't really tell anything to the subprocess (can't be caught): it just kills it (In Linux, for instance a KILL signal removes all access to memory from the killed process so any action that uses memory, such as a seek for next operation will cause an error. More info here)
To demonstrate that, let's say we have a simple script.py file in the same directory where our main program is located that looks like this:
script.py >
#!/usr/bin/env python
import sys
import random
import time
def main():
done = False
while not done:
time.sleep(0.5)
print("I'm busy doing things!!")
done = random.randint(0, 15) == 1
if __name__ == "__main__":
main()
sys.exit(0) # This is pretty much unnecessary, though
A script that would take a random time to process and that can, potentially, be quite long (at least long enough to demonstrate)
Now, we could create one (or many) subprocesses in a tread that run that script.py file, regularly check their status (using poll()) and if the user has requested the forced output send a TERM signal and a bit later a KILL if necessary.
import threading
import time
import subprocess
def thread_do_downloads(stop_activated):
p = subprocess.Popen('./script.py', stdout=subprocess.PIPE)
while p.poll() is None:
time.sleep(0.5)
print("Subprocess still running... Slepping a bit... ZzzzzzZZZ")
if stop_activated.is_set():
print("Forcing output requested!!!")
print("Trying to terminate the process nicely, which a SIGTERM:")
p.terminate()
time.sleep(0.5)
if p.poll() is None:
print("Not being nice anymore... Die, die die!!")
p.kill()
print("This is what the subprocess 'said':\n%s" % p.stdout.read())
return
print("stopping normally")
def do_stuff_with_downloaded_data():
print("doing stuff with downloaded data")
def listen_for_keypress(stop_activated):
input("Press ENTER to stop downloading.")
print("keypress intercepted")
stop_activated.set()
def main():
stop_activated = threading.Event()
dl = threading.Thread(target=thread_do_downloads, args=(stop_activated,))
dl.start()
kill_listener = threading.Thread(target=listen_for_keypress, args=(stop_activated,), daemon=True)
kill_listener.start()
dl.join()
print("Finished downloading data")
# here are some lines to make sure the download thread above completes gracefully
do_stuff_with_downloaded_data()
print("All done")
if __name__ == '__main__':
main()
I need this urgently in my Django site, but because of the time constraint, I cannot do any heavy modifications. This is probably the cheapest in-place modification.
If we just focus on either build or run...
Now I get the id back from build (or run).
All the heavy work is now in a separate function.
'
import multiprocessing as mp
def main():
id = get_build_id(....)
work = mp.Process(target=heavy_build_fn)
work.start()
return id
If I ran this in the shell (I have not tested this on the actual Django app), the terminal will not end completely until work process is done with its job. As a web app, I need to return the id right away. Can I place work on the background without interrupting?
Thanks.
I've read this How do I run another script in Python without waiting for it to finish?, but I want to know other ways to do it, for example, sticking with MP. The Popen solution may not be what I want actually.
import multiprocessing as mp
import time
def build():
print 'I build things'
with open('first.txt', 'w+') as f:
f.write('')
time.sleep(10)
with open('myname.txt', 'w+') as f:
f.write('3')
return
def main():
build_p = mp.Process(name='build process', target=build)
build_p.start()
build_p.join(2)
return 18
if __name__ == '__main__':
v = main()
print v
print 'done'
Console:
I build things
18
done
|
and wait
finally
user#user-P5E-VM-DO:~$ python mp3.py
I build things
18
done
user#user-P5E-VM-DO:~$
remove the join() and you may have what you want.
join() waits for the processes to end before returning.
The value will return before the child process(es) finish, however, your parent process will be alive until the child processes complete. Not sure if that's an issue for you or not.
This code:
import multiprocessing as mp
import time
def build():
print 'I build things'
for i in range(10):
with open('testfile{}.txt'.format(i), 'w+') as f:
f.write('')
time.sleep(5)
def main():
build_p = mp.Process(name='build process', target=build)
build_p.start()
return 18
if __name__ == '__main__':
v = main()
print v
print 'done'
Returns:
> python mptest.py
18
done
I build things
If you need to allow the process to end while the child process continues check out the answers here:
Run Process and Don't Wait
No, the easiest way to handle what you want is Probably to use a message broker. Django celery is a great solution. It will let you queue a process and return your vie right to the user. Your process will then be executed in the order it was queued
I believe process opened from Django are tied to the thread they were opened in so your view will wait to return until your process is complete
I wrote a test program, which has two processes. The father process gets data from a Queue, and the child puts data into it. There is a signal handler which tells the program to exit. However, it does not exit sometimes when I send the signal SIGTERM to the pid(child process) I printed, and it seems to be having a deadlock.
import os
import sys
import multiprocessing
import time
import signal
bStop = False
def worker(que):
signal.signal(signal.SIGTERM,sighandler)
print 'worker:',os.getpid()
for i in range(100000000):
que.put(i)
print 'STOP'
def sighandler(num,frame):
print 'catch signal'
q.put('STOP')
sys.exit(0)
q = multiprocessing.Queue(100)
p = multiprocessing.Process(target=worker,args=(q,))
p.start()
for item in iter(q.get,'STOP'):
print 'get',item
pass
print 'main stop'
p.join()
Unless you are running python 3 you should be using xrange instead of range for a loop that large. Python tends to choke once it exceeds a certain list size and so you really really need to move to generators by that point.
That very well could be the issue your seeing right now.
Alright I've been using the time module for time.sleep(x) function for awhile... but I need something that won't pause the shell and so the user can continue using the program while it's counting.
To be more "specific" let's suppose I had a program that needed to wait 5 seconds before executing a function. In this time using the time.sleep() function the user can't type anything into the shell because it's sleeping. However, I need Python to "count the 5 seconds" in the background while the user is able to use the shell. Is this possible?
threading ? You should handle piece of your work in one worker and another separate worker where you would count or sleep with time.sleep
Here is an example that might help you understand and use threading with time.sleep
import threading
import time
def sleeper():
print 'Starting to sleep'
time.sleep(10)
print 'Just waking up..'
print 'snooze'
print 'oh no. I have to get up.'
def worker():
print 'Starting to work'
time.sleep(1) # this also a work. :)
print 'Done with Work'
t = threading.Thread(name='sleeper', target=sleeper)
w = threading.Thread(name='worker', target=worker)
w.start()
t.start()