Im trying to understand how processes are messaging the other one, below example;
i use second function to do my main job, and queue feeds first function sometimes to do it own job and no matter when its finished, i look many example and try different ways, but no success, is any one can explain how can i do it over my example.
from multiprocessing import Process, Queue, Manager
import time
def first(a,b):
q.get()
print a+b
time.sleep(3)
def second():
for i in xrange(10):
print "seconf func"
k+=1
q.put=(i,k)
if __name__ == "__main__":
processes = []
q = Queue()
manager = Manager()
p = Process(target=first, args=(a,b))
p.start()
processes.append(p)
p2 = Process(target=second)
p2.start()
processes.append(p2)
try:
for process in processes:
process.join()
except KeyboardInterrupt:
print "Interupt"
Related
I have a python function which calls another function in a multiprocessing.How can I kill all the multiprocessing using python?
This is the outer function which is also a multiprocess
p = Process(target=api.queue_processor, args=(process_queue_in, process_queue_out, process_lock,
command_queue_in, command_queue_out, api.OBJECTIVE_0_5NA, quit_event,
log_file))
p.start()
This is the function which is getting called
def queue_processor(process_queue_in, process_queue_out, process_lock,command_queue_in, command_queue_out, objective_type_NA, quit_event,log_file=None):
slide, roi_coordinates = process_obj[0]
logger.info("Received for imaging {} with ROI {} and type {}".format(slide.tile_dir,
roi_coordinates,
slide.slide_type))
p = Process(target=iad.image_slide,
args=(slide.tile_dir, slide.slide_type, roi_coordinates, exception_queue,
objective_type_NA, logger, clog_path))
p.start() #process to kill
I want to kill the second multiprocess.(commented)
We can kill or terminate a process immediately by using the terminate() method. We will use this method to terminate the child process, which has been created with the help of function, immediately before completing its execution.
import multiprocessing
import time
def Child_process():
print ('Starting function')
time.sleep(5)
print ('Finished function')
P = multiprocessing.Process(target = Child_process)
P.start()
print("My Process has terminated, terminating main thread")
print("Terminating Child Process")
P.terminate()
print("Child Process successfully terminated")
I noticed data received through a multiprocess pipe can not be evaulated directly. In the example below, the code gets stuck in the child process.
import multiprocessing as mp
def child(conn):
while True:
if conn.recv()==1:
conn.send(1)
if conn.recv()==2:
conn.send(2)
conn.close()
def main():
parent_conn, child_conn = mp.Pipe()
p = mp.Process(target=child, args=(child_conn,))
p.start()
while True:
parent_conn.send(1)
print(parent_conn.recv())
p.join()
if __name__ == '__main__':
main()
But if I assign a variable to conn.recv() in the child process, as shown below. Then everything works.
def child(conn):
while True:
x = conn.recv()
if x==1:
conn.send(1)
if x==2:
conn.send(2)
conn.close()
I assume this is because the parent and child processes are running concurrently, so the data being passed should only be evaluated as they are received. Is this the the cause?
I am running Python 3.7 on Windows 10.
here is a simple example:
from collections import deque
from multiprocessing import Process
global_dequeue = deque([])
def push():
global_dequeue.append('message')
p = Process(target=push)
p.start()
def pull():
print(global_dequeue)
pull()
the output is deque([])
if I was to call push function directly, not as a separate process, the output would be deque(['message'])
How can get the message into deque, but still run push function in a separate process?
You can share data by using multiprocessing Queue object which is designed to share data between processes:
from multiprocessing import Process, Queue
import time
def push(q): # send Queue to function as argument
for i in range(10):
q.put(str(i)) # put element in Queue
time.sleep(0.2)
q.put("STOP") # put poison pillow to stop taking elements from Queue in master
if __name__ == "__main__":
q = Queue() # create Queue instance
p = Process(target=push, args=(q,),) # create Process
p.start() # start it
while True:
x = q.get()
if x == "STOP":
break
print(x)
p.join() # join process to our master process and continue master run
print("Finish")
Let me know if it helped, feel free to ask questions.
You can also use Managers to achieve this.
Python 2: https://docs.python.org/2/library/multiprocessing.html#managers
Python 3:https://docs.python.org/3.8/library/multiprocessing.html#managers
Example of usage:
https://pymotw.com/2/multiprocessing/communication.html#managing-shared-state
From the main process I am spawning a new process using multiprocessing.Process.
My aim is to do a heavy CPU intensive task in the child process and if the task takes too long (using timeout_in variable) to finish, then terminate it with a response else compute and get back the result from this task in the child process.
I am able to terminate if it is taking too long, but I am not able to get the object (result) in case of no forced termination of child process.
from multiprocessing import Process,Queue
def do_threading(function,argument, timeout_in=1):
# Making a queue for data exchange
q = Queue()
# Start function as a process
p = Process(target=function, args=(argument,q,))
p.start()
# Wait for 10 seconds or until process finishes
p.join(timeout_in)
# If thread is still active
if p.is_alive():
print("running... let's kill it...")
# print(q.get())
# Terminate
p.terminate()
p.join()
def do_big_job(argument, q):
# Do something with passed argument
print(argument)
# heavy computation
result = 2**1234567
# print("in child thread ",result)
# Putting result in the queue for exchange
q.put(result)
def main_2():
print("Main thread starting...")
do_threading( do_big_job, "Child thread starting...", timeout_in=10)
if __name__ == '__main__':
main_2()
I think the problem come from the fact that you create the Queue inside do_threading. So when your calculation runs normally (no timeout), the function is terminated and the queue with it.
Here is an alternative code that works if there is no timeout:
from multiprocessing import Process,Queue
def do_threading(q,function,argument, timeout_in=1):
# Start function as a process
p = Process(target=function, args=(argument,q,))
p.start()
# Wait for 10 seconds or until process finishes
p.join(timeout_in)
print "time out"
# If thread is still active
if p.is_alive():
print("running... let's kill it...")
# print(q.get())
# Terminate
p.terminate()
print "terminate"
p.join()
def do_big_job(argument, q):
# Do something with passed argument
print(argument)
# heavy computation
result = 2**123
# print("in child thread ",result)
# Putting result in the queue for exchange
q.put(result)
if __name__ == '__main__':
q = Queue() # Creating the queue in the main allows you to access it anytime
print("Main thread starting...")
do_threading( q, do_big_job, "Child thread starting...", timeout_in=10)
if q.empty():
pass
else:
print(q.get()) # get your result here.
Try to catch timeout exception in queue instead of process, for example:
...
from multiprocessing.queues import Empty
...
def do_threading(q,function,argument, timeout_in=1):
# Start function as a process
p = Process(target=function, args=(argument,q,))
p.start()
try:
print(q.get(True, timeout_in))
except Empty:
print "time out"
p.terminate()
p.join()
or you can able got result in else from your code:
...
# If thread is still active
if p.is_alive():
print("running... let's kill it...")
# Terminate
p.terminate()
else:
print(q.get())
I've gone through (this SO thread)[ Synchronization issue using Python's multiprocessing module but it doesnt provide the answer.
The following code :-
rom multiprocessing import Process, Lock
def f(l, i):
l.acquire()
print 'hello world', i
l.release()
# do something else
if __name__ == '__main__':
lock = Lock()
for num in range(10):
Process(target=f, args=(lock, num)).start()
How do I get the processes to execute in order.? I want to hold up a lock for a few seconds and then release it and thereby moving forward with the P1 and P2 into the lock, and then P2 moving forward and P3 exceuting that lock. How would I get the processes to execute in order.?
It sounds like you just want to delay the start of each successive process. If that's the case, you can use a multiprocessing.Event to delay starting the next child in the main process. Just pass the event to the child, and have the child set the Event when its done doing whatever should run prior to starting the next child. The main process can wait on that Event, and once it's signalled, clear it and start the next child.
from multiprocessing import Process, Event
def f(e, i):
print 'hello world', i
e.set()
# do something else
if __name__ == '__main__':
event = Event()
for num in range(10):
p = Process(target=f, args=(event, num))
p.start()
event.wait()
event.clear()
this is not the purpose of locks. Your code architecture is bad for your use case. I think you should refactor your code to this:
from multiprocessing import Process
def f(i):
# do something here
if __name__ == '__main__':
for num in range(10):
print 'hello world', num
Process(target=f, args=(num,)).start()
in this case it will print in order and then will do the remaining part asynchronously