Concurrent methods inside another function in python - python

I am new with multiprocessing in python and so far all the example I've seen are this kind (with one or more methods in the file and then 'main'):
from multiprocessing import Process
def f1(a):
#do something
def f2(b):
#do something
if __name__ == '__main__':
f1(a1)
p = Process(target=f2, args=(b2,))
p.start()
p.join()
If I have instead a method who calls 2 functions in another file to be concurrent like in the following lines,
def function():
#do something
file2.f1(a) #first concurrent method
file2.f2(b) #second concurrent method
how should I do?
Can anyone make a simple example? I tried in this way, but it starts all the program again after the first loop :
def function():
#do something
for i in range(3):
p1 = Process(target=file2.f1, args=(a)) #first concurrent method
p2 = Process(target=file2.f2, args=(b)) #second concurrent method
p1.start()
p2.start()
p1.join()
p2.join()

The issue seems to be that args varialbe is incorrectly defined, it should be tuple and not a single variable:
def function():
#do something
for i in range(3):
p1 = Process(target=file2.f1, args=(a, )) #first concurrent method
p2 = Process(target=file2.f2, args=(b, )) #second concurrent method
p1.start()
p2.start()
p1.join()
p2.join()
If you the order of the executions is flexible, you can use the Pool class to trigger multiple calls:
from multiprocessing.pool import Pool
pool = Pool()
pool.map_async(f1, [(arg, )] * 3)
pool.map_async(f2, [(arg, )] * 3)
pool.close()
pool.join()

Related

Start while loop in one multiprocessing function, from another multiprocessing function

So I want to start a nested while loop in one multiprocessing function from another multiprocessing function. In one function, I'm changing a variable (action) to "fn2", and in the other function there is a nested while loop whose condition is while action == "fn2":.
See code:
from multiprocessing import Process
running = True
action = None
def func1():
global action
if 1+1 == 2:
action = "fn2"
print(action)
def func2():
while running:
while action == "fn2":
print("fn2")
if __name__ == '__main__':
p1 = Process(target=func1)
p1.start()
p2 = Process(target=func2)
p2.start()
p1.join()
p2.join()
However, when I run it, the code just prints "fn2" once (confirming that action is equal to "fn2"). But the nested loop inside func2() does not execute. Sorry if the answer is obvious, I'm new to multiprocessing.
i added two comments (with print statements) to highlight the error.
basically action=None in func2() so that is why...
from multiprocessing import Process
running = True
action = None
def func1():
global action
if 1+1 == 2:
action = "fn2"
print(action)
def func2():
while running:
print('got here') # <--- loops infinitly here
print(action) # <--- this is none
while action == "fn2":
print("fn2")
if __name__ == '__main__':
p1 = Process(target=func1)
p1.start()
p2 = Process(target=func2)
p2.start()
p1.join()
p2.join()
In order to share values when multiprocessing, which is called Sharing state between processes you need to use value or array for a single device shared memory or alternatively, Manager for networks of servers.
Here is a link:
https://docs.python.org/3/library/multiprocessing.html
The basic format looks like this:
from multiprocessing import Process, Value, Array
def f(n, a):
n.value = 3.1415927
for i in range(len(a)):
a[i] = -a[i]
if __name__ == '__main__':
num = Value('d', 0.0)
arr = Array('i', range(10))
p = Process(target=f, args=(num, arr))
p.start()
p.join()
print(num.value)
print(arr[:])
So in the case of the question what the variable action is equivalent to n (variable) or a (list) etc.. and this can be shares across functions.
Also note that one can parse arguments into multiprocess functions with the args keyword: args=(num, arr ...).

How to dynamically change arguments to a process in multiprocessing using python

I have a code that spawns two processes and the processes run a function taking 2 arguments
I want to check a condition say,every 0.1 seconds and change the arguments to both the processes's target function without having to kill and restart the process. How should i do this?
def func(arg1,arg2):
#do something
main():
p1 = multiprocessing.Process(target=func, args=(arg1,arg2))
p2 = multiprocessing.Process(target=func, args=(arg1,arg2))
p1.start()
p2.start()
p1.join()
p2.join()
The main idea here should be not to pass the actual arguments but a communication channel. For one way communication (passing new arguments) use Queue. For bidirectional communication (passing arguments and receiving results) use Pipe.
For more information see: https://docs.python.org/2/library/multiprocessing.html#exchanging-objects-between-processes
Following your example code, this could look along the lines of this:
import multiprocessing
import time
def func(queue):
while True:
arg1, arg2 = queue.get()
print(arg1, " ", arg2) # sample usage
# create queues and pass them to your function
q1 = multiprocessing.Queue()
q2 = multiprocessing.Queue()
p1 = multiprocessing.Process(target=func, args=(q1,))
p2 = multiprocessing.Process(target=func, args=(q2,))
p1.start()
p2.start()
# sample arguments
args = [
("arg1.1", "arg1.2"),
("arg2.1", "arg2.2"),
("arg3.1", "arg3.2"),
("arg4.1", "arg4.2"),
("arg5.1", "arg5.2"),
("arg6.1", "arg6.2"),
]
# Here you would likely have your own way to generate new arguments.
for arg1, arg2 in args:
q1.put((arg1, arg2))
q2.put((arg1, arg2))
time.sleep(1)
# Since the processes now run indefinitly, you have to kill them.
# Alternitively you could send them a stop signal and let them return.
p1.kill()
p2.kill()
Queue.get() is by default a blocking call meaning it waits until a result is available: https://docs.python.org/3/library/queue.html#queue.Queue.get

python update variable in loop and use it in another process

Why while loop is ignored in work1? I would like to update value from string to another value in loop and output this value in process work2. Also already tried with Queue, but problem is I have only one variable which I would like to update in work1 and access to it at work2.
from multiprocessing import Process, Manager, Value
from ctypes import c_char_p
import time
def work1(string):
i = 2
string.value = i
# while True:
# print("work1")
# string.value = i + 1
# time.sleep(2)
def work2(string):
while True:
print("Value set in work1 " + str(string.value))
time.sleep(2)
if __name__ == '__main__':
manager = Manager()
string = manager.Value(int, 0);
p1=Process(target=work1, args=(string,))
p1.start()
p1.join()
p2=Process(target=work2, args=(string,))
p2.start()
p2.join()
That is because you didn't make your program parallel with two processes, but instead, two processes run in tandem. What you need to do is to start both process before any join. Like my modification below:
from multiprocessing import Process, Manager, Value
from ctypes import c_char_p
import time
def work1(string):
i = 2
string.value = i
while True:
i = i+1
string.value = i
print("work1 set value to "+str(string.value))
time.sleep(2)
def work2(string):
while True:
print("Value set in work1 " + str(string.value))
time.sleep(2)
if __name__ == '__main__':
manager = Manager()
string = manager.Value(int, 0, lock=False);
p1=Process(target=work1, args=(string,))
p2=Process(target=work2, args=(string,))
p1.start()
p2.start()
p2.join()
p1.join()
Indeed, if you write the code in this way, the join never happened due to the infinite while loop.

What is wrong with this multiprocessing example?

Based on this pretty useful tutorial I have tried to make a simple implementation of Python multiprocessing to measure its effectivity. The modules multi1, multi2, multi3 contain an ODE integration and exporting the calculated values in a csv (it does not matter, they are here for a script to do something).
import multiprocessing
import multi1
import multi2
import multi3
import time
t0 = time.time()
if __name__ == '__main__':
p1 = multiprocessing.Process(target = multi1.main(), args=())
p2 = multiprocessing.Process(target = multi2.main(), args=())
p3 = multiprocessing.Process(target = multi3.main(), args=())
p1.start()
p2.start()
p3.start()
p1.join()
p2.join()
p3.join()
t1 = time.time()
multi1.main()
multi2.main()
multi3.main()
t2 = time.time()
print t1-t0
print t2-t1
The problem is that the printed times are equal, so the multiprocessing didn't speed up the process. Why?
You called main in the main thread, and passed the return value (probably None) as the target, so no actual work is done in your worker processes. Remove the call parens, so you pass the function itself without calling it, e.g.:
p1 = multiprocessing.Process(target=multi1.main, args=())
p2 = multiprocessing.Process(target=multi2.main, args=())
p3 = multiprocessing.Process(target=multi3.main, args=())
This is the same basic problem seen in the threaded case.

Something strange happen with python multiprocess

I've just tested python multiprocessing for reading file or a global variable, but there is something strange happen.
for expample:
import multiprocessing
a = 0
def test(lock, name):
global a
with lock:
for i in range(10):
a = a + 1
print "in process %d : %d" % (name, a)
def main():
lock = multiprocessing.Lock()
p1 = multiprocessing.Process(target=test, args=(lock, 1))
p2 = multiprocessing.Process(target=test, args=(lock, 2))
p1.start()
p2.start()
p1.join()
p2.join()
print "in main process : %d" % a
if __name__=='__main__':
main()
The program read a global variable, but the output is:
in process 1 : 10
in process 2 : 10
in main process : 0
It seems that the sub-process cannot get and edit the global variable properly. Also, if I change the program to read the file, each sub-process will read the file completely, ignoring the lock.
So how does these happen? And how to solve this problem?
Global variables are not shared between processes. When you create and start a new Process(), that process runs inside a separate "cloned" copy of the current Python interpreter. Updating the variable from within a Process() will only update the variable locally to the particular process it is updated in.
To share data between Python processes, we need a multiprocessing.Pipe(), a multiprocessing.Queue(), a multiprocessing.Value(), a multiprocessing.Array() or one of the other multiprocessing-safe containers.
Here's an example based on your code:
import multiprocessing
def worker(lock, counter, name):
with lock:
for i in range(10):
counter.value += 1
print "In process {}: {}".format(name, counter.value)
def main():
lock = multiprocessing.Lock()
counter = multiprocessing.Value('i', 0)
p1 = multiprocessing.Process(target=worker, args=(lock, counter, 1))
p2 = multiprocessing.Process(target=worker, args=(lock, counter, 2))
p1.start()
p2.start()
p1.join()
p2.join()
print "In main process: {}".format(counter.value)
if __name__=='__main__':
main()
This gives me:
In process 1: 10
In process 2: 20
In main process: 20
Now, if you really want to use a global variable, you can use a multiprocessing.Manager(), but I think the first method is preferable, and this is a "heavier" solution. Here's an example:
import multiprocessing
manager = multiprocessing.Manager()
counter = manager.Value('i', 0);
def worker(lock, name):
global counter
with lock:
for i in range(10):
counter.value += 1
print "In process {}: {}".format(name, counter.value)
def main():
global counter
lock = multiprocessing.Lock()
p1 = multiprocessing.Process(target=worker, args=(lock, 1))
p2 = multiprocessing.Process(target=worker, args=(lock, 2))
p1.start()
p2.start()
p1.join()
p2.join()
print "In main process: {}".format(counter.value)
if __name__=='__main__':
main()

Categories

Resources