Python multiprocess - python

I'm getting "EOFError: EOF when reading a line", when I try to take input.
def one():
xyz = input("enter : ")
print(xyz)
time.sleep(1)
if __name__=='__main__':
from multiprocessing import Process
import time
p1 = Process(target = one)
p1.start()

the main process owns standard input, the forked process doesn't.
What would work would be to use multiprocessing.dummy which doesn't create subprocesses but threads.
def one(stdin):
xyz = input("enter: ")
print(xyz)
time.sleep(1)
if __name__=='__main__':
from multiprocessing.dummy import Process
import time
p1 = Process(target = one)
p1.start()
since threads share the process, they also share standard input.
for real multiprocessing, I suggest that you collect interactive input from main process and pass it as argument.

Related

How can I check that the Process class from Python Multiprocessing has worked?

I've written the following code which runs a function that simulates a stochastic simulation of a series of chemical reactions. I've written the following code:
v = range(1, 51)
def parallelfunc(*v):
gillespie_tau_leaping(start_state, LHS, stoch_rate, state_change_array)
def info(title):
print(title)
print('module name:', __name__)
print('parent process:', os.getppid())
print('process id:', os.getpid())
if __name__ == '__main__':
info('main line')
start = datetime.utcnow()
p = Process(target=parallelfunc, args=(v))
p.start()
p.join()
end = datetime.utcnow()
sim_time = end - start
print(f"Simualtion utc time:\n{sim_time}")
I'm using the Process method from the multiprocessing library and am trying to run gillespie_tau_leaping 50 times.
Only I'm not sure if its working. gillespie_tau_leaping prints out a number of values to the terminal, but these values are only printed out once, I'd expect them to be printed out 50 times.
I tried using the getpid etc command and this returns the following to the terminal:
main line
module name: __main__
parent process: 6188
process id: 27920
How can I tell if my code as worked and how can I get it to print the values from gillepsie_tau_leaping 50 times to the terminal?
Cheers
Your code is running just one process, the call to Process, spawns a new thread but you are doing it only once (not in a loop).
I would suggest you to use multiprocessing pools
Your code can be something like this:
from multiprocess import Pool
def parallelfunc(*args):
do_something()
def main():
# create a list of list of args for the function invocation
func_args = [['arg1call1', 'arg2call1', 'arg3call1'], ['arg1call2', 'arg2call2', 'arg3call2']]
with Pool() as p:
results = p.map(parallelfunc, func_args)
# do something with results which is a list of results
multiprocessing pool by default create the same number of processes as your CPU cores and manage the process Pool till the end of the processing taking care of all the Inter Process Communication.
This is really handy because synchronizing processes can be hard.
Hope this helps

How do I share data between processes in python?

here is a simple example:
from collections import deque
from multiprocessing import Process
global_dequeue = deque([])
def push():
global_dequeue.append('message')
p = Process(target=push)
p.start()
def pull():
print(global_dequeue)
pull()
the output is deque([])
if I was to call push function directly, not as a separate process, the output would be deque(['message'])
How can get the message into deque, but still run push function in a separate process?
You can share data by using multiprocessing Queue object which is designed to share data between processes:
from multiprocessing import Process, Queue
import time
def push(q): # send Queue to function as argument
for i in range(10):
q.put(str(i)) # put element in Queue
time.sleep(0.2)
q.put("STOP") # put poison pillow to stop taking elements from Queue in master
if __name__ == "__main__":
q = Queue() # create Queue instance
p = Process(target=push, args=(q,),) # create Process
p.start() # start it
while True:
x = q.get()
if x == "STOP":
break
print(x)
p.join() # join process to our master process and continue master run
print("Finish")
Let me know if it helped, feel free to ask questions.
You can also use Managers to achieve this.
Python 2: https://docs.python.org/2/library/multiprocessing.html#managers
Python 3:https://docs.python.org/3.8/library/multiprocessing.html#managers
Example of usage:
https://pymotw.com/2/multiprocessing/communication.html#managing-shared-state

multiprocess messaging queue between functions or process python

Im trying to understand how processes are messaging the other one, below example;
i use second function to do my main job, and queue feeds first function sometimes to do it own job and no matter when its finished, i look many example and try different ways, but no success, is any one can explain how can i do it over my example.
from multiprocessing import Process, Queue, Manager
import time
def first(a,b):
q.get()
print a+b
time.sleep(3)
def second():
for i in xrange(10):
print "seconf func"
k+=1
q.put=(i,k)
if __name__ == "__main__":
processes = []
q = Queue()
manager = Manager()
p = Process(target=first, args=(a,b))
p.start()
processes.append(p)
p2 = Process(target=second)
p2.start()
processes.append(p2)
try:
for process in processes:
process.join()
except KeyboardInterrupt:
print "Interupt"

Killing a process launched from a process that has ended - Python

I am trying to kill a process in Python, that is being launched from another process and I am unable to find the correct place to place my ".terminate()".
To explain myself better I will post some example code:
from multiprocessing import Process
import time
def function():
print "Here is where I am creating the function I need to kill"
ProcessToKill = Process(target = killMe)
ProcessToKill.start()
def killMe():
while True:
print "kill me"
time.sleep(0.5)
if __name__ == '__main__':
Process1 = Process(target = function)
Process1.start()
My question is, where can I place ProcessToKill.terminate(), ideally without having to change the overall structure of the code?
You can hold onto the ProcessToKill object so that you can kill it later:
from multiprocessing import Process
import time
def function():
print "Here is where I am creating the function I need to kill"
ProcessToKill = Process(target = killMe)
ProcessToKill.start()
return ProcessToKill
def killMe():
while True:
print "kill me"
time.sleep(0.5)
if __name__ == '__main__':
Process1 = function()
time.sleep(5)
Process1.terminate()
Here, I've removed your wrapping of function in another Process object, because for the example it seems redundant, but you should be able to do the same thing with a Process that runs another Process.

Is there any way to pass 'stdin' as an argument to another process in python?

I'm trying to create a script which is using multiprocessing module with python. The script (lets call it myscript.py) will get the input from another script with pipe.
Assume that I call the scripts like this;
$ python writer.py | python myscript.py
And here is the codes;
// writer.py
import time, sys
def main():
while True:
print "test"
sys.stdout.flush()
time.sleep(1)
main()
//myscript.py
def get_input():
while True:
text = sys.stdin.readline()
print "hello " + text
time.sleep(3)
if __name__ == '__main__':
p1 = Process(target=get_input, args=())
p1.start()
this is clearly not working, since the sys.stdin objects are different for main process and p1. So I have tried this to solve it,
//myscript.py
def get_input(temp):
while True:
text = temp.readline()
print "hello " + text
time.sleep(3)
if __name__ == '__main__':
p1 = Process(target=get_input, args=(sys.stdin,))
p1.start()
but I come across with this error;
Process Process-1:
Traceback (most recent call last):
File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
self.run()
File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
self._target(*self._args, **self._kwargs)
File "in.py", line 12, in get_input
text = temp.readline()
ValueError: I/O operation on closed file
So, I guess that main's stdin file closed and I can't read from it. At this conjunction, how can I pass main's stdin file to another process? If passing stdin is not possible, how can I use main's stdin from another process?
update:
Okay, I need to clarify my question since people think using multiprocessing is not really necessary.
consider myscript.py like this;
//myscript.py
def get_input():
while True:
text = sys.stdin.readline()
print "hello " + text
time.sleep(3)
def do_more_things():
while True:
#// some code here
time.sleep(60*5)
if __name__ == '__main__':
p1 = Process(target=get_input, args=())
p1.start()
do_more_things()
so, I really need to run get_input() function parallelly with main function (or other sub processes).
Sorry for the conflicts, I have a decent English, and I guess I couldn't be clear on this question. I would appreciate if you guys can tell me if i can use the main processes STDIN object in another process.
thanks in advance.
The simplest thing is to swap get_input() and do_more_things() i.e., read sys.stdin in the parent process:
def get_input(stdin):
for line in iter(stdin.readline, ''):
print("hello", line, end='')
stdin.close()
if __name__ == '__main__':
p1 = mp.Process(target=do_more_things)
p1.start()
get_input(sys.stdin)
The next best thing is to use a Thread() instead of a Process() for get_input():
if __name__ == '__main__':
t = Thread(target=get_input, args=(sys.stdin,))
t.start()
do_more_things()
If the above doesn't help you could try os.dup():
newstdin = os.fdopen(os.dup(sys.stdin.fileno()))
try:
p = Process(target=get_input, args=(newstdin,))
p.start()
finally:
newstdin.close() # close in the parent
do_more_things()
Each new process created with the multiprocessing module gets its own PID, and therefore it's own standard input device and output devices, even if they're both writing to the same terminal, hence the need for locks.
You're already creating two processes by separating the content into two scripts, and creating a third process with get_input(). get_input could read the standard input if it was a thread instead of a process. Then, no need to have a sleep function in the reader.
## reader.py
from threading import Thread
import sys
def get_input():
text = sys.stdin.readline()
while len(text) != 0:
print 'hello ' + text
text = sys.stdin.readline()
if __name__ == '__main__':
thread = Thread(target=get_input)
thread.start()
thread.join()
This will only be a partial answer - as I'm unclear about subsequent parts of the question.
You start by saying that you anticipate calling your scripts:
$ python writer.py | python myscript.py
If you're going to do that, writer needs to write to standard out and myscript needs to read from standard input. The second script would look like this:
def get_input():
while True:
text = sys.stdin.readline()
print "hello " + text
time.sleep(3)
if __name__ == '__main__':
get_input()
There's no need for the multiprocessing.Process object... you're firing off two processes from the command line already - and you're using the shell to connect them with an (anonymous) pipe (the "|" character) that connects standard output from the first script to standard input from the second script.
The point of the Process object is to manage launch of a second process from the first. You'd need to define a process; then start it - then you'd probably want to wait until it has terminated before exiting the first process... (calling p1.join() after p1.start() would suffice for this).
If you want to communicate between a pair of processes under python control, you'll probably want to use the multiprocess.Pipe object to do so. You can then easily communicate between the inital and the subordinate spawned process by reading and writing to/from the Pipe object rather than standard input and standard output. If you really want to re-direct standard input and standard output, this is probably possible by messing with low-level file-descriptors and/or by overriding/replacing the sys.stdin and sys.stdout objects... but, I suspect, you probably don't want (or need) to do this.
To read the piped in input use fileinput:
myscript.py
import fileinput
if __name__ == '__main__':
for line in fileinput.input():
#do stuff here
process_line(line)

Categories

Resources