How to pass arguments to subprocess python instance - python

I'm opening multiple python instances using multiprocessing package and subprocess object. So basicly 10 different python instances that have two sockets in them that serve as client socket and server socket.
Here is example how I launch two python instances with two different files:
from time import sleep
from multiprocessing import Process
import subprocess
def task1():
print('This is task1')
subprocess.Popen(['python','server_client_pair1.py'])
sleep(1)
def task2():
# block for a moment
sleep(1)
# display a message
print('This is task2')
p1 = subprocess.Popen(['python','server_client_pair2.py'])
sleep(1)
if __name__ == '__main__':
# create a process
process1 = Process(target=task1)
sleep(.5)
process2 = Process(target=task2)
sleep(.5)
# run the process
process1.start()
sleep(.5)
process2.start()
sleep(.5)
# wait for the process to finish
print('Waiting for the process...')
process1.join()
process2.join()
I need to pass argument which changes variable PORT which is port number and I'd like to change it with PORT+1 every loop in the file ('server_client_pair.py')
Right now I have working code that uses 10 different server_client_pair.py files (server_client_pair1.py, server_client_pair2.py, server_client_pair3.py, etc)
I'm wondering how to do this with just one file. Any help would be welcome.
*edited the post for more info

first you need to add arguments to your
server_client_pair.py
file then
it will work for you as well :
def task1():
for i in range(10):
subprocess.Popen(['python','server_client_pair.py',str(i)])
sleep(1)
check here to learn how to pass arguments to your python files:
https://www.tutorialspoint.com/python/python_command_line_arguments.htm

Related

Trouble with starting a process out of another process

If I got it right, Python doesn't accept a process to be started out of a process?! For example:
def function1():
while True:
wait_for_condition
#then....
process2.start()
def function2():
does something
process2.join()
process1 = multiprocessing.Process(target=function1,))
process2 = multiprocessing.Process(target=function2,))
process1.start()
In my test python denied to open a process out of a process.
Is there a solution with another way to solve this?
If not - Id have another way to go, but this way would include a modification of the electronics (connect one output to one input and use this to let a process wait for an event and then start. ... but I think this is not a clean way. Its more kind of an workaround. And I'd have a little risk to cause a shortcut if Input and Output is not set correctly).
Edit:
The Task:
Having three processes parallel. These wait for an input at one attached sensor each.
If one of these processes get an input change they should reset a counter (LED_counter) and start another process (LED_process) in not already started. After that the process waits for an input change again.
Beside that...
The LED_process starts to active one output and counting down the LED_counter. If the LED_counter reaches zero, the process terminates. If the code starts again it must be able to restart from the top of the code.
Edit 2:
Latest try with threading (don't be confused by some german words). If I try this code -> the different threads mixes in some strange way together. But for now I can't find a mistake. Same code with multiprocessing works fine:
import RPi.GPIO as GPIO
import time
import threading
import sys
LED_time = 10 #LEDs active time
#Sensor Inputs
SGT = 25
SGA = 23
SHT = 12
GPIO.setmode(GPIO.BCM)
GPIO.setup(SGT, GPIO.IN, pull_up_down=GPIO.PUD_UP)
GPIO.setup(SGA, GPIO.IN, pull_up_down=GPIO.PUD_UP)
GPIO.setup(SHT, GPIO.IN, pull_up_down=GPIO.PUD_UP)
def Sens_check(Sensor,Name):
print("Thread_{} aktiv".format(Name))
while True:
GPIO.wait_for_edge(Sensor, GPIO.FALLING)
#LcGT.value = LED_time
print("{} Offen".format(Name))
time.sleep(0.1)
GPIO.wait_for_edge(SGT, GPIO.RISING)
print("{} Geschlossen".format(Name))
time.sleep(0.1)
SensGT_Thread = threading.Thread(
target=Sens_check,
args=(SGT,"Gartentor",))
SensGA_Thread = threading.Thread(
target=Sens_check,
args=(SGA,"Garage",))
SensHT_Thread = threading.Thread(
target=Sens_check,
args=(SHT,"Haustuere",))
try:
SensGT_Thread.start()
time.sleep(0.1)
SensGA_Thread.start()
time.sleep(0.1)
SensHT_Thread.start()
SensGT_Thread.join()
SensGA_Thread.join()
SensHT_Thread.join()
except:
print("FAILURE")
finally:
sys.exit(1)
Processes can only be started within the process they were created in. In the code provided, process2 was created in the main process, and yet tried to be started within another one (process1). Also, processes cannot be restarted, so they should be created each time .start is used.
Here's an example of starting processes within a process:
import multiprocessing
import time
def function1():
print("Starting more processes")
sub_procs = [multiprocessing.Process(target=function2) for _ in range(5)]
for proc in sub_procs:
proc.start()
for proc in sub_procs:
proc.join()
print("Done with more processes")
def function2():
print("Doing work")
time.sleep(1) # work
print("Done with work")
print("Starting one subprocess")
process1 = multiprocessing.Process(target=function1)
process1.start()
print("Moving on without joining")
"""Output of this:
Starting one subprocess
Moving on without joining
Starting more processes
Doing work
Doing work
Doing work
Doing work
Doing work
Done with work
Done with work
Done with work
Done with work
Done with work
Done with more processes
"""

Python subprocess doesn't work without sleep

I'm working on a Python launcher which should execute a few programs in my list by calling subprocess. The code is correct, but it works very strangely.
In short, it doesn't work without some sleep or input command in main.
Here is the example:
import threading
import subprocess
import time
def executeFile(file_path):
subprocess.call(file_path, shell=True)
def main():
file = None
try:
file = open('./config.ini', 'r');
except:
# TODO: add alert widget
print("cant find a file")
pathes = [ path.strip() for path in file.readlines() ]
try:
for idx in range(len(pathes)):
print(pathes[idx])
file_path = pathes[idx];
newThread = threading.Thread(target=executeFile, args=(file_path,))
newThread.daemon = True
newThread.start()
except:
print("cant start thread")
if __name__ == '__main__':
main()
# IT WORKS WHEN SLEEP EXISTS
time.sleep(10)
# OR
# input("Press enter to exit ;)")
but without input or sleep it doesn't work:
if __name__ == '__main__':
# Doesn't work
main()
Could someone explain me, please, why it happens?
I have some idea but I'm not sure. Maybe it's because subprocess is asynchronyous and the program executes and closes itself BEFORE the subprocess execution.
In case of sleep and input, the program suspends and subprocess has enough time to execute.
Thanks for any help!
As soon as the last thread is started, your main() returns. That in turn will exit your Python program. That stops all your threads.
From the documentation on daemon threads:
Note: Daemon threads are abruptly stopped at shutdown. Their resources (such as open files, database transactions, etc.) may not be released properly. If you want your threads to stop gracefully, make them non-daemonic and use a suitable signalling mechanism such as an Event.
The simple fix would be to not use daemon threads.
As an aside, I would suggest some changes to your loop. First, iterate over pathes directly instead of using indices. Second; catch errors for each thread seperately, so one error doesn't leave remaining files unprocessed.
for path in pathes:
try:
print(path)
newThread = threading.Thread(target=executeFile, args=(path,))
newThread.start()
except:
print("cant start thread for", path)
Another option would be to skip threads entirely, and just maintain a list of running subprocesses:
import os
import subprocess
import time
def manageprocs(proclist):
"""Check a list of subprocesses for processes that have
ended and remove them from the list.
:param proclist: list of Popen objects
"""
for pr in proclist:
if pr.poll() is not None:
proclist.remove(pr)
# since manageprocs is called from a loop,
# keep CPU usage down.
time.sleep(0.5)
def main():
# Read config file
try:
with open('./config.ini', 'r') as f:
pathes = [path.strip() for path in f.readlines()]
except FileNotFoundError:
print("cant find config file")
exit(1)
# List of subprocesses
procs = []
# Do not launch more processes concurrently than your
# CPU has cores. That will only lead to the processes
# fighting over CPU resources.
maxprocs = os.cpu_count()
# Launch all subprocesses.
for path in pathes:
while len(procs) == maxprocs:
manageprocs(procs)
procs.append(subprocess.Popen(path, shell=True))
# Wait for all subprocesses to finish.
while len(procs) > 0:
manageprocs(procs)
if __name__ == '__main__':
main()

Multiprocessing callback message

I have long running process, that I want to keep track about in which state it currently is in. There is N processes running in same time therefore multiprocessing issue.
I pass Queue into process to report messages about state, and this Queue is then read(if not empty) in thread every couple of second.
I'm using Spider on windows as environment and later described behavior is in its console. I did not try it in different env.
from multiprocessing import Process,Queue,Lock
import time
def test(process_msg: Queue):
try:
process_msg.put('Inside process message')
# process...
return # to have exitstate = 0
except Exception as e:
process_msg.put(e)
callback_msg = Queue()
if __name__ == '__main__':
p = Process(target = test,
args = (callback_msg,))
p.start()
time.sleep(5)
print(p)
while not callback_msg.empty():
msg = callback_msg.get()
if type(msg) != Exception:
tqdm.write(str(msg))
else:
raise msg
Problem is that whatever I do with code, it never reads what is inside the Queue(also because it never puts anything in it). Only when I switch to dummy version, which runs similary to threading on only 1 CPU from multiprocessing.dummy import Process,Queue,Lock
Apparently the test function have to be in separate file.

How to modify a variable in one thread and check it in another?

Below is the code which demonstrates the problem. Please note that this is only an example, I am using the same logic in a more complicated application, where I can't use sleep as the amount of time, it will take for process1 to modify the variable, is dependent on the speed of the internet connection.
from multiprocessing import Process
code = False
def func():
global code
code = True
pro = Process(target=func)
pro.start()
while code == False:
pass
pro.terminate()
pro.join()
print('Done!')
On running this nothing appears on the screen. When I terminate the program, by pressing CTRL-C, the stack trace shows that the while loop was being executed.
Python has a few concurrency libraries: threading, multiprocessing and asyncio (and more).
multiprocessing is a library which uses subprocesses to bypass python's inability to concurrently run CPU intensive tasks. To share variables between different multiprocessing.Processes, create them via a multiprocessing.Manager() instance. For example:
import multiprocessing
import time
def func(event):
print("> func()")
time.sleep(1)
print("setting event")
event.set()
time.sleep(1)
print("< func()")
def main():
print("In main()")
manager = multiprocessing.Manager()
event = manager.Event()
p = multiprocessing.Process(target=func, args=(event,))
p.start()
while not event.is_set():
print("waiting...")
time.sleep(0.2)
print("OK! joining func()...")
p.join()
print('Done!')
if __name__ == "__main__":
main()

Killing a process launched from a process that has ended - Python

I am trying to kill a process in Python, that is being launched from another process and I am unable to find the correct place to place my ".terminate()".
To explain myself better I will post some example code:
from multiprocessing import Process
import time
def function():
print "Here is where I am creating the function I need to kill"
ProcessToKill = Process(target = killMe)
ProcessToKill.start()
def killMe():
while True:
print "kill me"
time.sleep(0.5)
if __name__ == '__main__':
Process1 = Process(target = function)
Process1.start()
My question is, where can I place ProcessToKill.terminate(), ideally without having to change the overall structure of the code?
You can hold onto the ProcessToKill object so that you can kill it later:
from multiprocessing import Process
import time
def function():
print "Here is where I am creating the function I need to kill"
ProcessToKill = Process(target = killMe)
ProcessToKill.start()
return ProcessToKill
def killMe():
while True:
print "kill me"
time.sleep(0.5)
if __name__ == '__main__':
Process1 = function()
time.sleep(5)
Process1.terminate()
Here, I've removed your wrapping of function in another Process object, because for the example it seems redundant, but you should be able to do the same thing with a Process that runs another Process.

Categories

Resources