Running multiple scripts from a single Script in python - python

I have two scripts Server.py and ServerGUI.py. I want them to run independently and in parallel. Say I make another script main.py. How can I run Server.py and ServerGUI.py from main.py?
Can you suggest me the code for main.py?

To run 2 or more scripts from within a python script, you can use the subprocess package with nohup. This will run each script in the background allowing you to run them in parallel from the same source script. Also, as an option, this example will save the standard output from each script in a different file
import os
from subprocess import call
from subprocess import Popen
# subprocess.call(['python', 'exampleScripts.py', somescript_arg1, somescript_val1,...]).
Popen(['nohup', 'python', 'exampleScripts.py'],
stdout=open('null1', 'w'),
stderr=open('logfile.log', 'a'),
start_new_session=True )
Popen(['nohup', 'python', 'exampleScripts.py'],
stdout=open('null2', 'w'),
stderr=open('logfile.log', 'a'),
start_new_session=True )
Popen(['nohup', 'python', 'exampleScripts.py'],
stdout=open('null3', 'w'),
stderr=open('logfile.log', 'a'),
start_new_session=True )
Output: the start and end times in each script overlap showing that the 2nd one started before the first one ended
(ds_tensorflow) C:\DataScience\SampleNotebooks\Threading>python RunScripts.py
(ds_tensorflow) C:\DataScience\SampleNotebooks\Threading>cat null*
2020-07-13 15:46:21.251606
List processing complete.
2020-07-13 15:46:29.130219
2020-07-13 15:46:22.501599
List processing complete.
2020-07-13 15:46:31.227954
2020-07-13 15:46:23.758498
List processing complete.
2020-07-13 15:46:32.431079
You can also use the same idea with functions, if you want to keep the code in once place. This example will run two different functions two times, in parallel.
Function example:
...
import threading
...
def some_function()
# code
def other_function()
# code
if __name__ == "__main__":
jobs = []
#same function run multiple times
threads = 2
for i in range(0, threads):
out_list = list()
thread1 = threading.Thread(target=some_function(size, i, out_list))
jobs.append(thread1)
thread2 = threading.Thread(target=other_function(size, i, out_list))
jobs.append(thread2)
# Start the threads (i.e. calculate the random number lists)
for j in jobs:
j.start()
# Ensure all of the threads have finished
for j in jobs:
j.join()
# continue processing

You can use threading or multiprocessing 'for running the python scripts in parallel.
Threading : https://www.tutorialspoint.com/python/python_multithreading.htm
Multiprocessing :
https://www.tutorialspoint.com/multiprocessing-in-python#:~:text=The%20multiprocessing%20package%20supports%20spawning,is%20similar%20to%20threading%20module.
Hope this helps you

You could do something along these lines (for the sake of example, I have assumed your scripts accept two arguments, arg1 and arg2. This needs to be modified according to your specific needs.):
If you have "main" functions in server.py and servergui.py:
import threading
from server import server_main
from servergui import server_gui_main
thread_list = []
thread_list.append(
threading.Thread(target=server_main, args=(arg1, arg2))
)
thread_list.append(
threading.Thread(target=server_gui_main, args=(arg1, arg2))
)
for thread in thread_list:
thread.start()
for thread in thread_list:
thread.join()
The above will start the two main functions in separate threads.
If you want separate parallel processes use multithreading:
import multiprocessing
process_list = []
process_list.append(
multiprocessing.Process(target=server_main, args=(arg1, arg2))
)
process_list.append(
multiprocessing.Process(target=server_gui_main, args=(arg1, arg2))
)
for process in process_list:
process.start()
for process in process_list:
process.join()
As you see, the differences in the API of multiprocessing and threading are small. However, your performance may suffer from threading if you are performing CPU-bound tasks. This is because the Python GIL forces Python to run only one thread at any given moment. Thus, if you have CPU-intensive tasks you should use multiprocessing as this creates separate processes which do indeed run in parallel.
If you want to start server.py and servergui.py as if you start them from the command line:
import subprocess
subprocess.run(
['python', 'server.py', 'arg1', 'arg2'],
shell=True
)
subprocess.run(
['python', 'servergui.py', 'arg1', 'arg2'],
shell=True
)

Related

Run multiple processes in parallel, continue each after each finishes

I'm on Windows environment, and suppose I have two toy programs, called 2.bat and 5.bat, which look like timeout 2 and timeout 5, respectively.
I want to set up a script that runs both 2.bat and 5.bat in parallel, and when 2.bat finishes it is ran again and likewise for 5.bat. I'm pretty bad at Python, but after a bit of searching I see that I can do:
from subprocess import Popen
commands = ["2.bat", "5.bat"]
while True:
procs = [Popen(i) for i in commands]
for p in procs:
p.wait()
This doesn't do what I want: it waits for both processes to finish, and then again executes both. What I want to do (in pseudocode) is as follows:
while True:
in parallel, run 2.bat and 5.bat
when 2.bat finishes, rerun 2.bat again
when 5.bat finishes, rerun 5.bat again
Can I achieve this with subprocess, or do I need other libraries?
my solution would be:
from _thread import start_new_thread
from subprocess import Popen
commands = ["2.bat", "5.bat"]
def run_bat(file):
while True:
p = Popen(file)
p.wait()
for command in commands:
start_new_thread(run_bat, (command, ))
while True:
pass

not able to terminate the process in multiprocessing python (linux)

I am new to python and using multiprocessing, I am starting one process and calling one shell script through this process. After terminating this process shell script keeps running in the background, how do I kill it, please help.
python script(test.py)
#!/usr/bin/python
import time
import os
import sys
import multiprocessing
# test process
def test_py_process():
os.system("./test.sh")
return
p=multiprocessing.Process(target=test_py_process)
p.start()
print 'STARTED:', p, p.is_alive()
time.sleep(10)
p.terminate()
print 'TERMINATED:', p, p.is_alive()
shell script (test.sh)
#!/bin/bash
for i in {1..100}
do
sleep 1
echo "Welcome $i times"
done
The reason is that the child process that is spawned by the os.system call spawns a child process itself. As explained in the multiprocessing docs descendant processes of the process will not be terminated – they will simply become orphaned. So. p.terminate() kills the process you created, but the OS process (/bin/bash ./test.sh) simply gets assigned to the system's scheduler process and continues executing.
You could use subprocess.Popen instead:
import time
from subprocess import Popen
if __name__ == '__main__':
p = Popen("./test.sh")
print 'STARTED:', p, p.poll()
time.sleep(10)
p.kill()
print 'TERMINATED:', p, p.poll()
Edit: #Florian Brucker beat me to it. He deserves the credit for answering the question first. Still keeping this answer for the alternate approach using subprocess, which is recommended over os.system() in the documentation for os.system() itself.
os.system runs the given command in a separate process. Therefore, you have three processes:
The main process in which your script runs
The process in which test_py_processes runs
The process in which the bash script runs
Process 2 is a child process of process 1, and process 3 is a child of process 1.
When you call Process.terminate from within process 1 this will send the SIGTERM signal to process two. That process will then terminate. However, the SIGTERM signal is not automatically propagated to the child processes of process 2! This means that process 3 is not notified when process 2 exits and hence keeps on running as a child of the init process.
The best way to terminate process 3 depends on your actual problem setting, see this SO thread for some suggestions.

Simultaneously Call a Script from Python, passing data to each, and Returning data to the original

I am trying to create a python script that calls 5 other python scripts to run simultaneously while passing in an array and then each of the 5 scripts perform an operation on that array and return a different array to the initial script.
The initial script then realize when the 5 have returned values and then performs operations on these 5 arrays.
I think the solution is something like
os.system(./script1.py arg1), os.system(./script2.py arg2)
but I'm unsure of how to proceed.
You can use a thread pool to run all of the commands in parallel. I also changed over to the subprocess module which grabs program outputs:
import multiprocessing.pool
import subprocess as subp
def worker(script):
proc = subp.Popen(script, shell=True, stdout=subp.PIPE, stderr=subp.PIPE)
out, err = proc.communicate()
return script, out, err, proc.returncode
scripts = ['./script1.py arg1', './script2.py arg2']
pool = multiprocessing.pool.ThreadPool(len(scripts))
for script, out, err, returncode in pool.map(worker, scripts):
do your magic
pool.close()
pool.join()

How do I run multiple subprocesses in parallel and wait for them to finish in Python

I am trying to migrate a bash script to Python.
The bash script runs multiple OS commands in parallel then waits for them to finish before resuming, ie:
command1 &
command2 &
.
commandn &
wait
command
I want to achieve the same using Python subprocess. Is this possible? How can I wait for a subprocess.call command to finish before resuming?
You can still use Popen which takes the same input parameters as subprocess.call but is more flexible.
subprocess.call: The full function signature is the same as that of the Popen constructor - this functions passes all supplied arguments directly through to that interface.
One difference is that subprocess.call blocks and waits for the subprocess to complete (it is built on top of Popen), whereas Popen doesn't block and consequently allows you to launch other processes in parallel.
Try the following:
from subprocess import Popen
commands = ['command1', 'command2']
procs = [ Popen(i) for i in commands ]
for p in procs:
p.wait()
Expanding on Aaron and Martin's answer, here is a solution that runs uses subprocess and Popen to run n processes in parallel:
import subprocess
commands = ['cmd1', 'cmd2', 'cmd3', 'cmd4', 'cmd5']
n = 2 #the number of parallel processes you want
for j in range(max(int(len(commands)/n), 1)):
procs = [subprocess.Popen(i, shell=True) for i in commands[j*n: min((j+1)*n, len(commands))] ]
for p in procs:
p.wait()
I find this to be useful when using a tool like multiprocessing could cause undesired behavior.

python multiprocessing can not control multiple long running console exe?

I am a newbie in Python. I recently tried to use Python script to call a console exe which is a process need long time. I will allow the exe being called as many times as the CPU can permit. And when the exe has finish its job. It should release the CPU to other new jobs. So I think I may need the multiple process control mechanism.
Since multiprocessing can only call python callable function. It can not directly call the console exe. I wrapped the subprocess.Popen(cmd) in a python function. However, after I did so, I found that before multiprocessing.Process.start() is used. The exe has already started. And problem is before it finish what it is doing (need long time), it does not return me the control. The program is freezed to wait. That is not what I want.
I am posting the codes as below:
import sys
import os
import multiprocessing
import subprocess
def subprocessExe(cmd):
return subprocess.call(cmd, shell=False, stdout=subprocess.PIPE, \
stderr=subprocess.PIPE,creationflags=0x08000000)
if __name__ == '__main__':
p = multiprocessing.Process(target=self.subprocessExe(exeFileName))
p.start()
print p, p.is_alive()
Thank you for your time!
You're calling subprocessExec when you create the multiprocessing.Process object. You should instead do this:
p = multiprocessing.Process(target=subprocessExec, args=(exeFileName,))
And then it should work.
There are number of things that are wrong in your test case. The following work for me:
import multiprocessing, subprocess
def subprocessExe(cmd):
subprocess.call([cmd], shell=False)
p = multiprocessing.Process(target=subprocessExe, args=('/full/path/to/script.sh',))
p.start()
OR
subprocess.call([cmd], shell=True)
p = multiprocessing.Process(target=subprocessExe, args=('script.sh',))
OR
subprocess.Popen([cmd], shell=True, stdout=subprocess.PIPE, \
stderr=subprocess.PIPE)

Categories

Resources