Run python files at the same time subprocess - python

I have searched and tried a lot of codes for that topic. I am trying to run two python files but to run both at the same time
This is my try
import subprocess
subprocess.run("py pop1.py & py pop2.py", shell=True)
But this executes the first python then the second one. This is not the target. My target is to run both files at the same time.

subprocess can do this all on its own without invoking shell=True with the & bashism.
import subprocess
# start processes running in parallel
p1 = subprocess.Popen(['py', 'pop1.py'])
p2 = subprocess.Popen(['py', 'pop2.py'])
# wait for both processes to complete
p1.wait()
p2.wait()

Related

Can't get a python file to run 2 other python files on parallel

So I have been trying for hours..DAYS to figure out a code to run 2 python files simultaneously.
I have tried subprocesses, multiprocessing, bash and whatnot, I must be doing something wrong, I just don't know what.
I have 2 python files, and I want to run them in parallel, but note that neither of them end. I want, while the first file is open and running, to run a second file.
Everything I have tried only opens the first file and stops there, since the script there is supposed to be running 24/7. Note that when I tried to use a separate bash file, it, for some reason, opened on git and then closed, doing nothing. I'm really desperate at this point ngl
Please do provide detailed answers with code, as I have been scanning the whole internet (StackOverflow included), I have tried everything and nothing seems to be working..
import subprocess
import LemonBot_General
import LemonBot_Time
import multiprocessing
def worker(file):
subprocess.Popen(["python3 LemonBot_Time.py"], stdout=subprocess.PIPE)
subprocess.Popen(["python3 LemonBot_General.py"],stdout=subprocess.PIPE)
if __name__ == '__main__':
files = ["LemonBot_General.py","LemonBot_Time.py"]
for i in files:
p = multiprocessing.Process(target=worker, args=(i,))
p.start()
This is the latest I tried and didn't work..
I also tried the subprocess commands alone, that didn't work as well.
Bash file also didn't work.
EDIT: NEITHER of the files FINISH. I want to run them in parallel.
You should be able to use Popen from subproccess. Worked for me. If you remove the p.wait() line, the second file will quit as soon as this first file finishes.
import time
import subprocess
p = subprocess.Popen(['python', 'test_file.py'])
time.sleep(5)
print("Working well")
p.wait()
Use the os.system('python3 myprogram.py') command inside of a threading.thread() command for each file.

Run Python scripts in parallel and wait for all to finish before executing more parallel scripts

I need to execute my Python scripts in parallel so I use the following batch file for it:
start python C:\myfolder\1.py
start python C:\myfolder\2.py
start python C:\myfolder\3.py
It works fine, but now I need to run three more scripts in parallel AFTER the above first three finish. How can I specify it in the same batch file?
You can easily do this in python without using windows batch commands.
You can use the subprocess library to run external scripts simultaneously and then wait for them all to complete.
processes = []
scripts = [
r'C:\myfolder\1.py',
r'C:\myfolder\2.py',
r'C:\myfolder\3.py',
]
for script in scripts:
p = subprocess.Popen(['python', script])
processes.append(p)
for p in processes:
p.wait()
# Run other processes here
I think it should work like this:
(
start python C:\myfolder\1.py
start python C:\myfolder\2.py
start python C:\myfolder\3.py
) | pause
For an explanation see this answer.

Running multiple python scripts in a sequence

I have scripts I would like to execute in sequence with a time delay between the each of them.
The intention is to run the scripts which scan for an string in file names and imports those files into a folder. The time delay is to give the script the time to finish copying the files before moving to the next file.
I have tried the questions already posed on Stackoverflow:
Running multiple Python scripts
Run a python script from another python script, passing in args
But I'm not understanding why the lines below don't work.
import time
import subprocess
subprocess.call(r'C:\Users\User\Documents\get summary into folder.py', shell=True)
time.sleep(100)
subprocess.call(r'C:\Users\User\Documents\get summaries into folder.py', shell=True)
time.sleep(100)
The script opens the files but doesn't run.
Couple of things, first of all, time.sleep accepts seconds as an argument, so you're waiting 100s after you've spawned these 2 processes, I guess you meant .100. Anyway, if you just want to run synchronously your 2 scripts better use subprocess.Popen.wait, that way you won't have to wait more than necessary, example below:
import time
import subprocess
test_cmd = "".join([
"import time;",
"print('starting script{}...');",
"time.sleep(1);",
"print('script{} done.')"
])
for i in range(2):
subprocess.Popen(
["python", "-c", test_cmd.format(*[str(i)] * 2)], shell=True).wait()
print('-'*80)

How to execute multiple bash commands in parallel in python

So, I have a code which takes in input and starts a spark job in cluster.. So, something like
spark-submit driver.py -i input_path
Now, I have list of paths and I want to execute all these simulatenously..
Here is what I tried
base_command = 'spark-submit driver.py -i %s'
for path in paths:
command = base_command%path
subprocess.Popen(command, shell=True)
My hope was, all of the shell commands would be executed simultaneously but instead, I am noticing that it executes one command at a time..
How do i execute all the bash commands simultaneously.
Thanks
This is where pool comes in, it is designed for just this case. It maps many inputs to many threads automatically. Here is a good resource on how to use it.
from multiprocessing import Pool
def run_command(path):
command = "spark-submit driver.py -i {}".format(path)
subprocess.Popen(command, shell=True)
pool = Pool()
pool.map(run_command, paths)
It will create a thread for every item in paths and, run them all at the same time for the given input

How do I run multiple subprocesses in parallel and wait for them to finish in Python

I am trying to migrate a bash script to Python.
The bash script runs multiple OS commands in parallel then waits for them to finish before resuming, ie:
command1 &
command2 &
.
commandn &
wait
command
I want to achieve the same using Python subprocess. Is this possible? How can I wait for a subprocess.call command to finish before resuming?
You can still use Popen which takes the same input parameters as subprocess.call but is more flexible.
subprocess.call: The full function signature is the same as that of the Popen constructor - this functions passes all supplied arguments directly through to that interface.
One difference is that subprocess.call blocks and waits for the subprocess to complete (it is built on top of Popen), whereas Popen doesn't block and consequently allows you to launch other processes in parallel.
Try the following:
from subprocess import Popen
commands = ['command1', 'command2']
procs = [ Popen(i) for i in commands ]
for p in procs:
p.wait()
Expanding on Aaron and Martin's answer, here is a solution that runs uses subprocess and Popen to run n processes in parallel:
import subprocess
commands = ['cmd1', 'cmd2', 'cmd3', 'cmd4', 'cmd5']
n = 2 #the number of parallel processes you want
for j in range(max(int(len(commands)/n), 1)):
procs = [subprocess.Popen(i, shell=True) for i in commands[j*n: min((j+1)*n, len(commands))] ]
for p in procs:
p.wait()
I find this to be useful when using a tool like multiprocessing could cause undesired behavior.

Categories

Resources