I have scripts I would like to execute in sequence with a time delay between the each of them.
The intention is to run the scripts which scan for an string in file names and imports those files into a folder. The time delay is to give the script the time to finish copying the files before moving to the next file.
I have tried the questions already posed on Stackoverflow:
Running multiple Python scripts
Run a python script from another python script, passing in args
But I'm not understanding why the lines below don't work.
import time
import subprocess
subprocess.call(r'C:\Users\User\Documents\get summary into folder.py', shell=True)
time.sleep(100)
subprocess.call(r'C:\Users\User\Documents\get summaries into folder.py', shell=True)
time.sleep(100)
The script opens the files but doesn't run.
Couple of things, first of all, time.sleep accepts seconds as an argument, so you're waiting 100s after you've spawned these 2 processes, I guess you meant .100. Anyway, if you just want to run synchronously your 2 scripts better use subprocess.Popen.wait, that way you won't have to wait more than necessary, example below:
import time
import subprocess
test_cmd = "".join([
"import time;",
"print('starting script{}...');",
"time.sleep(1);",
"print('script{} done.')"
])
for i in range(2):
subprocess.Popen(
["python", "-c", test_cmd.format(*[str(i)] * 2)], shell=True).wait()
print('-'*80)
Related
I have searched and tried a lot of codes for that topic. I am trying to run two python files but to run both at the same time
This is my try
import subprocess
subprocess.run("py pop1.py & py pop2.py", shell=True)
But this executes the first python then the second one. This is not the target. My target is to run both files at the same time.
subprocess can do this all on its own without invoking shell=True with the & bashism.
import subprocess
# start processes running in parallel
p1 = subprocess.Popen(['py', 'pop1.py'])
p2 = subprocess.Popen(['py', 'pop2.py'])
# wait for both processes to complete
p1.wait()
p2.wait()
I have a child python script that takes an argument and takes approx. 8 minutes to run.
e.g. python.exe child.py "2018-01-01"
I need to execute this script many times from a main script. I am considering using subprocess.Popen.
import os, sys, time, subprocess
for date in ["2018-01-01", "2018-01-02", "2018-01-03", ..., "2018-12-31"]
p = subprocess.Popen(['python.exe', "child.py", date])
time.sleep(600)
As the Popen function does not know when the child script finishes executing, it just keeps triggering the child script with the argument. So I had to need to set 600 seconds of sleep time (longer than the approximate run time for the child script) so the subsequent run safely starts after the previous run finishes.
I wonder if there is a more efficient way to dealing this situation.
If the scripts need to run synchronously, consider using subprocess. More specifilcally, the run function (>=3.5). Or even the call function (<3.5), which is the same as run but it only returns the code from the script. Both block the calling script until return.
Your code would become:
import shlex
import subprocess
for date in ["2018-01-01", "2018-01-02", "2018-01-03", ..., "2018-12-31"]:
command = 'python.exe child.py %s' % date
args = shlex.split(command)
res = subprocess.run(args)
If you need it to run asyncrhonously, consider using xargs. If you really need to do it in python, use multiprocessing our multiprocessing.dummy to do it.
how about this, call the wait() so the current subprocess is done before launching another one.
for date in ["2018-01-01", "2018-01-02", "2018-01-03", ..., "2018-12-31"]
p = subprocess.Popen(['python.exe', "child.py", date])
p.wait()
rc = p.returncode
print(rc)
So I have been trying for hours..DAYS to figure out a code to run 2 python files simultaneously.
I have tried subprocesses, multiprocessing, bash and whatnot, I must be doing something wrong, I just don't know what.
I have 2 python files, and I want to run them in parallel, but note that neither of them end. I want, while the first file is open and running, to run a second file.
Everything I have tried only opens the first file and stops there, since the script there is supposed to be running 24/7. Note that when I tried to use a separate bash file, it, for some reason, opened on git and then closed, doing nothing. I'm really desperate at this point ngl
Please do provide detailed answers with code, as I have been scanning the whole internet (StackOverflow included), I have tried everything and nothing seems to be working..
import subprocess
import LemonBot_General
import LemonBot_Time
import multiprocessing
def worker(file):
subprocess.Popen(["python3 LemonBot_Time.py"], stdout=subprocess.PIPE)
subprocess.Popen(["python3 LemonBot_General.py"],stdout=subprocess.PIPE)
if __name__ == '__main__':
files = ["LemonBot_General.py","LemonBot_Time.py"]
for i in files:
p = multiprocessing.Process(target=worker, args=(i,))
p.start()
This is the latest I tried and didn't work..
I also tried the subprocess commands alone, that didn't work as well.
Bash file also didn't work.
EDIT: NEITHER of the files FINISH. I want to run them in parallel.
You should be able to use Popen from subproccess. Worked for me. If you remove the p.wait() line, the second file will quit as soon as this first file finishes.
import time
import subprocess
p = subprocess.Popen(['python', 'test_file.py'])
time.sleep(5)
print("Working well")
p.wait()
Use the os.system('python3 myprogram.py') command inside of a threading.thread() command for each file.
So, I have a code which takes in input and starts a spark job in cluster.. So, something like
spark-submit driver.py -i input_path
Now, I have list of paths and I want to execute all these simulatenously..
Here is what I tried
base_command = 'spark-submit driver.py -i %s'
for path in paths:
command = base_command%path
subprocess.Popen(command, shell=True)
My hope was, all of the shell commands would be executed simultaneously but instead, I am noticing that it executes one command at a time..
How do i execute all the bash commands simultaneously.
Thanks
This is where pool comes in, it is designed for just this case. It maps many inputs to many threads automatically. Here is a good resource on how to use it.
from multiprocessing import Pool
def run_command(path):
command = "spark-submit driver.py -i {}".format(path)
subprocess.Popen(command, shell=True)
pool = Pool()
pool.map(run_command, paths)
It will create a thread for every item in paths and, run them all at the same time for the given input
I have python script that takes command line arguments. The way I get the command line arguments is by reading a mongo database. I need to iterate over the mongo query and launch a different process for the single script with different command line arguments from the mongo query.
Key is, I need the launched processes to be:
separate processes share nothing
when killing the process, I need to be able to kill them all easily.
I think the command killall -9 script.py would work and satisfies the second constraint.
Edit 1
From the answer below, the launcher.py program looks like this
def main():
symbolPreDict = initializeGetMongoAllSymbols()
keys = sorted(symbolPreDict.keys())
for symbol in keys:
# Display key.
print(symbol)
command = ['python', 'mc.py', '-s', str(symbol)]
print command
subprocess.call(command)
if __name__ == '__main__':
main()
The problem is that mc.py has a call that blocks
receiver = multicast.MulticastUDPReceiver ("192.168.0.2", symbolMCIPAddrStr, symbolMCPort )
while True:
try:
b = MD()
data = receiver.read() # This blocks
...
except Exception, e:
print str(e)
When I run the launcher, it just executes one of the mc.py (there are at least 39). How do I modify the launcher program to say "run the launched script in background" so that the script returns to the launcher to launch more scripts?
Edit 2
The problem is solved by replacing subprocess.call(command) with subprocess.Popen(command)
One thing I noticed though, if I say ps ax | grep mc.py, the PID seem to be all different. I don't think I care since I can kill them all pretty easily with killall.
[Correction] kill them with pkill -f xxx.py
There are several options for launching scripts from a script. The easiest are probably to use the subprocess or os modules.
I have done this several times to launch things to separate nodes on a cluster. Using os it might look something like this:
import os
for i in range(len(operations)):
os.system("python myScript.py {:} {:} > out.log".format(arg1,arg2))
using killall you should have no problem terminating processes spawned this way.
Another option is to use subprocess which has got a wide range of features and is much more flexible than os.system. An example might look like:
import subprocess
for i in range(len(operations)):
command = ['python','myScript.py','arg1','arg2']
subprocess.call(command)
In both of these methods, the processes are independent and share nothing other than a parent PID.