I am trying to run multiple Python scripts in parallel in Windows 7 (and 10). I am running them all from another Python script, which performs more functions on the files the scripts are editing. I want the external script to wait until the other scripts are done running. I have tried start /w, but that made each script wait before closing the console window.
Essentially what I want to do is for Python to wait until the 3 processes are done. The last script is just a print("done"), and is meaningless for all I care. This is important for me to solve with 3 processes because I need to do the same thing with 30. (On a server, there are enough available threads.)
This is the CMD command I am trying to run.
os.system("start python node1.py & start python node2.py & start python node3.py && start /w printstatement.py")
Any suggestions?
Use subprocess.Popen instead of os.system. You'll get 3 Popen instances that you can wait on. For example:
import subprocess
procs = [subprocess.Popen(['python', 'node{}.py'.format(n)])
for n in range(1, 4)]
retcodes = [p.wait() for p in procs]
If you want separate console windows, like how CMD's start command works, then add the option creationflags=subprocess.CREATE_NEW_CONSOLE to the Popen call (Windows only). If you instead want separate consoles that don't create windows, use creationflags=CREATE_NO_WINDOW (0x08000000). In this case they still have console standard I/O; it's just not rendered to a window.
Solution using asyncio:
import asyncio
commands = [
'python node1.py',
'python node2.py',
]
async def run_command(command):
task = await asyncio.create_subprocess_exec(*command.split())
await task.wait()
combined_task = asyncio.gather(*(run_command(command) for command in commands))
asyncio.get_event_loop().run_until_complete(combined_task)
Related
I have two python scripts that use two different cameras for a project I am working on and I am trying to run them both inside a different script or within each other, either way is fine.
import os
os.system('python 1.py')
os.system('python 2.py')
My problem however is that they don't run at the same time, I have to quit the first one for the next to open. I also tried doing it with bash as well with the & shell operator
python 1.py &
python 2.py &
And this does in fact make them both run however the issue is that they both run endlessly in the background and I need to close them rather easily. Any suggestion what I can do to avoid the issues with these implementations
You could do it with multiprocessing
import os
import time
import psutil
from multiprocessing import Process
def run_program(cmd):
# Function that processes will run
os.system(cmd)
# Initiating Processes with desired arguments
program1 = Process(target=run_program, args=('python 1.py',))
program2 = Process(target=run_program, args=('python 2.py',))
# Start our processes simultaneously
program1.start()
program2.start()
def kill(proc_pid):
process = psutil.Process(proc_pid)
for proc in process.children(recursive=True):
proc.kill()
process.kill()
# Wait 5 seconds and kill first program
time.sleep(5)
kill(program1.pid)
program1.join()
# Wait another 1 second and kill second program
time.sleep(1)
kill(program2.pid)
program2.join()
# Print current status of our programs
print('1.py alive status: {}'.format(program1.is_alive()))
print('2.py alive status: {}'.format(program2.is_alive()))
One possible method is to use systemd to control your process (i.e. treat them as daemons).
This is how I control my Python servers since they need to run in the background and be completely detached from the current tty so I can exit my connection to the machine and the continue processes continue. You can then also stop the server later using systemctl, as explained below.
Instructions:
Create a .service file and save it in /etc/systemd/system, with contents along the lines of:
[Unit]
Description=daemon one
[Service]
ExecStart=/path/to/1.py
and repeat with one going to 2.py.
Then you can use systemctl to control your daemons.
First reload all config files with:
systemctl daemon-reload
then start either of your daemons (where my_daemon.service is one of your unit files):
systemctl start my_daemon
it should now be running and you should find it in:
systemctl list-units
You can also check its status with:
systemctl status my_daemon
and stop/restart them with:
systemctl stop|restart my_daemon
Use subprocess.Popen. This will create a child process and return its pid.
pid = Popen("python 1.py").pid
And then check out these functions for communicating with the child process and checking if it is still running.
I need to execute my Python scripts in parallel so I use the following batch file for it:
start python C:\myfolder\1.py
start python C:\myfolder\2.py
start python C:\myfolder\3.py
It works fine, but now I need to run three more scripts in parallel AFTER the above first three finish. How can I specify it in the same batch file?
You can easily do this in python without using windows batch commands.
You can use the subprocess library to run external scripts simultaneously and then wait for them all to complete.
processes = []
scripts = [
r'C:\myfolder\1.py',
r'C:\myfolder\2.py',
r'C:\myfolder\3.py',
]
for script in scripts:
p = subprocess.Popen(['python', script])
processes.append(p)
for p in processes:
p.wait()
# Run other processes here
I think it should work like this:
(
start python C:\myfolder\1.py
start python C:\myfolder\2.py
start python C:\myfolder\3.py
) | pause
For an explanation see this answer.
When running a secondary python script:
Is it possible to run a subprocess.Popen, or subprocess.call or even execfile in a new terminal? (as in simply a different terminal than the current terminal where the script is run).
Alternatively, if before running my program (main), I open two terminals first, can I then point the secondary script to the second terminal? (so somehow getting the ID of open terminals, and then using a specific one among them, to perform the subprocess).
An example, two subprocesses to be run, first.py should be called first, only then the second is called, second.py. Because the two scripts first.py and second.py are interdependent (as in first.py goes to wait mode, until second.py is run, then first.py resumes, and I don't know how to make this communication work between them in terms of subprocesses.)
import subprocess
command = ["python", "first.py"]
command2 = ["python", "second.py"]
n = 5
for i in range(n):
p = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
p2 = subprocess.Popen(command2, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while True:
output = p.stdout.readline().strip()
print output
if output == 'stop':
print 'success'
p.terminate()
p2.terminate()
break
Framework (Ubuntu, python 2.7)
I guess you want something like
subprocess.call(['xterm','-e','python',script])
Good old xterm has almost no frills; on a Freedesktop system, maybe run xdg-terminal instead. On Debian, try x-terminal-emulator.
However, making your program require X11 is in most cases a mistake. A better solution is to run the subprocesses with output to a log file (or a socket, or whatever) and then separately run tail -f on those files (in a different terminal, or from a different server over ssh, or with output to a logger which supports rsyslog, or or or ...) which keeps your program simple and modular, free from "convenience" dependencies.
If you're using tmux, you can specify which target you want the command to run in:
tmux send -t foo.0 ls ENTER
So, if you've created a tmux session foo.0, you should be able to do:
my_command = 'ls'
tmux_cmd = ['tmux', 'send', '-t', 'foo.0', my_command]
p = subprocess.Popen(tmux_cmd)
You can specify the tty of the terminal window you wish the command to be carried out in:
ls > /dev/ttys004
However, I would recommend going for the tmux approach for greater control (see my other answer).
I have python script that takes command line arguments. The way I get the command line arguments is by reading a mongo database. I need to iterate over the mongo query and launch a different process for the single script with different command line arguments from the mongo query.
Key is, I need the launched processes to be:
separate processes share nothing
when killing the process, I need to be able to kill them all easily.
I think the command killall -9 script.py would work and satisfies the second constraint.
Edit 1
From the answer below, the launcher.py program looks like this
def main():
symbolPreDict = initializeGetMongoAllSymbols()
keys = sorted(symbolPreDict.keys())
for symbol in keys:
# Display key.
print(symbol)
command = ['python', 'mc.py', '-s', str(symbol)]
print command
subprocess.call(command)
if __name__ == '__main__':
main()
The problem is that mc.py has a call that blocks
receiver = multicast.MulticastUDPReceiver ("192.168.0.2", symbolMCIPAddrStr, symbolMCPort )
while True:
try:
b = MD()
data = receiver.read() # This blocks
...
except Exception, e:
print str(e)
When I run the launcher, it just executes one of the mc.py (there are at least 39). How do I modify the launcher program to say "run the launched script in background" so that the script returns to the launcher to launch more scripts?
Edit 2
The problem is solved by replacing subprocess.call(command) with subprocess.Popen(command)
One thing I noticed though, if I say ps ax | grep mc.py, the PID seem to be all different. I don't think I care since I can kill them all pretty easily with killall.
[Correction] kill them with pkill -f xxx.py
There are several options for launching scripts from a script. The easiest are probably to use the subprocess or os modules.
I have done this several times to launch things to separate nodes on a cluster. Using os it might look something like this:
import os
for i in range(len(operations)):
os.system("python myScript.py {:} {:} > out.log".format(arg1,arg2))
using killall you should have no problem terminating processes spawned this way.
Another option is to use subprocess which has got a wide range of features and is much more flexible than os.system. An example might look like:
import subprocess
for i in range(len(operations)):
command = ['python','myScript.py','arg1','arg2']
subprocess.call(command)
In both of these methods, the processes are independent and share nothing other than a parent PID.
I am trying to use the Python 2.7 subprocess library to programmatically add songs to the VLC player queue.
From here and here, I am able to launch VLC Player and play a song (or queue songs from the outset);
from subprocess import Popen
vlcpath = r'C:\Program Files (x86)\VideoLAN\VLC\vlc.exe'
musicpath1 = r'path\to\song1.mp3'
musicpath2 = r'path\to\song2.mp3'
p = Popen([vlcpath,musicpath1]) # launch VLC and play song
p = Popen([vlcpath,musicpath1,musicpath2]) # launch VLC and play/queue songs
The problem is that I do not know the entire queue playlist at launch. I want to be able to add songs to the queue of the VLC process already running. How do I accomplish this, please?
From here, I think the appropriate command line entry is:
vlc.exe --started-from-file --playlist-enqueue "2.wmv"
But I do not know the syntax to execute this in subprocess. I tried a couple of things, but couldn't get either to work:
calling Popen again (opens a new process)
calling p.communicate (I thought this is how to enter stdin commands)
To run the command: vlc.exe --started-from-file --playlist-enqueue "2.wmv"
using subprocess module on Windows:
from subprocess import Popen
cmd = 'vlc.exe --started-from-file --playlist-enqueue "2.wmv"'
p = Popen(cmd) # start and forget
assert not p.poll() # assert that it is started successfully
To wait for the command to finish:
from subprocess import check_call
check_call(cmd) # start, wait until it is done, raise on non-zero exit status
But how do I run that command a second time on the same p process?
Your code starts a new instance of VLC, rather than running that on
top of the p that was already open. I found that if I run the vlc.exe --started-from-file --playlist-enqueue "2.wmv" command multiple times manually (in a command prompt window), it correctly launches vlc (the
first time) and then adds to queue (on subsequent calls). So I think I
just need to be able to run the code you suggested multiple times "on
top of itself"
Each Popen() starts a new process. Each time you run the command manually in the command-line it starts a new process. It might be upto the current vlc configuration on your system whether it keeps multiple vlc instances or you are running a different command (different command-line arguments).