I have multiple python files to run. How would I launch all those files within one .py script? This is what I came up with but it shows the screen action and really doesn't begin the other stuff unless I exit out of it. Here's the code, not much:
import os
print("Launching Bot, just for you.")
print("Loading up shard 0")
try:
os.system("screen python3.5 run_0.py > /dev/null")
except:
print("Shard 0 failed")
print("Loading up shard 1")
try:
os.system("screen python3.5 run_1.py > /dev/null")
except:
print("Shard 1 failed")
print("Done running shards...")
I was doing some research and they said to use subprocess but when I used it, it didn't run my command properly. (I don't have a copy of that code, I lost it).
The problem is that I want to run the python script and it works fine but I have to close the screen to start the other one and I just want it to run the command w/o showing the output. Can you help?
You should use import subprocess in a python file. You can then start other instance of other programs with :
subprocess.Popen([sys.executable, "newprogram.py"])
You can mix that with multiprocessing package to launch one thread by new program
p = multiprocessing.Process(target= mp_worker , args=( ))
p.start()
where mp_worker launches the other program.
Related
I am in a bit of a pickle here. I have a python script (gather.py) that gathers information from an .xml file and uploads it into a database on a infinite loop that sleeps for 60sec; btw all of this is local. I am using Flask to run a webpage that will later pull information from the database, but at the moment all it does is display a sample page (main.py). I want to run main.py as for it to start gather.py as background process that won't prevent Flask from starting, I tried importing gather.py but it halts the process (indefinitely) and Flask won't start. After Googling for a while it seems that the best option is to use a task queue (Celery) and a message-broker (RabbitMQ) to take care of this. This is fine if the application were to do a lot of stuff in the background, but I only need it to do 1 or 2 things. So I did more digging and found posts stating that subprocess.Popen() could do the job. I tried using it and I don't think it failed, since it didn't raise any errors, but the database is empty. I confirmed that both gather.py and main.py work independently. I tried running the following code in IDLE:
subprocess.Popen([sys.executable, 'path\to\gather.py'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
and got this in return:
<subprocess.Popen object at 0x049A1CF0>
Now, I don't know what this means, I tried using .value and .attrib but understandably I get this:
AttributeError: 'Popen' object has no attribute 'value'
and
AttributeError: 'Popen' object has no attribute 'attrib'
Then I read on a StackOverflow post that stdout=subprocess.PIPE would cause the program to halt so, in a 'just in case' moment, I ran:
subprocess.Popen([sys.executable, 'path\to\gather.py'], stdout=subprocess.DEVNULL, stderr=subprocess.STDOUT)
and got this in return:
<subprocess.Popen object at 0x034A77D0>
Through all this process the database tables have remained empty. I am new to the subprocess module but all this checks and I can't figure out why it is not running gather.py. Is it because it has an infinite loop?? If there is a better option pls let me know.
Python version: 3.4.4
PS. IDK if it'll matter but I am running a portable version of Python (PortableApps) on a Windows 10 PC. This is why I included sys.executable inside subprocess.Popen().
Solution 1 (all in python script):
Try to use Thread and Queue.
I do this:
from flask import Flask
from flask import request
import json
from Queue import Queue
from threading import Thread
import time
def task(q):
q.put(0)
t = time.time()
while True:
time.sleep(1)
q.put(time.time() - t)
queue = Queue()
worker = Thread(target=task, args=(queue,))
worker.start()
app = Flask(__name__)
#app.route('/')
def message_from_queue():
msg = "Running: calculate %f seconds" % queue.get()
return msg
if __name__ == '__main__':
app.run(host='0.0.0.0')
If you run this code each access to '/' get a value calculate in task in background. Maybe you need to block until the task get a value, but it isnt enough information in the question. Of course you need to refactor your gather.py to pass a queue for it.
Solution 2 (using a system script):
For windows, create a .bat file and run both script from there:
#echo off
start python 'path\to\gather.py'
set FLASK_APP=app.py
flask run
This will run gather.py and after start the flask server. If you use start /min python 'path\to\gather.py' the gather will run in minimized mode.
subprocess.Popen will not work in opening a python program because it recognizes python as a file and not a executable. Subprocess.Popen can only open .exe files and nothing other than that.
You can use:
os.system('python_file_path.py')
but it won't be a background process(depends on your script)
So I have been trying for hours..DAYS to figure out a code to run 2 python files simultaneously.
I have tried subprocesses, multiprocessing, bash and whatnot, I must be doing something wrong, I just don't know what.
I have 2 python files, and I want to run them in parallel, but note that neither of them end. I want, while the first file is open and running, to run a second file.
Everything I have tried only opens the first file and stops there, since the script there is supposed to be running 24/7. Note that when I tried to use a separate bash file, it, for some reason, opened on git and then closed, doing nothing. I'm really desperate at this point ngl
Please do provide detailed answers with code, as I have been scanning the whole internet (StackOverflow included), I have tried everything and nothing seems to be working..
import subprocess
import LemonBot_General
import LemonBot_Time
import multiprocessing
def worker(file):
subprocess.Popen(["python3 LemonBot_Time.py"], stdout=subprocess.PIPE)
subprocess.Popen(["python3 LemonBot_General.py"],stdout=subprocess.PIPE)
if __name__ == '__main__':
files = ["LemonBot_General.py","LemonBot_Time.py"]
for i in files:
p = multiprocessing.Process(target=worker, args=(i,))
p.start()
This is the latest I tried and didn't work..
I also tried the subprocess commands alone, that didn't work as well.
Bash file also didn't work.
EDIT: NEITHER of the files FINISH. I want to run them in parallel.
You should be able to use Popen from subproccess. Worked for me. If you remove the p.wait() line, the second file will quit as soon as this first file finishes.
import time
import subprocess
p = subprocess.Popen(['python', 'test_file.py'])
time.sleep(5)
print("Working well")
p.wait()
Use the os.system('python3 myprogram.py') command inside of a threading.thread() command for each file.
I am developing some Python (version 3.6.1) code to install an application in Windows 7. The code used is this:
winCMD = r'"C:\PowerBuild\setup.exe" /v"/qr /l C:\PowerBuild\TUmsi.log"'
output = subprocess.check_call(winCMD, shell = True)
The application is installed successfully. The problem is that it always requires a reboot after it is finished (a popup with a message "You must restart your system for the configuration changes made to to take effect. Click Yes to restart now or No if you plan to restart later.).
I tried to insert parameter "/forcerestart" (source here) in the installation command but it still stops to request the reboot:
def installApp():
winCMD = r'"C:\PowerBuild\setup.exe" /v"/qr /forcerestart /l C:\PowerBuild\TUmsi.log"'
output = subprocess.check_call(winCMD, shell = True)
Another attempt was to create a following command like this one below, although since the previous command is not finished yet (as per my understanding) I realized it will never be called:
rebootSystem = 'shutdown -t 0 /r /f'
subprocess.Popen(rebootSystem, stdout=subprocess.PIPE, shell=True)
Does anyone had such an issue and could solve it?
As an ugly workaround, if you're not time-critical but you want to emphasise the "automatic" aspect, why not
run the installCMD in a thread
wait sufficiently long to be sure that the command has completed
perform the shutdown
like this:
import threading,time
def installApp():
winCMD = r'"C:\PowerBuild\setup.exe" /v"/qr /l C:\PowerBuild\TUmsi.log"'
output = subprocess.check_call(winCMD, shell = True)
t = threading.Thread(target=installApp)
t.start()
time.sleep(1800) # half-hour should be enough
rebootSystem = 'shutdown -t 0 /r /f'
subprocess.Popen(rebootSystem, stdout=subprocess.PIPE, shell=True)
Another (safer) way would be to find out which file is created last in the installation, and monitor for its existence in a loop like this:
while not os.path.isfile("somefile"):
time.sleep(60)
time.sleep(60) # another minute for safety
# perform the reboot
To be clean, you'd have to use subprocess.Popen for the installation process, export it as global and call terminate() on it in the main process, but since you're calling a shutdown that's not necessary.
(to be clean, we wouldn't have to do that hack in the first place)
I have a script that collects data from the streaming API. I'm getting an error at random that I believe it's coming from twitter's end for whatever reason. It doesn't happen at specific time, I've been seen it as early as 10 minutes after running my script, and other times after 2 hours.
My question is how do I create another script (outside the running one) that can catch if it terminated with an error, then restart after a delay.
I did some searching and most were related to using bash on linux, I'm on windows. Other suggestions were to use Windows Task Scheduler but that can only be set for a known time.
I came across the following code:
import os, sys, time
def main():
print "AutoRes is starting"
executable = sys.executable
args = sys.argv[:]
args.insert(0, sys.executable)
time.sleep(1)
print "Respawning"
os.execvp(executable, args)
if __name__ == "__main__":
main()
If I'm not mistaken that runs inside the code correct? Issue with that is my script is currently collecting data and I can't terminate to edit.
How about this?
from os import system
from time import sleep
while True: #manually terminate when you want to stop streaming
system('python streamer.py')
sleep(300) #sleep for 5 minutes
In the meanwhile, when something goes wrong in streamer.py , end it from there by invoking sys.exit(1)
Make sure this and streamer.py are in the same directory.
I need to run multiple programs one after the other and they each run in a console window.
I want the console window to be visible, but a new window is created for each program. This is annoying because each window is opened in a new position from where the other is closed and steals focus when working in Eclipse.
This is the initial code I was using:
def runCommand( self, cmd, instream=None, outstream=None, errstream=None ):
proc = subprocess.Popen( cmd, stdin=instream, stdout=outstream, stderr=errstream )
while True:
retcode = proc.poll()
if retcode == None:
if mAbortBuild:
proc.terminate()
return False
else:
time.sleep(1)
else:
if retcode == 0:
return True
else:
return False
I switched to opening a command prompt using 'cmd' when calling subprocess.Popen and then calling proc.stdin.write( b'program.exe\r\n' ).
This seems to solve the one command window problem but now I can't tell when the first program is done and I can start the second. I want to stop and interrogate the log file from the first program before running the second program.
Any tips on how I can achieve this? Is there another option for running the programs in one window I haven't found yet?
Since you're using Windows, you could just create a batch file listing each program you want to run which will all execute in a single console window. Since it's a batch script you can do things like put conditional statements in it as shown in the example.
import os
import subprocess
import textwrap
# create a batch file with some commands in it
batch_filename = 'commands.bat'
with open(batch_filename, "wt") as batchfile:
batchfile.write(textwrap.dedent("""
python hello.py
if errorlevel 1 (
#echo non-zero exit code: %errorlevel% - terminating
exit
)
time /t
date /t
"""))
# execute the batch file as a separate process and echo its output
kwargs = dict(stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
universal_newlines=True)
with subprocess.Popen(batch_filename, **kwargs).stdout as output:
for line in output:
print line,
try: os.remove(batch_filename) # clean up
except os.error: pass
In section 17.5.3.1. Constants in the subprocess module documentation there's description of subprocess.CREATE_NEW_CONSOLE constant:
The new process has a new console, instead of inheriting its parent’s
console (the default).
As we see, by default, new process inherits its parent's console. The reason you observe multiple consoles being opened is the fact that you call your scripts from within Eclipse, which itself does not have console so each subprocess creates its own console as there's no console it could inherit. If someone would like to simulate this behavior it's enough to run Python script which creates subprocesses using pythonw.exe instead of python.exe. The difference between the two is that the former does not open a console whereas the latter does.
The solution is to have helper script — let's call it launcher — which, by default, creates console and runs your programs in subprocesses. This way each program inherits one and the same console from its parent — the launcher. To run programs sequentially we use Popen.wait() method.
--- script_run_from_eclipse.py ---
import subprocess
import sys
subprocess.Popen([sys.executable, 'helper.py'])
--- helper.py ---
import subprocess
programs = ['first_program.exe', 'second_program.exe']
for program in programs:
subprocess.Popen([program]).wait()
if input('Do you want to continue? (y/n): ').upper() == 'N':
break