I have a script that collects data from the streaming API. I'm getting an error at random that I believe it's coming from twitter's end for whatever reason. It doesn't happen at specific time, I've been seen it as early as 10 minutes after running my script, and other times after 2 hours.
My question is how do I create another script (outside the running one) that can catch if it terminated with an error, then restart after a delay.
I did some searching and most were related to using bash on linux, I'm on windows. Other suggestions were to use Windows Task Scheduler but that can only be set for a known time.
I came across the following code:
import os, sys, time
def main():
print "AutoRes is starting"
executable = sys.executable
args = sys.argv[:]
args.insert(0, sys.executable)
time.sleep(1)
print "Respawning"
os.execvp(executable, args)
if __name__ == "__main__":
main()
If I'm not mistaken that runs inside the code correct? Issue with that is my script is currently collecting data and I can't terminate to edit.
How about this?
from os import system
from time import sleep
while True: #manually terminate when you want to stop streaming
system('python streamer.py')
sleep(300) #sleep for 5 minutes
In the meanwhile, when something goes wrong in streamer.py , end it from there by invoking sys.exit(1)
Make sure this and streamer.py are in the same directory.
Related
I am in a bit of a pickle here. I have a python script (gather.py) that gathers information from an .xml file and uploads it into a database on a infinite loop that sleeps for 60sec; btw all of this is local. I am using Flask to run a webpage that will later pull information from the database, but at the moment all it does is display a sample page (main.py). I want to run main.py as for it to start gather.py as background process that won't prevent Flask from starting, I tried importing gather.py but it halts the process (indefinitely) and Flask won't start. After Googling for a while it seems that the best option is to use a task queue (Celery) and a message-broker (RabbitMQ) to take care of this. This is fine if the application were to do a lot of stuff in the background, but I only need it to do 1 or 2 things. So I did more digging and found posts stating that subprocess.Popen() could do the job. I tried using it and I don't think it failed, since it didn't raise any errors, but the database is empty. I confirmed that both gather.py and main.py work independently. I tried running the following code in IDLE:
subprocess.Popen([sys.executable, 'path\to\gather.py'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
and got this in return:
<subprocess.Popen object at 0x049A1CF0>
Now, I don't know what this means, I tried using .value and .attrib but understandably I get this:
AttributeError: 'Popen' object has no attribute 'value'
and
AttributeError: 'Popen' object has no attribute 'attrib'
Then I read on a StackOverflow post that stdout=subprocess.PIPE would cause the program to halt so, in a 'just in case' moment, I ran:
subprocess.Popen([sys.executable, 'path\to\gather.py'], stdout=subprocess.DEVNULL, stderr=subprocess.STDOUT)
and got this in return:
<subprocess.Popen object at 0x034A77D0>
Through all this process the database tables have remained empty. I am new to the subprocess module but all this checks and I can't figure out why it is not running gather.py. Is it because it has an infinite loop?? If there is a better option pls let me know.
Python version: 3.4.4
PS. IDK if it'll matter but I am running a portable version of Python (PortableApps) on a Windows 10 PC. This is why I included sys.executable inside subprocess.Popen().
Solution 1 (all in python script):
Try to use Thread and Queue.
I do this:
from flask import Flask
from flask import request
import json
from Queue import Queue
from threading import Thread
import time
def task(q):
q.put(0)
t = time.time()
while True:
time.sleep(1)
q.put(time.time() - t)
queue = Queue()
worker = Thread(target=task, args=(queue,))
worker.start()
app = Flask(__name__)
#app.route('/')
def message_from_queue():
msg = "Running: calculate %f seconds" % queue.get()
return msg
if __name__ == '__main__':
app.run(host='0.0.0.0')
If you run this code each access to '/' get a value calculate in task in background. Maybe you need to block until the task get a value, but it isnt enough information in the question. Of course you need to refactor your gather.py to pass a queue for it.
Solution 2 (using a system script):
For windows, create a .bat file and run both script from there:
#echo off
start python 'path\to\gather.py'
set FLASK_APP=app.py
flask run
This will run gather.py and after start the flask server. If you use start /min python 'path\to\gather.py' the gather will run in minimized mode.
subprocess.Popen will not work in opening a python program because it recognizes python as a file and not a executable. Subprocess.Popen can only open .exe files and nothing other than that.
You can use:
os.system('python_file_path.py')
but it won't be a background process(depends on your script)
I have made a C# app which calls a python script.
C# app uses Process object to call python script.
I also have redirected the sub-process standard output so I can process the output from python script.
But the problem is:
The output(via print function) from python will always arrive at once when the script terminates.
I want the output to arrive in real time while script running.
I can say I have tried almost all of method can get from google, like add flush of sys.out, redirect sysout in python, C# event driven message receiving or just using while to wait message etc,.
How to flush output of print function?
PyInstaller packaged application works fine in Console mode, crashes in Window mode
I am very wondering that like PyCharm or other python IDE, they run python script inside, but they can print the output one by one without hacking original python script, how they do that?
The python version is 2.7.
Hope to have advise.
Thank you!
I just use very stupid but working method to resolve it:
using thread to periodically flush the sys.out, the code piece is like this:
import sys
import os
import threading
import time
run_thread = False
def flush_print():
while run_thread:
# print 'something'
sys.stdout.flush()
time.sleep(1)
in main function:
if __name__ == '__main__':
thread = threading.Thread(target=flush_print)
run_thread = True
thread.start()
# my big functions with some prints, the function will block until completed
run_thread = False
thread.join()
Apparently this is ugly, but I have no better method to make work done .
I would like to have several scripts running on PythonAnywhere. In order to make sure that the scripts are not killed I would like to check for their status in an interval of five minutes (based on https://help.pythonanywhere.com/pages/LongRunningTasks/).
Two questions arise:
1. In the script which runs every five minutes I would like to check whether the other scripts (script2, script3) are still alive or not. If not, I would obviously like to run them. But how do I run several scripts from one script (script1) without script1 getting "stuck"? I.e. how do I start two scripts at the same time from one script?
If I just try to run the script using "import script2" I get an error
ImportError: No module named script2
How do I tell Python that the script is in a different folder (because that has to be the issue)?
Thanks in advance!
Try this:
import time
import subprocess
def check_process(proc,path):
if proc.poll()!=1:
print('%s still running' % proc)
elif proc.poll()==1:#will give a 1 if the child process has been killed
print('%s is dead. Re-running')
subprocess.Popen(['python.exe', path])
script1=subprocess.Popen(['python.exe', pathscript1])
script2=subprocess.Popen(['python.exe', pathscript2])
while True:
check_process(script1,pathscript1)
check_process(script2,pathscript2)
time.sleep(300)
I have multiple python files to run. How would I launch all those files within one .py script? This is what I came up with but it shows the screen action and really doesn't begin the other stuff unless I exit out of it. Here's the code, not much:
import os
print("Launching Bot, just for you.")
print("Loading up shard 0")
try:
os.system("screen python3.5 run_0.py > /dev/null")
except:
print("Shard 0 failed")
print("Loading up shard 1")
try:
os.system("screen python3.5 run_1.py > /dev/null")
except:
print("Shard 1 failed")
print("Done running shards...")
I was doing some research and they said to use subprocess but when I used it, it didn't run my command properly. (I don't have a copy of that code, I lost it).
The problem is that I want to run the python script and it works fine but I have to close the screen to start the other one and I just want it to run the command w/o showing the output. Can you help?
You should use import subprocess in a python file. You can then start other instance of other programs with :
subprocess.Popen([sys.executable, "newprogram.py"])
You can mix that with multiprocessing package to launch one thread by new program
p = multiprocessing.Process(target= mp_worker , args=( ))
p.start()
where mp_worker launches the other program.
I have a Python script, say test.py which interacts with the user everytime it is executed.
test.py
#!/usr/bin/env python3
def foo():
a = input()
print("Hello, {}".format(a))
if __name__ == '__main__':
foo()
I want to run this script every 5 minutes. So, I tried adding it to my crontab but it seems that this doesn't work as I am not prompted to enter an input.
I have tested my crontab and it is working fine.
Should I use libraries such as schedule for this task?
What would be the ideal way of going about in this scenario?
NOTE: I want a cross platform solution which works on MacOSX and GNU/Linux.
Your could try the python module pexpect (http://pexpect.sourceforge.net/doc/) -- the usage is something like :
import pexpect
child = pexpect.spawn(script/python file etc...)
child.expect('Here goes the prompt you would expect')
child.sendline('What you want to send to it')
its not possible to have cron interact interactively on your server.
Delink your application:
Setup a db (like MySQL), have test.py run via cron every 5 mins to check if a new entry is made in the db: then, have a local application that saves user prompt information to your db.
If it is running via cron or schedule, it will not be running on your terminal, it will run as a background process. Therefore, it cannot be interactive and you will not be prompted.
If you just want to put a delay of 5 minutes, have you looked at the Linux "sleep" command or using python's "time.sleep"? In this way you can keep a terminal open and just have the job run in intervals.
To keep it simple:
#!/usr/bin/env python3
import time
def foo():
a = input()
print("Hello, {}".format(a))
if __name__ == '__main__':
while True:
foo()
time.sleep(300) # Sleeps 300 seconds
You can get a script and use that in the crontab:
#!/bin/bash
/usr/bin/python3 /home/user/project/test.py << EOF
some input here
EOF
Your crontab will look like
* * * * * /home/user/project/test.sh >> /home/user/project/LOG 2>&1