How do I run 100 python files 2 minutes apart? - python

I want to run around 100 python scripts 2 minutes apart everyday to ensure that none of them overlap. I am using linux/Mac system.
Is there a dynamic way of doing this using cron tab? Or perhaps is there a scheduler program that might make it easier?

I think the easiest way would be to develop your own python script to manage this, using the python time module to add the delay.
import time
time.sleep(120) # 2 minute delay.
To call your script just import the file then execute it by running.
exec('file.py') #this is for python 3

Is the goal here to have the files execute exactly two minutes apart, or are you hoping to have one job just finish before the other executes?
If you are solely concerned with script1 finishing before script2, crontab should be able to do this in cronjob with a command like this:
script1 && script2
There is a decent example of this in the comments of this Reddit post.
If you do in fact want to have the scripts execute exactly 2 minutes apart, maybe you could approach by setting a specific execution time? Of course this may be somewhat sensitive to failures / not the most robust method, so adding some sort of event listener, etc. might be a better option.
Hope this helps a little bit!

Edit: This does not answer the OPs question but I will leave it up here in case anyone else misinterpreted the question title and came searching for the corresponding answer.
I would write a (most likely bash) script that will execute all 100 scripts and then call that one script with crontab. The cron line for every 2 minutes is as follows:
*/2 * * * * <file>
Here is a bash script that runs all python scripts in a directory, assuming all 100 scripts are in the same directory (taken from here).
for f in *.py; do python "$f"; done

Related

how to automated my selenium web scrapper?

I was able to write a script to scrape using selenium, right now I'm trying to automate it so it can work periodically on a server so I don't bother myself by running it from my local, I did a lot of googling but I got no clue of how I can do that, can anyone simplify things for me ..
In order to run a python script on a linux server periodically you can make a cronjob, since you already have a python script which most probably fetches or scrapes data and saves it in a file. You can make a cronjob and set the exact time it has to run, say for instance after every 2 hours you can do it using something like this,
crontab -e
this will open a editor in your terminal, at the bottom of the text just write timing and the command to be executed.
* * * * */path/to/your/code.py
from this link you can find out how to fill out the stars https://crontab.guru/#*_*_*_*_1
if you need anymore help with using cronjobs take a look at this https://www.geeksforgeeks.org/scheduling-python-scripts-on-linux/
You can simply redo your script on pythonanywhere and schedule it as a task and choose the frequency that you want the script to be executed. The current frequency options available include; script runs always, hourly or daily.
i dont know if it'll really work but
while True:
whole
code
here
time.sleep(period required)

Delay the execution of a Python script on Anaconda prompt?

I am running some scientific experiments that take a lot of time. Everything is written in Python 3 and I use the Anaconda Command Prompt to activate the Python scripts. I am working on Windows.
Right now, an experiment is running. I want to execute the next experiment as soon the current experiment is finished, however, I will not be near this computer when that happens. Is there a way to execute a Python script with, say, a 4 hour delay so I do not waste a night of precious computation time?
Potentially, adding a long sleep statement in my main python script could do the trick but I was wondering if any of you has a more elegant solution to the problem.
there is a way with Windows Task Scheduler, you should see the following:
https://www.youtube.com/watch?v=n2Cr_YRQk7o
when you set the trigger set it as you like (in 4 hours)
While sleep could be a dirty work around, you could also make use of the innate tasksceduler of windows, like XtoR stated.
You could also call to the other script at the end of your current one, by inserting the following bit of code into the first script.
import subprocess
import sys
sys.pid = subprocess.pOpen([path_to_python_executable, 'path_to_second_script'])
Personally I'm predisposed towards writing a quick wrapper script.
import subprocess
import sys
# We're just providing the python path here. Make sure to change according to system settings
python_path = 'C:\Python\python.exe'
# Here we specify the scripts you want to run.
run_script_one = 'C:\Path_to_first_script.py'
run_script_two = 'C:\Path_to_second_script.py'
sys.pid = subprocess.call([python_path, run_script_one])
sys.pid = subprocess.call([python_path, run_script_two])
sys.exit(0)

How can I get a python code to run over and over again?

I have a scraper that scrapes data from a website, then saves the data in .csv files. What I am looking for is a way to run this code every 10 minutes, without using a loop. I have very little knowledge on how to do this. What approach would you use?
Are you using windows or a unix based system?
If you're using UNIX, you can schedule jobs to take place at regular intervals with CRON.
Execute the following in a terminal window:
#edit crontab
crontab -e
Then add the required file you want to execute prefaced by the following CRON instructions:
*/6 * * * * /path/to/desired/Python_version /path/to/executable/file
It's impossible to repeat a code without a loop.
I can only think that you want something like Task Scheluding on Windows (https://msdn.microsoft.com/pt-br/library/windows/desktop/aa383614.aspx) or crontab/timers on Linux.

python: permanent execution program date dependant

I wrote a script.py collecting data from the web from monday to friday. The script is usually executed from another script in the main function. I want it to close on friday and open monday automatically, and run from monday to friday.
At the moment I am obliged to run it manually every monday.
I wrote some code to stop it automatically on fridays. Basically it looks like this
import sys
import time
if strftime("%a %H:%M", gmtime()) != "Fri 20:00":
...code...
else:
sys.exit()
how to run the main script permanently and open the other script automatically when needed? hep me to improve this please thanks.
EDIT actually I will reformulate the question:
Is there a proper way to run prermanently a script, besides doing:
while 1!=0:
...code here
Do you have any option to automatically schedule the program to start on mondays, with cron or windows task scheduler?
Alternatley, you could write a separate program that runs permanently and controlls the startup and/or shutdown of the script.py.
For your reformulated question, theres nothing wrong with using
while True:
do things in a loop forever
If that is indeed how the code needs to run.
It is avoidable if you want to, you could avoid it by restructuring your code so that it does not need to run in an infinite loop. There is no magic way to have a script 'keep running in a loop forever' without using a loop construct.
Though I'm wondering. DO you not like
while 1!=0:
because it looks a bit silly to say 1!=0 ?
while True:
Is a perfectly neat alternative.

running a python script indefinitely (as a process, pretty much)

i have tests that i ran which can take up to 15m at a time. during these 15m, a log file is periodically written to. however, most of the content is useless.
in response to this i have a python script that parses out the useless text and displays the relevant data.
what i'm trying to achieve is similar to what tail -f log_file, constantly updating the terminal with the newest additions to a file. i was thinking that if a python script ran as a process, it could parse the log file whenever the tests write to it, then the python script can go to sleep until interrupted again once the log file is written to.
any ideas how one can achieve this?
i already have a script that does the parsing, i just don't know how to make it do it continually and efficiently.
You could just have the script filter standard input, and pipe tail -f through it. When you're waiting on stdin, your script will sleep, so it's plenty efficient.
Eg.
python long_running_script.py && tail -f log_file | python filter_logs.py
Your script can be something like
while true:
line = sys.stdin.readline()
if filter_line(line): print line
looks like you need something like "pytailer":
http://code.google.com/p/pytailer/
While I never used it myself, last example looks like what you want.
any ideas how one can achieve this?
This should be pretty easy to do. Most of what you want is already part of your OS.
python test.py | python log_parser.py
Be sure your tests write their log to stdout instead of some other file. This is often easy to do with small changes to the logging configuration.
Having implemented almost this exact tool, I had great success using the inotify capability in twisted

Categories

Resources