Delay the execution of a Python script on Anaconda prompt? - python

I am running some scientific experiments that take a lot of time. Everything is written in Python 3 and I use the Anaconda Command Prompt to activate the Python scripts. I am working on Windows.
Right now, an experiment is running. I want to execute the next experiment as soon the current experiment is finished, however, I will not be near this computer when that happens. Is there a way to execute a Python script with, say, a 4 hour delay so I do not waste a night of precious computation time?
Potentially, adding a long sleep statement in my main python script could do the trick but I was wondering if any of you has a more elegant solution to the problem.

there is a way with Windows Task Scheduler, you should see the following:
https://www.youtube.com/watch?v=n2Cr_YRQk7o
when you set the trigger set it as you like (in 4 hours)

While sleep could be a dirty work around, you could also make use of the innate tasksceduler of windows, like XtoR stated.
You could also call to the other script at the end of your current one, by inserting the following bit of code into the first script.
import subprocess
import sys
sys.pid = subprocess.pOpen([path_to_python_executable, 'path_to_second_script'])
Personally I'm predisposed towards writing a quick wrapper script.
import subprocess
import sys
# We're just providing the python path here. Make sure to change according to system settings
python_path = 'C:\Python\python.exe'
# Here we specify the scripts you want to run.
run_script_one = 'C:\Path_to_first_script.py'
run_script_two = 'C:\Path_to_second_script.py'
sys.pid = subprocess.call([python_path, run_script_one])
sys.pid = subprocess.call([python_path, run_script_two])
sys.exit(0)

Related

In Python, how can I run another Python script from the main script at the same time and close it when I stop the main script?

import subprocess
subprocess.Popen('python', 'second_script.py')
Does this open the second script and makes them run concurrently? Also will it close the second script if I stop the main one? If not, how can I do that?
You should do something like this
#! /usr/bin/env python3
import subprocess
pid = subprocess.Popen(['second-script']).pid
print(f'pid={pid}')
...
# do whatever you have to do here
I don't clearly understand what you need.
If you want to use some functions that are coded in 'pythonScriptA.py' in another script 'pythonScriptB.py', you can import the first script:
from pythonScriptA import *
# Use the functions in the ScriptB
If you want two script to run concurrently/in parallel, you should look into the multiprocessing library, or in any other library that allow python code to be run on multiple threads.
--
Edit: correction of the script

How do I run 100 python files 2 minutes apart?

I want to run around 100 python scripts 2 minutes apart everyday to ensure that none of them overlap. I am using linux/Mac system.
Is there a dynamic way of doing this using cron tab? Or perhaps is there a scheduler program that might make it easier?
I think the easiest way would be to develop your own python script to manage this, using the python time module to add the delay.
import time
time.sleep(120) # 2 minute delay.
To call your script just import the file then execute it by running.
exec('file.py') #this is for python 3
Is the goal here to have the files execute exactly two minutes apart, or are you hoping to have one job just finish before the other executes?
If you are solely concerned with script1 finishing before script2, crontab should be able to do this in cronjob with a command like this:
script1 && script2
There is a decent example of this in the comments of this Reddit post.
If you do in fact want to have the scripts execute exactly 2 minutes apart, maybe you could approach by setting a specific execution time? Of course this may be somewhat sensitive to failures / not the most robust method, so adding some sort of event listener, etc. might be a better option.
Hope this helps a little bit!
Edit: This does not answer the OPs question but I will leave it up here in case anyone else misinterpreted the question title and came searching for the corresponding answer.
I would write a (most likely bash) script that will execute all 100 scripts and then call that one script with crontab. The cron line for every 2 minutes is as follows:
*/2 * * * * <file>
Here is a bash script that runs all python scripts in a directory, assuming all 100 scripts are in the same directory (taken from here).
for f in *.py; do python "$f"; done

Automatically restart a Python program if it's killed

I run a Python Discord bot. I import some modules and have some events. Now and then, it seems like the script gets killed for some unknown reason. Maybe because of an error/exception or some connection issue maybe? I'm no Python expert but I managed to get my bot working pretty well, I just don't exactly understand how it works under the hood (since the program does nothing besides waiting for events). Either way, I'd like it to restart automatically after it stops.
I use Windows 10 and just start my program either by double-clicking on it or through pythonw.exe if I don't want the window. What would be the best approach to verify if my program is still running (it doesn't have to be instant, the verification could be done every X minutes)? I thought of using a batch file or another Python script but I have no idea how to do such thing.
Thanks for your help.
You can write another python code (B) to call your original python code (A) using Popen from subprocess. In python code (B), ask the program to wait for your python code (A). If 'A' exits with an error code, recall it from B.
I provide an example for python_code_B.py
import subprocess
filename = 'my_python_code_A.py'
while True:
"""However, you should be careful with the '.wait()'"""
p = subprocess.Popen('python '+filename, shell=True).wait()
"""#if your there is an error from running 'my_python_code_A.py',
the while loop will be repeated,
otherwise the program will break from the loop"""
if p != 0:
continue
else:
break
This will generally work well on Unix / Windows systems. Tested on Win7/10 with latest code update.
Also, please run python_code_B.py from a 'real terminal' which means running from a command prompt or terminal, and not in IDLE.
for problem you stated i prefer to use python subprocess call to rerun python script or use try blocks.
This might be helpful to you.
check this sample try block code:
try:
import xyz # consider it is not exist or any error code
except:
pass # go to next line of code to execute

hijacking terminal stdin from python

Is there a way in python to hijack the terminal stdin? Unix only solutions will do fine.
I'm currently writing a small wrapper around top as I want to be able to monitor named processes, e.g. all running python instances. Basically I'm calling pgrep to get process id's and then runs top using the -p option.
Overall this script have worked satisfactorily for a few years now (well with the caveat that top -p only accepts 20 pid's...). However, I now would like adjust the script to update the call to top if new processes matching the name pattern are born. This also works relatively nicely, but... any options set interactively in top gets lost every time I update the pid list but natural causes as I stop and restart top. Therefore I would like to hijack the terminal stdin somehow to be able to backtrack what ever the settings are in affect so I can set them accordingly after updating the pid-list, or even halt updating if neccesary (e.g. if top is awaiting more instructions from the user).
Now perhaps what I'm trying to achieve is just silly and there are better ways to do it, if so I'd highly appreciate enlightenment
(oh. the tag ps were used as the tag top does not exists and I'm to new here to define new tags, after all the two utilities are related)
thanks \p
What you are doing sounds like a bit of a hack. I would just write a Python script using psutil that does exactly what you want. Whatever information you are interested in, psutil should give it to you - and more.
Quick and dirty:
import psutil
import time
while True:
processes = [ p for p in psutil.process_iter() if 'python' in p.name() ]
for p in processes:
# print out whatever information interests you
print(
p.pid,
p.name(),
p.cpu_percent(),
p.io_counters().read_bytes,
p.io_counters().write_bytes
)
time.sleep(10)
Link to Documentation: http://pythonhosted.org/psutil/

Python equivalent of IDL's stop and .reset

I'm relatively new to python but have a bit of experience using IDL. I was wondering if anyone knows if there are equivalent commands in python for IDL's stop and .reset commands.
If I'm running some IDL script I wrote that I put a stop command in, essentially what it does is stop the script there and give me access to the command line in the middle of the script. So I have access to all the functions and variables that I defined before the stop command, which I find really useful for debugging.
The .reset command I find extremely useful too. What it does is reset the the IDL environment (clears all variables, functions, etc.). It's as if I closed that session and opened a new one, but without having to exit and restart IDL. I find that if I'm trying to debug a script I wrote it's useful sometimes to start from scratch and not have to reset IDL (or python now). It would be useful also in python to be able to un-import any modules I had previously imported.
Any help with these issues would be greatly appreciated.
Cheers
Related
Python Drop into REPL
Is it possible to go into ipython from code?
IPython (aside from being a far nicer REPL than the standard python interpreter) may do what you want:
from IPython.Shell import IPShellEmbed
start_shell = IPShellEmbed()
def times_2(x):
return 2*x
a = 5
start_shell()
# now in IPython shell
# a -> 5
# times_2(a) -> 10
Note that any changes you make in the shell will not be sent back to the main python process on exit - if you set a = 10 in IPython (using the above example), a is still equal to 5 in the main python process.
edit: post on IPython-user mailing list where I first saw this technique.
stop sounds equivalent to use of the code module. .reset doesn't have an equivalent in Python short of gratuitous use of del.
Use pdb, as in this short script. Run it at the command line, and the PDB prompt will magically appear allowing single stepping, evaluation of arbitrary expressions, etc.
#!/usr/bin/env python
import pdb;
print 1
print 2
pdb.set_trace()
print 3
print 4
You could do %reset from within an IPython shell.
For stops, just add pydebug breakpoints as mentioned
pdb.set_trace() breaking out of code apparently does not allow you to ".continue" (IDL command) from that point (from http://pythondammit.blogspot.fr/2012/04/equivalent-of-idls-stop-command.html)
An update to redacted's solution.
The interactive IPython is much more powerful and convenient than pdb.set_trace.
Try this script
from IPython import embed
a = 1
b = 2
print('before')
embed()
print('after')
c = 3
Put embed() where you want to interrupt.
Run the script you will enter an interactive IPython Shell, you can view and modify the variables.
The script will continue after you exiting the shell.
You probably just want to use a Python debugger for this.
Welcome to the Python community! I'm still learning, but imo Python's nicer than the Interactive Data Language.
Anyway, Ignacio's answer about using the code module looks like it may provide what you want, at least as far as a parallel to IDL's stop.
Another thing you may find useful is to go into Python's interactive mode and import your program. You can then interact with it by running functions, etc. (Admittedly, I'm no expert at this.) If you do this, you'll need a main() function in the file which drives the program. For example, you'd have something like:
import sys
def main():
# do stuff
return(0)
if __name__ == '__main__':
sys.exit(main())
instead of just:
# do stuff
This prevents the execution of the program when you pull it into the Python interpreter. For more, see Guido's article about main functions.

Categories

Resources