scheduler for running python scripts - python

I have a python script which installs a build from server to a node.
Now I want to schedule the task to a specific time, so that it triggers automatically.
One way is the windows scheduler to trigger batch file which internally calls runs the python script.
Do python have built-in support for this.
It should be OS independent, runs on both windows and linux platform.
Any ideas?

I've used this lightweight scheduler before to handle cron tasks -> https://github.com/mrhwick/schedule
I'd highly recommend it.

Related

How do I keep my python backend scripts running forever on Microsoft Windows?

I want to keep my Python scripts running forever on Windows Server 2012.
I tried using MS Windows Task Scheduler, but it keeps creating new instances of the script every time and hence, fills my memory. Currently I run my scripts through command prompt and keep them minimized, and I never log out from the Server.
Any help is really appreciated.
You could use https://learn.microsoft.com/en-US/windows-server/administration/windows-commands/sc-create to create a service then use Scheduled Tasks to control it.

Is there a way to stop and restart a self hosted Python script using Github Actions?

I have a project in which one of the tests consists of running a process indefinitely in order to collect data on the program execution.
It's a Python script that runs locally on a Linux machine, but I'd like for other people in my team to have access to it as well because there are specific moments where the process needs to be restarted.
Is there a way to set up a workflow on this machine that when dispatched, stops and restarts the process?
You can execute commands on your Linux host via GH Actions and SSH. Take a look at this action.

Cron job which will work under Linux and Windows

I've constructed a python script python_script.py in Linux. Is there a way to do a cron job which will be compatible with Linux and Windows. In fact, even tough I have implemented this script in Linux, it will be run under a cron job in Windows.
Otherwise, assume the script works well on Linux and Windows. How could we create an automatic task on Windows (similar to a cron job on Linux)?
Create an automatic task using scheduler in windows.create a bat file to run the Python script and schedule that ask from the windows scheduler.hope this helps

Scheduling tasks using Python's Schedule module

From the docs:
import schedule
import time
def job():
print("I'm working...")
schedule.every(10).minutes.do(job)
schedule.every().hour.do(job)
schedule.every().day.at("10:30").do(job)
while True:
schedule.run_pending()
time.sleep(1)
I understand that while the program is running, it will do the function you tell it to run. What I don't understand is how you would go about making this an automated task for every day. Is the idea that you would call this from the command line and always leave that open? If I shut off my computer, I would have to re-enable that again wouldn't I?
I feel there is something I am missing when creating an automated Python task in this case. I am on a windows environment.
Here is the overview: Running tasks as startup items means different things on each OS which has nothing to do with python specifically.
On windows you could set it up as a windows service by wrapping it using the python library Pyinstaller (which changes your script to an .exe file then running your.exe install --startup='auto'
On Linux based OS's you would need to check where to put the script because the startup sequence has changed in the last few years. There are even management software packages to make it easier.
On mac there is the GUI tools for controlling startup services as well as launchctl http://www.macworld.com/article/2047747/take-control-of-startup-and-login-items.html
You can take a look at the process currently on your computer by going to:
Windows: Task manager (press ctrl-alt-delete and select Task manager)
(depending on your windows version) click the Details tab. You will see the User name be blank or have "System" if it's run as a system process.
Linux or Mac: in a terminal type ps -Al
responding to comments:
System level - if nobody is logged in what is your computer doing? (your script?, web server?, protein folding?, dreaming of electric sheep?)
Yes, Python would be taking up resources each time you run a separate script. I have Gigs of RAM and Python takes <30 MB to run each script (depending on the size of libraries + size of program+ io bound + cpu bound problems). Your system is running >100 processes currently and it able to run 1000's. Don't worry about optimizing your program on the system till it's a problem.

Best way to make a Python scheduler

I am working with scrapy 0.20 on python 2.7
Question
What is the best python scheduler.
My need
I need to run my spider, which is a python script, each 3 hours.
What I have thought
I tried using windows scheduler features which comes with Windows 7 and It works good. I am able to run a python script each 3 hours but I may deploy my python script on Linux server so I may not be able to use this option.
I create a Java application using Quartz-Scheduler. It works good but this is a third library, which my manager may refuse.
I created a windows service and I made it fire the script each three hours. It works but I may deploy my python script on a Linux server so I may not be able to use this option.
I am asking about the best practice to fire a python script
I tried using windows scheduler features which comes with Windows 7 and it works good.
So that works fine for you already. Good, no need to change your script to do scheduling work yourself.
but I may deploy my python script on Linux server so I may not be able to use this option.
On Linux, you can use cron jobs to achieve this.
The other way would be to simply keep your script running the whole time, but pause it for the three hours in which you are doing nothing. So you don’t need to set up anything on the target machine, but just need to run the script in the background, and it will keep running and doing its job.
This is exactly how job schedulers work btw. They are launched early when the operating system starts, and then they just keep running forever and every short time interval (a minute or so) they check if there is any job on their list that needs to run now. And if that’s the case, they spawn a new process and run the job.
So if you wanted to make such a scheduler in Python, you would just keep it running forever, and once every time interval (in your case 3 hours because you only have a single job anyway), you start your job. That can be in a separate process, in a separate thread, or indirectly in a separate thread using asynchronous functions.
The best way to deploy/schedule your scrapy project is to use scrapyd server.
You should install scrapyd.
sudo-apt get install scrapyd
You change your project config file to something like this :
[deploy:somename]
url = http://localhost:6800/ ## this the default
project = scrapy_project
you deploy your project under the scrapyd server:
scrapy deploy somename
You change your poll interval in /etc/scrapyd/conf.d/default-000 to 3 hours ( default to 5 seconds):
poll_interval = 10800
You configure your spider something like :
curl http://localhost:6800/schedule.json -d project=scrapy_project -d spider=myspider
You can use the web service to monitor your jobs:
http://localhost:6800/
PS: I just test it under ubuntu So I am not sure that a windows version exist. If not you can install a VM with ubuntu to launch the spiders.
Well, there's always the charming
sched
(docs) module, which provides a generic scheduling interface.
Give it a time function and a sleep function, and it'll give you back a pretty nice and extensible scheduler.
It's not system-level, but if you can run it as a service, it should suffice.

Categories

Resources