I've constructed a python script python_script.py in Linux. Is there a way to do a cron job which will be compatible with Linux and Windows. In fact, even tough I have implemented this script in Linux, it will be run under a cron job in Windows.
Otherwise, assume the script works well on Linux and Windows. How could we create an automatic task on Windows (similar to a cron job on Linux)?
Create an automatic task using scheduler in windows.create a bat file to run the Python script and schedule that ask from the windows scheduler.hope this helps
Related
So far I am just simply open 2 sessions(I've got 2 bots so far) in macOS 'Terminal' and run the bots.
May I know if there is better way to make those bots alive? Is that the normal way that most of the people will do in this way?
Like this, once the bot is started, then I will just leave the session here for listening the requests.
Thanks all.
Way 1: Run inside docker container
You need to install docker Desktop at your Mac and run each python script in separate container
Pros:
Modern and correct way to run background processes
Running environment independent from local environment
Cons:
You need to install docker
You need to know/learn docker
Way 2: Run as daemon process inside Linux Virtual Machine [VM]
You need to install hypervisor (VirtualBox for example) and install Linux on it. After that use it like server for python scripts.
Pros:
Same as 1
Cons:
You need to install hypervisor
You need to know Linux and how to daemonize script (using systemd for example)
VM need more resources than docker
Way 3: Run process in background detached from terminal
Just run nohup pyton3 /Users/ws_fingear/Documents/workspace/Telegram_Bot/Telegram_Bot_WSFG1.0.py > output.log &. Mark & means that process will be run in background and continue to run even if you close terminal. Output will be printed out inside output.log file
Pros:
Super simple
Cons:
Bad way for production environment
Can't control or stop running process properly
I guess, you looking for way 3.
Also you can use syncthing on Mac OS to run script as daemon locally
I recently learned that cron jobs in Mac OS terminal will not run if the computer is off/asleep.
I had always assumed that cron jobs run regardless of whether the machine is on or off.
I have a crontab (with a couple python scripts) on linux (ubuntu) that is running on AWS (an EC2 instance). Will my cron jobs will run regardless of whether my computer is on or off? I've always thought yes, because the crontab is running on the cloud (AWS) and not my local machine. But now I'm not so sure.
Yes, contab jobs should run if the computer their setup on is running.
If you SSH from your laptop into a VM and setup a crontab job on the VM properly then it should run as long as the VM is on and running even if your laptop is off.
You should be able to test this using a simple script that creates a file (or something similar) set to 5 minutes in the future then quickly turn off your laptop and check again in 10 minutes. If you have things setup correctly then when you check the script will have created the file (or whatever you set it to do). If this works then you can continue confidently. If it doesn't work (the file didn't appear) then something is wrong with how the crontab job was set up.
coming from Linux I have to build a Django application on Windows Server.
Everything works fine, but I have a Django command that needs to be run every hour.
In Linux I would create a cronjob.
I wrote a batch file that activates the virtual env and runs the django command, scheduled it with windows, but the command jobs fails, despite task scheduler says it executed properly.
What is simple on Linux on Windows it seems to be rocket science...
I feel that task scheduler / batch file approach isn't the right way to do that.
Is there a more pythonesque way to run a django command on windows server every hour?
thanx in advance,
Jan
coming back late, but I solved it with a django custom command:
https://docs.djangoproject.com/en/3.1/howto/custom-management-commands/
strophi
I have a python script which installs a build from server to a node.
Now I want to schedule the task to a specific time, so that it triggers automatically.
One way is the windows scheduler to trigger batch file which internally calls runs the python script.
Do python have built-in support for this.
It should be OS independent, runs on both windows and linux platform.
Any ideas?
I've used this lightweight scheduler before to handle cron tasks -> https://github.com/mrhwick/schedule
I'd highly recommend it.
I am working with scrapy 0.20 on python 2.7
Question
What is the best python scheduler.
My need
I need to run my spider, which is a python script, each 3 hours.
What I have thought
I tried using windows scheduler features which comes with Windows 7 and It works good. I am able to run a python script each 3 hours but I may deploy my python script on Linux server so I may not be able to use this option.
I create a Java application using Quartz-Scheduler. It works good but this is a third library, which my manager may refuse.
I created a windows service and I made it fire the script each three hours. It works but I may deploy my python script on a Linux server so I may not be able to use this option.
I am asking about the best practice to fire a python script
I tried using windows scheduler features which comes with Windows 7 and it works good.
So that works fine for you already. Good, no need to change your script to do scheduling work yourself.
but I may deploy my python script on Linux server so I may not be able to use this option.
On Linux, you can use cron jobs to achieve this.
The other way would be to simply keep your script running the whole time, but pause it for the three hours in which you are doing nothing. So you don’t need to set up anything on the target machine, but just need to run the script in the background, and it will keep running and doing its job.
This is exactly how job schedulers work btw. They are launched early when the operating system starts, and then they just keep running forever and every short time interval (a minute or so) they check if there is any job on their list that needs to run now. And if that’s the case, they spawn a new process and run the job.
So if you wanted to make such a scheduler in Python, you would just keep it running forever, and once every time interval (in your case 3 hours because you only have a single job anyway), you start your job. That can be in a separate process, in a separate thread, or indirectly in a separate thread using asynchronous functions.
The best way to deploy/schedule your scrapy project is to use scrapyd server.
You should install scrapyd.
sudo-apt get install scrapyd
You change your project config file to something like this :
[deploy:somename]
url = http://localhost:6800/ ## this the default
project = scrapy_project
you deploy your project under the scrapyd server:
scrapy deploy somename
You change your poll interval in /etc/scrapyd/conf.d/default-000 to 3 hours ( default to 5 seconds):
poll_interval = 10800
You configure your spider something like :
curl http://localhost:6800/schedule.json -d project=scrapy_project -d spider=myspider
You can use the web service to monitor your jobs:
http://localhost:6800/
PS: I just test it under ubuntu So I am not sure that a windows version exist. If not you can install a VM with ubuntu to launch the spiders.
Well, there's always the charming
sched
(docs) module, which provides a generic scheduling interface.
Give it a time function and a sleep function, and it'll give you back a pretty nice and extensible scheduler.
It's not system-level, but if you can run it as a service, it should suffice.