coming from Linux I have to build a Django application on Windows Server.
Everything works fine, but I have a Django command that needs to be run every hour.
In Linux I would create a cronjob.
I wrote a batch file that activates the virtual env and runs the django command, scheduled it with windows, but the command jobs fails, despite task scheduler says it executed properly.
What is simple on Linux on Windows it seems to be rocket science...
I feel that task scheduler / batch file approach isn't the right way to do that.
Is there a more pythonesque way to run a django command on windows server every hour?
thanx in advance,
Jan
coming back late, but I solved it with a django custom command:
https://docs.djangoproject.com/en/3.1/howto/custom-management-commands/
strophi
Related
Im developing a django erp application and want to run it without stop. If we do runserver from a terminal it may be stopped at a point of time automatically. So what i seek for is to tell the os to run django as it boot up. Can subprocess of python a good idea to call from cron at startup ?
I want to know if there's a way to have windows server 2019 automatically launch django's web server. I also want the launch to be performed at startup and by SYSTEM.
I tried using batch scripts that launch manage.py from venv's python interpreter. When I launch the batch manually (i.e. double click) it works fine and dandy. But it appears that SYSTEM fails in running the script correctly when planning the task.
I made SYSTEM launch another script at startup (a simple python script that creates a txt file from within its own venv) and it works.
If the Django launch sceipt is launched by USER then it works.
The problem is with the launching of django with SYSTEM. I've also tried streamlit and the result is the same.
Do you have any Ideas?
Sample batch script:
cd path\of\managepyfile\
C:\path_to_venv\Scripts\python -m manage.py runserver
We run a similar application (not python) but an application that uses a web server.
We have it setup as a task in task scheduler that when the server starts up, it runs the powershell script that executes a command to start the web server.
Link to setup
However, you could use a web server like IIS and deploy the files to the www folder in the cdrive and run the site as an IIS service.
Setting it up on IIS was a little tricky if you've never used IIS before. Happy to help out as we have deployed our test access tool for one of our apps this way.
I recently learned that cron jobs in Mac OS terminal will not run if the computer is off/asleep.
I had always assumed that cron jobs run regardless of whether the machine is on or off.
I have a crontab (with a couple python scripts) on linux (ubuntu) that is running on AWS (an EC2 instance). Will my cron jobs will run regardless of whether my computer is on or off? I've always thought yes, because the crontab is running on the cloud (AWS) and not my local machine. But now I'm not so sure.
Yes, contab jobs should run if the computer their setup on is running.
If you SSH from your laptop into a VM and setup a crontab job on the VM properly then it should run as long as the VM is on and running even if your laptop is off.
You should be able to test this using a simple script that creates a file (or something similar) set to 5 minutes in the future then quickly turn off your laptop and check again in 10 minutes. If you have things setup correctly then when you check the script will have created the file (or whatever you set it to do). If this works then you can continue confidently. If it doesn't work (the file didn't appear) then something is wrong with how the crontab job was set up.
I am working with scrapy 0.20 on python 2.7
Question
What is the best python scheduler.
My need
I need to run my spider, which is a python script, each 3 hours.
What I have thought
I tried using windows scheduler features which comes with Windows 7 and It works good. I am able to run a python script each 3 hours but I may deploy my python script on Linux server so I may not be able to use this option.
I create a Java application using Quartz-Scheduler. It works good but this is a third library, which my manager may refuse.
I created a windows service and I made it fire the script each three hours. It works but I may deploy my python script on a Linux server so I may not be able to use this option.
I am asking about the best practice to fire a python script
I tried using windows scheduler features which comes with Windows 7 and it works good.
So that works fine for you already. Good, no need to change your script to do scheduling work yourself.
but I may deploy my python script on Linux server so I may not be able to use this option.
On Linux, you can use cron jobs to achieve this.
The other way would be to simply keep your script running the whole time, but pause it for the three hours in which you are doing nothing. So you don’t need to set up anything on the target machine, but just need to run the script in the background, and it will keep running and doing its job.
This is exactly how job schedulers work btw. They are launched early when the operating system starts, and then they just keep running forever and every short time interval (a minute or so) they check if there is any job on their list that needs to run now. And if that’s the case, they spawn a new process and run the job.
So if you wanted to make such a scheduler in Python, you would just keep it running forever, and once every time interval (in your case 3 hours because you only have a single job anyway), you start your job. That can be in a separate process, in a separate thread, or indirectly in a separate thread using asynchronous functions.
The best way to deploy/schedule your scrapy project is to use scrapyd server.
You should install scrapyd.
sudo-apt get install scrapyd
You change your project config file to something like this :
[deploy:somename]
url = http://localhost:6800/ ## this the default
project = scrapy_project
you deploy your project under the scrapyd server:
scrapy deploy somename
You change your poll interval in /etc/scrapyd/conf.d/default-000 to 3 hours ( default to 5 seconds):
poll_interval = 10800
You configure your spider something like :
curl http://localhost:6800/schedule.json -d project=scrapy_project -d spider=myspider
You can use the web service to monitor your jobs:
http://localhost:6800/
PS: I just test it under ubuntu So I am not sure that a windows version exist. If not you can install a VM with ubuntu to launch the spiders.
Well, there's always the charming
sched
(docs) module, which provides a generic scheduling interface.
Give it a time function and a sleep function, and it'll give you back a pretty nice and extensible scheduler.
It's not system-level, but if you can run it as a service, it should suffice.
I need to install on one of my Windows PC's some software that will periodically send a short HTTP POST request to my remote development server. The request is always the same and should get sent every minute.
What would you recommend as the best approach for that?
The things I considered are:
1. Creating a Windows service
2. Using a script in python (I have cygwin installed)
3. Scheduled task using a batch file (although I don't want the black cmd window to pop up in my face every minute)
Thanks for any additional ideas or hints on how to best implement it.
import urllib
import time
while True:
urllib.urlopen(url, post_data)
time.sleep(60)
If you have cygwin, you probably have cron - run a python script from your crontab.
This is trivially easy with a scheduled task which is the native Windows way to schedule tasks! There's no need for cygwin or Python or anything like that.
I have such a task running on my machine which pokes my Wordpress blog every few hours. The script is just a .bat file which calls wget. The task is configured to "Run whether user is logged on or not" which ensures that it runs when I'm not logged on. There's no "black cmd window".
You didn't say which version of Windows you are on and if you are on XP (unlucky for you if you are) then the configuration is probably different since the scheduled task interface changed quite a bit when Vista came out.