How to schedule a job with Flask on Windows? - python

It seems that flask-crontab cannot run on my machine because there is no such thing as a windows cron.
This is the error I get: ModuleNotFoundError: No module named 'fcntl'
If this is the case, how can I write a scheduled job in a Flask application?
Specifically, I need to be able to deploy the application--so it won't be running on windows once it goes into production, but in the meantime, I can't debug if I can't test the cronjob on my own computer.
Any help is appreciated, e.g. pointing me to documentation that is useful, suggesting other extensions, etc. Thanks!
Pete

I would recommend you start using Celery,
Celery is a great library to use for job scheduling, whether you want this job to be periodic or upon request tasks.
Follow this guide

You can use Apache Airflow. It is written in Python. Its documentation is here.

Related

Celery + Django on startup

I have written a code with django and celery, it is for a machine to run some tasks and to be accessible through net. So I need the django and celery to run automatically when the built-in PC starts. Can anyone please tell me the best and easiest way to do this? Can servers like Apache handle such thing?
Ok, if anyone has this problem, first I wrote a .sh file for each one and then call them in start up setting in ubuntu.

Deploying multiple Gunicorn applications. Best process manager for easy one-click setup?

I am a python programmer, and server administration was always a bit hard for me to immerse to. I always read tutorials and in practice just repeated the steps each time I set up a new project. I always used uwsgi, but realized that gunicorn is easier for simple projects like mine.
Recently, I successfully set up my first single gunicorn application with the help of this article: https://www.digitalocean.com/community/tutorials/how-to-set-up-django-with-postgres-nginx-and-gunicorn-on-ubuntu-16-04
But what should I do if I want to launch another app with gunicorn? Should I just make another systemd service file, like myproject.service? I'm looking for convenient 'one click' setup, so I can easily transfer my project to another machine, or add more gunicorn applications without much configuration. Or maybe, I should use another process manager like supervisor? What is the best solution for a newbie like me?
Sorry if my question's too dumb, but I'm really trying.
Thank you!

Celery and Python Client Server Architecture

I need help with celery and python. I was able to configure celery after a lot of trouble. I got help from this link https://github.com/larsbutler/celery-examples I tried the example program and it ran fine as long as all the tasks.py, demo.py and celeryconfig.py were on the same folder. I transferred demo.py to some other folder and tried running it. It threw ImportError: Unable to import tasks module. I am using rabbitmq. I am not able to find more resources for python using celery.
I have two systems - Alice and Bravo. I want to access the function add residing in Alice from Bravo. I need help with the configuration of celery for this purpose. It is like client and server architecture.
FYI, the OP contacted me directly last Friday (duplicate post? :P) and I was able to help him resolve the issue. I think we can consider this question resolved.
As S. Lott mentioned, the proper solution was to add the other folder (containing demo.py) to PYTHONPATH.

Scheduling a regular event: Cron/Cron alternatives (including Celery)

Something I've had interest in is regularly running a certain set of actions at regular time intervals. Obviously, this is a task for cron, right?
Unfortunately, the Internet seems to be in a bit of disagreement there.
Let me elaborate a little about my setup. First, my development environment is in Windows, while my production environment is hosted on Webfaction (Linux). There is no real cron on Windows, right? Also, I use Django! And what's suggested for Django?
Celery of course! Unfortunately, setting up Celery has been more or less a literal nightmare for me - please see Error message 'No handlers could be found for logger “multiprocessing”' using Celery. And this is only ONE of the problems I've had with Celery. Others include a socket error which it I'm the only one ever to have gotten the problem.
Don't get me wrong, Celery seems REALLY cool. Unfortunately, there seems to be a lack of support, and some odd limitations built into its preferred backend, RabbitMQ. Unfortunately, no matter how cool a program is, if it doesn't work, well, it doesn't work!
That's where I hope all of you can come in. I'd like to know about cron or a cron-equivalent, which can be set up similarly (preferably identically) in both a Windows and a Linux environment.
(I've been struggling with Celery for about two weeks now and unfortunately I think it's time to toss in the towel and give up on it, at least for now.)
I had the same problem, and held off trying to solve it with celery (too complicated) or cron (external to application) and ended up finding Advanced Python Scheduler. Only just started using it but it seems reasonably mature and stable, has decent documentation and will take a number of scheduling formats (e.g. cron style).
From the documentation, running a function at a specific interval.
from apscheduler.scheduler import Scheduler
sched = Scheduler()
sched.start()
def hello_world():
print "hello world"
sched.add_interval_job(hello_world,seconds=10)
This is non-blocking, and I run something pretty identical by simply importing the module from my urls.py. Hope this helps.
A simple, non-Celery way to approach things would be to create custom django-admin commands to perform your asynchronous or scheduled tasks.
Then, on Windows, you use the at command to schedule these tasks. On Linux, you use cron.
I'd also strongly recommend ditching Windows if you can for a development environment. Your life will be so much better on Linux or even Mac OSX. Re-purpose a spare or old machine with Ubuntu for example, or run Ubuntu in a VM on your Windows box.
https://github.com/andybak/django-cron
Triggered by a single cron task but all the scheduling and configuration is done in Python.
Django Chronograph is a great alternative. You only need to setup one cron then do everything in django admin. You can schedule tasks/commands from django management.

Starting Tornado Web

I'm quite new to using Tornado Web as a web server, and am having a little difficulty keeping it running. I normally use Django and Nginx, and am used to start/stop/restarting the server. However with Tornado I'm having trouble telling it to "run" without directly executing my main python file for the site, ie "python ~/path/to/server.py".
I'm sure I'm getting this completely wrong - is there a way of 'bootstrapping' my script so that when Nginx starts, Tornado starts?
Any help would be appreciated!
A better way to do it is using supervisord as it is also written in python
No, there is not a way to have nginx spawn your tornado instance.
Typically you would use an external framework like daemontools or a system init script to run the tornado process.

Categories

Resources