crontabs and celery beat - python

I wanted to develop an asynchronous task in my python project in which the task runs once everyday at a particular time.
I have researched a lot on the various ways of getting over with this function but I am very confused between celery beat and crontabs and their functionalities.
I would be glad if anyone helped me understanding the difference between the two (if any), including their performance considerations.

as you can see in this code:
CELERYBEAT_SCHEDULE = {
'add-every-30-seconds': {
'task': 'tasks.add',
'schedule': crontab(hour=7, minute=30, day_of_week=1),
'args': (16, 16)
},
}
the celery beat is actually the process itself,think of it like a function with settings.
the function is the task and the settings are the arguments,name and schedule, here comes the crontab, its a period of a time that tells celery when to cycle through it.
you can also see here a list of crontabs types.

Related

Execute task at specific times django

I am in the process of writing my own task app using Django and would like a few specific functions to be executed every day at a certain time (updating tasks, checking due dates, etc.). Is there a way to have Django run functions on a regular basis or how do I go about this in general?
Does it make sense to write an extra program with an infinite loop for this or are there better ways?
Celery is a good option here:
First steps with Django
Periodic Tasks
app.conf.beat_schedule = {
'add-every-30-seconds': {
'task': 'tasks.add',
'schedule': 30.0,
'args': (16, 16)
},
}
app.conf.timezone = 'UTC'
With celery you can define periodic tasks at any given interval. Celery workers will then pick up those tasks when needed. You will need to run something like RabbitMQ or Redis to support the celery workers.
The alternative, simpler, way is to add an entry to your urls.py that catches any url you don't otherwise use, and use that as a prompt to check your database as to whether another task is due. This leverages the fact that your website will be hit by a lot of bot traffic. The timing isn't entirely reliable but it doesn't require any extra set up.
you can use django-cronjobs or maybe Schedule your job with schedule library.

What Tool / App To Use For Running Django Function Periodically (Once A Day)?

Can you please recommend an app or a tool for Django 2.2 to run a function periodically? I have a list of products and want to update their price once a day. I've heard about Celery, but maybe there is something more simple that I can use? Thanks in advance.
Using Celery to run periodic tasks is relatively straightforward.
If you don't want to use Celery at all, you could write a custom management command and invoke it periodically via a cron job.
You can use celery.
from celery.schedules import crontab
CELERY_BEAT_SCHEDULE = {
"update-task-on-mathmod.org": {
"task": "project.app1.tasks.task_that_run_daily",
"schedule": crontab(minute=0, hour=0), # execute daily at midnight
}
}
And in tasks file
#shared_task()
def task_that_run_daily():
print(".......running once a day.......")

How to do a job every half an hour

I want to do a job every half an hour. My application is based on Flask, and running on Windows.
Now I create a task for the job, using Windows scheduler service.
I want to know if there is another way I can do cyclic tasks using Flask’s built-in functions...
Sorry for my poor English.
I want to know if there is another way I can do [periodic] tasks using Flask’s built-in functions.
Being somewhat a minimalist microframework, I don't think Flask has or ever will have a built-in feature to schedule periodic tasks.
The customary way is what you have already done, you write some Flask code that can be called as an HTTP endpoint or a script, then use a OS scheduling tool to execute it (e.g. Windows Task Scheduler or cron on UNIX/Linux)
Otherwise, Flask works well with other specialized libraries that take care of this, like Celery (periodic tasks) that takes care of those details and adds some features that may not be available otherwise.
from datetime import timedelta
CELERYBEAT_SCHEDULE = {
'every-half-hour': {
'task': 'tasks.your_task',
'schedule': timedelta(minutes=30),
'args': ('task_arg1', 'task_arg2')
},
}
CELERY_TIMEZONE = 'UTC'
I m not sure if this help, but I've been testing the schedule module and it's easy to use and it works well:
$pip install schedule
and this is a sample from the official documentation:
import schedule
import time
def job():
print("I'm working...")
schedule.every(30).minutes.do(job)
while True:
schedule.run_pending()
time.sleep(1)
Hope this help =)

Django + Celery: how to chain tasks with parameters to periodic task

I have configured Django + Celery: all works, i can execute tasks, that called from views.py i.e. mul.apply_async((2, 5), queue='celery', countdown=5)
I need to shedule periodic task that will chain simple tasks with argument that passed from users.
I read docs http://docs.celeryproject.org/en/latest/userguide/canvas.html and know how to chain, i know how to make periodic task without parameters #periodic_task(run_every=(crontab(hour="*", minute="*", day_of_week="*")))
But how to combine this?
What i want workflow:
User creates project with parameters. 5 tasks executed, using that parameters.
Then i need shedule to repeat all 5 tasks every 24 hours. So here i dont know how to pass parameters (they save to db).
In other answer i saw this syntax:
CELERYBEAT_SCHEDULE = {
# crontab(hour=0, minute=0, day_of_week='saturday')
'schedule-name': { # example: 'file-backup'
'task': 'some_django_app.tasks....', # example: 'files.tasks.cleanup'
'schedule': crontab(...)
'args': (2, 3)
},
}
But the problem here is that it located in settings.py of Django but not in tasks.py and i cannot dynamically pass args.
The celerybeat task you register could be a wrapper and perform the project/task logic inside of it, firing off other tasks as appropriate. You could fetch the project tasks inside of your celery beat job.
CELERYBEAT_SCHEDULE.task -> 'some_django_app.project_beat_task'
Then project beat task could retrieve the correct projects and all tasks associated with them, perhaps spawning a chain of tasks for each project

How to execute a command at exact time once a day in Django?

I am working on a Django web based project in which i need to build a application which work in the following sequence:
1) user open a page in which he need to enter a command and a time
2) Django application will execute that command at a given time on each day till user off the scheduler (by default it is True)
What i am facing the problem is that :
1) How should i execute the commands on a time but on each day. To save the commands and time i created a following model in my models.py
class commands(models.Model):
username = models.ForeignKey(User)
command = models.CharField(max_length=30)
execution_time = models.DateField()
I have the same time but i am not getting the right way to execute it on each day at the given time
and is it possible to do with pytz library?
For executing the commands i am using paramiko library
PS: I don't want to use any external library
While you could have your django app add and remove cron jobs on the system, another more django-ish approach would be to use Celery. It is a task queue system that can run both synch and async tasks.
One specific feature of Celery is scheduled tasks: http://packages.python.org/celery/userguide/periodic-tasks.html
from datetime import timedelta
CELERYBEAT_SCHEDULE = {
"runs-every-30-seconds": {
"task": "tasks.add",
"schedule": timedelta(seconds=30),
"args": (16, 16)
},
}
They also have a more granular version of the period task that replicates the scheduling of a crontab:
from celery.schedules import crontab
CELERYBEAT_SCHEDULE = {
# Executes every Monday morning at 7:30 A.M
'every-monday-morning': {
'task': 'tasks.add',
'schedule': crontab(hour=7, minute=30, day_of_week=1),
'args': (16, 16),
},
}
Celery by itself is stand-alone but there is the django-celery specific verison
The benefit of this solution is that you do not need to edit and maintain a system-level cron tab. This is a solution that is highly integrated into django for this exact use.
Also a huge win over using a cron is that Celery can scale with your system. If you were using a basic system crontab, then the tasks would be located on the server that hosts the application. But what if you needed to ramp up your site and run it on 5 web application nodes? You would need to centralize that crontab. If you are using Celery, you have a large number of options for how to transport and store tasks. It is inherently distributed, and available in sync to all your application servers. It is portable.
It seems to me that the proper way to do this would be write a Django custom command and execute it via cron. But you seem to be under luck as others have felt similar need and have written custom django apps. Take django-cron for example.
The solution for your problem is standard cron application (task planner on *nix systems). You can schedule a script using cron (by adding it to crontab).
If your script must perform in you Django application environment, it's possible to tell him to do that with setup_environment function. You can read more about standalone scripts for Django applications here.

Categories

Resources