Celery beat not sending crontab task when hour is set - python

I'm using celery 4.1 and all my periodic tasks work correctly except where I set the hour in a crontab task. I was thinking it had something to do with the timezone setting, but I can't seem to work out where the problem is.
dashboard/celery.py
from __future__ import absolute_import, unicode_literals
from celery import Celery
app = Celery('dashboard',
broker='redis://',
backend='redis://localhost',
include=['dashboard.tasks'])
app.conf.update(
result_expires=3600,
enable_utc = False,
timezone = 'America/New_York'
)
if __name__ == '__main__':
app.start()
This works:
#app.task
#periodic_task(run_every=(crontab()))
def shutdown_vms():
inst = C2CManage(['stop','kube'])
inst.run()
return
This works:
#app.task
#periodic_task(run_every=(crontab(minute=30,hour='*')))
def shutdown_vms():
inst = C2CManage(['stop','kube'])
inst.run()
return
This doesn't work:
#app.task
#periodic_task(run_every=(crontab(minute=30,hour=6)))
def shutdown_vms():
inst = C2CManage(['stop','kube'])
inst.run()
return
Beat picks up the task just fine:
<ScheduleEntry: dashboard.tasks.shutdown_vms dashboard.tasks.shutdown_vms() <crontab: 30 6 * * * (m/h/d/dM/MY)>
But it never sends it. I've let the processes run over a weekend and it never submits the task. I don't know what I'm doing wrong. I do have other tasks that run on timedelta periodicity and they all work perfectly.
Any help would be awesome.
EDIT: host is set to use the America/New_York timezone.
EDIT2: running beat as a separate process:
celery -A dashboard worker -l info
celery -A dashboard beat -l debug
I run them detached mostly or use multi.

Looks like this bug is causing it.
https://github.com/celery/celery/issues/4177
And several others that indicate that scheduling is not calculated properly when not using UTC.
Switched celery to use UTC as timezone and enabled utc and it works fine.

I solve this problem by use celery==4.0.1

An easy solution for the problem is
In celery settings update the following config
app.conf.enable_utc = False
app.conf.timezone = "Asia/Calcutta" #change to your timezone

Related

Why isn't celery periodic task working?

I'm trying to create a periodic task within a Django app.
I added this to my settings.py:
from datetime import timedelta
CELERYBEAT_SCHEDULE = {
'get_checkins': {
'task': 'api.tasks.get_checkins',
'schedule': timedelta(seconds=1)
}
}
I'm just getting started with Celery and haven't figured out which broker I want to use, so I added this as well to just bypass the broker for the time being:
if DEBUG:
CELERY_ALWAYS_EAGER = True
I also created a celery.py file in my project folder:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'testproject.settings')
app = Celery('testproject')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
Inside my app, called api, I made a tasks.py file:
from celery import shared_task
#shared_task
def get_checkins():
print('hello from get checkins')
I'm running the worker and beat with celery -A testproject worker --beat -l info
It starts up fine and I can see the task is registered under [tasks], but I don't see any jobs getting logged. Should be one per second. Can anyone tell why this isn't executing?
I looked at your post and don't see any comment on the broker you are using along with celery.
Have you installed a broker like Rabbitmq? Is it running or logging some kind of error?
Celery needs a broker to send and receive data.
Check the documentation here (http://docs.celeryproject.org/en/latest/getting-started/first-steps-with-celery.html#choosing-a-broker)

celery set visibility_timeout not working

When running django celery tasks, i see that my task re execute after every hour, maybe by default visibility_timeout setting, so i try to change visibility_timeout = 120 for re execute task every 120 seconds in celery.py config like this.
app.config_from_object('django.conf:settings')
installed_apps = [app_config.name for app_config in apps.get_app_configs()]
app.conf.broker_transport_options = {'visibility_timeout': 120}
app.autodiscover_tasks(lambda: installed_apps, force=True)
But it doesn't work, what is the correct way to change visibility_timeout?
I run my task from script running_celery_task.py with command: python manage.py shell < running_celery_task.py.
Here is running_celery_task.py
from project.app.tasks import my_task
my_task.delay()

Celery, periodic task execution, with concurrency

I would like to launch a periodic task every second but only if the previous task ended (db polling to send task to celery).
In the Celery documentation they are using the Django cache to make a lock.
I tried to use the example:
from __future__ import absolute_import
import datetime
import time
from celery import shared_task
from django.core.cache import cache
LOCK_EXPIRE = 60 * 5
#shared_task
def periodic():
acquire_lock = lambda: cache.add('lock_id', 'true', LOCK_EXPIRE)
release_lock = lambda: cache.delete('lock_id')
a = acquire_lock()
if a:
try:
time.sleep(10)
print a, 'Hello ', datetime.datetime.now()
finally:
release_lock()
else:
print 'Ignore'
with the following configuration:
app.conf.update(
CELERY_IGNORE_RESULT=True,
CELERY_ACCEPT_CONTENT=['json'],
CELERY_TASK_SERIALIZER='json',
CELERY_RESULT_SERIALIZER='json',
CELERYBEAT_SCHEDULE={
'periodic_task': {
'task': 'app_task_management.tasks.periodic',
'schedule': timedelta(seconds=1),
},
},
)
But in the console, I never see the Ignore message and I have Hello every second. It seems that the lock is not working fine.
I launch the periodic task with:
celeryd -B -A my_app
and the worker with:
celery worker -A my_app -l info
Could you please correct my misunderstanding?
From the Django Cache Framework documentation about local-memory cache:
Note that each process will have its own private cache instance, which
means no cross-process caching is possible.
So basically your workers are each dealing with their own cache. If you need a low resource cost cache backend I would recommend File Based Cache or Database Cache, both allow cross-process.

Celery Beat Windows Simple Example (not with Django)

I'm really struggling to set up a periodic task using Celery Beat on Windows 7 (unfortunately that is what I'm dealing with at the moment). The app that will be using celery is written with CherryPy, so the Django libraries are not relevant here. All I'm looking for is a simple example of how to start the Celery Beat Process in the background. The FAQ section says the following, but I haven't been able to actually do it yet:
Windows
The -B / –beat option to worker doesn’t work?¶
Answer: That’s right. Run celery beat and celery worker as separate services instead.
My project layout is as follows:
proj/
__init__.py (empty)
celery.py
celery_schedule.py
celery_settings.py (these work
tasks.py
celery.py:
from __future__ import absolute_import
from celery import Celery
from proj import celery_settings
from proj import celery_schedule
app = Celery(
'proj',
broker=celery_settings.BROKER_URL,
backend=celery_settings.CELERY_RESULT_BACKEND,
include=['proj.tasks']
)
# Optional configuration, see the application user guide.
app.conf.update(
CELERY_TASK_RESULT_EXPIRES=3600,
CELERYBEAT_SCHEDULE=celery_schedule.CELERYBEAT_SCHEDULE
)
if __name__ == '__main__':
app.start()
tasks.py
from __future__ import absolute_import
from proj.celery import app
#app.task
def add(x, y):
return x + y
celery_schedule.py
from datetime import timedelta
CELERYBEAT_SCHEDULE = {
'add-every-30-seconds': {
'task': 'tasks.add',
'schedule': timedelta(seconds=3),
'args': (16, 16)
},
}
Running "celery worker --app=proj -l info" from the command line (from the parent directory of "proj") starts the worker thread just fine and I can execute the add task from the Python terminal. However, I just can't figure out how to start the beat service. Obviously the syntax is probably incorrect as well because I haven't gotten past the missing --beat option.
Just start another process via a new terminal window, make sure you are in the correct directory and execute the command celery beat (no '--' needed preceding the beat keyword).
If this does not solve your issue, rename your celery_schedule.py file to celeryconfig.py and include it in your celery.py file as: app.config_from_object('celeryconfig') right above your name == main
then spawn a new celery beat process: celery beat

Django celery task run at once on startup of celery server

I need to find how to specify a kind of initial celery task, that will start all other tasks in specially defined way. This initial task should be run immediately at once on celery server startup and never run again.
How about using celeryd_after_setup or celeryd_init signal?
Follwing example code from the documentation:
from celery.signals import celeryd_init
#celeryd_init.connect(sender='worker12#example.com')
def configure_worker12(conf=None, **kwargs):
...
I found the way to do this. It has one negative side - impossible to specify current year and task will run after year again. But usually server restarts more often, then this period.
from celery.task import PeriodicTask
class InitialTasksStarter(PeriodicTask):
starttime = datetime.now() + timedelta(minutes=1)
run_every = crontab(month_of_year=starttime.month, day_of_month=starttime.day, hour=starttime.hour, minute=starttime.minute)
def run(self, **kwargs):
....
return True

Categories

Resources