I have troubles running very basic, actually the simplest celery example.
Can You guide me whether I'm doing something wrong here?
worker part:
from celery import Celery
celery = Celery('tasks', broker='redis://localhost:6379/0',backend='redis://localhost:6379/1')
#celery.task
def add(x, y):
return x + y
and run it with:
celery -A tasks worker --loglevel=INFO
then I open new terminal and type:
>>> from tasks import add
>>> result = add.delay(1,1)
>>>
but result.ready() is always False
and in the celery worker I cannot see task that is finished only received and run.
[2022-04-01 13:07:23,434: INFO/MainProcess] Task tasks.add[b0b98051-53b3-4fec-8f83-b1f74be27cfa] received
[2022-04-01 13:07:24,304: INFO/SpawnPoolWorker-19] child process 31632 calling self.run()
[2022-04-01 13:07:24,318: INFO/SpawnPoolWorker-20] child process 4648 calling self.run()
[2022-04-01 13:07:24,322: INFO/SpawnPoolWorker-21] child process 3984 calling self.run()
What am I doing wrong?
Thanks!
Related
I tried to create a task that should run every minute in celery along with redis server
To execute redis I ran "redis-server"
To execute celery I ran "celery -A tasks worker --loglevel=info"
This is my tasks.py file
from celery import Celery
from celery.schedules import crontab
from celery.task import periodic_task
app = Celery('tasks', backend='redis://localhost', broker='redis://localhost')
#app.task
def add(x, y):
return x + y
#periodic_task(run_every=(crontab(minute='1')),name="run_every_minute",ignore_result=True)
def run_every_minute():
print("hehe")
return "ok"
When I ran in python console
from tasks.py import run_every_minute
z=run_every_minute.delay()
I got output at celery running terminal as
[2019-06-05 01:35:02,591: INFO/MainProcess] Received task: run_every_minute[06498b4b-1d13-45af-b91c-fb10476e0aa3]
[2019-06-05 01:35:02,595: WARNING/Worker-2] hehe
[2019-06-05 01:35:02,599: INFO/MainProcess] Task run_every_minute[06498b4b-1d13-45af-b91c-fb10476e0aa3] succeeded in
0.004713802001788281s: 'ok'
But this should execute every minute since its a periodic task. How this can happen.
Also, how can we execute a celery task at some specific time say 5:30 GMT(for example).
Ok, based on the commentary
First periodic_task needs the scheduler/beat be started (Periodic Tasks), with this the scheduler will send the task depending in the run_every parameter
celery -A tasks beat
Next, if you need to send the beat every minute, you need the crontab be like this
#periodic_task(run_every=(crontab(minute='*')),name="run_every_minute",ignore_result=True)
def run_every_minute():
print("hehe")
return "ok"
With minute='*', it will send the task every minute. minute=1 will send the task at every hour in the minute one
Answering your last comment:
run_every=(crontab(minute='1'))
You have specified 'minute of hour' = 1, so celery beat runs your periodic task every hour at minute '1', e.g. 00:01, 01:01 and so on.
You should set hour attribute for your crontab, propably as a range
Consider the code:
from celery import Celery, group
from time import time
app = Celery('tasks', broker='redis:///0', backend='redis:///1', task_ignore_result=False)
#app.task
def test_task(i):
print('hi')
return i
x = test_task.delay(3)
print(x.get())
I run it by calling python script.py, but I'm getting no results. Why?
You don't get any results because you've asked your celery app to execute a task without starting a worker process to do the work executing it. The process you did start is blocked on the call to get().
First things first, when using celery it is critical that you do not have tasks get executed when a module is imported, so let's put your task execution inside of a main() function, and put it in a file called celery_test.py.
from celery import Celery, group
from time import time
app = Celery('tasks', broker='redis:///0', backend='redis:///1', task_ignore_result=False)
#app.task
def test_task(i):
print('hi')
return i
def main():
x = test_task.delay(3)
print(x.get())
if __name__ == '__main__':
main()
Now let's start a pool of celery workers to execute tasks for this app. You can do this by opening a new terminal and executing the following.
celery worker -A celery_test --loglevel=INFO
The -A flag refers to the module where celery will find an application to add workers to. You should see some output in the terminal to indicate that the the celery worker is running and ready for tasks to process.
Now, try executing your script again with python celery_test.py. You should see hi show up in the worker's log output, but the the value 3 returned in the script that called get().
Be warned, if you've been playing with celery without running a worker, it probably has lots of tasks waiting in your broker to execute. The first time you start up the worker pool, you'll see them all execute in parallel until the broker runs out of tasks.
I want to implement task priority in my celery workers. I can do this by creating different queues for high priority tasks and low priority tasks. But also I need to send broadcast tasks to all workers with a broadcast queue and its not working. Here is tasks.py file:
from celery import Celery
from kombu.common import Broadcast, Queue, Exchange
app = Celery('tasks')
app.conf.update(
CELERY_RESULT_BACKEND='amqp',
CELERY_ACCEPT_CONTENT=['json'],
CELERY_TASK_SERIALIZER='json',
CELERY_RESULT_SERIALIZER='json',
BROKER_URL='amqp://',
CELERY_QUEUES=(Queue('default',
Exchange('default'),
routing_key='default'),
Queue('low_priority',
Exchange('low_priority'),
routing_key='low_priority'),
Broadcast('broadcast_tasks'), ),
CELERY_ROUTES={'tasks.broadcast':
{'queue': 'broadcast_tasks'},
'tasks.low_task':
{'queue': 'low_priority'},
},
CELERY_DEFAULT_QUEUE = 'default',
CELERY_DEFAULT_EXCHANGE = 'default',
CELERY_DEFAULT_ROUTING_KEY = 'default'
)
#app.task
def broadcast():
print "Broadcast called"
#app.task
def low_task():
print "Low priority called"
#app.task
def def_task():
print "Default called"
When I run celery workers with this command:
celery -A tasks -Q default worker --loglevel=info
celery -A tasks -Q default,low_priority worker --loglevel=info
Task priority works but broadcast tasks are not acknowledged.
When I run the same command without a queue argument, broadcast works but task priority does not:
celery -A tasks worker --loglevel=info
celery -A tasks worker --loglevel=info
As I understand, it happens because broadcast queues have unique names, like bcast.0b5dbce0-9bcb-48a5-8554-cbb7f32a6703 for each worker.
Does anyone have a good workaround? Thanks in advance!
You must explicitly consume a broadcast queue, so modify your commandline invocation as the comment above per ANDY_VAR.
A similar question was asked here:
start celery worker and enable it for broadcast queue
If I have a celery task such as the following:
#celery.task(name='tasks.webrequest')
def webrequest(*args):
try:
webrequest = requests.get('{0}'.format(args[0]), auth=(args[1], args[2]), verify=False, timeout=240)
except Exception as e:
print e
webrequest='cant talk to server'
return webrequest
and a celery worker with only one core, so only 1 worker thread. Is there a way and how would you have that worker perform two or more of these tasks at once?
Currently I am executing the working like so:
celery -A app.celery worker -l DEBUG
When I call it with the concurrency (thanks Ale) it allows me to have more threads than cpu's.
celery -A app.celery worker -c 30 -l DEBUG
I'm working with Celery http://celery.readthedocs.org/en/latest/index.html
I need to run a periodic tasks at a specific moment. But I only want to start my task after starting the celery worker.
For that I'm trying to create my own "PeriodicTask". But I'm dealing with a problem.
When I'm starting the worker and executing the run_tasks.py in another terminal, it seems that my periodic tasks is executed only one time.
How could I do to have my periodic task running every 3 seconds.
Here is a part of the code.
Start celery :
celery worker --app=worker_manager.celery --loglevel=info
file tasks.py
class MyPeriodicTask(PeriodicTask):
name = "periodic-task"
run_every = timedelta(seconds=3)
def run(self, **kwargs):
logger = self.get_logger(**kwargs)
logger.info("Running periodic task!")
file run_tasks.py
tasks.register(MyPeriodicTask)
wmi_collector_task = worker_app.tasks[MyPeriodicTask.name]
Thanks in advance.
To run periodic tasks you need to start celery beat. You can do this by passing -B argument when starting workers:
celery worker -B --app=worker_manager.celery --loglevel=info