Django-Celery : worker does not execute task - python

I've been trying to run some test with Celery, RabbitMQ, Django..
The problem is that once i have spawned a worker, i use django to send a task thanks to .delay() which is received by the worker but from what it looks like, the worker does not find a task of that name to execute. Weird thing since this task with the exact same name is is the list of tasks the worker is supposed to be able to execute.
Here it is:
-> settings.py
BROKER_HOST = "127.0.0.1" #IP address of the server running RabbitMQ and Celery
BROKER_PORT = 5672
BROKER_URL = 'amqp://'
CELERY_IMPORTS = ('notify')
CELERY_IGNORE_RESULT = False
CELERY_RESULT_BACKEND = "amqp"
CELERY_IMPORTS = ("notify")
-> __init__.py (under notify module)
import celery
from celery import app as celery_app
from notify.user.tasks import send_email
-> tasks.py (under notify module)
import os
from celery import Celery, task
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'app.settings')
app = Celery('app')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.config_from_object('app.settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS, force=False)
#task()
def send_email(recepient, title, subject):
print('sending email')
And here is how i spawn my worker at the root of my project:
celery worker -n worker_one -l debug -A notify
Here is the error i get when the worker receive notification of the task i sent:
The full contents of the message body was:
'[["Guillaume", "One title", "This is my subject"], {}, {"chain": null, "callbacks": null, "errbacks": null, "chord": null}]' (123b)
Traceback (most recent call last):
File "/Users/guillaumeboufflers/.pyenv/versions/smartbase/lib/python3.5/site-packages/celery/worker/consumer/consumer.py", line 561, in on_task_received
strategy = strategies[type_]
KeyError: 'notify.user.tasks.send_email'
which is weird because ..
[tasks]
. celery.accumulate
. celery.backend_cleanup
. celery.chain
. celery.chord
. celery.chord_unlock
. celery.chunks
. celery.group
. celery.map
. celery.starmap
. notify.user.tasks.send_email
Thanks guys for helping..

Related

Why does celery worker keep trying to connect to amqp even though the broker is sqs?

I tried to configure broker via settings and directly from the celery file .
Settings that apply to celery below.
AWS_SQS_SECRET = os.environ.get("AWS_SQS_SECRET")
broker_url = 'sqs://%s:%s#' % (AWS_SQS_ACCESS, AWS_SQS_SECRET)
task_default_queue = os.environ.get("DEFAULT_QUEUE")
AWS_SQS_REGION = os.environ.get("AWS_REGION")
broker_backend = "SQS"
broker_transport_options = {
"region": AWS_SQS_REGION,
# 'queue_name_prefix': '%s-' % 'dev' , # os.environ.get('ENVIRONMENT', 'development'),
'visibility_timeout': 7200,
'polling_interval': 1,
}
accept_content = ['application/json']
result_serializer = 'json'
task_serializer = 'json'
Also, as I mentioned, I tried to configure directly from the celery file.
import os
from celery import Celery
from celery.schedules import crontab
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'MyApp.settings')
AWS_SQS_ACCESS = os.environ.get("AWS_SQS_ACCESS")
AWS_SQS_SECRET = os.environ.get("AWS_SQS_SECRET")
app = Celery('MyApp') #,, broker='sqs://%s:%s#' % (AWS_SQS_ACCESS, AWS_SQS_SECRET), backend='django-db'
# app.config_from_object('django.conf:settings') #, namespace='CELERY'
CELERY_CONFIG = {
"CELERY_TASK_SERIALIZER": "json",
"CELERY_ACCEPT_CONTENT": ["json"],
"CELERY_RESULT_SERIALIZER": "json",
"CELERY_RESULT_BACKEND": None,
"CELERY_TIMEZONE": "America/Sao_Paulo",
"CELERY_ENABLE_UTC": True,
"CELERY_ENABLE_REMOTE_CONTROL": False,
}
BROKER_URL = 'sqs://%s:%s#' % (AWS_SQS_ACCESS, AWS_SQS_SECRET)
CELERY_CONFIG.update(
**{
"BROKER_URL": BROKER_URL,
"BROKER_TRANSPORT": "sqs",
"BROKER_TRANSPORT_OPTIONS": {
"region": "sa-east-1",
"visibility_timeout": 3600,
"polling_interval": 60,
},
}
)
app.conf.update(**CELERY_CONFIG)
app.autodiscover_tasks()
During deployment on elastik beanstalk , in the service I am running the command:
$PYTHONPATH/celery -A celery worker -Q default-dev -n default-worker \
--logfile=/var/log/celery/celery-stdout-error.log --loglevel=DEBUG --concurrency=1
Tried to run before:
$PYTHONPATH/celery -A MyApp worker -Q default-dev -n default-worker \
--logfile=/var/log/celery/celery-stdout-error.log --loglevel=DEBUG --concurrency=1
But getting error , celery "unable to load app MyApp".
In the log file I get the following error:
[2022-06-10 15:58:25,678: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**#127.0.0.1:5672//: [Errno 111] Connection refused.
Trying again in 6.00 seconds... (3/100)
My celery version is 5.2.7
If I understand the configuration docs correctly, we're supposed to set config as follows:
app.conf.broker_url = BROKER_URL
# or, alternatively
app.conf.update(broker_url=BROKER_URL)
i.e., use the lowercase names instead of the uppercase names.
If u faced with the same issue , just try to restart ur celery services.

Celery sees my task but does not execute it

I'm trying to run a celery chain in local, with a Redis broker.
The problem: As you can see below the worker command, tasks are not ran. Nothing happens.
Find below all the informations needed.
The redis-server is running
redis-server
[15048] 05 Feb 11:36:30 * Server started, Redis version 2.4.5
[15048] 05 Feb 11:36:30 * DB loaded from disk: 0 seconds
[15048] 05 Feb 11:36:30 * The server is now ready to accept connections on port 6379
...
The celery beat is also running
celery beat -b redis://localhost:6379/0
celery beat v4.2.1 (windowlicker) is starting.
__ - ... __ - _
LocalTime -> 2020-02-05 11:37:30
Configuration ->
. broker -> redis://localhost:6379/0
. loader -> celery.loaders.default.Loader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]#%WARNING
. maxinterval -> 5.00 minutes (300s)
But when I run the worker, it can see the tasks, but does not execute them:
celery -A module.tasks worker --loglevel=info -Q training_queue -P eventlet
[tasks]
. module.tasks.generate_the_chain
. module.tasks.post_pipeline_message
. module.tasks.train_by_site_post
[2020-02-05 11:39:14,700: INFO/MainProcess] Connected to redis://localhost:6379/0
[2020-02-05 11:39:16,746: INFO/MainProcess] mingle: searching for neighbors
[2020-02-05 11:39:20,804: INFO/MainProcess] mingle: all alone
[2020-02-05 11:39:21,836: INFO/MainProcess] pidbox: Connected to redis://localhost:6379/0.
[2020-02-05 11:39:22,850: INFO/MainProcess] celery#HOST ready.
Nothing happens after this line.
BELOW is the code:
init.py
import os
from celery import Celery
CELERY_APP_NAME = 'app_name'
CELERY_BROKER = 'redis://localhost:6379/0'
CELERY_BACKEND = CELERY_BROKER
CELERY = Celery(CELERY_APP_NAME,
backend=CELERY_BACKEND,
broker=CELERY_BROKER,
include=['module.tasks']
)
And tasks.py
import os
import logging
import time
from celery import Celery, chain, signature
from celery.schedules import crontab
from module import CELERY
#CELERY.on_after_finalize.connect
def setup_periodic_tasks(sender, **kwargs):
site_id = "f1ae"
sign = signature(
'module.tasks.generate_the_chain',
args=(site_id),
queue='training_queue'
)
sender.add_periodic_task(5.0, sign, name='training')
#CELERY.task
def generate_the_chain(site_id):
print("Hello World")
"""
This function run the chain of functions and is then
scheduled with Celery beat
"""
chain = signature(
'module.tasks.site_post',
args=(site_id),
queue='training_queue'
)
chain |= signature(
'module.tasks.post_pipeline_message',
args=(),
queue='training_queue'
)
chain.apply_async()
#CELERY.task
def site_post(site_id):
print(f"Hello {site_id}")
return True
#CELERY.task
def post_pipeline_message(success):
if success is True:
LOGGER.info("Pipeline build with success")

Celery task not working in Django framework

I tried code to send_email 5 times to user as Asynchronous task using Celery and Redis Broker in Django Framework. My Celery server is working and it is responding to the celery cli interface even it is receiving task from Django but after that I am getting Error like:
Traceback (most recent call last):
File "c:\users\vipin\appdata\local\programs\python\python3
es\billiard\pool.py", line 358, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "c:\users\vipin\appdata\local\programs\python\python3
es\celery\app\trace.py", line 544, in _fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
task.py -
from celery.decorators import task
from django.core.mail import EmailMessage
import time
#task(name="Sending_Emails")
def send_email(to_email,message):
time1 = 1
while(time1 != 5):
print("Sending Email")
email = EmailMessage('Checking Asynchronous Task', message+str(time1), to=[to_email])
email.send()
time.sleep(1)
time1 += 1
views.py -
print("sending for Queue")
send_email.delay(request.user.email,"Email sent : ")
print("sent for Queue")
settings.py -
# CELERY STUFF
BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Asia/India'
celery.py -
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'ECartApplication.settings')
app = Celery('ECartApplication')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
I expect Email should be sent 5 times but getting error:
[tasks]
. ECartApplication.celery.debug_task
. Sending_Emails
[2019-05-19 12:41:27,695: INFO/SpawnPoolWorker-2] child process 3628 calling sel
f.run()
[2019-05-19 12:41:27,696: INFO/SpawnPoolWorker-1] child process 5748 calling sel
f.run()
[2019-05-19 12:41:28,560: INFO/MainProcess] Connected to redis://localhost:6379/
/
[2019-05-19 12:41:30,599: INFO/MainProcess] mingle: searching for neighbors
[2019-05-19 12:41:35,035: INFO/MainProcess] mingle: all alone
[2019-05-19 12:41:39,069: WARNING/MainProcess] c:\users\vipin\appdata\local\prog
rams\python\python37-32\lib\site-packages\celery\fixups\django.py:202: UserWarni
ng: Using settings.DEBUG leads to a memory leak, never use this setting in produ
ction environments!
warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2019-05-19 12:41:39,070: INFO/MainProcess] celery#vipin-PC ready.
[2019-05-19 12:41:46,448: INFO/MainProcess] Received task: Sending_Emails[db10da
d4-a8ec-4ad2-98a6-60e8c3183dd1]
[2019-05-19 12:41:47,455: ERROR/MainProcess] Task handler raised error: ValueErr
or('not enough values to unpack (expected 3, got 0)')
Traceback (most recent call last):
File "c:\users\vipin\appdata\local\programs\python\python37-32\lib\site-packag
es\billiard\pool.py", line 358, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "c:\users\vipin\appdata\local\programs\python\python37-32\lib\site-packag
es\celery\app\trace.py", line 544, in _fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
This is an issue when you running Python over Windows 7/10.
There are a workaround, you just need to use the module eventlet that you can install using pip:
pip install eventlet
After that execute your worker with -P eventlet at the end of the command:
celery -A MyWorker worker -l info -P eventlet
This command below also works on Windows 11:
celery -A core worker --pool=solo -l info

celery task routes not working as expected

I am practicing celery and I want to assign my task to a specific queue however it does not work as expected
My __init__.py
import os
import sys
from celery import Celery
CURRENT_DIR = os.path.dirname(os.path.abspath(__file__))
sys.path.append(CURRENT_DIR)
app = Celery()
app.config_from_object('celery_config')
My celery_config.py
amqp = 'amqp://guest:guest#localhost:5672//'
broker_url = amqp
result_backend = amqp
task_routes = ([
('import_feed', {'queue': 'queue_import_feed'})
])
My tasks.py
from . import app
#app.task(name='import_feed')
def import_feed():
pass
How I run my worker:
celery -A subscriber1.tasks worker -l info
My client's __init__.py :
import os
import sys
from celery import Celery
CURRENT_DIR = os.path.dirname(os.path.abspath(__file__))
sys.path.append(CURRENT_DIR)
app = Celery()
app.config_from_object('celery_config')
My client's celery_config.py:
from kombu.common import Broadcast
amqp = 'amqp://guest:guest#localhost:5672//'
BROKER_URL = amqp
CELERY_RESULT_BACKEND = amqp
Then in my client's shell I tried:
from publisher import app
result = app.send_task('import_feed')
Then my worker got the task?! Which I expect should not because I assigned that to a specific queue. I tried in my client the command below and no task has been received by my worker which I expect to have received instead on the first one
result = app.send_task('import_feed', queue='queue_import_feed')
Seems like I misunderstood something in the routing part. But what I really want is import_feed task to run only if the queue_import_feed queue is specified when send a task
You can change the default queue that the worker processes.
app.send_task('import_feed') sends the task to celery queue.
app.send_task('import_feed', queue='queue_import_feed') sends the task to queue_import_feed but your worker is only processing tasks in celery queue.
To process specific queues, use the -Q switch
celery -A subscriber1.tasks worker -l info -Q 'queue_import_feed'
Edit
In order to place a restriction on send_task such that a worker reacts to import_feed task only when it's published with a queue, you need to override send_task on Celery and also provide a custom AMQP with a default_queue set to None.
reactor.py
from celery.app.amqp import AMQP
from celery import Celery
class MyCelery(Celery):
def send_task(self, name=None, args=None, kwargs=None, **options):
if 'queue' in options:
return super(MyCelery, self).send_task(name, args, kwargs, **options)
class MyAMQP(AMQP):
default_queue = None
celery_config.py
from kombu import Exchange, Queue
...
task_exchange = Exchange('default', type='direct')
task_create_missing_queues = False
task_queues = [
Queue('feed_queue', task_exchange, routing_key='feeds'),
]
task_routes = {
'import_feed': {'queue': 'feed_queue', 'routing_key': 'feeds'}
}
__init__.py
celeree = MyCelery(amqp='reactor.MyAMQP')

Celery events specific to a queue

I have two Django projects, each with a Celery app:
- fooproj.celery_app
- barproj.celery_app
Each app is running its own Celery worker:
celery worker -A fooproj.celery_app -l info -E -Q foo_queue
celery worker -A barproj.celery_app -l info -E -Q bar_queue
Here's how I am configuring my Celery apps:
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings.local')
app = Celery('celery_app', broker=settings.BROKER_URL)
app.conf.update(
CELERY_ACCEPT_CONTENT=['json'],
CELERY_TASK_SERIALIZER='json',
CELERY_RESULT_SERIALIZER='json',
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend',
CELERY_SEND_EVENTS=True,
CELERY_DEFAULT_QUEUE=settings.CELERY_DEFAULT_QUEUE,
CELERY_DEFAULT_EXCHANGE=settings.CELERY_DEFAULT_EXCHANGE,
CELERY_DEFAULT_ROUTING_KEY=settings.CELERY_DEFAULT_ROUTING_KEY,
CELERY_DEFAULT_EXCHANGE_TYPE='direct',
CELERY_ROUTES = ('proj.celeryrouters.MainRouter', ),
CELERY_IMPORTS=(
'apps.qux.tasks',
'apps.lorem.tasks',
'apps.ipsum.tasks',
'apps.sit.tasks'
),
)
My router class:
from django.conf import settings
class MainRouter(object):
"""
Routes Celery tasks to a proper exchange and queue
"""
def route_for_task(self, task, args=None, kwargs=None):
return {
'exchange': settings.CELERY_DEFAULT_EXCHANGE,
'exchange_type': 'direct',
'queue': settings.CELERY_DEFAULT_QUEUE,
'routing_key': settings.CELERY_DEFAULT_ROUTING_KEY,
}
fooproj has settings:
BROKER_URL = redis://localhost:6379/0
CELERY_DEFAULT_EXCHANGE = 'foo_exchange'
CELERY_DEFAULT_QUEUE = 'foo_queue'
CELERY_DEFAULT_ROUTING_KEY = 'foo_routing_key'
barproj has settings:
BROKER_URL = redis://localhost:6379/1
CELERY_DEFAULT_EXCHANGE = 'foo_exchange'
CELERY_DEFAULT_QUEUE = 'foo_queue'
CELERY_DEFAULT_ROUTING_KEY = 'foo_routing_key'
As you can see, both projects use their own Redis database as a broker, their own MySQL database as a result backend, their own exchange, queue and routing key.
I am trying to have two Celery events processes running, one for each app:
celery events -A fooproj.celery_app -l info -c djcelery.snapshot.Camera
celery events -A barproj.celery_app -l info -c djcelery.snapshot.Camera
The problem is, both celery events processes are picking up tasks from all of my Celery workers! So in the fooproj database, I can see task results from barproj database.
Any idea how to solve this problem?
From http://celery.readthedocs.org/en/latest/getting-started/brokers/redis.html:
Monitoring events (as used by flower and other tools) are global and
is not affected by the virtual host setting.
This is caused by a limitation in Redis. The Redis PUB/SUB channels are global and not affected by the database number.
This seems to be one of Redis' caveats :(

Categories

Resources