Question on flask and celery -
init.py
from celery import Celery
from flask import Flask
app.config['CELERY_RESULT_BACKEND'] = 'redis://localhost'
app.config['CELERY_BROKER_URL'] = 'redis://localhost'
celery = Celery(app.name, broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)
training.py
from project import celery
training = Blueprint('training', __name__)
#training.route('/projectdetails/training', methods=["GET", "POST"])
#login_required
def start_training():
train_test.delay()
return render_template('test.html')
#celery.task()
def train_test():
# a ML training task.
I have my redis server on and my celery worker celery -A myproject.celery worker --loglevel=info .
This is an error I keep getting -
[2021-04-01 17:25:11,604: ERROR/MainProcess] Process 'ForkPoolWorker-8' pid:67580 exited with 'exitcode 2'
[2021-04-01 17:25:11,619: ERROR/MainProcess] Task handler raised error: WorkerLostError('Worker exited prematurely: exitcode 2.')
Traceback (most recent call last):
File "/Users/rohankamath/Desktop/lol/env/lib/python3.7/site-packages/billiard/pool.py", line 1267, in mark_as_worker_lost
human_status(exitcode)),
billiard.exceptions.WorkerLostError: Worker exited prematurely: exitcode 2.
Tried searching the meaning on exitcode2, couldn't find anything.
This has happened to me when there was some import error in my code. Could you try running without celery(directly calling the main function) and see if it is working?
Related
I tried code to send_email 5 times to user as Asynchronous task using Celery and Redis Broker in Django Framework. My Celery server is working and it is responding to the celery cli interface even it is receiving task from Django but after that I am getting Error like:
Traceback (most recent call last):
File "c:\users\vipin\appdata\local\programs\python\python3
es\billiard\pool.py", line 358, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "c:\users\vipin\appdata\local\programs\python\python3
es\celery\app\trace.py", line 544, in _fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
task.py -
from celery.decorators import task
from django.core.mail import EmailMessage
import time
#task(name="Sending_Emails")
def send_email(to_email,message):
time1 = 1
while(time1 != 5):
print("Sending Email")
email = EmailMessage('Checking Asynchronous Task', message+str(time1), to=[to_email])
email.send()
time.sleep(1)
time1 += 1
views.py -
print("sending for Queue")
send_email.delay(request.user.email,"Email sent : ")
print("sent for Queue")
settings.py -
# CELERY STUFF
BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Asia/India'
celery.py -
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'ECartApplication.settings')
app = Celery('ECartApplication')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
I expect Email should be sent 5 times but getting error:
[tasks]
. ECartApplication.celery.debug_task
. Sending_Emails
[2019-05-19 12:41:27,695: INFO/SpawnPoolWorker-2] child process 3628 calling sel
f.run()
[2019-05-19 12:41:27,696: INFO/SpawnPoolWorker-1] child process 5748 calling sel
f.run()
[2019-05-19 12:41:28,560: INFO/MainProcess] Connected to redis://localhost:6379/
/
[2019-05-19 12:41:30,599: INFO/MainProcess] mingle: searching for neighbors
[2019-05-19 12:41:35,035: INFO/MainProcess] mingle: all alone
[2019-05-19 12:41:39,069: WARNING/MainProcess] c:\users\vipin\appdata\local\prog
rams\python\python37-32\lib\site-packages\celery\fixups\django.py:202: UserWarni
ng: Using settings.DEBUG leads to a memory leak, never use this setting in produ
ction environments!
warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2019-05-19 12:41:39,070: INFO/MainProcess] celery#vipin-PC ready.
[2019-05-19 12:41:46,448: INFO/MainProcess] Received task: Sending_Emails[db10da
d4-a8ec-4ad2-98a6-60e8c3183dd1]
[2019-05-19 12:41:47,455: ERROR/MainProcess] Task handler raised error: ValueErr
or('not enough values to unpack (expected 3, got 0)')
Traceback (most recent call last):
File "c:\users\vipin\appdata\local\programs\python\python37-32\lib\site-packag
es\billiard\pool.py", line 358, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "c:\users\vipin\appdata\local\programs\python\python37-32\lib\site-packag
es\celery\app\trace.py", line 544, in _fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
This is an issue when you running Python over Windows 7/10.
There are a workaround, you just need to use the module eventlet that you can install using pip:
pip install eventlet
After that execute your worker with -P eventlet at the end of the command:
celery -A MyWorker worker -l info -P eventlet
This command below also works on Windows 11:
celery -A core worker --pool=solo -l info
I am trying to establish a periodic task using Celery (4.2.0) and RabbitMQ (3.7.14) running with Python 3.7.2 on an Azure VM using Ubuntu 16.04. I am able to start the beat and worker and see the message get kicked off from beat to the worker but at this point I'm met with an error like so
[2019-03-29 21:35:00,081: ERROR/MainProcess] Received
unregistered task of type 'facebook-call.facebook_api'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you're using relative imports?
My code is as follows:
from celery import Celery
from celery.schedules import crontab
app = Celery('facebook-call', broker='amqp://localhost//')
#app.task
def facebook_api():
{function here}
app.conf.beat.schedule = {
'task': 'facebook-call.facebook_api',
'schedule': crontab(hour=0, minute =0, day='0-6'),
}
I am starting the beat and worker processes by using the name of the python file which contains all of the code
celery -A FacebookAPICall beat --loglevel=info
celery -A FacebookAPICall worker --loglevel=info
Again, the beat process starts and I can see the message being successfully passed to the worker but cannot figure out how to "register" the task so that it is processed by the worker.
I was able to resolve the issue by renaming the app from facebook-call to coincide with the name of the file FacebookAPICall
Before:
app = Celery('facebook-call', broker='amqp://localhost//'
After:
app = Celery('FacebookAPICall', broker='amqp://localhost//'
From reading the Celery documentation, I don't totally understand why the name of the app must also be the name of the .py file but that seems to do the trick.
I'm trying to have the worker run on Python2, and client on Python3.
I notice that if I launch client from Python3, the worker code too seems to run on Python3, instead of Python2.
How can I resolve this? My worker code lies on Python2, and client using it has to run on Python3?
This is how I've set it up.
VirtualEnv1: Python2.7.12 (tasks.py)
from celery import Celery
import time
app = Celery('tasks', backend='redis://localhost', broker='redis://localhost')
app.conf.update(
task_serializer='json',
accept_content=['json'], # Ignore other content
result_serializer='json',
timezone='Europe/Oslo',
enable_utc=True,
)
#app.task
def add(x, y):
time.sleep(5)
print "Trying to process task" # This can run only on Python2, and not Python3
return x + y
Execution command:
celery -A tasks worker --loglevel=info -c 1
VirtualEnv2: Python3.5.2 (client.py)
from celery import Celery
from tasks import add
import time
app = Celery('tasks', backend='redis://localhost', broker='redis://localhost')
app.conf.update(
task_serializer='json',
accept_content=['json'], # Ignore other content
result_serializer='json',
timezone='Europe/Oslo',
enable_utc=True,
)
result = add.delay(4, 8)
result.get()
And this is the error I get upon executing client.py (in python3)
Traceback (most recent call last):
File "client.py", line 2, in <module>
from tasks import add
File "/home/vishal/work/yobi/expr/tasks.py", line 17
print "Trying to process task"
Isn't this unexpected? I would have guessed that the worker code should run on Python2, and not Python3.
I've been trying to run some test with Celery, RabbitMQ, Django..
The problem is that once i have spawned a worker, i use django to send a task thanks to .delay() which is received by the worker but from what it looks like, the worker does not find a task of that name to execute. Weird thing since this task with the exact same name is is the list of tasks the worker is supposed to be able to execute.
Here it is:
-> settings.py
BROKER_HOST = "127.0.0.1" #IP address of the server running RabbitMQ and Celery
BROKER_PORT = 5672
BROKER_URL = 'amqp://'
CELERY_IMPORTS = ('notify')
CELERY_IGNORE_RESULT = False
CELERY_RESULT_BACKEND = "amqp"
CELERY_IMPORTS = ("notify")
-> __init__.py (under notify module)
import celery
from celery import app as celery_app
from notify.user.tasks import send_email
-> tasks.py (under notify module)
import os
from celery import Celery, task
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'app.settings')
app = Celery('app')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.config_from_object('app.settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS, force=False)
#task()
def send_email(recepient, title, subject):
print('sending email')
And here is how i spawn my worker at the root of my project:
celery worker -n worker_one -l debug -A notify
Here is the error i get when the worker receive notification of the task i sent:
The full contents of the message body was:
'[["Guillaume", "One title", "This is my subject"], {}, {"chain": null, "callbacks": null, "errbacks": null, "chord": null}]' (123b)
Traceback (most recent call last):
File "/Users/guillaumeboufflers/.pyenv/versions/smartbase/lib/python3.5/site-packages/celery/worker/consumer/consumer.py", line 561, in on_task_received
strategy = strategies[type_]
KeyError: 'notify.user.tasks.send_email'
which is weird because ..
[tasks]
. celery.accumulate
. celery.backend_cleanup
. celery.chain
. celery.chord
. celery.chord_unlock
. celery.chunks
. celery.group
. celery.map
. celery.starmap
. notify.user.tasks.send_email
Thanks guys for helping..
Whenever I am running the celery worker I am getting the warning
./manage.py celery worker -l info --concurrency=8
and if I am ignored this warning then my celery worker not receiving the celery beat tasks
After googled I have also changed the worker name, but this time I am not receiving the warning but celery worker still not receiving the celery beat scheduled tasks
I have checked the celery beat logs, and celery beat scheduling the task on time.
I have also checked the celery flower and its showing two workers and the first worker is receiving the tasks and not executing it, how to send all task the second worker? or how can i disable the first kombu worker, what is djagno-celery setting that i am missing?
My django settings.py
RABBITMQ_USERNAME = "guest"
RABBITMQ_PASSWORD = "guest"
BROKER_URL = 'amqp://%s:%s#localhost:5672//' % (RABBITMQ_USERNAME,
RABBITMQ_PASSWORD)
CELERY_DEFAULT_QUEUE = 'default'
CELERY_DEFAULT_EXCHANGE = 'default'
CELERY_DEFAULT_ROUTING_KEY = 'default'
CELERY_IGNORE_RESULT = True
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
celery_enable_utc=True
import djcelery
djcelery.setup_loader()
You only enabled the worker. For a task to be executed, you must call the task with the help of the your_task.delay () function.
For example, open another terminal, enter your project, and run the python manage.py shell command. When entering the shell of your project Django, import your task and run the command your_task.delay ()
In the following link, there is an example of celery code with rabbitmq broker, I advise you to study it:
https://github.com/celery/celery/tree/master/examples/django