Celery KeyError: 'myproject.tasks.async_task' - python

When i'm trying to run celery background task then it giving me error: KeyError: 'myproject.tasks.async_task'
I'm using python version: 3.8.10, django verion: 4.1.2 , celery version: 5.2.7 and rabbitmq version: 3.8.2
here below screenshot is my project structure:
here below is my code for background task:
settings.py:
CELERY_BROKER_URL = 'amqp://localhost:5672'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
myproject/init.py:
from __future__ import absolute_import, unicode_literals
from myproject.celery import app as celery_app
__all__ = ['celery_app']
celery.py
from __future__ import absolute_import
import os, sys
from celery import Celery
import django
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')
app = Celery('myproject')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
tasks.py:
from __future__ import absolute_import
from celery import shared_task
from time import sleep
import os
#shared_task
def sleepy(duration):
sleep(duration)
return None
#shared_task
def async_task(save_path, name_of_file):
sleep(30)
completeName = os.path.join(save_path, name_of_file+".html")
file1 = open(completeName, "w")
toFile = 'test data'
file1.write(toFile)
file1.close()
return 'task complete'
views.py
def add_product(request):
if request.method == 'POST':
id_col_data = request.POST.getlist('services')
target_col = request.POST.get('target_col')
file_name = request.POST.get('file_name')
amz_columns_dict = {'id_col': id_col_data,
'target_col': target_col,
'wt_col': None}
import os.path
save_path = '/home/satyajit/Desktop/'
if os.path.exists(save_path):
try:
from myproject.tasks import async_task
name_of_file = file_name
status = async_task.delay(save_path, name_of_file) #celery task
print('status---->', status)
except Exception as e:
print('task error is ------>', e)
return render(request,'data/error.html', {'message': 'async task error'})
else:
print('error occured')
return HttpResponse('product added successfully')
return render(request, 'add_product.html')
When i run this celery command celery -A myproject worker -l info to check the celery process work then i get below trackback.
The traceback full details on below screenshot:

Related

Celery task hangs after calling .delay() in Django

While calling the .delay() method of an imported task from a django application, the process gets stuck and the request is never completed.
We also don't get any error on the console.
Setting up a set_trace() with pdb results in the same thing.
The following questions were reviewed which didn't help resolve the issue:
Calling celery task hangs for delay and apply_async
celery .delay hangs (recent, not an auth problem)
Eg.:
backend/settings.py
CELERY_BROKER_URL = os.environ.get("CELERY_BROKER", RABBIT_URL)
CELERY_RESULT_BACKEND = os.environ.get("CELERY_BROKER", RABBIT_URL)
backend/celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'backend.settings')
app = Celery('backend')
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
app/tasks.py
import time
from celery import shared_task
#shared_task
def upload_file(request_id):
time.sleep(request_id)
return True
app/views.py
from rest_framework.views import APIView
from .tasks import upload_file
class UploadCreateAPIView(APIView):
# other methods...
def post(self, request, *args, **kwargs):
id = request.data.get("id", None)
# business logic ...
print("Going to submit task.")
import pdb; pdb.set_trace()
upload_file.delay(id) # <- this hangs the runserver as well as the set_trace()
print("Submitted task.")
The issue was with the setup of the celery application with Django. We need to make sure that the celery app is imported and initialized in the following file:
backend\__init__.py
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
I've run into this issue that Celery calls through delay or apply_async may randomly hang the program indefinitely. I tried the all broker_transport_options and retry_policy options to let Celery to recover, but it still happens. Then I found this solution to enforce an execution time limit for an execution block/function by using underlying Python signal handlers.
#contextmanager
def time_limit(seconds):
def signal_handler(signum, frame):
raise TimeoutException("Timed out!")
signal.signal(signal.SIGALRM, signal_handler)
signal.alarm(seconds)
try:
yield
finally:
signal.alarm(0)
def my_function():
with time_limit(3):
celery_call.apply_sync(kwargs={"k1", "v1"}, expires=30)

Cannot call tasks gives module not found celery error

I have the following implementation for celery and this is not a Django project
Project structure
proj
|
proj
├── celery.py
├── conf.py
├── __init__.py
├── run.py
└── tasks.py
conf.py
broker_url = "redis://localhost:6379/0"
result_backend = "redis://localhost:6379/0"
task_serializer = "json"
timezone = "Europe/London"
enable_utc = True
include = ['proj.tasks']
beat_schedule = {
'add-every-2-seconds': {
'task': 'proj.tasks.test',
'schedule': 2.0,
'args': (16, 16)
},
}
run.py
from proj.tasks import add
res = add.delay(2, 2)
print(res.get())
tasks.py
from proj.celery import app
#app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
# Calls test('hello') every 10 seconds.
sender.add_periodic_task(2.0, test.s('hello'), name='add every 10')
# Calls test('world') every 30 seconds
sender.add_periodic_task(3.0, test.s('world'), expires=10)
#app.task
def test(arg):
print(arg)
#app.task
def add(x, y):
return x + y
#app.task
def mul(x, y):
return x * y
#app.task
def xsum(numbers):
return sum(numbers)
When following the tutorial celery.py
from __future__ import absolute_import, unicode_literals
from celery import Celery
from proj import conf
app = Celery('proj')
app.config_from_object(conf)
# Optional configuration, see the application user guide.
app.conf.update(
result_expires=3600,
)
if __name__ == '__main__':
app.start()
proj celery -A proj worker -l info works though
When running run.py getting the error
from celery import Celery
ImportError: cannot import name 'Celery'
renaming celery.py to another name will cause
proj celery -A proj worker -l info
Error:
Unable to load celery application.
Module 'proj' has no attribute 'celery'
How to fix this issue?

Django celery trigger manage.py cmd #periodic_task

i want to run a manage.py cmd from celery as a periodic task every x Minutes but every time i try to accomplish that as show below i get the following error:
[2019-01-17 01:36:00,006: INFO/MainProcess] Received task: Delete
unused media file(s)[3dd2b93b-e32a-4736-8b24-028b9ad8da35]
[2019-01-17 01:36:00,007: WARNING/ForkPoolWorker-3] Scanning for
unused media files [2019-01-17 01:36:00,008: WARNING/ForkPoolWorker-3]
Unknown command: 'cleanup_unused_media --noinput --remove-empty-dirs'
[2019-01-17 01:36:00,008: INFO/ForkPoolWorker-3] Task Delete unused
media file(s)[3dd2b93b-e32a-4736-8b24-028b9ad8da35] succeeded in
0.0008139749998008483s: None
tasks.py
from celery import Celery
from celery.schedules import crontab
from celery.task import periodic_task
from celery.utils.log import get_task_logger
import requests
from django.core import management
logger = get_task_logger(__name__)
app = Celery('tasks', broker='redis://127.0.0.1')
...
#periodic_task(run_every=(crontab(minute='*/90')), name="Delete unused media file(s)", ignore_result=True)
def delete_unused_media():
try:
print("Scanning for unused media files")
management.call_command('cleanup_unused_media --noinput --remove-empty-dirs')
return "success, old media files have been deleted"
except Exception as e:
print(e)
that management cmd comes from the following project:
https://github.com/akolpakov/django-unused-media
is my management call simply wrong or whats the deal?
thanks in advance
UPDATE (Celery Config):
settings.py
INSTALLED_APPS = [
...
'celery',
'django_unused_media',
...
# Celery Settings:
BROKER_URL = 'redis://127.0.0.1:6379'
CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'UTC'
...
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": [
"redis://127.0.0.1:6379/0",
#"redis://127.0.0.1:6379/1",
],
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
"SOCKET_CONNECT_TIMEOUT": 15, # in seconds
"SOCKET_TIMEOUT": 15, # in seconds
"COMPRESSOR": "django_redis.compressors.zlib.ZlibCompressor",
"CONNECTION_POOL_KWARGS": {"max_connections": 1000, "retry_on_timeout": True}
}
}
}
celery.py
from __future__ import absolute_import, unicode_literals
from celery import Celery
import os
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'MyProject.settings')
app = Celery('MyProject')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
_init__.py
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app')
e.g. This task is working fine:
#periodic_task(run_every=(crontab(minute='*/1')), name="Get exchange rate(s)", ignore_result=True)
def get_exchange_rate():
api_url = "https://api.coinmarketcap.com/v1/ticker/?limit=1"
try:
exchange_rate = requests.get(api_url).json()
logger.info("BTC Exchange rate updated successfully.")
except Exception as e:
print(e)
exchange_rate = dict()
return exchange_rate
The issue is the way you're calling call_command. call_command takes the name of the command as its first argument, followed by the arguments passed positionally. You're passing the whole lot as a single string. Try changing it to:
management.call_command('cleanup_unused_media', '--noinput', '--remove-empty-dirs')
For anyone using Django 1.7+, it seems that simply import the settings module is not enough.
i got it working like this
from celery import Celery
from celery.schedules import crontab
from celery.task import periodic_task
from celery.utils.log import get_task_logger
import requests, os, django
from django.core import management
logger = get_task_logger(__name__)
app = Celery('tasks', broker='redis://127.0.0.1')
#periodic_task(run_every=(crontab(minute='*/1')), name="Delete unused media file(s)", ignore_result=True)
def delete_unused_media():
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "MyProject.settings")
django.setup()
try:
print("Scanning for unused media files")
management.call_command('cleanup_unused_media', '--noinput', '--remove-empty-dirs')
return "success, old media files have been deleted"
except Exception as e:
print(e)

Celery task not run at the specified time

I created celery task and it should start each hour at the 0th minute, but it does not run. What doing wrong?
celery
from __future__ import absolute_import, unicode_literals
import os
import pytz
from celery import Celery
from datetime import datetime
from celery.schedules import crontab
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'app.settings')
app = Celery('app', broker='amqp://rabbit:5672')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
sender.add_periodic_task(crontab(minute=0, hour='0,1,2,3,4,5,6,7,8,9,10,11,12, \
13,14,15,16,17,18,19,20,21,22,23,24'),
task.s())
#app.task
def task():
#any code
In the terminal, I see this info, but the task is not running
[2018-05-09 19:05:16,275: INFO/Beat] beat: Starting...
Try change your task code to something like this:
from celery.task import periodic_task
from celery.schedules import crontab
#periodic_task(
run_every=(crontab(minute='*/60')),
name="task_name")
def run_some_task():
'''some code'''

Celery tasks in Django are always blocking

I have the following setup in my django settings:
CELERY_TASK_RESULT_EXPIRES = timedelta(minutes=30)
CELERY_CHORD_PROPAGATES = True
CELERY_ACCEPT_CONTENT = ['json', 'msgpack', 'yaml']
CELERY_ALWAYS_EAGER = True
CELERY_EAGER_PROPAGATES_EXCEPTIONS = True
BROKER_URL = 'django://'
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend'
I've included this under my installed apps:
'djcelery',
'kombu.transport.django'
My project structure is (django 1.5)
proj
|_proj
__init__.py
celery.py
|_apps
|_myapp1
|_models.py
|_tasks.py
This is my celery.py file:
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings.dev')
app = Celery('proj')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS, related_name='tasks')
In the main __init__.pyI have:
from __future__ import absolute_import
from .celery import app as celery_app
And finally in myapp1/tasks.py I define my task:
#task()
def retrieve():
# Do my stuff
Now, if I launch a django interactive shell and I launch the retrieve task:
result = retrieve.delay()
it always seems to be a blocking call, meaning that the prompt is bloked until the function returns. The result status is SUCCESS, the function actually performs the operations BUT it seems not to be async. What am I missing?
it seems like CELERY_ALWAYS_EAGER causes this
if this is True, all tasks will be executed locally by blocking until
the task returns. apply_async() and Task.delay() will return an
EagerResult instance, which emulates the API and behavior of
AsyncResult, except the result is already evaluated.
That is, tasks will be executed locally instead of being sent to the
queue.

Categories

Resources