I am trying to run task with Celery. I follow this tutorials link
Everything has been setup successfully. The thing now is that I don't know how to execute a task. I run celeryd and it couldn't find the task.
I want to know what exactly I need to call to execute the task and how I need to config the task on RabbitMQ server, django-admin..
I cannot find any full tutorials about it.
Django by example has a full section on using Celery with RabbitMQ. There are also free tutorials or articles on this topic
How to install Celery on Django and Create a Periodic Task
Django Celery Part 1
Django Celery Part 2
task definition
app/tasks.py:
from celery import shared_task
#shared_task
def add(param1,param2)
print("task")
task execution:
from celery import current_app
current_app.send_task("app.tasks.add", ["param1", "param2"])
This might help you to get an idea how to run Celery.
It worked fine for me.
http://www.hiddentao.com/archives/2012/01/27/processing-long-running-django-tasks-using-celery-rabbitmq-supervisord-monit/
Related
I have a FastAPI api code that is executed using uvicorn. Now I want to add a queu system, and I think Celery and Flower can be great tools for me since my api has some endpoints that uses a lot CPU and take some seconds in answering. However, I have a couple of questions about the addition of Celery:
Does Celery substitute Uvicorn? Do I need it any more? I cannot see any example on the website where they consider uvicorn too, and when you execute the Celery seems to do not need it...
I have read a lot about using Celery for creating a queu for FastAPI. However, you can manage a queue in FastAPI without using Celery. What's better? and why?
Does Celery substitute Uvicorn?
No. Celery is not a replacement for Uvicorn. Uvicorn is meant to run your FastAPI application, Celery will not do that for you.
I have read a lot about using Celery for creating a queu for FastAPI. However, you can manage a queue in FastAPI without using Celery. What's better? and why?
I guess you mean the BackgroundTasks here, but that is not a replacement for Celery. FastAPI BackgroundTasks are meant to execute simple tasks (and not CPU bound related tasks).
Answering the question, ideally, you'd have to start both services: Uvicorn, and Celery. You can see an example on how to do it here.
Not that it matters much here, but I'm one of the Uvicorn maintainers.
I use celery, with python 3 and supervisor in Ubuntu.
I've been working to make a new API, which will get an image from the internet using PIL(Pillow) and save it in a server.
However the problem is that I use Celery as scheduler and in the original API it returns the result in a milisecond, but when I use PIL, the wait becomes almost a second.
So as a solution, I am looking for a way to make the Celery worker run in the background.
Is it possible?
What you probably want is to daemonize your Celery worker.
If you follow the steps provided in the Celery running the worker as a daemon documentation you will be able to do that.
It is a bit of a complicated process, but it will allow the Celery worker to run in the background
I have a follow-on / clarification question related to an older question
I have 2 servers (for now). 1 server runs a django web application. The other server runs pure python scripts that are CRON-scheduled data acquisition & processing jobs for the web app.
There is a use case where user activity in the web application (updating a certain field) should trigger a series of actions by the backend server. I could stick with CRON but as we scale up, I can imagine running into trouble. Celery seems like a good solution except I'm unclear how to implement it. (Yes, I did read the getting started guide).
I want the web application to send tasks to a specific queue but the backend server to actually execute the work.
Assuming that both servers are using the same broker URL,
Do I need to define stub tasks in Djando or can I just use the celery.send_task method?
Should I still be using django-celery?
Meanwhile the backend server will be running Celery with the full implementation of the tasks and workers?
I decided to try it and work through any issues that came up.
On my django server, I did not use django-celery. I installed celery and redis (via pip) and followed most of the instructions in the First Steps with Django:
updated proj/proj/settings.py file to include the bare minimum of
configuration for Celery such as the BROKER_URL
created the proj/proj/celery.py file but without the task defined
at the bottom
updated the proj/proj/__init__.py file as documented
Since the server running django wasn't actually going to execute any
Celery tasks, in the view that would trigger a task, I added the
following:
from proj.celery import app as celery_app
try:
# send it to celery for backend processing
celery_app.send_task('tasks.mytask', kwargs={'some_id':obj.id,'another_att':obj.att}, queue='my-queue')
except Exception as err:
print('Issue sending task to Celery')
print err
The other server had the following installed: celery and redis (I used an AWS Elasticache redis instance for this testing).
This server had the following files:
celeryconfig.py will all of my Celery configuration and queues
defined, pointing to the same BROKER_URL as the django server
tasks.py with the actual code for all of my tasks
The celery workers were then started on this server, using the standard command: celery -A tasks worker -Q my-queue1,my-queue2
For testing, the above worked. Now I just need to make celery run in the background and optimize the number of workers/queue.
If anyone has additional comments or improvements, I'd love to hear them!
I have a web app running on heroku using flask and SQLAlchemy. I am now wondering how i can start a schedule task that runs daily and does some database related tasks (deleting some row if you need to know:)
The documentation on heroku recommends to use APScheduler but i would like to do it with Heroku-Scheduler. Dispite this decision i would like to know how i connect to my postgres database in this scheduler task. I could not find any example or hint for that.
thanks for your time
Torsten
Heroku scheduler will run any command you throw at it. The typical way would be to create a Python script/command as part of your flask app. You can do something similar to http://flask-script.readthedocs.org/en/latest/. Then within the scheduler you would schedule it similar to:
python manage.py mytask
I am considering using celery in my project. I found a lot of information about how to use it etc. What I am interested in is how to deploy/package my solution.
I need to run two components - django app and then celeryd worker (component that sends emails). For example I would like my django app to use email_ticket task that would email support tickets. I create tasks.py in the django app.
#task
def email_ticket(from, message):
...
Do I deploy my django app and then just run celeryd as separate process from the same path?
./manage.py celeryd ...
What about workers on different servers? Deploy whole django application and run only celeryd? I understand I could use celery only for the worker, but I would like to use celerycam and celerybeat.
Any feedback is appreciated. Thanks
Thanks for any feedback.
This is covered in the documentation here. The gist is you need to download some init scripts and setup some config. Once that's done celeryd will start on boot and you'll be off and running.