I am considering using celery in my project. I found a lot of information about how to use it etc. What I am interested in is how to deploy/package my solution.
I need to run two components - django app and then celeryd worker (component that sends emails). For example I would like my django app to use email_ticket task that would email support tickets. I create tasks.py in the django app.
#task
def email_ticket(from, message):
...
Do I deploy my django app and then just run celeryd as separate process from the same path?
./manage.py celeryd ...
What about workers on different servers? Deploy whole django application and run only celeryd? I understand I could use celery only for the worker, but I would like to use celerycam and celerybeat.
Any feedback is appreciated. Thanks
Thanks for any feedback.
This is covered in the documentation here. The gist is you need to download some init scripts and setup some config. Once that's done celeryd will start on boot and you'll be off and running.
Related
I am developing a small independent python application which uses Celery. I have built this using django framework but my application is back end only. This means that the users do not need to visit my site and my application is built only for the purpose of receiving tasks queue from celery and performing operations on the database. In order to perform operations on the database, I need to use Django modules.
What I am trying to do is eliminate the rest of my django application and use ONLY celery and django models modules (including the dependencies required to run these).
In short, my simple celery application will be running receiving instructions from my redis broker and perform operations in database using django models.
Is is possible to do this? If so, how?
Here is my project structure:
myproject/
--manage.py
--myproject/
----celery.py
----models.py
----settings.py
----tasks.py
----urls.py
----wsgi.py
Here is my settings.py:
In your project's settings.py, just add this at beginning.
import django
import os
sys.path.insert(0, your_project_path) # Ensure python can find your project
os.environ['DJANGO_SETTINGS_MODULE'] = 'myproject.settings'
django.setup()
Then you can use django orm, remember to delete the middleware you don't need in django settings.
you just need
env['DJANGO_SETTING_MODULE'] = 'myproject.settings'
django.setup()
(assuming you setup your database and installed_apps stuff in settings.py)
You have have a python script that requires some celery tasks and you need Django ORM too for the database interactions.
You can setup the django project
create an app for your purpose, include in settings.py and inside your app in models.py create the required models.
ref : What minimal files i need to use django ORM
Set up the environment for executing celery. Ie, redis server. integrate "djcelery" with django project. for the celery task purpose.
you can use celery beats for the periodic tasks. or delay.
ref: http://docs.celeryproject.org/en/latest/django/first-steps-with-django.html
You can import and use the django models like normal inside the celery tasks.
And The celery tasks you can run using
i. celery -A tasks worker --loglevel=info
ii. celery -A tasks beat -l info. use beats if you want the tasks which are written for periodic execution.
If the tasks need to be executed asynchronously just immediately or after a time interval , you can use task_name.delay()
call the tasks inside the python script using delay()
i think to use djcelery in your script you may need to set up the django env inside the script.
just Do django.setup().
i think this will help you to solve your problem.
I am trying to run task with Celery. I follow this tutorials link
Everything has been setup successfully. The thing now is that I don't know how to execute a task. I run celeryd and it couldn't find the task.
I want to know what exactly I need to call to execute the task and how I need to config the task on RabbitMQ server, django-admin..
I cannot find any full tutorials about it.
Django by example has a full section on using Celery with RabbitMQ. There are also free tutorials or articles on this topic
How to install Celery on Django and Create a Periodic Task
Django Celery Part 1
Django Celery Part 2
task definition
app/tasks.py:
from celery import shared_task
#shared_task
def add(param1,param2)
print("task")
task execution:
from celery import current_app
current_app.send_task("app.tasks.add", ["param1", "param2"])
This might help you to get an idea how to run Celery.
It worked fine for me.
http://www.hiddentao.com/archives/2012/01/27/processing-long-running-django-tasks-using-celery-rabbitmq-supervisord-monit/
I have a follow-on / clarification question related to an older question
I have 2 servers (for now). 1 server runs a django web application. The other server runs pure python scripts that are CRON-scheduled data acquisition & processing jobs for the web app.
There is a use case where user activity in the web application (updating a certain field) should trigger a series of actions by the backend server. I could stick with CRON but as we scale up, I can imagine running into trouble. Celery seems like a good solution except I'm unclear how to implement it. (Yes, I did read the getting started guide).
I want the web application to send tasks to a specific queue but the backend server to actually execute the work.
Assuming that both servers are using the same broker URL,
Do I need to define stub tasks in Djando or can I just use the celery.send_task method?
Should I still be using django-celery?
Meanwhile the backend server will be running Celery with the full implementation of the tasks and workers?
I decided to try it and work through any issues that came up.
On my django server, I did not use django-celery. I installed celery and redis (via pip) and followed most of the instructions in the First Steps with Django:
updated proj/proj/settings.py file to include the bare minimum of
configuration for Celery such as the BROKER_URL
created the proj/proj/celery.py file but without the task defined
at the bottom
updated the proj/proj/__init__.py file as documented
Since the server running django wasn't actually going to execute any
Celery tasks, in the view that would trigger a task, I added the
following:
from proj.celery import app as celery_app
try:
# send it to celery for backend processing
celery_app.send_task('tasks.mytask', kwargs={'some_id':obj.id,'another_att':obj.att}, queue='my-queue')
except Exception as err:
print('Issue sending task to Celery')
print err
The other server had the following installed: celery and redis (I used an AWS Elasticache redis instance for this testing).
This server had the following files:
celeryconfig.py will all of my Celery configuration and queues
defined, pointing to the same BROKER_URL as the django server
tasks.py with the actual code for all of my tasks
The celery workers were then started on this server, using the standard command: celery -A tasks worker -Q my-queue1,my-queue2
For testing, the above worked. Now I just need to make celery run in the background and optimize the number of workers/queue.
If anyone has additional comments or improvements, I'd love to hear them!
I decided I need to use an asynchronous queue system. And am setting up Redis/RQ/django-rq. I am wondering how I can start workers in my project.
django-rq provides a management command which is great, it looks like:
python manage.py rqworker high default low
But is it possible to start the worker when you start the django instance? Just wondering or is it something I will always have to start manually?
Thanks.
Django operates inside reques-response cycle, and it starts by request. So it is bad idea to attach such command to Django startup.
Instead of that, I would recommend you to look at supervisord - a process manager, that can automate services launch at system start and other things.
When I host Django project in Heroku. Heroku provide a Procfile, you can specify what to start with project.
It is my Procfile:
web: gunicorn RestApi.wsgi
worker: python manage.py rqworker default
I have a web app running on heroku using flask and SQLAlchemy. I am now wondering how i can start a schedule task that runs daily and does some database related tasks (deleting some row if you need to know:)
The documentation on heroku recommends to use APScheduler but i would like to do it with Heroku-Scheduler. Dispite this decision i would like to know how i connect to my postgres database in this scheduler task. I could not find any example or hint for that.
thanks for your time
Torsten
Heroku scheduler will run any command you throw at it. The typical way would be to create a Python script/command as part of your flask app. You can do something similar to http://flask-script.readthedocs.org/en/latest/. Then within the scheduler you would schedule it similar to:
python manage.py mytask