I am developing a small independent python application which uses Celery. I have built this using django framework but my application is back end only. This means that the users do not need to visit my site and my application is built only for the purpose of receiving tasks queue from celery and performing operations on the database. In order to perform operations on the database, I need to use Django modules.
What I am trying to do is eliminate the rest of my django application and use ONLY celery and django models modules (including the dependencies required to run these).
In short, my simple celery application will be running receiving instructions from my redis broker and perform operations in database using django models.
Is is possible to do this? If so, how?
Here is my project structure:
myproject/
--manage.py
--myproject/
----celery.py
----models.py
----settings.py
----tasks.py
----urls.py
----wsgi.py
Here is my settings.py:
In your project's settings.py, just add this at beginning.
import django
import os
sys.path.insert(0, your_project_path) # Ensure python can find your project
os.environ['DJANGO_SETTINGS_MODULE'] = 'myproject.settings'
django.setup()
Then you can use django orm, remember to delete the middleware you don't need in django settings.
you just need
env['DJANGO_SETTING_MODULE'] = 'myproject.settings'
django.setup()
(assuming you setup your database and installed_apps stuff in settings.py)
You have have a python script that requires some celery tasks and you need Django ORM too for the database interactions.
You can setup the django project
create an app for your purpose, include in settings.py and inside your app in models.py create the required models.
ref : What minimal files i need to use django ORM
Set up the environment for executing celery. Ie, redis server. integrate "djcelery" with django project. for the celery task purpose.
you can use celery beats for the periodic tasks. or delay.
ref: http://docs.celeryproject.org/en/latest/django/first-steps-with-django.html
You can import and use the django models like normal inside the celery tasks.
And The celery tasks you can run using
i. celery -A tasks worker --loglevel=info
ii. celery -A tasks beat -l info. use beats if you want the tasks which are written for periodic execution.
If the tasks need to be executed asynchronously just immediately or after a time interval , you can use task_name.delay()
call the tasks inside the python script using delay()
i think to use djcelery in your script you may need to set up the django env inside the script.
just Do django.setup().
i think this will help you to solve your problem.
Related
Instead of putting my celery tasks in tasks.py I'd like to put them in a couple different files.
Is there a way to tell Celery to look in these files for tasks? Or is tasks.py hard-coded somewhere?
Here are my versions if it helps:
Django==1.8.7
celery==3.1.19
kombu==3.0.26
django-celery==3.1.17
According to the celery docs, since celery version 3.1 one doesn't need the django-celery app anymore, as django support is integrated into celery itself already.
As per the tutorial, one just needs an imported app object (instance of celery.Celery) imported in the project's __init__.py
app.autodiscover_tasks accepts two arguments,
packages, which can easily be lambda: settings.INSTALLED_APPS
related_name, which defaults to tasks, but it can be overwritten
I want to use dynamic scheduler management for celery. I know djcelery have that functionality with database support.
But currently I do not use Django, but Flask. I couldn't find out Flask project or implementation with djcelery.schedulers.
Is it possible to use djcelery and implement dynamic scheduler management system without Django?
Short answer: No, but...
You have to use django. The scheduler's entries are instances of django models so you would have to setup djcelery app somehow (see this code: https://github.com/celery/django-celery/blob/master/djcelery/schedulers.py) Also you won't have the admin interface to add scheduler's entries.
This is just a guess, but you can try setting django's ORM standalone and syncing djcelery's models. (see: Use Django ORM as standalone)
You can also implement your own scheduler following the structure of djcelery/schedulers.py
Also see: Can celery celerybeat use a Database Scheduler without Django?
You can check out this flask-djcelery. It configures djcelery with flask, allows using django admin and also provide a browseable rest api for managing tasks.
I have a follow-on / clarification question related to an older question
I have 2 servers (for now). 1 server runs a django web application. The other server runs pure python scripts that are CRON-scheduled data acquisition & processing jobs for the web app.
There is a use case where user activity in the web application (updating a certain field) should trigger a series of actions by the backend server. I could stick with CRON but as we scale up, I can imagine running into trouble. Celery seems like a good solution except I'm unclear how to implement it. (Yes, I did read the getting started guide).
I want the web application to send tasks to a specific queue but the backend server to actually execute the work.
Assuming that both servers are using the same broker URL,
Do I need to define stub tasks in Djando or can I just use the celery.send_task method?
Should I still be using django-celery?
Meanwhile the backend server will be running Celery with the full implementation of the tasks and workers?
I decided to try it and work through any issues that came up.
On my django server, I did not use django-celery. I installed celery and redis (via pip) and followed most of the instructions in the First Steps with Django:
updated proj/proj/settings.py file to include the bare minimum of
configuration for Celery such as the BROKER_URL
created the proj/proj/celery.py file but without the task defined
at the bottom
updated the proj/proj/__init__.py file as documented
Since the server running django wasn't actually going to execute any
Celery tasks, in the view that would trigger a task, I added the
following:
from proj.celery import app as celery_app
try:
# send it to celery for backend processing
celery_app.send_task('tasks.mytask', kwargs={'some_id':obj.id,'another_att':obj.att}, queue='my-queue')
except Exception as err:
print('Issue sending task to Celery')
print err
The other server had the following installed: celery and redis (I used an AWS Elasticache redis instance for this testing).
This server had the following files:
celeryconfig.py will all of my Celery configuration and queues
defined, pointing to the same BROKER_URL as the django server
tasks.py with the actual code for all of my tasks
The celery workers were then started on this server, using the standard command: celery -A tasks worker -Q my-queue1,my-queue2
For testing, the above worked. Now I just need to make celery run in the background and optimize the number of workers/queue.
If anyone has additional comments or improvements, I'd love to hear them!
I have a Django application which uses django-celery, celery and rabbitmq for offline, distributed processing.
Now the setup is such that I need to run the celery tasks (and in turn celery workers) in other nodes in the network (different from where the Django web app is hosted).
To do that, as I understand I will need to place all my Django code in these separate servers. Not only that, I will have to install all the other python libraries which the Django apps require.
This way I will have to transfer all the django source code to all possible servers in the network, install dependencies and run some kind of an update system which will sync all the sources across nodes.
Is this the right way of doing things? Is there a simpler way of
making the celery workers run outside the web application server
where the Django code is hosted ?
If indeed there is no way other than to copy code and replicate in
all servers, is there a way to copy only the source files which the
celery task needs (which will include all models and views - not so
small a task either)
For this type of situation I have in the past made a egg of all of my celery task code that I can simply rsync or copy in some fashion to my worker nodes. This way you can edit your celery code in a single project that can be used in your django and on your work nodes.
So in summary create a web-app-celery-tasks project and make it into an installable egg and have a web-app package that depends on the celery tasks egg.
I am considering using celery in my project. I found a lot of information about how to use it etc. What I am interested in is how to deploy/package my solution.
I need to run two components - django app and then celeryd worker (component that sends emails). For example I would like my django app to use email_ticket task that would email support tickets. I create tasks.py in the django app.
#task
def email_ticket(from, message):
...
Do I deploy my django app and then just run celeryd as separate process from the same path?
./manage.py celeryd ...
What about workers on different servers? Deploy whole django application and run only celeryd? I understand I could use celery only for the worker, but I would like to use celerycam and celerybeat.
Any feedback is appreciated. Thanks
Thanks for any feedback.
This is covered in the documentation here. The gist is you need to download some init scripts and setup some config. Once that's done celeryd will start on boot and you'll be off and running.