Practical examples on redis and celery - python

I am new to redis and celery. I have gone through the basic tutorial of both, but I am not getting how to implement then in task scheduling job
I am unable to start with the scripting part. I am not getting how to write a script to make a queue, run the workers etc. I would need a practical example

So here's a cannonical example of how can celery run with Redis (let the script filename be mytasks.py):
from celery import Celery
celery = Celery('tasks', broker='redis://localhost:6379/0')
#celery.task
def add(x, y):
return x + y
As you see, broker argument was set to use Redis installed on your local machine. The next thing is to start celery server:
$ celery -A mytasks worker --loglevel=info
As your tasks celery server has been started, you can now use it to run your task just by importing mytasks script, e.g from Python interpreter interactive mode:
>>> from mytasks import add
>>> add.delay(1, 1)
2
After some time '2' will appear in console.
That's a basic example of how you can setup your tasks execution environment.

Related

How to decouple Django and Celery?

In my celery tasks, I have a task which uses a python module(theano) that runs on GPU, this module can only be imported by one thread at the same time. But to start the website, I have to run:
python manage.py runserver
celery -A celery_try worker -l info
So the module will be imported by both celery and django website, which is conflict. Is there a way to decouple Django and Celery so that the module is only imported once?
For testing purposes you can run django development server in single-threaded mode: python manage.py runserver --nothreading.
You want to import theano only in celery worker process, not in django web server process, right? Ok, let's make import conditional, so that it is imported in celery and not in django.
import os
try:
# next line will raise exception in django, but will work fine in celery
is_worker = os.environ['celery_worker']
import theano # celery will import theano, django won't
except Exception as exc:
# django code will catch exception that celery_worker doesn't exist and print it here
print exc
And start your celery worker with celery_worker environment variable set:
celery_worker=yes celery -A celery_try worker -l info
To discriminate between celery worker and django, let's set a bash environment variable in celery process, but not in django process. I called that variable celery_worker. In order to set it, I prepended celery -A celery_try worker -l info with per-command env variable assignment: celery_worker=yes. Now, in python code, I check if the environment variable is present. If it is, we are in celery worker and need to import theano.
If we're in django, os.environ['celery_worker'] shouldn't be defined and should raise an exception.

Can i use celery without django

I have API code which adds tasks to queue and then celery workers consuming those tasks.
Currently I have both code base same. But i want celery workers just to have simple plain Python tasks and no django code as workers will only be processing tasks and does not need django for that. Is it possible.
In order to start celery worker i need to use this line
celery -A django_project worker --queue high
What should i write instead of django_project there
Yes you can. Celery is a generic asynchronous task queue. In place of "django_project" you would point to your module. See http://docs.celeryproject.org/en/latest/getting-started/first-steps-with-celery.html#application for an example.
Here is an example project layout using celery:
project-dir/
mymodule/
__init__.py
celery.py
tasks.py
tests/
setup.py
etc, etc (e.g. tox.ini, requirements.txt, project management files)
In mymodule/celery.py:
# -*- coding : utf-8 -*-
from __future__ import absolute_import
from celery import Celery
app = Celery('mymodule',
broker='amqp://',
backend='amqp://',
include=['mymodule.tasks'])
if __name__ == '__main__':
app.start()
In mymodule/tasks.py:
from __future__ import absolute_import
from mymodule.celery import app
#app.task
def add(x, y):
return x + y
You can definitely use Celery without using any web framework like Django or Flask. Just create the Celery object and your tasks accordingly and run the following command
celery -A filename.celery_object_name worker --loglevel=info
Later, just run the Python file. You don't need to set anything. It works exactly with or without any Web Framework.

How to invoke celery task using celery daemon

I have written this task in tasks.py file which is under my django apps directory myapp.
#periodic task that run every minute
#periodic_task(run_every=(crontab(hour="*", minute="*", day_of_week="*")))
def news():
'''
Grab url
'''
logger.info("Start task")
now = datetime.now()
urls = []
urls.append(crawler()) #crawler return dic obj
for url_dic in list(reversed(urls)):
for title, url in url_dict.items():
#Save all the scrape url in database
Url.objects.create(title=headline, url=url)
logger.info("Task finished: result = %s" %url)
The main objectives of this task is to push the url and title to django database every minute
To run this celery task we need to invoke these commands using django ./manage utility how to run these commands as a daemon and I am planning to host this app in heroku
python manage.py celeryd --verbosity=2 --loglevel=DEBUG
python manage.py celerybeat --verbosity=2 --loglevel=DEBUG
but I need to run these two commands command as a daemon in background, How can we run this commands as a daemon so that my celery tasks can run.
A fast fix will be to put "&" after your commands i.e.
python manage.py celeryd --verbosity=2 --loglevel=DEBUG &
python manage.py celerybeat --verbosity=2 --loglevel=DEBUG &
After hitting enter this tasks will act as daemon and still print out the useful debug info. So this is great for initial stage and sometimes small applications that do not rely heavily on celery.
For development purpose i will suggest using supervisor .See THIS POST that gives realy nice info for celery, django and supervisor integration. Read the: "Running Celery workers as daemons" part of the post.

Need to run celery worker during Django unittest

I am working on Django based web app. During unittest, I need to write a test which needs "Celery worker" running in the background.
I have already used:
CELERY_EAGER_PROPAGATES_EXCEPTIONS=True
CELERY_ALWAYS_EAGER=True
BROKER_BACKEND='memory
In over_ride settings, but these are not running celery worker for me in background when needed.
Any help would much appreciated.
Celery won't get run by Django automatically.
You can start a worker process by running from your project root:
$ celery -A my_proj worker
my_proj should be the application name you configured with app = Celery('my_proj')

django celery daemon doesn't work

Here's my celery app config:
from __future__ import absolute_import
from celery import Celery
import os
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'tshirtmafia.settings')
app = Celery('tshirtmafia')
app.conf.update(
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend',
)
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
settings.py:
INSTALLED_APPS:
'kombu.transport.django',
'djcelery',
also:
BROKER_URL = 'django://'
Here's my task:
#shared_task
def test():
send_mail('nesamone bus', 'Files have been successfully generated.', 'marijus.merkevicius#gmail.com',
['marijus.merkevicius#gmail.com'], fail_silently=False)
Now when I run locally python manage.py celeryd locally and then run test.delay() from shell locally it works.
Now I'm trying to deploy my app. When with the exact same configuration I try to open python manage.py celeryd and in other window I open shell and run test task, it doesn't work.
I've also tried to setup background daemon like this:
/etc/default/celeryd configuration:
# Name of nodes to start, here we have a single node
CELERYD_NODES="w1"
# or we could have three nodes:
#CELERYD_NODES="w1 w2 w3"
# Where to chdir at start. (CATMAID Django project dir.)
CELERYD_CHDIR="/home/tshirtnation/"
# Python interpreter from environment. (in CATMAID Django dir)
ENV_PYTHON="/usr/bin/python"
# How to call "manage.py celeryd_multi"
CELERYD_MULTI="$ENV_PYTHON $CELERYD_CHDIR/manage.py celeryd_multi"
# How to call "manage.py celeryctl"
CELERYCTL="$ENV_PYTHON $CELERYD_CHDIR/manage.py celeryctl"
# Extra arguments to celeryd
CELERYD_OPTS="--time-limit=300 --concurrency=1"
# Name of the celery config module.
CELERY_CONFIG_MODULE="celeryconfig"
# %n will be replaced with the nodename.
CELERYD_LOG_FILE="/var/log/celery/%n.log"
CELERYD_PID_FILE="/var/run/celery/%n.pid"
# Workers should run as an unprivileged user.
CELERYD_USER="celery"
CELERYD_GROUP="celery"
# Name of the projects settings module.
export DJANGO_SETTINGS_MODULE="settings"
And I use default celery /etc/init.d/celeryd script.
So basically it seems like celeryd starts but doesn't work. No idea how to debug this and what might be wrong.
Let me know if you need anything else
Celery turned to be a very capricious child in Django robust system as for me.
There are too little initial data for understanding the reason of your problems.
The most usual reason of Celery daemon fail is file system permissions.
But to clarify the reason I'd try:
Start celery from a command line by the user-owner of django project:
celery -A proj worker -l info
If it works OK, go further
Start celery in a verbal mode as a root user just like daemon to be:
sudo sh -x /etc/init.d/celeryd start
This will show most of the problems with the daemon script - celery user and group used, but not all, unfortunately: permission fails are not visible.
My little remark.
Usually Celery is started by own celery user, and the django project by another one. After long fighting celery and system, I refused from celery user, and owned celery process by the django project user.
And .. do not forget to start once
update-rc.d celerybeat defaults
update-rc.d celeryd defaults
this is for Ubuntu daemon start, sure.
Good luck

Categories

Resources