Is it possible to use django-celery models with RabbitMQ? - python

I'm interested in using the django-celery models to create and monitor recurring tasks. In particular, I am looking at creating recurring cron-like actions and starting/stopping them from the admin.
As I understand it, it is possible to use this only if I am also using Django's default DB as the celery broker. Is it ever going to be possible to use those models with a non-DB broker?
EDIT: To clarify, I am already using RabbitMQ as the broker. My question is: can I, while using RabbigMQ, still somehow use django-celery's models to dynamically create and manage recurring/scheduled tasks?

If you have AMQP installed you can just set in celeryconfig:
BROKER_URL = 'amqp://127.0.0.1//'
Or replace the ip above with the ip where the RabbitMQ server is running.

Related

Using different Redis databases for different queues in Celery

I have a Django application that uses Celery with Redis broker for asynchronous task execution. Currently, the app has 3 queues (& 3 workers) that connect to a single Redis instance for communication. Here, the first two workers are prefork-based workers and the third one is a gevent-based worker.
The Celery setting variables regarding the broker and backend look like this:
CELERY_BROKER_URL="redis://localhost:6379/0"
CELERY_RESULT_BACKEND="redis://localhost:6379/1"
Since Celery uses rpush-blpop to implement the FIFO queue, I was wondering if it'd be correct or even possible to use different Redis databases for different queues like — q1 uses database .../1 and q2 uses database .../2 for messaging? This way each worker will only listen to the dedicated database for that and pick up the task from the queue with less competition.
Does this even make any sense?
If so, how do you implement something like this in Celery?
First, if you are worried about the load, please specify your expected numbers/rates.
In my opinion, you shouldn't be concerned about the Redis capability to handle your load.
Redis has its own scale-out / scale-in capabilities whenever you'll need them.
You can use RabbitMQ as your broker (using rabbitMQ docker is dead-simple as well, you can see example) which again, has its own scale-out capabilities to support a high load, so I don't think you should be worried about this point.
As far as I know, there's no way to use different DBs for Redis broker. You can create different Celery applications with different DBs but then you cannot set dependencies between tasks (canvas: group, chain, etc). I wouldn't recommend such an option.

How to import Django models in seperate class

I'm using signals for post-processing data. Because a lot needs to happen, and later I want to run that logic on the background so the user doesn't have to wait for this, I want to run this code in a separate class.
I want to run the code in my Post Save event
But I get the following error:
ImportError: cannot import name 'ActivityDetail' from 'ryf_app.models'
The model definitely exists in my models.py file
What am I missing here?
If you want to run a task asynchronously or in the background, you might use task queue like celery. For a broker or cache db there are options for redis, rabbitmq, amazon sqs. Celery have a good documentation with rabbitmq supporting broker. You can follow this link-here.

Django with python ayncio to perform background task

I have two servers, A primary server that provide REST API to accept data from user and maintain a product details list. This server is also responsible to share product list (a subset of product data) with secondary server as soon as product is updated/created.
also note that secondary url depends on product details, not a fix server.
Primary server written in Django. I have used django model db signal as product update, create and delete event.
Now problem is that I don’t want to bock my primary server REST call until it populates detail to secondary server. I need some scheduler stuff to do that, i.e. create a task to populate data in background without blocking my current thread.
I found python asyncio module comes with a function 'run_in_executor', and its working till now, But I don’t have a knowledge of the side effect over django run in wsgi server, can anyone explain ? or any other alternate ?
I found django channel, but it need extra stuff like run worker thread separately, redis cache.
You should use Django Celery for running Tasks asynchronously or in the background.
Celery is a task queue with batteries included. It’s easy to use so that you can get started without learning the full complexities of the problem it solves.
You can get more information on celery from http://docs.celeryproject.org/en/latest/getting-started/first-steps-with-celery.html#first-steps

How can I ensure a Celery task runs with the right settings?

I have two sites running essentially the same codebase, with only slight differences in settings. Each site is built in Django, with a WordPress blog integrated.
Each site needs to import blog posts from WordPress and store them in the Django database. When a user publishes a post, WordPress hits a webhook URL on the Django side, which kicks off a Celery task that grabs the JSON version of the post and imports it.
My initial thought was that each site could run its own instance of manage.py celeryd, each is in its own virtualenv, and the two sites would stay out of each other's way. Each is daemonized with a separate upstart script.
But it looks like they're colliding somehow. I can run one at a time successfully, but if both are running, one instance won't receive tasks, or tasks will run with the wrong settings (in this case, each has a WORDPRESS_BLOG_URL setting).
I'm using a Redis queue, if that makes a difference. What am I doing wrong here?
Have you specified the name of the default queue that celery should use? If you haven't set CELERY_DEFAULT_QUEUE the both sites will be using the same queue and getting each other's messages. You need to set this setting to a different value for each site to keep the message separate.
Edit
You're right, CELERY_DEFAULT_QUEUE is only for backends like RabbitMQ. I think you need to set a different database number for each site, using a different number at the end of your broker url.
If you are using django-celery then make sure you don't have an instance of celery running outside of your virtualenvs. Then start the celery instance within your virtualenvs using manage.py celeryd like you have done. I recommend setting up supervisord to keep track of your instances.

fetching data from celery's backend(redis)

How do I get data(all I really need is the state of the task) from a Celery backend? I am using Redis.
Assuming that you configured the CELERY_RESULT_BACKEND to use redis ( see here ), then you can monitor your application using a variety of methods.
I believe that celeryctl should suffice..

Categories

Resources