In my Django project I implemented Celery, which is running using RabbitMQ backend.
What celery does:
Celery puts my tasks into queue, and then under certain conditions runs them. That being said, I basically interact with RabbitMQ message queue exclusively using Celery python interface.
Problem:
I want just to push simple string message into RabbitMQ queue, which should be consumed by 3rd party application.
What I tried:
There is a way how to directly connect to RabbitMQ using Pika library. However I would find it a little clunky - If I have already Celery connected to RabbitMQ, why not use it (if possible) to send simple messages to a specific queue, instead opening another connection using mentioned Pika library.
Any insights appreciated.
You cannot use Celery to send arbitrary messages to your RabbitMQ server.
However, considering that you already use RabbitMQ as a broker, which means you already have all the necesary RabbitMQ support (py-amqp supports it either directly, or via librabbitmq), you can easily send messages to the MQ server from your Celery tasks. If you for whatever reason do not like py-amqp, you may use Pika as you mentioned already.
Related
My site write with django. I need to run some task in the background of container(I using ec2).
Recently, I research Celery. But, it required redis or queue server to run. It makes I cannot using celery because I mustn't install something else.
Question: Can I setup celery stand alone? If yes, how to do this? If no, Are we have any alternative, which can install stand alone?
The answer is - no, you cannot use Celery without a broker (Redis, RabbitMQ, or any other from the list of supported brokers).
I am not aware of a service that does both (queue management AND execution environment for your tasks). Best services follow the UNIX paradigm - "do one thing, and do it right". Service you described above would have to do two different, non-trivial things and that is probably why most likely such service does not exist (at least not in the Python world).
I am implementing an online server using:
Flask
NGINX
Celery
Celery uses:
RabbitMQ as a broker
Redis as a Result backend.
I would like to know if it is possible to use Redis as a cache to avoid doing big calculations if I receive the same request. For example, I want to answer a cache result if I receive a POST containing the same body.
If it is possible, do I have to configure it in Celery or in Redis? And how should I do it?
There are many existing extensions in the flask eco-system that let you do this easily. Including Flask-Redis
I have several applications that communicate using RabbitMQ. I need to process messages from the queue in Python application, these messages are created and added to the queue by different application. Is there a way to configure Celery to process tasks that were not created in Python? Or is there some way to achieve this without using Celery?
I have a follow-on / clarification question related to an older question
I have 2 servers (for now). 1 server runs a django web application. The other server runs pure python scripts that are CRON-scheduled data acquisition & processing jobs for the web app.
There is a use case where user activity in the web application (updating a certain field) should trigger a series of actions by the backend server. I could stick with CRON but as we scale up, I can imagine running into trouble. Celery seems like a good solution except I'm unclear how to implement it. (Yes, I did read the getting started guide).
I want the web application to send tasks to a specific queue but the backend server to actually execute the work.
Assuming that both servers are using the same broker URL,
Do I need to define stub tasks in Djando or can I just use the celery.send_task method?
Should I still be using django-celery?
Meanwhile the backend server will be running Celery with the full implementation of the tasks and workers?
I decided to try it and work through any issues that came up.
On my django server, I did not use django-celery. I installed celery and redis (via pip) and followed most of the instructions in the First Steps with Django:
updated proj/proj/settings.py file to include the bare minimum of
configuration for Celery such as the BROKER_URL
created the proj/proj/celery.py file but without the task defined
at the bottom
updated the proj/proj/__init__.py file as documented
Since the server running django wasn't actually going to execute any
Celery tasks, in the view that would trigger a task, I added the
following:
from proj.celery import app as celery_app
try:
# send it to celery for backend processing
celery_app.send_task('tasks.mytask', kwargs={'some_id':obj.id,'another_att':obj.att}, queue='my-queue')
except Exception as err:
print('Issue sending task to Celery')
print err
The other server had the following installed: celery and redis (I used an AWS Elasticache redis instance for this testing).
This server had the following files:
celeryconfig.py will all of my Celery configuration and queues
defined, pointing to the same BROKER_URL as the django server
tasks.py with the actual code for all of my tasks
The celery workers were then started on this server, using the standard command: celery -A tasks worker -Q my-queue1,my-queue2
For testing, the above worked. Now I just need to make celery run in the background and optimize the number of workers/queue.
If anyone has additional comments or improvements, I'd love to hear them!
I'm using RabbitMQ to make my tasks pool run sequentially one by one. But how can add a time parameter to make a task only run at the defined time in the future (like a scheduled tasks).
RabbitMQ is not a task scheduler, even though the documentation talks about "scheduling" a task. You might consider using something like cron. You could also use a library like sched to build a scheduler in a Python process.
FYI It looks like this question has already been answered:
Delayed message in RabbitMQ
RabbitMQ has a plugin for delayed messages.
Using this plugin, the messages can be delivered to the respective queues after a certain delay. Thanks to this plugin, you can use RabbitMQ as a scheduler, even though it's not a task scheduler by nature.
You can use celery along with rabbitmq as broker for task scheduling. Here is the celery documentation http://docs.celeryproject.org/en/master/index.html