I use Celery and RabbintMQ for my project.
I have 3 servers (Main, A, B). A and B are calculating the tasks from Main server, then they post response to him.
This is an organizational question: where I need to install Celery and RabbitMQ?
As I think, RabbitMQ must be install on Main server (create rabbitmq user, etc.), Celery - on A and B servers. Or A and B also needs to install RabbitMQ?
Thanks!
There is no need to install RabbitMQ on all servers. Installing it in one server is sufficient. You just need to route tasks to A & B servers.
Also, remember AMQP is network protocol, the producers, consumers and the broker can all reside on same or different machines. Following are the possible arrangements for them.
Producer: A producer is a user application that sends messages.
Broker: A broker receives massages from producer and router them to consumer. A broker consists an exchange and one or more queues.
Consumer: A consumer is an application that receives messages and process them.
Related
I am having issues while running my Python Flask application from Docker pull (remote pull).
In my app I had used RabbitMQ as message broker, and Celery as task scheduler. It is working as expected when running locally, But when I put my application on Docker, and Docker pull it from remote system, it runs fine, but Celery and RabbitMQ are not running with it, so all tasks (with method.delay()) are running infinitely and http request is not being processed.
I need help in putting my Python Flask application to Docker, as my application has asynchronous tasks to be processed with Celery. I am not aware about how to modify docker-compose.yml for including Celery service.
Thanks is advance.
I think you need to link celery container with rabbitmq.
From https://docs.docker.com/compose/compose-file/#links
Link to containers in another service. Either specify both the service name and a link alias (SERVICE:ALIAS), or just the service name.
links:
- rabbitmq
Or
- rabbitmq:rabbitmq
How can I configure a logstash agent that sends python logs to the redis broker?
I saw there's an option using "beaver" as a background deamon but I would rather choose a python module that configured to send it directly instead of doing it in the non-direct way.
Currently I'm using Python-Logstash, but I think it doesn't support inserting messages to Redis queue.
Is it possible for a kombu producer to queue a message on rabbitmq to be processed by celery workers? It seems the celery workers do not understand the message put by the kombu producer.
I understand that to communicate with RabbitMQ, you would require any lib that abides by AMQP specification.
Kombu is one such lib which can bind to the RabbitMQ exchange, listen and process messages by spawning numerous consumers.
Celery is nothing but an asynchronous task generator which has numerous add-ons like in-memory processing, capacity to write to DB/Redis cache, perform complex operations and so on.
Said that now you can use kombu to read and write messages in/from RMQ and use celery workers to process the message.
I have a celery setup and running fine using rabbitmq as the broker. I also have CELERY_SEND_TASK_ERROR_EMAILS=True in my settings. I receive emails if there is an Exception
thrown while executing the tasks which is fine.
My question is is there a way either with celery or rabbitmq, to receive an error notification from either celery if the broker connection cannot be established or rabbitmq itself if the rabbitmq-server running dies.
I think the right tool for this job is a process control system like supervisord, which launches/watches processes and can trigger events when those processes die or restart. More specifically, using the plugin superlance, you can send an email when a process dies.
How can I use two different celery project which consumes messages from single RabbitMQ installation.
Generally, these scripts work fine if I use different rabbitmq for them. But on production machine, I need to share the same RabbitMQ backend for them.
Note: Due to some constraint, I cannot merge new projects in existing, so it will be two different project.
RabbitMQ has the ability to create virtual message brokers called virtual
hosts or vhosts. Each one is essentially a mini-RabbitMQ server with its own queues. This lets you safely use one RabbitMQ server for multiple applications.
rabbitmqctl add_vhost command creates a vhost.
By default Celery uses the / default vhost:
celery worker --broker=amqp://guest#localhost//
But you can use any custom vhost:
celery worker --broker=amqp://guest#localhost/myvhost
Examples:
rabbitmqctl add_vhost new_host
rabbitmqctl add_vhost /another_host
celery worker --broker=amqp://guest#localhost/new_host
celery worker --broker=amqp://guest#localhost//another_host