I am using a RabbitMQ server with python for sending and receiving messages to the server
This is the code I am using for sending a message in to code.
import numpy as np
import pandas as pd
import pika
connection = pika.BlockingConnection(pika.ConnectionParameters(
host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='Q1')
message = 'Hello World'
channel.basic_publish(exchange='',
routing_key='Q1',
body=message)
# Printing the Sending Confirmation of ID
print(" [x] Sent %r" % message)
connection.close()
Output:
[x] Sent 'Hello World'
This is the code I am using for receiving messages from queue
import pika
import sys
connection = pika.BlockingConnection(pika.ConnectionParameters(host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='Q1')
def callback(ch, method, properties, body):
print(" [x] Received %r" % body)
channel.basic_consume(callback, queue='Q1', no_ack=True)
channel.start_consuming()
Output:
[x] Received 'Hello World'
The problem is I want to save this message i.e. "Hello World" to a variable and then use it in my program
But I am not able to save the message.
How can I save it to a variable.
What will be the solution for Multiple Messages in the queue
The problem is I want to save this message i.e. "Hello World" to a variable and then use it in my program
You've got a variable already which you can use in you program - body. If you'd like to decouple the infrastructure code (i.e. RabbitMq/pika) from the business logic then you can simply declare another function and pass the body to it.
def processing_function(message_received):
print(" [x] Received %r" % message_received)
def callback(ch, method, properties, body):
processing_function(body)
The idea is that pika calls callback once a message is received and then body is passed to the processing_function which performs calculations.
If you're struggling to understand the callback function I'd recommend you to read this first.
Related
I am just starting to use RabbitMQ and Python in general. I have been reading the tuts on the rabbit official page, but I have no idea how to use Rabbitmq to do another things.
I have been trying to run the example of this [tutorial] (https://www.rabbitmq.com/tutorials/tutorial-three-python.html), It runs well,.. BUT I need to know how can I create more than one fucntions and call them throug Rabbitmq Messages?... (I am also using this [example] (Python and RabbitMQ - Best way to listen to consume events from multiple channels?) to guide me. )
I hope someone have some idea how to do this... (I will repeat again, I am very new on this topics)...
This some code what i have.
I use this code to send the message as the tutorial..
import pika
import sys
url = 'amqp://oamogcgg:xxxxxxxxxxxxxxxxxxxxxxxxx#salamander.rmq.cloudamqp.com/oamogcgg'
params = pika.URLParameters(url)
connection = pika.BlockingConnection(params)
channel = connection.channel()
channel.exchange_declare(exchange='logs', exchange_type='fanout')
message = ' '.join(sys.argv[1:]) or "info: THIS IS A TEST MESSAGE !!!!!!"
channel.basic_publish(exchange='logs', routing_key='', body=message)
print(" [x] Sent %r" % message)
connection.close()
And in this file is where I receive the message according to me.
import pika
import sys
import threading
threads=[]
#function 1
def validator1(channel):
channel.queue_declare(queue='queue_name')
print (' [*] Waiting messsaes for valiadtor1 press CTRL+C')
def callback(ch, method, properties, body):
print (" Received %s" % (body))
sleep(2) #I need stop it for two minutes
channel.basic_consume(callback, queue='queue_name', no_ack=True)
channel.start_consuming()
#function 2
def validator2(channel):
channel.queue_declare(queue='queue_name')
print (' [*] Waiting messsaes for valiadtor2 press CTRL+C')
def callback(ch, method, properties, body):
print (" Received %s" % (body))
sleep(2) #I need stop it for two minutes
channel.basic_consume(callback, queue='queue_name', no_ack=True)
channel.start_consuming()
def manager():
url = 'amqp://oamogcgg:xxxxxxxxxxxxxxxxxxxxxxxxx#salamander.rmq.cloudamqp.com/oamogcgg'
params = pika.URLParameters(url)
#channel 1
connection1= pika.BlockingConnection(params)
channel1 = connection1.channel()
channel1.exchange_declare(exchange='logs', exchange_type='fanout')
result = channel1.queue_declare(queue='', exclusive=True)
queue_name = result.method.queue
channel1.queue_bind(exchange='logs', queue=queue_name)
#channel 2
connection2= pika.BlockingConnection(params)
channel2 = connection2.channel()
channel2.exchange_declare(exchange='logs', exchange_type='fanout')
result = channel2.queue_declare(queue='', exclusive=True)
queue_name = result.method.queue
channel2.queue_bind(exchange='logs', queue=queue_name)
#creating threads
t1 = threading.Thread(target=validator1, args=(channel1,))
t1.daemon = True
threads.append(t1)
t1.start()
t2 = threading.Thread(target=valiadtor2, args=(channel2,))
t2.daemon = True
threads.append(t2)
t2.start()
for t in threads:
t.join()
manager()
If your functions are dependent(by dependent I mean, one function works on the output of the other) then, you can call all those functions one by one in your callback function. Once the last function is executed successfully you can acknowledge the message.
If those functions are independent then you can maintain multiple queues for each of the function you wanna execute on the message. The same message can be routed to multiple queues using a fanout exchange as mentioned in RabbitMQ Tutorial-3.
I am trying to build a system where I can send messages to diffferent users based on their subscription to certain events. Basically I have an api which gives me live stream events. Some of the users will be subscribed to those events. My task is to send message to those users whenever such an event occurs. I am trying to design the system in Python.
Currently I have the following questions.
How yo continously poll for events from a live stream api in Python.
How to find out which users are subscribed to that particular event. (Redis or Mysql)
How to send notification to all the users of a particular event. (Pub/sub)
I am thinking of using Amazon SNS. But not quite sure about the overall architecture.
RabbitMQ is lightweight and easy to deploy on premise and in the
cloud. It supports multiple messaging protocols. RabbitMQ can be
deployed in distributed and federated configurations to meet
high-scale, high-availability requirements.
Just small example:
Producer sends messages to the "hello" queue. The consumer receives messages from that queue. This will create a Queue (hello) with a message on the RabbitMQ cluster.
#!/usr/bin/env python
import pika
RABBITMQ_USERNAME = 'ansible'
RABBITMQ_PASSWORD = 'ansible'
connection = pika.BlockingConnection(pika.ConnectionParameters(
host='example.eu-central-1.elb.amazonaws.com',
heartbeat_interval=25,
credentials=pika.PlainCredentials(RABBITMQ_USERNAME,RABBITMQ_PASSWORD)))
channel = connection.channel()
channel.queue_declare(queue='hello')
channel.basic_publish(exchange='',
routing_key='hello',
body='Hello World!')
print(" [x] Sent 'Hello World!'")
connection.close()
Receive a message from a named queue:
#!/usr/bin/env python
import pika
RABBITMQ_USERNAME = 'ansible'
RABBITMQ_PASSWORD = 'ansible'
connection = pika.BlockingConnection(pika.ConnectionParameters(
host='example.elb.amazonaws.com',
heartbeat_interval=25,
credentials=pika.PlainCredentials(RABBITMQ_USERNAME,RABBITMQ_PASSWORD)))
channel = connection.channel()
channel.queue_declare(queue='hello')
def callback(ch, method, properties, body):
print(" [x] Received %r" % body)
channel.basic_consume(callback,
queue='hello',
no_ack=True)
print(' [*] Waiting for messages. To exit press CTRL+C')
channel.start_consuming()
def response(return_JSON):
connection = pika.BlockingConnection(pika.ConnectionParameters(host=HOST))
channel = connection.channel()
channel.exchange_declare(exchange=EXCHANGE, type='direct', durable='True')
channel.queue_declare(queue=QUEUE_NAME, durable='True')
channel.queue_bind(exchange=EXCHANGE, queue=QUEUE_NAME)
channel.basic_publish(exchange=EXCHANGE, routing_key=RESPONSE_ROUTING_KEY, body=return_JSON)
connection.close()
print " [x] Finished sending msg to sender..."
def callback(ch, method, properties, body):
print " [x] Received message from sender!"
print body
result_JSON = Work().compute(json.loads(body))
if result_JSON is not None:
#result_JSON = json.dumps(result_JSON)
response(result_JSON)
channel.basic_ack(delivery_tag = method.delivery_tag)
print "Acknowledged to sender"
if __name__ == '__main__':
connection = pika.BlockingConnection(pika.ConnectionParameters(host=HOST))
channel = connection.channel()
channel.queue_declare(queue=QUEUE_NAME, durable='True')
channel.basic_consume(callback, queue=QUEUE_NAME, no_ack=False)
print " [*] Waiting for messages. To exit press CTRL+C"
channel.start_consuming()
This often works for a while and then falls over. Only need to consume messages one at a time, process them and send back a response message with a different routing KEY. Is there a smarter more reliable way to create a minimal consumer processing messages one at a time.
Error message is:
Connection reset by peer
I have implemented RabbitMQ in my servers. So basically what it does is that the main server passes messages to the worker server.
The problem that I am facing is that all the message that I pass is not received by the server.
i.e if i send 10 messages only 4 of them are received.
Any idea where am I going wrong.
Receiving code
import pika
connection = pika.BlockingConnection(pika.ConnectionParameters(
host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='hello')
def callback(ch, method, properties, body):
print(" [x] Received %r" % body)
channel.basic_consume(callback,
queue='hello',
no_ack=True)
print(' [*] Waiting for messages. To exit press CTRL+C')
channel.start_consuming()
Publishing code
import pika
import sys
connection = pika.BlockingConnection(pika.ConnectionParameters(
host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='task_queue', durable=True)
message = ' '.join(sys.argv[1:]) or "Hello World!"
channel.basic_publish(exchange='',
routing_key='task_queue',
body=message,
properties=pika.BasicProperties(
delivery_mode = 2, # make message persistent
))
print(" [x] Sent %r" % message)
connection.close()
Assuming that you are publishing to the same queue (as the examples you posted shows otherwise). I would recommend that you enable the confirm delivery flag. This will ensure that your message gets delivered, and if not it will either throw an exception, or publish will return False.
channel = connection.channel()
channel.confirm_delivery()
published = channel.basic_publish(...)
if not published:
raise Exception("Unable to publish message!")
It might also be worth to install the management plugin for RabbitMQ and inspect the queue before you start consuming messages. This way you can verify that the messages got published, and later consumed.
I am trying to make a simple AMQP client using python. I copied the code I found in RabbitMQ website:
#!/usr/bin/env python
import pika
connection = pika.BlockingConnection(pika.ConnectionParameters(
host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='hello')
def callback(ch, method, properties, body):
print(" [x] Received %r" % body)
channel.basic_consume(callback,
queue='hello',
no_ack=True)
print(' [*] Waiting for messages. To exit press CTRL+C')
channel.start_consuming()
This works except It always prints something like [x] Received b'my message'. Due to this I cant parse my json messages. How do I fix this ?
You may use decode() to convert the string to utf-8 and then print it out, something like
str = 'your str'
print(str.decode())
Adding to yichucai's correct answer, I found that you can add the decode() method directly to the body var inside print. Like so:
print(" [x] Received %r" % body.decode())