Python Proton sending binary data to Active MQ - python

I am trying to write a simple string message to an ActiveMQ queue:
def write_to_amq(message, host_name, port, queue):
conn = BlockingConnection(f'{host_name}:{port}')
sender = conn.create_sender(queue)
sender.send(Message(body='message'))
conn.close()
The message gets to the queue just fine, but it appears to have some binary data in it when I view it on the ActiveMQ web UI. It reports the contents as SpESsESw�message. I was expecting the contents to just be message
[Additional data point]
I am also seeing this in a separate Go program I have written using the pack.ag/amqp package.
func (s *amqpSender) SendResult(data string) error {
session, err := s.client.NewSession()
if err != nil {
return fmt.Errorf("failure creating AMQP session: %s", err)
}
ctx := context.Background()
sender, err := session.NewSender(
amqp.LinkTargetAddress(s.workQueueName),
)
if err != nil {
return fmt.Errorf("failure creating sender link: %s", err)
}
ctx, cancel := context.WithTimeout(ctx, s.timeout)
defer func() {
cancel()
sender.Close(ctx)
}()
err = sender.Send(ctx, amqp.NewMessage([]byte(data)))
if err != nil {
return fmt.Errorf("failure sending message: %s", err)
}
return nil
}
When I send a different message to ActiveMQ, I get similar behavior, seeing Su�vMy message in the ActiveMQ Message Details. Could this just be a web UI anomaly?

It is not a web anomaly, if you receive the message under the openwire protocol, you'll see the same thing as you see on the webpage. It seems that ActiveMQ encodes the properties inside the message payload so there are those weird characters at the start. My hypothesis is that ActiveMQ encodes these properties you see at the top right of the webpage at the start of the body:
Furthermore, if you were to send a message with text-based protocols like OpenWire or STOMP, you won't see any properties nor weird bytes at the start of the body.
There are 3 potential solutions to this issue:
If you want to keep using ActiveMQ
As said here, you can add
<transportConnectors>
<transportConnector name="amqp" uri="amqp://localhost:5672?transport.transformer=jms"/>
</transportConnectors>
inside the ActiveMQ configuration so AMQP messages are transformed into JMS TextMessages. However, if you're using the managed ActiveMQ service of AWS, Amazon MQ, this configuration is currently not available. So please open a ticket so they prioritize this.
Use a text-based protocol instead of AMQP such as OpenWire or STOMP
If you don't mind using another broker then ActiveMQ
Consider switching to RabbitMQ where AMQP is a first-class citizen.
I know the question is over a year old, but I hope this will still be helpful!

You need to tell the python binding that you want to encode the body as a String value by adding the unicode encoding prefix to you string so that the python binding knows what to do with the data you are encoding. The way you are currently handing off the body is resulting in a binary encoding instead and so the broker will show you the garbage data on the console as it views this as a BytesMessage instead of a TextMessage
It should work if you did something like the following:
sender.send(Message(body=u"message"))
More on python string encoding and decoding here.

Related

Subscribe to Google PubSub from Azure Function App

I'm trying to figure out the best way to subscribe to a Google PubSub subscription from Azure Function App. My solution below "works" -- meaning I can start it up locally and it will pull messages that are published to the subscribed-to topic. But this can't be the right way of doing this, so I'm looking for better ideas.
I'm kicking off a timer and then listening for messages. But would a Durable app be better? The described patterns in the documentation don't fit this.
Another approach might be using the timer, but then pulling instead of listening asynchronously?
I want to log these messages and publish them to Azure ESB.
Any advice or examples would be appreciated. I'm happy to use C# or Python -- or any language for that matter.
using System;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
using Google.Cloud.PubSub.V1;
using System.Collections.Generic;
using System.Threading.Tasks;
namespace NetEaiDemo
{
public class Function1
{
[FunctionName("Function1")]
public async Task Run([TimerTrigger("*/30 * * * * *")]TimerInfo myTimer, ILogger log)
{
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
string projectId = "project1";
string subscriptionId = "subscription1";
SubscriptionName subscriptionName = new SubscriptionName(projectId, subscriptionId);
SubscriberClient subscriber = await SubscriberClient.CreateAsync(subscriptionName);
List<PubsubMessage> receivedMessages = new List<PubsubMessage>();
// Start the subscriber listening for messages.
await subscriber.StartAsync((msg, cancellationToken) =>
{
receivedMessages.Add(msg);
Console.WriteLine($"Received message {msg.MessageId} published at {msg.PublishTime.ToDateTime()}");
Console.WriteLine($"Text: '{msg.Data.ToStringUtf8()}'");
// Stop this subscriber after one message is received.
// This is non-blocking, and the returned Task may be awaited.
subscriber.StopAsync(TimeSpan.FromSeconds(15));
// Return Reply.Ack to indicate this message has been handled.
return Task.FromResult(SubscriberClient.Reply.Ack);
});
}
}
}

Contract testing with Kafka in Python environment?

I am working with multiple applications that communicate asynchronously using Kafka. These applications are managed by several departments and contract testing is appropriate to ensure that the messages used during communication follow the expected schema and will evolve according to the contract specification.
It sounded like the pact library for python is a good fit because it helps creating contract tests for HTTP and message integrations.
What I wanted to do is to send an HTTP request and to listen from the appropriate and dedicated Kafka topic immediately after. But it seems that the test is forcing me specify an HTTP code even if what I am expecting is a message from a queue without an HTTP status code. Furthermore, it seems that the HTTP request is being sent before the consumer is listening. Here is some sample code.
from pact.consumer import Consumer as p_Consumer
from pact.provider import Provider as p_Provider
from confluent_kafka import Consumer as k_Consumer
pact = p_Consumer('Consumer').has_pact_with(p_Provider('Provider'))
pact.start_service()
atexit.register(pact.stop_service)
config = {'bootstrap.servers':'server', 'group.id':0, 'auto.offset.reset':'latest'}
consumer = k_consumer(config)
consumer.subscribe(['usertopic'])
def user():
while True:
msg = consumer.poll(timeout=1)
if msg is None:
continue
else:
return msg.value().decode()
class ConstractTesting(unittest.TestCase):
expected = {
'username': 'UserA',
'id':123,
'groups':['Editors']
}
pact.given('UserA exists and is not an administrator')
.upon_receiving('a request for UserA')
.with_request(method='GET',path='/user/')
.will_respond_with(200, body=expected)
with pact:
result = user()
self.assertEqual(result,expected)
How would I carry out contract testing in Python using Kafka? It feels like I am going through a lot of hoops to carry out this test.
With Pact message it's a different API you write tests against. You don't use the standard HTTP one, in fact the transport itself is ignored altogether and it's just the payload - the message - we're interested in capturing and verifying. This allows us to test any queue without having to build specific interfaces for each
See this example: https://github.com/pact-foundation/pact-python/blob/02643d4fb89ff7baad63e6436f6a929256c6bf12/examples/message/tests/consumer/test_message_consumer.py#L65
You can read more about message pact testing here: https://docs.pact.io/getting_started/how_pact_works#non-http-testing-message-pact
And finally here are some Kafka examples for other languages that may be helpful: https://docs.pactflow.io/docs/examples/kafka/js/consumer

Connecting Python to a webpage via node-red

I am trying to take info from my python program and update this real time on the web page.
I am trying to use node-red and communicate via web sockets.
My python program is below:
#!/usr/bin/python
import time
import websocket
ws = websocket.WebSocket();
ws.connect("ws://localhost:1880/ws/example")
count = 0;
while(count < 50):
print "Sending 'Hello, World'..."
ws.send("Hello, World")
print "Sent"
time.sleep(5)
count = count + 1
ws.close()
Using Node-red I have set up my flow as follows:
Node-Red Flow
However when I run them both, my python program says it is sending the message, however the node red console is returning null for the msg value.
Please check your settings.js file -- do you have a url prefix defined for either httpRoot or httpNodeRoot?
For instance, in my project, when I add a new websocket config node, this info box is shown:
By default, payload will contain the data to be sent over, or received from a
websocket. The listener can be configured to send or receive the entire message
object as a JSON formatted string. This path will be relative to /red.
If so, I believe you will have to modify the url in your python code, like so:
ws.connect("ws://localhost:1880/red/ws/example")
substituting your prefix, of course...

zeromq REP node will only get the first message (Req,Rep pattern)

I have a nodejs app which listen for messages from clients (python app).
the pattern i used for communication over zmq is REQ/REP pattern.
the Main app should get messages from many clients. it will not reply to them, just get messages.
the problem is the main app will only get the first message and the next messages are not shown in nodejs app console.
in other words every time i start nodejs app i only get one message.
here is my code:
Nodejs app
var responder = zmq.socket('rep');
responder.on('message', function(request) {
console.log(request);
//here, it seems this function will be called just once!
});
responder.bind('tcp://127.0.0.1:8000', function(err) {
if (err) {
console.log(err);
} else {
console.log('Listening on 8000...');
}
});
python (client) part:
socket = context.socket(zmq.REQ)
socket.connect("tcp://127.0.0.1:8000")
socket.send('blaaaa')
print 'message sent!'
python part is inside a function. i could see the output of "message sent!" in python console(i mean many 'message sent!').
but i could not see the messages in nodejs app.just the first message is seen in the console of nodejs.
When using the REQ/REP-pattern you actually need to respond to a request before you are given the next request - you will only handle one request at the time.
var responder = zmq.socket('rep');
responder.on('message', function(request) {
console.log(request);
responder.send('Here comes the reply!');
});
Respond, and you will receive the next one. If you do not wish to respond, then you need to choose some other socket pair than req/rep - ex: push/pull or maybe look at xreq/xrep (router/dealer) if you wish to handle multiple requests at the same time.
If in doubt, look up the send/receive pattern for each socket type at http://api.zeromq.org/2-1:zmq-socket

Google Glass callbackUrl POST from Mirror API is empty?

Apologies because the only web development I know is of the django/python kind and am probably guilty of mixing my code idioms ( REST vs django URL dispatch workflow)
I have a URL handler which serves as a callbackUrl to a subscription for my Glassware. I am getting a POST to the handler , but the request object seems empty.
I am sure I am understanding this wrong but can someone point me in the direction of getting the "REPLY" information from a POST notification to a callbackURL.
My URL Handler is
class A600Handler(webapp2.RequestHandler):
def post(self):
"""Process the value of A600 received and return a plot"""
# I am seeing this in my logs proving that I am getting a POST when glass replies
logging.info("Received POST to logA600")
# This is returning None
my_collection = self.request.get("collection")
logging.info(my_collection)
# I also tried this but self.sequest.POST is empty '[]' and of type UnicodeMultiDict
# json_request_data = json.loads(self.request.POST)
#util.auth_required
def get(self):
"""Process the value of A600 received and return a plot"""
logging.info("Received GET to this logA600")
I have the following URL Handler defined and can verify that the post function is getting a "ping" when the user hits reply by looking at the app-engine logs.
MAIN_ROUTES = [
('/', MainHandler),('/logA600',A600Handler),
]
How do I extract the payload in the form of the voice transcribed text sent by the user?. I am not understanding The "parse_notification" example given in the docs
Did you try request.body? The docs for request.POST state
"If you need to access raw or non-form data posted in the request, access this through the HttpRequest.body attribute instead."
If the API isn't using form data in its post, you'll likely find the contents in request.body. The docs to which you linked indicate that the content will be placed as JSON in the body instead of form data ("containing a JSON request body"). I would try json.loads(request.body).
I am also having this issue of Mirror API calling my application for notifications, and those notifications are empty. My app runs on tomcat so its a java stack. All the samples process the notification like this:
BufferedReader notificationReader = new BufferedReader(
new InputStreamReader(request.getInputStream()));
String notificationString = "";
// Count the lines as a very basic way to prevent Denial of Service
// attacks
int lines = 0;
while (notificationReader.ready()) {
notificationString += notificationReader.readLine();
lines++;
// No notification would ever be this long. Something is very wrong.
if (lines > 1000) {
throw new IOException(
"Attempted to parse notification payload that was unexpectedly long.");
}
}
log.info("got raw notification " + notificationString);
For me this is always logging as empty. Since a notification url must be https, and for testing I could not use an IP address, I have setup dyndns service to point to my localhost:8080 running service. This all seems to work but I suspect how dyndns works is some type of forward or redirect here post data is removed.
How can I work around this for local development?
Updated:
Solved for me.
I found closing the response before reading request caused issue that request.inputStream was already closed. MOving this
response.setContentType("text/html");
Writer writer = response.getWriter();
writer.append("OK");
writer.close();
To after I fully read in request notification into a String solved the issue.

Categories

Resources