Subscribe to Google PubSub from Azure Function App - python

I'm trying to figure out the best way to subscribe to a Google PubSub subscription from Azure Function App. My solution below "works" -- meaning I can start it up locally and it will pull messages that are published to the subscribed-to topic. But this can't be the right way of doing this, so I'm looking for better ideas.
I'm kicking off a timer and then listening for messages. But would a Durable app be better? The described patterns in the documentation don't fit this.
Another approach might be using the timer, but then pulling instead of listening asynchronously?
I want to log these messages and publish them to Azure ESB.
Any advice or examples would be appreciated. I'm happy to use C# or Python -- or any language for that matter.
using System;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
using Google.Cloud.PubSub.V1;
using System.Collections.Generic;
using System.Threading.Tasks;
namespace NetEaiDemo
{
public class Function1
{
[FunctionName("Function1")]
public async Task Run([TimerTrigger("*/30 * * * * *")]TimerInfo myTimer, ILogger log)
{
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
string projectId = "project1";
string subscriptionId = "subscription1";
SubscriptionName subscriptionName = new SubscriptionName(projectId, subscriptionId);
SubscriberClient subscriber = await SubscriberClient.CreateAsync(subscriptionName);
List<PubsubMessage> receivedMessages = new List<PubsubMessage>();
// Start the subscriber listening for messages.
await subscriber.StartAsync((msg, cancellationToken) =>
{
receivedMessages.Add(msg);
Console.WriteLine($"Received message {msg.MessageId} published at {msg.PublishTime.ToDateTime()}");
Console.WriteLine($"Text: '{msg.Data.ToStringUtf8()}'");
// Stop this subscriber after one message is received.
// This is non-blocking, and the returned Task may be awaited.
subscriber.StopAsync(TimeSpan.FromSeconds(15));
// Return Reply.Ack to indicate this message has been handled.
return Task.FromResult(SubscriberClient.Reply.Ack);
});
}
}
}

Related

Contract testing with Kafka in Python environment?

I am working with multiple applications that communicate asynchronously using Kafka. These applications are managed by several departments and contract testing is appropriate to ensure that the messages used during communication follow the expected schema and will evolve according to the contract specification.
It sounded like the pact library for python is a good fit because it helps creating contract tests for HTTP and message integrations.
What I wanted to do is to send an HTTP request and to listen from the appropriate and dedicated Kafka topic immediately after. But it seems that the test is forcing me specify an HTTP code even if what I am expecting is a message from a queue without an HTTP status code. Furthermore, it seems that the HTTP request is being sent before the consumer is listening. Here is some sample code.
from pact.consumer import Consumer as p_Consumer
from pact.provider import Provider as p_Provider
from confluent_kafka import Consumer as k_Consumer
pact = p_Consumer('Consumer').has_pact_with(p_Provider('Provider'))
pact.start_service()
atexit.register(pact.stop_service)
config = {'bootstrap.servers':'server', 'group.id':0, 'auto.offset.reset':'latest'}
consumer = k_consumer(config)
consumer.subscribe(['usertopic'])
def user():
while True:
msg = consumer.poll(timeout=1)
if msg is None:
continue
else:
return msg.value().decode()
class ConstractTesting(unittest.TestCase):
expected = {
'username': 'UserA',
'id':123,
'groups':['Editors']
}
pact.given('UserA exists and is not an administrator')
.upon_receiving('a request for UserA')
.with_request(method='GET',path='/user/')
.will_respond_with(200, body=expected)
with pact:
result = user()
self.assertEqual(result,expected)
How would I carry out contract testing in Python using Kafka? It feels like I am going through a lot of hoops to carry out this test.
With Pact message it's a different API you write tests against. You don't use the standard HTTP one, in fact the transport itself is ignored altogether and it's just the payload - the message - we're interested in capturing and verifying. This allows us to test any queue without having to build specific interfaces for each
See this example: https://github.com/pact-foundation/pact-python/blob/02643d4fb89ff7baad63e6436f6a929256c6bf12/examples/message/tests/consumer/test_message_consumer.py#L65
You can read more about message pact testing here: https://docs.pact.io/getting_started/how_pact_works#non-http-testing-message-pact
And finally here are some Kafka examples for other languages that may be helpful: https://docs.pactflow.io/docs/examples/kafka/js/consumer

Python Proton sending binary data to Active MQ

I am trying to write a simple string message to an ActiveMQ queue:
def write_to_amq(message, host_name, port, queue):
conn = BlockingConnection(f'{host_name}:{port}')
sender = conn.create_sender(queue)
sender.send(Message(body='message'))
conn.close()
The message gets to the queue just fine, but it appears to have some binary data in it when I view it on the ActiveMQ web UI. It reports the contents as SpESsESw�message. I was expecting the contents to just be message
[Additional data point]
I am also seeing this in a separate Go program I have written using the pack.ag/amqp package.
func (s *amqpSender) SendResult(data string) error {
session, err := s.client.NewSession()
if err != nil {
return fmt.Errorf("failure creating AMQP session: %s", err)
}
ctx := context.Background()
sender, err := session.NewSender(
amqp.LinkTargetAddress(s.workQueueName),
)
if err != nil {
return fmt.Errorf("failure creating sender link: %s", err)
}
ctx, cancel := context.WithTimeout(ctx, s.timeout)
defer func() {
cancel()
sender.Close(ctx)
}()
err = sender.Send(ctx, amqp.NewMessage([]byte(data)))
if err != nil {
return fmt.Errorf("failure sending message: %s", err)
}
return nil
}
When I send a different message to ActiveMQ, I get similar behavior, seeing Su�vMy message in the ActiveMQ Message Details. Could this just be a web UI anomaly?
It is not a web anomaly, if you receive the message under the openwire protocol, you'll see the same thing as you see on the webpage. It seems that ActiveMQ encodes the properties inside the message payload so there are those weird characters at the start. My hypothesis is that ActiveMQ encodes these properties you see at the top right of the webpage at the start of the body:
Furthermore, if you were to send a message with text-based protocols like OpenWire or STOMP, you won't see any properties nor weird bytes at the start of the body.
There are 3 potential solutions to this issue:
If you want to keep using ActiveMQ
As said here, you can add
<transportConnectors>
<transportConnector name="amqp" uri="amqp://localhost:5672?transport.transformer=jms"/>
</transportConnectors>
inside the ActiveMQ configuration so AMQP messages are transformed into JMS TextMessages. However, if you're using the managed ActiveMQ service of AWS, Amazon MQ, this configuration is currently not available. So please open a ticket so they prioritize this.
Use a text-based protocol instead of AMQP such as OpenWire or STOMP
If you don't mind using another broker then ActiveMQ
Consider switching to RabbitMQ where AMQP is a first-class citizen.
I know the question is over a year old, but I hope this will still be helpful!
You need to tell the python binding that you want to encode the body as a String value by adding the unicode encoding prefix to you string so that the python binding knows what to do with the data you are encoding. The way you are currently handing off the body is resulting in a binary encoding instead and so the broker will show you the garbage data on the console as it views this as a BytesMessage instead of a TextMessage
It should work if you did something like the following:
sender.send(Message(body=u"message"))
More on python string encoding and decoding here.

Send messages to Twilio Flex

Just to try to give you an idea of what we are trying to do: we have a Python system that works like a chatbot, answering some questions automatically, but there comes a time when we need to send the contact to the service one by one, and we need to use Python to open the conversation in Flex
I tried many ways to send messages to a chat on Flex, but as I said previously we just able to create a chat no send messages into it, better saying, we’re able to send the messages but they don’t appear on Flex’s chat interface (that’s weird). I tried 1) using the normal way via API, 2) creating a Runtime function (the same result than via API, just creating a chat but no messages appearing), 3) I tried applying Twilio Sync, and also 4) I tried using Twilio Proxy directly.
I can get the sid for all objects created and see the chat created on Flex interface, but I can't see the messages sent to this chat on Flex's UI.
event = {
'from': 'phone_number_from',
'to': 'flex_phone_number',
'body': 'Testing'
}
attrs = get_channel_attrs(event.get('from'))
channel = get_or_create_chat_channel(event.get('from'), event.get('to'), attrs)
print('Channel created: %s' % channel.sid)
task = get_or_create_ongoing_tasks(event.get('from'), channel.sid)
print('Task created: %s' % task.sid)
message = send_message(channel=channel, from_=event.get('from'), body=event.get('body'))
print('Message created: %s' % message.sid)
-----------------------
Channel created: CH99b4831f********************
Task created: WTe8eee516********************
Message created: IM08884be42********************
You may need to set additional attributes for the chat task to appear in Flex. Here is an example of Taskrouter attributes through the web chat and through SMS in Taskrouter. These tasks both appear in Flex.
Web chat:
{"channelSid":"CHc7221e1c8ac04b4d9f45xxxxxxxxxxxx","name":"Jane","channelType":"web"}
SMS:
{"channelSid":"CH86818963afed4d769fb3xxxxxxxxxxxx","endpoint":"sms","identity":"+15555555555","name":"+15555555555","title":"SMS request"}
Aaron, I am having similar issue and was trying to have a look at Flex Create Chat function but it is not there. When I click "create a function" and filter by flex product only blan function template is provided which will generate this function code:
exports.handler = function(context, event, callback) {
let twiml = new Twilio.twiml.VoiceResponse();
// twiml.say("Hello World");
callback(null, twiml);
};

Is there a way to broadcast a message to all (or filtered) WebSocket clients connected to a WebSocket server? [duplicate]

I'm assuming this isn't possible, but wanted to ask in case it is. If I want to provide a status information web page, I want to use WebSockets to push the data from the server to the browser. But my concerns are the effect a large number of browsers will have on the server. Can I broadcast to all clients rather than send discrete messages to each client?
WebSockets uses TCP, which is point to point, and provides no broadcast support.
Not sure how is your client/server setup, but you can always just keep in the server a collection of all connected clients - and then iterate over each one and send the message.
A simple example using Node's Websocket library:
Server code
var WebSocketServer = require('websocket').server;
var clients = [];
var socket = new WebSocketServer({
httpServer: server,
autoAcceptConnections: false
});
socket.on('request', function(request) {
var connection = request.accept('any-protocol', request.origin);
clients.push(connection);
connection.on('message', function(message) {
//broadcast the message to all the clients
clients.forEach(function(client) {
client.send(message.utf8Data);
});
});
});
As noted in other answers, WebSockets don't support multicast, but it looks like the 'ws' module maintains a list of connected clients for you, so it's pretty easy to iterate through them. From the docs:
const WebSocketServer = require('ws').Server;
const wss = new WebSocketServer({ port: 8080 });
wss.broadcast = function(data) {
wss.clients.forEach(client => client.send(data));
};
Yes, it is possible to broadcast messages to multiple clients.
In Java,
#OnMessage
public void onMessage(String m, Session s) throws IOException {
for (Session session : s.getOpenSessions()) {
session.getBasicRemote().sendText(m);
}
}
and here it is explained.
https://blogs.oracle.com/PavelBucek/entry/optimized_websocket_broadcast.
It depends on the server-side really. Here's an example of how it's done using Tomcat7:
Tomcat 7 Chat Websockets Servlet Example
and an explanation of the how it's constructed here.
Yes you can and there are many socket servers out there written in various scripting languages that are doing it.
The Microsoft.Web.WebSockets namespace has a WebSocketCollection with Broadcast capability. Look for the assembly in Nuget. The name is Microsoft.WebSockets.

zeromq REP node will only get the first message (Req,Rep pattern)

I have a nodejs app which listen for messages from clients (python app).
the pattern i used for communication over zmq is REQ/REP pattern.
the Main app should get messages from many clients. it will not reply to them, just get messages.
the problem is the main app will only get the first message and the next messages are not shown in nodejs app console.
in other words every time i start nodejs app i only get one message.
here is my code:
Nodejs app
var responder = zmq.socket('rep');
responder.on('message', function(request) {
console.log(request);
//here, it seems this function will be called just once!
});
responder.bind('tcp://127.0.0.1:8000', function(err) {
if (err) {
console.log(err);
} else {
console.log('Listening on 8000...');
}
});
python (client) part:
socket = context.socket(zmq.REQ)
socket.connect("tcp://127.0.0.1:8000")
socket.send('blaaaa')
print 'message sent!'
python part is inside a function. i could see the output of "message sent!" in python console(i mean many 'message sent!').
but i could not see the messages in nodejs app.just the first message is seen in the console of nodejs.
When using the REQ/REP-pattern you actually need to respond to a request before you are given the next request - you will only handle one request at the time.
var responder = zmq.socket('rep');
responder.on('message', function(request) {
console.log(request);
responder.send('Here comes the reply!');
});
Respond, and you will receive the next one. If you do not wish to respond, then you need to choose some other socket pair than req/rep - ex: push/pull or maybe look at xreq/xrep (router/dealer) if you wish to handle multiple requests at the same time.
If in doubt, look up the send/receive pattern for each socket type at http://api.zeromq.org/2-1:zmq-socket

Categories

Resources