proto3 grpc clients: can they all omit sending message fields? - python

Given the following proto definition
Can all autogenerated grpc clients produced by google code generation omit sending fields in PagingInfo?
In the python client, I can omit sending that field by using code like:
request = SearchRequest(paging_info=dict(a=OptionalField(d='d', e='e')), query='blah')
grpc proto definition:
syntax = "proto3";
message OptionalField {
string d = 1;
string e = 2;
}
message PagingInfo {
OptionalField a = 1;
OptionalField b = 2;
OptionalField c = 3;
}
message SearchRequest {
string query = 1;
PagingInfo paging_info = 2;
}
message SearchResponse {
string a = 1;
}
service SearchService {
rpc Search (SearchRequest) returns (SearchResponse);
}

In proto3, all elements are considered optional (in the proto2 sense), so yes: any compliant implementation should be able to send a message that omits that element.

Related

How to parse protobuf in python, byte data

Hello I receive binary data with protobuf, I would like to decode this msg.
my proto file:
syntax = "proto3";
message Server {
oneof msg {
Values values = 3;
Ack ack = 5;
}
int64 ack_id = 8;
}
message Ack {
int64 ack_id = 1;
bool is_ok = 2;
string error_msg = 3;
}
message Values {
message Value {
int64 timestamp = 1;
int64 id = 2;
oneof value {
string string_value = 3;
double double_value = 4;
sint32 int_value = 5;
}
}
repeated Value values = 1;
}
I compiled my proto file using protoc.
here is my example python code to decode this proto msg.
import server_pb2
values = server_pb2.Values.Value()
values.ParseFromString(data)
print(values.timestamp)
print(values.id)
print(values.value)
but always returned values are 0.
0
0
0.0
example byte data input:
b'\x1a\x17\n\x15\x08\xc0\xd6\xb2\x9f\x06\x10\xaa\xed\xe3\xe4\n!\xe9\x0b!\xe7\xfd\xff\xef?#\x1a'
bin ascii:
1a170a1508c0d6b29f0610aaede3e40a21e90b21e7fdffef3f401a
base64:
GhcKFQjA1rKfBhCq7ePkCiHpCyHn/f/vP0Aa
I don't know if these proto data are correct
Try:
import server_pb2
data = b'\x1a\x17\n\x15\x08\xc0\xd6\xb2\x9f\x06\x10\xaa\xed\xe3\xe4\n!\xe9\x0b!\xe7\xfd\xff\xef?#\x1a'
server = server_pb2.Server()
server.ParseFromString(data)
print(server)
Yields:
values {
values {
timestamp: 1676454720
id: 2895705770
double_value: 0.999999
}
}
ack_id: 26
Update responding to comment
Continuing the above code:
for value in server.values.values:
print(value.timestamp)
print(value.id)
print(value.double_value)
Yields:
1676454720
2895705770
0.999999
The proto duplicates the field name values. Server has a field values of type Values and Values contains a field values of type Value. So the Python code has some redundancy:
server_values = server.values
values_values = server.values.values # Or server_values.values

Return the enum key when decoding Protobuf in Python

Currently, I am decoding protobuf messages in Python where the output is:
{
"lat": 12.345678,
"lng": -12.345678,
"firmware_status": 3
}
In my case 3 corresponds to FINISHED per the .proto file enum definition. * Note I'm using v3 of protobuf.
enum FirmwareStatus {
UNKNOWN = 0;
STARTED = 1;
IN_PROGRESS = 2;
FINISHED = 3;
CANCELED = 4;
RESTARTED = 5;
}
How would I pull the enum "key" or definition from protobuf so that my output would be:
{
"lat": 12.345678,
"lng": -12.345678,
"firmware_status": "FINISHED"
}
I couldn't find any functions in the protobuf python library to do this easily, but perhaps I missed something.
Currently, this is my decode function:
def decode_protobuf(uplink):
"""Decode a base64 encoded protobuf message.
Paramaters:
uplink (str): base64 encoded protobuf message
Returns:
output (dict): decoded protobuf message in dictionary format
"""
protobuf = proto.Uplink()
decode_base64 = base64.b64decode(uplink)
protobuf.ParseFromString(decode_base64)
output = {}
elems = protobuf.ListFields()
for elem in elems:
output[elem[0].name] = elem[1]
return output
You can use json_format.MessageToDict which has an option so save enums as string
https://github.com/protocolbuffers/protobuf/blob/main/python/google/protobuf/json_format.py#L134

converting python dict to protobuf

How can I convert the following dict to protobuf ?
I have to send a protobuf as a payload to an mqtt broker.
I'm using python 3.8
publish_msg = {
"token":"xxxxxxxx",
"parms":{
"fPort":8,
"data":b"MDQzYzAwMDE=",
"confirmed":False,
"devEUI":"8CF9572000023509"
}
}
My protobuf is defined as follows:
syntax = "proto3";
package publish;
message DLParams{
string DevEUI = 1;
int32 FPort = 2;
bytes Data = 3;
bool Confirm = 4;
}
message DeviceDownlink {
string Token = 1;
DLParams Params = 2;
}
How about this?
import json
from google.protobuf import json_format
# Create an empty message
dl = DeviceDownlink_pb2.DeviceDownlink()
# Get the json string for your dict
json_string = json.dumps(publish_msg)
# Put the json contents into your message
json_format.Parse(json_string, dl)

How to use kafka.consumer.SimpleConsumer,seek()

The API doc is here:http://kafka-python.readthedocs.org/en/latest/apidoc/kafka.consumer.html
But when I run the following code, the exception is %d format: a number is required, not NoneType
client = KafkaClient("localhost:9092")
consumer = SimpleConsumer(client, "test-group", "test")
consumer.seek(0, whence=None)# (0,2) and (0,0)
run = True
while( run ):
message = consumer.get_message(block=False, timeout=4000)
except Exception as e:
print "Exception while trying to read msg:", str(e)
When I used the following piece of code, the exception is seek() got an unexpected keyword argument 'partition'
consumer.seek(0, whence=None, partition=None)# (0,2) and (0,0)
Any idea? Thanks.
In the Kafka Definitive Guide, there is a sample code of seek() written in Java (not in Python, but I hope you might get the general idea).
public class SaveOffsetsOnRebalance implements ConsumerRebalanceListener {
public void onPartitionsRevoked (Collection <TopicPartition> partitions) {
commitDBTransaction();
}
public void onPartitionsAssigned(Collection <TopicPartiton> partitions) {
for(TopicPartition partition : partitions)
consumer.seek(partition, getOffsetFromDB(partition));
}
}
} // these brackets are exactly the same as the book. I didn't change anything. You might want to though.
consumer.subscribe (topics, new SaveOffsetOnRebalance(consumer));
consumer.poll(0);
for ( TopicPartition partition : consumer.assignment())
consumer.seek(partition, getOffsetFromDB(partition));
while (true) {
ConsumerRecords <String, String> records = consumer.poll(100);
for (ConsumerRecord <String, String> record : records)
{
processRecord(record);
storeRecordInDB(record);
storeOffsetInDB(record.topic(), record.partition(), record.offset());
}
commitDBTransaction();
}

Receiving AWS SQS Message and calling function based off received message with Node.js

I am using a Node.js server to talk to a python DAO with Amazon SQS. I'm able to send a SQS to the DAO but have no idea on how to send something back on the python DAO and listen for it properly on the Node.js server. I'm also curious how to call another function based off what the Node.js SQS message is received from the python DAO. Any help would be greatly appreciated! Here's my code so far:
Node.js listening for SQS code:
app.get('/readDAOSQSMessage', function(req, res) {
var params = {
QueueUrl: DAO_QUEUE_URL,
MaxNumberOfMessages: 1, // how many messages do we wanna retrieve?
VisibilityTimeout: 60, // seconds - how long we want a lock on this job
WaitTimeSeconds: 3 // seconds - how long should we wait for a message?
};
sqs.receiveMessage(params, function(err, data) {
if (data.Messages) {
var message = data.Messages[0],
body = JSON.parse(message.Body);
res.send("Username: " + body.username + " Password: " + body.password);
removeFromQueue(message, DAO_QUEUE_URL);
}
});
});
Python sending SQS code:
queue = conn.get_queue('DAO-Queue')
writeQueue = conn.get_queue('DaoToServerQueue')
message = RawMessage()
message.set_body("This is my first message.")
writeQueue.write(message)
while True:
rs = queue.get_messages()
if len(rs) > 0:
m = rs[0]
print("Message Body: " + m.get_body())
message = m.get_body()
#unicodedata.normalize('NFKD',message).encode('ascii','ignore')
#command = 'dao.'+message[1:-1]
message = message.encode('ascii','ignore')
print "message: "+message[1:-1]
print "backslash: "+'\\'
replace_str = '\\'+'"'
print replace_str
print message.find(replace_str)
print"\nmessage type: "+str(type(message))
message = message.replace(replace_str,'"')
print "\nnew message: "+message
command = 'dao.'+message[1:-1]
print "\n command: "+command
#unicodedata.normalize('NFDK',message).encode('ascii','ignore')
#command = 'dao.'+ message
#unicodedata.normalize('NFDK',command).encode('ascii','ignore')
eval(command)
print("Command: " + command)
queue.delete_message(m)
Normally you want to listen to messages on the receiving side by long-polling.
function readMessages() {
async.parallel([
pollMessage,
], function(err, callbacks) {
readMessages();
});
}
function pollMessage(callback) {
sqs.receiveMessage({
QueueUrl: receiveQueue,
WaitTimeSeconds: 5,
MaxNumberOfMessages: 1,
// VisibilityTimeout: 20
}, function(err, data) {
if (err) console.log(err);
if (!data) {
callback();
return;
}
if (data.Messages) {
// Get the first message (should be the only one since we said to only get one above)
for (var i = 0; i < data.Messages.length; i++) {
var message = data.Messages[i];
removeFromQueue(message, callback);
// Now this is where you'd do something with this message
handleIncomeMessage(message); // whatever you wanna do
// Clean up after yourself... delete this message from the queue, so it's not executed again
}
} else {
callback();
}
});
}
The code to be executed as lambda function on your NodeJS server.
You can decide what to do with the message by attaching message type to the payload. Notice the handleIncomeMessage() function above, which can be something like:
function handleIncomeMessage(data) {
var msg = JSON.parse(data.Body);
var type = msg.type;
console.log("Got a new sqs message: ", msg.type);
switch (type) {
case 'someCoolType':
{
Manager.proceedWithCoolType(msg.restOfPayload);
break;
}
}
}

Categories

Resources