Python grpc - reading in all messages before sending responses - python

I'm trying to understand if grpc server using streams is able to wait for all client messages to be read in prior to sending responses.
I have a trivial application where I send in several numbers I'd like to add and return.
I've set up a basic proto file to test this:
syntax = "proto3";
message CalculateRequest{
int64 x = 1;
int64 y = 2;
};
message CalculateReply{
int64 result = 1;
}
service Svc {
rpc CalculateStream (stream CalculateRequest) returns (stream CalculateReply);
}
On my server-side I have implemented the following code which returns the answer message as the message is received:
class CalculatorServicer(contracts_pb2_grpc.SvcServicer):
def CalculateStream(self, request_iterator, context):
for request in request_iterator:
resultToOutput = request.x + request.y
yield contracts_pb2.CalculateReply(result=resultToOutput)
def serve():
server = grpc.server(futures.ThreadPoolExecutor(max_workers=10))
contracts_pb2_grpc.add_SvcServicer_to_server(
CalculatorServicer(), server)
server.add_insecure_port('localhost:9000')
server.start()
server.wait_for_termination()
if __name__ == '__main__':
print( "We're up")
logging.basicConfig()
serve()
I'd like to tweak this to first read in all the numbers and then send these out at a later stage - something like the following:
class CalculatorServicer(contracts_pb2_grpc.SvcServicer):
listToReturn = []
def CalculateStream(self, request_iterator, context):
for request in request_iterator:
listToReturn.append (request.x + request.y)
# ...
# do some other stuff first before returning
for item in listToReturn:
yield contracts_pb2.CalculateReply(result=resultToOutput)
Currently, my implementation to write out later doesn't work as the code at the bottom is never reached. Is this by design that the connection seems to "close" before reaching there?
The grpc.io website suggests that this should be possible with BiDirectional streaming:
for example, the server could wait to receive all the client messages before writing its responses, or it could alternately read a message then write a message, or some other combination of reads and writes.
Thanks in advance for any help :)

The issue here is the definition of "all client messages." At the transport level, the server has no way of knowing whether the client has finished independent of the client closing its connection.
You need to add some indication of the client's having finished sending requests to the protocol. Either add a bool field to the existing CalculateRequest or add a top-level oneof with one of the options being something like a StopSendingRequests

Related

How to Compile a While Loop statement in PySpark on Apache Spark with Databricks

I'm trying send data to my Data Lake with a While Loop.
Basically, the intention is to continually loop through code and send data to my Data Lake when ever data received from my Azure Service Bus using the following code:
This code receives message from my Service Bus
def myfunc():
with ServiceBusClient.from_connection_string(CONNECTION_STR) as client:
# max_wait_time specifies how long the receiver should wait with no incoming messages before stopping receipt.
# Default is None; to receive forever.
with client.get_queue_receiver(QUEUE_NAME, session_id=session_id, max_wait_time=5) as receiver:
for msg in receiver:
# print("Received: " + str(msg))
themsg = json.loads(str(msg))
# complete the message so that the message is removed from the queue
receiver.complete_message(msg)
return themsg
This code assigns a variable to the message:
result = myfunc()
The following code sends the message to my data lake
rdd = sc.parallelize([json.dumps(result)])
spark.read.json(rdd) \
.write.mode("overwrite").json('/mnt/lake/RAW/FormulaClassification/F1Area/')
I would like help looping through the code to continually checking for messages and sending the results to my data lake.
I believe the solution is accomplished with a While Loop but not sure
Just because you're using Spark doesn't mean you cannot loop
First off all, you're only returning the first message from your receiver, so it should look like this
with client.get_queue_receiver(QUEUE_NAME, session_id=session_id, max_wait_time=5) as receiver:
msg = str(next(receiver))
# print("Received: " + msg)
themsg = json.loads(msg)
# complete the message so that the message is removed from the queue
receiver.complete_message(msg)
return themsg
To answer your question,
while True:
result = json.dumps(myfunc())
rdd = sc.parallelize([result])
spark.read.json(rdd) \ # You should use rdd.toDF().json here instead
.write.mode("overwrite").json('/mnt/lake/RAW/FormulaClassification/F1Area/')
Keep in mind that the output file names aren't consistent and you might not want them to be overwritten
Alternatively, you should look into writing your own Source / SparkDataStream class that defines SparkSQL sources so that you don't need a loop in your main method and it's natively handled by Spark

KDB/Q Websocket example, getting `badmsg error: How to serialize my data when sending it over to my KDB server?

Lately, I've been trying to stream data from KDB to python.
I'm now using websockets and have gone through the doc https://code.kx.com/q/wp/kdb_and_websockets.pdf
On the python side, I've been trying ws4py, autobahn and websocket-client.
All do work fine, essentially my problem resides in the format of the message sent to the server to subscribe to the feed.
A little (open source) example:
class DummyClient(WebSocketClient):
def opened(self):
self.send(*what should I put here?*, binary=True)
def closed(self, code, reason=None):
print("Closed down", code, reason)
def received_message(self, m):
# not yet implemented
if __name__ == '__main__':
try:
ws = DummyClient('ws://host:port/', protocols=['http-only', 'chat'])
ws.connect()
ws.run_forever()
except KeyboardInterrupt:
ws.close()
When opening, I'm supposed to subscribe to the feed by calling the server function loadPage.
I've tried different things when to encode the list containing the function name and the argument, without success.
What I've tried:
np.array("['loadPage',[]]").tobytes()
"['loadPage',[]]".encode('utf8')
json formating
hexadecimal formating
Any help would be much appreciated!
Best,
Yael
I think what you'll need to do in this case is define .z.ws in your kdb server. This function will be called with whatever you pass over the websocket as an argument, so for example if you defined .z.ws like so:
.z.ws:{show x}
and then send a message over the WebSocket like so:
var ws = new WebSocket("ws://localhost:1234")
ws.send("Hello World")
This will be output in your kdb session e.g.
λ q
KDB+ 3.6 2018.06.14 Copyright (C) 1993-2018 Kx Systems
w64/ 4(16)core 8082MB jonat laptop-o8a8co1o 10.255.252.249 EXPIRE 2019.05.21 jonathon.mcmurray#aquaq.co.uk KOD #4160315
q).z.ws:{show x}
q)\p 1234
q)"Hello World"
So in your case if you want to call loadPage, it might be as simple as defining .z.ws like so:
.z.ws:loadPage
Then whatever you pass over the socket will be passed into loadPage (note this is will not update if you change loadPage elsewhere, you can instead use .z.ws:{loadPage x} if you need to be able to update it dynamically). For a niladic function, this will just ignore whatever is passed in

ROS message sent but not received

I'm using ROS in my project and I need to send one message from time to time. I have this function:
void RosNetwork::sendMessage(string msg, string channel) {
_mtx.lock();
ros::Publisher chatter_pub = _n.advertise<std_msgs::String>(channel.c_str(),10);
ros::Rate loop_rate(10);
std_msgs::String msgToSend;
msgToSend.data = msg.c_str();
chatter_pub.publish(msgToSend);
loop_rate.sleep();
cout << "Message Sent" << endl;
_mtx.unlock();
}
And I have this in python:
def callbackFirst(data):
#rospy.loginfo(rospy.get_caller_id() + "I heard %s", data.data)
print("Received message from first filter")
def callbackSecond(data):
#rospy.loginfo(rospy.get_caller_id() + "I heard %s", data.data)
print("Received message from second filter")
def listener():
rospy.Subscriber("FirstTaskFilter", String, callbackFirst)
print("subscribed to FirstTaskFilter")
rospy.Subscriber("SecondTaskFilter", String, callbackSecond)
print("subscribed to SecondTaskFilter")
rospy.spin()
The listener is a thread in python.
I get to the function sendMessage (I see in the terminal "Message Sent" a lot of times) but I don't see that the python script receives the message.
Update: I tested the python callback with rostopic pub /FirstTaskFilter std_msgs/String "test" and this works perfectly.
Any thought?
You are re-advertising the publisher every time and then you are immediately using it to publish something.
This is problematic as it needs some time for the subscribers to subscribe to newly emerging publishers. If you are publishing messages before the subscriber has finished with this, these messages will not arrive.
To avoid this problem, do not advertise a new publisher every time, but do it only once in the constructor of your class and store the publisher in a member variable. Your code could look something like this:
RosNetwork() {
_chatter_pub = _n.advertise<std_msgs::String>(channel.c_str(),10);
ros::Duration(1).sleep(); // optional, to make sure no message gets lost
}
void RosNetwork::sendMessage(string msg, string channel) {
...
_chatter_pub.publish(msgToSend);
...
}
The one-second-sleep after advertise makes sure that all existing subscribers can subscribe before you start publishing messages. This is only necessary, if it is important that not a single message gets lost. In most practical cases it can be omitted.
The proper solution to your problem is to use pub.getNumSubscribers() and wait until that is > 0. Then publish.

How to limit Autobahn python subscriptions on a per session basis

I am using autobahnpython with twisted (wamp) on server side and autobahnjs in browser. Is there a straight-forward way to allow/restrict subscriptions on a per session basis? For example, a client should not be able to subscribe to topics relavant to other users.
While I am NOT using crossbar.io, I tried using the Python code shown in the 'Example' section at the end of this page http://crossbar.io/docs/Authorization/ where a RPC call is first used to give authorization to a client. Of course, I am using my own authorization logic. Once this authorization is successful, I'd like to give the client privileges to subscribe to topics related only to this client, like 'com.example.user_id'. My issue is that even if auth passes, however, I have not found a way to limit subscription requests in the ApplicationSession class which is where the authorization takes place. How can I prevent a client who authorizes with user_id=user_a from subscribing to 'com.example.user_b'?
You can authorize by creating your own router. To do that, subclass Router() and override (at a minumum) the authorize() method:
def authorize(self, session, uri, action):
return True
This method is pretty simple, if you return a True then the session is authorized to do whatever it is attempting. You could make a rule that all subscriptions must start with 'com.example.USER_ID', so, your python code would split the uri, take the third field, and compare it to the current session id, returning True if they match, false otherwise. This is where things get a little weird though. I have code that does a similar thing, here is my authorize() method:
#inlineCallbacks
def authorize(self, session, uri, action):
authid = session._authid
if authid is None:
authid = 1
log.msg("AuthorizeRouter.authorize: {} {} {} {} {}".format(authid,
session._session_id, uri, IRouter.ACTION_TO_STRING[action], action))
if authid != 1:
rv = yield self.check_permission(authid, uri, IRouter.ACTION_TO_STRING[action])
else:
rv = yield True
log.msg("AuthorizeRouter.authorize: rv is {}".format(rv))
if not uri.startswith(self.svar['topic_base']):
self.sessiondb.activity(session._session_id, uri, IRouter.ACTION_TO_STRING[action], rv)
returnValue(rv)
return
Note that I dive into the session to get the _authid, which is bad karma (I think) because I should not be looking at these private variables. I don't know where else to get it, though.
Also, of note, this goes hand in hand with Authentication. In my implementation, the _authid is the authenticated user id, which is similar to a unix user id (positive unique integer). I am pretty sure this can be anything, like a string, so you should be ok with your 'user_b' as the _auth_id if you wish.
-g
I found a relatively simple solution using a Node guest. Here's the code:
// crossbar setup
var autobahn = require('autobahn');
var connection = new autobahn.Connection({
url: 'ws://127.0.0.1:8080/ws',
realm: 'realm1'
}
);
// Websocket to Scratch setup
// pull in the required node packages and assign variables for the entities
var WebSocketServer = require('websocket').server;
var http = require('http');
var ipPort = 1234; // ip port number for Scratch to use
// this connection is a crossbar connection
connection.onopen = function (session) {
// create an http server that will be used to contain a WebSocket server
var server = http.createServer(function (request, response) {
// We are not processing any HTTP, so this is an empty function. 'server' is a wrapper for the
// WebSocketServer we are going to create below.
});
// Create an IP listener using the http server
server.listen(ipPort, function () {
console.log('Webserver created and listening on port ' + ipPort);
});
// create the WebSocket Server and associate it with the httpServer
var wsServer = new WebSocketServer({
httpServer: server
});
// WebSocket server has been activated and a 'request' message has been received from client websocket
wsServer.on('request', function (request) {
// accept a connection request from Xi4S
//myconnection is the WS connection to Scratch
myconnection = request.accept(null, request.origin); // The server is now 'online'
// Process Xi4S messages
myconnection.on('message', function (message) {
console.log('message received: ' + message.utf8Data);
session.publish('com.serial.data', [message.utf8Data]);
// Process each message type received
myconnection.on('close', function (myconnection) {
console.log('Client closed connection');
boardReset();
});
});
});
};
connection.open();

How to implement server push in Flask framework?

I am trying to build a small site with the server push functionality on Flask micro-web framework, but I did not know if there is a framework to work with directly.
I used Juggernaut, but it seems to be not working with redis-py in current version, and Juggernaut has been deprecated recently.
Does anyone has a suggestion with my case?
Have a look at Server-Sent Events. Server-Sent Events is a
browser API that lets you keep open a socket to your server, subscribing to a
stream of updates. For more Information read Alex MacCaw (Author of
Juggernaut) post on why he kills juggernaut and why the simpler
Server-Sent Events are in manny cases the better tool for the job than
Websockets.
The protocol is really easy. Just add the mimetype text/event-stream to your
response. The browser will keep the connection open and listen for updates. An Event
sent from the server is a line of text starting with data: and a following newline.
data: this is a simple message
<blank line>
If you want to exchange structured data, just dump your data as json and send the json over the wire.
An advantage is that you can use SSE in Flask without the need for an extra
Server. There is a simple chat application example on github which
uses redis as a pub/sub backend.
def event_stream():
pubsub = red.pubsub()
pubsub.subscribe('chat')
for message in pubsub.listen():
print message
yield 'data: %s\n\n' % message['data']
#app.route('/post', methods=['POST'])
def post():
message = flask.request.form['message']
user = flask.session.get('user', 'anonymous')
now = datetime.datetime.now().replace(microsecond=0).time()
red.publish('chat', u'[%s] %s: %s' % (now.isoformat(), user, message))
#app.route('/stream')
def stream():
return flask.Response(event_stream(),
mimetype="text/event-stream")
You do not need to use gunicron to run the
example app. Just make sure to use threading when running the app, because
otherwise the SSE connection will block your development server:
if __name__ == '__main__':
app.debug = True
app.run(threaded=True)
On the client side you just need a Javascript handler function which will be called when a new
message is pushed from the server.
var source = new EventSource('/stream');
source.onmessage = function (event) {
alert(event.data);
};
Server-Sent Events are supported by recent Firefox, Chrome and Safari browsers.
Internet Explorer does not yet support Server-Sent Events, but is expected to support them in
Version 10. There are two recommended Polyfills to support older browsers
EventSource.js
jquery.eventsource
Redis is overkill: use Server-Sent Events (SSE)
Late to the party (as usual), but IMHO using Redis may be overkill.
As long as you're working in Python+Flask, consider using generator functions as described in this excellent article by Panisuan Joe Chasinga. The gist of it is:
In your client index.html
var targetContainer = document.getElementById("target_div");
var eventSource = new EventSource("/stream")
eventSource.onmessage = function(e) {
targetContainer.innerHTML = e.data;
};
...
<div id="target_div">Watch this space...</div>
In your Flask server:
def get_message():
'''this could be any function that blocks until data is ready'''
time.sleep(1.0)
s = time.ctime(time.time())
return s
#app.route('/')
def root():
return render_template('index.html')
#app.route('/stream')
def stream():
def eventStream():
while True:
# wait for source data to be available, then push it
yield 'data: {}\n\n'.format(get_message())
return Response(eventStream(), mimetype="text/event-stream")
As a follow-up to #peter-hoffmann's answer, I've written a Flask extension specifically to handle server-sent events. It's called Flask-SSE, and it's available on PyPI. To install it, run:
$ pip install flask-sse
You can use it like this:
from flask import Flask
from flask_sse import sse
app = Flask(__name__)
app.config["REDIS_URL"] = "redis://localhost"
app.register_blueprint(sse, url_prefix='/stream')
#app.route('/send')
def send_message():
sse.publish({"message": "Hello!"}, type='greeting')
return "Message sent!"
And to connect to the event stream from Javascript, it works like this:
var source = new EventSource("{{ url_for('sse.stream') }}");
source.addEventListener('greeting', function(event) {
var data = JSON.parse(event.data);
// do what you want with this data
}, false);
Documentation is available on ReadTheDocs. Note that you'll need a running Redis server to handle pub/sub.
As a committer of https://github.com/WolfgangFahl/pyFlaskBootstrap4 i ran into the same need and created a flask blueprint for Server Sent Events that has no dependency to redis.
This solutions builds on the other answers that have been given here in the past.
https://github.com/WolfgangFahl/pyFlaskBootstrap4/blob/main/fb4/sse_bp.py has the source code (see also sse_bp.py below).
There are unit tests at https://github.com/WolfgangFahl/pyFlaskBootstrap4/blob/main/tests/test_sse.py
The idea is that you can use different modes to create your SSE stream:
by providing a function
by providing a generator
by using a PubSub helper class
by using the PubSub helper class and use pydispatch at the same time.
As of 2021-02-12 this is alpha code which i want to share nevertheless. Please comment here or as issues in the project.
There is a demo at http://fb4demo.bitplan.com/events and a description of the example use e.g. for a progress bar or time display at: http://wiki.bitplan.com/index.php/PyFlaskBootstrap4#Server_Sent_Events
example client javascript/html code
<div id="event_div">Watch this space...</div>
<script>
function fillContainerFromSSE(id,url) {
var targetContainer = document.getElementById(id);
var eventSource = new EventSource(url)
eventSource.onmessage = function(e) {
targetContainer.innerHTML = e.data;
};
};
fillContainerFromSSE("event_div","/eventfeed");
</script>
example server side code
def getTimeEvent(self):
'''
get the next time stamp
'''
time.sleep(1.0)
s=datetime.utcnow().strftime('%Y-%m-%d %H:%M:%S')
return s
def eventFeed(self):
'''
create a Server Sent Event Feed
'''
sse=self.sseBluePrint
# stream from the given function
return sse.streamFunc(self.getTimeEvent)
sse_bp.py
'''
Created on 2021-02-06
#author: wf
'''
from flask import Blueprint, Response, request, abort,stream_with_context
from queue import Queue
from pydispatch import dispatcher
import logging
class SSE_BluePrint(object):
'''
a blueprint for server side events
'''
def __init__(self,app,name:str,template_folder:str=None,debug=False,withContext=False):
'''
Constructor
'''
self.name=name
self.debug=debug
self.withContext=False
if template_folder is not None:
self.template_folder=template_folder
else:
self.template_folder='templates'
self.blueprint=Blueprint(name,__name__,template_folder=self.template_folder)
self.app=app
app.register_blueprint(self.blueprint)
#self.app.route('/sse/<channel>')
def subscribe(channel):
def events():
PubSub.subscribe(channel)
self.stream(events)
def streamSSE(self,ssegenerator):
'''
stream the Server Sent Events for the given SSE generator
'''
response=None
if self.withContext:
if request.headers.get('accept') == 'text/event-stream':
response=Response(stream_with_context(ssegenerator), content_type='text/event-stream')
else:
response=abort(404)
else:
response= Response(ssegenerator, content_type='text/event-stream')
return response
def streamGen(self,gen):
'''
stream the results of the given generator
'''
ssegen=self.generateSSE(gen)
return self.streamSSE(ssegen)
def streamFunc(self,func,limit=-1):
'''
stream a generator based on the given function
Args:
func: the function to convert to a generator
limit (int): optional limit of how often the generator should be applied - 1 for endless
Returns:
an SSE Response stream
'''
gen=self.generate(func,limit)
return self.streamGen(gen)
def generate(self,func,limit=-1):
'''
create a SSE generator from a given function
Args:
func: the function to convert to a generator
limit (int): optional limit of how often the generator should be applied - 1 for endless
Returns:
a generator for the function
'''
count=0
while limit==-1 or count<limit:
# wait for source data to be available, then push it
count+=1
result=func()
yield result
def generateSSE(self,gen):
for result in gen:
yield 'data: {}\n\n'.format(result)
def enableDebug(self,debug:bool):
'''
set my debugging
Args:
debug(bool): True if debugging should be switched on
'''
self.debug=debug
if self.debug:
logging.basicConfig(level=logging.DEBUG, format='%(asctime)s.%(msecs)03d %(levelname)s:\t%(message)s', datefmt='%Y-%m-%d %H:%M:%S')
def publish(self, message:str, channel:str='sse', debug=False):
"""
Publish data as a server-sent event.
Args:
message(str): the message to send
channel(str): If you want to direct different events to different
clients, you may specify a channel for this event to go to.
Only clients listening to the same channel will receive this event.
Defaults to "sse".
debug(bool): if True enable debugging
"""
return PubSub.publish(channel=channel, message=message,debug=debug)
def subscribe(self,channel,limit=-1,debug=False):
def stream():
for message in PubSub.subscribe(channel,limit,debug=debug):
yield str(message)
return self.streamGen(stream)
class PubSub:
'''
redis pubsub duck replacement
'''
pubSubByChannel={}
def __init__(self,channel:str='sse',maxsize:int=15, debug=False,dispatch=False):
'''
Args:
channel(string): the channel name
maxsize(int): the maximum size of the queue
debug(bool): whether debugging should be switched on
dispatch(bool): if true use the pydispatch library - otherwise only a queue
'''
self.channel=channel
self.queue=Queue(maxsize=maxsize)
self.debug=debug
self.receiveCount=0
self.dispatch=False
if dispatch:
dispatcher.connect(self.receive,signal=channel,sender=dispatcher.Any)
#staticmethod
def reinit():
'''
reinitialize the pubSubByChannel dict
'''
PubSub.pubSubByChannel={}
#staticmethod
def forChannel(channel):
'''
return a PubSub for the given channel
Args:
channel(str): the id of the channel
Returns:
PubSub: the PubSub for the given channel
'''
if channel in PubSub.pubSubByChannel:
pubsub=PubSub.pubSubByChannel[channel]
else:
pubsub=PubSub(channel)
PubSub.pubSubByChannel[channel]=pubsub
return pubsub
#staticmethod
def publish(channel:str,message:str,debug=False):
'''
publish a message via the given channel
Args:
channel(str): the id of the channel to use
message(str): the message to publish/send
Returns:
PubSub: the pub sub for the channel
'''
pubsub=PubSub.forChannel(channel)
pubsub.debug=debug
pubsub.send(message)
return pubsub
#staticmethod
def subscribe(channel,limit=-1,debug=False):
'''
subscribe to the given channel
Args:
channel(str): the id of the channel to use
limit(int): limit the maximum amount of messages to be received
debug(bool): if True debugging info is printed
'''
pubsub=PubSub.forChannel(channel)
pubsub.debug=debug
return pubsub.listen(limit)
def send(self,message):
'''
send the given message
'''
sender=object();
if self.dispatch:
dispatcher.send(signal=self.channel,sender=sender,msg=message)
else:
self.receive(sender,message)
def receive(self,sender,message):
'''
receive a message
'''
if sender is not None:
self.receiveCount+=1;
if self.debug:
logging.debug("received %d:%s" % (self.receiveCount,message))
self.queue.put(message)
def listen(self,limit=-1):
'''
listen to my channel
this is a generator for the queue content of received messages
Args:
limit(int): limit the maximum amount of messages to be received
Return:
generator: received messages to be yielded
'''
if limit>0 and self.receiveCount>limit:
return
yield self.queue.get()
def unsubscribe(self):
'''
unsubscribe me
'''
if self.dispatch:
dispatcher.disconnect(self.receive, signal=self.channel)
pass

Categories

Resources