I am writing a Python 3.5 program which handles some signals and serves this data to a small amount of websocket clients.
I want the websocket server and the signal handling to happen in the same program, therefore I am using threading.
The problem is I don't know how to send data from the worker thread to the client.
The Websocket server is implemented with a simple library called "websockets". The server is set up and clients can connect and talk to the server within the "new websocket client has connected" handler.
The server is set up with the help of an event loop:
start_server = websockets.serve(newWsHandler, host, port)
loop = asyncio.get_event_loop()
loop.run_until_complete(start_server)
loop.run_forever()
Because I want my program to do signal handling too, and loop.run_forever() is a blocking call, I create an endless worker thread before I start my server. This works as expected.
When the worker thread detects a signal change, it has to alert the connected websocket clients. But a simple client.send() does not work. Putting await in front of it does not work either (since that only works within coroutines, I think). I tried making a separate "async def" function and adding it to the event loop, but it gets a bit complicated because it's not on the same thread.
So the main question is: what is the best way send something to a websocket client from a worker thread? I don't receive anything in response.
EDIT:
It will probably help if I add some mock code.
def signalHandler():
#check signals
...
if alert:
connections[0].send("Alert") #NEED HELP HERE
async def newWsHandler(websocket, path):
connections.append(websocket)
while True:
#keep the connection open until the client disconnects
msg = await websocket.recv()
#top level
connections = []
...
start_server = websockets.serve(newWsHandler, host, port)
signalThread = Thread(target = signalHandler)
signalThread.setDaemon(True)
signalThread.start()
loop = asyncio.get_event_loop()
loop.run_until_complete(start_server)
loop.run_forever()
Related
I am trying to implement WebSocket connection to a server (Python app <=> Django app)
Whole system runs in big Asyncio loop with many tasks. Code snippet is just very small testing part.
I am able to send any data to a server at any moment and many of them will be type request something and wait for response. But I would like to have some "always running" handler for all incoming messages. (When something in Django database will change I want to send changes to python app).
How can Include always running receiver/ or add callback to websocket? I am not able to find any solution for this.
My code snippet:
import asyncio, json, websockets, logging
class UpdateConnection:
async def connect(self,botName):
self.sock = await websockets.connect('ws://localhost:8000/updates/bot/'+botName)
async def send(self,data):
try:
await self.sock.send(json.dumps(data))
except:
logging.info("Websocket connection lost!")
# Find a way how to reconenct... or make socket reconnect automatically
if __name__ == '__main__':
async def DebugLoop(socketCon):
await socketCon.connect("dev")
print("Running..")
while True:
data = {"type": "debug"}
await socketCon.send(data)
await asyncio.sleep(1)
uSocket = UpdateConnection()
loop = asyncio.get_event_loop()
loop.create_task(DebugLoop(uSocket))
loop.run_forever()
My debug server after connection will start sending random messages to the client in random intervals and I would like to somehow handle them in async way.
Thanks for any help :)
You don't have to do it so complicated. First of all I suggest you use the context patterns offered by websockets module.
From the documentation:
connect() can be used as an infinite asynchronous iterator to reconnect automatically on errors:
async for websocket in websockets.connect(...):
try:
...
except websockets.ConnectionClosed:
continue
Additionally, you simply keep the connection alive by awaiting incoming messages:
my_websocket = None
async for websocket in websockets.connect('ws://localhost:8000/updates/bot/' + botName):
try:
my_websocket = websocket
async for message in websocket:
pass # here you could also process incoming messages
except websockets.ConnectionClosed:
my_websocket = None
continue
As you can see we have a nested loop here:
The outer loop constantly reconnects to the server
The inner loop processes one incoming message at a time
If you are connected, and no messages are coming in from the server, this will just sleep.
The other thing that happens here is that my_websocket is set to the active connection, and unset again when the connection is lost.
In other parts of your script you can use my_websocket to send data. Note that you will need to check if it is currently set wherever you use it:
async def send(data):
if my_websocket:
await my_websocket.send(json.dumps(data))
This is just an illustration, you can also keep the websocket object as an object member, or pass it to another component through a setter function, etc.
I have a Python3 program that runs a "while True"-loop until stopped, which occasionally saves data to a MySQL database. I am creating an administrative website, separate from the Python program, where I will be able to observe this data.
I now want to be able to be notified, on the website, when changes have been made to the database. My thought was to set up a websocket connection, so that the Python program can send a message through the socket to all connected clients, i.e. all open browsers, if there has been any changes to the database table.
I have done something similar before, but in that case I had to wait for a websocket connection before the "while True"-loop would start. In the new scenario I want to be able to have multiple website clients at once, and let them connect at any time, as well as disconnect without interrupting the Python programs loop.
This is a simplified version of my previous code, which I now want to update to be able to run both with & without websocket clients.
import asyncio
import websockets
socket_server = websockets.serve(run, "127.0.0.1", 5055)
asyncio.get_event_loop().run_until_complete(socket_server)
console_log("Waiting for socket connection...")
asyncio.get_event_loop().run_forever()
async def run(ws):
while True:
db_has_updated = do_stuff()
if db_has_updated:
await ws.send(data)
I just can't seem to be able to come up with the right search terms to find a solution, so I'm asking here instead.
I figured it out, finally! Here is my solution with a websocket server running in a separate thread from the other logic. I'm probably changing some things to make it neater, but this does everything I need. Feel free to ask any questions.
Be aware that this blocks when messaging all the connected clients. That is the way I needed it to work, but you could always thread/subprocess the logic/data-gen part of the program if you want it to run completely asynchronously.
#!/usr/bin/env python3
import asyncio
import websockets
import threading
import time
import random
def gen_data():
print("Generating data...")
time.sleep(3)
data = random.randint(1, 10)
return data
async def send(client, data):
await client.send(data)
async def handler(client, path):
# Register.
print("Websocket Client Connected.", client)
clients.append(client)
while True:
try:
print("ping", client)
pong_waiter = await client.ping()
await pong_waiter
print("pong", client)
time.sleep(3)
except Exception as e:
clients.remove(client)
print("Websocket Client Disconnected", client)
break
clients = []
start_server = websockets.serve(handler, "localhost", 5555)
asyncio.get_event_loop().run_until_complete(start_server)
threading.Thread(target = asyncio.get_event_loop().run_forever).start()
print("Socket Server Running. Starting main loop.")
while True:
data = str(gen_data())
message_clients = clients.copy()
for client in message_clients:
print("Sending", data, "to", client)
try:
asyncio.run(send(client, data))
except:
# Clients might have disconnected during the messaging process,
# just ignore that, they will have been removed already.
pass
Sorry for the long post but I've been poking at this for over a week so I've tried a lot of different stuff. I know Python well enough but I don't have any experience with asyncio or non-blocking functions in Python.
I'm writing an API library/module/package/whatever for a web service that requires a websocket connection. There are many incoming messages to act on, and some control-related (web app level, not websocket control messages) that I need to send on occasion. I can easily receive messages over the connection and act on them. I can send messages, but only in response to received messages because the receive loop is always blocking waiting for messages. I don't want to wait for an incoming messages to process an outgoing one so the script doesn't have to hang on input until a new messages is received. In my struggles to get two-way communication working as desired I discovered I need to use something like Twisted, Tornado, or asyncio but so far every implementation I've tried has failed. Note that the sending has to happen over the same connection. Opening a short-lived connection outside of the receive loop will not work. Here's what I've done so far:
The first iteration of the websocket code was using the websocket-client package. It was very close to the example from the docs:
import websocket
try:
import thread
except ImportError:
import _thread as thread
import time
def on_message(ws, message):
# Send message frames to respective functions
# for sorting, objectification, and processing
def on_error(ws, error):
print(error)
def on_close(ws):
print("### closed ###")
def on_open(ws):
def run(*args):
# Send initial frames required for server to send the desired frames
thread.start_new_thread(run, ())
if __name__ == "__main__":
websocket.enableTrace(True)
ws = websocket.WebSocketApp(buildWebsocketURL()),
on_message = on_message,
on_error = on_error,
on_close = on_close)
ws.on_open = on_open
ws.run_forever()
This blocks any further execution outside of the loop. I tried learning up on the _thread module but I couldn't find any indication that I could "communicate" with the websocket thread from outside. I tried setting up a pub/sub listener function that would forward data to ws.send() from another sender function but it didn't work. No errors or anything, just no indication of any sent messages.
Next I tried the Websockets module. This one seems to be built from the ground up to utilize asyncio. Again, I got a client build that would send initial messages and act on received messages but the progress stopped there:
async def wsconnection():
async with websockets.connect(getWebsocketURL()) as websocket:
while True:
message = await websocket.recv()
if message == '{"type":"broadcaster.ready"}':
subscriptions = getSubscriptions() # Get subscriptions from ident data
logging.info('Sending bookmarks to server as subscription keys')
subscriptionupdate = '{{"type": "subscribe","subscription_keys": ["{0}"],"subscription_scope": "update"}}'.format(
'","'.join(subscriptions))
subscriptioncontent = '{{"subscription_keys": ["{0}"],"subscription_scope": "content","type": "subscribe"}}'.format(
'","'.join(subscriptions))
logging.debug(subscriptioncontent)
await websocket.send(subscriptionupdate)
await websocket.send(subscriptioncontent)
await websocket.send(
'{"type":"message_lobby.read","lobby_id":"1","message_id:"16256829"}')
sortframe(message)
asyncio.get_event_loop().run_until_complete(wsconnection())
I tried the aforementioned pub/sub listener applied here to no avail. Upon reading the docs for this module more thoroughly I tried getting the websocket protocol object (that contains the send() and recv() methods) outside of the loop then creating two coroutines(?), one listening for incoming messages and one listening for and sending outgoing messages. So far I've been completely unable to get the websocket protocol object without running the async with websockets.connect(getWebsocketURL()) as websocket: line within the scope of the wsconnection() function. I tried using websocket = websockets.client.connect() which according to the docs I thought would set the protocol object I need but it doesn't. All of the examples I can find don't seem to reveal any apparent way to structure the websockets sender and receiver in the way I require without extensive knowledge of asyncio.
I also poked around with autobahn with similar code structures as above using both asyncio and Twisted but I came up with all the same problems as above.
So far the closest I've gotten was with the Websockets package above. The docs have an example snippet for a send/recv connection but I can't really read what's going on there as it's all very specific to asyncio. I'm really having trouble wrapping my head around asyncio in general and I think a big problem is it seems to have very rapidly evolved recently so there is a ton of very version-specific information floating around that conflicts. Not good for learning, unfortunately. ~~~~This is what I tried using that example and it connects, receives initial messages, then the connection is lost/closed:
async def producer(message):
print('Sending message')
async def consumer_handler(websocket, path):
while True:
message = await websocket.recv()
await print(message)
await pub.sendMessage('sender', message)
async def producer_handler(websocket, path):
while True:
message = await producer()
await websocket.send(message)
async def wsconnect():
async with websockets.connect(getWebsocketURL()) as websocket:
path = "443"
async def handler(websocket, path):
consumer_task = asyncio.ensure_future(
consumer_handler(websocket, path))
producer_task = asyncio.ensure_future(
producer_handler(websocket, path))
done, pending = await asyncio.wait(
[consumer_task, producer_task],
return_when=asyncio.FIRST_COMPLETED,
)
for task in pending:
task.cancel()
pub.subscribe(producer, 'sender')
asyncio.get_event_loop().run_until_complete(wsconnect())
So how do I structure this code to get sending and receiving over the same websocket connection? I also have various API calls to make in the same script while the websocket connection is open which further complicates things.
I'm using Python 3.6.6 and this script is intended to be imported as a module into other scripts so the websocket functionality will need to be wrapped up in a function or class for external calls.
I am in the exact same situation as u. I know that this is a very inelegant solution
because it still isn't full-duplex but i can't seem to find any example on the internet or stackoverflow involving asyncio and the websockets module which i used.
I don't think i completely understand your websockets example (is that server-side or client-side code?) but i'm going to explain my situation and "solution" and maybe that would be usable for you too.
So i have a server main function that has a websocket listening for messages in a loop with recv(). When i send "start" it will start a function that will send data every second to the javascript client in the browser. But while the function is sending data i sometimes want to pause or stop the stream of data from my client be sending a stop message. The problem is that when i use recv() while the data sending has already begun the server stops sending data and only waits for a message. I tried threads,multiprocessing and some other stuff but eventually i came to the hopefully temporarily solution of sending a "pong" message to the server immediately after the client receives a piece of data so that the server continues sending data at the next loop iteration or stop sending data if the "pong" message is "stop" instead for example but yeah this is not real duplex just fast half-duplex...
code on my python "server"
async def start_server(self,websocket,webserver_path):
self.websocket = websocket
self.webserver_path = webserver_path
while True:
command = await self.websocket.recv()
print("received command")
if command == "start":
await self.analyze()
asyncio.sleep(1)
in my analyze function:
for i,row in enumerate(data)
await self.websocket.send(json.dumps(row))
msg = await self.websocket.recv()
if msg == "stop":
self.stopFlag = True
return
await asyncio.sleep(1)
main
start_server = websockets.serve(t.start_server, "127.0.0.1", 5678)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_forever()
code on the javascript client
var ws = new WebSocket("ws://127.0.0.1:5678/");
ws.onmessage = function (event) {
var datapoint = JSON.parse(event.data);
console.log(counter);
counter++;
data.push(datapoint);
if (data.length > 40){
var element = data.shift();
render(data);
}
ws.send("pong");//sending dummy message to let server continue
};
I know it is not THE solution and i hope somebody else provides a better one but since i have the same or very similar problem and there are no other answers i decided to post and i hope it helps.
This is the basic tcp server from asyncio tutotial:
import asyncio
class EchoServerClientProtocol(asyncio.Protocol):
def connection_made(self, transport):
peername = transport.get_extra_info('peername')
print('Connection from {}'.format(peername))
self.transport = transport
def data_received(self, data):
message = data.decode()
print('Data received: {!r}'.format(message))
print('Send: {!r}'.format(message))
self.transport.write(data)
print('Close the client socket')
self.transport.close()
loop = asyncio.get_event_loop()
# Each client connection will create a new protocol instance
coro = loop.create_server(EchoServerClientProtocol, '127.0.0.1', 8888)
server = loop.run_until_complete(coro)
# Serve requests until CTRL+c is pressed
print('Serving on {}'.format(server.sockets[0].getsockname()))
try:
loop.run_forever()
except KeyboardInterrupt:
pass
# Close the server
server.close()
loop.run_until_complete(server.wait_closed())
loop.close()
Like all (i found) other examples it uses blocking loop.run_forever().
How do i start listeting server and do something else in the time?
I have tried to outsource starting server in a function and start this function with asyncio.async(), but with no success.
What i'm missing here?
You can schedule several concurrent asyncio tasks before calling loop.run_forever().
#asyncio.coroutine
def other_task_coroutine():
pass # do something
start_tcp_server_task = loop.create_task(loop.create_server(
EchoServerClientProtocol, '127.0.0.1', 8888))
other_task = loop.create_task(other_task_coroutine())
self.run_forever()
When you call loop.create_task(loop.create_server()) or loop.create_task(other_task_coroutine()), nothing is actually executed: a coroutine object is created and wrapped in a task (consider a task to be a shell and the coroutine an instance of the code that will be executed in the task). The tasks are scheduled on the loop when created.
The loop will execute start_tcp_server_task first (as it's scheduled first) until a blocking IO event is pending or the passive socket is ready to listen for incoming connections.
You can see asyncio as a non-preemptible scheduler running on one CPU: once the first task interrupts itself or is done, the second task will be executed. Hence, when one task is executed, the other one has to wait until the running task finishes or yields (or "awaits" with Python 3.5). "yielding" (yield from client.read()) or "awaiting" (await client.read()) means that the task gives back the hand to the loop's scheduler, until client.read() can be executed (data is available on the socket).
Once the task gave back the control to the loop, it can schedule the other pending tasks, process incoming events and schedule the tasks which were waiting for those events. Once there is nothing left to do, the loop will perform the only blocking call of the process: sleep until the kernel notifies it that events are ready to be processed.
In this context, you must understand that when using asyncio, everything running in the process must run asynchronously so the loop can do its work. You can not use multiprocessing objects in the loop.
Note that asyncio.async(coroutine(), loop=loop) is equivalent to loop.create_task(coroutine()).
Additionally, you can consider running what you want in an executor.
For example.
coro = loop.create_server(EchoServerClientProtocol, '127.0.0.1', 8888)
server = loop.run_until_complete(coro)
async def execute(self, loop):
await loop.run_in_executor(None, your_func_here, args:
asyncio.async(execute(loop))
loop.run_forever()
An executor will run whatever function you want in an executor, which wont block your server.
I have a small asynchronous server implemented using bottle and gevent.wsgi. There is a routine used to implement long poll that looks pretty much like the "Event Callbacks" example in the bottle documentation:
def worker(body):
msg = msgbus.recv()
body.put(msg)
body.put(StopIteration)
#route('/poll')
def poll():
body = gevent.queue.Queue()
worker = gevent.spawn(worker, body)
return body
Here, msgbus is a ZMQ sub socket.
This all works fine, but if a client breaks the connection while
worker is blocked on msgbus.recv(), that greenlet task will hang
around "forever" (well, until a message is received), and will only
find out about the disconnected client when it attempts to send a
response.
I can use msgbus.poll(timeout=something) if I don't want to block
forever waiting for ipc messages, but I still can't detect a client
disconnect.
What I want to do is get something like a reference to the client
socket so that I can use it in some kind of select or poll loop,
or get some sort of asynchronous notification inside my greenlet, but
I'm not sure how to accomplish either of these things with these
frameworks (bottle and gevent).
Is there a way to get notified of client disconnects?
Aha! The wsgi.input variable, at least under gevent.wsgi, has an rfile member that is a file-like object. This doesn't appear to be required by the WSGI spec, so it might not work with other servers.
With this I was able to modify my code to look something like:
def worker(body, rfile):
poll = zmq.Poller()
poll.register(msgbus)
poll.register(rfile, zmq.POLLIN)
while True:
events = dict(poll.poll())
if rfile.fileno() in events:
# client disconnect!
break
if msgbus in events:
msg = msgbus.recv()
body.put(msg)
break
body.put(StopIteration)
#route('/poll')
def poll():
rfile = bottle.request.environ['wsgi.input'].rfile
body = gevent.queue.Queue()
worker = gevent.spawn(worker, body, rfile)
return body
And this works great...
...except on OpenShift, where you will have to use the
alternate frontend on port 8000 with websockets support.