Creating a Simpe Python Web Socket Server - python

I am trying to implement a simple web sockets server in Python by using this module. For learning purposes, the server should reply with a reversed version of what it received. For example, if the client sends "Hello Server", the server should respond with "revreS olleH". My code is based off the documentation here
Since an example of a consumer() and producer() function/coroutine wasn't provided in the documentation, I took a stab at creating them but think I am misunderstanding something not obvious to me. The code is currently returning the string 'nothing' instead of the reversed version of what the client sent.
FYI, since the machine I am using has Python 3.4.3, the code had to be adjusted to accommodate for that version. That's why you'll see newer code commented out, for now. Lots of documentation is included too as I learn this stuff.
Now, the codez...
index.py:
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
#########################
# Dependencies
#########################
# asyncio
# websockets
#########################
# Modules
#########################
import asyncio
import websockets
#########################
# Functions
#########################
# async indicates an asynchronous function.
# Calling them doesn't actually run them,
# but instead a coroutine object is returned,
# which can then be passed to the event loop to be executed later on.
# Python ≥ 3.5: async def producer(reply):
#asyncio.coroutine
def producer(reply=None):
"""Sends the reply to producer_handler."""
if reply is None:
return 'nothing'
else:
return reply
# Python ≥ 3.5: async def consumer(message):
#asyncio.coroutine
def consumer(message):
"""Reverses message then sends it to the producer."""
reply = message[::-1]
#await producer(reply)
yield from producer(reply)
# async def consumer_handler(websocket):
#asyncio.coroutine
def consumer_handler(websocket):
"""Handles incoming websocket messages."""
while True:
# await calls an asynchronous function.
#message = await websocket.recv()
message = yield from websocket.recv()
# Python ≥ 3.5: await consumer(message)
yield from consumer(message)
#async def producer_handler(websocket):
#asyncio.coroutine
def producer_handler(websocket):
"""Handles outgoing websocket messages."""
while True:
#message = await producer()
message = yield from producer()
#await websocket.send(message)
yield from websocket.send(message)
#async def handler(websocket, path):
#asyncio.coroutine
def handler(websocket, path):
"""Enables reading and writing messages on the same websocket connection."""
# A Future is an object that is supposed to have a result in the future.
# ensure_future:
# schedules the execution of a coroutine object,
# wraps it in a future, then returns a Task object.
# If the argument is a Future, it is returned directly.
# Python ≥ 3.5
#consumer_task = asyncio.ensure_future(consumer_handler(websocket))
#producer_task = asyncio.ensure_future(producer_handler(websocket))
consumer_task = asyncio.async(consumer_handler(websocket))
producer_task = asyncio.async(producer_handler(websocket))
# .wait:
# wait for the Futures and coroutine objects given
# by the sequence futures to complete. Coroutines will be
# wrapped in Tasks. Returns two sets of Future: (done, pending).
#done, pending = await asyncio.wait(
done, pending = yield from asyncio.wait(
# The futures.
[consumer_task, producer_task],
# FIRST_COMPLETED: the function will return when
# any future finishes or is cancelled.
return_when=asyncio.FIRST_COMPLETED,
)
for task in pending:
task.cancel()
#########################
# Start script
#########################
def main():
# Creates a WebSocket server.
start_server = websockets.serve(handler, '127.0.0.1', 8000)
# Get the event loop for the current context.
# Run until the Future is done.
asyncio.get_event_loop().run_until_complete(start_server)
# Run until stop() is called.
asyncio.get_event_loop().run_forever()
#########################
# Script entry point.
#########################
if __name__ == '__main__':
main()
index.html:
<!DOCTYPE html>
<html>
<head>
<title>WebSocket demo</title>
</head>
<body>
<script>
// Create the websocket.
var ws = new WebSocket("ws://127.0.0.1:8000/"),
messages = document.createElement('ul');
// Called when the websocket is opened.
ws.onopen = function(event) {
ws.send('Hello Server!');
};
// Called when a message is received from server.
ws.onmessage = function(event) {
var messages = document.getElementsByTagName('ul')[0],
message = document.createElement('li'),
content = document.createTextNode(event.data);
message.appendChild(content);
messages.appendChild(message);
};
document.body.appendChild(messages);
</script>
</body>
</html>

Not completely sure on this, but I think you misinterpreted the docs. The consumer shouldn't be calling the producer.
The "Hello Server!" the HTML file sends goes through consumer_handler to consumer to producer, but the yield from statements means that the reversed string ends up back in the consumer_handler, as the result of yield from consumer(message).
On the other hand, producer_handler calls producer many times without an argument (from message = yield from producer()), which is what creates the nothing that gets sent to the HTML file. It doesn't receive the consumer's string.
Instead, there should be a queue or something where the consumer pushes to and the producer takes from, like in this example.
Thanks.

Related

Python Process blocking the rest of application

i have a program that basically does 2 things:
opens a websocket and remains on listening for messages and starting a video streaming in a forever loop.
I was trying to use multiprocess to manage both things but one piece stops the other from running.
The app is
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(start_client())
async def start_client():
async with WSClient() as client:
pass
class WSClient:
async def __aenter__(self):
async with websockets.connect(url,max_size= None) as websocket:
self.websocket = websocket
await self.on_open() ## it goes
p = Process(target = init(self)) # This is the streaming method
p.start()
async for message in websocket:
await on_message(message, websocket) ## listen for websocket messages
return self
the init method is
def init(ws):
logging.info('Firmware Version: ' + getVersion())
startStreaming(ws)
return
basically startStreaming has an infinite loop in it.
In this configuration, the stream starts but the on_message of the websocket it's not called because the Process function freezes the rest of the application.
How can I run both methods?
Thanks
In your code, you're telling multiprocessing.Process to take the function returned by init and call it in a new process. What you want is for the process to call init itself (with an argument). Here's how you can do that:
p = Process(target=init, args=(self,))
I have to note that you're passing an asynchronous websocket object to your init function. This will likely break as asyncio stuff aren't usually meant to be used in two threads, let alone two processes. Unless you're somehow recreating the websocket object in the new process and making a new loop there too, what you're actually looking for is how to create an asyncio task.
Assuming startStreaming is already an async function, you should change the init function to this:
async def init(ws): # note the async
logging.info('Firmware Version: ' + getVersion())
await startStreaming(ws) # note the await
return
and change the line creating and starting the process to this:
asyncio.create_task(init(self))
This will run your startStreaming function in a new task while you also read incoming messages at (basically) the same time.
Also, I'm not sure what you're trying to do with the async context manager as everything could be just in a normal async function. If you're interested in using one for learning purposes, I'd suggest you to check out contextlib.asynccontextmanager and have your message reading code inside the async with statement in start_client rather than inside __aenter__.

How do I implement async generators?

I have subscribed to a MQ queue. Every time I get a message, I pass it a function that then performs a number of time-consuming I/O actions on it.
The issue is that everything happens serially.
A request comes in, it picks up the request, performs the action by calling the function, and then picks up the next request.
I want to do this asynchronously so that multiple requests can be dealt with in an async manner.
results = []
queue = queue.subscribe(name)
async for message in queue:
yield my_funcion(message)
The biggest issue is that my_function is slow because it calls external web services and I want my code to process other messages in the meantime.
I tried to implement it above but it doesn't work! I am not sure how to implement async here.
I can't create a task because I don't know how many requests will be received. It's a MQ which I have subscribed to. I loop over each message and perform an action. I don't want for the function to complete before I perform the action on the next message. I want it to happen asynchronously.
If I understand your request, what you need is a queue that your request handlers fill, and that you read from from the code that needs to do something with the results.
If you insist on an async iterator, it is straightforward to use a generator to expose the contents of a queue. For example:
def make_asyncgen():
queue = asyncio.Queue(1)
async def feed(item):
await queue.put(item)
async def exhaust():
while True:
item = await queue.get()
yield item
return feed, exhaust()
make_asyncgen returns two objects: an async function and an async generator. The two are connected in such a way that, when you call the function with an item, the item gets emitted by the generator. For example:
import random, asyncio
# Emulate a server that takes some time to process each message,
# and then provides a result. Here it takes an async function
# that it will call with the result.
async def serve(server_ident, on_message):
while True:
await asyncio.sleep(random.uniform(1, 5))
await on_message('%s %s' % (server_ident, random.random()))
async def main():
# create the feed function, and the generator
feed, get = make_asyncgen()
# subscribe to serve several requests in parallel
asyncio.create_task(serve('foo', feed))
asyncio.create_task(serve('bar', feed))
asyncio.create_task(serve('baz', feed))
# process results from all three servers as they arrive
async for msg in get:
print('received', msg)
asyncio.run(main())

How to fix this async generator object is not an iterator issue in Python?

I'm trying to mock a websockets data stream and I'm getting this error: 'async_generator' object is not an iterator
This is my generator code:
from time import sleep
mock_sf_record = '{"payload": ...}'
async def generateMessages():
sleep(5)
yield mock_sf_record
and the code that calls this code:
async def subscribe(subscription):
global RECEIVED_MESSAGES_CACHE
...
while True:
messageStream = await(next(generateMessages())) if ENV == 'dev' else await websocket.recv()
What can I do? What am I doing wrong? I'm basically using the generateMessages() generator to create a stream of messages, but this isn't working...
The code that is calling subscribe:
for subscription in SUBSCRIPTION_TYPES:
loop.create_task(subscribe(subscription))
loop.run_forever()
More importantly, if I change the code to use a synchronous generator, this only generates messages for a single subscription and I never seem to generate messsages for any other subscription... it seems to block on a single thread. Why is this?
messageStream = (next(generateMessages())) if ENV == 'dev' else await websocket.recv()
and
# generator that generates mock SF data
from asyncio import sleep
mock_sf_record = '{"payload": ...}'
def generateMessages():
sleep(5)
yield mock_sf_record
Why does the synchronous generator cause problems?
The right way:
async def subscribe(subscription):
global RECEIVED_MESSAGES_CACHE
...
gen = generateMessages() # init async generator
messageStream = (await gen.__anext__()) if ENV == 'dev' else (await websocket.recv())
https://www.python.org/dev/peps/pep-0525/#support-for-asynchronous-iteration-protocol

Python websockets, how to send message from function

I'm writing an update to my code to send a WebSocket message to a connected web browser that it needs to update its data (charting web app).
This message needs to be sent when the code has inserted new data into the MySQL database. I will write some Javascript in the browser to go and get the update on receiving the message.
My test code:
import asyncio
#import time
import websockets
def readValues():
'''do stuff that returns the values for database'''
pass
def inserdata(val):
'''insert values into mysql'''
pass
async def ph(websocket, path):
while True:
message = 'update'
# here we receive message that the data
# has been added and need to message the
# browser to update
print('socket executed')
await websocket.send(message)
await asyncio.sleep(2)
# shouldn't be needed as message
# sent only when updated data
# inserted(every 20s)
async def main(): # maybe use this to get/write to the database etc
while True: # instead of the loop at bottom
print('main executed')
await asyncio.sleep(20)
start_server = websockets.serve(ph, '0.0.0.0', 5678)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_until_complete(main())
asyncio.get_event_loop().run_forever()
#below copied from current program
'''
while 1:
try:
a = readValues() #read valves from a function
insertdata(a) #function to write values to mysql
#some method to send the message to the web browser via -
#websocket, that it needs to get the new data
time.sleep(20) #wait and then do it again
except Exception as e:
print(e)
'''
I can send a message using the message variable.
I need the readValues and insert data functions to run continuously every 20sec regardless of what's happening with the WebSocket.
But I can't work out how to send a message to the browser from the function that updates the database. And I can't work out the best method to run the WebSocket process and the updating of the database at the same time.
I've written comments in the code to try and help you understand what I'm trying to do.
Hope you can understand, thanks, Guys.
Update: Thanks Nathan:
I changed the code and do 2 files like the below:
Server:
import asyncio
import websockets
async def ph(websocket, path):
while True:
need_update = await websocket.recv()
print('socket executed')
await websocket.send(need_update)
start_server = websockets.serve(ph, '0.0.0.0', 5678)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_forever()
process file:
import asyncio
import time
import websockets
async def main():
async with websockets.connect('ws://127.0.0.1:5678') as websocket:
while 1:
try:
#a = readValues() #read values from a function
#insertdata(a) #function to write values to mysql
await websocket.send("updated")
print('data updated')
time.sleep(20) #wait and then do it again
except Exception as e:
print(e)
asyncio.get_event_loop().run_until_complete(main())
I then ran both of these (eaxctly as shown) and opened a web browser
with this:
<!DOCTYPE html>
<html>
<head>
</head>
<body>
<h3>
Test
</h3>
<p>
<div id="log"></div>
</p>
<script>
// helper function: log message to screen
function log(msg) {
document.getElementById('log').innerText += msg + '\n';
}
// setup websocket with callbacks
var ws = new WebSocket('ws://192.168.0.224:5678/');
ws.onopen = function() {
log('CONNECT');
};
ws.onclose = function() {
log('DISCONNECT');
};
ws.onmessage = function(event) {
log('MESSAGE: ' + event.data);
};
</script>
</body>
</html>
Everything seems fine until I open the browser as above.
Then nothing comes to the browser and apart from the 'connect' result.
WebSocket connection is closed: code = 1006 (connection closed abnormally [internal]), no reason
appears on both scripts.
You need a socket connexion between the "database handler" and the socket server :
create a second script with the main loop:
async def main():
async with websockets.connect(websocket_address) as websocket:
while 1:
try:
a = readValues() #read values from a function
insertdata(a) #function to write values to mysql
await websocket.send("some token to recognize that it's the db socket")
time.sleep(20) #wait and then do it again
except Exception as e:
print(e)
asyncio.get_event_loop().run_until_complete(main())
then on the other script you could have :
USERS = set()
def register(websocket):
USERS.add(websocket)
async def ph(websocket, path):
while True:
register(websocket) #not sure if you need to place it here
need_update = await websocket.recv()
#check unique token to verify that it's the database
message = 'update'#here we receive message that the data
#has been added and need to message the
#browser to update
print('socket executed')
if USERS: # asyncio.wait doesn't accept an empty list
await asyncio.wait([user.send(message) for user in USERS])
start_server = websockets.serve(ph, '0.0.0.0', 5678)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_forever()

Python asyncio: unreferenced tasks are destroyed by garbage collector?

I am writing a program that accepts RPC requests over AMQP for executing network requests (CoAP). When processing RPC requests, the aioamqp callback generates tasks that are responsible for network IO. These tasks can be considered background tasks, that will run indefinitely for streaming network responses over AMQP (in this case one RPC requests triggers a RPC response and data streaming).
I noticed that in my original code the network task would be destroyed after seemingly random time intervals (before it was finished), asyncio would then print the following warning "Task was destroyed but it is pending". This issue is similar to the one described here: https://bugs.python.org/issue21163.
For now I have circumvented the issue by storing a hard reference in a module-level list, which prevents the GC from destroying the task object. However, I was wondering if there is a better work around? Ideally I would want to call await task in the RPC callback, but I noticed that this prevents any further AMQP operations from completing -> e.g. creating a new amqp channel stalls and receiving rpc requests over amqp also stalls. I am unsure what is causing this stalling however (as the callback is itself a coroutine, I would expect waiting would not stall the entire aioamqp library).
I am posting the source below for the RPC client and server, both are based on the aioamqp/aiocoap examples. In the server, on_rpc_request is the amqp rpc callback and send_coap_obs_request is the networking coroutine that gets destroyed when the 'obs_tasks.append(task)' statement is removed.
client.py:
"""
CoAP RPC client, based on aioamqp implementation of RPC examples from RabbitMQ tutorial
"""
import base64
import json
import uuid
import asyncio
import aioamqp
class CoAPRpcClient(object):
def __init__(self):
self.transport = None
self.protocol = None
self.channel = None
self.callback_queue = None
self.waiter = asyncio.Event()
async def connect(self):
""" an `__init__` method can't be a coroutine"""
self.transport, self.protocol = await aioamqp.connect()
self.channel = await self.protocol.channel()
result = await self.channel.queue_declare(queue_name='', exclusive=True)
self.callback_queue = result['queue']
await self.channel.basic_consume(
self.on_response,
no_ack=True,
queue_name=self.callback_queue,
)
async def on_response(self, channel, body, envelope, properties):
if self.corr_id == properties.correlation_id:
self.response = body
self.waiter.set()
async def call(self, n):
if not self.protocol:
await self.connect()
self.response = None
self.corr_id = str(uuid.uuid4())
await self.channel.basic_publish(
payload=str(n),
exchange_name='',
routing_key='coap_request_rpc_queue',
properties={
'reply_to': self.callback_queue,
'correlation_id': self.corr_id,
},
)
await self.waiter.wait()
await self.protocol.close()
return json.loads(self.response)
async def rpc_client():
coap_rpc = CoAPRpcClient()
request_dict = {}
request_dict_json = json.dumps(request_dict)
print(" [x] Send RPC coap_request({})".format(request_dict_json))
response_dict = await coap_rpc.call(request_dict_json)
print(" [.] Got {}".format(response_dict))
asyncio.get_event_loop().run_until_complete(rpc_client())
server.py:
"""
CoAP RPC server, based on aioamqp implementation of RPC examples from RabbitMQ tutorial
"""
import base64
import json
import sys
import logging
import warnings
import asyncio
import aioamqp
import aiocoap
amqp_protocol = None
coap_client_context = None
obs_tasks = []
AMQP_COAP_NOTIFICATIONS_EXCHANGE_NAME = 'topic_coap'
AMQP_COAP_NOTIFICATIONS_TOPIC_NAME = 'topic'
AMQP_COAP_NOTIFICATIONS_ROUTING_KEY = 'coap.response'
def create_response_dict(coap_request, coap_response):
response_dict = {'request_uri': "", 'code': 0}
response_dict['request_uri'] = coap_request.get_request_uri()
response_dict['code'] = coap_response.code
if len(coap_response.payload) > 0:
response_dict['payload'] = base64.b64encode(coap_response.payload).decode('utf-8')
return response_dict
async def handle_coap_response(amqp_envelope, amqp_properties, coap_request, coap_response):
# create response dict:
response_dict = create_response_dict(coap_request, coap_response)
message = json.dumps(response_dict)
# create new channel:
global amqp_protocol
amqp_channel = await amqp_protocol.channel()
await amqp_channel.basic_publish(
payload=message,
exchange_name='',
routing_key=amqp_properties.reply_to,
properties={
'correlation_id': amqp_properties.correlation_id,
},
)
await amqp_channel.basic_client_ack(delivery_tag=amqp_envelope.delivery_tag)
print(" [.] handle_coap_response() published response: {}".format(response_dict))
def incoming_observation(coap_request, coap_response):
asyncio.async(handle_coap_notification(coap_request, coap_response))
async def handle_coap_notification(coap_request, coap_response):
# create response dict:
response_dict = create_response_dict(coap_request, coap_response)
message = json.dumps(response_dict)
# create new channel:
global amqp_protocol
amqp_channel = await amqp_protocol.channel()
await amqp_channel.exchange(AMQP_COAP_NOTIFICATIONS_EXCHANGE_NAME, AMQP_COAP_NOTIFICATIONS_TOPIC_NAME)
await amqp_channel.publish(message, exchange_name=AMQP_COAP_NOTIFICATIONS_EXCHANGE_NAME, routing_key=AMQP_COAP_NOTIFICATIONS_ROUTING_KEY)
print(" [.] handle_coap_notification() published response: {}".format(response_dict))
async def send_coap_obs_request(amqp_envelope, amqp_properties, request_dict, coap_request):
observation_is_over = asyncio.Future()
try:
global coap_client_context
requester = coap_client_context.request(coap_request)
requester.observation.register_errback(observation_is_over.set_result)
requester.observation.register_callback(lambda data, coap_request=coap_request: incoming_observation(coap_request, data))
try:
print(" [..] Sending CoAP obs request: {}".format(request_dict))
coap_response = await requester.response
except socket.gaierror as e:
print("Name resolution error:", e, file=sys.stderr)
return
except OSError as e:
print("Error:", e, file=sys.stderr)
return
if coap_response.code.is_successful():
print(" [..] Received CoAP response: {}".format(coap_response))
await handle_coap_response(amqp_envelope, amqp_properties, coap_request, coap_response)
else:
print(coap_response.code, file=sys.stderr)
if coap_response.payload:
print(coap_response.payload.decode('utf-8'), file=sys.stderr)
sys.exit(1)
exit_reason = await observation_is_over
print("Observation is over: %r"%(exit_reason,), file=sys.stderr)
finally:
if not requester.response.done():
requester.response.cancel()
if not requester.observation.cancelled:
requester.observation.cancel()
async def on_rpc_request(amqp_channel, amqp_body, amqp_envelope, amqp_properties):
print(" [.] on_rpc_request(): received RPC request: {}".format(amqp_body))
request_dict = {} # hardcoded to vdna.be for SO example
aiocoap_code = aiocoap.GET
aiocoap_uri = "coap://vdna.be/obs"
aiocoap_payload = ""
# as we are ready to send the CoAP request, ack the client already indicating we have received the RPC request
await amqp_channel.basic_client_ack(delivery_tag=amqp_envelope.delivery_tag)
coap_request = aiocoap.Message(code=aiocoap_code, uri=aiocoap_uri, payload=aiocoap_payload)
coap_request.opt.observe = 0
task = asyncio.ensure_future(send_coap_obs_request(amqp_envelope, amqp_properties, request_dict, coap_request))
# we have to keep a hard ref to this task, otherwise the python garbage collector destroyes the task before it is completed. See https://bugs.python.org/issue21163
# this is apparent from the "Task was destroyed but it is pending" exception thrown after random (lengthy) time intervals, probably the time interval is related to when the gc is triggered
# await task # this does not seem to work, as it prevents new amqp operations from executing (e.g. amqp channels do not get created)
# we are actually not interested in waiting for the task anyway, so instead just keep a hard ref to the task in the obs_tasks list
obs_tasks.append(task) # TODO: when do we remove the task from the list?
async def amqp_connect():
try:
(transport, protocol) = await aioamqp.connect('localhost', 5672)
print(" [x] Connected to AMQP broker")
return (transport, protocol)
except aioamqp.AmqpClosedConnection as ex:
print("closed connections: {}".format(ex))
raise ex
async def main():
"""Open AMQP connection to broker, subscribe to coap_request_rpc_queue and setup aiocoap client context """
try:
global amqp_protocol
(amqp_transport, amqp_protocol) = await amqp_connect()
channel = await amqp_protocol.channel()
await channel.queue_declare(queue_name='coap_request_rpc_queue')
await channel.basic_qos(prefetch_count=10, prefetch_size=0, connection_global=False)
await channel.basic_consume(on_rpc_request, queue_name='coap_request_rpc_queue')
print(" [x] Awaiting CoAP request RPC requests")
except aioamqp.AmqpClosedConnection as ex:
print("amqp_connect: closed connections: {}".format(ex))
exit()
global coap_client_context
coap_client_context = await aiocoap.Context.create_client_context()
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.set_debug(True)
asyncio.async(main())
loop.run_forever()
When a task is scheduled, it's _step callback is scheduled in the loop. That callback maintains a reference to the task through self. I have not checked the code, but I have high confidence that the loop maintains a reference to its callbacks. However, when a task awaits some awaitable or future, the _step callback is not scheduled. In that case, the task adds a done callback that retains a reference to the task, but the loop does not retain references to tasks waiting for futures.
So long as something retains a reference to the future that the task is waiting on, all is well. However, if nothing retains a hard reference to the future, then the future can get garbage collected, and when that happens the task can get garbage collected.
So, I'd look for things that your task calls where the future the task is waiting on might not be referenced.
In general the future needs to be referenced so someone can set its result eventually, so it is very likely a bug if you have unreferenced futures.

Categories

Resources