Python async events and serial async? - python

I have code (python class, I'll call it Arduino) that writes control packets to a serial port (using serial_asyncio), and the serial port replies with confirmation packets. However the remote device also sends event packets to the python side randomly. I want the class to provide the decoded packets to the class that instantiates my class (I'll call it Controller). I am confused on how to handle this.
My first thought is to provide callback to the Arduino class:
class Controller:
def __init__(self):
self.arduino = Arduino("/dex/ttyS0", self._serialPacketCallback)
def _serialPacketCallback(self, obj: dict):
pass # handle spontaneous event packet from remote
But this does not seem very async-y. What is the asyncio way to do this? I think this would look like:
class Controller:
def __init__(self):
self.arduino = Arduino("/dex/ttyS0")
async readEventPacket(self):
pass
#somewhere, not sure where, or how to start it:
async def _handleEvents(self)
while True:
event = await self._readEventPacket()
async def start(self):
await self.arduino.start()
await asyncio.wait([self._handleEvents()])
if __name__ == '__main__':
controller = Controller()
loop = asyncio.get_event_loop()
loop.create_task(controller.start())
loop.run_forever()
I've looked around and I've seen suggestions of using callbacks, multi-processing pipes, additional event loops, and I am sure they work, but I'm not sure what the proper approach is. For me, I don't want to start any additional event loops or threads, leading me to think the callback is best, but that is not very async, and I would like to know how to do this as async, without additional event loops or callbacks.
An additional concern that I want to articulate is that I'd like this as loosely coupled as the Arduino class will be used in other controllers.
Side note: I am not sure when in Python a new event loop is required to be created?
Another question: how does the Arduino class generate an event and have Controller pick it up in await self._readEventPacket()?

The nice thing about asyncio is that you can always convert a callback-based interface to a coroutine-based one, and vice versa.
Let's assume your Arduino class implements a callback-based interface, like this (untested):
class Arduino:
def __init__(self, device, callback):
self._device = device
self._callback = callback
async def start(self):
reader, writer = await serial_asyncio.connect(url=self._device, baudrate=115200)
while True:
data = await reader.read(1024)
self._callback(data)
You can convert that interface into a coroutine-based one by using a queue:
def callback_to_coro():
# Return two functions, one that can be passed as callback to
# code that expects it, and the other a coroutine that can be
# awaited to get the values the callback was invoked with.
queue = asyncio.Queue()
return queue.put_nowait, queue.get
With that code you can implement Controller.read_event_packet like this:
class Controller:
def __init__(self):
callback, wait = callback_to_coro()
self.arduino = Arduino("/dex/ttyS0", callback)
self.read_event_packet = wait

Related

asyncio network operation in thread?

I have a Python asyncio script that needs to run a long running task in a thread. During the operation of the thread, it needs to make network connections to another server. Is there any problem calling network/socket write functions in a thread as opposed to doing it in the main thread?
I know that in the Tiwsted library for example, one must always do network operations in the main thread. Are there any such limitations in asyncio? And if so, how does one get around this problem.
Here's my sample code:
import asyncio
import threading
#
# global servers dict keeps track of connected instances of each protocol
#
servers={}
class SomeOtherServer(asyncio.Protocol):
def __init__(self):
self.transport = None
def connection_made(self,transport):
self.transport=transport
servers["SomeOtherServer"] = self
def connection_lost(self):
self.transport=None
class MyServer(asyncio.Protocol):
def __init__(self):
self.transport = None
def connection_made(self,transport);
self.transport=transport
servers["MyServer"] = self
def connection_lost(self):
self.transport=None
def long_running_task(self,data):
# some long running operations here, then write data to other server
# other_server is also an instance of some sort of asyncio.Protocol
# is it ok to call this like this, even though this method is running in a thread?
other_server = servers["SomeOtherServer"]
other_server.transport.write(data)
def data_received(self,data):
task_thread = threading.Thread(target=self.long_running_task,args=[data])
task_thread.start()
async def main():
global loop
loop = asyncio.get_running_loop()
other_server_obj = await loop.create_server(lambda: SomeOtherServer(),"localhost",9001)
my_server_obj = await loop.create_server(lambda: MyServer(),"localhost",9002)
async with other_server_obj, my_server_obj:
while True:
await asyncio.sleep(3600)
asyncio.run(main())
Note that data_received will set up and call long_running_task in a thread, and long running_task makes a network connection to another server, and does so in the task thread, not the main thread. Is this ok or is there some other way this must be done?

Mulitprocess management in Python with aiomultiprocess

I have a problem with the multiprocessing in Python. I need to create async processes, which run a undefined time and the number of processes is also undefined. As soon as a new request arrives, a new process must be created with the specifications from the request. We use ZeroMQ for messaging. There is also a Process which is started at the beginning and only ends if the whole script terminates.
Now I am searching for a solution how I can await all processes, while being able to add additional processes.
asyncio.gather()
Was my first idea, but it needs the list of processes before it's been called.
class Object:
def __init__(self, var):
self.var = var
async def run(self):
*do async things*
class object_controller:
def __init__(self):
self.ctx = zmq.Context()
self.socket = self.ctx.socket(zmq.PULL)
self.socket.connect("tcp://127.0.0.1:5558")
self.static_process = AStaticProcess()
self.sp = aiomultiprocess.Process(target=self.static_process.run)
self.sp.start()
#here I need a good way to await this process
def process(self, var):
object = Object(var)
process = aiomultiprocess.Process(target=object.run)
process.start()
def listener(self)
while True:
msg = self.socket.recv_pyobj()
# here I need to find a way how I can start and await this process while beeing able to
# receive additional request, which result in additional processes which need to be awaited
This is some code which hopefully explains my problem. I need a kind of Collector which awaits the Processes.
After initialization, there is no interaction between the object and the controller, only over zeroMQ (between the static process and the variable processes). There is also no return.
If you need to start up proceses while concurrently waiting for new ones, instead of explicitly calling await to know when the Processes finish, let them execute in the background using asyncio.create_task(). This will return a Task object, which has an add_done_callback method, which you can use to do some work when the process completes:
class Object:
def __init__(self, var):
self.var = var
async def run(self):
*do async things*
class object_controller:
def __init__(self):
self.ctx = zmq.Context()
self.socket = self.ctx.socket(zmq.PULL)
self.socket.connect("tcp://127.0.0.1:5558")
self.static_process = AStaticProcess()
self.sp = aiomultiprocess.Process(target=self.static_process.run)
self.sp.start()
asyncio.create_task(self.sp.join() self.handle_proc_finished)
def process(self, var):
object = Object(var)
process = aiomultiprocess.Process(target=object.run)
process.start()
def listener(self)
while True:
msg = self.socket.recv_pyobj()
process = aiomultiprocess.Process(...)
process.start()
t = asyncio.create_task(process.join())
t.add_done_callback(self.handle_other_proc_finished)
def handle_proc_finished(self, task):
# do something
def handle_other_proc_finished(self, task):
# do something else
If you want to avoid using callbacks, you can also pass create_task a coroutine you define yourself, which waits for the process to finish and does whatever needs to be done afterward.
self.sp.start()
asyncio.create_task(wait_for_proc(self.sp))
async def wait_for_proc(proc):
await proc.join()
# do other stuff
You need to create a list of tasks or a future object for the processes. Also you cannot add process to the event loop while awaiting other tasks

Add item to asyncio queue from a request handler

I have a TCP server running and have a handler function which needs to take the contents of the request, add it to an asyncio queue and reply with an OK status.
On the background I have an async coroutine running that detects when a new item is added and performs some processing.
How do I put items in the asyncio queue from the handler function, which isn't and can't be an async coroutine?
I am running a DICOM server pynetdicom which listens on port 104 for incoming TCP requests (DICOM C-STORE specifically).
I need to save the contents of the request to a queue and return a a 0x0000 response so that the listener is available to the network.
This is modeled by a producer-consumer pattern.
I have tried to define a consumer co-routine consume_dicom() that is currently stuck in await queue.get() since I can't properly define the producer.
The producer needs to simply invoke queue.put(produce_item) but this happens inside a handle_store(event) function which is not part of the event_loop but is called every time a request is received by the server.
import asyncio
from pynetdicom import (
AE, evt,
StoragePresentationContexts
)
class PacsServer():
def __init__(self, par, listen=True):
# Initialize other stuff...
# Initialize DICOM server
ae = AE(ae_title='DICOM-NODE')
ae.supported_contexts = StoragePresentationContexts
# When a C-STORE request comes, it will be passed to self.handle_store
handlers = [(evt.EVT_C_STORE, self.handle_store)]
# Define queue
loop = asyncio.get_event_loop()
self.queue = asyncio.Queue(loop=loop)
# Define consumer
loop.create_task(self.consume_dicom(self.queue))
# Start server in the background with specified handlers
self.scp = ae.start_server(('', 104), block=False, evt_handlers=handlers)
# Start async loop
self.loop.run_forever()
def handle_store(self, event):
# Request handling
ds = event.dataset
# Here I want to add to the queue but this is not an async method
await queue.put(ds)
return 0x0000
async def consume_dicom(self, queue):
while True:
print(f"AWAITING FROM QUEUE")
ds = await queue.get()
do_some_processing(ds)
I would like to find a way to add items to the queue and return the OK status in the handle_store() function.
Since handle_store is running in a different thread, it needs to tell the event loop to enqueue the item. This is done with call_soon_threadsafe:
self.loop.call_soon_threadsafe(queue.put_nowait, ds)
Note that you need to call queue.put_nowait instead of queue.put because the former is a function rather than a coroutine. The function will always succeed for unbounded queues (the default), otherwise it will raise an exception if the queue is full.

how to create python parallel sockets in asyncio and transport_base class?

I used asyncio for my non-stop server in python and implemented
connection_made , connection_lost , data_received
funtions in my ServerClientProtocol
I used this class first beacause of using multiple times repeatedly sending data to socket class socket
got closed and program exited
and second becuase I thought its async and have parallel answering multiple coming sockets in same time,
but it's not.
how should I use that in one async thread and parallel answering socket?
this is my code:
class ServerClientProtocol(asyncio.Protocol):
def connection_made(self,transport):
self.transport = transport
def connection_lost(self,exc):
pass
def data_received(self, data):
server.server(self,data)
def main(*args):
loop = get_event_loop()
coro = loop.create_server(ServerClientProtocol, '127.0.0.1', 50008)
srv = loop.run_until_complete(coro)
loop.run_forever()
if __name__ == '__main__':
main()
server.server() might be blocking the other connections. If this is a long-running call, try using asyncio.start_server (example here) instead, and call server.server() using await loop.run_in_executor(None, server.server, data)

Writing an "interactive" client with Twisted/Autobahn Websockets

Maybe I'm missing something here in the asynchronous designs of Twisted, but I can't seem to find a way to call the sendMessage() method "externaly". By this I mean, sending messages without being solely at the callback methods of Twisted/AutobahnWebsockets (like at onOpen or when receiving data from server at onMessage())
Of course I could launch a thread and call my_protocol_instance.sendMessage("hello") but that would defeat every purpose of the asynchronous design right?
In a concrete example, I need to have a top wrapper class which opens the connection and manages it, and whenever I need I call my_class.send_my_toplevel_message(msg). How can I implement this?
Hope I've been clear on my explanation.
Thanks
Why do you need a thread to launch protocolInstance.sendMessage() ?
This can be done in a normal reactor loop.
The core of a twisted is reactor and it gives a much easier look at things when you consider twisted itself reactive - meaning it does something as a reaction (response) to something else.
Now I assume that the thread you are talking about, also gets created and made in calling sendMessage because of certain events or activity or status. I can hardly imagine a case where you would just need to send a message out of the blue without any reason to react.
If however there is an event which should trigger sendMessage, there is no need to invoke that in thread: just use twisted mechanisms for catching that event and then calling sendMessage from that particular event's callback.
Now on to your concrete example: can you specify what "whenever I need" means exactly in the context of this question? An input from another connection? An input from the user? Looping activity?
I managed to implement what I needed by running Twisted in another thread, keeping my program free to run and allowing it to trigger send data in Twisted with reactor.callFromThread().
What do you think?
# ----- twisted ----------
class _WebSocketClientProtocol(WebSocketClientProtocol):
def __init__(self, factory):
self.factory = factory
def onOpen(self):
log.debug("Client connected")
self.factory.protocol_instance = self
self.factory.base_client._connected_event.set()
class _WebSocketClientFactory(WebSocketClientFactory):
def __init__(self, *args, **kwargs):
WebSocketClientFactory.__init__(self, *args, **kwargs)
self.protocol_instance = None
self.base_client = None
def buildProtocol(self, addr):
return _WebSocketClientProtocol(self)
# ------ end twisted -------
class BaseWBClient(object):
def __init__(self, websocket_settings):
self.settings = websocket_settings
# instance to be set by the own factory
self.factory = None
# this event will be triggered on onOpen()
self._connected_event = threading.Event()
# queue to hold not yet dispatched messages
self._send_queue = Queue.Queue()
self._reactor_thread = None
def connect(self):
log.debug("Connecting to %(host)s:%(port)d" % self.settings)
self.factory = _WebSocketClientFactory(
"ws://%(host)s:%(port)d" % self.settings,
debug=True)
self.factory.base_client = self
c = connectWS(self.factory)
self._reactor_thread = threading.Thread(target=reactor.run,
args=(False,))
self._reactor_thread.daemon = True
self._reactor_thread.start()
def send_message(self, body):
if not self._check_connection():
return
log.debug("Queing send")
self._send_queue.put(body)
reactor.callFromThread(self._dispatch)
def _check_connection(self):
if not self._connected_event.wait(timeout=10):
log.error("Unable to connect to server")
self.close()
return False
return True
def _dispatch(self):
log.debug("Dispatching")
while True:
try:
body = self._send_queue.get(block=False)
except Queue.Empty:
break
self.factory.protocol_instance.sendMessage(body)
def close(self):
reactor.callFromThread(reactor.stop)

Categories

Resources