Using Mongo Motor with async create_task - python

I want to submit data to MongoDB in a non-blocking way. Other questions on this website recommend either:
Phrasing this problem as a single-producer, single-consumer problem with queue communication.
Using Motor with asyncio and create_task.
All examples in the Motor documentation use asyncio.get_event_loop() and run_until_complete(). However, this is a blocking way of running co-routines. I would like to instead use the non-blocking method of create_task.
Here's what I've tried:
client = motor.motor_asyncio.AsyncIOMotorClient(my_login_url)
async def do_find_one():
doc = client.my_db.my_collection.find_one()
pprint.pprint(doc)
async def main():
task = asyncio.create_task(do_find_one())
print("other things")
await task
print("done")
asyncio.run(main())
However, instead of a MongoDB entry, task just prints a pending Future which I don't know how to get the value of. How am I supposed to be using motor with create_task?

Related

How to use python asyncio to asynchronously dispatch tasks?

I am looking to implement a simple p2p file downloader in python. The downloader needs to work within the following limitations:
When queried, the tracker responds with exactly 2 peers
The tracker ignores requests made within 2 seconds of the previous request
The downloader should attempt to download a single file as fast as possible. From my reading about this I thought that I should be using asyncio. I thought I'd structure the code something like this pseudo code:
async def downloader(peer):
while file is not downloaded:
download a new block from the peer without blocking
response = synchronously query tracker for initial pair of peers
newPeers = exractPeers(response)
while file not downloaded:
for peer in newPeers:
dispatch new downloader(peer)
wait 2 seconds without blocking downloaders
response = query tracker for initial pair of peers without blocking downloaders
newPeers = exractPeers(response)
What asyncio methods should I be using to "dispatch" new downloaders and query the tracker? From my testing it seems that awaiting an async function blocks the event loop:
import asyncio
from random import randint
async def myfunc(i):
print("hello", i)
await asyncio.sleep(randint(1,3))
print("world", i)
async def main():
await myfunc(1)
await myfunc(2)
await myfunc(3)
await myfunc(4)
asyncio.run(main())
This code runs each myfunc call as though it were defined without async. I'd appreciate anything, including basic pointers.

Python Process blocking the rest of application

i have a program that basically does 2 things:
opens a websocket and remains on listening for messages and starting a video streaming in a forever loop.
I was trying to use multiprocess to manage both things but one piece stops the other from running.
The app is
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(start_client())
async def start_client():
async with WSClient() as client:
pass
class WSClient:
async def __aenter__(self):
async with websockets.connect(url,max_size= None) as websocket:
self.websocket = websocket
await self.on_open() ## it goes
p = Process(target = init(self)) # This is the streaming method
p.start()
async for message in websocket:
await on_message(message, websocket) ## listen for websocket messages
return self
the init method is
def init(ws):
logging.info('Firmware Version: ' + getVersion())
startStreaming(ws)
return
basically startStreaming has an infinite loop in it.
In this configuration, the stream starts but the on_message of the websocket it's not called because the Process function freezes the rest of the application.
How can I run both methods?
Thanks
In your code, you're telling multiprocessing.Process to take the function returned by init and call it in a new process. What you want is for the process to call init itself (with an argument). Here's how you can do that:
p = Process(target=init, args=(self,))
I have to note that you're passing an asynchronous websocket object to your init function. This will likely break as asyncio stuff aren't usually meant to be used in two threads, let alone two processes. Unless you're somehow recreating the websocket object in the new process and making a new loop there too, what you're actually looking for is how to create an asyncio task.
Assuming startStreaming is already an async function, you should change the init function to this:
async def init(ws): # note the async
logging.info('Firmware Version: ' + getVersion())
await startStreaming(ws) # note the await
return
and change the line creating and starting the process to this:
asyncio.create_task(init(self))
This will run your startStreaming function in a new task while you also read incoming messages at (basically) the same time.
Also, I'm not sure what you're trying to do with the async context manager as everything could be just in a normal async function. If you're interested in using one for learning purposes, I'd suggest you to check out contextlib.asynccontextmanager and have your message reading code inside the async with statement in start_client rather than inside __aenter__.

Starting two cpu blocking listeners and wait until one of them finishes

I am trying simultaneously to listen for a telegram or a discord message, whatever the first comes in. For Telegram I'm using Telethon:
async def listentg():
tgclient = TelegramClient('anon', conntg.tg_api_id, conntg.tg_api_hash)
#tgclient.on(events.NewMessage(chats=conntg.canaltg, pattern=patternmatch))
async def tg_event_handler(event):
print("Telegram message listened")
await tgclient.disconnect()
await tgclient.start()
await tgclient.run_until_disconnected()
And for Discord I'm using Discum
async def listendc():
dcclient = discum.Client(token=conndc.tokendc, log=False)
#dcclient.gateway.command
def dc_event_handler(resp):
if resp.event.message:
print("Discord message listened")
dcclient.gateway.close()
dcclient.gateway.run()
I understand that for running simultaneously CPU blocking code i have to (https://docs.python.org/3/library/asyncio-eventloop.html#asyncio.loop.run_in_executor) use python processes, but I don't know how to do this, just wait for the first one to return the value.
You don't really need to have them in parallel for simultaneously, it is easier to have them concurrent (but still simultaneosly waiting). You can use Python Thread-based parallelism instead.

ZMQ dealer does not receive message with asyncio

I'm setting up a listener for Hyperledger Sawtooth events with a pyzmq dealer socket and the provided asyncio functionality. Currently futures are returned but only sometimes finished, even though messages are sent to the Socket.
Weirdly this works for the connection message (only when sleeping before it as shown below) but not for event messages. I implemented this already with JavaScript and it works without problems. It seems that the issue does not lie with Sawtooth but rather in pyzmq's implementation of asyncio functionality or in my code.
class EventListener:
def __init__(self):
...
ctx = Context.instance()
self._socket = ctx.socket(zmq.DEALER)
self._socket.connect("tcp://127.0.0.1:4004")
async def subscribe(self):
...
await self._socket.send_multipart([connection_msg])
async def receive(self):
# Necessary wait otherwise the future is never finished
await asyncio.sleep(0.1)
resp = await self._socket.recv_multipart()
handle_response(resp)
async def listen(self):
while True:
# here sleep is not helping
# await asyncio.sleep(0.1)
# follwing await is never finished
resp = await self._socket.recv_multipart()
handle_response(resp)
...
listener = listener.EventListener()
await asyncio.gather(
listener.receive(), listener.subscribe())
await asyncio.create_task(listener.listen())
...
Debugging shows that the returned Future object is never changed from a pending to a finished state. So, is my code incorrect, do I need to await messages differently or is it possible that something is wrong with pyzmq's asyncio functionality? Also, why do I need to sleep in receive(), isn't that why we have asyncio?
There are too many queries, this answer may not address all of them. Hope at least this will help others looking for a way to setup event listeners.
The Hyperledger Sawtooth python SDK provides option for clients to subscribe to the events. The SDK part of code that does what you're trying to do can be found at https://github.com/hyperledger/sawtooth-sdk-python/blob/master/sawtooth_sdk/messaging/stream.py
The example code to use the Hyperledger Sawtooth python SDK for event subscription can be found here https://github.com/danintel/sawtooth-cookiejar/blob/master/events/events_client.py

How can I send a message down a websocket running in a thread from tkinter?

I have a server script running:
async def give_time(websocket,path):
while True:
await websocket.send(str(datetime.datetime.now()))
await asyncio.sleep(3)
start_server = websockets.serve(give_time, '192.168.1.32', 8765)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_forever()
which is working fine, just sending the current time every 3 seconds.
I can receive that string from the client which runs this code:
async def hello(): #takes whatever text comes through the websocket and displays it in the socket_text label.
async with websockets.connect('ws://wilsons.lan:8765') as ws:
while True:
text = await ws.recv()
logger.info('message received through websocket:{}'.format(text))
socket_text.configure(text=text) #socket_text is a tkinter object
loop = asyncio.new_event_loop()
def socketstuff():
asyncio.set_event_loop(loop)
asyncio.get_event_loop().run_until_complete(hello())
t = threading.Thread(target=socketstuff,daemon=True)
It runs in a thread so that I can run tkinter.mainloop in the main thread. This is the first time I ever used threading so I may be getting it wrong, but it seems to work at present.
What I need to do, is to be able to send a message down the websocket based on a tkinter event - currently just clicking a button next to a text box, but eventually more complex things. The clicking part works fine.
I'm having a lot of trouble with the sending of the message. I've tried a lot of different things, with and without async and await, although that may have just been panic.
The main problem seems to be that I can't access ws from outside that hello() function. This makes sense as I'm using the with context manager. However, if I just use ws = websockets.connect('ws://host') then I get a websockets.py35.client.Connect object which wen I try to use the send (or indeed recv) methods, I get a object has no attribute 'send' error.
I hope this is enough info - will happily post anything else required!
It turns out that the best way of solving this is not to use threads.
This post helped me solve it. It shows that you can run, in a coroutine, one iteration of the tkinter mainloop:
async def do_gui(root,interval=0.05):
while True:
root.update()
await asyncio.sleep(interval)
However, the best way of getting a tkinter event to spawn a websocket message is to use asyncio.queue. Making the tkinter callback add an item to the queue using put_nowait() and having a coroutine which runs at the same time as do_gui which uses await queue.get() to get the message from the queue works for me.

Categories

Resources