I've been using asyncio for a bit but I'm still fairly unfamiliar with it. My current issue is that while trying to wait for a response from a function with asyncio, the waiting (while loop) blocks the function from happening. Here is the code that sums up the problem:
import asyncio
response = 0
async def handle(x):
await asyncio.sleep(0.1)
return x
async def run():
global response
for number in range(1, 21):
response = await handle(number)
print(response)
if response == 10:
await wait_for_next(response)
async def wait_for_next(x):
while response == x:
print('waiting',response,x)
await asyncio.sleep(0.5)
print('done')
tasks = [run()]
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait(tasks))
wait_for_next is supposed to wait for the next response, but the while loop blocks the run() function. How could I stop this happening? Should I be using loop.run_in_executor, and if so, how?
(There were a couple of other examples of this I could find, but they were very specific and I didn't understand if our problems/solutions would be the same.)
As already noted, loop stuck because await wait_for_next(response) blocks execution flow until this coroutine wouldn't be finished.
If you want some of your coroutines to be started without blocking execution flow you can start it as asyncio.Task (more about tasks) using ensure_future function:
import asyncio
response = 0
async def handle(x):
await asyncio.sleep(0.1)
return x
async def run():
global response
for number in range(1, 21):
response = await handle(number)
print(response)
if response == 10:
# run wait_for_next "in background" instead of blocking flow:
asyncio.ensure_future(wait_for_next(response))
async def wait_for_next(x):
while response == x:
print('waiting',response,x)
await asyncio.sleep(0.5)
print('done')
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(run())
Output:
1
2
3
4
5
6
7
8
9
10
waiting 10 10
11
12
13
14
done
15
16
17
18
19
20
Related
I can't figure how to stop loop after one task is finished. In sample when WsServe count to 5 I expect loop to close. But instead stop I got RuntimeError: Cannot close a running event loop
#!/usr/bin/env python
import asyncio
async def rxer():
i=0
while True:
i+=1
print ('Rxer ',i)
await asyncio.sleep(1)
async def WsServe():
for i in range(5):
print ('WsServe',i)
await asyncio.sleep(1)
print ('Finish')
loop.stop()
loop.close()
loop=asyncio.get_event_loop()
loop.create_task(rxer())
loop.run_until_complete(WsServe())
loop.run_forever()
The error comes from calling loop.close() from inside the loop. You don't need to bother with loop.close(), loop.stop() is quite sufficient to stop the loop. loop.close() is only relevant when you want to ensure that all the resources internally acquired by the loop are released. It is not needed when your process is about to exit anyway, and removing the call to loop.close() indeed eliminates the error.
But also, loop.stop() is incompatible with run_until_complete(). It happens to work in this code because the coroutine returns immediately after calling loop.stop(); if you added e.g. an await asyncio.sleep(1) after loop.stop(), you'd again get a (different) RuntimeError.
To avoid such issues, I suggest that you migrate to the newer asyncio.run API and avoid both run_until_complete and stop. Instead, you can just use an event to terminate the main function, and the loop with it:
# rxer() defined as before
async def WsServe(stop_event):
for i in range(5):
print ('WsServe',i)
await asyncio.sleep(1)
print ('Finish')
stop_event.set()
await asyncio.sleep(1)
async def main():
asyncio.get_event_loop().create_task(rxer())
stop_event = asyncio.Event()
asyncio.get_event_loop().create_task(WsServe(stop_event))
await stop_event.wait()
asyncio.run(main())
# python 3.6 and older:
#asyncio.get_event_loop().run_until_complete(main())
Check commented lines of your implementation as below:
import asyncio
async def rxer():
i=0
while True:
i+=1
print ('Rxer ',i)
await asyncio.sleep(1)
async def WsServe():
for i in range(5):
print ('WsServe',i)
await asyncio.sleep(1)
print ('Finish')
#loop.stop()
#loop.close()
loop=asyncio.get_event_loop()
loop.create_task(rxer())
loop.run_until_complete(WsServe())
#loop.run_forever()
And here is the output:
Rxer 1
WsServe 0
Rxer 2
WsServe 1
Rxer 3
WsServe 2
Rxer 4
WsServe 3
Rxer 5
WsServe 4
Rxer 6
Finish
I'm trying to leave threads and start using async. I tried to write something simple so I can get more comfortable with async; for some reason my async code isn't acting async.
I rewrote the same code in threads and it worked fast and concurrently, unlike the async code.
Normal code
import time
import random
def display(x: int) -> None:
time.sleep(random.randint(1, 8))
print(x)
def main():
for i in range(10):
display(i)
if __name__ == '__main__':
main()
Output
0
1
2
3
4
5
6
7
8
9
Async code
import time
import random
import asyncio
async def display(x: int) -> None:
await asyncio.sleep(random.randint(1, 8))
print(x)
async def main():
for i in range(10):
await display(i)
if __name__ == '__main__':
event_loop = asyncio.get_event_loop()
event_loop.run_until_complete(main())
event_loop.close()
Output
0
1
2
3
4
5
6
7
8
9
Thread code
import time
import random
import threading
def display(x: int) -> None:
time.sleep(random.randint(1, 8))
print(x)
def main():
threads = []
for i in range(10):
t = threading.Thread(target=display, args=[i])
threads.append(t)
t.start()
for t in threads:
t.join()
if __name__ == '__main__':
main()
Output
5
9
3
0
4
2
1
8
6
7
await display(i) runs display with argument i, which returns an awaitable. You then immediately wait for it with await, blocking the call right there.
If you want to schedule them all together and then wait at the end, you need to collect the awaitables in a list and then wait for all of them at once.
import time
import random
import asyncio
async def display(x: int) -> None:
await asyncio.sleep(random.randint(1, 8)/10)
print(x)
async def main():
awaitables = []
for i in range(10):
awaitables.append(display(i))
await asyncio.wait(awaitables)
if __name__ == '__main__':
event_loop = asyncio.get_event_loop()
event_loop.run_until_complete(main())
event_loop.close()
And yes, because otherwise someone will surely point it out, you can also write that in a list comprehension:
async def main():
await asyncio.wait([display(i) for i in range(10)])
Further note:
I'm sure you are aware, nonetheless I think it is important to mention it anyway. The code runs asynchronous, but not parallel. Running multiple compution-heavy functions with async or threading.Thread still only runs them on one single cpu core without any speedup. The Python interpreter is single-threaded.
I have several http requests to fire simultaneously. I am trying to use async for to do this.
import asyncio
async def ticker(delay, to):
for i in range(to):
yield i
print(i)
await asyncio.sleep(delay) # instead of aiohttp request
print(i, ' fin')
async def main():
async for x in ticker(1,2):
pass
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
I keep getting subsequent calls with the following:
0
0 fin
1
1 fin
Instead I need the output as shown below:
0
1
0 fin
1 fin
Could you please advise me on how to do this?
The problem is that async for is exactly what you don't need.
async for is designed to iterate while waiting for a task to complete between each iteration; you want to iterate (starting requests) without waiting for the previous task(s) to finish.
You'll want something like
async def do_request():
await asyncio.sleep(1)
async def main():
await asyncio.gather(*[
do_request() for i in range(10)
])
Comment with a follow-up if that doesn't answer your question.
I have 2 functions: The first one, def_a, is an asynchronous function and the second one is def_b which is a regular function and called with the result of def_a as a callback with the add_done_callback function.
My code looks like this:
import asyncio
def def_b(result):
next_number = result.result()
# some work on the next_number
print(next_number + 1)
async def def_a(number):
await some_async_work(number)
return number + 1
loop = asyncio.get_event_loop()
task = asyncio.ensure_future(def_a(1))
task.add_done_callback(def_b)
response = loop.run_until_complete(task)
loop.close()
And it's work perfectly.
The problem began when also the second function, def_b, became asynchronous. Now it looks like this:
async def def_b(result):
next_number = result.result()
# some asynchronous work on the next_number
print(next_number + 1)
But now I can not provide it to the add_done_callback function, because it's not a regular function.
My question is- Is it possible and how can I provide def_b to the add_done_callback function if def_b is asynchronous?
add_done_callback is considered a "low level" interface. When working with coroutines, you can chain them in many ways, for example:
import asyncio
async def my_callback(result):
print("my_callback got:", result)
return "My return value is ignored"
async def coro(number):
await asyncio.sleep(number)
return number + 1
async def add_success_callback(fut, callback):
result = await fut
await callback(result)
return result
loop = asyncio.get_event_loop()
task = asyncio.ensure_future(coro(1))
task = add_success_callback(task, my_callback)
response = loop.run_until_complete(task)
print("response:", response)
loop.close()
Keep in mind add_done_callback will still call the callback if your future raises an exception (but calling result.result() will raise it).
This only works for one future job, if you have multiple async jobs, they will blocks each other, a better way is using asyncio.as_completed() to iterate future list:
import asyncio
async def __after_done_callback(future_result):
# await for something...
pass
async def __future_job(number):
await some_async_work(number)
return number + 1
loop = asyncio.get_event_loop()
tasks = [asyncio.ensure_future(__future_job(x)) for x in range(100)] # create 100 future jobs
for f in asyncio.as_completed(tasks):
result = await f
await __after_done_callback(result)
loop.close()
You can try the aiodag library. It's a very lightweight wrapper around asyncio that abstracts away some of the async plumbing that you usually have to think about. From this example you won't be able to tell that things are running asynchronously since it's just 1 task that depends on another, but it is all running async.
import asyncio
from aiodag import task
#task
async def def_b(result):
# some asynchronous work on the next_number
print(result + 1)
#task
async def def_a(number):
await asyncio.sleep(number)
return number + 1
async def main():
a = def_a(1)
b = def_b(a) # this makes task b depend on task a
return await b
loop = asyncio.get_event_loop()
asyncio.set_event_loop(loop)
response = loop.run_until_complete(main())
I have a program with one producer and two slow consumers and I'd like to rewrite it with coroutines in such way that each consumer will handle only last value (i.e. skip new values generated during processing the old ones) produced for it (I used threads and threading.Queue() but with it blocks on put(), cause the queue will be full most of the time).
After reading answer to this question I decided to use asyncio.Event and asyncio.Queue. I wrote this prototype program:
import asyncio
async def l(event, q):
h = 1
while True:
# ready
event.set()
# get value to process
a = await q.get()
# process it
print(a * h)
h *= 2
async def m(event, q):
i = 1
while True:
# pass element to consumer, when it's ready
if event.is_set():
await q.put(i)
event.clear()
# produce value
i += 1
el = asyncio.get_event_loop()
ev = asyncio.Event()
qu = asyncio.Queue(2)
tasks = [
asyncio.ensure_future(l(ev, qu)),
asyncio.ensure_future(m(ev, qu))
]
el.run_until_complete(asyncio.gather(*tasks))
el.close()
and I have noticed that l coroutine blocks on q.get() line and doesn't print anything.
It works as I expect after adding asyncio.sleep() in both (I get 1,11,21,...):
import asyncio
import time
async def l(event, q):
h = 1
a = 1
event.set()
while True:
# await asyncio.sleep(1)
a = await q.get()
# process it
await asyncio.sleep(1)
print(a * h)
event.set()
async def m(event, q):
i = 1
while True:
# pass element to consumer, when it's ready
if event.is_set():
await q.put(i)
event.clear()
await asyncio.sleep(0.1)
# produce value
i += 1
el = asyncio.get_event_loop()
ev = asyncio.Event()
qu = asyncio.Queue(2)
tasks = [
asyncio.ensure_future(l(ev, qu)),
asyncio.ensure_future(m(ev, qu))
]
el.run_until_complete(asyncio.gather(*tasks))
el.close()
...but I'm looking for solution without it.
Why is it so? How can I fix it? I think I cannot call await l() from m as both of them have states (in original program the first draws solution with PyGame and the second plots results).
The code is not working as expected as the task running the m function is never stopped. The task will continue increment i in the case that event.is_set() == False. Because this task is never suspended, the task running function l will never be called. Therefore, you need a way to suspend the task running function m. One way of suspending is awaiting another coroutine, that is the reason why a asyncio.sleep works as expected.
I think the following code will work as you expect. The LeakyQueue will ensure that only the last value from the producer will be processed by the consumer. As the complexity is very symmetric, the consumer will consume all values produced by the producer. If you increase the delay argument, you can simulate that the consumer only processes the last value created by the producer.
import asyncio
class LeakyQueue(asyncio.Queue):
async def put(self, item):
if self.full():
await self.get()
await super().put(item)
async def consumer(queue, delay=0):
h = 1
while True:
a = await queue.get()
if delay:
await asyncio.sleep(delay)
print ('consumer', a)
h += 2
async def producer(queue):
i = 1
while True:
await asyncio.ensure_future(queue.put(i))
print ('producer', i)
i += 1
loop = asyncio.get_event_loop()
queue = LeakyQueue(maxsize=1)
tasks = [
asyncio.ensure_future(consumer(queue, 0)),
asyncio.ensure_future(producer(queue))
]
loop.run_until_complete(asyncio.gather(*tasks))