I was experimenting with some basic constructs in python asyncio. I came across the following scenarios:
Snippet 1
import asyncio
async def A():
await asyncio.sleep(10)
print("A...")
async def B():
await asyncio.sleep(15)
print("B...")
async def main():
t = asyncio.create_task(A())
n = asyncio.create_task(B())
await n
print("CCDD...")
await t
asyncio.run(main())
Snippet 2
import asyncio
async def A():
await asyncio.sleep(10)
print("A...")
async def B():
await asyncio.sleep(15)
print("B...")
async def main():
t = asyncio.create_task(A())
n = asyncio.create_task(B())
await n
await t
print("CCDD...")
asyncio.run(main())
Snippet 3
import asyncio
async def A():
await asyncio.sleep(10)
print("A...")
async def B():
await asyncio.sleep(15)
print("B...")
async def main():
t = asyncio.create_task(A())
n = asyncio.create_task(B())
print("CCDD...")
await n
await t
asyncio.run(main())
I find it difficult to understand how the output produced by the first two snippets above is the same, but the output produced by the last snippet is different than the first two?
Output of snippets 1, 2
A...
B...
CCDD...
Output of snippet 3
CCDD...
A...
B...
It's all a matter of thinking about the sequencing. First off, n/15/B is always the 15-second task and t/10/A is the 10-second one. For all snippets, that means A will be printed before B, as you start them at roughly the same time.
In snippet 1, you start them both then wait for the 15-second task, meaning that both will finish before main prints CCDD (waiting for the 10-second task after that but it's already finished). Hence you see A B CCDD.
In snippet 2, you wait for the both the 15-second and 10-second task to finish before main prints CCDD, resulting in A B CCDD.
In snippet 3, you start both tasks then immediately print CCDD before waiting for them both. This gives you CCDD A B.
Related
I would like to know how can I order a program like this:
import asyncio
async def multi_coro():
async with asyncio.TaskGroup() as tg:
tg.create_task(coro('5'))
print('3-')
tg.create_task(coro('6'))
print('4-')
async def coro(i):
print(i)
await asyncio.sleep(.1)
print(i)
async def main():
async with asyncio.TaskGroup() as tg:
tg.create_task(multi_coro())
print('1-')
tg.create_task(coro('7'))
print('2-')
print('8-')
asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
asyncio.run(main())
My goal is to get this return:
1-
2-
3-
4-
5
6
7
5
6
7
8-
But the return is:
1-
2-
3-
4-
7
5
6
7
5
6
8-
In use asyncio.gather(), the result is like TaskGroup.
async def gather():
print('3-')
await asyncio.gather(*[coro('4'), coro('5')])
print('6-')
So I would like to know how can I do to call tasks group in multi_coro before to call the second task in the main()
Edit:
import asyncio
async def another_coro(i):
print(i)
await asyncio.sleep(.1)
async def coro(i, tg):
if i == 1:
tg.create_task(another_coro(i * 10))
tg.create_task(another_coro(i * 100))
else:
print(i)
await asyncio.sleep(.1)
async def main():
async with asyncio.TaskGroup() as tg:
for i in range(0, 3):
tg.create_task(coro(i, tg))
asyncio.run(main())
printing is 0 => 2 => 10 => 100
But I would a method to get 0 => 10 => 100 => ... No matter what the sequence is, it is the result 10 and 100 directly after 0 that is sought.
The thing there is that your first task calling multi_coro itself won't run until your code allows the asyncio loop to run, inside your main code. Once it is awaited, as the TaskGroup it creates is exited before the co-routine finishes, the sub-coros it creates will also run.
Inserting a await asyncio.sleep(0) creating the coro that will print 7 would work: but that would not be deterministic - the flow passes to the event loop, that will take the opportunity to step through existing tasks - but that behavior is implementation dependant.
exiting the task group and entering another, on the other hand, yes, will await all existing tasks created in that group, and will be a reliable behavior:
import asyncio
async def multi_coro():
async with asyncio.TaskGroup() as tg:
tg.create_task(coro('5'))
print('3-')
tg.create_task(coro('6'))
print('4-')
async def coro(i):
print(i)
await asyncio.sleep(.1)
print(i)
async def main():
async with asyncio.TaskGroup() as tg:
tg.create_task(multi_coro())
print('1-')
# here we end a task group, forcing the created tasks
# to run up to completion.
async with asyncio.TaskGroup() as tg:
tg.create_task(coro('7'))
print('2-')
print('8-')
This won't print your desired order - it is deterministically "3 4 (sync) 5 6 5 6 (both tasks in first group, alternating as they find the sleep (.1) 1 2 (sync) 7 7 (new task) 8 (sync) " - but understanding this, you have the keys to enforce the particular order you need.
Keep in mind a task group is a normal object you can just pass along as a parameter - so, instead of creating a new task group inside "multi_coro" you can pass it tg as an argument, and the loop will await for any tasks it creates there along with others:
import asyncio
async def multi_coro(tg):
tg.create_task(coro('5'))
print('3-')
tg.create_task(coro('6'))
print('4-')
async def coro(i):
print(i)
await asyncio.sleep(.1)
print(i)
async def main():
async with asyncio.TaskGroup() as tg:
tg.create_task(multi_coro(tg))
tg.create_task(coro(7))
async with asyncio.TaskGroup() as tg:
print('1-')
tg.create_task(coro('8'))
print('2-')
print('9-')
asyncio.run(main())
printing:
3-
4-
7
5
6
7
5
6
1-
2-
8
8
9-
I get this error:
D:\pythonstuff\demo.py:28: DeprecationWarning: The explicit passing of coroutine objects to asyncio.wait() is deprecated since Python 3.8, and scheduled for removal in Python 3.11.
await asyncio.wait([
Waited 1 second!
Waited 5 second!
Time passed: 0hour:0min:5sec
Process finished with exit code 0
When I run the code:
import asyncio
import time
class class1():
async def function_inside_class(self):
await asyncio.sleep(1)
print("Waited 1 second!")
async def function_inside_class2(self):
await asyncio.sleep(5)
print("Waited 5 second!")
def tic():
global _start_time
_start_time = time.time()
def tac():
t_sec = round(time.time() - _start_time)
(t_min, t_sec) = divmod(t_sec,60)
(t_hour,t_min) = divmod(t_min,60)
print('Time passed: {}hour:{}min:{}sec'.format(t_hour,t_min,t_sec))
object = class1()
async def main():
tic()
await asyncio.wait([
object.function_inside_class(),
object.function_inside_class2()
])
tac()
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
loop.close()
Are there any good alternatives to asyncio.wait? I don't want a warning in console every time I launch my application.
Edit: I don't want to just hide the error, that's bad practice, and I'm looking for other ways to do the same or a similar thing, not another async library to restore the old functionality.
You can just call it this way as it recommends in the docs here
Example from the docs:
async def foo():
return 42
task = asyncio.create_task(foo())
done, pending = await asyncio.wait({task})
So your code would become:
await asyncio.wait([
asyncio.create_task(object.function_inside_class()),
asyncio.create_task(object.function_inside_class2())
])
I have two async tasks.
update_x updates x every 1 seconds, and take_action will detect the change for x and print x.
If I remove await asyncio.sleep(1) in take_action. the value of x will never been printed. The program stucks in while loop in take_action. Can someone explain why asyncio.sleep(1) is important here?
Below is the simplified code. In reality, update_x() is used to subscribe an url, it will publish data. And take_action is used to detect some value change in the data. Is there a better way to design this?
import asyncio
x = 0
def print_x():
print("x is {}".format(x))
async def take_action():
y = x
while True:
if y != x:
y = x
print_x()
**await asyncio.sleep(1)**
async def update_x():
global x
while True:
x += 1
await asyncio.sleep(2)
async def main(loop):
print('creating task')
task1 = loop.create_task(update_x())
task2 = loop.create_task(take_action())
await task1
await task2
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main(loop))
loop.close()
Can someone explain why asyncio.sleep(1) is important here?
Because without it you are left with a loop that doesn't await anything, and therefore never yields control to the event loop (and thereby to other coroutines). sleep helps because it suspends the current coroutine, giving the event loop a chance to execute other runnable tasks.
In other words, async/await is based on cooperative multitasking, and a non-awaiting loop doesn't cooperate.
Is there a better way to design this?
The way to write that code without introducing arbitrary sleeps is by using a synchronization device. asyncio.Event is simple and a good match for this use case:
async def take_action(evt):
y = x
while True:
if y != x:
y = x
print_x()
await evt.wait() # wait for x to change
evt.clear()
async def update_x(evt):
global x
while True:
x += 1
evt.set() # notify take_action that we've changed x
await asyncio.sleep(2)
async def main(loop):
evt = asyncio.Event()
task1 = loop.create_task(update_x(evt))
task2 = loop.create_task(take_action(evt))
await task1
await task2
Another popular approach is to use a queue to transfer objects between two (or more) coroutines.
I have several http requests to fire simultaneously. I am trying to use async for to do this.
import asyncio
async def ticker(delay, to):
for i in range(to):
yield i
print(i)
await asyncio.sleep(delay) # instead of aiohttp request
print(i, ' fin')
async def main():
async for x in ticker(1,2):
pass
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
I keep getting subsequent calls with the following:
0
0 fin
1
1 fin
Instead I need the output as shown below:
0
1
0 fin
1 fin
Could you please advise me on how to do this?
The problem is that async for is exactly what you don't need.
async for is designed to iterate while waiting for a task to complete between each iteration; you want to iterate (starting requests) without waiting for the previous task(s) to finish.
You'll want something like
async def do_request():
await asyncio.sleep(1)
async def main():
await asyncio.gather(*[
do_request() for i in range(10)
])
Comment with a follow-up if that doesn't answer your question.
I have 2 functions: The first one, def_a, is an asynchronous function and the second one is def_b which is a regular function and called with the result of def_a as a callback with the add_done_callback function.
My code looks like this:
import asyncio
def def_b(result):
next_number = result.result()
# some work on the next_number
print(next_number + 1)
async def def_a(number):
await some_async_work(number)
return number + 1
loop = asyncio.get_event_loop()
task = asyncio.ensure_future(def_a(1))
task.add_done_callback(def_b)
response = loop.run_until_complete(task)
loop.close()
And it's work perfectly.
The problem began when also the second function, def_b, became asynchronous. Now it looks like this:
async def def_b(result):
next_number = result.result()
# some asynchronous work on the next_number
print(next_number + 1)
But now I can not provide it to the add_done_callback function, because it's not a regular function.
My question is- Is it possible and how can I provide def_b to the add_done_callback function if def_b is asynchronous?
add_done_callback is considered a "low level" interface. When working with coroutines, you can chain them in many ways, for example:
import asyncio
async def my_callback(result):
print("my_callback got:", result)
return "My return value is ignored"
async def coro(number):
await asyncio.sleep(number)
return number + 1
async def add_success_callback(fut, callback):
result = await fut
await callback(result)
return result
loop = asyncio.get_event_loop()
task = asyncio.ensure_future(coro(1))
task = add_success_callback(task, my_callback)
response = loop.run_until_complete(task)
print("response:", response)
loop.close()
Keep in mind add_done_callback will still call the callback if your future raises an exception (but calling result.result() will raise it).
This only works for one future job, if you have multiple async jobs, they will blocks each other, a better way is using asyncio.as_completed() to iterate future list:
import asyncio
async def __after_done_callback(future_result):
# await for something...
pass
async def __future_job(number):
await some_async_work(number)
return number + 1
loop = asyncio.get_event_loop()
tasks = [asyncio.ensure_future(__future_job(x)) for x in range(100)] # create 100 future jobs
for f in asyncio.as_completed(tasks):
result = await f
await __after_done_callback(result)
loop.close()
You can try the aiodag library. It's a very lightweight wrapper around asyncio that abstracts away some of the async plumbing that you usually have to think about. From this example you won't be able to tell that things are running asynchronously since it's just 1 task that depends on another, but it is all running async.
import asyncio
from aiodag import task
#task
async def def_b(result):
# some asynchronous work on the next_number
print(result + 1)
#task
async def def_a(number):
await asyncio.sleep(number)
return number + 1
async def main():
a = def_a(1)
b = def_b(a) # this makes task b depend on task a
return await b
loop = asyncio.get_event_loop()
asyncio.set_event_loop(loop)
response = loop.run_until_complete(main())