I would like to know how can I order a program like this:
import asyncio
async def multi_coro():
async with asyncio.TaskGroup() as tg:
tg.create_task(coro('5'))
print('3-')
tg.create_task(coro('6'))
print('4-')
async def coro(i):
print(i)
await asyncio.sleep(.1)
print(i)
async def main():
async with asyncio.TaskGroup() as tg:
tg.create_task(multi_coro())
print('1-')
tg.create_task(coro('7'))
print('2-')
print('8-')
asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
asyncio.run(main())
My goal is to get this return:
1-
2-
3-
4-
5
6
7
5
6
7
8-
But the return is:
1-
2-
3-
4-
7
5
6
7
5
6
8-
In use asyncio.gather(), the result is like TaskGroup.
async def gather():
print('3-')
await asyncio.gather(*[coro('4'), coro('5')])
print('6-')
So I would like to know how can I do to call tasks group in multi_coro before to call the second task in the main()
Edit:
import asyncio
async def another_coro(i):
print(i)
await asyncio.sleep(.1)
async def coro(i, tg):
if i == 1:
tg.create_task(another_coro(i * 10))
tg.create_task(another_coro(i * 100))
else:
print(i)
await asyncio.sleep(.1)
async def main():
async with asyncio.TaskGroup() as tg:
for i in range(0, 3):
tg.create_task(coro(i, tg))
asyncio.run(main())
printing is 0 => 2 => 10 => 100
But I would a method to get 0 => 10 => 100 => ... No matter what the sequence is, it is the result 10 and 100 directly after 0 that is sought.
The thing there is that your first task calling multi_coro itself won't run until your code allows the asyncio loop to run, inside your main code. Once it is awaited, as the TaskGroup it creates is exited before the co-routine finishes, the sub-coros it creates will also run.
Inserting a await asyncio.sleep(0) creating the coro that will print 7 would work: but that would not be deterministic - the flow passes to the event loop, that will take the opportunity to step through existing tasks - but that behavior is implementation dependant.
exiting the task group and entering another, on the other hand, yes, will await all existing tasks created in that group, and will be a reliable behavior:
import asyncio
async def multi_coro():
async with asyncio.TaskGroup() as tg:
tg.create_task(coro('5'))
print('3-')
tg.create_task(coro('6'))
print('4-')
async def coro(i):
print(i)
await asyncio.sleep(.1)
print(i)
async def main():
async with asyncio.TaskGroup() as tg:
tg.create_task(multi_coro())
print('1-')
# here we end a task group, forcing the created tasks
# to run up to completion.
async with asyncio.TaskGroup() as tg:
tg.create_task(coro('7'))
print('2-')
print('8-')
This won't print your desired order - it is deterministically "3 4 (sync) 5 6 5 6 (both tasks in first group, alternating as they find the sleep (.1) 1 2 (sync) 7 7 (new task) 8 (sync) " - but understanding this, you have the keys to enforce the particular order you need.
Keep in mind a task group is a normal object you can just pass along as a parameter - so, instead of creating a new task group inside "multi_coro" you can pass it tg as an argument, and the loop will await for any tasks it creates there along with others:
import asyncio
async def multi_coro(tg):
tg.create_task(coro('5'))
print('3-')
tg.create_task(coro('6'))
print('4-')
async def coro(i):
print(i)
await asyncio.sleep(.1)
print(i)
async def main():
async with asyncio.TaskGroup() as tg:
tg.create_task(multi_coro(tg))
tg.create_task(coro(7))
async with asyncio.TaskGroup() as tg:
print('1-')
tg.create_task(coro('8'))
print('2-')
print('9-')
asyncio.run(main())
printing:
3-
4-
7
5
6
7
5
6
1-
2-
8
8
9-
Related
I was experimenting with some basic constructs in python asyncio. I came across the following scenarios:
Snippet 1
import asyncio
async def A():
await asyncio.sleep(10)
print("A...")
async def B():
await asyncio.sleep(15)
print("B...")
async def main():
t = asyncio.create_task(A())
n = asyncio.create_task(B())
await n
print("CCDD...")
await t
asyncio.run(main())
Snippet 2
import asyncio
async def A():
await asyncio.sleep(10)
print("A...")
async def B():
await asyncio.sleep(15)
print("B...")
async def main():
t = asyncio.create_task(A())
n = asyncio.create_task(B())
await n
await t
print("CCDD...")
asyncio.run(main())
Snippet 3
import asyncio
async def A():
await asyncio.sleep(10)
print("A...")
async def B():
await asyncio.sleep(15)
print("B...")
async def main():
t = asyncio.create_task(A())
n = asyncio.create_task(B())
print("CCDD...")
await n
await t
asyncio.run(main())
I find it difficult to understand how the output produced by the first two snippets above is the same, but the output produced by the last snippet is different than the first two?
Output of snippets 1, 2
A...
B...
CCDD...
Output of snippet 3
CCDD...
A...
B...
It's all a matter of thinking about the sequencing. First off, n/15/B is always the 15-second task and t/10/A is the 10-second one. For all snippets, that means A will be printed before B, as you start them at roughly the same time.
In snippet 1, you start them both then wait for the 15-second task, meaning that both will finish before main prints CCDD (waiting for the 10-second task after that but it's already finished). Hence you see A B CCDD.
In snippet 2, you wait for the both the 15-second and 10-second task to finish before main prints CCDD, resulting in A B CCDD.
In snippet 3, you start both tasks then immediately print CCDD before waiting for them both. This gives you CCDD A B.
I have several http requests to fire simultaneously. I am trying to use async for to do this.
import asyncio
async def ticker(delay, to):
for i in range(to):
yield i
print(i)
await asyncio.sleep(delay) # instead of aiohttp request
print(i, ' fin')
async def main():
async for x in ticker(1,2):
pass
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
I keep getting subsequent calls with the following:
0
0 fin
1
1 fin
Instead I need the output as shown below:
0
1
0 fin
1 fin
Could you please advise me on how to do this?
The problem is that async for is exactly what you don't need.
async for is designed to iterate while waiting for a task to complete between each iteration; you want to iterate (starting requests) without waiting for the previous task(s) to finish.
You'll want something like
async def do_request():
await asyncio.sleep(1)
async def main():
await asyncio.gather(*[
do_request() for i in range(10)
])
Comment with a follow-up if that doesn't answer your question.
I've been using asyncio for a bit but I'm still fairly unfamiliar with it. My current issue is that while trying to wait for a response from a function with asyncio, the waiting (while loop) blocks the function from happening. Here is the code that sums up the problem:
import asyncio
response = 0
async def handle(x):
await asyncio.sleep(0.1)
return x
async def run():
global response
for number in range(1, 21):
response = await handle(number)
print(response)
if response == 10:
await wait_for_next(response)
async def wait_for_next(x):
while response == x:
print('waiting',response,x)
await asyncio.sleep(0.5)
print('done')
tasks = [run()]
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait(tasks))
wait_for_next is supposed to wait for the next response, but the while loop blocks the run() function. How could I stop this happening? Should I be using loop.run_in_executor, and if so, how?
(There were a couple of other examples of this I could find, but they were very specific and I didn't understand if our problems/solutions would be the same.)
As already noted, loop stuck because await wait_for_next(response) blocks execution flow until this coroutine wouldn't be finished.
If you want some of your coroutines to be started without blocking execution flow you can start it as asyncio.Task (more about tasks) using ensure_future function:
import asyncio
response = 0
async def handle(x):
await asyncio.sleep(0.1)
return x
async def run():
global response
for number in range(1, 21):
response = await handle(number)
print(response)
if response == 10:
# run wait_for_next "in background" instead of blocking flow:
asyncio.ensure_future(wait_for_next(response))
async def wait_for_next(x):
while response == x:
print('waiting',response,x)
await asyncio.sleep(0.5)
print('done')
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(run())
Output:
1
2
3
4
5
6
7
8
9
10
waiting 10 10
11
12
13
14
done
15
16
17
18
19
20
I was wondering how I could use asyncio to handle tasks similar to what nodeJS does. I want to run tasks at the same time without opening threads.
Example:
import asyncio
#asyncio.coroutine
def my_coroutine(task_name, seconds_to_sleep=3):
print('{0} sleeping for: {1} seconds'.format(task_name, seconds_to_sleep))
yield from asyncio.sleep(seconds_to_sleep)
print('{0} is finished'.format(task_name))
loop = asyncio.get_event_loop()
tasks = [
my_coroutine('task1', 4),
my_coroutine('task2', 3),
my_coroutine('task3', 2)]
loop.run_until_complete(asyncio.wait(tasks))
loop.close()
will output:
task1 sleeping for: 4 seconds
task2 sleeping for: 3 seconds
task3 sleeping for: 2 seconds
task3 is finished
task2 is finished
task1 is finished
but when I try to do it with a different task it won't work like that.
import asyncio
import timeit
#asyncio.coroutine
def my_coroutine(task_name):
print('order placed for ' + task_name)
print(timeit.timeit('1 + 3 ', number=50000000))
print(task_name + ' done')
loop = asyncio.get_event_loop()
tasks = [
my_coroutine('task1'),
my_coroutine('task2'),
my_coroutine('task3')]
loop.run_until_complete(asyncio.wait(tasks))
loop.close()
outputs
order placed for task2
0.6677237730912453
task2 done
order placed for task1
0.6627442526498016
task1 done
order placed for task3
0.665618849882418
task3 done
asyncio doesn't run things in parallel. It runs one task until it awaits, then moves on to the next. The sleeps in your first example are what make the tasks yield control to each other. Your second example doesn't await anything, so each task runs until completion before the event loop can give control to another task.
If you add something awaitable (e.g., asyncio.sleep) into your coroutine, each one will yield control and give the others a chance to run.
#asyncio.coroutine
def my_coroutine(task_name):
print('order placed for ' + task_name)
yield from asyncio.sleep(0) # Another coroutine will resume here.
print(timeit.timeit('1 + 3 ', number=50000000))
yield from asyncio.sleep(0) # Another coroutine will resume here.
print(task_name + ' done')
The asyncio documentation says below so asyncio tasks run concurrently but not parallelly.
asyncio is a library to write concurrent code using the async/await
syntax.
And, #asyncio.coroutine is deprecated since Python 3.7.14 and removed since Python 3.11.0 so instead, you should use async as shown below:
# #asyncio.coroutine
async def test():
print("Test")
And for example, with this code below:
import asyncio
async def test1():
for _ in range(0, 3):
print("Test1")
async def test2():
for _ in range(0, 3):
print("Test2")
async def test3():
for _ in range(0, 3):
print("Test3")
async def call_tests():
await asyncio.gather(test1(), test2(), test3())
asyncio.run(call_tests())
test1(), test2() and test3() are run serially as shown below:
Test1 # 0 second
Test1 # 0 second
Test1 # 0 second
Test2 # 0 second
Test2 # 0 second
Test2 # 0 second
Test3 # 0 second
Test3 # 0 second
Test3 # 0 second
And, if using await asyncio.sleep(1) in them as shown below:
import asyncio
async def test1():
for _ in range(0, 3):
print("Test1")
await asyncio.sleep(1) # Here
async def test2():
for _ in range(0, 3):
print("Test2")
await asyncio.sleep(1) # Here
async def test3():
for _ in range(0, 3):
print("Test3")
await asyncio.sleep(1) # Here
async def call_tests():
await asyncio.gather(test1(), test2(), test3())
asyncio.run(call_tests())
They are run alternately sleeping 1 second each time as shown below:
Test1 # 1 second
Test2 # 1 second
Test3 # 1 second
Test1 # 2 seconds
Test2 # 2 seconds
Test3 # 2 seconds
Test1 # 3 seconds
Test2 # 3 seconds
Test3 # 3 seconds
And, if using await asyncio.sleep(0) in them as shown below:
import asyncio
async def test1():
for _ in range(0, 3):
print("Test1")
await asyncio.sleep(0) # Here
async def test2():
for _ in range(0, 3):
print("Test2")
await asyncio.sleep(0) # Here
async def test3():
for _ in range(0, 3):
print("Test3")
await asyncio.sleep(0) # Here
async def call_tests():
await asyncio.gather(test1(), test2(), test3())
asyncio.run(call_tests())
They are run alternately without sleeping as shown below:
Test1 # 0 second
Test2 # 0 second
Test3 # 0 second
Test1 # 0 second
Test2 # 0 second
Test3 # 0 second
Test1 # 0 second
Test2 # 0 second
Test3 # 0 second
I'm quite new in this python asyncio topic. I have a simple question:
I have a task containing two coroutines to be run concurrently. First coroutine(my_coroutine) would just print something continuously until second_to_sleep is reached. The second coroutine(seq_coroutine) would call 4 other coroutines sequentially one after the other. My goal is to stop the loop at the end of seq_coroutine whenever it is completely finished. To be exact, I want my_coroutine be alive until seq_coroutine is finished. Can someone help me with that?
My code is like this:
import asyncio
async def my_coroutine(task, seconds_to_sleep = 3):
print("{task_name} started\n".format(task_name=task))
for i in range(1, seconds_to_sleep):
await asyncio.sleep(1)
print("\n{task_name}: second {seconds}\n".format(task_name=task, seconds=i))
async def coroutine1():
print("coroutine 1 started")
await asyncio.sleep(1)
print("coroutine 1 finished\n")
async def coroutine2():
print("coroutine 2 started")
await asyncio.sleep(1)
print("coroutine 2 finished\n")
async def coroutine3():
print("coroutine 3 started")
await asyncio.sleep(1)
print("coroutine 3 finished\n")
async def coroutine4():
print("coroutine 4 started")
await asyncio.sleep(1)
print("coroutine 4 finished\n")
async def seq_coroutine():
await coroutine1()
await coroutine2()
await coroutine3()
await coroutine4()
def main():
main_loop = asyncio.get_event_loop()
task = [asyncio.ensure_future(my_coroutine("task1", 11)),
asyncio.ensure_future(seq_coroutine())]
try:
print('loop is started\n')
main_loop.run_until_complete(asyncio.gather(*task))
finally:
print('loop is closed')
main_loop.close()
if __name__ == "__main__":
main()
This is the output of this program:
loop is started
task1 started
coroutine 1 started
task1: second 1
coroutine 1 finished
coroutine 2 started
task1: second 2
coroutine 2 finished
coroutine 3 started
task1: second 3
coroutine 3 finished
coroutine 4 started
task1: second 4
coroutine 4 finished
task1: second 5
task1: second 6
task1: second 7
task1: second 8
task1: second 9
task1: second 10
loop is closed
I only want to have something like this:
loop is started
task1 started
coroutine 1 started
task1: second 1
coroutine 1 finished
coroutine 2 started
task1: second 2
coroutine 2 finished
coroutine 3 started
task1: second 3
coroutine 3 finished
coroutine 4 started
task1: second 4
coroutine 4 finished
loop is closed
I just found a suitable solution for my problem.
I won't remove my post and I'll post my solution so that it may help others who face the same question.
I used asyncio.wait(task, return_when=asyncio.FIRST_COMPLETED) and it will return the result whenever the first task is finished.
This is the solution:
import asyncio
from asyncio.tasks import FIRST_COMPLETED
from concurrent.futures import CancelledError
async def my_coroutine(task, seconds_to_sleep = 3):
print("{task_name} started\n".format(task_name=task))
for i in range(1, seconds_to_sleep):
await asyncio.sleep(1)
print("\n{task_name}: second {seconds}\n".format(task_name=task, seconds=i))
async def coroutine1():
print("coroutine 1 started")
await asyncio.sleep(1)
print("coroutine 1 finished\n")
async def coroutine2():
print("coroutine 2 started")
await asyncio.sleep(1)
print("coroutine 2 finished\n")
async def coroutine3():
print("coroutine 3 started")
await asyncio.sleep(1)
print("coroutine 3 finished\n")
async def coroutine4():
print("coroutine 4 started")
await asyncio.sleep(1)
print("coroutine 4 finished\n")
async def seq_coroutine(loop):
await coroutine1()
await coroutine2()
await coroutine3()
await coroutine4()
def main():
main_loop = asyncio.get_event_loop()
task = [asyncio.ensure_future(my_coroutine("task1", 11)),
asyncio.ensure_future(seq_coroutine(main_loop))]
try:
print('loop is started\n')
done, pending = main_loop.run_until_complete(asyncio.wait(task, return_when=asyncio.FIRST_COMPLETED))
print("Completed tasks: {completed}\nPending tasks: {pending}".format(completed = done, pending = pending))
#canceling the tasks
for task in pending:
print("Cancelling {task}: {task_cancel}".format(task=task, task_cancel=task.cancel()))
except CancelledError as e:
print("Error happened while canceling the task: {e}".format(e=e))
finally:
print('loop is closed')
if __name__ == "__main__":
main()
You can use a variable to signal to another coroutine. asyncio.Event is usually used:
import asyncio
import random
async def clock(name, event):
print("* {} started".format(name))
i = 0
while not event.is_set():
await asyncio.sleep(0.1)
i += 1
print("* {}: {}".format(name, i))
print("* {} done".format(name))
return i
async def coro(x):
print("coro() started", x)
await asyncio.sleep(random.uniform(0.2, 0.5))
print("coro() finished", x)
async def seq_coroutine(name):
event = asyncio.Event()
clock_task = asyncio.ensure_future(clock(name, event))
# await asyncio.sleep(0) # if you want to give a chance to clock() to start
await coro(1)
await coro(2)
await coro(3)
await coro(4)
event.set()
i = await clock_task
print("Got:", i)
def main():
main_loop = asyncio.get_event_loop()
main_loop.run_until_complete(seq_coroutine("foo"))
main_loop.close()
if __name__ == "__main__":
main()
You can also use await event.wait() to block a piece of code until the event is set:
async def xxx(event):
print("xxx started")
await event.wait()
print("xxx ended")
Here's another way to do the same thing, which I think is cleaner in representing the dependence between jobs:
import asyncio
async def poll():
i = 0
while True:
print("First", i)
i += 1
await asyncio.sleep(20)
print("Second", i)
i += 1
await asyncio.sleep(20)
async def stop():
poller = asyncio.ensure_future(poll())
await asyncio.sleep(5)
poller.cancel()
main_loop = asyncio.get_event_loop()
main_loop.run_until_complete(stop())
main_loop.close()
Basically, instead of breaking the entire event loop on a single job ending and then cancelling the job there, we just cancel the dependent job directly when the parent job finishes.