Yield from coroutine vs yield from task
In this link, there is an example give by #dano that:
import asyncio
#asyncio.coroutine
def test1():
print("in test1")
#asyncio.coroutine
def dummy():
yield from asyncio.sleep(1)
print("dummy ran")
#asyncio.coroutine
def main():
test1()
yield from dummy()
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
The output is only
dummy ran
I can't add comment to that directly, so I have to ask a new question here;
(1) why the test1() isn't executed in order in such corutine function.
Whether corutine can only use the following two ways?
yield from cor()
asyncio.async(cor())
Where did test1() go?
(2) There is also some other problems in understanding the differnence of the followng two methods to use corutine() function. Are they the same?
yield from asyncio.async(cor())
asyncio.async(cor())
I use the following code to explain:
import random
import datetime
global a,b,c
import asyncio
a = [random.randint(0, 1<<256) for i in range(500000)]
b= list(a)
c= list(a)
#asyncio.coroutine
def test1():
global b
b.sort(reverse=True)
print("in test1")
#asyncio.coroutine
def test2():
global c
c.sort(reverse=True)
print("in test2")
#asyncio.coroutine
def dummy():
yield from asyncio.sleep(1)
print("dummy ran")
#asyncio.coroutine
def test_cor():
for i in asyncio.sleep(1):
yield i
#asyncio.coroutine
def main():
test1()
print("hhhh_______________________")
asyncio.async(test1())
asyncio.async(test2())
print("hhhh_______________________")
print("hhh")
asyncio.async(dummy())
yield from test_cor()
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
print("hhhhhhh")
However the output is
hhhh_______________________
hhhh_______________________
hhh
in test1
in test2
hhhhhhh
It even didn't execute the dummy() function !
And I use
#asyncio.coroutine
def test2():
# global c
# c.sort(reverse=True)
print("in test2")
(3) without sorting and I think test2 should run faster so that test1 is output after test2. However, the output didn't change. I don't know why.
(4)And I also tried to remove sorting for both test1() and test2(), then amazingly, dummy() runs and output the following. Why ??
hhhh_______________________
hhhh_______________________
hhh
in test1
in test2
dummy ran
hhhhhhh
I don't know how these things happens....I am relly bewilerded.
Related
I'm trying to run 2 async functions test1() and test2() with loop.run_until_complete() alternately in Python as shown below:
import asyncio
async def test1():
for _ in range(3):
print("Test1")
await asyncio.sleep(1)
async def test2():
for _ in range(3):
print("Test2")
await asyncio.sleep(1)
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(test1()) # Here
loop.run_until_complete(test2()) # Here
But as shown below, they don't run with loop.run_until_complete() alternately:
Test1
Test1
Test1
Test2
Test2
Test2
I know that if I use loop.run_forever() with loop.create_task() as shown below:
import asyncio
async def test1(loop):
for _ in range(3):
print("Test1")
await asyncio.sleep(1)
loop.stop() # Extra code to stop "loop.run_forever()"
async def test2(loop):
for _ in range(3):
print("Test2")
await asyncio.sleep(1)
loop.stop() # Extra code to stop "loop.run_forever()"
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.create_task(test1(loop)) # Here
loop.create_task(test2(loop)) # Here
loop.run_forever() # Here
I can run them alternately as shown below but loop.run_forever() runs forever so to stop loop.run_forever(), the extra code loop.stop() is needed which is troublesome. In addition, I know that asyncio.gather() can also run them alternately but it needs await which I don't want:
Test1
Test2
Test1
Test2
Test1
Test2
So, how can I run them with loop.run_until_complete() alternately?
If you wouldn't insist on the loop.run_until_complete you can achieve what you want to get by using the asyncio.gather functionality, just like so:
import asyncio
async def test1():
for _ in range(3):
print("Test1")
await asyncio.sleep(1)
async def test2():
for _ in range(3):
print("Test2")
await asyncio.sleep(1)
async def main():
tasks = [test1(), test2()]
new_items = await asyncio.gather(*tasks)
return new_items
if __name__ == '__main__':
results = asyncio.run(main())
and the results would be as you'd expect-
==> python3 stack_overflow.py
Test1
Test2
Test1
Test2
Test1
Test2
You can run 2 async functions test1() and test2() alternately by calling the intermediate async function call_tests() with loop.run_until_complete() as shown below. You also need to use asyncio.get_running_loop() and loop.create_task() in call_tests() and await is needed for the last loop.create_task() to run them alternately as shown below:
import asyncio
async def test1():
for _ in range(3):
print("Test1")
await asyncio.sleep(1)
async def test2():
for _ in range(3):
print("Test2")
await asyncio.sleep(1)
async def call_tests(): # Here
loop = asyncio.get_running_loop() # Here
loop.create_task(test1()) # Here
await loop.create_task(test2()) # "await" is needed
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(call_tests()) # Call "call_tests()"
Finally, you can run them alternately as shown below:
Test1
Test2
Test1
Test2
Test1
Test2
Be careful, if using await for the first and last loop.create_task() both as shown below:
# ...
async def call_tests():
loop = asyncio.get_running_loop()
await loop.create_task(test1()) # Here
await loop.create_task(test2()) # Here
# ...
You cannot run them alternately as shown below:
Test1
Test1
Test1
Test2
Test2
Test2
And, if using await for the first loop.create_task() as shown below:
# ...
async def call_tests():
loop = asyncio.get_running_loop()
await loop.create_task(test1()) # Here
loop.create_task(test2())
# ...
You cannot run them alternately and the last loop.create_task() is exited without completed as shown below:
Test1
Test1
Test1
Test2
And, if not using await for the first and last loop.create_task() both as shown below:
# ...
async def call_tests():
loop = asyncio.get_running_loop()
loop.create_task(test1()) # No "await"
loop.create_task(test2()) # No "await"
# ...
You can run them alternately but the first and last loop.create_task() both are exited without completed as shown below:
Test1
Test2
In addition, if using asyncio.gather(), call_tests() is not needed as shown below:
import asyncio
async def test1():
for _ in range(3):
print("Test1")
await asyncio.sleep(1)
async def test2():
for _ in range(3):
print("Test2")
await asyncio.sleep(1)
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(asyncio.gather(test1(), test2()))
Then, you can run them alternately as shown below:
Test1
Test2
Test1
Test2
Test1
Test2
I'm facing this problem with async function
my_class.py
class cl:
async def foo():
return
async def bar(foo):
#some code here
return result
main.py
from my_class import cl
import asyncio
c = cl()
r = asyncio.run(c.foo)
x = cl.bar(r)
s = asyncio.run(x)
How can I retrieve the return of foo() to the function bar() because now I get this error :
ValueError: The future belongs to a different loop than the one specified as the loop argument
THANKS!!
Futures in Python imply that the result is available in the future, as the asynchronous function has not "finished" executing. I assume you will want to wait for foo to "finish" running in bar before using its value. This should do it:
import asyncio
class cl:
async def foo(self):
return
async def bar(self):
value = await self.foo() # waits for foo, then gets its value
#some code here
return result
c = cl()
x = c.bar()
s = asyncio.run(x)
Note: I changed some minor syntax so this code snippet can execute as-is
I want speed up some API requests... for that I try to figure out how to do and copy some code which run but when I try my own code its no longer asynchrone. Maybe someone find the fail?
Copy Code (guess from stackoverflow):
#!/usr/bin/env python3
import asyncio
#asyncio.coroutine
def func_normal():
print('A')
yield from asyncio.sleep(5)
print('B')
return 'saad'
#asyncio.coroutine
def func_infinite():
for i in range(10):
print("--%d" % i)
return 'saad2'
loop = asyncio.get_event_loop()
tasks = func_normal(), func_infinite()
a, b = loop.run_until_complete(asyncio.gather(*tasks))
print("func_normal()={a}, func_infinite()={b}".format(**vars()))
loop.close()
My "own" code (I need at the end a list returned and merge the results of all functions):
import asyncio
import time
#asyncio.coroutine
def say_after(start,count,say,yep=True):
retl = []
if yep:
time.sleep(5)
for x in range(start,count):
retl.append(x)
print(say)
return retl
def main():
print(f"started at {time.strftime('%X')}")
loop = asyncio.get_event_loop()
tasks = say_after(10,20,"a"), say_after(20,30,"b",False)
a, b = loop.run_until_complete(asyncio.gather(*tasks))
print("func_normal()={a}, func_infinite()={b}".format(**vars()))
loop.close()
c = a + b
#print(c)
print(f"finished at {time.strftime('%X')}")
main()
Or I m completly wrong and should solve that with multithreading? What would be the best way for API requests that returns a list that I need to merge?
Added comment for each section that needs improvement. Removed some to simply code.
In fact, I didn't find any performance uplift with using range() wrapped in coroutine and using async def, might worth with heavier operations.
import asyncio
import time
# #asyncio.coroutine IS DEPRECATED since python 3.8
#asyncio.coroutine
def say_after(wait=True):
result = []
if wait:
print("I'm sleeping!")
time.sleep(5)
print("'morning!")
# This BLOCKs thread, but release GIL so other thread can run.
# But asyncio runs in ONE thread, so this still harms simultaneity.
# normal for is BLOCKING operation.
for i in range(5):
result.append(i)
print(i, end='')
print()
return result
def main():
start = time.time()
# Loop argument will be DEPRECATED from python 3.10
# Make main() as coroutine, then use asyncio.run(main()).
# It will be in asyncio Event loop, without explicitly passing Loop.
loop = asyncio.get_event_loop()
tasks = say_after(), say_after(False)
# As we will use asyncio.run(main()) from now on, this should be await-ed.
a, b = loop.run_until_complete(asyncio.gather(*tasks))
print(f"Took {time.time() - start:5f}")
loop.close()
main()
Better way:
import asyncio
import time
async def say_after(wait=True):
result = []
if wait:
print("I'm sleeping!")
await asyncio.sleep(2) # 'await' a coroutine version of it instead.
print("'morning!")
# wrap iterator in generator - or coroutine
async def asynchronous_range(end):
for _i in range(end):
yield _i
# use it with async for
async for i in asynchronous_range(5):
result.append(i)
print(i, end='')
print()
return result
async def main():
start = time.time()
tasks = say_after(), say_after(False)
a, b = await asyncio.gather(*tasks)
print(f"Took {time.time() - start:5f}")
asyncio.run(main())
Result
Your code:
DeprecationWarning: "#coroutine" decorator is deprecated since Python 3.8, use "async def" instead
def say_after(wait=True):
I'm sleeping!
'morning!
01234
01234
Took 5.003802
Better async code:
I'm sleeping!
01234
'morning!
01234
Took 2.013863
Note that fixed code now finish it's job while other task is sleeping.
I am trying to execute all the functions with common decorator without calling all the functions. For example
#run
#v1
def test1():
#do something
#run
#v1
#v2
def test2():
#do something
#run
#v2
def test3():
#do something
#run
def test4():
#do something
I want to executes the test functions based on the decorators like #run executes all 4 tests. #v1 executes only first two. How can I do that? Any guidance will be helpful.
You could probably use the decorator to "register" your functions in a list:
_to_run = [] # list of functions to run
def run(func):
_to_run.append(func) # add the decorated function to the list
return func
#run
def test1():
print('test1')
return 1
#run
def test2():
print('test2')
def test3():
print('test3')
if __name__ == '__main__':
for test in _to_run: # iterate over registered functions
x = test()
print('Returned:', x)
On the other hand, you could as well create this list explicitly, without decorators.
I have a blocking, non-async code like this:
def f():
def inner():
while True:
yield read()
return inner()
With this code the caller can choose when to stop the function to generate data. How to change this to async? This solution doesn't work:
async def f():
async def inner():
while True:
yield await coroutine_read()
return inner()
... because yield can't be used in async def functions. If i remove the async from the inner() signature, I can't use await anymore.
Upd:
Starting with Python 3.6 we have asynchronous generators and able to use yield directly inside coroutines.
As noted above, you can't use yield inside async funcs. If you want to create coroutine-generator you have to do it manually, using __aiter__ and __anext__ magic methods:
import asyncio
# `coroutine_read()` generates some data:
i = 0
async def coroutine_read():
global i
i += 1
await asyncio.sleep(i)
return i
# `f()` is asynchronous iterator.
# Since we don't raise `StopAsyncIteration`
# it works "like" `while True`, until we manually break.
class f:
async def __aiter__(self):
return self
async def __anext__(self):
return await coroutine_read()
# Use f() as asynchronous iterator with `async for`:
async def main():
async for i in f():
print(i)
if i >= 3:
break
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
Output:
1
2
3
[Finished in 6.2s]
You may also like to see other post, where StopAsyncIteration uses.