understanding async/await with an example - python

I want to understand why async await didnt work while looping around a range, below is the code. Is it that the function I called is not asynchronous. Not able to understand it. Problem is same thing worked when I used non async process instead of for loop. Is there something i'm missing. Thanks
import asyncio
import time
async def test():
response = {}
s = time.time()
tasks=[]
answers=[]
for k in range(4):
print(k)
tasks.append(asyncio.create_task(take_time()))
answers = await asyncio.gather(*tasks)
response['ans'] = answers
response["Time Taken"] = round((time.time() - s),2)
print(response)
return response
async def take_time():
# # time.sleep(4)
# await asyncio.sleep(4)
for i in range(100000000):
o=i
return str(o)
if __name__=='__main__':
asyncio.run(test())

Related

Python asyncio double await

For some reason I need to write a double await, but I don't exactly know why. Can someone explain this to me?
I've created a small example of the issue I ran into.
import asyncio
from random import randint
async def work():
return randint(1, 100)
async def page():
return asyncio.gather(*[
work()
for _ in range(10)
])
async def run():
results = await (await page())
return max(list(results))
result = asyncio.run(run())
It is the line results = await (await page()).
To actually execute awaitable objects you need to await on them.
Your page here is coroutine function, when called, it returns a coroutine which is an awaitable object!
When you say await page(), you're running the body of it. after execution it gives you(return) another awaitable object which is the result of calling asyncio.gather(). So you need to await on that too. That's why you need two await.
If you don't you'd see:
RuntimeError: await wasn't used with future
You could do this nested await expression inside the calling coroutine:
import asyncio
from random import randint
async def work():
return randint(1, 100)
async def page():
return await asyncio.gather(*[work() for _ in range(10)])
async def run():
results = await page()
return max(list(results))
result = asyncio.run(run())
print(result)

Why am I getting RuntimeError while using threading.Thread()?

today is another question day. I am doing requests in python, and I am trying to speed things up. if you recall, I already asked a question about this. Today i have finished that code, and move on to face a similar problem.
Running this code produces a RuntimeError. It says:
RuntimeError: There is no current event loop in thread 'Thread-2'
I don't know why I am getting this error, since I am using threading.Thread() instead of asyncio. Why is it asking for an event loop?
def func(mininum, maximum=None):
if maximum == None:
maximum = mininum
import asyncio
import aiohttp
from bs4 import BeautifulSoup
async def find_account(i_d, session):
try:
async with session.get(f'https://web.roblox.com/users/{i_d}/profile') as response:
if response.status == 200:
r = await response.read()
soup = BeautifulSoup(r, 'html.parser')
stuff = list(soup.find_all('h2'))
stuff = stuff[0]
stuff = list(stuff)
stuff = stuff[0]
print(f'{i_d} is done')
return str(stuff) + ' ID: {id}'.format(id=i_d)
else:
return 'None'
except aiohttp.ServerDisconnectedError:
await find_account(i_d, session)
except asyncio.exceptions.TimeoutError:
await find_account(i_d, session)
async def id_range(minimum, maximum):
tasks = []
async with aiohttp.ClientSession() as session:
for i in range(minimum, maximum + 1):
tasks.append(asyncio.create_task(find_account(i_d=i, session=session)))
return await asyncio.gather(*tasks)
event_loop = asyncio.get_event_loop()
return event_loop.run_until_complete(id_range(mininum, maximum))
import threading
all = []
p1 = threading.Thread(target=func, args=(1,1000)).start()
p2 = threading.Thread(target=func, args=(1001,2000)).start()
p1 : threading.Thread
p2 : threading.Thread
p1.join()
p2.join()```
#Comment if you need more traceback.
You cannot use the async/await keywords when using threading.
Instead, the OS will decide automatically when to switch threads.

How to use an asyncio loop inside another asyncio loop

I have been trying all kinds of things to be able to use an asyncio loop inside another asyncio loop. Most of the time my test just end in errors, such as:
RuntimeError: This event loop is already running
My example code below is just the base test I started with, so you can see the basics of what I am trying to do. I tried so many things after this test, it was just too confusing, so I figured I should keep it simple when asking for help. If anyone can point me in the right direction, that would be great. Thank you for your time!
import asyncio
async def fetch(data):
message = 'Hey {}!'.format(data)
other_data = ['image_a.com', 'image_b.com', 'image_c.com']
images = sub_run(other_data)
return {'message' : message, 'images' : images}
async def bound(sem, data):
async with sem:
r = await fetch(data)
return r
async def build(dataset):
tasks = []
sem = asyncio.Semaphore(400)
for data in dataset:
task = asyncio.ensure_future(bound(sem, data))
tasks.append(task)
r = await asyncio.gather(*tasks)
return r
def run(dataset):
loop = asyncio.get_event_loop()
future = asyncio.ensure_future(build(dataset))
responses = loop.run_until_complete(future)
loop.close()
return responses
async def sub_fetch(data):
image = 'https://{}'.format(data)
return image
async def sub_bound(sem, data):
async with sem:
r = await sub_fetch(data)
return r
async def sub_build(dataset):
tasks = []
sem = asyncio.Semaphore(400)
for data in dataset:
task = asyncio.ensure_future(sub_bound(sem, data))
tasks.append(task)
r = await asyncio.gather(*tasks)
return r
def sub_run(dataset):
loop = asyncio.get_event_loop()
future = asyncio.ensure_future(sub_build(dataset))
responses = loop.run_until_complete(future)
loop.close()
return responses
if __name__ == '__main__':
dataset = ['Joe', 'Bob', 'Zoe', 'Howard']
responses = run(dataset)
print (responses)
Running loop.run_until_compete inside a running event loop would block the outer loop, thus defeating the purpose of using asyncio. Because of that, asyncio event loops aren't recursive, and one shouldn't need to run them recursively. Instead of creating an inner event loop, await a task on the existing one.
In your case, remove sub_run and simply replace its usage:
images = sub_run(other_data)
with:
images = await sub_build(other_data)
And it will work just fine, running the sub-coroutines and not continuing with the outer coroutine until the inner one is complete, as you likely intended from the sync code.

Call Async Functions Dynamically with exec()

So, I'm implementing a Discord Bot using discord.py, and I'm trying to dynamically call functions based on commands. I was able to test dynamic function calls with exec() fine, but they seem to fall apart with the async calls needed for discord.py.
So what I'm trying to do with this example would be to call the hello function and print Hello World into discord by typing !hello in chat.
#client.event
async def on_message(message):
call = 'await ' + message.content.lower()[1:] + '(message)'
exec(call)
async def hello(message):
await client.send_message(message.channel, 'Hello World')
Unfortunately, this code doesn't seem to do anything, I'm assuming because of how exec() handles async calls. Any help would be appreciated.
Instead of exec() use globals() to get your function:
import asyncio
async def main():
s = "foo"
param = "hello"
coro = globals().get(s)
if coro:
result = await coro(param)
print("got:", result)
else:
print("unknown function:", s)
async def foo(param):
print(param)
await asyncio.sleep(0.11)
return ":-)"
loop = asyncio.get_event_loop()
response = loop.run_until_complete(main())
loop.close()
However, allowing the user to access anything in globals() might bwe dangerous, instead it would be much better to whitelist your commands, for example using:
import asyncio
my_commands = {}
def register(cmd):
my_commands[cmd.__name__] = cmd
return cmd
async def main():
s = "foo"
param = "hello"
coro = my_commands.get(s)
if coro:
result = await coro(param)
print("got:", result)
else:
print("unknown function:", s)
#register
async def foo(param):
"""I am the mighty foo command!"""
print(param)
await asyncio.sleep(0.11)
return ":-)"
loop = asyncio.get_event_loop()
response = loop.run_until_complete(main())
loop.close()
See also:
for k, v in my_commands.items():
print("{}: {}".format(k, v.__doc__ or "no docs"))

python asyncio add_done_callback with async def

I have 2 functions: The first one, def_a, is an asynchronous function and the second one is def_b which is a regular function and called with the result of def_a as a callback with the add_done_callback function.
My code looks like this:
import asyncio
def def_b(result):
next_number = result.result()
# some work on the next_number
print(next_number + 1)
async def def_a(number):
await some_async_work(number)
return number + 1
loop = asyncio.get_event_loop()
task = asyncio.ensure_future(def_a(1))
task.add_done_callback(def_b)
response = loop.run_until_complete(task)
loop.close()
And it's work perfectly.
The problem began when also the second function, def_b, became asynchronous. Now it looks like this:
async def def_b(result):
next_number = result.result()
# some asynchronous work on the next_number
print(next_number + 1)
But now I can not provide it to the add_done_callback function, because it's not a regular function.
My question is- Is it possible and how can I provide def_b to the add_done_callback function if def_b is asynchronous?
add_done_callback is considered a "low level" interface. When working with coroutines, you can chain them in many ways, for example:
import asyncio
async def my_callback(result):
print("my_callback got:", result)
return "My return value is ignored"
async def coro(number):
await asyncio.sleep(number)
return number + 1
async def add_success_callback(fut, callback):
result = await fut
await callback(result)
return result
loop = asyncio.get_event_loop()
task = asyncio.ensure_future(coro(1))
task = add_success_callback(task, my_callback)
response = loop.run_until_complete(task)
print("response:", response)
loop.close()
Keep in mind add_done_callback will still call the callback if your future raises an exception (but calling result.result() will raise it).
This only works for one future job, if you have multiple async jobs, they will blocks each other, a better way is using asyncio.as_completed() to iterate future list:
import asyncio
async def __after_done_callback(future_result):
# await for something...
pass
async def __future_job(number):
await some_async_work(number)
return number + 1
loop = asyncio.get_event_loop()
tasks = [asyncio.ensure_future(__future_job(x)) for x in range(100)] # create 100 future jobs
for f in asyncio.as_completed(tasks):
result = await f
await __after_done_callback(result)
loop.close()
You can try the aiodag library. It's a very lightweight wrapper around asyncio that abstracts away some of the async plumbing that you usually have to think about. From this example you won't be able to tell that things are running asynchronously since it's just 1 task that depends on another, but it is all running async.
import asyncio
from aiodag import task
#task
async def def_b(result):
# some asynchronous work on the next_number
print(result + 1)
#task
async def def_a(number):
await asyncio.sleep(number)
return number + 1
async def main():
a = def_a(1)
b = def_b(a) # this makes task b depend on task a
return await b
loop = asyncio.get_event_loop()
asyncio.set_event_loop(loop)
response = loop.run_until_complete(main())

Categories

Resources