I'm essentially making a pinger, that makes has a 2d list, of key / webhook pairs, and after pinging a key, send the response to a webhook
the 2d list goes as follows:
some_list = [["key1", "webhook1"], ["key2", "webhook2"]]
My program is essentially a loop, and I'm not too sure how I can rotate the some_list data, in the function.
Here's a little demo of what my script looks like:
async def do_ping(some_pair):
async with aiohttps.ClientSession() as s:
tasks = await gen_tasks(s, some_pair)
results = await asyncio.gather(*tasks*)
sleep(10)
await do_ping(some_pair)
I've tried:
async def main():
for entry in some_list:
asyncio.run(do_ping(entry))
but due to the do_ping function being a self-calling loop, it just calls the first one over and over again, and never gets to the ones after it. Hoping to find a solution to this, whether it's threading or alike, and if you have a better way of structuring some_list values (which I assume would be a dictionary), feel free to drop that feedback as well
You made your method recursive await do_ping(some_pair), it never ends for the loop in main to continue. I would restructure the application like this:
async def do_ping(some_pair):
async with aiohttps.ClientSession() as s:
while True:
tasks = await gen_tasks(s, some_pair)
results = await asyncio.gather(*tasks)
await asyncio.sleep(10)
async def main():
tasks = [do_ping(entry) for entry in some_list]
await asyncio.gather(*tasks)
if __name__ == "__main__":
asyncio.run(main())
Alternatively you could move the repeat and sleeping logic into the main:
async def do_ping(some_pair):
async with aiohttps.ClientSession() as s:
tasks = await gen_tasks(s, some_pair)
results = await asyncio.gather(*tasks)
async def main():
while True:
tasks = [do_ping(entry) for entry in some_list]
await asyncio.gather(*tasks)
await asyncio.sleep(10)
if __name__ == "__main__":
asyncio.run(main())
You could also start the tasks before doing a call to sleep, and gather them afterwards. That would make the pings more consistently start at 10 second intervals instead of being 10 seconds + the time it takes to gather the results:
async def main():
while True:
tasks = [
asyncio.create_task(do_ping(entry))
for entry in some_list
]
await asyncio.sleep(10)
await asyncio.wait(tasks)
EDIT As pointed out by creolo you should only create a single ClientSession object. See https://docs.aiohttp.org/en/stable/client_reference.html
Session encapsulates a connection pool (connector instance) and supports keepalives by default. Unless you are connecting to a large, unknown number of different servers over the lifetime of your application, it is suggested you use a single session for the lifetime of your application to benefit from connection pooling.
async def do_ping(session, some_pair):
tasks = await gen_tasks(session, some_pair)
results = await asyncio.gather(*tasks)
async def main():
async with aiohttp.ClientSession() as session:
while True:
tasks = [
asyncio.create_task(do_ping(session, entry))
for entry in some_list
]
await asyncio.sleep(10)
await asyncio.wait(tasks)
Related
For example this code:
async def f1(num):
while True:
print(num)
await asyncio.sleep(2)
class ExampleClass:
def __init__():
self.tasks = []
async def main():
for i in range(10):
tasks.append(asyncio.create_task(f1(i)))
await asyncio.gather(*tasks)
def add_new_task(task):
self.tasks.append(task)
Then somewhere outside I call
ExampleClass.add_new_task(task)
What I need is to add new tasks and execute them asynchronously with the existing ones.
May be I should use any other constructions to implement what i want?
What is important is that my tasks probably need to execute forever(forever polling)
The asyncio.gather function is not really convenient for such task. However, you can take a look at asyncio.TaskGroup which was released in Python3.11 and allows to add new tasks dynamically more easily.
import asyncio
from concurrent.futures import wait, FIRST_COMPLETED
async def main():
async with asyncio.TaskGroup() as group:
# Create some tasks
tasks = [
group.create_task(asyncio.sleep(1.0))
for _ in range(10)
]
# Wait for some tasks to finish
done, tasks = await wait(tasks, return_when=FIRST_COMPLETED)
# Add more tasks (/!\ `tasks` is now a set)
tasks.add(group.create_task(asyncio.sleep(2.0)))
# Wait the rest to complete
await asyncio.gather(*tasks)
if __name__ == "__main__":
asyncio.run(main())
So, I come to possible solution for my case with the idea of Louis.
The approximate idea is to use fake async linfinite loop where all my task are executed and where I update new tasks within asyncio.TaskGroup :
async def fake_generator():
while True:
yield None
fake_gen = fake_generator()
async with asyncio.TaskGroup() as tg:
async for _ in fake_gen:
await add_new_data()
for element in self.data:
if not self.data[element]['in_use']:
tg.create_task(job(element))
self.data[element]['in_use'] = True
await asyncio.sleep(60 * 60)
I'm trying to run some asynchronous functions in her asynchronous function, the problem is, how did I understand that functions don't run like that, then how do I do it? I don't want to make the maze_move function asynchronous.
async def no_stop():
#some logic
await asyncio.sleep(4)
async def stop(stop_time):
await asyncio.sleep(stop_time)
#some logic
def maze_move():
no_stop()
stop(1.5)
async def main(websocket):
global data_from_client, data_from_server, power_l, power_r
get_params()
get_data_from_server()
get_data_from_client()
while True:
msg = await websocket.recv()
allow_data(msg)
cheker(data_from_client)
data_from_server['IsBrake'] = data_from_client['IsBrake']
data_from_server['powerL'] = power_l
data_from_server['powerR'] = power_r
await websocket.send(json.dumps(data_from_server))
print(data_from_client['IsBrake'])
start_server = websockets.serve(main, 'localhost', 8080)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_forever()
How about:
def maze_move():
loop = asyncio.get_event_loop()
loop.run_until_complete(no_stop())
loop.run_until_complete(stop(1.5))
If you wanted to run two coroutines concurrently, then:
def maze_move():
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.gather(no_stop(), stop(1.5)))
Update Based on Updated Question
I am guessing what it is you want to do (see my comment to your question):
First, you cannot call from maze_move coroutines such as stop directly since stop() does not result in calling stop it just returns a coroutine object. So maze_move has to be modified. I will assume you do not want to make it a coroutine itself (why not as long as you already have to modify it?). And further assuming you want to invoke maze_move from a coroutine that wishes to run concurrently other coroutines, then you can create a new coroutine, e.g. maze_move_runner that will run maze_move in a separate thread so that it does not block other concurrently running coroutines:
import asyncio
import concurrent.futures
async def no_stop():
#some logic
print('no stop')
await asyncio.sleep(4)
async def stop(stop_time):
await asyncio.sleep(stop_time)
print('stop')
#some logic
async def some_coroutine():
print('Enter some_coroutine')
await asyncio.sleep(2)
print('Exit some_coroutine')
return 1
def maze_move():
# In case we are being run directly and not in a separate thread:
try:
loop = asyncio.get_running_loop()
except:
# This thread has no current event loop, so:
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(no_stop())
loop.run_until_complete(stop(1.5))
return 'Done!'
async def maze_move_runner():
loop = asyncio.get_running_loop()
# Run in another thread:
return await loop.run_in_executor(None, maze_move)
async def main():
loop = asyncio.get_running_loop()
results = await (asyncio.gather(some_coroutine(), maze_move_runner()))
print(results)
asyncio.run(main())
Prints:
Enter some_coroutine
no stop
Exit some_coroutine
stop
[1, 'Done!']
But this would be the most straightforward solution:
async def maze_move():
await no_stop()
await stop(1.5)
return 'Done!'
async def main():
loop = asyncio.get_running_loop()
results = await (asyncio.gather(some_coroutine(), maze_move()))
print(results)
If you have an already running event loop, you can define an async function inside of a sync function and launch it as task:
def maze_move():
async def amaze_move():
await no_stop()
await stop(1.5)
return asyncio.create_task(amaze_move())
This function returns an asyncio.Task object which can be used in an await expression, or not, depending on requirements. This way you won't have to make maze_move itself an async function, although I don't know why that would be a goal. Only a async function can run no_stop and stop, so you've got to have an async function somewhere.
I'm trying to build an appllication with python's asyncio module that is scalable and uses other modules based on asyncio also, the idea is to easily add tasks as the application grows, utilizing syncronization primitives for shared resources between the tasks, yet I'm confused regarding which design would best fit the intent.
import asyncio
async def task1():
while True:
# Code from task1
await asyncio.sleep(1)
async def task2():
while True:
# Code from task2
await asyncio.sleep(1)
async def task3():
while True:
# Code from task3
await asyncio.sleep(1)
async def main():
await asyncio.gather(
task1(),
task2(),
task3()
)
if __name__ == "__main__":
asyncio.get_event_loop().run_until_complete(main())
In the above approach each task is a while loop, yielding at the end of each execution,
import asyncio
async def task1():
# Code from task1
await asyncio.sleep(1)
async def task2():
# Code from task2
await asyncio.sleep(1)
async def task3():
# Code from task3
await asyncio.sleep(1)
async def main():
while True:
await asyncio.gather(
task1(),
task2(),
task3()
)
if __name__ == "__main__":
asyncio.get_event_loop().run_until_complete(main())
In this last approach, the gather method is the one being inside a while loop.
I also that perhaps wouldn't be necessary as asyncio tasks are running in a cooperative way, and I would get the same result from awaiting in series each coroutine, saving the usage of mutexes.
Is there anything asynchronous happening besides the asyncio.sleep(1). If not, then why bother using asynchronicity.
In the first piece of code, your three tasks are running independently of each other; you might handle 3 items in task1, 4 items in task2, and none in task3.
In the second, you specifically handle one thing in task1, one thing in task2, one thing in task3, and then start over again. If the tasks run at different speeds, this seems inefficient.
When the user sends !bot in a special channel, the code runs a function that runs for 20-30 seconds (depends on some_var). My problem is that if several people write !bot , the code will reverse this in multiple threads. How do I queue for these requests?
I tried to understand asyncio, discord.ext.tasks, but can't figure out how it works
#client.command()
async def test(ctx, *args):
data = args
if data:
answer = some_function(data[0]) #works from 5 seconds to 1 minute or more
await ctx.send(answer)
Everything works well, but I just don't want to load the system so much, I need a loop to process requests first in - first out
You can use an asyncio.Queue to queue tasks, then process them sequentially in a background loop:
import asyncio
from discord.ext import commands, tasks
queue = asyncio.Queue()
bot = commands.Bot('!')
#tasks.loop(seconds=1.0)
async def executor():
task = await queue.get()
await task
queue.task_done()
#executor.before_loop
async def before():
await bot.wait_until_ready()
#bot.command()
async def example(ctx, num: int):
await queue.put(helper(ctx, num))
async def helper(ctx, num):
await asyncio.sleep(num)
await ctx.send(num)
executor.start()
bot.run('token')
Make some_function() to async then await it. Then all test command will be processed simultaneously.
async some_function(...):
# something to do
#client.command()
async def test(ctx, *args):
if args:
answer = await some_function(data[0])
await ctx.send(answer)
I mean what do I get from using async for. Here is the code I write with async for, AIter(10) could be replaced with get_range().
But the code runs like sync not async.
import asyncio
async def get_range():
for i in range(10):
print(f"start {i}")
await asyncio.sleep(1)
print(f"end {i}")
yield i
class AIter:
def __init__(self, N):
self.i = 0
self.N = N
def __aiter__(self):
return self
async def __anext__(self):
i = self.i
print(f"start {i}")
await asyncio.sleep(1)
print(f"end {i}")
if i >= self.N:
raise StopAsyncIteration
self.i += 1
return i
async def main():
async for p in AIter(10):
print(f"finally {p}")
if __name__ == "__main__":
asyncio.run(main())
The result I excepted should be :
start 1
start 2
start 3
...
end 1
end 2
...
finally 1
finally 2
...
However, the real result is:
start 0
end 0
finally 0
start 1
end 1
finally 1
start 2
end 2
I know I could get the excepted result by using asyncio.gather or asyncio.wait.
But it is hard for me to understand what I got by use async for here instead of simple for.
What is the right way to use async for if I want to loop over several Feature object and use them as soon as one is finished. For example:
async for f in feature_objects:
data = await f
with open("file", "w") as fi:
fi.write()
But it is hard for me to understand what I got by use async for here instead of simple for.
The underlying misunderstanding is expecting async for to automatically parallelize the iteration. It doesn't do that, it simply allows sequential iteration over an async source. For example, you can use async for to iterate over lines coming from a TCP stream, messages from a websocket, or database records from an async DB driver.
None of the above would work with an ordinary for, at least not without blocking the event loop. This is because for calls __next__ as a blocking function and doesn't await its result. You cannot manually await elements obtained by for because for expects __next__ to signal the end of iteration by raising StopIteration. If __next__ is a coroutine, the StopIteration exception won't be visible before awaiting it. This is why async for was introduced, not just in Python, but also in other languages with async/await and generalized for.
If you want to run the loop iterations in parallel, you need to start them as parallel coroutines and use asyncio.as_completed or equivalent to retrieve their results as they come:
async def x(i):
print(f"start {i}")
await asyncio.sleep(1)
print(f"end {i}")
return i
# run x(0)..x(10) concurrently and process results as they arrive
for f in asyncio.as_completed([x(i) for i in range(10)]):
result = await f
# ... do something with the result ...
If you don't care about reacting to results immediately as they arrive, but you need them all, you can make it even simpler by using asyncio.gather:
# run x(0)..x(10) concurrently and process results when all are done
results = await asyncio.gather(*[x(i) for i in range(10)])
(Adding on the accepted answer - for Charlie's bounty).
Assuming you want to consume each yielded value concurrently, a straightforward way would be:
import asyncio
async def process_all():
tasks = []
async for obj in my_async_generator:
# Python 3.7+. Use ensure_future for older versions.
task = asyncio.create_task(process_obj(obj))
tasks.append(task)
await asyncio.gather(*tasks)
async def process_obj(obj):
...
Explanation:
Consider the following code, without create_task:
async def process_all():
async for obj in my_async_generator:
await process_obj(obj))
This is roughly equivalent to:
async def process_all():
obj1 = await my_async_generator.__anext__():
await process_obj(obj1))
obj2 = await my_async_generator.__anext__():
await process_obj(obj1))
...
Basically, the loop cannot continue because its body is blocking. The way to go is to delegate the processing of each iteration to a new asyncio task which will start without blocking the loop. The, gather wait for all of the tasks - which means, for every iteration to be processed.
Code based on fantastic answer from #matan129, just missing the async generator to make it runnable, once I have that (or if someone wants to contributed it) will finilize this:
import time
import asyncio
async def process_all():
"""
Example where the async for loop allows to loop through concurrently many things without blocking on each individual
iteration but blocks (waits) for all tasks to run.
ref:
- https://stackoverflow.com/questions/56161595/how-to-use-async-for-in-python/72758067#72758067
"""
tasks = []
async for obj in my_async_generator:
# Python 3.7+. Use ensure_future for older versions.
task = asyncio.create_task(process_obj(obj)) # concurrently dispatches a coroutine to be executed.
tasks.append(task)
await asyncio.gather(*tasks)
async def process_obj(obj):
await asyncio.sleep(5) # expensive IO
if __name__ == '__main__':
# - test asyncio
s = time.perf_counter()
asyncio.run(process_all())
# - print stats
elapsed = time.perf_counter() - s
print(f"{__file__} executed in {elapsed:0.2f} seconds.")
print('Success, done!\a')