Running function in IOLoop.current().run_in_executor? - python

I read the following code
async def f():
sc_client = session.client("ec2")
for id in ids:
await IOLoop.current().run_in_executor(None, lambda: client.terminate(id))
How does it compare to the following code? Will client.terminate be run in parallel? But each execution is awaited?
for id in ids:
client.terminate(id)

Will client.terminate be run in parallel?
NO, it still runs as sequence.
IOLoop.current().run_in_executor will run the blocking function in a separate thread and returns an asyncio.Future, while await will wait until the Future which call client.terminate finish, then the loop continue.
The difference the 2 options you given is:
If the program has other coroutine to run, using the 1st option, the other coroutine won't block, while using the 2nd option, the other coroutine will block to wait your for loop finish.
An example to make you understand it (Here, will use loop.run_in_executor to simulate the IOLoop.current().run_in_executor for simple sake):
test.py:
import asyncio
import concurrent.futures
import time
def client_terminate(id):
print(f"start terminate {id}")
time.sleep(5)
print(f"end terminate {id}")
async def f():
loop = asyncio.get_running_loop()
for id in range(2):
with concurrent.futures.ThreadPoolExecutor() as pool:
await loop.run_in_executor(pool, client_terminate, id)
# client_terminate(id)
async def f2():
await asyncio.sleep(1)
print("other task")
async def main():
await asyncio.gather(*[f(), f2()])
asyncio.run(main())
The run output is:
$ python3 test.py
start terminate 0
other task
end terminate 0
start terminate 1
end terminate 1
You could see the two client_terminate in for loop still runs in sequence, BUT, the function f2 which print other task inject between them, it won't block asyncio scheduler to schedule f2.
Additional:
If you comment the 2 lines related to await loop.run_in_executor & threadpool, directly call client_terminate(id), the output will be:
$ python3 test.py
start terminate 0
end terminate 0
start terminate 1
end terminate 1
other task
Means if you don't wraps the blocking function in a Future, the other task will have to wait your for loop to finish which waste CPU.

Related

Python asyncio unable to run multiple tasks properly

I have the following code snippet that I am expecting it to run two async functions (func1 and func2) concurrently, and:
Worker is an infinite loop that keeps fetching items from a global asyncio.Queue instance no matter if the queue is empty or not and simply printing some stuff, and Worker.start() is the method that starts that loop
worker1 = Worker()
worker2 = Worker()
worker3 = Worker()
async def func1():
print("Running func1...")
await asyncio.gather(worker1.start(), worker2.start(), worker3.start())
print("func1 done...")
async def func2():
print("Running func2...")
await asyncio.sleep(2)
print("func2 done...")
async def background_tasks():
asyncio.create_task(func1())
asyncio.create_task(func2())
if __name__ == '__main__':
asyncio.run(background_tasks())
I am expecting the two functions run concurrently and could get some output similar with below:
Running func1...
Running func2...
worker1 printing object
worker2 printing object
worker3 waiting
worker2 printing object
func2 done...
worker1 printing object
worker2 printing object
... (expecting no "func1 done..." because of the infinite loop)
But I actually get result output like this:
Running func1...
Running func2...
Process finished with exit code 0
It seems both of the two functions started but never ended properly, even func2 has no ending output, I am unable to find a solution to this and hoping to get some help, thanks in advance!
The code is asynchronous, so after you have created your two tasks eventually the interpreter will move on to the next line of code, which there is none, so it returns and your two coroutines are destroyed with it. you need to await the results of the tasks prior to exiting the function if you want the coroutines to run until completion.
async def background_tasks():
task1 = asyncio.create_task(func1())
task2 = asyncio.create_task(func2())
await task1
await task2

Python run non-blocking async function from sync function

Is there a way to call an async function from a sync one without waiting for it to complete?
My current tests:
Issue: Waits for test_timer_function to complete
async def test_timer_function():
await asyncio.sleep(10)
return
def main():
print("Starting timer at {}".format(datetime.now()))
asyncio.run(test_timer_function())
print("Ending timer at {}".format(datetime.now()))
Issue: Does not call test_timer_function
async def test_timer_function():
await asyncio.sleep(10)
return
def main():
print("Starting timer at {}".format(datetime.now()))
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
asyncio.ensure_future(test_timer_function())
print("Ending timer at {}".format(datetime.now()))
Any suggestions?
Async functions really do not run in the background: they run always in a single thread.
That means that when there are parallel tasks in async code (normal async code), it is only when you give a chance to the asyncio loop to run that those are executed - this happens when your code uses await, call one of async for, async with or return from a co-routine function that is running as a task.
In non-async code, you have to enter the loop and pass control to it, in order to the async code to run - that is what asyncio.run does - and asyncio.ensure_future does not: this call just registers a task to be executed, whenever the asyncio loop has time for it: but you return from the function without ever passing control to the async loop, so your program just finishes.
One thing that can be done is to establish a secondary thread, where the asyncio code will run: this thread will run its asyncio loop, and you can communicate with tasks in it by using global variables and normal thread data structures like Queues.
The minimal changes for your code are:
import asyncio
import threading
from datetime import datetime
now = datetime.now
async def test_timer_function():
await asyncio.sleep(2)
print(f"ending async task at {now()}")
return
def run_async_loop_in_thread():
asyncio.run(test_timer_function())
def main():
print(f"Starting timer at {now()}")
t = threading.Thread(target=run_async_loop_in_thread)
t.start()
print(f"Ending timer at {now()}")
return t
if __name__ == "__main__":
t = main()
t.join()
print(f"asyncio thread exited normally at {now()}")
(please, when posting Python code, include the import lines and lines to call your functions and make your code actually run: it is not a lot of boiler plate like may be needed in other languages, and turn your snippets in complete, ready to run, examples)
printout when running this snippet at the console:
Starting timer at 2022-10-20 16:47:45.211654
Ending timer at 2022-10-20 16:47:45.212630
ending async task at 2022-10-20 16:47:47.213464
asyncio thread exited normally at 2022-10-20 16:47:47.215417
The answer is simply no. It's not gonna happen in a single thread.
First issue:
In your first issue, main() is a sync function. It stops at the line asyncio.run(test_timer_function()) until the event loop finishes its work.
What is its only task? test_timer_function! This task "does" give the control back to event loop but not to the caller main! So if the event loop had other tasks too, they would cooperate with each other. But within the tasks of the event loop, not between event loop and the caller.
So it will wait 10 seconds. There is no other one here to use this 10 seconds to do its work.
Second issue:
You didn't even run the event loop. Check documentation for ensure_future.

How to close the loop if one of the task is completed in asyncio

I have 3 tasks. -
def task_a():
while True:
file.write()
asyncio.sleep(10)
def task_b():
while True:
file.write()
asyncio.sleep(10)
def task_c():
# do something
main.py -
try:
loop = asyncio.get_event_loop()
A = loop.create_task(task_a)
B = loop.create_task(task_b)
C = loop.create_task(task_c)
awaitable_pending_tasks = asyncio.all_tasks()
execution_group = asyncio.gather(*awaitable_pending_tasks, return_exceptions=True)
fi_execution = loop.run_until_complete(execution_group)
finally:
loop.run_forever()
I want to make sure that the loop is exited when the task_c is completed.
Tried with loop.close() in finally but since it's async, it closes in between.
task_a and task_b write to a file and there is another process running that checks the time the file was modified. If it's greater than a minute it will result in an error(which I don't want) hence I've put the while loop in it and once its written I added a sleep()
Once task_c is complete, I need the loop to stop.
Other answers on StackOverflow looked complicated to understand.
Any way we can do this?
You could call loop.run_until_complete or asyncio.run (but not run_forever) to run a function that prepares the tasks you need and then only awaits the one you want to terminate the loop (untested):
async def main():
asyncio.create_task(task_a)
asyncio.create_task(task_b)
await task_c
tasks = set(asyncio.all_tasks()) - set([asyncio.current_task()])
for t in tasks:
t.cancel()
await asyncio.gather(*tasks, return_exceptions=True)
asyncio.run(main())
# or asyncio.get_event_loop().run_until_complete(main())

Concurrent future polling of series of blocking calls

I'm trying to generate a polling mechanism for a long running task in Python. To do this, I'm using a concurrent Future and poll with .done(). The task exists of many iterations that are themselves blocking, which I wrapped in an async function. I don't have access to the code of the blocking functions as I'm calling third-party software. This is a minimal example of my current approach:
import asyncio
import time
async def blocking_iteration():
time.sleep(1)
async def long_running():
for i in range(5):
print(f"sleeping {i}")
await blocking_iteration()
async def poll_run():
future = asyncio.ensure_future(long_running())
while not future.done():
print("before polling")
await asyncio.sleep(0.05)
print("polling")
future.result()
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(poll_run())
loop.close()
The result of this is:
before polling
sleeping 0
sleeping 1
sleeping 2
sleeping 3
sleeping 4
polling
From my current understanding of the asyncio mechanism in Python, I had expected the loop to unblock after the first sleep, return control to the loop that would go back to the poll_run await statement and would only run the second iteration of the long_running function after the subsequent poll.
So desired output is something like this:
before polling
sleeping 0
polling
before polling
sleeping 1
polling
before polling
sleeping 2
polling
before polling
sleeping 3
polling
before polling
sleeping 4
polling
Can this be achieved with the current approach somehow, or is it possible in a different way?
EDIT
Thanks to #drjackild was able to solve it by changing
async def blocking_iteration():
time.sleep(1)
into
def blocking():
time.sleep(1)
async def blocking_iteration():
loop = asyncio.get_event_loop()
await loop.run_in_executor(None, blocking)
time is synchronous library and block whole main thread when executing. If you have such blocking calls in your program you can avoid blocking with thread or process pool executors (you can read about it here). Or, change your blocking_iteration to use asyncio.sleep instead of time.sleep
UPD. Just to make it clear, here is non-blocking version, which use loop.run_in_executor with default executor. Please, pay attention, that blocking_iteration now without async
import asyncio
import concurrent.futures
import time
def blocking_iteration():
time.sleep(1)
async def long_running():
loop = asyncio.get_event_loop()
for i in range(5):
print(f"sleeping {i}")
await loop.run_in_executor(None, blocking_iteration)
async def poll_run():
task = asyncio.create_task(long_running())
while not task.done():
print("before polling")
await asyncio.sleep(0.05)
print("polling")
print(task.result())
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(poll_run())
loop.close()

Python asyncio two tasks and only one is running

I am new to python and struggling to understand why my coroutine is not working.
In the current code, the only one job is running and another is always stays idle. Why?
class Worker:
def job1_sync(self):
count = 0
while True:
print('JOB A:', count)
count = count + 1
def job2_sync(self):
count = 0
while True:
print('JOB B:', count)
count = count + 1
async def job1(self):
await self.job1_sync()
async def job2(self):
await self.job2_sync()
worker = Worker()
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(asyncio.gather(worker.job1(), worker.job2()))
Asyncio does not do multi-tasking or multithreading. What it does is it schedules tasks within one thread, using a cooperative model.
That is, the event loop runs again when the current task awaits something that will "block", and only then it schedules another task. Under the hood, async functions are coroutines, and calls to await make the corouting yield to the event loop, which resumes it a later point, when awaited condition arises.
Here you never await anything, so job1 never relinquishes control, so the event loop never has a chance to distribute computing power to other tasks.
Now if your job was to actually relinquish control, say by triggering a delay, then your code would work:
async def job1_sync(self): # note the async : only async functions can await
count = 0
while True:
print('JOB A:', count)
count = count + 1
await asyncio.sleep(1) # main even loop gets control
TLDR: asyncio is useful for what it says: doing stuff asynchronously, allowing other tasks to make progress while current task waits for something. Nothing runs in parallel.

Categories

Resources