I'm trying to generate a polling mechanism for a long running task in Python. To do this, I'm using a concurrent Future and poll with .done(). The task exists of many iterations that are themselves blocking, which I wrapped in an async function. I don't have access to the code of the blocking functions as I'm calling third-party software. This is a minimal example of my current approach:
import asyncio
import time
async def blocking_iteration():
time.sleep(1)
async def long_running():
for i in range(5):
print(f"sleeping {i}")
await blocking_iteration()
async def poll_run():
future = asyncio.ensure_future(long_running())
while not future.done():
print("before polling")
await asyncio.sleep(0.05)
print("polling")
future.result()
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(poll_run())
loop.close()
The result of this is:
before polling
sleeping 0
sleeping 1
sleeping 2
sleeping 3
sleeping 4
polling
From my current understanding of the asyncio mechanism in Python, I had expected the loop to unblock after the first sleep, return control to the loop that would go back to the poll_run await statement and would only run the second iteration of the long_running function after the subsequent poll.
So desired output is something like this:
before polling
sleeping 0
polling
before polling
sleeping 1
polling
before polling
sleeping 2
polling
before polling
sleeping 3
polling
before polling
sleeping 4
polling
Can this be achieved with the current approach somehow, or is it possible in a different way?
EDIT
Thanks to #drjackild was able to solve it by changing
async def blocking_iteration():
time.sleep(1)
into
def blocking():
time.sleep(1)
async def blocking_iteration():
loop = asyncio.get_event_loop()
await loop.run_in_executor(None, blocking)
time is synchronous library and block whole main thread when executing. If you have such blocking calls in your program you can avoid blocking with thread or process pool executors (you can read about it here). Or, change your blocking_iteration to use asyncio.sleep instead of time.sleep
UPD. Just to make it clear, here is non-blocking version, which use loop.run_in_executor with default executor. Please, pay attention, that blocking_iteration now without async
import asyncio
import concurrent.futures
import time
def blocking_iteration():
time.sleep(1)
async def long_running():
loop = asyncio.get_event_loop()
for i in range(5):
print(f"sleeping {i}")
await loop.run_in_executor(None, blocking_iteration)
async def poll_run():
task = asyncio.create_task(long_running())
while not task.done():
print("before polling")
await asyncio.sleep(0.05)
print("polling")
print(task.result())
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(poll_run())
loop.close()
Related
Is there a way to call an async function from a sync one without waiting for it to complete?
My current tests:
Issue: Waits for test_timer_function to complete
async def test_timer_function():
await asyncio.sleep(10)
return
def main():
print("Starting timer at {}".format(datetime.now()))
asyncio.run(test_timer_function())
print("Ending timer at {}".format(datetime.now()))
Issue: Does not call test_timer_function
async def test_timer_function():
await asyncio.sleep(10)
return
def main():
print("Starting timer at {}".format(datetime.now()))
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
asyncio.ensure_future(test_timer_function())
print("Ending timer at {}".format(datetime.now()))
Any suggestions?
Async functions really do not run in the background: they run always in a single thread.
That means that when there are parallel tasks in async code (normal async code), it is only when you give a chance to the asyncio loop to run that those are executed - this happens when your code uses await, call one of async for, async with or return from a co-routine function that is running as a task.
In non-async code, you have to enter the loop and pass control to it, in order to the async code to run - that is what asyncio.run does - and asyncio.ensure_future does not: this call just registers a task to be executed, whenever the asyncio loop has time for it: but you return from the function without ever passing control to the async loop, so your program just finishes.
One thing that can be done is to establish a secondary thread, where the asyncio code will run: this thread will run its asyncio loop, and you can communicate with tasks in it by using global variables and normal thread data structures like Queues.
The minimal changes for your code are:
import asyncio
import threading
from datetime import datetime
now = datetime.now
async def test_timer_function():
await asyncio.sleep(2)
print(f"ending async task at {now()}")
return
def run_async_loop_in_thread():
asyncio.run(test_timer_function())
def main():
print(f"Starting timer at {now()}")
t = threading.Thread(target=run_async_loop_in_thread)
t.start()
print(f"Ending timer at {now()}")
return t
if __name__ == "__main__":
t = main()
t.join()
print(f"asyncio thread exited normally at {now()}")
(please, when posting Python code, include the import lines and lines to call your functions and make your code actually run: it is not a lot of boiler plate like may be needed in other languages, and turn your snippets in complete, ready to run, examples)
printout when running this snippet at the console:
Starting timer at 2022-10-20 16:47:45.211654
Ending timer at 2022-10-20 16:47:45.212630
ending async task at 2022-10-20 16:47:47.213464
asyncio thread exited normally at 2022-10-20 16:47:47.215417
The answer is simply no. It's not gonna happen in a single thread.
First issue:
In your first issue, main() is a sync function. It stops at the line asyncio.run(test_timer_function()) until the event loop finishes its work.
What is its only task? test_timer_function! This task "does" give the control back to event loop but not to the caller main! So if the event loop had other tasks too, they would cooperate with each other. But within the tasks of the event loop, not between event loop and the caller.
So it will wait 10 seconds. There is no other one here to use this 10 seconds to do its work.
Second issue:
You didn't even run the event loop. Check documentation for ensure_future.
Using asyncio is it possible to create a task, then continue with the "main" code execution and await for the task results later on?
Consider the following code
from functools import reduce
import asyncio
async def a_task():
print('a_task(): before sleep')
# waiting for something to happen
await asyncio.sleep(30)
print('a_task(): after sleep')
return 42
async def main():
# Create a Task
print('main() before create_task')
task = asyncio.create_task(a_task())
print('main() task created')
print('Doing stuff here between task creation and await')
# Computing 200000! should take few seconds...
# use a smaller number if its too slow on your machine
x = reduce(lambda a,b: a*b, range(1, 200000))
print('Stuff done')
print('main() awaiting task')
task_result = await task
print('main() task awaited')
return task_result
#%%
if __name__ == '__main__':
results = asyncio.run(main())
print(results)
This returns
main() before create_task
main() task created
Doing stuff here between task creation and await
Stuff done
main() awaiting task
a_task(): before sleep <<---- Task only starts running here!!
a_task(): after sleep
main() task awaited
42
a_task is created before we start computing 200000!, but it's executed only when we call await task. Is it possible to make a_task start to run before we start computing 200000! and keep it running in the background?
I read the doc, this, this, etc... they all mention that tasks are what should be used to execute code in the background, but I can't understand how to run it without hanging the main code.
I believe that the problem here is the following: creating a task with create_task does not schedule it for the immediate execution: it needs an await or something similar to trigger the switching of the event loop to start running something different. In your current code the creation of task is followed with synchronous code in which case the event loop cannot suspend the execution of main and start running the task. One way you can ensure that it would behave the way you expect would be putting asyncio.sleep(0) before running the evaluation of the factorial. That way while main() is being executed when it'll encounter await the event loop will suspend the execution of main and switch to a_task.
Other approach that can be interesting for you is using asyncio.gather docs link to schedule multiple async task and then to suspend the execution of main and wait for all of the tasks to complete.
from functools import reduce
import asyncio
async def a_task():
print("a_task(): before sleep")
# waiting for something to happen
await asyncio.sleep(30)
print("a_task(): after sleep")
return 42
async def doing_something_else():
print("We start doing something else")
x = reduce(lambda a, b: a * b, range(1, 200000))
print("We finish doing something else")
async def main():
# Create a Task
print("main() before create_task")
task = asyncio.create_task(a_task())
print("main() task created")
print("main() before create_task 2")
task_2 = asyncio.create_task(doing_something_else())
print("main() task 2 created")
print("main() gathering tasks")
# task_result = await task
await asyncio.gather(task, task_2)
print("main() tasks finished")
# return task_result
#%%
if __name__ == "__main__":
results = asyncio.run(main())
print(results)
A program I am developing has a long running process in another thread. I would like to interrupt that thread in the event something goes awry.
Of other SO posts I've seen like this, they use syntax similar to this:
while True:
if condition_here:
break
else:
await asyncio.sleep(1)
which does work in catching KeyboardInterrupts. However, I'm not a big fan of using while loops like this and would like to avoid this if at all possible.
For some example code, here is what I currently have (which does not catch the interrupts until after the thread is done):
import asyncio
import time
from threading import Thread
def some_long_process():
time.sleep(60)
async def main():
thread = Thread(target=some_long_process)
thread.start()
# Doesn't work
loop = asyncio.get_event_loop()
await loop.run_in_executor(None, thread.join)
# await asyncio.wait([loop.run_in_executor(None, thread.join)])
# await asyncio.wait_for(loop.run_in_executor(None, thread.join), None)
# await asyncio.gather(asyncio.to_thread(thread.join))
# Works
# while thread.is_alive():
# await asyncio.sleep(1)
if __name__ == '__main__':
asyncio.run(main())
I'm also open to suggestions to reconsider my entire approach to the way this is designed if this isn't possible. Thanks for your time.
Goal: To cancel pending tasks safely across all threads. Then safely end all the asyncio loops across all threads.
Code explanation:
I opened two threads, one for the server to run and the other for background processing. Each thread has there own separate asyncio loops.
Desired Function:
When I receive a message called onClose from the client, I want to immediately shutdown all processes safely across all the threads. Desired Function is in the func_websocket_connection() function after print('Running On Close Function')
Techniques Tried:
Of course I tried os._exit(0) to abruptly stop everything. It accomplishes what I want but I also know it is not safe and can corrupt processing data. I also tried
print('Running On Close Function')
loop = asyncio.get_event_loop()
loop = loop.stop()
which also works but I get Task was destroyed but it is pending!
Non-Server Code:
import asyncio
import websockets
from threading import Thread
from time import sleep
#=============================================
# Open Websocket Server:
def func_run_server():
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
asyncio.ensure_future(func_websocket_connection())
loop.run_forever()
# Background Processing Function 1:
async def func_websocket_connection():
while i in range(100):
await asyncio.sleep(0.5)
print('Run Step Server 0')
#Some If Statement
if i == 10:
print('Running On Close Function')
#=============================================
# Open Background Processing:
def func_run_background():
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
future_run_step0 = asyncio.ensure_future(func_run_step0())
future_run_step1 = asyncio.ensure_future(func_run_step1())
loop.run_until_complete(future_run_step0)
loop.run_until_complete(future_run_step1)
# Background Processing Function 1:
async def func_run_step0():
await asyncio.sleep(5.0)
print('Run Step 0')
# Background Processing Function 2:
async def func_run_step1():
await asyncio.sleep(5.0)
print('Run Step 1')
#================================================================================
#Running two separate threads
Thread(target=func_run_server).start()
Thread(target=func_run_background).start()
I am new to python and struggling to understand why my coroutine is not working.
In the current code, the only one job is running and another is always stays idle. Why?
class Worker:
def job1_sync(self):
count = 0
while True:
print('JOB A:', count)
count = count + 1
def job2_sync(self):
count = 0
while True:
print('JOB B:', count)
count = count + 1
async def job1(self):
await self.job1_sync()
async def job2(self):
await self.job2_sync()
worker = Worker()
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(asyncio.gather(worker.job1(), worker.job2()))
Asyncio does not do multi-tasking or multithreading. What it does is it schedules tasks within one thread, using a cooperative model.
That is, the event loop runs again when the current task awaits something that will "block", and only then it schedules another task. Under the hood, async functions are coroutines, and calls to await make the corouting yield to the event loop, which resumes it a later point, when awaited condition arises.
Here you never await anything, so job1 never relinquishes control, so the event loop never has a chance to distribute computing power to other tasks.
Now if your job was to actually relinquish control, say by triggering a delay, then your code would work:
async def job1_sync(self): # note the async : only async functions can await
count = 0
while True:
print('JOB A:', count)
count = count + 1
await asyncio.sleep(1) # main even loop gets control
TLDR: asyncio is useful for what it says: doing stuff asynchronously, allowing other tasks to make progress while current task waits for something. Nothing runs in parallel.