Run a function every n seconds in python with asyncio - python

I have an application which already runs infinitely with asyncio event loop run forever and also I need to run a specific function every 10 seconds.
def do_something():
pass
a = asyncio.get_event_loop()
a.run_forever()
I would like to call the function do_something every 10 seconds. How to achieve this without replacing asynctio event loop with while loop ?
Edited:
I can achieve this with the below code
def do_something():
pass
while True:
time.sleep(10)
do_something()
But I dont want to use while loop to run infinitely in my application instead I would like to go with asyncio run_forever(). So how to call the same function every 10 seconds with asyncio ? is there any scheduler like which will not block my ongoing work ?

asyncio does not ship with a builtin scheduler, but it is easy enough to build your own. Simply combine a while loop with asyncio.sleep to run code every few seconds.
async def every(__seconds: float, func, *args, **kwargs):
while True:
func(*args, **kwargs)
await asyncio.sleep(__seconds)
a = asyncio.get_event_loop()
a.create_task(every(1, print, "Hello World"))
...
a.run_forever()
Note that the design has to be slightly different if func is itself a coroutine or a long-running subroutine. In the former case use await func(...) and in the latter case use asyncio's thread capabilities.

You can achieve it with
async def do_something():
while True:
await asyncio.wait(10)
...rest of code...
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(do_something())
loop.run_forever()

Related

Python create both sync and async function in same main

I need to have two function in my Python 3.11 code.
One function must be sync, it retrive some data from a local machine so i need to wait to finish.
Another function must be async, it get the data from the first function and send to the server. Since i don't know how many time can be (5 seconds to 30 seconds) this function must doesn't interrupt the first one
Pratically, the second function start always when the first finish but the first always start and don't care about the second one. This code run H24
My attempt:
import time
import asyncio
async def task1():
print("Recover data... waiting")
time.sleep(3)
print("End data recover")
return "slow"
async def task2(p):
print("I'm so" + p)
time.sleep(10)
print("END--->")
async def main():
while True:
print("create task1 and wait to finish")
x = await task1()
print("create task2 and not wait to finishing")
asyncio.create_task(task2(x))
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.create_task(main())
loop.run_forever()
I dont' need to use asyncio like a requirement, i just want to meet the goal without run out all the memory of the machine. Thanks
basically, "yes". Like in: that is the way you do it. If you need some task to complete before goind on with code in a given place, the thing to do is to wait for that data - if it is in an async func, then you use the await keyword, just as depicted in the code above.
If there are other tasks in parallel to calling main, then it would be nice if the code in task1 would return execution to the asyncio.loop while it waits for its result - That code could run in another thread, with the use of await asyncio.run_in_executor(sync_call, args) or simply await asyncio.sleep(...)`

How to call async function from sync funcion and get result, while a loop is already running

I have a asyncio running loop, and from the coroutine I'm calling a sync function, is there any way we can call and get result from an async function in a sync function
tried below code, it is not working
want to print output of hel() in i() without changing i() to async function
is it possible, if yes how?
import asyncio
async def hel():
return 4
def i():
loop = asyncio.get_running_loop()
x = asyncio.run_coroutine_threadsafe(hel(), loop) ## need to change
y = x.result() ## this lines
print(y)
async def h():
i()
asyncio.run(h())
This is one of the most commonly asked type of question here. The tools to do this are in the standard library and require only a few lines of setup code. However, the result is not 100% robust and needs to be used with care. This is probably why it's not already a high-level function.
The basic problem with running an async function from a sync function is that async functions contain await expressions. Await expressions pause the execution of the current task and allow the event loop to run other tasks. Therefore async functions (coroutines) have special properties that allow them to yield control and resume again where they left off. Sync functions cannot do this. So when your sync function calls an async function and that function encounters an await expression, what is supposed to happen? The sync function has no ability to yield and resume.
A simple solution is to run the async function in another thread, with its own event loop. The calling thread blocks until the result is available. The async function behaves like a normal function, returning a value. The downside is that the async function now runs in another thread, which can cause all the well-known problems that come with threaded programming. For many cases this may not be an issue.
This can be set up as follows. This is a complete script that can be imported anywhere in an application. The test code that runs in the if __name__ == "__main__" block is almost the same as the code in the original question.
The thread is lazily initialized so it doesn't get created until it's used. It's a daemon thread so it will not keep your program from exiting.
The solution doesn't care if there is a running event loop in the main thread.
import asyncio
import threading
_loop = asyncio.new_event_loop()
_thr = threading.Thread(target=_loop.run_forever, name="Async Runner",
daemon=True)
# This will block the calling thread until the coroutine is finished.
# Any exception that occurs in the coroutine is raised in the caller
def run_async(coro): # coro is a couroutine, see example
if not _thr.is_alive():
_thr.start()
future = asyncio.run_coroutine_threadsafe(coro, _loop)
return future.result()
if __name__ == "__main__":
async def hel():
await asyncio.sleep(0.1)
print("Running in thread", threading.current_thread())
return 4
def i():
y = run_async(hel())
print("Answer", y, threading.current_thread())
async def h():
i()
asyncio.run(h())
Output:
Running in thread <Thread(Async Runner, started daemon 28816)>
Answer 4 <_MainThread(MainThread, started 22100)>
In order to call an async function from a sync method, you need to use asyncio.run, however this should be the single entry point of an async program so asyncio makes sure that you don't do this more than once per program, so you can't do that.
That being said, this project https://github.com/erdewit/nest_asyncio patches the asyncio event loop to do that, so after using it you should be able to just call asyncio.run in your sync function.

asyncio running from a synchronous (standard) function

Having read the documents and watched a number of videos, i am testing asyncio as an alternative to threading.
The docs are here:
https://docs.python.org/3/library/asyncio.html
I have constructed the following code with the expectation that it would produce the following.
before the sleep
hello
world
But in fact is produces this (world comes before hello):
before the sleep
world
hello
Here is the code:
import asyncio
import time
def main():
''' main entry point for the program '''
# create the event loop and add to the loop
# or run directly.
asyncio.run(main_async())
return
async def main_async():
''' the main async function '''
await foo()
await bar()
return
async def foo():
print('before the sleep')
await asyncio.sleep(2)
# time.sleep(0)
print('world')
return
async def bar():
print('hello')
await asyncio.sleep(0)
return
if __name__=='__main__':
''' This is executed when run from the command line '''
main()
The main() function calls the async main_async() function which in turn calls both the foo and bar async functions and both of those run the await asyncio.sleep(x) command.
So my question is: why is the hello world comming in the wrong (unexpected) order given that i was expecting world to be printed approximately 2 seconds after hello ?
You awaited foo() immediately, so bar() was never scheduled until foo() had run to completion; the execution of main_async will never do things after an await until the await has completed. If you want to schedule them both and let them interleave, replace:
await foo()
await bar()
with something like:
await asyncio.gather(foo(), bar())
which will convert both awaitables to tasks, scheduling both on the running asyncio event loop, then wait for both tasks to run to completion. With both scheduled at once, when one blocks on an await (and only await-based blocks, because only await yields control back to the event loop), the other will be allowed to run (and control can only return to the other task when the now running task awaits or finishes).
Basically, you have to remember that asyncio is cooperative multitasking. If you're only executing one task, and that task performs an await, there is nothing else to schedule, so nothing else runs until that await completes. If you block by any means other than an await, you still hold the event loop, and nothing else will get a chance to run, even if it's ready to go. So to gain any benefit from asyncio you need to be careful to:
Ensure other tasks are launched in time to occupy the event loop while the original task(s) are blocking on await.
Ensure you only block via await, so you don't monopolize the event loop unnecessarily.

Is there a way to make a loop that runs in the background while other code continues to run in python?

I'm really new to programming, I was wondering if there was a way to run a while loop in the background of code already running in Python?
I was thinking of something like
While True:
print("gibberish")
print("pass")
with an output of something like:
'gibberish
gibberish
pass
gibberish.....'
(It doesn't have to be in this order as long as I get a similar result)
You can use either multiprocessing or threading:
def background_code():
while some_condition:
print("gibberish")
...
thread = threading.Thread(target=background_code, args=(), kwargs={})
thread.start()
print("pass")
...
Both multiprocessing and threading have very similar APIs, and which one to use depends on your use case - the distinction between processes and threads is not one for this question. You're probably going to want threading for what you're currently working on, but there are different situations in which you'd prefer one or the other.
You can refer to the following code.
import threading
def func1():
for i in range(10):
print("gibberish")
def func2():
print("pass")
t1 = threading.Thread(target=func1)
t2 = threading.Thread(target=func2)
if __name__ == '__main__':
t1.start()
t2.start()
What it does is, runs the methods func1 and func2 concurrently so that the provided methods run as background task for each other.
Here is something similar using asyncio (requires python 3.7+):
import asyncio
async def loop():
while True:
print("gibberish")
await asyncio.sleep(0.5)
async def main():
future = asyncio.ensure_future(loop())
for i in range(100):
print("pass")
await asyncio.sleep(1)
future.cancel()
asyncio.get_event_loop().run_until_complete(main())
This will print two gibberish for each pass. You can change the sleep timing to change the ratio.
Here, main and loop are coroutines, where only one is executed at a time. The await ... calls are points where execution is potentially yielded to other coroutines.

How can I make a set of functions that can be used synchronously as well as asynchronously?

Imagine I have a set of functions like this:
def func1():
func2()
def func2():
time.sleep(1) # simulate I/O operation
print('done')
I want these to be usable synchronously:
# this would take two seconds to complete
func1()
func1()
as well as asynchronously, for example like this:
# this would take 1 second to complete
future = asyncio.gather(func1.run_async(), func1.run_async())
loop = asyncio.get_event_loop()
loop.run_until_complete(future)
The problem is, of course, that func1 somehow has to propagate the "context" it's running in (synchronously vs. asynchronously) to func2.
I want to avoid writing an asynchronous variant of each of my functions because that would result in a lot of duplicate code:
def func1():
func2()
def func2():
time.sleep(1) # simulate I/O operation
print('done')
# duplicate code below...
async def func1_async():
await func2_async()
async def func2_async():
await asyncio.sleep(1) # simulate I/O operation
print('done')
Is there any way to do this without having to implement an asynchronous copy of all my functions?
Here's my "not-an-answer-answer," which I know that Stack Overflow loves...
Is there any way to do this without having to implement an asynchronous copy of all my functions?
I don't think that there is. Making a "blanket translator" to convert functions to native coroutines seems next-to-impossible. That's because making a synchronous function asynchronous is about more than throwing an async keyword in front of it and a couple of await statements within it. Keep in mind that anything that you await must be awaitable.
Your def func2(): time.sleep(1) illustrates that point. Synchronous functions will make blocking calls, such as time.sleep(); asynchronous (native coroutines) will await non-blocking coroutines. Making this function asynchronous, as you point out, requires not just using async def func(), but awaiting asyncio.sleep(). Now let's say instead of time.sleep(), you're calling a more complex, blocking function. You build some sort of fancy decorator that slaps a function attribute called run_async, which is a callable, onto the decorated function. But how does that decorator know how to "translate" the blocking calls within func2() into their coroutine equivalents, if those are even defined? I can't think of any magic that would be smart enough to convert all of the calls in a synchronous function to their awaitable counterparts.
In your comments, you mention that this is for HTTP requests. For a real-world example the differences in call signatures and APIs between the requests and aiohttp packages. In aiohttp, .text() is an instance method; in requests, .text is a property. How could you build something smart enough to know differences such as that?
I don't mean to be discouraging--but I think that using threading would be more realistic.
So I found a way to achieve this, but since this is literally the first time I've done anything with async I can't guarantee that this doesn't have any bugs or that it's not a terrible idea.
The concept is actually pretty simple: Define your functions like normal asynchronous functions using async def and await where necessary, and then add a wrapper around them that automatically awaits the function if no event loop is running. Proof of concept:
import asyncio
import functools
import time
class Hybrid:
def __init__(self, func):
self._func = func
functools.update_wrapper(self, func)
def __call__(self, *args, **kwargs):
coro = self._func(*args, **kwargs)
loop = asyncio.get_event_loop()
if loop.is_running():
# if the loop is running, we must've been called from a
# coroutine - so we'll return a future
return loop.create_task(coro)
else:
# if the loop isn't running, we must've been called synchronously,
# so we'll start the loop and let it execute the coroutine
return loop.run_until_complete(coro)
def run_async(self, *args, **kwargs):
return self._func(*args, **kwargs)
#Hybrid
async def func1():
await func2()
#Hybrid
async def func2():
await asyncio.sleep(0.1)
def twice_sync():
func1()
func1()
def twice_async():
future = asyncio.gather(func1.run_async(), func1.run_async())
loop = asyncio.get_event_loop()
loop.run_until_complete(future)
for func in [twice_sync, twice_async]:
start = time.time()
func()
end = time.time()
print('{:>11}: {} sec'.format(func.__name__, end-start))
# output:
# twice_sync: 0.20142340660095215 sec
# twice_async: 0.10088586807250977 sec
However, this approach does have its limitations. If you have a synchronous function calling a hybrid function, calling the synchronous function from an asynchronous function will change its behavior:
#hybrid
async def hybrid_function():
return "Success!"
def sync_function():
print('hybrid returned:', hybrid_function())
async def async_function():
sync_function()
sync_function() # this prints "Success!" as expected
loop = asyncio.get_event_loop()
loop.run_until_complete(async_function()) # but this prints a coroutine
Take care to account for this!

Categories

Resources