In one of my projects, I need to run three different database updater functions at different intervals.
For instance, function one needs to run every 30 seconds, function two needs to run every 60 seconds and function 3 every 5 minutes (notably due to API call restrictions).
I've been trying to achieve this in python, looking up every possible solution but I cannot seem to find anything that works for my use case. I am rather fresh in python.
Here is (somewhat) what I have, using asyncio.
import asyncio
def updater1(url1, url2, time):
print(f"Doing my thing here every {time} seconds")
def updater2(url1, url2, time):
print(f"Doing my thing here every {time} seconds")
def updater3(url, time):
print(f"Doing my thing here every {time} seconds")
async def func1():
updater1(rankUrl, statsUrl, 30)
await asyncio.sleep(30)
async def func2():
updater2(rankUrl, statsUrl, 60)
await asyncio.sleep(60)
async def func3():
updater3(url, 300)
await asyncio.sleep(300)
# Initiate async loops
while True:
asyncio.run(func1())
asyncio.run(func2())
asyncio.run(func3())
The issue is that these tasks run one after each other, while what I am trying to achieve is that they run independently from each other, with a start time when the script is initiated, and respective to their own individual loop times
Any idea on how this could be done is much appreciated - I am open to new concepts and ideas if you have any for me to explore :)
Don't use asyncio.run() on individual coroutines, as async.run() is itself not asynchronous. The call to asyncio.run() won't return until the funcN() coroutine is done.
Create a single top-level coroutine that then runs others as tasks:
async def main():
task1 = asyncio.create_task(func1())
task2 = asyncio.create_task(func2())
task3 = asyncio.create_task(func3())
await asyncio.wait([task1, task2, task3])
The above kicks off three independent tasks, then waits for all 3 to complete.
Related
I need to have two function in my Python 3.11 code.
One function must be sync, it retrive some data from a local machine so i need to wait to finish.
Another function must be async, it get the data from the first function and send to the server. Since i don't know how many time can be (5 seconds to 30 seconds) this function must doesn't interrupt the first one
Pratically, the second function start always when the first finish but the first always start and don't care about the second one. This code run H24
My attempt:
import time
import asyncio
async def task1():
print("Recover data... waiting")
time.sleep(3)
print("End data recover")
return "slow"
async def task2(p):
print("I'm so" + p)
time.sleep(10)
print("END--->")
async def main():
while True:
print("create task1 and wait to finish")
x = await task1()
print("create task2 and not wait to finishing")
asyncio.create_task(task2(x))
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.create_task(main())
loop.run_forever()
I dont' need to use asyncio like a requirement, i just want to meet the goal without run out all the memory of the machine. Thanks
basically, "yes". Like in: that is the way you do it. If you need some task to complete before goind on with code in a given place, the thing to do is to wait for that data - if it is in an async func, then you use the await keyword, just as depicted in the code above.
If there are other tasks in parallel to calling main, then it would be nice if the code in task1 would return execution to the asyncio.loop while it waits for its result - That code could run in another thread, with the use of await asyncio.run_in_executor(sync_call, args) or simply await asyncio.sleep(...)`
I have a function that make a post request with a lot of treatment. All of that takes 30 seconds.
I need to execute this function every 6 mins. So I used asyncio for that ... But it's not asynchrone my api is blocked since the end of function ... Later I will have treatment that takes 5 minutes to execute.
def update_all():
# do request and treatment (30 secs)
async run_update_all():
while True:
await asyncio.sleep(6 * 60)
update_all()
loop = asyncio.get_event_loop()
loop.create_task(run_update_all())
So, I don't understand why during the execute time of update_all() all requests comming are in pending, waiting for the end of update_all() instead of being asynchronous
I found an answer with the indication of larsks
I did that :
def update_all():
# Do synchrone post request and treatment that take long time
async def launch_async():
loop = asyncio.get_event_loop()
while True:
await asyncio.sleep(120)
loop.run_in_executore(None, update_all)
asyncio.create_task(launch_async())
With that code I'm able to launch a synchrone function every X seconds without blocking the main thread of FastApi :D
I hope that will help other people in the same case than me.
I have an API, which takes a long time to do. So I want to split it into many smaller jobs, then run them in parallel and wait for the result before sending the response.
My snip code:
#app.post("/data")
async def get_tsp_events:
query = [foo, bar, foo, bar]
tasks = [asyncio.create_task(do_work(param1, param2)) for query in queries]
events = await asyncio.gather(*tasks)
return events
async def do_something(arg1, arg2):
log("Start time")
# This take a lot of times
events = [event for event in range(10000000000)]
log("End time")
return events
As I see, all task is run in sequence just like normal code (without using asyncio.create_task() and asyncio.gather)
I'm new in Python and my question is:
I'm wrong or not ? Where ?
There are any solution can help me ?
Thank you all
In fact, async and await didn't introduce real parallel, they works when you do some underlying "asynchronous" operations like aiohttp, aiofile. Nevertheless, they only bring concurrency for IO-bound tasks.
If you want real parallel for CPU-bound tasks, use multiprocessing.
I am pretty new with Async.io and I am using it with Discord.py to create a bot. Once a day, I need to update a spreadsheet, but the problem is that the spreadsheet has gotten a little long so it now triggers the loop's default timeout. Is there anyway to overcome this? I have seen run_until_complete but as you see below there is a await asyncio.sleep(86400) which from my understanding will not work with wait until complete because it will wait for a day? I would also be fine with just changing the timeout for that function and then changing it back after it is complete, but I have not been able to find any resources.
Here is the function that needs to repeat everyday:
async def updateSheet():
while True:
print("Updating Sheet at " + datetime.now().strftime("%H:%M"))
user.updateAllUsers(os.getenv('CID'), os.getenv('CS'), subs) #This is the function that takes too long
print("Done Updating")
await asyncio.sleep(86400)
and here is how I am adding it to the loop (because I am using Discord.py):
#client.event
async def on_ready():
print('We have logged in as {0.user}'.format(client))
client.loop.create_task(updateSheet())
Any and all help will be appreciated since as long as this is down my project loses precious time. :)
If something is blocking, the direct method would be trying to convert it to a task, which might not be possible in your case. So we would have to use something like APS to schedule jobs.
sched = Scheduler()
sched.start()
#sched.cron_schedule(day='mon-fri')
def task():
user.updateAllUsers(os.getenv('CID'), os.getenv('CS'), subs)
Make sure you do this in a separate file, and use async scheduler for tasks.
You can simply measure how much time does the function take to execute and simply subtract it from 86400
import time
async def updateSheet():
while True:
start = time.monotonic()
print("Updating Sheet at " + datetime.now().strftime("%H:%M"))
user.updateAllUsers(os.getenv('CID'), os.getenv('CS'), subs) #This is the function that takes too long
end = time.monotonic()
total = end - start
sleep_time = 86400 - total
await asyncio.sleep(sleep_time)
I really suggest you that you run the blocking functions in a non-blocking way, refer to one of my previous answers for more info, (What does "blocking" mean)
In python, there are 3 main types awaitable objects: coroutines, Tasks, and Futures.
I can await a coroutine, and also a tasks.
Awaiting a coroutine
import asyncio
async def nested():
return 42
async def main():
print(await nested()) # will print "42".
asyncio.run(main())
Awaiting a task
import asyncio
async def nested():
return 42
async def main():
task = asyncio.create_task(nested())
await task
asyncio.run(main())
What is the value of wrapping the coroutine in a task in the first place? It looks like they do the same thing.
When would I need to use a task vs a coroutine?
Coroutine is just a function that runs in the context of current awaitable. It can yield execution to the event loop on behalf of the caller (the one who calls await). Think of a function that is allowed to pause it's thread. You can call one coroutine from another, but they still share the same thread.
Task, on other hand, immediately posts a separate job to an event loop. The task itself is a handle to that job. You may await a task, but it can run on itself just fine in "parallel" — in single threaded context this means that task can run while other josb are yielding (e.g. waiting for the I/O). Task may complete even before you call await.
Example without tasks:
job_1 = sleep(5)
job_2 = sleep(2)
# will sleep for 5 seconds
await job_1
# will sleep for another 2 seconds
await job_2
Example with tasks:
job_1 = sleep(5)
job_2 = asyncio.create_task(sleep(2))
# will sleep for 5 seconds
await job_1
# by this time, job_2 is complete
# because previous job has yielded at some point, allowing other jobs to run
# thus await takes no time
await job_2
In this case there's no real difference: by awaiting the coroutine it's going to get scheduled as part of the task it's part of. However that means it's driven by its parent.
By wrapping a coroutine in a task, it gets independently scheduled on the event loop, meaning it is not driven by the containing task anymore (it has its own lifecycle) and it can be interacted with more richly (e.g. cancelled or have callbacks added to it).
Think "function" versus "thread", really. A coroutine is just a function which can be suspended (if it awaits stuff), but it still only exists within the lexical and dynamic context of its caller. A task is freed from that context, it makes the wrapped coroutine live its own life in the same way a thread makes the wrapped function (target) live its own life.
Creating a Task schedules the passed coroutine to be run on an event loop. You can use the Task to cancel the underlying coroutine.