I would like to call an async function from a pure sync code.
I would like to execute that async function in the background without stucking my prog.
My idea is to use the threading module.
from threading import Thread
import asyncio
async def func1():
...
def func2():
...
if __name__ == '__main__':
Thread(target=func1).start()
Thread(target=func2).start()
Any idea?
Thanks in advance!
Since Python 3.7 there is asyncio.run.
Replace
Thread(target=func1).start()
by
Thread(target=asyncio.run, args=(func1(),)).start()
Related
I want to call 4 methods at once so they run parallel-ly in Python. These methods make HTTP calls and do some basic operation like verify response. I want to call them at once so the time taken will be less. Say each method takes ~20min to run, I want all 4methods to return response in 20min and not 20*4 80min
It is important to note that the 4methods I'm trying to run in parallel are async functions. When I tried using ThreadPoolExecutor to run the 4methods in parallel I didn't see much difference in time taken.
Example code - edited from #tomerar comment below
from concurrent.futures import ThreadPoolExecutor
async def foo_1():
print("foo_1")
async def foo_2():
print("foo_2")
async def foo_3():
print("foo_3")
async def foo_4():
print("foo_4")
with ThreadPoolExecutor() as executor:
for foo in [await foo_1,await foo_2,await foo_3,await foo_4]:
executor.submit(foo)
Looking for suggestions
You can use from concurrent.futures import ThreadPoolExecutor
from concurrent.futures import ThreadPoolExecutor
def foo_1():
print("foo_1")
def foo_2():
print("foo_2")
def foo_3():
print("foo_3")
def foo_4():
print("foo_4")
with ThreadPoolExecutor() as executor:
for foo in [foo_1,foo_2,foo_3,foo_4]:
executor.submit(foo)
You can use "multiprocessing" in python.
it's so simple
from multiprocessing import Pool
pool = Pool()
result1 = pool.apply_async(solve1, [A]) # evaluate "solve1(A)"
result2 = pool.apply_async(solve2, [B]) # evaluate "solve2(B)"
answer1 = result1.get(timeout=10)
answer2 = result2.get(timeout=10)
you can see full details
I am trying to use asyncio together with threading for a Discord Bot. I've found this script which I changed to my needs:
import time
import threading as th
import asyncio
import discord
class discordvars(object):
client=discord.Client()
TOKEN=('---')
running_discordthread=False
discordloop = asyncio.get_event_loop()
discordloop.create_task(client.start(TOKEN))
discordthread=th.Thread(target=discordloop.run_forever)
def start():
if discordvars.running_discordthread==False:
discordvars.discordthread.start()
print("Discord-Client started...")
discordvars.running_discordthread=True
else:
print("Discord-CLient allready running...")
time.sleep(2)
def stop():
if discordvars.running_discordthread==True:
discordvars.discordloop.call_soon_threadsafe(discordvars.discordloop.stop())
print("Requestet Discord-Client stop!")
discordvars.discordthread.join()
print(discordvars.discordthread.isAlive())
time.sleep(1)
print("Discord-Client stopped...")
discordvars.running_discordthread=False
else:
print("Discord-Client not running...")
time.sleep(2)
#discordvars.client.event
async def on_message(message):
if message.content.startswith('!test'):
embed = discord.Embed(title="test", color=0x0071ce, description="test")
await message.channel.send(embed=embed)
Starting the Script with the start() function works great. Also stopping with the stop() function works somehow. If I call the stop() function it prints: "False" so I am thinking that the thread was stopped. But if I then call the start() function I will get an error:
RuntimeError: threads can only be started once
This script is part of a big project so I am calling the functions from another script. But I think that shouldn't be the problem.
What is the problem? Thanks in advance.
You cannot re-start the existing thread, but you can start a new thread that runs the event loop. You can achieve that by moving the assignment to discordthread to the start function.
And your call to call_soon_threadsafe is wrong. You need to pass discordloop.stop to it, without parentheses. That refers to the actual function without calling it right away, and allows the loop thread to call it, which was intended:
discordloop.call_soon_threadsafe(discordloop.stop)
Finally, your init function is missing a global declaration for the variables you assign that are intended as globals.
I have some difficulties into making this function running in the background without blocking my entire program, how can I run that function in loop, without blocking my entire program?
This is the function:
while True:
schedule.run_pending()
Thank you for any reply.
Edit:
def FunctioninLoop():
while True:
schedule.run_pending()
async def MyFunction():
ScheduleToexecute=schedule.every().minute.do(Functionscheduled)
t = Thread(target=FunctioninLoop())
t.start()
print("The execution is going on")
Threads are what you are looking for.
Consider the following code:
from threading import Thread
def myfunc(a, b, c):
pass
# Creates a thread
t = Thread(target=myfunc, args=(1, 2, 3))
# Launches the thread (while not blocking the main execution)
t.start()
somecode
somecode
somecode
# Waits for the thread to return (not a must)
t.join()
Hope I've helped! :)
import threading
pender = threading.thread(schedule.run_pending) # Does not Block
print("life goes on until...")
pender.join() # Blocks until schedule.run_pending() is complete.
You can use python's subprocess module
https://docs.python.org/3.2/library/subprocess.html
import os
def myfunction():
..........
os.spawnl(os.P_NOWAIT, myfunction())
I have the following code:
import time
def wait10seconds():
for i in range(10):
time.sleep(1)
return 'Counted to 10!'
print(wait10seconds())
print('test')
Now my question is how do you make print('test') run before the function wait10seconds() is executed without exchanging the 2 lines.
I want the output to be the following:
test
Counted to 10!
Anyone know how to fix this?
You can use Threads for this
like:
from threading import Thread
my_thread = Thread(target=wait10seconds) # Create a new thread that exec the function
my_thread.start() # start it
print('test') # print the test
my_thread.join() # wait for the function to end
You can use a Timer. Taken from the Python docs page:
def hello():
print("hello, world")
t = Timer(30.0, hello)
t.start() # after 30 seconds, "hello, world" will be printed
if you are using python 3.5+ you can use asyncio:
import asyncio
async def wait10seconds():
for i in range(10):
await asyncio.sleep(1)
return 'Counted to 10!'
print(asyncio.run(wait10seconds()))
asyncio.run is new to python 3.7, for python 3.5 and 3.6 you won't be able to use asyncio.run but you can achieve the same thing by working with the event_loop directly
Below is the code snippet that I ran:
from concurrent.futures import ProcessPoolExecutor
import asyncio
import contextvars
ctx = contextvars.ContextVar('ctx', default=None)
pool = ProcessPoolExecutor(max_workers=2)
def task():
print(f'inside pool process, ctx: {ctx.get()}')
ctx.set('co co')
return ctx.get()
async def execute():
loop = asyncio.get_event_loop()
ctx.set('yo yo')
ctx_from_pool = await loop.run_in_executor(pool, task)
ctx_from_async = ctx.get()
print(f'ctx_from_async: {ctx_from_async}')
print(f'ctx_from_pool: {ctx_from_pool}')
ctx.set('bo bo')
def main():
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.ensure_future(execute()))
ctx_from_main = ctx.get()
print(f'ctx_from_main: {ctx_from_main}')
main()
Output:
inside pool process, ctx: yo yo
ctx_from_async: yo yo
ctx_from_pool: co co
ctx_from_main: None
My understanding is that the reason for contextvars change ctx.set('co co') made by process pool to not propagated to the main process is that when assigning the task, a copy of the variables are made using pickle due to which the change was made on a different copy of the variable rather than the one which is accessed by the main process. However, I am not completely sure of this as I don't have a lot of experience with ProcessPoolExecutor.
Could someone throw some additional light on the same? Also, what can be done to have a seamless manipulation of the contextvars across asyncio loop and process pool executor?