aiohttp - multiple websockets, single session? - python

I am becoming familiar with the libraries, but I am stumped by the following situation:
I wish to continuously process update messages from two different websites' websockets.
But I am having trouble achieving this with a single session variable, given that aiohttp.ClientSession() objects should exist within coroutines.
import asyncio
import aiohttp
url1 = 'wss://example.com'
async def main():
session = aiohttp.ClientSession()
async with session.ws_connect(url1) as ws1:
async for msg in ws1:
# do some processing
loop = asyncio.get_event_loop()
loop.create_task(main())
loop.run_forever()
The above will work for a single websocket connection. But because async for msg in ws: is an infinite loop, I can't see where I can put the ws2 version of this async loop.

Since each websocket needs its own infinite loop, you can abstract it into a coroutine that serves that websocket and accepts the session from its caller. The calling coroutine will create the serve tasks "in the background" using loop.create_task, which can also be called from a coroutine:
async def setup():
session = aiohttp.ClientSession()
loop = asyncio.get_event_loop()
loop.create_task(serve(url1, session))
loop.create_task(serve(url2, session))
async def serve(url, session):
async with session.ws_connect(url) as ws:
async for msg in ws:
...
loop = asyncio.get_event_loop()
loop.run_until_complete(setup())
loop.run_forever()

Related

Running asynchronous functions in non-asynchronous functions

I'm trying to run some asynchronous functions in her asynchronous function, the problem is, how did I understand that functions don't run like that, then how do I do it? I don't want to make the maze_move function asynchronous.
async def no_stop():
#some logic
await asyncio.sleep(4)
async def stop(stop_time):
await asyncio.sleep(stop_time)
#some logic
def maze_move():
no_stop()
stop(1.5)
async def main(websocket):
global data_from_client, data_from_server, power_l, power_r
get_params()
get_data_from_server()
get_data_from_client()
while True:
msg = await websocket.recv()
allow_data(msg)
cheker(data_from_client)
data_from_server['IsBrake'] = data_from_client['IsBrake']
data_from_server['powerL'] = power_l
data_from_server['powerR'] = power_r
await websocket.send(json.dumps(data_from_server))
print(data_from_client['IsBrake'])
start_server = websockets.serve(main, 'localhost', 8080)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_forever()
How about:
def maze_move():
loop = asyncio.get_event_loop()
loop.run_until_complete(no_stop())
loop.run_until_complete(stop(1.5))
If you wanted to run two coroutines concurrently, then:
def maze_move():
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.gather(no_stop(), stop(1.5)))
Update Based on Updated Question
I am guessing what it is you want to do (see my comment to your question):
First, you cannot call from maze_move coroutines such as stop directly since stop() does not result in calling stop it just returns a coroutine object. So maze_move has to be modified. I will assume you do not want to make it a coroutine itself (why not as long as you already have to modify it?). And further assuming you want to invoke maze_move from a coroutine that wishes to run concurrently other coroutines, then you can create a new coroutine, e.g. maze_move_runner that will run maze_move in a separate thread so that it does not block other concurrently running coroutines:
import asyncio
import concurrent.futures
async def no_stop():
#some logic
print('no stop')
await asyncio.sleep(4)
async def stop(stop_time):
await asyncio.sleep(stop_time)
print('stop')
#some logic
async def some_coroutine():
print('Enter some_coroutine')
await asyncio.sleep(2)
print('Exit some_coroutine')
return 1
def maze_move():
# In case we are being run directly and not in a separate thread:
try:
loop = asyncio.get_running_loop()
except:
# This thread has no current event loop, so:
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(no_stop())
loop.run_until_complete(stop(1.5))
return 'Done!'
async def maze_move_runner():
loop = asyncio.get_running_loop()
# Run in another thread:
return await loop.run_in_executor(None, maze_move)
async def main():
loop = asyncio.get_running_loop()
results = await (asyncio.gather(some_coroutine(), maze_move_runner()))
print(results)
asyncio.run(main())
Prints:
Enter some_coroutine
no stop
Exit some_coroutine
stop
[1, 'Done!']
But this would be the most straightforward solution:
async def maze_move():
await no_stop()
await stop(1.5)
return 'Done!'
async def main():
loop = asyncio.get_running_loop()
results = await (asyncio.gather(some_coroutine(), maze_move()))
print(results)
If you have an already running event loop, you can define an async function inside of a sync function and launch it as task:
def maze_move():
async def amaze_move():
await no_stop()
await stop(1.5)
return asyncio.create_task(amaze_move())
This function returns an asyncio.Task object which can be used in an await expression, or not, depending on requirements. This way you won't have to make maze_move itself an async function, although I don't know why that would be a goal. Only a async function can run no_stop and stop, so you've got to have an async function somewhere.

Python - Run multiple async functions simultaneously

I'm essentially making a pinger, that makes has a 2d list, of key / webhook pairs, and after pinging a key, send the response to a webhook
the 2d list goes as follows:
some_list = [["key1", "webhook1"], ["key2", "webhook2"]]
My program is essentially a loop, and I'm not too sure how I can rotate the some_list data, in the function.
Here's a little demo of what my script looks like:
async def do_ping(some_pair):
async with aiohttps.ClientSession() as s:
tasks = await gen_tasks(s, some_pair)
results = await asyncio.gather(*tasks*)
sleep(10)
await do_ping(some_pair)
I've tried:
async def main():
for entry in some_list:
asyncio.run(do_ping(entry))
but due to the do_ping function being a self-calling loop, it just calls the first one over and over again, and never gets to the ones after it. Hoping to find a solution to this, whether it's threading or alike, and if you have a better way of structuring some_list values (which I assume would be a dictionary), feel free to drop that feedback as well
You made your method recursive await do_ping(some_pair), it never ends for the loop in main to continue. I would restructure the application like this:
async def do_ping(some_pair):
async with aiohttps.ClientSession() as s:
while True:
tasks = await gen_tasks(s, some_pair)
results = await asyncio.gather(*tasks)
await asyncio.sleep(10)
async def main():
tasks = [do_ping(entry) for entry in some_list]
await asyncio.gather(*tasks)
if __name__ == "__main__":
asyncio.run(main())
Alternatively you could move the repeat and sleeping logic into the main:
async def do_ping(some_pair):
async with aiohttps.ClientSession() as s:
tasks = await gen_tasks(s, some_pair)
results = await asyncio.gather(*tasks)
async def main():
while True:
tasks = [do_ping(entry) for entry in some_list]
await asyncio.gather(*tasks)
await asyncio.sleep(10)
if __name__ == "__main__":
asyncio.run(main())
You could also start the tasks before doing a call to sleep, and gather them afterwards. That would make the pings more consistently start at 10 second intervals instead of being 10 seconds + the time it takes to gather the results:
async def main():
while True:
tasks = [
asyncio.create_task(do_ping(entry))
for entry in some_list
]
await asyncio.sleep(10)
await asyncio.wait(tasks)
EDIT As pointed out by creolo you should only create a single ClientSession object. See https://docs.aiohttp.org/en/stable/client_reference.html
Session encapsulates a connection pool (connector instance) and supports keepalives by default. Unless you are connecting to a large, unknown number of different servers over the lifetime of your application, it is suggested you use a single session for the lifetime of your application to benefit from connection pooling.
async def do_ping(session, some_pair):
tasks = await gen_tasks(session, some_pair)
results = await asyncio.gather(*tasks)
async def main():
async with aiohttp.ClientSession() as session:
while True:
tasks = [
asyncio.create_task(do_ping(session, entry))
for entry in some_list
]
await asyncio.sleep(10)
await asyncio.wait(tasks)

discord.py main thread being blocked

I have a very simple bot that makes a request to https://jsonplaceholder.typicode.com/todos/1 every 0.5 second (this is a practice project, I know it's not recommended) but, Upon running the bot, It starts, but due to the high amount of requests, It won't execute sayHi() due to the thread being blocked.
What I want: Is a way to execute the requests inside another thread, If possible?
#bot.command()
async def sayHi(ctx, arg):
print("Hi")
async def make_request():
r = requests.get("https://jsonplaceholder.typicode.com/todos/1")
print(r)
#tasks.loop(seconds=0.5)
async def scan_loop():
await make_request()
#bot.event
async def on_ready():
print("Bot is running")
scan_loop.start()
bot.run(TOKEN)
Of course, you can use multiprocessing or threading to run a request in another thread but there is a second way: using asyncronous aiohhtp instead of blocking requests library.
So, your make_request function will be something like this with aiohttp:
import aiohttp
async def make_request():
async with aiohttp.ClientSession() as session:
async with session.get("https://jsonplaceholder.typicode.com/todos/1") as response:
print(await response.json())

Asyncio not running Aiohttp requests in parallel

I want to run many HTTP requests in parallel using python.
I tried this module named aiohttp with asyncio.
import aiohttp
import asyncio
async def main():
async with aiohttp.ClientSession() as session:
for i in range(10):
async with session.get('https://httpbin.org/get') as response:
html = await response.text()
print('done' + str(i))
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
I expect it to execute all the requests in parallel, but they are executed one by one.
Although, I later solved this using threading, but I would like to know what's wrong with this?
You need to make the requests in a concurrent manner. Currently, you have a single task defined by main() and so the http requests are run in a serial manner for that task.
You could also consider using asyncio.run() if you are using Python version 3.7+ that abstracts out creation of event loop:
import aiohttp
import asyncio
async def getResponse(session, i):
async with session.get('https://httpbin.org/get') as response:
html = await response.text()
print('done' + str(i))
async def main():
async with aiohttp.ClientSession() as session:
tasks = [getResponse(session, i) for i in range(10)] # create list of tasks
await asyncio.gather(*tasks) # execute them in concurrent manner
asyncio.run(main())

Aiohttp - should I keep the session alive 24/7 when polling restful api?

Sorry, library first-timer here. I am polling a restful endpoint every 10 seconds.
Its not obvious to me which of the following is appropriate:
import aiohttp
import asyncio
async def poll(url):
async with aiohttp.ClientSession() as session:
async with session.get(url) as r:
return await r.text()
async def main():
while True:
await asyncio.sleep(10)
print(await poll('http://example.com/api'))
loop = asyncio.get_event_loop()
loop.create_task(main())
loop.run_forever()
Or the session variable persists forever:
import aiohttp
import asyncio
async def poll(url):
async with aiohttp.ClientSession() as session:
await asyncio.sleep(10)
async with session.get(url) as r:
print(await r.text())
loop = asyncio.get_event_loop()
loop.create_task(poll('http://example.com/api'))
loop.run_forever()
I expect the latter is desirable, but coming from the non-asynchronous requests library, I'm not used to the idea of sessions. Will I actually experience faster response times because of connection pooling or other things?
From official document:
Don’t create a session per request. Most likely you need a
session per application which performs all requests altogether.
A session contains a connection pool inside. Connection reusage and
keep-alives (both are on by default) may speed up total performance.
Surely the latter one is better and definitely you will have a faster experience.

Categories

Resources