I have a very simple bot that makes a request to https://jsonplaceholder.typicode.com/todos/1 every 0.5 second (this is a practice project, I know it's not recommended) but, Upon running the bot, It starts, but due to the high amount of requests, It won't execute sayHi() due to the thread being blocked.
What I want: Is a way to execute the requests inside another thread, If possible?
#bot.command()
async def sayHi(ctx, arg):
print("Hi")
async def make_request():
r = requests.get("https://jsonplaceholder.typicode.com/todos/1")
print(r)
#tasks.loop(seconds=0.5)
async def scan_loop():
await make_request()
#bot.event
async def on_ready():
print("Bot is running")
scan_loop.start()
bot.run(TOKEN)
Of course, you can use multiprocessing or threading to run a request in another thread but there is a second way: using asyncronous aiohhtp instead of blocking requests library.
So, your make_request function will be something like this with aiohttp:
import aiohttp
async def make_request():
async with aiohttp.ClientSession() as session:
async with session.get("https://jsonplaceholder.typicode.com/todos/1") as response:
print(await response.json())
Related
I am trying to set up a schedule to run a subroutine. I am trying to use the subroutine example to send a message to a discord channel when the schedule is triggered. At first I attempted to try and just send the message but got an error. I then tried looking into how to solve this and have tried different ways using asyncio all of which have not worked.
If anyone is able to give me any pointers on how I could do this then it would be much appreciated.
import discord
import asyncio
import time
import schedule # pip install schedule
client = discord.Client()
#client.event
async def on_ready():
print("Connected!")
async def example(message):
await client.get_channel(CHANNEL ID).send(message)
client.run(SECRET KEY)
def scheduledEvent():
loop = asyncio.get_event_loop()
loop.run_until_complete(example("Test Message"))
loop.close()
schedule.every().minute.do(scheduledEvent)
while True:
schedule.run_pending()
time.sleep(1)
You can't run your blocking schedule code in the same thread as your asynchronous event loop (your current code won't even try to schedule tasks until the bot has already disconnected). Instead, you should use the built in tasks extension which allows you to schedule tasks.
import discord
from discord.ext import tasks, commands
CHANNEL_ID = 1234
TOKEN = 'abc'
client = discord.Client()
#client.event
async def on_ready():
print("Connected!")
#tasks.loop(minutes=1)
async def example():
await client.get_channel(CHANNEL_ID).send("Test Message")
#example.before_loop
async def before_example():
await client.wait_until_ready()
example.start()
clinet.run(TOKEN)
When the user sends !bot in a special channel, the code runs a function that runs for 20-30 seconds (depends on some_var). My problem is that if several people write !bot , the code will reverse this in multiple threads. How do I queue for these requests?
I tried to understand asyncio, discord.ext.tasks, but can't figure out how it works
#client.command()
async def test(ctx, *args):
data = args
if data:
answer = some_function(data[0]) #works from 5 seconds to 1 minute or more
await ctx.send(answer)
Everything works well, but I just don't want to load the system so much, I need a loop to process requests first in - first out
You can use an asyncio.Queue to queue tasks, then process them sequentially in a background loop:
import asyncio
from discord.ext import commands, tasks
queue = asyncio.Queue()
bot = commands.Bot('!')
#tasks.loop(seconds=1.0)
async def executor():
task = await queue.get()
await task
queue.task_done()
#executor.before_loop
async def before():
await bot.wait_until_ready()
#bot.command()
async def example(ctx, num: int):
await queue.put(helper(ctx, num))
async def helper(ctx, num):
await asyncio.sleep(num)
await ctx.send(num)
executor.start()
bot.run('token')
Make some_function() to async then await it. Then all test command will be processed simultaneously.
async some_function(...):
# something to do
#client.command()
async def test(ctx, *args):
if args:
answer = await some_function(data[0])
await ctx.send(answer)
I am making a discord bot with python. When a user types a command, this bot brings data from url and shows it. I use aiohttp for asynchronous http request, but documentation of discord.py says,
Since it is better to not create a session for every request, you should store it in a variable and then call session.close on it when it needs to be disposed.
So i changed all my codes from
async with aiohttp.ClientSession() as session:
async with session.get('url') as response:
# something to do
into
# Global variable
session = aiohttp.ClientSession()
async with session.get('url') as response:
# something to do
All http requests use globally defined session. But when i run this code and stop by keyboard interrupt(Ctrl + C), this warning messages are appeared.
Unclosed client session
client_session: <aiohttp.client.ClientSession object at 0x0000015A45ADBDD8>
Unclosed connector
connections: ['[(<aiohttp.client_proto.ResponseHandler object at 0x0000015A464925E8>, 415130.265)]']
connector: <aiohttp.connector.TCPConnector object at 0x0000015A454B3320>
How can i close ClientSession when program stops by keyboard interrupt?
What I tried:
I tried following but nothing worked well.
Making a class and close in its __del__. __del__ was not called when keyboard interrupt.
class Session:
def __init__(self):
self._session = aiohttp.ClientSession()
def __del__(self):
self._session.close()
Infinite loop in main, and catch KeyboardInterrupt. Program is blocked with bot.run() so cannot reach to code.
from discord.ext import commands
if __name__ == "__main__":
bot = commands.Bot()
bot.run(token) # blocked
try:
while(True):
sleep(1)
except KeyboardInterrupt:
session.close()
Close session when bot is disconnected. on_disconnect had been not called when keyboard interrupt.
#bot.event
async def on_disconnect():
await session.close()
edit: I missed await before session.close(), but this was just my mistake when I wrote this question. All I tried also didn't work well as i expected when i wrote correctly with await.
You must await the closing of a ClientSession object:
await session.close()
Notice coroutine in the docs here. Your attempt #3 is probably best suited for this problem, as it is naturally an async function.
I tried the following code and it seems to work well.
import asyncio
import aiohttp
class Session:
def __init__(self):
self._session = aiohttp.ClientSession()
def __del__(self):
loop = asyncio.get_event_loop()
loop.run_until_complete(self.close()
async def close(self):
await self._session.close()
session = Session()
I need to parse repeatedly one link content. synchronous way gives me 2-3 responses per second, i need faster (yes, i know, that too fast is bad too)
I found some async examples, but all of them show how to handle result after all links are parsed, whereas i need to parse it immediately after receiving, something like this, but this code doesn't give any speed improvement:
import aiohttp
import asyncio
import time
async def fetch(session, url):
async with session.get(url) as response:
return await response.text()
async def main():
while True:
async with aiohttp.ClientSession() as session:
html = await fetch(session, 'https://example.com')
print(time.time())
#do_something_with_html(html)
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
but this code doesn't give any speed improvement
asyncio (and async/concurrency in general) gives speed improvement for I/O things that interleave each other.
When everything you do is await something and you never create any parallel tasks (using asyncio.create_task(), asyncio.ensure_future() etc.) then you are basically doing the classic synchronous programming :)
So, how to make the requests faster:
import aiohttp
import asyncio
import time
async def fetch(session, url):
async with session.get(url) as response:
return await response.text()
async def check_link(session):
html = await fetch(session, 'https://example.com')
print(time.time())
#do_something_with_html(html)
async def main():
async with aiohttp.ClientSession() as session:
while True:
asyncio.create_task(check_link(session))
await asyncio.sleep(0.05)
asyncio.run(main())
Notice: the async with aiohttp.Cliensession() as session: must be above (outside) while True: for this to work. Actually, having a single ClientSession() for all your requests is a good practice anyway.
I gave up using async, threading solved my problem, thanks to this answer
https://stackoverflow.com/a/23102874/5678457
from threading import Thread
import requests
import time
class myClassA(Thread):
def __init__(self):
Thread.__init__(self)
self.daemon = True
self.start()
def run(self):
while True:
r = requests.get('https://ex.com')
print(r.status_code, time.time())
for i in range(5):
myClassA()
Sorry, library first-timer here. I am polling a restful endpoint every 10 seconds.
Its not obvious to me which of the following is appropriate:
import aiohttp
import asyncio
async def poll(url):
async with aiohttp.ClientSession() as session:
async with session.get(url) as r:
return await r.text()
async def main():
while True:
await asyncio.sleep(10)
print(await poll('http://example.com/api'))
loop = asyncio.get_event_loop()
loop.create_task(main())
loop.run_forever()
Or the session variable persists forever:
import aiohttp
import asyncio
async def poll(url):
async with aiohttp.ClientSession() as session:
await asyncio.sleep(10)
async with session.get(url) as r:
print(await r.text())
loop = asyncio.get_event_loop()
loop.create_task(poll('http://example.com/api'))
loop.run_forever()
I expect the latter is desirable, but coming from the non-asynchronous requests library, I'm not used to the idea of sessions. Will I actually experience faster response times because of connection pooling or other things?
From official document:
Don’t create a session per request. Most likely you need a
session per application which performs all requests altogether.
A session contains a connection pool inside. Connection reusage and
keep-alives (both are on by default) may speed up total performance.
Surely the latter one is better and definitely you will have a faster experience.