I am trying to setup an socket.io server using python-socketio.
Here is a minimal working example:
import asyncio
from aiohttp import web
import socketio
import random
sio = socketio.AsyncServer(async_mode='aiohttp')
app = web.Application()
sio.attach(app)
#sio.on('connect')
def connect(sid, environ):
print("connected: ", sid)
#sio.on('sendText')
async def message(sid, data):
print("message ", data)
# await asyncio.sleep(1 * random.random())
# print('waited', data)
#sio.on('disconnect')
def disconnect(sid):
print('disconnect ', sid)
if __name__ == '__main__':
web.run_app(app, host='0.0.0.0', port=8080)
This runs fine, and I can execute (here in node.js) for instance
const io = require('socket.io-client');
const socket = io('ws://localhost:8080');
socket.emit('sendText', 'hey 1')
socket.emit('sendText', 'hey 2')
socket.emit('sendText', 'hey 3')
If I run the server and run the node script above I get server-side
connected: c1e687f0e2724b339fcdbefdb5aaa8f8
message hey 1
message hey 2
message hey 3
However, if I uncomment the lines with await sleep in the code, I only receive the first message:
connected: 816fb6700f5143f7875b20a252c65f33
message hey 1
waited hey 1
I don't understand why the next messages are not appearing.
Can only one instance of async def message run at the same time? Or why?
I am sure that I am not understanding something very fundamental about how this works. I would be very grateful if someone could point out what I am not understanding.
I'm the author of the python-socketio package. There are two problems here, I think. I can answer your question:
Can only one instance of async def message run at the same time? Or why?
My Socket.IO server serializes the events that are received from a given client. So for example, if client A sends an event that runs for one minute, any additional events sent by A during that minute will be queued, waiting for the first event to complete first. If client B sends an event during that minute, it will be handled immediately. The reason why events from a client are artificially serialized is to prevent race conditions or other side effects from occurring as a result of two or more handlers for the same client running in parallel. This serialization of events can be turned off, with the async_handlers option:
sio = socketio.AsyncServer(async_mode='aiohttp', async_handlers=True)
Using aiohttp 2.3.7 and async_handlers=True your three events are received at more or less the same time, and then all handlers wait in parallel during their sleep periods.
Unfortunately this does not explain the 2nd and 3rd events never reaching the server. I have verified that these events are properly queued and executed in sequence with aiohttp 2.2.5, but this breaks with 2.3.0 all the way to 2.3.7. My current theory is that a change that was introduced in 2.3.0 is causing these messages that arrive while the task is sleeping to get dropped, but haven't found why that happens yet.
Related
I am trying to implement WebSocket connection to a server (Python app <=> Django app)
Whole system runs in big Asyncio loop with many tasks. Code snippet is just very small testing part.
I am able to send any data to a server at any moment and many of them will be type request something and wait for response. But I would like to have some "always running" handler for all incoming messages. (When something in Django database will change I want to send changes to python app).
How can Include always running receiver/ or add callback to websocket? I am not able to find any solution for this.
My code snippet:
import asyncio, json, websockets, logging
class UpdateConnection:
async def connect(self,botName):
self.sock = await websockets.connect('ws://localhost:8000/updates/bot/'+botName)
async def send(self,data):
try:
await self.sock.send(json.dumps(data))
except:
logging.info("Websocket connection lost!")
# Find a way how to reconenct... or make socket reconnect automatically
if __name__ == '__main__':
async def DebugLoop(socketCon):
await socketCon.connect("dev")
print("Running..")
while True:
data = {"type": "debug"}
await socketCon.send(data)
await asyncio.sleep(1)
uSocket = UpdateConnection()
loop = asyncio.get_event_loop()
loop.create_task(DebugLoop(uSocket))
loop.run_forever()
My debug server after connection will start sending random messages to the client in random intervals and I would like to somehow handle them in async way.
Thanks for any help :)
You don't have to do it so complicated. First of all I suggest you use the context patterns offered by websockets module.
From the documentation:
connect() can be used as an infinite asynchronous iterator to reconnect automatically on errors:
async for websocket in websockets.connect(...):
try:
...
except websockets.ConnectionClosed:
continue
Additionally, you simply keep the connection alive by awaiting incoming messages:
my_websocket = None
async for websocket in websockets.connect('ws://localhost:8000/updates/bot/' + botName):
try:
my_websocket = websocket
async for message in websocket:
pass # here you could also process incoming messages
except websockets.ConnectionClosed:
my_websocket = None
continue
As you can see we have a nested loop here:
The outer loop constantly reconnects to the server
The inner loop processes one incoming message at a time
If you are connected, and no messages are coming in from the server, this will just sleep.
The other thing that happens here is that my_websocket is set to the active connection, and unset again when the connection is lost.
In other parts of your script you can use my_websocket to send data. Note that you will need to check if it is currently set wherever you use it:
async def send(data):
if my_websocket:
await my_websocket.send(json.dumps(data))
This is just an illustration, you can also keep the websocket object as an object member, or pass it to another component through a setter function, etc.
I have a Python3 program that runs a "while True"-loop until stopped, which occasionally saves data to a MySQL database. I am creating an administrative website, separate from the Python program, where I will be able to observe this data.
I now want to be able to be notified, on the website, when changes have been made to the database. My thought was to set up a websocket connection, so that the Python program can send a message through the socket to all connected clients, i.e. all open browsers, if there has been any changes to the database table.
I have done something similar before, but in that case I had to wait for a websocket connection before the "while True"-loop would start. In the new scenario I want to be able to have multiple website clients at once, and let them connect at any time, as well as disconnect without interrupting the Python programs loop.
This is a simplified version of my previous code, which I now want to update to be able to run both with & without websocket clients.
import asyncio
import websockets
socket_server = websockets.serve(run, "127.0.0.1", 5055)
asyncio.get_event_loop().run_until_complete(socket_server)
console_log("Waiting for socket connection...")
asyncio.get_event_loop().run_forever()
async def run(ws):
while True:
db_has_updated = do_stuff()
if db_has_updated:
await ws.send(data)
I just can't seem to be able to come up with the right search terms to find a solution, so I'm asking here instead.
I figured it out, finally! Here is my solution with a websocket server running in a separate thread from the other logic. I'm probably changing some things to make it neater, but this does everything I need. Feel free to ask any questions.
Be aware that this blocks when messaging all the connected clients. That is the way I needed it to work, but you could always thread/subprocess the logic/data-gen part of the program if you want it to run completely asynchronously.
#!/usr/bin/env python3
import asyncio
import websockets
import threading
import time
import random
def gen_data():
print("Generating data...")
time.sleep(3)
data = random.randint(1, 10)
return data
async def send(client, data):
await client.send(data)
async def handler(client, path):
# Register.
print("Websocket Client Connected.", client)
clients.append(client)
while True:
try:
print("ping", client)
pong_waiter = await client.ping()
await pong_waiter
print("pong", client)
time.sleep(3)
except Exception as e:
clients.remove(client)
print("Websocket Client Disconnected", client)
break
clients = []
start_server = websockets.serve(handler, "localhost", 5555)
asyncio.get_event_loop().run_until_complete(start_server)
threading.Thread(target = asyncio.get_event_loop().run_forever).start()
print("Socket Server Running. Starting main loop.")
while True:
data = str(gen_data())
message_clients = clients.copy()
for client in message_clients:
print("Sending", data, "to", client)
try:
asyncio.run(send(client, data))
except:
# Clients might have disconnected during the messaging process,
# just ignore that, they will have been removed already.
pass
I'm trying to write an IRC bot that continues to work normally while it executes a long (10+ seconds) function.
I started by writing the bot using socket. When I called a 'blocking' function (computation that takes few seconds to execute), the bot naturally stopped responding and did not record any messages sent in chat while the function was computing.
I did some googling and saw a lot of people recommend using Twisted.
I implemented basic IRC bot, heavily based on some examples:
# twisted imports
from twisted.words.protocols import irc
from twisted.internet import reactor, protocol
from twisted.python import log
# system imports
import time, sys, datetime
def a_long_function():
time.sleep(180)
print("finished")
class BotMain(irc.IRCClient):
nickname = "testIRC_bot"
def connectionMade(self):
irc.IRCClient.connectionMade(self)
def connectionLost(self, reason):
irc.IRCClient.connectionLost(self, reason)
# callbacks for events
def signedOn(self):
"""Signed to server"""
self.join(self.factory.channel)
def joined(self, channel):
"""Joined channel"""
def privmsg(self, user, channel, msg):
"""Received message"""
user = user.split('!', 1)[0]
if 'test' in msg.lower():
print("timeout started")
a_long_function()
msg = "test finished"
self.msg(channel, msg)
if 'ping' in msg.lower():
self.msg(channel, "pong")
print("pong")
class BotMainFactory(protocol.ClientFactory):
"""A factory for BotMains """
protocol = BotMain
def __init__(self, channel, filename):
self.channel = channel
self.filename = filename
def clientConnectionLost(self, connector, reason):
"""Try to reconnect on connection lost"""
connector.connect()
def clientConnectionFailed(self, connector, reason):
print ("connection failed:", reason)
reactor.stop()
if __name__ == '__main__':
log.startLogging(sys.stdout)
f = BotMainFactory("#test", "log.txt")
reactor.connectTCP("irc.freenode.net", 6667, f)
reactor.run()
This approach is definitely better than my earlier socket implementation, because now the bot still receives the messages sent while it executes a_long_function().
However, it only 'sees' these messages after the function is complete. This means that when I was logging the messages to txt file, all messages received when a_long_function() was executing receive the same timestamp of when the function has finished - and not when they were actually sent in the chatroom.
Also, the bot still isn't able to send any messages while its executing the long function.
Could someone point me in the right direction of how I should go about changing the code so that this long function can be executed asynchronously, so that the bot can still log and reply to messages as it's executing?
Thanks in advance.
Edit:
I came across this answer, which gave me an idea that I could add deferLater calls into my a_long_function to split it into smaller chunks (that say take 1s to execute), and have the bot resume normal operation in between to reply to and log any messages that were sent to the IRC channel in mean time. Or perhaps add a timer that counts how long a_long_function has been running for, and if its longer than a threshold, it would call a deferLater to let the bot catch up on the buffered messages.
This does seem like a bit of hack thought - is there a more elegant solution?
No, there is not really a more elegant solution. Unless you want to use threading, which might look more elegant but could easily lead to an unstable program. If you can avoid it, go with the deferral solution.
To asynchronously call a function, you should use the asyncio package along with async/await, or coroutines. Keep in mind that calling async/await is a v3 implementation, not v2.
Using async/await:
#!/usr/bin/env python3
# countasync.py
import asyncio
async def count():
print("One")
await asyncio.sleep(1)
print("Two")
async def main():
await asyncio.gather(count(), count(), count())
if __name__ == "__main__":
import time
s = time.perf_counter()
asyncio.run(main())
elapsed = time.perf_counter() - s
print(f"{__file__} executed in {elapsed:0.2f} seconds.")
There is a really good tutorial you can read here that goes over using asyncio, in depth.
Hope of help!
Sorry for the long post but I've been poking at this for over a week so I've tried a lot of different stuff. I know Python well enough but I don't have any experience with asyncio or non-blocking functions in Python.
I'm writing an API library/module/package/whatever for a web service that requires a websocket connection. There are many incoming messages to act on, and some control-related (web app level, not websocket control messages) that I need to send on occasion. I can easily receive messages over the connection and act on them. I can send messages, but only in response to received messages because the receive loop is always blocking waiting for messages. I don't want to wait for an incoming messages to process an outgoing one so the script doesn't have to hang on input until a new messages is received. In my struggles to get two-way communication working as desired I discovered I need to use something like Twisted, Tornado, or asyncio but so far every implementation I've tried has failed. Note that the sending has to happen over the same connection. Opening a short-lived connection outside of the receive loop will not work. Here's what I've done so far:
The first iteration of the websocket code was using the websocket-client package. It was very close to the example from the docs:
import websocket
try:
import thread
except ImportError:
import _thread as thread
import time
def on_message(ws, message):
# Send message frames to respective functions
# for sorting, objectification, and processing
def on_error(ws, error):
print(error)
def on_close(ws):
print("### closed ###")
def on_open(ws):
def run(*args):
# Send initial frames required for server to send the desired frames
thread.start_new_thread(run, ())
if __name__ == "__main__":
websocket.enableTrace(True)
ws = websocket.WebSocketApp(buildWebsocketURL()),
on_message = on_message,
on_error = on_error,
on_close = on_close)
ws.on_open = on_open
ws.run_forever()
This blocks any further execution outside of the loop. I tried learning up on the _thread module but I couldn't find any indication that I could "communicate" with the websocket thread from outside. I tried setting up a pub/sub listener function that would forward data to ws.send() from another sender function but it didn't work. No errors or anything, just no indication of any sent messages.
Next I tried the Websockets module. This one seems to be built from the ground up to utilize asyncio. Again, I got a client build that would send initial messages and act on received messages but the progress stopped there:
async def wsconnection():
async with websockets.connect(getWebsocketURL()) as websocket:
while True:
message = await websocket.recv()
if message == '{"type":"broadcaster.ready"}':
subscriptions = getSubscriptions() # Get subscriptions from ident data
logging.info('Sending bookmarks to server as subscription keys')
subscriptionupdate = '{{"type": "subscribe","subscription_keys": ["{0}"],"subscription_scope": "update"}}'.format(
'","'.join(subscriptions))
subscriptioncontent = '{{"subscription_keys": ["{0}"],"subscription_scope": "content","type": "subscribe"}}'.format(
'","'.join(subscriptions))
logging.debug(subscriptioncontent)
await websocket.send(subscriptionupdate)
await websocket.send(subscriptioncontent)
await websocket.send(
'{"type":"message_lobby.read","lobby_id":"1","message_id:"16256829"}')
sortframe(message)
asyncio.get_event_loop().run_until_complete(wsconnection())
I tried the aforementioned pub/sub listener applied here to no avail. Upon reading the docs for this module more thoroughly I tried getting the websocket protocol object (that contains the send() and recv() methods) outside of the loop then creating two coroutines(?), one listening for incoming messages and one listening for and sending outgoing messages. So far I've been completely unable to get the websocket protocol object without running the async with websockets.connect(getWebsocketURL()) as websocket: line within the scope of the wsconnection() function. I tried using websocket = websockets.client.connect() which according to the docs I thought would set the protocol object I need but it doesn't. All of the examples I can find don't seem to reveal any apparent way to structure the websockets sender and receiver in the way I require without extensive knowledge of asyncio.
I also poked around with autobahn with similar code structures as above using both asyncio and Twisted but I came up with all the same problems as above.
So far the closest I've gotten was with the Websockets package above. The docs have an example snippet for a send/recv connection but I can't really read what's going on there as it's all very specific to asyncio. I'm really having trouble wrapping my head around asyncio in general and I think a big problem is it seems to have very rapidly evolved recently so there is a ton of very version-specific information floating around that conflicts. Not good for learning, unfortunately. ~~~~This is what I tried using that example and it connects, receives initial messages, then the connection is lost/closed:
async def producer(message):
print('Sending message')
async def consumer_handler(websocket, path):
while True:
message = await websocket.recv()
await print(message)
await pub.sendMessage('sender', message)
async def producer_handler(websocket, path):
while True:
message = await producer()
await websocket.send(message)
async def wsconnect():
async with websockets.connect(getWebsocketURL()) as websocket:
path = "443"
async def handler(websocket, path):
consumer_task = asyncio.ensure_future(
consumer_handler(websocket, path))
producer_task = asyncio.ensure_future(
producer_handler(websocket, path))
done, pending = await asyncio.wait(
[consumer_task, producer_task],
return_when=asyncio.FIRST_COMPLETED,
)
for task in pending:
task.cancel()
pub.subscribe(producer, 'sender')
asyncio.get_event_loop().run_until_complete(wsconnect())
So how do I structure this code to get sending and receiving over the same websocket connection? I also have various API calls to make in the same script while the websocket connection is open which further complicates things.
I'm using Python 3.6.6 and this script is intended to be imported as a module into other scripts so the websocket functionality will need to be wrapped up in a function or class for external calls.
I am in the exact same situation as u. I know that this is a very inelegant solution
because it still isn't full-duplex but i can't seem to find any example on the internet or stackoverflow involving asyncio and the websockets module which i used.
I don't think i completely understand your websockets example (is that server-side or client-side code?) but i'm going to explain my situation and "solution" and maybe that would be usable for you too.
So i have a server main function that has a websocket listening for messages in a loop with recv(). When i send "start" it will start a function that will send data every second to the javascript client in the browser. But while the function is sending data i sometimes want to pause or stop the stream of data from my client be sending a stop message. The problem is that when i use recv() while the data sending has already begun the server stops sending data and only waits for a message. I tried threads,multiprocessing and some other stuff but eventually i came to the hopefully temporarily solution of sending a "pong" message to the server immediately after the client receives a piece of data so that the server continues sending data at the next loop iteration or stop sending data if the "pong" message is "stop" instead for example but yeah this is not real duplex just fast half-duplex...
code on my python "server"
async def start_server(self,websocket,webserver_path):
self.websocket = websocket
self.webserver_path = webserver_path
while True:
command = await self.websocket.recv()
print("received command")
if command == "start":
await self.analyze()
asyncio.sleep(1)
in my analyze function:
for i,row in enumerate(data)
await self.websocket.send(json.dumps(row))
msg = await self.websocket.recv()
if msg == "stop":
self.stopFlag = True
return
await asyncio.sleep(1)
main
start_server = websockets.serve(t.start_server, "127.0.0.1", 5678)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_forever()
code on the javascript client
var ws = new WebSocket("ws://127.0.0.1:5678/");
ws.onmessage = function (event) {
var datapoint = JSON.parse(event.data);
console.log(counter);
counter++;
data.push(datapoint);
if (data.length > 40){
var element = data.shift();
render(data);
}
ws.send("pong");//sending dummy message to let server continue
};
I know it is not THE solution and i hope somebody else provides a better one but since i have the same or very similar problem and there are no other answers i decided to post and i hope it helps.
I'm using python-xmpp to send jabber messages. Everything works fine except that every time I want to send messages (every 15 minutes) I need to reconnect to the jabber server, and in the meantime the sending client is offline and cannot receive messages.
So I want to write a really simple, indefinitely running xmpp client, that is online the whole time and can send (and receive) messages when required.
My trivial (non-working) approach:
import time
import xmpp
class Jabber(object):
def __init__(self):
server = 'example.com'
username = 'bot'
passwd = 'password'
self.client = xmpp.Client(server)
self.client.connect(server=(server, 5222))
self.client.auth(username, passwd, 'bot')
self.client.sendInitPresence()
self.sleep()
def sleep(self):
self.awake = False
delay = 1
while not self.awake:
time.sleep(delay)
def wake(self):
self.awake = True
def auth(self, jid):
self.client.getRoster().Authorize(jid)
self.sleep()
def send(self, jid, msg):
message = xmpp.Message(jid, msg)
message.setAttr('type', 'chat')
self.client.send(message)
self.sleep()
if __name__ == '__main__':
j = Jabber()
time.sleep(3)
j.wake()
j.send('receiver#example.org', 'hello world')
time.sleep(30)
The problem here seems to be that I cannot wake it up. My best guess is that I need some kind of concurrency. Is that true, and if so how would I best go about that?
EDIT: After looking into all the options concerning concurrency, I decided to go with twisted and wokkel. If I could, I would delete this post.
There is a good example on the homepage of xmpppy itself (which is another name for python-xmpp), which does almost what you want: xtalk.py
It is basically a console jabber-client, but shouldn't be hard to rewrite into bot you want.
It's always online and can send and receive messages. I don't see a need for multiprocessing (or other concurrency) module here, unless you need to receive and send messages at exact same time.
A loop over the Process(timeout) method is a good way to wait and process any new incoming stanzas while keeping the connection up.