I have a class that contains a function that I would like to be able to invoke by invoking a flask-resful endpoint. Is there a way to define an asynchronous function that would await/subscribe to this endpoint to be called? I can make changes to the flask app (but can't switch to SocketIO) as well if required or write some sort of async requests function. I can only work with the base Anaconda 3.7 library and I don't have any additional message brokers installed or available.
class DaemonProcess:
def __init__(self):
pass
async def await_signal():
signal = await http://ip123/signal
self.(process_signal) # do stuff with signal
For context, this isn't the main objective of the process. I simply want to be able to use this to tell my process remotely or via UI to shut down worker processes either gracefully or forcefully. The only other idea I came up with is pinging a database table repeatedly to see if a signal has been inserted, but time is of the essence and would require pinging at too short of intervals in my opinion and an asynchronous approach would be favored. The database would be SQLite3 and it doesn't appear to support update_hook callbacks.
Here's sample pattern to send a singal and process it:
import asyncio
import aiotools
class DaemonProcess
async def process(reader, writer):
data = await reader.read(100)
writer.write(data)
print(f"We got a message {data} - time to do something about it.")
await writer.drain()
writer.close()
#aiotools.server
async def worker(loop, pidx, args):
server = await asyncio.start_server(echo, '127.0.0.1', 8888,
reuse_port=True, loop=loop)
print(f'[{pidx}] started')
yield # wait until terminated
server.close()
await server.wait_closed()
print(f'[{pidx}] terminated')
def start(self):
aiotools.start_server(myworker, num_workers=4)
if __name__ == '__main__':
# Run the above server using 4 worker processes.
d = DaemonProcess()
d.start()
if you save it in a file, for example, process.py, you should be able to start it:
python3 process.py
Now once you have this daemon in background, you should be able to ping it (see a sample client below):
import asyncio
async def tcp_echo_client(message):
reader, writer = await asyncio.open_connection('127.0.0.1', 8888)
print(f'Send: {message!r}')
writer.write(message.encode())
await writer.drain()
data = await reader.read(100)
print(f'Received: {data.decode()!r}')
print('Close the connection')
writer.close()
await writer.wait_closed()
And now, somewhere in your Flask view, you should be able to invoke:
asyncio.run(tcp_echo_client('I want my daemon to do something for me'))
Notice this all used localhost 127.0.0.1 and port 8888, so those to be made available unless you have your own ports and IPs, then you'll need to configure them accordingly.
Also notice the use of aiotools which is a module providing a set of common asyncio patterns (daemons, etc...).
Related
I am trying to implement WebSocket connection to a server (Python app <=> Django app)
Whole system runs in big Asyncio loop with many tasks. Code snippet is just very small testing part.
I am able to send any data to a server at any moment and many of them will be type request something and wait for response. But I would like to have some "always running" handler for all incoming messages. (When something in Django database will change I want to send changes to python app).
How can Include always running receiver/ or add callback to websocket? I am not able to find any solution for this.
My code snippet:
import asyncio, json, websockets, logging
class UpdateConnection:
async def connect(self,botName):
self.sock = await websockets.connect('ws://localhost:8000/updates/bot/'+botName)
async def send(self,data):
try:
await self.sock.send(json.dumps(data))
except:
logging.info("Websocket connection lost!")
# Find a way how to reconenct... or make socket reconnect automatically
if __name__ == '__main__':
async def DebugLoop(socketCon):
await socketCon.connect("dev")
print("Running..")
while True:
data = {"type": "debug"}
await socketCon.send(data)
await asyncio.sleep(1)
uSocket = UpdateConnection()
loop = asyncio.get_event_loop()
loop.create_task(DebugLoop(uSocket))
loop.run_forever()
My debug server after connection will start sending random messages to the client in random intervals and I would like to somehow handle them in async way.
Thanks for any help :)
You don't have to do it so complicated. First of all I suggest you use the context patterns offered by websockets module.
From the documentation:
connect() can be used as an infinite asynchronous iterator to reconnect automatically on errors:
async for websocket in websockets.connect(...):
try:
...
except websockets.ConnectionClosed:
continue
Additionally, you simply keep the connection alive by awaiting incoming messages:
my_websocket = None
async for websocket in websockets.connect('ws://localhost:8000/updates/bot/' + botName):
try:
my_websocket = websocket
async for message in websocket:
pass # here you could also process incoming messages
except websockets.ConnectionClosed:
my_websocket = None
continue
As you can see we have a nested loop here:
The outer loop constantly reconnects to the server
The inner loop processes one incoming message at a time
If you are connected, and no messages are coming in from the server, this will just sleep.
The other thing that happens here is that my_websocket is set to the active connection, and unset again when the connection is lost.
In other parts of your script you can use my_websocket to send data. Note that you will need to check if it is currently set wherever you use it:
async def send(data):
if my_websocket:
await my_websocket.send(json.dumps(data))
This is just an illustration, you can also keep the websocket object as an object member, or pass it to another component through a setter function, etc.
I have a python socket server using asyncio and websockets. When the websocket is active 100+ devices will connect and hold their connection waiting for commands/messages.
There are two threads the first thread accepts connections and adds their details to a global variable then waits for messages from the device:
async def thread1(websocket, path):
client_address = await websocket.recv()
CONNECTIONS[client_address] = websocket
async for message in websocket:
... do something with message
start_server = websockets.serve(thread1, host, port)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.ensure_future(thread2())
asyncio.get_event_loop().run_forever()
The second thread processes some user data and once it needs to send a command it accesses a global variable to get the websocket info:
thread2()
...some data processing
soc = CONNECTIONS[ipaddress]
await soc.send("some message")
My question: What's the best way to allow another thread to send messages?
I can keep the global variable safe using thread locking and a function made only to process that data, however global variables aren't ideal. I cannot send information between threads since thread1 is stuck waiting to receive messages.
The first thing I would like to say is the incorrect use of the term thread. You use asyncio and here the concept is used - coroutine (coroutine is wrapped into a asyncio task). How it differs from threads can be found, for example, here.
The websockets server spawns a new task for each incoming connection (there are the same number of connections and spawned tasks). I don't see anything wrong with the global object, at least in a small script. However, below I gave an example where I placed this in a separate class.
Also, in this case, special synchronization between coroutines is not required, since they are implemented through cooperative multitasking (in fact, all are executed in one thread, transferring control at certain points.)
Here is a simple example in which the server stores a dictionary of incoming connections and starts a task that every 2 seconds, notifies all clients and sends them the current time. The server also prints confirmation from clients to the console.
# ws_server.py
import asyncio
import websockets
import datetime
class Server:
def __init__(self, host, port):
self.host = host
self.port = port
self.connections = {}
self.is_active = False
self.server = None
async def start(self):
self.is_active = True
self.server = await websockets.serve(self.handler, self.host, self.port)
asyncio.create_task(self.periodic_notifier())
async def stop(self):
self.is_active = False
self.server.close()
await self.wait_closed()
async def wait_closed(self):
await self.server.wait_closed()
async def handler(self, websocket, path):
self.connections[websocket.remote_address] = websocket
try:
async for message in websocket:
print(message)
except ConnectionClosedError as e:
pass
del self.connections[websocket.remote_address]
print(f"Connection {websocket.remote_address} is closed")
async def periodic_notifier(self):
while self.is_active:
await asyncio.gather(
*[ws.send(f"Hello time {datetime.datetime.now()}") for ws in self.connections.values()],
return_exceptions=True)
await asyncio.sleep(2)
async def main():
server = Server("localhost", 8080)
await server.start()
await server.wait_closed()
asyncio.run(main())
# ws_client.py
import asyncio
import websockets
async def client():
uri = "ws://localhost:8080"
async with websockets.connect(uri) as websocket:
async for message in websocket:
print(message)
await websocket.send(f"ACK {message}")
asyncio.run(client())
I have two websocket servers, call them Main and Worker, and this is the desired workflow:
Client sends message to Main
Main sends message to Worker
Worker responds to Main
Main responds to Client
Is this doable? I couldn't find any WS client functionality in Channels. I tried naively to do this (in consumers.py):
import websockets
class SampleConsumer(AsyncWebsocketConsumer):
async def receive(self, text_data):
async with websockets.connect(url) as worker_ws:
await worker_ws.send(json.dumps({ 'to': 'Worker' }))
result = json.loads(await worker_ws.recv())
await self.send(text_data=json.dumps({ 'to': 'Client' })
However, it seems that the with section blocks (Main doesn't seem to accept any further messages until the response is received from Worker). I suspect it is because websockets runs its own loop, but I don't know for sure. (EDIT: I compared id(asyncio.get_running_loop()) and it seems to be the same loop. I have no clue why it is blocking then.)
The response { "to": "Client" } does not need to be here, I would be okay even if it is in a different method, as long as it triggers when the response from Worker is received.
Is there a way to do this, or am I barking up the wrong tree?
If there is no way to do this, I was thinking of having a thread (or process? or a separate application?) that communicates with Worker, and uses channel_layer to talk to Main. Would this be viable? I would be grateful if I could get a confirmation (and even more so for a code sample).
EDIT I think I see what is going on (though still investigating), but — I believe one connection from Client instantiates one consumer, and while different instances can all run at the same time, within one consumer instance it seems the instance doesn't allow a second method to start until one method has finished. Is this correct? Looking now if moving the request-and-wait-for-response code into a thread would work.
I was in the same position where I wanted to process the message in my Django app whenever I receive it from another WebSocket server.
I took the idea of using the WebSockets client library and keeping it running as a separate process using the manage.py command from this post on the Django forum.
You can define an async coroutine client(websocket_url) to listen to messages received from the WebSocket server.
import asyncio
import websockets
async def client(websocket_url):
async for websocket in websockets.connect(uri):
print("Connected to Websocket server")
try:
async for message in websocket:
# Process message received on the connection.
print(message)
except websockets.ConnectionClosed:
print("Connection lost! Retrying..")
continue #continue will retry websocket connection by exponential back off
In the above code connect() acts as an infinite asynchronous iterator. More on that here.
You can run the above coroutine inside handle() method of the custom management command class.
runwsclient.py
from django.core.management.base import BaseCommand
class Command(BaseCommand):
def handle(self, *args, **options):
URL = "ws://example.com/messages"
print(f"Connecting to websocket server {URL}")
asyncio.run(client(URL))
Finally, run the manage.py command.
python manage.py runwsclient
You can also pass handler to client(ws_url, msg_handler) which will process the message so that processing logic will remain outside of the client.
Update 31/05/2022:
I have created a django package to integrate the above functionality with the minimal setup: django-websocketclient
Yes, Django Channels does not provide a websocket client as it is used as a server mainly.
From your code, it doesn't seem like you really need a websocket communication between the Main and Worker, as you just fire up a socket, send a single message, receive the response and close the socket. This is the classical use case for regular HTTP, so if you do not really need to keep the connection alive, I suggest you use a regular HTTP endpoint instead and use aioHTTP as a client.
However, if you do really need a client, then you should open the socket once on client connection and close it when the client disconnects. You can do something like this.
import websockets
async def create_ws(on_create, on_message):
uri = "wss://localhost:8765"
async with websockets.connect(uri) as websocket:
await on_create(websocket)
while True:
message = await websocket.recv()
if message:
await on_message(message)
class WebsocketClient:
asyn def __init__(self, channel):
self.channel = channel
self.ws = None
await creat_ws(self.on_message)
async def on_create(self, was):
self.ws = ws
async def on_message(self, ws, message):
await self.channel.send(text_data=json.dumps(message)
async def send(self, message):
self.ws.send(message)
asunc def close(self):
self.ws.close()
Then in your consumer, you can use the client as follows:
class SampleConsumer(AsyncWebsocketConsumer):
async def connect(self):
self.ws_client = WebsocketClient(self)
async def receive(self, text_data):
await self.ws_client.send(text_data)
async def disconnect(self, code):
await self.ws_client.close()
It seems I managed to do it using the latest idea I posted — launching a thread to handle the connection to Worker. Something like this:
class SampleConsumer(AsyncWebsocketConsumer):
async def receive(self, text_data):
threading.Thread(
target=asyncio.run,
args=(self.talk_to_worker(
url,
{ 'to': 'Worker' },
),)
).start()
async def talk_to_worker(self, url, message):
async with websockets.connect(url) as worker_ws:
await worker_ws.send(json.dumps(message))
result = json.loads(await worker_ws.recv())
await self.send(text_data=json.dumps({ 'to': 'Client' })
It may actually be smarter to do it with HTTP requests in each direction (since both endpoints can be HTTP servers), but this seems to work.
I'm new to websockets. I've been using the examples on the Getting Started page of the websockets docs, mainly the Synchronization Example.
In this use case, I have a sqlite3 database on localhost. I edit that database from a python GUI program on localhost which just imports the database code layer directly. The client then tells the websocket server to send out some extracted data to all clients.
(Eventually this will be on a LAN, with the server machine running a Flask API.)
This is working, with the code below, but it's not clear if I'm doing it correctly. Basically I want to send websockets messages when certain database activity takes place, and I'm confused about how to do a 'simple' non-async send, when invoked from code, ultimately in response to a GUI interaction, as opposed to doing a send in response to an incoming websocket message. In pseudo-code:
def send(ws,msg):
ws.send(msg)
send(ws,'OK!')
The way I'm accomplishing that is wrapping the async def that does the sending in a non-async 'vanilla' def.
The websocket server code:
# modified from https://websockets.readthedocs.io/en/stable/intro.html#synchronization-example
import asyncio
import websockets
USERS = set()
async def register(websocket):
print("register: "+str(websocket))
USERS.add(websocket)
async def unregister(websocket):
print("unregister: "+str(websocket))
USERS.remove(websocket)
# each new connection calls trackerHandler, resulting in a new USERS entry
async def trackerHandler(websocket, path):
await register(websocket)
try:
async for message in websocket:
await asyncio.wait([user.send(message) for user in USERS])
finally:
await unregister(websocket)
start_server = websockets.serve(trackerHandler, "localhost", 8765)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_forever()
in the database interface code (on localhost, this file is just imported directly to the GUI app; but on the LAN server, this is the file specified in the WSGI call in Flask):
import asyncio
import websockets
# uri = "ws://localhost:8765"
# wrap the asynchronous send function inside a synchronous function
def wsSend(uri,msg):
async def send():
async with websockets.connect(uri) as websocket:
# await websocket.send(str.encode(str(msg)))
await websocket.send(json.dumps({"msg":msg}))
# print(f"> {msg}")
# greeting = await websocket.recv()
# print(f"< {greeting}")
asyncio.get_event_loop().run_until_complete(send())
...
...
def tdbPushTables(uri,teamsViewList=None,assignmentsViewList=None,teamsCountText="---",assignmentsCountText="---"):
# uri = "ws://localhost:8765"
if not teamsViewList:
teamsViewList=tdbGetTeamsView()
if not assignmentsViewList:
assignmentsViewList=tdbGetAssignmentsView()
if uri=='pusher':
pusher_client.trigger('my-channel','teamsViewUpdate',teamsViewList)
pusher_client.trigger('my-channel','assignmentsViewUpdate',teamsViewList)
else:
wsSend(uri,json.dumps({
"teamsView":teamsViewList,
"assignmentsView":assignmentsViewList,
"teamsCount":teamsCountText,
"assignmentsCount":assignmentsCountText}))
it's actually the client that initiates the call to tdbPushTables:
def buildLists(self):
self.teamsList=tdbGetTeamsView()
self.assignmentsList=tdbGetAssignmentsView()
self.updateCounts()
tdbPushTables('ws://localhost:8765',self.teamsList,self.assignmentsList,self.teamsCountText,self.assignmentsCountText)
Feels spooky. Is it spooky or is this actually the right way to do it? Should I be using the websockets module for the server, but a different module to do the 'simple'/synchronous sending of the websocket message to the server?
Two known side effects of this solution: 1) it opens and closes the websocket connection on every call - probably not really a problem...?, and 2) it results in non-fatal handled messages like this in the server transcript:
register: <websockets.server.WebSocketServerProtocol object at 0x041C46F0>
Task exception was never retrieved
future: <Task finished coro=<WebSocketCommonProtocol.send() done, defined at C:\Users\caver\AppData\Roaming\Python\Python37\site-packages\websockets\protocol.py:521> exception=ConnectionClosedOK('code = 1000 (OK), no reason')>
Traceback (most recent call last):
File "C:\Users\caver\AppData\Roaming\Python\Python37\site-packages\websockets\protocol.py", line 555, in send
await self.ensure_open()
File "C:\Users\caver\AppData\Roaming\Python\Python37\site-packages\websockets\protocol.py", line 812, in ensure_open
raise self.connection_closed_exc()
websockets.exceptions.ConnectionClosedOK: code = 1000 (OK), no reason
unregister: <websockets.server.WebSocketServerProtocol object at 0x041C46F0>
EDIT: looks like the websocket (singular) module has a synchronous interface, and the websockets (plural) docs explain that if you want to go synchronous you should use a different module; so, this works:
(instead of importing asyncio and websockets)
from websocket import create_connection
def wsSend(uri,msg):
ws=create_connection(uri)
ws.send(json.dumps({"msg":msg}))
ws.close()
It does still result in the same handled traceback showing up in the server transcript each time wsSend is called; there's probably a way to silence that traceback output, but, regardless, it still doesn't seem to affect anything.
Your code feels spooky, because you are mixing async code with synchronous code.
Based on personal experience, the code is simpler to follow if you keep most of the code asynchronous.
The structure will become something like:
import asyncio
import websockets
async def main():
# Create websocket connection
async with websockets.connect(uri) as websocket:
await your_function_that_does_some_processing(websocket)
asyncio.get_event_loop().run_until_complete(main())
Have in mind that big sections of blocking code can generate trouble.
I try to create a client which uses a asyncio.Queue to feed the messages I want to send to the server. Receiving data from websocket server works great. Sending data which is just generated by the producer works, too. For explaning what works and what fails, first here's my code:
import sys
import asyncio
import websockets
class WebSocketClient:
def __init__(self):
self.send_queue = asyncio.Queue()
#self.send_queue.put_nowait('test-message-1')
async def startup(self):
await self.connect_websocket()
consumer_task = asyncio.create_task(
self.consumer_handler()
)
producer_task = asyncio.create_task(
self.producer_handler()
)
done, pending = await asyncio.wait(
[consumer_task, producer_task],
return_when=asyncio.ALL_COMPLETED
)
for task in pending:
task.cancel()
async def connect_websocket(self):
try:
self.connection = await websockets.client.connect('ws://my-server')
except ConnectionRefusedError:
sys.exit('error: cannot connect to backend')
async def consumer_handler(self):
async for message in self.connection:
await self.consumer(message)
async def consumer(self, message):
self.send_queue.put_nowait(message)
# await self.send_queue.put(message)
print('mirrored message %s now in queue, queue size is %s' % (message, self.send_queue.qsize()))
async def producer_handler(self):
while True:
message = await self.producer()
await self.connection.send(message)
async def producer(self):
result = await self.send_queue.get()
self.send_queue.task_done()
#await asyncio.sleep(10)
#result = 'test-message-2'
return result
if __name__ == '__main__':
wsc = WebSocketClient()
asyncio.run(wsc.startup())
Connecting works great. If I send something from my server to the client, this works great too and prints the message in consumer(). But producer never gets any message I put in send_queue inside consumer().
The reason why I chose send_queue.put_nowait in consumer() was that I wanted to prevent deadlocks. If I use the line await self.send_queue.put(message) line instead of self.send_queue.put_nowait(message) it makes no difference.
I thought, maybe the queue dos not work at all, so I filled something to the queue just at creation in __init__(): self.send_queue.put_nowait("test-message-1"). This works and is sent to my server. So the basic concept of the queue and await queue.get() works.
I als thought, maybe there is some issue with the producer, so let's just randomly generate messages during runtime: result = "test-message-2" instead of result = await self.send_queue.get(). This works too: every 10 seconds 'test-message-2' is sent to my server.
EDIT: This also happens if I try to add stuff from another source to the queue on the fly. I build a small asyncio socket server which pushes any message to the queue, which works great, and you can see the messages I added from the other source with qsize() in consumer(), but still no successfull queue.get(). So the queue itself seems to work, just not get(). This is btw the reason for the queue, too: I would like to send data from quite different sources.
So, this is the point where I'm stuck. My wild guess is that the queue I use in producer() is not the same as in consumer(), something which happens at threading quite easily if you use non-thread-safe queues like asyncio.Queue, but as I understood it I don't use threading at all, just coroutines. So, what else went wrong here?
Just for the context: it's a Ubuntu 20.04 python 3.8.2 inside a docker container.
Thanks,
Ernesto
Just for the records - the solution for my problem was quite simple: I defined send_queue outside the event loop created by my websocket client. So it called events.get_event_loop() and got its own loop - which was not part of the main loop and therefore never called, therefore await queue.get() really never got anything back.
In normal mode, you don't see any message which is a hint to this issue. But, python documentation to the rescue: for course they mentioned it at https://docs.python.org/3/library/asyncio-dev.html : logging.DEBUG gave the hints I needed to find the problem.
It should look like this:
class WebSocketClient:
async def startup(self):
self.send_queue = asyncio.Queue()
await self.connect_websocket()
Then the queue is defined inside the main loop.