I am building a simple chat app and I have been trying to emit messages to rooms with Flask-SocketIO.
The "mesage_event" event from the client reaches the server well, but then, I cannot see anything on the client side. I don't know whether the server emits something to the room but on the client side, I cannot see anything in the console. I can only successfully send to all clients with broadcasting.
Here is my code on the server side:
#socketio.on("send msg")
def sendsocket(data):
print("send msg start:", data["msg"])
msg = data["msg"]
room = data["room"]
emit("message_event", msg, room = room)
And the client side:
1- for sending the message:
socket.emit('send msg', {'msg': msg, 'room': room})
2- for triggering the event handler:
socket.on('message_event', data => {
console.log("message received:", data);
});
You are missing 2 things.
1st, you need to send a "join" event to the server.
<script>
function joinRoom() {
console.log("ask server to join room");
socket.emit("join", { "user": Date.now(), "room": "Notifications" });
}
</script>
<body>
<button onclick="joinRoom()">Join</button>
</body>
For example, here I attached the trigger to a button. And to make it easier to initially test adding users to rooms, I use Date.now() as the username. You can open different tabs to serve as different users.
2nd, you need to have a handler for that join event.
There is an example in the Rooms and Namespaces section of the Flask-SocketIO docs.
#socketio.on("join")
def on_join(data):
user = data["user"]
room = data["room"]
print(f"client {user} wants to join: {room}")
join_room(room)
emit("room_message", f"Welcome to {room}, {user}", room=room)
In the handler, you need to call the join_room method to add the user to a room under the current namespace. Note that part about the namespace. By default all connections are under the root (/) namespace. If you have custom namespaces, then each namespace will have their own rooms.
There is also a corresponding leave_room method.
Here is the complete server-side code:
#socketio.on("connect")
def connect():
print("client wants to connect")
emit("status", { "data": "Connected. Hello!" })
#socketio.on("join")
def on_join(data):
user = data["user"]
room = data["room"]
print(f"client {user} wants to join: {room}")
join_room(room)
emit("room_message", f"Welcome to {room}, {user}", room=room)
Here is the complete client-side code:
<script type="text/javascript" charset="utf-8">
const socket = io();
socket.on("connect", () => {
console.log("connect");
});
socket.on("status", (status) => {
console.log("received status: " + status.data);
});
socket.on("room_message", (msg) => {
console.log("message from room: " + msg);
});
function joinRoom() {
console.log("ask server to join room");
socket.emit("join", { "user": Date.now(), "room": "Notifications" });
}
</script>
<body>
<button onclick="joinRoom()">Join</button>
</body>
Now, you can now open multiple tabs and connect each one to the server. The server-side should show the following messages:
client wants to connect
client wants to connect
client wants to connect
client 1582428076724 wants to join: Notifications
client 1582428080023 wants to join: Notifications
client 1582428082916 wants to join: Notifications
And on the 1st user to join the room (1582428076724), you should be able to see the logs as other users are joining the room.
connect
received status: Connected. Hello!
ask server to join room
message from room: Welcome to Notifications, 1582428076724
message from room: Welcome to Notifications, 1582428080023
message from room: Welcome to Notifications, 1582428082916
Don't know why send and emit function from flask_socketio does not work. If I use socketio (socket instance), it will send the message successfully.
socketio.emit('EVENT_NAME', data, to='YOUR_ROOM')
Related
class WSTestView(WebsocketConsumer):
def connect(self):
self.accept();
self.send(json.dumps({'status': 'sent'})) # client receives this
def receive(self, text_data=None, bytes_data=None):
notifications = Notification.objects.filter(receiver=text_data) # receives user id
serializer = NotificationSerializer(notifications, many=True).data
self.send(serializer) # client does not receives this
Frontend
// ...
useEffect(() => {
socket.onmessage = (e) => { console.log(e.data) }
}, [])
// ...
I've just started with django-channels and am working on a consumer that sends the user's notifications when it receives the user's id but on the frontend the onmessage event does not receive anything, how can I fix this and is there a better way that I can implement this?
It's probably your front end and not django-channels if you're able to connect to it. The most probable reason is that your onmessage binds after, because of the componenDidMount/useEffect, the server has already sent the message.
Try just as a test, to rule that out. To put the
socket.onmessage = (e) => { console.log(e.data) }
Right after the new WebSocket.... Let me know if that prints something in the console. Another way to test/isolate it is with an online WebSocket tester like this https://www.lddgo.net/en/network/websocket
class WSTestView(WebsocketConsumer):
def connect(self):
self.accept();
self.send(json.dumps({'status': 'sent'}))
def receive(self, text_data=None, bytes_data=None):
notifications = Notification.objects.filter(receiver=text_data) # receives user id
serializer = NotificationSerializer(notifications, many=True).data
self.send(json.dumps(serializer)) # <-------
Had to convert the serialized data into a JSON string.
Frontend
useEffect(() => {
socket.onmessage = (e) => { console.log(e.data) }
}, [socket]) // <-------
Had to put the socket variable into the dependency array
I'm very new to django-channels so this is probably a very simple question.
On our website, there is a permanent button "Messages" in the header. I want user to be notified about new message immediately. So I use channels for this purpose. If there is a new message created, I send a number of not readed conversations through channels to client:
class Message(..):
def save(...):
notify_recipient(self)
def notify_recipient(self):
Group('%s' % self.recipient).send({
"text": json.dumps({
"message":{"text":truncatechars(self.text,100)},
"unreaded_conversations":Conversation.objects.get_unreaded_conversations(self.recipient).count(),
}),
})
And in base.html:
const webSocketBridge = new channels.WebSocketBridge();
webSocketBridge.connect('/notifications/');
webSocketBridge.listen(function (action, stream) {
console.log(action, stream);
var conversations_sidebar = $('#id_conversations_sidebar');
var messages_list = $('#messagesList');
if (action.unreaded_conversations) {
$('#id_unreaded_conversations_count').text(action.unreaded_conversations);
}
On the other hand, there is a page /chat/detail/<username>/ where users chat with each other. This chat should be live so I need to recieve messages through channels.
For now, I've added rendered message to the notify_recipient method but the problem is that it has to render the message allways, even when user is not on this /chat/detail/<username>/ url which is not efficient.
Do you know how to recieve rendered messages only when user is in the current chat?
routing.py
#channel_session_user
def message_handler(message):
message.reply_channel.send({"accept": True})
#channel_session_user_from_http
def ws_connect(message,):
Group("%s" % message.user).add(message.reply_channel)
message.reply_channel.send({"accept": True})
channel_routing = [
route("websocket.receive", message_handler), # we register our message handler
route("websocket.connect", ws_connect) # we register our message handler
]
I am using autobahnpython with twisted (wamp) on server side and autobahnjs in browser. Is there a straight-forward way to allow/restrict subscriptions on a per session basis? For example, a client should not be able to subscribe to topics relavant to other users.
While I am NOT using crossbar.io, I tried using the Python code shown in the 'Example' section at the end of this page http://crossbar.io/docs/Authorization/ where a RPC call is first used to give authorization to a client. Of course, I am using my own authorization logic. Once this authorization is successful, I'd like to give the client privileges to subscribe to topics related only to this client, like 'com.example.user_id'. My issue is that even if auth passes, however, I have not found a way to limit subscription requests in the ApplicationSession class which is where the authorization takes place. How can I prevent a client who authorizes with user_id=user_a from subscribing to 'com.example.user_b'?
You can authorize by creating your own router. To do that, subclass Router() and override (at a minumum) the authorize() method:
def authorize(self, session, uri, action):
return True
This method is pretty simple, if you return a True then the session is authorized to do whatever it is attempting. You could make a rule that all subscriptions must start with 'com.example.USER_ID', so, your python code would split the uri, take the third field, and compare it to the current session id, returning True if they match, false otherwise. This is where things get a little weird though. I have code that does a similar thing, here is my authorize() method:
#inlineCallbacks
def authorize(self, session, uri, action):
authid = session._authid
if authid is None:
authid = 1
log.msg("AuthorizeRouter.authorize: {} {} {} {} {}".format(authid,
session._session_id, uri, IRouter.ACTION_TO_STRING[action], action))
if authid != 1:
rv = yield self.check_permission(authid, uri, IRouter.ACTION_TO_STRING[action])
else:
rv = yield True
log.msg("AuthorizeRouter.authorize: rv is {}".format(rv))
if not uri.startswith(self.svar['topic_base']):
self.sessiondb.activity(session._session_id, uri, IRouter.ACTION_TO_STRING[action], rv)
returnValue(rv)
return
Note that I dive into the session to get the _authid, which is bad karma (I think) because I should not be looking at these private variables. I don't know where else to get it, though.
Also, of note, this goes hand in hand with Authentication. In my implementation, the _authid is the authenticated user id, which is similar to a unix user id (positive unique integer). I am pretty sure this can be anything, like a string, so you should be ok with your 'user_b' as the _auth_id if you wish.
-g
I found a relatively simple solution using a Node guest. Here's the code:
// crossbar setup
var autobahn = require('autobahn');
var connection = new autobahn.Connection({
url: 'ws://127.0.0.1:8080/ws',
realm: 'realm1'
}
);
// Websocket to Scratch setup
// pull in the required node packages and assign variables for the entities
var WebSocketServer = require('websocket').server;
var http = require('http');
var ipPort = 1234; // ip port number for Scratch to use
// this connection is a crossbar connection
connection.onopen = function (session) {
// create an http server that will be used to contain a WebSocket server
var server = http.createServer(function (request, response) {
// We are not processing any HTTP, so this is an empty function. 'server' is a wrapper for the
// WebSocketServer we are going to create below.
});
// Create an IP listener using the http server
server.listen(ipPort, function () {
console.log('Webserver created and listening on port ' + ipPort);
});
// create the WebSocket Server and associate it with the httpServer
var wsServer = new WebSocketServer({
httpServer: server
});
// WebSocket server has been activated and a 'request' message has been received from client websocket
wsServer.on('request', function (request) {
// accept a connection request from Xi4S
//myconnection is the WS connection to Scratch
myconnection = request.accept(null, request.origin); // The server is now 'online'
// Process Xi4S messages
myconnection.on('message', function (message) {
console.log('message received: ' + message.utf8Data);
session.publish('com.serial.data', [message.utf8Data]);
// Process each message type received
myconnection.on('close', function (myconnection) {
console.log('Client closed connection');
boardReset();
});
});
});
};
connection.open();
When a client makes a get request to '/test' a simple string is exchanged between node.js and python via AMQP, but I don't know how to transmit the response back to the client (since the process is async).
test.py
import pika
connection = pika.BlockingConnection(pika.ConnectionParameters(
host='localhost'))
channel = connection.channel()
channel.queue_declare(queue='task_queue', durable=True)
print ' [*] Waiting for messages. To exit press CTRL+C'
def callback(ch, method, props, body):
print " [x] Received %r" % (body,)
response = body + " MODIFIED"
#response = get_a_concept()
print " [x] Done"
ch.basic_publish(exchange='',
routing_key=props.reply_to,
properties=pika.BasicProperties(correlation_id = \
props.correlation_id),
body=str(response))
ch.basic_ack(delivery_tag = method.delivery_tag)
channel.basic_qos(prefetch_count=1)
channel.basic_consume(callback,
queue='task_queue')
channel.start_consuming()
app.js
var connection = amqp.createConnection({ host: 'localhost' });
connection.addListener('ready', function() {
var exchange = connection.exchange('', {
'type' : 'direct',
durable : false
}, function() {
var queue = connection.queue('incoming', {
durable : false,
exclusive : true }, function() {
queue.subscribe(function(msg) {
// got response here, how to transmit it to the node that made the exchange?
console.log("received message: ");
console.log(msg.data.toString());
});
});
});
});
User request makes a publish to python, but how to reply it back to the user once it's finished?
app.get('/test', loadUser, function(req, res) {
console.log("sent");
exchange.publish('task_queue', "funciona!", {
'replyTo' : 'incoming'
});
res.redirect('/home');
});
(Note: I'm not sure if this is the best implementation. Hints, suggestions welcomed!)
I solved it as follows:
The requesting side sets the reply-to and correlation-id headers when sending a message and stores the information needed to process the reply (in my case, NodeJS, a callback) in a list with the correlation-id as index.
The responding side publishes to a direct exchange with reply-to as routing key and sets the correlation-id on the message.
Now when the message arrives back at the requester, it simply gets (and removes) the needed info from the list and handles the response.
Edit: Ofcourse there's some more work to be done if you want to handle timeouts, etc, but that all depends on your usecase.
I have an isolated python script that simply captures data from Twitter's streaming API and then on the receipt of each message, using redis pubsub it publishes to the channel "tweets". Here is that script:
def main():
username = "username"
password = "password"
track_list = ["apple", "microsoft", "google"]
with tweetstream.FilterStream(username, password, track=track_list) as stream:
for tweet in stream:
text = tweet["text"]
user = tweet["user"]["screen_name"]
message = {"text": text, "user": user}
db.publish("tweets", message)
if __name__ == '__main__':
try:
print "Started..."
main()
except KeyboardInterrupt:
print '\nGoodbye!'
My server side socket.io implementation is done using django-socketio (based off of gevent-socketio) https://github.com/stephenmcd/django-socketio which simply provides a few helper decorators as well as a broadcast_channel method. Because it's done in django, I've simply put this code in views.py simply so that they're imported. My views.py code:
def index(request):
return render_to_response("twitter_app/index.html", {
}, context_instance=RequestContext(request))
def _listen(socket):
db = redis.Redis(host="localhost", port=6379, db=0)
client = db.pubsub()
client.subscribe("tweets")
tweets = client.listen()
while True:
tweet = tweets.next()
tweet_data = ast.literal_eval(tweet["data"])
message = {"text": tweet_data["text"], "user": tweet_data["user"], "type": "tweet"}
socket.broadcast_channel(message)
#on_subscribe(channel="livestream")
def subscribe(request, socket, context, channel):
g = Greenlet.spawn(_listen, socket)
The client side socket.io JavaScript simply connects and subscribes to the channel "livestream" and captures any received messages to that channel:
var socket = new io.Socket();
socket.connect();
socket.on('connect', function() {
socket.subscribe("livestream");
});
socket.on('message', function(data) {
console.log(data);
});
The obvious problem with this code is that each time a new user or browser window is opened to the page, a new _listen method is spawned and the tweets get subscribed to and broadcast for each user resulting in duplicate messages being received on the client. My question is, where would the proper place be to put the _listen method so that it's only created once regardless of the # of clients? Also, keeping in mind that the broadcast_channel method is a method of a socket instance.
The problem was that I was using socket.broadcast_channel when I should have been using socket.send.