I'm attempting to send consumers.py information to display on the client end outside of consumers.py.
I've referenced Send message using Django Channels from outside Consumer class this previous question, but the sub process .group_send or .group_add don't seem to exist, so I feel it's possible I'm missing something very easy.
Consumers.py
from channels.generic.websocket import WebsocketConsumer
from asgiref.sync import async_to_sync
class WSConsumer(WebsocketConsumer):
def connect(self):
async_to_sync(self.channel_layer.group_add)("appDel", self.channel_name)
self.accept()
self.render()
appAlarm.py
def appFunc(csvUpload):
#csvUpload=pd.read_csv(request.FILES['filename'])
csvFile = pd.DataFrame(csvUpload)
colUsernames = csvFile.usernames
print(colUsernames)
channel_layer = get_channel_layer()
for user in colUsernames:
req = r.get('https://reqres.in/api/users/2')
print(req)
t = req.json()
data = t['data']['email']
print(user + " " + data)
message = user + " " + data
async_to_sync(channel_layer.group_send)(
'appDel',
{'type': 'render', 'message': message}
)
It's throwing this error:
async_to_sync(channel_layer.group_send)(
AttributeError: 'NoneType' object has no attribute 'group_send'
and will throw the same error for group_add when stripping it back more to figure out what's going on, but per the documentation HERE I feel like this should be working.
To anyone looking at this in the future, I was not able to use redis or even memurai in Windows OS due to cost. I ended up using server side events (SSE), specifically django-eventstream, and so far it's worked great as I didn't need the client to interact with the server, for a chat application this would not work.
Eventstream creates an endpoint at /events/ the client can connect to and receive a streaming http response.
Sending data from externalFunc.py:
send_event('test', 'message', {'text': 'Hello World'})
Event listener in HTML page:
var es = new ReconnectingEventSource('/events/');
es.addEventListener('message', function (e) {
console.log(e.data);
var source = new EventSource("/events/")
var para = document.createElement("P");
const obj = JSON.parse(event.data)
para.innerText = obj.text;
document.body.appendChild(para)
}, false);
es.addEventListener('stream-reset', function (e) {
}, false);
Related
I'm sending the stream of data from flask server via yield. I'm able to see this stream if I go directly to api url. However I don't know how to receive it on frontend. I would appreciate the help. Here is how my backend looks like:
STREAM
`
def redis_stream():
global lock
channel = r.pubsub()
channel.subscribe('CellGridMapClose')
for msg in channel.listen():
if msg['type'] == 'message':
obj = tm.CellGridMapping()
obj.ParseFromString(msg['data'])
objects = obj.objects
movement = []
for vehicle in objects:
x, y = vehicle.pos.x, vehicle.pos.y
movement.append({'posx': x/50, 'posy': y/40})
yield bytes(json.dumps(movement), 'utf-8')
`
ROUTE
`
#app.route('/redis-stream')
def redis_data():
return Response(redis_stream(), mimetype='application/json')
`
This is how my frontend looks. I've tried many variants. This is the last one, however it's not working
FRONTEND
`
const response = await axios.get("/redis-stream", {responseType: 'arraybuffer'});;
console.log(response.data);
`
class WSTestView(WebsocketConsumer):
def connect(self):
self.accept();
self.send(json.dumps({'status': 'sent'})) # client receives this
def receive(self, text_data=None, bytes_data=None):
notifications = Notification.objects.filter(receiver=text_data) # receives user id
serializer = NotificationSerializer(notifications, many=True).data
self.send(serializer) # client does not receives this
Frontend
// ...
useEffect(() => {
socket.onmessage = (e) => { console.log(e.data) }
}, [])
// ...
I've just started with django-channels and am working on a consumer that sends the user's notifications when it receives the user's id but on the frontend the onmessage event does not receive anything, how can I fix this and is there a better way that I can implement this?
It's probably your front end and not django-channels if you're able to connect to it. The most probable reason is that your onmessage binds after, because of the componenDidMount/useEffect, the server has already sent the message.
Try just as a test, to rule that out. To put the
socket.onmessage = (e) => { console.log(e.data) }
Right after the new WebSocket.... Let me know if that prints something in the console. Another way to test/isolate it is with an online WebSocket tester like this https://www.lddgo.net/en/network/websocket
class WSTestView(WebsocketConsumer):
def connect(self):
self.accept();
self.send(json.dumps({'status': 'sent'}))
def receive(self, text_data=None, bytes_data=None):
notifications = Notification.objects.filter(receiver=text_data) # receives user id
serializer = NotificationSerializer(notifications, many=True).data
self.send(json.dumps(serializer)) # <-------
Had to convert the serialized data into a JSON string.
Frontend
useEffect(() => {
socket.onmessage = (e) => { console.log(e.data) }
}, [socket]) // <-------
Had to put the socket variable into the dependency array
Using Django channels to update the user on the current status of a potentially long running task, I'm facing a WebSocket DISCONNECT that I would like to trace down.
The setup looks pretty straight forward. settings.py defines the channel layer:
ASGI_APPLICATION = "config.routing.application"
CHANNEL_LAYERS = {
'default': {
'BACKEND': 'channels_redis.core.RedisChannelLayer',
'CONFIG': {
"hosts": [('127.0.0.1', 6379)],
},
},
}
In routing.py we basically follow the default suggestion from the Channels doc (the pattern is simply matching a UUID, which we use as public facing identifier):
from channels.auth import AuthMiddlewareStack
from channels.routing import ProtocolTypeRouter, URLRouter
from django.conf.urls import url
from lektomat.consumers import analysis
application = ProtocolTypeRouter({
# (http->django views is added by default)
"websocket": AuthMiddlewareStack(
URLRouter([
# Use a regex to match the UUID as the Django version with its '<uuid:analysis_id>' does not work for Channels.
url(r"^ws/analysis/(?P<analysis_id>[a-f0-9]{8}-[a-f0-9]{4}-[1-5][a-f0-9]{3}-[89aAbB][a-f0-9]{3}-[a-f0-9]{12})/$",
analysis.AnalysisConsumer),
])),
})
The consumer does not much more than sending some initial data via a newly established websockets connection and logging info all over the place:
import logging
import json
from channels.db import database_sync_to_async
from uuid import UUID
from typing import Tuple
from django.utils import timezone
from channels.generic.websocket import AsyncWebsocketConsumer
from myapp.models import AnalysisResult
logger = logging.getLogger(__name__)
class AnalysisConsumer(AsyncWebsocketConsumer):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.analysis_id = None
self.analysis_group_name = 'analysis_'
async def connect(self):
self.analysis_id = self.scope['url_route']['kwargs']['analysis_id']
self.analysis_group_name = "analysis_{}".format(self.analysis_id)
await self.channel_layer.group_add(self.analysis_group_name, self.channel_name)
logger.info("%s – AnalysisConsumer: connect()ed and joined group %s.", str(timezone.now()), self.analysis_group_name)
dummy_percent = 11
dummy_text = 'Dummy'
logger.debug("%s – Sending initial channel update to group %s with a %d percent and status_text=%s", str(timezone.now()), self.analysis_group_name, dummy_percent, dummy_text)
await self.send(text_data=json.dumps({
'progress_percent': progress_percent,
'status_text': status_text
}))
await self.accept()
async def disconnect(self, code):
logger.info("%s – AnalysisConsumer: disconnecting with code=%s (internal group name=%s).", str(timezone.now()), code, self.analysis_group_name)
await self.channel_layer.group_discard(self.analysis_group_name, self.channel_name)
logger.info("%s – AnalysisConsumer: disconnect(%s)ed and left room %s", str(timezone.now()), code, self.analysis_group_name)
async def receive(self, text_data=None, bytes_data=None):
logger.info("%s – unexpectedly received data from the websocket, text_data=%s, bytes_data=%s", str(timezone.now()), text_data, str(bytes_data))
And finally, the client's javascript is connecting to the websocket endpoint:
var ws_scheme = window.location.protocol == "https:" ? "wss" : "ws";
var ws_uri = ws_scheme + '://' + window.location.host + '/ws/analysis/' + '{{ result_id }}' + '/';
var socket = new WebSocket(ws_uri);
socket.onopen = function open() {
let now = new Date();
console.info(now.toLocaleString() + ':' + now.getMilliseconds() + ' – WebSocket connection created.');
};
socket.onmessage = function(e) {
console.log("WebSocket message received.")
const data = JSON.parse(e.data);
console.log("WebSocket message: " + data.status_text + " at " + data.progress_percent + " percent.");
};
socket.onclose = function(e) {
let now = new Date();
console.error(now.toLocaleString() + ':' + now.getMilliseconds() + ' – Analysis socket closed with event code = ' + e.code + ' and reason=' + e.reason);
};
socket.onerror = function(error) {
let now = new Date();
let msg = now.toLocaleString() + ':' + now.getMilliseconds() + ' – WebSocket error: ' + error;
console.error(msg);
}
The Redis backend is up and running.
But: The websocket connection is closed right after its start. More precisely, the browser's JS console logs (in German, faulty translation is mine):
(!) Firefox cannot connect to server at ws://localhost:8000/ws/analysis/d222ebe1-5a2a-4797-9466-24db1de5d24b/
(!) 10.8.2020, 22:30:21:317 – WebSocket error: [object Event]
onerror http://localhost:8000/analysis/d222ebe1-5a2a-4797-9466-24db1de5d24b:149
(Async: EventHandlerNonNull)
<anonym> http://localhost:8000/analysis/d222ebe1-5a2a-4797-9466-24db1de5d24b:146
(!) 10.8.2020, 22:30:21:319 – Analysis socket closed with event code = 1006 and reason=
onclose http://localhost:8000/analysis/d222ebe1-5a2a-4797-9466-24db1de5d24b:143
(Async: EventHandlerNonNull)
<anonym> http://localhost:8000/analysis/d222ebe1-5a2a-4797-9466-24db1de5d24b:141
The server console says:
request: <AsgiRequest: GET '/analysis/d222ebe1-5a2a-4797-9466-24db1de5d24b'>
HTTP GET /analysis/d222ebe1-5a2a-4797-9466-24db1de5d24b 200 [1.37, 127.0.0.1:51562]
WebSocket HANDSHAKING /ws/analysis/d222ebe1-5a2a-4797-9466-24db1de5d24b/ [127.0.0.1:51574]
2020-08-10 20:30:20.549519+00:00 – AnalysisConsumer: connect()ed and joined group analysis_d222ebe1-5a2a-4797-9466-24db1de5d24b.
2020-08-10 20:30:20.610167+00:00 – Sending initial channel update to group analysis_d222ebe1-5a2a-4797-9466-24db1de5d24b with a 11 percent and status_text=Dummy
WebSocket DISCONNECT /ws/analysis/d222ebe1-5a2a-4797-9466-24db1de5d24b/ [127.0.0.1:51574]
Thus, the server receives the connecting request, establishes the websocket connection and immediately disconnects. The client responds about half a second later with a mere error with code 1006 (which doesn't tell much). The consumer's disconnect() is never called.
Who might be initiating the WebSocket DISCONNECT? It doesn't seem to be any of the application code. Pointers on what's missing here are appreciated.
That consumer is trying to send a message before accepting the connection.
I just tested and Daphne throws this exception:
File "...site-packages/daphne/ws_protocol.py", line 193, in handle_reply
"Socket has not been accepted, so cannot send over it"
And the clients receive a CloseEvent with code 1006
This looks like the same problem I described here.
Last message of your server before disconnect is that it is trying to send something (large?). I discovered that sending large WebSocket messages may lead to disconnect with error 1006.
You may try to configure your WebSocket server to send data in small chunks. It helped in my case.
I have created a channel that implements some text operations using a shared task which returns the response back to the channel layer.
#consumers.py
import json
import pdb
from asgiref.sync import async_to_sync
from channels.generic.websocket import AsyncWebsocketConsumer
from . import tasks
COMMANDS = {
'help': {
'help': 'Display help message.',
},
'sum': {
'args': 2,
'help': 'Calculate sum of two integer arguments. Example: `sum 12 32`.',
'task': 'add'
},
'status': {
'args': 1,
'help': 'Check website status. Example: `status twitter.com`.',
'task': 'url_status'
},
}
class Consumer(AsyncWebsocketConsumer):
async def receive(self, text_data):
text_data_json = json.loads(text_data)
message = text_data_json['message']
# response_message = 'Please type `help` for the list of the commands.'
message_parts = message.split()
if message_parts:
command = message_parts[0].lower()
if command == 'help':
response_message = 'List of the available commands:\n' + '\n'.join([f'{command} - {params["help"]} ' for command, params in COMMANDS.items()])
elif command in COMMANDS:
if len(message_parts[1:]) != COMMANDS[command]['args']:
response_message = f'Wrong arguments for the command `{command}`.'
else:
getattr(tasks, COMMANDS[command]['task']).delay(self.channel_name, *message_parts[1:])
# response_message = f'Command `{command}` received.'
response_message = message
await self.channel_layer.send(
self.channel_name,
{
'type': 'chat_message',
'message': response_message
}
)
#tasks.py
#shared_task
def add(channel_layer, x, y):
message = '{}+{}={}'.format(x, y, int(x) + int(y))
async_to_sync(channel_layer.send)({"type": "chat.message", "message": message})
I want to share this channel as an api which could be accessed using http request. for which i have written following views.
views.py
#csrf_exempt
#api_view(['POST'])
def api(request):
channel_layer = get_channel_layer()
async_to_sync(channel_layer.send)('test_channel', {'type': 'hello'})
ret = async_to_sync(channel_layer.receive)(channel_name)
return JsonResponse({"msg":ret})
While receiving from the views I get the same message that I have sent. How can I share the channel or handle incoming messages without connecting using WebSockets from the template?
If you just want the POST request to send a message
views.py
#csrf_exempt
#api_view(['POST'])
def api(request):
channel_layer = get_channel_layer()
async_to_sync(channel_layer.send)('test_channel', {'type': 'chat.message'})
return JsonResponse({"msg":"sent"})
You will need to ensure you have subscribed to test_channel in your consumer. And you will need a method on that consumer chat_message.
If you want to wiat for the response in your post
your not going to be able to do this using channel_layer.send since that is async to the extend that you dont have any concept fo response. In addition there might not even be an instance of your consumer running since Channels only creates instances when it has open websocket connections that rout to them.
so I think you can do either:
To create an instance of your consumer and send message to it from synchronise python code is going to be very complex. I suggest you do not do this approach It is complex, dirty and likely to break in all sorts of un-expected
instead I suggest to move the code you want to share between your HTTP view and your websocket view into a single place (not part of the consumer) were they both can call these functions.
I have an isolated python script that simply captures data from Twitter's streaming API and then on the receipt of each message, using redis pubsub it publishes to the channel "tweets". Here is that script:
def main():
username = "username"
password = "password"
track_list = ["apple", "microsoft", "google"]
with tweetstream.FilterStream(username, password, track=track_list) as stream:
for tweet in stream:
text = tweet["text"]
user = tweet["user"]["screen_name"]
message = {"text": text, "user": user}
db.publish("tweets", message)
if __name__ == '__main__':
try:
print "Started..."
main()
except KeyboardInterrupt:
print '\nGoodbye!'
My server side socket.io implementation is done using django-socketio (based off of gevent-socketio) https://github.com/stephenmcd/django-socketio which simply provides a few helper decorators as well as a broadcast_channel method. Because it's done in django, I've simply put this code in views.py simply so that they're imported. My views.py code:
def index(request):
return render_to_response("twitter_app/index.html", {
}, context_instance=RequestContext(request))
def _listen(socket):
db = redis.Redis(host="localhost", port=6379, db=0)
client = db.pubsub()
client.subscribe("tweets")
tweets = client.listen()
while True:
tweet = tweets.next()
tweet_data = ast.literal_eval(tweet["data"])
message = {"text": tweet_data["text"], "user": tweet_data["user"], "type": "tweet"}
socket.broadcast_channel(message)
#on_subscribe(channel="livestream")
def subscribe(request, socket, context, channel):
g = Greenlet.spawn(_listen, socket)
The client side socket.io JavaScript simply connects and subscribes to the channel "livestream" and captures any received messages to that channel:
var socket = new io.Socket();
socket.connect();
socket.on('connect', function() {
socket.subscribe("livestream");
});
socket.on('message', function(data) {
console.log(data);
});
The obvious problem with this code is that each time a new user or browser window is opened to the page, a new _listen method is spawned and the tweets get subscribed to and broadcast for each user resulting in duplicate messages being received on the client. My question is, where would the proper place be to put the _listen method so that it's only created once regardless of the # of clients? Also, keeping in mind that the broadcast_channel method is a method of a socket instance.
The problem was that I was using socket.broadcast_channel when I should have been using socket.send.