I'm building a django webapp where i need to stream some stock market trades on a webpage in real time. In order to do that, i'm searching for various approaches, and i found about Pusher and RabbitMQ.
With RabbitMQ i would just send the message to RMQ and consume them from Django, in order to get them on the web page. While looking for other solutions, i've also found about Pusher. What it's not clear, to me, is the difference between the two, technically. I don't understand where would i use Rabbit and where would i use Pusher, can someone explain to me how are they different? Thanks in advance!
You may be thinking of data delivery, non-blocking operations or push
notifications. Or you want to use publish / subscribe, asynchronous
processing, or work queues. All these are patterns, and they form
part of messaging.
RabbitMQ is a messaging broker - an intermediary for messaging. It
gives your applications a common platform to send and receive
messages, and your messages a safe place to live until received.
Pusher is a hosted service that makes it super-easy to add real-time data and functionality to web and mobile applications.
Pusher sits as a real-time layer between your servers and your
clients. Pusher maintains persistent connections to the clients -
over WebSocket if possible and falling back to HTTP-based
connectivity - so that as soon as your servers have new data that
they want to push to the clients they can do, instantly via Pusher.
Pusher offers libraries to integrate into all the main runtimes and
frameworks. PHP, Ruby, Python, Java, .NET, Go and Node on the server
and JavaScript, Objective-C (iOS) and Java (Android) on the client.
Related
I need to implement a chat application for my web service (that is written in Django + Rest api framework). After doing some google search, I found that Django chat applications that are available are all deprecated and not supported anymore. And all the DIY (do it yourself) solutions I found are using Tornado or Twisted framework.
So, My question is: is it OK to make a Django-only based synchronous chat application? And do I need to use any asynchronous framework? I have very little experience in backend programming, so I want to keep everything as simple as possible.
Django, like many other web framework, is constructed around the concept of receiving an HTTP request from a web client, processing the request and sending a response. Breaking down that flow (simplified for sake of clarity):
The remote client opens TCP connection with your Django server.
The client sends a HTTP request to the server, having a path, some headers and possibly a body.
Server sends a HTTP response.
Connection is closed
Server goes back to a state where it waits for a new connection.
A chat server, if it needs to be somewhat real-time, needs to be different: it needs to maintain many simultaneous open connections with connected clients, so that when new messages are published on a channel, the appropriate clients are notified accordingly.
A modern way of implementing that is using WebSockets. This communication flow between the client and server starts with a HTTP request, like the one described above, but the client sends a special Upgrade HTTP request to the server, asking for the session to switch over from a simple request/response paradigm to a persistent, "full-duplex" communication model, where both the client and server can send messages at any time in both direction.
The fact that the connections with multiple simultaneous clients needs to be persistent means you can't have a simple execution model where a single request would be taken care of by your server at a time, which is usually what happens in what you call synchronous servers. Tornado and Twisted have different models for doing networking, using multithreading, so that multiple connections can be left open and taken care of simultanously by a server, and making a chat service possible.
Synchronous approach nevertheless
Having said that, there are ways to implement a very simple, non-scalable chat service with apparent latency:
Clients perform POST requests to your server to send messages to channels.
Clients perform periodical GET requests to the server to ask for any new messages to the channels they're subscribed to. The rate at which they send these requests is basically the refresh rate of the chat app.
With this approach, your server will work significantly harder than if it had a asynchronous execution model for maintaining persistent connections, but it will work.
If you're going to make a chat app, you'll want to use websockets. They'll make getting updates to all clients participating in a conversation significantly easier and it'll give you real time conversations within your app. Having said that, I've never seen websockets used within a synchronous framework.
If it's OK to make Django-only based synchronous chat application? Too many unanswered questions for a reasonable answer. How many people will use this chat app? How many people per conversation? How long will this app be around? If you're looking to make something simple for you and a couple friends, make what you know. If you're getting paid to make this app, use websockets and use an asynchronous framework.
You certainly can develop a synchronous chat app, you don't necessarily need to us an asynchronous framework. but it all comes down to what do you want your app to do? how many people will use the app? will there be multiple users and multiple chats going on at the same time?
I'm writing a python real-time chat feature embedded in a web app. I'm a little bit confused on the real time implementation. I need to push real time message to different users.
I plan to use websocket but I'm not quite sure how to save those sockets into an array so that once a user send a message to server, the server can find the related socket and push the message.
So any idea about this? Or what's the common way to implement real time chat feature?
Thanks in advance.
You need to use a websocket aware web server, like Tornado to handle websocket traffic. To multiplex chat messages between different chats and users, there are solutions like Redis and ZeroMQ that you can use for message multiplexing.
However, it sounds like you have zero experience and starting point, so starting with an working example is better approach. Please study existing real-time chat implementations for Python:
https://github.com/heroku-examples/python-websockets-chat
https://github.com/nellessen/Tornado-Redis-Chat
https://github.com/tornadoweb/tornado/blob/master/demos/websocket/chatdemo.py
http://ferretfarmer.net/2013/09/05/tutorial-real-time-chat-with-django-twisted-and-websockets-part-1/
I want to be able to schedule delivery of a lightweight message from a server to a client. This is new territory to me so I'd appreciate some advice on the possible approaches available.
The client is running on a Raspberry Pi using node.js (because I'm using node libraries to control a piece of attached hardware). Eventually there will be multiple clients like it.
The server could be anything, though I'm most familiar with Python, django and node.
I want to be able to access the server from a browser and cause it to schedule a future message to the client, effectively a push notification with a tiny bit of data.
I'm looking at pub-sub and messaging systems to do this; I started writing a system that uses node on both ends and sockets, but the approach I want is more fire-and-forget occasional messages, not constant realtime data exchange. I'm also not a huge fan of the node-cron style scheduling, I'd like to be able to retrieve and alter scheduled events and it felt somewhat heavy-handed to layer this on top of a cron system.
My current solution uses python on the server (so I can write a django web interface) with celery and rabbitmq, using a named queue per client. The client subscribes to that specific queue using node-amqp, and off we go. This also allows me to create queues that multiple clients can be interested in, which is a neat bonus.
This answer makes me think I'm doing the right thing -- but as I'm new to this stuff, it feels like I might be missing something. Are there alternatives I should consider in the world of server-client messaging?
Since you are already using python you could take a look at python remote objects, (pyro).
I am planning to integrate real time notifications into a web application that I am currently working on. I have decided to go with XMPP for this and selected openfire server which i thought to be suitable for my needs.
The front end uses strophe library to fetch the notifications using BOSH from my openfire server. However the notices are the notifications and other messages are to be posted by my application and hence I think this code needs to reside at the backend.
Initially I thougt of going with PHP XMPP libraries like XMPHP and JAXL but then I think that this would cause much overhead as each script will have to do same steps like connection, authentication etc. and I think this would make the PHP end a little slow and unresponsive.
Now I am thinking of creating a middle-ware application acting as a web service that the PHP will call and this application will handle the stuff with XMPP service. The benefit with this is that this app(a server if you will) will have to connect just once and the it will sit there listening on a port. also I am planning to build it in a asynchronous way such that It will first take all the requests from my PHp app and then when there are no more requests; go about doing the notification publishing stuff. I am planninng to create this service in Python using SleekXMPP.
This is just what I planned. I am new to XMPP and this whole web service stuff ans would like to take your comments on this regarding issues like memory and CPU usage, advantages, disadvantages, scalability issues,security etc.
Thanks in advance.
PS:-- also if something like this already exists(although I didn't find after a lot of Googling) Please direct me there.
EDIT ---
The middle-level service should be doing the following(but not limited to):
1. Publishing notifications for different level of groups and community pages.
2. Notification for single user on some event.
3. User registration(can be done using user service plugin though).
EDIT ---
Also it should like to create pub-sub nodes and subscribe and unsubscribe users from these pub-sub nodes.
Also I want to store the notifications and messages in a database(openfire doesn't). Would that be a good choice?
It seems to me like XMPP is a bit of a heavy-weight solution for what you're doing, given that communication is just one-way and you're just sending notifications (no real-time multi-user chat etc.).
I would consider using something like Socket.io (http://socket.io) for the server<=>client channel along with Redis (http://redis.io) for persistence.
I want to implement a lightweight Message Queue proxy. It's job is to receive messages from a web application (PHP) and send them to the Message Queue server asynchronously. The reason for this proxy is that the MQ isn't always avaliable and is sometimes lagging, or even down, but I want to make sure the messages are delivered, and the web application returns immediately.
So, PHP would send the message to the MQ proxy running on the same host. That proxy would save the messages to SQLite for persistence, in case of crashes. At the same time it would send the messages from SQLite to the MQ in batches when the connection is available, and delete them from SQLite.
Now, the way I understand, there are these components in this service:
message listener (listens to the messages from PHP and writes them to a Incoming Queue)
DB flusher (reads messages from the Incoming Queue and saves them to a database; due to SQLite single-threadedness)
MQ connection handler (keeps the connection to the MQ server online by reconnecting)
message sender (collects messages from SQlite db and sends them to the MQ server, then removes them from db)
I was thinking of using Twisted for #1 (TCPServer), but I'm having problem with integrating it with other points, which aren't event-driven. Intuition tells me that each of these points should be running in a separate thread, because all are IO-bound and independent of each other, but I could easily put them in a single thread. Even though, I couldn't find any good and clear (to me) examples on how to implement this worker thread aside of Twisted's main loop.
The example I've started with is the chatserver.py, which uses service.Application and internet.TCPServer objects. If I start my own thread prior to creating TCPServer service, it runs a few times, but the it stops and never runs again. I'm not sure, why this is happening, but it's probably because I don't use threads with Twisted correctly.
Any suggestions on how to implement a separate worker thread and keep Twisted? Do you have any alternative architectures in mind?
You're basically considering writing an ad-hoc extension to your messaging server, the job of which it is to provide whatever reliability guarantees you've asked of it.
Instead, perhaps you should take the hardware where you were planning to run this new proxy and run another MQ node on it. The new node should take care of persisting and relaying messages that you deliver to it while the other nodes are overloaded or offline.
Maybe it's not the best bang for your buck to use a separate thread in Twisted to get around a blocking call, but sometimes the least evil solution is the best. Here's a link that shows you how to integrate threading into Twisted:
http://twistedmatrix.com/documents/10.1.0/core/howto/threading.html
Sometimes in a pinch easy-to-implement is faster than hours/days of research which may all turn out to be for nought.
A neat solution to this problem would be to use the Key Value store Redis. Its a high speed persistent data store, with plenty of clients - it has a php and a python client (if you want to use a timed/batch process to process messages - it saves you creating a database, and also deals with your persistence stories. It runs fine on Cywin/Windows + posix environments.
PHP Redis client is here.
Python client is here.
Both have a very clean and simple API. Redis also offers a publish/subscribe mechanism, should you need it, although it sounds like it would be of limited value if you're publishing to an inconsistent queue.