Scheduled message passing from server to clients: what system to use? - python

I want to be able to schedule delivery of a lightweight message from a server to a client. This is new territory to me so I'd appreciate some advice on the possible approaches available.
The client is running on a Raspberry Pi using node.js (because I'm using node libraries to control a piece of attached hardware). Eventually there will be multiple clients like it.
The server could be anything, though I'm most familiar with Python, django and node.
I want to be able to access the server from a browser and cause it to schedule a future message to the client, effectively a push notification with a tiny bit of data.
I'm looking at pub-sub and messaging systems to do this; I started writing a system that uses node on both ends and sockets, but the approach I want is more fire-and-forget occasional messages, not constant realtime data exchange. I'm also not a huge fan of the node-cron style scheduling, I'd like to be able to retrieve and alter scheduled events and it felt somewhat heavy-handed to layer this on top of a cron system.
My current solution uses python on the server (so I can write a django web interface) with celery and rabbitmq, using a named queue per client. The client subscribes to that specific queue using node-amqp, and off we go. This also allows me to create queues that multiple clients can be interested in, which is a neat bonus.
This answer makes me think I'm doing the right thing -- but as I'm new to this stuff, it feels like I might be missing something. Are there alternatives I should consider in the world of server-client messaging?

Since you are already using python you could take a look at python remote objects, (pyro).

Related

How to communicate between a NodeJS graphql-apollo server and a Python project?

Since this stack is new to me I'd figure it wouldnt hurt to ask the community for input.
I'm exploring a new type of project and have been brainstorming how/if it's possible to communicate between an apollo-graphql server (NodeJS) and a client Python program. I have a web development background and I intend to keep my app and server in Typescript/Node.
However ...
As a project and exercise, I would like to play around with connecting a watering system via the GPIO package in Python + RaspberryPi and then send the updates to my apollo-graphql server. I think conceptually, in a traditional RESTful api it would be straightforward ... However, with GraphQL things are a bit more nebulous to me. It would be nice to build out hardware connected services in various languages as appropriate and not be bottled into having everything in typescript/node and to that end Python is the most widely used language within the RaspberryPi community.
Has anyone done this before, or something similar, and have insights and experience to share?
edit: since posting I became more familiar with graphql and played around with it in Postman. It simply has an endpoint that I can post to it with queries (and I assume mutations and everything else) and it's captured by my NodeJS apollo-graphQL server. I ended up using this python package to query against my node API and it seems to work well.
Your server is node and your client is python. The server doesn't care at all whether your client is python or java or C#, it only wants to see GraphQL requests.
I suggest using a python-based GraphQL client package such as python-graphql-client
You're right that in the end everything looks like a POST. Using an actual client might make life a little bit easier for you though.
You can try calling your GraphQL service using a RESTful wrapper: https://graphql.org/blog/rest-api-graphql-wrapper/. There's a similar question asked before: link.
Secondly, instead of calling the server directly, you could also try using a message broker in the middle. In short: the raspi sends messages to a queue that is polled by your server. There are several solutions available like: Rabbitmq.

XMPP-- openfire,PHP and python web service

I am planning to integrate real time notifications into a web application that I am currently working on. I have decided to go with XMPP for this and selected openfire server which i thought to be suitable for my needs.
The front end uses strophe library to fetch the notifications using BOSH from my openfire server. However the notices are the notifications and other messages are to be posted by my application and hence I think this code needs to reside at the backend.
Initially I thougt of going with PHP XMPP libraries like XMPHP and JAXL but then I think that this would cause much overhead as each script will have to do same steps like connection, authentication etc. and I think this would make the PHP end a little slow and unresponsive.
Now I am thinking of creating a middle-ware application acting as a web service that the PHP will call and this application will handle the stuff with XMPP service. The benefit with this is that this app(a server if you will) will have to connect just once and the it will sit there listening on a port. also I am planning to build it in a asynchronous way such that It will first take all the requests from my PHp app and then when there are no more requests; go about doing the notification publishing stuff. I am planninng to create this service in Python using SleekXMPP.
This is just what I planned. I am new to XMPP and this whole web service stuff ans would like to take your comments on this regarding issues like memory and CPU usage, advantages, disadvantages, scalability issues,security etc.
Thanks in advance.
PS:-- also if something like this already exists(although I didn't find after a lot of Googling) Please direct me there.
EDIT ---
The middle-level service should be doing the following(but not limited to):
1. Publishing notifications for different level of groups and community pages.
2. Notification for single user on some event.
3. User registration(can be done using user service plugin though).
EDIT ---
Also it should like to create pub-sub nodes and subscribe and unsubscribe users from these pub-sub nodes.
Also I want to store the notifications and messages in a database(openfire doesn't). Would that be a good choice?
It seems to me like XMPP is a bit of a heavy-weight solution for what you're doing, given that communication is just one-way and you're just sending notifications (no real-time multi-user chat etc.).
I would consider using something like Socket.io (http://socket.io) for the server<=>client channel along with Redis (http://redis.io) for persistence.

Best way for client to fire off separate process without blocking client/server communication

The end result I am trying to achieve is allow a server to assign specific tasks to a client when it makes it's connection. A simplified version would be like this
Client connects to Server
Server tells Client to run some network task
Client receives task and fires up another process to complete task
Client tells Server it has started
Server tells Client it has another task to do (and so on...)
A couple of notes
There would be a cap on how many tasks a client can do
The client would need to be able to monitor the task/process (running? died?)
It would be nice if the client could receive data back from the process to send to the server if needed
At first, I was going to try threading, but I have heard python doesn't do threading correctly (is that right/wrong?)
Then it was thought to fire of a system call from python and record the PID. Then send certain signals to it for status, stop, (SIGUSR1, SIGUSR2, SIGINT). But not sure if that will work, because I don't know if I can capture data from another process. If you can, I don't have a clue how that would be accomplished. (stdout or a socket file?)
What would you guys suggest as far as the best way to handle this?
Use spawnProcess to spawn a subprocess. If you're using Twisted already, then this should integrate pretty seamlessly into your existing protocol logic.
Use Celery, a Python distributed task queue. It probably does everything you want or can be made to do everything you want, and it will also handle a ton of edge cases you might not have considered yet (what happens to existing jobs if the server crashes, etc.)
You can communicate with Celery from your other software using a messaging queue like RabbitMQ; see the Celery tutorials for details on this.
It will probably be most convenient to use a database such as MySQL or PostgreSQL to store information about tasks and their results, but you may be able to engineer a solution that doesn't use a database if you prefer.

I'm looking for a network service that'll let me send messages to selected clients

I have a program which will be running on multiple devices on a network. These programs will need to send data between each other - to specified devices (not all devices).
server = server.Server('192.168.1.10')
server.identify('device1')
server.send('device2', 'this will be pickled and sent to device2')
That's some basic example code for what I need to do. Of course, it will also need to receive.
I was looking at building my own simple message passing server using Twisted when someone pointed me in the direction of MPI. I've never looked into the MPI protocol before and that website gives rather vague examples.
Is MPI a good approach? Are there better alternatives?
MPI is really good at doing the communications for running a tightly-coupled program accross several or many machines in a cluster. If you're running very loosely coupled programs - only interacting occasionally - or the machines are more distributed than within a cluster, like scattered around a LAN - then MPI is probably not what you're looking for.
There are several Open Source message brokers that already handle this kind of stuff for you, and come with a full API ready to use.
You should take a look at:
ActiveMQ which has a Python Stomp client.
RabbitMQ has a Python client too - see Building RabbitMQ apps using Python.
You could build it yourself, but that would be like reinventing the wheel (and on a side-note: I actually only realised I was half-way building a message broker before I started looking at existing solutions - building one takes a lot of work).
Consider using something like ZeroMQ. It supports the most useful messaging idioms - push/pull, publish/subscribe and so on, and although it's not 100% clear from your question which one you need, I'm pretty sure you will find the answer there.
They have a great user guide here, and the Python bindings are well-developed and supported. Some code samples are here.
You can implement MPI functions in order to create a communication between different codes. In this case the server program should public "MPI ports" with differents IDs. Clients should look for this ports and try to connect to them. Only server can accept each communication. Once the communication is stablished, codes can exchange data between them.
Another posibility is to run different programs in Multiple Instruction MPI option. In this case all programs are executed at the same time, and there is not necessity to create port communicators. After they are executed, you can create particular communicators between groups of programms you select.
Please tell me what kind of method you need and I can provide c code to implement the functions.

Message queue proxy in Python + Twisted

I want to implement a lightweight Message Queue proxy. It's job is to receive messages from a web application (PHP) and send them to the Message Queue server asynchronously. The reason for this proxy is that the MQ isn't always avaliable and is sometimes lagging, or even down, but I want to make sure the messages are delivered, and the web application returns immediately.
So, PHP would send the message to the MQ proxy running on the same host. That proxy would save the messages to SQLite for persistence, in case of crashes. At the same time it would send the messages from SQLite to the MQ in batches when the connection is available, and delete them from SQLite.
Now, the way I understand, there are these components in this service:
message listener (listens to the messages from PHP and writes them to a Incoming Queue)
DB flusher (reads messages from the Incoming Queue and saves them to a database; due to SQLite single-threadedness)
MQ connection handler (keeps the connection to the MQ server online by reconnecting)
message sender (collects messages from SQlite db and sends them to the MQ server, then removes them from db)
I was thinking of using Twisted for #1 (TCPServer), but I'm having problem with integrating it with other points, which aren't event-driven. Intuition tells me that each of these points should be running in a separate thread, because all are IO-bound and independent of each other, but I could easily put them in a single thread. Even though, I couldn't find any good and clear (to me) examples on how to implement this worker thread aside of Twisted's main loop.
The example I've started with is the chatserver.py, which uses service.Application and internet.TCPServer objects. If I start my own thread prior to creating TCPServer service, it runs a few times, but the it stops and never runs again. I'm not sure, why this is happening, but it's probably because I don't use threads with Twisted correctly.
Any suggestions on how to implement a separate worker thread and keep Twisted? Do you have any alternative architectures in mind?
You're basically considering writing an ad-hoc extension to your messaging server, the job of which it is to provide whatever reliability guarantees you've asked of it.
Instead, perhaps you should take the hardware where you were planning to run this new proxy and run another MQ node on it. The new node should take care of persisting and relaying messages that you deliver to it while the other nodes are overloaded or offline.
Maybe it's not the best bang for your buck to use a separate thread in Twisted to get around a blocking call, but sometimes the least evil solution is the best. Here's a link that shows you how to integrate threading into Twisted:
http://twistedmatrix.com/documents/10.1.0/core/howto/threading.html
Sometimes in a pinch easy-to-implement is faster than hours/days of research which may all turn out to be for nought.
A neat solution to this problem would be to use the Key Value store Redis. Its a high speed persistent data store, with plenty of clients - it has a php and a python client (if you want to use a timed/batch process to process messages - it saves you creating a database, and also deals with your persistence stories. It runs fine on Cywin/Windows + posix environments.
PHP Redis client is here.
Python client is here.
Both have a very clean and simple API. Redis also offers a publish/subscribe mechanism, should you need it, although it sounds like it would be of limited value if you're publishing to an inconsistent queue.

Categories

Resources