Streaming responses between node js server and python - python

i need an advice over how to set up streaming responses from node js server to python, from python back to node js.
There are four files
a) The node js script
b) The serverConn.py script
c) The python file that takes input from serverconn.py script called count.py
d) the python file that takes input from count.py and return it back to serverConn to be sent back to the node.js server
I read up on zerorpc and i am currently using it to send input from the node js to python, but the python script also has to send the input to another python script, and that python script also has to send the input to another python script and then send the result back to the node js server.
Everything seems complicated for someone who just got out of college lol.
I saw this thread and was also wondering if i could use something like this for the two python scripts to communicate with each other.
How to get a python script to listen for inputs from another script
This is my node js script
var server = new zerorpc.Server({
hello: function(name, reply) {
var catalog = "3D";
reply(null, catalog + name);
}
});
server.bind("tcp://0.0.0.0:4244");
And this is my python script
import zerorpc
class serverConn:
def __init__(self):
self.c = "tcp://127.0.0.1:4244"
def client(self):
c = zerorpc.Client()
c.connect(self.c)
catalog = c.hello("")
return catalog
s = serverConn()
s.client()
The python script gets the input from the node js file, and sends it to matchcount.py file, and matchcount.py sends it to calculate.py file, and calculate.py file sends it back to node js.
Is there any tips on how i can go about this, and will the link i posted help?
Thank you.

Ok so you have nodejs -> python #1 -> python #2 -> python #3. The nodejs process is a client only. Python #1 and #2 are servers and clients to the next python process. Python #3 is a server only.
Thus your nodejs process should use a zerorpc client to invoke a given procedure on python #1. Python #1 will run a zerorpc Server with the procedure. In turn this procedure will use a zerorpc client and so on. When the last process returns from it's procedure, zerorpc will return the value back to the caller process. If each of your procedure always return the result of the remote procedure call. The result will eventually come back to the nodejs process.

Related

Start python script with information containing message directly from messenger API in a safe way

My network configuration: I have a revers proxy nginx handling https behind it will be a golang server (gs).
I want gs to run my python script with the data that comes in as JSON with the POST at /webhook.
I Thought about using sys.args but I am not sure is it, or how to make it safe. Is there an injection attack possible?
My plan was to make gs parse the JSON and run:
python3 respond.py -txt "this is message sent from messenger" -mid 0000000000 -pld "payload if a button was pressed"
Python would create message and sent it to facebook by it self, so it would have to be called for every messsage. The traffic isn't big but still if there is a best solution I would like to find it.
Other thing that I considered was to run python3 listening on a port and forward to it raw incoming JSON over tcp (JSON that golang server recieves).
Does the Script have to run often or only with some requests?
If yes, consider implementing a pipe between the processes and have the python script listen for incomming messages via the pipe.

How to run a Python script continuously that can receive commands from node

I have set up a Raspberry Pi connected to an LED strip which is controllable from my phone via a Node server I have running on the RasPi. It triggers a simple python script that sets a colour.
I'm looking to expand the functionality such that I have a python script continuously running and I can send colours to it that it will consume the new colour and display both the old and new colour side by side. I.e the python script can receive commands and manage state.
I've looked into whether to use a simple loop or a deamon for this but I don't understand how to both run a script continuously and receive the new commands.
Is it better to keep state in the Node server and keep sending a lot of simple commands to a basic python script or to write a more involved python script that can receive few simpler commands and continuously update the lights?
IIUC, you don't necessarily need to have the python script running continuously. It just needs to store state, and you can do this by writing the state to a file. The script can then just read the last state file at startup, decide what to do from thereon, perform action, then update the state file.
In case you do want to actually run the script continuously though, you need a way to accept the commands. The simplest way for a daemon to accept command is probably through signal, you can use custom signal e.g. SIGUSR1 and SIGUSR2 to send and receive these notifications. These may be sufficient if your daemon only need to accept very simple request.
For more complex request where you need to actually accept messages, you can listen to a Unix socket or listen to a TCP socket. The socket module in the standard library can help you with that. If you want to build a more complex command server, then you may even want to consider running a full HTTP server, though this looks overkill for the current situation.
Is it better to keep state in the Node server and keep sending a lot of simple commands to a basic python script or to write a more involved python script that can receive few simpler commands and continuously update the lights?
There's no straightforward answer to that. It depends on case by case basis, how complex the state is, how frequently you need to change colour, how familiar you are with the languages, etc.
Another option is to have the Node app, calll the Python script as a child process, and pass to it any needed vars, and you can read python's out put as well, like so:
var exec = require('child_process').exec;
var child = exec('python file.py var1 var2', function (error, stdout, stderr) {
}

How to implement a server for a python scripts testing system (with stdio)?

Could you advice how to implement server for scripts testing such as like on coursera.
There is input, where students attach their script.
And next steps are quite foggy for me as I see them:
server gets script
server checks extension of script
server runs bash script with testing data(std input), waits for result >> creates output file
output file is compared with solution
server sends response
Is it right? Are there any other solutions?
The answer is from the local stackoverflow. I think that is enough to start from here: https://github.com/DMOJ/judge.

Django/Python - Serial line concurrency

I'm currently working on gateway with an embedded Linux and a Webserver. The goal of the gateway is to retrieve data from electrical devices through a RS485/Modbus line, and to display them on a server.
I'm using Nginx and Django, and the web front-end is delivered by "static" files. Repeatedly, a Javascript script file makes AJAX calls that send CGI requests to Nginx. These CGI requests are answered with JSON responses thanks to Django. The responses are mostly data that as been read on the appropriate Modbus device.
The exact path is the following :
Randomly timed CGI call -> urls.py -> ModbusCGI.py (import an other script ModbusComm.py)-> ModbusComm.py create a Modbus client and instantly try to read with it.
Next to that, I wanted to implement a Datalogger, to store data in a DB at regular intervals. I made a script that also import the ModbusComm.py script, but it doesn't work : sometime multiple Modbus frames are sent at the same time (datalogger and cgi scripts call the same function in ModbusComm.py "files" at the same time) which results in an error.
I'm sure this problem would also occur if there are a lot of users on the server (CGI requests sent at the same time). Or not ? (queue system already managed for CGI requests? I'm a bit lost)
So my goal would be to make a queue system that could handle calls from several python scripts => make them wait while it's not their turn => call a function with the right arguments when it's their turn (actually using the modbus line), and send back the response to the python script so it can generate the JSON response.
I really don't know how to achieve that, and I'm sure there are better way to do this.
If I'm not clear enough, don't hesitate to make me aware of it :)
I had the same problem when I had to allow multiple processes to read some Modbus (and not only Modbus) data through a serial port. I ended up with a standalone process (“serial port server”) that exclusively works with a serial port. All other processes work with that port through that standalone process via some inter processes communication mechanism (we used Unix sockets).
This way when an application wants to read a Modbus register it connects to the “serial port server”, sends its request and receives the response. All the actual serial port communication is done by the “serial port server” in sequential way to ensure consistency.

help with python forking child server for doing ajax push, long polling

Alright, I only know some basic python but if I can get help with this then I am considering making it open source.
What I am trying to do:
- (Done) Ajax send for init content
- Python server recv command "init" to send most recent content
- (Done) Ajax recv content and then immediately calls back to python server
- Python server recv command "wait", sets up child, and waits for command "new" from ajax
- (Done) Ajax sends "new" command
- Python server wakes up all waiting children and sends newest content
- (Done) Ajax sends "wait", and so forth
I have already written the Python Server part in php but it uses 100% CPU so I knew I had to use forking socket daemon to be able to have multi processes sitting there waiting. Now, I could write this with PHP but the extensions it needs have to be manually installed which can be a problem with asking host to install it on shared accounts and so forth. So I turned to Python which would also give more flexability and run faster. Plus more people could user it.
So, if anyone could help with this, or give some direction, that would be great.
I am working on the code myself just just do not know it well enough. I can add the if statements in for the different commands and add in mysql connection myself. If I end up having any problems, I will ask here. I love this site.
Look at subprocess.
Read all of these related questions on StackOverflow: https://stackoverflow.com/search?q=[python]+web+subprocess

Categories

Resources