I'm currently working on gateway with an embedded Linux and a Webserver. The goal of the gateway is to retrieve data from electrical devices through a RS485/Modbus line, and to display them on a server.
I'm using Nginx and Django, and the web front-end is delivered by "static" files. Repeatedly, a Javascript script file makes AJAX calls that send CGI requests to Nginx. These CGI requests are answered with JSON responses thanks to Django. The responses are mostly data that as been read on the appropriate Modbus device.
The exact path is the following :
Randomly timed CGI call -> urls.py -> ModbusCGI.py (import an other script ModbusComm.py)-> ModbusComm.py create a Modbus client and instantly try to read with it.
Next to that, I wanted to implement a Datalogger, to store data in a DB at regular intervals. I made a script that also import the ModbusComm.py script, but it doesn't work : sometime multiple Modbus frames are sent at the same time (datalogger and cgi scripts call the same function in ModbusComm.py "files" at the same time) which results in an error.
I'm sure this problem would also occur if there are a lot of users on the server (CGI requests sent at the same time). Or not ? (queue system already managed for CGI requests? I'm a bit lost)
So my goal would be to make a queue system that could handle calls from several python scripts => make them wait while it's not their turn => call a function with the right arguments when it's their turn (actually using the modbus line), and send back the response to the python script so it can generate the JSON response.
I really don't know how to achieve that, and I'm sure there are better way to do this.
If I'm not clear enough, don't hesitate to make me aware of it :)
I had the same problem when I had to allow multiple processes to read some Modbus (and not only Modbus) data through a serial port. I ended up with a standalone process (“serial port server”) that exclusively works with a serial port. All other processes work with that port through that standalone process via some inter processes communication mechanism (we used Unix sockets).
This way when an application wants to read a Modbus register it connects to the “serial port server”, sends its request and receives the response. All the actual serial port communication is done by the “serial port server” in sequential way to ensure consistency.
Related
I am working on a project whereby I have a couple of remote IoT devices that send messages via UDP. I am looking to make a server that can receive this constant flow of UDP messages and can store them in a database. Additionally I would like to make a (REST) API which allows the information from this database to be accessed (every 15/30 minutes or so) from other applications.
Does anyone have any suggestions for how to do this (preferably in python)?
So far I am able to do the following (in python):
I know how to make a UDP client and server, and send messages between them using "socket". This link provided a useful explanation.
I know how to create a Flask server, store random data in a database using SQLAlchemy, and make the database content available via an API that can be accessed via Postman. This link showed me how.
What I am not able to do:
Tying everything together is where the problem arises. Specifically I don't know how to combine these above methods so that everything works at the same time (in the same loop so to say). Both Flask and the UDP server are running their own loops and listening (for events?) so I don't see how those processes would work simultaneously.
One thing that I was considering is to run the UDP server + database insertion in one terminal, and the Flask/API server from another terminal. That would mean that the database is being opened and accessed by multiple programs at the same time. Is that possible? It would be like opening a single Excel sheet multiple times (which is not permitted I would think).
I also came across this library which allows you to combine Flask with Flask-Sockets, but that doesn't seem to support UDP as far as I understand..
Many thanks!
My network configuration: I have a revers proxy nginx handling https behind it will be a golang server (gs).
I want gs to run my python script with the data that comes in as JSON with the POST at /webhook.
I Thought about using sys.args but I am not sure is it, or how to make it safe. Is there an injection attack possible?
My plan was to make gs parse the JSON and run:
python3 respond.py -txt "this is message sent from messenger" -mid 0000000000 -pld "payload if a button was pressed"
Python would create message and sent it to facebook by it self, so it would have to be called for every messsage. The traffic isn't big but still if there is a best solution I would like to find it.
Other thing that I considered was to run python3 listening on a port and forward to it raw incoming JSON over tcp (JSON that golang server recieves).
Does the Script have to run often or only with some requests?
If yes, consider implementing a pipe between the processes and have the python script listen for incomming messages via the pipe.
I'm working on an application in web.py which can send commands to a device through a website with buttons.
I know which buttons are pushed on my website and I get some parameters back in my python serverprogram. Python-program -> gets the basic commands out of an SQLite-database -> adds the received parameters. But I need to simultaneously set up a connection with the remote device through Ethernet (simple socket connection) to send these commands to the device. That's where I got stuck.
So I have the website working correctly, I also have a small seperate terminal program written to just make connection with the device and with a simple terminal interface to send commands. So basically I have the 2 major parts of the program working, but not simultaneously and I can't figure out how to fit them together.
I have been reading through some information to let the webserver run in a separate thread or maybe I have to connect and close the socket connection with the device each time I get information (command/parameters) from the website? Can somebody push me somewhat in the right direction?
NB: the server is running on a Raspberry Pi
Yes, your problem appears to be caused by socket connection not being thread-safe.
Each request to web.py server runs in its own thread, and if you want to access socket connection to your device, then you have to use locks or manage connection pool, if your device supports multiple connections.
To force web.py running in single thread mode please see the following answer:
Forcing single threaded request handling with web.py
Note that you don't have to lock all requests (as in that answer) and may put lock only on the part of the code where connection is used.
Short version of my question:
How do I design a single Python script that can listen and respond to inputs received via HTTP or a serial port, and also initiate communications via these channels on its own? My problem is that I don't understand how to design a single script that both (i) uses a web framework to listen on some port for HTTP inputs, and (ii) also does other work that's independent of incoming HTTP requests.
Long version:
I want to use Python to design a system that does the following:
Listens to a serial port for occasional reports. Specifically, I have a network of JeeNode sensors (wireless Arduino-compatible modules) that talk to a central JeeLink, which connects to my computer via USB and talks to my Python script via pySerial.
Listens to a web URL for occasional inputs. Specifically, users send commands to the system via SMS to a Twilio number. Twilio intercepts the SMS messages and posts them to a URL I designate, and I use the Bottle micro web-framework to listen for new HTTP requests.
Responds to both types (serial and HTTP) of inputs. For example, if a user texts the command "Sleep", I want to (i) tell the sensors to go to sleep via the serial port -> JeeLink (which will then forward the command onto the remotes); and (ii) reply to the sender -- and maybe other users -- that the command has been received and is being executed.
Occasionally initiates its own communications to users (via HTTP -> Twilio -> SMS) or remote sensors (via serial -> JeeLink) without any precipitating input event. Two examples: (1) I want to report out to users or remote sensors every N minutes even if I haven't received any new inputs. (2) I want to tell users remotes have actually entered Sleep mode. Because the remotes are battery-powered, they spend most of the time in an inaccessible low-power mode. They can only receive new commands from the JeeLink when they initiate a wireless "check-in" every 5 min. So while technically remotes go to sleep (or wake up, etc.) in response to a user command, commands and responses are effectively independent.
My problem is that all of usage examples of web frameworks I've seen seem to assume that all precipitating events occur via HTTP requests. I can create a Bottle object, and use decorators to bind code to that object that get executed whenever it sees an HTTP request that matches some specified URL path. But I don't know how to do that while simultaneously doing other work that's independent of HTTP events, for example, listening to the serial port.
After struggling a lot, the potential solutions I'm considering now are:
Splitting the functionality into separate scripts. A.py listens for text messages via HTTP and writes the relevant information to some database; B.py continuously reads the database for new records and reacts accordingly, as well as listening to the serial monitor and doing other work. This seems like it would work fine, but it feels inelegant, and I suspect there's a simpler solution I'm unaware of.
Maybe the answer is related to Python decorators? I use various decorators to specify the URL paths that, when a matching HTTP request comes in, execute the code bound to the decorator. So I'm guessing that maybe there's a way to specify some other kind of decorator that, rather than listening for HTTP requests, gets executed when my "main" Python code tells it to? But I don't know enough about decorators to know if this is true.
It seems like you are trying to write an asynchronous application to manage your network of nodes via HTTP. You want to respond to incoming communications on multiple channels as they occur, you want to initiate communications on a schedule, on multiple channels, and you want those two forms of communication to interact. All of these communications are with an outside world that is slow, so it behooves you not to block if you don't need to.
It will probably be easiest to maintain your system if you organize your code into several Python modules, split by their area of concern - serial interface code, HTTP interface code, common processing code-paths, etc. Weave those components together in a central control module, which imports your libraries, and knows how to start and stop cleanly. Then you can test the serial interface independent of the web interface, and potentially reuse some of those Python modules in other projects.
EDIT:Question Updated. Thanks Slott.
I have a TCP Server in Python.
It is a server with asynchronous behaviour. .
The message format is Binary Data.
Currently I have a python client that interacts with the code.
What I want to be able to do eventually implement a Web based Front End to this client.
I just wanted to know , what should be correct design for such an application.
Start with any WSGI-based web server. werkzeug is a choice.
The Asynchronous TCP/IP is a seriously complicated problem. HTTP is synchronous. So using the synchronous web server presenting some asynchronous data is always a problem. Always.
The best you can do is to buffer things and have two processes in your web application.
TCP/IP process that collects data from the remove server and buffers it in a file (or files) somewhere.
WSGI web process which handles GET/POST processing.
GET requests will fetch some or all of the buffer and display it.
POST requests will send a message to the TCP/IP server.
For Web-based, talk HTTP. Use JSON or XML as data formats.
Be standards-compliant and make use of the vast number of libraries out there. Don't reinvent the wheel. This way you have less headaches in the long run.
if you need to maintain a connection to a backend server across multiple HTTP requests, Twisted's HTTP server is an ideal choice, since it's built to manage multiple connections easily.