I'm working on an application in web.py which can send commands to a device through a website with buttons.
I know which buttons are pushed on my website and I get some parameters back in my python serverprogram. Python-program -> gets the basic commands out of an SQLite-database -> adds the received parameters. But I need to simultaneously set up a connection with the remote device through Ethernet (simple socket connection) to send these commands to the device. That's where I got stuck.
So I have the website working correctly, I also have a small seperate terminal program written to just make connection with the device and with a simple terminal interface to send commands. So basically I have the 2 major parts of the program working, but not simultaneously and I can't figure out how to fit them together.
I have been reading through some information to let the webserver run in a separate thread or maybe I have to connect and close the socket connection with the device each time I get information (command/parameters) from the website? Can somebody push me somewhat in the right direction?
NB: the server is running on a Raspberry Pi
Yes, your problem appears to be caused by socket connection not being thread-safe.
Each request to web.py server runs in its own thread, and if you want to access socket connection to your device, then you have to use locks or manage connection pool, if your device supports multiple connections.
To force web.py running in single thread mode please see the following answer:
Forcing single threaded request handling with web.py
Note that you don't have to lock all requests (as in that answer) and may put lock only on the part of the code where connection is used.
Related
I'm currently working on gateway with an embedded Linux and a Webserver. The goal of the gateway is to retrieve data from electrical devices through a RS485/Modbus line, and to display them on a server.
I'm using Nginx and Django, and the web front-end is delivered by "static" files. Repeatedly, a Javascript script file makes AJAX calls that send CGI requests to Nginx. These CGI requests are answered with JSON responses thanks to Django. The responses are mostly data that as been read on the appropriate Modbus device.
The exact path is the following :
Randomly timed CGI call -> urls.py -> ModbusCGI.py (import an other script ModbusComm.py)-> ModbusComm.py create a Modbus client and instantly try to read with it.
Next to that, I wanted to implement a Datalogger, to store data in a DB at regular intervals. I made a script that also import the ModbusComm.py script, but it doesn't work : sometime multiple Modbus frames are sent at the same time (datalogger and cgi scripts call the same function in ModbusComm.py "files" at the same time) which results in an error.
I'm sure this problem would also occur if there are a lot of users on the server (CGI requests sent at the same time). Or not ? (queue system already managed for CGI requests? I'm a bit lost)
So my goal would be to make a queue system that could handle calls from several python scripts => make them wait while it's not their turn => call a function with the right arguments when it's their turn (actually using the modbus line), and send back the response to the python script so it can generate the JSON response.
I really don't know how to achieve that, and I'm sure there are better way to do this.
If I'm not clear enough, don't hesitate to make me aware of it :)
I had the same problem when I had to allow multiple processes to read some Modbus (and not only Modbus) data through a serial port. I ended up with a standalone process (“serial port server”) that exclusively works with a serial port. All other processes work with that port through that standalone process via some inter processes communication mechanism (we used Unix sockets).
This way when an application wants to read a Modbus register it connects to the “serial port server”, sends its request and receives the response. All the actual serial port communication is done by the “serial port server” in sequential way to ensure consistency.
I need some help, i am on early design stage of a client server software and i don't know which of the 2 options (Web Service or Socket programming) are the right one for my software.
All programming is in python.
The layout:
PC will need to run a server service - this server will get commands from the local computer and will send them to the MiniPC.
MiniPC will need to run a client service - when it identify a command (method) he will go to hardware (connection by serial,usb.....), do something and return to the miniPC with result.
MiniPC get the Hardware result and sends it to the Logging server and to the Main PC
Notes:
PC can controls several MiniPC.
The amount of data in one hardware response can be up to 10Kb.
Commands from PC to MiniPC are small (strings)
Logging data can be up to 10Kb.
Questios:
What is you recomendation for protocol Web (http) or Socket programming?
Do you have any suggestions for the design?
You should be able to use socket programming for this. Setup a socket server at the PC and a client at MiniPC devices. The clients would wait for input (read from socket) from the PC and then send back the output that they would get back from hardware. In terms of design, I see two things. First, the socket server can run a select() to handle multiple clients. Second, you probably want to bump up the SO_SNDBUF socket option for MiniPC sockets and SO_RCVBUF for the server at PC to multiples of 10Kb. What is your argument for considering Web?
I'have done a similar project with ARM-based controllers instead on BeagleBone : feel free to ask me questions by commentaries.
Firstly, technically your BeagleBones are servers - since they ran a daemon service which is event triggered - and PC are clients. (but it is just pendantry)
Secondly, due to the limitations of embedded devices, I was not able to have an efficient Web server running on servers, so the choice was simple. I would advise you to stick with socket programming, but adding network services such as DCHP , support of TCP/UDP/UDP multicast, ping, echo, ...
Finally, the important question in terms of performance is the following :
what's the physical layer of communication ?
Ethernet ? Wifi ? Bluetooth/ZigBee ? I2C/CAN/... ?
I will guess it's Ethernet : IEEE 802.11 protocol doesn't scale well because of CSMA ( see here http://fr.wikipedia.org/wiki/CSMA ). If you want to have several devices (dozens), you will need switches/routers to encapsulate sub-networks to avoid network congestion.
I have a python application , to be more precise a Network Application that can't go down this means i can't kill the PID since it actually talks with other servers and clients and so on ... many € per minute of downtime , you know the usual 24/7 system.
Anyway in my hobby projects i also work a lot with WSGI frameworks and i noticed that i have the same problem even during off-peak hours.
Anyway imagine a normal server using TCP/UDP ( put here your favourite WSGI/SIP/Classified Information Server/etc).
Now you perform a git pull in the remote server and there goes the new python files into the server (these files will of course ONLY affect the data processing and not the actual sockets so there is no need to re-raise the sockets or touch in any way the network part).
I don't usually use File monitors since i prefer to use SIGNAL to wakeup the internal app updater.
Now imagine the following code
from mysuper.app import handler
while True:
data = socket.recv()
if data:
socket.send(handler(data))
Lets imagine that handler is a APP with DB connections, cache connections , etc.
What is the best way to update the handler.
Is it safe to call reload(handler) ?
Will this break DB connections ?
Will DB Connections survive to this restart ?
Will current transactions be lost ?
Will this create anti-matter ?
What is the best-pratice patterns that you guys usually use if there are any ?
It's safe to call reload(handler).
Depends where you initialize your connections. If you make the connections inside handler(), then yes, they'll be garbage collected when the handler() object falls out of scope. But you wouldn't be connecting inside your main loop, would you? I'd highly recommend something like:
dbconnection = connect(...)
while True:
...
socket.send(handler(data, dbconnection))
if for no other reason than that you won't be making an expensive connection inside a tight loop.
That said, I'd recommend going with an entirely different architecture. Make a listener process that does basically nothing more than listen for UDP datagrams, sends them to a messaging queue like RabbitMQ, then waits for the reply message to send the results back to the client. Then write your actual servers that get their requests from the messaging queue, process them, and send a reply message back.
If you want to upgrade the UDP server, launch the new instance listening on another port. Update your firewall rules to redirect incoming traffic to the new port. Reload the rules. Kill the old process. Voila: seamless cutover.
The real win is from uncoupling your backend. Since multiple processes can listen for the same messages from your frontend "proxy" service, you can run several in parallel - on different machines, if you want to. To upgrade the backend, start a new instance then kill the old one so that there's no time when at least one instance isn't running.
To scale your proxy, have multiple instances running on different ports or different hosts, and configure your firewall to randomly redirect incoming datagrams to one of the proxies.
To scale your backend, run more instances.
A typical situation with a server/web application is that the application needs to be shut down and restarted to implement an upgrade.
What are the possible/common schemes (and available software) to avoid losing data that clients sent to the server during the short time the application was gone?
An example scheme that could work is: For a simple web server where the client connects to port 80, rather than the client connecting directly to the web server application, there could be a simple application in between that listens to port 80 and seamlessly forwards/returns data to/from the "Actual" web server application (on some other port). When the web server needs to be shut down and restarted, the relay app could detect this and buffer all incoming data until the webserver comes back to life. This way there is always an application listening to port 80 and data is never lost (within buffer-size and time reason, of course). Does such a simple intermediate buffer-on-recipient-unavailable piece of software exist already?
I'm mostly interested in solutions for a single application instance and not one where there are multiple instances (in which case a clever rolling update scheme could be used), but in the interests of having a full answer set, any response would be great!
To avoid this, have multiple application servers behind a load balancer. Before bringing one down, ensure the load balancer is not sending it new clients. Bring it down, traffic will go to the other applications servers, and when it comes back up traffic will begin getting sent to it again.
If you have only one application server, simply 'buffering' network traffic is a poor solution. When the server comes back up, it has none of the TCP state information anymore and the old incoming connections have nowhere to go anyway.
I have written a simple twisted application that connects to a server that listens on 1 or more ports. The twisted app connects to this server and usually connects to a few of the open ports at a time. This server is a serial logger that connects to serial devices and provides the serial line information through a raw TCP Socket and I need to log all this data to disk.
My current app logs any received information to disk without issue.
What I now need to do but am unable to make progress on is add the ability to interact with my application through stdin. I need to be able to issue commands to the local application but also send text commands through the connected sockets.
I have a basic prompt using basic.LineReceiver and adding this to my reactor but can't figure out how to send the data to the server or even if this is the correct way of doing this.
A simplified example would be helpful to show what I need to do.
Thanks
J
To add an interactive console to your Twisted app, see this article -- it explains how to use twisted.internet.stdio for the purpose.