Is it possible to use Django for communication with some kind of server process? For example on my Django website I want to have form where I input connection details (host and port) and after connection I want to send some request or events to other server process (some simple action like slider moving or clicking a button). Can I use python socket programming for this or is there some easier way?
You can use with django any python packages as with any "normal" python program. If you have a module, that communicate with your server, you can use this, if not, you have to write one on your own possibly with socket programming.
Related
I'm currently working on gateway with an embedded Linux and a Webserver. The goal of the gateway is to retrieve data from electrical devices through a RS485/Modbus line, and to display them on a server.
I'm using Nginx and Django, and the web front-end is delivered by "static" files. Repeatedly, a Javascript script file makes AJAX calls that send CGI requests to Nginx. These CGI requests are answered with JSON responses thanks to Django. The responses are mostly data that as been read on the appropriate Modbus device.
The exact path is the following :
Randomly timed CGI call -> urls.py -> ModbusCGI.py (import an other script ModbusComm.py)-> ModbusComm.py create a Modbus client and instantly try to read with it.
Next to that, I wanted to implement a Datalogger, to store data in a DB at regular intervals. I made a script that also import the ModbusComm.py script, but it doesn't work : sometime multiple Modbus frames are sent at the same time (datalogger and cgi scripts call the same function in ModbusComm.py "files" at the same time) which results in an error.
I'm sure this problem would also occur if there are a lot of users on the server (CGI requests sent at the same time). Or not ? (queue system already managed for CGI requests? I'm a bit lost)
So my goal would be to make a queue system that could handle calls from several python scripts => make them wait while it's not their turn => call a function with the right arguments when it's their turn (actually using the modbus line), and send back the response to the python script so it can generate the JSON response.
I really don't know how to achieve that, and I'm sure there are better way to do this.
If I'm not clear enough, don't hesitate to make me aware of it :)
I had the same problem when I had to allow multiple processes to read some Modbus (and not only Modbus) data through a serial port. I ended up with a standalone process (“serial port server”) that exclusively works with a serial port. All other processes work with that port through that standalone process via some inter processes communication mechanism (we used Unix sockets).
This way when an application wants to read a Modbus register it connects to the “serial port server”, sends its request and receives the response. All the actual serial port communication is done by the “serial port server” in sequential way to ensure consistency.
I'm working on an application in web.py which can send commands to a device through a website with buttons.
I know which buttons are pushed on my website and I get some parameters back in my python serverprogram. Python-program -> gets the basic commands out of an SQLite-database -> adds the received parameters. But I need to simultaneously set up a connection with the remote device through Ethernet (simple socket connection) to send these commands to the device. That's where I got stuck.
So I have the website working correctly, I also have a small seperate terminal program written to just make connection with the device and with a simple terminal interface to send commands. So basically I have the 2 major parts of the program working, but not simultaneously and I can't figure out how to fit them together.
I have been reading through some information to let the webserver run in a separate thread or maybe I have to connect and close the socket connection with the device each time I get information (command/parameters) from the website? Can somebody push me somewhat in the right direction?
NB: the server is running on a Raspberry Pi
Yes, your problem appears to be caused by socket connection not being thread-safe.
Each request to web.py server runs in its own thread, and if you want to access socket connection to your device, then you have to use locks or manage connection pool, if your device supports multiple connections.
To force web.py running in single thread mode please see the following answer:
Forcing single threaded request handling with web.py
Note that you don't have to lock all requests (as in that answer) and may put lock only on the part of the code where connection is used.
I want to create a python application that is always listening to a parametrized port. Whenever there is a request coming from the port, the application will parse the request and do tasks based on the request.
Is this type of application called services? (I have 0 knowledge on services). Where can I find beginner's tips and guides on this type of development?
This is called a server, there are examples at the bottom of the Python socket documentation page.
HTH.
This is socket programming. Writing sockets is cumbersome, you can use any web server written in python. My recommendation is use werkzeug, it is very simple. Meanwhile have a look at Flask which is built on top of werkzeug.
In case you are trying to build your own protocol engine twisted is one which will help you to achieve that.
You can using threads or the Twisted (arguably an easier option) framework to create a server.
What's the way to go to build a HTML gui for eg a multiplexed tcp server in python?
I am familiar with building websites with Django, but the thing i don't understand is, how is the tcp server part communicating with the Django related views? How could i implement the data sharing (do i see the wood for the trees)?
The problem for me is the mapping between the stateless "get an leave" and the "state full" py module "running as a daemon".
greetings
edit my standalone application skeleton:
#!/usr/bin/python
from django.core.management import setup_environ
import settings
setup_environ(settings)
from myapp.models import fanzy
def main():
for each in fanzy.objects.all():
print each.id, each.foo
if __name__ == '__main__':
main()
Django is just Python, so anything you've written in Python can be imported and referenced in the 'views' that you write for Django to serve back as HTTP responses.
In answer to another part of your question, the way a HTTP server handling TCP connections communicates with the python framework is most commonly through a protocol called WSGI. This is a good place to get more knowledge about the principles of WSGI. This is another.
With regards to running a background process and serving up a view of that processes' activities, it may be better to keep the two problems separate. You could write data to a file or a database and then access and serve this data via your web application.
These are just general comments, because your question is not totally clear. Please feel free to respond with further questions.
It's not always as easy as importing the libraries, mostly because process lifetime. For example, if you run Django through CGI with 1 request per process, then your TCP server won't stay alive between views. Similarly, if you use multiple processes to handle requests (e.g. using FastCGI), then you'll have several servers running at the same time.
If you want to have permanent network connections alive independent of request lifetimes, you'll need to run the TCP server in an external (daemon) process. This is the standard procedure for some caching schemes, where all your Django processes share cached data via a single deamon (e.g. Redis).
Basically, you have two approaches.
Global connection
Either establish a connection per Django process (if you have more than one) as a global object and forward requests to this from your view. This is most convenient if your TCP server is coded to handle multiple requests per connection. However, you'll have problems if your Django process is multi-threaded.
Connection per request
If your TCP server can accept multiple short-lived connections, this is also a viable approach. Just open the connection for the lifetime of your view. If this object is used often enough, you can even add some piece of middleware that opens up the connection and stores it in the request object.
I have written a simple twisted application that connects to a server that listens on 1 or more ports. The twisted app connects to this server and usually connects to a few of the open ports at a time. This server is a serial logger that connects to serial devices and provides the serial line information through a raw TCP Socket and I need to log all this data to disk.
My current app logs any received information to disk without issue.
What I now need to do but am unable to make progress on is add the ability to interact with my application through stdin. I need to be able to issue commands to the local application but also send text commands through the connected sockets.
I have a basic prompt using basic.LineReceiver and adding this to my reactor but can't figure out how to send the data to the server or even if this is the correct way of doing this.
A simplified example would be helpful to show what I need to do.
Thanks
J
To add an interactive console to your Twisted app, see this article -- it explains how to use twisted.internet.stdio for the purpose.