EDIT:Question Updated. Thanks Slott.
I have a TCP Server in Python.
It is a server with asynchronous behaviour. .
The message format is Binary Data.
Currently I have a python client that interacts with the code.
What I want to be able to do eventually implement a Web based Front End to this client.
I just wanted to know , what should be correct design for such an application.
Start with any WSGI-based web server. werkzeug is a choice.
The Asynchronous TCP/IP is a seriously complicated problem. HTTP is synchronous. So using the synchronous web server presenting some asynchronous data is always a problem. Always.
The best you can do is to buffer things and have two processes in your web application.
TCP/IP process that collects data from the remove server and buffers it in a file (or files) somewhere.
WSGI web process which handles GET/POST processing.
GET requests will fetch some or all of the buffer and display it.
POST requests will send a message to the TCP/IP server.
For Web-based, talk HTTP. Use JSON or XML as data formats.
Be standards-compliant and make use of the vast number of libraries out there. Don't reinvent the wheel. This way you have less headaches in the long run.
if you need to maintain a connection to a backend server across multiple HTTP requests, Twisted's HTTP server is an ideal choice, since it's built to manage multiple connections easily.
Related
I'm currently working on gateway with an embedded Linux and a Webserver. The goal of the gateway is to retrieve data from electrical devices through a RS485/Modbus line, and to display them on a server.
I'm using Nginx and Django, and the web front-end is delivered by "static" files. Repeatedly, a Javascript script file makes AJAX calls that send CGI requests to Nginx. These CGI requests are answered with JSON responses thanks to Django. The responses are mostly data that as been read on the appropriate Modbus device.
The exact path is the following :
Randomly timed CGI call -> urls.py -> ModbusCGI.py (import an other script ModbusComm.py)-> ModbusComm.py create a Modbus client and instantly try to read with it.
Next to that, I wanted to implement a Datalogger, to store data in a DB at regular intervals. I made a script that also import the ModbusComm.py script, but it doesn't work : sometime multiple Modbus frames are sent at the same time (datalogger and cgi scripts call the same function in ModbusComm.py "files" at the same time) which results in an error.
I'm sure this problem would also occur if there are a lot of users on the server (CGI requests sent at the same time). Or not ? (queue system already managed for CGI requests? I'm a bit lost)
So my goal would be to make a queue system that could handle calls from several python scripts => make them wait while it's not their turn => call a function with the right arguments when it's their turn (actually using the modbus line), and send back the response to the python script so it can generate the JSON response.
I really don't know how to achieve that, and I'm sure there are better way to do this.
If I'm not clear enough, don't hesitate to make me aware of it :)
I had the same problem when I had to allow multiple processes to read some Modbus (and not only Modbus) data through a serial port. I ended up with a standalone process (“serial port server”) that exclusively works with a serial port. All other processes work with that port through that standalone process via some inter processes communication mechanism (we used Unix sockets).
This way when an application wants to read a Modbus register it connects to the “serial port server”, sends its request and receives the response. All the actual serial port communication is done by the “serial port server” in sequential way to ensure consistency.
I need to implement a chat application for my web service (that is written in Django + Rest api framework). After doing some google search, I found that Django chat applications that are available are all deprecated and not supported anymore. And all the DIY (do it yourself) solutions I found are using Tornado or Twisted framework.
So, My question is: is it OK to make a Django-only based synchronous chat application? And do I need to use any asynchronous framework? I have very little experience in backend programming, so I want to keep everything as simple as possible.
Django, like many other web framework, is constructed around the concept of receiving an HTTP request from a web client, processing the request and sending a response. Breaking down that flow (simplified for sake of clarity):
The remote client opens TCP connection with your Django server.
The client sends a HTTP request to the server, having a path, some headers and possibly a body.
Server sends a HTTP response.
Connection is closed
Server goes back to a state where it waits for a new connection.
A chat server, if it needs to be somewhat real-time, needs to be different: it needs to maintain many simultaneous open connections with connected clients, so that when new messages are published on a channel, the appropriate clients are notified accordingly.
A modern way of implementing that is using WebSockets. This communication flow between the client and server starts with a HTTP request, like the one described above, but the client sends a special Upgrade HTTP request to the server, asking for the session to switch over from a simple request/response paradigm to a persistent, "full-duplex" communication model, where both the client and server can send messages at any time in both direction.
The fact that the connections with multiple simultaneous clients needs to be persistent means you can't have a simple execution model where a single request would be taken care of by your server at a time, which is usually what happens in what you call synchronous servers. Tornado and Twisted have different models for doing networking, using multithreading, so that multiple connections can be left open and taken care of simultanously by a server, and making a chat service possible.
Synchronous approach nevertheless
Having said that, there are ways to implement a very simple, non-scalable chat service with apparent latency:
Clients perform POST requests to your server to send messages to channels.
Clients perform periodical GET requests to the server to ask for any new messages to the channels they're subscribed to. The rate at which they send these requests is basically the refresh rate of the chat app.
With this approach, your server will work significantly harder than if it had a asynchronous execution model for maintaining persistent connections, but it will work.
If you're going to make a chat app, you'll want to use websockets. They'll make getting updates to all clients participating in a conversation significantly easier and it'll give you real time conversations within your app. Having said that, I've never seen websockets used within a synchronous framework.
If it's OK to make Django-only based synchronous chat application? Too many unanswered questions for a reasonable answer. How many people will use this chat app? How many people per conversation? How long will this app be around? If you're looking to make something simple for you and a couple friends, make what you know. If you're getting paid to make this app, use websockets and use an asynchronous framework.
You certainly can develop a synchronous chat app, you don't necessarily need to us an asynchronous framework. but it all comes down to what do you want your app to do? how many people will use the app? will there be multiple users and multiple chats going on at the same time?
I tried to search in the internet for this subject, But I didn't found some answers.
If some know how can I use a browser as a client in python sockets it will be very good.
To use the browser as a client to a python (server) socket, you simply need to point it to the right endpoint.
Assuming you are running the browser and the python server on the same machine, and that you're opening port 1234 on the server socket, you simply need to open the localhost:1234 URL in your browser.
Of course, what happens next is entirely dependent on how you handle the communication in your program. Most browsers will understand plain text put directly on the socket, but you probably want to talk HTTP.
It's worth mentioning that using a plain socket to communicate with a browser is, at best, uncommon. There may be better solutions, depending on what, exactly, you want to do:
If you just want to quickly serve a few files from a directory (i.e.: often
called a "directory listing"), you can use SimpleHTTPServer
If you're trying to build a website, you should look into a web framework, such as Django, Flask or CherryPy
If you want a lower-level highly asynchronous scalable communication, Tornado is a popular choice
You might want to consider using websockets. They essentially function like regular TCP sockets, but are initiated with a HTTP handshake, making them suitable for browsers. They are supported in recent versions of all major browsers. There are many libraries available that adapt common python webservers to serve websockets as well, for example:
https://pypi.python.org/pypi/gevent-websocket/
if you like gevent.
They also support an SSL layer, which is called using a url starting with "wss://" on the browser side. More information here:
https://www.websocket.org/
I need to develop an application in Python handling a few thousand of persistent TCP connection in parallel. Clients connected to the server at bootstrap and send some message (in binary format) from time to time. The server also send both in reply to clients' message and asynchronously some other binary messages. Basically it is a persistent connection initiated by the client because I have no way to reach clients that are behind a NAT.
The question is: which is the libraries/framework i shall consider for this task. Spawning a thread for each client is not an option. I'm not aware of thread pool library for python. I also recently discovered gevent. Which other options do I have?
This link is an excellent read. It lists all the available event driven and asynchronous network frameworks within Python and also has good analysis of the performance for each framework.
It appears that the Tornado framework is one of the most-performant when developing such applications.
Hope this helps
'greenlets' is a leighweight concurrency package. See http://greenlet.readthedocs.org/en/latest/.
Besides greenlets, you might also want to consider multiprocessing. See http://docs.python.org/2/library/multiprocessing.html.
The title may be a bit vague, but here's my goal: I have a frontend webserver which takes incoming HTTP requests, does some preprocessing on them, and then passes the requests off to my real webserver to get the HTTP response, which is then passed back to the client.
Currently, my frontend is built off of BaseHTTPServer.HTTPServer and the backend is CherryPy.
So the question is: Is there a way to take these HTTP requests / client connections and insert them into a CherryPy server to get the HTTP response? One obvious solution is to run an instance of the CherryPy backend on a local port or using UNIX domain sockets, and then the frontend webserver establishes a connection with the backend and relays any requests/responses. Obviously, this isn't ideal due to the overhead.
What I'd really like is for the CherryPy backend to not bind to any port, but just sit there waiting for the frontend to pass the client's socket (as well as the modified HTTP Request info), at which point it does its normal CherryPy magic and returns the request directly to the client.
I've been perusing the CherryPy source to find some way to accomplish this, and currently am attempting to modify wsgiserver.CherryPyWSGIServer, but it's getting pretty hairy and is probably not the best approach.
Is your main app a wsgi application? If so, you could write some middleware that wraps around it and does all the request wrangling before passing on to the main application.
If this this is possible it would avoid you having to run two webservers and all the problems you are encountering.
Answered the Upgrade question at Handling HTTP/1.1 Upgrade requests in CherryPy. Not sure if that addresses this one or not.