Using client and server websockets in same python script - python

I am new to Python - and work on Slackware Linux with Python 3.4.3. I prefer simple no-dependency solutions within one single python script.
I am building a demonized server program (A) which I need to access through both a regular shell CLI and GUIs in my web browser: it serves various files, uses a corresponding database and updates a firefox tab through python's WEBBROWSER function. Currently, I access process (A) via the CLI or a threaded network socket. This all started to work in a localhost scenario with all processes running on one machine.
Now, it turns out that the WebSocket protocol would render my setup dramatically simpler and cut short on traditional flow protocols using Apache and complex frameworks as middlemen.
1st central question: How do I access daemon (A) with websockets from the CLI? I thought about firing up a non-daemon version of my server program, now called (B), and send a program call to its (A) counterpart via the WebSocket HTTP protocol. This would make process (B) a websocket CLIENT, and process (A) a websocket SERVER. Is such a communication at all possible today?
2nd question: Which is the best suited template solution for this scenario - that works with python 3.4.3 ?! I started to play with Pithikos' very sleek python-websocket-server template (see https://github.com/Pithikos/python-websocket-server) but I am unable to use it as CLIENT (initiating the network call) to call its SERVER equivalent (receiving the call while residing in a daemonized process).

Problem 'solved': I gave up on the zero-dependency zero-library idea :
pip install websockets
https://websockets.readthedocs.io
It works like a charm. The WebSocket server sits in the daemon process and receives and processes WebSocket client calls that come from the CLI processes and from the HTML GUIs.

Related

Sending commands to and receiving their output from a running Python application (via remote connection)

I have a long-running server application written in Python 3, which I would like to interactively debug from time to time. For that I would like to run Python commands "within" the server application process, inspect the values of global variables etc., like in a REPL or the standardd Python console.
It seems the Python standard library's code module and its InteractiveConsole class seems to be what I am looking for. I was thinking of running that in a separate thread so that the main application is not blocked while I communicate with it.
However, it seems that class provides interaction via standard input and output. That might not be exactly what I need. Is there a way to make that interactive console listen / connect to a socket and send input and output through this socket, so that I can connect to the console via a TCP connection?
Or is there another, better way to implement my requirement without this code module?

Dronekit Python - Is it possible to send commands from Nodejs?

I'm trying to use Dronekit Python 2 for creating a minimalistic GCS (ground station). From the examples it looks like python scripts always finish and the connection with the vehicle is lost. That said, Is there any way to code a python script that works like a thread and only exit once it get's a command from Nodejs? Nodejs has the python-shell module that is supposed to send messages to python via STDIN. So my goal is to run python script from Nodejs python-shell, and then send commands to dronekit (connect, arm, takeoff, etc). Thanks for your help!
I believe you can build your GUI in nodejs and use UDP or websocket to communicate with python code, where you build a dronekit wrapped with UDP or websocket.

Calculation Server Architecture

I have a rails app. This app takes parameters from the users and sends it to what I think would be called a slave application server that runs heavy calculations in Python. This slave server is located on the same physical box and runs a Python "SimpleHTTPServer" (just a basic web server).
I set up the slave to receive commands through post requests and run the calculations. Is it appropriate for the python server to receive these requests through GET/POST even though it is on the same box or should I be using another protocol?
**note I have looked into rubypython and other direct connectors but I need a separate app server to run calculations.

Communicate with long-running Python program

I have a long-running Python program on a server that already listens for messages on one serial port and forwards them out another serial port.
What do I need to do to allow that program to accept data from a web server (that ultimately gets that data from a web browser on a laptop)?
The options I've seen so far are:
flask()
The solution at "
Communicating with python program running on server " server
doesn't seem to work for me, because
(I may be doing this wrong)
the long-running Python program can't seem to grab port 80,
I guess because the web server is already running on port 80 (serving other pages).
Have a CGI script that writes the data to the file, and the long-running script reads the data from that file. I'm a little reluctant to do this on a system where flash wear-out may be a concern.
Somehow (?) convert the long-running script
to a FastCGI script that includes everything it used to do plus new stuff to accept data from the web server.
Somehow (?) convert the long-running script
to a WSGI script that includes everything it used to do plus new stuff to accept data from the web server.
Write a brief web script that the web server starts up, that communicates with a long-running script using asynchat / asyncore / sockets / twisted , which seem designed for communication between two different computers, and so seems like overkill when talking between a long-running Python script and a web server (perhaps with a short-time CGI script or FastCGI script between them) running on the same server.
Perhaps some other option?
Is there a standard "pythonic" way for a web server to hand off data to a Python program that is already up and running? (Rather than the much more common case of a web server starting a Python program and hand off data to that freshly-started program).
(Details that I suspect aren't relevant: my server runs Lighttpd on Ubuntu Linux running on a Beaglebone Black).
(Perhaps this question should be moved to https://softwareengineering.stackexchange.com/ ?)
You could setup your python process to use any other port (f.e. 8091). Than configure your webserver to forward certain (or all) requests to that port using proxypas. Example for Apache:
<VirtualHost yourdomain.for.python.thread>
ServerName localhost
ServerAdmin webmaster#example.com
ProxyRequests off
ProxyPass * http://127.0.0.1:8091
</VirtualHost>
I've done this before for quickly getting a Django server in development mode to show pages via a webserver. If you actually want to serve html content, this is not the most efficient way to go.

What does Tornado do with active requests when it is stopped?

Question pretty much says it all. If I am running Tornado on a server with Supervisor, what happens to active requests when I deploy code and need to restart the Tornado server? Are they dropped mid-request? Are they allowed to finish?
Supervisord send a signal like HUP or TERM to tornado process, the most important point is how tornado deal with it.
Unfortunately, tornado will simple exit when it get signal like HUP, TERM, INT.
Tornado has a sub module named autoreload, it make the application could detect the code files' changes and reload the application, but it only works the debug mode for one process, and not in WSGI applications. It's development tool.
But, we can define a function within run tornado.autoreload._reload function by manual, and register it for HUP sigal. tornado.autoreload.add_reload_hook can add functions should be called when reload.
Because the tornado doesn't manage the processes well on fork mode, so it's suggested running many independent processes for different ports. On this mode, the _reload will works like set debug flag.
After all, test and benchmark it for make sure it works well in your application.

Categories

Resources