Flask-SocketIO server using polling instead of websockets - python

I'm working on a Flask-SocketIO server that works just fine.
However, I'm getting lots of requests like this in my server log:
"GET /socket.io/?EIO=3&transport=polling&t=LBS1TQt HTTP/1.1"
Here's the code I'm working with:
from flask import Flask, render_template, redirect, url_for
from flask_socketio import SocketIO, emit
import json
def load_config():
# configuration
return json.load(open('/etc/geekdj/config.json'))
config = load_config()
geekdj = Flask(__name__)
geekdj.config["DEBUG"] = config["debug"]
geekdj.config["SECRET_KEY"] = config["secret_key"]
geekdj.config.from_envvar("FLASKR_SETTINGS", silent=True)
socketio = SocketIO(geekdj)
#geekdj.route('/')
def index():
return render_template('index.html')
# SocketIO functions
#socketio.on('connect')
def chat_connect():
print ('connected')
#socketio.on('disconnect')
def chat_disconnect():
print ("Client disconnected")
#socketio.on('broadcast')
def chat_broadcast(message):
print ("test")
emit("chat", {'data': message['data']})
if __name__ == "__main__":
socketio.run(geekdj, port=8000)
and the JS in index.html:
<script src="//cdn.socket.io/socket.io-1.4.5.js"></script>
<script type="text/javascript" charset="utf-8">
$(document).ready(function(){
// the socket.io documentation recommends sending an explicit package upon connection
// this is specially important when using the global namespace
var socket = io.connect('http://localhost:8000');
socket.on('connection', function(socket) {
socket.emit('foo', {foo: "bar"});
socket.join("test");
});
socket.on('joined', function(data) {
console.log('Joined room!');
console.log(data["room"]);
});
});
I'd prefer to be using actual Websockets if possible, does anyone know why SocketIO is falling back on polling?

Did the first answer work? If so, you should accept it. If not, post your requirements.txt please.
I had the same problem and found the resolution by fully absorbing the documentation page:
The asynchronous services that this package relies on can be selected
among three choices:
eventlet is the best performant option, with support for long-polling and WebSocket transports.
gevent is supported in a number of different configurations. The long-polling transport is fully supported with the gevent package,
but unlike eventlet, gevent does not have native WebSocket support.
To add support for WebSocket there are currently two options.
Installing the gevent-websocket package adds WebSocket support to
gevent or one can use the uWSGI web server, which comes with
WebSocket functionality. The use of gevent is also a performant
option, but slightly lower than eventlet.
The Flask development server based on Werkzeug can be used as well, with the caveat that it lacks the performance of the other two
options, so it should only be used to simplify the development
workflow. This option only supports the long-polling transport.
Basically, I didn't have evenlet nor gevent-websocket in my virtual environment. I installed eventlet, and transport upgrade to websocket was near instantaneous! Hope this helps.

I found the solution in this other Q/A.
It turns out that SocketIO sets a cookie with the most recent connection type that worked. In my case, it was polling.
So, I changed the SocketIO connect statement in my JS from
var socket = io.connect('http://localhost:8000');
to
var socket = io.connect(null, {port: 8000, rememberTransport: false});
and now there is activity in the websockets type under the Network tab in the Chrome developer tools (which there wasn't previously):

Related

Python socket.io server error 400 (NodeJS server works)

I'm trying to make JavaScript client to a Python websocket server through an Apache2 proxy.
The client is dead simple:
const socket = io({
transports: ['websocket']
});
I have a NodeJS websocket server and a working Apache2 reverse proxy setup.
Now I want to replace the NodeJS server with a Python server - but none of the example implementations from socket.io works. With each of the my client reports an "error 400" when setting up the websocket connection.
The Python server examples come from here:
https://github.com/miguelgrinberg/python-socketio/tree/master/examples/server
Error 400 stands for "Bad Request" - but I know that my requests are fine because my NodeJS server understands them.
When not running behind a proxy then all Python examples work fine.
What could be the problem?
I found the solution - all the Python socket.io server examples that I refered to are not configured to run behind a reverse proxy. The reason is, that the socket.io server is managing a list of allowed request origins and the automatic list creation is failing in the reverse proxy situation.
This function creates the automatic list of allowed origins (engineio/asyncio_server.py):
def _cors_allowed_origins(self, environ):
default_origins = []
if 'wsgi.url_scheme' in environ and 'HTTP_HOST' in environ:
default_origins.append('{scheme}://{host}'.format(
scheme=environ['wsgi.url_scheme'], host=environ['HTTP_HOST']))
if 'HTTP_X_FORWARDED_HOST' in environ:
scheme = environ.get(
'HTTP_X_FORWARDED_PROTO',
environ['wsgi.url_scheme']).split(',')[0].strip()
default_origins.append('{scheme}://{host}'.format(
scheme=scheme, host=environ['HTTP_X_FORWARDED_HOST'].split(
',')[0].strip()))
As you can see, it only adds URLs with {scheme} as a protocol. When behind a reverse proxy, {scheme} will always be "http". So if the initial request was HTTPS based, it will not be in the list of allowed origins.
The solution to this problem is very simple: when creating the socket.io server, you have to either tell him to allow all origins or specify your origin:
import socketio
sio = socketio.AsyncServer(cors_allowed_origins="*") # allow all
# or
sio = socketio.AsyncServer(cors_allowed_origins="https://example.com") # allow specific

How to serve Flask With Http Server [duplicate]

This question already has answers here:
Are a WSGI server and HTTP server required to serve a Flask app?
(3 answers)
How to serve static files in Flask
(24 answers)
How to run functions in parallel?
(8 answers)
Closed 3 years ago.
I would like to develop an app that uses both Flask and httpd.
Flask publishes HTML-related files, and httpd publishes files in local files.
It is for browsing local files published in httpd from Flask HTML.
Though the port numbers of Flask and httpd are different, it seems that httpd server side is not working.
Connection refused error occurs when connecting to httpd server.
Added the intention of the question.
I want to run Flask's built-in web server and HTTPServer simultaneously from a script.
I just want to be able to see myself, not to expose it to the network.
I'm looking for a mechanism that can be completed with the app.py script without using WSGI.
Added additional information to the question.
This question uses Flask and Python's HTTPServer, but using NodeJS's HTTPServer instead of HTTPServer seems to work well.
(Comment out run())
I would like to complete in Python if possible without using NodeJS HTTPServer.
https://www.npmjs.com/package/http-server
C:\Users\User\Videos>http-server
Starting up http-server, serving ./
Available on:
http://127.0.0.1:8080
Hit CTRL-C to stop the server
...
version
Flask==1.0.2
Python 3.7
Can I not start each server with the following code?
templates/index.html
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title></title>
</head>
<body>
<video src="http://localhost:8080/video.mp4"></video>
</body>
</html>
python(app.py)
from http.server import SimpleHTTPRequestHandler, HTTPServer
from flask import Flask, render_template
app = Flask(__name__)
#app.route('/')
def hello_world():
return render_template('index.html')
class Handler(SimpleHTTPRequestHandler):
def __init__(self, *args, directory=None, **kwargs):
super().__init__(*args,
directory=r'C:\Users\User\Videos',
**kwargs)
def run(server_class=HTTPServer, handler_class=Handler):
server_address = ('localhost', 8000)
httpd = server_class(server_address, handler_class)
httpd.serve_forever()
if __name__ == '__main__':
app.run(host='localhost', port=5000)
run()
It may not have been transmitted well. I'm sorry.
Thank you very much.
Can I not start each server with the following code?
Yes, there are many other ways.
WSGI
wsgi, which stands for Web Server Gateway Interface is defined in PEP 333:
This document specifies a proposed standard interface between web
servers and Python web applications or frameworks, to promote web
application portability across a variety of web servers.
framework side
flask, django and many other frameworks all implement this interface. So when you write an app in flask, app implements wsgi so any web server that knows how to serve a wsgi app can serve it.
web server side
There are many choices and you can find more at wsgi.org:
gunicorn
uWSGI
...
Basically, you can choose any of these to start your server and use httpd to proxy requests to it. Or you can use mod_wsgi:
mod_wsgi is an Apache module that embeds a Python application within
the server and allow them to communicate through the Python WSGI
interface as defined in the Python PEP 333.
Note
The bundled server in flask is not suitable for production use, you can see this question for more details.
For Updated Question
I want to run Flask's built-in web server and HTTPServer
simultaneously from a script.
Just Run Shell Command
You can start another process and call shell command using Popen:
if __name__ == '__main__':
p = Popen(['python -m http.server'], shell=True)
app.run(host='localhost', port=5000)
Use whatever command you like, you can even start a node http server here.
Use Thread/Process
You can start http server in another thread or process, here I use threading for example:
if __name__ == '__main__':
from threading import Thread
Thread(target=run, daemon=True).start()
app.run(host='localhost', port=5000)
Use Flask To Serve File
Instead of starting two servers binding to two ports, you can actually use flask to serve file as well. In this way, you only start one server binding to one port:
#app.route('/videos/<path:filename>')
def download_file(filename):
return send_from_directory(r'C:\Users\User\Videos',
filename, as_attachment=True)
You can see documentation for more details.
app.run() is a blocking operation. The below lines won't be interpreted.
Run your apps from separate files or in different thread/process.

Python Twisted Authentication Without HTTP Authentication

I'm setting up a Flask server, and want to use a Twisted ReverseProxyResource to proxy over another local server. In Flask, I have a current_user.is_authenticated boolean which I use to protect pages. How can I lock the ReverseProxyResource using this variable, so that it cannot be accessed when a user is logged out? The twisted.web.guard module requires me to set up another HTTP authentication system entirely, and the library doesn't seem to have any other built-in solution.
I've set up some demo code, (based off a previous question) attempting to place the Flask server inside of the Twisted reactor. I'm using the ReverseProxyResource to allow for two-way communication with a Shellinabox server on port 4200.
from flask import Flask
from twisted.internet import reactor
from twisted.web.proxy import ReverseProxyResource
from twisted.web.resource import Resource
from twisted.web.server import Site
from twisted.web.wsgi import WSGIResource
app = Flask(__name__)
#app.route('/example')
def index():
return 'Flask within Twisted?'
flask_site = WSGIResource(reactor, reactor.getThreadPool(), app)
root = Resource()
site_example = ReverseProxyResource('localhost', 4200, b'')
root.putChild(b'ssh', site_example)
reactor.listenTCP(8081, Site(root))
reactor.run()
I'd be okay switching to another reverse proxy, but only Twisted has worked so far. I'm also not wanting to switch to the Klein framework due to what I already have established in Flask.

Flask as proxy doesn't work when deployed to tornado

I wrote a flask app as kind of a proxy, to analyse the data passing through it and provide a web page where I get the result. All seems to went well when using the default development server that comes with flask, say using:
app.run()
But when I tried to deploy the app to a server, for example tornado or wsgiref.simple_server in Python standard library using:
from tornado.wsgi import WSGIContainer
from tornado.httpserver import HTTPServer
from tornado.ioloop import IOLoop
from app import app
http_server = HTTPServer(WSGIContainer(app))
http_server.listen(5000)
IOLoop.instance().start()
or
from wsgiref.simple_server import make_server
from app import app
httpd = make_server('', 5000, app)
httpd.serve_forever()
This two ways result in getting 404 error on requests got 200 OK previously.
The requests my app gets, since it serves as a proxy, are with absolute urls in the request lines like POST http://example.com/test HTTP/1.1. When I'm using the development server, this request is handled by the function registered under /test normally, something like:
#app.route('/test', methods=['GET', 'POST'])
def handle_test():
...
and the request.url, as I checked, is http://example.com/test.
When using the other two ways, the request is handled by the error handler with code 404, and the handle_test() function is never invoked. The request.url, which seems to cause the problem, is http://example.com/http://example.com/test, definitely not what I want.
So I want to know:
What changed the url to the wrong one, and when did this happen.
Why the app behaves differently on default development server and servers like tornado.
Of course, how to get rid of this problem.
Thanks!
Tornado does not currently support proxy-style requests; see https://github.com/tornadoweb/tornado/issues/1036
Tornado's WSGIContainer is also a poor choice for proxies because of its single-threaded concurrency model (see http://www.tornadoweb.org/en/stable/wsgi.html#tornado.wsgi.WSGIContainer). Even if the aforementioned bug were fixed, your proxy would perform poorly. I recommend either using a multithreaded WSGI server (like gunicorn or uwsgi, although I don't know whether they support proxy-style requests) or rewriting the proxy as a native Tornado app (without flask) to take advantage of asynchronous features.

Python Web Server

I want a simple python web server for the following use case:
I want to write a simple server that will accept HTTP requests from my application running on Google App Engine.
The server will accept HTTP requests, and then send iphone notifications. (Basically, I need this extra server to account for the lack of socket support in google app engine).
I guess I need the server to be able to maintain this persistent connection with Apple's Push Notification Service. So I'll need to have some sort of thread always open for this. So I need some sort of web server that can accept the request pass it off to the other thread with the persistent connection to APNS.
Maybe multiple processes and one of pythons queuing tools to communicate between them? Accept the HTTP request, then enqueue a message to the other process?
I was wondering what someone with a bit of experience would suggest. I'm starting to think that maybe even writing my own simple server is a good option (http://fragments.turtlemeat.com/pythonwebserver.php).
One option would be the (appropriately named) SimpleHTTPServer, which is part of the Python standard library. Another, more flexible but more complicated option would be to write your server in Twisted.
I've been writing simple http servers using gevent and bottle -- an example:
#!/usr/bin/env python
import gevent.monkey
gevent.monkey.patch_all()
import bottle
bottle.debug(True)
import gevent.wsgi
from bottle import route, run, request, response, static_file, abort
#route('/echo')
def echo():
s = request.GET.get('s', 'o hai')
return '<html><head><title>echo server</title></head><body>%s</body></html>\r\n' % (s)
#route('/static/:filename')
def send_static(filename):
root = os.getcwd() + '/static'
return static_file(filename, root=root)
if __name__ == '__main__':
app = bottle.app()
wsgi_server = gevent.wsgi.WSGIServer(('0.0.0.0', 8000), app)
print 'Starting wsgi search on port 8000'
wsgi_server.serve_forever()
So you could write a simple server that sticks a job into a Queue (see gevent.queue) and have another worker greenlet that handles reading requests from the queue and processing them...

Categories

Resources