Can I create a HTTP server without using
python -m http.server [port number]
Using an old school style with sockets and such.
Latest code and errors...
import socketserver
response = """HTTP/1.0 500 Internal Server Error
Content-type: text/html
Invalid Server Error"""
class MyTCPHandler(socketserver.BaseRequestHandler):
"""
The RequestHandler class for our server.
It is instantiated once per connection to the server, and must
override the handle() method to implement communication to the
client.
"""
def handle(self):
# self.request is the TCP socket connected to the client
self.data = self.request.recv(1024).strip()
self.request.sendall(response)
if __name__ == "__main__":
HOST, PORT = "localhost", 8000
server = socketserver.TCPServer((HOST, PORT), MyTCPHandler)
server.serve_forever()
TypeError: 'str' does not support the buffer interface
Yes, you can, but it's a terrible idea -- in fact, even http.server is at best a toy implementation.
You're better off writing whatever webapp you want as a standard WSGI application (most Python web frameworks do that -- Django, Pyramid, Flask...), and serving it with one of the dozens of production-grade HTTP servers that exist for Python.
uWSGI (https://uwsgi-docs.readthedocs.org/en/latest/) is my personal favorite, with Gevent a close second.
If you want more info about how it's done, I recommend that you read the source code to the CherryPy server (http://www.cherrypy.org/). While not as powerful as the aforementioned uWSGI, it's a good reference implementation written in pure Python, that serves WSGI apps through a thread pool.
Sure you can, and servers like Tornado already do it this way.
For simple test servers which can do only HTTP/1.0 GET requests and handle only a single request at a time it should not be that hard once you understood the basics of the HTTP protocol. But if you care even a bit about performance it gets complex fast.
Related
The API for Python's wsgiref module precludes hop-by-hop headers (as defined in RFC 2616).
I'm unclear on how to get the server to terminate a connection after a response (since there doesn't seem to be a way to add Connection: close).
This problem comes up in testing small WSGI apps and Bottle micro-services. Calls from curl get blocked by open connections from a browser. I have to click a browser refresh to terminate the connection so that the pending curl request can be answers.
Obviously, this should be a server side decision (terminate connection after a response) rather than client-side. I'm unclear how to implement this.
This is really predicated on your WSGI server you are hosting your framework via. The best solution with bottle is to run it through gevent.
botapp = bottle.app()
for Route in (mainappRoute,): #handle multiple files containing routes
botapp.merge(Route)
botapp = SessionMiddleware(botapp, beakerconfig) #in case you are using beaker sessions
botapp = WhiteNoise(botapp) #in case you want whitenoise to handle static files
botapp.add_files(staticfolder, prefix='static/') #add static route to whitenoise
server = WSGIServer(("0.0.0.0", int(80)), botapp) #gevent async web server
def shutdown():
print('Shutting down ...')
server.stop(timeout=60)
exit(signal.SIGTERM)
gevent.signal(signal.SIGTERM, shutdown)
gevent.signal(signal.SIGINT, shutdown) #CTRL C
server.serve_forever() #spawn the server
You can purge the whitenoise and bottle configs if they aren't necessary, I kept them there as an example, and a suggestion that you use them if this is outward facing.
This is purely asynchronous on every connection.
I'm working on a scientific experiment where about two dozen test persons play a turn-based game with/against each other. Right now, it's a Python web app with a WSGI interface. I'd like to augment the usability with websockets: When all players have finished their turns, I'd like to notify all clients to update their status. Right now, everyone has to either wait for the turn timeout, or continually reload and wait for the "turn is still in progress" error message not to appear again (busy waiting, effectively).
I read through multiple websocket libraries' documentation and I understand how websockets work, but I'm not sure about the architecture for mixing WSGI and websockets: Can I have a websockets and a WSGI server in the same process (and if so, how, using really any websockets library) and just call my_websocket.send_message() from a WSGI handler, or should I have a separate websockets server and do some IPC? Or should I not mix them at all?
edit, 6 months later: I ended up starting a separate websockets server process (using Autobahn), instead of integrating it with the WSGI server. The reason was that it's much easier and cleaner to separate the two of them, and talking to the websockets server from the WSGI process (server to server communication) was straight forward and worked on the first attempt using websocket-client.
Here is an example that does what you want:
https://github.com/tavendo/AutobahnPython/tree/master/examples/twisted/websocket/echo_wsgi
It runs a WSGI web app (Flask-based in this case, but can be anything WSGI conforming) plus a WebSocket server under 1 server and 1 port.
You can send WS messages from within Web handlers. Autobahn also provides PubSub on top of WebSocket, which greatly simplifies the sending of notifications (via WampServerProtocol.dispatch) like in your case.
http://autobahn.ws/python
Disclosure: I am author of Autobahn and work for Tavendo.
but I'm not sure about the architecture for mixing WSGI and websockets
I made it
use WSocket
Simple WSGI HTTP + Websocket Server, Framework, Middleware And App.
Includes
Server(WSGI) included - works with any WSGI framework
Middleware - adds Websocket support for any WSGI framework
Framework - simple Websocket WSGI web application framework
App - Event based app for Websocket communication
When external server used, some clients like Firefox requires http 1.1 Server. for Middleware, Framework, App
Handler - adds Websocket support to wsgiref(python builtin WSGI server)
Client -Coming soon...
Common Features
only single file less than 1000 lines
websocket sub protocol supported
websocket message compression supported (works if client asks)
receive and send pong and ping messages(with automatic pong sender)
receive and send binary or text messages
works for messages with or without mask
closing messages supported
auto and manual close
example using bottle web framework and WSocket middleware
from bottle import request, Bottle
from wsocket import WSocketApp, WebSocketError, logger, run
from time import sleep
logger.setLevel(10) # for debugging
bottle = Bottle()
app = WSocketApp(bottle)
# app = WSocketApp(bottle, "WAMP")
#bottle.route("/")
def handle_websocket():
wsock = request.environ.get("wsgi.websocket")
if not wsock:
return "Hello World!"
while True:
try:
message = wsock.receive()
if message != None:
print("participator : " + message)
wsock.send("you : "+message)
sleep(2)
wsock.send("you : "+message)
except WebSocketError:
break
run(app)
I'm new to using Cloud9 IDE (c9) and so far it looks great, except a few minor things.
I see from the docs that to start up a simple node.js http server, you have to pass in process.env.PORT in place of the regular port such as "8080".
Node Hello World example:
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(process.env.PORT, process.env.IP);
What I want to know is, on c9, can you only start services on ports using javascript / node.js? Or do other languages work just as well, perhaps with some other method of passing the port? Specifically python + Twisted?
I uploaded some twisted code that was working locally for me, but wouldn't work on c9 because it was trying to access local ports (which are already in use). Here is the error
twisted.internet.error.CannotListenError: Couldn't listen on any:8080: [Errno 98] Address already in use.
How would one make the following example work, if even possible, running on c9?
Python+Twisted Hello World example
from twisted.web import server, resource
from twisted.internet import reactor
class Simple(resource.Resource):
isLeaf = True
def render_GET(self, request):
return "<html>Hello, world!</html>"
site = server.Site(Simple())
reactor.listenTCP(8080, site)
reactor.run()
Initial searches through the documentation and github issues did not turn much up. I'm hoping this is possible and I just missed the right parameter to pass.
Edit: Updated output below
Node Code
console.log(process.env.PORT)
console.log(process.env.IP)
Terminal output
Running Node Process
Tip: you can access long running processes, like a server, at 'http://private-cloud.mrchampe.c9.io'.
Important: in your scripts, use 'process.env.PORT' as port and 'process.env.IP' as host.
8080
127.6.70.129
Python Code
import os
print os.environ["PORT"]
print os.environ["IP"]
Terminal output
Running Python Process
8080
127.6.70.129
Twisted code
import os
import twisted
from twisted.web import server, resource
from twisted.internet import reactor
class Simple(resource.Resource):
isLeaf = True
def render_GET(self, request):
return "<html>Hello, world!</html>"
site = server.Site(Simple())
reactor.listenTCP(int(os.environ["PORT"]), interface=os.environ["IP"])
reactor.run()
Terminal Output
Running Python Process
hello world
Traceback (most recent call last):
File "python/hello.py", line 17, in <module>
reactor.listenTCP(int(os.environ["PORT"]), interface=os.environ["IP"])
TypeError: listenTCP() takes at least 3 non-keyword arguments (2 given)
The listenTCP TypeError is strange because 2 arguments works locally but not on Cloud9. I don't see why using these arguments does not work.
I have the above code hosted on this public Cloud9 project for anyone to take a look. Thanks!
process.env.PORT and process.env.IP from Node.js sound like os.environ["PORT"] and os.environ["IP"] in Python. Perhaps you can try:
reactor.listenTCP(int(os.environ["PORT"]), site, interface=os.environ["IP"])
Probably this is limitation of c9 environment management, so users dont abuse their service too much. I would assume that they have some level of management for Node.js used resources and thus allow these to open ports.
If this was the case and I had to work with cloud9, I would probably approach this as follows:
- create Node.js service which would act as a proxy, listening on twisted behalf
- create new reactor with overriden listenTCP and listenUDP methods, which would bind these to Node.js proxy service.
The way how proxy would work is as follows, Node.js would initially listen on one "management" TCP port. Then, when twisted service starts up, it would create a TCP connection between Node.js and itself through that port. Then, whenever listenTCP or listenUDP is called, these commands would then be dispatched to Node.js service, which in return would open the port and all of the comunication through that port would then be proxied to twisted through that existing TCP connection.
It's worth mentioning that Jean-Paul's answer worked for me as well, but I had to use 'address' instead of 'interface':
http_server.listen(int(os.environ.get("PORT")), address=os.environ["IP"])
I want a simple python web server for the following use case:
I want to write a simple server that will accept HTTP requests from my application running on Google App Engine.
The server will accept HTTP requests, and then send iphone notifications. (Basically, I need this extra server to account for the lack of socket support in google app engine).
I guess I need the server to be able to maintain this persistent connection with Apple's Push Notification Service. So I'll need to have some sort of thread always open for this. So I need some sort of web server that can accept the request pass it off to the other thread with the persistent connection to APNS.
Maybe multiple processes and one of pythons queuing tools to communicate between them? Accept the HTTP request, then enqueue a message to the other process?
I was wondering what someone with a bit of experience would suggest. I'm starting to think that maybe even writing my own simple server is a good option (http://fragments.turtlemeat.com/pythonwebserver.php).
One option would be the (appropriately named) SimpleHTTPServer, which is part of the Python standard library. Another, more flexible but more complicated option would be to write your server in Twisted.
I've been writing simple http servers using gevent and bottle -- an example:
#!/usr/bin/env python
import gevent.monkey
gevent.monkey.patch_all()
import bottle
bottle.debug(True)
import gevent.wsgi
from bottle import route, run, request, response, static_file, abort
#route('/echo')
def echo():
s = request.GET.get('s', 'o hai')
return '<html><head><title>echo server</title></head><body>%s</body></html>\r\n' % (s)
#route('/static/:filename')
def send_static(filename):
root = os.getcwd() + '/static'
return static_file(filename, root=root)
if __name__ == '__main__':
app = bottle.app()
wsgi_server = gevent.wsgi.WSGIServer(('0.0.0.0', 8000), app)
print 'Starting wsgi search on port 8000'
wsgi_server.serve_forever()
So you could write a simple server that sticks a job into a Queue (see gevent.queue) and have another worker greenlet that handles reading requests from the queue and processing them...
Because cross-domain xmlrpc requests are not possible in JavaScript
I need to create a Python app which exposes both some HTML through HTTP and an XML-RPC service on the same domain.
Creating an HTTP request handler and SimpleXMLRPCServer in python is quite easy,
but they both have to listen on a different port, which means a different domain.
Is there a way to create something that will listen on a single port on the localhost
and expose both the HTTPRequestHandler and XMLRPCRequest handler?
Right now I have two different services:
httpServer = HTTPServer(('localhost',8001), HttpHandler);
xmlRpcServer = SimpleXMLRPCServer(('localhost',8000),requestHandler=RequestHandler)
Update
I cannot install Apache on the device
The hosted page will be a single html page
The only client will be the device on witch the python service runs itself
Both of them subclass of SocketServer.TCPServer. There must be someway to refactor them so that once server instance can dispatch to both.
An easier alternative may be to keep the HTTPServer in front and proxy XML RPC to the SimpleXMLRPCServer instance.
The solution was actually quite simple, based on Wai Yip Tung's reply:
All I had to do was keep using the SimpleXMLRPCServer instance,
but modify the handler:
class RequestHandler(SimpleXMLRPCRequestHandler):
rpc_paths = ('/RPC2',)
def do_GET(self):
#implementation here
This will cause the handler to respond to GET requests as well as the original POST (XML-RPC) requests.
Using HTTPServer for providing contents is not a good idea. You should use a webserver like Apache and use Python as CGI (or a more advanced interface like mod_wsgi).
Then, the webserver is running on one port and you can server HTML directly over the webserver and write as many CGI scripts as you like in Python, as example one for XMLRPC requests using CGIXMLRPCRequestHandler.
class MyFuncs:
def div(self, x, y) : return x // y
handler = CGIXMLRPCRequestHandler()
handler.register_function(pow)
handler.register_function(lambda x,y: x+y, 'add')
handler.register_introspection_functions()
handler.register_instance(MyFuncs())
handler.handle_request()