The following code works fine with python.exe but fails with pythonw.exe. I'm using Python 3.1 on Windows 7.
from http.server import BaseHTTPRequestHandler, HTTPServer
class FooHandler(BaseHTTPRequestHandler):
def do_POST(self):
length = int(self.headers['Content-Length'])
data = self.rfile.read(length)
print(data)
self.send_response(200)
self.send_header('Content-Length', '0')
self.end_headers()
httpd = HTTPServer(('localhost', 8000), FooHandler)
httpd.serve_forever()
Something wrong when I start sending responses. Nothing got written back. And if I try another http connection it won't connect. I also tried using self.wfile but no luck either.
You are printing to stdout. pythonw.exe doens't have a stdout, as it's not connected to a terminal. My guess is that this has something to do with it.
Try to redirect stdout to a file, or quicker, remove the print().
Related
Why does subprocess.call("xdotool key XF86AudioPlay") in do_GET only last for the duration of my http request?
I am trying to play/pause spotify on my local machine, through an http request. ie. when a request is received, emulate a keypress.
When I hit localhost:8002 - the music plays for ~200ms, but as soon as the request finishes, it stops.
import http.server
import socketserver
import subprocess
MYCMD = "xdotool key XF86AudioPlay"
PORT = 8002
class MyRequestHandler(http.server.SimpleHTTPRequestHandler):
def do_GET(self):
self.send_response(200)
self.send_header('Content-type', 'text/html')
self.end_headers()
# Send the html message
subprocess.call(MYCMD, shell=True)
self.wfile.write(b'works')
return
Handler = MyRequestHandler
with socketserver.TCPServer(("", PORT), Handler) as httpd:
print("serving at port", PORT)
httpd.serve_forever()
This dos not really make sense.
Normally subprocess.call() waits for the process to be called to be finished.
in your example it just simulates pressing a key on the server's xwindows.
you simulate pressing a key, that toggles playing audio.
What I would expect is, that somehow you perform two http requests.
The first one starting audio, the second one stopping audio.
Probably the two requests are executed with a delay of about 20ms.
What I propose is to add following lines before the subprocess call
with open("/tmp/mylogfile.log", "a") as fout:
fout.write("call subprocess")
and following two lines after the subprocess call.
with open("/tmp/mylogfile.log", "a") as fout:
fout.write("called subprocess")
I am quite sure, that you will see two instead of one request.
So I'm just trying to get a GET request to go through from a node.js app to a python app and for some reason the code works when I'm running it in a Jupyter Notebook but not otherwise.
Here's the Python code.
from http.server import BaseHTTPRequestHandler , HTTPServer
class HTTPServer_RequestHandler(BaseHTTPRequestHandler):
def do_GET(self):
self.send_response(200)
print('got a request')
server_address = ('localhost', 9000)
httpd = HTTPServer(server_address, HTTPServer_RequestHandler)
print('running server...')
httpd.serve_forever()
and here's the node code
var request = require('request');
request.get({
uri: 'http://localhost:9000',
type:'GET '
}, function(res,err,body){
console.log(res)
console.log(err)
console.log(body)
});
This of course prints an error on the node side, but the interesting thing is that on Jupyter Notebook I get "running server..." output and I also get the "got a request" whenever I run the node, however when I run Python normally I don't get any output at all.
Using a very simple BaseHttpServer (code below) I am experiencing the following problem:
Starting the server in background using ./testserver.py & works fine and the server responses on port 12121 . When I type in disown the server still responses. After closing the terminal the server stops after the next request with Input/output error in test.log
Steps to reconstruct:
$ ./testserver &
$ bg
$ disown
close terminal, send a request to server -> server does not respond.
The only solution I found was to redirect the stdout and stderr:
$ ./testserver > /dev/null 2>&1
or as #Daniel Stated to call it the usual way with nohup
Has anyone experienced this bug before or why is this a desired behaviour for a HttpServer to crash if there is no output possible?
testserver.py
#! /usr/bin/env python
import time
import BaseHTTPServer
HOST_NAME = '0.0.0.0'
PORT_NUMBER = 12121
class MyHandler(BaseHTTPServer.BaseHTTPRequestHandler):
def do_HEAD(s):
s.send_response(200)
s.send_header("Content-type", "text/html")
s.end_headers()
def do_GET(s):
"""Respond to a GET request."""
s.send_response(200)
s.send_header("Content-type", "text/html")
s.end_headers()
s.wfile.write("MANUAL SERVER SUCCESS")
if __name__ == '__main__':
server_class = BaseHTTPServer.HTTPServer
httpd = server_class((HOST_NAME, PORT_NUMBER), MyHandler)
try:
httpd.serve_forever()
except Exception as e:
with open('test.log', 'w') as f:
f.write(str(e))
You only can write strings to files not exceptions. And you have to redirect stdout and stderr somewhere, otherwise any output will hang your program. Why don't you use nohup? That's the normal way, to start programs without terminal.
To make this clear: There is no special behavior in HttpServer. HttpServer is writing log-information to the terminal, so you need a terminal or a redirection to a file.
I have the strange problem that the return of a request in SimpleHTTPRequestHandler is blocked if in the request handler a new process is spawned by subprocess.Popen. But shouldn't Popen be asynchronous?
The behavior is reproducible by the following files and using Python 2.6.7 on my OS X machine as well as ActivePython 2.6 on a SLES machine:
webserver.py:
#!/usr/bin/env python
import SimpleHTTPServer
import SocketServer
import subprocess
import uuid
class MyRequestHandler(SimpleHTTPServer.SimpleHTTPRequestHandler):
def do_POST(self):
print 'in do_POST'
uuid_gen = str(uuid.uuid1())
subprocess.Popen(['python', 'sleep.py'])
print 'subprocess spawned'
return self.wfile.write(uuid_gen)
Handler = MyRequestHandler
server = SocketServer.TCPServer(('127.0.0.1', 9019), Handler)
server.serve_forever()
sleep.py:
import time
time.sleep(10)
When I now start the webserver and then do a curl POST request to localhost:9019 the webserver prints instantly:
$python2.6 webserver2.py
in do_POST
subprocess spawned
but on the console where I do the curl request it shows the following behavior:
$curl -X POST http://127.0.0.1:9019/
<wait ~10 seconds>
cd8ee24a-0ad7-11e3-a361-34159e02ccec
When I run the same setup with Python 2.7 the answer arrives on the curl site instantly.
How can this happen, since Popen doesn't seem to block the print, but just the return?
The problem is I'm bound to python 2.6 for legacy reasons, so what would be the best workaround for that behavior to let the request return instantly?
Ok, so after Nikhil referred me to this question https://stackoverflow.com/questions/3973789/... I figured out that I need self.send_response(200), self.send_header("Content-Length", str(len(uuid_gen))) and self.end_headers().
This means the working code looks like this:
#!/usr/bin/env python
import SimpleHTTPServer
import SocketServer
import subprocess
import uuid
class MyRequestHandler(SimpleHTTPServer.SimpleHTTPRequestHandler):
def do_POST(self):
print 'in do_POST'
uuid_gen = str(uuid.uuid1())
subprocess.Popen(['python', 'sleep.py'])
print 'subprocess spawned'
self.send_response(200)
self.send_header("Content-Length", str(len(uuid_gen)))
self.end_headers()
self.wfile.write(uuid_gen)
Handler = MyRequestHandler
server = SocketServer.TCPServer(('127.0.0.1', 9019), Handler)
server.serve_forever()
The problem seemed to be that the HTTP connection is left open for Pipelining, even though I'm still not quite sure why it the returns exactly when the Popen process is finished.
I'm playing a little with Python 3.2.2 and want to write a simple web server to access some data remotely. This data will be generated by Python so I don't want to use the SimpleHTTPRequestHandler as it's a file server, but a handler of my own.
I copied some example from the internet but I'm stuck because the response outputstream refuses to write the body content.
This is my code:
import http.server
import socketserver
PORT = 8000
class MyHandler(http.server.BaseHTTPRequestHandler):
def do_HEAD(self):
self.send_response(200)
self.send_header("Content-type", "text/html")
self.end_headers()
def do_GET(self):
self.send_response(200)
self.send_header("Content-type", "text/html")
self.end_headers()
print(self.wfile)
self.wfile.write("<html><head><title>Title goes here.</title></head>")
self.wfile.write("<body><p>This is a test.</p>")
# If someone went to "http://something.somewhere.net/foo/bar/",
# then s.path equals "/foo/bar/".
self.wfile.write("<p>You accessed path: %s</p>" % self.path)
self.wfile.write("</body></html>")
self.wfile.close()
try:
server = http.server.HTTPServer(('localhost', PORT), MyHandler)
print('Started http server')
server.serve_forever()
except KeyboardInterrupt:
print('^C received, shutting down server')
server.socket.close()
What should be a correct code for writing the response body?
Thanks a lot.
Edit:
The error is:
...
File "server.py", line 16, in do_GET
self.wfile.write("<html><head><title>Title goes here.</title></head>")
File "C:\Python32\lib\socket.py", line 297, in write
return self._sock.send(b)
TypeError: 'str' does not support the buffer interface
In Python3 string is a different type than that in Python 2.x. Cast it into bytes using either
self.wfile.write(bytes("<html><head><title>Title goes here.</title></head>/html>","utf-8"))
or
self.wfile.write("<html><head><title>Title goes here.</title></head></html>".encode("utf-8"))
For Python 3, prefix the string literals with a b:
self.wfile.write(b"<foo>bar</foo>")
based on your code #comments you're probably looking for self.headers.getheaders('referer'), ie:
if 'http://www.icamefromthissite.com/' in self.headers.getheaders('referer'):
do something
Just use this when using Python 3.X
self.wfile.write(bytes("<body><p>This is a test.</p>", "utf-8"))