I have written this HTTP web server in python which simply sends reply "Website Coming Soon!" to the browser/client, but I want that this web server should sends back the URL given by the client, like if I write
http://localhost:13555/ChessBoard_x16_y16.bmp
then server should reply back the same url instead of "Website Coming Soon!" message.
please tell how can I do this?
Server Code:
import sys
import http.server
from http.server import HTTPServer
from http.server import SimpleHTTPRequestHandler
#import usb.core
class MyHandler(SimpleHTTPRequestHandler): #handles client requests (by me)
#def init(self,req,client_addr,server):
# SimpleHTTPRequestHandler.__init__(self,req,client_addr,server)
def do_GET(self):
response="Website Coming Soon!"
self.send_response(200)
self.send_header("Content-type", "application/json;charset=utf-8")
self.send_header("Content-length", len(response))
self.end_headers()
self.wfile.write(response.encode("utf-8"))
self.wfile.flush()
print(response)
HandlerClass = MyHandler
Protocol = "HTTP/1.1"
port = 13555
server_address = ('localhost', port)
HandlerClass.protocol_version = Protocol
try:
httpd = HTTPServer(server_address, MyHandler)
print ("Server Started")
httpd.serve_forever()
except:
print('Shutting down server due to some problems!')
httpd.socket.close()
You can do what you're asking, sort of, but it's a little complicated.
When a client (e.g., a web browser) connects to your web server, it sends a request that look like this:
GET /ChessBoard_x16_y16.bmp HTTP/1.1
Host: localhost:13555
This assumes your client is using HTTP/1.1, which is likely true of anything you'll find these days. If you expect HTTP/1.0 or earlier clients, life is much more difficult because there is no Host: header.
Using the value of the Host header and the path passed as an argument to the GET request, you can construct a URL that in many cases will match the URL the client was using.
But it won't necessarily match in all cases:
There may be a proxy in between the client and your server, in which case both the path and hostname/port seen by your code may be different from that used by the client.
There may be packet manipulation rules in place that modify the destination ip address and/or port, so that the connection seen by your code does not match the parameters used by the client.
In your do_GET method, you can access request headers via the
self.headers attribute and the request path via self.path. For example:
def do_GET(self):
response='http://%s/%s' % (self.headers['host'],
self.path)
Related
How can I keep my Python HTTP server connected(streaming) to my browser in real time?
(Update image to infinity) Like raspberry pi's motion eye
class MyHttpRequestHandler(http.server.SimpleHTTPRequestHandler):
def _set_response(self):
self.send_response(200)
self.send_header('Content-type', 'text/html')
self.send_header("Connection", "keep-alive")
self.send_header("keep-alive", "timeout=999999, max=99999")
self.end_headers()
def do_GET(self):
#self.send_response(204)
#self.end_headers()
if self.path == '/':
self.path = 'abc.jpg'
return http.server.SimpleHTTPRequestHandler.do_GET(self)
# Create an object of the above class
handler_object = MyHttpRequestHandler
PORT = 8000
my_server = socketserver.TCPServer(("", PORT), handler_object)
# Star the server
my_server.serve_forever()
Just keep writing, as in:
while True:
self.wfile.write(b"data")
This however won't get you into eventstream / server sent events territory, without using helper external libraries, as far as I'm aware.
I came across the same issue, I then found by chance (after much debugging) that you need to send linebreaks (\r\n or \n\n) to have the packets sent:
import http.server
import time
class MyHttpRequestHandler(http.server.BaseHTTPRequestHandler):
value = 0
# One can also set protocol_version = 'HTTP/1.1' here
def do_GET(self):
self.send_response(200)
self.send_header('Content-type', 'text/html')
self.send_header("Connection", "keep-alive")
self.end_headers()
while True:
self.wfile.write(str(self.value).encode())
self.wfile.write(b'\r\n') # Or \n\n, necessary to flush
self.value += 1
time.sleep(1)
PORT = 8000
my_server = http.server.HTTPServer(("", PORT), MyHttpRequestHandler)
# Start the server
my_server.serve_forever()
This enables you to send Server-sent Events (SSE) or HTTP long poll, or even json/raw http streams with the http.server library.
As the comment in the code says, you can also set the protocol version to HTTP/1.1 to enable keepalive by default. If you do so, you will have to specify Content-Length for every sent packet, otherwise the connection will never be terminated.
It is probably best to combine this with a threaded server to allow concurrent connections, as well as maybe setting a keepalive on the socket itself.
little bit explanation:
i have multiple clients and a very simple http client written in python.
out of all the clients , one client sends a post request to the http server with 4 values (lets call this client "client Alpha") and all the remaining clients send the http post request just to establish the connection to the server (lets call these clients "clients beta") the reason behind clients beta for sending the request is so that they can receive the values that were sent via client alpha...
import kwargs
import args
from http.server import BaseHTTPRequestHandler, HTTPServer
import logging
import requests
class S(BaseHTTPRequestHandler):
def _set_response(self):
self.send_response(200,1)
self.send_header('Content-type', 'int')
self.end_headers()
def breakRequest(self, str):
l = []
x = str.split("&")
for i in x:
a = i.split("=")
l.append(a[1])
return l[0], l[1], l[2], l[3]
def do_POST(self):
content_length = int(self.headers['Content-Length']) # <--- Gets the size of data
post_data = self.rfile.read(content_length) # <--- Gets the data itself
var1,var2,var3,var4 = self.breakRequest(str(post_data.decode('utf-8')))
if (var1 !='ard'):
s = "\n" + var1+"\n"+var2+"\n"+var3 + "\n"+var4 + "\n"
logging.info(s)
logging.info("POST request,\nPath: %s\nHeaders:\n%s\n\nBody:\n%s\n",
str(self.path), str(self.headers), post_data.decode('utf-8'))
self._set_response()
self.wfile.write("1".format(self.path).encode('utf-8'))
def run(server_class=HTTPServer, handler_class=S, port=6060):
logging.basicConfig(level=logging.INFO)
server_address = ('', port)
httpd = server_class(server_address, handler_class)
logging.info('Starting httpd...\n')
try:
httpd.serve_forever()
except KeyboardInterrupt:
pass
httpd.server_close()
logging.info('Stopping httpd...\n')
if __name__ == '__main__':
from sys import argv
if len(argv) == 2:
run(port=int(argv[1]))
else:
run()
what the client alpha sends:
client alpha sends 4 values which are being stored in var1, var2, var3, var4
what client beta sends
client beta send http post request only once to establish the connection to the server
what i am trying to achieve
once the client beta has establish the connection to the server i am trying to make the server store the values received by the client alpha into var1,var2,va3,var4 and then send these values out to all the beta clients at once once the values have been sent out then wait and when the new values are received by the server from client alpha, then send these new values to the client beta
every time the ip address of beta client is changed then it sends the request again to establish connection.
and
i am not very good at python and what i currently have is all thanks to google i kept searching for the examples and been implementing and testing them and ended up with a python code that receives and stores the http post data into variables
i will highly appreciate your help
thanks in advance
and sorry for any mistakes
You're talking about having the server connected to 4 different clients and PUSHING data to them when a specific event occurs. You are going to need to look at either using Websockets (https://pypi.org/project/websockets/) or Server Sent Events (https://medium.com/code-zen/python-generator-and-html-server-sent-events-3cdf14140e56).
Those are the only two methods in which a server can push data to other clients, as they are connected, so the server knows that they exist.
I am putting together an http-post server client example in order to send and request data from a client to a server that handle multiple connections. I am using the HTTPServer module from the standard library. The code seems to work fine, but the communication slows down randomly. I have checked the communication traffic using Wireshark and I can see some strange messages going on.
I have checked different solutions on internet, but I have not found anything unusual in my code.
The code for the client it is just a simple http post request
Server code:
class Handler(BaseHTTPRequestHandler):
def do_POST(self):
content_length = int(self.headers['Content-Length'])
body = self.rfile.read(content_length)
data = {
'ids': [5, 6]
}
self.send_response(200)
self.send_header('Content-type', 'application/json')
self.end_headers()
self.wfile.write(json.dumps(data).encode())
return
class ThreadedHTTPServer(ThreadingMixIn, HTTPServer):
"""Handle requests in a separate thread."""
test = HTTPServer((SV_HOST, SV_PORT), Handler)
test.timeout = 5
print('Starting server, use <Ctrl-C> to stop')
test.serve_forever()
Here are the Wireshark messages that I see:
I would appreciate if someone can clarify what I am doing wrong, if there is something wrong. Is "TCP segment of a reassembled PDU" normal?
Short version: Is there any easy API for encoding an HTTP request (and decoding the response) without actually transmitting and receiving the encoded bytes as part of the process?
Long version: I'm writing some embedded software which uses paramiko to open an SSH session with a server. I then need to make an HTTP request across an SSH channel opened with transport.open_channel('direct-tcpip', <remote address>, <source address>).
requests has is transport adapters, which lets you substitute your own transport. But the send interface provided by BaseAdapter just accepts a PreparedRequest object which (a) doesn't provide the remote address in any useful way; you need to parse the URL to find out the host and port and (b) doesn't provide an encoded version of the request, only a dictionary of headers and the encoded body (if any). It also gives you no help in decoding the response. HTTPAdapter defers the whole lot, including encoding the request, making the network connection, sending the bytes, receiving the response bytes and decoding the response, to urllib3.
urllib3 likewise defers to http.client and http.client's HTTPConnection class has encoding and network operations all jumbled up together.
Is there a simple way to say, "Give me a bunch of bytes to send to an HTTP server," and "Here's a bunch of bytes from an HTTP server; turn them into a useful Python object"?
This is the simplest implementation of this that I can come up with:
from http.client import HTTPConnection
import requests
from requests.structures import CaseInsensitiveDict
from urllib.parse import urlparse
from argparse import ArgumentParser
class TunneledHTTPConnection(HTTPConnection):
def __init__(self, transport, *args, **kwargs):
self.ssh_transport = transport
HTTPConnection.__init__(self, *args, **kwargs)
def connect(self):
self.sock = self.ssh_transport.open_channel(
'direct-tcpip', (self.host, self.port), ('localhost', 0)
)
class TunneledHTTPAdapter(requests.adapters.BaseAdapter):
def __init__(self, transport):
self.transport = transport
def close(self):
pass
def send(self, request, **kwargs):
scheme, location, path, params, query, anchor = urlparse(request.url)
if ':' in location:
host, port = location.split(':')
port = int(port)
else:
host = location
port = 80
connection = TunneledHTTPConnection(self.transport, host, port)
connection.request(method=request.method,
url=request.url,
body=request.body,
headers=request.headers)
r = connection.getresponse()
resp = requests.Response()
resp.status_code = r.status
resp.headers = CaseInsensitiveDict(r.headers)
resp.raw = r
resp.reason = r.reason
resp.url = request.url
resp.request = request
resp.connection = connection
resp.encoding = requests.utils.get_encoding_from_headers(response.headers)
requests.cookies.extract_cookies_to_jar(resp.cookies, request, r)
return resp
if __name__ == '__main__':
import paramiko
parser = ArgumentParser()
parser.add_argument('-p', help='Port the SSH server listens on', default=22)
parser.add_argument('host', help='SSH server to tunnel through')
parser.add_argument('username', help='Username on SSH server')
parser.add_argument('url', help='URL to perform HTTP GET on')
args = parser.parse_args()
client = paramiko.SSHClient()
client.load_system_host_keys()
client.connect(args.host, args.p, username=args.username)
transport = client.get_transport()
s = requests.Session()
s.mount(url, TunneledHTTPAdapter(transport))
response = s.get(url)
print(response.text)
There are various options to BaseAdapter.send that it doesn't handle, and it completely ignores issues like connection pooling and so on, but it gets the job done.
You could write your own SOCKS4 proxy, run it on localhost, then point your HTTP requests at it. For example, https://urllib3.readthedocs.io/en/latest/advanced-usage.html describes how to use a SOCKS proxy with urllib3.
SOCKS4 is basically a simple handshake followed by raw HTTP/TCP traffic. The handshake conveys the target IP address and port. So your proxy can do the handshake to satisfy the client that it is a SOCKS server, then the proxy can send the "real" traffic straight to the SSH session (and proxy the responses in the reverse direction).
The cool thing about this approach is that it will work with tons of clients--SOCKS has been widespread for a long time.
I found a python http web server from http://www.linuxjournal.com/content/tech-tip-really-simple-http-server-python
import sys
import BaseHTTPServer
from SimpleHTTPServer import SimpleHTTPRequestHandler
HandlerClass = SimpleHTTPRequestHandler
ServerClass = BaseHTTPServer.HTTPServer
Protocol = "HTTP/1.0"
if sys.argv[1:]:
port = int(sys.argv[1])
else:
port = 8000
server_address = ('127.0.0.1', port)
HandlerClass.protocol_version = Protocol
httpd = ServerClass(server_address, HandlerClass)
sa = httpd.socket.getsockname()
print "Serving HTTP on", sa[0], "port", sa[1], "..."
httpd.serve_forever()
If the directory has a file named index.html, that file will be served as the initial file. If there is no index.html, then the files in the directory will be listed.
How do I modify the script so that I send a custom html to the browser?
As the name and documentation imply, SimpleHTTPServer is dead-simple, and intended to be used as sample code for building your own servers on top of the frameworks in the standard library.
So, if you want to do anything with it, you probably want to copy and modify the source, or just use it as inspiration.
And if you want to do anything serious, you probably want to use a framework made for writing real HTTP servers like tornado or twisted, or just use a stock HTTP server and delegate the dynamic pages to Python via, say, WSGI.
But if you really want to do this, you can. There's nothing stopping you from subclassing SimpleHTTPServer.SimpleHTTPRequestHandler and overriding its methods. For example:
class MyHandler(SimpleHTTPRequestHandler):
def send_head(self):
if self.translate_path(self.path).endswith('/foo'):
body = gaping_security_hole(self.path)
self.send_response(200)
self.send_header("Content-type", "text/html; charset=utf-8")
self.send_header("Content-Length", str(len(body)))
self.end_headers()
return StringIO(body)
else:
return super(MyHandler, self).send_head()
Obviously you can check whatever you want there instead of endswith('/foo'). For example, as you can see from the source, the default implementation checks os.path.isdir, and if it's true checks whether it endswith('/'), and whether the directory has anything named index.html or index.htm, before deciding what to do.