I have a HTTPS proxy written in python. All worked until I made the proxy transparent. I redirected 80 and 443 ports to 8888(the proxy port). Now only requests over HTTP works, when I request a HTTPS website, I get 400 and 401 errors on the server. Any ideas? When I make a request over HTTPS, I see encrypted text on the proxy side(log) and I don't see any CONNECT request.
The proxy is at this address: https://github.com/allfro/pymiproxy
But I've changed it a bit to work as a transparent proxy:
def _connect_to_host(self):
# Get hostname and port to connect to
if self.is_connect:
self.hostname = self.getheader('host')
self.port = '443'
#self.hostname, self.port = self.path.split(':')
else:
self.path = "http://" + self.headers.getheader('host') + self.path
u = urlparse(self.path)
Correct me if this is OT, but how do you test your HTTPS over HTTPS proxy?
Python does not support this!
Neither does curl/libcurl/pycurl etc.
You may be able to find a browser that supports it though.
Also, I haven't found anything in the linked github code where you listen on an ssl-enabled socket...
Proxy diagrams (shameless plug) https://stackoverflow.com/a/20330304/705086
Related
I am trying to get the IP address of the requesting computer on my server. I can successfully get the IP-address from request header if the request came from a web browser. The code example below. However, I cannot fetch the client IP, if I send the request via curl/postman. I checked the nginx log, and I found there is a log of my public IP when I sent a curl request. How can I achieve this?
PS: I am using the Sanic Framework.
client_ip = request.headers.get('x-real-ip')
In sanic's docs for Request Data:
ip (str) - IP address of the requester.
Therefore, just use
client_ip = request.ip
I am trying to make a server in python using sockets that I can connect to on any web browser. I am using the host as "localhost" and the port as 8888.
When I attempt to connect to it, the stuff I want to be shown shows up for a split-second, and then it goes away with the browser saying "The connection was reset".
I've made it do something very simple to test if it still does it, and it does.
Is there a way to stop this?
import time
import socket
HOST = "localhost"
PORT = 8888
def function(sck):
sck.send(bytes("test"),"UTF-8"))
sck.close()
ssck=socket.socket(socket.AF_INET,socket.SOCK_STREAM)
ssck.bind((HOST,PORT))
ssck.listen(1)
while True:
sck,addr=ssck.accept()
function(sck)
Probably the same problem as Perl: Connection reset with simple HTTP server, Ultra simple HTTP socket server, written in PHP, behaving unexpectedly, HTTP Server Not Sending Complete File To WGET, Firefox. Connection reset by peer?. That is you don't read the HTTP header from the browser but simply send your response and close the connection.
tl;dr
your function should be
def function(sck):
sck.send(bytes("HTTP/1.1 200 OK\n\n<header><title>test page</title></header><body><h1>test page!</h1></body>"),"UTF-8"))
sck.close()
With a server as simple as that, you're only creating a TCP socket.
HTTP protocols suggest that the client should ask for a page, something like:
HTTP/1.1 GET /somepath/somepage.html
Host: somehost.com
OtherHeader: look at the http spec
The response should then be:
HTTP/1.1 200 OK
some: headers
<header></header><body></body>
Below is the simple script I'm using to redirect regular HTTP requests on port 8080, it redirects(causes them to be at least) them depending on the source IP address right away.
It works (for HTTP), however I would like to have the same behavior for HTTPS requests coming over 443 port. Assume that if the redirection was not present, incoming clients to this simple server would be able to handshake with the target they are being redirected to via a self signed certificate.
import SimpleHTTPServer
import SocketServer
LISTEN_PORT = 8080
source = "127.0.0.1"
target = "http://target/"
class simpleHandler(SimpleHTTPServer.SimpleHTTPRequestHandler):
def do_POST(self):
clientAddressString = ''.join(str(self.clientAddress))
if source in clientAddressString:
# redirect incoming request
self.send_response(301)
new_path = '%s%s' % (target, self.path)
self.send_header('Location', new_path)
self.end_headers()
handler = SocketServer.TCPServer(("", LISTEN_PORT), simpleHandler)
handler.serve_forever()
I can use a self signed certificate and have access to files "server.crt" and "server.key" that are normally used for this connection(without the middle redirecting python server). I am not sure what happens when I put a redirection in between like this, although I assume it has to be part of the hand-shaking chain.
How can I achieve this behavior?
Is there anything I should modify apart from the new target and the response code within request headers?
I will split my answer into Networking and Python parts.
On the Networking side, you cannot redirect at the SSL layer - hence you need a full HTTPs server, and redirect the GET/POST request once the SSL handshake is complete. The response code, and the actual do_POST or do_GET implementation would be exactly the same for both HTTP and HTTPs.
As a side note, don't you get any issues with redirecting POSTs? When you do a 301 on POST, the browser will not resend the POST data to your new target, so something is likely to break at the application level.
On the Python side, you can augment an HTTP server to an HTTPs one by wrapping the socket:
import BaseHTTPServer, SimpleHTTPServer
import ssl
handler = BaseHTTPServer.HTTPServer(("", LISTEN_PORT), simpleHandler)
handler.socket = ssl.wrap_socket (handler.socket, certfile='path/to/combined/PKCS12/container', server_side=True)
handler.serve_forever()
Hope this helps.
Using Python httplib or httpclient, what code do I need to use in my HTTP client to:
use an HTTP HEAD request and
contact a web server by just specifying only its IP address and
contact a web server without specifying any webpage (or homepage) on the request
to extend its HTTP connection using Keepalive messages?
I used the following code example but it has two problems:
It does not extend the http connection using Keepalive,
It gives me an error message "500 Domain Not Found" if I use the IP address instead of the domain name.
import http.client
Connection = http.client.HTTPConnection("www.python.org")
Connection.request("HEAD", "")
response = Connection.getresponse()
print(response.status, response.reason)
requests allows to:
send requests with HEAD method:
import requests
resp = requests.head("http://www.python.org")
use sessions for auto Keep-alive: info
s = requests.Session()
resp = s.head("http://www.python.org")
resp2 = s.get("http://www.python.org/")
Regarding using the IP address instead of domain, that has nothing to do with your request. Most sites use some kind of virtual hosts, so they don't respond to IP address only to specific domain names. If you ask for the IP address you may get a 500 error or a message error.
I am facing the following scenario:
I am forced to use an HTTP proxy to connect to an HTTPS server. For several reasons I need access to the raw data (before encryption) so I am using the socket library instead of one of the HTTP specific libraries.
I thus first connect a TCP socket to the HTTP proxy and issue the connect command.
At this point, the HTTP proxy accepts the connection and seemingly forwards all further data to the target server.
However, if I now try to switch to SSL, I receive
error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol
indicating that the socket attempted the handshake with the HTTP proxy and not with the HTTPS target.
Here's the code I have so far:
import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect(('proxy',9502))
s.send("""CONNECT en.wikipedia.org:443 HTTP/1.1
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.7; rv:15.0) Gecko/20100101 Firefox/15.0.1
Proxy-Connection: keep-alive
Host: en.wikipedia.org
""")
print s.recv(1000)
ssl = socket.ssl(s, None, None)
ssl.connect(("en.wikipedia.org",443))
What would be the correct way to open an SSL socket to the target server after connecting to the HTTP proxy?
(Note that in generally, it would be easier to use an existing HTTPS library such as PyCurl, instead of implementing it all by yourself.)
Firstly, don't call your variable ssl. This name is already used by the ssl module, so you don't want to hide it.
Secondly, don't use connect a second time. You're already connected, what you need is to wrap the socket. Since Python doesn't do any certificate verification by default, you'll need to verify the remote certificate and verify the host name too.
Here are the steps involved:
Establish your plain-text connection and use CONNECT like you're doing in the first few lines.
Read the HTTP response you get, and make sure you get a 200 status code. (You'll need to read the header line by line).
Use ssl_s = ssl.wrap_socket(s, cert_reqs=ssl.CERT_REQUIRED, ssl_version=ssl.PROTOCOL_TLS1, ca_certs='/path/to/cabundle.pem') to wrap the socket. Then, verify the host name. It's worth reading this answer: the connect method and what it does after wrapping the socket.
Then, use ssl_s as if it was your normal socket. Don't call connect again.
works with python 3
< proxy > is an ip or domain name
< port > 443 or 80 or whatever your proxy is listening to
< endpoint > your final server you want to connect to via the proxy
< cn > is an optional sni field your final server could be expecting
import socket,ssl
def getcert_sni_proxy(cn,endpoint,PROXY_ADDR=("<proxy>", <port>)):
#prepare the connect phrase
CONNECT = "CONNECT %s:%s HTTP/1.0\r\nConnection: close\r\n\r\n" % (endpoint, 443)
#connect to the actual proxy
conn = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
conn.connect(PROXY_ADDR)
conn.send(str.encode(CONNECT))
conn.recv(4096)
#set the cipher for the ssl layer
context = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
#connect to the final endpoint via the proxy, sending an optional servername information [cn here]
sock = context.wrap_socket(conn, server_hostname=cn)
#retreive certificate from the server
certificate = ssl.DER_cert_to_PEM_cert(sock.getpeercert(True))
return certificate