Python HTTP HEAD request keepalive - python

Using Python httplib or httpclient, what code do I need to use in my HTTP client to:
use an HTTP HEAD request and
contact a web server by just specifying only its IP address and
contact a web server without specifying any webpage (or homepage) on the request
to extend its HTTP connection using Keepalive messages?
I used the following code example but it has two problems:
It does not extend the http connection using Keepalive,
It gives me an error message "500 Domain Not Found" if I use the IP address instead of the domain name.
import http.client
Connection = http.client.HTTPConnection("www.python.org")
Connection.request("HEAD", "")
response = Connection.getresponse()
print(response.status, response.reason)

requests allows to:
send requests with HEAD method:
import requests
resp = requests.head("http://www.python.org")
use sessions for auto Keep-alive: info
s = requests.Session()
resp = s.head("http://www.python.org")
resp2 = s.get("http://www.python.org/")
Regarding using the IP address instead of domain, that has nothing to do with your request. Most sites use some kind of virtual hosts, so they don't respond to IP address only to specific domain names. If you ask for the IP address you may get a 500 error or a message error.

Related

Why do i get "HTTP/1.1 400 Bad Request" error on execution?

So i want to send (through a proxy) a request to a website.. The script looks like this and its made with the socket library in python:
import socket
TargetDomainName="www.stackoverflow.com"
TargetIP="151.101.65.69"
TargetPort=80
ProxiesIP=["107.151.182.247"]
ProxiesPort=[80]
Connect=f"CONNECT {TargetDomainName} HTTP/1.1"
Connection=socket.socket(socket.AF_INET, socket.SOCK_STREAM)
Connection.connect((ProxiesIP[0],ProxiesPort[0]))
Connection.sendto(str.encode(Connect),(TargetIP, TargetPort))
Connection.sendto(("GET /" + TargetIP + " HTTP/1.1\r\n").encode('ascii'), (TargetIP, TargetPort))
Connection.sendto(("Host: " + ProxiesIP[0] + "\r\n\r\n").encode('ascii'), (TargetIP, TargetPort))
print (Connection.recv(1028))
Connection.close()
My question is why i get the 400 bad request error?
You did not indicate whether the 400 reply is coming from the proxy or the target server. But both of your commands are malformed.
Your CONNECT command is missing a port number, a Host header since you are requesting HTTP 1.1, and trailing line breaks to terminate the command properly.
Your GET command is sent to the target server (if CONNECT is successful) and should not be requesting a resource by IP address. It is also sending the wrong value for the Host header. The command is relative to the target server, so it needs to specify the target server's host name.
Also, you should be using send()/sendall() instead of sendto().
Try something more like this instead:
import socket
TargetDomainName="www.stackoverflow.com"
TargetIP="151.101.65.69"
TargetPort=80
ProxiesIP=["107.151.182.247"]
ProxiesPort=[80]
Connection=socket.socket(socket.AF_INET, socket.SOCK_STREAM)
Connection.connect((ProxiesIP[0], ProxiesPort[0]))
Connection.sendall((f"CONNECT {TargetDomainName}:{TargetPort} HTTP/1.1\r\n").encode("ascii"))
Connection.sendall((f"Host: {TargetDomainName}:{TargetPort}\r\n\r\n").encode("ascii"))
print (Connection.recv(1028))
Connection.sendall(("GET / HTTP/1.1\r\n").encode('ascii'))
Connection.sendall((f"Host: {TargetDomainName}\r\n").encode('ascii'))
Connection.sendall(("Connection: close\r\n\r\n").encode('ascii'))
print (Connection.recv(1028))
Connection.close()
You really need to read the proxy's reply before sending the GET command. The proxy will send its own HTTP reply indicating whether it successfully connected to the target server or not.
You really should not be implementing HTTP manually though, there are plenty of HTTP libraries for Python that can handle these details for you. Python even has one built-in: http.client

How to get the ip address of a request machine using python (On Sanic)

I am trying to get the IP address of the requesting computer on my server. I can successfully get the IP-address from request header if the request came from a web browser. The code example below. However, I cannot fetch the client IP, if I send the request via curl/postman. I checked the nginx log, and I found there is a log of my public IP when I sent a curl request. How can I achieve this?
PS: I am using the Sanic Framework.
client_ip = request.headers.get('x-real-ip')
In sanic's docs for Request Data:
ip (str) - IP address of the requester.
Therefore, just use
client_ip = request.ip

python 'requests' package with proxy not working

Here are my test code:
import requests
url = 'https://api.ipify.org/'
proxyapi = 'http://ip.11jsq.com/index.php/api/entry?method=proxyServer.generate_api_url&packid=1&fa=0&fetch_key=&qty=1&time=1&pro=&city=&port=1&format=txt&ss=1&css=&dt=1'
proxy = {'http' : 'http://{}'.format(requests.get(proxyapi).text)}
print ('Downloading with fresh proxy.', proxy)
resp = requests.get(url, proxies = proxy_new)
print ('Fresh proxy response status.', resp.status_code)
print (resp.text)
#terminal output
Downloading with fresh proxy. {'http': 'http://49.84.152.176:30311'}
Fresh proxy response status. 200
222.68.154.34#my public ip address
with no error message and seems that the requests lib never apply this proxy settings. The proxyapi is valid for I've checked the proxy in my web browser, and by visiting https://api.ipify.org/, it returns the desired proxy server's ip address.
I am using python 3.6.4 and requests 2.18.4.

Python simple HTTPS forwarder

Below is the simple script I'm using to redirect regular HTTP requests on port 8080, it redirects(causes them to be at least) them depending on the source IP address right away.
It works (for HTTP), however I would like to have the same behavior for HTTPS requests coming over 443 port. Assume that if the redirection was not present, incoming clients to this simple server would be able to handshake with the target they are being redirected to via a self signed certificate.
import SimpleHTTPServer
import SocketServer
LISTEN_PORT = 8080
source = "127.0.0.1"
target = "http://target/"
class simpleHandler(SimpleHTTPServer.SimpleHTTPRequestHandler):
def do_POST(self):
clientAddressString = ''.join(str(self.clientAddress))
if source in clientAddressString:
# redirect incoming request
self.send_response(301)
new_path = '%s%s' % (target, self.path)
self.send_header('Location', new_path)
self.end_headers()
handler = SocketServer.TCPServer(("", LISTEN_PORT), simpleHandler)
handler.serve_forever()
I can use a self signed certificate and have access to files "server.crt" and "server.key" that are normally used for this connection(without the middle redirecting python server). I am not sure what happens when I put a redirection in between like this, although I assume it has to be part of the hand-shaking chain.
How can I achieve this behavior?
Is there anything I should modify apart from the new target and the response code within request headers?
I will split my answer into Networking and Python parts.
On the Networking side, you cannot redirect at the SSL layer - hence you need a full HTTPs server, and redirect the GET/POST request once the SSL handshake is complete. The response code, and the actual do_POST or do_GET implementation would be exactly the same for both HTTP and HTTPs.
As a side note, don't you get any issues with redirecting POSTs? When you do a 301 on POST, the browser will not resend the POST data to your new target, so something is likely to break at the application level.
On the Python side, you can augment an HTTP server to an HTTPs one by wrapping the socket:
import BaseHTTPServer, SimpleHTTPServer
import ssl
handler = BaseHTTPServer.HTTPServer(("", LISTEN_PORT), simpleHandler)
handler.socket = ssl.wrap_socket (handler.socket, certfile='path/to/combined/PKCS12/container', server_side=True)
handler.serve_forever()
Hope this helps.

SOAP web service behind proxy, access using python-suds

I have this strange case scenario with python suds.
I have a soap service (java) running on a local ip, say http://10.0.0.1:8080/services/
I use suds http base auth within the local network and it's working fine.
from suds.client import Client
c = Client(url, username="user", password="pass")
But I want to make it accessible from outside, so I asked the system admin : "Can you set up a external IP use reverse proxy for this soap service"
"Yes, but the company firewall doesn't allow port 8080, so your rule will be:
http://10.0.0.1:8080/services/* <-> https://example.com/services/*
Then the rule is setup but I just can't make the client to work. I tried all kinds of transport:
from suds.transport.https import WindowsHttpAuthenticated
from suds.transport.http import HttpAuthenticated
#from suds.transport.https import HttpAuthenticated
from suds.client import Client
http = HttpAuthenticated(username="jlee", password="jlee")
#https = HttpAuthenticated(username="jlee", password="jlee")
ntlm = WindowsHttpAuthenticated(username="jlee", password="jlee")
url = "https://example.com/services/SiteManager?wsdl"
c = Client(url, transport = http)
it always returns:
suds.transport.TransportError: HTTP Error 403: Forbidden ( The server denied the specified Uniform Resource Locator (URL). Contact the server administrator. )
I tried to access the URL https://example.com/services/SiteManager?wsdl from chrome, it returns 403 too!
But if I sign in first using other routes (my server is running other http pages on tomcat), and then access the URL again the wsdl desc page shows up!
Can anybody tell me what's wrong with this? is it to do with the configuration of the reverse proxy server or the suds transport?
Thanks very much!
Jackie
Found the solution by talking to system admin(who's in charge of setting up the reverse proxy), he said there is an checkbox option in MS DMZ(reverse proxy server) for allow http base auth.

Categories

Resources