I would like to make various http requests and display the actual response status code and reason regardless of any http exceptions, for e.g. if it returns 503 or 404 then just want to display that status code and handle it rather than throwing exception stack.
However, what happens currently in the following is reason variable is never populated if the request is unsuccessful so the request summary result is never displayed.
Any suggestions?
import http.server
import socketserver
import socket
import requests
PORT = 5000
URL1 = "https://foo/"
# URL2 =
class Handler(http.server.BaseHTTPRequestHandler):
def do_GET(self):
self.send_response(200)
self.send_header('Content-type','text/html')
self.end_headers()
self.wfile.write(("<br>Running on: %s" % socket.gethostname()).encode('utf-8'))
if self.path == '/status':
self.wfile.write(("<h1>status</h1>").encode('utf-8'))
try:
response = requests.get(URL1,verify=False)
self.wfile.write(("<br>Request client connection : {}, Response Status: {}, Response Reason: {}".format(response.url, response.status_code, response.reason)).encode('utf-8'))
except:
self.wfile.write(("exception").encode('utf-8'))
#self.wfile.write(("<br>Request client connection : {}, Response Status: {}, Response Reason: {}".format(response.url, response.status_code, response.reason)).encode('utf-8'))
return
return
httpd = socketserver.TCPServer(("", PORT), Handler)
print("serving at port: %s" % PORT)
httpd.serve_forever()
In this case the code is not checking for unsuccessful requests, the try will catch some exceptions but not all. What you want is the following function raise_for_status() that will raise an exception in case of a failed status code. See also Response raise for status
Related
I wrote a webserver with SimpleHTTPRequestHandler and a small script to send a GET request with an argument to the server, to which the server responds with code 200.
What I want is for the script to close as soon as the server replies with 200 (which should happen when "self.send_response(200)" is ran, so I think the problem might be somewhere around there), instead of staying open until the called file, jg_server_ed.py, has finished.
Webserver code:
# server.py
import http.server
import socketserver
import subprocess
import webbrowser
from urllib.parse import urlparse
class Handler(http.server.SimpleHTTPRequestHandler):
def do_POST(self):
print(f"request body: {self.path}")
return
def do_GET(self):
self.send_response(200)
self.send_header("Content-type", "text/html")
self.end_headers()
if self.path.startswith("/?sitename="):
try:
body_split = self.path.split("/?sitename=")
if body_split != "" or []:
print(f"Incoming request: {body_split[-1]}")
subprocess.call(f"python jg_server_ed.py {body_split[-1]}")
else: pass
except Exception as err: print(err)
else: pass
return super().do_GET()
if __name__ == "__main__":
print("Serving on port 8080, IP 0.0.0.0.")
socketserver.TCPServer(('0.0.0.0', 8080), Handler).serve_forever()
GET request sending script thingy:
# submit_to_server.py
import urllib3
http = urllib3.PoolManager()
url = input("URL: ")
print("Sending request...")
r = http.request('GET', f"http://server_ip_address/?sitename={url}")
if r.status == "200":
exit()
This is my server that is handling post requests
from http.server import HTTPServer, BaseHTTPRequestHandler
class requestHandler(BaseHTTPRequestHandler):
def do_GET(self):
self.send_response(200)
self.send_header('content-type', 'text/html')
self.end_headers()
output = ''
output += '<html><body>'
output += '<h1> List of Followers</h1>'
self.wfile.write(output.encode())
def do_POST(self):
content_length = int(self.headers['Content-Length']) # <--- Gets the size of data
post_data = self.rfile.read(content_length) # <--- Gets the data itself
print(post_data.decode('utf-8'))
def main():
PORT = 8080
server_address = ('localhost', PORT)
server = HTTPServer(server_address, requestHandler)
print("Server running on port %s" % PORT)
server.serve_forever()
if __name__ == '__main__':
main()
This is my client that is sending a post request
import requests
import sys
try:
payload = {"name": "Me", "job" : "programmer"}
r = requests.post('http://localhost:8080/', json=payload)
except requests.exceptions.RequestException as e:
print("WTF IS GOING ON")
print(e)
sys.exit(1)
So this is being printed out by my server console
Server running on port 8080
{"name": "kenny", "job": "programmer"}
But this is being printed out by my client console
WTF IS GOING ON
('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')
I'm not sure what this means and I've tried as much as I can before asking for help from the community. I appreciate the help
I wasn't sending a proper response code back to the client.
Adding this in the do_POST method fixed it.
self.send_response(200, "OK")
self.end_headers()
I can't connect to anything on my network using the IP address of the host. I can open a browser and connect and I can ping the host just fine. Here is my code:
from httplib import HTTPConnection
addr = 192.168.14.203
conn = HTTPConnection(addr)
conn.request('HEAD', '/')
res = conn.getresponse()
if res.status == 200:
print "ok"
else:
print "problem : the query returned %s because %s" % (res.status, res.reason)
The following error gets returned:
socket.error: [Errno 51] Network is unreachable
If I change the addr var to google.com I get a 200 response. What am I doing wrong?
You should check the address and your proxy settings.
For making HTTP requests I recommend the requests library. It's much more high-level and user friendly compared to httplib and it makes it easy to set proxies:
import requests
addr = "http://192.168.14.203"
response = requests.get(addr)
# if you need to set a proxy:
response = requests.get(addr, proxies={"http": "...proxy address..."})
# to avoid using any proxy if your system sets one by default
response = requests.get(addr, proxies={"http": None})
URL = "MY HTTP REQUEST URL"
XML = "<port>0</port>"
parameter = urllib.urlencode({'XML': XML})
response = urllib.urlopen(URL, parameter)
print response.read()
IOError: ('http protocol error', 0, 'got a bad status line', None)
I am trying to send XML to a server and get back XML. Is there any way to fix / ignore this exception?
I know that the status line is empty which is raising this error.
Try to have a look what your server actually returns! It probably isn't a valid HTTP response. You could use something like this to send a raw http request to the server:
from socket import socket
host = 'localhost'
port = 80
path = "/your/url"
xmlmessage = "<port>0</port>"
s = socket()
s.connect((host, port))
s.send("POST %s HTTP/1.1\r\n" % path)
s.send("Host: %s\r\n" % host)
s.send("Content-Type: text/xml\r\n")
s.send("Content-Length: %d\r\n\r\n" % len(xmlmessage))
s.send(xmlmessage)
for line in s.makefile():
print line,
s.close()
The response should look something like:
HTTP/1.1 200 OK
<response headers>
<response body>
I'm using httplib to access an api over https and need to build in exception handling in the event that the api is down.
Here's an example connection:
connection = httplib.HTTPSConnection('non-existent-api.com', timeout=1)
connection.request('POST', '/request.api', xml, headers={'Content-Type': 'text/xml'})
response = connection.getresponse()
This should timeout, so I was expecting an exception to be raised, and response.read() just returns an empty string.
How can I know if there was a timeout? Even better, what's the best way to gracefully handle the problem of a 3rd-party api being down?
Even better, what's the best way to gracefully handle the problem of a 3rd-party api being down?
what's mean API is down , API return http 404 , 500 ...
or you mean when the API can't be reachable ?
first of all i don't think you can know if a web service in general is down before trying to access it so i will recommend for first one you can do like this:
import httplib
conn = httplib.HTTPConnection('www.google.com') # I used here HTTP not HTTPS for simplify
conn.request('HEAD', '/') # Just send a HTTP HEAD request
res = conn.getresponse()
if res.status == 200:
print "ok"
else:
print "problem : the query returned %s because %s" % (res.status, res.reason)
and for checking if the API is not reachable i think you will be better doing a try catch:
import httplib
import socket
try:
# I don't think you need the timeout unless you want to also calculate the response time ...
conn = httplib.HTTPSConnection('www.google.com')
conn.connect()
except (httplib.HTTPException, socket.error) as ex:
print "Error: %s" % ex
You can mix the two ways if you want something more general ,Hope this will help
urllib and httplib don't expose timeout. You have to include socket and set the timeout there:
import socket
socket.settimeout(10) # or whatever timeout you want
This is what I found to be working correctly with httplib2. Posting it as it might still help someone :
import httplib2, socket
def check_url(url):
h = httplib2.Http(timeout=0.1) #100 ms timeout
try:
resp = h.request(url, 'HEAD')
except (httplib2.HttpLib2Error, socket.error) as ex:
print "Request timed out for ", url
return False
return int(resp[0]['status']) < 400