import urllib.request
request = urllib.request.Request('http://1.0.0.8/')
try:
response = urllib.request.urlopen(request)
print("Server Online")
#do stuff here
except urllib.error.HTTPError as e: # 404, 500, etc..
print("Server Offline")
#do stuff here
I'm trying to write a simple program that will check a list of LAN webserver is up. Currently just using one IP for now.
When I run it with an IP of a web server I get back Server Online.
When I run it with a IP that doesn't have web server I get
"urllib.error.URLError: <urlopen error [WinError 10061] No connection could be made because the target machine actively refused it>"
but would rather a simple "Server Offline" output. Not sure how to get the reply to output Server Offline.
In your code above you’re just looking for HTTPError exceptions.
Just add another except clause to the end that would reference the exception that you are looking for, in this case the URLError:
import urllib.request
request = urllib.request.Request('http://1.0.0.8/')
try:
response = urllib.request.urlopen(request)
print("Server Online")
#do stuff here
except urllib.error.HTTPError as e:
print("Server Offline")
#do stuff here
except urllib.error.URLError as e:
print("Server Offline")
#do stuff here
well, we could combine the errors as well.
except (urllib.error.URLError, urllib.error.HTTPError):
print("Poor server is offline.")
Related
In case of a connection error, I want Python to wait and re-try. Here's the relevant code, where "link" is some link:
import requests
import urllib.request
import urllib.parse
from random import randint
try:
r=requests.get(link)
except ConnectionError or TimeoutError:
print("Will retry again in a little bit")
time.sleep(randint(2500,3000))
r=requests.get(link)
Except I still periodically get a connection error. And I never see the text "Will retry again in a little bit" so I know the code is not re-trying. What am I doing wrong? I'm pasting parts of the error code below in case I'm misreading the error. TIA!
TimeoutError: [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond
During handling of the above exception, another exception occurred:
requests.packages.urllib3.exceptions.ProtocolError: ('Connection aborted.', TimeoutError(10060, 'A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond', None, 10060, None))
During handling of the above exception, another exception occurred:
requests.exceptions.ConnectionError: ('Connection aborted.', TimeoutError(10060, 'A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond', None, 10060, None))
For me, using a custom user agent in the request fixes this issue. With this method you spoof your browser.
Works:
url = "https://www.nasdaq.com/market-activity/stocks/amd"
headers = {'User-Agent': 'Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.1.6) Gecko/20070802 SeaMonkey/1.1.4'}
response = requests.get(url, headers=headers)
Doesn't work:
url = "https://www.nasdaq.com/market-activity/stocks/amd"
response = requests.get(url)
The second request is not inside a try block so exceptions are not caught. Also in the try-except block you're not catching other exceptions that may occur.
You could use a loop to attempt a connection two times, and break if the request is successful.
for _ in range(2):
try:
r = requests.get(link)
break
except (ConnectionError, TimeoutError):
print("Will retry again in a little bit")
except Exception as e:
print(e)
time.sleep(randint(2500,3000))
I think you should use
except (ConnectionError, TimeoutError) as e:
print("Will retry again in a little bit")
time.sleep(randint(2500,3000))
r=requests.get(link)
See this similar question, or check the docs.
I had the same problem. It turns out that urlib3 relies on socket.py, which raises an OSError. So, you need to catch that:
try:
r = requests.get(link)
except OSError as e:
print("There as an error: {}".format(e))
I've implemented some code that allows a client to connect to a socket server, introduces itself and the server then goes into an infinite loop which sends "commands" (strings) to the client from a Redis list. The server uses the Redis 'blpop' method to block until a string arrives which is then sent off to the client and the response awaited.
However, in testing (with a python client socket script on another local workstation) I find that if I break the client connection (Ctrl+c) to simulate an interruption in the connectivity, the server happily writes the next received string to the client, reports an empty response but ONLY throws the broken pipe exception when a second string is written :/ Thus, two writes are "lost" before anything is caught. Here's my code:
# Create global Redis resource
rds_cnx = redis.StrictRedis(host='localhost', port=6379, db=6)
def initialise_server():
""" Setup server socket """
try:
srv_skt = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
srv_skt.bind((IP, PORT))
srv_skt.listen(1)
print("Listening on:[{}]".format(IP, PORT))
return srv_skt
except socket.error as skt_err: # e.g. port in use
print("Could not initialise tcp server:[{}]".format(skt_err))
sys.exit(1)
except Exception as exp:
print("Unable to setup server socket:[{}]".format(exp))
sys.exit(1)
def main():
server_socket = initialise_server()
while True:
client_socket, remote_address = server_socket.accept()
try:
# Block and wait for connection and data
initial_data = client_socket.recv(1024).decode()
print("Connection from [{}] - Data:[{}]".format(remote_address, initial_data))
while True:
wait_for_queue_command(client_socket)
except (BrokenPipeError, socket.error, Exception) as sck_exp:
print("Exception in client loop:[{}]".format(sck_exp))
continue
except KeyboardInterrupt:
# Close client socket
client_socket.shutdown(2)
client_socket.close()
print('Caught Ctrl+c ... Shutting down.')
break
# Tear down context
server_socket.shutdown(2) # Param ref: 0 = done receiving, 1 = done sending, 2 = both
server_socket.close()
def wait_for_queue_command(client_skt):
""" Blocking while waiting for command for Redis list
:param client_skt: socket
:return: None
"""
print('Waiting for command...')
queue_cmd = rds_cnx.blpop('queuetest', 0)
print("Received something from the queue:")
pprint(queue_cmd)
try:
#client_skt.settimeout(15)
client_skt.send(queue_cmd[1])
# Block for response
response_data = client_skt.recv(1024).decode()
print("Response:[{}]".format(response_data))
except BrokenPipeError as brkn_p:
print('Outbound write detected "Broken Pipe":[{}]'.format(brkn_p))
''' Here one would decide to either re-schedule the command or
ignore the error and move on to the next command. A "pause"
(sleep) could also useful?
'''
raise
except socket.timeout as sck_tmo:
print('Socket timed out:[{}]'.format(sck_tmo))
except socket.error as sck_err:
print('Socket timed out:[{}]'.format(sck_err))
raise
print('Command handling complete.')
Is there any better way to handle such a situation? I've had a cursory look at Twisted but it seems very difficult to achieve the specific blocking behavior and other code that might be implemented to handle specific responses from the client.
I have implemented a quick solution to check for internet connection in one python program, using what I found on SO :
def check_internet(self):
try:
response=urllib2.urlopen('http://www.google.com',timeout=2)
print "you are connected"
return True
except urllib2.URLError as err:
print err
print "you are disconnected"
It works well ONCE, and show that I am not connected if I try it once. But if I re-establish the connection and try again, then it still says I am not connected.
Is the urllib2 connection not closed somehow ? Should I do something to reset it ?
This could be because of server-side caching.
Try this:
def check_internet(self):
try:
header = {"pragma" : "no-cache"} # Tells the server to send fresh copy
req = urllib2.Request("http://www.google.com", headers=header)
response=urllib2.urlopen(req,timeout=2)
print "you are connected"
return True
except urllib2.URLError as err:
print err
I haven't tested it. But according to the 'pragma' definition, it should work.
There is a good discussion here if you want to know about pragma: Difference between Pragma and Cache-control headers?
This is how I used to check my connectivity for one of my applications.
import httplib
import socket
test_con_url = "www.google.com" # For connection testing
test_con_resouce = "/intl/en/policies/privacy/" # may change in future
test_con = httplib.HTTPConnection(test_con_url) # create a connection
try:
test_con.request("GET", test_con_resouce) # do a GET request
response = test_con.getresponse()
except httplib.ResponseNotReady as e:
print "Improper connection state"
except socket.gaierror as e:
print "Not connected"
else:
print "Connected"
test_con.close()
I tested the code enabling/disabling my LAN connection repeatedly and it works.
It will be faster to just make a HEAD request so no HTML will be fetched.
Also I am sure google would like it better this way :)
# uncomment for python2
# import httplib
import http.client as httplib
def have_internet():
conn = httplib.HTTPConnection("www.google.com")
try:
conn.request("HEAD", "/")
return True
except:
return False
I have a ploblem with socket communication in my python app. I'm using python32. OS windows 7.
In my task i must use UDP sockets. If i ran my app from IDE(Eclipse) communication between client and server is fine.
BUT:
if i ran my app in Command Promtp and then i can't to communicate from client to server (get errno 11004 getaddrinfo failed). On windows xp app work fine.
I'm trying to turn off firewall but it doesn't help.
Why i can't communicate from cmd?
Client connection:
try:
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
s.settimeout(15)
except socket.error as msg:
print(msg)
s.close()
s = None
addr=(HOST,int(PORT))
msg="CONNECT"
s.sendto(bytes(msg,"ascii"),addr)
try:
data = s.recvfrom(1024)[0]
except socket.timeout as err:
print("Connection lost! /cry")
sys.exit(1)
and server code:
try:
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
except socket.error as msg:
print(msg)
s = None
s.settimeout(45)
s.bind(('0.0.0.0',PORT))
if s is None:
print('could not open socket')
sys.exit(1)
print("Server created. Waiting for player.")
sost = "Server"
while True:
try:
(data,addr) = s.recvfrom(1024)
except socket.timeout as err:
print("Nobody want's to connect! /cry")
sys.exit(1)
if data == b"CONNECT":
print("User from {0} connected".format(addr))
s.sendto(b"CONNECT_OK",addr)
break;
PS: Sorry for my english :)
can you print out the value of the addr variable when you get the exception? maybe something silly with the address (embedded line feed or something like that)
Probably safest to add a addr.strip() call?
I'm using httplib to access an api over https and need to build in exception handling in the event that the api is down.
Here's an example connection:
connection = httplib.HTTPSConnection('non-existent-api.com', timeout=1)
connection.request('POST', '/request.api', xml, headers={'Content-Type': 'text/xml'})
response = connection.getresponse()
This should timeout, so I was expecting an exception to be raised, and response.read() just returns an empty string.
How can I know if there was a timeout? Even better, what's the best way to gracefully handle the problem of a 3rd-party api being down?
Even better, what's the best way to gracefully handle the problem of a 3rd-party api being down?
what's mean API is down , API return http 404 , 500 ...
or you mean when the API can't be reachable ?
first of all i don't think you can know if a web service in general is down before trying to access it so i will recommend for first one you can do like this:
import httplib
conn = httplib.HTTPConnection('www.google.com') # I used here HTTP not HTTPS for simplify
conn.request('HEAD', '/') # Just send a HTTP HEAD request
res = conn.getresponse()
if res.status == 200:
print "ok"
else:
print "problem : the query returned %s because %s" % (res.status, res.reason)
and for checking if the API is not reachable i think you will be better doing a try catch:
import httplib
import socket
try:
# I don't think you need the timeout unless you want to also calculate the response time ...
conn = httplib.HTTPSConnection('www.google.com')
conn.connect()
except (httplib.HTTPException, socket.error) as ex:
print "Error: %s" % ex
You can mix the two ways if you want something more general ,Hope this will help
urllib and httplib don't expose timeout. You have to include socket and set the timeout there:
import socket
socket.settimeout(10) # or whatever timeout you want
This is what I found to be working correctly with httplib2. Posting it as it might still help someone :
import httplib2, socket
def check_url(url):
h = httplib2.Http(timeout=0.1) #100 ms timeout
try:
resp = h.request(url, 'HEAD')
except (httplib2.HttpLib2Error, socket.error) as ex:
print "Request timed out for ", url
return False
return int(resp[0]['status']) < 400