Python Requests ProxyError not caught - python

Why would the proxy error not be caught by the first except: clause? I am not quite understanding why it is defaulting to the second clause (or if I remove the second cause it will just throw an error)
from requests.exceptions import ProxyError
try:
login(acc)
except ProxyError:
pass
except Exception as e:
print e
Output:
HTTPSConnectionPool(host='www.google.com', port=443): Max retries exceeded with url: /mail (Caused by ProxyError('Cannot connect to proxy.', error('Tunnel connection failed: 403 Forbidden',)))

You've hit a bit of an edge-case here. The ProxyError exception is not actually the requests.exceptions exception; it an exception with the same name from the embedded urllib3 library, and it is wrapped in a MaxRetryError exception.
This is really a bug, and was indeed filed as such a while ago, see issue #3050. It was fixed with this pull request, to raise the proper requests.exceptions.ProxyError exception instead. This fix has been released as part of requests 2.9.2.
Normally, requests unwraps the MaxRetryError exception for you, but not for this specific exception. If you can’t upgrade to 2.9.2 or newer you can catch it specifically (unwrapping two layers now):
from requests.exceptions import ConnectionError
from requests.packages.urllib3.exceptions import MaxRetryError
from requests.packages.urllib3.exceptions import ProxyError as urllib3_ProxyError
try:
# ...
except ConnectionError as ce:
if (isinstance(ce.args[0], MaxRetryError) and
isinstance(ce.args[0].reason, urllib3_ProxyError)):
# oops, requests should have handled this, but didn't.
# see https://github.com/kennethreitz/requests/issues/3050
pass
or apply the change from the pull request to your local install of requests.

Related

Python caught exception and continue

I've a little script in python that work in loop.
Every 30 seconds it get an url with requests to check if content of the page is changed.
But sometime I get script error (about 1 time a day):
ConnectionError: HTTPSConnectionPool(host='www.example.com', port=443): Max retries exceeded with url: /test/ (Caused by NewConnectionError(<urllib3.connection.VerifiedHTTPSConnection object at 0x.......>: [Errno -3] Temporary failure in name resolution))
What it the best way to intercept exception and, if occurs, wait another 30 seconds and continue the script instead stop it?
The exception to caught is ConnectionError or NewConnectionError ?
Put a try/except around your code, like this:
try:
# your code here
except ConnectionError:
pass
except NewConnectionError:
pass

How can I access a peer's cert chain from a python-requests response/exception object?

I use python-requests to talk to HTTPS web services, some of which present incomplete certificate X509 chains. I'm having trouble figuring out how to access the invalid/incomplete certificates in order to explain the error to the user.
Here's an example illustrated by https://ssllabs.com/ssltest, where the server sends only the leaf certificate, and not the intermediate certificate which is necessary for validation, but missing from certifi's root CA store:
When I try to connect with python-requests, I get an exception that isn't very useful:
request.get('https://web.service.com/path')
SSLError: HTTPSConnectionPool(host='web.service.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')],)",),))
Obviously, I can use separate tools to figure out what's wrong in any particular case (e.g. gnutls-cli, openssl s_client, SSLLabs, etc.).
However, what I really want to be able to do is to be able to catch and diagnose the problem with the certificate chain in my Python code, so that I can present a more specific error message to the user. This answer suggests a monkey-patch to the response object; it's not particularly elegant, but it works—though only when the response object is returned successfully, and not in the case of an exception.
What are the cleanest ways to instrument requests to save the peer's certificate chain in the exception object returned when requests fails to validate the certificate chain itself?
Take requests.get("https://151.101.1.69") # stackoverflow's ip as an example:
try:
requests.get("https://151.101.1.69")
except requests.exceptions.SSLError as e:
cert = e.args[0].reason.args[0]._peer_cert
Then cert is a dict contains the peer's certificate. As I'm not very familiar with SSL, I don't know if it is enough for your case.
BTW, in this case the error is "hostname '151.101.1.69' doesn't match either of '*.stackexchange.com', ...omitted. I'm not sure about the structure of exception in your real case, so you may need to find it on your own. I think it should have the same name _peer_cert.
update
The above method doesn't work when handshake fails... But it still can be done:
try:
requests.get("https://fpslinux1.finalphasesystems.com/")
except requests.exceptions.SSLError:
import ssl
import OpenSSL
cert = ssl.get_server_certificate(('fpslinux1.finalphasesystems.com', 443))
cert = OpenSSL.crypto.load_certificate(OpenSSL.crypto.FILETYPE_PEM, cert)
print(cert.get_issuer())
print(cert.get_subject().get_components())
Yes it is a little dirty but I don't have a better method as a ssl socket doesn't
even return invalid certs from C level :/
To use OpenSSL, you need to install pyopenssl.

Python: SSLError, bad handshake, Unexpected EOF

I have an issue with connecting to a specific site using Python requests and get this error:
HTTPSConnectionPool(host='XXXXXXXXX', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError("bad handshake: SysCallError(-1, 'Unexpected EOF')",),))
How can I work around this? (setting verify=False does not make a difference) I'm suspecting the server who's at fault here as it get an overall rating of F # ssllabs when I run their test
I am fairly new with Python and requests
my code:
import requests
try:
site = requests.get('https://XXXXXXXXXX', verify=True)
print(site)
except requests.exceptions.RequestException as e:
print(e)
pass
Faced with the same error my troubles went away after doing pip install ndg-httpsclient. yum install python-ndg_httpsclient or apt-get install python-ndg-httpsclient (or apt-get install python3-ndg-httpsclient) probably works too.
As mentioned in other question https://stackoverflow.com/a/36499718/1657819 this error may happen if you're under proxy, so disabling proxy may help
unset https_proxy
The root cause might be this open bug in the requests library: "Session.verify=False ignored when REQUESTS_CA_BUNDLE environment variable is set".
We've seen similar issues start all of a sudden on a specific host. It turned out that the env variable was set there recently, which started causing requests to spawn with session.verify not False despite being initialized to False. Once we removed the REQUESTS_CA_BUNDLE environment variable the errors stopped.
Set verify = False, it will help:
import requests
try:
site = requests.get('https://XXXXXXXXXX', verify=False)
print(site)
except requests.exceptions.RequestException as e:
print(e)
pass
or try with urllib:
import requests
import urllib.request
import ssl
# Ignore SSL certificate errors
ctx = ssl.create_default_context()
ctx.check_hostname = False
ctx.verify_mode = ssl.CERT_NONE
try:
site =urllib.request.urlopen(url, context=ctx)
print(site.read().decode())
except requests.exceptions.RequestException as e:
print(e)
pass

Cannot catch ConnectionError with requests

I'm doing this:
import requests
r = requests.get("http://non-existent-domain.test")
And getting
ConnectionError: HTTPConnectionPool(host='non-existent-domain.test', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x10b0170f0>: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known',))
However, if I try to catch it like this:
try:
r = requests.get("http://non-existent-domain.test")
except ConnectionError:
print("ConnectionError")
Nothing changes, I still have ConnectionError unhandled. How to catch it properly?
That's a different ConnectionError. You are catching the built-in one, but requests has its own. So this should be
try:
r = requests.get("http://non-existent-domain.test")
except requests.ConnectionError:
print("ConnectionError")
# Output: ConnectionError

Python requests.get raises ConnectionError for HTTPS url

im trying this simple python 2.7 code:
import requests
response = requests.get(url="https://sslbl.abuse.ch", verify=False)
print response
I'm using verify=False in order to ignore verifying the SSL certificate.
I'm getting the following exception:
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='sslbl.abuse.ch', port=443): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 10054] An existing connection was forcibly closed by the remote host
If I try another https url (like twitter.com) everything is ok.
What can be the problem? How can I get the response like a browser does?
UPDATE:
after upgrading requests version i get the same ConnectionError but some warnings were added:
C:\Python27\lib\site-packages\requests\packages\urllib3\util\ssl_.py:315: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#snimissingwarning.
C:\Python27\lib\site-packages\requests\packages\urllib3\util\ssl_.py:120: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.
I do not use Python 2.7 for my tasks, but I tried to open the URL you provided with python3.2 (it should work for all Python3x, I assume). There was no exception raised. This is what I did (>>> are omited):
from urllib.request import urlopen
url = "https://sslbl.abuse.ch"
response = urlopen(url)
type(response)
<class 'http.client.HTTPResponse'>
From the Python docs, see the output of this:
i = 0
with open(url) as response:
for line in response:
line = line.decode('utf-8')
if "Show more information about this SSL certificate" in line:
i += 1
print(i)
1060
I suggest using Python3x. Hope this helps!

Categories

Resources