Python: SSLError, bad handshake, Unexpected EOF - python

I have an issue with connecting to a specific site using Python requests and get this error:
HTTPSConnectionPool(host='XXXXXXXXX', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError("bad handshake: SysCallError(-1, 'Unexpected EOF')",),))
How can I work around this? (setting verify=False does not make a difference) I'm suspecting the server who's at fault here as it get an overall rating of F # ssllabs when I run their test
I am fairly new with Python and requests
my code:
import requests
try:
site = requests.get('https://XXXXXXXXXX', verify=True)
print(site)
except requests.exceptions.RequestException as e:
print(e)
pass

Faced with the same error my troubles went away after doing pip install ndg-httpsclient. yum install python-ndg_httpsclient or apt-get install python-ndg-httpsclient (or apt-get install python3-ndg-httpsclient) probably works too.

As mentioned in other question https://stackoverflow.com/a/36499718/1657819 this error may happen if you're under proxy, so disabling proxy may help
unset https_proxy

The root cause might be this open bug in the requests library: "Session.verify=False ignored when REQUESTS_CA_BUNDLE environment variable is set".
We've seen similar issues start all of a sudden on a specific host. It turned out that the env variable was set there recently, which started causing requests to spawn with session.verify not False despite being initialized to False. Once we removed the REQUESTS_CA_BUNDLE environment variable the errors stopped.

Set verify = False, it will help:
import requests
try:
site = requests.get('https://XXXXXXXXXX', verify=False)
print(site)
except requests.exceptions.RequestException as e:
print(e)
pass
or try with urllib:
import requests
import urllib.request
import ssl
# Ignore SSL certificate errors
ctx = ssl.create_default_context()
ctx.check_hostname = False
ctx.verify_mode = ssl.CERT_NONE
try:
site =urllib.request.urlopen(url, context=ctx)
print(site.read().decode())
except requests.exceptions.RequestException as e:
print(e)
pass

Related

Python requests.get "unable to get local issuer certificate" - but self-signed cert is installed (Windows)

I’m writing a Python script that will monitor our Tesla PowerWall Gateway, but am stuck on this SSL problem.
HTTPSConnectionPool(host='powerwall', port=443): Max retries exceeded with url: /api/system_status/soe (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1108)')))
import json
import os
import requests
import sys
from requests.auth import HTTPDigestAuth
if __name__ == "__main__":
scriptPath = os.path.split(os.path.abspath(__file__))[0] # Where am I, and the local copy of the cert
#scriptPath = os.path.split(requests.certs.where(),)[0] # Where is requests looking for certs?
cert = os.path.join(scriptPath, 'PW2.pem')
#os.environ['REQUESTS_CA_BUNDLE'] = cert
#os.environ['REQUESTS_CA_BUNDLE'] = scriptPath
try:
response = None
query = "https://powerwall/api/system_status/soe"
with requests.Session() as session:
session.auth = (HTTPDigestAuth('myEmail', 'PW_PWD'))
session.timeout = 20
session.verify = True
#session.verify = cert
#session.load_cert_chain = "PW2.pem"
#session.load_cert_chain = cert
response = session.get(query)
except Exception as e:
print(str(e))
Despite all I’ve tried I still can’t get past this error. Yes, setting verify=False is an obvious work-around, but I’m trying to do this the ‘right’ way.
Setup:
Windows 10 PC
Python 3.8.2
I’ve downloaded the certificate from the Gateway and added it to the Local Machine store on my PC, in the Trusted Root Certification Authorities folder.
Windows can open it OK, showing the various SANs, including “powerwall”, which is how I’m addressing it in my call to requests.get. That says to me the integrity of the cert is good. (Its 'intended purposes' are Server Authentication & Client Authentication.)
I’ve installed python-certifi-win32, then later uninstalled it and installed pip-system-certs as per this SO answer to no avail.
I’ve added the PW’s cert to cacert.pem in the folder returned by requests.certs.where(): C:\Python38\lib\site-packages\certifi\cacert.pem
The commented-out code are variations I’ve performed along the way.
In the doco for ‘requests’ is a mention of this issue: “For example: Self-signed SSL certificates specified in REQUESTS_CA_BUNDLE will not be taken into account.” and a way around it, but that wasn’t successful either.
What have I missed?
Please don’t tell me it’s the 2047 expiry date of the cert…
TIA.

SSL Error CERTIFICATE_VERIFY_FAILED with requests BUT NOT with urllib.request

If I try to use requests.get() to connect a HTTPS server (a Jenkins) I got SSL error CERTIFICATE_VERIFY_FAILED certificate verify failed: unable to get local issuer certificate (_ssl.c:997)'))
HTTPS connection are working fine if I use curl or any browser.
The HTTPS server is an internal server but use a SSL cert from DigiCert. It is a wildcard certificate and the same certificate is used for a lot of other servers (like IIS server) in my company, which are working fine together with requests.
If I use urllib package the HTTPS connection will be also fine.
I don't understand why requests doesn't work and I ask what can I do that requests is working?
And no! verify=false is not the solution ;-)
For the SSLContext in the second function I have to call method load_default_certs()
My system: Windows 10, Python 3.10, requests 2.28.1, urllib3 1.26.10, certifi 2022.6.15. Packages are installed today.
url = 'https://redmercury.acme.org/'
def use_requests(url):
import requests
try:
r = requests.get(url)
print(r)
except Exception as e:
print(e)
def use_magic_code_from_stackoverflow(url):
import urllib
import ssl
ssl_context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
# ssl_context.verify_mode = ssl.CERT_REQUIRED
# ssl_context.check_hostname = True
ssl_context.load_default_certs() # WITHOUT I got SSL error(s)
# previous context
https_handler = urllib.request.HTTPSHandler(context=ssl_context)
opener = urllib.request.build_opener(https_handler)
ret = opener.open(url, timeout=2)
print(ret.status)
def use_urllib_requests(url):
import urllib.request
with urllib.request.urlopen(url) as response:
print(response.status)
use_requests(url) # SSL error
use_magic_code_from_stackoverflow(url) # server answers with 200
use_urllib_requests(url) # server answers with 200

Python Requests ProxyError not caught

Why would the proxy error not be caught by the first except: clause? I am not quite understanding why it is defaulting to the second clause (or if I remove the second cause it will just throw an error)
from requests.exceptions import ProxyError
try:
login(acc)
except ProxyError:
pass
except Exception as e:
print e
Output:
HTTPSConnectionPool(host='www.google.com', port=443): Max retries exceeded with url: /mail (Caused by ProxyError('Cannot connect to proxy.', error('Tunnel connection failed: 403 Forbidden',)))
You've hit a bit of an edge-case here. The ProxyError exception is not actually the requests.exceptions exception; it an exception with the same name from the embedded urllib3 library, and it is wrapped in a MaxRetryError exception.
This is really a bug, and was indeed filed as such a while ago, see issue #3050. It was fixed with this pull request, to raise the proper requests.exceptions.ProxyError exception instead. This fix has been released as part of requests 2.9.2.
Normally, requests unwraps the MaxRetryError exception for you, but not for this specific exception. If you can’t upgrade to 2.9.2 or newer you can catch it specifically (unwrapping two layers now):
from requests.exceptions import ConnectionError
from requests.packages.urllib3.exceptions import MaxRetryError
from requests.packages.urllib3.exceptions import ProxyError as urllib3_ProxyError
try:
# ...
except ConnectionError as ce:
if (isinstance(ce.args[0], MaxRetryError) and
isinstance(ce.args[0].reason, urllib3_ProxyError)):
# oops, requests should have handled this, but didn't.
# see https://github.com/kennethreitz/requests/issues/3050
pass
or apply the change from the pull request to your local install of requests.

Python requests.get raises ConnectionError for HTTPS url

im trying this simple python 2.7 code:
import requests
response = requests.get(url="https://sslbl.abuse.ch", verify=False)
print response
I'm using verify=False in order to ignore verifying the SSL certificate.
I'm getting the following exception:
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='sslbl.abuse.ch', port=443): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 10054] An existing connection was forcibly closed by the remote host
If I try another https url (like twitter.com) everything is ok.
What can be the problem? How can I get the response like a browser does?
UPDATE:
after upgrading requests version i get the same ConnectionError but some warnings were added:
C:\Python27\lib\site-packages\requests\packages\urllib3\util\ssl_.py:315: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#snimissingwarning.
C:\Python27\lib\site-packages\requests\packages\urllib3\util\ssl_.py:120: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.
I do not use Python 2.7 for my tasks, but I tried to open the URL you provided with python3.2 (it should work for all Python3x, I assume). There was no exception raised. This is what I did (>>> are omited):
from urllib.request import urlopen
url = "https://sslbl.abuse.ch"
response = urlopen(url)
type(response)
<class 'http.client.HTTPResponse'>
From the Python docs, see the output of this:
i = 0
with open(url) as response:
for line in response:
line = line.decode('utf-8')
if "Show more information about this SSL certificate" in line:
i += 1
print(i)
1060
I suggest using Python3x. Hope this helps!

Python urllib2 giving "network unreachable error" if the URL is https

I am trying to fetch some urls using urllib2 library.
a = urllib2.urlopen("http://www.google.com")
ret = a.read()
Code above is working fine, and giving expected result. But when I make the url https, it gives "network unreachable" error
a = urllib2.urlopen("https://www.google.com")
urllib2.URLError: <urlopen error [Errno 101] Network is unreachable>
Is there any problem with ssl? My python version is Python2.6.5. I am also behind an academic proxy server. I have the settings in bash file. Anyway, since http is opening proxy shouldn't be the problem here.
Normally the issue in cases like this is the proxy you are behind having an out of date or untrusted SSL certificate. urllib is fussier than most browsers when it comes to SSL and this is why you might be getting this error.
The http url didn't give error because http_proxy variable was set already. By setting https_proxy the above error disappears.
export http_proxy = "http://{proxy-address}"
Set samething for https_proxy
export https_proxy = "http://{proxy-address}"

Categories

Resources