im trying this simple python 2.7 code:
import requests
response = requests.get(url="https://sslbl.abuse.ch", verify=False)
print response
I'm using verify=False in order to ignore verifying the SSL certificate.
I'm getting the following exception:
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='sslbl.abuse.ch', port=443): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 10054] An existing connection was forcibly closed by the remote host
If I try another https url (like twitter.com) everything is ok.
What can be the problem? How can I get the response like a browser does?
UPDATE:
after upgrading requests version i get the same ConnectionError but some warnings were added:
C:\Python27\lib\site-packages\requests\packages\urllib3\util\ssl_.py:315: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#snimissingwarning.
C:\Python27\lib\site-packages\requests\packages\urllib3\util\ssl_.py:120: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.
I do not use Python 2.7 for my tasks, but I tried to open the URL you provided with python3.2 (it should work for all Python3x, I assume). There was no exception raised. This is what I did (>>> are omited):
from urllib.request import urlopen
url = "https://sslbl.abuse.ch"
response = urlopen(url)
type(response)
<class 'http.client.HTTPResponse'>
From the Python docs, see the output of this:
i = 0
with open(url) as response:
for line in response:
line = line.decode('utf-8')
if "Show more information about this SSL certificate" in line:
i += 1
print(i)
1060
I suggest using Python3x. Hope this helps!
Related
If I try to use requests.get() to connect a HTTPS server (a Jenkins) I got SSL error CERTIFICATE_VERIFY_FAILED certificate verify failed: unable to get local issuer certificate (_ssl.c:997)'))
HTTPS connection are working fine if I use curl or any browser.
The HTTPS server is an internal server but use a SSL cert from DigiCert. It is a wildcard certificate and the same certificate is used for a lot of other servers (like IIS server) in my company, which are working fine together with requests.
If I use urllib package the HTTPS connection will be also fine.
I don't understand why requests doesn't work and I ask what can I do that requests is working?
And no! verify=false is not the solution ;-)
For the SSLContext in the second function I have to call method load_default_certs()
My system: Windows 10, Python 3.10, requests 2.28.1, urllib3 1.26.10, certifi 2022.6.15. Packages are installed today.
url = 'https://redmercury.acme.org/'
def use_requests(url):
import requests
try:
r = requests.get(url)
print(r)
except Exception as e:
print(e)
def use_magic_code_from_stackoverflow(url):
import urllib
import ssl
ssl_context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
# ssl_context.verify_mode = ssl.CERT_REQUIRED
# ssl_context.check_hostname = True
ssl_context.load_default_certs() # WITHOUT I got SSL error(s)
# previous context
https_handler = urllib.request.HTTPSHandler(context=ssl_context)
opener = urllib.request.build_opener(https_handler)
ret = opener.open(url, timeout=2)
print(ret.status)
def use_urllib_requests(url):
import urllib.request
with urllib.request.urlopen(url) as response:
print(response.status)
use_requests(url) # SSL error
use_magic_code_from_stackoverflow(url) # server answers with 200
use_urllib_requests(url) # server answers with 200
Currently working on using zeep for a client binding to an application that we do not control (so we cannot change its behavior).
Unfortunately for me, the WSDL is hosted on a https:// page, while the binding itself ONLY support HTTP, so i cannot simply change the binding address to HTTPS to make this working.
When first creating the zeep client object I am assuming this is then a python requests prepared request, which now is forced to only accept SSL.
Question: Is there a way to tell zeep or python requests that the next response won't be HTTPS?
Example:
from requests import Session
from zeep import Client
from zeep.transports import Transport
import requests
from requests.packages.urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
session = Session()
session.verify = False
transport = Transport(session=session)
client = Client('https://example.local:8443/www/core-service/services/LoginService?wsdl', transport=transport)
with client.settings(raw_response=True):
print(client.service.login('0', 'user', 'password'))
This would return this error because the next call is towards an http address:
requests.exceptions.SSLError: HTTPSConnectionPool(host='localhost', port=9090): Max retries exceeded with url: /core-service/services/LoginService (Caused by SSLError(SSLError(1, '[SSL: UNKNOWN_PROTOCOL] unknown protocol (_ssl.c:877)'),))
You can set the "force_https" property to false in order to avoid the https forcing.
https://python-zeep.readthedocs.io/en/master/settings.html#settings
I'm using urllib3 to get the html content of a website via a socks proxy.
Unfortunately this does not work on any website. Running this on a few sites will give me an error.
import urllib3
from urllib3.contrib.socks import SOCKSProxyManager
proxy = SOCKSProxyManager('socks5://myProxyIP:8080')
r = proxy.request('GET', "https://urlofwebsite", preload_content=False)
urllib3.exceptions.MaxRetryError:
SOCKSHTTPSConnectionPool(host='urlofwebsite', port=443): Max retries
exceeded with url: /index.php (Caused by SSLError(SSLError(1, u'[SSL:
UNSUPPORTED_PROTOCOL] unsupported protocol (_ssl.c:727)'),))
I assume this might be an issue with the SSL (or TLS) version the site is using but as I am not the owner of the server I have to edit my script to handle with this.
Is it possible to change the setting in urllib3 to accept this connection?
Thanks!
I use python-requests to talk to HTTPS web services, some of which present incomplete certificate X509 chains. I'm having trouble figuring out how to access the invalid/incomplete certificates in order to explain the error to the user.
Here's an example illustrated by https://ssllabs.com/ssltest, where the server sends only the leaf certificate, and not the intermediate certificate which is necessary for validation, but missing from certifi's root CA store:
When I try to connect with python-requests, I get an exception that isn't very useful:
request.get('https://web.service.com/path')
SSLError: HTTPSConnectionPool(host='web.service.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')],)",),))
Obviously, I can use separate tools to figure out what's wrong in any particular case (e.g. gnutls-cli, openssl s_client, SSLLabs, etc.).
However, what I really want to be able to do is to be able to catch and diagnose the problem with the certificate chain in my Python code, so that I can present a more specific error message to the user. This answer suggests a monkey-patch to the response object; it's not particularly elegant, but it works—though only when the response object is returned successfully, and not in the case of an exception.
What are the cleanest ways to instrument requests to save the peer's certificate chain in the exception object returned when requests fails to validate the certificate chain itself?
Take requests.get("https://151.101.1.69") # stackoverflow's ip as an example:
try:
requests.get("https://151.101.1.69")
except requests.exceptions.SSLError as e:
cert = e.args[0].reason.args[0]._peer_cert
Then cert is a dict contains the peer's certificate. As I'm not very familiar with SSL, I don't know if it is enough for your case.
BTW, in this case the error is "hostname '151.101.1.69' doesn't match either of '*.stackexchange.com', ...omitted. I'm not sure about the structure of exception in your real case, so you may need to find it on your own. I think it should have the same name _peer_cert.
update
The above method doesn't work when handshake fails... But it still can be done:
try:
requests.get("https://fpslinux1.finalphasesystems.com/")
except requests.exceptions.SSLError:
import ssl
import OpenSSL
cert = ssl.get_server_certificate(('fpslinux1.finalphasesystems.com', 443))
cert = OpenSSL.crypto.load_certificate(OpenSSL.crypto.FILETYPE_PEM, cert)
print(cert.get_issuer())
print(cert.get_subject().get_components())
Yes it is a little dirty but I don't have a better method as a ssl socket doesn't
even return invalid certs from C level :/
To use OpenSSL, you need to install pyopenssl.
I have an issue with connecting to a specific site using Python requests and get this error:
HTTPSConnectionPool(host='XXXXXXXXX', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError("bad handshake: SysCallError(-1, 'Unexpected EOF')",),))
How can I work around this? (setting verify=False does not make a difference) I'm suspecting the server who's at fault here as it get an overall rating of F # ssllabs when I run their test
I am fairly new with Python and requests
my code:
import requests
try:
site = requests.get('https://XXXXXXXXXX', verify=True)
print(site)
except requests.exceptions.RequestException as e:
print(e)
pass
Faced with the same error my troubles went away after doing pip install ndg-httpsclient. yum install python-ndg_httpsclient or apt-get install python-ndg-httpsclient (or apt-get install python3-ndg-httpsclient) probably works too.
As mentioned in other question https://stackoverflow.com/a/36499718/1657819 this error may happen if you're under proxy, so disabling proxy may help
unset https_proxy
The root cause might be this open bug in the requests library: "Session.verify=False ignored when REQUESTS_CA_BUNDLE environment variable is set".
We've seen similar issues start all of a sudden on a specific host. It turned out that the env variable was set there recently, which started causing requests to spawn with session.verify not False despite being initialized to False. Once we removed the REQUESTS_CA_BUNDLE environment variable the errors stopped.
Set verify = False, it will help:
import requests
try:
site = requests.get('https://XXXXXXXXXX', verify=False)
print(site)
except requests.exceptions.RequestException as e:
print(e)
pass
or try with urllib:
import requests
import urllib.request
import ssl
# Ignore SSL certificate errors
ctx = ssl.create_default_context()
ctx.check_hostname = False
ctx.verify_mode = ssl.CERT_NONE
try:
site =urllib.request.urlopen(url, context=ctx)
print(site.read().decode())
except requests.exceptions.RequestException as e:
print(e)
pass