SSL version in httplib2 - EOF occurred in violation of protocol - python

I'm issuing a HTTPS GET request to a REST service I own with httplib2 but we're getting the error:
[Errno 8] _ssl.c:504: EOF occurred in violation of protocol
All other clients works well (browser, Java client, etc...) with the minor exception that PHP curl needed to be set to use SSL v3.
I've searched around and it seems that it is indeed an error regarding SSL version, but I can't seem to find a way to change it in httplib2.
Is there any way around it besides changing the following line in the source code:
# We should be specifying SSL version 3 or TLS v1, but the ssl module
# doesn't expose the necessary knobs. So we need to go with the default
# of SSLv23.
return ssl.wrap_socket(sock, keyfile=key_file, certfile=cert_file,
cert_reqs=cert_reqs, ca_certs=ca_certs)

I developed this workaround for httplib2 :
import httplib2
# Start of the workaround for SSL3
# This is a monkey patch / module function overriding
# to allow pages that only work with SSL3
# Build the appropriate socket wrapper for ssl
try:
import ssl # python 2.6
httplib2.ssl_SSLError = ssl.SSLError
def _ssl_wrap_socket(sock, key_file, cert_file,
disable_validation, ca_certs):
if disable_validation:
cert_reqs = ssl.CERT_NONE
else:
cert_reqs = ssl.CERT_REQUIRED
# Our fix for sites the only accepts SSL3
try:
# Trying SSLv3 first
tempsock = ssl.wrap_socket(sock, keyfile=key_file, certfile=cert_file,
cert_reqs=cert_reqs, ca_certs=ca_certs,
ssl_version=ssl.PROTOCOL_SSLv3)
except ssl.SSLError, e:
tempsock = ssl.wrap_socket(sock, keyfile=key_file, certfile=cert_file,
cert_reqs=cert_reqs, ca_certs=ca_certs,
ssl_version=ssl.PROTOCOL_SSLv23)
return tempsock
httplib2._ssl_wrap_socket = _ssl_wrap_socket
except (AttributeError, ImportError):
httplib2.ssl_SSLError = None
def _ssl_wrap_socket(sock, key_file, cert_file,
disable_validation, ca_certs):
if not disable_validation:
raise httplib2.CertificateValidationUnsupported(
"SSL certificate validation is not supported without "
"the ssl module installed. To avoid this error, install "
"the ssl module, or explicity disable validation.")
ssl_sock = socket.ssl(sock, key_file, cert_file)
return httplib.FakeSocket(sock, ssl_sock)
httplib2._ssl_wrap_socket = _ssl_wrap_socket
# End of the workaround for SSL3
if __name__ == "__main__":
h1 = httplib2.Http()
resp, content = h1.request("YOUR_SSL3_ONLY_LINK_HERE", "GET")
print(content)
This workaround was based on the workarounds for urllib2 presented at this bug report http://bugs.python.org/issue11220,
Update: Presenting a solution for httplib2. I didn't notice you were using httplib2, I thought it was urllib2.

Please refer to another StackOverflow threads that specify a solution. The way to specify a TLS version force the SSL version to TLSv1 as mentioned in the response by user favoretti within the provided link.
Hopefully, this works

Related

SSL Error CERTIFICATE_VERIFY_FAILED with requests BUT NOT with urllib.request

If I try to use requests.get() to connect a HTTPS server (a Jenkins) I got SSL error CERTIFICATE_VERIFY_FAILED certificate verify failed: unable to get local issuer certificate (_ssl.c:997)'))
HTTPS connection are working fine if I use curl or any browser.
The HTTPS server is an internal server but use a SSL cert from DigiCert. It is a wildcard certificate and the same certificate is used for a lot of other servers (like IIS server) in my company, which are working fine together with requests.
If I use urllib package the HTTPS connection will be also fine.
I don't understand why requests doesn't work and I ask what can I do that requests is working?
And no! verify=false is not the solution ;-)
For the SSLContext in the second function I have to call method load_default_certs()
My system: Windows 10, Python 3.10, requests 2.28.1, urllib3 1.26.10, certifi 2022.6.15. Packages are installed today.
url = 'https://redmercury.acme.org/'
def use_requests(url):
import requests
try:
r = requests.get(url)
print(r)
except Exception as e:
print(e)
def use_magic_code_from_stackoverflow(url):
import urllib
import ssl
ssl_context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
# ssl_context.verify_mode = ssl.CERT_REQUIRED
# ssl_context.check_hostname = True
ssl_context.load_default_certs() # WITHOUT I got SSL error(s)
# previous context
https_handler = urllib.request.HTTPSHandler(context=ssl_context)
opener = urllib.request.build_opener(https_handler)
ret = opener.open(url, timeout=2)
print(ret.status)
def use_urllib_requests(url):
import urllib.request
with urllib.request.urlopen(url) as response:
print(response.status)
use_requests(url) # SSL error
use_magic_code_from_stackoverflow(url) # server answers with 200
use_urllib_requests(url) # server answers with 200

Get a server certificate despite handshake failure in Python

I am writing a tool to monitor server certificate expiration. I'm using python3 ssl and socket modules to get the server cert using a pretty basic method of creating a default context, disabling hostname validation and certificate verification, calling SSLSocket.connect(), then SSLSocket.getpeercert(), with the sole purpose of grabbing the server certificate, and that is all.
This is all within a private network and I am not concerned with validation.
I have some devices that require client certs signed by a private CA (which my tool doesn't have), so the handshake fails on SSLSocket.connect(), making SSLSocket.getpeercert() impossible.
I know that the server certificate is indeed being provided to my client (along with that pesky Certificate Request) during the handshake. I can see it in a packet capture, as well as just using the openssl s_client command line.
Here is my code.
def get_cert(self, host, port):
ctx = ssl.create_default_context()
ctx.check_hostname = False
ctx.verify_mode = ssl.CERT_NONE
with ctx.wrap_socket(socket.socket(), server_hostname=host) as s:
s.settimeout(10)
s.connect((host, port))
binary_cert = s.getpeercert(True)
cert = OpenSSL.crypto.load_certificate(OpenSSL.crypto.FILETYPE_ASN1, binary_cert)
pem_cert = OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, cert).decode()
return pem_cert
Is there any way to get a little lower into the handshake messages to get the server cert, even though the handshake ultimately fails?
My current solution is to just run openssl s_client -connect host:port using subprocess.run() in the event of a ssl.SSLError.
You may catch exception that do_handshake() produced and then continue to process server certificate.
import OpenSSL
import socket
dst = ('10.10.10.10', 443)
sock = socket.create_connection(dst)
context = OpenSSL.SSL.Context(OpenSSL.SSL.SSLv23_METHOD)
connection = OpenSSL.SSL.Connection(context, sock)
connection.set_connect_state()
try:
connection.do_handshake()
except:
print(connection.get_peer_cert_chain())
Tested on python 2.7.17 and 3.8.5
It looks like there's unfortunately no way to do it with python's ssl module in versions < 3.10. In those versions, the only way to get the peer certificate that I can see is through the low-level _ssl.SSLSocket.getpeercert() method and that immediately throws exception if the handshake is not complete.
Since python 3.10, there's a new _ssl.SSLSocket.get_unverified_chain() method that does not do the handshake check, so perhaps something like this abomination could work?
ssock = context.wrap_socket(sock, do_handshake_on_connect=False)
try:
ssock.do_handshake()
except ssl.SSLError as e:
pass
certs = ssock._sslobj._sslobj.get_unverified_chain()
... but I have not tested it.

Bypass SSL when I'm using SUDS for consume web service

I'm using SUDS for consuming web service. I tried like bellow:
client = Client(wsdl_url)
list_of_methods = [method for method in client.wsdl.services[0].ports[0].methods]
print(list_of_methods)
I got this error:
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)>
I saw link but it is just solution for python 2.7. How can I bypass SSL with SUDS? Or is there any none python solution (For example add fake certificate in windows OS)? I'm using python 3(So I have to use urllib instead of urllib2).
A suds client uses a subclass of suds.transport.Transport to process requests.
The default transport used is an instance of suds.transport.https.HttpAuthenticated, but you can override this when you instantiate the client by passing a transport keyword argument.
The http and https transports are implemented using urllib.request (or urllib2 for python2) by creating an urlopener. The list of handlers used to create this urlopener is retrieved by calling the u2handlers() method on the transport class. This means that you can create your own transport by subclassing the default and overriding that method to use a HTTPSHander with a specific ssl context, e.g:
from suds.client import Client
from suds.transport.https import HttpAuthenticated
from urllib.request import HTTPSHandler
import ssl
class CustomTransport(HttpAuthenticated):
def u2handlers(self):
# use handlers from superclass
handlers = HttpAuthenticated.u2handlers(self)
# create custom ssl context, e.g.:
ctx = ssl.create_default_context(cafile="/path/to/ca-bundle.pem")
# configure context as needed...
ctx.check_hostname = False
# add a https handler using the custom context
handlers.append(HTTPSHandler(context=ctx))
return handlers
# instantiate client using this transport
c = Client("https://example.org/service?wsdl", transport=CustomTransport())
This code worked for me:
from suds.client import Client
import ssl
if hasattr(ssl, '_create_unverified_context'):
ssl._create_default_https_context = ssl._create_unverified_context
cli = Client('https://your_lik_to?wsdl')
print(cli)
You can add the code below before instantiate your suds client:
import ssl
try:
_create_unverified_https_context = ssl._create_unverified_context
except AttributeError:
pass
else:
ssl._create_default_https_context = _create_unverified_https_context
See my own website for details: https://lucasmarques.me/bypass-ssl
This is what I came up with that seems to work well:
class MyTransport(HttpAuthenticated):
def u2handlers(self):
"""
Get a collection of urllib handlers.
#return: A list of handlers to be installed in the opener.
#rtype: [Handler,...]
"""
handlers = []
context = ssl._create_unverified_context()
handlers.append(urllib2.HTTPSHandler(context=context))
return handlers
Cheers!
You can use https://pypi.python.org/pypi/suds_requests to leverage the requests library for the transport. This gives you the ability to disable the ssl verification.
Or try my new soap library, it supports it out of the box: http://docs.python-zeep.org/en/latest/#transport-options
I use this:
with mock.patch('ssl._create_default_https_context', ssl._create_unverified_context):
client = Client(url)
See: https://bitbucket.org/jurko/suds/issues/78/allow-bypassing-ssl-certificate#comment-39029255

How to validate server's ssl certificate in python?

I have configured my server to serve only https creating a self-signed certificate. I have a client that I has to validate the server's certificate and after that will download a file from the server.
How do I implement the validation in client? Is there any code example?
My question is similar with this one: How can the SSL client validate the server's certificate?
but although the fine explanation, I didn't find any help.
So far, in my code I create a directory and then I download the file with urllib2:
[...] #imports
def dir_creation(path):
try:
os.makedirs(path)
except OSError as exception:
if exception.errno != errno.EEXIST:
raise
def file_download(url):
ver_file = urllib2.urlopen(url)
data = ver_file.read()
with open(local_filename, "wb") as code:
code.write(data)
dir_creation(path)
file_download(url)
Rather than configuring your server to present a self-signed certificate, you should use a self-signed certificate as a certificate authority to sign the server certificate. (How to do this is beyond the scope of your question, but I'm sure you can find help on Stack Overflow or elsewhere.)
Now you must configure your client to trust your certificate authority. In python (2.7.9 or later), you can do this using the ssl module:
import ssl
... # create socket
ctx = ssl.create_default_context(cafile=path_to_ca_certificate)
sslsock = ctx.wrap_socket(sock)
You can then transmit and read data on the secure socket. See the ssl module documentation for more explanation.
The urllib2 API is simpler:
import urllib2
resp = urllib2.urlopen(url, cafile=path_to_ca_certificate)
resp_body = resp.read()
If you wish to use Requests, according to the documentation you can supply a path to the CA certificate as the argument to the verify parameter:
resp = requests.get(url, verify=path_to_ca_certificate)

Using SocksiPy with SSL

I'm trying to use SocksIPy with ssl module (from stdlib) to grab a site's remote certificate but SocksIPy won't play with ssl.
The below code will connect to check.torproject.org and state we are not using Tor (meaning SocksIPy is not working) (bad).
Not sure if SocksIPy is the best solution for this but I haven't been able to find any other way to proxify a raw socket (or get pycurl/urllib2 to use SOCKS proxies and give SSL certs!).
To clarify, my issue is that the socket is not being proxied. I'd like to get the ssl certificate with a proxy of my choosing, that's not happening.
Seems right now, I can either have proxy or SSL but not both. Help!
import socks
import ssl
s = socks.socksocket()
s.setproxy(socks.PROXY_TYPE_SOCKS5, "127.0.0.1", 9050)
ss = ssl.wrap_socket(s)
ss.connect(('check.torproject.org', 443))
ss.write("""GET / HTTP/1.0\r
Host: check.torproject.org\r\n\r\n""")
# print ss.getpeercert()
print ss.read(), ss.read(), ss.read()
ss.close()
I have tested this code while running tcpdump so it should work.
import socks
import ssl
s = socks.socksocket()
s.setproxy(socks.PROXY_TYPE_SOCKS5,"127.0.0.1",port=9050)
s.connect(('83.94.121.246', 443))
ss = ssl.wrap_socket(s)
print ss.send("hello")
ss.close()
I didn't review the ssl.py but I guess you have to call connect on the socks object and not the ssl object.
Put ssl.wrap_socket below connect. It doesn't work properly otherwise.
Use validation and CA certfile Getting the certificate from the server requires creating the SSL object with validation turned on and giving it a CA certificates file. If you can't find one on your system you could download the one provided by the CURL project based on Mozilla's as a local file: http://curl.haxx.se/docs/caextract.html
Note: the SocksIPy project hasn't been updated in quite a while and doesn't support Python 3.
Fixed version of original code:
import socks
import ssl
s = socks.socksocket()
s.setproxy(socks.PROXY_TYPE_SOCKS5, "127.0.0.1", port=9050)
s.connect(('check.torproject.org', 443))
ss = ssl.wrap_socket(s, cert_reqs=ssl.CERT_REQUIRED, ca_certs="cacert.pem")
print "Peer cert: ", ss.getpeercert()
ss.write("""GET / HTTP/1.0\r\nHost: check.torproject.org\r\n\r\n""")
content = []
while True:
data = ss.read()
if not data: break
content.append(data)
ss.close()
content = "".join(content)
assert "This browser is configured to use Tor" in content

Categories

Resources