I'm trying the use urllib.request.urlopen on a website starting with "https". Error output is:
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed
There are many great threads which cover this error. Including this one which mentions SSL Labs rating. I am able to use urllib.request.urlopen on every other "https" site I have tested.
SSL Labs shows the following output:
Key RSA 2048 bits (e 65537)
Issuer Let's Encrypt Authority X3
AIA: http://cert.int-x3.letsencrypt.org/
Signature algorithm SHA256withRSA
Extended Validation No
Certificate Transparency No
OCSP Must Staple No
Revocation information OCSP
Revocation status Good (not revoked)
DNS CAA No (more info)
Trusted Yes
To clarify, my question is: is there a solution for completing the handshake, that doesn't include bypassing the certificate verification? And if there is a solution, can it be solved entirely inside a python script on linux, macOS and Windows?
I cannot answer this question for urllib, but I was able to overcome this problem using python requests instead. Note this will only work if there is a trusted certificate chain for the website in question but the server is perhaps missing a root or intermediate certificate.
Using SSL labs server test (linked here), run the test and scroll down to certification paths. IF it IS the case that there are trusted cert paths, but the server is for some reason not providing the complete chain, you can download the full trusted path here as text. Copy the full certificate chain and save it as a .pem file and pass the path of this file to the requests function:
r = requests.get(url, verify = "path/to/chain.pem")
The requests module can throw all sorts of SSL related certification failures, many of which will be server side problems, and you really want to avoid disabling the SSL verification. This solution is only for the somewhat rare case where a full certificate exists but the issuer or server in question has sloppily omitted either a root or intermediate cert.
You can work around this by adding your missing intermediate certificate to the active X509Store:
cert_text = '''
-----BEGIN CERTIFICATE-----
...put your actual certificate text here...
-----END CERTIFICATE-----
'''
# fill this out depending on which specific intermediate cert you're missing
missing_cert = OpenSSL.crypto.load_certificate(OpenSSL.crypto.FILETYPE_PEM, cert_text)
context = ssl.create_default_context() # load default trusted certificates
store = context.get_cert_store() # get the X509Store for that context
store.add_cert(missing_cert) # add your missing cert to it
urllib.request.urlopen(site, context=context)
Note that if you only need to talk to the one server for which you're doing this, you could just pass an appropriate cafile or capath argument to create_default_context().
I was able to solve this issue (for debian-based system, I'm running Debian 9). I still need to test solutions on macOS and Windows.
On the SSL Labs report, under the "Certification Paths" header, it showed:
1 Sent by server www.exampleSITE.com
BLAH BLAH BLAH
2 Extra download Let's Encrypt Authority X3
BLAH BLAH BLAH
3 In trust store DST Root CA X3 Self-signed
BLAH BLAH BLAH
I navigated to /etc/ssl/certs/ and noticed there was no Let's Encrypt certificates present. I then downloaded the .pem and rehashed.
cd /etc/ssl/certs
sudo wget https://letsencrypt.org/certs/lets-encrypt-x3-cross-signed.pem
sudo c_rehash
Then I tested the python line that was giving me an error earlier
page = urllib.request.urlopen('https://www.exampleSITE.com').read()
and it successfully retrieved the page.
Related
I am using python request module to hit rest api. I have to use SSL for security measures.
I see that i can set
requests.get(url,verify=/path/ca/bundle/)
However i am confused as to what needs to be passed as CA_BUNDLE?
I get the server certificate using
cert = ssl.get_server_certificate((server,port))
Can someone let me know, how i should use this certificate in my request? Should i convert the cert to X509/.pem/.der/.crt file ?
Solved it. Apparently i needed to get the entire certificate chain and create a CA bundle out of it.
I am very new to python and cannot seem to figure out how to accomplish this task. I want to connect to a website and extract the certificate information such as issuer and expiration dates.
I have looked all over, tried all kinds of steps but because I am new I am getting lost in the socket, wrapper etc.
To make matters worse, I am in a proxy environment and it seems to really complicate things.
Does anyone know how I could connect and extract the information while behind the proxy?
As explained in this Answer:
You can still the server certificate with the
ssl.get_server_certificate() function, but it returns it in PEM
format.
import ssl
print ssl.get_server_certificate(('server.test.com', 443))
From here, I would use M2Crypto or OpenSSL to read the cert and get values:
# M2Crypto
cert = ssl.get_server_certificate(('www.google.com', 443))
x509 = M2Crypto.X509.load_cert_string(cert)
x509.get_subject().as_text()
# 'C=US, ST=California, L=Mountain View, O=Google Inc, CN=www.google.com'
Python SSL lib don't deal with proxies.
I am developing the client- and server-side of a Python3 application. They must communicate over TLS using self-signed certs.
The connection should always be established, even if both have never seen the other, thus neither has the other's cert in its trust store. Verification shall happen after the handshake with a custom method.
However, Python's ssl library attempts to verify the certificate during handshake and this fails if the incoming cert is unknown and has no valid certificate chain. Setting verify_mode to CERT_NONE is also not an option, since I do require the certificates from both sides for my custom verification method.
So my question: How can I require a certificate from the other side but turn off automatic verification during handshake? Or maybe I can pass a custom verifyer-method that gets called?
Thanks!
You can use ssl.get_server_certificate((host,port)). It will return the certificate in PEM format.
import OpenSSL
key = ...
signature = ...
data = ...
x509 = OpenSSL.crypto.load_certificate(OpenSSL.crypto.FILETYPE_ASN1, key)
OpenSSL.crypto.verify(x509, signature, data, 'sha1')
So far, I am able to do all of this without any problems. However, it doesn't feel like this is enough security, since the key itself is given to me via an URL (that I am supposed to trust*), and the method to build the signature is publicly available.
So, say the key is said to be verified by "VeriSign Class 3 Code Signing 2010 CA", can anyone tell me how I can go about checking that this is a valid claim?
I'm guessing I need to have the VeriSign certificate locally on my machine. Assuming that I do, where do I go from there?
Thanks!
*the URL is given to me as a parameter in a JSON request. Sure, the URL will be HTTPS and I can check the domain name and all that. But it seems like I should be doing checks on the certificate itself
You are right that you should check the certificate itself. And yes, you need the VeriSign root certificate(s) (and any other intermediate certificates to have the complete chain of trust) which signed the certificate to be checked.
Current Symantec (VeriSign) root certificates can be found here in zipfile.
Download and unzip the zip file and find all certificates you wish to trust and put them together (in pem format) into one certificate bundle file.
Now you need to do the actual verification. Unfortunately, the OpenSSL call you need is X509_verify_certificate. I looked at the source for both pyopenssl and M2Crypto and neither expose that call, so there's no direct Python code you can call to verify the certificate with either of those packages.
However, since you are using pyopenssl you obviously have the openssl library available. Thus you probably already have or can easily install the openssl command-line tool set. If so, you can call the openssl verify command through a pipe by doing something like this:
cert = OpenSSL.crypto.load_certificate(OpenSSL.crypto.FILETYPE_ASN1, key)
# the command like likes pem format
cert_pem = OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, cert)
# the bundle that you created from the zip extraction
certificate_bundle = 'verisign-root-bundle.pem'
# Pipe the cert to the openssl verify command and check the return code
# a return code of 0 is successful verify
import subprocess
p = subprocess.Popen(['openssl', 'verify', '-CAfile', certificate_bundle],
stdin=subprocess.PIPE)
p.communicate(input=cert_pem)
p.wait()
if (p.returncode == 0):
print('Certificate Verified.')
else:
print('Problem with certificate')
The above pipe runs the command
openssl verify -CAfile ca.bundle certificate.pem
Finally, if you're not familiar with openssl, the command to show certificates is
openssl x509 -inform PEM -text -in certificate.pem
Hope this helps!
Maybe I only partly address your question. It seems that your largest worry is the security of the channel via which you obtain the key. You do not show any code of how you obtain that key, but you said that you retrieve it via HTTPS and now you want to verify the authenticity of this connection by certificate verification.
You can comfortably do so using the well-established third-party web client framework requests.
Quote from the docs:
Requests can verify SSL certificates for HTTPS requests, just like a
web browser. To check a host’s SSL certificate, you can use the verify
argument:
requests.get(url, verify=True)
Also:
You can pass verify the path to a CA_BUNDLE file with certificates of
trusted CAs.
The latter could look like
requests.get(url, verify='/path/to/cert.pem')
In case you really want to take control (and reduce complexity), then load the right file from http://www.symantec.com/page.jsp?id=roots and take the verify='/path/to/cert.pem' approach. I guess you need http://www.symantec.com/content/en/us/enterprise/verisign/roots/Class-3-Public-Primary-Certification-Authority-G2.pem
I have a problem.I'm testing some tips about Apple passbook with python. I'm using M2Crypto to obtain the signature.
The code is:
def passwordCallback(*args, **kwds):
return password
smime = SMIME.SMIME()
smime.load_key(key, certificate, callback=passwordCallback)
pk7 = smime.sign(SMIME.BIO.MemoryBuffer(manifest), flags=SMIME.PKCS7_DETACHED | SMIME.PKCS7_BINARY)
pem = SMIME.BIO.MemoryBuffer()
pk7.write(pem)
der = ''.join(l.strip() for l in pem.read().split('-----')[2].splitlines()).decode('base64')
The code is supposed to work well and generate the signature content, the problem is with the "key" and the "certificate".
This two variable are the name of certificate.pem and key.pem, but I have donwloaded only the pass.cert file from the Apple Developer portal.
How is possible to obtain this two files, with openssl or something similar?
SOLVED:
I have solved with this link
http://www.raywenderlich.com/3443/apple-push-notification-services-tutorial-part-12
You need to either obtain a certificate from a third-party certification authority (CA) or create a self-signed certificate using something like the process described in openSSL. If you are just testing some code a self-signed cert will work, but a CA-issued cert provides other users some indication that you are who you the cert says you are. You could create a self-signed cert claiming to be Tim_Cook#apple.com, but no reputable CA would issue you such a cert.