Manually verify certificates in Python3 ssl - python

I am developing the client- and server-side of a Python3 application. They must communicate over TLS using self-signed certs.
The connection should always be established, even if both have never seen the other, thus neither has the other's cert in its trust store. Verification shall happen after the handshake with a custom method.
However, Python's ssl library attempts to verify the certificate during handshake and this fails if the incoming cert is unknown and has no valid certificate chain. Setting verify_mode to CERT_NONE is also not an option, since I do require the certificates from both sides for my custom verification method.
So my question: How can I require a certificate from the other side but turn off automatic verification during handshake? Or maybe I can pass a custom verifyer-method that gets called?
Thanks!

You can use ssl.get_server_certificate((host,port)). It will return the certificate in PEM format.

Related

urrlib Error - SSL: CERTIFICATE_VERIFY_FAILED - in CONDA environment [duplicate]

I'm trying the use urllib.request.urlopen on a website starting with "https". Error output is:
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed
There are many great threads which cover this error. Including this one which mentions SSL Labs rating. I am able to use urllib.request.urlopen on every other "https" site I have tested.
SSL Labs shows the following output:
Key RSA 2048 bits (e 65537)
Issuer Let's Encrypt Authority X3
AIA: http://cert.int-x3.letsencrypt.org/
Signature algorithm SHA256withRSA
Extended Validation No
Certificate Transparency No
OCSP Must Staple No
Revocation information OCSP
Revocation status Good (not revoked)
DNS CAA No (more info)
Trusted Yes
To clarify, my question is: is there a solution for completing the handshake, that doesn't include bypassing the certificate verification? And if there is a solution, can it be solved entirely inside a python script on linux, macOS and Windows?
I cannot answer this question for urllib, but I was able to overcome this problem using python requests instead. Note this will only work if there is a trusted certificate chain for the website in question but the server is perhaps missing a root or intermediate certificate.
Using SSL labs server test (linked here), run the test and scroll down to certification paths. IF it IS the case that there are trusted cert paths, but the server is for some reason not providing the complete chain, you can download the full trusted path here as text. Copy the full certificate chain and save it as a .pem file and pass the path of this file to the requests function:
r = requests.get(url, verify = "path/to/chain.pem")
The requests module can throw all sorts of SSL related certification failures, many of which will be server side problems, and you really want to avoid disabling the SSL verification. This solution is only for the somewhat rare case where a full certificate exists but the issuer or server in question has sloppily omitted either a root or intermediate cert.
You can work around this by adding your missing intermediate certificate to the active X509Store:
cert_text = '''
-----BEGIN CERTIFICATE-----
...put your actual certificate text here...
-----END CERTIFICATE-----
'''
# fill this out depending on which specific intermediate cert you're missing
missing_cert = OpenSSL.crypto.load_certificate(OpenSSL.crypto.FILETYPE_PEM, cert_text)
context = ssl.create_default_context() # load default trusted certificates
store = context.get_cert_store() # get the X509Store for that context
store.add_cert(missing_cert) # add your missing cert to it
urllib.request.urlopen(site, context=context)
Note that if you only need to talk to the one server for which you're doing this, you could just pass an appropriate cafile or capath argument to create_default_context().
I was able to solve this issue (for debian-based system, I'm running Debian 9). I still need to test solutions on macOS and Windows.
On the SSL Labs report, under the "Certification Paths" header, it showed:
1 Sent by server www.exampleSITE.com
BLAH BLAH BLAH
2 Extra download Let's Encrypt Authority X3
BLAH BLAH BLAH
3 In trust store DST Root CA X3 Self-signed
BLAH BLAH BLAH
I navigated to /etc/ssl/certs/ and noticed there was no Let's Encrypt certificates present. I then downloaded the .pem and rehashed.
cd /etc/ssl/certs
sudo wget https://letsencrypt.org/certs/lets-encrypt-x3-cross-signed.pem
sudo c_rehash
Then I tested the python line that was giving me an error earlier
page = urllib.request.urlopen('https://www.exampleSITE.com').read()
and it successfully retrieved the page.

Python Request: SSL Verify

I am using python request module to hit rest api. I have to use SSL for security measures.
I see that i can set
requests.get(url,verify=/path/ca/bundle/)
However i am confused as to what needs to be passed as CA_BUNDLE?
I get the server certificate using
cert = ssl.get_server_certificate((server,port))
Can someone let me know, how i should use this certificate in my request? Should i convert the cert to X509/.pem/.der/.crt file ?
Solved it. Apparently i needed to get the entire certificate chain and create a CA bundle out of it.

Does PyOpenSSL verify_certificate() do signature verification

I use PyOpenSSL verify_certificate() to verify certificate chains. My code seems to work. But I was wondering if the function also checks the signatures along the certificate chain. Lets assume we have the chain ca_cert -> i_ca_cert -> s_cert. Thus ca_cert signed i_ca_cert and i_ca_cert signed s_cert. Does verify_certificate() check whether the signer's (RSA) key was used to sign the certificate and whether the signature is correct, for every certificate along the chain?
But I was wondering if the function also checks the signatures along the certificate chain
Of course it does. Otherwise what is the purpose of chain verification? From the OpenSSL documentation (man 1ssl verify on linux):
The final operation is to check the validity of the certificate chain. The validity period is checked against the current system time and the notBefore and notAfter dates in the certificate. The certificate signatures are also checked at this point.

I setup https, but data still response to client without certificate

I am developing an iOS app, and I want the data return by my server can only be read by my app.
So, I create self-signed certificate, and setup https in Tornado like this:
http_server = tornado.httpserver.HTTPServer(applicaton, ssl_options={
"certfile": os.path.join(data_dir, "mydomain.crt"),
"keyfile": os.path.join(data_dir, "mydomain.key"),
})
http_server.listen(443)
When after I type the API of my server in chrome/safari, they warned me, but the data still can be read.
The browsers don't have my certificate/key pair, why can they access my server and read the data?
According to public/private theory:
the browser have to send its public key, which involved in its certificate
if my server trust the certificate by some ways, my server encrypt the response using the browser's public key
browser receive the response and decrypt it using itself's private key
In step 2, my server should not trust the browser's certificate! Am I right?
Thanks.
According to public/private theory:
the browser have to send its public key, which involved in its certificate
if my server trust the certificate by some ways, my server encrypt the response using the browser's public key
browser receive the response and decrypt it using itself's private key
No, that's not how it works.
In SSL/TLS with only server authentication (most HTTPS sites), the server sends its certificate first, the client checks whether it trusts the certificate, the client and server negotiate a shared secret using the server's public key (how it's done depend on the cipher suite), and an encrypted channel is set up, using keys derived from this shared secret.
In SSL/TLS with mutual authentication, an extra steps involves the client sending its certificate to the server and signing something at the end of the handshake, to prove to the server it's indeed the holder of this certificate.
It's only in the second case that the browser has a certificate and a private key, and it's never used for any encryption in any case.
The code you're using here only sets up certfile and keyfile, which means you've configured your server for a connection where only the server is authenticated. When you're bypassing the browser warning, you're merely telling it to trust the server certificate (since it's self-signed in your case), so the connection can indeed proceed.
If you want to authenticate the client, you'll need to configure the server to request (and require) a client certificate. You'll also need to set up the client certificate (with its private key) in the client (whether it's the browser or your app). This is independent of the server certificate and its private key.
The Tornado documentation seems to indicate the ssl_options parameter uses the ssl.wrap_socket options, so you should look into those if you want to use client certificate authentication (in particular cert_reqs and ca_certs).
Note that, in general, authenticating an app (as opposed to the user of an app) using a client certificate only works as long as no-one is able to decompile the app. The app will contain the private key one way or another, so someone could get hold of it. This problem is of course even worse if you use the same private key for all the copies of your app.
I'm by no means knowledgeable in this field, but the certificate is only meant to go so far as to help ensure that the server is who it says it is.
Anyone can view the page, if they trust the servers certificate.
To get the functionality you want, you probably want to use some form of authentication, even something basic like a given value in a HTTP header field.
here is a bizarre tip, you just hack your user agent, so tornado will only allow the string you gave, i dont know if iOS browsers offers this, but in Chrome on PC you can override your user agent in
developper tool -> Settings -> Overrides.
use:
self.request.headers["User-Agent"]
because it is a string, then you just allow some string to pass:
if personnalized_ua not in self.request.headers["User-Agent"]:
self.redirect("no-way.html")
and now, if you want to make access only for iPhone for example, use user_agents library

How do I use m2crypto to validate a X509 certificate chain in a non-SSL setting

I'm trying to figure out how to, using m2crypto, validate the chain of trust from a public key version of a X509 certificate back to one of a set of known root CA's when the chain may be arbitrarily long. The SSL.Context module looks promising except that I'm not doing this in the context of a SSL connection and I can't see how the information passed to load_verify_locations is used.
Essentially, I'm looking for the interface that's equivalent to:
openssl verify pub_key_x509_cert
Is there something like that in m2crypto?
Thanks.
I have modified a different M2Crypto patch and with this we are able to verify a X509 Certificate against a chain of CAs, plus it allows the usage of Certificate Revocation List (CRL)s.
The heart of allowing chain verification with M2Crypto is exposing "verify_cert()" on a X509_Store_Context.
Basic flow is:
Add your CAs/CRLs to a X509_Store
Use a X509_Store_Context to verify the certificate of interest
My patch enhances CRL support as well as allowing chain verification.
https://bugzilla.osafoundation.org/show_bug.cgi?id=12954#c2
We are using this patch as part of Pulp, we have a wiki page below which shares some more info on how we are doing the verification with a chain:
https://fedorahosted.org/pulp/wiki/CertChainVerification
There is a patch that might need to be updated slightly, and it would need unit tests for me to check it in. Contributions welcome!
Another convoluted way would be to create an in-memory SSL session where you do the validation. The Twisted wrapper effectively works this way; Twisted acts as dumb network pipe without knowing anything about the data, and M2Crypto encrypts/decrypts the data in memory, doing certificate validation on the side.

Categories

Resources