I'm attempting to use requests to access a remote server over SSL. Unfortunately it's misconfigured such that it responds with the error TLSV1_UNRECOGNIZED_NAME during the SNI handshake, which is ignored by browsers but raises an exception in requests.
This appears to be the same issue as this question, but in Python rather than Java: SSL handshake alert: unrecognized_name error since upgrade to Java 1.7.0`
The connection works perfectly in Python 2, which doesn't support SNI. How can I disable SNI in Python 3 so that the connection works?
I couldn't find a way to disable SNI on the requests level, but I found a hack that will trick it into thinking SNI isn't supported. Add this code before importing requests (not after, or it will be too late):
import ssl
ssl.HAS_SNI = False
Related
I'd like to modify the Extensions that I send in the client Hello packet with python.
I've had a read of most of the source code found on GitHub for urllib3 but I still don't know how it determines which TLS extensions to use.
I am aware that it will be quite low level and the creators of urllib3 may just import another package to do this for them. If this is the case, which package do they use?
If not, how is this determined?
Thanks in advance for any assistance.
The HTTPS support in urllib3 uses the ssl package which uses the openssl C-library. ssl does not provide any way to directly fiddle with the TLS extension except for setting the hostname in the TLS handshake (i.e. server_name extension aka SNI).
I'm trying to figure out, why I'm having a problem that Python code is throwing a SSLCertVerificationError for valid LetsEncrypt certificates on a virtual host with multiple domains and certificates at the same IP If I delete all certificates except one it's fine, but with more than one certificate requests ignores the domain to which Python sent the request and pulls the most recent LetsEncrypt certificate, which is incorrect, causing the domain SSLCertVerificationError.
My understanding was that under SNI (Server Name Indication) requests should only pull the certificate for the domain to which the request is being made, not simply the most recent one. I have checked, and I'm running Python, 3.8, requests 2.5 under a version of Nginx that has been compiled with SNI support. I can suppress the error by turning off SSL validation, but that seems a poor workaround.
Any idea what is going on?
Why does SNI work fine when browsers requests page from Nginx, pullign the proper certificate, but fail under when the same is done under Python's requests package?
I have read everything I can find, and the docs say it should just work under the current builds of nginx, requests,OpenSSL, etc., but it clearly isn't here.
To replicate, I can do requests.get{'https://kedrosky.org') error-free from a local machine. But on scripts run at that server -- a hosted domain -- a newer certificate for the wrong domain is returned, causing an SSLCertVerificationError.
The problem is that the server configuration is likely only properly done for IPv4 even though the domain also resolved to an IPv6 address. With IPv4 it returns the correct certificate:
$ openssl s_client -connect kedrosky.org:443 -4
...
subject=CN = kedrosky.com
But with IPv6 it returns a different certificate (this needs IPv6 connectivity to the internet on your local machine):
$ openssl s_client -connect kedrosky.org:443 -6
...
subject=CN = paulandhoward.com
Likely this is because there is only a listen 443 but not listen [::]:443, the latter needed for IPv6. In this case virtual hosts only properly work for IPv4 but with IPv6 it will just return the default, i.e. usually the first certificate configured.
And the reason that you are seeing different results from different hosts is that one has only IPv4 connectivity while the other can do IPv6 too.
I am checking the http protcol in use for this site http://www.dlf.in/
Chrome developer tools shows it to be http/1.1 as in the image below.
However the command line tool is-http2 or alpn in python as seems to indicate the http/1.1 is not available. And only http1 is available. What's going on here?
I am doing the ALPN negotiation in python as follows (openssl version : OpenSSL 1.0.2h and python version 3.5.1)
import ssl
import socket
port = 443
domain = 'www.dlf.in'
ctx = ssl.create_default_context()
ctx.set_alpn_protocols(['h2', 'spdy/3.1', 'spdy/3', 'spdy/2', 'http/1.1'])
conn = ctx.wrap_socket(socket.socket(socket.AF_INET, socket.SOCK_STREAM),
server_hostname=domain)
conn.connect((domain, port))
print(conn.selected_alpn_protocol())
That site doesn't have HTTPS on it. Browsers only support HTTP/2 over HTTPS despite the spec allowing it over HTTP in theory.
Using telnet and HTTPie to send HTTP requests (both 1.0 and 1.1), the web server responds with:
HTTP/1.1 200 OK
I tried it with 2.0 and got:
HTTP/1.1 505 HTTP Version Not Supported
This site doesn't support SSL (I tried using using curl https://www.dlf.in and got the HTTP 504 response)
Curl is a command line browser that also implements HTTP/2 in clear (non SSL) and it has the upgrade mechanism (using the --http2 flag). Using that flag I saw that the website doesn't support clear HTTP/2. (There is an option to connect directly with HTTP/2 in clear, without the upgrade mechanism, using --http2-prior-knowledge, but it doesn't work either).
I'm writing an application in Google App Engine, and I'm trying to send HTTPS requests (GET / POST) from GAE to a private server.
Is there any method to achieve a request with:
- sending request with client certificate/key;
- verify server certificate;
AND using SNI support?
I'v tried to use:
urllib2 -> but it can't verifies server CA;
urlfetch -> it only verifies server CA;
urllib3 -> i'm getting
"_ssl.c:529: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed", caused by missing SNI support;
lib requests -> same of urllib3.
I'v also tried for SNI to put libraries on Google App Engine as indicated in using requests with TLS doesn't give SNI support
pyOpenSSL
ndg-httpsclient
pyasn1
But pyOpenSSL has C dependencies, so there is no way to use it, and it's not supported as third party library.
TL;DR: sending request from GAE to private server in SSL with client cert, verify server CA and SNI support seems to be impossible.
I think the matter is:
python version in GAE, that is 2.7.5 and not 2.7.9 (with backported SNI compatibilty).
maybe also SSL version included in GAE is not supporting SNI ( ssl has not HAS_SNI attr)
How can I do that?
I'm running an older version of python (2.6) and tornado (2.3). In my program I have an instance of HTTPClient running fetch() requesting an https url on facebook. However, it's trying to make the request over SSLv3. Since Facebook disabled SSLv3 when POODLE happened, the request is throwing a handshake failure.
I can't figure out where to change the protocol, if I even can. Is there any way I can change it to use TLS with these older versions? This is a legacy application that I was just given to fix asap, so I'm not sure of the implication of updating any of the libraries.
Heres the error I'm receiving:
SSL Error on 16: [Errno 1] _ssl.c:492: error:14094410:SSL routines:SSL3_READ_BYTES:sslv3 alert handshake failure
Thanks!
In the end, I ended up upgrading tornado to version 3.2 since there was a change made to simple_httpclient that changed it's protocol from sslv3 to tlsv1, as stated here http://tornado.readthedocs.org/en/latest/releases/v3.2.0.html#tornado-simple-httpclient