I'm running an older version of python (2.6) and tornado (2.3). In my program I have an instance of HTTPClient running fetch() requesting an https url on facebook. However, it's trying to make the request over SSLv3. Since Facebook disabled SSLv3 when POODLE happened, the request is throwing a handshake failure.
I can't figure out where to change the protocol, if I even can. Is there any way I can change it to use TLS with these older versions? This is a legacy application that I was just given to fix asap, so I'm not sure of the implication of updating any of the libraries.
Heres the error I'm receiving:
SSL Error on 16: [Errno 1] _ssl.c:492: error:14094410:SSL routines:SSL3_READ_BYTES:sslv3 alert handshake failure
Thanks!
In the end, I ended up upgrading tornado to version 3.2 since there was a change made to simple_httpclient that changed it's protocol from sslv3 to tlsv1, as stated here http://tornado.readthedocs.org/en/latest/releases/v3.2.0.html#tornado-simple-httpclient
Related
When using a urllib3.ProxyManager() with an HTTPS proxy URL I'm seeing a warning called InvalidProxyConfigurationWarning on version 1.25.9 of urllib3. I didn't get this warning before, what does it mean?
This warning is new in urllib3 v1.25.9 and means that your proxy which is configured to use HTTPS is not doing what you intended.
See this issue for more information: https://github.com/urllib3/urllib3/issues/1850
Copied below is the text from the issue.
urllib3 up to v1.25.x doesn't support HTTPS proxies. When connecting to an HTTPS URL, urllib3 contacts the proxy via HTTP instead of HTTPS even if your proxy URL specifies HTTPS. In urllib3 v1.26.x we're planning on supporting HTTPS proxies properly and are giving an early warning to users to switch their proxy URLs from HTTPS to HTTP to not encounter issues when upgrading later.
import urllib3
# HTTPS proxy, should change!
http = urllib3.ProxyManager("https://1.2.3.4")
http.request("GET", "https://example.com") # Warning would be raised here.
# Switch to this, will maintain current behavior when connecting to HTTPS URLs.
http = urllib3.ProxyManager("http://1.2.3.4")
http.request("GET", "https://example.com") # Warning won't be raised, same behavior as above.
Your proxy may be configured externally like in a HTTPS_PROXY environment variable or via requests.Session(proxy_url=...) or configured by your OS.
(FYI I'm the current lead maintainer of urllib3)
I am checking the http protcol in use for this site http://www.dlf.in/
Chrome developer tools shows it to be http/1.1 as in the image below.
However the command line tool is-http2 or alpn in python as seems to indicate the http/1.1 is not available. And only http1 is available. What's going on here?
I am doing the ALPN negotiation in python as follows (openssl version : OpenSSL 1.0.2h and python version 3.5.1)
import ssl
import socket
port = 443
domain = 'www.dlf.in'
ctx = ssl.create_default_context()
ctx.set_alpn_protocols(['h2', 'spdy/3.1', 'spdy/3', 'spdy/2', 'http/1.1'])
conn = ctx.wrap_socket(socket.socket(socket.AF_INET, socket.SOCK_STREAM),
server_hostname=domain)
conn.connect((domain, port))
print(conn.selected_alpn_protocol())
That site doesn't have HTTPS on it. Browsers only support HTTP/2 over HTTPS despite the spec allowing it over HTTP in theory.
Using telnet and HTTPie to send HTTP requests (both 1.0 and 1.1), the web server responds with:
HTTP/1.1 200 OK
I tried it with 2.0 and got:
HTTP/1.1 505 HTTP Version Not Supported
This site doesn't support SSL (I tried using using curl https://www.dlf.in and got the HTTP 504 response)
Curl is a command line browser that also implements HTTP/2 in clear (non SSL) and it has the upgrade mechanism (using the --http2 flag). Using that flag I saw that the website doesn't support clear HTTP/2. (There is an option to connect directly with HTTP/2 in clear, without the upgrade mechanism, using --http2-prior-knowledge, but it doesn't work either).
Using Python 3.4 and requests 2.11.1 package to get 'https://www.nfm.com' website, requests throws an SSLError [SSL: UNKNOWN_PROTOCOL]. I'm able to get a valid response from other https sites such as pythonanywhere.com and amazon.com. The error is only encountered trying to requests.get('https://www.nfm.com') or requests.get('https://www.nfm.com', verify=False).
I checked NFM's certificate in Chrome and it's a valid thawte SHA256 SSL CA. Is this a problem with NFM or is there something I need to configure on my end to get a response object from this website?
According to SSLLabs the server supports only TLS 1.1 and higher. My guess is that you have an older OpenSSL version which is not able to speak this version. Support for TLS 1.1 and TLS 1.2 was added with OpenSSL 1.0.1 years ago but for example Apple ships only the very old version 0.9.8 on Mac OS X which only supports at most TLS 1.0 (which the server does not support).
I need to connect to a ftp server which requires TLS 1.2
the ftplib has an object called FTP_TLS.ssl_version but I can't choose ssl.PROTOCOL_TLSv1_2 because its available only in Python 3.4 and will be available at python 2.7.9 which is not released as of this post.
There is no way I can change my program to use Python 3.4 so what are my options?
One could assume that the default should already be to connect with the best TLS version possible. An explicit setting to TLS1.2 just means, that the client will not accept anything below TLS1.2 back from the server.
Unfortunately ftplib decided to hard code the version to TLSv1 and thus reduce the connection to TLS 1.0 even if the OpenSSL would support better versions. Since there is no way with older python versions to explicitly request TLS 1.1 or TLS 1.2 you need to request SSLv23 which automatically requests the best version possible:
import ssl
from ftplib import FTP_TLS
ftps = FTP_TLS('127.0.0.1')
## set protocol to SSLv23 to request best version
ftps.ssl_version = ssl.PROTOCOL_SSLv23;
ftps.login()
ftps.prot_p()
ftps.retrlines('LIST')
ftps.quit()
The only change to normal use of ftplib is to set ssl_version to ssl.PROTOCOL_SSLv23 and thus it will request the best version possible. If this will be TLS 1.2 depends on the server and on the supported versions in the client. With Ubuntu TLS 1.2 is disabled on the client side up to version 13.10, so it will use at most TLS 1.1. With Ubuntu 14.04 it will use TLS 1.2 if the server supports it.
A side effect of this change is that it will not send an AUTH TLS command to the FTP server, but instead the older AUTH SSL command, but most servers will probably not care. Another side effect is that it will also allow TLS 1.0 or SSL 3.0 if the server does not support anything better. If you don't want this you have to fiddle with the SSL context options, but it looks like this is only available with python3.
I'm attempting to use requests to access a remote server over SSL. Unfortunately it's misconfigured such that it responds with the error TLSV1_UNRECOGNIZED_NAME during the SNI handshake, which is ignored by browsers but raises an exception in requests.
This appears to be the same issue as this question, but in Python rather than Java: SSL handshake alert: unrecognized_name error since upgrade to Java 1.7.0`
The connection works perfectly in Python 2, which doesn't support SNI. How can I disable SNI in Python 3 so that the connection works?
I couldn't find a way to disable SNI on the requests level, but I found a hack that will trick it into thinking SNI isn't supported. Add this code before importing requests (not after, or it will be too late):
import ssl
ssl.HAS_SNI = False