Why am I seeing InvalidProxyConfigurationWarning when using an HTTPS proxy with urllib3? - python

When using a urllib3.ProxyManager() with an HTTPS proxy URL I'm seeing a warning called InvalidProxyConfigurationWarning on version 1.25.9 of urllib3. I didn't get this warning before, what does it mean?

This warning is new in urllib3 v1.25.9 and means that your proxy which is configured to use HTTPS is not doing what you intended.
See this issue for more information: https://github.com/urllib3/urllib3/issues/1850
Copied below is the text from the issue.
urllib3 up to v1.25.x doesn't support HTTPS proxies. When connecting to an HTTPS URL, urllib3 contacts the proxy via HTTP instead of HTTPS even if your proxy URL specifies HTTPS. In urllib3 v1.26.x we're planning on supporting HTTPS proxies properly and are giving an early warning to users to switch their proxy URLs from HTTPS to HTTP to not encounter issues when upgrading later.
import urllib3
# HTTPS proxy, should change!
http = urllib3.ProxyManager("https://1.2.3.4")
http.request("GET", "https://example.com") # Warning would be raised here.
# Switch to this, will maintain current behavior when connecting to HTTPS URLs.
http = urllib3.ProxyManager("http://1.2.3.4")
http.request("GET", "https://example.com") # Warning won't be raised, same behavior as above.
Your proxy may be configured externally like in a HTTPS_PROXY environment variable or via requests.Session(proxy_url=...) or configured by your OS.
(FYI I'm the current lead maintainer of urllib3)

Related

incorrect http protocol shown by chrome developer tools?

I am checking the http protcol in use for this site http://www.dlf.in/
Chrome developer tools shows it to be http/1.1 as in the image below.
However the command line tool is-http2 or alpn in python as seems to indicate the http/1.1 is not available. And only http1 is available. What's going on here?
I am doing the ALPN negotiation in python as follows (openssl version : OpenSSL 1.0.2h and python version 3.5.1)
import ssl
import socket
port = 443
domain = 'www.dlf.in'
ctx = ssl.create_default_context()
ctx.set_alpn_protocols(['h2', 'spdy/3.1', 'spdy/3', 'spdy/2', 'http/1.1'])
conn = ctx.wrap_socket(socket.socket(socket.AF_INET, socket.SOCK_STREAM),
server_hostname=domain)
conn.connect((domain, port))
print(conn.selected_alpn_protocol())
That site doesn't have HTTPS on it. Browsers only support HTTP/2 over HTTPS despite the spec allowing it over HTTP in theory.
Using telnet and HTTPie to send HTTP requests (both 1.0 and 1.1), the web server responds with:
HTTP/1.1 200 OK
I tried it with 2.0 and got:
HTTP/1.1 505 HTTP Version Not Supported
This site doesn't support SSL (I tried using using curl https://www.dlf.in and got the HTTP 504 response)
Curl is a command line browser that also implements HTTP/2 in clear (non SSL) and it has the upgrade mechanism (using the --http2 flag). Using that flag I saw that the website doesn't support clear HTTP/2. (There is an option to connect directly with HTTP/2 in clear, without the upgrade mechanism, using --http2-prior-knowledge, but it doesn't work either).

Python - Requests HTTP Library SSL Key

I am using requests library to complete communication with https websites. This works great, my only problem is that wireshark no longer captures plain text information in the "Decrypted SSL Data" tab as it does after following this instructional :
https://jimshaver.net/2015/02/11/decrypting-tls-browser-traffic-with-wireshark-the-easy-way/
Setup enviromental variable that allows chrome and firefox to store ssl keys in file, wireshark uses this file in real time.
Is their a way I can modify a simple https request script such as this :
import requests
resp = requests.get("https://www.google.com", allow_redirects=True)
to also store the ssl key into file as chrome and firefox do?
From what I understand about OpenSSL implementations that would do similar, you'd have to find the master secret and session key in memory - is this doable when running from cmd or practical?
This appears to be possible now with Requests.
I have set SSLKEYLOGFILE=secrets.log and then ran a request via requests.get() and secrets.log is now populated with TLS secrets. I am using requests v2.25.1 and urllib3 v1.26.3.
Apparently, it took a while for OpenSSL to provide APIs necessary to extract keying information, and then time for bindings to be created in pyOpenSSL to utilize those APIs and then for that to bubble up to urllib3.
See this issue for more details: https://github.com/psf/requests/issues/3674
openssl s_client -connect www.google.com:443 -showcerts
you will see all certs that google site uses.

Change SSL Protocol Torndao 2.3/Python 2.6

I'm running an older version of python (2.6) and tornado (2.3). In my program I have an instance of HTTPClient running fetch() requesting an https url on facebook. However, it's trying to make the request over SSLv3. Since Facebook disabled SSLv3 when POODLE happened, the request is throwing a handshake failure.
I can't figure out where to change the protocol, if I even can. Is there any way I can change it to use TLS with these older versions? This is a legacy application that I was just given to fix asap, so I'm not sure of the implication of updating any of the libraries.
Heres the error I'm receiving:
SSL Error on 16: [Errno 1] _ssl.c:492: error:14094410:SSL routines:SSL3_READ_BYTES:sslv3 alert handshake failure
Thanks!
In the end, I ended up upgrading tornado to version 3.2 since there was a change made to simple_httpclient that changed it's protocol from sslv3 to tlsv1, as stated here http://tornado.readthedocs.org/en/latest/releases/v3.2.0.html#tornado-simple-httpclient

How to disable SNI in Python Requests?

I'm attempting to use requests to access a remote server over SSL. Unfortunately it's misconfigured such that it responds with the error TLSV1_UNRECOGNIZED_NAME during the SNI handshake, which is ignored by browsers but raises an exception in requests.
This appears to be the same issue as this question, but in Python rather than Java: SSL handshake alert: unrecognized_name error since upgrade to Java 1.7.0`
The connection works perfectly in Python 2, which doesn't support SNI. How can I disable SNI in Python 3 so that the connection works?
I couldn't find a way to disable SNI on the requests level, but I found a hack that will trick it into thinking SNI isn't supported. Add this code before importing requests (not after, or it will be too late):
import ssl
ssl.HAS_SNI = False

python requests can not send https request

I installed requests.
But I can not send https request, even as simple as this:
requests.get('https://www.google.com')
The error message is:
_ssl.c:504: error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol
UPDATES: I can send https request using other method, urllib2, httplib can send https request sucessfully
This is not a problem with requests, or even with Python.
You are most likely using a proxy server that has not been properly configured to handle SSL connections. The exact problem is very much dependent on the proxy server. A Google search on the 140770FC error will give you many hits and discussions on how to diagnose this.
Note that even if you have not configured a proxy on your local machine, a corporate firewall could still be forcing HTTPS connections to go over a proxy at the network boundaries.

Categories

Resources