Requests failing to connect to a TLS server - python

I'm having an issue tracking down why requests fails to connect to a specific host.
The following works just fine via curl, or browser:
curl https://banking4.anz.com
However if I use requests:
requests.get('https://banking4.anz.com')
I get:
SSLError: ("bad handshake: SysCallError(-1, 'Unexpected EOF')",)
On the wire, I see only the client hello and the server disconnects immediately, so it doesn't seem like any ssl or cipher incompatibility. (I'd expect an SSL-layer error for those) What else could be an issue in this case?
I'm on python 3.6.1 with requests 2.14.2 (with security extras).

This server is broken in multiple ways.
For one, it only understands DES-CBC3-SHA which is considered insecure and not included in the default cipher set used by requests. Additionally it looks like that it only checks a limited number of offered ciphers in the ClientHello and thus will not see that DES-CBC3-SHA is offered by the client if too much other offers are before this cipher.
A quick workaround for this broken server is to only offer the only cipher the server supports:
import requests
requests.packages.urllib3.util.ssl_.DEFAULT_CIPHERS = 'DES-CBC3-SHA'
requests.get('https://banking4.anz.com')
But note that this sets the default cipher list of requests to an insecure value. Thus this method should not be used if you want to connect to other sites within your application. Instead have a look at this more complex solution of using your own HTTPAdapter with specific cipher settings for the broken site.

Related

python-requests how to send cipher name/http2

I am trying to replicate the following client requests via python-requests.
Under client connection I see HTTP Version which is 2.0 and TLS version which is 1.3, as up to my knowledge I know that requests utilizes TLS 1.3. My requests are failing as of now.
And I wonder if I need to pass certificates. I would like to understand how this request is different from regular request which would be simply as
r = requests.get('someurl')
How can I use requests to use the exact client connection show in requests? I don't fully understand each pointer, How would I use h2 ALPN/ with that specific cipher name? I am not expecting an solid answer to the question rather an explanation would be much more helpful!
python-requests doesn't support HTTP 2 request. You can use httpx package.
HTTPX is a fully featured HTTP client for Python 3, which provides sync and async APIs, and support for both HTTP/1.1 and HTTP/2.
Example
import ssl
import httpx
# create an ssl context
ssl_context = ssl.SSLContext(protocol=ssl.PROTOCOL_TLS)
# ssl.PROTOCOL_TLS - Selects the highest protocol version that both the client and server support.
# Despite the name, this option can select both "SSL" and "TLS" protocols.
# set protocol to use
ssl_context.set_alpn_protocols(["h2"])
CIPHERS = 'ECDH+AESGCM:ECDH+CHACHA20:DH+AESGCM:DH+CHACHA20:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:DH+HIGH:RSA+AESGCM:RSA+AES:RSA+HIGH:!aNULL:!eNULL:!MD5:!3DES'
# set ciphers
ssl_context.set_ciphers(CIPHERS)
# httpx verify param lets you pass a standard library ssl.SSLContext
response = httpx.get('https://example.com', verify=ssl_context)
print(response.http_version)
# outputs HTTP/2
Instead of using ssl.SSLContext, you can also use httpx.create_ssl_context() to set the ssl context.
As far as I know python-requests is a library which currently1 doesn't support HTTP/2.0. This question has been answered here.
However there are python libraries like Python httpx supporting HTTP/2.0!
Kind regards,
1 Feb 16, 2021

Creating a Charles proxy alternative using Python

I am using Charles proxy right now to monitor traffic between my devices and a website. The traffic is SSL and I am able to read it on charles. The issue is charles makes the content hard to read when I am filtering through hundreds of variables in s JSON object. I created a program that will filter the JSON after exporting the charles log. My next step is to get rid of charles completely and create my own proxy in python that can view http and https data. I was wondering if scapy or any other existing libraries existed that would work? I am interested with scapy because I can save the proxy log as a pcap file.
Reading through mitmproxy would be overwhelming since it's a huge source base. If you would like to implement the proxy server from scratch. Here is what I learn during developing Proxyman
Learn how to set up a tiny Proxy server: Basically, open the listening socket at your port (9090 for example). Accept any incoming requests and get the first line of the HTTP Message. It could be done a lightweight http-parser or any Python parser. The raw HTTP message looks like:
CONNECT https://google.com HTTP/1.1
Parse and get the google and the IP: Open the socket connection to the destination IP and start to receive and sent forth and back from the client <-> the destination server.
The first step is essential to implement the HTTP Proxy in this step. Use http-parser to parse the rest of the HTTP Message. Thus, you can get the headers and body from the Request / Response -> Present to UI
Learn how HTTPS and SSL work: Use OpenSSL to generate a self-signed certificate and how to generate the chain certificates too.
Learn how to import those certificate to the macOS keychain by using security CLI or Security framework from Apple.
When you've done: it's time to start the HTTPS interception: Start the 2nd step and do SSL Handshake with appropriate certificate in both sides (Client -> Your Proxy Server and your Proxy Server -> Destination)
Parse the HTTP message as usual and get the rest of the message.
Overall, there are a lot of open sources out there, but I suggest to start from the simple version before moving on.
Hope that could help you.

Manually verify certificates in Python3 ssl

I am developing the client- and server-side of a Python3 application. They must communicate over TLS using self-signed certs.
The connection should always be established, even if both have never seen the other, thus neither has the other's cert in its trust store. Verification shall happen after the handshake with a custom method.
However, Python's ssl library attempts to verify the certificate during handshake and this fails if the incoming cert is unknown and has no valid certificate chain. Setting verify_mode to CERT_NONE is also not an option, since I do require the certificates from both sides for my custom verification method.
So my question: How can I require a certificate from the other side but turn off automatic verification during handshake? Or maybe I can pass a custom verifyer-method that gets called?
Thanks!
You can use ssl.get_server_certificate((host,port)). It will return the certificate in PEM format.

Python httplib disble certificate validation

I have the following code to use:
def createCon(host,auth):
con = httplib.HTTPSConnection(host)
return con
def _readJson(con,url):
con.putrequest("GET",url)
...
r = con.getresponse()
It is working on a specific server, but on another I'm getting SSLError. If I open the url from browser, I need to accept the certificate and it is working well. But how can I either accept the certificate or disable it's validation? This is a self-signed certificate stored in a java keystore, so I would prefer to disable the verification...
The code is meant to reuse the connection between the requests, so I would prefer not to modify it deeply.
How could I do this? I tried to modify the examples but haven't beend succeded.
con.putrequest("GET",url, verify=False)
or
con.request.verify=False
I do not know how could I access the session or request objects or modify the default settings.
UPDATE this does not help:
socket.ssl.cert_reqs='CERT_NONE'
well, the actual error message is weird...:
SSLError:'[Errno 1] _ssl.c:492: error:100AE081:elliptic curve routines:EC_GROUP_new_by_curve_name:unknown group'
Regards:
Bence
Your error message points to a bug in the openssl version you use. See https://bugzilla.redhat.com/show_bug.cgi?id=1022468. In short: the client advertises capabilities it does not have and if the server picks such capability you get this error message. Needs to be fixed by upgrading your local openssl installation. A workaround on the server side should be possible too, if you have control over the server.

How does Python urllib2 https work?

Looking at the documentation for urlib2 it says it supports HTTPS connections. However what it doesn't make clear is how you enable it do you for example take HTTPBasicAuth and replace the HTTP with HTTPS or do you just need to pass an HTTPS in url when you actually open the connection?
< Python 2.7.9:_
You can simply pass an HTTPS URL when you open the connection. Heed the warning in the Urllib2 documentation that states:
"Warning HTTPS requests do not do any verification of the server’s certificate."
As such, I recommend using Python Requests library that provides a better interface and many features, including SSL Cert verification and Unicode support.
Update 20150120:
Python 2.7.9 Added HTTPS Hostname verification as standard. See change comment in https://docs.python.org/2/library/httplib.html#httplib.HTTPSConnection
Thanks to #EnnoGröper for the notice.

Categories

Resources