Notes:
versions
Python 2.7.11 and my requests version is '2.10.0'
'OpenSSL 1.0.2d 9 Jul 2015'
Please read the below comment by Martijn Pieters before reproducing
Initially I tried to get pdf from https://www.neco.navy.mil/necoattach/N6945016R0626_2016-06-20__INFO_NAS_Pensacola_Base_Access.docx using code as below
code1:
>>> import requests
>>> requests.get("https://www.neco.navy.mil/necoattach/N6945016R0626_2016-06-20__INFO_NAS_Pensacola_Base_Access.docx",verify=False)
Error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\mob140003207\AppData\Local\Enthought\Canopy\User\lib\site-packa
ges\requests\api.py", line 67, in get
return request('get', url, params=params, **kwargs)
File "C:\Users\mob140003207\AppData\Local\Enthought\Canopy\User\lib\site-packa
ges\requests\api.py", line 53, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Users\mob140003207\AppData\Local\Enthought\Canopy\User\lib\site-packa
ges\requests\sessions.py", line 468, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\mob140003207\AppData\Local\Enthought\Canopy\User\lib\site-packa
ges\requests\sessions.py", line 576, in send
r = adapter.send(request, **kwargs)
File "C:\Users\mob140003207\AppData\Local\Enthought\Canopy\User\lib\site-packa
ges\requests\adapters.py", line 447, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: ("bad handshake: SysCallError(10054, 'WSAECONNRESE
T')",)
After googling and searching I found that you have use SSL verification and using session with adapters can solve the problem. But I still got error's please find the code and error's below
Code2:
import requests
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.poolmanager import PoolManager
import ssl
import traceback
class MyAdapter(HTTPAdapter):
def init_poolmanager(self, connections, maxsize, block=False):
self.poolmanager = PoolManager(num_pools=connections,
maxsize=maxsize,
block=block,
ssl_version=ssl.PROTOCOL_TLSv1)
s = requests.Session()
s.mount('https://', MyAdapter())
print "Mounted "
r = s.get("https://www.neco.navy.mil/necoattach/N6945016R0626_2016-06-20__INFO_NAS_Pensacola_Base_Access.docx", stream=True, timeout=120)
Error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\mob140003207\AppData\Local\Enthought\Canopy\User\lib\site-packa
ges\requests\sessions.py", line 480, in get
return self.request('GET', url, **kwargs)
File "C:\Users\mob140003207\AppData\Local\Enthought\Canopy\User\lib\site-packa
ges\requests\sessions.py", line 468, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\mob140003207\AppData\Local\Enthought\Canopy\User\lib\site-packa
ges\requests\sessions.py", line 576, in send
r = adapter.send(request, **kwargs)
File "C:\Users\mob140003207\AppData\Local\Enthought\Canopy\User\lib\site-packa
ges\requests\adapters.py", line 447, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: ("bad handshake: SysCallError(10054, 'WSAECONNRESET')",)
First of all, I confirm that the host, www.neco.navy.mil, is not accessible from everywhere. From some networks (geography) it works*, from others connection just hangs:
$ curl www.neco.navy.mil
curl: (7) couldn't connect to host
$ curl https://www.neco.navy.mil
curl: (7) couldn't connect to host
Second, when connection can be established there is an certificate problem:
$ curl -v https://www.neco.navy.mil
* Rebuilt URL to: https://www.neco.navy.mil/
* Hostname was NOT found in DNS cache
* Trying 205.85.2.133...
* Connected to www.neco.navy.mil (205.85.2.133) port 443 (#0)
* successfully set certificate verify locations:
* CAfile: none
CApath: /etc/ssl/certs
* SSLv3, TLS handshake, Client hello (1):
* SSLv3, TLS handshake, Server hello (2):
* SSLv3, TLS handshake, CERT (11):
* SSLv3, TLS alert, Server hello (2):
* SSL certificate problem: unable to get local issuer certificate
* Closing connection 0
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: http://curl.haxx.se/docs/sslcerts.html
curl performs SSL certificate verification by default, using a "bundle"
of Certificate Authority (CA) public keys (CA certs). If the default
bundle file isn't adequate, you can specify an alternate file
using the --cacert option.
If this HTTPS server uses a certificate signed by a CA represented in
the bundle, the certificate verification probably failed due to a
problem with the certificate (it might be expired, or the name might
not match the domain name in the URL).
If you'd like to turn off curl's verification of the certificate, use
the -k (or --insecure) option.
To make sure, you just feed it to Qualys SSL tester:
The CA (DoD Root CA 2) is not trusted. Moreover it's not in the chain. Note that OpenSSL validation process needs whole chain:
Firstly a certificate chain is built up starting from the supplied certificate and ending in the root CA. It is an error if the whole chain cannot be built up.
But there's only www.neco.navy.mil -> DODCA-28. It may be related to the TLD and extra security measure, but C grade alone isn't much anyway ;-)
On they Python side it won't be much different. If you don't have access to the CA, you can only disable certificate validation entirely (after you have connectivity problem solved, of course). If you have it, you can use cafile.
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import urllib2
import ssl
ctx = ssl.create_default_context()
ctx.check_hostname = False
ctx.verify_mode = ssl.CERT_NONE
r = urllib2.urlopen('https://www.neco.navy.mil/'
'necoattach/N6945016R0626_2016-06-20__INFO_NAS_Pensacola_Base_Access.docx',
timeout = 5, context = ctx)
print(len(r.read()))
r = urllib2.urlopen('https://www.neco.navy.mil/'
'necoattach/N6945016R0626_2016-06-20__INFO_NAS_Pensacola_Base_Access.docx',
timeout = 5, cafile = '/path/to/DODCA-28_and_DoD_Root_CA_2.pem')
print(len(r.read()))
To reproduce with certain version of Python, use simple Dockerfile like follows:
FROM python:2.7.11
WORKDIR /opt
ADD . ./
CMD dpkg -s openssl | grep Version && ./app.py
Then run:
docker build -t ssl-test .
docker run --rm ssl-test
This snippet works for me (py2.7.11 64bits + requests==2.10.0) on windows7:
import requests
import ssl
import traceback
import shutil
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.poolmanager import PoolManager
class MyAdapter(HTTPAdapter):
def init_poolmanager(self, connections, maxsize, block=False):
self.poolmanager = PoolManager(num_pools=connections,
maxsize=maxsize,
block=block,
ssl_version=ssl.PROTOCOL_TLSv1)
if __name__ == "__main__":
s = requests.Session()
s.mount('https://', MyAdapter())
print "Mounted "
filename = "N6945016R0626_2016-06-20__INFO_NAS_Pensacola_Base_Access.docx"
r = s.get(
"https://www.neco.navy.mil/necoattach/{0}".format(filename), verify=False, stream=True, timeout=120)
if r.status_code == 200:
with open(filename, 'wb') as f:
r.raw.decode_content = True
shutil.copyfileobj(r.raw, f)
I use python 2.7.6 and this simple example still working on my ubuntu 14.04
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import requests
from requests.packages.urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
with open('out.docx', 'wb') as h :
r = requests.get("https://www.neco.navy.mil/necoattach/N6945016R0626_2016-06-20__INFO_NAS_Pensacola_Base_Access.docx", verify=False, stream=True)
for block in r.iter_content(1024):
h.write(block)
Related
If I try to use requests.get() to connect a HTTPS server (a Jenkins) I got SSL error CERTIFICATE_VERIFY_FAILED certificate verify failed: unable to get local issuer certificate (_ssl.c:997)'))
HTTPS connection are working fine if I use curl or any browser.
The HTTPS server is an internal server but use a SSL cert from DigiCert. It is a wildcard certificate and the same certificate is used for a lot of other servers (like IIS server) in my company, which are working fine together with requests.
If I use urllib package the HTTPS connection will be also fine.
I don't understand why requests doesn't work and I ask what can I do that requests is working?
And no! verify=false is not the solution ;-)
For the SSLContext in the second function I have to call method load_default_certs()
My system: Windows 10, Python 3.10, requests 2.28.1, urllib3 1.26.10, certifi 2022.6.15. Packages are installed today.
url = 'https://redmercury.acme.org/'
def use_requests(url):
import requests
try:
r = requests.get(url)
print(r)
except Exception as e:
print(e)
def use_magic_code_from_stackoverflow(url):
import urllib
import ssl
ssl_context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
# ssl_context.verify_mode = ssl.CERT_REQUIRED
# ssl_context.check_hostname = True
ssl_context.load_default_certs() # WITHOUT I got SSL error(s)
# previous context
https_handler = urllib.request.HTTPSHandler(context=ssl_context)
opener = urllib.request.build_opener(https_handler)
ret = opener.open(url, timeout=2)
print(ret.status)
def use_urllib_requests(url):
import urllib.request
with urllib.request.urlopen(url) as response:
print(response.status)
use_requests(url) # SSL error
use_magic_code_from_stackoverflow(url) # server answers with 200
use_urllib_requests(url) # server answers with 200
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "main.py", line 34, in <module>
page = http.robotCheck()
File "C:\Users\user001\Desktop\automation\Loader.py", line 21, in robotCheck
request = requests.get('http://*redacted*.onion/login')
File "C:\Users\user001\AppData\Local\Programs\Python\Python37-32\lib\site-packages\requests\api.py", line 72, in get
return request('get', url, params=params, **kwargs)
File "C:\Users\user001\AppData\Local\Programs\Python\Python37-32\lib\site-packages\requests\api.py", line 58, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Users\user001\AppData\Local\Programs\Python\Python37-32\lib\site-packages\requests\sessions.py", line 512, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\user001\AppData\Local\Programs\Python\Python37-32\lib\site-packages\requests\sessions.py", line 622, in send
r = adapter.send(request, **kwargs)
File "C:\Users\user001\AppData\Local\Programs\Python\Python37-32\lib\site-packages\requests\adapters.py", line 513, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='*redacted*.onion', port=80): Max retries exceeded with url: /login (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0348E230>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed'))
Here's code of my Loader class. Loader is responsible for uploading and making requests to resource over TOR. I configured proxy, but it still throws me error.
I disabled VPN, Windows firewall and everything what I could.
import socket
import requests
class Loader:
url = "http://*redacted*.onion/login"
user = "username"
password = 'password'
manageUrl = ''
def __init__(self, config):
self.config = config
self.restartSession()
def restartSession(self):
self.userSession = requests.session()
self.userSession.proxies['http'] = 'http://127.0.0.1:9050'
self.userSession.proxies['https'] = 'https://127.0.0.1:9051'
def robotCheck(self):
request = requests.get('http://*redacted*.onion/login')
print(request)
#self.session.post(self.robotCheckUrl, data=checkResult)
def authorization(self):
self.session.get(self.url)
authPage = self.session.post(self.url, data = self.getAuthData())
def getAuthData(self):
return {'login' : self.user, 'password' : self.password}
Code which call Loader class:
http = Loader(Config())
page = http.robotCheck()
Tor is a SOCKS proxy so the proxy configuration needs to be slightly different.
Change the following lines:
self.userSession.proxies['http'] = 'http://127.0.0.1:9050'
self.userSession.proxies['https'] = 'https://127.0.0.1:9051'
To:
self.userSession.proxies['http'] = 'socks5h://127.0.0.1:9050'
self.userSession.proxies['https'] = 'socks5h://127.0.0.1:9050'
Port 9051 is the Tor Controller port. For both HTTP and HTTPS SOCKS connections over Tor, use port 9050 (the default SOCKS port).
The socks5h scheme is necessary to have DNS names resolved over Tor instead of by the client. This privatizes DNS lookups and is necessary to be able to resolve .onion addresses.
EDIT: I was able to make a SOCKS request for a .onion address using the following example:
import socket
import requests
s = requests.session()
s.proxies['http'] = 'socks5h://127.0.0.1:9050'
s.proxies['https'] = 'socks5h://127.0.0.1:9050'
print(s.proxies)
r = s.get('http://***site***.onion/')
Make sure you have the most up-to-date requests libraries with pip3 install -U requests[socks]'
Wrap like this to get the exception only.
def robotCheck(self):
try:
request = requests.get('http://hydraruzxpnew4af.onion/login')
print(request)
except requests.exceptions.RequestException as e:
print('exception caught', e)
#self.session.post(self.robotCheckUrl, data=checkResult)
Because of the following reasons you may get those errors:
your server may refuse your connection (you're sending too many
requests from same ip address in short period of time)
Check your proxy settings
For your reference : https://github.com/requests/requests/issues/1198
I'm using a REST API from euronext.com, to go any further I need to verify the server certificate and send my own client certificate through the module requests.
I already did some testing with curl, both .crt/.pem files were accepted.
But requests is still throwing :
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): saturn-api-h.euronext.com
Traceback (most recent call last):
File "C:\python36\lib\site-packages\urllib3\contrib\pyopenssl.py", line 441, in wrap_socket
cnx.do_handshake()
File "C:\python36\lib\site-packages\OpenSSL\SSL.py", line 1806, in do_handshake
self._raise_ssl_error(self._ssl, result)
File "C:\python36\lib\site-packages\OpenSSL\SSL.py", line 1546, in _raise_ssl_error
_raise_current_error()
File "C:\python36\lib\site-packages\OpenSSL\_util.py", line 54, in exception_from_error_queue
raise exception_type(errors)
OpenSSL.SSL.Error: [('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')]
What I did try to solve the issue :
Follow the requests documentation
Update requests module to 2.18.4
Install pyOpenSSL 17.5.0
Check the .crt/.pem formating
Curl tests
Working curl:
curl -i -vvv -X POST https://saturn-api-h.euronext.com/SaturnWebServices/rest/Authentication/AuthenticateUser -H "Content-Type: application/json" --cert ./client.crt --cacert ./digicert-full-chain.crt
With a valid Authorization headers it return a 200 status code, without it 401 "Access denied!". If the certificate validation fails, it redirects to euronext.com with status code 302.
Problematic python:
endpoint = 'https://saturn-api-h.euronext.com/SaturnWebServices/rest/Authentication/AuthenticateUser'
headers = { 'Content-Type': 'application/json', } #'Authorization': 'Basic <auth_string>',
r = requests.post(endpoint, headers = headers, verify = './digicert-full-chain.crt', cert = './client.crt')
Certificates:
digicert-full-chain.crt is containing the full chain from DigiCert:
DigiCertAssuredIDRootCA.pem
DigiCertSHA2AssuredIDCA.pem
DigiCertSHA2SecureServerCA.pem
client.crt is containg our certificate and its key.
Why is curl command working whereas python's requests module is failling?
Is there any way to show the complete handshake process from the requests module?
I solved the problem myself, a certificate was missing from the full chain.
Doing /usr/local/lib/python3.4/dist-packages/certifi/cacert.pem > certifi-digicert.pem; digicert-full-chain.pem >> certifi-digicert.pem and passing the resulting certificate to the verify argument worked.
As a side note, I used the strace command to compare certificate locations from both python and curl:
strace <python_command> |& grep open | grep -E 'crt|pem'
strace <curl_command> |& grep open | grep -E 'crt|pem'
I also checked with Wireshark to get the full handshake process, the pyOpenSSL module was getting an error after the server's Certificate Request. Alert (Level: Fatal, Description: Unknown CA).
I have a problem with Python (I'm a Python noob and learning it).
I used the version 2.7.9 on a Debian 9 System. I installed paho and tinkerforge package in python.
I developed a script using the Paho MQTT client to connected my mosquitto broker. I want to use a crypted connection. My connection work fine when not encrypted but fails when encrypted. The connection work fine encrypted on openHAB (MQTT-Subscriber) and MQTTFX (MQTT-Subscriber and Producer)
I'm using these parameters for my script:
self.client = mqtt.Client()
self.client.tls_set("/home/pi/ca-cert.pem","/home/pi/IWILR1-1-cert.pem","/home/pi/IWILR1-1.pem",tls_version=ssl.PROTOCOL_TLSv1)
# disables peer verification
self.client.tls_insecure_set(False)
self.client.on_connect = self.mqtt_on_connect
self.client.on_disconnect = self.mqtt_on_disconnect
self.client.on_message = self.mqtt_on_message
self.device_proxies = {}
self.device_proxy_classes = {}
for subclass in DeviceProxy.subclasses():
self.device_proxy_classes[subclass.DEVICE_CLASS.DEVICE_IDENTIFIER] = subclass
def connect(self):
if self.broker_username is not None:
self.client.username_pw_set(self.broker_username, self.broker_password)
self.client.connect(self.broker_host, self.broker_port)
self.client.loop_start()
But now the problem is the error on Python.
sudo python /home/pi/brick-mqtt-proxy.py
Traceback (most recent call last):
File "/home/pi/brick-mqtt-proxy.py", line 1250, in <module>
proxy.connect()
File "/home/pi/brick-mqtt-proxy.py", line 1109, in connect
self.client.connect(self.broker_host, self.broker_port)
File "/usr/local/lib/python2.7/dist-packages/paho/mqtt/client.py", line 760, in connect
return self.reconnect()
File "/usr/local/lib/python2.7/dist-packages/paho/mqtt/client.py", line 919, in reconnect
sock.do_handshake()
File "/usr/lib/python2.7/ssl.py", line 840, in do_handshake
self._sslobj.do_handshake()
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:661)
and on mosquitto these errors arrived.
1504896114: New connection from 143.93.197.20 on port 8883.
1504896114: OpenSSL Error: error:14094418:SSL routines:SSL3_READ_BYTES:tlsv1 alert unknown ca
1504896114: OpenSSL Error: error:140940E5:SSL routines:SSL3_READ_BYTES:ssl handshake failure
1504896114: Socket error on client <unknown>, disconnecting.
Mosquitto conf
# Place your local configuration in /etc/mosquitto/conf.d/
#
# A full description of the configuration file is at
# /usr/share/doc/mosquitto/examples/mosquitto.conf.example
pid_file /var/run/mosquitto.pid
persistence true
persistence_location /var/lib/mosquitto/
log_type all
log_facility 5
log_timestamp true
log_dest file /var/log/mosquitto/mosquitto.log
include_dir /etc/mosquitto/conf.d
port 8883
cafile /etc/mosquitto/ca_certificates/ca-cert.pem
certfile /etc/mosquitto/certs/server-cert.pem
keyfile /etc/mosquitto/certs/server-key.pem
Only Server and Ca matched the broker hostname. Client use there own hostname for CN. I hope thats right?
I hope you can help me to fix my problem.
PS: I used a self-signed certificate! TLS Version 1.2
If you are using TLS v1.2 you need to modify the expression (2nd line: self.client.tls_set()) 'tls_version=ssl.PROTOCOL_TLSv1' to 'tls_version=ssl.PROTOCOL_TLSv1_2', not as expected to ...TLSv1.2. This worked for me.
try providing something like below. Default port for ssl is 8883. We can start multiple listeners. In this case non-ssl on 1883 and ssl on 8883.
port 1883
listener 8883
I use python 2.6 and request Facebook API (https). I guess my service could be target of Man In The Middle attacks.
I discovered this morning reading again urllib module documentation that :
Citation:
Warning : When opening HTTPS URLs, it is not attempted to validate the server certificate. Use at your own risk!
Do you have hints / url / examples to complete a full certificate validation ?
Thanks for your help
You could create a urllib2 opener which can do the validation for you using a custom handler. The following code is an example that works with Python 2.7.3 . It assumes you have downloaded http://curl.haxx.se/ca/cacert.pem to the same folder where the script is saved.
#!/usr/bin/env python
import urllib2
import httplib
import ssl
import socket
import os
CERT_FILE = os.path.join(os.path.dirname(__file__), 'cacert.pem')
class ValidHTTPSConnection(httplib.HTTPConnection):
"This class allows communication via SSL."
default_port = httplib.HTTPS_PORT
def __init__(self, *args, **kwargs):
httplib.HTTPConnection.__init__(self, *args, **kwargs)
def connect(self):
"Connect to a host on a given (SSL) port."
sock = socket.create_connection((self.host, self.port),
self.timeout, self.source_address)
if self._tunnel_host:
self.sock = sock
self._tunnel()
self.sock = ssl.wrap_socket(sock,
ca_certs=CERT_FILE,
cert_reqs=ssl.CERT_REQUIRED)
class ValidHTTPSHandler(urllib2.HTTPSHandler):
def https_open(self, req):
return self.do_open(ValidHTTPSConnection, req)
opener = urllib2.build_opener(ValidHTTPSHandler)
def test_access(url):
print "Acessing", url
page = opener.open(url)
print page.info()
data = page.read()
print "First 100 bytes:", data[0:100]
print "Done accesing", url
print ""
# This should work
test_access("https://www.google.com")
# Accessing a page with a self signed certificate should not work
# At the time of writing, the following page uses a self signed certificate
test_access("https://tidia.ita.br/")
Running this script you should see something a output like this:
Acessing https://www.google.com
Date: Mon, 14 Jan 2013 14:19:03 GMT
Expires: -1
...
First 100 bytes: <!doctype html><html itemscope="itemscope" itemtype="http://schema.org/WebPage"><head><meta itemprop
Done accesing https://www.google.com
Acessing https://tidia.ita.br/
Traceback (most recent call last):
File "https_validation.py", line 54, in <module>
test_access("https://tidia.ita.br/")
File "https_validation.py", line 42, in test_access
page = opener.open(url)
...
File "/usr/local/Cellar/python/2.7.3/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 1177, in do_open
raise URLError(err)
urllib2.URLError: <urlopen error [Errno 1] _ssl.c:504: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed>
If you have a trusted Certificate Authority (CA) file, you can use Python 2.6 and later's ssl library to validate the certificate. Here's some code:
import os.path
import ssl
import sys
import urlparse
import urllib
def get_ca_path():
'''Download the Mozilla CA file cached by the cURL project.
If you have a trusted CA file from your OS, return the path
to that instead.
'''
cafile_local = 'cacert.pem'
cafile_remote = 'http://curl.haxx.se/ca/cacert.pem'
if not os.path.isfile(cafile_local):
print >> sys.stderr, "Downloading %s from %s" % (
cafile_local, cafile_remote)
urllib.urlretrieve(cafile_remote, cafile_local)
return cafile_local
def check_ssl(hostname, port=443):
'''Check that an SSL certificate is valid.'''
print >> sys.stderr, "Validating SSL cert at %s:%d" % (
hostname, port)
cafile_local = get_ca_path()
try:
server_cert = ssl.get_server_certificate((hostname, port),
ca_certs=cafile_local)
except ssl.SSLError:
print >> sys.stderr, "SSL cert at %s:%d is invalid!" % (
hostname, port)
raise
class CheckedSSLUrlOpener(urllib.FancyURLopener):
'''A URL opener that checks that SSL certificates are valid
On SSL error, it will raise ssl.
'''
def open(self, fullurl, data = None):
urlbits = urlparse.urlparse(fullurl)
if urlbits.scheme == 'https':
if ':' in urlbits.netloc:
hostname, port = urlbits.netloc.split(':')
else:
hostname = urlbits.netloc
if urlbits.port is None:
port = 443
else:
port = urlbits.port
check_ssl(hostname, port)
return urllib.FancyURLopener.open(self, fullurl, data)
# Plain usage - can probably do once per day
check_ssl('www.facebook.com')
# URL Opener
opener = CheckedSSLUrlOpener()
opener.open('https://www.facebook.com/find-friends/browser/')
# Make it the default
urllib._urlopener = opener
urllib.urlopen('https://www.facebook.com/find-friends/browser/')
Some dangers with this code:
You have to trust the CA file from the cURL project (http://curl.haxx.se/ca/cacert.pem), which is a cached version of Mozilla's CA file. It's also over HTTP, so there is a potential MITM attack. It's better to replace get_ca_path with one that returns your local CA file, which will vary from host to host.
There is no attempt to see if the CA file has been updated. Eventually, root certs will expire or be deactivated, and new ones will be added. A good idea would be to use a cron job to delete the cached CA file, so that a new one is downloaded daily.
It's probably overkill to check certificates every time. You could manually check once per run, or keep a list of 'known good' hosts over the course of the run. Or, be paranoid!