I'm trying to connect to an internal site using urlopen. And it's failing repeatedly with SSL error irrespective of providing cafile.
I tried all the various ways explained in the stackoverflow answers. But no luck.
urllib2.urlopen(url,cafile=certifi.where())
Second Way:
context = ssl.create_default_context(cafile=certifi.where())
urllib2.urlopen(url,context=context)
Third Way:
ctx = ssl.create_default_context()
ctx.load_verify_locations(cafile = certifi.where())
urllib2.urlopen(url,context=ctx)
Whichever way, I try, I get the following error.
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "urllib2.py", line 154, in urlopen
return opener.open(url, data, timeout)
File "urllib2.py", line 429, in open
response = self._open(req, data)
File "urllib2.py", line 447, in _open
'_open', req)
File "urllib2.py", line 407, in _call_chain
result = func(*args)
File "urllib2.py", line 1241, in https_open
context=self._context)
File "urllib2.py", line 1198, in do_open
raise URLError(err)
urllib2.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:661)>
I've added the entry of CA certificate to the file certifi.where() too. I use the python version 2.7.14.
Could someone tell me whether I miss something here? Or the python version, that I use doesn't support it? Also, let me know the way to debug this, that is, to find out whether there is any issue with the CA certificate.
Thanks.
EDIT: I don't want to opt out SSL verification as suggested in one of the answers in urllib and "SSL: CERTIFICATE_VERIFY_FAILED" Error. The other answer tells me to use cafile with urlopen which doesn't work in my case. I've tried the solutions given in the answers of this question; but no luck.
Also, openssl throws the following error.
[root#host1 ~]# openssl s_client -connect url -CAfile "cacert.pem"
...
Certificate chain
...
Server certificate
-----BEGIN CERTIFICATE-----
...
-----END CERTIFICATE-----
...
No client certificate CA names sent
...
SSL handshake has read 2098 bytes and written 415 bytes
...
Verify return code: 2 (unable to get issuer certificate)
---
HTTP/1.0 408 Request Time-out
Cache-Control: no-cache
Connection: close
Content-Type: text/html
<html><body><h1>408 Request Time-out</h1>
Your browser didn't send a complete request in time.
</body></html>
closed
Related
I am attempting to use python sockets to make an Extensible Provisioning Protocol (EPP) request to a domain registrar, which only accepts requests over ssl.
Certificate file: www.myDomain.se.crt
Key File: mydomain.pem
openssl s_client -connect epptestv3.iis.se:700 -cert www.myDomain.se.crt -key mydomain.pem
When I try making request using openssl client I successfully get greeting response from registrar, but when I use following code in python i get ssl certificate error.
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.settimeout(15)
sock.connect(('epptestv3.iis.se', 700))
sock.settimeout(60) # regular timeout
ssl_keyfile='myDomain.pem'
ssl_certfile='www.myDomain.se.crt'
ssl_ciphers='AES256-GCM-SHA384'
ssl_version=ssl.PROTOCOL_TLSv1_2
sock = ssl.wrap_socket(sock,
ssl_keyfile,
ssl_certfile,
ssl_version=ssl_version,
ciphers=ssl_ciphers,
server_side=False,
cert_reqs=ssl.CERT_REQUIRED,
ca_certs=None
)
After executing script I get following error:
Traceback (most recent call last):
File "server_connect.py", line 54, in <module>
ca_certs=ssl_keyfile
File "/usr/lib/python2.7/ssl.py", line 933, in wrap_socket
ciphers=ciphers)
File "/usr/lib/python2.7/ssl.py", line 601, in __init__
self.do_handshake()
File "/usr/lib/python2.7/ssl.py", line 830, in do_handshake
self._sslobj.do_handshake()
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)
Any idea what's wrong here?
From your code:
cert_reqs=ssl.CERT_REQUIRED,
ca_certs=None
From the documentation of wrap_socket:
If the value of this parameter is not CERT_NONE, then the ca_certs parameter must point to a file of CA certificates.
Essentially you are asking in your code to validate the certificate from the server (CERT_REQUIRED) but specify at the same time that you have no trusted root (ca_certs=None). But without trusted root certificates no validation can be done.
Note that changing your code to use CERT_NONE instead would be a bad idea. It would probably work since no certificate validation will be done but it would be open to man in the middle attacks.
I'm trying to sent a request on https url to get data, the domain needs security certificate when I try to execute it on the browser.
But my issue is how to call the url on my python code to get the response data?
I've write the follwing code:
conn = HTTPSConnectionPool(BETTING_CONFG['api_url'],
maxsize = BETTING_CONFG['connection_max_size'])
response = conn.request_encode_body('POST', service_uri, headers= headers,
encode_multipart=False, body = body)
and I get the following response:
Response: status = 200, payload = {"_status":"error","payload":{"_code":"0-2","_message":"invalid_app_key"}} .
and this warning on the terminal:
/usr/local/lib/python2.7/dist-packages/urllib3/util/ssl_.py:318: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#snimissingwarning.
SNIMissingWarning
/usr/local/lib/python2.7/dist-packages/urllib3/util/ssl_.py:122: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.
InsecurePlatformWarning
/usr/local/lib/python2.7/dist-packages/urllib3/connectionpool.py:821: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.org/en/latest/security.html
InsecureRequestWarning)
[555WIN] 2016-05-30 14:02:06,043 - INFO - Betting Response: status = 200, payload = {"_status":"error","payload":{"_code":"0-2","_message":"invalid_app_key"}} .
Traceback (most recent call last):
File "/usr/lib/python2.7/logging/handlers.py", line 76, in emit
if self.shouldRollover(record):
File "/usr/lib/python2.7/logging/handlers.py", line 156, in shouldRollover
msg = "%s\n" % self.format(record)
File "/usr/lib/python2.7/logging/__init__.py", line 724, in format
return fmt.format(record)
File "/usr/lib/python2.7/logging/__init__.py", line 464, in format
record.message = record.getMessage()
File "/usr/lib/python2.7/logging/__init__.py", line 324, in getMessage
msg = str(self.msg)
TypeError: __str__ returned non-string (type dict)
Logged from file jsonapi.py, line 137
Traceback (most recent call last):
File "/usr/lib/python2.7/logging/__init__.py", line 851, in emit
msg = self.format(record)
File "/usr/lib/python2.7/logging/__init__.py", line 724, in format
return fmt.format(record)
File "/usr/lib/python2.7/logging/__init__.py", line 464, in format
record.message = record.getMessage()
File "/usr/lib/python2.7/logging/__init__.py", line 324, in getMessage
msg = str(self.msg)
TypeError: __str__ returned non-string (type dict)
when I added the certificate on the chrome and tried to send the request on the postman, it works fine?
any help how to fix it?
Please understand the your Chrome certificate store is not the same certificate store that is used by your Python application.
It would be much easier if you could only get a valid SSL certificate instead of trying to make self-signed ones to work.
Also, be sure you do upgrade your Python and urllib. Those warning messages are not to be ignored! Resolve them first!
SSL certificates used to be expensive but now you can get valid, fully supported certificates for free from LetsEncrypt. I run my own website using their certificates and I can assure you that Python does have no problem loading their certificates.
I am having trouble using pyVmomi with python 2.7.5. I get SSL certificate errors when trying to run the sample scripts from the SDK. I tried all the solutions mentioned on this post but none of them worked for me.
Below is the complete console output.
/usr/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:315: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#snimissingwarning. SNIMissingWarning /usr/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:120: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning. InsecurePlatformWarning
Traceback (most recent call last):
File "hello_world_vcenter.py", line 105, in <module>
main()
File "hello_world_vcenter.py", line 80, in main
port=int(args.port))
File "/usr/lib/python2.7/site-packages/pyVim/connect.py", line 663, in SmartConnect
sslContext)
File "/usr/lib/python2.7/site-packages/pyVim/connect.py", line 552, in __FindSupportedVersion
sslContext)
File "/usr/lib/python2.7/site-packages/pyVim/connect.py", line 472, in __GetServiceVersionDescription
tree = __GetElementTreeFromUrl(url, sslContext)
File "/usr/lib/python2.7/site-packages/pyVim/connect.py", line 440, in __GetElementTreeFromUrl
sock = requests.get(url)
File "/usr/lib/python2.7/site-packages/requests/api.py", line 67, in get
return request('get', url, params=params, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/api.py", line 53, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 468, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 576, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 447, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: [Errno 1] _ssl.c:504: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
That looks like you're using a self-signed certificate. While connecting via SmartConnect use your own sslContext and disable certificate verification.
from pyVim.connect import SmartConnect
import ssl
context = ssl.SSLContext(ssl.PROTOCOL_TLSv1)
context.verify_mode = ssl.CERT_NONE
si = SmartConnect(host=somehost.com, port=443, user=someone, pwd=secret, sslContext=context)
... or use a signed ssl certificate.
There are a few similiar questions on here (e.g. here).
I recently got an SSL certificate for my site:
https://ram.rachum.com/
It works great in browsers. But it fails for requests:
>>> import requests
>>> requests.get('https://ram.rachum.com')
Traceback (most recent call last):
File "<pyshell#1>", line 1, in <module>
requests.get('https://ram.rachum.com')
File "C:\Python27\lib\site-packages\requests\api.py", line 55, in get
return request('get', url, **kwargs)
File "C:\Python27\lib\site-packages\requests\api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 354, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 460, in send
r = adapter.send(request, **kwargs)
File "C:\Python27\lib\site-packages\requests\adapters.py", line 250, in send
raise SSLError(e)
SSLError: hostname 'ram.rachum.com' doesn't match either of '*.webfaction.com', 'webfaction.com'
Why? Why does requests look at the webfaction certificate rather than my own certificate, which is valid for ram.rachum.com?
You are using a requests library without support for SNI (server name indication), but you have multiple SSL certificates behind the same IP address which requires SNI. You can verify this with openssl s_client. Without given a name for SNI the server just gives the default certificate for this IP, which is *.webfaction.com:
openssl s_client -connect ram.rachum.com:443
...
0 ...CN=*.webfaction.com
But if you specify a hostname for SNI it returns the expected certificate:
openssl s_client -connect ram.rachum.com:443 -servername ram.rachum.com
...
0 ...CN=ram.rachum.com...
Maybe you need to upgrade your requests library and other modules too, see using requests with TLS doesn't give SNI support
I'm trying to access a journal article hosted by an academic service provider (SP), using a Python script.
The server authenticates using a Shibboleth login. I read Logging into SAML/Shibboleth authenticated server using python and tried to implement a login with Python Requests.
The script starts by querying the SP for the link leading to my IDP institution, and is supposed then to authenticate automatically with the IDP. The first part works, but when following the link to the IDP, it chokes on an SSL error.
Here is what I used:
import requests
import lxml.html
LOGINLINK = 'https://www.jsave.org/action/showLogin?redirectUri=%2F'
USERAGENT = 'Mozilla/5.0 (X11; Linux x86_64; rv:28.0) Gecko/20100101 Firefox/28.0'
s = requests.session()
s.headers.update({'User-Agent' : USERAGENT})
# getting the page where you can search for your IDP
# need to get the cookies so we can continue
response = s.get(LOGINLINK)
rtext = response.text
print('Don\'t see your school?' in rtext) # prints True
# POSTing the name of my institution
data = {
'institutionName' : 'tubingen',
'submitForm' : 'Search',
'currUrl' : '%2Faction%2FshowBasicSearch',
'redirectUri' : '%2F',
'activity' : 'isearch'
}
response = s.post(BASEURL + '/action/showLogin', data=data)
rtext = response.text
print('university of tubingen' in rtext) # prints True
# get the link that leads to the IDP
tree = lxml.html.fromstring(rtext)
loginlinks = tree.cssselect('a.extLogin')
if (loginlinks):
loginlink = loginlinks[0].get('href')
else:
exit(1)
print('continuing to IDP')
response = s.get(loginlink)
rtext = response.text
print('zentrale Anmeldeseite' in rtext)
This yields:
continuing to IDP...
2014-04-04 10:04:06,010 - INFO - Starting new HTTPS connection (1): idp.uni-tuebingen.de
Traceback (most recent call last):
File "/usr/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 480, in urlopen
body=body, headers=headers)
File "/usr/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 285, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python3.4/http/client.py", line 1066, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python3.4/http/client.py", line 1104, in _send_request
self.endheaders(body)
File "/usr/lib/python3.4/http/client.py", line 1062, in endheaders
self._send_output(message_body)
File "/usr/lib/python3.4/http/client.py", line 907, in _send_output
self.send(msg)
File "/usr/lib/python3.4/http/client.py", line 842, in send
self.connect()
File "/usr/lib/python3.4/site-packages/requests/packages/urllib3/connection.py", line 164, in connect
ssl_version=resolved_ssl_version)
File "/usr/lib/python3.4/site-packages/requests/packages/urllib3/util.py", line 639, in ssl_wrap_socket
return context.wrap_socket(sock, server_hostname=server_hostname)
File "/usr/lib/python3.4/ssl.py", line 344, in wrap_socket
_context=self)
File "/usr/lib/python3.4/ssl.py", line 540, in __init__
self.do_handshake()
File "/usr/lib/python3.4/ssl.py", line 767, in do_handshake
self._sslobj.do_handshake()
ssl.SSLError: [SSL: TLSV1_ALERT_INTERNAL_ERROR] tlsv1 alert internal error (_ssl.c:598)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.4/site-packages/requests/adapters.py", line 330, in send
timeout=timeout
File "/usr/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 504, in urlopen
raise SSLError(e)
requests.packages.urllib3.exceptions.SSLError: [SSL: TLSV1_ALERT_INTERNAL_ERROR] tlsv1 alert internal error (_ssl.c:598)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "./try.py", line 154, in <module>
response = s.get(loginlink)
File "/usr/lib/python3.4/site-packages/requests/sessions.py", line 395, in get
return self.request('GET', url, **kwargs)
File "/usr/lib/python3.4/site-packages/requests/sessions.py", line 383, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python3.4/site-packages/requests/sessions.py", line 486, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python3.4/site-packages/requests/adapters.py", line 385, in send
raise SSLError(e)
requests.exceptions.SSLError: [SSL: TLSV1_ALERT_INTERNAL_ERROR] tlsv1 alert internal error (_ssl.c:598)
Using s.get(loginlink, verify=False) yields exactly the same error. Simply using urllib.request.urlopen(loginlink) does so, too.
Printing and pasting the link into Firefox, on the other hand, works fine.
After trying with openssl s_client it looks like the destination idp.uni-tuebingen.de:443 is only support SSLv3 and misbehaving on anything newer. With forcing SSLv3 one gets:
$ openssl s_client -connect idp.uni-tuebingen.de:443 -ssl3
CONNECTED(00000003)
depth=3 C = DE, O = Deutsche Telekom AG, OU = T-TeleSec Trust Center, CN = Deutsche Telekom Root CA 2
...
But with default setup or forcing TLv1 (-tls1) it only returns an alert:
openssl s_client -connect idp.uni-tuebingen.de:443
CONNECTED(00000003)
140493591938752:error:14077438:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert internal error:s23_clnt.c:741:
So you need to find a way to force SSLv3 for this connection. I'm not familiar with the python at this point but maybe http://docs.python-requests.org/en/latest/user/advanced/ chapter "Example: Specific SSL Version" helps.
And why it works with firefox: the browsers usually retry with a downgraded SSL version if the connects with the safer versions fail. E.g. everybody is trying to work around broken stuff so that the owner of the broken stuff has no intention to fix it :(