I am passing data to Elasticsearch (ES) through a Python script. First, I secured ES with a self-signed certificate and everything works as expected. Then, I switched to a more trusted certificate (Let's Encrypt). Note, that I can reach my ES cluster without any problems. The Let's Encrypt cert is trusted by my browser + by an application that is talking to ES, no problem. But when I try to pass data from my Python script to ES with the new certificate, I get following error:
urllib3.exceptions.SSLError: ("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')],)",)
I would have expected this error with a self signed cert but not with Let's encrypt. The only way I can avoid it, is by changing the settings to verify=False, which is no long-term solution.
Before I received the error message mentioned above, I got following error:
elasticsearch.exceptions.SSLError: ConnectionError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777)) caused by: SSLError([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777))
I found a workaround for this by doing pip install requests. However, afterwards I receive the first error I mentioned (bad handshake). I know that this means that the certificate is not trusted. But how can this be, if it works for a self-signed cert but not for a Let's Encrypt cert that is trusted by a browser and an app? E.g. if I call ES on https://my-IP:9200, no warning is given by my browser, while a warning will be given with the self-signed cert.
Some additional info
python3
urllib3 1.25.7
certifi 2019.9.11
Ubuntu 18.04
So, basically everything is up-to-date. I also tried a suggested solution by downgrading certifi and/or urllib3, but it doesn't work. One suggestion is to downgrade urllib3 below version 1.25 (but as I said, it doesn't work in my case).
Any ideas?
You need to install certificate first, and then use it in connection to Elasticsearch with Python
host = 'mydomain.com:9200'
client = Elasticsearch(host, http_auth=('admin', 'pass'), scheme="https", use_ssl=True, ca_certs='C:/my_path/CertificateFile.cer.pem', port=443)
try:
info = json.dumps(client.info(), indent=4)
print ("Elasticsearch client info():", info)
except exceptions.ConnectionError as err:
print ("\nElasticsearch info() ERROR:", err)
print ("\nThe client host:", host, "is invalid or cluster is not running")
client = None
Response:
> Elasticsearch client info(): {
> "name": "my_name",
> "cluster_name": "my_cluster_name",
> "cluster_uuid": "fBRShbkSRy2vcfQJZsojGA",
> "version": {
> "number": "7.3.0",
> "build_flavor": "default",
> "build_type": "tar",
> "build_hash": "de777fa",
> "build_date": "2019-07-24T18:30:11.767338Z",
> "build_snapshot": false,
> "lucene_version": "8.1.0",
> "minimum_wire_compatibility_version": "6.8.0",
> "minimum_index_compatibility_version": "6.0.0-beta1"
> },
> "tagline": "You Know, for Search" }
elasticsearch.yml:
xpack.security.enabled: true
xpack.ml.enabled: false
xpack.security.transport.ssl.enabled: true
xpack.security.transport.ssl.verification_mode: certificate
xpack.security.transport.ssl.keystore.path: elastic-certificates.p12
xpack.security.transport.ssl.truststore.path: elastic-certificates.p12
xpack.security.http.ssl.enabled: true
xpack.security.http.ssl.keystore.path: elastic-certificates.p12
xpack.security.http.ssl.truststore.path: elastic-certificates.p12
Related
I’m writing a Python script that will monitor our Tesla PowerWall Gateway, but am stuck on this SSL problem.
HTTPSConnectionPool(host='powerwall', port=443): Max retries exceeded with url: /api/system_status/soe (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1108)')))
import json
import os
import requests
import sys
from requests.auth import HTTPDigestAuth
if __name__ == "__main__":
scriptPath = os.path.split(os.path.abspath(__file__))[0] # Where am I, and the local copy of the cert
#scriptPath = os.path.split(requests.certs.where(),)[0] # Where is requests looking for certs?
cert = os.path.join(scriptPath, 'PW2.pem')
#os.environ['REQUESTS_CA_BUNDLE'] = cert
#os.environ['REQUESTS_CA_BUNDLE'] = scriptPath
try:
response = None
query = "https://powerwall/api/system_status/soe"
with requests.Session() as session:
session.auth = (HTTPDigestAuth('myEmail', 'PW_PWD'))
session.timeout = 20
session.verify = True
#session.verify = cert
#session.load_cert_chain = "PW2.pem"
#session.load_cert_chain = cert
response = session.get(query)
except Exception as e:
print(str(e))
Despite all I’ve tried I still can’t get past this error. Yes, setting verify=False is an obvious work-around, but I’m trying to do this the ‘right’ way.
Setup:
Windows 10 PC
Python 3.8.2
I’ve downloaded the certificate from the Gateway and added it to the Local Machine store on my PC, in the Trusted Root Certification Authorities folder.
Windows can open it OK, showing the various SANs, including “powerwall”, which is how I’m addressing it in my call to requests.get. That says to me the integrity of the cert is good. (Its 'intended purposes' are Server Authentication & Client Authentication.)
I’ve installed python-certifi-win32, then later uninstalled it and installed pip-system-certs as per this SO answer to no avail.
I’ve added the PW’s cert to cacert.pem in the folder returned by requests.certs.where(): C:\Python38\lib\site-packages\certifi\cacert.pem
The commented-out code are variations I’ve performed along the way.
In the doco for ‘requests’ is a mention of this issue: “For example: Self-signed SSL certificates specified in REQUESTS_CA_BUNDLE will not be taken into account.” and a way around it, but that wasn’t successful either.
What have I missed?
Please don’t tell me it’s the 2047 expiry date of the cert…
TIA.
I'm trying to send a GET request to a host with (supposedly) correct certificates.
It's a university task, and they gave me these certificates. (which are only valid for 30 seconds)
But the code below gives me the error that certificate verify failed: self signed certificate
The package I got from the host in response says that Fatal Error: Unknown CA.
What could cause the issue? Thanks!
context = ssl.SSLContext(ssl.PROTOCOL_TLSv1_2)
context.load_cert_chain('clientcert.pem', keyfile='clientkey.pem')
connection = http.client.HTTPSConnection(IP)
connection.request("GET", "/")
response = connection.getresponse()
print("response:", response)
The error message seems to be self-explanatory. Self-signed SSL certificates always cause security warnings/errors. You will either need to add your self-signed SSL as an exception or add the self-signed CA to OS trusted certificates pool.
You may also try using something identical to --insecure option in curl.
I use python-requests to talk to HTTPS web services, some of which present incomplete certificate X509 chains. I'm having trouble figuring out how to access the invalid/incomplete certificates in order to explain the error to the user.
Here's an example illustrated by https://ssllabs.com/ssltest, where the server sends only the leaf certificate, and not the intermediate certificate which is necessary for validation, but missing from certifi's root CA store:
When I try to connect with python-requests, I get an exception that isn't very useful:
request.get('https://web.service.com/path')
SSLError: HTTPSConnectionPool(host='web.service.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')],)",),))
Obviously, I can use separate tools to figure out what's wrong in any particular case (e.g. gnutls-cli, openssl s_client, SSLLabs, etc.).
However, what I really want to be able to do is to be able to catch and diagnose the problem with the certificate chain in my Python code, so that I can present a more specific error message to the user. This answer suggests a monkey-patch to the response object; it's not particularly elegant, but it works—though only when the response object is returned successfully, and not in the case of an exception.
What are the cleanest ways to instrument requests to save the peer's certificate chain in the exception object returned when requests fails to validate the certificate chain itself?
Take requests.get("https://151.101.1.69") # stackoverflow's ip as an example:
try:
requests.get("https://151.101.1.69")
except requests.exceptions.SSLError as e:
cert = e.args[0].reason.args[0]._peer_cert
Then cert is a dict contains the peer's certificate. As I'm not very familiar with SSL, I don't know if it is enough for your case.
BTW, in this case the error is "hostname '151.101.1.69' doesn't match either of '*.stackexchange.com', ...omitted. I'm not sure about the structure of exception in your real case, so you may need to find it on your own. I think it should have the same name _peer_cert.
update
The above method doesn't work when handshake fails... But it still can be done:
try:
requests.get("https://fpslinux1.finalphasesystems.com/")
except requests.exceptions.SSLError:
import ssl
import OpenSSL
cert = ssl.get_server_certificate(('fpslinux1.finalphasesystems.com', 443))
cert = OpenSSL.crypto.load_certificate(OpenSSL.crypto.FILETYPE_PEM, cert)
print(cert.get_issuer())
print(cert.get_subject().get_components())
Yes it is a little dirty but I don't have a better method as a ssl socket doesn't
even return invalid certs from C level :/
To use OpenSSL, you need to install pyopenssl.
I'm trying to solve the problem
requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:579)
when I connect to a handle server.
I also used
ssl._create_default_https_context = ssl._create_unverified_context
as some user suggested, but I'm not able to fix the issue.
Any other solution?
Thanks
Does your server have a valid certificate, signed by a Certification Authority?
If it uses a self-signed certificate I would suggest that you save a copy of the public certificate in your Python project and pass the certificate name in the verify parameter on requests.
You can save the certificate by accessing the server on Firefox, clicking on the Lock icon near to the address bar, selecting the Certificate, then More details, then View Certificate, then export.
You will get a .pem file, let's say: "my_server_certificate.pem".
Then when you create your Session object on requests you can pass the parameter:
session = requests.Session()
session.verify = "my_server_certificate.pem"
I had similar problems when using charles proxy with my Python scripts. I hope this helps you solve your problem as well.
I have a python script that uses the VirusTotal API. It has been working with no problems, but all of a sudden when I run the script I am getting the following error:
urllib2.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)>
I believe it may be our web proxy that is causing the issue. Is there a way to prevent it from verifying the cert? Here is the portion of the code that uses the API:
json_out = []
url = "https://www.virustotal.com/vtapi/v2/file/report"
parameters = {"resource": my_list,
"apikey": "<MY API KEY>"}
data = urllib.urlencode(parameters)
req = urllib2.Request(url, data)
response = urllib2.urlopen(req)
json_out.append (response.read())
I believe it may be our web proxy that is causing the issue. Is there a way to prevent it from verifying the cert?
If you assume that a SSL intercepting proxy is denying the connection then you have to fix the problem at the proxy, i.e. there is no way to instruct the proxy to not check the certificate from your application.
If instead you assume that there is a SSL intercepting proxy and thus the certificate you receive is not signed by a CA you trust then you should get the CA of the proxy and trust it in your application (see cafile parameter in the documentation). Disabling validation is almost never the right way. Instead fix it so that validation works.
There are two possibilities,
You are using a self-signed certificate. Browsers don not trust on such certificate, so be sure that you are using CA-signed trusted certificate.
If you are using CA-signed trusted the certificate that you should have to check for install CA chain certificates (Root and Intermediate certificate).
You can refer this article, it may help you. - https://access.redhat.com/articles/2039753