import requests
data = {'foo':'bar'}
url = 'https://foo.com/bar'
r = requests.post(url, data=data)
If the URL uses a self signed certificate, this fails with
requests.exceptions.SSLError: [Errno 1] _ssl.c:507: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
I know that I can pass False to the verify parameter, like this:
r = requests.post(url, data=data, verify=False)
However, what I would like to do is point requests to a copy of the public key on disk and tell it to trust that certificate.
try:
r = requests.post(url, data=data, verify='/path/to/public_key.pem')
With the verify parameter you can provide a custom certificate authority bundle
requests.get(url, verify=path_to_bundle_file)
From the docs:
You can pass verify the path to a CA_BUNDLE file with certificates of
trusted CAs. This list of trusted CAs can also be specified through
the REQUESTS_CA_BUNDLE environment variable.
The easiest is to export the variable REQUESTS_CA_BUNDLE that points to your private certificate authority, or a specific certificate bundle. On the command line you can do that as follows:
export REQUESTS_CA_BUNDLE=/path/to/your/certificate.pem
python script.py
If you have your certificate authority and you don't want to type the export each time you can add the REQUESTS_CA_BUNDLE to your ~/.bash_profile as follows:
echo "export REQUESTS_CA_BUNDLE=/path/to/your/certificate.pem" >> ~/.bash_profile ; source ~/.bash_profile
Case where multiple certificates are needed was solved as follows:
Concatenate the multiple root pem files, myCert-A-Root.pem and myCert-B-Root.pem, to a file. Then set the requests REQUESTS_CA_BUNDLE var to that file in my ./.bash_profile.
$ cp myCert-A-Root.pem ca_roots.pem
$ cat myCert-B-Root.pem >> ca_roots.pem
$ echo "export REQUESTS_CA_BUNDLE=~/PATH_TO/CA_CHAIN/ca_roots.pem" >> ~/.bash_profile ; source ~/.bash_profile
All of the answers to this question point to the same path: get the PEM file, but they don't tell you how to get it from the website itself.
Getting the PEM file from the website itself is a valid option if you trust the site, such as on an internal corporate server. If you trust the site, why should you do this? You should do this because it helps protect yourself and others from inadvertently re-using your code on a site that isn't safe.
Here is how you can get the PEM file.
Click on the lock next to the url.
Navigate to where you can see the certificates and open the certificates.
Download the PEM CERT chain.
Put the .PEM file somewhere you script can access it and try verify=r"path\to\pem_chain.pem" within your requests call.
r = requests.get(url, verify='\path\to\public_key.pem')
Setting export SSL_CERT_FILE=/path/file.crt should do the job.
If you're behind a corporate network firewall like I was, ask your network admin where your corporate certificates are, then:
import os
os.environ["REQUESTS_CA_BUNDLE"] = 'path/to/corporate/cert.pem'
os.environ["SSL_CERT_FILE"] = 'path/to/corporate/cert.pem'
This fixed issues I had with requests and openssl.
In a dev environment, using Poetry as virtual env provider on a Mac with Python 3.8 I used this answer https://stackoverflow.com/a/42982144/15484549 as base and appended the content of my self-signed root certificate to the certifi cacert.pem file.
The steps in detail:
cd project_folder
poetry add requests
# or if you use something else, make sure certifi is among the dependencies
poetry shell
python
>>> import certifi
>>> certifi.where()
/path/to/the/certifi/cacert.pem
>>> exit()
cat /path/to/self-signed-root-cert.pem >> /path/to/the/certifi/cacert.pem
python the_script_you_want_to_run.py
I know it is an old thread. However, I run into this issue recently. My python requests code does not accept the self-signed certificate but curl does. It turns out python requests are very strict on the self-signed certificate. It needs to be a root CA certificate. In other words,
Basic Constraints: CA:TRUE
Key Usage: Digital Signature, Non Repudiation, Key Encipherment, Certificate Sign
Incase anyone happens to land here (like I did) looking to add a CA (in my case Charles Proxy) for httplib2, it looks like you can append it to the cacerts.txt file included with the python package.
For example:
cat ~/Desktop/charles-ssl-proxying-certificate.pem >> /usr/local/google-cloud-sdk/lib/third_party/httplib2/cacerts.txt
The environment variables referenced in other solutions appear to be requests-specific and were not picked up by httplib2 in my testing.
You may try:
settings = s.merge_environment_settings(prepped.url, None, None, None, None)
You can read more here: http://docs.python-requests.org/en/master/user/advanced/
Related
I have a p7b file I need to extract the base64 server certificate from. I have an openssl command I can do this on in my terminal, but the problem is I have a whole folder full of p7b certs I need to extract the server certs from.
Here is my openSSL command:
openssl pkcs7 -inform DER -outform PEM -in p7bfile.p7b -print_certs > base64_server_cert.cer
I did some googling and found a post saying to use the call command to run this in python, but when I try that it runs with no errors but there's no output in folder, so I'm not sure if I'm running they python syntax incorrectly or this is the wrong module to use. If I try printing the assigned "decrypted" variable I just get a 1 (assuming it's 0 or 1 output only).
Example of my python code:
from subprocess import call
def main():
decrypted = call(['openssl', 'pkcs7', '-inform', 'DER', '-outform', 'PEM', '-in', 'p7bfile.p7b', '-print_certs', '>', 'base64_server_cert.cer'])
if __name__ == '__main__':
main()
Is there a better way to run an open SSL command like this to extract the server cert? I don't care about extracting the CA or intermediate certs in the file, just the server cert is all that matters to me.
Included in above post, but attempted to run code using call from subprocess module and formatting my openssl command I can successfully run from my terminal, but it doesn't appear to take any action when run in python (no cert is output like when I run the command locally).
I'm trying to add a certificate into a Dockerfile, needed for Python requests package:
FROM python:3.9-slim-buster
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
ENV PYTHONPATH="$PYTHONPATH:/app"
WORKDIR /app
COPY ./app .
COPY ./certs/*.crt /usr/local/share/ca-certificates/
RUN update-ca-certificates
RUN pip3 install requests
CMD ["python3", "main.py"]
With the above Dockerfile, I get the following error:
[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain
Based on my tests, that is because requests is using certifi and is looking only inside /usr/local/lib/python3.9/site-packages/certifi/cacert.pem. If I add my certificates inside cacert.pem, everything works as expected and the errors are gone.
What is the pythonic way to deal with this issue? Ideally, I would prefer to insert certificates into a directory, instead of modifying a file. Is there a way to "force" Python requests look inside /etc/ssl/certs for certificates, as well into certifi cacert.pem file? If I list the /etc/ssl/certs directory contents, it contains my .pem certificates.
Running an apt-get update will not update ca-certificates, I'm already using the latest version. When I execute update-ca-certificates, the new certificates are detected:
STEP 10/11: RUN update-ca-certificates
Updating certificates in /etc/ssl/certs...
2 added, 0 removed; done.
Thank you for your help.
There only reasonable solution I found is:
from requests import post
from requests.exceptions import HTTPError, RequestException, SSLError
try:
result = post(url=url, data=dumps(data), headers=headers, verify='/etc/ssl/certs')
except (HTTPError, RequestException, SSLError) as e:
raise
Setting verify=/etc/ssl/certs will see the self-signed certificates.
I have been following the steps below to install Jupyter in AWS EC2:
https://chrisalbon.com/aws/basics/run_project_jupyter_on_amazon_ec2/
I gave 8888 as port.
I then launched jupyter notebook:
Then I go on my instance url:
https://ec2-XX-XX-XX-XXX.eu-west-3.compute.amazonaws.com:8888/
I have a public IP so I also tried https://XX-XX-XX-XXX:8888/
But it does not load anything in both ways.
I made sure that 8888 port is authorized in security groups on my EC2 instance.
Any idea how I can deep dive where the issue is?
[EDIT 1]:
I followed these steps:
c = get_config()
# Kernel config
c.IPKernelApp.pylab = 'inline' # if you want plotting support always in your notebook
# Notebook config
c.NotebookApp.certfile = u'/home/ec2-user/Notebooks/certs/Mycert_file.pem' #location of your certificate file
c.NotebookApp.ip = '*'
c.NotebookApp.open_browser = False #so that the ipython notebook does not opens up a browser by default
c.NotebookApp.password = u'sha1:XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX' #the encrypted password we generated above
# Set the port to 8888, the port we set up in the AWS EC2 set-up
c.NotebookApp.port = 8888
[EDIT 2]:
Previously to these steps I did this:
sudo openssl req -x509 -nodes -days 365 -newkey rsa:1024 -keyout Mycert_file.pem -out Mycert_file.pem
Now, to locate my .pem file I did the following: find /home -name *.pem
I found the location of my .pem file which is /home/ec2-user/Notebooks/certs/Mycert_file.pem
[EDIT 3]:
I will also add that I am already currently running a RStudio session on this instance on a 8787 port. I assume this is not impacting what I am doing trying to install Jupyter) but just wanted to point it out in case of.
So I found the issue, regarding the config file. In the tutorial it said to press esckey to record the config file. But that was not saving the file for me. So I simply used :wq! and it saved it for me.
But I still can't make it work.
So, as advised, I used jupyter notebook --debug
Here are the logs:
I'm using Windows 10 OS.
I want to count the number of IP Address of AWS.
I use python 2.7.14 and boto 2.6.0
I add a file which name is boto.config locate C:\Users\Administrator folder
The content of the boto.config is:
[Credentials]
aws_access_key_id=******
aws_secret_access_key=*****
The script is :
#!/usr/bin/env python
# -*- encoding: utf8 -*-
import boto.ec2
from pprint import pprint
import ssh
import requests
import urllib3
import certifi
import ssl
conn = boto.ec2.connect_to_region('cn-north-1')
reservations = conn.get_all_instances()
InstanceMap=[]
for reservation in reservations:
for instance in reservation.instances:
if 'env' in instance.tags and instance.tags['env'] == 'test':
InstanceMap.append(instance.ip_address)
f = open('F:\ip.txt','w')
pprint(InstanceMap, f)
When I run this script, it show the error formation:
SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:661)
What's the method can I solve this problem ?
I was having same issue with boto3 and Python 3.7 on Windows 10 machine. As it turned out, since I was using corporate device with Proxy installed, *.amazonaws.com certificate was getting replaced by the Proxy certificate. This Proxy certificate chain needed to be trusted by Python certifi module. Whether or not, you have a proxy, below method should resolve SSL: CERTIFICATE_VERIFY_FAILED error.
Here is what I did, to resolve the issue -
Find the path where cacert.pem is located -
Install certifi, if you don't have. Command: pip install certifi
import certifi
certifi.where()
C:\\Users\\[UserID]\\AppData\\Local\\Programs\\Python\\Python37-32\\lib\\site-packages\\certifi\\cacert.pem
Set AWS_CA_BUNDLE environment variable to the cacert.pem path -
AWS_CA_BUNDLE=C:\Users\[UserID]\AppData\Local\Programs\Python\Python37-32\Lib\site-packages\certifi\cacert.pem
Download the chain of certificates from amazonaws.com URL. For example: Go to https://sts.amazonaws.com/xyz on a browser and export Root, all the intermediate certificates, domain cert and save as base64 encoded .cer file. Open the certificates in notepad, copy all the contents.
Now open the cacert.pem in a notepad and just add every downloaded certificate contents (---Begin Certificate--- *** ---End Certificate---) at the end.
Restart the command line prompt or PowerShell, SSL verification error should be resolved.
Do not use is_secure = False in your organization's envrionments. This is essentially disabling SSL verification.
Try adding is_secure = False like below, in order to skip ssl verification,
conn = boto.ec2.connect_to_region('cn-north-1',is_secure=False)
Try providing the credentials as so, that way you would know if the keys in boto config are old if this works, and if this returns the same issue then you need to check your api-key and secret on aws.
API_KEY = 'Actual API_KEY'
API_SECRET = 'Actual Secret'
conn = boto.ec2.connect_to_region('us-east-2',aws_access_key_id=API_KEY,aws_secret_access_key=API_SECRET,is_secure=False)
import requests
data = {'foo':'bar'}
url = 'https://foo.com/bar'
r = requests.post(url, data=data)
If the URL uses a self signed certificate, this fails with
requests.exceptions.SSLError: [Errno 1] _ssl.c:507: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
I know that I can pass False to the verify parameter, like this:
r = requests.post(url, data=data, verify=False)
However, what I would like to do is point requests to a copy of the public key on disk and tell it to trust that certificate.
try:
r = requests.post(url, data=data, verify='/path/to/public_key.pem')
With the verify parameter you can provide a custom certificate authority bundle
requests.get(url, verify=path_to_bundle_file)
From the docs:
You can pass verify the path to a CA_BUNDLE file with certificates of
trusted CAs. This list of trusted CAs can also be specified through
the REQUESTS_CA_BUNDLE environment variable.
The easiest is to export the variable REQUESTS_CA_BUNDLE that points to your private certificate authority, or a specific certificate bundle. On the command line you can do that as follows:
export REQUESTS_CA_BUNDLE=/path/to/your/certificate.pem
python script.py
If you have your certificate authority and you don't want to type the export each time you can add the REQUESTS_CA_BUNDLE to your ~/.bash_profile as follows:
echo "export REQUESTS_CA_BUNDLE=/path/to/your/certificate.pem" >> ~/.bash_profile ; source ~/.bash_profile
Case where multiple certificates are needed was solved as follows:
Concatenate the multiple root pem files, myCert-A-Root.pem and myCert-B-Root.pem, to a file. Then set the requests REQUESTS_CA_BUNDLE var to that file in my ./.bash_profile.
$ cp myCert-A-Root.pem ca_roots.pem
$ cat myCert-B-Root.pem >> ca_roots.pem
$ echo "export REQUESTS_CA_BUNDLE=~/PATH_TO/CA_CHAIN/ca_roots.pem" >> ~/.bash_profile ; source ~/.bash_profile
All of the answers to this question point to the same path: get the PEM file, but they don't tell you how to get it from the website itself.
Getting the PEM file from the website itself is a valid option if you trust the site, such as on an internal corporate server. If you trust the site, why should you do this? You should do this because it helps protect yourself and others from inadvertently re-using your code on a site that isn't safe.
Here is how you can get the PEM file.
Click on the lock next to the url.
Navigate to where you can see the certificates and open the certificates.
Download the PEM CERT chain.
Put the .PEM file somewhere you script can access it and try verify=r"path\to\pem_chain.pem" within your requests call.
r = requests.get(url, verify='\path\to\public_key.pem')
Setting export SSL_CERT_FILE=/path/file.crt should do the job.
If you're behind a corporate network firewall like I was, ask your network admin where your corporate certificates are, then:
import os
os.environ["REQUESTS_CA_BUNDLE"] = 'path/to/corporate/cert.pem'
os.environ["SSL_CERT_FILE"] = 'path/to/corporate/cert.pem'
This fixed issues I had with requests and openssl.
In a dev environment, using Poetry as virtual env provider on a Mac with Python 3.8 I used this answer https://stackoverflow.com/a/42982144/15484549 as base and appended the content of my self-signed root certificate to the certifi cacert.pem file.
The steps in detail:
cd project_folder
poetry add requests
# or if you use something else, make sure certifi is among the dependencies
poetry shell
python
>>> import certifi
>>> certifi.where()
/path/to/the/certifi/cacert.pem
>>> exit()
cat /path/to/self-signed-root-cert.pem >> /path/to/the/certifi/cacert.pem
python the_script_you_want_to_run.py
I know it is an old thread. However, I run into this issue recently. My python requests code does not accept the self-signed certificate but curl does. It turns out python requests are very strict on the self-signed certificate. It needs to be a root CA certificate. In other words,
Basic Constraints: CA:TRUE
Key Usage: Digital Signature, Non Repudiation, Key Encipherment, Certificate Sign
Incase anyone happens to land here (like I did) looking to add a CA (in my case Charles Proxy) for httplib2, it looks like you can append it to the cacerts.txt file included with the python package.
For example:
cat ~/Desktop/charles-ssl-proxying-certificate.pem >> /usr/local/google-cloud-sdk/lib/third_party/httplib2/cacerts.txt
The environment variables referenced in other solutions appear to be requests-specific and were not picked up by httplib2 in my testing.
You may try:
settings = s.merge_environment_settings(prepped.url, None, None, None, None)
You can read more here: http://docs.python-requests.org/en/master/user/advanced/