HTTPS with Python 2.7 CGIHTTPServer - python

I already have a web service running using Python 2.7 CGIHTTPServer. It's great. It's lightweight. However, I now require it to work with HTTPS and a certificate that I have. There are very few instructions on how to do this. In fact there is only one article that I found for Python 2.7.
My question is simple and very narrow. Given the instructions below, how do I launch it? I already have a python script that is transaction based. You call it, it processes your request. It needs SSL.
https://blog.farville.com/15-line-python-https-cgi-server
This expects a directory structure:
/ssl_server.py
/localhost.pem
/html/index.html html lives here, aka “root directory”
/html/cgi/ python scripts live here
Self Signed SSL cert made with openssl like this:
openssl req -x509 -sha256 -newkey rsa:2048 -keyout localhost.pem \
-out localhost.pem -days 3650 -nodes
ssl_server.py:
#!/usr/bin/env python
import os, sys
import BaseHTTPServer
import CGIHTTPServer
import cgitb; cgitb.enable() ## This line enables CGI error reporting
import ssl
server = BaseHTTPServer.HTTPServer
handler = CGIHTTPServer.CGIHTTPRequestHandler
server_address = ("", 8443)
handler.cgi_directories = ["/cgi"]
os.chdir("html")
srvobj = server(server_address, handler)
srvobj.socket = ssl.wrap_socket (srvobj.socket, certfile="../localhost.pem", server_side=True)
# Force the use of a subprocess, rather than
# normal fork behavior since that doesn't work with ssl
handler.have_fork=False
srvobj.serve_forever()
So what now? Again, keep in mind I have another python script that already successfully processes web requests.

I added keyfile next to cert file, and executed python ssl_server.py, it worked
srvobj.socket = ssl.wrap_socket (srvobj.socket, certfile="/etc/letsencrypt/live/myowndomain.com/cert.pem", keyfile="/etc/letsencrypt/live/myowndomain.com/privkey.pem" server_side=True)

Related

How to extract a base64 server cert from a p7b file in python

I have a p7b file I need to extract the base64 server certificate from. I have an openssl command I can do this on in my terminal, but the problem is I have a whole folder full of p7b certs I need to extract the server certs from.
Here is my openSSL command:
openssl pkcs7 -inform DER -outform PEM -in p7bfile.p7b -print_certs > base64_server_cert.cer
I did some googling and found a post saying to use the call command to run this in python, but when I try that it runs with no errors but there's no output in folder, so I'm not sure if I'm running they python syntax incorrectly or this is the wrong module to use. If I try printing the assigned "decrypted" variable I just get a 1 (assuming it's 0 or 1 output only).
Example of my python code:
from subprocess import call
def main():
decrypted = call(['openssl', 'pkcs7', '-inform', 'DER', '-outform', 'PEM', '-in', 'p7bfile.p7b', '-print_certs', '>', 'base64_server_cert.cer'])
if __name__ == '__main__':
main()
Is there a better way to run an open SSL command like this to extract the server cert? I don't care about extracting the CA or intermediate certs in the file, just the server cert is all that matters to me.
Included in above post, but attempted to run code using call from subprocess module and formatting my openssl command I can successfully run from my terminal, but it doesn't appear to take any action when run in python (no cert is output like when I run the command locally).

When I use python boto connect to aws ec2 , it show SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:661)

I'm using Windows 10 OS.
I want to count the number of IP Address of AWS.
I use python 2.7.14 and boto 2.6.0
I add a file which name is boto.config locate C:\Users\Administrator folder
The content of the boto.config is:
[Credentials]
aws_access_key_id=******
aws_secret_access_key=*****
The script is :
#!/usr/bin/env python
# -*- encoding: utf8 -*-
import boto.ec2
from pprint import pprint
import ssh
import requests
import urllib3
import certifi
import ssl
conn = boto.ec2.connect_to_region('cn-north-1')
reservations = conn.get_all_instances()
InstanceMap=[]
for reservation in reservations:
for instance in reservation.instances:
if 'env' in instance.tags and instance.tags['env'] == 'test':
InstanceMap.append(instance.ip_address)
f = open('F:\ip.txt','w')
pprint(InstanceMap, f)
When I run this script, it show the error formation:
SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:661)
What's the method can I solve this problem ?
I was having same issue with boto3 and Python 3.7 on Windows 10 machine. As it turned out, since I was using corporate device with Proxy installed, *.amazonaws.com certificate was getting replaced by the Proxy certificate. This Proxy certificate chain needed to be trusted by Python certifi module. Whether or not, you have a proxy, below method should resolve SSL: CERTIFICATE_VERIFY_FAILED error.
Here is what I did, to resolve the issue -
Find the path where cacert.pem is located -
Install certifi, if you don't have. Command: pip install certifi
import certifi
certifi.where()
C:\\Users\\[UserID]\\AppData\\Local\\Programs\\Python\\Python37-32\\lib\\site-packages\\certifi\\cacert.pem
Set AWS_CA_BUNDLE environment variable to the cacert.pem path -
AWS_CA_BUNDLE=C:\Users\[UserID]\AppData\Local\Programs\Python\Python37-32\Lib\site-packages\certifi\cacert.pem
Download the chain of certificates from amazonaws.com URL. For example: Go to https://sts.amazonaws.com/xyz on a browser and export Root, all the intermediate certificates, domain cert and save as base64 encoded .cer file. Open the certificates in notepad, copy all the contents.
Now open the cacert.pem in a notepad and just add every downloaded certificate contents (---Begin Certificate--- *** ---End Certificate---) at the end.
Restart the command line prompt or PowerShell, SSL verification error should be resolved.
Do not use is_secure = False in your organization's envrionments. This is essentially disabling SSL verification.
Try adding is_secure = False like below, in order to skip ssl verification,
conn = boto.ec2.connect_to_region('cn-north-1',is_secure=False)
Try providing the credentials as so, that way you would know if the keys in boto config are old if this works, and if this returns the same issue then you need to check your api-key and secret on aws.
API_KEY = 'Actual API_KEY'
API_SECRET = 'Actual Secret'
conn = boto.ec2.connect_to_region('us-east-2',aws_access_key_id=API_KEY,aws_secret_access_key=API_SECRET,is_secure=False)

How to figure out http and ssl version using python

Is there a good way in python to check if a website/url supports http2 and which SSL versions does it support.
What I am looking for is achievable by the use of commands
openssl s_client ­connect domain:443 ­nextprotoneg ''
output of this openssl command contains this line : Protocols advertised by server: h2, spdy/3.1, http/1.1 Through which I can figure out the http2 support.
curl -v -o /dev/null --silent https://www.example.com
This line -> TLS 1.2 connection using TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256 in the output of the above command can tell me about the SSL version used.
I don't want to run these commands and parse the output because I feel there should be a better way to do it.
You can use the Hyper lib.
from hyper import HTTPConnection
conn = HTTPConnection('google.com:443')
# If request made over http/1.1 then result will be None
if conn.request('HEAD', '/') is not None:
# Return a string identifying the protocol version used by the current SSL channel
print(conn._sock._sck.version()) # little bit dirty, but work

How to get Python requests to trust a self signed SSL certificate?

import requests
data = {'foo':'bar'}
url = 'https://foo.com/bar'
r = requests.post(url, data=data)
If the URL uses a self signed certificate, this fails with
requests.exceptions.SSLError: [Errno 1] _ssl.c:507: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
I know that I can pass False to the verify parameter, like this:
r = requests.post(url, data=data, verify=False)
However, what I would like to do is point requests to a copy of the public key on disk and tell it to trust that certificate.
try:
r = requests.post(url, data=data, verify='/path/to/public_key.pem')
With the verify parameter you can provide a custom certificate authority bundle
requests.get(url, verify=path_to_bundle_file)
From the docs:
You can pass verify the path to a CA_BUNDLE file with certificates of
trusted CAs. This list of trusted CAs can also be specified through
the REQUESTS_CA_BUNDLE environment variable.
The easiest is to export the variable REQUESTS_CA_BUNDLE that points to your private certificate authority, or a specific certificate bundle. On the command line you can do that as follows:
export REQUESTS_CA_BUNDLE=/path/to/your/certificate.pem
python script.py
If you have your certificate authority and you don't want to type the export each time you can add the REQUESTS_CA_BUNDLE to your ~/.bash_profile as follows:
echo "export REQUESTS_CA_BUNDLE=/path/to/your/certificate.pem" >> ~/.bash_profile ; source ~/.bash_profile
Case where multiple certificates are needed was solved as follows:
Concatenate the multiple root pem files, myCert-A-Root.pem and myCert-B-Root.pem, to a file. Then set the requests REQUESTS_CA_BUNDLE var to that file in my ./.bash_profile.
$ cp myCert-A-Root.pem ca_roots.pem
$ cat myCert-B-Root.pem >> ca_roots.pem
$ echo "export REQUESTS_CA_BUNDLE=~/PATH_TO/CA_CHAIN/ca_roots.pem" >> ~/.bash_profile ; source ~/.bash_profile
All of the answers to this question point to the same path: get the PEM file, but they don't tell you how to get it from the website itself.
Getting the PEM file from the website itself is a valid option if you trust the site, such as on an internal corporate server. If you trust the site, why should you do this? You should do this because it helps protect yourself and others from inadvertently re-using your code on a site that isn't safe.
Here is how you can get the PEM file.
Click on the lock next to the url.
Navigate to where you can see the certificates and open the certificates.
Download the PEM CERT chain.
Put the .PEM file somewhere you script can access it and try verify=r"path\to\pem_chain.pem" within your requests call.
r = requests.get(url, verify='\path\to\public_key.pem')
Setting export SSL_CERT_FILE=/path/file.crt should do the job.
If you're behind a corporate network firewall like I was, ask your network admin where your corporate certificates are, then:
import os
os.environ["REQUESTS_CA_BUNDLE"] = 'path/to/corporate/cert.pem'
os.environ["SSL_CERT_FILE"] = 'path/to/corporate/cert.pem'
This fixed issues I had with requests and openssl.
In a dev environment, using Poetry as virtual env provider on a Mac with Python 3.8 I used this answer https://stackoverflow.com/a/42982144/15484549 as base and appended the content of my self-signed root certificate to the certifi cacert.pem file.
The steps in detail:
cd project_folder
poetry add requests
# or if you use something else, make sure certifi is among the dependencies
poetry shell
python
>>> import certifi
>>> certifi.where()
/path/to/the/certifi/cacert.pem
>>> exit()
cat /path/to/self-signed-root-cert.pem >> /path/to/the/certifi/cacert.pem
python the_script_you_want_to_run.py
I know it is an old thread. However, I run into this issue recently. My python requests code does not accept the self-signed certificate but curl does. It turns out python requests are very strict on the self-signed certificate. It needs to be a root CA certificate. In other words,
Basic Constraints: CA:TRUE
Key Usage: Digital Signature, Non Repudiation, Key Encipherment, Certificate Sign
Incase anyone happens to land here (like I did) looking to add a CA (in my case Charles Proxy) for httplib2, it looks like you can append it to the cacerts.txt file included with the python package.
For example:
cat ~/Desktop/charles-ssl-proxying-certificate.pem >> /usr/local/google-cloud-sdk/lib/third_party/httplib2/cacerts.txt
The environment variables referenced in other solutions appear to be requests-specific and were not picked up by httplib2 in my testing.
You may try:
settings = s.merge_environment_settings(prepped.url, None, None, None, None)
You can read more here: http://docs.python-requests.org/en/master/user/advanced/

How to make a rest call request from python using client certificate (in windows)

I'm trying to make a RESTfull call in Python (2.7) to a tomcat server and it must be done using SSL with client certificate.
The following line is how the call to tomcat is done:
result = requests.get(url, headers=headers, verify=settings.SLA_CA_SERVER_CERTIFICATE, cert=(settings.SLA_CLIENT_CERTIFICATE_PUBLIC, settings.SLA_CLIENT_CERTIFICATE_PRIVATE), **kwargs)
I got the following error:
[Errno 336265225] _ssl.c:355: error:140B0009:SSL routines:SSL_CTX_use_PrivateKey_file:PEM lib ()
I've tried using in the cert variable .pem files and .key and .crt files and had no luck. The private key hasn't a password. Any clue why I'm having this error?
Thank you very much
I was creating the public certificate and private key from .p12 file using openssl in windows.
I created them using and openssl from linux (ubuntu) and it worked.
Just informative, the commands used to create the keys where
openssl pkcs12 -in path.p12 -out newfile.crt.pem -clcerts -nokeys
openssl pkcs12 -in path.p12 -out newfile.key.pem -nocerts -nodes

Categories

Resources