SSL error CERTIFICATE_VERIFY_FAILED with Locust when using Docker - python

It's my first try at Locus, and unfortunately I don't know Python.
I'm trying a simple request to a valid https server, and I see this error:
SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to
get local issuer certificate
After some research I tried to add this:
import gevent
import geventhttpclient.connectionpool
geventhttpclient.connectionpool.SSLConnectionPool.default_options = {
"cert_reqs": gevent.ssl.CERT_NONE,
}
or this:
import requests
requests.packages.urllib3.disable_warnings() # disable SSL warnings
I run Locust as instructed:
docker-compose up --scale worker=4
How can I test https sites with Locust?
Thanks in advance
Regards

You can do turn the verification off by adding below method:
def on_start(self):
""" on_start is called when a Locust start before any task is scheduled """
self.client.verify = False

While connecting to a server with a self-signed certificate I had a similar problem. I successfully disabled the certificate verification using the following code (from Locust):
import gevent
from geventhttpclient.url import URL
from geventhttpclient import HTTPClient
def insecure_ssl_context_factory():
context = gevent.ssl.create_default_context()
context.check_hostname = False
context.verify_mode = gevent.ssl.CERT_NONE
return context
url = URL(server)
http = HTTPClient.from_url(url, insecure=True, ssl_context_factory=insecure_ssl_context_factory)

Related

Python requests.get "unable to get local issuer certificate" - but self-signed cert is installed (Windows)

I’m writing a Python script that will monitor our Tesla PowerWall Gateway, but am stuck on this SSL problem.
HTTPSConnectionPool(host='powerwall', port=443): Max retries exceeded with url: /api/system_status/soe (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1108)')))
import json
import os
import requests
import sys
from requests.auth import HTTPDigestAuth
if __name__ == "__main__":
scriptPath = os.path.split(os.path.abspath(__file__))[0] # Where am I, and the local copy of the cert
#scriptPath = os.path.split(requests.certs.where(),)[0] # Where is requests looking for certs?
cert = os.path.join(scriptPath, 'PW2.pem')
#os.environ['REQUESTS_CA_BUNDLE'] = cert
#os.environ['REQUESTS_CA_BUNDLE'] = scriptPath
try:
response = None
query = "https://powerwall/api/system_status/soe"
with requests.Session() as session:
session.auth = (HTTPDigestAuth('myEmail', 'PW_PWD'))
session.timeout = 20
session.verify = True
#session.verify = cert
#session.load_cert_chain = "PW2.pem"
#session.load_cert_chain = cert
response = session.get(query)
except Exception as e:
print(str(e))
Despite all I’ve tried I still can’t get past this error. Yes, setting verify=False is an obvious work-around, but I’m trying to do this the ‘right’ way.
Setup:
Windows 10 PC
Python 3.8.2
I’ve downloaded the certificate from the Gateway and added it to the Local Machine store on my PC, in the Trusted Root Certification Authorities folder.
Windows can open it OK, showing the various SANs, including “powerwall”, which is how I’m addressing it in my call to requests.get. That says to me the integrity of the cert is good. (Its 'intended purposes' are Server Authentication & Client Authentication.)
I’ve installed python-certifi-win32, then later uninstalled it and installed pip-system-certs as per this SO answer to no avail.
I’ve added the PW’s cert to cacert.pem in the folder returned by requests.certs.where(): C:\Python38\lib\site-packages\certifi\cacert.pem
The commented-out code are variations I’ve performed along the way.
In the doco for ‘requests’ is a mention of this issue: “For example: Self-signed SSL certificates specified in REQUESTS_CA_BUNDLE will not be taken into account.” and a way around it, but that wasn’t successful either.
What have I missed?
Please don’t tell me it’s the 2047 expiry date of the cert…
TIA.

SSL Error CERTIFICATE_VERIFY_FAILED with requests BUT NOT with urllib.request

If I try to use requests.get() to connect a HTTPS server (a Jenkins) I got SSL error CERTIFICATE_VERIFY_FAILED certificate verify failed: unable to get local issuer certificate (_ssl.c:997)'))
HTTPS connection are working fine if I use curl or any browser.
The HTTPS server is an internal server but use a SSL cert from DigiCert. It is a wildcard certificate and the same certificate is used for a lot of other servers (like IIS server) in my company, which are working fine together with requests.
If I use urllib package the HTTPS connection will be also fine.
I don't understand why requests doesn't work and I ask what can I do that requests is working?
And no! verify=false is not the solution ;-)
For the SSLContext in the second function I have to call method load_default_certs()
My system: Windows 10, Python 3.10, requests 2.28.1, urllib3 1.26.10, certifi 2022.6.15. Packages are installed today.
url = 'https://redmercury.acme.org/'
def use_requests(url):
import requests
try:
r = requests.get(url)
print(r)
except Exception as e:
print(e)
def use_magic_code_from_stackoverflow(url):
import urllib
import ssl
ssl_context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
# ssl_context.verify_mode = ssl.CERT_REQUIRED
# ssl_context.check_hostname = True
ssl_context.load_default_certs() # WITHOUT I got SSL error(s)
# previous context
https_handler = urllib.request.HTTPSHandler(context=ssl_context)
opener = urllib.request.build_opener(https_handler)
ret = opener.open(url, timeout=2)
print(ret.status)
def use_urllib_requests(url):
import urllib.request
with urllib.request.urlopen(url) as response:
print(response.status)
use_requests(url) # SSL error
use_magic_code_from_stackoverflow(url) # server answers with 200
use_urllib_requests(url) # server answers with 200

asyncpg error: "no pg_hba.conf entry for host" in Heroku

I'm using asyncpg to connect my database in Heroku postgresql, using python:
import asyncpg
async def create_db_pool():
bot.pg_con = await asyncpg.create_pool(dsn="postgres://....", host="....amazonaws.com", user="xxx", database="yyy", port="5432", password="12345")
it was working perfectly until I received an email from heroku advising me of a maintenance: Maintenance (DATABASE_URL on myappname) is starting now. We will update you when it has completed.
then this error appeared:
asyncpg.exceptions.InvalidAuthorizationSpecificationError: no pg_hba.conf entry for host "123.456.789.10", user "xxx", database "yyy", SSL off
I tried to follow some help, like putting ssl=True
but this error appeared:
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate (_ssl.c:1108)
same as putting ssl="allow"
asyncpg.exceptions.InvalidPasswordError: password authentication failed for user "xxx"
what can I do to fix this?
Using the solution from this worked.
import ssl
ssl_object = ssl.create_default_context()
ssl_object.check_hostname = False
ssl_object.verify_mode = ssl.CERT_NONE
# connect elsewhere
pool = await asyncpg.create_pool(uri, ssl=ssl_object)
Note: You don't need to use any certificate like mentioned in the comment, as we set verify_mode to not use certificates.

Bypass SSL when I'm using SUDS for consume web service

I'm using SUDS for consuming web service. I tried like bellow:
client = Client(wsdl_url)
list_of_methods = [method for method in client.wsdl.services[0].ports[0].methods]
print(list_of_methods)
I got this error:
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)>
I saw link but it is just solution for python 2.7. How can I bypass SSL with SUDS? Or is there any none python solution (For example add fake certificate in windows OS)? I'm using python 3(So I have to use urllib instead of urllib2).
A suds client uses a subclass of suds.transport.Transport to process requests.
The default transport used is an instance of suds.transport.https.HttpAuthenticated, but you can override this when you instantiate the client by passing a transport keyword argument.
The http and https transports are implemented using urllib.request (or urllib2 for python2) by creating an urlopener. The list of handlers used to create this urlopener is retrieved by calling the u2handlers() method on the transport class. This means that you can create your own transport by subclassing the default and overriding that method to use a HTTPSHander with a specific ssl context, e.g:
from suds.client import Client
from suds.transport.https import HttpAuthenticated
from urllib.request import HTTPSHandler
import ssl
class CustomTransport(HttpAuthenticated):
def u2handlers(self):
# use handlers from superclass
handlers = HttpAuthenticated.u2handlers(self)
# create custom ssl context, e.g.:
ctx = ssl.create_default_context(cafile="/path/to/ca-bundle.pem")
# configure context as needed...
ctx.check_hostname = False
# add a https handler using the custom context
handlers.append(HTTPSHandler(context=ctx))
return handlers
# instantiate client using this transport
c = Client("https://example.org/service?wsdl", transport=CustomTransport())
This code worked for me:
from suds.client import Client
import ssl
if hasattr(ssl, '_create_unverified_context'):
ssl._create_default_https_context = ssl._create_unverified_context
cli = Client('https://your_lik_to?wsdl')
print(cli)
You can add the code below before instantiate your suds client:
import ssl
try:
_create_unverified_https_context = ssl._create_unverified_context
except AttributeError:
pass
else:
ssl._create_default_https_context = _create_unverified_https_context
See my own website for details: https://lucasmarques.me/bypass-ssl
This is what I came up with that seems to work well:
class MyTransport(HttpAuthenticated):
def u2handlers(self):
"""
Get a collection of urllib handlers.
#return: A list of handlers to be installed in the opener.
#rtype: [Handler,...]
"""
handlers = []
context = ssl._create_unverified_context()
handlers.append(urllib2.HTTPSHandler(context=context))
return handlers
Cheers!
You can use https://pypi.python.org/pypi/suds_requests to leverage the requests library for the transport. This gives you the ability to disable the ssl verification.
Or try my new soap library, it supports it out of the box: http://docs.python-zeep.org/en/latest/#transport-options
I use this:
with mock.patch('ssl._create_default_https_context', ssl._create_unverified_context):
client = Client(url)
See: https://bitbucket.org/jurko/suds/issues/78/allow-bypassing-ssl-certificate#comment-39029255

HTTPS request in twisted that checks the certificate

In my twisted app I want to make an asynchronous request to Akismet to check for spam. Akismet reasonably uses HTTPS, so I've been following the web client guide on SSL in the docs. But there's this part that worries me:
Here’s an example which shows how to use Agent to request an HTTPS URL with no certificate verification.
I very much want certificate verification to prevent Man-In-The-Middle attacks. So how do I add it?
My test code without verification is this:
from twisted.internet import reactor
from twisted.web.client import Agent
from twisted.internet.ssl import ClientContextFactory
class WebClientContextFactory(ClientContextFactory):
def getContext(self, hostname, port):
print( "getting context for {}:{}".format( hostname, port ) )
# FIXME: no attempt to verify certificates!
return ClientContextFactory.getContext(self)
agent = Agent( reactor, WebClientContextFactory() )
def success( response ):
print( "connected!" )
def failure( failure ):
print( "failure: {}".format( failure ) )
def stop( ignored ):
reactor.stop()
agent.request( "GET", "https://www.pcwebshop.co.uk/" )\ # uses self-signed cert
.addCallbacks( success, failure )\
.addBoth( stop )
reactor.run()
I'd like it to fail due to inability to verify the cert.
I'm using Twisted 15.1.0.
Actually, the default init function of Agent will pass in BrowserLikePolicyForHTTPS as contextFactory and have the ablility to verify server certificate.
Simply using this:
agent = Agent( reactor )
will produce the following error:
failure: [Failure instance: Traceback (failure with no frames):
<class 'twisted.web._newclient.ResponseNeverReceived'>:
[<twisted.python.failure.Failure <class 'OpenSSL.SSL.Error'>>]]
Make sure you installed service_identity package using pip.
If you need custom cert verification, you can create a custom policy by passing the pem in, as described here:
customPolicy = BrowserLikePolicyForHTTPS(
Certificate.loadPEM(FilePath("your-trust-root.pem").getContent())
)
agent = Agent(reactor, customPolicy)
Thanks for pointing this out. This seems to be a bug in the documentation. Prior to version 14.0, it was accurate; Twisted would not validate HTTPS, and that was a big problem. However, as you can see in the release notes for that version, Twisted (at least in versions 14.0 and greater) does validate TLS on HTTPS connections made with Agent. (It still does not do so for getPage, the old, bad, HTTP client; do not use getPage.)
I have filed this bug to track fixing the documentation to be accurate.

Categories

Resources