Python code to test the Postgres connectivity and get response code - python

I am working on an AWS Lambda function to test URL connectivity and get respose code. I have used the below code to get it
import urllib
from urllib import URLError, HTTPError
from urllib import request, parse
def fn_getresponsecode(url):
try:
conn =urllib.request.urlopen(url)
return conn.getcode()
except HTTPError as e:
return e.code
except URLError as e:
return e.reason
fn_getresponsecode("https://stackoverflow.com")
Now, I want to test a postgres connection from Lambda function and check if it is getting connected. Is there a way by which we can do that. Also, the password is also not provided.
In unix I have tested it using the code below:
pg_isready -h test.test1.region.rds.amazon.com -p port_number -U user_name

Related

How Handle Expired SSL/TLS Certificate with Python Requests?

What's the correct way to handle an expired certificates with Python Requests?
I want the code to differentiate between a "connection error" and connection with an "expired TLS certificate".
import requests
def conn(URL):
try:
response = requests.get(URL)
except requests.exceptions.RequestException:
print(URL, "Cannot connect")
return False
print(URL, "connection sucessful")
return True
# valid cert
conn("https://www.google.com")
# unexistant domain
conn("https://unexistent-domain-example.com")
# expired cert
conn("https://expired-rsa-dv.ssl.com")
I want the code to differentiate between a "connection error" and connection with an "expired TLS certificate".
You can look at the exception details and see if 'CERTIFICATE_VERIFY_FAILED' is there.
import requests
def conn(URL):
try:
response = requests.get(URL)
except requests.exceptions.RequestException as e:
if 'CERTIFICATE_VERIFY_FAILED' in str(e):
print('CERTIFICATE_VERIFY_FAILED')
print(URL, f"Cannot connect: {str(e)}")
print('--------------------------')
return False
print(URL, "connection sucessful")
return True
# valid cert
conn("https://www.google.com")
# unexistant domain
conn("https://unexistent-domain-example.com")
# expired cert
conn("https://expired-rsa-dv.ssl.com")
requests is a perfect tool for requests, but your task is to check server certificate expiration date which require using lower level API. The algorithm is to retrieve server certificate, parse it and check end date.
To get certificate from server there's function ssl.get_server_certificate(). It will return certificate in PEM encoding.
There're plenty of ways how to parse PEM encoded certificate (check this question), I'd stick with "undocumented" one.
To parse time from string you can use ssl.cert_time_to_seconds().
To parse url you can use urllib.parse.urlparse(). To get current timestamp you can use time.time()
Code:
import ssl
from time import time
from urllib.parse import urlparse
from pathlib import Path
def conn(url):
parsed_url = urlparse(url)
cert = ssl.get_server_certificate((parsed_url.hostname, parsed_url.port or 443))
# save cert to temporary file (filename required for _test_decode_cert())
temp_filename = Path(__file__).parent / "temp.crt"
with open(temp_filename, "w") as f:
f.write(cert)
try:
parsed_cert = ssl._ssl._test_decode_cert(temp_filename)
except Exception:
return
finally: # delete temporary file
temp_filename.unlink()
return ssl.cert_time_to_seconds(parsed_cert["notAfter"]) > time()
It'll throw an exception on any connection error, you can handle it with try .. except over get_server_certificate() call (if needed).

How to check if user is connected to internet in python?

I am python 3.5 and using urllib3 and I have seen various examples but they were of urllib2 that's why I asked this question ,I have a bit of code which need internet but if user is not connected I want to show them a warning that device is not connected to internet so how can I do this.
You could do something like this where you check for connection to a site.
import urllib.request
import urllib.parse
try:
x = urllib.request.urlopen('https://www.google.com')
except Exception as e:
print(str(e))
Ive not used urllib before but you can try something like:
try:
#you put your code here
#and it will raise an exception if
#network connection is not available
#so you catch that
except:
#your warning code

Delete DNS 'A' Records from Domain Controller using Python

I have a DC with "example.com" and I have many DNS records/FQDNs with 'A' record(Windows Servers).
Ex: Server1.example.com A 172.3.2.1
I'm using Python and trying to delete the record (server1).
Unfortunately, its giving me response as None.
I am using dnspython library.
def DeleteDNSRecords(serverList):
try:
for server in serverList:
updater = update.Update(server,'A')
response = updater.delete(server,'A')
print(str(response))
except Exception as e:
print (e)

Xmlrpc ServerProxy returns socket.gaierror

I'm trying to connect to a Magento API using Xmlrpc.
When the url is valid, i have no problem. But i'd like to catch errors if the url is not valid. If i try with an invalid url i have :
socket.gaierror: [Errno 8] nodename nor servname provided, or not known
I'm trying to catch it but i can't find a way to do it ..
I'm using Python 3.5 :
from xmlrpc.client import ServerProxy
from socket import gaierror
params = {
"encoding: "utf-8",
"verbose": False,
"transport": SpecialTransport() # I use a SpecialTransport class
}
try:
client = ServerProxy("https://ma.bad.url, **params)
except gaierror:
print("Error")
The problem is, that i never go through the except ..
I don't understand what i'm doing wrong..
Thanks!
I'm answering to myself.
I've finally been able to make it works like this :
# Connect to the url
client = ServerProxy('https://my.bad.url', **params)
# Try to login to Magento to get a session
try:
session = client.login('username', 'password')
except gaierror:
# Error resolving / connecting to the url
print('Connection error')
sys.exit(2)
except Fault:
# Error with the login
print('Login error')
sys.exit(2)
else:
print('Success')

Python problems with FancyURLopener, 401, and "Connection: close"

I'm new to Python, so forgive me if I am missing something obvious.
I am using urllib.FancyURLopener to retrieve a web document. It works fine when authentication is disabled on the web server, but fails when authentication is enabled.
My guess is that I need to subclass urllib.FancyURLopener to override the get_user_passwd() and/or prompt_user_passwd() methods. So I did:
class my_opener (urllib.FancyURLopener):
# Redefine
def get_user_passwd(self, host, realm, clear_cache=0):
print "get_user_passwd() called; host %s, realm %s" % (host, realm)
return ('name', 'password')
Then I attempt to open the page:
try:
opener = my_opener()
f = opener.open ('http://1.2.3.4/whatever.html')
content = f.read()
print "Got it: ", content
except IOError:
print "Failed!"
I expect FancyURLopener to handle the 401, call my get_user_passwd(), and retry the request.
It does not; I get the IOError exception when I call "f = opener.open()".
Wireshark tells me that the request is sent, and that the server is sending a "401 Unauthorized" response with two headers of interest:
WWW-Authenticate: BASIC
Connection: close
The connection is then closed, I catch my exception, and it's all over.
It fails the same way even if I retry the "f = opener.open()" after IOError.
I have verified that my my_opener() class is working by overriding the http_error_401() method with a simple "print 'Got 401 error'". I have also tried to override the prompt_user_passwd() method, but that doesn't happen either.
I see no way to proactively specify the user name and password.
So how do I get urllib to retry the request?
Thanks.
I just tried your code on my webserver (nginx) and it works as expected:
Get from urllib client
HTTP/1.1 401 Unauthorized from server with Headers
Connection: close
WWW-Authenticate: Basic realm="Restricted"
client tries again with Authorization header
Authorization: Basic <Base64encoded credentials>
Server responds with 200 OK + Content
So I guess your code is right (I tried it with python 2.7.1) and maybe the webserver you are trying to access is not working as expected. Here is the code tested using the free http basic auth testsite browserspy.dk (seems they are using apache - the code works as expected):
import urllib
class my_opener (urllib.FancyURLopener):
# Redefine
def get_user_passwd(self, host, realm, clear_cache=0):
print "get_user_passwd() called; host %s, realm %s" % (host, realm)
return ('test', 'test')
try:
opener = my_opener()
f = opener.open ('http://browserspy.dk/password-ok.php')
content = f.read()
print "Got it: ", content
except IOError:
print "Failed!"

Categories

Resources