Testing the connection of Postgres-DB - python

I would like to put a button on the GUI if the software connects to a specific Postgre-DB. I wrote a small test-function: if it can connect to the DB it returns True, if not it returns False.
The code works, but there is an issue: if there is no connection (I just pull out the internet cable, nothing else changes), it simply
takes too much time.
Could you help me to make the code faster if there is no connection?
Here is my simple test-function:
import psycopg2
def postgres_test():
try:
conn = psycopg2.connect("dbname='mydb' user='myuser' host='my_ip' password='mypassword'")
conn.close()
return True
except:
return False

Thanks for the comments. And yes, it was timeout related.
Here is my faster code:
import psycopg2
def postgres_test():
try:
conn = psycopg2.connect("dbname='mydb' user='myuser' host='my_ip' password='mypassword' connect_timeout=1 ")
conn.close()
return True
except:
return False

For test postgres connection with python first you have to install this package :
pip install psycopg2-binary
and try this code :
import psycopg2
conn = psycopg2.connect(dbname="db_name",
user="user_name",
host="127.0.0.1",
password="******",
port="5432")
cursor = conn.cursor()
cursor.execute('SELECT * FROM information_schema.tables')
rows = cursor.fetchall()
for table in rows:
print(table)
conn.close()

Related

One connection to DB for app, or a connection on every execution?

I'm using psycopg2 library to connection to my postgresql database.
Every time I want to execute any query, I make a make a new connection like this:
import psycopg2
def run_query(query):
with psycopg2.connect("dbname=test user=postgres") as connection:
cursor = connection.cursor()
cursor.execute(query)
cursor.close()
But I think it's faster to make one connection for whole app execution like this:
import psycopg2
connection = psycopg2.connect("dbname=test user=postgres")
def run_query(query):
cursor = connection.cursor()
cursor.execute(query)
cursor.close()
So which is better way to connect my database during all execution time on my app?
I've tried both ways and both worked, but I want to know which is better and why.
You should strongly consider using a connection pool, as other answers have suggested, this will be less costly than creating a connection every time you query, as well as deal with workloads that one connection alone couldn't deal with.
Create a file called something like mydb.py, and include the following:
import psycopg2
import psycopg2.pool
from contextlib import contextmanager
dbpool = psycopg2.pool.ThreadedConnectionPool(host=<<YourHost>>,
port=<<YourPort>>,
dbname=<<YourDB>>,
user=<<YourUser>>,
password=<<YourPassword>>,
)
#contextmanager
def db_cursor():
conn = dbpool.getconn()
try:
with conn.cursor() as cur:
yield cur
conn.commit()
"""
You can have multiple exception types here.
For example, if you wanted to specifically check for the
23503 "FOREIGN KEY VIOLATION" error type, you could do:
except psycopg2.Error as e:
conn.rollback()
if e.pgcode = '23503':
raise KeyError(e.diag.message_primary)
else
raise Exception(e.pgcode)
"""
except:
conn.rollback()
raise
finally:
dbpool.putconn(conn)
This will allow you run queries as so:
import mydb
def myfunction():
with mydb.db_cursor() as cur:
cur.execute("""Select * from blahblahblah...""")
Both ways are bad. The fist one is particularly bad, because opening a database connection is quite expensive. The second is bad, because you will end up with a single connection (which is too few) one connection per process or thread (which is usually too many).
Use a connection pool.

Using unittest in python with docker and psycopg2

BRIEF DESCRIPTION OF THE PROBLEM: Psycopg2 won't connect to a test DB within docker from unittest, but connects fine from console.
ERROR MESSAGE:
psycopg2.OperationalError: server closed the connection unexpectedly
This probably means the server terminated abnormally before or while processing the request.
DETAILS:
I'm trying to set up a testing database in docker, that will be created and populated before testing and then removed after.
Here's the fuction to set up database:
def set_up_empty_test_db():
client = docker.from_env()
try:
testdb = client.containers.get("TestDB")
testdb.stop()
testdb.remove()
testdb = client.containers.run(
"postgres",
ports={5432: 5433},
detach=True,
name="TestDB",
environment=["POSTGRES_PASSWORD=yourPassword"],
)
except NotFound:
testdb = client.containers.run(
"postgres",
ports={5432: 5433},
detach=True,
name="TestDB",
environment=["POSTGRES_PASSWORD=yourPassword"],
)
while testdb.status != "running":
testdb = client.containers.get("TestDB")
return
If I launch this function from console it works without an error and I can see TestDB container running. I can successfully initiate connection:
conn = psycopg2.connect("host='localhost' user='postgres' password='yourPassword' port='5433'")
But it doesn't work when unit testing. Here's the testing code:
class TestCreateCity(unittest.TestCase):
def setUp(self):
set_up_empty_test_db()
con = psycopg2.connect("host='localhost' user='postgres' password='yourPassword' port='5433'")
self.assertIsNone(con.isolation_level)
cur = con.cursor()
sql_file = open(os.path.join(ROOT_DIR + "/ddl/creates/schema_y.sql"), "r")
cur.execute(sql_file.readline())
con.commit()
con.close()
self.session = Session(creator=con)
def test_create_city(self):
cs.create_city("Test_CITY", "US")
q = self.session.query(City).filter(City.city_name == "Test_CITY").one()
self.assertIs(q)
self.assertEqual(q.city_name, "Test_CITY")
self.assertEqual(q.country_code, "US")
It breaks when trying to initiate connection. Please advise.
I know this is an old question, but I needed to do the same thing today. You try and connect to the postgres server too quickly after starting it - that's why it works in the console.
All you need to do is replace:
set_up_empty_test_db()
con = psycopg2.connect("host='localhost' user='postgres' password='yourPassword' port='5433'")
with:
set_up_empty_test_db()
con = None
while con == None:
try:
con = psycopg2.connect("host='localhost' user='postgres' password='yourPassword' port='5433'")
except psycopg2.OperationalError:
time.sleep(0.5);
Hope this helps someone else. Cheers!

Using cursors with mysqldb and flask

It's more of a theoratical question but i have been trying to find a correct answer of it for hours and yet i have't arrived at a solution. I have a big flask app and it contains multiple routes.
#app.route('/try'):
#app.route('/new'):
and many others. I am using MySQLdb for database purpose. Before i was having this in the starting of the application.
import MySQLdb as mysql
db = mysql.connect('localhost', 'root', 'password', 'db')
cursor = db.cursor()
It works fine but after a time, it generates a error "Local Variable 'cursor' referenced before assignment.". This may be due to the reason that after a time mysql closes a connection. So, i entered
cursor=db.cursor() in every route function and close it afer i have done the processing like this:
db = mysql.connect('localhost', 'root', 'password', 'db')
#app.route('/')
def home():
cursor=db.cursor()
...some processing...
cursor.close()
return render_template('home.html')
#app.route('/new')
def home_new():
cursor=db.cursor()
...some processing...
cursor.close()
return render_template('homenew.html')
Now i want to ask is this approach right? Should i define a cursor for each request and close it?
This is how I have my MySQLdb setup
def requestConnection():
"Create new connection. Return connection."
convt = cv.conversions.copy()
convt[3] = int
convt
conn = db.connect(host=c.SQL_HOST, port=c.SQL_PORT, user=c.SQL_USER, passwd=c.SQL_PASSWD, db=c.SQL_DB, conv=convt, use_unicode=True, charset="utf8")
return conn
def requestCursor(conn):
return conn.cursor(db.cursors.DictCursor)
Then, at the start of every SQL function I do this:
def executeQuery(query):
"Execute a given query. Used for debug purpouses."
conn = requestConnection()
cur = requestCursor(conn)
cur.execute(query)
r = cur.fetchall()
cur.close()
conn.close()
return r
I change conversions because I had to change int values in DB from Float to int due to my work, but you can skip this step.
If not, you need to import this:
import MySQLdb as db # https://github.com/farcepest/MySQLdb1
import MySQLdb.converters as cv
Hope it helps!

Sqlite insert query not working with python?

I have been trying to insert data into the database using the following code in python:
import sqlite3 as db
conn = db.connect('insertlinks.db')
cursor = conn.cursor()
db.autocommit(True)
a="asd"
b="adasd"
cursor.execute("Insert into links (link,id) values (?,?)",(a,b))
conn.close()
The code runs without any errors. But no updation to the database takes place. I tried adding the conn.commit() but it gives an error saying module not found. Please help?
You do have to commit after inserting:
cursor.execute("Insert into links (link,id) values (?,?)",(a,b))
conn.commit()
or use the connection as a context manager:
with conn:
cursor.execute("Insert into links (link,id) values (?,?)", (a, b))
or set autocommit correctly by setting the isolation_level keyword parameter to the connect() method to None:
conn = db.connect('insertlinks.db', isolation_level=None)
See Controlling Transactions.
It can be a bit late but set the autocommit = true save my time! especially if you have a script to run some bulk action as update/insert/delete...
Reference: https://docs.python.org/2/library/sqlite3.html#sqlite3.Connection.isolation_level
it is the way I usually have in my scripts:
def get_connection():
conn = sqlite3.connect('../db.sqlite3', isolation_level=None)
cursor = conn.cursor()
return conn, cursor
def get_jobs():
conn, cursor = get_connection()
if conn is None:
raise DatabaseError("Could not get connection")
I hope it helps you!

About MySQLdb conn.autocommit(True)

I have installed python 2.7 64bit,MySQL-python-1.2.3.win-amd64-py2.7.exe.
I use the following code to insert data :
class postcon:
def POST(self):
conn=MySQLdb.connect(host="localhost",user="root",passwd="mysql",db="dang",charset="utf8")
cursor = conn.cursor()
n = cursor.execute("insert into d_message (mid,title,content,image) values(2,'xx','ccc','fff')")
cursor.close()
conn.close()
if n:
raise web.seeother('/')
This results in printing n as 1, but in mysql client data aren't visible.
google says I must add conn.autocommit(True).
but I don't know why MySQLdb turns it off;
by default MySQLdb autocommit is false,
You can set autocommit to True in your MySQLdb connection like this,
conn=MySQLdb.connect(host="localhost",user="root",passwd="mysql",db="dang",charset="utf8")
conn.get_autocommit() #will return **False**
conn.autocommit(True)
conn.get_autocommit() #Should return **True** now
cursor = conn.cursor()
I don't know if there's a specific reason to use autocommit with GAE (assuming you are using it). Otherwise, you can just manually commit.
class postcon:
def POST(self):
conn=MySQLdb.connect(host="localhost",user="root",passwd="mysql",db="dang",charset="utf8")
cursor = conn.cursor()
n = cursor.execute("insert into d_message (mid,title,content,image) values(2,'xx','ccc','fff')")
conn.commit() # This right here
cursor.close()
conn.close()
if n:
raise web.seeother('/')
Note that you probably should check if the insert happened successfully, and if not, rollback the commit.
Connector/Python Connection Arguments
Turning on autocommit can be done directly when you connect to a database:
import mysql.connector as db
conn = db.connect(host="localhost", user="root", passwd="pass", db="dbname", autocommit=True)
or
import mysql.connector
db = mysql.connector.connect(option_files='my.conf', autocommit=True)
Or call conn.commit() before calling close.

Categories

Resources