The following cx_Oracle code works fine when the database is up:
#!C:\Python27
import cx_Oracle
try:
conn = cx_Oracle.connect("scott/tiger#oracle")
try:
curs = conn.cursor()
curs.execute("SELECT dummy FROM sys.dual")
print curs.fetchone()[0]
finally:
curs.close()
finally:
conn.close()
But if the database happens to be down when I run this script, a NameError is raised:
Traceback (most recent call last):
File "C:\Users\ArtMetzer\Documents\Code\Python\db_conn_test.py", line 14, in <module>
conn.close()
NameError: name 'conn' is not defined
This makes sense to me: cx_Oracle wasn't able to instantiate a connection, so the variable conn never got set, and hence has no close() method.
In Python, what's the best way to ensure your database connection closes, while still gracefully handling the condition of a down database?
Doing something like the following seems like a massive kludge to me:
finally:
try:
conn.close()
except NameError:
pass
You can try initializing conn to something like None before-hand and testing that in the finally block. This works because the only place the connection is set to something else is when it is opened. So opened implies non-None and None implies not-opened:
#!C:\Python27
import cx_Oracle
conn = None
try:
conn = cx_Oracle.connect("scott/tiger#oracle")
try:
curs = conn.cursor()
curs.execute("SELECT dummy FROM sys.dual")
print curs.fetchone()[0]
finally:
curs.close()
finally:
if conn is not None:
conn.close()
According to the docs, you don't really need to bother closing the connection:
https://cx-oracle.readthedocs.io/en/latest/user_guide/connection_handling.html#closing-connections
The important line being:
Alternatively, you may prefer to let connections be automatically cleaned up when references to them go out of scope. This lets cx_Oracle close dependent resources in the correct order.
(not exactly an answer, but comments don't have nice formatting)
Try this:
#!C:\Python27
import cx_Oracle
try:
conn = cx_Oracle.connect("scott/tiger#oracle")
try:
curs = conn.cursor()
curs.execute("SELECT dummy FROM sys.dual")
print curs.fetchone()[0]
finally:
curs.close()
conn.close()
except Exception as e:
print e
Not ideal, but should work better. I'm also wondering why so much nesting. Why not do this:
#!C:\Python27
import cx_Oracle
try:
conn = cx_Oracle.connect("scott/tiger#oracle")
curs = conn.cursor()
curs.execute("SELECT dummy FROM sys.dual")
print curs.fetchone()[0]
curs.close()
conn.close()
except Exception as e:
print e
BTW, I have this assumption that the connection and the cursor will close automatically on exit, removing the need to close them explicitly.
Related
I'm using psycopg2 library to connection to my postgresql database.
Every time I want to execute any query, I make a make a new connection like this:
import psycopg2
def run_query(query):
with psycopg2.connect("dbname=test user=postgres") as connection:
cursor = connection.cursor()
cursor.execute(query)
cursor.close()
But I think it's faster to make one connection for whole app execution like this:
import psycopg2
connection = psycopg2.connect("dbname=test user=postgres")
def run_query(query):
cursor = connection.cursor()
cursor.execute(query)
cursor.close()
So which is better way to connect my database during all execution time on my app?
I've tried both ways and both worked, but I want to know which is better and why.
You should strongly consider using a connection pool, as other answers have suggested, this will be less costly than creating a connection every time you query, as well as deal with workloads that one connection alone couldn't deal with.
Create a file called something like mydb.py, and include the following:
import psycopg2
import psycopg2.pool
from contextlib import contextmanager
dbpool = psycopg2.pool.ThreadedConnectionPool(host=<<YourHost>>,
port=<<YourPort>>,
dbname=<<YourDB>>,
user=<<YourUser>>,
password=<<YourPassword>>,
)
#contextmanager
def db_cursor():
conn = dbpool.getconn()
try:
with conn.cursor() as cur:
yield cur
conn.commit()
"""
You can have multiple exception types here.
For example, if you wanted to specifically check for the
23503 "FOREIGN KEY VIOLATION" error type, you could do:
except psycopg2.Error as e:
conn.rollback()
if e.pgcode = '23503':
raise KeyError(e.diag.message_primary)
else
raise Exception(e.pgcode)
"""
except:
conn.rollback()
raise
finally:
dbpool.putconn(conn)
This will allow you run queries as so:
import mydb
def myfunction():
with mydb.db_cursor() as cur:
cur.execute("""Select * from blahblahblah...""")
Both ways are bad. The fist one is particularly bad, because opening a database connection is quite expensive. The second is bad, because you will end up with a single connection (which is too few) one connection per process or thread (which is usually too many).
Use a connection pool.
With psycopg2, connection and querying the database works like so
conn = psycopg2.connect('connection string')
with conn:
cur=conn.cursor()
cur.execute("SELECT * FROM pg_stat_activity") #simple query
rows = cur.fetchall()
for row in rows:
print (row)
After trial and error, I found out that with conn is absolutely necessary or you will get many unexplained locks.
My question is: is there a way to setup the connection to avoid the need to use it?
From https://www.psycopg.org/docs/usage.html,
Warning
Unlike file objects or other resources, exiting the connection’s with
block doesn’t close the connection, but only the transaction
associated to it. If you want to make sure the connection is closed
after a certain point, you should still use a try-catch block:
conn = psycopg2.connect(DSN)
try:
# connection usage
finally:
conn.close()
In psycopg, the Context manager has been implemented in such a way that the with statement will only terminate the transaction and not close the connection for you. The connection needs to be closed by you separately.
In case of an error with your transaction, you have the option to rollback and raise the error.
One way to do this is to write the connection closing logic yourself.
def with_connection(func):
"""
Function decorator for passing connections
"""
def connection(*args, **kwargs):
# Here, you may even use a connection pool
conn = psycopg.connect(DSN)
try:
rv = func(conn, *args, **kwargs)
except Exception as e:
conn.rollback()
raise e
else:
# Can decide to see if you need to commit the transaction or not
conn.commit()
finally:
conn.close()
return rv
return connection
#with_connection
def run_sql(conn, arg1):
cur = conn.cursor()
cur.execute(SQL, (arg1))
Since Version 2.5, psycopg2 should support the with statement like you expect it to behave.
Docs
conn = psycopg2.connect(DSN)
with conn:
with conn.cursor() as curs:
curs.execute(SQL1)
with conn:
with conn.cursor() as curs:
curs.execute(SQL2)
conn.close()
I am using this below code to update the records of my table but it doesn't work and it doesn't update my table
conn = mysql.connector.connect(host='localhost',
database='rps',
user='root',
password='')
cur = conn.cursor()
cur.execute("""UPDATE players SET score=%s WHERE chat_id =%s""", (int(self.p1.score), str(self.p1.chat__id)))
cur.execute("""UPDATE players SET score=%s WHERE chat_id =%s""", (int(self.p2.score), str(self.p2.chat__id)))
conn.commit()
cur.close()
conn.close()
It is completely true but i don't know how to solve this problem
Can you help me to solve it guys ?
Thanks
You can set try/catch on your query execution:
try:
cursor.execute(some_statement)
except MySQLdb.IntegrityError, e:
# handle a specific error condition
except MySQLdb.Error, e:
# handle a generic error condition
except MySQLdb.Warning, e:
# handle warnings, if the cursor you're using raises them
If you don't catch any exceptions, you should check your program logic.
there’s something wrong in my python script: when I try to put some data in my database and print it, it looks like it’s working, but when I rerun the code, or if I check the phpmyadmin, there’s no data saved in the db. Does anyone have some idea on how to solve this problem?
import mysql.connector
from mysql.connector import errorcode
def connect():
""" Connect to MySQL database """
try:
conn = mysql.connector.connect(host='localhost',
database='Temperature',
user='Temperature',
password='mypass')
if conn.is_connected():
print('Connected to MySQL database')
cur = conn.cursor()
query = "INSERT INTO Temp(temp, humi) " \
"VALUES(315, 55)"
try:
cur.execute(query)
except MySQLdb.ProgrammingError as e:
print(e)
query = "SELECT * FROM Temp"
try:
cur.execute(query)
for reading in cur.fetchall():
print (str(reading[0])+" "+str(reading[1]))
except MySQLdb.ProgrammingError as e:
print(e)
except Error as e:
print(e)
finally:
conn.close()
if __name__ == '__main__':
connect()
You will need to add conn.commit() before conn.close(). That should solve the problem.
Let's say that you have the following code:
import sqlite3
conn = sqlite3.connect('mydb')
cur = conn.cursor()
# some database actions
cur.close()
conn.close()
# more code below
If I try to use the conn or cur objects later on, how could I tell that they are closed? I cannot find a .isclosed() method or anything like it.
You could wrap in a try, except statement:
>>> conn = sqlite3.connect('mydb')
>>> conn.close()
>>> try:
... one_row = conn.execute("SELECT * FROM my_table LIMIT 1;")
... except sqlite3.ProgrammingError as e:
... print(e)
Cannot operate on a closed database.
This relies on a shortcut specific to sqlite3.
How about making sure that the connection and cursor are never closed?
You could have a state based program that you can guarantee only calls close() at the right time.
Or wrap them in other objects that have a pass implementation of close(). Or add an _isclosed member set by close() and accessed by isclosed().