Is there any way I can convert a weak variable to strong one in Python?
# utils.py
def connect_db():
cnx = mysql.connector.connect(user="root", database="test_db")
cursor = cnx.cursor()
return cursor
# main.py
from utils import connect_db
def main():
cursor = connect_db()
cursor.execute("some commands")
I got this error
ReferenceError: weakly-referenced object no longer exists
First, let's see why the error is happening. You open a connection and bind it to a function-local name cnx. Then you create a cursor, which has a weak reference to the connection. As soon as you return the cursor, the connection no longer has strong references, and it up for garbage collection. By the time you attempt to execute a query, the connection has been garbage collected.
As you noted, making the reference from cursor to connection a strong one would solve your immediate problem. At the same time, there is a reason that the API was designed this way. You don't want to have too many connections lingering around, all because some cursors did not get garbage collected.
Instead, the right answer is to handle the connection explicitly, instead of burying it in a function that returns a cursor. Among other things, it should be done in an enclosing with block to make the shut-down explicit in case of error. Since the existing implementation does not appear to support context management, you will have to write out the try-catch explicitly.
For example, you could return both the connection and the cursor:
def connect_db():
cnx = mysql.connector.connect(user="root", database="test_db")
cursor = cnx.cursor()
return cnx, cursor
def main():
cnx = None
try:
cnx, cursor = connect_db()
cursor.execute("some commands")
finally:
if cnx is not None:
cnx.close()
A more elegant solution might be to make your own context manager for the database instead of returning two separate objects (similar to https://stackoverflow.com/a/67645694/2988730, but more encapsulated):
class connect_db:
def __enter__(self):
self.cnx = mysql.connector.connect(user="root", database="test_db")
self.cursor = self.cnx.cursor()
return cursor
def __exit__(self, *args):
self.cursor.close()
self.cnx.close()
def main():
with connect_db() as cursor:
cursor.execute("some commands")
Related
im trying to develop a simple application where multiple business objects are saved to the sqlite database. I have several business objects for which I want to program save/update/delete methods. For each of the classes and methods within those classes I always create a new connection. For example:
import sqlite3
db = "mydb.db"
class BusinessObject1:
...
def save_to_db(self, db):
conn = sqlite3.connect(db)
cur = conn.cursor()
with conn:
cur.execute("...")
def delete_from_db(self, db):
conn = sqlite3.connect(db)
cur = conn.cursor()
with conn:
cur.execute("...")
class BusinessObject2:
...
def save_to_db(self, db):
conn = sqlite3.connect(db)
cur = conn.cursor()
with conn:
cur.execute("...")
def delete_from_db(self, db):
conn = sqlite3.connect(db)
cur = conn.cursor()
with conn:
cur.execute("...")
It doesn't feel like a good design solution (not DRY). Can someone propose a better design for this? I have around 20 Business objects and 4-8 methods for each and a record table where all these objects are included. Typing conn=sqlite3.connect(db) every time simply cannot be the right way. And if I'll decide to change to MySQL or PostgreSQL I would need to refactor the whole project.
Thanks!
Then you might want to separate the connection handler from the class object or you can do as the comment above suggesting to put the connection inside the class constructor.
If for some reason you can't do both you can still add a staticmethod to your class that will handle the connection. It's not ideal but it's still better than your solution:
#staticmethod
def create_connection(db_file):
conn = None
try:
with sqlite3.connect(db_file) as conn:
return conn
except sqlite3.Error as e:
print("Can't connect to database, error:", e, sep="\n")
return conn
Then you add a parameter to your other methods which will be the returned create_connection method:
def save_to_db(self, connection):
cur = connection.cursor()
cur.execute("...")
This way you can separate your connect object from your class object:
if __name__ == "__main__":
conn = BusinessObject1.create_connection(db)
b1 = BusinessObject1(...)
b1.save_to_db(conn)
I am new to Python and MySQL and having some trouble with a connection class that I am using to execute queries.
Here is my what I have so far:
class DbConnection:
def __init__(self):
db = mysql.connector.connect(
host=cfg.mysql["host"],
user=cfg.mysql["user"],
passwd=cfg.mysql["passwd"],
database=cfg.mysql["database"]
)
self.cursor = db.cursor()
def query(self, sql):
self.cursor.execute(sql)
return self.cursor.fetchall()
test = DbConnection()
test.query('SELECT * FROM DaycareDogs')
When I try and run this (or any query) I am getting "ReferenceError: weakly-referenced object no longer exists".
I am very new to coding and this is my first real project so am learning on the fly.
Is there something I am missing? I have seen a few other similar problems and did what was recommended but still no luck.
Any advice?
Thanks!
Try making the database connection a property of the class, by also adding self in front of it
class DbConnection:
def __init__(self):
self.db = mysql.connector.connect(
host=cfg.mysql["host"],
user=cfg.mysql["user"],
passwd=cfg.mysql["passwd"],
database=cfg.mysql["database"]
)
self.cursor = self.db.cursor()
def query(self, sql):
self.cursor.execute(sql)
return self.cursor.fetchall()
test = DbConnection()
test.query('SELECT * FROM DaycareDogs')
Since the variable db is in the local scope, when the 'init' constructor ends, the variable will be destroyed, therefore the connection
I have written a function for connecting to a database using pymysql. Here is my code:
def SQLreadrep(sql):
connection=pymysql.connect(host=############,
user=#######,
password=########,
db=#########)
with connection.cursor() as cursor:
cursor.execute(sql)
rows=cursor.fetchall()
connection.commit()
connection.close()
return rows
I pass the SQL into this function and return the rows. However, I am doing quick queries to the database. (Something like "SELECT sku WHERE object='2J4423K').
What is a way to avoid so many connections?
Should I be avoiding this many connections to begin with?
Could I crash a server using this many connections and queries?
Let me answer your last question first. Your function is acquiring a connection but it is closing it prior to returning. So, I see no reason why unless your were multithreading or multiprocessing you would ever be using more than one connection at a time and you should not be crashing the server.
The way to avoid the overhead of creating and closing so many connections would be to "cache" the connection. One way to do that would be to replace your function by a class:
import pymysql
class DB(object):
def __init__(self, datasource, db_user, db_password):
self.conn = pymysql.connect(db=datasource, user=db_user, password=db_password)
def __del__(self):
self.conn.close()
def query(self, sql):
with self.conn.cursor() as cursor:
cursor.execute(sql)
self.conn.commit()
return cursor.fetchall()
Then you instantiate an instance of the DB class and invoke its query method. When the DB instance is grabage collected, the connection will be automatically closed.
I have two programs: One that fill and updates a database and another that selects info from the database every 10 seconds.
I use Pymysql.
When I update the database I commit the data, I can see the results in the database with command lines, but the other program has the same output and doesn't get the new data!
Do I need to make a special query other than SELECT?
Do I need to close the connection and reopen it before all query?
I create the GetData class when starting the program and get_data is called every 10 seconds.
class GetData:
def __init__(self):
self.conn = pymysql.connect(host='localhost', user='root', password='', db='mydb', charset='utf8mb4', cursorclass=pymysql.cursors.DictCursor)
def get_data(self, data):
with self.conn.cursor() as cursor:
self.sql = "SELECT id_data, somedata FROM mytable WHERE (%s = 'example');"
cursor.execute(self.sql, (data,))
return cursor.fetchall()
def close_conn(self):
self.conn.close()
The program that fills the database:
class FillDb:
def __init__(self):
self.conn = pymysql.connect(host='localhost', user='root', password='', db='mydb', charset='utf8mb4', cursorclass=pymysql.cursors.DictCursor)
#added this line but doesen't help!
self.conn.autocommit(True)
def add_in_db(self, data):
with self.conn.cursor() as cursor:
self.sql = "INSERT INTO mytable (somedata) VALUES (%s);"
cursor.execute(self.sql, (data,))
self.conn.commit()
Why you did not see the updates:
The cause of the behavior is InnoDB's default isolation level REPEATABLE READ. With REPEATABLE READ, the first nonlocking SELECT establishes a snapshot representing the data at that point in time. All consecutive nonlocking SELECTs read from that same snapshot. Updates to the DB from other transactions are not reflected in that snapshot, thus remaining transparent.
Committing the transaction (or closing it and creating a new one) will cause a new snapshot to be created with the next query, representing the data in the DB at that point in time. This is how MySQL implements Consistent Nonlocking Reads as part of their ACID compliance strategy.
Why with self.conn works and what it does:
In PyMySQL, there's two (relevant) contextmanager implementations, one on the Cursor (more or less 'documented') and one on the Connection (can be found in the code :D).
When you used with self.conn.cursor() as cursor: it was the cursor's implementation that was in effect. Entering the context returned self (the cursor object returned from the cursor() method on self.conn); leaving the context ultimately closed that cursor. It has no effect on the transaction.
When using with self.conn as cursor it is the connection's implementation that is in effect. Entering the context returns the cursor from calling self.cursor(); leaving the context does a commit or rollback on the transaction. The cursor is closed implicitly as well.
So, the implicit call to self.commit when leaving the context of the connection's implementation 'expires' the existing snapshot in your transaction and forces the creation of a new one in the next iteration of your loop, which potentially contains your inserts, as long as their commit has completed before the creation of said new snapshot.
I have resolved the same issue with adding self.conn.commit() after
cursor.fetchall()
I am opening mysql connection in the main function and use that connection in multiple functions called by the main function.
Is there anything wrong with passing cursor from main function instead of passing the connection?
I.e.:
Pass in cursor from main function
def main():
conn = pymysql.connect(...)
with conn as cursor:
func1(cursor)
func2(cursor)
conn.close()
def func1(cursor):
cursor.execute('select ...')
def func2(cursor):
cursor.execute('insert ...')
Pass in connection from main function
def main():
conn = pymysql.connect(...)
func1(conn)
func2(conn)
conn.close()
def func1(conn):
with conn as cursor:
cursor.execute('select ...')
def func2(conn):
with conn as cursor:
cursor.execute('insert ...')
The answer comes from Law of Demeter: Pass cursor.
This also leads to a slightly shorter code. In this case, it's pretty trivial, but sometimes it may be a lot (e.g., passing a database name vs. passing a cursor).