Description
I have a database that I already built in python3 utilizing sqlite. Up till this point, I have not had any issues with commit saving changes (with insert commands and delete commands). However, I am trying to utilize an update command and I have not been able to save the changes (it only changes the DB in the working memory despite calling commit().
The goal of this code snippet is to replace the null values in the database with an empty string as I have another function that cannot handle null data. I found a solution to do that here: Find null values from table and replace it with space - sqlite.
Details
Here is the current code that I am trying to execute:
self.cursor.execute(f'UPDATE {tbl_name} SET {col_name} = IFNULL({col_name}, "")')
self.conn.commit()
This code basically goes through the entire database one column at a time and replaces the null values.
Note that self is defined as follows:
Database.conn = sqlite3.connect(self.location + self.name)
Database.cursor = sqlite3.connect(self.location + self.name).cursor()
As earlier stated this correctly operates; however, it will not commit the changes to the actual database. This is verified by both DB browser for sqlite and pulling the data again on a close and re-execute.
I will also note that if I close out of this program and reinitialize it to run it again it will error out for the DB still being locked despite the last line of my code being:
Database.conn.commit() # Save (commit) the changes
Database.conn.close() # Close database
Conclusion
Thanks in advance as I have been beating my head against the wall with this one and have yet to find a problem like this elsewhere!
Your database connecton has nothing to do with your cursor.
You do
Database.conn = sqlite3.connect(self.location + self.name)
Database.cursor = sqlite3.connect(self.location + self.name).cursor()
So by creating a new connection for the cursor, a following Database.conn.commit() won't commit any changes you did with cursor.
Create your cursor like this to have the connection between connection and cursor:
Database.conn = sqlite3.connect(self.location + self.name)
Database.cursor = Database.conn.cursor()
Related
I am trying to update SQL Server table through Python. But unfortunately it does not update.
I get message successful but no data was updated.
If I call the same SQL script from within SQL Server, it updates correctly.
Let me show you my script: this is my Python code:
PredString = '99'
conn = pymssql.connect(server="MyServer", database="MyDB", port="1433", user="****", password="******")
dfUpdate = pd.read_sql("EXEC UpdatePredictions '" + PredString + "'", conn)
conn.close()
print(dfUpdate)
This is the SQL Server stored procedure:
alter procedure UpdatePredictions
(#PredString varchar(max))
as
begin
update MyTable
set PredMths = #PredString
select 'Updated.'
end
When I run Python code I get "Updated" but actually no record was updated
But when I call from SQL Server:
EXEC UpdatePredictions '99'
I get message "Updated" and records are actually updated
What am I doing wrong here? How can I get Python to update the table?
Thanks to the guys who commented the answer.
As no one have made it as an answer, I will so I can mark it, so other people can find the answer easily in the future.
the problem was that Python connection wasn't committing update statement.
therefore I have to add this line after sending the update
conn.commit()
So, I have the following code that inserts the data of an old database to a new one:
...
cur_old.execute("""SELECT DISTINCT module FROM all_students_users_log_course266""")
module_rows = cur_old.fetchall()
for row in module_rows:
cur_new.execute("""INSERT INTO modules(label) SELECT %s WHERE NOT EXISTS (SELECT 1 FROM modules WHERE label=%s)""", (row[0], row[0]))
...
The last line executes a query where labels are inserted into the new database table. I tested this query on pgAdmin and it works as I want.
However, when execute the script, nothing is inserted on the modules table. (Actually the sequences are updated, but none data is stored on the table).
Do I need to do anything else after I call the execute method from the cursor?
(Ps. The script is running till the end without any errors)
You forgot to do connection.commit(). Any alteration in the database has to be followed by a commit on the connection. For example, the sqlite3 documentation states it clearly in the first example:
# Save (commit) the changes.
conn.commit()
And the first example in the psycopg2 documentation does the same:
# Make the changes to the database persistent
>>> conn.commit()
As Evert said, the commit() was missing. An alternative to always specifying it in your code is using the autocommit feature.
http://initd.org/psycopg/docs/connection.html#connection.autocommit
For example like this:
with psycopg2.connect("...") as dbconn:
dbconn.autocommit=True
My set-up:
MySQL server.
host running a python script.
(1) and (2) are different machines on the network.
The python script generates data which must be stored in a MySQL-database.
I use this (example-)code to achieve that:
def function sqldata(date,result):
con = mdb.connect('sql.lan', 'demouser', 'demo', 'demo')
with con:
cur = con.cursor()
cur.execute('INSERT INTO tabel(titel, nummer) VALUES( %s, %s)',(date, result))
The scipt generates one data-point approx. every minute. So this means that a new connection is opened and closed every minute. I'm wondering if it would be a better idea to open the connection at the start of the script and only close it when the script terminates. Effectively leaving the connection open indefinately.
This then obviously begs the question how to handle/recover when the SQL-server "leaves" the network (e.g. due to a reboot) for a while.
While typing my question this question appeared in the "Similar Questions" section. It is, however, from 2008 and possibly outdated and the 4 answers it received seem to contradict with each other.
What are the current insights in this matter?
Well the referred answer is right in it's point, but maybe not answering all your questions. I can not provide a full running python script for you here, but let me explain how i would go along with it:
Rule 1: Generally most mysql functions return values, that you should always check so that you can react on unwanted behavior.
Rule 2: Open a connection at the beginning of your script and use this one and only connection throughout your script.
Obviously you could check if there is an existing connection in your sqldata function, and if not then you could open a new one to the global con object.
if not con:
con = mdb.connect('sql.lan', 'demouser', 'demo', 'demo')
And if there is a connection already, you could check it's "up status" by performing a simple query with fixed expected result that you can check to see if the sql server is running.
if con:
cur = con.cursor()
returned = cur.execute('SELECT COUNT(*) FROM tabel')
if returned.with_rows:
....
Basically you could avoid this, because if you don't get a cursor back, and you check that first before using it, then you already know if the server is alive or not.
So CHECK, CHECK and CHECK. You should check everything you get back from a function to have a good error handling. Just using a connection or using a cursor without checking it first, can lead you talking to a NIL object and crashing your script.
And the last BIG HINT i can give you is to use multiple row inserts. You can actually insert hundreds of rows, if you just add the values comma seperated to your insert string:
# consider result would be filled like this
result = '("First Song",1),("Second Song",2),("Third Song",3)'
# then this will insert 3 rows with one call
returned = cur.execute('INSERT INTO tabel (titel, nummer) VALUES %s',(date, result), multi=True)
# since literally it will execute
returned = cur.execute('INSERT INTO tabel (titel, nummer) VALUES ("First Song",1),("Second Song",2),("Third Song",3)', multi=True)
# and now you can check returned for any error
if returned:
....
I'm using Python and MySQLdb to add rows to my database. It seems that when my script exits, the rows get deleted. My last lines before the script exits do a "select *" on the table, which shows my one row. When I re-run the script, the first lines (after opening the connection) do the same "select *" and return zero results. I'm really at a loss here. I've been working for about 2 hours on this, and can't understand what could be accessing my database.
Also, between running the scripts, I run the "select *" manually from a terminal with zero results.
If I manually add a row from the terminal, it seems to last.
The query to insert the row:
cursor.execute("INSERT INTO sessions(username, id, ip) VALUES (%s, %s, %s)", (username, SessionID, IP]))
The query I use to check the data:
cursor.execute("select * from sessions")
print cursor.fetchall()
This shows the row before the program exits, then shows nothing when the program is run again.
Thanks in advance for all the help.
Looks like you need to connection.commit() your changes after you execute the query (replace connection with your DB connection variable).
http://docs.python.org/library/sqlite3.html
Connection.commit():
This method commits the current transaction. If you don’t call this method, anything you did since the last call to commit() is not visible from other database connections. If you wonder why you don’t see the data you’ve written to the database, please check you didn’t forget to call this method.
Check this other question: Python MySQLdb update query fails
You can find some examples on how to commit, how to connect using autocommit, etc.
I'm trying to figure out if it's possible to replace record values in a Microsoft Access (either .accdb or .mdb) database using pyodbc. I've poured over the documentation and noted where it says that "Row Values Can Be Replaced" but I have not been able to make it work.
More specifically, I'm attempting to replace a row value from a python variable. I've tried:
setting the connection autocommit to "True"
made sure that it's not a data type issue
Here is a snippet of the code where I'm executing a SQL query, using fetchone() to grab just one record (I know with this script the query is only returning one record), then I am grabbing the existing value for a field (the field position integer is stored in the z variable), and then am getting the new value I want to write to the field by accessing it from an existing python dictionary created in the script.
pSQL = "SELECT * FROM %s WHERE %s = '%s'" % (reviewTBL, newID, basinID)
cursor.execute(pSQL)
record = cursor.fetchone()
if record:
oldVal = record[z]
val = codeCrosswalk[oldVal]
record[z] = val
I've tried everything I can think bit cannot get it to work. Am I just misunderstanding the help documentation?
The script runs successfully but the newly assigned value never seems to commit. I even tried putting "print str(record[z])this after the record[z] = val line to see if the field in the table has the new value and the new value would print like it worked...but then if I check in the table after the script has finished the old values are still in the table field.
Much appreciate any insight into this...I was hoping this would work like how using VBA in MS Access databases you can use an ADO Recordset to loop through records in a table and assign values to a field from a variable.
thanks,
Tom
The "Row values can be replaced" from the pyodbc documentation refers to the fact that you can modify the values on the returned row objects, for example to perform some cleanup or conversion before you start using them. It does not mean that these changes will automatically be persisted in the database. You will have to use sql UPDATE statements for that.