I have a simple database application in Python with SQLite. I wrote a simple program to create database and insert into some values. However, database is created, but new values are not inserted, and I don't know why:
#!/usr/bin/python
# -*- coding: utf-8 -*-
import sqlite3 as lite
import sys
def CreateTable():
try:
connection = lite.connect(':memory:')
with connection:
cursor = connection.cursor()
sql = 'CREATE TABLE IF NOT EXISTS Authors' + '(ID INT PRIMARY KEY NOT NULL, FIRSTNAME TEXT, LASTNAME TEXT, EMAIL TEXT)'
cursor.execute(sql)
data = '\n'.join(connection.iterdump())
with open('authors.sql', 'w') as f:
f.write(data)
except lite.Error, e:
if connection:
connection.rollback()
finally:
if connection:
cursor.close()
connection.close()
def Insert(firstname, lastname, email) :
try:
connection = lite.connect('authors.sql')
with connection:
cursor = connection.cursor()
sql = "INSERT INTO Authors VALUES (NULL, %s, %s, %s)" % (firstname, lastname, email)
cursor.execute(sql)
data = '\n'.join(connection.iterdump())
with open('authors.sql', 'w') as f:
f.write(data)
except lite.Error, e:
if connection:
connection.rollback()
finally:
if connection:
cursor.close()
connection.close()
CreateTable()
Insert('Tibby', 'Molko', 'tibby.molko#yahoo.co.uk')
You are not calling commit on your connection. You should also not write to the database file yourself, the database engine is writing to the file.
Try to go through the first few examples in sqlite documentation, it should be clear then.
You have misunderstood what connection.iterdump() is for. You are creating SQL text, instructions for SQLite to execute again at a later date. It is not the database itself. If all you wanted was to output SQL statements you can just write your SQL statements directly, there is little point in passing it through SQLite first.
You also cannot 'connect' SQLite to the text file with SQL statements; you'd have to load those statements as text and re-play them all. That's not what I think you wanted however.
You can connect to an existing database to insert additional rows. Each time you want to have add data, just connect:
def CreateTable():
connection = lite.connect('authors.db')
try:
with connection as:
cursor = connection.cursor()
sql = '''\
CREATE TABLE IF NOT EXISTS Authors (
ID INT PRIMARY KEY NOT NULL,
FIRSTNAME TEXT,
LASTNAME TEXT,
EMAIL TEXT)
'''
cursor.execute(sql)
finally:
connection.close()
def Insert(firstname, lastname, email) :
connection = lite.connect('authors.db')
try:
with connection:
cursor = connection.cursor()
sql = "INSERT INTO Authors VALUES (NULL, ?, ?, ?)"
cursor.execute(sql, (firstname, lastname, email))
finally:
connection.close()
Note that using the connection as a context manager already ensures that the transaction is either committed or rolled back, depending on there being an exception.
On the whole, you want to be informed of exceptions here; if you cannot connect to the database you'd want to know about it. I simplified the connection handling as such. Closing a connection auto-closes any remaining cursors.
Last but far from least, I switched your insertion to using SQL parameters. Never use string interpolation where parameters can be used instead. Using parameters makes it possible for the database to cache statement parse results and most of all prevents SQL injection attacks.
You cannot connect to a text file with sql commands.
sqlite3.connect expects or creates a binary file.
You didnt commit it.For writing into database, it should be committed.For read (select) operations,not needed.
try:
with connection:
cursor = connection.cursor()
sql = "INSERT INTO Authors VALUES (NULL, ?, ?, ?)"
cursor.execute(sql, (firstname, lastname, email))
connection.commit() # or cursor.commit()
finally:
connection.close()
Related
I can not figure out why I can not get my cursor.execute statement to work. I do not get any Python errors however the code continues to fail the try command. I am connected to SQL and I can print a list of column in the table. Here is my code.
import pyodbc
# Connect to the database
connection = pyodbc.connect(
r'DRIVER={SQL Server};'
r'SERVER=******;'
r'DATABASE=****;'
r'UID=****;'
r'PWD=*****'
)
try:
# Create a cursor
cursor = connection.cursor()
# Execute a SQL statement
cursor.execute("INSERT INTO employee table (FName, SSN) VALUES (?, ?)", "john", "123-123-1234")
connection.commit()
print("success")
except pyodbc.Error:
print("error")
connection.rollback()
# Close the cursor and connection
cursor.close()
connection.close()
I get a 42000 error when I print out the error but the syntax looks to be all correct.
cursor.execute("INSERT INTO employee table (FName, SSN) VALUES (?, ?)", "john", "123-123-1234")
I see two problems here.
First, the syntax for INSERT should look like this:
INSERT INTO tablename (column1, column2) VALUES (value1, value2)
But instead of tablename, you have employee table. If employee is the name of the table, then just use that; you don't need the extra word table hanging out there.
Second, you're passing two arguments "john", "123-123-1234" to the execute function, but this is wrong. It should be one argument, which is a list/tuple containing all the desired values, like so:
cursor.execute("...", ("john", "123-123-1234"))
I am trying to test this function which inserts data into postgres via GCP Cloud Function. I am having trouble testing this. It say some JSON is required. However what is exactly required her to be called?
def hello_world(self):
# establishing the connection
conn = psycopg2.connect(
database='somedb',
user='someuser',
password='somepwd',
host='XX.XXX.XXX.XXX',
port='5432'
)
# creating a cursor object
cursor = conn.cursor()
# list that contain records to be inserted into table
data = [('Babita', 'Bihar'), ('Anushka', 'Hyderabad'),
('Anamika', 'Banglore'), ('Sanaya', 'Pune'),
('Radha', 'Chandigarh')]
# inserting record into employee table
for d in data:
cursor.execute("INSERT into employee(name, state) VALUES (%s, %s)", d)
# print("List has been inserted to employee table successfully...")
# Commit your changes in the database
conn.commit()
# Closing the connection
conn.close()
I am creating a Python app that will store my homework in a database (using PhpMyAdmin). Here comes my problem:
At this moment, I am sorting every input with an ID (1, 2, 3, 4...), a date (23/06/2018...), and a task (read one chapter of a book). Now I would like to sort them by the date because when I want to read what do I have to do. I would prefer to see what shall I do first, depending on when should I get it done. For example:
If I have two tasks: one 25/07/2018 and the other 11/07/2018, I would like to show the 11/07/2018 first, no matter if it was addead later than the 25/07/2018. I am using Python (3.6), pymysql and PhpMyAdmin to manage the database.
I have had an idea to get this working, maybe I could run a Python script every 2 hours, that sorts all the elements in the database, but I have no clue about how can I do it.
Now, I will show you the code that enters the values into a database and then it shows them all.
def dba():
connection = pymysql.connect(host='localhost',
user='root',
password='Adminhost123..',
db='deuresc',
charset='utf8mb4',
cursorclass=pymysql.cursors.DictCursor)
try:
with connection.cursor() as cursor:
# Create a new record
sql = "INSERT INTO `deures` (`data`, `tasca`) VALUES (%s, %s)"
cursor.execute(sql, (data, tasca))
# connection is not autocommit by default. So you must commit to save
# your changes.
connection.commit()
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT * FROM `deures` WHERE `data`=%s"
cursor.execute(sql, (data,))
resultat = cursor.fetchone()
print('Has introduït: ' + str(resultat))
finally:
connection.close()
def dbb():
connection = pymysql.connect(host='localhost',
user='root',
password='Adminhost123..',
db='deuresc',
charset='utf8mb4',
cursorclass=pymysql.cursors.DictCursor)
try:
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT * FROM `deures`"
cursor.execute(sql)
resultat = cursor.fetchall()
for i in resultat:
print(i)
finally:
connection.close()
Can someone help?
You don't sort the database. You sort the results of the query when you ask for data. So in your dbb function you should do:
SELECT * FROM `deures` ORDER BY `data`
assuming that data is the field with the date.
I have been trying to insert data into the database using the following code in python:
import sqlite3 as db
conn = db.connect('insertlinks.db')
cursor = conn.cursor()
db.autocommit(True)
a="asd"
b="adasd"
cursor.execute("Insert into links (link,id) values (?,?)",(a,b))
conn.close()
The code runs without any errors. But no updation to the database takes place. I tried adding the conn.commit() but it gives an error saying module not found. Please help?
You do have to commit after inserting:
cursor.execute("Insert into links (link,id) values (?,?)",(a,b))
conn.commit()
or use the connection as a context manager:
with conn:
cursor.execute("Insert into links (link,id) values (?,?)", (a, b))
or set autocommit correctly by setting the isolation_level keyword parameter to the connect() method to None:
conn = db.connect('insertlinks.db', isolation_level=None)
See Controlling Transactions.
It can be a bit late but set the autocommit = true save my time! especially if you have a script to run some bulk action as update/insert/delete...
Reference: https://docs.python.org/2/library/sqlite3.html#sqlite3.Connection.isolation_level
it is the way I usually have in my scripts:
def get_connection():
conn = sqlite3.connect('../db.sqlite3', isolation_level=None)
cursor = conn.cursor()
return conn, cursor
def get_jobs():
conn, cursor = get_connection()
if conn is None:
raise DatabaseError("Could not get connection")
I hope it helps you!
When using sqlite3 for python, how do I determine if a row has been successfully inserted into a table? e.g.
conn = sqlite3.connect("test.db")
c = conn.cursor()
c.execute("INSERT INTO TEST VALUES ('sample text')")
c.commit()
c.close()
If no exception was thrown when calling execute() or commit(), it was inserted when you called commit().
Committing a transaction successfully is a guarantee from the database layer that the insert was written to disk.
you can get all the rows and see if its in there with:
SELECT * FROM TEST
But SQLite will give you an error message if it didnt work.
you can count() rows before inserting and after inserting.
You could try something like this to have an error message:
try:
c.execute("INSERT INTO TEST VALUES ('sample text')")
except sqlite3.OperationalError, msg:
print msg
You should do the commit on connection made(db selected which is conn) not on cursor.
conn = sqlite3.connect("test.db")
c = conn.cursor()
c.execute("INSERT INTO TEST VALUES ('sample text')")
#commit the changes to db
conn.commit()
conn.close()
First You should do the commit on the connection object not the cursor i.e
conn.commit() not c.commit()
Then you can examine lastrowid on the cursor to determine if the insert was successful after conn.commit()
c.lastrowid