Python - SQLITE db update entrys from csv file - python

I am trying to update existing sqlite db rows with rows from my csv file.
Not delet and insert, but update existing row (keeping id).
dqlite db has 4 columns and only 4th column is different (to update).
if its not possible to update only one column i can accept updating whole row but keeping its place in db.
db before update:
cfthostname,cftshortname,cftenv,cert_time
1904h.net,1904h,tst,DD/MM/RRRR
19053.net,19053,tst,26/03/2021
2210010315.net,2210010315,prd,DD/MM/RRRR
1809m.net,1809m,tst,26/03/2021
13jw.net,13jw,acc,DD/MM/RRRR
csv to update:
cfthostname,cftshortname,cftenv,cert_time
1904h.net,1904h,tst,13/05/2023
19053.net,19053,tst,23/07/2023
13jw.net,13jw,acc,14/06/2029
update code:
import sqlite3
import csv
conn = sqlite3.connect("C:\db.sqlite3")
cursor = conn.cursor()
[...]
with open('C:\\csv\\update.csv','rt') as fin:
dr = csv.DictReader(fin)
to_db = [(i['hostname'], i['shortname'], i['env'], i['cert_time']) for i in dr]
cursor.executemany("UPDATE itpassed_host SET hostname = ?, shortname = ?, env = ?, cert_time = ?", to_db)
conn.commit()
conn.close()
tried also with () on to_db but it gives same output on db
cursor.executemany("UPDATE itpassed_host SET hostname = ?, shortname = ?, env = ?, cert_time = ?", (to_db))
db after update:
cfthostname,cftshortname,cftenv,cert_time
13jw.net,13jw,acc,14/06/2029
13jw.net,13jw,acc,14/06/2029
13jw.net,13jw,acc,14/06/2029
13jw.net,13jw,acc,14/06/2029
13jw.net,13jw,acc,14/06/2029
how to update only rows from csv to update correctly in db?

Use the WHERE condition.
import sqlite3
import csv
conn = sqlite3.connect("db.sqlite3")
cursor = conn.cursor()
with open('update.csv','rt') as fin:
dr = csv.DictReader(fin)
to_db = [(i['cert_time'], i['cfthostname'], i['cftshortname'], i['cftenv'],) for i in dr]
cursor.executemany("UPDATE itpassed_host SET cert_time = ? WHERE cfthostname = ? AND cftshortname = ? AND cftenv = ?", to_db)
conn.commit()
conn.close()
Notice the order changed in the line with to_db=.

Related

Python Sqlite3 not updating database

I'm working in a project, and i'm facing a issue when trying to update my database...
I'm running the following command:
con = sqlite3.connect("DATASETS/SQLite.db")
cur = con.cursor()
with open("DATASETS/test.csv","r") as fin:
dr = csv.DictReader(fin, ["element1", "element2", "element3", "element4", "element5", "element6", "element7", "element8", "id"])
to_db = [(i["element1"], i["element2"], i["element3"], i["element4"], i["element5"], i["element6"], i["element7"], i["element8"], i["id"]) for i in dr]
cur.executemany("UPDATE Table SET element1 = ?, element2 = ?, element3 = ?, element4 = ?, element5 = ?, element6 = ?, element7 = ?, element8 = ? WHERE ID = ?;", to_db)
con.commit()
con.close()
The data is beeing filtered in a dataset csv with 6000+ rows and delimiter ",".
The code runs without errors, but, when i opened the DB to check, there ware not even a single row that was updated.
I do not know if this is a VSCode bug, a version error (Python 3.8.3), or anything like this, and i wnated to ask if anyone has already saw this.
Thanks!!

CSV import to database

I am getting the error 'sqlite3.ProgrammingError: Incorrect number of bindings supplied. The current statement uses 4, and there are 1 supplied.' The below code should be making a database and creating a table with the the titles listed below. Then take values from a csv. file and add it under the allotted headings. Any help would be would be appreciated!
import const
import sqlite3
SEP = ','
DATA_FILENAME = 'pokemon.csv'
con = sqlite3.connect('poki.db')
cur = con.cursor()
cur.execute('DROP TABLE IF EXISTS poki')
cur.execute( ' CREATE TABLE poki( pokemon TEXT, species_id INTEGER,'
' height REAL, weight REAL)' )
values = ('INSERT INTO poki VALUES (?, ?, ?, ?)')
for line in DATA_FILENAME:
list_of_values = line.strip().split(SEP)
cur.execute(values, list_of_values)
cur.close()
con.commit()
con.close()

Postgresql: how to copy a column from a table to another using psycopg2?

I am using psycopg2 in python to manage my database.
I created two tables. One called data that I want to populate with some data. For doing so I created a temporary table called temporarytable from a .csv file.
I want to copy the column numberofinhabitants. So what I am doing is the following:
### Add Column
import psycopg2 as ps
stat1 = """ ALTER TABLE data ADD COLUMN numberofinhabitants integer"""
con = ps.connect(dbname = 'mydb', user='postgres', host='localhost', password='mypd')
con.autocommit = True
cur = con.cursor()
cur.execute(stat1)
cur.close()
con.close()
### Copy Column
stat2 = """INSERT INTO data (numberofinhabitants) SELECT numberofinhabitants FROM temporarytable"""
con = ps.connect(dbname = 'mydb', user='postgres', host='localhost', password='mypd')
con.autocommit = True
cur = con.cursor()
cur.execute(stat2)
cur.close()
con.close()
but I get the following error
ProgrammingError: column "numberofinhabitants" does not exist
LINE 1: INSERT INTO data (numberofinhabitants) SELECT numberofinhabi...
^
HINT: There is a column named "numberofinhabitants" in table "data", but it cannot be referenced from this part of the query.
Below a screenshot from pgAdmin3 after SELECT * FROM temporarytable;
I think the problem is that PostgreSQL's columns are case sensitive. You should try this as stat2:
stat2 = """INSERT INTO data (numberofinhabitants) SELECT "numberOfInhabitants" FROM temporarytable"""
Note that you should also use " for columns with upper characters in them.

Python 3.4 transmiting table, columns and values as variables using MySQLdb

I am using Python 3.4
I have this piece of code:
import MySQLdb
table = "my_table"
columns = ("column1", "column2")
values = ("value1", "value2")
conn = MySQLdb.connect (host = "localhost",
user = "user",
passwd = "password",
db = "my_database")
cursor = conn.cursor()
# execute an insert
cursor.execute("INSERT INTO my_table column1, column2 VALUES (value1, value2)")
cursor.commit()
cursor.close()
conn.close()
Q: How can I pass the table name, columns and the values all as variables?
I would like to do something like this:
sql = "INSERT INTO %s %s VALUES %s" % (my_table, columns, values)
cursor.execute(sql)
You will have to do it as a 2 step process as the execute method will escape strings.
sql = "INSERT INTO {} ({}) VALUES ({})".format(table, ','.join(columns), ','.join('[%s]' * len(columns)))
# Generates: INSERT INTO my_table (column1,column2) VALUES (?,?)
cursor.execute(sql, values)

Insert data into MySQL table from Python script

I have a MySQL Table named TBLTEST with two columns ID and qSQL. Each qSQL has SQL queries in it.
I have another table FACTRESTTBL.
There are 10 rows in the table TBLTEST.
For example, On TBLTEST lets take id =4 and qSQL ="select id, city, state from ABC".
How can I insert into the FACTRESTTBL from TBLTEST using python, may be using dictionary?
Thx!
You can use MySQLdb for Python.
Sample code (you'll need to debug it as I have no way of running it here):
#!/usr/bin/python
import MySQLdb
# Open database connection
db = MySQLdb.connect("localhost","testuser","test123","TESTDB" )
# prepare a cursor object using cursor() method
cursor = db.cursor()
# Select qSQL with id=4.
cursor.execute("SELECT qSQL FROM TBLTEST WHERE id = 4")
# Fetch a single row using fetchone() method.
results = cursor.fetchone()
qSQL = results[0]
cursor.execute(qSQL)
# Fetch all the rows in a list of lists.
qSQLresults = cursor.fetchall()
for row in qSQLresults:
id = row[0]
city = row[1]
#SQL query to INSERT a record into the table FACTRESTTBL.
cursor.execute('''INSERT into FACTRESTTBL (id, city)
values (%s, %s)''',
(id, city))
# Commit your changes in the database
db.commit()
# disconnect from server
db.close()

Categories

Resources