Insert sqlite3 from variables - python

i've been trying to create a database, the column names of my table would come from a list:
import sqlite3
L = ["Nom","Age","Taille"]
list2 = ["Karl", "11", "185"]
M = []
R = 0
con = sqlite3.connect(":memory:")
con.execute("CREATE TABLE master ("+",".join(L)+")")
Then Either :
for e in L:
R += 1
con.execute("INSERT INTO master("+",".join(L)+") VALUES (?,?,?)",list2[R-1])
or
for e in L:
R += 1
con.execute("INSERT INTO master(e) VALUES (?)",list2[R-1])
or
listX=[list2[0],list2[1],list2[3])
con.executemany("INSERT INTO master ("+",".join(L)+") VALUES ("+",".join(M)+")", (listX))

Check the documentation: https://docs.python.org/3.8/library/sqlite3.html
In your case:
import sqlite3
con = sqlite3.connect(":memory:")
columns = ["Nom", "Age", "Taille"]
columns_str = '"' + '","'.join(columns) + '"'
con.execute(f"CREATE TABLE people ({columns_str})")
data = [
('Karl', 11, 185)
]
stmt = f"INSERT INTO people ({columns_str}) VALUES (?, ?, ?)"
con.executemany(stmt, data)
Also, probably don't call your table master - that'll get very confusing later. Names like L and list2 also don't help. Be clear in naming your variables, name them after what they mean or contain. Future you will thank you.
A little bit cleaner perhaps:
import sqlite3
con = sqlite3.connect(":memory:")
columns = ("Nom", "Age", "Taille")
con.execute("CREATE TABLE people (%s, %s, %s)" % columns)
data = [
('Karl', 11, 185)
]
stmt = f"INSERT INTO people (%s, %s, %s) VALUES (?, ?, ?)" % columns
con.executemany(stmt, data)

Related

erorr occured when I use timediff on python

I wrote this sql code in my database and it worked
INSERT INTO moni (sn, dgw, tgw, delay_gw_server) VALUES ('2020060002', '2020-07-05', '11:12:17', timediff(NOW(), (cast(concat(dgw, ' ', tgw) as datetime))))
can I use this code into python code?
date = ‘2020-07-07’
time = ‘17:17:17’
concat = ‘%s %s’ %(date, time)
dt = datetime.datetime.strptime(concat, ‘%Y-%m-%d %H:%M:%S’)
diff = (datetime.datetime.now() - dt)
mycursor = mydb.cursor()
sql = "INSERT INTO moni (sn, dgw, tgw, delay_gw_server) VALUES (%s, %s, %s, %s)"
val = (sn, date, time, diff)
mycursor.execute(sql, val)
mydb.commit()
or this code ?
date = ‘2020-07-07’
time = ‘17:17:17’
concat = ‘%s %s’ %(date, time)
dt = datetime.datetime.strptime(concat, ‘%Y-%m-%d %H:%M:%S’)
mycursor = mydb.cursor()
sql = "INSERT INTO moni (sn, dgw, tgw, delay_gw_server) VALUES (%s, %s, %s, %s)"
val = (sn, date, time, timediff(NOW(), dt))
mycursor.execute(sql, val)
mydb.commit()

Python 3 MySQL find number of rows error?

I have tried several variation of this, and for that reason I am coming here for guidance. Where is this incorrect?
with connection.cursor() as cur:
sql = 'select * from table where var1 = %s, var2 = %s, var3 = %s, var4 = %s, var5 = %s'
cur.execute(sql, (var1val, var2val, var3val, var4val, var5val))
connection.commit()
l_fetch = cur.fetchall()
rc = int(l_fetch.rowcount)
print('rc len lerr_log: ' + rc)
if(rc > 0):
#result found
cur.fetchall() returns a list, it doesn't have a rowcount attribute.
The number of rows is cur.rowcount and len(l_fetch).

My Python program it's not writing sql database

This is my code, i'm getting data from a MCP3008 and I want to write that values into a SQL Database, but it's not working. When I execute the code it's okay, but when I open the SQL Database it's empty.
Python Program:
spi = spidev.SpiDev()
spi.open(0,0)
def ReadChannel(channel):
adc = spi.xfer2([1,(8+channel)<<4,0])
data = ((adc[1]&3) << 8) + adc[2]
return data
TIMES = 480
def AdcRead(adc_value = []):
time_start = time.time()
i = 0
while True:
time_current = time.time()
if time_current > time_start + i / float(TIMES):
print('{}: {}'.format(i, time_current))
data = ReadChannel(0)
adc_value.append(data)
i += 1
if i > 223:
max_value = max(adc_value)
break
print(adc_value)
return max_value
amp = AdcRead() * 0.8
amp = amp + 0.0
print("Binario: {}").format(amp)
output = 240*(amp/1024)*30
print("Potennia: {}").format(output)
amp_out = output/240
print ("Amperes: {}").format(amp_out)
output_h = output/3600
price = output_h * 0.15
ts = time.strftime('%Y-%m-%d %H:%M:%S', time.localtime())
db = MySQLdb.connect("localhost","root","pass","auto_room_control")
cursor = db.cursor()
sql = "INSERT INTO auto_room_control VALUES ('%s', '%d', '%d', '%d', '%d' )", (ts, amp_out, output, output_h, price)
try:
cursor.execute(sql)
db.commit()
except:
db.rollback()
db.close()
And when I execute the MySql command:
mysql> USE auto_room_control;
mysql> SELECT * FROM power_consumption;
Empty set (0.00 sec)
I get that "Empty set(0.00 sec)". What am I doing wrong?
There are several issues.
Change
sql = "INSERT INTO auto_room_control VALUES ('%s', '%d', '%d', '%d', '%d' )", (ts, amp_out, output, output_h, price)
try:
cursor.execute(sql)
db.commit()
except:
db.rollback()
to
sql = "INSERT INTO auto_room_control VALUES (%s, %s, %s, %s, %s)"
sqldata = (ts, amp_out, output, output_h, price)
try:
cursor.execute(sql, sqldata)
db.commit()
except:
db.rollback()
because the %s are interpreted by the database driver and that is the cleanest way to prevent SQL injection.
Change this to
sql = "INSERT INTO auto_room_control VALUES (%s, %s, %s, %s, %s)"
sqldata = (ts, amp_out, output, output_h, price)
with db as cursor:
cursor.execute(sql, sqldata)
The with statement does the committing and rolling back on its own.
The problem is here:
sql = "INSERT INTO auto_room_control VALUES ('%s', '%d', '%d', '%d', '%d' )", (ts, amp_out, output, output_h, price)
You should write this:
sql = "INSERT INTO auto_room_control VALUES ('%s', %d, %d, %d, %d )" % (ts, amp_out, output, output_h, price)
In the your case, you just create couple of string and another couple, instead of formatting.

Python does not commit on insert query

I am trying to bulk insert locations on wordpress. I have defined functions to check and adding terms and taxonomy
def checkTerm(term,con):
cur = con.cursor()
query = "SELECT term_id FROM wp_terms as t WHERE t.name = '%s'" % term
print query
cur.execute(query)
rows = cur.fetchall()
if rows: return rows[0][0]
else : return None
def addTerm(term,slug,con):
cur = con.cursor()
try:
query = "INSERT INTO `wp_terms` (`name`,`slug`,`term_group`) VALUES ('%s','%s',0)" % (term,slug)
print query
cur.execute(query)
con.commit()
rows = checkTerm(term,con)
if rows: return rows[0][0]
else : return None
except:
return None
def checkTaxonomy(term_id,con):
cur = con.cursor()
query = "SELECT tt.term_taxonomy_id,tt.parent FROM wp_term_taxonomy AS tt INNER JOIN wp_terms AS t ON tt.term_id = t.term_id WHERE tt.taxonomy = 'project_location' AND t.term_id = '%s'" % term_id
print query
cur.execute(query)
rows = cur.fetchall()
if rows: return rows
else : return None
def addTaxonomy(term_id,taxonomy,description,parent,count,con):
cur = con.cursor()
query = "INSERT INTO `wp_term_taxonomy` (`term_id`,`taxonomy`,`description`,`parent`,`count`) VALUES ('%s','%s','%s','%s','%s')" % (term_id,taxonomy,description,parent,count)
print query
cur.execute(query)
con.commit()
rows = checkTaxonomy(term_id,con)
if rows: return rows
else: return None
I store cities in dictionary of dicionaries
df = pd.read_table('./Argentina.csv',sep='\t',header=None,engine='python')
for line in xrange(len(df)):
stringa = str(df[17][line])
location = str(df[1][line])
population = int(df[14][line])
if population < limit_pop: continue
string_state = stringa.split("/")
country = string_state[1]
state = string_state[2]
if not country in states:
states[country] = {}
if not state in states[country]:
states[country][state] = [location]
else :
states[country][state].append(location)
Then I try to insert terms and taxonomies in the wordpress db
con = mdb.connect('localhost', 'root', 'mypassword, 'Wordpress')
for country in states:
country_id = checkTerm(country.replace("_"," "),con)
if not country_id:
country_id = addTerm(country.replace("_"," "),country,con)
taxonomy = checkTaxonomy(country_id,con)
if not taxonomy:
taxonomy = addTaxonomy(country_id,'project_location','','0','0',con)
parent = dict((y, x) for x, y in taxonomy)
if not 0 in parent:
taxonomy = addTaxonomy(country_id,'project_location','','0','0',con)
for state in states[country]:
state_id = checkTerm(state.replace("_"," "),con)
if not state_id:
state_id = addTerm(state.replace("_"," "),state,con)
taxonomy = checkTaxonomy(state_id,con)
if not taxonomy:
taxonomy = addTaxonomy(state_id,'project_location','',country_id,'0',con)
parent = dict((y, x) for x, y in taxonomy)
if not country_id in parent:
taxonomy = addTaxonomy(state_id,'project_location','',country_id,'0',con)
for location in states[country][state]:
location_id=checkTerm(location.replace("_"," "),con)
if not location_id:
location_id = addTerm(location.replace("_"," "),location,con)
taxonomy = checkTaxonomy(location_id,con)
if not taxonomy:
taxonomy = addTaxonomy(location_id,'project_location','',state_id,'0',con)
parent = dict((y, x) for x, y in taxonomy)
if not state_id in parent:
taxonomy = addTaxonomy(location_id,'project_location','',state_id,'0',con)
When I try to execute the script I found this behaviour
SELECT term_id FROM wp_terms as t WHERE t.name = 'Argentina'
INSERT INTO `wp_terms` (`name`,`slug`,`term_group`) VALUES ('Argentina','Argentina',0)
SELECT term_id FROM wp_terms as t WHERE t.name = 'Argentina'
SELECT tt.term_taxonomy_id,tt.parent FROM wp_term_taxonomy AS tt INNER JOIN wp_terms AS t ON tt.term_id = t.term_id WHERE tt.taxonomy = 'project_location' AND t.term_id = 'None'
INSERT INTO `wp_term_taxonomy` (`term_id`,`taxonomy`,`description`,`parent`,`count`) VALUES ('None','project_location','','0','0')
SELECT tt.term_taxonomy_id,tt.parent FROM wp_term_taxonomy AS tt INNER JOIN wp_terms AS t ON tt.term_id = t.term_id WHERE tt.taxonomy = 'project_location' AND t.term_id = 'None'
And the script stop with the following error
./import.py:59: Warning: Truncated incorrect DOUBLE value: 'None'
cur.execute(query)
./import.py:69: Warning: Incorrect integer value: 'None' for column 'term_id' at row 1
cur.execute(query)
Traceback (most recent call last):
File "./import.py", line 115, in <module>
parent = dict((y, x) for x, y in taxonomy)
TypeError: 'NoneType' object is not iterable
This means that the insert statements are not executed. I don't understand. I con.commit() the query but it is still not executed. Where is the problem?
Solution:
I changed
import MySQLdb as mdb
to
import mysql.connector
and
con = mdb.connect(host='localhost',user='root', password='passowrd',database= 'Wordpress');
to
con = mysql.connector.connect(host='localhost',user='root', password='password',database= 'Wordpress',autocommit=False,buffered=False);

Getting different results with mysql when using python multiprocessing

I can't figure out what I'm doing wrong(or how to correct it). It might be easier to show some code(its a bit simplified from what I'm doing but it proves my point):
from multiprocessing import Pool
import MySQLdb
import sys
#sql connection
try:
conn = MySQLdb.connect (host = "127.0.0.1",user = "user",passwd = "password", db = "mydb")
except MySQLdb.Error, e:
print "Error %d: %s" % (e.args[0], e.args[1])
sys.exit (1)
#with database
cursor = conn.cursor ()
cursor.execute ("DROP TABLE IF EXISTS data_table")
cursor.execute ("""
CREATE TABLE data_table(
value CHAR(80)
) ENGINE=MyISAM
""")
cursor.execute (""" INSERT INTO data_table (value) VALUES ('%s')""" % [0, 0]) #need to insert basecase
conn.commit()
def build_table(i,x): # i is index, x is data[i]
conn = MySQLdb.connect (host = "127.0.0.1",user = "user",passwd = "password", db = "mydb")
cursor = conn.cursor ()
#print i,x
target_sum = 100
for s in range(target_sum + 1):
for c in range(target_sum + 1):
#print c, i
cursor.execute ("""
INSERT INTO data_table (value)
SELECT '%s'
FROM dual
WHERE ( SELECT COUNT(*) FROM data_table WHERE value='%s' )
= 1
AND NOT EXISTS
( SELECT * FROM data_table WHERE value='%s' )
""" % ([s, i+1], [s - c * x, i], [s, i+1]))
conn.commit()
conn.close()
data = [2,5,8]
pool = Pool(processes=4)
for i, x in enumerate(data):
build_table(i,x) #creates 250 records
#pool.apply_async(build_table, (i, x))
pool.close()
pool.join()
print 'completed'
It basically creates a table in mysql. The code above creates 250 entries(which is correct), but if you comment out build_table(i,x) in the for loop and uncomment pool.apply_async(build_table, (i, x)) it creates only 52 records. Why is there a difference when multiprocessing the same function and is there anything I can do to fix it so the results are the same(I thought quickly committing updates would fix it but no luck)?
If I play around pool = Pool(processes=4) and change it to 1, it works but I guess thats expected because its not multiprocessing really at that point. Also, if it helps I'm using InnoDB.
UPDATE: when I change to MyISAM I get 240 results being updated(not quite the 250 I need but much better than 52).
UPDATE2: mysql command was combined into a single command, and results seem to vary. Sometimes I get 248 results in the database, sometimes 240 or less. Maybe multiprocessing is causing this diverge between expected and actual results ?
I would try to combine the 2 Selects and the Insert in one Insert statement:
#print c, i
cursor.execute(""" SELECT value FROM data_table WHERE value='%s' """ % ([s - c * x, i]))
if cursor.rowcount == 1:
cursor.execute(""" SELECT value FROM data_table WHERE value='%s' """ % [s, i+1])
if cursor.rowcount == 0:
cursor.execute (""" INSERT INTO data_table (value) VALUES ('%s')""" % [s, i+1])
Into something like:
#print c, i
cursor.execute ("""
INSERT INTO data_table (value)
SELECT '%s'
FROM dual
WHERE ( SELECT COUNT(*) FROM data_table WHERE value='%s' )
= 1
AND NOT EXISTS
( SELECT * FROM data_table WHERE value='%s' )
""" % ([s, i+1], [s - c * x, i], [s, i+1]))
Not sure about the syntax in the last line. You'll need to pass 3 parameters.

Categories

Resources