Inserting a variable to the database using sqlite in Python - python

I want to add some data from variables into my database using sqlite library in python. I create a table and then run the sql statement. Here is my simple code:
import sqlite3
db = sqlite3.connect("dbse.sqlite")
cursor= db.cursor()
cursor.execute("CREATE TABLE Myt (Test TEXT)")
variable = ('aaa')
cursor.execute('INSERT INTO Myt VALUES (?)' , variable)
db.commit()
but after running the code, this error comes up:
cursor.execute('INSERT INTO Myt VALUES (?)' , variable)
sqlite3.ProgrammingError: Incorrect number of bindings supplied. The current statement uses 1, and there are 3 supplied.
When I insert a variable that contains a one character value, it works well but when I use a variable with more than one character, it doesn't work.
I use python 3.2.3 .
Do you have an idea to solve it?

Your variable should be a tuple:
variable = ('aaa',) # Notice the comma
When creating a one-element tuple, you need to use comma at the end. As a side note, bear in mind that using the tuple() method won't give what you want:
>>> tuple('aaa')
('a', 'a', 'a')
>>> ('aaa',)
('aaa',)

cursor.execute() expects the second argument to be a sequence. Your variable is a string, which happens to be a sequence of length 3:
>>> len(variable)
3
>>> list(variable)
['a', 'a', 'a']
This is what causes your confusing error message; .execute sees a 3-element sequency and expected only 1. Pass it to .execute in a one-element tuple:
cursor.execute('INSERT INTO Myt VALUES (?)', (variable,))
Note the comma there.

Related

Unable to insert multiple rows to sql table from python

Even though I pass a list as params I am getting the below error while executing the query
TypeError: ('Params must be in a list, tuple, or Row', 'HY000')
I am trying to pass multiple rows to sql table using executemany
Please find my code below and help, I am new to python.
query = """INSERT INTO TableTest (Summary) Values (%s)"""
val = [('SPHXNORF2ASW1'),('CHNSIRSDB1USAA'),('NKOLGTPRAVPNVM'),('STAMHO3WANCEG'),('SPHXNORWANCEG1'),('SPHXNORWANCE'),('STAMHO3WANCEG')]
cursor = conn.cursor()
cursor.executemany(query,val)
conn.commit() ```
In python, putting a value inside '()' does not make it tuple.
So, this code
val = [('SPHXNORF2ASW1'),('CHNSIRSDB1USAA'),('NKOLGTPRAVPNVM'),('STAMHO3WANCEG'),('SPHXNORWANCEG1'),('SPHXNORWANCE'),('STAMHO3WANCEG')]
will be converted to
['SPHXNORF2ASW1', 'CHNSIRSDB1USAA', 'NKOLGTPRAVPNVM', 'STAMHO3WANCEG', 'SPHXNORWANCEG1', 'SPHXNORWANCE', 'STAMHO3WANCEG']
If you want to specify it as a tuple, add a trailing comma like below or use '[]' instead of tuple.
Use:
val = [('SPHXNORF2ASW1',),('CHNSIRSDB1USAA',),('NKOLGTPRAVPNVM',),('STAMHO3WANCEG',),('SPHXNORWANCEG1',),('SPHXNORWANCE',),('STAMHO3WANCEG',)]
Or:
[ ['SPHXNORF2ASW1'], ['CHNSIRSDB1USAA'], ['NKOLGTPRAVPNVM'], ['STAMHO3WANCEG'], ['SPHXNORWANCEG1'], ['SPHXNORWANCE'], ['STAMHO3WANCEG']]

Properly format SQL query when insert into variable number of columns

I'm using psycopg2 to interact with a PostgreSQL database. I have a function whereby any number of columns (from a single column to all columns) in a table could be inserted into. My question is: how would one properly, dynamically, construct this query?
At the moment I am using string formatting and concatenation and I know this is the absolute worst way to do this. Consider the below code where, in this case, my unknown number of columns (i.e. keys from a dict is in fact 2):
dictOfUnknownLength = {'key1': 3, 'key2': 'myString'}
def createMyQuery(user_ids, dictOfUnknownLength):
fields, values = list(), list()
for key, val in dictOfUnknownLength.items():
fields.append(key)
values.append(val)
fields = str(fields).replace('[', '(').replace(']', ')').replace("'", "")
values = str(values).replace('[', '(').replace(']', ')')
query = f"INSERT INTO myTable {fields} VALUES {values} RETURNING someValue;"
query = INSERT INTO myTable (key1, key2) VALUES (3, 'myString') RETURNING someValue;
This provides a correctly formatted query but is of course prone to SQL injections and the like and, as such, is not an acceptable method of achieving my goal.
In other queries I am using the recommended methods of query construction when handling a known number of variables (%s and separate argument to .execute() containing variables) but I'm unsure how to adapt this to accommodate an unknown number of variables without using string formatting.
How can I elegantly and safely construct a query with an unknown number of specified insert columns?
To add to your worries, the current methodology using .replace() is prone to edge cases where fields or values contain [, ], or '. They will get replaced no matter what and may mess up your query.
You could always use .join() to join a variable number of values in your list. To top it up, format the query appropriately with %s after VALUES and pass your arguments into .execute().
Note: You may also want to consider the case where the number of fields is not equal to the number values.
import psycopg2
conn = psycopg2.connect("dbname=test user=postgres")
cur = conn.cursor()
dictOfUnknownLength = {'key1': 3, 'key2': 'myString'}
def createMyQuery(user_ids, dictOfUnknownLength):
# Directly assign keys/values.
fields, values = list(dictOfUnknownLength.keys()), list(dictOfUnknownLength.values())
if len(fields) != len(values):
# Raise an error? SQL won't work in this case anyways...
pass
# Stringify the fields and values.
fieldsParam = ','.join(fields) # "key1, key2"
valuesParam = ','.join(['%s']*len(values))) # "%s, %s"
# "INSERT ... (key1, key2) VALUES (%s, %s) ..."
query = 'INSERT INTO myTable ({}) VALUES ({}) RETURNING someValue;'.format(fieldsParam, valuesParam)
# .execute('INSERT ... (key1, key2) VALUES (%s, %s) ...', [3, 'myString'])
cur.execute(query, values) # Anti-SQL-injection: pass placeholder
# values as second argument.

cx_Oracle returns empty query when using bindings

I'm facing a strange problem when doing queries in my sql application. I'm working with python3 and cx_Oracle 5.1.2. My test table is organized as it follows:
CREATE TABLE people (
sin CHAR(15),
name VARCHAR(40),
PRIMARY KEY (sin))
with the following values inserted (sin, name):
('1','a'), ('2','b'), ('3','c')
When I do a simple select using an unsafe query:
curs.execute("select name from people where sin = '1'")
The result is 'a', as expected, but if I use bindings:
curs.execute("select name from people where sin = :v", v='1')
The result is empty. I already tried to change this to the positional '?' parameter, set the size of 'v' through setinputsizes(v=15) but nothing appears to work.
Is there something that I am missing?
Thanks,
The problem lies in your use of the CHAR datatype instead of VARCHAR2.
You can observe the difference even in SQL*Plus.
If we bind a VARCHAR2 variable then no rows are selected:
SQL> variable v varchar2(15)
SQL> exec :v := '1';
PL/SQL procedure successfully completed.
SQL> select name from people where sin = :v;
no rows selected
If instead we bind a CHAR variable, which is the same data type as the column, then one row is selected:
SQL> variable v char(15)
SQL> exec :v := '1';
PL/SQL procedure successfully completed.
SQL> select name from people where sin = :v;
NAME
----------------------------------------
a
Therefore you either need to change the column data type from CHAR to VARCHAR2 (by the way, VARCHAR is obsolete as well) or instruct cx_Oracle to user a FIXED_CHAR data type:
>>> v = curs.var(cx_Oracle.FIXED_CHAR, 15)
>>> v.setvalue(0, '1')
>>> print v
<cx_Oracle.FIXED_CHAR with value '1'>
>>> result = curs.execute("select name from people where sin = :sin", sin=v)
>>> for r in result: print r
('a',)

SQLAlchemy WHERE IN single value (raw SQL)

I'm having trouble with SQLAlchemy when doing a raw SQL which checks against multiple values.
my_sess.execute(
"SELECT * FROM table WHERE `key`='rating' AND uid IN :uids",
params=dict(uids=some_list)
).fetchall()
There are 2 scenarios for this query, one that works and one that doesn't. If some_list = [1], it throws me an SQL error that I have a syntax error near ). But if some_list = [1, 2], the query executes successfully.
Any reason why this would happen?
No, SQL parameters only ever deal with scalar values. You'll have to generate the SQL here; if you need raw SQL, use:
statement = "SELECT * FROM table WHERE `key`='rating' AND uid IN ({})".format(
', '.join([':i{}'.format(i) for i in range(len(some_list))]))
my_sess.execute(
statement,
params={'i{}'.format(i): v for i, v in enumerate(some_list)})
).fetchall()
e.g. generate enough parameters to hold all values in some_list with string formatting, then generate matching parameters to fill them.
Better still would be to use a literal_column() object to do all the generating for you:
from sqlalchemy.sql import literal_column
uid_in = literal_column('uid').in_(some_list)
statement = "SELECT * FROM able WHERE `key`='rating' AND {}".format(uid_in)
my_sess.execute(
statement,
params={'uid_{}'.format(i): v for i, v in enumerate(some_list)})
).fetchall()
but then you perhaps could just generate the whole statement using the `sqlalchemy.sql.expression module, as this would make supporting multiple database dialects much easier.
Moreover, the uid_in object already holds references to the right values for the bind parameters; instead of turning it into a string as we do with the str.format() action above, SQLAlchemy would have the actual object plus the associated parameters and you would no longer have to generate the params dictionary either.
The following should work:
from sqlalchemy.sql import table, literal_column, select
tbl = table('table')
key_clause = literal_column('key') == 'rating'
uid_clause = literal_column('uid').in_(some_list)
my_sess.execute(select('*', key_clause & uid_clause, [tbl]))
where the sqlalchemy.sql.select() takes a column spec (here hard-coded to *), a where clause (generated from the two clauses with & to generate a SQL AND clause) and a list of selectables; here your one sqlalchemy.sql.table() value.
Quick demo:
>>> from sqlalchemy.sql import table, literal_column, select
>>> some_list = ['foo', 'bar']
>>> tbl = table('table')
>>> key_clause = literal_column('key') == 'rating'
>>> uid_clause = literal_column('uid').in_(some_list)
>>> print select('*', key_clause & uid_clause, [tbl])
SELECT *
FROM "table"
WHERE key = :key_1 AND uid IN (:uid_1, :uid_2)
but the actual object tree generated from all this contains the actual values for the bind parameters too, so my_sess.execute() can access these directly.

Inserting multiple types into an SQLite database with Python

I'm trying to create an SQLite 3 database from Python. I have a few types I'd like to insert into each record: A float, and then 3 groups of n floats, currently a tuple but could be an array or list.. I'm not well-enough versed in Python to understand all the differences. My problem is the INSERT statement.
DAS = 12345
lats = (42,43,44,45)
lons = (10,11,12,13)
times = (1,2,3,4,5,6,7,8,9)
import sqlite3
connection = sqlite3.connect("test.db")
cursor = connection.cursor()
cursor.execute( "create table foo(DAS LONG PRIMARY KEY,lats real(4),lons real(4), times real(9) )" )
I'm not sure what comes next. Something along the lines of:
cmd = 'INSERT into foo values (?,?,?,?), ..."
cursor.execute(cmd)
How should I best build the SQL insert command given this data?
The type real(4) does not mean an array/list/tuple of 4 reals; the 4 alters the 'real' type. However, SQLite mostly ignores column types due to its manifest typing, but they can still affect column affinity.
You have a few options, such as storing the text representation (from repr) or using four columns, one for each.
You can modify this with various hooks provided by the Python's SQLite library to handle some of the transformation for you, but separate columns (with functions to localize and handle various statements, so you don't repeat yourself) is probably the easiest to work with if you need to search/etc. in SQL on each value.
If you do store a text representation, ast.literal_eval (or eval, under special conditions) will convert back into a Python object.
Something like this:
db = sqlite3.connect("test.db")
cursor = db.cursor()
cursor.execute("insert into foo values (?,?,?,?)", (val1, val2, val3, val4))
db.commit() # Autocommit is off by default (and rightfully so)!
Please note, that I am not using string formatting to inject actual data into the query, but instead make the library do this work for me. That way the data is quoted and escaped correctly.
EDIT: Obviously, considering your database schema, it doesn't work. It is impractical to attempt to store a collection-type value in a single field of a sqlite database. If I understand you correctly, you should just create a separate column for every value you are storing in the single row. That will be a lot of columns, sure, but that's the most natural way to do it.
(A month later), two steps:
1. flatten e.g. DAS lats lons times to one long list, say 18 long
2. generate "Insert into tablename xx (?,?,... 18 question marks )" and execute that.
Test = 1
def flatten( *args ):
""" 1, (2,3), [4,5] -> [1 2 3 4 5] """
# 1 level only -- SO [python] [flatten] zzz
all = []
for a in args:
all.extend( a if hasattr( a, "__iter__" ) else [a] )
if Test: print "test flatten:", all
return all
def sqlinsert( db, tablename, *args ):
flatargs = flatten( *args ) # one long list
ncol = len(flatargs)
qmarks = "?" + (ncol-1) * ",?"
insert = "Insert into tablename %s values (%s)" % (tablename, qmarks)
if Test: print "test sqlinsert:", insert
if db:
db.execute( insert, flatargs )
# db.executemany( insert, map( flatargs, rows ))
return insert
#...............................................................................
if __name__ == "__main__":
print sqlinsert( None, "Table", "hidiho", (4,5), [6] )

Categories

Resources