Insert tuple in sqlite request [duplicate] - python

all I want to do is send a query like
SELECT * FROM table WHERE col IN (110, 130, 90);
So I prepared the following statement
SELECT * FROM table WHERE col IN (:LST);
Then I use
sqlite_bind_text(stmt, 1, "110, 130, 90", -1, SQLITE_STATIC);
Unfortunately this becomes
SELECT * FROM table WHERE col IN ('110, 130, 90');
and is useless (note the two additional single quotes). I already tried putting extra ' in the string but they get escaped. I didn't find an option to turn off the escaping or prevent the text from being enclosed by single quotes. The last thing I can think of is not using a prepared statement, but I'd only take it as last option. Do you have any ideas or suggestions?
Thanks
Edit:
The number of parameters is dynamic, so it might be three numbers, as in the example above, one or twelve.

You can dynamically build a parameterized SQL statement of the form
SELECT * FROM TABLE WHERE col IN (?, ?, ?)
and then call sqlite_bind_int once for each "?" you added to the statement.
There is no way to directly bind a text parameter to multiple integer (or, for that matter, multiple text) parameters.
Here's pseudo code for what I have in mind:
-- Args is an array of parameter values
for i = Lo(Args) to Hi(Args)
paramlist = paramlist + ', ?'
sql = 'SELECT * FROM TABLE WHERE col IN (' + Right(paramlist, 3) + ')'
for i = Lo(Args) to Hi(Args)
sql_bind_int(sql, i, Args[i]
-- execute query here.

I just faced this question myself, but answered it by creating a temporary table and inserting all the values into that, so that I could then do:
SELECT * FROM TABLE WHERE col IN (SELECT col FROM temporarytable);

Even simpler, build your query like this:
"SELECT * FROM TABLE WHERE col IN (" + ",".join(["?"] * len(lst)) + ")"

Depending on your build of sqlite (it's not part of the default build), you may be able to use:
SELECT * FROM table WHERE col IN carray(?42);
and then bind ?42 using (assuming the C API):
int32_t data[] = {110, 130, 90};
sqlite3_carray_bind(
stmtPtr, 42,
data, sizeof(data)/sizeof(data[0]),
CARRAY_INT32, SQLITE_TRANSIENT);
I haven't actually tested that, I just read the docs: https://sqlite.org/carray.html

You cannot pass an array as one parameter, but you can pass each array value as a separate parameter (IN (?, ?, ?)).
The safe way to do this for dynamic number parameters (you should not use string concatenation, .format(), etc. to insert the values themselves into the query, it can lead to SQL injections) is to generate the query string with the needed number of ? placeholders and then bind the array elements. Use array concatenation or spread syntax (* or ... in most languages) if you need to pass other parameters too.
Here is an example for Python 3:
c.execute('SELECT * FROM TABLE WHERE col IN ({}) LIMIT ?'
.format(', '.join(['?'] * len(values))), [*values, limit])

One solution (which I haven't tried yet in code, but only on the SQLite shell) is to use json_each function from SQLite.
So you could do something like:
SELECT * FROM table
WHERE col IN (SELECT value FROM json_each(?));
The caveat is that you'd have to manually assemble a valid JSON array with the values you're trying to bind.

A much simpler and safer answer simply involves generating the mask (as opposed to the data part of the query) and allowing the SQL-injection formatter engine to do its job.
Suppose we have some ids in an array, and some cb callback:
/* we need to generate a '?' for each item in our mask */
const mask = Array(ids.length).fill('?').join();
db.get(`
SELECT *
FROM films f
WHERE f.id
IN (${mask})
`, ids, cb);

Working on a same functionality lead me to this approach:
(nodejs, es6, Promise)
var deleteRecords = function (tblName, data) {
return new Promise((resolve, reject) => {
var jdata = JSON.stringify(data);
this.run(`DELETE FROM ${tblName} WHERE id IN (?)`, jdata.substr(1, jdata.length - 2), function (err) {
err ? reject('deleteRecords failed with : ' + err) : resolve();
});
});
};

this works fine aswell (Javascript ES6):
let myList = [1, 2, 3];
`SELECT * FROM table WHERE col IN (${myList.join()});`

You can try this
RSQLite in R:
lst <- c("a", "b", "c")
dbGetQuery(db_con, paste0("SELECT * FROM table WHERE col IN (", paste0(shQuote(lst), collapse=", ") , ");"))

My solution for node (ES6, Promises):
let records = await db.all(`
SELECT * FROM table
WHERE (column1 = ?) and column2 IN ( ${[...val2s].fill('?').join(',')} )
`, [val1, ...val2s])
Works with a variable number of possible values.
This uses sqlite-async but you can modify it for the callback style version trivially.

if you are using Python the easiest way to handle this, in practice, is to create a local function that tests against a string value of the list (which can be passed as a bind variable).
I used this when providing "Query By Example" functionality in a Python GUI app.
pros:
can use common approach in parsing and building the SQL across entries
as I would when parsing LIKE xxx and > xxx etc
just one extra call to set the function up - either at connection time
or if the function call is detected in the created sql
cons:
function needs to parse string list for each row. This is bad if the query
is running against a large table
embedded commas, blanks and other similar stuff may be difficult to handle
For example
user enters IN 18C, 356, 013 into Account field in application
application creates sql with ... WHERE inz( Account , ? ) ...
application creates string bind value 18C, 356, 013
application issues <sqlite3.connection>.create_function("inz", 2, inz ) to bind local python function inz (see below) to sqlite function inz.
application issues query
the coding for the inz function is as follows
def inz( val , possibles ) :
"""implements the IN list function allowing one bind variable
use <sqlite3.connection>.create_function("inz", 2, inz )
and ensure that the bind variable is a comma delimited list
in string form (without quotes)
matches can be string or integer but do not allow for leading
or trailing spaces or contained commas, quotes etc or floating points
"""
poss = [ x.strip() for x in possibles.split(',') ]
if val in poss :
return True
if isinstance( val, int ) :
ipos = [ int(x) for x in poss if x.isdecimal() ]
if val in ipos :
return True
return False

For example, if you want the sql query:
select * from table where col in (110, 130, 90)
What about:
my_list = [110, 130, 90]
my_list_str = repr(my_list).replace('[','(').replace(']',')')
cur.execute("select * from table where col in %s" % my_list_str )

Related

How to generate a unique random number when insert in MySQL?

I have a database for articles and may want to generate a unique random integer for each articles so that they can be visited through URL like https://blablabla.com/articles/8373734 etc.
I could achieve that in python backend, but how do we achieve this in MySQL sentences?
For example, a new article was done, and inserted into database:
INSERT into article_table (title, date, url_id) VALUES ('asdf', '11/11/1111', 8373734)
the url_id here is the unique random integer (1000~10000000) that automatically generated.
I believe The primary key ID and auto-increasment are good way to solve this. But my question is:
In practical scenario, do they (companies) literally use primary ID or auto-increasment? This may expose how piece of data you (ever) have in database. Take this https://www.zhihu.com/question/41490222 for example, I tried hundreds of number around 41490222, all returns 404 not found. it seems that the number are recorded very sparsely, not very possible achieved by auto-increasement.
Are there any efficient way to generate such random number without checking duplication for every loop?
Use mysql function RAND()
-------------------------
select FLOOR(RAND() * 999999)
You can use UUID(), or if it has to be numeric UUID_SHORT() for that.
albeit my sql skills are a bit rusty, I think you might want to create a function using the RAND function.
CREATE PROCEDURE GetRandomValue()
BEGIN
DECLARE newUrlId INT DEFAULT 0;
WHILE (
newUrlId = 0
OR IF EXISTS(SELECT 1 FROM yourTable WHERE url_id = newUrlId)
)
DO
SET newUrlId = SELECT FLOOR(RAND() * 999999)
END WHILE
RETURN newUrlId
END
Then again, why creating such a fuss while you could use other ways to create "bigger random numbers"
for example:
function createBiggerNumber(id) {
return (id * constants.MySecretMultiplyValue) + constants.MySecretAddedValue;
}
function extractIdFromBiggerNumber(number) {
return (number - constants.MySecretAddedValue) / constants.MySecretMultiplyValue
}
the logic is combine with their primary key | id , so we dont need re check if the data is exist or not.
DELIMITER $$
DROP TRIGGER IF EXISTS `auto_number`$$
CREATE TRIGGER `auto_number` BEFORE INSERT on users
FOR EACH ROW BEGIN
SET new.auto_number = CONCAT(new.id, LEFT(UUID(), 8));
END$$
DELIMITER ;
https://gist.github.com/yogithesymbian/698b27138a5ba89d2a32e3fc7ddd3cfb

How to write multi column in clause with sqlalchemy

Please suggest is there way to write query multi-column in clause using SQLAlchemy?
Here is example of the actual query:
SELECT url FROM pages WHERE (url_crc, url) IN ((2752937066, 'http://members.aye.net/~gharris/blog/'), (3799762538, 'http://www.coxandforkum.com/'));
I have a table that has two columns primary key and I'm hoping to avoid adding one more key just to be used as an index.
PS I'm using mysql DB.
Update: This query will be used for batch processing - so I would need to put few hundreds pairs into the in clause. With IN clause approach I hope to know fixed limit of how many pairs I can stick into one query. Like Oracle has 1000 enum limit by default.
Using AND/OR combination might be limited by the length of the query in chars. Which would be variable and less predictable.
Assuming that you have your model defined in Page, here's an example using tuple_:
keys = [
(2752937066, 'http://members.aye.net/~gharris/blog/'),
(3799762538, 'http://www.coxandforkum.com/')
]
select([
Page.url
]).select_from(
Page
).where(
tuple_(Page.url_crc, Page.url).in_(keys)
)
Or, using the query API:
session.query(Page.url).filter(tuple_(Page.url_crc, Page.url).in_(keys))
I do not think this is currently possible in sqlalchemy, and not all RDMBS support this.
You can always transform this to a OR(AND...) condition though:
filter_rows = [
(2752937066, 'http://members.aye.net/~gharris/blog/'),
(3799762538, 'http://www.coxandforkum.com/'),
]
qry = session.query(Page)
qry = qry.filter(or_(*(and_(Page.url_crc == crc, Page.url == url) for crc, url in filter_rows)))
print qry
should produce something like (for SQLite):
SELECT pages.id AS pages_id, pages.url_crc AS pages_url_crc, pages.url AS pages_url
FROM pages
WHERE pages.url_crc = ? AND pages.url = ? OR pages.url_crc = ? AND pages.url = ?
-- (2752937066L, 'http://members.aye.net/~gharris/blog/', 3799762538L, 'http://www.coxandforkum.com/')
Alternatively, you can combine two columns into just one:
filter_rows = [
(2752937066, 'http://members.aye.net/~gharris/blog/'),
(3799762538, 'http://www.coxandforkum.com/'),
]
qry = session.query(Page)
qry = qry.filter((func.cast(Page.url_crc, String) + '|' + Page.url).in_(["{}|{}".format(*_frow) for _frow in filter_rows]))
print qry
which produces the below (for SQLite), so you can use IN:
SELECT pages.id AS pages_id, pages.url_crc AS pages_url_crc, pages.url AS pages_url
FROM pages
WHERE (CAST(pages.url_crc AS VARCHAR) || ? || pages.url) IN (?, ?)
-- ('|', '2752937066|http://members.aye.net/~gharris/blog/', '3799762538|http://www.coxandforkum.com/')
I ended up using the test() based solution: generated "(a,b) in ((:a1, :b1), (:a2,:b2), ...)" with named bind vars and generating dictionary with bind vars' values.
params = {}
for counter, r in enumerate(records):
a_param = "a%s" % counter
params[a_param] = r['a']
b_param = "b%s" % counter
params[b_param] = r['b']
pair_text = "(:%s,:%s)" % (a_param, b_param)
enum_pairs.append(pair_text)
multicol_in_enumeration = ','.join(enum_pairs)
multicol_in_clause = text(
" (a,b) in (" + multicol_in_enumeration + ")")
q = session.query(Table.id, Table.a,
Table.b).filter(multicol_in_clause).params(params)
Another option I thought about using mysql upserts but this would make whole included even less portable for the other db engine then using multicolumn in clause.
Update SQLAlchemy has sqlalchemy.sql.expression.tuple_(*clauses, **kw) construct that can be used for the same purpose. (I haven't tried it yet)

Python sqlite parameter extension problem

I have a table with three columns, cell, trx and type.
This is the query I'm trying to run:
db.execute("SELECT cell,trx FROM tchdrop").fetchall()
It gives the correct output.
However when I try a = ("cell", "trx") and then
db.execute("SELECT ?,? FROM tchdrop", t).fetchall()
the output is [(u'cell', u'trx'), (u'cell', u'trx')] (which is wrong)
I'm doing this to figure out how to extract columns dynamically which is a part of a bigger problem.
The place holder (?) of python DB-API (like sqlite3) don't support columns names to be passed, so you have to use python string formatting like this:
a = ("cell", "trx")
query = "SELECT {0},{1} FROM tchdrop".format(*a)
db.execute(query)
EDIT:
if you don't know the length of the columns that you want to pass , you can do something like this:
a = ("cell", "trx", "foo", "bar")
a = ", ".join(a)
query = "SELECT {0} FROM tchdrop".format(a)
# OUTPUT : 'SELECT cell, trx, foo, bar FROM tchdrop'
db.execute(query)
The library replaces the specified values ("cell", "trx") with their quoted SQL equivalent, so what you get is SELECT "cell", "trx" FROM tchdrop. The result is correct.
What you are trying to achieve is not possible with the ? syntax. Instead, do string replacement yourself. You can check column names with regular expressions (like ^[a-zA-Z_]$) for more security.
For example:
columns = ",".join(("cell", "trx"))
db.execute("SELECT %s FROM tchdrop" % columns).fetchall()

What is the correct way to form MySQL queries in python?

I am new to python, I come here from the land of PHP. I constructed a SQL query like this in python based on my PHP knowledge and I get warnings and errors
cursor_.execute("update posts set comment_count = comment_count + "+str(cursor_.rowcount)+" where ID = " + str(postid))
# rowcount here is int
What is the right way to form queries?
Also, how do I escape strings to form SQL safe ones? like if I want to escape -, ', " etc, I used to use addslashes. How do we do it in python?
Thanks
First of all, it's high time to learn to pass variables to the queries safely, using the method Matus expressed. Clearer,
tuple = (foovar, barvar)
cursor.execute("QUERY WHERE foo = ? AND bar = ?", tuple)
If you only need to pass one variable, you must still make it a tuple: insert comma at the end to tell Python to treat it as a one-tuple: tuple = (onevar,)
Your example would be of form:
cursor_.execute("update posts set comment_count = comment_count + ? where id = ?",
(cursor_.rowcount, postid))
You can also use named parameters like this:
cursor_.execute("update posts set comment_count = comment_count + :count where id = :id",
{"count": cursor_.rowcount, "id": postid})
This time the parameters aren't a tuple, but a dictionary that is formed in pairs of "key": value.
from python manual:
t = (symbol,)
c.execute( 'select * from stocks where symbol=?', t )
this way you prevent SQL injection ( suppose this is the SQL safe you refer to ) and also have formatting solved

Inserting multiple types into an SQLite database with Python

I'm trying to create an SQLite 3 database from Python. I have a few types I'd like to insert into each record: A float, and then 3 groups of n floats, currently a tuple but could be an array or list.. I'm not well-enough versed in Python to understand all the differences. My problem is the INSERT statement.
DAS = 12345
lats = (42,43,44,45)
lons = (10,11,12,13)
times = (1,2,3,4,5,6,7,8,9)
import sqlite3
connection = sqlite3.connect("test.db")
cursor = connection.cursor()
cursor.execute( "create table foo(DAS LONG PRIMARY KEY,lats real(4),lons real(4), times real(9) )" )
I'm not sure what comes next. Something along the lines of:
cmd = 'INSERT into foo values (?,?,?,?), ..."
cursor.execute(cmd)
How should I best build the SQL insert command given this data?
The type real(4) does not mean an array/list/tuple of 4 reals; the 4 alters the 'real' type. However, SQLite mostly ignores column types due to its manifest typing, but they can still affect column affinity.
You have a few options, such as storing the text representation (from repr) or using four columns, one for each.
You can modify this with various hooks provided by the Python's SQLite library to handle some of the transformation for you, but separate columns (with functions to localize and handle various statements, so you don't repeat yourself) is probably the easiest to work with if you need to search/etc. in SQL on each value.
If you do store a text representation, ast.literal_eval (or eval, under special conditions) will convert back into a Python object.
Something like this:
db = sqlite3.connect("test.db")
cursor = db.cursor()
cursor.execute("insert into foo values (?,?,?,?)", (val1, val2, val3, val4))
db.commit() # Autocommit is off by default (and rightfully so)!
Please note, that I am not using string formatting to inject actual data into the query, but instead make the library do this work for me. That way the data is quoted and escaped correctly.
EDIT: Obviously, considering your database schema, it doesn't work. It is impractical to attempt to store a collection-type value in a single field of a sqlite database. If I understand you correctly, you should just create a separate column for every value you are storing in the single row. That will be a lot of columns, sure, but that's the most natural way to do it.
(A month later), two steps:
1. flatten e.g. DAS lats lons times to one long list, say 18 long
2. generate "Insert into tablename xx (?,?,... 18 question marks )" and execute that.
Test = 1
def flatten( *args ):
""" 1, (2,3), [4,5] -> [1 2 3 4 5] """
# 1 level only -- SO [python] [flatten] zzz
all = []
for a in args:
all.extend( a if hasattr( a, "__iter__" ) else [a] )
if Test: print "test flatten:", all
return all
def sqlinsert( db, tablename, *args ):
flatargs = flatten( *args ) # one long list
ncol = len(flatargs)
qmarks = "?" + (ncol-1) * ",?"
insert = "Insert into tablename %s values (%s)" % (tablename, qmarks)
if Test: print "test sqlinsert:", insert
if db:
db.execute( insert, flatargs )
# db.executemany( insert, map( flatargs, rows ))
return insert
#...............................................................................
if __name__ == "__main__":
print sqlinsert( None, "Table", "hidiho", (4,5), [6] )

Categories

Resources