Related
I have tried to query all elements from a list into an insertion query, I tried by making the list into a list of tuples and directly by adding the elements from the list. But it did not work, and I don't know the best practice for this as I am no SQL shark. I have below added the two different outputs I have before I do the query. I don't know which is easier to work with for this, but the code example is for the list of elements.
Output
['testuser', 'AskeMeyer']
and
[('testuser',), ('AskeMeyer',)]
Code to query
try:
conn = psycopg2.connect(host=ENDPOINT, port=PORT, database=DBNAME, user=USER, password=PASS, sslmode='require', sslrootcert="SSLCERTIFICATE")
cur = conn.cursor()
var_string = ', '.join(map(str, res))
sql = 'INSERT INTO users_from_group(name) VALUES %s;' % var_string)
cur.execute(sql)
error from above
Database connection failed due to syntax error at or near ")"
You need to use brackets and quotes in your insert statement, for example:
var_string = ', '.join([f"(\"{name}\")" for name in ['testuser', 'AskeMeyer']])
Also, there is a bracket at the end of your sql statement which causes the syntax error:
sql = 'INSERT INTO users_from_group(name) VALUES %s;' % var_string)
Should be
sql = 'INSERT INTO users_from_group(name) VALUES %s;' % var_string
it's not the proper way to store lists.
you should save it as json :
json.dumps(['testuser', 'AskeMeyer'] )
then save it
Try this:
var_string = '(' + ','.join(map(str, res)) + ')'
sql = 'INSERT INTO users_from_group(name) VALUES %s;' % var_string
This question already has answers here:
imploding a list for use in a python MySQLDB IN clause
(8 answers)
Closed 1 year ago.
I have a python list, say l
l = [1,5,8]
I want to write a sql query to get the data for all the elements of the list, say
select name from students where id = |IN THE LIST l|
How do I accomplish this?
Answers so far have been templating the values into a plain SQL string. That's absolutely fine for integers, but if we wanted to do it for strings we get the escaping issue.
Here's a variant using a parameterised query that would work for both:
placeholder= '?' # For SQLite. See DBAPI paramstyle.
placeholders= ', '.join(placeholder for unused in l)
query= 'SELECT name FROM students WHERE id IN (%s)' % placeholders
cursor.execute(query, l)
Easiest way is to turn the list to tuple first
t = tuple(l)
query = "select name from studens where id IN {}".format(t)
Dont complicate it, Solution for this is simple.
l = [1,5,8]
l = tuple(l)
params = {'l': l}
cursor.execute('SELECT * FROM table where id in %(l)s',params)
I hope this helped !!!
The SQL you want is
select name from studens where id in (1, 5, 8)
If you want to construct this from the python you could use
l = [1, 5, 8]
sql_query = 'select name from studens where id in (' + ','.join(map(str, l)) + ')'
The map function will transform the list into a list of strings that can be glued together by commas using the str.join method.
Alternatively:
l = [1, 5, 8]
sql_query = 'select name from studens where id in (' + ','.join((str(n) for n in l)) + ')'
if you prefer generator expressions to the map function.
UPDATE: S. Lott mentions in the comments that the Python SQLite bindings don't support sequences. In that case, you might want
select name from studens where id = 1 or id = 5 or id = 8
Generated by
sql_query = 'select name from studens where ' + ' or '.join(('id = ' + str(n) for n in l))
string.join the list values separated by commas, and use the format operator to form a query string.
myquery = "select name from studens where id in (%s)" % ",".join(map(str,mylist))
(Thanks, blair-conrad)
I like bobince's answer:
placeholder= '?' # For SQLite. See DBAPI paramstyle.
placeholders= ', '.join(placeholder for unused in l)
query= 'SELECT name FROM students WHERE id IN (%s)' % placeholders
cursor.execute(query, l)
But I noticed this:
placeholders= ', '.join(placeholder for unused in l)
Can be replaced with:
placeholders= ', '.join(placeholder*len(l))
I find this more direct if less clever and less general. Here l is required to have a length (i.e. refer to an object that defines a __len__ method), which shouldn't be a problem. But placeholder must also be a single character. To support a multi-character placeholder use:
placeholders= ', '.join([placeholder]*len(l))
If you're using PostgreSQL with the Psycopg2 library you can let its tuple adaption do all the escaping and string interpolation for you, e.g:
ids = [1,2,3]
cur.execute(
"SELECT * FROM foo WHERE id IN %s",
[tuple(ids)])
i.e. just make sure that you're passing the IN parameter as a tuple. if it's a list you can use the = ANY array syntax:
cur.execute(
"SELECT * FROM foo WHERE id = ANY (%s)",
[list(ids)])
note that these both will get turned into the same query plan so you should just use whichever is easier. e.g. if your list comes in a tuple use the former, if they're stored in a list use the latter.
Solution for #umounted answer, because that broke with a one-element tuple, since (1,) is not valid SQL.:
>>> random_ids = [1234,123,54,56,57,58,78,91]
>>> cursor.execute("create table test (id)")
>>> for item in random_ids:
cursor.execute("insert into test values (%d)" % item)
>>> sublist = [56,57,58]
>>> cursor.execute("select id from test where id in %s" % str(tuple(sublist)).replace(',)',')'))
>>> a = cursor.fetchall()
>>> a
[(56,), (57,), (58,)]
Other solution for sql string:
cursor.execute("select id from test where id in (%s)" % ('"'+'", "'.join(l)+'"'))
Just use inline if operation with tuple function:
query = "Select * from hr_employee WHERE id in " % tuple(employee_ids) if len(employee_ids) != 1 else "("+ str(employee_ids[0]) + ")"
To run a select from where field is in list of strings (instead of int), as per this question use repr(tuple(map(str, l))). Full example:
l = ['a','b','c']
sql = f'''
select name
from students
where id in {repr(tuple(map(str, l)))}
'''
print(sql)
Returns:
select name from students where id in ('a', 'b', 'c')
For a list of dates in Oracle, this worked
l = ['2020-11-24', '2020-12-28']
dates_str = ','.join([f'DATE {repr(s)}' for s in l])
dates_str = f'({dates_str})'
sql_cmd = f'''
select *
from students
where
and date in {dates_str}
'''
Returns:
select * from students where and date in (DATE '2020-11-24',DATE '2020-12-28')
If you need to get the list of dates from a pandas df, it's df['date'].dt.strftime('%Y-%m-%d').unique()
And since I often needed it too, adding columns from a list
# single list
f'select {','.join(l)}'
# multi list in different tables
sql_cmd = f'''
select {','.join(f't1.{s}' for s in l1)},
{','.join(f't1.{s}' for s in l2)},
{','.join(f't2.{s}' for s in l3)}
'''
placeholders= ', '.join("'{"+str(i)+"}'" for i in range(len(l)))
query="select name from students where id (%s)"%placeholders
query=query.format(*l)
cursor.execute(query)
This should solve your problem.
a simpler solution:
lst = [1,2,3,a,b,c]
query = f"""SELECT * FROM table WHERE IN {str(lst)[1:-1}"""
l = [1] # or [1,2,3]
query = "SELECT * FROM table WHERE id IN :l"
params = {'l' : tuple(l)}
cursor.execute(query, params)
The :var notation seems simpler. (Python 3.7)
For example, if you want the sql query:
select name from studens where id in (1, 5, 8)
What about:
my_list = [1, 5, 8]
cur.execute("select name from studens where id in %s" % repr(my_list).replace('[','(').replace(']',')') )
This uses parameter substitution and takes care of the single value list case:
l = [1,5,8]
get_operator = lambda x: '=' if len(x) == 1 else 'IN'
get_value = lambda x: int(x[0]) if len(x) == 1 else x
query = 'SELECT * FROM table where id ' + get_operator(l) + ' %s'
cursor.execute(query, (get_value(l),))
This Will Work If Number of Values in List equals to 1 or greater than 1
t = str(tuple(l))
if t[-2] == ',':
t= t.replace(t[-2],"")
query = "select name from studens where id IN {}".format(t)
This question already has answers here:
imploding a list for use in a python MySQLDB IN clause
(8 answers)
Closed 1 year ago.
I have a python list, say l
l = [1,5,8]
I want to write a sql query to get the data for all the elements of the list, say
select name from students where id = |IN THE LIST l|
How do I accomplish this?
Answers so far have been templating the values into a plain SQL string. That's absolutely fine for integers, but if we wanted to do it for strings we get the escaping issue.
Here's a variant using a parameterised query that would work for both:
placeholder= '?' # For SQLite. See DBAPI paramstyle.
placeholders= ', '.join(placeholder for unused in l)
query= 'SELECT name FROM students WHERE id IN (%s)' % placeholders
cursor.execute(query, l)
Easiest way is to turn the list to tuple first
t = tuple(l)
query = "select name from studens where id IN {}".format(t)
Dont complicate it, Solution for this is simple.
l = [1,5,8]
l = tuple(l)
params = {'l': l}
cursor.execute('SELECT * FROM table where id in %(l)s',params)
I hope this helped !!!
The SQL you want is
select name from studens where id in (1, 5, 8)
If you want to construct this from the python you could use
l = [1, 5, 8]
sql_query = 'select name from studens where id in (' + ','.join(map(str, l)) + ')'
The map function will transform the list into a list of strings that can be glued together by commas using the str.join method.
Alternatively:
l = [1, 5, 8]
sql_query = 'select name from studens where id in (' + ','.join((str(n) for n in l)) + ')'
if you prefer generator expressions to the map function.
UPDATE: S. Lott mentions in the comments that the Python SQLite bindings don't support sequences. In that case, you might want
select name from studens where id = 1 or id = 5 or id = 8
Generated by
sql_query = 'select name from studens where ' + ' or '.join(('id = ' + str(n) for n in l))
string.join the list values separated by commas, and use the format operator to form a query string.
myquery = "select name from studens where id in (%s)" % ",".join(map(str,mylist))
(Thanks, blair-conrad)
I like bobince's answer:
placeholder= '?' # For SQLite. See DBAPI paramstyle.
placeholders= ', '.join(placeholder for unused in l)
query= 'SELECT name FROM students WHERE id IN (%s)' % placeholders
cursor.execute(query, l)
But I noticed this:
placeholders= ', '.join(placeholder for unused in l)
Can be replaced with:
placeholders= ', '.join(placeholder*len(l))
I find this more direct if less clever and less general. Here l is required to have a length (i.e. refer to an object that defines a __len__ method), which shouldn't be a problem. But placeholder must also be a single character. To support a multi-character placeholder use:
placeholders= ', '.join([placeholder]*len(l))
If you're using PostgreSQL with the Psycopg2 library you can let its tuple adaption do all the escaping and string interpolation for you, e.g:
ids = [1,2,3]
cur.execute(
"SELECT * FROM foo WHERE id IN %s",
[tuple(ids)])
i.e. just make sure that you're passing the IN parameter as a tuple. if it's a list you can use the = ANY array syntax:
cur.execute(
"SELECT * FROM foo WHERE id = ANY (%s)",
[list(ids)])
note that these both will get turned into the same query plan so you should just use whichever is easier. e.g. if your list comes in a tuple use the former, if they're stored in a list use the latter.
Solution for #umounted answer, because that broke with a one-element tuple, since (1,) is not valid SQL.:
>>> random_ids = [1234,123,54,56,57,58,78,91]
>>> cursor.execute("create table test (id)")
>>> for item in random_ids:
cursor.execute("insert into test values (%d)" % item)
>>> sublist = [56,57,58]
>>> cursor.execute("select id from test where id in %s" % str(tuple(sublist)).replace(',)',')'))
>>> a = cursor.fetchall()
>>> a
[(56,), (57,), (58,)]
Other solution for sql string:
cursor.execute("select id from test where id in (%s)" % ('"'+'", "'.join(l)+'"'))
Just use inline if operation with tuple function:
query = "Select * from hr_employee WHERE id in " % tuple(employee_ids) if len(employee_ids) != 1 else "("+ str(employee_ids[0]) + ")"
To run a select from where field is in list of strings (instead of int), as per this question use repr(tuple(map(str, l))). Full example:
l = ['a','b','c']
sql = f'''
select name
from students
where id in {repr(tuple(map(str, l)))}
'''
print(sql)
Returns:
select name from students where id in ('a', 'b', 'c')
For a list of dates in Oracle, this worked
l = ['2020-11-24', '2020-12-28']
dates_str = ','.join([f'DATE {repr(s)}' for s in l])
dates_str = f'({dates_str})'
sql_cmd = f'''
select *
from students
where
and date in {dates_str}
'''
Returns:
select * from students where and date in (DATE '2020-11-24',DATE '2020-12-28')
If you need to get the list of dates from a pandas df, it's df['date'].dt.strftime('%Y-%m-%d').unique()
And since I often needed it too, adding columns from a list
# single list
f'select {','.join(l)}'
# multi list in different tables
sql_cmd = f'''
select {','.join(f't1.{s}' for s in l1)},
{','.join(f't1.{s}' for s in l2)},
{','.join(f't2.{s}' for s in l3)}
'''
placeholders= ', '.join("'{"+str(i)+"}'" for i in range(len(l)))
query="select name from students where id (%s)"%placeholders
query=query.format(*l)
cursor.execute(query)
This should solve your problem.
a simpler solution:
lst = [1,2,3,a,b,c]
query = f"""SELECT * FROM table WHERE IN {str(lst)[1:-1}"""
l = [1] # or [1,2,3]
query = "SELECT * FROM table WHERE id IN :l"
params = {'l' : tuple(l)}
cursor.execute(query, params)
The :var notation seems simpler. (Python 3.7)
For example, if you want the sql query:
select name from studens where id in (1, 5, 8)
What about:
my_list = [1, 5, 8]
cur.execute("select name from studens where id in %s" % repr(my_list).replace('[','(').replace(']',')') )
This uses parameter substitution and takes care of the single value list case:
l = [1,5,8]
get_operator = lambda x: '=' if len(x) == 1 else 'IN'
get_value = lambda x: int(x[0]) if len(x) == 1 else x
query = 'SELECT * FROM table where id ' + get_operator(l) + ' %s'
cursor.execute(query, (get_value(l),))
This Will Work If Number of Values in List equals to 1 or greater than 1
t = str(tuple(l))
if t[-2] == ',':
t= t.replace(t[-2],"")
query = "select name from studens where id IN {}".format(t)
I have to connect to an Oracle database and see if a table exists. While I can get a list of the tables, I'm having trouble seeing if the table I'm looking for is in the list. Some tables have associated table which I'll have to join on, some do not, thus I have to check.
What is in my list: ('NYSDOH_CI_EI_HOSPITAL',)
sql = "SELECT table_name FROM all_tables"
cur.execute(sql)
searchstr = 'NYSDOH_CI_EI_HOSPITAL'
p = re.compile(searchstr)
#create data array to load in SQL results in.
ciDataSet = []
cxRows = cur.fetchall()
for i in cxRows:
#print i # list of tables
if p.match(str(i)):
print i
It doesn't find it, even if I use a wildcard.
fetchall() returns a list of tuples.
So when you do
for i in cxRows:
'i' is of type tuple. In your case, this tuple will have only single value. You can access it using i[0] and match it with p.
Currently you are converting a tuple to string so regular expression will not match.
Corrected code:
sql = "SELECT table_name FROM all_tables"
cur.execute(sql)
searchstr = 'NYSDOH_CI_EI_HOSPITAL'
p = re.compile(searchstr)
#create data array to load in SQL results in.
ciDataSet = []
cxRows = cur.fetchall()
for i in cxRows:
#print i # list of tables
if p.match(str(i[0])):
print i
To improve on the syntax of #vaichidrewar, you could simplify the fetch loop to:
for tabname, in cur:
if p.match(str(tabname)):
print(tabname)
But it's going to be more efficient to do the reg exp matching in the query:
sql = "select table_name from all_tables where regexp_like(table_name, :tn, 'i')"
searchstr = 'EMP'
cur.execute(sql, (searchstr,))
for tabname, in cur:
print(tabname)
The 'i' option does a case-insensitive match. You can adjust the regexp as you like.
I am trying to use a dict to do a SQL INSERT. The logic would basically be:
INSERT INTO table (dict.keys()) VALUES dict.values()
However, I am having a tough time figuring out the correct syntax / flow to do this. This is what I currently have:
# data = {...}
sorted_column_headers_list = []
sorted_column_values_list = []
for k, v in data.items():
sorted_column_headers_list.append(k)
sorted_column_values_list.append(v)
sorted_column_headers_string = ', '.join(sorted_column_headers_list)
sorted_column_values_string = ', '.join(sorted_column_values_list)
cursor.execute("""INSERT INTO title (%s)
VALUES (%s)""",
(sorted_column_headers_string, sorted_column_values_string))
From this I get a SQL exception (I think related to the fact that commas are also included in some of the values that I have). What would be the correct way to do the above?
I think the comment on using this with MySQL is not quite complete. MySQLdb doesn't do parameter substitution in the columns, just the values (IIUC) - so maybe more like
placeholders = ', '.join(['%s'] * len(myDict))
columns = ', '.join(myDict.keys())
sql = "INSERT INTO %s ( %s ) VALUES ( %s )" % (table, columns, placeholders)
# valid in Python 2
cursor.execute(sql, myDict.values())
# valid in Python 3
cursor.execute(sql, list(myDict.values()))
You're not getting escaping on the columns though, so you might want to check them first....
See http://mail.python.org/pipermail/tutor/2010-December/080701.html for a more complete solution
You want to add parameter placeholders to the query. This might get you what you need:
qmarks = ', '.join('?' * len(myDict))
qry = "Insert Into Table (%s) Values (%s)" % (qmarks, qmarks)
cursor.execute(qry, myDict.keys() + myDict.values())
Always good answers here, but in Python 3, you should write the following:
placeholder = ", ".join(["%s"] * len(dict))
stmt = "insert into `{table}` ({columns}) values ({values});".format(table=table_name, columns=",".join(dict.keys()), values=placeholder)
cur.execute(stmt, list(dict.values()))
Don't forget to convert dict.values() to a list because in Python 3, dict.values() returns a view, not a list.
Also, do NOT pour the dict.values() in stmt because it tears a quote out of a string by joining it, which caused MySQL error in inserting it. So you should always put it in cur.execute() dynamically.
I'm a little late to the party but there is another way that I tend to prefer since my data is usually in the form of a dict already. If you list the bind variables in the form of %(columnName)s you can use a dictionary to bind them at execute. This partially solves the problem of column ordering since the variables are bound in by name. I say partially because you still have to make sure that the columns & values portion of the insert are mapped correctly; but the dictionary itself can be in any order (since dicts are sort of unordered anyway)
There is probably a more pythonic way to achieve all this, but pulling the column names into a list and working off it ensures we have a static ordering to build the columns & values clauses.
data_dict = {'col1': 'value 1', 'col2': 'value 2', 'col3': 'value 3'}
columns = data_dict.keys()
cols_comma_separated = ', '.join(columns)
binds_comma_separated = ', '.join(['%(' + item + ')s' for item in columns])
sql = f'INSERT INTO yourtable ({cols_comma_separated}) VALUES ({binds_comma_separated})'
cur.execute(sql, data_dict)
Now whether or not it is a good idea to dynamically build your columns & values clause like this is a topic for a SQL injection thread.
table='mytable'
columns_string= '('+','.join(myDict.keys())+')'
values_string = '('+','.join(map(str,myDict.values()))+')'
sql = """INSERT INTO %s %s
VALUES %s"""%(table, columns_string,values_string)
I tried #furicle's solution but it still inputs everything as a string - if your dict is a mixed one then this may not work as you would want it to. I had a similar issue and this is what I came up with - this is only a query builder and you could use it (with changes) to work with any database of your choice. Have a look!
def ins_query_maker(tablename, rowdict):
keys = tuple(rowdict)
dictsize = len(rowdict)
sql = ''
for i in range(dictsize) :
if(type(rowdict[keys[i]]).__name__ == 'str'):
sql += '\'' + str(rowdict[keys[i]]) + '\''
else:
sql += str(rowdict[keys[i]])
if(i< dictsize-1):
sql += ', '
query = "insert into " + str(tablename) + " " + str(keys) + " values (" + sql + ")"
print(query) # for demo purposes we do this
return(query) #in real code we do this
This is crude and still needs sanity checks, etc, but it works as intended.
for a dict:
tab = {'idnumber': 1, 'fname': 'some', 'lname': 'dude', 'dob': '15/08/1947', 'mobile': 5550000914, 'age' : 70.4}
running the query I get the following output
results of query generated by the suite
This code worked for me (Python 3):
fields = (str(list(dictionary.keys()))[1:-1])
values = (str(list(dictionary.values()))[1:-1])
sql = 'INSERT INTO Table (' + fields + ') VALUES (' + values + ')'
cursor.execute(sql)
It does rely on the dictionary outputting its keys and values in the same order. I'm unclear if this is always true :)
When constructing queries dynamically it's important to ensure that both identifiers and values are correctly quoted. Otherwise you risk
SQL injection if untrusted data is processed
Errors if the column names require quoting (for example embedded spaces)
Data corruption or errors if values are incorrectly quoted (for example 2021-07-11 unquoted may be evaluated as 2003)
Quoting values is best delegated to the DB-API connector. However connector packages don't always provide a way to quote identifiers, so you may need to do this manually. MySQL uses backticks (`) to quote identifiers.
This code quotes identifiers and values. It works for MySQLdb, mysql.connector and pymysql and works for Python 3.5+.
data = {'col1': val1, 'col2': val2, ...}
# Compose a string of quoted column names
cols = ','.join([f'`{k}`' for k in data.keys()])
# Compose a string of placeholders for values
vals = ','.join(['%s'] * len(data))
# Create the SQL statement
stmt = f'INSERT INTO `tbl` ({cols}) VALUES ({vals})'
# Execute the statement, delegating the quoting of values to the connector
cur.execute(stmt, tuple(data.values()))
This is based on other answers here, but it uses back ticks around column names for cases in which you are using reserved words as column names and it it ensures that column names only contain letters, numbers, and underscores to thwart SQL injection attacks.
I've also written a similar upsert that works the same way as the insert but which overwrites data that duplicates the primary key.
import mysql.connector
import re
cnx = mysql.connector.connect(...)
def checkColumnNames(data):
for name in data.keys():
assert re.match(r'^[a-zA-Z0-9_]+$',name), "Bad column name: " + name
def insert(table, data):
checkColumnNames(data)
assert table, "No table specified"
placeholders = ', '.join(['%s'] * len(data))
columns = '`,`'.join(data.keys())
sql = "INSERT INTO `%s` (`%s`) VALUES (%s);" % (table, columns, placeholders)
cnx.cursor().execute(sql, list(data.values()))
def upsert(table, data):
checkColumnNames(data)
assert table, "No table specified"
placeholders = ', '.join(['%s'] * len(data))
columns = '`,`'.join(data.keys())
updates = '`' + '`=%s,`'.join(data.keys()) + '`=%s'
sql = "INSERT INTO `%s` (`%s`) VALUES (%s) ON DUPLICATE KEY UPDATE %s" % (table, columns, placeholders, updates)
cnx.cursor().execute(sql, list(data.values()) + list(data.values()))
Example usage
insert("animals", {
"id": 1,
"name": "Bob",
"type": "Alligator"
})
cnx.commit()
I used this thread for my usage and tried to keep it much simpler
ins_qry = "INSERT INTO {tablename} ({columns}) VALUES {values};" .format(
tablename=my_tablename,
columns=', '.join(myDict.keys()),
values=tuple(myDict.values())
)
cursor.execute(ins_qry)
Make sure to commit the data inserted, either using db_connection.commit() and use cursor.lastrowid, if you need the primary key of the inserted row
This works for me
cursor.execute("INSERT INTO table (col) VALUES ( %(col_value) )",
{'col_value': 123})
if you have list in which there are number of dictionaries
for example: lst=[d1,d2,d3,d4]
then below one will worked for me:
for i in lst:
placeholders = ', '.join(['%s'] * len(i))
columns = ', '.join(i.keys())
sql = "INSERT INTO %s ( %s ) VALUES ( %s )" % (table, columns, placeholders)
cursor.execute(sql,list(i.values()))
conn.commit()
Note:Dont ever forget to commit otherwise you wont be able to see columns and values inserted in table
columns = ', '.join(str(x).replace('/', '_') for x in row_dict.keys())
values = ', '.join("'" + str(x).replace('/', '_') + "'" for x in row_dict.values())
sql = "INSERT INTO %s ( %s ) VALUES ( %s );" % ("tablename", columns, values)
applicable for python3
Let's say our data is:
data = {
"name" : "fani",
"surname": "dogru",
"number" : 271990
}
This is my shorter version:
tablo = "table_name"
cols = ','.join([f" {k}" for k in data.keys()])
vals = ','.join([f"'{k}'" for k in data.values()])
stmt = f'INSERT INTO {tablo} ({cols}) VALUES ({vals})'
What about:
keys = str(dict.keys())
keys.replace('[', '(')
keys.replace(']', ')')
keys.replace("'",'')
vals = str(dict.values())
vals.replace('[', '(')
vals.replace(']', ')')
cur.execute('INSERT INTO table %s VALUES %s' % (keys, vals))
For python 3:
keys = str(dict.keys())[9:].replace('[', '').replace(']', '')
vals = str(dict.values())[11:].replace('[', '').replace(']', '')
...