I am using postgres with a psycopg2 python/flask web app.
I am having an encoding problem that I'm sure is something stupid I'm missing (I'm new to programming). The following statement works perfectly:
cur.execute("SELECT column_name FROM information_schema.columns where table_name = %s;", (tablename,))
I use fetchall() to create a list of column names in my table. However, another statement doesn't work:
cur.execute("ALTER TABLE %s ADD COLUMN %s varchar;", (tablename, col,))
Here is the error:
psycopg2.ProgrammingError
ProgrammingError: syntax error at or near "E'flatresponses_1'"
LINE 1: ALTER TABLE E'flatresponses_1' ADD COLUMN E'What was the bes...
('flatresponses_1' is the 'tablename', and 'What was the best...' is the start of 'col'.)
I did 'print cur.query' and here is the result:
>>> print cur.query
>>> ALTER TABLE E'flatresponses_1' ADD COLUMN E'What was the best part of your ENT clinic visit today? Why?' varchar;
I'm getting E' encoding in the second query but not the first. I've also tried str(tablename).
What am I missing?!
I ended up using the AsIs psycopg2 extension as described in this post. Worked like a charm!
Table and column names aren't text types, they are identifiers (type = name). They do not take escaped string literals and presumably you need something other than %s for your placeholder.
http://www.postgresql.org/docs/9.2/static/datatype-character.html
Related
My objective is to store a JSON object into a MySQL database field of type json, using the mysql.connector library.
import mysql.connector
import json
jsonData = json.dumps(origin_of_jsonData)
cnx = mysql.connector.connect(**config_defined_elsewhere)
cursor = cnx.cursor()
cursor.execute('CREATE DATABASE dataBase')
cnx.database = 'dataBase'
cursor = cnx.cursor()
cursor.execute('CREATE TABLE table (id_field INT NOT NULL, json_data_field JSON NOT NULL, PRIMARY KEY (id_field))')
Now, the code below WORKS just fine, the focus of my question is the use of '%s':
insert_statement = "INSERT INTO table (id_field, json_data_field) VALUES (%s, %s)"
values_to_insert = (1, jsonData)
cursor.execute(insert_statement, values_to_insert)
My problem with that: I am very strictly adhering to the use of '...{}'.format(aValue) (or f'...{aValue}') when combining variable aValue(s) into a string, thus avoiding the use of %s (whatever my reasons for that, let's not debate them here - but it is how I would like to keep it wherever possible, hence my question).
In any case, I am simply unable, whichever way I try, to create something that stores the jsonData into the mySql dataBase using something that resembles the above structure and uses '...{}'.format() (in whatever shape or form) instead of %s. For example, I have (among many iterations) tried
insert_statement = "INSERT INTO table (id_field, json_data_field) VALUES ({}, {})".format(1, jsonData)
cursor.execute(insert_statement)
but no matter how I turn and twist it, I keep getting the following error:
ProgrammingError: 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '[some_content_from_jsonData})]' at line 1
Now my question(s):
1) Is there a way to avoid the use of %s here that I am missing?
2) If not, why? What is it that makes this impossible? Is it the cursor.execute() function, or is it the fact that it is a JSON object, or is it something completely different? Shouldn't {}.format() be able to do everything that %s could do, and more?
First of all: NEVER DIRECTLY INSERT YOUR DATA INTO YOUR QUERY STRING!
Using %s in a MySQL query string is not the same as using it in a python string.
In python, you just format the string and 'hello %s!' % 'world' becomes 'hello world!'. In SQL, the %s signals parameter insertion. This sends your query and data to the server separately. You are also not bound to this syntax. The python DB-API specification specifies more styles for this: DB-API parameter styles (PEP 249). This has several advantages over inserting your data directly into the query string:
Prevents SQL injection
Say you have a query to authenticate users by password. You would do that with the following query (of course you would normally salt and hash the password, but that is not the topic of this question):
SELECT 1 FROM users WHERE username='foo' AND password='bar'
The naive way to construct this query would be:
"SELECT 1 FROM users WHERE username='{}' AND password='{}'".format(username, password)
However, what would happen if someone inputs ' OR 1=1 as password. The formatted query would then become
SELECT 1 FROM users WHERE username='foo' AND password='' OR 1=1
which will allways return 1. When using parameter insertion:
execute('SELECT 1 FROM users WHERE username=%s AND password=%s', username, password)
this will never happen, as the query will be interpreted by the server separately.
Performance
If you run the same query many times with different data, the performance difference between using a formatted query and parameter insertion can be significant. With parameter insertion, the server only has to compile the query once (as it is the same every time) and execute it with different data, but with string formatting, it will have to compile it over and over again.
In addition to what was said above, I would like to add some details that I did not immediately understand, and that other (newbies like me ;)) may also find helpful:
1) "parameter insertion" is meant for only for values, it will not work for table names, column names, etc. - for those, the Python string substitution works fine in the sql syntax defintion
2) the cursor.execute function requires a tuple to work (as specified here, albeit not immediately clear, at least to me: https://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlcursor-execute.html)
EXAMPLE for both in one function:
def checkIfRecordExists(column, table, condition_name, condition_value):
...
sqlSyntax = 'SELECT {} FROM {} WHERE {} = %s'.format(column, table, condition_name)
cursor.execute(sqlSyntax, (condition_value,))
Note both the use of .format in the initial sql syntax definition and the use of (condition_value,) in the execute function.
I have to delete some dates from mysql by python.
I have tables over 2000. so, I need to finish this code... I can't handle this much by clicking my mouse. I really need help.
well, my guess was like this
sql ="delete from finance.%s where date='2000-01-10'"
def Del():
for i in range(0,len(data_s)):
curs.execute(sql,(data_s[i]))
conn.commit()
Howerver, it doesn't work.
I just though
when I just type like this , it works.
>>> query="delete from a000020 where date ='2000-01-25'"
>>> curs.execute(query) //curs=conn.cursor()
But if I add %s to the syntax, it doesn't work..
>>> table='a000050'
>>> query="delete from %s where date ='2000-01-25'"
>>> curs.execute(query,table)
ProgrammingError: (1064, u"You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ''a000050' where date ='2000-01-25'' at line 1")
it doesn't work too.
>>> curs.execute(query,(table))
ProgrammingError: (1064, u"You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ''a000050' where date ='2000-01-25'' at line 1")
a bit different... but same.
>>> curs.execute(query,(table,))
I have read many questions from here, but by just adding () or , it doesn't fixed...
Because I'm beginner for the python and mysql, I really need your help. Thank you for reading.
I had the same issue and I fixed by appending as:
def Del():
for i in range(0,len(data_s)):
x = "delete from finance." + data_s[i] + "where date='2000-01-10'"
print x # to check the sql statement :)
curs.execute(x)
conn.commit()
Good question,have a look at MySQLdb User's Guide
paramstyle
String constant stating the type of parameter marker formatting
expected by the interface. Set to 'format' = ANSI C printf format
codes, e.g. '...WHERE name=%s'. If a mapping object is used for
conn.execute(), then the interface actually uses 'pyformat' = Python
extended format codes, e.g. '...WHERE name=%(name)s'. However, the API
does not presently allow the specification of more than one style in
paramstyle.
Note that any literal percent signs in the query string passed to execute() must be escaped, i.e. %%.
Parameter placeholders can only be used to insert column values. They
can not be used for other parts of SQL, such as table names,
statements, etc.
Hope this helps.
I'm inserting data from some csv files into my SQLite3 database with a python script I wrote. When I run the script, it inserts the first row into the database, but gives this error when trying to inset the second:
sqlite3.IntegrityError: columns column_name1, column_name2 are not unique.
It is true the values in column_name1 and column_name2 are same in the first two rows of the csv file. But, this seems a bit strange to me, because reading about this error indicated that it signifies a uniqueness constraint on one or more of the database's columns. I checked the database details using SQLite Expert Personal, and it does not show any uniqueness constraints on the current table. Also, none of the fields that I am entering specify the primary key. It seems that the database automatically assigns those. Any thoughts on what could be causing this error? Thanks.
import sqlite3
import csv
if __name__ == '__main__' :
conn = sqlite3.connect('ts_database.sqlite3')
c = conn.cursor()
fileName = "file_name.csv"
f = open(fileName)
csv_f = csv.reader(f)
for row in csv_f:
command = "INSERT INTO table_name(column_name1, column_name2, column_name3)"
command += " VALUES (%s, '%s', %s);" % (row[0],row[1],row[2])
print command
c.execute(command)
conn.commit()
f.close()
If SQLite is reporting an IntegrityError error it's very likely that there really is a PRIMARY KEY or UNIQUE KEY on those two columns and that you are mistaken when you state there is not. Ensure that you're really looking at the same instance of the database.
Also, do not write your SQL statement using string interpolation. It's dangerous and also difficult to get correct (as you probably know considering you have single quotes on one of the fields). Using parameterized statements in SQLite is very, very simple.
The error may be due to duplicate column names in the INSERT INTO statement. I am guessing it is a typo and you meant column_name3 in the INSERT INTO statement.
I'm trying to insert a record into an sqlite database using named parameters in python (with the sqlite3 module).
The values I want to insert are in a dictionary, but the dictionary keys might contain dashes, for example {'request-id': 100, 'year': '2015'}.
I'm trying to execute the following:
import sqlite3
conn = sqlite3.connect('database.db')
cursor = conn.cursor()
cursor.execute('''CREATE TABLE IF NOT EXISTS requests (request_id text, year text)''')
query = '''INSERT INTO requests (request_id, year) VALUES (:request-id, :year)'''
cursor.execute(query, {'request-id': 100, 'year': '2015'})
conn.commit()
conn.close()
I get this error during the insert statement:
sqlite3.OperationalError: no such column: id
It seems like dashes are not well accepted as named parameters.
There are many workarounds for this, like creating a new dictionary where dashes in the keys are replaced by underscores, but I'd like to know if I could use some escaping technique or something else to avoid that.
Thanks for your help
The documentation for sqlite3_bind_* states that parameter names must be composed of alphanumeric characters, and doesn't mention a way of escaping them.
Your query is probably being parsed as :request - id, i.e. :request minus id, and since there's no such column id, SQLite throws an error.
(Also, as Prerak Sola points out, you create the table with a date column but try to insert to a year column which doesn't exist.)
SQL parameter names have no quoting or escaping mechanism; you have to use the same rules as for an unquoted identifier.
I am having a field called comments.
I am effectively trying to read values from one large table into multiple tables.
Hence, my select query fetches the comment field for me.
I am constructing a Python script to do the copying from table to table.
My insert query fails when it encounters a comment field like "Sorry! we can't process your order" because of the single quote.
I have tried using $ quotes but in vain
Here is what I am trying
#!/usr/bin/python
import psycopg2
conn = psycopg2.connect("dbname='postgres' user='postgres' host='localhost' )
mark=conn.cursor()
/* fectching the rows and doing other stuff */
addthis="insert into my_table(something) values("$$"+str(row[8])+"$$")
mark.execute(addthis)
conn.commit()
Appreciate the help!
Your insert statement should use a placeholder. In the case of psycopg2, it is %s.
You should pass the parameter(s) as a second argument to execute(). That way you don't have quoting issues and you guard against SQL-injection attack.
For example:
addthis = "INSERT INTO my_table (something) VALUES (%s);"
mark.execute(addthis, ('a string you wish to insert',))
You could use a placeholder, as suggested by bernie. This is the preferred way.
There are however situations where using a placeholder is not possible. You then have to escape the qoutes manually. This is done by backslashing them:
addthis="insert into my_table(something) values(%s)" % str(row[8]).replace('"', r'\"').replace("'", r"\'")
mark.execute(addthis)