I am trying to load the data inside the table trial and it says Invalid Column name - Name.
I am passing values inside Name and Area dynamically.
cursor.execute("insert into trial (NameofTheProperty, AreaofTheProperty)
values (Name, Area)")
cnxn.commit()
You need to have quotes around the column values so that they are not gonna be interpreted as column names instead:
insert into
trial (NameofTheProperty, AreaofTheProperty)
values
("Name", "Area")
Now, since you mentioned that you dynamically insert these values into the query, you can just let your database driver handle the quotes and other things like type conversions:
property_name = "Name"
property_area = "Area"
cursor.execute("""
insert into
trial (NameofTheProperty, AreaofTheProperty)
values
(?, ?)""", (property_name, property_area))
cnxn.commit()
This is called query parameterization and is considered the safest and the most robust way to insert values into the SQL queries. These ? values are called "placeholders".
Note that the database driver is gonna put quotes around the string values automatically - no need to do it manually.
I have seen some posts that suggesting using a ? as a place holder when inserting python variables into a SQL Query but all of these examples show the question mark at the end of the query followed by the python variable. What if you want to insert a python variable in the middle of a query and want to avoid SQL injection? I am using Python 3.6 and SQLite.
Update* - This code is working:
id='13'
text='YES'
db=sqlite3.connect('NEW_Inventory.sqlite')
cursor=db.cursor()
query=('''
INSERT
OR REPLACE
INTO
text (id, text)
VALUES
(?,
(SELECT
CASE
WHEN exists(SELECT 1 FROM text WHERE id=?)
THEN 'good'
ELSE 'Hello'
END
)
)''')
cursor.execute(query, (id, id))
db.commit()
You need to pass the parameters to execute() as a tuple. In your case you need to call it like this:
cursor.execute(query, (id, id))
where query is your parameterised SQL query string.
I assume that your code defines id somewhere, otherwise, execute() will try to use the builtin function id() to construct the query, resulting in another error.
It also worth mentioning that if you have only one parameter it must also be passed as a tuple like this (id,). Avoid the common mistake of this: (id) which is not a tuple.
With MySQLdb package in python, I want to insert records with checking some unique keys. The method I used is executemany. The arguments are sql sentence and a tuple. But when I executed it, it raised an error which said "not all argument converted".
The codes are as following:
dData = [[u'Daniel', u'00-50-56-C0-00-12', u'Daniel']]
sql = "INSERT INTO app_network_white_black_list (biz_id, shop_id, type, mac_phone, remarks, create_time) " \
"VALUES ({bsid}, {shop_id}, {type}, %s, %s, NOW()) " \
"ON DUPLICATE KEY UPDATE type={type}, remarks=%s, create_time=NOW()".format(bsid=bsid, shop_id=shop_id, type=dType)
cur.executemany(sql, tuple(dData))
Someone said this is a bug. But they didn't give me a path to jump over it. Please provide a method if this is a bug.
What's going wrong
After checking the link in your comment below and doing some more research and testing, I was able to reproduce the error with MySQLdb versions 1.2.4b4 and 1.2.5. As explained in unubtu's answer, this has to do with the limitations of a regular expression that appears in cursors.py. The exact regular expression is slightly different in each release, probably because people keep finding cases it doesn't handle and adjusting the expression instead of looking for a better approach entirely.
What the regular expression does is try to match the VALUES ( ... ) clause of the INSERT statement and identify the beginning and end of the tuple expression it contains. If the match succeeds, executemany tries to convert the single-row insert statement template into a multiple-row insert statement so that it runs faster. I.e., instead of executing this for every row you want to insert:
INSERT INTO table
(foo, bar, ...)
VALUES
(%s, %s, ...);
It tries to rewrite the statement so that it only has to execute once:
INSERT INTO table
(foo, bar, ...)
VALUES
(1, 2, ...),
(3, 4, ...),
(5, 6, ...),
...;
The problem you're running into is that executemany assumes you only have parameter placeholders in the tuple immediately after VALUES. When you also have placeholders later on, it takes this:
INSERT INTO table
(foo, bar, ...)
VALUES
(%s, %s, ...)
ON DUPLICATE KEY UPDATE baz=%s;
And tries to rewrite it like this:
INSERT INTO table
(foo, bar, ...)
VALUES
(1, 2, ...),
(3, 4, ...),
(5, 6, ...),
...
ON DUPLICATE KEY UPDATE baz=%s;
The problem here is that MySQLdb is trying to do string formatting at the same time that it's rewriting the query. Only the VALUES ( ... ) clause needs to be rewritten, so MySQLdb tries to put all your parameters into the matching group (%s, %s, ...), not realizing that some parameters need to go into the UPDATE clause instead.
If you only send parameters for the VALUES clause to executemany, you'll avoid the TypeError but run into a different problem. Notice that the rewritten INSERT ... ON DUPLICATE UPDATE query has numeric literals in the VALUES clause, but there's still a %s placeholder in the UPDATE clause. That's going to throw a syntax error when it reaches the MySQL server.
When I first tested your sample code, I was using MySQLdb 1.2.3c1 and couldn't reproduce your problem. Amusingly, the reason that particular version of the package avoids these problems is that the regular expression is broken and doesn't match the statement at all. Since it doesn't match, executemany doesn't attempt to rewrite the query, and instead just loops through your parameters calling execute repeatedly.
What to do about it
First of all, don't go back and install 1.2.3c1 to make this work. You want to be using updated code where possible.
You could move to another package, as unubtu suggests in the linked Q&A, but that would involve some amount of adjustment and possibly changes to other code.
What I would recommend instead is to rewrite your query in a way that is more straightforward and takes advantage of the VALUES() function in your UPDATE clause. This function allows you to refer back to the values that you would have inserted in the absence of a duplicate key violation, by column name (examples are in the MySQL docs).
With that in mind, here's one way to do it:
dData = [[u'Daniel', u'00-50-56-C0-00-12', u'Daniel']] # exact input you gave
sql = """
INSERT INTO app_network_white_black_list
(biz_id, shop_id, type, mac_phone, remarks, create_time)
VALUES
(%s, %s, %s, %s, %s, NOW())
ON DUPLICATE KEY UPDATE
type=VALUES(type), remarks=VALUES(remarks), create_time=VALUES(create_time);
""" # keep parameters in one part of the statement
# generator expression takes care of the repeated values
cur.executemany(sql, ((bsid, shop_id, dType, mac, rem) for mac, rem in dData))
This approach should work because there are no parameters in the UPDATE clause, meaning MySQLdb will be able to successfully convert the single-line insert template with parameters into a multi-line insert statement with literal values.
Some things to note:
You don't have to supply a tuple to executemany; any iterable is fine.
Multiline strings make for much more readable SQL statements in your Python code than implicitly concatenated strings; when you separate the statement from the string delimiters, it's easy to quickly grab the statement and copy it into a client application for testing.
If you're going to parameterize part of your query, why not parameterize all of your query? Even if only part of it is user input, it's more readable and maintainable to handle all your input values the same way.
That said, I didn't parameterize NOW(). My preferred approach here would be to use CURRENT_TIMESTAMP as the column default and take advantage of DEFAULT in the statement. Others might prefer to generate this value in the application and supply it as a parameter. If you're not worried about version compatibility, it's probably fine as-is.
If you can't avoid having parameter placeholders in the UPDATE clause – e.g., because the UPDATE value(s) can't be hard-coded in the statement or derived from the VALUES tuple – you'll have to iterate over execute instead of using executemany.
You have three elements in dData but only two %s placeholders for them to go into.
I used MySQL Connector/Python API, NOT MySQLdb.
I need to dynamically insert values into a sparse table so I wrote the Python code like this:
cur.executemany("UPDATE myTABLE SET %s=%s WHERE id=%s" % data)
where
data=[('Depth', '17.5cm', Decimal('3003')), ('Input_Voltage', '110 V AC', Decimal('3004'))]
But it resulted an error:
TypeError: not enough arguments for format string
Is there any solution for this problem? Is it possible to use executemany when there is a
substitution of a field in query?
Thanks.
Let's start with the original method:
As the error message suggests you have a problem with your SQL syntax (not Python). If you insert your values you are effectively trying to execute
UPDATE myTABLE SET 'Depth'='17.5cm' WHERE id='3003'
You should notice that you are trying to assign a value to a string 'Depth', not a database field. The reason for this is that the %s substitution of the mysql module is only possible for values, not for tables/fields or other object identifiers.
In the second try you are not using the substitution anymore. Instead you use generic python string interpolation, which however looks similar. This does not work for you because you have a , and a pair of brackets too much in your code. It should read:
cur.execute("UPDATE myTABLE SET %s=%s WHERE id=%s" % data)
I also replaced executemany with execute because this method will work only for a single row. However your example only has one row, so there is no need to use executemany anyway.
The second method has some drawbacks however. The substitution is not guaranteed to be quoted or formatted in a correct manner for the SQL query, which might cause unexpected behaviour for certain inputs and may be a security concern.
I would rather ask, why it is necessary to provide the field name dynamically in the first place. This should not be necessary and might cause some trouble.
I am having a field called comments.
I am effectively trying to read values from one large table into multiple tables.
Hence, my select query fetches the comment field for me.
I am constructing a Python script to do the copying from table to table.
My insert query fails when it encounters a comment field like "Sorry! we can't process your order" because of the single quote.
I have tried using $ quotes but in vain
Here is what I am trying
#!/usr/bin/python
import psycopg2
conn = psycopg2.connect("dbname='postgres' user='postgres' host='localhost' )
mark=conn.cursor()
/* fectching the rows and doing other stuff */
addthis="insert into my_table(something) values("$$"+str(row[8])+"$$")
mark.execute(addthis)
conn.commit()
Appreciate the help!
Your insert statement should use a placeholder. In the case of psycopg2, it is %s.
You should pass the parameter(s) as a second argument to execute(). That way you don't have quoting issues and you guard against SQL-injection attack.
For example:
addthis = "INSERT INTO my_table (something) VALUES (%s);"
mark.execute(addthis, ('a string you wish to insert',))
You could use a placeholder, as suggested by bernie. This is the preferred way.
There are however situations where using a placeholder is not possible. You then have to escape the qoutes manually. This is done by backslashing them:
addthis="insert into my_table(something) values(%s)" % str(row[8]).replace('"', r'\"').replace("'", r"\'")
mark.execute(addthis)