Insert python list into Postgres database - python

I am having trouble in formatting the list for insertion using psycopg.
Here is a sample of code i am trying to do.
Basically I am just reading data from one table and trying to insert it into another table.
Code:
cur.execute("""select data from \"Table1\" where lat=-20.004189 and lon=-63.848004""")
rows = cur.fetchall()
print rows
cur.execute("""INSERT INTO \"%s\" (data) VALUES (ARRAY%s)""" % (args.tableName,rows)))
The result returned by first select query is like this:
[([6193, 3975, 4960, 5286, 3380, 970, 3328, 3173, 2897, 2457, 2443, 2674, 2172, 2740, 3738, 4907, 3691, 4234, 3651, 3215],)]
When I try to insert this into another table I get the following format error.
cur.execute(cur.mogrify("""INSERT INTO \"%s\" (data) VALUES (%s)""" % (args.tableName,rows)))
psycopg2.ProgrammingError: syntax error at or near "["
LINE 1: INSERT INTO "DUMMY1km" (data) VALUES ([([6193, 3975, 4960, 5...
I tried cur.mogrify, but it does not seem to help.
Please let me know if anyone has a work around for this issue.
Thanks
Adi

I don't think mogrify is needed here. Use executemany and pass rows as the second argument.
cur.executemany(
"""INSERT INTO "%s" (data) VALUES (%%s)""" % (args.tableName),rows)
Using parametrized arguments helps prevent SQL injection.
The table name can not be parametrized, so we do have to use string interpolation to place the table name in the SQL query. %%s gets escapes the percent sign and becomes %s after string interpolation.
By the way, (as a_horse_with_no_name has already pointed out) you can use the INSERT INTO ... SELECT form of INSERT to perform both SQL queries as one:
cur.execute(
"""INSERT INTO %s (data)
SELECT data FROM Table1
WHERE lat=-20.004189 AND lon=-63.848004""" % (args.tableName))
Per the question in the comments, if there are multiple fields, then the SQL becomes:
cur.executemany(
"""INSERT INTO {t} (lat,lon,data1,data2)
VALUES (%s,%s,%s,%s)""".format(t=args.tableName),rows)
(If you use the format method, then you don't have to escape all the other %ss.)

Related

python pymssql error with query execution

This is the pymssql query I am working with
query = 'INSERT INTO [dbo].[helios_devops_data_curr] ("iipm.l3_it_org", "iipm.it_custodian","iipm.it_executive") VALUES ({}{}{})'.format("'Innovation and Technology'", "'bob tom'", "'bob tom'")
I'm using these values as an example, they are not the real values I'm trying to upload. However the errors are the same:
109, b'There are more columns in the INSERT statement than values specified in the VALUES clause. The number of values in the VALUES clause must match the number of columns specified in the INSERT statement.DB-Lib error message 20018, severity 15:\nGeneral SQL Server error: Check messages from the SQL Server\n')
I'm not sure why these error is occurring as there's clearly 3 columns and 3 values being inserted.
Any help on this would be appreciated
Your format placeholders might need commas between them:
query = 'INSERT INTO [dbo].[helios_devops_data_curr] ("iipm.l3_it_org", "iipm.it_custodian","iipm.it_executive") VALUES ({}, {}, {})'.format("'Innovation and Technology'", "'bob tom'", "'bob tom'")
Looking at the docs, it appears that pymssql might prefer you use C-style string-formatting symbols; maybe something like this:
query = "INSERT INTO dbo.helios_devops_data_curr(iipm.l3_it_org, iipm.it_custodian, iipm.it_executive) VALUES (%s, %s, %s)"
params = ("Innovation and Technology", "bob tom", "bob tom")
cursor.execute(query, params)
Here is another SO question with an example.

Getting column is of type jsonb[] but expression is of type text[] when trying to insert with psycopg2

I'm using execute_values to insert the content of many tables on other tables, I already set the adapter of dict to json when I received other error, however I don't know how to treat this one:
psycopg2.ProgrammingError: column "rules" is of type jsonb[] but expression is of type text[]
LINE 1: ...UES (1,'tturxvrtgvvsrqgzsedcoyqujakyepjordrbbjdw',ARRAY['{"i...
The only way to handle that was something like this issue, but I would need to treat each column ...
Maybe there are ways to create a new adapter, but I couldn't achieve how to do that by the docs.
register_adapter(dict, Json)
execute_values(
dest_cursor,
f'''
INSERT INTO {t} VALUES %s ON CONFLICT DO NOTHING;
''',
records,
)
Is there a automatic way to deal with that, like the register_adapter?
I've found the answer by this issue I learned I could use the template parameter, like so:
execute_values(
dest_cursor,
f'''
INSERT INTO {t} VALUES %s ON CONFLICT DO NOTHING;
''',
records,
template="(%s, %s, %s::jsonb[], %s, %s)" # as many placeholders requested
)

Successful connection to MS SQL DB on Azure using pymssql, INSERT statement executing with no error message but 0 rows inserted

I'm using pymssql to connect to a MS SQL DB on Azure and insert records from a CSV file. I've verified that the connection is successful, and that the list used for the executemany parameter contains all of the data in the correct format and with the correct number of values. However, when I run the script 0 rows are inserted - but no error is thrown.
I looked around and it seems like most others that have experienced something similar were missing the commit(), but that isn't the issue here.
Here is the code. Any help is greatly appreciated.
with open('file.csv') as csvfile:
data = csv.reader(csvfile)
next(data)
dicts = ({'col1': line[0], 'col1': line[1], 'col3': line[2], 'col4': int(line[3]), 'col5': int(line[4]), 'col6': float(line[5])} for line in data)
to_db = ((i['col1'], i['col2'], i['col3'], i['col4'], i['col5'], i['col6']) for i in dicts)
cursor.executemany(
'INSERT INTO myTable VALUES (%s, %s, %s, %d, %d, %f)',
to_db)
print str(cursor.rowcount) + " rows inserted"
conn.commit()
Edit: If I execute the query using cursor.execute() and include the values explicitly in the query then I can successfully insert rows into the database (see below for example).
cursor.execute("INSERT INTO myTable VALUES ('4/18/2016','test','test',0,0,0.0)")
But if I user the cursor.executemany(operation,parameters) syntax and pass a list of the values as the parameter then it results in an incorrect syntax error.
cursor.executemany("INSERT INTO myTable VALUES(%s,%s,%s,%d,%d,%f)",list_of_values)
I was just reading in the module reference that only %s and %d are supported. So I'm thinking that might be the issue. But how do I pass a float?
Using the float placeholder (%f) was in fact the issue. Only %s and %d are supported, but are purely placeholders and do not have any impact on formatting the way that they typically do in python, so really only %s is needed. The working code is as follows:
cursor.executemany("INSERT INTO myTable VALUES(%s,%s,%s,%s,%s,%s)",list_of_values)

How to handle apostrophes in MySQL-Python?

A Python API is giving back u"'HOPPE'S No. 9'" as a value for a particular product attribute. I'm then looking to insert it into the DB, also using Python (python-mysqldb), with the following query:
INSERT INTO mytable (rating, Name) VALUES('5.0 (7)', 'HOPPE'S No. 9';
MySQL rejects this, and the suggested approach to handling a single quote in MySQL is to escape it first. This I need to do in Python, so I try:
In [5]: u"'HOPPE'S No. 9'".replace("'", "\'")
Out[5]: u"'HOPPE'S No. 9'"
When I incorporate this in my program, MySQL still rejects it. So I double-escape the apostrophe, and then an insert happens successfully. Thing is, it contains the escape character (so what gets written is 'HOPPE\'S No. 9').
If I need the second escape character, but when I add it gets left in, then how can I handle the escaping without having the escape character included in the string that gets inserted?
Edit: Based on theBjorn's suggestion, tried:
actualSQL = "INSERT INTO %s (%s) VALUES(%s);"
#cur.execute(queryString)
cur.execute(actualSQL,
(configData["table"], sqlFieldMappingString, sqlFieldValuesString))
but it looks like I'm back to where I was when I was trying to escape using the single escape with .replace():
Error 1064: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ''mytable' ('rating, Name, Image, mfg, price, URL') VALUES('\'5.0 (3)\', \'AR-1' at line 1
You should never construct sql that way. Use parameterized code instead:
cursor.execute(
"insert into mytable (rating, name) values (%s, %s);",
("5.0 (7)", "HOPPE'S No. 9")
)
your latest problem is due to the misconception that this is string interpolation, which it isn't (the use of %s is confusing), thus:
actualSQL = "INSERT INTO %s (%s) VALUES(%s);"
will be wrong. It is possible to construct your sql string, but probably easier to do so in two steps so we don't trip over sql parameter markers looking like string interpolation markers. Assuming you have the values in a tuple named field_values:
params = ["%s"] * len(field_values) # create a list with the correct number of parameter markers
sql = "insert into %s (%s) values (%s)" % ( # here we're using string interpolation, but not with the values
configData["table"],
sqlFieldMappingString,
', '.join(params)
)
if you print sql it should look like my example above. Now you can execute it with:
cursor.execute(sql, field_values)

python, mysql, inserting string into table, error 1054

I am having the problem
OperationalError: (1054, "Unknown column 'Ellie' in 'field list'")
With the code below, I'm trying to insert data from json into a my sql database. The problem happens whenever I try to insert a string in this case "Ellie" This is something do to with string interpolation I think but I cant get it to work despite trying some other solutions I have seen here..
CREATE TABLE
con = MySQLdb.connect('localhost','root','','tweetsdb01')
cursor = con.cursor()
cursor.execute("CREATE TABLE IF NOT EXISTS User(user_id BIGINT NOT NULL PRIMARY KEY, username varchar(25) NOT NULL,user varchar(25) NOT NULL) CHARACTER SET utf8 COLLATE utf8_unicode_ci ENGINE=InnoDB")
con.commit()
INSERT INTO
def populate_user(a,b,c):
con = MySQLdb.connect('localhost','root','','tweetsdb01')
cursor = con.cursor()
cursor.execute("INSERT INTO User(user_id,username,user) VALUES(%s,%s,%s)"%(a,b,c))
con.commit()
cursor.close()
READ FILE- this calls the populate method above
def read(file):
json_data=open(file)
tweets = []
for i in range(10):
tweet = json.loads(json_data.readline())
populate_user(tweet['from_user_id'],tweet['from_user_name'],tweet['from_user'])
Use parametrized SQL:
cursor.execute("INSERT INTO User(user_id,username,user) VALUES (%s,%s,%s)", (a,b,c))
(Notice the values (a,b,c) are passed to the function execute as a second argument, not as part of the first argument through string interpolation). MySQLdb will properly quote the arguments for you.
PS. As Vajk Hermecz notes, the problem occurs because the string 'Ellie' is not being properly quoted.
When you do the string interpolation with "(%s,)" % (a,) you get
(Ellie,) whereas what you really want is ('Ellie',). But don't bother doing the quoting yourself. It is safer and easier to use parametrized SQL.
Your problem is that you are adding the values into the query without any escaping.... Now it is just broken. You could do something like:
cursor.execute("INSERT INTO User(user_id,username,user) VALUES(\"%s\",\"%s\",\"%s\")"%(a,b,c))
But that would just introduce SQL INJECTION into your code.
NEVER construct SQL statements with concatenating query and data. Your parametrized queries...
The proper solution here would be:
cursor.execute("INSERT INTO User(user_id,username,user) VALUES(%s,%s,%s)", (a,b,c))
So, the problem with your code was that you have used the % operator which does string formatting, and finally you just gave one parameter to cursor.execute. Now the proper solution, is that instead of doing the string formatting yourself, you give the query part to cursor.execute in the first parameter, and provide the tuple with arguments in the second parameter.

Categories

Resources