How to insert NULL values into PostgreSQL database using Python? - python

I have list of tuples with data something like this:
list1 = [(1100, 'abc', '{"1209": "Y", "1210": "Y"}'), (1100, 'abc', None)]
def insert_sample_data(col_val):
cur = self.con.cursor()
sql = """insert into sampletable values {}""".format(col_val)
cur.execute(sql)
self.con.commit()
cur.close()
values1 = ', '.join(map(str, list1)) #bulk insert
insert_sample_data(values1)
Table Structure:
ssid int, name varchar, rules jsonb
When I am trying to insert the data but it throws me an error saying "insert column "none" does not exist". How can we load the data into table with 'Null' or 'None'?
I looked at this solution but it does not help in this case How to insert 'NULL' values into PostgreSQL database using Python?

As #shmee states, you need to use something like executemany and parameterize your values instead of using format, which is vulnerable to SQL injection.
I would modify your code as follows:
def insert_sample_data(self, values): # added self since you are referencing it below
with self.con.cursor() as cur:
sql = "insert into sampletable values (%s, %s, %s)" # Use %s for parameters
cur.executemany(sql, values) # Pass the list of tuples directly
self.con.commit()
list1 = [(1100, 'abc', '{"1209": "Y", "1210": "Y"}'), (1100, 'abc', None)]
self.insert_sample_data(list1) # pass the list directly

Related

psycopg2 prepared delete statement

I am struggling with generating the delete query where parameters for the query is actually a set of values.
So I need to delete rows where parameters are a pair values for example:
delete from table where col1 = %s and col2 = %s
which can be executed in Python like:
cur = conn.cursor()
cur.execute(query, (col1_value, col2_value))
Now I would like to run a query:
delete from table where (col1, col2) in ( (col1_value1, col2_value1), (col1_value2, col2_value2) );
I can generate the queries and values and execute the exact SQL but I can't quite generate prepared statement.
I tried:
delete from table where (col1, col2) in %s
and
delete from table where (col1, col2) in (%s)
But when I try to execute:
cur.execute(query, list_of_col_value_tuples)
or
cur.execute(query, tuple_of_col_value_tuples)
I get an exception that indicates that psycopg2 cannot convert arguments to strings.
Is there any way to use psycopg2 to execute a query like this?
You could dynamically add %s placeholders to your query:
cur = con.cursor()
query = "delete from table where (role, username) in (%s)"
options = [('admin', 'foo'), ('user', 'bar')]
placeholders = '%s,' * len(options)
query = query % placeholders[:-1] # remove last comma
print(query)
print(cur.mogrify(query, options).decode('utf-8'))
Out:
delete from table where (role, user) in (%s,%s)
delete from table where (role, user) in (('admin', 'foo'),('user', 'bar'))
Alternatively, build the query using psycopg2.sql as answered there.
Actually the resolution is quite easy if carefully constructed.
In the miscellaneous goodies of psycopg2 there is a function execute_values.
While all the examples that are given by psycopg2 deal with inserts as the function basically converts the list of arguments into a VALUES list if the call to delete is formatted like so:
qry = "delete from table where (col1, col2) in (%s)"
The call:
execute_values(cur=cur, qry=qry, argslist=<list of value tuples>)
will make the delete perform exactly as required.

(TypeError: Can't convert 'int' object to str implicitly) when pushing a data into a table using python environment

Age and phone_num are int values rest all are strings. when trying to push this into a DB using the code below am getting the following error
insert_query = "insert into employee.details (name,emp_id,age,contact,address) values('"+name+"','"+emp_id+"',"+age+","+phone_num+",'"+address+"')"
cursor = connection.cursor
result = cursor.execute(insert_query)
print("Table updated successfully ")
I think you were getting this error because python cannot combine integers and strings unless they are explicitly converted using str()
I assume you are using SQLite3? If so here is the proper syntax for a query.
insert_query = """INSERT INTO employee.details (name, emp_id, age, contact, address) VALUES (?, ?, ?, ?, ?)"""
cur = conn.cursor()
cur.execute(insert_query, (name, emp_id, age, phone_num, address))
one_row = cur.fetchone() # This will only get one row of the data
all_data = cur.fetchall() # This will get all rows of data in a list of tuples
conn.commit()
conn.close() # only if this is last db change
Templating into your query using a tuple will automatically escape strings and prevent SQL injection. It will also convert your integers to strings, fixing your error.

How to insert a dictionary into Postgresql Table with Pscycopg2

How do I insert a python dictionary into a Postgresql2 table? I keep getting the following error, so my query is not formatted correctly:
Error syntax error at or near "To" LINE 1: INSERT INTO bill_summary VALUES(To designate the facility of...
import psycopg2
import json
import psycopg2.extras
import sys
with open('data.json', 'r') as f:
data = json.load(f)
con = None
try:
con = psycopg2.connect(database='sanctionsdb', user='dbuser')
cur = con.cursor(cursor_factory=psycopg2.extras.DictCursor)
cur.execute("CREATE TABLE bill_summary(title VARCHAR PRIMARY KEY, summary_text VARCHAR, action_date VARCHAR, action_desc VARCHAR)")
for d in data:
action_date = d['action-date']
title = d['title']
summary_text = d['summary-text']
action_date = d['action-date']
action_desc = d['action-desc']
q = "INSERT INTO bill_summary VALUES(" +str(title)+str(summary_text)+str(action_date)+str(action_desc)+")"
cur.execute(q)
con.commit()
except psycopg2.DatabaseError, e:
if con:
con.rollback()
print 'Error %s' % e
sys.exit(1)
finally:
if con:
con.close()
You should use the dictionary as the second parameter to cursor.execute(). See the example code after this statement in the documentation:
Named arguments are supported too using %(name)s placeholders in the query and specifying the values into a mapping.
So your code may be as simple as this:
with open('data.json', 'r') as f:
data = json.load(f)
print(data)
""" above prints something like this:
{'title': 'the first action', 'summary-text': 'some summary', 'action-date': '2018-08-08', 'action-desc': 'action description'}
use the json keys as named parameters:
"""
cur = con.cursor()
q = "INSERT INTO bill_summary VALUES(%(title)s, %(summary-text)s, %(action-date)s, %(action-desc)s)"
cur.execute(q, data)
con.commit()
Note also this warning (from the same page of the documentation):
Warning: Never, never, NEVER use Python string concatenation (+) or string parameters interpolation (%) to pass variables to a SQL query string. Not even at gunpoint.
q = "INSERT INTO bill_summary VALUES(" +str(title)+str(summary_text)+str(action_date)+str(action_desc)+")"
You're writing your query in a wrong way, by concatenating the values, they should rather be the comma-separated elements, like this:
q = "INSERT INTO bill_summary VALUES({0},{1},{2},{3})".format(str(title), str(summery_text), str(action_date),str(action_desc))
Since you're not specifying the columns names, I already suppose they are in the same orders as you have written the value in your insert query. There are basically two way of writing insert query in postgresql. One is by specifying the columns names and their corresponding values like this:
INSERT INTO TABLE_NAME (column1, column2, column3,...columnN)
VALUES (value1, value2, value3,...valueN);
Another way is, You may not need to specify the column(s) name in the SQL query if you are adding values for all the columns of the table. However, make sure the order of the values is in the same order as the columns in the table. Which you have used in your query, like this:
INSERT INTO TABLE_NAME VALUES (value1,value2,value3,...valueN);

Query format with f-string

I have very big dictionary that I want to insert into MySQL table. The dictionary keys are the column names in the table. I'm constructing my query like this as of now:
bigd = {'k1':'v1', 'k2':10}
cols = str(bigd.keys()).strip('[]')
vals = str(bigd.values()).strip('[]')
query = "INSERT INTO table ({}) values ({})".format(cols,vals)
print query
Output:
"INSERT INTO table ('k2', 'k1') values (10, 'v1')"
And this works in Python2.7
But in Python 3.6 if I use string literals like this:
query = f"INSERT INTO table ({cols}) values ({vals})"
print(query)
It prints this:
"INSERT INTO table (dict_keys(['k1', 'k2'])) values (dict_values(['v1', 10]))"
Any tips?
For your curiosity, you should realize that you've cast these to str, getting the representation of dict_keys/values to be inserted into the f-string.
You could just cast to tuples and then insert:
cols = tuple(bigd.keys())
vals = tuple(bigd.values())
q = f"INSERT INTO table {cols} values {vals}"
but, as the comment notes, this isn't a safe approach.

pymysql: How to format types on query?

I'm trying to insert rows on a MySQL table using pymysql (Python 3), the relevant code is the following.
def saveLogs(DbConnection, tableName, results):
for row in results:
formatStrings = ",".join(["?"]*len(row))
sql = "INSERT INTO %s VALUES (%s);"%(tableName,formatStrings)
DbConnection.cursor().execute(sql, tuple(row))
DbConnection.commit()
I'm using "?" for the types, but I get the error not all arguments converted during string formatting. row is a list composed of strings, ints and datetime.datetime. I guess the issue is the "?" but I have checked the PEP 249 and it's still not clear to me how should I do it. Any suggestions?
Use string formatting for the table name only (though make sure you trust the source or have a proper validation in place). For everything else, use query parameters:
def saveLogs(DbConnection, tableName, results):
cursor = DbConnection.cursor()
sql = "INSERT INTO {0} VALUES (%s, %s, %s)".format(tableName)
for row in results:
cursor.execute(sql, row)
DbConnection.commit()
There is also that executemany() method:
def saveLogs(DbConnection, tableName, results):
cursor = DbConnection.cursor()
cursor.executemany("INSERT INTO {0} VALUES (%s, %s, %s)".format(tableName), results)
DbConnection.commit()

Categories

Resources