PYTHON error in SQL syntax while writing json to MYSQL database - python

I want to receive json data from MQTT and store it in my database.
When am executing my code am receiving this error:
mysql.connector.errors.ProgrammingError: 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near '%s)' at line 1
There is my code :
import mysql.connector
import json
mydb = mysql.connector.connect(
host="localhost",
user="***",
passwd="***",
database="database"
)
mycursor = mydb.cursor()
def DATA_REPARTITION(Topic, jsonData):
if Topic == "test":
#print ("Start")
INSERT_DEBIT(jsonData)
def INSERT_DEBIT(jsonData):
#Read json from MQTT
print("Start read data to insert")
json_Dict = json.loads(jsonData)
debit = json_Dict['debit']
#Insert into DB Table
sql = ("INSERT INTO debit (data_debit) VALUES (%s)")
val=(debit)
mycursor.execute(sql,val)
mydb.commit()
print(mycursor.rowcount, "record inserted.")
mycursor.close()
mydb.close()
Thanks for your help, am working on this problem for the last 2 days.

You've written your parameterized query properly for MySQL:
sql = ("INSERT INTO debit (data_debit) VALUES (%s)")
The problem is that you're passing in the arguments wrong:
val=(debit)
mycursor.execute(sql,val)
The parentheses don't make debit into a tuple of 1 value. They don't do anything at all; val is just the same value as debit.
But execute wants a sequence of separate values, not 1 value.
To fix this, you need to add a comma. Commas are what create tuples in Python:
val = debit,
If you're wondering why this raises a SQL error, instead of a Python error about val not being iterable… Most likely, val is a string. And strings are iterable. They just iterate their characters. If val is, say, '$100', then you're passing the arguments '$', '1', '0', and '0', to fit a parameterized query with only one parameter.

Related

pymysql.err.OperationalError: 1054. "Unknown column 'X' in 'where clause'

I'm getting the following error in console, where the column name actually is the value passed through the query:
pymysql.err.OperationalError: (1054, "Unknown column 'LiqNac83437' in
'where clause'")
This is my function:
sql = f"""
SELECT
detallev.clave,
detallev.cantidad,
venta.fecha
FROM
detallev
INNER JOIN
venta ON detallev.ven_id = venta.ven_id
WHERE
clave = '{codigoBarras}'
AND (fecha BETWEEN {fecha_inicio} AND {fecha_final});"""
print(sql)
with bd:
with bd.cursor() as cursor:
cursor.execute(sql)
resultado = cursor.fetchall()
cursor.close()
which is called by:
#app.get('/{sucursal}/reporte/articulos/')
def reporte_articulo(sucursal: Origenes, clave: str = '', fecha_inicial: str = '', fecha_final: str = fechaHoy(), username: str = Depends(valida_usuario)):
return reporte_articulos.reporte_articulo(sucursal, clave, fecha_inicial, fecha_final)
I'm using FastAPI, python and Mysql.
I've already tried following these solutions with no luck:
Solution 1
Solution 2
and several other solutions outside stackoverflow, already tried wrapping the concatenated value in different type of ways.
When running this query directly on Mysql workbench it works perfect, aside from calling it from the API.
When the column name value passed to the function is only numbers as "47839234215" instead of "LiqNac83437", which is numbers and letters, It works great as expected.
This happens because you are substituting the values yourself, and in this case you have not properly quotes the fields in the BETWEEN clause. It sees LiqNac83437 and thinks it is a column name, because it is not quoted.
For this reason, and to avoid SQL injection problems, you should let the database connector do the quoting:
sql = """
SELECT
detallev.clave,
detallev.cantidad,
venta.fecha
FROM
detallev
INNER JOIN
venta ON detallev.ven_id = venta.ven_id
WHERE
clave = ?
AND fecha BETWEEN ? AND ?;"""
with bd.cursor() as cursor:
cursor.execute(sql, (codigoBarras, fecha_inicio, fecha_final))
resultado = cursor.fetchall()
cursor.close()

Safely Inserting Strings Into a SQLite3 UNION Query Using Python

I'm aware that the best way to prevent sql injection is to write Python queries of this form (or similar):
query = 'SELECT %s %s from TABLE'
fields = ['ID', 'NAME']
cur.execute(query, fields)
The above will work for a single query, but what if we want to do a UNION of 2 SQL commands? I've set this up via sqlite3 for sake of repeatability, though technically I'm using pymysql. Looks as follows:
import sqlite3
conn = sqlite3.connect('dummy.db')
cur = conn.cursor()
query = 'CREATE TABLE DUMMY(ID int AUTO INCREMENT, VALUE varchar(255))'
query2 = 'CREATE TABLE DUMMy2(ID int AUTO INCREMENT, VALUE varchar(255)'
try:
cur.execute(query)
cur.execute(query2)
except:
print('Already made table!')
tnames = ['DUMMY1', 'DUMMY2']
sqlcmds = []
for i in range(0,2):
query = 'SELECT %s FROM {}'.format(tnames[i])
sqlcmds.append(query)
fields = ['VALUE', 'VALUE']
sqlcmd = ' UNION '.join(sqlcmds)
cur.execute(sqlcmd, valid_fields)
When I run this, I get a sqlite Operational Error:
sqlite3.OperationalError: near "%": syntax error
I've validated the query prints as expected with this output:
INSERT INTO DUMMY VALUES(%s) UNION INSERT INTO DUMMY VALUES(%s)
All looks good there. What is the issue with the string substitutions here? I can confirm that running a query with direct string substitution works fine. I've tried it with both selects and inserts.
EDIT: I'm aware there are multiple ways to do this with executemany and a few other. I need to do this with UNION for the purposes I'm using this for because this is a very, very simplified example fo the operational code I'm using
The code below executes few INSERTS at once
import sqlite3
conn = sqlite3.connect('dummy.db')
cur = conn.cursor()
query = 'CREATE TABLE DUMMY(ID int AUTO INCREMENT NOT NULL, VALUE varchar(255))'
try:
cur.execute(query)
except:
print('Already made table!')
valid_fields = [('ya dummy',), ('stupid test example',)]
cur.executemany('INSERT INTO DUMMY (VALUE) VALUES (?)',valid_fields)

Not all parameters were used in the SQL statement when using python and mysql

hi I am doing the python mysql at this project, I initial the database and try to create the table record, but it seems cannot load data to the table, can anyone here can help me out with this
import mysql.connector
mydb = mysql.connector.connect( host="localhost",user="root",password="asd619248636",database="mydatabase")
mycursor = mydb.cursor()
mycursor.excute=("CREATE TABLE record (temperature FLOAT(20) , humidity FLOAT(20))")
sql = "INSERT INTO record (temperature,humidity) VALUES (%d, %d)"
val = (2.3,4.5)
mycursor.execute(sql,val)
mydb.commit()
print(mycursor.rowcount, "record inserted.")
and the error shows "Not all parameters were used in the SQL statement")
mysql.connector.errors.ProgrammingError: Not all parameters were used in the SQL statement
Changing the following should fix your problem:
sql = "INSERT INTO record (temperature,humidity) VALUES (%s, %s)"
val = ("2.3","4.5") # You can also use (2.3, 4.5)
mycursor.execute(sql,val)
The database API takes strings as arguments, and later converts them to the appropriate datatype. Your code is throwing an error because it isn't expecting %d or %f (int or float) datatypes.
For more info on this you can look here
simply change insert method to
sql = "INSERT INTO record (temperature,humidity) VALUES (%s, %s)"
then it works fine
This works for me.
# Insert from dataframe to table in SQL Server
import time
import pandas as pd
import pyodbc
# create timer
start_time = time.time()
from sqlalchemy import create_engine
df = pd.read_csv("C:\\your_path_here\\CSV1.csv")
conn_str = (
r'DRIVER={SQL Server Native Client 11.0};'
r'SERVER=Excel-Your_Server_Name;'
r'DATABASE=NORTHWND;'
r'Trusted_Connection=yes;'
)
cnxn = pyodbc.connect(conn_str)
cursor = cnxn.cursor()
for index,row in df.iterrows():
cursor.execute('INSERT INTO dbo.Table_1([Name],[Address],[Age],[Work]) values (?,?,?,?)',
row['Name'],
row['Address'],
row['Age'],
row['Work'])
cnxn.commit()
cursor.close()
cnxn.close()

How to store python dictionary in to mysql DB through python

I am trying to store the the following dictionary into mysql DB by converting the dictionary into a string and then trying to insert, but I am getting following error. How can this be solved, or is there any other way to store a dictionary into mysql DB?
dic = {'office': {'component_office': ['Word2010SP0', 'PowerPoint2010SP0']}}
d = str(dic)
# Sql query
sql = "INSERT INTO ep_soft(ip_address, soft_data) VALUES ('%s', '%s')" % ("192.xxx.xx.xx", d )
soft_data is a VARCHAR(500)
Error:
execution exception (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to
use near 'office': {'component_office': ['Word2010SP0', 'PowerPoint2010SP0' at line 1")
Any suggestions or help please?
First of all, don't ever construct raw SQL queries like that. Never ever. This is what parametrized queries are for. You've asking for an SQL injection attack.
If you want to store arbitrary data, as for example Python dictionaries, you should serialize that data. JSON would be good choice for the format.
Overall your code should look like this:
import MySQLdb
import json
db = MySQLdb.connect(...)
cursor = db.cursor()
dic = {'office': {'component_office': ['Word2010SP0', 'PowerPoint2010SP0']}}
sql = "INSERT INTO ep_soft(ip_address, soft_data) VALUES (%s, %s)"
cursor.execute(sql, ("192.xxx.xx.xx", json.dumps(dic)))
cursor.commit()
Change your code as below:
dic = {'office': {'component_office': ['Word2010SP0', 'PowerPoint2010SP0']}}
d = str(dic)
# Sql query
sql = """INSERT INTO ep_soft(ip_address, soft_data) VALUES (%r, %r)""" % ("192.xxx.xx.xx", d )
Try this:
dic = { 'office': {'component_office': ['Word2010SP0', 'PowerPoint2010SP0'] } }
"INSERT INTO `db`.`table`(`ip_address`, `soft_data`) VALUES (`{}`, `{}`)".format("192.xxx.xx.xx", str(dic))
Change db and table to the values you need.
It is a good idea to sanitize your inputs, and '.format' is useful when needing to use the same variable multiple times within a query. (Not that you to for this example)
dic = {'office': {'component_office': ['Word2010SP0', 'PowerPoint2010SP0']}}
ip = '192.xxx.xx.xx'
with conn.cursor() as cur:
cur.execute("INSERT INTO `ep_soft`(`ip_address`, `soft_data`) VALUES ({0}, '{1}')".format(cur.escape(ip),json.dumps(event)))
conn.commit()
If you do not use cur.escape(variable), you will need to enclose the placeholder {} in quotes.
This answer has some pseudo code regarding the connection object and the flavor of mysql is memsql, but other than that it should be straightforward to follow.
import json
#... do something
a_big_dict = getAHugeDict() #build a huge python dict
conn = getMeAConnection(...)
serialized_dict = json.dumps(a_big_dict) #serialize dict to string
#Something like this to hold the serialization...
qry_create = """
CREATE TABLE TABLE_OF_BIG_DICTS (
ROWID BIGINT NOT NULL AUTO_INCREMENT,
SERIALIZED_DICT BLOB NOT NULL,
UPLOAD_DT TIMESTAMP NULL DEFAULT CURRENT_TIMESTAMP,
KEY (`ROWID`) USING CLUSTERED COLUMNSTORE
);
"""
conn.execute(qry_create)
#Something like this to hold em'
qry_insert = """
INSERT INTO TABLE_OF_BIG_DICTS (SERIALIZED_DICT)
SELECT '{SERIALIZED_DICT}' as SERIALIZED_DICT;
"""
#Send it to db
conn.execute(qry_insert.format(SERIALIZED_DICT=serialized_dict))
#grab the latest
qry_read = """
SELECT a.SERIALIZED_DICT
from TABLE_OF_BIG_DICTS a
JOIN
(
SELECT MAX(UPLOAD_DT) AS MAX_UPLOAD_DT
FROM TABLE_OF_BIG_DICTS
) b
ON a.UPLOAD_DT = b.MAX_UPLOAD_DT
LIMIT 1
"""
#something like this to read the latest dict...
df_dict = conn.sql_to_dataframe(qry_read)
dict_str = df_dict.iloc[df_dict.index.min()][0]
#dicts never die they just get rebuilt
dict_better = json.loads(dict_str)

mysql-connector-python cannot work with GBK string "赵孟頫"

Look at the code in python shell:
>>> s = u'赵孟頫'.encode('gbk')
>>> s
'\xd5\xd4\xc3\xcf\xee\\'
The last byte of '赵孟頫' is \x5c, the same as backslash. And it causes a sql error.
mysql.connector.errors.ProgrammingError: 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ''?????\\')' at line 4
Here is my code:
# db is mysql.connector object
sql = '''
INSERT INTO scraped_products(
site_prd_id,site_id,brand)
VALUES(
%(site_prd_id)s,%(site_id)s,%(brand)s)
'''
dat = {
'site_prd_id' : 'test',
'site_id' : 1,
'brand' : u'赵孟頫'.encode('gbk'),
}
self.db.ping(True, 3, 1)
self.db.cursor().execute(sql, dat)
I have a solution which would need some extra work to get it working. The following code example is converting the data into a MySQL Hexadecimal Literal and sends it to MySQL without escaping, quoting or converting it. It's a bit a different way of executing queries, but I hope it will serve for now:
import mysql.connector
cnx = mysql.connector.connect(database='test', user='root',
charset='gbk', use_unicode=False)
cur = cnx.cursor()
cur.execute("DROP TABLE IF EXISTS gbktest")
table = (
"CREATE TABLE gbktest ("
"id INT AUTO_INCREMENT KEY, "
"c1 VARCHAR(40)"
") CHARACTER SET 'gbk'"
)
cur.execute(table)
def gbk_to_hexstr(value):
"""Convert value to Hexadecimal Literal for MySQL
"""
return "0x{0}".format(''.join(
["{0:x}".format(ord(c)) for c in value.encode('gbk')]))
# Convert your Unicode data using gbk_to_hexstr
data = {
'c1' : gbk_to_hexstr(u'赵孟頫'),
}
# Use MySQLCursor.execute() _not_ passing data as second argument
cur.execute("INSERT INTO gbktest (c1) VALUES ({c1})".format(**data))
cur.execute("SELECT c1 FROM gbktest")
# Print the inserted data as Unicode
for row in cur:
print(row[0].decode('gbk').encode('utf8'))

Categories

Resources