Saving to database to prevent SQL injection in Python - python

I have a script that pulls data from a weather API & save this info into a MySQL database on localhost. I want to have the UPDATE script prevent any SQL injection but the following doesn't seem to run the UPDATE at all. There isnt an error just the query doesn't seem to have been executed when I check the database.
Can anyone suggest the problem? I am using the mysql.connector import/plugin
def save_to_database(self, uid):
sql = "UPDATE weather_data " \
"SET temperature=%s, temperature_feels=%s, humidity=%s, precipitation=%s, weather_status=%s " \
"WHERE UID =%s"
temperature = self.weather_data['temperature']
temperature_feels = self.weather_data['temperature_feels']
humidity = self.weather_data['humidity']
precipitation = self.weather_data['precipitation']
weather_status = self.weather_data['type']
print(sql)
c = self._db.cursor()
c.execute(sql, (temperature, temperature_feels, humidity, precipitation, weather_status, uid))
UPDATE
The following works fine - but isn't 'safe'
def save_weather_forecast(self, uid):
print(self.weather_data);
sql = "UPDATE weather_data SET temperature = "+ str(self.weather_data['temperature']) + ", " \
+"temperature_feels = "+ str(self.weather_data['temperature_feels']) +", " \
+"humidity = "+ str(self.weather_data['humidity']) +", " \
+"weather_status = '"+ str(self.weather_data['type']) +"', " \
+"precipitation = "+ str(self.weather_data['precipitation']) +"" \
+" WHERE UID = '"+ str(uid) +"'"
print(sql)
c = self._db.cursor()
c.execute(sql)
c.close()

The Python DB API standard explicitly turns off auto commit which means you have to commit any transactions manually otherwise they are not effected at the database.
Committing is done at connection, so you need to add:
self._db.commit()
After the c.execute() line.

Related

python mutiple sql execution by using insert or update

I got the following error occurs when executing sql in Python.
TypeError: not enough arguments for format string
I used executemany function to execute multiple sql statements at once without running a for statement.
And I want to execute the sql statement by receiving the list in "idx".
fruit_idx_list=['123','456','789']
fruit_list = [('123','apple',70,7),('456','strawberry',60,6),('789','banana',100,10)]
sql = "MERGE INTO fruit_test " \
+ "USING DUAL " \
+ "ON idx = {idx}".format(idx = fruit_idx_list) \ #the suspicious part1
+ "WHEN NOT MATCHED THEN " \
+ "INSERT (idx, name, price, vat) VALUES (%s, %s, %s, %s) " \
+ "WHEN MATCHED THEN " \
+ "UPDATE SET idx = %s name = %s, price = %s vat = %s; " #the suspicious part2
cur.executemany(sql, fruit_list)
conn.commit()
I think I have 2 suspicious part to occur sql error.
One is the part which pass fruit_idx_list then execute sql and another is the update statement part.
Please help me to figure out. Thank you in advance

MySQL crashes during data transfer from large csv using LOAD DATA from python

I have a large csv file of 30 million rows(1.6 gb) and I am using pymysql to load the data from csv to mysql tables.
I have removed all constraints in table schema to make load faster and have also set timeout values to large values.
def setTimeOutLimit(connection):
try:
with connection.cursor() as cursor:
query = "SET GLOBAL innodb_lock_wait_timeout = 28800"
cursor.execute(query)
query2 = "SET innodb_lock_wait_timeout = 28800"
cursor.execute(query2)
query3 = "SET GLOBAL connect_timeout = 28800"
cursor.execute(query3)
query4 = "SET GLOBAL wait_timeout = 28800"
cursor.execute(query4)
query5 = "SET GLOBAL interactive_timeout = 28800"
cursor.execute(query5)
query6 = "SET GLOBAL max_allowed_packet = 1073741824"
cursor.execute(query6)
except:
conn.close()
sys.exit(" Could not set timeout limit ")
The data gets inserted into the table but I need to make one of the column as Primary Key and so I am creating another table that makes that column primary index by ignoring duplicate values. (tableName_1 is old table tableName is new table)
def createNewTableFromOld(connection, tableName):
try:
pprint( " Creating new table from old table with constraints" )
with connection.cursor() as cursor:
query = (" CREATE TABLE " + tableName +
" Like " + tableName + "_1")
cursor.execute(query)
query2 = (" ALTER TABLE " + tableName +
" ADD PRIMARY KEY(TimeStamp) ")
cursor.execute(query2)
query3 = (" INSERT IGNORE INTO " + tableName +
" SELECT * FROM " + tableName + "_1")
cursor.execute(query3)
query4 = ("DROP TABLE " + tableName + "_1")
cursor.execute(query4)
connection.commit()
except:
conn.close()
sys.exit(" Could not create table with Primary Key ")
During this method execution, somewhere after 5-6 minutes I get this error,
pymysql.err.OperationalError: (2013, 'Lost connection to MySQL server during query ([WinError 10054] An existing connection was forcibly closed by the remote host)')
And when I check services, MYSQL80 automatically crashed and stopped. I have also set max_allowed_packet_size to 1 gb in my.ini file and all timeouts are manually set to 8 hours. What could be the issue?
The original table schema is:
query = ("CREATE TABLE IF NOT EXISTS " + table + " ("
" TimeStamp DECIMAL(15, 3), " +
" Value DECIMAL(30, 11), " +
" Quality INT, " +
" TagName varchar(30) )"
)
I finally solved the issue by setting the innodb_buffer_pool_size in my.ini file to 2GB which was earlier only 4M.

optimizing display of a 5MB query result using python and VueJS

I have a web application using python on the server (bottle) and VueJS in the client.
one frontend component display's a paginated result of a query which ends in more then 10k records. the records are saved as python dict and sent to the frontend. the size of the result is 5.5MB approximately which with my internet connection ends with more then 2 seconds of waiting.
the api path to handle the request looks like this:
#cmCampaigns.get('/api/campaignManager/campaigns')
#authorize()
def get():
resp = {}
usr = authlayer.current_user
user_id = dao.App().getUserID(usr.username)
resp["campaigns"] = CampaignsData().get_revcontent_campaigns(user_id)
return resp
and the query itself looks like this:
def get_campaigns(self, user_id):
query = "SELECT rc.account_id, rc.campaign_id, " \
"rc.campaign_name, rc.start_date, rc.end_date, " \
"rc.enabled, rc.default_bid, " \
"rc.budget, rc.cost, ctr,rc.country_codes, " \
"'revcontent' AS provider, " \
"replace(JSON_EXTRACT(ac.account_json,'$.client_id'),'\"','') AS account_name, " \
"CASE enabled " \
"WHEN enabled = 'active' THEN 'on' " \
"WHEN enabled = 'inactive' THEN 'off' " \
"END AS enabled_val " \
"FROM prv_campaigns AS rc " \
"INNER JOIN websites AS ws " \
"ON rc.website_id = ws.website_id " \
"INNER JOIN website_users AS wu " \
"ON wu.website_id = ws.website_id " \
"INNER JOIN prv_accounts ac " \
"ON rc.account_id = ac.account_id " \
"WHERE wu.user_id = %s " \
"ORDER BY id DESC"
try:
data = self.db.query(query, user_id)
return data
except Exception as e:
logging.exception(e.message)
return -1, e.message
i tried to optimize the query as possible, but it is still not enogh.
What "best practice" solution's are there to optimize this common logic? compressing the dict before sending it? i wa thinking about pagination but then my filtering and sorting logic is in the client and so lots of data is obviously gone.....any recommendations?? thx
For pagination you can use LIMIT & OFFSET
Optimization ? run "EXPLAIN" before the query and bring the output
Lets take a quick look
SELECT rc.account_id,
rc.campaign_id,
rc.campaign_name,
rc.start_date,
rc.end_date,
rc.enabled,
rc.default_bid,
rc.budget,
rc.cost,
ctr,
rc.country_codes,
'revcontent' AS
provider,
Replace(Json_extract(ac.account_json, '$.client_id'), '"', '') AS
account_name,
CASE enabled
WHEN enabled = 'active' THEN 'on'
WHEN enabled = 'inactive' THEN 'off'
END AS
enabled_val
FROM prv_campaigns AS rc
inner join websites AS ws
ON rc.website_id = ws.website_id
inner join website_users AS wu
ON wu.website_id = ws.website_id
inner join prv_accounts ac
ON rc.account_id = ac.account_id
WHERE wu.user_id = %s
ORDER BY id DESC
enabled column should be numeric or enum, it will save strcmp time
Replace(Json_extract(ac.account_json, '$.client_id'), '"', '') ? extract all account name do your thing in client side
2 inner joins ? might be a design issue

How do I securely parameterize a dynamic python mysql query?

I am wondering how to securely parameterize a dynamic mysql query in python. By dynamic, I mean it changes depending on how the if statements evaluate.
I understand how to parameterize a mysql query in python, by using a comma, rather than a percent sign, like as follows.
c.execute("SELECT * FROM foo WHERE bar = %s AND baz = %s", (param1, param2))
Here is an example of a 'dynamic query'. I am looking to find a more secure way than using the percent sign.
def queryPhotos(self, added_from, added, added_to):
sql = "select * from photos where 1=1 "
if added_from is not None:
sql = sql + "and added >= '%s' " % added_from
if added is not None:
sql = sql + "and added = '%s' " % added
if added_to is not None:
sql = sql + "and added <= '%s' " % added_to
Thank you for your insight.
thanks to #Nullman I came to an answer.
def queryPhotos(self, added_from, added, added_to):
vars = []
sql = "select * from photos where 1=1 "
if added_from is not None:
sql = sql + "and added >= %s "
vars.append(added_from)
if added is not None:
sql = sql + "and added = %s "
vars.append(added)
if added_to is not None:
sql = sql + "and added <= %s "
vars.append(added_to)
vars = tuple(vars)
results = c.execute(sql, vars)

"(u'24000', u'[24000] [Microsoft][ODBC SQL Server Driver]Invalid Cursor State)" in python pypyodbc

I want to execute the procedure and get the output parameters
import pypyodbc
command = "DECLARE #l_Stat INT \r\n" \
"DECLARE #s_Ms VARCHAR(200) \r\n" \
"EXEC p_Log_InsertParam_2011v1 " \
"#l_TesterId=?, " \
"#l_ObiektId=?, " \
"#l_RejPId=?, " \
"#s_ParamName=?, " \
"#f_ParamValue=?, " \
"#f_ParamMinValue=?, " \
"#f_ParamMaxValue=?, " \
"#b_CheckParam=?, " \
"#l_Status=#l_Stat output, " \
"#s_Msg=#s_Ms output \r\n" \
"SELECT #l_Stat, #s_Ms\r\n"
connection = pypyodbc.connect(self.ConnectionString)
cursor = connection.cursor()
params=(453879185, 23192812, 645872, '/APL/CTRL_GZ/PID/SP', 35.0, 0, 0, True)
# params = (testerid,
# obiektid,
# rejpid,
# paramname,
# paramvalue,
# paramminvalue,
# parammaxvalue,
# checkparam) """
result = cursor.execute(command, params)
pars = result.fetchall()[0]
print pars
if pars.__len__() >= 2:
l_status = pars[0]
s_msg = pars[1]
print "{}: {};".format(l_status, s_msg)
It runs well to the line result = cursor.execute(command, params) - procedure execute properly.
Problem appears when I try to fetch the result in line pars = result.fetchall()[0]:
u'24000', u'[24000] [Microsoft][ODBC SQL Server Driver]Invalid Cursor State'
How can I get rid of this error?
I was able to reproduce your issue when my p_Log_InsertParam_2011v1 stored procedure did not have
SET NOCOUNT ON;
as its first executable statement. When I added it at the beginning of my stored procedure the error went away.

Categories

Resources