How to solve second order SQL injection attacks vulnerability? - python

I'm facing a challenge while deploying a backed application built using Python Flask. The codegate scan is catching up some of the code integrated in the CI/CD process as potential vulnerabilities. Below are 2 types of vulnerabilities.
Second order SQL injection.
SQL injection
I'm using raw SQL queries to get the data from the DB in conjunction with Flask SQLAlchemy and Pandas.
Below is a sample code where codegate is pointing the issues for Second order SQL injection.
def get_user_data(user_id: str):
query: str = USER_ID_QUERY.format(user_id=user_id)
result = db.session.execute(query)
result = result.fetchall()
if len(result) < 1:
raise Exception("User not Found")
return result[0][0]
Query
USER_ID_QUERY = """SELECT USER_ID FROM USER WHERE USER_ID = '{user_id}'"""
Vulnerability description - Method get_user_data at line 236 of
src\utils.py gets database data from the execute element. This
element’s value then flows through the code without being properly
sanitized or validated, and is eventually used in a database query in
method check_access at line 50 of src\service.py. This may enable an
Second-Order SQL Injection attack.
I have tried below solution after digging out the internet by still it is giving the same error.
Binding the parameters using text and bindparams of sqlalchemy
query = text(USER_ID_QUERY).bindparams(user_id=user_id)
Could someone please help me in highlighting what is wrong here or what can be done to resolve these painfull issues?

Related

python pyodbc SQLite sql injections

I use pyodbc in my python flask Project for the SQLite DB connection.
I know and understand SQL Injections but this is my first time dealing with it.
I tried to execute some
I have a function which concatenates the SQL String in my database.py file:
def open_issue(self, data_object):
cursor = self.conn.cursor()
# data_object is the issue i get from the user
name = data_object["name"]
text = data_object["text"]
rating_sum = 0
# if the user provides an issue
if name:
# check if issue is already in db
test = cursor.execute(f'''SELECT name FROM issue WHERE name = "{name}"''')
data = test.fetchall()
# if not in db insert
if len(data) == 0:
# insert the issue
cursor.executescript(f'''INSERT INTO issue (name, text, rating_sum)
VALUES ("{name}", "{text}", {rating_sum})''')
else:
print("nothing inserted!")
In the api.py file the open_issue() function gets called:
#self.app.route('/open_issue')
def insertdata():
# data sent from client
# data_object = flask.request.json
# unit test dictionary
data_object = {"name": "injection-test-table",
"text": "'; CREATE TABLE 'injected_table-1337';--"}
DB().open_issue(data_object)
The "'; CREATE TABLE 'injected_table-1337';--" sql injection has not created the injected_table-1337, instead it got inserted normally like a string into the text column of the injection-test-table.
So i don't really know if i am safe for the standard ways of SQL injection (this project will only be hosted locally but good security is always welcome)
And secondary: are there ways with pyodbc to check if a string contains sql syntax or symbols, so that nothing will get inserted in my example or do i need to check the strings manually?
Thanks a lot
As it turns out, with SQLite you are at much less risk of SQL injection issues because by default neither Python's built-in sqlite3 module nor the SQLite ODBC driver allow multiple statements to be executed in a single .execute call (commonly known as an "anonymous code block"). This code:
thing = "'; CREATE TABLE bobby (id int primary key); --"
sql = f"SELECT * FROM table1 WHERE txt='{thing}'"
crsr.execute(sql)
throws this for sqlite3
sqlite3.Warning: You can only execute one statement at a time.
and this for SQLite ODBC
pyodbc.Error: ('HY000', '[HY000] only one SQL statement allowed (-1) (SQLExecDirectW)')
Still, you should follow best practices and use a proper parameterized query
thing = "'; CREATE TABLE bobby (id int primary key); --"
sql = "SELECT * FROM table1 WHERE txt=?"
crsr.execute(sql, (thing, ))
because this will also correctly handle parameter values that would cause errors if injected directly, e.g.,
thing = "it's good to avoid SQL injection"

Redshift / SQLAlchemy: 25P02 error/warning with all queries

I'm querying Redshift with SQLAlchemy over an ODBC connection. No matter what I do, I get the following warning:
C:\Anaconda3\lib\site-packages\sqlalchemy\engine\default.py:324:
SAWarning: Exception attempting to detect unicode returns:
ProgrammingError("(pyodbc.ProgrammingError) ('25P02', '[25P02]
[Amazon][Amazon Redshift] (30) Error occurred while trying to execute
a query: [SQLState 25P02] ERROR: current transaction is aborted,
commands ignored until end of transaction block\n (30)
(SQLExecDirectW)')") "detect unicode returns: %r" % de)
It is not an error, just a warning. I still get the correct results. For example, simple queries like this:
from sqlalchemy import create_engine
engine = create_engine("mssql+pyodbc://#MY_CONN")
with engine.connect() as conn:
ct = conn.execute("SELECT COUNT(1) FROM my_table").scalar()
print(ct)
Will produce the correct count but still show that warning. I've done some research that indicates this might be related to autocommit options, but when I run the following code, I still get the warning, and this time with an incorrect result of 0:
ct = (
conn.execute(text("SELECT COUNT(1) FROM my_table").execution_options(autocommit=True)).scalar()
)
Besides I would think autocommit has nothing to do with read queries.
Any insights into this?
As per my comment, the probably cause of this error is the use of "mssql+pyodbc". This dialect is intended for Microsoft SQL Server and so is probably making incompatible metadata queries in the background, causing the warning.
To work with Redshift, try using a PostgreSQL dialect or a Redshift dialect (e.g. https://github.com/sqlalchemy-redshift/sqlalchemy-redshift).

Too few parameters error, while no parameters placeholders used

I am trying to execute SQL query within Access database using PYODBC and I get following error:
pyodbc.Error: ('07002', '[07002] [Microsoft][ODBC Microsoft Access Driver]
Too few parameters. Expected 1. (-3010) (SQLExecDirectW)')
The problem is that I am not using any additional parameters. Here is the code:
access_con_string = r"Driver={};Dbq={};".format(driver, base)
cnn = pyodbc.connect(access_con_string)
db_cursor = cnn.cursor()
expression = """SELECT F_ARODES.ARODES_INT_NUM, F_ARODES.TEMP_ADRESS_FOREST,F_AROD_LAND_USE.ARODES_INT_NUM, F_ARODES.ARODES_TYP_CD
FROM F_ARODES LEFT JOIN F_AROD_LAND_USE ON F_ARODES.ARODES_INT_NUM = F_AROD_LAND_USE.ARODES_INT_NUM
WHERE (((F_AROD_LAND_USE.ARODES_INT_NUM) Is Null) AND ((F_ARODES.ARODES_TYP_CD)="wydziel") AND ((F_ARODES.TEMP_ACT_ADRESS)=True));"""
db_cursor.execute(expression)
Query itself, if used inside MS-Access works fine. Also, connection is OK, as other queries are executed properly.
What am I doing wrong?
Constants in such queries are problematic - you never know the exact underlying syntax for booleans, strings etc. - even if it works in MS-Access, it can be different inside the intermediary library you're using.
The safest way is to extract them as parameters anyway:
expression = """SELECT F_ARODES.ARODES_INT_NUM, F_ARODES.TEMP_ADRESS_FOREST,F_AROD_LAND_USE.ARODES_INT_NUM, F_ARODES.ARODES_TYP_CD FROM F_ARODES LEFT JOIN F_AROD_LAND_USE ON F_ARODES.ARODES_INT_NUM = F_AROD_LAND_USE.ARODES_INT_NUM WHERE (((F_AROD_LAND_USE.ARODES_INT_NUM) Is Null)
AND ((F_ARODES.ARODES_TYP_CD)=?) AND ((F_ARODES.TEMP_ACT_ADRESS)=?));"""
db_cursor.execute(expression, "wydziel", True)
I had a similar problem, with an update I was trying to perform with pyodbc. When executed in Access, the query worked fine, same for when using the application (it allows some queries from within the app). But when ran in python with pyodbc the same text would throw errors. I determined the problem is the double quote (OP's query has a set of them as well). The query began to work when I replaced them with single quotes.
This does not work:
Update ApplicationStandards Set ShortCutKey = "I" Where ShortName = "ISO"
This does:
Update ApplicationStandards Set ShortCutKey = 'I' Where ShortName = 'ISO'

pypyodbc: OPENJSON incorrect syntax near keyword "WITH"

I'm trying to use OPENJSON in a Python script to import some basic JSON into a SQL database. I initially tried with a more complex JSON file, but simplified it for the sake of this post. Here's what I have:
sql_statement = "declare #json nvarchar(max) = '{\"name\":\"James\"}'; SELECT * FROM OPENJSON(#json) WITH (name nvarchar(20))"
cursor.execute(sql_statement)
cursor.commit()
connection.close()
The error I receive:
pypyodbc.ProgrammingError: (u'42000', u"[42000] [Microsoft][ODBC SQL
Server Driver][SQL Server]Incorrect syntax near the keyword 'with'. If
this statement is a common table expression, an xmlnamespaces clause
or a change tracking context clause, the previous statement must be
terminated with a semicolon.")
Any thoughts on why I'm seeing this error? I was successfully able to execute other SQL queries with the same pypyodbc / database configuration.
The problem could be that your database is running in an older compatibility level, where OPEN JSON is not available.
To find the compatibility level of your database, run following SQL statement:
SELECT compatibility_level FROM sys.databases WHERE name = 'your_db_name';
If the result is 120 or lower, you'll need to update your compatibility level to 130, by running:
ALTER DATABASE your_db_name SET COMPATIBILITY_LEVEL = 130;
Note: In case your database is actually Azure SQL DB, you should check the version as well, as OPEN JSON is not available for versions prior to 12.x

Does pgdb prevent against injection attacks?

I have a piece of code like this:
db = pgdb.connect(
database=connection['database'],
user=connection['user'],
host=connection['host'])
cursor = db.cursor()
# ask database
query = '''
SELECT a, b, c
FROM table
WHERE a ILIKE %s;'''
try:
cursor.execute(query, userInput)
except pgdb.Error, error:
error = str(error)
print json.dumps({
'errorMessage': 'ERROR: %s' % error
})
I have read in another forum that python modules like MySQLdb do escaping to prevent against injection attacks. I have also looked through the documentation on pgdb but it is pretty thin. Lastly, I tried to do my own injection attacks using my own test database, but I'm not sure if my tests are sufficient. What would be a good way to test this out?
All DB-API modules protect against SQL injection when you use the execute method with all variable input kept in the parameter list (userInput in your example, which is safe).
It turns out that for pgdb the way it does this is indeed by escaping each of the parameters to get SQL literal values before injecting them into the placeholders in the SQL query. That needn't necessarily be the case: some database connectors can pass parameters to their server as separate structures rather than part of the query string, and there are potentially performance benefits from doing that. Ultimately though you shouldn't really care what method is being used - you deliver the parameters separately to the DB-API connector, and it is reponsible for making that work in a secure way.
Of course if you start dropping variables into the query yourself instead (eg "WHERE a ILIKE '%s'" % userInput), pgdb or any other connector can't stop you from hurting yourself.

Categories

Resources