Getting the table name to select on from the user - python

My problem is I have a SELECT query but the table to select the data from needs to be specified by the user, from an HTML file. Can anyone suggest a way to do this?
I am querying a postgres database and the SQL queries are in a python file.

Create a variable table_name and verify it contains only characters allowed in a table name. Then put it into the SQL query:
sql = "SELECT ... FROM {} WHERE ...".format(table_name) # replace ... with real sql
If you don't verify it and the user sends something nasty, you run into risk.

Related

POSTGRESQL Queries using Python

I am trying to access tables from a database using python. There was some code on the website: https://rnacentral.org/help/public-database
import psycopg2.extras
def main():
conn_string = "host='hh-pgsql-public.ebi.ac.uk' dbname='pfmegrnargs' user='reader' password='NWDMCE5xdipIjRrp'"
conn = psycopg2.connect(conn_string)
cursor = conn.cursor(cursor_factory=psycopg2.extras.DictCursor)`
# retrieve a list of RNAcentral databases
query = "SELECT * FROM rnc_database"
cursor.execute(query)
for row in cursor:
print(row)`
When i run this code, i get back a list of databases:
I want to access tables from one of these databases but I don't know what the schema for those tables are or what the values in each list returned represents. I have been looking at 'postgresql to python' resources but all of them are about accessing tables when you know the name of the tables and the columns within.... Is there code for how I can access the table names from the database?
Thank You
Edit: sorry, i thought i linked the website before
The dataset you want to use has schema diagram here https://rnacentral.org/help/public-database
For general purpose I would use something like https://dbeaver.io/ tool it will show you all the schemas in the db and tables inside the schema and so forth. The DBeaver settings to connect to your db would look like this
If you want to keep using python script to explore the db this sql query
SELECT *
FROM pg_catalog.pg_tables
WHERE schemaname != 'pg_catalog' AND
schemaname != 'information_schema';
Should help you.

python pyodbc SQLite sql injections

I use pyodbc in my python flask Project for the SQLite DB connection.
I know and understand SQL Injections but this is my first time dealing with it.
I tried to execute some
I have a function which concatenates the SQL String in my database.py file:
def open_issue(self, data_object):
cursor = self.conn.cursor()
# data_object is the issue i get from the user
name = data_object["name"]
text = data_object["text"]
rating_sum = 0
# if the user provides an issue
if name:
# check if issue is already in db
test = cursor.execute(f'''SELECT name FROM issue WHERE name = "{name}"''')
data = test.fetchall()
# if not in db insert
if len(data) == 0:
# insert the issue
cursor.executescript(f'''INSERT INTO issue (name, text, rating_sum)
VALUES ("{name}", "{text}", {rating_sum})''')
else:
print("nothing inserted!")
In the api.py file the open_issue() function gets called:
#self.app.route('/open_issue')
def insertdata():
# data sent from client
# data_object = flask.request.json
# unit test dictionary
data_object = {"name": "injection-test-table",
"text": "'; CREATE TABLE 'injected_table-1337';--"}
DB().open_issue(data_object)
The "'; CREATE TABLE 'injected_table-1337';--" sql injection has not created the injected_table-1337, instead it got inserted normally like a string into the text column of the injection-test-table.
So i don't really know if i am safe for the standard ways of SQL injection (this project will only be hosted locally but good security is always welcome)
And secondary: are there ways with pyodbc to check if a string contains sql syntax or symbols, so that nothing will get inserted in my example or do i need to check the strings manually?
Thanks a lot
As it turns out, with SQLite you are at much less risk of SQL injection issues because by default neither Python's built-in sqlite3 module nor the SQLite ODBC driver allow multiple statements to be executed in a single .execute call (commonly known as an "anonymous code block"). This code:
thing = "'; CREATE TABLE bobby (id int primary key); --"
sql = f"SELECT * FROM table1 WHERE txt='{thing}'"
crsr.execute(sql)
throws this for sqlite3
sqlite3.Warning: You can only execute one statement at a time.
and this for SQLite ODBC
pyodbc.Error: ('HY000', '[HY000] only one SQL statement allowed (-1) (SQLExecDirectW)')
Still, you should follow best practices and use a proper parameterized query
thing = "'; CREATE TABLE bobby (id int primary key); --"
sql = "SELECT * FROM table1 WHERE txt=?"
crsr.execute(sql, (thing, ))
because this will also correctly handle parameter values that would cause errors if injected directly, e.g.,
thing = "it's good to avoid SQL injection"

Test Postgresql query on a table with JSON columns using Python

Example query:
SELECT error->>'message' as message
FROM error_cases
In reality, my query is way more complicated and I would like to make sure that future code changes won't destroy the data this query outputs. I would like to compare result of this query with some particular output I already have.
I am using testing.postgresql library to create temporary database, run the query, save the output and destroy the database.
My query uses Postgresql ->> notation. I get the error:
psycopg2.errors.UndefinedFunction: operator does not exist: text ->> unknown
To reproduce, first I create table:
cur.execute('CREATE TABLE error_cases (error TEXT NOT NULL)')
Then I insert data:
cur.execute('INSERT INTO error_cases
VALUES ('{"message": "someMessage"}')
And select:
select (error->>'message') as message from error_cases
I've looked at sqlalchemy to query the data, but the problem is that I want to test this particular query I have. In sqlalchemy for retrieving JSON I can't use Postgresql ->> notation, which is in my query.
---Is there any other way to run query containing ->> operator on database created using testing.postgresql?
I've just located the issue, which is so basic - should be:
CREATE TABLE error_cases (error JSONB NOT NULL) instead of
CREATE TABLE error_cases (error TEXT NOT NULL)

pymysql SELECT query with field of table that the user gives as input

i wondering if it's possible to make a query with select with the user give the field of table and the value that want. For example:
field=input("Field: ")
value=input("Value: ")
cursorobject.execute('SELECT id FROM users WHERE {}=\'{}\'')
result=cursorobject.fetchall()
for x in result:
print(x)
and if it's not possible , there is any way to do it?
PS: this one not working
Of course you can construct the text of your query as you want using variables. E.g.
query = 'SELECT id FROM users WHERE {}=\'{}\''
print(query.format(field,value))
But, have in mind that you should validate very well the contents of the variables, before executing the query, to avoid SQL injections. For example the contents of the variables should not contain quotes.
E.g. the below code, with the specific values of the variables, will return the full list of users:
field='name'
value='name\' or \'1\'=\'1'
query = 'SELECT id FROM users WHERE {}=\'{}\''
print(query.format(field,value))
The produced query would be:
SELECT id FROM users WHERE name='name' or '1'='1'
Following your edit, you should replace your 3rd line with:
cursorobject.execute('SELECT id FROM users WHERE {}=\'{}\''.format(field,value))
And for making the best to avoid sql injections, you should use the built-in query parameterization features of your framework - pymysql:
cursorobject.execute('SELECT id FROM users WHERE {}=%s'.format(field),(value))
Simply format the query for field and pass value as a parameter in second argument of cursor.execute which must receive an iterable (i.e., tuple/list):
# PREPARED STATEMENT
sql = 'SELECT id FROM users WHERE {} = %s'
# EXECUTE QUERY
cursorobject.execute(sql.format(field), (value,))
result = cursorobject.fetchall()

how to check/print psycopg2 dynamic query Compose without creating conn.cursor()

I am writing unit test for a query builder, in which I assemble query from a couple of user input fields.
e.g.
query = sql.SQL("SELECT {fields} FROM {table}).format(
fields='*'
table=sql.Identifier(topic)))
I just wanna check if the query is what I desired, no need to execute.
I was trying to print query, and got a composed object which looks like
Composed([SQL('SELECT '), Composed([Identifier('*')]), SQL(' FROM '), Identifier('topic'), SQL(' '), SQL(''), SQL(' ')...)
Is there a try to transform the dynamic sql as a Composed object to a sql query String?
SELECT * FROM topic
I don't have the postgres set up for the unit test, so I cannot use
query.as_string(conn)
Any hints? Many thanks
You need cursor.mogrify(query, params), but to call it you need a cursor and to create a cursor you need to open a connection. I don't believe you can get the final query without a connection; this is because the query depends on server and database (server version, database encoding, quoting style).

Categories

Resources