In diagnosing SQL query problems, it would sometimes be useful to be able to see the query string after parameters are interpolated into it, using MySQLdb's safe interpolation.
Is there a way to get that information from either a MySQL exception object or from the connection object itself?
Use mysql's own ability to log the queries and watch for them.
Perhaps You could use the slow_query_log?
If You cannot turn on the mysql's internal ability to log all queries, You need to write down all the queries before You execute them... You can store them in an own log-file, or in a table (or in some other system). If that would be the case, and if I were You, I'd create an wrapper for the connection with the logging ability.
Related
I am attempting to execute a raw sql insert statement in Sqlalchemy, SQL Alchemy throws no errors when the constructed insert statement is executed but the lines do not appear in the database.
As far as I can tell, it isn't a syntax error (see no 2), it isn't an engine error as the ORM can execute an equivalent write properly (see no 1), it's finding the table it's supposed to write too (see no 3). I think it's a problem with a transaction not being commited and have attempted to address this (see no 4) but this hasn't solved the issue. Is it possible to create a nested transaction and what would start the 'first' so to speak?
Thankyou for any answers.
Some background:
I know that the ORM facilitates this and have used this feature and it works, but is too slow for our application. We decided to try using raw sql for this particular write function due to how often it's called and the ORM for everything else. An equivalent method using the ORM works perfectly, and the same engine is used for both, so it can't be an engine problem right?
I've issued an example of the SQL that the method using raw sql constructs to the database directly and that reads in fine, so I don't think it's a syntax error.
it's communicating with the database properly and can find the table as any syntax errors with table and column names throw a programmatic error so it's not just throwing stuff into the 'void' so to speak.
My first thought after reading around was that it was transaction error and that a transaction was being created and not closed, and so constructed the execute statement as such to ensure a transaction was properly created and commited.
with self.Engine.connect() as connection:
connection.execute(Insert_Statement)
connection.commit
The so called 'Insert Statement' has been converted to text using the sqlalchemy 'text' function, I don't quite understand why it won't execute if I pass the constructed string directly to the execute statement but mention it in case it's relevant.
Other things that may be relevant:
Python3 is running on an individual ec2 instance the postgres database on another. The table in particular is a timescaledb hypertable taking realtime data, hence the need for very fast writes, but probably not relevant.
Currently using pg8000 as dialect for no particular reason other than psycopg2 was throwing errors when trying the execute an equivalent method using the ORM.
Just so this question is answered in case anyone else ends up here:
The issue was a failure to call commit as a method, as #snakecharmerb pointed out. Gord Thompson also provided an alternate method using 'begin' which automatically commits rather than connection which is a 'commit as you go' style transaction.
I need to move data from one database to another.
I can use python my counterpart can't.
How can select all data from a table and save it as insert statements.
Using SQLalchemy.
Is there a way to create a back up like this?
As others have suggested in comments, using the database backup program (mysqldump, pg_dump, etc) is your best bet; that will make sure that the data is transferred correctly for the underlying database.
Outputting INSERT statements will be risky; even the built-in SQLAlchemy facility for doing this comes with a big red warning, complete with a picture of a dragon, indicating that it can be dangerous.
If you nevertheless need to do this, and the data is generally trusted and doesn't contain much in the way of odd types, you can use:
Create (but do not execute) an insert expression as though you were inserting the rows back into the database.
Use the .compile() method with the relevant dialect parameter and literal_binds set to True.
Manually double-check that the output is, in fact, valid for the database; as per the warning in the SQLAlchemy FAQ, this method is not very dependable and may expose you to attacks if it's part of any production system.
I wouldn't recommend formatting up INSERT statements by hand; you're unlikely to do a better job than SQLAlchemy...
I am building a little interface where I would like users to be able to write out their entire sql statement and then see the data that is returned. However, I don't want a user to be able to do anything funny ie delete from user_table;. Actually, the only thing I would like users to be able to do is to run select statements. I know there aren't specific users for SQLite, so I am thinking what I am going to have to do, is have a set of rules that reject certain queries. Maybe a regex string or something (regex scares me a little bit). Any ideas on how to accomplish this?
def input_is_safe(input):
input = input.lower()
if "select" not in input:
return False
#more stuff
return True
I can suggest a different approach to your problem. You can restrict the access to your database as read-only. That way even when the users try to execute delete/update queries they will not be able to damage your data.
Here is the answer for Python on how to open a read-only connection:
db = sqlite3.connect('file:/path/to/database?mode=ro', uri=True)
Python's sqlite3 execute() method will only execute a single SQL statement, so if you ensure that all statements start with the SELECT keyword, you are reasonably protected from dumb stuff like SELECT 1; DROP TABLE USERS. But you should check sqlite's SQL syntax to ensure there is no way to embed a data definition or data modification statement as a subquery.
My personal opinion is that if "regex scares you a little bit", you might as well just put your computer in a box and mail it off to <stereotypical country of hackers>. Letting untrusted users write SQL code is playing with fire, and you need to know what you're doing or you'll get fried.
Open the database as read only, to prevent any changes.
Many statements, such as PRAGMA or ATTACH, can be dangerous. Use an authorizer callback (C docs) to allow only SELECTs.
Queries can run for a long time, or generate a large amount of data. Use a progress handler to abort queries that run for too long.
I'm using Django's ORM to insert thousands of objects in a Postgre's DB. And it works fine, but sometimes one of those registers have a wrong format and the insert operation doesn't work.
I can't do this kind of insert ignoring errors so I'd like to see the SQL executed by the operation and the bulk_insert only returns a list of the objects.
When in debug-mode you could use the django.db.backends logger.
https://docs.djangoproject.com/en/1.8/topics/logging/#django-db-backends
In production I would use loggers for PostGres itself, because saving these queries from within a Django process will (probably) have major impact on your performance.
I have the following python code:
row = conn.execute('''SELECT admin FROM account WHERE password = ?''',
(request.headers.get('X-Admin-Pass'),)).fetchone()
My question is whether this code is secure for SQL injection? Since I use parameterized query it should be. However, since I am passing user information straight from the header, I am a little worried :)
Any thoughts about the issue?
The way that you are inserting the data into the database will ensure that an SQL attack will not work, the execute method will automatically escape the parameters that you passed as a tuple as its second parameter to the query.
You are doing that correctly.
If your module uses the DBI specs, then you're parameterizing fine. Unless you want to do research into preventing specific SQL attacks, paramterizing your queries is a good umbrella against SQL injection.