I'm new to sqlalchemy and have been trying to figure this out for days!
I have some python code which is executing the following line:
mdb_session.query(PendingConfig).filter(PendingConfig.id == config.id).delete()
It's deleting all rows in a table called PendingConfig which have an id equals to a given config.id.
I want to log the underlying SQL query sqlalchemy that generates, but don't know how to do that since delete() returns an integer equal to the number of rows deleted.
I tried setting up a logger but that had it's own issues, as I explained in this post.
Need help on this!
If you really want to get the SQL that was actually run by the MySQL server, then you can enable the MySQL query log or slow query log, and read it from the database server.
See https://dev.mysql.com/doc/refman/5.7/en/slow-query-log.html
The MySQL Server doesn't know anything about Python, it just knows that a client sent it a query to execute.
If it's a parameterized query, it will contain ? placeholders in the SQL text, but SQLAlchemy doesn't do placeholders as far as I know. It always interpolates parameter values into the SQL query before sending it to MySQL.
Related
I am attempting to execute a raw sql insert statement in Sqlalchemy, SQL Alchemy throws no errors when the constructed insert statement is executed but the lines do not appear in the database.
As far as I can tell, it isn't a syntax error (see no 2), it isn't an engine error as the ORM can execute an equivalent write properly (see no 1), it's finding the table it's supposed to write too (see no 3). I think it's a problem with a transaction not being commited and have attempted to address this (see no 4) but this hasn't solved the issue. Is it possible to create a nested transaction and what would start the 'first' so to speak?
Thankyou for any answers.
Some background:
I know that the ORM facilitates this and have used this feature and it works, but is too slow for our application. We decided to try using raw sql for this particular write function due to how often it's called and the ORM for everything else. An equivalent method using the ORM works perfectly, and the same engine is used for both, so it can't be an engine problem right?
I've issued an example of the SQL that the method using raw sql constructs to the database directly and that reads in fine, so I don't think it's a syntax error.
it's communicating with the database properly and can find the table as any syntax errors with table and column names throw a programmatic error so it's not just throwing stuff into the 'void' so to speak.
My first thought after reading around was that it was transaction error and that a transaction was being created and not closed, and so constructed the execute statement as such to ensure a transaction was properly created and commited.
with self.Engine.connect() as connection:
connection.execute(Insert_Statement)
connection.commit
The so called 'Insert Statement' has been converted to text using the sqlalchemy 'text' function, I don't quite understand why it won't execute if I pass the constructed string directly to the execute statement but mention it in case it's relevant.
Other things that may be relevant:
Python3 is running on an individual ec2 instance the postgres database on another. The table in particular is a timescaledb hypertable taking realtime data, hence the need for very fast writes, but probably not relevant.
Currently using pg8000 as dialect for no particular reason other than psycopg2 was throwing errors when trying the execute an equivalent method using the ORM.
Just so this question is answered in case anyone else ends up here:
The issue was a failure to call commit as a method, as #snakecharmerb pointed out. Gord Thompson also provided an alternate method using 'begin' which automatically commits rather than connection which is a 'commit as you go' style transaction.
I need to extract a bunch of records within a singlestore database and insert the records into another table. For performance, the ideal way to do this is to create a query string with an Insert Into statement and iterate through on a daily basis.
I can't seem to get python to execute the query in the database, but it appears to run successfully?
fh = 'file_containing_insert_select_query.sql'
qry = open(fh).read()
for i in range(2):
qry_new = some_custom_function_to_replace_dates(qry, i)
engine = tls.custom_engine_function()
engine.execute(qry_new)
I've verified that the sql statements created by my custom function can be copy/pasted to a sql editor and executed successfully, but it won't run in python... any thoughts?
After execution of the above query, you need to send a 'commit' query to database using connection.commit()
(Where connection contains the Database connection credentials and ip address) so it can save the no. of rows inserted via python program.
if you want it to run really fast, it’s usually better to use set-oriented SQL, like INSERT INTO targetTbl … SELECT FROM …;
That way, it doesn’t have to round-trip through the client app.
I know that you can get the SQL of a given QuerySet using
print query.query
but as we know from a previous question ( Potential Django Bug In QuerySet.query? ) the returned SQL is not properly quoted. See http://code.djangoproject.com/browser/django/trunk/django/db/models/sql/query.py
Is there any way that is it possible to get the raw, executable SQL (quoted) for a given QuerySet without actually executing it?
Django never creates the raw sql, so no. To prevent SQL injection, django passes the parameters separately to the database drivers at the last step. The best way to get the actual SQL is to look at your query log, which you cannot do before you execute the query.
I have a python script that connects to a local MySQL db. I know it is connecting correctly because I can do this and get the proper results:
cursor.execute("SELECT * FROM reel")
But when I try to do any insert statements it just does nothing. No error messages, no exceptions. Nothing shows up in the database when I check it from sqlyog. This is what my code looks like:
self.cursor.executemany("INSERT INTO reel (etime,etext) VALUES (%s,%s)", tups)
where tups is a list of tuples looking like this ('0000-00-00 00:00:00','text'). No errors show up and if I copy paste the generated SQL query into sqlyog it works. I've tried generating the query and doing cursor.execute() on it and no errors and no result either. Anyone know what I'm doing wrong?
You need to do a self.cursor.commit() after self.cursor.executemany("INSERT INTO reel (etime,etext) VALUES (%s,%s)", tups)
Starting with 1.2.0, MySQLdb disables autocommit by default, as required by the DB-API standard (PEP-249). If you are using InnoDB tables or some other type of transactional table type, you'll need to do connection.commit() before closing the connection, or else none of your changes will be written to the database.
Conversely, you can also use connection.rollback() to throw away any changes you've made since the last commit.
Important note: Some SQL statements -- specifically DDL statements like CREATE TABLE -- are non-transactional, so they can't be rolled back, and they cause pending transactions to commit.
Is a FAQ
I'm using Elixir in a project that connects to a postgres database. I want to run the following query on the database I'm connected to, but I'm not sure how to do it as I'm rather new to Elixir and SQLAlchemy. Anyone know how?
VACUUM FULL ANALYZE table
Update
The error is: "UnboundExecutionError: Could not locate a bind configured on SQL expression or this Session". And the same result with session.close() issued before. I did try doing metadata.bind.execute() and that worked for a simple select. But for the VACUUM it said - "InternalError: (InternalError) VACUUM cannot run inside a transaction block", so now I'm trying to figure out how to turn that off.
Update 2
I can get the query to execute, but I'm still getting the same error - even when I create a new session and close the previous one.
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
# ... insert stuff
old_session.commit()
old_session.close()
new_sess = sessionmaker(autocommit=True)
new_sess.configure(bind=create_engine('postgres://user:pw#host/db', echo=True))
sess = new_sess()
sess.execute('VACUUM FULL ANALYZE table')
sess.close()
and the output I get is
2009-12-10 10:00:16,769 INFO sqlalchemy.engine.base.Engine.0x...05ac VACUUM FULL ANALYZE table
2009-12-10 10:00:16,770 INFO sqlalchemy.engine.base.Engine.0x...05ac {}
2009-12-10 10:00:16,770 INFO sqlalchemy.engine.base.Engine.0x...05ac ROLLBACK
finishing failed run, (InternalError) VACUUM cannot run inside a transaction block
'VACUUM FULL ANALYZE table' {}
Update 3
Thanks to everyone who responded. I wasn't able to find the solution I wanted, but I think I'm just going to go with the one described here PostgreSQL - how to run VACUUM from code outside transaction block?. It's not ideal, but it works.
Dammit. I knew the answer was going to be right under my nose. Assuming you setup your connection like I did.
metadata.bind = 'postgres://user:pw#host/db'
The solution to this was as simple as
conn = metadata.bind.engine.connect()
old_lvl = conn.connection.isolation_level
conn.connection.set_isolation_level(0)
conn.execute('vacuum analyze table')
conn.connection.set_isolation_level(old_lvl)
This is similar to what was suggested here PostgreSQL - how to run VACUUM from code outside transaction block?
because underneath it all, sqlalchemy uses psycopg to make the connection to postgres. Connection.connection is a proxy to the psycopg connection. Once I realized this, this problem came back to mind and I decided to take another whack at it.
Hopefully this helps someone.
You need to bind the session to an engine
session.bind = metadata.bind
session.execute('YOUR SQL STATEMENT')
UnboundExecutionError says that your session is not bound to an engine and there is no way to discover engine from query passed to execute(). You can either use engine.execute() directly or pass additional mapper parameter (either mapper or mapped model corresponding to table used in query) to session.execute() to help SQLAlchemy discover proper engine.
The InternalError says that you are trying to execute this statement inside explicitly (with BEGIN statement) started transaction. Have you issued some statements before it without calling commit()? If so, just call commit() or rollback() method to close transaction before doing VACUUM. Also note, that there are several parameter to sessionmaker() that tell SQLAlchemy when transaction should be started.
If you have access to SQLAlchemy session, you can execute arbitrary SQL statements via its execute method:
session.execute("VACUUM FULL ANALYZE table")
(Depending on the Postgres version) you most likely do not want to run "VACUUM FULL".