Execute sql insert statements via sql alchemy within for loop - python

I need to extract a bunch of records within a singlestore database and insert the records into another table. For performance, the ideal way to do this is to create a query string with an Insert Into statement and iterate through on a daily basis.
I can't seem to get python to execute the query in the database, but it appears to run successfully?
fh = 'file_containing_insert_select_query.sql'
qry = open(fh).read()
for i in range(2):
qry_new = some_custom_function_to_replace_dates(qry, i)
engine = tls.custom_engine_function()
engine.execute(qry_new)
I've verified that the sql statements created by my custom function can be copy/pasted to a sql editor and executed successfully, but it won't run in python... any thoughts?

After execution of the above query, you need to send a 'commit' query to database using connection.commit()
(Where connection contains the Database connection credentials and ip address) so it can save the no. of rows inserted via python program.

if you want it to run really fast, it’s usually better to use set-oriented SQL, like INSERT INTO targetTbl … SELECT FROM …;
That way, it doesn’t have to round-trip through the client app.

Related

Preventing writable modifications to Oracle database, using Python.

Currently using cx_Oracle module in Python to connect to my Oracle database. I would like to only allow the user of the program to do read only executions, like Select, and NOT INSERT/DELETE queries.
Is there something I can do to the connection/cursor variables once I establish the connection to prevent writable queries?
I am using the Python Language.
Appreciate any help.
Thanks.
One possibility is to issue the statement "set transaction read only" as in the following code:
import cx_Oracle
conn = cx_Oracle.connect("cx_Oracle/welcome")
cursor = conn.cursor()
cursor.execute("set transaction read only")
cursor.execute("insert into c values (1, 'test')")
That will result in the following error:
ORA-01456: may not perform insert/delete/update operation inside a READ ONLY transaction
Of course you'll have to make sure that you create a Connection class that calls this statement when it is first created and after each and every commit() and rollback() call. And it can still be circumvented by calling a PL/SQL block that performs a commit or rollback.
The only other possibility that I can think of right now is to create a restricted user or role which simply doesn't have the ability to insert, update, delete, etc. and make sure the application uses that user or role. This one at least is fool proof, but a lot more effort up front!

How to get sqlalchemy query for a delete statement?

I'm new to sqlalchemy and have been trying to figure this out for days!
I have some python code which is executing the following line:
mdb_session.query(PendingConfig).filter(PendingConfig.id == config.id).delete()
It's deleting all rows in a table called PendingConfig which have an id equals to a given config.id.
I want to log the underlying SQL query sqlalchemy that generates, but don't know how to do that since delete() returns an integer equal to the number of rows deleted.
I tried setting up a logger but that had it's own issues, as I explained in this post.
Need help on this!
If you really want to get the SQL that was actually run by the MySQL server, then you can enable the MySQL query log or slow query log, and read it from the database server.
See https://dev.mysql.com/doc/refman/5.7/en/slow-query-log.html
The MySQL Server doesn't know anything about Python, it just knows that a client sent it a query to execute.
If it's a parameterized query, it will contain ? placeholders in the SQL text, but SQLAlchemy doesn't do placeholders as far as I know. It always interpolates parameter values into the SQL query before sending it to MySQL.

when pymysql using local_infile, data is not insert database

I created the code to create and execute SQL query statements using pymysql in Python.
If i put the sql query statement generated in code directly in db, it works normally.
However, if i execute the sql query statement created in code with cursor.execute (sql), db will not contain data.
When I connect, I also gave the local_infile = True option. How do I create the code?

Unable to insert data on Google Cloud SQL but UID field is still AUTO incrementing

I am running into MySQL behavior on Google Cloud SQL I have never seen before.
Every MySQL command we try is working from a Python script except INSERT. We can create the table and show tables, but when we insert data - nothing appears in the table. Yet, if we copy that exact same insert statement to the MySQL command line and run it, the insert works fine.
BUT here is the unusual part. Even though the Python script fails to insert data, the UID AUTO INCREMENT field has incremented for every empty and failed insert. For example, if the Python script fails to insert a row, the next time we run an insert from the mySQL command line, we see that the UID field has incremented by one.
It is as if MySQL started to insert the data, auto-incremented the UID field, but then the data never arrived.
We are using MySQL on Google Cloud SQL. The insert is a simple test:
insert into queue (filename, text) VALUES ('test', 'test')
Any ideas what this is or how to debug it?
It turns out AUTOCOMMIT is set to OFF on Google Cloud SQL.
All SQL inserts must be followed by a commit statement.
For example:
import MySQLdb
db = mdb.connect(ip, login, pword)
query = "insert into tbname (entity, attribute) VALUES('foo', 'bar')"
cursor = db.cursor()
cursor.execute(query)
db.commit()

Data loaded to MySQL via python disappears

I've looked around to see whether anyone had this problem but looks like not! Basically my problem is as follows:
I try loading data into MYSQL db using the MySQLdb library for python
I seem to succeed, since I'm able to retrieve the items I loaded within the same python instance
ONce the python code is run and closed, when I try to retrieve the data either by running a query in MySQL workbench or by running a python code in command prompt, I cannot retrieve the data..
So in summary, I do load the data in, but the moment I close the python instance, the data seems to disappear..
To try to isolate the problem later, I placed a time.sleep(60) line into my code, so that once the python code loads the data, I can go and try retrieving the data from MYSQL workbench using queries, but I still cant..
I thought perhaps I'm saving data into different instances, but I checked things like "port" etc. and they are identical!..
I've spent 4-5 hours trying to figure out, but starting to lose hope.. Help much apperciated.. Please find below my code:
db = MySQLdb.connect("localhost","root","password","mydb")
cursor = db.cursor()
cursor.execute("SELECT VERSION()")
data = cursor.fetchone()
print data
cursor.execute("LOAD DATA LOCAL INFILE "+ "filepath/file.txt" +" INTO TABLE addata FIELDS TERMINATED BY ';' LINES TERMINATED BY '\r\n'")
data = cursor.fetchall()
print data ###At this point data displays warnings etc
cursor.execute("select * from addata")
data = cursor.fetchmany(10)
print data ###Here I can see that the data is loaded
time.sleep(60) ##Here while the code is sleeping I go to mysql workbench and try the query "select * from addata".. It returns nothing:(
You almost certainly need to commit the data after you have loaded it.
If your program exits without committing the data, the DB will roll back your transaction, on the assumption that something has gone wrong.
You may be able to set autocommit as part of your connection request, otherwise you should call 'commit()' via your cursor object.

Categories

Resources