Calling several SQL procedures at once with Python - python

I have made a stored procedure in MySQL which accepts several arguments and does its things.
And while I have no problem to execute the following query in MySQL
CALL my_pr(var1, var2, var3); CALL my_pr(var4, var5, var6);
When I try to execute it (or any other 2 statements at once) via Python I get the following error:
Commands out of sync; you can't run this command now
But when I am trying to execute them one by one - everything works smoothly.
I am adding each statement to a list and then execute it via:
for stm in sql_stms:
mycursor.execute(stm)
mydb.commit()
Where I set each stm to be a single query or a multiple statement query in some code above. And my sql_stms contain several INSERT, SELECT and DELETE queries and tens (or sometimes hundreds) of queries for a stored procedure.
My goal is to speed up the running process and currently the slowest part in my code is submitting queries to SQL, so I believe that when I submit multiple queries at once it will work slightly faster.
Any ideas and suggestions are welcomed.

Probably not expecting more than one resultSet, try setting the multi result to true before executing
mycursor = mydb.cursor(multi=True)

The interface is not designed to easily get two "result sets" at once.
There is very little advantage in trying to run two statements together. Simply run them one at a time.
You can, on the other hand, build a third SP that makes those two CALLs. But, again, why bother.

Related

Calling sql server stored procedure using python pymssql

I am using pymssql for executing ms sql stored procedure from python. When I try to execute a stored procedure it seems not getting executed. The code gets completed without any error but upon verifying I can see the procedure was not really executed. What baffles me is that usual queries like select and similar ones are working. What might be missing here? I have tried the below two ways. The stored procedure does not have any parameters or arguments.
cursor.execute("""exec procedurename""")
and
cursor.callproc('procedurename',())
EDIT: The procedure loads a table with some latest data. When I execute the proc from local, it loads the table with latest data but I can see the latest data is not being loaded when done from python using pymssql.
Thanks to AlwaysLearning to provide the crucial clue for fixing the issue, I have added connection.commit() after the procedure call and it started working!

Impala open connection in python

I'm after a way of querying Impala through Python which enables you to keep a connection open and pass queries to it.
I can connect quite happily to Impala using this sort of code:
import subprocess
sql = 'some sort of sql statement;'
cmds = ['impala-shell','-k','-B','-i','impala.company.corp','-q', sql]
out,err = subprocess.Popen(cmds, stderr=subprocess.PIPE, stdout=subprocess.PIPE).communicate()
print(out.decode())
print(err.decode())
I can also switch out the -q and sql for -f and a file with sql statements as per the documentation here.
When I'm running this for multiple sql statements the name node it uses is the same for all the queries and it it will stop if there is a failure in the code (unless I use the option to continue), this is all expected.
What I'm trying to get to is where I can run a query or two, check the results using some python logic and then continue if it meets my criteria.
I have tried splitting up my code into individual queries using sqlparse and running them one by one. This works well in isolation but if one statement is a drop table if exists x; and the next one then goes create table x (blah string); then if x did actually exist then because the second statement will run on a different node the dropping metadata change hasn't reached that one yet and it fails with table x already exists or similar error.
I'd think as well as getting round this metadata issue it would just make more sense to keep a connection open to impala whilst I run all the statements but I'm struggling to work this out.
Does anyone have any code that has this functionality?
You may wanna look at impyla, the Impala/Hive python client, if you haven't done so already.
As far as the second part of your question, using Impala's SYNC_DDL option will guarantee that DDL changes are propagated across impalads before next DDL is executed.

How to execute many SELECT statements at once using python sqlite

I have some business logic that iterates many many times and needs to perform a simple query every time. Rather than make a call to the db every time I would like to store the SELECT statements as an array of strings or something similar and then execute all of the statements at once after the loop. Is this possible with python and sqlite?
The documentation says:
execute() will only execute a single SQL statement. If you try to execute more than one statement with it, it will raise a Warning. Use executescript() if you want to execute multiple SQL statements with one call.
However, executescript() does not allow you to access all the results.
To get multiple query results, you have to do the loop yourself:
def execute_many_selects(cursor, queries):
return [cursor.execute(query).fetchall() for query in queries]
SQLite is an embedded library, so there is no client/server communication overhead when doing multiple database calls.
I suspect you'd be better off if you work out a "larger" query and then decompose the result set after retrieving the information.
In other words, rather than three calls to the database (one each for Alice, Betty and Claire), use something like:
select stuff from a_table
where person in ('Alice', 'Betty', 'Claire')
and then process the actual data taking person into account.
Obviously, that will only work in the case where you can figure out the query before executing any of the person-based actions, but it looks like that's the case anyway, based on your question.

Execute .sql file in Python with MySQLdb

I have a .sql file containing a bunch of SQL queries, with each query spanning multiple lines. I want to execute these queries in MySQL via Python using MySQLdb.
sqlite3 has "a nonstandard shortcut" for this purpose called executescript(), but there doesn't seem to be any equivalent function in MySQLdb.
I noticed this old question from 2 years ago which asks the same thing, but I found the answers unsatisfying. The answers are basically:
Use subprocess to run the mysql command and send it your .sql file.
This works, but it is rather inelegant, and it introduces unwanted complexity with error handling and such.
If each query is on a single line, just execute each line separately.
But in my case, they span multiple lines, so this won't work.
If each query is not on a single line, somehow join them.
But, how? I mean, I can hack up something easily enough so there's no need for you to reply with half-baked answers here, and maybe that's what I'll end up doing, but is there already an established library that does this? I'd feel more comfortable with a comprehensive and correct solution rather than a hack.
MySQLdb seems to allow this out of the box, you just have to call cursor.nextset() to cycle through the returned result sets.
db = conn.cursor()
db.execute('SELECT 1; SELECT 2;')
more = True
while more:
print db.fetchall()
more = db.nextset()
If you want to be absolutely sure the support for this is enabled, and/or disable the support, you can use something like this:
MYSQL_OPTION_MULTI_STATEMENTS_ON = 0
MYSQL_OPTION_MULTI_STATEMENTS_OFF = 1
conn.set_server_option(MYSQL_OPTION_MULTI_STATEMENTS_ON)
# Multiple statement execution here...
conn.set_server_option(MYSQL_OPTION_MULTI_STATEMENTS_OFF)

How to run DDL script with kinterbasdb

Is there way to execute DDL script from Python with kinterbasdb library for Firebird database?
Basically I'd like to replicate 'isql -i myscript.sql' command.
It has been a while since I used kinterbasdb, but as far as I know you should be able to do this with any query command which can also be used for INSERT, UPDATE and DELETE (ie nothing that produces a resultset). So Connection.execute_immediate and Cursor.execute should work.
Did you actually try this.
BTW: With Firebird it is advisable not to mix DDL and DML in one transaction.
EDIT:
I just realised that you might have meant a full DDL script with multiple statements, if that is what you mean, then: no you cannot, you need to execute each statement individually.
You might be able to use an EXECUTE BLOCK statement, but you may need to modify your script so much that it would be easier to simply try to split the actual script into individual statements.

Categories

Resources