I have a DDL object (create_function_foo) that contains a create function statement. In first line of it I put DROP FUNCTION IF EXISTS foo; but engine.execute(create_function_foo) returns:
sqlalchemy.exc.InterfaceError: (InterfaceError) Use multi=True when executing multiple statements
I put multi=True as parameter for create_engine, engine.execute_options and engine.execute but it doesn't work.
NOTE: engine if my instance of create_engine
NOTE: I'm using python 3.2 + mysql.connector 1.0.12 + sqlalchemy 0.8.2
create_function_foo = DDL("""\
DROP FUNCTION IF EXISTS foo;
CREATE FUNCTION `foo`(
SID INT
) RETURNS double
READS SQL DATA
BEGIN
...
END
""")
Where I should put it?
multi=True is a requirement for MySql connector. You can not set this flag passing it to SQLAlchemy methods. Do this:
conn = session.connection().connection
cursor = conn.cursor() # get mysql db-api cursor
cursor.execute(sql, multi=True)
More info here: http://www.mail-archive.com/sqlalchemy#googlegroups.com/msg30129.html
Yeah... This seems like a bummer to me. I don't want to use the ORM so the accepted answer didn't work for me.
I did this instead:
with open('sql_statements_file.sql') as sql_file:
for statement in sql_file.read().split(';'):
if len(statement.strip()) > 0:
connection.execute(statement + ';')
And then this failed for a CREATE function.... YMMV.
There are some cases where SQLAlchemy does not provide a generic way at accessing some DBAPI functions, such as as dealing with multiple result sets. In these cases, you should deal with the raw DBAPI connection directly.
From SQLAlchemy documentation:
connection = engine.raw_connection()
try:
cursor = connection.cursor()
cursor.execute("select * from table1; select * from table2")
results_one = cursor.fetchall()
cursor.nextset()
results_two = cursor.fetchall()
cursor.close()
finally:
connection.close()
You can also do the same using mysql connector as seen here:
operation = 'SELECT 1; INSERT INTO t1 VALUES (); SELECT 2'
for result in cursor.execute(operation, multi=True):
if result.with_rows:
print("Rows produced by statement '{}':".format(
result.statement))
print(result.fetchall())
else:
print("Number of rows affected by statement '{}': {}".format(
result.statement, result.rowcount))
Related
I have a Python application, in which I'm calling a MySQL stored procedure from my view, like so:
import mysql.connector
proc = 'audit_report'
parms = [data['schoolid'], dateToISO(data['startdatedefault'],'from'), dateToISO(data['enddatedefault'],'to'), joinIntList(data['studypgms'], joinWith), joinIntList(data['fedpgms'], joinWith), joinIntList(data['statuses'], joinWith), data['fullssndefault']]
conn = mysql.connector.connect(user='usr', database='db', password='pwd')
cursor = conn.cursor(dictionary=True)
cursor.callproc(proc, parms)
for result in cursor.stored_results():
print(result.fetchall())
I am getting the data returned as a list of tuples, the standard output. Since I'm using connector version 2.1.7, the docs say adding
dictionary=True
to the cursor declaration should cause the rowset to be returned as a list of dictionaries, with column name as the key of each dictionary. Main difference between my application and the example in the docs is that I'm using cursor.callproc(), whereas the examples use cursor.execute() with actual sql code.
I tried
print(cursor.column_names)
to see if I could get the column names that way, but all I get is
('#_audit_report_arg1', '#_audit_report_arg2', '#_audit_report_arg3', '#_audit_report_arg4', '#_audit_report_arg5', '#_audit_report_arg6', '#_audit_report_arg7')
which looks more like the input parameters to the stored procedure.
Is there any way to actually get the column names of the returned data? The procedure is somewhat complex and contains crosstab-type manipulation, but calling the same stored procedure from MySQL Workbench happily supplies the column names.
Normally, knowing what the output is supposed to be, I could hard-code column names, except this procedure crosstabs the data for the last few columns, and it is unpredictable what they will be until after the query runs.
Thanks...
You can use pymysql in python3 and it should work fine !!
import pymysql.cursors
connection = pymysql.connect(host='',
user='',
password='',
db='test',
charset='utf8mb4',
cursorclass=pymysql.cursors.DictCursor)
try:
with connection.cursor() as cursor:
# Read a single record
sql = "query"
cursor.execute(sql)
result = cursor.fetchone()
num_fields = len(cursor.description)
field_names = [i[0] for i in cursor.description]
print (field_names)
finally:
connection.close()
I am connecting to mysql database via mysql connector and running a simple query to pull a list of IDs. I need to loop over that list and pass them into some other code. For some reason I am getting a list of tuples. Is this expected behavior? If not, what am I doing wrong?
Here is the snippet of my code:
import mysql.connector
conn = mysql.connector.connect(host='127.0.0.1', database='t', user='r', password='pwd')
cursor = conn.cursor()
query = ( "select id from T where updated < '%s'" % (run_date) )
cursor.execute(query)
for row in cursor:
print (row)
cursor.close()
I am getting the following back (from an INT field in d/b):
(Decimal('991837'),)
(Decimal('991838'),)
(Decimal('991839'),)
(Decimal('991871'),)
(Decimal('991879'),)
(Decimal('991899'),)
(Decimal('992051'),)
(Decimal('992299'),)
(Decimal('992309'),)
if you want to access just the data in the row you need to go into the dictionary
first you must make it true in the cursor
cur = db.cursor( buffered=True , dictionary=True)
then the result will be like this :
{'Decimal' : '991837'}
i'm sure the Decimal is your row name
so when you need to access to the value do this
import mysql.connector
conn = mysql.connector.connect(host='127.0.0.1', database='t', user='r', password='pwd')
cursor = conn.cursor()
query = ( "select id from T where updated < '%s'" % (run_date) )
cursor.execute(query)
for row in cursor:
print (row['Decimal'])
cursor.close()
i hope it works for i was looking for this solution for the past 2 days and no answers
the only way i debugged i opened the debugger and print out all the variables
have fun with Python :)
Yes, this is expected behavior. Using the cursor as an iterable is basically equivalent to looping over it using the fetchone() method. From the documentation for fetchone() (emphasis mine):
This method retrieves the next row of a query result set and returns a
single sequence, or None if no more rows are available. By default,
the returned tuple consists of data returned by the MySQL server,
converted to Python objects. If the cursor is a raw cursor, no such
conversion occurs;
I'm attempting to transition a code base from using MySQLdb to pymysql. I'm encountering the following problem and wonder if anyone has seen something similar.
In a nutshell, if I call a stored procedure through the pymysql cursor callproc() method a subsequent 'select' call through the execute() method using the same or a different cursor returns incorrect results. I see the same results for Python 2.7.2 and Python 3.2.2
Is the callproc() method locking up the server somehow? Code is shown below:
conn = pymysql.connect(host='localhost', user='me', passwd='pwd',db='mydb')
curr = conn.cursor()
rargs = curr.callproc("getInputVar", (args,))
resultSet = curr.fetchone()
print("Result set : {0}".format(resultSet))
# curr.close()
#
# curr = conn.cursor()
curr.execute('select * from my_table')
resultSet = curr.fetchall()
print("Result set len : {0}".format(len(resultSet)))
curr.close()
conn.close()
I can uncomment the close() and cursor creation calls above but this doesn't change the result. If I comment out the callproc() invocation the select statement works just fine.
I have a similar problem with (committed) INSERT statements not appearing in the database. PyMySQL 0.5 für Python 3.2 and MySQL Community Server 5.5.19.
I found the solution for me: instead of using the execute() method, I used the executemany method, explained in the module reference on
http://code.google.com/p/pymssql/wiki/PymssqlModuleReference
There is also a link to examples.
Update
A little later, today, I found out that this is not yet the full solution.
A too fast exit() at the end of the python script makes the data getting lost in the database.
So, I added a time.sleep() before closing the connection and before exit()ing the script, and finally all the data appeared!
(I also switched to using a myisam table)
import pymysql
conn = pymysql.connect(host='localhost', user='root', passwd='', db='mydb', charset='utf8')
conn.autocommit(True)
cur = conn.cursor()
# CREATE tables (SQL statements generated by MySQL workbench, and exported with Menu -> Database -> Forward Engineer)
cur.execute("""
SET #OLD_UNIQUE_CHECKS=##UNIQUE_CHECKS, UNIQUE_CHECKS=0;
SET #OLD_FOREIGN_KEY_CHECKS=##FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0;
SET #OLD_SQL_MODE=##SQL_MODE, SQL_MODE='TRADITIONAL';
DROP SCHEMA IF EXISTS `mydb` ;
CREATE SCHEMA IF NOT EXISTS `mydb` DEFAULT CHARACTER SET utf8 COLLATE utf8_general_ci ;
USE `mydb` ;
# […]
SET SQL_MODE=#OLD_SQL_MODE;
SET FOREIGN_KEY_CHECKS=#OLD_FOREIGN_KEY_CHECKS;
SET UNIQUE_CHECKS=#OLD_UNIQUE_CHECKS;
""")
# Fill lookup tables:
cur.executemany("insert into mydb.number(tagname,name,shortform) values (%s, %s, %s)", [('ЕД','singular','sg'), ('МН','plural','p')] )
cur.executemany("insert into mydb.person(tagname,name,shortform) values (%s, %s, %s)", [('1-Л','first','1st'), ('2-Л','second','2nd'), ('3-Л','third','3rd')] )
cur.executemany("insert into mydb.pos(tagname,name,shortform) values (%s, %s, %s)", [('S','noun','s'), ('A','adjective','a'), ('ADV','adverb','adv'), ('NUM','numeral','num'), ('PR','preposition','pr'), ('COM','composite','com'), ('CONJ','conjunction','conj'), ('PART','particle','part'), ('P','word-clause','p'), ('INTJ','interjection','intj'), ('NID','foreign-named-entity','nid'), ('V','verb','v')] )
#[…]
import time
time.sleep(3)
cur.close()
conn.close()
time.sleep(3)
exit()
I suggest the forum/group https://groups.google.com/forum/#!forum/pymysql-users for further discussion with the developer.
Does pyodbc have an execute scalar function?
something like executescalar on the sql lib in .net?
The pyodbc cursor has a fetchone() method.
cursor.execute("select user_name from users where user_id=?", userid)
row = cursor.fetchone()
if row:
print row.user_name
# or print row[0]
I don't think so, but sqlalchemy does (apart from using the ORM etc., it can also be used as a handy higher level interface to DB API libraries). As an example:
import sqlalchemy
# using mssql as an example because sqlalchemy uses pyodbc as the default driver for MS Sql Server
engine = sqlalchemy.create_engine("mssql://myserver/mydb")
# first column of first row is returned
username = engine.scalar("select username from users where userid = 1")
You can simplify the pyodbc call like this:
name = cursor.execute("select user_name from users where user_id=?", userid).fetchval()
because
fetchval()
Returns the first column of the first row if there are
results
and
*execute(sql, parameters)
Prepares and executes a SQL statement, returning the Cursor object itself
I have this statement
cursor = connection.cursor()
query = "SELECT * from table"
cursor.execute(query)
res = cursor.fetchall()
cursor = connection.cursor()
query = "SELECT * from table"
cursor.execute(query)
print cursor.rowcount
According to the Python Database API Specification v2.0, the rowcount attribute of the cursor object should return the number of rows that the last query produced or affected (the latter is for queries that alter the database). If your database module conforms to the API specification, you should be able to use the rowcount attribute.
The num_rows() function you are looking for does not exist in the MySQLdb module. There is an internal module called _mysql which has a result class with a num_rows method, but you shouldn't be using that - the very existence of _mysql is considered an implementation detail.
Most voted answer is not working yet. It should be like that:
cursor = connection.cursor()
query = "SELECT * FROM table"
cursor.execute(query)
cursor.fetchall()
print (cursor.rowcount)