Invalid Cursor State Python SQL Statement - python

I am writing a simple python program to retrieve information from a localhost database i set up.
I am encountering the following error:
('24000', '[24000] [Microsoft][ODBC Driver 17 for SQL Server]Invalid cursor state')
My code:
try:
cursor = conn.cursor()
cursor.execute(getSQLStatement('SelectRecipe'),[item])
recipe = cursor.fetchone()[0]
cursor.close()
except Exception as e:
print(e)
return
The query i am running works fine in the database client, the query isnt the issue, it is a simple SELECT statement:
SELECT
Recipe
FROM recipes
WHERE Name = ? --(item)
My desired result is just the most basic statement just so i can get it running, and then be able to expand the complexity later, but something is wrong at this level. I am just trying to get the value of one record and one column i believe with this code. When i searched other similar issues people had with this the answer was always related to multiple result sets being attempted to be returned? My query is just one result set and i am only running it once? I am kind of scratching my head here to find what i am missing?
The error seems to be related to the:
recipe = cursor.fetchone()[0]
...line as it runs fine without throwing the error without this line, however then i am not sure how to get the data as a variable if i dont do it that way.
Wondering if anyone could help me out here, if there is a better way to get my data or if there is something wrong with my implementation of this way. Thanks.

Related

Query works in postgres shell but sometimes fails to return results in psycopg2

I'm completely stumped.
The query looks something like this:
WITH e AS(
INSERT INTO TEAMS(TEAM_NAME, SPORT_ID, TEAM_GENDER)
VALUES ('Cameroon U23','1','M')
ON CONFLICT (TEAM_NAME, SPORT_ID, TEAM_GENDER)
DO NOTHING
RETURNING TEAM_ID
)
SELECT * FROM e
UNION
SELECT TEAM_ID FROM TEAMS WHERE LOWER(TEAM_NAME)=LOWER('Cameroon U23') AND SPORT_ID='1' AND LOWER(TEAM_GENDER)=LOWER('M');
And the python code like this:
sqlString = """WITH e AS(
INSERT INTO TEAMS(TEAM_NAME, SPORT_ID, TEAM_GENDER)
VALUES (%s,%s,%s)
ON CONFLICT (TEAM_NAME, SPORT_ID, TEAM_GENDER)
DO NOTHING
RETURNING TEAM_ID
)
SELECT * FROM e
UNION
SELECT TEAM_ID FROM TEAMS WHERE LOWER(TEAM_NAME)=LOWER(%s) AND SPORT_ID=%s AND LOWER(TEAM_GENDER)=LOWER(%s);"""
cur.execute(sqlString, (TEAM_NAME, SPORT_ID, TEAM_GENDER, TEAM_NAME, SPORT_ID, TEAM_GENDER,))
fetch = cur.fetchone()[0]
The error that I get is on "cur.fetchone()[0]" because "cur.fetchone()" doesn't return any values for some reason. I have also tried "cur.fetchall()" but it's the same issue.
This query works every time without fail in the normal postgres shell. However, in my python code using psycopg2, it will sometimes error out and not return anything. When I check the DB from the shell, the data I am looking for is there so it is the select query that should be returning something but isn't.
I am not sure if this is relevant, but I am creating concurrent connections (not connection pools) and doing multiple of these queries at once. Each query has a different team, however, to prevent deadlock.
I have found the issue. It was to do with me using concurrency. I was wrong in saying that each query has a different team. The teams might sometimes be the same.
But the main issue was occurring because my INSERT would try and put some data in and find a duplicate because a concurrent query was also trying to put the same data in. But then for some reason, the SELECT wouldn't find that data. I don't exactly what the issue is but that's my understanding.
I had to change to doing a SELECT, checking if there was a result, then doing an INSERT if there wasn't and then doing a final SELECT if the INSERT didn't return anything. The INSERT does not return anything sometimes because it encounters a conflict with an entry that appeared after the first SELECT was executed.
EDIT:
Nevermind. The problem was, in fact, that my deadlock_timeout was too low. My program wasn't actually reaching deadlock (where two processes are waiting on each other and cannot because they are also dependent on the other finishing). So increasing the deadlock_timeout to be larger than the average time for one of my processes to complete was the solution.
THIS WILL NOT WORK if your program is actually reaching deadlock. In that case, fix it, because it should not be reaching deadlock ever.
Hope this helps someone.

Mariadb python: Commands out of syncs

so I was trying to create a password manager for myself, using python and mariadb. After creating a table named pw, which contains Name, Account and Passwords 3 columns, I tried to create a function(Search_Passwords(app_name)) which I can use to enter a keyword to search in the database, and it will give me the right passwords. However, I ran into this error message:
Commands out of syncs error message.
I'm new to python and mariadb(using it cause for some reason MySQL doesn't work..), tried to look up for some answers but still can't figure it out. Can anyone help please? Below are other codes I think might be related.
This is what mariadb's table looks like.
Search_Passwords()
class UseDataBase
This is what I found online as a reference version where Search_Passwords() involved.
Sorry if my codes are not perfect... :(
MariaDB Connector/Python by default use unbuffered result sets, which means before executing another cursor all pending result sets need to be fetched or the cursor needs to be closed.
For example the following script
import mariadb
conn= mariadb.connect()
cursor1= conn.cursor()
cursor1.execute("select 1 from dual")
cursor2= conn.cursor()
cursor2.execute("select 2 from dual")
will throw an exception Mariadb.InterfaceError: Commands out of sync; you can't run this command now.
To avoid this, you need to create a buffered cursor:
cursor1= conn.cursor(buffered=True)

Python Mysql get variable value after select completes

So I am trying to do an IF statement after the select is done in python. Hard to explain, better to provide an example of it:
customer = cursor.execute("SELECT * FROM customer")
try:
customer['name']="john"
(do something here)
except:
customer['name']!="john"
(do something else here)
I hope this makes sense and sorry if this is not enough information. Trying to think of how to explain this. I don't want it to do a WHERE statement in the select because I don't want it to NOT SELECT certain information just because their 'name' is not "john"
First of all, you are not telling us which library you are using for working with MySQL. I assume its MySQLdb.
After executing SQL query with cursor.execute you have to call fetchall or fetchone. In later case you will be able to do customer['name'].
Example:
results = cursor.execute("SELECT * FROM customers")
customer = cursor.fetchone()
You shouldn't be using try/except since there is not an obvious case when any exception is thrown or is expected.

Python MySQLdb returning variable amount of rows on same call

I'm working on a project but I'm kinda stuck due to a weird problem. I pull data from an external API and save it into my own SQL database with a Python script. After pulling the data I check if it's already present in my database. I do this with the following code snippet:
def getDatabaseMatchesForSummoner(summonerId):
sqlSelect = 'SELECT gameId FROM playerMatchHistory WHERE playerId=%s'
try:
cur.execute(sqlSelect,(summonerId,))
db.commit()
except MySQLdb as e:
db.rollback()
print e
gameIds = []
print cur.rowcount
for i in range(cur.rowcount):
gameIds += [str(cur.fetchone()[0])]
return gameIds
Now the problem is the following: this piece of code tends to return an amount of rows that are not in agreement with my actual database. For instance, for a particular summoner ID, it returns 7 rows whereas if I enter the query into phpMyAdmin I get 10, the correct amount. I've been searching for some hours now and I can't honestly find anything wrong with it. I tried some other things like fetchall(), other string formatting, etc. I really hope someone can point out what's wrong.

Preferred method of adding data to MySQL database

My set-up:
MySQL server.
host running a python script.
(1) and (2) are different machines on the network.
The python script generates data which must be stored in a MySQL-database.
I use this (example-)code to achieve that:
def function sqldata(date,result):
con = mdb.connect('sql.lan', 'demouser', 'demo', 'demo')
with con:
cur = con.cursor()
cur.execute('INSERT INTO tabel(titel, nummer) VALUES( %s, %s)',(date, result))
The scipt generates one data-point approx. every minute. So this means that a new connection is opened and closed every minute. I'm wondering if it would be a better idea to open the connection at the start of the script and only close it when the script terminates. Effectively leaving the connection open indefinately.
This then obviously begs the question how to handle/recover when the SQL-server "leaves" the network (e.g. due to a reboot) for a while.
While typing my question this question appeared in the "Similar Questions" section. It is, however, from 2008 and possibly outdated and the 4 answers it received seem to contradict with each other.
What are the current insights in this matter?
Well the referred answer is right in it's point, but maybe not answering all your questions. I can not provide a full running python script for you here, but let me explain how i would go along with it:
Rule 1: Generally most mysql functions return values, that you should always check so that you can react on unwanted behavior.
Rule 2: Open a connection at the beginning of your script and use this one and only connection throughout your script.
Obviously you could check if there is an existing connection in your sqldata function, and if not then you could open a new one to the global con object.
if not con:
con = mdb.connect('sql.lan', 'demouser', 'demo', 'demo')
And if there is a connection already, you could check it's "up status" by performing a simple query with fixed expected result that you can check to see if the sql server is running.
if con:
cur = con.cursor()
returned = cur.execute('SELECT COUNT(*) FROM tabel')
if returned.with_rows:
....
Basically you could avoid this, because if you don't get a cursor back, and you check that first before using it, then you already know if the server is alive or not.
So CHECK, CHECK and CHECK. You should check everything you get back from a function to have a good error handling. Just using a connection or using a cursor without checking it first, can lead you talking to a NIL object and crashing your script.
And the last BIG HINT i can give you is to use multiple row inserts. You can actually insert hundreds of rows, if you just add the values comma seperated to your insert string:
# consider result would be filled like this
result = '("First Song",1),("Second Song",2),("Third Song",3)'
# then this will insert 3 rows with one call
returned = cur.execute('INSERT INTO tabel (titel, nummer) VALUES %s',(date, result), multi=True)
# since literally it will execute
returned = cur.execute('INSERT INTO tabel (titel, nummer) VALUES ("First Song",1),("Second Song",2),("Third Song",3)', multi=True)
# and now you can check returned for any error
if returned:
....

Categories

Resources