Error while testing postgresql database with python - python

I wanted to start into using databases in python. I chose postgresql for the database "language". I already created several databases, but now I want simply to check if the database exists with python. For this I already read this answer: Checking if a postgresql table exists under python (and probably Psycopg2) and tried to use their solution:
import sys
import psycopg2
con = None
try:
con = psycopg2.connect(database="testdb", user="test", password="abcd")
cur = con.cursor()
cur.execute("SELECT exists(SELECT * from information_schema.testdb)")
ver = cur.fetchone()[0]
print ver
except psycopg2.DatabaseError, e:
print "Error %s" %e
sys.exit(1)
finally:
if con:
con.close()
But unfortunately, I only get the output
Error relation "information_schema.testdb" does not exist
LINE 1: SELECT exists(SELECT * from information_schema.testdb)
Am I doing something wrong, or did I miss something?

Your question confuses me a little, because you say you want to look to see if a database exists, but you look in the information_schema.tables view. That view would tell you if a table existed in the currently open database. If you want to check if a database exists, assuming you have access to the 'postgres' database, you could:
import sys
import psycopg2, psycopg2.extras
cur = conn.cursor(cursor_factory=psycopg2.extras.DictCursor)
dbname = 'db_to_check_for_existance'
con = None
try:
con = psycopg2.connect(database="postgres", user="postgres")
cur = con.cursor(cursor_factory=psycopg2.extras.DictCursor)
cur.execute("select * from pg_database where datname = %(dname)s", {'dname': dbname })
answer = cur.fetchall()
if len(answer) > 0:
print "Database {} exists".format(dbname)
else:
print "Database {} does NOT exist".format(dbname)
except Exception, e:
print "Error %s" %e
sys.exit(1)
finally:
if con:
con.close()
What is happening here is you are looking in the database tables called pg_database. The column 'datname' contains each of the database names. Your code would supply db_to_check_for_existance as the name of the database you want to check for existence. For example, you could replace that value with 'postgres' and you would get the 'exists' answer. If you replace the value with aardvark you would probably get the does NOT exist report.

If you're trying to see if a database exists:
curs.execute("SELECT exists(SELECT 1 from pg_catalog.pg_database where datname = %s)", ('mydb',))
It sounds like you may be confused by the difference between a database and a table.

Related

mysql.connector.errors.InternalError: Unread result found when trying to insert into remote database [duplicate]

I am inserting JSON data into a MySQL database
I am parsing the JSON and then inserting it into a MySQL db using the python connector
Through trial, I can see the error is associated with this piece of code
for steps in result['routes'][0]['legs'][0]['steps']:
query = ('SELECT leg_no FROM leg_data WHERE travel_mode = %s AND Orig_lat = %s AND Orig_lng = %s AND Dest_lat = %s AND Dest_lng = %s AND time_stamp = %s')
if steps['travel_mode'] == "pub_tran":
travel_mode = steps['travel_mode']
Orig_lat = steps['var_1']['dep']['lat']
Orig_lng = steps['var_1']['dep']['lng']
Dest_lat = steps['var_1']['arr']['lat']
Dest_lng = steps['var_1']['arr']['lng']
time_stamp = leg['_sent_time_stamp']
if steps['travel_mode'] =="a_pied":
query = ('SELECT leg_no FROM leg_data WHERE travel_mode = %s AND Orig_lat = %s AND Orig_lng = %s AND Dest_lat = %s AND Dest_lng = %s AND time_stamp = %s')
travel_mode = steps['travel_mode']
Orig_lat = steps['var_2']['lat']
Orig_lng = steps['var_2']['lng']
Dest_lat = steps['var_2']['lat']
Dest_lng = steps['var_2']['lng']
time_stamp = leg['_sent_time_stamp']
cursor.execute(query,(travel_mode, Orig_lat, Orig_lng, Dest_lat, Dest_lng, time_stamp))
leg_no = cursor.fetchone()[0]
print(leg_no)
I have inserted higher level details and am now searching the database to associate this lower level information with its parent. The only way to find this unique value is to search via the origin and destination coordinates with the time_stamp. I believe the logic is sound and by printing the leg_no immediately after this section, I can see values which appear at first inspection to be correct
However, when added to the rest of the code, it causes subsequent sections where more data is inserted using the cursor to fail with this error -
raise errors.InternalError("Unread result found.")
mysql.connector.errors.InternalError: Unread result found.
The issue seems similar to MySQL Unread Result with Python
Is the query too complex and needs splitting or is there another issue?
If the query is indeed too complex, can anyone advise how best to split this?
EDIT As per #Gord's help, Ive tried to dump any unread results
cursor.execute(query,(leg_travel_mode, leg_Orig_lat, leg_Orig_lng, leg_Dest_lat, leg_Dest_lng))
leg_no = cursor.fetchone()[0]
try:
cursor.fetchall()
except mysql.connector.errors.InterfaceError as ie:
if ie.msg == 'No result set to fetch from.':
pass
else:
raise
cursor.execute(query,(leg_travel_mode, leg_Orig_lat, leg_Orig_lng, leg_Dest_lat, leg_Dest_lng, time_stamp))
But, I still get
raise errors.InternalError("Unread result found.")
mysql.connector.errors.InternalError: Unread result found.
[Finished in 3.3s with exit code 1]
scratches head
EDIT 2 - when I print the ie.msg, I get -
No result set to fetch from
All that was required was for buffered to be set to true!
cursor = cnx.cursor(buffered=True)
The reason is that without a buffered cursor, the results are "lazily" loaded, meaning that "fetchone" actually only fetches one row from the full result set of the query. When you will use the same cursor again, it will complain that you still have n-1 results (where n is the result set amount) waiting to be fetched. However, when you use a buffered cursor the connector fetches ALL rows behind the scenes and you just take one from the connector so the mysql db won't complain.
I was able to recreate your issue. MySQL Connector/Python apparently doesn't like it if you retrieve multiple rows and don't fetch them all before closing the cursor or using it to retrieve some other stuff. For example
import mysql.connector
cnxn = mysql.connector.connect(
host='127.0.0.1',
user='root',
password='whatever',
database='mydb')
crsr = cnxn.cursor()
crsr.execute("DROP TABLE IF EXISTS pytest")
crsr.execute("""
CREATE TABLE pytest (
id INT(11) NOT NULL AUTO_INCREMENT,
firstname VARCHAR(20),
PRIMARY KEY (id)
)
""")
crsr.execute("INSERT INTO pytest (firstname) VALUES ('Gord')")
crsr.execute("INSERT INTO pytest (firstname) VALUES ('Anne')")
cnxn.commit()
crsr.execute("SELECT firstname FROM pytest")
fname = crsr.fetchone()[0]
print(fname)
crsr.execute("SELECT firstname FROM pytest") # InternalError: Unread result found.
If you only expect (or care about) one row then you can put a LIMIT on your query
crsr.execute("SELECT firstname FROM pytest LIMIT 0, 1")
fname = crsr.fetchone()[0]
print(fname)
crsr.execute("SELECT firstname FROM pytest") # OK now
or you can use fetchall() to get rid of any unread results after you have finished working with the rows you retrieved.
crsr.execute("SELECT firstname FROM pytest")
fname = crsr.fetchone()[0]
print(fname)
try:
crsr.fetchall() # fetch (and discard) remaining rows
except mysql.connector.errors.InterfaceError as ie:
if ie.msg == 'No result set to fetch from.':
# no problem, we were just at the end of the result set
pass
else:
raise
crsr.execute("SELECT firstname FROM pytest") # OK now
cursor.reset() is really what you want.
fetchall() is not good because you may end up moving unnecessary data from the database to your client.
The problem is about the buffer, maybe you disconnected from the previous MySQL connection and now it cannot perform the next statement. There are two ways to give the buffer to the cursor. First, only to the particular cursor using the following command:
import mysql.connector
cnx = mysql.connector.connect()
# Only this particular cursor will buffer results
cursor = cnx.cursor(buffered=True)
Alternatively, you could enable buffer for any cursor you use:
import mysql.connector
# All cursors created from cnx2 will be buffered by default
cnx2 = mysql.connector.connect(buffered=True)
cursor = cnx.cursor()
In case you disconnected from MySQL, the latter works for you.
Enjoy coding
If you want to get only one result from a request, and want after to reuse the same connexion for other requests, limit your sql select request to 1 using "limit 1" at the end of your request.
ex "Select field from table where x=1 limit 1;"
This method is faster using "buffered=True"
Set the consume_results argument on the connect() method to True.
cnx = mysql.connector.connect(
host="localhost",
user="user",
password="password",
database="database",
consume_results=True
)
Now instead of throwing an exception, it basically does fetchall().
Unfortunately this still makes it slow, if you have a lot of unread rows.
There is also a possibility that your connection to MySQL Workbench is disconnected. Establish the connection again. This solved the problem for me.
cursor.reset()
and then create tables and load entries
Would setting the cursor within the for loop, executing it, and then closing it again in the loop help?
Like:
for steps in result['routes'][0]['legs'][0]['steps']:
cursor = cnx.cursor()
....
leg_no = cursor.fetchone()[0]
cursor.close()
print(leg_no)

Python insert data into MySQL

I've encounter a problem when i try to insert values into mysql using python connector.
The problem is that i'm trying to pass an input as a value in mysql, but the input is added as name of the table instead of value of field. Can anyone let me now what am i doing wrong?
My code is:
import mysql.connector
from mysql.connector import errorcode
def main():
try:
connection= mysql.connector.connect(user='root',passwd='',host='localhost',port='3306', database='game_01')
print("Welcome")
name_of_char = input("Your name?: ")
con = connection.cursor()
con.execute("INSERT into charachter (name,intel,strenght,agil) values(%s,0,0,0)" % str(name_of_char))
con.execute("SELECT * FROM charachter")
for items in con:
print(items[1])
except mysql.connector.Error as err:
print(err)
else:
connection.close()
main()
Thanks.
P.S
The error is : 1054: Unknown column in 'field list'. I forgot to mention that in the post. It seems if i enter the tables attribute,it will work but won't add any value.
if you using MySQLdb driver , after execute the query that insert into database or update , you should use connection.commit() to complete saving operation.
try this:
con = connection.cursor()
con.execute("INSERT into `charachter` (`name`,`intel,`strenght`,`agil`) values('%s',0,0,0)" % str(name_of_char))
connection.commit()
if you use any other driver , you should set the auto_commit option true.
see this:
How can I insert data into a MySQL database?

Sybase sybpydb queries not returning anything

I am currently connecting to a Sybase 15.7 server using sybpydb. It seems to connect fine:
import sys
sys.path.append('/dba/sybase/ase/15.7/OCS-15_0/python/python26_64r/lib')
sys.path.append('/dba/sybase/ase/15.7/OCS-15_0/lib')
import sybpydb
conn = sybpydb.connect(user='usr', password='pass', servername='serv')
is working fine. Changing any of my connection details results in a connection error.
I then select a database:
curr = conn.cursor()
curr.execute('use db_1')
however, now when I try to run queries, it always returns None
print curr.execute('select * from table_1')
I have tried running the use and select queries in the same execute, I have tried including go commands after each, I have tried using curr.connection.commit() after each, all with no success. I have confirmed, using dbartisan and isql, that the same queries I am using return entries.
Why am I not getting results from my queries in python?
EDIT:
Just some additional info. In order to get the sybpydb import to work, I had to change two environment variables. I added the lib paths (the same ones that I added to sys.path) to $LD_LIBRARY_PATH, i.e.:
setenv LD_LIBRARY_PATH "$LD_LIBRARY_PATH":dba/sybase/ase/15.7/OCS-15_0/python/python26_64r/lib:/dba/sybase/ase/15.7/OCS-15_0/lib
and I had to change the SYBASE path from 12.5 to 15.7. All this was done in csh.
If I print conn.error(), after every curr.execute(), I get:
("Server message: number(5701) severity(10) state(2) line(0)\n\tChanged database context to 'master'.\n\n", 5701)
I completely understand where you might be confused by the documentation. Its doesn't seem to be on par with other db extensions (e.g. psycopg2).
When connecting with most standard db extensions you can specify a database. Then, when you want to get the data back from a SELECT query, you either use fetch (an ok way to do it) or the iterator (the more pythonic way to do it).
import sybpydb as sybase
conn = sybase.connect(user='usr', password='pass', servername='serv')
cur = conn.cursor()
cur.execute("use db_1")
cur.execute("SELECT * FROM table_1")
print "Query Returned %d row(s)" % cur.rowcount
for row in cur:
print row
# Alternate less-pythonic way to read query results
# for row in cur.fetchall():
# print row
Give that a try and let us know if it works.
Python 3.x working solution:
import sybpydb
try:
conn = sybpydb.connect(dsn="Servername=serv;Username=usr;Password=pass")
cur = conn.cursor()
cur.execute('select * from db_1..table_1')
# table header
header = tuple(col[0] for col in cur.description)
print('\t'.join(header))
print('-' * 60)
res = cur.fetchall()
for row in res:
line = '\t'.join(str(col) for col in row)
print(line)
cur.close()
conn.close()
except sybpydb.Error:
for err in cur.connection.messages:
print(f'Error {err[0]}, Value {err[1]}')

How To Connect To Multiple sqlite3 Database with Python

this is somewhat related to my earlier query..
Reading A Big File With Python
The problem was with runtime, so i was suggested to use sqlite3 database, and it reduced the time to millisecond, and I am very happy, now the only problem i have is, connecting to different database files in the same folder. All the database files have the same tables.
The code I am using, reads only the first one, and doesnt seem to check the other databases.
The output is when the teacher, enters students ID, it is supposed to return the related records if found in the database table.
my Code is something like this, But I am sure I am doing something wrong, pardon me if its a silly one, as using sqlite3 for the first time.
#other codes above not related to this part
databases = []
directory = "./Databases"
for filename in os.listdir(directory):
flname = os.path.join(directory, filename)
databases.append(flname)
for database in databases:
conn = sqlite3.connect(database)
conn.text_factory = str
cur = conn.cursor()
sqlqry = "SELECT * FROM tbl_1 WHERE std_ID='%s';" % (sudentID)
try:
c = cur.execute(sqlqry)
data = c.fetchall()
for i in data:
print "[INFO] RECORD FOUND"
print "[INFO] STUDENT ID: "+i[1]
print "[INFO] STUDENT NAME: "+i[2]
#and some other info
conn.close()
except sqlite3.Error as e:
print "[INFO] "+e
Thanks For Any guides
#Whiskey, sometimes it helps to try to break the problem down into a minimal example and see if that works or where it breaks. Since you are able to see the database names being printed as they are opened, my guess would be a problem with the query or possibly the data in the db even tho the records seem to be there. When you say it doesn't find the record you're looking for does it just print out nothing or does it print out the "[INFO]" line in your exception handler?
I put together the following minimal example, and it seems to be working as far as my understanding of your problem goes. My only other piece of advice to add to everyone else's would be to parametrize your query rather than using the raw input directly to make your app a little more secure. Hope it helps:
import os, sqlite3
"""
Create the test databases:
sqlite3 Databases/test_db1.db
sqlite> CREATE TABLE foo ( id INTEGER NOT NULL, name VARCHAR(100), PRIMARY KEY (id) );
sqlite>
sqlite3 Databases/test_db2.db
sqlite> CREATE TABLE foo ( id INTEGER NOT NULL, name VARCHAR(100), PRIMARY KEY (id) );
sqlite> INSERT INTO foo VALUES (2, 'world');
"""
databases = []
student_id = 2
directory = "./Databases"
for filename in os.listdir(directory):
flname = os.path.join(directory, filename)
databases.append(flname)
for database in databases:
try:
with sqlite3.connect(database) as conn:
conn.text_factory = str
cur = conn.cursor()
sqlqry = "SELECT * FROM foo WHERE id=:1;"
c = cur.execute(sqlqry, [student_id])
for row in c.fetchall():
print "-- found: %s=%s" % (row[0], row[1])
except sqlite3.Error, err:
print "[INFO] %s" % err

Python, Sqlite not saving results on the file

I have this code in Python:
conn = sqlite3.connect("people.db")
cursor = conn.cursor()
sql = 'create table if not exists people (id integer, name VARCHAR(255))'
cursor.execute(sql)
conn.commit()
sql = 'insert into people VALUES (3, "test")'
cursor.execute(sql)
conn.commit()
sql = 'insert into people VALUES (5, "test")'
cursor.execute(sql)
conn.commit()
print 'Printing all inserted'
cursor.execute("select * from people")
for row in cursor.fetchall():
print row
cursor.close()
conn.close()
But seems is never saving to the database, there is always the same elements on the db as if it was not saving anything.
On the other side If I try to access the db file via sqlite it I got this error:
Unable to open database "people.db": file is encrypted or is not a database
I found on some other answers to use conn.commit instead of conn.commit() but is not changing the results.
Any idea?
BINGO ! people! I Had the same problem. One of thr reasons where very simple. I`am using debian linux, error was
Unable to open database "people.db": file is encrypted or is not a database
database file was in the same dir than my python script
connect line was
conn = sqlite3.connect('./testcases.db')
I changed this
conn = sqlite3.connect('testcases.db')
! No dot and slash.
Error Fixed. All works
If someone think it is usefull, you`re welcome
This seems to work alright for me ("In database" increases on each run):
import random, sqlite3
conn = sqlite3.connect("people.db")
cursor = conn.cursor()
sql = 'create table if not exists people (id integer, name VARCHAR(255))'
cursor.execute(sql)
for x in xrange(5):
cursor.execute('insert into people VALUES (?, "test")', (random.randint(1, 10000),))
conn.commit()
cursor.execute("select count(*) from people")
print "In database:", cursor.fetchone()[0]
You should commit after making changes i.e:
myDatabase.commit()
can you open the db with a tool like sqlite administrator ? this would proove thedb-format is ok.
if i search for that error the solutions point to version issues between sqlite and the db-driver used. maybe you can chrck your versions or AKX could post the working combination.
regards,khz

Categories

Resources