Can't execute drop index in psycopg2 connect in python - python

I'm trying to use psycopg2 in python to drop index in postgresql:
connection = psycopg2.connect(host=hostname, user=username, password=password, dbname=database)
cur = connection.cursor()
statement = "DROP INDEX IF EXISTS idx_my_id"
cur.execute(statement)
connection.commit()
Same statement can be done in pgAdmin4 in one second. But in python, the execution never finished.
"pg_stat_activity" shows wait_event_type is Lock and wait_event is relation.
What went wrong?

won't fit in comments put it here. maybe because you don't commit your connections?
add this to your code , close all the connections and try again:
connection.set_session(autocommit=True)

Related

MariaDB Python database always returns empty [duplicate]

I have a python script which needs to update a mysql database, I have so far:
dbb = MySQLdb.connect(host="localhost",
user="user",
passwd="pass",
db="database")
try:
curb = dbb.cursor()
curb.execute ("UPDATE RadioGroups SET CurrentState=1 WHERE RadioID=11")
print "Row(s) were updated :" + str(curb.rowcount)
curb.close()
except MySQLdb.Error, e:
print "query failed<br/>"
print e
The script prints Row(s) were updated : with the correct number of rows which have a RadioID of 11. If I change the RadioID to another number not present in the table it will say Row(s) were updated :0. However the database doesn't actually update. The CurrentState field just stays the same. If I copy and past the SQL statement in to PHPMyAdmin it works fine.
use
dbb.commit()
after
curb.execute ("UPDATE RadioGroups SET CurrentState=1 WHERE RadioID=11")
to commit all the changes that you 'loaded' into the mysql server
As the #Lazykiddy pointed out, you have to commit your changes after you load them into the mysql.
You could also use this approach to enable the auto commit setting, just after the MySQL connection initialization:
dbb.autocommit(True)
Then, it will automatically commit the changes you made during your code execution.
the two answers are correct. However, you can also do this:
dbb = MySQLdb.connect(host="localhost",
user="user",
passwd="pass",
db="database",
autocommit=True)
add autocommit=True

Python: cx_Oracle cursor.execute() hangs on UPDATE query

I have looked at similar questions but nothing has worked for me so far
So here it is. I want to update my table through a python script. I'm using the module cx_oracle. I can execute a SELECT query but whenever I try to execute an UPDATE query, my program just hangs (freezes). I realize that I need to use cursor.commit() after cursor.execute() if I am updating a table but my code never gets past cursor.commit(). I have added a code snippet below that I am using to debug.
Any suggestions??
Code
import cx_Oracle
def getConnection():
ip = '127.0.0.1'
port = 1521
service_name = 'ORCLCDB.localdomain'
username = 'username'
password = 'password'
dsn = cx_Oracle.makedsn(ip, port, service_name=service_name) # (CONNECT_DATA=(SERVICE_NAME=ORCLCDB.localdomain)))
return cx_Oracle.connect(username, password, dsn) # connection
def debugging():
con = getConnection()
print(con)
cur = con.cursor()
print('Updating')
cur.execute('UPDATE EMPLOYEE SET LATITUDE = 53.540943 WHERE EMPLOYEEID = 1')
print('committing')
con.commit()
con.close()
print('done')
debugging()
**Here is the corresponding output: **
<cx_Oracle.Connection to username#(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=127.0.0.1)(PORT=1521))(CONNECT_DATA=(SERVICE_NAME=ORCLCDB.localdomain)))>
Updating
Solution
After a bit of poking around, I found the underlying cause! I had made changes to the table using Oracle SQL Developer but had not committed them, when the python script tried to make changes to the table it would freeze up because of this. To avoid the freeze, I committed my changes in oracle sql developer before running the python script and it worked fine!
Do you have any option to look in the database ? I mean , in order to understand whether is a problem of the python program or not, we need to check the v$session in the database to understand whether something is blocked.
select sid, event, last_call_et, status from v$session where sid = xxx
Where xxx is the sid of the session which has connected with python.
By the way, I would choose to commit explicitly after cursor execute
cur.execute('UPDATE EMPLOYEE SET LATITUDE = 53.540943 WHERE EMPLOYEEID = 1')
con.commit()
Hope it helps
Best

Run postgresql functions in python and gets error

I would like to run below postgresql function queries in python but I keep getting error message says
cursor = connection.cursor()
cursor.execute("SELECT
ST_Distance_sphere(st_makepoint(32.836956,39.925018)
,st_makepoint(28.990174,41.036857))")
df = cursor.fetchall()
df
InFailedSqlTransaction: current transaction is aborted, commands ignored until end of transaction block
How can I fix this? Thank you.
According to psycopg docs, you probably have an error in your command (SQL).
There was a problem in the previous command to the database, which
resulted in an error. The database will not recover automatically from
this condition: you must run a rollback() before sending new commands
to the session (if this seems too harsh, remember that PostgreSQL
supports nested transactions using the SAVEPOINT command).
I highly recommend using try/except/finally clause in your database connections. Or use with statement.
Here is an example from http://www.postgresqltutorial.com/postgresql-python/transaction/:
conn = psycopg2.connect(dsn)
# transaction 1
with conn:
with conn.cursor() as cur:
cur.execute(sql)
# transaction 2
with conn:
with conn.cursor() as cur:
cur.execute(sql)
conn.close()

pymysql.err.Error: Already closed

I am trying to create a login function. But it only works ones. Ex- When i give a wrong userid and password I got correct error massage that "Could't login" after canceling that message and giving correct userid and password then I get "pymysql.err.Error: Already closed" below are the sample code.
import pymysql
# Connect to the database
connection = pymysql.connect(host='localhost',
user='root',
password='',
db='python_code',
charset='utf8mb4',
cursorclass=pymysql.cursors.DictCursor)
class LoginModel:
def check_user(self, data):
try:
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT `username` FROM `users` WHERE `username`=%s"
cursor.execute(sql, (data.username))
user = cursor.fetchone()
print(user)
if user:
if (user, data.password):
return user
else:
return False
else:
return False
finally:
connection.close()
You have a mismatch with respect to the number of times you're creating the connection (once) and the number of times you're closing the connection (once per login attempt).
One fix would be to move your:
connection = pymysql.connect(host='localhost',
user='root',
password='',
db='python_code',
charset='utf8mb4',
cursorclass=pymysql.cursors.DictCursor)
into your def check__user(). It would work because you'd create and close the connection on each invocation (as others have pointed out, the finally clause always gets executed.)
That's not a great design because getting database connections tends to be relatively expensive. So keeping the connection creating outside of the method is preferred.... which means you must remove the connection.close() within the method.
I think you're mixing up connection.close() with cursor.close(). You want to do the latter, not the former. In your example you don't have to explicitly close the cursor because that happens automatically with your with connection.cursor() as cursor: line.
Change finally to except, or remove the try block completely.
This is the culprit code:
finally:
connection.close()
Per the docs:
"A finally clause is always executed before leaving the try statement, whether an exception has occurred or not"
From: https://docs.python.org/2/tutorial/errors.html
You didn't describe alternative behavior for what you would like to see happen instead of this, but my answer addresses the crux of your question.
Had the same issue. The "Finally clause is needed for Postgres with the psycopg2 driver, if used with context manager (with clause), it close the cursor but not the connection. The same does not apply with Pymysql.

MySQLdb python problem refresh database

I have wrote a python daemon for read a database... I do the next:
db.connect('localhost', user, pass, database)
while true:
cursor = db.cursor()
sql = "SELECT id FROM task WHERE status='pending'"
r = cursor.execute(sql)
if r != 0:
result = cursor.fetchall()
#.....
The problem is that: when change database, the daemon not detect it... can refresh...
what can I do??
thanks!!
This is just a guess since I don't have a full view of your code, but since you're connecting outside the loop any changes to database won't tell db to reconnect.
Again just a guess, not sure if you're threading or not threading, how the change coincides with the daemon... etc
I have soluted it activating commit at the end of the script:
db.autocommit(True)
thanks for all friends

Categories

Resources