just wanna ask how do you reopen the connection after closing it?
i have this script
cursor.execute(add_data, dummy_data)
cnx = mysql.connector.connect(user='',
password ='',host='',
database='')
cursor = cnx.cursor()
add_data = insert statement
dummy_data = dictionary of data
cursor.execute(add_data, dummy_data)
cnx.commit()
cnx.close()
After inserting one set of values i cannot insert anymore.
This is the error :
"OperationalError: 2055: Lost connection to MySQL server at system error 9: Bad file descriptor"
thanks in advance!
You need a new connection object after you close
cnx2 = mysql.connector.connect(user='',
password ='',host='',
database='')
but why would you close the connection? you should close the cursor and keep the connection open for all operations.
I would define an open connection function and a close connection function that I will call at begginning and end of program respectively.
Related
I followed the answer here to create a connection using psycopg2. It works on the first call on the endpoint. The second try gives this error psycopg2.InterfaceError: connection already closed. Below is a snippet of my code:
from config import conn
with conn:
with conn.cursor() as cursor:
cursor.execute("""
select ...
"""
)
pos = cursor.fetchone()
cursor.execute("""
select ...'
"""
)
neg = cursor.fetchone()
conn.close()
Since you're closing the connection in the last line of your code, it cannot be used further (when calling the endpoint for the second time) without reconnecting.
You can either delete the last line (which would result in never closing connection) and move conn.close() to app shutdown event or perform psycopg2.connect each time you need it.
I am trying to get Python to run a stored procedure in my SQL Server which kicks off a series of procedures which involves importing a file processing it and outputting a couple of files.
So far I have got my code so that it accepts an input to a table but then the whole thing hangs when it calls the stored procedure.
Checking Who2 on the server it is waiting on the preemptive_OS_Pipeops which searching has revealed it is waiting on something outside of SQL Server to finish before proceeding.
Is someone able to shed some light if it is possible to use pyodbc to blind activate a stored procedure then close the connection?
My belief is by just telling the procedure to run then closing out should fix the issue but I am having issues finding the code for this
Python code:
connection2 = pypyodbc.connect('Driver={SQL Server}; Server=server;Database=db', timeout=1)
cursor2 = connection2.cursor()
cursor2.execute("{CALL uspGoBabyGo}")
connection2.close()
return 'file uploaded successfully'
Stored procedure:
BEGIN
SET NOCOUNT ON;
EXECUTE [dbo].[uspCableMapImport]
END
After searching and the script stopped posting the record to the table I found the solution to the issue. I needed to add in the autocommit=True line to the script, now the code is as follows;
connection = pyodbc.connect('Driver={SQL Server};
Server='Server';Database='DB';Trusted_Connection=yes')
connection.autocommit=True
cursor = connection.cursor()
referee = file.filename.rsplit('.')[0]
SQLCommand = ("INSERT INTO RequestTable(Reference, Requested) VALUES ('"+ str(referee) +"', " + datetime.now().strftime('%Y-%m-%d') + ")")
cursor.execute(SQLCommand)
connection.commit
SQLCommand2 = ("{CALL uspGoBabyGo}")
cursor.execute(SQLCommand2)
connection.commit
connection.close()
I have a program using Python + python mysql connector + Mysql which is installed in a network and uses the same database. How do I refresh a connection without restarting the program on other Machines?
The Program is installed in more than one Machine and connects to the same Mysql database which is located on a Server. The program is working properly, but,
when a new data is entered or modified "SQL INSERT, UPDATE.... Statments" from one machine is not reflected in all other Machines until the program is restarted, which means that a new connection is necessary to show the new database data.
So, I would like to know how to refresh the connection without restarting the program on other machines.
Here is the sample Connection code:
import mysql.connector as mc
conn = mc.connect(host='host', user='user', passwd='passw', db='db_name', port=port_number)
cur = conn.cursor()
How to refresh This Connection while the program is running?
closing and reopening connection may also solve your problem,
see following example in loop
import mysql.connector as mc
conn = mc.connect(host='host', user='user', passwd='passw', db='db_name', port=port_number)
while True:
# check if youre connected, if not, connect again
if (conn.is_connected() == False):
conn = mc.connect(host='host', user='user', passwd='passw', db='db_name', port=port_number)
cur = conn.cursor()
sql = "INSERT INTO db_name.myTable (name) VALUES (%(val)s);"
val = {'val':"Slim Shady"}
cur.execute(sql,val)
conn.commit()
conn.close()
After inserting or updating, you should do a commit to make sure all data it's writed to the DB. Then it will be visible for everyone:
conn.commit()
I am using psycopg2 library to handle connection with Postgress database.
Are the following two approaches to handling db connection comparable?
Snipet 1:
cnx = connect(user=<...>, password=<...>, host=<...>, database=<...>)
cursor = cnx.cursor()
cursor.execute(sql)
cnx.close()
Snipet 2:
with psycopg2.connect(user=<...>, password=<...>, host=<...>, database=<...>) as cnx:
cursor = cnx.cursor()
cursor.execute(sql)
I am especialy interested if using WITH automatically closes connection? I was informed that second approach is preferable because it does connection close automatically. Yet when i test it using cnx.closed it shows open connection.
The with block tries on exit to close (commit) a transaction, not a connection. For the documentation:
Note that, unlike file objects or other resources, exiting the connection’s with block doesn’t close the connection but only the transaction associated with it [...]
I am trying to create a login function. But it only works ones. Ex- When i give a wrong userid and password I got correct error massage that "Could't login" after canceling that message and giving correct userid and password then I get "pymysql.err.Error: Already closed" below are the sample code.
import pymysql
# Connect to the database
connection = pymysql.connect(host='localhost',
user='root',
password='',
db='python_code',
charset='utf8mb4',
cursorclass=pymysql.cursors.DictCursor)
class LoginModel:
def check_user(self, data):
try:
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT `username` FROM `users` WHERE `username`=%s"
cursor.execute(sql, (data.username))
user = cursor.fetchone()
print(user)
if user:
if (user, data.password):
return user
else:
return False
else:
return False
finally:
connection.close()
You have a mismatch with respect to the number of times you're creating the connection (once) and the number of times you're closing the connection (once per login attempt).
One fix would be to move your:
connection = pymysql.connect(host='localhost',
user='root',
password='',
db='python_code',
charset='utf8mb4',
cursorclass=pymysql.cursors.DictCursor)
into your def check__user(). It would work because you'd create and close the connection on each invocation (as others have pointed out, the finally clause always gets executed.)
That's not a great design because getting database connections tends to be relatively expensive. So keeping the connection creating outside of the method is preferred.... which means you must remove the connection.close() within the method.
I think you're mixing up connection.close() with cursor.close(). You want to do the latter, not the former. In your example you don't have to explicitly close the cursor because that happens automatically with your with connection.cursor() as cursor: line.
Change finally to except, or remove the try block completely.
This is the culprit code:
finally:
connection.close()
Per the docs:
"A finally clause is always executed before leaving the try statement, whether an exception has occurred or not"
From: https://docs.python.org/2/tutorial/errors.html
You didn't describe alternative behavior for what you would like to see happen instead of this, but my answer addresses the crux of your question.
Had the same issue. The "Finally clause is needed for Postgres with the psycopg2 driver, if used with context manager (with clause), it close the cursor but not the connection. The same does not apply with Pymysql.