How to refresh Mysql connection? - python

I have a program using Python + python mysql connector + Mysql which is installed in a network and uses the same database. How do I refresh a connection without restarting the program on other Machines?
The Program is installed in more than one Machine and connects to the same Mysql database which is located on a Server. The program is working properly, but,
when a new data is entered or modified "SQL INSERT, UPDATE.... Statments" from one machine is not reflected in all other Machines until the program is restarted, which means that a new connection is necessary to show the new database data.
So, I would like to know how to refresh the connection without restarting the program on other machines.
Here is the sample Connection code:
import mysql.connector as mc
conn = mc.connect(host='host', user='user', passwd='passw', db='db_name', port=port_number)
cur = conn.cursor()
How to refresh This Connection while the program is running?

closing and reopening connection may also solve your problem,
see following example in loop
import mysql.connector as mc
conn = mc.connect(host='host', user='user', passwd='passw', db='db_name', port=port_number)
while True:
# check if youre connected, if not, connect again
if (conn.is_connected() == False):
conn = mc.connect(host='host', user='user', passwd='passw', db='db_name', port=port_number)
cur = conn.cursor()
sql = "INSERT INTO db_name.myTable (name) VALUES (%(val)s);"
val = {'val':"Slim Shady"}
cur.execute(sql,val)
conn.commit()
conn.close()

After inserting or updating, you should do a commit to make sure all data it's writed to the DB. Then it will be visible for everyone:
conn.commit()

Related

MySQL connection with Python through PythonAnywhere

I want to connect to MySql database using Python through PythonAnywhere, without creating a Flask/Django application.
I have seemingly managed to connect through MySQLdb, using the code below, but I do not receive a response when I run the code. Any solutions?
import MySQLdb
db = MySQLdb.connect(
host = "myuser.mysql.pythonanywhere-services.com",
user = "myuser",
passwd = XXX,
db = "myuser$db_name"
)
cursor = db.cursor()
cursor.execute("SELECT * FROM table_name")
for x in cursor:
print(x)
cursor.close()
db.close()
You retrieve all rows in the table, without error.
cursor.execute("SELECT * FROM table_name")
for x in cursor:
print(x)
Yet you see no output. This is normal for a table that contains zero rows.
Consider doing one or more INSERTs, and a COMMIT,
prior to the query.

Psycopg2 cannot connect to a server

I was writing a proprietary script that queries a company database to pull certain information. I was using Psycopg2. At this point the lines I was using were like:
conn = psycopg2.connect("dbname='somedb' user='usr' host='something.azure.com' password='pswd' port='5432'")
cur = conn.cursor()
query = "something"
cur.execute(query)
results = cur.fetchall()
The script was running fine until a few days ago, after quick successions of runs during debugging, I started getting:
psycopg2.OperationalError: server closed the connection unexpectedly
This probably means the server terminated abnormally
before or while processing the request.
A closer look that I realized I did not properly close the connection. So I modified it to:
with psycopg2.connect("dbname='somedb' user='usr' host='something.azure.com' password='pswd' port='5432'") as conn:
cur = conn.cursor()
query = "something"
cur.execute(query)
results = cur.fetchall()
The error persisted even when no additional connections were present on the server end, so hitting max_connection is unlikely the reason here.
Strangely, I can access the same server through PGAdmin or use sqlalchemy.create_engine with pandas.read_sql on the same machine. Additionally, the script runs fine on another colleague's machine while mimicking my IP address through VPN.
Edit: sqlalchemy worked for my machine for exactly once, after which I started to get the same error.
My sqlalchemy code:
engine = create_engine('postgresql://{0}:{1}#{2}:5432/db'.format(USR, PSWD, HOST))
query = "something"
df = pd.read_sql(sql=query, con=engine)

try & except pyodbc with no internet connection

I am writing and reading from a sql database with tries & excepts. The thought behind the try/except is if for some reason the internet is down or we cannot connect to the server, we will write the sql transactions locally to a text file and then use those statements to update the table. That being said - the try and except only seems to work if there is a connection to the server. We have a table BAR in the DB database on server FOO:
try:
conn = pyodbc.connect('DRIVER={SQL Server};SERVER=FOO;DATABASE=DB;UID=user;PWD=password')
cursor = conn.cursor()
cursor.execute("UPDATE BAR SET Date = '"+time+"' WHERE ID = "+ID)
conn.commit()
except:
f = open("vistorlog.txt", "a")
f.write("UPDATE BAR SET Date = '"+time+"' WHERE ID = "+ID+"\n")
f.close()
the only instance where this try&except works is when there is an issue with the sql statement i.e. "Update BARS..." fails because there is no table named BARS. If I change the server to FOOS (or in a real life scenario unplug the ethernet cord and leave the table/serve names legitimate) the try and except doesn't work - the program freezes with no error.

Python & SQL Server stored procedure hang

I am trying to get Python to run a stored procedure in my SQL Server which kicks off a series of procedures which involves importing a file processing it and outputting a couple of files.
So far I have got my code so that it accepts an input to a table but then the whole thing hangs when it calls the stored procedure.
Checking Who2 on the server it is waiting on the preemptive_OS_Pipeops which searching has revealed it is waiting on something outside of SQL Server to finish before proceeding.
Is someone able to shed some light if it is possible to use pyodbc to blind activate a stored procedure then close the connection?
My belief is by just telling the procedure to run then closing out should fix the issue but I am having issues finding the code for this
Python code:
connection2 = pypyodbc.connect('Driver={SQL Server}; Server=server;Database=db', timeout=1)
cursor2 = connection2.cursor()
cursor2.execute("{CALL uspGoBabyGo}")
connection2.close()
return 'file uploaded successfully'
Stored procedure:
BEGIN
SET NOCOUNT ON;
EXECUTE [dbo].[uspCableMapImport]
END
After searching and the script stopped posting the record to the table I found the solution to the issue. I needed to add in the autocommit=True line to the script, now the code is as follows;
connection = pyodbc.connect('Driver={SQL Server};
Server='Server';Database='DB';Trusted_Connection=yes')
connection.autocommit=True
cursor = connection.cursor()
referee = file.filename.rsplit('.')[0]
SQLCommand = ("INSERT INTO RequestTable(Reference, Requested) VALUES ('"+ str(referee) +"', " + datetime.now().strftime('%Y-%m-%d') + ")")
cursor.execute(SQLCommand)
connection.commit
SQLCommand2 = ("{CALL uspGoBabyGo}")
cursor.execute(SQLCommand2)
connection.commit
connection.close()

"with psycopg2.connect" is it clossing connection automatically?

I am using psycopg2 library to handle connection with Postgress database.
Are the following two approaches to handling db connection comparable?
Snipet 1:
cnx = connect(user=<...>, password=<...>, host=<...>, database=<...>)
cursor = cnx.cursor()
cursor.execute(sql)
cnx.close()
Snipet 2:
with psycopg2.connect(user=<...>, password=<...>, host=<...>, database=<...>) as cnx:
cursor = cnx.cursor()
cursor.execute(sql)
I am especialy interested if using WITH automatically closes connection? I was informed that second approach is preferable because it does connection close automatically. Yet when i test it using cnx.closed it shows open connection.
The with block tries on exit to close (commit) a transaction, not a connection. For the documentation:
Note that, unlike file objects or other resources, exiting the connection’s with block doesn’t close the connection but only the transaction associated with it [...]

Categories

Resources