Timeout of pyodbc connection - recycle conn? - python

At the beginning of my process, I connect to a SQL Server database
conn = pyodbc.connect("DRIVER=" + driver + ";SERVER=" + server + ";PORT=" + port + ";DATABASE=" + dbn + ";UID=" + user +";PWD=" + pwd)
and then use that connection and execute queries with this code:
cur = conn.cursor()
try:
cur.execute(sql)
conn.commit()
if cur.description is not None:
data = pd.io.sql.read_sql_query(sql, conn)
return data
except Exception as err:
conn.rollback()
raise err
cur.close()
It works fine until the code does not execute a query for several hours, then I get the following error:
cur.execute(sql)
pyodbc.OperationalError: ('08S01', '[08S01] [FreeTDS][SQL Server]Write to the server failed (20006) (SQLExecDirectW)')
It seems that there is a timeout. Is there any way to recycle the existing connection (conn)?

Related

Data is not deleted from mysql. Is there any other way to do the same?

I am trying to delete some record from mysql database using pymysql but i couldn't able to do.The same command if i execute in mysql its working.
logger = logging.getLogger()
logger.setLevel(logging.INFO)
# Connect to the MYSQL database
logger.info("Connecting to MySQL database")
try:
conn = pymysql.connect(host=rds_host, user=name, passwd=password, database=db_name, cursorclass=pymysql.cursors.DictCursor)
cur = conn.cursor()
except:
logger.error("ERROR: Unexpected error: Could not connect to MySql instance.")
sys.exit()
logger.info("SUCCESS: Connection to MySQL database succeeded")
def _clean_general_logs():
cur.execute(
"DELETE FROM desker.general_log WHERE event_time <= '2019-08-12 04:36:42.457536' ORDER BY event_time DESC limit 10")
conn.commit()
cur.close()
conn.close()
return {"result": True}

Fast Connection to a SQL Server with pyodbc

I am grabbing json data from a messaging bus and dumping that json into a database. I've had this working pretty well with psycopg2 doing ~3000 entries/sec into postgres. For a number of reason we've since moved to SQL Server 2016, and my entries dropped to around 100 per second.
I've got a functioned called insert_into() that inserts the json to the database. All I've really done to my insert_into() function is change the library to pyodbc and the connection string. It seems that my slow down is coming from setting up then tearing down my connection each time my function is called ('conn' in the code below). If I move the line that setup the connection outside of my insert_into function, my speed comes back. I was just wondering two things:
Whats the proper way to setup connections like this from a SQL Server perspective?
Is this even the best way to do this in postrgres?
For SQL Server, the server is 2016, using ODBC driver 17, SQL authentication.
Slow for SQL Server:
def insert_into():
conn = None
try:
conn = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};SERVER=server1;DATABASE=json;UID=user;PWD=pass')
cur = conn.cursor()
for i in buffer_list:
command = 'INSERT INTO jsonTable (data) VALUES (%s)' % ("'" + i + "'")
cur.execute(command)
cur.close()
conn.commit()
except (Exception, pyodbc.DatabaseError) as error:
print(error)
finally:
if conn is not None:
conn.close()
Fast for SQL Server:
conn = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};SERVER=server1;DATABASE=json;UID=user;PWD=pass')
def insert_into():
#conn = None
try:
cur = conn.cursor()
for i in buffer_list:
command = 'INSERT INTO jsonTable (data) VALUES (%s)' % ("'" + i + "'")
cur.execute(command)
cur.close()
conn.commit()
except (Exception, pyodbc.DatabaseError) as error:
print(error)
This daemon runs 24/7 and any advice on setting up a fast connection to MSSQL will be greatly appreciated.

MySQL remote query works, but remote insert fails via Python

I'm running Ubuntu 16.04 with MySQL. I've opened the MySQL server for remote connections, and my remote Python script can query my database, but all attempts to INSERT fail without any error log entry.
It also looks like my remote INSERTs are being seen, because my AUTO_INCREMENT ID increases without entries being made when I run the Python INSERT code.
Any insight is appreciated!
Simple table schema:
CREATE TABLE test (
ID int NOT NULL AUTO_INCREMENT,
x INT,
PRIMARY KEY (ID)
);
This works directly on the server:
INSERT INTO test (x) VALUES (10);
This is the Python query that's working:
try:
connection = db.Connection(host=HOST, port=PORT, user=USER, passwd=PASSWORD, db=DB)
cursor = connection.cursor()
print("Connected to Server")
cursor.execute("SELECT * FROM test")
result = cursor.fetchall()
for item in result:
print(item)
except Exception as e:
print('exception connecting to server db: ' + str(e))
finally:
print('closing connection...')
connection.close()
And the Python INSERT that's not working:
try:
connection = db.Connection(host=HOST, port=PORT, user=USER, passwd=PASSWORD, db=DB)
cursor = connection.cursor()
print("Connected to Server")
cursor.execute("INSERT INTO test (x) VALUES (10);")
except Exception as e:
print('exception connecting to server db: ' + str(e))
finally:
print('closing connection...')
connection.close()
Thanks
Add this line after the execute() call:
cursor.execute("INSERT INTO test (x) VALUES (10)")
connection.commit()
When making changes to the db, it is required that you commit your changes, no change(s) would take effect.

Connecting through pyodbc, strange error

I'm getting the following error after running this:
connection = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};SERVER=xxxxx.xxxx.net;PORT=1443;DATABASE=xxx;UID=xxxx;PWD=xxxx')
cursor = connection.cursor()
SQLCommand = ("SELECT [Date]...FROM [dbo].[...]")
cursor.execute(SQLCommand)
results = cursor.fetchone()
connection.close
Error: libc++abi.dylib: terminating with uncaught exception of type std::runtime_error: collate_byname::collate_byname failed to construct for C/en_US.UTF-8/C/C/C/C
Any ideas on how to remediate?

Connecting to Oracle Database Using Python

I am trying to connect to Oracle database using the following script;
import cx_Oracle
username = user
password = pwd
host = host
port = '1521'
service = service
dbconn = cx_Oracle.connect(username + '/' + password + '#'+ host + ':' + port + '/' + service)
But I got the following error;
TNS:Connect timeout occurred
Can anybody please suggest me what is going wrong here?
# importing module
import cx_Oracle
# Create a table in Oracle database
try:
con = cx_Oracle.connect('scott/tiger#localhost')
# Now execute the sqlquery
cursor = con.cursor()
# Creating a table srollno heading which is number
cursor.execute("create table student(srollno number, \
name varchar2(10), efees number(10, 2)")
print("Table Created successful")
except cx_Oracle.DatabaseError as e:
print("There is a problem with Oracle", e)
# by writing finally if any error occurs
# then also we can close the all database operation
finally:
if cursor:
cursor.close()
if con:
con.close()

Categories

Resources