So I have been making websites using PHP with a MySQL database, PHPMyAdmin, and XAMPP. I am trying to switch from PHP to python. All of the tutorials seem to be using SQLite instead of MYSQL. As far as I understand, sqlite is serverless and cant hold certain data types like datetime, but I need datetime in my website. How would I connect to MySQL with a python Flask project or is there a different way I need to do this?
You need to use Client Library like PyMySQL
To install pymysql use:
pip install PyMySQL
And then use this function, it will return the DB object:
def make_connection():
try:
db = pymysql.connect(host='localhost',
user='root',
password='',
db='DatabaseName',
charset='utf8mb4',
cursorclass=pymysql.cursors.DictCursor)
except Exception as error:
print(error)
return db
As what #tirth-mehta mentioned, if you want to use connect without any ORM, you should a client library. and if you feel it's too painful to not forget to close connection in every function call, you could use a decorator like this:
DB_CONFIG = {'host':'127.0.0.1',
'database':'dbname',
'user':'root',
'password':''}
def connect(func):
def _connect(*args, **kwargs):
conn = pymysql.connect(**DB_CONFIG)
try:
rv = func(conn, *args, **kwargs)
except Exception as e:
print(e)
else:
conn.commit()
finally:
conn.close()
return rv
return _connect
Related
I am confused how to use context manager with pyodbc connection. As far as I know, it is usually necessary to close the database connection and using context manager is a good practice for that (for pyodbc, I saw some examples which closes the cursor only). Long story short, I am creating a python app which pulls data from sql server and want to read them into a Pandas Dataframe.
I did some search on using contextlib and wrote an script sql_server_connection:
import pyodbc
import contextlib
#contextlib.contextmanager
def open_db_connection(server, database):
"""
Context manager to automatically close DB connection.
"""
conn = pyodbc.connect('DRIVER={SQL Server};SERVER='+server+';DATABASE='+database+';Trusted_Connection=yes;')
try:
yield
except pyodbc.Error as e:
print(e)
finally:
conn.close()
I then called this in another script:
from sql_server_connection import open_db_connection
with open_db_connection(server, database) as conn:
df = pd.read_sql_query(query_string, conn)
which raises this error:
File "C:\ProgramData\Anaconda3\lib\site-packages\pandas\io\sql.py", line 436, in read_sql_query
return pandas_sql.read_query(
File "C:\ProgramData\Anaconda3\lib\site-packages\pandas\io\sql.py", line 2116, in read_query
cursor = self.execute(*args)
File "C:\ProgramData\Anaconda3\lib\site-packages\pandas\io\sql.py", line 2054, in execute
cur = self.con.cursor()
AttributeError: 'NoneType' object has no attribute 'cursor'
I didn't define a cursor here because I expect that Pandas handle it as it did before I think about closing the connection. If the approach above is wrong how would I close the connection? Or does pyodbc handle it?
Thanks!
You yield nothing (None) from your open_db_connection.
import pyodbc
import contextlib
#contextlib.contextmanager
def open_db_connection(server, database):
"""
Context manager to automatically close DB connection.
"""
conn = pyodbc.connect('DRIVER={SQL Server};SERVER='+server+';DATABASE='+database+';Trusted_Connection=yes;')
try:
yield conn # yield the connection not None
except pyodbc.Error as e:
print(e)
finally:
conn.close()
Also, I should point out two things:
pyodbc does not expect the user to close the connections (docs):
Connections are automatically closed when they are deleted (typically when they go out of scope) so you should not normally need to call this, but you can explicitly close the connection if you wish.
pandas expects a SQLAlchemy connectable, a SQLAlchemy URL str or a sqlite3 connections in its read_sql_* functions (docs), not a pyODBC connection, so your mileage may vary.
pandas.read_sql_query(sql, con, index_col=None, coerce_float=True, params=None, parse_dates=None, chunksize=None, dtype=None)
con: SQLAlchemy connectable, str, or sqlite3 connection
Using SQLAlchemy makes it possible to use any DB supported by that library. If a DBAPI2 object, only sqlite3 is supported.
I would probably simplify to the following:
No need for error handling, this function must return a connection (also split up the string on several line and used f-strings rather than concat).
import pyodbc
def open_db_connection(server, database):
"""
Context manager to automatically close DB connection.
"""
return pyodbc.connect(
"DRIVER={SQL Server};"
f"SERVER={server};"
f"DATABASE={database};"
"Trusted_Connection=yes;"
)
Call the above function directly in the argument list to scope the connection to inside read_sql_query. Might want to do error handling here as well, but that depends on what you're writing.
import pandas as pd
from sql_server_connection import open_db_connection
df = pd.read_sql_query(
query_string,
open_db_connection(server, database),
)
I'm using psycopg2 library to connection to my postgresql database.
Every time I want to execute any query, I make a make a new connection like this:
import psycopg2
def run_query(query):
with psycopg2.connect("dbname=test user=postgres") as connection:
cursor = connection.cursor()
cursor.execute(query)
cursor.close()
But I think it's faster to make one connection for whole app execution like this:
import psycopg2
connection = psycopg2.connect("dbname=test user=postgres")
def run_query(query):
cursor = connection.cursor()
cursor.execute(query)
cursor.close()
So which is better way to connect my database during all execution time on my app?
I've tried both ways and both worked, but I want to know which is better and why.
You should strongly consider using a connection pool, as other answers have suggested, this will be less costly than creating a connection every time you query, as well as deal with workloads that one connection alone couldn't deal with.
Create a file called something like mydb.py, and include the following:
import psycopg2
import psycopg2.pool
from contextlib import contextmanager
dbpool = psycopg2.pool.ThreadedConnectionPool(host=<<YourHost>>,
port=<<YourPort>>,
dbname=<<YourDB>>,
user=<<YourUser>>,
password=<<YourPassword>>,
)
#contextmanager
def db_cursor():
conn = dbpool.getconn()
try:
with conn.cursor() as cur:
yield cur
conn.commit()
"""
You can have multiple exception types here.
For example, if you wanted to specifically check for the
23503 "FOREIGN KEY VIOLATION" error type, you could do:
except psycopg2.Error as e:
conn.rollback()
if e.pgcode = '23503':
raise KeyError(e.diag.message_primary)
else
raise Exception(e.pgcode)
"""
except:
conn.rollback()
raise
finally:
dbpool.putconn(conn)
This will allow you run queries as so:
import mydb
def myfunction():
with mydb.db_cursor() as cur:
cur.execute("""Select * from blahblahblah...""")
Both ways are bad. The fist one is particularly bad, because opening a database connection is quite expensive. The second is bad, because you will end up with a single connection (which is too few) one connection per process or thread (which is usually too many).
Use a connection pool.
Using Flask, I'm attempting to build a web application for user authentication, however the database is not being created and I keep receiving sql.errors.opertionalerror.
When I use sqlite, it functions perfectly, but when I use mysql, I receive the above mentioned error.
I am providing the SQLALCHEMY_DATABASE_URI='mysql+pymysql://username:password#localhost:3306/db_name' in the enivronment file.
Is it acceptable to use the Python MySQL (pymysql) connector to establish a database?
import pymysql as pm
def creating_user_db():
try:
mydb = pm.connect(host='localhost',
user='username',
passwd='password'
)
mydb.autocommit = False
cursor = mydb.cursor()
mydb.cursor().execute('CREATE DATABASE IF NOT EXISTS db_name;')
except Exception as e:
print(e)
finally:
cursor.close()
mydb.close()
Please correct me where I made a mistake and assist me with this.
Thanks In advance!
I use Python 3.4 from the Anaconda distribution. Within this distribution, I found the pymysql library to connect to an existing MySQL database, which is located on another computer.
import pymysql
config = {
'user': 'my_user',
'passwd': 'my_passwd',
'host': 'my_host',
'port': my_port
}
try:
cnx = pymysql.connect(**config)
except pymysql.err.OperationalError :
sys.exit("Invalid Input: Wrong username/database or password")
I now want to write test code for my application, in which I want to create a very small database at the setUp of every test case, preferably in memory. However, when I try this out of the blue with pymysql, it cannot make a connection.
def setUp(self):
config = {
'user': 'test_user',
'passwd': 'test_passwd',
'host': 'localhost'
}
cnx = pymysql.connect(**config)
pymysql.err.OperationalError: (2003, "Can't connect to MySQL server on 'localhost' ([Errno 61] Connection refused)")
I have been googling around, and found some things about SQLite and MySQLdb. I have the following questions:
Is sqlite3 or MySQLdb suitable for creating quickly a database in memory?
How do I install MySQLdb within the Anaconda package?
Is there an example of a test database, created in the setUp? Is this even a good idea?
I do not have a MySQL server running locally on my computer.
You can mock a mysql db using testing.mysqld (pip install testing.mysqld)
Due to some noisy error logs that crop up, I like this setup when testing:
import testing.mysqld
from sqlalchemy import create_engine
# prevent generating brand new db every time. Speeds up tests.
MYSQLD_FACTORY = testing.mysqld.MysqldFactory(cache_initialized_db=True, port=7531)
def tearDownModule():
"""Tear down databases after test script has run.
https://docs.python.org/3/library/unittest.html#setupclass-and-teardownclass
"""
MYSQLD_FACTORY.clear_cache()
class TestWhatever(unittest.TestCase):
#classmethod
def setUpClass(cls):
cls.mysql = MYSQLD_FACTORY()
cls.db_conn = create_engine(cls.mysql.url()).connect()
def setUp(self):
self.mysql.start()
self.db_conn.execute("""CREATE TABLE `foo` (blah)""")
def tearDown(self):
self.db_conn.execute("DROP TABLE foo")
#classmethod
def tearDownClass(cls):
cls.mysql.stop() # from source code we can see this kills the pid
def test_something(self):
# something useful
Both pymysql, MySQLdb, and sqlite will want a real database to connect too.
If you want just to test your code, you should just mock the pymysql module on the module you want to test, and use it accordingly
(in your test code: you can setup the mock object to return hardcoded results to predefined SQL statements)
Check the documentation on native Python mocking library at:
https://docs.python.org/3/library/unittest.mock.html
Or, for Python 2:
https://pypi.python.org/pypi/mock
I have built a site using Django and I am receiving this annoying error when I am trying to execute a query.
If I restart the Apache server, the error will go away for a short time.
Traceback:
File "/usr/local/lib/python2.7/site-packages/django/core/handlers/base.py" in get_response
100. response = callback(request, *callback_args, **callback_kwargs)
File "/home/fran/cron/views/set_caches.py" in set_caches
24. cursor.execute(query, [category['id']])
File "/usr/local/lib/python2.7/site-packages/django/db/backends/util.py" in execute
15. return self.cursor.execute(sql, params)
File "/usr/local/lib/python2.7/site-packages/django/db/backends/mysql/base.py" in execute
86. return self.cursor.execute(query, args)
File "build/bdist.linux-i686/egg/MySQLdb/cursors.py" in execute
155. charset = db.character_set_name()
Exception Type: InterfaceError at /blablabla/
Exception Value: (0, '')
This is caused by a global cursor. Try creating and closing the cursor within each method a raw query is needed.
cursor = connection.cursor()
cursor.execute(query)
cursor.close()
You get this error when you have a db.close() call and later try to access the database without creating a new connection. Try to find if you close the connection to the database when you don't mean to.
I agreed with Moberg. This error is caused when we try to access the database after we have closed the connection. This could be caused by some wrong indentation in the code. Below is my code.
conn = connect()
cur = conn.cursor()
tk = get_tickers(cur)
for t in tk:
prices = read_price(t, cur)
if prices != None:
update_price(t, cur)
print 'Price after update of ticker ', t, ':'
p_open, p_high, p_low, p_close = read_price(t, cur)
print p_open, p_high, p_low, p_close
else:
print 'Price for ', t, ' is not available'
conn.close()
I got the same error as reported by Marian. After dedenting conn.close(), everything worked well. Confirmed that global conn is not an issue.
I had the same problem as for April of 2019 using python 3.7 and Mysql 2.7.
At intermittent intervals, the string (0, '') would be added at random to my SQL statements causing errors. I have solved the issue by commenting on the closing of the database connection and just leaving the closing of the cursors across my code.
def set_db():
db = pymysql.connect(host='localhost',
user="root",
passwd="root",
db="DATABASE")
return db
def execute_sql(cnx, sql_clause, fetch_all):
if sql_clause and sql_clause is not None:
try:
cnx.execute(sql_clause)
except Exception as e:
print("Error in sql: " + sql_clause + str(e))
return 0
pass
if fetch_all:
result = cnx.fetchall()
else:
result = cnx.fetchone()
return result
else:
print("Empty sql.")
return 0
db = set_db()
cnx = db.cursor()
sql = "SELECT * FROM TABLE"
result = execute_sql(cnx, sql, 1)
cnx.close() #close the cursor
#db.close #do not close the db connection
...
I had the same issue using threading with Python3 and Pymysql. I was getting deadlocks and then I would get hit with InterfaceError (0, '').
My issue was that I was trying to do a rollback on exception of the query- I believe this rollback was trying to use a connection that no longer existed and it was giving me the interface error. I took this rollback out (because I am OK with not doing rollback for this query) and I just let things go. This fixed my issue.
def delete_q_msg(self, assetid, queuemsgtypeid, msgid):
"""
Given the paramerts below remove items from the msg queue equal to or older than this.
If appropriate send them into a history table to be processed later
:param assetid:
:param queuemsgtypeid:
:param msgid:
:return:
"""
params = (assetid, queuemsgtypeid, msgid,)
db_connection = self._connect_to_db()
sp_sql = "{db}.ps_delete_q_msg".format(db=self._db_settings["database"])
return_value = []
try:
with db_connection.cursor() as cursor:
cursor.callproc(sp_sql, params)
return_value = cursor.fetchall()
db_connection.commit()
except Exception as ex:
# i think we dont want rollback here
# db_connection.rollback()
raise Exception(ex)
finally:
db_connection.close()
return return_value
I can confirm this is caused by a global cursor which is then later used in some functions. My symptoms were the exact same: intermittent interface errors that would temporarily be cleared up by an apache restart.
from django.db import connection
cursor = connection.cursor() # BAD
def foo():
cursor.execute('select * from bar')
But, I am using Django on top of Oracle 11.2 so I do not believe this is a bug in the MySQL/python driver. This is probably due to the caching done by apache/mod_wsgi.
I had the same issue with Flask+pymysql, I was getting an empty tuple as a result in the except: block, something like this "(\"(0, '')\",)" to be specific.
It turned out that the connection was getting closed and later the code tried accessing it which resulted into this error.
So I solved it by referring to above solutions and used a function for connection which assured me a conn every time I had to access the db.
You can recreate this issue by inserting conn.close() just before accessing the cursor.
For reference I used this site which helped me solve this issue.
https://hackersandslackers.com/python-mysql-pymysql/
For me, removing the conn.close() from my function worked. I was trying to access the database again after closing.
I am using flask with AWS.
Also you can try to restart your flask application if it has been running for a long time & if you are also using AWS RDS with MYSQL workbench like in my case, then just check whether your session is expired or not and update the access key and id.
Hope this helps.
I had this same problem and what worked for me in Django was what is described in this answer, which consists of:
Replacing
'ENGINE': 'django.db.backends.mysql'
with
'ENGINE': 'mysql_server_has_gone_away'
on
settings.DATABASES['ENGINE']
and installing with pip the package below:
mysql_server_has_gone_away==1.0.0
with connections.cursor() as cursor:
res=cursor.execute(sql)