I am trying to access a pyodbc connection from multiple places within my code.
Currently the user passes in the connection details via command line, and I process them using plac.annotation, problem is, I do not know how to share this object across all the project. What I have so far is:
A Singleton Class for storing the connection
class DatabaseInstance:
"""
Singleton Class holding a Database Connection
"""
class __DatabaseInstance:
def __init__(self, server, database, schema, table, user, password):
self.server = server
self.database = database
self.schema = schema
self.table = table
self.username = user
self.passw = password
def __str__(self):
return "{} DB: {}#[{}].[{}].[{}] # {}".format(
repr(self),
self.server,
self.database,
self.schema,
self.table,
self.username,
)
def get_connection(self):
"""
TODO
"""
if DatabaseInstance.connection:
return DatabaseInstance.connection
else:
DatabaseInstance.connection = pyodbc.connect(
"DRIVER=SQL Server;SERVER="
+ self.server
+ ";PORT=1433;DATABASE="
+ self.database
+ ";UID="
+ self.username
+ ";PWD="
+ self.passw
)
return DatabaseInstance.connection
instance = None
connection = None
def __init__(self, server, database, schema, table, user, password):
if not DatabaseInstance.instance:
DatabaseInstance.instance = DatabaseInstance.__DatabaseInstance(
server, database, schema, table, user, password
)
def __getattr__(self, name):
return getattr(self.instance, name)
Now, In my main, I get the params, and create an instance for the database:
connection = DatabaseInstance(
server=server,
database=database,
schema=schema,
table=table,
user=user,
password=passw,
)
The application needs to access this object from different modules and submodules, but connection is withing the scope of a function.
Is there a better way to do it than just passing down this object from function to function up until it is used by the necessary functions?
Don't instantiate the singleton in your main. Instantiate DatabaseInstance in your module as db, and when you need access to the singleton, reach into the module and use it.
import .thatmodule
db = thatmodule.db.get_connection()
It is perfectly fine to instantiate in your main and pass it into code that needs access to the database, to answer your final question. It makes for an uglier API, though.
Both are fine. The choice is up to you.
Related
I have a CRUD with insert and update functions with commit at the end of the each one as follows:
#staticmethod
def insert(db: Session, item: Item) -> None:
db.add(item)
db.commit()
#staticmethod
def update(db: Session, item: Item) -> None:
...
db.commit()
I have an endpoint which receives a sqlalchemy session from a FastAPI dependency and needs to insert and update atomically (DB transaction).
What's the best practice when working with transactions? I can't work with the CRUD since it does more than one commit.
How should I handle the transactions? Where do you commit your session? in the CRUD? or only once in the FastAPI dependency function for each request?
I had the same problem while using FastAPI. I couldn't find a way to use commit in separate methods and have them behave transactionally.
What I ended up doing was a flush instead of the commit, which sends the changes to the db, but doesn't commit the transaction.
One thing to note, is that in FastAPI every request opens a new session and closes it once its done. This would be a rough example of what is happening using the example in the SQLAlchemy docs.
def run_my_program():
# This happens in the `database = SessionLocal()` of the `get_db` method below
session = Session()
try:
ThingOne().go(session)
ThingTwo().go(session)
session.commit()
except:
session.rollback()
raise
finally:
# This is the same as the `get_db` method below
session.close()
The session that is generated for the request is already a transaction. When you commit that session what is actually doing is this
When using the Session in its default mode of autocommit=False, a new transaction will be begun immediately after the commit, but note that the newly begun transaction does not use any connection resources until the first SQL is actually emitted.
In my opinion after reading that it makes sense handling the commit and rollback at the endpoint scope.
I created a dummy example of how this would work. I use everything form the FastAPI guide.
def create_user(db: Session, user: UserCreate):
"""
Create user record
"""
fake_hashed_password = user.password + "notreallyhashed"
db_user = models.User(email=user.email, hashed_password=fake_hashed_password)
db.add(db_user)
db.flush() # Changed this to a flush
return db_user
And then use the crud operations in the endpoint as follows
from typing import List
from fastapi import Depends, HTTPException
from sqlalchemy.orm import Session
...
def get_db():
"""
Get SQLAlchemy database session
"""
database = SessionLocal()
try:
yield database
finally:
database.close()
#router.post("/users", response_model=List[schemas.User])
def create_users(user_1: schemas.UserCreate, user_2: schemas.UserCreate, db: Session = Depends(get_db)):
"""
Create two users
"""
try:
user_1 = crud.create_user(db=db, user=user_1)
user_2 = crud.create_user(db=db, user=user_2)
db.commit()
return [user_1, user_2]
except:
db.rollback()
raise HTTPException(status_code=400, detail="Duplicated user")
In the future I might investigate moving this to a middleware, but I don't think that using commit you can get the behavior you want.
A more pythonic approach is to let a context manager perform a commit or rollback depending on whether or not there was an exception.
A Transaction is a nice abstraction of what we are trying to accomplish.
class Transaction:
def __init__(self, session: Session = Depends(get_session)):
self.session = session
def __enter__(self):
return self
def __exit__(self, exc_type, exc_val, exc_tb):
if exc_type is not None:
# rollback and let the exception propagate
self.session.rollback()
return False
self.session.commit()
return True
And, use it in your APIs, like so:
def some_api(tx: Transaction = Depends(Transaction)):
with tx:
ThingOne().go()
ThingTwo().go()
No need to pass session to ThingOne and ThingTwo. Inject it into them, like so:
class ThingOne:
def __init__(self, session: Session = Depends(get_session)):
...
class ThingTwo:
def __init__(self, session: Session = Depends(get_session)):
...
I would also inject ThingOne and ThingTwo in the APIs as well:
def some_api(tx: Transaction = Depends(Transaction),
one: ThingOne = Depends(ThingOne),
two: ThingTwo = Depends(ThingTwo)):
with tx:
one.go()
two.go()
I am currently working on a flask app in which you can have multiple databases connected to it.
Each request to the app should be handled by a certain database depending on the url.
I am now trying to replace flask-sqlalchemy with sqlalchemy in order to use scoped-session to take care of my problem.
I have a session_registry in order to store the sessions:
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker, scoped_session
class SessionRegistry(object):
_registry = {}
def get_database_connection(self, name, **kwargs):
return self._registry[name]
def add_database_connection(self, url, name, **kwargs):
if url not in self._registry:
engine = create_engine(url)
Session = sessionmaker(bind=engine)
session = scoped_session(Session)
self._registry[name] = session
return True if self._registry[name] is not None else Fals
The problem that I have now is, that I don't know how to pass it to my routes in order to use that session. Here is an example class where I am trying to use it:
class SomeJob():
def get(self, lim=1000, order="asc"):
if order == "desc":
result = session.query(SomeModel).order_by(
SomeModel.id.desc()).limit(lim).all()
else:
result = session.query(SomeModel).order_by(
SomeModel.id.asc()).limit(lim).all()
# deserialize to json
schemaInstance = SomeSchema(many=True)
json_res = schemaInstance.dump(result)
# return json
return json_res
My question now is, how do I pass that session to the object properly?
I've been searching StackOverflow questions and reading SQLAlchemy and Flask-SQLAlchemy docs, and have still not figured out how to get reflection working (still new to SQLAlchemy).
When I try to map the table using the engine, I get the error "sqlalchemy.exc.ArgumentError: Mapper Mapper|User|user could not assemble any primary key columns for mapped table 'user'".
In spite of that, user has column 'id' as a primary key in the database. I'm not sure if there's something else I need to do here first. I had thought that if I could reflect, it would automatically give my User model class properties named after the database columns, and I wouldn't have to define them manually.
Here is my code that I've cobbled together so far:
from flask_sqlalchemy import SQLAlchemy
from sqlalchemy.orm import mapper
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = "mysql+pymysql://" + dbUser + ":" + dbPass + "#" + dbAddress + ":3306/" + dbName
app.config["SQLALCHEMY_ECHO"] = True
db = SQLAlchemy(app)
engine = db.engine
meta = db.metadata
Now with that, I know that db will give me a good database session. I'm not sure yet about what I'm doing wrong with engine and meta. I've seen engine being used to create the context differently, but I think I'm creating that with this line (from above):
db = SQLAlchemy(app)
Here's the place where I'm trying to reflect a model class:
class User(db.Model):
try:
self = db.Model.metadata.tables('user', metadata)
#self = Tadb.Model.metadata.tables('user', meta, autoload=True, autoload_with=engine, extend_existing=True)
#Table('user', metadata, autoload_with=engine, extend_existing=True)
#self = Table('user', meta, autoload_with=engine, extend_existing=True)
#self.metadata.reflect(extend_existing=True, only=['user'])
except Exception as e:
print("In init User(): failed to map table user - " + str(e))
I get the mapper error on this line (from above):
self = db.Model.metadata.tables('user', metadata)
I've tried the other lines as well, but it doesn't seem to know what Table is...I have Flask-SQLAlchemy 2.3.2.
Am I making any obvious mistakes here?
You can try by reflecting the id field (or its real name in the database) and all the fields you will need in your code :
class User(db.Model):
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(50), nullable=False)
This has taken me several days of effort, reading over 30 StackOverflow Q/As and going down quite a few rabbit holes in the SQLAlchemy and Flask-SQLAlchemy documentation, trying and discarding quite a few code fragments. This is what I've pieced together. Note: this is with Flask-MySQL version 2.3.2.
I thought it was rather unusual that Flask-SQLAlchemy provides an engine and metadata after app initialization, but the documentation on their page has them importing different engine and metadata modules from SQLAlchemy.
Not only that, but they are also importing a separate session engine, and I have just used the Flask initialized database context to get a session, and it works.
Not only that, but this does what 99% of the Q/As here on StackOverflow state is impossible with Flask-SQLAlchemy - automapping / reflection of the database tables. That is, I do not have to declare properties on the classes at all, those come from the database directly.
Here's the code:
import requests
import json
from flask import Flask, render_template, request, redirect, jsonify
from flask_sqlalchemy import SQLAlchemy
app = Flask(__name__)
#The file I am importing below is named config and is in the same folder as app.py.
#It has json formatted text and looks like this (without the '#' signs, and remove the <> signs that surround
#the places you need to insert the db address, username, password, and database name (instance name -
#database servers can have multiple databases on them, each one is called an instance / has a different name))
#{
#"dbAddress" = "<some IP or URL to your database server>",
#"dbName" = "<database name>",
#"dbUser" = "<database user name>",
#"dbPass" = "<dbPass>"
#}
with open('config') as data_file:
data = json.load(data_file)
dbAddress = data["dbAddress"]
dbName = data["dbName"]
dbUser = data["dbUser"]
dbPass = data["dbPass"]
app.config['SQLALCHEMY_DATABASE_URI'] = "mysql+pymysql://" + dbUser + ":" + dbPass + "#" + dbAddress + ":3306/" + dbName
app.config["SQLALCHEMY_ECHO"] = True
db = SQLAlchemy(app)
db.Model.metadata.reflect(bind=db.engine)
class User(db.Model):
__tablename__ = 'user'
def __init__(self, db, username, password, email):
try:
self = db.session.query(User).filter(User.username==username) #.one()
except Exception as e:
print("In init User(): Failed to load an existing user into the model for user '" + username + "' " + str(e))
self.username=username
self.password=password
self.emailAddress=emailAddress
try:
db.session.add(self)
db.session.commit()
print("In init User(): Inserted or updated user '" + username + "'")
return True
except Exception as e:
print("In init User(): insert or update exception on user '" + username + "': " + str(e))
return False
def delete(db, userid):
try:
self = db.session.query(User).filter(User.id == userid).one()
db.session.delete(self)
db.session.commit()
except Exception as e:
print("In User.delete(): failed to delete userid '" + self.id + "', username '" + self.username + "': " + str(e))
def getAllUsers(db):
return db.session.query(User).all()
#Setting this to blank would "logout" the user.
#This is because csrf_protect prevents POST requests from going through other
#than login and signup.
def updateSessionToken(db, userid, token):
try:
self = db.session.query(User).filter(User.id == userid).one()
self.sessionToken = token
db.session.add(self)
db.session.commit()
print("In User.updateSessionToken(): Successfully updated the token for userid '" + userid + "'.")
return True
except Exception as e:
print("In User.updateSessionToken() failed to update token: " + str(e))
return False
def checkSessionToken(db, userid, givenToken):
try:
user = db.session.query(User).filter(User.id == userid).one()
except Exception as e:
print("In checkSessionToken(): issue looking up userid: " + userid + ": " + str(e))
if user:
if user.sessionToken == givenToken:
print("In User.checkSessionToken(): token match confirmed for userid '" + user.id + "', username '" + user.username + "'.")
return True
else:
print("In checkSessionToken() - token and given token do not match")
else:
print("In checkSessionToken() - user by given id '" + userid + "' not found.")
return False
I'm trying to separate the Read and write DB operations via Flask Sqlalchemy. I'm using binds to connect to the mysql databases. I would want to perform the write operation in Master and Reads from slaves. There does not seem to be an in built way to handle this.
I'm new to python and was surprised that a much needed functionality like this is not pre-built into flask-sqlalchemy already. Any help is appreciated. Thanks
There is no official support, but you can customize Flask-SQLalchemy session to use master slave connects
from functools import partial
from sqlalchemy import orm
from flask import current_app
from flask_sqlalchemy import SQLAlchemy, get_state
class RoutingSession(orm.Session):
def __init__(self, db, autocommit=False, autoflush=True, **options):
self.app = db.get_app()
self.db = db
self._bind_name = None
orm.Session.__init__(
self, autocommit=autocommit, autoflush=autoflush,
bind=db.engine,
binds=db.get_binds(self.app),
**options,
)
def get_bind(self, mapper=None, clause=None):
try:
state = get_state(self.app)
except (AssertionError, AttributeError, TypeError) as err:
current_app.logger.info(
'cant get configuration. default bind. Error:' + err)
return orm.Session.get_bind(self, mapper, clause)
# If there are no binds configured, use default SQLALCHEMY_DATABASE_URI
if not state or not self.app.config['SQLALCHEMY_BINDS']:
return orm.Session.get_bind(self, mapper, clause)
# if want to user exact bind
if self._bind_name:
return state.db.get_engine(self.app, bind=self._bind_name)
else:
# if no bind is used connect to default
return orm.Session.get_bind(self, mapper, clause)
def using_bind(self, name):
bind_session = RoutingSession(self.db)
vars(bind_session).update(vars(self))
bind_session._bind_name = name
return bind_session
class RouteSQLAlchemy(SQLAlchemy):
def __init__(self, *args, **kwargs):
SQLAlchemy.__init__(self, *args, **kwargs)
self.session.using_bind = lambda s: self.session().using_bind(s)
def create_scoped_session(self, options=None):
if options is None:
options = {}
scopefunc = options.pop('scopefunc', None)
return orm.scoped_session(
partial(RoutingSession, self, **options),
scopefunc=scopefunc,
)
Than the default session will be master, when you want to select from slave you can call it directly, here the examples:
In your app:
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'postgresql:///master'
app.config['SQLALCHEMY_BINDS'] = {
'slave': 'postgresql:///slave'
}
db = RouteSQLAlchemy(app)
Select from master
session.query(User).filter_by(id=1).first()
Select from slave
session.using_bind('slave').query(User).filter_by(id=1).first()
Here is the documentation http://packages.python.org/Flask-SQLAlchemy/binds.html
So I have a controller that renders a page. In the controller, I call multiple functions from the model that create its own sessions. For example:
def page(request):
userid = authenticated_userid(request)
user = User.get_by_id(userid)
things = User.get_things()
return {'user': user, 'things': things}
Where in the model I have:
class User:
...
def get_by_id(self, userid):
return DBSession.query(User)...
def get_things(self):
return DBSession.query(Thing)...
My question is, is creating a new session for each function optimal, or should I start a session in the controller and use the same session throughout the controller (assuming I'm both querying as well as inserting into the database in the controller)? Ex.
def page(request):
session = DBSession()
userid = authenticated_userid(request)
user = User.get_by_id(userid, session)
things = User.get_things(session)
...
return {'user': user, 'things': things}
class User:
...
def get_by_id(self, userid, session=None):
if not session:
session = DBSession()
return session.query(User)...
def get_things(self, session=None):
if not session:
session = DBSession()
return session.query(Thing)...
Your first code is OK, if your DBSession is a ScopedSession. DBSession() is not a constructor then, but just an accessor function to thread-local storage. You might speed up things a bit by passing the session explicitly, but premature optimization is the root of all evil.