I have a flask application, that accesses a DB via Class that encapsulates DB access. I need to use this same class outside of the flask application for some regular jobs that access the same db.
/databases/database.db
/website/application/myblueprint/views.py
/website/application/myblueprint/db_class.py
/scripts/log_reading.py
Both views.py and log_reading.py need to use db_class.py, but you can't import from above your own package.
I could make db_class.py it's own application and install it each
venv, but then every time I edit I have to reinstall it in each
place. Plus there's the overhead of the setup stuff for a single module.
I could put the file in python site path, either by moving
it or by adding to the path, but that feels wrong and I'm not sure
would work with venvs.
I could sym link, that also feels wrong.
I'm not using flask models for the DB, but I don't think that would solve my problem anyway.
if you want to use db or session then you have to create application context, outside of application context you can't use it.
from flask import Flask
from sqlalchemy import create_engine
from sqlalchemy.orm import scoped_session, sessionmaker
import os
APP_SETTINGS = os.getenv("APP_SETTINGS", "your config path")
def _create_db_session(self, connection_string):
engine = create_engine(connection_string)
session_factory = sessionmaker(bind=self.engine)
session = scoped_session(session_factory)
return session
app = Flask(__name__)
app.config.from_object(APP_SETTINGS)
db_url = app.config.get("SQLALCHEMY_DATABASE_URI")
session =_create_db_session(db_url)
#then you can use
# session.query to query database
# session.commit(), session.remove()
# same operations you do normally with DB session
Related
I'd like to use Flask's application factory mechanism fpr my application. I have is that the databases I use within some blueprints are located differently, so I'm using binds for pointing to them. The tables itself are in production and already in use, so I need to reflect them in order to use them within my application.
Problem is that I can't get the reflect function working because of the application context. I always get the message, that I'm working outside the application context. I fully understand that and see, that db is really outside, but don't have any idea anymore on how to involve it.
I tried different variations on passing app via current_app to my models.py, but nothing was working.
config.py:
class Config(object):
#Secret key
SECRET_KEY = 'my_very_secret_key'
ITEMS_PER_PAGE = 25
SQLALCHEMY_BINDS = {
'mysql_bind': 'mysql+mysqlconnector://localhost:3306/tmpdb'
}
SQLALCHEMY_TRACK_MODIFICATIONS = False
main.py:
from webapp import create_app
app = create_app('config.Config')
if __name__ == '__main__':
app.run(debug=true)
webapp/init.py:
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy()
def create_app(config_object):
app=Flask(__name__)
app.config.from_object(config_object)
db.init_app(app)
from main import create_module as main_create_module
main_create_module(app)
return app
webapp/main/init.py:
def create_module(app):
from .controller import blueprint
app.register(blueprint)
webapp/main/controller.py:
from flask import Blueprint, render_template, current_app as app
from .models import db, MyTable # <-- Problem might be here ...
bluerint = Blueprint('main', __name__)
#blueprint.route('/'):
def index():
resp = db.session.query(MyTable)\
.db.func.count(MyTable.versions)\
.filter(MyTable.versions =! '')\
.group_by(MyTable.name).all()
if resp:
return render_template('index.html', respo=respo)
else:
return 'Nothing happend'
webapp/main/models.py:
from .. import db # <-- and here ...
db.reflect(bind='mysql_bind')
class MyTable(db.Model):
__bind_key__ = 'mysql_bind'
__table__ = db.metadata.tables['my_table']
Expected result would be to get the reflection working in different blueprints.
Got it working, full solution here:
https://github.com/researcher2/stackoverflow_56885380
I have used sqllite3 for the test, run create_db.py script to setup db. Run flask with debug.sh, since recent versions you can't seem to just app.run() inside __main__ anymore.
Explanation
As I understand it a blueprint is just a way to group together several views if you need to use them multiple times in a single app or across multiple apps. You can add different route prefix as you desire.
A db object is not associated with a blueprint, it is associated with an app, which provide the configuration information. Once inside the blueprint views you will have access to the db object with the relevant app context automatically available.
Regarding the db.reflect, you need to make the call inside create_app and pass it the app object(preferred) or import the app inside the model which is spaghetti.
Multiple DBs can be accessed using binding as you've shown.
So your blueprints will have access to all tables imported and flask-sqlalchemy knows which db connection to use based on the binding.
I'm normally a fan of explicitly defining tables so you have access to the ORM objects and fields in code completion. Do you have lots of tables/fields or maybe you are creating something to query table metadata for total automation on any schema? Like a schema viewer or something like that.
This might be useful for others coming to this post:
https://flask-sqlalchemy.palletsprojects.com/en/2.x/contexts/
Brilliant! Thank you very much. Got it also working. Your tip gave me a hint to find another way:
#blueprint.route('/')
def index():
# pushing app_context() to import MyTable
# now I can use db.reflect() also in models.py
with app.app_context():
from .models import MyTable
results = db.session.query(MyTable).all()
print(results)
for row in results:
print (row)
print(row.versions)
print(row.name)
if results:
return render_template('my_table.html', results=results)
else:
return 'Nothing happend'
Then the reflection can be done inside models.py. The link you posted is really helpful, don't know why I did not stumble over it myself ...
Anyway, I do now have a lot more possibilities than before!
Cheers, mate!
Our application uses flask + gunicorn. Now we want to make it able to reload db configuration while it is runing, which means it can switch to a new db without restart process. With the help of config center we can dispatch config at runtime, but how can I re-init the global varibale db?
db = SQLAlchemy()
def create_app():
app = Flask(__name__)
app.config.from_object(dynamic_config)
db.init_app(app)
And assume at some time. We dispatch new config, how can db be init with new config? Or is it safe to just replace it with new SQLAlchemy() instance? Like do this:
from models import set_db # which will set global db to new instance
from app import app
def callback(odl, new):
new_db = SQLAlchemy()
# re-construct config with old, and new
# now app.config is updated
new_db.init_app(app)
set_db(new_db)
Is it ok to do this? As I'm concerned, it will cause something like thread safety and may destroy Transaction.
Help me with this, many thanks
If i were you I would use SQLALCHEMY_BINDS config or different instances of db with different configs instead of changing configuration every time.
and I think it isn't good practice to change db structure using application(in case you are doing that)
I have a python app that uses Flask RESTful as well as Flask SQLAlchemy. Part of the API I'm writing has the side effect of spinning off Timer objects. When a Timer expires, it executes some database queries. I'm seeing an issue in which code that is supposed to update rows in the database (a sqlite backend) is actually not issuing any UPDATE statements. I have verified this by turning the SQLALCHEMY_ECHO flag on to log the SQL statements. Whether or not the code works seems to be random. About half the time it fails to issue the UPDATE statement. See full example below.
My guess here is that SQLAlchemy Flask does not work properly when called from a worker thread. I think part of the point of Flask SQLAlchemy is to manage the SQLAlchemy sessions for you per API request. Obviously since there are no API requests going on when the Timer expires, I could see where things may not work properly.
Just to test this, I went ahead and wrote a simple data access layer using python's sqlite3 interface and it seems to solve the problem.
I'd really rather not have to rewrite a bunch of data access code though. Is there a way to get Flask SQLAlchemy to work properly in this case?
Sample code
Here's where I set up the flask app and save off the SQLAlchemy db object:
from flask import Flask
from flask_restful import Api
from flask.ext.sqlalchemy import SQLAlchemy
from flask_cors import CORS
import db_conn
flask_app = Flask(__name__)
flask_app.config.from_object('config')
CORS(flask_app)
api = Api(flask_app)
db_conn.db = SQLAlchemy(flask_app)
api.add_resource(SomeClass, '/abc/<some_id>/def')
Here's how I create the ORM models:
import db_conn
db = db_conn.db
class MyTable(db.Model):
__tablename__ = 'my_table'
id = db.Column(db.Integer, primary_key=True)
phase = db.Column(db.Integer, nullable=False, default=0)
def set_phase(self, phase):
self.phase = phase
db.session.commit()
Here's the API handler with timer and the database call that is failing:
from flask_restful import Resource
from threading import Timer
from models import MyTable
import db_conn
import global_store
class SomeClass(Resource):
def put(self, some_id):
global_store.saved_id = some_id
self.timer = Timer(60, self.callback)
return '', 204
def callback(self):
row = MyTable.query.filter_by(id=global_store.saved_id).one()
# sometimes this works, sometimes it doesn't
row.set_phase(1)
db_conn.db.session.commit()
I'm guessing in your callback you aren't actually changing the value of the object. SQLAlchemey won't issue DB UPDATE calls if the session state is not dirty. So if the phase is already 1 for some reason there is nothing to do.
I'm trying to test my flask application using unittest. I want to refrain from flask-testing because I don't like to get ahead of myself.
I've really been struggling with this unittest thing now. It is confusing because there's the request context and the app context and I don't know which one I need to be in when I call db.create_all().
It seems like when I do add to the database, it adds my models to the database specified in my app module (init.py) file, but not the database specified in the setUp(self) method.
I have some methods that must populate the database before every test_ method.
How can I point my db to the right path?
def setUp(self):
#self.db_gd, app.config['DATABASE'] = tempfile.mkstemp()
app.config['TESTING'] = True
# app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///' + app.config['DATABASE']
basedir = os.path.abspath(os.path.dirname(__file__))
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///' + \
os.path.join(basedir, 'test.db')
db = SQLAlchemy(app)
db.create_all()
#self.app = app.test_client()
#self.app.testing = True
self.create_roles()
self.create_users()
self.create_buildings()
#with app.app_context():
# db.create_all()
# self.create_roles()
# self.create_users()
# self.create_buildings()
def tearDown(self):
#with app.app_context():
#with app.request_context():
db.session.remove()
db.drop_all()
#os.close(self.db_gd)
#os.unlink(app.config['DATABASE'])
Here is one of the methods that populates my database:
def create_users(self):
#raise ValueError(User.query.all())
new_user = User('Some User Name','xxxxx#gmail.com','admin')
new_user.role_id = 1
new_user.status = 1
new_user.password = generate_password_hash(new_user.password)
db.session.add(new_user)
Places I've looked at:
http://kronosapiens.github.io/blog/2014/08/14/understanding-contexts-in-flask.html
http://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-xvi-debugging-testing-and-profiling
And the flask documentation:
http://flask.pocoo.org/docs/0.10/testing/
one issue that your hitting is the limitations of flask contexts, this is the primary reason i think long and hard before including a flask extension into my project, and flask-sqlalchemy is one of the biggest offenders. i say this because in most cases it is completely unnecessary to depend on the flask app context when dealing with your database. Sure it can be nice, especially since flask-sqlalchemy does a lot behind the scenes for you, mainly you dont have to manually manage your session, metadata or engine, but keeping that in mind those things can easily be done on your own, and for doing that you get the benefit of unrestricted access to your database, with no worry about the flask context. here is an example of how to setup your db manually, first i will show the flask-sqlalchemy way, then the manual plain sqlalchemy way:
the flask-sqlalchemy way:
import flask
from flask_sqlalchemy import SQLAlchemy
app = flask.Flask(__name__)
db = SQLAlchemy(app)
# define your models using db.Model as base class
# and define columns using classes inside of db
# ie: db.Column(db.String(255),nullable=False)
# then create database
db.create_all() # <-- gives error if not currently running flask app
the standard sqlalchemy way:
import flask
import sqlalchemy as sa
from sqlalchemy.ext.declarative import declarative_base
# first we need our database engine for the connection
engine = sa.create_engine(MY_DB_URL,echo=True)
# the line above is part of the benefit of using flask-sqlalchemy,
# it passes your database uri to this function using the config value
# SQLALCHEMY_DATABASE_URI, but that config value is one reason we are
# tied to the application context
# now we need our session to create querys with
Session = sa.orm.scoped_session(sa.orm.sessionmaker())
Session.configure(bind=engine)
session = Session()
# now we need a base class for our models to inherit from
Model = declarative_base()
# and we need to tie the engine to our base class
Model.metadata.bind = engine
# now define your models using Model as base class and
# anything that would have come from db, ie: db.Column
# will be in sa, ie: sa.Column
# then when your ready, to create your db just call
Model.metadata.create_all()
# no flask context management needed now
if you set your app up like that, any context issues your having should go away.
as a separate answer, to actually just force what you need to work, you can just use the test_request_context function, ie: in setup do: self.ctx = app.test_request_context() then just activate it, self.ctx.push() and when your done get rid of it, ie in tearDown: self.ctx.pop()
I was planning for my structure to be something similar to this:
appname/libs/user
/football
/tax
Where the library user, will have the models for the User, the controller which would show up the REST JSON API and the views.
The problems that I'm facing currently can be divided into two major questions, and which mainly stem from using Django for a while. I'm fairly new to CherryPy and SqlAlchemy.
How to define modles in each of this library? The problem I face is I've to inherit the Base Declarative and Engine in every model and run it as a standalone app for its models to be generated. Is there a mechanism where I can plug in the libraries and the database should pull all the models and create it? (Something that Django does.)
How to define routes/apis? (a urls.py)
How about defining the declarative base (sqlalchemy) in appname/db/__init__.py and for each of the libs import the base from appname in appname/libs/NAME/models.py:
import appname.db
Base = appname.db.Base
class MyUser(Base):
...
To get a database session just use a scoped session for example, this could be the basic content for appname/db/__init__.py (or just db.py if you don't want to define additional base models in appname/db/models.py)
from sqlalchemy.orm import scoped_session, sessionmaker
from sqlalchemy import engine_from_config
__all__ = ['session', 'ses']
ses = session = scoped_session(sessionmaker())
def load_engine(config):
engine = engine_from_config(config, prefix='')
session.configure(bind=engine)
Then set a tool to remove the session from the thread locals when the request has ended:
import cherrypy
from appname import db
def remove_db_session():
db.session.remove()
cherrypy.tools.removedbs = cherrypy.Tool('on_end_request', remove_db_session)
from that point forward just use the session as normal from any part of your application, for example:
from appname import db
from appname.libs.user import models
class User:
exposed = True
def GET(self, id):
db.ses.query(models.User).filter_by(id=id)
# and let the tool to remove the session
# when the request finish
By the way to enable that removedbs tool, just make sure that you execute somewhere that cherrypy.tools.removedbs = .... I usually put that in: appname/cptools/tool and then in the config dictionary or file set tools.removedbs.on to True
Working with cherrypy means that you will build the application tree, there is no extra magic you need to have a central place to build the full tree, if you want to use the MethodDispatcher or the DefaultDispatcher.
In this case I recommend you the MethodDispatcher, and probably this post can give you a little more perspective and this is from my blog emulating the github api without any base handler.
There is an alternative to use more django like routes with the RoutesDispatcher, but you will lose a lot of functionality from the tools.
To show you an example with the MethodDispatcher and building your own object tree, from the current structure you can have a build function on the appname/builder.py and make something like this:
from appname.views import Main
from appname.libs import user, football
appmap = {'user': user,
'footbal': football}
def build_app(*apps):
root = Main()
for app in apps:
appmodule = appmap[app]
appmodule.attach(root)
return root
And inside the appname/libs/user/__init__.py
from appname.libs.user import views
def build_tree():
root = views.Main()
root.management = views.Management()
return root
def attach(root):
root.user = build_tree()
That's just a way to structure the application, there is also a repository with cherrypy recipes which are pretty helpful.