OK, I am trying to do something a little bit esoteric with my Flask application.
I want to have some conditional logic in the model structure that is based on information in a configuration file.
At present when I call my Flask App, a configuration object is specified like this:
app = create_app('swarm.settings.DevConfig')
The db object is created in the models.py :
from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy()
class MyClass(db.Model):
...
I would like the models.py to accommodate a variety of ORM and OGM (not limited to SQLAlchemy and py2neo) so that I can develop a Flask app to be SQL/Graph agnostic.
if __SOME_CONFIG__['db_mapper'] = 'SQLAlchemy':
from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy()
class MapperModel(db.Model):
...
elif __SOME_CONFIG__['db_mapper'] = 'Py2Neo':
from py2neo import Graph, Node, Relation
db = Graph()
class MapperModel(py2neo.Node):
...
class MyClass(MapperModel):
...
I can't see a way to use current_app to achieve this because I am creating my db object before the code is aware of an app object.
Is there a simple way to load the current configuration object from inside models.py ? Should I just load the configuration in models.py from a seperate file in the without reference to the app's current configuration object?
Create a function which will return a db object, and initialize this object when you instantiate flask application:
app = create_app(...)
db = create_dbobject('someconfig')
def create_dbobject(someconfig):
if someconfig == 'Py2Neo':
return Py2Neo()
#default to sqlchemy
return SQLAlchemy()
So no more you have to worry about extension initialization. And its good to keep extensions initialization in place where app exists.
Related
I'd like to use Flask's application factory mechanism fpr my application. I have is that the databases I use within some blueprints are located differently, so I'm using binds for pointing to them. The tables itself are in production and already in use, so I need to reflect them in order to use them within my application.
Problem is that I can't get the reflect function working because of the application context. I always get the message, that I'm working outside the application context. I fully understand that and see, that db is really outside, but don't have any idea anymore on how to involve it.
I tried different variations on passing app via current_app to my models.py, but nothing was working.
config.py:
class Config(object):
#Secret key
SECRET_KEY = 'my_very_secret_key'
ITEMS_PER_PAGE = 25
SQLALCHEMY_BINDS = {
'mysql_bind': 'mysql+mysqlconnector://localhost:3306/tmpdb'
}
SQLALCHEMY_TRACK_MODIFICATIONS = False
main.py:
from webapp import create_app
app = create_app('config.Config')
if __name__ == '__main__':
app.run(debug=true)
webapp/init.py:
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy()
def create_app(config_object):
app=Flask(__name__)
app.config.from_object(config_object)
db.init_app(app)
from main import create_module as main_create_module
main_create_module(app)
return app
webapp/main/init.py:
def create_module(app):
from .controller import blueprint
app.register(blueprint)
webapp/main/controller.py:
from flask import Blueprint, render_template, current_app as app
from .models import db, MyTable # <-- Problem might be here ...
bluerint = Blueprint('main', __name__)
#blueprint.route('/'):
def index():
resp = db.session.query(MyTable)\
.db.func.count(MyTable.versions)\
.filter(MyTable.versions =! '')\
.group_by(MyTable.name).all()
if resp:
return render_template('index.html', respo=respo)
else:
return 'Nothing happend'
webapp/main/models.py:
from .. import db # <-- and here ...
db.reflect(bind='mysql_bind')
class MyTable(db.Model):
__bind_key__ = 'mysql_bind'
__table__ = db.metadata.tables['my_table']
Expected result would be to get the reflection working in different blueprints.
Got it working, full solution here:
https://github.com/researcher2/stackoverflow_56885380
I have used sqllite3 for the test, run create_db.py script to setup db. Run flask with debug.sh, since recent versions you can't seem to just app.run() inside __main__ anymore.
Explanation
As I understand it a blueprint is just a way to group together several views if you need to use them multiple times in a single app or across multiple apps. You can add different route prefix as you desire.
A db object is not associated with a blueprint, it is associated with an app, which provide the configuration information. Once inside the blueprint views you will have access to the db object with the relevant app context automatically available.
Regarding the db.reflect, you need to make the call inside create_app and pass it the app object(preferred) or import the app inside the model which is spaghetti.
Multiple DBs can be accessed using binding as you've shown.
So your blueprints will have access to all tables imported and flask-sqlalchemy knows which db connection to use based on the binding.
I'm normally a fan of explicitly defining tables so you have access to the ORM objects and fields in code completion. Do you have lots of tables/fields or maybe you are creating something to query table metadata for total automation on any schema? Like a schema viewer or something like that.
This might be useful for others coming to this post:
https://flask-sqlalchemy.palletsprojects.com/en/2.x/contexts/
Brilliant! Thank you very much. Got it also working. Your tip gave me a hint to find another way:
#blueprint.route('/')
def index():
# pushing app_context() to import MyTable
# now I can use db.reflect() also in models.py
with app.app_context():
from .models import MyTable
results = db.session.query(MyTable).all()
print(results)
for row in results:
print (row)
print(row.versions)
print(row.name)
if results:
return render_template('my_table.html', results=results)
else:
return 'Nothing happend'
Then the reflection can be done inside models.py. The link you posted is really helpful, don't know why I did not stumble over it myself ...
Anyway, I do now have a lot more possibilities than before!
Cheers, mate!
Our application uses flask + gunicorn. Now we want to make it able to reload db configuration while it is runing, which means it can switch to a new db without restart process. With the help of config center we can dispatch config at runtime, but how can I re-init the global varibale db?
db = SQLAlchemy()
def create_app():
app = Flask(__name__)
app.config.from_object(dynamic_config)
db.init_app(app)
And assume at some time. We dispatch new config, how can db be init with new config? Or is it safe to just replace it with new SQLAlchemy() instance? Like do this:
from models import set_db # which will set global db to new instance
from app import app
def callback(odl, new):
new_db = SQLAlchemy()
# re-construct config with old, and new
# now app.config is updated
new_db.init_app(app)
set_db(new_db)
Is it ok to do this? As I'm concerned, it will cause something like thread safety and may destroy Transaction.
Help me with this, many thanks
If i were you I would use SQLALCHEMY_BINDS config or different instances of db with different configs instead of changing configuration every time.
and I think it isn't good practice to change db structure using application(in case you are doing that)
I have a working sample Flask-Admin application on GitHub which queries a view (itself based on MySQL's information_schema.TABLES) in order to dynamically update a ModelView's column_labels and column_descriptions properties. There's also a small template modification to add tooltips to the header row containing the column comments from the database.
I put a lot of effort into figuring this out because it seemed foolish to re-type column descriptions in my Python code when they had already been entered into the database using the COMMENT SQL keyword during table creation.
In the simple test app, everything works as expected; I receive the model and session parameters given to the ModelView's __init__ method, and I use the session to query the other model for the column labels / comments, update self.column_labels and self.column_descriptions on the view, then call super().
However, in a more complicated app, I get the dreaded RuntimeError: No application found. Either work inside a view function or push an application context when I try to query the other model within a ModelView's __init__ method. The only marked difference I would note between the demo app and my "real" app is that I import a SQLAlchemy object instantiated in another .py file and then invoke its init_app() in my app.py to wire it up to the Flask instance.
Edit: which is exactly what the problem is; SQLAlchemy instances created using in the usual way, demonstrated in the Flask-SQLAlchemy Quickstart get a proper Flask application instance in their __init__()s; when I use db.init_app(app), I get the No application found error instead.
My question is: at which point does a Flask-Admin ModelView start existing inside a Flask application context? Why are circumstances different for import db from somewhere.py; db.init_app(app) vs. db = SQLAlchemy(app)? Is there any way that I can trace the start-up process of a Flask application and hook into that exact moment so I can see what's going on here?
Here are the two basic parts that are involved (complete source for each is on GitHub, as noted above):
The model that provides the column "metadata," including comments
# models.py
# [SQLALchemy imports and declaration of 'Base']
class ColumnProperty(Base):
# this view is based on MySQL's 'information_schema.TABLES'
__tablename__ = 'v_column_properties'
parent_table = Column(String(64), primary_key=True)
name = Column(String(64), primary_key=True)
# [some fields elided]
comment = Column(String(1024)) # type: str
and
The ModelView which queries the above model upon instantiation
# views.py
from flask_admin.contrib.sqla import ModelView
class TestView(ModelView):
def __init__(self, model, session, **kwargs):
from models import ColumnProperty as cp
descriptions = {}
q = session.query(cp).filter(cp.parent_table==model.__tablename__)
for row in q.all():
descriptions[row.name] = row.comment
self.column_descriptions = descriptions
super(TestView, self).__init__(model, session, **kwargs)
I would override the render method of ModelView and set your column_descriptions there. Obviously you could setup some kind of caching mechanism to reduce the number of database queries.
Something like (completely untested) - note use of self.session in querying:
# views.py
from flask_admin.contrib.sqla import ModelView
class TestView(ModelView):
def render(self, template, **kwargs):
from models import ColumnProperty as cp
descriptions = {}
q = self.session.query(cp).filter(cp.parent_table==model.__tablename__)
for row in q.all():
descriptions[row.name] = row.comment
self.column_descriptions = descriptions
super(TestView, self).render(template, **kwargs)
I realize I'm a bit late to the show but since I had a similar problem I'll share my findings.
It seems that in order to get an application context for the ModelView you have to initialize it within with app.app_context(), like this, in case of your app.py:
...
#app.py
with app.app_context():
admin = Admin(app, name="{}-admin".format(cfg['database']),
index_view=AdminIndexView(name='Admin Home', url='/'),
template_mode='bootstrap3')
admin.add_view(TestView(Test, db.session))
admin.add_view(ColumnPropertiesView(ColumnProperty, db.session))
...
In you prefer using an application factory pattern, you could do something along these lines:
#app.py
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from flask_admin import Admin, AdminIndexView
db = SQLAlchemy()
def create_app():
"""The application factory"""
app = Flask(__name__)
app.config.from_object('app.config')
db.init_app(app)
with app.app_context():
admin = Admin(app, name="{}-admin".format(cfg['database']),
index_view=AdminIndexView(name='Admin Home', url='/'),
template_mode='bootstrap3')
from models import Test, ColumnProperty
from views import TestView, ColumnPropertiesView
admin.add_view(TestView(Test, db.session))
admin.add_view(ColumnPropertiesView(ColumnProperty, db.session))
return app
I'm trying to test my flask application using unittest. I want to refrain from flask-testing because I don't like to get ahead of myself.
I've really been struggling with this unittest thing now. It is confusing because there's the request context and the app context and I don't know which one I need to be in when I call db.create_all().
It seems like when I do add to the database, it adds my models to the database specified in my app module (init.py) file, but not the database specified in the setUp(self) method.
I have some methods that must populate the database before every test_ method.
How can I point my db to the right path?
def setUp(self):
#self.db_gd, app.config['DATABASE'] = tempfile.mkstemp()
app.config['TESTING'] = True
# app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///' + app.config['DATABASE']
basedir = os.path.abspath(os.path.dirname(__file__))
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///' + \
os.path.join(basedir, 'test.db')
db = SQLAlchemy(app)
db.create_all()
#self.app = app.test_client()
#self.app.testing = True
self.create_roles()
self.create_users()
self.create_buildings()
#with app.app_context():
# db.create_all()
# self.create_roles()
# self.create_users()
# self.create_buildings()
def tearDown(self):
#with app.app_context():
#with app.request_context():
db.session.remove()
db.drop_all()
#os.close(self.db_gd)
#os.unlink(app.config['DATABASE'])
Here is one of the methods that populates my database:
def create_users(self):
#raise ValueError(User.query.all())
new_user = User('Some User Name','xxxxx#gmail.com','admin')
new_user.role_id = 1
new_user.status = 1
new_user.password = generate_password_hash(new_user.password)
db.session.add(new_user)
Places I've looked at:
http://kronosapiens.github.io/blog/2014/08/14/understanding-contexts-in-flask.html
http://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-xvi-debugging-testing-and-profiling
And the flask documentation:
http://flask.pocoo.org/docs/0.10/testing/
one issue that your hitting is the limitations of flask contexts, this is the primary reason i think long and hard before including a flask extension into my project, and flask-sqlalchemy is one of the biggest offenders. i say this because in most cases it is completely unnecessary to depend on the flask app context when dealing with your database. Sure it can be nice, especially since flask-sqlalchemy does a lot behind the scenes for you, mainly you dont have to manually manage your session, metadata or engine, but keeping that in mind those things can easily be done on your own, and for doing that you get the benefit of unrestricted access to your database, with no worry about the flask context. here is an example of how to setup your db manually, first i will show the flask-sqlalchemy way, then the manual plain sqlalchemy way:
the flask-sqlalchemy way:
import flask
from flask_sqlalchemy import SQLAlchemy
app = flask.Flask(__name__)
db = SQLAlchemy(app)
# define your models using db.Model as base class
# and define columns using classes inside of db
# ie: db.Column(db.String(255),nullable=False)
# then create database
db.create_all() # <-- gives error if not currently running flask app
the standard sqlalchemy way:
import flask
import sqlalchemy as sa
from sqlalchemy.ext.declarative import declarative_base
# first we need our database engine for the connection
engine = sa.create_engine(MY_DB_URL,echo=True)
# the line above is part of the benefit of using flask-sqlalchemy,
# it passes your database uri to this function using the config value
# SQLALCHEMY_DATABASE_URI, but that config value is one reason we are
# tied to the application context
# now we need our session to create querys with
Session = sa.orm.scoped_session(sa.orm.sessionmaker())
Session.configure(bind=engine)
session = Session()
# now we need a base class for our models to inherit from
Model = declarative_base()
# and we need to tie the engine to our base class
Model.metadata.bind = engine
# now define your models using Model as base class and
# anything that would have come from db, ie: db.Column
# will be in sa, ie: sa.Column
# then when your ready, to create your db just call
Model.metadata.create_all()
# no flask context management needed now
if you set your app up like that, any context issues your having should go away.
as a separate answer, to actually just force what you need to work, you can just use the test_request_context function, ie: in setup do: self.ctx = app.test_request_context() then just activate it, self.ctx.push() and when your done get rid of it, ie in tearDown: self.ctx.pop()
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I'm trying to create a "modular application" in Flask using Blueprints.
When creating models, however, I'm running into the problem of having to reference the app in order to get the db-object provided by Flask-SQLAlchemy. I'd like to be able to use some blueprints with more than one app (similar to how Django apps can be used), so this is not a good solution.*
It's possible to do a switcharoo, and have the Blueprint create the db instance, which the app then imports together with the rest of the blueprint. But then, any other blueprint wishing to create models need to import from that blueprint instead of the app.
My questions are thus:
Is there a way to let Blueprints define models without any awareness of the app they're being used in later -- and have several Blueprints come together? By this, I mean having to import the app module/package from your Blueprint.
Am I wrong from the outset? Are Blueprints not meant to be independent of the app and be redistributable (à la Django apps)?
If not, then what pattern should you use to create something like that? Flask extensions? Should you simply not do it -- and maybe centralize all models/schemas à la Ruby on Rails?
Edit: I've been thinking about this myself now, and this might be more related to SQLAlchemy than Flask because you have to have the declarative_base() when declaring models. And that's got to come from somewhere, anyway!
Perhaps the best solution is to have your project's schema defined in one place and spread it around, like Ruby on Rails does. Declarative SQLAlchemy class definitions are really more like schema.rb than Django's models.py. I imagine this would also make it easier to use migrations (from alembic or sqlalchemy-migrate).
I was asked to provide an example, so let's do something simple: Say I have a blueprint describing "flatpages" -- simple, "static" content stored in the database. It uses a table with just shortname (for URLs), a title and a body. This is simple_pages/__init__.py:
from flask import Blueprint, render_template
from .models import Page
flat_pages = Blueprint('flat_pages', __name__, template_folder='templates')
#flat_pages.route('/<page>')
def show(page):
page_object = Page.query.filter_by(name=page).first()
return render_template('pages/{}.html'.format(page), page=page_object)
Then, it would be nice to let this blueprint define its own model (this in simple_page/models.py):
# TODO Somehow get ahold of a `db` instance without referencing the app
# I might get used in!
class Page(db.Model):
name = db.Column(db.String(255), primary_key=True)
title = db.Column(db.String(255))
content = db.Column(db.String(255))
def __init__(self, name, title, content):
self.name = name
self.title = title
self.content = content
This question is related to:
Flask-SQLAlchemy import/context issue
What's your folder layout for a Flask app divided in modules?
And various others, but all replies seem to rely on import the app's db instance, or doing the reverse. The "Large app how to" wiki page also uses the "import your app in your blueprint" pattern.
* Since the official documentation shows how to create routes, views, templates and assets in a Blueprint without caring about what app it's "in", I've assumed that Blueprints should, in general, be reusable across apps. However, this modularity doesn't seem that useful without also having independent models.
Since Blueprints can be hooked into an app more than once, it might simply be the wrong approach to have models in Blueprints?
I believe the truest answer is that modular blueprints shouldn't concern themselves directly with data access, but instead rely on the application providing a compatible implementation.
So given your example blueprint.
from flask import current_app, Blueprint, render_template
flat_pages = Blueprint('flat_pages', __name__, template_folder='templates')
#flat_pages.record
def record(state):
db = state.app.config.get("flat_pages.db")
if db is None:
raise Exception("This blueprint expects you to provide "
"database access through flat_pages.db")
#flat_pages.route('/<page>')
def show(page):
db = current_app.config["flat_pages.db"]
page_object = db.find_page_by_name(page)
return render_template('pages/{}.html'.format(page), page=page_object)
From this, there is nothing preventing you from providing a default implementation.
def setup_default_flat_pages_db(db):
class Page(db.Model):
name = db.Column(db.String(255), primary_key=True)
title = db.Column(db.String(255))
content = db.Column(db.String(255))
def __init__(self, name, title, content):
self.name = name
self.title = title
self.content = content
class FlatPagesDBO(object):
def find_page_by_name(self, name):
return Page.query.filter_by(name=name).first()
return FlatPagesDBO()
And in your configuration.
app.config["flat_pages.db"] = setup_default_flat_pages_db(db)
The above could be made cleaner by not relying in direct inheritance from db.Model and instead just use a vanilla declarative_base from sqlalchemy, but this should represent the gist of it.
I have similar needs of making Blueprints completely modular and having no reference to the App. I came up with a possibly clean solution but I'm not sure how correct it is and what its limitations are.
The idea is to create a separate db object (db = SQLAlchemy()) inside the blueprint and call the init_app() and create_all() methods from where the root app is created.
Here's some sample code to show how the project is structured:
The app is called jobs and the blueprint is called status and it is stored inside the blueprints folder.
blueprints.status.models.py
from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy() # <--- The db object belonging to the blueprint
class Status(db.Model):
__tablename__ = 'status'
id = db.Column(db.Integer, primary_key=True)
job_id = db.Column(db.Integer)
status = db.Column(db.String(120))
models.py
from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy() # <--- The db object belonging to the root app
class Job(db.Model):
__tablename__ = 'job'
id = db.Column(db.Integer, primary_key=True)
state = db.Column(db.String(120)
factory.py
from .blueprints.status.models import db as status_db # blueprint db
from .blueprints.status.routes import status_handler # blueprint handler
from .models import db as root_db # root db
from flask import Flask
def create_app():
app = Flask(__name__)
# Create database resources.
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:////path/to/app.db'
root_db.init_app(app)
status_db.init_app(app) # <--- Init blueprint db object.
with app.app_context():
root_db.create_all()
status_db.create_all() # <--- Create blueprint db.
# Register blueprint routes.
app.register_blueprint(status_handler, url_prefix="/status")
return app
I tested it with gunicorn with gevent worker and it works. I asked a separate question about the robustness of the solution here:
Create one SQLAlchemy instance per blueprint and call create_all multiple times
You asked "Are Blueprints not meant to be independent of the app and be redistributable (à la Django apps)? "
The answer is yes. Blueprints are not similar to Django App.
If you want to use different app/configurations, then you need to use "Application Dispatching" and not blueprints. Read this
[1]: http://flask.pocoo.org/docs/patterns/appdispatch/#app-dispatch [1]
Also, the link here [1] http://flask.pocoo.org/docs/blueprints/#the-concept-of-blueprints [1]
It clearly says and I quote "A blueprint in Flask is not a pluggable app because it is not actually an application – it’s a set of operations which can be registered on an application, even multiple times. Why not have multiple application objects? You can do that (see Application Dispatching), but your applications will have separate configs and will be managed at the WSGI layer."