With flask-sqlalchemy, does anyone know why the second approach of construction in http://pythonhosted.org/Flask-SQLAlchemy/api.html doesn't suggest db.app = app as well? It seems the major difference between the first and second construction methods is simply that the first does db.app = app whilst the second does db.app = None
Thanks!
The two methods of initialization are pretty standard for Flask extensions and follow an implicit convention on how extensions are to be initialized. In this section of the Flask documentation you can find a note that explains it:
As you noticed, init_app does not assign app to self. This is intentional! Class based Flask extensions must only store the application on the object when the application was passed to the constructor. This tells the extension: I am not interested in using multiple applications.
When the extension needs to find the current application and it does not have a reference to it, it must either use the current_app context local or change the API in a way that you can pass the application explicitly.
The idea can be summarized as follows:
If you use the SQLAlchemy(app) constructor then the extension will assume that app is the only application, so it will store a reference to it in self.app.
If you use the init_app(app) constructor then the extension will assume that app is one of possibly many applications. So instead of saving a reference it will rely on current_app to locate the application every time it needs it.
The practical difference between the two ways to initialize extensions is that the first format requires the application to exist, because it must be passed in the constructor. The second format allows the db object to be created before the application exists because you pass nothing to the constructor. In this case you postpone the call to db.init_app(app) until you have an application instance. The typical situation in which the creation of the application instance is delayed is if you use the application factory pattern.
Related
I want to create a Flask extension which depends on another Flask extension. For the sake of argument, say that it's Flask-Foo, and that it needs Flask-Redis to store some specific data in a Redis database.
I know that I can add an install dependency to Flask-Redis. However I don't understand how I should instantiate and initialize Flask-Redis.
The setup for Flask-Foo sets up the Flask-Redis object. The drawback of this is that it assumes that the app isn't also using Flask-Redis for some other reason, configured explicitly outside of Flask-Foo. If it is, we get two objects which exist side-by-side, which seems wrong.
The user has to themselves instantiate and configure Flask-Redis. Flask-Foo checks that it has been initialized for that app, and complains otherwise. The problem with this is that it seems to impose boilerplate on the user - why should they have to set up Flask-Redis to use Flask-Foo, when they have no other knowledge or interest in the configuration of Flask-Redis? Furthermore, aren't we asking for trouble if this means that Flask-Foo.init_app() always has to be called after Flask-Redis.init_app()?
Don't use Flask-Redis. Use the Redis package directly, and manage the connection in Flask-Foo code. This would probably avoid the above problems. But it seems unelegant - we will basically have to resolve problems solved by Flask-Redis. If Flask-Foo goes on to support an alternative database, it will become complicated as we have to maintain code to manage the different types of connection.
Just to be clear, this is not a question specifically about Flask-Redis or how it works! I just want to understand what is generally the right way to build an extension on top of an extension.
You can pass depend extension to init_app. http://flask.pocoo.org/docs/1.0/extensiondev/
flask_foo/init.py
class FooManager:
def __init__(self, app=None, db=None, **kwargs):
self.app = app
if app is not None:
self.init_app(app, db, **kwargs)
def init_app(self, app, db, **kwargs):
self.db = db
app.config.setdefault('xxx', xxx)
# Bind Flask-Foo to app
app.foo_manager = self
Now, you can get foo_manager object from current_app like this:
models.py
from flask import current_app
db = current_app.foo_manager.db
class XXX(db.Model):
pass
Last, maybe you must register foo by app_context():
run.py
with app.app_context():
FooManager(app, db) # or xx = FooManager(); xx.init_app(app, db)
wonderful, depend extension works good for us.
Other tip: https://stackoverflow.com/a/51739367/5204664
Flask extension has the same structure as a python module. You should specify all requirements in setup.py file.
For example flask-babel
install_requires=[
'Flask',
'Babel>=2.3',
'Jinja2>=2.5'
],
Is there any way I can import my Flask-SQLAlchemy models into a Jupyter notebook? I would like to be able to explore my models and data in the notebook.
I haven't tried this but I believe it can be done, with a little bit of work.
tl;dr
Import the app, db, and the models you want to use. Push the app context before doing a query. If you understood all this, you're done.
In more detail
In the code which sets up your Flask app, you have a Flask-SQLAlchemy object, which is usually defined something like this:
from flask_sqlalchemy import FlaskSQLAlchemy
db = FlaskSQLAlchemy()
And somewhere else you have your models:
from db_setup import db
class MyThing(db.Model):
thing_id = db.Column(db.Integer())
And further somewhere you have the app:
from flask import Flask
from db_setup import db
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = '...'
db.init_app(app)
Now, in your iPython notebook you have to be able to import the last two pieces above:
from app_setup import app
from models import MyThing
To run a query, you have to be in the app context (see http://flask.pocoo.org/docs/1.0/api/#flask.Flask.app_context):
with app.app_context():
things = MyThing.query.filter(MyThing.thing_id < 100).all()
You should be able to run any query there. If I remember correctly, even outside of the with block the objects in things will still be valid, and you can retrieve their properties, etc.
If you want to explicitly commit, you can import db from where it's defined, and do
db.session.commit()
Just like using the model class name to make a query, db only works inside a context.
Technicalities
Don't worry about this section unless you got the above working but you want to tweak how you did it.
First of all, you might not want to use an app created in exactly the same way that you create it in your Flask code. For example, you might want to use a different config. Instead of importing the module where app is defined, you could just create a new Flask app in your notebook. You still have to import db (to do db.init_app(app)) and MyThing. But these modules probably don't have any configuration code in, since the configuration is all done at the level of the app.
Secondly, instead of use with, you could also explicitly do
my_context = app.app_context()
my_context.push()
then your SQLAlchemy code, and then later
my_context.pop()
This has two advantages. You can just push the context once, before using it in multiple notebook cells. The with block only works inside one cell.
Furthermore, storing the context in a variable after creating it means that you can re-use the same context. For the purposes of SQLAlchemy, the context acts a bit like a transaction. If you make a change to an object in one context, they won't apply in another context, unless you committed to the database. If you store a FlaskSQLAlchemy object in a Python variable, you won't be able to do anything with it inside a different context.
You could also store the context in a variable, then use it in multiple with blocks.
my_context = app.app_context()
with my_context.push():
thing1 = MyThing.query().order_by(MyThing.thing_id).first()
# (Maybe in another cell)
with my_context.push()
print thing1.thing_id
A last consideration, is that it might make sense to define your models using vanilla SQLAlchemy instead of FlaskSQLAlchemy. This would mean that you wouldn't need all the stuff above using contexts, just a database connection to create a session. This would make it much easier to import the models in non-flask code, but the tradeoff would be that it would make using them in Flask a bit harder.
The setup of the problem is simple enough:
an user selects a language preference (this preference can be read from the user’s session);
based on this choice, load the appropriate .mo from the available translations;
(no separate domains are set up, if it makes any difference)
Problem: since this return has to be done outside the scope of the flask app, it cannot be instantiated and to use #babel.localeselector. Instead, I use a simple function based on webapp2 i18n’ extension which, using Babel’s support function, loads a given translation and returns a translation instance (Translations: "PROJECT VERSION"). (inb4 ‘why not use webapp2 already?’ too many libs already).
From this point on, it is not clear to me what to do with this instance. How can I get Babel to use this specific instance? (at the moment, it always uses the default one, no 'best_match' involved).
Solved by just using the flask app and the way I wanted to avoid - on every request, there is a callback to the app instance and to the localeselector decorator, language is set previously in an attribute in flask.g. Basically, by the books I guess.
According to the Flask documentation for the Flask.make_response method, the only types allowed to be returned from a view are instances of app.response_class, str, unicode, a wsgi function, or a tuple with a response object already following one of the above.
I'd like to be able to return my own models and queries from a view and having generic code building an adequate response, serializing the object to an accepted format, building collections from queries, applying filtering conditions, etc. Where's the best place to hook that up?
I considered the following possibilities, but can't figure the one correct way to do it.
Subclassing Response and setting app.response_class
Subclassing Flask redefining Flask.make_response
Wrapping app.route into another decorator
Flask.after_request
?
edit1: I already have many APIs with the behavior I need implemented in the views, but I'd like to avoid repeating that everywhere.
edit2: I'm actually building a Flask-extension with many default practices used in my applications. While a plain decorator would certainly work, I really need a little more magic than that.
Why don't you just create a function make_response_from_custom_object and end your views with
return make_response_from_custom_object(custom_object)
If it is going to be common I would put it into a #response_from_custom_object decorator, but hooking into Flask seems an overkill. You can chain decorators, so wrapping app.route does not make sense too; all you need is
#app.route(..)
#response_from_custom_object
def view(...):
...
If you can do it the simple and explicit way, there is no sense to make your code do magic and thus be less comprehensible.
The farther up the chain you go (make_response, dispatch_request + handle_user_error, full_dispatch_request, rewrite Flask from scratch) the more functionality you will have to re-create.
The easiest thing to do in this case is to override response_class and do the serialization there - that leaves you with all the magic that Flask does in make_response, full_dispatch_request etc., but still gives you control over how to respond to exceptions and serialize responses. It also leaves all of Flask's hooks in place, so consumers of your extension can override behavior where they need to (and they can re-use their existing knowledge of Flask's request / lifecycle)
Python is dyanmic by nature, so while this probably isn't the best practice, you can reassign the make_response method on the application to whatever you want.
To avoid having to recreate the default functionality, you can save a reference to the original function and use that to implement your new function.
I used this recently in a project to add the ability to return instances of a custom Serializable class directly from flask views.
app = Flask("StarCorp")
__original_make_response = app.make_response
def convert_custom_object(obj):
# Check if the returned object is "Serializable"
if not isinstance(obj, Serializable):
# Nope, do whatever flask normally does
return __original_make_response(obj)
# It is, get a `dict` from `obj` using the `json` method
data = obj.json
data.pop(TYPE_META) # Don't share the type meta info of an object with users
# Let flask turn the `dict` into a `json` response
return __original_make_response(data)
app.make_response = convert_custom_object
Since flask extensions typically provide an init_app(app) method I'm sure you could build an extension that monkey patches the passed in application object in a similar manner.
I am trying to do some pre-processing at Django startup (I put a startup script that runs once in urls.py) and then use the created instance of an object in my views. How would I go about doing that?
Try to use the singleton design pattern.
You can use a Context Processor to add it to your template context.
If you want it in the View, rather than the Template, then you can either have a base View class that has this, or just import the reference into the module your view is in (and access it directly).
Be aware that each django thread may have a different copy of the object in memory, so this should really only be used for read-only access. If you make changes to it, you are likely to find yourself in a world of hurt.