Hooking into sqlalchemy models - python

I'm looking to hook into the model creation cycle for sqlalchemy models. For example on create or on save (like in the Ruby ORM ActiveRecord, in fact I'm moving a model from ActiveRecord to SqlAlchemy).
Events looks like what I need: http://docs.sqlalchemy.org/en/rel_0_7/core/event.html, but I haven't found more detailed examples, yet. I'd like to hear someone's experience with this.
Are there similar facilities in sqlalchemy for doing things with a model/instance based on certain cues, e.g. after_create?

Events are pretty simple once you get the hang of it.
Here is a quick example using events
import uuid
from sqlalchemy.event import listen
from mypackage.models import Base
def generate_license(mapper, connect, target):
target.generate_license()
class User(Base):
__tablename__ = "users"
id = Column(String(36))
license = Column(String(256))
def generate_license(self):
if not self.license:
self.license = str(uuid.uuid4())
return self.license
listen(User, 'before_insert', generate_license)
Alternately, you can use decorators:
from sqlalchemy.event import listens_for
…
class User(Base):
…
#listens_for(User, 'before_insert')
def generate_license(mapper, connect, self):
…

from sqlalchemy.event import listen_for
…
class User(Base):
…
#listen_for(User, 'before_insert')
#staticmethod
def generate_license(mapper, connect, self):
…
This will return
NameError: name 'User' is not defined

Related

How to fix my approach to use the same models in vanilla SQLAlchemy and Flask-SQLAlchemy?

I came across
several approaches on how to use the vanilla SQLAlchemy models in Flask-SQLAlchemy.
It works like a charm to use models that inherit from Base with Flask-SQLAlchemy.
But I really like that convenience stuff...
Job.query.all() # Does not work
db.session.query(Job).all() # Works
So I started to work on this and put together some code, but I am stuck and need some help to get this nice and clean.
The following block is a general definition that does not inherit from either.
It is imported and supposed to be used from Flask-SQLAlchemy and vanilla SQLAlchemy at some point.
class VanillaMachine():
__tablename__ = 'machine'
id = Column(Integer, primary_key=True)
name = Column(String(100))
status = Column(Integer)
And there is a factory that takes either db.Model or Base and return Machine with the correct parent:
class MachineFactory:
def __init__(self, *args, **kwargs):
pass
def __new__(cls, *args, **kwargs):
return type('Machine',(object, VanillaMachine, args[0]), VanillaMachine.__dict__.copy())
I am quite sure that there's something off with that code, but
I am not sure where.
If I use it like
db = SQLAlchemy()
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()#
Machine1 = MachineFactory(db.Model)
Machine2 = MachineFactory(Base)
there is a error message
sqlalchemy.exc.ArgumentError: Column object 'id' already assigned to Table 'machine'
Can help me to get this straight in a nice, reliable way?
I know that you could just use a function, pass the parent as argument into VanillaMachine and use some if statement, but that would be too straightforward, right? :)
Edit:
Other approaches I came across are
using the Flask context to use Flask-SQLAlchemy models
with app.app_context():
pass
or
app.app_context().push()
But this is too focused on Flask for me and does not allow to clearly separate the models, make them independent and adjust to the context.
supplying an alternative Base class to db = SQLAlchemy(app, model_class=Base), see here. This might work for me, but I did not evaluate this so far.
I found a good solution inspired by a Factory pattern and
Declarative Mixins as mentioned in the SQLAlchemy docs.
For complex multi-level inheritance scenarios a different approach is needed, using #declared_attr.cascading.
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy import Column, Integer, String
from sqlalchemy import MetaData
from sqlalchemy.ext.declarative import declarative_base
from flask_sqlalchemy import SQLAlchemy
SQLALCHEMY_DATABASE_URI = 'sqlite:///' + '/tmp/test_app.db'
engine = create_engine(SQLALCHEMY_DATABASE_URI, echo=True)
# for vanilla
Base = declarative_base()
# for Flask (import from app once initialized)
db = SQLAlchemy()
class MachineMixin:
__tablename__ = 'machine'
id = Column(Integer, primary_key=True)
name = Column(String(100))
status = Column(Integer)
class ModelFactory:
#staticmethod
def create(which_model, which_parent):
if which_parent == 'flask_sqlalchemy':
parent = db.Model
elif which_parent == 'pure_sqlalchemy':
parent = Base
# now use type() to interit, fill __dict__ and assign a name
obj = type(which_model.__name__ + '_' + which_parent,
(which_model, parent),
{})
return obj
test_scenario = 'pure_sqlalchemy' # 'flask_sqlalchemy'
Machine = ModelFactory.create(MachineMixin, test_scenario)
if test_scenario == 'flask_sqlalchemy':
db.metadata.drop_all(bind=engine)
db.metadata.create_all(bind=engine)
elif test_scenario == 'pure_sqlalchemy':
Base.metadata.drop_all(bind=engine)
Base.metadata.create_all(bind=engine)
Session = sessionmaker(bind=engine)
session = Session()
session.add(Machine(name='Bob', status=1))
session.commit()

Access peewee's Meta subclass in Model's outer class method

I'm trying to wrap peewee models and classes into other interface and i want to dynamically assign model to database. I'm using peewee.Proxy class for this, but i don't want to use global variable for making initialization of this proxy available. I wanted to make class method for changing Meta (inner) class of base model, but i get following error:
AttributeError: type object 'BaseModel' has no attribute 'Meta'
Code that i have:
import peewee as pw
class BaseModel(pw.Model):
class Meta:
database = pw.Proxy()
#classmethod
def configure_proxy(cls, database: pw.Database):
cls.Meta.database.initialize(database)
Of course i could access this variable by calling BaseModel.Meta.database but it is less intuitive in my opinion.
Have you got any suggestions?
Peewee transforms the inner "Meta" class into an object accessible at "ModelClass._meta" after the class is constructed:
Change ".Meta" to "._meta":
class BaseModel(pw.Model):
class Meta:
database = pw.Proxy()
#classmethod
def configure_proxy(cls, database: pw.Database):
cls._meta.database.initialize(database)
I don't know exactly why you are having this problem, and I'd be interested in the full answer.
The problem is with the name Meta. I'm guessing there's something by that name defined in pw.Model but I haven't been through it all yet.
That said, this (for example) works:
import peewee as pw
class BaseModel(pw.Model):
class MyMeta:
database = pw.Proxy()
#classmethod
def configure_proxy(cls, database: pw.Database):
cls.MyMeta.database.initialize(database)

Pass database parameter into peewee Meta class

I am using peewee to manage CRUD operations on a Postgres database.
In the project documentation, connection to the database and creation of the ORM should be setup through a Meta class which other ORM types should inherit.
from peewee import *
db = PostgresqlDatabase('table', **{})
class BaseModel(Model):
class Meta:
database = db
class Product(BaseModel):
name = CharField(unique=True)
I'd like to be able to encapsulate this setup in a Persistence class, like as follows (so as not to create any global variables):
class Persistence():
db = None
class BaseModel(Model):
class Meta:
database = Persistence.db
class Product(BaseModel):
name = CharField(unique=True)
def __init__(self):
self.db = PostgresqlDatabase('table', **{})
Unfortunately this doesn't work with:
AttributeError: type object 'Persistence' has no attribute 'db'
I don't think this would work as expected (disregarding the AttributeError), because even if the variable was in scope at the time of the creation of BaseModel, it would be None and not change when the Persistence class is instantiated.
Is there a way to scope the db variable correctly so that it uses a class attribute on Persistence?
Can I pass in this db connection to peewee through another mechanism?
Your issue confuses two separate issues. Python scoping and Peewee database initialization. It'd be helpful if you were clear on where exactly the issue lies.
For the peewee part of your question, you probably want to defer initialization of the database. To do this, you create a database object placeholder -- or use a Proxy depending on how late you want to defer things.
Example of deferred initialization:
db = SqliteDatabase(None)
class BaseModel(Model):
class Meta:
database = db
# Declare other models as subclasses of BaseModel, e.g.
class Foo(BaseModel):
data = TextField()
class Persistence(object):
def __init__(self, db_file):
db.init(db_file)
Alternatively you can use a proxy, which is documented here: http://docs.peewee-orm.com/en/latest/peewee/database.html#dynamically-defining-a-database
You can make use of 'Using' execution contexts to have dynamic db connections. For example I have something like this:
from peewee import *
from playhouse.shortcuts import RetryOperationalError
from config import settings
class RetryMySQLDatabase(RetryOperationalError, MySQLDatabase):
pass
db_connection = {}
for conn in settings.DB:
dbs = settings.DB[conn]
db_connection[conn] = RetryMySQLDatabase(
dbs["DB_NAME"],
host=dbs["DB_HOST"],
user=dbs["DB_USER"],
password=dbs["DB_PASS"]
)
db = db_connection["default"] #if you don't want to have default connection, you can make use of Proxy() here
class BaseModel(Model):
class Meta:
database = db
class Booking(BaseModel):
id = BigIntegerField(db_column='ID', primary_key=True)
name = CharField(db_column='NAME', null=True)
And, while using the model class you can specify which db connection to use:
with Using(db_connection["read_only"], [Booking]):
booking_data = Booking.get(Booking.id == 123)
Ref:
'Using' execution context: http://docs.peewee-orm.com/en/2.10.2/peewee/database.html?highlight=re#using-multiple-databases
Proxy class: http://docs.peewee-orm.com/en/latest/peewee/database.html#dynamically-defining-a-database

How to manage a peewee database in a separate module?

I want to have my database implementation in a separate module or class. But I am struggling with a few details. A simple example:
from peewee import *
db = SqliteDatabase(':memory:')
class BaseModel(Model):
class Meta:
database = db
class User(BaseModel):
name = CharField()
db.connect()
db.create_tables([User,])
db.commit()
#db.atomic()
def add_user(name):
User.create(name=name).save()
#db.atomic()
def get_user(name):
return User.get(User.name == name)
So far this is working fine. I can implement my interface to the database here and import this as a module.
Now I want to be able to choose the database file at runtime. So I need a way to define the Model classes without defining SqliteDatabase('somefile') before. I tried to encapsulate everything in a new Database class, which I can later import and create an instance from:
from peewee import *
class Database:
def __init__(self, dbfile):
self.db = SqliteDatabase(dbfile)
class BaseModel(Model):
class Meta:
database = self.db
class User(BaseModel):
name = CharField()
self.User = User
self.db.connect()
self.db.create_tables([User,])
self.db.commit()
#self.db.atomic() # Error as self is not known on this level
def add_user(self, name):
self.User.create(name=name).save()
#self.db.atomic() # Error as self is not known on this level
def get_user(self, name):
return self.User.get(self.User.name == name)
Now I can call for example database = Database('database.db') or choose any other file name. I can even use multiple database instance in the same program, each with its own file.
However, there are two problems with this approach:
I still need to specify the database driver (SqliteDatabase) before defining the Model classes. To solve this I define the Model classes within the __init__() method, and then create an alias to with self.User = User. I don't really like this approach (it just doesn't feel like neat code), but at least it works.
I cannot use the #db.atomic() decorator since self is not known at class level, I would an instance here.
So this class approach does not seem to work very well. Is there some better way to define the Model classes without having to choose where you want to store your database first?
If you need to change database driver at the runtime, then Proxy is a way to go
# database.py
import peewee as pw
proxy = pw.Proxy()
class BaseModel(pw.Model):
class Meta:
database = proxy
class User(BaseModel):
name = pw.CharField()
def add_user(name):
with proxy.atomic() as txn:
User.create(name=name).save()
def get_user(name):
with proxy.atomic() as txn:
return User.get(User.name == name)
From now on each time you load the module, it won't need a database to be initialized. Instead, you can initialize it at the runtime and switch between multiple as follows
# main.py
import peewee as pw
import database as db
sqlite_1 = pw.SqliteDatabase('sqlite_1.db')
sqlite_2 = pw.PostgresqlDatabase('sqlite_2.db')
db.proxy.initialize(sqlite_1)
sqlite_1.create_tables([db.User], safe=True)
db.add_user(name="Tom")
db.proxy.initialize(sqlite_2)
sqlite_2.create_tables([db.User], safe=True)
db.add_user(name="Jerry")
But if the connection is the only thing that matters, then init() method will be enough.
Now I want to be able to choose the database file at runtime. So I
need a way to define the Model classes without defining
SqliteDatabase('somefile') before. I tried to encapsulate everything
in a new Database class, which I can later import and create an
instance from
Peewee uses the meta class to define the name of the table (Model.Meta.db_table) and database( Model.Meta.database)
set these attribute before calling a Model specific code ( either to create a table or to DML statements)
'Allow to define database dynamically'
Question: I cannot use the #db.atomic() decorator since self is not known at class level
Do it, as you do it with self.User.
I wonder about atomic() instead of atomic, but you tell is working fine.
class Database:
def __init__(self, dbfile):
self.db = SqliteDatabase(dbfile)
...
#self.db.atomic()
def __add_user(self, name):
self.User.create(name=name).save()
self.add_user = __add_user
#self.db.atomic()
def __get_user(self, name):
return self.User.get(self.User.name == name)
self.get_user = __get_user
Related: Define models separately from Database() initialization

Sharing an event handler across all models

I'm trying to share a simple functionality across all my models (timestamping) and am going with the "augmenting the base" approach as described in SQLA docs. So far I did this:
import sqlalchemy as sa
from datetime import datetime as dt
class EntityBase(object):
#declared_attr
def __tablename__(cls):
return cls.__name__.lower()
id = sa.Column(sa.Integer, primary_key=True)
last_update = sa.Column(sa.DateTime, default=dt.utcnow())
def update_entity(mapper, connection, target):
target.last_update = dt.utcnow()
Entity = declarative_base(cls=EntityBase)
sa.event.listen(Entity, 'before_insert', update_entity)
sa.event.listen(Entity, 'before_update', update_entity)
All my models are derived from the Entity class. But at runtime I get sqlalchemy.orm.exc.UnmappedClassError: Class 'sqlalchemy.ext.declarative.Base' is not mapped. What am I doing wrong?
UPDATE
I've circumvented the problem by doing simple preprocessing, like this:
def setupEntities():
...
for cls in Entity.__subclasses__():
listen(cls, 'before_insert', update_entity)
listen(cls, 'before_update', update_entity)
...
...but I'd like to hear about a right way to do it.
interesting here is that the way you did it above probably should work. It would be handy. So I've added a ticket for that http://www.sqlalchemy.org/trac/ticket/2585 .
For now, a way you can do this is to set an event listener for new mappings along with your Base:
Entity = declarative_base(cls=EntityBase):
from sqlalchemy.orm import mapper
#event.listens_for(mapper, 'mapper_configured')
def set_events(mapper, class_):
if issubclass(class_, Entity):
listen(class_, 'before_update', update_entity)
listen(class_, 'before_insert', update_entity)
in fact if I implement the feature for #2585 I'd probably have to do it very similarly to this also.

Categories

Resources