SQLAlchemy Mixin Expected Mapped Entity got Implementation - python

I have a joined-inheritance set of models.
class Asset(db.Model):
__tablename__ = 'assets'
asset_id = db.Column(db.Integer, primary_key=True, autoincrement=True)
asset_type_id = db.Column(db.ForeignKey('asset_types.asset_type_id'))
...
__mapper_args__ = {
'polymorphic_on': asset_type_id
}
class Vehicle(Asset):
__tablename__ = 'vehicles'
asset_id = db.Column(db.ForeignKey('assets.asset_id'), primary_key=True)
...
...
__mapper_args__ = {
'polymorphic_identity':2
}
I would like to mixin the Vehicle into a class with self referencing methods. Like so:
from library.models import Vehicle as EsdbVehicle
class Vehicle(EsdbVehicle):
def update_usage(self, db_session, odometer):
self.odometer = odometer
db_session.commit()
When I query the EsdbVehicle model vehicles = db_session.query(EsdbVehicle).all() I get 115 results, which matches my DB. However when I query the implementing Vehicle I get 0 results. What am I missing?

I cannot be 100% sure about this without setting up a DB connection and running your code but my instinct tells me that this is not working as expected because it looks to SQLAlchemy like you're trying to do table inheritance (but since you're not and the class definition is incomplete SQLAlchemy is confused) when what you really want is a mixin.
I would try something more like this:
class Vehicle(Asset):
# stuff...
__mapper_args__ = {
'polymorphic_on': 'vehicle_subtype',
'polymorphic_identity': 'base_vehicle'
}
class UsageMixin(object):
def update_usage(self, *args):
# stuff...
pass
class VehiclesThatUpdate(Vehicle, UsageMixin):
__mapper_args__ = {
'polymorphic_identity': 'updating_vehicles'
}
The reason that should work is because everytime you extend an ORM mapped class, you are essentially telling SQLAlchemy you want to perform some kind of table inheritance. In this case, you can get away with just adding a type (for STI) because you aren't adding attributes (columns) you're just adding functionality.
If this does not work out (or is incomplete) just add a comment and I'll adjust.
EDIT: Another thought. The other thing you could do is skip the inheritance all together and just write functions that operate on the Vehicle class. They do not need to be methods necessarily. I often do this to avoid extra coupling between my actual SQL model and the class / method structure.
The more you use ORM's the more you start to realize that often times using less of their features is better.

Related

Abstract class - MongoEngine

Per MongoEngine docs (http://docs.mongoengine.org/guide/defining-documents.html#abstract-classes:~:text=document._data%20dictionary.-,2.3.9.%20Abstract%20classes,-If%20you%20want) abstract classes is a nice way to provide extra functionality to a group of Document classes.
I am trying to achieve this, but am struggling to create methods that do even simple things as get a document by id.
I have something like the following:
class BaseDocument(Document):
meta = {
'abstract': True,
}
def delete_by_id(self,id):
document = self.objects(_id=id) # <---- self is a User Object, not the User collection
document.delete()
class User(BaseDocument):
_id = StringField(type="string", primary_key=True)
name = StringField(type="string")
email = StringField(type="string")
new_user = User(_id="123",name="123",email="123#123.com")
new_user.delete_by_id("123") # <--- throws 'QuerySetManager' object is not callable
is there no way to have the BaseDocument method "delete_by_id" delete a document without having to use the User class itself inside the "delete_by_id"-method? Such an implementation would make the class BaseDocument so much more "inheritable".
Thank you for any help!!

SQLAlchemy creating a backref on a single table inheritance subclass?

I'm trying to create a backref in a Single table inheritance situation, where one of the subclassed objects have a mixin that declares the relationship (HasFooMixin), but this results in the following issue:
sqlalchemy.exc.ArgumentError: Error creating backref 'bar' on relationship 'ImplementedBase.foos': property of that name exists on mapper 'mapped class Foo->foos'
I thought maybe the name 'bar' was used soemwhere else, but regardless of what I name it, the same error will be generated. The error seems to be due to the fact that there's already a backref with that name, but I cant find any in all of my codebase, and regardless of what I name it, it throws the same error.
Any idea how to solve this? Basically I want a two-way reference from bar --> foos and from foo --> bars, (but only for the polymorphic class ImplementedBase if possible)
Below are the model details.
def BaseClass(db.Model):
type = db.Column(db.Text, default="base")
...
__mapper_args__ = {
'polymorphic_identity' : 'base',
'polymorphic_on' : type
}
def ImplementedBase(BaseClass, HasFooMixin):
__mapper_args__ = {
'polymorphic_identity': 'implementedbase'
}
def HasFooMixin(object):
#declared_attr
def foos(cls)
return cls.__table__.c.get('foos', db.relationship('Foo', secondary=mtm_foo_bar, backref="bar"))
#Table for MTM mapping Foo <---> Bar
mtm_foo_bar = db.Table('associate_foo_bar',
db.Model.metadata,
db.Column("foo_id", db.Integer, db.ForeignKey("foo.id")),
db.Column("bar_id", db.Integer, db.ForeignKey("bar.id"))
)

How to create custom classes that inherit from sqlalchemy models in external library

Rather than creating mixin classes that models inherit from, I have a use case that requires me to configure classes the other way around. The classes that would normally be mixin classes need to be the classes that inherit from the models as well as the class that model objects are created from. This is because the models and the mapper configurations are in an external library from the main repository. I need to pass in the host for the engine from the main repository to the models library before any of the models are loaded so they can load with the declarative base already configured. After the engine information is passed in, the session, Base class, and everything is created within a sort of base class that the models inherit from. Here is a simplified example:
class SQLAlchemyBase(object):
metadata = None
Session = None
Base = object
sessionfactory = sessionmaker()
def initialize(self, host):
engine = create_engine(host)
self.metadata = MetaData(bind=engine)
self.Session = scoped_session(self.sessionfactory)
self.Base = declarative_base(metadata=self.metadata)
models = SQLAlchemyBase()
(The models inherit from models.Base)
So the SQLAlchemyBase will be imported into the main repository, the initialize method will be called, passing in the host for the engine, and the models can then be imported. The main repository has its own classes with the same names as the models and have additional methods that a normal mixin class would have to extend functionality. However, I am unable to create model objects using the classes in the main repository because I can't get the mappers to play nice with this unusual inheritance that extends from the external models library. Additionally, in the models library, there are models that have multiple levels of inherited polymorphic relationships. Here is an example that is similar one of the more basic inherited polymorphic relationships:
Models Library
class Foo(models.Base):
__tablename__ = "foo"
id = Column(Integer, primary_key=True)
type = Column(String)
foo_bar_id = Column(Integer, ForeignKey("foo_bar.id"))
foo_bar = relationship(Foo, backref=backref("foos"))
__mapper_args__ = {"polymorphic_on": type}
class Bar(Foo):
__mapper_args__ = {"polymorphic_identity": "bar"}
class FooBar(models.Base):
__tablename__ = "foo_bar"
id = Column(Integer, primary_key=True)
Main Repository
from separate_library.models import models, Foo as BaseFoo, Bar as BaseBar, FooBar as BaseFooBar
class Foo(BaseFoo):
#classmethod
def custom_create_method(cls, **kw):
foo_obj = cls(**kw)
models.session.add(foo_obj)
models.session.flush()
class Bar(BaseBar):
pass
class FooBar(BaseFooBar):
pass
The original error I was getting was something like this:
InvalidRequestError: One or more mappers failed to initialize - can't proceed with initialization of other mappers.
Original exception was: Multiple classes found for path Foo in the registry of this declarative base. Please use a fully module-qualified path.
So I tried putting the full path in the relationships. Then it started giving me an error like this:
FlushError: Attempting to flush an item of type as a member of collection FooBar.foos. Expected an object of type or a polymorphic subclass of this type. If is a subclass of , configure mapper Mapper|Foo|foo to load this subtype polymorphically, or set enable_typechecks=False to allow any subtype to be accepted for flush.
Essentially, the main problem is getting the classes in the main module to point to and act like the model classes. For example, when I try to create relationships, it says it expected an object of type separate_library.models.Foo instead of main_module.models.Foo. Additionally, in the polymorphic relationships, I can't get the polymorphic_identity to populate for the polymorphic_on column. For example, Bar in the main repository will have the type column empty when the object is initially created.
One idea I tried was to add a metaclass to the declarative base in the models library and modify the mappers in the __init__ method during their initialization. I made progress this way, but haven't gotten it to work completely.
Sorry for the complex explanation, but this is a complex problem. I am not able to change anything about the models or the use case, unfortunately. I have to work within these constraints. If anyone can offer ideas on how to configure the mappers for the classes in the main repository to act like the models in the model library, I would be very grateful.
There are three problems here:
When you write foo_bar = relationship(FooBar, backref=backref("foos")) the FooBar needs to refer to the subclass FooBar, not the BaseFooBar.
Bar needs to inherit from Foo for the inheritance mechanism to work; it cannot inherit from BaseFoo.
Your base classes should not have mappers attached to them; otherwise the inheritance mechanism gets out of whack.
The solutions to these problems, in order:
Use a string to refer to the class name. This confines the consumer to name their classes a certain way. Let's accept this restriction for now.
We can use a metaclass to dynamically change the base class. The metaclass needs to derive from the metaclass of Base because SQLAlchemy's declarative extension makes liberal use of metaclasses. We'll see that the metaclass approach can also solve problem 1 in a flexible way.
Use __abstract__ = True.
Simplest possible example:
from sqlalchemy import *
from sqlalchemy.ext.declarative import declarative_base, declared_attr, DeclarativeMeta
class BaseMeta(DeclarativeMeta):
def __new__(cls, name, bases, attrs):
if not attrs.get("__abstract__"):
if len(bases) != 1:
# you'll need to have multiple inheritance if you have that
# as well
raise NotImplementedError()
base, = bases
extra_bases = tuple(b._impl for b in base.__bases__
if hasattr(b, "_impl"))
bases += extra_bases
self = super(BaseMeta, cls).__new__(cls, name, bases, attrs)
if getattr(base, "__abstract__", False):
base._impl = self
return self
else:
return super(BaseMeta, cls).__new__(cls, name, bases, attrs)
Base = declarative_base(metaclass=BaseMeta)
class BaseFoo(Base):
__abstract__ = True
__tablename__ = "foo"
id = Column(Integer, primary_key=True)
type = Column(String)
#declared_attr
def foo_bar_id(cls):
return Column(Integer, ForeignKey("foo_bar.id"))
#declared_attr
def foo_bar(cls):
return relationship(lambda: BaseFooBar._impl, backref=backref("foos"))
__mapper_args__ = {"polymorphic_on": type}
class BaseBar(BaseFoo):
__abstract__ = True
__mapper_args__ = {"polymorphic_identity": "bar"}
class BaseFooBar(Base):
__abstract__ = True
__tablename__ = "foo_bar"
id = Column(Integer, primary_key=True)
class Foo(BaseFoo):
#classmethod
def custom_create_method(cls, **kw):
foo_obj = cls(**kw)
models.session.add(foo_obj)
models.session.flush()
class Bar(BaseBar):
pass
class FooBar(BaseFooBar):
pass
print(Bar.__bases__) # (<class '__main__.BaseBar'>, <class '__main__.Foo'>)
The basic idea of the metaclass is to inject the class Foo into the bases of Bar, based on the fact that BaseBar inherits from BaseFoo, and the fact that Foo implements BaseFoo (by inheriting from it).
You can add more complicated stuff on top, such as multiple inheritance support or graceful error handling (e.g. warning the user that he's missing a subclass for each base class that you have or he's provided multiple subclasses for the same base class).

SQLAlchemy raises exception on query when with_polymorphic used for model with order_by mapper arg presented with string

I've stuck with one issue when using SQLA inheritance (I have mixin with __mapper_args__ set there).
To reproduce it:
Model should have __mapper_arg__ attribute with order_by param set to any string.
add with_polymorphic('*') call to query with this Model.
class User(Base):
id = Column(Integer, primary_key=True)
name = Column(String)
__mapper_args__ = {'order_by': 'User.name'}
With this constructions everything works just fine except when we add with_polymorphic('*') to query.
db.query(User).with_polymorphic('*')
This fails with exception
File "/python2.7/dist-packages/sqlalchemy/sql/visitors.py", line 267, in clone
'no_replacement_traverse' in elem._annotations:
AttributeError: 'str' object has no attribute '_annotations'
I suppose this is kind of a bug. But since It reproduces on SQLA 0.7-0.9 I have doubts about my way I've run into this issue. Maybe I do something wrong?
This issue was reproduced on small testcase.
P.S.
Originally I needed this mixin in my project:
class DocMixin(object):
id = Column(Integer)
title = Column(String(255))
#declared_attr
def __mapper_args__(cls):
return {'order_by': 'title'}
Don't use strings in order_by mapper parameter, use column directly. Changing 'User.name' to name makes your example to pass tests.

SQLAlchemy: Initializing attribute based on other attribute

I'm working on a Python project using SQLAlchemy. I have following class (I have omitted some methods irrelevant to the question):
class Cmd(Base):
__tablename__ = "commands"
dbid = Column(Integer, Sequence("commands_seq"), primary_key = True)
cmd_id = Column(SmallInteger)
instance_dbid = Column(Integer, ForeignKey("instances.dbid"))
type = Column(String(20))
__mapper_args__ = {
"polymorphic_on" : type,
"polymorphic_identity" : "Cmd"
}
def __init__(self, cmd_id):
self.cmd_id = cmd_id
self.cmd_name = event_names[self.cmd_id]
As you see, on initializing instance of the class the attribute cmd_name is created from cmd_id attribute using event_names list (also omitted, it's a simple list containing command names).
I create object Cmd, add it session, commit session. After closing application and launching it again I try to load this Cmd using SQLAlchemy query. The object is loaded, but of course __init__ is not called and cmd_name is not set.
I would like to know if there is some simple way of executing some code (self.cmd_name = event_names[self.cmd_id]) after getting Cmd object with query. Of course I could do a special method and always launch it after query, but I'm seeking more elegant, automatic way.
I've read the documentation and found some information about ORM Event listeners, but they seem to be too much for such simple case. I've also found piece about Attribute Events, but they work with column_property and relationship only. Is there any short, elegant way to achieve what I want?
You can use the #reconstructor decorator:
from sqlalchemy.orm import reconstructor
class Cmd(Base):
__tablename__ = "commands"
dbid = Column(Integer, Sequence("commands_seq"), primary_key = True)
cmd_id = Column(SmallInteger)
instance_dbid = Column(Integer, ForeignKey("instances.dbid"))
type = Column(String(20))
__mapper_args__ = {
"polymorphic_on" : type,
"polymorphic_identity" : "Cmd"
}
def __init__(self, cmd_id):
self.cmd_id = cmd_id
self.cmd_name = event_names[self.cmd_id]
#reconstructor
def init_db_load(self):
self.cmd_name = event_names[self.cmd_id]
See this doc under "Constructors and Object Initialization".

Categories

Resources