I'm working on a Python project using SQLAlchemy. I have following class (I have omitted some methods irrelevant to the question):
class Cmd(Base):
__tablename__ = "commands"
dbid = Column(Integer, Sequence("commands_seq"), primary_key = True)
cmd_id = Column(SmallInteger)
instance_dbid = Column(Integer, ForeignKey("instances.dbid"))
type = Column(String(20))
__mapper_args__ = {
"polymorphic_on" : type,
"polymorphic_identity" : "Cmd"
}
def __init__(self, cmd_id):
self.cmd_id = cmd_id
self.cmd_name = event_names[self.cmd_id]
As you see, on initializing instance of the class the attribute cmd_name is created from cmd_id attribute using event_names list (also omitted, it's a simple list containing command names).
I create object Cmd, add it session, commit session. After closing application and launching it again I try to load this Cmd using SQLAlchemy query. The object is loaded, but of course __init__ is not called and cmd_name is not set.
I would like to know if there is some simple way of executing some code (self.cmd_name = event_names[self.cmd_id]) after getting Cmd object with query. Of course I could do a special method and always launch it after query, but I'm seeking more elegant, automatic way.
I've read the documentation and found some information about ORM Event listeners, but they seem to be too much for such simple case. I've also found piece about Attribute Events, but they work with column_property and relationship only. Is there any short, elegant way to achieve what I want?
You can use the #reconstructor decorator:
from sqlalchemy.orm import reconstructor
class Cmd(Base):
__tablename__ = "commands"
dbid = Column(Integer, Sequence("commands_seq"), primary_key = True)
cmd_id = Column(SmallInteger)
instance_dbid = Column(Integer, ForeignKey("instances.dbid"))
type = Column(String(20))
__mapper_args__ = {
"polymorphic_on" : type,
"polymorphic_identity" : "Cmd"
}
def __init__(self, cmd_id):
self.cmd_id = cmd_id
self.cmd_name = event_names[self.cmd_id]
#reconstructor
def init_db_load(self):
self.cmd_name = event_names[self.cmd_id]
See this doc under "Constructors and Object Initialization".
Related
I'm trying to create a backref in a Single table inheritance situation, where one of the subclassed objects have a mixin that declares the relationship (HasFooMixin), but this results in the following issue:
sqlalchemy.exc.ArgumentError: Error creating backref 'bar' on relationship 'ImplementedBase.foos': property of that name exists on mapper 'mapped class Foo->foos'
I thought maybe the name 'bar' was used soemwhere else, but regardless of what I name it, the same error will be generated. The error seems to be due to the fact that there's already a backref with that name, but I cant find any in all of my codebase, and regardless of what I name it, it throws the same error.
Any idea how to solve this? Basically I want a two-way reference from bar --> foos and from foo --> bars, (but only for the polymorphic class ImplementedBase if possible)
Below are the model details.
def BaseClass(db.Model):
type = db.Column(db.Text, default="base")
...
__mapper_args__ = {
'polymorphic_identity' : 'base',
'polymorphic_on' : type
}
def ImplementedBase(BaseClass, HasFooMixin):
__mapper_args__ = {
'polymorphic_identity': 'implementedbase'
}
def HasFooMixin(object):
#declared_attr
def foos(cls)
return cls.__table__.c.get('foos', db.relationship('Foo', secondary=mtm_foo_bar, backref="bar"))
#Table for MTM mapping Foo <---> Bar
mtm_foo_bar = db.Table('associate_foo_bar',
db.Model.metadata,
db.Column("foo_id", db.Integer, db.ForeignKey("foo.id")),
db.Column("bar_id", db.Integer, db.ForeignKey("bar.id"))
)
I have a joined-inheritance set of models.
class Asset(db.Model):
__tablename__ = 'assets'
asset_id = db.Column(db.Integer, primary_key=True, autoincrement=True)
asset_type_id = db.Column(db.ForeignKey('asset_types.asset_type_id'))
...
__mapper_args__ = {
'polymorphic_on': asset_type_id
}
class Vehicle(Asset):
__tablename__ = 'vehicles'
asset_id = db.Column(db.ForeignKey('assets.asset_id'), primary_key=True)
...
...
__mapper_args__ = {
'polymorphic_identity':2
}
I would like to mixin the Vehicle into a class with self referencing methods. Like so:
from library.models import Vehicle as EsdbVehicle
class Vehicle(EsdbVehicle):
def update_usage(self, db_session, odometer):
self.odometer = odometer
db_session.commit()
When I query the EsdbVehicle model vehicles = db_session.query(EsdbVehicle).all() I get 115 results, which matches my DB. However when I query the implementing Vehicle I get 0 results. What am I missing?
I cannot be 100% sure about this without setting up a DB connection and running your code but my instinct tells me that this is not working as expected because it looks to SQLAlchemy like you're trying to do table inheritance (but since you're not and the class definition is incomplete SQLAlchemy is confused) when what you really want is a mixin.
I would try something more like this:
class Vehicle(Asset):
# stuff...
__mapper_args__ = {
'polymorphic_on': 'vehicle_subtype',
'polymorphic_identity': 'base_vehicle'
}
class UsageMixin(object):
def update_usage(self, *args):
# stuff...
pass
class VehiclesThatUpdate(Vehicle, UsageMixin):
__mapper_args__ = {
'polymorphic_identity': 'updating_vehicles'
}
The reason that should work is because everytime you extend an ORM mapped class, you are essentially telling SQLAlchemy you want to perform some kind of table inheritance. In this case, you can get away with just adding a type (for STI) because you aren't adding attributes (columns) you're just adding functionality.
If this does not work out (or is incomplete) just add a comment and I'll adjust.
EDIT: Another thought. The other thing you could do is skip the inheritance all together and just write functions that operate on the Vehicle class. They do not need to be methods necessarily. I often do this to avoid extra coupling between my actual SQL model and the class / method structure.
The more you use ORM's the more you start to realize that often times using less of their features is better.
I've stuck with one issue when using SQLA inheritance (I have mixin with __mapper_args__ set there).
To reproduce it:
Model should have __mapper_arg__ attribute with order_by param set to any string.
add with_polymorphic('*') call to query with this Model.
class User(Base):
id = Column(Integer, primary_key=True)
name = Column(String)
__mapper_args__ = {'order_by': 'User.name'}
With this constructions everything works just fine except when we add with_polymorphic('*') to query.
db.query(User).with_polymorphic('*')
This fails with exception
File "/python2.7/dist-packages/sqlalchemy/sql/visitors.py", line 267, in clone
'no_replacement_traverse' in elem._annotations:
AttributeError: 'str' object has no attribute '_annotations'
I suppose this is kind of a bug. But since It reproduces on SQLA 0.7-0.9 I have doubts about my way I've run into this issue. Maybe I do something wrong?
This issue was reproduced on small testcase.
P.S.
Originally I needed this mixin in my project:
class DocMixin(object):
id = Column(Integer)
title = Column(String(255))
#declared_attr
def __mapper_args__(cls):
return {'order_by': 'title'}
Don't use strings in order_by mapper parameter, use column directly. Changing 'User.name' to name makes your example to pass tests.
I am using the SQLAlchemy versioned object example as a reference.
Example: http://docs.sqlalchemy.org/en/rel_0_7/orm/examples.html#versioned-objects
When I update a record I am getting no errors. The case_history table is being created, but the version number is staying at '1' and the case_history table is empty.
(Yes I am aware that I am using 'case' as a class name. Is that bad?)
Here are my code snippets:
models.py:
from history_meta import Versioned, versioned_session
# set up the base class
class Base(object):
#declared_attr
def __tablename__(cls):
return cls.__name__.lower()
id = Column(Integer, primary_key = True)
header_row = Column(SmallInteger)
def to_dict(self):
serialized = dict((column_name, getattr(self, column_name))
for column_name in self.__table__.c.keys())
return serialized
Base = declarative_base(cls=Base)
class case(Versioned, Base):
title = Column(String(32))
description = Column(Text)
def __repr__(self):
return self.title
app.py:
engine = create_engine(SQLALCHEMY_DATABASE_URI)
Session = sessionmaker(bind=engine)
versioned_session(Session)
db = Session()
...
#app.route('/<name>/:record', method='POST')
def default(name, record):
myClass = getattr(sys.modules[__name__], name)
db.query(myClass).filter(myClass.id == record).update(request.json)
for u in db.query(case).filter(case.id == record):
print u.version # Version is always 1
db.commit() # I added this just to test versioning.
Any clue as to why the versioning isn't happening?
For others who find their way here...
Remember: even if filter() returns a single object, the update() method is a bulk operation, and acts differently. It is possible the version is only incremented on an event like after_update() which does not trigger on bulk operations.
Read more on caveats for the update() operation here.
An update query will not cause the version to increment even though the data changes. There might be ways to 'listen' for that type of change, but I don't know.
You have to change an attribute of a mapped class:
#Get an instance of the class
myItem = db.query(myClass).get(record)
#Change an attribute
myItem.title="foo"
#Commit if necessary
db.commit()
In SQLAlchemy Declarative, how do I set up default values for columns, such that transient or pending object instances will have those default values? A short example:
from sqlalchemy import Column, Integer, String
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class A(Base):
__tablename__ = "A"
id = Column(Integer, primary_key=True)
word = Column(String, default="adefault")
a = A()
print a.word
Naively, I would expect the output from this to be adefault. Of course, the output is actually None. Even when adding to a session, it staysNone and only gets filled when I commit (or flush) the session, and re-read the instance value from the database.
Is there any way to set an attribute default without flushing the instance to the database? I tried investigating the ColumnDefault documentation, and there doesn't seem to be an obvious way to inspect the type/python value, so as to manually set it in a custom declarative baseclass.
Add a constructor to your class and set the default value there. The constructor doesn't run when the rows are loaded from the database so it is fine to do this.
class A(Base):
__tablename__ = "A"
id = Column(Integer, primary_key=True)
word = Column(String)
def __init__(self):
self.word = "adefault"
a = A()
print a.word
There are examples of using __init__ in similar ways in the SA Docs.