I have three classes in my model, which one class inherited by the other two:
class Item(Base):
__tablename__ = 'item'
id = Column(Integer, primary_key=True)
title = Column(Unicode(300))
type = Column(Unicode(50))
__mapper_args__ = {
'polymorphic_on': type
}
class Note(Item):
__tablename__ = 'note'
id = Column(Integer, ForeignKey('item.id'), primary_key=True)
extra = Column(Text)
__mapper_args__ = {
'polymorphic_identity': 'note'
}
class Task(Item):
__tablename__ = 'task'
id = Column(Integer, ForeignKey('item.id'), primary_key=True)
another_extra = Column(Text)
__mapper_args__ = {
'polymorphic_identity': 'task'
}
So, when I execute session.query(Item).all() I get a list that includes both Note and Task objects, but I don't want that, I want my objects to be the instance of Item class and just have id, title, type, not those extra fields. how should I write the query?
to clarify more, currently, I get:
[
<models.Note object at 0x7f25ac3ffe80>,
<models.Task object at 0x7f25ac3ffe80>,
<models.Task object at 0x7f25ac3ffe80>,
...
]
But I want to get:
[
<models.Item object at 0x000000000000>,
<models.Item object at 0x000000000000>,
<models.Item object at 0x000000000000>,
...
]
NOTE: This may be problematic in a multi-threaded application.
You could use a context manager to temporarily block the polymorphism:
from contextlib import contextmanager
from sqlalchemy import inspect
#contextmanager
def no_poly(class_):
mapper = inspect(class_).mapper
polycol = mapper.polymorphic_on
mapper.polymorphic_on = None
yield class_
mapper.polymorphic_on = polycol
Base.metadata.drop_all(engine)
Base.metadata.create_all(engine)
task = Task(title='Task Title', another_extra='something')
s = Session()
s.add(task)
s.commit()
# opening a new session as if the pk already exists in the
# identity map it will return whatever type that pk is
# pointing at.
s = Session()
with no_poly(Item) as class_:
inst = s.query(class_).all()
print(inst) # [<__main__.Item object at 0x000001443685DDD8>]
s = Session() # new session again.
inst = s.query(Item).all()
print(inst) # [<__main__.Task object at 0x00000144368770B8>]
Something to be mindful of and as noted in the comments in my example, if the identity of the object is already referenced in the Identity Map, then you will get back whatever type is held in there, regardless of the class that you query on.
Related
I want to store a reference to database entries with a generic purpose in an extra table of an SQL Database.
My data model in SQLAlchemy looks like this:
class Entity(db.Model):
id = db.Column(db.Integer, primary_key = True)
name = db.Column(db.String(10))
def __init__(self, name):
self.name = name
class Entry(db.Model):
id = db.Column(db.Integer, primary_key=True)
name= db.Column(db.String(10))
entity_id = db.Column(db.Integer, db.ForeignKey('entity.id'))
def __init__(self, name, entity_id):
self.name = name
self.entity_id = entity_id
class Thing(db.Model):
id = db.Column(db.Integer, primary_key=True)
name= db.Column(db.String(10))
entity_id = db.Column(db.Integer, db.ForeignKey('entity.id'))
def __init__(self, name, entity_id):
self.name = name
self.entity_id = entity_id
As soon as an Entry or a Thing is created, there should be a reference in entity_id to a newly created entry in Entity. For solving this i create an entity first, commit it, get then the id from the newly created entity and create the Entry or Thing with the grabbed id.
# Create a new Entity object
entity = Entity(entity_title)
# Commit the entity
db.session.add(entity)
db.session.commit()
# Get the entity ID
entity_id = Entity.query.filter_by(name=entity_title).first().id
# Create a new Entry/Thing object
entry= Entry(name, entity_id)
# Commit the entry
db.session.add(task)
db.session.commit()
This way seems quite inefficient. Is there maybe another way to solve this?
Instead of dealing with foreign keys manually, relationships serve this exact purpose of allowing you to deal with in-memory representations of objects and their relationships, without having to do flush()es in the correct order to figure out what foreign keys you should put where.
class Entry(db.Model):
id = db.Column(db.Integer, primary_key=True)
name= db.Column(db.String(10))
entity_id = db.Column(db.Integer, db.ForeignKey('entity.id'))
entity = relationship(Entity)
entry = Entry(name="foo", entity=Entry(name="foo"))
db.session.add(entry)
db.session.commit()
Notice how you don't have to deal with entity_id at all?
In your particular situation, you may also find inheritance useful:
class Entity(db.Model):
__tablename__ = "entities"
id = db.Column(db.Integer, primary_key = True)
name = db.Column(db.String(10))
type = db.Column(db.Enum("entry", "thing"))
def __init__(self, name):
self.name = name
__mapper_args__ = {
"polymorphic_on": type,
}
class Entry(Entity):
__tablename__ = "entries"
id = db.Column(db.Integer, db.ForeignKey('entity.id') primary_key=True)
class Thing(Entity):
__tablename__ = "things"
id = db.Column(db.Integer, db.ForeignKey('entity.id') primary_key=True)
entry = Entry(name="foo") # automatically deals with the entities table
db.session.add(entry)
db.session.commit()
You can use session.flush()
entity = Entity(entity_title)
# Commit the entity
db.session.add(entity)
db.session.flush()
after that you may have id in entity.id
session.flush() only "create" an instance but it actualy didn't commit your session
In your case
# Create a new Entity object
entity = Entity(entity_title)
# Commit the entity
db.session.add(entity)
db.session.flush()
# Didn't need
# Get the entity ID
# entity_id = Entity.query.filter_by(name=entity_title).first().id
# Create a new Entry/Thing object
entry= Entry(name, entity.id)
# Commit the entry
db.session.add(task)
db.session.commit()
Here docs
http://docs.sqlalchemy.org/en/latest/orm/session_api.html#sqlalchemy.orm.session.Session.flush
Using Python 3.5 and SQLAlchemy 1.0.14 (ORM).
I have a table of items declared as such:
from sqlalchemy.ext.declarative.api import declarative_base
Base = declarative_base()
class Item(Base):
__tablename__ = 'items'
id = Column(Integer, primary_key=True)
type = Column(String)
# other non relevant attributes
My Items can be of many different types, the type identifier being stored in type.
For a few of those objects types, I need to have specific methods or attributes available.
To achieve that I tried to use single table inheritance with several SpecialisedItem as subclass of Item:
class Item(Base):
__tablename__ = 'items'
id = Column(Integer, primary_key=True)
type = Column(String, index=True)
# other non relevant attributes
__mapper_args__ = {
'polymorphic_on': type,
}
class SpecialisedItem(Base):
__mapper_args__ = {
'polymorphic_identity': 'specialitem',
}
def specialised_method(self):
return "I am special"
Now when I load my items, I'd want all specialised items (having type=='specialitem') to be loaded as such, while any other type value would result in the parent class Item being loaded.
That doesn't work, I get AssertionError: No such polymorphic_identity 'normal' is defined when loading the items.
I would like to avoid creating inherited classes that do nothing just to cover all possible type values, instead having "unmapped" type falling back to the parent class Item.
Is there any way to achieve that effect ?
Minimal test case for reference:
from sqlalchemy.engine import create_engine
from sqlalchemy.ext.declarative.api import declarative_base
from sqlalchemy.orm.session import sessionmaker
from sqlalchemy.sql.schema import Column
from sqlalchemy.sql.sqltypes import Integer, String
Base = declarative_base()
class Item(Base):
__tablename__ = 'items'
id = Column(Integer, primary_key=True)
type = Column(String, index=True)
# other non relevant attributes
__mapper_args__ = {
'polymorphic_on': type,
}
class SpecialisedItem(Item):
__mapper_args__ = {
'polymorphic_identity': 'special',
}
specialAttribute = Column(String)
def specialised_method(self):
return "I am special"
engine = create_engine("sqlite:///:memory:")
Base.metadata.create_all(engine)
Session = sessionmaker(bind=engine)
session = Session()
session.add(Item(type='normal'))
session.add(Item(type='special'))
session.commit()
# loading only specialized items works
for item in session.query(Item).filter_by(type="special"):
print(item.specialised_method())
# loading other items fails
for item in session.query(Item):
print(item.type)
Thanks,
Guillaume
A mapping of “polymorphic identity” identifiers to Mapper instances is stored in the polymorphic_map dict. You can create custom polymorphic_map that will return parent class mapper for undefined polymorphic identities.
from sqlalchemy.engine import create_engine
from sqlalchemy.ext.declarative.api import declarative_base
from sqlalchemy.orm.session import sessionmaker
from sqlalchemy.sql.schema import Column
from sqlalchemy.sql.sqltypes import Integer, String
from sqlalchemy import event
Base = declarative_base()
class Item(Base):
__tablename__ = 'items'
id = Column(Integer, primary_key=True)
type = Column(String, index=True)
# other non relevant attributes
__mapper_args__ = {
'polymorphic_on': type,
}
class SpecialisedItem(Item):
__mapper_args__ = {
'polymorphic_identity': 'special',
}
specialAttribute = Column(String)
def specialised_method(self):
return "I am special"
#http://docs.sqlalchemy.org/en/rel_1_1/orm/events.html#sqlalchemy.orm.events.MapperEvents.mapper_configured
#event.listens_for(Item, 'mapper_configured')
def receive_mapper_configured(mapper, class_):
mapper.polymorphic_map = defaultdict(lambda: mapper, mapper.polymorphic_map)
# to prevent 'incompatible polymorphic identity' warning, not mandatory
mapper._validate_polymorphic_identity = None
engine = create_engine("sqlite:///:memory:")
Base.metadata.create_all(engine)
Session = sessionmaker(bind=engine)
session = Session()
session.add(Item(type='normal'))
session.add(Item(type='special'))
session.commit()
# loading only specialized items works
for item in session.query(Item).filter_by(type="special"):
print(item.specialised_method())
# loading other items fails
for item in session.query(Item):
print(item.type)
A reusable decorator solution, based on #r-m-n answer. Custom class is also replaced with collections.defaultdict that actually does same thing.
def receive_mapper_configured(mapper, class_):
mapper.polymorphic_map = defaultdict(lambda: mapper, mapper.polymorphic_map)
# to prevent 'incompatible polymorphic identity' warning, not necessary
mapper._validate_polymorphic_identity = None
def polymorphic_fallback(mapper_klass):
event.listens_for(mapper_klass, 'mapper_configured')(receive_mapper_configured)
return mapper_klass
Then in your code you can just add this decorator to base classes:
#polymorphic_fallback
class Item:
...
class SpecificItem(Item):
...
So I want to execute a filter on all Columns of my Database Model which uses table inheritance. I am by no means sure if this is actually do-able or not.
To get started let's use the same inheritance example from the SQLAlchemy Doc just slightly modified. I've omitted the imports here.
class Employee(Base):
__tablename__ = 'employee'
id = Column(Integer, primary_key=True)
name = Column(String(50))
type = Column(String(50))
__mapper_args__ = {
'polymorphic_identity':'employee',
'polymorphic_on':type
}
#classmethod
def get_all(cls, session, query):
_filters = []
for prop in class_mapper(cls).iterate_properties:
if isinstance(prop, ColumnProperty):
_col = prop.columns[0]
_attr = getattr(cls, _cls.name)
_filters.append(cast(_attr, String).match(query))
result = session.query(cls)
result = result.filter(or_(*_filters))
return result.all()
class Engineer(Employee):
__tablename__ = 'engineer'
id = Column(Integer, ForeignKey('employee.id'), primary_key=True)
engineer_name = Column(String(30))
foo = Column(String(10))
__mapper_args__ = {
'polymorphic_identity':'engineer',
}
class Manager(Employee):
__tablename__ = 'manager'
id = Column(Integer, ForeignKey('employee.id'), primary_key=True)
manager_name = Column(String(30))
bar = Column(String(20))
__mapper_args__ = {
'polymorphic_identity':'manager',
}
Now let's say I would like to query all Employee where some of the fields matches a query. The method get_all shown above will only query in Columns known to the class Employee.
Is there some way to query in all columns of the entire inheritance chain?
It's pretty ugly, but one way would be to find all the subclasses that inherit from Employee, then left join those tables and add their columns to the query.
How to get subclasses:
https://stackoverflow.com/a/5883218/443900
Have not tested this, but something like this should work.
#classmethod
def get_all(cls, session, query):
_filters = []
for prop in class_mapper(cls).iterate_properties:
if isinstance(prop, ColumnProperty):
_col = prop.columns[0]
_attr = getattr(cls, _cls.name)
_filters.append(cast(_attr, String).match(query))
result = session.query(cls)
result = result.filter(or_(*_filters))
# get the subclasses
subclasses = set()
for child in cls.__subclasses__():
if child not in subclasses:
subclasses.add(child)
# join the subclass
result = result.outerjoin(child)
# recurse to get the columns from the subclass
result = subclass.get_all(session, result)
# return a query, not a result to allow for the recursion.
# you might need to tweak this.
return result
I have two related classes as below:
class IP(Base):
__tablename__ = 'ip'
id = Column(Integer, primary_key=True)
value = Column(String, unique=True)
topics = relationship('Topic')
class Topic(Base):
__tablename__ = 'topic'
id = Column(Integer, primary_key=True)
value = Column(String)
ip_id = Column(Integer, ForeignKey('ip.id'))
ip = relationship('IP')
if __name__ == '__main__':
Base.metadata.create_all(engine)
topics = [
Topic(value='t1', ip=IP(value='239.255.48.1')),
Topic(value='t2', ip=IP(value='239.255.48.1')),
Topic(value='t3', ip=IP(value='239.255.48.1'))
]
session.add_all(topics)
The above doesnt work as it tries to add different ip entries with same value. Is it possible to create or get the existing one so that I can use like below?
topics = [
Topic(value='t1', ip=create_or_get(value='239.255.48.1')),
Topic(value='t2', ip=create_or_get(value='239.255.48.1')),
Topic(value='t3', ip=create_or_get(value='239.255.48.1'))
]
Sure, just create the function:
def create_or_get(value):
obj = session.query(IP).filter(IP.value==value).first()
if not obj:
obj = IP(value=value)
session.add(obj)
return obj
Of course, it needs a session, but if you use scoped_session factory, it is straightforward. Alternatively, you might look into events, but it gets too complicated for the problem to solve.
The following totally incomplete snippet defines a basic SQLAlchemy relationship using declarative syntax...
Base = declarative_base()
class Movie(Base):
__tablename__ = 'movies'
id = Column(Integer, primary_key=True)
name = Column(String)
director = relationship("People", uselist = False)
class People(Base):
__tablename__ = 'people'
id = Column(Integer, primary_key=True)
name = Column(String, nullable = false)
To access the director name it would be something like:
assert isinstance(movie, Movie) # <-- retrieved with query or whatever
director_name = movie.director.name
If, for convenience, I always want the director relationship to just give me the director's name, rather than a People instance, how do you do this? eg: it should work just like this:
assert isinstance(movie, Movie)
director_name = movie.director # <-- should get the string directly
I'm 99% sure I've done this before but can't find any reference code or documentation on it anymore. I'm going a bit crazy trying to locate it. Stack Overflow will be a good/permanent reference location for the answer.
The association proxy is used for all kinds of "object reference-> attribute reference" styles of transformation on the Python side. Docs have been newly updated and rewritten:
http://www.sqlalchemy.org/docs/orm/extensions/associationproxy.html
What if you use property?
class Movie(Base):
__tablename__ = 'movies'
id = Column(Integer, primary_key=True)
name = Column(String)
_director = relationship("People", uselist = False)
#property
def director_name(self):
return self._director.name