Is there anything wrong with inheritance in which child class is only used to present parent's values in a different way?
Example:
class Parent(db.Model):
__tablename__ = u'parent'
parent_entry_id = db.Column(db.Integer, primary_key=True)
parent_entry_value = db.Column(db.BigInteger)
class Child(Parent):
__tablename__ = u'child'
#property
def extra_value(self):
return unicode(self.parent_entry_id) + unicode(self.parent_entry_value)
No new values will be added Child class, thus Joined Table, Single Table or Concrete Table Inheritance, as for me, is not needed.
If you're simply changing how you display the data from the class, I'm pretty sure you don't need a __tablename__.
Additionally, though I don't know your exact problem domain, I would simply just add the property on the original class. You could argue that you're adding some extra behavior to your original class, but that seems like a bit of a flimsy argument in this case.
Related
Might be a bit of an inelegant question title, but hopefully this skeleton setup explains things a little more clearly:
class User(Base):
__tablename__ = 'user'
id = Column(Integer, primary_key=True)
name = Column(String)
class Number(Base):
__tablename__ = 'number'
id = Column(Integer, primary_key=True)
users_id = Column(Integer, ForeignKey('user.id'))
user = relationship('User', backref=backref('numbers'))
value = Column(String)
joe = User(name='Joe')
joe.numbers = [
# Here we need to know that the class we want is named "Number".
# However, in some contexts (think abstract base classes or mixins) we might
# not necessarily know that, or have a way to import/reference it.
Number(value='212-555-1234'),
Number(value='201-555-1111'),
Number(value='917-555-8989')]
Basically there is a table of Users, and each User can have an arbitrary number of Numbers associated with it.
Is there a clean way, through the attributes of User alone, to find a reference to the Number class (and be able to create instances from it) without importing Number directly? The best I've come up with, with considerable influence from this question, is:
from sqlalchemy.orm import object_mapper
number_class = object_mapper(joe).relationships['numbers'].mapper.class_
joe.numbers = [number_class(value='212-555-1234') ...]
... but this seems rather obtuse, and I'm not fully comfortable relying on it.
The most valid reason I can think to want to be able to do this is in the case of mixins -- if there were some base class that needed the ability to append new numbers to a user without concrete knowledge of what class to use.
There are a few ways to do this, but I'd argue that the easiest (and clean enough) is to store what you need on the User class, because your User class is already implementation bound to the Number class, in that it imports and uses Number when creating the relationship. So you could add a User.add_number() method where you pass args to add number, and just have it create the Numbers and store on self.
I'm trying to define a one-to-many relationship with SqlAlchemy where I have Parent has many Child
class Parent(Base):
__tablename__ = "parent"
id = Column(String, primary_key = True)
children = relationship("Child")
class Child(Base):
__tablename__ = "child"
id = Column(Integer, primary_key = True)
feed_type_id = Column(String, ForeignKey("parent.id"))
From business rules, Parent has no much Child (between 10 and 30) and most of the time I will need access to all of them so I think that it's good idea that relationship() retrieve all children in memory in order to increase performance (First question: am I right?) but Few times I need to get a particular child but I won't do something like:
def search_bar_attr(some_value)
for bar in foo.bars:
if(bar.attr == some_value)
return bar
lazy="dynamic" returns a list that allows queries but I think it's slow against "eagerly" loaded because dynamic relationship always queries the database.
Second question: Is there some configuration that covers all my needs?
You can construct the same query that lazy="dynamic" does by using .with_parent.
class Parent(Base):
...
#property
def children_dynamic(self):
return object_session(self).query(Child).with_parent(self, Parent.children)
You can even add a function to reduce boilerplate if you have to write a lot of these:
def dynamicize(rel):
#property
def _getter(self):
return object_session(self).query(rel.parent).with_parent(self, rel)
return _getter
class Parent(Base):
...
children = relationship("Child")
children_dynamic = dynamicize(children)
You don't need to use a function like that one, you don't even need to load all of the child objects in memory.
When you want to search for a child with a certain attribute, you can do:
# get a session object, usually with sessionmaker() configured to bind to your engine instance
c = session.query(Child).filter_by(some_attribute="some value here").all() # returns a list of all child objects that match the filter
# or: to get a child who belongs to a certain parrent with a certain attribute:
# get the parent object (p)
c = session.query(Child).filter_by(feed_type_id=p.id).filter_by(some_attr="some attribute that belongs to children of the p parrent object")
No one strategy will give you everything. However, you can choose a default strategy and then override it.
My recommendation would be to:
Add lazy = "joined" to your relationship so that by default, you will get all the parents.
In cases where you want to query for a set of children dependent on properties of their parents but don't need the parent objects, use the join function on the query and filters referring both to the parent and child
In cases where you need to construct a query similar to what lazy = "dynamic" would do, use the sqlalchemy.orm.defer operator to turn off your lazy = "joined" eager loading and the loading interface( to override eager loading and then use with_parent to construct query. a query like you would have gotten with lazy = "dynamic"
For example, using Flask-SQLAlchemy and jsontools to serialize to JSON like shown -here-, and given a model like this:
class Engine(db.Model):
__tablename__ = "engines"
id = db.Column(db.Integer, primary_key=True)
this = db.Column(db.String(10))
that = db.Column(db.String(10))
parts = db.relationship("Part")
schema = ["id"
, "this"
, "that"
, "parts"
]
def __json__(self):
return self.schema
class Part(db.Model):
__tablename__ = "parts"
id = db.Column(db.Integer, primary_key=True)
engine_id = db.Column(db.Integer, db.ForeignKey("engines.id"))
code = db.Column(db.String(10))
def __json__(self):
return ["id", "code"]
How do I change the schema attribute before query so that it takes effect on the return data?
enginelist = db.session.query(Engine).all()
return enginelist
So far, I have succeeded with subclassing and single-table inheritance like so:
class Engine_smallschema(Engine):
__mapper_args__ = {'polymorphic_identity': 'smallschema'}
schema = ["id"
, "this"
, "that"
]
and
enginelist = db.session.query(Engine_smallschema).all()
return enginelist
...but it seems there should be a better way without needing to subclass (I'm not sure if this is wise). I've tried various things such as setting an attribute or calling a method to set an internal variable. Problem is, when trying such things, the query doesn't like the instance object given it and I don't know SQLAlchemy well enough yet to know if queries can be executed on pre-made instances of these classes.
I can also loop through the returned objects, setting a new schema, and get the wanted JSON, but this isn't a solution for me because it launches new queries (I usually request the small dataset first).
Any other ideas?
The JSON serialization takes place in flask, not in SQLAlchemy. Thus, the __json__ function is not consulted until after you return from your view function. This has therefore nothing to do with SQLAlchemy, and instead it has to do with the custom encoding function, which presumably you can change.
I would actually suggest not attempting to do it this way if you have different sets of attributes you want to serialize for a model. Setting a magic attribute on an instance that affects how it's serialized violates the principle of least surprise. Instead, you can, for example, make a Serializer class that you can initialize with the list of fields you want to be serialized, then pass your Engine to it to produce a dict that can be readily converted to JSON.
If you insist on doing it your way, you can probably just do this:
for e in enginelist:
e.__json__ = lambda: ["id", "this", "that"]
Of course, you can change __json__ to be a property instead if you want to avoid the lambda.
I have a simple SqlAlchemy application:
import sqlalchemy as sa
import sqlalchemy.ext.declarative as dec
import sqlalchemy.engine.url as saurl
import sqlalchemy.orm as saorm
import sqlalchemy.schema as sch
import abc
class ItemTable():
__tablename__ = 'book_items'
#abc.abstractmethod
def _source_key(self):
pass
rowid = sa.Column(sa.Integer, sa.Sequence('book_page_id_seq'), primary_key=True)
src = sa.Column(sa.String, nullable=False, index=True, default=_source_key)
dlState = sa.Column(sa.Integer, nullable=False, index=True, default=0)
url = sa.Column(sa.String, nullable=False, unique=True, index=True)
# [...snip...]
Base = dec.declarative_base(cls=ItemTable)
class TestItem(Base):
_source_key = 'test'
def __init__(self, *args, **kwds):
# Set the default value of `src`. Somehow, despite the fact that `self.src` is being set
# to a string, it still works.
self.src = self._source_key
print(self)
print(type(self))
print(super())
print("IsInstance of ItemTable", isinstance(self, ItemTable))
print("IsInstance of Table", isinstance(self, sch.Table))
super().__init__(*args, **kwds)
def test():
test = TestItem()
if __name__ == "__main__":
test()
The idea is that the table schema is defined in ItemTable, and certain member attributes are defined as abstract. This ensures child-classes define certain member attributes, that are then used as value defaults by the instantiated child-class via some __init__() hijinks.
Anyways, this much works.
The issue I'm having is that I cannot for the life of me figure out what the hell the parents of TestItem(Base) are. I know it inherits from ItemTable(), but the intermediate inheritance of dec.declarative_base(cls=ItemTable) is inserting a whole bunch of methods and "stuff" into TestItem(Base), and I don't know what is there, or where it's coming from.
I'm pretty sure there are some functions that would make my life a LOT easier with regard to modifying a row in the table, but since I don't know what TestItem(Base) is actually inheriting from, I have no idea where to look at all in the SqlAlchemy documentation.
The documentation does say about declarative_base():
The new base class will be given a metaclass that produces appropriate
Table objects and makes the appropriate mapper() calls based on the
information provided declaratively in the class and any subclasses of
the class.
Which makes me think that possibly TestItem(Base) is a child-class of Table, but isinstance(self, sch.Table) returns false, so either it's not, or the metaclass muckery is completely breaking isinstance.
Also, TestItem(Base) being a child-class of Table wouldn't make any sense logically, because you get instances of TestItem(Base) returned when you query, with each instance representing a row.
Anyways, I'm thoroughly confused.
Update:
#Veedrac in the comments pointed out that ClassName.mro() gives you the full inheritance. In this case:
TestItem.mro() ->
[<class '__main__.TestItem'>, <class 'sqlalchemy.ext.declarative.api.Base'>, <class '__main__.ItemTable'>, <class 'object'>]
The fun thing here is that there are zero instances of sqlalchemy.ext.declarative.api.Base anywhere in the SqlAlchemy documentation.
The only thing documented along the sqlalchemy.ext.declarative.api path at all is _declarative_constructor, and there are ~2 unhelpful sentences there.
Well, the end solution to my issues here was to just flat-out dump SqlAlchemy entirely.
I know exactly how to achieve what I want using SQL. I assumed that SqlAlchemy would make things easier, but it just lead to inheritance nightmares and lots of bizarre issues.
I'm sure there is a way to manage what I want in SqlAlchemy, but the documentation is so terrible that I'm unable to find it.
Assuming i have a class that is called Customer that is defined in sqlalchemy to represent the customer table. I want to write a search method so that ...
results = Customer.search(query)
will return the results based on the method. Do I want to do this as a #classmethod?
#classmethod
def search(cls,query):
#do some stuff
return results
Can i use cls in place of DBSession.query?
DBSession.query(Customer).filter(...)
cls.query(Customer).filter(...)
To use
cls.query
you have to assign a query_property to your model classes!
You probably want to use this in other model classes as well, so you might want to do that in your model Base class somewhere in your model/__init__.py:
Base.query = Session.query_property()
Then you can simply write:
cls.query.filter(...)
Note that you don't specify the Object to query for anymore, that is automatically done by using the query_property mechanism.
I just recently wrote some similar code, following this reference.
class Users(Base):
__tablename__ = 'users'
#classmethod
def by_id(cls, userid):
return Session.query(Users).filter(Users.id==userid).first()
So the answer to your first question appears to be "yes". Not sure about the second, as I didn't substitute cls for DBSession.