I am trying to create nested Python classes using the 3 argument type function. I want to construct an analogue of this:
In [15]: class CC:
...: class DD:
...: pass
...:
The naive attempt is
In [17]: AA = type('AA', (), {'BB': type('BB', (), {})})
but that is not quite right, since BB is actually created outside
and before AA and is only put inside `AA later.
The difference isdemonstrated by:
In [18]: AA.BB
Out[18]: __main__.BB
In [16]: CC.DD
Out[16]: __main__.CC.DD
How can I create nested classes reflectively/dynamically that are completely equivalent to nested definitions?
I want to use this to reflectively generate a graphene-sqlalchemy api. The idiom there is to create an outer Graphene class with an inner Meta class pointing to the correponding SQLAchemy model class (http://docs.graphene-python.org/projects/sqlalchemy/en/latest/tutorial/#schema) eg:
from sqlalchemy import Column, Integer, String
from sqlalchemy.orm import relationship
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class UserModel(Base):
__tablename__ = 'department'
id = Column(Integer, primary_key=True)
name = Column(String)
last_name = Column(String)
from graphene_sqlalchemy import SQLAlchemyObjectType
class User(SQLAlchemyObjectType):
class Meta:
model = UserModel
# only return specified fields
only_fields = ("name",)
# exclude specified fields
exclude_fields = ("last_name",)
The User class above is pretty cookie cutter and should be constructable programmatically from the UserModel class. This should then be doable for an entire schema.
In the Python 3.7 type documentation, it says:
The [1st argument] is the class name and becomes the __name__ attribute.
So, I think the only difference between your 2 examples is that AA.BB.__name__ is AA.BB in the 1st example, and BB in the 2nd. If you want the __name__ to be the same, you can do this:
AA = type('AA', (), {'BB': type('AA.BB', (), {})})
Otherwise, as far as I can tell, both examples are functionally equivalent.
Actually, the only difference you get there is the __qualname__ class attribute.
The __qualname__ is created by the code object running the class body, and passed as an ordinary attribute to the metaclass __new__ method (usually type) .
Therefore all you need to get this level of equivalence is to explicitly pass a proper __qualname__ when creating the class:
In [9]: AA = type('AA', (), {'BB': type('BB', (), {'__qualname__': 'AA.BB'})})
In [10]: AA
Out[10]: __main__.AA
In [11]: AA.BB
Out[11]: __main__.AA.BB
This will likely be enough for you, but beware that there are more subtle differences between classes created dynamically. One of the last metaclass questions I answered is exactly about that, but with the opposite approach: the challenge was to actually be able to distinguish classes created with both styles.
Detect if class was defined declarative or functional - possible?
(warning: that contains what likely is the "deepest black magic" Python code I ever put in an answer here)
Related
Is there anything wrong with inheritance in which child class is only used to present parent's values in a different way?
Example:
class Parent(db.Model):
__tablename__ = u'parent'
parent_entry_id = db.Column(db.Integer, primary_key=True)
parent_entry_value = db.Column(db.BigInteger)
class Child(Parent):
__tablename__ = u'child'
#property
def extra_value(self):
return unicode(self.parent_entry_id) + unicode(self.parent_entry_value)
No new values will be added Child class, thus Joined Table, Single Table or Concrete Table Inheritance, as for me, is not needed.
If you're simply changing how you display the data from the class, I'm pretty sure you don't need a __tablename__.
Additionally, though I don't know your exact problem domain, I would simply just add the property on the original class. You could argue that you're adding some extra behavior to your original class, but that seems like a bit of a flimsy argument in this case.
I have 3 marshmallow Schemas with Nested fields that form a dependency cycle/triangle. If I use the boilerplate from two-way nesting, I seem to have no problem.
from marshmallow import Schema, fields
class A(Schema):
id = fields.Integer()
b = fields.Nested('B')
class B(Schema):
id = fields.Integer()
c = fields.Nested('C')
class C(Schema):
id = fields.Integer()
a = fields.Nested('A')
However, I have my own, thin subclass of fields.Nested that looks something like the following:
from marshmallow import fields
class NestedRelationship(fields.Nested):
def __init__(self, nested,
include_data=True,
**kwargs):
super(NestedRelationship, self).__init__(nested, **kwargs)
self.schema.is_relationship = True
self.schema.include_relationship_data = include_data
and I change each Schema to use NestedRelationship instead of the native Nested type, I get:
marshmallow.exceptions.RegistryError: Class with name 'B' was not found. You may need to import the class.
NestedRelationship is a relatively thin subclass and I am surprised at the difference in behavior. Am I doing something wrong here? Am I not calling super appropriately?
The problem is with your extra code that accesses self.schema. When you define A.b field, it tries to resolve it, but it wasn't defined yet. On the other hand marshmallow.fields.Nested does not try to resolve schema on construction time and thus does not have this problem.
I have a simple SqlAlchemy application:
import sqlalchemy as sa
import sqlalchemy.ext.declarative as dec
import sqlalchemy.engine.url as saurl
import sqlalchemy.orm as saorm
import sqlalchemy.schema as sch
import abc
class ItemTable():
__tablename__ = 'book_items'
#abc.abstractmethod
def _source_key(self):
pass
rowid = sa.Column(sa.Integer, sa.Sequence('book_page_id_seq'), primary_key=True)
src = sa.Column(sa.String, nullable=False, index=True, default=_source_key)
dlState = sa.Column(sa.Integer, nullable=False, index=True, default=0)
url = sa.Column(sa.String, nullable=False, unique=True, index=True)
# [...snip...]
Base = dec.declarative_base(cls=ItemTable)
class TestItem(Base):
_source_key = 'test'
def __init__(self, *args, **kwds):
# Set the default value of `src`. Somehow, despite the fact that `self.src` is being set
# to a string, it still works.
self.src = self._source_key
print(self)
print(type(self))
print(super())
print("IsInstance of ItemTable", isinstance(self, ItemTable))
print("IsInstance of Table", isinstance(self, sch.Table))
super().__init__(*args, **kwds)
def test():
test = TestItem()
if __name__ == "__main__":
test()
The idea is that the table schema is defined in ItemTable, and certain member attributes are defined as abstract. This ensures child-classes define certain member attributes, that are then used as value defaults by the instantiated child-class via some __init__() hijinks.
Anyways, this much works.
The issue I'm having is that I cannot for the life of me figure out what the hell the parents of TestItem(Base) are. I know it inherits from ItemTable(), but the intermediate inheritance of dec.declarative_base(cls=ItemTable) is inserting a whole bunch of methods and "stuff" into TestItem(Base), and I don't know what is there, or where it's coming from.
I'm pretty sure there are some functions that would make my life a LOT easier with regard to modifying a row in the table, but since I don't know what TestItem(Base) is actually inheriting from, I have no idea where to look at all in the SqlAlchemy documentation.
The documentation does say about declarative_base():
The new base class will be given a metaclass that produces appropriate
Table objects and makes the appropriate mapper() calls based on the
information provided declaratively in the class and any subclasses of
the class.
Which makes me think that possibly TestItem(Base) is a child-class of Table, but isinstance(self, sch.Table) returns false, so either it's not, or the metaclass muckery is completely breaking isinstance.
Also, TestItem(Base) being a child-class of Table wouldn't make any sense logically, because you get instances of TestItem(Base) returned when you query, with each instance representing a row.
Anyways, I'm thoroughly confused.
Update:
#Veedrac in the comments pointed out that ClassName.mro() gives you the full inheritance. In this case:
TestItem.mro() ->
[<class '__main__.TestItem'>, <class 'sqlalchemy.ext.declarative.api.Base'>, <class '__main__.ItemTable'>, <class 'object'>]
The fun thing here is that there are zero instances of sqlalchemy.ext.declarative.api.Base anywhere in the SqlAlchemy documentation.
The only thing documented along the sqlalchemy.ext.declarative.api path at all is _declarative_constructor, and there are ~2 unhelpful sentences there.
Well, the end solution to my issues here was to just flat-out dump SqlAlchemy entirely.
I know exactly how to achieve what I want using SQL. I assumed that SqlAlchemy would make things easier, but it just lead to inheritance nightmares and lots of bizarre issues.
I'm sure there is a way to manage what I want in SqlAlchemy, but the documentation is so terrible that I'm unable to find it.
I need to get a list of a model's properties which are actually relationships (that is, they were created by relationship()).
Say I have a model Foo in a models:
class Thing(db.Model):
id = db.Column(...)
bar_id = db.Column(...)
foo_id = db.Column(...)
foo = db.relationship('Foo')
bar = db.relationship('Bar')
Later on, I want to take models.Thing and get a list of relationship-properties, that is ['foo', 'bar'].
Currently I'm checking every attribute indicated by dir(models.Thing) that happens to be of type sqlalchemy.orm.attributes.InstrumentedAttribute for the class of its property attribute — which can be either a ColumnProperty or RelationshipProperty. This does the job but I was wondering if there's another way.
I could probably just find all attributes ending in _id and derive the relationship name, but this could break for some cases.
How about setting a __relationships__ = ['foo', 'bar']?
Or is there something built into SQLAlchemy to help me out?
There is indeed - take a look at sqlalchemy.inspection.inspect. Calling inspect on a mapped class (for example, your Thing class) will return a Mapper, which has a relationships attribute that is dict like:
from sqlalchemy.inspection import inspect
thing_relations = inspect(Thing).relationships.items()
Instead of using inspect you can also use
model.__mapper__.relationships
You just need to use the inspect module from sqlalchemy
from sqlalchemy import inspect
i = inspect(model)
i.relationships
If you need the class of each referred model you can do:
referred_classes = [r.mapper.class_ for r in i.relationships]
I have two mapped classes with a one-to-many relation:
class Part(...):
product = relationship('products', backref=backref('parts'))
class Product(...):
pass
Given Part.product, I can introspect this relationship, namely get the attribute name, and also get the backref attribute name:
>>> rel = Part.product # image it's passed in as a function parameter
>>> rel.property.key
'product'
>>> rel.property.backref[0]
'parts'
I can also access the relationship the other way round:
>>> rel = Product.parts
>>> rel
<sqlalchemy.orm.attributes.InstrumentedAttribute object at 0x3744fd0>
>>> rel.property.key
'parts'
However, I cannot find out how to access the original attribute name (aka the backref' backref attribute, aka 'product' in the example):
>>> rel.property.backref is None
True
Where do I have to tickle Product.parts to obtain 'product'?
I tried to reproduce situation your described and got Product.parts.property.backref = None too.
After debugging in pycharm I found that other property holds the name of property in parts:
print Product.parts.property.backref
>>>None
print Product.parts.property.back_populates
>>>product
I would suggest to consider using back_populates in this case as a hack.
back_populates is described in documentation Linking Relationship Configuration:Relationships with Backref. According to documentation you would need to define your model like that:
class Part(...):
product = relationship('products', back_populates='parts')
class Product(...):
parts = relationship('products', back_populates='product')
pass