SQLAlchemy relationship introspection - python

I have two mapped classes with a one-to-many relation:
class Part(...):
product = relationship('products', backref=backref('parts'))
class Product(...):
pass
Given Part.product, I can introspect this relationship, namely get the attribute name, and also get the backref attribute name:
>>> rel = Part.product # image it's passed in as a function parameter
>>> rel.property.key
'product'
>>> rel.property.backref[0]
'parts'
I can also access the relationship the other way round:
>>> rel = Product.parts
>>> rel
<sqlalchemy.orm.attributes.InstrumentedAttribute object at 0x3744fd0>
>>> rel.property.key
'parts'
However, I cannot find out how to access the original attribute name (aka the backref' backref attribute, aka 'product' in the example):
>>> rel.property.backref is None
True
Where do I have to tickle Product.parts to obtain 'product'?

I tried to reproduce situation your described and got Product.parts.property.backref = None too.
After debugging in pycharm I found that other property holds the name of property in parts:
print Product.parts.property.backref
>>>None
print Product.parts.property.back_populates
>>>product
I would suggest to consider using back_populates in this case as a hack.
back_populates is described in documentation Linking Relationship Configuration:Relationships with Backref. According to documentation you would need to define your model like that:
class Part(...):
product = relationship('products', back_populates='parts')
class Product(...):
parts = relationship('products', back_populates='product')
pass

Related

creating nested classes in Python reflectively

I am trying to create nested Python classes using the 3 argument type function. I want to construct an analogue of this:
In [15]: class CC:
...: class DD:
...: pass
...:
The naive attempt is
In [17]: AA = type('AA', (), {'BB': type('BB', (), {})})
but that is not quite right, since BB is actually created outside
and before AA and is only put inside `AA later.
The difference isdemonstrated by:
In [18]: AA.BB
Out[18]: __main__.BB
In [16]: CC.DD
Out[16]: __main__.CC.DD
How can I create nested classes reflectively/dynamically that are completely equivalent to nested definitions?
I want to use this to reflectively generate a graphene-sqlalchemy api. The idiom there is to create an outer Graphene class with an inner Meta class pointing to the correponding SQLAchemy model class (http://docs.graphene-python.org/projects/sqlalchemy/en/latest/tutorial/#schema) eg:
from sqlalchemy import Column, Integer, String
from sqlalchemy.orm import relationship
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class UserModel(Base):
__tablename__ = 'department'
id = Column(Integer, primary_key=True)
name = Column(String)
last_name = Column(String)
from graphene_sqlalchemy import SQLAlchemyObjectType
class User(SQLAlchemyObjectType):
class Meta:
model = UserModel
# only return specified fields
only_fields = ("name",)
# exclude specified fields
exclude_fields = ("last_name",)
The User class above is pretty cookie cutter and should be constructable programmatically from the UserModel class. This should then be doable for an entire schema.
In the Python 3.7 type documentation, it says:
The [1st argument] is the class name and becomes the __name__ attribute.
So, I think the only difference between your 2 examples is that AA.BB.__name__ is AA.BB in the 1st example, and BB in the 2nd. If you want the __name__ to be the same, you can do this:
AA = type('AA', (), {'BB': type('AA.BB', (), {})})
Otherwise, as far as I can tell, both examples are functionally equivalent.
Actually, the only difference you get there is the __qualname__ class attribute.
The __qualname__ is created by the code object running the class body, and passed as an ordinary attribute to the metaclass __new__ method (usually type) .
Therefore all you need to get this level of equivalence is to explicitly pass a proper __qualname__ when creating the class:
In [9]: AA = type('AA', (), {'BB': type('BB', (), {'__qualname__': 'AA.BB'})})
In [10]: AA
Out[10]: __main__.AA
In [11]: AA.BB
Out[11]: __main__.AA.BB
This will likely be enough for you, but beware that there are more subtle differences between classes created dynamically. One of the last metaclass questions I answered is exactly about that, but with the opposite approach: the challenge was to actually be able to distinguish classes created with both styles.
Detect if class was defined declarative or functional - possible?
(warning: that contains what likely is the "deepest black magic" Python code I ever put in an answer here)

How can I use "class" as an enum value in Python/SQLAlchemy?

I have a model in SQLAlchemy of which one column is an enum. Wanting to stay as close to vanilla SQLAlchemy and the Python3 stdlib as possible, I've defined the enum against the stdlib's enum.Enum class, and then fed that to SQLAlchemy using its sqlalchemy.Enum class (as recommended somewhere in the SQLAlchemy documentation.)
class TaxonRank(enum.Enum):
domain = "domain"
kingdom = "kingdom"
phylum = "phylum"
class_ = "class"
order = "order"
family = "family"
genus = "genus"
species = "species"
And in the model:
rank = sqlalchemy.Column(sqlalchemy.Enum(TaxonRank), name = "rank", nullable = False)
This works well, except for forcing me to use class_ instead of class for one of the enum values (naturally to avoid conflict with the Python keyword; it's illegal syntax to attempt to access TaxonRank.class.)
I don't really mind using class_, but the issue I'm having is that class_ is the value that ends up getting stored in the database. This, in turn, is causing me issues with my CRUD API, wherein I allow the user to do things like "filter on rank where rank ends with ss." Naturally this doesn't match anything because the value actually ends with ss_!
For record display I've been putting in some hacky case-by-case translation to always show the user class in place of class_. Doing something similar with sorting and filtering, however, is more tricky because I do both of those at the SQL level.
So my question: is there a good way around this mild annoyance? I don't really care about accessing TaxonRank.class_ in my Python, but perhaps there's a way to subclass the stdlib's enum.Enum to force the string representation of the class_ attribute (and thus the value that actually gets stored in the database) to the desired class?
Thanks to Sergey Shubin for pointing out to me an alternative form for defining an enum.Enum.
TaxonRank = enum.Enum("TaxonRank", [
("domain", "domain"),
("kingdom", "kingdom"),
("phylum", "phylum"),
("class", "class"),
("order", "order"),
("family", "family"),
("genus", "genus"),
("species", "species")
])
I have been working on an interface for a Russian and English database. I am using postgresql, but it will probably work for any brand X enumeration. This is the solution solution:
In mymodel.py:
from sqlalchemy.dialects.postgresql import ENUM
from .meta import Base
from enum import Enum
class NounVar(Enum):
abstract = 1
proper = 2
concrete = 3
collective = 4,
compound = 5
class Nouns(Base):
__tablename__ = 'nouns'
id = Column(Integer, primary_key=True)
name = Column(Text)
runame = Column(Text)
variety = Column("variety", ENUM(NounVar, name='variety_enum'))
And then further in default.py:
from .models.mymodel import Nouns
class somecontainer():
def somecallable():
page = Nouns(
name="word",
runame="слово",
variety=NounVar().concrete))
self.request.dbsession.add(page)
I hope it works for you.

SQLAlchemy; getting list of related tables/classes

Say I have a Thing class that is related to some other classes, Foo and Bar.
class Thing(Base):
FooKey = Column('FooKey', Integer,
ForeignKey('FooTable.FooKey'), primary_key=True)
BarKey = Column('BarKey', Integer, ForeignKey('BarTable.BarKey'), primary_key=True)
foo = db.relationship('Foo')
bar = db.relationship('Bar')
I want to get a list of the classes/tables related to Thing created by my relationships() e.g. [Foo, Bar]. Any way to do this?
This is a closely related question:
SQLAlchemy, Flask: get relationships from a db.Model. That identifies the string names of the relationships, but not the target classes.
Context:
I'm building unit tests for my declarative base mapping of a SQL database. A lot of dev work is going into it and I want robust checks in place.
Using the Mapper as described in that other question gets you on the right path. As mentioned on the doc [0], you will get a bunch of sqlalchemy.orm.relationships.RelationshipProperty, and then you can use class_ on the mapper associated with each RelationshipProperty to get to the class:
from sqlalchemy.inspection import inspect
rels = inspect(Thing).relationships
clss = [rel.mapper.class_ for rel in rels]

Flask-SQLAlchemy .like() method raises NotImplementedError

I am having trouble building a Flask-SQLAlchemy query with a like() method, which should build a query using the the SQL LIKE statement.
According the SQLAlchemy docs the like method can be called on a column like this:
select([sometable]).where(sometable.c.column.like("%foobar%"))
I have a ModelClass that subclasses the Flask-SQLAlchemy db.Model class. Defined like this:
class ModelClass(db.Model):
# Some other columns ...
field1 = db.Column(db.Integer(), db.ForeignKey('my_other_class.id'))
rel1 = db.relationship("MyOtherClass", foreign_keys=[field1])
I then have a loop where I am building up filters dynamically. Outside the loop I use these filters to filter a query. The inside of my loop, slightly modified, looks like this:
search_term = '%{}%'.format(search_string)
my_filter = getattr(ModelClass, field_string).like(search_term)
This raises an error at the line with the like method:
NotImplementedError: <function like_op at 0x101c06668>
It raises this error for any text string. The Python docs for a NotImplementedError say:
This exception is derived from RuntimeError. In user defined base
classes, abstract methods should raise this exception when they
require derived classes to override the method.
This isn't an AttributeError, so I think the like method exists, but something else is wrong and I'm not sure what.
Update
Now that I'm looking more closely at the model definition I think the problem might be that I'm doing this on a relationship and not a Column type.
I saw that type(getattr(ModelClass, field_string)) gives:
<sqlalchemy.orm.attributes.InstrumentedAttribute object at 0x102018090>
Since this is not a Column type I looked at the values for field_string and saw that one of the values being passed was actually rel1.
So I guess that's the "answer" but I'm still confused why calling .like() on rel1 didn't raise an AttributeError.
So I've confirmed the issue is that I was trying to apply the .like() method to a relationship attribute instead of a column.
I changed my code to call the child model class directly as opposed to trying to go across the relationship from the parent to access the child class columns. Something like this:
search_term = '%{}%'.format(search_string)
my_filter = getattr(ChildModelClass, field_string).like(search_term)
As #ACV said, calling methods such as like(), is_(), is_not(), etc. on relationship attributes raises NotImplementedError. So, to workaround this problem, I called the method directly on the real column attribute instead of the relationship. E.g. if I have the following two attributes in a Model:
user_id = db.Column(db.Integer, db.ForeignKey('user.id', ondelete='CASCADE'), index=True)
user = db.relationship(
'User', backref=db.backref('readings', lazy='dynamic', cascade='all, delete-orphan'))
I did the following query to filter the instances whose attribute user IS NOT NULL. (Note that I'm using MyModel.user_id instead of MyModel.user to successfully run the query):
MyModel.query.filter(MyModel.user_id.is_not(None))

SQLAlchemy: get relationships from a db.Model

I need to get a list of a model's properties which are actually relationships (that is, they were created by relationship()).
Say I have a model Foo in a models:
class Thing(db.Model):
id = db.Column(...)
bar_id = db.Column(...)
foo_id = db.Column(...)
foo = db.relationship('Foo')
bar = db.relationship('Bar')
Later on, I want to take models.Thing and get a list of relationship-properties, that is ['foo', 'bar'].
Currently I'm checking every attribute indicated by dir(models.Thing) that happens to be of type sqlalchemy.orm.attributes.InstrumentedAttribute for the class of its property attribute — which can be either a ColumnProperty or RelationshipProperty. This does the job but I was wondering if there's another way.
I could probably just find all attributes ending in _id and derive the relationship name, but this could break for some cases.
How about setting a __relationships__ = ['foo', 'bar']?
Or is there something built into SQLAlchemy to help me out?
There is indeed - take a look at sqlalchemy.inspection.inspect. Calling inspect on a mapped class (for example, your Thing class) will return a Mapper, which has a relationships attribute that is dict like:
from sqlalchemy.inspection import inspect
thing_relations = inspect(Thing).relationships.items()
Instead of using inspect you can also use
model.__mapper__.relationships
You just need to use the inspect module from sqlalchemy
from sqlalchemy import inspect
i = inspect(model)
i.relationships
If you need the class of each referred model you can do:
referred_classes = [r.mapper.class_ for r in i.relationships]

Categories

Resources