Why does the attribute 'name' of the sqlalchemy.Column instance disappear? - python

I'm writing the history of attribute changes for sqlalchemy models. Attributes list, which are important for history, I keep in the class attribute history_attributes.
from sqlalchemy import Integer, Column, String
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class HistoryAttribute:
def __init__(self, descriptor, display_name, use_message=False):
self.name = descriptor.name
self.display_name = display_name
self.use_message = use_message
class User(Base):
__tablename__ = 'user'
id = Column(Integer, primary_key=True)
first_name = Column(String(100))
email = Column(String(100))
history_attributes = (
HistoryAttribute(descriptor=first_name, display_name='User name'),
email
)
print(User.history_attributes[0].name)
>>> None
print(User.history_attributes[1].name)
>>> email
Why does the attribute "name" of the Column instance disappear, if I pass one to constructor of other class? Of course, I can write first_name = Column('first_name', String(100) and code will work fine,
but I don't want to add Column.name explicitly. I avoided the problem using namedtuple, which I then pass to the constructor of the HistoryAttribute class, but it's very similar to a crutch.

It's not that the name attribute disappears, but that it hasn't been initialized yet. Look at the value of first_name as passed to HistoryAttribute: Column(String(100)); it does not contain any mention of the string first_name. SQLAlchemy will fill the name in later after the class is defined.

As has been discussed, the issue is that first_name.name is set while the class is being constructed.
SQLAlchemy does have a mechanism for deferring an attribute definition until configuration time. Try something like
class User(Base):
# ...
#sqlalchemy.ext.declarative.api.declared_attr
def history_attributes(self):
return (HistoryAttribute(self.first_name #...))
That will defer the construction of history_attributes until a point where column names are populated

Related

Access backref on relationship when filtering in SQLAlchemy

I have a table named ExtendedUser that has a one to one relationship with a User table with a backref named extended_user on the User table. The User table has a one to many relationship with the UserPosts table with a backref named posts on the User table.
class User(Base):
__tablename__ = "users"
id = Column(Integer, primary_key=True)
class UserPost(Base):
__tablename__ = "user_posts"
id = Column(Integer, primary_key=True)
user_id = Column(Integer, ForeignKey("users.id"))
user = relationship("User", backref="posts")
class ExtendedUser(Base):
__tablename__ = "extended_users"
user_id = Column(Integer, ForeignKey("users.id"), primary_key=True)
user = relationship("User", backref="extended_user", uselist=False, lazy="joined")
Starting from ExtendedUser, I want to select all the ExtendedUsers whose Users has no post.
So what I've unsuccessfully tried to do is
sess.query(ExtendedUser).filter(not_(ExtendedUser.user.posts.any()))
But that does not work, I get an error:
AttributeError: Neither 'InstrumentedAttribute' object nor 'Comparator' object associated with ExtendedUser.user has an attribute 'posts'
How can I model my query so that only ExtendedUsers whose User has UserPosts be returned?
The error occurs because the .user attribute is a different thing depending on whether you access it from the class, or an instance of the class. For example, this raises the exception that you are getting:
ExtendedUser.user.posts
This is because accessing .user on the class ExtendedUser returns an InstrumentedAttribute object, and instances of InstrumentedAttribute have no attribute called posts.
This works:
inst = ExtendedUser()
inst.user.posts
The above works because we've accessed .user on an instance of ExtendedUser, which returns an instance of User that has an attribute called posts.
This differing behavior between class and instance attribute access is controlled by Python's descriptor protocol.
One way to achieve your objective would be to use a subquery to query for unique user_ids in the user_posts table, and test that the ExtendedUser's user_id isn't in the result:
q = sess.query(ExtendedUser).\
filter(
not_(
ExtendedUser.user_id.in_(
sess.query(UserPost.user_id).distinct()
)
)
)
The following is a working example, but first I had to change the definition of ExtendedUser.user a little:
class ExtendedUser(Base):
...
user = relationship(
"User", backref=backref("extended_user", uselist=False), lazy="joined")
Note the use of the backref function which allows me to set uselist=False on User.extended_user. Your example has uselist=False on ExtendedUser.user but it isn't needed there as the Foreign Key is defined on ExtendedUser, so ExtendedUser.user can only ever point to one user, and sqlalchemy will automatically know that the collection shouldn't be a list. Without that change I'd get a TypeError: Incompatible collection type: ExtendedUser is not list-like exception.
OK here's the example:
sess = Session()
user_1 = User(extended_user=ExtendedUser())
user_2 = User(extended_user=ExtendedUser())
user_3 = User(extended_user=ExtendedUser())
user_1.posts = [UserPost()]
sess.add_all([user_1, user_2, user_3])
sess.commit()
q = sess.query(ExtendedUser).\
filter(
not_(
ExtendedUser.user_id.in_(
sess.query(UserPost.user_id).distinct()
)
)
)
print(q.all()) # [ExtendedUser(user_id=2), ExtendedUser(user_id=3)]

SQLAlchemy: Subclassed model db entry gets properties of other subclassed model

I'm having an issue with SQLAlchemy in a Flask app where I have two models Instructor and Student that subclass the same model User. When I'm creating a Student object, it's listed in the database under User with properties that should be unique for Instructor. Here are my (simplified) classes:
class User(db.Model):
__tablename__ = "user"
discriminator = db.Column("type", db.String(50)) # Student or Instructor
__mapper_args__ = {"polymorphic_on": discriminator}
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(50))
class Instructor(User):
__mapper_args__ = {"polymorphic_identity": "instructor"}
reputation = db.Column(db.Integer, default=1)
approved_for_teaching = db.Column(db.Boolean, default=False)
class Student(User):
__mapper_args__ = {"polymorphic_identity": "student"}
lessons_taken = db.Column(db.Integer, default=0)
I'm creating a Student like this:
new_student = Student(name=user_name)
db.session.add(new_student)
db.session.commit()
But when inspecting the object in the database, the student gets attributes from the Instructor model, i.e. reputation (value is 1) and approved_for_teaching (value is False). Am I doing something wrong here, or is this expected behaviour? I could understand if the columns would have to be present, since they share the same table (User) in the db, but then I'd expect the values to be null or something. Thanks!
It is expected behaviour, since you're using single table inheritance:
Single table inheritance represents all attributes of all subclasses within a single table. A particular subclass that has attributes unique to that class will persist them within columns in the table that are otherwise NULL if the row refers to a different kind of object.
Though the documentation mentions leaving the columns belonging to other classes NULL, the client side defaults you've given the columns kick in during inserts to user table, which underlies all the classes inheriting from User, even when they're not a part of the particular subclass' columns. A context sensitive default function could perhaps be used to avoid that, for example:
def polymorphic_default(discriminator, identity, value):
def default(context):
# This should be replaced with context.get_current_parameters() in 1.2 and above
if context.current_parameters[discriminator.name] == identity:
return value
return default
...
class Instructor(User):
__mapper_args__ = {"polymorphic_identity": "instructor"}
reputation = db.Column(
db.Integer, default=polymorphic_default(User.discriminator, 'instructor', 1))
approved_for_teaching = db.Column(
db.Boolean, default=polymorphic_default(User.discriminator, 'instructor', False))
but that seems like a lot of work to avoid a rather small issue.

How can I determine the type (e.g. many-to-one) of a dynamic SQLAlchemy relationship?

Suppose I have the following SQLAlchemy classes defined:
Base = declarative_base()
class Person(Base):
__tablename__ = 'person'
id = Column(Integer, primary_key=True)
computers = relationship('Computer', backref=backref('owner', lazy='dynamic'))
class Computer(Base):
__tablename__ = 'computer'
id = Column(Integer, primary_key=True)
ownerid = Column(Integer, ForeignKey('person.id'))
Suppose further that I have accessed the lazy query object this way:
relation = getattr(Computer, 'owner')
How can I determine if relation refers to a single instance of Person (that is, in a many-to-one relationship, like in this example), or if relation refers to a collection of instances (like in a one-to-many relationship)? In other words, how can I determine the relationship type of a dynamic SQLAlchemy relationship object?
If we suppose model = Computer and relation = 'owner' as in the question, then the following attribute is True if and only if the relation is a list of instances as opposed to a single instance:
model._sa_class_manager[relation].property.uselist
You can then use this to test whether or not to call the one() method on the result of getattr(model, relation):
if model._sa_class_manager[relation].property.uselist:
related_instances = getattr(model, relation)
else:
related_instance = getattr(model, relation).one()
I am not confident, however, that this is the best solution.

SQLAlchemy: How to order query results (order_by) on a relationship's field?

Models
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, ForeignKey
from sqlalchemy import Integer
from sqlalchemy import Unicode
from sqlalchemy import TIMESTAMP
from sqlalchemy.orm import relationship
BaseModel = declarative_base()
class Base(BaseModel):
__tablename__ = 'base'
id = Column(Integer, primary_key=True)
location = Column(Unicode(12), ForeignKey("locationterrain.location"), unique=True,)
name = Column(Unicode(45))
ownerid = Column(Integer,ForeignKey("player.id"))
occupierid = Column(Integer, ForeignKey("player.id"))
submitid = Column(Integer,ForeignKey("player.id"))
updateid = Column(Integer,ForeignKey("player.id"))
owner = relationship("Player",
primaryjoin='Base.ownerid==Player.id',
join_depth=3,
lazy='joined')
occupier= relationship("Player",
primaryjoin='Base.occupierid==Player.id',
join_depth=3,
lazy='joined')
submitter = relationship("Player",
primaryjoin='Base.submitid== Player.id',
join_depth=3,
lazy='joined')
updater= relationship("Player",
primaryjoin='Base.updateid== Player.id',
join_depth=3,
lazy='joined')
class Player(BaseModel):
__tablename__ = 'player'
id = Column(Integer, ForeignKey("guildmember.playerid"), primary_key=True)
name = Column(Unicode(45))
Searching
bases = dbsession.query(Base)
bases = bases.order_by(Base.owner.name)
This doesn't work .... I've searched everywhere and read the documentation.
But I just don't see how I can sort my (Base) query on their 'owner' relationship's name.
It always results in:
AttributeError: Neither 'InstrumentedAttribute' object nor 'Comparator' object has an attribute 'name'
This must be easy... but I don't see it. Also looked into Comparators, which seemed logical, but I don't see where the query part for the ORDER BY is generated or what I should be returning since everything is generated dynamically. And making a comparator for each of my 'player' relationships to do a simple thing seems over complicated.
SQLAlchemy wants you to think in terms of SQL. If you do a query for "Base", that's:
SELECT * FROM base
easy. So how, in SQL, would you select the rows from "base" and order by the "name" column in a totally different table, that is, "player"? You use a join:
SELECT base.* FROM base JOIN player ON base.ownerid=player.id ORDER BY player.name
SQLAlchemy has you use the identical thought process - you join():
session.query(Base).join(Base.owner).order_by(Player.name)
Use order_by argument
note here order_by argumnent in relationship from sqlalchemy.orm
Please consider parent children relationship here as many-to-many.
class Parent(BaseModel):
__tablename__ = "parent"
name = Column(String(100), nullable=False)
children = relationship(
'Children',
secondary=chile_parent_mapping,
backref=backref(
'parents',
),
order_by="Parent.order_id",
)

Sqlalchemy: avoiding multiple inheritance and having abstract base class

So I have a bunch of tables using SQLAlchemy that are modelled as objects which inherit from the result to a call to declarative_base(). Ie:
Base = declarative_base()
class Table1(Base):
# __tablename__ & such here
class Table2(Base):
# __tablename__ & such here
Etc. I then wanted to have some common functionality available to each of my DB table classes, the easiest way to do this according to the docs is to just do multiple inheritance:
Base = declarative_base()
class CommonRoutines(object):
#classmethod
def somecommonaction(cls):
# body here
class Table1(CommonRoutines, Base):
# __tablename__ & such here
class Table2(CommonRoutines, Base):
# __tablename__ & such here
The thing I don't like about this is A) multiple inheritance in general is a bit icky (gets tricky resolving things like super() calls, etc), B) if I add a new table I have to remember to inherit from both Base and CommonRoutines, and C) really that "CommonRoutines" class "is-a" type of table in a sense. Really what CommonBase is is an abstract base class which defines a set of fields & routines which are common to all tables. Put another way: "its-a" abstract table.
So, what I'd like is this:
Base = declarative_base()
class AbstractTable(Base):
__metaclass__ = ABCMeta # make into abstract base class
# define common attributes for all tables here, like maybe:
id = Column(Integer, primary_key=True)
#classmethod
def somecommonaction(cls):
# body here
class Table1(AbstractTable):
# __tablename__ & Table1 specific fields here
class Table2(AbstractTable):
# __tablename__ & Table2 specific fields here
But this of course doesn't work, as I then have to A) define a __tablename__ for AbstractTable, B) the ABC aspect of things causes all sorts of headaches, and C) have to indicate some sort of DB relationship between AbstractTable and each individual table.
So my question: is it possible to achieve this in a reasonable way? Ideally I'd like to enforce:
No multiple inheritance
CommonBase/AbstractTable be abstract (ie cannot be instantiated)
SQLAlchemy version 0.7.3 introduced the __abstract__ directive which is used for abstract classes that should not be mapped to a database table, even though they are subclasses of sqlalchemy.ext.declarative.api.Base. So now you create a base class like this:
Base = declarative_base()
class CommonRoutines(Base):
__abstract__ = True
id = Column(Integer, primary_key=True)
def __init__(self):
# ...
Notice how CommonRoutines doesn't have a __tablename__ attribute. Then create subclasses like this:
class Foo(CommonRoutines):
__tablename__ = 'foo'
name = Column(...)
def __init__(self, name):
super().__init__()
self.name = name
# ...
This will map to the table foo and inherit the id attribute from CommonRoutines.
Source and more information: http://docs.sqlalchemy.org/en/rel_0_7/orm/extensions/declarative.html#abstract
It is pretty straigh-forward, you just make declarative_base() to return a Base class which inherits from your CommonBase using cls= parameter. Also shown in Augmenting The Base docs. Your code might then look similar to below:
class CommonBase(object):
#classmethod
def somecommonaction(cls):
# body here
Base = declarative_base(cls=CommonBase)
class Table1(Base):
# __tablename__ & Table1 specific fields here
class Table2(Base):
# __tablename__ & Table2 specific fields here
You can use AbstractConcreteBase to make an absract base model:
from sqlalchemy.ext.declarative import AbstractConcreteBase
class AbstractTable(AbstractConcreteBase, Base):
id = db.Column(db.Integer, primary_key=True)
#classmethod
def somecommonaction(cls):
# body here
If you want to have several models with common columns, then you can use __abstract__ and #declared_attr to inherit shared table attributes. Example:
Base = declarative_base()
class CommonRoutines(Base):
__abstract__ = True
id = Column(Integer, primary_key=True)
modified_at = Column(DateTime)
#declared_attr
def modified_by(self):
# `user.id` is another table called `user` with an `id` field
return Column(Integer, ForeignKey('user.id', name='fk_modified_by_user_id'))
def __init__(self):
self.modified_by = None
super().__init__()
class Foo(CommonRoutines):
__tablename__ = 'foo'
name = Column(...)
With this solution you will have a Foo table with the fields of Foo class (name) and the ones in CommonRoutines (id, modified_at and modified_by)

Categories

Resources