Appropriate use of server_default=FetchedValue() - python

Checking some code in the backend, found some uses of server_default=FetchedValue() in Flask using SQLAlchemy, the use of it would vary depending the case.
For instance, use in non primary key will cause the columns will remain empty and will be fetched during first use when executing SELECT statement as documented:
__tablename__ = 'dummy_type'
dummy_type_code = db.Column(db.String, primary_key=True)
dummy_ind = db.Column(db.Boolean, nullable=False, server_default=FetchedValue())
__tablename__ = 'dummy_recommendation'
dummy_recommendation_id = db.Column(
db.Integer, primary_key=True, server_default=FetchedValue())
dummy_recommendation_guid = db.Column(
UUID(as_uuid=True), nullable=False, server_default=FetchedValue())
In other cases, the server_default=FetchedValue() can be applied in the PK:
__tablename__ = 'dummy_note'
dummy_note_guid = db.Column(
UUID(as_uuid=True), primary_key=True, server_default=FetchedValue())
It is not clear to me what is the appropriate use on these examples. In the first examples using the server_default, what is the advantage to generate the value until first use? Am I missing something? The use in PK is not clear to me at all.
Can you please clarify the use?
Thanks

Related

What could be the cause of "Dependency rule tried to blank-out primary key column"

When im trying to delete category instance identified by 'id' with its category_image and files instances the way like this:
c = Category.query.get(id)
for ci in c.images:
db.session.delete(ci)
db.session.flush()
for ci in c.images:
db.session.delete(ci.file)
db.session.flush() # if i type here db.session.commit() all is fine
db.session.delete(c)
db.session.commit()
i'm getting a AssertionError: Dependency rule tried to blank-out primary key column 'category_image.id_category' on instance ''. But when i replace flush which is after deleting category_image.files with commit, then it works. I've notice it after i changed CategoryImage table to intermediary. Before changes it has it's own pk that wasn't combined and all was working properly. Here're my current models definitions.
class File(db.Model):
__tablename__ = 'file'
id_file = Column(Integer, Sequence('seq_id_file'), primary_key=True, nullable=False)
name = Column(Text, nullable=False)
path = Column(Text, nullable=False, unique=True)
protected = Column(Boolean, nullable=False, default=False)
class Category(db.Model):
__tablename__ = 'category'
id_category = Column(Integer, Sequence('seq_id_category'), primary_key=True, nullable=False)
name = Column(UnicodeText, nullable=False, unique=True)
images = relationship('CategoryImage', backref='images')
class CategoryImage(db.Model):
__tablename__ = 'category_image'
__table_args__ = (
PrimaryKeyConstraint('id_category', 'id_file', name='seq_id_category_image'),
)
id_category = Column(Integer, ForeignKey(Category.id_category), nullable=False)
id_file = Column(Integer, ForeignKey(File.id_file), nullable=False)
id_size_type = Column(Integer, nullable=)
file = relationship(File)
Now i'm trying to figure out what just happened. Correct me if i'm using things wrong.
I just noticed that i have to delete objects beeing in relation with intermediate model in the same order as it was declared in table_args, PrimaryKeyConstraint('id_category', 'id_file'). So when i perform it this way: session.delete(category_image), session.delete(category), session.delete(file) and commit it or flush everywhere before commit, then all works fine. If anyone spot something about it in alch docs let me know.
Here is what is happening. Once you call session.delete() on some object it is like having marked the object for deletion but not yet deleted from db. when you call the flush() after deleting (note: db still has the object as it is yet not committed) but session has marked the object as deleted. So the objects become inconsistent. In order to make the delete smooth you can always wrap your delete operations within a transaction and once they are deleted from the session you need to call the db.commit() once to make db session consistent with the db.
Hope it helps.

sqlalchemy.exc.InvalidRequestError: One or more mappers failed to initialize - can't proceed with initialization of other mappers

This error happened when I tried to get access to the page. I didn't get errors when I created the tables, but seems like there are problems still.
The models are like this:
class User(UserMixin, db.Model):
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(64), index=True, unique=True)
sell_items = db.relationship('Item', backref='user')
class Item(db.Model):
id = db.Column(db.Integer, primary_key=True)
item_name = db.Column(db.String(64), index=True)
item_image = db.Column(db.String(200), index=True)
price = db.Column(db.Float(10), index=True)
user_id = db.Column(db.Integer, db.ForeignKey('user.id'))
user = db.relationship('User', backref='sell_items')
The whole error message is this
Triggering mapper: 'Mapper|User|user'. Original exception was: Error creating backref 'user' on relationship 'User.sell_items': property of that name exists on mapper 'Mapper|Item|item'
How can I fix this? What I want to do is to refer to username who sells the item, but I cannot. There is a problem with the relationships between the models.
When you use backref the backwards relationship is automatically created, so it should only be used in one side of the relationship. In your case, you can remove the sell_items in the User model and the User model will automatically get a relationship from Item.
To declare the relationshiop on both sides (in case you want to customize its name, for example, use back_populates='name_of_relationship_on_other_model'.
in your Item class, replace this line
user = db.relationship('User', backref='sell_items')
with this line
user_id = db.Column(db.Integer, db.ForeignKey('user.id'), nullable=False)
it should work that way, from there you can query like this item = Item.query.first(), then item.sell_items... to get the user who posted the item.
i hope it helps.

SQLAlchemy not cascade deleting multiple levels down

I'm a little new to SQLAlchemy. I've searched around for an answer to my question but I have found nothing that works for my situation.
In short, deleting a record in the Release model will delete all the records in the other models as long as there is no related records in TestResults. However, if there are related records in TestResult, then deleting a Release will not work. It almost seems as if deleting a parent will delete a child and the child's child but not the child's child's child. Here is some code to help highlight this:
class Release(db.Model):
__tablename__ = 'releases'
id = db.Column(db.Integer, primary_key=True)
platform_id=db.Column(db.Integer, db.ForeignKey('platforms.id'))
name = db.Column(db.String(20), unique=True)
builds = db.relationship('ReleaseBuilds', cascade='all,delete', lazy='dynamic', order_by="desc(ReleaseBuilds.date_created)")
class ReleaseBuilds(db.Model):
__tablename__='release_builds'
id = db.Column(db.Integer, primary_key=True)
release_id = db.Column(db.Integer, db.ForeignKey('releases.id'))
name = db.Column(db.String(150), nullable=False)
artifacts = db.relationship('ReleaseBuildArtifacts', cascade='all,delete', backref='builds', lazy='dynamic')
deployments = db.relationship('Deployments', cascade='all,delete', lazy='dynamic')
tests = db.relationship('Test', cascade='delete', lazy='dynamic')
class ReleaseBuildArtifacts(db.Model):
__tablename__='release_build_artifacts'
id = db.Column(db.Integer, primary_key=True)
release_build_id = db.Column(db.Integer, db.ForeignKey('release_builds.id'))
application_id = db.Column(db.Integer, db.ForeignKey('applications.id'))
rpm = db.Column(db.String(300))
build = db.relationship('ReleaseBuilds')
application = db.relationship('Application')
class Deployments(db.Model):
__tablename__ = 'deployments'
release_build_id = db.Column(db.Integer, db.ForeignKey('release_builds.id'), primary_key=True)
environment_id = db.Column(db.Integer, db.ForeignKey('environments.id'), primary_key=True)
date_deployed = db.Column(db.DateTime(timezone=False), default=datetime.datetime.utcnow)
environment = db.relationship('Environment', foreign_keys=[environment_id])
class TestType(db.Model):
__tablename__ = 'test_types'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(50), unique=True)
class Test(db.Model):
__tablename__ = 'tests'
id = db.Column(db.Integer, primary_key=True)
release_build_id = db.Column(db.Integer, db.ForeignKey('release_builds.id'), nullable=False)
environment_id = db.Column(db.Integer, db.ForeignKey('environments.id'), nullable=False)
test_type_id = db.Column(db.Integer, db.ForeignKey('test_types.id'))
name = db.Column(db.String(300))
environments = db.relationship('Environment', foreign_keys=[environment_id])
results = db.relationship('TestResult', cascade='all,delete', lazy='dynamic')
__table_args__ = (
ForeignKeyConstraint(['release_build_id', 'environment_id'],['deployments.release_build_id', 'deployments.environment_id']),
)
class TestResult(db.Model):
__tablename__ = 'test_results'
id = db.Column(db.Integer, primary_key=True)
test_id = db.Column(db.Integer, db.ForeignKey('tests.id'), nullable=False)
name = db.Column(db.String(500))
passed = db.Column(db.Boolean)
Any suggestions as to why this cascade delete is not working?
I came across a similar issue in our project, where we define cascades on the ORM level and also use lazy=dynamic relationships. This caused the cascade not to run on the bottom-most children.
Dynamic loading causes the relationship to return a Query object when accessed.
Delete on queries is quite limited, in order to increase performance, as documented here:
https://docs.sqlalchemy.org/en/13/orm/query.html
The method does not offer in-Python cascading of relationships - it
is assumed that ON DELETE CASCADE/SET NULL/etc. is configured for any
foreign key references which require it, otherwise the database may
emit an integrity violation if foreign key references are being
enforced.
After the DELETE, dependent objects in the Session which were impacted
by an ON DELETE may not contain the current state, or may have been
deleted. This issue is resolved once the Session is expired, which
normally occurs upon Session.commit() or can be forced by using
Session.expire_all(). Accessing an expired object whose row has been
deleted
will invoke a SELECT to locate the row; when the row is not found,
an ObjectDeletedError is raised.
Therefore a solution for your problem could be either defining cascades on the database level, or using other types of relationships.
Related question was raised here: SQLAlchemy delete doesn't cascade
EDIT: (Solution I applied is changing the loading type on query level - in options)

SQLAlchemy lazy=dynamic with m2m relationship using association object pattern

I have a simple m2m relationship between users and roles tables:
users_roles = db.Table('users_roles',
db.Column('user_id', db.Integer, db.ForeignKey('users.id')),
db.Column('role_id', db.Integer, db.ForeignKey('roles.id')),
db.Column('is_primary', db.Boolean)
)
class User(db.Model):
__tablename__ = 'users'
id = db.Column('id', db.Integer, primary_key=True)
roles = db.relationship('Role', secondary=users_roles, lazy='dynamic', backref=db.backref('users', lazy='dynamic'))
class Role(db.Model):
__tablename__ = 'roles'
id = db.Column('id', db.Integer, primary_key=True)
users = db.relationship('User', secondary=users_roles, lazy='dynamic', backref=db.backref('roles', lazy='dynamic'))
To add a record to the users_roles table, I have to do something like this:
role = Role.get(1)
user = User()
user.roles.append(role)
db.session.add(user)
db.session.commit()
That is okay, but I have a column named is_primary in the users_roles table that should also be populated.
I changed my code to use the Association Object Pattern as described in the SQLAlchemy documentation.
Now my code looks like this:
class User(db.Model):
__tablename__ = 'users'
id = db.Column('id', db.Integer, primary_key=True)
class Role(db.Model):
__tablename__ = 'roles'
id = db.Column('id', db.Integer, primary_key=True)
class UserRole(db.Model):
__tablename__ = 'users_roles'
user_id = db.Column(db.Integer, db.ForeignKey('users.id'), primary_key=True)
role_id = db.Column(db.Integer, db.ForeignKey('roles.id'), primary_key=True)
is_primary = db.Column(db.Boolean)
user = db.relationship(User, backref="users_roles")
role = db.relationship(Role, backref="users_roles")
User.roles = association_proxy("users_roles", "role")
Role.users = association_proxy("users_roles", "user")
It works nice, but I still have a problem.
Is it possible that User.roles (added with the association proxy) returns an AppenderBaseQuery that I can add more filters, e.g. User.query.get(1).roles.filter_by(...)?
I was used to do that with the plain many-to-many relationship using lazy=dynamic in the relationship declaration, but after giving a class mapping to the association table it seems that I cannot do it anymore.
Is there a way to achieve that?
#IfLoop I followed your recommendation in this post. Your help would be much appreciated.
Well, I ended up filtering roles using the following code:
roles = Role.query.filter_by(...).join(UserRole).join(User).filter_by(id=1)
I still want to be able to do something like this:
roles = User.query.get(1).roles.filter_by(...).all()
Anyway if I get no answers in a few days I will accept this as an answer.
Way too late for helping you but I asked myself the same question, and this paragraph of the docs shed some light on what is possible to do from proxy associations:
https://docs.sqlalchemy.org/en/14/orm/extensions/associationproxy.html#querying-with-association-proxies
In summary from what I understood: this is not explicitly possible but For association proxies where the immediate target is a related object or collection, relationship-oriented operators can be used instead, such as .has() and .any()
I'm not sure this will help me, but I'm laying this there if it can ever point someone to their solution

How to use joinedload/contains_eager for query-enabled relationships (lazy='dynamic' option) in SQLAlchemy

I have the following model classes declared by SQLAlchemy:
class User(Base):
id = Column(Integer, primary_key=True)
name = Column(String, nullable=False, unique=True)
created_at = Colmn(DateTime, nullable=False, default=func.now())
class Post(Base):
id = Column(Integer, primary_key=True)
user_id = Column(Integer, ForeignKey(User.id), nullable=False)
user = relationship(User, backref=backref('posts', lazy='dynamic'))
title = Column(String, nullable=False)
body = Column(Text, nullable=False)
created_at = Colmn(DateTime, nullable=False, default=func.now())
As I quoted, these models have a relationship and its backref named posts set to be query-enabled (through lazy='dynamic' option). Because some users may have the large set of posts while most users don’t.
With these models, I tried joinedload for User.posts, but I faced the error:
>>> users = session.query(User).options(joinedload(User.posts))[:30]
Traceback (most recent call last):
...
InvalidRequestError: 'User.posts' does not support object population - eager loading cannot be applied.
Is there any way to work around this situation? I need following two functionalities both:
Sometimes User.posts can be sliced to avoid eagerloading of the large set of posts written by heavy users.
However usually User.posts should don’t produce 1+N queries.
The problem is that the property on User for posts is a dynamic relationship; It's supposed to return a Query object. There's no way for the property to know, or safely communicate, that this time, all of the related items are already loaded.
The simple workaround will be to have two properties, one that uses the normal lazy loading behavior (that you can set to eager load for specific queries where it makes sense), and another that always returns a dynamic relationship.
class User(Base):
id = Column(Integer, primary_key=True)
name = Column(String, nullable=False, unique=True)
created_at = Colmn(DateTime, nullable=False, default=func.now())
class Post(Base):
id = Column(Integer, primary_key=True)
user_id = Column(Integer, ForeignKey(User.id), nullable=False)
user = relationship(User, backref=backref('posts'))
title = Column(String, nullable=False)
body = Column(Text, nullable=False)
created_at = Colmn(DateTime, nullable=False, default=func.now())
User.post_query = relationship(Post, lazy="dynamic")

Categories

Resources