Is it possible to use passive_deletes with lazy='select' in sqlalchemy? - python

I was trying to do something like this:
class Parent(Base):
__tablename__ = 'parents'
id = Column(Integer, primary_key=True)
class Child(Base):
id = Column(Integer, primary_key=True)
parent_id = Column(Integer, ForeignKey('parents.id'))
Parent = relationship('Parent', backref=backref('children', passive_deletes=True))
parent = db.session.query(Parent).filter(id=some_id).first()
print parent.children
db.session.delete(parent)
db.session.commit()
I don't want sqlalchemy to create a lot of queries for deleting children. I have foreign key constraint instead of it. But i am getting the error:
sqlalchemy.exc.IntegrityError: (IntegrityError) null value in column "parent" violates not-null constraint
It is beacause i don't use lazy='dynamic' option in relationship parameters. But i can't use the option because joinedload option would not work with lazy='dynamic' option. How can i avoid these problems?

First of all please refine your question as it contains several mistakes. The working solution looks like this:
class Parent(Base):
__tablename__ = 'parent'
id = Column(Integer, primary_key=True)
class Child(Base):
__tablename__ = 'child'
id = Column(Integer, primary_key=True)
parent_id = Column(Integer, ForeignKey('parent.id'))
parent = relationship('Parent', backref=backref('children', cascade='all, delete-orphan'))
parent = db.session.query(Parent).filter(id=some_id).first()
db_session.delete(parent)
db_session.commit()
If you want to delete a parent with its children you have to use option cascade='all, delete-orphan' to relationship.
Unfortunately, I could't reproduce the example with passive_deletes=True directive from doc:
Passive Deletes
See my question:
Not working example with passive deletes directive in sqlalchemy doc

Related

SqlAlchemy drop/create all duplicate relation

Using SQL Alchemy and declarative base to define two related tables
JiraBase = declarative_base(cls=_Shared)
class JiraSource(JiraBase):
id = Column(BigInteger, primary_key=True, autoincrement=True)
name = Column(String)
__tablename__ = 'jira_source'
class JiraProject(JiraBase):
id = Column(BigInteger, primary_key=True, autoincrement=True)
source = Column(BigInteger, ForeignKey('jira_source.id'), nullable=False)
project_id = Column(String, nullable=False)
project_key = Column(String, nullable=False)
name = Column(String)
__table_args__ = (Index('source', 'project_id', 'project_key', unique=True),)
__tablename__ = 'jira_project'
The idea behind the Index is that projects are uniquely identified by an id and key for a given JiraSource.
The tables are being dropped and created via
JiraBase.metadata.drop_all(bind=self.engine)
JiraBase.metadata.create_all(bind=self.engine)
This actually drops and creates tables fine; however, also throws an exception
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.DuplicateTable) relation "source" already exists
[SQL: CREATE UNIQUE INDEX source ON jira_issue (issue_id)]
(Background on this error at: http://sqlalche.me/e/13/f405)
Thinking somehow because using both a ForeignKey and Index that both relate to source, is this a bad practice or something? When I comment out the Index entirely or drop the source field from it, the error goes away. Thinking that because I defined source as a foreign key that using it again in an index is bad or something?
Any idea why I'm seeing this error? For now am simply try/ignoring in code.

SQLAlchemy not cascade deleting multiple levels down

I'm a little new to SQLAlchemy. I've searched around for an answer to my question but I have found nothing that works for my situation.
In short, deleting a record in the Release model will delete all the records in the other models as long as there is no related records in TestResults. However, if there are related records in TestResult, then deleting a Release will not work. It almost seems as if deleting a parent will delete a child and the child's child but not the child's child's child. Here is some code to help highlight this:
class Release(db.Model):
__tablename__ = 'releases'
id = db.Column(db.Integer, primary_key=True)
platform_id=db.Column(db.Integer, db.ForeignKey('platforms.id'))
name = db.Column(db.String(20), unique=True)
builds = db.relationship('ReleaseBuilds', cascade='all,delete', lazy='dynamic', order_by="desc(ReleaseBuilds.date_created)")
class ReleaseBuilds(db.Model):
__tablename__='release_builds'
id = db.Column(db.Integer, primary_key=True)
release_id = db.Column(db.Integer, db.ForeignKey('releases.id'))
name = db.Column(db.String(150), nullable=False)
artifacts = db.relationship('ReleaseBuildArtifacts', cascade='all,delete', backref='builds', lazy='dynamic')
deployments = db.relationship('Deployments', cascade='all,delete', lazy='dynamic')
tests = db.relationship('Test', cascade='delete', lazy='dynamic')
class ReleaseBuildArtifacts(db.Model):
__tablename__='release_build_artifacts'
id = db.Column(db.Integer, primary_key=True)
release_build_id = db.Column(db.Integer, db.ForeignKey('release_builds.id'))
application_id = db.Column(db.Integer, db.ForeignKey('applications.id'))
rpm = db.Column(db.String(300))
build = db.relationship('ReleaseBuilds')
application = db.relationship('Application')
class Deployments(db.Model):
__tablename__ = 'deployments'
release_build_id = db.Column(db.Integer, db.ForeignKey('release_builds.id'), primary_key=True)
environment_id = db.Column(db.Integer, db.ForeignKey('environments.id'), primary_key=True)
date_deployed = db.Column(db.DateTime(timezone=False), default=datetime.datetime.utcnow)
environment = db.relationship('Environment', foreign_keys=[environment_id])
class TestType(db.Model):
__tablename__ = 'test_types'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(50), unique=True)
class Test(db.Model):
__tablename__ = 'tests'
id = db.Column(db.Integer, primary_key=True)
release_build_id = db.Column(db.Integer, db.ForeignKey('release_builds.id'), nullable=False)
environment_id = db.Column(db.Integer, db.ForeignKey('environments.id'), nullable=False)
test_type_id = db.Column(db.Integer, db.ForeignKey('test_types.id'))
name = db.Column(db.String(300))
environments = db.relationship('Environment', foreign_keys=[environment_id])
results = db.relationship('TestResult', cascade='all,delete', lazy='dynamic')
__table_args__ = (
ForeignKeyConstraint(['release_build_id', 'environment_id'],['deployments.release_build_id', 'deployments.environment_id']),
)
class TestResult(db.Model):
__tablename__ = 'test_results'
id = db.Column(db.Integer, primary_key=True)
test_id = db.Column(db.Integer, db.ForeignKey('tests.id'), nullable=False)
name = db.Column(db.String(500))
passed = db.Column(db.Boolean)
Any suggestions as to why this cascade delete is not working?
I came across a similar issue in our project, where we define cascades on the ORM level and also use lazy=dynamic relationships. This caused the cascade not to run on the bottom-most children.
Dynamic loading causes the relationship to return a Query object when accessed.
Delete on queries is quite limited, in order to increase performance, as documented here:
https://docs.sqlalchemy.org/en/13/orm/query.html
The method does not offer in-Python cascading of relationships - it
is assumed that ON DELETE CASCADE/SET NULL/etc. is configured for any
foreign key references which require it, otherwise the database may
emit an integrity violation if foreign key references are being
enforced.
After the DELETE, dependent objects in the Session which were impacted
by an ON DELETE may not contain the current state, or may have been
deleted. This issue is resolved once the Session is expired, which
normally occurs upon Session.commit() or can be forced by using
Session.expire_all(). Accessing an expired object whose row has been
deleted
will invoke a SELECT to locate the row; when the row is not found,
an ObjectDeletedError is raised.
Therefore a solution for your problem could be either defining cascades on the database level, or using other types of relationships.
Related question was raised here: SQLAlchemy delete doesn't cascade
EDIT: (Solution I applied is changing the loading type on query level - in options)

How to nullify childre's foreign key when parent deleted using sqlalchemy?

I have basic Flask application with Parent and Child models like that:
class Parent(db.Model):
__tablename__ = 'parents'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String)
class Child(db.Model):
__tablename__ = 'children'
id = db.Column(db.Integer, primary_key=True)
parent_id = db.Column(db.Integer, db.ForeignKey('parents.id'), nullable=False)
parent = db.relationship('Parent', backref=db.backref('children', cascade='all,delete'))
name = db.Column(db.String)
As database I am using Postgres if it is important.
Now I want to do following: remove cascade='all,delete' from child and make this parent_id nullable. I.e. when Parent removed from DB Child stays in place with parent_id == NULL.
I know that I could specify it with schema creation script adding constraint to FK. But I want just to mark it as NULL and allow SqlAlchemy take control on nullification of children's FK.
It's explained in the relevant section of the documentation in a great detail. Make sure you also read "ORM-level “delete” cascade vs. FOREIGN KEY level “ON DELETE” cascade" section to understand differences between proposed solutions.
ORM level
Now I want to do following: remove cascade='all,delete' from child and make this parent_id nullable.
Do it and you will get exact behaviour you want.
class Child(db.Model):
__tablename__ = 'children'
id = db.Column(db.Integer, primary_key=True)
parent_id = db.Column(db.Integer, db.ForeignKey('parents.id'), nullable=True)
parent = db.relationship('Parent', backref=db.backref('children'))
name = db.Column(db.String)
Also note, that all is the synonym for save-update, merge, refresh-expire, expunge, delete, so all, delete is the same as simply all.
DB level
If you want to have ON DELETE SET NULL constraint on the database level, you can specify ondelete='SET NULL' in the ForeighKey definition or do nothing (since it's default behaviour for the foreign key). To get it working on the DB level you also need to set passive_deletes to either True or 'all' (see docs for the difference).
class Child(db.Model):
__tablename__ = 'children'
id = db.Column(db.Integer, primary_key=True)
parent_id = db.Column(db.Integer, db.ForeignKey('parents.id', ondelete='SET NULL'), nullable=True)
parent = db.relationship('Parent', backref=db.backref('children', passive_deletes=True))
name = db.Column(db.String)

One-to-many relationship to multiple models

I have a model Thing and a model Action. There is a one-to-many relationship between Things and Actions. However, I would like to be able to subclass Action to have (for example) BuildAction, HealAction and BloodyStupidAction. Is it possible using Flask-SQLAlchemy to do this and maintain the single one-to-many relationship?
This problem is described in the SQLAlchemy docs under Inheritance Configuration. If your different subclasses will share the same database table, you should use single table inheritance.
Code example:
class Thing(db.Model):
__tablename__ = 'thing'
id = db.Column(db.Integer, primary_key=True)
actions = db.relationship('Action', backref=db.backref('thing'))
class Action(db.Model):
__tablename__ = 'action'
id = db.Column(db.Integer, primary_key=True)
thing_id = db.Column(db.Integer, db.ForeignKey('thing.id'))
discriminator = db.Column('type', db.String(50))
__mapper_args__ = {'polymorphic_on': discriminator}
class BuildAction(Action):
__mapper_args__ = {'polymorphic_identity': 'build_action'}
time_required = db.Column(db.Integer)
Each subclass of Action should inherit the thing relationship defined in the parent class. The action.type column describes which subclass action each row of the table represents.

What's the proper way to describe an associative object by SQLalchemy the declarative way

I'm looking for a way to describe an associative object the declarative way. Beyond storing the foreign keys in the association table, I need to store information like the creation date of the association.
Today, my model looks like that :
# Define the User class
class User(Base):
__tablename__ = 'users'
# Define User fields
id = schema.Column(types.Integer(unsigned=True),
schema.Sequence('users_seq_id', optional=True), primary_key=True)
password = schema.Column(types.Unicode(64), nullable=False)
# Define the UserSubset class
class UserSubset(Base):
__tablename__ = 'subsets'
# Define UserSubset fields
id = schema.Column(types.Integer(unsigned=True),
schema.Sequence('subsets_seq_id', optional=True), primary_key=True)
some_short_description = schema.Column(types.Unicode(50), nullable=False)
# Define the subset memberships table
subset_memberships = schema.Table('group_memberships', Base.metadata,
schema.Column('user_id', types.Integer(unsigned=True), ForeignKey('users.id')),
schema.Column('subset_id', types.Integer(unsigned=True), ForeignKey('subsets.id')),
schema.Column('created', types.DateTime(), default=now, nullable=False),
)
Can I connect everything in an associative object ? Or should I change stop using the declarative way ?
What you are using at the moment is just a Many-to-Many-relation. How to work with association objects is described in the docs.
There is also an extension called associationproxy which simplifies the relation.
As you can see in the manual, configuring a one to many relation is really simple:
class User(Base):
__tablename__ = 'users'
id = Column(Integer, primary_key=True)
name = Column(String(50))
addresses = relation("Address", backref="user")
class Address(Base):
__tablename__ = 'addresses'
id = Column(Integer, primary_key=True)
email = Column(String(50))
user_id = Column(Integer, ForeignKey('users.id'))
Many to many relations isn't much harder:
There’s nothing special about many-to-many with declarative. The secondary argument to relation() still requires a Table object, not a declarative class. The Table should share the same MetaData object used by the declarative base:
keywords = Table('keywords', Base.metadata,
Column('author_id', Integer, ForeignKey('authors.id')),
Column('keyword_id', Integer, ForeignKey('keywords.id'))
)
class Author(Base):
__tablename__ = 'authors'
id = Column(Integer, primary_key=True)
keywords = relation("Keyword", secondary=keywords)
You should generally not map a class and also specify its table in a many-to-many relation, since the ORM may issue duplicate INSERT and DELETE statements.
Anyway, what you seem to be doing might be better served with inheritance. Of course, there can be complex table relations that will be a pathological case for the declarative way, but this doesn't seem to be one of them.
One more thing, code comments should state what the following code does ans why, not how it does it. Having a # Define the User class comment is almost like having a line of code saying a = 1 # assing value 1 to variable "a".

Categories

Resources