Reusing SQLAlchemy models across projects - python

I have some standard SQLAlchemy models that I reuse across projects. Something like this:
from sqlalchemy import Column, Integer, String, Unicode
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class Category(Base):
__tablename__ = 'category'
id = Column(Integer, primary_key=True)
slug = Column(String(250), nullable=False, unique=True)
title = Column(Unicode(250), nullable=False)
def __call__(self):
return self.title
I'd like to put this in a shared library and import it into each new project instead of cutting and pasting it, but I can't, because the declarative_base instance is defined separately in the project. If there's more than one, they won't share sessions. How do I work around this?
Here's another question that suggests using mixin classes. Could that work? Will SQLAlchemy accurately import foreign keys from mixin classes?

When you call
Base = declarative_base()
SA create new metadata for this Base.
To reuse your models you must bind metadata of main models to reusable models, but before any import of your reusable models by:
Base.metadata = my_main_app.db.metadata
MixIn classes useful for repeating column declarations, and extending class methods.
For connecting reusable apps based on MixIns you must define concrete class in code manualy for each model.
Will SQLAlchemy accurately import
foreign keys from mixin classes?
MixIn class with foreign key and constraint
from sqlalchemy.schema import UniqueConstraint
from sqlalchemy.ext.declarative import declared_attr
class MessageMixIn(object):
ttime = Column(DateTime)
#declared_attr
def sometable_id(cls):
return Column(Integer, ForeignKey('sometable.id'))
#declared_attr
def __table_args__(cls):
return (UniqueConstraint('sometable_id', 'ttime'), {})

Related

Is it possible to rename the metadata attribute of a SQLAlchemy declarative base?

I'm trying to set up a database with a few specific fields (and I can't move away from the specification). One of the fields would be a column called metadata, but sqlalchemy prevents that:
sqlalchemy.exc.InvalidRequestError: Attribute name 'metadata' is reserved for the MetaData instance when using a declarative base class.
Is there a decent workaround for this? Do I need to monkeypatch the declarative_base function to rename the metadata attribute? I couldn't find an option to rename that attribute in the api docs.
Here's some example code that will fail with the above error:
#!/usr/bin/env python3.7
from sqlalchemy.ext.declarative import declarative_base, declared_attr
from sqlalchemy import Column, Integer
class CustomBase(object):
#declared_attr
def __tablename__(cls):
return cls.__name__.lower()
DBBase = declarative_base(cls=CustomBase)
class Data(DBBase):
id = Column(Integer, primary_key=True)
metadata = Column(Integer)
if __name__ == "__main__":
print(dir(Data()))
You can use like:
class Data(DBBase):
id = Column(Integer, primary_key=True)
# metadata = Column(Integer)
metadata_ = Column("metadata", Integer)
The constructor of Column class has a name parameter. You can find it from https://docs.sqlalchemy.org/en/13/core/metadata.html#sqlalchemy.schema.Column
The name field may be omitted at construction time and applied later
In other words, you could write a name as you want originally.

Create a table from a SQLAlchemy model with a different name?

I am importing data from csv files into a table created using the SQLAlchemy declarative api.
I receive updates to this data which I want to stage in a temporary table with the same structure for preprocessing.
E.g:
from sqlalchemy import Column,Integer
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class MyModel(Base):
__tablename__ = "mymodel"
test_column = Column(Integer,primary_key=True)
I can use MyModel.__table__.create() to create this table.
Can I use a similar construct to create another table with the same model and a different name?
What would be the recommended way to achieve this?
Just extend your existing table and change its name
class StagingMyModel(MyModel):
__tablename__ = "staging_mymodel"
This worked for me:
from sqlalchemy import Column,Integer
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class MyBase(Base):
__abstract__ = True
test_column = Column(Integer,primary_key=True)
class MyModel(MyBase):
__tablename__ = "mymodel"
class MyStagingModel(MyBase):
__tablename__ = "mymodel_staging"
Reference: https://docs.sqlalchemy.org/en/14/orm/declarative_tables.html#using-deferredreflection
See also https://sparrigan.github.io/sql/sqla/2016/01/03/dynamic-tables.html for an approach that probably jibes with what you were originally thinking of with ModelName.__table__.create().

Independent SQLAlchemy models [duplicate]

I have some standard SQLAlchemy models that I reuse across projects. Something like this:
from sqlalchemy import Column, Integer, String, Unicode
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class Category(Base):
__tablename__ = 'category'
id = Column(Integer, primary_key=True)
slug = Column(String(250), nullable=False, unique=True)
title = Column(Unicode(250), nullable=False)
def __call__(self):
return self.title
I'd like to put this in a shared library and import it into each new project instead of cutting and pasting it, but I can't, because the declarative_base instance is defined separately in the project. If there's more than one, they won't share sessions. How do I work around this?
Here's another question that suggests using mixin classes. Could that work? Will SQLAlchemy accurately import foreign keys from mixin classes?
When you call
Base = declarative_base()
SA create new metadata for this Base.
To reuse your models you must bind metadata of main models to reusable models, but before any import of your reusable models by:
Base.metadata = my_main_app.db.metadata
MixIn classes useful for repeating column declarations, and extending class methods.
For connecting reusable apps based on MixIns you must define concrete class in code manualy for each model.
Will SQLAlchemy accurately import
foreign keys from mixin classes?
MixIn class with foreign key and constraint
from sqlalchemy.schema import UniqueConstraint
from sqlalchemy.ext.declarative import declared_attr
class MessageMixIn(object):
ttime = Column(DateTime)
#declared_attr
def sometable_id(cls):
return Column(Integer, ForeignKey('sometable.id'))
#declared_attr
def __table_args__(cls):
return (UniqueConstraint('sometable_id', 'ttime'), {})

SQLAlchemy cannot find a class name

Simplified, I have the following class structure (in a single file):
Base = declarative_base()
class Item(Base):
__tablename__ = 'item'
id = Column(BigInteger, primary_key=True)
# ... skip other attrs ...
class Auction(Base):
__tablename__ = 'auction'
id = Column(BigInteger, primary_key=True)
# ... skipped ...
item_id = Column('item', BigInteger, ForeignKey('item.id'))
item = relationship('Item', backref='auctions')
I get the following error from this:
sqlalchemy.exc.InvalidRequestError
InvalidRequestError: When initializing mapper Mapper|Auction|auction, expression
'Item' failed to locate a name ("name 'Item' is not defined"). If this is a
class name, consider adding this relationship() to the Auction class after
both dependent classes have been defined.
I'm not sure how Python cannot find the Item class, as even when passing the class, rather than the name as a string, I get the same error. I've been struggling to find examples of how to do simple relationships with SQLAlchemy so if there's something fairly obvious wrong here I apologise.
This all turned out to be because of the way I've set SQLAlchemy up in Pyramid. Essentially you need to follow this section to the letter and make sure you use the same declarative_base instance as the base class for each model.
I was also not binding a database engine to my DBSession which doesn't bother you until you try to access table metadata, which happens when you use relationships.
if it's a subpackage class, add Item and Auction class to __init__.py in the subpackage.
The SQLAlchemy documentation on Importing all SQLAlchemy Models states in part:
However, due to the behavior of SQLAlchemy's "declarative" configuration mode, all modules which hold active SQLAlchemy models need to be imported before those models can successfully be used. So, if you use model classes with a declarative base, you need to figure out a way to get all your model modules imported to be able to use them in your application.
Once I imported all of the models (and relationships), the error about not finding the class name was resolved.
Note: My application does not use Pyramid, but the same principles apply.
Case with me
Two models defined in separate files, one is Parent and the other is Child, related with a Foreign Key. When trying to use Child object in celery, it gave
sqlalchemy.exc.InvalidRequestError: When initializing mapper Mapper|Child|child, expression 'Parent' failed to locate a name ("name 'Parent' is not defined"). If this is a class name, consider adding this relationship() to the <class 'app.models.child'>
parent.py
from app.models import *
class Parent(Base):
__tablename__ = 'parent'
id = Column(BigInteger, primary_key=True, autoincrement=True)
name = Column(String(60), nullable=False, unique=True)
number = Column(String(45), nullable=False)
child.py
from app.models import *
class Child(Base):
__tablename__ = 'child'
id = Column(BigInteger, primary_key=True, autoincrement=True)
parent_id = Column(ForeignKey('parent.id'), nullable=False)
name = Column(String(60), nullable=False)
parent = relationship('Parent')
Solution
Add an import statement for Parent in beginning of child.py
child.py (modified)
from app.models import *
from app.models.parent import Parent # import Parent in child.py 👈👈
class Child(Base):
__tablename__ = 'child'
id = Column(BigInteger, primary_key=True, autoincrement=True)
parent_id = Column(ForeignKey('parent.id'), nullable=False)
name = Column(String(60), nullable=False)
parent = relationship('Parent')
Why this worked
The order in which models get loaded is not fixed in SQLAlchemy.
So, in my case, Child was being loaded before Parent. Hence, SQLAlchemy can't find what is Parent. So, we just imported Parent before Child gets loaded.
Namaste 🙏
I've solved the same error by inheriting a 'db.Model' instead of 'Base'... but I'm doing the flask
Eg:
from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy()
class someClass(db.Model):
someRelation = db.relationship("otherClass")
Also, even though this doesn't apply to the OP, for anyone landing here having gotten the same error, check to make sure that none of your table names have dashes in them.
For example, a table named "movie-genres" which is then used as a secondary in a SQLAlchemy relationship will generate the same error "name 'movie' is not defined", because it will only read as far as the dash. Switching to underscores (instead of dashes) solves the problem.
My Solution
One models file, or even further, if you need.
models.py
from sqlalchemy import Boolean, BigInteger, Column, DateTime, Float, ForeignKey, BigInteger, Integer, String
from sqlalchemy.orm import relationship
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
from .parent import Parent
from .child import Child
parent.py
from sqlalchemy import Boolean, BigInteger, Column, DateTime, Float, ForeignKey, BigInteger, Integer, String
from sqlalchemy.orm import relationship
from sqlalchemy.ext.declarative import declarative_base
#Base = declarative_base()
class Parent(Base):
__tablename__ = 'parent'
id = Column(BigInteger, primary_key=True, autoincrement=True)
name = Column(String(60), nullable=False, unique=True)
number = Column(String(45), nullable=False)
child.py
from sqlalchemy import Boolean, BigInteger, Column, DateTime, Float, ForeignKey, BigInteger, Integer, String
from sqlalchemy.orm import relationship
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class Child(Base):
__tablename__ = 'child'
id = Column(BigInteger, primary_key=True, autoincrement=True)
parent_id = Column(ForeignKey('parent.id'), nullable=False)
name = Column(String(60), nullable=False)
parent = relationship('Parent')
Why this worked
Same Deepam answer, but with just one models.py file to import another models
I had a different error, but the answers in here helped me fix it.
The error I received:
sqlalchemy.exc.InvalidRequestError: When initializing mapper mapped class Parent->parents, expression 'Child' failed to locate a name ('Child'). If this is a class name, consider adding this relationship() to the <class 'parent.Parent'> class after both dependent classes have been defined.
My set-up is similar toDeepam's answer.
Briefly what I do different:
I have multiple separate .py files for each db.Model.
I use a construct/fill database .py file that pre-fills db.Model objects in either Multi-threading or single threading way
What caused the error:
Only in multi-threaded set up the error occured
This construct/fill .py script did import Parent, but not Child.
What fixed it:
Adding an import to Child fixed it.
I had yet another solution, but this helped clue me in. I was trying to implement versioning, from https://docs.sqlalchemy.org/en/14/orm/examples.html#versioning-objects using the "history_mapper" class.
I got this same error. All I had to do to fix it was change the order in which my models were imported.
Use back_populates for relationship mapping in both models.
Also keep in mind to import both the models in the models/__init__.py
Base = declarative_base()
class Item(Base):
__tablename__ = 'item'
id = Column(BigInteger, primary_key=True)
# ... skip other attrs ...
auctions = relationship('Auction', back_populates='item')
class Auction(Base):
__tablename__ = 'auction'
id = Column(BigInteger, primary_key=True)
# ... skipped ...
item_id = Column('item', BigInteger, ForeignKey('item.id'))
item = relationship('Item', back_populates='auctions')

SQLAlchemy "event.listen" for all models

I have fields created_by and updated_by in each models. These fields are automatically filled with sqlalchemy.event.listen (formerly MapperExtension). For each model, I write:
event.listen(Equipment, 'before_insert', get_created_by_id)
event.listen(Equipment, 'before_update', get_updated_by_id)
When the model was a lot of code gets ugly. Is it possible to apply event.listen immediately to all models or several?
UPD: I'm trying to do so:
import pylons
from sqlalchemy import event, sql
from sqlalchemy import Table, ForeignKey, Column
from sqlalchemy.databases import postgresql
from sqlalchemy.schema import UniqueConstraint, CheckConstraint
from sqlalchemy.types import String, Unicode, UnicodeText, Integer, DateTime,\
Boolean, Float
from sqlalchemy.orm import relation, backref, synonym, relationship
from sqlalchemy import func
from sqlalchemy import desc
from sqlalchemy.orm.exc import NoResultFound
from myapp.model.meta import Session as s
from myapp.model.meta import metadata, DeclarativeBase
from pylons import request
def created_by(mapper, connection, target):
identity = request.environ.get('repoze.who.identity')
if identity:
id = identity['user'].user_id
target.created_by = id
def updated_by(mapper, connection, target):
identity = request.environ.get('repoze.who.identity')
if identity:
id = identity['user'].user_id
target.updated_by = id
from sqlalchemy.ext.declarative import declared_attr
from sqlalchemy.ext.declarative import has_inherited_table
class TestMixin(DeclarativeBase):
__tablename__ = 'TestMixin'
id = Column(Integer, autoincrement=True, primary_key=True)
event.listen(TestMixin, 'before_insert', created_by)
event.listen(TestMixin, 'before_update', updated_by)
class MyClass(TestMixin):
__tablename__ = 'MyClass'
__mapper_args__ = {'concrete':True}
id = Column(Integer, autoincrement=True, primary_key=True)
created_by = Column(Integer, ForeignKey('user.user_id',
onupdate="cascade", ondelete="restrict"))
updated_by = Column(Integer, ForeignKey('user.user_id',
onupdate="cascade", ondelete="restrict"))
When I add a new MyClass object I have created_by = None. If I create event.listen for MyClass all is fine. What's wrong?
Inherit all your models from the base class and subscribe to that base class:
event.listen(MyBaseMixin, 'before_insert', get_created_by_id, propagate=True)
event.listen(MyBaseMixin, 'before_update', get_updated_by_id, propagate=True)
See more on Mixin and Custom Base Classes
In newer versions of sqlalchemy (1.2+), the following event targets are available:
mapped classes (that is, subscribe to every model)
unmapped superclasses (that is, Base, and mixins, using the propagate=True flag)
Mapper objects
Mapper class itself
So, in order to listen to all instance events, you can listen on Mapper itself:
from typing import Set, Optional
import sqlalchemy as sa
import sqlalchemy.orm.query
import sqlalchemy.event
#sa.event.listens_for(sa.orm.Mapper, 'refresh', named=True)
def on_instance_refresh(target: type,
context: sa.orm.query.QueryContext,
attrs: Optional[Set[str]]):
ssn: sqlalchemy.orm.Session = context.session
print(target, attrs)
this way you will get an app-wide event listener.
If you want to only listen to your own models, use the Base class

Categories

Resources