Sqlalchemy: avoiding multiple inheritance and having abstract base class - python

So I have a bunch of tables using SQLAlchemy that are modelled as objects which inherit from the result to a call to declarative_base(). Ie:
Base = declarative_base()
class Table1(Base):
# __tablename__ & such here
class Table2(Base):
# __tablename__ & such here
Etc. I then wanted to have some common functionality available to each of my DB table classes, the easiest way to do this according to the docs is to just do multiple inheritance:
Base = declarative_base()
class CommonRoutines(object):
#classmethod
def somecommonaction(cls):
# body here
class Table1(CommonRoutines, Base):
# __tablename__ & such here
class Table2(CommonRoutines, Base):
# __tablename__ & such here
The thing I don't like about this is A) multiple inheritance in general is a bit icky (gets tricky resolving things like super() calls, etc), B) if I add a new table I have to remember to inherit from both Base and CommonRoutines, and C) really that "CommonRoutines" class "is-a" type of table in a sense. Really what CommonBase is is an abstract base class which defines a set of fields & routines which are common to all tables. Put another way: "its-a" abstract table.
So, what I'd like is this:
Base = declarative_base()
class AbstractTable(Base):
__metaclass__ = ABCMeta # make into abstract base class
# define common attributes for all tables here, like maybe:
id = Column(Integer, primary_key=True)
#classmethod
def somecommonaction(cls):
# body here
class Table1(AbstractTable):
# __tablename__ & Table1 specific fields here
class Table2(AbstractTable):
# __tablename__ & Table2 specific fields here
But this of course doesn't work, as I then have to A) define a __tablename__ for AbstractTable, B) the ABC aspect of things causes all sorts of headaches, and C) have to indicate some sort of DB relationship between AbstractTable and each individual table.
So my question: is it possible to achieve this in a reasonable way? Ideally I'd like to enforce:
No multiple inheritance
CommonBase/AbstractTable be abstract (ie cannot be instantiated)

SQLAlchemy version 0.7.3 introduced the __abstract__ directive which is used for abstract classes that should not be mapped to a database table, even though they are subclasses of sqlalchemy.ext.declarative.api.Base. So now you create a base class like this:
Base = declarative_base()
class CommonRoutines(Base):
__abstract__ = True
id = Column(Integer, primary_key=True)
def __init__(self):
# ...
Notice how CommonRoutines doesn't have a __tablename__ attribute. Then create subclasses like this:
class Foo(CommonRoutines):
__tablename__ = 'foo'
name = Column(...)
def __init__(self, name):
super().__init__()
self.name = name
# ...
This will map to the table foo and inherit the id attribute from CommonRoutines.
Source and more information: http://docs.sqlalchemy.org/en/rel_0_7/orm/extensions/declarative.html#abstract

It is pretty straigh-forward, you just make declarative_base() to return a Base class which inherits from your CommonBase using cls= parameter. Also shown in Augmenting The Base docs. Your code might then look similar to below:
class CommonBase(object):
#classmethod
def somecommonaction(cls):
# body here
Base = declarative_base(cls=CommonBase)
class Table1(Base):
# __tablename__ & Table1 specific fields here
class Table2(Base):
# __tablename__ & Table2 specific fields here

You can use AbstractConcreteBase to make an absract base model:
from sqlalchemy.ext.declarative import AbstractConcreteBase
class AbstractTable(AbstractConcreteBase, Base):
id = db.Column(db.Integer, primary_key=True)
#classmethod
def somecommonaction(cls):
# body here

If you want to have several models with common columns, then you can use __abstract__ and #declared_attr to inherit shared table attributes. Example:
Base = declarative_base()
class CommonRoutines(Base):
__abstract__ = True
id = Column(Integer, primary_key=True)
modified_at = Column(DateTime)
#declared_attr
def modified_by(self):
# `user.id` is another table called `user` with an `id` field
return Column(Integer, ForeignKey('user.id', name='fk_modified_by_user_id'))
def __init__(self):
self.modified_by = None
super().__init__()
class Foo(CommonRoutines):
__tablename__ = 'foo'
name = Column(...)
With this solution you will have a Foo table with the fields of Foo class (name) and the ones in CommonRoutines (id, modified_at and modified_by)

Related

How to create an abstract subclass of SQLAlchemy's Table(Base) class

I am using SQLAlchemy to create tables in my project. I have a requirement where all these tables should have some specific attributes and functions. I want to create a structure such that all tables inherit from an abstract class which includes these attributes and functions.
Here's an example of what I want to achieve:
Base = declarative_base()
# pseudo
class Table(ABC, Base):
# like #abstractattribute
some_attribtue = list()
#staticmethod
def some_func(self):
pass
class Users(Table):
__tablename__ = "users"
user_id = Column(Integer, primary_key=True)
username = Column(String, nullable=False)
some_attribute = list()
#staticmethod
def some_func():
do_something()
By doing this, I hope that I can use these classes in something like:
Base.metadata.create_all(engine)
while also being able to call:
Users.some_func()
I understand that this code wouldn't work as is, due to issues like having ABC and Base at the same time, not having #abstractattribute, and needing to add __tablename__ and a Primary-Key Column to the class Table.
I am thinking of using a decorator to achieve this, but I am not sure how to implement it correctly. This is the outline of my idea:
class Table(ABC):
some_attribute=None
#staticmethod
def some_func(self):
pass
# create decorator
def sql_table():
def decorator(abstract_class):
class SQLTable(Base): # How do I name the class correctly?
__tablename__ = abstract_class.__dict__["__tablename__"]
some_attribute = abstract_class.__dict__["some_attribute"]
for name, obj in abstract_class.__dict__.items():
if isinstance(obj, Column):
locals()[name] = obj
# How do I get the some_func function?
#sql_table
class Users(Table):
__tablename__ = "users"
user_id = Column(Integer, primary_key=True)
username = Column(String, nullable=False)
some_attribute = "some_val"
#staticmethod
def some_func():
do_something()
Any help or suggestions on how to implement this (not necessarily with decorators) would be greatly appreciated.
Thanks to #snakecharmerb and #ljmc I have found an solution that works for me, although there seem to be many ways one can achieve this.
The solution that works for me is:
from sqlalchemy.ext.declarative import declarative_base, declared_attr
from sqlalchemy import Column, Integer, String
Base = declarative_base()
class Table(Base):
__abstract__ = True
#declared_attr
def __tablename__(cls) -> str: # so i don't have to specify no more
return cls.__name__.lower()
some_attribute = set() # this is the default
#staticmethod
def some_func(): # define default behavior (or pass)
do_something()
class Users(Table):
# define columns as usual
user_id = Column(Integer, primary_key=True)
username = Column(String, nullable=False)
some_attribute = set(["a"]) # overwrite the default
def some_func(): # overwrite the default behavior
do_something_else()
Now, this should be improved upon by specifying a type to some_attribute (typing is awesome).

How to extend SQLalchemy Base class with a static method

I have multiple classes similar to the following:
class Weather(Base):
__tablename__ = "Weather"
id = Column(Integer, primary_key=True, nullable=False, autoincrement=True)
temperature = Column(Integer)
humidity = Column(Integer)
wind_speed = Column(Float)
wind_direction = Column(String)
I want to add a method df() that returns me the Pandas dataframe of that table. I know I can write it like this:
class Weather(Base):
__tablename__ = "Weather"
id = Column(Integer, primary_key=True, nullable=False, autoincrement=True)
temperature = Column(Integer)
humidity = Column(Integer)
wind_speed = Column(Float)
wind_direction = Column(String)
#staticmethod
def df():
with engine.connect() as conn:
return pd.read_sql_table(Weather.__tablename__ , conn)
But I want to implement this for every table. I guess if I can extend the Base class with this method I should be able to implement it once and use it in every class. Everything I have tried has failed because I do not have access to __tablename__ attribute.
SOLUTION
I ended up with a mix of both answers. I have used the first method proposed by #snakecharmerb (it allows to introduce the change without modifying the rest of the code) with the #classmethod proposed by #RomanPerekhrest (which is the bit I was missing).
class MyBase:
__tablename__ = None
#classmethod
def df(cls):
with engine.connect() as conn:
return pd.read_sql_table(cls.__tablename__ , conn)
Base = declarative_base(cls=MyBase)
You can do this by passing a custom class to the declarative_base function:
class MyBase:
__abstract__ = True
#staticmethod
def df():
with engine.connect() as conn:
return pd.read_sql_table(Weather.__tablename__ , conn)
Base = orm.declarative_base(cls=MyBase)
class MyModel(Base):
__tablename__ = 'tbl'
...
Alternatively, you can create a mixin that provides the static method and have classes inherit from it selectively.
class DFMixin:
#staticmethod
def df():
with engine.connect() as conn:
return pd.read_sql_table(Weather.__tablename__ , conn)
class MyModel(Base, DFMixin):
__tablename__ = 'tbl'
...
The mixin gives you more flexibility if not all of your models are going to need the dataframe functionality.
Declare an auxiliary class (say DfBase) with classmethod df(cls) having the desired behavior.
Then each derived class will access its __tablename__ attribute seamlessly via cls object which refers to the derived class itself.
class DfBase:
__tablename__ = None
#classmethod
def df(cls):
with engine.connect() as conn:
return pd.read_sql_table(cls.__tablename__ , conn)
class Weather(Base, DfBase):
__tablename__ = "Weather"
...

How to define two parent classes for SqlAlchemy entities?

Until now I have a parent class Entity for all my orm classes:
class AbstractEntity():
id = Column(Integer, primary_key=True)
#declared_attr
def __tablename__(self):
return AbstractEntity.table_name_for_class(self)
...
Entity = declarative_base(cls=AbstractEntity)
class Drink(Entity):
name = Entity.stringColumn()
I want my classes only to inherit from a single class Entity, not from a class Base and a mixin Entity. That works fine.
However, now I would like to introduce another parent class EntityAssociation that I can use as parent for all my asssociation classes that are used for many to many relationships, e.g.
class DrinkIngretients(EntityAssociation):
drink_id = Entity.foreign_key(Drink)
ingredient_id = Entity.foreign_key(Ingredient)
...
The class EntityAssociation should inherit from Base = declarative_base() but not from AbstractEntity. (It should not include the column id that is defined in AbstractEntity.)
=> How can I implement that inheritance structure?
I tried
class AbstractEntity():
id = Column(Integer, primary_key=True)
#declared_attr
def __tablename__(self):
return AbstractEntity.table_name_for_class(self)
...
Base = declarative_base()
class Entity(Base, AbstractEntity):
pass
class EntityAssociation(Base):
pass
However, the behavior of
Entity = declarative_base(cls=AbstractEntity)
and
class Entity(Base, AbstractEntity):
pass
seems to be different.
Class does not have a table or tablename specified and does not inherit from an existing table-mapped class.
=> How can I specify that the classes Entity and EntityAssociation should not have extra table names?
=> Any other suggestions on how to get the wanted inheritance structure?
The __abstract__ flag did the trick:
class EntityRelation(Base):
__abstract__ = True
class Entity(Base, AbstractEntity):
__abstract__ = True

SQLAlchemy ORM with dynamic table schema

I am trying to task SQLAlchemy ORM to create class Field that describes all fields in my database:
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class Field(Base):
__tablename__ = 'fields'
__table_args__ = {'schema':'SCM'}
id = Column(String(20), primary_key=True)
The issue is that table fields describes different fields in different schemas, i.e.
SCM.fields
TDN.fields
...
I need class Field to
Be initialized with object fieldset before records can be read from db
Schema determined by fieldset.get_schema() before table <schema>.fields is read.
Something like this:
session.query(Field(fieldset))).filter(Field.id=='some field')
However, adding
def __init__(self, fieldset)
pass
to class Field results in
__init__() takes 1 positional argument...
I could lump all fields tables into one schema and add column 'schema_name' but I still need Field have link to its fieldset.
Can this be done using SQLAlchemy ORM or should I switch to SqlAlchemy Core where I would have more control over object instantiation?
So the problem is solvable and the solution is documented as Augmenting the Base
from sqlalchemy import Column, Integer, String
from sqlalchemy.ext.declarative import declared_attr, declarative_base
class Field:
__tablename__ = 'fields'
#declared_attr
def __table_args__(cls):
# schema will be part of class name, which works for me
# Example: class FieldSCM --> Field with schema SCM
return dict(schema=cls.__name__[5:].upper())
id = Column(String(20), primary_key=True)
Field = declarative_base(cls=Field)
class FieldSet:
def __init__(self, schema):
self.fieldtype = type('Field' + schema.upper(), (Field,), {})
Proof of concept:
FieldSet('ork').fieldtype.__table__
Table('fields', MetaData(bind=None), Column('id', String(length=20), table=, primary_key=True, nullable=False), schema='ORK')

An alias of a sqlalchemy class?

This is my base class
class Product(Base):
__tablename__ = 'PRODUCT'
__table_args__ = {'quote':False}
...
id = Column(Integer, name='id_prod', primary_key=True)
type = Column(String(100),name='id_typ_prod')
__mapper_args__ = {'polymorphic_on': type}
So, naturally we have a number of classes that extends from this Product, e.g. Phone and Cable, each of them maps to its own table.
class Phone (Product):
__tablename__ = 'PHONE'
...
Now for some reasons now I want to create a 'alias' class, a class that does not have a corresponding table in database. Something like this:
class VapourWare(Product):
...
If I do
class VapourWare(Product):
__tablename__ = 'PRODUCT'
__mapper_args__ = {'polymorphic_identity':'VapourWare'}
It seems to work. But is it the right or recommended way? I am repeating __tablename__ = 'PRODUCT' here.
To some degree it depends on what you're trying to achieve, but from your example it seems that what you're trying to do is called Single-Table Inheritence in the SA Docs. The example listed on the linked page seems very much like your example, with Employee == Product and Manager == VapourWare (insert Dilbert joke here).

Categories

Resources