Metaclass conflict when separating classes into separate files - python

Edit
The problem was in importing. What I should have done was write: from SomeInterface import SomeInterface. Really I should write module name in lowercase someinterface.py as per Python styleguide (PEP 8).
I have a file model.py the defines all classes related to my DB as well as instantiates my Base.
# model.py
metadata = MetaData()
DeclarativeBase = declarative_base()
metadata = DeclarativeBase.metadata
class Bar(DeclarativeBase):
__tablename__ = 'Bar'
__table_args__ = {}
# column and relation definitions
The file model.py is autogenerated so I can't really touch it. What I did instead was create a file called modelaugmented.py where I add extra functionality to some of the model classes via inheritance.
# modelaugmented.py
from model import *
import SomeInterface
class BarAugmented(Bar, SomeInterface):
pass
# SomeInterface.py
class SomeInterface(object):
some_method(): pass
The problem I'm having is that for classes like BarAugmented, I'm getting the following error:
TypeError: Error when calling the metaclass bases
metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases
I only get this error when SomeInterface is in a separate file instead of being inside modelaugmented.py.
I understand that the metaclass for SomeInterface and Bar are different. The problem is that I can't figure out how to resolve this problem. I tried the solution suggested in Triple inheritance causes metaclass conflict... Sometimes which works in the example given, but not in my case. Not sure if SqlAlchmey has anything to do with it.
class MetaAB(type(DeclarativeBase), type(SomeInterface)):
pass
class BarAugmented(Bar, SomeInterface):
__metaclass__ = MetaAB
But then I get the error:
TypeError: Error when calling the metaclass
bases multiple bases have instance lay-out conflict
Using SQLAlchemy 0.8 and Python 2.7.

Ok, there must be something I'm missing, because I created a similar file layout to yours (I think) and it works in my machine. I appreciate you kept your question short and simple, but maybe is missing some little detail that alters... something? Dunno... (maybe SomeInterface has an abc.abstract metaclass?) If you update your question, please let me know trough a comment to this answer, and I'll try to update my answer.
Here it goes:
File stack29A.py (equivalent to your model.py):
from sqlalchemy import create_engine, Column, Integer, String
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker, scoped_session
DSN = "mysql://root:foobar#localhost/so_tests?charset=utf8"
engine = create_engine(DSN)
Session = scoped_session(sessionmaker(bind=engine))
session = Session()
DeclarativeBase = declarative_base()
class Bar(DeclarativeBase):
__tablename__ = 'Bar'
_id = Column('_id', Integer, primary_key=True)
email = Column('email', String(50))
File stack29B.py (equivalent to your someinterface.py):
class SomeInterface(object):
def some_method(self):
print "hellou"
File stack29C.py (equivalent to your modelaugmented.py):
from stack29A import Bar
from stack29B import SomeInterface
class BarAugmented(Bar, SomeInterface):
pass
File stack29D.py (like a kind of main.py: table creator and sample):
from stack29C import BarAugmented
from stack29A import session, engine, DeclarativeBase
if __name__ == "__main__":
DeclarativeBase.metadata.create_all(engine)
b1 = BarAugmented()
b1.email = "foo#bar.baz"
b2 = BarAugmented()
b2.email = "baz#bar.foo"
session.add_all([b1, b2])
session.commit()
b3 = session.query(BarAugmented)\
.filter(BarAugmented.email == "foo#bar.baz")\
.first()
print "b3.email: %s" % b3.email
b3.some_method()
If I run the "main" file (stack29D.py) everything works as expected:
(venv_SO)borrajax#borrajax:~/Documents/Tests$ python ./stack29D.py
b3.email: foo#bar.baz
hellou

Related

How to resolve mypy error with sqlalchemy enum?

mypy reports an error in the following code:
import enum
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, Enum
Base = declarative_base()
class MyEnum(enum.Enum):
A = 1
B = 2
class MyTable(Base):
__tablename__ = 'my_table'
col = Column(Enum(MyEnum), nullable=False)
c = MyTable(col=MyEnum.A)
Following is the error:
a.py:16: error: Incompatible type for "col" of "MyTable" (got
"MyEnum", expected "str")
How do I make this error go away without adding a "type: ignore" ? I could also replace MyEnum.A with MyEnum.A.name to make the error go away. But this doesn't look clean and is also not suggested in sqlalchemy documentation.
I don't know what exactly makes this error go away but on my setup after some refactoring and mypy configuring this error disappeared.
I installed sqlalchemy-stubs
pip install sqlalchemy-stubs
and created setup.cfg file:
[mypy]
files = **/*.py
plugins =
sqlalchemy.ext.mypy.plugin
import enum
from sqlalchemy import Column, Enum
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class MyEnum(enum.Enum):
"""My enum."""
ONE = 1
TWO = 2
class MyTable(Base):
"""My table."""
__tablename__ = 'my_table'
col = Column(Enum(MyEnum), nullable=False)
table = MyTable(col=MyEnum.ONE)
Since sqlalchemy does not use type annotations, you have to introduce them yourself in your script. The dynamically created Base class is of type DeclarativeMeta. If you type annotate the variable Base, mypy does not show the error any more.
from sqlalchemy.orm.decl_api import DeclarativeMeta
Base: DeclarativeMeta = declarative_base()
Now the Base variable is properly type annotated. I think that the DeclarativeMeta class is not to be exposed in the API, so I'm not sure how sustainable this solution will be.
This is however not surprising and one may ask the question what the purpose of enforced static type checking in a dynamically typed language would be. But that's for another day.

How to fix my approach to use the same models in vanilla SQLAlchemy and Flask-SQLAlchemy?

I came across
several approaches on how to use the vanilla SQLAlchemy models in Flask-SQLAlchemy.
It works like a charm to use models that inherit from Base with Flask-SQLAlchemy.
But I really like that convenience stuff...
Job.query.all() # Does not work
db.session.query(Job).all() # Works
So I started to work on this and put together some code, but I am stuck and need some help to get this nice and clean.
The following block is a general definition that does not inherit from either.
It is imported and supposed to be used from Flask-SQLAlchemy and vanilla SQLAlchemy at some point.
class VanillaMachine():
__tablename__ = 'machine'
id = Column(Integer, primary_key=True)
name = Column(String(100))
status = Column(Integer)
And there is a factory that takes either db.Model or Base and return Machine with the correct parent:
class MachineFactory:
def __init__(self, *args, **kwargs):
pass
def __new__(cls, *args, **kwargs):
return type('Machine',(object, VanillaMachine, args[0]), VanillaMachine.__dict__.copy())
I am quite sure that there's something off with that code, but
I am not sure where.
If I use it like
db = SQLAlchemy()
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()#
Machine1 = MachineFactory(db.Model)
Machine2 = MachineFactory(Base)
there is a error message
sqlalchemy.exc.ArgumentError: Column object 'id' already assigned to Table 'machine'
Can help me to get this straight in a nice, reliable way?
I know that you could just use a function, pass the parent as argument into VanillaMachine and use some if statement, but that would be too straightforward, right? :)
Edit:
Other approaches I came across are
using the Flask context to use Flask-SQLAlchemy models
with app.app_context():
pass
or
app.app_context().push()
But this is too focused on Flask for me and does not allow to clearly separate the models, make them independent and adjust to the context.
supplying an alternative Base class to db = SQLAlchemy(app, model_class=Base), see here. This might work for me, but I did not evaluate this so far.
I found a good solution inspired by a Factory pattern and
Declarative Mixins as mentioned in the SQLAlchemy docs.
For complex multi-level inheritance scenarios a different approach is needed, using #declared_attr.cascading.
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy import Column, Integer, String
from sqlalchemy import MetaData
from sqlalchemy.ext.declarative import declarative_base
from flask_sqlalchemy import SQLAlchemy
SQLALCHEMY_DATABASE_URI = 'sqlite:///' + '/tmp/test_app.db'
engine = create_engine(SQLALCHEMY_DATABASE_URI, echo=True)
# for vanilla
Base = declarative_base()
# for Flask (import from app once initialized)
db = SQLAlchemy()
class MachineMixin:
__tablename__ = 'machine'
id = Column(Integer, primary_key=True)
name = Column(String(100))
status = Column(Integer)
class ModelFactory:
#staticmethod
def create(which_model, which_parent):
if which_parent == 'flask_sqlalchemy':
parent = db.Model
elif which_parent == 'pure_sqlalchemy':
parent = Base
# now use type() to interit, fill __dict__ and assign a name
obj = type(which_model.__name__ + '_' + which_parent,
(which_model, parent),
{})
return obj
test_scenario = 'pure_sqlalchemy' # 'flask_sqlalchemy'
Machine = ModelFactory.create(MachineMixin, test_scenario)
if test_scenario == 'flask_sqlalchemy':
db.metadata.drop_all(bind=engine)
db.metadata.create_all(bind=engine)
elif test_scenario == 'pure_sqlalchemy':
Base.metadata.drop_all(bind=engine)
Base.metadata.create_all(bind=engine)
Session = sessionmaker(bind=engine)
session = Session()
session.add(Machine(name='Bob', status=1))
session.commit()

How to create custom classes that inherit from sqlalchemy models in external library

Rather than creating mixin classes that models inherit from, I have a use case that requires me to configure classes the other way around. The classes that would normally be mixin classes need to be the classes that inherit from the models as well as the class that model objects are created from. This is because the models and the mapper configurations are in an external library from the main repository. I need to pass in the host for the engine from the main repository to the models library before any of the models are loaded so they can load with the declarative base already configured. After the engine information is passed in, the session, Base class, and everything is created within a sort of base class that the models inherit from. Here is a simplified example:
class SQLAlchemyBase(object):
metadata = None
Session = None
Base = object
sessionfactory = sessionmaker()
def initialize(self, host):
engine = create_engine(host)
self.metadata = MetaData(bind=engine)
self.Session = scoped_session(self.sessionfactory)
self.Base = declarative_base(metadata=self.metadata)
models = SQLAlchemyBase()
(The models inherit from models.Base)
So the SQLAlchemyBase will be imported into the main repository, the initialize method will be called, passing in the host for the engine, and the models can then be imported. The main repository has its own classes with the same names as the models and have additional methods that a normal mixin class would have to extend functionality. However, I am unable to create model objects using the classes in the main repository because I can't get the mappers to play nice with this unusual inheritance that extends from the external models library. Additionally, in the models library, there are models that have multiple levels of inherited polymorphic relationships. Here is an example that is similar one of the more basic inherited polymorphic relationships:
Models Library
class Foo(models.Base):
__tablename__ = "foo"
id = Column(Integer, primary_key=True)
type = Column(String)
foo_bar_id = Column(Integer, ForeignKey("foo_bar.id"))
foo_bar = relationship(Foo, backref=backref("foos"))
__mapper_args__ = {"polymorphic_on": type}
class Bar(Foo):
__mapper_args__ = {"polymorphic_identity": "bar"}
class FooBar(models.Base):
__tablename__ = "foo_bar"
id = Column(Integer, primary_key=True)
Main Repository
from separate_library.models import models, Foo as BaseFoo, Bar as BaseBar, FooBar as BaseFooBar
class Foo(BaseFoo):
#classmethod
def custom_create_method(cls, **kw):
foo_obj = cls(**kw)
models.session.add(foo_obj)
models.session.flush()
class Bar(BaseBar):
pass
class FooBar(BaseFooBar):
pass
The original error I was getting was something like this:
InvalidRequestError: One or more mappers failed to initialize - can't proceed with initialization of other mappers.
Original exception was: Multiple classes found for path Foo in the registry of this declarative base. Please use a fully module-qualified path.
So I tried putting the full path in the relationships. Then it started giving me an error like this:
FlushError: Attempting to flush an item of type as a member of collection FooBar.foos. Expected an object of type or a polymorphic subclass of this type. If is a subclass of , configure mapper Mapper|Foo|foo to load this subtype polymorphically, or set enable_typechecks=False to allow any subtype to be accepted for flush.
Essentially, the main problem is getting the classes in the main module to point to and act like the model classes. For example, when I try to create relationships, it says it expected an object of type separate_library.models.Foo instead of main_module.models.Foo. Additionally, in the polymorphic relationships, I can't get the polymorphic_identity to populate for the polymorphic_on column. For example, Bar in the main repository will have the type column empty when the object is initially created.
One idea I tried was to add a metaclass to the declarative base in the models library and modify the mappers in the __init__ method during their initialization. I made progress this way, but haven't gotten it to work completely.
Sorry for the complex explanation, but this is a complex problem. I am not able to change anything about the models or the use case, unfortunately. I have to work within these constraints. If anyone can offer ideas on how to configure the mappers for the classes in the main repository to act like the models in the model library, I would be very grateful.
There are three problems here:
When you write foo_bar = relationship(FooBar, backref=backref("foos")) the FooBar needs to refer to the subclass FooBar, not the BaseFooBar.
Bar needs to inherit from Foo for the inheritance mechanism to work; it cannot inherit from BaseFoo.
Your base classes should not have mappers attached to them; otherwise the inheritance mechanism gets out of whack.
The solutions to these problems, in order:
Use a string to refer to the class name. This confines the consumer to name their classes a certain way. Let's accept this restriction for now.
We can use a metaclass to dynamically change the base class. The metaclass needs to derive from the metaclass of Base because SQLAlchemy's declarative extension makes liberal use of metaclasses. We'll see that the metaclass approach can also solve problem 1 in a flexible way.
Use __abstract__ = True.
Simplest possible example:
from sqlalchemy import *
from sqlalchemy.ext.declarative import declarative_base, declared_attr, DeclarativeMeta
class BaseMeta(DeclarativeMeta):
def __new__(cls, name, bases, attrs):
if not attrs.get("__abstract__"):
if len(bases) != 1:
# you'll need to have multiple inheritance if you have that
# as well
raise NotImplementedError()
base, = bases
extra_bases = tuple(b._impl for b in base.__bases__
if hasattr(b, "_impl"))
bases += extra_bases
self = super(BaseMeta, cls).__new__(cls, name, bases, attrs)
if getattr(base, "__abstract__", False):
base._impl = self
return self
else:
return super(BaseMeta, cls).__new__(cls, name, bases, attrs)
Base = declarative_base(metaclass=BaseMeta)
class BaseFoo(Base):
__abstract__ = True
__tablename__ = "foo"
id = Column(Integer, primary_key=True)
type = Column(String)
#declared_attr
def foo_bar_id(cls):
return Column(Integer, ForeignKey("foo_bar.id"))
#declared_attr
def foo_bar(cls):
return relationship(lambda: BaseFooBar._impl, backref=backref("foos"))
__mapper_args__ = {"polymorphic_on": type}
class BaseBar(BaseFoo):
__abstract__ = True
__mapper_args__ = {"polymorphic_identity": "bar"}
class BaseFooBar(Base):
__abstract__ = True
__tablename__ = "foo_bar"
id = Column(Integer, primary_key=True)
class Foo(BaseFoo):
#classmethod
def custom_create_method(cls, **kw):
foo_obj = cls(**kw)
models.session.add(foo_obj)
models.session.flush()
class Bar(BaseBar):
pass
class FooBar(BaseFooBar):
pass
print(Bar.__bases__) # (<class '__main__.BaseBar'>, <class '__main__.Foo'>)
The basic idea of the metaclass is to inject the class Foo into the bases of Bar, based on the fact that BaseBar inherits from BaseFoo, and the fact that Foo implements BaseFoo (by inheriting from it).
You can add more complicated stuff on top, such as multiple inheritance support or graceful error handling (e.g. warning the user that he's missing a subclass for each base class that you have or he's provided multiple subclasses for the same base class).

sqlalchemy.exc.InvalidRequestError: SQL expression, column, or mapped entity expected

I am running into this sqlachemy error that I haven't been able to understand:
sqlalchemy.exc.InvalidRequestError: SQL expression, column, or mapped entity expected - got '<class '__main__.JobRecord'>'
What does this error mean? What are possible causes?
This is the method that triggers the error:
#classmethod
def find_job_record_from_pk(cls, pk):
'''
return the job record with the given pk
'''
job_record = MlcDb.get_session().query(cls).filter(cls.pk == pk).first()
return job_record
Mapping:
#classmethod
def define_mapping(cls):
'''
SQLAlchemy mapping definition
'''
cls.mapper = mapper(cls, cls.table,
polymorphic_on = cls.table.c.item_type,
properties = {
'item_type': synonym('_JobRecord__item_type', map_column=True),
'version': synonym('_JobRecord__version', map_column=True),
'state': synonym('_JobRecord__state', map_column=True),
'date_created' : synonym( '_JobRecord__date_created', map_column=True)
}
)
In my case the error ocurred because I forgot to extend from "Base" in the class Definition.
Base on the given error it's probable that it was declared as
follows:
JobRecord:
pro1: Column()
instead of
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
JobRecord(Base): #here is the missing extend
pro1: Column()
The messages says that "the parameter is not recognized.
According to the documentation
Classes mapped using the Declarative system are defined in terms of a base class which maintains a catalog of classes and tables relative to that base - this is known as the declarative base class
And it's important to notice that:
Our application will usually have just one instance of this base in a commonly imported module. We create the base class using the declarative_base() function
Of course all this helps me in my particular case
I hope this is useful.
more info:
https://docs.sqlalchemy.org/en/13/orm/tutorial.html
try filter_by(pk = pk) instead

Dynamic Table Creation and ORM mapping in SqlAlchemy

I'm fairly new to using relational databases, so I prefer using a good ORM to simplify things. I spent time evaluating different Python ORMs and I think SQLAlchemy is what I need. However, I've come to a mental dead end.
I need to create a new table to go along with each instance of a player I create in my app's player table. I think I know how to create the table by changing the name of the table through the metadata then calling the create function, but I have no clue on how to map it to a new dynamic class.
Can someone give me some tips to help me get past my brain freeze? Is this even possible?
Note: I'm open to other ORMs in Python if what I'm asking is easier to implement.Just show me how :-)
We are spoiled by SQLAlchemy.
What follows below is taken directly from the tutorial,
and is really easy to setup and get working.
And because it is done so often,
the documentation moved to full declarative in Aug 2011.
Setup your environment (I'm using the SQLite in-memory db to test):
>>> from sqlalchemy import create_engine
>>> engine = create_engine('sqlite:///:memory:', echo=True)
>>> from sqlalchemy import Table, Column, Integer, String, MetaData
>>> metadata = MetaData()
Define your table:
>>> players_table = Table('players', metadata,
... Column('id', Integer, primary_key=True),
... Column('name', String),
... Column('score', Integer)
... )
>>> metadata.create_all(engine) # create the table
If you have logging turned on, you'll see the SQL that SQLAlchemy creates for you.
Define your class:
>>> class Player(object):
... def __init__(self, name, score):
... self.name = name
... self.score = score
...
... def __repr__(self):
... return "<Player('%s','%s')>" % (self.name, self.score)
Map the class to your table:
>>> from sqlalchemy.orm import mapper
>>> mapper(Player, players_table)
<Mapper at 0x...; Player>
Create a player:
>>> a_player = Player('monty', 0)
>>> a_player.name
'monty'
>>> a_player.score
0
That's it, you now have a your player table.
It's a very old question. Anyway if you prefer ORM, it's quite easy to generate table class with type:
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, Integer, String
Base = declarative_base()
Test = type('Test', (Base,), {
'__tablename__': 'test',
'test_id': Column(Integer, primary_key=True, autoincrement=True),
'fldA': Column(String),
... other columns
}
)
Base.metadata.create_all(engine)
# passed session create with sqlalchemy
session.query(Test).all()
Making a class factory, it's easy to assign names to a class and database table.
If you are looking to create dynamic classes and tables you can use the following technique based from this tutorial URL I found here (http://sparrigan.github.io/sql/sqla/2016/01/03/dynamic-tables.html), I modified how he did it a bit.
from sqlalchemy import create_engine
engine = create_engine('sqlite:///test.db', echo=True)
from sqlalchemy import Column, Integer,Float,DateTime, String, MetaData
metadata = MetaData()
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
Session = sessionmaker(bind=engine)
session = Session() # create a Session
Base = declarative_base()
First include all the needed dependencies and create your session and Base.
The key to creating it dynamically is this here:
attr_dict = {'__tablename__': 'default','id': Column(Integer, primary_key=True, auto_increment=True)}
you could create a table from just this by taking advantage of the 'type' function in python.
myClass = type('ClassnameHere', (Base,), attr_dict)
Note that we are passing in attr_dict, this will give the required tablename and column information to our class, but the difference is we are defining the class name through a string! This means you could create a loop for example going through an array of strings to start creating tables dynamically!
Next all you have to do is simply call
Base.metadata.create_all(engine)
Because the dynamic class we created inherits from Base the command will simply create the tables!
You add to this table for example like this now:
SomeRow = myClass(id='2')
session.add(SomeRow)
session.commit()
This can go even further if you you don't know the column names as well. Just refer to the article to learn how to do that.
You would essentially do something like this though:
firstColName = "Ill_decide_later"
secondColName = "Seriously_quit_bugging_me"
new_row_vals = myClass(**{firstColName: 14, secondColName: 33})
The ** operator takes the object and unpacks it so that firstColName and secondColName are added with assignment operators so it would essentially be the same thing as this:
new_row_vals = myClass(firstColName=14, secondColName=33)
The advantage of this technique is now you can dynamically add to the table without even having to define the column names!
These column names could be stored in a string array for example or whatever you wanted and you just take it from there.
Maybe look at SQLSoup, which is layer over SQLAlchemy.
You can also create the tables using plain SQL, and to dynamically map, use these libraries if they already don't have create table function.
Or alternatively create a dynamic class and map it:
tableClass = type(str(table.fullname), (BaseTable.BaseTable,), {})
mapper(tableClass, table)
where BaseTable can be any Python class which you want all your table classes to inherit from, e.g. such Base class may have some utility or common methods, e.g. basic CRUD methods:
class BaseTable(object): pass
Otherwise you need not pass any bases to type(...).
you can use declarative method for dynamically creating tables in database
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Table, Column, Integer, String, MetaData, ForeignKey
Base = declarative_base()
class Language(Base):
__tablename__ = 'languages'
id = Column(Integer, primary_key=True)
name = Column(String(20))
extension = Column(String(20))
def __init__(self, name, extension):
self.name = name
self.extension = extension
I faced the same problem when I was trying to automate simple CRUD tasks using SQLAlchemy.
Here is simple explanation and some code: http://www.devx.com/dbzone/Article/42015
maybe i didn't quite understand what you want, but this recipe create identical column in different __tablename__
class TBase(object):
"""Base class is a 'mixin'.
Guidelines for declarative mixins is at:
http://www.sqlalchemy.org/docs/orm/extensions/declarative.html#mixin-classes
"""
id = Column(Integer, primary_key=True)
data = Column(String(50))
def __repr__(self):
return "%s(data=%r)" % (
self.__class__.__name__, self.data
)
class T1Foo(TBase, Base):
__tablename__ = 't1'
class T2Foo(TBase, Base):
__tablename__ = 't2'
engine = create_engine('sqlite:///foo.db', echo=True)
Base.metadata.create_all(engine)
sess = sessionmaker(engine)()
sess.add_all([T1Foo(data='t1'), T1Foo(data='t2'), T2Foo(data='t3'),
T1Foo(data='t4')])
print sess.query(T1Foo).all()
print sess.query(T2Foo).all()
sess.commit()
info in example sqlalchemy

Categories

Resources