I want to automatically run a class method defined in a base class on any derived class during the creation of the class. For instance:
class Base(object):
#classmethod
def runme():
print "I am being run"
def __metclass__(cls,parents,attributes):
clsObj = type(cls,parents,attributes)
clsObj.runme()
return clsObj
class Derived(Base):
pass:
What happens here is that when Base is created, ''runme()'' will fire. But nothing happens when Derived is created.
The question is: How can I make ''runme()'' also fire when creating Derived.
This is what I have thought so far: If I explicitly set Derived's metaclass to Base's, it will work. But I don't want that to happen. I basically want Derived to use the Base's metaclass without me having to explicitly set it so.
See this answer. Basically, when calling type(cls,parents,attributes), you are creating a class without passing in the information about what that class's metaclass is. Thus, the class that is returned doesn't have the metaclass you want it to; instead it has metaclass type. (Ironically, by defining __metaclass__ to do as it does, you are explicitly causing your class to not have that metaclass.) Instead of directly calling type, you need to call type.__new__(meta, cls, parents, attrs), where meta is the metaclass.
However, you can't achieve this when you define __metaclass__ inline. From inside your __metaclass__ method, you have no way to refer to that method, because it's a method of a class that hasn't been defined yet. You want to do something like
def __metaclass__(cls, bases, attrs):
type.__new__(metaclassGoesHere, cls, bases, attrs)
. . . but there's nothing you can put in for metaclassGoesHere to make it do the right thing, because what you're trying to refer to is the method inside which you're trying to refer to it.
So just define your metaclass externally.
rename your runme() to __init__(self) and DO NOT override the __init__() method, and it will get called every time you make an instance of Derived
class Base(object):
def __init__(self):
print "I am being run"
class Derived(Base):
pass
dummy_instance = Derived()
Copy and paste that, it will print I am being run.
Try This,
class MyMetaclass(type):
def __new__(cls, name, bases, dct):
runme()
return super(MyMetaclass, cls).__new__(cls, name, bases, uppercase_attr)
If you want to do something when the class is created you have to do it in the __init__ method of __metaclass__.
class foo(object):
#classmethod
def method(cls):
print 'I am being run by', cls.__name__
class __metaclass__(type):
def __init__(self, *args, **kwargs):
type.__init__(self, *args, **kwargs)
self.method()
class bar(foo): pass
Which prints:
I am being run by foo
I am being run by bar
Related
This question already has answers here:
Create an abstract Enum class
(3 answers)
Closed 3 years ago.
Simple Example
The goal is to create an abstract enum class through a metaclass deriving from both abc.ABCMeta and enum.EnumMeta. For example:
import abc
import enum
class ABCEnumMeta(abc.ABCMeta, enum.EnumMeta):
pass
class A(abc.ABC):
#abc.abstractmethod
def foo(self):
pass
class B(A, enum.IntEnum, metaclass=ABCEnumMeta):
X = 1
class C(A):
pass
Now, on Python3.7, this code will be interpreted without error (on 3.6.x and presumably below, it will not). In fact, everything looks great, our MRO shows B derived from both A and IntEnum.
>>> B.__mro__
(<enum 'B'>, __main__.A, abc.ABC, <enum 'IntEnum'>, int, <enum 'Enum'>, object)
Abstract Enum is not Abstract
However, even though B.foo has not been defined, we can still instantiate B without any issue, and call foo().
>>> B.X
<B.X: 1>
>>> B(1)
<B.X: 1>
>>> B(1).foo()
This seems rather weird, since any other class that derives from ABCMeta cannot be instantiated, even if I use a custom metaclass.
>>> class NewMeta(type):
... pass
...
... class AbcNewMeta(abc.ABCMeta, NewMeta):
... pass
...
... class D(metaclass=NewMeta):
... pass
...
... class E(A, D, metaclass=AbcNewMeta):
... pass
...
>>> E()
TypeError: Can't instantiate abstract class E with abstract methods foo
Question
Why does a class using a metaclass derived from EnumMeta and ABCMeta effectively ignore ABCMeta, while any other class using a metaclass derived from ABCMeta use it? This is true even if I custom define the __new__ operator.
class ABCEnumMeta(abc.ABCMeta, enum.EnumMeta):
def __new__(cls, name, bases, dct):
# Commented out lines reflect other variants that don't work
#return abc.ABCMeta.__new__(cls, name, bases, dct)
#return enum.EnumMeta.__new__(cls, name, bases, dct)
return super().__new__(cls, name, bases, dct)
I'm rather confused, since this seems to fly in the face of what a metaclass is: the metaclass should define how the class is defined and behaves, and in this case, defining a class using a metaclass that is both abstract and an enumeration creates a class that silently ignores the abstract component. Is this a bug, is this intended, or is there something greater I am not understanding?
As stated on #chepner's answer, what is going on is that Enum metaclass overrides the normal metaclass' __call__ method, so that an Enum class is never instantiated through the normal methods, and thus, ABCMeta checking does not trigger its abstractmethod check.
However, on class creation, the Metaclass's __new__ is run normally, and the attributes used by the abstract-class mechanisms to mark the class as abstract do create the attribute ___abstractmethods__ on the created class.
So, all you have to do for what you intend to work, is to further customize your metaclass to perform the abstract check in the code to __call__:
import abc
import enum
class ABCEnumMeta(abc.ABCMeta, enum.EnumMeta):
def __call__(cls, *args, **kw):
if getattr(cls, "__abstractmethods__", None):
raise TypeError(f"Can't instantiate abstract class {cls.__name__} "
f"with frozen methods {set(cls.__abstractmethods__)}")
return super().__call__(*args, **kw)
This will make the B(1) expression to fail with the same error as abstractclass instantiation.
Note, however, that an Enum class can't be further inherited anyway, and it simply creating it without the missing abstractmethods may already violate what you want to check. That is: in your example above, class B can be declared and B.x will work, even with the missing foo method. If you want to prevent that, just put the same check in the metaclass' __new__:
import abc
import enum
class ABCEnumMeta(abc.ABCMeta, enum.EnumMeta):
def __new__(mcls, *args, **kw):
cls = super().__new__(mcls, *args, **kw)
if issubclass(cls, enum.Enum) and getattr(cls, "__abstractmethods__", None):
raise TypeError("...")
return cls
def __call__(cls, *args, **kw):
if getattr(cls, "__abstractmethods__", None):
raise TypeError(f"Can't instantiate abstract class {cls.__name__} "
f"with frozen methods {set(cls.__abstractmethods__)}")
return super().__call__(*args, **kw)
(Unfortunatelly, the ABC abstract method check in CPython seems to be performed in native code, outside the ABCMeta.__call__ method - otherwise, instead of mimicking the error, we could just call ABCMeta.__call__ explicitly overriding super's behavior instead of hardcoding the TypeError there.)
Calling an enumerated type doesn't create a new instance. Members of the enumerated type are created immediately at class-creation time by the meta class. The __new__ method simply performs lookup, which means ABCMeta is never invoked to prevent instantiation.
B(1).foo() works because, once you have an instance, it doesn't matter if the method was marked as abstract. It's still a real method, and can be called as such.
I have a base class extending unittest.TestCase, and I want to patch that base class, such that classes extending this base class will have the patches applied as well.
Code Example:
#patch("some.core.function", mocked_method)
class BaseTest(unittest.TestCase):
#methods
pass
class TestFunctions(BaseTest):
#methods
pass
Patching the TestFunctions class directly works, but patching the BaseTest class does not change the functionality of some.core.function in TestFunctions.
You probably want a metaclass here: a metaclass simply defines how a class is created.
By default, all classes are created using Python's built-in class type:
>>> class Foo:
... pass
...
>>> type(Foo)
<class 'type'>
>>> isinstance(Foo, type)
True
So classes are actually instances of type.
Now, we can subclass type to create a custom metaclass (a class that creates classes):
class PatchMeta(type):
"""A metaclass to patch all inherited classes."""
We need to control the creation of our classes, so we wanna override the type.__new__ here, and use the patch decorator on all new instances:
class PatchMeta(type):
"""A metaclass to patch all inherited classes."""
def __new__(meta, name, bases, attrs):
cls = type.__new__(meta, name, bases, attrs)
cls = patch("some.core.function", mocked_method)(cls)
return cls
And now you simply set the metaclass using __metaclass__ = PatchMeta:
class BaseTest(unittest.TestCase):
__metaclass__ = PatchMeta
# methods
The issue is this line:
cls = patch("some.core.function", mocked_method)(cls)
So currently we always decorate with arguments "some.core.function" and mocked_method.
Instead you could make it so that it uses the class's attributes, like so:
cls = patch(*cls.patch_args)(cls)
And then add patch_args to your classes:
class BaseTest(unittest.TestCase):
__metaclass__ = PatchMeta
patch_args = ("some.core.function", mocked_method)
Edit: As #mgilson mentioned in the comments, patch() modifies the class's methods in place, instead of returning a new class. Because of this, we can replace the __new__ with this __init__:
class PatchMeta(type):
"""A metaclass to patch all inherited classes."""
def __init__(cls, *args, **kwargs):
super(PatchMeta, self).__init__(*args, **kwargs)
patch(*cls.patch_args)(cls)
Which is quite unarguably cleaner.
Generally, I prefer to do this sort of thing in setUp. You can make sure that the patch gets cleaned up after the test is completed by making use of the tearDown method (or alternatively, registering a the patch's stop method with addCleanup):
class BaseTest(unittest.TestCase):
def setUp(self):
super(BaseTest, self).setUp()
my_patch = patch("some.core.function", mocked_method)
my_patch.start()
self.addCleanup(my_patch.stop)
class TestFunctions(BaseTest):
#methods
pass
Provided that you're disciplined enough to always call super in your overridden setUp methods, it should work just fine.
I have a setup like this:
class Meta(type):
#property
def test(self):
return "Meta"
class Test(object):
__metaclass__ = Meta
test = "Test"
class TestSub(object):
test = "TestSub"
print(Test.test, TestSub.test)
Which yields the following output:
('Meta', 'TestSub')
What I would have expected would be:
('Test', 'TestSub')
I know why that happens: test is assigned on Test before the metaclass Meta is executed. But I have no idea about how to implement a clean way of changing this behavior. I know I could hack around in __init__ and __new__ of the Meta class, but that seems dirty because I'd have to modify it for each new property. So is there a clean way (like writing a new decorator) to get this?
I also don't like the idea of creating an intermediate class just to work around this, but would accept it as a last resort.
In fact, your Test class's test attribute is not overwritten. It's still there:
>>> Test.__dict__['test']
'Test'
However, doing Test.test doesn't access it, because, according to the documentation:
If an instance’s dictionary has an entry with the same name as a data descriptor, the data descriptor takes precedence.
property creates a data descriptor. So by using a property in the metaclass, you block access to the ordinary class variable on the class.
How best to solve this is not clear, because it's not clear what you're trying to accomplish with this structure. Given the code you posted, it's not clear why you're using a metaclass at all. If you just want override class attributes on subclasses, you can do it with simple inheritance as described in unutbu's answer.
If that's what you want, don't use a metaclass. Use inheritance:
class Base(object):
#property
def test(self):
return "Meta"
class Test(Base):
test = "Test"
class TestSub(object):
test = "TestSub"
print(Test.test, TestSub.test)
yields
('Test', 'TestSub')
I'm not sure is it what you need?
class Meta(type):
#property
def test(self):
return 'Meta'
def __new__(cls, name, bases, dct):
if 'test' not in dct:
dct['test'] = Meta.test
return super(Meta, cls).__new__(cls, name, bases, dct)
class Test(object):
__metaclass__ = Meta
test = "Test"
class TestSub(object):
test = "TestSub"
class TestWithoutTest(object):
__metaclass__ = Meta
print(Test.test, TestSub.test, TestWithoutTest.test)
I'm having a load of confusion between the __metaclass__ property of a class and actual inheritance, and how __new__ is called in either of these scenarios. My issue comes from digging through some model code in the django framework.
Let's say I wanted to append an attribute to a class as it's defined in the child's Meta subclass:
class Parent(type):
def __new__(cls, name, base, attrs):
meta = attrs.pop('Meta', None)
new_class = super(Parent, cls).__new__(cls, name, base, attrs)
new_class.fun = getattr(meta, 'funtime', None)
return new_class
I don't understand why the actual __new__ method is called in django's code, but when I try to code something like that it doesn't work.
From what I've experienced, the following does not actually call the __new__ method of the parent:
class Child(Parent):
class Meta:
funtime = 'yaaay'
C = Child()
When I try to do this it complains with the TypeError:
TypeError: __new__() takes exactly 4 arguments (1 given)
However the source code I have been looking at appears to work in that way.
I understand that it could be done with a metaclass:
class Child(object):
__metaclass__ = Parent
But I don't understand why their way works for them and not for me, since the non __metaclass___ would be cleaner for making a distributable module.
Could somebody please point me in the right direction on what I'm missing?
Thanks!
In django, Model is not a metaclass. Actually the metaclass is ModelBase. That's why their way works and your way doesn't work.
Moreover, the latest django used a helper function, six.with_metaclass, to wrap 'ModelBase'.
If we want to follow django's style, Parent and Child class will look like
def with_metaclass(meta, base=object):
"""Create a base class with a metaclass."""
return meta("NewBase", (base,), {})
class ParentBase(type):
def __new__(cls, name, base, attrs):
meta = attrs.pop('Meta', None)
new_class = super(ParentBase, cls).__new__(cls, name, base, attrs)
new_class.fun = getattr(meta, 'funtime', None)
return new_class
class Parent(with_metaclass(ParentBase)):
pass
class Child(Parent):
class Meta:
funtime = 'yaaay'
c = Child()
>>> c.fun
'yaaay'
Let us focus on Parent. It is almost equivalent to
NewBase = ParentBase("NewBase", (object,), {})
class Parent(NewBase):
pass
The key is how to understand ParentBase("NewBase", (object,), {}).
Let us recall type().
type(name, bases, dict)
With three arguments, return a new type object. This is essentially a dynamic form of the class statement. The name string is the class name and becomes the name attribute; the bases tuple itemizes the base classes and becomes the bases attribute; and the dict dictionary is the namespace containing definitions for class body and becomes the dict attribute. For example, the following two statements create identical type objects:
Since ParentBase is a metaclass, a subclass of type. Therefore, ParentBase("NewBase", (object,), {}) is very similar to type("NewBase", (object,), {}). In this case, the only difference is the class created dynamically is not an instance of type, but ParentBase.
In other word, the metaclass of NewBase is ParentBase. Parent is equivalent to
class NewBase(object):
__metaclass__ = ParentBase
class Parent(NewBase):
pass
Finally, we got a __metaclass__.
in a metaclass that extends type, __new__ is used to create a class.
in a class, __new__ is used to create an instance.
metaclass is a class that creates a class. you're confused of class inheritance and metaclass.
your Child class inherits Parent and you want to create an instance of Child. however, Parent being a metaclass means Parent.__new__ shouldn't be used to create an instance of a class.
In a recent project I try to do something like this (more complex, but the same result):
class MetaA(type):
def __new__(cls, name, args, kwargs):
print kwargs["foo"]
return type.__new__(cls, name, args, kwargs)
class A(object):
__metaclass__ = MetaA
foo = "bar"
class B(A):
pass
I get this:
bar
Traceback (most recent call last):
File "C:\Users\Thorstein\Desktop\test.py", line 10, in <module>
class B(A):
File "C:\Users\Thorstein\Desktop\test.py", line 3, in __new__
print kwargs["foo"]
KeyError: 'foo'
Are class attributes not inherited? If so, is there any workaround possible in a similar framework to the above?
EDIT:
Might be easier to see what I mean using an actual (simplified) example from the program..
class GameObjectMeta(type):
def __new__(cls, name, bases, attrs):
attrs["dark_color"] = darken(*attrs["color"])
return type.__new__(cls, name, bases, attrs)
class GameObject(object):
__metaclass__ = GameObjectMeta
color = (255,255,255) # RGB tuple
class Monster(GameObject):
pass
Basically, want to run a function on the base color to make a darker one that's saved in the class (multiple instances of the class will want the same color, but there will be a longer class hierarchy). I hope this'll make more sense..
It's not supposed to inherit those. The metaclass receives the attributes defined on the class it is instantiating, not those of its base classes. The whole point of the metaclass constructor is to get access to what is actually given in the class body.
In fact, the attributes of the base class really aren't "in" the subclass at all. It's not as if they are "copied" to the subclass when you define it, and in fact at creation time the subclass doesn't even "know" what attributes its superclasses might have. Rather, when you access B.foo, Python first tries to find the attribute on B, then if it doesn't find it, looks on its superclasses. This happens dynamically every time you try to read such an attribute. The metaclass mechanism isn't supposed to have access to attributes of the superclass, because those really aren't there on the subclass.
A perhaps related problem is that your definition of __new__ is incorrect. The arguments to the metaclass __new__ are cls, name, bases, and attrs. The third argument is the list of base classes, not "args" (whatever you intend that to be). The fourth argument is the list of attributes define in the class body. If you want access to inherited attributes you can get at them via the bases.
Anyway, what are you trying to accomplish with this scheme? There's probably a better way to do it.
If you use the __init__ instead of __new__ then you can inherit the base class attributes:
class MetaA(type):
def __init__(self, name, bases, attr):
print self.foo # don't use attr because foo only exists for A not B
print attr # here is the proof that class creation does not follow base class attributes
super(MetaA, self).__init__(name, base, attr) # this line calls type(), but has no effect
class A(object):
__metaclass__ = MetaA
foo = 'bar'
class B(A):
pass
when A is created returns:
bar
{'__module__': '__main__', 'foo': 'bar', '__metaclass__': <class '__main__.MetaA'>}
when B is created returns:
bar
{'__module__': '__main__'}
Note, and this is what #BrenBarn was saying in his answer, that foo is not in attr when B is created. That's why the code in OP question raises KeyError. However calling self.foo looks for foo in B then if it can't find it looks in its bases, IE A.
The metaclass for both classes can be determined by looking at its class:
>>> print A.__class__
<class '__main__.MetaA'>
>>> print A.__class__
<class '__main__.MetaA'>
The fact that the metaclass __init__ method is called when allocating and initializing B, printing self.foo and attr in stdout also proves that B's meta class is MetaA.