Auto instantiation of class after import in Python - python

I want to automatically instantiate some classes in Python just after their modules have been imported, so there is no need to direct instantiation call.
My setup can be stripped down to following code:
class MetaTest(type):
_instances = {}
def __init__(cls, name, bases, attrs):
super(MetaTest, cls).__init__(name, bases, attrs)
MetaTest._instances[name] = cls()
class Foo(object):
__metaclass__ = MetaTest
def __init__(self):
super(Foo, self).__init__()
class Bar(Foo):
def __init__(self):
super(Bar, self).__init__()
But the instantiation fails on the super with:
super(Foo, self).__init__()
NameError: global name 'Foo' is not defined
By any chance, could there be the way how to do this?

Solution 1: Modify sys.modules
This is a Python 2 version.
Adding the class to its module works, because it solves the problem that the class does not exits yet when you use super():
import sys
class MetaTest(type):
_instances = {}
def __init__(cls, name, bases, attrs):
setattr(sys.modules[cls.__module__], cls.__name__, cls)
super(MetaTest, cls).__init__(name, bases, attrs)
MetaTest._instances[name] = cls()
class Foo(object):
__metaclass__ = MetaTest
def __init__(self):
print('init Foo')
super(Foo, self).__init__()
class Bar(Foo):
def __init__(self):
print('init Bar')
super(Bar, self).__init__()
print(MetaTest._instances)
Output:
init Foo
init Bar
init Foo
{'Foo': <__main__.Foo object at 0x10cef90d0>, 'Bar': <__main__.Bar object at 0x10cef9110>}
Solution 2: Use the future library
The future library offers as super() for Python 2 that works just like the one for Python 3, i.e. without arguments.
This helps:
from builtins import super
class MetaTest(type):
_instances = {}
def __init__(cls, name, bases, attrs):
super(MetaTest, cls).__init__(name, bases, attrs)
MetaTest._instances[name] = cls()
class Foo(object):
__metaclass__ = MetaTest
def __init__(self):
print('init Foo')
super().__init__()
class Bar(Foo):
def __init__(self):
print('init Bar')
super().__init__()
print(MetaTest._instances)
Output:
init Foo
init Bar
init Foo
{'Foo': <__main__.Foo object at 0x1066b39d0>, 'Bar': <__main__.Bar object at 0x1066b3b50>}

First note:
for this sort of code, it is easier to be using Python 3. More about this on the first solution and at the end of the answer.
So, the name of the class itself Foo won't, obviously, be available inside the body of class Foo: this name is only bound after the class instantiation is completely performed.
In Python 3, there is the parameterless version of super() call, which can direct one to the superclass with no need for a explicit reference to the class itself. It is one of the few exceptions the language make to its very predictable mechanisms. It binds data to the super call at compile time. However, even in Python 3 it is not possible to call a method that makes use of super while the class is still being istantiated - i.e. before returning from the metaclass __new__ and __init__ methods. The data needed by the parameterless super is silently bound after that.
However, in Python 3.6+ there is a mechanism that even precludes a metaclass: a class method named __init_subclass__ will be called on the superclasses after a class is created, and it could make normal use of (parameterless) super. This would work with Python 3.6:
class Base:
_instances = {}
def __init_subclass__(cls):
__class__._instances[cls.__name__] = cls()
...
class Foo(Base):
def __init__(self):
super().__init__()
Here, __class__ (not obj.__class__) is another magic name, available along with parameterless super, that always reference the class within which body it is used, never a subclass.
Now, a couple considerations on other methods: no code within the metaclass itself could work to instantiate a class, and the class methods needing to have a reference to the class in its name do not help. A class decorator would also not work, because the name is only bound after the return from the decorator. If you are coding your system within a framework which have an event loop, you could, on the metaclass __init__ schedule an event to create a class instance - that would work, but you need to be coding within such a context (gobject, Qt, django, sqlalchemy, zope/pyramid, Python asyncIO/tulip are examples of frameworks which have an event system that could make the needed callbacks)
Now, if you really need this in Python 2, I think the easiest way is to create a function to be called at the footer of each of your modules that would "register" the recently created classes.
Something along:
class MetaTest(type):
_instances = {}
def register():
import sys
module_namespace = sys._getframe(1).f_globals
for name, obj in f_globals.items():
if isinstance(obj, MetaTest):
MetaTest._instances[obj.__name__] = obj()
again: a a call to this "register" function should come at the last line of each of your modules.
The "meta" thing in here is introspecting the frame of the caller function itself to get its global variables automatically. You can get rid with it by making globals() a required explicit parameter to the register function, and keeping the code easier to understand by third-parties.
Why Python 3
As of today, Python 2 is 2 years from End of Line. And a lot of improvements have been happening in the language in the last 8 years, including features in the class and metaclass models. I've seem a lot of people seems to stick with python 2 because as "python" is linked to Python 2 in their Linux installs, they think that is the default. Not true: it is just there for backwards compatibility - more and more Linux install use python3 as the default Python, usually installed alongside Python 2. And for your projects, it is nicer to use an isolated environment created with virtuaelnv anyway.

What jsbueno said. I've been trying to make this work in Python 2, but without much success. Here's the closest I got:
class MetaTest(type):
_instances = {}
def __init__(cls, name, bases, attrs):
super(MetaTest, cls).__init__(name, bases, attrs)
MetaTest._instances[name] = cls()
class Foo(object):
__metaclass__ = MetaTest
class Bar(Foo):
pass
for name, obj in Bar._instances.items():
print name, obj, type(obj)
output
Foo <__main__.Foo object at 0xb74626ec> <class '__main__.Foo'>
Bar <__main__.Bar object at 0xb746272c> <class '__main__.Bar'>
However, if you try to give either Foo or Bar __init__ methods, then it all falls to pieces, for the reasons given by jsbueno.
Here's a Python 3.6 version using the argless form of super that's more successful:
class MetaTest(type):
_instances = {}
def __init__(cls, name, bases, attrs):
print('cls: {}\nname: {}\nbases: {}\nattrs: {}\n'.format(cls, name, bases, attrs))
super().__init__(name, bases, attrs)
MetaTest._instances[name] = cls()
class Foo(metaclass=MetaTest):
def __init__(self):
super().__init__()
class Bar(Foo):
def __init__(self):
super().__init__()
for name, obj in Bar._instances.items():
print(name, obj, type(obj))
output
cls: <class '__main__.Foo'>
name: Foo
bases: ()
attrs: {'__module__': '__main__', '__qualname__': 'Foo', '__init__': <function Foo.__init__ at 0xb7192dac>, '__classcell__': <cell at 0xb718c2b4: MetaTest object at 0xb718d3cc>}
cls: <class '__main__.Bar'>
name: Bar
bases: (<class '__main__.Foo'>,)
attrs: {'__module__': '__main__', '__qualname__': 'Bar', '__init__': <function Bar.__init__ at 0xb7192d64>, '__classcell__': <cell at 0xb718c2fc: MetaTest object at 0xb718d59c>}
Foo <__main__.Foo object at 0xb712514c> <class '__main__.Foo'>
Bar <__main__.Bar object at 0xb71251ac> <class '__main__.Bar'>
I must confess that I'm not very comfortable with this whole concept of classes creating an instance of themself automatically: "Explicit is better than implicit". I think jsbueno's suggestion of using a registration function is a good compromise.

Related

How class is constructed?

For the below code in python 3,
class Spam(object):
def __init__(self,name):
self.name = name
def bar(self):
print('Am Spam.bar')
metaclass for Spam is type and base class for Spam is object.
My understanding is,
the purpose of base class is to inherit the properties. Metaclass is to construct the given class definition, as shown below,
body= \
"""
def __init__(self, name):
self.name = name
def bar(self):
print('Am ', self.name)
"""
clsdict = type.__prepare__('type', 'Spam', (object,))
exec(body, globals(), clsdict)
Spam = type('Spam', (object,), clsdict)
s = Spam('xyz')
s.bar()
Code is tested here.
With the given syntax def __prepare__(metacls, name, bases) to use,
Does __prepare__() require passing 'type' as first argument?
type.__prepare__ is a bit special, in that it ignores all and any arguments passed to it and returns an empty dict.
>>> type.__prepare__()
{}
That said, you are not calling __prepare__ correctly. It is called with: the name of the class to be created, its bases and any keyword arguments the class is being created with. __prepare__ is called as metaclass.__prepare__(name, bases, **kwds) Thus,
class MyClass(SomeBase, arg="value", metaclass=MyMeta):
...
will have __prepare__ called as:
MyMeta.__prepare__("MyClass", (SomeBase,), arg="value")
However, most user defined meta classes define their __prepare__ as a classmethod meaning the metaclass is implicitly passed. Meaning your __prepare__ definition can look like:
#classmethod
def __prepare__(metaclass, name, bases, **kwargs):
...
But __prepare__ is still called in the same way as before. It is through the magic of descriptors that the metaclass argument is added.

super not working with class decorators?

Lets define simple class decorator function, which creates subclass and adds 'Dec' to original class name only:
def decorate_class(klass):
new_class = type(klass.__name__ + 'Dec', (klass,), {})
return new_class
Now apply it on a simple subclass definition:
class Base(object):
def __init__(self):
print 'Base init'
#decorate_class
class MyClass(Base):
def __init__(self):
print 'MyClass init'
super(MyClass, self).__init__()
Now, if you try instantiate decorated MyClass, it will end up in an infinite loop:
c = MyClass()
# ...
# File "test.py", line 40, in __init__
# super(MyClass, self).__init__()
# RuntimeError: maximum recursion depth exceeded while calling a Python object
It seems, super can't handle this case and does not skip current class from inheritance chain.
The question, how correctly use class decorator on classes using super ?
Bonus question, how get final class from proxy-object created by super ? Ie. get object class from super(Base, self).__init__ expression, as determined parent class defining called __init__.
If you just want to change the class's .__name__ attribute, make a decorator that does that.
from __future__ import print_function
def decorate_class(klass):
klass.__name__ += 'Dec'
return klass
class Base(object):
def __init__(self):
print('Base init')
#decorate_class
class MyClass(Base):
def __init__(self):
print('MyClass init')
super(MyClass, self).__init__()
c = MyClass()
cls = c.__class__
print(cls, cls.__name__)
Python 2 output
MyClass init
Base init
<class '__main__.MyClassDec'> MyClassDec
Python 3 output
MyClass init
Base init
<class '__main__.MyClass'> MyClassDec
Note the difference in the repr of cls. (I'm not sure why you'd want to change a class's name though, it sounds like a recipe for confusion, but I guess it's ok for this simple example).
As others have said, an #decorator isn't intended to create a subclass. You can do it in Python 3 by using the arg-less form of super (i.e., super().__init__()). And you can make it work in both Python 3 and Python 2 by explicitly supplying the parent class rather than using super.
from __future__ import print_function
def decorate_class(klass):
name = klass.__name__
return type(name + 'Dec', (klass,), {})
class Base(object):
def __init__(self):
print('Base init')
#decorate_class
class MyClass(Base):
def __init__(self):
print('MyClass init')
Base.__init__(self)
c = MyClass()
cls = c.__class__
print(cls, cls.__name__)
Python 2 & 3 output
MyClass init
Base init
<class '__main__.MyClassDec'> MyClassDec
Finally, if we just call decorate_class using normal function syntax rather than as an #decorator we can use super.
from __future__ import print_function
def decorate_class(klass):
name = klass.__name__
return type(name + 'Dec', (klass,), {})
class Base(object):
def __init__(self):
print('Base init')
class MyClass(Base):
def __init__(self):
print('MyClass init')
super(MyClass, self).__init__()
MyClassDec = decorate_class(MyClass)
c = MyClassDec()
cls = c.__class__
print(cls, cls.__name__)
The output is the same as in the last version.
Since your decorator returns an entirely new class with different name, for that class MyClass object doesn't even exist. This is not the case class decorators are intended for. They are intended to add additional functionality to an existing class, not outright replacing it with some other class.
Still if you are using Python3, solution is simple -
#decorate_class
class MyClass(Base):
def __init__(self):
print 'MyClass init'
super().__init__()
Otherwise, I doubt there is any straight-forward solution, you just need to change your implementation. When you are renaming the class, you need to rewrite overwrite __init__ as well with newer name.
The problem is that your decorator creates a subclass of the original one. That means that super(Myclass) now point to... the original class itself!
I cannot even explain how the 0 arg form of super manages to do the job in Python 3, I could not find anything explicit in the reference manual. I assume it must use the class in which it is used at the time of declaration. But I cannot imagine a way to get that result in Python2.
If you want to be able to use super in the decorated class in Python 2, you should not create a derived class, but directly modify the original class in place.
For example, here is a decorator that prints a line before and after calling any method:
def decorate_class(klass):
for name, method in klass.__dict__.iteritems(): # iterate the class attributes
if isinstance(method, types.FunctionType): # identify the methods
def meth(*args, **kwargs): # define a wrapper
print "Before", name
method(*args, **kwargs)
print "After", name
setattr(klass, name, meth) # tell the class to use the wrapper
return klass
With your example it gives as expected:
>>> c = MyClass()
Before __init__
MyClass init
Base init
After __init__

Python __super black magic failed

I want to add an attribute for every class created by a metaclass. For example, when a class named C is created, I want add an attribute C._C__sup whose value is the descriptor super(C).
Here is what I've tried:
class Meta(type):
def __init__(cls, name, bases, dict): # Not overriding type.__new__
cls.__dict__['_' + name + '__sup'] = super(cls)
# Not calling type.__init__; do I need it?
class C(object):
__metaclass__ = Meta
c = C()
print c._C__sup
This gives me:
TypeError: Error when calling the metaclass bases
'dictproxy' object does not support item assignment
Some background information:
(You don't have to read this part)
Inspired by this article, what I'm doing is trying to avoid "hardcoding" the class name when using super:
The idea there is to use the unbound super objects as private
attributes. For instance, in our example, we could define the private
attribute __sup in the class C as the unbound super object
super(C):
>>> C._C__sup = super(C)
With this definition inside the methods the syntax self.__sup.meth
can be used as an alternative to super(C, self).meth. The advantage
is that you avoid to repeat the name of the class in the calling
syntax, since that name is hidden in the mangling mechanism of private
names. The creation of the __sup attributes can be hidden in a
metaclass and made automatic. So, all this seems to work: but
actually this not the case.
Use setattr instead of assignment to cls.__dict__:
class Meta(type):
def __init__(cls, name, bases, clsdict): # Not overriding type.__new__
setattr(cls, '_' + name + '__sup', super(cls))
super(Meta, cls).__init__(name, bases, clsdict)
class C(object):
__metaclass__ = Meta
def say(self):
return 'wow'
class D(C):
def say(self):
return 'bow' + self.__sup.say()
c = C()
print(c._C__sup)
# <super: <class 'C'>, <C object>>
d = D()
print(d.say())
prints
bowwow
By the way, it is a good idea to call
super(Meta, cls).__init__(name, bases, clsdict)
inside Meta.__init__ to allow Meta to participate in class hierarchies which
might need super to properly call a chain of __init__s. This seems
particularly appropriate since you are building a metaclass to assist with the
use of super.

How can one locate where an inherited variable comes from in Python?

If you have multiple layers of inheritance and know that a particular variable exists, is there a way to trace back to where the variable originated? Without having to navigate backwards by looking through each file and classes. Possibly calling some sort of function that will do it?
Example:
parent.py
class parent(object):
def __init__(self):
findMe = "Here I am!"
child.py
from parent import parent
class child(parent):
pass
grandson.py
from child import child
class grandson(child):
def printVar(self):
print self.findMe
Try to locate where the findMe variable came from with a function call.
If the "variable" is an instance variable - , so , if at any point in chain of __init__ methods you do:
def __init__(self):
self.findMe = "Here I am!"
It is an instance variable from that point on, and cannot, for all effects, be made distinct of any other instance variable. (Unless you put in place a mechanism, like a class with a special __setattr__ method, that will keep track of attributes changing, and introspect back which part of the code set the attribute - see last example on this answer)
Please also note that on your example,
class parent(object):
def __init__(self):
findMe = "Here I am!"
findMe is defined as a local variable to that method and does not even exist after __init__ is finished.
Now, if your variable is set as a class attribute somewhere on the inheritance chain:
class parent(object):
findMe = False
class childone(parent):
...
It is possible to find the class where findMe is defined by introspecting each class' __dict__ in the MRO (method resolution order) chain . Of course, there is no way, and no sense, in doing that without introspecting all classes in the MRO chain - except if one keeps track of attributes as defined, like in the example bellow this - but introspecting the MRO itself is a oneliner in Python:
def __init__(self):
super().__init__()
...
findme_definer = [cls for cls in self.__class__.__mro__ if "findMe" in cls.__dict__][0]
Again - it would be possible to have a metaclass to your inheritance chain which would keep track of all defined attributes in the inheritance tree, and use a dictionary to retrieve where each attribute is defined. The same metaclass could also auto-decorate all __init__ (or all methods), and set a special __setitem__ so that it could track instance attributes as they are created, as listed above.
That can be done, is a bit complicated, would be hard to maintain, and probably is a signal you are taking the wrong approach to your problem.
So, the metaclass to record just class attributes could simply be (python3 syntax - define a __metaclass__ attribute on the class body if you are still using Python 2.7):
class MetaBase(type):
definitions = {}
def __init__(cls, name, bases, dct):
for attr in dct.keys():
cls.__class__.definitions[attr] = cls
class parent(metaclass=MetaBase):
findMe = 5
def __init__(self):
print(self.__class__.definitions["findMe"])
Now, if one wants to find which of the superclasses defined an attribute of the currentclass, just a "live" tracking mechanism, wrapping each method in each class can work - it is a lot trickier.
I've made it - even if you won't need this much, this combines both methods - keeping track of class attributes in the class'class definitions and on an instance _definitions dictionary - since in each created instance an arbitrary method might have been the last to set a particular instance attribute: (This is pure Python3, and maybe not that straighforward porting to Python2 due to the "unbound method" that Python2 uses, and is a simple function in Python3)
from threading import current_thread
from functools import wraps
from types import MethodType
from collections import defaultdict
def method_decorator(func, cls):
#wraps(func)
def wrapper(self, *args, **kw):
self.__class__.__class__.current_running_class[current_thread()].append(cls)
result = MethodType(func, self)(*args, **kw)
self.__class__.__class__.current_running_class[current_thread()].pop()
return result
return wrapper
class MetaBase(type):
definitions = {}
current_running_class = defaultdict(list)
def __init__(cls, name, bases, dct):
for attrname, attr in dct.items():
cls.__class__.definitions[attr] = cls
if callable(attr) and attrname != "__setattr__":
setattr(cls, attrname, method_decorator(attr, cls))
class Base(object, metaclass=MetaBase):
def __setattr__(self, attr, value):
if not hasattr(self, "_definitions"):
super().__setattr__("_definitions", {})
self._definitions[attr] = self.__class__.current_running_class[current_thread()][-1]
return super().__setattr__(attr,value)
Example Classes for the code above:
class Parent(Base):
def __init__(self):
super().__init__()
self.findMe = 10
class Child1(Parent):
def __init__(self):
super().__init__()
self.findMe1 = 20
class Child2(Parent):
def __init__(self):
super().__init__()
self.findMe2 = 30
class GrandChild(Child1, Child2):
def __init__(self):
super().__init__()
def findall(self):
for attr in "findMe findMe1 findMe2".split():
print("Attr '{}' defined in class '{}' ".format(attr, self._definitions[attr].__name__))
And on the console one will get this result:
In [87]: g = GrandChild()
In [88]: g.findall()
Attr 'findMe' defined in class 'Parent'
Attr 'findMe1' defined in class 'Child1'
Attr 'findMe2' defined in class 'Child2'

Python metaclasses: Why isn't __setattr__ called for attributes set during class definition?

I have the following python code:
class FooMeta(type):
def __setattr__(self, name, value):
print name, value
return super(FooMeta, self).__setattr__(name, value)
class Foo(object):
__metaclass__ = FooMeta
FOO = 123
def a(self):
pass
I would have expected __setattr__ of the meta class being called for both FOO and a. However, it is not called at all. When I assign something to Foo.whatever after the class has been defined the method is called.
What's the reason for this behaviour and is there a way to intercept the assignments that happen during the creation of the class? Using attrs in __new__ won't work since I'd like to check if a method is being redefined.
A class block is roughly syntactic sugar for building a dictionary, and then invoking a metaclass to build the class object.
This:
class Foo(object):
__metaclass__ = FooMeta
FOO = 123
def a(self):
pass
Comes out pretty much as if you'd written:
d = {}
d['__metaclass__'] = FooMeta
d['FOO'] = 123
def a(self):
pass
d['a'] = a
Foo = d.get('__metaclass__', type)('Foo', (object,), d)
Only without the namespace pollution (and in reality there's also a search through all the bases to determine the metaclass, or whether there's a metaclass conflict, but I'm ignoring that here).
The metaclass' __setattr__ can control what happens when you try to set an attribute on one of its instances (the class object), but inside the class block you're not doing that, you're inserting into a dictionary object, so the dict class controls what's going on, not your metaclass. So you're out of luck.
Unless you're using Python 3.x! In Python 3.x you can define a __prepare__ classmethod (or staticmethod) on a metaclass, which controls what object is used to accumulate attributes set within a class block before they're passed to the metaclass constructor. The default __prepare__ simply returns a normal dictionary, but you could build a custom dict-like class that doesn't allow keys to be redefined, and use that to accumulate your attributes:
from collections import MutableMapping
class SingleAssignDict(MutableMapping):
def __init__(self, *args, **kwargs):
self._d = dict(*args, **kwargs)
def __getitem__(self, key):
return self._d[key]
def __setitem__(self, key, value):
if key in self._d:
raise ValueError(
'Key {!r} already exists in SingleAssignDict'.format(key)
)
else:
self._d[key] = value
def __delitem__(self, key):
del self._d[key]
def __iter__(self):
return iter(self._d)
def __len__(self):
return len(self._d)
def __contains__(self, key):
return key in self._d
def __repr__(self):
return '{}({!r})'.format(type(self).__name__, self._d)
class RedefBlocker(type):
#classmethod
def __prepare__(metacls, name, bases, **kwargs):
return SingleAssignDict()
def __new__(metacls, name, bases, sad):
return super().__new__(metacls, name, bases, dict(sad))
class Okay(metaclass=RedefBlocker):
a = 1
b = 2
class Boom(metaclass=RedefBlocker):
a = 1
b = 2
a = 3
Running this gives me:
Traceback (most recent call last):
File "/tmp/redef.py", line 50, in <module>
class Boom(metaclass=RedefBlocker):
File "/tmp/redef.py", line 53, in Boom
a = 3
File "/tmp/redef.py", line 15, in __setitem__
'Key {!r} already exists in SingleAssignDict'.format(key)
ValueError: Key 'a' already exists in SingleAssignDict
Some notes:
__prepare__ has to be a classmethod or staticmethod, because it's being called before the metaclass' instance (your class) exists.
type still needs its third parameter to be a real dict, so you have to have a __new__ method that converts the SingleAssignDict to a normal one
I could have subclassed dict, which would probably have avoided (2), but I really dislike doing that because of how the non-basic methods like update don't respect your overrides of the basic methods like __setitem__. So I prefer to subclass collections.MutableMapping and wrap a dictionary.
The actual Okay.__dict__ object is a normal dictionary, because it was set by type and type is finicky about the kind of dictionary it wants. This means that overwriting class attributes after class creation does not raise an exception. You can overwrite the __dict__ attribute after the superclass call in __new__ if you want to maintain the no-overwriting forced by the class object's dictionary.
Sadly this technique is unavailable in Python 2.x (I checked). The __prepare__ method isn't invoked, which makes sense as in Python 2.x the metaclass is determined by the __metaclass__ magic attribute rather than a special keyword in the classblock; which means the dict object used to accumulate attributes for the class block already exists by the time the metaclass is known.
Compare Python 2:
class Foo(object):
__metaclass__ = FooMeta
FOO = 123
def a(self):
pass
Being roughly equivalent to:
d = {}
d['__metaclass__'] = FooMeta
d['FOO'] = 123
def a(self):
pass
d['a'] = a
Foo = d.get('__metaclass__', type)('Foo', (object,), d)
Where the metaclass to invoke is determined from the dictionary, versus Python 3:
class Foo(metaclass=FooMeta):
FOO = 123
def a(self):
pass
Being roughly equivalent to:
d = FooMeta.__prepare__('Foo', ())
d['Foo'] = 123
def a(self):
pass
d['a'] = a
Foo = FooMeta('Foo', (), d)
Where the dictionary to use is determined from the metaclass.
There are no assignments happening during the creation of the class. Or: they are happening, but not in the context you think they are. All class attributes are collected from class body scope and passed to metaclass' __new__, as the last argument:
class FooMeta(type):
def __new__(self, name, bases, attrs):
print attrs
return type.__new__(self, name, bases, attrs)
class Foo(object):
__metaclass__ = FooMeta
FOO = 123
Reason: when the code in the class body executes, there's no class yet. Which means there's no opportunity for metaclass to intercept anything yet.
Class attributes are passed to the metaclass as a single dictionary and my hypothesis is that this is used to update the __dict__ attribute of the class all at once, e.g. something like cls.__dict__.update(dct) rather than doing setattr() on each item. More to the point, it's all handled in C-land and simply wasn't written to call a custom __setattr__().
It's easy enough to do whatever you want to the attributes of the class in your metaclass's __init__() method, since you're passed the class namespace as a dict, so just do that.
During the class creation, your namespace is evaluated to a dict and passed as an argument to the metaclass, together with the class name and base classes. Because of that, assigning a class attribute inside the class definition wouldn't work the way you expect. It doesn't create an empty class and assign everything. You also can't have duplicated keys in a dict, so during class creation attributes are already deduplicated. Only by setting an attribute after the class definition you can trigger your custom __setattr__.
Because the namespace is a dict, there's no way for you to check duplicated methods, as suggested by your other question. The only practical way to do that is parsing the source code.

Categories

Resources