The Scenario:
class A:
def __init__(self, key, secret):
self.key = key
self.secret = secret
def same_name_method(self):
do_some_staff
def method_a(self):
pass
class B:
def __init__(self, key, secret):
self.key = key
self.secret = secret
def same_name_method(self):
do_another_staff
def method_b(self):
pass
class C(A,B):
def __init__(self, *args, **kwargs):
# I want to init both class A and B's key and secret
## I want to rename class A and B's same method
any_ideas()
...
What I Want:
I want the instance of class C initialize both class A and B, because they are different api key.
And I want rename class A and B's same_name_method, so I will not confused at which same_name_method.
What I Have Done:
For problem one, I have done this:
class C(A,B):
def __init__(self, *args, **kwargs):
A.__init__(self, a_api_key,a_api_secret)
B.__init__(self, b_api_key,b_api_secret)
Comment: I know about super(), but for this situation I do not know how to use it.
For problem two, I add a __new__ for class C
def __new__(cls, *args, **kwargs):
cls.platforms = []
cls.rename_method = []
for platform in cls.__bases__:
# fetch platform module name
module_name = platform.__module__.split('.')[0]
cls.platforms.append(module_name)
# rename attr
for k, v in platform.__dict__.items():
if not k.startswith('__'):
setattr(cls, module_name+'_'+k, v)
cls.rename_method.append(k)
for i in cls.rename_method:
delattr(cls, i) ## this line will raise AttributeError!!
return super().__new__(cls)
Comment: because I rename the new method names and add it to cls attr. I need to delete the old method attr, but do not know how to delattr. Now I just leave them alone, did not delete the old methods.
Question:
Any Suggestions?
So, you want some pretty advanced things, some complicated things, and you don't understand well how classes behave in Python.
So, for your first thing: initializing both classes, and every other method that should run in all classes: the correct solution is to make use of cooperative calls to super() methods.
A call to super() in Python returns you a very special proxy objects that reflects all methods available in the next class, obeying the proper method Resolution Order.
So, if A.__init__ and B.__init__ have to be called, both methods should include a super().__init__ call - and one will call the other's __init__ in the appropriate order, regardless of how they are used as bases in subclasses. As object also have __init__, the last super().__init__ will just call it that is a no-op. If you have more methods in your classes that should be run in all base classes, you'd rather build a proper base class so that the top-most super() call don't try to propagate to a non-existing method.
Otherwise, it is just:
class A:
def __init__(self, akey, asecret, **kwargs):
self.key = akey
self.secret = asecret
super().__init__(**kwargs)
class B:
def __init__(self, bkey, bsecret, **kwargs):
self.key = bkey
self.secret = bsecret
super().__init__(**kwargs)
class C(A,B):
# does not even need an explicit `__init__`.
I think you can get the idea. Of course, the parameter names have to differ - ideally, when writing C you don't have to worry about parameter order - but when calling C you have to worry about suplying all mandatory parameters for C and its bases. If you can't rename the parameters in A or B to be distinct, you could try to use the parameter order for the call, though, with each __init__ consuming two position-parameters - but that will require some extra care in inheritance order.
So - up to this point, it is basic Python multiple-inheritance "howto", and should be pretty straightforward. Now comes your strange stuff.
As for the auto-renaming of methods: first things first -
are you quite sure you need inheritance? Maybe having your granular classes for each external service, and a registry and dispatch class that call the methods on the others by composition would be more sane. (I may come back to this later)
Are you aware that __new__ is called for each instantiation of the class, and all class-attribute mangling you are performing there happens at each new instance of your classes?
So, if the needed method-renaming + shadowing needs to take place at class creation time, you can do that using the special method __init_subclass__ that exists from Python 3.6. It is a special class method that is called once for each derived class of the class it is defined on. So, just create a base class, from which A and B themselves will inherit, and move a properly modified version the thing you are putting in __new__ there. If you are not using Python 3.6, this should be done on the __new__ or __init__ of a metaclass, not on the __new__ of the class itself.
Another approach would be to have a custom __getattribute__ method - this could be crafted to provide namespaces for the base classes. It would owrk ony on instances, not on the classes themselves (but could be made to, again, using a metaclass). __getattribute__ can even hide the same-name-methods.
class Base:
#classmethod
def _get_base_modules(cls):
result = {}
for base in cls.__bases__:
module_name = cls.__module__.split(".")[0]
result[module_name] = base
return result
#classmethod
def _proxy(self, module_name):
class base:
def __dir__(base_self):
return dir(self._base_modules[module_name])
def __getattr__(base_self, attr):
original_value = self._base_modules[module_name].__dict__[attr]
if hasattr(original_value, "__get__"):
original_value = original_value.__get__(self, self.__class__)
return original_value
base.__name__ = module_name
return base()
def __init_subclass__(cls):
cls._base_modules = cls._get_base_modules()
cls._shadowed = {name for module_class in cls._base_modules.values() for name in module_class.__dict__ if not name.startswith("_")}
def __getattribute__(self, attr):
if attr.startswith("_"):
return super().__getattribute__(attr)
cls = self.__class__
if attr in cls._shadowed:
raise AttributeError(attr)
if attr in cls._base_modules:
return cls._proxy(attr)
return super().__getattribute__(attr)
def __dir__(self):
return super().dir() + list(self._base_modules)
class A(Base):
...
class B(Base):
...
class C(A, B):
...
As you can see - this is some fun, but starts getting really complicated - and all the hoola-boops that are needed to retrieve the actual attributes from the superclasses after ading an artificial namespace seem to indicate your problem is not calling for using inheritance after all, as I suggested above.
Since you have your small, functional, atomic classes for each "service" , you could use a plain, simple, non-meta-at-all class that would work as a registry for the various services - and you can even enhance it to call the equivalent method in several of the services it is handling with a single call:
class Services:
def __init__(self):
self.registry = {}
def register(self, cls, key, secret):
name = cls.__module__.split(".")[0]
service= cls(key, secret)
self.registry[name] = service
def __getattr__(self, attr):
if attr in self.registry:
return self.registry[attr]
Related
I have a python abstract base class as follows:
class Node(object):
"""
All concrete node classes should inherit from this
"""
__metaclass__ = ABCMeta
def __init__(self, name):
self.name = name
self.inputs = dict()
def add_input(self, key, value=None, d=None):
self.inputs[key] = (d, value)
def bind_input(self):
print "Binding inputs"
#abstractmethod
def run(self):
pass
Now, various derived classes will inherit from this node class and override the run method. It is always the case that bind_input() must be the first thing that should be called in the run method. Currently, for all derived classes the developer has to make sure to first call self.bind_input(). This is not a huge problem per se but out of curiosity is it possible to ensure this somehow from the base class itself that bind_input is called before executing the child object's run?
The usual object-oriented approach is this:
def run(self):
self.bind_input()
return self.do_run()
#abstractmethod
def do_run(self):
pass # override this method
Have your subclasses override the inner method, instead of the outer one.
If you have multiple layers of inheritance and know that a particular variable exists, is there a way to trace back to where the variable originated? Without having to navigate backwards by looking through each file and classes. Possibly calling some sort of function that will do it?
Example:
parent.py
class parent(object):
def __init__(self):
findMe = "Here I am!"
child.py
from parent import parent
class child(parent):
pass
grandson.py
from child import child
class grandson(child):
def printVar(self):
print self.findMe
Try to locate where the findMe variable came from with a function call.
If the "variable" is an instance variable - , so , if at any point in chain of __init__ methods you do:
def __init__(self):
self.findMe = "Here I am!"
It is an instance variable from that point on, and cannot, for all effects, be made distinct of any other instance variable. (Unless you put in place a mechanism, like a class with a special __setattr__ method, that will keep track of attributes changing, and introspect back which part of the code set the attribute - see last example on this answer)
Please also note that on your example,
class parent(object):
def __init__(self):
findMe = "Here I am!"
findMe is defined as a local variable to that method and does not even exist after __init__ is finished.
Now, if your variable is set as a class attribute somewhere on the inheritance chain:
class parent(object):
findMe = False
class childone(parent):
...
It is possible to find the class where findMe is defined by introspecting each class' __dict__ in the MRO (method resolution order) chain . Of course, there is no way, and no sense, in doing that without introspecting all classes in the MRO chain - except if one keeps track of attributes as defined, like in the example bellow this - but introspecting the MRO itself is a oneliner in Python:
def __init__(self):
super().__init__()
...
findme_definer = [cls for cls in self.__class__.__mro__ if "findMe" in cls.__dict__][0]
Again - it would be possible to have a metaclass to your inheritance chain which would keep track of all defined attributes in the inheritance tree, and use a dictionary to retrieve where each attribute is defined. The same metaclass could also auto-decorate all __init__ (or all methods), and set a special __setitem__ so that it could track instance attributes as they are created, as listed above.
That can be done, is a bit complicated, would be hard to maintain, and probably is a signal you are taking the wrong approach to your problem.
So, the metaclass to record just class attributes could simply be (python3 syntax - define a __metaclass__ attribute on the class body if you are still using Python 2.7):
class MetaBase(type):
definitions = {}
def __init__(cls, name, bases, dct):
for attr in dct.keys():
cls.__class__.definitions[attr] = cls
class parent(metaclass=MetaBase):
findMe = 5
def __init__(self):
print(self.__class__.definitions["findMe"])
Now, if one wants to find which of the superclasses defined an attribute of the currentclass, just a "live" tracking mechanism, wrapping each method in each class can work - it is a lot trickier.
I've made it - even if you won't need this much, this combines both methods - keeping track of class attributes in the class'class definitions and on an instance _definitions dictionary - since in each created instance an arbitrary method might have been the last to set a particular instance attribute: (This is pure Python3, and maybe not that straighforward porting to Python2 due to the "unbound method" that Python2 uses, and is a simple function in Python3)
from threading import current_thread
from functools import wraps
from types import MethodType
from collections import defaultdict
def method_decorator(func, cls):
#wraps(func)
def wrapper(self, *args, **kw):
self.__class__.__class__.current_running_class[current_thread()].append(cls)
result = MethodType(func, self)(*args, **kw)
self.__class__.__class__.current_running_class[current_thread()].pop()
return result
return wrapper
class MetaBase(type):
definitions = {}
current_running_class = defaultdict(list)
def __init__(cls, name, bases, dct):
for attrname, attr in dct.items():
cls.__class__.definitions[attr] = cls
if callable(attr) and attrname != "__setattr__":
setattr(cls, attrname, method_decorator(attr, cls))
class Base(object, metaclass=MetaBase):
def __setattr__(self, attr, value):
if not hasattr(self, "_definitions"):
super().__setattr__("_definitions", {})
self._definitions[attr] = self.__class__.current_running_class[current_thread()][-1]
return super().__setattr__(attr,value)
Example Classes for the code above:
class Parent(Base):
def __init__(self):
super().__init__()
self.findMe = 10
class Child1(Parent):
def __init__(self):
super().__init__()
self.findMe1 = 20
class Child2(Parent):
def __init__(self):
super().__init__()
self.findMe2 = 30
class GrandChild(Child1, Child2):
def __init__(self):
super().__init__()
def findall(self):
for attr in "findMe findMe1 findMe2".split():
print("Attr '{}' defined in class '{}' ".format(attr, self._definitions[attr].__name__))
And on the console one will get this result:
In [87]: g = GrandChild()
In [88]: g.findall()
Attr 'findMe' defined in class 'Parent'
Attr 'findMe1' defined in class 'Child1'
Attr 'findMe2' defined in class 'Child2'
I am almost sure that there is a proper term for what I want to do but since I'm not familiar with it, I will try to describe the whole idea explicitly. So what I have is a collection of classes that all inherit from one base class. All the classes consist almost entirely of different methods that are relevant within each class only. However, there are several methods that share similar name, general functionality and also some logic but their implementation is still mostly different. So what I want to know is whether it's possible to create a method in a base class that will execute some logic that is similar to all the methods but still continue the execution in the class specific method. Hopefully that makes sense but I will try to give a basic example of what I want.
So consider a base class that looks something like that:
class App(object):
def __init__(self, testName):
self.localLog = logging.getLogger(testName)
def access(self):
LOGIC_SHARED
And an example of a derived class:
class App1(App):
def __init__(self, testName):
. . .
super(App1, self).__init__(testName)
def access(self):
LOGIC_SPECIFIC
So what I'd like to achieve is that the LOGIC_SHARED part in base class access method to be executed when calling the access method of any App class before executing the LOGIC_SPECIFIC part which is(as it says) specific for each access method of all derived classes.
If that makes any difference, the LOGIC_SHARED mostly consists of logging and maintenance tasks.
Hope that is clear enough and the idea makes sense.
NOTE 1:
There are class specific parameters which are being used in the LOGIC_SHARED section.
NOTE 2:
It is important to implement that behavior using only Python built-in functions and modules.
NOTE 3:
The LOGIC_SHARED part looks something like that:
try:
self.localLog.info("Checking the actual link for %s", self.application)
self.link = self.checkLink(self.application)
self.localLog.info("Actual link found!: %s", self.link)
except:
self.localLog.info("No links found. Going to use the default link: %s", self.link)
So, there are plenty of specific class instance attributes that I use and I'm not sure how to use these attributes from the base class.
Sure, just put the specific logic in its own "private" function, which can overridden by the derived classes, and leave access in the Base.
class Base(object):
def access(self):
# Shared logic 1
self._specific_logic()
# Shared logic 2
def _specific_logic(self):
# Nothing special to do in the base class
pass
# Or you could even raise an exception
raise Exception('Called access on Base class instance')
class DerivedA(Base):
# overrides Base implementation
def _specific_logic(self):
# DerivedA specific logic
class DerivedB(Base):
# overrides Base implementation
def _specific_logic(self):
# DerivedB specific logic
def test():
x = Base()
x.access() # Shared logic 1
# Shared logic 2
a = DerivedA()
a.access() # Shared logic 1
# Derived A specific logic
# Shared logic 2
b = DerivedB()
b.access() # Shared logic 1
# Derived B specific logic
# Shared logic 2
The easiest method to do what you want is to simply call the parent's class access method inside the child's access method.
class App(object):
def __init__(self, testName):
self.localLog = logging.getLogger(testName)
def access(self):
LOGIC_SHARED
class App1(App):
def __init__(self, testName):
super(App1, self).__init__(testName)
def access(self):
App.access(self)
# or use super
super(App1, self).access()
However, your shared functionality is mostly logging and maintenance. Unless there is a pressing reason to put this inside the parent class, you may want to consider is to refactor the shared functionality into a decorator function. This is particularly useful if you want to reuse similar logging and maintenance functionality for a range of methods inside your class.
You can read more about function decorators here: http://www.artima.com/weblogs/viewpost.jsp?thread=240808, or here on Stack Overflow: How to make a chain of function decorators?.
def decorated(method):
def decorated_method(self, *args, **kwargs):
LOGIC_SHARED
method(self, *args, **kwargs)
return decorated_method
Remember than in python, functions are first class objects. That means that you can take a function and pass it as a parameter to another function. A decorator function make use of this. The decorator function takes another function as a parameter (here called method) and then creates a new function (here called decorated_method) that takes the place of the original function.
Your App1 class then would look like this:
class App1(App):
#logged
def access(self):
LOGIC_SPECIFIC
This really is shorthand for this:
class App1(App):
def access(self):
LOGIC_SPECIFIC
decorated_access = logged(App.access)
App.access = decorated_access
I would find this more elegant than adding methods to the superclass to capture shared functionality.
If I understand well this commment (How to execute BaseClass method before it gets overridden by DerivedClass method in Python) you want that additional arguments passed to the parent class used in derived class
based on Jonathon Reinhart's answer
it's how you could do
class Base(object):
def access(self,
param1 ,param2, #first common parameters
*args, #second positional parameters
**kwargs #third keyword arguments
):
# Shared logic 1
self._specific_logic(param1, param2, *args, **kwargs)
# Shared logic 2
def _specific_logic(self, param1, param2, *args, **kwargs):
# Nothing special to do in the base class
pass
# Or you could even raise an exception
raise Exception('Called access on Base class instance')
class DerivedA(Base):
# overrides Base implementation
def _specific_logic(self, param1, param2, param3):
# DerivedA specific logic
class DerivedB(Base):
# overrides Base implementation
def _specific_logic(self, param1, param2, param4):
# DerivedB specific logic
def test():
x = Base()
a = DerivedA()
a.access("param1", "param2", "param3") # Shared logic 1
# Derived A specific logic
# Shared logic 2
b = DerivedB()
b.access("param1", "param2", param4="param4") # Shared logic 1
# Derived B specific logic
# Shared logic 2
I personally prefer Jonathon Reinhart's answer, but seeing as you seem to want more options, here's two more. I would probably never use the metaclass one, as cool as it is, but I might consider the second one with decorators.
With Metaclasses
This method uses a metaclass for the base class that will force the base class's access method to be called first, without having a separate private function, and without having to explicitly call super or anything like that. End result: no extra work/code goes into inheriting classes.
Plus, it works like maaaagiiiiic </spongebob>
Below is the code that will do this. Here http://dbgr.cc/W you can step through the code live and see how it works :
#!/usr/bin/env python
class ForceBaseClassFirst(type):
def __new__(cls, name, bases, attrs):
"""
"""
print("Creating class '%s'" % name)
def wrap_function(fn_name, base_fn, other_fn):
def new_fn(*args, **kwargs):
print("calling base '%s' function" % fn_name)
base_fn(*args, **kwargs)
print("calling other '%s' function" % fn_name)
other_fn(*args, **kwargs)
new_fn.__name__ = "wrapped_%s" % fn_name
return new_fn
if name != "BaseClass":
print("setting attrs['access'] to wrapped function")
attrs["access"] = wrap_function(
"access",
getattr(bases[0], "access", lambda: None),
attrs.setdefault("access", lambda: None)
)
return type.__new__(cls, name, bases, attrs)
class BaseClass(object):
__metaclass__ = ForceBaseClassFirst
def access(self):
print("in BaseClass access function")
class OtherClass(BaseClass):
def access(self):
print("in OtherClass access function")
print("OtherClass attributes:")
for k,v in OtherClass.__dict__.iteritems():
print("%15s: %r" % (k, v))
o = OtherClass()
print("Calling access on OtherClass instance")
print("-------------------------------------")
o.access()
This uses a metaclass to replace OtherClass's access function with a function that wraps a call to BaseClass's access function and a call to OtherClass's access function. See the best explanation of metaclasses here https://stackoverflow.com/a/6581949.
Stepping through the code should really help you understand the order of things.
With Decorators
This functionality could also easily be put into a decorator, as shown below. Again, a steppable/debuggable/runnable version of the code below can be found here http://dbgr.cc/0
#!/usr/bin/env python
def superfy(some_func):
def wrapped(self, *args, **kwargs):
# NOTE might need to be changed when dealing with
# multiple inheritance
base_fn = getattr(self.__class__.__bases__[0], some_func.__name__, lambda *args, **kwargs: None)
# bind the parent class' function and call it
base_fn.__get__(self, self.__class__)(*args, **kwargs)
# call the child class' function
some_func(self, *args, **kwargs)
wrapped.__name__ = "superfy(%s)" % some_func.__name__
return wrapped
class BaseClass(object):
def access(self):
print("in BaseClass access function")
class OtherClass(BaseClass):
#superfy
def access(self):
print("in OtherClass access function")
print("OtherClass attributes")
print("----------------------")
for k,v in OtherClass.__dict__.iteritems():
print("%15s: %r" % (k, v))
print("")
o = OtherClass()
print("Calling access on OtherClass instance")
print("-------------------------------------")
o.access()
The decorator above retrieves the BaseClass' function of the same name, and calls that first before calling the OtherClass' function.
May this simple approach can help.
class App:
def __init__(self, testName):
self.localLog = logging.getLogger(testName)
self.application = None
self.link = None
def access(self):
print('There is something BaseClass must do')
print('The application is ', self.application)
print('The link is ', self.link)
class App1(App):
def __init__(self, testName):
# ...
super(App1, self).__init__(testName)
def access(self):
self.application = 'Application created by App1'
self.link = 'Link created by App1'
super(App1, self).access()
print('There is something App1 must do')
class App2(App):
def __init__(self, testName):
# ...
super(App2, self).__init__(testName)
def access(self):
self.application = 'Application created by App2'
self.link = 'Link created by App2'
super(App2, self).access()
print('There is something App2 must do')
and the test result:
>>>
>>> app = App('Baseclass')
>>> app.access()
There is something BaseClass must do
The application is None
The link is None
>>> app1 = App1('App1 test')
>>> app1.access()
There is something BaseClass must do
The application is Application created by App1
The link is Link created by App1
There is something App1 must do
>>> app2 = App2('App2 text')
>>> app2.access()
There is something BaseClass must do
The application is Application created by App2
The link is Link created by App2
There is something App2 must do
>>>
Adding a combine function we can combine two functions and execute them one after other as bellow
def combine(*fun):
def new(*s):
for i in fun:
i(*s)
return new
class base():
def x(self,i):
print 'i',i
class derived(base):
def x(self,i):
print 'i*i',i*i
x=combine(base.x,x)
new_obj=derived():
new_obj.x(3)
Output Bellow
i 3
i*i 9
it need not be single level hierarchy it can have any number of levels or nested
I have the following python code:
class FooMeta(type):
def __setattr__(self, name, value):
print name, value
return super(FooMeta, self).__setattr__(name, value)
class Foo(object):
__metaclass__ = FooMeta
FOO = 123
def a(self):
pass
I would have expected __setattr__ of the meta class being called for both FOO and a. However, it is not called at all. When I assign something to Foo.whatever after the class has been defined the method is called.
What's the reason for this behaviour and is there a way to intercept the assignments that happen during the creation of the class? Using attrs in __new__ won't work since I'd like to check if a method is being redefined.
A class block is roughly syntactic sugar for building a dictionary, and then invoking a metaclass to build the class object.
This:
class Foo(object):
__metaclass__ = FooMeta
FOO = 123
def a(self):
pass
Comes out pretty much as if you'd written:
d = {}
d['__metaclass__'] = FooMeta
d['FOO'] = 123
def a(self):
pass
d['a'] = a
Foo = d.get('__metaclass__', type)('Foo', (object,), d)
Only without the namespace pollution (and in reality there's also a search through all the bases to determine the metaclass, or whether there's a metaclass conflict, but I'm ignoring that here).
The metaclass' __setattr__ can control what happens when you try to set an attribute on one of its instances (the class object), but inside the class block you're not doing that, you're inserting into a dictionary object, so the dict class controls what's going on, not your metaclass. So you're out of luck.
Unless you're using Python 3.x! In Python 3.x you can define a __prepare__ classmethod (or staticmethod) on a metaclass, which controls what object is used to accumulate attributes set within a class block before they're passed to the metaclass constructor. The default __prepare__ simply returns a normal dictionary, but you could build a custom dict-like class that doesn't allow keys to be redefined, and use that to accumulate your attributes:
from collections import MutableMapping
class SingleAssignDict(MutableMapping):
def __init__(self, *args, **kwargs):
self._d = dict(*args, **kwargs)
def __getitem__(self, key):
return self._d[key]
def __setitem__(self, key, value):
if key in self._d:
raise ValueError(
'Key {!r} already exists in SingleAssignDict'.format(key)
)
else:
self._d[key] = value
def __delitem__(self, key):
del self._d[key]
def __iter__(self):
return iter(self._d)
def __len__(self):
return len(self._d)
def __contains__(self, key):
return key in self._d
def __repr__(self):
return '{}({!r})'.format(type(self).__name__, self._d)
class RedefBlocker(type):
#classmethod
def __prepare__(metacls, name, bases, **kwargs):
return SingleAssignDict()
def __new__(metacls, name, bases, sad):
return super().__new__(metacls, name, bases, dict(sad))
class Okay(metaclass=RedefBlocker):
a = 1
b = 2
class Boom(metaclass=RedefBlocker):
a = 1
b = 2
a = 3
Running this gives me:
Traceback (most recent call last):
File "/tmp/redef.py", line 50, in <module>
class Boom(metaclass=RedefBlocker):
File "/tmp/redef.py", line 53, in Boom
a = 3
File "/tmp/redef.py", line 15, in __setitem__
'Key {!r} already exists in SingleAssignDict'.format(key)
ValueError: Key 'a' already exists in SingleAssignDict
Some notes:
__prepare__ has to be a classmethod or staticmethod, because it's being called before the metaclass' instance (your class) exists.
type still needs its third parameter to be a real dict, so you have to have a __new__ method that converts the SingleAssignDict to a normal one
I could have subclassed dict, which would probably have avoided (2), but I really dislike doing that because of how the non-basic methods like update don't respect your overrides of the basic methods like __setitem__. So I prefer to subclass collections.MutableMapping and wrap a dictionary.
The actual Okay.__dict__ object is a normal dictionary, because it was set by type and type is finicky about the kind of dictionary it wants. This means that overwriting class attributes after class creation does not raise an exception. You can overwrite the __dict__ attribute after the superclass call in __new__ if you want to maintain the no-overwriting forced by the class object's dictionary.
Sadly this technique is unavailable in Python 2.x (I checked). The __prepare__ method isn't invoked, which makes sense as in Python 2.x the metaclass is determined by the __metaclass__ magic attribute rather than a special keyword in the classblock; which means the dict object used to accumulate attributes for the class block already exists by the time the metaclass is known.
Compare Python 2:
class Foo(object):
__metaclass__ = FooMeta
FOO = 123
def a(self):
pass
Being roughly equivalent to:
d = {}
d['__metaclass__'] = FooMeta
d['FOO'] = 123
def a(self):
pass
d['a'] = a
Foo = d.get('__metaclass__', type)('Foo', (object,), d)
Where the metaclass to invoke is determined from the dictionary, versus Python 3:
class Foo(metaclass=FooMeta):
FOO = 123
def a(self):
pass
Being roughly equivalent to:
d = FooMeta.__prepare__('Foo', ())
d['Foo'] = 123
def a(self):
pass
d['a'] = a
Foo = FooMeta('Foo', (), d)
Where the dictionary to use is determined from the metaclass.
There are no assignments happening during the creation of the class. Or: they are happening, but not in the context you think they are. All class attributes are collected from class body scope and passed to metaclass' __new__, as the last argument:
class FooMeta(type):
def __new__(self, name, bases, attrs):
print attrs
return type.__new__(self, name, bases, attrs)
class Foo(object):
__metaclass__ = FooMeta
FOO = 123
Reason: when the code in the class body executes, there's no class yet. Which means there's no opportunity for metaclass to intercept anything yet.
Class attributes are passed to the metaclass as a single dictionary and my hypothesis is that this is used to update the __dict__ attribute of the class all at once, e.g. something like cls.__dict__.update(dct) rather than doing setattr() on each item. More to the point, it's all handled in C-land and simply wasn't written to call a custom __setattr__().
It's easy enough to do whatever you want to the attributes of the class in your metaclass's __init__() method, since you're passed the class namespace as a dict, so just do that.
During the class creation, your namespace is evaluated to a dict and passed as an argument to the metaclass, together with the class name and base classes. Because of that, assigning a class attribute inside the class definition wouldn't work the way you expect. It doesn't create an empty class and assign everything. You also can't have duplicated keys in a dict, so during class creation attributes are already deduplicated. Only by setting an attribute after the class definition you can trigger your custom __setattr__.
Because the namespace is a dict, there's no way for you to check duplicated methods, as suggested by your other question. The only practical way to do that is parsing the source code.
I'm writing a decorator for methods that must inspect the parent methods (the methods of the same name in the parents of the class in which I'm decorating).
Example (from the fourth example of PEP 318):
def returns(rtype):
def check_returns(f):
def new_f(*args, **kwds):
result = f(*args, **kwds)
assert isinstance(result, rtype), \
"return value %r does not match %s" % (result,rtype)
return result
new_f.func_name = f.func_name
# here I want to reach the class owning the decorated method f,
# it should give me the class A
return new_f
return check_returns
class A(object):
#returns(int)
def compute(self, value):
return value * 3
So I'm looking for the code to type in place of # here I want...
Thanks.
As bobince said it, you can't access the surrounding class, because at the time the decorator is invoked, the class does not exist yet. If you need access to the full dictionary of the class and the bases, you should consider a metaclass:
__metaclass__
This variable can be any callable accepting arguments for name, bases, and dict. Upon class creation, the callable is used instead of the built-in type().
Basically, we convert the returns decorator into something that just tells the metaclass to do some magic on class construction:
class CheckedReturnType(object):
def __init__(self, meth, rtype):
self.meth = meth
self.rtype = rtype
def returns(rtype):
def _inner(f):
return CheckedReturnType(f, rtype)
return _inner
class BaseInspector(type):
def __new__(mcs, name, bases, dct):
for obj_name, obj in dct.iteritems():
if isinstance(obj, CheckedReturnType):
# do your wrapping & checking here, base classes are in bases
# reassign to dct
return type.__new__(mcs, name, bases, dct)
class A(object):
__metaclass__ = BaseInspector
#returns(int)
def compute(self, value):
return value * 3
Mind that I have not tested this code, please leave comments if I should update this.
There are some articles on metaclasses by the highly recommendable David Mertz, which you might find interesting in this context.
here I want to reach the class owning the decorated method f
You can't because at the point of decoration, no class owns the method f.
class A(object):
#returns(int)
def compute(self, value):
return value * 3
Is the same as saying:
class A(object):
pass
#returns(int)
def compute(self, value):
return value*3
A.compute= compute
Clearly, the returns() decorator is built before the function is assigned to an owner class.
Now when you write a function to a class (either inline, or explicitly like this) it becomes an unbound method object. Now it has a reference to its owner class, which you can get by saying:
>>> A.compute.im_class
<class '__main__.A'>
So you can read f.im_class inside ‘new_f’, which is executed after the assignment, but not in the decorator itself.
(And even then it's a bit ugly relying on a CPython implementation detail if you don't need to. I'm not quite sure what you're trying to do, but things involving “get the owner class” are often doable using metaclasses.)