I have a python abstract base class as follows:
class Node(object):
"""
All concrete node classes should inherit from this
"""
__metaclass__ = ABCMeta
def __init__(self, name):
self.name = name
self.inputs = dict()
def add_input(self, key, value=None, d=None):
self.inputs[key] = (d, value)
def bind_input(self):
print "Binding inputs"
#abstractmethod
def run(self):
pass
Now, various derived classes will inherit from this node class and override the run method. It is always the case that bind_input() must be the first thing that should be called in the run method. Currently, for all derived classes the developer has to make sure to first call self.bind_input(). This is not a huge problem per se but out of curiosity is it possible to ensure this somehow from the base class itself that bind_input is called before executing the child object's run?
The usual object-oriented approach is this:
def run(self):
self.bind_input()
return self.do_run()
#abstractmethod
def do_run(self):
pass # override this method
Have your subclasses override the inner method, instead of the outer one.
Related
The Scenario:
class A:
def __init__(self, key, secret):
self.key = key
self.secret = secret
def same_name_method(self):
do_some_staff
def method_a(self):
pass
class B:
def __init__(self, key, secret):
self.key = key
self.secret = secret
def same_name_method(self):
do_another_staff
def method_b(self):
pass
class C(A,B):
def __init__(self, *args, **kwargs):
# I want to init both class A and B's key and secret
## I want to rename class A and B's same method
any_ideas()
...
What I Want:
I want the instance of class C initialize both class A and B, because they are different api key.
And I want rename class A and B's same_name_method, so I will not confused at which same_name_method.
What I Have Done:
For problem one, I have done this:
class C(A,B):
def __init__(self, *args, **kwargs):
A.__init__(self, a_api_key,a_api_secret)
B.__init__(self, b_api_key,b_api_secret)
Comment: I know about super(), but for this situation I do not know how to use it.
For problem two, I add a __new__ for class C
def __new__(cls, *args, **kwargs):
cls.platforms = []
cls.rename_method = []
for platform in cls.__bases__:
# fetch platform module name
module_name = platform.__module__.split('.')[0]
cls.platforms.append(module_name)
# rename attr
for k, v in platform.__dict__.items():
if not k.startswith('__'):
setattr(cls, module_name+'_'+k, v)
cls.rename_method.append(k)
for i in cls.rename_method:
delattr(cls, i) ## this line will raise AttributeError!!
return super().__new__(cls)
Comment: because I rename the new method names and add it to cls attr. I need to delete the old method attr, but do not know how to delattr. Now I just leave them alone, did not delete the old methods.
Question:
Any Suggestions?
So, you want some pretty advanced things, some complicated things, and you don't understand well how classes behave in Python.
So, for your first thing: initializing both classes, and every other method that should run in all classes: the correct solution is to make use of cooperative calls to super() methods.
A call to super() in Python returns you a very special proxy objects that reflects all methods available in the next class, obeying the proper method Resolution Order.
So, if A.__init__ and B.__init__ have to be called, both methods should include a super().__init__ call - and one will call the other's __init__ in the appropriate order, regardless of how they are used as bases in subclasses. As object also have __init__, the last super().__init__ will just call it that is a no-op. If you have more methods in your classes that should be run in all base classes, you'd rather build a proper base class so that the top-most super() call don't try to propagate to a non-existing method.
Otherwise, it is just:
class A:
def __init__(self, akey, asecret, **kwargs):
self.key = akey
self.secret = asecret
super().__init__(**kwargs)
class B:
def __init__(self, bkey, bsecret, **kwargs):
self.key = bkey
self.secret = bsecret
super().__init__(**kwargs)
class C(A,B):
# does not even need an explicit `__init__`.
I think you can get the idea. Of course, the parameter names have to differ - ideally, when writing C you don't have to worry about parameter order - but when calling C you have to worry about suplying all mandatory parameters for C and its bases. If you can't rename the parameters in A or B to be distinct, you could try to use the parameter order for the call, though, with each __init__ consuming two position-parameters - but that will require some extra care in inheritance order.
So - up to this point, it is basic Python multiple-inheritance "howto", and should be pretty straightforward. Now comes your strange stuff.
As for the auto-renaming of methods: first things first -
are you quite sure you need inheritance? Maybe having your granular classes for each external service, and a registry and dispatch class that call the methods on the others by composition would be more sane. (I may come back to this later)
Are you aware that __new__ is called for each instantiation of the class, and all class-attribute mangling you are performing there happens at each new instance of your classes?
So, if the needed method-renaming + shadowing needs to take place at class creation time, you can do that using the special method __init_subclass__ that exists from Python 3.6. It is a special class method that is called once for each derived class of the class it is defined on. So, just create a base class, from which A and B themselves will inherit, and move a properly modified version the thing you are putting in __new__ there. If you are not using Python 3.6, this should be done on the __new__ or __init__ of a metaclass, not on the __new__ of the class itself.
Another approach would be to have a custom __getattribute__ method - this could be crafted to provide namespaces for the base classes. It would owrk ony on instances, not on the classes themselves (but could be made to, again, using a metaclass). __getattribute__ can even hide the same-name-methods.
class Base:
#classmethod
def _get_base_modules(cls):
result = {}
for base in cls.__bases__:
module_name = cls.__module__.split(".")[0]
result[module_name] = base
return result
#classmethod
def _proxy(self, module_name):
class base:
def __dir__(base_self):
return dir(self._base_modules[module_name])
def __getattr__(base_self, attr):
original_value = self._base_modules[module_name].__dict__[attr]
if hasattr(original_value, "__get__"):
original_value = original_value.__get__(self, self.__class__)
return original_value
base.__name__ = module_name
return base()
def __init_subclass__(cls):
cls._base_modules = cls._get_base_modules()
cls._shadowed = {name for module_class in cls._base_modules.values() for name in module_class.__dict__ if not name.startswith("_")}
def __getattribute__(self, attr):
if attr.startswith("_"):
return super().__getattribute__(attr)
cls = self.__class__
if attr in cls._shadowed:
raise AttributeError(attr)
if attr in cls._base_modules:
return cls._proxy(attr)
return super().__getattribute__(attr)
def __dir__(self):
return super().dir() + list(self._base_modules)
class A(Base):
...
class B(Base):
...
class C(A, B):
...
As you can see - this is some fun, but starts getting really complicated - and all the hoola-boops that are needed to retrieve the actual attributes from the superclasses after ading an artificial namespace seem to indicate your problem is not calling for using inheritance after all, as I suggested above.
Since you have your small, functional, atomic classes for each "service" , you could use a plain, simple, non-meta-at-all class that would work as a registry for the various services - and you can even enhance it to call the equivalent method in several of the services it is handling with a single call:
class Services:
def __init__(self):
self.registry = {}
def register(self, cls, key, secret):
name = cls.__module__.split(".")[0]
service= cls(key, secret)
self.registry[name] = service
def __getattr__(self, attr):
if attr in self.registry:
return self.registry[attr]
In most Python examples, when super is used to call a parent class's constructors, it appears at the top.
Is it bad form to have it at the bottom of an init method?
In the examples below, super is at the bottom of A's constructor, but at the top of B's constructor.
class A:
def __init__(self):
# Do some stuff
b = result_of_complex_operation()
super(A, self).__init__(b)
class B:
def __init__(self):
super(A, self).__init__(b)
# Do some stuff
This totally depends on the use case. Consider this.
class Foo():
def __init__(self):
print(self.name)
#property
def name(self):
return self.__class__.__name__
class Bar(Foo):
def __init__(self, name):
self.name = name
super().__init__()
#property
def name(self):
return self.__name
#name.setter
def name(self, name):
self.__name = name
If you'd invoke super() before setting self.name within Bar.__init__ you'd get an AttributeError because the required name has not yet been set.
Is it bad form to have it at the bottom of an init method?
You're asking the wrong question. Regardless of whether it's bad from or not, there are valid use cases for moving the superclass initialization to the bottom of a sub-class's constructor. Where to put the call to the superclass's constructor entirely depends on the implementation of the superclass's constructor.
For example, suppose you have a superclass. When constructing the superclass, you want to give an attribute a certain value depending on an attribute of the subclasses:
class Superclass:
def __init__(self):
if self.subclass_attr:
self.attr = 1
else:
self.attr = 2
As you can see from above, we expect the subclasses to have the attribute subclass_attr. So what does this mean? We can't initialize Supperclass until we've given the subclasses the subclass_attr attribute.
Thus, we have to defer calling the superclass's constructor until we initialize subclass_attr. In other words, the call to super will have to be put at the bottom of a subclasses constructor:
class Subclass(Superclass):
def __init__(self):
self.subclass_attr = True
super(Superclass, self).__init__()
In the end, the choice of where to put super should not be based upon some style, but on what's necessary.
Lets define simple class decorator function, which creates subclass and adds 'Dec' to original class name only:
def decorate_class(klass):
new_class = type(klass.__name__ + 'Dec', (klass,), {})
return new_class
Now apply it on a simple subclass definition:
class Base(object):
def __init__(self):
print 'Base init'
#decorate_class
class MyClass(Base):
def __init__(self):
print 'MyClass init'
super(MyClass, self).__init__()
Now, if you try instantiate decorated MyClass, it will end up in an infinite loop:
c = MyClass()
# ...
# File "test.py", line 40, in __init__
# super(MyClass, self).__init__()
# RuntimeError: maximum recursion depth exceeded while calling a Python object
It seems, super can't handle this case and does not skip current class from inheritance chain.
The question, how correctly use class decorator on classes using super ?
Bonus question, how get final class from proxy-object created by super ? Ie. get object class from super(Base, self).__init__ expression, as determined parent class defining called __init__.
If you just want to change the class's .__name__ attribute, make a decorator that does that.
from __future__ import print_function
def decorate_class(klass):
klass.__name__ += 'Dec'
return klass
class Base(object):
def __init__(self):
print('Base init')
#decorate_class
class MyClass(Base):
def __init__(self):
print('MyClass init')
super(MyClass, self).__init__()
c = MyClass()
cls = c.__class__
print(cls, cls.__name__)
Python 2 output
MyClass init
Base init
<class '__main__.MyClassDec'> MyClassDec
Python 3 output
MyClass init
Base init
<class '__main__.MyClass'> MyClassDec
Note the difference in the repr of cls. (I'm not sure why you'd want to change a class's name though, it sounds like a recipe for confusion, but I guess it's ok for this simple example).
As others have said, an #decorator isn't intended to create a subclass. You can do it in Python 3 by using the arg-less form of super (i.e., super().__init__()). And you can make it work in both Python 3 and Python 2 by explicitly supplying the parent class rather than using super.
from __future__ import print_function
def decorate_class(klass):
name = klass.__name__
return type(name + 'Dec', (klass,), {})
class Base(object):
def __init__(self):
print('Base init')
#decorate_class
class MyClass(Base):
def __init__(self):
print('MyClass init')
Base.__init__(self)
c = MyClass()
cls = c.__class__
print(cls, cls.__name__)
Python 2 & 3 output
MyClass init
Base init
<class '__main__.MyClassDec'> MyClassDec
Finally, if we just call decorate_class using normal function syntax rather than as an #decorator we can use super.
from __future__ import print_function
def decorate_class(klass):
name = klass.__name__
return type(name + 'Dec', (klass,), {})
class Base(object):
def __init__(self):
print('Base init')
class MyClass(Base):
def __init__(self):
print('MyClass init')
super(MyClass, self).__init__()
MyClassDec = decorate_class(MyClass)
c = MyClassDec()
cls = c.__class__
print(cls, cls.__name__)
The output is the same as in the last version.
Since your decorator returns an entirely new class with different name, for that class MyClass object doesn't even exist. This is not the case class decorators are intended for. They are intended to add additional functionality to an existing class, not outright replacing it with some other class.
Still if you are using Python3, solution is simple -
#decorate_class
class MyClass(Base):
def __init__(self):
print 'MyClass init'
super().__init__()
Otherwise, I doubt there is any straight-forward solution, you just need to change your implementation. When you are renaming the class, you need to rewrite overwrite __init__ as well with newer name.
The problem is that your decorator creates a subclass of the original one. That means that super(Myclass) now point to... the original class itself!
I cannot even explain how the 0 arg form of super manages to do the job in Python 3, I could not find anything explicit in the reference manual. I assume it must use the class in which it is used at the time of declaration. But I cannot imagine a way to get that result in Python2.
If you want to be able to use super in the decorated class in Python 2, you should not create a derived class, but directly modify the original class in place.
For example, here is a decorator that prints a line before and after calling any method:
def decorate_class(klass):
for name, method in klass.__dict__.iteritems(): # iterate the class attributes
if isinstance(method, types.FunctionType): # identify the methods
def meth(*args, **kwargs): # define a wrapper
print "Before", name
method(*args, **kwargs)
print "After", name
setattr(klass, name, meth) # tell the class to use the wrapper
return klass
With your example it gives as expected:
>>> c = MyClass()
Before __init__
MyClass init
Base init
After __init__
When I define a class, I like to include type checking (using assert) of the input variables. I am now defining a 'specialized' class Rule which inherits from an abstract base class (ABC) BaseRule, similar to the following:
import abc
class BaseRule(object):
__metaclass__ = abc.ABCMeta
#abc.abstractproperty
def resources(self):
pass
class Rule(BaseRule):
def __init__(self, resources):
assert all(isinstance(resource, Resource) for resource in resources) # type checking
self._resources = resources
#property
def resources(self):
return self._resources
class Resource(object):
def __init__(self, domain):
self.domain = domain
if __name__ == "__main__":
resources = [Resource("facebook.com")]
rule = Rule(resources)
The assert statement in the __init__ function of the Rule class ensures that the resources input is a list (or other iterable) of Resource objects. However, this would also be the case for other classes which inherit from BaseRule, so I would like to incorporate this assertion in the abstractproperty somehow. How might I go about this?
See this documentation on abc Type annotations with mypy-lang https://mypy.readthedocs.io/en/latest/class_basics.html#abstract-base-classes-and-multiple-inheritance
Make your base class have a non-abstract property that calls separate abstract getter and setter methods. The property can do the validation you want before calling the setter. Other code (such as the __init__ method of a derived class) that wants to trigger the validation can do so by doing its assignment via the property:
class BaseRule(object):
__metaclass__ = abc.ABCMeta
#property
def resources(self): # this property isn't abstract and shouldn't be overridden
return self._get_resources()
#resources.setter
def resources(self, value):
assert all(isinstance(resource, Resources) for resource in value)
self._set_resources(value)
#abstractmethod
def _get_resources(self): # these methods should be, instead
pass
#abstractmethod
def _set_resources(self, value):
pass
class Rule(BaseRule):
def __init__(self, resources):
self.resources = resources # assign via the property to get type-checking!
def _get_resources(self):
return self._resources
def _set_resources(self, value):
self._resources = value
You might even consider moving the __init__ method from Rule into the BaseRule class, since it doesn't need any knowledge about Rule's concrete implementation.
If you have multiple layers of inheritance and know that a particular variable exists, is there a way to trace back to where the variable originated? Without having to navigate backwards by looking through each file and classes. Possibly calling some sort of function that will do it?
Example:
parent.py
class parent(object):
def __init__(self):
findMe = "Here I am!"
child.py
from parent import parent
class child(parent):
pass
grandson.py
from child import child
class grandson(child):
def printVar(self):
print self.findMe
Try to locate where the findMe variable came from with a function call.
If the "variable" is an instance variable - , so , if at any point in chain of __init__ methods you do:
def __init__(self):
self.findMe = "Here I am!"
It is an instance variable from that point on, and cannot, for all effects, be made distinct of any other instance variable. (Unless you put in place a mechanism, like a class with a special __setattr__ method, that will keep track of attributes changing, and introspect back which part of the code set the attribute - see last example on this answer)
Please also note that on your example,
class parent(object):
def __init__(self):
findMe = "Here I am!"
findMe is defined as a local variable to that method and does not even exist after __init__ is finished.
Now, if your variable is set as a class attribute somewhere on the inheritance chain:
class parent(object):
findMe = False
class childone(parent):
...
It is possible to find the class where findMe is defined by introspecting each class' __dict__ in the MRO (method resolution order) chain . Of course, there is no way, and no sense, in doing that without introspecting all classes in the MRO chain - except if one keeps track of attributes as defined, like in the example bellow this - but introspecting the MRO itself is a oneliner in Python:
def __init__(self):
super().__init__()
...
findme_definer = [cls for cls in self.__class__.__mro__ if "findMe" in cls.__dict__][0]
Again - it would be possible to have a metaclass to your inheritance chain which would keep track of all defined attributes in the inheritance tree, and use a dictionary to retrieve where each attribute is defined. The same metaclass could also auto-decorate all __init__ (or all methods), and set a special __setitem__ so that it could track instance attributes as they are created, as listed above.
That can be done, is a bit complicated, would be hard to maintain, and probably is a signal you are taking the wrong approach to your problem.
So, the metaclass to record just class attributes could simply be (python3 syntax - define a __metaclass__ attribute on the class body if you are still using Python 2.7):
class MetaBase(type):
definitions = {}
def __init__(cls, name, bases, dct):
for attr in dct.keys():
cls.__class__.definitions[attr] = cls
class parent(metaclass=MetaBase):
findMe = 5
def __init__(self):
print(self.__class__.definitions["findMe"])
Now, if one wants to find which of the superclasses defined an attribute of the currentclass, just a "live" tracking mechanism, wrapping each method in each class can work - it is a lot trickier.
I've made it - even if you won't need this much, this combines both methods - keeping track of class attributes in the class'class definitions and on an instance _definitions dictionary - since in each created instance an arbitrary method might have been the last to set a particular instance attribute: (This is pure Python3, and maybe not that straighforward porting to Python2 due to the "unbound method" that Python2 uses, and is a simple function in Python3)
from threading import current_thread
from functools import wraps
from types import MethodType
from collections import defaultdict
def method_decorator(func, cls):
#wraps(func)
def wrapper(self, *args, **kw):
self.__class__.__class__.current_running_class[current_thread()].append(cls)
result = MethodType(func, self)(*args, **kw)
self.__class__.__class__.current_running_class[current_thread()].pop()
return result
return wrapper
class MetaBase(type):
definitions = {}
current_running_class = defaultdict(list)
def __init__(cls, name, bases, dct):
for attrname, attr in dct.items():
cls.__class__.definitions[attr] = cls
if callable(attr) and attrname != "__setattr__":
setattr(cls, attrname, method_decorator(attr, cls))
class Base(object, metaclass=MetaBase):
def __setattr__(self, attr, value):
if not hasattr(self, "_definitions"):
super().__setattr__("_definitions", {})
self._definitions[attr] = self.__class__.current_running_class[current_thread()][-1]
return super().__setattr__(attr,value)
Example Classes for the code above:
class Parent(Base):
def __init__(self):
super().__init__()
self.findMe = 10
class Child1(Parent):
def __init__(self):
super().__init__()
self.findMe1 = 20
class Child2(Parent):
def __init__(self):
super().__init__()
self.findMe2 = 30
class GrandChild(Child1, Child2):
def __init__(self):
super().__init__()
def findall(self):
for attr in "findMe findMe1 findMe2".split():
print("Attr '{}' defined in class '{}' ".format(attr, self._definitions[attr].__name__))
And on the console one will get this result:
In [87]: g = GrandChild()
In [88]: g.findall()
Attr 'findMe' defined in class 'Parent'
Attr 'findMe1' defined in class 'Child1'
Attr 'findMe2' defined in class 'Child2'