AttributeError not generated in case of passing method reference - python

I have one simple doubt with respect to python 2.7:
I have created an abstract base class and a child class:
from abc import ABCMeta, abstractmethod
class Base:
"""
Abstract base class for all entities.
"""
__metaclass__ = ABCMeta
def __init__(self, name):
self.name = name
def send_data(self):
self.send_data()
class Child (Base):
def __init__(self, name):
super(Child, self).__init__(name=name)
When the object for the child class is created and the send_method is called I get the following error which is the expected behavior:
sample = Child('test')
sample.send_data()
# …
RuntimeError: maximum recursion depth exceeded
But when the send_method reference is passed in the base class and call is made to send_method by creating the child class object I think the expected behavior is to receive AttributeError but I am surprised to see no error is generated. Please explain.
from abc import ABCMeta, abstractmethod
class Base:
"""
Abstract base class for all entities.
"""
__metaclass__ = ABCMeta
def __init__(self, name, parent):
self.name = name
self.parent = parent
def send_data(self):
self.send_data
sample = Child('test')
sample.send_data()
# No error

In your first example you simply created a recursive function:
def send_data(self):
self.send_data()
This calls itself, without end, and that's why you end up with a recursion depth exception.
Your second example doesn't actually call the method:
def send_data(self):
self.send_data
The only difference here is that you forgot to use ().
None of this has anything to do with abstract base classes or inheritance. You didn't mark the send_data function as abstract, and even if you did, all that using abstractmethod does is make it impossible to create an instance of a class without a concrete implementation to replace it.
You won't get an AttributeError just because you defined a method on an ABCMeta class. And note that methods are just attributes on a class; they don't live a separate namespace. self.send_data references the bound method, not some other attribute that is separate. Referencing the method without calling it does nothing otherwise.

Related

OOP in Python beginner issues, finding all methods of the same starting name

I would like to create an abstract method in parent class which would be overridden in subclasses. This method would print all methods in the given subclass which start with 'on_'.
from abc import ABC, abstractmethod
class abstract_class(ABC):
#abstractmethod
def get_all_on_methods(self):
pass
class sub(abstract_class):
an_object = sub()
def get_all_on_methods(self):
for attribute in dir(self):
if attribute.startswith("on_"):
print(attribute)
def nothin(self):
print("nothin")
def on_fallback(self):
raise NotImplementedError()
def on_no(self):
raise NotImplementedError()
sub.get_all_on_methods()
I have two problems. First, I have:
Unresolved reference 'sub'
Second, I don't know whether my approach as actually all that good.

What is the way of enforcing explicit initialization of parent class by an implementer?

Consider the following sample snippet
import abc
class BASE(metaclass=abc.ABCMeta):
def __init__(self, name):
assert name is not None, "Name must be provided."
self.num = 3
#abc.abstractmethod
def compute(self):
pass
class CHILD(BASE):
def __init__(name):
'''
'''
def compute(self):
return self.num + 34
On execution it gives the following sensible error :
AttributeError: 'CHILD' object has no attribute 'num'
In the present situation BASE is not being initialized because if we add a print function to it as below, it does not print absolutely anything.
class BASE(metaclass=abc.ABCMeta):
def __init__(self, name):
print(name)
assert name is not None, "Name must be provided."
self.num = 3
Can we do anything in this class design to make sure that an implementor subclassing from BASE must explicitly call the initializer of the BASE ?
Can we do anything in this class design to make sure that an
implementor subclassing from BASE must explicitly call the initializer
of the BASE ?
I guess you can do something like this:
class BASE(metaclass=abc.ABCMeta):
'''
when subclassing this class, ensure you explicitly
call super().__init__() for this parent class
'''
def __init__(self, name):
assert name is not None, "Name must be provided."
self.num = 3
From comments secion:
I just wanted to understand that how in real world problems, designs
are made to ensure that such mistakes do not happen
I have done many sub-classes where I did not explicitly call the super().__init__() of the parent class. So I am not sure if you can say that all sub-classes must call the parent class __init__ to avoid "mistakes" (as you call it). It is up to the creator of the parent class to document properly how to sub-class, and then it is still up to the creator of the sub-class to either do it or not. Not something you can "enforce"

How to incorporate type checking in an abstract base class in Python

When I define a class, I like to include type checking (using assert) of the input variables. I am now defining a 'specialized' class Rule which inherits from an abstract base class (ABC) BaseRule, similar to the following:
import abc
class BaseRule(object):
__metaclass__ = abc.ABCMeta
#abc.abstractproperty
def resources(self):
pass
class Rule(BaseRule):
def __init__(self, resources):
assert all(isinstance(resource, Resource) for resource in resources) # type checking
self._resources = resources
#property
def resources(self):
return self._resources
class Resource(object):
def __init__(self, domain):
self.domain = domain
if __name__ == "__main__":
resources = [Resource("facebook.com")]
rule = Rule(resources)
The assert statement in the __init__ function of the Rule class ensures that the resources input is a list (or other iterable) of Resource objects. However, this would also be the case for other classes which inherit from BaseRule, so I would like to incorporate this assertion in the abstractproperty somehow. How might I go about this?
See this documentation on abc Type annotations with mypy-lang https://mypy.readthedocs.io/en/latest/class_basics.html#abstract-base-classes-and-multiple-inheritance
Make your base class have a non-abstract property that calls separate abstract getter and setter methods. The property can do the validation you want before calling the setter. Other code (such as the __init__ method of a derived class) that wants to trigger the validation can do so by doing its assignment via the property:
class BaseRule(object):
__metaclass__ = abc.ABCMeta
#property
def resources(self): # this property isn't abstract and shouldn't be overridden
return self._get_resources()
#resources.setter
def resources(self, value):
assert all(isinstance(resource, Resources) for resource in value)
self._set_resources(value)
#abstractmethod
def _get_resources(self): # these methods should be, instead
pass
#abstractmethod
def _set_resources(self, value):
pass
class Rule(BaseRule):
def __init__(self, resources):
self.resources = resources # assign via the property to get type-checking!
def _get_resources(self):
return self._resources
def _set_resources(self, value):
self._resources = value
You might even consider moving the __init__ method from Rule into the BaseRule class, since it doesn't need any knowledge about Rule's concrete implementation.

Can I ensure that python base class method is always called

I have a python abstract base class as follows:
class Node(object):
"""
All concrete node classes should inherit from this
"""
__metaclass__ = ABCMeta
def __init__(self, name):
self.name = name
self.inputs = dict()
def add_input(self, key, value=None, d=None):
self.inputs[key] = (d, value)
def bind_input(self):
print "Binding inputs"
#abstractmethod
def run(self):
pass
Now, various derived classes will inherit from this node class and override the run method. It is always the case that bind_input() must be the first thing that should be called in the run method. Currently, for all derived classes the developer has to make sure to first call self.bind_input(). This is not a huge problem per se but out of curiosity is it possible to ensure this somehow from the base class itself that bind_input is called before executing the child object's run?
The usual object-oriented approach is this:
def run(self):
self.bind_input()
return self.do_run()
#abstractmethod
def do_run(self):
pass # override this method
Have your subclasses override the inner method, instead of the outer one.

How can one locate where an inherited variable comes from in Python?

If you have multiple layers of inheritance and know that a particular variable exists, is there a way to trace back to where the variable originated? Without having to navigate backwards by looking through each file and classes. Possibly calling some sort of function that will do it?
Example:
parent.py
class parent(object):
def __init__(self):
findMe = "Here I am!"
child.py
from parent import parent
class child(parent):
pass
grandson.py
from child import child
class grandson(child):
def printVar(self):
print self.findMe
Try to locate where the findMe variable came from with a function call.
If the "variable" is an instance variable - , so , if at any point in chain of __init__ methods you do:
def __init__(self):
self.findMe = "Here I am!"
It is an instance variable from that point on, and cannot, for all effects, be made distinct of any other instance variable. (Unless you put in place a mechanism, like a class with a special __setattr__ method, that will keep track of attributes changing, and introspect back which part of the code set the attribute - see last example on this answer)
Please also note that on your example,
class parent(object):
def __init__(self):
findMe = "Here I am!"
findMe is defined as a local variable to that method and does not even exist after __init__ is finished.
Now, if your variable is set as a class attribute somewhere on the inheritance chain:
class parent(object):
findMe = False
class childone(parent):
...
It is possible to find the class where findMe is defined by introspecting each class' __dict__ in the MRO (method resolution order) chain . Of course, there is no way, and no sense, in doing that without introspecting all classes in the MRO chain - except if one keeps track of attributes as defined, like in the example bellow this - but introspecting the MRO itself is a oneliner in Python:
def __init__(self):
super().__init__()
...
findme_definer = [cls for cls in self.__class__.__mro__ if "findMe" in cls.__dict__][0]
Again - it would be possible to have a metaclass to your inheritance chain which would keep track of all defined attributes in the inheritance tree, and use a dictionary to retrieve where each attribute is defined. The same metaclass could also auto-decorate all __init__ (or all methods), and set a special __setitem__ so that it could track instance attributes as they are created, as listed above.
That can be done, is a bit complicated, would be hard to maintain, and probably is a signal you are taking the wrong approach to your problem.
So, the metaclass to record just class attributes could simply be (python3 syntax - define a __metaclass__ attribute on the class body if you are still using Python 2.7):
class MetaBase(type):
definitions = {}
def __init__(cls, name, bases, dct):
for attr in dct.keys():
cls.__class__.definitions[attr] = cls
class parent(metaclass=MetaBase):
findMe = 5
def __init__(self):
print(self.__class__.definitions["findMe"])
Now, if one wants to find which of the superclasses defined an attribute of the currentclass, just a "live" tracking mechanism, wrapping each method in each class can work - it is a lot trickier.
I've made it - even if you won't need this much, this combines both methods - keeping track of class attributes in the class'class definitions and on an instance _definitions dictionary - since in each created instance an arbitrary method might have been the last to set a particular instance attribute: (This is pure Python3, and maybe not that straighforward porting to Python2 due to the "unbound method" that Python2 uses, and is a simple function in Python3)
from threading import current_thread
from functools import wraps
from types import MethodType
from collections import defaultdict
def method_decorator(func, cls):
#wraps(func)
def wrapper(self, *args, **kw):
self.__class__.__class__.current_running_class[current_thread()].append(cls)
result = MethodType(func, self)(*args, **kw)
self.__class__.__class__.current_running_class[current_thread()].pop()
return result
return wrapper
class MetaBase(type):
definitions = {}
current_running_class = defaultdict(list)
def __init__(cls, name, bases, dct):
for attrname, attr in dct.items():
cls.__class__.definitions[attr] = cls
if callable(attr) and attrname != "__setattr__":
setattr(cls, attrname, method_decorator(attr, cls))
class Base(object, metaclass=MetaBase):
def __setattr__(self, attr, value):
if not hasattr(self, "_definitions"):
super().__setattr__("_definitions", {})
self._definitions[attr] = self.__class__.current_running_class[current_thread()][-1]
return super().__setattr__(attr,value)
Example Classes for the code above:
class Parent(Base):
def __init__(self):
super().__init__()
self.findMe = 10
class Child1(Parent):
def __init__(self):
super().__init__()
self.findMe1 = 20
class Child2(Parent):
def __init__(self):
super().__init__()
self.findMe2 = 30
class GrandChild(Child1, Child2):
def __init__(self):
super().__init__()
def findall(self):
for attr in "findMe findMe1 findMe2".split():
print("Attr '{}' defined in class '{}' ".format(attr, self._definitions[attr].__name__))
And on the console one will get this result:
In [87]: g = GrandChild()
In [88]: g.findall()
Attr 'findMe' defined in class 'Parent'
Attr 'findMe1' defined in class 'Child1'
Attr 'findMe2' defined in class 'Child2'

Categories

Resources