Inherit wrapper from abstract method - python

I would like to systematically wrap some overriden method of a base class.
I am using ABC for the base class. I tried to wrap the #abstractmethod, putting the annotation before or after, but it doesn’t work. As I understand it, the the whole wrapped method is overriden.
from functools import wraps
from abc import ABC, abstractmethod
def print_before(func):
#wraps(func)
def out(*args, **kwargs):
print('Hello')
return func(*args, **kwargs)
return out
class Base(ABC):
#print_before
#abstractmethod
def test(self):
pass
class Extend(Base):
def test(self):
print('World')
Here is what happens when we test:
Extend().test()
Result:
World
Desired:
Hello
World
I guess I’m not using the right method to get such behavior. What would be a nice pythonic way to run some code before and after an overriden method?

As you noticed, the decorator does not change overridden methods. You could decorate the method every time you create a subclass. You can even do it automatically with the magic method __init_subclass__.
class Base(ABC):
...
def __init_subclass__(cls):
cls.test = print_before(cls.test)
But i would not recommend this approach. It will probably destroy the ABC mechanisms and classes that inherit from Extend are decorated twice if they don't override the method.
Here is a much easier approach. We define a concrete "public" method on Base that calls an abstract "private" method. In the child classes we then have to implement the "private" method.
class Base(ABC):
def test(self):
# do something before
result = self._do_work()
# do something after
return result
#abstractmethod
def _do_work(self):
pass
class Extend(Base):
def _do_work(self):
# your implementation here
# use it like this:
e = Extend()
e.test()
Another advantage is that you can change the behaviour of the "wrapper" in a child class which would be difficult with the decorator.

Related

How to create a class decorator that can add multiple methods to a class, while preserving the IDE's ability to type-hint the methods

The issue
I would like to be able to re-use methods by implementing them with a decorator, while preserving my IDE's ability to type-hint the methods added, such that:
#methods(methods=[implement_foo, implement_bar])
class K:
pass
# OR
#method(methods[Foo, Bar])
class K:
pass
k = K()
#### THE ISSUE
k. <- # IDE should recognize the methods .foo() or bar(), but does not.
My issue is much like How to create a class decorator that can add multiple methods to a class?, but as mentioned, while preserving the type-hint and only use one decorator.
What I have tried
I can make it work with one decorator, but not with multiple.
Example with one decorator called implement_method
def implement_method(cls):
class Inner(cls):
def __init__(self, *args, **kargs):
super(Inner, self).__init__(*args, **kargs)
def method(self):
pass
return Inner
#implement_method
class K:
pass
And type hint works for a new instance of K:
I imagine that one of the issues is using a loop, but I am unable to come up with a different solution. The following is my best attempt:
def methods(methods):
def wrapper(cls):
for method in methods:
cls = method(cls)
return cls
return wrapper
class Bar:
def bar(self):
pass
#methods(methods=[Bar])
class K:
pass
k = K()
k. # <- not finding bar()
Since your question is a 2 part one:
I have an answer for your first part and I am quite stuck on the second. You can modify signatures of functions using the inspect module, but I have not found anything similar for classes and I am not sure if it is possible. So for my answer I will focus on your first part:
One decorator for multiple functions:
Let's look at the decorator first:
def add_methods(*methods):
def wrapper(cls):
for method in methods:
setattr(cls, method.__name__, staticmethod(method))
return cls
return wrapper
We use *methods as a parameter so that we can add as many methods as we want as arguments.
Then we define a wrapper for the class and in it iterate over all methods we want to add using setattr to add the method to the class. Notice the staticmethod wrapping the original method. You can leave this out if you want the methods to receive the argument self.
Then we return from the wrapper returning the class and return from the decorator returning the wrapper.
Let's write some simple methods next:
def method_a():
print("I am a banana!")
def method_b():
print("I am an apple!")
Now we create a simple class using our decorator:
#add_methods(method_a, method_b)
class MyClass:
def i_was_here_before(self):
print("Hah!")
And finally test it:
my_instance = MyClass()
my_instance.i_was_here_before()
my_instance.method_a()
my_instance.method_b()
Our output:
Hah!
I am a banana!
I am an apple!
A word of caution
Ususally it is not advised to change the signature of functions or classes without a good reason (and sometimes even with a good reason).
Alternate Solution
Given that you will need to apply the decorator to each class anyway, you could also just use a superclass like this:
class Parent:
#staticmethod
def method_a():
print("I am a banana!")
#staticmethod
def method_b():
print("I am an apple!")
class MyClass(Parent):
def i_was_here_before(self):
print("Hah!")
my_instance = MyClass()
my_instance.i_was_here_before()
my_instance.method_a()
my_instance.method_b()
Since python supports multiple inheritance this should work better and it also gives you the correct hints.
Complete working example:
def add_methods(*methods):
def wrapper(cls):
for method in methods:
setattr(cls, method.__name__, staticmethod(method))
return cls
return wrapper
def method_a():
print("I am a banana!")
def method_b():
print("I am an apple!")
#add_methods(method_a, method_b)
class MyClass:
def i_was_here_before(self):
print("Hah!")
my_instance = MyClass()
my_instance.i_was_here_before()
my_instance.method_a()
my_instance.method_b()

Can a parent's class method call an abstract method which will be implemented in a child class?

If I have a parent class with two methods:
class Parent():
#abstractmethod
#staticmethod
def functionA():
pass
def functionB():
return __class__.functionA() + 1
And I implement a child class:
class Child(Parent):
def functionA(): # this function is different for each kind of child
return 3
In the end, the purpose of the child classes would be to call functionB() only.
Does it work? Of course, I could place functionB() into the child class and make it work, but because functionB() is the same for every kind of child class, I wouldn't want to write repeated code for each class?
Also, is my use of __class__ appropriate here?
First, functionB itself should be a class method.
#classmethod
def functionB(cls):
return cls.functionA() + 1
Second, you still have to decorate functionA as a static method in each child class; otherwise, you are replacing the inherited static method with an instance method.
class Child(Parent):
#staticmethod
def functionA():
return 3
Here is a working version:
from abc import ABC, abstractmethod
class Parent(ABC):
#staticmethod # not needed, but is documentation
#abstractmethod
def a(): pass
#classmethod
def b(cls): return cls.a() + 1
class Child(Parent):
#staticmethod
def a(): return 3
We can test it:
>>> c = Child()
>>> c.b()
4
Noteworthy things:
We need to use the ABC base class (or the ABCMeta metaclass) in order to have access to the abstractmethod decorator, as explained in the abc module documentation.
__class__ is not a keyword; it is an attribute of objects. We cannot just do things like __class__.a() because there is nothing to get the __class__ from. We could address this using the self parameter of an ordinary method, but what we are really trying to do here is to get behaviour that doesn't require an instance, yet depends on which derived class we are using. And that is what classmethod is for, and why there are separate classmethod and staticmethod decorators.
When you use classmethod, the class will be passed as a parameter, like how the instance is passed when you use a normal method. By convention, we name it cls. For more information about classmethod, please see this excellent talk by Raymond Hettinger (on the Python dev team).
Our implementations of the abstract method must also be decorated in the child classes, because they are still not ordinary methods - so Python needs to know not to pass an instance.
The staticmethod decoration on the abstract method needs to be listed first, before the abstractmethod decoration, for technical reasons. It effectively does nothing here; the abstractmethod would already behave like a staticmethod when we call e.g. Parent.a() (no instance; we can't create one anyway, since it's an abstract class).
We could also use classmethod instead of staticmethod for the a methods. This would allow children of Child to inherit the Child behaviour without explicitly writing their own a implementations. In this case, we would want an explicit classmethod decoration on the base abstract method, rather than staticmethod; and of course we would need to add cls parameters to each a implementation.

How to add a classmethod in Python dynamically

I'm using Python 3.
I know about the #classmethod decorator. Also, I know that classmethods can be called from instances.
class HappyClass(object):
#classmethod
def say_hello():
print('hello')
HappyClass.say_hello() # hello
HappyClass().say_hello() # hello
However, I don't seem to be able to create class methods dynamically AND let them be called from instances. Let's say I want something like
class SadClass(object):
def __init__(self, *args, **kwargs):
# create a class method say_dynamic
SadClass.say_dynamic() # prints "dynamic!"
SadClass().say_dynamic() # prints "dynamic!"
I've played with cls.__dict__ (which produces exceptions), and with setattr(cls, 'say_dynamic', blahblah) (which only makes the thingie callable from the class and not the instance).
If you ask me why, I wanted to make a lazy class property. But it cannot be called from instances.
#classmethod
def search_url(cls):
if hasattr(cls, '_search_url'):
setattr(cls, '_search_url', reverse('%s-search' % cls._meta.model_name))
return cls._search_url
Maybe because the property hasn't been called from the class yet...
In summary, I want to add a lazy, class method that can be called from the instance... Can this be achieved in an elegant (nottoomanylines) way?
Any thoughts?
How I achieved it
Sorry, my examples were very bad ones :\
Anyway, in the end I did it like this...
#classmethod
def search_url(cls):
if not hasattr(cls, '_search_url'):
setattr(cls, '_search_url', reverse('%s-search' % cls._meta.model_name))
return cls._search_url
And the setattr does work, but I had made a mistake when testing it...
You can add a function to a class at any point, a practice known as monkey-patching:
class SadClass:
pass
#classmethod
def say_dynamic(cls):
print('hello')
SadClass.say_dynamic = say_dynamic
>>> SadClass.say_dynamic()
hello
>>> SadClass().say_dynamic()
hello
Note that you are using the classmethod decorator, but your function accepts no arguments, which indicates that it's designed to be a static method. Did you mean to use staticmethod instead?
If you want to create class methods, do not create them in the __init__ function as it is then recreated for each instance creation. However, following works:
class SadClass(object):
pass
def say_dynamic(cls):
print("dynamic")
SadClass.say_dynamic = classmethod(say_dynamic)
# or
setattr(SadClass, 'say_dynamic', classmethod(say_dynamic))
SadClass.say_dynamic() # prints "dynamic!"
SadClass().say_dynamic() # prints "dynamic!"
Of course, in the __init__ method the self argument is an instance, and not the class: to put the method in the class there, you can hack something like
class SadClass(object):
def __init__(self, *args, **kwargs):
#classmethod
def say_dynamic(cls):
print("dynamic!")
setattr(self.__class__, 'say_dynamic', say_dynamic)
But it will again reset the method for each instance creation, possibly needlessly. And notice that your code most probably fails because you are calling the SadClass.say_dynamic() before any instances are created, and thus before the class method is injected.
Also, notice that a classmethod gets the implicit class argument cls; if you do want your function to be called without any arguments, use the staticmethod decorator.
As a side note, you can just use an instance attribute to hold a function:
>>> class Test:
... pass
...
>>> t=Test()
>>> t.monkey_patch=lambda s: print(s)
>>> t.monkey_patch('Hello from the monkey patch')
Hello from the monkey patch
How I achieved it:
#classmethod
def search_url(cls):
if not hasattr(cls, '_search_url'):
setattr(cls, '_search_url', reverse('%s-search' % cls._meta.model_name))
return cls._search_url

Can an abstract class force the inheriting class to implement a method as static?

Python 3.2 in case that matters...
The following code shows that the "concrete class" can either implement some_method as a static method or an instance method:
import abc
class SomeAbstractClass(metaclass=abc.ABCMeta):
#abc.abstractmethod
def some_method(self): pass
class ValidConcreteClass1(SomeAbstractClass):
#staticmethod
def some_method():
print("foo!")
class ValidConcreteClass2(SomeAbstractClass):
def some_method(self):
print("foo!")
ValidConcreteClass1.some_method()
instance = ValidConcreteClass2()
instance.some_method()
My question is, can I force the implementation of some_method to be static in the inheriting class?
I noticed #abc.abstractstaticmethod and thought this was the answer but the following code still runs just fine. I would think it would reject ValidConreteClass2 because some_method is not static:
import abc
class SomeAbstractClass(metaclass=abc.ABCMeta):
#abc.abstractstaticmethod
def some_method(self): pass
class ValidConcreteClass1(SomeAbstractClass):
#staticmethod
def some_method():
print("foo!")
class ValidConcreteClass2(SomeAbstractClass):
def some_method(self):
print("foo!")
ValidConcreteClass1.some_method()
instance = ValidConcreteClass2()
instance.some_method()
I think some clarification is needed;
First, in Python every method is virtual - really virtual; so whether a method is static, or bound to a class or instance, that's a matter of the subclass, not of the parent one. You don't have a real reason for wanting to prevent that - what's your purpose?
Second, ABCs check for abstractness at instantiation time - if you try instancing a class that's still got any abstract method, an error will be raised. But ABCs can't do anything on static or class methods that are invoked from the class itself - there's NO CHECK performed on the method itself, just an attribute set on the method - it's ABCMeta that does the dirty work when the class is instanced.
Third, the purpose of abstractstaticmethod is to allow an abstract method - hence something that must be still be overriden someway by subclasses - to be static and used from anywhere - again, there's no check done on the method itself, so the following code is perfectly legal:
import abc
class SomeAbstractClass(metaclass=abc.ABCMeta):
#abc.abstractstaticmethod
def some_method():
return 123
class ValidConcreteClass1(SomeAbstractClass):
def some_method(self):
return 456
inst = ValidConcreteClass1()
print(inst.some_method())
print(SomeAbstractClass.some_method())
The only reason for abstractstaticmethod/abstractclassmethod existence is that the following does not work because decorated methods lack a dict
class NotWorking(metaclass=abc.ABCMeta):
#abc.abstractmethod
#staticmethod
def some_method(self):
return "asd"
One last thing:
if you really wanted to, you could probably add such functionality by extending ABCMeta, but I won't give you an hook on how to do this unless you tell me why you're doing that :-)

python: calling super().__init__ too early in the __init__ method?

I have a class hierarchy where __init__ in class Base performs some pre-initialization and then calls method calculate. The calculate method is defined in class Base, but it's expected to be redefined in derived classes. The redefined calculate will use some of the attributes that are only available in class Derived:
class Base:
def __init__(self, args):
# perform some pre-initialization
...
# now call method "calculate"
self.calculate()
class Derived(Base):
def __init__(self, args, additional_attr):
super().__init__(args)
# do some work and create new instance attributes
...
self.additional_attr = additional_attr
This is not going to work because calculate method in class Derived will be invoked before self.additional_attr is assigned.
I can't move super().__init__(args) call to the end of the __init__ method because some of the work it does has to happen before processing additional_attr.
What to do?
Perhaps you shouldn't have the calculate() call in your constructor then. If you can't construct a derived object by allowing the base constructor to complete first, then you must be doing something wrong IMHO. A sensible approach would be to move that call out of the constructor and perhaps create a factory method to make that call automatically. Then use that method if you need precalculated instances.
class Base(object):
def __init__(self, args):
# perform some initialization
pass
def calculate(self):
# do stuff
pass
#classmethod
def precalculated(cls, args):
# construct first
newBase = cls(args)
# now call method "calculate"
newBase.calculate()
return newBase
class Derived(Base):
def __init__(self, args, additional_attr):
super(Derived, self).__init__(args)
# do some work and create new instance attributes
self.additional_attr = additional_attr
#classmethod
def precalculated(cls, args, additional_attr): # also if you want
newDerived = cls(args, additional_attr)
newDerived.calculate()
return newDerived
newBase = Base('foo')
precalculatedBase = Base.precalculated('foo')
newDerived = Derived('foo', 'bar')
precalculatedDerived = Derived.precalculated('foo', 'bar')
This is bad design, IMHO, and you're obusing the object system of Python. Consider that in other OO languages like C++, you don't even have control over the creation of base classes. The derived class's constructor calls the base constructor before your code runs. Such behavior is almost always expected of well-behaved class hierarchies, and changing it can only lead to problems.
Sure, you can do some patching (such as assigning self.additional_attr before the call to super's constructor, or other tricks), but the better way would be to change your design so that it won't require such hacks. Since you've presented an abstract example here, it's hard to give more comprehensive design advice.
In order for something like this to work, you need to design a protocol that allows the base and derived class(es) to cooperate with each other to accomplish the object initialization task:
class Base:
def __init__(self, args, *additional_args):
# perform some pre-initialization
# ...
# perform any futher initialization needed by derived classes
self.subclass_setup(*additional_args)
# now call method "calculate"
self.calculate()
def subclass_setup(self, *args):
pass
class Derived(Base):
def __init__(self, args, additional_attr):
super().__init__(args, additional_attr)
def subclass_setup(self, additional_attr):
# do some work and create new instance attributes
# ...
self.additional_attr = additional_attr
Can you pass the additional_attr as a parameter to __init__ method of base class and propogate it from there to calculate method?
Say something like:
class Base(object):
def __init__(self, args,additional_attr):
print 'Args for base class:%s' %(args)
self.calculate(additional_attr)
class Derived(Base):
def __init__(self, args, additional_attr):
super(Derived,self).__init__(args,additional_attr)
def calculate(self,val):
print 'Arg for calculate:%s' %(val)
self.additional_attr = val
>>> d = Derived(['test','name'],100)
Args for base class:['test', 'name']
Arg for calculate:100
This is roundabout way, but with no information about what the pre-initialisation steps are, it is hard to say whether the above approach would help you.

Categories

Resources