I'm wondering if it's possible to have a class which automatically wraps all of its methods, without it being necessary to explicity decorate each of them.
For example, instead of doing this:
class MyClass:
#mydecorator
def mymethod1(...):
...
#mydecorator
def mymethod2(...):
...
I'd like to do something like this:
class MyClass(metaclass=DecoratedMethods):
def mymethod1(...):
...
def mymethod2(...):
...
Here I'm hinting at metaclasses, but I'm not sure it's the right solution path.
I've just discovered the __prepare__ protocol. This would allow me to do something naive like decorate all functions or all callables in the class namespace, but that's not really what I want. I only want to decorate methods(class methods and instance methods).
Python has so many metaprogramming facilities, I'd be surprised if there wasn't a way... Or at least a better way than decorating each method manually?
I'm using python 3.6.
Thanks!
Related
I have a base class A, and a decorator behavior. Both has different behaviors but sometimes it can be used at the same time.
There is to implement a new class decorator new_behavior that applies behavior and "inject" A as a parent class?
Something like this:
#new_behavior
class B:
...
So B will behave just like if it was declared like class B(A): but B also inhirts all #behavior behaviors?
Broadly speaking, by the time a decorator gets a chance to operate on a class, it's too late to change fundamental properties of the class, like its bases. But that doesn't necessarily mean you can't do what you want, it only rules out direct approaches.
You could have your decorator create a new class with the desired bases, and add the contents of the old class to the new one. But there are a lot of subtle details that might go wrong, like methods that don't play correctly with super and other stuff that make it somewhat challenging. I would not want to do this on a whim.
One possible option that might be simpler than most is to make a new class that inherits from both the class you're decorating, and the base class you want to add. That isn't exactly the same as injecting a base class as a base of the decorated, but it will usually wind up with the same MRO, and super should work just fine. Here's how I'd implement that:
def new_behavior(cls):
class NewClass(cls, A): # do the multiple inheritance by adding A here
pass
NewClass.__name__ = f'New{cls.__name__}' # should modify __qualname__ too
return NewClass
I'm not applying any other decorators in that code, but you could do that by changing the last line to return some_other_decorator(NewClass) or just applying the decorator to the class statement with #decorator syntax. In order to make introspection nicer, you might want to modify a few parameters of NewClass before returning it. I demonstrate altering the __name__ attribute, but you would probably also want to change __qualname__ (which I've skipped doing because it would be a bit more fiddly and annoying to get something appropriate), and maybe some others that I can't think of off the top of my head.
class X():
#decorator
def method(self):
return
class Y(X):
def method(self):
return
Is there any way where the applied decorators still applies on child class method without explicitly decorating them?
Nope, since overriding methods creates a completely new object. It is the same logic that super(...).__init__ does not get called automatically, and the general Python guideline of be explicit - for example, if that happened by default, but you did not want the decorator, how would it work?
Does not seem like too much work to be explicit here, and decorate.
In Java, for example, you can make a class MyClass with certain methods that are specified but not implemented in MyClass, but must be implemented in any class MySubClass that inherits from MyClass. So basically there is some common functionality among all subclasses you want, so you put it in MyClass, and there is some functionality unique (but required) for each subclass, so you want it in each subclass. How can this behavior be achieved in Python?
(I know there are concise terms to describe what I'm asking, so feel free to let me know what these are and how I can better describe my question.)
A very basic example but the abc docs provide a few more
import abc
class Foo():
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
def bar(self):
raise NotImplemented
class FooBar(Foo):
pass
f = FooBar()
TypeError: Can't instantiate abstract class FooBar with abstract methods bar
You can't require the implementation of a method in a subclass in a way that will break at compile-time, but the convention on writing a method on the base class that must be implemented in the subclasses is to raise NotImplementedError.
Something like this:
class MyBase(object):
def my_method(self, *args, **kwargs):
raise NotImplementedError("You should implement this method on a subclass of MyBase")
Then your subclasses can implement my_method, but this will break only when the method is called. If you have comprehensive unit tests, as you should, this won't be a problem.
↑↑↑ It does NOT
Let's say I have a class with some utility methods:
class Utils:
#staticmethod
def do_stuff():
# some stuff
Utils.do_other_stuff()
# some more stuff
#staticmethod
def do_other_stuff():
# somehting other
I don't really like the Utils.do_other_stuff() part.
If it was instance method, I would reference it via self, but here I have to write the full class name.
Is this where #classmethod is a good idea to use, or is it overkill? - or is there some cleaner way to write Utils, perhaps with a module?
If you need a reference to the current class (which could be a subclass), then definitely make it a classmethod.
That's not overkill; the amount of work Python does to bind a class method is no different from a static method, or a regular method for that matter.
However, don't use classes here unless you have to. Python is not Java, you do not have to use a class and functions can live outside of classes just fine.
#classmethod is the way to go:
class Utils:
#classmethod
def do_stuff(cls):
# some stuff
cls.do_other_stuff()
# some more stuff
#classmethod
def do_other_stuff(cls):
# somehting other
Just a clarification related to Martijn Pieters comment: I usually avoid #staticmethod and I prefer to adopt always #classmethod because it allows me to refer to the class and its methods. (I don't agree with suggestions about writing modules with functions… I'm an OOP supporter :P)
It doesn't look like Utils will ever be subclassed or instantiated; it's just a wrapper for static methods. In that case, these methods can all be turned into module-level functions, perhaps in a separate utils module:
# No class!
def do_stuff():
...
do_other_stuff()
...
def do_other_stuff():
...
I have a one to many class inheritance structure as follows:
class SuperClass:
def func1():
print 'hello'
def func2():
print 'ow'
class SubClass1(SuperClass):
def func1():
print 'hi'
class SubClass2(SuperClass):
def func1():
print 'howdy'
...
I want to add functionality to class A so that I can use it when I create classes B and C (etc), but I cannot edit the code for class A directly. My current solution is:
def func3():
print 'yes!'
SuperClass.func3 = func3
Is there a better and/or more pythonic way to achieve this?
This is called "monkeypatching", and is perfectly reasonable in some cases.
For example if you have to use someone else's code (that you can't modify) that depends on SuperClass, and you need to change that code's behavior, your only real choice is to replace methods on SuperClass.
However, in your case, there doesn't seem to be any good reason to do this. You're defining all of the subclasses of SuperClass, so why not just add another class in between?
class Intermediate(SuperClass):
def func3():
pass
class SubClass1(Intermediate):
def func1():
print 'hi'
This isn't good enough for "functionality that should have been in SuperClass but wasn't" if other code you can't control needs that functionality… but when it's only your code that needs that functionality, it's just as good, and a lot simpler.
If even the subclasses aren't under your control, often you can just derive a new class from each one that is. For example:
class Func3Mixin(object):
def func3():
pass
class F3SubClass1(SubClass1, Func3Mixin):
pass
class F3SubClass2(SubClass2, Func3Mixin):
pass
Now you just construct instances of F3SubClass1 instead of SubClass1. Code that was expecting a SubClass1 instance can use an F3SubClass1 just fine. And Python's duck typing makes this kind of "mixin-oriented programming" especially simple: inside the implementation of Func3Mixin.func3, you can use attributes and methods of SuperClass, despite the fact that Func3Mixin itself isn't statically related to SuperClass in any way, because you know that any runtime object that is a Func3Mixin will also be a SuperClass.
Meanwhile, even when monkeypatching is appropriate, it isn't necessarily the best answer. For example, if you're patching to work around a bug in some third-party code, that code has a nice license and a source repository that makes it easy to maintain your own patches, you can just fork it, create a fixed copy, and use that instead of the original.
Also, it's worth pointing out that none of your classes are actually usable as written—any attempt to call any of the methods will raise a TypeError because they're missing the self argument. But the way you've monkeypatched in func3, it will fail in exactly the same way as func1. (And the same is true for the alternatives I sketched above.)
Finally, all of your classes here are classic classes rather than new-style, because you forgot to make SuperClass inherit from object. If you can't change SuperClass, of course, that's not your fault—but you may want to fix it anyway by making your subclasses (or Intermediate) multiply inherit from object and SuperClass. (If you've been paying attention: yes, this means you can mix-in new-style-classness. Although under the covers you have to understand metaclasses to understand why.)