class parent:
def fun1(self):
print("foo")
fun2 = fun1
class child(parent):
def fun1(self):
print("bar")
child().fun2()
This code outputs "foo". I'm pretty sure I understand why this is happening, but I'm wondering if there is a straightforward way to implement what I want (inheritable aliases that behave according to MRO, so it outputs "bar") or a reason why this is bad programming practice.
There is no way to directly do what you want.
To understand why, you need to understand the way method lookup works. It's explained in detail in the Descriptor HOWTO, and I tried to write a less expert-level explanation here, so I'll assume you read both of those and just show the effects:
>>> child.fun1
<function __main__.child.fun1>
>>> child.fun2
<function __main__.parent.fun1>
>>> child().fun2
<bound method parent.fun1 of <__main__.child object at 0x10b735ba8>>
Notice that child.fun2 is parent.fun1, not child.fun1. Which explains why child().fun2 ends up as a bound method around parent.fun1, even though the self ends up as a child instance. Why? Well, obviously fun2 is not in child.__dict__ (or we wouldn't need an MRO). And in parent.__dict__, it can't be anything but parent.fun1.
So, what are the workarounds?
You could make fun2 into a property that forwards to fun2, as Patrick Haugh suggested. Or you can just do this:
def fun2(self):
return self.fun1()
This solution has the benefit of being dead simple—anyone who can Python will understand why it works.
But Patrick's solution has the advantage of making not just child().fun2(), but also child().fun2 (as an object you can pass around, compare for identity, etc.) work the way you want it to.
Meanwhile, there is an idiom closely related to what you're asking for, where a set of public methods that you don't expect to override call a "protected" implementation method that you do. For example, a simple 1D array-math class might do this:
def _math(lhs, rhs, op):
# not actually a useful implementation...
return [op(x, y) for x, y in zip(lhs, rhs)]
def __add__(self, other):
return self._math(other, add)
def __radd__(self, other):
return self._math(other, add)
# etc.
And now there's no asymmetry between __add__ and __radd__, only between the "public" interface (__add__ and __radd__) and the "protected" one (_math).
Here's a very straightforward way—that's perfectly OK:
class Parent:
def fun1(self):
print("foo")
def fun2(self):
return self.fun1()
class Child(Parent):
def fun1(self):
print("bar")
Child().fun2() # -> bar
You could have fun2 be a property that returns self.fun1
class Parent:
def fun1(self):
print('foo')
#property
def fun2(self):
return self.fun1
class Child(Parent):
def fun1(self):
print('bar')
Child().fun2()
# bar
Related
Suppose I have the following code:
class Classy:
def other(self):
print("other")
def method(self):
print("method")
self.other()
obj = Classy()
obj.method()
The output:
method
other
So I invoke another object/class method from inside the class. I invoke the other method within the 'method' method.
Now if I run the following code:
class Classy:
def other(self):
print("other")
def method(self):
print("method")
Classy.other(self)
obj = Classy()
obj.method()
The output is the same. Now my question is: What is the difference between these two?
I am not sure if it is just a different style of calling - so it is basically the same - or if there is a difference in the logic. If yes, I would be interested in an example where the difference matters.
Let's set it up so we can run them side by side:
class Classy:
def other(self):
print("Classy.other")
def method(self):
print("Classy.method")
self.other()
class NotClassy:
def other(self):
print("NotClassy.other")
def method(self):
print("NotClassy.method")
NotClassy.other(self)
So far, so good:
>>> Classy().method()
Classy.method
Classy.other
>>> NotClassy().method()
NotClassy.method
NotClassy.other
But what if inheritance gets involved, as it so often does in oop? Let's define two subclasses that inherit method but override other:
class ClassyToo(Classy):
def other(self):
print("ClassyToo.other")
class NotClassyToo(NotClassy):
def other(self):
print("NotClassyToo.other")
Then things get a bit problematic; although the subclasses have almost identical implementation, and the parent classes seemed to behave exactly the same, the outputs here are different:
>>> ClassyToo().method()
Classy.method
ClassyToo.other
>>> NotClassyToo().method()
NotClassy.method
NotClassy.other # what about NotClassyToo.other??
By calling NotClassy.other directly, rather than invoking the method on self, we've bypassed the overridden implementation in NotClassyToo. self might not always be an instance of the class the method is defined in, which is also why you see super getting used - your classes should cooperate in inheritance to ensure the right behaviour.
From a famous example, I learned the difference between method, classmethod and staticmethod in a Python class.
Source:
What is the difference between #staticmethod and #classmethod in Python?
class A(object):
def foo(self,x):
print "executing foo(%s,%s)"%(self,x)
#classmethod
def class_foo(cls,x):
print "executing class_foo(%s,%s)"%(cls,x)
#staticmethod
def static_foo(x):
print "executing static_foo(%s)"%x
# My Guesses
def My_Question(self,x):
self.foo(x)
A.class_foo(x)
A.static_foo(x)
a=A()
Now I am wondering, how to call a method, #classmethod, and #staticmethod inside the class.
I put my guesses in the My_Question function above, please correct me if I am wrong with any of these.
Yes, your guesses will work. Note that it is also possible/normal to call staticmethods and classmethods outside the class:
class A():
...
A.class_foo()
A.static_foo()
Also note that inside regular instance methods, it's customary to call the staticmethods and class methods directly on the instance (self) rather than the class (A):
class A():
def instance_method(self):
self.class_foo()
self.static_foo()
This allow for inheritance to work as you might expect -- If I create a B subclass from A, if I call B.instance_method(), my class_foo function will get B instead of A as the cls argument -- And possibly, if I override static_foo on B to do something slightly different than A.static_foo, this will allow the overridden version to be called as well.
Some examples might make this more clear:
class A(object):
#staticmethod
def static():
print("Static, in A")
#staticmethod
def staticoverride():
print("Static, in A, overrideable")
#classmethod
def clsmethod(cls):
print("class, in A", cls)
#classmethod
def clsmethodoverrideable(cls):
print("class, in A, overridable", cls)
def instance_method(self):
self.static()
self.staticoverride()
self.clsmethod()
self.clsmethodoverride()
class B(A):
#classmethod
def clsmethodoverrideable(cls):
print("class, in B, overridable", cls)
#staticmethod
def staticoverride():
print("Static, in B, overrideable")
a = A()
b = B()
a.instance_method()
b.instance_method()
...
After you've run that, try it by changing all of the self. to A. inside instance_method. Rerun and compare. You'll see that all of the references to B have gone (even when you're calling b.instance_method()). This is why you want to use self rather than the class.
As #wim said, what you have is right. Here's the output when My_Question is called.
>>> a.My_Question("My_Answer=D")
executing foo(<__main__.A object at 0x0000015790FF4668>,My_Answer=D)
executing class_foo(<class '__main__.A'>,My_Answer=D)
executing static_foo(My_Answer=D)
Rather than try to explain verbosely, this code snippet should do it.
def decorator(function):
return lambda self: function(self) + 1
# imported from library
class A(object):
# override
def method_1(self):
pass
# override
def method_2(self):
pass
class B(A):
def method_1(self):
return 1
class C(B):
#decorator
def method_2(self):
return 1
class D(B):
#decorator
def method_2(self):
return 2
print C().method_1() # 1
print C().method_2() # 2
print D().method_2() # 3
This works well, but as decorator is only used on method_2, maybe it should be pulled in.
class B(A):
def method_1(self):
return 1
#staticmethod
def decorator(function):
return lambda self: function(self) + 1
class C(B):
#B.decorator
def method_2(self):
return 1
This works, but it's not clear to me if this is actually better. In particular whether Python treats B.decorator like an external function which just happens to be defined on B, or if C inheriting from B makes this more efficient.
What I actually want is some way to define the decorator on B and use it on C like this.
class C(B):
#self.decorator
def method_2(self):
return 1
This doesn't work. Is there a better alternative to either perhaps? Thanks.
Besides the fact that there is no reason to move the decoration definition to inside the class, as pointed in the comments, there goes some clarifications:
when you do use the #staticmethod decorator on your decorator definition, you will have the exact same behavior as if it was defined outside any class body.
If you do use #classmethod instead, as in
class B(A):
#classmethod
def decorator(cls, function):
return lambda self: function(self) + 1
The difference is that the decorator is passed the class it is defined in, as an object as the first parameter. This could be of some use, or not - note that the decorated method itself will still be a regular instance method - unless you make use of #classmethod or #staticmethod decorators inside the decorator body itself, changing the wrapper function it returns.
As for efficient: the most "efficient" thing will be to just leave teh decorator outside any class body - the difference for a "#staticmethod" defiend decorator will be minimal - probably impossible to measure at all, but should involve two or three less attribute accesses on class creation (which won't usually be inside a critical loop in the application) - so, no, there is no difference there.
Now, it is not what you are asking, but someone could come here looking for how to automatically re-apply a decorator in a method in a superclass when overreding this method in a sub-class:
That is not easy to do, but should be possible with specially prepared decorators (which will anotate themselves on the wrapped method)
I'm using Python 3.
I know about the #classmethod decorator. Also, I know that classmethods can be called from instances.
class HappyClass(object):
#classmethod
def say_hello():
print('hello')
HappyClass.say_hello() # hello
HappyClass().say_hello() # hello
However, I don't seem to be able to create class methods dynamically AND let them be called from instances. Let's say I want something like
class SadClass(object):
def __init__(self, *args, **kwargs):
# create a class method say_dynamic
SadClass.say_dynamic() # prints "dynamic!"
SadClass().say_dynamic() # prints "dynamic!"
I've played with cls.__dict__ (which produces exceptions), and with setattr(cls, 'say_dynamic', blahblah) (which only makes the thingie callable from the class and not the instance).
If you ask me why, I wanted to make a lazy class property. But it cannot be called from instances.
#classmethod
def search_url(cls):
if hasattr(cls, '_search_url'):
setattr(cls, '_search_url', reverse('%s-search' % cls._meta.model_name))
return cls._search_url
Maybe because the property hasn't been called from the class yet...
In summary, I want to add a lazy, class method that can be called from the instance... Can this be achieved in an elegant (nottoomanylines) way?
Any thoughts?
How I achieved it
Sorry, my examples were very bad ones :\
Anyway, in the end I did it like this...
#classmethod
def search_url(cls):
if not hasattr(cls, '_search_url'):
setattr(cls, '_search_url', reverse('%s-search' % cls._meta.model_name))
return cls._search_url
And the setattr does work, but I had made a mistake when testing it...
You can add a function to a class at any point, a practice known as monkey-patching:
class SadClass:
pass
#classmethod
def say_dynamic(cls):
print('hello')
SadClass.say_dynamic = say_dynamic
>>> SadClass.say_dynamic()
hello
>>> SadClass().say_dynamic()
hello
Note that you are using the classmethod decorator, but your function accepts no arguments, which indicates that it's designed to be a static method. Did you mean to use staticmethod instead?
If you want to create class methods, do not create them in the __init__ function as it is then recreated for each instance creation. However, following works:
class SadClass(object):
pass
def say_dynamic(cls):
print("dynamic")
SadClass.say_dynamic = classmethod(say_dynamic)
# or
setattr(SadClass, 'say_dynamic', classmethod(say_dynamic))
SadClass.say_dynamic() # prints "dynamic!"
SadClass().say_dynamic() # prints "dynamic!"
Of course, in the __init__ method the self argument is an instance, and not the class: to put the method in the class there, you can hack something like
class SadClass(object):
def __init__(self, *args, **kwargs):
#classmethod
def say_dynamic(cls):
print("dynamic!")
setattr(self.__class__, 'say_dynamic', say_dynamic)
But it will again reset the method for each instance creation, possibly needlessly. And notice that your code most probably fails because you are calling the SadClass.say_dynamic() before any instances are created, and thus before the class method is injected.
Also, notice that a classmethod gets the implicit class argument cls; if you do want your function to be called without any arguments, use the staticmethod decorator.
As a side note, you can just use an instance attribute to hold a function:
>>> class Test:
... pass
...
>>> t=Test()
>>> t.monkey_patch=lambda s: print(s)
>>> t.monkey_patch('Hello from the monkey patch')
Hello from the monkey patch
How I achieved it:
#classmethod
def search_url(cls):
if not hasattr(cls, '_search_url'):
setattr(cls, '_search_url', reverse('%s-search' % cls._meta.model_name))
return cls._search_url
I'd like to be able to do this:
class A(object):
#staticandinstancemethod
def B(self=None, x, y):
print self is None and "static" or "instance"
A.B(1,2)
A().B(1,2)
This seems like a problem that should have a simple solution, but I can't think of or find one.
It is possible, but please don't. I couldn't help but implement it though:
class staticandinstancemethod(object):
def __init__(self, f):
self.f = f
def __get__(self, obj, klass=None):
def newfunc(*args, **kw):
return self.f(obj, *args, **kw)
return newfunc
...and its use:
>>> class A(object):
... #staticandinstancemethod
... def B(self, x, y):
... print self is None and "static" or "instance"
>>> A.B(1,2)
static
>>> A().B(1,2)
instance
Evil!
Since you'd like the static method case to be used to create a new class anyway, you'd best just make it a normal method and call it at the end of the __init__ method.
Or, if you don't want that, create a separate factory function outside the class that will instantiate a new, empty object, and call the desired method on it.
There probably are ways of making exactly what you are asking for, but they will wander through the inner mechanisms of Python, be confusing, incompatible across python 2.x and 3.x - and I can't see a real need for it.
From what you're saying, is this along the line of what you're looking for?
I'm not sure there is a way to do exactly what you're saying that is "built in"
class Foo(object):
def __init__(self, a=None, b=None):
self.a
self.b
def Foo(self):
if self.a is None and self.b is None:
form = CreationForm()
else:
form = EditingForm()
return form
The answer to your question is no, you can't do that.
What I would do, since Python also supports regular functions, is define a function outside that class, then call that function from a normal method. The caller can decide what which one is needed.