I have a class hierarchy where __init__ in class Base performs some pre-initialization and then calls method calculate. The calculate method is defined in class Base, but it's expected to be redefined in derived classes. The redefined calculate will use some of the attributes that are only available in class Derived:
class Base:
def __init__(self, args):
# perform some pre-initialization
...
# now call method "calculate"
self.calculate()
class Derived(Base):
def __init__(self, args, additional_attr):
super().__init__(args)
# do some work and create new instance attributes
...
self.additional_attr = additional_attr
This is not going to work because calculate method in class Derived will be invoked before self.additional_attr is assigned.
I can't move super().__init__(args) call to the end of the __init__ method because some of the work it does has to happen before processing additional_attr.
What to do?
Perhaps you shouldn't have the calculate() call in your constructor then. If you can't construct a derived object by allowing the base constructor to complete first, then you must be doing something wrong IMHO. A sensible approach would be to move that call out of the constructor and perhaps create a factory method to make that call automatically. Then use that method if you need precalculated instances.
class Base(object):
def __init__(self, args):
# perform some initialization
pass
def calculate(self):
# do stuff
pass
#classmethod
def precalculated(cls, args):
# construct first
newBase = cls(args)
# now call method "calculate"
newBase.calculate()
return newBase
class Derived(Base):
def __init__(self, args, additional_attr):
super(Derived, self).__init__(args)
# do some work and create new instance attributes
self.additional_attr = additional_attr
#classmethod
def precalculated(cls, args, additional_attr): # also if you want
newDerived = cls(args, additional_attr)
newDerived.calculate()
return newDerived
newBase = Base('foo')
precalculatedBase = Base.precalculated('foo')
newDerived = Derived('foo', 'bar')
precalculatedDerived = Derived.precalculated('foo', 'bar')
This is bad design, IMHO, and you're obusing the object system of Python. Consider that in other OO languages like C++, you don't even have control over the creation of base classes. The derived class's constructor calls the base constructor before your code runs. Such behavior is almost always expected of well-behaved class hierarchies, and changing it can only lead to problems.
Sure, you can do some patching (such as assigning self.additional_attr before the call to super's constructor, or other tricks), but the better way would be to change your design so that it won't require such hacks. Since you've presented an abstract example here, it's hard to give more comprehensive design advice.
In order for something like this to work, you need to design a protocol that allows the base and derived class(es) to cooperate with each other to accomplish the object initialization task:
class Base:
def __init__(self, args, *additional_args):
# perform some pre-initialization
# ...
# perform any futher initialization needed by derived classes
self.subclass_setup(*additional_args)
# now call method "calculate"
self.calculate()
def subclass_setup(self, *args):
pass
class Derived(Base):
def __init__(self, args, additional_attr):
super().__init__(args, additional_attr)
def subclass_setup(self, additional_attr):
# do some work and create new instance attributes
# ...
self.additional_attr = additional_attr
Can you pass the additional_attr as a parameter to __init__ method of base class and propogate it from there to calculate method?
Say something like:
class Base(object):
def __init__(self, args,additional_attr):
print 'Args for base class:%s' %(args)
self.calculate(additional_attr)
class Derived(Base):
def __init__(self, args, additional_attr):
super(Derived,self).__init__(args,additional_attr)
def calculate(self,val):
print 'Arg for calculate:%s' %(val)
self.additional_attr = val
>>> d = Derived(['test','name'],100)
Args for base class:['test', 'name']
Arg for calculate:100
This is roundabout way, but with no information about what the pre-initialisation steps are, it is hard to say whether the above approach would help you.
Related
The issue
I would like to be able to re-use methods by implementing them with a decorator, while preserving my IDE's ability to type-hint the methods added, such that:
#methods(methods=[implement_foo, implement_bar])
class K:
pass
# OR
#method(methods[Foo, Bar])
class K:
pass
k = K()
#### THE ISSUE
k. <- # IDE should recognize the methods .foo() or bar(), but does not.
My issue is much like How to create a class decorator that can add multiple methods to a class?, but as mentioned, while preserving the type-hint and only use one decorator.
What I have tried
I can make it work with one decorator, but not with multiple.
Example with one decorator called implement_method
def implement_method(cls):
class Inner(cls):
def __init__(self, *args, **kargs):
super(Inner, self).__init__(*args, **kargs)
def method(self):
pass
return Inner
#implement_method
class K:
pass
And type hint works for a new instance of K:
I imagine that one of the issues is using a loop, but I am unable to come up with a different solution. The following is my best attempt:
def methods(methods):
def wrapper(cls):
for method in methods:
cls = method(cls)
return cls
return wrapper
class Bar:
def bar(self):
pass
#methods(methods=[Bar])
class K:
pass
k = K()
k. # <- not finding bar()
Since your question is a 2 part one:
I have an answer for your first part and I am quite stuck on the second. You can modify signatures of functions using the inspect module, but I have not found anything similar for classes and I am not sure if it is possible. So for my answer I will focus on your first part:
One decorator for multiple functions:
Let's look at the decorator first:
def add_methods(*methods):
def wrapper(cls):
for method in methods:
setattr(cls, method.__name__, staticmethod(method))
return cls
return wrapper
We use *methods as a parameter so that we can add as many methods as we want as arguments.
Then we define a wrapper for the class and in it iterate over all methods we want to add using setattr to add the method to the class. Notice the staticmethod wrapping the original method. You can leave this out if you want the methods to receive the argument self.
Then we return from the wrapper returning the class and return from the decorator returning the wrapper.
Let's write some simple methods next:
def method_a():
print("I am a banana!")
def method_b():
print("I am an apple!")
Now we create a simple class using our decorator:
#add_methods(method_a, method_b)
class MyClass:
def i_was_here_before(self):
print("Hah!")
And finally test it:
my_instance = MyClass()
my_instance.i_was_here_before()
my_instance.method_a()
my_instance.method_b()
Our output:
Hah!
I am a banana!
I am an apple!
A word of caution
Ususally it is not advised to change the signature of functions or classes without a good reason (and sometimes even with a good reason).
Alternate Solution
Given that you will need to apply the decorator to each class anyway, you could also just use a superclass like this:
class Parent:
#staticmethod
def method_a():
print("I am a banana!")
#staticmethod
def method_b():
print("I am an apple!")
class MyClass(Parent):
def i_was_here_before(self):
print("Hah!")
my_instance = MyClass()
my_instance.i_was_here_before()
my_instance.method_a()
my_instance.method_b()
Since python supports multiple inheritance this should work better and it also gives you the correct hints.
Complete working example:
def add_methods(*methods):
def wrapper(cls):
for method in methods:
setattr(cls, method.__name__, staticmethod(method))
return cls
return wrapper
def method_a():
print("I am a banana!")
def method_b():
print("I am an apple!")
#add_methods(method_a, method_b)
class MyClass:
def i_was_here_before(self):
print("Hah!")
my_instance = MyClass()
my_instance.i_was_here_before()
my_instance.method_a()
my_instance.method_b()
I understand that the concept of super().__init__() is to do with inheritance and I have seen code with parameters within init. However, I came across a code sample that had this:
class Maze(tk.Tk, object):
def __init__(self):
super(Maze, self).__init__()
the parameters are now within the super parenthesis. What is the difference and why may one be used over the other? Thank you
This is the original way super() was intended to work:
super(Maze, self).__init__()
This is also the only way that it did work in Python 2.
So, why the arguments?
You want to call __init__() of the class which is the super class if Maze (probably tk.Tk), bound to self. To do that, you have to pass the arguments Maze and self to super, so that it knows what to do.
What does it actually do?
super(Maze, self).__init__ has to determine type(self) to extract the MRO from it, i.e. the order in which classes are inherited from one another.
Then, in that list, it finds the class which is just above Maze and looks for an __init__ in that class or any class above it. When it finds it, it bounds the __init__ init method to self (i.e. fixes its first argument, so you don't have to pass it).
You could implement that version of super yourself. It would be something like this:
class my_super:
def __init__(self, cls, obj):
self.cls = cls
self.obj = obj
def __getattribute__(self, method_name):
cls = object.__getattribute__(self, 'cls')
obj = object.__getattribute__(self, 'obj')
mro = type(obj).__mro__
mro_above_cls = mro[mro.index(cls)+1:]
for super_cls in mro_above_cls:
if hasattr(super_cls, method_name):
method = getattr(super_cls, method_name)
return functools.partial(method, self)
Note that you don't have to call this from a method at all. You could do this:
class A:
def f(self):
print('A')
class B(A):
def f(self):
print('B')
a = A()
b = B()
super(B, b).f() # prints: A
my_super(B, b).f() # prints: A
What about the version without arguments?
super(Maze, self).__init__() was very explicit, but almost all of Python code always used current class and self as arguments, so Python 3 made it easier by providing a magic super() which knows what you want.
This is a feature I miss in several languages and wonder if anyone has any idea how it can be done in Python.
The idea is that I have a base class:
class Base(object):
def __init__(self):
self.my_data = 0
def my_rebind_function(self):
pass
and a derived class:
class Child(Base):
def __init__(self):
super().__init__(self)
# Do some stuff here
self.my_rebind_function() # <==== This is the line I want to get rid of
def my_rebind_function(self):
# Do stuff with self.my_data
As can be seen above, I have a rebound function which I want called after the Child.__init__ has done its job. And I want this done for all inherited classes, so it would be great if it was performed by the base class, so I do not have to retype that line in every child class.
It would be nice if the language had a function like __finally__, operating similar to how it operates with exceptions. That is, it should run after all __init__-functions (of all derived classes) have been run, that would be great. So the call order would be something like:
Base1.__init__()
...
BaseN.__init__()
LeafChild.__init__()
LeafChild.__finally__()
BaseN.__finally__()
...
Base1.__finally__()
And then object construction is finished. This is also kind of similar to unit testing with setup, run and teardown functions.
You can do this with a metaclass like that:
class Meta(type):
def __call__(cls, *args, **kwargs):
print("start Meta.__call__")
instance = super().__call__(*args, **kwargs)
instance.my_rebind_function()
print("end Meta.__call__\n")
return instance
class Base(metaclass=Meta):
def __init__(self):
print("Base.__init__()")
self.my_data = 0
def my_rebind_function(self):
pass
class Child(Base):
def __init__(self):
super().__init__()
print("Child.__init__()")
def my_rebind_function(self):
print("Child.my_rebind_function")
# Do stuff with self.my_data
self.my_data = 999
if __name__ == '__main__':
c = Child()
print(c.my_data)
By overwriting Metaclass.__call__ you can hook after all __init__ ( and __new__) methods of the class-tree have run an before the instance is returned. This is the place to call your rebind function. To understand the call order i added some print statements. The output will look like this:
start Meta.__call__
Base.__init__()
Child.__init__()
Child.my_rebind_function
end Meta.__call__
999
If you want to read on and get deeper into details I can recommend following great article: https://blog.ionelmc.ro/2015/02/09/understanding-python-metaclasses/
I may still not fully understand, but this seems to do what I (think) you want:
class Base(object):
def __init__(self):
print("Base.__init__() called")
self.my_data = 0
self.other_stuff()
self.my_rebind_function()
def other_stuff(self):
""" empty """
def my_rebind_function(self):
""" empty """
class Child(Base):
def __init__(self):
super(Child, self).__init__()
def other_stuff(self):
print("In Child.other_stuff() doing other stuff I want done in Child class")
def my_rebind_function(self):
print("In Child.my_rebind_function() doing stuff with self.my_data")
child = Child()
Output:
Base.__init__() called
In Child.other_stuff() doing other stuff I want done in Child class
In Child.my_rebind_function() doing stuff with self.my_data
If you want a "rebind" function to be invoked after each instance of a type which inherits from Base is instantiated, then I would say this "rebind" function can live outside the Base class(or any class inheriting from it).
You can have a factory function that gives you the object you need when you invoke it(for example give_me_a_processed_child_object()). This factory function basically instantiates an object and does something to it before it returns it to you.
Putting logic in __init__ is not a good idea because it obscures logic and intention. When you write kid = Child(), you don't expect many things to happen in the background, especially things that act on the instance of Child that you just created. What you expect is a fresh instance of Child.
A factory function, however, transparently does something to an object and returns it to you. This way you know you're getting an already processed instance.
Finally, you wanted to avoid adding "rebind" methods to your Child classes which you now you can since all that logic can be placed in your factory function.
There is a BaseClient
class BaseClient(object):
that later get inherits in a lot of classes
class Account(BaseClient):
def create(self, **params):
pass
and few others.
class MainClass(Account, User):
pass
There a few functions that use the same create function
def create(self, **params):
pass
How to add a unique class label like
MainClass.Account.create()
Now it is working as
MainClass.create()
Update:
There a lot duplicate functions like create() that going to override the ones that are inherting from. I would like to call the class like Account, so when I call
MainClass.Account.create()
MainClass.User.create()
they act so two different functions.
In other words, you have multiple inheritance, with:
class Base1(object):
def create(self): ...
class Base2(object):
def create(self): ...
class C(Base1, Base2):
def create(self): ...
In class C, you can choose whether to call the implementation from the parent classes or not.
Option 1: do not implement create in class C
If you don't implement method create in C, then Base1.create is going to be used.
Note that this situation where C inherits from Base1 and Base2 is treated as if C inherites from Base1 and Base1 inherits from Base2.
You can see that if you print C.__mro__
See also this thread about MRO: Method Resolution Order (MRO) in new style Python classes
Option 2: do not call the base implemntation
class C(Base1, Base2):
def create(self):
pass
Now Base1.create is no longer going to be called.
Option 3: call only one of the bases
class C(Base1, Base2):
def create(self):
Base2.create(self)
Now Base1.create is not going to be called, but Base2.create is.
Option 4: call each of the base implementations
class C(Base1, Base2):
def create(self):
Base1.create(self)
Base2.create(self)
Both Base1.create and Base2.create will be called.
Option 5: user super to call all base implementations
Although option 4 may seem like a very nice solution here, in some configurations, like diamond inheritance it could cause a method to be called multiple times. So, an alternative approach is to user super, which uses the MRO (see Option 1) to determine which base implementation to use. By using MRO, it avoids diamond inheritance problems. However, it has to be used systematically on all classes and even then it has its caveats.
class CommonBase(object):
def create(self):
pass
class Base1(CommonBase):
def create(self):
super(Base1, self).create()
class Base2(CommonBase):
def create(self):
super(Base2, self).create()
class C(Base1, Base2):
def create(self):
super(C, self).create()
Here, C().create() will call all four create methods, each once.
You can't control it as a client of the class from the outside of the class, it can only be controlled inside a class, in your case inside MainClass by calling super to call a method from one or another base class: Account or User.
class MainClass(Account, User):
# your own convention that by default it calls Account.create
def create(self, **params):
super(Account, self).create(**params)
def create2(self, **params):
super(User, self).create(**params)
I'm using Python 3.
I know about the #classmethod decorator. Also, I know that classmethods can be called from instances.
class HappyClass(object):
#classmethod
def say_hello():
print('hello')
HappyClass.say_hello() # hello
HappyClass().say_hello() # hello
However, I don't seem to be able to create class methods dynamically AND let them be called from instances. Let's say I want something like
class SadClass(object):
def __init__(self, *args, **kwargs):
# create a class method say_dynamic
SadClass.say_dynamic() # prints "dynamic!"
SadClass().say_dynamic() # prints "dynamic!"
I've played with cls.__dict__ (which produces exceptions), and with setattr(cls, 'say_dynamic', blahblah) (which only makes the thingie callable from the class and not the instance).
If you ask me why, I wanted to make a lazy class property. But it cannot be called from instances.
#classmethod
def search_url(cls):
if hasattr(cls, '_search_url'):
setattr(cls, '_search_url', reverse('%s-search' % cls._meta.model_name))
return cls._search_url
Maybe because the property hasn't been called from the class yet...
In summary, I want to add a lazy, class method that can be called from the instance... Can this be achieved in an elegant (nottoomanylines) way?
Any thoughts?
How I achieved it
Sorry, my examples were very bad ones :\
Anyway, in the end I did it like this...
#classmethod
def search_url(cls):
if not hasattr(cls, '_search_url'):
setattr(cls, '_search_url', reverse('%s-search' % cls._meta.model_name))
return cls._search_url
And the setattr does work, but I had made a mistake when testing it...
You can add a function to a class at any point, a practice known as monkey-patching:
class SadClass:
pass
#classmethod
def say_dynamic(cls):
print('hello')
SadClass.say_dynamic = say_dynamic
>>> SadClass.say_dynamic()
hello
>>> SadClass().say_dynamic()
hello
Note that you are using the classmethod decorator, but your function accepts no arguments, which indicates that it's designed to be a static method. Did you mean to use staticmethod instead?
If you want to create class methods, do not create them in the __init__ function as it is then recreated for each instance creation. However, following works:
class SadClass(object):
pass
def say_dynamic(cls):
print("dynamic")
SadClass.say_dynamic = classmethod(say_dynamic)
# or
setattr(SadClass, 'say_dynamic', classmethod(say_dynamic))
SadClass.say_dynamic() # prints "dynamic!"
SadClass().say_dynamic() # prints "dynamic!"
Of course, in the __init__ method the self argument is an instance, and not the class: to put the method in the class there, you can hack something like
class SadClass(object):
def __init__(self, *args, **kwargs):
#classmethod
def say_dynamic(cls):
print("dynamic!")
setattr(self.__class__, 'say_dynamic', say_dynamic)
But it will again reset the method for each instance creation, possibly needlessly. And notice that your code most probably fails because you are calling the SadClass.say_dynamic() before any instances are created, and thus before the class method is injected.
Also, notice that a classmethod gets the implicit class argument cls; if you do want your function to be called without any arguments, use the staticmethod decorator.
As a side note, you can just use an instance attribute to hold a function:
>>> class Test:
... pass
...
>>> t=Test()
>>> t.monkey_patch=lambda s: print(s)
>>> t.monkey_patch('Hello from the monkey patch')
Hello from the monkey patch
How I achieved it:
#classmethod
def search_url(cls):
if not hasattr(cls, '_search_url'):
setattr(cls, '_search_url', reverse('%s-search' % cls._meta.model_name))
return cls._search_url