Inheriting from decorated classes - python

I'm trying to decorate a class with another class. I also want to inherit from the decorated class, but I get some errors. Here's my code:
class Decorator:
def __init__(self, decorated):
pass
#Decorator
class Foo:
pass
class Goo(Foo):
pass
The error I get when I try to subclass from Foo is this:
Traceback (most recent call last):
File "test.py", line 9, in
class Goo(Foo):
TypeError: __init__() takes exactly 2 positional arguments (4 given)
By adding another init function to Decorator...
def __init__(self, *args):
for arg in args:
print(arg)
... I get the following output:
<class '__main__.Foo'>
Goo
(<__main__.Decorator object at 0x010073B0>,)
{'__module__': '__main__'}
What are those parameters and how should I be using them inside Decorator?

I'll try to answer the "what are those parameters" question. This code:
#Decorator
class Foo:
pass
is equivalent to:
class Foo:
pass
Foo = Decorator(Foo)
This means that Foo ends up being an instance of the Decorator class instead of being a class.
When you try to use this instance as a base of a class (Goo), Python will have to determine a metaclass that will be used to create the new class. In this case it will use Foo.__class__ which equals to Decorator. Then it will call the metaclass with (name, bases, dict) arguments and expect it to return a new class.
This is how you end up with these arguments in Decorator.__init__.
More about this can be found here:
http://www.python.org/download/releases/2.2.3/descrintro/#metaclasses
(particularly the "When a class statement is executed..." part)

Are you trying to add a MixIn to a class after the class has been defined?
If so, you could inject the MixIn this way:
def inject_class(mixin):
def _inject_class(cls):
return type(cls.__name__,(mixin,)+cls.__bases__,dict(cls.__dict__))
return _inject_class
class MixIn(object):
def mix(self):
print('mix')
#inject_class(MixIn)
class Foo(object):
def foo(self):
print('foo')
class Goo(Foo):
def goo(self):
print('goo')
goo=Goo()
goo.mix()
goo.foo()
goo.goo()
prints
mix
foo
goo
If you don't want the generality of inject_class, you could make a specialized class decorator which mixes in Decorator only:
def decorate(cls):
class Decorator(object):
def deco(self):
print('deco')
return type(cls.__name__,(Decorator,)+cls.__bases__,dict(cls.__dict__))
#decorate
class Foo(object):
def foo(self):
print('foo')
the result is the same.

I had the same problem and the following solution works for me:
from functools import update_wrapper
class decoratorBase():
def __new__(cls, logic):
self = object.__new__(cls)
self.__init__(logic)
def new (cls):
#cls is the decorated class type, not the decorator class type itself
self._createInstance(cls)
self._postInstanceCreation()
return self
self._logic.__new__ = new
#return the wrapped class and not a wrapper
return self._logic
def __init__(self, logic):
#logic is the decorated class
self._logic = logic
def _createInstance(self, cls):
self._logicInstance = object.__new__(cls)
self._logicInstance.__init__()
def _postInstanceCreation(self):
pass
class factory(decoratorBase):
def __init__(self, *largs, **kwargs):
super().__init__(*largs, **kwargs)
self.__instance = None
def _createInstance(self, cls):
self._logicInstance = None
self._cls = cls
def _postInstanceCreation(self):
update_wrapper(self, self._cls)
def __call__(self, userData, *largs, **kwargs):
logicInstance = object.__new__(self._cls)
logicInstance.__init__(*largs, **kwargs)
logicInstance._update(userData)
return logicInstance
class singelton(decoratorBase):
def _postInstanceCreation(self):
update_wrapper(self, self._logicInstance)
def __call__(self, userData):
self._logicInstance._update(userData)
return self._logicInstance
class base():
def __init__(self):
self.var = 0
print ("Create new object")
def __call__(self):
self.var += self._updateValue()
def _update(self, userData):
print ("Update object static value with {0}".format(userData))
self.var = userData
#factory
class factoryTestBase(base):
def __call__(self):
super().__call__()
print("I'm a factory, here is the proof: {0}".format(self.var))
def _updateValue(self):
return 1
class factoryTestDerived(factoryTestBase):
def _updateValue(self):
return 5
#singelton
class singeltonTestBase(base):
def __call__(self):
super().__call__()
print("I'm a singelton, here is the proof: {0}".format(self.var))
def _updateValue(self):
return 1
class singeltonTestDerived(singeltonTestBase):
def _updateValue(self):
return 5
The magic in this approach is the overloading of the __new__() method, as well for the decorator itself as for the "wrapper" which is returned by the decorator. I set the word wrapper in quotes, because actually there is no wrapper. Instead the decorated class is alternated by the decorator and returned. Using this scheme, you are able to inherit from a decorated class. The most important thing is the change of the __new__() method of the decorated class, which is made by the following lines:
def new (cls):
self._createInstance(cls)
self._postInstanceCreation()
return self
self._logic.__new__ = new
Using this, you have access to the decorator methods like self._createInstance() during creation of an object from a decorated class. You even have the opportunity to inherit from your decorators (as it is shown in the example).
Now lets run a simple example:
>>> factoryObjCreater = factoryTestBase()
>>> factoryObj1 = factoryObjCreater(userData = 1)
Create new object
Update object static value with 1
>>> factoryObj2 = factoryObjCreater(userData = 1)
Create new object
Update object static value with 1
>>> factoryObj1()
I'm a factory, here is the proof: 2
>>> factoryObj2()
I'm a factory, here is the proof: 2
>>> factoryObjDerivedCreater = factoryTestDerived()
>>> factoryObjDerived1 = factoryObjDerivedCreater(userData = 2)
Create new object
Update object static value with 2
>>> factoryObjDerived2 = factoryObjDerivedCreater(userData = 2)
Create new object
Update object static value with 2
>>> factoryObjDerived1()
I'm a factory, here is the proof: 7
>>> factoryObjDerived2()
I'm a factory, here is the proof: 7
>>> singeltonObjCreater = singeltonTestBase()
Create new object
>>> singeltonObj1 = singeltonObjCreater(userData = 1)
Update object static value with 1
>>> singeltonObj2 = singeltonObjCreater(userData = 1)
Update object static value with 1
>>> singeltonObj1()
I'm a singelton, here is the proof: 2
>>> singeltonObj2()
I'm a singelton, here is the proof: 3
>>> singeltonObjDerivedCreater = singeltonTestDerived()
Create new object
>>> singeltonObjDerived1 = singeltonObjDerivedCreater(userData = 2)
Update object static value with 2
>>> singeltonObjDerived2 = singeltonObjDerivedCreater(userData = 2)
Update object static value with 2
>>> singeltonObjDerived1()
I'm a singelton, here is the proof: 7
>>> singeltonObjDerived2()
I'm a singelton, here is the proof: 12
>>>

Related

decorating methods causes method to pass in objects [duplicate]

This question already has answers here:
Python: Bind an Unbound Method?
(5 answers)
Closed 2 years ago.
Problem Description
I want to use a decorator to define a class method, but this requires me to manually give the 'self' object when I shouldn't have to provide that.
def func_wrapper(func):
def call_func(self):
print(self.a)
func()
return call_func
def func():
print('hello')
class test:
def __init__(self, func):
self.a = 0
self.call_func = func_wrapper(func)
mytest = test(func)
#mytest.call_func() #why does this not work?
mytest.call_func(mytest) #this works
I want to be able to mytest.call_func() but this doesn't work, presumably because call_func is bound to the func_wrapper and not mytest. If I manually pass in the object, e.g. mytest.call_func(mytest) this will work, but I don't want to have to manually pass in the object - this creates inconsistent call signatures if one inherited the test class and wrote their own call_func method, because then the method would be properly bound to the class.
Solution Attempts
def func_wrapper2(func, obj):
def call_func():
print(obj.a)
func()
return call_func
class test:
def __init__(self, func):
self.a = 0
self.call_func = func_wrapper2(func, self)
Here is a solution which lets me test.call_func() as desired, but here func_wrapper is not a true decorator as it requires to be passed in the object as well.
Looking on the web I found this blog https://medium.com/#vadimpushtaev/decorator-inside-python-class-1e74d23107f6 which talks about this issue and recommends to define the decorator either in a nested class, or a helper class. However their solution doesn't seem to work and I am getting type errors from passing the wrong number of inputs.
class test2:
class test2helper:
#classmethod
def func_wrapper(func):
print(self.a)
func()
def __init__(self):
self.a = 0
#test2helper.func_wrapper
def call_func(self):
print('hello')
So what is the proper way to use decorators with class methods? Every way to do it seems to cause different issues with how the self is being handled. I am going to use the func_wrapper2 design unless there is a better way to do this.
You are missing one level:
class test2:
class test2helper:
#classmethod
def decorator(cls, func): # this must return a function!
def func_wrapper(self): # ... namely this one, the "wrapper"
print(self.a) # ... where you have access to the instance
func(self) # ... upon which the method is called
return func_wrapper
def __init__(self):
self.a = 0
#test2helper.decorator
def call_func(self):
print('hello')
>>> t = test2()
>>> t.call_func()
0
hello
Or, if you want to go with the earlier attempt without nested class:
def decorator(func): # you are decorating an unbound function!
def func_wrapper(obj):
print(obj.a)
func(obj) # which has to be passed all the arguments
return func_wrapper
class test:
def __init__(self):
self.a = 0
#decorator
def call_func(self):
print('hello')
You can define a class decorator to do what you want:
def class_decorator(cls):
def call_func(self):
print(self.a)
return func()
setattr(cls, 'call_func', call_func)
return cls
def func():
print('hello')
#class_decorator
class Test:
def __init__(self, func):
self.a = 0
mytest = Test(func)
mytest.call_func() # This now works.
Output:
0
hello

python extending a memoized class gives a compilation error

I have given up memoization of a class as a bag-of-worms that I didn't want to explore and here is one example of why. The question I ask is "how does one extend or inherit from a memoized class" but it's very possible I have made a mistake. The memoize class below is a cut-down version of the one by brandizzi in How can I memoize a class instantiation in Python? and googling the subject finds more involved such classes.
class memoize(object):
def __init__(self, cls):
self.cls = cls
# I didn't understand why this was needed
self.__dict__.update(cls.__dict__)
# bit about static methods not needed
def __call__(self, *args):
try:
self.cls.instances
except:
self.cls.instances = {}
key = '//'.join(map(str, args))
if key not in self.cls.instances:
self.cls.instances[key] = self.cls(*args)
return self.cls.instances[key]
class Foo():
def __init__(self,val):
self.val = val
def __repr__(self):
return "{}<{},{}>".format(self.__class__.__name__,self.val,id(self))
class Bar(Foo):
def __init__(self,val):
super().__init__(val)
f1,f2,f3 = [Foo(i) for i in (0,0,1)]
print([f1,f2,f3])
b1,b2,b3 = [Bar(i) for i in (0,0,1)]
print([b1,b2,b3])
# produces exactly what I expect
# [Foo<0,3071981964>, Foo<0,3071982092>, Foo<1,3071982316>]
# [Bar<0,3071983340>, Bar<0,3071983404>, Bar<1,3071983436>]
Foo = memoize(Foo)
f1,f2,f3 = [Foo(i) for i in (0,0,1)]
print([f1,f2,f3])
b1,b2,b3 = [Bar(i) for i in (0,0,1)]
print([b1,b2,b3])
# and now Foo has been memoized so Foo(0) always produces the same object
# [Foo<0,3071725804>, Foo<0,3071725804>, Foo<1,3071726060>]
# [Bar<0,3071711916>, Bar<0,3071711660>, Bar<1,3071725644>]
# this produces a compilation error that I don't understand
class Baz(Foo):
def __init__(self,val):
super().__init__(val)
# Traceback (most recent call last):
# File "/tmp/foo.py", line 49, in <module>
# class Baz(Foo):
# TypeError: __init__() takes 2 positional arguments but 4 were given
This "recipe" is indeed a very bad idea - once you rebind Foo to memoize(Foo), Foo is a memoize instance and not class Foo anymore. This breaks all expectations wrt/ python's type and the whole object model. In this case, it about how the class statement works. Actually, this:
class Titi():
x = 42
def toto(self):
print(self.x)
is syntactic sugar for:
def toto(self):
print(self.x)
Titi = type("Titi", (object,), {x:42, toto:toto})
del toto
Note that this happens at runtime (like everything in Python except parsing / bytecode compilation), and that type is a class so calling type creates a new class which is a type instance (this is named a 'metaclass' - the class of a class - and type is the default metaclass).
So with Foo being now a memoize instance instead of a Type instance, and since memoize is not a proper metaclass (it's __init__ methods signature is incompatible), the whole thing just cannot work.
To get this to work, you'd have to make memoize a proper metaclass (this is a simplified example assuming a single arg named param but it can be generalized if you want to):
class FooType(type):
def __new__(meta, name, bases, attrs):
if "_instances" not in attrs:
attrs["_instances"] = dict()
return type.__new__(meta, name, bases, attrs)
def __call__(cls, param):
if param not in cls._instances:
cls._instances[param] = super(FooType, cls).__call__(param)
return cls._instances[param]
class Foo(metaclass=FooType):
def __init__(self, param):
self._param = param
print("%s init(%s)" % (self, param))
def __repr__(self):
return "{}<{},{}>".format(self.__class__.__name__, self._param, id(self))
class Bar(Foo):
pass
f1,f2,f3 = [Foo(i) for i in (0,0,1)]
print([f1,f2,f3])
b1,b2,b3 = [Bar(i) for i in (0,0,1)]
print([b1,b2,b3])

can a function be static and non-static in python 2

Lets say I have this class:
class Test(object):
def __init__(self, a):
self.a = a
def test(self, b):
if isinstance(self, Test):
return self.a + b
else:
return self + b
This would ideally in my world do this:
>>> Test.test(1,2)
3
>>> Test(1).test(2)
3
Now this doesn't work because you get this error:
TypeError: unbound method test() must be called with Test instance as first argument (got int instance instead)
In python3 this works fine, and I have the sneaking suspicion this is possible with a decorator in python2 but my python foo isn't strong enough to get that to work.
Plot Twist: So what happens when I need something on self when it's not called statically.
If you want something that will actually receive self if called on an instance, but can also be called on the class, writing your own descriptor type may be advisable:
import types
class ClassOrInstanceMethod(object):
def __init__(self, wrapped):
self.wrapped = wrapped
def __get__(self, instance, owner):
if instance is None:
instance = owner
return self.wrapped.__get__(instance, owner)
class demo(object):
#ClassOrInstanceMethod
def foo(self):
# self will be the class if this is called on the class
print(self)
Demo.
For the original version of your question, you could just write it like any other static method, with #staticmethod. Calling a static method on an instance works the same as calling it on the class:
class Test(object):
#staticmethod
def test(a, b):
return a + b
Demo.

Python proxy class

I'm trying to create a Proxy class to another class. I want this class to be passed into the proxy in its constructor and then for the proxy to dynamically create all the same methods of this class on itself.
This is what I hvae so far which is not working:
import inspect
from optparse import OptionParser
class MyClass:
def func1(self):
print 'MyClass.func1'
def func2(self):
print 'MyClass.func1'
class ProxyClass:
def __init__(self):
myClass = MyClass()
members = inspect.getmembers(MyClass, predicate=inspect.ismethod)
for member in members:
funcName = member[0]
def fn(self):
print 'ProxyClass.' + funcName
return myClass[funcName]()
self.__dict__[funcName] = fn
proxyClass = ProxyClass()
proxyClass.func1()
proxyClass.func2()
I think it is the line self.__dict__[funcName] = fn that needs to be changed but am not sure what to?
I'm new to Python so if there is a completely different Pythonic way of doing this I would be happy to hear about that too.
I would not explicitly copy the methods of the wrapped class. You can use the magic method __getattr__ to control what happens when you call something on the proxy object, including decorating it as you like; __getattr__ has to return a callable object, so you can make that callable do whatever you need to (in addition to calling the original method).
I have included an example below.
class A:
def foo(self): return 42
def bar(self, n): return n + 5
def baz(self, m, n): return m ** n
class Proxy:
def __init__(self, proxied_object):
self.__proxied = proxied_object
def __getattr__(self, attr):
def wrapped_method(*args, **kwargs):
print("The method {} is executing.".format(attr))
result = getattr(self.__proxied, attr)(*args, **kwargs)
print("The result was {}.".format(result))
return result
return wrapped_method
proxy = Proxy(A())
proxy.foo()
proxy.bar(10)
proxy.baz(2, 10)

Python : Set method attribute from within method

I am trying to make a python decorator that adds attributes to methods of a class so that I can access and modify those attributes from within the method itself. The decorator code is
from types import MethodType
class attribute(object):
def __init__(self, **attributes):
self.attributes = attributes
def __call__(self, function):
class override(object):
def __init__(self, function, attributes):
self.__function = function
for att in attributes:
setattr(self, att, attributes[att])
def __call__(self, *args, **kwargs):
return self.__function(*args, **kwargs)
def __get__(self, instance, owner):
return MethodType(self, instance, owner)
retval = override(function, self.attributes)
return retval
I tried this decorator on the toy example that follows.
class bar(object):
#attribute(a=2)
def foo(self):
print self.foo.a
self.foo.a = 1
Though I am able to access the value of attribute 'a' from within foo(), I can't set it to another value. Indeed, when I call bar().foo(), I get the following AttributeError.
AttributeError: 'instancemethod' object has no attribute 'a'
Why is this? More importantly how can I achieve my goal?
Edit
Just to be more specific, I am trying to find a simple way to implement static variable that are located within class methods. Continuing from the example above, I would like instantiate b = bar(), call both foo() and doo() methods and then access b.foo.a and b.doo.a later on.
class bar(object):
#attribute(a=2)
def foo(self):
self.foo.a = 1
#attribute(a=4)
def doo(self):
self.foo.a = 3
The best way to do this is to not do it at all.
First of all, there is no need for an attribute decorator; you can just assign it yourself:
class bar(object):
def foo(self):
print self.foo.a
self.foo.a = 1
foo.a = 2
However, this still encounters the same errors. You need to do:
self.foo.__dict__['a'] = 1
You can instead use a metaclass...but that gets messy quickly.
On the other hand, there are cleaner alternatives.
You can use defaults:
def foo(self, a):
print a[0]
a[0] = 2
foo.func_defaults = foo.func_defaults[:-1] + ([2],)
Of course, my preferred way is to avoid this altogether and use a callable class ("functor" in C++ words):
class bar(object):
def __init__(self):
self.foo = self.foo_method(self)
class foo_method(object):
def __init__(self, bar):
self.bar = bar
self.a = 2
def __call__(self):
print self.a
self.a = 1
Or just use classic class attributes:
class bar(object):
def __init__(self):
self.a = 1
def foo(self):
print self.a
self.a = 2
If it's that you want to hide a from derived classes, use whatever private attributes are called in Python terminology:
class bar(object):
def __init__(self):
self.__a = 1 # this will be implicitly mangled as __bar__a or similar
def foo(self):
print self.__a
self.__a = 2
EDIT: You want static attributes?
class bar(object):
a = 1
def foo(self):
print self.a
self.a = 2
EDIT 2: If you want static attributes visible to only the current function, you can use PyExt's modify_function:
import pyext
def wrap_mod(*args, **kw):
def inner(f):
return pyext.modify_function(f, *args, **kw)
return inner
class bar(object):
#wrap_mod(globals={'a': [1]})
def foo(self):
print a[0]
a[0] = 2
It's slightly ugly and hackish. But it works.
My recommendation would be just to use double underscores:
class bar(object):
__a = 1
def foo(self):
print self.__a
self.__a = 2
Although this is visible to the other functions, it's invisible to anything else (actually, it's there, but it's mangled).
FINAL EDIT: Use this:
import pyext
def wrap_mod(*args, **kw):
def inner(f):
return pyext.modify_function(f, *args, **kw)
return inner
class bar(object):
#wrap_mod(globals={'a': [1]})
def foo(self):
print a[0]
a[0] = 2
foo.a = foo.func_globals['a']
b = bar()
b.foo() # prints 1
b.foo() # prints 2
# external access
b.foo.a[0] = 77
b.foo() # prints 77
While You can accomplish Your goal by replacing self.foo.a = 1 with self.foo.__dict__['a'] = 1 it is generally not recommended.
If you are using Python2 - (and not Python3) - whenever you retrieve a method from an instance, a new instance method object is created which is a wrapper to the original function defined in the class body.
The instance method is a rather transparent proxy to the function - you can retrieve the function's attributes through it, but not set them - that is why setting an item in self.foo.__dict__ works.
Alternatively you can reach the function object itself using: self.foo.im_func - the im_func attribute of instance methods point the underlying function.
Based on other contributors's answers, I came up with the following workaround. First, wrap a dictionnary in a class resolving non-existant attributes to the wrapped dictionnary such as the following code.
class DictWrapper(object):
def __init__(self, d):
self.d = d
def __getattr__(self, key):
return self.d[key]
Credits to Lucas Jones for this code.
Then implement a addstatic decorator with a statics attribute that will store the static attributes.
class addstatic(object):
def __init__(self, **statics):
self.statics = statics
def __call__(self, function):
class override(object):
def __init__(self, function, statics):
self.__function = function
self.statics = DictWrapper(statics)
def __call__(self, *args, **kwargs):
return self.__function(*args, **kwargs)
def __get__(self, instance, objtype):
from types import MethodType
return MethodType(self, instance)
retval = override(function, self.statics)
return retval
The following code is an example of how the addstatic decorator can be used on methods.
class bar(object):
#attribute(a=2, b=3)
def foo(self):
self.foo.statics.a = 1
self.foo.statics.b = 2
Then, playing with an instance of the bar class yields :
>>> b = bar()
>>> b.foo.statics.a
2
>>> b.foo.statics.b
3
>>> b.foo()
>>> b.foo.statics.a
3
>>> b.foo.statics.b
5
The reason for using this statics dictionnary follows jsbueno's answer which suggest that what I want would require overloading the dot operator of and instance method wrapping the foo function, which I am not sure is possible. Of course, the method's attribute could be set in self.foo.__dict__, but since it not recommended (as suggested by brainovergrow), I came up with this workaround. I am not certain this would be recommended either and I guess it is up for comments.

Categories

Resources