I have given up memoization of a class as a bag-of-worms that I didn't want to explore and here is one example of why. The question I ask is "how does one extend or inherit from a memoized class" but it's very possible I have made a mistake. The memoize class below is a cut-down version of the one by brandizzi in How can I memoize a class instantiation in Python? and googling the subject finds more involved such classes.
class memoize(object):
def __init__(self, cls):
self.cls = cls
# I didn't understand why this was needed
self.__dict__.update(cls.__dict__)
# bit about static methods not needed
def __call__(self, *args):
try:
self.cls.instances
except:
self.cls.instances = {}
key = '//'.join(map(str, args))
if key not in self.cls.instances:
self.cls.instances[key] = self.cls(*args)
return self.cls.instances[key]
class Foo():
def __init__(self,val):
self.val = val
def __repr__(self):
return "{}<{},{}>".format(self.__class__.__name__,self.val,id(self))
class Bar(Foo):
def __init__(self,val):
super().__init__(val)
f1,f2,f3 = [Foo(i) for i in (0,0,1)]
print([f1,f2,f3])
b1,b2,b3 = [Bar(i) for i in (0,0,1)]
print([b1,b2,b3])
# produces exactly what I expect
# [Foo<0,3071981964>, Foo<0,3071982092>, Foo<1,3071982316>]
# [Bar<0,3071983340>, Bar<0,3071983404>, Bar<1,3071983436>]
Foo = memoize(Foo)
f1,f2,f3 = [Foo(i) for i in (0,0,1)]
print([f1,f2,f3])
b1,b2,b3 = [Bar(i) for i in (0,0,1)]
print([b1,b2,b3])
# and now Foo has been memoized so Foo(0) always produces the same object
# [Foo<0,3071725804>, Foo<0,3071725804>, Foo<1,3071726060>]
# [Bar<0,3071711916>, Bar<0,3071711660>, Bar<1,3071725644>]
# this produces a compilation error that I don't understand
class Baz(Foo):
def __init__(self,val):
super().__init__(val)
# Traceback (most recent call last):
# File "/tmp/foo.py", line 49, in <module>
# class Baz(Foo):
# TypeError: __init__() takes 2 positional arguments but 4 were given
This "recipe" is indeed a very bad idea - once you rebind Foo to memoize(Foo), Foo is a memoize instance and not class Foo anymore. This breaks all expectations wrt/ python's type and the whole object model. In this case, it about how the class statement works. Actually, this:
class Titi():
x = 42
def toto(self):
print(self.x)
is syntactic sugar for:
def toto(self):
print(self.x)
Titi = type("Titi", (object,), {x:42, toto:toto})
del toto
Note that this happens at runtime (like everything in Python except parsing / bytecode compilation), and that type is a class so calling type creates a new class which is a type instance (this is named a 'metaclass' - the class of a class - and type is the default metaclass).
So with Foo being now a memoize instance instead of a Type instance, and since memoize is not a proper metaclass (it's __init__ methods signature is incompatible), the whole thing just cannot work.
To get this to work, you'd have to make memoize a proper metaclass (this is a simplified example assuming a single arg named param but it can be generalized if you want to):
class FooType(type):
def __new__(meta, name, bases, attrs):
if "_instances" not in attrs:
attrs["_instances"] = dict()
return type.__new__(meta, name, bases, attrs)
def __call__(cls, param):
if param not in cls._instances:
cls._instances[param] = super(FooType, cls).__call__(param)
return cls._instances[param]
class Foo(metaclass=FooType):
def __init__(self, param):
self._param = param
print("%s init(%s)" % (self, param))
def __repr__(self):
return "{}<{},{}>".format(self.__class__.__name__, self._param, id(self))
class Bar(Foo):
pass
f1,f2,f3 = [Foo(i) for i in (0,0,1)]
print([f1,f2,f3])
b1,b2,b3 = [Bar(i) for i in (0,0,1)]
print([b1,b2,b3])
Related
I have the following classes implementing a "Delegation Design Pattern" with an additional DelegatorParent class:
class DelegatorParent():
def __init__(self):
self.a = 'whatever'
class ConcreteDelegatee():
def myMethod(self):
return 'myMethod'
class Delegator(DelegatorParent):
def __init__(self):
self.delegatee = ConcreteDelegatee()
DelegatorParent.__init__(self)
def __getattr__(self, attrname):
return getattr(self.delegatee, attrname)
a = Delegator()
result = a.myMethod()
Everything looks fine.
Now I would like to put an abstract method in DelegatorParent, to ensure that "myMethod" is always defined.
from abc import ABCMeta, abstractmethod
class DelegatorParent():
__metaclass__ = ABCMeta
#abstractmethod
def myMethod(self):
pass
def __init__(self):
self.a = 'whatever'
class ConcreteDelegatee():
def myMethod(self):
return 'myMethod'
class Delegator(DelegatorParent):
def __init__(self):
self.delegatee = ConcreteDelegatee()
DelegatorParent.__init__(self)
def __getattr__(self, attrname):
return getattr(self.delegatee, attrname)
# This method seems unnecessary, but if I erase it an exception is
# raised because the abstract method's restriction is violated
def myMethod(self):
return self.delegatee.myMethod()
a = Delegator()
result = a.myMethod()
Can you help me find an "elegant" way to remove "myMethod" from "Delegator"... Intuition tells me that it is somehow redundant (considering that a custom getattr method is defined).
And more importantly, notice that with this implementation, if I forget to define myMethod in ConcreteDelegatee the program compiles, but it may crash in runtime if I call Delegator.myMethod(), which is exactly what I wanted to avoid by using abstract methods in DelegatorParent.
Obviously a simple solution would be to move #abstractmethod to the Delegator class, but I want to avoid doing that because in my program DelegatorParent is a very important class (and Delegator is just an auxiliary class).
You can decide to automatically implement abstract methods delegared to ConcreteDelegatee.
For each abstract method, check if it's name exist in the ConcreteDelegatee class and implement this method as a delegate to this class method.
from abc import ABCMeta, abstractmethod
class DelegatorParent(object):
__metaclass__ = ABCMeta
def __init__(self):
self.a = 'whatever'
#abstractmethod
def myMethod(self):
pass
class Delegatee(object):
pass
class ConcreteDelegatee(Delegatee):
def myMethod(self):
return 'myMethod'
def myMethod2(self):
return 'myMethod2'
class Delegator(DelegatorParent):
def __new__(cls, *args, **kwargs):
implemented = set()
for name in cls.__abstractmethods__:
if hasattr(ConcreteDelegatee, name):
def delegated(this, *a, **kw):
meth = getattr(this.delegatee, name)
return meth(*a, **kw)
setattr(cls, name, delegated)
implemented.add(name)
cls.__abstractmethods__ = frozenset(cls.__abstractmethods__ - implemented)
obj = super(Delegator, cls).__new__(cls, *args, **kwargs)
obj.delegatee = ConcreteDelegatee()
return obj
def __getattr__(self, attrname):
# Called only for attributes not defined by this class (or its bases).
# Retrieve attribute from current behavior delegate class instance.
return getattr(self.delegatee, attrname)
# All abstract methods are delegared to ConcreteDelegatee
a = Delegator()
print(a.myMethod()) # correctly prints 'myMethod'
print(a.myMethod2()) #correctly prints 'myMethod2'
This solves the main problem (prevent ConcreteDelegatee from forgetting to define myMethod). Other abstract methods are still checked if you forgot to implement them.
The __new__ method is in charge of the delegation, that frees your __init__ to do it.
Since you use ABCMeta, you must defined the abstract methods. One could remove your method from the __abstractmethods__ set, but it is a frozenset. Anyway, it involves listing all abstract methods.
So, instead of playing with __getattr__, you can use a simple descriptor.
For instance:
class Delegated(object):
def __init__(self, attrname=None):
self.attrname = attrname
def __get__(self, instance, owner):
if instance is None:
return self
delegatee = instance.delegatee
return getattr(delegatee, self.attrname)
class Delegator(DelegatorParent):
def __init__(self):
self.delegatee = ConcreteDelegatee()
DelegatorParent.__init__(self)
myMethod = Delegated('myMethod')
An advantage here: the developer has the explicit information that "myMethod" is delegated.
If you try:
a = Delegator()
result = a.myMethod()
It works! But if you forget to implement myMethod in Delegator class, you have the classic error:
Traceback (most recent call last):
File "script.py", line 40, in <module>
a = Delegator()
TypeError: Can't instantiate abstract class Delegator with abstract methods myMethod
Edit
This implementation can be generalized as follow:
class DelegatorParent():
__metaclass__ = ABCMeta
#abstractmethod
def myMethod1(self):
pass
#abstractmethod
def myMethod2(self):
pass
def __init__(self):
self.a = 'whatever'
class ConcreteDelegatee1():
def myMethod1(self):
return 'myMethod1'
class ConcreteDelegatee2():
def myMethod2(self):
return 'myMethod2'
class DelegatedTo(object):
def __init__(self, attrname):
self.delegatee_name, self.attrname = attrname.split('.')
def __get__(self, instance, owner):
if instance is None:
return self
delegatee = getattr(instance, self.delegatee_name)
return getattr(delegatee, self.attrname)
class Delegator(DelegatorParent):
def __init__(self):
self.delegatee1 = ConcreteDelegatee1()
self.delegatee2 = ConcreteDelegatee2()
DelegatorParent.__init__(self)
myMethod1 = DelegatedTo('delegatee1.myMethod1')
myMethod2 = DelegatedTo('delegatee2.myMethod2')
a = Delegator()
result = a.myMethod2()
Here, we can specify the delegatee name and delegatee method.
Here is my current solution. It solves the main problem (prevent ConcreteDelegatee from forgetting to define myMethod), but I'm still not convinced because I still need to define myMethod inside Delegator, which seems redundant
from abc import ABCMeta, abstractmethod
class DelegatorParent(object):
__metaclass__ = ABCMeta
def __init__(self):
self.a = 'whatever'
#abstractmethod
def myMethod(self):
pass
class Delegatee(object):
def checkExistence(self, attrname):
if not callable(getattr(self, attrname, None)):
error_msg = "Can't instantiate " + str(self.__class__.__name__) + " without abstract method " + attrname
raise NotImplementedError(error_msg)
class ConcreteDelegatee(Delegatee):
def myMethod(self):
return 'myMethod'
def myMethod2(self):
return 'myMethod2'
class Delegator(DelegatorParent):
def __init__(self):
self.delegatee = ConcreteDelegatee()
DelegatorParent.__init__(self)
for method in DelegatorParent.__abstractmethods__:
self.delegatee.checkExistence(method)
def myMethod(self, *args, **kw):
return self.delegatee.myMethod(*args, **kw)
def __getattr__(self, attrname):
# Called only for attributes not defined by this class (or its bases).
# Retrieve attribute from current behavior delegate class instance.
return getattr(self.delegatee, attrname)
# if I forget to implement myMethod inside ConcreteDelegatee,
# the following line will correctly raise an exception saying
# that 'myMethod' is missing inside 'ConcreteDelegatee'.
a = Delegator()
print a.myMethod() # correctly prints 'myMethod'
print a.myMethod2() #correctly prints 'myMethod2'
This question already has answers here:
Using property() on classmethods
(19 answers)
Closed 3 years ago.
In python I can add a method to a class with the #classmethod decorator. Is there a similar decorator to add a property to a class? I can better show what I'm talking about.
class Example(object):
the_I = 10
def __init__( self ):
self.an_i = 20
#property
def i( self ):
return self.an_i
def inc_i( self ):
self.an_i += 1
# is this even possible?
#classproperty
def I( cls ):
return cls.the_I
#classmethod
def inc_I( cls ):
cls.the_I += 1
e = Example()
assert e.i == 20
e.inc_i()
assert e.i == 21
assert Example.I == 10
Example.inc_I()
assert Example.I == 11
Is the syntax I've used above possible or would it require something more?
The reason I want class properties is so I can lazy load class attributes, which seems reasonable enough.
Here's how I would do this:
class ClassPropertyDescriptor(object):
def __init__(self, fget, fset=None):
self.fget = fget
self.fset = fset
def __get__(self, obj, klass=None):
if klass is None:
klass = type(obj)
return self.fget.__get__(obj, klass)()
def __set__(self, obj, value):
if not self.fset:
raise AttributeError("can't set attribute")
type_ = type(obj)
return self.fset.__get__(obj, type_)(value)
def setter(self, func):
if not isinstance(func, (classmethod, staticmethod)):
func = classmethod(func)
self.fset = func
return self
def classproperty(func):
if not isinstance(func, (classmethod, staticmethod)):
func = classmethod(func)
return ClassPropertyDescriptor(func)
class Bar(object):
_bar = 1
#classproperty
def bar(cls):
return cls._bar
#bar.setter
def bar(cls, value):
cls._bar = value
# test instance instantiation
foo = Bar()
assert foo.bar == 1
baz = Bar()
assert baz.bar == 1
# test static variable
baz.bar = 5
assert foo.bar == 5
# test setting variable on the class
Bar.bar = 50
assert baz.bar == 50
assert foo.bar == 50
The setter didn't work at the time we call Bar.bar, because we are calling
TypeOfBar.bar.__set__, which is not Bar.bar.__set__.
Adding a metaclass definition solves this:
class ClassPropertyMetaClass(type):
def __setattr__(self, key, value):
if key in self.__dict__:
obj = self.__dict__.get(key)
if obj and type(obj) is ClassPropertyDescriptor:
return obj.__set__(self, value)
return super(ClassPropertyMetaClass, self).__setattr__(key, value)
# and update class define:
# class Bar(object):
# __metaclass__ = ClassPropertyMetaClass
# _bar = 1
# and update ClassPropertyDescriptor.__set__
# def __set__(self, obj, value):
# if not self.fset:
# raise AttributeError("can't set attribute")
# if inspect.isclass(obj):
# type_ = obj
# obj = None
# else:
# type_ = type(obj)
# return self.fset.__get__(obj, type_)(value)
Now all will be fine.
If you define classproperty as follows, then your example works exactly as you requested.
class classproperty(object):
def __init__(self, f):
self.f = f
def __get__(self, obj, owner):
return self.f(owner)
The caveat is that you can't use this for writable properties. While e.I = 20 will raise an AttributeError, Example.I = 20 will overwrite the property object itself.
[answer written based on python 3.4; the metaclass syntax differs in 2 but I think the technique will still work]
You can do this with a metaclass...mostly. Dappawit's almost works, but I think it has a flaw:
class MetaFoo(type):
#property
def thingy(cls):
return cls._thingy
class Foo(object, metaclass=MetaFoo):
_thingy = 23
This gets you a classproperty on Foo, but there's a problem...
print("Foo.thingy is {}".format(Foo.thingy))
# Foo.thingy is 23
# Yay, the classmethod-property is working as intended!
foo = Foo()
if hasattr(foo, "thingy"):
print("Foo().thingy is {}".format(foo.thingy))
else:
print("Foo instance has no attribute 'thingy'")
# Foo instance has no attribute 'thingy'
# Wha....?
What the hell is going on here? Why can't I reach the class property from an instance?
I was beating my head on this for quite a while before finding what I believe is the answer. Python #properties are a subset of descriptors, and, from the descriptor documentation (emphasis mine):
The default behavior for attribute access is to get, set, or delete the
attribute from an object’s dictionary. For instance, a.x has a lookup chain
starting with a.__dict__['x'], then type(a).__dict__['x'], and continuing
through the base classes of type(a) excluding metaclasses.
So the method resolution order doesn't include our class properties (or anything else defined in the metaclass). It is possible to make a subclass of the built-in property decorator that behaves differently, but (citation needed) I've gotten the impression googling that the developers had a good reason (which I do not understand) for doing it that way.
That doesn't mean we're out of luck; we can access the properties on the class itself just fine...and we can get the class from type(self) within the instance, which we can use to make #property dispatchers:
class Foo(object, metaclass=MetaFoo):
_thingy = 23
#property
def thingy(self):
return type(self).thingy
Now Foo().thingy works as intended for both the class and the instances! It will also continue to do the right thing if a derived class replaces its underlying _thingy (which is the use case that got me on this hunt originally).
This isn't 100% satisfying to me -- having to do setup in both the metaclass and object class feels like it violates the DRY principle. But the latter is just a one-line dispatcher; I'm mostly okay with it existing, and you could probably compact it down to a lambda or something if you really wanted.
If you use Django, it has a built in #classproperty decorator.
from django.utils.decorators import classproperty
For Django 4, use:
from django.utils.functional import classproperty
I think you may be able to do this with the metaclass. Since the metaclass can be like a class for the class (if that makes sense). I know you can assign a __call__() method to the metaclass to override calling the class, MyClass(). I wonder if using the property decorator on the metaclass operates similarly.
Wow, it works:
class MetaClass(type):
def getfoo(self):
return self._foo
foo = property(getfoo)
#property
def bar(self):
return self._bar
class MyClass(object):
__metaclass__ = MetaClass
_foo = 'abc'
_bar = 'def'
print MyClass.foo
print MyClass.bar
Note: This is in Python 2.7. Python 3+ uses a different technique to declare a metaclass. Use: class MyClass(metaclass=MetaClass):, remove __metaclass__, and the rest is the same.
As far as I can tell, there is no way to write a setter for a class property without creating a new metaclass.
I have found that the following method works. Define a metaclass with all of the class properties and setters you want. IE, I wanted a class with a title property with a setter. Here's what I wrote:
class TitleMeta(type):
#property
def title(self):
return getattr(self, '_title', 'Default Title')
#title.setter
def title(self, title):
self._title = title
# Do whatever else you want when the title is set...
Now make the actual class you want as normal, except have it use the metaclass you created above.
# Python 2 style:
class ClassWithTitle(object):
__metaclass__ = TitleMeta
# The rest of your class definition...
# Python 3 style:
class ClassWithTitle(object, metaclass = TitleMeta):
# Your class definition...
It's a bit weird to define this metaclass as we did above if we'll only ever use it on the single class. In that case, if you're using the Python 2 style, you can actually define the metaclass inside the class body. That way it's not defined in the module scope.
def _create_type(meta, name, attrs):
type_name = f'{name}Type'
type_attrs = {}
for k, v in attrs.items():
if type(v) is _ClassPropertyDescriptor:
type_attrs[k] = v
return type(type_name, (meta,), type_attrs)
class ClassPropertyType(type):
def __new__(meta, name, bases, attrs):
Type = _create_type(meta, name, attrs)
cls = super().__new__(meta, name, bases, attrs)
cls.__class__ = Type
return cls
class _ClassPropertyDescriptor(object):
def __init__(self, fget, fset=None):
self.fget = fget
self.fset = fset
def __get__(self, obj, owner):
if self in obj.__dict__.values():
return self.fget(obj)
return self.fget(owner)
def __set__(self, obj, value):
if not self.fset:
raise AttributeError("can't set attribute")
return self.fset(obj, value)
def setter(self, func):
self.fset = func
return self
def classproperty(func):
return _ClassPropertyDescriptor(func)
class Bar(metaclass=ClassPropertyType):
__bar = 1
#classproperty
def bar(cls):
return cls.__bar
#bar.setter
def bar(cls, value):
cls.__bar = value
bar = Bar()
assert Bar.bar==1
Bar.bar=2
assert bar.bar==2
nbar = Bar()
assert nbar.bar==2
I happened to come up with a solution very similar to #Andrew, only DRY
class MetaFoo(type):
def __new__(mc1, name, bases, nmspc):
nmspc.update({'thingy': MetaFoo.thingy})
return super(MetaFoo, mc1).__new__(mc1, name, bases, nmspc)
#property
def thingy(cls):
if not inspect.isclass(cls):
cls = type(cls)
return cls._thingy
#thingy.setter
def thingy(cls, value):
if not inspect.isclass(cls):
cls = type(cls)
cls._thingy = value
class Foo(metaclass=MetaFoo):
_thingy = 23
class Bar(Foo)
_thingy = 12
This has the best of all answers:
The "metaproperty" is added to the class, so that it will still be a property of the instance
Don't need to redefine thingy in any of the classes
The property works as a "class property" in for both instance and class
You have the flexibility to customize how _thingy is inherited
In my case, I actually customized _thingy to be different for every child, without defining it in each class (and without a default value) by:
def __new__(mc1, name, bases, nmspc):
nmspc.update({'thingy': MetaFoo.services, '_thingy': None})
return super(MetaFoo, mc1).__new__(mc1, name, bases, nmspc)
If you only need lazy loading, then you could just have a class initialisation method.
EXAMPLE_SET = False
class Example(object):
#classmethod
def initclass(cls):
global EXAMPLE_SET
if EXAMPLE_SET: return
cls.the_I = 'ok'
EXAMPLE_SET = True
def __init__( self ):
Example.initclass()
self.an_i = 20
try:
print Example.the_I
except AttributeError:
print 'ok class not "loaded"'
foo = Example()
print foo.the_I
print Example.the_I
But the metaclass approach seems cleaner, and with more predictable behavior.
Perhaps what you're looking for is the Singleton design pattern. There's a nice SO QA about implementing shared state in Python.
I am trying to make a python decorator that adds attributes to methods of a class so that I can access and modify those attributes from within the method itself. The decorator code is
from types import MethodType
class attribute(object):
def __init__(self, **attributes):
self.attributes = attributes
def __call__(self, function):
class override(object):
def __init__(self, function, attributes):
self.__function = function
for att in attributes:
setattr(self, att, attributes[att])
def __call__(self, *args, **kwargs):
return self.__function(*args, **kwargs)
def __get__(self, instance, owner):
return MethodType(self, instance, owner)
retval = override(function, self.attributes)
return retval
I tried this decorator on the toy example that follows.
class bar(object):
#attribute(a=2)
def foo(self):
print self.foo.a
self.foo.a = 1
Though I am able to access the value of attribute 'a' from within foo(), I can't set it to another value. Indeed, when I call bar().foo(), I get the following AttributeError.
AttributeError: 'instancemethod' object has no attribute 'a'
Why is this? More importantly how can I achieve my goal?
Edit
Just to be more specific, I am trying to find a simple way to implement static variable that are located within class methods. Continuing from the example above, I would like instantiate b = bar(), call both foo() and doo() methods and then access b.foo.a and b.doo.a later on.
class bar(object):
#attribute(a=2)
def foo(self):
self.foo.a = 1
#attribute(a=4)
def doo(self):
self.foo.a = 3
The best way to do this is to not do it at all.
First of all, there is no need for an attribute decorator; you can just assign it yourself:
class bar(object):
def foo(self):
print self.foo.a
self.foo.a = 1
foo.a = 2
However, this still encounters the same errors. You need to do:
self.foo.__dict__['a'] = 1
You can instead use a metaclass...but that gets messy quickly.
On the other hand, there are cleaner alternatives.
You can use defaults:
def foo(self, a):
print a[0]
a[0] = 2
foo.func_defaults = foo.func_defaults[:-1] + ([2],)
Of course, my preferred way is to avoid this altogether and use a callable class ("functor" in C++ words):
class bar(object):
def __init__(self):
self.foo = self.foo_method(self)
class foo_method(object):
def __init__(self, bar):
self.bar = bar
self.a = 2
def __call__(self):
print self.a
self.a = 1
Or just use classic class attributes:
class bar(object):
def __init__(self):
self.a = 1
def foo(self):
print self.a
self.a = 2
If it's that you want to hide a from derived classes, use whatever private attributes are called in Python terminology:
class bar(object):
def __init__(self):
self.__a = 1 # this will be implicitly mangled as __bar__a or similar
def foo(self):
print self.__a
self.__a = 2
EDIT: You want static attributes?
class bar(object):
a = 1
def foo(self):
print self.a
self.a = 2
EDIT 2: If you want static attributes visible to only the current function, you can use PyExt's modify_function:
import pyext
def wrap_mod(*args, **kw):
def inner(f):
return pyext.modify_function(f, *args, **kw)
return inner
class bar(object):
#wrap_mod(globals={'a': [1]})
def foo(self):
print a[0]
a[0] = 2
It's slightly ugly and hackish. But it works.
My recommendation would be just to use double underscores:
class bar(object):
__a = 1
def foo(self):
print self.__a
self.__a = 2
Although this is visible to the other functions, it's invisible to anything else (actually, it's there, but it's mangled).
FINAL EDIT: Use this:
import pyext
def wrap_mod(*args, **kw):
def inner(f):
return pyext.modify_function(f, *args, **kw)
return inner
class bar(object):
#wrap_mod(globals={'a': [1]})
def foo(self):
print a[0]
a[0] = 2
foo.a = foo.func_globals['a']
b = bar()
b.foo() # prints 1
b.foo() # prints 2
# external access
b.foo.a[0] = 77
b.foo() # prints 77
While You can accomplish Your goal by replacing self.foo.a = 1 with self.foo.__dict__['a'] = 1 it is generally not recommended.
If you are using Python2 - (and not Python3) - whenever you retrieve a method from an instance, a new instance method object is created which is a wrapper to the original function defined in the class body.
The instance method is a rather transparent proxy to the function - you can retrieve the function's attributes through it, but not set them - that is why setting an item in self.foo.__dict__ works.
Alternatively you can reach the function object itself using: self.foo.im_func - the im_func attribute of instance methods point the underlying function.
Based on other contributors's answers, I came up with the following workaround. First, wrap a dictionnary in a class resolving non-existant attributes to the wrapped dictionnary such as the following code.
class DictWrapper(object):
def __init__(self, d):
self.d = d
def __getattr__(self, key):
return self.d[key]
Credits to Lucas Jones for this code.
Then implement a addstatic decorator with a statics attribute that will store the static attributes.
class addstatic(object):
def __init__(self, **statics):
self.statics = statics
def __call__(self, function):
class override(object):
def __init__(self, function, statics):
self.__function = function
self.statics = DictWrapper(statics)
def __call__(self, *args, **kwargs):
return self.__function(*args, **kwargs)
def __get__(self, instance, objtype):
from types import MethodType
return MethodType(self, instance)
retval = override(function, self.statics)
return retval
The following code is an example of how the addstatic decorator can be used on methods.
class bar(object):
#attribute(a=2, b=3)
def foo(self):
self.foo.statics.a = 1
self.foo.statics.b = 2
Then, playing with an instance of the bar class yields :
>>> b = bar()
>>> b.foo.statics.a
2
>>> b.foo.statics.b
3
>>> b.foo()
>>> b.foo.statics.a
3
>>> b.foo.statics.b
5
The reason for using this statics dictionnary follows jsbueno's answer which suggest that what I want would require overloading the dot operator of and instance method wrapping the foo function, which I am not sure is possible. Of course, the method's attribute could be set in self.foo.__dict__, but since it not recommended (as suggested by brainovergrow), I came up with this workaround. I am not certain this would be recommended either and I guess it is up for comments.
object of type A and Is there a way to programatically wrap a class object?
Given
class A(object):
def __init__(self):
## ..
def f0(self, a):
## ...
def f1(self, a, b):
## ..
I want another class that wraps an A, such as
class B(object):
def __init__(self):
self.a = A()
def f0(self,a):
try:
a.f0(a)
except (Exception),ex:
## ...
def f1(self, a, b):
try:
a.f1(a,b)
except (Exception),ex:
## ...
Is there a way to do create B.f0 & B.f1 by reflection/inspection of class A?
If you want to create class B by calling a function on a predefined class A, you can simply do B = wrap_class(A) with a function wrap_class that looks like this:
import copy
def wrap_class(cls):
'Wraps a class so that exceptions in its methods are caught.'
# The copy is necessary so that mutable class attributes are not
# shared between the old class cls and the new class:
new_cls = copy.deepcopy(cls)
# vars() is used instead of dir() so that the attributes of base classes
# are not modified, but one might want to use dir() instead:
for (attr_name, value) in vars(cls).items():
if isinstance(value, types.FunctionType):
setattr(new_cls, attr_name, func_wrapper(value))
return new_cls
B = wrap_class(A)
As Jürgen pointed out, this creates a copy of the class; this is only needed, however, if you really want to keep your original class A around (like suggested in the original question). If you don't care about A, you can simply decorate it with a wrapper that does not perform any copy, like so:
def wrap_class(cls):
'Wraps a class so that exceptions in its methods are caught.'
# vars() is used instead of dir() so that the attributes of base classes
# are not modified, but one might want to use dir() instead:
for (attr_name, value) in vars(cls).items():
if isinstance(value, types.FunctionType):
setattr(cls, attr_name, func_wrapper(value))
return cls
#wrap_class
class A(object):
… # Original A class, with methods that are not wrapped with exception catching
The decorated class A catches exceptions.
The metaclass version is heavier, but its principle is similar:
import types
def func_wrapper(f):
'Returns a version of function f that prints an error message if an exception is raised.'
def wrapped_f(*args, **kwargs):
try:
return f(*args, **kwargs)
except Exception, ex:
print "Function", f, "raised", ex
return wrapped_f
class ExceptionCatcher(type):
'Metaclass that wraps methods with func_wrapper().'
def __new__(meta, cname, bases, cdict):
# cdict contains the attributes of class cname:
for (attr_name, value) in cdict.items():
if isinstance(value, types.FunctionType): # Various attribute types can be wrapped differently
cdict[attr_name] = func_wrapper(value)
return super(meta, ExceptionCatcher).__new__(meta, cname, bases, cdict)
class B(object):
__metaclass__ = ExceptionCatcher # ExceptionCatcher will be used for creating class A
class_attr = 42 # Will not be wrapped
def __init__(self):
pass
def f0(self, a):
return a*10
def f1(self, a, b):
1/0 # Raises a division by zero exception!
# Test:
b = B()
print b.f0(3.14)
print b.class_attr
print b.f1(2, 3)
This prints:
31.4
42
Function <function f1 at 0x107812d70> raised integer division or modulo by zero
None
What you want to do is in fact typically done by a metaclass, which is a class whose instances are a class: this is a way of building the B class dynamically based on its parsed Python code (the code for class A, in the question). More information on this can be found in the nice, short description of metaclasses given in Chris's Wiki (in part 1 and parts 2-4).
Meta classes are an option, but generally hard to understand. As is too much reflection
if not needed in simple cases, because it is easy to catch too many (internal) functions. If the wrapped functions are a stable known set, and B might gain other functions, you can delegate explicitly function by function and still keep your error handling code in one place:
class B(object):
def __init__(self):
a = A()
self.f0 = errorHandler(a.f0)
self.f1 = errorHandler(a.f1)
You might do the assignments in a loop if they are many, using getattr/setattr.
The errorhandler function will need to return a function which wraps its argument with
error handling code.
def errorHandler(f):
def wrapped(*args, **kw):
try:
return f(*args, **kw)
except:
# log or something
return wrapped
You can also use errorhandler as decorator on new functions not delegating to the A instance.
def B(A):
...
#errorHandler
def f_new(self):
...
This solution keeps B simple and it is quite explicit what's going on.
You could try it old-school with __getattr__:
class B(object):
def __init__(self):
self.a = A()
def __getattr__(self, name):
a_method = getattr(a, name, None)
if not callable(a_method):
raise AttributeError("Unknown attribute %r" % name)
def wrapper(*args, **kwargs):
try:
return a_method(*args, **kwargs)
except Exception, ex:
# ...
return wrapper
Or with updating B's dict:
class B(object):
def __init__(self):
a = A()
for attr_name in dir(a):
attr = getattr(a, attr_name)
if callable(attr):
def wrapper(*args, **kwargs):
try:
return attr(*args, **kwargs)
except Exception, ex:
# ...
setattr(self, attr_name, wrapper) # or try self.__dict__[x] = y
I'm trying to decorate a class with another class. I also want to inherit from the decorated class, but I get some errors. Here's my code:
class Decorator:
def __init__(self, decorated):
pass
#Decorator
class Foo:
pass
class Goo(Foo):
pass
The error I get when I try to subclass from Foo is this:
Traceback (most recent call last):
File "test.py", line 9, in
class Goo(Foo):
TypeError: __init__() takes exactly 2 positional arguments (4 given)
By adding another init function to Decorator...
def __init__(self, *args):
for arg in args:
print(arg)
... I get the following output:
<class '__main__.Foo'>
Goo
(<__main__.Decorator object at 0x010073B0>,)
{'__module__': '__main__'}
What are those parameters and how should I be using them inside Decorator?
I'll try to answer the "what are those parameters" question. This code:
#Decorator
class Foo:
pass
is equivalent to:
class Foo:
pass
Foo = Decorator(Foo)
This means that Foo ends up being an instance of the Decorator class instead of being a class.
When you try to use this instance as a base of a class (Goo), Python will have to determine a metaclass that will be used to create the new class. In this case it will use Foo.__class__ which equals to Decorator. Then it will call the metaclass with (name, bases, dict) arguments and expect it to return a new class.
This is how you end up with these arguments in Decorator.__init__.
More about this can be found here:
http://www.python.org/download/releases/2.2.3/descrintro/#metaclasses
(particularly the "When a class statement is executed..." part)
Are you trying to add a MixIn to a class after the class has been defined?
If so, you could inject the MixIn this way:
def inject_class(mixin):
def _inject_class(cls):
return type(cls.__name__,(mixin,)+cls.__bases__,dict(cls.__dict__))
return _inject_class
class MixIn(object):
def mix(self):
print('mix')
#inject_class(MixIn)
class Foo(object):
def foo(self):
print('foo')
class Goo(Foo):
def goo(self):
print('goo')
goo=Goo()
goo.mix()
goo.foo()
goo.goo()
prints
mix
foo
goo
If you don't want the generality of inject_class, you could make a specialized class decorator which mixes in Decorator only:
def decorate(cls):
class Decorator(object):
def deco(self):
print('deco')
return type(cls.__name__,(Decorator,)+cls.__bases__,dict(cls.__dict__))
#decorate
class Foo(object):
def foo(self):
print('foo')
the result is the same.
I had the same problem and the following solution works for me:
from functools import update_wrapper
class decoratorBase():
def __new__(cls, logic):
self = object.__new__(cls)
self.__init__(logic)
def new (cls):
#cls is the decorated class type, not the decorator class type itself
self._createInstance(cls)
self._postInstanceCreation()
return self
self._logic.__new__ = new
#return the wrapped class and not a wrapper
return self._logic
def __init__(self, logic):
#logic is the decorated class
self._logic = logic
def _createInstance(self, cls):
self._logicInstance = object.__new__(cls)
self._logicInstance.__init__()
def _postInstanceCreation(self):
pass
class factory(decoratorBase):
def __init__(self, *largs, **kwargs):
super().__init__(*largs, **kwargs)
self.__instance = None
def _createInstance(self, cls):
self._logicInstance = None
self._cls = cls
def _postInstanceCreation(self):
update_wrapper(self, self._cls)
def __call__(self, userData, *largs, **kwargs):
logicInstance = object.__new__(self._cls)
logicInstance.__init__(*largs, **kwargs)
logicInstance._update(userData)
return logicInstance
class singelton(decoratorBase):
def _postInstanceCreation(self):
update_wrapper(self, self._logicInstance)
def __call__(self, userData):
self._logicInstance._update(userData)
return self._logicInstance
class base():
def __init__(self):
self.var = 0
print ("Create new object")
def __call__(self):
self.var += self._updateValue()
def _update(self, userData):
print ("Update object static value with {0}".format(userData))
self.var = userData
#factory
class factoryTestBase(base):
def __call__(self):
super().__call__()
print("I'm a factory, here is the proof: {0}".format(self.var))
def _updateValue(self):
return 1
class factoryTestDerived(factoryTestBase):
def _updateValue(self):
return 5
#singelton
class singeltonTestBase(base):
def __call__(self):
super().__call__()
print("I'm a singelton, here is the proof: {0}".format(self.var))
def _updateValue(self):
return 1
class singeltonTestDerived(singeltonTestBase):
def _updateValue(self):
return 5
The magic in this approach is the overloading of the __new__() method, as well for the decorator itself as for the "wrapper" which is returned by the decorator. I set the word wrapper in quotes, because actually there is no wrapper. Instead the decorated class is alternated by the decorator and returned. Using this scheme, you are able to inherit from a decorated class. The most important thing is the change of the __new__() method of the decorated class, which is made by the following lines:
def new (cls):
self._createInstance(cls)
self._postInstanceCreation()
return self
self._logic.__new__ = new
Using this, you have access to the decorator methods like self._createInstance() during creation of an object from a decorated class. You even have the opportunity to inherit from your decorators (as it is shown in the example).
Now lets run a simple example:
>>> factoryObjCreater = factoryTestBase()
>>> factoryObj1 = factoryObjCreater(userData = 1)
Create new object
Update object static value with 1
>>> factoryObj2 = factoryObjCreater(userData = 1)
Create new object
Update object static value with 1
>>> factoryObj1()
I'm a factory, here is the proof: 2
>>> factoryObj2()
I'm a factory, here is the proof: 2
>>> factoryObjDerivedCreater = factoryTestDerived()
>>> factoryObjDerived1 = factoryObjDerivedCreater(userData = 2)
Create new object
Update object static value with 2
>>> factoryObjDerived2 = factoryObjDerivedCreater(userData = 2)
Create new object
Update object static value with 2
>>> factoryObjDerived1()
I'm a factory, here is the proof: 7
>>> factoryObjDerived2()
I'm a factory, here is the proof: 7
>>> singeltonObjCreater = singeltonTestBase()
Create new object
>>> singeltonObj1 = singeltonObjCreater(userData = 1)
Update object static value with 1
>>> singeltonObj2 = singeltonObjCreater(userData = 1)
Update object static value with 1
>>> singeltonObj1()
I'm a singelton, here is the proof: 2
>>> singeltonObj2()
I'm a singelton, here is the proof: 3
>>> singeltonObjDerivedCreater = singeltonTestDerived()
Create new object
>>> singeltonObjDerived1 = singeltonObjDerivedCreater(userData = 2)
Update object static value with 2
>>> singeltonObjDerived2 = singeltonObjDerivedCreater(userData = 2)
Update object static value with 2
>>> singeltonObjDerived1()
I'm a singelton, here is the proof: 7
>>> singeltonObjDerived2()
I'm a singelton, here is the proof: 12
>>>