Wrapping all class methods from a foreign package - python

I've been trying to find a way to implement a class wrapper that works for any given class for many many hours. All the solutions I've seen either require access to the class definition (the Meta Class solution) or completely ignore #staticmethod/#classmethod decorators that decorate the class/instance method (the setattr solution. E.g. #staticmethod decorator is ignored and self is still passed to that function as first argument).
How can this be done in Python 2.7?
For example the following foreign class: SQLAlchemy from flask_sqlalchemy. The end result would be that I'd be able to call another function before executing the requested function call.
This is my failed attempt:
class WithWrapping(object):
def __init__(self, other_cls, *args, **kwargs):
self.other = other_cls(*args, **kwargs)
self.other_cls = other_cls
def __getattr__(self, attr):
other_attr = self.other.__getattribute__(attr)
if callable(other_attr):
other_attr = getattr(self.other_cls, attr, None)
if not other_attr:
other_attr = getattr(self.other, attr, None)
def wrapper(*args, **kwargs):
# Do something here prior to function call
r = other_attr(*args, **kwargs)
if r == self.other:
return self
return r
return wrapper
else:
return other_attr
This wouldn't work if you run the following:
from flask_sqlalchemy import SQLAlchemy
db = WithWrapping(SQLAlchemy)
class Foo(db.TypeDecorator):
pass
What you'd get is:
TypeError Traceback (most recent call last)
<ipython-input-4-9813b186f710> in <module>()
1 from flask_sqlalchemy import SQLAlchemy
2 db = WithWrapping(SQLAlchemy)
----> 3 class Foo(db.TypeDecorator):
4 pass
TypeError: Error when calling the metaclass bases
function() argument 1 must be code, not str
Which I tried debugging but it's quite complicated to follow the many inner calls there.

I'm not 100% on what you want to achieve, but you can go through the components of an object or a class and wrap them with your own decorator, using things like dir(), callable() and setattr().
import functools
class A (object) :
def __init__( self ):
pass
def myop( self, a, b ):
return a + b
def mydec( fn ):
#functools.wraps(fn)
def wrapper( obj, *args, **kwargs ):
print( "called" )
return fn( obj, *args, **kwargs )
return wrapper
for name in dir( A ):
if not name.startswith( "_" ) : # probably want to think carefully about this!
attr = getattr( A, name )
if callable(attr):
setattr( A, name, mydec( attr ) )
results in
>>> a = A()
>>> a.myop( 1, 3 )
called
4
>>>
Is this similar to what you had in mind?

Related

Dynamic Wrapper in Python

I'm looking to create a dynamic wrapper class that exposes the API calls from a provided object using data in the object.
Statically it looks like this:
class Concrete:
def __init__(self, data):
self.data = data
def print_data(self):
print(self.data)
class Wrapper:
'''
One day this will wrap a variety of objects. But today
it can only handle Concrete objects.
'''
def wrap_it(self, concrete):
self.cco = concrete # concreteobject=cco
def print_data(self):
self.cco.print_data()
cco = Concrete(5)
wcco = Wrapper()
wcco.wrap_it(cco)
wcco.print_data()
Produces
5
I'd like to figure out how to do the same thing but make
wrap_it dynamic. It should search the concrete object
find the functions, and create functions of the same name
that call the same function in the concrete object.
I imagine that the solution involves inspect.signature or
at least some use of *args and **kwargs, but I've not seen
an example on how to put all this together.
You can use the __getattr__ magic method to hook getting undefined attributes, and forward them to the concrete object:
class DynamicWrapper():
def wrap_it(self, concrete):
self.cco = concrete
def __getattr__(self, k):
def wrapper(*args, **kwargs):
print(f'DynamicWrapper calling {k} with args {args} {kwargs}')
return getattr(self.cco, k)(*args, **kwargs)
if hasattr(self.cco, k):
return wrapper
else:
raise AttributeError(f'No such field/method: {k}')
cco = Concrete(5)
dwcco = DynamicWrapper()
dwcco.wrap_it(cco)
dwcco.print_data()
Use the dir() function to get the attributes of the given object, check if they are callable and assign them to your wrapper, like this:
class Wrapper:
def wrap_it(self, objToWrap):
for attr in dir(objToWrap):
if not attr.startswith('__') and callable(getattr(objToWrap, attr)):
exec('self.%s = objToWrap.%s' % (attr, attr))
And now, for testing.
>>> cco = Concrete(5)
>>> wcco = Wrapper()
>>> wcco.wrap_it(cco)
>>> wcco.print_data()
5

How can I delay the __init__ call until an attribute is accessed?

I have a test framework that requires test cases to be defined using the following class patterns:
class TestBase:
def __init__(self, params):
self.name = str(self.__class__)
print('initializing test: {} with params: {}'.format(self.name, params))
class TestCase1(TestBase):
def run(self):
print('running test: ' + self.name)
When I create and run a test, I get the following:
>>> test1 = TestCase1('test 1 params')
initializing test: <class '__main__.TestCase1'> with params: test 1 params
>>> test1.run()
running test: <class '__main__.TestCase1'>
The test framework searches for and loads all TestCase classes it can find, instantiates each one, then calls the run method for each test.
load_test(TestCase1(test_params1))
load_test(TestCase2(test_params2))
...
load_test(TestCaseN(test_params3))
...
for test in loaded_tests:
test.run()
However, I now have some test cases for which I don't want the __init__ method called until the time that the run method is called, but I have little control over the framework structure or methods. How can I delay the call to __init__ without redefining the __init__ or run methods?
Update
The speculations that this originated as an XY problem are correct. A coworker asked me this question a while back when I was maintaining said test framework. I inquired further about what he was really trying to achieve and we figured out a simpler workaround that didn't involve changing the framework or introducing metaclasses, etc.
However, I still think this is a question worth investigating: if I wanted to create new objects with "lazy" initialization ("lazy" as in lazy evaluation generators such as range, etc.) what would be the best way of accomplishing it? My best attempt so far is listed below, I'm interested in knowing if there's anything simpler or less verbose.
First Solution:use property.the elegant way of setter/getter in python.
class Bars(object):
def __init__(self):
self._foo = None
#property
def foo(self):
if not self._foo:
print("lazy initialization")
self._foo = [1,2,3]
return self._foo
if __name__ == "__main__":
f = Bars()
print(f.foo)
print(f.foo)
Second Solution:the proxy solution,and always implement by decorator.
In short, Proxy is a wrapper that wraps the object you need. Proxy could provide additional functionality to the object that it wraps and doesn't change the object's code. It's a surrogate which provide the abitity of control access to a object.there is the code come form user Cyclone.
class LazyProperty:
def __init__(self, method):
self.method = method
self.method_name = method.__name__
def __get__(self, obj, cls):
if not obj:
return None
value = self.method(obj)
print('value {}'.format(value))
setattr(obj, self.method_name, value)
return value
class test:
def __init__(self):
self._resource = None
#LazyProperty
def resource(self):
print("lazy")
self._resource = tuple(range(5))
return self._resource
if __name__ == '__main__':
t = test()
print(t.resource)
print(t.resource)
print(t.resource)
To be used for true one-time calculated lazy properties. I like it because it avoids sticking extra attributes on objects, and once activated does not waste time checking for attribute presence
Metaclass option
You can intercept the call to __init__ using a metaclass. Create the object with __new__ and overwrite the __getattribute__ method to check if __init__ has been called or not and call it if it hasn't.
class DelayInit(type):
def __call__(cls, *args, **kwargs):
def init_before_get(obj, attr):
if not object.__getattribute__(obj, '_initialized'):
obj.__init__(*args, **kwargs)
obj._initialized = True
return object.__getattribute__(obj, attr)
cls.__getattribute__ = init_before_get
new_obj = cls.__new__(cls, *args, **kwargs)
new_obj._initialized = False
return new_obj
class TestDelayed(TestCase1, metaclass=DelayInit):
pass
In the example below, you'll see that the init print won't occur until the run method is executed.
>>> new_test = TestDelayed('delayed test params')
>>> new_test.run()
initializing test: <class '__main__.TestDelayed'> with params: delayed test params
running test: <class '__main__.TestDelayed'>
Decorator option
You could also use a decorator that has a similar pattern to the metaclass above:
def delayinit(cls):
def init_before_get(obj, attr):
if not object.__getattribute__(obj, '_initialized'):
obj.__init__(*obj._init_args, **obj._init_kwargs)
obj._initialized = True
return object.__getattribute__(obj, attr)
cls.__getattribute__ = init_before_get
def construct(*args, **kwargs):
obj = cls.__new__(cls, *args, **kwargs)
obj._init_args = args
obj._init_kwargs = kwargs
obj._initialized = False
return obj
return construct
#delayinit
class TestDelayed(TestCase1):
pass
This will behave identically to the example above.
In Python, there is no way that you can avoid calling __init__ when you instantiate a class cls. If calling cls(args) returns an instance of cls, then the language guarantees that cls.__init__ will have been called.
So the only way to achieve something similar to what you are asking is to introduce another class that will postpone the calling of __init__ in the original class until an attribute of the instantiated class is being accessed.
Here is one way:
def delay_init(cls):
class Delay(cls):
def __init__(self, *arg, **kwarg):
self._arg = arg
self._kwarg = kwarg
def __getattribute__(self, name):
self.__class__ = cls
arg = self._arg
kwarg = self._kwarg
del self._arg
del self._kwarg
self.__init__(*arg, **kwarg)
return getattr(self, name)
return Delay
This wrapper function works by catching any attempt to access an attribute of the instantiated class. When such an attempt is made, it changes the instance's __class__ to the original class, calls the original __init__ method with the arguments that were used when the instance was created, and then returns the proper attribute. This function can be used as decorator for your TestCase1 class:
class TestBase:
def __init__(self, params):
self.name = str(self.__class__)
print('initializing test: {} with params: {}'.format(self.name, params))
class TestCase1(TestBase):
def run(self):
print('running test: ' + self.name)
>>> t1 = TestCase1("No delay")
initializing test: <class '__main__.TestCase1'> with params: No delay
>>> t2 = delay_init(TestCase1)("Delayed init")
>>> t1.run()
running test: <class '__main__.TestCase1'>
>>> t2.run()
initializing test: <class '__main__.TestCase1'> with params: Delayed init
running test: <class '__main__.TestCase1'>
>>>
Be careful where you apply this function though. If you decorate TestBase with delay_init, it will not work, because it will turn the TestCase1 instances into TestBase instances.
In my answer I'd like to focus on cases when one wants to instantiate a class whose initialiser (dunder init) has side effects. For instance, pysftp.Connection, creates an SSH connection, which may be undesired until it's actually used.
In a great blog series about conceiving of wrapt package (nit-picky decorator implementaion), the author describes Transparent object proxy. This code can be customised for the subject in question.
class LazyObject:
_factory = None
'''Callable responsible for creation of target object'''
_object = None
'''Target object created lazily'''
def __init__(self, factory):
self._factory = factory
def __getattr__(self, name):
if not self._object:
self._object = self._factory()
return getattr(self._object, name)
Then it can be used as:
obj = LazyObject(lambda: dict(foo = 'bar'))
obj.keys() # dict_keys(['foo'])
But len(obj), obj['foo'] and other language constructs which invoke Python object protocols (dunder methods, like __len__ and __getitem__) will not work. However, for many cases, which are limited to regular methods, this is a solution.
To proxy object protocol implementations, it's possible to use neither __getattr__, nor __getattribute__ (to do it in a generic way). The latter's documentation notes:
This method may still be bypassed when looking up special methods as the result of implicit invocation via language syntax or built-in functions. See Special method lookup.
As a complete solution is demanded, there are examples of manual implementations like werkzeug's LocalProxy and django's SimpleLazyObject. However a clever workaround is possible.
Luckily there's a dedicated package (based on wrapt) for the exact use case, lazy-object-proxy which is described in this blog post.
from lazy_object_proxy import Proxy
obj = Proxy(labmda: dict(foo = 'bar'))
obj.keys() # dict_keys(['foo'])
len(len(obj)) # 1
obj['foo'] # 'bar'
One alternative would be to write a wrapper that takes a class as input and returns a class with delayed initialization until any member is accessed. This could for example be done as this:
def lazy_init(cls):
class LazyInit(cls):
def __init__(self, *args, **kwargs):
self.args = args
self.kwargs = kwargs
self._initialized = False
def __getattr__(self, attr):
if not self.__dict__['_initialized']:
cls.__init__(self,
*self.__dict__['args'], **self.__dict__['kwargs'])
self._initialized = True
return self.__dict__[attr]
return LazyInit
This could then be used as such
load_test(lazy_init(TestCase1)(test_params1))
load_test(lazy_init(TestCase2)(test_params2))
...
load_test(lazy_init(TestCaseN)(test_params3))
...
for test in loaded_tests:
test.run()
Answering your original question (and the problem I think you are actually trying to solve), "How can I delay the init call until an attribute is accessed?": don't call init until you access the attribute.
Said another way: you can make the class initialization simultaneous with the attribute call. What you seem to actually want is 1) create a collection of TestCase# classes along with their associated parameters; 2) run each test case.
Probably your original problem came from thinking you had to initialize all your TestCase classes in order to create a list of them that you could iterate over. But in fact you can store class objects in lists, dicts etc. That means you can do whatever method you have for finding all TestCase classes and store those class objects in a dict with their relevant parameters. Then just iterate that dict and call each class with its run() method.
It might look like:
tests = {TestCase1: 'test 1 params', TestCase2: 'test 2 params', TestCase3: 'test 3 params'}
for test_case, param in tests.items():
test_case(param).run()
Overridding __new__
You could do this by overriding __new__ method and replacing __init__ method with a custom function.
def init(cls, real_init):
def wrapped(self, *args, **kwargs):
# This will run during the first call to `__init__`
# made after `__new__`. Here we re-assign the original
# __init__ back to class and assign a custom function
# to `instances.__init__`.
cls.__init__ = real_init
def new_init():
if new_init.called is False:
real_init(self, *args, **kwargs)
new_init.called = True
new_init.called = False
self.__init__ = new_init
return wrapped
class DelayInitMixin(object):
def __new__(cls, *args, **kwargs):
cls.__init__ = init(cls, cls.__init__)
return object.__new__(cls)
class A(DelayInitMixin):
def __init__(self, a, b):
print('inside __init__')
self.a = sum(a)
self.b = sum(b)
def __getattribute__(self, attr):
init = object.__getattribute__(self, '__init__')
if not init.called:
init()
return object.__getattribute__(self, attr)
def run(self):
pass
def fun(self):
pass
Demo:
>>> a = A(range(1000), range(10000))
>>> a.run()
inside __init__
>>> a.a, a.b
(499500, 49995000)
>>> a.run(), a.__init__()
(None, None)
>>> b = A(range(100), range(10000))
>>> b.a, b.b
inside __init__
(4950, 49995000)
>>> b.run(), b.__init__()
(None, None)
Using cached properties
The idea is to do the heavy calculation only once by caching results. This approach will lead to much more readable code if the whole point of delaying initialization is improving performance.
Django comes with a nice decorator called #cached_property. I tend to use it a lot in both code and unit-tests for caching results of heavy properties.
A cached_property is a non-data descriptor. Hence once the key is set in instance's dictionary, the access to property would always get the value from there.
class cached_property(object):
"""
Decorator that converts a method with a single self argument into a
property cached on the instance.
Optional ``name`` argument allows you to make cached properties of other
methods. (e.g. url = cached_property(get_absolute_url, name='url') )
"""
def __init__(self, func, name=None):
self.func = func
self.__doc__ = getattr(func, '__doc__')
self.name = name or func.__name__
def __get__(self, instance, cls=None):
if instance is None:
return self
res = instance.__dict__[self.name] = self.func(instance)
return res
Usage:
class A:
#cached_property
def a(self):
print('calculating a')
return sum(range(1000))
#cached_property
def b(self):
print('calculating b')
return sum(range(10000))
Demo:
>>> a = A()
>>> a.a
calculating a
499500
>>> a.b
calculating b
49995000
>>> a.a, a.b
(499500, 49995000)
I think you can use a wrapper class to hold the real class you want to instance, and use call __init__ yourself in your code, like(Python 3 code):
class Wrapper:
def __init__(self, cls):
self.cls = cls
self.instance = None
def your_method(self, *args, **kwargs):
if not self.instance:
self.instnace = cls()
return self.instance(*args, **kwargs)
class YourClass:
def __init__(self):
print("calling __init__")
but it's a dump way, but without any trick.

python equivalent of functools 'partial' for a class / constructor

I want to create a class that behaves like collections.defaultdict, without having the usage code specify the factory. EG:
instead of
class Config(collections.defaultdict):
pass
this:
Config = functools.partial(collections.defaultdict, list)
This almost works, but
isinstance(Config(), Config)
fails. I am betting this clue means there are more devious problems deeper in also. So is there a way to actually achieve this?
I also tried:
class Config(Object):
__init__ = functools.partial(collections.defaultdict, list)
I don't think there's a standard method to do it, but if you need it often, you can just put together your own small function:
import functools
import collections
def partialclass(cls, *args, **kwds):
class NewCls(cls):
__init__ = functools.partialmethod(cls.__init__, *args, **kwds)
return NewCls
if __name__ == '__main__':
Config = partialclass(collections.defaultdict, list)
assert isinstance(Config(), Config)
I had a similar problem but also required instances of my partially applied class to be pickle-able. I thought I would share what I ended up with.
I adapted fjarri's answer by peeking at Python's own collections.namedtuple. The below function creates a named subclass that can be pickled.
from functools import partialmethod
import sys
def partialclass(name, cls, *args, **kwds):
new_cls = type(name, (cls,), {
'__init__': partialmethod(cls.__init__, *args, **kwds)
})
# The following is copied nearly ad verbatim from `namedtuple's` source.
"""
# For pickling to work, the __module__ variable needs to be set to the frame
# where the named tuple is created. Bypass this step in enviroments where
# sys._getframe is not defined (Jython for example) or sys._getframe is not
# defined for arguments greater than 0 (IronPython).
"""
try:
new_cls.__module__ = sys._getframe(1).f_globals.get('__name__', '__main__')
except (AttributeError, ValueError):
pass
return new_cls
At least in Python 3.8.5 it just works with functools.partial:
import functools
class Test:
def __init__(self, foo):
self.foo = foo
PartialClass = functools.partial(Test, 1)
instance = PartialClass()
instance.foo
If you actually need working explicit type checks via isinstance, you can simply create a not too trivial subclass:
class Config(collections.defaultdict):
def __init__(self): # no arguments here
# call the defaultdict init with the list factory
super(Config, self).__init__(list)
You'll have no-argument construction with the list factory and
isinstance(Config(), Config)
will work as well.
Could use *args and **kwargs:
class Foo:
def __init__(self, a, b):
self.a = a
self.b = b
def printy(self):
print("a:", self.a, ", b:", self.b)
class Bar(Foo):
def __init__(self, *args, **kwargs):
return super().__init__(*args, b=123, **kwargs)
if __name__=="__main__":
bar = Bar(1)
bar.printy() # Prints: "a: 1 , b: 123"

Decorated class looses acces to its attributes

I implemented a decorator that worked like a charm until I added attributes to the decorated class. When I instantiate the class, it cannot acces the calss attributes. Take the following minimal working example :
from module import specialfunction
class NumericalMathFunctionDecorator:
def __init__(self, enableCache=True):
self.enableCache = enableCache
def __call__(self, wrapper):
def numericalmathfunction(*args, **kwargs):
func = specialfunction(wrapper(*args, **kwargs))
"""
Do some setup to func with decorator arguments (e.g. enableCache)
"""
return numericalmathfunction
#NumericalMathFunctionDecorator(enableCache=True)
class Wrapper:
places = ['home', 'office']
configs = {
'home':
{
'attr1': 'path/at/home',
'attr2': 'jhdlt'
},
'office':
{
'attr1': 'path/at/office',
'attr2': 'sfgqs'
}
}
def __init__(self, where='home'):
# Look for setup configuration on 'Wrapper.configs[where]'.
assert where in Wrapper.places, "Only valid places are {}".format(Wrapper.places)
self.__dict__.update(Wrapper.configs[where])
def __call__(self, X):
"""Do stuff with X and return the result
"""
return X ** 2
model = Wrapper()
When I instantiate the Wrapper class (#1), I get the following error :
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-5-a99bd3d544a3> in <module>()
15 assert where in Wrapper.places, "Only valid places are {}".format(Wrapper.places)
16
---> 17 model = Wrapper()
<ipython-input-5-a99bd3d544a3> in numericalmathfunction(*args, **kwargs)
5 def __call__(self, wrapper):
6 def numericalmathfunction(*args, **kwargs):
----> 7 func = wrapper(*args, **kwargs)
8 return numericalmathfunction
9
<ipython-input-5-a99bd3d544a3> in __init__(self, where)
13 def __init__(self, where='home'):
14 # Look for setup configuration on 'Wrapper.configs[where]'.
---> 15 assert where in Wrapper.places, "Only valid places are {}".format(Wrapper.places)
16
17 model = Wrapper()
AttributeError: 'function' object has no attribute 'places'
I guess that with the decorator, Wrapper becomes a function that looses acces to its attributes...
Any ideas of how I can solve this ? Maybe there is a workaround
You replaced Wrapper (which was a class) with the numericalmathfunction function object. That object doesn't have any of the class attributes, no.
In essence, the decorator does this:
class Wrapper:
# ...
Wrapper = NumericalMathFunctionDecorator(enableCache=True)(Wrapper)
so whatever the NumericalMathFunctionDecorator.__call__ method returns has now replaced the class; all references to Wrapper now reference that return value. And when you use the name Wrapper in the __init__ method, you are referencing that global, not the original class.
You can still access the current class with type(self), or just reference those attributes via self (where the name lookup falls through to the class):
def __init__(self, where='home'):
# Look for setup configuration on 'Wrapper.configs[where]'.
assert where in self.places, "Only valid places are {}".format(self.places)
self.__dict__.update(self.configs[where])
or
def __init__(self, where='home'):
# Look for setup configuration on 'Wrapper.configs[where]'.
cls = type(self)
assert where in cls.places, "Only valid places are {}".format(cls.places)
self.__dict__.update(cls.configs[where])
In both cases you can end up with referencing an attribute on a subclass if you ever did subclass Wrapper (which you cannot do in this case anyway as you would have to fish the class out of the decorator closure).
Alternatively, you could store the original class as an attribute on the returned function:
def __call__(self, wrapper):
def numericalmathfunction(*args, **kwargs):
func = specialfunction(wrapper(*args, **kwargs))
"""
Do some setup to func with decorator arguments (e.g. enableCache)
"""
numericalmathfunction.__wrapped__ = wrapper
return numericalmathfunction
then use that reference in your __init__:
def __init__(self, where='home'):
# Look for setup configuration on 'Wrapper.configs[where]'.
cls = Wrapper
while hasattr(cls, '__wrapped__'):
# remove any decorator layers to get to the original
cls = cls.__wrapped__
assert where in cls.places, "Only valid places are {}".format(cls.places)
self.__dict__.update(cls.configs[where])

add a decorate function to a class

I have a decorated function (simplified version):
class Memoize:
def __init__(self, function):
self.function = function
self.memoized = {}
def __call__(self, *args, **kwds):
hash = args
try:
return self.memoized[hash]
except KeyError:
self.memoized[hash] = self.function(*args)
return self.memoized[hash]
#Memoize
def _DrawPlot(self, options):
do something...
now I want to add this method to a pre-esisting class.
ROOT.TChain.DrawPlot = _DrawPlot
when I call this method:
chain = TChain()
chain.DrawPlot(opts)
I got:
self.memoized[hash] = self.function(*args)
TypeError: _DrawPlot() takes exactly 2 arguments (1 given)
why doesn't it propagate self?
The problem is that you have defined your own callable class then tried to use it as a method. When you use a function as an attribute, accessing the function as an attribute calls it its __get__ method to return something other than the function itself—the bound method. When you have your own class without defining __get__, it just returns your instance without implicitly passing self.
Descriptors are explained some on http://docs.python.org/reference/datamodel.html#descriptors if you are not familiar with them. The __get__, __set__, and __delete__ methods change how interacting with your object as an attribute works.
You could implement memoize as a function and use the built-in __get__ magic that functions already have
import functools
def memoize(f):
#functools.wraps(f)
def memoized(*args, _cache={}):
# This abuses the normally-unwanted behaviour of mutable default arguments.
if args not in _cache:
_cache[args] = f(*args)
return _cache[args]
return memoized
or by modifying your class along the lines of
import functools
class Memoize(object): #inherit object
def __init__(self, function):
self.function = function
self.memoized = {}
def __call__(self, *args): #don't accept kwargs you don't want.
# I removed "hash = args" because it shadowed a builtin function and
# because it was untrue--it wasn't a hash, it was something you intended for
# Python to hash for you.
try:
return self.memoized[args]
except KeyError:
self.memoized[args] = self.function(*args)
return self.memoized[args]
def __get__(self, obj, type):
if obj is None: #We looked up on the class
return self
return functools.partial(self, obj)
Note that both of these choke if any of the arguments you pass in are mutable (well, unhashable technically). This might be suitable for your case, but you may also want to deal with the case where args is unhashable.

Categories

Resources