Essentially this is what I want to accomplish:
class Move(object):
def __init__(self, Attr):
if Attr:
self.attr = Attr
if hasattr(self, "attr"):
__call__ = self.hasTheAttr
else:
__call__ = self.hasNoAttr
def hasNoAttr(self):
#no args!
def hasTheAttr(func, arg1, arg2):
#do things with the args
__call__ = hasNoAttr
I know that that doesn't work, it just uses hasNoAttr all the time. My first thought was to use a decorator, but I'm not all that familiar with them and I couldn't figure out how to base it from whether or not a class attribute existed or not.
Actual question part: How would I be able to deterministically make a function either x function or y function depending on a condition.
You can't really do this sort of thing with __call__ -- with other (non-magic) methods, you can just monkey-patch them, but with __call__ and other magic methods you need to delegate to the appropriate method within the magic method itself:
class Move(object):
def __init__(self, Attr):
if Attr:
self.attr = Attr
if hasattr(self, "attr"):
self._func = self.hasTheAttr
else:
self._func = self.hasNoAttr
def hasNoAttr(self):
#no args!
def hasTheAttr(func, arg1, arg2):
#do things with the args
def __call__(self,*args):
return self._func(*args)
Related
I am attempting to wrap a class from a third-party package in such a way that my new class looks exactly like a subclass of the third-party class. The third-party class does not support inheritance, and it has nontrivial features, such as functions that have a __getitem__ method. I can wrap almost every attribute and method using a solution based on Wrapping a class whose methods return instances of that class and How can I intercept calls to python's "magic" methods in new style classes?. However, I still need to override the __init__ method of the third-party class. How can I do that? Note: I am using new-style classes.
Code so far:
import copy
class WrapperMetaclass(type):
"""
Works with the `Wrapper` class to create proxies for the wrapped object's magic methods.
"""
def __init__(cls, name, bases, dct):
def make_proxy(name):
def proxy(self, *args):
return getattr(self._obj, name)
return proxy
type.__init__(cls, name, bases, dct)
if cls.__wraps__:
ignore = set("__%s__" % n for n in cls.__ignore__.split())
for name in dir(cls.__wraps__):
if name.startswith("__"):
if name not in ignore and name not in dct:
setattr(cls, name, property(make_proxy(name)))
class Wrapper(object):
"""
Used to provide a (nearly) seamless inheritance-like interface for classes that do not support direct inheritance.
"""
__metaclass__ = WrapperMetaclass
__wraps__ = None
# note that the __init__ method will be ignored by WrapperMetaclass
__ignore__ = "class mro new init setattr getattr getattribute dict"
def __init__(self, obj):
if self.__wraps__ is None:
raise TypeError("base class Wrapper may not be instantiated")
elif isinstance(obj, self.__wraps__):
self._obj = obj
else:
raise ValueError("wrapped object must be of %s" % self.__wraps__)
def __getattr__(self, name):
if name is '_obj':
zot = 1
orig_attr = self._obj.__getattribute__(name)
if callable(orig_attr) and not hasattr(orig_attr, '__getitem__'):
def hooked(*args, **kwargs):
result = orig_attr(*args, **kwargs)
if result is self._obj:
return self
elif isinstance(result, self.__wraps__):
return self.__class__(result)
else:
return result
return hooked
else:
return orig_attr
def __setattr__(self, attr, val):
object.__setattr__(self, attr, val)
if getattr(self._obj, attr, self._obj) is not self._obj: # update _obj's member if it exists
setattr(self._obj, attr, getattr(self, attr))
class ClassToWrap(object):
def __init__(self, data):
self.data = data
def theirfun(self):
new_obj = copy.deepcopy(self)
new_obj.data += 1
return new_obj
def __str__(self):
return str(self.data)
class Wrapped(Wrapper):
__wraps__ = ClassToWrap
def myfun(self):
new_obj = copy.deepcopy(self)
new_obj.data += 1
return new_obj
# can't instantiate Wrapped directly! This is the problem!
obj = ClassToWrap(0)
wr0 = Wrapped(obj)
print wr0
>> 0
print wr0.theirfun()
>> 1
This works, but for truly seamless inheritance-like behavior, I need to instantiate Wrapped directly, e.g.
wr0 = Wrapped(0)
which currently throws
ValueError: wrapped object must be of <class '__main__.ClassToWrap'>
I attempted to override by defining a new proxy for __init__ in WrapperMetaclass, but rapidly ran into infinite recursions.
My codebase is complex with users at different skill levels, so I can't afford to use monkey-patching or solutions that modify the definition of the example classes ClassToWrap or Wrapped. I am really hoping for an extension to the code above that overrides Wrapped.__init__.
Please note that this question is not simply a duplicate of e.g. Can I exactly mimic inheritance behavior with delegation by composition in Python?. That post does not have any answer that is nearly as detailed as what I'm already providing here.
It sounds like you just want Wrapper.__init__ method to work differently that it currently does. Rather than taking an already existing instance of the __wraps__ class, it should take the arguments that the other class expects in its constructor and built the instance for you. Try something like this:
def __init__(self, *args, **kwargs):
if self.__wraps__ is None:
raise TypeError("base class Wrapper may not be instantiated")
else:
self._obj = self.__wraps__(*args, **kwargs)
If you want Wrapper to remain the same for some reason, you could put the logic in a new Wrapped.__init__ method instead:
def __init__(self, data): # I'm explicitly naming the argument here, but you could use *args
super(self, Wrapped).__init__(self.__wraps__(data)) # and **kwargs to make it extensible
Disclaimer:
This article is more a recipe than a question, but I found the subject quite interesting, with almost no references in the Web.
If there is any better place on StackOverflow to publish this kind of articles, please let me know.
Subject:
How can I force Python to invoke different function depending on the type of attribute access (using class or instance) - e.g. force Python to invoke different method for MyClass.my_method() and MyClass().my_method()?
Usecase:
Let's say, we have custom Enum implementation (based on Python36 Enum, but with some customization). As a user of this Enum, we want to create a CustomEnum, inherit not just from Enum, but also from str: class MyEnum(str, Enum).We also want to add encoding and decoding feature. Our idea is to use MyEnum.encode to encode any object, that includes our enum members, but leave the original str.encode in power for instances of our enum class.
In short: MyEnum.encode invoke our custom encoding function, and have perfectly sens, from this point of view. MyEnum() is a string, so MyEnum().encode should invoke encode function inherited from str class.
Solution:
Write a descriptor, which will work as a switch.
Full answer in my first post.
Solution:
As far as I know, descriptors are the only objects, that can distinguish, if they are invoke for class or instance, because of the __get__ function signature: __get__(self, instance, instance_type). This property allows us to build a switch on top of it.
class boundmethod(object):
def __init__(self, cls_method=None, instance_method=None, doc=None):
self._cls_method = cls_method
self._instance_method = instance_method
if cls_method:
self._method_name = cls_method.__name__
elif instance_method:
self._method_name = instance_method.__name__
if doc is None and cls_method is not None:
doc = cls_method.__doc__
self.__doc__ = doc
self._method = None
self._object = None
def _find_method(self, instance, instance_type, method_name):
for base in instance_type.mro()[1:]:
method = getattr(base, method_name, None)
if _is_descriptor(method):
method = method.__get__(instance, base)
if method and method is not self:
try:
return method.__func__
except AttributeError:
return method
def __get__(self, instance, instance_type):
if instance is None:
self._method = self._cls_method or self._find_method(instance, instance_type, self._method_name)
self._object = instance_type
else:
self._method = self._instance_method or self._find_method(instance, instance_type, self._method_name)
self._object = instance
return self
#staticmethod
def cls_method(obj=None):
def constructor(cls_method):
if obj is None:
return boundmethod(cls_method, None, cls_method.__doc__)
else:
return type(obj)(cls_method, obj._instance_method, obj.__doc__)
if isinstance(obj, FunctionType):
return boundmethod(obj, None, obj.__doc__)
else:
return constructor
#staticmethod
def instance_method(obj=None):
def constructor(instance_method):
if obj is None:
return boundmethod(None, instance_method, instance_method.__doc__)
else:
return type(obj)(obj._cls_method, instance_method, obj.__doc__)
if isinstance(obj, FunctionType):
return boundmethod(None, obj, obj.__doc__)
else:
return constructor
def __call__(self, *args, **kwargs):
if self._method:
try:
return self._method(self._object, *args, **kwargs)
except TypeError:
return self._method(*args, **kwargs)
return None
Example:
>>> class Walkmen(object):
... #boundmethod.cls_method
... def start(self):
... return 'Walkmen start class bound method'
... #boundmethod.instance_method(start)
... def start(self):
... return 'Walkmen start instance bound method'
>>> print Walkmen.start()
Walkmen start class bound method
>>> print Walkmen().start()
Walkmen start instance bound method
I hope it will help some o you guys.
Best.
I actually just asked this question (Python descriptors and inheritance I hadn't seen this question). My solution uses descriptors and a metaclass for inheritance.
from my answer:
class dynamicmethod:
'''
Descriptor to allow dynamic dispatch on calls to class.Method vs obj.Method
fragile when used with inheritence, to inherit and then overwrite or extend
a dynamicmethod class must have dynamicmethod_meta as its metaclass
'''
def __init__(self, f=None, m=None):
self.f = f
self.m = m
def __get__(self, obj, objtype=None):
if obj is not None and self.f is not None:
return types.MethodType(self.f, obj)
elif objtype is not None and self.m is not None:
return types.MethodType(self.m, objtype)
else:
raise AttributeError('No associated method')
def method(self, f):
return type(self)(f, self.m)
def classmethod(self, m):
return type(self)(self.f, m)
def make_dynamicmethod_meta(meta):
class _dynamicmethod_meta(meta):
def __prepare__(name, bases, **kwargs):
d = meta.__prepare__(name, bases, **kwargs)
for base in bases:
for k,v in base.__dict__.items():
if isinstance(v, dynamicmethod):
if k in d:
raise ValueError('Multiple base classes define the same dynamicmethod')
d[k] = v
return d
return _dynamicmethod_meta
dynamicmethod_meta=make_dynamicmethod_meta(type)
class A(metaclass=dynamicmethod_meta):
#dynamicmethod
def a(self):
print('Called from obj {} defined in A'.format(self))
#a.classmethod
def a(cls)
print('Called from class {} defined in A'.format(cls))
class B(A):
#a.method
def a(self):
print('Called from obj {} defined in B'.format(self))
A.a()
A().a()
B.a()
B().a()
results in:
Called from class <class 'A'> defined in A
Called from obj <A object at ...> defined in A
Called from class <class 'B'> defined in A
Called from obj <B object at ...> defined in B
There is an answered question about classmethod and property combined together: Using property() on classmethods
I still don't understand the cause of the problem, please help.
My understanding of classmethod was that it simply replaces self with cls. With this in mind I wrote several classmethods during the past few years and now I see I was wrong all that time.
So what is the difference between #classmethod and #cm from the code below?
def cm(func):
def decorated(self, *args, **kwargs):
return func(self.__class__, *args, **kwargs)
return decorated
class C:
V = 0
#property
#classmethod
def inc1(cls):
cls.V += 1
print("V1 =", cls.V)
#property
#cm
def inc3(cls):
cls.V += 3
print("V3 =", cls.V)
c = C()
#c.inc1 # fails with: TypeError: 'classmethod' object is not callable
c.inc3 # works
inc3 with cm works, but inc1 with classmethod does not.
what is the difference between #classmethod and #cm from the code below?
decorator is calling during class creation time before an instance is created.
In your case, since #cm returns func(self.__class__, *args, **kwargs), which is relied on self, it should be used as a instance method.
On the other hand, #classmethod is able to use before an instance is created.
def cm(func):
def decorated(self, *args, **kwargs):
return func(self.__class__, *args, **kwargs)
return decorated
class C:
#classmethod
def inc1(cls):
(blablabla)
#cm
def inc3(cls):
(blablabla)
C().inc1() # works as a instance method
C.inc1() # works as a classmethod
C().inc3() # works as a instance method
C.inc3() # TypeError: unbound method decorated() must be called with C instance as first argument (got nothing instead)
For a combination of classmethod and property, it could be done by return an customized object. Reference
class ClassPropertyDescriptor(object):
def __init__(self, f):
self.f = f
def __get__(self, obj, klass=None):
if klass is None:
klass = type(obj)
return self.f.__get__(obj, klass)()
def classproperty(func):
if not isinstance(func, (classmethod, staticmethod)):
func = classmethod(func)
return ClassPropertyDescriptor(func)
class C:
#classproperty
def inc1(cls):
(blablabla)
C.inc1 # works as a classmethod property
[Edit]
Q. What does the classmethod() call do with the method it decorates to achieve that?
The implementation can be done by using descriptor
class ClassMethodDescriptor(object):
def __init__(self, f):
self.f = f
def __get__(self, obj, klass=None):
if klass is None:
klass = type(obj)
def newfunc(*args):
return self.f(klass, *args)
return newfunc
def myclassmethod(func):
return ClassMethodDescriptor(func)
class C:
#myclassmethod
def inc1(cls):
(blablabla)
C.inc1() # works as a classmethod
Q. Why is the result not callable?
Because the implementation of ClassMethodDescriptor does not define __call__ function. Once using #property, it will return ClassMethodDescriptor which is not callable.
The difference is that classmethod is not callable, and cm method is callable. This means that when the property(class) makes a call to the inputed func(which it is supposed to do), it works as you'll except for cm, but will not work for classmethod since classmethod does not have a call implemented.
class method does not know anything about instance and does not require it.
instance method knows about it's instance and it's class.
class Foo:
some = 'some'
class Bar(Foo):
def __init__(self):
self.some = 'not some'
#classmethod
def cls_some(cls):
print(cls.some)
def instance_some(self):
print(self.some)
Bar.cls_some()
>>>some
Bar().instance_some()
>>>not some
Also as you can see you don't need an instance to call classmethod.
My apologies if this question has already been answered somewhere, but if it has I have not been able to locate the answer.
I would like to create a sub-class of a parent class in such a way that there will be a delay (e.g. time.sleep()) before each call to the corresponding parent class method. I would like to do this in such a way that I do not need to replicate each parent class method in the child class. In fact, I would like to have a generic method that would work with virtually any parent class -- so that I do not even need to know all the parent class methods.
The delay would be specified when instantiating the sub-class.
For example:
class Parent():
....
def method1(self):
....
def method2(self):
....
class Child(Parent):
def __init__(self, delay)
self.delay = delay
....
child = Child(1)
A call to child.method1() would result in a 1 second delay before Parent.method1() is called.
I think the previously given answers have not really addressed your specific need to delay ALL methods from the parent class, and not necessarily have to go and decorate them. You said you do NOT want to have to replicate the parent class method in the child class just so that you can delay them. This answer uses the same delay wrapper from S.Lott, but also uses a metaclass (http://www.voidspace.org.uk/python/articles/metaclasses.shtml)
#!/usr/bin/env python
from types import FunctionType
import time
def MetaClassFactory(function):
class MetaClass(type):
def __new__(meta, classname, bases, classDict):
newClassDict = {}
for attributeName, attribute in classDict.items():
if type(attribute) == FunctionType:
attribute = function(attribute)
newClassDict[attributeName] = attribute
return type.__new__(meta, classname, bases, newClassDict)
return MetaClass
def delayed(func):
def wrapped(*args, **kwargs):
time.sleep(2)
func(*args, **kwargs)
return wrapped
Delayed = MetaClassFactory(delayed)
class MyClass(object):
__metaclass__ = Delayed
def a(self):
print 'foo'
def b(self):
print 'bar'
The MetaClassFactory wraps every function in the delayed decorator. If you wanted to make sure certain built-ins like the init function were not delayed, you could just check for that name in the MetaClassFactory and ignore it.
Really, what you have here is a design that involves a Strategy object.
Your best approach is to fix the parent class to include a call to a "delay object". A default delay object does nothing.
This violates the "so that I do not even need to know all the parent class methods" hoped-for feature set.
Method lookup doesn't have a handy __getmethod__ that corresponds to __getattribute__; this gap makes it difficult to tap into Python's internals for method invocation.
class Parent( object ):
delay= ZeroDelay()
def method1(self):
self.delay()
....
def method2(self):
self.delay()
...
class ZeroDelay( object ):
def __call__( self ):
pass
class ShortDelay( ZeroDelay ):
def __init__( self, duration=1.0 )
self.duration= duration
def __call__( self ):
time.sleep( self.duration )
class Child( Parent ):
delay= ShortDelay( 1 )
EDIT: Of course, you can decorate each method, also.
def delayed( delayer ):
def wrap( a_method ):
def do_delay( *args, **kw ):
delayer()
return a_method( *args, **kw )
return do_delay
return wrap
class Parent( object ):
delay= ZeroDelay()
#delayed( self.delay )
def method1(self):
self.delay()
....
#delayed( self.delay )
def method2(self):
self.delay()
...
S.Lott solution is a good one. If you need more granularity (i.e. to delay only certain methods, not all of them), you could go with a decorator:
from time import sleep
def delayed(func):
'''This is the decorator'''
def wrapped(*args, **kwargs):
sleep(2)
func(*args, **kwargs)
return wrapped
class Example(object):
#delayed
def method(self, str):
print str
e = Example()
print "Brace! I'm delaying!"
e.method("I'm done!")
The idea is that you add #delayed in before the definition of those methods you want to delete.
EDIT: Even more granularity: setting an arbitrary delay:
from time import sleep
def set_delay(seconds):
def delayed(func):
'''This is the decorator'''
def wrapped(*args, **kwargs):
sleep(seconds)
func(*args, **kwargs)
return wrapped
return delayed
class Example(object):
#set_delay(1)
def method(self, str):
print str
#set_delay(2)
def method_2(self, str):
print str
e = Example()
print "Brace! I'm delaying!"
e.method("I'm done!")
e.method_2("I'm also done!")
You can achieve what you want by using the method __getattribute__
class Child(Parent):
def __init__(self, delay):
self.delay = delay
def __getattribute__(self, name):
attr = object.__getattribute__(self, name)
if hasattr(attr, '__call__'):
def proxFct(*args, **kwargs):
time.sleep(object.__getattribute__(self, "delay"))
return attr(*args, **kwargs)
return proxFct
else:
return attr
Update: Updated according delnan's comment
Update 2: Updated according delnan's second comment
I want to be able to ask a class's __init__ method what it's parameters are. The straightforward approach is the following:
cls.__init__.__func__.__code__.co_varnames[:code.co_argcount]
However, that won't work if the class has any decorators. It will give the parameter list for the function returned by the decorator. I want to get down to the original __init__ method and get those original parameters. In the case of a decorator, the decorator function is going to be found in the closure of the function returned by the decorator:
cls.__init__.__func__.__closure__[0]
However, it is more complicated if there are other things in the closure, which decorators may do from time to time:
def Something(test):
def decorator(func):
def newfunc(self):
stuff = test
return func(self)
return newfunc
return decorator
def test():
class Test(object):
#Something(4)
def something(self):
print Test
return Test
test().something.__func__.__closure__
(<cell at 0xb7ce7584: int object at 0x81b208c>, <cell at 0xb7ce7614: function object at 0xb7ce6994>)
And then I have to decide if I want to the parameters from decorator or the parameters from the original function. The function returned by the decorator could have *args and **kwargs for its parameters. What if there are multiple decorators and I have to decide which is the one I care about?
So what is the best way to find a function's parameters even when the function may be decorated? Also, what is the best way to go down a chain of decorators back to the decorated function?
Update:
Here is effectively how I am doing this right now (names have been changed to protect the identity of the accused):
import abc
import collections
IGNORED_PARAMS = ("self",)
DEFAULT_PARAM_MAPPING = {}
DEFAULT_DEFAULT_PARAMS = {}
class DICT_MAPPING_Placeholder(object):
def __get__(self, obj, type):
DICT_MAPPING = {}
for key in type.PARAMS:
DICT_MAPPING[key] = None
for cls in type.mro():
if "__init__" in cls.__dict__:
cls.DICT_MAPPING = DICT_MAPPING
break
return DICT_MAPPING
class PARAM_MAPPING_Placeholder(object):
def __get__(self, obj, type):
for cls in type.mro():
if "__init__" in cls.__dict__:
cls.PARAM_MAPPING = DEFAULT_PARAM_MAPPING
break
return DEFAULT_PARAM_MAPPING
class DEFAULT_PARAMS_Placeholder(object):
def __get__(self, obj, type):
for cls in type.mro():
if "__init__" in cls.__dict__:
cls.DEFAULT_PARAMS = DEFAULT_DEFAULT_PARAMS
break
return DEFAULT_DEFAULT_PARAMS
class PARAMS_Placeholder(object):
def __get__(self, obj, type):
func = type.__init__.__func__
# unwrap decorators here
code = func.__code__
keys = list(code.co_varnames[:code.co_argcount])
for name in IGNORED_PARAMS:
try: keys.remove(name)
except ValueError: pass
for cls in type.mro():
if "__init__" in cls.__dict__:
cls.PARAMS = tuple(keys)
break
return tuple(keys)
class BaseMeta(abc.ABCMeta):
def __init__(self, name, bases, dict):
super(BaseMeta, self).__init__(name, bases, dict)
if "__init__" not in dict:
return
if "PARAMS" not in dict:
self.PARAMS = PARAMS_Placeholder()
if "DEFAULT_PARAMS" not in dict:
self.DEFAULT_PARAMS = DEFAULT_PARAMS_Placeholder()
if "PARAM_MAPPING" not in dict:
self.PARAM_MAPPING = PARAM_MAPPING_Placeholder()
if "DICT_MAPPING" not in dict:
self.DICT_MAPPING = DICT_MAPPING_Placeholder()
class Base(collections.Mapping):
__metaclass__ = BaseMeta
"""
Dict-like class that uses its __init__ params for default keys.
Override PARAMS, DEFAULT_PARAMS, PARAM_MAPPING, and DICT_MAPPING
in the subclass definition to give non-default behavior.
"""
def __init__(self):
pass
def __nonzero__(self):
"""Handle bool casting instead of __len__."""
return True
def __getitem__(self, key):
action = self.DICT_MAPPING[key]
if action is None:
return getattr(self, key)
try:
return action(self)
except AttributeError:
return getattr(self, action)
def __iter__(self):
return iter(self.DICT_MAPPING)
def __len__(self):
return len(self.DICT_MAPPING)
print Base.PARAMS
# ()
print dict(Base())
# {}
At this point Base reports uninteresting values for the four contants and the dict version of instances is empty. However, if you subclass you can override any of the four, or you can include other parameters to the __init__:
class Sub1(Base):
def __init__(self, one, two):
super(Sub1, self).__init__()
self.one = one
self.two = two
Sub1.PARAMS
# ("one", "two")
dict(Sub1(1,2))
# {"one": 1, "two": 2}
class Sub2(Base):
PARAMS = ("first", "second")
def __init__(self, one, two):
super(Sub2, self).__init__()
self.first = one
self.second = two
Sub2.PARAMS
# ("first", "second")
dict(Sub2(1,2))
# {"first": 1, "second": 2}
Consider this decorator:
def rickroll(old_function):
return lambda junk, junk1, junk2: "Never Going To Give You Up"
class Foo(object):
#rickroll
def bar(self, p1, p2):
return p1 * p2
print Foo().bar(1, 2)
In it, the rickroll decorator takes the bar method, discards it, replaces it with a new function that ignores its differently-named (and possibly numbered!) parameters and instead returns a line from a classic song.
There are no further references to the original function, and the garbage collector can come and remove it any time it likes.
In such a case, I cannot see how you could find the parameter names p1 and p2. In my understanding, even the Python interpreter itself has no idea what they used to be called.