python - dynamically overriding a method at instance level and 'updating' all references - python

I would like to override the method of an instance of class A with a method from class B but in a way so that all references to the old method of the instance (made before overriding the method) then 'link' to the new one. In code:
import types
class A:
def foo(self):
print('A')
class B:
def foo(self):
print('B')
class C:
def __init__(self, a):
self.func = a.foo
def do_something(self):
self.func()
a = A()
c = C(a)
method_name = 'foo' # it has to be dynamic
new_method = getattr(B, method_name)
setattr(a, method_name, types.MethodType(new_method, a))
c.do_something() # still prints A, I want it to print B now
I want c.func to hold the new method from class B after the attribute of a has been set (without doing anything with the c object).
Is there a way to set the attribute of the instance a so that all previously made references then refer to the new method?
Sorry if this question is kind of stupid, I am not that much into this.

You could do it like this, for example:
...
def retain(foo):
return lambda *args, **kwargs: getattr(foo.__self__, foo.__name__)(*args, **kwargs)
class C:
def __init__(self, a):
self.func = retain(a.foo)
...

Just adding to Alex's answer.
In my case, the described dynamic partly comes from a need for serialization and deserialization. To serialize certain method references, I used to use func.__name__.
However, c.func.__name__ would only return <lambda> using Alex's approach. I prevented this by creating a callable class that uses the retain function but stores the method's name separately which in my case is enough because it's just some specific references I need to serialize.
def retain(foo):
return lambda *args, **kwargs: getattr(foo.__self__, foo.__name__)(*args, **kwargs)
class M:
def __init__(self, method):
self.method_name = method.__name__
self.method = retain(method)
def __call__(self, *args, **kwargs):
self.method(*args, **kwargs)
class A:
def foo(self):
print('A')
class B:
def foo(self):
print('B')
class C:
def __init__(self, method):
self.func = M(method)
def do_something(self):
self.func()
a = A()
c = C(a.foo)
setattr(a, method_name, types.MethodType(getattr(B, 'foo'), a))
c.do_something() # now prints B
# when serializing
method_name = c.func.method_name

Related

Automatic override of parent methods

I'm trying to add a wrapper to each method in a class by subclassing it, and reassigning them in the constructor of the new class, however i'm getting the same reference for all subclassed methods, how is this possible?
class A:
def foo(self):
print("foo")
def bar(self):
print("bar")
class B(A):
def __init__(self):
super().__init__()
methods = [
(method_name, getattr(self, method_name)) for method_name in dir(self) if not method_name.startswith('_')
]
for (method_name, f) in methods:
def wrapper(*args, **kwargs):
print('wrapped')
return f(*args, **kwargs)
setattr(self, method_name, wrapper)
b = B()
b.foo()
>>> wrapped
>>> foo
b.bar()
>>> wrapped
>>> foo
This is a spin on a common python gotcha, late binding closures.
What is happening is the last value of f is being bound to all your wrapped methods.
A common workaround is binding your changing variable to a keyword argument or using functools.partial.
For your example you can use it as a keyword argument.
class A:
def foo(self, baz='foo'):
print(baz)
def bar(self, baz='bar'):
print(baz)
class B(A):
def __init__(self):
super().__init__()
methods = [
(method_name, getattr(self, method_name)) for method_name in dir(self) if not method_name.startswith('_')
]
for (method_name, f) in methods:
# here you can use an implied private keyword argument
# to minimize the chance of conflicts
def wrapper(*args, _f=f, **kwargs):
print('wrapped')
return _f(*args, **kwargs)
setattr(self, method_name, wrapper)
b = B()
b.foo()
b.foo('baz')
b.foo(baz='baz')
b.bar()
I added a few more calls to your method to demonstrate that it still works with different forms of calls.

Add an automated function call to each method

Is it possible to create a "constructor".. or rather "Initializer" to each function, instead of having to manually write it at the top of each function in class?
So, each time a function in a class is called, the other assigned function (unknown to caller) is always called first (called pre_check in below example).
An example using super(), but I then have to manually copy it inside each function.
class Helper():
def pre_check(self):
print("Helper fcn")
class Parent(Helper):
def __init__(self):
print("Initializer")
def foo(self):
super().pre_check() # <---- new code
# ... existing code here ...
def bar(self):
super().pre_check() # <---- new code
# ... existing code here ...
def many_more_functions(self):
super().pre_check() # <---- new code
# ... existing code here ...
m = Parent()
m.foo()
m.bar()
Note how __init__ in Parent is not supposed to run pre_check.
You can use a decorator for the class that will in turn decorate all public methods defined in the class:
def addhelper(helpmethod):
def deco(cls):
def decomethod(method):
def inner(self, *args, **kwargs):
helpmethod(self)
return method(self, *args, **kwargs)
# copy signature, doc and names from the original method
inner.__signature__ = inspect.signature(method)
inner.__doc__ = method.__doc__
inner.__name__ = method.__name__
inner.__qualname__ = method.__qualname__
return inner
# search all methods declared in cls with a name not starting with _
for name, meth in inspect.getmembers(
cls,lambda x: inspect.isfunction(x)
and not x.__name__.startswith('_')
and x.__qualname__.startswith(cls.__name__)):
# replace each method with its decoration
setattr(cls, name, decomethod(meth))
return cls
return deco
class Helper():
def pre_check(self):
print("Helper fcn")
#addhelper(Helper.pre_check)
class Parent(Helper):
def __init__(self):
print("Initializer")
def foo(self):
# super().pre_check() # <----
print('in foo')
def bar(self):
# super().pre_check() # <----
print('in bar')
def many_more_functions(self):
# super().pre_check() # <----
print('in many_more_functions')
We can now use it:
>>> p = Parent()
Initializer
>>> p.foo()
Helper fcn
in foo
>>> p.bar()
Helper fcn
in bar
>>> p.many_more_functions()
Helper fcn
in many_more_functions
Use __init_subclass__ to change subclasses as they are created. You can wrap the methods of subclasses:
class Helper():
def __init_subclass__(cls):
for field, value in cls.__dict__.items():
# add additional checks as desired, e.g. exclude __special_methods__
if inspect.isfunction(value) and not getattr(value, 'checked', False):
setattr(cls, field, cls._check(value)) # wrap method
#classmethod
def _check(cls, fcn):
"""Create a wrapper to inspect the arguments passed to methods"""
#functools.wraps(fcn)
def checked_fcn(*args, **kwargs):
print(fcn, "got", args, kwargs)
return fcn(*args, **kwargs)
return checked_fcn
class Parent(Helper):
def __init__(self):
print("Initializer")
def foo(self):
print("Foo")
Note that this will wrap all methods, including special methods such as __init__:
>>> Parent().foo()
<function Parent.__init__ at 0x1029b2378> got (<__main__.Parent object at 0x102c09080>,) {}
Initializer
<function Parent.foo at 0x1029b2158> got (<__main__.Parent object at 0x102c09080>,) {}
Foo
You can extend the check in __init_subclass__ with arbitrary rules to filter out functions. For example, field[:2] == field[-2:] == "__" excludes special methods.
You can use metaclass and define a decorator for each method in the instance of that metaclass
Code :
def decorate(f):
def do_something(self, a):
if (f(self, a) > 18) :
return ("Eligible to vote")
else :
return ("Not eligible to vote")
return do_something
class Meta(type):
def __new__(cls, name, bases, namespace, **kwds):
namespace = {k: v if k.startswith('__') else decorate(v) for k, v in namespace.items()}
return type.__new__(cls, name, bases, namespace)
class MetaInstance(metaclass=Meta):
def foo1(self, val):
return val + 15
def foo2(self, val):
return val + 9
obj1 = MetaInstance()
print(obj1.foo1(5))
print(obj1.foo2(2))

Is there a use for the code cls().__init__() in a classmethod or elsewhere?

I have seen cls().__init__() used in a classmethod, but it seems that the code could have used a simple cls() instead. As in:
class SomeCrazyClass:
#classmethod
def newclass(cls):
return cls().__init__()
#classmethod
def newclass2(cls):
return cls()
Is this just a poor coding style choice or is there a practical use of cls().__init__() in some situation?
The difference between cls().__init__() and cls() is that former calls the __init__ on instance twice and hence will return None and the latter will return the actual instance.
But an imaginary scenario to of calling __init__ again can be used in lazy initialization of a class or may be some other use-cases as well.
For example in the below code the instance variables are loaded only on the first access of an attribute:
def init(cls, real_init):
def wrapped(self, *args, **kwargs):
cls.__init__ = real_init
return wrapped
class A(object):
def __new__(cls, *args, **kwargs):
cls.__init__ = init(cls, cls.__init__)
instance = object.__new__(cls)
return instance
def __getattr__(self, attr):
expected_attrs = ('a', 'b')
if attr in expected_attrs:
self.__init__(range(10000), range(1000))
return object.__getattribute__(self, attr)
def __init__(self, a, b):
print('inside __init__')
self.a = sum(a)
self.b = sum(b)
Demo:
>>> a = A()
>>> a.__dict__
{}
>>> a.a, a.b
inside __init__
(49995000, 499500)
>>> a.__dict__
{'a': 49995000, 'b': 499500}
>>> a = A()
>>> a.__init__(range(10**5), range(10**4))
inside __init__
>>> a.a, a.b
(4999950000, 49995000)
We can now also return a value from __init__ now which is usually not possible.

Python : Set method attribute from within method

I am trying to make a python decorator that adds attributes to methods of a class so that I can access and modify those attributes from within the method itself. The decorator code is
from types import MethodType
class attribute(object):
def __init__(self, **attributes):
self.attributes = attributes
def __call__(self, function):
class override(object):
def __init__(self, function, attributes):
self.__function = function
for att in attributes:
setattr(self, att, attributes[att])
def __call__(self, *args, **kwargs):
return self.__function(*args, **kwargs)
def __get__(self, instance, owner):
return MethodType(self, instance, owner)
retval = override(function, self.attributes)
return retval
I tried this decorator on the toy example that follows.
class bar(object):
#attribute(a=2)
def foo(self):
print self.foo.a
self.foo.a = 1
Though I am able to access the value of attribute 'a' from within foo(), I can't set it to another value. Indeed, when I call bar().foo(), I get the following AttributeError.
AttributeError: 'instancemethod' object has no attribute 'a'
Why is this? More importantly how can I achieve my goal?
Edit
Just to be more specific, I am trying to find a simple way to implement static variable that are located within class methods. Continuing from the example above, I would like instantiate b = bar(), call both foo() and doo() methods and then access b.foo.a and b.doo.a later on.
class bar(object):
#attribute(a=2)
def foo(self):
self.foo.a = 1
#attribute(a=4)
def doo(self):
self.foo.a = 3
The best way to do this is to not do it at all.
First of all, there is no need for an attribute decorator; you can just assign it yourself:
class bar(object):
def foo(self):
print self.foo.a
self.foo.a = 1
foo.a = 2
However, this still encounters the same errors. You need to do:
self.foo.__dict__['a'] = 1
You can instead use a metaclass...but that gets messy quickly.
On the other hand, there are cleaner alternatives.
You can use defaults:
def foo(self, a):
print a[0]
a[0] = 2
foo.func_defaults = foo.func_defaults[:-1] + ([2],)
Of course, my preferred way is to avoid this altogether and use a callable class ("functor" in C++ words):
class bar(object):
def __init__(self):
self.foo = self.foo_method(self)
class foo_method(object):
def __init__(self, bar):
self.bar = bar
self.a = 2
def __call__(self):
print self.a
self.a = 1
Or just use classic class attributes:
class bar(object):
def __init__(self):
self.a = 1
def foo(self):
print self.a
self.a = 2
If it's that you want to hide a from derived classes, use whatever private attributes are called in Python terminology:
class bar(object):
def __init__(self):
self.__a = 1 # this will be implicitly mangled as __bar__a or similar
def foo(self):
print self.__a
self.__a = 2
EDIT: You want static attributes?
class bar(object):
a = 1
def foo(self):
print self.a
self.a = 2
EDIT 2: If you want static attributes visible to only the current function, you can use PyExt's modify_function:
import pyext
def wrap_mod(*args, **kw):
def inner(f):
return pyext.modify_function(f, *args, **kw)
return inner
class bar(object):
#wrap_mod(globals={'a': [1]})
def foo(self):
print a[0]
a[0] = 2
It's slightly ugly and hackish. But it works.
My recommendation would be just to use double underscores:
class bar(object):
__a = 1
def foo(self):
print self.__a
self.__a = 2
Although this is visible to the other functions, it's invisible to anything else (actually, it's there, but it's mangled).
FINAL EDIT: Use this:
import pyext
def wrap_mod(*args, **kw):
def inner(f):
return pyext.modify_function(f, *args, **kw)
return inner
class bar(object):
#wrap_mod(globals={'a': [1]})
def foo(self):
print a[0]
a[0] = 2
foo.a = foo.func_globals['a']
b = bar()
b.foo() # prints 1
b.foo() # prints 2
# external access
b.foo.a[0] = 77
b.foo() # prints 77
While You can accomplish Your goal by replacing self.foo.a = 1 with self.foo.__dict__['a'] = 1 it is generally not recommended.
If you are using Python2 - (and not Python3) - whenever you retrieve a method from an instance, a new instance method object is created which is a wrapper to the original function defined in the class body.
The instance method is a rather transparent proxy to the function - you can retrieve the function's attributes through it, but not set them - that is why setting an item in self.foo.__dict__ works.
Alternatively you can reach the function object itself using: self.foo.im_func - the im_func attribute of instance methods point the underlying function.
Based on other contributors's answers, I came up with the following workaround. First, wrap a dictionnary in a class resolving non-existant attributes to the wrapped dictionnary such as the following code.
class DictWrapper(object):
def __init__(self, d):
self.d = d
def __getattr__(self, key):
return self.d[key]
Credits to Lucas Jones for this code.
Then implement a addstatic decorator with a statics attribute that will store the static attributes.
class addstatic(object):
def __init__(self, **statics):
self.statics = statics
def __call__(self, function):
class override(object):
def __init__(self, function, statics):
self.__function = function
self.statics = DictWrapper(statics)
def __call__(self, *args, **kwargs):
return self.__function(*args, **kwargs)
def __get__(self, instance, objtype):
from types import MethodType
return MethodType(self, instance)
retval = override(function, self.statics)
return retval
The following code is an example of how the addstatic decorator can be used on methods.
class bar(object):
#attribute(a=2, b=3)
def foo(self):
self.foo.statics.a = 1
self.foo.statics.b = 2
Then, playing with an instance of the bar class yields :
>>> b = bar()
>>> b.foo.statics.a
2
>>> b.foo.statics.b
3
>>> b.foo()
>>> b.foo.statics.a
3
>>> b.foo.statics.b
5
The reason for using this statics dictionnary follows jsbueno's answer which suggest that what I want would require overloading the dot operator of and instance method wrapping the foo function, which I am not sure is possible. Of course, the method's attribute could be set in self.foo.__dict__, but since it not recommended (as suggested by brainovergrow), I came up with this workaround. I am not certain this would be recommended either and I guess it is up for comments.

Python introspection: Automatic wrapping of methods

object of type A and Is there a way to programatically wrap a class object?
Given
class A(object):
def __init__(self):
## ..
def f0(self, a):
## ...
def f1(self, a, b):
## ..
I want another class that wraps an A, such as
class B(object):
def __init__(self):
self.a = A()
def f0(self,a):
try:
a.f0(a)
except (Exception),ex:
## ...
def f1(self, a, b):
try:
a.f1(a,b)
except (Exception),ex:
## ...
Is there a way to do create B.f0 & B.f1 by reflection/inspection of class A?
If you want to create class B by calling a function on a predefined class A, you can simply do B = wrap_class(A) with a function wrap_class that looks like this:
import copy
def wrap_class(cls):
'Wraps a class so that exceptions in its methods are caught.'
# The copy is necessary so that mutable class attributes are not
# shared between the old class cls and the new class:
new_cls = copy.deepcopy(cls)
# vars() is used instead of dir() so that the attributes of base classes
# are not modified, but one might want to use dir() instead:
for (attr_name, value) in vars(cls).items():
if isinstance(value, types.FunctionType):
setattr(new_cls, attr_name, func_wrapper(value))
return new_cls
B = wrap_class(A)
As Jürgen pointed out, this creates a copy of the class; this is only needed, however, if you really want to keep your original class A around (like suggested in the original question). If you don't care about A, you can simply decorate it with a wrapper that does not perform any copy, like so:
def wrap_class(cls):
'Wraps a class so that exceptions in its methods are caught.'
# vars() is used instead of dir() so that the attributes of base classes
# are not modified, but one might want to use dir() instead:
for (attr_name, value) in vars(cls).items():
if isinstance(value, types.FunctionType):
setattr(cls, attr_name, func_wrapper(value))
return cls
#wrap_class
class A(object):
… # Original A class, with methods that are not wrapped with exception catching
The decorated class A catches exceptions.
The metaclass version is heavier, but its principle is similar:
import types
def func_wrapper(f):
'Returns a version of function f that prints an error message if an exception is raised.'
def wrapped_f(*args, **kwargs):
try:
return f(*args, **kwargs)
except Exception, ex:
print "Function", f, "raised", ex
return wrapped_f
class ExceptionCatcher(type):
'Metaclass that wraps methods with func_wrapper().'
def __new__(meta, cname, bases, cdict):
# cdict contains the attributes of class cname:
for (attr_name, value) in cdict.items():
if isinstance(value, types.FunctionType): # Various attribute types can be wrapped differently
cdict[attr_name] = func_wrapper(value)
return super(meta, ExceptionCatcher).__new__(meta, cname, bases, cdict)
class B(object):
__metaclass__ = ExceptionCatcher # ExceptionCatcher will be used for creating class A
class_attr = 42 # Will not be wrapped
def __init__(self):
pass
def f0(self, a):
return a*10
def f1(self, a, b):
1/0 # Raises a division by zero exception!
# Test:
b = B()
print b.f0(3.14)
print b.class_attr
print b.f1(2, 3)
This prints:
31.4
42
Function <function f1 at 0x107812d70> raised integer division or modulo by zero
None
What you want to do is in fact typically done by a metaclass, which is a class whose instances are a class: this is a way of building the B class dynamically based on its parsed Python code (the code for class A, in the question). More information on this can be found in the nice, short description of metaclasses given in Chris's Wiki (in part 1 and parts 2-4).
Meta classes are an option, but generally hard to understand. As is too much reflection
if not needed in simple cases, because it is easy to catch too many (internal) functions. If the wrapped functions are a stable known set, and B might gain other functions, you can delegate explicitly function by function and still keep your error handling code in one place:
class B(object):
def __init__(self):
a = A()
self.f0 = errorHandler(a.f0)
self.f1 = errorHandler(a.f1)
You might do the assignments in a loop if they are many, using getattr/setattr.
The errorhandler function will need to return a function which wraps its argument with
error handling code.
def errorHandler(f):
def wrapped(*args, **kw):
try:
return f(*args, **kw)
except:
# log or something
return wrapped
You can also use errorhandler as decorator on new functions not delegating to the A instance.
def B(A):
...
#errorHandler
def f_new(self):
...
This solution keeps B simple and it is quite explicit what's going on.
You could try it old-school with __getattr__:
class B(object):
def __init__(self):
self.a = A()
def __getattr__(self, name):
a_method = getattr(a, name, None)
if not callable(a_method):
raise AttributeError("Unknown attribute %r" % name)
def wrapper(*args, **kwargs):
try:
return a_method(*args, **kwargs)
except Exception, ex:
# ...
return wrapper
Or with updating B's dict:
class B(object):
def __init__(self):
a = A()
for attr_name in dir(a):
attr = getattr(a, attr_name)
if callable(attr):
def wrapper(*args, **kwargs):
try:
return attr(*args, **kwargs)
except Exception, ex:
# ...
setattr(self, attr_name, wrapper) # or try self.__dict__[x] = y

Categories

Resources