Assign method locals to class attributes - python

When debugging, I sometimes find it useful to access the values of local variables within a class method without necessarily invoking pdb.set_trace().
class Calculator:
# ...
def add_and_format(x, y):
sum = x + y
return '{:.2f}'.format(sum)
If I want to access sum following a call to add_and_format, I could edit the code to be self.sum = x + y and inspect it that way. In the case of many local variables, however, it would be simpler if I could use a #debug decorator around the method or some debug_function() class method to persist all locals variables as attributes. I guess I have in mind something basically equivalent to
def my_method(self, *args, **kwargs):
# ... method code
for k, v in locals().items():
setattr(self, k, v)
return result
I've seen this answer, which is a good partial solution but I'm unsure how to get it to work for this use case.

Related

Rename every method skipping the object name

Not sure if this is a valid question or just nonsense, but I have not found an answer online.
I know that it is possible to rename a function in Python this way:
SuperMethod = myObject.SuperMethod
I would like to know if it is possible to rename every method of an object, that's it, being able to call every method of a particular object without telling explicitly its name (similarly than in VBA by using with clause)
I know this will have all kind of naming issues.
You can update the globals() dict with the object's callables after filtering out the internal methods that start and end with '__':
class A:
def __init__(self, i):
self.i = i
def x(self):
print(self.i + 1)
def y(self):
print(self.i + 2)
myObject = A(1)
globals().update({k: getattr(myObject, k) for k, v in A.__dict__.items() if not k.startswith('__') and not k.endswith('__') and callable(v)})
x()
y()
This outputs:
2
3

does #property update changed elements in an attribute or calculates it again?

I was wondering if using the #property in python to update an attribute overwrites it or simply updates it? As the speed is very different in the 2 cases.
And in case it gets overwritten, what alternative can I use? Example:
class sudoku:
def __init__(self,puzzle):
self.grid={(i,j):puzzle[i][j] for i in range(9) for j in range(9)}
self.elements
self.forbidden=set()
#property
def elements(self):
self.rows=[[self.grid[(i,j)] for j in range(9)] for i in range(9)]
self.columns=[[self.grid[(i,j)] for i in range(9)] for j in range(9)]
self.squares={(i,j): [self.grid[(3*i+k,3*j+l)] for k in range(3) for l in range(3)] for i in range(3) for j in range(3) }
self.stack=[self.grid]
self.empty={k for k in self.grid.keys() if self.grid[k]==0}
Basically, I work with the grid method, and whenever I need to update the other attributes I call elements. I prefer to call it manually tho. The question, however, is that if I change self.grid[(i,j)], does python calculate each attribute from scratch because self.grid was changed or does it only change the i-th row, j-th column etc?
Thank you
edit: added example code
As is, your question is totally unclear - but anyway, since you don't seem to understand what a property is and how it works...
class Obj(object):
def __init__(self, x, y):
self.x = x
#property
def x(self):
return self._x / 2
#x.setter
def x(self, value):
self._x = value * 2
Here we have a class with a get/set ("binding") property x, backed by a protected attribute _x.
The "#property" syntax here is mainly syntactic sugar, you could actually write this code as
class Obj(object):
def __init__(self, x, y):
self.x = x
self.y = y
def get_x(self):
return self._x / 2
def set_x(self, value):
self._x = value * 2
x = property(fget=get_x, fset=set_x)
The only difference with the previous version being that the get_x and set_x functions remain available as methods. Then if we have an obj instance:
obj = Obj(2, 4)
Then
x = obj.x
is just a shortcut for
x = obj.get_x()
and
obj.x = 42
is just a shortcut for
obj.set_x(42)
How this "shortcut" works is fully documented here, with a whole chapter dedicated to the property type.
As you can see there's nothing magical here, and once you get (no pun intended) the descriptor protocol and how the property class uses it, you can answer the question by yourself.
Note that properties will ALWAYS add some overhead (vs plain attributes or direct method call) since you have more indirections levels and method calls invoked, so it's best to only use them when it really makes sense.
EDIT: now you posted your code, I confirm that you don't understand Python's "properties" - not only the technical side of it but even the basic concept of a "computed attribute".
The point of computed attributes in general (the builtin property type being just one generic implementation of) is to have the interface of a plain attribute (something you can get the value if with value = obj.attrname and eventually set the value of with obj.attrname = somevalue) but actually invoking a getter (and eventually a setter) behind the hood.
Your elements "property" while technically implemented as a read-only property, is really a method that initializes half a dozen attributes of your class, doesn't return anything (well it implicitely returns None) and which return value is actually never used (of course). This is definitly not what computed attributes are for. This should NOT be a property, it should be a plain function (with some explicit name such as "setup_elements" or whatever makes sense here).
# nb1 : classes names should be CamelCased
# nb2 : in Python 2x, you want to inherit from 'object'
class Sudoku(object):
def __init__(self,puzzle):
self.grid={(i,j):puzzle[i][j] for i in range(9) for j in range(9)}
self.setup_elements()
self.forbidden=set()
def setup_elements(self):
self.rows=[[self.grid[(i,j)] for j in range(9)] for i in range(9)]
self.columns=[[self.grid[(i,j)] for i in range(9)] for j in range(9)]
self.squares={(i,j): [self.grid[(3*i+k,3*j+l)] for k in range(3) for l in range(3)] for i in range(3) for j in range(3) }
self.stack=[self.grid]
self.empty={k for k, v in self.grid.items() if v==0}
Now to answer your question:
if I change self.grid[(i,j)], does python calculate each attribute from scratch because self.grid was changed
self.grid is a plain attribute, so just rebinding self.grid[(i, j)] doesn't make "python" calculate anything else, of course. None of your object's other attributes will be impacted. Actually Python (the interpreter) has no mind-reading ability and will only do exactly what you asked for, nothing less, nothing more, period.
or does it only change the i-th row, j-th column
This :
obj = Sudoku(some_puzzle)
obj.grid[(1, 1)] = "WTF did you expect ?"
will NOT (I repeat: "NOT") do anything else than assigning the literal string "WTF did you expect ?" to obj.grid[(1, 1)]. None of the other attributes will be updated in any way.
Now if your question was: "if I change something to self.grid and call self.setup_elements() after, will Python recompute all attributes or only update self.rows[xxx] and self.columns[yyy]", then the answer is plain simple: Python will do exactly what you asked for: it will execute self.setup_elements(), line after line, statement after statement. Plain and simple. No magic here, and the only thing you'll get from making it a property instead of a plain method is that you won't have to type the () after to invoke the method.
So if what you expected from making this elements() method a property was to have some impossible magic happening behind the scene to detect that you actually only wanted to recompute impacted elements, then bad news, this is not going to happen, and you will have to explicitely tell the interpreter how to do so. Computed attributes might be part of the solution here, but not by any magic - you will have to write all the code needed to intercept assignments to any of those attributes and recompute what needs to be recomputed.
Beware, since all those attributes are mutable containers, just wrapping each of them into properties won't be enough - consider this:
class Foo(object):
def __init__(self):
self._bar = {"a":1, "b": 2}
#property
def bar(self):
print("getting self._bar")
return self._bar
#bar.setter
def bar(self, value):
print("setting self._bar to {}".format(value))
self._bar = value
>>> f = Foo()
>>> f.bar
getting self._bar
{'a': 1, 'b': 2}
>>> f.bar['z'] = "WTF ?"
getting self._bar
>>> f.bar
getting self._bar
{'a': 1, 'b': 2, 'z': 'WTF ?'}
>>> bar = f.bar
getting self._bar
>>> bar
{'a': 1, 'b': 2, 'z': 'WTF ?'}
>>> bar["a"] = 99
>>> f.bar
getting self._bar
{'a': 99, 'b': 2, 'z': 'WTF ?'}
As you can see, we could mutate self._bar without the bar.setter function ever being invoked - because f.bar["x"] = "y" is actually NOT assigning to f.bar (which would need f.bar = "something else") but _getting_ thef._bardict thru theFoo.bargetter, then invokingsetitem()` on this dict.
So if you want to intercept something like f.bar["x"] = "y", you will also have to write some dict-like object that will intercept all mutators access on the dict itself ( __setitem__, but also __delitem__ etc) and notify f of those changes, and change your property so that it returns an instance of this dict-like objects instead.

How can a decorator pass variables into a function without changing its signature?

Let me first acknowledge that what I want to do may be considered anything from silly to evil, but I want to find out if I can do it in Python anyway.
Let's say I have a function decorator that takes keyword arguments defining variables, and I want to access those variables in the wrapped function. I might do something like this:
def more_vars(**extras):
def wrapper(f):
#wraps(f)
def wrapped(*args, **kwargs):
return f(extras, *args, **kwargs)
return wrapped
return wrapper
Now I can do something like:
#more_vars(a='hello', b='world')
def test(deco_vars, x, y):
print(deco_vars['a'], deco_vars['b'])
print(x, y)
test(1, 2)
# Output:
# hello world
# 1 2
The thing I don't like about this is that when you use this decorator, you have to change the call signature of the function, adding the extra variable in addition to slapping on the decorator. Also, if you look at the help for the function, you see an extra variable that you're not expected to use when calling the function:
help(test)
# Output:
# Help on function test in module __main__:
#
# test(deco_vars, x, y)
This makes it look like the user is expected to call the function with 3 parameters, but obviously that won't work. So you'd have to also add a message to the docstring indicating that the first parameter isn't part of the interface, it's just an implementation detail and should be ignored. That's kind of crappy, though. Is there any way to do this without hanging these variables on something in the global scope? Ideally, I'd like it to look like the following:
#more_vars(a='hello', b='world')
def test(x, y):
print(a, b)
print(x, y)
test(1, 2)
# Output:
# hello world
# 1 2
help(test)
# Output:
# Help on function test in module __main__:
#
# test(x, y)
I am content with a Python 3 only solution if one exists.
You could do this with some trickery that inserts the variables passed to the decorator into the function's local variables:
import sys
from functools import wraps
from types import FunctionType
def is_python3():
return sys.version_info >= (3, 0)
def more_vars(**extras):
def wrapper(f):
#wraps(f)
def wrapped(*args, **kwargs):
fn_globals = {}
fn_globals.update(globals())
fn_globals.update(extras)
if is_python3():
func_code = '__code__'
else:
func_code = 'func_code'
call_fn = FunctionType(getattr(f, func_code), fn_globals)
return call_fn(*args, **kwargs)
return wrapped
return wrapper
#more_vars(a="hello", b="world")
def test(x, y):
print("locals: {}".format(locals()))
print("x: {}".format(x))
print("y: {}".format(y))
print("a: {}".format(a))
print("b: {}".format(b))
if __name__ == "__main__":
test(1, 2)
Can you do this? Sure! Should you do this? Probably not!
(Code available here.)
EDIT: answer edited for readability. Latest answer is on top, original follows.
If I understand well
you want the new arguments to be defined as keywords in the #more_vars decorator
you want to use them in the decorated function
and you want them to be hidden to the normal users (the exposed signature should still be the normal signature)
Have a look at the #with_partial decorator in my library makefun. It provides this functionality out of the box:
from makefun import with_partial
#with_partial(a='hello', b='world')
def test(a, b, x, y):
"""Here is a doc"""
print(a, b)
print(x, y)
It yields the expected output and the docstring is modified accordingly:
test(1, 2)
help(test)
yields
hello world
1 2
Help on function test in module <...>:
test(x, y)
<This function is equivalent to 'test(x, y, a=hello, b=world)', see original 'test' doc below.>
Here is a doc
To answer the question in your comment, the function creation strategy in makefun is exactly the same than the one in the famous decorator library: compile + exec. No magic here, but decorator has been using this trick for years in real-world applications so it is quite solid. See def _make in the source code.
Note that the makefun library also provides a partial(f, *args, **kwargs) function if you want to create the decorator yourself for some reason (see below for inspiration).
If you wish to do this manually, this is a solution that should work as you expect, it relies on the wraps function provided by makefun, to modify the exposed signature.
from makefun import wraps, remove_signature_parameters
def more_vars(**extras):
def wrapper(f):
# (1) capture the signature of the function to wrap and remove the invisible
func_sig = signature(f)
new_sig = remove_signature_parameters(func_sig, 'invisible_args')
# (2) create a wrapper with the new signature
#wraps(f, new_sig=new_sig)
def wrapped(*args, **kwargs):
# inject the invisible args again
kwargs['invisible_args'] = extras
return f(*args, **kwargs)
return wrapped
return wrapper
You can test that it works:
#more_vars(a='hello', b='world')
def test(x, y, invisible_args):
a = invisible_args['a']
b = invisible_args['b']
print(a, b)
print(x, y)
test(1, 2)
help(test)
You can even make the decorator definition more compact if you use decopatch to remove the useless level of nesting:
from decopatch import DECORATED
from makefun import wraps, remove_signature_parameters
#function_decorator
def more_vars(f=DECORATED, **extras):
# (1) capture the signature of the function to wrap and remove the invisible
func_sig = signature(f)
new_sig = remove_signature_parameters(func_sig, 'invisible_args')
# (2) create a wrapper with the new signature
#wraps(f, new_sig=new_sig)
def wrapped(*args, **kwargs):
kwargs['invisible_args'] = extras
return f(*args, **kwargs)
return wrapped
Finally, if you rather do not want to depend on any external library, the most pythonic way to do it is to create a function factory (but then you cannot have this as a decorator):
def make_test(a, b, name=None):
def test(x, y):
print(a, b)
print(x, y)
if name is not None:
test.__name__ = name
return test
test = make_test(a='hello', b='world')
test2 = make_test(a='hello', b='there', name='test2')
I'm the author of makefun and decopatch by the way ;)
It sounds like your only problem is that help is showing the signature of the raw test as the signature of the wrapped function, and you don't want it to.
The only reason that's happening is that wraps (or, rather, update_wrapper, which wraps calls) explicitly copies this from the wrappee to the wrapper.
You can decide exactly what you do and don't want to copy. If what you want to do differently is simple enough, it's just a matter of filtering stuff out of the default WRAPPER_ASSIGNMENTS and WRAPPER_UPDATES. If you want to change other stuff, you may need to fork update_wrapper and use your own version—but functools is one of those modules that has a link to the source right at the top of the docs, because it's meant to be used as readable sample code.
In your case, it may just be a matter of wraps(f, updated=[]), or you may want to do something fancy, like use inspect.signature to get the signature of f, and modify it to remove the first parameter, and build a wrapper explicitly around that to fool even the inspect module.
I've found a solution to this problem, although the solution is by most standards almost certainly worse than the problem itself. With some clever rewriting of the decorated function's bytecode, you can redirect all references to variables of a given name to a new closure you can dynamically create for the function. This solution only works for the standard CPython, and I have only tested it with 3.7.
import inspect
from dis import opmap, Bytecode
from types import FunctionType, CodeType
def more_vars(**vars):
'''Decorator to inject more variables into a function.'''
def wrapper(f):
code = f.__code__
new_freevars = code.co_freevars + tuple(vars.keys())
new_globals = [var for var in code.co_names if var not in vars.keys()]
new_locals = [var for var in code.co_varnames if var not in vars.keys()]
payload = b''.join(
filtered_bytecode(f, new_freevars, new_globals, new_locals))
new_code = CodeType(code.co_argcount,
code.co_kwonlyargcount,
len(new_locals),
code.co_stacksize,
code.co_flags & ~inspect.CO_NOFREE,
payload,
code.co_consts,
tuple(new_globals),
tuple(new_locals),
code.co_filename,
code.co_name,
code.co_firstlineno,
code.co_lnotab,
code.co_freevars + tuple(vars.keys()),
code.co_cellvars)
closure = tuple(get_cell(v) for (k, v) in vars.items())
return FunctionType(new_code, f.__globals__, f.__name__, f.__defaults__,
(f.__closure__ or ()) + closure)
return wrapper
def get_cell(val=None):
'''Create a closure cell object with initial value.'''
# If you know a better way to do this, I'd like to hear it.
x = val
def closure():
return x # pragma: no cover
return closure.__closure__[0]
def filtered_bytecode(func, freevars, globals, locals):
'''Get the bytecode for a function with adjusted closed variables
Any references to globlas or locals in the bytecode which exist in the
freevars are modified to reference the freevars instead.
'''
opcode_map = {
opmap['LOAD_FAST']: opmap['LOAD_DEREF'],
opmap['STORE_FAST']: opmap['STORE_DEREF'],
opmap['LOAD_GLOBAL']: opmap['LOAD_DEREF'],
opmap['STORE_GLOBAL']: opmap['STORE_DEREF']
}
freevars_map = {var: idx for (idx, var) in enumerate(freevars)}
globals_map = {var: idx for (idx, var) in enumerate(globals)}
locals_map = {var: idx for (idx, var) in enumerate(locals)}
for instruction in Bytecode(func):
if instruction.opcode not in opcode_map:
yield bytes([instruction.opcode, instruction.arg or 0])
elif instruction.argval in freevars_map:
yield bytes([opcode_map[instruction.opcode],
freevars_map[instruction.argval]])
elif 'GLOBAL' in instruction.opname:
yield bytes([instruction.opcode,
globals_map[instruction.argval]])
elif 'FAST' in instruction.opname:
yield bytes([instruction.opcode,
locals_map[instruction.argval]])
This behaves exactly as I wanted:
In [1]: #more_vars(a='hello', b='world')
...: def test(x, y):
...: print(a, b)
...: print(x, y)
...:
In [2]: test(1, 2)
hello world
1 2
In [3]: help(test)
Help on function test in module __main__:
test(x, y)
This is almost certainly not ready for production use. I would be surprised if there weren't edge cases that behave unexpectedly, and possibly even segfault. I'd probably file this under the "educational curiosity" heading.

str.format() with lazy dict?

I want to use str.format() and pass it a custom lazy dictionary.
str.format() should only access the key in the lazy dict it needs.
Is this possible?
Which interface needs to be implemented by the lazy_dict?
Update
This is not what I want:
'{0[a]}'.format(d)
I need something like this:
'{a}'.format(**d)
Need to run on Python2.7
For doing '{a}'.format(**d), especially the **d part, the "lazy" dict is transformed into a regular one. Here happens the access to all keys, and format() can't do anything about it.
You could craft some proxy objects which are put in place of the elements, and on string access they do the "real" work.
Something like
class LazyProxy(object):
def __init__(self, prx):
self.prx = prx
def __format__(self, fmtspec):
return format(self.prx(), fmtspec)
def __repr__(self):
return repr(self.prx())
def __str__(self):
return str(self.prx())
You can put these elements into a dict, such as
interd = { k, LazyProxy(lambda: lazydict[k]) for i in lazydict.iterkeys()}
I didn't test this, but I think this fulfills your needs.
After the last edit, it now works with !r and !s as well.
You can use the __format__ method (Python 3 only). See the doc here.
If I understand your question correctly, you want to pass a custom dictionary, that would compute values only when needed. First, we're looking for implementation of __getitem__():
>>> class LazyDict(object):
... def __init__(self, d):
... self.d = d
... def __getitem__(self, k):
... print k # <-- tracks the needed keys
... return self.d[k]
...
>>> d = D({'a': 19, 'b': 20})
>>> '{0[a]}'.format(d)
a
'19'
This shows that only key 'a' is accessed; 'b' is not, so you already have your lazy access.
But also, any object attribute is usable for str.format this way, and using #property decorator, you can access function results:
class MyObject(object):
def __init__(self):
self.a = 19
self.b = 20
def __getitem__(self, var):
return getattr(self, var)
# this command lets you able to call any attribute of your instance,
# or even the result of a function if it is decorated by #property:
#property
def c(self):
return 21
Example of usage:
>>> m = MyObject()
>>> '{0[c]}'.format(m)
'21'
But note that this also works, making the formating string a little bit specific, but avoid the need for __getitem__() implementation.
>>> '{0.c}'.format(m)
'21'

Intercept operator lookup on metaclass

I have a class that need to make some magic with every operator, like __add__, __sub__ and so on.
Instead of creating each function in the class, I have a metaclass which defines every operator in the operator module.
import operator
class MetaFuncBuilder(type):
def __init__(self, *args, **kw):
super().__init__(*args, **kw)
attr = '__{0}{1}__'
for op in (x for x in dir(operator) if not x.startswith('__')):
oper = getattr(operator, op)
# ... I have my magic replacement functions here
# `func` for `__operators__` and `__ioperators__`
# and `rfunc` for `__roperators__`
setattr(self, attr.format('', op), func)
setattr(self, attr.format('r', op), rfunc)
The approach works fine, but I think It would be better if I generate the replacement operator only when needed.
Lookup of operators should be on the metaclass because x + 1 is done as type(x).__add__(x,1) instead of x.__add__(x,1), but it doesn't get caught by __getattr__ nor __getattribute__ methods.
That doesn't work:
class Meta(type):
def __getattr__(self, name):
if name in ['__add__', '__sub__', '__mul__', ...]:
func = lambda:... #generate magic function
return func
Also, the resulting "function" must be a method bound to the instance used.
Any ideas on how can I intercept this lookup? I don't know if it's clear what I want to do.
For those questioning why do I need to this kind of thing, check the full code here.
That's a tool to generate functions (just for fun) that could work as replacement for lambdas.
Example:
>>> f = FuncBuilder()
>>> g = f ** 2
>>> g(10)
100
>>> g
<var [('pow', 2)]>
Just for the record, I don't want to know another way to do the same thing (I won't declare every single operator on the class... that will be boring and the approach I have works pretty fine :). I want to know how to intercept attribute lookup from an operator.
Some black magic let's you achieve your goal:
operators = ["add", "mul"]
class OperatorHackiness(object):
"""
Use this base class if you want your object
to intercept __add__, __iadd__, __radd__, __mul__ etc.
using __getattr__.
__getattr__ will called at most _once_ during the
lifetime of the object, as the result is cached!
"""
def __init__(self):
# create a instance-local base class which we can
# manipulate to our needs
self.__class__ = self.meta = type('tmp', (self.__class__,), {})
# add operator methods dynamically, because we are damn lazy.
# This loop is however only called once in the whole program
# (when the module is loaded)
def create_operator(name):
def dynamic_operator(self, *args):
# call getattr to allow interception
# by user
func = self.__getattr__(name)
# save the result in the temporary
# base class to avoid calling getattr twice
setattr(self.meta, name, func)
# use provided function to calculate result
return func(self, *args)
return dynamic_operator
for op in operators:
for name in ["__%s__" % op, "__r%s__" % op, "__i%s__" % op]:
setattr(OperatorHackiness, name, create_operator(name))
# Example user class
class Test(OperatorHackiness):
def __init__(self, x):
super(Test, self).__init__()
self.x = x
def __getattr__(self, attr):
print "__getattr__(%s)" % attr
if attr == "__add__":
return lambda a, b: a.x + b.x
elif attr == "__iadd__":
def iadd(self, other):
self.x += other.x
return self
return iadd
elif attr == "__mul__":
return lambda a, b: a.x * b.x
else:
raise AttributeError
## Some test code:
a = Test(3)
b = Test(4)
# let's test addition
print(a + b) # this first call to __add__ will trigger
# a __getattr__ call
print(a + b) # this second call will not!
# same for multiplication
print(a * b)
print(a * b)
# inplace addition (getattr is also only called once)
a += b
a += b
print(a.x) # yay!
Output
__getattr__(__add__)
7
7
__getattr__(__mul__)
12
12
__getattr__(__iadd__)
11
Now you can use your second code sample literally by inheriting from my OperatorHackiness base class. You even get an additional benefit: __getattr__ will only be called once per instance and operator and there is no additional layer of recursion involved for the caching. We hereby circumvent the problem of method calls being slow compared to method lookup (as Paul Hankin noticed correctly).
NOTE: The loop to add the operator methods is only executed once in your whole program, so the preparation takes constant overhead in the range of milliseconds.
The issue at hand is that Python looks up __xxx__ methods on the object's class, not on the object itself -- and if it is not found, it does not fall back to __getattr__ nor __getattribute__.
The only way to intercept such calls is to have a method already there. It can be a stub function, as in Niklas Baumstark's answer, or it can be the full-fledged replacement function; either way, however, there must be something already there or you will not be able to intercept such calls.
If you are reading closely, you will have noticed that your requirement for having the final method be bound to the instance is not a possible solution -- you can do it, but Python will never call it as Python is looking at the class of the instance, not the instance, for __xxx__ methods. Niklas Baumstark's solution of making a unique temp class for each instance is as close as you can get to that requirement.
It looks like you are making things too complicated. You can define a mixin class and inherit from it. This is both simpler than using metaclasses and will run faster than using __getattr__.
class OperatorMixin(object):
def __add__(self, other):
return func(self, other)
def __radd__(self, other):
return rfunc(self, other)
... other operators defined too
Then every class you want to have these operators, inherit from OperatorMixin.
class Expression(OperatorMixin):
... the regular methods for your class
Generating the operator methods when they're needed isn't a good idea: __getattr__ is slow compared to regular method lookup, and since the methods are stored once (on the mixin class), it saves almost nothing.
If you want to achieve your goal without metaclasses, you can append the following to your code:
def get_magic_wrapper(name):
def wrapper(self, *a, **kw):
print('Wrapping')
res = getattr(self._data, name)(*a, **kw)
return res
return wrapper
_magic_methods = ['__str__', '__len__', '__repr__']
for _mm in _magic_methods:
setattr(ShowMeList, _mm, get_magic_wrapper(_mm))
It reroutes the methods in _magic_methods to the self._data object, by adding these attributes to the class iteratively. To check if it works:
>>> l = ShowMeList(range(8))
>>> len(l)
Wrapping
8
>>> l
Wrapping
[0, 1, 2, 3, 4, 5, 6, 7]
>>> print(l)
Wrapping
[0, 1, 2, 3, 4, 5, 6, 7]

Categories

Resources