Override Python's 'in' operator? - python

If I am creating my own class in Python, what function should I define so as to allow the use of the in operator, e.g.
class MyClass(object):
...
m = MyClass()
if 54 in m:
...

MyClass.__contains__(self, item)

A more complete answer is:
class MyClass(object):
def __init__(self):
self.numbers = [1,2,3,4,54]
def __contains__(self, key):
return key in self.numbers
Here you would get True when asking if 54 was in m:
>>> m = MyClass()
>>> 54 in m
True
See documentation on overloading __contains__.

Another way of having desired logic is to implement __iter__.
If you don't overload __contains__ python would use __iter__ (if it's overloaded) to check whether or not your data structure contains specified value.

Related

str.format() with lazy dict?

I want to use str.format() and pass it a custom lazy dictionary.
str.format() should only access the key in the lazy dict it needs.
Is this possible?
Which interface needs to be implemented by the lazy_dict?
Update
This is not what I want:
'{0[a]}'.format(d)
I need something like this:
'{a}'.format(**d)
Need to run on Python2.7
For doing '{a}'.format(**d), especially the **d part, the "lazy" dict is transformed into a regular one. Here happens the access to all keys, and format() can't do anything about it.
You could craft some proxy objects which are put in place of the elements, and on string access they do the "real" work.
Something like
class LazyProxy(object):
def __init__(self, prx):
self.prx = prx
def __format__(self, fmtspec):
return format(self.prx(), fmtspec)
def __repr__(self):
return repr(self.prx())
def __str__(self):
return str(self.prx())
You can put these elements into a dict, such as
interd = { k, LazyProxy(lambda: lazydict[k]) for i in lazydict.iterkeys()}
I didn't test this, but I think this fulfills your needs.
After the last edit, it now works with !r and !s as well.
You can use the __format__ method (Python 3 only). See the doc here.
If I understand your question correctly, you want to pass a custom dictionary, that would compute values only when needed. First, we're looking for implementation of __getitem__():
>>> class LazyDict(object):
... def __init__(self, d):
... self.d = d
... def __getitem__(self, k):
... print k # <-- tracks the needed keys
... return self.d[k]
...
>>> d = D({'a': 19, 'b': 20})
>>> '{0[a]}'.format(d)
a
'19'
This shows that only key 'a' is accessed; 'b' is not, so you already have your lazy access.
But also, any object attribute is usable for str.format this way, and using #property decorator, you can access function results:
class MyObject(object):
def __init__(self):
self.a = 19
self.b = 20
def __getitem__(self, var):
return getattr(self, var)
# this command lets you able to call any attribute of your instance,
# or even the result of a function if it is decorated by #property:
#property
def c(self):
return 21
Example of usage:
>>> m = MyObject()
>>> '{0[c]}'.format(m)
'21'
But note that this also works, making the formating string a little bit specific, but avoid the need for __getitem__() implementation.
>>> '{0.c}'.format(m)
'21'

Using map on methods in Python

I have some classes in Python:
class Class1:
def method(self):
return 1
class Class2:
def method(self):
return 2
and a list myList whose elements are all either instances of Class1 or Class2. I'd like to create a new list whose elements are the return values of method called on each element of myList. I have tried using a "virtual" base class
class Class0:
def method(self):
return 0
class Class1(Class0):
def method(self):
return 1
class Class2(Class0):
def method(self):
return 2
But if I try map(Class0.method, myList) I just get [0, 0, 0, ...]. I'm a bit new to Python, and I hear that "duck typing" is preferred to actual inheritance, so maybe this is the wrong approach. Of course, I can do
[myList[index].method() for index in xrange(len(myList))]
but I like the brevity of map. Is there a way to still use map for this?
You can use
map(lambda e: e.method(), myList)
But I think this is better:
[e.method() for e in myList]
PS.: I don't think there is ever a need for range(len(collection)).
The operator.methodcaller tool is exactly what you're looking for:
map(methodcaller("method"), myList)
Alternatively you can use a list comprehension:
[obj.method() for obj in myList]
This is best:
[o.method() for o in myList]
Map seems to be favored by people pining for Haskell or Lisp, but Python has fine iterative structures you can use instead.

__add__ all elements of a list

I'd like to combine a list of class instances of a class for which the __add__ method is defined.
i.e., I have a list of class instances L=[A,B,C,D] and I want their sum E = A+B+C+D, but generalized so that instead of the + syntax I could do something like E = sum(L).
What function should I use to do that? Is the __add__ method adequate, or do I need to define a different class method (e.g. __iadd__) in order to accomplish this?
(if this turns out to be a duplicate, how should I be asking the question?)
import operator
reduce(operator.add, L)
sum may want to add numerical values to instances of your class. Define __radd__ so for example int + Foo(1) will be defined:
class Foo(object):
def __init__(self, val):
self.val = val
def __add__(self, other):
return self.val + other.val
def __radd__(self, other):
return other + self.val
A = Foo(1)
B = Foo(2)
L = [A,B]
print(A+B)
# 3
print(sum(L))
# 3
Ignore my previous answer, it was wrong.
The reduce function allows you to apply any binary function or method to all the elements of a sequence. So, you could write:
reduce(YourClass.__add__, sequence)
If not all objects in the sequence are instances of the same class, then instead use this:
import operator
reduce(operator.add, sequence)
Or this:
reduce(lambda x, y: x + y, sequence)

Intercept operator lookup on metaclass

I have a class that need to make some magic with every operator, like __add__, __sub__ and so on.
Instead of creating each function in the class, I have a metaclass which defines every operator in the operator module.
import operator
class MetaFuncBuilder(type):
def __init__(self, *args, **kw):
super().__init__(*args, **kw)
attr = '__{0}{1}__'
for op in (x for x in dir(operator) if not x.startswith('__')):
oper = getattr(operator, op)
# ... I have my magic replacement functions here
# `func` for `__operators__` and `__ioperators__`
# and `rfunc` for `__roperators__`
setattr(self, attr.format('', op), func)
setattr(self, attr.format('r', op), rfunc)
The approach works fine, but I think It would be better if I generate the replacement operator only when needed.
Lookup of operators should be on the metaclass because x + 1 is done as type(x).__add__(x,1) instead of x.__add__(x,1), but it doesn't get caught by __getattr__ nor __getattribute__ methods.
That doesn't work:
class Meta(type):
def __getattr__(self, name):
if name in ['__add__', '__sub__', '__mul__', ...]:
func = lambda:... #generate magic function
return func
Also, the resulting "function" must be a method bound to the instance used.
Any ideas on how can I intercept this lookup? I don't know if it's clear what I want to do.
For those questioning why do I need to this kind of thing, check the full code here.
That's a tool to generate functions (just for fun) that could work as replacement for lambdas.
Example:
>>> f = FuncBuilder()
>>> g = f ** 2
>>> g(10)
100
>>> g
<var [('pow', 2)]>
Just for the record, I don't want to know another way to do the same thing (I won't declare every single operator on the class... that will be boring and the approach I have works pretty fine :). I want to know how to intercept attribute lookup from an operator.
Some black magic let's you achieve your goal:
operators = ["add", "mul"]
class OperatorHackiness(object):
"""
Use this base class if you want your object
to intercept __add__, __iadd__, __radd__, __mul__ etc.
using __getattr__.
__getattr__ will called at most _once_ during the
lifetime of the object, as the result is cached!
"""
def __init__(self):
# create a instance-local base class which we can
# manipulate to our needs
self.__class__ = self.meta = type('tmp', (self.__class__,), {})
# add operator methods dynamically, because we are damn lazy.
# This loop is however only called once in the whole program
# (when the module is loaded)
def create_operator(name):
def dynamic_operator(self, *args):
# call getattr to allow interception
# by user
func = self.__getattr__(name)
# save the result in the temporary
# base class to avoid calling getattr twice
setattr(self.meta, name, func)
# use provided function to calculate result
return func(self, *args)
return dynamic_operator
for op in operators:
for name in ["__%s__" % op, "__r%s__" % op, "__i%s__" % op]:
setattr(OperatorHackiness, name, create_operator(name))
# Example user class
class Test(OperatorHackiness):
def __init__(self, x):
super(Test, self).__init__()
self.x = x
def __getattr__(self, attr):
print "__getattr__(%s)" % attr
if attr == "__add__":
return lambda a, b: a.x + b.x
elif attr == "__iadd__":
def iadd(self, other):
self.x += other.x
return self
return iadd
elif attr == "__mul__":
return lambda a, b: a.x * b.x
else:
raise AttributeError
## Some test code:
a = Test(3)
b = Test(4)
# let's test addition
print(a + b) # this first call to __add__ will trigger
# a __getattr__ call
print(a + b) # this second call will not!
# same for multiplication
print(a * b)
print(a * b)
# inplace addition (getattr is also only called once)
a += b
a += b
print(a.x) # yay!
Output
__getattr__(__add__)
7
7
__getattr__(__mul__)
12
12
__getattr__(__iadd__)
11
Now you can use your second code sample literally by inheriting from my OperatorHackiness base class. You even get an additional benefit: __getattr__ will only be called once per instance and operator and there is no additional layer of recursion involved for the caching. We hereby circumvent the problem of method calls being slow compared to method lookup (as Paul Hankin noticed correctly).
NOTE: The loop to add the operator methods is only executed once in your whole program, so the preparation takes constant overhead in the range of milliseconds.
The issue at hand is that Python looks up __xxx__ methods on the object's class, not on the object itself -- and if it is not found, it does not fall back to __getattr__ nor __getattribute__.
The only way to intercept such calls is to have a method already there. It can be a stub function, as in Niklas Baumstark's answer, or it can be the full-fledged replacement function; either way, however, there must be something already there or you will not be able to intercept such calls.
If you are reading closely, you will have noticed that your requirement for having the final method be bound to the instance is not a possible solution -- you can do it, but Python will never call it as Python is looking at the class of the instance, not the instance, for __xxx__ methods. Niklas Baumstark's solution of making a unique temp class for each instance is as close as you can get to that requirement.
It looks like you are making things too complicated. You can define a mixin class and inherit from it. This is both simpler than using metaclasses and will run faster than using __getattr__.
class OperatorMixin(object):
def __add__(self, other):
return func(self, other)
def __radd__(self, other):
return rfunc(self, other)
... other operators defined too
Then every class you want to have these operators, inherit from OperatorMixin.
class Expression(OperatorMixin):
... the regular methods for your class
Generating the operator methods when they're needed isn't a good idea: __getattr__ is slow compared to regular method lookup, and since the methods are stored once (on the mixin class), it saves almost nothing.
If you want to achieve your goal without metaclasses, you can append the following to your code:
def get_magic_wrapper(name):
def wrapper(self, *a, **kw):
print('Wrapping')
res = getattr(self._data, name)(*a, **kw)
return res
return wrapper
_magic_methods = ['__str__', '__len__', '__repr__']
for _mm in _magic_methods:
setattr(ShowMeList, _mm, get_magic_wrapper(_mm))
It reroutes the methods in _magic_methods to the self._data object, by adding these attributes to the class iteratively. To check if it works:
>>> l = ShowMeList(range(8))
>>> len(l)
Wrapping
8
>>> l
Wrapping
[0, 1, 2, 3, 4, 5, 6, 7]
>>> print(l)
Wrapping
[0, 1, 2, 3, 4, 5, 6, 7]

How to call same method for a list of objects?

Suppose code like this:
class Base:
def start(self):
pass
def stop(self)
pass
class A(Base):
def start(self):
... do something for A
def stop(self)
.... do something for A
class B(Base):
def start(self):
def stop(self):
a1 = A(); a2 = A()
b1 = B(); b2 = B()
all = [a1, b1, b2, a2,.....]
Now I want to call methods start and stop (maybe also others) for each object in the list all. Is there any elegant way for doing this except of writing a bunch of functions like
def start_all(all):
for item in all:
item.start()
def stop_all(all):
This will work
all = [a1, b1, b2, a2,.....]
map(lambda x: x.start(),all)
simple example
all = ["MILK","BREAD","EGGS"]
map(lambda x:x.lower(),all)
>>>['milk','bread','eggs']
and in python3
all = ["MILK","BREAD","EGGS"]
list(map(lambda x:x.lower(),all))
>>>['milk','bread','eggs']
It seems like there would be a more Pythonic way of doing this, but I haven't found it yet.
I use "map" sometimes if I'm calling the same function (not a method) on a bunch of objects:
map(do_something, a_list_of_objects)
This replaces a bunch of code that looks like this:
do_something(a)
do_something(b)
do_something(c)
...
But can also be achieved with a pedestrian "for" loop:
for obj in a_list_of_objects:
do_something(obj)
The downside is that a) you're creating a list as a return value from "map" that's just being throw out and b) it might be more confusing that just the simple loop variant.
You could also use a list comprehension, but that's a bit abusive as well (once again, creating a throw-away list):
[ do_something(x) for x in a_list_of_objects ]
For methods, I suppose either of these would work (with the same reservations):
map(lambda x: x.method_call(), a_list_of_objects)
or
[ x.method_call() for x in a_list_of_objects ]
So, in reality, I think the pedestrian (yet effective) "for" loop is probably your best bet.
The approach
for item in all:
item.start()
is simple, easy, readable, and concise. This is the main approach Python provides for this operation. You can certainly encapsulate it in a function if that helps something. Defining a special function for this for general use is likely to be less clear than just writing out the for loop.
The *_all() functions are so simple that for a few methods I'd just write the functions. If you have lots of identical functions, you can write a generic function:
def apply_on_all(seq, method, *args, **kwargs):
for obj in seq:
getattr(obj, method)(*args, **kwargs)
Or create a function factory:
def create_all_applier(method, doc=None):
def on_all(seq, *args, **kwargs):
for obj in seq:
getattr(obj, method)(*args, **kwargs)
on_all.__doc__ = doc
return on_all
start_all = create_all_applier('start', "Start all instances")
stop_all = create_all_applier('stop', "Stop all instances")
...
maybe map, but since you don't want to make a list, you can write your own...
def call_for_all(f, seq):
for i in seq:
f(i)
then you can do:
call_for_all(lamda x: x.start(), all)
call_for_all(lamda x: x.stop(), all)
by the way, all is a built in function, don't overwrite it ;-)
Starting in Python 2.6 there is a operator.methodcaller function.
So you can get something more elegant (and fast):
from operator import methodcaller
map(methodcaller('method_name'), list_of_objects)
Taking #Ants Aasmas answer one step further, you can create a wrapper that takes any method call and forwards it to all elements of a given list:
class AllOf:
def __init__(self, elements):
self.elements = elements
def __getattr__(self, attr):
def on_all(*args, **kwargs):
for obj in self.elements:
getattr(obj, attr)(*args, **kwargs)
return on_all
That class can then be used like this:
class Foo:
def __init__(self, val="quux!"):
self.val = val
def foo(self):
print "foo: " + self.val
a = [ Foo("foo"), Foo("bar"), Foo()]
AllOf(a).foo()
Which produces the following output:
foo: foo
foo: bar
foo: quux!
With some work and ingenuity it could probably be enhanced to handle attributes as well (returning a list of attribute values).
If you would like to have a generic function while avoiding referring to method name using strings, you can write something like that:
def apply_on_all(seq, method, *args, **kwargs):
for obj in seq:
getattr(obj, method.__name__)(*args, **kwargs)
# to call:
apply_on_all(all, A.start)
Similar to other answers but has the advantage of only using explicit attribute lookup (i.e. A.start). This can eliminate refactoring errors, i.e. it's easy to rename the start method and forget to change the strings that refer to this method.
The best solution, in my opinion, depends on whether you need the result of the method and whether your method takes any arguments except self.
If you don't need the result, I would simply write a for loop:
for instance in lst:
instance.start()
If you need the result, but method takes no arguments, I would use map:
strs = ['A', 'B', 'C']
lower_strs = list(map(str.lower, strs)) # ['a', 'b', 'c']
And finally, if you need the result and method does take some arguments, list comprehension would work great:
strs = ['aq', 'bq', 'cq']
qx_strs = [i.replace('q', 'x') for i in strs] # ['ax', 'bx', 'cx']

Categories

Resources