Python one-liner to call list of functions - python

I've got some old code where I stored lists of functions in Python as class attributes. These lists are used as a sort of event hook.
To call each function in the list with appropriate arguments, I've used one-liners, mixing map with lambda expressions. I'm now concerned that there is unnecessary overhead in using lambda expressions like this.. I guess the recommended way would be to drop both map and lambda and just use a standard for loop, for readability.
Is there a better (read faster) one-liner to do this, though?
For example:
class Foo:
"""Dummy class demonstrating event hook usage."""
pre = [] # list of functions to call before entering loop.
mid = [] # list of functions to call inside loop, with value
post = [] # list of functions to call after loop.
def __init__(self, verbose=False, send=True):
"""Attach functions when initialising class."""
self._results = []
if verbose:
self.mid.append( self._print )
self.mid.append( self._store )
if send:
self.post.append( self._send )
def __call__(self, values):
# call each function in self.pre (no functions there)
map( lambda fn: fn(), self.pre )
for val in values:
# call each function in self.mid, with one passed argument
map( lambda fn: fn(val), self.mid )
# call each fn in self.post, with no arguments
map( lambda fn: fn(), self.post )
def _print(self, value):
"""Print argument, when verbose=True."""
print value
def _store(self, value):
"""Store results"""
self._results.append(value)
def _send(self):
"""Send results somewhere"""
# create instance of Foo
foo = Foo(verbose=True)
# equivalent to: foo.__call__( ... )
foo( [1, 2, 3, 4] )
Is there a better way to write those one-liner map calls?

The recommended way is definitely to use for loops, however, if you insist on using map, then operator.methodcaller might be just what you need:
>>> def foo(*args):
... print 'foo',args
...
>>> def bar(*args):
... print 'bar',args
...
>>> from operator import methodcaller
>>>
>>> map(methodcaller('__call__',1,2,3),[foo,bar])
foo (1, 2, 3)
bar (1, 2, 3)
[None, None]
A word of caution about using map for this -- It won't work if you port your code to python 3 since map became lazy.
You could also use list comprehensions pretty trivially (and that works on python3 also):
[fn() for fn in self.pre]
[fn(val) for fn in self.mid]
etc.

First of all "I'm concerned that there is unnecessary overhead" is no way to optimise your code. Use a profiler to find the hotspots.
Secondly, your code could do with comments to let the reader know what is going on.
Finally, until proven otherwise, the following is a fine way to accomplish the task:
for func in self.pre: func()
#apply every function in self.mid to every value in values
for func,val in itertools.product(self.mid, values):
func(val)
If you wanted to capture the values, you could use a list comprehension; if you wanted to delay evaluation, you could use a generator expression.

>>> def chain(*fn):
>>> return lambda *args, **kwargs: [_(*args, **kwargs) for _ in fn]
>>>
>>> def add(x, y):
>>> return(x + y)
>>>
>>> def multiply(x, y):
>>> return(x * y)
>>>
>>> chained = chain(add, multiply)
>>> chained(2, 6)
[8, 12]

Related

Ways to define and use partially bound functions

The two ways I'm aware of to have a partially-bound function that can be later called is:
apply_twice = lambda f: lambda x: f(f(x))
square2x = apply_twice(lambda x: x*x)
square2x(2)
# 16
And
def apply_twice(f):
def apply(x):
return f(f(x))
return apply
square_2x=apply_twice(lambda x: x*x)
square_2x(4)
# 256
Are there any other common ways to pass around or use partially-bound functions?
functools.partial can be used to partially apply an ordinary Python function. This is especially useful if you already have a regular function and want to apply only some of the arguments.
from functools import partial
def apply_twice(f, x):
return f(f(x))
square2x = partial(apply_twice, lambda x: x*x)
print(square2x(4))
It's also important to remember that functions are only one type of callable in Python, and we're free to define callables ourselves as ordinary user-defined classes. So if you have some complex operation that you want to behave like a function, you can always write a class, which lets you document in more detail what it is and what the different parts mean.
class MyApplyTwice:
def __init__(self, f):
self.f = f
def __call__(self, x):
return self.f(self.f(x))
square2x = MyApplyTwice(lambda x: x*x)
print(square2x(4))
While overly verbose in this example, it can be helpful to write your function out as a class if it's going to be storing state long-term or might be doing confusing mutable things with its state. It's also useful to keep in mind for learning purposes, as it's a healthy reminder that closures and objects are two sides of the same coin. They're really the same thing, viewed in a different light.
You can also do this with functools.partial():
def apply_twice(f, x):
return f(f(x))
square_2x = functools.partial(apply_twice, lambda x: x*x)
This isn't really partial binding, assuming you mean partial application.
Partial application is when you create a function that does the same thing as another function by fixing some number of its arguments, producing a function of smaller arity (the arity of a function is the number of arugments it takes).
So, for example,
def foo(a, b, c):
return a + b + c
A partially applied version of foo would be something like:
def partial_foo(a, b):
return foo(a, b, 42)
Or, with a lambda expression:
partial_foo = lambda a, b: foo(a, b, 42)
However, note, the above goes against the official style guidelines, in PEP8, you shouldn't assign the result of lambda expressions to a name, if you are going to do that just use a full function defintion.
The module, functools, has a helper for partial application:
import functools
partial_foo = functools.partial(foo, c=42)
Note, you may have heard about "currying", which sometimes gets confused for partial application. Currying is when you decompose a n-arity function into N, 1-arity functions. So, more concretely, for foo:
curried_foo = lambda a: lambda b: lambda c: a + b + c
Or in long form:
def curried_foo(a):
def _curr0(b):
def _curr1(c):
return a + b + c
return _curr1
return _curr0
And the important part, curried_foo(1)(2)(3) == foo(1, 2, 3)

how to store functions inside an array and call one single function as array[index] [duplicate]

How can I bind arguments to a Python function so that I can call it later without arguments (or with fewer additional arguments)?
For example:
def add(x, y):
return x + y
add_5 = magic_function(add, 5)
assert add_5(3) == 8
What is the magic_function I need here?
It often happens with frameworks and libraries that people accidentally call a function immediately when trying to give arguments to a callback: for example on_event(action(foo)). The solution is to bind foo as an argument to action, using one of the techniques described here. See for example How to pass arguments to a Button command in Tkinter? and Using a dictionary as a switch statement in Python.
Some APIs, however, allow you to pass the to-be-bound arguments separately, and will do the binding for you. Notably, the threading API in the standard library works this way. See thread starts running before calling Thread.start. If you are trying to set up your own API like this, see How can I write a simple callback function?.
Explicitly binding arguments is also a way to avoid problems caused by late binding when using closures. This is the problem where, for example, a lambda inside a for loop or list comprehension produces separate functions that compute the same result. See What do lambda function closures capture? and Creating functions (or lambdas) in a loop (or comprehension).
functools.partial returns a callable wrapping a function with some or all of the arguments frozen.
import sys
import functools
print_hello = functools.partial(sys.stdout.write, "Hello world\n")
print_hello()
Hello world
The above usage is equivalent to the following lambda.
print_hello = lambda *a, **kw: sys.stdout.write("Hello world\n", *a, **kw)
Using functools.partial:
>>> from functools import partial
>>> def f(a, b):
... return a+b
...
>>> p = partial(f, 1, 2)
>>> p()
3
>>> p2 = partial(f, 1)
>>> p2(7)
8
If functools.partial is not available then it can be easily emulated:
>>> make_printer = lambda s: lambda: sys.stdout.write("%s\n" % s)
>>> import sys
>>> print_hello = make_printer("hello")
>>> print_hello()
hello
Or
def partial(func, *args, **kwargs):
def f(*args_rest, **kwargs_rest):
kw = kwargs.copy()
kw.update(kwargs_rest)
return func(*(args + args_rest), **kw)
return f
def f(a, b):
return a + b
p = partial(f, 1, 2)
print p() # -> 3
p2 = partial(f, 1)
print p2(7) # -> 8
d = dict(a=2, b=3)
p3 = partial(f, **d)
print p3(), p3(a=3), p3() # -> 5 6 5
lambdas allow you to create a new unnamed function with fewer arguments and call the function:
>>> def foobar(x, y, z):
... print(f'{x}, {y}, {z}')
...
>>> foobar(1, 2, 3) # call normal function
1, 2, 3
>>> bind = lambda x: foobar(x, 10, 20) # bind 10 and 20 to foobar
>>> bind(1)
1, 10, 20
>>> bind = lambda: foobar(1, 2, 3) # bind all elements
>>> bind()
1, 2, 3
You can also use functools.partial. If you are planning to use named argument binding in the function call this is also applicable:
>>> from functools import partial
>>> barfoo = partial(foobar, x=10)
>>> barfoo(y=5, z=6)
10, 5, 6
Note that if you bind arguments from the left you need to call the arguments by name. If you bind from the right it works as expected.
>>> barfoo(5, 6)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: foobar() got multiple values for argument 'x'
>>> f = partial(foobar, z=20)
>>> f(1, 1)
1, 1, 20
This would work, too:
def curry(func, *args):
def curried(*innerargs):
return func(*(args+innerargs))
curried.__name__ = "%s(%s, ...)" % (func.__name__, ", ".join(map(str, args)))
return curried
>>> w=curry(sys.stdout.write, "Hey there")
>>> w()
Hey there
Functors can be defined this way in Python. They're callable objects. The "binding" merely sets argument values.
class SomeFunctor( object ):
def __init__( self, arg1, arg2=None ):
self.arg1= arg1
self.arg2= arg2
def __call___( self, arg1=None, arg2=None ):
a1= arg1 or self.arg1
a2= arg2 or self.arg2
# do something
return
You can do things like
x= SomeFunctor( 3.456 )
x( arg2=123 )
y= SomeFunctor( 3.456, 123 )
y()
The question asks generally about binding arguments, but all answers are about functions. In case you are wondering, partial also works with class constructors (i.e. using a class instead of a function as a first argument), which can be useful for factory classes. You can do it as follows:
from functools import partial
class Animal(object):
def __init__(self, weight, num_legs):
self.weight = weight
self.num_legs = num_legs
animal_class = partial(Animal, weight=12)
snake = animal_class(num_legs = 0)
print(snake.weight) # prints 12

How to apply class decorator at base of all decorators on methods

I am using this way of decorating all methods
import inspect
def decallmethods(decorator, prefix='test_'):
def dectheclass(cls):
for name, m in inspect.getmembers(cls, inspect.ismethod):
if name.startswith(prefix):
setattr(cls, name, decorator(m))
return cls
return dectheclass
#decallmethods(login_testuser)
class TestCase(object):
def setUp(self):
pass
def test_1(self):
print "test_1()"
def test_2(self):
print "test_2()"
This is working but it applies at the top , if i have other decorators.
I mean
Now the result is
#login_testuser
#other
def test_2(self):
print "test_2()"
But i want
#other
#login_testuser
def test_2(self):
print "test_2()"
This is most certainly a bad idea, but what you want to do can be done in some extent, and this is going to take a lot of time to explain. First off, rather than thinking of decorators as a syntax sugar, think of them as what they really are: a function (that is a closure) with a function that exist inside it. Now this is out of the way, supposed we have a function:
def operation(a, b):
print('doing operation')
return a + b
Simply it will do this
>>> hi = operation('hello', 'world')
doing operation
>>> print(hi)
helloworld
Now define a decorator that prints something before and after calling its inner function (equivalent to the other decorator that you want to decorator later):
def other(f):
def other_inner(*a, **kw):
print('other start')
result = f(*a, **kw)
print('other finish')
return result
return other_inner
With that, build a new function and decorator
#other
def o_operation(a, b):
print('doing operation')
return a + b
Remembering, this is basically equivalent to o_operation = other(operation)
Run this to ensure it works:
>>> r2 = o_operation('some', 'inner')
other start
doing operation
other finish
>>> print(r2)
someinner
Finally, the final decorator you want to call immediately before operation but not d_operation, but with your existing code it results in this:
def inject(f):
def injected(*a, **kw):
print('inject start')
result = f(*a, **kw)
print('inject finish')
return result
return injected
#inject
#other
def i_o_operation(a, b):
print('doing operation')
return a + b
Run the above:
>>> i_o_operation('hello', 'foo')
inject start
other start
doing operation
other finish
inject finish
'hellofoo'
As mentioned decorators are really closures and hence that's why it's possible to have items inside that are effectively instanced inside. You can reach them by going through the __closure__ attribute:
>>> i_o_operation.__closure__
(<cell at 0x7fc0eabd1fd8: function object at 0x7fc0eabce7d0>,)
>>> i_o_operation.__closure__[0].cell_contents
<function other_inner at 0x7fc0eabce7d0>
>>> print(i_o_operation.__closure__[0].cell_contents('a', 'b'))
other start
doing operation
other finish
ab
See how this effectively calls the function inside the injected closure directly, as if that got unwrapped. What if that closure can be replaced with the one that did the injection? For all of our protection, __closure__ and cell.cell_contents are read-only. What needs to be done is to construct completely new functions with the intended closures by making use of the FunctionType function constructor (found in the types module)
Back to the problem. Since what we have now is:
i_o_operation = inject(other(operation))
And what we want is
o_i_operation = other(inject(operation))
We effectively have to somehow strip the call to other from i_o_operation and somehow wrap it around with inject to produce o_i_operation. (Dragons follows after the break)
First, construct a function that effectively calls inject(operation) by taking the closure to level deep (so that f will contain just the original operation call) but mix it with the code produced by inject(f):
i_operation = FunctionType(
i_o_operation.__code__,
globals=globals(),
closure=i_o_operation.__closure__[0].cell_contents.__closure__,
)
Since i_o_operation is the result of inject(f) we can take that code to produce a new function. The globals is a formality that's required, and finally take the closure of the nested level, and the first part of the function is produced. Verify that the other is not called.
>>> i_operation('test', 'strip')
inject start
doing operation
inject finish
'teststrip'
Neat. However we still want the other to be wrapped outside of this to finally produce o_i_operation. We do need to somehow put this new function we produced in a closure, and a way to do this is to create a surrogate function that produce one
def closure(f):
def surrogate(*a, **kw):
return f(*a, **kw)
return surrogate
And simply use it to construct and extract our closure
o_i_operation = FunctionType(
i_o_operation.__closure__[0].cell_contents.__code__,
globals=globals(),
closure=closure(i_operation).__closure__,
)
Call this:
>>> o_i_operation('job', 'complete')
other start
inject start
doing operation
inject finish
other finish
'jobcomplete'
Looks like we finally got what we need. While this doesn't exactly answer your exact problem, this started down the right track but is already pretty hairy.
Now for the actual problem: a function that will ensure a decorator function be the most inner (final) callable before a given original, undecorated function - i.e. for a given target and a f(g(...(callable)), we want to emulate a result that gives f(g(...(target(callable)))). This is the code:
from types import FunctionType
def strip_decorators(f):
"""
Strip all decorators from f. Assumes each are functions with a
closure with a first cell being the target function.
"""
# list of not the actual decorator, but the returned functions
decorators = []
while f.__closure__:
# Assume first item is the target method
decorators.append(f)
f = f.__closure__[0].cell_contents
return decorators, f
def inject_decorator(decorator, f):
"""
Inject a decorator to the most inner function within the stack of
closures in `f`.
"""
def closure(f):
def surrogate(*a, **kw):
return f(*a, **kw)
return surrogate
decorators, target_f = strip_decorators(f)
result = decorator(target_f)
while decorators:
# pop out the last one in
decorator = decorators.pop()
result = FunctionType(
decorator.__code__,
globals=globals(),
closure=closure(result).__closure__,
)
return result
To test this, we use a typical example use-case - html tags.
def italics(f):
def i(s):
return '<i>' + f(s) + '</i>'
return i
def bold(f):
def b(s):
return '<b>' + f(s) + '</b>'
return b
def underline(f):
def u(s):
return '<u>' + f(s) + '</u>'
return u
#italics
#bold
def hi(s):
return s
Running the test.
>>> hi('hello')
'<i><b>hello</b></i>'
Our target is to inject the underline decorator (specifically the u(hi) callable) into the most inner closure. This can be done like so, with the function we have defined above:
>>> hi_u = inject_decorator(underline, hi)
>>> hi_u('hello')
'<i><b><u>hello</u></b></i>'
Works with undecorated functions:
>>> def pp(s):
... return s
...
>>> pp_b = inject_decorator(bold, pp)
>>> pp_b('hello')
'<b>hello</b>'
A major assumption was made for this first-cut version of the rewriter, which is that all decorators in the chain only have a closure length of one, that one element being the function being decorated with. Take this decorator for instance:
def prefix(p):
def decorator(f):
def inner(*args, **kwargs):
new_args = [p + a for a in args]
return f(*new_args, **kwargs)
return inner
return decorator
Example usage:
>>> #prefix('++')
... def prefix_hi(s):
... return s
...
>>> prefix_hi('test')
'++test'
Now try to inject a bold decorator like so:
>>> prefix_hi_bold = inject_decorator(bold, prefix_hi)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 18, in inject_decorator
ValueError: inner requires closure of length 2, not 1
This is simply because the closure formed by decorator within prefix has two elements, one being the prefix string p and the second being the actual function, and inner being nested inside that expects both those to be present inside its closure. Resolving that will require more code to analyse and reconstruct the details.
Anyway, this explanation took quite a bit of time and words, so I hope you understand this and maybe get you started on the actual right track.
If you want to turn inject_decorator into a decorator, and/or mix it into your class decorator, best of luck, most of the hard work is already done.

Better Function Composition in Python

I work in Python. Recently, I discovered a wonderful little package called fn. I've been using it for function composition.
For example, instead of:
baz(bar(foo(x))))
with fn, you can write:
(F() >> foo >> bar >> baz)(x) .
When I saw this, I immediately thought of Clojure:
(-> x foo bar baz) .
But notice how, in Clojure, the input is on the left. I wonder if this possible in python/fn.
You can't replicate the exact syntax, but you can make something similar:
def f(*args):
result = args[0]
for func in args[1:]:
result = func(result)
return result
Seems to work:
>>> f('a test', reversed, sorted, ''.join)
' aestt'
You can't get that exact syntax, although you can get something like F(x)(foo, bar, baz). Here's a simple example:
class F(object):
def __init__(self, arg):
self.arg = arg
def __call__(self, *funcs):
arg = self.arg
for f in funcs:
arg = f(arg)
return arg
def a(x):
return x+2
def b(x):
return x**2
def c(x):
return 3*x
>>> F(2)(a, b, c)
48
>>> F(2)(c, b, a)
38
This is a bit different from Blender's answer since it stores the argument, which can later be re-used with different functions.
This is sort of like the opposite of normal function application: instead of specifying the function up front and leaving some arguments to be specified later, you specify the argument and leave the function(s) to be specified later. It's an interesting toy but it's hard to think why you'd really want this.
If you want to use fn, with a little hack you can get a bit closer to Clojure syntax:
>>> def r(x): return lambda: x
>>> (F() >> r(x) >> foo >> bar >> baz)()
See how I added another function at the beginning of the composition chain that will just return x when called. The problem with this is that you still have to call your composed function, just without any arguments.
I think #Blender's answer is your best bet trying to emulate Clojure's thread function in Python.
I came up with this
def _composition(arg, *funcs_and_args):
"""
_composition(
[1,2,3],
(filter, lambda x: x % 2 == 1),
(map, lambda x: x+3)
)
#=> [4, 6]
"""
for func_and_args in funcs_and_args:
func, *b = func_and_args
arg = func(*b, arg)
return(arg)
This seems to work for simple input. Not sure it is worth the effort for complex input, e.g., ((42, 'spam'), {'spam': 42}).
def compose(function, *functions):
return function if not functions else \
lambda *args, **kwargs: function(compose(*functions)(*args, **kwargs))
def rcompose(*functions):
return compose(*reversed(functions))
def postfix(arg, *functions):
return rcompose(*functions)(arg)
Example:
>>> postfix(1, str, len, hex)
'0x1'
>>> postfix(1, hex, len)
3
My compose function that returns a function
def compose(*args):
length = len(args)
def _composeInner(lastResult, index):
if ((length - 1) < index):
return lastResult
return _composeInner(args[index](lastResult), index + 1)
return (lambda x: _composeInner(x, 0))
Usage:
fn = compose(
lambda x: x * 2,
lambda x: x + 2,
lambda x: x + 1,
lambda x: x / 3
)
result = fn(6) # -> 5
I understand what you mean. It doesn't make sense. In my opinion this python library
does it better.
>>> from compositions.compositions import Compose
>>> foo = Compose(lambda x:x)
>>> foo = Compose(lambda x:x**2)
>>> foo = Compose(lambda x:sin(x))
>>> (baz*bar*foo)(x)

How to call same method for a list of objects?

Suppose code like this:
class Base:
def start(self):
pass
def stop(self)
pass
class A(Base):
def start(self):
... do something for A
def stop(self)
.... do something for A
class B(Base):
def start(self):
def stop(self):
a1 = A(); a2 = A()
b1 = B(); b2 = B()
all = [a1, b1, b2, a2,.....]
Now I want to call methods start and stop (maybe also others) for each object in the list all. Is there any elegant way for doing this except of writing a bunch of functions like
def start_all(all):
for item in all:
item.start()
def stop_all(all):
This will work
all = [a1, b1, b2, a2,.....]
map(lambda x: x.start(),all)
simple example
all = ["MILK","BREAD","EGGS"]
map(lambda x:x.lower(),all)
>>>['milk','bread','eggs']
and in python3
all = ["MILK","BREAD","EGGS"]
list(map(lambda x:x.lower(),all))
>>>['milk','bread','eggs']
It seems like there would be a more Pythonic way of doing this, but I haven't found it yet.
I use "map" sometimes if I'm calling the same function (not a method) on a bunch of objects:
map(do_something, a_list_of_objects)
This replaces a bunch of code that looks like this:
do_something(a)
do_something(b)
do_something(c)
...
But can also be achieved with a pedestrian "for" loop:
for obj in a_list_of_objects:
do_something(obj)
The downside is that a) you're creating a list as a return value from "map" that's just being throw out and b) it might be more confusing that just the simple loop variant.
You could also use a list comprehension, but that's a bit abusive as well (once again, creating a throw-away list):
[ do_something(x) for x in a_list_of_objects ]
For methods, I suppose either of these would work (with the same reservations):
map(lambda x: x.method_call(), a_list_of_objects)
or
[ x.method_call() for x in a_list_of_objects ]
So, in reality, I think the pedestrian (yet effective) "for" loop is probably your best bet.
The approach
for item in all:
item.start()
is simple, easy, readable, and concise. This is the main approach Python provides for this operation. You can certainly encapsulate it in a function if that helps something. Defining a special function for this for general use is likely to be less clear than just writing out the for loop.
The *_all() functions are so simple that for a few methods I'd just write the functions. If you have lots of identical functions, you can write a generic function:
def apply_on_all(seq, method, *args, **kwargs):
for obj in seq:
getattr(obj, method)(*args, **kwargs)
Or create a function factory:
def create_all_applier(method, doc=None):
def on_all(seq, *args, **kwargs):
for obj in seq:
getattr(obj, method)(*args, **kwargs)
on_all.__doc__ = doc
return on_all
start_all = create_all_applier('start', "Start all instances")
stop_all = create_all_applier('stop', "Stop all instances")
...
maybe map, but since you don't want to make a list, you can write your own...
def call_for_all(f, seq):
for i in seq:
f(i)
then you can do:
call_for_all(lamda x: x.start(), all)
call_for_all(lamda x: x.stop(), all)
by the way, all is a built in function, don't overwrite it ;-)
Starting in Python 2.6 there is a operator.methodcaller function.
So you can get something more elegant (and fast):
from operator import methodcaller
map(methodcaller('method_name'), list_of_objects)
Taking #Ants Aasmas answer one step further, you can create a wrapper that takes any method call and forwards it to all elements of a given list:
class AllOf:
def __init__(self, elements):
self.elements = elements
def __getattr__(self, attr):
def on_all(*args, **kwargs):
for obj in self.elements:
getattr(obj, attr)(*args, **kwargs)
return on_all
That class can then be used like this:
class Foo:
def __init__(self, val="quux!"):
self.val = val
def foo(self):
print "foo: " + self.val
a = [ Foo("foo"), Foo("bar"), Foo()]
AllOf(a).foo()
Which produces the following output:
foo: foo
foo: bar
foo: quux!
With some work and ingenuity it could probably be enhanced to handle attributes as well (returning a list of attribute values).
If you would like to have a generic function while avoiding referring to method name using strings, you can write something like that:
def apply_on_all(seq, method, *args, **kwargs):
for obj in seq:
getattr(obj, method.__name__)(*args, **kwargs)
# to call:
apply_on_all(all, A.start)
Similar to other answers but has the advantage of only using explicit attribute lookup (i.e. A.start). This can eliminate refactoring errors, i.e. it's easy to rename the start method and forget to change the strings that refer to this method.
The best solution, in my opinion, depends on whether you need the result of the method and whether your method takes any arguments except self.
If you don't need the result, I would simply write a for loop:
for instance in lst:
instance.start()
If you need the result, but method takes no arguments, I would use map:
strs = ['A', 'B', 'C']
lower_strs = list(map(str.lower, strs)) # ['a', 'b', 'c']
And finally, if you need the result and method does take some arguments, list comprehension would work great:
strs = ['aq', 'bq', 'cq']
qx_strs = [i.replace('q', 'x') for i in strs] # ['ax', 'bx', 'cx']

Categories

Resources