Given a function foo:
def foo(x):
pass
Printing its representation by invoking str or repr gives you something boring like this:
str(foo)
'<function foo at 0x119e0c8c8>'
I'd like to know if it is possible to override a function's __str__ method to print something else. Essentially, I'd like to do:
str(foo)
"I'm foo!'
Now, I understand that the description of a function should come from __doc__ which is the function's docstring. However, this is merely an experiment.
In attempting to figure out a solution to this problem, I came across implementing __str__ for classes: How to define a __str__ method for a class?
This approach involved defining a metaclass with an __str__ method, and then attempting to assign the __metaclass__ hook in the actual class.
I wondered whether the same could be done to the class function, so here's what I tried -
In [355]: foo.__class__
Out[355]: function
In [356]: class fancyfunction(type):
...: def __str__(self):
...: return self.__name__
...:
In [357]: foo.__class__.__metaclass__ = fancyfunction
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
I figured it wouldn't work, but it was worth a shot!
So, what's the best way to implement __str__ for a function?
A function in Python is just a callable object. Using def to define function is one way to create such an object. But there is actually nothing stopping you from creating a callable type and creating an instance of it to get a function.
So the following two things are basically equal:
def foo ():
print('hello world')
class FooFunction:
def __call__ (self):
print('hello world')
foo = FooFunction()
Except that the last one obviously allows us to set the function type’s special methods, like __str__ and __repr__.
class FooFunction:
def __call__ (self):
print('hello world')
def __str__ (self):
return 'Foo function'
foo = FooFunction()
print(foo) # Foo function
But creating a type just for this becomes a bit tedious and it also makes it more difficult to understand what the function does: After all, the def syntax allows us to just define the function body. So we want to keep it that way!
Luckily, Python has this great feature called decorators which we can use here. We can create a function decorator that will wrap any function inside a custom type which calls a custom function for the __str__. That could look like this:
def with_str (str_func):
def wrapper (f):
class FuncType:
def __call__ (self, *args, **kwargs):
# call the original function
return f(*args, **kwargs)
def __str__ (self):
# call the custom __str__ function
return str_func()
# decorate with functool.wraps to make the resulting function appear like f
return functools.wraps(f)(FuncType())
return wrapper
We can then use that to add a __str__ function to any function by simply decorating it. That would look like this:
def foo_str ():
return 'This is the __str__ for the foo function'
#with_str(foo_str)
def foo ():
print('hello world')
>>> str(foo)
'This is the __str__ for the foo function'
>>> foo()
hello world
Obviously, doing this has some limitations and drawbacks since you cannot exactly reproduce what def would do for a new function inside that decorator.
For example, using the inspect module to look at the arguments will not work properly: For the callable type, it will include the self argument and when using the generic decorator, it will only be able to report the details of wrapper. However, there might be some solutions, for example discussed in this question, that will allow you to restore some of the functionality.
But that usually means you are investing a lot of effort just to get a __str__ work on a function object which will probably very rarely be used. So you should think about whether you actually need a __str__ implementation for your functions, and what kind of operations you will do on those functions then.
If you find yourself wrapping functions, it's useful to look at functools.partial. It's primarily for binding arguments of course, but that's optional. It's also a class that wraps functions, removing the boilerplate of doing so from scratch.
from functools import partial
class foo(partial):
def __str__(self):
return "I'm foo!"
#foo
def foo():
pass
assert foo() is None
assert str(foo) == "I'm foo!"
Related
I am trying to figure out how to get the names of all decorators on a method. I can already get the method name and docstring, but cannot figure out how to get a list of decorators.
I'm surprised that this question is so old and no one has taken the time to add the actual introspective way to do this, so here it is:
The code you want to inspect...
def template(func):
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return wrapper
baz = template
che = template
class Foo(object):
#baz
#che
def bar(self):
pass
Now you can inspect the above Foo class with something like this...
import ast
import inspect
def get_decorators(cls):
target = cls
decorators = {}
def visit_FunctionDef(node):
decorators[node.name] = []
for n in node.decorator_list:
name = ''
if isinstance(n, ast.Call):
name = n.func.attr if isinstance(n.func, ast.Attribute) else n.func.id
else:
name = n.attr if isinstance(n, ast.Attribute) else n.id
decorators[node.name].append(name)
node_iter = ast.NodeVisitor()
node_iter.visit_FunctionDef = visit_FunctionDef
node_iter.visit(ast.parse(inspect.getsource(target)))
return decorators
print get_decorators(Foo)
That should print something like this...
{'bar': ['baz', 'che']}
or at least it did when I tested this with Python 2.7.9 real quick :)
If you can change the way you call the decorators from
class Foo(object):
#many
#decorators
#here
def bar(self):
pass
to
class Foo(object):
#register(many,decos,here)
def bar(self):
pass
then you could register the decorators this way:
def register(*decorators):
def register_wrapper(func):
for deco in decorators[::-1]:
func=deco(func)
func._decorators=decorators
return func
return register_wrapper
For example:
def many(f):
def wrapper(*args,**kwds):
return f(*args,**kwds)
return wrapper
decos = here = many
class Foo(object):
#register(many,decos,here)
def bar(self):
pass
foo=Foo()
Here we access the tuple of decorators:
print(foo.bar._decorators)
# (<function many at 0xb76d9d14>, <function decos at 0xb76d9d4c>, <function here at 0xb76d9d84>)
Here we print just the names of the decorators:
print([d.func_name for d in foo.bar._decorators])
# ['many', 'decos', 'here']
I've add the same question. In my unit tests I just wanted to make sure decorators were used by given functions/methods.
The decorators were tested separately so I didn't need to test the common logic for each decorated function, just that the decorators were used.
I finally came up with the following helper function:
import inspect
def get_decorators(function):
"""Returns list of decorators names
Args:
function (Callable): decorated method/function
Return:
List of decorators as strings
Example:
Given:
#my_decorator
#another_decorator
def decorated_function():
pass
>>> get_decorators(decorated_function)
['#my_decorator', '#another_decorator']
"""
source = inspect.getsource(function)
index = source.find("def ")
return [
line.strip().split()[0]
for line in source[:index].strip().splitlines()
if line.strip()[0] == "#"
]
With the list comprehension, it is a bit "dense" but it does the trick and in my case it's a test helper function.
It works if you are intrested only in the decorators names, not potential decorator arguments. If you want to support decorators taking arguments, something like line.strip().split()[0].split("(")[0] could do the trick (untested)
Finally, you can remove the "#" if you'd like by replacing line.strip().split()[0] by line.strip().split()[0][1:]
As Faisal notes, you could have the decorators themselves attach metadata to the function, but to my knowledge it isn't automatically done.
That's because decorators are "syntactic sugar". Say you have the following decorator:
def MyDecorator(func):
def transformed(*args):
print "Calling func " + func.__name__
func()
return transformed
And you apply it to a function:
#MyDecorator
def thisFunction():
print "Hello!"
This is equivalent to:
thisFunction = MyDecorator(thisFunction)
You could embed a "history" into the function object, perhaps, if you're in control of the decorators. I bet there's some other clever way to do this (perhaps by overriding assignment), but I'm not that well-versed in Python unfortunately. :(
That's not possible in my opinion. A decorator is not some kind of attribute or meta data of a method. A decorator is a convenient syntax for replacing a function with the result of a function call. See http://docs.python.org/whatsnew/2.4.html?highlight=decorators#pep-318-decorators-for-functions-and-methods for more details.
It is impossible to do in a general way, because
#foo
def bar ...
is exactly the same as
def bar ...
bar = foo (bar)
You may do it in certain special cases, like probably #staticmethod by analyzing function objects, but not better than that.
You can't but even worse is there exists libraries to help hide the fact that you have decorated a function to begin with. See Functools or the decorator library (#decorator if I could find it) for more information.
I want to assign a static method to a class variable in Python, and below is what my code looks like.
class Klass:
classVariable = None
#staticmethod
def method():
print "method called"
Klass.classVariable = Klass.method
Klass.method()
Klass.classVariable()
This gave me an error at the last line,
TypeError: unbound method method() must be called with Klass instance as first argument (got nothing instead).
But when I change the static method to class method it works. Can anyone give me any idea of why this is the case?
Backstory (descriptor protocol)
First, we need to know a little about python descriptors...
For this answer, it should be enough to know the following:
Functions are descriptors.
Binding behavior of methods (i.e. how a method knows what self to pass) is implemented via the function's __get__ method and the built-in descriptor protocol.
When you put a descriptor foo on a class, accessing the descriptor actually calls the .__get__ method. (This is really just a generalization of statement 2)
In other words:
class Foo(object):
val = some_descriptor
When I do:
result = Foo.val
Python actually does:
Foo.val.__get__(None, Foo)
When I do:
f = Foo()
f.val
python does:
f = Foo()
type(f).val.__get__(f, type(f))
Now the good stuff.
It looks like (on python2.x), staticmethod is implemented such that it's __get__ method returns a regular function. You can see this by printing the type of Klass.method:
print type(Klass.method) # <type 'function'>
So what we've learned is that the method returned by Klass.method.__get__ is just a regular function.
When you put that regular function onto a class, it's __get__ method returns an instancemethod (which expects a self argument). This isn't surprising ... We do it all the time:
class Foo(object):
def bar(self):
print self
Is no different to python than:
def bar(self):
print self
class Foo(object):
pass
Foo.bar = bar
except that the first version is a lot easier to read and doesn't clutter your module namespace.
So now we've explained how your staticmethod turned into an instance method. Is there anything we can do about it?
Solution
When you put the method onto the class, designate it as a staticmethod again and it will work out Ok.
class Klass(object): # inheriting from object is a good idea.
classVariable = None
#staticmethod
def method():
print("method called")
Klass.classVariable = staticmethod(Klass.method) # Note extra staticmethod
Klass.method()
Klass.classVariable()
Appendix -- Re-implementation of #staticmethod
If you're a little but curious how you might implement staticmethod to not have this problem -- Here's an example:
class StaticMethod(object):
def __init__(self, fn):
self.fn = fn
def __get__(self, inst, cls):
return self
def __call__(self, *args, **kwargs):
return self.fn(*args, **kwargs)
class Klass(object):
classVariable = None
#StaticMethod
def method():
print("method called")
Klass.classVariable = Klass.method
Klass.method()
Klass.classVariable()
Klass().method()
Klass().classVariable()
The trick here is that my __get__ doesn't return a function. It returns itself. When you put it on a different class (or the same class), it's __get__ will still just return itself. Since it is returning itself from __get__, it needs to pretend to be a function (so it can be called after "__gotten__") so I implement a custom __call__ method to do the right thing (pass through to the delegate function and return the result).
Please note, I'm not advocating that you use this StaticMethod instead of staticmethod. It'll be less efficient and not as introspectible (and probably confusing for your code readers). This is only for educational purposes.
Introduction
I have a Python class, which contains a number of methods. I want one of those methods to have a static counterpart—that is, a static method with the same name—which can handle more arguments. After some searching, I have found that I can use the #staticmethod decorator to create a static method.
Problem
For convenience, I have created a reduced test case which reproduces the issue:
class myclass:
#staticmethod
def foo():
return 'static method'
def foo(self):
return 'public method'
obj = myclass()
print(obj.foo())
print(myclass.foo())
I expect that the code above will print the following:
public method
static method
However, the code prints the following:
public method
Traceback (most recent call last):
File "sandbox.py", line 14, in <module>
print(myclass.foo())
TypeError: foo() missing 1 required positional argument: 'self'
From this, I can only assume that calling myclass.foo() tries to call its non-static counterpart with no arguments (which won't work because non-static methods always accept the argument self). This behavior baffles me, because I expect any call to the static method to actually call the static method.
I've tested the issue in both Python 2.7 and 3.3, only to receive the same error.
Questions
Why does this happen, and what can I do to fix my code so it prints:
public method
static method
as I would expect?
While it's not strictly possible to do, as rightly pointed out, you could always "fake" it by redefining the method on instantiation, like this:
class YourClass(object):
def __init__(self):
self.foo = self._instance_foo
#staticmethod
def foo():
print "Static!"
def _instance_foo(self):
print "Instance!"
which would produce the desired result:
>>> YourClass.foo()
Static!
>>> your_instance = YourClass()
>>> your_instance.foo()
Instance!
A similar question is here: override methods with same name in python programming
functions are looked up by name, so you are just redefining foo with an instance method. There is no such thing as an overloaded function in Python. You either write a new function with a separate name, or you provide the arguments in such a way that it can handle the logic for both.
In other words, you can't have a static version and an instance version of the same name. If you look at its vars you'll see one foo.
In [1]: class Test:
...: #staticmethod
...: def foo():
...: print 'static'
...: def foo(self):
...: print 'instance'
...:
In [2]: t = Test()
In [3]: t.foo()
instance
In [6]: vars(Test)
Out[6]: {'__doc__': None, '__module__': '__main__', 'foo': <function __main__.foo>}
Because attribute lookup in Python is something within the programmer's control, this sort of thing is technically possible. If you put any value into writing code in a "pythonic" way (using the preferred conventions and idioms of the python community), it is very likely the wrong way to frame a problem / design. But if you know how descriptors can allow you to control attribute lookup, and how functions become bound functions (hint: functions are descriptors), you can accomplish code that is roughly what you want.
For a given name, there is only one object that will be looked up on a class, regardless of whether you are looking the name up on an instance of the class, or the class itself. Thus, the thing that you're looking up has to deal with the two cases, and dispatch appropriately.
(Note: this isn't exactly true; if an instance has a name in its attribute namespace that collides with one in the namespace of its class, the value on the instance will win in some circumstances. But even in those circumstances, it won't become a "bound method" in the way that you probably would wish it to.)
I don't recommend designing your program using a technique such as this, but the following will do roughly what you asked. Understanding how this works requires a relatively deep understanding of python as a language.
class StaticOrInstanceDescriptor(object):
def __get__(self, cls, inst):
if cls is None:
return self.instance.__get__(self)
else:
return self.static
def __init__(self, static):
self.static = static
def instance(self, instance):
self.instance = instance
return self
class MyClass(object):
#StaticOrInstanceDescriptor
def foo():
return 'static method'
#foo.instance
def foo(self):
return 'public method'
obj = MyClass()
print(obj.foo())
print(MyClass.foo())
which does print out:
% python /tmp/sandbox.py
static method
public method
Ended up here from google so thought I would post my solution to this "problem"...
class Test():
def test_method(self=None):
if self is None:
print("static bit")
else:
print("instance bit")
This way you can use the method like a static method or like an instance method.
When you try to call MyClass.foo(), Python will complain since you did not pass the one required self argument. #coderpatros's answer has the right idea, where we provide a default value for self, so its no longer required. However, that won't work if there are additional arguments besides self. Here's a function that can handle almost all types of method signatures:
import inspect
from functools import wraps
def class_overload(cls, methods):
""" Add classmethod overloads to one or more instance methods """
for name in methods:
func = getattr(cls, name)
# required positional arguments
pos_args = 1 # start at one, as we assume "self" is positional_only
kwd_args = [] # [name:str, ...]
sig = iter(inspect.signature(func).parameters.values())
next(sig)
for s in sig:
if s.default is s.empty:
if s.kind == s.POSITIONAL_ONLY:
pos_args += 1
continue
elif s.kind == s.POSITIONAL_OR_KEYWORD:
kwd_args.append(s.name)
continue
break
#wraps(func)
def overloaded(*args, **kwargs):
# most common case: missing positional arg or 1st arg is not a cls instance
isclass = len(args) < pos_args or not isinstance(args[0], cls)
# handle ambiguous signatures, func(self, arg:cls, *args, **kwargs);
# check if missing required positional_or_keyword arg
if not isclass:
for i in range(len(args)-pos_args,len(kwd_args)):
if kwd_args[i] not in kwargs:
isclass = True
break
# class method
if isclass:
return func(cls, *args, **kwargs)
# instance method
return func(*args, **kwargs)
setattr(cls, name, overloaded)
class Foo:
def foo(self, *args, **kwargs):
isclass = self is Foo
print("foo {} method called".format(["instance","class"][isclass]))
class_overload(Foo, ["foo"])
Foo.foo() # "foo class method called"
Foo().foo() # "foo instance method called"
You can use the isclass bool to implement the different logic for class vs instance method.
The class_overload function is a bit beefy and will need to inspect the signature when the class is declared. But the actual logic in the runtime decorator (overloaded) should be quite fast.
There's one signature that this solution won't work for: a method with an optional, first, positional argument of type Foo. It's impossible to tell if we are calling the static or instance method just by the signature in this case. For example:
def bad_foo(self, other:Foo=None):
...
bad_foo(f) # f.bad_foo(None) or Foo.bad_foo(f) ???
Note, this solution may also report an incorrect isclass value if you pass in incorrect arguments to the method (a programmer error, so may not be important to you).
We can get a possibly more robust solution by doing the reverse of this: first start with a classmethod, and then create an instance method overload of it. This is essentially the same idea as #Dologan's answer, though I think mine is a little less boilerplatey if you need to do this on several methods:
from types import MethodType
def instance_overload(self, methods):
""" Adds instance overloads for one or more classmethods"""
for name in methods:
setattr(self, name, MethodType(getattr(self, name).__func__, self))
class Foo:
def __init__(self):
instance_overload(self, ["foo"])
#classmethod
def foo(self, *args, **kwargs):
isclass = self is Foo
print("foo {} method called:".format(["instance","class"][isclass]))
Foo.foo() # "foo class method called"
Foo().foo() # "foo instance method called"
Not counting the code for class_overload or instance_overload, the code is equally succinct. Often signature introspection is touted as the "pythonic" way to do these kinds of things. But I think I'd recommend using the instance_method solution instead; isclass will be correct for any method signature, including cases where you call with incorrect arguments (a programmer error).
I've only seen examples for setting the __repr__ method in class definitions. Is it possible to change the __repr__ for functions either in their definitions or after defining them?
I've attempted without success...
>>> def f():
pass
>>> f
<function f at 0x1026730c8>
>>> f.__repr__ = lambda: '<New repr>'
>>> f
<function __main__.f>
Yes, if you're willing to forgo the function actually being a function.
First, define a class for our new type:
import functools
class reprwrapper(object):
def __init__(self, repr, func):
self._repr = repr
self._func = func
functools.update_wrapper(self, func)
def __call__(self, *args, **kw):
return self._func(*args, **kw)
def __repr__(self):
return self._repr(self._func)
Add in a decorator function:
def withrepr(reprfun):
def _wrap(func):
return reprwrapper(reprfun, func)
return _wrap
And now we can define the repr along with the function:
#withrepr(lambda x: "<Func: %s>" % x.__name__)
def mul42(y):
return y*42
Now repr(mul42) produces '<Func: mul42>'
No, because repr(f) is done as type(f).__repr__(f) instead.
In order to do that, you'd need to change the __repr__ function for the given class, which in this case is the built-in function class (types.FunctionType). Since in Python you cannot edit built-in classes, only subclass them, you cannot.
However, there are two approaches you could follow:
Wrap some functions as kwatford suggested
Create your own representation protocol with your own repr function. For example, you could define a myrepr function that looks for __myrepr__ methods first, which you cannot add to the function class but you can add it to individual function objects as you suggest (as well as your custom classes and objects), then defaults to repr if __myrepr__ is not found. A possible implementation for this would be:
def myrepr(x):
try:
x.__myrepr__
except AttributeError:
return repr(x)
else:
return x.__myrepr__()
Then you could define __myrepr__ methods and use the myrepr function. Alternatively, you could also do __builtins__.repr = myrepr to make your function the default repr and keep using repr. This approach would end up doing exactly what you want, though editing __builtins__ may not always be desirable.
This appears to be difficult. Kwatford's approach only solves this problem partially since it does not work for functions in classes, becuase self would be treated like a positional argument, as explained in Decorating Python class methods - how do I pass the instance to the decorator? - However, the solution for that question is not applicable to this case, unfortunately, as using __get__() and functools.partial would override the custom __repr__().
I need to decorate a object's method. It needs to be at runtime because the decorators applied on the object depends on the arguments that the user gave when calling the program (arguments supplied with argv), so a same object could be decorated 3 times, 2 times, or not be decorated at all.
Here is some context, the program is a puzzle solver, the main behavior is to find a solution for the puzzle automatically, by automatically I mean without user intervention. And here is where the decoration gets to play, one of the things I want to is draw a graph of what happened during the execution, but I want to do so only when the flag --draw-graph is used.
Here is what I've tried:
class GraphDecorator(object):
def __init__(self, wrappee):
self.wrappee = wrappee
def method(self):
# do my stuff here
self.wrappee.method()
# do more of stuff here
def __getattr__(self,attr):
return getattr(self.wrappee,attr)
And why it did NOT work:
It did not work because of the way I built the application, when a method that did not exist in my Decorator class was called it felt back to the implementation of the decorated class, the problem is that the application always started invoking the method run that did not need to be decorated, so the undecorated fall back was used and from inside the undecorated form it always called undecorated methods, what I needed was to replace the method from the object, not to proxy it:
# method responsible to replace the undecorated form by the decorated one
def graphDecorator(obj):
old_method = obj.method
def method(self):
# do my stuff here
old_method()
# do more of my stuff
setattr(obj,'method',method) # replace with the decorated form
And here is my problem, the decorated form does not receive self when it is called resulting on a TypeError because of the wrong number of arguments.
The problem was that I couldn't use func(self) as a method. The reason is that setattr() method does not bound the function, and the function acts like it a static method - not a class method -, thanks to the introspective nature of python I've able to come up with this solution:
def decorator(obj):
old_func = obj.func # can't call 'by name' because of recursion
def decorated_func(self):
# do my stuff here
old_func() # does not need pass obj
# do some othere stuff here
# here is the magic, this get the type of a 'normal method' of a class
method = type(obj.func)
# this bounds the method to the object, so self is passed by default
obj.func = method(decorated_func, obj)
I think this is the best way to decorate a object's method at runtime, though it would be nice to find a way to call method() directly, without the line method = type(obj.func)
You might want to use __getattribute__ instead of __getattr__ (the latter being only called if "standard" lookup fails):
class GraphDecorator(object):
def __init__(self, wrappee):
self.__wrappee = wrappee
def method(self):
# do my stuff here
self.wrappe.method()
# do more of stuff here
def __getattribute__(self, name):
try:
wrappee = object.__getattribute__(self, "_GraphDecorator__wrappee")
return getattr(wrappee, name)
except AttributeError:
return object.__getattribute__(self, name)
I need to decorate a object's method. It needs to be at runtime because the decorators applied on the object depends on the arguments that the user gave when calling the program (arguments supplied with argv), so a same object could be decorated 3 times, 2 times, or not be decorated at all.
The above is unfortunately incorrect, and what you are trying to do is unnecessary.
You can do this at runtime like so. Example:
import sys
args = sys.argv[1:]
class MyClass(object):
pass
if args[0]=='--decorateWithFoo':
MyClass = decoratorFoo(MyClass)
if args[1]=='--decorateWithBar'
MyClass = decoratorBar(MyClass)
The syntax:
#deco
define something
Is the same thing as:
define something
something = deco(something)
You could also make a decorator factory #makeDecorator(command_line_arguments)
"It needs to be at runtime because the decorators applied on the object depends on the arguments that the user gave when calling the program"
The don't use decorators. Decorators are only syntactical support for wrappers, you can just as well use normal function/method calls instead.