I can access a python function's attribute inside of function itself by below code:
def aa():
print aa.__name__
print aa.__hash__
# other simliar
However, if above aa() function is a template for write other code, say bb(), I have to write:
def bb():
print bb.__name__
print bb.__hash__
# other simliar
Is there a "pointer" similar to the self argument in a class method so I could write code like this?
def whatever():
print self.__name__
print self.__hash__
# other simliar
I searched and found someone said to use the class to solve this problem, but that may be a trouble to redefine all the existing functions. Any suggestions?
There is no generic way for a function to refer to itself. Consider using a decorator instead. If all you want as you indicated was to print information about the function that can be done easily with a decorator:
from functools import wraps
def showinfo(f):
#wraps(f)
def wrapper(*args, **kwds):
print(f.__name__, f.__hash__)
return f(*args, **kwds)
return wrapper
#showinfo
def aa():
pass
If you really do need to reference the function, then just add it to the function arguments:
def withself(f):
#wraps(f)
def wrapper(*args, **kwds):
return f(f, *args, **kwds)
return wrapper
#withself
def aa(self):
print(self.__name__)
# etc.
Edit to add alternate decorator:
You can also write a simpler (and probably faster) decorator that will make the wrapped function work correctly with Python's introspection:
def bind(f):
"""Decorate function `f` to pass a reference to the function
as the first argument"""
return f.__get__(f, type(f))
#bind
def foo(self, x):
"This is a bound function!"
print(self, x)
>>> foo(42)
<function foo at 0x02A46030> 42
>>> help(foo)
Help on method foo in module __main__:
foo(self, x) method of builtins.function instance
This is a bound function!
This leverages Python's descriptor protocol: functions have a __get__ method that is used to create bound methods. The decorator simply uses the existing method to make the function a bound method of itself. It will only work for standalone functions, if you wanted a method to be able to reference itself you would have to do something more like the original solution.
http://docs.python.org/library/inspect.html looks promising:
import inspect
def foo():
felf = globals()[inspect.getframeinfo(inspect.currentframe()).function]
print felf.__name__, felf.__doc__
you can also use the sys module to get the name of the current function:
import sys
def bar():
felf = globals()[sys._getframe().f_code.co_name]
print felf.__name__, felf.__doc__
You can at least say self = bb in the first line, and then you only need to change that line when you change the function name, instead of every other reference.
My code editor highlights the variable self the same way it does for classes, too.
How about a quick hack to make your own "self" name, like this:
>>> def f():
... self = f
... print "My name is ", self.__name__, "and I am", self.__hash__
...
>>> f()
My name is f and I am <method-wrapper '__hash__' of function object at 0x00B50F30>
>>> x = f
>>> x()
My name is f and I am <method-wrapper '__hash__' of function object at 0x00B50F30>
>>>
Related
This question already has answers here:
Possible to change a function's repr in python?
(4 answers)
Closed 2 years ago.
__repr__ is used to return a string representation of an object, but in Python a function is also an object itself, and can have attributes.
How do I set the __repr__ of a function?
I see here that an attribute can be set for a function outside the function, but typically one sets a __repr__ within the object definition itself, so I'd like to set the repr within the function definition itself.
My use case is that I am using tenacity to retry a networking function with exponential backoff, and I want to log the (informative) name of the function I have called last.
retry_mysql_exception_types = (InterfaceError, OperationalError, TimeoutError, ConnectionResetError)
def return_last_retry_outcome(retry_state):
"""return the result of the last call attempt"""
return retry_state.outcome.result()
def my_before_sleep(retry_state):
print("Retrying {}: attempt {} ended with: {}\n".format(retry_state.fn, retry_state.attempt_number, retry_state.outcome))
#tenacity.retry(wait=tenacity.wait_random_exponential(multiplier=1, max=1200),
stop=tenacity.stop_after_attempt(30),
retry=tenacity.retry_if_exception_type(retry_mysql_exception_types),
retry_error_callback=return_last_retry_outcome,
before_sleep=my_before_sleep)
def connect_with_retries(my_database_config):
connection = mysql.connector.connect(**my_database_config)
return connection
Currently retry_state.fn displays something like <function <lambda> at 0x1100f6ee0> like #chepner says, but I'd like to add more information to it.
You could use a decorator that returns a class with the __call__ and __repr__ set:
class CustomReprFunc:
def __init__(self, f, custom_repr):
self.f = f
self.custom_repr = custom_repr
def __call__(self, *args, **kwargs):
return self.f(*args, **kwargs)
def __repr__(self):
return self.custom_repr(self.f)
def set_repr(custom_repr):
def set_repr_decorator(f):
return CustomReprFunc(f, custom_repr)
return set_repr_decorator
#set_repr(lambda f: f.__name__)
def func(a):
return a
print(repr(func))
I think a custom decorator could help:
import functools
class reprable:
"""Decorates a function with a repr method.
Example:
>>> #reprable
... def foo():
... '''Does something cool.'''
... return 4
...
>>> foo()
4
>>> foo.__name__
'foo'
>>> foo.__doc__
'Does something cool.'
>>> repr(foo)
'foo: Does something cool.'
>>> type(foo)
<class '__main__.reprable'>
"""
def __init__(self, wrapped):
self._wrapped = wrapped
functools.update_wrapper(self, wrapped)
def __call__(self, *args, **kwargs):
return self._wrapped(*args, **kwargs)
def __repr__(self):
return f'{self._wrapped.__name__}: {self._wrapped.__doc__}'
Demo: http://tpcg.io/uTbSDepz.
It's already set.
>>> repr(lambda x:x)
'<function <lambda> at 0x1100f6ee0>'
The problem is that the function type is immutable, so you can't just assign a new function to function.__repr__, and you also can't create a subtype of function in order to override __repr__. (Not that creating instances of the subclass would be easy, even if it were possible to define it.)
You can't do this for actual functions; the function type is immutable, and already defines a __repr__, and __repr__ is looked up on the type, not the instance, so changing __repr__ on a given function doesn't change behavior.
While probably not useful in this case, you can make your own callable class (analogous to C++ functors), and those can define their own __repr__. For example:
class myfunction:
#staticmethod # Avoids need to receive unused self
def __call__(your, args, here):
... do stuff and return as if it were a function ...
#classmethod # Know about class, but again, instance is useless
def __repr__(cls):
return f'{cls.__name__}(a, b, c)'
which you could convert to a singleton instance of the class (making it equivalent to a plain function in how you use it) at the end by just doing:
myfunction = myfunction()
to replace the class with a single instance of the class.
Note: In real code, I'd almost certainly just change where I'm printing it to print in a more useful way without modifying the function. This doesn't have much overhead over a plain function or a wrapped plain function (since we put the function itself in __call__ rather than wrapping, making it faster, but requiring a separate class for each "friendly repr function"), but it's just not the job of the function to decide how to represent itself in a human-friendly way; that's your job, based on the situation.
You can change retry_state.fn to retry_state.__name__. I use many decorators like this. If you add a decorator, it will be called each time a function of interest is called.
def display_function(func):
""" This decorator prints before and after running """
#functools.wraps(func)
def function_wrapper(*args, **kwargs):
print(f'\nNow: Calling {func.__name__}.')
entity = func(*args, **kwargs)
print(f'Done: Calling {func.__name__}.\n')
return entity
return function_wrapper
Additionally, the retrying module in python allows you to do some of what you're doing by default. I often use a decorator:
import retrying
#retrying.retry(wait_exponential_multiplier=1000, wait_exponential_max=10000)
I am trying to figure out how to get the names of all decorators on a method. I can already get the method name and docstring, but cannot figure out how to get a list of decorators.
I'm surprised that this question is so old and no one has taken the time to add the actual introspective way to do this, so here it is:
The code you want to inspect...
def template(func):
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return wrapper
baz = template
che = template
class Foo(object):
#baz
#che
def bar(self):
pass
Now you can inspect the above Foo class with something like this...
import ast
import inspect
def get_decorators(cls):
target = cls
decorators = {}
def visit_FunctionDef(node):
decorators[node.name] = []
for n in node.decorator_list:
name = ''
if isinstance(n, ast.Call):
name = n.func.attr if isinstance(n.func, ast.Attribute) else n.func.id
else:
name = n.attr if isinstance(n, ast.Attribute) else n.id
decorators[node.name].append(name)
node_iter = ast.NodeVisitor()
node_iter.visit_FunctionDef = visit_FunctionDef
node_iter.visit(ast.parse(inspect.getsource(target)))
return decorators
print get_decorators(Foo)
That should print something like this...
{'bar': ['baz', 'che']}
or at least it did when I tested this with Python 2.7.9 real quick :)
If you can change the way you call the decorators from
class Foo(object):
#many
#decorators
#here
def bar(self):
pass
to
class Foo(object):
#register(many,decos,here)
def bar(self):
pass
then you could register the decorators this way:
def register(*decorators):
def register_wrapper(func):
for deco in decorators[::-1]:
func=deco(func)
func._decorators=decorators
return func
return register_wrapper
For example:
def many(f):
def wrapper(*args,**kwds):
return f(*args,**kwds)
return wrapper
decos = here = many
class Foo(object):
#register(many,decos,here)
def bar(self):
pass
foo=Foo()
Here we access the tuple of decorators:
print(foo.bar._decorators)
# (<function many at 0xb76d9d14>, <function decos at 0xb76d9d4c>, <function here at 0xb76d9d84>)
Here we print just the names of the decorators:
print([d.func_name for d in foo.bar._decorators])
# ['many', 'decos', 'here']
I've add the same question. In my unit tests I just wanted to make sure decorators were used by given functions/methods.
The decorators were tested separately so I didn't need to test the common logic for each decorated function, just that the decorators were used.
I finally came up with the following helper function:
import inspect
def get_decorators(function):
"""Returns list of decorators names
Args:
function (Callable): decorated method/function
Return:
List of decorators as strings
Example:
Given:
#my_decorator
#another_decorator
def decorated_function():
pass
>>> get_decorators(decorated_function)
['#my_decorator', '#another_decorator']
"""
source = inspect.getsource(function)
index = source.find("def ")
return [
line.strip().split()[0]
for line in source[:index].strip().splitlines()
if line.strip()[0] == "#"
]
With the list comprehension, it is a bit "dense" but it does the trick and in my case it's a test helper function.
It works if you are intrested only in the decorators names, not potential decorator arguments. If you want to support decorators taking arguments, something like line.strip().split()[0].split("(")[0] could do the trick (untested)
Finally, you can remove the "#" if you'd like by replacing line.strip().split()[0] by line.strip().split()[0][1:]
As Faisal notes, you could have the decorators themselves attach metadata to the function, but to my knowledge it isn't automatically done.
That's because decorators are "syntactic sugar". Say you have the following decorator:
def MyDecorator(func):
def transformed(*args):
print "Calling func " + func.__name__
func()
return transformed
And you apply it to a function:
#MyDecorator
def thisFunction():
print "Hello!"
This is equivalent to:
thisFunction = MyDecorator(thisFunction)
You could embed a "history" into the function object, perhaps, if you're in control of the decorators. I bet there's some other clever way to do this (perhaps by overriding assignment), but I'm not that well-versed in Python unfortunately. :(
That's not possible in my opinion. A decorator is not some kind of attribute or meta data of a method. A decorator is a convenient syntax for replacing a function with the result of a function call. See http://docs.python.org/whatsnew/2.4.html?highlight=decorators#pep-318-decorators-for-functions-and-methods for more details.
It is impossible to do in a general way, because
#foo
def bar ...
is exactly the same as
def bar ...
bar = foo (bar)
You may do it in certain special cases, like probably #staticmethod by analyzing function objects, but not better than that.
You can't but even worse is there exists libraries to help hide the fact that you have decorated a function to begin with. See Functools or the decorator library (#decorator if I could find it) for more information.
Introduction
I have a Python class, which contains a number of methods. I want one of those methods to have a static counterpart—that is, a static method with the same name—which can handle more arguments. After some searching, I have found that I can use the #staticmethod decorator to create a static method.
Problem
For convenience, I have created a reduced test case which reproduces the issue:
class myclass:
#staticmethod
def foo():
return 'static method'
def foo(self):
return 'public method'
obj = myclass()
print(obj.foo())
print(myclass.foo())
I expect that the code above will print the following:
public method
static method
However, the code prints the following:
public method
Traceback (most recent call last):
File "sandbox.py", line 14, in <module>
print(myclass.foo())
TypeError: foo() missing 1 required positional argument: 'self'
From this, I can only assume that calling myclass.foo() tries to call its non-static counterpart with no arguments (which won't work because non-static methods always accept the argument self). This behavior baffles me, because I expect any call to the static method to actually call the static method.
I've tested the issue in both Python 2.7 and 3.3, only to receive the same error.
Questions
Why does this happen, and what can I do to fix my code so it prints:
public method
static method
as I would expect?
While it's not strictly possible to do, as rightly pointed out, you could always "fake" it by redefining the method on instantiation, like this:
class YourClass(object):
def __init__(self):
self.foo = self._instance_foo
#staticmethod
def foo():
print "Static!"
def _instance_foo(self):
print "Instance!"
which would produce the desired result:
>>> YourClass.foo()
Static!
>>> your_instance = YourClass()
>>> your_instance.foo()
Instance!
A similar question is here: override methods with same name in python programming
functions are looked up by name, so you are just redefining foo with an instance method. There is no such thing as an overloaded function in Python. You either write a new function with a separate name, or you provide the arguments in such a way that it can handle the logic for both.
In other words, you can't have a static version and an instance version of the same name. If you look at its vars you'll see one foo.
In [1]: class Test:
...: #staticmethod
...: def foo():
...: print 'static'
...: def foo(self):
...: print 'instance'
...:
In [2]: t = Test()
In [3]: t.foo()
instance
In [6]: vars(Test)
Out[6]: {'__doc__': None, '__module__': '__main__', 'foo': <function __main__.foo>}
Because attribute lookup in Python is something within the programmer's control, this sort of thing is technically possible. If you put any value into writing code in a "pythonic" way (using the preferred conventions and idioms of the python community), it is very likely the wrong way to frame a problem / design. But if you know how descriptors can allow you to control attribute lookup, and how functions become bound functions (hint: functions are descriptors), you can accomplish code that is roughly what you want.
For a given name, there is only one object that will be looked up on a class, regardless of whether you are looking the name up on an instance of the class, or the class itself. Thus, the thing that you're looking up has to deal with the two cases, and dispatch appropriately.
(Note: this isn't exactly true; if an instance has a name in its attribute namespace that collides with one in the namespace of its class, the value on the instance will win in some circumstances. But even in those circumstances, it won't become a "bound method" in the way that you probably would wish it to.)
I don't recommend designing your program using a technique such as this, but the following will do roughly what you asked. Understanding how this works requires a relatively deep understanding of python as a language.
class StaticOrInstanceDescriptor(object):
def __get__(self, cls, inst):
if cls is None:
return self.instance.__get__(self)
else:
return self.static
def __init__(self, static):
self.static = static
def instance(self, instance):
self.instance = instance
return self
class MyClass(object):
#StaticOrInstanceDescriptor
def foo():
return 'static method'
#foo.instance
def foo(self):
return 'public method'
obj = MyClass()
print(obj.foo())
print(MyClass.foo())
which does print out:
% python /tmp/sandbox.py
static method
public method
Ended up here from google so thought I would post my solution to this "problem"...
class Test():
def test_method(self=None):
if self is None:
print("static bit")
else:
print("instance bit")
This way you can use the method like a static method or like an instance method.
When you try to call MyClass.foo(), Python will complain since you did not pass the one required self argument. #coderpatros's answer has the right idea, where we provide a default value for self, so its no longer required. However, that won't work if there are additional arguments besides self. Here's a function that can handle almost all types of method signatures:
import inspect
from functools import wraps
def class_overload(cls, methods):
""" Add classmethod overloads to one or more instance methods """
for name in methods:
func = getattr(cls, name)
# required positional arguments
pos_args = 1 # start at one, as we assume "self" is positional_only
kwd_args = [] # [name:str, ...]
sig = iter(inspect.signature(func).parameters.values())
next(sig)
for s in sig:
if s.default is s.empty:
if s.kind == s.POSITIONAL_ONLY:
pos_args += 1
continue
elif s.kind == s.POSITIONAL_OR_KEYWORD:
kwd_args.append(s.name)
continue
break
#wraps(func)
def overloaded(*args, **kwargs):
# most common case: missing positional arg or 1st arg is not a cls instance
isclass = len(args) < pos_args or not isinstance(args[0], cls)
# handle ambiguous signatures, func(self, arg:cls, *args, **kwargs);
# check if missing required positional_or_keyword arg
if not isclass:
for i in range(len(args)-pos_args,len(kwd_args)):
if kwd_args[i] not in kwargs:
isclass = True
break
# class method
if isclass:
return func(cls, *args, **kwargs)
# instance method
return func(*args, **kwargs)
setattr(cls, name, overloaded)
class Foo:
def foo(self, *args, **kwargs):
isclass = self is Foo
print("foo {} method called".format(["instance","class"][isclass]))
class_overload(Foo, ["foo"])
Foo.foo() # "foo class method called"
Foo().foo() # "foo instance method called"
You can use the isclass bool to implement the different logic for class vs instance method.
The class_overload function is a bit beefy and will need to inspect the signature when the class is declared. But the actual logic in the runtime decorator (overloaded) should be quite fast.
There's one signature that this solution won't work for: a method with an optional, first, positional argument of type Foo. It's impossible to tell if we are calling the static or instance method just by the signature in this case. For example:
def bad_foo(self, other:Foo=None):
...
bad_foo(f) # f.bad_foo(None) or Foo.bad_foo(f) ???
Note, this solution may also report an incorrect isclass value if you pass in incorrect arguments to the method (a programmer error, so may not be important to you).
We can get a possibly more robust solution by doing the reverse of this: first start with a classmethod, and then create an instance method overload of it. This is essentially the same idea as #Dologan's answer, though I think mine is a little less boilerplatey if you need to do this on several methods:
from types import MethodType
def instance_overload(self, methods):
""" Adds instance overloads for one or more classmethods"""
for name in methods:
setattr(self, name, MethodType(getattr(self, name).__func__, self))
class Foo:
def __init__(self):
instance_overload(self, ["foo"])
#classmethod
def foo(self, *args, **kwargs):
isclass = self is Foo
print("foo {} method called:".format(["instance","class"][isclass]))
Foo.foo() # "foo class method called"
Foo().foo() # "foo instance method called"
Not counting the code for class_overload or instance_overload, the code is equally succinct. Often signature introspection is touted as the "pythonic" way to do these kinds of things. But I think I'd recommend using the instance_method solution instead; isclass will be correct for any method signature, including cases where you call with incorrect arguments (a programmer error).
It is straightforward to delegate a called function's data to another function:
def test2(a, b):
huh = locals()
print huh
def test(a, b='hoho'):
test2(**locals())
However, locals() contains self when a method is called and this gets in the way when attempting to do the same thing in a single line for method calls:
class X(object):
def test2(self, a, b):
huh = locals()
print huh
def test(self, a, b='hoho'):
self.test2(**locals()) # no workie
test2(**locals()) # no workie either
You should not be using locals() at all here; use *args an **kw to catch arguments and pass those on:
def test(self, *args, **kw):
self.test(*args, **kw)
Some time ago I wrote a function that introspects the function being called and passes named arguments only for parameters that actually exist; see this answer.
But I'm with Martijn in general, passing locals() somewhere else smells like a bad idea.
For example, if I'm decorating a method like so
def my_decorator(fn):
# Do something based on the class that fn is a method of
def decorated_fn(*args, **kwargs):
fn(*args, **kwargs)
return decorated_fn
class MyClass(object):
#my_decorator
def my_method(self, param):
print "foo"
Is it possible in my_decorator to determine where fn came from?
Short answer: No.
Longer answer: You can do it by mucking about in the stack trace (see the inspect module) but it's not a great idea.
Full answer: At the time the function gets decorated, it's still an unbound function. Try the following:
def my_dec(fn):
print dir(fn) # Has "func_code" and "func_name"
return fn
class A(object):
#my_dec
def test(self):
pass
print dir(A.test) # Has "im_class" and "im_self"
You can see that the raw function gets passed to the decorator, while the bound function is available after the class is declared.
The way to accomplish this is to just the function decorator in conjunction with either a metaclass or a class decorator. In either case, the function decorator can set a flag on the function, and the metaclass or class decorator can look for it and do the appropriate thing.
No. You'll have to defer it until decorated_fn() is called.