How to prevent a function from being cast to bool - python

The following python code has a bug:
class Location(object):
def is_nighttime():
return ...
if location.is_nighttime:
close_shades()
The bug is that the programmer forgot to call is_nighttime (or forgot to use a #property decorator on the method), so the method is cast by bool evaluated as True without being called.
Is there a way to prevent the programmer from doing this, both in the case above, and in the case where is_nighttime is a standalone function instead of a method? For example, something in the following spirit?
is_nighttime.__bool__ = TypeError

In theory, you could wrap the function in a function-like object with a __call__ that delegates to the function and a __bool__ that raises a TypeError. It'd be really unwieldy and would probably cause more bad interactions than it'd catch - for example, these objects won't work as methods unless you add more special handling for that - but you could do it:
class NonBooleanFunction(object):
"""A function wrapper that prevents a function from being interpreted as a boolean."""
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
return self.func(*args, **kwargs)
def __bool__(self):
raise TypeError
__nonzero__ = __bool__
#NonBooleanFunction
def is_nighttime():
return True # We're at the Sun-Earth L2 point or something.
if is_nighttime:
# TypeError!
There's still a lot of stuff you can't catch:
nighttime_list.append(is_nighttime) # No TypeError ._.
And you have to remember to explicitly apply this to any functions you don't want being treated as booleans. You also can't do much about functions and methods you don't control; for example, you can't apply this to str.islower to catch things like if some_string.islower:.
If you want to catch things like this, I recommend using static analysis tools instead. I think IDEs like PyCharm might warn you, and there should be linting tools that can catch this.
If you want these things to work as methods, here's the extra handling for that:
import functools
class NonBooleanFunction(object):
... # other methods omitted for brevity
def __get__(self, instance, owner):
if instance is None:
return self
return NonBooleanFunction(functools.partial(self.func, instance))

This is something you can approach with static code analysis.
For instance, pylint has a related warning:
using-constant-test (W0125):
Using a conditional statement with a
constant value Emitted when a conditional statement (If or ternary if)
uses a constant value for its test. This might not be what the user
intended to do.
Demo:
If is_nightmare is not called:
$ pylint test.py
************* Module test
C: 1, 0: Missing module docstring (missing-docstring)
C: 1, 0: Missing function docstring (missing-docstring)
W: 4, 0: Using a conditional statement with a constant value (using-constant-test)
If called:
$ pylint test.py
************* Module test
C: 1, 0: Missing module docstring (missing-docstring)
C: 1, 0: Missing function docstring (missing-docstring)

Short answer: if is_nighttime():, with parenthesis to call it.
Longer answer:
is_nighttime points to a function, which is a non-None type. if looks for a condition which is a boolean, and casts the symbol is_nighttime to boolean. As it is not zero and not None, it is True.

Related

Why was Python decorator chaining designed to work backwards? What is the logic behind this order?

To start with, my question here is about the semantics and the logic behind why the Python language was designed like this in the case of chained decorators. Please notice the nuance how this is different from the question
How decorators chaining work?
Link: How decorators chaining work? It seems quite a number of other users had the same doubts, about the call order of chained Python decorators. It is not like I can't add a __call__ and see the order for myself. I get this, my point is, why was it designed to start from the bottom, when it comes to chained Python decorators?
E.g.
def first_func(func):
def inner():
x = func()
return x * x
return inner
def second_func(func):
def inner():
x = func()
return 2 * x
return inner
#first_func
#second_func
def num():
return 10
print(num())
Quoting the documentation on decorators:
The decorator syntax is merely syntactic sugar, the following two function definitions are semantically equivalent:
def f(arg):
...
f = staticmethod(f)
#staticmethod
def f(arg):
...
From this it follows that the decoration in
#a
#b
#c
def fun():
...
is equivalent to
fun = a(b(c(fun)))
IOW, it was designed like that because it's just syntactic sugar.
For proof, let's just decorate an existing function and not return a new one:
def dec1(f):
print(f"dec1: got {vars(f)}")
f.dec1 = True
return f
def dec2(f):
print(f"dec2: got {vars(f)}")
f.dec2 = True
return f
#dec1
#dec2
def foo():
pass
print(f"Fully decked out: {vars(foo)}")
prints out
dec2: got {}
dec1: got {'dec2': True}
Fully decked out: {'dec2': True, 'dec1': True}
TL;DR
g(f(x)) means applying f to x first, then applying g to the output.
Omit the parentheses, add # before and line break after each function name:
#g
#f
x
(Syntax only valid if x is the definition of a function/class.)
Abstract explanation
The reasoning behind this design decision becomes fairly obvious IMHO, if you remember what the decorator syntax - in its most abstract and general form - actually means. So I am going to try the abstract approach to explain this.
It is all about syntax
To be clear here, the distinguishing factor in the concept of the "decorator" is not the object underneath it (so to speak) nor the operation it performs. It is the special syntax and the restrictions for it. Thus, a decorator at its core is nothing more than feature of Python grammar.
The decorator syntax requires a target to be decorated. Initially (see PEP 318) the target could only be function definitions; later class definitions were also allowed to be decorated (see PEP 3129).
Minimal valid syntax
Syntactically, this is valid Python:
def f(): pass
#f
class Target: pass # or `def target: pass`
However, this will (perhaps unsuprisingly) cause a TypeError upon execution. As has been reiterated multiple times here and in other posts on this platform, the above is equivalent to this:
def f(): pass
class Target: pass
Target = f(Target)
Minimal working decorator
The TypeError stems from the fact that f lacks a positional argument. This is the obvious logical restriction imposed by what a decorator is supposed to do. Thus, to achieve not only syntactically valid code, but also have it run without errors, this is sufficient:
def f(x): pass
#f
class Target: pass
This is still not very useful, but it is enough for the most general form of a working decorator.
Decoration is just application of a function to the target and assigning the output to the target's name.
Chaining functions ⇒ Chaining decorators
We can ignore the target and what it is or does and focus only on the decorator. Since it merely stands for applying a function, the order of operations comes into play, as soon as we have more than one. What is the order of operation, when we chain functions?
def f(x): pass
def g(x): pass
class Target: pass
Target = g(f(Target))
Well, just like in the composition of purely mathematical functions, this implies that we apply f to Target first and then apply g to the result of f. Despite g appearing first (i.e. further left), it is not what is applied first.
Since stacking decorators is equivalent to nesting functions, it seems obvious to define the order of operation the same way. This time, we just skip the parentheses, add an # symbol in front of the function name and a line break after it.
def f(x): pass
def g(x): pass
#g
#f
class Target: pass
But, why though?
If after the explanation above (and reading the PEPs for historic background), the reasoning behind the order of operation is still not clear or still unintuitive, there is not really any good answer left, other than "because the devs thought it made sense, so get used to it".
PS
I thought I'd add a few things for additional context based on all the comments around your question.
Decoration vs. calling a decorated function
A source of confusion seems to be the distinction between what happens when applying the decorator versus calling the decorated function.
Notice that in my examples above I never actually called target itself (the class or function being decorated). Decoration is itself a function call. Adding #f above the target is calling the f and passing the target to it as the first positional argument.
A "decorated function" might not even be a function
The distinction is very important because nowhere does it say that a decorator actually needs to return a callable (function or class). f being just a function means it can return whatever it wants. This is again valid and working Python code:
def f(x): return 3.14
#f
def target(): return "foo"
try:
target()
except Exception as e:
print(repr(e))
print(target)
Output:
TypeError("'float' object is not callable")
3.14
Notice that the name target does not even refer to a function anymore. It just holds the 3.14 returned by the decorator. Thus, we cannot even call target. The entire function behind it is essentially lost immediately before it is even available to the global namespace. That is because f just completely ignores its first positional argument x.
Replacing a function
Expanding this further, if we want, we can have f return a function. Not doing that seems very strange, considering it is used to decorate a function. But it doesn't have to be related to the target at all. Again, this is fine:
def bar(): return "bar"
def f(x): return bar
#f
def target(): return "foo"
print(target())
print(target is bar)
Output:
bar
True
It comes down to convention
The way decorators are actually overwhelmingly used out in the wild, is in a way that still keeps a reference to the target being decorated around somewhere. In practice it can be as simple as this:
def f(x):
print(f"applied `f({x.__name__})`")
return
#f
def target(): return "foo"
Just running this piece of code outputs applied f(target). Again, notice that we don't call target here, we only called f. But now, the decorated function is still target, so we could add the call print(target()) at the bottom and that would output foo after the other output produced by f.
The fact that most decorators don't just throw away their target comes down to convention. You (as a developer) would not expect your function/class to simply be thrown away completely, when you use a decorator.
Decoration with wrapping
This is why real-life decorators typically either return the reference to the target at the end outright (like in the last example) or they return a different callable, but that callable itself calls the target, meaning a reference to the target is kept in that new callable's local namespace . These functions are what is usually referred to as wrappers:
def f(x):
print(f"applied `f({x.__name__})`")
def wrapper():
print(f"wrapper executing with {locals()=}")
return x()
return wrapper
#f
def target(): return "foo"
print(f"{target()=}")
print(f"{target.__name__=}")
Output:
applied `f(target)`
wrapper executing with locals()={'x': <function target at 0x7f1b2f78f250>}
target()='foo'
target.__name__='wrapper'
As you can see, what the decorator left us is wrapper, not what we originally defined as target. And the wrapper is what we call, when we write target().
Wrapping wrappers
This is the kind of behavior we typically expect, when we use decorators. And therefore it is not surprising that multiple decorators stacked together behave the way they do. The are called from the inside out (as explained above) and each adds its own wrapper around what it receives from the one applied before:
def f(x):
print(f"applied `f({x.__name__})`")
def wrapper_from_f():
print(f"wrapper_from_f executing with {locals()=}")
return x()
return wrapper_from_f
def g(x):
print(f"applied `g({x.__name__})`")
def wrapper_from_g():
print(f"wrapper_from_g executing with {locals()=}")
return x()
return wrapper_from_g
#g
#f
def target(): return "foo"
print(f"{target()=}")
print(f"{target.__name__=}")
Output:
applied `f(target)`
applied `g(wrapper_from_f)`
wrapper_from_g executing with locals()={'x': <function f.<locals>.wrapper_from_f at 0x7fbfc8d64f70>}
wrapper_from_f executing with locals()={'x': <function target at 0x7fbfc8d65630>}
target()='foo'
target.__name__='wrapper_from_g'
This shows very clearly the difference between the order in which the decorators are called and the order in which the wrapped/wrapping functions are called.
After the decoration is done, we are left with wrapper_from_g, which is referenced by our target name in global namespace. When we call it, wrapper_from_g executes and calls wrapper_from_f, which in turn calls the original target.

How to make a proper function wrapper

I used naive approach to write a wrapper. Get all *args and **kwargs and pass them to the enclosing function. But something went wrong. So I simplified example to the core to illustrate my troubles.
# simplies wrapper possible: just pass the args
def wraps(f):
def call(*argv, **kw):
# add some meaningful manipulations later
return f(*argv, **kw)
return call
# check the wrapper behaves identically
class M:
def __init__(this, param):
this.param = param
M.__new__ = M.__new__
m1 = M(1)
M.__new__ = wraps(M.__new__)
m2 = M(2)
m1 was instantiated normally, but m2 fails with the following error description
TypeError: object.__new__() takes exactly one argument (the type to instantiate)
The question is how to define wraps and call function properly so they would behave identically to the function being wrapped regardless of the wrapped function.
It is not the end objective obviously, since primitive lambda x: x would suffice. It is a starting point from which I could introduce further complications.
The short answer: It's impossible. You could not define a perfect wrapper in python (and in many other languages too).
Slightly longer version. Python function is a first-class object and all manipulations acceptable for objects could be performed with a function too. So you could not presume that some complex procedure would limit itself with only calling the function passed as argument and would not use the function object in other unobvious ways
Much more verbose speculation with examples
Functions defined only at part of the domain are pretty common
def half(i):
if i < 0:
raise ValueError
if i & 1:
raise ValueError
return i / 2
Pretty straight. No we could get a little more confusing:
class Veggy:
def __init__(this, kind):
this.kind = kind
def pr(this):
print(this.kind)
def assess(v):
if v.kind in ['tomato', 'carrot']:
raise ValueError
v.pr()
Here Veggy used as a function proxy but also have public property kind which the assess function check before executing.
The same thing could be done with a function object since it also have additional properties besides calling.
def test(x):
return x + x
def assess4(f, *argv, **kw):
if f.__name__ != 'test':
raise ValueError
if f.__module__ != '__main__':
raise ValueError
if len(f.__code__.co_code) % 8 == 4:
raise ValueError
return f(*argv, **kw)
Writing correct wrapper becomes a challenge. That challenge could be complicated further:
def assess0(f, *argv, **kw):
if len(f.__code__.co_code) % 8 == 0:
kw['arg'] = True
return f(*argv[1:], kw)
else
kw['arg'] = False
return f(*argv[:-1], **kw)
Universal wrapper should handle both assess0 and assess4 correctly which is pretty impossible. And we have not touched id magic. Checking id would cast acceptable function in stone.
Coding etiquette
So you could not write a wrapper. Why someone bother to write one? Why function are so common when they could not guarantee behavior equivalence and could possible introduce non-trivial changes in code flow?
The simple answer is coding conventions. The famous substitution principle. Code should keep behavior properties when some object is substituted with another of the same type. Python put little focus on type nomination and enforcing. Rigorous type system is not a must, you could establish APIs and protocols through documentation and type annotation like the python language does.
Programs must be written for people to read, and only incidentally for machines to execute. OOP conventions are all in people minds. So python developers broke conventions requiring some non-stadard behavior for overriding object methods. This non-conventional OOP treatment make impossible to use decorators for transforming __init__ and __new__ methods.
The final solution
If python treats __new__ so special then generic wrapper should do the same.
# simplest wrapper possible: just pass the args
def wraps(f):
def call(*argv, **kw):
# add some meaningful manipulations later
return f(*argv, **kw)
def call_new(*argv, **kw):
# add some meaningful manipulations later
return f(argv[0])
if f is object.__new__:
return call_new
# elif other_special_case: pass
else:
return call
Now it could successfully pass the test
# check the wrapper behaves identically
class M:
def __init__(this, param):
this.param = param
M.__new__ = M.__new__
m1 = M(1)
M.__new__ = wraps(M.__new__)
m2 = M(2)
The drawback is that you should implement distinct workaround for any other convention breaking functions besides __new__ to make your function wrapper semi-applicable in universal context. But it is the best you could get out of python.

Python function loses identity after being decorated

(Python 3)
First of all, I feel my title isn't quite what it should be, so if you stick through the question and come up with a better title, please feel free to edit it.
I have recently learned about Python Decorators and Python Annotations, and so I wrote two little functions to test what I have recently learned.
One of them, called wraps is supposed to mimic the behaviour of the functools wraps, while the other, called ensure_types is supposed to check, for a given function and through its annotations, if the arguments passed to some function are the correct ones.
This is the code I have for those functions:
def wraps(original_func):
"""Update the decorated function with some important attributes from the
one that was decorated so as not to lose good information"""
def update_attrs(new_func):
# Update the __annotations__
for key, value in original_func.__annotations__.items():
new_func.__annotations__[key] = value
# Update the __dict__
for key, value in original_func.__dict__.items():
new_func.__dict__[key] = value
# Copy the __name__
new_func.__name__ = original_func.__name__
# Copy the docstring (__doc__)
new_func.__doc__ = original_func.__doc__
return new_func
return update_attrs # return the decorator
def ensure_types(f):
"""Uses f.__annotations__ to check the expected types for the function's
arguments. Raises a TypeError if there is no match.
If an argument has no annotation, object is returned and so, regardless of
the argument passed, isinstance(arg, object) evaluates to True"""
#wraps(f) # say that test_types is wrapping f
def test_types(*args, **kwargs):
# Loop through the positional args, get their name and check the type
for i in range(len(args)):
# function.__code__.co_varnames is a tuple with the names of the
##arguments in the order they are in the function def statement
var_name = f.__code__.co_varnames[i]
if not(isinstance(args[i], f.__annotations__.get(var_name, object))):
raise TypeError("Bad type for function argument named '{}'".format(var_name))
# Loop through the named args, get their value and check the type
for key in kwargs.keys():
if not(isinstance(kwargs[key], f.__annotations__.get(key, object))):
raise TypeError("Bad type for function argument named '{}'".format(key))
return f(*args, **kwargs)
return test_types
Supposedly, everything is alright until now. Both the wraps and the ensure_types are supposed to be used as decorators. The problem comes when I defined a third decorator, debug_dec that is supposed to print to the console when a function is called and its arguments. The function:
def debug_dec(f):
"""Does some annoying printing for debugging purposes"""
#wraps(f)
def profiler(*args, **kwargs):
print("{} function called:".format(f.__name__))
print("\tArgs: {}".format(args))
print("\tKwargs: {}".format(kwargs))
return f(*args, **kwargs)
return profiler
That also works cooly. The problem comes when I try to use debug_dec and ensure_types at the same time.
#ensure_types
#debug_dec
def testing(x: str, y: str = "lol"):
print(x)
print(y)
testing("hahaha", 3) # raises no TypeError as expected
But if I change the order with which the decorators are called, it works just fine.
Can someone please help me understand what is going wrong, and if is there any way of solving the problem besides swapping those two lines?
EDIT
If I add the lines:
print(testing.__annotations__)
print(testing.__code__.co_varnames)
The output is as follows:
#{'y': <class 'str'>, 'x': <class 'str'>}
#('args', 'kwargs', 'i', 'var_name', 'key')
Although wraps maintains the annotations, it doesn't maintain the function signature. You see this when you print out the co_varnames. Since ensure_types does its checking by comparing the names of the arguments with the names in the annotation dict, it fails to match them up, because the wrapped function has no arguments named x and y (it just accepts generic *args and **kwargs).
You could try using the decorator module, which lets you write decorators that act like functools.wrap but also preserve the function signature (including annotations).
There is probably also a way to make it work "manually", but it would be a bit of a pain. Basically what you would have to do is have wraps store the original functions argspec (the names of its arguments), then have ensure_dict use this stored argspec instead of the wrapper's argspec in checking the types. Essentially your decorators would pass the argspec in parallel with the wrapped functions. However, using decorator is probably easier.

Decorator for overloading in Python

I know it's not Pythonic to write functions that care about the type of the arguments, but there are cases when it's simply impossible to ignore types because they are handled differently.
Having a bunch of isinstance checks in your function is just ugly; is there any function decorator available that enables function overloads? Something like this:
#overload(str)
def func(val):
print('This is a string')
#overload(int)
def func(val):
print('This is an int')
Update:
Here's some comments I left on David Zaslavsky's answer:
With a few modification[s], this will suit my purposes pretty well. One other limitation I noticed in your implementation, since you use func.__name__ as the dictionary key, you are prone to name collisions between modules, which is not always desirable. [cont'd]
[cont.] For example, if I have one module that overloads func, and another completely unrelated module that also overloads func, these overloads will collide because the function dispatch dict is global. That dict should be made local to the module, somehow. And not only that, it should also support some kind of 'inheritance'. [cont'd]
[cont.] By 'inheritance' I mean this: say I have a module first with some overloads. Then two more modules that are unrelated but each import first; both of these modules add new overloads to the already existing ones that they just imported. These two modules should be able to use the overloads in first, but the new ones that they just added should not collide with each other between modules. (This is actually pretty hard to do right, now that I think about it.)
Some of these problems could possibly be solved by changing the decorator syntax a little bit:
first.py
#overload(str, str)
def concatenate(a, b):
return a + b
#concatenate.overload(int, int)
def concatenate(a, b):
return str(a) + str(b)
second.py
from first import concatenate
#concatenate.overload(float, str)
def concatenate(a, b):
return str(a) + b
Since Python 3.4 the functools module supports a #singledispatch decorator. It works like this:
from functools import singledispatch
#singledispatch
def func(val):
raise NotImplementedError
#func.register
def _(val: str):
print('This is a string')
#func.register
def _(val: int):
print('This is an int')
Usage
func("test") --> "This is a string"
func(1) --> "This is an int"
func(None) --> NotImplementedError
Quick answer: there is an overload package on PyPI which implements this more robustly than what I describe below, although using a slightly different syntax. It's declared to work only with Python 3 but it looks like only slight modifications (if any, I haven't tried) would be needed to make it work with Python 2.
Long answer: In languages where you can overload functions, the name of a function is (either literally or effectively) augmented by information about its type signature, both when the function is defined and when it is called. When a compiler or interpreter looks up the function definition, then, it uses both the declared name and the types of the parameters to resolve which function to access. So the logical way to implement overloading in Python is to implement a wrapper that uses both the declared name and the parameter types to resolve the function.
Here's a simple implementation:
from collections import defaultdict
def determine_types(args, kwargs):
return tuple([type(a) for a in args]), \
tuple([(k, type(v)) for k,v in kwargs.iteritems()])
function_table = defaultdict(dict)
def overload(arg_types=(), kwarg_types=()):
def wrap(func):
named_func = function_table[func.__name__]
named_func[arg_types, kwarg_types] = func
def call_function_by_signature(*args, **kwargs):
return named_func[determine_types(args, kwargs)](*args, **kwargs)
return call_function_by_signature
return wrap
overload should be called with two optional arguments, a tuple representing the types of all positional arguments and a tuple of tuples representing the name-type mappings of all keyword arguments. Here's a usage example:
>>> #overload((str, int))
... def f(a, b):
... return a * b
>>> #overload((int, int))
... def f(a, b):
... return a + b
>>> print f('a', 2)
aa
>>> print f(4, 2)
6
>>> #overload((str,), (('foo', int), ('bar', float)))
... def g(a, foo, bar):
... return foo*a + str(bar)
>>> #overload((str,), (('foo', float), ('bar', float)))
... def g(a, foo, bar):
... return a + str(foo*bar)
>>> print g('a', foo=7, bar=4.4)
aaaaaaa4.4
>>> print g('b', foo=7., bar=4.4)
b30.8
Shortcomings of this include
It doesn't actually check that the function the decorator is applied to is even compatible with the arguments given to the decorator. You could write
#overload((str, int))
def h():
return 0
and you'd get an error when the function was called.
It doesn't gracefully handle the case where no overloaded version exists corresponding to the types of the arguments passed (it would help to raise a more descriptive error)
It distinguishes between named and positional arguments, so something like
g('a', 7, bar=4.4)
doesn't work.
There are a lot of nested parentheses involved in using this, as in the definitions for g.
As mentioned in the comments, this doesn't deal with functions having the same name in different modules.
All of these could be remedied with enough fiddling, I think. In particular, the issue of name collisions is easily resolved by storing the dispatch table as an attribute of the function returned from the decorator. But as I said, this is just a simple example to demonstrate the basics of how to do it.
This doesn't directly answer your question, but if you really want to have something that behaves like an overloaded function for different types and (quite rightly) don't want to use isinstance then I'd suggest something like:
def func(int_val=None, str_val=None):
if sum(x != None for x in (int_val, str_val)) != 1:
#raise exception - exactly one value should be passed in
if int_val is not None:
print('This is an int')
if str_val is not None:
print('This is a string')
In use the intent is obvious, and it doesn't even require the different options to have different types:
func(int_val=3)
func(str_val="squirrel")
Yes, there is an overload decorator in the typing library that can be used to help make complex type hints easier.
from collections.abc import Sequence
from typing import overload
#overload
def double(input_: int) -> int:
...
#overload
def double(input_: Sequence[int]) -> list[int]:
...
def double(input_: int | Sequence[int]) -> int | list[int]:
if isinstance(input_, Sequence):
return [i * 2 for i in input_]
return input_ * 2
Check this link for more details.
Just noticed it is a 11 years old question, sorry to bring it up again. It was by mistake.

Is there a Python equivalent to Ruby's respond_to?

Is a way to see if a class responds to a method in Python? like in ruby:
class Fun
def hello
puts 'Hello'
end
end
fun = Fun.new
puts fun.respond_to? 'hello' # true
Also is there a way to see how many arguments the method requires?
Hmmm .... I'd think that hasattr and callable would be the easiest way to accomplish the same goal:
class Fun:
def hello(self):
print 'Hello'
hasattr(Fun, 'hello') # -> True
callable(Fun.hello) # -> True
You could, of course, call callable(Fun.hello) from within an exception handling suite:
try:
callable(Fun.goodbye)
except AttributeError, e:
return False
As for introspection on the number of required arguments; I think that would be of dubious value to the language (even if it existed in Python) because that would tell you nothing about the required semantics. Given both the ease with which one can define optional/defaulted arguments and variable argument functions and methods in Python it seems that knowing the "required" number of arguments for a function would be of very little value (from a programmatic/introspective perspective).
Has method:
func = getattr(Fun, "hello", None)
if callable(func):
...
Arity:
import inspect
args, varargs, varkw, defaults = inspect.getargspec(Fun.hello)
arity = len(args)
Note that arity can be pretty much anything if you have varargs and/or varkw not None.
dir(instance) returns a list of an objects attributes.
getattr(instance,"attr") returns an object's attribute.
callable(x) returns True if x is callable.
class Fun(object):
def hello(self):
print "Hello"
f = Fun()
callable(getattr(f,'hello'))
I am no Ruby expert, so I am not sure if this answers your question. I think you want to check if an object contains a method. There are numerous ways to do so. You can try to use the hasattr() function, to see if an object hast the method:
hasattr(fun, "hello") #True
Or you can follow the python guideline don't ask to ask, just ask so, just catch the exception thrown when the object doesn't have the method:
try:
fun.hello2()
except AttributeError:
print("fun does not have the attribute hello2")

Categories

Resources