Is there a Python equivalent to Ruby's respond_to? - python

Is a way to see if a class responds to a method in Python? like in ruby:
class Fun
def hello
puts 'Hello'
end
end
fun = Fun.new
puts fun.respond_to? 'hello' # true
Also is there a way to see how many arguments the method requires?

Hmmm .... I'd think that hasattr and callable would be the easiest way to accomplish the same goal:
class Fun:
def hello(self):
print 'Hello'
hasattr(Fun, 'hello') # -> True
callable(Fun.hello) # -> True
You could, of course, call callable(Fun.hello) from within an exception handling suite:
try:
callable(Fun.goodbye)
except AttributeError, e:
return False
As for introspection on the number of required arguments; I think that would be of dubious value to the language (even if it existed in Python) because that would tell you nothing about the required semantics. Given both the ease with which one can define optional/defaulted arguments and variable argument functions and methods in Python it seems that knowing the "required" number of arguments for a function would be of very little value (from a programmatic/introspective perspective).

Has method:
func = getattr(Fun, "hello", None)
if callable(func):
...
Arity:
import inspect
args, varargs, varkw, defaults = inspect.getargspec(Fun.hello)
arity = len(args)
Note that arity can be pretty much anything if you have varargs and/or varkw not None.

dir(instance) returns a list of an objects attributes.
getattr(instance,"attr") returns an object's attribute.
callable(x) returns True if x is callable.
class Fun(object):
def hello(self):
print "Hello"
f = Fun()
callable(getattr(f,'hello'))

I am no Ruby expert, so I am not sure if this answers your question. I think you want to check if an object contains a method. There are numerous ways to do so. You can try to use the hasattr() function, to see if an object hast the method:
hasattr(fun, "hello") #True
Or you can follow the python guideline don't ask to ask, just ask so, just catch the exception thrown when the object doesn't have the method:
try:
fun.hello2()
except AttributeError:
print("fun does not have the attribute hello2")

Related

How to write unittest for variable assignment in python?

This is in Python 2.7. I have a class called class A, and there are some attributes that I want to throw an exception when being set by the user:
myA = A()
myA.myattribute = 9 # this should throw an error
I want to write a unittest that ensures that this throws an error.
After creating a test class and inheriting unittest.TestCase, I tried to write a test like this:
myA = A()
self.assertRaises(AttributeError, eval('myA.myattribute = 9'))
But, this throws a syntax error. However, if I try eval('myA.myattribute = 9'), it throws the attribute error, as it should.
How do I write a unittest to test this correctly?
Thanks.
You can also use assertRaises as a context manager:
with self.assertRaises(AttributeError):
myA.myattribute = 9
The documentation shows more examples for this if you are interested. The documentation for assertRaises has a lot more detail on this subject as well.
From that documentation:
If only the exception and possibly the msg arguments are given, return a context manager so that the code under test can be written
inline rather than as a function:
with self.assertRaises(SomeException):
do_something()
which is exactly what you are trying to do.
self.assertRaises takes a callable (and optionally one or more arguments for that callable) as its argument; you are providing the value that results from calling the callable with its arguments. The correct test would be self.assertRaises(AttributeError, eval, 'myA.myattribute = 9')
# Thanks to #mgilson for something that actually works while
# resembling the original attempt.
self.assertRaises(AttributeError, eval, 'myA.myattribute = 9', locals())
However, you should use assertRaises as a context manager, which allows you to write the much more natural
with self.assertRaises(AttributeError):
myA.myattribute = 9

How to prevent a function from being cast to bool

The following python code has a bug:
class Location(object):
def is_nighttime():
return ...
if location.is_nighttime:
close_shades()
The bug is that the programmer forgot to call is_nighttime (or forgot to use a #property decorator on the method), so the method is cast by bool evaluated as True without being called.
Is there a way to prevent the programmer from doing this, both in the case above, and in the case where is_nighttime is a standalone function instead of a method? For example, something in the following spirit?
is_nighttime.__bool__ = TypeError
In theory, you could wrap the function in a function-like object with a __call__ that delegates to the function and a __bool__ that raises a TypeError. It'd be really unwieldy and would probably cause more bad interactions than it'd catch - for example, these objects won't work as methods unless you add more special handling for that - but you could do it:
class NonBooleanFunction(object):
"""A function wrapper that prevents a function from being interpreted as a boolean."""
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
return self.func(*args, **kwargs)
def __bool__(self):
raise TypeError
__nonzero__ = __bool__
#NonBooleanFunction
def is_nighttime():
return True # We're at the Sun-Earth L2 point or something.
if is_nighttime:
# TypeError!
There's still a lot of stuff you can't catch:
nighttime_list.append(is_nighttime) # No TypeError ._.
And you have to remember to explicitly apply this to any functions you don't want being treated as booleans. You also can't do much about functions and methods you don't control; for example, you can't apply this to str.islower to catch things like if some_string.islower:.
If you want to catch things like this, I recommend using static analysis tools instead. I think IDEs like PyCharm might warn you, and there should be linting tools that can catch this.
If you want these things to work as methods, here's the extra handling for that:
import functools
class NonBooleanFunction(object):
... # other methods omitted for brevity
def __get__(self, instance, owner):
if instance is None:
return self
return NonBooleanFunction(functools.partial(self.func, instance))
This is something you can approach with static code analysis.
For instance, pylint has a related warning:
using-constant-test (W0125):
Using a conditional statement with a
constant value Emitted when a conditional statement (If or ternary if)
uses a constant value for its test. This might not be what the user
intended to do.
Demo:
If is_nightmare is not called:
$ pylint test.py
************* Module test
C: 1, 0: Missing module docstring (missing-docstring)
C: 1, 0: Missing function docstring (missing-docstring)
W: 4, 0: Using a conditional statement with a constant value (using-constant-test)
If called:
$ pylint test.py
************* Module test
C: 1, 0: Missing module docstring (missing-docstring)
C: 1, 0: Missing function docstring (missing-docstring)
Short answer: if is_nighttime():, with parenthesis to call it.
Longer answer:
is_nighttime points to a function, which is a non-None type. if looks for a condition which is a boolean, and casts the symbol is_nighttime to boolean. As it is not zero and not None, it is True.

Python function loses identity after being decorated

(Python 3)
First of all, I feel my title isn't quite what it should be, so if you stick through the question and come up with a better title, please feel free to edit it.
I have recently learned about Python Decorators and Python Annotations, and so I wrote two little functions to test what I have recently learned.
One of them, called wraps is supposed to mimic the behaviour of the functools wraps, while the other, called ensure_types is supposed to check, for a given function and through its annotations, if the arguments passed to some function are the correct ones.
This is the code I have for those functions:
def wraps(original_func):
"""Update the decorated function with some important attributes from the
one that was decorated so as not to lose good information"""
def update_attrs(new_func):
# Update the __annotations__
for key, value in original_func.__annotations__.items():
new_func.__annotations__[key] = value
# Update the __dict__
for key, value in original_func.__dict__.items():
new_func.__dict__[key] = value
# Copy the __name__
new_func.__name__ = original_func.__name__
# Copy the docstring (__doc__)
new_func.__doc__ = original_func.__doc__
return new_func
return update_attrs # return the decorator
def ensure_types(f):
"""Uses f.__annotations__ to check the expected types for the function's
arguments. Raises a TypeError if there is no match.
If an argument has no annotation, object is returned and so, regardless of
the argument passed, isinstance(arg, object) evaluates to True"""
#wraps(f) # say that test_types is wrapping f
def test_types(*args, **kwargs):
# Loop through the positional args, get their name and check the type
for i in range(len(args)):
# function.__code__.co_varnames is a tuple with the names of the
##arguments in the order they are in the function def statement
var_name = f.__code__.co_varnames[i]
if not(isinstance(args[i], f.__annotations__.get(var_name, object))):
raise TypeError("Bad type for function argument named '{}'".format(var_name))
# Loop through the named args, get their value and check the type
for key in kwargs.keys():
if not(isinstance(kwargs[key], f.__annotations__.get(key, object))):
raise TypeError("Bad type for function argument named '{}'".format(key))
return f(*args, **kwargs)
return test_types
Supposedly, everything is alright until now. Both the wraps and the ensure_types are supposed to be used as decorators. The problem comes when I defined a third decorator, debug_dec that is supposed to print to the console when a function is called and its arguments. The function:
def debug_dec(f):
"""Does some annoying printing for debugging purposes"""
#wraps(f)
def profiler(*args, **kwargs):
print("{} function called:".format(f.__name__))
print("\tArgs: {}".format(args))
print("\tKwargs: {}".format(kwargs))
return f(*args, **kwargs)
return profiler
That also works cooly. The problem comes when I try to use debug_dec and ensure_types at the same time.
#ensure_types
#debug_dec
def testing(x: str, y: str = "lol"):
print(x)
print(y)
testing("hahaha", 3) # raises no TypeError as expected
But if I change the order with which the decorators are called, it works just fine.
Can someone please help me understand what is going wrong, and if is there any way of solving the problem besides swapping those two lines?
EDIT
If I add the lines:
print(testing.__annotations__)
print(testing.__code__.co_varnames)
The output is as follows:
#{'y': <class 'str'>, 'x': <class 'str'>}
#('args', 'kwargs', 'i', 'var_name', 'key')
Although wraps maintains the annotations, it doesn't maintain the function signature. You see this when you print out the co_varnames. Since ensure_types does its checking by comparing the names of the arguments with the names in the annotation dict, it fails to match them up, because the wrapped function has no arguments named x and y (it just accepts generic *args and **kwargs).
You could try using the decorator module, which lets you write decorators that act like functools.wrap but also preserve the function signature (including annotations).
There is probably also a way to make it work "manually", but it would be a bit of a pain. Basically what you would have to do is have wraps store the original functions argspec (the names of its arguments), then have ensure_dict use this stored argspec instead of the wrapper's argspec in checking the types. Essentially your decorators would pass the argspec in parallel with the wrapped functions. However, using decorator is probably easier.

How can I overload in Python?

I'm trying to make a function that does different things when called on different argument types. Specifically, one of the functions should have the signature
def myFunc(string, string):
and the other should have the signature
def myFunc(list):
How can I do this, given that I'm not allowed to specify whether the arguments are strings or lists?
Python does not support overloading, even by the argument count. You need to do:
def foo(string_or_list, string = None):
if isinstance(string_or_list, list):
...
else:
...
which is pretty silly, or just rethink your design to not have to overload.
There is a recipe at http://code.activestate.com/recipes/577065-type-checking-function-overloading-decorator/ which does what you want;
basically, you wrap each version of your function with #takes and #returns type declarations; when you call the function, it tries each version until it finds one that does not throw a type error.
Edit: here is a cut-down version; it's probably not a good thing to do, but if you gotta, here's how:
from collections import defaultdict
def overloaded_function(overloads):
"""
Accepts a sequence of ((arg_types,), fn) pairs
Creates a dispatcher function
"""
dispatch_table = defaultdict(list)
for arg_types,fn in overloads:
dispatch_table[len(arg_types)].append([list(arg_types),fn])
def dispatch(*args):
for arg_types,fn in dispatch_table[len(args)]:
if all(isinstance(arg, arg_type) for arg,arg_type in zip(args,arg_types)):
return fn(*args)
raise TypeError("could not find an overloaded function to match this argument list")
return dispatch
and here's how it works:
def myfn_string_string(s1, s2):
print("Got the strings {} and {}".format(s1, s2))
def myfn_list(lst):
print("Got the list {}".format(lst))
myfn = overloaded_function([
((basestring, basestring), myfn_string_string),
((list,), myfn_list)
])
myfn("abcd", "efg") # prints "Got the strings abcd and efg"
myfn(["abc", "def"]) # prints "Got the list ['abc', 'def']"
myfn(123) # raises TypeError
*args is probably the better way, but you could do something like:
def myFunc(arg1, arg2=None):
if arg2 is not None:
#do this
else:
#do that
But that's probably a terrible way of doing it.
Not a perfect solution, but if the second string argument will never legitimately be None, you could try:
def myFunc( firstArg, secondArg = None ):
if secondArg is None:
# only one arg provided, try treating firstArg as a list
else:
# two args provided, try treating them both as strings
Define it as taking variable arguments:
def myFunc(*args):
Then you can check the amount and type of the arguments via len and isinstance, and route the call to the appropriate case-specific function.
It may make for clearer code, however, if you used optional named arguments. It would be better still if you didn't use overloading at all, it's kinda not python's way.
You can't - for instance a class instance method can be inserted in run-time.
If you had multiple __init__ for a class for instance, you'd be better off with multiple #classmethod's such as from_strings or from_sequence

What is the proper python way to write methods that only take a particular type?

I have a function that is supposed to take a string, append things to it where necessary, and return the result.
My natural inclination is to just return the result, which involved string concatenation, and if it failed, let the exception float up to the caller. However, this function has a default value, which I just return unmodified.
My question is: What if someone passed something unexpected to the method, and it returns something the user doesn't expect? The method should fail, but how to enforce that?
It's not necessary to do so, but if you want you can have your method raise a TypeError if you know that the object is of a type that you cannot handle. One reason to do this is to help people to understand why the method call is failing and to give them some help fixing it, rather than giving them obscure error from the internals of your function.
Some methods in the standard library do this:
>>> [] + 1
Traceback (most recent call last):
File "", line 1, in
TypeError: can only concatenate list (not "int") to list
You can use decorators for this kind of thing, you can see an example here.
But forcing parameters to be of a specific type isn't very pythonic.
Python works under the assumption that we are all intelligent adults that read the documentation. If you still want to do it, you should not assert the actual type, but rather just catch the exception when the argument does not support the operations you need, like that:
def foo(arg):
try:
return arg + "asdf"
except TypeError:
return arg
What does the default value have to do with it? Are you saying you want to return the default value in the case where the caller doesn't pass a str? In that case:
def yourFunc( foo ):
try:
return foo + " some stuff"
except TypeError:
return "default stuff"
Space_C0wb0y has the right answer if you want to return the arg unmodified if it's not a string, and there's also the option of making an attempt to convert something to a string:
def yourFunc2( bar ):
return str(bar) + " some stuff"
Which will work with a lot of different types.

Categories

Resources