Overriding Python mock's patch decorator - python

I have a Python TestCase class where all test methods, except one, need to patch an object the same way. The other method need some other behavior from the same object. I'm using mock, so I did:
#mock.patch('method_to_patch', mock.Mock(return_value=1))
class Tests(TestCase):
#mock.patch('method_to_patch', mock.Mock(return_value=2))
def test_override(self):
(....)
But that's not working. When test_override is run, it still calls the patched behavior from the class decorator.
After a lot of debugging, I found out that during the TestSuite build, the #patch around test_override is being called before the one around Tests, and since mock apply the patches in order, the class decorator is overriding the method decorator.
Is this order correct? I was expecting the opposite and I'm not really sure how to override patching... Maybe with a with statement?

Well, turns out that a good night sleep and a cold shower made me rethink the whole issue.
I'm still very new to the concept of mocking, so it still hasn't sunk in quite right.
The thing is, there's no need to override the patch to a mocked object. It's a mocked object and that means I can make it do anything. So my first try was:
#mock.patch('method_to_patch', mock.Mock(return_value=1))
class Tests(TestCase):
def test_override(self):
method_to_patch.return_value = 2
(....)
That worked, but had the side effect of changing the return value for all following tests. So then I tried:
#mock.patch('method_to_patch', mock.Mock(return_value=1))
class Tests(TestCase):
def test_override(self):
method_to_patch.return_value = 2
(....)
method_to_patch.return_value = 1
And it worked like a charm. But seemed like too much code. So then I went the down the road of context management, like this:
#mock.patch('method_to_patch', mock.Mock(return_value=1))
class Tests(TestCase):
def test_override(self):
with mock.patch('method_to_patch', mock.Mock(return_value=2):
(....)
I think it seems clearer and more concise.
About the order in which the patch decorators were being applied, it's actually the correct order. Just like stacked decorators are applied from the bottom up, a method decorator is supposed to be called before the class decorator. I guess it makes sense, I was just expecting the opposite behavior.
Anyway, I hope this help some poor newbie soul like mine in the future.

Related

Is there a way to decorate a class injecting a parent class?

I have a base class A, and a decorator behavior. Both has different behaviors but sometimes it can be used at the same time.
There is to implement a new class decorator new_behavior that applies behavior and "inject" A as a parent class?
Something like this:
#new_behavior
class B:
...
So B will behave just like if it was declared like class B(A): but B also inhirts all #behavior behaviors?
Broadly speaking, by the time a decorator gets a chance to operate on a class, it's too late to change fundamental properties of the class, like its bases. But that doesn't necessarily mean you can't do what you want, it only rules out direct approaches.
You could have your decorator create a new class with the desired bases, and add the contents of the old class to the new one. But there are a lot of subtle details that might go wrong, like methods that don't play correctly with super and other stuff that make it somewhat challenging. I would not want to do this on a whim.
One possible option that might be simpler than most is to make a new class that inherits from both the class you're decorating, and the base class you want to add. That isn't exactly the same as injecting a base class as a base of the decorated, but it will usually wind up with the same MRO, and super should work just fine. Here's how I'd implement that:
def new_behavior(cls):
class NewClass(cls, A): # do the multiple inheritance by adding A here
pass
NewClass.__name__ = f'New{cls.__name__}' # should modify __qualname__ too
return NewClass
I'm not applying any other decorators in that code, but you could do that by changing the last line to return some_other_decorator(NewClass) or just applying the decorator to the class statement with #decorator syntax. In order to make introspection nicer, you might want to modify a few parameters of NewClass before returning it. I demonstrate altering the __name__ attribute, but you would probably also want to change __qualname__ (which I've skipped doing because it would be a bit more fiddly and annoying to get something appropriate), and maybe some others that I can't think of off the top of my head.

Preferred way of patching multiple methods in Python unit test

I need to patch three methods (_send_reply, _reset_watchdog and _handle_set_watchdog) with mock methods before testing a call to a fourth method (_handle_command) in a unit test of mine.
From looking at the documentation for the mock package, there's a few ways I could go about it:
With patch.multiple as decorator
#patch.multiple(MBG120Simulator,
_send_reply=DEFAULT,
_reset_watchdog=DEFAULT,
_handle_set_watchdog=DEFAULT,
autospec=True)
def test_handle_command_too_short_v1(self,
_send_reply,
_reset_watchdog,
_handle_set_watchdog):
simulator = MBG120Simulator()
simulator._handle_command('XA99')
_send_reply.assert_called_once_with(simulator, 'X?')
self.assertFalse(_reset_watchdog.called)
self.assertFalse(_handle_set_watchdog.called)
simulator.stop()
With patch.multiple as context manager
def test_handle_command_too_short_v2(self):
simulator = MBG120Simulator()
with patch.multiple(simulator,
_send_reply=DEFAULT,
_reset_watchdog=DEFAULT,
_handle_set_watchdog=DEFAULT,
autospec=True) as mocks:
simulator._handle_command('XA99')
mocks['_send_reply'].assert_called_once_with('X?')
self.assertFalse(mocks['_reset_watchdog'].called)
self.assertFalse(mocks['_handle_set_watchdog'].called)
simulator.stop()
With multiple patch.object decoratorations
#patch.object(MBG120Simulator, '_send_reply', autospec=True)
#patch.object(MBG120Simulator, '_reset_watchdog', autospec=True)
#patch.object(MBG120Simulator, '_handle_set_watchdog', autospec=True)
def test_handle_command_too_short_v3(self,
_handle_set_watchdog_mock,
_reset_watchdog_mock,
_send_reply_mock):
simulator = MBG120Simulator()
simulator._handle_command('XA99')
_send_reply_mock.assert_called_once_with(simulator, 'X?')
self.assertFalse(_reset_watchdog_mock.called)
self.assertFalse(_handle_set_watchdog_mock.called)
simulator.stop()
Manually replacing methods using create_autospec
def test_handle_command_too_short_v4(self):
simulator = MBG120Simulator()
# Mock some methods.
simulator._send_reply = create_autospec(simulator._send_reply)
simulator._reset_watchdog = create_autospec(simulator._reset_watchdog)
simulator._handle_set_watchdog = create_autospec(simulator._handle_set_watchdog)
# Exercise.
simulator._handle_command('XA99')
# Check.
simulator._send_reply.assert_called_once_with('X?')
self.assertFalse(simulator._reset_watchdog.called)
self.assertFalse(simulator._handle_set_watchdog.called)
Personally I think the last one is clearest to read, and will not result in horribly long lines if the number of mocked methods grow. It also avoids having to pass in simulator as the first (self) argument to assert_called_once_with.
But I don't find any of them particularly nice. Especially the multiple patch.object approach, which requires careful matching of the parameter order to the nested decorations.
Is there some approach I've missed, or a way to make this more readable? What do you do when you need to patch multiple methods on the instance/class under test?
No you didn't have missed anything really different from what you proposed.
About readability my taste is for decorator way because it remove the mocking stuff from test body... but it is just taste.
You are right: if you patch the static instance of the method by autospec=True you must use self in assert_called_* family check methods. But your case is just a small class because you know exactly what object you need to patch and you don't really need other context for your patch than test method.
You need just patch your object use it for all your test: often in tests you cannot have the instance to patch before doing your call and in these cases create_autospec cannot be used: you can just patch the static instance of the methods instead.
If you are bothered by passing the instance to assert_called_* methods consider to use ANY to break the dependency. Finally I wrote hundreds of test like that and I never had a problem about the arguments order.
My standard approach at your test is
from unittest.mock import patch
#patch('mbgmodule.MBG120Simulator._send_reply', autospec=True)
#patch('mbgmodule.MBG120Simulator._reset_watchdog', autospec=True)
#patch('mbgmodule.MBG120Simulator._handle_set_watchdog', autospec=True)
def test_handle_command_too_short(self,mock_handle_set_watchdog,
mock_reset_watchdog,
mock_send_reply):
simulator = MBG120Simulator()
simulator._handle_command('XA99')
# You can use ANY instead simulator if you don't know it
mock_send_reply.assert_called_once_with(simulator, 'X?')
self.assertFalse(mock_reset_watchdog.called)
self.assertFalse(mock_handle_set_watchdog_mock.called)
simulator.stop()
Patching is out of the test method code
Every mock starts by mock_ prefix
I prefer to use simple patch call and absolute path: it is clear and neat what you are doing
Finally: maybe create simulator and stop it are setUp() and tearDown() responsibility and tests should take in account just to patch some methods and do the checks.
I hope that answer is useful but the question don't have a unique valid answer because readability is not an absolute concept and depends from the reader. Moreover even the title speaking about general case, question examples are about the specific class of problem where you should patch methods of the object to test.
[EDIT]
I though a while about this question and I found what bother me: you are trying to test and sense on private methods. When this happen the first thing that you should ask is why? There are a lot chances that the answer is because these methods should be public methods of private collaborators (that not my words).
In that new scenario you should sense on private collaborators and you cannot change just your object. What you need to do is to patch the static instance of some other classes.

Equivalent to im_func for __new__?

I've got a chunk of code to automate monkey patching that caches away a function's im_func reference, and then replaces the function while attaching the im_func of the original as a ._unmonkeyed attribute, like so:
class MonkeyPatch(object):
'''A callable object to implement the monkey patch. Stores the previous version in
attribute _unmonkeyed and new version in _monkeyed.'''
def __init__(self,source,target,attr):
self._monkeyPatchSource=source
self._monkeyPatchTarget=target
self._monkeyPatchAttr=attr
self._monkeyed=getattr(source,attr).im_func
self._unmonkeyed=getattr(target,attr,None)
setattr(target,attr,self)
###a few more methods here
def __get__(self,inst,cls=None): #(self,*args,**kwds):
tmp=lambda *args,**kwds: self._monkeyed(inst,*args,**kwds)
tmp._unmonkeyed=lambda *args,**kwds: self._unmonkeyed(inst,*args,**kwds)
return tmp
I'm not much of Pythonista, so I'm sure there's a thousand reasons this is a dumb way to do things, but it's worked for me. Now I find myself in a place where I'd like to patch a class' __new__ method to add some logic before calling the existing __new__. __new__ doesn't have an im_func attribute, and that probably indicates that there are other methods that don't.
Is there a way to accomplish the same job in a general way (preferably without having to keep a list of special cases) for methods without im_func?
Subclassing isn't the behavior I want here because I want to inject the new code into an existing class hierarchy. This isn't production code, so I'm not too worried about the consequences of adding a few blue wires.

In Python, how can I determine which class a method was declared in?

How can I get a reference to the class a method was declared in? In the code below, I'm trying to get a reference to Classical in test.
class Classical:
def method():
pass
def test(func):
# How can I get a reference to 'Classical' using only 'func'?
pass
test(Classical.method)
I've tried dir()ing and searching, to no avail. Any ideas? (If it matters, I'm running Python 3.1)
EDIT: Changed the code to make more sense.
The function decorator test() is executed in the context of the class Classical while this class is being created. While executing the code in the class body, the class does not exist yet, so there is no means of accessing it. You should use a class decorator instead of a function decorator to work around this.
Alternatively, you can tell us what you are actually trying to achieve. Most probably, there will be a simple solution.
Edit: To answer your edited question: To access the class an unbound method belongs to, you can use unbound_method.im_class in CPython 2.x. I don't know if this is portable to other Python implementations, and most probably there is a better way of achieving whatever you are trying to achieve.

override method of class in python

I'd like to override a class method, not creating subclass/extending from a class.
An example:
from django.contrib import admin
class NewModelAdmin(admin.ModelAdmin):
def formfield_for_dbfield(self, db_field, **kwargs):
# some custom stuff
Now I don't want to change all the classes (aka Models) which extend from admin.ModelAdmin to NewModelAdmin. But I don't want to modify the original django code either.
Is there some way to accomplish this?
I'm not 100% clear with what you want to do, and why you don't want to create a new subclass or have a method of a different name.
But in general in python you can do something like:
class MyClass(object):
def print_hello(self):
print "not hello"
def real_print_hello():
print "hello"
x = MyClass()
x.print_hello() # "not hello"
setattr(x, "print_hello", real_print_hello)
x.print_hello() # "hello"
Are you trying to do 'monkey patching'?
http://mail.python.org/pipermail/python-dev/2008-January/076194.html
In order to keep your code maintainable, it's best to go ahead and have your individual ModelAdmin classes inherit from NewModelAdmin. This way, other developers who look at your code (and you, perhaps a year or two later) can clearly see where the custom formfield_for_dbfield behavior originates from so that it can be updated if needed. If you monkey-patch admin.ModelAdmin, it will make it much more difficult to track down issues or change the behavior if needed later.
Chances are good that your problem is solvable without monkey-patching, which often can have unintended consequences.
How are you registering models with the django admin?
If you are using this approach:
admin.site.register(FooModel) #uses generic ModelAdmin
You have the problem of needing to change this to many boilerplate instances of subclasses of NewModelAdmin, which would look like this:
class FooModelAdmin(NewModelAdmin):
pass #does nothing except set up inheritance
admin.site.register(FooModel, FooModelAdmin)
This is really wordy and might take a lot of time to implement if you have a lot of models, so do it programmatically by writing a wrapper function:
def my_admin_register(model):
class _newmodeladmin(ModelAdmin):
def your_overridden_method(*args, **kwargs):
#do whatever here
admin.site.register(model, _newmodeladmin)
Then, you can use this like this:
my_admin_register(FooModel)
You can change a class method using setattr() on the class - aka monkey patching.
If you modify a method in a class you modify behavior for:
all instances which resolve their method to that class
all derived classes which resolve their method to that class
Your requirements are mutually exclusive. You cannot modify the behavior of a class without impacting those object which resolve their methods to the class.
In order to not modify the behaviors of these other objects you would want to create the method in your instance so that it doesn't resolve it's method in the class.
Another alternative is to rely on Python's duck-typing. You don't need the object to be directly related to the one currently used. You could reimplement the interface and in the code swap out the calls to the old class for your new one.
These techniques have tradeoffs in maintainability and design. In other words don't use them unless you have no other options.
I'm not 100% sure what you are trying to achieve, but I suppose you want to behave all admins that inherit from models.ModelAdmin without having to change their declaration. The only solution to achieve this will be monkey-patching the original ModelAdmin class, eg. something like:
setattr(admin.ModelAdmin, 'form_field_for_dbfield', mymethod)
This for sure not the most recommendable way, because the code will be hard to maintain and other things.

Categories

Resources