I'm trying to write a pytest suite for functions that write to variables in a dedicated global_variables module, and I can't work out how to check if the variables are getting written there. I've tried using pytest-mock's mocker.patch to mock the variables, but mocker seems to just allow you to read in from a mocked variable, not write to one.
The code being tested:
import global_variables as gv
def main(filename):
gv.name = filename
do_other_stuff()
My unit test:
import pytest
def test_main_writes_to_gv():
# arrange
output = mocker.patch(gv.name)
# act
app.main('foo')
# assert
assert output == 'foo'
I want to add some sort of listener to see if gv.name is being written to, similar to the way that mocker's .assert_called_with() method works, but I'm looking for something like .assert_written_to.
Related
I have one function like this one:
def function(df, path_write):
df['A'] = df['col1'] * df['col2']
write(df, path)
The function is not that simple but the question is, how can i make a unit test if the function do not return any value??
If the function returns the new df it's simple, just make:
assert_frame_equal from the library from pandas.testing import assert_frame_equal and mock the write method.
But without that return, how can i test the df line??
In general, I can think of only two kinds of functions: Those that return a value and those that produce side-effects.
With the second kind, you typically don't want the side-effects to actually happen during testing. For example, if your function writes something to disk or sends some data to some API on the internet, you usually don't want it to actually do that during the test, you only want to ensure that it attempts to do the right thing.
To roll with the example of disk I/O as a side-effect: You would usually have some function that does the actual writing to the filesystem that the function under testing calls. Let's say it is named write. The typical apporach would be to mock that write function in your test. Then you would need to verify that that mocked write was called with the arguments you expected.
Say you have the following code.py for example:
def write(thing: object, path: str) -> None:
print("Some side effect like disk I/O...")
def function(thing: object, file_name: str) -> None:
...
directory = "/tmp/"
write(thing, path=directory + file_name)
To test function, I would suggest the following test.py:
from unittest import TestCase
from unittest.mock import MagicMock, patch
from . import code
class MyTestCase(TestCase):
#patch.object(code, "write")
def test_function(self, mock_write: MagicMock) -> None:
test_thing = object()
test_file_name = "test.txt"
self.assertIsNone(code.function(test_thing, test_file_name))
mock_write.assert_called_once_with(
test_thing,
path="/tmp/" + test_file_name,
)
Check out unittest.mock for more details on mocking with the standard library. I would strongly advise to use the tools there and not do custom monkey-patching. The latter is certainly possible, but always carries the risk that you forget to revert the patched objects back to their original state after every test. That can break the entire rest of your test cases and depending on how you monkey-patched, the source of the resulting errors may become very hard to track down.
Hope this helps.
In the mock for write() you can add assert statements to ensure the form of df is as you would expect. For example:
def _mock_write(df, path):
assert path == '<expected path value>'
assert_frame_equal(df, <expected dataframe>)
So the full test case would be:
def test_function(self, monkeypatch):
# Define mock function
def _mock_write(df, path):
assert path == '<expected path value>'
assert_frame_equal(df, <expected dataframe>)
# Mock write function
monkepyatch.setattr(<MyClass>, 'write', _mock_write)
# Run function to enter mocked write function
function(test_df, test_path_write)
N.B. This is assuming you are using pytest as your test runner which supports the set up and tear down of monkeypatch. Other answers show the usage for the standard unittest framework.
I know there are ways to perform dynamic import of Python modules themselves, but I would like to know if there's a way to write a module such that it can dynamically create its own module contents on demand. I am imagining a module hook that looks something like:
# In some_module.py:
def __import_name__(name):
return some_object
Such that if I were to write from some_module import foo in a script, Python will call some_module.__import_name__("foo") and let me dynamically create and return the contents.
I haven't found anything that works like this exactly in the documentation, though there are references to an "import protocol" with "finders" and "loaders" and "meta hooks" and "import path hooks" that permit customization of the import logic, and I imagine that such a thing is possible.
I discovered you can modify the behavior of a Module from within itself in arbitrary ways by setting sys.modules[__name__].__class__ to a class that implements whatever your chosen behavior.
import sys
import types
class DynamicModule(types.ModuleType):
# This function is what gets called on `from this_module import whatever`
# or `this_module.whatever` accesses.
def __getattr__(self, name):
# This check ensures we don't intercept special values like __path__
# if they're not set elsewhere.
if name.startswith("__") and name.endswith("__"):
return self.__getattribute__(name)
return make_object(name)
# Helpful to define this here if you need to dynamically construct the
# full set of available attributes.
#property
def __all__(self):
return get_all_objects()
# This ensures the DynamicModule class is used to define the behavior of
# this module.
sys.modules[__name__].__class__ = DynamicModule
Something about this feels like it may not be the intended path to do something like this, though, and that I should be hooking into the importlib machinery.
I am writing pytest for a script where the method is using the global variable. While writing a test method for the script, I want to pass the DUMMY global variable(i.e. 2) to it instead of the original value(i.e. 55). So I wrote a script (can see below), but that resulted my failure.
# script1.py
VAL = 55
def add_one():
return VAL+1
Pytest script for the above script:
# test_script1.py
import pytest
from script1 import add_one
DUMMY_VAL = 2
#pytest.mark.parametrize("VAL", DUMMY_VAL)
def test_add_one():
expected_output = 3
observed_output = add_one()
assert observed_output == expected_output
But the above test script is Failing as it is taking the VAL to be 55 instead of 2. So my question is, is there a way by which I can pass DUMMY_VAL to the test method so as to pass my test case.
Also, one condition: I do not wish to change my method definition in the script.py file.
The trouble you are having are all signs telling you not to use globals. Especially since you are using the language of function parameters (i.e. s there a way by which I can pass DUMMY_VAL).
Having said that, you can patch the variable with something like — you just need to do the imports in a way that you can access the imported module:
from unittest.mock import patch
import script1
DUMMY_VAL = 2
#patch('script1.VAL', DUMMY_VAL)
def test_add_one():
expected_output = 3
observed_output = script1.add_one()
assert observed_output == expected_output
So let's say I have this bit of code:
import coolObject
def doSomething():
x = coolObject()
x.coolOperation()
Now it's a simple enough method, and as you can see we are using an external library(coolObject).
In unit tests, I have to create a mock of this object that roughly replicates it. Let's call this mock object coolMock.
My question is how would I tell the code when to use coolMock or coolObject? I've looked it up online, and a few people have suggested dependency injection, but I'm not sure I understand it correctly.
Thanks in advance!
def doSomething(cool_object=None):
cool_object = cool_object or coolObject()
...
In you test:
def test_do_something(self):
cool_mock = mock.create_autospec(coolObject, ...)
cool_mock.coolOperation.side_effect = ...
doSomthing(cool_object=cool_mock)
...
self.assertEqual(cool_mock.coolOperation.call_count, ...)
As Dan's answer says, one option is to use dependency injection: have the function accept an optional argument, if it's not passed in use the default class, so that a test can pass in a moc.
Another option is to use the mock library (here or here) to replace your coolObject.
Let's say you have a foo.py that looks like
from somewhere.else import coolObject
def doSomething():
x = coolObject()
x.coolOperation()
In your test_foo.py you can do:
import mock
def test_thing():
path = 'foo.coolObject' # The fully-qualified path to the module, class, function, whatever you want to mock.
with mock.patch('foo.coolObject') as m:
doSomething()
# Whatever you want to assert here.
assert m.called
The path you use can include properties on objects, e.g. module1.module2.MyClass.my_class_method. A big gotcha is that you need to mock the object in the module being tested, not where it is defined. In the example above, that means using a path of foo.coolObject and not somwhere.else.coolObject.
So I'm running py.test and trying to use monkeypatch. I understand that monkeypatch's intended purpose is to replace attributes in a module so that they can be tested. And I get that we can substitute in mock functions in order to do this.
Currently I am trying to run essentially the following block of code.
from src.module.submodule import *
def mock_function(parameter = None):
return 0
def test_function_works(monkeypatch):
monkeypatch.setattr("src.module.submodule.function",mock_function ]
assert function(parameter = None) == 0
When the test runs, instead of swapping in mock_function, it just runs function . Could there be a reason why monkeypatch isn't activating
I have got monkey patch running succesfully with other code before. So I don't see why this isn't working.
I haven't used pytest for this stuff, but I know that with the mock library, functions are patched in the namespace where they're called. i.e. from src.module.submodule import * imports src.module.submodule.function into your namespace, but you then patch it in its original namespace, so your local name for the function still accesses the original, unpatched code.
If you change this to
import src.module.submodule
def mock_function(parameter = None):
return 0
def test_function_works(monkeypatch):
monkeypatch.setattr("src.module.submodule.function",mock_function ]
assert src.module.submodule.function(parameter = None) == 0
does it succeed?
Looks like a typo, shouldn't it be
monkeypatch.setattr("src.module.submodule.function",mockIfunction)
i.e. mockIfunction instead of mock_function?