I have a test suite with a conftest.py defining some options and some fixtures to retrieve them:
def pytest_addoption(parser):
parser.addoption("--ip", action="store")
parser.addoption("--port", action="store")
#pytest.fixture
def ip(request):
return request.config.getoption("ip")
#pytest.fixture
def port(request):
return request.config.getoption("ip")
(I slipped in a copy-paste error to make a point)
My tests can very eloquently express the options they need:
def test_can_ping(ip):
...
def test_can_net_cat(ip, port):
...
But ...
I'm trying to avoid duplicating myself here: I have to specify the name of the config parameter in three places to make it work.
I could have avoided the copy-paste error if I had something that looked like this:
# does not exist:
#pytest.option_fixture
def ip(request, parser):
return request.config.getoption(this_function_name)
or this
def pytest_addoption(parser):
# does not exist: an as_fixture parameter
parser.addoption("--ip", action="store", as_fixture=True)
parser.addoption("--port", action="store", as_fixture=True)
Is there a way to tell pytest to add an option and a corresponding
fixture to achieve DRY/SPOT code?
After some tests, I came to something working. It is probably not the best way to do it but it is quite satisfying I think.
All code below have been added to the conftest.py module, except the two tests.
First define a dictionary containing the options data:
options = {
'port': {'action': 'store', 'help': 'TCP port', 'type': int},
'ip': {'action': 'store', 'help': 'IP address', 'type': str},
}
We could do without help and type, but it will have a certain utility later.
Then you can use this options to create the pytest options:
def pytest_addoption(parser):
for option, config in options.items():
parser.addoption(f'--{option}', **config)
At this point, pytest --help gives this (note the help data usage which provides convenient doc):
usage: pytest [options] [file_or_dir] [file_or_dir] [...]
...
custom options:
--port=PORT TCP port
--ip=IP IP address
Finally we have to define the fixtures. I did this by providing a make_fixture function which is used in a loop at conftest.py reading to dynamically create fixtures and add them to the global scope of the module:
def make_fixture(option, config):
func = lambda request: request.config.getoption(option)
func.__doc__ = config['help']
globals()[option] = pytest.fixture()(func)
for option, config in options.items():
make_fixture(option, config)
Again, the 'help' data is used to build a docstring to the created fixtures and document them. Thus, invoking pytest --fixtures prints this:
...
---- fixtures defined from conftest ----
ip
IP address
port
TCP port
Invoking pytest --port 80 --ip 127.0.0.1, with the two following very simple tests, seems to validate the trick (Here the type data shows its utility, it has made pytest convert the port to an int, instead of a string):
def test_ip(ip):
assert ip == '127.0.0.1'
def test_ip_port(ip, port):
assert ip == '127.0.0.1'
assert port == 80
(Very interesting question, I would like to see more like this one)
Instead of changing the pytest decorators, create one of your own:
parse_options = []
#addOption(parse_options)
#pytest
def ip(...): ...
A decorator doesn't have to modify the function which is passed in. So in this case, look at the method object, use the f.__name__ to get the name and add an entry in the list parse_options for it.
The next step is to modify pytest_addoption to iterate over the list and create the options. At the time when the function is being executed, the decorators should have done their work.
Related
I am new the usage of more "advanced" python. I decided to learn and implements unit testing for all my scripts.
Here is my issue :
I have a function from an external package I have made myself called gendiag. This package has a function "notify" that will send an email to a set recipient, defined in a config file, but leave the subject and the message as parameters :
gendiag.py :
import subprocess
#...
try:
configfile = BASE_PACKAGE_DIR.joinpath('config.yml')
with open(configfile, "r") as f:
config = yaml.safe_load(f)
except Exception as e:
print("Uh Oh!")
def notify(subject,message):
adress = config['mail']['nipt']
command = f'echo -e "{message}" | mail -s "{subject}" "{adress}"'
subprocess.run(command, shell=True)
In an other project called watercheck which import gendiag, I am using this function to get some info from every directory and send it as an email :
watercheck.py :
import gendiag as gdl
#...
def scan_water_metrics(indir_list):
for dir in indir_list:
#Do the things, this is a dummy example to simplify
some_info = os.path.basename(dir)
subject="Houlala"
message="I have parsed this directory, amazing.\n"
message += some_info
gdl.notify(subject,message)
Now in my test_watercheck.py, I would like to test that this function works with already created dummy data. But of course, I don't want to send an email to the rest of the world everytime I use pytest to see if email sending works. Instead, I was thinking I would create the following mock function in conftest.py :
conftest.py :
import gendiag
#pytest.fixture
def mock_scan_water_metrics():
mck = gendiag
mck.notify = mock.Mock(
return_value=subprocess.check_output(
"echo 'Hmm interesting' | mail -s Test my_own_email#gmule.com", shell=True
)
)
return mck
And then pass this mock to my test function in test_watercheck.py :
test_watercheck.py :
import gendiag
import pytest
from unittest import mock
from src import watercheck
def test_scan_water_metrics(mock_scan_water_metrics):
indir_list = ["tests/outdir/CORRECT_WATERCHECK","tests/outdir/BADWATER_WATERCHECK"]
water_check.scan_water_metrics(indir_list)
So this works in the sense that I am able to overwrite the email, but I would still like to test that some_info is collected properly, and for that I need to pass subject and message to the mock function. And this is the very confusing part for me. I don't doubt the answer is probably out there, but my understanding of the topic is too limited for me to find it out, or even formulate properly my question.
I have tried to read more about the object mock.Mock to see if I could collect the parameters somewhere, I have tried the following to see if I could access the parameters :
My attempt in conftest.py :
#pytest.fixture
#mock.patch("gendiag.notify_nipt")
def mock_scan_water_metrics(event):
print("Event : "+event)
args, kwargs = event.call_args
print("Args : "+args)
print("Kwargs : "+kwargs)
mck = gendiag
mck.notify = mock.Mock(
return_value=subprocess.check_output(
"echo 'Hmm interesting' | mail -s Test my_own_email#gmule.com", shell=True
)
)
return mck
I was hoping somewhere in args, I would find my two parameters, but when starting pytest, I have an error that the module "gendiag" does not exists, even though I had imported it everywhere just to be sure. I imagine the line causing it is the decorator here : #mock.patch("gendiag.notify_nipt"). I have tried with #mock.patch("gdl.notify_nipt") as well, as it is how it is called in the main function, with no success.
To be honest, I am really not sure where to go from here, it's getting too complex for me for now. How can I simply access to the parameters given to the function before it is decorated by pytest ?
We have unit tests running via Pytest, which use a custom decorator to start up a context-managed mock echo server before each test, and provide its address to the test as an extra parameter. This works on Python 2.
However, if we try to run them on Python 3, then Pytest complains that it can't find a fixture matching the name of the extra parameter, and the tests fail.
Our tests look similar to this:
#with_mock_url('?status=404&content=test&content-type=csv')
def test_file_not_found(self, url):
res_id = self._test_resource(url)['id']
result = update_resource(None, res_id)
assert not result, result
self.assert_archival_error('Server reported status error: 404 Not Found', res_id)
With a decorator function like this:
from functools import wraps
def with_mock_url(url=''):
"""
Start a MockEchoTestServer and call the decorated function with the server's address prepended to ``url``.
"""
def decorator(func):
#wraps(func)
def decorated(*args, **kwargs):
with MockEchoTestServer().serve() as serveraddr:
return func(*(args + ('%s/%s' % (serveraddr, url),)), **kwargs)
return decorated
return decorator
On Python 2 this works; the mock server starts, the test gets a URL similar to "http://localhost:1234/?status=404&content=test&content-type=csv", and then the mock is shut down afterward.
On Python 3, however, we get an error, "fixture 'url' not found".
Is there perhaps a way to tell Python, "This parameter is supplied from elsewhere and doesn't need a fixture"? Or is there, perhaps, an easy way to turn this into a fixture?
You can use url as args parameter
#with_mock_url('?status=404&content=test&content-type=csv')
def test_file_not_found(self, *url):
url[0] # the test url
Looks like Pytest is content to ignore it if I add a default value for the injected parameter, to make it non-mandatory:
#with_mock_url('?status=404&content=test&content-type=csv')
def test_file_not_found(self, url=None):
The decorator can then inject the value as intended.
consider separating the address from the service of the url. Using marks and changing fixture behavior based on the presence of said marks is clear enough. Mock should not really involve any communication, but if you must start some service, then make it separate from
with_mock_url = pytest.mark.mock_url('http://www.darknet.go')
#pytest.fixture
def url(request):
marker = request.get_closest_marker('mock_url')
if marker:
earl = marker.args[0] if args else marker.kwargs['fake']
if earl:
return earl
try:
#
earl = request.param
except AttributeError:
earl = None
return earl
#fixture
def server(request):
marker = request.get_closest_marker('mock_url')
if marker:
# start fake_server
#with_mock_url
def test_resolve(url, server):
server.request(url)
I have a python file that reads from a configuration file and initializes certain variables, followed by a number of test cases, defined by pytest markers.
I run different set of test cases parallelly by calling these markers, like this - pytest -m "markername" -n 3
The problem now is, I don't have a single configuration file anymore. There are multiple configuration files and I need a way to get from command line during execution, which configuration file to use for the test cases.
What I tried?
I wrapped the reading of config file into a function with a conf argument.
I added a conftest.py file, added a command-line option conf using pytest addoption.
def pytest_addoption(parser):
parser.addoption("--conf", action="append", default=[],
help="Name of the configuration file to pass to test functions")
def pytest_generate_tests(metafunc):
if 'conf' in metafunc.fixturenames:
metafunc.parametrize("conf", metafunc.config.option.conf)
and then tried this pytest -q --conf="configABC" -m "markername", in the hope that I can read that configuration file to initialize certain parameters and pass it on to the test cases containing the given marker. But nothing ever happens, and I wonder... I wonder how... I wonder why..
If I run pytest -q --conf="configABC", the config file gets read, but all the test cases are running.
However, I only need to run specific test cases that use variables initialized through the config file I get from command line. And I want to use markers because I'm also using parameterization and running them in parallel. How will I get which configuration file to use, from the command line? Am I messing this up?
Edit 1:
#contents of testcases.py
import json
import pytest
...
...
...
def getconfig(conf):
config = open(str(conf)+'_Configuration.json', 'r')
data = config.read()
data_obj = json.loads(data)
globals()['ID'] = data_obj['Id']
globals()['Codes'] = data_obj['Codes'] # list [Code_1, Code_2, Code_3]
globals()['Uname'] = data_obj['IM_User']
globals()['Pwd'] = data_obj['IM_Password']
#return ID, Codes, User, Pwd
def test_parms():
#Returns a list of tuples [(ID, Code_1, Uname, Pwd), (ID, Code_2, Uname, Pwd), (ID, Code_3, Uname, Pwd)]
...
...
return l
#pytest.mark.testA
#pytest.mark.parametrize("ID, Code, Uname, Pwd", test_parms())
def testA(ID, Code, Uname, Pwd):
....
do something
....
#pytest.mark.testB
#pytest.mark.parametrize("ID, Code, Uname, Pwd", test_parms())
def testB(ID, Code, Uname, Pwd):
....
do something else
....
You seem to be on the right track, but miss some connections and details.
First, your option looks a bit strange - as far as I understand, you just need a string instead of a list:
conftest.py
def pytest_addoption(parser):
parser.addoption("--conf", action="store",
help="Name of the configuration file"
" to pass to test functions")
In your test code, you read the config file, and based on your code, it contains a json dictionary of parameter lists, e.g. something like:
{
"Id": [1, 2, 3],
"Codes": ["a", "b", "c"],
"IM_User": ["User1", "User2", "User3"],
"IM_Password": ["Pwd1", "Pwd2", "Pwd3"]
}
What you need for parametrization is a list of parameter tuples, and you also want to read the list only once. Here is an example implementation that reads the list on first access and stores it in a dictionary (provided your config file looks like shown above):
import json
configs = {}
def getconfig(conf):
if conf not in configs:
# read the configuration if not read yet
with open(conf + '_Configuration.json') as f:
data_obj = json.load(f)
ids = data_obj['Id']
codes = data_obj['Codes']
users = data_obj['IM_User']
passwords = data_obj['IM_Password']
# assume that all lists have the same length
config = list(zip(ids, codes, users, passwords))
configs[conf] = config
return configs[conf]
Now you can use these parameters to parametrize your tests:
def pytest_generate_tests(metafunc):
conf = metafunc.config.getoption("--conf")
# only parametrize tests with the correct parameters
if conf and metafunc.fixturenames == ["uid", "code", "name", "pwd"]:
metafunc.parametrize("uid, code, name, pwd", getconfig(conf))
#pytest.mark.testA
def test_a(uid, code, name, pwd):
print(uid, code, name, pwd)
#pytest.mark.testB
def test_b(uid, code, name, pwd):
print(uid, code, name, pwd)
def test_c():
pass
In this example, both test_a and test_b will be parametrized, but not test_c.
If you now run the test (with the json file name "ConfigA_Configuration.json"), you get something like:
$ python -m pytest -v --conf=ConfigA -m testB testcases.py
============================================ 6 passed, 2 warnings in 0.11s ============================================
(Py37_new) c:\dev\so\questions\so\params_from_config>python -m pytest -v --conf=ConfigA -m testB test_params_from_config.py
...
collected 7 items / 4 deselected / 3 selected
test_params_from_config.py::test_b[1-a-User1-Pwd1] PASSED
test_params_from_config.py::test_b[2-b-User2-Pwd2] PASSED
test_params_from_config.py::test_b[3-c-User3-Pwd3] PASSED
I have some pytests that all access a fixture whose scope is module. I want to move the duplicated parts of the tests into a common place and access it from there.
Specifically, in the below sample code, in test/test_blah.py each of the test methods has the variable dsn, which is the device under test's serial number. I couldn't figure out how to extract this common code out. I tried accessing the dut in TestBase, but couldn't make it work.
# my_pytest/__init__.py
import pytest
#pytest.fixture(scope="module")
def device_fixture(request):
config = getattr(request.module, 'config', {})
device = get_device(config.get('dsn'))
assert device is not None
return device
...some other code...
# test/base.py
class TestBase:
def common_method_1(self):
pass
def common_method_2(self):
pass
# test/test_blah.py
from base import TestBase
import my_pytest
from my_pytest import device_fixture as dut #'dut' stands for 'device under test'
class TestBlah(TestBase):
def test_001(self, dut):
dsn = dut.get_serialno()
...
# how to extract the dsn = dut.get_serialno() into
# something common so I can keep these tests more DRY?
def test_002(self, dut):
dsn = dut.get_serialno()
...
def test_003(self, dut):
dsn = dut.get_serialno()
...
If I understand your question correctly: Put your fixtures in conftest.py and they will be available to use as arguments for your test functions. No need to import anything, you just define
#pytest.fixture(scope='module')
def dut():
return 'something'
is it possible to use pytest_addoption(parser) to create a list that is used by pytest.yield_fixture? i.e.
def pytest_addoption(parser):
parser.addoption("-foo", action="store",defaults="1,2,3")
#pytest.yield_fixture(params=request.config.getoption('--foo').split(','))
def test_bar(request):
do_something(request.param)
say you had 6 browsers, and you wanted to ability to run the tests against 1 browser as a quick check. I can't figure out how to get in place before test discovery/generation. Help
This obviously doesn't work since you the request variable does not exist in the global module scope, which is when the expression in the decorator is executed. The way to solve this is use the pytest_generate_tests hook:
# conftest.py
def pytest_addoption(parser):
parser.addoption('--foo', action='store', defaults='1,2,3')
def pytest_configure(config):
config.foo = config.getoption('foo').split(',')
def pytest_generate_tests(metafunc):
if 'foo' in metafunc.fixturenames:
metafunc.parametrize('foo', metafunc.config.foo)
# test_bar.py
def test_bar(foo):
do_something(foo)