Python - Use pytest to test test methods written in subclass - python

I am a novice to pytest. I have a scenario wherein i wanted to test some test methods written in a subclass.
Assume that the following is my code structure
class Superclass:
def __init__(self, a):
self.a = a
def dummy_print():
print("This is a dummy function")
class TestSubClass(Superclass):
def test_1_eq_1():
assert 1 == 1
Upon executing the following command
py.test -s -v test_filename.py
I get the following error messgae:
cannot collect test class 'Test_Super' because it has a init constructor
The same is mentioned in the pytest documentation as well.
Is there a workaround for this?
I need the superclass' __init__() method because all my test files would need to inherit from the super class.

Pytest is different beast than, say unittests. It prohibits class hierarchies, which you can see in the warning message.
If you need some pre-initialization steps, do them using fixtures:
#pytest.fixture(autouse=True)
def pre_run_initialization():
# do before test
yield
# do after test

Related

Run setup_class() class after class scope fixture defined in conftest

So, I have fixtures defined in conftest.py file with scope="class" as I want to run them before each test class is invoked. The conftest file is placed inside project root directory for it to be visible to every test module.
Now in one of the test modules, I have another setup function which I want to run once for that module only. But the problem is setup_class() method is called before running fixtures defined in conftest.py. Is this expected? I wanted it to be opposite because I want to use something done in the fixtures defined in conftest. How to do that?
Code -
conftest.py:
#pytest.fixture(scope="class")
def fixture1(request):
#set a
#pytest.fixture(scope="class")
def fixture1(request):
test_1.py:
#pytest.mark.usefixtures("fixture_1", "fixture_2")
class Test1():
#need this to run AFTER the fixture_1 & fixture_2
def setup_class():
#setup
#get a set in fixture_1
def test_1()
.....
I know that I could simply define a fixture in the test file instead of setup_class but then I will have to specify it in arguments of every test method in order it to be invoked by pytest. But suggestions are welcome!
I have exactly the same problem. Only now I have realized that the problem might be taht the setup_class is called before the fixture >-/
I think that this question is similar to this one
Pytest - How to pass an argument to setup_class?
And the problem is mixing the unittest and pytest methods.
I kind of did what they suggested - I ommitted the setup_class and created a new fixture within the particular test file,
calling the fixture in the conftest.py.
It works so far.
M.
The problem is that you can use the result of a fixture only in test function (or method) which is run by pytest. Here I can suggest a workaround. But of course I'm not sure if it suites your needs.
The workaround is to call the function from a test method:
conftest.py
#pytest.fixture(scope='class')
def fixture1():
yield 'MYTEXT'
test_1.py
class Test1:
def setup_class(self, some_var):
print(some_var)
def test_one(self, fixture1):
self.setup_class(fixture1)
Fixtures and setup_class are two different paradigms to initialize test functions (and classes). In this case, mixing the two creates a problem: The class-scoped fixtures run when the individual test functions (methods) run. On the other hand, setup_class runs before they do. Hence, it is not possible to access a fixture value (or fixture-modified state) from setup_class.
One of the solutions is to stop using setup_class entirely and stick with a fixtures-only solution which is the preferred way in pytest nowadays (see the note at the beginning).
# conftest.py or the test file:
#pytest.fixture(scope="class")
def fixture_1(request):
print('fixture_1')
# the test file:
class Test1():
#pytest.fixture(scope="class", autouse=True)
def setup(self, fixture_1, request):
print('Test1.setup')
def test_a(self):
print('Test1.test_a')
def test_b(self):
print('Test1.test_b')
Note that the setup fixture depends on fixture_1 and hence can access it.

Pytest fixture with scope "class" running on every method

I'm trying to create a test environment with Pytest. The idea is to group test methods into classes.
For every class/group, I want to attach a config fixture that is going to be parametrized. So that I can run all the tests with "configuration A" and then all tests with "configuration B" and so on.
But also, I want a reset fixture, that can be executed before specific methods or all methods of a class.
The problem I have there is, once I apply my reset fixture (to a method or to a whole class), the config fixture seems to work in the function scope instead of the class scope. So, once I apply the reset fixture, the config fixture is called before/after every method in the class.
The following piece of code reproduces the problem:
import pytest
from pytest import *
#fixture(scope='class')
def config(request):
print("\nconfiguring with %s" % request.param)
yield
print("\ncleaning up config")
#fixture(scope='function')
def reset():
print("\nreseting")
#mark.parametrize("config", ["config-A", "config-B"], indirect=True)
##mark.usefixtures("reset")
class TestMoreStuff(object):
def test_a(self, config):
pass
def test_b(self, config):
pass
def test_c(self, config):
pass
The test shows how the config fixture should work, being executed only once for the whole class. If you uncomment the usefixtures decoration, you can notice that the config fixture will be executed in every test method. Is it possible to use the reset fixture without triggering this behaviour?
As I mentioned in a comment, that seems to be a bug in Pytest 3.2.5.
There's a workaround, which is to "force" the scope of a parametrization. So, in this case if you include the scope="class" in the parametrize decorator, you get the desired behaviour.
import pytest
from pytest import *
#fixture(scope='class')
def config(request):
print("\nconfiguring with %s" % request.param)
yield
print("\ncleaning up config")
#fixture(scope='function')
def reset():
print("\nreseting")
#mark.parametrize("config", ["config-A", "config-B"], indirect=True, scope="class")
#mark.usefixtures("reset")
class TestMoreStuff(object):
def test_a(self, config):
pass
def test_b(self, config):
pass
def test_c(self, config):
pass
It depends on which version of pytest you are using.
There are some semantical problems to implement this in older versions of pytest. So, this idea is not yet implemented in older pytest. Someone has already given suggestion to implement the same. you can refer this
"Fixture scope doesn't work when parametrized tests use parametrized fixtures".
This was the bug.
You can refer this
This issue has been resolved in latest version of pytest. Here's the commit for the same with pytest 3.2.5
Hope it would help you.

python unittest inheritance - abstract test class

I need to use unittest in python to write some tests. I am testing the behavior of 2 classes, A and B, that have a lot of overlap in behavior because they are both subclasses of C, which is abstract. I would really like to be able to write 3 testing classes: ATestCase, BTestCase, and AbstractTestCase, where AbstractTestCase defines the common setup logic for ATestCase and BTestCase, but does not itself run any tests. ATestCase and BTestCase would be subclasses of AbstractTestCase and would define behavior/input data specific to A and B.
Is there a way to create an abstract class via python unittest that can take care of setup functionality by inheriting from TestCase, but not actually run any tests?
Sure, construct like that will surely work:
class BaseTestCase(unittest.TestCase):
def setUp(self):
pass # common teardown
def tearDown(self):
pass # common teardown
class ATestCase(BaseTestCase):
def test1(self):
pass
class BTestCase(BaseTestCase):
def test1(self):
pass
If knowledge from ATestCase or BTestCase is required in BaseTestCase simply override some method in subclasses but use it in superclass.
class BaseTestCase(unittest.TestCase):
def setUp(self):
self.instance = self._create_instance()
def _create_instance(self):
raise NotImplementedError()
class ATestCase(BaseTestCase):
def _create_instance(self):
return A()
class BestCase(BaseTestCase):
def _create_instance(self):
return B()
Note that if any test_(self) methods will be implemented in BaseTestCase, they'll run (and fail due to failing setUp) when discovered by automated runners.
As a workaround you may use skipTest in your setUp clause in abstract test and override it in subclasses.
class BaseTestCase(unittest.TestCase):
def setUp(self):
self.instance = self._create_instance()
def _create_instance(self):
self.skipTest("Abstract")
def test_fromBase(self):
self.assertTrue(True)
Note that skipping test_fromBase (e.g. via decorator) won't be good, since 'test should be skipped' logic will be inherited by all subclasses.
I tried Łukasz’s answer and it works, but I don’t like OK (SKIP=<number>) messages. For my own desires and aims for having a test suite I don’t want me or someone to start trusting any particular number of skipped tests, or not trusting and digging into the test suite and asking why something was skipped, and always?, and on purpose? For me that’s a non-starter.
I happen to use nosetests exclusively, and by convention test classes starting with _ are not run, so naming my base class _TestBaseClass is sufficient.
I tried this in Pycharm with Unittests and py.test and both of those tried to run my base class and its tests resulting in errors because there’s no instance data in the abstract base class. Maybe someone with specific knowledge of either of those runners could make a suite, or something, that bypasses the base class.

How can I create a class-scoped test fixture in pyUnit?

I am unit testing mercurial integration and have a test class which currently creates a repository with a file and a clone of that repository in its setUp method and removes them in its tearDown method.
As you can probably imagine, this gets quite performance heavy very fast, especially if I have to do this for every test individually.
So what I would like to do is create the folders and initialize them for mercurial on loading the class, so each and every unittest in the TestCase class can use these repositories. Then when all the tests are run, I'd like to remove them. The only thing my setUp and tearDown methods then have to take care of is that the two repositories are in the same state between each test.
Basically what I'm looking for is a python equivalent of JUnit's #BeforeClass and #AfterClass annotations.
I've now done it by subclassing the TestSuite class, since the standard loader wraps all the test methods in an instance of the TestCase in which they're defined and puts them together in a TestSuite. I have the TestSuite call the before() and after() methods of the first TestCase. This of course means that you can't initialize any values to your TestCase object, but you probably want to do this in your setUp anyway.
The TestSuite looks like this:
class BeforeAfterSuite(unittest.TestSuite):
def run(self, result):
if len(self._tests) < 1:
return unittest.TestSuite.run(self, result)
first_test = self._tests[0]
if "before" in dir(first_test):
first_test.before()
result = unittest.TestSuite.run(self, result)
if "after" in dir(first_test):
first_test.after()
return result
For some slightly more finegrained control I also created the custom TestLoader which makes sure the BeforeAfterSuite is only used to wrap test-method-TestCase objects in, which looks like this:
class BeforeAfterLoader(unittest.TestLoader):
def loadTestsFromTestCase(self, testCaseClass):
self.suiteClass = BeforeAfterSuite
suite = unittest.TestLoader.loadTestsFromTestCase(self, testCaseClass)
self.suiteClass = unittest.TestLoader.suiteClass
return suite
Probably missing here is a try/except block around the before and after which could fail all the testcases in the suite or something.
from the Python unittest documentation :
setUpClass() :
A class method called before tests in an individual class run. setUpClass is called with the class as the only argument and must be decorated as a classmethod():
#classmethod
def setUpClass(cls):
...
New in version 2.7.
tearDownClass() :
A class method called after tests in an individual class have run. tearDownClass is called with the class as the only argument and must be decorated as a classmethod():
#classmethod
def tearDownClass(cls):
...

How can I add a test method to a group of Django TestCase-derived classes?

I have a group of test cases that all should have exactly the same test done, along the lines of "Does method x return the name of an existing file?"
I thought that the best way to do it would be a base class deriving from TestCase that they all share, and simply add the test to that class. Unfortunately, the testing framework still tries to run the test for the base class, where it doesn't make sense.
class SharedTest(TestCase):
def x(self):
...do test...
class OneTestCase(SharedTest):
...my tests are performed, and 'SharedTest.x()'...
I tried to hack in a check to simply skip the test if it's called on an object of the base class rather than a derived class like this:
class SharedTest(TestCase):
def x(self):
if type(self) != type(SharedTest()):
...do test...
else:
pass
but got this error:
ValueError: no such test method in <class 'tests.SharedTest'>: runTest
First, I'd like any elegant suggestions for doing this. Second, though I don't really want to use the type() hack, I would like to understand why it's not working.
You could use a mixin by taking advantage that the test runner only runs tests inheriting from unittest.TestCase (which Django's TestCase inherits from.) For example:
class SharedTestMixin(object):
# This class will not be executed by the test runner (it inherits from object, not unittest.TestCase.
# If it did, assertEquals would fail , as it is not a method that exists in `object`
def test_common(self):
self.assertEquals(1, 1)
class TestOne(TestCase, SharedTestMixin):
def test_something(self):
pass
# test_common is also run
class TestTwo(TestCase, SharedTestMixin):
def test_another_thing(self):
pass
# test_common is also run
For more information on why this works do a search for python method resolution order and multiple inheritance.
I faced a similar problem. I couldn't prevent the test method in the base class being executed but I ensured that it did not exercise any actual code. I did this by checking for an attribute and returning immediately if it was set. This attribute was only set for the Base class and hence the tests ran everywhere else but the base class.
class SharedTest(TestCase):
def setUp(self):
self.do_not_run = True
def test_foo(self):
if getattr(self, 'do_not_run', False):
return
# Rest of the test body.
class OneTestCase(SharedTest):
def setUp(self):
super(OneTestCase, self).setUp()
self.do_not_run = False
This is a bit of a hack. There is probably a better way to do this but I am not sure how.
Update
As sdolan says a mixin is the right way. Why didn't I see that before?
Update 2
(After reading comments) It would be nice if (1) the superclass method could avoid the hackish if getattr(self, 'do_not_run', False): check; (2) if the number of tests were counted accurately.
There is a possible way to do this. Django picks up and executes all test classes in tests, be it tests.py or a package with that name. If the test superclass is declared outside the tests module then this won't happen. It can still be inherited by test classes. For instance SharedTest can be located in app.utils and then used by the test cases. This would be a cleaner version of the above solution.
# module app.utils.test
class SharedTest(TestCase):
def test_foo(self):
# Rest of the test body.
# module app.tests
from app.utils import test
class OneTestCase(test.SharedTest):
...

Categories

Resources