I am testing several versions of a component using Pytest. Some tests can run on all versions, some are version specific. For example
tests
|
|-- version1_tests
| |-- test_feature_1_1.py
| |-- test_feature_1_2.py
| |-- test_feature_1_n.py
|
|-- version2_tests
| |-- test_feature_2_1.py
| |-- test_feature_2_2.py
| |-- test_feature_2_n.py
|
|-- common_tests
| |-- test_feature_common_1.py
| |-- test_feature_common_2.py
| |-- test_feature_common_n.py
I would like to mark my tests such that I can select if I want to test Version 1 (version1_tests + common_tests) or Version 2 (version2_tests + common_tests) from the command line.
The way I am currently doing this is for each test module, I add a pytest marker and then specify the marker from the command line. For example, in test_feature_1_1.py
:
import pytest
pytestmark = pytest.mark.version1
class TestSpecificFeature(object):
...
And then to run: python -m pytest -m "common and version1"
This works fine, but I have to manually add the marker to every module, which is tedious because there are actually dozens (not 3 like in the example).
We used to use Robot Framework, where it was trivial to "mark" an entire folder by adding tags into the __init__.robot
files. Is there any equivalent way to do this in Pytest, or is marking each module the best I can do?
You can register markers to collected tests at runtime using item.add_marker()
method. Here's an example of registering markers in pytest_collection_modifyitems
:
import pathlib
import pytest
def pytest_collection_modifyitems(config, items):
# python 3.4/3.5 compat: rootdir = pathlib.Path(str(config.rootdir))
rootdir = pathlib.Path(config.rootdir)
for item in items:
rel_path = pathlib.Path(item.fspath).relative_to(rootdir)
mark_name = next((part for part in rel_path.parts if part.endswith('_tests')), '').removesuffix('_tests')
if mark_name:
mark = getattr(pytest.mark, mark_name)
item.add_marker(mark)
Write the code to conftest.py
in the project root dir and try it out:
$ pytest -m "common or version2" --collect-only -q
tests/common_tests/test_feature_common_1.py::test_spam
tests/common_tests/test_feature_common_1.py::test_eggs
tests/common_tests/test_feature_common_2.py::test_spam
tests/common_tests/test_feature_common_2.py::test_eggs
tests/common_tests/test_feature_common_n.py::test_spam
tests/common_tests/test_feature_common_n.py::test_eggs
tests/version2_tests/test_feature_2_1.py::test_spam
tests/version2_tests/test_feature_2_1.py::test_eggs
tests/version2_tests/test_feature_2_2.py::test_spam
tests/version2_tests/test_feature_2_2.py::test_eggs
tests/version2_tests/test_feature_2_n.py::test_spam
tests/version2_tests/test_feature_2_n.py::test_eggs
Only tests under common_tests
and version2_tests
were selected.
For each collected test item, we extract the path relative to project root dir (rel_path
), first part of rel_path
that ends with _tests
will be used as source for the mark name extraction. E.g. collect_tests
is the source for the mark name collect
etc. Once we have the mark name, we create the mark (using getattr
since we can't use the property access) and append the mark via item.add_marker(mark)
. You can write your own, less abstract version of it, e.g.
for item in items:
if `common_tests` in str(item.fspath):
item.add_marker(pytest.mark.common)
elif `version1_tests` in str(item.fspath):
item.add_marker(pytest.mark.version1)
elif `version2_tests` in str(item.fspath):
item.add_marker(pytest.mark.version2)
With a recent version of pytest
, you should receive a PytestUnknownMarkWarning
since the dynamically generated markers were not registered. Check out the section Registering markers for a solution - you can either add the mark names in pytest.ini
:
[pytest]
markers =
common
version1
version2
or add them dynamically via pytest_configure
hook, e.g.
def pytest_configure(config):
rootdir = pathlib.Path(config.rootdir)
for dir_ in rootdir.rglob('*_tests'):
mark_name = dir_.stem.removesuffix('_tests')
config.addinivalue_line('markers', mark_name)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With