I am trying to sort out how to get a list of just the test names from a test file using pytest I have made it this far:
$ pytest --collect-only my_test_file.py
============================================================================================= test session starts ==============================================================================================
platform darwin -- Python 3.6.4, pytest-3.2.5, py-1.5.2, pluggy-0.4.0
rootdir: /Users/.../automation, inifile:
plugins: xdist-1.20.1, metadata-1.7.0, html-1.17.0, forked-0.2, cloud-2.0.0
collected 6 items
<Module 'my_test_file.py'>
<UnitTestCase 'TestFile'>
<TestCaseFunction 'test_general'>
<TestCaseFunction 'test_search_0'>
<TestCaseFunction 'test_search_1'>
<TestCaseFunction 'test_search_2'>
<TestCaseFunction 'test_search_3'>
<TestCaseFunction 'test_search_4'>
While this is a good start it is too much information. What should I be passing to just get back a list of the test names themselves?
Running pytestWe can run a specific test file by giving its name as an argument. A specific function can be run by providing its name after the :: characters. Markers can be used to group tests. A marked grouped of tests is then run with pytest -m .
The simplest way to skip a test is to mark it with the skip decorator which may be passed an optional reason . It is also possible to skip imperatively during test execution or setup by calling the pytest. skip(reason) function. This is useful when it is not possible to evaluate the skip condition during import time.
Use --quiet
to change the collection output.
$ pytest --collect-only -q
test_eggs.py::test_bacon[1]
test_eggs.py::test_bacon[2]
test_eggs.py::test_bacon[3]
test_spam.py::test_foo
test_spam.py::test_bar
no tests ran in 0.03 seconds
You can strip the status line with
$ pytest --collect-only -q | head -n -2
When used twice (or more), --quiet
will print the amount of tests per module:
$ pytest --collect-only -qq
test_eggs.py: 3
test_spam.py: 2
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With