Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to measure coverage in a proper way

Pytest + coverage are showing very strange coverage statistics. They are counting only those modules where tests were added, but other Python modules are not calculated for some reason.

I have a simple Python Microservice with a structure similar to:

README.rst
Dockerfile
manage.py
api_service/
setup.py
requirements.txt
tests/

Where api_service contains all the logic, and tests contains unit tests. API is written in Python 3.X Unit tests - Pytest 3.10.0

I'm running these commands to get a code coverage statistics:

python coverage run pytest -v --junit-xml=junit-report.xml tests/
python coverage xml --fail-under 80
python coverage report

It shows really strange and unexpected results for me. e.g. there are empty init.py modules in the final report (with 100% coverage) and they affects the final coverage percentage. Also, it adds a lot of modules with just abstract classes, etc.

But what is really not expected at all - it's not counting Python modules without tests. It's awful!

Are there any commands, flags etc. to handle this situation is a proper way?

I've tried also to run something like:

python coverage run --source=service_api -v --junit-xml=junit-report.xml tests/

But it also returns not expected results.

like image 961
pythonista Avatar asked Oct 23 '25 18:10

pythonista


1 Answers

CD into prj directory and run:

pytest --cov=. tests/ --cov-report xml

in order to get the code coverage for your source files in xml format.

prereq:

pip install pytest pytest-cov
like image 177
cristian Avatar answered Oct 26 '25 08:10

cristian



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!