Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to integrate behave into pytest?

I create a Django app and heavily rely on pytest to discover and organize my unit and functional tests. However, I want to apply Behaviour Driven with behave Development for future tests. Unfortunately, the behave test features are not auto-detected by pytest.

How can I integrate behave and its' tests into pytest discovery, execution and reporting?

like image 338
Jon Avatar asked Dec 14 '16 15:12

Jon


2 Answers

Pytest and behave are two separate test runners.

There is a pytest plugin for behavior testing which also uses Gherkin as a DSL but the implementation of the steps uses a syntax different from that of behave, so I don't think you can directly run the steps you created with it.

like image 75
jgiralt Avatar answered Oct 13 '22 00:10

jgiralt


Following pytest docs example, you can achieve outputs like these:

______________________________________________________________ Feature: Fight or flight  - Scenario: Stronger opponent ______________________________________________________________

Feature: Fight or flight
  Scenario: Stronger opponent
    Step [OK]: the ninja has a third level black-belt
    Step [ERR]: attacked by Chuck Norris
      Traceback (most recent call last):
        File ".venv/lib/python3.6/site-packages/behave/model.py", line 1329, in run
          match.run(runner.context)
        File ".venv/lib/python3.6/site-packages/behave/matchers.py", line 98, in run
          self.func(context, *args, **kwargs)
        File "tests/bdd/steps/tutorial.py", line 23, in step_impl4
          raise NotImplementedError('STEP: When attacked by Chuck Norris')
      NotImplementedError: STEP: When attacked by Chuck Norris
    Step [NOT REACHED]: the ninja should run for his life

with a feature file from behaves tutorial

In order to make pytest run behave, you can use the following snippet in conftest.py:

# content of conftest.py

import pytest


class BehaveException(Exception):
    """Custom exception for error reporting."""

def pytest_collect_file(parent, path):
    """Allow .feature files to be parsed for bdd."""
    if path.ext == ".feature":
        return BehaveFile.from_parent(parent, fspath=path)

class BehaveFile(pytest.File):
    def collect(self):
        from behave.parser import parse_file
        feature = parse_file(self.fspath)
        for scenario in feature.walk_scenarios(with_outlines=True):
            yield BehaveFeature.from_parent(
                self,
                name=scenario.name,
                feature=feature,
                scenario=scenario,
            )


class BehaveFeature(pytest.Item):

    def __init__(self, name, parent, feature, scenario):
        super().__init__(name, parent)
        self._feature = feature
        self._scenario = scenario

    def runtest(self):
        import subprocess as sp
        from shlex import split

        feature_name = self._feature.filename
        cmd = split(f"""behave tests/bdd/ 
            --format json 
            --no-summary
            --include {feature_name}
            -n "{self._scenario.name}"
        """)

        try:
            proc = sp.run(cmd, stdout=sp.PIPE)
            if not proc.returncode:
                return
        except Exception as exc:
            raise BehaveException(self, f"exc={exc}, feature={feature_name}")

        stdout = proc.stdout.decode("utf8")
        raise BehaveException(self, stdout)

    def repr_failure(self, excinfo):
        """Called when self.runtest() raises an exception."""
        import json
        if isinstance(excinfo.value, BehaveException):
            feature = excinfo.value.args[0]._feature
            results = excinfo.value.args[1]
            data = json.loads(results)
            summary = ""
            for feature in data:
                if feature['status'] != "failed":
                    continue
                summary += f"\nFeature: {feature['name']}"
                for element in feature["elements"]:
                    if element['status'] != "failed":
                        continue
                    summary += f"\n  {element['type'].title()}: {element['name']}"
                    for step in element["steps"]:
                        try:
                            result = step['result']
                        except KeyError:
                            summary += f"\n    Step [NOT REACHED]: {step['name']}"
                            continue
                        status = result['status']
                        if status != "failed":
                            summary += f"\n    Step [OK]: {step['name']}"
                        else:
                            summary += f"\n    Step [ERR]: {step['name']}"
                            summary += "\n      " + "\n      ".join(result['error_message'])

            return summary

    def reportinfo(self):
        return self.fspath, 0, f"Feature: {self._feature.name}  - Scenario: {self._scenario.name}"


NOTE:

  1. Proper status decoding is needed, eg comparing the feature, element, or step status with the Enum in behave.model_core.Status
  2. This snippet would call behave as a subproces instead of its internal API. A proper itegration would consider
    1. subclassing behave.runner:Runner, behave.runner:ModelRunner and trigger from within the same process.
    2. using a behave formatter inside repr_failure, instead of manually decoding the json output
  3. You could achieve more / less granularity by targeting whole features or specific steps, but this snippet show a demo for scenarios only
  4. Because of (1), you wont gather data eg for coverage purposes...
like image 1
pwoolvett Avatar answered Oct 13 '22 00:10

pwoolvett