Skip to content

Commit

Permalink
Adding scenarios to validate the rerun command works properly
Browse files Browse the repository at this point in the history
  • Loading branch information
anibalinn committed Dec 27, 2024
1 parent a5f8ce3 commit 64fb227
Show file tree
Hide file tree
Showing 4 changed files with 35 additions and 3 deletions.
11 changes: 8 additions & 3 deletions behavex/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -178,7 +178,6 @@ def join_scenario_reports(json_reports):
def explore_features(features_path, features_list=None):
if features_list is None:
features_list = []

# Normalize path separators
pure_feature_path, scenario_line = get_feature_and_scenario_line(features_path)
normalized_features_path = os.path.normpath(pure_feature_path)
Expand All @@ -190,8 +189,14 @@ def explore_features(features_path, features_list=None):
if scenario_line:
# iterate over scenarios and add the scenario that matches the scenario line
for scenario in feature.scenarios:
if scenario.line == int(scenario_line):
features_list.append(scenario)
#check if scenario is a ScenarioOutline
if isinstance(scenario, ScenarioOutline):
for example in scenario.scenarios:
if example.line == int(scenario_line):
features_list.append(example)
else:
if scenario.line == int(scenario_line):
features_list.append(scenario)
else:
features_list.extend(feature.scenarios)
else:
Expand Down
1 change: 1 addition & 0 deletions tests/features/failing_scenarios.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
tests/features/secondary_features/passing_tests.feature:8,tests/features/secondary_features/passing_tests.feature:29,tests/features/secondary_features/failing_tests.feature:3
19 changes: 19 additions & 0 deletions tests/features/rerun_scenarios.feature
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
Feature: Failing Scenarios

@FAILING
Scenario Outline: Performing rerun of failing scenarios using the -rf option
Given I have installed behavex
When I setup the behavex command with "<parallel_processes>" parallel processes and parallel scheme set as "<parallel_scheme>"
When I run the behavex command with a file of failing tests
Then I should see the following behavex console outputs and exit code "1"
| output_line |
| 2 scenarios passed, 1 failed |
| Exit code: 1 |
And I should not see exception messages in the output
And I should see the same number of scenarios in the reports not considering the skipped scenarios
And I should see the generated HTML report does not contain internal BehaveX variables and tags
Examples:
| parallel_scheme | parallel_processes |
| scenario | 1 |
| scenario | 3 |
| feature | 2 |
7 changes: 7 additions & 0 deletions tests/features/steps/execution_steps.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,13 @@ def step_impl(context):
execute_command(context, execution_args)


@when('I run the behavex command with a file of failing tests')
def step_impl(context):
context.output_path = os.path.join('output', 'output_{}'.format(get_random_number(6)))
execution_args = ['behavex', '-rf', os.path.join(tests_features_path, 'failing_scenarios.txt'), '-o', context.output_path]
execute_command(context, execution_args)


@when('I run the behavex command that renames scenarios and features')
def step_impl(context):
context.output_path = os.path.join('output', 'output_{}'.format(get_random_number(6)))
Expand Down

0 comments on commit 64fb227

Please sign in to comment.