Workflow Name
Purpose: Guide contributors through a standardized process for transforming use case requirements into manual and e2e tests, ensuring comprehensive, reliable, and consistent test coverage across the project.
Audience: All contributors involved in defining use cases, writing manual or e2e tests, and improving test coverage (developers, testers, maintainers).
Status: Draft
Rationale
This workflow ensures:
- Contributors can easily identify which tests to create from each use case
- All scenarios (main, alternative, error) are covered and tests are reliable
- Tests are grouped and classified consistently (e.g., smoke tests, feature types)
- Clear guidance on which tests to prioritize and create first
- Contributors know when to improve or expand use cases before writing tests
- Use cases are systematically linked to their manual and e2e tests, with coverage status documented inside the use case itself
Quick Reference
- Use cases:
doc/usecases/(by status: done, current, assigned, later, draft) - Manual tests:
doc/tests/manual/testcases/(Gherkin syntax) - E2E tests:
e2e/features/(pytest-bdd, Playwright, Python) - Testing strategy:
doc/tests/README.md - E2E guidelines:
doc/guidelines/e2e.md - Manual testing guidelines:
doc/guidelines/manual-testing.md
Test-related issues in GitLab:
- Issues with titles containing "test" or "testing"
- Issues labeled with
scope::e2e(for e2e tests)
Quick checklist:
- Find the relevant use case to cover
- Check for existing manual and e2e tests
- Map each use case scenario to a test (manual and/or e2e)
- Ensure all main, alternative, and error scenarios are tested
- Group and name tests consistently (e.g., smoke, feature type)
- Prioritize critical and release-blocking tests
- Link use cases to their tests and document coverage
- Create follow-up issues for missing or unclear tests/use cases
Workflow Steps
1. Identify use case to cover or improve
Who: Developer, Tester, Quality Assurance
When:
- During new feature implementation (manual tests should be included in the implementation issue)
- Before moving a use case from "current" to "done" status
- When improving or expanding existing test coverage for features that lack tests
- When fixing bugs that currently lack tests
Actions:
- read use case in
doc/usecases/<usecase>.mdto identify those needing tests- use cases that don't have Manual Tests: and E2E Tests: set to done yet
- use cases that have unclear or missing scenarios
- For all new features: ensure manual tests are included as part of the implementation issue
- For existing features: Check issue tracker for existing test-related issues (titles containing "test" or "testing", or labeled with
scope::e2e) - Check existing manual tests in
doc/tests/manual/testcases/<usecase>.md - Check existing e2e tests in
e2e/features/<usecase>.feature - Identify gaps in test coverage or unclear scenarios
- Create follow-up issues for missing or unclear tests/use cases
- Include names of all the tests to be created or improved
Result: Issue(s) created for writing or improving tests based on use case
2. Analyze and Propose Test Requirements
Who: Developer, Domain Expert, Tester, Quality Assurance
Actions:
- In the issue, analyze and propose:
- Which scenarios (main, alternative, error) need tests
- Which scenarios should be covered by manual tests (always written first)
- Which manual test scenarios should later be automated as e2e tests
- Whether scenarios are relevant as smoke tests
- Dependencies between tests
- Initial assessment of test complexity and implementation approach
Result: Issue has detailed test proposal ready for prioritization
3. Prioritize Test Requirements
Who: Product Manager
Actions:
- Together with Product Manager finalize:
- Prioritization of tests (level of criticality, release-blocking)
- Critical vs. optional functionality (based on use case criticality)
- Test groups that need immediate attention
- Release-critical scenarios
- Which manual tests should be prioritized for e2e automation
- For guidance on prioritization, automation candidates, and effort analysis, see doc/tests/manual/testcases/README.md
- Final decision on test approach and scope
- Prioritization of tests (level of criticality, release-blocking)
Result: Issue has clear test requirements and priorities and is ready for implementation
4. Implement manual tests based on issue
Who: Tester/Contributor
When:
- During feature implementation (manual tests are part of the implementation issue)
- After issue is created and manual tests are not yet implemented (for existing features)
Actions:
- Create or improve Gherkin test cases in
doc/tests/manual/testcases/<usecase>.md - Follow manual testing guidelines in
doc/guidelines/manual-testing.md - For guidance on including automation effort analysis fields for each test case, see doc/tests/manual/testcases/README.md
- Test manual test cases on dev.permaplant.net or a local build to ensure they work as intended
- After testing, measure and add:
- Execution Time (minutes): Actual measured time for manual execution
Result: New or improved manual tests are ready for MR
5. Create MR for manual tests
Who: Tester/Contributor
Actions:
- Create MR with the new or improved tests, invite contributors for review
- Ensure tests cover all specified scenarios from the issue
- If any scenarios are unclear or missing, ask for clarification in the issue
Result: MR is created and ready for review
6. Review manual test MR
Who: Reviewers
Actions:
- Ensure tests cover all specified scenarios from the issue
- If any scenarios are unclear or missing, ask for clarification in the MR/issue
- Verify they are correctly formatted and follow gherkin guidelines
- Test the manual test cases on dev.permaplant.net or a local build to ensure they work as intended
- Confirm e2e automation selection: Review and finalize which manual test scenarios should be automated as e2e tests (based on complexity, execution time, and automation feasibility)
- Provide feedback and approve MR if all is good
Result: MR is ready to be merged and e2e automation candidates are confirmed
7. Implement e2e tests based on manual tests
Who: Developer
When: After manual tests are merged
Actions:
- Create e2e test scenarios in
e2e/features/<usecase>.feature - Follow e2e testing guidelines in
doc/guidelines/e2e.md - If needed, tag tests appropriately (e.g., @smoke_tests)
- Remove e2e test scenarios from manual test cases once they have been implemented
Result: New or improved e2e tests are ready for MR
8. Create MR for e2e tests
Who: Developer
Actions:
- Create MR with the new or improved e2e tests, invite contributors for review
- Ensure e2e tests cover all manual test scenarios
- Ensure the pipeline passes and tests run successfully
Result: MR is created and ready for review
9. Review MR
Who: Reviewer
Actions:
- Ensure tests cover all specified scenarios from the manual tests
- If any scenarios are unclear or missing, ask for clarification in the MR/issue
- verify if they follow our Gherkin guidelines
- Verify e2e tests run successfully in the CI pipeline
- check trace files if they actually click through as expected
Result: MR is ready to be merged
10. Update use case with coverage status
Who: Developer/Tester
When: After all MRs related to a use case are merged.
Actions:
- A use case is only fully done when all its scenarios are covered with tests
- If not done yet, move the use case to "done" status
- Update the use case doc in
doc/usecases/<usecase>.mdwith:-
Test Coverage Status
- Manual Tests: <status/details>
- E2E Tests: <status/details>
- Test Statistics:
manual test cases, e2e test cases - Coverage Verification: Manually verify and document what percentage of use case scenarios/branches are covered by tests and note any gaps
- Notes: Any limitations or scenarios not covered and why
-
Result: Use case is fully covered with tests and marked as done
Special Cases
Use case is already done but tests are missing
Follow the regular workflow steps 1-10, with these modifications:
- Step 1: When creating the issue for missing tests, link the use case and document which tests are missing and which scenarios are already covered in manual and e2e tests.
- Step 10: Skip moving the use case to "done" status since it's already done. Only update the test coverage documentation.
Feature has to be merged into master before tests are ready
The created issue for missing tests has high priority and should be completed as soon as possible after the feature is merged to production.
Examples
Example 1: Adding tests to existing feature (multi-select use case)
Step 1: Identify that multi-select use case doc/usecases/multi_select.md is done but lacks tests
Step 2: Create issue [#841](https://issues.permaplant.net/841) and analyze test requirements - document which scenarios need manual tests and which should be automated
Step 3: Discuss with Product Manager to prioritize which manual test scenarios are critical and should be automated as e2e tests first
Step 4: Implement all missing manual tests in doc/tests/manual/testcases/multi_select.md
Step 5: Create MR for manual tests
Step 6: Review MR for manual tests
Step 7: Implement e2e tests for HIGH PRIORITY scenarios first in e2e/features/multi_select.feature
Step 8: Create MR for e2e tests
Step 9: Review MR for e2e tests
Step 10: Update use case doc/usecases/multi_select.md with test coverage status (skip moving to "done" since already done)
Result: Multi select use case is fully covered with manual and e2e tests
Example 2: Implementing new feature with MR open
Step 1: While implementing new feature, ensure manual tests are included in the implementation issue and MR
Step 2: Analyze test requirements in the issue - identify which manual scenarios should later be automated
Step 3: Discuss with Product Manager to prioritize which manual tests need e2e automation
Step 4: Include manual tests in doc/tests/manual/testcases/<feature>.md as part of the feature MR
Step 5: Review and merge feature MR (including manual tests)
Step 6: Create follow-up issue/MR for e2e automation of prioritized manual test scenarios
Step 7: Implement e2e tests in e2e/features/<feature>.feature
Step 8: Review and merge e2e tests MR
Step 9: Update use case doc with final test coverage status
Result: New feature is delivered with comprehensive manual tests, and critical scenarios are automated
Related Resources
- Use cases:
doc/usecases/(by status: done, current, assigned, later, draft) - Manual tests:
doc/tests/manual/testcases/(Gherkin syntax) - E2E tests:
e2e/features/(pytest-bdd, Playwright, Python) - Testing strategy:
doc/tests/README.md - E2E guidelines:
doc/guidelines/e2e.md
Troubleshooting
Common Issues
Problem: Certain test scenarios are difficult to implement in e2e tests. Solution: This should be addressed during prioritization (Step 3). Prioritize easy and high-value e2e tests first. Difficult scenarios that can be easily tested manually should be automated last, or may remain manual forever. Document the prioritization rationale in the issue and use case, explaining why they were not automated.
Problem: Certain test scenarios are difficult to test manually. Solution: Most scenarios should be easy to test manually if the application has good UX. They may be time-consuming but not difficult. However, some scenarios are genuinely hard to test manually (e.g., race conditions, precise timing-dependent behavior). For time-consuming scenarios (performance with large datasets, browser compatibility testing, network failure simulations), consider prioritizing e2e automation for efficiency. For genuinely difficult scenarios, discuss alternatives in the issue or MR and prioritize e2e automation. Document in the use case why manual testing was not feasible for specific scenarios.