Importance of Regression testing
Experience has shown the re-emergence of faults is quite common throughout the software development lifecycle. Sometimes it occurs because a fix was lost through poor revision control practices (or simple human error in revision control), or a fix for a problem is "fragile" - i.e. if some other change is made to the program, the fix will no longer work.
Regression Testing guidelines
The general guideline is 10-30% per major functional module, depending on the complexity, severity and number of defects found in current and past releases, as these are indicators of code stability.
10% Exploratory testing of functional modules where no identified code changes occurred to provide a minimal check to ensure we didn't overlook the integration point. This testing can be done without creating new test cases.
10% Test coverage for non-invasive code changes such as UI changes.
No Severity 1 or Severity 2 defects were found during test execution.
30% Test coverage for complex code changes such as multiple integration points.
A limited number of Severity 1 or Severity 2 defects in isolated code.
50% Test coverage code changes with multiple Severity 1 and Severity 2 defects that crossed integration points – for instance, one defect is fixed and created two more.
Automation Status | Status Definition |
---|---|
Not Ready |
The status will default to Not Ready. When the test case is in this status, it is incomplete or has not been reviewed by the QE Analyst to determine if it is ready to be automated. |
Ready |
The QE Analyst will set the test case to Ready when it is complete and ready for automation. A test case is ready when it is fully detailed with all test steps, allowing the Automation Engineer to manually execute it and see the expected outcome to automate it. |
Rework Needed |
The QE Analyst will set the test case to Rework Needed when the functional behaviour has changed since the test case was automated, making automation updates necessary. |
Rejected |
The Automation Engineer will set the test case to Rejected when the test case is not sufficiently detailed to allow automation to be performed. A test case in this status requires further work by the QE Analyst to add details regarding test steps and expected outcomes. |
In Progress |
The Automation Engineer will set the test case to In Progress when the test case is being automated. |
Complete |
The Automation Engineer will set the test case to Complete when the test case has been automated and can be run during regression instead of manually executing the test case. |
Not Automatable |
The Automation Engineer will set the test case to Not Automatable when the test case steps do not lend themselves to automation because verification requires a QE Analyst to review the results. |
Regression Testing across STLC
Regression Test suite creation workflow
Workflow – Below workflow illustrates the creation of manual and automated regression suites with maintenance flow.
Test suite creation
The activities for Test Suite creation for Regression Testing are outlined in the steps below:
Step 1: Create a high-level mind map for the application
Step 2: Detail each process of the application to the scenario level
Step 3: Identify the critical business functionalities
Step 4: Prioritize the functionalities/scenarios according to criticality
Step 5: Review the prioritized scenarios with the Business team
Step 6: Build the regression suite accordingly
Step 7: Review the regression suite with the Business team
Step 8: Maintain regression test suite/library as an ongoing activity in support of any Customer Care or Release changes
Test Suite Maintenance
Review the mind map during Post Implementation phase of each release and update catering to Business and technical needs.
Review existing regression test suite during Post Implementation Phase and provide recommendations to Business and IT for needed functionalities.
Always maintain regression test library for Integration and System testing functionalities.
Review existing Manual/Automated test scripts for reusability and re-testability and provide recommendations on the functionalities/test Scripts that can be eliminated from the current regression test library.
There is a trade-off between possible redundant testing costs and the costs of deciding which test results can be reused and which manual/automated test scripts can be rerun.
Some manual/automated test Scripts may need to be updated during Post Implementation phase.
Identify regression test functionalities and review with Business and IT to add to the regression test suite.