Reporting and Metrics
Data Collection Methods
Test Automation Solution logging
- Log which test case is currently under execution, with start and end times. When running Cucumber from the command line, test case progress is indicated as each step is executed.
- The status of the test execution: passed, failed, or a Test Automation Solution failure.
- Low-level details of the test log (e.g. logging significant steps).
- Dynamic info about the system (e.g. memory leaks).
- With reliability/stress testing, a counter should be logged.
- When test cases have random elements, these choices should be logged.
- All actions should be logged so that it can be re-played with the same steps & timing.
- Screenshots can be saved for root cause analysis. The Cucumber capybara-screenshot gem can be used to capture screenshots.
- Any test logs which could be overwritten should be stored for analysis.
- Colours can help distinguish log information Cucumber uses red for failures, yellow for undefined/pending scenarios and green for passing ones.
System Under Test logging
- When a defect is detected in the system, log time stamps, source location, and error messages.
- At the startup of the system, configuration info should be logged to a file (e.g. software version, config of the OS).
- A failure identified in the test log by the Test Automation Solution should be easily identified in the test log of the System Under Test (e.g. in Sentry) and vice versa.
Data Analysis
- Analyse the test environment data to support proper sizing of automation (e.g. in the cloud), resources (CPU and RAM) and single vs multi-browser execution.
- Compare test results of previous executions.
- Determine how to use web logs to monitor software usage.
Analysis of test execution failures
- Check if the same failure happened in previous executions (it may be an existing defect in the System Under Test or Test Automation Solution).
- Identify the test case and what is being tested.
- Find the step where the failure happened.
- Analyse the log information about the state of the System Under Test.
- If the state of the System Under Test is not what was expected, log a defect.
Test Progress Report
Content of a test progress report
- Test results, system under test information, documentation of test environment.
- Which tests have failed and reasons for failure.
- Test execution history and who reported it.
- Test reporting is also used to diagnose any failures of the Test Automation Framework components.
Publishing the test reports
- Can be uploaded to a website / sent to a mailing list / uploaded to a test management tool / posted by a chatbot.
- Identify problematic parts of the system under test and keep a history of the test reports for trend analysis.
Stakeholders to report to
- Management stakeholders (solution architect, program manager, test manager).
- Operational stakeholders (product owner, business analyst).
- Technical stakeholders (team leader, scrum master, developer, tester, DBA).
Creation of dashboards
- Tools support data aggregation from pipeline execution test logs, project management tools and code repositories.
- Trends can include defect clusters, performance degradation and reliability of builds.
AI / machine learning analysis of test logs
- Automated analysis of large amounts of data in test logs helps reduce time spent finding broken locators, analysing the reason for test failures, and grouping common defects for reporting (“Unsupervised learning”).