Automated processes ensure consistent execution and higher-quality outcomes. It streamlines software development, testing, and deployment, resulting in faster releases. Automation also facilitates data analysis, monitoring, and decision-making.

Background on the birth of the QA Automation Dashboard:

  • We were a Team of 25+ QA distributed into 6-7 different Teams/Pods so it was pretty hard to track on how these Teams are doing in Automation.
  • For a manager, it was really painful to find out all the relevant data for these teams, given data is spread across multiple tools.
  • Multiple Jira Projects & Suites –  Challenging to get data for each team in terms of progress on automation coverage.
  • For Tracking the Team’s progress, excel sheets were being used but it was not a feasible option, as we were supposed to keep maintaining them regularly.
  • This gave us the much-needed single Dashboard- QA Automation Dashboard.

QA Automation Dashboard is a tool which enables tracking Automation related metrics from multiple sources in a single place, not only by the team but also at the pod.

Initial Brainstorming

Questions we pondered on:

  • After the manual stories have been deployed to production, how do we determine the criteria for selecting stories that should be prioritized for automation?
  • How do we differentiate between the test cases that have been automated and those that have not?
  • How do we track which test cases were automated in specific sprints and maintain visibility into the progress of test automation efforts?
  • How do we arrange the automation suite based on priority?
  • How do we effectively present to a wider audience the status of test cases, including those that were automated, planned, not automatable, or deprecated, in a simplified manner?/li>
  • How do we evaluate the overall automation coverage of the entire product at the end?

Solving all the above problems was not a short-term goal. We needed a sustainable solution to make it a culture across the organization. With a lot of brainstorming, we came up with multiple phases to take this up.

Phases of our journey

PHASE-1: Revisiting the User Stories & Test Cases in Jira

At Yubi we use Jira which is a project management tool that helps teams plan, track, and manage tasks and issues throughout the software development process, enabling collaboration and efficient workflow management. During this phase, we reviewed all the User Stories that were deployed to production along with their associated Test Cases. The following actions were undertaken to ensure proper visibility of automation coverage for these User Stories and associated Test Cases.

  • Marking the status of whether this story is automatable and can be picked up for automation or not in Jira as below:MTA Effective QA Automation Dashboard at Yubi
  • For the mentioned User Story, we select Test Cases based on the priority defined by the QA team.
  • After Automating the Test Case we mark the Automation Status in Jira accordingly as below:AS Effective QA Automation Dashboard at Yubi
  • To identify the current sprint in which we Automate we define the labels as ASP_21 ( Automation Sprint 21) with respective Test Cases. By doing the above actions, we moved closer to the projection of Automated Test Cases and demonstrating Coverage, as outlined in the following Phases.

PHASE-2: Jira Queries for Automation metrics.

We developed custom queries in JIRA using JQL for pulling out metrics related to our automation status.

  • Query to take out all Test cases linked to the project/projects. project in (abc,zyz) AND issuetype = Test
  • Query to take out all Automated Test cases in Sprint 21 for ex. project in (abc,zyz) AND labels in (ASP_21)

PHASE-3: Final Automation Dashboard

The Automation Dashboard provides a centralized view of automation-related information and metrics, facilitating monitoring and analysis of automation progress, test results, and overall coverage.

  • As we already marked each Test Case with appropriate Automation Status so we will just take the Jira query for all the test cases stated above ( project in (abc,xyz) AND issue type = Test) and filter them into a pie chart based on Automation Status. Below is the Pie Chart we will receive.  Graph 1 Effective QA Automation Dashboard at YubiTo track which test cases were automated in specific sprints we will just take the Jira query we stated in the above phase ( project in (abc,xyz) AND labels in (ASP_21)) and filter them into the priority or Automation Status or Project bases on which we will get a pie chart accordingly.
  • Based on the Project wise in Sprint 21Graph 2 Effective QA Automation Dashboard at Yubi
  • Based on Priority wise in Sprint 21Graph 3 Effective QA Automation Dashboard at Yubi

PHASE-4: Maintaining the Stability of Automation Suite

We identified multiple aspects for which maintaining the stability of the automation suite was of utmost importance

  • Consistent and Accurate Results: A stable automation suite ensures consistent and accurate test results, fostering trust by minimizing false positives and false negatives.
  • Time and Cost Savings: A stable automation suite minimizes costs and time by reducing manual intervention, debugging, and rework, enhancing productivity.
  • Scalability and Maintainability: A stable suite simplifies scalability and maintenance, accommodating changes without disrupting existing tests and supporting the addition of new cases.
  • Continuous Integration and Delivery (CI/CD) Enablement: A stable automation suite supports smooth CI/CD integration, delivering quick feedback on software changes for faster release cycles.

After intense research on test reporting frameworks, we zeroed in on Allure Report

  • Allure report is a powerful test reporting framework used in software testing.
  • It generates detailed and visually appealing reports with statistics, graphs, and interactive features.
  • The reports simplify the analysis of test execution results, making it easier to understand and interpret the outcomes.
  • Allure report enhances test result visualization and provides a user-friendly interface for effective reporting.
  • Below is an example of % Automation vs No. of Scenarios/Scripts.Allure Effective QA Automation Dashboard at Yubi
  • What did this Result in?

The newly acquired dashboard has unquestionably brought about significant ease and benefits for everyone, including:

  • Accurate display of automated test cases for the current or previous sprint.
  • Elimination of GSheets usage.
  • Reduction of repetitive tasks involved in tracking multiple projects and gathering data.
  • Provision of instant visibility and overview.
  • Enhancement of decision-making processes.
  • Presentation of automation coverage percentage by comparing the total number of automated test cases to the overall test case count.
  • Proper segregation of automated test case coverage based on pods/projects.
  • Clear identification of test cases to be selected for manual regression testing.
  • Prioritization of test cases based on their importance.