I designed a new software QA platform as part of a key business initiative that allowed GAT to move the majority of its manual testing onto technology.
🏆 Over 50% of tests run by GAT are completed on the new platform
🏆 A reduction of ~35% in the manual effort required by Ops to manage tests
A key objective for the business was to free up capacity from the Ops team so that they could spend additional time on value add services for customers. Interviewing stakeholders revealed that the problem we needed to solve was that the existing testing process for many of our customers required the Ops team to carry out slow and manual test management processes that relied on spreadsheets for results gathering and communication.
The problem we needed to solve could be framed with the following How Might We statement. HMW increase the % of GAT tests that are run using technology to reduce Ops overhead? It was also clear that the scope of this problem was very large since transferring the execution of testing onto technology would impact internal systems, customer facing UI and tester facing UI.
Succeeding in this project required a deep understanding of the existing processes and requirements for tests to be managed to completion. I interviewed numerous stakeholders from Ops to understand exactly what goes into running a test.
Based on the insights gathered from the user interviews, I worked with the other designers, PMs and tech leads to build a user story map that contained all of the stages of running a test and the steps in each. This map allowed us to identify what capabilities the platform would need to have, as well as potential opportunities for improvement with optimisations and automation.
The map covered a wide array of user journeys crossing between various systems and products.
I began wireframing high level flows and shared them for feedback within the Product & Engineering team as well as Ops. I used this feedback to iterate and began sketching specific screens that we would need to display.
I created a series of prototypes in Figma and used them to run user testing sessions with Ops team members and used their feedback to iterate. It was especially helpful to learn what information they prioritise the most when attempting to review the progress of a test. This allowed me to keep the UI simple with only required information being displayed at each stage.
When the platform was released, adoption was not immediate. Usage of the platform required test managers to be trained on new internal processes that were very different in some cases than those they were used to. We continued to gather feedback on the platform long after initial release and continued to iterate both to increase ease of use as well as the increase the % of tests that were capable of being launched and managed on the platform.
🏆 Over 50% of tests run by GAT are completed on the new platform.
🏆 A reduction of ~35% in the manual effort required by Ops to manage tests.