As testers complete your live maze, you will start seeing insights on the Results dashboard.
The results for your prototype test blocks are displayed in three sections:
Usability metrics
At the top of your mission results, you'll see an overview of your mission results:
- Direct success: Percentage of missions completed by an expected path
- Mission unfinished: Percentage of testers who ended the mission on the wrong screen, or abandoned the maze
- Misclick rate: Average percentage of clicks outside of a hotspot. In a live product, a misclick would have taken the user to an “incorrect” page.
- Average duration: Average time testers took to complete the mission
Mission results
Mission results data is aggregated by path or by screen. To toggle between both, click the Prototype test results aggregated by dropdown.
Aggregated path analysis
When you open your Results dashboard, you'll see the aggregated paths view by default. This view groups the paths taken by testers based on the mission outcome:
- Direct success: Testers who completed the mission via the expected path(s)
- Indirect success: Testers who completed the mission via unexpected path(s), but still reached the final screen
- Mission unfinished: Testers who ended the mission on the wrong screen, or abandoned the maze
Under each tab, you'll see the paths corresponding to each outcome. The path overview includes the number of testers who took that path, as well as the average time and average misclick rate for all screens within the path.
Hover over each path and click View path heatmaps to see the combined heatmaps from the testers who took that path. If you've enabled Clips on your mission, you'll also be able to see your recordings here.
Learn more about heatmaps in Maze
Screen analysis
Your results data can also be grouped by screen.
Click each screen to see the heatmap of your testers' interactions for each individual screen.
Tester paths
Under Tester paths, you can see heatmaps and other details for each individual tester:
- ID: The individual tester's ID
- Clips insights: Whether there are saved screen and/or audio and video recordings from the mission
- Outcome: Whether the mission resulted in a direct or indirect success, or the mission wasn't complete
- Duration: How long it took the tester to complete the mission
- Misclicks: The number of clicks the user made on a non-clickable area (i.e. outside a hotspot)
- Misclick's pages: How many pages where the user misclicked
- Tester path: A thumbnail preview of the first four screens of the path the tester took
- Tested at: The date and time of the test
Click each row to see the individual tester's responses to the maze, as well as the specific path they've taken on the mission.
How many testers do I need?
In quantitative studies, the more users you can test with, the better the chance your results are accurate. As a general rule, we recommend testing with at least 20 users.
Having more testers can be particularly helpful when doing card sorting: with more testers, agreement rates become easier to interpret, and more meaningful patterns start emerging in the data.
Learn more about the ideal number of testers
Filter your results
When reviewing your results, it can be helpful to refine the data you're seeing to help you answer specific questions. Filters allow you to narrow down your results data based on responses to specific blocks.
Learn more about results filtering
Reports
Maze reports make it easy to analyze, share, and present your results data. Reports are automatically generated for every live maze tested with at least one tester.
Export your results
To export your results data, open the More menu (•••) and click Export as CSV file or Export as image.