As testers complete your live maze, you will start seeing insights on the Results dashboard.
The results for your prototype test blocks are displayed in three sections:
Usability metrics
At the top of your mission results, you'll see an overview of your mission results:
- Direct success: Percentage of missions completed by an expected path
- Mission unfinished: Percentage of testers who ended the mission on the wrong screen, or abandoned the maze
- Misclick rate: Average percentage of clicks outside a hotspot. In a live product, a misclick would have taken the user to an “incorrect” page.
- Average duration: Average time testers took to complete the mission
Mission results
Mission results data is aggregated by path or by screen. To toggle between both, click the Prototype test results aggregated by dropdown.
Aggregated path analysis
When you open your Results dashboard, you'll see the aggregated paths view by default. This view groups the paths taken by testers based on the mission outcome:
- Direct success: Testers who completed the mission via the expected path(s)
- Indirect success: Testers who completed the mission via unexpected path(s), but still reached the final screen
- Mission unfinished: Testers who ended the mission on the wrong screen, or abandoned the maze
Under each tab, you'll see the paths corresponding to each outcome. The path overview includes the number of testers who took that path, as well as the average time and average misclick rate for all screens within the path.
Hover over each path and click View heatmaps to see the combined heatmaps from the testers who took that path. Learn more about heatmaps in Maze
If you've enabled Clips, you'll also be able to see the participant recordings here.
Screen analysis
Results data can also be grouped by screen.
Click each screen to see the heatmap of your testers' interactions for that particular screen.
Responses
Continue scrolling down the Results page until you reach the Responses section, where you can view heatmaps and additional information for each tester.
- Clips: Whether there are saved screen and/or audio and video recordings from the mission
- Participant: The individual tester's ID
- Outcome: Whether the mission resulted in a direct or indirect success, or the mission wasn't complete
- Duration: How long it took the tester to complete the mission
- Tester path: A thumbnail preview of the first four screens of the path the tester took
- Responded at: The date and time of the test
Click each row to see the individual tester's responses to the maze, as well as the specific path they've taken on the mission.
How many testers do I need?
The number of testers you need depends on the type of research you're running. We recommend testing with at least 20 testers for unmoderated studies and 5–10 for moderated studies.
Learn more about the ideal number of testers
Filter your results
When reviewing your results, it can be helpful to refine the data you're seeing to help you answer specific questions. Filters allow you to narrow down your results data based on responses to specific blocks.
Learn more about results filtering
Reports
Maze reports make it easy to analyze, share, and present your results data. Reports are automatically generated for every live maze tested with at least one tester.
Export your results
To export your results data, open the More menu (•••) and click Export as CSV file or Export as image.