As testers complete your live maze, you will start seeing insights on the Results dashboard.
The results for your live website testing blocks are displayed in three sections:
Before you start
- Clicks on a website test are captured as heatmaps. Non-click interactions (e.g. hover, drag) are not shown on the heatmap.
- We only take one snapshot as the user enters the page for the first time. For this reason, overlays and modals are not captured in the heatmap.
Usability metrics
In this section, you'll see a summary containing the following metrics:
- Mission completed: Percentage of testers who completed the mission
- Expected paths followed: Percentage of testers who followed an expected path
- Mission unfinished: Percentage of testers who have abandoned the mission
- Average duration: Average time testers took to complete the mission
Mission results
Results data is aggregated by path or by screen. To toggle between both, click the Live website test results aggregated by dropdown.
Aggregated path analysis
When you open your Results dashboard, you'll see the aggregated paths view by default. This view groups the paths taken by testers based on the mission outcome:
- Direct success: Testers who completed the mission via the expected path(s).
- Indirect success: Testers who completed the mission via unexpected path(s), but still reached the final page.
- Mission unfinished: Testers who abandoned the mission, or finished it on the wrong screen.
Under each tab, you'll see the paths corresponding to each outcome, including the number of testers who met that outcome.
Hover over each path and click View path heatmaps to see the combined heatmaps from the testers who took that path. If you've enabled Clips on the mission, you'll also be able to see your recordings here.
Learn more about heatmaps in Maze
Screen analysis
Your results data can also be grouped by the website screens your testers saw during the website test.
Click each screen to see the heatmap of your testers' interactions for each individual screen.
Tester path
Under Tester paths, you can see information about each individual testing session:
- ID: The individual tester's ID
- Clips insights: Whether there are saved screen and/or audio and video recordings from the mission
- Outcome: Whether the mission resulted in a direct or indirect success, or the tester gave up
- Duration: How long it took the tester to complete the mission
- Misclicks: The number of clicks the tester made on a non-clickable area (i.e. outside a hotspot)
- Misclick's pages: How many screens where the user misclicked
- Tester path: A thumbnail preview of the first four screens of the path the tester took
- Tested at: The date and time of the test
Click each row to see the individual tester's responses to the maze, as well as the specific path they've taken on the mission.
How many testers do I need?
In quantitative studies, the more users you can test with, the better the chance your results are accurate. As a rule of thumb, we recommend testing with at least 20 users.
Having more testers can be particularly helpful when doing card sorting: as more testers participate, agreement rates become easier to interpret, and you start seeing more meaningful patterns in the data.
Learn more in this article: How many testers do I need?
Filter your results
When reviewing your results, it can be helpful to refine the data you're seeing to help you answer specific questions. Filters allow you to narrow down your results data based on responses to specific blocks.
Learn more about results filtering
Reports
Maze reports make it easy to analyze, share, and present your results data. Reports are automatically generated for every live maze tested with at least one tester.
Export your results
To export your results data, open the More menu (•••) and click Export as CSV file or Export as image.