The usability scores in your maze report are not intended as an interpretation of the design, but as a way to measure the ease of use of the screens, missions, and maze.
The usability score is an exact number from 0 to 100. You get a usability score for every screen, mission, and maze.
In this article:
Screen Usability Score (SCUS)
You get a usability score for every screen in the expected paths. The score reflects how easy it is for a user to perform a given task (mission) with your prototype. A high usability score indicates the design will be easy to use and intuitive.
Principles used to compute the SCUS
You lose usability points when a user:
- Clicks on another hotspot than the expected ones. This means the user got off the expected path(s) — in a live product, this results in frustration or a lost user.
- Gives up on a mission. This is a clear indication something isn't right and should be checked.
- Misclicks. A misclick is a click on a non-hotspot/clickable area. In a live product, a misclick would take the user to an "incorrect" page", which leads to (1) above. In this context, a misclick doesn't send testers anywhere, since it's a click outside a clickable area.
- Spends too much time on a screen. Understandably, there are types of pages in a live product where you'd want the user to spend a lot of time, e.g., on a blog article or on an About us page. But in a prototype, too much time spent on a screen indicates something is wrong and needs to be improved.
How these principles are translated into data
- (1) and (2) are expressed by the drop-off and give-up rates. Both are equally as important. For every percent of users dropping off or giving up on a mission, you lose 1 usability point.
- (3) is expressed by the misclick rate. Not every misclick is an indication of a wrong action so for every percent of a misclick, you lose 0.5 usability points.
- (4) is expressed by average duration in the following way:
- From 0 to 5 seconds: No usability points lost
- From 5 to 25 seconds: 1 usability point lost every 2 seconds
- From 25 seconds onwards: 10 usability points lost
SCUS Formula
SCUS = MAX(0,100 - (DOR* dW) - (MCR*mW) - (MIN(10,MAX(0,(AVGD - 5)/2))))
Which has these variables:
- SCUS for Screen Usability Score
- DOR for drop-off and bounce rate
- dW for DOR weight; The dW equals 1 point for every drop-off / bounce
- MCR for misclick rate
- mW for MCR weight; The mW equals 0.5 points for every misclick
- AVGD for Average Duration in seconds
And these functions:
- MAX: MAX(VALUE, {EXPRESSION}) => return the maximum between the VALUE and the EXPRESSION
- MAX: MIN(VALUE, {EXPRESSION}) => return the minimum between the VALUE and the EXPRESSION
Mission Usability Score (MIUS)
You get a usability score for every mission in your maze. As with the SCUS, the mission score reflects how easy it is for a user to perform a task (mission) with your prototype. A high usability score indicates the finished product will be usable, intuitive, and efficient.
Principles used to compute the MIUS
- Direct success should be strongly correlated to the mission usability score.
- Indirect success shouldn't be considered as a failure.
- The average usability metrics should impact the mission score.
How these principles are translated into data
- For each percent of direct success rate is equal to 1 point.
- For each percent of indirect success rate is equal to 0.5 points.
- The average misclick and duration penalty applied to the screen will be applied to the mission. For instance a mission with a 100% success rate, but an average of 50% misclick rate on every screen will deduce 25 points. So an MCUS of 75 out of 100.
MIUS Formula
MIUS = DSR + (IDSR / 2) - avg(MC_P) - avg(DU_P)
Which has these variables:
- DSR for Direct Success Rate
- IDSR for Indirect Success Rate
- avg for the average
- MC_P for misclick penalty =
MCR * 0.5
- DU_P for duration penalty =
(MIN(10,MAX(0,(AVGD - 5)/2)))
Maze Usability Score (MAUS)
You get a Usability Score for every one of your live maze tested. A mission's usability score (MIUS) doesn't have an impact on the other missions' scores. The Maze Usability Score (MAUS) is thus an average of the usability score for every mission in the maze.
MAUS Formula
MAUS = avg(MIUS)
Which has these variables:
- MAUS for Maze Usability Score
- avg for the average
- MIUS for Mission Usability Score