Getting high-quality data from your unmoderated studies is essential for making confident decisions. At Maze, we’ve built several features to help ensure participants are engaged and providing thoughtful responses.
Our approach to quality
Quality participant data starts with careful matching and continues through every step of your study. We combine automated detection with tools that give you visibility and control over your data quality.
Panel partner verification and fraud prevention
Before participants ever reach your study, our recruitment partners use sophisticated systems to verify identities and detect fraudulent behavior. These industry-leading platforms employ multiple layers of protection:
- Identity verification: Participants go through verification checks during signup, including identity confirmation, contact information validation, and screening for duplicate accounts
- Ongoing monitoring: Continuous fraud detection systems track behavior patterns across entire participant pools, flagging suspicious activity like bot-like responses or profile inconsistencies
- Quality scoring: Machine learning models analyze participant history, approval ratings, and engagement patterns to maintain high-quality pools and remove poor performers
- Regular re-verification: Periodic identity checks and profile updates ensure participants remain legitimate throughout their lifecycle on the platform
These behind-the-scenes safeguards mean fraudulent actors are typically caught before they can apply to your study, maintaining the integrity of our participant pool.
Speeders Detection: Catching rushed responses
What are speeders?
Speeders are participants who complete your study unusually quickly, indicating they likely aren’t taking time to read questions or provide thoughtful answers. When someone rushes through a test in a fraction of the expected time, it can compromise the reliability of your results.
How Speeders Detection works
Our system monitors completion times in real-time and compares them against expected durations for each part of your study:
During the study:
- As participants move through your test, we track how long they spend on each section.
- If someone completes sections in less than 10% of the expected time, they’ll see a prompt encouraging them to slow down and provide thoughtful responses.
- Participants can dismiss this warning and continue, but it helps remind them to engage meaningfully with your content.
After completion:
- Participants who completed the entire study unusually quickly are automatically flagged in your responses.
- You’ll see a clear visual indicator next to flagged participants, making them easy to identify.
- You have full transparency into who was flagged and why, so you can review their responses and decide how to handle their data.
What you can do
When you spot a flagged participant:
- Review their responses to assess quality.
- Report the participant if their answers appear low-quality or fraudulent. For panel participants reported within 72 hours, we’ll automatically find a replacement at no extra cost.
- If a participant’s answer looks okay, click “Looks good” to dismiss the banner.
This feature is currently available for unmoderated panel studies.
Participant reporting and replacements
Beyond automated detection, you have additional tools to ensure participant response quality:
- Report any participant who you believe provided low-quality responses by going to Results → Participants → Report Participant. Participants reported more than twice are blocked from or platform.
- Automatic replacements: Report an unmoderated panel participant within 72 hours of their submission, and we’ll automatically source a replacement at no additional cost.
- After 72 hours: You’ll still receive a refund for reported participants, but you’ll need to place a new panel order for a replacement.
Best practices for quality data
To get the most from these quality features:
- Review flagged participants shortly after study completion to catch issues within the 72-hour replacement window.
- Look for patterns in flagged responses—multiple speeders might indicate issues with study length or clarity.
- Set clear expectations in your study introduction about the importance of thoughtful participation.
- Use attention checks strategically to validate engagement alongside automated detection.
- Enable think-out-loud education for studies where you want participants to share their thinking out loud.
Questions?
If you notice unusual patterns in participant quality or have questions about specific responses, our support team is here to help.