Embark on a journey of knowledge! Take the quiz and earn valuable credits.
Take A QuizChallenge yourself and boost your learning! Start the quiz now to earn credits.
Take A QuizUnlock your potential! Begin the quiz, answer questions, and accumulate credits along the way.
Take A Quiz
🔹 Introduction
After setting clear goals, preparing a solid test plan, and
recruiting the right users, it’s time to conduct the usability test. This
chapter focuses on the actual execution of the usability test, covering
moderation techniques, how to observe user behavior, capturing data, maintaining
neutrality, and ensuring ethical and effective sessions.
Your job during a usability test is to facilitate—not
influence—so participants can naturally interact with your product. The more
natural and unbiased the session, the more authentic the feedback.
🔹 The Usability Testing
Workflow
✅ General Phases of a Usability
Test Session
🔹 Setting the Stage
🧑💻
Environment Setup
For in-person:
For remote:
🛠 Tools to Have Ready
Tool Type |
Recommended Tools |
Screen sharing |
Zoom, Microsoft Teams,
Google Meet |
Session recording |
Lookback,
OBS, Maze |
Note-taking |
Notion, Airtable,
Miro, Google Docs |
Timer |
UXtweak,
Maze, stopwatch app |
🔹 Moderator Roles and
Responsibilities
A usability session typically includes:
Moderator Best Practices
🔹 Facilitating the Test
✅ 1. Greeting and Rapport
Building
Start with friendly conversation to put participants at
ease. Confirm tech setup (sound, video, screen share).
Example Script:
"Thanks for joining today. We'll be walking through
some tasks—this isn't a test of your abilities, but of our design."
✅ 2. Informed Consent
Explain recording, confidentiality, and participant rights.
Get verbal or written agreement before proceeding.
✅ 3. Warm-Up Questions
Ask basic demographic or behavior-related questions:
These insights provide context for their behavior during the
session.
✅ 4. Delivering Tasks
Present one task at a time. Example:
“Imagine you're a freelancer looking to track your
income—try finding the dashboard where you can view monthly earnings.”
Tips:
✅ 5. Encouraging Think-Aloud
Protocol
The "think aloud" method gives insight into user
cognition:
Prompt Examples:
✅ 6. Managing Difficult
Situations
Challenge |
How to Handle |
User is silent or
nervous |
Reassure and encourage
without pressure |
User asks for help |
Respond with
“What would you do if I weren’t here?” |
Tech failure (remote
session) |
Pause, troubleshoot or
reschedule as needed |
User is off-topic or chatting |
Politely
redirect focus to the current task |
🔹 Observing and
Collecting Data
🔍 What to Observe
📝 Note-Taking Techniques
Field |
Example Notes |
Task #1 Outcome |
Completed but clicked
4 wrong items first |
Error or confusion |
“Not sure
what 'billing email' means” |
Suggested
improvement |
Wants a tooltip for
icon label |
Quote |
“I didn’t
expect this to open a new tab” |
🔹 Recording and Data
Handling
Ensure all sessions are recorded (if permitted). Store
securely and organize using:
Suggested Log Table:
Task ID |
Participant |
Time Taken |
Success |
Notes |
T-001 |
P-03 |
0:42 |
✅ |
Clicked wrong button
first, confused |
🔹 Debriefing the
Participant
After all tasks, conduct a short interview. Example
questions:
Give users a chance to speak freely at the end. Thank them
and share any promised incentive.
🔹 Synthesizing Session
Results
After the test:
Use frameworks like Rainbow Sheets or Affinity
Mapping to cluster insights.
🔹 Common Moderation
Mistakes
Mistake |
Impact |
Talking too much |
Inhibits participant
expression |
Leading questions |
Influences
responses, skews data |
Overexplaining
tasks |
Eliminates natural
problem-solving behavior |
Failing to ask follow-ups |
Misses deeper
insights |
Not recording |
Risk of losing data or
misinterpretation |
🔹 Checklist: Running a
Test Session
✅ Step |
Details |
Participant joins session |
Verify connection,
sound, screen sharing |
Consent confirmed |
Get verbal or
written approval to record |
Warm-up questions |
Ask 2–3 to set context |
Tasks presented |
Deliver
clearly and neutrally |
Think-aloud
encouraged |
Remind participant
gently throughout |
Observation ongoing |
Note
reactions, issues, hesitations |
Debrief conducted |
Gather qualitative
feedback and insights |
Session wrapped |
Thank
participant and document findings |
🔹 Summary
Running a usability test is both an art and a science. It
requires neutrality, focus, empathy, and discipline to observe without
interfering.
By sticking to a structured process—welcoming, guiding,
observing, recording, and debriefing—you ensure your team gets honest, valuable
feedback directly from users. The insights gained here become the foundation
for iteration, innovation, and impactful design.
In the next chapter, we’ll explore how to analyze test
results and translate raw user behavior into clear, prioritized action
steps for your design and development team.
Usability testing is a user research method where real users are observed as they attempt to complete tasks on a product to evaluate its ease of use, functionality, and overall user experience.
According to usability expert Jakob Nielsen, testing with 5 users typically reveals about 80% of usability issues, making it a practical number for early testing.
Moderated testing involves a facilitator guiding the participant, often in real-time, while unmoderated testing is conducted without direct oversight, usually through automated tools or platforms.
Usability testing should be conducted at multiple stages—during early wireframes, prototype development, before launch, and even post-launch to ensure continuous improvement.
Tools like UserTesting, Maze, Lookback, Optimal Workshop, and Hotjar are commonly used to run usability tests, gather recordings, and analyze user behavior.
Important usability metrics include task success rate, time on task, error rate, satisfaction score, and qualitative feedback from users.
A usability test plan typically includes the objective, target audience, task scenarios, success criteria, tools used, facilitator script, and post-test debrief questions.
Users can be recruited via email lists, testing platforms, social media, or customer databases, and they should represent the target demographic of the product.
Yes, remote usability testing is increasingly popular and effective, allowing researchers to gather insights from users across various locations using tools like Zoom, Maze, or UserZoom.
After testing, synthesize your findings, prioritize issues by severity, share insights with the team, and implement design improvements based on the feedback.
Please log in to access this content. You will be redirected to the login page shortly.
LoginReady to take your education and career to the next level? Register today and join our growing community of learners and professionals.
Comments(0)