What is usability testing?
Watching people try to use what you're creating/designing/building (or
something you've already created/designed/built), with the intention of (a)
making it easier for people to use or (b) proving that it is easy to use.
-- Steve Krug, Rocket Surgery Made Easy
We run usability tests early and often, regardless of fidelity. We even test
sketches to get a feel for flows and mental models. The earlier the stage, the
more we're testing the problem/solution fit. The later the stage, the more
we're testing actual usability.
We recommend setting aside at least one day a month for testing on every
Designing the test
Identify which workflows you'd like to test. These might be happy-paths for
critical tasks or just specific interactions you're unsure of. Each of these
tasks should be written in the form of a scenario to add detail and context.
You're designing an iOS app that has a Rails back-end, and saw on Twitter that
thoughtbot wrote a book on the topic. Find and download the book.
For a general usability test, prepare three to five workflows. For the test
phase of a Product Design Sprint, prepare for five or more.
The most representative candidates are going to be sourced from our existing
user base, pre-launch email list, or from a client's network. This is
especially important when our clients are looking for someone who has a
specialized background; like doctors, lawyers, or politicians. When sourcing
from a client's network, the client might want to be the person that reaches out
to schedule these. Otherwise anyone on the team, including the local Office
Manager, can reach out and schedule these.
Here's an example of an email after a client introduction:
Nice to meet both of you! A little more background on what Trent is asking:
We're looking for some folks to spend ~45 min answering some questions and
running through some tasks in a prototype. We're testing the prototype, we're
not testing you, so you don't need to be any kind of expert. It will help
if you've purchased a new energy plan over the last several months though.
You will need a computer with a steady internet connection, space you feel
comfortable in, and Google Hangouts installed. Let me know if y'all have
any other questions!
If you're still interested, could you grab a time here next Wednesday or
Thursday: CALENDLY LINK GOES HERE
For finding a more general audience, we've found that User
Interviews and User
Testing are easy ways to quickly schedule
usability tests. Credentials for both accounts can be found in 1Password. Both
will handle scheduling as well as sourcing candidates. We've found User Testing
to be great for fast feedback without having to track down users. And we've
User Interviews to be less expensive, but requires more set up work on your
end. These tend to work really well for our Product Design Sprints because we
can get them set up quickly and adjust based on the nature of the sprint. Anyone
on the project or an office manager could help set these up.
When we have trouble sourcing subjects, especially local ones, Craigslist can be
somewhat effective to find candidates. But before using it for sourcing
subjects, you should ask yourself, “How often do the people I’m looking for go
to Craigslist to find research studies to participate in?” If the answer is
or never, it may not be the right place to source testing subjects.
You can work with your office manager to post an ad on Craigslist, schedule them
to come into our office, and compensate them for their time after the test. We
have a detailed instructions for finding people on
Generally, we should compensate people for their time when possible. That value
will have a range based on the audience with whom we're testing and how they
value their time. Compensation should be enough of an incentive so that a
majority of the subjects will show up for their scheduled time.
A few examples of what we've done in the past:
- A digital gift card at general places like Amazon, Target, or Starbucks
ranging from $25-$75. These get emailed out at the end of the day by
the person that scheduled the interview.
- A free month or more of the product. Typically handled by the client.
- Free physical product. Typically handled by the client.
- Credits to their account on the product. Typically handled by the client.
- Physical gift card, Cash or Visa gift card ranging from $30-$75. Purchase
these a few days in advance.
Typically, we compensate people at the end of the session but there may be
circumstances where we would want to compensate them just before starting the
test. They should be told that what they say or do during the session doesn't
impact their compensation. If you need the support of an Office Manager to
fulfill compensation, make sure to make the request with ample time to
accommodate, especially if requesting physical gift cards.
Include people with a diverse range of abilities and disabilities, including
those who have hearing, visual, cognitive, or motor impairments. This can help
create a broader understanding of usability for clients and reiterate the
importance of building accessibly.
Review this list of accessibility contacts for recruiting participants, or ask
for help in the
#accessibility Slack channel.
Setup for testing remotely
Choose and test screen sharing/recording software ahead of time. We suggest
using Zoom because it can both share and record. As an
alternative, you could use Google Hangouts Meet with ScreenFlow.
Setup for testing in-person
Round up the product team, and encourage them to remotely observe broadcast
sessions from a room different from the room in which you're testing.
generate different insights, and observers are more likely to use (and
research if they are included in the discovery process.
Put out snacks, and have an observer worksheet printed
for each person.
Print out consent forms, a script template, and a copy
of the scenarios you'd like to test.
The screener might have indicated a preferred operating system. Use that if
possible, to make the participant more comfortable. If using a laptop, have
a separate monitor, keyboard, and mouse to use. Always use a "standard"
mouse, with normal scrolling behavior.
Prep the testing computer
- Install and test recording software like Lookback
(captures faces, voices, interactions).
- Clear off the desktop and close any windows.
- Open the browser to a neutral site, like Google.
- Have your site or application ready to test.
- Disable notifications. If you're on macOS, you can turn on the do not disturb mode.
Facilitating the test
The session can be broken into five distinct parts, each included in the script
1. Intro spiel
Introduce yourself, and say that you're going to read an introduction from a
script so you don't forget anything important. The important parts are:
"We're testing the software, we're not testing not you", "please think aloud",
and "you won't hurt my feelings".
There's no need to be strictly formal, we want them to be at ease. Think about
how you have your body positioned, how you're speaking, and what their body
language or their general demeanor is telling you. Adjust as necessary.
Open the piece of the product that you're testing — website, app, flow,
etc. — and ask the participant to give you a walkthrough. Have them scroll
around, and ask them to relay what they think they're looking at, whose site
it is, and what it might be for. Don't click on anything quite yet.
Read one of the tasks aloud, and ask them to complete it. Sit in the tension of
their silence but if the interaction is too quiet for too long, nudge slightly
keep the participant talking.
- "What are you thinking now?"
- "Is that what you expected to happen?"
- "What are you looking for, or hoping to find?"
Refrain from answering questions unless they're completely lost, and keep
from asking any leading questions. You can read read more on the topic in the
blog post What not to ask.
4. Cool down
Allow the participant an opportunity to provide any general feedback. Questions
to get them started talking might include:
- "What'd you think?"
- "What did you like about it?"
- "What was the most confusing part for you?"
Generously thank the participant for taking the time, and provide them their
incentive gift. Don't stop recording until they've left, participants are more
likely to be frank and candid with feedback once they feel the session is over.
If in-person, walk them out and take the next few minutes to clear the room
as well as the desktop for the next participant.
After the sessions have finished, gather the observers for debriefing. This
should happen as soon as possible, hopefully either the same day or the next
morning. Provide food.
Ask each observer to identify the three largest usability problems that they
saw across all participants.
Document these, and collaboratively prioritize which problems are the most
serious. List the problems that you'll fix before the next round of usability
If testing is happening in conjunction with a design sprint, looking back at the
assumptions that were discussed during the sprint and relating findings back to
them directly is an impactful way to show results.
- Scheduling a Google Hangout to discuss findings has been a great way to
deliver summarized findings along with anecdotes from the sessions.
- Being able to hand off some kind of document so that clients can reference
the findings later is helpful, but the format of that document depends on what
feels right for the client. Google docs usually work pretty well.
Navigating difficult conversations
Certain projects may necessitate talking to people who have experienced
difficulties, trauma, and hardship. In these situations, it is especially
important to be mindful of what the interview script asks, and how it
If conducting this kind of interview, please
follow this advice on how to approach and conduct