It was not an easy recruit. Directors of IT are busy people. Oddly, they’re hard to get hold of. They don’t answer calls from strangers. They don’t answer ads on web sites. The ones who do answer ads on web sites we had to double-check on by calling their company HR departments to verify they had the titles they said they did.
And now this.
“Hi! So we have some executives coming in tomorrow to observe the test sessions.” This was the researcher phoning. He was pretty pleased that his work was finally getting some attention from management. I would have been, too. But. He continued, “I need you to [oh yeah, the Phrase of Danger] call up the participants and move some of them around. We really want to see the experienced guy and the novice back-to-back because Bob [the head of marketing] can only come at 11:30 and has to leave at 1:00.”
“Sure,” I say, “we can see if the participants can flex. But your sessions are each an hour long. And they’re scheduled at 9:00, 10:30, 12:00, and 2:00. So I’m not quite clear about what you’re asking us to do.”
“I’m telling you to move the sessions,” the researcher says, “so the experienced guy is at 11:30 and the novice is at 12:30. Do whatever else you have to do to make it work.”
“Okay, let me check the availability right now while we’re on the phone,” I say. I pull up the spreadsheet of participant data. I can see that the experienced guy was only available at 9:00 am. “When we talked with Greg, the experienced guy, the only time he could come in was 9:00 am. He’s getting on a plane at 12:30 to go to New York.”
“Find another experienced guy then.” What?!
Five signs that you’re dissing your participants
You shake hands. You pay them. There’s more to respecting participants? These are some of the symptoms of treating user research participants like lab rats:
They seem interchangeable to you.
If you’re just seeing cells in a spreadsheet, consider taking a step back to think about the purpose and goals of your study.
You’re focused on the demographics or psychographics.
If it’s about segmentation, consider that unless you’re running a really large study, you don’t have representative sample, anyway. Loosen up.
Participants are just a way to deliver data.
You’ve become a usability testing factory, and putting participants through the mill is just part of your life as a cog in the company machine.
You don’t think about the effort it takes for a person to show up in your lab.
Taking part in your session is a serious investment. The session is only an hour. But you ask participants to come early. Most do. You might go over time a little bit. Sometimes. It’ll take at least a half hour for the participant to get to you from wherever she’s coming from. It’ll take another half hour for her to get wherever she’s going afterward. That’s actually more than 2 hours all together. Think about that and the price of gas.
You don’t consider that these people are your customers and this is part of their customer experience.
You and your study make another touch point between the customer and the organization that most customers don’t get the honor of experiencing. Don’t you want it to be especially good?
They’re “study participants” not “test subjects.”
Don’t forget that you couldn’t do what you do without interacting with the people who use (or might use) your organization’s products and services. When you meet with them in a field study or bring them into a usability lab, they are doing you a massive favor.
Although you conduct the session, the participant is your partner in exploration, discovery, and validation. That is why we call them “participants” and not “test subjects.” There’s a reason it’s called “usability testing” and not “user testing.” As we so often say in the introductions to our sessions, “We’re not testing you. You’re helping us evaluate the
Throw away your screener: Tips on recruiting humans
I’m not kidding. Get rid of your screener and have a friendly chat with your market research people. Tell them you’re not going to recruit to match the market segments anymore. Why not? Because they usually don’t matter for what you’re doing. In a usability test, you focus on behavior and performance, right? So recruit for that.
Focus on behavior, not demographics
Why, if you’re testing a web site for movie buffs, will selecting for household income matter? What you want to know is whether they download movies regularly. That’s all. Visualize what you will be doing in the session, and what you want to learn from participants. This should help you define what you absolutely require.
Limit the number of qualifiers
Think about whether you’re going to compare groups. Are you really going to compare task success between people from different sized companies, or who have multiple rental properties versus only one, or different education levels? You might if you’re doing a summative test, but if most of your studies are formative, then it’s unlikely that selecting for multiple attributes will make a big difference when you’re testing 5 or 6 people in each audience cell.
Ask open-ended questions
Thought you covered everything in the screener, but fakers still got into your study? Asking multiple-choice questions forces people to choose the answer that best fits. And smart people can game the questionnaire to get into the study because they can guess what you’re seeking. Instead, ask open-ended questions: Tell me about the last time you went on a trip. Where did you go? Where did you stay? Who made the arrangements? You’ll learn more than if you ask, Of your last three trips taken domestically, how many times did you stay in a hotel?
Learn from the answers
You get “free” research data when you pay attention to the answers given to open-ended screening questions because now people have volunteered information about their lives, giving you more information about context in which you can make decisions about your study design and the resulting data.
Flex on the mix
If you use an outside recruiting firm, ask to review a list of candidates and their data before anyone gets scheduled. You know the domain better than the recruiters do. You should know who will be in the testing room (besides you). You should make the trade-offs when there’s a question about how closely someone meets the selection criteria.
Remember, we’re all human, even your participants. These steps will help you respect the human who is helping you help your company design better experiences for customers.