Coming into Focus
Why would a company hire a very capable set of developers to create a speech-enabled automated solution, if it really didn’t want to serve its customers there?
“They basically decided to make the [interactive voice response] so incredibly painful so customers would use the Web site. However, customers calling just ended up opting out of the system,” says Susan Hura, owner and principal consultant at SpeechUsability, a voice user interface (VUI) design consulting firm. Hura, who is working with this company, says she could have saved it time, money, and lost customers if it had heeded her advice to perform some kind of user research before going live six months ago.
“It’s too bad we couldn’t save them the six months of agony,” she says. “We could have done much of the work that I suggested over the course of eight or 10 weeks. We could have saved them this whole negative experience that their customers are having now, and this rip-it-out-and-start-from-scratch redesign.”
Despite such horror stories, many companies still choose not to use focus groups or perform usability testing, or to do so right before going live. The question is, why do they opt to implement a new IVR without user research when so much is at stake?
According to Melanie Polkosky, a human factors psychologist and senior consultant at IBM, many companies think focus groups are unnecessary because they already know their customers. She argues that designers could benefit from more information, especially in the early planning stages. “A focus group is one tool in an arsenal of getting information from users to understand how to make an application better,” she says.
Another piece of the testing conundrum is that companies not only think they know their customers, but, more specifically, they also think they know what kind of language will be understandable and how to organize a menu. That’s probably not the case, though, because many people in the technical field don’t know how to make their systems usable to people who aren’t privy to the language of technology. “The design is usually owned by the technical leadership, and they generally don’t know about human beings and live in mom’s basement,” says Jim Milroy, director of creative services at West Interactive.
The application’s language has to be simple enough so the average person—who’s always distracted—can use the application, says Polkosky, who adds there’s a large disconnect between how tech people and nontech people use language.
Yet another central reason companies choose to bypass testing is, especially now, a lack of money. “It’s not the cheapest thing in the world to do,” Milroy says. Among the costs involved, companies have to spend money to recruit—and pay—participants, rent a testing facility, and hire professional services. But Hura doesn’t see huge differences in costs among focus groups, usability testing, and one-on-one interviews. Cost differences are actually geographic, she says, noting that rates for coastal centers can be around $3,000, while in the Midwest centers can be rented for less than $1,000.
Besides money, concerns about time often influence the decision to go live with an untested IVR. According to Milroy, it can take between four and six weeks to adequately perform usability tests. Even though a focus group can be held in a day, it still takes time to sift through and analyze results. The level of experience of the person—or people—performing your user research also makes a difference. “[I can] do usability testing pretty cheaply. Because I’ve done so many, I’m very efficient, but it depends on who you’re dealing with,” Hura says.
The other alternative, choosing to forgo user research of any kind, can get companies in even greater trouble, costing them money on automation that only later reveals itself as widely unpopular among customers. “If you don’t design for your customers,” Milroy says, “don’t expect your customers to like it, that’s for sure.”
However, many industry experts say change is afoot. Companies are beginning to reap the benefits of user research because they are realizing “the way to differentiate yourself is through having an excellent user experience,” Hura says.
Type of Test
Budgets aside, another sticking point for companies is uncertainty about the type of usability testing to perform and whether a focus group is even right for gathering the type of information being sought.
If your company is considering user research, then the first stage of the process usually begins with one question: What kind of user research should you perform? The answer to this question almost always leads to another question: What information do you need?
When it comes to user research, a focus group might not be the best or only solution. In fact, among experts—even those who favor one approach over another—the common recommendation is to use various approaches to get the most information possible. “Both [interviews and focus groups] are useful,” Hura says. “They just give you very different kinds of data. Should you run a focus group? Yes. Should you run a usability test? Yes. More data is better.”
If you’re looking to find the larger problems with your IVR, then a focus group can provide you with that information, suggests Lizanne Kaiser, senior principal consultant in voice services at Genesys Telecommunications Laboratories. She cautions, however, that if designers are looking for more specific problems, they won’t get those in a focus group, but can test for them later by recording calls.
Even a small group can yield results, she says. “You don’t need a huge pool. In 20 interviews, any of the really big issues will pop out at you,” Kaiser says.
Also, some regard focus groups with a degree of skepticism since they tend to invite very particular problems. “Speech technology isn’t a group experience,” points out Kaiser, who favors one-on-one interviews because they don’t have the pitfalls group dynamics can create. Large groups can potentially collapse into groupthink because, as she asserts, in focus groups people will often just say what they believe the group wants to hear. Even in one-on-one interviews, “You still have to read between the lines,” Kaiser says. “Are they just saying this because they think the interviewer wants to hear this, or is this really true to how they behave in a real-world situation?”
Focus groups also can lead testers to rely too heavily on them to figure out what people might want to do in an IVR rather than having them respond to more particular elements. “If you ask people what they would want to have, [and] then you give them A, B, C, and D...they don’t use [it],” says Jenni McKienzie, a VUI designer at Travelocity.
McKienzie doesn’t use focus groups. Instead, she listens in on calls and performs her own usability testing. “It’s better to observe people in action than to ask what you think they want. We’re not very good at self-analysis, especially at the hypothetical.”
Similarly, figuring out whether customers can understand your IVR’s language might fall into usability testing territory. “Focus groups are going to tell you what the customers feel; usability testing tells you if your design is usable,” West Interactive’s Milroy says.
However, interviews also can be used in interesting ways to discover information about how customers organize tasks. Organization of tasks is a vital component of a successful IVR because if callers find a menu confusing or don’t hear their option soon enough, then they often opt out.
For example, Kaiser says she asks participants in interviews to organize index cards with menu titles into groups that make sense. The results, she says, are surprising. “Suddenly what you thought was an intuitive structure ends up being much different from the user’s perspective. When a caller uses a system it’s like a game show where you’re saying, ‘Guess what’s hidden behind door three?’ If door three isn’t really intuitive, they’re going to go through the door and be shocked to find out what’s on the other side.”
An Agent Alternative
Similar to focus groups, another technique used to gather data is agent roundtables. In this process, call center agents answer questions about customer problems. Although these roundtables don’t get data directly from customers, agents have talked to thousands of customers, says Milroy, who often employs the method. “The agents can tell us how customers are asking questions,” he explains.
But to be truly effective, how much user testing is needed? If a company wants to know not only if its IVR is going to be successful, but also if it will remain successful, then it is best to perform ongoing user research. “[User research] is not a one-shot deal. You need to come back in and gather user research throughout the process,” SpeechUsability’s Hura says.
McKienzie contends that while you can’t test every change made to an IVR, problems often can occur when an IVR that was tested is then handed over to a company and subsequent changes are made without testing. “I evaluate changes. Sometimes if they are big enough, we go and do usability testing sometimes more than once,” she says.
However, if iterative testing is not possible, performing user research at the beginning stages of the process—before too much work is done on a system—can be very helpful, IBM’s Polkosky says. Focus groups can be useful in the very early stage of development to see why some aspects of a call should be automated, she states.
No matter what technique your company employs to test and/or develop an IVR, a specific combination will most likely be best. No one situation and no one IVR are the same. “There is no silver bullet,” McKienzie argues. “Each of these different [research] techniques has its uses.”
Who to Use
So now that you’ve decided to go with a focus group, how do you pick your participants? VUI design consultants offer different perspectives.
A focus group or a series of interviews, is, of course, made up of participants who will inevitably affect the kinds of results gathered. It goes almost without saying that a group should represent your customer population, and, as mentioned earlier, it doesn’t need to be huge. However, Kaiser suggests in addition to representing your demographic, a group should also include students and seniors so you can see the range of responses to your IVR.
Additionally, perhaps the most important way to control for accurate results is to only recruit participants who are not connected to your field. “You want to be really careful,” Kaiser points out. “Sometimes people think they’re just going to invite friends and family, and sometimes they’re too close to the terminology of the company or they don’t want to offend someone, so you want to be careful not to taint the process.”
Worries about not tainting the process can sometimes cause a lot of concern, especially because no one wants to perform thorough testing and end up with a flawed system. This anxiety can lead companies to formulate rules about choosing participants that might seem extreme. For example, some refuse to allow English teachers to participate because they might be overly stringent about grammar, while others dismiss lawyers because they could be argumentative and dominate a group, Hura says.
Another consideration is, of course, how much to pay your participants. Like testing centers, cost varies by geographic location; participants who live in coastal states usually require higher payments. That said, you want to be careful about paying too much regardless of where the testing is done so participants don’t feel as if they’re being coerced or bribed, Hura says. Compensation could be in the form of cash or other incentives, like a gift certificate.
Compensation could also depend on how hard it is to recruit your particular demographic. Typically, the harder it is to recruit a group, the more you’ll have to pay the members.
While the people you recruit will, in theory, be the ones who give you the information you need, a good moderator will be the one who elicits responses and guides the conversation. In short, a lackluster moderator means lackluster results. Many of the concerns raised by critics of focus groups—about participants falling into groupthink or one person dominating the group—can be averted by hiring a skilled moderator.
Whether a company chooses to do interviews, a focus group, or a combination of the two, someone with the skills to elicit real responses from participants is absolutely necessary to successful user research.
Problems like one person dominating the conversation will nearly always occur, Milroy contends. The quietest person in the room might be the one who found a problem with the IVR, making that person’s opinion the most important in the room, he argues. “The moderator is the quarterback; he has to make sure everybody’s talking,” Milroy says.
A Reliable Ringleader
A good moderator also knows the importance of neutrality and how to word questions in nonjudgmental ways to elicit honest responses from participants.
Similarly, question order and the flow of the conversation play an important role in obtaining accurate assessments about an IVR. As an example, Kaiser presents the following: Asking a group if they like milk is a neutral question. But if you ask them first if they like cookies, you’ll get a much higher yes rate about liking milk because of the positive association between milk and cookies.
And while using a well-written discussion guide is helpful, not using that guide means a moderator can let the conversation occur in a more natural and, therefore, beneficial way, Milroy contends. “It’s kind of a Catch-22,” he says. “I shouldn’t have to ask a single one of [these discussion questions] because I’m going to let people tell me without me having to ask.”
However, some suggest more hands-on ways in which a moderator can guide a discussion, especially when it comes to automation, which isn’t exactly popular. When participants flat-out say they don’t want to use automation for a particular kind of call, Kaiser suggests seeing if there’s an aspect of a call they’d be willing to have automated. “You had to kind of explore with them a little bit what sort of things they would or would not be willing to use automation for,” she says.
The trouble with asking the opinions of groups of people, however, is that people can be unreliable in their assessments. A good researcher should be able to carefully pick out truths from the mass of comments gathered during interviews, focus groups, or usability testing. However, experts say the ability to glean results is a fine art often honed through years of research.
Seeing the Signs
Part of this art also involves the ability to decipher physical cues and body language to get a more complete picture of how people will actually respond to an IVR. “Behavior doesn’t lie,” Hura says. “You have to take with a grain of salt what people say about what they do.”
But besides participants, who else needs to be involved with the focus group? According to Hura, while one person can do all of the work—the design, testing, etc.—it’s best to bring in a third party to gain objective insights. “You want to avoid the ‘my baby isn’t ugly’ phenomenon,” she cautions. “Once you’ve invested in doing the design, it’s hard not to feel personally invested in the design. If you do have to fill both roles, you have to be very careful to look at your own biases.”
It’s important to have business stakeholders involved, at least as observers to connect business goals to the technology, Hura says. Also, whoever is going to update the system and own it on a day-to-day basis should be involved, she adds. “Folks who are on the call center side [have] great insight into why callers call,” Hura says.
While she says she’s happy to see more companies performing user research, Polkosky warns if companies don’t do the necessary work by performing appropriate user research, “these systems will continue to be horrible.” When it comes to testing an IVR, you’ll always find something surprising, Kaiser adds.
So it is that the future of IVR design could finally be heading in the right direction, possibly to a future where users actually like, and use, an IVR. Now that’s something to focus on.
Dos and Don'ts
- consider all of your testing options and tailor them to fit your company’s needs;
- get a great moderator who has lots of experience and is someone people feel comfortable with; and
- use participants’ language to rewrite any tech jargon in the IVR.
- assume you know how your customers will respond to your IVR;
- take any one participant’s opinion too seriously; and
- try to go against users’ motivations, hoping they will change a behavior to conform to an IVR design. Instead, build an IVR that conforms to users’ motivations.
When it comes to IVR design, are you seeing the forest for the trees?
This valuable tool offers unique insights into caller motivations.