Usability Testing Connects the Dots
I recently advised a client to skip a planned usability test because I wasn't sure it would be worth it. The application we were testing is already in production, so we missed our chance to make changes when it would have been quick and cheap. The application had been designed without any significant user input up front, so I was not optimistic about how much good we could do with a single usability test. I saw this as a classic Band-Aid-on-a-broken-leg situation, where the test was supposed to make up for the lack of user-centered design in the project so far. The client chose to move ahead in spite of my objections, and I am happy to say that I could not have been more wrong about its value.
We did not find major issues preventing users from completing their tasks, nor did the users express opinions that negated the overall goals for the IVR. We did find a few menus that needed tweaking and some prompts that ought to be reworded, and got a few ideas for additional functionality, but these don't represent the primary value of the test.
The best thing that came out of it was the opportunity to see real people interacting with the IVR and listen to their thoughts and opinions about using the system. This may not seem like a big deal in the age of analytics, when we measure a zillion aspects of customer behavior and produce minute-by-minute reports on any call statistic imaginable, but running this test reminded me that usability data is unique.
Analytics data can be robust and important in maintaining an application over time, but it is by definition at a level of remove from the behavior of any individual customer. Many analytics solutions allow you to listen to specific calls made by individual customers (another excellent practice to include in a program for monitoring applications over time). However, there is still a missing element: Analytics data tells you what customers are doing, but not why.
The why is bound up with motivation, expectation, and opinion—exactly the data that Voice of the Customer (VoC) programs are designed to capture. The problem with VoC data is that, while it is more direct, it also tends to be less focused and therefore less actionable. VoC data is valuable in understanding customer needs and desires, but it does not allow us to straightforwardly identify specific issues to fix in the IVR. VoC data points out that customers are good at identifying when they don't like something and how it makes them feel, but customers are less adept at explaining exactly what is causing their reactions.
Here's the beauty of a usability test: We can observe individual customers interacting with the IVR to see what's happening, and immediately interview them to understand their response. Usability testing allows us to connect the dots between the patterns of behavior revealed in analytics data and the opinions expressed in VoC programs. The connection is robust because the opinions are directly linked to observable behaviors, so we know exactly what customers are reacting to. Usability testing connects the what and the why of customer behavior, which provides diagnostic information on what to fix, plus a rationale for why it's important to fix it.
The other connecting power of usability testing is the immediate, personal connection with individual customers. It's easy to think of the people who dial into IVRs as an anonymous conglomerate rather than as actual human beings. Spending a couple of days in the lab with real users was a potent reminder that people call organizations because they have real-life tasks they need to accomplish, and that IVRs can either help or hinder them. Sitting across the table talking to test participants makes their experience real in a way that no analytics or VoC data can do.
You won't hear a consultant say this often, but I'm so glad my client didn't follow my advice. I'm grateful for the reminder of the unique power of a usability test to connect us with customers and to motivate us to build self-service options that give people a comfortable and efficient means of checking items off their list.
Susan Hura. Ph.D., is a principal and founder of SpeechUsability, a VUI design consulting firm. She can be reached at email@example.com.
When project teams lose sight of the big picture, users lose too.
Users accustomed to substandard systems can't imagine any better.
Tuning cycles smooth out the final kinks in an IVR.