-->
  • July 14, 2009
  • By Melanie Polkosky Human Factors Psychologist & Consultant - IBM/Center for Multimedia Arts (University of Memphis)
  • Interact

Goldilocks and the Three IVRs

Article Featured Image

A couple of recent projects in close proximity gave me the uncanny feeling that I was Goldilocks visiting the house of the three bears.

At the first organization, I asked for usage data to analyze behavior in the interactive voice response (IVR) system. The data was too little (really, not any). No one had ever heard of doing such an analysis. All decisions about user experience came from a small group who read customer satisfaction and complaint reports, and then decided on prompt changes. Sure, the customer satisfaction survey included two broad IVR questions, but they weren’t specific enough to have diagnostic value. The organization knew a problem existed, but didn’t know what it was or what to do about it. 

At the second organization, I asked for usage data to analyze IVR behavior. The data was too much. I received dozens of reports, PowerPoints, and meeting invitations to discuss hundreds of data points from the past year. When I asked for very specific numbers, I got them, plus a bunch of other stuff that knotted my brain. The organization had multiple groups doing multiple analyses of different data, all with conflicting findings, suspicions about accuracy, and hand-wringing and finger-pointing. The organization knew a problem existed, but didn’t know what it was or what to do about it.

At the third organization, I asked for usage data to analyze IVR behavior. The data was just right. It had selection frequencies for all prompts, error-handling data and recovery frequencies, call containment percentages, common paths, and system- and user-initiated operator request numbers. It had a biweekly reporting schedule with clear, understandable reports for the important data everyone agreed to track. Numbers were reported upward in a regular executive meeting about channels of service delivery. And it had a change management process that included a single, skilled voice user interface professional who tracked suspected usability problems, confirmed them with data, recommended script changes, and designed and tested those changes for implementation. The organization knew it had a great (yet evolving) user experience and the numbers to prove it.

It’s Unreal
All right, I admit it: That last company makes this story a fairy tale. Unfortunately, organizations that get user data just right are surprisingly rare. In the case of the two extreme examples I visited, the net outcomes were exactly the same: Both organizations had very unusable self-service systems with poor containment rates. Neither could take concrete, data-based action on their scripts, preferring broken processes that disconnected their IVRs from user data. It was enough to make this Goldilocks run screaming from her cubicle. 

When the Too Little organization saw by example how prompts could be structured based on usage data, it insisted that every teeny design decision be based on original data, even though such a feat is practically impossible. It pushed me to identify why two menu options were in a slightly different order than the original system; because we had no data to support a decision either way, I gave the linguistic and experience-based rationale for the sequencing I selected. “But that’s not based on data,” someone retorted, “so we’ll just change it to what we had.” In the absence of its own data, the Too Little organization didn’t realize it could rely on existing communication research applied by skilled scriptwriters or find new ways of getting user feedback quickly. Design decisions continued to be Too Fast.

When the Too Much organization saw by example how prompts could be structured based on usage data, it insisted that more analysis was required, even though it was unnecessary for the system it had. It pushed me to identify particular segments of the user population who were experiencing repetitive errors. Although this analysis certainly has a benefit for marketing and user research, in this case plans to create a personalized IVR did not exist. It was one IVR that needed to serve the user population as a whole. “But we need to know who’s going around in these circles,” someone retorted, “so we’ll just do more analysis to make sure.” The Too Much organization didn’t realize it already had adequate data to rewrite the problematic script. Design decisions continued to be Too Slow.

The more organizations I visit, the more I believe change management data procedures are the most critical, long-term factors for successful self-service.  A Just Right organization has the capacity to refine an interface as its understanding of user behavior evolves, even if the initial design is a little Too Weak. That’s an organization where Goldilocks would want to take off her coat and live happily ever after. The end.


Melanie Polkosky, Ph.D., is a social-cognitive psychologist and speech language pathologist who has researched and designed speech, graphic, and multimedia user experiences for more than 12 years. She is currently a human factors psychologist and senior consultant at IBM. She can be reached at polkosky@comcast.com.

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues