Survey Says Users Still Suspicious of Conversational AI
When Google debuted Duplex in 2018, the voice assistant made a lot of waves. A little too human-sounding for some, Duplex caused people to instantly wonder whether it was ethical to have an unsuspecting stranger on the phone with a bot without disclosing it as such. Over a year later, Riley Panko, senior content developer and marketer at Clutch, published the results of her research into the question.
In “The State of Conversational AI and Consumer Trust,” Panko reports that 73% of respondents say they are unlikely to trust an AI-powered voice assistant to make simple calls for them correctly. Meanwhile, 81% believe that AI-powered voice assistants need to declare they are robots before proceeding with a call—and 61% would feel uncomfortable if they believed they spoke to a human and later learned they had spoken to AI. Those kinds of results don’t bode well for Google’s Duplex or other voice assistants.
“These findings show that people are still uneasy about trusting voice AI not to make mistakes, especially if the mistakes have embarrassing or expensive consequences,” says Deborah Dahl, principal at Conversational Technologies. “This is not surprising, since even the best AIs make a lot of mistakes. My Echo probably fails to turn on the lights for me about 25% of the time I ask. Normal light switches turn the lights on pretty much 100% of the time, unless they’re actually broken. But they don’t routinely fail for no apparent reason. Any kind of automation has to earn our trust by working almost all the time.”
“I think my findings show that the creators of these technologies need to go above and beyond to instill trust in users,” says Panko. “It will certainly impact adoption if consumers aren’t willing to trust AI to do the tasks it’s developed to do. I believe the recent news that nearly 25% of Google Duplex’s calls are still conducted by humans shows that at least Google is focused on improving the experience of its tool. That is likely the best option to combat consumer distrust—ensuring that the tool works the best it possibly can. Users will eventually overcome their distrust and adopt the tool if they can be assured it works.”
But some of the distrust around this particular use of conversational assistants seems to center on a lack of quality control. If you ask Duplex to call your salon and make an appointment to get your haircut, how do you really know the appointment was made?
“I’m not sure,” Panko says, “aside from letting the user listen in on the call, which defeats the time they saved by allowing it to call for them. Again, I think the need for monitoring ties back to the idea of proving results. If a consumer knows that Duplex will work correctly for them almost all of the time, then they will have less desire to monitor the call. This technology is so new to the consumer-facing marketplace, though, that people still don’t fully grasp its capabilities.”
So how do conversational assistants earn our trust?
“Putting voice assistants in our homes and on our phones is a great strategy for earning trust, because it gives them a chance to demonstrate that they can perform simple, inconsequential tasks like telling us today’s date and tomorrow’s weather forecast before we let them do something more complicated,” says Dahl. “Making dinner reservations might be the next step. This isn’t all that different from gradually giving a child or a new employee more and more responsibilities (and checking their work). Certainly at voice assistants’ current level of competence, people are smart to adopt a ‘trust but verify’ strategy with them.”
“Maybe, eventually, voice applications will be up to complex and risky tasks, like planning a vacation or shopping for a house for us, but they have to be able to show us that they can do simple things first,” Dahl adds. Until then, there are always plenty of timers for Siri to set.
Academics are at the forefront of the biggest AI changes in the industry. Learn more about the latest breakthroughs, the questions researchers are trying to answer, and the challenges they face.