-->

Do In-Car Voice-Enabled Devices Distract Drivers?

Article Featured Image

Vendors Remain Operating on All Cylinders

In spite of AAA's research and subsequent media splash, car manufacturers and speech technology vendors have remained largely quiet and are going full throttle with new speech-enabled car systems and partnerships.

Earlier this year, Apple announced plans to update the integration of Siri on car touch screens, enabling users to get directions, access maps, make phone calls, and dictate and receive iMessages, eyes-free. In 2014, Siri will be available not only in high-end cars from Mercedes-Benz, Lexus, and Jaguar, but also in more budget-friendly vehicles from Kia, Hyundai, Chevrolet, Honda, Nissan, and others.

Nuance Mobile, the biggest provider of in-vehicle speech systems, issued a statement that didn't refute AAA's findings but did urge that more research was needed.

"We agree that driving distractions need to be minimized in an increasingly connected society," writes Rebecca Paquette, senior corporate communications manager of Nuance Mobile, in a blog post on the company's Web site. "Despite the interpretations of this recent study, now is not the time to stand in the way of this innovation [to address] concerns. From maps to music to phones and messaging, it is critical that the industry collaborate to find a safer, smarter way, and it's central to our mission here at Nuance."

Nuance is continuing to move forward with its in-car speech recognition partnerships. In September, Nuance announced that its Dragon Drive voice capabilities are the driving force behind the new voice-enabled Kenwood Audio Visual Navigation system. Kenwood's system provides drivers with navigation, content, music, Twitter and Facebook updates, message dictation, and other features via voice commands.

"Consumers want to engage with their favorite apps safely and productively behind the wheel," said Scott Caswell, a Kenwood executive, in a statement. "By voice-enabling our in-car experience, we are transforming the in-car experience, and creating a deeper sense of connectivity on the open road."

Speech Technology Experts Don't See Potholes

While speech vendors and car makers are staying out of the fray, several experts in the voice recognition field seem to agree that the AAA study is flawed, and recommend redesigning research methodologies.

"They didn't compare the distraction of managing voice-activated email with manual email management, and the set of tasks they selected doesn't seem very systematic," said Deborah Dahl, principal at Conversational Technologies and chair of the World Wide Web Consortium's Multimodal Interaction Working Group, in an email. "I think texting happens a lot more often in the car than email, so it's probably a much bigger source of...problems. They should have looked at a representative range of the things the driver does in a car (other than driving) and compared the two ways of doing them."

Thomas Schalk, vice president of voice technology at Agero, and chair of The Speech Consortium, was less kind in his assessment of the study.

"People believe that what AAA has advertised about speech is misleading," Schalk says. "Others believe that AAA has been using information from the University of Utah as a publicity stunt. I happen to agree. I am not suggesting that the results from the study are incorrect. Instead, I am saying that using speech is safe while driving as long as the user interface is properly designed."

Dan Miller, senior analyst at Opus Research, believes that the University of Utah has derived some very useful findings, "but I feel like they are using some very precise measuring to draw conclusions about a concept that defies precision, measuring distraction. As the study shows, every activity other than two hands on the wheel and eyes forward results in some level of distraction."

"Some tasks, such as composing an email or text message, make heavy demands on cognitive functions like working memory and attention, regardless of how they're performed," Dahl says. "A better study would compare distraction resulting from simple tasks like controlling the temperature in the car and complex tasks like writing a speech both manually and using speech."

Schalk points out that if a driver is doing a secondary task not relevant to driving, such as texting, it will take a certain amount of time to complete that task. During that time, the driver's eyes may be off the road and his ability to fully focus on driving, from a cognitive load perspective, may be degraded. So it can be a combination of eyes-off-road and cognitive load that leads to unsafe driving.

"However, if that same person uses a speech interface to compose and send that same text message while driving, there is data that shows it can be safe to do with a very reasonable task completion time," Schalk says.

"We performed the research with Virginia Tech and I presented those results to NHTSA. [NHTSA] referenced our work when [it] published the visual-manual guidelines. However, NHTSA doesn't make many public statements about a particular study. Rather, NHTSA tends to comment based on a number of studies and multiple sets of data."

Dahl, Schalk, and Miller all agree that trying to compose long emails and text using speech systems while driving is a bad idea.

"Personally, I don't see how saying 'Play the Grateful Dead' to my car radio is going to make me a more dangerous driver, but answering my emails with my normal verbose style may," Miller says.

Miller and Dahl agree that educating drivers about the operation of speech-enabled cars might help to reduce distracted driving.

"What's needed is a thorough evaluation of the risks and rewards of using hands-free, eyes-forward technologies that could be included in driver education programs," Miller says.

Schalk goes a step further and says that it's not the speech system that's the problem, it's the user interface. Schalk points out that as touch displays are becoming more dominant in cars, people are getting more comfortable using them. "When people see icons, they instinctively know that when they tap on them it opens an app. If you were to push the speech button in the car and the car said, 'Please tap or say your selection,' things would be much better. It's all about multimodality."

Some might argue that tapping commands in a car is just as much as a distraction as anything else, but Schalk disagrees. "The benchmark test that the NHTSA refers to in [its] visual and manual guidelines published this year uses manual tuning of a radio as a secondary task that's safe to do," he says. "A certain amount of visual manual interaction is absolutely safe to do. It's okay to turn up the volume or turn down the air conditioner."

As Nuance said, it is working on looking at the issues regarding speech-enabled cars, something that Dahl encourages. "Vendors need to take a good look at this study and learn whatever they can," she says. "Even though the University of Utah study was not well designed, I think that we can conclude that cognitively demanding tasks compromise safe driving. Vendors should not ignore this."

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Clarion Launches Google-Driven In-Vehicle Multimedia and Navigation System

Cloud-based technology brings voice recognition to cars.

Beyond Speech in the Car

Automotive safety calls for a holistic approach.

Agero Advocates for Speech in Cars

The company calls on the NHTSA to include speech technologies in review of driver distraction guidelines for vehicle interfaces.