-->

Google Brings Voice Search To iPhone

iPhone users can now perform speech-enabled Internet searches, thanks to a free update released by Google yesterday for its iPhone application.

The update builds on the previous version of Google Mobile App for iPhone and offers two new ways for users to search the Web: by voice and by location.

With the update, users can perform Google searches by simply speaking queries into their iPhones. Google’s Voice Search uses speech recognition technology to transform spoken words into text and then runs the query through Google’s search engine as if it had been typed manually.

“This speech recognition engine now—as opposed to the traditional where you have a few words or limited grammars to say—we have opened it up to any kind of Web query,” says Gummi Hafsteinsson, product manager at Google. “And I think that is probably the biggest breakthrough…So you say something, and we turn it into the query, and then we do a search, and then we turn that back to the application.”

According to Hafsteinsson, the update is possible now for a two main reasons: computation power and Google’s significant experience with Internet queries.

“Google has a vast amount of computation power,” he says. “Speech recognition is a very computational-intensive application, and now we’ve been able to scale it to such a large scale that we can do a vast amount of data processing and we can handle a lot of load and we can actually do this on a real-time basis—something that just wasn’t possible before because the computers didn’t exist to do that.”

And, Hafsteinsson says, Voice Search was “trained on Google queries.”

“Because we know what people search for and how they type and so on, we used that to build a big language model that’s used to run this whole engine,” he says. “So, because of the behavioral patterns of how people do Web searches, we can better understand what you see and turn it into text with a higher rate of accuracy than possible before.”

An additional feature of Voice Search detects the phone's movement and activates automatically when users want to perform a search. Users simply bring the phone to their ears and speak searches into the phone.

According to Hafsteinsson, this feature was designed to make the application more natural and intuitive and “less socially awkward.”

“One thing we’ve noticed when we were doing user studies…pretty consistently across the board people felt a little bit awkward using the application when they held [the iPhone] like a walkie-talkie,” he says. “And we’ve noticed also in other applications, people speaking to machines sometimes feel a little weird...So that’s when this whole idea came…So you don’t have to press any buttons, you just move the phone to your ear, wait for the beep, and then you speak your query, and then we do the search.”

The update also makes use of the iPhone's location detection capability to further personalize search results. Users can simply speak queries like "coffee shops," "weather," or "wine bars" and receive results specific to their current location.

This, Hafsteinsson says, simply makes the application even easier for users. “The idea is always: What can we do to make the search faster and easier from a user point of view?” he says. “So whenever you do a query for local information…you don’t have to specify your location. So, one, it saves you time…but also it saves you from having to know that information.”

Although Hafsteinsson couldn’t comment on future updates or innovations for the Google Mobile App for iPhone, he notes that any developments would driven by improving user experience. “It’s all about, for users, how to make it easier and faster for them to find information,” he says. “So I think innovation is going to continue to be on that line: What are the things we can do to make it even easier now to find information when you’re on the go?"

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues