-->

MIT Project Could Yield a New Kind of Mobile Speech App

Researchers at the Massachusetts Institute of Technology are working to develop a self-navigating wheelchair capable of learning different locations within a building and then transporting users to given locations in response to verbal commands.

Through speech recognition technology, users will be able to speak commands—"take me to the bathroom" or "go to the conservatory"—and the wheelchair will automatically maneuver from place to place based on a map stored in its memory.

"We’re using speech recognition developed inside the Computer Science and Artificial Intelligence Laboratory developed by the Spoken Language Systems Group," says Seth Teller, professor of computer science and engineering and head of the Robotics, Vision, and Sensor Networks Group at MIT's Computer Science and Artificial Intelligence Laboratory. "What’s novel about what we’re doing is that we’re trying to resolve ambiguous utterances by the users. So if the user, for example, says ‘take me to the elevator bank’ and there are two or three elevator banks in the building, the system will generate a question back to the user—something like ‘which elevator bank did you mean?’ … So it initiates a dialogue with the user rather than simply taking speech in stone and trying to do the best it can with it."

Instead of relying on manually captured maps, the MIT wheelchair learns about its environment through a guided tour. The chair is simply pushed around its new environment—an office, home, or hospital—as the user identifies rooms and locations: the study, the breakfast nook, the nurse’s station. The wheelchair prototype uses a WiFi system and a network of WiFi nodes in each location to generate its maps and navigate through environments.  

"The other novel thing we’re doing is using speech to help the wheelchair learn a map of the environment," Teller says. "So the scenario is—we haven’t deployed this yet, but this is our dream—the scenario is you get a brand new wheelchair with factory programming…and the first thing you do is give it a narrated guided tour of the space you’re in so it can learn the space layout. It’s too tedious to program in the layout of your space manually or give it floor plans or that kind of thing. So we simply treat it like a new person would be treated…You just walk it around the whole space and show it where everything is and after that it remembers the layout of the space which it learned using its onboard mapping abilities and the words that you used to describe each space. And later when you say ‘go to room 224’ it knows how to do it because you showed it room 224 earlier. And if you say ‘go to the magic garden’ and there isn’t one it’ll say ‘I don’t know what you mean; you didn’t tell me where it was.’"

Collaborating on the project with Teller are Nicholas Roy, the project lead and an assistant professor of aeronautics and astronautics, and Bryan Reimer, a research scientist at MIT's AgeLab.

According to Reimer, the chair will eventually be used in real trials at The Boston Home—a nursing home in Dorchester, Mass., where the majority of patients use wheelchairs.

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues