Eyes-Busy, Hands-Busy Computing
Several years ago, I accompanied a repairman named Gus as he climbed around on rooftops diagnosing, repairing, and fine-tuning HVAC (heating, ventilation, and air conditioning) systems. Gus used a checklist to examine schematics, engineering drawings, maintenance history, and instructions for repairing worn or broken parts. He recorded repairs and observations for billing purposes and to use during the next scheduled maintenance visit, but preparing and managing the paperwork was a problem.
Before Gus left for the HVAC site, he'd print the information he might need and place the papers on a clipboard. When he needed both hands to replace a part, he set the clipboard on a ledge. Sometimes it would fall off the ledge, resulting in a half-hour process to fish it out from the depths of the HVAC. So Gus carried strips of Velcro, which he attached to the HVAC and then attached the clipboard to the HVAC via the Velcro. Even then, the clipboard was not always visible when he needed it.
Fast forward a few years.
Smartphones and tablets have replaced clipboards. Gus no longer prints documents before visiting the site, but downloads what he needs from the Internet. He enters his observations and actions directly into the smartphone or tablet. For complicated tasks, the smartphone displays instructions on its screen and reads them to Gus, so he uses his eyes and hands to complete his task and uses voice commands such as "next" to proceed to the next instruction. Still, the smartphone or tablet has some usability problems. For example, when he's making repairs, Gus can't always see the screen.
Move over, Dick Tracy, with your two-way wrist radio. Both Samsung and Apple are rumored to be creating smartwatches which not only contain a telephone, but can also display HVAC information. Because Gus can wear the smartwatch on his wrist, he can view the display while making repairs. However, the display in the watch is small, and Gus may find it difficult to navigate the HVAC documents or see the display during complicated repairs.
Head-mounted displays have been used for virtual reality applications (where the user perceives an alternate world) and augmented reality (where information about the real world is superimposed over the user's view of the real world). Google Glass is a head-mounted display embedded into a pair of glasses that contains an Android computer, a touch pad, and a display embedded in the upper corner of one of the lenses. Wearing Google Glass, Gus can see the HVAC machinery while reviewing documents on the lens display. Gus can also take photos and videos for future reference.
Occasionally speech recognition systems fail to understand Gus' commands. Alternate techniques for entering voice commands include (a) head movements (detected by motion and orientation detectors); (b) touch pad manipulations; (c) camera and vision technology to recognize a collection of finger and hand gestures (sort of a "sign language" for controlling the Android); and (d) eye-tracking technology available on some head-mounted displays, where Gus can focus his eyes for a few moments on an option in a displayed menu to select it. Usability testing is necessary to learn which combination of technologies is appropriate for repair applications.
Many activities require the use of both hands and eyes, including assembling, diagnosing, repairing, and fine-tuning machines and devices of all types. What's causing that sound in my car's engine? How do I set the margins in Microsoft Word? Why won't my printer work? Every machine and device should have a Quick Response (QR) code that users can access to automatically download the instructions. Imagine, early Christmas morning when a parent needs to assemble a bicycle or a "some assembly required" gift. A quick photo of the QR code downloads an app that walks you through each step and leaves your hands and eyes free to perform each instruction. You no longer need to read cryptic instructions on small bits of paper or look for missing instructions.
To make do-it-yourself instructions widely usable, the formats and procedures for instructions should be standardized, so the same set of instructions works on a variety of devices and instructions for different tasks follow a standard, consistent protocol, making it easy for users to perform them.
James A. Larson, Ph.D., is an independent speech consultant. He is co-program chair for SpeechTEK and teaches courses in speech user interfaces at Portland State University in Oregon. He can be reached at email@example.com.
A voice interface opens a world of possibilities for wearable devices.
Nuance's voice and language technologies debut on Samsung's new smartwatch device.
The new pulseM app lets customers and employees send voice comments to companies.