Many Gadgets, One Interface

Article Featured Image

Soon there will be hundreds of smart home gadgets and appliances, including thermostats, lights, security systems, and entertainment devices. Managing all these home gadgets using any convenient control device—desktop computer, laptop, smartphone, tablet, smart watch—will call for a unified user interface.

How will the ideal unified user interface control current and future home gadgets? Will it be a graphical user interface (GUI) with widgets and controls familiar to most consumers already? Will it contain schematic layouts with the gadgets' location within the home? Will it be a natural language user interface like Siri? Or something else, currently unknown?

Several companies market hubs for managing gadgets in the connected home. Smartphone and smart watch user interfaces, like the one for Apple Smart Watch, enable Insteon consumers to control garage lights, doors, cameras, water heaters, home entertainment centers, and so on. Lutron Electronics' Caséta Wireless Smart Bridge uses Siri, a natural language interface popular on Apple products. Google, Microsoft, and Samsung are all promoting software to control home devices. And the Jibo social robot, Amazon Echo, Unified Computer Intelligence Corporation’s Ubi, and similar voice-enabled devices will also likely compete for the job of home gadget manager.

The unified user interface for smart homes will not be strictly a GUI or a voice user interface (VUI) but will combine the best features of both. For example, displays usefully show consumers the location of gadgets, which options are available, the results of earlier commands, and current status. After reviewing the display, consumers can then speak commands and options.

A well-designed unified user interface will exhibit these qualities:

Consistency. The unified interface must work consistently across multiple devices with a similar look and feel for controlling each device. There are several levels of consistency, including gadget names, command verbs and their meanings, parameter values and scales, conceptual models, and formats for help messages and hints.

Configurability. Consumers must be able to configure the unified user interface easily to support new gadgets and their commands and constraints, and remove old gadgets.

Helpfulness. The unified user interface must provide hints and suggestions to assist the consumer in choosing the desired gadget, commands, and options; prohibit consumers from entering parameter values inconsistent with the requested command; help the consumer learn commands and their parameters; and assist when the consumer fails to remember commands and associated parameters. One approach is to make use of a display to present current status and request options as shown in the following four steps:

1. Consumers click, touch, or speak the name of a gadget from a list or possibly from a schematic of the smart home layout showing all gadgets. The selected gadget name is displayed in the command line on the display.

2. Consumers click, touch, or speak the desired command by reviewing a list of possible commands. When the correct command is selected, the command is appended to the gadget in the command line.

3. If the command requires parameters (volume, channel number), consumers may click, touch, or speak the appropriate values. The selected parameter values will be appended to the gadget name and command in the command line.

4. Consumers review the complete command, edit it, and, if necessary, tell the machine to process the command.

As consumers learn the names of gadgets, commands, and parameter values, they can speak the entire request. If consumers speak an invalid option, the user interface may either (1) apply automatic correction algorithms to select the most probable options, or (2) display the currently valid options on the display and ask consumers to touch, type, or speak the desired option.

Flexibility. Consumers should be able to select and use any input mode available on the device and appropriate to the consumer’s environment. For example, in a noisy environment, consumers may enter requests by touching or typing. In quiet environments, they may speak commands. Or consumers may choose the mode of command entry based on preferences, convenience, and social conventions.

This is not a complete list of requirements, but it captures many fundamental requirements for unified user interfaces.

There will be chaos as more user interfaces appear. Some will succeed, some will fail, and some will consolidate as developers learn what consumers prefer and establish guidelines. It promises to be a wild ride. Let the smart home interface competition begin.… 

James A. Larson, Ph.D., is an independent speech technology consultant and teaches courses in speech technologies and user interfaces at Portland State University in Oregon.

SpeechTek Covers
for qualified subscribers
Subscribe Now Current Issue Past Issues