IBM's Watson Brings Cognitive Computing to Customer Engagement
or three more system upgrades are needed for the speech capabilities to be robust enough to live up to Watson's early hype. "I'm mostly looking toward the next-generation Watson and the one after that, when speech recognition should be totally seamless," he says.
Dan Miller, founder and principal analyst at Opus Research, says IBM is uniquely positioned in this regard because of an ongoing partnership with Nuance Communications. "IBM, with its partner Nuance, is up to tackling the challenges of speech-enabled customer service," he says. "Automatic speech recognition technology today is certainly accurate enough to serve as the grist for high-quality customer service, but it's not something that IBM, even with Watson, will do alone."
Nonetheless, Miller is confident in IBM's ability to master the intricacies of such an application. "IBM is one of a handful of companies that can dominate customer engagement technologies," he says. "Its long-standing investment in Watson has led to great advances in natural language understanding and machine learning."
That is where Watson truly shines. Watson proved on Jeopardy! that it has a unique ability to understand the nuances of human language, process questions asked in a format that mirrors the way people think, and quickly cull through vast amounts of data for relevant, evidence-based responses to questions thrown at it. The beauty of Watson, IBM claims, is that its cognitive systems can understand the context within users' questions and even improve its own performance by continuously learning from experiences.
Consumers can interact with Watson in plain English, directly or through an agent, to get personalized responses to questions and receive valuable insight with supporting evidence and a confidence score. The integrated Ask Watson feature greets and offers help to customers via a Web site chat window or a mobile push alert. "Speech recognition is just a shell for Watson," Miller says, "because the power of the Watson platform revolves around deep domain knowledge with natural language processing and machine learning."
Watson, in its current form, "is basically a way to get at the implications of big data," Bill Meisel, executive director of the Applied Voice Input/Output Society (AVIOS) and principal at TMA Associates, stated in an email to Speech Technology magazine.
"An organization might have a number of text-based resources with information that might provide an answer to a customer's questions. Watson analyzes this body of data and converts it into answers accessible by natural language inquiries," Meisel explained. "The focus is inquiries by text...although any speech-to-text engine could create a text entry in theory."
IBM has said that it hopes to eventually equip Watson with a conversational speech interface similar to Apple's Siri. A member of the IBM Research team in India working on the Watson project confirmed to Speech Technology magazine that "speech technology integration with commercial Watson offerings is planned for the future."
But for Watson to truly succeed in the customer service role, it will need to be expanded with the ability to carry on a
Both companies are committed to reshaping the future of speech-enabled interactions.