• May 1, 2020
  • By Leonard Klie Editor, Speech Technology and CRM magazines
  • FYI

IBM, Apple Build on Natural Language Processing

Article Featured Image

IBM at an online event in late March outlined a plan to bring to market new natural language processing (NLP) technology that will help computers better understand human communication and companies better communicate with their customers.

The company is working on commercial applications of the NLP technology developed as part of IBM Research’s Project Debater, an artificial intelligence-powered system originally designed to compete against a world-class human debater in the same way that Watson did on the TV game show Jeopardy! The Project Debater technology is being rolled into IBM’s Watson product line—which includes Watson Discover, Watson Assistant, and Watson Natural Language Understanding—IBM officials said during an online version of the IBM Innovation Preview related to artificial intelligence and NLP.

IBM’s work with Project Debater involved creating a system that could listen to an opponent’s argument, understand the argument, and craft a response quickly from a large corpus of data. That requires a very robust NLP engine, which IBM hopes to move into production for businesses to improve their communications with customers.

“NLP is the brains of business today,” said Rob Thomas, general manager of data and AI at IBM. For companies, the technology can promise shorter call times, reduced workloads, and increased productivity, he said.

Ruchir Puri, a chief scientist at IBM Research, said two of the biggest business uses for NLP are customer self-service and agent-assisted service through contact centers.

Additionally, the two business-critical capabilities of NLP are “discovering the right insights” and then using them to guide conversations with customers, he said.

From a business standpoint, Project Debater will allow companies to take controversial topics, identify the pros and cons of different positions, and then help them make informed decisions, said Aya Soffer, vice president of AI technologies at IBM Research.

IBM, she added, “has a very rich agenda” for research in NLP, noting that the Project Debater work has already been ongoing for seven years. A big part of the Debater development involves sentiment analysis, which IBM experts extol as a huge boost for businesses.

Beyond that, another key capability of NLP will involve more efficiently combing through large amounts of data, conducting research, fielding incoming communications, and ultimately improving customer service.

But before that can happen, a lot more work needs to take place not just in sentiment, but also in context and language variations. Specific attention has to be given to idioms and industry- and company-specific terminology, Thomas said.

When tied into IBM’s existing Watson apps, it can also make it easier for companies to pore through and derive insights from large data warehouses.

NLP technology needs to be able to understand central themes in documents, classify them into specific categories, retrieve other necessary information, and then generate summaries from the mass amounts of data, Puri said.

That will take time, according to Luke Palamara, IBM’s program director of product management for AI-based NLP. Though advanced sentiment analysis is being rolled into Watson applications right now, other Project Debater innovations are still a way off. “We are just at the start of bringing [Debater] into our portfolio and integrating it with Watson,” he said.

IBM, though, is not the only industry heavyweight looking to increase its capabilities with natural language.

Apple in early April acquired Voysis, an Irish startup that offers an AI-powered platform for digital voice assistants to better understand people’s natural language inputs.

Financial terms of the deal were not disclosed, and Apple was keeping the reasons for the acquisition close to the vest, saying in a very terse statement that Apple “buys smaller technology companies from time to time, and we generally do not discuss our purpose or plans.” Speculation, however, is that Apple will use Voysis’s technology to improve the speech recognition of its Siri voice assistant.

Voysis’s focus was on improving digital assistants for voice search and online shopping apps. Its technology used WaveNets, designed to process raw audio signals directly, allowing neural networks to learn without the constraints of traditional signal processing. WaveNets rely on artificial intelligence to create a more human-like computer speech experience. WaveNets were developed by Google’s DeepMind in 2016.

In late 2018, Voysis launched ViEW (Voysis Embedded WaveNet), making it available for use on any mobile device without needing cloud connectivity. ViEW models were designed to run natively on-device, removing the need for the data center entirely.

SpeechTek Covers
for qualified subscribers
Subscribe Now Current Issue Past Issues