-->

Improving Conversational Virtual Assistants with Natural Language Processing

Article Featured Image

As tools and platforms become more interconnected, having a good experience with a conversational virtual assistant requires strong performance from these adjacent technologies. “The core thing that has caused the natural language processing to be effective is increasing computing power,” Meisel says. An excellent example is Alexa, whose functions are executed by Amazon’s powerful back-end infrastructure. “Even though what you’re saying is being sent over the internet to Amazon, it’s processed on their web servers, understood, and then a response is returned, but there’s not much of a pause,” Meisel says. When systems have access to this kind of tremendous computing power, a big chunk of complaints related to latency go away.

The rise of Big Data has also been a major contributor to improvements in conversational virtual assistants. Systems and networks must analyze large quantities of data to get better. “The more data they have, the better they get,” Meisel says. “The better they are, the more people use them. The more they’re used, the more data they have.” The cycle will build on itself moving forward, giving developers more of what they need to make the experiences even better. These troves of stored data, which are gaining in value as they grow, can point virtual assistants toward more seamless conversational strategies and show developers all the branches a normal discussion with a user may take. Looking farther down the timeline, data will be able to help drive better models and enable companies to deploy more instances of virtual assistants.

Impacts on Users and Where Virtual Assistants Are Headed

Hebner cautions that having a back-and-forth conversation is hard. “It’s still an emerging field to understand all the different twists and turns of language. Trying to build a machine that can hold a conversation with someone isn’t even on the near horizon.” Today’s systems can hold a conversation in a narrow band, though, offering real value in a number of use cases. “Now it’s incumbent on developers to branch out and understand context and other things so you can have these richer dialogues,” Hebner says. The computer needs to think like a human if it is to converse like a human in a useful and intelligent way.

User expectations are also different than they used to be and developers need to take that into account. Wilpon says part of the problem is that consumers regularly experience personal virtual assistants but only rarely do many of them contact enterprises to see how the technology works on that side of things. “If you use it every day, you learn how Siri works,” Wilpon says. People know what phrases to use and how to pose a question for the best response. Interacting with an airline’s virtual assistant, however, may happen only once a year. “When you’re in the enterprise world, the user expectations are very different and their knowledge is very different,” Wilpon says. Instead of having a limited set of commands, consumers dialing in for help or information will see where the technology can really go. As these systems improve in the enterprise realm, fewer calls will be transferred to an agent and consumers will receive the support they want faster and with less frustration. Their expectations around what personal-level conversational virtual assistants can accomplish will also likely evolve as a result.

Additional effort and investment is still needed in dialogue management and dialogue flows. “There’s a lot of work in machine learning to be able to learn a dialogue and nonlinear flows,” Wilpon says. Humans move conversations around seamlessly, and the technology will eventually need to do the same. Organizations have already put considerable effort into pulling dates, times, and places out of conversations, but some are also looking at identifying and incorporating emotion in their virtual assistants as they explore how to direct interactions most effectively. “It’s where you try to figure out the state of the user and use that to help drive a conversation,” Wilpon says. It could mean that customers who are upset are transferred to an agent rather than routed to a virtual assistant, or have their dialogue flows structured differently. Longer term, some of these ideas may fall by the wayside but a handful will likely make it into the market.

Organizations dedicated to improving the performance of their conversational virtual assistants are getting some additional help from other areas. Though the need for more and better data has been a stumbling block, new resources are available. “When you call a customer service call center, the call is recorded,” Meisel says. “So they have recordings of every call to an agent and also calls to the automated system.” For companies that lack the internal bandwidth to make sense of all that incoming information, analytics organizations can now offer a deep dive into the data, providing insight into what kinds of questions customers are asking, for example. “You can say, ‘How do customers ask to transfer money?’” Meisel says. Companies are also beginning to look at their own website FAQ pages to see the search terms and phrases customers use most, to help make their virtual assistants more intuitive and effective.

Development is another area that’s advancing to give users a better experience. New tools are making it easier to build NLP systems, prompting developers to expand what virtual assistants can comprehend. “If I can make it handle 10 things, it won’t be so bad to make it handle 100 things,” Dahl says. “And it won’t fall off the end of the earth when you ask it something off the wall.” This may help to reduce development time and effort, encouraging companies to continue adding to their assistants’ capabilities. “The user experience is really dependent on the developers’ efforts,” Dahl says. Even when the speech is recognized and the natural language is understood, she says, “The developer still has to make that result into something useful.” Knowing the value of the application and its impact on what’s important to the business will determine how many resources a company will put toward development. 

Julie Knudson is a freelance business writer who specializes in technology. She also covers healthcare, cybersecurity, risk management, and hospitality. Reach her at www.julieknudson.com.

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues