-->

Speech Analytics Can Help Steer Chatbot Interactions

Article Featured Image

Speech has been used for decades to analyze recorded calls so companies can gather customer information with hopes of improving current processes and future interactions. Speech analytics solutions have proven their worth in uncovering the topics being discussed, the emotions being expressed, larger market trends, the strengths and weaknesses of processes and products, how consumers perceive companies’ offerings, and areas where contact center agents might need additional training or coaching. They can isolate the words and phrases used most frequently within a given time period to help companies spot changes in consumer behavior and take action to reduce call volume.

While mostly applied to telephone conversations within contact centers, a growing use case has seen speech analytics used with chatbots to enable companies to make the chatbots that much more efficient at handling more interactions and a wider variety of interactions and to ensure that interactions that need human intervention get transferred properly. With conversational artificial intelligence, speech analytics technologies themselves and the related tuning of chatbots continue to improve, enabling companies to use chatbots to handle ever more complex interactions, according to Tapan Patel, senior director of go-to-market and product marketing for conversational artificial intelligence at Verint.

Speech analytics enables companies to determine where customers get stuck in conversations with chatbots. There could be certain words or phrases that the chatbot doesn’t understand, meaning the interaction defaults to a human agent or stays stuck.

In healthcare, for example, that could be common because conversations often include complicated medical conditions, drug names, diagnostic codes, and other terminology. Or in financial services, complicated fund names, alphanumeric account codes, stock ticker names, and other industry-specific terms can also confound chatbots. Companies in these two verticals are using speech analytics to learn the terminology that customers, advisers, or practitioners are using and similar details so they can better determine when a hand-off to a human agent is required and when a chatbot can handle the situation more quickly and efficiently. As speech technology informs companies about the terms used and topics discussed, they can adjust the chatbot’s responses and update their workflows accordingly, Patel says.

Some of the most significant advances in speech analytics and chatbots have taken place in the healthcare and financial services industries, but retail is currently the most advanced, according to Mike Szilagyi, general manager of the Cloud CX portfolio at Genesys. Big-box retailers like Walmart have the advantage of the sheer volume of interactions for analysis to train their bots, while other retailers, such as Nordstrom, are driven by their dedication to improving customer experiences.

Equally important, analytics technology vendors have stepped up their offerings with newer capabilities that have made them faster, more accurate, and more versatile.

The underlying engines have improved, and users better understand the technologies and are more comfortable using them, according to Szilagyi. “They don’t require data scientists any more to train these things, which it used to take a lot of people to do. We can do a lot out of the box. The cloud has really helped; we can make changes fairly quickly,” he says.

The advances in speech analytics are also helping companies better train chatbots for internal use by agents, according to Szilagyi.

But to achieve those advantages, companies need integrated back-end systems, says Rob LoCascio, CEO and founder of LivePerson. “Digital transformation takes time, but it is well worth it. Connect your legacy and cloud systems with your conversational and voice analytics to provide agents with a unified view of each customer. This means you can provide relevant context and automation for each interaction, helping your agents build better relationships and deliver efficient, personalized experiences at scale.”

Companies are also using speech analytics to identify workflow issues that go beyond the chatbot being unable to handle an interaction, Patel adds. “Sometimes, there’s nothing wrong with the chatbot; there’s something wrong with your process.”

For an insurance company, for example, this might take the form of a large number of consumers repeatedly complaining about the timing of their policy renewal statements or claims forms. Likewise, if too many claims are pending or a high number of claims are pending for too long, this can indicate a larger issue in the claims process that the company might want to address quickly.

Just as a high volume of calls indicate there is an issue that needs to be addressed, companies have to ensure that an issue reaches a critical mass before making changes in workflows or in how chatbots handle certain situations, Szilagyi says. Otherwise a company is spending time and resources to make changes that have little return, no return, or a negative return. Some issues occur only once or very rarely.

Beyond enabling improved workflow and enabling the chatbots to handle increasingly complex interactions, speech analytics is also helping alert companies when to proactively transfer a call from a chatbot to a human agent by looking for keywords and terms, such as “I’m angry,” “I’m upset,” and so on, according to Patel. “We can look at positive experiences and negative experiences to see which factors are driving escalation to the agent and where in the conversation flow the escalations are happening.”

By examining the speech analytics information more deeply, a company can group together calls that escalate from the chatbot to the agent by category, determine where in the interaction a customer declined the self-service option, and work to improve self-service capabilities at that point so fewer future calls need to go to an agent, Patel says.

Speech analytics is also being used to help determine which interactions are ripe for chatbot automation. This is crucial, as Gartner in early June reported that only 8 percent of consumers used a chatbot in their most recent customer service experience, and, more worryingly, only a quarter of that 8 percent said they would use that chatbot again. According to Gartner, these are the main use cases for chatbots:

  • order/purchase (52 percent);
  • account information (43 percent);
  • payment/financial transaction (40 percent);
  • submitting feedback (38 percent);
  • troubleshooting (36 percent);
  • account changes (27 percent);
  • check status (26 percent);
  • complaints (25 percent);
  • registration/activation (24 percent);
  • gathering information on products/services (19 percent); and
  • change in products/services (18 percent).

“Chatbots aren’t effective for all issue types,” Michael Rendelman, senior research specialist in the Gartner Customer Service and Support practice, said in a statement. “As generative AI makes them more advanced, customer confusion about what chatbots can and can’t do is likely to get worse. It’s up to service and support leaders to guide customers to chatbots when it’s appropriate for their issue and to other channels when another channel is more appropriate, Rendelman maintains.

Speech analytics can help them get that information.

The Data Is the Key

With chatbots or any other interaction types, the benefits from speech analytics are dependent on excellent data integration, Patel points out. Bots, and the underlying technologies that support them, should be able to bring in data from various sources in a continuous loop for the best results. “Combing your voice data with your CRM data will give you a much broader picture for the issue that you are trying to solve. Organizations looking into speech analytics cannot overlook the data integration issue. Many organizations have to do this before applying speech analytics to conversational AI.”

The cleansing of the data is important as well, Patel adds. Though this can be time-consuming, it’s necessary to maximize the benefits of pairing speech analytics with chatbots. “AI training data is equally important. The more conventional IVR systems will continue to be replaced by intelligent conversational AI systems,” he says.

The conversational AI systems will provide more meaningful data, including keywords, sentiment, etc., to make the speech analytics systems that much more effective, according to Patel.

By incorporating data with customer profiles, going beyond the medium or channel context, to create a unified overview of customer interactions, companies can achieve a comprehensive understanding of customer behavior and preferences, according to Jaya Kishore Reddy, chief technology officer of yellow.ai.

A rich data layer should be established to accommodate ongoing customization and evolution of speech analytics performance, allowing businesses to tailor the system to meet specific business demands, Reddy adds. “This ensures that the analytics solution remains adaptable and aligned with changing requirements. Implementing an ongoing feedback loop mechanism is also very crucial. This allows for continuous iteration and learning from the analysis of speech analytics.”

And in light of the growing popularity of bots, a new category of analytics is emerging exclusively for this channel. With chatbot analytics, companies can measure important metrics like self-service containment rate, retention rate, average session length, the total number of messages sent and received, response time, the most active hours for bot usage, the number of messages handled or not handled by bots, and the number of daily new users.

Biggest Mistakes

Companies that want to maximize the benefits they receive from speech analytics and chatbots need to ensure they commit enough resources to the effort, according to Patel. “You need to have the domain experts to understand, for example, claims processing or people who really understand billing disputes. Those are the business or domain experts who understand the workflow, would know which data and insights are important and which KPIs and metrics you should be measuring on a continuous basis. Speech analytics doesn’t happen in a vacuum. You have to identify the right words and phrases, groups and categories in order to get the complete picture.”

Some companies use different teams for speech analytics and for chatbot/conversational AI, according to Patel. Speech analytics comes under a larger AI umbrella, while chatbot/conversational AI is under the wider CX practice. “Bringing those two teams together definitely would help,” Patel says.

Companies miss an opportunity, though, when they start deploying speech analytics without first determining the issue or issues that they are trying to solve, according to Patel. “You can go in many different directions. You should pick one or two use cases, build muscle around that, and identify the risk and failure points to use in future use cases. That’s where these technologies could be really helpful in creating better interactions, finding automation opportunities, or improving your self-service options.”

Similarly, companies need to monitor words and phrases on a continuous basis, not just over a short period of time, to see how terminology and concerns change over time, Patel says. “What you’re trying to measure and how long you are measuring it are equally important.”

Another mistake some companies make, according to Patel, is not taking full advantage of speech analytics, instead relegating it to simply monitoring rather than using it to improve chatbot performance and workflow.

However, trying to do too much with speech analytics and chatbots, particularly when starting out, is another mistake, Reddy says. “It is advisable to deploy analytics in a phased and iterative manner, gradually building performance capacity and improving productivity levels. Another common mistake is the lack of concrete optimization metrics. To ensure effectiveness, businesses should invest in setting clear, business-driven goals and regularly review and benchmark target key performance indicators (KPIs) to measure the impact and success of the speech analytics implementation.”

Even as speech analytics helps chatbots handle more interactions, companies need to remember the human side of the equation as well, LivePerson’s LoCascio adds. “To successfully integrate messaging, voice technologies, and analytics, you can’t forget that these technologies should be viewed as a complement to human agents, not a replacement.

Companies are just now scratching the surface of what they will be able to do with the technologies, according to Szilagyi. “The generative AI and the large language models that are coming right now are going to make chatbots far better than they are today. The biggest changes in how bots will behave is that they will be more personal; they will feel and sound like interacting with a human.”

Szilagyi also expects the technologies to provide better sentiment analyses and summarizations. Genesys is working on a prototype feature that will summarize what happened in a conversation with a bot or human agent, whether in a single interaction, a cross-channel interaction, an interaction that uses a combination of messaging platforms, or an interaction that occurs over the course of several days.

Advances in AI will occur at a much faster rate in the future, Patel adds. But rather than adding the newest AI as soon as it becomes available, companies should focus on the use cases to determine if the newest iteration of the technology and the additional expense will produce better returns than the AI the company is already using.  

Phillip Britt is a freelance writer based in the Chicago area. He can be reached at spenterprises1@comcast.net.

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues