Analytics Is All the Rage
Anyone who’s been around the speech industry for a decade or more has to be struck by the huge marketwide transformation in just the past year or two. Speech went from a nice-to-have feature to a must-have seemingly overnight. Voice interfaces quickly became almost ubiquitous, allowing us to verbally control everything from our smartphones to the navigation systems in our cars and the ice cube output in our refrigerators. And systems have gotten far more intelligent, realistic-sounding, and accurate while taking up far less system architecture and computing power. Innovations in artificial intelligence, cloud computing, and natural language processing have vastly improved nearly every segment of the speech technology market, but perhaps the greatest changes have taken place within the analytics sector. Advancements in speech analytics have been fast and furious, which is why we decided to dedicate this issue to the topic.
Today’s speech analytics solutions can deliver a wide array of insights from customer interactions, extracting valuable information from multiple customer conversations and sifting through unstructured call data to identify problems earlier.
And as the technology advances, its use cases also expand. Solutions today are being used for customer experience management, call monitoring and summarization, agent performance monitoring, sales and marketing management, risk and compliance management, personally identifiable information redaction, sentiment analysis, competitive intelligence, business process monitoring, and predictive analysis, to name just a few of the many use cases that have emerged in just the past few years.
Within the larger analytics arena, the market for contact center analytics is booming. Analysts expect double-digit growth for the foreseeable future, for good reason:
“Contact center analytics is more important now than ever,” says Kathy Sobus, senior director of customer experience strategy at ConvergeOne, in our second feature, “Analytics Turns Its Sights on Interactions.”
And there’s no indication that interest will wane anytime soon. “Over the next year and beyond, contact center analytics will prove that contact center automation is a revenue opportunity, transforming contact centers from cost centers to centers of excellence,” says Gadi Shamia, CEO and cofounder of Replicant, in the article. “Contact center analytics will continue to be an important part of operational excellence for customers who want to achieve both efficiency and customer satisfaction.”
Read the feature and you will quickly see that contact center analytics is all the rage these days.
Speaking of rage, that is something a contact center operator never wants to encounter. Should it occur, however, contact centers are better able to spot it and respond accordingly, thanks to emotion detection capabilities that are increasingly being built into speech analytics technologies.
Emotion detection is making inroads into all sorts of business processes, and research firm MarketsandMarkets projects the global emotion detection and recognition market to grow from $23.6 billion this year to $43.3 billion by 2027, as revealed in our cover story, “Interest Mounts for Emotion Detection.”
Based on those numbers, one could argue that emotion detection is even hotter than speech analytics alone.
At the same time, though, there are hurdles to overcome. Among those cited in the feature are the lack of uniformity in how people display emotions and the need to read both verbal and nonverbal cues and tie them together in the proper context. Another pressing concern is the Uncanny Valley theory, which holds that humans prefer anthropomorphic agents and reject them if they become too humanlike. Emotions are a characteristic unique to humans, and the fear is that if technology starts tinkering around in that area, it might alter the balance of power and the natural order of things.
Whether in the contact center or elsewhere, speech analytics providers will need to ensure that their technologies can handle not only the growing number of interactions but also their sources and contexts. And if you haven’t considered changing up your speech analytics mind-set, it might be time for further analysis.
Leonard Klie is the editor of Speech Technology. He can be reached at firstname.lastname@example.org.
Companies and Suppliers Mentioned