▼ Scroll to Site ▼

Enhancing Voice Search with Sentiment Analysis

Article Featured Image

Sentiment analysis. Opinion mining. Whatever you call it, the purpose is to uncover whether a statement is of positive, negative or neutral intent. In simpler terms, sentiment analysis uses an enormous amount of data generated by consumers to understand the expression of feelings, attitudes, and opinions about brands, services, products, and even about the news. When gathering data for sentiment analysis, people use social media websites, review portals, and search engines to gather data.

Sentiment analysis—whether for text or voice—extracts information such as:

  • Subject—what the customer is talking about
  • Polarity—The emotional state of the customer’s opinion—whether it is positive or negative
  • Opinion holder—who is expressing the opinion; can be a person or an entity.

It is said that 80% of the world’s data is unstructured—but there is valuable information in that data. Emails, phone calls, transcripts, and live chat transcripts can generate very useful statistics about your business through the power of sentiment analysis.

Looking at the many benefits and practical implementations, sentiment analysis has helped not just gather, but sort the information widely available on both public and private platforms online. It helps identify user-generated content on the internet—like reviews, forums, blogs, and of course, social media posts.

Sentiment analysis is nothing new in the voice world, especially when it comes to call-centers. It can automatically structure voice data to make it easier to put into a specific category. It can help you determine what’s a public opinion regarding a brand, what is a political opinion, and which is a service or product review, etc. But as voice search emerges as the preferred way for consumers to find what they need, new uses for sentiment analysis arise.

Smart speakers such as Amazon Echo and Google Home have achieved widespread popularity: according to voicebot.ai, around 47.3 million people in the United States own a smart speaker. Which basically means one out of every five adults in the United States. In the U.K., smart speakers are slightly less popular but one out of every eight households still has a smart speaker.

Sentiment Analysis Use Cases

Voice search is becoming part of everyday life. So let’s get into how voice search and sentiment analysis work together.  

Let’s see what happens when you go to Google and search “Why isn’t my iPhone taking screenshots?”

That is a very common and everyday life question, but you should get very different answers when you type your search than you do when you search using your voice. Because there are more factors that are going to be involved, like the tone and volume of someone’s voice.

When you use voice search, an intelligent algorithm should know if you are frustrated and ready to spend money fixing your iPhone or if you are in the initial stages and would like to see some tutorials to fix the bug yourself.

Search engines should alter the results based on the sentiments it picks from your voice search. Voice search can also help brands identify the buying intent of the user or know if they had a good experience with your product.

But right now, this is all just theoretical—the full potential of sentiment analysis is yet to be realized.

If you’re looking to pioneer this space, here are a few more tips in mind. When we’re dealing with text, sentiment analysis algorithms use adjectives and adverbs to understand emotions in a particular piece of text. However, with voice, the tone of the user comes into play. It’s important to keep the different tones, accents, and usage of words that arise from different cultures and backgrounds. For example, some people talk louder than others, some have heavier accents, some talk faster naturally, some might have background noise while some people might be very angry while talking.

How to Enhance Voice Search with Tone Analysis

For the sake of this article, let’s only look at the tone analysis. There are certainly more factors you can involve in your voice search, but let’s keep it simple for now. (You can find more information and practical implementation of sentiment analysis on voice search in this research paper called Sentiment Analysis on Speaker Specific Speech Data.)

Tone can help you understand all kinds of things about a person’s state of mind or what they’re really asking for. Humans understand this instinctively. We know when someone is mad, sad, or excited. That’s all lost with a text search. Now, though, we can learn more about searchers thanks to tools like IBM’s tone analyzer. (You can play around with it by recording your own voice sample, or find the backend implementation here).

Once you have a tone analyzer in place, you can start using voice search data to impact search results. If you receive a query from someone who sounds positive or joyful, your results should 

Show positive intent results first and negative or disgusting topics at the end. 

The Benefits of Applying Sentiment Analysis to Voice Search

The most significant benefit of employing sentiment analysis in voice search is the level of accuracy it provides in terms of tone and context. An irate customer who speaks the following sentence “I want the helpline of Apple Inc. USA right now” is definitely looking for a way to report a problem; thus the sentiment analysis will quickly pick up the tone and emphasis on words such as “right now” and provide the information this person is looking for.

Ultimately, this provides a better customer experience—which is especially important when you’ve got an angry customer on the other end of the query. Embracing sentiment analysis as an integral part of your search solution just might save you a customer.

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

The Key to Business Growth: Using AI to Break Barriers in Sales Conversations

Businesses let hundreds of millions of dollars slip through their fingertips because they fail to recognize opportunities embedded in conversational nuance. But AI can help.

Video: Implications of a Speech UI

Grand Studio Lead Designer Diana Deibel discusses the ethical implications of speech UIs and remaining cognizant of the inherent human elements of speech and conversation in this clip from her presentation at SpeechTEK 2019.

How Artificial Intelligence is Transforming Human Resource Departments

The human resources (HR) function has seen many transformations over the years but the increasing use of artificial intelligence in the workplace presents challenges and opportunities to the profession.

Digital Marketing & Voice Search: What Marketers Need To Know

The smart speaker marketplace—think Amazon Echo and Google Home—is growing like gangbusters, and digital marketers who are in the know, are bracing for its impact. The emerging category is evolving rapidly, and could affect the digital marketing landscape in a similar way that mobile devices and smartphones did a decade ago—except maybe faster and more pervasively.