Voice: The Most Accurate Way to Detect Human Emotion
As humans, we have become adept at masking our emotions. We tend to think that our visual cues are what give our emotions away, and so we develop tactics to hide how we really feel.
Voice, however, can expose even the best poker face. When we turn our complete attention to voice, it turns out that we can more accurately predict other people’s emotions and figure out how someone truly feels— simply by listening.
We tend to overestimate our ability to understand the emotions of others by relying on non-verbal cues such as facial expressions and body language, but these may not be the finest strategy for wholly recognizing the complex array of human feelings and intentions.
“Close your eyes and use your ears”
The American Psychological Association published a study that claims: “If you want to know how someone is feeling, it might be better to close your eyes and use your ears.”
Research indicates that people can more precisely read the emotions of others when they listen rather than look. When we only listen to voice, we pay more attention to little subtleties in vocal tone and focus more on the nuances in how the speakers express themselves. You’re more likely to notice that someone is anxious or nervous if they’re breathing quickly, for example, or if they sound tired or sad. If they’re talking in a high-pitched or rapid manner, you may be able to pick up on excitement or enthusiasm, without misleading facial cues.
It is easier to control body language and conceal expressions in non-verbal communication. Picture going in for an interview, for example. You have a game plan for how you look, what you want to say, and how you want to present yourself. Then you get asked a question that throws you completely off guard, and your voice reveals perhaps more insecurity and uncertainty than you wanted it to.
Listening to someone’s voice can reveal a bottomless well of information about a person—their characteristics, age, education, ethnicity, and more. This information leaks out subconsciously, and people pick up on it.
Accuracy Through Empathy
In the APA study, every experiment showed that individuals who only listened without observing were able to more accurately identify emotions being experienced by others. The study demonstrated that people often intentionally communicate their internal feelings through the voice, allowing perceivers to more effectively focus their attention on the linguistic and paralinguistic vocal cues that coincide with speech. Listening is simply a more efficient path to cutting to the truth of what someone is trying to convey, and how they really feel.
Empathic accuracy is known as the ability to “read” the emotions, thoughts, and actions of another person. With enhanced empathic accuracy, individuals can respond more appropriately to situations of conflict or confusion and can greater share and understand the feelings of someone else while resonating with their perspective.
The human voice, as an embodiment of self in a social context, may have offered an evolutionary advantage that assisted our ancestors in distinguishing familiar from unfamiliar voices while perceiving expressions of need and distress that was key to ensuring survival. When two people talk and understand one another, their brains literally synchronize as if they are working in unison. This, inevitably, leads not only to more resonance but also more compassion.
The more information that’s transmitted doesn’t necessarily equal a more accurate reading of emotion. Cognitive psychology teaches us that when one engages in two complex tasks at the same time (watching TV and listening to music, for example), the individual's performance on both tasks takes a hit. The art of listening, and only listening, can lead to actionable, improved understanding of others, especially in our technologically immersive and multiplatformed culture.
It’s nearly as simple as a middle school adage: focus on the task at hand, and listen well.
Translating Emotional Intelligence to AI
Much of the research on emotion recognition up to this point has focused on the role of facial cues, which makes this APA study on the power of voice such a breakthrough. The voice may contain much of the content necessary to accurately perceive the internal states of others, and recent findings suggest that more studies examining the vocalizations of emotion are vital.
Our communications with computers are only going to become chattier, which leads us into a new world of possibilities with artificial emotional intelligence, or Emotion AI. Emotion detection technology currently revolves heavily around facial coding to measure, understand, simulate, and react to human emotions. But since we may be able to better empathize over a phone call than a FaceTime session, perhaps we should focus more on the power of the voice.
Renowned psychiatrist Dr. Smiley Blanton wrote: “The effect of emotions upon the voice is recognized by all people. [Everyone] can recognize the tones of love and fear and anger; and this knowledge is shared by the animals. The dog, the horse, and many other animals can understand the meaning of the human voice. The language of the tones is the oldest and most universal of all our means of communication.”
We have entered the era where computing machinery is ready to understand it as well, as we lend AI the emotional intelligence to recognize human emotion and synthesize our emotional behavior in unprecedented new ways.
Much as developers rushed to put new apps onto Apple's App Store in 2009, we're seeing a bit of a gold rush to develop new skills for voice assistants. But developing for a VUI is an entirely different challenge.
07 Aug 2019
AI allows companies to retrieve 100% of the audio from contact center calls without compromising quality and accuracy. With this knowledge, companies can improve CX, reduce effort and increase brand loyalty.
31 Jul 2019
Advances in conversational AI are removing those barriers to adoption. That, coupled with changing consumer behavior, is causing speech to become the preferred interface for self-service applications.
26 Jun 2019