-->

The 2017 Speech Industry Star Performers: Beyond Verbal

Article Featured Image

Beyond Verbal Breaks Down Biology

What began as a quest to track human emotions through speech patterns could eventually lead to a revolutionary new method for diagnosing the early stages of heart disease. Beyond Verbal, an Israeli start-up, completed an exploratory study with the Mayo Clinic last November that found a strong correlation between certain voice characteristics and coronary artery disease (CAD).

The study, made possible with $3 million in funds that Beyond Verbal collected last September, used a smartphone app to measure subjects’ voice signals prior to undergoing coronary angiograph, a test that uses dye and X-rays to see how much plaque has built up in the arteries. The study, which included 120 heart patients and an equal number of control subjects, linked one voice feature to a 19 times greater likelihood of CAD and identified 13 others that could be associated with the condition. The strongest of these features was observed in patients’ voices when they described negative experiences.

At the heart of the study was Beyond Verbal’s Beyond Wellness API, which it released in 2014. The software turns any smartphone or microphone-equipped wearable device into an emotional well-being sensor that measures intonations in the voice rather than the content or context of the spoken words.

Yuval Mor, CEO of Beyond Verbal, says the Mayo Clinic research presents an opportunity to move the technology into many new therapeutic areas. It could, for example, be used to identify conditions like dyslexia and attention deficit and hyperactivity disorder (ADHD).

But the medical field isn’t the only area to benefit from Beyond Verbal’s technological advances over the past year. Just in June the company launched a cloud-based API engine that will enable virtual private assistants to reveal customized recommendations based on individual moods and emotional states.

Beyond Verbal enables these assistants to understand the emotional message, context, and intent present in callers’ vocal intonations, which it says represent 35 percent to 40 percent of the emotions people convey in their communications. Armed with that information, the assistants can then change their behavior, personality, and even tone of voice to adjust to the context of the conversation and the person with whom they are communicating.

To carry this out, Beyond Verbal’s Emotional Analytics technology takes raw voice input and analyzes it for mood and attitude. The technology requires only 10 seconds of continuous voice input to conduct its analysis, which can be performed in real time.

According to Mor, his company’s aim in the not-too-distant future is to add vocal biomarker analysis to its feature set, enabling virtual private assistants to analyze voices for specific health conditions.

Beyond Verbal previously released two free consumer-facing apps, Moodie and Empath, and one for clinicians called Beyond Clinic. The company already had some experience with voice analysis on conditions like Parkinson’s disease, autism, and a number of other neuro-cognitive conditions, and even worked on solutions aimed at marketers and sales professionals. In each of these use cases, its technology tracks 430 precise emotions.

Mor says his company’s current offerings are built off more than 21 years of research that includes 2.5 million emotion-tagged recordings in more than 40 languages.

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues