-->

Diagnosing Disease with Speech Analytics

Article Featured Image

The “spectral characteristics” that companies like Canary Speech analyze are related to how the body physically creates words, not the words themselves. Indeed, they are not related to any particular words at all. That means a recording of any casual speech can be analyzed using these biomarkers. And to take an example from Canary Speech, only 15 biomarkers (out of more than 2,500) are required for it to identify and monitor depression in patients. Clinics that wish to use this diagnostic tool would record their interactions with their clients, in their offices, and have the software scan the recording for disease biomarkers. The clinician can then add that data to other diagnostic tests to make the final diagnosis.

The ability of algorithms to detect spectral characteristics, which cannot be detected by the naked eye (or, in this case, ear), could potentially allow for early diagnosis of illnesses, which can offer the opportunity of potentially life-saving early treatment.

On top of diagnosing illness through speech recordings, platforms that offer a chat-like interface (think Alexa and Google Assistant) allow for people to interact more naturally with algorithms that can monitor their health conditions from the comfort of the patient’s own home. Imagine that you are older, living alone, and have a few medical concerns. You might not be driving much anymore. If you have an Amazon Echo or Google Assistant device in your home, you also have access to HIPAA-compliant “skills” like Livongo’s Livongo for Diabetes skill, which can help a diabetic remember their latest blood glucose readings while adding health tips; or a skill by Atrium Health that can help users find and schedule appointments with urgent care providers. Phone-based apps are also in development, such as an app by Winterlight Labs that can “objectively assess and monitor cognitive health” through your smartphone.

HIPAA, Privacy, and Other Concerns

The Health Insurance Portability and Accountability Act (HIPAA) was signed into law two decades ago. Title I of HIPAA concerns ensuring access to health insurance through workplace plans even when changing jobs or if one has a preexisting condition. Title II, however, focuses on electronic transmission of private health data. There are strict rules guarding the privacy of patients’ health data, also called protected health information (PHI). Companies that wish to use digital versions of PHI (which includes patient’s voice recordings) must comply with HIPAA standards or risk significant monetary penalties (up to $1.5 million per violation).

One company that has done so is Orbita, which specializes in creating frameworks that allow for a more conversational experience with artificial intelligence. Bill Rogers, CEO of Orbita, says that AI can increasingly “carry the load” for interactions between a caregiver and a patient by making appointments, answering health questions, or making test results accessible without adding to a clinician’s already full load. Orbita has also built in the ability for its AI to escalate interactions to humans when necessary, such as when it cannot answer a patient’s questions. Since Orbita is also now HIPAA-compliant, its recorded health information can be shared directly with health care providers over platforms such as Alexa or Google Assistant. This certification means Orbita customers can develop skills similar to Boston Children’s Hospital’s MyChildren’s Enhanced Recovery After Surgery program, which can monitor patients’ recovery process and send information regarding their condition to their doctors.

Using relatively non-invasive speech analytics to detect disease may sound like a minor miracle, but it’s not without its concerns. Imagine that you suffer from bouts of depression, and you are on a phone interview for a new job you hope to get. Unbeknownst to you, the organization is using speech analytics to screen you for potential health conditions. The algorithm picks up on the fact that you suffer from depression, so the company decides against hiring you because they don’t want to take the chance that you would need to use sick days. What if a company decides it doesn’t want to hire people at risk for other illnesses? Can that company use phone screening to discriminate against individuals suffering from Parkinson’s or heart disease? The Americans with Disabilities Act (ADA) specifically stipulates that employers must not conduct a medical examination on an applicant. Will this continue to be enough protection for job seekers as technology advances?

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Presence of Well-Funded Research Organizations to Retain North America’s Dominance in Vocal Biomarker Market

Recent research analysis, sound vibrations, which were employed as therapeutic healing for mental health conditions, are now witnessing application into various disease diagnostics.

Protecting User Data: How Close is the US to its Own GDPR?

GDPR has already had wide-ranging consequences for companies collecting data, and now some are calling for federal regulations in the U.S. Voice-data isn't exempt from the regulations, and vendors need to be ready.

Rethinking Voice in a Digital World

For years, investments in the voice channel have taken a backseat to digital. But thanks to the rise of IoT devices and AI-driven conversational experiences in the consumer realm, organizations must rethink the role of voice.

How to Reach Marketers Looking for a Speech Solution

As speech technology becomes more commonplace, new markets are opening up for vendors. But as new opportunities abound, how do voice vendors reach out to potential new customers like marketers?

Ethics and Algorithms—Exploring the Implications of AI

Concerns have been voiced about how AI and speech technologies are now being used, but solutions are not clear-cut

Planned Parenthood Proves Chatbots Can Provide Answers to Tough Questions

Planned Parenthood introduced Roo, a chatbot that answers the sometimes embarrassing questions of teens and young adults who might otherwise not have access to science-based sexual education.