-->

Diagnosing Disease with Speech Analytics

Article Featured Image

What if the recordings we post to social media sites today can be used to assess our health without our knowledge? Facebook itself cautions users to “think before you post.” And the EU was so concerned about our social media data lingering into the future that it built a “right to be forgotten” into the General Data Protection Regulation (GDPR), implemented last year.

Speech Analysis Skeptics

The first pair of glasses, the first blood test, the first vaccines. People were skeptical of all these in the beginning. (Some remain skeptical of what most of us consider routine medical care.) Using speech analytics as a diagnostic tool might also raise some suspicions—perhaps among those who remember the revelation from only two years ago that Facebook was using algorithms to assess the emotional states of teenagers to better target them with ads.

The companies behind these technologies know that there are plenty of privacy concerns, and they often take them into account during product development. Rogers says that Orbita builds patient privacy concerns into every step of its platform creation by ensuring that all of the policies and procedures in the current regulatory environment (such as HIPAA) are followed during the development of the code itself.

And Canary Speech is also addressing these privacy concerns within its software, says O’Connell. The app encrypts and transmits the recordings immediately and then erases the recording from the device, adding another layer of security to the recordings.

O’Connell says that each clinic that uses Canary’s app is a “standalone” island—meaning doctors and nurses from one office cannot see what is happening at the other clinics using the app. Even at the master level—in which Canary Speech app system administrators might have to get into the app to debug or fix something—the information they are able to see has been de-identified (without names, addresses, dates of birth, or Social Security numbers).

Improving Care and Planning for Problems

Early diagnosis of illness can often lead to better disease outcomes. Even if there is no early treatment, an early diagnosis can lead to better tracking over the course of the disease, which may lead to innovative treatments in the future. O’Connell points out that the amount of “face time” patients get with their clinicians is relatively little. But this technology has the capacity to capture a wide range of information, over a long time period, often recorded from the comfort of the patients’ own homes, and help a clinician get to know a patient better.

For example, an algorithm that can analyze speech for signs of pulmonary disease, such as COPD, can be gathering data on the disease’s progression by having “conversations” with the patient through his Amazon Echo. With this information, a doctor can monitor a patient in near real time whom she might otherwise see every few months for only a few minutes. In fact, the future may bring entirely remote doctor “visits.” This could lead to vastly improved health outcomes for people living in rural or otherwise hard-to-reach areas around the globe.

Canary Speech technology is also being used in drug trials to help researchers follow the progression of disease over time. This has the promise of offering near-real-time information on the efficacy of treatment, potentially leading to shorter wait times for new drugs or treatments to be offered to the public (though this is still just speculation).

These feats are nothing less than amazing. Yet we cannot ignore the ethical questions. A recent paper that appeared in the journal NPJ Digital Medicine titled “Data Mining for Health: Staking out the Ethical Territory of Digital Phenotyping,” by Martinez-Martin, Insel, Dagum, Greely, and Cho, suggests that “stakeholders, including software developers, healthcare, patients, consumers, and other institutions, will need to be involved in the creation of standards and best practices that adequately address the ethical challenges raised here.” In other words, problems will arise, and we need to be prepared. Making sure we get this right is paramount, as the technology has the ability to literally save lives. 

Brian Chevalier is a freelance writer based in Massachusetts.

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Constant Companion Partners with Canary Speech to Help Diagnose Behavioral Health Trends

Constant Companion and Canary Speech study to explore how voice and AI technologies can detect conditions such as anxiety, depression, and Alzheimer's.

Presence of Well-Funded Research Organizations to Retain North America’s Dominance in Vocal Biomarker Market

Recent research analysis, sound vibrations, which were employed as therapeutic healing for mental health conditions, are now witnessing application into various disease diagnostics.

Protecting User Data: How Close is the US to its Own GDPR?

GDPR has already had wide-ranging consequences for companies collecting data, and now some are calling for federal regulations in the U.S. Voice-data isn't exempt from the regulations, and vendors need to be ready.

Rethinking Voice in a Digital World

For years, investments in the voice channel have taken a backseat to digital. But thanks to the rise of IoT devices and AI-driven conversational experiences in the consumer realm, organizations must rethink the role of voice.

How to Reach Marketers Looking for a Speech Solution

As speech technology becomes more commonplace, new markets are opening up for vendors. But as new opportunities abound, how do voice vendors reach out to potential new customers like marketers?

Ethics and Algorithms—Exploring the Implications of AI

Concerns have been voiced about how AI and speech technologies are now being used, but solutions are not clear-cut

Planned Parenthood Proves Chatbots Can Provide Answers to Tough Questions

Planned Parenthood introduced Roo, a chatbot that answers the sometimes embarrassing questions of teens and young adults who might otherwise not have access to science-based sexual education.