Vendors' privacy policies can put physicians using speech recognition in a tough spot
M*Modal’s brochure Using the Cloud for More Than Just Architecture and Access starts out with the statement, “Healthcare professionals may have sky-high expectations of their cloud-based speech-recognition technology—but what exactly happens in the cloud and how does that connect to the real-world users on the ground?”
An excellent question.
iScribe Healthcare, M*Modal, Nuance, and Speech Processing Solutions (the manufacturer of Philips voice technology) have access to extensive personal data. Users dictate copious amounts of information, much of it sensitive or legally protected from disclosure. Add to that the fact that this data includes recordings of people’s voices, and these companies’ policies regarding respecting privacy become worthy objects of scrutiny.
M*Modal and Nuance assert ownership rights over user data, including users’ voice recordings. iScribe “has complete integration with” M*Modal, so its policies mirror M*Modal’s. SPS believes customers should control their own data.
Who Bears the Burden?
When it comes to informing prospective and current customers that a company is appropriating user data, whose responsibility is it to start the conversation? Doctors are sophisticated at providing medical care. They are not necessarily sophisticated at buying software. Is it incumbent on doctors to ask these questions? Are companies banking on doctors not knowing to even ask them?
If some of these companies told users how their data is actually being used, “I suspect it could cause friction in the sales process,” says attorney David Schwartz.
Benefit of the Bargain
There is a tenet in contract law that in the case of a breach, the breaching party must pay the nonbreaching party damages equal to the difference between the actual value of the contract and the value the defendant represented the value to be at the time of the sale.
Here, a doctor—or anyone who dictates for that matter—gives a speech recognition company money. The speech recognition company gives the doctor speech recognition software or use thereof. End of transaction. Or at least it should be. But if the doctor contracted with iScribe, M*Modal, or Nuance, what happens if the doctor discovers the company is appropriating her data?
The Ninth Circuit held in a 2013 case that consumers only receive the benefit of the bargain when they have full command of the facts that convince them to make the purchase, so the doctor might have standing to sue.
When users sign up for a rewards program, they pretty much know that in exchange for free stuff, they’re giving up some privacy. If the amount of privacy being forfeited is not objectionable, the transaction is consummated. But in the speech recognition scenario, the privacy being forfeited is not being disclosed in a meaningful way, and an informed decision whether to consent cannot be made.
Imagine the outrage if a transcriptionist kept a doctor’s microcassette dictation after he transcribed it to use for his own purposes.
When a doctor uses speech recognition from iScribe, M*Modal, or Nuance, should the doctor disclose this to her patients? Must she disclose it? M*Modal seems to think so: “Where personal data is supplied to M*Modal from a customer, that customer is responsible for ensuring that data subjects are notified about the identity of the data controller or its representatives, the purposes for which it is collecting, processing and maintaining the data, and any further information that may be required by the circumstances under which the data are collected.”
The problem is, voice data is being supplied to M*Modal for processing into text. Not for M*Modal to use for any other purpose. Further, M*Modal does not disclose all the ways it is using or will use customer data.
Nuance “...may observe your activities, preferences, and transactional data...as well as related usage behavior.... We may use this data for any purpose.” (Emphasis added.)
Nuance can see how many words you dictated, how many commands you said, how many misrecognitions you had, and the amount of time you dictated. Further, Nuance uses the speech data, which consists of audio files, associated text and transcriptions, and log files, and may include personal information.
Assuming the doctor even knows to make a meaningful disclosure, what if the patient doesn’t want his personal information being seized by a third party? Must the patient refuse treatment and go on a quest for a doctor whose transcription partners respect privacy? How might that affect patient care?
If companies believe that what they’re doing is not objectionable, they should be transparent about their practices. Here’s how they can start:
• Users should own their data.
• Data collection should be opt in, not opt out.
• Users should have meaningful choice.
• Key facts should not be buried in hard-to-understand or unconscionable privacy policies.
Robin Springer is an attorney and the president of Computer Talk, Inc. (www.comptalk.com), a consulting firm specializing in speech recognition and other hands-free technology services. She can be reached at (888) 999-9161 or firstname.lastname@example.org.