Data Privacy Concerns Should Govern Speech Technology Industry
This spring brought lots to think about for speech technology providers. No sooner had the General Data Protection Regulation (GDPR) gone into effect than Amazon’s Alexa raised new privacy concerns by recording and emailing a family’s conversation to a random contact. But this mishap, while relatively high profile, is not typical of the privacy concerns that the speech technology industry needs to confront.
Security and privacy issues around voice assistants are sexy, but these concerns manifest themselves far more often in more mundane ways. Just ask Nuance, which, earlier this year, revealed a major data breach that resulted in a former employee gaining unauthorized access to almost 900 patients of San Francisco General and Laguna Honda hospitals. This isn’t the kind of news that makes national headlines and takes over your social media feeds. It is, however, important.
For its part, Nuance has strengthened its security and cooperated with police in the investigation. There are strict privacy laws governing patient information in the United States, but had this breach involved European patients, it could have been much more devastating for Nuance. As Nancy Davis Kho writes in “GDPR Implications for Speech Technology” on our website, “While the regulation came from the EU, any entity that collects, stores, and/or processes [personal information] for citizens of the EU is subject to GDPR, wherever they’re based. And with fines that can range up to 4% of annual global turnover or €20 million, whichever is higher, companies in the speech technology industry can’t afford to be cavalier.”
Do people living in the European Union call your call center? Do you collect data about them? Do you record their conversations? Do you have a data protection plan in place? If the answer to that last question isn’t an enthusiastic “Yes!” you need to stop everything and put one in place now.
And what about those voice assistants that are all the rage, but also constantly listening to us? Well, Kho says they are probably going to have to ask European users for permission to listen to and record them. Additionally, she writes, “Under GDPR, says [Richard] Brown, [director of security solutions provider activereach], ‘There are a lot of interesting questions right now about recording and capturing voices on devices,’ noting that some vendors who encrypt such data can’t even access it themselves.”
The questions that GDPR brings up are many, but they aren’t endless. Speech technology vendors (and users) just need to deal with them head on, and start letting data privacy concerns govern how they continue to develop their products and policies.
There's no doubt we'll have plenty to talk about in the year to come