Curbing Speech Data Overreach
Consumers have voiced concerns about how Alexa, Siri, and Google Assistant collect and use voice data every time they talk to their smart speakers. They’re troubled by the potential for speech bots and voice-based information companies to exploit their personal information. They’re alarmed by the rise of deep fakes and the ease with which bad actors can manipulate audio for nefarious reasons.
People across the globe are becoming increasingly worried about the extent to which their recorded voices and words are captured, mined for marketing purposes, and vulnerable to hackers and criminals. It’s a big reason for the stronger push in the past few years for laws and guidelines intended to control these practices and decrease these risks.
But are existing and proposed regulations strong and comprehensive enough to deter unethical methods? Will laws designed to regulate speech data be properly enforced, and will companies in violation be held accountable? Has technology reached a point where it’s nearly impossible to rein it in and safeguard consumers?
These are critical questions that demand responses—answers that might have to wait as the pace of legislation, and the will to enact it, tries to catch up with the speed of advancing technology and innovation. Moreover, these are convoluted political, legal, business, and moral issues that won’t be easily or quickly solved. Fortunately, there are a handful of landmark laws passed recently to which the tech industry can look for guidance and inspiration.
Currently the gold standard in data privacy and security legislation, the European Union’s General Data Protection Regulation (GDPR) went into effect in May 2018. The GDPR requires that companies that process data conform to important protection and accountability principles, including data security; fairness, lawfulness, and transparency; data minimization; accuracy; integrity and confidentiality; accountability; and storage limitation. Controllers of data, including voice data, also have to demonstrate GDPR compliance. Violators of these standards can face stiff financial penalties. And these regulations extend to organizations throughout the world, provided they collect or target data linked to people within the 27 countries in the European Union.
Taking a cue from the EU’s efforts, other nations, such as Brazil, Japan, and India, are pushing for stronger data privacy protection laws. The United States, meanwhile has yet to pass a similar comprehensive national law. The closest approximation is the California Consumer Privacy Act (CCPA), which went into effect in 2020 and affects organizations across the country that do business with California residents. This landmark legislation gives California consumers the right to know about the personal information businesses collect about them and how it is used and shared; the right to opt out of the sale of personal information; and the right to delete personal data gathered from them.
Other states, including Vermont, have passed laws intended to safeguard consumers around the issue of speech data. Vermont now obligates businesses that collect and sell or license personal data to third parties to reveal which data they are collecting from consumers and allows consumers to opt out of this data collection. Additionally, bills or bill drafts related to consumer data privacy have been filed or introduced in at least 25 states and Puerto Rico.
In small signs of progress, data breach notification laws are now in place in all 50 states, although most of these rules are constrained and not widely enforced. The United States also has the Children’s Online Privacy Protection Act of 1998 (COPPA) and the Health Insurance Portability and Accountability Act of 1996 (HIPAA) in effect, which provide at least some measures of protection for minors and patients.
The State of Speech Data Protections
While many states are taking the problem of data security and privacy seriously, enacted or proposed regulations across states are often inconsistent.
“Regulation of voice data is a patchwork of laws and regulations around the world, and within the United States exists on a state-by-state basis,” says Deborah Dahl, principal at speech and language consulting firm Conversational Technologies and chair of the Worldwide Web Consortium’s Multimodal Interaction Working Group. “Consider that multiple federal agencies, such as the [Federal Trade Commission, the Federal Communications Commission, and the Department of Health and Human Services] have roles in regulating the handling of voice data, which makes this area extremely complex. But as technology becomes more sophisticated, it will become harder for regulations to keep up. Hence, it’s important to start developing a consistent overall approach to regulating the use of voice data.”
Compared to other countries, “the U.S. is like the Wild West” when it comes to government regulation related to voice-based information companies, says LaNysha Adams, chief technology officer at Edlinguist Solutions.
“Consider that before 2018, only three states even had biometric privacy laws—Illinois, Texas, and Washington. While that number has grown to at least 20 states today, compare that to the United Kingdom, where they have a Financial Contact Authority (FCA) in place that examines issues related to governance, risk, and compliance, with a keen eye on voice-based technologies,” Adams explains.