-->

SpeechTEK 2019: Anthropomorphism and Ethics

Have you ever had a car you found it hard to let go of? Or maybe a favorite toy that was your closest "friend"? You may be guilty of anthropomorphizing inanimate objects and you're not alone. According to Dr. Judith Markowitz, even the most battle-hardened soldiers tend to bond with machines. 

Take Packbot, for instance. It's a mine-clearing robot that helps save lives on the battlefield. Soldiers have been known to name it (Scooby Doo, in case you were wondering) and mourn its loss when it gets blown up. According to Markowitz, soldiers have even put their own safety on the line to retrieve Packbot from dangerous situations. 

In many ways, if you're creating a chatbot or virtual assistant, this kind of bond is exactly what you're hoping for. But it does raise some questions. Paro, a soft, robotic dog-like robot is used to comfort elderly people--many suffering from dementia--and is considered a therapeutic robot. As a non-pharmaceutical medical device, Paro, in essence, replaces medications that might help combat loneliness and depression. Sounds like an ideal solution, though, the audience in Markowitz's sessions wasn't entirely convinced--wondering what might happen when the deception was revealed. 

And when it came to toys like "Hello Barbie"--which is, according to Markowitz “in effect a conversational speech system”--other ethical concerns were raised. Though Mattel was certain to make sure the toy complied with COPPA, there was still concern around information sharing and data use. For instance, the Hello Barbie might suggest other Barbie products based on the child's interest. This had parents miffed.

The discussion around the ethics of these anthropomorphic devices segued well into Brian Garr's panel on "The Ethics of ASR Lie Detection." Garr, senior creative technologist at Virgin Voyages, led a discussion that ranged across a number of ethical issues around biometrics. The rest of the panel was comprised of Nagendra Goel, CEO and founder, GoVivace; Steven M. Hoffberg, Of Counsel, Tully Rinckey, PLLC; Peter Soufleris, CEO and founder, Voice Biometric Group.

One thing quickly became clear. The law hasn't caught up with the technology, and vendors are, in many ways, on their own when it comes to navigating ethical issues surrounding the technology they create. 

If your biometric solution is able to detect a degenerative condition or other health issues--even if that's not how it's being employed by your client--are they ethically bound to tell the potential sufferer? Who decides what the "proper" use of data is? These questions don't really have answers yet--and it's not clear how long it will be until the law catches up. 

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

SpeechTEK 2019: Voice in the Newsroom

"We don't have solutions yet, but we have the roadmap on how we're going to get to the solutions," says Navya Nayaki Yelloji, product manager of voice platforms at Gannett.