X

ACLU, Human Rights Groups Call on Zoom to Drop Plans for 'Emotion Analysis' Software

In an open letter to Zoom, the groups say using AI to monitor the mood of videoconference participants is "a violation of privacy and human rights."

Dan Avery Former Writer
Dan was a writer on CNET's How-To and Thought Leadership teams. His byline has appeared in The New York Times, Newsweek, NBC News, Architectural Digest and elsewhere. He is a crossword junkie and is interested in the intersection of tech and marginalized communities.
Expertise Personal finance, government and policy, consumer affairs
Dan Avery
2 min read
Zoom logo on a smartphone
Budrul Chukrut/SOPA Images/LightRocket/Getty Images

Civil rights groups are calling on Zoom to ditch plans to explore "emotion analysis software" that would use artificial intelligence to analyze the mood of videoconference participants.

In an open letter to Zoom founder Eric Yuan on Wednesday, the American Civil Liberties Union, digital-rights nonprofit Fight for the Future and nearly 30 other civil liberties organizations called such technology discriminatory, manipulative and "based on pseudoscience." 

"Zoom claims to care about the happiness and security of its users but this invasive technology says otherwise," according to the letter, which called using AI to track human emotions "a violation of privacy and human rights."

The memo also warned that harvesting such "deeply personal data" could make client companies a target "for snooping government authorities and malicious hackers."

See Also: Zoom Privacy Risks: The Video Chat App Could Be Sharing More Information Than You Think
It was fueled by an April 13 Protocol article indicating that the popular video communications app was actively researching integrating AI that can read emotional cues.

"These are informational signals that can be useful; they're not necessarily decisive," Josh Dulberger, Zoom's head of product, data and AI, told Protocol. Dulberger imagined using the tech to give sales reps a better understanding of how a video meeting went, "for instance by detecting, 'We think sentiments went south in this part of the call,'" Protocol reported.

A woman on a Zoom call

Emotion-tracking software is inherently biased, according to civil rights groups, because it assumes all people display the same facial expressions and body language.

FG Trade

But, the groups contend, the technology could be used to punish employees, students and other Zoom users for "expressing the wrong emotions" based on the AI's determinations. It's also inherently biased, they added, because it assumes all people use the same facial expressions, voice patterns and body language to express themselves.
"Adding this feature will discriminate against certain ethnicities and people with disabilities, hardcoding stereotypes into millions of devices," the letter read.

The group has called on Zoom to commit by May 20 to not implement emotion-tracking AI in its products.
Zoom didn't immediately respond to a request for comment.