Rights groups urge Zoom not to use emotion-monitoring AI tool

Fears raised over potential discriminatory issues

Ben Wodecki, Jr. Editor

May 20, 2022

2 Min Read

Fears raised over potential discriminatory issues

Videoconferencing platform Zoom has been hit with concerns from several rights groups over its emotion tracking software.

Several human rights groups, including the American Civil Liberties Union, OpenMedia and Fight for the Future, signed an open letter to Zoom CEO Eric Yuan asking him not to implement AI systems that could analyze emotions.

The signatories claim such a system is "based on pseudoscience" and that the technology could be dangerous for users if employers, academics, or other institutions decide to discipline them for "expressing the wrong emotions" based on the determinations of this AI technology.

They also claim the system is inherently biased, in a similar vein to facial recognition technologies.

“These tools assume that all people use the same facial expressions, voice patterns, and body language—but that’s not true,” the letter reads. “Adding this feature will discriminate against certain ethnicities and people with disabilities, hardcoding stereotypes into millions of devices.

The groups cited previous decisions by Zoom to not implement potential privacy-damaging updates, like blocking free users from its encrypted service and canceling face-tracking features.

“This is another opportunity to show you care about your users and your reputation… You can make it clear that this technology has no place in video communications.”

Zoom’s toying with emotional AI tools is part of its wider growth strategy. As classes and businesses return to some form of normality, the videoconferencing system is attempting to maintain the immense capital it secured during the initial pandemic.

After reaching a record share high of $559 in November 2020, the company’s stock finds itself at $90.94 a share at the time of writing, a price not seen since January 2020.

“As you continue to grow, it is critical that you maintain a relationship of trust and respect with your users,” the rights group contends.

“We ask you to publicly respond to our request and commit to not implementing emotion AI.”

Zoom declined to comment on the matter.

About the Author(s)

Ben Wodecki

Jr. Editor

Ben Wodecki is the Jr. Editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to Jr. Editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others. He holds an MSc in Digital Journalism from Middlesex University.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like