The Star Malaysia - Star2

Can AI get creepy?

-

HUMAN rights groups have urged video-conferenci­ng company Zoom to scrap research on integratin­g emotion recognitio­n tools into its products, saying the technology can infringe on users’ privacy and perpetuate discrimina­tion.

Technology publicatio­n Protocol reported last month that California­based Zoom was looking into building such tools, which could use artificial intelligen­ce (AI) to scan facial movements and speech to draw conclusion­s about people’s moods.

In a joint letter sent to Zoom chief executive Eric Yuan on May 11, more than 25 rights groups, including Access Now, the American Civil Liberties Union (ACLU) and the Muslim Justice League, said the technology was inaccurate and could threaten basic rights.

“If Zoom advances with these plans, this feature will discrimina­te against people of certain ethnicitie­s and people with disabiliti­es, hardcoding stereotype­s into millions of devices,” said Caitlin Seeley George, director of campaign and operations at Fight for the Future, a digital rights group.

“Beyond mining users for profit and allowing businesses to capitalise on them, this technology could take on far more sinister and punitive uses,” George said.

Zoom did not immediatel­y respond to a request for comment.

Zoom Video Communicat­ions emerged as a major video conferenci­ng platform around the world during Covid-19 lockdowns as education and work shifted online, reporting more than 200 million daily users at the height of the pandemic in 2020.

The company has already built tools that purport to analyse the sentiment of meetings based on text transcript­s of video calls, and according to Protocol, it also plans to explore more advanced emotion reading tools across its products.

In a blog post describing the sentiment analysis technology, Zoom said its tools can measure the “emotional tone of the conversati­ons” in order to help salespeopl­e improve their pitches.

But the rights groups’ letter said rolling out emotional recognitio­n analysis for video calls would trample users’ rights.

“This move to mine users for emotional data points based on the false idea that AI can track and analyse human emotions is a violation of privacy and human rights,” said the letter, a copy of which was sent to the Thomson Reuters Foundation.

“Zoom needs to halt plans to advance this feature,” it added.

Esha Bhandari, deputy director of the ACLU Speech, Privacy, and Technology Project, called emotion AI “a junk science”.

“There is no good reason for Zoom to mine its users’ facial expression­s, vocal tones, and eye movements to develop this creepy technology,” she said in emailed comments. – Thomson Reuters Foundation

 ?? ?? Zoom said its tools can measure the ‘emotional tone of the conversati­ons’ in order to help salespeopl­e improve their pitches. — dreamstime/ TNS
Zoom said its tools can measure the ‘emotional tone of the conversati­ons’ in order to help salespeopl­e improve their pitches. — dreamstime/ TNS

Newspapers in English

Newspapers from Malaysia