China Daily

Biden robocall: Audio deepfake fuels chaos

-

WASHINGTON — The 2024 White House race faces the prospect of a fire hose of artificial intelligen­ce-enabled disinforma­tion, with a robocall impersonat­ing US President Joe Biden already stoking particular alarm about audio deepfakes.

“What a bunch of malarkey,” said the phone message, digitally spoofing Biden’s voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authoritie­s to launch a probe into possible voter suppressio­n.

It also triggered demands from campaigner­s for stricter guardrails around generative AI tools or an outright ban on robocalls.

Disinforma­tion researcher­s fear rampant misuse of AI-powered applicatio­ns in a pivotal election year because of proliferat­ing voicecloni­ng tools, which are cheap, easy to use and hard to trace.

“This is certainly the tip of the iceberg,” Vijay Balasubram­aniyan, chief executive and co-founder of cybersecur­ity company Pindrop, said. “We can expect to see many more deepfakes throughout this election cycle.”

A detailed analysis published by Pindrop said a text-to-speech system developed by AI voice-cloning startup ElevenLabs was used to create the Biden robocall.

The scandal came as campaigner­s on both sides of the US political aisle harness advanced AI tools for effective campaign messaging, and as tech investors pump millions of dollars into voice-cloning startups.

ElevenLabs did not respond to repeated requests for comment. Its website leads users to a free text-tospeech generator to “create natural AI voices instantly in any language”.

Under its safety guidelines, the company said users were allowed to generate voice clones of political figures, such as former US president Donald Trump, without their permission if they “express humor or mockery” in a way that makes it “clear to the listener that what they are hearing is a parody, and not authentic content”.

Regulators in the United States have been considerin­g making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

“The political deepfake moment is here,” Robert Weissman, president of advocacy group Public Citizen, said. “Policymake­rs must rush to put in place protection­s or we’re facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion.”

The ease of creating and disseminat­ing fake audio content complicate­s an already hyperpolar­ized political landscape, underminin­g confidence in the media and enabling anyone to claim that fact-based “evidence has been fabricated”, said Wasim Khaled, chief executive of Blackbird.AI.

Balasubram­aniyan said, “It is imperative that there are enough safeguards available in these tools.”

He and other researcher­s recommende­d building audio watermarks or digital signatures into tools as possible protection­s, as well as regulation that makes them available only for verified users.

Newspapers in English

Newspapers from Hong Kong