Santa Fe New Mexican

AI scams designed to mimic voices of loved ones on rise

Advancemen­ts now allow bad actors to replicate sound with audio sample of just a few sentences

- By Pranshu Verma

The man calling Ruth Card sounded just like her grandson, Brandon. So when he said he was in jail, with no wallet or cellphone, and needed cash for bail, Card scrambled to do whatever she could to help.

“It was definitely this feeling of … fear,” she said. “That we’ve got to help him right now.”

Card, 73, and her husband, Greg Grace, 75, dashed to their bank in Regina, Saskatchew­an, and withdrew $2,207, the daily maximum. They hurried to a second branch for more money. But a bank manager pulled them into his office: Another patron had gotten a similar call and learned the eerily accurate voice had been faked, Card recalled the banker saying. The man on the phone probably wasn’t their grandson.

That’s when they realized they’d been duped.

“We were sucked in,” Card said in an interview with The Washington Post. “We were convinced that we were talking to Brandon.”

As impersonat­ion scams in the United States rise, Card’s ordeal is indicative of a troubling trend. Technology is making it easier and cheaper for bad actors to mimic voices, convincing people, often the elderly, their loved ones are in distress. In 2022, impostor scams were the second most popular racket in America, with over 36,000 reports of people being swindled by those pretending to be friends and family, according to data from the Federal Trade Commission. Over 5,100 of those incidents happened over the phone, accounting for over $11 million in losses, FTC officials said.

Advancemen­ts in artificial intelligen­ce have added a terrifying new layer, allowing bad actors to replicate a voice with just an audio sample of a few sentences. Powered by AI, a slew of cheap online tools can translate an audio file into a replica of a voice, allowing a swindler to make it “speak” whatever they type.

Experts say federal regulators, law enforcemen­t and the courts are ill-equipped to rein in the burgeoning scam. Most victims have few leads to identify the perpetrato­r, and it’s difficult for the police to trace calls and funds from scammers operating across the world. And there’s little legal precedent for courts to hold the companies that make the tools accountabl­e for their use.

“It’s terrifying,” said Hany Farid, a professor of digital forensics at the University of California at Berkeley. “It’s sort of the perfect storm … [with] all the ingredient­s you need to create chaos.”

Although impostor scams come in many forms, they essentiall­y work the same way: A scammer impersonat­es someone trustworth­y — a child, lover or friend — and convinces the victim to send them money because they’re in distress.

But artificial­ly generated voice technology is making the ruse more convincing. Victims report reacting with visceral horror when hearing loved ones in danger.

The technology can re-create the pitch, timber and individual sounds of a person’s voice to create an overall effect that is similar, Farid said. It requires a short sample of audio, taken from places such as YouTube, podcasts, commercial­s, TikTok, Instagram or Facebook videos, Farid said.

“Two years ago, even a year ago, you needed a lot of audio to clone a person’s voice,” Farid said. “Now … if you have a Facebook page … or if you’ve recorded a TikTok and your voice is in there for 30 seconds, people can clone your voice.”

 ?? ??

Newspapers in English

Newspapers from United States