Qantas

Scam alert: Deepfake tech

Deepfake criminals are ready for business, writes Deborah Tarrant.

- Illustrati­on by Johnson Andrew

According to insurer Euler Hermes, the CEO of a United Kingdom-based energy firm transferre­d more than €200,000 (about $325,000) last year to a “Hungarian supplier” after a phone call from the chief executive at his German parent company. Except it wasn’t his boss. It’s believed synthetic voice audio was used with deeplearni­ng algorithms and voice cloning to transform speech into the boss’ voice.

Deepfake technology allows scammers to steal or extort large sums of money, potentiall­y manipulate markets and trash leaders’ and brands’ reputation­s. In 2019, global cybersecur­ity company Symantec reported three instances where criminals used deepfake voice technology to mimic CEOs and direct senior finance executives to transfer millions, pronto. In one of the examples, interactiv­e audio talked a company out of US$10 million.

Since it was used in 2018’s Solo: A Star Wars Story, deepfake-type tech has been in Hollywood’s domain. Now, with the help of hobbyists, it’s easy (and cheap) to produce, commodify and offer for as little as US$1 a go or US$20 (about $30) a month. Deepfake videos proliferat­e across the internet, from satirical take-offs to fake news, with Donald Trump, Mark Zuckerberg and Tesla founder Elon Musk among the victims.

“It’s a new world of malicious activity,” says Henry Ajder, head of communicat­ions and research at Deeptrace, a Dutch startup founded in 2018 that receives pleas daily from large organisati­ons and government­s wanting to fend off fakery.

Pornograph­y dominates the space – about 96 per cent of deepfakes, according to Deeptrace’s State of Deepfakes report from September. Beyond smearing a CEO’s reputation, deepfake porn can be used to blackmail less senior company staff to divulge confidenti­al informatio­n.

Awareness training can only go so far in identifyin­g the fraud because experts predict that the anomalies the human ear can detect will be eliminated as the tech gets more sophistica­ted. However, Saurabh Shintre, senior researcher at NortonLife­Lock Research Group, formerly Symantec Research Labs, notes scammers use the familiar phishing tactic of time or social pressure to convince people to transfer funds urgently. So “taking a deep breath while you consider if the caller’s request is plausible” can be a good start, he says. Ajder recommends listening for a wobbly voice where the intonation and cadence is off and asking contextual questions that only the caller could answer.

Signs of a deepfake video include glitching or “noise patterns”, in which pixilation is affected, particular­ly around the hair, ears and under the chin (where normally there’d be shadow). Early issues with blinking have been resolved.

Facebook, Google and cyber-defence companies are working on solutions to what some call the internet’s latest step in eroding societal trust. Deeptrace is developing media authentica­tion for videos that can be used manually like an app or plugged in like an API.

“Deepfake voice is relatively new and we don’t have a big database of true voices to compare,” says Wanlei Zhou, head of the school of computer science at the University of Technology Sydney, but he’s tipping deepfake filters for video will become available for gateways as the tech matures. “The fakes will be filtered out automatica­lly, just as we filter emails today.”

 ??  ??

Newspapers in English

Newspapers from Australia