The Star Malaysia - Star2

Deepfake audio used to impersonat­e CEO

- By ANGELIN YEOH lifestylet­ech@thestar.com.my

AN employee at an undisclose­d tech company received a deepfake audio impersonat­ing the voice of its CEO asking for assistance to finalise an “urgent business deal”, according to a security company that investigat­ed the incident.

Us-based Nisos told website Vice in a report that it analysed the voicemail that the employee received and determined that it was fake “synthetic audio” made to fool the receiver.

In the copy of the voicemail Nisos shared with Vice, a voice can be heard saying, “Hi (recipient’s name), this is (alleged CEO’S name). I need your assistance to finalise an urgent business deal.”

The employee who received the voicemail was suspicious and reported it which led to Nisos’ investigat­ion.

The firm used Spectrum3d, a spectrogra­m tool, to detect anomalies.

“You could tell there was something wrong about the audio.

“It looks like they basically took every single word, chopped it up, and then pasted them back in together,” Nisos researcher

Dev Badlu said.

When he lowered the volume of the voice, he discovered there was no background noise, which to him was a clear sign of forgery. There were also too many stark peaks and valleys (high and low indicators), which don’t normally occur in a regular speech. Nisos said it also investigat­ed the phone number the voicemail originated from and found that it was from a VOIP (Voice over Internet Protocol) service with no user info.

Last year, criminals used voice-mimicking software to copy the voice of a British executive and fooled a managing director at his company into transferri­ng US$240,000 (Rm1mil) to an account in Hungary.

Last May, two Youtubers managed to fool a number of celebritie­s into believing that they were doing interviews with TV host James Corden by using clips of his voice from other videos.

Criminals have also used machine learning to combine audio snippets from conference calls, Youtube videos and TED talks to copy the speech behaviour of company bosses.

Nisos offered a simple piece of advice to avoid getting duped – call the person back, as deepfake tech has not evolved to mimic an entire phone call.

The firm said it would expect criminals to use deepfake audio as the first step in a fraud attempt before using other forms of trickery to dupe their victims.

“We would anticipate a deepfake audio would be the first step in a series of social engineerin­g attempts to get an employee to wire money to a specific location.

“Phishing emails, additional phone calls, or even deepfake videos purporting to authorise an action could be used in furtheranc­e of the criminal scheme,” Nisos said in the report.

 ??  ??

Newspapers in English

Newspapers from Malaysia