AI potential new tools for financial scammers
KUALA LUMPUR: The application of artificial intelligence (AI) now spans various fields and is easily accessible to people from all walks of life as long as they are relatively tech-savvy.
However, AI – a branch of computer science developed to simulate human intelligence in machines that are programmed to think and act like humans – is a double-edged sword in that it can be used for beneficial or nefarious purposes.
In terms of the latter, AI can be applied to orchestrate sophisticated scams, one of which involves the use of AI-generated deepfake technology that enables scammers to create realistic audio and video impersonations of trusted individuals.
While Malaysia has not recorded any case of crimes involving AI, numerous incidents of such nature have been reported in other countries.
In China, a businessman, identified as Guo, was nearly cheated of 4.3 million yuan (about RM2.8 million) in May last year after he was tricked by a scammer who used AI to impersonate his (Guo’s) close friend.
It was reported that the “friend” wanted to borrow some money and persuaded Guo to transfer the sum. Fortunately, the businessman realised he was being scammed after finding out his friend’s identity had been stolen and he had no knowledge of the transaction. Guo alerted the police and the bank involved and recovered 3.4 million yuan, while efforts to reclaim the remaining funds were ongoing.
In another case reported in the United States early last year, a woman was contacted by a person who said her daughter has been kidnapped and demanded a ransom for her release. To convince the mother her daughter had been abducted, the “kidnapper” used AI to spoof the voice of the girl. This case shocked the US authorities as the cloned voice, generated using AI, was highly convincing.
Commenting on the use of AI by fraudsters, research fellow at the Data Science Centre, Universiti Malaysia Sarawak, Syahrul Nizam Junaini warned Malaysia will not be exempt from facing such crimes in line with the increasing use of AI technology in the country.
He said the sophistication of this technology will make it possible for cybercriminals to orchestrate financial scams, especially when personal data is stored in the cloud.
“These perpetrators often target individuals based on information gleaned from social media,” he told Bernama.
According to Syahrul Nizam, the AI used in scams typically involves sophisticated software capable of analysing and replicating an individual’s visual and audio characteristics.
“This technology can mimic one’s speech patterns, speaking style, intonation and even facial expressions to the extent that it becomes challenging to distinguish between genuine and fake,” he said, adding deepfake is an AI software with the ability to generate fake videos of individuals.
He said the sophistication of this technology will pose significant challenges to law enforcement agencies in handling crimes, particularly those involving scams and other financial offences.
“The use of AI technology to perpetrate criminal activities demands law enforcement (personnel) to be equipped with specialised expertise (to handle such cases), especially in the field of digital forensics as evidence for such fraud cases is in digital form.
“Therefore, understanding and knowledge of AI are crucial to gather and analyse evidence,” he said.