Malaysia Oversight

AI scams are getting real: Here are the cases happening in Malaysia that you should know about

By MalayMail in August 4, 2025 – Reading time 3 minute
email


KUALA LUMPUR, Aug 4 — Scams used to be easy to spot — all it took was some bad grammar, a weird link, or a dodgy phone call.

But in today’s digital era, fraudsters are using artificial intelligence (AI) to impersonate people we know and trust in order to steal money or personal data.

Malay Mail has compiled some of the real-life scams behind AI-powered fraud wave:

Voice-cloning scams via phone or WhatsApp

In May this year, a woman in lost RM5,000 after falling victim to a sophisticated voice cloning scam that used AI to mimic her employer’s voice, The Rakyat Post reported.

The incident occurred during a routine workday at a local shop when the company phone rang repeatedly.

On the line was someone who sounded exactly like her boss and he requested several Touch ‘n Go (TnG) PINs, claiming it was an urgent matter.

It wasn’t the first time he had made such requests, so she didn’t hesitate.

The woman quickly went from one convenience store to another, purchasing RM5,000 worth of TnG top-up codes and sending them as instructed.

Then the line went dead.

When she eventually managed to contact her real boss through a different channel, he confirmed he had never made the call.

His phone had been off the entire time.

Police later confirmed it was an AI-driven scam.

As of 2024, The Star had reported at least three AI voice scam cases where victims lost thousands of ringgit.

In Kuala Terengganu, a travel agent lost RM49,800 after receiving a highly convincing phone call from someone who sounded exactly like her close friend.

Believing her friend was in urgent trouble, she transferred the money without hesitation.

In Kuala Lumpur, a 26-year-old interior designer was scammed out of RM3,000 in a similar incident, where the caller impersonated a trusted contact using AI-generated audio.

In Penang, a 50-year-old housewife fell victim to the same tactic, losing RM4,800 after speaking with a familiar-sounding voice on the other end of the line.

Last year, the police investigated 454 fraud cases involving deepfake technology, with total reported losses amounting to RM2.72 million, according to Bukit Aman Commercial Crime Investigation Department (CCID) director Datuk Seri Ramli Mohamed Yoosuf.

He said these scams frequently involve the use of AI-generated voices to impersonate family members, friends, or acquaintances, often via WhatsApp voice calls or messages.

Scammers typically claim to be in urgent need of help and request money through bank transfers or prepaid top-up PINs

Deepfake video investment scams featuring VIPs

Scammers are now leveraging AI to produce highly convincing videos of politicians, business leaders, and celebrities to trick victims into bogus investment schemes.

These AI-generated deepfake videos commonly feature well-known figures including Prime Minister Datuk Seri Ibrahim, tycoon Tan Sri Robert Kuok, former chief justice Tun Tengku Maimun Tuan Mat, and Capital A Bhd CEO Tan Sri Tony Fernandes, appearing to endorse fake investment opportunities and quick-money schemes.

Even the monarchy wasn’t spared — on July 10, the Johor Royal Press Office issued a public warning after detecting an AI-generated deepfake video of His Majesty Sultan Ismail, King of Malaysia on Facebook, falsely promoting an investment scheme.

The palace reminded the public that impersonating the King is a serious offence and urged people not to fall for these scams.

On Saturday (July 5), MCA Public Services and Complaints Department head Datuk Seri Michael Chong said Malaysians lost a staggering RM2.11 billion to such scams last year, with 13,956 cases reported.

“The AI-generated videos look so real that people can’t tell the difference. Anyone watching would think it is the prime minister himself asking the public to invest, unaware that it’s an AI-generated fake.” Chong was quoted as saying by News Straits Times.

He also said 85 per cent of victims were convinced to invest after watching fake promotional videos featuring seemingly genuine endorsements from public figures.

Recommended reading:



Source link