KUALA LUMPUR: Police have raised the alarm over a new wave of “silent call” scams, where fraudsters use artificial intelligence (AI) technology to mimic a victim’s voice and deceive family members into transferring money.
In a viral video user Izzul Islam, two police officers explained how victims typically receive a call that appears active but remains silent when answered.
When the victim responds with repeated “hello” or speaks for several seconds, scammers are believed to use a device to record the voice sample.
According to the officers, as little as three to five seconds of audio is sufficient for AI technology to replicate a person’s voice convincingly.
“Once they obtain the voice sample, scammers can generate messages or calls using the victim’s voice. They may then contact a family member, such as a parent, pretending that the victim is in urgent need of money,” one officer said.
In such cases, the fraudster imitates the victim’s tone and speech patterns to create urgency, often directing the family member to quickly transfer funds into a third-party account.
“When parents hear what sounds like their child’s voice, panic sets in. They believe it is genuine and immediately send money,” the officer said.
Police said the tactic marks an evolution of commercial crime cases, with AI now being exploited to enhance the believability of impersonation scams.
To reduce the risk of being targeted, police advised the public to avoid speaking first when answering unknown calls.
“If someone truly intends to speak to you, they will introduce themselves and state their purpose. Do not greet first and do not say anything until the caller identifies themselves,” the officer said.
They added that Malaysians often instinctively begin calls with greetings, making them vulnerable to voice harvesting.
“Let the caller speak first. If they remain silent or do not identify themselves, end the call immediately,” the officer said.
The public is also encouraged to maintain alternative contact channels with family members to verify any emergency request, especially involving money.
Police said that scammers are increasingly relying on AI to manipulate voices, making verification more important than ever.
© New Straits Times Press (M) Bhd






