Police have issued an urgent warning about a rising trend of “silent call” scams, where fraudsters use artificial intelligence (AI) to clone a victim’s voice and deceive family members into transferring money.
In a viral video by user Izzul Islam, two police officers explained that victims often receive calls that appear connected but remain silent. When the victim responds repeatedly with “hello” or speaks for several seconds, scammers are believed to capture the audio for cloning.
Just three to five seconds of speech is enough for AI tools to generate a convincing replica of a person’s voice, the officers said.
Once scammers obtain the sample, they use the AI-generated voice to contact family members—often parents—claiming the victim is in trouble and urgently needs money. By mimicking tone and speech patterns, the fraudsters create panic and push relatives to transfer funds to a third-party account.
Police described the tactic as a dangerous evolution in commercial crime, with AI making impersonation scams far more believable.
To reduce the risk of being targeted, authorities advised the public not to greet unknown callers or speak before the caller identifies themselves.
“If the caller has genuine business, they will introduce themselves. If they remain silent, hang up immediately,” one officer said, noting Malaysians often answer calls with greetings, making them easy targets for voice harvesting.
Police also urged families to maintain alternative ways of contacting each other to verify any emergency request, especially those involving money, as scammers increasingly rely on AI-generated voices.

