VESTAVIA HILLS, Ala. — Vestavia Hills police are warning parents about a “disturbing” new scam in which criminals use artificial intelligence to clone children’s voices and call families with fake emergencies to demand money.
In a social media alert this week, the Vestavia Hills Police Department said scammers are scraping short clips of kids’ voices from social media and other online posts, then running them through AI tools to generate audio that sounds like a child pleading for help. Parents may then receive a call from an unfamiliar or spoofed number, hear what appears to be their child crying or begging for assistance and be told to send money immediately.
The department described the scheme as part of a wave of AI-enabled “virtual kidnapping” and family-in-distress scams that have surfaced across the country, in which fraudsters create a high-pressure crisis to override victims’ skepticism. Similar warnings have recently been issued by the Shelby County Sheriff’s Office and other law enforcement agencies, which say scammers are using cloned voices of children and other relatives to claim they’re jailed, hurt or being held hostage.
Police are urging families to establish a private code word or phrase that only close relatives know and to use it to verify any supposed emergency call before sending money. They also recommend hanging up, contacting the child or another trusted adult directly and refusing to pay anyone who demands wire transfers, cryptocurrency or gift cards under urgent threats.
Experts say AI voice cloning has become cheaper and easier to use, allowing scammers to build convincing fake audio from just a few seconds of recorded speech. Federal officials and consumer advocates have warned that the technology is fueling a rise in emotionally charged scams targeting parents, grandparents and other caregivers with fabricated emergencies involving loved ones.

