Advertisement

AI kidnapping scam copied teen girl’s voice in $1M extortion attempt

Illustration of an AI speaking letters. Getty Images

A mother in Arizona was left shaken after she narrowly avoided paying scammers thousands of dollars after they convinced her that they were holding her 15-year-old daughter hostage.

Jennifer DeStefano told local KPHO that she never doubted it was her daughter on the line for a single moment.

“It was completely her voice,” the Scottsdale resident said in a video interview last week.

DeStefano recounted that she got a call from an unfamiliar phone number while she was out at her other daughter’s dance studio. She almost let it go to voicemail, but picked it up because her 15-year-old daughter was out of town skiing and she feared there may have been an accident.

“I pick up the phone and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” DeStefano said. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”

Story continues below advertisement

That’s when a man’s voice took over the call, seeming to order DeStefano’s daughter to lie back down.

“This man gets on the phone and he’s like, ‘Listen here. I’ve got your daughter. This is how it’s going to go down. You call the police, you call anybody, I’m going to pop her so full of drugs. I’m going to have my way with her and I’m going to drop her off in Mexico,'” DeStefano said.

“And at that moment, I just started shaking. In the background, she’s going, ‘Help me, Mom. Please help me. Help me,’ and bawling.”

DeStefano said the voice was indistinguishable from her real daughter’s, who was later confirmed safe and was never in any danger.

“It was completely her voice. It was her inflection. It was the way she would have cried,” DeStefano said.

The man on the line demanded US$1 million for her daughter’s safe return. DeStefano told them that she didn’t have that much money and he eventually lowered the “ransom” to US$50,000.

Story continues below advertisement

Because DeStefano was at her other daughter’s dance studio, she was surrounded by other worried parents who caught on to the situation. One called 911 and another called DeStefano’s husband.

Within four minutes, they were able to confirm that DeStefano’s supposedly kidnapped daughter was safe, KPHO reported.

DeStefano hung up on the scammers and broke down crying.

“It all just seemed so real,” she said. “I never doubted for one second it was her. That’s the freaky part that really got me to my core.”

In reality, the 15-year-old had never said any of the things her mother heard on the phone that day. Though police are still investigating the extortion attempt, it’s believed the scammer used artificial intelligence (AI) software to clone the teen’s voice.

AI-generated voices are already being used on-screen to replicate actors. One recent example is James Earl Jones, now 92 years old, whose iconic voice has aged since his portrayal of Darth Vader in Star Wars. Last year, the actor signed off on a deal to allow Disney to use AI to replicate his voice from his initial performance for use in the TV series Obi-Wan Kenobi.

Story continues below advertisement

Experts say that AI voice generation is becoming easier to access and use by the everyday person as the technology improves — it’s not just in the hands of Hollywood and computer programmers anymore.

It used to take extensive recordings to create a believable cloned voice, but now, it only takes seconds of recorded speech.

In January, an AI research lab that released a beta version tool for synthetic speech shared a Twitter thread that revealed that “a set of actors” were using the company’s technology for “malicious purposes.”

ElevenLabs wrote that their VoiceLab technology was being increasingly used in a “number of voice cloning misuse cases,” which led the company to roll out a series of new features to make its synthetic speech more easily verifiable as AI-generated, and hid it behind a paywall.

Story continues below advertisement

Dan Mayo, the assistant special agent in charge of the FBI’s Phoenix office, told KPHO that scammers using AI voice cloning technology are becoming more common. It “happens on a daily basis,” he said, though not every victim of an AI scam reports it.

“Trust me, the FBI is looking into these people, and we will find them,” Mayo said.

Mayo is urging people to keep their social media profiles private and not visible to the public, as this is usually how scammers find examples of a person’s voice for cloning.

“If you have (social media accounts) public, you’re allowing yourself to be scammed by people like this, because they’re going to be looking for public profiles that have as much information as possible on you, and when they get a hold of that, they’re going to dig into you,” Mayo said.

Earlier this month, a couple in Canada was reportedly scammed out of $21,000 after someone claiming to be a lawyer managed to convince them their son was in jail for killing a diplomat in a car accident.

Story continues below advertisement

The scammer used an AI-generated voice to pose as the couple’s son, pleading with his parents to pay his bogus legal fees, the Washington Post reported.

The son told the outlet that the voice was “close enough for my parents to truly believe they did speak with me.”

The couple sent the scammer money through Bitcoin before realizing that their son was in no danger when he called to check in later that evening.

Need to report fraud or cybercrime in Canada? You can report to the Canadian Fraud Centre.

Sponsored content

AdChoices