AI Cloned Voices Used in New Scam to Fake Kidnapping

My family recently experienced a scary situation when my dad’s Instagram account got hacked. It was a regular day when my dad received a message saying his account was temporarily blocked due to suspicious activity. He followed the instructions on the message to regain access but, to his dismay, he was unable to log in. After several attempts, he realized that his account had been hacked, and the scammers had changed the email and phone number associated with it.

In a panic, my dad contacted Instagram support to report the issue, but they were unable to help him as he couldn’t verify his identity. I messaged my dad’s page and to my surprise, the scammers responded. They sent me a message asking for $500 to give him back access to his account. Of course, I refused to pay, but the scammers were persistent and kept threatening to delete the account if he didn’t pay up.

After a long and stressful ordeal, my dad was finally able to regain access to his account, but it was a scary experience that made us realize how vulnerable we all are to online scams.

This incident reminded me of an article I read about scammers using AI technology to clone children’s voices to fake kidnappings. It’s a new kind of scam that parents need to be aware of.

Scammers are using AI technology to clone the voices of children to pretend they have been kidnapped. In a recent incident, a mom in Arizona received a call from a weird number, and when she picked up, she heard her 15-year-old daughter sobbing on the other end. The daughter said, “Mom, I messed up,” and a man in the background threatened to take her to Mexico if he didn’t get $1 million. The mom was understandably distraught and contacted her husband, who confirmed that their daughter was safe upstairs.


Listen to Ransom Garcia weekday afternoons from 3 to 7 on 1079 Coyote Country

Follow the station on socials: InstagramFacebook, and Twitter

Follow Ransom Garcia on InstagramFacebookTwitter and TikTok!

 

  • How it Works

    The scammers only need a short clip of a child’s voice to convincingly clone it and pretend that the child has been kidnapped. They then demand money from the child’s parents.

  • An Expert's Advice

    Experts advise being suspicious of any calls from unknown numbers and questioning anyone pressuring you to send money immediately. Unfortunately, even our ears cannot be trusted anymore, according to a professor at Arizona State University who specializes in AI. It’s crucial to report any incidents like this to the authorities as they do investigate and sometimes catch the perpetrators.

     

  • In Conclusion

    My dad’s experience with his hacked Instagram account and the AI voice cloning scam is a stark reminder of how important it is to be vigilant and cautious online. It’s a scary world out there, and scammers are always finding new ways to trick people. We must remain alert and educated to protect ourselves and our loved ones.

Sign me up for the 107.9 Coyote Country email newsletter!

Join Coyote Country VIP Club for access to all the perks delivered right to your inbox! Get exclusive concert announcements, the latest updates with your favorite country icons, contest updates, and more.

*
*
By clicking "Subscribe" I agree to the website's terms of Service and Privacy Policy. I understand I can unsubscribe at any time.