AI kidnapping scam flourishes
It’s almost been two months since we first noticed AI-generated voice cloning, or voice spoofing, scams starting to proliferate. Voice cloning technology is being used in scams where the reproduction of someone’s voice is imperative in making the scam seem more realistic. Typically, they’re being used in grandparent scams and virtual kidnapping scams, where scammers have always tried to imitate a victim’s loved one. Today, we’ll be focusing on the virtual kidnapping scam.
Before consumer level AI programs became so accessible, kidnapping scammers would try to make it sound like a victim’s loved one had been kidnapped by having someone in the background screaming as if they were being assaulted. Now, a scammer only needs to obtain a few seconds of someone’s voice online to make a program where they can simulate that person saying just about anything. Scammers can obtain someone’s voice either through social media, or by recording a spam call made to that person.
In Western Pennsylvania, a family received such a call from someone claiming to have kidnapped their teenage daughter. The call appeared to come from the daughter’s phone number, with the daughter’s voice saying she had been kidnapped, and her parents needed to send money. The scammer then got on the phone, threatening to harm the girl.
In many instances, this would have sent parents into a panic while potentially following the scammers instructions for a ransom payment.
Thankfully, in this instance, the daughter was standing right next to her parents when they got the call.
Even though new technology is being used by scammers, the old methods of precaution should still be used.
If you receive such a call, try to have someone contact the person who’s supposedly been kidnapped. When they put your loved one on the phone, ask them a question that only they would know the answer to. Or, set up a family code word to use only if your loved one is in danger.
Discover more from Greg's Corner
Subscribe to get the latest posts sent to your email.
Leave a Reply