AI voices used in grandparent scam

AI voices used in grandparent scam

By Greg Collier

If you follow the tech news at all, you’ll no doubt have heard about how artificial intelligence (AI) has become increasingly popular in the past year or so. You may have heard of the art generator known as DALL-E. It can produce images using any prompt you can give it. For example, the above picture was generated with an AI program called Stable Diffusion, using the prompt of ‘AI Voice’. You may have also heard of ChatGPT, a text-based AI that can generate just about anything in text form. Do you want to craft a professional sounding email to your boss? ChatGPT can generate that for you. Do you want ChatGPT to craft the lyrics of a song in the style of The Doors about soup? It can do that too.

However, the more important question is, is AI advanced enough to be used in scams? Yes, it is.

This past weekend, The Washington Post published a story about AI being used in one of the more common scams we post about, the grandparent scam. For those who may be unfamiliar, The grandparent scam is a type of phone scam where fraudsters impersonate a grandchild or other family member in distress to trick elderly individuals into sending them money. Typically, scammers will tell their elderly victims that they’ve had some kind of facial injury such as a broken nose as to why their voice sounds different from their actual grandchild.

According to the Post, scammers are now using AI voice-cloning technology to sound exactly like the person they’re impersonating. Victims from both Canada and the United States have lost thousands of dollars to scammers using this technology.

While voice cloning technology is nothing new, it has advanced exponentially in the past couple of years. It used to be someone would need vast amounts of recordings to accurately clone someone’s voice. Now, it only takes a 30-second recording to do so. If someone you know has posted a video or recording of themselves on social media where they’re talking, their voice can now be cloned.

You can still protect yourself from this scam, as long as you disregard what your ears are telling you. If you receive a call from a relative or loved one asking for money because they’re in trouble, you should still follow the same precautions, even if it sounds exactly like them. Hang up on the call and contact the person who’s supposedly in trouble. If you can’t reach them, ask other family members who might know where they are. Tell them the exact situation you encountered, and never keep it a secret. Lastly, never send money under any means.