AI scams aren’t limited to just voice

AI scams aren't limited to just voice

By Greg Collier

AI voice spoofing scams are on the rise and have really grabbed our attention recently. Again, this is when scammers take a sample of someone’s voice from online and run the sample through an AI program to make the voice say whatever they want. We see it mostly used in phone scams, where the scammers need you to believe the victim is talking to a loved one. With the advent of AI-generated voices, scammers have gone back into their bag of tricks to make an older scam even more convincing, and that’s the deep fake video.

A deepfake video refers to a manipulated or synthesized video created using artificial intelligence techniques. In the context of deepfake videos, the AI is used to manipulate or replace the appearance and actions of individuals in existing videos, making it appear as though someone said or did something they didn’t actually say or do. However, to make the voice sound more convincing in deep fakes, a lot more voice sampling was needed than today. Now, bad actors only need a few seconds of someone’s voice to make the cloned voice sound more convincing.

Recently, a man in Louisiana received a video that appeared to come from his brother-in-law. The video was received over Messenger, and the man’s brother-in-law said in the video that he needed $250 and couldn’t explain why, just that he was in trouble. The message also contained a link to a payment app account where the man could send the $250. The video disappeared from the message, but the link remained.

Unfortunately for the scammers, they had sent their message to a police sergeant, who knew this was a scam. He called his brother-in-law, who was in no immediate danger.

If you receive a phone call or instant message from a loved one asking for money, always verify their story before sending any funds. Even if it appears that it’s your loved one contacting you, verify the story. With advances in technology, you can’t believe your eyes or ears in situations like these.