Tagged: AI Toggle Comment Threads | Keyboard Shortcuts

  • Geebo 8:00 am on April 27, 2023 Permalink | Reply
    Tags: AI, , , ,   

    Man loses $38K to voice spoofing scam 

    Man loses $38K to voice spoofing scam

    By Greg Collier

    We haven’t seen a scam proliferate as fast as the voice spoofing scam in a while. Even scams like the Zelle scam, which took off like wildfire, didn’t spread this fast. For those who may just be learning about voice spoofing, or voice cloning as it’s sometimes called, scammers can spoof just about anyone’s voice. Using a voice recording taken from social media or spam phone calls, scammers can then use artificial intelligence (AI) programs to make that voice say just about anything they want.

    At the risk of sounding like a broken record, voice spoofing is typically used in two different scams, so far. One is the virtual kidnapping scam, and the other is the grandparent scam. Both scams rely on phone calls that need to sound as legitimate as possible, and using the voice of a victim’s loved one makes these scam calls sound more convincing than ever.

    The grandparent scam is a type of phone scam where a fraudster poses as a grandchild or another family member in distress and asks the targeted grandparent to send money immediately, often using wire transfers or gift cards, for a supposed urgent situation, such as bail or medical bills. The scam relies on the emotional manipulation and trust of the victim and often preys on their desire to help their loved ones.

    Before AI programs became so pervasive, scammers would always use some excuse as to why they didn’t sound like the victim’s grandchild. They would usually claim they had a broken nose or some other injury that made their voice sound different. Now, with voice spoofing, they don’t have to worry about that.

    Recently, an elderly man in Maryland fell victim to this scam. He received a call that sounded like it was coming from his granddaughter. The caller claimed they had been in an accident that sent several victims to the hospital. The fake granddaughter then turned the call over to a ‘lawyer’ who told the man that he needed to send $38,000 for bail, which he did. It was a few days later when he texted his granddaughter, he found out he had been scammed.

    Now you may think, this was an elderly person who is more vulnerable to scams like this. However, when a recording of the call was played for the granddaughter’s parents, they also said it sounded exactly like their daughter.

    There’s a saying that’s often attributed to Edgar Allan Poe that says, “Don’t believe anything you hear and only half of what you see.” That adage couldn’t be truer when it comes to the grandparent scam. Even if you hear the voice of a loved one saying they’re in trouble and need money, try to contact that loved one immediately. Don’t believe any claims that you can’t hang up the phone or requests not to talk to anyone else in the family, even if the caller claims there is a gag order.

     
  • Geebo 8:00 am on April 24, 2023 Permalink | Reply
    Tags: AI, , , ,   

    AI kidnapping scam flourishes 

    AI kidnapping scam flourishes

    It’s almost been two months since we first noticed AI-generated voice cloning, or voice spoofing, scams starting to proliferate. Voice cloning technology is being used in scams where the reproduction of someone’s voice is imperative in making the scam seem more realistic. Typically, they’re being used in grandparent scams and virtual kidnapping scams, where scammers have always tried to imitate a victim’s loved one. Today, we’ll be focusing on the virtual kidnapping scam.

    Before consumer level AI programs became so accessible, kidnapping scammers would try to make it sound like a victim’s loved one had been kidnapped by having someone in the background screaming as if they were being assaulted. Now, a scammer only needs to obtain a few seconds of someone’s voice online to make a program where they can simulate that person saying just about anything. Scammers can obtain someone’s voice either through social media, or by recording a spam call made to that person.

    In Western Pennsylvania, a family received such a call from someone claiming to have kidnapped their teenage daughter. The call appeared to come from the daughter’s phone number, with the daughter’s voice saying she had been kidnapped, and her parents needed to send money. The scammer then got on the phone, threatening to harm the girl.

    In many instances, this would have sent parents into a panic while potentially following the scammers instructions for a ransom payment.

    Thankfully, in this instance, the daughter was standing right next to her parents when they got the call.

    Even though new technology is being used by scammers, the old methods of precaution should still be used.

    If you receive such a call, try to have someone contact the person who’s supposedly been kidnapped. When they put your loved one on the phone, ask them a question that only they would know the answer to. Or, set up a family code word to use only if your loved one is in danger.

     
  • Geebo 8:00 am on April 11, 2023 Permalink | Reply
    Tags: AI, , , ,   

    AI voice cloning used again in alarming scam 

    AI voice cloning used again in alarming scam

    By Greg Collier

    Few things are more unnerving than the new tool scammers have added to their arsenal, AI-generated voice cloning. Potentially, scammers can make their voice sound like anyone. That includes your friends and family. Voice cloning can be very convincing when used in two scam in particular. The first one is the grandparent scam, and the other is the virtual kidnapping scam.

    In a virtual kidnapping scam, the scammers call their victims claiming they are holding one of the victim’s loved one hostage for ransom. Typically, the supposed kidnap victim is safe and unaware they’re being used in a scam.

    Previously, the scammers would do almost all of the talking, but they would have someone else in the background crying and screaming, who they claimed was the kidnap victim. Now, with voice cloning technology, scammers can make it seem like the victim’s loved one is on the phone with them. To make the scam more disturbing than it already is, the scammers only need three seconds of audio to clone the voice of someone, according to some reports.

    An Arizona woman found out all too well how the scam works when she received a call from someone who claimed to have kidnapped her 15-year-old daughter. She received a phone call from an unknown number, but when she picked up the call, she heard the voice of her daughter. The mother said her daughter sounded like she was crying, while her daughter’s voice said, “Mom, I messed up.”

    The next voice she heard was from the supposed kidnapper. The caller threatened the woman by saying if she calls the police or anyone, he’s going to pump her daughter full of drugs, physically assault her, then leave her in Mexico if the woman doesn’t pay a ransom. Then in the background, the woman heard her daughter’s voice saying, “Help me, Mom. Please help me. Help me.” The scammer demanded $1 million in ransom before settling for $50,000.

    Thankfully, the woman was in a room with friends. The friends were able to not only call police, but also got a hold of the woman’s husband. The daughter in question was at home, totally unaware of what was going on.

    When it comes to the virtual kidnapping scam, we like to remind our readers that kidnapping for ransom is actually rare in the United States. However, child abductions are unfortunately a very real occurrence. This makes the scam even more terrifying for its victims.

    The girl’s mother should be commended though for doing the right thing even though her ears were being deceived. Even if it sounds like a loved one is in danger, always verify the scammer’s story.

    If you receive a call like this, try to have someone contact the person who’s supposedly been kidnapped. When they put your loved one on the phone, ask them a question that only they would know the answer to. Or have a family code word set up in advance that’s only to be used if the loved one is in danger.

    This may also be an opportunity for you to have a talk with your children about what they share on social media, since that’s where these scammers tend to find the voice samples they need.

     
  • Geebo 9:00 am on March 6, 2023 Permalink | Reply
    Tags: AI, , ,   

    AI voices used in grandparent scam 

    AI voices used in grandparent scam

    By Greg Collier

    If you follow the tech news at all, you’ll no doubt have heard about how artificial intelligence (AI) has become increasingly popular in the past year or so. You may have heard of the art generator known as DALL-E. It can produce images using any prompt you can give it. For example, the above picture was generated with an AI program called Stable Diffusion, using the prompt of ‘AI Voice’. You may have also heard of ChatGPT, a text-based AI that can generate just about anything in text form. Do you want to craft a professional sounding email to your boss? ChatGPT can generate that for you. Do you want ChatGPT to craft the lyrics of a song in the style of The Doors about soup? It can do that too.

    However, the more important question is, is AI advanced enough to be used in scams? Yes, it is.

    This past weekend, The Washington Post published a story about AI being used in one of the more common scams we post about, the grandparent scam. For those who may be unfamiliar, The grandparent scam is a type of phone scam where fraudsters impersonate a grandchild or other family member in distress to trick elderly individuals into sending them money. Typically, scammers will tell their elderly victims that they’ve had some kind of facial injury such as a broken nose as to why their voice sounds different from their actual grandchild.

    According to the Post, scammers are now using AI voice-cloning technology to sound exactly like the person they’re impersonating. Victims from both Canada and the United States have lost thousands of dollars to scammers using this technology.

    While voice cloning technology is nothing new, it has advanced exponentially in the past couple of years. It used to be someone would need vast amounts of recordings to accurately clone someone’s voice. Now, it only takes a 30-second recording to do so. If someone you know has posted a video or recording of themselves on social media where they’re talking, their voice can now be cloned.

    You can still protect yourself from this scam, as long as you disregard what your ears are telling you. If you receive a call from a relative or loved one asking for money because they’re in trouble, you should still follow the same precautions, even if it sounds exactly like them. Hang up on the call and contact the person who’s supposedly in trouble. If you can’t reach them, ask other family members who might know where they are. Tell them the exact situation you encountered, and never keep it a secret. Lastly, never send money under any means.

     
  • Geebo 8:00 am on May 28, 2019 Permalink | Reply
    Tags: AI, ,   

    Could Deep Fakes ruin our world? 

    Could Deep Fakes ruin our world?

    Recently, a video was distributed on social media od Speaker of the House Nancy Pelosi that made it look and sound like she was slurring her speech. While the video was determined to be fake it was put out by someone who supposedly did not support Pelosi’s politics. However, with politics being what it is in this country today there were people who believed it to be real. While this particular video was made using simple editing tricks of an actual video it does bring up the matter of what will happen when ‘Deep Fakes’ become more prevalent in our society and media.

    For those of you unfamiliar with the term Deep Fake, it refers to a process where someone can take a single image or video and by using an AI Assisted program it can turn the original video into just about anything the fabricator wanted. For example, if you wanted to make it look like a beloved celebrity say that they enjoyed stealing candy from babies you probably could. Now take that same process and imagine it being used against candidates running for President. Potentially deep fakes could be used to make it look like any candidate look like they were saying or doing something completely detrimental to their campaign.

    [youtube https://www.youtube.com/watch?v=gLoI9hAX9dw%5D

    The deep fakes could become so commonplace that we wouldn’t be able to tell what was real and what wasn’t. If someone were to commit a heinous act caught on video’ all they would need to say is that the video was a deep fake and scores of people would believe them. Thankfully, the technology is not there yet where a deep fake is indistinguishable from the real thing but it could only be a matter of time before it is. When technologies are used by bad actors, it usually takes law enforcement and government some time to catch up before designing the tools needed to fight them.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel