Tagged: voice cloning Toggle Comment Threads | Keyboard Shortcuts

  • Geebo 8:00 am on June 20, 2023 Permalink | Reply
    Tags: , , , , voice cloning,   

    Mother convinced daughter arrested in AI scam 

    Mother convinced daughter arrested in AI scam

    By Greg Collier

    If anyone could recognize their daughter’s voice with just a few short words, it would be their mother. At least, that’s what scammers are hoping as AI-generated voice spoofing scams continue to plague families.

    Within the past few months, we have seen an increased uptick of phone scams that use AI-generated voices. As we’ve previously discussed, there are two scams where an AI-generated voice of the victim’s loved one makes the scams seem more believable.

    One of those scams is the virtual kidnapping scam. That’s when scammers will call their victim to tell them that they’ve kidnapped one of the victim’s loved ones, while demanding a ransom. In actuality, the supposed kidnap victim is unaware they’re being used in a scam.

    The other scam is the grandparent scam. It’s called the grandparent scam because in it, the majority of scammers target elderly victims and claim to be one of their grandchildren. Calling it the grandparent scam can be a misnomer, as scammers will also target parents and spouses.

    One mother from Upstate New York was shopping for her daughter’s wedding when she received a call from scammers. She immediately heard her daughter’s voice saying she got into a car accident. But it wasn’t her daughter’s voice. Scammers had spoofed it using AI.

    Scammers only need a few seconds of someone’s voice before they can make an authentic sounding AI model, along with the speaker’s cadence. They get their voice samples either from someone’s social media or making phone calls to their target. Since the daughter was preparing for her wedding, there may have been a wide variety of voice samples to choose from.

    But getting back to the scam, after the mother heard her daughter’s voice, a scammer got on the line posing as local police. They said the daughter caused a wreck while texting and driving, and needed $15,000 for bail.

    Thankfully, even though the woman was convinced that was her daughter’s voice, she did not fall victim to the scam. Instead, she called her daughter, who was in no danger at all.

    If you receive a phone call like this, try to contact the person who was supposedly arrested. Even if you held a conversation on that call and the person sounded exactly like your loved one. Scammers will try to keep you on the phone, but no one ever had their bail raised while someone verified their story.

     
  • Geebo 8:00 am on June 7, 2023 Permalink | Reply
    Tags: , , , voice cloning,   

    Virtual kidnappings become more virtual 

    Virtual kidnappings become more virtual

    By Greg Collier

    The virtual kidnapping scam is called virtual because it’s not real. This is when scammers call a victim and pretend to have kidnapped one of the victim’s loved ones. The scammers then demand some kind of ransom payment that can typically be done online. The victim will be kept on the phone by the scammers to try and ensure the victim can’t contact the loved one who has supposedly been kidnapped. Since the scam appeals to the victim’s emotions, many people have fallen victim to this scam while their loved ones are unaware they’re being used in a scam.

    More recently, scammers have made the virtual kidnapping scam more believable through AI-generated voice spoofing technology. Just as an aside, when we refer to programs like ChatGPT and Dall-E as AI, it’s actually a misnomer. A better way to describe them is machine learning programs, but the popular nomenclature has stuck, so we refer to them as AI.

    Anyway, scammers are now taking voice samples from people online, and using it in the virtual kidnapping scam. For example, a man from Arizona recently received a phone call where scammers said they kidnapped his daughter. The man then heard his daughter’s voice on the phone call saying “Papa, help me!” Her voice wasn’t robotic sounding as some may think. Voice spoofing has gotten so believable because it can mimic someone’s tone of voice as well. The scammers demanded $10,000 from the victim.

    Thankfully, the man’s daughter was unharmed. She was at school, unaware of what her father had been going through.

    Scammers get the voice samples used in the spoofing mainly from social media. It only takes a few seconds of someone’s voice to make a complete copy of someone’s voice. So, for anything that includes your child’s voice, you may want to limit access to that post.

    If you receive one of these phone calls, it’s hard not to believe what you’re hearing. However, as we like to stress, kidnappings for ransom are actually rare in the U.S. With that knowledge in mind, try to contact the supposed kidnap victim either on another phone or some other device. The chances are you’ll find they’re in no danger. In any event, you should contact local law enforcement and let know what happened.

     
  • Geebo 8:00 am on May 22, 2023 Permalink | Reply
    Tags: , , , voice cloning,   

    AI scams aren’t limited to just voice 

    AI scams aren't limited to just voice

    By Greg Collier

    AI voice spoofing scams are on the rise and have really grabbed our attention recently. Again, this is when scammers take a sample of someone’s voice from online and run the sample through an AI program to make the voice say whatever they want. We see it mostly used in phone scams, where the scammers need you to believe the victim is talking to a loved one. With the advent of AI-generated voices, scammers have gone back into their bag of tricks to make an older scam even more convincing, and that’s the deep fake video.

    A deepfake video refers to a manipulated or synthesized video created using artificial intelligence techniques. In the context of deepfake videos, the AI is used to manipulate or replace the appearance and actions of individuals in existing videos, making it appear as though someone said or did something they didn’t actually say or do. However, to make the voice sound more convincing in deep fakes, a lot more voice sampling was needed than today. Now, bad actors only need a few seconds of someone’s voice to make the cloned voice sound more convincing.

    Recently, a man in Louisiana received a video that appeared to come from his brother-in-law. The video was received over Messenger, and the man’s brother-in-law said in the video that he needed $250 and couldn’t explain why, just that he was in trouble. The message also contained a link to a payment app account where the man could send the $250. The video disappeared from the message, but the link remained.

    Unfortunately for the scammers, they had sent their message to a police sergeant, who knew this was a scam. He called his brother-in-law, who was in no immediate danger.

    If you receive a phone call or instant message from a loved one asking for money, always verify their story before sending any funds. Even if it appears that it’s your loved one contacting you, verify the story. With advances in technology, you can’t believe your eyes or ears in situations like these.

     
  • Geebo 8:00 am on May 16, 2023 Permalink | Reply
    Tags: , , , , , , voice cloning,   

    Scam Round Up: A new stolen car scam and more 

    Scam Round Up: A new stolen car scam and more

    By Greg Collier

    This week, in the Round Up, we’ll be reviewing two scams we’ve discussed before and a new one that took even us by surprise.

    Today’s first scam is one that we thought we’d see more of, but that could just mean that victims aren’t coming forward. Anyway, the voice spoofing scam has found its way to another family, this time in Tacoma, Washington. The scammers spoofed the voice of the family’s 16-year-old daughter and said that she had been in a car wreck and needed $10,000. Scammers only need a few seconds of someone’s voice to be able to generate that person’s voice using AI technology.

    This voice spoofing technology has been used in the grandparent scam, as shown above, and the virtual kidnapping scam. Even if your ears are trying to convince you that you’re talking to a loved one, always verify their story. Try to use another device to contact that person. Or have a code phrase set up beforehand with your family in case of an actual emergency.

    The second scam for today seems like it’s popping up more often lately, if the news is any indication. More homeowners have been receiving concerning letters in the mail that many think are coming from their mortgage company. In reality, the letters are from someone trying to sell a home warranty policy. However, the Better Business Bureau notes that the fine print should tell you all you need to know about the letter. In some instances, the letter says something similar to, “Not all consumers have previous coverage. We are not affiliated with your current mortgage.”

    If you have any questions or concerns about your mortgage or current home warranty, call those companies directly. Do not use any contact information contained in the letter.

    Lastly, it seems we’ve seen a number of car scams emerge, and this may be one of the most heinous. Selling a stolen car online is nothing new. It’s the buyers who pay the price once they find out that the car is stolen when they’re notified by either the DMV or the police. More recently, car scammers are taking the Vehicle Identification Number (VIN) of a car of a similar make and model, and using it on the stolen car.

    This way, when a buyer may run a vehicle history report, it will come back with the history of a car that hasn’t been stolen.

    However, this isn’t a perfect scam for the scammers. A buyer would need to look out for any discrepancies between the vehicle history and what the seller is telling you. If there are any discrepancies, or there’s an issue with any paperwork, the buyer should walk away.

     
  • Geebo 8:00 am on May 3, 2023 Permalink | Reply
    Tags: , , , , , , , voice cloning,   

    Scam Round Up: AI voice scam finds another victim and more 

    By Greg Collier

    This week in the round-up, we’ll be discussing three scams we’ve discussed before, but have popped up again recently.

    Our first scam is the Medicare card scam. Medicare issued new cards back in 2018 which started using an ID# rather than the recipient’s Social Security number. This was done to help prevent Medicare fraud and ensure patient privacy. Ever since then, scammers have been trying to fool Medicare recipients into believing another new card was being issued. Scammers typically do this to try to steal their victim’s Medicare information.

    The West Virginia Attorney General’s Office has issued a warning which says scammers are calling residents posing as Medicare, the Social Security Administration, or the Department of Insurance. The scammers are telling residents they need to turn in their paper Medicare cards for new plastic ones. This is not true. If Medicare were to issue new cards, they would announce it through the mail and not by calling Medicare recipients.

    The next scam pertains to families who have a loved one who is currently incarcerated. The Georgia Parole Board has issued their own warning to the families of the incarnated. They’ve reported scammers are calling the families and asking for money for the release of their family member. The scammers claim the money is needed for an ankle monitor before the inmate could be released.

    According to the parole board, they will never call anyone’s family asking for money. Georgia residents are advised to check with the parole board’s website before to determine the current parole status of their family member.

    Our final scam is one that’s not that old and has been in the news a lot lately, the voice spoofing scam. Scammers are taking voice recordings from social media or spam phone calls and feeding it to an AI program that can replicate that person’s voice. So far, it’s mostly been used in the grandparent scam, and the virtual kidnapping scam.

    An elderly coupe from Texas fell victim to the grandparent scam when they heard the voice of their grandson asking for help. The AI-generated voice said they were in an accident in Mexico and needed $1000. Believing he was talking to his actual grandson, the grandfather sent the money.

    If you receive a call like this, don’t believe your ears, as they can be deceived. Instead, try to contact the person who is supposedly in danger before sending any money.

     
  • Geebo 8:00 am on April 27, 2023 Permalink | Reply
    Tags: , , , voice cloning,   

    Man loses $38K to voice spoofing scam 

    Man loses $38K to voice spoofing scam

    By Greg Collier

    We haven’t seen a scam proliferate as fast as the voice spoofing scam in a while. Even scams like the Zelle scam, which took off like wildfire, didn’t spread this fast. For those who may just be learning about voice spoofing, or voice cloning as it’s sometimes called, scammers can spoof just about anyone’s voice. Using a voice recording taken from social media or spam phone calls, scammers can then use artificial intelligence (AI) programs to make that voice say just about anything they want.

    At the risk of sounding like a broken record, voice spoofing is typically used in two different scams, so far. One is the virtual kidnapping scam, and the other is the grandparent scam. Both scams rely on phone calls that need to sound as legitimate as possible, and using the voice of a victim’s loved one makes these scam calls sound more convincing than ever.

    The grandparent scam is a type of phone scam where a fraudster poses as a grandchild or another family member in distress and asks the targeted grandparent to send money immediately, often using wire transfers or gift cards, for a supposed urgent situation, such as bail or medical bills. The scam relies on the emotional manipulation and trust of the victim and often preys on their desire to help their loved ones.

    Before AI programs became so pervasive, scammers would always use some excuse as to why they didn’t sound like the victim’s grandchild. They would usually claim they had a broken nose or some other injury that made their voice sound different. Now, with voice spoofing, they don’t have to worry about that.

    Recently, an elderly man in Maryland fell victim to this scam. He received a call that sounded like it was coming from his granddaughter. The caller claimed they had been in an accident that sent several victims to the hospital. The fake granddaughter then turned the call over to a ‘lawyer’ who told the man that he needed to send $38,000 for bail, which he did. It was a few days later when he texted his granddaughter, he found out he had been scammed.

    Now you may think, this was an elderly person who is more vulnerable to scams like this. However, when a recording of the call was played for the granddaughter’s parents, they also said it sounded exactly like their daughter.

    There’s a saying that’s often attributed to Edgar Allan Poe that says, “Don’t believe anything you hear and only half of what you see.” That adage couldn’t be truer when it comes to the grandparent scam. Even if you hear the voice of a loved one saying they’re in trouble and need money, try to contact that loved one immediately. Don’t believe any claims that you can’t hang up the phone or requests not to talk to anyone else in the family, even if the caller claims there is a gag order.

     
  • Geebo 8:00 am on April 24, 2023 Permalink | Reply
    Tags: , , , voice cloning,   

    AI kidnapping scam flourishes 

    AI kidnapping scam flourishes

    It’s almost been two months since we first noticed AI-generated voice cloning, or voice spoofing, scams starting to proliferate. Voice cloning technology is being used in scams where the reproduction of someone’s voice is imperative in making the scam seem more realistic. Typically, they’re being used in grandparent scams and virtual kidnapping scams, where scammers have always tried to imitate a victim’s loved one. Today, we’ll be focusing on the virtual kidnapping scam.

    Before consumer level AI programs became so accessible, kidnapping scammers would try to make it sound like a victim’s loved one had been kidnapped by having someone in the background screaming as if they were being assaulted. Now, a scammer only needs to obtain a few seconds of someone’s voice online to make a program where they can simulate that person saying just about anything. Scammers can obtain someone’s voice either through social media, or by recording a spam call made to that person.

    In Western Pennsylvania, a family received such a call from someone claiming to have kidnapped their teenage daughter. The call appeared to come from the daughter’s phone number, with the daughter’s voice saying she had been kidnapped, and her parents needed to send money. The scammer then got on the phone, threatening to harm the girl.

    In many instances, this would have sent parents into a panic while potentially following the scammers instructions for a ransom payment.

    Thankfully, in this instance, the daughter was standing right next to her parents when they got the call.

    Even though new technology is being used by scammers, the old methods of precaution should still be used.

    If you receive such a call, try to have someone contact the person who’s supposedly been kidnapped. When they put your loved one on the phone, ask them a question that only they would know the answer to. Or, set up a family code word to use only if your loved one is in danger.

     
  • Geebo 8:00 am on April 11, 2023 Permalink | Reply
    Tags: , , , , voice cloning   

    AI voice cloning used again in alarming scam 

    AI voice cloning used again in alarming scam

    By Greg Collier

    Few things are more unnerving than the new tool scammers have added to their arsenal, AI-generated voice cloning. Potentially, scammers can make their voice sound like anyone. That includes your friends and family. Voice cloning can be very convincing when used in two scam in particular. The first one is the grandparent scam, and the other is the virtual kidnapping scam.

    In a virtual kidnapping scam, the scammers call their victims claiming they are holding one of the victim’s loved one hostage for ransom. Typically, the supposed kidnap victim is safe and unaware they’re being used in a scam.

    Previously, the scammers would do almost all of the talking, but they would have someone else in the background crying and screaming, who they claimed was the kidnap victim. Now, with voice cloning technology, scammers can make it seem like the victim’s loved one is on the phone with them. To make the scam more disturbing than it already is, the scammers only need three seconds of audio to clone the voice of someone, according to some reports.

    An Arizona woman found out all too well how the scam works when she received a call from someone who claimed to have kidnapped her 15-year-old daughter. She received a phone call from an unknown number, but when she picked up the call, she heard the voice of her daughter. The mother said her daughter sounded like she was crying, while her daughter’s voice said, “Mom, I messed up.”

    The next voice she heard was from the supposed kidnapper. The caller threatened the woman by saying if she calls the police or anyone, he’s going to pump her daughter full of drugs, physically assault her, then leave her in Mexico if the woman doesn’t pay a ransom. Then in the background, the woman heard her daughter’s voice saying, “Help me, Mom. Please help me. Help me.” The scammer demanded $1 million in ransom before settling for $50,000.

    Thankfully, the woman was in a room with friends. The friends were able to not only call police, but also got a hold of the woman’s husband. The daughter in question was at home, totally unaware of what was going on.

    When it comes to the virtual kidnapping scam, we like to remind our readers that kidnapping for ransom is actually rare in the United States. However, child abductions are unfortunately a very real occurrence. This makes the scam even more terrifying for its victims.

    The girl’s mother should be commended though for doing the right thing even though her ears were being deceived. Even if it sounds like a loved one is in danger, always verify the scammer’s story.

    If you receive a call like this, try to have someone contact the person who’s supposedly been kidnapped. When they put your loved one on the phone, ask them a question that only they would know the answer to. Or have a family code word set up in advance that’s only to be used if the loved one is in danger.

    This may also be an opportunity for you to have a talk with your children about what they share on social media, since that’s where these scammers tend to find the voice samples they need.

     
  • Geebo 8:00 am on March 16, 2023 Permalink | Reply
    Tags: , , , , voice cloning   

    AI voice used in kidnapping scam 

    By Greg Collier

    Just over a week ago, we posted about scammers using AI technology to clone a victim’s loved one’s voice for a grandparent scam. It seems that this technique of scammers cloning voices isn’t going away anytime soon. Just recently, AI voice cloning was used in a virtual kidnapping scam in Oklahoma, where the victim lost $3000 to a scammer.

    Virtual kidnapping is a type of scam where a person receives a call or message claiming that their loved one has been kidnapped and demanding a ransom payment for their release. However, in most cases, the supposed victim is actually safe and not in any danger.

    Previously, in most virtual kidnapping scams, the scammers would do almost all of the talking, but they would have someone else in the background crying and screaming, who they claimed was the kidnap victim.

    In this most recent scam, the scam victim thought she was talking to her son and even said that the person on the phone sounded just like her son.

    It started like most virtual kidnapping scams do. The victim received a phone call from an unknown caller who told the woman they had kidnapped her adult son. The caller insinuated that the woman’s son interrupted a drug deal that cost the caller a lot of money. So, if the woman didn’t pay the money that was supposedly lost, they were going to harm her son. Typically, when the victim asks to speak to their loved one, the scammers will make excuses. However, this time, the victim spoke with someone who sounded just like her son.

    Panicked, the woman went to Walmart to wire $3000 to someone in Mexico. The scammer kept her on the phone the entire time. After making the payment, the impostor got back on the phone to say that the kidnappers were letting him go. The scammer’s told her they would drop her son off at that Walmart, but he never appeared. Finally, she was able to get a hold of her son on the phone, who had been at work the entire time.

    The virtual kidnapping scam has been using fear to get victims to pay a phony ransom for years. But now, with the voice cloning technology, the scammers have stepped up the fear to another level. The scammers only need about a minute of your loved one’s voice to be able to clone it. They usually take the voice from recordings that can be found on social media.

    But even if it sounds like a loved one on the phone, the same old precautions should be used. If you receive a call like this, try to have someone contact the person who’s supposedly been kidnapped. When they put your loved one on the phone, ask them a question that only they would know the answer to. Or have a family code word set up in advance that’s only to be used if the loved one is in danger.

     
  • Geebo 9:00 am on March 6, 2023 Permalink | Reply
    Tags: , , , voice cloning   

    AI voices used in grandparent scam 

    AI voices used in grandparent scam

    By Greg Collier

    If you follow the tech news at all, you’ll no doubt have heard about how artificial intelligence (AI) has become increasingly popular in the past year or so. You may have heard of the art generator known as DALL-E. It can produce images using any prompt you can give it. For example, the above picture was generated with an AI program called Stable Diffusion, using the prompt of ‘AI Voice’. You may have also heard of ChatGPT, a text-based AI that can generate just about anything in text form. Do you want to craft a professional sounding email to your boss? ChatGPT can generate that for you. Do you want ChatGPT to craft the lyrics of a song in the style of The Doors about soup? It can do that too.

    However, the more important question is, is AI advanced enough to be used in scams? Yes, it is.

    This past weekend, The Washington Post published a story about AI being used in one of the more common scams we post about, the grandparent scam. For those who may be unfamiliar, The grandparent scam is a type of phone scam where fraudsters impersonate a grandchild or other family member in distress to trick elderly individuals into sending them money. Typically, scammers will tell their elderly victims that they’ve had some kind of facial injury such as a broken nose as to why their voice sounds different from their actual grandchild.

    According to the Post, scammers are now using AI voice-cloning technology to sound exactly like the person they’re impersonating. Victims from both Canada and the United States have lost thousands of dollars to scammers using this technology.

    While voice cloning technology is nothing new, it has advanced exponentially in the past couple of years. It used to be someone would need vast amounts of recordings to accurately clone someone’s voice. Now, it only takes a 30-second recording to do so. If someone you know has posted a video or recording of themselves on social media where they’re talking, their voice can now be cloned.

    You can still protect yourself from this scam, as long as you disregard what your ears are telling you. If you receive a call from a relative or loved one asking for money because they’re in trouble, you should still follow the same precautions, even if it sounds exactly like them. Hang up on the call and contact the person who’s supposedly in trouble. If you can’t reach them, ask other family members who might know where they are. Tell them the exact situation you encountered, and never keep it a secret. Lastly, never send money under any means.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel