Tagged: voice spoofing Toggle Comment Threads | Keyboard Shortcuts

  • Geebo 8:00 am on July 20, 2023 Permalink | Reply
    Tags: , , , , , , , voice spoofing   

    Scam Round Up: Fake cops threaten tenants and more 

    Scam Round Up: Fake cops threaten tenants and more

    By Greg Collier

    Our first scam of the day comes to us from a warning from the New York City Police Department. The NYPD says they’ve seen an increase in a charity scam that involves Venmo and your phone. Scammers are approaching NYC residents while pretending they’re working for a charity.

    The scammers will ask for a donation through the personal payment app Venmo. The victim will be provided the information to make the donation, but the donation won’t go through. This is when the scammer will ask for the victim’s phone to help them make the donation. Instead, the scammers are sending the entire amount of the victim’s Venmo account to themselves.

    The NYPD is telling residents not to hand their phones over to strangers, especially if they’re asking for donations. Please keep in mind, Venmo was intended to be used between family and friends.

    ***

    We’ve been keeping a close eye on the scams that involve AI-generated voice-spoofing. Scammers will take someone’s voice either from social media or their voicemail message and run it through an AI voice program that will allow them to make someone’s voice say just about anything they want. Typically, voice-spoofing is used in the grandparent and virtual kidnapping scams. In these scams, scammers need the victim to believe they’re talking to a loved one.

    The most recent report we have on this is out of Atlanta, where a mother was confronted with this scam. She received a call she thought was from her adult daughter. She heard her daughter’s voice before someone on the call said her daughter saw something she shouldn’t have and has now been kidnapped. The caller demanded $50,000 in ransom.

    Thankfully, her husband was able to get a hold of her daughter, who was in no real danger.

    If you receive a phone call like this, always try to reach the person who has been supposedly kidnapped through other means. Even if you have a full conversation with someone who sounds just like your loved one, always verify the story. Ask them a question only they would know, or set up a family code word ahead of time that would signify who you were talking to.

    ***

    Residents of Newark, New Jersey, have reported that people posing as police have been going around to tenants and demanding multiple months worth of rent. If the phony officers don’t get the money, they threaten the tenants with eviction and arrest.

    In New Jersey, an eviction can’t be carried out until the landlord has received a judgment in court.

    If you’re renting your home or apartment, you should familiarize yourself with your state’s or county’s eviction process.

    Also, keep in mind, legitimate police will never show up at your door asking for your rent money. If someone claiming to be police does show up at your door, call the police department they’re supposedly from and verify if an officer has been dispatched to your home.

     
  • Geebo 8:00 am on July 13, 2023 Permalink | Reply
    Tags: , , , , voice spoofing   

    County official targeted in AI scam 

    By Greg Collier

    We’ve come across yet another story where an AI-generated voice has been used in a scam. This time it took place in the Harrisburg, Pennsylvania area. There, a county official received a phone call she thought was coming from her daughter. The voice on the other end sounded exactly like her daughter, and was sobbing and crying. Then a man got on the call and claimed to be a police officer. That man said that the daughter caused a car accident after looking at her phone while driving. It wasn’t long before the man asked for $15,000 bail. Thankfully, while this was going on, the woman got a text message from her actual daughter, which spoiled the scam.

    Regular readers will recognize this as the grandparent scam. It was initially called that because scammers would target the elderly and pose as one of the victim’s grandchildren. Now, ‘grandparent scam’ is a misnomer because more recently, scammers have been targeting parents as well. This is thanks to the advancement of AI technology lately. Scammers now have the capability of spoofing the voice of just about anyone they want and making it say whatever they want. This makes a scam that was concerning at first, absolutely terrifying now. Before voice-spoofing, a scammer would have to try to imitate a loved one while claiming they had some kind of injury which made their voice sound different, such as a broken nose. Now, scammers don’t even have to bother. All they need now is a few seconds of someone’s voice they can take from a video on social media.

    But as always, If you receive a distressing call from a supposed loved one who claims they’re in some kind of trouble, it is critical to verify their situation by contacting them directly. Scammers will try to keep you on the phone by threatening arrest if you hang up or claiming there is some kind of gag order. Nothing is keeping you from hanging up on the phone call to verify the story with your family or friends. Even if you’re convinced you’re hearing your loved one’s voice, always verify the story before making any kind of payment is even considered.

     
  • Geebo 8:00 am on June 28, 2023 Permalink | Reply
    Tags: , , , , , , voice spoofing   

    AI voice-spoofing scam started earlier than we thought 

    By Greg Collier

    One of the many problems with scams is, by the time the public hears about them, they’re already in full swing and have claimed numerous victims. For example, we’ve only been discussing the AI voice-spoofing scam for roughly two months. While we assumed the scam had been going on longer than that, we were unaware of just how far back it started. According to one recent report, at least one scam ring has been implementing the voice-spoofing scam since October of last year. The reason we know the scam is at least that old is because a suspect has been arrested for such a scam.

    In a voice-spoofing scam, scammers extract someone’s voice sample from online sources and manipulate it using AI technology to make it utter desired phrases. This deceptive practice is commonly observed in phone scams, particularly those aimed at convincing victims that they are communicating with a trusted family member or loved one. The voice-spoofing seems to be only used in grandparent scams and virtual kidnapping scams, so far. It’s only a matter of time before scammers come up with new ways of using voice-spoofing to scam victims.

    Also, when we discuss voice-spoofing scams here in 2023, we’re referring to the new wave of voice-spoofing scams. In previous years, there have been voice-spoofing scams, however, they were almost primitive compared to today’s technology. Those older scams also needed several minutes of someone’s recorded voice before they could make a viable speech model. Today, scammers only need a few seconds of speech.

    Getting back to the matter at hand, a New Jersey man was recently arrested for allegedly scamming a Houston, Texas, woman out of $40,000. She thought the voice she was talking to was her son, who claimed to have been arrested. Then the alleged scammer would get on the phone posing as a public defender while asking the woman for bail money. The man was caught after investigators followed the money trail, since one of the payments was sent through money transfer. However, the victim in this case was scammed in October 2022.

    Since scammers hardly ever work alone, more arrests may be following, and you can almost bet there are more victims out there.

    If you receive a distressing call from a supposed loved one requesting urgent financial assistance, it is crucial to verify their situation by promptly contacting them through alternative means. Do not entertain any assertions that prevent you from ending the call or consulting other family members. Stay vigilant and prioritize verifying the authenticity of such requests.

     
  • Geebo 8:00 am on June 20, 2023 Permalink | Reply
    Tags: , , , , , voice spoofing   

    Mother convinced daughter arrested in AI scam 

    Mother convinced daughter arrested in AI scam

    By Greg Collier

    If anyone could recognize their daughter’s voice with just a few short words, it would be their mother. At least, that’s what scammers are hoping as AI-generated voice spoofing scams continue to plague families.

    Within the past few months, we have seen an increased uptick of phone scams that use AI-generated voices. As we’ve previously discussed, there are two scams where an AI-generated voice of the victim’s loved one makes the scams seem more believable.

    One of those scams is the virtual kidnapping scam. That’s when scammers will call their victim to tell them that they’ve kidnapped one of the victim’s loved ones, while demanding a ransom. In actuality, the supposed kidnap victim is unaware they’re being used in a scam.

    The other scam is the grandparent scam. It’s called the grandparent scam because in it, the majority of scammers target elderly victims and claim to be one of their grandchildren. Calling it the grandparent scam can be a misnomer, as scammers will also target parents and spouses.

    One mother from Upstate New York was shopping for her daughter’s wedding when she received a call from scammers. She immediately heard her daughter’s voice saying she got into a car accident. But it wasn’t her daughter’s voice. Scammers had spoofed it using AI.

    Scammers only need a few seconds of someone’s voice before they can make an authentic sounding AI model, along with the speaker’s cadence. They get their voice samples either from someone’s social media or making phone calls to their target. Since the daughter was preparing for her wedding, there may have been a wide variety of voice samples to choose from.

    But getting back to the scam, after the mother heard her daughter’s voice, a scammer got on the line posing as local police. They said the daughter caused a wreck while texting and driving, and needed $15,000 for bail.

    Thankfully, even though the woman was convinced that was her daughter’s voice, she did not fall victim to the scam. Instead, she called her daughter, who was in no danger at all.

    If you receive a phone call like this, try to contact the person who was supposedly arrested. Even if you held a conversation on that call and the person sounded exactly like your loved one. Scammers will try to keep you on the phone, but no one ever had their bail raised while someone verified their story.

     
  • Geebo 8:00 am on June 7, 2023 Permalink | Reply
    Tags: , , , , voice spoofing   

    Virtual kidnappings become more virtual 

    Virtual kidnappings become more virtual

    By Greg Collier

    The virtual kidnapping scam is called virtual because it’s not real. This is when scammers call a victim and pretend to have kidnapped one of the victim’s loved ones. The scammers then demand some kind of ransom payment that can typically be done online. The victim will be kept on the phone by the scammers to try and ensure the victim can’t contact the loved one who has supposedly been kidnapped. Since the scam appeals to the victim’s emotions, many people have fallen victim to this scam while their loved ones are unaware they’re being used in a scam.

    More recently, scammers have made the virtual kidnapping scam more believable through AI-generated voice spoofing technology. Just as an aside, when we refer to programs like ChatGPT and Dall-E as AI, it’s actually a misnomer. A better way to describe them is machine learning programs, but the popular nomenclature has stuck, so we refer to them as AI.

    Anyway, scammers are now taking voice samples from people online, and using it in the virtual kidnapping scam. For example, a man from Arizona recently received a phone call where scammers said they kidnapped his daughter. The man then heard his daughter’s voice on the phone call saying “Papa, help me!” Her voice wasn’t robotic sounding as some may think. Voice spoofing has gotten so believable because it can mimic someone’s tone of voice as well. The scammers demanded $10,000 from the victim.

    Thankfully, the man’s daughter was unharmed. She was at school, unaware of what her father had been going through.

    Scammers get the voice samples used in the spoofing mainly from social media. It only takes a few seconds of someone’s voice to make a complete copy of someone’s voice. So, for anything that includes your child’s voice, you may want to limit access to that post.

    If you receive one of these phone calls, it’s hard not to believe what you’re hearing. However, as we like to stress, kidnappings for ransom are actually rare in the U.S. With that knowledge in mind, try to contact the supposed kidnap victim either on another phone or some other device. The chances are you’ll find they’re in no danger. In any event, you should contact local law enforcement and let know what happened.

     
  • Geebo 8:00 am on May 22, 2023 Permalink | Reply
    Tags: , , , , voice spoofing   

    AI scams aren’t limited to just voice 

    AI scams aren't limited to just voice

    By Greg Collier

    AI voice spoofing scams are on the rise and have really grabbed our attention recently. Again, this is when scammers take a sample of someone’s voice from online and run the sample through an AI program to make the voice say whatever they want. We see it mostly used in phone scams, where the scammers need you to believe the victim is talking to a loved one. With the advent of AI-generated voices, scammers have gone back into their bag of tricks to make an older scam even more convincing, and that’s the deep fake video.

    A deepfake video refers to a manipulated or synthesized video created using artificial intelligence techniques. In the context of deepfake videos, the AI is used to manipulate or replace the appearance and actions of individuals in existing videos, making it appear as though someone said or did something they didn’t actually say or do. However, to make the voice sound more convincing in deep fakes, a lot more voice sampling was needed than today. Now, bad actors only need a few seconds of someone’s voice to make the cloned voice sound more convincing.

    Recently, a man in Louisiana received a video that appeared to come from his brother-in-law. The video was received over Messenger, and the man’s brother-in-law said in the video that he needed $250 and couldn’t explain why, just that he was in trouble. The message also contained a link to a payment app account where the man could send the $250. The video disappeared from the message, but the link remained.

    Unfortunately for the scammers, they had sent their message to a police sergeant, who knew this was a scam. He called his brother-in-law, who was in no immediate danger.

    If you receive a phone call or instant message from a loved one asking for money, always verify their story before sending any funds. Even if it appears that it’s your loved one contacting you, verify the story. With advances in technology, you can’t believe your eyes or ears in situations like these.

     
  • Geebo 8:00 am on May 16, 2023 Permalink | Reply
    Tags: , , , , , , , voice spoofing   

    Scam Round Up: A new stolen car scam and more 

    Scam Round Up: A new stolen car scam and more

    By Greg Collier

    This week, in the Round Up, we’ll be reviewing two scams we’ve discussed before and a new one that took even us by surprise.

    Today’s first scam is one that we thought we’d see more of, but that could just mean that victims aren’t coming forward. Anyway, the voice spoofing scam has found its way to another family, this time in Tacoma, Washington. The scammers spoofed the voice of the family’s 16-year-old daughter and said that she had been in a car wreck and needed $10,000. Scammers only need a few seconds of someone’s voice to be able to generate that person’s voice using AI technology.

    This voice spoofing technology has been used in the grandparent scam, as shown above, and the virtual kidnapping scam. Even if your ears are trying to convince you that you’re talking to a loved one, always verify their story. Try to use another device to contact that person. Or have a code phrase set up beforehand with your family in case of an actual emergency.

    The second scam for today seems like it’s popping up more often lately, if the news is any indication. More homeowners have been receiving concerning letters in the mail that many think are coming from their mortgage company. In reality, the letters are from someone trying to sell a home warranty policy. However, the Better Business Bureau notes that the fine print should tell you all you need to know about the letter. In some instances, the letter says something similar to, “Not all consumers have previous coverage. We are not affiliated with your current mortgage.”

    If you have any questions or concerns about your mortgage or current home warranty, call those companies directly. Do not use any contact information contained in the letter.

    Lastly, it seems we’ve seen a number of car scams emerge, and this may be one of the most heinous. Selling a stolen car online is nothing new. It’s the buyers who pay the price once they find out that the car is stolen when they’re notified by either the DMV or the police. More recently, car scammers are taking the Vehicle Identification Number (VIN) of a car of a similar make and model, and using it on the stolen car.

    This way, when a buyer may run a vehicle history report, it will come back with the history of a car that hasn’t been stolen.

    However, this isn’t a perfect scam for the scammers. A buyer would need to look out for any discrepancies between the vehicle history and what the seller is telling you. If there are any discrepancies, or there’s an issue with any paperwork, the buyer should walk away.

     
  • Geebo 8:00 am on May 3, 2023 Permalink | Reply
    Tags: , , , , , , , , voice spoofing   

    Scam Round Up: AI voice scam finds another victim and more 

    By Greg Collier

    This week in the round-up, we’ll be discussing three scams we’ve discussed before, but have popped up again recently.

    Our first scam is the Medicare card scam. Medicare issued new cards back in 2018 which started using an ID# rather than the recipient’s Social Security number. This was done to help prevent Medicare fraud and ensure patient privacy. Ever since then, scammers have been trying to fool Medicare recipients into believing another new card was being issued. Scammers typically do this to try to steal their victim’s Medicare information.

    The West Virginia Attorney General’s Office has issued a warning which says scammers are calling residents posing as Medicare, the Social Security Administration, or the Department of Insurance. The scammers are telling residents they need to turn in their paper Medicare cards for new plastic ones. This is not true. If Medicare were to issue new cards, they would announce it through the mail and not by calling Medicare recipients.

    The next scam pertains to families who have a loved one who is currently incarcerated. The Georgia Parole Board has issued their own warning to the families of the incarnated. They’ve reported scammers are calling the families and asking for money for the release of their family member. The scammers claim the money is needed for an ankle monitor before the inmate could be released.

    According to the parole board, they will never call anyone’s family asking for money. Georgia residents are advised to check with the parole board’s website before to determine the current parole status of their family member.

    Our final scam is one that’s not that old and has been in the news a lot lately, the voice spoofing scam. Scammers are taking voice recordings from social media or spam phone calls and feeding it to an AI program that can replicate that person’s voice. So far, it’s mostly been used in the grandparent scam, and the virtual kidnapping scam.

    An elderly coupe from Texas fell victim to the grandparent scam when they heard the voice of their grandson asking for help. The AI-generated voice said they were in an accident in Mexico and needed $1000. Believing he was talking to his actual grandson, the grandfather sent the money.

    If you receive a call like this, don’t believe your ears, as they can be deceived. Instead, try to contact the person who is supposedly in danger before sending any money.

     
  • Geebo 8:00 am on April 27, 2023 Permalink | Reply
    Tags: , , , , voice spoofing   

    Man loses $38K to voice spoofing scam 

    Man loses $38K to voice spoofing scam

    By Greg Collier

    We haven’t seen a scam proliferate as fast as the voice spoofing scam in a while. Even scams like the Zelle scam, which took off like wildfire, didn’t spread this fast. For those who may just be learning about voice spoofing, or voice cloning as it’s sometimes called, scammers can spoof just about anyone’s voice. Using a voice recording taken from social media or spam phone calls, scammers can then use artificial intelligence (AI) programs to make that voice say just about anything they want.

    At the risk of sounding like a broken record, voice spoofing is typically used in two different scams, so far. One is the virtual kidnapping scam, and the other is the grandparent scam. Both scams rely on phone calls that need to sound as legitimate as possible, and using the voice of a victim’s loved one makes these scam calls sound more convincing than ever.

    The grandparent scam is a type of phone scam where a fraudster poses as a grandchild or another family member in distress and asks the targeted grandparent to send money immediately, often using wire transfers or gift cards, for a supposed urgent situation, such as bail or medical bills. The scam relies on the emotional manipulation and trust of the victim and often preys on their desire to help their loved ones.

    Before AI programs became so pervasive, scammers would always use some excuse as to why they didn’t sound like the victim’s grandchild. They would usually claim they had a broken nose or some other injury that made their voice sound different. Now, with voice spoofing, they don’t have to worry about that.

    Recently, an elderly man in Maryland fell victim to this scam. He received a call that sounded like it was coming from his granddaughter. The caller claimed they had been in an accident that sent several victims to the hospital. The fake granddaughter then turned the call over to a ‘lawyer’ who told the man that he needed to send $38,000 for bail, which he did. It was a few days later when he texted his granddaughter, he found out he had been scammed.

    Now you may think, this was an elderly person who is more vulnerable to scams like this. However, when a recording of the call was played for the granddaughter’s parents, they also said it sounded exactly like their daughter.

    There’s a saying that’s often attributed to Edgar Allan Poe that says, “Don’t believe anything you hear and only half of what you see.” That adage couldn’t be truer when it comes to the grandparent scam. Even if you hear the voice of a loved one saying they’re in trouble and need money, try to contact that loved one immediately. Don’t believe any claims that you can’t hang up the phone or requests not to talk to anyone else in the family, even if the caller claims there is a gag order.

     
  • Geebo 8:00 am on April 24, 2023 Permalink | Reply
    Tags: , , , , voice spoofing   

    AI kidnapping scam flourishes 

    AI kidnapping scam flourishes

    It’s almost been two months since we first noticed AI-generated voice cloning, or voice spoofing, scams starting to proliferate. Voice cloning technology is being used in scams where the reproduction of someone’s voice is imperative in making the scam seem more realistic. Typically, they’re being used in grandparent scams and virtual kidnapping scams, where scammers have always tried to imitate a victim’s loved one. Today, we’ll be focusing on the virtual kidnapping scam.

    Before consumer level AI programs became so accessible, kidnapping scammers would try to make it sound like a victim’s loved one had been kidnapped by having someone in the background screaming as if they were being assaulted. Now, a scammer only needs to obtain a few seconds of someone’s voice online to make a program where they can simulate that person saying just about anything. Scammers can obtain someone’s voice either through social media, or by recording a spam call made to that person.

    In Western Pennsylvania, a family received such a call from someone claiming to have kidnapped their teenage daughter. The call appeared to come from the daughter’s phone number, with the daughter’s voice saying she had been kidnapped, and her parents needed to send money. The scammer then got on the phone, threatening to harm the girl.

    In many instances, this would have sent parents into a panic while potentially following the scammers instructions for a ransom payment.

    Thankfully, in this instance, the daughter was standing right next to her parents when they got the call.

    Even though new technology is being used by scammers, the old methods of precaution should still be used.

    If you receive such a call, try to have someone contact the person who’s supposedly been kidnapped. When they put your loved one on the phone, ask them a question that only they would know the answer to. Or, set up a family code word to use only if your loved one is in danger.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel