Tagged: voice spoofing Toggle Comment Threads | Keyboard Shortcuts

  • Geebo 9:00 am on February 28, 2024 Permalink | Reply
    Tags: , , , , voice c\, voice spoofing   

    The terrifying rise of AI-generated phone scams 

    By Greg Collier

    In the age of rapid technological advancement, it appears that scammers are always finding new ways to exploit our vulnerabilities. One of the latest and most frightening trends is the emergence of AI-generated phone scams, where callers use sophisticated artificial intelligence to mimic the voices of loved ones and prey on our emotions.

    Recently, residents of St. Louis County in Missouri were targeted by a particularly chilling variation of this scam. Victims received calls from individuals claiming to be their children in distress, stating that they had been involved in a car accident and the other driver was demanding money for damages under the threat of kidnapping. The scammers used AI to replicate the voices of the victims’ children, adding an extra layer of realism to their deception.

    The emotional impact of such a call cannot be overstated. Imagine receiving a call from someone who sounds exactly like your child, crying and pleading for help. The panic and fear that ensue can cloud judgment and make it difficult to discern the truth. This is precisely what the scammers rely on to manipulate their victims.

    One brave mother shared her harrowing experience with a local news outlet. She recounted how she received a call from someone who sounded like her daughter, claiming to have been in an accident and demanding a $2,000 wire transfer to prevent her kidnapping.

    Fortunately, in the case of the St. Louis County mother, prompt police intervention prevented her from falling victim to the scam. However, not everyone is as fortunate, with some parents having lost thousands of dollars to these heartless perpetrators.

    Experts warn that hanging up the phone may not be as simple as it seems in the heat of the moment. Instead, families should establish safe words or phrases to verify the authenticity of such calls.

    To protect yourself from falling victim to AI-generated phone scams, it’s essential to remain informed. Be wary of calls that pressure you to act quickly or request payment via gift cards or cryptocurrency. If you receive such a call, verify the authenticity of the situation by contacting the threatened family member directly and report the incident to law enforcement.

     
  • Geebo 9:00 am on February 22, 2024 Permalink | Reply
    Tags: , , , , voice spoofing   

    Victim avoids arrest scam, still loses money 

    By Greg Collier

    The scam, known by various aliases such as the arrest warrant scam or the jury duty scam, is a form of police impersonation scheme that has become increasingly common. In this fraudulent tactic, perpetrators masquerade as representatives from the victim’s local law enforcement agency, often spoofing official phone numbers to enhance their credibility.

    The typical method involves informing the victim of an outstanding arrest warrant, commonly fabricated for offenses like missing jury duty, though the pretext can vary. The primary objective of the scam is to coerce the victim into paying a fine purportedly to resolve the warrant. Payment is typically demanded through channels that are difficult to trace, such as money transfers, gift cards, or cryptocurrency.

    However, in a recent incident reported in Orlando, Florida, a victim discovered that the scam had a secondary agenda if the initial ploy failed. A fraudster posing as an Orange County Sheriff’s Deputy contacted the victim, attempting to instill fear of imminent arrest and requesting payment via a money order. Although the victim resisted this primary coercion attempt, the scammers had a backup plan.

    Remarkably, the scammers possessed extensive personal information about their target, including their name, address, date of birth, and complete Social Security number. Such data can be obtained through illicit means, such as purchasing from other criminals or harvesting from data breaches. To execute their fallback strategy, the scammers required a voice recording of the victim.

    The victim’s bank utilized voice verification for transaction authorization. Exploiting this vulnerability, the scammers swiftly used a recorded snippet of the victim’s voice to siphon $900 from her account on the same day.

    It remains unclear whether the perpetrators employed advanced AI-generated voice spoofing tools, or if they resorted to a variation of the “Can you hear me now?” scam. In the latter, scammers prompt victims to utter affirmative responses, aiming to record them for potential circumvention of voice-based authorizations.

    Scammers can effortlessly manipulate caller ID to falsely display a phone number associated with law enforcement agencies, creating the illusion of an official call. However, it’s crucial to note that legitimate police practices differ significantly from these deceptive tactics. Law enforcement agencies typically do not issue arrest warrant notifications over the phone; instead, they prefer personal visits. Furthermore, it’s important to recognize that authentic law enforcement entities never demand fine payments over the phone while issuing threats of arrest.

    If you see a call appearing to be from the police on your caller ID, it’s wise to let it go to voicemail. Afterward, listen to the message carefully. To ensure there’s no urgent matter requiring your attention, it’s prudent to directly call your local police department using their non-emergency number. This approach helps confirm the legitimacy of the call and prevents falling prey to potential scams.

     
  • Geebo 9:00 am on January 12, 2024 Permalink | Reply
    Tags: , , family emergency, , , , voice spoofing   

    More police warn of AI voice scams 

    More police warn of AI voice scams

    By Greg Collier

    AI voice spoofing refers to the use of artificial intelligence (AI) technology to imitate or replicate a person’s voice in a way that may deceive listeners into thinking they are hearing the real person. This technology can be used to generate synthetic voices that closely mimic the tone, pitch, and cadence of a specific individual. The term is often associated with negative uses, such as creating fraudulent phone calls or audio messages with the intent to deceive or manipulate.

    Scammers can exploit a brief audio clip of your family member’s voice, easily obtained from online content. With access to a voice-cloning program, the scammer can then imitate your loved one’s voice convincingly when making a call, leading to potential deception and manipulation. Scammers have quickly taken to this technology in order to fool people into believing their loved ones are in danger in what are being called family emergency scams.

    Family emergency scams typically break down into two categories, the virtual kidnapping scam, and the grandparent scam. Today, we’re focused on the grandparent scam. It garnered its name from the fact that scammers often target elderly victims, posing as the victim’s grandchild in peril. This scam has been happening a lot lately in the Memphis area, to the point where a Sheriff’s Office has issued a warning to local residents about it.

    One family received a phone call that appeared to be coming from their adult granddaughter. The caller sounded exactly like their granddaughter, who said they needed $500 for bail money after getting into a car accident. Smartly, the family kept asking the caller questions that only their granddaughter would know. The scammers finally hung up.

    To safeguard against this scam, it’s crucial to rely on caution rather than solely trusting your ears. If you receive a call from a supposed relative or loved one urgently requesting money due to a purported crisis, adhere to the same safety measures. Resist the urge to engage further; instead, promptly end the call and independently contact the person who is claimed to be in trouble to verify the authenticity of the situation. This proactive approach helps ensure protection against potential scams, even when the voice on the call seems identical to that of your loved one.

     
  • Geebo 9:00 am on November 27, 2023 Permalink | Reply
    Tags: , , , , , voice spoofing   

    The FTC puts a bounty on AI voice cloning 

    The FTC puts a bounty on AI voice cloning

    By Greg Collier

    AI-generated voice cloning, or voice spoofing, scams have become such a nuisance, the federal government is turning to the people to help solve the problem. If you’re unfamiliar with AI-voice generation technology, there are apps and programs that can take a short sample of anyone’s voice and make that voice say whatever you want it to. The benefit of it is it can give people who lost their speaking ability a voice. However, every tool that’s made for the good of mankind can also be used to its detriment.

    Scammers use cloned voices in what are known as emergency scams. Emergency scams can be broken down into two categories, for the most part, the grandparent scam, and the virtual kidnapping scam. In both sets of scams, the scammers need to convince their victim one of the victim’s loved ones is in some sort of peril. In the case of the grandparent scam, the scammer will try to convince the victim their loved one is in jail and needs bail money. While in the virtual kidnapping scam, the scammers try to convince the victim their loved one has been kidnapped for ransom.

    Scammers will take a sample of someone’s voice, typically from a video that’s been posted to social media. Then, they’ll use the voice cloning technology to make it sound like that person is in a situation that requires the victim to send money.

    Voice cloning has become such a problem, the Federal Trade Commission has issued a challenge to anyone who thinks they can develop some kind of voice cloning detector. The top prize winner can receive $25,000, the runner-up can get $4000, while three honorable mentions can get $2000.

    In their own words, the FTC has issued this challenge to help push forward ideas to mitigate risks upstream—shielding consumers, creative professionals, and small businesses against the harms of voice cloning before the harm reaches a consumer.

    The online submission portal can be found at this link, and submissions will be accepted from January 2 to 12, 2024.

    Hopefully, someone can come up with the right idea to better help consumers from losing their money to these scammers.

     
  • Geebo 8:00 am on September 26, 2023 Permalink | Reply
    Tags: , , voice spoofing   

    New version of grandparent scam changes the target 

    New version of grandparent scam changes the target

    By Greg Collier

    If you haven’t heard of the grandparent scam, it’s called that because it mostly targets the elderly. The way it works is, scammers will call their elderly target and pose as one of the target’s grandchildren. The call usually starts with the scammer saying something like “Grampa?”. They’re hoping the target will respond with a grandchild’s name by replying with something along the lines of, “Is this Brandon?”. The scammer will reply with yes to no matter what name they’re supplied with. Then the real grift begins.

    While posing as the grandchild, the scammer will tell their target they’ve gotten into legal trouble and need money to fix the situation. Typically, the phony grandchild will claim they’ve been in a car accident that was their fault and need money for bail or some other legal fee. Sometimes, the call is passed off to the scammer’s partner, who will pose as the police, a bail bondsman, or attorney to add an element of urgency to the target.

    Payment is usually asked for through means that are hard to recover, such as cryptocurrency, gift cards, or through payment apps like Zelle, Cash App and Venmo. The target is also instructed not to tell anyone else in the family, sometimes under the threat of a gag order.

    That’s how the grandparent scam traditionally worked until the development of AI voice-spoofing technology. Now, the grandparent scam has become more focused, with scammers targeting specific victims instead of random elderly people.

    With that development, the Better Business Bureau has issued a warning that scammers have also flipped the script on the grandparent scam. According to the BBB, scammers are now posing as grandparents in distress on these scam phone calls. Thanks to AI voice-spoofing, scammers are now targeting children and grandchildren instead of just the elderly with this scam. You can imagine how panicked this would make the victim of this new version of the scam.

    However, the ways to protect yourself remain the same. Educating your family about the scam is the best defense. Your family should also set up a code word you can use to verify the identity of the person who is calling. Or, you could ask the caller a question only they would know the answer to. Lastly, don’t believe your ears when you get a call like this, it may sound like your loved one, but now, scammers can mimic any voice down to a T.

     
  • Geebo 8:00 am on September 19, 2023 Permalink | Reply
    Tags: , , , , voice spoofing   

    The sheer terror of the kidnapping scam 

    The sheer terror of the kidnapping scam

    By Greg Collier

    Even if someone has complete knowledge of how a certain scam works, that doesn’t necessarily mean they won’t fall victim to it, due to how some scams are completely menacing. Take, for example, the virtual kidnapping scam. This is when a scammer calls someone and claims to have kidnapped their loved one before making a ransom demand. Meanwhile, the supposed kidnap victim is unharmed and has no idea they’re being used in a scam. With the advancement of AI voice-spoofing technology, scammers can easily mimic the voice of the victim’s loved one to make the scam seem even more threatening.

    With that knowledge in mind, we may think we wouldn’t fall for such a scam as we sit at our keyboards and screens. But can you say that with 100% confidence? Before you answer, you should know the story of an Atlanta father who fell victim to the scam.

    He received a call from someone who claimed they kidnapped his adult daughter. At the time of the call, the man’s daughter was traveling. This could be why the man was targeted, as scammers often take information they find on social media and use it to their advantage. The caller claimed he got into a car accident with the man’s daughter and that they were carrying a substantial amount of cocaine at the time.

    The caller threatened the life of the man’s daughter, saying that they couldn’t have anyone recognize them. This was accompanied by screams and cries in the background that replicated his daughter’s voice. This was followed up with threats of torture and other bodily harm to the daughter if the man didn’t comply. For the sake of decorum, we won’t reprint specifically what the threats entailed, but imagine the worst thing that could happen to a loved one of your own, and then you have an idea of the terror that was unfolding.

    The father complied with the scammer’s request and sent them $2500 to the scammer’s bank account, probably through an app like Zelle.

    Even if armed with the knowledge of how the virtual kidnapping scam works, in the heat of the moment, no one could be blamed for falling victim to the scam. However, there are still ways to try to protect yourself from the scam. The best way is to set up a code word between you and your loved ones. This way, in cases of calls like this, you can know if you’re actually talking to your loved one or not. Or, you could also ask them a question that only the supposed kidnap victim would know.

    While it’s easier said than done, try to remain calm in the situation, even while your ears may be deceiving you. Make attempts to contact your loved one through other means. If you can, attempt to have someone else reach them on a different phone.

    Please keep in mind, virtual kidnapping scams rely on manipulation and intimidation. By staying calm, and taking the necessary precautions, you can protect yourself and your loved ones from falling victim to these schemes.

     
  • Geebo 8:00 am on September 7, 2023 Permalink | Reply
    Tags: , , , voice spoofing   

    Kidnapping scam brings terror to family 

    Kidnapping scam brings terror to family

    By Greg Collier

    For the better part of this year, we’ve been warning our readers about scams that use AI mimicked voices of your loved ones. Typically, these spoofed voices are used in the grandparent scam and the virtual kidnapping scam. In these scams, it’s crucial for the scammers to make their victims believe that a member of the victim’s family is in immediate danger. To that end, scammers will steal a recording of someone’s voice, usually from social media.

    That voice sample is then run through an AI program that will allow the scammer to make the voice say anything they want it to, such as pleas for help. It’s gotten to the point where we believe the voice spoofing versions of these scams have become more common than their analog predecessors. For now, we think it’s pretty safe to assume if there’s a grandparent or virtual kidnapping scam, an AI voice clone is probably involved.

    For example, two parents in Ohio almost fell victim to the virtual kidnapping scam. They received a call that sounded like it was coming from their adult daughter. The parents described the call sounded like their daughter was in a panic. The voice said they were blindfolded and being held in a trunk. Then a male voice got on the call, claiming to be a kidnapper who would harm their daughter if they didn’t pay a ransom.

    To make matters worse, the supposed kidnapper knew the daughter’s name and the area where she worked. This made the claim of kidnapping seem more credible to the parents.

    At first, the parents did the right thing. They tried calling their daughter from another line, but were unable to get a hold of her. Then they called 911, but were still under the impression their daughter had been legitimately kidnapped.

    They went to get the ransom from their bank, but the branch had just closed. The caller instructed the parents to go to a local Walmart, probably to send a money transfer to the scammers. Thankfully, the police caught up with the parents to let them know their daughter was in no harm and the call was a scam.

    Not everyone is up on the latest scams, so just imagine the sense of fear and terror they must have experienced. However, all it takes is a little bit of knowledge to protect yourself from this scam. As we often cite, kidnappings for ransom are actually quite rare in the U.S. If you have a loved one who is active on social media, scammers can use the information shared to make it seem like they’ve been plotting a kidnapping for a while. Again, this is done to make their con seem more authentic.

    In the unfortunate event you receive a call like this, do exactly what these parents did. Contact the loved one who has been supposedly kidnapped on another line. The odds are you’ll find them not only safe, but unaware they’re being used in a scam. Then call the police for their assistance. Lastly, even if it sounds like the exact voice of your loved one, be skeptical, as these days, voices can be easily duplicated.

     
  • Geebo 8:00 am on September 1, 2023 Permalink | Reply
    Tags: , , , , , , voice spoofing   

    Grandmother scammed for weeks in AI voice-spoofing scam 

    By Greg Collier

    It’s been a short while since we last discussed the AI voice-spoofing scam. For new readers, this is when scammers obtain a sample of someone’s voice from online, and run it through an AI program, which allows the scammers to make the voice say whatever they want. The scammers then use the person’s voice to convince that person’s loved one to send the scammers money.

    Voice-spoofing is typically used in one of two consumer-level scams. The first one is the virtual kidnapping scam, which is exactly what it sounds like. Scammers will use the spoofed voice to make it sound like somebody’s loved one has been kidnapped, and the scammers will demand a ransom.

    The second scam is the one we’ll be discussing today, which is the grandparent scam. In this scam, the scammers pose as an elderly victim’s grandchild who’s in some kind of legal trouble. The scammers will often ask for bail money or legal fees.

    An elderly woman from Utah recently fell victim to the grandparent scam. Scammers called her on the phone using the cloned voice of one of her granddaughters. The ‘granddaughter’ said she had been arrested after riding in a car with someone who had drugs and needed bail money. A scammer then got on the call and pretended to be the granddaughter’s attorney and instructed the woman on how she could send payment. The woman was also instructed not to tell anyone else in the family, as it could jeopardize the granddaughter’s court case.

    One of the many problems with scammers is if you pay them once, chances are they’ll come back for more money, which is what happened here. For weeks, the phony granddaughter kept calling back needing more money each time for various legal proceedings. Keep in mind that with each conversation, the grandmother is not actually talking to anybody but a computer-generated voice, which sounds exactly like her granddaughter.

    Eventually, the grandmother did grow suspicious and told her son, who informed her she was being scammed.

    Don’t trust your ears when it comes to phone scams. If you receive a call from someone claiming to be a relative or loved one in need of money, it’s important to follow the same precautions, even if the voice sounds exactly like them. Hang up on the call and contact the person who’s supposedly in trouble. If you can’t reach them, ask other family members who might know where they are. Be sure to tell them about the situation you encountered, and never keep it a secret. Lastly, never send money under any circumstances.

     
  • Geebo 8:00 am on July 28, 2023 Permalink | Reply
    Tags: , , , , , , , , , voice spoofing   

    Scam Round Up: Weird AI scam and more 

    Scam Round Up: Weird AI scam and more

    By Greg Collier

    Our first scam comes to us from Athens, Texas, where residents have been experiencing a twist in the arrest warrant scam, also known as a police impersonation scam. Typically, when scammers pose as police, they’ll call their intended victims and tell them they have a warrant out for their arrest, The scammers usually claim this for missed jury duty, but they can also claim a number of other infractions.

    For example, residents of Athens have complained the scammers are accusing their victims of using their phone to transmit a photo that traumatized a child. Essentially, the scammers accused their victims of sending explicit material to a child. The victim is then asked to pay several hundred dollars over the phone to resolve the complaint.

    That’s not how arrest warrants work. If there is a warrant for your arrest, especially one that’s supposedly this serious, the police are not going to call you over the phone. Also, no law enforcement agency will ask for money over the phone, and then ask for it in unusual ways, like gift cards or cryptocurrency, just to name a few.

    If you receive a call like this, hang up and call your local police at their emergency number. Not only can you verify there is no warrant for your arrest, you can let the police know scammers are working in your area.

    ***

    Police in Connecticut are warning residents there has been an uptick in check washing. Check washing typically involves stealing checks that are in outgoing mail. Thieves often steal the mail from residential mailboxes, along with the outdoor drop-off boxes used by the US Postal Service. They then dip the written checks in a chemical solution that removes the ink from the check, so the thieves can write the checks to themselves.

    The police in Connecticut are also warning residents the thieves can steal checks out of your trash. If you use your bank’s mobile app to deposit checks, and then throw the checks out, make sure they’re properly shredded before throwing them out, as check washing can still be performed on voided checks.

    If you have to write a check, which is going in the mail, use a gel-based ink pen. The ink in gel pens is said to be more resistant to check washing. Also, don’t put the envelope that holds the check in your mailbox and the put the mailbox flag up. This is a signal to thieves there may be a check in there.

    ***

    Lastly, we’ve read about another AI voice-spoofing scam. There has been a rash of these scams nationwide over the past year or so. In this scam, the victim gets a phone call where the voice sounds like exactly like one of the victim’s loved ones. The scammers manipulate the loved one’s voice in such a way where it sounds like the actual loved one is in some kind of trouble and needs money to resolve the issue. Typically, the scammers ask for bail money, or in some cases a ransom. However, the loved one is usually unaware their voice is being used in a scam.

    However, the recent news article we read out of Alabama, suggests scammers are using the voice-spoofing technique in identity theft. An Alabama woman received a call she thought was from her brother, but was actually from scammers. Instead of asking for money, they asked the woman for personal information. They then used this information to hijack her Facebook account and use that for additional scams. Police there have said the scammers used the videos the brother posted on social media to mimic his voice with AI.

    We can’t say for sure, but this sounds like the scammers may have been asking for the woman’s security questions in case she lost her Facebook password. Considering the answers to these questions are something like “What was your first pet’s name?” or “What city were you born?” these may seem like innocuous questions coming from a close family member.

    In cases like this, it’s best to ask the family member calling a question only they would know to verify their identity.

     
  • Geebo 8:00 am on July 20, 2023 Permalink | Reply
    Tags: , , , , , , , voice spoofing   

    Scam Round Up: Fake cops threaten tenants and more 

    Scam Round Up: Fake cops threaten tenants and more

    By Greg Collier

    Our first scam of the day comes to us from a warning from the New York City Police Department. The NYPD says they’ve seen an increase in a charity scam that involves Venmo and your phone. Scammers are approaching NYC residents while pretending they’re working for a charity.

    The scammers will ask for a donation through the personal payment app Venmo. The victim will be provided the information to make the donation, but the donation won’t go through. This is when the scammer will ask for the victim’s phone to help them make the donation. Instead, the scammers are sending the entire amount of the victim’s Venmo account to themselves.

    The NYPD is telling residents not to hand their phones over to strangers, especially if they’re asking for donations. Please keep in mind, Venmo was intended to be used between family and friends.

    ***

    We’ve been keeping a close eye on the scams that involve AI-generated voice-spoofing. Scammers will take someone’s voice either from social media or their voicemail message and run it through an AI voice program that will allow them to make someone’s voice say just about anything they want. Typically, voice-spoofing is used in the grandparent and virtual kidnapping scams. In these scams, scammers need the victim to believe they’re talking to a loved one.

    The most recent report we have on this is out of Atlanta, where a mother was confronted with this scam. She received a call she thought was from her adult daughter. She heard her daughter’s voice before someone on the call said her daughter saw something she shouldn’t have and has now been kidnapped. The caller demanded $50,000 in ransom.

    Thankfully, her husband was able to get a hold of her daughter, who was in no real danger.

    If you receive a phone call like this, always try to reach the person who has been supposedly kidnapped through other means. Even if you have a full conversation with someone who sounds just like your loved one, always verify the story. Ask them a question only they would know, or set up a family code word ahead of time that would signify who you were talking to.

    ***

    Residents of Newark, New Jersey, have reported that people posing as police have been going around to tenants and demanding multiple months worth of rent. If the phony officers don’t get the money, they threaten the tenants with eviction and arrest.

    In New Jersey, an eviction can’t be carried out until the landlord has received a judgment in court.

    If you’re renting your home or apartment, you should familiarize yourself with your state’s or county’s eviction process.

    Also, keep in mind, legitimate police will never show up at your door asking for your rent money. If someone claiming to be police does show up at your door, call the police department they’re supposedly from and verify if an officer has been dispatched to your home.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel