Tagged: voice spoofing Toggle Comment Threads | Keyboard Shortcuts

  • Geebo 9:00 am on November 5, 2024 Permalink | Reply
    Tags: , , , , , voice spoofing   

    A Mother’s Close Call with AI Voice Cloning 

    A Mother's Close Call with AI Voice Cloning

    By Greg Collier

    Imagine the terror of receiving a phone call with a familiar voice in distress, only to realize it was a cruel, high-tech scam. This harrowing experience recently befell a mother in Grand Rapids, Michigan, who nearly lost $50,000 over a weekend due to a sophisticated AI-driven scam. This scam, called ‘voice cloning’ mimicked the voice of her daughter so convincingly that it bypassed her natural skepticism and sent her scrambling to respond to what seemed like an emergency.

    It started with a phone call from an unknown number, coming from a town her daughter often frequented. With her daughter’s faint, panicked voice on the other end, she felt an instant urgency and fear that something was gravely wrong. Then, as she listened, the tone shifted; a stranger seized control of the call, asserting himself as a captor and demanding an immediate ransom. Her daughter’s supposed voice—distorted, mumbled, and terrified—amplified the mother’s fears. Desperation began to cloud her judgment as she debated how to produce such a vast sum on short notice.

    In her fear and confusion, she was prepared to do whatever it took to ensure her daughter’s safety. She was ready to withdraw cash, find neighbors who might accompany her, and meet the caller, who had directed her to a local hardware store for the exchange. But her instincts were seconded by her husband, who, while she negotiated, placed a call to the local police department. They advised him to contact their daughter directly, which they did, only to find she was safe and sound, unaware of the horrifying call her mother had just endured.

    This unsettling experience highlights a chilling reality of today’s world: the power of artificial intelligence to manipulate emotions, creating distressing scenarios with fabricated voices. These AI scams work by exploiting easily accessible samples of people’s voices, often found in social media videos or recordings. Voice cloning technology, once a futuristic concept, is now accessible and advanced enough to replicate a person’s voice with unsettling accuracy from just a brief clip.

    The Better Business Bureau advises those targeted by similar scams to resist the urge to act immediately. The shock of hearing a loved one’s voice in peril can push us to respond without question, but taking a pause, verifying the caller’s claims, and contacting the loved one directly are critical steps to prevent falling victim.

    Protecting yourself from AI-driven voice cloning scams requires both awareness and a proactive approach. Start by being mindful of what you share online, especially voice recordings, as even brief audio clips on social media can provide the material needed for cloning. Reducing the number of public posts containing your voice limits potential exposure, making it harder for scammers to replicate.

    Establishing a safe word with family members is also an effective precaution. A unique, shared phrase can act as a verification tool in emergency calls. If you ever receive a call claiming a loved one is in distress, use this word to confirm their identity. By doing so, you create a reliable check against scams, especially when emotions run high.

    It’s essential to take a moment to verify information before reacting. Scammers count on people’s tendency to act on instinct, especially when fear and urgency are involved. If you receive an alarming call, try to reach the person directly using a familiar number. Verifying information before sending money or following instructions can prevent falling victim to such fraud.

    In the end, a calm, measured approach, grounded in verification and pre-established safety measures, can make all the difference in staying protected against AI-driven threats.

     
  • Geebo 8:00 am on August 2, 2024 Permalink | Reply
    Tags: , , , voice spoofing   

    Bank worker foils kidnapping scam for a mother 

    Bank worker foils kidnapping scam for a mother

    By Greg Collier

    Last week, a mother from Pasco, Washington, received a call that turned her world upside down. An unknown number informed her that her daughter had been kidnapped. It was only later that she realized this call, which appeared to come from a local number, was part of a sophisticated virtual scam. The scammers even allowed her to briefly ‘speak’ with her daughter. Although it is unclear how the kidnappers mimicked her daughter’s voice, it’s more than likely they used AI voice spoofing technology.

    Adding to the mother’s terror, the scammers threatened that if she attempted to call her daughter, they would harm her. Convinced that her daughter was in danger, the mother sent the kidnappers all the money she had, $600. But the demands didn’t stop there.

    The scammers coerced her into sending an additional $5,000 and directed her to a credit union to take out a loan. Thankfully, a vigilant employee at the credit union sensed something was amiss when they noticed someone on the phone claiming to have the woman’s daughter. The employee promptly alerted the police.

    As the mother remained on the line with the supposed kidnappers, the officer collaborated with the bank employees to obtain her address. Police arrived at her home to find her daughter safe and sound. The officer drove the daughter to the bank, where she and her mother were finally reunited.

    It’s important to recognize the signs of a virtual kidnapping scam. The calls do not originate from the alleged victim’s phone. Scammers strive to keep the target on the phone for extended periods and cannot answer simple questions about the person they claim to have kidnapped, such as their appearance.

    Anytime a stranger insists on keeping you on the phone, it’s a major red flag and likely an indication of a scam. Scammers rely on creating a sense of urgency and fear, preventing you from thinking clearly or seeking outside help. If you find yourself in such a situation, hang up immediately, verify the information independently, and reach out to local authorities.

     
  • Geebo 9:00 am on February 28, 2024 Permalink | Reply
    Tags: , , , , voice c\, voice spoofing   

    The terrifying rise of AI-generated phone scams 

    By Greg Collier

    In the age of rapid technological advancement, it appears that scammers are always finding new ways to exploit our vulnerabilities. One of the latest and most frightening trends is the emergence of AI-generated phone scams, where callers use sophisticated artificial intelligence to mimic the voices of loved ones and prey on our emotions.

    Recently, residents of St. Louis County in Missouri were targeted by a particularly chilling variation of this scam. Victims received calls from individuals claiming to be their children in distress, stating that they had been involved in a car accident and the other driver was demanding money for damages under the threat of kidnapping. The scammers used AI to replicate the voices of the victims’ children, adding an extra layer of realism to their deception.

    The emotional impact of such a call cannot be overstated. Imagine receiving a call from someone who sounds exactly like your child, crying and pleading for help. The panic and fear that ensue can cloud judgment and make it difficult to discern the truth. This is precisely what the scammers rely on to manipulate their victims.

    One brave mother shared her harrowing experience with a local news outlet. She recounted how she received a call from someone who sounded like her daughter, claiming to have been in an accident and demanding a $2,000 wire transfer to prevent her kidnapping.

    Fortunately, in the case of the St. Louis County mother, prompt police intervention prevented her from falling victim to the scam. However, not everyone is as fortunate, with some parents having lost thousands of dollars to these heartless perpetrators.

    Experts warn that hanging up the phone may not be as simple as it seems in the heat of the moment. Instead, families should establish safe words or phrases to verify the authenticity of such calls.

    To protect yourself from falling victim to AI-generated phone scams, it’s essential to remain informed. Be wary of calls that pressure you to act quickly or request payment via gift cards or cryptocurrency. If you receive such a call, verify the authenticity of the situation by contacting the threatened family member directly and report the incident to law enforcement.

     
  • Geebo 9:00 am on February 22, 2024 Permalink | Reply
    Tags: , , , , voice spoofing   

    Victim avoids arrest scam, still loses money 

    By Greg Collier

    The scam, known by various aliases such as the arrest warrant scam or the jury duty scam, is a form of police impersonation scheme that has become increasingly common. In this fraudulent tactic, perpetrators masquerade as representatives from the victim’s local law enforcement agency, often spoofing official phone numbers to enhance their credibility.

    The typical method involves informing the victim of an outstanding arrest warrant, commonly fabricated for offenses like missing jury duty, though the pretext can vary. The primary objective of the scam is to coerce the victim into paying a fine purportedly to resolve the warrant. Payment is typically demanded through channels that are difficult to trace, such as money transfers, gift cards, or cryptocurrency.

    However, in a recent incident reported in Orlando, Florida, a victim discovered that the scam had a secondary agenda if the initial ploy failed. A fraudster posing as an Orange County Sheriff’s Deputy contacted the victim, attempting to instill fear of imminent arrest and requesting payment via a money order. Although the victim resisted this primary coercion attempt, the scammers had a backup plan.

    Remarkably, the scammers possessed extensive personal information about their target, including their name, address, date of birth, and complete Social Security number. Such data can be obtained through illicit means, such as purchasing from other criminals or harvesting from data breaches. To execute their fallback strategy, the scammers required a voice recording of the victim.

    The victim’s bank utilized voice verification for transaction authorization. Exploiting this vulnerability, the scammers swiftly used a recorded snippet of the victim’s voice to siphon $900 from her account on the same day.

    It remains unclear whether the perpetrators employed advanced AI-generated voice spoofing tools, or if they resorted to a variation of the “Can you hear me now?” scam. In the latter, scammers prompt victims to utter affirmative responses, aiming to record them for potential circumvention of voice-based authorizations.

    Scammers can effortlessly manipulate caller ID to falsely display a phone number associated with law enforcement agencies, creating the illusion of an official call. However, it’s crucial to note that legitimate police practices differ significantly from these deceptive tactics. Law enforcement agencies typically do not issue arrest warrant notifications over the phone; instead, they prefer personal visits. Furthermore, it’s important to recognize that authentic law enforcement entities never demand fine payments over the phone while issuing threats of arrest.

    If you see a call appearing to be from the police on your caller ID, it’s wise to let it go to voicemail. Afterward, listen to the message carefully. To ensure there’s no urgent matter requiring your attention, it’s prudent to directly call your local police department using their non-emergency number. This approach helps confirm the legitimacy of the call and prevents falling prey to potential scams.

     
  • Geebo 9:00 am on January 12, 2024 Permalink | Reply
    Tags: , , , , , , voice spoofing   

    More police warn of AI voice scams 

    More police warn of AI voice scams

    By Greg Collier

    AI voice spoofing refers to the use of artificial intelligence (AI) technology to imitate or replicate a person’s voice in a way that may deceive listeners into thinking they are hearing the real person. This technology can be used to generate synthetic voices that closely mimic the tone, pitch, and cadence of a specific individual. The term is often associated with negative uses, such as creating fraudulent phone calls or audio messages with the intent to deceive or manipulate.

    Scammers can exploit a brief audio clip of your family member’s voice, easily obtained from online content. With access to a voice-cloning program, the scammer can then imitate your loved one’s voice convincingly when making a call, leading to potential deception and manipulation. Scammers have quickly taken to this technology in order to fool people into believing their loved ones are in danger in what are being called family emergency scams.

    Family emergency scams typically break down into two categories, the virtual kidnapping scam, and the grandparent scam. Today, we’re focused on the grandparent scam. It garnered its name from the fact that scammers often target elderly victims, posing as the victim’s grandchild in peril. This scam has been happening a lot lately in the Memphis area, to the point where a Sheriff’s Office has issued a warning to local residents about it.

    One family received a phone call that appeared to be coming from their adult granddaughter. The caller sounded exactly like their granddaughter, who said they needed $500 for bail money after getting into a car accident. Smartly, the family kept asking the caller questions that only their granddaughter would know. The scammers finally hung up.

    To safeguard against this scam, it’s crucial to rely on caution rather than solely trusting your ears. If you receive a call from a supposed relative or loved one urgently requesting money due to a purported crisis, adhere to the same safety measures. Resist the urge to engage further; instead, promptly end the call and independently contact the person who is claimed to be in trouble to verify the authenticity of the situation. This proactive approach helps ensure protection against potential scams, even when the voice on the call seems identical to that of your loved one.

     
  • Geebo 9:00 am on November 27, 2023 Permalink | Reply
    Tags: , , , , , voice spoofing   

    The FTC puts a bounty on AI voice cloning 

    The FTC puts a bounty on AI voice cloning

    By Greg Collier

    AI-generated voice cloning, or voice spoofing, scams have become such a nuisance, the federal government is turning to the people to help solve the problem. If you’re unfamiliar with AI-voice generation technology, there are apps and programs that can take a short sample of anyone’s voice and make that voice say whatever you want it to. The benefit of it is it can give people who lost their speaking ability a voice. However, every tool that’s made for the good of mankind can also be used to its detriment.

    Scammers use cloned voices in what are known as emergency scams. Emergency scams can be broken down into two categories, for the most part, the grandparent scam, and the virtual kidnapping scam. In both sets of scams, the scammers need to convince their victim one of the victim’s loved ones is in some sort of peril. In the case of the grandparent scam, the scammer will try to convince the victim their loved one is in jail and needs bail money. While in the virtual kidnapping scam, the scammers try to convince the victim their loved one has been kidnapped for ransom.

    Scammers will take a sample of someone’s voice, typically from a video that’s been posted to social media. Then, they’ll use the voice cloning technology to make it sound like that person is in a situation that requires the victim to send money.

    Voice cloning has become such a problem, the Federal Trade Commission has issued a challenge to anyone who thinks they can develop some kind of voice cloning detector. The top prize winner can receive $25,000, the runner-up can get $4000, while three honorable mentions can get $2000.

    In their own words, the FTC has issued this challenge to help push forward ideas to mitigate risks upstream—shielding consumers, creative professionals, and small businesses against the harms of voice cloning before the harm reaches a consumer.

    The online submission portal can be found at this link, and submissions will be accepted from January 2 to 12, 2024.

    Hopefully, someone can come up with the right idea to better help consumers from losing their money to these scammers.

     
  • Geebo 8:00 am on September 26, 2023 Permalink | Reply
    Tags: , , voice spoofing   

    New version of grandparent scam changes the target 

    New version of grandparent scam changes the target

    By Greg Collier

    If you haven’t heard of the grandparent scam, it’s called that because it mostly targets the elderly. The way it works is, scammers will call their elderly target and pose as one of the target’s grandchildren. The call usually starts with the scammer saying something like “Grampa?”. They’re hoping the target will respond with a grandchild’s name by replying with something along the lines of, “Is this Brandon?”. The scammer will reply with yes to no matter what name they’re supplied with. Then the real grift begins.

    While posing as the grandchild, the scammer will tell their target they’ve gotten into legal trouble and need money to fix the situation. Typically, the phony grandchild will claim they’ve been in a car accident that was their fault and need money for bail or some other legal fee. Sometimes, the call is passed off to the scammer’s partner, who will pose as the police, a bail bondsman, or attorney to add an element of urgency to the target.

    Payment is usually asked for through means that are hard to recover, such as cryptocurrency, gift cards, or through payment apps like Zelle, Cash App and Venmo. The target is also instructed not to tell anyone else in the family, sometimes under the threat of a gag order.

    That’s how the grandparent scam traditionally worked until the development of AI voice-spoofing technology. Now, the grandparent scam has become more focused, with scammers targeting specific victims instead of random elderly people.

    With that development, the Better Business Bureau has issued a warning that scammers have also flipped the script on the grandparent scam. According to the BBB, scammers are now posing as grandparents in distress on these scam phone calls. Thanks to AI voice-spoofing, scammers are now targeting children and grandchildren instead of just the elderly with this scam. You can imagine how panicked this would make the victim of this new version of the scam.

    However, the ways to protect yourself remain the same. Educating your family about the scam is the best defense. Your family should also set up a code word you can use to verify the identity of the person who is calling. Or, you could ask the caller a question only they would know the answer to. Lastly, don’t believe your ears when you get a call like this, it may sound like your loved one, but now, scammers can mimic any voice down to a T.

     
  • Geebo 8:00 am on September 19, 2023 Permalink | Reply
    Tags: , , , , voice spoofing   

    The sheer terror of the kidnapping scam 

    The sheer terror of the kidnapping scam

    By Greg Collier

    Even if someone has complete knowledge of how a certain scam works, that doesn’t necessarily mean they won’t fall victim to it, due to how some scams are completely menacing. Take, for example, the virtual kidnapping scam. This is when a scammer calls someone and claims to have kidnapped their loved one before making a ransom demand. Meanwhile, the supposed kidnap victim is unharmed and has no idea they’re being used in a scam. With the advancement of AI voice-spoofing technology, scammers can easily mimic the voice of the victim’s loved one to make the scam seem even more threatening.

    With that knowledge in mind, we may think we wouldn’t fall for such a scam as we sit at our keyboards and screens. But can you say that with 100% confidence? Before you answer, you should know the story of an Atlanta father who fell victim to the scam.

    He received a call from someone who claimed they kidnapped his adult daughter. At the time of the call, the man’s daughter was traveling. This could be why the man was targeted, as scammers often take information they find on social media and use it to their advantage. The caller claimed he got into a car accident with the man’s daughter and that they were carrying a substantial amount of cocaine at the time.

    The caller threatened the life of the man’s daughter, saying that they couldn’t have anyone recognize them. This was accompanied by screams and cries in the background that replicated his daughter’s voice. This was followed up with threats of torture and other bodily harm to the daughter if the man didn’t comply. For the sake of decorum, we won’t reprint specifically what the threats entailed, but imagine the worst thing that could happen to a loved one of your own, and then you have an idea of the terror that was unfolding.

    The father complied with the scammer’s request and sent them $2500 to the scammer’s bank account, probably through an app like Zelle.

    Even if armed with the knowledge of how the virtual kidnapping scam works, in the heat of the moment, no one could be blamed for falling victim to the scam. However, there are still ways to try to protect yourself from the scam. The best way is to set up a code word between you and your loved ones. This way, in cases of calls like this, you can know if you’re actually talking to your loved one or not. Or, you could also ask them a question that only the supposed kidnap victim would know.

    While it’s easier said than done, try to remain calm in the situation, even while your ears may be deceiving you. Make attempts to contact your loved one through other means. If you can, attempt to have someone else reach them on a different phone.

    Please keep in mind, virtual kidnapping scams rely on manipulation and intimidation. By staying calm, and taking the necessary precautions, you can protect yourself and your loved ones from falling victim to these schemes.

     
  • Geebo 8:00 am on September 7, 2023 Permalink | Reply
    Tags: , , , voice spoofing   

    Kidnapping scam brings terror to family 

    Kidnapping scam brings terror to family

    By Greg Collier

    For the better part of this year, we’ve been warning our readers about scams that use AI mimicked voices of your loved ones. Typically, these spoofed voices are used in the grandparent scam and the virtual kidnapping scam. In these scams, it’s crucial for the scammers to make their victims believe that a member of the victim’s family is in immediate danger. To that end, scammers will steal a recording of someone’s voice, usually from social media.

    That voice sample is then run through an AI program that will allow the scammer to make the voice say anything they want it to, such as pleas for help. It’s gotten to the point where we believe the voice spoofing versions of these scams have become more common than their analog predecessors. For now, we think it’s pretty safe to assume if there’s a grandparent or virtual kidnapping scam, an AI voice clone is probably involved.

    For example, two parents in Ohio almost fell victim to the virtual kidnapping scam. They received a call that sounded like it was coming from their adult daughter. The parents described the call sounded like their daughter was in a panic. The voice said they were blindfolded and being held in a trunk. Then a male voice got on the call, claiming to be a kidnapper who would harm their daughter if they didn’t pay a ransom.

    To make matters worse, the supposed kidnapper knew the daughter’s name and the area where she worked. This made the claim of kidnapping seem more credible to the parents.

    At first, the parents did the right thing. They tried calling their daughter from another line, but were unable to get a hold of her. Then they called 911, but were still under the impression their daughter had been legitimately kidnapped.

    They went to get the ransom from their bank, but the branch had just closed. The caller instructed the parents to go to a local Walmart, probably to send a money transfer to the scammers. Thankfully, the police caught up with the parents to let them know their daughter was in no harm and the call was a scam.

    Not everyone is up on the latest scams, so just imagine the sense of fear and terror they must have experienced. However, all it takes is a little bit of knowledge to protect yourself from this scam. As we often cite, kidnappings for ransom are actually quite rare in the U.S. If you have a loved one who is active on social media, scammers can use the information shared to make it seem like they’ve been plotting a kidnapping for a while. Again, this is done to make their con seem more authentic.

    In the unfortunate event you receive a call like this, do exactly what these parents did. Contact the loved one who has been supposedly kidnapped on another line. The odds are you’ll find them not only safe, but unaware they’re being used in a scam. Then call the police for their assistance. Lastly, even if it sounds like the exact voice of your loved one, be skeptical, as these days, voices can be easily duplicated.

     
  • Geebo 8:00 am on September 1, 2023 Permalink | Reply
    Tags: , , , , , , voice spoofing   

    Grandmother scammed for weeks in AI voice-spoofing scam 

    By Greg Collier

    It’s been a short while since we last discussed the AI voice-spoofing scam. For new readers, this is when scammers obtain a sample of someone’s voice from online, and run it through an AI program, which allows the scammers to make the voice say whatever they want. The scammers then use the person’s voice to convince that person’s loved one to send the scammers money.

    Voice-spoofing is typically used in one of two consumer-level scams. The first one is the virtual kidnapping scam, which is exactly what it sounds like. Scammers will use the spoofed voice to make it sound like somebody’s loved one has been kidnapped, and the scammers will demand a ransom.

    The second scam is the one we’ll be discussing today, which is the grandparent scam. In this scam, the scammers pose as an elderly victim’s grandchild who’s in some kind of legal trouble. The scammers will often ask for bail money or legal fees.

    An elderly woman from Utah recently fell victim to the grandparent scam. Scammers called her on the phone using the cloned voice of one of her granddaughters. The ‘granddaughter’ said she had been arrested after riding in a car with someone who had drugs and needed bail money. A scammer then got on the call and pretended to be the granddaughter’s attorney and instructed the woman on how she could send payment. The woman was also instructed not to tell anyone else in the family, as it could jeopardize the granddaughter’s court case.

    One of the many problems with scammers is if you pay them once, chances are they’ll come back for more money, which is what happened here. For weeks, the phony granddaughter kept calling back needing more money each time for various legal proceedings. Keep in mind that with each conversation, the grandmother is not actually talking to anybody but a computer-generated voice, which sounds exactly like her granddaughter.

    Eventually, the grandmother did grow suspicious and told her son, who informed her she was being scammed.

    Don’t trust your ears when it comes to phone scams. If you receive a call from someone claiming to be a relative or loved one in need of money, it’s important to follow the same precautions, even if the voice sounds exactly like them. Hang up on the call and contact the person who’s supposedly in trouble. If you can’t reach them, ask other family members who might know where they are. Be sure to tell them about the situation you encountered, and never keep it a secret. Lastly, never send money under any circumstances.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel