Tagged: AI voice Toggle Comment Threads | Keyboard Shortcuts

  • Geebo 9:00 am on November 5, 2024 Permalink | Reply
    Tags: , AI voice, , , ,   

    A Mother’s Close Call with AI Voice Cloning 

    A Mother's Close Call with AI Voice Cloning

    By Greg Collier

    Imagine the terror of receiving a phone call with a familiar voice in distress, only to realize it was a cruel, high-tech scam. This harrowing experience recently befell a mother in Grand Rapids, Michigan, who nearly lost $50,000 over a weekend due to a sophisticated AI-driven scam. This scam, called ‘voice cloning’ mimicked the voice of her daughter so convincingly that it bypassed her natural skepticism and sent her scrambling to respond to what seemed like an emergency.

    It started with a phone call from an unknown number, coming from a town her daughter often frequented. With her daughter’s faint, panicked voice on the other end, she felt an instant urgency and fear that something was gravely wrong. Then, as she listened, the tone shifted; a stranger seized control of the call, asserting himself as a captor and demanding an immediate ransom. Her daughter’s supposed voice—distorted, mumbled, and terrified—amplified the mother’s fears. Desperation began to cloud her judgment as she debated how to produce such a vast sum on short notice.

    In her fear and confusion, she was prepared to do whatever it took to ensure her daughter’s safety. She was ready to withdraw cash, find neighbors who might accompany her, and meet the caller, who had directed her to a local hardware store for the exchange. But her instincts were seconded by her husband, who, while she negotiated, placed a call to the local police department. They advised him to contact their daughter directly, which they did, only to find she was safe and sound, unaware of the horrifying call her mother had just endured.

    This unsettling experience highlights a chilling reality of today’s world: the power of artificial intelligence to manipulate emotions, creating distressing scenarios with fabricated voices. These AI scams work by exploiting easily accessible samples of people’s voices, often found in social media videos or recordings. Voice cloning technology, once a futuristic concept, is now accessible and advanced enough to replicate a person’s voice with unsettling accuracy from just a brief clip.

    The Better Business Bureau advises those targeted by similar scams to resist the urge to act immediately. The shock of hearing a loved one’s voice in peril can push us to respond without question, but taking a pause, verifying the caller’s claims, and contacting the loved one directly are critical steps to prevent falling victim.

    Protecting yourself from AI-driven voice cloning scams requires both awareness and a proactive approach. Start by being mindful of what you share online, especially voice recordings, as even brief audio clips on social media can provide the material needed for cloning. Reducing the number of public posts containing your voice limits potential exposure, making it harder for scammers to replicate.

    Establishing a safe word with family members is also an effective precaution. A unique, shared phrase can act as a verification tool in emergency calls. If you ever receive a call claiming a loved one is in distress, use this word to confirm their identity. By doing so, you create a reliable check against scams, especially when emotions run high.

    It’s essential to take a moment to verify information before reacting. Scammers count on people’s tendency to act on instinct, especially when fear and urgency are involved. If you receive an alarming call, try to reach the person directly using a familiar number. Verifying information before sending money or following instructions can prevent falling victim to such fraud.

    In the end, a calm, measured approach, grounded in verification and pre-established safety measures, can make all the difference in staying protected against AI-driven threats.

     
  • Geebo 9:00 am on February 28, 2024 Permalink | Reply
    Tags: , AI voice, , , voice c\,   

    The terrifying rise of AI-generated phone scams 

    By Greg Collier

    In the age of rapid technological advancement, it appears that scammers are always finding new ways to exploit our vulnerabilities. One of the latest and most frightening trends is the emergence of AI-generated phone scams, where callers use sophisticated artificial intelligence to mimic the voices of loved ones and prey on our emotions.

    Recently, residents of St. Louis County in Missouri were targeted by a particularly chilling variation of this scam. Victims received calls from individuals claiming to be their children in distress, stating that they had been involved in a car accident and the other driver was demanding money for damages under the threat of kidnapping. The scammers used AI to replicate the voices of the victims’ children, adding an extra layer of realism to their deception.

    The emotional impact of such a call cannot be overstated. Imagine receiving a call from someone who sounds exactly like your child, crying and pleading for help. The panic and fear that ensue can cloud judgment and make it difficult to discern the truth. This is precisely what the scammers rely on to manipulate their victims.

    One brave mother shared her harrowing experience with a local news outlet. She recounted how she received a call from someone who sounded like her daughter, claiming to have been in an accident and demanding a $2,000 wire transfer to prevent her kidnapping.

    Fortunately, in the case of the St. Louis County mother, prompt police intervention prevented her from falling victim to the scam. However, not everyone is as fortunate, with some parents having lost thousands of dollars to these heartless perpetrators.

    Experts warn that hanging up the phone may not be as simple as it seems in the heat of the moment. Instead, families should establish safe words or phrases to verify the authenticity of such calls.

    To protect yourself from falling victim to AI-generated phone scams, it’s essential to remain informed. Be wary of calls that pressure you to act quickly or request payment via gift cards or cryptocurrency. If you receive such a call, verify the authenticity of the situation by contacting the threatened family member directly and report the incident to law enforcement.

     
  • Geebo 9:00 am on January 12, 2024 Permalink | Reply
    Tags: , AI voice, , , , ,   

    More police warn of AI voice scams 

    More police warn of AI voice scams

    By Greg Collier

    AI voice spoofing refers to the use of artificial intelligence (AI) technology to imitate or replicate a person’s voice in a way that may deceive listeners into thinking they are hearing the real person. This technology can be used to generate synthetic voices that closely mimic the tone, pitch, and cadence of a specific individual. The term is often associated with negative uses, such as creating fraudulent phone calls or audio messages with the intent to deceive or manipulate.

    Scammers can exploit a brief audio clip of your family member’s voice, easily obtained from online content. With access to a voice-cloning program, the scammer can then imitate your loved one’s voice convincingly when making a call, leading to potential deception and manipulation. Scammers have quickly taken to this technology in order to fool people into believing their loved ones are in danger in what are being called family emergency scams.

    Family emergency scams typically break down into two categories, the virtual kidnapping scam, and the grandparent scam. Today, we’re focused on the grandparent scam. It garnered its name from the fact that scammers often target elderly victims, posing as the victim’s grandchild in peril. This scam has been happening a lot lately in the Memphis area, to the point where a Sheriff’s Office has issued a warning to local residents about it.

    One family received a phone call that appeared to be coming from their adult granddaughter. The caller sounded exactly like their granddaughter, who said they needed $500 for bail money after getting into a car accident. Smartly, the family kept asking the caller questions that only their granddaughter would know. The scammers finally hung up.

    To safeguard against this scam, it’s crucial to rely on caution rather than solely trusting your ears. If you receive a call from a supposed relative or loved one urgently requesting money due to a purported crisis, adhere to the same safety measures. Resist the urge to engage further; instead, promptly end the call and independently contact the person who is claimed to be in trouble to verify the authenticity of the situation. This proactive approach helps ensure protection against potential scams, even when the voice on the call seems identical to that of your loved one.

     
  • Geebo 8:00 am on September 19, 2023 Permalink | Reply
    Tags: AI voice, , , ,   

    The sheer terror of the kidnapping scam 

    The sheer terror of the kidnapping scam

    By Greg Collier

    Even if someone has complete knowledge of how a certain scam works, that doesn’t necessarily mean they won’t fall victim to it, due to how some scams are completely menacing. Take, for example, the virtual kidnapping scam. This is when a scammer calls someone and claims to have kidnapped their loved one before making a ransom demand. Meanwhile, the supposed kidnap victim is unharmed and has no idea they’re being used in a scam. With the advancement of AI voice-spoofing technology, scammers can easily mimic the voice of the victim’s loved one to make the scam seem even more threatening.

    With that knowledge in mind, we may think we wouldn’t fall for such a scam as we sit at our keyboards and screens. But can you say that with 100% confidence? Before you answer, you should know the story of an Atlanta father who fell victim to the scam.

    He received a call from someone who claimed they kidnapped his adult daughter. At the time of the call, the man’s daughter was traveling. This could be why the man was targeted, as scammers often take information they find on social media and use it to their advantage. The caller claimed he got into a car accident with the man’s daughter and that they were carrying a substantial amount of cocaine at the time.

    The caller threatened the life of the man’s daughter, saying that they couldn’t have anyone recognize them. This was accompanied by screams and cries in the background that replicated his daughter’s voice. This was followed up with threats of torture and other bodily harm to the daughter if the man didn’t comply. For the sake of decorum, we won’t reprint specifically what the threats entailed, but imagine the worst thing that could happen to a loved one of your own, and then you have an idea of the terror that was unfolding.

    The father complied with the scammer’s request and sent them $2500 to the scammer’s bank account, probably through an app like Zelle.

    Even if armed with the knowledge of how the virtual kidnapping scam works, in the heat of the moment, no one could be blamed for falling victim to the scam. However, there are still ways to try to protect yourself from the scam. The best way is to set up a code word between you and your loved ones. This way, in cases of calls like this, you can know if you’re actually talking to your loved one or not. Or, you could also ask them a question that only the supposed kidnap victim would know.

    While it’s easier said than done, try to remain calm in the situation, even while your ears may be deceiving you. Make attempts to contact your loved one through other means. If you can, attempt to have someone else reach them on a different phone.

    Please keep in mind, virtual kidnapping scams rely on manipulation and intimidation. By staying calm, and taking the necessary precautions, you can protect yourself and your loved ones from falling victim to these schemes.

     
  • Geebo 8:00 am on September 1, 2023 Permalink | Reply
    Tags: , AI voice, , , , ,   

    Grandmother scammed for weeks in AI voice-spoofing scam 

    By Greg Collier

    It’s been a short while since we last discussed the AI voice-spoofing scam. For new readers, this is when scammers obtain a sample of someone’s voice from online, and run it through an AI program, which allows the scammers to make the voice say whatever they want. The scammers then use the person’s voice to convince that person’s loved one to send the scammers money.

    Voice-spoofing is typically used in one of two consumer-level scams. The first one is the virtual kidnapping scam, which is exactly what it sounds like. Scammers will use the spoofed voice to make it sound like somebody’s loved one has been kidnapped, and the scammers will demand a ransom.

    The second scam is the one we’ll be discussing today, which is the grandparent scam. In this scam, the scammers pose as an elderly victim’s grandchild who’s in some kind of legal trouble. The scammers will often ask for bail money or legal fees.

    An elderly woman from Utah recently fell victim to the grandparent scam. Scammers called her on the phone using the cloned voice of one of her granddaughters. The ‘granddaughter’ said she had been arrested after riding in a car with someone who had drugs and needed bail money. A scammer then got on the call and pretended to be the granddaughter’s attorney and instructed the woman on how she could send payment. The woman was also instructed not to tell anyone else in the family, as it could jeopardize the granddaughter’s court case.

    One of the many problems with scammers is if you pay them once, chances are they’ll come back for more money, which is what happened here. For weeks, the phony granddaughter kept calling back needing more money each time for various legal proceedings. Keep in mind that with each conversation, the grandmother is not actually talking to anybody but a computer-generated voice, which sounds exactly like her granddaughter.

    Eventually, the grandmother did grow suspicious and told her son, who informed her she was being scammed.

    Don’t trust your ears when it comes to phone scams. If you receive a call from someone claiming to be a relative or loved one in need of money, it’s important to follow the same precautions, even if the voice sounds exactly like them. Hang up on the call and contact the person who’s supposedly in trouble. If you can’t reach them, ask other family members who might know where they are. Be sure to tell them about the situation you encountered, and never keep it a secret. Lastly, never send money under any circumstances.

     
  • Geebo 8:00 am on June 28, 2023 Permalink | Reply
    Tags: , AI voice, , , , ,   

    AI voice-spoofing scam started earlier than we thought 

    By Greg Collier

    One of the many problems with scams is, by the time the public hears about them, they’re already in full swing and have claimed numerous victims. For example, we’ve only been discussing the AI voice-spoofing scam for roughly two months. While we assumed the scam had been going on longer than that, we were unaware of just how far back it started. According to one recent report, at least one scam ring has been implementing the voice-spoofing scam since October of last year. The reason we know the scam is at least that old is because a suspect has been arrested for such a scam.

    In a voice-spoofing scam, scammers extract someone’s voice sample from online sources and manipulate it using AI technology to make it utter desired phrases. This deceptive practice is commonly observed in phone scams, particularly those aimed at convincing victims that they are communicating with a trusted family member or loved one. The voice-spoofing seems to be only used in grandparent scams and virtual kidnapping scams, so far. It’s only a matter of time before scammers come up with new ways of using voice-spoofing to scam victims.

    Also, when we discuss voice-spoofing scams here in 2023, we’re referring to the new wave of voice-spoofing scams. In previous years, there have been voice-spoofing scams, however, they were almost primitive compared to today’s technology. Those older scams also needed several minutes of someone’s recorded voice before they could make a viable speech model. Today, scammers only need a few seconds of speech.

    Getting back to the matter at hand, a New Jersey man was recently arrested for allegedly scamming a Houston, Texas, woman out of $40,000. She thought the voice she was talking to was her son, who claimed to have been arrested. Then the alleged scammer would get on the phone posing as a public defender while asking the woman for bail money. The man was caught after investigators followed the money trail, since one of the payments was sent through money transfer. However, the victim in this case was scammed in October 2022.

    Since scammers hardly ever work alone, more arrests may be following, and you can almost bet there are more victims out there.

    If you receive a distressing call from a supposed loved one requesting urgent financial assistance, it is crucial to verify their situation by promptly contacting them through alternative means. Do not entertain any assertions that prevent you from ending the call or consulting other family members. Stay vigilant and prioritize verifying the authenticity of such requests.

     
  • Geebo 8:00 am on June 20, 2023 Permalink | Reply
    Tags: AI voice, , , , ,   

    Mother convinced daughter arrested in AI scam 

    Mother convinced daughter arrested in AI scam

    By Greg Collier

    If anyone could recognize their daughter’s voice with just a few short words, it would be their mother. At least, that’s what scammers are hoping as AI-generated voice spoofing scams continue to plague families.

    Within the past few months, we have seen an increased uptick of phone scams that use AI-generated voices. As we’ve previously discussed, there are two scams where an AI-generated voice of the victim’s loved one makes the scams seem more believable.

    One of those scams is the virtual kidnapping scam. That’s when scammers will call their victim to tell them that they’ve kidnapped one of the victim’s loved ones, while demanding a ransom. In actuality, the supposed kidnap victim is unaware they’re being used in a scam.

    The other scam is the grandparent scam. It’s called the grandparent scam because in it, the majority of scammers target elderly victims and claim to be one of their grandchildren. Calling it the grandparent scam can be a misnomer, as scammers will also target parents and spouses.

    One mother from Upstate New York was shopping for her daughter’s wedding when she received a call from scammers. She immediately heard her daughter’s voice saying she got into a car accident. But it wasn’t her daughter’s voice. Scammers had spoofed it using AI.

    Scammers only need a few seconds of someone’s voice before they can make an authentic sounding AI model, along with the speaker’s cadence. They get their voice samples either from someone’s social media or making phone calls to their target. Since the daughter was preparing for her wedding, there may have been a wide variety of voice samples to choose from.

    But getting back to the scam, after the mother heard her daughter’s voice, a scammer got on the line posing as local police. They said the daughter caused a wreck while texting and driving, and needed $15,000 for bail.

    Thankfully, even though the woman was convinced that was her daughter’s voice, she did not fall victim to the scam. Instead, she called her daughter, who was in no danger at all.

    If you receive a phone call like this, try to contact the person who was supposedly arrested. Even if you held a conversation on that call and the person sounded exactly like your loved one. Scammers will try to keep you on the phone, but no one ever had their bail raised while someone verified their story.

     
  • Geebo 8:00 am on May 3, 2023 Permalink | Reply
    Tags: , AI voice, , , , , , ,   

    Scam Round Up: AI voice scam finds another victim and more 

    By Greg Collier

    This week in the round-up, we’ll be discussing three scams we’ve discussed before, but have popped up again recently.

    Our first scam is the Medicare card scam. Medicare issued new cards back in 2018 which started using an ID# rather than the recipient’s Social Security number. This was done to help prevent Medicare fraud and ensure patient privacy. Ever since then, scammers have been trying to fool Medicare recipients into believing another new card was being issued. Scammers typically do this to try to steal their victim’s Medicare information.

    The West Virginia Attorney General’s Office has issued a warning which says scammers are calling residents posing as Medicare, the Social Security Administration, or the Department of Insurance. The scammers are telling residents they need to turn in their paper Medicare cards for new plastic ones. This is not true. If Medicare were to issue new cards, they would announce it through the mail and not by calling Medicare recipients.

    The next scam pertains to families who have a loved one who is currently incarcerated. The Georgia Parole Board has issued their own warning to the families of the incarnated. They’ve reported scammers are calling the families and asking for money for the release of their family member. The scammers claim the money is needed for an ankle monitor before the inmate could be released.

    According to the parole board, they will never call anyone’s family asking for money. Georgia residents are advised to check with the parole board’s website before to determine the current parole status of their family member.

    Our final scam is one that’s not that old and has been in the news a lot lately, the voice spoofing scam. Scammers are taking voice recordings from social media or spam phone calls and feeding it to an AI program that can replicate that person’s voice. So far, it’s mostly been used in the grandparent scam, and the virtual kidnapping scam.

    An elderly coupe from Texas fell victim to the grandparent scam when they heard the voice of their grandson asking for help. The AI-generated voice said they were in an accident in Mexico and needed $1000. Believing he was talking to his actual grandson, the grandfather sent the money.

    If you receive a call like this, don’t believe your ears, as they can be deceived. Instead, try to contact the person who is supposedly in danger before sending any money.

     
  • Geebo 8:00 am on April 11, 2023 Permalink | Reply
    Tags: , AI voice, , ,   

    AI voice cloning used again in alarming scam 

    AI voice cloning used again in alarming scam

    By Greg Collier

    Few things are more unnerving than the new tool scammers have added to their arsenal, AI-generated voice cloning. Potentially, scammers can make their voice sound like anyone. That includes your friends and family. Voice cloning can be very convincing when used in two scam in particular. The first one is the grandparent scam, and the other is the virtual kidnapping scam.

    In a virtual kidnapping scam, the scammers call their victims claiming they are holding one of the victim’s loved one hostage for ransom. Typically, the supposed kidnap victim is safe and unaware they’re being used in a scam.

    Previously, the scammers would do almost all of the talking, but they would have someone else in the background crying and screaming, who they claimed was the kidnap victim. Now, with voice cloning technology, scammers can make it seem like the victim’s loved one is on the phone with them. To make the scam more disturbing than it already is, the scammers only need three seconds of audio to clone the voice of someone, according to some reports.

    An Arizona woman found out all too well how the scam works when she received a call from someone who claimed to have kidnapped her 15-year-old daughter. She received a phone call from an unknown number, but when she picked up the call, she heard the voice of her daughter. The mother said her daughter sounded like she was crying, while her daughter’s voice said, “Mom, I messed up.”

    The next voice she heard was from the supposed kidnapper. The caller threatened the woman by saying if she calls the police or anyone, he’s going to pump her daughter full of drugs, physically assault her, then leave her in Mexico if the woman doesn’t pay a ransom. Then in the background, the woman heard her daughter’s voice saying, “Help me, Mom. Please help me. Help me.” The scammer demanded $1 million in ransom before settling for $50,000.

    Thankfully, the woman was in a room with friends. The friends were able to not only call police, but also got a hold of the woman’s husband. The daughter in question was at home, totally unaware of what was going on.

    When it comes to the virtual kidnapping scam, we like to remind our readers that kidnapping for ransom is actually rare in the United States. However, child abductions are unfortunately a very real occurrence. This makes the scam even more terrifying for its victims.

    The girl’s mother should be commended though for doing the right thing even though her ears were being deceived. Even if it sounds like a loved one is in danger, always verify the scammer’s story.

    If you receive a call like this, try to have someone contact the person who’s supposedly been kidnapped. When they put your loved one on the phone, ask them a question that only they would know the answer to. Or have a family code word set up in advance that’s only to be used if the loved one is in danger.

    This may also be an opportunity for you to have a talk with your children about what they share on social media, since that’s where these scammers tend to find the voice samples they need.

     
  • Geebo 8:00 am on March 16, 2023 Permalink | Reply
    Tags: AI voice, , , ,   

    AI voice used in kidnapping scam 

    By Greg Collier

    Just over a week ago, we posted about scammers using AI technology to clone a victim’s loved one’s voice for a grandparent scam. It seems that this technique of scammers cloning voices isn’t going away anytime soon. Just recently, AI voice cloning was used in a virtual kidnapping scam in Oklahoma, where the victim lost $3000 to a scammer.

    Virtual kidnapping is a type of scam where a person receives a call or message claiming that their loved one has been kidnapped and demanding a ransom payment for their release. However, in most cases, the supposed victim is actually safe and not in any danger.

    Previously, in most virtual kidnapping scams, the scammers would do almost all of the talking, but they would have someone else in the background crying and screaming, who they claimed was the kidnap victim.

    In this most recent scam, the scam victim thought she was talking to her son and even said that the person on the phone sounded just like her son.

    It started like most virtual kidnapping scams do. The victim received a phone call from an unknown caller who told the woman they had kidnapped her adult son. The caller insinuated that the woman’s son interrupted a drug deal that cost the caller a lot of money. So, if the woman didn’t pay the money that was supposedly lost, they were going to harm her son. Typically, when the victim asks to speak to their loved one, the scammers will make excuses. However, this time, the victim spoke with someone who sounded just like her son.

    Panicked, the woman went to Walmart to wire $3000 to someone in Mexico. The scammer kept her on the phone the entire time. After making the payment, the impostor got back on the phone to say that the kidnappers were letting him go. The scammer’s told her they would drop her son off at that Walmart, but he never appeared. Finally, she was able to get a hold of her son on the phone, who had been at work the entire time.

    The virtual kidnapping scam has been using fear to get victims to pay a phony ransom for years. But now, with the voice cloning technology, the scammers have stepped up the fear to another level. The scammers only need about a minute of your loved one’s voice to be able to clone it. They usually take the voice from recordings that can be found on social media.

    But even if it sounds like a loved one on the phone, the same old precautions should be used. If you receive a call like this, try to have someone contact the person who’s supposedly been kidnapped. When they put your loved one on the phone, ask them a question that only they would know the answer to. Or have a family code word set up in advance that’s only to be used if the loved one is in danger.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel