Tagged: AI Toggle Comment Threads | Keyboard Shortcuts

  • Geebo 9:00 am on February 28, 2024 Permalink | Reply
    Tags: AI, , , , voice c\,   

    The terrifying rise of AI-generated phone scams 

    By Greg Collier

    In the age of rapid technological advancement, it appears that scammers are always finding new ways to exploit our vulnerabilities. One of the latest and most frightening trends is the emergence of AI-generated phone scams, where callers use sophisticated artificial intelligence to mimic the voices of loved ones and prey on our emotions.

    Recently, residents of St. Louis County in Missouri were targeted by a particularly chilling variation of this scam. Victims received calls from individuals claiming to be their children in distress, stating that they had been involved in a car accident and the other driver was demanding money for damages under the threat of kidnapping. The scammers used AI to replicate the voices of the victims’ children, adding an extra layer of realism to their deception.

    The emotional impact of such a call cannot be overstated. Imagine receiving a call from someone who sounds exactly like your child, crying and pleading for help. The panic and fear that ensue can cloud judgment and make it difficult to discern the truth. This is precisely what the scammers rely on to manipulate their victims.

    One brave mother shared her harrowing experience with a local news outlet. She recounted how she received a call from someone who sounded like her daughter, claiming to have been in an accident and demanding a $2,000 wire transfer to prevent her kidnapping.

    Fortunately, in the case of the St. Louis County mother, prompt police intervention prevented her from falling victim to the scam. However, not everyone is as fortunate, with some parents having lost thousands of dollars to these heartless perpetrators.

    Experts warn that hanging up the phone may not be as simple as it seems in the heat of the moment. Instead, families should establish safe words or phrases to verify the authenticity of such calls.

    To protect yourself from falling victim to AI-generated phone scams, it’s essential to remain informed. Be wary of calls that pressure you to act quickly or request payment via gift cards or cryptocurrency. If you receive such a call, verify the authenticity of the situation by contacting the threatened family member directly and report the incident to law enforcement.

     
  • Geebo 9:00 am on January 12, 2024 Permalink | Reply
    Tags: AI, , family emergency, , , ,   

    More police warn of AI voice scams 

    More police warn of AI voice scams

    By Greg Collier

    AI voice spoofing refers to the use of artificial intelligence (AI) technology to imitate or replicate a person’s voice in a way that may deceive listeners into thinking they are hearing the real person. This technology can be used to generate synthetic voices that closely mimic the tone, pitch, and cadence of a specific individual. The term is often associated with negative uses, such as creating fraudulent phone calls or audio messages with the intent to deceive or manipulate.

    Scammers can exploit a brief audio clip of your family member’s voice, easily obtained from online content. With access to a voice-cloning program, the scammer can then imitate your loved one’s voice convincingly when making a call, leading to potential deception and manipulation. Scammers have quickly taken to this technology in order to fool people into believing their loved ones are in danger in what are being called family emergency scams.

    Family emergency scams typically break down into two categories, the virtual kidnapping scam, and the grandparent scam. Today, we’re focused on the grandparent scam. It garnered its name from the fact that scammers often target elderly victims, posing as the victim’s grandchild in peril. This scam has been happening a lot lately in the Memphis area, to the point where a Sheriff’s Office has issued a warning to local residents about it.

    One family received a phone call that appeared to be coming from their adult granddaughter. The caller sounded exactly like their granddaughter, who said they needed $500 for bail money after getting into a car accident. Smartly, the family kept asking the caller questions that only their granddaughter would know. The scammers finally hung up.

    To safeguard against this scam, it’s crucial to rely on caution rather than solely trusting your ears. If you receive a call from a supposed relative or loved one urgently requesting money due to a purported crisis, adhere to the same safety measures. Resist the urge to engage further; instead, promptly end the call and independently contact the person who is claimed to be in trouble to verify the authenticity of the situation. This proactive approach helps ensure protection against potential scams, even when the voice on the call seems identical to that of your loved one.

     
  • Geebo 9:00 am on November 28, 2023 Permalink | Reply
    Tags: AI, , , , ,   

    AI finds its way into Medicare scams 

    AI finds its way into Medicare scams

    By Greg Collier

    We are currently nearing the end of Medicare’s Open Enrollment period. This is the time of year when Medicare recipients can change their plan from the traditional Medicare coverage to a Medicare Advantage plan, or change back if they so desire. This is also the time of year when scammers specifically target Medicare eligible seniors with their scams.

    When it comes to scams, identity theft poses a significant risk to seniors, especially during Open Enrollment. Scammers often employ tactics such as impersonating government officials, adopting titles like ‘health care benefits advocate,’ to deceive victims. These fraudsters make enticing promises, assuring the victim of enrollment in equivalent or superior coverage at a reduced cost. To accomplish their scheme, the fraudulent agent requests the victim’s personal information, including their Medicare number.

    The stolen Medicare number becomes a tool for these scammers to commit Medicare fraud, involving unauthorized charges for procedures or items. This fraudulent activity has the potential to impact the victim’s benefits in the future. Additionally, scammers resort to high-pressure tactics, such as claiming that the victim’s benefits may expire if immediate information is not provided. In some cases, these deceptive calls may even display Medicare’s official phone number, adding an extra layer of trickery. It is crucial for seniors to be vigilant and cautious to protect themselves from falling victim to such identity theft scams during the Open Enrollment period.

    Though not strictly a scam, certain unscrupulous insurance brokers may exert undue pressure on seniors to switch to their company’s Medicare Advantage plan. While Medicare Advantage plans can offer advantages for some individuals, they may also have limitations that may not suit everyone’s needs. The decision to switch should be based on the individual’s personal healthcare requirements, yet some insurance agents may prioritize making a sale over the well-being of the patient.

    If contemplating a transition from Medicare to a Medicare Advantage Plan, it is essential to conduct thorough research on the potential benefits and drawbacks. Avoid succumbing to the tactics of salespersons, who may push for a decision that could lead to regret in the following year. Taking the time to make an informed decision ensures that the chosen healthcare plan aligns with individual needs and preferences.

    There is also another potential threat with this year’s Open Enrollment, and not surprisingly, it’s related to AI. Experts are warning that scammers could be using AI-generated voice programs to make scam phone calls sound more authentic. These calls could even be used to try to record a victim’s voice, which could then be used in other voice spoofing scams.

    It’s important to be cautious when receiving calls related to your Medicare plan. Legitimate Medicare plans typically contact their members if necessary, but if you ever feel uneasy during such calls, consider calling your insurance company’s official customer service number to verify the legitimacy of the communication.

    As a general rule, exercise caution about sharing your Medicare or Social Security number over the phone. Medicare and your insurance company already have your information on file and typically don’t need you to provide it again during unsolicited calls. This precaution helps protect you from potential scams or identity theft. Always prioritize your security and verify the authenticity of any calls before sharing sensitive information.

     
  • Geebo 8:00 am on October 10, 2023 Permalink | Reply
    Tags: AI, , ChatGPT, ,   

    Scammers employ new weapon in romance scams 

    By Greg Collier

    Whenever someone develops a new and useful tool, it’s only a matter of time before someone uses it for criminal purposes. The large language model ChatGPT was released to the public last year. Essentially, you can give ChatGPT any kind of prompt, and it will write it out for you. Want to write a professional sounding email to a prospective employer, it can do that for you. Want to have it write a script about Batman meeting Abraham Lincoln? It can do that too. Do you want to have ChatGPT craft the best romantic responses to keep a lonely victim believing they’re in a committed online relationship? Unfortunately, it can do that too.

    According to cybersecurity experts, scammers have developed their own chat AI that will produce authentic looking messages to romance scam victims. For the initiated, romance scammers typically prey on the single and widowed by pretending to be an online romantic interest. These scammers will cultivate a phony online relationship using fake names and pictures, along with a story about how they can’t meet in public. The scammers will cultivate these relationships for months before asking the victim for money. Victims have lost thousands of dollars and even up to millions of dollars each to these scammers.

    Now, armed with an AI chatbot of their own, romance scammers almost have a ‘set it and forget it’ setting for running their scams.

    However, while this may make the romance scam appear more like a legitimate relationship, the steps someone can take to protect themselves are still the same. Anytime a prospective partner sends you a picture of themselves, use Google’s reverse image search to make sure they didn’t steal it from someone else’s social media. If they claim to be working overseas or somewhere where they can’t travel from freely, there’s a good chance they’re a scammer. Lastly, if they ask for money without meeting first, it’s almost guaranteed that they’re a scammer.

     
  • Geebo 8:00 am on September 1, 2023 Permalink | Reply
    Tags: AI, , , , , ,   

    Grandmother scammed for weeks in AI voice-spoofing scam 

    By Greg Collier

    It’s been a short while since we last discussed the AI voice-spoofing scam. For new readers, this is when scammers obtain a sample of someone’s voice from online, and run it through an AI program, which allows the scammers to make the voice say whatever they want. The scammers then use the person’s voice to convince that person’s loved one to send the scammers money.

    Voice-spoofing is typically used in one of two consumer-level scams. The first one is the virtual kidnapping scam, which is exactly what it sounds like. Scammers will use the spoofed voice to make it sound like somebody’s loved one has been kidnapped, and the scammers will demand a ransom.

    The second scam is the one we’ll be discussing today, which is the grandparent scam. In this scam, the scammers pose as an elderly victim’s grandchild who’s in some kind of legal trouble. The scammers will often ask for bail money or legal fees.

    An elderly woman from Utah recently fell victim to the grandparent scam. Scammers called her on the phone using the cloned voice of one of her granddaughters. The ‘granddaughter’ said she had been arrested after riding in a car with someone who had drugs and needed bail money. A scammer then got on the call and pretended to be the granddaughter’s attorney and instructed the woman on how she could send payment. The woman was also instructed not to tell anyone else in the family, as it could jeopardize the granddaughter’s court case.

    One of the many problems with scammers is if you pay them once, chances are they’ll come back for more money, which is what happened here. For weeks, the phony granddaughter kept calling back needing more money each time for various legal proceedings. Keep in mind that with each conversation, the grandmother is not actually talking to anybody but a computer-generated voice, which sounds exactly like her granddaughter.

    Eventually, the grandmother did grow suspicious and told her son, who informed her she was being scammed.

    Don’t trust your ears when it comes to phone scams. If you receive a call from someone claiming to be a relative or loved one in need of money, it’s important to follow the same precautions, even if the voice sounds exactly like them. Hang up on the call and contact the person who’s supposedly in trouble. If you can’t reach them, ask other family members who might know where they are. Be sure to tell them about the situation you encountered, and never keep it a secret. Lastly, never send money under any circumstances.

     
  • Geebo 8:00 am on July 28, 2023 Permalink | Reply
    Tags: AI, , , , , , , , ,   

    Scam Round Up: Weird AI scam and more 

    Scam Round Up: Weird AI scam and more

    By Greg Collier

    Our first scam comes to us from Athens, Texas, where residents have been experiencing a twist in the arrest warrant scam, also known as a police impersonation scam. Typically, when scammers pose as police, they’ll call their intended victims and tell them they have a warrant out for their arrest, The scammers usually claim this for missed jury duty, but they can also claim a number of other infractions.

    For example, residents of Athens have complained the scammers are accusing their victims of using their phone to transmit a photo that traumatized a child. Essentially, the scammers accused their victims of sending explicit material to a child. The victim is then asked to pay several hundred dollars over the phone to resolve the complaint.

    That’s not how arrest warrants work. If there is a warrant for your arrest, especially one that’s supposedly this serious, the police are not going to call you over the phone. Also, no law enforcement agency will ask for money over the phone, and then ask for it in unusual ways, like gift cards or cryptocurrency, just to name a few.

    If you receive a call like this, hang up and call your local police at their emergency number. Not only can you verify there is no warrant for your arrest, you can let the police know scammers are working in your area.

    ***

    Police in Connecticut are warning residents there has been an uptick in check washing. Check washing typically involves stealing checks that are in outgoing mail. Thieves often steal the mail from residential mailboxes, along with the outdoor drop-off boxes used by the US Postal Service. They then dip the written checks in a chemical solution that removes the ink from the check, so the thieves can write the checks to themselves.

    The police in Connecticut are also warning residents the thieves can steal checks out of your trash. If you use your bank’s mobile app to deposit checks, and then throw the checks out, make sure they’re properly shredded before throwing them out, as check washing can still be performed on voided checks.

    If you have to write a check, which is going in the mail, use a gel-based ink pen. The ink in gel pens is said to be more resistant to check washing. Also, don’t put the envelope that holds the check in your mailbox and the put the mailbox flag up. This is a signal to thieves there may be a check in there.

    ***

    Lastly, we’ve read about another AI voice-spoofing scam. There has been a rash of these scams nationwide over the past year or so. In this scam, the victim gets a phone call where the voice sounds like exactly like one of the victim’s loved ones. The scammers manipulate the loved one’s voice in such a way where it sounds like the actual loved one is in some kind of trouble and needs money to resolve the issue. Typically, the scammers ask for bail money, or in some cases a ransom. However, the loved one is usually unaware their voice is being used in a scam.

    However, the recent news article we read out of Alabama, suggests scammers are using the voice-spoofing technique in identity theft. An Alabama woman received a call she thought was from her brother, but was actually from scammers. Instead of asking for money, they asked the woman for personal information. They then used this information to hijack her Facebook account and use that for additional scams. Police there have said the scammers used the videos the brother posted on social media to mimic his voice with AI.

    We can’t say for sure, but this sounds like the scammers may have been asking for the woman’s security questions in case she lost her Facebook password. Considering the answers to these questions are something like “What was your first pet’s name?” or “What city were you born?” these may seem like innocuous questions coming from a close family member.

    In cases like this, it’s best to ask the family member calling a question only they would know to verify their identity.

     
  • Geebo 8:00 am on June 28, 2023 Permalink | Reply
    Tags: AI, , , , , ,   

    AI voice-spoofing scam started earlier than we thought 

    By Greg Collier

    One of the many problems with scams is, by the time the public hears about them, they’re already in full swing and have claimed numerous victims. For example, we’ve only been discussing the AI voice-spoofing scam for roughly two months. While we assumed the scam had been going on longer than that, we were unaware of just how far back it started. According to one recent report, at least one scam ring has been implementing the voice-spoofing scam since October of last year. The reason we know the scam is at least that old is because a suspect has been arrested for such a scam.

    In a voice-spoofing scam, scammers extract someone’s voice sample from online sources and manipulate it using AI technology to make it utter desired phrases. This deceptive practice is commonly observed in phone scams, particularly those aimed at convincing victims that they are communicating with a trusted family member or loved one. The voice-spoofing seems to be only used in grandparent scams and virtual kidnapping scams, so far. It’s only a matter of time before scammers come up with new ways of using voice-spoofing to scam victims.

    Also, when we discuss voice-spoofing scams here in 2023, we’re referring to the new wave of voice-spoofing scams. In previous years, there have been voice-spoofing scams, however, they were almost primitive compared to today’s technology. Those older scams also needed several minutes of someone’s recorded voice before they could make a viable speech model. Today, scammers only need a few seconds of speech.

    Getting back to the matter at hand, a New Jersey man was recently arrested for allegedly scamming a Houston, Texas, woman out of $40,000. She thought the voice she was talking to was her son, who claimed to have been arrested. Then the alleged scammer would get on the phone posing as a public defender while asking the woman for bail money. The man was caught after investigators followed the money trail, since one of the payments was sent through money transfer. However, the victim in this case was scammed in October 2022.

    Since scammers hardly ever work alone, more arrests may be following, and you can almost bet there are more victims out there.

    If you receive a distressing call from a supposed loved one requesting urgent financial assistance, it is crucial to verify their situation by promptly contacting them through alternative means. Do not entertain any assertions that prevent you from ending the call or consulting other family members. Stay vigilant and prioritize verifying the authenticity of such requests.

     
  • Geebo 8:00 am on June 7, 2023 Permalink | Reply
    Tags: AI, , , ,   

    Virtual kidnappings become more virtual 

    Virtual kidnappings become more virtual

    By Greg Collier

    The virtual kidnapping scam is called virtual because it’s not real. This is when scammers call a victim and pretend to have kidnapped one of the victim’s loved ones. The scammers then demand some kind of ransom payment that can typically be done online. The victim will be kept on the phone by the scammers to try and ensure the victim can’t contact the loved one who has supposedly been kidnapped. Since the scam appeals to the victim’s emotions, many people have fallen victim to this scam while their loved ones are unaware they’re being used in a scam.

    More recently, scammers have made the virtual kidnapping scam more believable through AI-generated voice spoofing technology. Just as an aside, when we refer to programs like ChatGPT and Dall-E as AI, it’s actually a misnomer. A better way to describe them is machine learning programs, but the popular nomenclature has stuck, so we refer to them as AI.

    Anyway, scammers are now taking voice samples from people online, and using it in the virtual kidnapping scam. For example, a man from Arizona recently received a phone call where scammers said they kidnapped his daughter. The man then heard his daughter’s voice on the phone call saying “Papa, help me!” Her voice wasn’t robotic sounding as some may think. Voice spoofing has gotten so believable because it can mimic someone’s tone of voice as well. The scammers demanded $10,000 from the victim.

    Thankfully, the man’s daughter was unharmed. She was at school, unaware of what her father had been going through.

    Scammers get the voice samples used in the spoofing mainly from social media. It only takes a few seconds of someone’s voice to make a complete copy of someone’s voice. So, for anything that includes your child’s voice, you may want to limit access to that post.

    If you receive one of these phone calls, it’s hard not to believe what you’re hearing. However, as we like to stress, kidnappings for ransom are actually rare in the U.S. With that knowledge in mind, try to contact the supposed kidnap victim either on another phone or some other device. The chances are you’ll find they’re in no danger. In any event, you should contact local law enforcement and let know what happened.

     
  • Geebo 8:00 am on May 22, 2023 Permalink | Reply
    Tags: AI, , , ,   

    AI scams aren’t limited to just voice 

    AI scams aren't limited to just voice

    By Greg Collier

    AI voice spoofing scams are on the rise and have really grabbed our attention recently. Again, this is when scammers take a sample of someone’s voice from online and run the sample through an AI program to make the voice say whatever they want. We see it mostly used in phone scams, where the scammers need you to believe the victim is talking to a loved one. With the advent of AI-generated voices, scammers have gone back into their bag of tricks to make an older scam even more convincing, and that’s the deep fake video.

    A deepfake video refers to a manipulated or synthesized video created using artificial intelligence techniques. In the context of deepfake videos, the AI is used to manipulate or replace the appearance and actions of individuals in existing videos, making it appear as though someone said or did something they didn’t actually say or do. However, to make the voice sound more convincing in deep fakes, a lot more voice sampling was needed than today. Now, bad actors only need a few seconds of someone’s voice to make the cloned voice sound more convincing.

    Recently, a man in Louisiana received a video that appeared to come from his brother-in-law. The video was received over Messenger, and the man’s brother-in-law said in the video that he needed $250 and couldn’t explain why, just that he was in trouble. The message also contained a link to a payment app account where the man could send the $250. The video disappeared from the message, but the link remained.

    Unfortunately for the scammers, they had sent their message to a police sergeant, who knew this was a scam. He called his brother-in-law, who was in no immediate danger.

    If you receive a phone call or instant message from a loved one asking for money, always verify their story before sending any funds. Even if it appears that it’s your loved one contacting you, verify the story. With advances in technology, you can’t believe your eyes or ears in situations like these.

     
  • Geebo 8:00 am on May 3, 2023 Permalink | Reply
    Tags: AI, , , , , , , ,   

    Scam Round Up: AI voice scam finds another victim and more 

    By Greg Collier

    This week in the round-up, we’ll be discussing three scams we’ve discussed before, but have popped up again recently.

    Our first scam is the Medicare card scam. Medicare issued new cards back in 2018 which started using an ID# rather than the recipient’s Social Security number. This was done to help prevent Medicare fraud and ensure patient privacy. Ever since then, scammers have been trying to fool Medicare recipients into believing another new card was being issued. Scammers typically do this to try to steal their victim’s Medicare information.

    The West Virginia Attorney General’s Office has issued a warning which says scammers are calling residents posing as Medicare, the Social Security Administration, or the Department of Insurance. The scammers are telling residents they need to turn in their paper Medicare cards for new plastic ones. This is not true. If Medicare were to issue new cards, they would announce it through the mail and not by calling Medicare recipients.

    The next scam pertains to families who have a loved one who is currently incarcerated. The Georgia Parole Board has issued their own warning to the families of the incarnated. They’ve reported scammers are calling the families and asking for money for the release of their family member. The scammers claim the money is needed for an ankle monitor before the inmate could be released.

    According to the parole board, they will never call anyone’s family asking for money. Georgia residents are advised to check with the parole board’s website before to determine the current parole status of their family member.

    Our final scam is one that’s not that old and has been in the news a lot lately, the voice spoofing scam. Scammers are taking voice recordings from social media or spam phone calls and feeding it to an AI program that can replicate that person’s voice. So far, it’s mostly been used in the grandparent scam, and the virtual kidnapping scam.

    An elderly coupe from Texas fell victim to the grandparent scam when they heard the voice of their grandson asking for help. The AI-generated voice said they were in an accident in Mexico and needed $1000. Believing he was talking to his actual grandson, the grandfather sent the money.

    If you receive a call like this, don’t believe your ears, as they can be deceived. Instead, try to contact the person who is supposedly in danger before sending any money.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel