Tagged: AI Toggle Comment Threads | Keyboard Shortcuts

  • Geebo 9:00 am on November 5, 2024 Permalink | Reply
    Tags: AI, , , , ,   

    A Mother’s Close Call with AI Voice Cloning 

    A Mother's Close Call with AI Voice Cloning

    By Greg Collier

    Imagine the terror of receiving a phone call with a familiar voice in distress, only to realize it was a cruel, high-tech scam. This harrowing experience recently befell a mother in Grand Rapids, Michigan, who nearly lost $50,000 over a weekend due to a sophisticated AI-driven scam. This scam, called ‘voice cloning’ mimicked the voice of her daughter so convincingly that it bypassed her natural skepticism and sent her scrambling to respond to what seemed like an emergency.

    It started with a phone call from an unknown number, coming from a town her daughter often frequented. With her daughter’s faint, panicked voice on the other end, she felt an instant urgency and fear that something was gravely wrong. Then, as she listened, the tone shifted; a stranger seized control of the call, asserting himself as a captor and demanding an immediate ransom. Her daughter’s supposed voice—distorted, mumbled, and terrified—amplified the mother’s fears. Desperation began to cloud her judgment as she debated how to produce such a vast sum on short notice.

    In her fear and confusion, she was prepared to do whatever it took to ensure her daughter’s safety. She was ready to withdraw cash, find neighbors who might accompany her, and meet the caller, who had directed her to a local hardware store for the exchange. But her instincts were seconded by her husband, who, while she negotiated, placed a call to the local police department. They advised him to contact their daughter directly, which they did, only to find she was safe and sound, unaware of the horrifying call her mother had just endured.

    This unsettling experience highlights a chilling reality of today’s world: the power of artificial intelligence to manipulate emotions, creating distressing scenarios with fabricated voices. These AI scams work by exploiting easily accessible samples of people’s voices, often found in social media videos or recordings. Voice cloning technology, once a futuristic concept, is now accessible and advanced enough to replicate a person’s voice with unsettling accuracy from just a brief clip.

    The Better Business Bureau advises those targeted by similar scams to resist the urge to act immediately. The shock of hearing a loved one’s voice in peril can push us to respond without question, but taking a pause, verifying the caller’s claims, and contacting the loved one directly are critical steps to prevent falling victim.

    Protecting yourself from AI-driven voice cloning scams requires both awareness and a proactive approach. Start by being mindful of what you share online, especially voice recordings, as even brief audio clips on social media can provide the material needed for cloning. Reducing the number of public posts containing your voice limits potential exposure, making it harder for scammers to replicate.

    Establishing a safe word with family members is also an effective precaution. A unique, shared phrase can act as a verification tool in emergency calls. If you ever receive a call claiming a loved one is in distress, use this word to confirm their identity. By doing so, you create a reliable check against scams, especially when emotions run high.

    It’s essential to take a moment to verify information before reacting. Scammers count on people’s tendency to act on instinct, especially when fear and urgency are involved. If you receive an alarming call, try to reach the person directly using a familiar number. Verifying information before sending money or following instructions can prevent falling victim to such fraud.

    In the end, a calm, measured approach, grounded in verification and pre-established safety measures, can make all the difference in staying protected against AI-driven threats.

     
  • Geebo 8:00 am on October 16, 2024 Permalink | Reply
    Tags: AI, artificial intelligence, , , , ,   

    How AI is Fueling a New Wave of Online Scams 

    How AI is Fueling a New Wave of Online Scams

    By Greg Collier

    With the rise of artificial intelligence (AI), the internet has become a more treacherous landscape for unsuspecting users. Once, the adage “seeing is believing” held weight. Today, however, scammers can create highly realistic images and videos that deceive even the most cautious among us. The enhanced development of AI has made it easier for fraudsters to craft convincing scenarios that prey on emotions, tricking people into parting with their money or personal information.

    One common tactic involves generating images of distressed animals or children. These fabricated images often accompany stories of emergencies or tragedies, urging people to click links to donate or provide personal details. The emotional weight of these images makes them highly effective, triggering a quick, compassionate response. Unfortunately, the results are predictable, stolen personal information or exposure to harmful malware. Social media users must be on high alert, as the Better Business Bureau warns against clicking unfamiliar links, especially when encountering images meant to elicit an emotional reaction.

    Identifying AI-generated content has become a key skill in avoiding these scams. When encountering images, it’s essential to look for subtle signs that something isn’t right. AI-generated images often exhibit flaws that betray their synthetic nature. Zooming in on these images can reveal strange details such as blurring around certain elements, disproportionate body parts, or even extra fingers on hands. Other giveaways include glossy, airbrushed textures and unnatural lighting. These telltale signs, though subtle, can help distinguish AI-generated images from genuine ones.

    The same principles apply to videos. Deepfake technology allows scammers to create videos that feature manipulated versions of public figures or loved ones in fabricated scenarios. Unnatural body language, strange shadows, and choppy audio can all indicate that the video isn’t real.

    One particularly concerning trend involves scammers using AI to create fake emergency scenarios. A family member might receive a video call or a voice message that appears to be from a loved one in distress, asking for money or help. But even though the voice and face may seem familiar, the message is an illusion, generated by AI to exploit trust and fear. The sophistication of this technology makes these scams harder to detect, but the key is context. Urgency, emotional manipulation, and unexpected requests for money are red flags. It’s always important to verify the authenticity of the situation by contacting the person directly through trusted methods.

    Reverse image searches can be useful for confirming whether a photo has been used elsewhere on the web. By doing this, users can trace images back to their original sources and determine whether they’ve been manipulated. Similarly, checking whether a story has been reported by credible news outlets can help discern the truth. If an image or video seems too shocking or unbelievable and hasn’t been covered by mainstream media, it’s likely fake.

    As AI technology continues to evolve, scammers will only refine their methods. The challenge of spotting fakes will become more difficult, and even sophisticated consumers may find themselves second-guessing what they see. Being suspicious and fact-checking are more important than ever. By recognizing the tactics scammers use and understanding how to spot AI-generated content, internet users can better protect themselves in this new digital landscape.

     
  • Geebo 9:00 am on February 28, 2024 Permalink | Reply
    Tags: AI, , , , voice c\,   

    The terrifying rise of AI-generated phone scams 

    By Greg Collier

    In the age of rapid technological advancement, it appears that scammers are always finding new ways to exploit our vulnerabilities. One of the latest and most frightening trends is the emergence of AI-generated phone scams, where callers use sophisticated artificial intelligence to mimic the voices of loved ones and prey on our emotions.

    Recently, residents of St. Louis County in Missouri were targeted by a particularly chilling variation of this scam. Victims received calls from individuals claiming to be their children in distress, stating that they had been involved in a car accident and the other driver was demanding money for damages under the threat of kidnapping. The scammers used AI to replicate the voices of the victims’ children, adding an extra layer of realism to their deception.

    The emotional impact of such a call cannot be overstated. Imagine receiving a call from someone who sounds exactly like your child, crying and pleading for help. The panic and fear that ensue can cloud judgment and make it difficult to discern the truth. This is precisely what the scammers rely on to manipulate their victims.

    One brave mother shared her harrowing experience with a local news outlet. She recounted how she received a call from someone who sounded like her daughter, claiming to have been in an accident and demanding a $2,000 wire transfer to prevent her kidnapping.

    Fortunately, in the case of the St. Louis County mother, prompt police intervention prevented her from falling victim to the scam. However, not everyone is as fortunate, with some parents having lost thousands of dollars to these heartless perpetrators.

    Experts warn that hanging up the phone may not be as simple as it seems in the heat of the moment. Instead, families should establish safe words or phrases to verify the authenticity of such calls.

    To protect yourself from falling victim to AI-generated phone scams, it’s essential to remain informed. Be wary of calls that pressure you to act quickly or request payment via gift cards or cryptocurrency. If you receive such a call, verify the authenticity of the situation by contacting the threatened family member directly and report the incident to law enforcement.

     
  • Geebo 9:00 am on January 12, 2024 Permalink | Reply
    Tags: AI, , , , , ,   

    More police warn of AI voice scams 

    More police warn of AI voice scams

    By Greg Collier

    AI voice spoofing refers to the use of artificial intelligence (AI) technology to imitate or replicate a person’s voice in a way that may deceive listeners into thinking they are hearing the real person. This technology can be used to generate synthetic voices that closely mimic the tone, pitch, and cadence of a specific individual. The term is often associated with negative uses, such as creating fraudulent phone calls or audio messages with the intent to deceive or manipulate.

    Scammers can exploit a brief audio clip of your family member’s voice, easily obtained from online content. With access to a voice-cloning program, the scammer can then imitate your loved one’s voice convincingly when making a call, leading to potential deception and manipulation. Scammers have quickly taken to this technology in order to fool people into believing their loved ones are in danger in what are being called family emergency scams.

    Family emergency scams typically break down into two categories, the virtual kidnapping scam, and the grandparent scam. Today, we’re focused on the grandparent scam. It garnered its name from the fact that scammers often target elderly victims, posing as the victim’s grandchild in peril. This scam has been happening a lot lately in the Memphis area, to the point where a Sheriff’s Office has issued a warning to local residents about it.

    One family received a phone call that appeared to be coming from their adult granddaughter. The caller sounded exactly like their granddaughter, who said they needed $500 for bail money after getting into a car accident. Smartly, the family kept asking the caller questions that only their granddaughter would know. The scammers finally hung up.

    To safeguard against this scam, it’s crucial to rely on caution rather than solely trusting your ears. If you receive a call from a supposed relative or loved one urgently requesting money due to a purported crisis, adhere to the same safety measures. Resist the urge to engage further; instead, promptly end the call and independently contact the person who is claimed to be in trouble to verify the authenticity of the situation. This proactive approach helps ensure protection against potential scams, even when the voice on the call seems identical to that of your loved one.

     
  • Geebo 9:00 am on November 28, 2023 Permalink | Reply
    Tags: AI, , , , ,   

    AI finds its way into Medicare scams 

    AI finds its way into Medicare scams

    By Greg Collier

    We are currently nearing the end of Medicare’s Open Enrollment period. This is the time of year when Medicare recipients can change their plan from the traditional Medicare coverage to a Medicare Advantage plan, or change back if they so desire. This is also the time of year when scammers specifically target Medicare eligible seniors with their scams.

    When it comes to scams, identity theft poses a significant risk to seniors, especially during Open Enrollment. Scammers often employ tactics such as impersonating government officials, adopting titles like ‘health care benefits advocate,’ to deceive victims. These fraudsters make enticing promises, assuring the victim of enrollment in equivalent or superior coverage at a reduced cost. To accomplish their scheme, the fraudulent agent requests the victim’s personal information, including their Medicare number.

    The stolen Medicare number becomes a tool for these scammers to commit Medicare fraud, involving unauthorized charges for procedures or items. This fraudulent activity has the potential to impact the victim’s benefits in the future. Additionally, scammers resort to high-pressure tactics, such as claiming that the victim’s benefits may expire if immediate information is not provided. In some cases, these deceptive calls may even display Medicare’s official phone number, adding an extra layer of trickery. It is crucial for seniors to be vigilant and cautious to protect themselves from falling victim to such identity theft scams during the Open Enrollment period.

    Though not strictly a scam, certain unscrupulous insurance brokers may exert undue pressure on seniors to switch to their company’s Medicare Advantage plan. While Medicare Advantage plans can offer advantages for some individuals, they may also have limitations that may not suit everyone’s needs. The decision to switch should be based on the individual’s personal healthcare requirements, yet some insurance agents may prioritize making a sale over the well-being of the patient.

    If contemplating a transition from Medicare to a Medicare Advantage Plan, it is essential to conduct thorough research on the potential benefits and drawbacks. Avoid succumbing to the tactics of salespersons, who may push for a decision that could lead to regret in the following year. Taking the time to make an informed decision ensures that the chosen healthcare plan aligns with individual needs and preferences.

    There is also another potential threat with this year’s Open Enrollment, and not surprisingly, it’s related to AI. Experts are warning that scammers could be using AI-generated voice programs to make scam phone calls sound more authentic. These calls could even be used to try to record a victim’s voice, which could then be used in other voice spoofing scams.

    It’s important to be cautious when receiving calls related to your Medicare plan. Legitimate Medicare plans typically contact their members if necessary, but if you ever feel uneasy during such calls, consider calling your insurance company’s official customer service number to verify the legitimacy of the communication.

    As a general rule, exercise caution about sharing your Medicare or Social Security number over the phone. Medicare and your insurance company already have your information on file and typically don’t need you to provide it again during unsolicited calls. This precaution helps protect you from potential scams or identity theft. Always prioritize your security and verify the authenticity of any calls before sharing sensitive information.

     
  • Geebo 8:00 am on October 10, 2023 Permalink | Reply
    Tags: AI, , ChatGPT, ,   

    Scammers employ new weapon in romance scams 

    By Greg Collier

    Whenever someone develops a new and useful tool, it’s only a matter of time before someone uses it for criminal purposes. The large language model ChatGPT was released to the public last year. Essentially, you can give ChatGPT any kind of prompt, and it will write it out for you. Want to write a professional sounding email to a prospective employer, it can do that for you. Want to have it write a script about Batman meeting Abraham Lincoln? It can do that too. Do you want to have ChatGPT craft the best romantic responses to keep a lonely victim believing they’re in a committed online relationship? Unfortunately, it can do that too.

    According to cybersecurity experts, scammers have developed their own chat AI that will produce authentic looking messages to romance scam victims. For the initiated, romance scammers typically prey on the single and widowed by pretending to be an online romantic interest. These scammers will cultivate a phony online relationship using fake names and pictures, along with a story about how they can’t meet in public. The scammers will cultivate these relationships for months before asking the victim for money. Victims have lost thousands of dollars and even up to millions of dollars each to these scammers.

    Now, armed with an AI chatbot of their own, romance scammers almost have a ‘set it and forget it’ setting for running their scams.

    However, while this may make the romance scam appear more like a legitimate relationship, the steps someone can take to protect themselves are still the same. Anytime a prospective partner sends you a picture of themselves, use Google’s reverse image search to make sure they didn’t steal it from someone else’s social media. If they claim to be working overseas or somewhere where they can’t travel from freely, there’s a good chance they’re a scammer. Lastly, if they ask for money without meeting first, it’s almost guaranteed that they’re a scammer.

     
  • Geebo 8:00 am on September 1, 2023 Permalink | Reply
    Tags: AI, , , , , ,   

    Grandmother scammed for weeks in AI voice-spoofing scam 

    By Greg Collier

    It’s been a short while since we last discussed the AI voice-spoofing scam. For new readers, this is when scammers obtain a sample of someone’s voice from online, and run it through an AI program, which allows the scammers to make the voice say whatever they want. The scammers then use the person’s voice to convince that person’s loved one to send the scammers money.

    Voice-spoofing is typically used in one of two consumer-level scams. The first one is the virtual kidnapping scam, which is exactly what it sounds like. Scammers will use the spoofed voice to make it sound like somebody’s loved one has been kidnapped, and the scammers will demand a ransom.

    The second scam is the one we’ll be discussing today, which is the grandparent scam. In this scam, the scammers pose as an elderly victim’s grandchild who’s in some kind of legal trouble. The scammers will often ask for bail money or legal fees.

    An elderly woman from Utah recently fell victim to the grandparent scam. Scammers called her on the phone using the cloned voice of one of her granddaughters. The ‘granddaughter’ said she had been arrested after riding in a car with someone who had drugs and needed bail money. A scammer then got on the call and pretended to be the granddaughter’s attorney and instructed the woman on how she could send payment. The woman was also instructed not to tell anyone else in the family, as it could jeopardize the granddaughter’s court case.

    One of the many problems with scammers is if you pay them once, chances are they’ll come back for more money, which is what happened here. For weeks, the phony granddaughter kept calling back needing more money each time for various legal proceedings. Keep in mind that with each conversation, the grandmother is not actually talking to anybody but a computer-generated voice, which sounds exactly like her granddaughter.

    Eventually, the grandmother did grow suspicious and told her son, who informed her she was being scammed.

    Don’t trust your ears when it comes to phone scams. If you receive a call from someone claiming to be a relative or loved one in need of money, it’s important to follow the same precautions, even if the voice sounds exactly like them. Hang up on the call and contact the person who’s supposedly in trouble. If you can’t reach them, ask other family members who might know where they are. Be sure to tell them about the situation you encountered, and never keep it a secret. Lastly, never send money under any circumstances.

     
  • Geebo 8:00 am on July 28, 2023 Permalink | Reply
    Tags: AI, , , , , , , , ,   

    Scam Round Up: Weird AI scam and more 

    Scam Round Up: Weird AI scam and more

    By Greg Collier

    Our first scam comes to us from Athens, Texas, where residents have been experiencing a twist in the arrest warrant scam, also known as a police impersonation scam. Typically, when scammers pose as police, they’ll call their intended victims and tell them they have a warrant out for their arrest, The scammers usually claim this for missed jury duty, but they can also claim a number of other infractions.

    For example, residents of Athens have complained the scammers are accusing their victims of using their phone to transmit a photo that traumatized a child. Essentially, the scammers accused their victims of sending explicit material to a child. The victim is then asked to pay several hundred dollars over the phone to resolve the complaint.

    That’s not how arrest warrants work. If there is a warrant for your arrest, especially one that’s supposedly this serious, the police are not going to call you over the phone. Also, no law enforcement agency will ask for money over the phone, and then ask for it in unusual ways, like gift cards or cryptocurrency, just to name a few.

    If you receive a call like this, hang up and call your local police at their emergency number. Not only can you verify there is no warrant for your arrest, you can let the police know scammers are working in your area.

    ***

    Police in Connecticut are warning residents there has been an uptick in check washing. Check washing typically involves stealing checks that are in outgoing mail. Thieves often steal the mail from residential mailboxes, along with the outdoor drop-off boxes used by the US Postal Service. They then dip the written checks in a chemical solution that removes the ink from the check, so the thieves can write the checks to themselves.

    The police in Connecticut are also warning residents the thieves can steal checks out of your trash. If you use your bank’s mobile app to deposit checks, and then throw the checks out, make sure they’re properly shredded before throwing them out, as check washing can still be performed on voided checks.

    If you have to write a check, which is going in the mail, use a gel-based ink pen. The ink in gel pens is said to be more resistant to check washing. Also, don’t put the envelope that holds the check in your mailbox and the put the mailbox flag up. This is a signal to thieves there may be a check in there.

    ***

    Lastly, we’ve read about another AI voice-spoofing scam. There has been a rash of these scams nationwide over the past year or so. In this scam, the victim gets a phone call where the voice sounds like exactly like one of the victim’s loved ones. The scammers manipulate the loved one’s voice in such a way where it sounds like the actual loved one is in some kind of trouble and needs money to resolve the issue. Typically, the scammers ask for bail money, or in some cases a ransom. However, the loved one is usually unaware their voice is being used in a scam.

    However, the recent news article we read out of Alabama, suggests scammers are using the voice-spoofing technique in identity theft. An Alabama woman received a call she thought was from her brother, but was actually from scammers. Instead of asking for money, they asked the woman for personal information. They then used this information to hijack her Facebook account and use that for additional scams. Police there have said the scammers used the videos the brother posted on social media to mimic his voice with AI.

    We can’t say for sure, but this sounds like the scammers may have been asking for the woman’s security questions in case she lost her Facebook password. Considering the answers to these questions are something like “What was your first pet’s name?” or “What city were you born?” these may seem like innocuous questions coming from a close family member.

    In cases like this, it’s best to ask the family member calling a question only they would know to verify their identity.

     
  • Geebo 8:00 am on June 28, 2023 Permalink | Reply
    Tags: AI, , , , , ,   

    AI voice-spoofing scam started earlier than we thought 

    By Greg Collier

    One of the many problems with scams is, by the time the public hears about them, they’re already in full swing and have claimed numerous victims. For example, we’ve only been discussing the AI voice-spoofing scam for roughly two months. While we assumed the scam had been going on longer than that, we were unaware of just how far back it started. According to one recent report, at least one scam ring has been implementing the voice-spoofing scam since October of last year. The reason we know the scam is at least that old is because a suspect has been arrested for such a scam.

    In a voice-spoofing scam, scammers extract someone’s voice sample from online sources and manipulate it using AI technology to make it utter desired phrases. This deceptive practice is commonly observed in phone scams, particularly those aimed at convincing victims that they are communicating with a trusted family member or loved one. The voice-spoofing seems to be only used in grandparent scams and virtual kidnapping scams, so far. It’s only a matter of time before scammers come up with new ways of using voice-spoofing to scam victims.

    Also, when we discuss voice-spoofing scams here in 2023, we’re referring to the new wave of voice-spoofing scams. In previous years, there have been voice-spoofing scams, however, they were almost primitive compared to today’s technology. Those older scams also needed several minutes of someone’s recorded voice before they could make a viable speech model. Today, scammers only need a few seconds of speech.

    Getting back to the matter at hand, a New Jersey man was recently arrested for allegedly scamming a Houston, Texas, woman out of $40,000. She thought the voice she was talking to was her son, who claimed to have been arrested. Then the alleged scammer would get on the phone posing as a public defender while asking the woman for bail money. The man was caught after investigators followed the money trail, since one of the payments was sent through money transfer. However, the victim in this case was scammed in October 2022.

    Since scammers hardly ever work alone, more arrests may be following, and you can almost bet there are more victims out there.

    If you receive a distressing call from a supposed loved one requesting urgent financial assistance, it is crucial to verify their situation by promptly contacting them through alternative means. Do not entertain any assertions that prevent you from ending the call or consulting other family members. Stay vigilant and prioritize verifying the authenticity of such requests.

     
  • Geebo 8:00 am on June 7, 2023 Permalink | Reply
    Tags: AI, , , ,   

    Virtual kidnappings become more virtual 

    Virtual kidnappings become more virtual

    By Greg Collier

    The virtual kidnapping scam is called virtual because it’s not real. This is when scammers call a victim and pretend to have kidnapped one of the victim’s loved ones. The scammers then demand some kind of ransom payment that can typically be done online. The victim will be kept on the phone by the scammers to try and ensure the victim can’t contact the loved one who has supposedly been kidnapped. Since the scam appeals to the victim’s emotions, many people have fallen victim to this scam while their loved ones are unaware they’re being used in a scam.

    More recently, scammers have made the virtual kidnapping scam more believable through AI-generated voice spoofing technology. Just as an aside, when we refer to programs like ChatGPT and Dall-E as AI, it’s actually a misnomer. A better way to describe them is machine learning programs, but the popular nomenclature has stuck, so we refer to them as AI.

    Anyway, scammers are now taking voice samples from people online, and using it in the virtual kidnapping scam. For example, a man from Arizona recently received a phone call where scammers said they kidnapped his daughter. The man then heard his daughter’s voice on the phone call saying “Papa, help me!” Her voice wasn’t robotic sounding as some may think. Voice spoofing has gotten so believable because it can mimic someone’s tone of voice as well. The scammers demanded $10,000 from the victim.

    Thankfully, the man’s daughter was unharmed. She was at school, unaware of what her father had been going through.

    Scammers get the voice samples used in the spoofing mainly from social media. It only takes a few seconds of someone’s voice to make a complete copy of someone’s voice. So, for anything that includes your child’s voice, you may want to limit access to that post.

    If you receive one of these phone calls, it’s hard not to believe what you’re hearing. However, as we like to stress, kidnappings for ransom are actually rare in the U.S. With that knowledge in mind, try to contact the supposed kidnap victim either on another phone or some other device. The chances are you’ll find they’re in no danger. In any event, you should contact local law enforcement and let know what happened.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel