Tagged: AI voice Toggle Comment Threads | Keyboard Shortcuts

  • Geebo 8:00 am on October 22, 2025 Permalink | Reply
    Tags: AI voice, , , political donations, , , ,   

    Deepfake Donors: When Political Voices Are Fake 

    Deepfake Donors: When Political Voices Are Fake

    By Greg Collier

    You get a text from your “preferred political candidate.” It asks for a small donation of ten dollars “to fight misinformation” or “protect election integrity.” The link looks official. The voice message attached even sounds authentically passionate, familiar, and persuasive.

    But it isn’t real. And neither is the person behind it.

    This fall, investigators from the U.S. Treasury and U.K. authorities announced their largest-ever takedown of cybercriminal networks responsible for billions in losses tied to fraudulent campaigns, fake fundraising, and AI-generated political deepfakes. This operation struck transnational organized criminal groups based especially in Southeast Asia, including the notorious Prince Group TCO, a dominant cybercrime player in Cambodia’s scam economy responsible for billions in illicit financial transactions. U.S. losses alone to online investment scams topped $16.6 billion, with over $10 billion lost to scam operations based in Southeast Asia just last year.​

    These scams are blurring the line between digital activism and manipulation right when citizens are most vulnerable: election season.

    What’s Going On:

    Scammers are exploiting voters’ trust in political communication, blending voice cloning, AI video, and fraudulent donation sites to extract money and personal data.​

    Here’s how it works:

    • A deepfake video or voicemail mimics a real candidate, complete with campaign slogans and “urgent” donation requests.
    • The links lead to fraudulent websites where victims enter credit card details.
    • Some schemes even collect personal voter data later sold or used for identity theft.

    In 2024’s New Hampshire primaries, voice-cloned robocalls impersonating national figures were caught attempting to sway voters, a precursor to the tactics now being scaled globally in 2025.​

    Why It’s Effective:

    These scams thrive because people trust familiarity, especially voices, faces, and causes they care about. The timing, emotional tone, and recognizable slogans create a powerful illusion of legitimacy.

    Modern AI makes it nearly impossible for the average person to distinguish a deepfake from reality, especially when wrapped in high-stakes messaging about public service, patriotism, or “protecting democracy.” Add in social pressure, and even cautious donors lower their guard.

    Red Flags:

    Before contributing or sharing campaign links, pause and check for these telltale signs:

    • Donation requests that come through texts, WhatsApp, or unknown numbers.
    • Voices or videos that sound slightly “off,” mismatched mouth movements, odd pauses, or inconsistent lighting.
    • Links that end in unusual extensions (like “.co” or “.support”) rather than official candidate domains.
    • Payment requests through Venmo, CashApp, Zelle, or crypto.
    • No clear disclosure or FEC registration details at the bottom of the website.

    Quick tip: Official campaigns in the U.S. are required to display Federal Election Commission (FEC) registration and disclaimers. If that’s missing, it’s a huge red flag.

    What You Can Do:

    • Verify before donating. Go directly to the official campaign site; don’t use links from texts or emails.
    • Treat urgency as a warning. Real campaigns rarely need “immediate wire transfers.”
    • Listen for tells. Deepfakes often have slightly distorted sounds or mechanical echoes.
    • Cross-check messages. If you get a surprising call or voicemail, compare it with the candidate’s latest verified posts.
    • Report and share. Submit suspicious calls or videos to reportfraud.ftc.gov or your state election board.

    Platforms including Google, Meta, and YouTube are now launching active detection systems and educational tools to flag deepfake political content before it spreads.​

    If You’ve Been Targeted:

    • Report donations made to fake campaigns immediately to your bank or credit card provider.
    • File a complaint through the FTC and local election authorities.
    • Freeze credit if personal or voter identity data were shared.
    • Publicize responsibly. Sharing examples with the right context can warn others, but avoid amplifying active scams.

    Final Thoughts:

    Deepfakes are no longer a distant concern; they’re reshaping political communication in real time. What makes this wave dangerous isn’t just money loss; it’s trust erosion.

    The recent takedown of the Prince Group’s transnational criminal networks by U.S. and U.K. authorities, which included sanctions on key individuals and cutting off millions in illicit financial flows, underscores the global scale of this problem. Their coordinated actions disrupted the infrastructure enabling these massive fraud campaigns, providing a much-needed deterrent to criminals using AI-based scams during critical democratic processes.​

    Staying safe now means applying the same critical awareness you’d use for phishing to the content you see and hear. Don’t assume your eyes or ears tell the full story.

    Think you spotted a fake campaign video or suspicious fundraising call? Don’t scroll past it; report it, discuss it, and share this guide. The more people who know what to look for, the fewer fall for it.

    Further Reading:

     
  • Geebo 8:00 am on October 20, 2025 Permalink | Reply
    Tags: , AI voice, , , ,   

    AI Is Calling, But It’s Not Who You Think 

    By Greg Collier

    A phone rings with an unfamiliar number while an AI waveform hovers behind, symbolizing how technology cloaks modern impersonation scams.

    Picture this: you get a call, and it’s your boss’s voice asking for a quick favor, a wire transfer to a vendor, or a prepaid card code “for the conference.” It sounds exactly like their tone, pace, and even background noise. But that voice? It’s not real.

    AI-generated voice cloning is fueling a wave of impersonation scams. And as voice, image, and chat synthesis tools become more advanced, the line between real and fake is disappearing.

    What’s Going On?:

    Fraudsters are now combining data from social media with voice samples from YouTube, voicemail greetings, or even podcasts. Using consumer-grade AI tools, they replicate voices with uncanny accuracy.

    They then use these synthetic voices to:

    • Impersonate company leaders or HR representatives.
    • Call family members with “emergencies.”
    • Trick users into authorizing transactions or revealing codes.

    It’s a high-tech twist on old-fashioned deception. Google, PayPal, and cybersecurity experts are warning that deepfake-driven scams will only increase through 2026.​

    Why It’s Effective:

    This scam works because it blends psychological urgency with technological familiarity. When “someone you trust” calls asking for help, most people act before thinking.

    Add to that how AI-generated voices now mimic emotional tone, stress, confidence, and familiarity, and even seasoned professionals fall for it.

    Red Flags:

    • Here’s what to look (and listen) for:
    • A call or voicemail that sounds slightly robotic or “too perfect.”
    • Sudden, urgent money or password requests from known contacts.
    • Unusual grammar or tone in follow-up messages.
    • Inconsistencies between the voice message and typical company protocols.

    Pause before panic. If a voice message feels “off,” verify independently with the real person using a saved contact number, not the one in the message.

    What You Can Do:

    • Verify before you act. Hang up and call back using an official phone number.
    • Establish a “family or team password.” A simple phrase everyone knows can verify real emergencies.
    • Don’t rely on caller ID. Scammers can spoof names and organizations.
    • Educate your circle. The best defense is awareness—share updates about new scam tactics.
    • Secure your data. Limit the amount of voice or video content you share publicly.

    Organizations like Google and the FTC now recommend using passkeys, two-factor verification, and scam-spotting games to build intuition against fake communications.​

    If You’ve Been Targeted:

    • Cut off contact immediately. Do not reply, click, or engage further.
    • Report the incident to your bank, employer, or relevant platform.
    • File a complaint with the FTC or FBI Internet Crime Complaint Center (IC3).
    • Change your passwords and enable multifactor authentication on critical accounts.
    • Freeze your credit through major reporting agencies if personal data was compromised.

    AI is transforming how scammers operate, but awareness and calm action can short-circuit their success. Most scams thrive on confusion and pressure. If you slow down, verify, and stay informed, you take away their greatest weapon.

    Seen or heard something suspicious? Share this post with someone who might be vulnerable or join the conversation: how would you verify a voice you thought you knew?

    Further Reading:

     
  • Geebo 8:00 am on October 3, 2025 Permalink | Reply
    Tags: , AI voice, , ,   

    AI Voice Fuels Virtual Kidnap Plot of Teen 

    AI Voice Fuels Virtual Kidnap Plot

    By Greg Collier

    A family in Buffalo, New York, was recently targeted in a terrifying scam that began with a phone call from an unfamiliar number. On the line was what sounded like the sobbing voice of a teenage boy, pleading for help. The caller then claimed the boy had stumbled upon a dangerous situation and that his life was at risk if the family contacted the authorities.

    In an attempt to make the threat more convincing, the supposed victim’s voice declared that a friend was dead. That detail likely intensified the panic and added emotional weight to the situation, creating even greater pressure to act before pausing to verify the facts.

    While the voice on the line appeared to match the teenager’s, relatives acted quickly to confirm his whereabouts. They checked his phone location and contacted friends who were with him at a local football game. After confirming that he was safe, the caller escalated demands for thousands of dollars in exchange for the teenager’s return. The family ultimately determined the audio was a fabrication engineered to provoke fear and extract money.

    This scheme is known as the virtual kidnapping scam, and the Buffalo incident highlights its modern evolution. Law enforcement and consumer protection agencies have reported a rise in these incidents in recent years. Some of the more convincing cases now incorporate synthetic audio produced with artificial intelligence. Criminals frequently harvest voice samples from publicly posted videos, voice messages, and other social media content to train AI tools that can mimic a loved one’s voice. Other schemes require no sophisticated technology at all and rely instead on pressure tactics and background sounds that suggest urgency. Both approaches exploit emotional vulnerability and the instinct to act quickly when a family member appears to be in danger.

    The narrative presented in this case involved a supposed drug deal that required silencing a witness. Scenarios like that are far more common in fiction than in real life. Local drug activity usually involves low-level sales of marijuana or other minor substances, not organized plots to eliminate bystanders. Scammers craft these kinds of dramatic stories because they sound believable in the moment and increase the pressure on the victim to comply.

    Because these scams play on fear, verification is essential. Families can reduce their risk by establishing simple, prearranged measures that only they know. A short, memorable code word that is used in authentic emergencies is one practical precaution. If a caller claims a family member is being held or harmed, asking for the code word and independently confirming the person’s location can quickly expose fraud. Reporting the call to local law enforcement and preserving call records will help investigators and may prevent others from becoming victims.

    The incident in Buffalo serves as a reminder that technology can magnify age-old criminal tactics. Virtual kidnappings represent an alarming fusion of traditional extortion and modern audio manipulation. Awareness, verification, and basic household protocols can blunt the effect of the scam and give families time to respond calmly and effectively.

     
  • Geebo 8:00 am on June 19, 2025 Permalink | Reply
    Tags: AI voice, , , ,   

    Scammers Clone Celebrity Voices 

    Scammers Clone Celebrity Voices

    By Greg Collier

    A growing number of scams now involve the use of artificial intelligence to impersonate well-known individuals, including local news personalities and potentially even national celebrities. A recent example in Cincinnati highlights the sophistication of these tactics, as scammers used AI-generated audio to mimic the voice of a local TV meteorologist.

    The scheme involves the creation of fake social media accounts, complete with copied profile photos and fabricated usernames that closely resemble legitimate ones. These impersonators send friend requests to unsuspecting individuals and later initiate private conversations in which they use voice messages to convince the target of their identity. The scammers then ask for large sums of money, exploiting the trust built through this artificial familiarity.

    What makes this scam particularly effective is the use of AI voice cloning. With only a few seconds of publicly available audio, such as from a news broadcast or social media post, malicious actors can create a nearly perfect replica of a person’s voice. This technology is readily accessible through free or inexpensive software tools available online.

    While this incident involved a local media figure, the same approach can be used to mimic actors, musicians, and other public figures. It can also extend to impersonations of family members, as seen in other frauds where a cloned voice is used to trick victims into believing a loved one is in distress.

    Social media companies and cybersecurity experts continue to warn the public about these emerging threats. Verifying the legitimacy of messages or profiles, particularly when they involve requests for money, is critical. Fake accounts often use slight misspellings, have minimal engagement, or were created recently. In many cases, a quick search can reveal the existence of the real account, helping to identify the fraudulent one.

    The rise of AI-powered impersonation poses significant challenges to online safety. It underscores the importance of skepticism, especially when requests come through unofficial or unexpected channels. Awareness and caution remain the first lines of defense against this evolving form of digital deception.

     
  • Geebo 9:00 am on January 27, 2025 Permalink | Reply
    Tags: , AI voice, , , , , ,   

    AI Voice Scams: The Ransom Threat 

    AI Voice Scams: The Ransom Threat

    By Greg Collier

    In a chilling evolution of traditional scams, a new wave of ransom schemes is targeting families with advanced technology, creating fear and financial loss. These scams, which have been reported in Westchester County, New York, and Chatham County, Georgia, use artificial intelligence (AI) to replicate the voices of loved ones and phone number spoofing to make calls appear authentic. The alarming frequency and realism of these incidents leave victims shaken and desperate.

    In Peekskill, New York, families in a local school district were targeted with calls claiming their child had been kidnapped. Using AI-generated voice replication, scammers made the calls sound as though they were coming directly from the child. The calls included cries for help and demands for ransom, creating a terrifying sense of urgency for the families. Similarly, in Chatham County, Georgia, law enforcement received reports of scam calls where the voices of loved ones were mimicked, and their phone numbers were spoofed. Victims believed they were speaking directly with their family member, further convincing them of the alleged kidnapping.

    This type of scam, known as the virtual kidnapping scam, is made possible by the proliferation of digital tools capable of replicating a person’s voice with only a few audio samples. These samples are often taken from social media, where individuals frequently share videos and voice recordings. Additionally, phone number spoofing allows scammers to manipulate caller IDs, making it seem as though the call is originating from the victim’s own phone or from a familiar contact.

    Authorities have noted that these scams exploit advanced technology and human psychology to maximum effect. The sense of urgency created by threats of violence and the apparent authenticity of the call make it difficult for victims to pause and assess the situation critically. Victims often feel immense pressure to act quickly, believing that hesitation could lead to harm for their loved ones.

    In both Peekskill and Chatham County, authorities have emphasized the importance of verifying the safety of family members independently and resisting the temptation to provide personal or financial information over the phone. Families are being encouraged to create unique verification methods, such as secret passwords or phrases, to quickly confirm the legitimacy of a call. Law enforcement in both areas continues to investigate these cases and spread awareness to prevent further victimization.

    While the technological tools enabling these scams are growing more sophisticated, education remains a powerful defense. By understanding how these scams operate and staying cautious about unfamiliar links or calls, individuals can protect themselves and their loved ones from falling victim to these disturbing schemes.

    With the rise of these incidents, it’s clear that continued efforts to promote awareness and implement preventative strategies will be key in combating this alarming trend.

     
  • Geebo 9:00 am on November 5, 2024 Permalink | Reply
    Tags: , AI voice, , , ,   

    A Mother’s Close Call with AI Voice Cloning 

    A Mother's Close Call with AI Voice Cloning

    By Greg Collier

    Imagine the terror of receiving a phone call with a familiar voice in distress, only to realize it was a cruel, high-tech scam. This harrowing experience recently befell a mother in Grand Rapids, Michigan, who nearly lost $50,000 over a weekend due to a sophisticated AI-driven scam. This scam, called ‘voice cloning’ mimicked the voice of her daughter so convincingly that it bypassed her natural skepticism and sent her scrambling to respond to what seemed like an emergency.

    It started with a phone call from an unknown number, coming from a town her daughter often frequented. With her daughter’s faint, panicked voice on the other end, she felt an instant urgency and fear that something was gravely wrong. Then, as she listened, the tone shifted; a stranger seized control of the call, asserting himself as a captor and demanding an immediate ransom. Her daughter’s supposed voice—distorted, mumbled, and terrified—amplified the mother’s fears. Desperation began to cloud her judgment as she debated how to produce such a vast sum on short notice.

    In her fear and confusion, she was prepared to do whatever it took to ensure her daughter’s safety. She was ready to withdraw cash, find neighbors who might accompany her, and meet the caller, who had directed her to a local hardware store for the exchange. But her instincts were seconded by her husband, who, while she negotiated, placed a call to the local police department. They advised him to contact their daughter directly, which they did, only to find she was safe and sound, unaware of the horrifying call her mother had just endured.

    This unsettling experience highlights a chilling reality of today’s world: the power of artificial intelligence to manipulate emotions, creating distressing scenarios with fabricated voices. These AI scams work by exploiting easily accessible samples of people’s voices, often found in social media videos or recordings. Voice cloning technology, once a futuristic concept, is now accessible and advanced enough to replicate a person’s voice with unsettling accuracy from just a brief clip.

    The Better Business Bureau advises those targeted by similar scams to resist the urge to act immediately. The shock of hearing a loved one’s voice in peril can push us to respond without question, but taking a pause, verifying the caller’s claims, and contacting the loved one directly are critical steps to prevent falling victim.

    Protecting yourself from AI-driven voice cloning scams requires both awareness and a proactive approach. Start by being mindful of what you share online, especially voice recordings, as even brief audio clips on social media can provide the material needed for cloning. Reducing the number of public posts containing your voice limits potential exposure, making it harder for scammers to replicate.

    Establishing a safe word with family members is also an effective precaution. A unique, shared phrase can act as a verification tool in emergency calls. If you ever receive a call claiming a loved one is in distress, use this word to confirm their identity. By doing so, you create a reliable check against scams, especially when emotions run high.

    It’s essential to take a moment to verify information before reacting. Scammers count on people’s tendency to act on instinct, especially when fear and urgency are involved. If you receive an alarming call, try to reach the person directly using a familiar number. Verifying information before sending money or following instructions can prevent falling victim to such fraud.

    In the end, a calm, measured approach, grounded in verification and pre-established safety measures, can make all the difference in staying protected against AI-driven threats.

     
  • Geebo 9:00 am on February 28, 2024 Permalink | Reply
    Tags: , AI voice, , , voice c\,   

    The terrifying rise of AI-generated phone scams 

    By Greg Collier

    In the age of rapid technological advancement, it appears that scammers are always finding new ways to exploit our vulnerabilities. One of the latest and most frightening trends is the emergence of AI-generated phone scams, where callers use sophisticated artificial intelligence to mimic the voices of loved ones and prey on our emotions.

    Recently, residents of St. Louis County in Missouri were targeted by a particularly chilling variation of this scam. Victims received calls from individuals claiming to be their children in distress, stating that they had been involved in a car accident and the other driver was demanding money for damages under the threat of kidnapping. The scammers used AI to replicate the voices of the victims’ children, adding an extra layer of realism to their deception.

    The emotional impact of such a call cannot be overstated. Imagine receiving a call from someone who sounds exactly like your child, crying and pleading for help. The panic and fear that ensue can cloud judgment and make it difficult to discern the truth. This is precisely what the scammers rely on to manipulate their victims.

    One brave mother shared her harrowing experience with a local news outlet. She recounted how she received a call from someone who sounded like her daughter, claiming to have been in an accident and demanding a $2,000 wire transfer to prevent her kidnapping.

    Fortunately, in the case of the St. Louis County mother, prompt police intervention prevented her from falling victim to the scam. However, not everyone is as fortunate, with some parents having lost thousands of dollars to these heartless perpetrators.

    Experts warn that hanging up the phone may not be as simple as it seems in the heat of the moment. Instead, families should establish safe words or phrases to verify the authenticity of such calls.

    To protect yourself from falling victim to AI-generated phone scams, it’s essential to remain informed. Be wary of calls that pressure you to act quickly or request payment via gift cards or cryptocurrency. If you receive such a call, verify the authenticity of the situation by contacting the threatened family member directly and report the incident to law enforcement.

     
  • Geebo 9:00 am on January 12, 2024 Permalink | Reply
    Tags: , AI voice, , , , ,   

    More police warn of AI voice scams 

    More police warn of AI voice scams

    By Greg Collier

    AI voice spoofing refers to the use of artificial intelligence (AI) technology to imitate or replicate a person’s voice in a way that may deceive listeners into thinking they are hearing the real person. This technology can be used to generate synthetic voices that closely mimic the tone, pitch, and cadence of a specific individual. The term is often associated with negative uses, such as creating fraudulent phone calls or audio messages with the intent to deceive or manipulate.

    Scammers can exploit a brief audio clip of your family member’s voice, easily obtained from online content. With access to a voice-cloning program, the scammer can then imitate your loved one’s voice convincingly when making a call, leading to potential deception and manipulation. Scammers have quickly taken to this technology in order to fool people into believing their loved ones are in danger in what are being called family emergency scams.

    Family emergency scams typically break down into two categories, the virtual kidnapping scam, and the grandparent scam. Today, we’re focused on the grandparent scam. It garnered its name from the fact that scammers often target elderly victims, posing as the victim’s grandchild in peril. This scam has been happening a lot lately in the Memphis area, to the point where a Sheriff’s Office has issued a warning to local residents about it.

    One family received a phone call that appeared to be coming from their adult granddaughter. The caller sounded exactly like their granddaughter, who said they needed $500 for bail money after getting into a car accident. Smartly, the family kept asking the caller questions that only their granddaughter would know. The scammers finally hung up.

    To safeguard against this scam, it’s crucial to rely on caution rather than solely trusting your ears. If you receive a call from a supposed relative or loved one urgently requesting money due to a purported crisis, adhere to the same safety measures. Resist the urge to engage further; instead, promptly end the call and independently contact the person who is claimed to be in trouble to verify the authenticity of the situation. This proactive approach helps ensure protection against potential scams, even when the voice on the call seems identical to that of your loved one.

     
  • Geebo 8:00 am on September 19, 2023 Permalink | Reply
    Tags: AI voice, , , ,   

    The sheer terror of the kidnapping scam 

    The sheer terror of the kidnapping scam

    By Greg Collier

    Even if someone has complete knowledge of how a certain scam works, that doesn’t necessarily mean they won’t fall victim to it, due to how some scams are completely menacing. Take, for example, the virtual kidnapping scam. This is when a scammer calls someone and claims to have kidnapped their loved one before making a ransom demand. Meanwhile, the supposed kidnap victim is unharmed and has no idea they’re being used in a scam. With the advancement of AI voice-spoofing technology, scammers can easily mimic the voice of the victim’s loved one to make the scam seem even more threatening.

    With that knowledge in mind, we may think we wouldn’t fall for such a scam as we sit at our keyboards and screens. But can you say that with 100% confidence? Before you answer, you should know the story of an Atlanta father who fell victim to the scam.

    He received a call from someone who claimed they kidnapped his adult daughter. At the time of the call, the man’s daughter was traveling. This could be why the man was targeted, as scammers often take information they find on social media and use it to their advantage. The caller claimed he got into a car accident with the man’s daughter and that they were carrying a substantial amount of cocaine at the time.

    The caller threatened the life of the man’s daughter, saying that they couldn’t have anyone recognize them. This was accompanied by screams and cries in the background that replicated his daughter’s voice. This was followed up with threats of torture and other bodily harm to the daughter if the man didn’t comply. For the sake of decorum, we won’t reprint specifically what the threats entailed, but imagine the worst thing that could happen to a loved one of your own, and then you have an idea of the terror that was unfolding.

    The father complied with the scammer’s request and sent them $2500 to the scammer’s bank account, probably through an app like Zelle.

    Even if armed with the knowledge of how the virtual kidnapping scam works, in the heat of the moment, no one could be blamed for falling victim to the scam. However, there are still ways to try to protect yourself from the scam. The best way is to set up a code word between you and your loved ones. This way, in cases of calls like this, you can know if you’re actually talking to your loved one or not. Or, you could also ask them a question that only the supposed kidnap victim would know.

    While it’s easier said than done, try to remain calm in the situation, even while your ears may be deceiving you. Make attempts to contact your loved one through other means. If you can, attempt to have someone else reach them on a different phone.

    Please keep in mind, virtual kidnapping scams rely on manipulation and intimidation. By staying calm, and taking the necessary precautions, you can protect yourself and your loved ones from falling victim to these schemes.

     
  • Geebo 8:00 am on September 1, 2023 Permalink | Reply
    Tags: , AI voice, , , , ,   

    Grandmother scammed for weeks in AI voice-spoofing scam 

    By Greg Collier

    It’s been a short while since we last discussed the AI voice-spoofing scam. For new readers, this is when scammers obtain a sample of someone’s voice from online, and run it through an AI program, which allows the scammers to make the voice say whatever they want. The scammers then use the person’s voice to convince that person’s loved one to send the scammers money.

    Voice-spoofing is typically used in one of two consumer-level scams. The first one is the virtual kidnapping scam, which is exactly what it sounds like. Scammers will use the spoofed voice to make it sound like somebody’s loved one has been kidnapped, and the scammers will demand a ransom.

    The second scam is the one we’ll be discussing today, which is the grandparent scam. In this scam, the scammers pose as an elderly victim’s grandchild who’s in some kind of legal trouble. The scammers will often ask for bail money or legal fees.

    An elderly woman from Utah recently fell victim to the grandparent scam. Scammers called her on the phone using the cloned voice of one of her granddaughters. The ‘granddaughter’ said she had been arrested after riding in a car with someone who had drugs and needed bail money. A scammer then got on the call and pretended to be the granddaughter’s attorney and instructed the woman on how she could send payment. The woman was also instructed not to tell anyone else in the family, as it could jeopardize the granddaughter’s court case.

    One of the many problems with scammers is if you pay them once, chances are they’ll come back for more money, which is what happened here. For weeks, the phony granddaughter kept calling back needing more money each time for various legal proceedings. Keep in mind that with each conversation, the grandmother is not actually talking to anybody but a computer-generated voice, which sounds exactly like her granddaughter.

    Eventually, the grandmother did grow suspicious and told her son, who informed her she was being scammed.

    Don’t trust your ears when it comes to phone scams. If you receive a call from someone claiming to be a relative or loved one in need of money, it’s important to follow the same precautions, even if the voice sounds exactly like them. Hang up on the call and contact the person who’s supposedly in trouble. If you can’t reach them, ask other family members who might know where they are. Be sure to tell them about the situation you encountered, and never keep it a secret. Lastly, never send money under any circumstances.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel