Tagged: voice cloning Toggle Comment Threads | Keyboard Shortcuts

  • Geebo 8:00 am on October 22, 2025 Permalink | Reply
    Tags: , , , political donations, , , voice cloning,   

    Deepfake Donors: When Political Voices Are Fake 

    Deepfake Donors: When Political Voices Are Fake

    By Greg Collier

    You get a text from your “preferred political candidate.” It asks for a small donation of ten dollars “to fight misinformation” or “protect election integrity.” The link looks official. The voice message attached even sounds authentically passionate, familiar, and persuasive.

    But it isn’t real. And neither is the person behind it.

    This fall, investigators from the U.S. Treasury and U.K. authorities announced their largest-ever takedown of cybercriminal networks responsible for billions in losses tied to fraudulent campaigns, fake fundraising, and AI-generated political deepfakes. This operation struck transnational organized criminal groups based especially in Southeast Asia, including the notorious Prince Group TCO, a dominant cybercrime player in Cambodia’s scam economy responsible for billions in illicit financial transactions. U.S. losses alone to online investment scams topped $16.6 billion, with over $10 billion lost to scam operations based in Southeast Asia just last year.​

    These scams are blurring the line between digital activism and manipulation right when citizens are most vulnerable: election season.

    What’s Going On:

    Scammers are exploiting voters’ trust in political communication, blending voice cloning, AI video, and fraudulent donation sites to extract money and personal data.​

    Here’s how it works:

    • A deepfake video or voicemail mimics a real candidate, complete with campaign slogans and “urgent” donation requests.
    • The links lead to fraudulent websites where victims enter credit card details.
    • Some schemes even collect personal voter data later sold or used for identity theft.

    In 2024’s New Hampshire primaries, voice-cloned robocalls impersonating national figures were caught attempting to sway voters, a precursor to the tactics now being scaled globally in 2025.​

    Why It’s Effective:

    These scams thrive because people trust familiarity, especially voices, faces, and causes they care about. The timing, emotional tone, and recognizable slogans create a powerful illusion of legitimacy.

    Modern AI makes it nearly impossible for the average person to distinguish a deepfake from reality, especially when wrapped in high-stakes messaging about public service, patriotism, or “protecting democracy.” Add in social pressure, and even cautious donors lower their guard.

    Red Flags:

    Before contributing or sharing campaign links, pause and check for these telltale signs:

    • Donation requests that come through texts, WhatsApp, or unknown numbers.
    • Voices or videos that sound slightly “off,” mismatched mouth movements, odd pauses, or inconsistent lighting.
    • Links that end in unusual extensions (like “.co” or “.support”) rather than official candidate domains.
    • Payment requests through Venmo, CashApp, Zelle, or crypto.
    • No clear disclosure or FEC registration details at the bottom of the website.

    Quick tip: Official campaigns in the U.S. are required to display Federal Election Commission (FEC) registration and disclaimers. If that’s missing, it’s a huge red flag.

    What You Can Do:

    • Verify before donating. Go directly to the official campaign site; don’t use links from texts or emails.
    • Treat urgency as a warning. Real campaigns rarely need “immediate wire transfers.”
    • Listen for tells. Deepfakes often have slightly distorted sounds or mechanical echoes.
    • Cross-check messages. If you get a surprising call or voicemail, compare it with the candidate’s latest verified posts.
    • Report and share. Submit suspicious calls or videos to reportfraud.ftc.gov or your state election board.

    Platforms including Google, Meta, and YouTube are now launching active detection systems and educational tools to flag deepfake political content before it spreads.​

    If You’ve Been Targeted:

    • Report donations made to fake campaigns immediately to your bank or credit card provider.
    • File a complaint through the FTC and local election authorities.
    • Freeze credit if personal or voter identity data were shared.
    • Publicize responsibly. Sharing examples with the right context can warn others, but avoid amplifying active scams.

    Final Thoughts:

    Deepfakes are no longer a distant concern; they’re reshaping political communication in real time. What makes this wave dangerous isn’t just money loss; it’s trust erosion.

    The recent takedown of the Prince Group’s transnational criminal networks by U.S. and U.K. authorities, which included sanctions on key individuals and cutting off millions in illicit financial flows, underscores the global scale of this problem. Their coordinated actions disrupted the infrastructure enabling these massive fraud campaigns, providing a much-needed deterrent to criminals using AI-based scams during critical democratic processes.​

    Staying safe now means applying the same critical awareness you’d use for phishing to the content you see and hear. Don’t assume your eyes or ears tell the full story.

    Think you spotted a fake campaign video or suspicious fundraising call? Don’t scroll past it; report it, discuss it, and share this guide. The more people who know what to look for, the fewer fall for it.

    Further Reading:

     
  • Geebo 8:00 am on October 20, 2025 Permalink | Reply
    Tags: , , , , , voice cloning   

    AI Is Calling, But It’s Not Who You Think 

    By Greg Collier

    A phone rings with an unfamiliar number while an AI waveform hovers behind, symbolizing how technology cloaks modern impersonation scams.

    Picture this: you get a call, and it’s your boss’s voice asking for a quick favor, a wire transfer to a vendor, or a prepaid card code “for the conference.” It sounds exactly like their tone, pace, and even background noise. But that voice? It’s not real.

    AI-generated voice cloning is fueling a wave of impersonation scams. And as voice, image, and chat synthesis tools become more advanced, the line between real and fake is disappearing.

    What’s Going On?:

    Fraudsters are now combining data from social media with voice samples from YouTube, voicemail greetings, or even podcasts. Using consumer-grade AI tools, they replicate voices with uncanny accuracy.

    They then use these synthetic voices to:

    • Impersonate company leaders or HR representatives.
    • Call family members with “emergencies.”
    • Trick users into authorizing transactions or revealing codes.

    It’s a high-tech twist on old-fashioned deception. Google, PayPal, and cybersecurity experts are warning that deepfake-driven scams will only increase through 2026.​

    Why It’s Effective:

    This scam works because it blends psychological urgency with technological familiarity. When “someone you trust” calls asking for help, most people act before thinking.

    Add to that how AI-generated voices now mimic emotional tone, stress, confidence, and familiarity, and even seasoned professionals fall for it.

    Red Flags:

    • Here’s what to look (and listen) for:
    • A call or voicemail that sounds slightly robotic or “too perfect.”
    • Sudden, urgent money or password requests from known contacts.
    • Unusual grammar or tone in follow-up messages.
    • Inconsistencies between the voice message and typical company protocols.

    Pause before panic. If a voice message feels “off,” verify independently with the real person using a saved contact number, not the one in the message.

    What You Can Do:

    • Verify before you act. Hang up and call back using an official phone number.
    • Establish a “family or team password.” A simple phrase everyone knows can verify real emergencies.
    • Don’t rely on caller ID. Scammers can spoof names and organizations.
    • Educate your circle. The best defense is awareness—share updates about new scam tactics.
    • Secure your data. Limit the amount of voice or video content you share publicly.

    Organizations like Google and the FTC now recommend using passkeys, two-factor verification, and scam-spotting games to build intuition against fake communications.​

    If You’ve Been Targeted:

    • Cut off contact immediately. Do not reply, click, or engage further.
    • Report the incident to your bank, employer, or relevant platform.
    • File a complaint with the FTC or FBI Internet Crime Complaint Center (IC3).
    • Change your passwords and enable multifactor authentication on critical accounts.
    • Freeze your credit through major reporting agencies if personal data was compromised.

    AI is transforming how scammers operate, but awareness and calm action can short-circuit their success. Most scams thrive on confusion and pressure. If you slow down, verify, and stay informed, you take away their greatest weapon.

    Seen or heard something suspicious? Share this post with someone who might be vulnerable or join the conversation: how would you verify a voice you thought you knew?

    Further Reading:

     
  • Geebo 8:00 am on October 3, 2025 Permalink | Reply
    Tags: , , , , voice cloning   

    AI Voice Fuels Virtual Kidnap Plot of Teen 

    AI Voice Fuels Virtual Kidnap Plot

    By Greg Collier

    A family in Buffalo, New York, was recently targeted in a terrifying scam that began with a phone call from an unfamiliar number. On the line was what sounded like the sobbing voice of a teenage boy, pleading for help. The caller then claimed the boy had stumbled upon a dangerous situation and that his life was at risk if the family contacted the authorities.

    In an attempt to make the threat more convincing, the supposed victim’s voice declared that a friend was dead. That detail likely intensified the panic and added emotional weight to the situation, creating even greater pressure to act before pausing to verify the facts.

    While the voice on the line appeared to match the teenager’s, relatives acted quickly to confirm his whereabouts. They checked his phone location and contacted friends who were with him at a local football game. After confirming that he was safe, the caller escalated demands for thousands of dollars in exchange for the teenager’s return. The family ultimately determined the audio was a fabrication engineered to provoke fear and extract money.

    This scheme is known as the virtual kidnapping scam, and the Buffalo incident highlights its modern evolution. Law enforcement and consumer protection agencies have reported a rise in these incidents in recent years. Some of the more convincing cases now incorporate synthetic audio produced with artificial intelligence. Criminals frequently harvest voice samples from publicly posted videos, voice messages, and other social media content to train AI tools that can mimic a loved one’s voice. Other schemes require no sophisticated technology at all and rely instead on pressure tactics and background sounds that suggest urgency. Both approaches exploit emotional vulnerability and the instinct to act quickly when a family member appears to be in danger.

    The narrative presented in this case involved a supposed drug deal that required silencing a witness. Scenarios like that are far more common in fiction than in real life. Local drug activity usually involves low-level sales of marijuana or other minor substances, not organized plots to eliminate bystanders. Scammers craft these kinds of dramatic stories because they sound believable in the moment and increase the pressure on the victim to comply.

    Because these scams play on fear, verification is essential. Families can reduce their risk by establishing simple, prearranged measures that only they know. A short, memorable code word that is used in authentic emergencies is one practical precaution. If a caller claims a family member is being held or harmed, asking for the code word and independently confirming the person’s location can quickly expose fraud. Reporting the call to local law enforcement and preserving call records will help investigators and may prevent others from becoming victims.

    The incident in Buffalo serves as a reminder that technology can magnify age-old criminal tactics. Virtual kidnappings represent an alarming fusion of traditional extortion and modern audio manipulation. Awareness, verification, and basic household protocols can blunt the effect of the scam and give families time to respond calmly and effectively.

     
  • Geebo 8:00 am on June 19, 2025 Permalink | Reply
    Tags: , , , , voice cloning   

    Scammers Clone Celebrity Voices 

    Scammers Clone Celebrity Voices

    By Greg Collier

    A growing number of scams now involve the use of artificial intelligence to impersonate well-known individuals, including local news personalities and potentially even national celebrities. A recent example in Cincinnati highlights the sophistication of these tactics, as scammers used AI-generated audio to mimic the voice of a local TV meteorologist.

    The scheme involves the creation of fake social media accounts, complete with copied profile photos and fabricated usernames that closely resemble legitimate ones. These impersonators send friend requests to unsuspecting individuals and later initiate private conversations in which they use voice messages to convince the target of their identity. The scammers then ask for large sums of money, exploiting the trust built through this artificial familiarity.

    What makes this scam particularly effective is the use of AI voice cloning. With only a few seconds of publicly available audio, such as from a news broadcast or social media post, malicious actors can create a nearly perfect replica of a person’s voice. This technology is readily accessible through free or inexpensive software tools available online.

    While this incident involved a local media figure, the same approach can be used to mimic actors, musicians, and other public figures. It can also extend to impersonations of family members, as seen in other frauds where a cloned voice is used to trick victims into believing a loved one is in distress.

    Social media companies and cybersecurity experts continue to warn the public about these emerging threats. Verifying the legitimacy of messages or profiles, particularly when they involve requests for money, is critical. Fake accounts often use slight misspellings, have minimal engagement, or were created recently. In many cases, a quick search can reveal the existence of the real account, helping to identify the fraudulent one.

    The rise of AI-powered impersonation poses significant challenges to online safety. It underscores the importance of skepticism, especially when requests come through unofficial or unexpected channels. Awareness and caution remain the first lines of defense against this evolving form of digital deception.

     
  • Geebo 8:00 am on March 25, 2025 Permalink | Reply
    Tags: , , , voice cloning   

    Scammers Are Still Cloning You 

    Scammers Are Still Cloning You

    By Greg Collier

    A new type of scam is becoming more common, and more convincing, thanks to rapidly evolving artificial intelligence. The Better Business Bureau has issued a warning about voice-cloning scams that are impacting individuals and families across the country.

    These scams rely on technology that can mimic someone’s voice with alarming accuracy. With just a few seconds of audio, sometimes lifted from voicemail greetings, casual conversations, or even online videos, scammers can generate a voice that sounds nearly identical to that of a loved one. This makes it incredibly difficult to distinguish between a real call and a fake one, especially when the voice on the other end is claiming to be in trouble, asking for money, or offering a too-good-to-be-true opportunity.

    In one case recently reported, an individual spent nearly a week performing tasks for what appeared to be a remote job, unaware that the employer’s true intent was to capture voice recordings. The concern is that these recordings may later be used in scams that impersonate the individual or manipulate others into sharing sensitive information.

    Scammers are becoming more strategic. They’re using AI not just to imitate voices, but also to weave those voices into emotional scenarios that cause panic or urgency, situations where someone might act quickly without verifying the call. This emotional manipulation is what makes these scams so dangerous. A familiar voice saying it’s an emergency can override our instincts and judgment in a matter of seconds.

    To protect yourself, take steps that make it harder for these scams to succeed. If you receive a call that seems suspicious, even if the voice sounds familiar, don’t respond right away. Take a moment to pause. Hang up and call the person directly using a known number. This simple step can often expose the scam for what it is.

    Securing your digital presence is also key. Enable multifactor authentication on your accounts whenever possible. It adds an extra layer of protection that can prevent scammers from accessing your information, even if they manage to imitate your voice or steal your password. At work, businesses should invest in cybersecurity training for employees. Building a culture of awareness and caution can prevent data breaches and manipulation.

    AI voice scams are still a developing threat, and organizations like the BBB are working to find solutions and increase public awareness. Until then, staying skeptical, careful, and informed is the best defense. In this new era where hearing a familiar voice doesn’t guarantee safety, taking a second to verify can make all the difference.

     
  • Geebo 9:00 am on January 27, 2025 Permalink | Reply
    Tags: , , , , , , voice cloning,   

    AI Voice Scams: The Ransom Threat 

    AI Voice Scams: The Ransom Threat

    By Greg Collier

    In a chilling evolution of traditional scams, a new wave of ransom schemes is targeting families with advanced technology, creating fear and financial loss. These scams, which have been reported in Westchester County, New York, and Chatham County, Georgia, use artificial intelligence (AI) to replicate the voices of loved ones and phone number spoofing to make calls appear authentic. The alarming frequency and realism of these incidents leave victims shaken and desperate.

    In Peekskill, New York, families in a local school district were targeted with calls claiming their child had been kidnapped. Using AI-generated voice replication, scammers made the calls sound as though they were coming directly from the child. The calls included cries for help and demands for ransom, creating a terrifying sense of urgency for the families. Similarly, in Chatham County, Georgia, law enforcement received reports of scam calls where the voices of loved ones were mimicked, and their phone numbers were spoofed. Victims believed they were speaking directly with their family member, further convincing them of the alleged kidnapping.

    This type of scam, known as the virtual kidnapping scam, is made possible by the proliferation of digital tools capable of replicating a person’s voice with only a few audio samples. These samples are often taken from social media, where individuals frequently share videos and voice recordings. Additionally, phone number spoofing allows scammers to manipulate caller IDs, making it seem as though the call is originating from the victim’s own phone or from a familiar contact.

    Authorities have noted that these scams exploit advanced technology and human psychology to maximum effect. The sense of urgency created by threats of violence and the apparent authenticity of the call make it difficult for victims to pause and assess the situation critically. Victims often feel immense pressure to act quickly, believing that hesitation could lead to harm for their loved ones.

    In both Peekskill and Chatham County, authorities have emphasized the importance of verifying the safety of family members independently and resisting the temptation to provide personal or financial information over the phone. Families are being encouraged to create unique verification methods, such as secret passwords or phrases, to quickly confirm the legitimacy of a call. Law enforcement in both areas continues to investigate these cases and spread awareness to prevent further victimization.

    While the technological tools enabling these scams are growing more sophisticated, education remains a powerful defense. By understanding how these scams operate and staying cautious about unfamiliar links or calls, individuals can protect themselves and their loved ones from falling victim to these disturbing schemes.

    With the rise of these incidents, it’s clear that continued efforts to promote awareness and implement preventative strategies will be key in combating this alarming trend.

     
  • Geebo 9:00 am on November 5, 2024 Permalink | Reply
    Tags: , , , , voice cloning,   

    A Mother’s Close Call with AI Voice Cloning 

    A Mother's Close Call with AI Voice Cloning

    By Greg Collier

    Imagine the terror of receiving a phone call with a familiar voice in distress, only to realize it was a cruel, high-tech scam. This harrowing experience recently befell a mother in Grand Rapids, Michigan, who nearly lost $50,000 over a weekend due to a sophisticated AI-driven scam. This scam, called ‘voice cloning’ mimicked the voice of her daughter so convincingly that it bypassed her natural skepticism and sent her scrambling to respond to what seemed like an emergency.

    It started with a phone call from an unknown number, coming from a town her daughter often frequented. With her daughter’s faint, panicked voice on the other end, she felt an instant urgency and fear that something was gravely wrong. Then, as she listened, the tone shifted; a stranger seized control of the call, asserting himself as a captor and demanding an immediate ransom. Her daughter’s supposed voice—distorted, mumbled, and terrified—amplified the mother’s fears. Desperation began to cloud her judgment as she debated how to produce such a vast sum on short notice.

    In her fear and confusion, she was prepared to do whatever it took to ensure her daughter’s safety. She was ready to withdraw cash, find neighbors who might accompany her, and meet the caller, who had directed her to a local hardware store for the exchange. But her instincts were seconded by her husband, who, while she negotiated, placed a call to the local police department. They advised him to contact their daughter directly, which they did, only to find she was safe and sound, unaware of the horrifying call her mother had just endured.

    This unsettling experience highlights a chilling reality of today’s world: the power of artificial intelligence to manipulate emotions, creating distressing scenarios with fabricated voices. These AI scams work by exploiting easily accessible samples of people’s voices, often found in social media videos or recordings. Voice cloning technology, once a futuristic concept, is now accessible and advanced enough to replicate a person’s voice with unsettling accuracy from just a brief clip.

    The Better Business Bureau advises those targeted by similar scams to resist the urge to act immediately. The shock of hearing a loved one’s voice in peril can push us to respond without question, but taking a pause, verifying the caller’s claims, and contacting the loved one directly are critical steps to prevent falling victim.

    Protecting yourself from AI-driven voice cloning scams requires both awareness and a proactive approach. Start by being mindful of what you share online, especially voice recordings, as even brief audio clips on social media can provide the material needed for cloning. Reducing the number of public posts containing your voice limits potential exposure, making it harder for scammers to replicate.

    Establishing a safe word with family members is also an effective precaution. A unique, shared phrase can act as a verification tool in emergency calls. If you ever receive a call claiming a loved one is in distress, use this word to confirm their identity. By doing so, you create a reliable check against scams, especially when emotions run high.

    It’s essential to take a moment to verify information before reacting. Scammers count on people’s tendency to act on instinct, especially when fear and urgency are involved. If you receive an alarming call, try to reach the person directly using a familiar number. Verifying information before sending money or following instructions can prevent falling victim to such fraud.

    In the end, a calm, measured approach, grounded in verification and pre-established safety measures, can make all the difference in staying protected against AI-driven threats.

     
  • Geebo 9:00 am on January 12, 2024 Permalink | Reply
    Tags: , , , , , voice cloning,   

    More police warn of AI voice scams 

    More police warn of AI voice scams

    By Greg Collier

    AI voice spoofing refers to the use of artificial intelligence (AI) technology to imitate or replicate a person’s voice in a way that may deceive listeners into thinking they are hearing the real person. This technology can be used to generate synthetic voices that closely mimic the tone, pitch, and cadence of a specific individual. The term is often associated with negative uses, such as creating fraudulent phone calls or audio messages with the intent to deceive or manipulate.

    Scammers can exploit a brief audio clip of your family member’s voice, easily obtained from online content. With access to a voice-cloning program, the scammer can then imitate your loved one’s voice convincingly when making a call, leading to potential deception and manipulation. Scammers have quickly taken to this technology in order to fool people into believing their loved ones are in danger in what are being called family emergency scams.

    Family emergency scams typically break down into two categories, the virtual kidnapping scam, and the grandparent scam. Today, we’re focused on the grandparent scam. It garnered its name from the fact that scammers often target elderly victims, posing as the victim’s grandchild in peril. This scam has been happening a lot lately in the Memphis area, to the point where a Sheriff’s Office has issued a warning to local residents about it.

    One family received a phone call that appeared to be coming from their adult granddaughter. The caller sounded exactly like their granddaughter, who said they needed $500 for bail money after getting into a car accident. Smartly, the family kept asking the caller questions that only their granddaughter would know. The scammers finally hung up.

    To safeguard against this scam, it’s crucial to rely on caution rather than solely trusting your ears. If you receive a call from a supposed relative or loved one urgently requesting money due to a purported crisis, adhere to the same safety measures. Resist the urge to engage further; instead, promptly end the call and independently contact the person who is claimed to be in trouble to verify the authenticity of the situation. This proactive approach helps ensure protection against potential scams, even when the voice on the call seems identical to that of your loved one.

     
  • Geebo 9:00 am on November 27, 2023 Permalink | Reply
    Tags: , , , , voice cloning,   

    The FTC puts a bounty on AI voice cloning 

    The FTC puts a bounty on AI voice cloning

    By Greg Collier

    AI-generated voice cloning, or voice spoofing, scams have become such a nuisance, the federal government is turning to the people to help solve the problem. If you’re unfamiliar with AI-voice generation technology, there are apps and programs that can take a short sample of anyone’s voice and make that voice say whatever you want it to. The benefit of it is it can give people who lost their speaking ability a voice. However, every tool that’s made for the good of mankind can also be used to its detriment.

    Scammers use cloned voices in what are known as emergency scams. Emergency scams can be broken down into two categories, for the most part, the grandparent scam, and the virtual kidnapping scam. In both sets of scams, the scammers need to convince their victim one of the victim’s loved ones is in some sort of peril. In the case of the grandparent scam, the scammer will try to convince the victim their loved one is in jail and needs bail money. While in the virtual kidnapping scam, the scammers try to convince the victim their loved one has been kidnapped for ransom.

    Scammers will take a sample of someone’s voice, typically from a video that’s been posted to social media. Then, they’ll use the voice cloning technology to make it sound like that person is in a situation that requires the victim to send money.

    Voice cloning has become such a problem, the Federal Trade Commission has issued a challenge to anyone who thinks they can develop some kind of voice cloning detector. The top prize winner can receive $25,000, the runner-up can get $4000, while three honorable mentions can get $2000.

    In their own words, the FTC has issued this challenge to help push forward ideas to mitigate risks upstream—shielding consumers, creative professionals, and small businesses against the harms of voice cloning before the harm reaches a consumer.

    The online submission portal can be found at this link, and submissions will be accepted from January 2 to 12, 2024.

    Hopefully, someone can come up with the right idea to better help consumers from losing their money to these scammers.

     
  • Geebo 8:00 am on September 19, 2023 Permalink | Reply
    Tags: , , , voice cloning,   

    The sheer terror of the kidnapping scam 

    The sheer terror of the kidnapping scam

    By Greg Collier

    Even if someone has complete knowledge of how a certain scam works, that doesn’t necessarily mean they won’t fall victim to it, due to how some scams are completely menacing. Take, for example, the virtual kidnapping scam. This is when a scammer calls someone and claims to have kidnapped their loved one before making a ransom demand. Meanwhile, the supposed kidnap victim is unharmed and has no idea they’re being used in a scam. With the advancement of AI voice-spoofing technology, scammers can easily mimic the voice of the victim’s loved one to make the scam seem even more threatening.

    With that knowledge in mind, we may think we wouldn’t fall for such a scam as we sit at our keyboards and screens. But can you say that with 100% confidence? Before you answer, you should know the story of an Atlanta father who fell victim to the scam.

    He received a call from someone who claimed they kidnapped his adult daughter. At the time of the call, the man’s daughter was traveling. This could be why the man was targeted, as scammers often take information they find on social media and use it to their advantage. The caller claimed he got into a car accident with the man’s daughter and that they were carrying a substantial amount of cocaine at the time.

    The caller threatened the life of the man’s daughter, saying that they couldn’t have anyone recognize them. This was accompanied by screams and cries in the background that replicated his daughter’s voice. This was followed up with threats of torture and other bodily harm to the daughter if the man didn’t comply. For the sake of decorum, we won’t reprint specifically what the threats entailed, but imagine the worst thing that could happen to a loved one of your own, and then you have an idea of the terror that was unfolding.

    The father complied with the scammer’s request and sent them $2500 to the scammer’s bank account, probably through an app like Zelle.

    Even if armed with the knowledge of how the virtual kidnapping scam works, in the heat of the moment, no one could be blamed for falling victim to the scam. However, there are still ways to try to protect yourself from the scam. The best way is to set up a code word between you and your loved ones. This way, in cases of calls like this, you can know if you’re actually talking to your loved one or not. Or, you could also ask them a question that only the supposed kidnap victim would know.

    While it’s easier said than done, try to remain calm in the situation, even while your ears may be deceiving you. Make attempts to contact your loved one through other means. If you can, attempt to have someone else reach them on a different phone.

    Please keep in mind, virtual kidnapping scams rely on manipulation and intimidation. By staying calm, and taking the necessary precautions, you can protect yourself and your loved ones from falling victim to these schemes.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel