The Rise of Celebrity Deepfake Scams
By Greg Collier
Picture this: you’re scrolling through TikTok or Instagram and suddenly see your favorite celebrity share a video endorsement. The voice, the smile, even the familiar expressions all feel authentic. Maybe it’s an investment opportunity, a charitable donation, or a new product launch.
It feels real—but it isn’t.
Recently, a woman in Southern California believed she was speaking directly with actor Steve Burton from General Hospital. Through a series of video and voice messages, she was convinced they were in a relationship. By the time the truth surfaced, she had lost over $430,000, including money from selling her home.
In another case, influencer Molly-Mae Hague had to warn her followers after a realistic video appeared online promoting a perfume she never endorsed. Supermodel Gisele Bündchen’s image was also used in a fake Instagram campaign that netted scammers millions of dollars before being taken down.
These aren’t isolated incidents. Deepfake technology is rapidly becoming one of the most dangerous new tools in online fraud.
What’s Happening:
Scammers have learned to use publicly available photos and videos to create realistic AI-generated likenesses of celebrities. Once they have enough material, they can digitally clone a person’s face and voice with startling accuracy.
Here’s how the schemes often unfold:
- They create a convincing video or audio clip using AI trained on interviews, social media clips, and public footage.
- The fake content is shared through social platforms, private messages, or even live video streams.
- Victims are told to invest in a product, send donations, or even begin a “personal relationship” with the celebrity.
- Once trust is established, the scammer asks for money, crypto transfers, or sensitive information.
- The real celebrity often has no idea their name and likeness are being used until it goes viral.
Actress Helen Mirren recently issued a public warning after her image was used to promote a fake charity campaign. Each of these examples shows how scammers manipulate trust in famous faces to create a false sense of connection and urgency.
Why It Works:
Celebrity scams are powerful because they mix emotional appeal with technological realism.
Fans already feel connected to public figures. When a message sounds and looks exactly like someone they admire, skepticism fades. Add a personal touch like “I wanted to reach out to you” or “You’ve been selected for a private offer,” and even cautious people can fall for it.
Modern AI has also become so sophisticated that voice clones capture tone, pacing, and personality. Even professionals who work with these tools admit they sometimes can’t tell the difference.
Finally, these scams thrive on emotion—whether that’s excitement, admiration, or loneliness. Victims of romantic deepfake scams often describe feeling special or chosen, which makes it harder to question what’s happening.
Red Flags:
Be cautious if you notice any of the following:
- A “celebrity” contacts you directly through DMs or messaging apps like WhatsApp or Telegram.
- The conversation quickly moves off the platform where it started.
- The message includes links to unknown websites or online stores.
- You’re asked for money, cryptocurrency, or gift cards.
- The product or cause doesn’t appear on the celebrity’s verified social pages.
- Something feels slightly “off”—the background, speech pattern, or body language doesn’t quite match.
Quick tip: If a celebrity asks you to act—send money, buy something, or share personal information—pause and verify through their official accounts or press releases. Real endorsements rarely happen in private messages.
How to Protect Yourself:
- Check official channels. Always verify through the celebrity’s verified social media accounts or website before engaging.
- Don’t share personal details. Never send money, ID photos, or banking information in private messages.
- Be skeptical of “exclusive” offers. If it sounds like you’re being personally chosen, it’s probably a scam.
- Use secure payment methods. Credit cards offer protection that crypto and wire transfers do not.
- Talk about it. Share these risks with family members who might be more vulnerable to emotional manipulation.
- Report impersonations. Use the “report” feature on social platforms and file a complaint with the Federal Trade Commission at ReportFraud.ftc.gov.
If you’re a brand or public figure, consider setting up automated alerts for your name and image. This makes it easier to spot and remove fake content before it spreads widely.
What to Do if You’re Targeted:
- Stop responding immediately and save all evidence such as screenshots or messages.
- Contact your bank or payment service to flag suspicious transfers.
- File a report with the FTC or your local consumer protection office.
- Monitor your financial accounts for unusual charges.
- Let others know. Sharing your experience can prevent someone else from becoming the next victim.
Final Thoughts:
The rise of AI-generated celebrity content is changing what we can trust online. It’s no longer enough to recognize a familiar face or voice. Today, anyone with a laptop and access to AI tools can create a realistic imitation capable of fooling millions.
Before you act on a celebrity endorsement or message, take a step back and check the source. Verification only takes a few minutes—and it can save you thousands of dollars.
Awareness, not fear, is our best defense.
Further Reading:
- Deepfake of General Hospital Star Steve Burton Scams Southern California Woman for $80K – Indiatimes
- Molly-Mae Hague Warns Fans After AI Perfume Video Scam – The Sun
- AI Used to Create Celebrity Deepfakes in Multi-Million Dollar Instagram Scam – NDTV
- Helen Mirren Warns Fans About Fake Charity Scams – People
- Michigan Attorney General: Rise in Celebrity Impersonation Scams – State Press Release









Leave a Reply