Tagged: deep fakes Toggle Comment Threads | Keyboard Shortcuts

  • Geebo 8:00 am on October 16, 2024 Permalink | Reply
    Tags: , artificial intelligence, , deep fakes, , ,   

    How AI is Fueling a New Wave of Online Scams 

    How AI is Fueling a New Wave of Online Scams

    By Greg Collier

    With the rise of artificial intelligence (AI), the internet has become a more treacherous landscape for unsuspecting users. Once, the adage “seeing is believing” held weight. Today, however, scammers can create highly realistic images and videos that deceive even the most cautious among us. The enhanced development of AI has made it easier for fraudsters to craft convincing scenarios that prey on emotions, tricking people into parting with their money or personal information.

    One common tactic involves generating images of distressed animals or children. These fabricated images often accompany stories of emergencies or tragedies, urging people to click links to donate or provide personal details. The emotional weight of these images makes them highly effective, triggering a quick, compassionate response. Unfortunately, the results are predictable, stolen personal information or exposure to harmful malware. Social media users must be on high alert, as the Better Business Bureau warns against clicking unfamiliar links, especially when encountering images meant to elicit an emotional reaction.

    Identifying AI-generated content has become a key skill in avoiding these scams. When encountering images, it’s essential to look for subtle signs that something isn’t right. AI-generated images often exhibit flaws that betray their synthetic nature. Zooming in on these images can reveal strange details such as blurring around certain elements, disproportionate body parts, or even extra fingers on hands. Other giveaways include glossy, airbrushed textures and unnatural lighting. These telltale signs, though subtle, can help distinguish AI-generated images from genuine ones.

    The same principles apply to videos. Deepfake technology allows scammers to create videos that feature manipulated versions of public figures or loved ones in fabricated scenarios. Unnatural body language, strange shadows, and choppy audio can all indicate that the video isn’t real.

    One particularly concerning trend involves scammers using AI to create fake emergency scenarios. A family member might receive a video call or a voice message that appears to be from a loved one in distress, asking for money or help. But even though the voice and face may seem familiar, the message is an illusion, generated by AI to exploit trust and fear. The sophistication of this technology makes these scams harder to detect, but the key is context. Urgency, emotional manipulation, and unexpected requests for money are red flags. It’s always important to verify the authenticity of the situation by contacting the person directly through trusted methods.

    Reverse image searches can be useful for confirming whether a photo has been used elsewhere on the web. By doing this, users can trace images back to their original sources and determine whether they’ve been manipulated. Similarly, checking whether a story has been reported by credible news outlets can help discern the truth. If an image or video seems too shocking or unbelievable and hasn’t been covered by mainstream media, it’s likely fake.

    As AI technology continues to evolve, scammers will only refine their methods. The challenge of spotting fakes will become more difficult, and even sophisticated consumers may find themselves second-guessing what they see. Being suspicious and fact-checking are more important than ever. By recognizing the tactics scammers use and understanding how to spot AI-generated content, internet users can better protect themselves in this new digital landscape.

     
  • Geebo 8:00 am on September 11, 2024 Permalink | Reply
    Tags: , , deep fakes, , ,   

    Crypto Scammers Exploit Apple’s iPhone 16 Event 

    By Greg Collier

    On September 9, as Apple enthusiasts eagerly tuned in for the launch of the iPhone 16 during the ‘Glowtime’ event, scammers took advantage of the hype by launching an elaborate crypto scam. Using deepfake technology, the scammers created videos that impersonated Apple CEO Tim Cook, promoting fraudulent cryptocurrency giveaways and investment schemes. These videos, which were posted on YouTube, lured unsuspecting viewers into participating in crypto transactions by flashing QR codes on the screen. Viewers were asked to send their cryptocurrency to fake websites that closely resembled Apple’s official site.

    This isn’t the first time scammers have deployed deepfake technology to impersonate prominent figures. Earlier this year, similar tactics were used to imitate Elon Musk, spreading false crypto giveaways.

    The deepfake videos on YouTube managed to garner thousands of views before being taken down, but not before several people flagged them on platforms like X (formerly Twitter). Social media users expressed concern over the growing misuse of advanced technologies like artificial intelligence (AI) for fraudulent purposes.

    This event serves as a chilling reminder of the increasing sophistication of cryptocurrency scams. With cryptocurrencies’ volatile nature and the difficulty of tracing transactions, hackers have found fertile ground for fraud. The FBI has warned that crypto scammers are using ever more advanced techniques, with members of the crypto community losing over $5.6 billion to such scams last year.

    It’s important to remember that celebrity endorsements of cryptocurrency schemes are usually fake. Scammers often exploit the likeness and voices of well-known figures, like Tim Cook or Elon Musk, to create a false sense of trust and credibility. These endorsements are rarely, if ever, legitimate. Instead, they are sophisticated traps designed to manipulate and deceive people into investing in fraudulent schemes. When it comes to crypto, always exercise caution and verify information through trusted sources before making any transactions. If something seems too good to be true, it probably is.

     
  • Geebo 8:00 am on May 22, 2023 Permalink | Reply
    Tags: , deep fakes, , ,   

    AI scams aren’t limited to just voice 

    AI scams aren't limited to just voice

    By Greg Collier

    AI voice spoofing scams are on the rise and have really grabbed our attention recently. Again, this is when scammers take a sample of someone’s voice from online and run the sample through an AI program to make the voice say whatever they want. We see it mostly used in phone scams, where the scammers need you to believe the victim is talking to a loved one. With the advent of AI-generated voices, scammers have gone back into their bag of tricks to make an older scam even more convincing, and that’s the deep fake video.

    A deepfake video refers to a manipulated or synthesized video created using artificial intelligence techniques. In the context of deepfake videos, the AI is used to manipulate or replace the appearance and actions of individuals in existing videos, making it appear as though someone said or did something they didn’t actually say or do. However, to make the voice sound more convincing in deep fakes, a lot more voice sampling was needed than today. Now, bad actors only need a few seconds of someone’s voice to make the cloned voice sound more convincing.

    Recently, a man in Louisiana received a video that appeared to come from his brother-in-law. The video was received over Messenger, and the man’s brother-in-law said in the video that he needed $250 and couldn’t explain why, just that he was in trouble. The message also contained a link to a payment app account where the man could send the $250. The video disappeared from the message, but the link remained.

    Unfortunately for the scammers, they had sent their message to a police sergeant, who knew this was a scam. He called his brother-in-law, who was in no immediate danger.

    If you receive a phone call or instant message from a loved one asking for money, always verify their story before sending any funds. Even if it appears that it’s your loved one contacting you, verify the story. With advances in technology, you can’t believe your eyes or ears in situations like these.

     
  • Geebo 8:00 am on September 9, 2019 Permalink | Reply
    Tags: deep fakes, ,   

    Protect yourself against deepfake fraud 

    Protect yourself against deepfake fraud

    Last week, it was revealed that a German energy company doing business in the UK was conned out of more than $240,000. The scammers were using a form of deepfake technology that mimicked the voice of the company’s CEO. A director of the company was instructed by the phony CEO over both phone and email to wire payment to an account in Hungary to avoid a late payment fine. Reports say that the director could not distinguish between the AI-assisted deepfake and the CEO’s actual voice so the money was wired without question. The plot may not have been uncovered if it wasn’t for the scammers’ greed.

    The scammers tried getting the director to wire more funds to another account. At this point, the director felt like something was up and called the CEO himself. It was at this point that the scammers posing as the CEO called the director while the director was on the phone with the CEO himself. Unfortunately, by this time it was too late to do anything about the original payment. The funds had been scattered across the globe after being wired to the initial account and no suspects have been named as of yet.

    [youtube https://www.youtube.com/watch?v=S1-HV031Fps%5D

    The Better Business Bureau (BBB) has some good news and bad news about deepfake audio though. The bad news is that the technology is advancing at such a rapid pace it could only be a matter of time before scammers would only need to keep you on the phone for a minute before getting enough of your voice to make a deepfake out of you. However, the good news is that companies can fight deepfakes by instilling a culture of security. They suggest that companies should confirm transactions like this by calling the person who supposedly requested the transaction directly.

     
  • Geebo 8:00 am on July 10, 2019 Permalink | Reply
    Tags: deep fakes, defense contractor, , ,   

    These tech scams are frightening! 

    These tech scams are frightening!

    This week’s set of scams are incredibly troubling. Technology has advanced to a point where scams have become harder to spot. Not to mention that some of the tactics used by these scammers are like something out of a movie.

    The first scam is kind of confusing and seems a little convoluted for something that doesn’t bring that much to the scammers. If you’re not familiar with Google Voice, it’s a service that provides you with a free supplementary phone number. Scammers are using Google Voice to hijack phone numbers from personal numbers that have been shared online. For example, if you’ve posted your phone number in a classified ad the scammers will attempt to hijack that number. The scammers won’t be able to take any money from you but could potentially use your number for criminal activity. If your number has been hijacked in one of these scams this article has instructions on how to get your number back. Unfortunately, the steps won’t be that easy.

    The next scam, while rare, is very disconcerting. Security firm Symantec has said that they have found a handful of scams where the scammers have used deep fake audio of business executives in order to trick employees into transferring money to the scammers. Deep fakes are AI generated video or audio that can be hard to tell from the real thing. We’ve previously discussed the potential harm that deep fakes could cause here. The process to generate these deep fakes can cost thousands of dollars ut could end up costing businesses untold losses in the future.

    [youtube https://www.youtube.com/watch?v=VnFC-s2nOtI%5D

    Our last scam for today is the most alarming. According to news site Quartz, a US military defense contractor was taken for $3 million in top-secret equipment by international con artists. All the scammers had to do was use an email address that looks similar to official military domains. This is basically the same phishing scam that’s used to try to steal your banking information except a company with a high government security clearance fell for it to the tune of $3 million. Thankfully, the scammers were apprehended after federal investigators tracked them down through the mailing address they used that they claimed was a military installation. Disturbingly, neither the Quartz article nor the legal documents Quartz obtained state whether or not the sensitive equipment was ever recovered.

     
  • Geebo 8:00 am on May 28, 2019 Permalink | Reply
    Tags: , deep fakes,   

    Could Deep Fakes ruin our world? 

    Could Deep Fakes ruin our world?

    Recently, a video was distributed on social media od Speaker of the House Nancy Pelosi that made it look and sound like she was slurring her speech. While the video was determined to be fake it was put out by someone who supposedly did not support Pelosi’s politics. However, with politics being what it is in this country today there were people who believed it to be real. While this particular video was made using simple editing tricks of an actual video it does bring up the matter of what will happen when ‘Deep Fakes’ become more prevalent in our society and media.

    For those of you unfamiliar with the term Deep Fake, it refers to a process where someone can take a single image or video and by using an AI Assisted program it can turn the original video into just about anything the fabricator wanted. For example, if you wanted to make it look like a beloved celebrity say that they enjoyed stealing candy from babies you probably could. Now take that same process and imagine it being used against candidates running for President. Potentially deep fakes could be used to make it look like any candidate look like they were saying or doing something completely detrimental to their campaign.

    [youtube https://www.youtube.com/watch?v=gLoI9hAX9dw%5D

    The deep fakes could become so commonplace that we wouldn’t be able to tell what was real and what wasn’t. If someone were to commit a heinous act caught on video’ all they would need to say is that the video was a deep fake and scores of people would believe them. Thankfully, the technology is not there yet where a deep fake is indistinguishable from the real thing but it could only be a matter of time before it is. When technologies are used by bad actors, it usually takes law enforcement and government some time to catch up before designing the tools needed to fight them.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel