Tagged: deep fakes Toggle Comment Threads | Keyboard Shortcuts

  • Geebo 8:00 am on May 22, 2023 Permalink | Reply
    Tags: , deep fakes, , ,   

    AI scams aren’t limited to just voice 

    AI scams aren't limited to just voice

    By Greg Collier

    AI voice spoofing scams are on the rise and have really grabbed our attention recently. Again, this is when scammers take a sample of someone’s voice from online and run the sample through an AI program to make the voice say whatever they want. We see it mostly used in phone scams, where the scammers need you to believe the victim is talking to a loved one. With the advent of AI-generated voices, scammers have gone back into their bag of tricks to make an older scam even more convincing, and that’s the deep fake video.

    A deepfake video refers to a manipulated or synthesized video created using artificial intelligence techniques. In the context of deepfake videos, the AI is used to manipulate or replace the appearance and actions of individuals in existing videos, making it appear as though someone said or did something they didn’t actually say or do. However, to make the voice sound more convincing in deep fakes, a lot more voice sampling was needed than today. Now, bad actors only need a few seconds of someone’s voice to make the cloned voice sound more convincing.

    Recently, a man in Louisiana received a video that appeared to come from his brother-in-law. The video was received over Messenger, and the man’s brother-in-law said in the video that he needed $250 and couldn’t explain why, just that he was in trouble. The message also contained a link to a payment app account where the man could send the $250. The video disappeared from the message, but the link remained.

    Unfortunately for the scammers, they had sent their message to a police sergeant, who knew this was a scam. He called his brother-in-law, who was in no immediate danger.

    If you receive a phone call or instant message from a loved one asking for money, always verify their story before sending any funds. Even if it appears that it’s your loved one contacting you, verify the story. With advances in technology, you can’t believe your eyes or ears in situations like these.

  • Geebo 8:00 am on September 9, 2019 Permalink | Reply
    Tags: deep fakes, ,   

    Protect yourself against deepfake fraud 

    Protect yourself against deepfake fraud

    Last week, it was revealed that a German energy company doing business in the UK was conned out of more than $240,000. The scammers were using a form of deepfake technology that mimicked the voice of the company’s CEO. A director of the company was instructed by the phony CEO over both phone and email to wire payment to an account in Hungary to avoid a late payment fine. Reports say that the director could not distinguish between the AI-assisted deepfake and the CEO’s actual voice so the money was wired without question. The plot may not have been uncovered if it wasn’t for the scammers’ greed.

    The scammers tried getting the director to wire more funds to another account. At this point, the director felt like something was up and called the CEO himself. It was at this point that the scammers posing as the CEO called the director while the director was on the phone with the CEO himself. Unfortunately, by this time it was too late to do anything about the original payment. The funds had been scattered across the globe after being wired to the initial account and no suspects have been named as of yet.

    [youtube https://www.youtube.com/watch?v=S1-HV031Fps%5D

    The Better Business Bureau (BBB) has some good news and bad news about deepfake audio though. The bad news is that the technology is advancing at such a rapid pace it could only be a matter of time before scammers would only need to keep you on the phone for a minute before getting enough of your voice to make a deepfake out of you. However, the good news is that companies can fight deepfakes by instilling a culture of security. They suggest that companies should confirm transactions like this by calling the person who supposedly requested the transaction directly.

  • Geebo 8:00 am on July 10, 2019 Permalink | Reply
    Tags: deep fakes, defense contractor, , ,   

    These tech scams are frightening! 

    These tech scams are frightening!

    This week’s set of scams are incredibly troubling. Technology has advanced to a point where scams have become harder to spot. Not to mention that some of the tactics used by these scammers are like something out of a movie.

    The first scam is kind of confusing and seems a little convoluted for something that doesn’t bring that much to the scammers. If you’re not familiar with Google Voice, it’s a service that provides you with a free supplementary phone number. Scammers are using Google Voice to hijack phone numbers from personal numbers that have been shared online. For example, if you’ve posted your phone number in a classified ad the scammers will attempt to hijack that number. The scammers won’t be able to take any money from you but could potentially use your number for criminal activity. If your number has been hijacked in one of these scams this article has instructions on how to get your number back. Unfortunately, the steps won’t be that easy.

    The next scam, while rare, is very disconcerting. Security firm Symantec has said that they have found a handful of scams where the scammers have used deep fake audio of business executives in order to trick employees into transferring money to the scammers. Deep fakes are AI generated video or audio that can be hard to tell from the real thing. We’ve previously discussed the potential harm that deep fakes could cause here. The process to generate these deep fakes can cost thousands of dollars ut could end up costing businesses untold losses in the future.

    [youtube https://www.youtube.com/watch?v=VnFC-s2nOtI%5D

    Our last scam for today is the most alarming. According to news site Quartz, a US military defense contractor was taken for $3 million in top-secret equipment by international con artists. All the scammers had to do was use an email address that looks similar to official military domains. This is basically the same phishing scam that’s used to try to steal your banking information except a company with a high government security clearance fell for it to the tune of $3 million. Thankfully, the scammers were apprehended after federal investigators tracked them down through the mailing address they used that they claimed was a military installation. Disturbingly, neither the Quartz article nor the legal documents Quartz obtained state whether or not the sensitive equipment was ever recovered.

  • Geebo 8:00 am on May 28, 2019 Permalink | Reply
    Tags: , deep fakes,   

    Could Deep Fakes ruin our world? 

    Could Deep Fakes ruin our world?

    Recently, a video was distributed on social media od Speaker of the House Nancy Pelosi that made it look and sound like she was slurring her speech. While the video was determined to be fake it was put out by someone who supposedly did not support Pelosi’s politics. However, with politics being what it is in this country today there were people who believed it to be real. While this particular video was made using simple editing tricks of an actual video it does bring up the matter of what will happen when ‘Deep Fakes’ become more prevalent in our society and media.

    For those of you unfamiliar with the term Deep Fake, it refers to a process where someone can take a single image or video and by using an AI Assisted program it can turn the original video into just about anything the fabricator wanted. For example, if you wanted to make it look like a beloved celebrity say that they enjoyed stealing candy from babies you probably could. Now take that same process and imagine it being used against candidates running for President. Potentially deep fakes could be used to make it look like any candidate look like they were saying or doing something completely detrimental to their campaign.

    [youtube https://www.youtube.com/watch?v=gLoI9hAX9dw%5D

    The deep fakes could become so commonplace that we wouldn’t be able to tell what was real and what wasn’t. If someone were to commit a heinous act caught on video’ all they would need to say is that the video was a deep fake and scores of people would believe them. Thankfully, the technology is not there yet where a deep fake is indistinguishable from the real thing but it could only be a matter of time before it is. When technologies are used by bad actors, it usually takes law enforcement and government some time to catch up before designing the tools needed to fight them.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc