Tagged: AI Toggle Comment Threads | Keyboard Shortcuts

  • Geebo 9:00 am on March 6, 2023 Permalink | Reply
    Tags: AI, , ,   

    AI voices used in grandparent scam 

    AI voices used in grandparent scam

    By Greg Collier

    If you follow the tech news at all, you’ll no doubt have heard about how artificial intelligence (AI) has become increasingly popular in the past year or so. You may have heard of the art generator known as DALL-E. It can produce images using any prompt you can give it. For example, the above picture was generated with an AI program called Stable Diffusion, using the prompt of ‘AI Voice’. You may have also heard of ChatGPT, a text-based AI that can generate just about anything in text form. Do you want to craft a professional sounding email to your boss? ChatGPT can generate that for you. Do you want ChatGPT to craft the lyrics of a song in the style of The Doors about soup? It can do that too.

    However, the more important question is, is AI advanced enough to be used in scams? Yes, it is.

    This past weekend, The Washington Post published a story about AI being used in one of the more common scams we post about, the grandparent scam. For those who may be unfamiliar, The grandparent scam is a type of phone scam where fraudsters impersonate a grandchild or other family member in distress to trick elderly individuals into sending them money. Typically, scammers will tell their elderly victims that they’ve had some kind of facial injury such as a broken nose as to why their voice sounds different from their actual grandchild.

    According to the Post, scammers are now using AI voice-cloning technology to sound exactly like the person they’re impersonating. Victims from both Canada and the United States have lost thousands of dollars to scammers using this technology.

    While voice cloning technology is nothing new, it has advanced exponentially in the past couple of years. It used to be someone would need vast amounts of recordings to accurately clone someone’s voice. Now, it only takes a 30-second recording to do so. If someone you know has posted a video or recording of themselves on social media where they’re talking, their voice can now be cloned.

    You can still protect yourself from this scam, as long as you disregard what your ears are telling you. If you receive a call from a relative or loved one asking for money because they’re in trouble, you should still follow the same precautions, even if it sounds exactly like them. Hang up on the call and contact the person who’s supposedly in trouble. If you can’t reach them, ask other family members who might know where they are. Tell them the exact situation you encountered, and never keep it a secret. Lastly, never send money under any means.

     
  • Geebo 8:00 am on May 28, 2019 Permalink | Reply
    Tags: AI, ,   

    Could Deep Fakes ruin our world? 

    Could Deep Fakes ruin our world?

    Recently, a video was distributed on social media od Speaker of the House Nancy Pelosi that made it look and sound like she was slurring her speech. While the video was determined to be fake it was put out by someone who supposedly did not support Pelosi’s politics. However, with politics being what it is in this country today there were people who believed it to be real. While this particular video was made using simple editing tricks of an actual video it does bring up the matter of what will happen when ‘Deep Fakes’ become more prevalent in our society and media.

    For those of you unfamiliar with the term Deep Fake, it refers to a process where someone can take a single image or video and by using an AI Assisted program it can turn the original video into just about anything the fabricator wanted. For example, if you wanted to make it look like a beloved celebrity say that they enjoyed stealing candy from babies you probably could. Now take that same process and imagine it being used against candidates running for President. Potentially deep fakes could be used to make it look like any candidate look like they were saying or doing something completely detrimental to their campaign.

    [youtube https://www.youtube.com/watch?v=gLoI9hAX9dw%5D

    The deep fakes could become so commonplace that we wouldn’t be able to tell what was real and what wasn’t. If someone were to commit a heinous act caught on video’ all they would need to say is that the video was a deep fake and scores of people would believe them. Thankfully, the technology is not there yet where a deep fake is indistinguishable from the real thing but it could only be a matter of time before it is. When technologies are used by bad actors, it usually takes law enforcement and government some time to catch up before designing the tools needed to fight them.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel