Page Loader
Voice cloning scams: How AI is deceiving the US public
Scammers can use voice cloning tools to easily fool victims

Voice cloning scams: How AI is deceiving the US public

Jun 12, 2023
02:55 pm

What's the story

Artificial intelligence (AI) is being touted as the next big thing in technology. Some say it will be as big as the internet. However, AI's growing popularity means easier access to tools for unscrupulous elements of society. This leads to an increase in crimes. The US is dealing with a new breed of scams courtesy of AI technology called voice cloning.

Context

Why does this story matter?

AI naysayers have always been worried about how the new technology will empower fraudsters and scammers. The technology's sudden rise in popularity means the world isn't prepared to deal with its evil side. Lina Khan, the chair of the US Federal Trade Commission, recently said AI is being used to "turbocharge" crimes. It needs to be seen how authorities deal with AI-related crimes.

Seeking help

People are getting calls from distressed loved ones

In the US, people are getting calls from their loved ones seeking help. A woman in Arizona received a call from her 15-year-old daughter who was on a skiing trip. The daughter was sobbing on the call. "Help me, mom, please help me," the daughter said. Then a man took over the call and asked for up to $1 million ransom.

Fake

The daughter's voice was cloned using an AI tool

The woman had no questions about the authenticity of her daughter's voice. Since the call came from an unfamiliar number, she called her daughter. Surprisingly, the daughter picked up the call and this ended the AI-powered scam. But do you know how the scam was performed? A fraudster used an AI voice cloning tool to mimic the voice of the daughter.

Grandparents

Grandparents are being scammed by cloned voices of grandchildren

The AI-powered ruses in the US are not limited to fake kidnappings. For instance, grandparents are getting calls from 'grandchildren' who are in trouble and need money urgently to escape the distressful situation. Authorities call this the "grandparent scam." As was the case with the 'kidnapped' daughter, the 'cloned voices' of the grandchildren were also strikingly similar.

Easy

Scammers can easily create convincing deep fakes with voice cloning

The AI-based voice cloning tools are easily available on the internet, with most of them being free. With these tools, fraudsters can leave misleading voicemails and can even change their voices on phone calls. Scammers only need a small sample of a voice to create convincing deep fakes. They can also employ different accents, genders, and even speech patterns.

Remedy

Most people are vulnerable to AI-powered scams

According to a survey of 7,000 people, 70% said they may not be able to tell the difference between a cloned voice and the real thing. Almost anyone with an online presence is now prone to being a victim of an AI-powered crime. AI scams are growing faster than ways to prevent them. It is advised to seek help from authorities in such cases.