Deepfake AI scam calls: How to protect yourself
With advancements in the field of generative AI, impersonation scam calls have become a growing concern. Scammers can now employ AI to generate an audio deepfake, a realistic copy of your voice that might sound like you are in distress, and make your family members fall prey, resulting in monetary loss. Here's how to safeguard yourself and your dear ones from such calls.
Why does this story matter?
Generative AI is gradually becoming a part of our daily lives, and changing how we perceive technology. However, not all aspects of AI are advantageous. As this technology is becoming more prevalent, bad-faith actors have started misusing it. For instance, using deepfake algorithms to replicate your voice and make "you" say whatever they want, and ultimately steal money from your known ones.
How does it work?
With an audio sample of just a few sentences, scammers can replicate a voice, and make the swindler "speak" whatever they want. The task can easily be carried out using a slew of cheap tools that are available online. Anyone who is in possession of your audio recordings, could use deepfake algorithms to create a realistic copy of your voice.
Sound replicas can be made from social media updates
To create a realistic copy of their target's voice, scammers only need their audio data to train the algorithm. Many of us post updates regarding our daily lives on social media, which means the audio data needed to create a realistic sound copy could easily be imported from the web. Larger the amount of data, the better and more convincing the copy is.
Voice-generating software analyzes several elements of sound bytes
AI voice-generating tools examine what distinguishes a person's voice, including age, gender, and accent, and search a massive database of voices to find similar ones and predict patterns. Post-that, they recreate the pitch, timbre and individual sounds of one's voice, for replication purposes. The tools only need short samples of audio, which scammers import from TV commercials, podcasts, TikTok, Facebook, or Instagram.
Scammers can impersonate your loved ones
Anyone with access to your audio data could use a deepfake algorithm to make "you" say whatever they want. It's as simple as typing some text and having the computer read it aloud in what appears to be your voice. A scammer can pose as someone trustworthy—a child, parent, or friend—and persuade the victim to send them money because the former is in "trouble."
The victims are often elderly people
Generative AI has made it easier for con artists to imitate voices, convincing people, primarily the elderly, that their loved ones are in distress. Imagine the caller sounds exactly like a friend or family member and appears to be in danger. It could really add a whole new level of complication and panic to the unfortunate recipient's life.
Be cautious when you receive calls from unknown numbers
The most common tactic used in AI scam calls is to dupe victims into paying a ransom to save a loved one who they believe is in danger. If you receive an unknown call, wherein the caller sounds exactly like your family member and asks for money or makes unusual requests, hang up, and call/text them on their known number to cross-check. Be skeptical.