AI Got Your Tongue?

April 2, 2024

Did you know that cybercriminals can impersonate real people’s voices…and then clone them, so you think you’re listening to somebody else?

Voice cloning is absolutely real…and uses artificial intelligence to recreate a target’s voice by replicating audio files of the real person.

In other words, you may think you’re talking to your boss or local politician or even your best friend…but it could really be a bunch of special software deepfaking their vocal likeness.

This could make any phishing scam seem a whole lot more plausible!

Hearing a familiar voice, especially from an authority or loved one, lowers defenses and makes you more likely to believe their claims! By choosing a specific victim, the threat actor can pick somebody close to them and start to find clips of their voice.

Where could someone find your voice online? Social media posts, voicemails, leaked recordings and even public speeches are all commonly posted online. By using these audio files as a basis, threat actors can then use specialized software to commit the auditory deepfake. Analyzing and learning from unique characteristics like pitch, cadence and pronunciation, the AI can replicate these patterns.

Once trained, the AI model can synthesize new speech based on the learned patterns. That is to say, they can make this stolen “voice” to say things that the original audio file never said. The criminal provides text for the clone to say, and the AI generates audio that sounds remarkably similar to the target voice.

As you can imagine, this makes it much easier for cybercriminals to bypass your suspicions and manipulate you.

Threat actors can even script the cloned voice to sound distressed, panicked, or authoritative, creating a sense of urgency that pushes victims to act quickly without thinking critically. Talk about something out of a dystopian novel!

Some common voice cloning schemes include…

  • Impersonating CEOs or managers to trick employees into transferring funds or revealing sensitive information.
  • Fake friend or family emergencies that scam victims into sending money to supposedly help a loved one in trouble.
  • Virtual kidnapping uses a cloned voice to pretend that a loved one is in trouble, even mimicking their pleas for help.
  • Romance scams build rapport and trust with a victim over time using a cloned voice, so that it is less obvious that they are a catfish.

By cloning someone specific to their victim, the scam becomes more targeted and therefore believable. In light of this technology, it’s important to be wary of unsolicited calls or messages, even if they sound familiar.

Verify requests and information directly from the source, never just over the phone or video call. Remember, even as AI voice cloning becomes more sophisticated, vigilance and healthy skepticism are your best defenses!

How can you stay safe from AI voice cloning scams?

  • Be wary of unsolicited calls or messages, even if they sound familiar.
  • Verify information directly with the source, not through the caller.
  • Never share personal information or send money based on phone calls or texts.
  • Be aware of the potential for AI voice cloning and stay informed about new scam tactics.

Meanwhile, the best way to stop your voice from being harvested for AI cloning is to avoid posting original audio online! When you do post audio, try to change your profile settings to limit who can view, interact with and download your posts. What threat actors can’t get, can’t come back to bite you!

Most Recent Post

How to Properly Deploy IoT on a Business Network

How to Properly Deploy IoT on a Business Network

The Internet of Things (IoT) is no longer a futuristic concept. It's rapidly transforming industries and reshaping how businesses operate. IoT is a blanket term to describe smart devices that are internet enabled. One example is smart sensors monitoring production...

“Knowledgeable, reliable and trustworthy”

In addition to being knowledgeable, reliable and trustworthy, he’s very friendly and accessible. Would definitely use his services again.

Nyshie Perkinson

Senior Media Specialist, Center for Biological Diversity

Related Articles

Here Are 5 Data Security Trends to Prepare for in 2024

Here Are 5 Data Security Trends to Prepare for in 2024

With cyber threats evolving at an alarming pace, staying ahead of the curve is crucial. It’s a must for safeguarding sensitive information. Data security threats are becoming more sophisticated and prevalent. The landscape must change to keep up. In 2024, we can...

Beware of Deepfakes! Learn How to Spot the Different Types

Beware of Deepfakes! Learn How to Spot the Different Types

Have you ever seen a video of your favorite celebrity saying something outrageous? Then later, you find out it was completely fabricated? Or perhaps you've received an urgent email seemingly from your boss. But something felt off.Welcome to the world of deepfakes....