Connect with us

Hi, what are you looking for?

Tech & Science

Beware of scammers using AI to increase the sophistication of their attacks

Scammers can mimic a voice, but they can’t copy your loved one’s memories or personality.

Image: Andrew Caballero-Reynolds, AFP/Getty Images
Image: Andrew Caballero-Reynolds, AFP/Getty Images

Over many years scammers have tricked people by pretending a loved one is in trouble to get them to send money or share personal information. Now with AI in the mix, these calls are becoming both riskier and more convincing.

Scammers only require a short audio clip of someone’s family member, which they can easily find on social media or the internet. They then use an AI program to copy the voice from the audio clip and make it say whatever they want. They can even add emotions like laughter or fear to make it sound real.

To reflect such concerns, and as the prevalence of such incidents rises, Google searches for “voice cloning” and “voice cloning AI” have increased by 25% and 136% this year, respectively.

Manas Chowdhury, VP of cloud security company AccuKnox has provided Digital Journal with some advice about how to spot AI voice cloning scams.

How to Spot AI Voice Cloning Scams?

According to Chowdhury:

You receive the call from an unknown number

AI voice scams often begin with an unexpected call from a number you don’t know. Most times, the call is from a different country.

You only hear your loved one’s voice briefly

Scammers know the longer they use a fake voice, the more likely you’ll figure it out.

Chowdhury says: “Usually, you’ll only hear your family member or friend briefly, often sounding distressed or crying, explaining the situation with phrases like “I’m in trouble” or “I messed up.”

They can’t answer easy questions

Scammers can mimic a voice, but they can’t copy your loved one’s memories or personality. If you ask them simple questions and they can’t give clear answers, you should immediately hang up.

Another person quickly takes control of the call

Scammers often begin by using a fake voice, then pass the phone to someone pretending to be a kidnapper, lawyer, or cop.

They tell you to pay via crypto or gift cards

Scammers choose payment methods that are hard to trace, so police and victims can’t recover funds or track them down.

Chowdhury moves on to consider how people can protect themselves against AI-Generated voice scams.

Don’t respond to calls from unknown numbers

Chowdhury recommends: “While we’re used to saying “hi” or “hello” when we pick up the phone, it’s a good idea to wait for the other person to speak first. Asking “Who is this?” or “Who are you?” repeatedly can give scammers material for training AI voice cloning software.”

Beware of red flags

Scammers trick people into sending money in ways that make it hard to get it back. Red flags include requests to wire money, purchase gift cards, or use cryptocurrency.

As an example: “If someone asks for a lot of money, pause, ask why they need it, tell them you need time to think, hang up, and check their identity before doing anything.”

Don’t share personal information with unknown callers

Never share sensitive information such as PINs or one-time passwords over the phone unless you’re sure it’s a verified contact.

Chowdhury  advises: “Remember, banks and government organisations don’t ask for your account info over the phone or by text. Even if the voice sounds familiar, it’s safer to verify in person or through another call.”

Create a family “safe word” to use on the phone

In a time when scammers can mimic your voice, it’s a good idea to have a special password known only to your loved ones. In an emergency, saying the family passcode can help prove it’s really you.

For example, advises Chowdhury: “You can use a password that is special to your family, like a favourite memory, known only to you and your loved ones.”

Be aware of what you share on social media

Your social media shows your life to friends and family, but it can also be used by anyone to exploit you or your loved ones.

Therefore, Chowdhury proposes: “It’s important to avoid posting long videos of you or your loved ones talking on camera. Also, use privacy settings on your account and check a follower’s details before letting them see your feed.”

Avatar photo
Written By

Dr. Tim Sandle is Digital Journal's Editor-at-Large for science news. Tim specializes in science, technology, environmental, business, and health journalism. He is additionally a practising microbiologist; and an author. He is also interested in history, politics and current affairs.

You may also like:

Business

The EU, Japan, Canada, Australia, and the Middle East definitely aren’t going to stop doing business with China, end of discussion.

Business

Among large metros, San Jose, CA, Washington, D.C. and Columbus, OH, take the podium in 2025 for women in tech.

Social Media

Pope Francis created the first pontifical Instagram account. — © AFP STRMarine DO-VALEAs an at-times unwitting star on social media, Pope Francis knew how...

Tech & Science

Image generated with Gemini.In a world where threats travel faster than updates and cyberattacks evolve as fast as the tools designed to stop them,...