Connect with us

Hi, what are you looking for?

Tech & Science

Beware of voice cloning: Then latest cybersecurity scam

Use two-factor authentication wherever possible. It provides an additional layer of security.

Connected to social media network. — © Digital Journal / File
Connected to social media network. — © Digital Journal / File

One of the cybersecurity issues that has arisen recently, as artificial intelligence systems have advanced, is voice cloning. The advancement of technology has led to the creation of digital doppelgängers that can mimic voices with terrifying precision.

The technology can be used in various scams, fooling even close friends and family members into believing they are interacting with you. Deepfakes have become more sophisticated. In 2019, a deepfake video of Facebook CEO Mark Zuckerberg circulated on Instagram. It featured Zuckerberg making statements he never actually made, highlighting capability of AI to fabricate believable content.

The company Geonode has provided advice for readers to avoid their voice from being cloned. The advice is:

Limit publicly available recordings

The more recordings of your voice that are available, the easier it is for scammers to create a convincing deep fake. Be mindful of what you’re sharing online, especially on social media. Limit public posts that include your voice and consider changing privacy settings to restrict who can access your content.

Use voice modulation apps

Consider using voice modulation apps when you need to share audio online. These tools distort your voice in a way that maintains your natural inflections while making it harder for AI algorithms to build a precise model of your voice.

Secure personal data

Always secure your personal data. Don’t share sensitive information like your phone number, address, or bank details publicly or on unsecured platforms. Scammers can use this information in combination with a voice deep fake to lend credibility to their schemes.

Two-factor authentication

Use two-factor authentication wherever possible. It provides an additional layer of security, making it difficult for scammers to break into your accounts even if they have managed to mimic your voice.

Education

Educate close friends and family members about the existence of voice deepfakes and how to recognize potential scams. They should be aware that unusual requests, especially those involving money, should be verified through other communication methods.

Safeword strategy

Create a safeword within your circle of friends and family, so if it doubt you can ask for the safe word to avoid AI voice scammers.

Avatar photo
Written By

Dr. Tim Sandle is Digital Journal's Editor-at-Large for science news. Tim specializes in science, technology, environmental, business, and health journalism. He is additionally a practising microbiologist; and an author. He is also interested in history, politics and current affairs.

You may also like:

Entertainment

On Monday, December 8th, Cheyenne Jackson, Emmy-nominated actor, singer, and Broadway performer, headlined Carnegie Hall in Manhattan.

Business

There’s a massive gap between what AI can theoretically do and what organizations can actually implement.

Business

Germany's mechanical and plant manufacturing group VDMA is urging the government to improve conditions for companies - Copyright AFP/File Fabrice COFFRINIProduction in Germany’s key...

Social Media

Tech companies that fail to purge teens from their platforms face US$33 million fines.