In this day and age, scams are sadly far too common. We all know someone or have been that someone who has been scammed – either by email, text or phone call, but what is the latest style of scam that is beginning to gain traction? AI voice scams.
Sophisticated voice-cloning technology is now being used to impersonate loved ones or public figures. Scarily, this can be done with just a few seconds of audio – often taken from social media – scammers create a convincing clone of a voice and use it to request money or sensitive information.
While this scam is still new in Australia, experts are warning it’s quickly gaining momentum and could become increasingly common. Below are a few tips on how to keep you and your loved ones protected.
What to watch out for:
- Unusual or urgent requests – especially those filled with emotion or claiming to be from a friend or family member.
- Pressure to send money immediately, particularly through gift cards, cryptocurrencies, or direct transfers.
- Insistence on keeping the conversation secret, or discouraging you from confirming the story via other methods.
How to protect yourself if “someone” calls you:
- Verify the caller’s identity: Calls from unknown numbers should always be treated with caution.
- Listen carefully to the content – not just the voice: Pay attention to what’s actually being said, beyond how it sounds.
- Recognise the warning signs: An AI-generated voices often stick to a brief -they use rehearsed scripts and are generally not good with natural conversation.
- Put them to the test: Ask questions or attempt to engage in unscripted dialogue – most AI clones can not keep up with genuine conversation.
- Agree on a codeword with loved ones: Create codeword that only you and those closest to you (kids, family, trusted close friends) know. Make sure everyone in your circle knows your codeword and to use it when in need or asking for help.
Protect your voice from being cloned:
- Manage the exposure of your voice: Refrain from sharing videos or voice notes on public platforms whenever possible.
- Secure your profiles: Adjust your social media privacy settings to limit who can access and download your audio content.
- Practice caution with smart devices: Turn off voice recording features on home assistants and smart speakers when not in use.
- Raise awareness in your household: Have open conversations about the risks of sharing voice clips online – especially with children and teens who can be unaware to the hidden risks.
As AI technology advances, so too do the methods scammers use to target businesses and individuals. If you’d like to deepen your understanding of these emerging AI voice scams, you can find further insights in this article, which inspired this article.