Read Time:1 Minute, 59 Second

New Alert: Criminals use AI and voice cloning to trick you out of your money. 

Earlier this year, Microsoft unveiled a new AI system that can replicate a person’s voice by analysing just three seconds of speech. The quick replication of a vital part of someone’s identity using AI showed how fast this technology can be employed. 

In March, security concerns were raised when Australian journalist Nick Evershed revealed that an AI version of his voice could grant access to his Centrelink self-service account.

Scammers are exploiting AI in ways beyond voice cloning. Although voice cloning is one such way, experts have observed other methods scammers use. 

Centrelink and the Australian Tax Office (ATO) use “voiceprint” security systems that may be tricked. According to the investigation these systems use the phrase, “In Australia, my voice identifies me.”

Services Australia reported in its annual report for 2021-22 that voice biometrics had been utilised to verify more than 56,000 calls daily, which accounts for 39% of calls made to the primary business numbers of Centrelink. The report also mentioned that a voiceprint is as secure as a fingerprint.

The ATO stated that it is not easy for someone to impersonate your voiceprint and gain access to your personal information.

Dr Lisa Given, a professor of information sciences at RMIT University, suggests that AI-generated voices have the potential to convince individuals they are conversing with someone familiar to them.

If a system can accurately replicate my voice tone and emotions, scammers may start using voice messages instead of text to mimic someone’s voice and make a convincing message,” Lisa said.

Consumers were warned by the US Federal Trade Commission last month about fake family emergency calls that use voice clones generated by AI. The FBI has also issued warnings about scams involving virtual kidnappings.

According to Mark Gorrie, the Managing Director for Asia Pacific at Gen Digital, a cyber security software company, AI voice generators will improve in deceiving people and security systems.

It is essential to be aware of the potential risks of AI-generated voices and other scams and take appropriate measures to protect yourself. Taking steps such as verifying information obtained over the phone with a trusted source, being mindful when giving out personal information and using strong passwords are just some ways to stay one step ahead of scammers.

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %
Previous post ‘Impossible to Spot’ Delivery Scam Email Targets Australia Post Customers – Don’t Fall Victim!
Next post Mysterious Money Transfer Leaves Couple Speechless: How They Got an Unsolicited $4000