A CYBERSECURITY expert has warned of artificial intelligence voice scams that can rob victims of their money.
Using artificial intelligence technology, scammers have mimicked people’s voices to scam their respective social circles.
This is dubbed AI voice cloning and is the latest in a line of AI-powered scams.
The voice cloning scam is a type of phishing attack that tricks people into revealing sensitive information.
Similar to phishing, the goal of a voice cloning scam is to steal a user’s banking information, identity, or passwords.
“AI voice cloning is not only indistinguishable from human speech, it also allows for the creation of more convincing deep fakes, and a deluge of voice samples from public figures such as politicians and celebrities yields high-fidelity results,” says Wasim Khaled, co-founder and Blackbird.AI’s CEO told the US Sun.
“This technology is now more accessible than ever — a quick search yields dozens of low-cost or free providers,” added Khaled.
Additionally, McAfee researchers have found that cybercriminals need as little as three seconds of a person’s voice to clone them.
After successfully cloning, all a scammer needs to do is select a target and call their family, friends or co-workers to impersonate them.
This is just one of the ways generative AI can be used against humanity.
“The greatest danger of generative AI is that it disrupts our understanding of what is real or fake, what is trustworthy and what is fake,” Khaled said.
“Voice cloning, along with other rapidly growing commercially available generative AI capabilities, is another risk factor disrupting the information environment,” he added.
Cybersecurity expert believes voice clone detection technology is the best way to counter threat actors.
However, “accurate and reliable detection capabilities are unlikely to emerge in the short term,” he noted.
“In the future, voice clone detection technology could be used in a similar way to the ‘Scam Likely’ notifications now being provided by many major wireless carriers,” added Khaled.
For now, experts have shared warning signs to watch out for and tips on how to protect yourself from AI voice clone attacks.
One indicator of a scam is when someone uses urgent language to trick you into doing something.
Also, someone who asks for money, goods, or financial support over the phone is not a good sign.
Even if a voice recording sounds suspiciously good, it can be fake.
HOW TO STAY SAFE
First, never send money to someone with whom you have only communicated online or over the phone.
Be careful what information you share or post, as scammers could use it to target you.
Also, be wary of someone who tries to isolate you from friends and family, or who requests inappropriate photos or financial information, as these could later be used to blackmail you.