I’m a tech expert – I’ve figured out AI’s ‘greatest peril’ to humanity and it’s already happening

A CYBERSECURITY expert has warned of artificial intelligence voice scams that can rob victims of their money.

Using artificial intelligence technology, scammers have mimicked people’s voices to scam their respective social circles.

A cybersecurity expert has warned of voice fraud using artificial intelligence


A cybersecurity expert has warned of voice fraud using artificial intelligencePhoto credit: Getty

This is dubbed AI voice cloning and is the latest in a line of AI-powered scams.

The voice cloning scam is a type of phishing attack that tricks people into revealing sensitive information.

Similar to phishing, the goal of a voice cloning scam is to steal a user’s banking information, identity, or passwords.

“AI voice cloning is not only indistinguishable from human speech, it also allows for the creation of more convincing deep fakes, and a deluge of voice samples from public figures such as politicians and celebrities yields high-fidelity results,” says Wasim Khaled, co-founder and Blackbird.AI’s CEO told the US Sun.

The new iPhone tool speaks in YOUR voice - and it only takes 15 minutes to learn
I'm a deepfake expert and we can't trust videos and voices we see online anymore

“This technology is now more accessible than ever — a quick search yields dozens of low-cost or free providers,” added Khaled.

Additionally, McAfee researchers have found that cybercriminals need as little as three seconds of a person’s voice to clone them.

After successfully cloning, all a scammer needs to do is select a target and call their family, friends or co-workers to impersonate them.

This is just one of the ways generative AI can be used against humanity.

“The greatest danger of generative AI is that it disrupts our understanding of what is real or fake, what is trustworthy and what is fake,” Khaled said.

“Voice cloning, along with other rapidly growing commercially available generative AI capabilities, is another risk factor disrupting the information environment,” he added.

Cybersecurity expert believes voice clone detection technology is the best way to counter threat actors.

However, “accurate and reliable detection capabilities are unlikely to emerge in the short term,” he noted.

“In the future, voice clone detection technology could be used in a similar way to the ‘Scam Likely’ notifications now being provided by many major wireless carriers,” added Khaled.

For now, experts have shared warning signs to watch out for and tips on how to protect yourself from AI voice clone attacks.


One indicator of a scam is when someone uses urgent language to trick you into doing something.

Also, someone who asks for money, goods, or financial support over the phone is not a good sign.

Even if a voice recording sounds suspiciously good, it can be fake.


First, never send money to someone with whom you have only communicated online or over the phone.

Live updates from the NBA Draft featuring Victor Wembanyama and top mock projections
See Sister Wives' Janelle and daughter Maddie Trump Christine at Plexus sales

Be careful what information you share or post, as scammers could use it to target you.

Also, be wary of someone who tries to isolate you from friends and family, or who requests inappropriate photos or financial information, as these could later be used to blackmail you.


TaraSubramaniam is a Dailynationtoday U.S. News Reporter based in London. His focus is on U.S. politics and the environment. He has covered climate change extensively, as well as healthcare and crime. TaraSubramaniam joined Dailynationtoday in 2023 from the Daily Express and previously worked for Chemist and Druggist and the Jewish Chronicle. He is a graduate of Cambridge University. Languages: English. You can get in touch with me by emailing: tarasubramaniam@dailynationtoday.com.

Related Articles

Back to top button