The Business & Technology Network
Helping Business Interpret and Use Technology
S M T W T F S
 
 
 
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
26
 
27
 
28
 
29
 
30
 

Phone scammers are using faked AI voices. Here’s how to protect yourself

DATE POSTED:August 27, 2024

Never before has it been easier to clone a human voice. New AI tools can take a voice sample, process it, copy it, and say anything in the voice of the original. It’s been a thing since as early as 2018, but modern tools can do it faster, more accurately, and with greater ease.

Don’t believe it? OpenAI, the artificial intelligence company behind ChatGPT, presented a project this year that showed how it’s possible to clone a voice with nothing more than a 15-second recording.

OpenAI’s tool isn’t yet publicly available and it’s said to have security measures in place to prevent misuse. However, Eleven Labs offers a similar service — you can clone a voice from a one-minute audio sample for just $6 — and this one is available to everyone.

You can see how this might be abused. Scammers and fraudsters can use phone calls and videos uploaded to social media to obtain voice samples from almost anyone, then clone those voices for use in scams.

Eleven Labs Eleven Labs offers to clone a voice for just $6. The result isn’t perfect, but it’s good enough to trick people. Eleven LabsEleven Labs offers to clone a voice for just $6. The result isn’t perfect, but it’s good enough to trick people.

IDG

Eleven LabsEleven Labs offers to clone a voice for just $6. The result isn’t perfect, but it’s good enough to trick people.

IDG

IDG

One rampant example is the grandparent scam: a scammer calls an elderly person and pretends to be their grandchild via voice cloning, urgently requesting money for some important reason.

Maybe they’ve “been in an accident” or they’ve “been arrested.” They’ll usually ask you to keep quiet and not tell anyone — especially their parents — about this phone call. After all, that could expose the call as fraudulent when you realize your actual grandchild is fine.

Anyone who isn’t aware that voice cloning is a thing could easily fall for such a scam. But even if you are aware, you could still be fooled if you aren’t paying careful attention. And even if you do suspect that something is off, would you want to be the one accusing your grandchild of being an imposter? They’d just deny it, anyway.

That’s the trouble with this scam, and plenty of people are falling for it every day. As AI-powered voice cloning grows more widespread, you can expect this scam to explode in popularity.

Here’s how you protect yourself: Have a family meeting and agree on a code word to be used in emergencies. Then, if you ever get a call from a family member who desperately needs money, have them confirm the code word. If they can’t, then they’re probably a scammer.

Further reading: Beware of these new tricks being used by hackers