Do you really know who’s calling you? This question has taken on a whole new urgency with the rise of AI technology. I recently came across an alarming story that reveals how criminals are using AI to clone voices of loved ones in a way that’s scarily believable — just to trick people into handing over money.
How AI voice cloning turned a routine call into a nightmare
September 30th started off as a normal day for Olivia Kalescky in South Carolina. Her phone buzzed, and the caller ID showed her sister Cassie’s name and picture — a routine moment we can all relate to. But this call wasn’t from Cassie. Olivia described hearing whimpering, crying, and even her sister pleading, “Help me, please.” The voice? 100% Cassie’s – or so it seemed.
What Olivia was experiencing was a high-tech scam powered by AI voice cloning. Retired FBI agent Doug Kouns, now heading a global intelligence agency, explains that scammers are harvesting voice samples from social media or previous calls to create fake audio that’s almost impossible to distinguish from the real thing.
It’s a whole new level with artificial intelligence.
The chilling demand and emotional turmoil
The scam escalated quickly. A man’s voice took over the call, threatening Olivia that he was holding her sister at gunpoint. The pressure to pay up was real and terrifying. Olivia was told, “If you hang up or call the police, I’m putting a bullet in her head.” The man demanded cash payments through a mobile app, but Olivia tried desperately to stall and even offered alternative payment methods while covertly texting for help.
The emotional weight of the situation was crushing. Olivia’s reaction? Distraught but trying to stay calm. The scammer’s anger intensified when Olivia couldn’t comply quickly enough. This incident is not just an alarming tale but a warning about just how convincing AI-driven scams have become.
What can you do to protect yourself and your family?
Stories like Olivia’s make it clear we can no longer rely on caller ID or even voice alone to verify who’s really on the other end of the line. According to cybersecurity experts, simple safeguards can make a huge difference. Here are some practical tips to keep you safe:
- If a family member calls asking for urgent help, send them a text at their usual number asking if it’s really them.
- Create a secret family code word to use in emergencies that only you and your close relatives know.
- Be skeptical of any call demanding immediate payment or threatening harm — especially if they pressure you to use quick-money apps or services.
What makes these scams so terrifying is how AI blurs the line between reality and deception. When you can no longer trust what you hear, it puts everyone in a tough spot, just like Olivia experienced firsthand.
When you can’t believe what you see and hear, where does that leave us?
Staying vigilant and adopting new verification habits could be crucial as this type of AI scam continues to evolve. At its core, this is a stark reminder that technology, while incredible, also raises the stakes for how criminals operate — and how we protect ourselves in an increasingly digital world.
If you ever receive a suspicious call that feels off, trust your instincts. A moment of caution and a quick check might just save you from falling victim to these sophisticated schemes.