“I'm so sorry, Mom... I hit a woman. She's dead. You have to help me!” A message like that coming from your child's voice would send any parent into a state of sheer terror. This is the dark heart of a new scam known as a “shock call.” With a perfectly fake voice that sounds exactly like that of a beloved family member, criminals are able to capitalize on fear.

The Doppelganger Voice: How Fraudsters Clone the Voices of Innocents

Thanks to huge advances in artificial intelligence, it is now possible to clone a voice with frightening accuracy. All you need is a short voice sequence from the victim - a quick call, voice message or video clip can be enough.

Programs like ElevenLabs, a US-based service, are able to reproduce these voices with up to 95% accuracy. The entire process does not require any special technical knowledge or a large amount of time. Within minutes, a perfect copy of a voice can be created and exported as an audio file.

Increase in voice-cloning fraud cases

Reports of this type of fraud have increased over the past year. In the US, a case of “virtual kidnapping” made headlines in which criminals used a cloned voice to trick a mother into thinking her daughter had been kidnapped. – We reported HERE.

And these are not isolated cases: According to an online survey by software manufacturer McAfee, 22 percent of Germans said that they or someone they knew had fallen victim to an audio fraud using artificial intelligence.

Protection measures against clone-based shock calls

Although these types of attacks can be very scary, there are several strategies you can use to protect yourself:

  • Always remember: the police never call to ask for money.
  • Ask the caller to hear more. This can often cause the scammers to give up as they only have a limited amount of cloned audio available.
  • End the conversation as quickly as possible and contact the supposedly affected family member directly.
  • Never hand over money or valuables to strangers.

Challenge for law enforcement authorities

Despite authorities' efforts to combat this type of fraud, it is often difficult to track down the perpetrators, especially when AI technology is involved. To make matters worse, unmasking cloned voices is almost impossible.

Conclusion

Technology has done wonders in many areas of life, but as the saying goes, “There are two sides to every coin.” A grim example of how technological advances can be exploited for criminal purposes is the rise in cloned voice fraud cases. It is more important than ever that we realize that not everything we hear is the truth, even if it sounds like the voice of a loved one.

Help us make our online community safer! Download the attached sharepic and share it on your Facebook feed or stories. The more people we can reach with this, the safer our digital world will become. Always remember: think first, then click!

MIMIKAMA

This might also interest you:

Fraud with fake bank details: Deception costs 51-year-olds over 12,000 euros
Artificial intelligence - a question of security: Close eye on AI regulation
The Devil Wears Emoji: Why social media can mislead us!


If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:

📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.

Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!

* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!


Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )