The advances in artificial intelligence (AI) are both impressive and frightening. A notable example is the development of audio deepfakes, in which computer systems are able to imitate the human voice down to the smallest detail. This technology has led to an increase in AI-powered fraud attempts. This also includes a new variant of the infamous “grandchild trick”.
The “grandchild trick 2.0”: deepfake audio fraud
In the so-called “grandchild trick,” the fraudster usually poses as the child or grandchild of the victim and fakes a crisis situation in order to receive financial support. This trick is becoming increasingly dangerous as fraudsters now use artificial intelligence that specializes in imitating human voices.
How does it work?
To create a complete voice profile of a person, some of these AIs only require 30 minutes of audio. They can mimic that person's voice down to the smallest detail using generative AI. This makes the fraud shockingly credible.
Social media as a source for audio material
Another worrying aspect of this development is the fact that access to the audio you need is now easier than ever. Social media platforms are a treasure trove of audio material that can be used to create deepfake voice profiles. Whether it's podcasts, live videos, or just voice messages, the large amount of audio content we produce and share every day makes it easy for fraudsters to gather enough material for their fraudulent purposes.
How can we protect ourselves?
It is important that we are all aware that audio deepfakes exist and that they pose a danger. Be careful with information you share on social media. Be wary of calls from “family members” asking for financial support. Use callback numbers that you already know, rather than the number that called you.
Technology can help us in many ways, but we also need to learn to look at it critically and protect ourselves from potential dangers. It is important to protect yourself from such scams.
Here are some tips to protect yourself from audio deepfake scams:
- Verbal code word: Set a verbal code word with your children, family members or close friends. Make sure that it is only known to you and those closest to you.
- Always question the source: Even if the message comes from a number you recognize, stop and think. Does the voice and way of speaking really sound like the person contacting you here?
- Think before you click and share: Who is part of your social media network? How well do you really know these people and how much trust do you have in them? Be careful with the friends and connections you have online.
- Protect your identity: Services that monitor your identity can alert you if your personal information appears on the dark web and provide guidance on how to protect it.
Conclusion
The world of audio deepfakes powerfully illustrates the double-edged nature of technology. It has the potential to support us in many areas. But it can also be misused for harmful purposes. We must become aware of this, remain vigilant and actively protect ourselves.
The full study and further details can be found here:
Artificial Imposters—Cybercriminals Turn to AI Voice Cloning for a New Breed of Scam
This might also be of interest:
ChatGPT triggers a flood of new domains – misuse is possible
Beware of online trading fraud: protect your money!
Internet users in a state of emergency: Social media is causing global panic!
If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:
📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.
Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!
* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!
Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )

