Deep Fakes – This term refers to fake videos that give the impression that certain people in a video are doing or saying something that is not true.
A certain tool called “FakeApp” has been circulating on the Internet for some time. The curious readers should already be told that you have to be careful when downloading, there are many virus-infected versions floating around the internet! With this program, provided you have a powerful PC and a lot of time, it is possible to create fake videos whose quality can hardly be distinguished from real videos. Only the faces are swapped in the corresponding videos. The “Deep” in “Deep Fake” stands for “deep intelligence”, as the computer uses photos to calculate further facial expressions of the inserted faces.
Can an app really do that?
After this app remained relatively undiscovered by normal Internet users, it now received media attention after Bayerischer Rundfunk reported on the possibilities at the beginning of August and created a fake video with Angela Merkel . Since this app was introduced, it has become very popular with a certain group of users, for example by creating fake porn with celebrities.
Is this app legal?
“FakeApp” is like “Photoshop”: the programs themselves are legal. Of course, the more often a program is used for abusive purposes, the more it falls into disrepute. But while Photoshop is relatively expensive and creating fakes is not its basic function, FakeApp is free and specifically designed to create fake videos.
Is this a new method?
New only in the sense that this app also makes it possible for private users to use the technology. “Face swapping” is already used in Hollywood, a prominent example being the young Princess Leia Organa in “Star Wars – Rogue One”.

Can anyone do this?
In principle yes. But there are a few requirements: The PC must have a Windows system , the graphics card should at least 6 GB of memory and, ideally, you should several hundred photos of the person whose face you want to project into a video. It also takes time because, depending on the performance of a PC, it can take several hours (poor quality) to days until such a video is credibly rendered. In addition, the user and the person whose face is swapped in the video should have a certain similarity .
Blackmailers use fear
If you've only heard or read about "deep fakes" superficially, you'd get the idea that it would be very easy to create such videos with anyone. Blackmailers are now taking advantage of this ignorance, as the Thuringian police report:
“On August 18, 2018, a complainant appeared at the police station in Meiningen to report extortion. Previously, he accepted a friend request from someone he didn't know on Facebook. This person opened a chat and then threatened to post a manipulated video in which he would be inserted onto the Internet if he did not immediately transfer money to a specific account. The injured party therefore reported blackmail, reported the person to Facebook and blocked the contact.”
Plain text again: Can I be blackmailed like this?
Theoretically yes. In practice, however, it is an enormous effort that hardly produces credible results. As mentioned above, such a “deep fake” requires hundreds of photos of the target person. So if you're not someone who posts hundreds of selfies on Facebook, you're on the safe side. In addition, a certain degree of similarity between the swapped faces is necessary.
It is also possible to create a “deep fake” with fewer photos. However, these photos must then have been taken in exactly the right lighting conditions, as the app can only recalculate the lighting conditions very inadequately. In addition, these photos must also be available at different angles and facial expressions so that the program's algorithm can "learn" the face. And that doesn't really work if there are only 20 photos, all of which have different lighting conditions.
That's why it works relatively well with celebrities, of whom there are hundreds to thousands of photos, but with private individuals it's an impossibility.
Is there a way to detect deep fake videos?
Yes .
As impressive as this technical possibility is, it has a small but significant flaw: it cannot calculate the blink of a human eye! A study by the Computer Science Department at the University of Albany showed that the eyes in such videos are either completely fixed or the blinking looks very “blurred”.
Conclusion
Such blackmail attempts are based on the blackmailer's trust that the recipient of the blackmail has heard and read nothing or only very rudimentary information about that technology. As a layman, you naturally believe that this is certainly possible with today's technology and would rather pay before such a video spreads.
However, it is not that simple . That's why we advise you, if you fall victim to such an extortion attempt, to behave like the user in the police report: don't pay and report it to the police !
If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:
📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.
Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!
* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!
Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )


