Just a few days ago, the term “deepfake” flitted through the press: SPD politician Franziska Giffey and Vienna’s SPÖ mayor Michael Ludwig fell , with whom they spoke via video conference. It didn't seem to be a real deepfake, but rather a clever compilation of interview excerpts that were recorded live, as investigative journalist Daniel Laufer explains in a Twitter thread , but it's only a matter of time until real time -Deepfakes are almost perfect.
Deepfakes – The beginning
In 2018, deepfakes came into the public eye for the first time when a desktop application called "FakeApp" made it possible to create deceptively real-looking fake videos with other people's faces - which was then (of course) used by many users to supposed porn with celebrities, which was more bad than good.

2019 The nude trend continued with deepfakes when a software called “Deepnude” made it possible to create nude images from normal photographs of people, which of course was then used criminally - suddenly blackmail with supposed nude images was a real problem - almost at least Because these deepfakes weren't very convincing, the developers then took the software off the internet themselves.
Deepfakes are becoming more realistic
But in the same year, videos by artists Bill Posters and Daniel Howe together with the advertising agency Canny proved that deepfakes can appear very realistic. At the “ Specter ” art exhibition they showed, among others, Mark Zuckerberg, who apparently thanks “Spectre”: “ Specter showed me: Whoever controls the data also controls the future. “
But not only videos, but also voices could already be imitated very well in 2019, as this supposed audio document from Donald Trump shows: The little speech was created entirely on the computer!
And this development continues: Amazon's voice assistant Alexa will soon also be able to imitate voices .
Fun with deepfakes
An interesting question among film fans: What would films actually look like if they had been cast with the actors who were originally intended for the roles? This was answered . For example, Will Smith turned down the role of Neo in The Matrix to film Wild Wild West (a major flop at the box office). This is what Matrix would otherwise look like:
Tom Sellek was actually supposed to play the lead role in “Indiana Jones,” but the production company for the TV series “Magnum” didn’t allow this. It's almost a shame, because the hat and whip look pretty good on him!
But other tricks are also amusing:
In the original Star Wars trilogy, Alec Guiness played Obi-Wan Kenobi, in episodes 1 - 3 and in the current series the young Obi-Wan was played by Ewan McGregor. What would it have looked like if McGregor had been old enough to play the Jedi Master back then?
In 2021, apps like “ Wombo ” finally conquered the market: With the app, anyone can use their own or other people’s photos to sing along . You've probably never heard North Korea's leader Kim Jong-un sing so beautifully!
His deepfakes even helped the YouTuber “Shamook” get a lucrative job: his version of the finale of the second season of “The Mandalorian” was so convincing that the Disney subsidiary Lucasfilm didn’t sue him with copyright complaints, but rather offered him a contract :
A challenge to media literacy
As impressive as the deepfakes above are, they were also made by real professionals. Nowadays it is very difficult for laypeople to create convincing deepfakes. The tech journalist and author Svea Eckert tried it - and, according to her own statements, failed miserably.
Even very good deepfakes can be revealed as fakes upon closer inspection: certain parts of the face remain rigid, there is either no blinking at all or very fake blinking, the mouth area sometimes appears “washed out” when speaking, the background is distorted when the head moves... such small mistakes have to be made themselves Professional deepfakes always have to be painstakingly reworked by hand.
But such errors are often overlooked, especially if you only watch a video on a small smartphone display, whereas on a PC it is easier to run a video in slow motion and pay closer attention to the details. That's why you should also pay attention to the sources from which a video comes!
Over the course of a few years, deepfake technology made huge strides. It can be used for fun or to make older actors in series appear young... but also to put wrong words in the mouths - currently relatively easy to understand, but this can change quickly.
What if such deepfakes trigger conflicts at some point? When a seemingly secretly recorded conversation from a politician insulting the leaders of another country leads to an international crisis? How can it be quickly proven that it is a deepfake when technology becomes more and more perfect?
These are questions we must address in the future. Unfortunately, any technology that has been developed can often be used to sow discord and harm others.
Article image: Unsplash
If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:
📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.
Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!
* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!
Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )

