Russian President Vladimir Putin experienced an unexpected turn of events at his annual press conference. A student from St. Petersburg used artificial intelligence (AI) to have a digital AI doppelganger ask Putin questions. This unusual action attracted not only the president's attention, but also that of the entire audience.

YouTube

By loading the video, you accept YouTube's privacy policy.
Learn more

Load video

Putin responds to rumors about body doubles

When asked about the topic of AI and body doubles, Putin initially reacted with surprise, but then denied the use of physical doubles. He emphasized that although he had considered the possibility, he ultimately decided against it. This statement contradicts previous rumors that Putin would use doppelgangers on certain occasions.

Putin started laughing. “I see that you look like me and even speak with my voice ,” he said. “But I thought about it and decided that only one person should look like me and speak with my voice.”

The head of state also responded to a video of a child expressing concern that his grandmother could be replaced by a computer image. “I can assure you that no one will replace your grandmother. “That is impossible ,” said Putin.

Russia's long-term president called on his country to use its own technology to combat "biased" Western chatbots. It is “impossible” to stop AI, Putin said on Thursday. “That means we have to be leaders in this area.”

Putin has repeatedly called for an end to Russia's dependence on Western technology. He directed his government to promote the development of AI and mainframe computers.

The dangers of deepfakes and AI doppelgangers

Although the video ended abruptly before Putin could express his opinion on the dangers of artificial intelligence, the incident highlights the growing concern about "deepfakes." These fake videos and audios can make famous people say or do things that never happened. The distribution of such content, especially on social media, promotes the spread of disinformation.

The effect of counterfeits – regardless of quality

Interestingly, the quality of the fake is not decisive for its influence, even rudimentary forgeries, so-called “ cheapfakes ,” can affect the reputation of political and public figures. The ease with which videos and voices can be faked through AI technology poses a serious challenge to information authenticity.

Conclusion: A wake-up call to vigilance

This incident with Putin's AI doppelganger serves as a wake-up call to be vigilant about the risks and power of AI-generated content. In a world where the boundaries between reality and fiction are increasingly blurring, it is becoming increasingly important to critically question information and examine the source.

Subscribe to our newsletter to stay up to date. Also explore our extensive media education offering .

You might also be interested in:
Deepfakes: Virtual deceptions in the digital world
Deepfake fraud with Armin Assinger: Warning about dubious investment platforms
Deepfake fraud: The invisible danger on the phone

Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )