Millions of photos and videos are shared on Facebook every day, some of them manipulated.
Parts of the uploaded content are manipulated, often for harmless reasons, such as making a video sharper or audio clearer.
But there are people who engage in media manipulation to mislead. Manipulations can be carried out using simple technologies such as Photoshop or sophisticated tools that use artificial intelligence or “deep learning” techniques to create videos that distort reality – usually referred to as “deepfakes”. Although these videos are still rare on the Internet, as their use increases, they pose a major challenge to our industry and society.
How does Facebook deal with deepfakes?
Facebook's approach includes multiple components, from investigating AI-generated content and fraudulent behavior such as false reports to collaborating with academia, government and industry to uncovering people behind these efforts.
[mk_ad]
Collaboration is key. Globally, Facebook held conversations with more than 50 global experts with technical, political, media, legal, civic and academic backgrounds to inform policy developments and advance the science of detecting manipulated media.
Criteria for removing misleading media
As a result of these partnerships and discussions, Facebook's policies against misleading manipulated videos that have been identified as deepfakes will be strengthened. Going forward, misleading manipulated media will be removed if it meets the following criteria:
It has been edited or synthesized - beyond adjustments for clarity or quality - in a way that is not obvious to the average person and would likely mislead someone into thinking that statements are being made in the video that are not actually being said became.
It is the product of artificial intelligence or machine learning that merges, replaces or overlays content in a video so that it appears authentic.
This policy does not cover content that is parody or satire, or videos that have been edited solely to omit or change the order of words.
In accordance with Facebook's existing policies, audio files, photos or videos, whether a deepfake or not, will be removed from Facebook if they violate existing other community standards, including those prohibiting nudity, graphic violence, oppression Voters and regulate hate speech.
Third party check
Videos that do not meet removal standards may still be reviewed by Facebook's independent third parties.
These include over 50 partners worldwide who check facts in over 40 languages. If a photo or video is rated as false or partially false by a fact checker, its distribution in news feeds will be significantly reduced and will also be rejected by Facebook if it is to be displayed as an ad. Users who see it and try to share it or have already shared it should see warnings informing them that the content is incorrect. – This approach is crucial to Facebook's strategy, which results from discussions with experts. If all manipulated videos that were flagged as false by fact checkers were simply removed, the videos would still be available elsewhere on the Internet or social media network. However, by keeping them available but flagging them as incorrect, users receive important information.
Tracking down the authors, projects to detect deepfakes
The enforcement strategy against misleading and manipulated media also benefits from tracking down the people behind them. Just last month, a network that used AI-generated photos to hide their fake accounts was identified and removed. Facebook teams continue to proactively hunt down fake accounts and other coordinated, inauthentic behavior.
[mk_ad]
Facebook is also concerned with identifying manipulated content, of which deepfakes are the most difficult to detect. That's why the Deep Fake Detection Challenge was launched in September last year, encouraging people from around the world to develop more research and open source tools to detect deep fakes. This project, supported by $10 million in grants, includes a cross-sector coalition of organizations including the Partnership on AI, Cornell Tech, the University of California Berkeley, MIT, WITNESS, Microsoft, the BBC and AWS, among several others civil society and the technology, media and academic communities.
In a separate project, Facebook has partnered with Reuters, the world's largest multimedia news provider, to help newsrooms worldwide identify deepfakes and manipulated media through a free online training course. News organizations are increasingly relying on third parties for large volumes of images and videos, and identifying manipulated footage is a significant challenge. This program aims to support newsrooms in this work.
As these partnerships and Facebook's own insights evolve, so will policy toward manipulated media. In the meantime, there is a commitment to investing within Facebook and working with other stakeholders in the space to find solutions with real impact.
Source: about.fb.com
Article image: about.fb.com
If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:
📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.
Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!
* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!
Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )

