Facebook's role in spreading misinformation has come under scrutiny. What has the company done about the problem and how is the fight going?
27 partners focused on fact checking articles. This is now being expanded to include images and videos, as Facebook reports newsroom
Facebook now wants to transparently present to the public the way in which it is tackling fake news, as many users are subjectively of the opinion that nothing is actually happening. That's why they're publishing an 11-minute long video that shows their work and the difficulties involved:
Since not everyone speaks English, we would like to explain the central points of the video to you.
The problem with reported posts
One of the problems with misinformation is that it is often difficult to determine what is actually true. Just think of all the news you read every day. What is objectively false, what is objectively true? The truth often has the unpleasant property that it does not agree with one's own ideas and opinions. Because of this, there is a lot of news that is subjective but not objectively fake news.
neutrality
It would be bad if Facebook scrutinized every posting and every picture. But it is just as wrong if anyone could post anything that fits the bill, for example pornography or hate speech. So there has to be a middle, but that middle is quite large.
It's not always black and white
It's not just truths and lies on Facebook.
If it were that simple, it wouldn't be a problem at all. However, there is also, for example, the tricky area of propaganda: real statistics are taken, real numbers are given, but these are put in the wrong context, or only small parts of a real statistic are picked out in order to consolidate one's own opinion ("cherrypicking "). This makes it difficult for Facebook, because propaganda is not fundamentally wrong, it just often suppresses other aspects of a fact.
A subdivision
Fake news itself can be divided into different categories, as the chart in the video shows:

- Bad people
The most well-known are fake accounts, spammers and scammers - Bad behavior
For example, polarizing or misinforming, but also romance scamming, i.e. faking love to get money - Bad content
such as fake news, hate speech, violence, and clickbait
All of these points are intertwined, but each of these points requires a specific strategy to combat it, which is why Facebook also has various teams that constantly communicate and coordinate with each other.
The further procedure
In principle, there are three ways to proceed: delete, limit reach and inform:

- Delete :
hate speech, terrorist content, fake accounts and bullying will be deleted - Reduce :
Clickbait, fake news and messages with exaggeratedly sensational content (“You won’t believe what this…”) are reduced. - Inform
If there are other articles on a certain topic or if an article corresponds to a user's interests, these will appear in the news feed
Facebook makes it clear that hate speech or clickbait can still appear in the newsfeed when shared by friends, so some things will probably never disappear from Facebook.
The challenge: images and videos
Texts can be checked relatively easily using algorithms, but it is more difficult with images and videos. Facebook is now taking on this challenge of using algorithms to recognize media as real or fake.
How will this work?
Similar to what already works with articles, a special algorithm now also checks images and videos, which uses various automated methods, for example an automatic image search is carried out on the Internet to determine how often an image has already appeared on the Internet.

Suspicious images and videos are then forwarded to the external fact checkers for further inspection. These fact checkers specialize in examining the material in more detail, for example by looking at the metadata of an image to determine when and where an image or video was taken. They also receive support from experts, scientists or government authorities.
The more reviews the Fact Checker receives on Facebook, the more accurate the machine recognition method for the material becomes.

For example, the algorithm has an OCR mechanism . This recognizes texts in images and then compares these texts with articles to check their veracity. Facebook is also still working on techniques to automatically determine whether an image may have been manipulated before it is sent to the fact checkers for review.
How does Facebook categorize potential fakes and hoaxes?
Usually, fakes and hoaxes can be divided into three categories:
1. manipulated or completely fake images
2. images taken out of context
3. real images with false claims in text or audio form

These three categories form the basis of automated fake detection.
What is the next step?
Facebook is now working on further refining the algorithm with the help of the photos and videos checked by the fact checkers. There is still a long way to go before such reports disappear automatically. Of course, there is also a great risk that reports about fakes and hoaxes, which have to use these images in articles, will themselves be recognized as fake.
In order to make this system as perfect as possible, Facebook will continue to look for new partners in the next few months and continue to work on the algorithm, which has been in active operation since September 13, 2018.
If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:
📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.
Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!
* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!
Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )

