The fight against reported content on Facebook: experience or fact?


Reported Content and User Experience: “The content has not been removed…” or “does not violate our Community Standards”

Many Facebook users have the feeling that despite their reports, many irregular posts remain on the platform. You report posts that spread hate, discrimination, or misinformation, but often receive the response: “The content was not removed because it does not violate our Community Standards . But what is the reason for this? And is it just a perception or is it reality that Facebook doesn't remove enough inappropriate content?

The gray areas of community standards

Facebook has clearly defined community standards that regulate what content is allowed on the platform and what is not. However, there are also gray areas. When it comes to hate speech, for example, the definition of “hate” is often in the eye of the beholder. What is harsh criticism for one person may be hate speech for another. This makes moderating such content a difficult task.

The dilemma with nudity vs. hate and hate speech

Facebook's seemingly different approach to different types of content is another notable phenomenon. Users report that the platform takes a strict stance against nudity, particularly female breasts, while posts containing hate and hate speech often remain. The difference may be that nudity is easier to detect and moderate, while hate and hate speech are often more subtle and harder to detect.

The geographical difference: USA vs. Europe

There are also differences in the perception of Facebook's moderation practices between users in the US and Europe. In Europe, where data protection laws are stricter and hate speech is more heavily regulated, it may appear that Facebook is not doing enough to remove rule-breaking content. In the USA, however, where freedom of expression is very important, the impression could arise that Facebook is being too strict.

Reported Content: Is it just the impression or the reality?

The key question is whether Facebook is actually under-reporting and removing things, or whether it just looks that way. What is undisputed is that there is a large volume of reported content that is not removed. However, you have to remember that Facebook has billions of users and billions of posts are published every day. It is nearly impossible to immediately review and remove every reported post. Facebook moderators are also people who can make mistakes. They often have to make difficult judgments, and sometimes they may be wrong.

User experiences about reported content


User experience 1: Dieter

Dieter actively participates in political discussions on Facebook and has reported hate speech on several occasions. “Initially, I was frustrated when I received a notification that the flagged content had not been removed. Then I learned that I can appeal. In some cases this worked and the posts were subsequently removed.”

Tip: If Facebook doesn't remove something reported at the first attempt, raise an objection. There is a review process and sometimes a second review can produce a different result.

User Experience 2: Emily

Emily is an artist and shares her images on Facebook. She has reported several cases of copyright infringement. “Once, Facebook did not remove a reported post even though it clearly used my artwork without my permission. I was angry and felt powerless. But I objected and provided further evidence and the post was eventually removed.”

Tip: When reporting copyright infringement, make sure you provide enough evidence. If your first attempt fails, appeal and provide additional information or evidence.

User Experience 3: Sebastian

Sebastian reported several posts that spread false information / fake news. “I was disappointed when Facebook did not delete some of the reported posts. But instead of giving up, I started posting factual information and sources in the comments to refute the misinformation."

Tip: If Facebook won't remove a post containing misinformation, try sharing factual information and sources to refute the misinformation. It is important to stay informed and actively combat the spread of misinformation.

User Experience 4: Sara

Sara reported several hateful comments in a newsgroup. “It was frustrating that the reported comments were not initially deleted. But I've learned that it's worth being persistent. After my objection, some comments were removed”

Tip: It can be frustrating when hate comments aren't removed immediately, but don't give up. Through objection and continued reporting, such comments can often be removed.

User Experience 5: Paul

Paul came across a post that glorified violence and reported it immediately. “When I received the news that the post had not been removed, I was shocked. But I objected and shared my concerns with my community. Some of my friends also reported the post and it was eventually removed."

Tip: If you come across a particularly disturbing post, talk about it in your community. Joint efforts can help draw moderators' attention to the reported post.

User Experience 6: Sabine

Sabine had a problem with someone who repeatedly left inappropriate comments on her posts. “Even though I reported the comments, they were not removed. So I blocked the user and reported the incident to my friends. They helped me report the comments and Facebook eventually removed them.”

Tip: If someone is harassing you, use Facebook's blocking feature and inform your friends. A user who is reported by multiple people may receive more attention from moderators.


Conclusion

It's clear that Facebook faces a major challenge when it comes to moderating inappropriate content. Although it may sometimes seem that the platform is not doing enough, it is important to remember that moderation is not an exact science and there will always be gray areas. It's worth appealing if Facebook doesn't remove a reported post. It's important to stay active and continue to report inappropriate content.

However, Facebook should be more transparent about how decisions are made and what steps are taken to ensure compliance with community standards. This is the only way users can trust the platform and feel that their contributions make a difference.


Also read on Facebook:


In line with this topic:


If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:

📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.

Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!

* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!


Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )