Facebook increases hit rate for hate and hate speech through software
Hate and hate speech are always topics on social media. Especially on Facebook you have to deal with criticism that too little has been done about it for too long. For this reason, the software used by Facebook has been significantly expanded. New technologies such as machine learning, computer vision and artificial intelligence are being used to find inappropriate content even more quickly.
According to the latest report on Facebook's code of conduct compliance, 68 percent of problematic content was automatically detected in the first quarter. In the third quarter the percentage even rose to 80 percent . So Facebook is obviously on the right track when it comes to expanding the software.
[mk_ad]
Facebook itself writes on this topic:
Initially, we've used these systems to proactively detect potential hate speech violations and send them to our content review teams since people can better assess context where AI cannot. Starting in Q2 2019, thanks to continued progress in our systems' abilities to correctly detect violations, we began removing some posts automatically, but only when content is either identical or near-identical to text or images previously removed by our content review team as violating our policy, or where content very closely matches common attacks that violate our policy. We only do this in select instances, and it has only been possible because our automated systems have been trained on hundreds of thousands, if not millions, of different examples of violating content and common attacks. In all other cases when our systems proactively detect potential hate speech, the content is still sent to our review teams to make a final determination. With these evolutions in our detection systems, our proactive rate has climbed to 80%, from 68% in our last report, and we've increased the volume of content we find and remove for violating our hate speech policy.

Fake accounts are also being targeted.
Last quarter, 1.7 billion fake user accounts were deleted . Compared to the first quarter of the year, this was 500 million less, but it shows that Facebook is now better at blocking the creation of false accounts from the outset.
Furthermore, in the last quarter, 4.4 million content related to drug trafficking deleted on Facebook and 1.5 million on Instagram. Incidentally, data on the Instagram platform, which belongs to Facebook, is included in this report for the first time.
[mk_ad]
In line with this topic:
Hate on the internet: unfortunately still a big topic!
Source: Facebook Newsroom
Article image: Shutterstock / Worawee Meepian
If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:
📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.
Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!
* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!
Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )

