Moderators review content from across Africa and remove banned material. The employees are exposed to extreme psychological stress.
The hidden side of digital vigilance
With tears in his eyes, Nathan Nkunzimana remembers his time as a content moderator at Facebook. Day after day he was confronted with shocking images of child abuse and murder. The work he and his colleagues did to protect internet users worldwide from these atrocities was an emotional nightmare.
A fight for justice begins
Today, Nkunzimana is one of nearly 200 former employees suing Facebook and its local partner Sama over poor working conditions in Kenya. The former employees, including Nkunzimana, complain about inadequate psychological support and poor pay. They are demanding a compensation fund of $1.6 billion (1.5 billion euros) and are making serious allegations against their former employers.
The dark side of content moderation
Working as a content moderator at Facebook requires reviewing and removing shocking and disturbing content on a daily basis. The impact on the psyche of employees can be devastating. Despite this enormous burden, moderators have reported encountering a culture of secrecy and having to sign non-disclosure agreements.
Companies under attack
The former Facebook and Sama employees say the companies ignored extending their contracts despite a court order to do so. Both companies have not commented on the allegations, although the former employees urgently need their legal status clarified.
Expert statements underline the problem
Sarah Roberts, a digital content moderation expert at the University of California, Los Angeles, warns about the psychological damage associated with working as a content moderator. Roberts criticizes that outsourcing these tasks to low-wage countries like Kenya is an exploitative practice.
Possible effects of the process
The Kenya litigation is particularly significant because it is the first known trial of its kind outside the United States. It could have far-reaching implications for the working conditions of social media moderators worldwide. The case will be heard in court on July 10, and the global tech community is looking forward to it.
If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:
📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.
Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!
* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!
Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )

