Murder, sodomy, necrophilia, child porn and other disturbing things: content moderators see a lot of this on TikTok. Not only that, in order to meet excessive “productivity standards,” they have to endure a lot of it without breaks.

Two ex-moderators made this accusation in a lawsuit filed in a Californian US federal court against the app and parent company ByteDance. They see the “extreme psychological stress” as a violation of employee protection and are seeking the status of a class action lawsuit, which other moderators suffering from the consequences of their work could join.

Endless weirdness

TikTok has around 10,000 moderators worldwide. They are intended to ensure that users have endless fun and are spared from disturbing content. But the content watchdogs themselves get a lot of it.

“We had to watch death and graphic pornography. I saw naked underage children every day.”

Ashley Velez, a plaintiff, told NPR

Both plaintiffs speak of twelve-hour shifts and criticize the fact that they had to endure large amounts of such disturbing content without a break due to quota requirements.

The lawsuit also alleges that the moderators' working environment was unsafe. TikTok does not provide adequate resources to help employees deal with anxiety, depression and post-traumatic stress that result from watching the flood of disturbing videos. Watching hundreds of “highly toxic and extremely disturbing” clips per week led to emotional trauma. The lack of protection against this constitutes negligence and a violation of California's workplace safety laws.

Well-known problem

Such allegations are not new. It was only in December 2021 that another TikTok moderator filed a similar lawsuit, but it has since been dropped. However, the current legal dispute could become more of a problem for the platform - especially if it actually receives class action status and other affected parties join.

The lawyers behind the lawsuit had also filed a class action lawsuit against Facebook a few years ago due to similar allegations regarding the psychological stress on moderators. An agreement was reached in May 2020, according to which Facebook had to pay $52 million, or at least $1,000 per affected content watchdog.

You might also be interested in: Do not forward images and videos of violence and prisoners of war

Source: press release


If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:

📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.

Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!

* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!


Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )