Former Facebook moderators suffer from post-traumatic stress disorder (PTSD), alcoholism and sleep disorders.

The Internet. For most of us, it is impossible to imagine everyday life without it. But we often encounter cruel and disturbing content online. To protect users from this, thousands of Facebook moderators are at work every day. On Facebook alone there are said to be 40,000 who select content for users, including videos of beheadings, abuse and child pornography.

[mk_ad]

According to “The Guardian”, twelve former Facebook moderators have now filed a lawsuit against Facebook. They suffered from PTSD, alcoholism and sleep disorders. Chris Gray, one of the plaintiffs, worked for CPL for 10 months. CPL – a company that moderates content on behalf of Facebook. Videos of a stoned woman, tortured people or dogs being boiled alive were burned into Chris Gray's mind.

Insufficient preparation for the psychological stress

One accusation made by the plaintiffs is that the employees are not adequately prepared for this stressful type of work. Tickets with content would be assigned and cannot be skipped. You have to face the content. And very precisely: a detailed analysis is necessary. Moderators often have to expose themselves to cruel content for a very long time and repeatedly.

Chris Gray still has trouble sleeping after two years. One of the other plaintiffs is also plagued by insomnia. He also regularly tried to “numb” himself with alcohol because he didn’t have nightmares when he was drunk. He also had to take antidepressants.

Mark Zuckerberg believes statements are exaggerated

A Facebook spokesman told the newspaper that they supported the moderators. It is understood that checking some content is difficult. Therefore, all moderators undergo an intensive training program lasting several weeks. They are also offered extensive psychological support.

[mk_ad]

Mark Zuckerberg downplays the statements of the complaining moderators:

“It’s not the case that people just have to watch terrible things all day.” Facebook ensures that moderators are adequately looked after, and enough breaks are also possible.

Mimikama: We have also been checking user requests since 2011!

  • We don't even know when I first saw the video of teenagers setting fire to a puppy in a bucket for fun.
  • We don't even remember when I saw the video of a man smashing a woman's skull in with a board.
  • We don't even remember when I saw the video of a man being beheaded.
  • Skinned cats, gaping knife wounds, severed bones, aborted unborn children. I really don't know anymore what kind of atrocities I've seen.
  • All we know is that it was all content that we received from Mimikama because it was published on Facebook.
Source: nau.ch
Article image: Shutterstock / By KieferPix

Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )