You have to see this

The social media platform TikTok has over 1.5 billion users worldwide. One cannot imagine how many short videos with the most entertaining and varied, but often particularly disturbing content are uploaded to the app every day.

The use of artificial intelligence is essential for pre-selecting content. However, these systems are mainly trained on English-language content, which means that more and more people in other language areas are seeing such videos.

If the algorithm sounds an alarm or videos are reported by other users, the content monitors come into play to deal with the content of the TikToks. No one should have to endure what they see.

The images, which must be manually reviewed by employees of Luxembourg outsourcing specialist Majorel , go far beyond pornography, insults and threats.

Further and further, more and more

TikTok's user numbers are increasing explosively in North Africa and the Middle East. In order to ensure quality control of this volume of videos, Majorel is responding by expanding the moderation team in these regions.

Now former and current moderators shared their experiences anonymously with “ Business Insider .”

Animal cruelty and suicide

Employees talk about videos in which animals are tortured and killed in the most cruel ways, people are injured and mistreated, or how they even have to watch others commit suicide on camera.

Employees are also required to classify the TikTok content into categories and must therefore watch the videos until the end.

Non-stop stress

The pressure placed on employees is enormous. For a pilot project, 200 TikTok live videos per hour per employee were originally supposed to be checked and categorized. This corresponds to a processing time of around 3 videos per minute. Later, this number was increased so drastically that only 10 seconds per video were available. If these targets could not be met, the employees were reprimanded and bonus payments were not made.

told “ Business Insider Duty rosters were quickly changed and work assignments of up to twelve hours were required. No consideration was given to breaks and recovery periods.

Lack of offers of help

According to Majorel, there would be numerous offers for employees to look after and support them in their work and beyond. The company refers to various tools with which the disturbing content could be made “more bearable” through a gray filter. However, the affected employees never had such options available.

In monthly meetings, those affected can talk to their carers about their experiences. However, this offer is far from enough to process what you have seen or to deal with the pressure. There would also be the opportunity to seek psychological support outside the company, but four out of five moderators were unaware of such an offer.

That makes you sick

The various types of content that moderators are subjected to can cause severe psychological and emotional stress and can literally make you sick. Those affected report that it is not so easy to get rid of these experiences by terminating the contract. Psychotherapeutic follow-up care is often necessary.  

You might also be interested in:
Murders, child porn and more!
Ex-moderators sue TikTok Facebook will pay sick moderators 47.89 million euros

Source: derStandard

Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )