Nowadays, when social media plays a central role in the everyday lives of many young people, a recent study from Dublin City University alarming results. Male children and young people who use platforms like TikTok and YouTube Shorts are quickly confronted with content that promotes an outdated and toxic image of masculinity.

Content recommendation mechanisms

The researchers found that specially created user accounts that were assigned to young males were recommended problematic content after only a short period of use. TikTok led the way with a recommendation rate of 76% harmful content after an average of two hours and 32 minutes, while YouTube Shorts recommended 78% harmful content after three hours and 20 minutes. This content not only propagates aggressive masculinity, but also the subordination of women and discriminatory views towards non-binary people.

The role of algorithms

The main problem is the opaque algorithms that promote such content. Most social media companies keep their operations secret, making it difficult to study and understand the impact of these algorithms. However, these systems seem to quickly recognize an interest in harmful content and then increasingly suggest it.

Impact on young people

The study strongly warns of the negative impact of this content on young people. Girls and women are the primary victims of this toxic content, but boys and men also suffer psychological damage from confrontation with these role models. This creates a vicious circle of glorifying outdated gender roles and progressively undermining social progress towards equality and respect.

Need for regulation

The study results are an urgent wake-up call for parents, teachers, policymakers and society at large. There is an urgent need for action to make the digital environment in which children and young people move safer. Online Security Coordinator , emphasizes the responsibility of platforms to rethink and adapt their systems for the benefit of users.

Questions and answers:

Question 1: What was the proportion of harmful content on TikTok?
Answer 1: TikTok recommended 76 percent harmful content after an average of two hours and 32 minutes of use.

Question 2: What was the proportion of harmful content on YouTube Shorts?
Answer 2: YouTube Shorts recommended 78% harmful content after three hours and 20 minutes.

Question 3: What types of content do the algorithms recommend?
Answer 3: The algorithms promote content that supports aggressive images of masculinity, the subordination of women and discrimination against non-binary people.

Question 4: What long-term impact does this content have on young people?
Answer 4: Long-term effects include psychological damage and the reinforcement of outdated and harmful gender roles.

Question 5: Why is it difficult to study the impact of algorithms?
Answer 5: The algorithms are opaque because the social media companies do not disclose exactly how they work.

Question 6: What is required to address the problem?
Answer 6: What is required is stricter regulation and transparency of algorithms as well as a more active commitment by platforms to protect their young users.

Conclusion

The results of the study show that there is an urgent need for action to make the digital living spaces in which our children and young people move safer and healthier. Both the platform operators and politicians are responsible for taking regulatory action and creating transparency. It is important that all stakeholders work together to promote a healthier digital environment that supports and protects the development and well-being of all users.

It is also important that you subscribe to the Mimikama newsletter at https://www.mimikama.org/mimikama-newsletter/ and register for our online presentations and workshops at https://www.mimikama.education/online-vortrag- by-mimikama/ log in.

Source: press release

Also read:

Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )