Parents are demanding more protection for children online. The majority of parents report bad experiences with their children because they are repeatedly confronted with bullying, pornography and chain letters on their smartphones.
Protection for children online. But it's not just the parents who are demanding more legal protection, the children's welfare organization is also speaking out on this issue. The children's charity believes that providers of online services need to be held more accountable.
For almost all parents in Germany (97 percent), good age identification is an important selection criterion for their children's use of social media services or games. Accordingly, the vast majority of parents (88 percent) pay attention to age ratings when selecting films, apps, games or streaming services. This is shown by a survey on child and youth media protection
Four fifths of those surveyed (81 percent) do not find the age verification procedure in place at some providers (confirmation of adulthood by clicking) to be sufficient to protect children and young people from content and offers that are not age-appropriate. These are the central results of a representative survey by the opinion research institute Mauss Research on behalf of the German Children's Fund on child and youth media protection.
Negative experiences when using online media
More than half (55 percent) of those surveyed said that their child had already had negative experiences using online media. If their child comes into contact with negative or inappropriate content on the Internet, only a little more than a third of the parents surveyed (37 percent) know who to turn to. Of this group, two thirds of those surveyed (62 percent) would turn to a state law enforcement agency, especially the police.
Inadequate child and youth protection
The efforts of online service providers to protect children and young people were rated as inadequate. Providers of messenger services and video platforms fare particularly poorly here, with only 27 percent each seeing their protection efforts as sufficient, while only 18 percent see it that way for social media providers such as Facebook or Instagram.
Tougher penalties for providers!
At the same time, almost all respondents are calling for harsher penalties for providers in the event of violations of child and youth protection, reliable age determination for offers that are unsuitable or harmful for children, and an efficient reporting and complaint system for violations of child and youth protection (93 percent each). .
Potential offers that could help parents to safely accompany and support their children online are rated very positively overall: at least four fifths of those surveyed rated them as very helpful or helpful. It turns out that offers that can be implemented without one's own intervention, such as functioning child protection settings (91 percent) or understandable, uniform age identification (88 percent), are rated as more helpful than offers that require greater initiative or personal action would be beneficial, such as advice and complaint offices (84 percent) or media education training (80 percent).
“ We need holistic child and youth media protection that is based on the real usage behavior of children and young people. This should adapt to current and future phenomena and technologies, be transparent for parents and children, offer them help at any time and at the same time not hinder age-appropriate participation of children and young people in the digital world. Parents, like their children, need more support for safe and competent internet use. This also includes ensuring that age ratings for media content, once checked, are consistently transferred to other distribution media, whether online or offline. “Double examinations with sometimes different age ratings must be a thing of the past ,” emphasizes Thomas Krüger, President of the German Children’s Fund.
“This must be accompanied by more effective legal frameworks and measures. In particular, providers of media content and media services, whether at home or abroad, should be the focus of the legislature and at the same time have a responsibility themselves. They need a clear legal framework that provides for various requirements such as age determination, transparency and advice for effective child and youth media protection in relation to the respective offer. This requires control mechanisms as well as a more consistent determination of the legal consequences of violations. In order to effectively enforce child and youth media protection, violations by providers of applicable youth protection law must be punished through effective sanctions ," Krüger continued.
Source: German Children's Fund
Article image: Shutterstock / By carballo
If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:
📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.
Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!
* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!
Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )

