Now there is a U-turn on this plan. The plan was that Apple wanted to use the iCloud storage service and so-called “hashes”. These hashes – files containing already known child pornographic material – would be loaded onto the devices and could compare existing photos with already known material. If multiple hits are found, an alarm would be triggered.
On the one hand, the approach to combating child abuse should be seen as entirely positive. On the other hand, there were also critical voices who considered it a “bad idea” as this could be a “key element for the monitoring of encrypted messaging systems”.
Of course, this also means restrictions for the privacy of the users, as the company examines all material that is loaded into the iCloud.
Apple is now pulling the brakes
The group now wants to focus its anti-CSAM efforts on communication safety functions.
“After extensive consultation with experts to gather feedback on the child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature, which we first made available in December 2021 have.
Apple told WIRED
We have also decided not to pursue our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue to work with governments, children's advocates and other companies to protect young people, uphold their right to privacy, and make the Internet a safer place for children and for all of us close."
Communication Safety for Messages analyzes image attachments that users send and receive on their devices. However, this function is designed so that Apple does not have access to the messages even through end-to-end encryption. This opt-in function can be used to determine whether a photo depicts nudity.
Parents and caregivers can opt in to Communication Safety protection through family iCloud accounts.
Source:
WIRED , futurezone.de
Also read our fact checks:
Ukrainian Justice Minister traveling in a stolen German car?
84 actors? No, a bit player in a vaccination campaign!
“Are Vaccines Dangerous” – How to use Google incorrectly
If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:
📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.
Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!
* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!
Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )

