A new range of features dubbed Expanded Protections for Children were announced by Apple recently. These include monitoring user images for child sexual abuse material (CSAM) content. It plans to do so by screening photos on iPhone devices in the United States before they are uploaded to iCloud storage. This feature raised some concerns about user privacy and received backlash from Edward Snowden on Twitter and other platforms.
Apple reportedly shared an internal memo that was obtained by 9to5Mac and it acknowledges what Apple calls “misunderstandings” about the new features. The alleged memo written by Sebastien Marineau-Mes, VP Software at Apple, reads:
“Today marks the official public unveiling of Expanded Protections for Children, and I wanted to take a moment to thank each and every one of you for all of your hard work over the last few years. We would not have reached this milestone without your tireless dedication and resiliency.” It goes on to say, “We've seen many positive responses today. We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we've built.”
The memo also comes with a message from National Center for Missing and Exploited Children's (NCMEC) Executive Director of Strategic Advancement and Partnerships, Marita Rodriguez, that lauds Apple's new features aimed at child safety. “I wanted to share a note of encouragement to say that everyone at NCMEC is SO PROUD of each of you and the incredible decisions you have made in the name of prioritizing child protection,” the message reportedly reads.
Apple recently stated that it will roll out the system for checking photos for child abuse imagery on a country-by-country basis complying with the local laws.
Affiliate links may be automatically generated - see our ethics statement for details.