Apple is delaying the implementation of child safety features announced last month, including a contentious feature that will scan users’ photographs for child sexual abuse material (CSAM), in response to widespread concern that the improvements would erode user privacy. The adjustments were originally expected to take effect later this year.
“Last month, we revealed plans for features aimed at protecting children from predators who recruit and abuse them using communication platforms, as well as limiting the transmission of Child Sexual Abuse Material,” Apple said in a statement to The Verge. “In response to feedback from consumers, advocacy groups, researchers, and others, we’ve chosen to take additional time in the coming months to gather feedback and make changes before delivering these crucial kid safety features.”
Apple’s initial press release announcing the modifications, which were meant to kerb the spread of child sexual abuse material (CSAM), begins with a similar remark. That announcement revealed three significant changes in the works. One improvement to Search and Siri would be to direct users to resources that would help them avoid CSAM when they looked for information about it.
The other two adjustments drew further scrutiny. One would notify parents when their children received or sent sexually obscene images and would obfuscate the images for children. The other would have scanned photographs stored in a user’s iCloud Photos for CSAM and notified Apple administrators, who might subsequently forwards the reports to the National Center for Missing and Exploited Children, or NCMEC.
Apple went into great detail about the iCloud Photo scanning mechanism to demonstrate that it did not compromise customer privacy. In a nutshell, it scanned photographs stored in iCloud Photos on your iOS device and compared them to a database of known CSAM image hashes maintained by NCMEC and other child protection groups.
Nonetheless, several privacy and security experts sharply condemned the company’s new approach, stating that it may have resulted in the creation of an on-device surveillance system and breached consumers’ trust in Apple’s ability to preserve on-device privacy.
The Electronic Frontier Foundation stated on August 5th that the new approach, regardless of its intentions, will “violate major promises of the messenger’s encryption and open the door to greater abuses.”
“Apple is jeopardising the phone that you and I own and operate,” Ben Thompson of Stratechery wrote in his own complaint, “without any of us having a say.”