Apple Will Scan iPhones for Illegal Child Abuse Images, Sparking Privacy Debate
Apple Announces Limits to Child Sex Abuse Image-Scanning System After Privacy Backlash
Apple on Aug. 13 provided new details of how its planned child sexual abuse material (CSAM) detection system would work, outlining a range of privacy-preserving limits following backlash that the software would introduce a backdoor that threatens user privacy protections.
The company addressed concerns triggered by the planned CSAM feature, slated for release in an update for U.S. users later this year, in a 14-page document (pdf) that outlined safeguards it says it will implement to prevent the system on Apple devices from erroneously flagging files as child pornography, or being exploited for malicious surveillance of users.
“The system is designed so that a user need not trust Apple, any other single entity, or even any set of possibly-colluding entities from the same sovereign jurisdiction (that is, under the control of the same government) to be confident that the system is functioning as advertised,” the company said in the document.
Apple’s reference to “possibly-colluding entities” appears to address concerns raised by some that the system could be abused—for instance, by authoritarian regimes—to falsely incriminate political opponents.