Apple scraps plans to roll out controversial CSAM iCloud Photo scanning feature
Apple has officially scrapped plans to scan images to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos, which would have allowed Apple to report detected material to law enforcement agencies, with the iPhone maker saying on Tuesday it will instead deepen its investment in the Communication Safety feature that was first made available in December 2021.
Apple immediately faced a backlash from privacy advocates after announcing the CSAM iCloud Photo scanning feature last year, with the company postponing the introduction of the safeguarding toolset in response after facing complaints over fears of governments forcing Apple to add non-CSAM images to the detection list, and how false-positives would be reported and dealt with.
In a statement shared with WIRED, Apple has said that “Children can be protected without companies combing through personal data,” confirming that it has “decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos.”
After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.
The Communication Safety feature Apple referees to in the statement is a tool that warns children and their parents when a child receives or sends sexually explicit photos through iMessage.
Launched first in the US and later expanded to the U.K., Canada, Australia, and New Zealand, Apple’s Child Communication Safety in Messages feature sends warnings to alert children and parents when a sexually explicit image is sent or received on a device connected to iCloud Family Sharing as well as the automatic blurring of explicit images sent in Messages.
When receiving explicit content, the safeguarding tool blurs the image and the child will be warned. As an additional precaution, the child can will be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.