Apple has officially scrapped plans to scan images to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos, which would have allowed Apple to report detected material to law enforcement agencies, with the iPhone maker saying on Tuesday […]