Apple ditching plan to scan iCloud photos for child abuse material amid privacy push


Apple is reportedly ditching a controversial plan to scan users’ photos stored in iCloud for child sexual abuse material, or CSAM, amid an ongoing privacy push. 

These safety tools, announced in August 2021, were meant to flag illicit content while preserving privacy. But the plans drew widespread criticism from digital rights groups who argued that the surveillance capabilities were ripe for potential abuse. 

Apple put the plans on pause a month later. Now, more than a year after its announcement, the company has no plans to move forward with the CSAM-detection tool.

The company says it is developing new features that will better balance users privacy and protect children. Such parameters will allow parents to limit their child’s contacts, restrict content and screen time and provide an app store carefully curated for kids. 

Apple says the best way to prevent online exploitation of children is to interrupt it before it happens. The company pointed to new features it rolled out in December 2021 that enabled this process. 

The company says it will develop new features to help balance users privacy and protect children.
SOPA Images/LightRocket via Gett

Communication safety in messages, for instance, includes warnings when questionable photos are being sent, and expanded guidance in Siri, Spotlight, and Safari Search.

The company is working on updates to communication safety in messages to cover nudity in videos and other child safety protections. Apple says it is also working with child safety professionals to make reporting incidents to law enforcement more seamless. 

The company announced Wednesday it will now offer full end-to-end encryption for nearly all the data its users store in its global cloud-based storage system, making it more difficult for hackers, spies and law enforcement agencies to access sensitive user information.  

Credit: Source link