Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,625
31,011


In addition to making end-to-end encryption available for iCloud Photos, Apple today announced that it has abandoned its controversial plans to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos, according to a statement shared with WIRED.

iCloud-General-Feature.jpg

Apple's full statement:
After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.
In August 2021, Apple announced plans for three new child safety features, including a system to detect known CSAM images stored in iCloud Photos, a Communication Safety option that blurs sexually explicit photos in the Messages app, and child exploitation resources for Siri. Communication Safety launched in the U.S. with iOS 15.2 in December 2021 and has since expanded to the U.K., Canada, Australia, and New Zealand, and the Siri resources are also available, but CSAM detection never ended up launching.

Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company ultimately postponed the feature based on "feedback from customers, advocacy groups, researchers, and others." Now, after a year of silence, Apple has abandoned the CSAM detection plans altogether.

Apple promised its CSAM detection system was "designed with user privacy in mind." The system would have performed "on-device matching using a database of known CSAM image hashes" from child safety organizations, which Apple would transform into an "unreadable set of hashes that is securely stored on users' devices."

Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple said there would be a "threshold" that would ensure "less than a one in one trillion chance per year" of an account being incorrectly flagged by the system, plus a manual review of flagged accounts by a human.

Apple's plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Some critics argued that the feature would have created a "backdoor" into devices, which governments or law enforcement agencies could use to surveil users. Another concern was false positives, including the possibility of someone intentionally adding CSAM imagery to another person's iCloud account to get their account flagged.

Article Link: Apple Abandons Controversial Plans to Detect Known CSAM in iCloud Photos
 
Last edited:

BenGoren

macrumors 6502
Jun 10, 2021
471
1,336
Apple has lost my trust, so if I were to take Apple at their word, this is good news, but I remain skeptical of Apple going forward.

So there’s no way for Apple to regain your trust?

Corporations basically never admit mistrakes these days, and certainly never eat crow. If Apple doing so isn’t enough to convince you that they actually do realize how stupid this was, what would be?

Can you point to a better alternative?

In today’s world, what Apple did took balls. It’s the right thing to do, and they at least deserve kudos for doing it.

If nothing else, you might want to think about the next time they screw up. Wouldn’t you want to not only have a stick to threaten them with, but also a carrot to reward them with so they have an incentive to turn around like they did here?

b&
 

antiprotest

macrumors 601
Apr 19, 2010
4,006
14,061
So there’s no way for Apple to regain your trust?

Corporations basically never admit mistrakes these days, and certainly never eat crow. If Apple doing so isn’t enough to convince you that they actually do realize how stupid this was, what would be?

Can you point to a better alternative?

In today’s world, what Apple did took balls. It’s the right thing to do, and they at least deserve kudos for doing it.

If nothing else, you might want to think about the next time they screw up. Wouldn’t you want to not only have a stick to threaten them with, but also a carrot to reward them with so they have an incentive to turn around like they did here?

b&
I think you are lecturing the wrong person. Consumer trust is fragile even for small matters, and Apple messed up big time. They knew it or they would not have retracted something so strongly virtuous-sounding. Apple should have known better than to mess with consumer trust. Lecture them instead so perhaps they will not mess up again.
 

HMI

Contributor
May 23, 2012
844
327
Wow. All this time I figured they slipped it in a 0.1.1 update calling it “security.”

I’m not sure why they would even mention it now.

How sad, that I would have believed “privacy” before they brought up CSAM, but now that they say they’re not doing it, I don’t believe them at all.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.