Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,770
31,228


Apple today announced that iOS 15 and iPadOS 15 will see the introduction of a new method for detecting child sexual abuse material (CSAM) on iPhones and iPads in the United States.

iCloud-General-Feature.jpg

User devices will download an unreadable database of known CSAM image hashes and will do an on-device comparison to the user's own photos, flagging them for known CSAM material before they're uploaded to iCloud Photos. Apple says that this is a highly accurate method for detecting CSAM and protecting children.

CSAM image scanning is not an optional feature and it happens automatically, but Apple has confirmed to MacRumors that it cannot detect known CSAM images if the iCloud Photos feature is turned off.

Apple's method works by identifying a known CSAM photo on device and then flagging it when it's uploaded to iCloud Photos with an attached voucher. After a certain number of vouchers (aka flagged photos) have been uploaded to iCloud Photos, Apple can interpret the vouchers and does a manual review. If CSAM content is found, the user account is disabled and the National Center for Missing and Exploited Children is notified.

Because Apple is scanning iCloud Photos for the CSAM flags, it makes sense that the feature does not work with iCloud Photos disabled. Apple has also confirmed that it cannot detect known CSAM images in iCloud Backups if iCloud Photos is disabled on a user's device.

It's worth noting that Apple is scanning specifically for hashes of known child sexual abuse materials and it is not broadly inspecting a user's photo library or scanning personal images that are not already circulating among those who abuse children. Still, users who have privacy concerns about Apple's efforts to scan user photo libraries can disable iCloud Photos.

Security researchers have expressed concerns over Apple's CSAM initiative and worry that it could in the future be able to detect other kinds of content that could have political and safety implications, but for now, Apple's efforts are limited seeking child abusers.

Article Link: Apple Confirms Detection of Child Sexual Abuse Material is Disabled When iCloud Photos is Turned Off
 
Last edited:
  • Like
Reactions: RandomDSdevel

Lounge vibes 05

macrumors 68040
May 30, 2016
3,649
10,603
Dang, you’re telling me that I just spent two hours explaining to people why they shouldn’t be worried about this, just for Apple to come and say that you can disable it?
It’s almost if I wasted my time.
I feel closer to death now.
Like I wasted a huge section of my life I could’ve spent doing other things.
I wonder, when I’m on my deathbed, if I‘ll look back at this exact day and think ”hmm, maybe I was a fool.”
 

Mockletoy

macrumors 6502a
Sep 26, 2017
621
1,919
Gothenburg, Sweden
The only problem I have with this is that it can be turned off, but I understand why that’s the case.

Apple isn’t policing what you store on your device, only what you upload to their cloud servers.

There are no doubt slippery slope arguments to be made, but at the end of the day every one of these sickos who gets caught and locked up makes the world a better place. I’m willing to give up a slice of my privacy to make that happen.

And if I’m not, then Apple Photos clearly isn’t for me.
 

itsmeaustend

Suspended
Apr 27, 2016
332
816
Sounds like I’ll be turning off iCloud.

Apple go ahead and release that 1TB iPhone.

Please, respect our privacy as consumers. Don’t be creepy.
This can totally be done in a way which respects privacy. Anyone who’s versed in software engineering would know this.

Now, it can still be abused, yes. But not in the way you might think… hashes aren’t an abuse of privacy. Not by any means. The abuse vector comes from the fact a different scanning filter could be applied. However one would hope that Apple doesn’t turn into a totalitarian regime with Tim Cook at the reigns (nor anyone else)

Though totalitarian regimes will find a way to develop this technology themselves and put it to whatever use they see fit. But that was going to happen with or without Apple scanning for CSAM.
 
Last edited by a moderator:

zakarhino

Contributor
Sep 13, 2014
2,508
6,778
*As of right now

Forgot that tidbit, Apple, considering you went through all that effort to move this system that already existed on iCloud to on-device. Might want to mention how the system has changed with this updated compared to before:

Before, Apple can only scan photos once they're on their servers:
Photos on phone -> uploaded to iCloud -> Scanned on iCloud servers as the photos on iCloud have been accessible to Apple since iCloud's inception, there is no zero access encryption on iCloud Photos.

Now, Apple are capable of scanning photos on device, even if you have iCloud turned off:

Photos on phone -> scanned on phone -> Uploaded to iCloud

This system is arguably WORSE for privacy than the previous system.
 

Mockletoy

macrumors 6502a
Sep 26, 2017
621
1,919
Gothenburg, Sweden
*As of right now

Forgot that tidbit, Apple, considering you went through all that effort to move this system that already existed on iCloud to on-device. Might want to mention how the system has changed with this updated compared to before:

Before, Apple can only scan photos once they're on their servers:
Photos on phone -> uploaded to iCloud -> Scanned on iCloud servers as the photos on iCloud have been accessible to Apple since iCloud's inception, there is no zero access encryption on iCloud Photos.

Now, Apple are capable of scanning photos on device, even if you have iCloud turned off:
Photos on phone -> scanned on phone -> Uploaded to iCloud

This system is arguably WORSE for privacy than the previous system.
Scanning my photos for kiddie porn has no effect on my privacy, since I don’t have any kiddie porn.
 

ArtOfWarfare

macrumors G3
Nov 26, 2007
9,567
6,073
I agree with darcyf that if you're not doing anything wrong, then you have nothing to worry about.
Who decides what's wrong, though? Regardless of where you fall politically, there is likely something you do or have that some politician wants to make illegal.

We're starting with something that is fairly universal in people saying it's wrong, but it's a slippery slope. Now that the tools are there, an authoritarian government can start telling Apple to do whatever with it.

And everyone knows that Apple's commitment to human rights and privacy goes right out the window the moment the Chinese Communist Party asks for assistance in trampling them.
 

iObama

macrumors 65816
Nov 16, 2008
1,041
2,234
Scanning my photos for kiddie porn has no effect on my privacy, since I don’t have any kiddie porn.
Here's the thing. That's great that you don't have any of that on your device!

But if you ever live in a country that, for some reason, wants to find something on your device and have it flagged in order to charge you with a crime, this sets a dangerous precedent.

Surveillance technology, while often well-intentioned, can easily end up in the wrong hands for nefarious purposes.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.