Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,610
30,963


It has now been over a year since Apple announced plans for three new child safety features, including a system to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, an option to blur sexually explicit photos in the Messages app, and child exploitation resources for Siri. The latter two features are now available, but Apple remains silent about its plans for the CSAM detection feature.

iCloud-General-Feature.jpg

Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company ultimately postponed the feature based on "feedback from customers, advocacy groups, researchers, and others."

In September 2021, Apple posted the following update to its Child Safety page:
Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
In December 2021, Apple removed the above update and all references to its CSAM detection plans from its Child Safety page, but an Apple spokesperson informed The Verge that Apple's plans for the feature had not changed. To the best of our knowledge, however, Apple has not publicly commented on the plans since that time.

We've reached out to Apple to ask if the feature is still planned. Apple did not immediately respond to a request for comment.

Apple did move forward with implementing its child safety features for the Messages app and Siri with the release of iOS 15.2 and other software updates in December 2021, and it expanded the Messages app feature to Australia, Canada, New Zealand, and the UK with iOS 15.5 and other software releases in May 2022.

Apple said its CSAM detection system was "designed with user privacy in mind." The system would perform "on-device matching using a database of known CSAM image hashes" from child safety organizations, which Apple would transform into an "unreadable set of hashes that is securely stored on users' devices."

Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple said there would be a "threshold" that would ensure "less than a one in one trillion chance per year" of an account being incorrectly flagged by the system, plus a manual review of flagged accounts by a human.

Apple's plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Some critics argued that Apple's child safety features could create a "backdoor" into devices, which governments or law enforcement agencies could use to surveil users. Another concern was false positives, including the possibility of someone intentionally adding CSAM imagery to another person's iCloud account to get their account flagged.

Article Link: Apple Remains Silent About Plans to Detect Known CSAM Stored in iCloud Photos
 
Last edited:

LinusR

macrumors 6502
Jan 3, 2011
332
515
Being silent probably means something is coming down the line.
One of the rumoured theories at the time was: Apple will realise what a terrible idea this is, and how terrible it makes them look (and PR image is EVERYTHING to Apple, as we all know), so they will drop it and never mention it again. I’m taking the current situation (Apple pretending as if this idea never existed) as evidence for that take.

A spokesperson saying none of their plans have changed is just preserving the status quo. That’s comfortable for them right now. If they make change now, one way or the other, they’ll still open a can of worms they’ll hardly be able to close. It’s not quite all water under the bridge yet.
 

xxray

macrumors 68040
Jul 27, 2013
3,040
9,158
Let’s hope this gets introduced. Harmful material and the individuals who share it could be held to account.

Those who think Apple will be spying on their photos need to learn how hashing works.
I guess you must be smarter than “security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.”

You also must have missed this part of the article:

Some critics argued that Apple's child safety features could create a "backdoor" into devices, which governments or law enforcement agencies could use to surveil users. Another concern was false positives, including the possibility of someone intentionally adding CSAM imagery to another person's iCloud account to get their account flagged.

I’m all for protecting children and anyone in general from abuse, but invading the privacy of the entire rest of the population to do it isn’t the way to go. You don’t let a someone into your house to check for any illegal substances or content just because you might have it.
 

Populus

macrumors 601
Aug 24, 2012
4,709
6,965
Spain, Europe
One of the rumoured theories at the time was: Apple will realise what a terrible idea this is, and how terrible it makes them look (and PR image is EVERYTHING to Apple, as we all know), so they will drop it and never mention it again. I’m taking the current situation (Apple pretending as if this idea never existed) as evidence for that take.

A spokesperson saying none of their plans have changed is just preserving the status quo. That’s comfortable for them right now. If they make change now, one way or the other, they’ll still open a can of worms they’ll hardly be able to close. It’s not quite all water under the bridge yet.
It could be that way, but I guess there’s the possibility of Apple implementing such technology without them announcing or making it public.

I say this, because while for Apple would be easier to drop plans on implementing such technology, I think the European Comission or the European Parliament are asking ways to detect this content in an easy way, as well as ending or limiting the E2E encryption. I don’t have the sources at hand but I’ve read news about it.

Bad times for privacy I’m afraid. Good times for governments who want more and more power over their citizens.
 

BGPL

macrumors 6502a
May 4, 2016
936
2,582
California
I hope everyone who has CSAM in their iCloud photos gets what's coming to them, which is Bubba in a 5 x 7 cell.

For the ignorant people who don't know how hashing works, take five minutes out of your day and learn about it. Take another five minutes and read a few victim impacts statements from children who are/were impacted by distribution of CSAM. If you're not victimizing children, then move along, you have nothing to worry about. Yeah, hashing iCloud albums will create a "back door". Ignorance at its best. More people that don't know how hashing works.

BTW, Facebook, Google, Dropbox and a host of others have been hashing images and videos for years now. And thank god, it's put a lot of pedophiles behind bars. Have you heard about any "back doors" from these providers?
 

Unregistered 4U

macrumors G3
Jul 22, 2002
9,949
7,903
Apple planned to report iCloud accounts with known CSAM images to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple said there would be a "threshold" that would ensure "less than a one in one trillion chance per year" of an account being incorrectly flagged by the system, plus a manual review of flagged accounts by a human.
I don’t believe Apple planned to report iCloud accounts with known CSAM images because Apple would have no idea if there were known CSAM images or not. They’d ONLY know that a large number of imagine in an account matched with the hash of known CSAM. So Apple, just like everyone else maintaining storage for users, would report accounts that have a high number of hash matches.
 

Unregistered 4U

macrumors G3
Jul 22, 2002
9,949
7,903
QUOTE="I7guy, post: 31335119, member: 863841"]
Being silent probably means something is coming down the line.
[/QUOTE]
If Apple ever reports that iCloud is 100% encrypted, such that they don’t even have access to a user’s images, they’re absolutely doing what they announced earlier, scanning on the device. The government would not allow Apple’s cloud storage to be some safe haven for those who really MUST keep large numbers of CSAM images on their device and in their cloud storage.
 

GMShadow

macrumors 68000
Jun 8, 2021
1,805
7,416
I hope everyone who has CSAM in their iCloud photos gets what's coming to them, which is Bubba in a 5 x 7 cell.

For the ignorant people who don't know how hashing works, take five minutes out of your day and learn about it. Take another five minutes and read a few victim impacts statements from children who are/were impacted by distribution of CSAM. If you're not victimizing children, then move along, you have nothing to worry about. Yeah, hashing iCloud albums will create a "back door". Ignorance at its best. More people that don't know how hashing works.

BTW, Facebook, Google, Dropbox and a host of others have been hashing images and videos for years now. And thank god, it's put a lot of pedophiles behind bars. Have you heard about any "back doors" from these providers?
Cambridge Analytica ring a bell?
 

coolfactor

macrumors 604
Jul 29, 2002
7,088
9,821
Vancouver, BC
One of the rumoured theories at the time was: Apple will realise what a terrible idea this is, and how terrible it makes them look (and PR image is EVERYTHING to Apple, as we all know), so they will drop it and never mention it again. I’m taking the current situation (Apple pretending as if this idea never existed) as evidence for that take.

A spokesperson saying none of their plans have changed is just preserving the status quo. That’s comfortable for them right now. If they make change now, one way or the other, they’ll still open a can of worms they’ll hardly be able to close. It’s not quite all water under the bridge yet.

It's only terrible in the minds of those that don't understand the extensive technological process that Apple implemented to maintain privacy for end users.

It is a well-designed system that is worlds apart from how other companies handle photo gallery scanning.

As for false positives, if someone is intentionally 'planting' photos into someone's iCloud Photo Gallery, then there's another issue that needs to be addresses. The account is compromised, and that has nothing to do with this feature.

Regarding the possibility of back doors, this is just a matter of trust. Apple already clearly stated that they would not allow this feature to be abused by law enforcement agencies for any other purpose, and I choose to believe that. If they were going to allow that, they could do it in secret, and they never have to date.
 

coolfactor

macrumors 604
Jul 29, 2002
7,088
9,821
Vancouver, BC
I hope everyone who has CSAM in their iCloud photos gets what's coming to them, which is Bubba in a 5 x 7 cell.

For the ignorant people who don't know how hashing works, take five minutes out of your day and learn about it. Take another five minutes and read a few victim impacts statements from children who are/were impacted by distribution of CSAM. If you're not victimizing children, then move along, you have nothing to worry about. Yeah, hashing iCloud albums will create a "back door". Ignorance at its best. More people that don't know how hashing works.

BTW, Facebook, Google, Dropbox and a host of others have been hashing images and videos for years now. And thank god, it's put a lot of pedophiles behind bars. Have you heard about any "back doors" from these providers?

There's a BIG difference between how Apple's feature is handling this and how the other providers you mentioned handle it. They all do it on the server. Apple's feature operates entirely ON DEVICE. The hashes never leave the device unless significant thresholds are crossed.

This was clearly explained by Apple, but those against this feature continue to compare it to how the other providers are doing it... on their servers, where privacy invasion can happen much more easily. That has never been Apple's approach with this feature.
 

Unregistered 4U

macrumors G3
Jul 22, 2002
9,949
7,903
Nope, I don't believe that is correct. Apple's website said "Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC."
Apple manually reviews a report to confirm that the “hash” matches. Because, it IS possible that the algorithm failed and yielded a false match. If they verify that the hash represented a positive match, then they disable the account and sends the report. Apple does not have, and will not be provided access to, the actual images, they are only provided with the hashes.
 

Joe Rossignol

Senior Reporter
Staff member
May 12, 2012
908
3,492
Canada
Apple manually reviews a report to confirm that the “hash” matches. Because, it IS possible that the algorithm failed and yielded a false match. If they verify that the hash represented a positive match, then they disable the account and sends the report. Apple does not have, and will not be provided access to, the actual images, they are only provided with the hashes.
I added the word "hashes" to that paragraph of the story to be abundantly clear.
 

baryon

macrumors 68040
Oct 3, 2009
3,880
2,941
Imagine if the Chinese government could one day use this system to check who has the Tank Man image in their cloud storage and deport that person to a muder camp the next day without question.

Apple can only enforce the local law. If the law is different in a different country, will it enforce that for its citizens? Say, everyone agrees that child abuse is bad. But what if in Russia, where homosexuality is pretty much a crime, anything labeled "LGBT propaganda aimed at minors" such as an informative book about an LGBT subject would be called "child abuse" for political reasons, and thus be illegal. Would Apple play international judge and pick and choose what it considers right and wrong based on its own morals, or would it strictly abide by the respective laws of each country, even if they go against Apple's initial "good intentions"? What happens when a government puts pressure on Apple to hand over control of this system to them "or else"? Will they do the right thing or will there come a point where money will matter more? (Hint: money eventually always takes priority over morals).

It sounds good but it gets messy the more questions you ask, which is not a good omen.
 

steve09090

macrumors 68020
Aug 12, 2008
2,144
4,135
I think now is the perfect time for Macrumors to suspend comments. When this story gets some traction we know it’s going to be the race to the bottom and will get nasty like previous stories about this.

Meanwhile +1 to Apple for keeping this going. I hate paedophiles.
 

Unregistered 4U

macrumors G3
Jul 22, 2002
9,949
7,903
One of the rumoured theories at the time was: Apple will realise what a terrible idea this is, and how terrible it makes them look (and PR image is EVERYTHING to Apple, as we all know), so they will drop it and never mention it again. I’m taking the current situation (Apple pretending as if this idea never existed) as evidence for that take.
Legally, they’re absolutely doing something unless you feel that folks that want to maintain large repositories of CSAM images would be perfectly protected from prying eyes on an iCloud drive :) If they’ve changed anything, they’ve pushed back on the idea of encrypting everyone’s images.
 

Count Blah

macrumors 68040
Jan 6, 2004
3,192
2,748
US of A
Let’s hope this gets introduced. Harmful material and the individuals who share it could be held to account.

Those who think Apple will be spying on their photos need to learn how hashing works.
CSAM, tank man, Whinnie the Pooh, pro/anti-Trump(whichever side you find yourself), etc…

It’s not the ACTUAL subject, it’s the fact that they CAN and are eager to do it. I’m less inclined to be pissed when it’s iCloud, since it’s their storage. But Apple wanted to search our PERSONAL device. You know if they are scanning our devices, any despot can knock on Apple’s local office door, with many armed thugs and order the scanning of anything the despot desires. Apple has proven to bend over backwards to the CCP already, so it would only be a matter of time.

Screw that and anyone who supports on-device scanning.
 
Last edited:

MrRom92

macrumors 6502a
Sep 30, 2021
933
1,977
I think there is potential for this system to go badly where innocent photos are viewed in a different context. For example…

Let’s say, you are a parent and you take completely innocent photos of your child.
Some parents take photos where the child may be nude (doesn’t everyone have the classic embarrassing “baby butt” shot in their childhood photo album?) but nobody is offended because everyone knows there is no malintent behind the image. It’s just a child in a state of undress.
So, you have a photo like this of your kid, or your kid in the bath, or your kid at the pool/beach, etc. And you post it to social media, and nobody thinks anything of it because to anyone with a properly working brain, there is nothing to think about it.


But THEN, some creeper trolls public social media accounts like Facebook and Instagram for pictures other people post of their children, sees a few that excite them for reasons of their own, saves the good ones to their computer and shares them online on some sicko forum, or trades them with other perverts, etc.

Now when one of them gets caught, or their website gets raided, etc. all their files get flagged as CSAM because of the context in which they were being distributed and viewed by these people, completely unbeknownst to you, the child’s parent, who now still has this original photo on their phone or in their iCloud. And the checksums match because it’s the same file. Do you see how this goes wrong?

I do not know nearly enough about the process in which material is determined to be CSAM but this scenario doesn’t seem implausible to me.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.