Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Chris Grande

macrumors 6502
Jun 17, 2003
254
130
For example a parent's pictures of children in a bubble bath could open the door for their entire iCloud to be shared with law enforcement.

Just a point of clarification — the CSAM detection was against a set of known hashed CSAM material, and was not designed to use machines learning models of what might be said material.
 
  • Like
Reactions: jonblatho

henryhbk

macrumors regular
Jul 26, 2002
134
134
Boston
Apple was planning on scanning your phone.
Not really, if by "scan" you mean create a hash (not scanning), then sure. That isn't "looking at your photos' content" That is adding up all the pixels into a formula (say SHA-256) and matching them to the hash of known shared images. This would not have detected new photos (as in you filmed your pedi-porn using your iPhone, but rather you stored a pornographic photo you got from the web that was already known by law enforcement. If this truly ran on-device not sure how it would create the backdoor for law enforcement mentioned in the article (in fact I'd be shocked if the photos app doesn't already hash the photos just for the duplicate detection mechanism) Storage wise is would have negligible effects (photos are huge, hashes are tiny). The one thing that seems a tad sketchy is if apple can hand verify a flagged photo is in fact CSAM, that would imply the encryption at rest is done with a mechanism that apple can decrypt? I thought they always bragged that not even they can decrypt your iCloud stuff... I thought it needed the data in your T2 chip?
 

AF_APPLETALK

macrumors 6502a
Nov 12, 2020
606
848
It's not like I want CSAM to go unnoticed, but the fedgov has reduced it's credibility so far at this point that they can't be trusted with that kind of a tool.
 
  • Like
Reactions: huge_apple_fangirl

ghanwani

macrumors 601
Dec 8, 2008
4,628
5,810
The pilot worked so well, they had the FBI working full time on false positives!
 

Darth Tulhu

macrumors 68020
Apr 10, 2019
2,252
3,776
Yes, and there are also "secret" orders that can't be disclosed in any manner. So, while Apple can craft public relations on "their" behalf, it does not include the rest -- I believe every major tech entity must have orders in place for national security and other purposes. But that's another rabbit hole...
I try not to go down conspiracy rabbit-holes.

If there was a National Security need, I'm sure the NSA already has the power and authority to compel whatever organization to produce what they need without something being in place prior to the NSA "request".

Of course, this would be classified and we'd never hear of it.
 
  • Like
Reactions: compwiz1202

MacProFCP

Contributor
Jun 14, 2007
1,222
2,952
Michigan
Just a point of clarification — the CSAM detection was against a set of known hashed CSAM material, and was not designed to use machines learning models of what might be said material.
Thanks for pointing that out.

However, I feel my concern remains as things like this have a way of becoming the monster they set out to kill. Once we allow CSAM detection, what is, or is not, considered CSAM is subject to change, with or without public input.
 
  • Like
Reactions: SFjohn

vicviper789

macrumors 6502
Jun 5, 2013
362
1,938
China probably told Tim Cook to say it’s dead when it’s actually fully implemented and in use 😂
 

Analog Kid

macrumors G3
Mar 4, 2003
8,983
11,733
Everybody should be happy CSAM is DOA
Unfortunately CSAM is still alive and well. This method of CSAM detection is dead before arrival.

And of course, to quite a few significant digits, "everybody" was and will remain unaware this was even a discussion.

Should those of us in the rounding error be happy? Depends on how things go from here. More victims won't make people happy. A high profile case where someone kept a massive cache of CSAM in iCloud giving an excuse for a politician rather than Apple's technologists making rules won't make people happy. If we get through the future with little additional harm and all of us keeping this extra modicum of privacy, it'll make people happy.
 

ChromeCloud

macrumors 6502
Jun 21, 2009
357
836
Italy
This is very good news not just for Apple users, but for everybody.

If Apple went ahead and implemented CSAM scanning algorithms, then Google, Amazon, Microsoft, Samsung would have probably followed with their own implementation of cloud data scanning algorithms.

This is a win for everybody.
 
Last edited:
  • Love
Reactions: SFjohn

CarlJ

macrumors 604
Feb 23, 2004
6,976
12,140
San Diego, CA, USA
Thank you, Apple. CSAM was a joke.
Everybody should be happy CSAM is DOA
If Apple went ahead and implemented CSAM, ...
People keep referring to CSAM as if it were software written by Apple. It is not. It's Child Sexual Abuse Material, which is supremely awful. What Apple had was a proposal, design, and prototype software to scan for CSAM in a particular way. Stop conflating the two! Unless you're trying to imply that actual CSAM isn't really all that bad.
 
  • Like
Reactions: Analog Kid

Realityck

macrumors G4
Nov 9, 2015
10,338
15,568
Silicon Valley, CA
People keep referring to CSAM as if it were software written by Apple. It is not. It's Child Sexual Abuse Material, which is supremely awful. What Apple had was a proposal, design, and prototype software to scan for CSAM in a particular way. Stop conflating the two! Unless you're trying to imply that actual CSAM isn't really all that bad.
I don't think any of us you referenced portrayed it as software written by Apple. It was alleged earlier that Apple was planning on implementing CSAM surveillance which was heavily criticized.
 
  • Like
Reactions: SFjohn

BGPL

macrumors 6502a
May 4, 2016
945
2,598
California
Apple folded, and it was all for money. The anime hentai owners squawked and apple heard their cries from their parents' basements. What a disgrace. The victims will ultimately not receive justice because of Apple's greed. Every other tech company with cloud storage is using this. Google, FB, MS, etc.

Pedophiles are gonna love iCloud. Quite the humanitarian huh Timmy?
 

Analog Kid

macrumors G3
Mar 4, 2003
8,983
11,733
For example a parent's pictures of children in a bubble bath could open the door for their entire iCloud to be shared with law enforcement.
Thanks for pointing that out.

However, I feel my concern remains as things like this have a way of becoming the monster they set out to kill. Once we allow CSAM detection, what is, or is not, considered CSAM is subject to change, with or without public input.
Unless you're distributing pictures of your kids in a bubble bath where they could become part of a law enforcement database, it can't become part of the hash with our without public input. If your concern is that the method of detection changes and starts looking for potential CSAM versus hashing against known CSAM, then the rules can still change now just like they could if the original method went forward.
 

Analog Kid

macrumors G3
Mar 4, 2003
8,983
11,733
People keep referring to CSAM as if it were software written by Apple. It is not. It's Child Sexual Abuse Material, which is supremely awful. What Apple had was a proposal, design, and prototype software to scan for CSAM in a particular way. Stop conflating the two! Unless you're trying to imply that actual CSAM isn't really all that bad.

It's like saying the PCR test gives you covid.

Which, now that I say it, I'm sure I've read people claiming too... nevermind... I expect too much of people...
 

owidhh

macrumors regular
Jun 12, 2021
161
203
After extensive consultation with experts After extensive public pressure...

Never should have succumbed to the public feedback! Perhaps a botched introduction but no one else would have done it 'right' like Apple because of the scrutiny they are under.

The problem is there is no way to do it right. Once the capability is introduced, once Apple caves, it's a downwards spiral towards worse and worse privacy/human rights.

The people of China have recently learned about this - total surveillance was all fine and good when it was applied to those "terrorists" far away in the west of the country, but then when they protested against too much government control that same surveillance was turned against them, much to their surprise.

It's better if the capability is not there at all in the first place.

It's better if Apple publicly denounces the whole "Think of the children!!!1111" mantra.
 

MacProFCP

Contributor
Jun 14, 2007
1,222
2,952
Michigan
Unless you're distributing pictures of your kids in a bubble bath where they could become part of a law enforcement database, it can't become part of the hash with our without public input. If your concern is that the method of detection changes and starts looking for potential CSAM versus hashing against known CSAM, then the rules can still change now just like they could if the original method went forward.
I understand that there are criteria for CSAM and I understand that the criteria would be substantial. However, if the technology is not implemented, then there is no criteria to be changed.

As Apple killed the implementation of the technology, combined with today's announcement for Advanced Data Protection allowing iCloud information to be completely encrypted, hidden even from Apple; it is my understanding, and I could be wrong, that this would make it difficult, if not impossible, for nefarious actors to use personal iCloud information to suppress dissent.

I am all for utilizing technology to stop the horrific crimes being committed. I am simply unsure of the harm:good ratio considering that billions of people live in suppressed societies.
 

Analog Kid

macrumors G3
Mar 4, 2003
8,983
11,733
I understand that there are criteria for CSAM and I understand that the criteria would be substantial. However, if the technology is not implemented, then there is no criteria to be changed.

A hash of known circulating CSAM provided by at least two child protection agencies operating under different governments. At least 30 matched known CSAM images must be detected before triggering an alert. Matches confirmed manually before notifying law enforcement. Implemented and documented:

It's so hard to know how much of the outrage is about what Apple was planning to do and how much is about what people imagined Apple was planning to do...
 
  • Disagree
Reactions: SFjohn

SSDGUY

macrumors 65816
Jul 27, 2009
1,345
2,114
Apple's willingness to even entertain the idea of scanning user's photo libraries to see if they find something makes me think they've been drinking too much tea with Chinese government officials. I'm glad they walked it back but I'm not suddenly all rosy and full of faith about about Apple's personal privacy claims.
 

Analog Kid

macrumors G3
Mar 4, 2003
8,983
11,733
I’ve still yet to hear how exactly this step could be done without violating state and/or federal laws in the United States regarding the possession and distribution of CSAM.

It says the hash is encrypted with a "visual derivative". I don't see a description of what that comprises, but the language suggests Apple never possesses CSAM, only CSAM derivatives. It presumably has to be different enough that they don't have employees bleaching their eyeballs.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.