Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

bob24

macrumors 6502a
Sep 25, 2012
610
544
Dublin, Ireland
Correct. There's a difference between being able to access data and actually doing so.

I have never seen any evidence that Apple employees look at user data without consent — and there are enough whistleblowers out there that this would have come to light a long time ago. That said, I've also worked in IT long enough to know that there are many occasions where data can be seen inadvertently.

Similarly, law enforcement isn't allowed to go on fishing expeditions. They have to get a judge to sign off on a warrant for specific information, which has to be based on probable cause.

With that in mind, the question I had is: what exactly is the manual review process which gets triggered once images are flagged.

I.e. without looking at the actual picture I don't see what value a human being carrying out that manual review can bring on top of the "hash match" label already provided by the machine. But if they are actually looking at the picture this seems to be setting a precedent that it is OK for employees to access customer data without notification or explicit consent.
 

jhollington

macrumors 6502a
Sep 23, 2008
530
589
Toronto
With that in mind, the question I had is: what exactly is the manual review process which gets triggered once images are flagged.
Well, all we know is what Apple says, which is:

Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. The threshold is set to provide an extremely high level of accuracy that accounts are not incorrectly flagged. This is further mitigated by a manual review process where Apple reviews each report to confirm there is a match. If so, Apple will disable the user’s account and send a report to NCMEC.

So it certainly sounds like they are looking at the pictures, since they'd pretty much have to, but they're only looking at the ones that have specifically been flagged by the algorithm. The technical documents make it clear that they have no access to photos that are not flagged:

...it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.

I guess the logic here is that anybody who has a critical mass of "known CSAM" in their account doesn't deserve as high of a level of privacy. It's also worth keeping in mind that, assuming the algorithms work properly, Apple employees aren't looking at "private" images, per se — these would all be images that are already making the rounds online.

The threshold is also supposed to be high enough that false positives should be extremely rare. For instance, one or two photos mistakenly flagged as CSAM due to a false collision is understandable, but if an account has 100 or more flagged photos, that definitely warrants further investigation.

The fact that the user has them in their iCloud Photo Library is obviously "private," technically speaking, but that sounds like it's a distinction for lawmakers, lawyers, and judges to sort out. However, despite all of Apple's highfalutin' talk about privacy, it's always reserved the right to monitor anything that's stored in your iCloud account. From Apple's iCloud Terms of Service:

Apple reserves the right at all times to determine whether Content is appropriate and in compliance with this Agreement, and may screen, move, refuse, modify and/or remove Content at any time, without prior notice and in its sole discretion, if such Content is found to be in violation of this Agreement or is otherwise objectionable.

That section actually follows a long list of things you're not allowed do with iCloud, which includes "upload, download, post, email, transmit, store or otherwise make available any Content that is unlawful, harassing, threatening, harmful, tortious, defamatory, libelous, abusive, violent, obscene, vulgar, invasive of another’s privacy, hateful, racially or ethnically offensive, or otherwise objectionable," along with "plan or engage in any illegal activity."
 

DotCom2

macrumors 603
Feb 22, 2009
6,173
5,446
I remember years and years ago when I would get my AT&T bill it would be pages and pages and pages long. It would show every single call I made, the date, the time, the duration and even the phone number of the calls both in and outgoing. I'm sure this still occurs but we just don't see it on our bills anymore. I don't know why people are shocked about this.
 
  • Like
Reactions: JMacHack

ColdShadow

Cancelled
Sep 25, 2013
1,860
1,929
If that were to happen I’d be here railing against it.

But as long as they’re just helping catch these sick freaks they have nothing but my support.
will you be happy if say,police stops you every day,searches your phone and bag to see IF you have something illegal?
because this is exactly what Apple is going to do from now on.
the company who sold millions and millions of phones by saying : privacy! what happens on your phone stays on your phone..

That's not the way to do it.
this is very extreme and 100% breach of privacy.
 

baypharm

macrumors 68000
Nov 15, 2007
1,951
973
It's funny how Google, MS, FB, (and I assume Apple), have been doing CSAM checking of any images uploaded to the cloud for years. Apple moves the checking locally to improve privacy (and I hope to add E2E to iCloud photos), and people go nuts.

I understand the slippery slope argument, but that argument existed before and after this change. Either you trust Apple to stay within the CSAM parameters outlined or you don't. For example, if Apple wanted to scan devices for pirated media, they could put that change in at any time without going through all these hoops.
Simply because Apple has for years been preaching privacy for iphone users. ‘What is on your iphone…stays on your iPhone.’
 
  • Like
Reactions: bob24

jhollington

macrumors 6502a
Sep 23, 2008
530
589
Toronto
Simply because Apple has for years been preaching privacy for iphone users. ‘What is on your iphone…stays on your iPhone.’
Heh, it does — unless you choose to sync it to Apple's servers ?

At that point, it seems that it's fair game. Apple's iCloud terms and conditions have always made that clear, but of course nobody really reads those. Maybe more people should.
 

DummyFool

macrumors regular
Jan 15, 2020
245
385
With all due respect, you clearly don't really understand how advanced cryptographic technology works. The system is very clearly designed to only allow the flagged images to be decrypted, since they have security vouchers added at the iOS level before they're even uploaded.

To put it in basic terms, if your iPhone flags an image as matching a known image from the CSAM database, it will be encrypted with an additional key that will allow Apple to decrypt it. However, it's even more complicated than this, since Apple is using a well-known cryptographic technique known as Threshold Secret Sharing that splits up the key in such a way that it can't be used to decrypt any of the images until a certain threshold has been reached. In a nutshell, it's like each image getting 1/100th of the key — until all 100 images have been flagged and encrypted, you don't have the entire key. That's a massively oversimplified example, of course.

In other words, Apple won't even be able to look at any flagged images until enough of them have been flagged.


To be clear, I don't disagree that there's a slippery slope here in other ways. However, it's not based on how the system is designed in terms of encryption. When and if Apple enables full end-to-end encryption in iCloud Photos, it will likely be done in such a way that Apple won't have easy access to your photos — although it could still end up having a loophole like Messages in the Cloud, where the E2E encryption key is stored in your iCloud Backup for recovery purposes. However, that hole can easily be closed by not backing up your devices to iCloud, in which case Apple won't have the key at all.

The real danger, however, is that the entire system is based on matching images from a known database. Right now, that's a database of child abuse imagery. Tomorrow that could be a database of photos of "unlawful" protestors, dissidents, or just about anything else that the government might want access to. The system, as designed, is neutral in its approach — if the hash of an image matches the database, it gets flagged. It's what gets put into that database that controls what Apple is looking for.

We can only hope that Apple will have the guts to stand up to any authoritarian governments who would misuse this. Or even to US law enforcement officials. So far, it has a pretty good track record for that, so I'm not nearly as worried about it as some are. Plus, the database in question comes from the National Center for Missing and Exploited Children (NCMEC), which is focused exclusively on dealing with child abuse. I'd be much more concerned if it were being driven directly by the FBI or DoJ, which could of course choose to populate it with other images of things they might be looking for.
Apple can at any time change the algorithm to change which image will be flagged and if they can do it on image they can do it on any file as well.

Don't forget that Apple, under the law, as to provide any information requested by the NSA and under the same law is forbidden to make the public aware or admit it as done so.
 

jhollington

macrumors 6502a
Sep 23, 2008
530
589
Toronto
Apple can at any time change the algorithm to change which image will be flagged and if they can do it on image they can do it on any file as well.

Don't forget that Apple, under the law, as to provide any information requested by the NSA and under the same law is forbidden to make the public aware or admit it as done so.
Sure, but do you honestly believe that Apple would be advertising something like this if it was planning to use it for nefarious purposes?

More importantly, Apple already has access to everything in your iCloud Photo Library, so it can already provide ALL of your photos to the FBI, CIA, NSA, or any other shadowy three-letter government organization du jour, subject to whatever legal procedures are in place for that.

This new system actually changes nothing unless Apple decides to turn on end-to-end encryption for iCloud Photo Library, but that would be a massive net gain in privacy. We'd go from having our entire iCloud Photo Libraries being available upon request to only those photos that match other existing photos.

Even if the matching database was abused to find more than CSAM, that's still a tiny subset of your entire photo library. Plus the current system only allows your photos to be flagged if they match known photos. This would make it very useful in cases of copyright infringement, but it would be a stretch even to use this to find cases of political dissent — it would only work for catching people sharing copies of a specific photo.

Sure, Apple could expand the algorithm to use machine learning to track down more. It could also turn on the microphone in your iPhone to start listening to everything you say. Or perhaps toggle on the FaceTime camera on demand to spy on you. If you're going to go down that line of thought, you can end up just about anywhere in terms of what Apple might do. After all, how do we know that Apple hasn't already been using the object detection features that were introduced in iOS 10 five years ago to secretly disclose information to government agencies?

There's a point at which you have to take Apple's word for things, and if you're concerned about wha the NSA might secretly request from Apple, you really shouldn't be using iCloud in the first place. This doesn't change anything in that regard.
 

DummyFool

macrumors regular
Jan 15, 2020
245
385
Sure, but do you honestly believe that Apple would be advertising something like this if it was planning to use it for nefarious purposes?

More importantly, Apple already has access to everything in your iCloud Photo Library, so it can already provide ALL of your photos to the FBI, CIA, NSA, or any other shadowy three-letter government organization du jour, subject to whatever legal procedures are in place for that.

This new system actually changes nothing unless Apple decides to turn on end-to-end encryption for iCloud Photo Library, but that would be a massive net gain in privacy. We'd go from having our entire iCloud Photo Libraries being available upon request to only those photos that match other existing photos.

Even if the matching database was abused to find more than CSAM, that's still a tiny subset of your entire photo library. Plus the current system only allows your photos to be flagged if they match known photos. This would make it very useful in cases of copyright infringement, but it would be a stretch even to use this to find cases of political dissent — it would only work for catching people sharing copies of a specific photo.

Sure, Apple could expand the algorithm to use machine learning to track down more. It could also turn on the microphone in your iPhone to start listening to everything you say. Or perhaps toggle on the FaceTime camera on demand to spy on you. If you're going to go down that line of thought, you can end up just about anywhere in terms of what Apple might do. After all, how do we know that Apple hasn't already been using the object detection features that were introduced in iOS 10 five years ago to secretly disclose information to government agencies?

There's a point at which you have to take Apple's word for things, and if you're concerned about wha the NSA might secretly request from Apple, you really shouldn't be using iCloud in the first place. This doesn't change anything in that regard.
I think the likely explanation of Apple's move is that the importance of privacy will diminue on the platform. Maybe it's not possible or desirable to continue to fight off those law enforcement requests.

That would be a good first step to prepare peoples. Future will tell.
 

jhollington

macrumors 6502a
Sep 23, 2008
530
589
Toronto
I think the likely explanation of Apple's move is that the importance of privacy will diminue on the platform. Maybe it's not possible or desirable to continue to fight off those law enforcement requests.
I think it's a necessary compromise.

Again, there's absolutely no reason for Apple to create this feature unless it plans to improve privacy for users' iCloud Photo Libraries. As of right now, Apple has 100% access to every single photo you store in iCloud. Why does it need to analyze them on your iPhone before it uploads them? That's a lot of work to build a feature that's completely redundant.

While the darker view is that Apple could extend this to every photo on your device, whether you use iCloud Photo Library or not, the very article we're commenting on confirms that this is not the case right now — and there's really no reason beyond pure speculation to believe that Apple would do this.

However, the more realistic perspective, considering Apple's past actions, is that it plans to enable end-to-end encryption for iCloud Photo Libraries. It's been trying to add more secure encryption and privacy for years now, but it's been frustrated repeatedly by politicians and law enforcement. It abandoned its plans for encrypted iCloud Backups because the FBI and DoJ pushed back hard.

It goes without saying that end-to-end encryption on iCloud Photos would absolutely raise the ire of politicians who have already vowed to "impose their will" on Apple through blanket legislation. Apple has to throw them a bone here, and preventing child abuse is the most contentious issue that Apple's opponents like to throw out, because it's something that almost nobody can disagree with.
 
  • Like
Reactions: DotCom2

mdatwood

macrumors 6502a
Mar 14, 2010
919
908
East Coast, USA
Simply because Apple has for years been preaching privacy for iphone users. ‘What is on your iphone…stays on your iPhone.’
And still does. Apple has for years been CSAMing photos sent to iCloud. Now, they will CSAM them right before sending to iCloud. If you don't want them checking the photos, then don't use iCloud.

As I said earlier, I think this feature has been developed and released in preparation to move more (all would be amazing) services to having full E2E encryption thus greatly improving privacy for iPhone users.
 
  • Like
Reactions: jhollington

jhollington

macrumors 6502a
Sep 23, 2008
530
589
Toronto
And still does. Apple has for years been CSAMing photos sent to iCloud. Now, they will CSAM them right before sending to iCloud. If you don't want them checking the photos, then don't use iCloud.

As I said earlier, I think this feature has been developed and released in preparation to move more (all would be amazing) services to having full E2E encryption thus greatly improving privacy for iPhone users.
Exactly.

I don't think people fully appreciate how much effort Apple has put into developing his feature. It's easy to read a small info bite like "Apple is scanning photos on people's iPhones" without fully appreciating that Apple has jumped through multiple complex and sophisticated hoops to make absolutely certain that it can't access anything at all until a given account reaches the threshold of flagged photos. It's even gone so far as to take steps to obscure the count of flagged CSAM until it reaches the threshold.

Everyone really needs to go and at least skim this document...


While your eyes are probably going to glaze over halfway down page 2 unless you have a working knowledge of basic cryptography, what's in there should make anybody with even a modicum of technical knowledge appreciate that this is a very complex system that Apple has put a lot of thought into designing.

As somebody who has a hobbyist knowledge of crypto, I look at this and find it elegantly beautiful, but then again, I'm a total nerd that way.

Apple would not have gone to even half this effort unless it was really serious about privacy.
 
  • Like
Reactions: DotCom2

Yojimbo007

macrumors 6502a
Jun 13, 2012
693
576
ORWELLIAN!
Besides absolute invasiveness what do they hope to accomplish with this??… it will solve nothing!
Ones who have these tendencies and know that they are being watched.. they will simply circumvent it by using other platforms and Avoid Apple iCloud or Apple all together.
Including those who just cant stand the totalitarian approach of big tech and Apple‘s increasingly invasive walled garden ( feels like the a walled China )
This is a horrific pr/pub/buisness move by Apple… it feels Orwellian ! It Is ORWELLIAN !

Dont do it Tim…. this will alienate everyday normal people who value privacy!
IT IS not to the best interest of the share holders AND SOLVES NOTHING!
 
  • Like
Reactions: JMacHack

JMacHack

Suspended
Mar 16, 2017
1,965
2,422
Peoples worry about liberties in China. I promise, within 20 years, some Republicans will pass laws that mandate right to search your stuff for anti-American (read anti-conservative) material. Maybe sooner than that.
That’s adorable you think that the Democrats have your back: https://www.govtrack.us/congress/votes/107-2001/s313

And in more recent news:

Spying is a bipartisan issue. Never be deluded otherwise.
 
  • Like
Reactions: _Spinn_ and IG88

baypharm

macrumors 68000
Nov 15, 2007
1,951
973
Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure as well as scan for illicit photography. What about government surveillance of dissidents or protesters? So now we have to watch what we say in msgs? It’s called pre-censorship - whereby we are afraid of what might happen so we don’t talk about it.
 

MacBH928

macrumors G3
May 17, 2008
8,351
3,734
Scanning my photos for kiddie porn has no effect on my privacy, since I don’t have any kiddie porn.

No one is upset because they are fighting against pedophiles.
People are upset because you are scanning without their consent. Today you scan for pedophiles, tomorrow you scan for poltical opinions, after that you scan for your photos just in case they find something the can blackmail you with in the future.
 

Darth Tulhu

macrumors 68020
Apr 10, 2019
2,250
3,773
The arguments defending this policy are just plain dumb.

This is the equivalent of me willfully allowing an entity full access to my house unannounced, so they can come in an do spot checks "just in case I'm up to (whatever they deem to be) no good".

I'm sorry, but I was comfortable with them lying that they believe in Privacy.

But to openly admit they will spy on you is just unbelievably stupid and exceeds my "benefit-of-the-doubt" threshold.

Apple really frakked up here.

I'm really pissed here because I absolutely love the Apple ecosystem, and never thought I would have to go back to the (Android-like) wilderness of offline syncing.

I'll be stopping ALL my Apple services effective immediately, until I they lie to me again, they backpedal on this publicly, and some crafty Linux-loving paranoids validate it...

This is NOT a slippery slope.

This is BS. :mad:
 
Last edited by a moderator:

DeepIn2U

macrumors G5
May 30, 2002
12,852
6,892
Toronto, Ontario, Canada
what’s next? scanning your stuff on iCloud for anti government materials for oppressive governments?
He newsflash Apples already been doing that the Chinese government more data although if your native or Chinese citizens with an iPhone used in the country your data is already looked at it. That is completely different from this initiative here why are people mixing two completely different initiatives as the same thing that’s wrong with this thought?
 
  • Like
Reactions: haruhiko

haruhiko

macrumors 604
Sep 29, 2009
6,534
5,882
He newsflash Apples already been doing that the Chinese government more data although if your native or Chinese citizens with an iPhone used in the country your data is already looked at it. That is completely different from this initiative here why are people mixing two completely different initiatives as the same thing that’s wrong with this thought?
Lol you just proved my point
 
  • Like
Reactions: benspratling

Apple_Robert

Contributor
Sep 21, 2012
34,504
50,065
In the middle of several books.
Yeah, I believe this is the exact legal culpability that Apple wants to avoid by implementing the feature.

It wouldn’t surprise me if the authorities found some awful humans were sharing this type of content via iCloud photo sharing and suddenly this became an absolute priority to block.
And Apple can still protect themselves with server side scanning. There is no need for device scanning to accomplish the goal. The reason Apple is doing device scanning is to broaden the scope and control of this feature.
 

bushman4

macrumors 601
Mar 22, 2011
4,043
3,553
By having a cellphone you actually are giving up all your privacy
Unfortunately we all need an iPhone for one reason or another
And the more features you use on the iPhone the more privacy you give up
Here’s the proof :
Take a person that doesn’t use a cellphone perhaps an elderly person or a person that doesn’t want one whose name you know
Now try looking up information about that person and you’ll be surprised how little info if any comes up. Then try looking up info on your name
I bought an iPhone for personal use not for supplying Apple with info of where I am or where I’ve been not for supplying Apple with various private info like what I’m buying and where
The list goes on and on
Just remember anything you do with your iPhone has the potential of not being private. Think what you’re doing
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.