Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ColdShadow

Cancelled
Sep 25, 2013
1,860
1,929
I agree with darcyf that if you're not doing anything wrong, then you have nothing to worry about.
this is a very bad take,imagine if police could and would raid your home on regular basis,search every where,do a strip search and all just in case you MIGHT have something illegal..remember without your permission and on constant regular basis..
that's the equilavent of what Apple doing.and not only that,this unlocks the same access for governments,various other authorities and of course hackers to access your data really easily.

don't be simple minded and bring that " I have nothing to hide" bs.99% of people who are concerned and angry anout this doen't have anything illegal.they worry about their privacy.and rightfully so.
 

baypharm

macrumors 68000
Nov 15, 2007
1,951
973
The software Apple will be using is called neuralmatch. It will be installed on all iphones in the upcoming IOS 15 and ipad OS 15 update. Every picture uploaded to iCloud in the United States will get a safety certificate that indicates whether the picture is suspicious or not. Apple will decrypt the suspicious photos and, if the material appears to be illegal, submit the results to the proper authorities.

It’s very similiar to the algorithms used to accurately predict the outcomes of sport games (convolutional neural network (ANN), random forest (RF) and support vector machine (SVM)).
 
Last edited:

jhollington

macrumors 6502a
Sep 23, 2008
530
589
Toronto
So, let me get this straight, Apple will have the ability based on flags from an algorithm to have access to my iCloud photo library. This tells me that even without my password and untold number of outside individuals could have access to my iCloud library.
As things stand now, Apple already has full access to your iCloud Photo Library. It is not end-to-end encrypted, merely encrypted in transit (the same as any SSL web site, basically) and stored encrypted "at rest" on Apple's servers.

Right now, the privacy and security of your photos in iCloud is based solely on how much you trust Apple's policies in general, and whether a law enforcement agency is ever likely to serve a warrant to Apple requesting access to your photos.

I would be all for this if it was end to end encryption that Apple had no access to. They could only read the flags and alert law-enforcement who could then start an investigation.
This appears to be exactly what Apple is moving toward. Apple's technical outline makes it pretty clear that the system is designed to only allowed Apple staff and law enforcement agencies to view those specific photos that have been flagged as matches to known CSAM. It does not open the door for anybody to view your entire photo library.

Except, of course, that right now Apple can view your entire photo library. So, the only reason for Apple to even bother explaining the new technology this way is if it's planning to debut end-to-end encryption for iCloud Photo Library.

In fact, it's most likely that this new initiative is a necessary prerequisite to Apple doing this. Can you imagine the reaction of the US DoJ and Congress if Apple suddenly turned iCloud into an unreadable black box, effectively creating a safe haven for child abusers to store their images with impunity?

Basically, if Apple wants to offer much better privacy and security for the trillions of legal photos already being stored in iCloud, it has to have a way of identifying and dealing with the illegal ones.
 

DummyFool

macrumors regular
Jan 15, 2020
245
385
Peoples worry about liberties in China. I promise, within 20 years, some Republicans will pass laws that mandate right to search your stuff for anti-American (read anti-conservative) material. Maybe sooner than that.
 

jthesssin

macrumors regular
May 6, 2013
162
95
Matthews NC
So you’re saying Apple’s been lying all these years about how even they don’t have access to your iCloud storage since they don’t have the key? Adding iCloud photos to end to end encryption means nothing if someone else has the key and can go in at any time. That’s like giving a key to your house to a random person and then telling everyone your house is secure. It’s not secure as long as someone besides yourself has a key to get in! My point is, if someone besides myself has the key to unlock my supposedly secure, encrypted iCloud data, I should not be held responsible for the data it may contain… hell, even banks don’t have a copy of the key to a safe deposit box. You abandon that bank/deposit box, they break in via locksmith, etc., to gain access!
 

cmaier

Suspended
Jul 25, 2007
25,405
33,471
California
So you’re saying Apple’s been lying all these years about how even they don’t have access to your iCloud storage since they don’t have the key? Adding iCloud photos to end to end encryption means nothing if someone else has the key and can go in at any time. That’s like giving a key to your house to a random person and then telling everyone your house is secure. It’s not secure as long as someone besides yourself has a key to get in! My point is, if someone besides myself has the key to unlock my supposedly secure, encrypted iCloud data, I should not be held responsible for the data it may contain… hell, even banks don’t have a copy of the key to a safe deposit box. You abandon that bank/deposit box, they break in via locksmith, etc., to gain access!
No, Apple has not been lying. They never said that they couldn’t decrypt iCloud *backups*. That’s different than iCloud storage.
 

mdatwood

macrumors 6502a
Mar 14, 2010
919
908
East Coast, USA
So you’re saying Apple’s been lying all these years about how even they don’t have access to your iCloud storage since they don’t have the key? Adding iCloud photos to end to end encryption means nothing if someone else has the key and can go in at any time. That’s like giving a key to your house to a random person and then telling everyone your house is secure. It’s not secure as long as someone besides yourself has a key to get in! My point is, if someone besides myself has the key to unlock my supposedly secure, encrypted iCloud data, I should not be held responsible for the data it may contain… hell, even banks don’t have a copy of the key to a safe deposit box. You abandon that bank/deposit box, they break in via locksmith, etc., to gain access!
Apple has been very clear what is and is not E2E encrypted.

 

IG88

macrumors 65816
Nov 4, 2016
1,109
1,637
Apple will decrypt the suspicious photos and, if the material appears to be illegal, submit the results to the proper authorities.
Appears to be illegal? Wow that isn't nebulous at all. Who decides what the threshold for "appears to be illegal" is?

Sounds like they'll err on the side of forwarding a lot of pictures, and let the authorities sort it out.
 

jthesssin

macrumors regular
May 6, 2013
162
95
Matthews NC
So, they in fact can decrypt at their whim your iCloud data (dont care if it’s your photos, drive, backups, etc) sounds unsecured to me
 

jhollington

macrumors 6502a
Sep 23, 2008
530
589
Toronto
So you’re saying Apple’s been lying all these years about how even they don’t have access to your iCloud storage since they don’t have the key?
As others have pointed out, Apple has never once said that iCloud Photos were end-to-end encrypted. There's a difference between "encrypted" and "end-to-end encrypted" and it clarifies that in the support document that mdatwood linked to above.

Adding iCloud photos to end to end encryption means nothing if someone else has the key and can go in at any time. That’s like giving a key to your house to a random person and then telling everyone your house is secure. It’s not secure as long as someone besides yourself has a key to get in! My point is, if someone besides myself has the key to unlock my supposedly secure, encrypted iCloud data, I should not be held responsible for the data it may contain… hell, even banks don’t have a copy of the key to a safe deposit box. You abandon that bank/deposit box, they break in via locksmith, etc., to gain access!
If you read Apple's Technical Summary of CSAM Detection, they explain that the system is designed to only disclose those images that are flagged as matching known CSAM images, and further it doesn't even get access to those until a certain threshold of matched images is reached:

  • Apple does not learn anything about images that do not match the known CSAM database.
  • Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.
 

jthesssin

macrumors regular
May 6, 2013
162
95
Matthews NC
If  can decrypt one file in your cloud storage, they can decrypt ALL files. Stop saying they will only access “flagged” images. It never stops there and will only be expanded the longer it’s allowed to be in use! I’m reminded of RICO laws and how they’ve been morphed into our current civil asset forfeiture laws… what started as “great intentions” for the “greater good” of society, to “disrupt criminal gangs and cartels” is now used by local law enforcement to seize cash at the side of the road under the auspices of “crime eradication”
 

jhollington

macrumors 6502a
Sep 23, 2008
530
589
Toronto
Appears to be illegal? Wow that isn't nebulous at all. Who decides what the threshold for "appears to be illegal" is?

Sounds like they'll err on the side of forwarding a lot of pictures, and let the authorities sort it out.
The "appears to be illegal" has more to do with false positives.

The CSAM Detection is based on matching specific images from a known database of child abuse images that have already been found online. Apple is not algorithmically looking for "suspicious" content.

However, since the algorithm uses "hashing" to do this — reducing every photo to a numerical value — it's conceivably possible that a completely unrelated image could compute to the same numerical hash as an actual child abuse image. It's not based on image content, but rather the numbers the algorithm spits out. It could be an image of a tractor, for example. It's just how the math ultimately works, and it's not even a new problem. CRC checksums and MD5 hashes have had this same issue for years, and it's always been a problem or forensic investigators.

This is why there's a threshold under which Apple won't even know that any images have been flagged. It's only once a critical mass of images appears to be in a user's account that they'll go in and take a look. At that point, if they find a bunch of images of farm equipment, they'll obviously decide that's a false positive and not worth reporting. Images that contain any nudity will be sent to authorities — most likely because they are actual matches to the CSAM database.
 

mdatwood

macrumors 6502a
Mar 14, 2010
919
908
East Coast, USA
It's funny how Google, MS, FB, (and I assume Apple), have been doing CSAM checking of any images uploaded to the cloud for years. Apple moves the checking locally to improve privacy (and I hope to add E2E to iCloud photos), and people go nuts.

I understand the slippery slope argument, but that argument existed before and after this change. Either you trust Apple to stay within the CSAM parameters outlined or you don't. For example, if Apple wanted to scan devices for pirated media, they could put that change in at any time without going through all these hoops.
 
  • Like
Reactions: jhollington

jhollington

macrumors 6502a
Sep 23, 2008
530
589
Toronto
If  can decrypt one file in your cloud storage, they can decrypt ALL files. Stop saying they will only access “flagged” images. It never stops there and will only be expanded the longer it’s allowed to be in use!
With all due respect, you clearly don't really understand how advanced cryptographic technology works. The system is very clearly designed to only allow the flagged images to be decrypted, since they have security vouchers added at the iOS level before they're even uploaded.

To put it in basic terms, if your iPhone flags an image as matching a known image from the CSAM database, it will be encrypted with an additional key that will allow Apple to decrypt it. However, it's even more complicated than this, since Apple is using a well-known cryptographic technique known as Threshold Secret Sharing that splits up the key in such a way that it can't be used to decrypt any of the images until a certain threshold has been reached. In a nutshell, it's like each image getting 1/100th of the key — until all 100 images have been flagged and encrypted, you don't have the entire key. That's a massively oversimplified example, of course.

In other words, Apple won't even be able to look at any flagged images until enough of them have been flagged.

I’m reminded of RICO laws and how they’ve been morphed into our current civil asset forfeiture laws… what started as “great intentions” for the “greater good” of society, to “disrupt criminal gangs and cartels” is now used by local law enforcement to seize cash at the side of the road under the auspices of “crime eradication”
To be clear, I don't disagree that there's a slippery slope here in other ways. However, it's not based on how the system is designed in terms of encryption. When and if Apple enables full end-to-end encryption in iCloud Photos, it will likely be done in such a way that Apple won't have easy access to your photos — although it could still end up having a loophole like Messages in the Cloud, where the E2E encryption key is stored in your iCloud Backup for recovery purposes. However, that hole can easily be closed by not backing up your devices to iCloud, in which case Apple won't have the key at all.

The real danger, however, is that the entire system is based on matching images from a known database. Right now, that's a database of child abuse imagery. Tomorrow that could be a database of photos of "unlawful" protestors, dissidents, or just about anything else that the government might want access to. The system, as designed, is neutral in its approach — if the hash of an image matches the database, it gets flagged. It's what gets put into that database that controls what Apple is looking for.

We can only hope that Apple will have the guts to stand up to any authoritarian governments who would misuse this. Or even to US law enforcement officials. So far, it has a pretty good track record for that, so I'm not nearly as worried about it as some are. Plus, the database in question comes from the National Center for Missing and Exploited Children (NCMEC), which is focused exclusively on dealing with child abuse. I'd be much more concerned if it were being driven directly by the FBI or DoJ, which could of course choose to populate it with other images of things they might be looking for.
 
  • Like
Reactions: jthesssin

jhollington

macrumors 6502a
Sep 23, 2008
530
589
Toronto
It's funny how Google, MS, FB, (and I assume Apple), have been doing CSAM checking of any images uploaded to the cloud for years. Apple moves the checking locally to improve privacy (and I hope to add E2E to iCloud photos), and people go nuts.

I understand the slippery slope argument, but that argument existed before and after this change. Either you trust Apple to stay within the CSAM parameters outlined or you don't. For example, if Apple wanted to scan devices for pirated media, they could put that change in at any time without going through all these hoops.
Bingo.

This is a good thing. Sure, it has the potential for abuse, but images stored in the cloud have been scannable for years, and Apple is being very careful to make sure it only applies to devices that use iCloud Photo Library, which sort of makes sense since it's arguably responsible for anything store on its servers, while it naturally feels it has no business sticking its fingers into what's stored only locally on user's iPhones.

Further, Apple has been scanning and analyzing photos on our devices since 2016 when iOS 10 was released. Nobody seemed to care back then. Everybody saw it as a good move, since it was common knowledge that Google had been doing the same thing for years on its cloud servers. In fact, when Apple first unveiled the feature in iOS 10, it was being so careful about privacy that every device had to perform its own analysis — nothing was synced through iCloud Photo Library. That later changed as Apple improved its encryption and users became more comfortable with the idea, but even today, while the photos themselves aren't encrypted, the identifying metadata is. That's why you can't search for people or objects in iCloud on the web.

Let's face it, US politicians and the DoJ would have had kittens if Apple tried to turn on end-to-end encryption in iCloud Photos without offering some solution for detecting CSAM. It would create a completely safe haven for child abusers, and if there's one rallying cry that the public and almost all politicians can get behind when it comes to online crime, it's "Won't somebody please think of the children?"
 
  • Like
Reactions: mdatwood

jthesssin

macrumors regular
May 6, 2013
162
95
Matthews NC
And your info on this special encryption of flagged images comes from where exactly? It is not stated anywhere in the article that flagged images are encrypted differently than the other data, nor does it state what the threshold is, or how many flagged events constitutes a complete key… just to be clear, the images currently on my iCloud photo storage (that I assume are run through this same scrutiny on their servers) will be the same ones being analyzed at the local level should they implement this, I am in no way worried about my images being flagged, but this is a bad idea in a long sad history of bad ideas…
 

bob24

macrumors 6502a
Sep 25, 2012
610
544
Dublin, Ireland
This has always been the case with iCloud backups and iCloud Photo Library. This content is encrypted with a key known to Apple. They even document which data they can read or not.

I am not sure they ever mentioned employees begin authorised to access the files. There has bee cases whereby law enforcement asked for data which Apple provided. But Apple's own employees reviewing user data after decrypting it and without explicit consent would be a different story.
 

bob24

macrumors 6502a
Sep 25, 2012
610
544
Dublin, Ireland
And your info on this special encryption of flagged images comes from where exactly? It is not stated anywhere in the article that flagged images are encrypted differently than the other data, nor does it state what the threshold is, or how many flagged events constitutes a complete key… just to be clear, the images currently on my iCloud photo storage (that I assume are run through this same scrutiny on their servers) will be the same ones being analyzed at the local level should they implement this, I am in no way worried about my images being flagged, but this is a bad idea in a long sad history of bad ideas…

For reference I posted a link to Apple's more detailed technical explanation and my understanding of it earlier.

 

jhollington

macrumors 6502a
Sep 23, 2008
530
589
Toronto
And your info on this special encryption of flagged images comes from where exactly? It is not stated anywhere in the article that flagged images are encrypted differently than the other data, nor does it state what the threshold is, or how many flagged events constitutes a complete key…
It's from Apple's Technical Summary of the feature, which I cited above, but here's the direct link in case it wasn't obvious before...


To be fair, you're right that it doesn't state what the threshold is, quite likely because this will be a moving target. All Apple says is that "The threshold is selected to provide an extremely low (1 in 1 trillion) probability of incorrectly flagging a given account."

It's also been independently assessed by three leading experts in cryptographic research, all people who are way smarter than any of us are. Having worked in IT security for over 20 years, I have an armchair enthusiast's understanding of cryptographic technology — enough to read the document above and comprehend what Apple is talking about at a macro level, but these are the people who actually take it apart mathematically and procedurally to look for flaws in it.

Technical Assessment of CSAM Detection — Benny Pinkas (PDF)
Technical Assessment of CSAM Detection — David Forsyth (PDF)
Technical Assessment of CSAM Detection — Mihir Bellare (PDF)
Alternative Security Proof of Apple PSI System — Mihir Bellare (PDF)

Just to be clear, the images currently on my iCloud photo storage (that I assume are run through this same scrutiny on their servers) will be the same ones being analyzed at the local level should they implement this, I am in no way worried about my images being flagged, but this is a bad idea in a long sad history of bad ideas…
As I said, the bad idea is the potential for abuse by changing the parameters of what gets flagged. The system itself seems solid, but since it's designed to flag any known images that are fed into the algorithm, it's rife for abuse, and preventing that is solely in the hands of Apple's ability to refuse to play ball with agencies that would try and use it for other purposes.
 

bob24

macrumors 6502a
Sep 25, 2012
610
544
Dublin, Ireland
It's funny how Google, MS, FB, (and I assume Apple), have been doing CSAM checking of any images uploaded to the cloud for years. Apple moves the checking locally to improve privacy (and I hope to add E2E to iCloud photos), and people go nuts.

I understand the slippery slope argument, but that argument existed before and after this change. Either you trust Apple to stay within the CSAM parameters outlined or you don't. For example, if Apple wanted to scan devices for pirated media, they could put that change in at any time without going through all these hoops.

I could be wrong, but to my knowledge Apple's hasn't been scanning user's personal data before.

They might have been scanning pictures before publishing them on a public or semi-public content sharing platform (such as iCloud's photo sharing feature), and personally I have no issue with company validating content before they publish it to a wider audience (it is not private anymore as the user as elected to share the content through the mediation of Apple, Google, or FB, and of course once they publish it those companies also have a legal responsibility). But it is different from scanning personal data which no-one requested to share.
 

jhollington

macrumors 6502a
Sep 23, 2008
530
589
Toronto
This is crazy! What if parents like to take photos of their kids or baby taking a bath? Which they think is cute!
That's not what this is. For the CSAM Detection feature, Apple is only comparing photos to a database of known child abuse images. So, a picture of your kids or baby taking a bath isn't going to get flagged unless it's already been making the rounds on the dark web.

The only feature that will be doing any kind of analysis is the new Communication Safety in Messages. That will definitely look for nude photos of any kind, but it's not reporting them to anybody, except for the parents of children under the age of 13, and even then it only does so if the child decides to actually bypass the warning and look at the photo.
 

jhollington

macrumors 6502a
Sep 23, 2008
530
589
Toronto
I could be wrong, but to my knowledge Apple's hasn't been scanning user's personal data before.
Actually, Apple's Chief Privacy Officer, Jane Horvath, told a panel at CES early last year that Apple has algorithms in place to scan iCloud Photo Libraries to "help screen for child sexual abuse material."

It's very likely Apple has been doing this for almost as long as other major tech companies that allow photo storage. Google has been doing it since 2008, and Facebook started back in 2011. Much like the new CSAM Detection, however, this is only about scanning for matches to known photos that are already out in the wild.

The goal for this new feature seems to remain the same as before — to catch actual child predators who are using iCloud Photo Library to store their personal collections of filth. The difference is that by moving to on-device scanning, Apple will be able to encrypt users' iCloud Photo Libraries without raising the ire of politicians and law enforcement agencies.
 
  • Like
Reactions: Realityck and bob24

jhollington

macrumors 6502a
Sep 23, 2008
530
589
Toronto
I am not sure they ever mentioned employees begin authorised to access the files. There has bee cases whereby law enforcement asked for data which Apple provided. But Apple's own employees reviewing user data after decrypting it and without explicit consent would be a different story.
Correct. There's a difference between being able to access data and actually doing so.

I have never seen any evidence that Apple employees look at user data without consent — and there are enough whistleblowers out there that this would have come to light a long time ago. That said, I've also worked in IT long enough to know that there are many occasions where data can be seen inadvertently.

Similarly, law enforcement isn't allowed to go on fishing expeditions. They have to get a judge to sign off on a warrant for specific information, which has to be based on probable cause.
 

mdatwood

macrumors 6502a
Mar 14, 2010
919
908
East Coast, USA
Actually, Apple's Chief Privacy Officer, Jane Horvath, told a panel at CES early last year that Apple has algorithms in place to scan iCloud Photo Libraries to "help screen for child sexual abuse material."
...
The difference is that by moving to on-device scanning, Apple will be able to encrypt users' iCloud Photo Libraries without raising the ire of politicians and law enforcement agencies.
And note, the reason the tech companies have been doing this for so long is to avoid legislation which tends to be sledgehammer like instead of a scalpel. I'm betting Apple is going to announce expanding their E2E encryption at the iPhone event this year, and like you said wants to avoid the ire of LEOs by already having a CSAM solution in place.
 
Last edited:
  • Like
Reactions: jhollington
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.