Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Kar98

macrumors 65816
Feb 20, 2007
1,258
884
Soon, a wave of "dumb pedo conveniently sent CP to local police, gets arrested" news stories, because that's the line we're going to go with here, same as back in the day of warrantless wiretapping, when we said "dumb criminal calls 911 during commission/discussion of crime" instead.

i.e. they're still gonna do it, but now they're not going to tell the public about it.
 

MacProFCP

Contributor
Jun 14, 2007
1,222
2,952
Michigan
A hash of known circulating CSAM provided by at least two child protection agencies operating under different governments. At least 30 matched known CSAM images must be detected before triggering an alert. Matches confirmed manually before notifying law enforcement. Implemented and documented:

It's so hard to know how much of the outrage is about what Apple was planning to do and how much is about what people imagined Apple was planning to do...
Your assumption that people may have incorrect assumptions on Apples plan may be correct. However, once that tree grows, you can’t stop the poison fruit.

Humans have a terrible track record when it comes to abuse. If Apple implemented this technology, what would stop China or Russia or Iran or Saudi Arabia or any of the many other countries who suppress freedom from insisting that Apple use the technology to assist their immoral behavior?

Do you think once in place, Apple would be able to shut it down without consequence? Do you think Apple would have been able to launch iCloud encryption, effectively shutting down CSAM?

My point is that we live in a world filled with bad state actors, where evil people rule over billions of innocent. Are you willing to put such power in the hands of people and persons who are not to be trusted?
 

Analog Kid

macrumors G3
Mar 4, 2003
8,983
11,734
If Apple implemented this technology, what would stop China or Russia or Iran or Saudi Arabia or any of the many other countries who suppress freedom from insisting that Apple use the technology to assist their immoral behavior?
What’s to stop them from insisting if Apple doesn’t deploy it? What’s to stop them from insisting it be implemented in a less narrow, transparent and secure way?

You’re making a slippery slope argument about regimes that wouldn’t hesitate to push someone off a cliff.
 
Last edited:

MacProFCP

Contributor
Jun 14, 2007
1,222
2,952
Michigan
What’s to stop them from insisting if they don’t deploy it? What’s to stop them from insisting it be implemented in a less narrow, transparent and secure way?

You’re making a slippery slope argument about countries that wouldn’t hesitate to push someone off a cliff.

OK. Let’s assume that all of us on this forum are just postulating different assumptions and none are correct. Very possible.

However, one thing is fairly clear: by Apple saying that they are killing an expensive and invasive technology in the name of privacy is telling. And doing so on the same day they implement iCloud encryption further sets the tone that Apple is locking the door to user data and throwing out keys.

Maybe I’m wrong. Maybe we’re all wrong. Either way I don’t believe the benefits of CSAM are going to be fruitful as, once implemented, any criminal with half a brain would simply turn off iCloud sync.
 
  • Like
  • Love
Reactions: ericwn and SFjohn

Analog Kid

macrumors G3
Mar 4, 2003
8,983
11,734
OK. Let’s assume that all of us on this forum are just postulating different assumptions and none are correct. Very possible.

However, one thing is fairly clear: by Apple saying that they are killing an expensive and invasive technology in the name of privacy is telling. And doing so on the same day they implement iCloud encryption further sets the tone that Apple is locking the door to user data and throwing out keys.

Maybe I’m wrong. Maybe we’re all wrong. Either way I don’t believe the benefits of CSAM are going to be fruitful as, once implemented, any criminal with half a brain would simply turn off iCloud sync.

Which was the whole point-- prevent people from putting criminal material in an encrypted cache and sharing the password.

Apple saying they're throwing in the towel doesn't mean they suddenly agree with you. It means Apple decided it wasn't worth all the negative PR they were getting and their attempts at educating the public over the past year have only led to more waves of FUD.

So now the question is what happens when one of those regimes you mention make end to end encryption on certain data illegal and use CSAM as the stalking horse. Apple has tried to address it in a way that undercuts the false pretext and prevents a fishing expedition but it blew back in their face. Now it's likely that they'll either be forced to play ball, or exit those markets and leave them to hollow companies who won't even make an effort to hold a line.
 

Fat_Guy

macrumors 65816
Feb 10, 2021
1,012
1,078
Look, forget about the CSAM fiasco.


USB C is coming next year to the iPhone! 👍👍👍👍👍👍👍👍
 

Grey Area

macrumors 6502
Jan 14, 2008
423
1,004
Just a point of clarification — the CSAM detection was against a set of known hashed CSAM material, and was not designed to use machines learning models of what might be said material.

Not really, if by "scan" you mean create a hash (not scanning), then sure. That isn't "looking at your photos' content" That is adding up all the pixels into a formula (say SHA-256) and matching them to the hash of known shared images.
No, this was not hashing like SHA-256. A cryptographic hashing algorithm like SHA-256 aims at creating wildly different hash values for two pieces of input data, even when those have only minor differences (e.g. one image and one copy with a few pixels altered). Also, the hash values are non-reversible, i.e. you cannot recreate the original data from the hash.

What Apple was planning to do was semantic hashing, which is very different: it is using a machine learning classifier trained to analyze the content of the images, and two images showing similar things are then supposed to get similar hash values. If this algorithm determines that an image on your phone shows things similar to those in a known CSAM picture, this would count as a match.

I do not think Apple ever provided any specifics on how generous this similarity measurement would be. They said the system was intended to catch minor alterations of the known CSAM pictures, so maybe it had rather tight tolerances. On the other hand, if that was the intention, Apple could have chosen to exclude photos the user took with the camera and only scan downloaded pictures, but from what I understand they did plan to scan user photos.

(Also, semantic hashes are reversible, which is why Apple planned to encrypt the known CSAM hashes stored on the phones and to keep them out of reach from the user.)
 

Analog Kid

macrumors G3
Mar 4, 2003
8,983
11,734
No, this was not hashing like SHA-256. A cryptographic hashing algorithm like SHA-256 aims at creating wildly different hash values for two pieces of input data, even when those have only minor differences (e.g. one image and one copy with a few pixels altered). Also, the hash values are non-reversible, i.e. you cannot recreate the original data from the hash.

What Apple was planning to do was semantic hashing, which is very different: it is using a machine learning classifier trained to analyze the content of the images, and two images showing similar things are then supposed to get similar hash values. If this algorithm determines that an image on your phone shows things similar to those in a known CSAM picture, this would count as a match.

I do not think Apple ever provided any specifics on how generous this similarity measurement would be. They said the system was intended to catch minor alterations of the known CSAM pictures, so maybe it had rather tight tolerances. On the other hand, if that was the intention, Apple could have chosen to exclude photos the user took with the camera and only scan downloaded pictures, but from what I understand they did plan to scan user photos.

(Also, semantic hashes are reversible, which is why Apple planned to encrypt the known CSAM hashes stored on the phones and to keep them out of reach from the user.)
It's worth reading the document.

It is a perceptual hash, not a semantic hash. It isn’t seeking images with the same meaning (semantics) it’s looking for images that look like the same image. It is looking to create a common hash value for variations of a specific image. Cropped, rotated, color shifted, and probably a lot of other things.

In particular, this is not an image classifier. It is not inferring CSAM/not CSAM. It is an image detector, determining if your image is already in a known database. The downside of this is that it is not looking for and will not detect abusive imagery you may have created yourself, it is looking to see if you have a copy of an image that is already in circulation.

If this algorithm determines that an image on your phone shows similar things to a known CSAM image, it would not count as a match. If this algorithm determines that an image on your phone is a manipulated version of a known CSAM image it would count as a match.

It won't be perfect. It will miss some CSAM depending on the manipulations. Hashing implies information reduction so there is always the possibility of a false positive. A false positive does not imply that your image is "similar" to CSAM, it just means it hashed to the same value and is most likely completely innocuous. This is why it requires 30 positive matches to generate an alert, to give most people 30 possible false matches (not to give everyone a pass for 29 criminal images).

Apple has said there is a 1 in a trillion chance that an account would be falsely flagged. If there are a billion iCloud accounts, there's a 1 in a thousand chance someone would get referred to Apple for manual verification. I suspect that number is based on a numerical analysis with some assumptions and probably underestimates the risk-- but still the risk to anyone should be exceedingly low. Your match results are encrypted so that Apple can't tell anything about them, including whether you have 0 or 29 matches, unless you exceed the 30 match threshold.

Excluding pictures taken with the camera leaves open the possibility of altering the metadata of downloaded images to avoid the scan, and still would mean that if you share your picture with someone else it'll get scanned on their phone.

The hashes are not reversible: see hash. If you have a hash you can't create the image from it. There is less data in the hash than in the source image, and that data can't be guessed. The hashes are spoofable. With effort, it would be possible to construct an image that has the same hash as another different image. Most likely the image would look like crap, but it would tickle the hashing function in just the right way.

The hashes are probably encrypted to prevent people from creating spoof images and bogging down the system, and to prevent people from being able to pre-test their own image library for matches.
 
  • Like
Reactions: MecPro

v0lume4

macrumors 68020
Jul 28, 2012
2,483
5,122
Great news but they lost all my trust with this. I will continue to use local backups on my Mac and keep my photos out of iCloud for good.
Ultimately, we shouldn't trust any cloud storage when it comes to our privacy. Plus, local backups are fun to manage. Or is that just me? 🙃
 

PBMB

macrumors 6502
Mar 19, 2015
324
126
Would not have mattered me either way.
Me too. I have never uploaded photos to iCloud because I did not like the idea since the beginning. And in fact, I mostly use my (standalone) digital camera to take photos. Sometimes I inspect them in a Mac, or store a small number of them, but that's it.
 

PBMB

macrumors 6502
Mar 19, 2015
324
126
Ultimately, we shouldn't trust any cloud storage when it comes to our privacy. Plus, local backups are fun to manage. Or is that just me? 🙃
No. You are not alone; see my reply above. But that does not mean I don't care about the subject discussed here. Apple did the right thing to abandon this potentially very dangerous technology.
 
  • Like
Reactions: v0lume4

MuppetGate

macrumors 6502a
Jan 20, 2012
651
1,086
Apple folded, and it was all for money. The anime hentai owners squawked and apple heard their cries from their parents' basements. What a disgrace. The victims will ultimately not receive justice because of Apple's greed. Every other tech company with cloud storage is using this. Google, FB, MS, etc.

Pedophiles are gonna love iCloud. Quite the humanitarian huh Timmy?

The difference is that Google and MS are footing the bill for their CSAM detection. Apple’s scheme would pass the cost on to their customers by using their phones‘ battery and processor to handle the scan.
 
Last edited:

SirAnthonyHopkins

macrumors 6502a
Sep 29, 2020
946
1,887
How sad, that I would have believed “privacy” before they brought up CSAM, but now that they say they’re not doing it, I don’t believe them at all.
If you don't believe them, find proof a raise a class action. Do you realise how bad it would be for Apple to say they're not doing something this invasive and then just go ahead and do it anyway?
 

No5tromo

macrumors 6502
Feb 17, 2012
397
1,029
What were they even thinking at first place? Proactively scanning all of our photos to find child pornography and then have actual people double checking potentially false positive private photos of your family? And who's to say that the person looking at our personal photos isn't a perv or something? Are they totally insane? I am all for measures that protect children but you can't just arbitrarily do stuff like that. This is as intrusive as if the police randomly and forcefully broke into our houses on a regular basis and started proactively searching everything that we own in hopes of finding something illegal. That's not how it works. It's a good thing that they won't go forward with it but the sheer fact that they even considered it after trying to convince us how much they care about privacy is bad enough.
 
  • Like
Reactions: huge_apple_fangirl

Grey Area

macrumors 6502
Jan 14, 2008
423
1,004
It's worth reading the document.

It is a perceptual hash, not a semantic hash. It isn’t seeking images with the same meaning (semantics) it’s looking for images that look like the same image. It is looking to create a common hash value for variations of a specific image. Cropped, rotated, color shifted, and probably a lot of other things.

Apple was most definitely going for semantics:
"The embedding network represents images as real-valued vectors and ensures that perceptually and
semantically similar images have close descriptors in the sense of angular distance or cosine similarity.
Perceptually and semantically different images have descriptors farther apart, which results in larger
angular distances."



In particular, this is not an image classifier. It is not inferring CSAM/not CSAM. It is an image detector, determining if your image is already in a known database.
The difference is not that clear-cut. The system extracts features from the image, and based on these features a neural network produces an image descriptor. If this descriptor is sufficiently similar to the image descriptor of a known CSAM image, the image will be flagged. Now yes, I understand that this type of system relies on existing images and is not capable of finding entirely new types of CSAM. But NCMEC was to provide its five million image hashes, that is a lot of images for a subject matter, and if you then go for similarity matching rather than exact matching, you have for all intents and purposes a CSAM classifier.

The hashes are not reversible: see hash. If you have a hash you can't create the image from it. There is less data in the hash than in the source image, and that data can't be guessed.

They are not perfectly reversible, that is true. But you can recreate a recognizable approximation of the original image. See: https://towardsdatascience.com/black-box-attacks-on-perceptual-image-hashes-with-gans-cc1be11f277

This has been shown to work with PhotoDNA, a perceptual image hashing system widely used for CSAM detection. Maybe Apple's system would have been immune to this, but they spent quite some effort to prevent people from even trying, so I have my doubts.
 

playtech1

macrumors 6502a
Oct 10, 2014
677
846
Apple may have been well intentioned but it is fundamentally creepy for the maker of my phone or laptop to scan its contents.

The type of content when introduced may be unobjectionable, but it would so obviously be a thin end of a wedge once the principle of compulsory private document scanning has been accepted.
 

Mercury7

macrumors 6502a
Oct 31, 2007
738
556
I would not worry too much about what Apple does going forward…. We have privacy advocates that will let us know, just keep your eyes open…. Which I’m sure just about everyone here does anyway, Apple is well aware of the repercussions if they tried to sneak something in under the radar
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.