Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

giggles

macrumors 65816
Dec 15, 2012
1,048
1,277
Agreed. There is a clear contradiction here with Apple's claims that say "any adjustments to that matched photo will still be a match", yet "oh but an entirely separate photo that is nearly identical won't get flagged so don't worry!
That’s super obvious how that would be possible.
You’re looking at this with human eyes, not with AI eyes.
 

giggles

macrumors 65816
Dec 15, 2012
1,048
1,277
The contradiction is:

"Any adjustments made to an image in the database will be flagged"

and

"A separate photo that is nearly identical will not get flagged, so don't worry!"
Sorry, nothing contradictory about that.
There’s an information gap with regards to AI, hashes, etc.
 

Ethosik

Contributor
Oct 21, 2009
7,832
6,762
That’s super obvious how that would be possible.
You’re looking at this with human eyes, not with AI eyes.
An image is just a series of pixels. And the fact that there is some flexibility from the truly stored image in the database and what can get flagged (any modifications get flagged), means there is a wide range of collision. Its not 1:1.

A similar image with nearly the EXACT SAME pixel attributes, but with an adult/legal subject CANNOT get flagged....how? The pixels are nearly the same....
 

giggles

macrumors 65816
Dec 15, 2012
1,048
1,277
That in and of itself doesn't mean very much. Usually when a security audit is done, it is explained in a report what was tested, under what conditions, and if there were any weaknesses or security holes found etc.

Sure.
They probably won’t let people actually audit it.
Someone brought it up in this FAQ:

 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,073
2,653
My grandparents have lived through Hitler and Stalin too. They've taught me a lot and told me a lot of stories.
I don't trust Apple. They'll expand this 'feature' and soon the users that have such pictures on their phones will be reported & arrested too:
You're only partially right. They'll also be reported to and arrested by:

201107-joe-biden-al-1143.jpg

Coz_b_9XEAATTRl-646x437.jpg
 
  • Like
Reactions: iHorseHead

giggles

macrumors 65816
Dec 15, 2012
1,048
1,277
An image is just a series of pixels. And the fact that there is some flexibility from the truly stored image in the database and what can get flagged (any modifications get flagged), means there is a wide range of collision. Its not 1:1.

A similar image with nearly the EXACT SAME pixel attributes, but with an adult/legal subject CANNOT get flagged....how? The pixels are nearly the same...

The unlucky 1 in a million image that tricks the system by sheer pixel attributes would be something completely unrelated to children, like a picture of a bucket of sand or a sunset. Human review would can it in a split second. Fact is, it wouldn’t even reach human review ‘cause one match isn’t enough, an account has to do multiple offences.

On the other hand, altered versions of the original known CSAM would be easily caught by the AI. (again, read about how PhotoDNA works, Apple’s system is probably similar)

There is no contradiction.
 

fumi2014

macrumors 6502
May 21, 2014
357
1,521
California
The unlucky 1 in a million image that tricks the system by sheer pixel attributes would be something completely unrelated to children, like a picture of a bucket of sand or a sunset. Human review would can it in a split second. Fact is, it wouldn’t even reach human review ‘cause one match isn’t enough, an account has to do multiple offences.

On the other hand, altered versions of the original known CSAM would be easily caught by the AI. (again, read about how PhotoDNA works, Apple’s system is probably similar)

There is no contradiction.

Giggles, time and again you've tried to explain how this works (and in some detail).

Lots of people on here are not listening or reading correctly. Apple's papers on this go into extreme technical detail for those interested and have the appropriate attention span.
 

Ethosik

Contributor
Oct 21, 2009
7,832
6,762
The unlucky 1 in a million image that tricks the system by sheer pixel attributes would be something completely unrelated to children, like a picture of a bucket of sand or a sunset. Human review would can it in a split second. Fact is, it wouldn’t even reach human review ‘cause one match isn’t enough, an account has to do multiple offences.

On the other hand, altered versions of the original known CSAM would be easily catched by the AI. (again, read about how PhotoDNA works)

There is no contradiction.
I have read it. It makes no sense. Let me put it another way. Using @hans1972 example.

Picture 1: Someone takes a picture of you where you live. Lets say you standing in front of a window.

You walk away for a few minutes.

Picture 2: Someone takes a picture of you standing in front of the same window and you have approx. the same pose as in the last picture.

Somehow picture 1 becomes part of the CSAM database. If NeuralHash is any good it should just flag picture 1 and not picture 2.

So we are saying in this example, picture 2 will NOT get flagged, even though it is nearly identical.....correct?

Okay, so I fire up Photoshop, and I perform a warp on Picture 1 to make it COMPLETELY MATCH picture 2. However, all these reports are now saying Picture 1 modified (which MATCHES picture 2) will now get flagged (because it tracks crops, edits, manipulations and other things)? But somehow picture 2 does not?

Again, that is the contradiction. ANY modifications to picture 1 get flagged, but picture 2 does not? How?
 

Nuvi

macrumors 65816
Feb 7, 2008
1,099
810
That's not how NeuralHash works. It's not AI.

They are in fact looking for exact matches and derivates of that exact photo.

If you take two photos of the same sex act with 2 second difference the photos will be so different that they shouldn't mach.

It is an AI. You do understand that Apple calls their AI component on their processors Apple Neural Engine (machine learning focused processor) and this exactly how it works. They are not looking for exact matches. Looking for exact matches would be idiotic and counter productive. What they do is train the AI to recognise images based on a set of aprox. 200 000 images. If the scanned image falls within threshold then pictures will be flagged and manually checked.
 
  • Like
Reactions: Ethosik

Ethosik

Contributor
Oct 21, 2009
7,832
6,762
It is an AI. You do understand that Apple calls their AI component on their processors Apple Neural Engine (machine learning focused processor) and this exactly how it works. They are not looking for exact matches. Looking for exact matches would be idiotic and counter productive. What they do is train the AI to recognise images based on a set of aprox. 200 000 images. If the scanned image falls within threshold then pictures will be flagged and manually checked.
And the funny thing is, the same CSAM technical document that they keep telling me to read (which I did) says this exactly.


The neural network that generates the descriptor is trained through a self-supervised training scheme.

Training a neural network is AI.
 

DblHelix

macrumors 6502a
Mar 19, 2009
757
618
You're being dishonest.

This is not going to affect those who do not have child pornography on their phone.
iMessage photos share to your photos now. Therefore if iCloud is on and someone sends you CP it gets compared to hashes and then you get reported to authorities.
 
  • Like
Reactions: baypharm

timeconsumer

macrumors 68020
Aug 1, 2008
2,059
2,053
Portland
After thinking about this for a couple of days now. I really don't care if they want to scan my photos when they're in iCloud as mentioned several times other companies are already doing this. However, I'm not interested in them scanning photos on my device. And yes, I get that they will only be scanning if I select the option to backup my photos to iCloud but the scan will still be done on my device. Once they implement this technology, it could then be used for other things too; to what extent I don't know. Will hackers find a vulnerability in this and then use it to scan for other things? It's opening another possible point of failure that doesn't need to be done.
 

Ethosik

Contributor
Oct 21, 2009
7,832
6,762
After thinking about this for a couple of days now. I really don't care if they want to scan my photos when they're in iCloud as mentioned several times other companies are already doing this. However, I'm not interested in them scanning photos on my device. And yes, I get that they will only be scanning if I select the option to backup my photos to iCloud but the scan will still be done on my device. Once they implement this technology, it could then be used for other things too; to what extent I don't know. Will hackers find a vulnerability in this and then use it to scan for other things? It's opening another possible point of failure that doesn't need to be done.
Won't this impact battery life too, using the neural engine of the SOC?
 

09872738

Cancelled
Feb 12, 2005
1,270
2,124
Giggles, time and again you've tried to explain how this works (and in some detail).

Lots of people on here are not listening or reading correctly. Apple's papers on this go into extreme technical detail for those interested and have the appropriate attention span.
Because the technology used is irrelevant. They access and search private images. That‘s called snooping.
Its absolutely irrelevant HOW they do it. Important thing is: they do it
 

citysnaps

macrumors G5
Oct 10, 2011
12,021
26,055
Stop giving Apple your money is only part of the solution. One is not helping society by depriving Apple of funds unless one goes one step further and continues to object to its behaviour after becoming a non customer. As we have learned over many years, silence is not a solution to reprehensible actions by bad characters.

If you really want to see results, vote with your wallet. Even a 10-20% decline in revenue will result in Wall Street instantly taking notice with a declining share price. That will get Apple's attention ASAP.

Yada yada-ing on internet forums will have no effect.

Will you do it?
 

Expos of 1969

Contributor
Aug 25, 2013
4,741
9,257
If you really want to see results, vote with your wallet. Even a 10-20% decline in revenue will result in Wall Street instantly taking notice with a declining share price. That will get Apple's attention ASAP.

Yada yada-ing on internet forums will have no effect.

Will you do it?
I already have prior to this. No Apple services subscriptions. No Euros going from me to Apple. I am Android and staying that way. Not recommending Apple products and services to friends or family.
 
  • Love
Reactions: baypharm

giggles

macrumors 65816
Dec 15, 2012
1,048
1,277
Because the technology used is irrelevant. They access and search private images. That‘s called snooping.
Its absolutely irrelevant HOW they do it. Important thing is: they do it
Even thinking this is not a reason to propagate or endorse misconceptions about the technical aspect of this.

Further discussion about the legal/ethical/philosophical/etc. aspects must stem from a common base of truth about the technical reality of what we’re talking about.
 

Expos of 1969

Contributor
Aug 25, 2013
4,741
9,257
And this will have literally 0 impact on Apple.
My action alone I agree. Unfortunately there are not enough intelligent consumers out there to make a big difference. But I sleep better knowing I am not an Apple customer. As I said earlier, protesting, publicising, and informing people about the way Apple is going is a key part of the equation. Apple will fall, all smug companies do eventually.
 
  • Like
Reactions: airbusking

giggles

macrumors 65816
Dec 15, 2012
1,048
1,277
And the funny thing is, the same CSAM technical document that they keep telling me to read (which I did) says this exactly.




Training a neural network is AI.

The AI in the hash comparison is for catching slightly altered versions of the picture (you proved too obtuse to understand how but that’s another matter).

The AI in the iMessage Safety for kids under 12 is for looking at the content of the photo.

They’re both called “AI” but they do completely different things.

But you’re too limited to acknowledge this too.
 

09872738

Cancelled
Feb 12, 2005
1,270
2,124
Even thinking this is not a reason to propagate or endorse misconceptions about the technical aspect of this.

Further discussion about the legal/ethical/philosophical/etc. aspects must stem from a common base of truth about the technical reality of what we’re talking about.
And the base truth is: they access and snoop around my data on my device, isn‘t it?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.