Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

iHorseHead

macrumors 65816
Jan 1, 2021
1,338
1,603
I applaud Apple’s afford to curb child pornography. If you are this worried about on device search, what are you hiding?
Are you kidding me right now?!? Are you serious? Really? Are you serious?

My grandparents have lived through Hitler and Stalin too. They've taught me a lot and told me a lot of stories.
I don't trust Apple. They'll expand this 'feature' and soon the users that have such pictures on their phones will be reported & arrested too:
KZ3JVtXv.jpg


(MacRumors, please do not ban me. Thank you)

It always starts out with good intentions and then it'll expand but once this "feature" is rolled out it's too late to start fighting.
 

citysnaps

macrumors G5
Oct 10, 2011
12,067
26,123
Even considering development of such a system is the single worst decision ever made since the founding of Apple. The surveilence privacy abuse and security implications of this system are astounding. I hope every imaginable avenue of communication to Apple is being flooded with requests to scrap this big brother system and instead focus all developer time on features that actually improve privacy and security and prevent survilence of any kind, in any country.

If you really want to send a message to Apple, one that will be heard loud and clear by those who count, the solution is easy. VOTE WITH YOUR WALLET.

Will you commit to doing that?

How about anyone else here? Feel free to respond with a YES or a NO.
 
  • Like
Reactions: baypharm

hagar

macrumors 68020
Jan 19, 2008
2,018
5,093
They have every right to access and scan photos on their Cloud, which they have already been doing. Your first sentence doesn't jive with what Apple has already been doing.
For what purpose are they accessing and scanning iCloud photos server-side? iCloud photos are stored encrypted.

While Apple could decrypt them as they have the key I’m not aware of any ongoing activities to do so. do you have sources?

Also, if that were actually the case, then what’s this outcry over CSAM pics for??
 

Expos of 1969

Contributor
Aug 25, 2013
4,741
9,257
If you really want to send a message to Apple, one that will be heard loud and clear by those who count, the solution is easy. VOTE WITH YOUR WALLET.

Will you do it?

How about anyone else here - feel free to respond with a YES or NO.
Stop giving Apple your money is only part of the solution. One is not helping society by depriving Apple of funds unless one goes one step further and continues to object to its behaviour after becoming a non customer. As we have learned over many years, silence is not a solution to reprehensible actions by bad characters.
 

Apple_Robert

Contributor
Sep 21, 2012
34,610
50,315
In the middle of several books.
For what purpose are they accessing and scanning iCloud photos server-side? iCloud photos are stored encrypted.

While Apple could decrypt them as they have the key I’m not aware of any ongoing activities to do so. do you have sources?

Also, if that were actually the case, then what’s this outcry over CSAM pics for??
I was thinking about Google and Microsoft when in error said iCloud scanning is already active. That is not the case. I apologize for the error and have amended my previous post.

That being said, I still stand by my point, that if all Apple is concerned about is proactively keeping said filth off their servers, that can be done server side without the need for on device scanning. I think they are doing the on device for the reasons I posited earlier. It now appears privacy means something different with Apple depending on the subject.
 

boswald

macrumors 65816
Jul 21, 2016
1,311
2,189
Florida
I’ve watched Louis Rossmann’s videos for quite a while, just to see what it’s like to repair an Apple product from a technical point-of-view. I never really took anything he said against Apple personally, as everyone is entitled their opinion, despite how “vocal” they are. In fact, I thought he went overboard with his views. However, the more I revisit his old videos and listen to his opinions on Apple’s behavior over the years, the more I’ve come to realize I’ve made a big mistake.
 
  • Like
Reactions: turbineseaplane

hagar

macrumors 68020
Jan 19, 2008
2,018
5,093
I was thinking about Google and Microsoft when in error said iCloud scanning is already active. That is not the case. I apologize for the error and have amended my previous post.

That being said, I still stand by my point, that if all Apple is concerned about is proactively keeping said filth off their servers, that can be done server side without the need for on device scanning. I think they are doing the on device for the reasons I posited earlier. It now appears privacy means something different with Apple depending on the subject.
But client-side verification trumps server-side scanning as the data never leaves your device.

As always Apple makes it more difficult for itself (like with Face scanning) due to privacy reasons. It would be much easier to do it all server side. Remember when face scanning was first introduced? It happened on device and there was no sync to other devices. Very annoying. Because they were not happy doing it until it could be done in privacy friendly way.

Same here.
 

boswald

macrumors 65816
Jul 21, 2016
1,311
2,189
Florida
We have movements to stop sexual harassment, racism, hate crimes; support equality and inclusivity, but we don’t have anything or anyone defending our right to privacy. If ONE company would stand up to these evil companies and say, “Nope. Not interested in lying and abusing our customers,” a domino effect would take place and we’d move in the right direction. It’s a long shot, but there’s got to be one up-and-coming CEO out there willing to take a reduced paycheck to uphold something that affects all of us.
 
  • Like
Reactions: turbineseaplane

jntdroid

macrumors 6502a
Oct 12, 2011
937
1,286
But client-side verification trumps server-side scanning as the data never leaves your device.

As always Apple makes it more difficult for itself (like with Face scanning) due to privacy reasons. It would be much easier to do it all server side. Remember when face scanning was first introduced? It happened on device and there was no sync to other devices. Very annoying. Because they were not happy doing it until it could be done in privacy friendly way.

Same here.

A fair point. But the sole purpose of this is to incriminate someone. Obviously if they're guilty, they deserve it. But don't use my device to do it. Or at least give me a choice in the matter.
 

boswald

macrumors 65816
Jul 21, 2016
1,311
2,189
Florida
Android does similar things.

Every URL used by an app (including browsers) are analysed by the system and if they're found to be potentially harmful, they are being sent to Google.

Most Android phones also has a system service which will scan part of the file system and delete stuff if found to be malicious.

But Google does most of their scanning on their servers. Almost every Android user outside China uses Google Photos and gets their images scanned. Google also reports through the same system.
Yes, but there’s a few ways I can think of to help avoid some of that.

1. Don’t use Google search.
2. Use a third-party ROM like LineageOS (for example).
3. Do not install Google apps.
4. Switch from Gmail to Proton.

It’s restrictive, yes, but our right to privacy is restricted more and more by the year. You shouldn’t have to settle.
 

09872738

Cancelled
Feb 12, 2005
1,270
2,124
But client-side verification trumps server-side scanning as the data never leaves your device.

As always Apple makes it more difficult for itself (like with Face scanning) due to privacy reasons. It would be much easier to do it all server side. Remember when face scanning was first introduced? It happened on device and there was no sync to other devices. Very annoying. Because they were not happy doing it until it could be done in privacy friendly way.

Same here.
Not really. Coz if you use iCloud the data is transferred to Apple anyways. So no need to scan on-device, which is - irrespective of how they do it - snooping.
 

Luis Ortega

macrumors 65816
May 10, 2007
1,158
342
I personally place a high priority on privacy and security. Apple's move doesn't bother me, though, because one of my baseline rules is to never store anything that is highly sensitive or mission critical in the cloud.
If people can get by without storing anything that is highly sensitive or mission critical in the cloud, then it proves that the cloud systems are not needed and if people get snared by mindless algorithms, they have no one to blame but themselves.
 

Ethosik

Contributor
Oct 21, 2009
7,840
6,766
The NeuralHash and MicrosoftDNA algorithms were designed to do the opposite of what you are describing. NeuralHash is optimised to not catch pictures of a similar "feel".

Both of these systems are optimised only to catch changes to a specific photo: cropping, changing colour, hue contrast, mirroring.

Here's an example:

Picture 1: Someone takes a picture of you where you live. Lets say you standing in front of a window.

You walk away for a few minutes.

Picture 2: Someone takes a picture of you standing in front of the same window and you have approx. the same pose as in the last picture.

Somehow picture 1 becomes part of the CSAM database. If NeuralHash is any good it should just flag picture 1 and not picture 2.

That's why its do difficult to misuse it. You can't just provide similar pictures of a protest and hope for the system to get people with "pictures of protests".

Still, you will get collisions and it's why Apple has a threshold on collisions before they are notified. With the threshold they have chosen they have estimated it to be 1 in 1 trillion accounts per year.
But this just does not line up that it can track "crops", "pixel change", "color adjustments", "rotations" and "transformations". The way you described it, it would be a true 1:1 hash match. But there is some leeway. So yes a truly similar photo can results in a similar hash that might get flagged.

I am sorry, and I am just trying to understand how this works. How can this feature track "crops", "pixel change", "color adjustments", "rotations" and "transformations" but at the SAME TIME, your picture 2 is suddenly safe? It just does not make sense. They should come out to be very close to the same hash, if the statements are true about tracking the manipulations to the image. If I just slightly change my pose off a pixel, how is that not the same as some adjustments mentioned above?
 
Last edited:

hagar

macrumors 68020
Jan 19, 2008
2,018
5,093
A fair point. But the sole purpose of this is to incriminate someone. Obviously if they're guilty, they deserve it. But don't use my device to do it. Or at least give me a choice in the matter.
For obvious reason this can’t be opt in.

And the sole purpose is not to incriminate someone. The purpose is to stop the spread of CSAM content and to remove the millions of CSAM photos from Apple’s servers.
 

giggles

macrumors 65816
Dec 15, 2012
1,048
1,277
I’ve watched Louis Rossmann’s videos for quite a while, just to see what it’s like to repair an Apple product from a technical point-of-view. I never really took anything he said against Apple personally, as everyone is entitled their opinion, despite how “vocal” they are. In fact, I thought he went overboard with his views. However, the more I revisit his old videos and listen to his opinions on Apple’s behavior over the years, the more I’ve come to realize I’ve made a big mistake.
He also read John Gruber’s take on this in one of the latest videos and said it’s a reasonable take.
And he admitted that in his first video about this he got some parts wrong because of the confusion between the iMessage safety feature (AI and content based) for kids under 12 and the CSAM detection system (hash based with a sprinkle of AI to detect slightly modified versions of the same pic)

 

giggles

macrumors 65816
Dec 15, 2012
1,048
1,277
But this just does not line up that it can track "crops", "pixel change", "color adjustments", "rotations" and "transformations". The way you described it, it would be a true 1:1 hash match. But there is some leeway. So yes a truly similar photo can results in a similar hash that might get flagged.

You’re underestimating the complexity and elegance of those systems.

We can’t give the “layman common sense” treatment to any subject.

Can’t remember where I read it but I think those systems divide the picture in a lot of small squares and extrapolate some parameters from each little square, hence they can catch cropped/edited pics but not mistake it for an unrelated pic..
 

turbineseaplane

macrumors P6
Mar 19, 2008
15,366
33,205
Even with the best intentions where does Tim Cook think this is leading?

I think he's realized, much like military contractors, that it's ultimately more profitable and simply "easier" to be an ally of the State.

Let's be honest. He's never been the visionary or creative here.
Tim was an Ops guy. He's about making money and very little else.

The easiest way to do that is not "fight the good fight".
They pitched that for a while, as it was good narrative for sales and to be counter to the other MegaTech's...

Ultimately - always look at what someone (or a company) does, not what it says or markets to us.
 

Ethosik

Contributor
Oct 21, 2009
7,840
6,766
You’re underestimating the complexity and elegance of those systems.

We can’t give the “layman common sense” treatment to any subject.

Can’t remember where I read it but I think those systems divide the picture in a lot of small squares and extrapolate some parameters from each little square, hence they can catch cropped/edited pics but not mistake it for an unrelated pic..
Then Apple is clearly overstating how it can be so flexible by tracking "crops", "pixel change", "color adjustments", "rotations" and "transformations". If one image gets flagged and a second image does not, and I am only off by a pixel in my pose, then Apple is falsely stating it can know about these manipulations to the photo.

And again, I am just trying to understand here. How can this feature track "crops", "pixel change", "color adjustments", "rotations" and "transformations" but at the SAME TIME, a VERY VERY similar picture doesn't get flagged. Can you not see the contradiction here?
 

giggles

macrumors 65816
Dec 15, 2012
1,048
1,277
I would like to know how Apple tested this system (and who was involved) and how they arrived at the statement of accuracy.
They cite 3 independent security scholars who they showed the system to in their press release.
 

giggles

macrumors 65816
Dec 15, 2012
1,048
1,277
Then Apple is clearly overstating how it can be so flexible by tracking "crops", "pixel change", "color adjustments", "rotations" and "transformations". If one image gets flagged and a second image does not, and I am only off by a pixel in my pose, then Apple is falsely stating it can know about these manipulations to the photo.

And again, I am just trying to understand here. How can this feature track "crops", "pixel change", "color adjustments", "rotations" and "transformations" but at the SAME TIME, a VERY VERY similar picture doesn't get flagged. Can you not see the contradiction here?

Again, this is all your limitation in understanding how this works. No contradiction. Read up about Microsoft PhotoDNA.
 

Ethosik

Contributor
Oct 21, 2009
7,840
6,766
I would like to know how Apple tested this system (and who was involved) and how they arrived at the statement of accuracy.
Agreed. There is a clear contradiction here with Apple's claims that say "any adjustments to that matched photo will still be a match", yet "oh but an entirely separate photo that is nearly identical won't get flagged so don't worry!
 

Ethosik

Contributor
Oct 21, 2009
7,840
6,766
Again, this is all your limitation in understanding how this works. No contradiction. Read up about Microsoft PhotoDNA.
The contradiction is:

"Any adjustments made to an image in the database will be flagged"

and

"A separate photo that is nearly identical will not get flagged, so don't worry!"

I can transform an image and have it be unrecognizable to the original, but supposedly this can also get flagged? But a same style of picture with an adult/legal subject that has the same lighting, colors, dimensions, poses and more can absolutely not get flagged?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.