Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

IG88

macrumors 65816
Nov 4, 2016
1,109
1,637
based on everything that’s been explained about how it works and what purpose it serves, it’s honestly creepier that you feel the need to turn it off
If you have nothing to hide, send me all of your photos, since you're apparently totes ok with Apple employees looking through your iCloud picture library.
 

IG88

macrumors 65816
Nov 4, 2016
1,109
1,637
I agree with darcyf that if you're not doing anything wrong, then you have nothing to worry about.
Yes, because people are never falsely accused, the hash AI is perfectly flawless (no false positive hash collisions), and I'm sure no Apple employees would ever abuse the system to look through a hot chick's iCloud Photo Library.
 
Last edited:
  • Like
Reactions: baypharm

IG88

macrumors 65816
Nov 4, 2016
1,109
1,637
This isn’t about what might happen somewhere someday, this is about catching purveyors of kiddie porn, right now, here, today.
I'll be the judge of that. Give me access to your photo library. You have nothing to hide.
 
  • Like
Reactions: baypharm

PaladinGuy

macrumors 68000
Sep 22, 2014
1,616
1,030
I have mixed feelings about this. One one hand, I do not like Apple backpedaling on the topic they’ve marketed incessantly for years and rolling back privacy. As others said, this can easily be a very slippery slope. I completely support finding and severely prosecuting people that abuse and harm children and taking actions to try to prevent it to begin with. I’m just not positive that I like this approach.

It brings up another topic in my mind: cloud services. Here’s the thing, when we store our data on someone else’s property (servers), we’re giving up some level of control of it. The direction that a lot of technology is currently going makes it to where we don’t really own anything. We’re just paying to use a company’s services. There seems to be a bit of a moral dilemma here.
 

IG88

macrumors 65816
Nov 4, 2016
1,109
1,637
Good! The use of hash values in comparing known CSAM is extremely accurate, and has been in use by multiple cloud providers for years now - and has resulted in thousands of NCMEC leads which have gone out to the local ICACs - which have resulted in numerous arrests and prosecutions. There's no reason to have CSAM on your phone, period. Apple, thank you.
Yes exactly. I'll volunteer to prescreen your photos to make sure there's nothing incriminating before they get uploaded to iCloud. I want to do my part!
 
  • Like
Reactions: timothevs

Mydel

macrumors 6502a
Apr 8, 2006
804
664
Sometimes here mostly there
Not as obnoxious as saying we shouldn’t do everything we reasonably can to protect exploited children and punish their abusers because we don’t want to allow a harmless and fully automated scan that affects us in no way at all.
Again. Not the point. You clearly do not understand the ramification of these kind of action and are blindsided by the slogans.
 

Ethosik

Contributor
Oct 21, 2009
7,832
6,762
It is an hash. It is mathematics not a visive inspection.

Yet it’s been stated it can track crops, color adjustments, rotations, transforms, and any edits. So how is it guaranteed that an adult/legal age subject in similar “pose”, “lighting”, “colors” and more similarities won’t get flagged? How can something that allows a match on a manipulated photo be guaranteed to not flag legal photos from a couple who are in a consenting relationship?
 
  • Like
Reactions: Weisswurstsepp

fmillion

macrumors regular
Jun 16, 2011
146
340
Today: "If you're not a child pornographer or someone who partakes in such content, you have nothing to worry about. We're only scanning your photos for known kiddie porn. --Apple"

A little while in the future: "If you're staying in line with only approved ideas, you have nothing to worry about. We're only scanning everything you do all the time for thoughtcrime. --The iMinistry"

This is how draconian policies begin - by applying them to universally detested crimes.
 

Ethosik

Contributor
Oct 21, 2009
7,832
6,762
I have mixed feelings about this. One one hand, I do not like Apple backpedaling on the topic they’ve marketed incessantly for years and rolling back privacy. As others said, this can easily be a very slippery slope. I completely support finding and severely prosecuting people that abuse and harm children and taking actions to try to prevent it to begin with. I’m just not positive that I like this approach.

It brings up another topic in my mind: cloud services. Here’s the thing, when we store our data on someone else’s property (servers), we’re giving up some level of control of it. The direction that a lot of technology is currently going makes it to where we don’t really own anything. We’re just paying to use a company’s services. There seems to be a bit of a moral dilemma here.

There is a line from having your picture exposed and having police/FBI show up. I know one couple where the woman is a model. She won’t care if the images got released, she offers the same to others. But having police show up because it was detected and is a false positive. That’s different.
 
  • Like
Reactions: Weisswurstsepp

baypharm

macrumors 68000
Nov 15, 2007
1,951
973
Edward Snowden warned us all a long time ago to stop using all cloud services and third party storage services (like Dropbox, Amazon, Google, etc).
 
  • Like
Reactions: EtCetera2022

DotCom2

macrumors 603
Feb 22, 2009
6,173
5,447
Nope. General privacy abuse for everyone out-rules individual abuse.

But sure enough some would argue pro water boarding for the potential terror relief as well.
Maybe I should have said "to me" it's still worth it. That way you can't say "Nope".
 

baypharm

macrumors 68000
Nov 15, 2007
1,951
973
Wait until the employees who find positive hits get paid for each violation they find, they will be planting all kinds of data everywhere.

It’s that way now with those who are on parole. If a parole officer finds a violation they get paid a bonus.
This is true. One of my employees was on community control (house arrest) many years ago. She was allowed to go the store, bank, etc only on certain days of the week and it had to be preapproved a week in advance. Once she was 2 minutes late arriving home. Her probation was violated and she spent 3 months in the slammer before a judge finally released her and dismissed the charge. The system is inherently corrupt. And lets not forget about how many poor souls have spent 3 or 4 decades in prisons for crimes they never committed.
 

mdatwood

macrumors 6502a
Mar 14, 2010
924
923
East Coast, USA
I wish articles would put Apples new CSAM method in context. PhotoDNA [1] has been used by cloud providers for years to do exactly this sort of scanning for anything uploaded to the cloud. By Apple doing it on device prior to upload, they are setting up the ability to add iCloud Photo Library to the E2E encrypted list. Hopefully this is something they announce.

Ironically, Apple's method of doing CSAM on device right before the pic is uploaded is the most privacy focused way of checking, though it does lead to slippery slope arguments.

[1] https://en.wikipedia.org/wiki/PhotoDNA
 

hans1972

macrumors 68040
Apr 5, 2010
3,396
3,007
Actually, the intent is creepy. It is not like an elected official passed a law or there was a recent referendum. Apple seems to have just decided that this was "the right thing to do". What if they decided that monitoring hate, extremism, suicidal ideation, bullying or whatever we all now oppose, and however Apple wants to define these things, was the right thing to do?

It would certainly be Apple's right to do so as long as they informed the users beforehand. Property owners have the ty right to set rules for the use of its property.
 

hans1972

macrumors 68040
Apr 5, 2010
3,396
3,007
I remember the good old days, when people on this forums used to make fun out of android for not being privacy freindly to their users, and now Apple comes out to say "we're going to scan every photo you take for possible child sex abuse!". At least google is quiet about these things...

Privacy != Secrecy

Privacy means that you know how your data is used, you have agreed to it and you can verify that your personal data is correct and demand a correction if they're not. Your personal data should also be reasonable secured from unlawful access. Also you should be allowed to demand deletion unless there is a good reason not to.
 

hans1972

macrumors 68040
Apr 5, 2010
3,396
3,007
Or it actually allows a human operator to view the pictures, and then it is a massive privacy and safety concern (as it implies that regardless of this particular use case, as a general rule Apple is OK with storing user content which it can decrypt and to give access to its employees).

This has always been the case with iCloud backups and iCloud Photo Library. This content is encrypted with a key known to Apple. They even document which data they can read or not.
 

jthesssin

macrumors regular
May 6, 2013
162
95
Matthews NC
So, let me get this straight, Apple will have the ability based on flags from an algorithm to have access to my iCloud photo library. This tells me that even without my password and untold number of outside individuals could have access to my iCloud library. This includes adding images, deleting images, and copying images that I have no recollection of. I would be all for this if it was end to end encryption that Apple had no access to. They could only read the flags and alert law-enforcement who could then start an investigation. I do not agree with the fact that I have no control over what is now in my iCloud library. Anyone could place any image in anyones iCloud photo library and they would have no legal recourse.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.