Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

bob24

macrumors 6502a
Sep 25, 2012
610
544
Dublin, Ireland
If you want to use the convenience of cloud things, you need to understand it’s a service and your data will be scanned. You don’t have to use it. Personally I love it.

Apple has actually insisted in saying the exact opposite for the past years, i.e. that as opposed to Google’s their vision of a cloud platform is one whereby only the user had access to his or her data (not the provider).
 

bob24

macrumors 6502a
Sep 25, 2012
610
544
Dublin, Ireland
It evidently wasn’t an issue or concern when other companies like google before them did this. So why it suddenly the end of the world now when Apple is involved?

I can only answer for myself, but the reason I am complaining now is that I am an Apple user and any change in the way they handle privacy will affect me. I am not naive and I am not claiming Apple is perfect in that regard, but I definitely trust them more than Google to protect user data and privacy (and Apple themselves are using this as a marketing argument). Being privacy conscious I don’t intend to let them ruin that without trying to prevent it (the reality is that if they go down that slope we don’t have anywhere better to go to as when it comes to smartphone operating systems we are facing a duopoly and the other player is even worse).

Since we are on a website covering Apple products, I think others might think in the same way.
 
Last edited:

jennyp

macrumors 6502a
Oct 27, 2007
637
275
Why not do the same with every URL you type? Every email? Every message?

This measure only checks that an image is in the photos library. It does nothing to check who put it there or who looks at it. Why not have your FaceID log and time stamp every interaction with your phone so that actionable evidence against the true criminal can be gathered?
If you can work out the question of storage then Xi Jinping has a position for you right now!
 

Merode

macrumors 6502a
Nov 5, 2013
623
617
Warsaw, Poland
This is a really weird feature. I doubt anybody into child pornography would keep it on an iPhone with iCloud photos enabled. Sounds sketchy and I think there's more to this story. I guess soon enough this mechanism will be detecting way more stuff, stuff that probably some three letter organization might be interested in.

I would rather prefer if my daughter's iPhone notified mine when she started posting nudes online, assuming everything is processed locally. That would be a lot more useful.
 
  • Like
Reactions: haruhiko and femike

Smartass

macrumors 65816
Dec 18, 2012
1,457
1,702
I remember the good old days, when people on this forums used to make fun out of android for not being privacy freindly to their users, and now Apple comes out to say "we're going to scan every photo you take for possible child sex abuse!". At least google is quiet about these things...
 

Khedron

Suspended
Sep 27, 2013
2,561
5,755
If you can work out the question of storage then Xi Jinping has a position for you right now!

There is no storage problem. Your FaceID already checks for an authorised user. You just need to give that user a unique ID number and then make a text file logging all times FaceID sees that user. FaceID already has the capability to even flag whether the user is looking directly at the screen or not.

The moment the on-phone AI content police detects the user is exhibiting undesirable behaviour their FaceID data and log is uploaded to an Apple server. The built in eSIM can be automatically turned on and used for this so the user is reported even if they turn off all internet access.
 

MJaP

macrumors 6502
Mar 14, 2015
290
1,218
No company, not even Apple should have access to my personal photos, moreover wasn't Apple championing their privacy saying they would not give law inforcement access to people's iPhones for things like murder investigations and would fight against court orders, now they're more than happy to invade people's privacy and scan your device and report you... a bit hypocritical I feel, and playing on our knee jerk emotional reactions to children. So Apple are willing to invade your privacy for peodophilic images, but not if a woman gets raped, or someone you know gets murdered, then it's tough luck, the user's privacy is more important?! What gives them the right to define what should and should not be allowed at a criminal law level?! They have no right to play police and moreover arbitrarily decided that one crime is worse than another just because someone like Tim Cook believes it to be so, unlike Judge Dredd he is not The Law, although apparently he seems to think he is.
 

nikaru

macrumors 65816
Apr 23, 2009
1,123
1,397
So I guess, if you have photos of your 3-months old son naked while taking a bath in your camera roll like me (private photos reserved exclusively for me and the closest family members), these could be flagged by this algorithm and the photos can be reviewed by someone at Apple HQ, just because there is a child on a photo and nudity. Great! F*ck my privacy!

Scanning ALL iPhone users photo library, in order to catch potential pedophiles that represent 0,0001% of iPhone users is really a great cause and the cause completely justified the means. I hope they can do the same with all emails from the Mail app, in order to catch this 0,0001% of terrorists dump enough to communicate and organize attacks via emails.
 
Last edited:

Khedron

Suspended
Sep 27, 2013
2,561
5,755
No company, not even Apple should have access to my personal photos, moreover wasn't Apple championing their privacy saying they would not give law inforcement access to people's iPhones for things like murder investigations and would fight against court orders, now they're more than happy to invade people's privacy and scan your device and report you... a bit hypocritical I feel, and playing on our knee jerk emotional reactions to children. So Apple are willing to invade your privacy for peodophilic images, but not if a woman gets raped, or someone you know gets murdered, then it's tough luck, the user's privacy is more important?! What gives them the right to define what should and should not be allowed at a criminal law level?! They have no right to play police and moreover arbitrarily decided that one crime is worse than another just because someone like Tim Cook believes it to be so, unlike Judge Dredd he is not The Law, although apparently he seems to think he is.

The exact same technology could be used to detect revenge porn images.

Apple is effectively endorsing revenge porn by refusing to use their capability to eliminate it at no cost.
 
  • Like
Reactions: baypharm

bob24

macrumors 6502a
Sep 25, 2012
610
544
Dublin, Ireland
So I guess, if you have photos of your 3-months old son naked while taking a bath in your camera roll like me (private photos reserved exclusively for me and the closest family members), these could be flagged by this algorithm and the photos can be reviewed by someone at Apple HQ, just because there is a child on a photo and nudity. Great! F*ck my privacy!

I don’t support Apple’s move, but the way you describe it is not how it is meant to work.

What they said is that images will be scanned against a know list of child abuse pictures, based on their hash value. So it isn’t an algorithm trying to interpret what is depicted on each picture, it rather is an attempt to find exact matches with a pre-existing database of pictures.
 

MJaP

macrumors 6502
Mar 14, 2015
290
1,218
I don’t support Apple’s move, but the way you describe it is not how it is meant to work.

What they said is that images will be scanned against a know list of child abuse pictures, based on their hash value. So it isn’t an algorithm trying to interpret what is depicted on each picture, it rather is an attempt to find exact matches with a pre-existing database of pictures.

Unfortunately that not exactly correct. It's not mentioned in this article but in the BBC article about this it says...

"Apple says the technology will also catch edited but similar versions of original images."

So it's also looking for SIMILAR and not just exact matches, so the margin for error and the ability that non-relevant photos get matched goes up.

I get the noble (if very selective) intentions behind what they are doing, but it should not be in their remit to do this on private photos that are none of their business.
 

hufflematt

macrumors 68000
Mar 7, 2015
1,725
1,782
UK
The UK passed legislation aimed at making it easier to conduct surveillance on suspected terrorises. Good, right? You’re not a terrorist, so you have nothing to fear.

Except they started using the legislation to check whether people were putting their trash out on the correct day, and other trivial misdemeanours.


It’s kinda naive to trust that your government will always act lawfully and appropriately and with good intention. History is littered with counterexamples.
 
  • Like
Reactions: Sincci

iHorseHead

macrumors 65816
Jan 1, 2021
1,338
1,604
Until someone gets screwed up for taking a pic of their toddler taking a fun bath naked.
It was recently made illegal to upload nudes of your children online… Before that a lot of parents did that and were like: "Aww, my child taking a bath"
 

Khedron

Suspended
Sep 27, 2013
2,561
5,755
No company, not even Apple should have access to my personal photos, moreover wasn't Apple championing their privacy saying they would not give law inforcement access to people's iPhones for things like murder investigations and would fight against court orders, now they're more than happy to invade people's privacy and scan your device and report you... a bit hypocritical I feel, and playing on our knee jerk emotional reactions to children. So Apple are willing to invade your privacy for peodophilic images, but not if a woman gets raped, or someone you know gets murdered, then it's tough luck, the user's privacy is more important?! What gives them the right to define what should and should not be allowed at a criminal law level?! They have no right to play police and moreover arbitrarily decided that one crime is worse than another just because someone like Tim Cook believes it to be so, unlike Judge Dredd he is not The Law, although apparently he seems to think he is.

Tim Cook strongly believes that human rights are the second most important thing in the world. *looks into distance with tear in eye*

The first most important is Apple's share price.
 

Wildkraut

Suspended
Nov 8, 2015
3,583
7,673
Germany
It was recently made illegal to upload nudes of your children online… Before that a lot of parents did that and were like: "Aww, my child taking a bath"
Yep, just common pics made by parents. E.g. toddlers playing/running around naked on a beach during their holidays, we have many pics and vids of our children playing that way. I bet we’ll soon be discussing about arrested parents in here, due to false positives. Even if they don’t get arrested but forced to defend themselves, this can already screw them up, not everybody has money left to hire a good lawyer. The worst case could be parents being temporarily or permanently split from their children.
 
Last edited:

Damian83

macrumors 6502a
Jul 20, 2011
505
276
No worries, we have exhausted icloud space a decade ago. Iphone storage raises, video/photo size raises, internet speed raises, icloud space it's the same from when it was released. While google offers me the same 5gb space for free without having any of its products, and ps plus gives me 100gb space. ok ps plus its a paid feature however its main purpose is playing online and having free games every month, while icloud subscrisption its ONLY for that space, no other additional services that i dont already own...
 

bob24

macrumors 6502a
Sep 25, 2012
610
544
Dublin, Ireland
Unfortunately that not exactly correct. It's not mentioned in this article but in the BBC article about this it says...

"Apple says the technology will also catch edited but similar versions of original images."

So it's also looking for SIMILAR and not just exact matches, so the margin for error and the ability that non-relevant photos get matched goes up.

I get the noble (if very selective) intentions behind what they are doing, but it should not be in their remit to do this on private photos that are none of their business.

Fair point, its was not mentioned on MacRumors but I can see that on the BBC website indeed.

I just did some digging and they have a document on their website explaining how it works: https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

Basically they are not using traditional hashing techniques. Instead they have neural networks generating a hash based on the general appearance of the picture (as opposed to computing the hash value based on the actual sequence of bits in the file).

So technically it is correct to say that there is no algorithm trying to decide if a certain type of scene is depicted on the picture. But it is also true that the algorithm actively analyses the shapes on the picture to generate the hash, rather than doing a "dumb" computation based on the sequence of bits in the file.

To be clear I don't think the whole thing is a good idea anyway, but assuming it is working as described I don't think this system is likely at all to confuse a homemade picture with a picture from the reference database (it has nothing to do with your typical neural networks which would be trying to recognise *and interpret* various shapes on the picture - and which would be subject to errors).

What remains unclear to me is what that manual review step is. Either this is just based on the hash comparison and I don't see what a human being can do that the system hasn't done already. Or it actually allows a human operator to view the pictures, and then it is a massive privacy and safety concern (as it implies that regardless of this particular use case, as a general rule Apple is OK with storing user content which it can decrypt and to give access to its employees). Based on the PDF what is revealed to the employee is the "NeuralHash" and the "visual derivative", but it isn't quite clear what exactly the visual derivative is and what the person is supposed to do with that information to carry out a manual validation which brings more value than what the machine has done already.
 
Last edited:

DanTSX

Suspended
Oct 22, 2013
1,111
1,505
The exact same technology could be used to detect revenge porn images.

Apple is effectively endorsing revenge porn by refusing to use their capability to eliminate it at no cost.


No, they are not endorsing “revenge porn”.
 

DanTSX

Suspended
Oct 22, 2013
1,111
1,505
Tim Cook strongly believes that human rights are the second most important thing in the world. *looks into distance with tear in eye*

The first most important is Apple's share price.


While I am not a libertarian, I have learned that if you do not specify individual “Human rights”, then it is usually some weasel-talk about the greater good “human rights”, which usually mean I’m expected to yield something I value up to an enlightened decision maker.
 
  • Like
Reactions: EtCetera2022

ipponrg

macrumors 68020
Oct 15, 2008
2,309
2,087
I am personally not concerned about this feature, but it’s incredibly ironic that Apple preaches privacy yet is scanning your personal pictures with a possible human auditor at the end.

It’s humorous about the gymnastics Apple does to bend privacy and still somehow there are still people that believe in Apple’s privacy rhetoric
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.