Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

CarlJ

macrumors 604
Feb 23, 2004
6,971
12,135
San Diego, CA, USA
Using your device's battery/CPU to scan everything you ever shoot a photo of as it's about to be uploaded to be sure you're not a criminal is an obvious violation of a person's privacy.
Your device's battery/CPU is already being used to scan everything you ever shoot a photo of, doing AI object/face recognition, so that when you search your photos for "tree" or "airplane" or "baby" or "dog" or "George" or "Martha", it can quickly bring up images that contain the requested subject. This has been going on for years - they made a big deal about it when it was first implemented. Hashing the image will take a minute fraction of the power already being used. And those images are already being scanned as soon as they arrive on the iCloud servers (and the on-device hashing wouldn't happen except to images about to be uploaded to iCloud). If you don't want your images scanned at all, you better stop using any smartphone and any cloud storage service for photos.

And it's also an easy way to set someone up. Better never leave your phone unlocked or in the hands of someone who knows your password - they can just download some of this crap to your photos collection and then close the browser leaving you none the wiser.
This argument is entirely irrelevant as a point against Apple's hashing mechanism because YOU CAN ALREADY SET SOMEONE UP LIKE THIS RIGHT NOW USING AN IPHONE OR AN ANDROID PHONE - in either case, as soon as the picture is on your phone, the phone uploads it to the cloud, and the scanners that are ALREADY RUNNING IN THE CLOUD will scan the images.

Apple presented a new implementation of scanning that makes it more private and harder for governments to abuse, and everyone is arguing WE DON'T WANT THIS NEW THING - well, then, you're tacitly arguing in favor of keeping the EXISTING system that is uploading your images unprotected and then scanning them in the cloud RIGHT NOW.

If you don't want ANY scanning at all, then your argument isn't with Apple, it's with the government.

Also, Apple implemented Touch ID and Face ID to make it far more likely that your phone is locked when you're not using it. And if you're handing out your password to anyone else, you're being incredibly foolish.
 

CarlJ

macrumors 604
Feb 23, 2004
6,971
12,135
San Diego, CA, USA
Yep, and also that there is no way to quickly appeal and reactivate the account, despite being found innocent. An official „case closed letter“ should have been enough to reactivate it in this case. Imagine if he were an Indie Dev, this would be a disaster for him and all his customers.

I hope he finds a way to sue Google and get compensation for damages.
Entirely agreed with this. I don't trust Google any more than I absolutely have to. They have a bad track record for privacy. And it's outrageous that Google is trusting their own opinion of whether it's law-breaking more than the actual police - it's as though they have their own Google Police that supersedes the government. The NYTimes article suggested that he considered suing Google but decided it wasn't worth the cost. I'd love to see the ACLU or some such take the case up pro-bono.
 

BaldiMac

macrumors G3
Jan 24, 2008
8,778
10,903
Sorry, but your Apple sided arguments doesn’t put Apple in a better light than Google.

Apple already utilizes more or less the same kind of heuristic and AI recognition of illegal content on their iCloud Storages.
Source?

Why do you think they don’t decently encrypt the iCloud?
They do decently encrypt iCloud. They just have access to the encryption key.
 
  • Like
Reactions: CarlJ

Wildkraut

Suspended
Nov 8, 2015
3,583
7,673
Germany


However, Apple confirmed to me that it has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019. Email is not encrypted, so scanning attachments as mail passes through Apple servers would be a trivial task.

Apple also indicated that it was doing some limited scanning of other data, but would not tell me what that was, except to suggest that it was on a tiny scale. It did tell me that the “other data” does not include iCloud backups.


A tiny scale of 588 million Apple users, surely a tiny scale 🤣, I doubt they were 100% honestly.

They do decently encrypt iCloud. They just have access to the encryption key.

Didn't you notice while writing that your argument is already flawed and a negation of decently.
🤣 "decently encrypt iCloud" + "JUST have access to the encryption key" = "NOT decently encrypted iCloud"
 
Last edited:
  • Like
Reactions: dk001

BaldiMac

macrumors G3
Jan 24, 2008
8,778
10,903

However, Apple confirmed to me that it has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019. Email is not encrypted, so scanning attachments as mail passes through Apple servers would be a trivial task.

Apple also indicated that it was doing some limited scanning of other data, but would not tell me what that was, except to suggest that it was on a tiny scale. It did tell me that the “other data” does not include iCloud backups.


A tiny scale of 588 million Apple users, surely a tiny scale 🤣, I doubt they were 100% honestly.
So your source directly contradicts your claim.

Didn't you notice while writing that your argument is already flawed and a negation of decently.
🤣 "decently encrypt iCloud" + "JUST have access to the encryption key" = "NOT decently encrypted iCloud"
You're just playing word games. My point stands.
 

Wildkraut

Suspended
Nov 8, 2015
3,583
7,673
Germany
So you're going to pretend that you meant email when you clearly referred to encrypted iCloud storage (which your source says is not scanned.)
No, they only said that iCloud backup is not scanned(which i personally don't believe).

Anyway, (iCloud Backups != iCloud Storage), in addition to Email scans, they mentioned that they also scan "other data" that's a very expandable statement which they didn't want to get deeper into.

So why the mystery, what you expect it is?
Of course all the stored iCloud Documents a.k.a. Storage, everything you can access through the Files App and all the content other Apps syncs up to the iCloud Storage(iPhone: Settings -> iCloud -> [Apps using iCloud]), there is nothing else interesting with personal files they can scan for stuff. The iCloud Backup is just a duplicate of these synced up content with a few iOS internals.

Btw. App content to iCloud sync is ON for all by default.
 
Last edited:
  • Like
Reactions: dk001

BaldiMac

macrumors G3
Jan 24, 2008
8,778
10,903
No, they only said that iCloud backup is not scanned(which i personally don't believe).

Anyway, (iCloud Backups != iCloud Storage), in addition to Email scans, they mentioned that they also scan "other data" that's a very expandable statement which they didn't want to get deeper into.

So why the mystery, what you expect it is?
Of course all the stored iCloud Documents a.k.a. Storage, everything you can access through the Files App and all the content other Apps syncs up to the iCloud Storage(iPhone: Settings -> iCloud -> [Apps using iCloud]), there is nothing else interesting with personal files they can scan for stuff. The iCloud Backup is just a duplicate of these synced up content with a few iOS internals.

Btw. App content to iCloud sync is ON for all by default.
Again, your source directly contradicted your claim that Apple is scanning photos like Google in encrypted iCloud storage.

“It has not, however, been scanning iCloud Photos or iCloud backups.”
 

Wildkraut

Suspended
Nov 8, 2015
3,583
7,673
Germany
Again, your source directly contradicted your claim that Apple is scanning photos like Google in encrypted iCloud storage.

“It has not, however, been scanning iCloud Photos or iCloud backups.”
It‘s comfortable to look away and whitewash, but doesn’t make things better.
 

dk001

macrumors demi-god
Oct 3, 2014
10,601
14,942
Sage, Lightning, and Mountains

rme

macrumors 6502
Jul 19, 2008
292
436
“CSAM” is, quite literally, “Chile Sexual Abuse Material”. Nobody needs it. Please keep the language straight - CSAM is not code, it is not an algorithm, it is not anything that Apple or Google has written - it is photos or movies of kids being abused and/or raped. Take care with how you toss the term around.
Stop spreading nonsense and INFORM yourself.
If you take a picture of your toddler in bath and it shows her/his genitals, you can be accused of creating CSAM, and that picture could end up in the NMEC database.
 
  • Disagree
Reactions: dk001

CarlJ

macrumors 604
Feb 23, 2004
6,971
12,135
San Diego, CA, USA
Stop spreading nonsense and INFORM yourself.
If you take a picture of your toddler in bath and it shows her/his genitals, you can be accused of creating CSAM, and that picture could end up in the NMEC database.
I'm pretty well INFORMED, thanks. Rather than throwing the entire Wikipedia page at me with a general purpose, outraged, "figure it out" comment - and quoting one paragraph of my post seemingly at random (it's not clear to me if you just felt like quoting the first paragraph or if you don't understand it), how about you tell me specifically which part of what I said you disagree with. Nothing I said was nonsense (I did just now fix a one-character typo).

If Apple moves to the proposed method of attempting to match against the CSAM hash database before uploading photos, then the only way an image of yours will match is if it's ALREADY in the database. If you take the picture you describe, it won't already be in the database, so it won't match. If it doesn't match, it won't get flagged for review, and thus it won't come to the attention of the authorities, nor will it get put into the database. The hash testing method doesn't look for images that "look like" something (bare skin or whatever), it only matches against images already in the database.

The method that Google is currently using (AI software that guesses that it is seeing a bad picture), can cause precisely the problem you describe, as demonstrated in the article that several people have posted recently in this thread. That's an entirely different system than the one Apple is proposing. At this point, it is presumed that Apple is also scanning images on the iCloud servers as pictures are uploaded, but to my knowledge they've never stated this explicitly, nor what method they may be using for such scanning.
 
  • Like
Reactions: I7guy

dk001

macrumors demi-god
Oct 3, 2014
10,601
14,942
Sage, Lightning, and Mountains
...

If Apple moves to the proposed method of attempting to match against the CSAM hash database before uploading photos, then the only way an image of yours will match is if it's ALREADY in the database. If you take the picture you describe, it won't already be in the database, so it won't match. If it doesn't match, it won't get flagged for review, and thus it won't come to the attention of the authorities, nor will it get put into the database. The hash testing method doesn't look for images that "look like" something (bare skin or whatever), it only matches against images already in the database.

...
emphasis above added.

Not exactly true and why Apple has (had) set a threshold of 30 matches before promoting for review by an Apple employee (?) for confirmation. NCMEC already sees this with the existing tool MS helped develop for this type of matching.
 

pdoherty

macrumors 65816
Dec 30, 2014
1,348
1,612
I wonder if all the Apple apologists in these CSAM threads are scratching their heads at rebuking all of us saying that Apple could easily misuse the on-device scanning when instructed by foreign governments (like China) when just this week Apple disabled Airdrop so Chinese protestors couldn’t exchange files?
 

hagar

macrumors 68000
Jan 19, 2008
1,975
4,951
I wonder if all the Apple apologists in these CSAM threads are scratching their heads at rebuking all of us saying that Apple could easily misuse the on-device scanning when instructed by foreign governments (like China) when just this week Apple disabled Airdrop so Chinese protestors couldn’t exchange files?
Why would you assume Apple needs the excuse of a CSAM scanner to do this? They could very well already scan and tag every single photo on your phone and in the cloud without us knowing anything about it. Either you trust Apple, or you don't.

It's also widely hypocritical to condemn Apple everytime they comply with local authorities in the countries they're selling their products. Either they comply or drop out of that market. But as there's so much disproval of doing business with or in China, why not impose restrictions for every US company, instead of only blaming Apple?
 

snipper

macrumors regular
Feb 9, 2004
233
30
I wonder if all the Apple apologists in these CSAM threads are scratching their heads at rebuking all of us saying that Apple could easily misuse the on-device scanning when instructed by foreign governments (like China) when just this week Apple disabled Airdrop so Chinese protestors couldn’t exchange files?
Of course, it will all be better once Apple pulls out of China all together and the Chinese can only buy hardware that is 100% controlled by the Chinese government.

And of course all the highroad riders themself would never use anything from 'bad' countries.

Oh, wait, that would leave nothing to buy left.
 
  • Like
Reactions: CarlJ and hagar

dk001

macrumors demi-god
Oct 3, 2014
10,601
14,942
Sage, Lightning, and Mountains
Why would you assume Apple needs the excuse of a CSAM scanner to do this? They could very well already scan and tag every single photo on your phone and in the cloud without us knowing anything about it. Either you trust Apple, or you don't.

It's also widely hypocritical to condemn Apple everytime they comply with local authorities in the countries they're selling their products. Either they comply or drop out of that market. But as there's so much disproval of doing business with or in China, why not impose restrictions for every US company, instead of only blaming Apple?

You bring up a great point - my emphasis.
Apple and Google - I trust neither implicitly.
At this point in time when it comes to privacy I rate Apple slightly less than Google for mobile devices. As far as I can determine and know, if I turn a feature off in Android 13 it is actually off. I had thought the same for Apple but ...

Great point.
 
  • Like
Reactions: pdoherty

I7guy

macrumors Nehalem
Nov 30, 2013
34,303
24,032
Gotta be in it to win it
Why would you assume Apple needs the excuse of a CSAM scanner to do this? They could very well already scan and tag every single photo on your phone and in the cloud without us knowing anything about it. Either you trust Apple, or you don't.

It's also widely hypocritical to condemn Apple everytime they comply with local authorities in the countries they're selling their products. Either they comply or drop out of that market. But as there's so much disproval of doing business with or in China, why not impose restrictions for every US company, instead of only blaming Apple?
I trust both apple and google to do what they say. But I believe there are articles and it’s been shown that google tracks you much more than apple.

And as we all know with google you are the product.

And I agree the critics seemingly only virtue signal their criticism of apple.
 
  • Like
Reactions: CarlJ
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.