Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

pdoherty

macrumors 65816
Dec 30, 2014
1,347
1,612
So, you’re saying that, knowing how the US government CURRENTLY uses tech products to spy on people, they are currently NOT using tech products to spy on people where it comes to unencrypted images stored in cloud image repositories. So, they’re currently using tech products to spy, but, where it makes the conspiracy convenient, they’re absolutely not in one very specific area?

The government currently (supposedly) doesn’t require Apple to monitor for the US government because the government is currently using tech products to spy on people. BUT, the government, with its tech spying, has just been WAITING to hand off the tech spying to Apple and, I suppose, Google, Samsung, Microsoft, etc.? Or, is it that the government ONLY wants to hand off the tech spying to Apple in this case and most certainly not anyone else?
The government definitely wants this, because it sets the precedent of corporations doing warrantless searches of all citizens all the time. No effort on the government’s part to actually find evidence of a crime, do any investigation, and obtain warrants where applicable.
 

pdoherty

macrumors 65816
Dec 30, 2014
1,347
1,612
The government has the capabilities NOW to have access to the images folks are saying this “will give” government a new tool for. “The government might use this for…” No. If the government has the equivalent of a network enabled search engine, they are not going to use the equivalent of a physical library card catalog, and perform the equivalent of walking around a big building to find the information.
If they already have this capability then why is this CSAM even needed? You don’t seem to be following your own logic to its conclusion.
 

CarlJ

macrumors 604
Feb 23, 2004
6,971
12,135
San Diego, CA, USA
The CSAM scanners caught an evil pedophile!
"A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal."
https://archive.ph/tL6wk
As the article clearly states, this “catch” was based on “AI” software that scans images for things that “look like” children being exploited. Quoting a paragraph (and the first sentence of the next paragraph) from the article:

A bigger breakthrough came along almost a decade later, in 2018, when Google developed an artificially intelligent tool that could recognize never-before-seen exploitative images of children. That meant finding not just known images of abused children but images of unknown victims who could potentially be rescued by the authorities. Google made its technology available to other companies, including Facebook.​
When Mark’s and Cassio’s photos were automatically uploaded from their phones to Google’s servers, this technology flagged them.​

This is, very specifically, NOT anything like the mechanism that Apple has built, which looks only for matches to an existing set of known bad images (CSAM material that is already in circulation among pedophiles). What Google built is exactly the kind of system that is the problem. Google has a long history of building privacy-intrusive systems with a “what could possibly go wrong” attitude, and that’s exactly what you’re seeing in that article.

Trying to present that article as an argument against Apple’s system (which would be the only logical reason I can see for posting it here) is along the same lines as saying, “a car manufacturer has made bad seatbelt and airbag systems, let us therefore ban all seatbelts and air bags from cars.”
 
  • Like
Reactions: dk001 and I7guy

CarlJ

macrumors 604
Feb 23, 2004
6,971
12,135
San Diego, CA, USA
How does this CSAM thing protect any children from abuse? All it does is prevent (or catch) those who upload existing, already-known-offending, images.
The people abusing the kids and taking these pictures and sharing them with others are invariably also collecting similar pictures from other pedophiles - they trade them around, apparently. If you don’t catch them for the picture they took, you catch them for the pictures they have that others took - the ones already known to the system. Then when you investigate them, you find the pictures they are taking and the children they are abusing.
 

CarlJ

macrumors 604
Feb 23, 2004
6,971
12,135
San Diego, CA, USA
If they already have this capability then why is this CSAM even needed? You don’t seem to be following your own logic to its conclusion.
“CSAM” is, quite literally, “Child Sexual Abuse Material”. Nobody needs it. Please keep the language straight - CSAM is not code, it is not an algorithm, it is not anything that Apple or Google has written - it is photos or movies of kids being abused and/or raped. Take care with how you toss the term around.

There is an organization called the National Center for Missing and Exploited Children (NCMEC), who are the only ones in the US legally allowed to possess such images. They have them for the purpose of compiling a database of hashes of these images, which they then make available. The hash makes it easy to test if a given random image is in the database (you hash it in the same way and look up the resulting number to see if it’s there), but impossible to go the other way - theres simply no way to reconstruct an image from the hash value.

The code that Apple wrote, takes each one of your images, immediately before uploading to iCloud, and hashes it, and attempts to look it up in the database. When it fails to match, it uploads it as normal. Because Apple can tell the image is safe before uploading, it can encrypt the image, with a key only you possess, before uploading. So all your images on iCloud are protected from spying eyes, whether they be some government agency, a rogue Apple employee, a hacker, or whatever - even if they can get the file, it’s meaningless gibberish to them without the key required to decrypt it.

This other approach is to upload all your photos unencrypted, and then scan them all on the server for CSAM (this is no doubt already happening in Apple’s case, and is definitely happening on Google’s servers as demonstrated by the article cited earlier today), and then they’re sitting on the server without any encryption, where that government agency, or rogue employee, or hacker, can do whatever they want with them, if they can get into the server.

Apple literally built a system that is more private for their users, and people screamed bloody murder because the scanning is happening on their phone immediately before uploading, rather than on the server immediately after uploading, like everyone else is already doing.

The comment you were replying to was making the point that people are getting worried about “what if they put an image of $POLITICIAN into the CSAM hash database!!1!” are entirely missing the point that the government already has a much easier mechanism to abuse - coerce $COMPANY to let them run any scanner of said government’s choosing (including an AI one that looks for any image vaguely resembling $POLITICIAN, or, say, any image of someone wearing a red hat)… across all those unencrypted images currently sitting on their server.

The database-of-hashes approach will only work with specific images, not “types” of images, or items in an image, only specific images - it’s much harder to misuse for nefarious purposes than what we’ve already got running right now.

You can’t judge these things in a vacuum, you have to look at them in comparison to the alternatives. And a lot of people have been trained to not look further than the sound bite they’re currently being fed.

(Edit: fixed a single-character typo.)
 
Last edited:

pdoherty

macrumors 65816
Dec 30, 2014
1,347
1,612
“CSAM” is, quite literally, “Chile Sexual Abuse Material”. Nobody needs it. Please keep the language straight - CSAM is not code, it is not an algorithm, it is not anything that Apple or Google has written - it is photos or movies of kids being abused and/or raped. Take care with how you toss the term around.
You misunderstood my point. The poster I replied to said the government could already do the things this CSAM scanning is doing (and suggested that was a reason to not be bothered by the CSAM scanning). I was asking if the government already can do this, why do we need CSAM scanning on iPhones/iPads?
 
Last edited:

CarlJ

macrumors 604
Feb 23, 2004
6,971
12,135
San Diego, CA, USA
You misunderstood my point. The poster I replied to said the government could already do the things this CSAM scanning is doing (and suggested that was a reason to not be bothered by the CSAM scanning). I was asking if they government already can do this, why do we need CSAM scanning on iPhones/iPads?
No, I didn’t misunderstand. The government can already coerce Apple to run whatever arbitrary scan they want, across all those unencrypted images in iCloud, with precisely the same amount of bending of the rules as it would take to force an image of somebody ($POLITICIAN in previous examples) into the CSAM image hash database.

The indications are that Apple is already running scans for CSAM material (though it’s unclear if they’re using the CSAM hash database method or something else - I don’t think they’ve ever explained it publicly), and we have proof that Google is running such scans - in a worse form, the “AI” way, as reported in the NYTimes article mentioned a few posts back.

With precisely the same amount of effort that it would take the government to say, “psst, Apple, you have to add this image hash to the CSAM database and don’t tell anybody”, the government could currently say, “psst, Apple, you have to add this government-designed bit to your iCloud photo scanner, and don’t tell anybody”.

Scanning on your phone before upload, using the CSAM database, only looks for very specific images, and winds up with the images on Apple’s servers in a form that are impossible to scan further. Scanning them on the server leaves them open to all sorts of additional scanning and/or theft in the future.

The “CSAM hash database / scan before uploading / encrypt before uploading” method offers more security than the “upload everything unencrypted and let Apple or whoever scan whenever needed” method. They wouldn’t do both, the new one would replace the old one (that’s the whole point of building it). Now, you might think, “well, but I don’t want my pictures scanned via either method!” Yeah, that’s not an option. If your pictures go on a server on the internet, they’re going to be scanned. If you want to opt out, you’ll have to convince the government, not Apple. Of the two options, I’d rather have the one that’s (A) harder for the government to subvert, and (B) makes it harder for hackers to steal my pictures off the server. Your other option, of course, is to never upload your pictures to Apple, then they’ll never get scanned (by either method).
 
Last edited:

BaldiMac

macrumors G3
Jan 24, 2008
8,775
10,900
You misunderstood my point. The poster I replied to said the government could already do the things this CSAM scanning is doing (and suggested that was a reason to not be bothered by the CSAM scanning). I was asking if they government already can do this, why do we need CSAM scanning on iPhones/iPads?
As the poster said, it would be “needed” if Apple enacted e2e encryption for photos. Currently, Apple can access and scan your photos as legally required by court order because they have access to your encryption key. If e2e encryption is enabled, only the user would have access, so something like CSAM scanning would deter users from storing illegal images on Apple’s servers.
 
  • Like
Reactions: CarlJ

dk001

macrumors demi-god
Oct 3, 2014
10,601
14,941
Sage, Lightning, and Mountains
Thanks. Added an archive.

So Google and others are not looking just for CSAM but any material their “tool” thinks might fall into that category.
It does make a great point about putting all your eggs in one basket AND how people can be condemned and assumed guilty without being arrested, prosecuted, and found guilty.
 

steve09090

macrumors 68020
Aug 12, 2008
2,144
4,135
So, because citizens didn’t push back to stop ‘The Patriot Act’ and its intrusions into privacy, you use that as a reason to not push back when they attempt it again?
How is hashing an intrusion into privacy? Unless you are talking about the privacy of paedophiles???
 

Wildkraut

Suspended
Nov 8, 2015
3,583
7,673
Germany
That's what can happen with CSAM, face the future.

 

dk001

macrumors demi-god
Oct 3, 2014
10,601
14,941
Sage, Lightning, and Mountains
How is hashing an intrusion into privacy? Unless you are talking about the privacy of paedophiles???

I think you are looking at this incorrectly.

The reason why - proposed crime - is immaterial. Cloud and eMail scanning are bad enough but to have a third party scan your information on device solely for the purpose of looking for potential illegal material without a warrant and not part of law enforcement. That is a deliberate intrusion and questionable from a legal standpoint. At least here in the US - the proposed starting point for Apple's CSAM tool.
 
  • Like
Reactions: pdoherty

dk001

macrumors demi-god
Oct 3, 2014
10,601
14,941
Sage, Lightning, and Mountains
That's what can happen with CSAM, face the future.


From my understanding, Apple has that caliber of an AI device for photos already on device.
 

CarlJ

macrumors 604
Feb 23, 2004
6,971
12,135
San Diego, CA, USA
That's what can happen with CSAM, face the future.

Maybe scroll up even just a little before posting?

#278
 
  • Like
Reactions: I7guy

CarlJ

macrumors 604
Feb 23, 2004
6,971
12,135
San Diego, CA, USA
Here’s an example where you get false flagged and lose account access because of CSAM algorithms.

Which is nothing like what Apple is proposing. Google is using AI code to look for things that maybe might look like abuse, and then assuming they are. Apple's design uses a database of hashes of images that are already circulating among pedophiles. This was all discussed yesterday ==> #278
 

antiprotest

macrumors 601
Apr 19, 2010
4,003
14,042
My anecdotal observation, is that apples ai is pretty good. I hold no claim to specific evidence of what it is used for.
On my own Apple devices, searching the term "food" brings up pictures of cakes and sushi, but also a sizable number of um... human anatomy pictures. I suppose you can put your mouth to those parts but they are usually not for consumption unless you're a cannibal. And this is only one example. Apple image recognition is actually very poor even with basic and obvious items.
 
Last edited:
  • Like
Reactions: Canyonero

Wildkraut

Suspended
Nov 8, 2015
3,583
7,673
Germany
Maybe scroll up even just a little before posting?

#278
Sorry, but your Apple sided arguments doesn’t put Apple in a better light than Google.

Apple already utilizes more or less the same kind of heuristic and AI recognition of illegal content on their iCloud Storages. Why do you think they don’t decently encrypt the iCloud? On top of that comes the newer on-device scanning mechanisms of CSAM.

This sad news is just an outlook and proof of what we will increasingly face in the future with all those wannabe „protective mechanisms“.

Don’t you think the bad guys read news, they aren’t all plain stupid, they will simply circumvent it.

It’s the innocent ones like the guy in the news that will take the most hit.
It will get even worse if somebody that fully relies on a single ecosystem like Apples get a false positive.
In a blink you lose access to all your stuff, Photos, Documents, Hardware activations, access to your credit card, digital driver license, digital insurance card, etc.

Imagine being on holiday in a foreign country and suddenly your credit card/ApplePay access is gone, just because you took a panoramic beach photo where in the background a few naked toddlers ran around.

I doubt and bet that in this case Apple will act very differently, it will take ages and a lot of headaches to get your account reactivated(if ever). Just take Apples App review rejection appeal process as example to see how bad this will work out.
 
Last edited:

pdoherty

macrumors 65816
Dec 30, 2014
1,347
1,612
How is hashing an intrusion into privacy? Unless you are talking about the privacy of paedophiles???
Using your device's battery/CPU to scan everything you ever shoot a photo of as it's about to be uploaded to be sure you're not a criminal is an obvious violation of a person's privacy. And it's also an easy way to set someone up. Better never leave your phone unlocked or in the hands of someone who knows your password - they can just download some of this crap to your photos collection and then close the browser leaving you none the wiser.
 
  • Haha
Reactions: steve09090

dk001

macrumors demi-god
Oct 3, 2014
10,601
14,941
Sage, Lightning, and Mountains
Sorry, but your Apple sided arguments doesn’t put Apple in a better light than Google.

Apple already utilizes more or less the same kind of heuristic and AI recognition of illegal content on their iCloud Storages. Why do you think they don’t decently encrypt the iCloud? On top of that comes the newer on-device scanning mechanisms of CSAM.

This sad news is just an outlook and proof of what we will increasingly face in the future with all those wannabe „protective mechanisms“.

Don’t you think the bad guys read news, they aren’t all plain stupid, they will simply circumvent it.

It’s the innocent ones like the guy in the news that will take the most hit.
It will get even worse if somebody that fully relies on a single ecosystem like Apples get a false positive.
In a blink you lose access to all your stuff, Photos, Documents, Hardware activations, access to your credit card, digital driver license, digital insurance card, etc.

Imagine being on holiday in a foreign country and suddenly your credit card/ApplePay access is gone, just because you took a panoramic beach photo where in the background a few naked toddlers ran around.

I doubt and bet that in this case Apple will act very differently, it will take ages and a lot of headaches to get your account reactivated(if ever). Just take Apples App review rejection appeal process as example to see how bad this will work out.

It's the "Automatic Guilty" aspect that is really concerning. Especially in the Google example as it was the assumption that it was A: Illegal and he was B: Guilty. This should have gone to ICMEC/NCMEC at the worst. In reality it should never have happened.
 

steve09090

macrumors 68020
Aug 12, 2008
2,144
4,135
Using your device's battery/CPU to scan everything you ever shoot a photo of as it's about to be uploaded to be sure you're not a criminal is an obvious violation of a person's privacy. And it's also an easy way to set someone up. Better never leave your phone unlocked or in the hands of someone who knows your password - they can just download some of this crap to your photos collection and then close the browser leaving you none the wiser.
You worry too much. A plane flies overhead and a person could be up there taking photos of you in your backyard watching the plane flying overhead with a person taking photographs of you watching the plane flying overhead.
privacy breach right there!
 

Wildkraut

Suspended
Nov 8, 2015
3,583
7,673
Germany
It's the "Automatic Guilty" aspect that is really concerning. Especially in the Google example as it was the assumption that it was A: Illegal and he was B: Guilty. This should have gone to ICMEC/NCMEC at the worst. In reality it should never have happened.
Yep, and also that there is no way to quickly appeal and reactivate the account, despite being found innocent. An official „case closed letter“ should have been enough to reactivate it in this case. Imagine if he were an Indie Dev, this would be a disaster for him and all his customers.

I hope he finds a way to sue Google and get compensation for damages.

But it’s inevitable, these kind of „protection mechanisms“ will become default everywhere, it’s just a matter of time, same for Apple.

Anyway people have to realize that relying on a single ecosystem might break their neck. It’s easy to get used to the comfy smart digital world and become careless with such kind of things, till it heavily hits you when you don’t expect it.
 
  • Like
Reactions: nebojsak and dk001
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.