Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

xpxp2002

macrumors 65816
May 3, 2016
1,162
2,740
Okay so I am confusion.

iOS already scans your photos to identify pets, faces, cars, buildings, trees, etc... how is finding missing and exploited children via the same technology a bad thing? The fact that it alerts a non-profit NGO? It's already a fact that it will not flag naked pics of your 1-week-old during their first bath time, or the suggestive images you send your sexual partners.

The ONLY way you'll be flagged is if you have known images of exploited children on your phone. In this context, "known" is defined as any image that is already uploaded to the CSAM database, and is found to be an IDENTICAL match in your iCloud library.
It isn't the same implementation.

The object/person detection uses the on-device neural engine and those identifications never leave the device. In this case, there is a threshold that Apple is not disclosing that will trigger an upload of your private photos if it meets their criteria.

Moreover, this hash database is also unverified and unaudited. The US-funded NCMEC blindly provides these hashes and Apple merely has to assume this database that they're going to be pushing down to every iOS device is genuinely only representative of CSAM material. What happens when unscrupulous politicians (from the US or allies) start leveraging NCMEC to push "undesirable" political content hashes into that database. Legal political dissent content will be flagged as "CSAM" and that "evidence" will be used to falsely accuse, and likely arrest political dissidents and label them as peddlers of CSAM. That reputation damage will be difficult to prove false against the power of large nation-states determined to take down political opponents, let alone overcoming being framed with false evidence. And frankly, I'm shocked Apple had not considered that they would be complicit every time this happens by enabling and pushing this technology on everyone, because they were too weak willed to say "no".

This is an authoritarian's dream just waiting to be unleashed. Politicians are going to be chomping at the bit to push their own content into those hash databases once this is rolled out. It's only a matter of time before a major world power like the US or China passes a law to make it happen -- if it doesn't happen in secret without your knowledge, while falsely accused possessors of "CSAM" are spirited away in the middle of the night to dark sites.
 

rme

macrumors 6502
Jul 19, 2008
292
436
Besides which, as I understand it, the hashes are made up of two different sources.
These sources all cooperate and share data with law enforcement and security services in different countries
 

ivan86

Contributor
Feb 24, 2016
152
312
Moscow / Berlin
The most obvious thing in all this is after all these plans discussed everywhere does anyone think that some pedophile will use iCloud or photos app to view explicit content on the phone?

Any file manager can be used for that to which anything can be uploaded from computer.

What is the plan here really? To catch 3 pedophiles who do not have internet access and never seen of CSAM discussions? Seriously?

I see absolutely no difference with this and the wish of some politicians and forces to have legal backdoors in communication apps or operating systems themselves.

Any bad actor who needs to - can simply switch to any kind of encrypted messenger, including self hosted open source solutions.
Thus, the only thing that will be achieved is bringing to risk everyone on the platforms to hackers who steal personal and confidential information to make money from it or engage in some scam operations.

This is simply just not the way to catch “bad actors” at all.

CSAM must be scratched before governments all over will start pushing for a full surveillance, knowing that such technical opportunity exist.

In cases of encryption and backdoors, the last defense wall was often “absence of technical possibility to create such backdoor, due to end-to-end encryption nature”.

CSAM = technical possibility to scan your photos absolutely for any content that is considered “offensive” by whatever government or agency.

This is just absolutely transparent logic and I do not understand how Apple leadership does not see this.

Moreover, I believe that brining CSAM will not only put under massive pressure everything related to whatever “politically incorrect” photos we are having, but it will bring a new wave of push regarding all kind of backdoors and weakening encryption. There is just no way around this.

I hope there will be much stronger public and governmental push against CSAM so that it never comes to life.
 

iamgalt

macrumors 6502
Jul 25, 2012
479
1,858
It’s surprising to see that kind of stance by a politician, they are not the biggest fans of digital privacy. Very curious if any major politician in US will fight on CSAM scanning, but I’m not holding my breath
The only US politician I can see openly standing up against this is Rand Paul because he's big on privacy. But like you, I'm not holding my breath.
 
Last edited:

DanielDD

macrumors 6502a
Apr 5, 2013
524
4,447
Portugal
Those are not the equivalent of this technology. Apple’s new technology is about having a process running on the user’s phone that monitors for illegal activity (1) and reports matches to an authority (2). So this is like having your camera automatically scan for antisocial behaviour and reporting your GPS location to the local police.

(1) this implies it runs continuously. That is not true. It runs only on photos you upload to the iCloud service.
(2) this implies reports are automatic. That is not true. There's manual review after 30 matches are met.

You may not like the service. But this overly exaggerated framing of what Apple is doing a disservice for people who have legitimate concerns about this technology.
 

fwmireault

Contributor
Jul 4, 2019
2,162
9,243
Montréal, Canada
That might be surprising for some people, but in many democracies, there is an actual wide range of polictical parties with a wide range of political stances.

That does include parties and politicians who actually care about privacy.
I know how a democracy works, and there’s definitely a bias in political parties towards digital surveillance and control, at least in some countries like the US. Parties that really care about privacy are either very marginal or will never makes privacy as a priority
 

rme

macrumors 6502
Jul 19, 2008
292
436
The most obvious thing in all this is after all these plans discussed everywhere does anyone think that some pedophile will use iCloud or photos app to view explicit content on the phone?

Any file manager can be used for that to which anything can be uploaded from computer.

What is the plant here really? To catch 3 pedophiles who do not have internet access and never seen of CSAM discussions? Seriously?

I see absolutely no difference with this and wish of some politicians of forces to have legal backdoors in communication apps or operating systems themselves.

Any person who needs to can simply switch to any kind of encrypted messenger. Thus, the only thing that will be achieved is bringing to risk everyone on the platforms to hackers who still personal and confidential information to make money from it or engage in some scam operations.

This is simply just not the way to catch “bad actors” at all.

CSAM must be scratched before governments all over will start pushing for a full surveillance, knowing that such technical opportunity exist.

In cases of encryption and backdoors, the last defense wall was often “absence of technical possibility to create such backdoor, due to end-to-end encryption methods used”.

CSAM = technical possibility to scan your photos absolutely for any content that is considered “offensive” by whatever government or agency.

This is just absolutely transparent logic and I do not understand how Apple leadership does not see this.

Moreover, I believe that brining CSAM will not only put under massive pressure everything related to whatever “politically incorrect” photos we are having, but it will bring a new wave of push regarding all kind of backdoors and weakening encryption. There is just not way around this.

I hope there will be much stronger public and governmental push against CSAM so that it never comes to life.
According to Apple and all the people here who love Apple's 'solution', the great thing about this is that CSAM that's uploaded to iCloud before the upgrade to IOS15 won't be scanned!
 

movielad

macrumors regular
Dec 19, 2005
120
219
Surrey
Some rather silly comments here trying to justify this unjustifiable software being on hardware rather than iCloud.

Comments about camera on iPhone etc., which actually prove the point, as you can choose not to use the camera, just as you choose most things, but not the software Apple intends to place on your hardware, and even if you don't intend to use iCloud for photos.

Regardless, if you upload photos to something like Google Photos, Facebook, Instagram - whatever - they're still scanned for CSAM. There really isn't much of a choice regardless of whether you do it on the phone or in the cloud - the choice, as it were, is not to upload anything to the big cloud providers.
 
  • Like
Reactions: Tagbert

VulchR

macrumors 68040
Jun 8, 2009
3,419
14,314
Scotland
Despite the facts that it scans locally before uploading, that it only uses hashes initially...
People keep raising this as though it makes the template-matching somehow independent of the perceptual features of the image. I do not see how this can be if the template-matching is fuzzy. The hash will convey information about the content of the image, even if it is in a digested form. The only way this can work is if the hash somehow captures the essence of the image's content. My concern with the implementation of this is whether the false positives will be sensitive pictures of people, whether people's tendency to take multiple pictures of the same scene increases the odds of a false positive, and whether Apple's per-account criterion includes a consideration that the more pictures in a library, the higher the number of expect false positives overall.

As for the principle of this, Apple's goal is laudable, but the price is ushering in distributed AI, including on the person's machines, for surveillance. This was an ominous rubicon to cross, and Apple was crazy to do so. There is a reason why Google hires behavioural/social scientists - so they don't make stupid decisions like this.
 

justperry

macrumors G5
Aug 10, 2007
12,559
9,749
I'm a rolling stone.
I have been a huge fan of Apple, especially for OS X/macOS and computer hardware, still am with them on that.

BUT, Apple is overplaying their hand, I really don't like the way Apple is going.
Yes, child abuse, actually any abuse is a serious problem, this though is not the way to solve it, it sets a precedent.
 

DanielDD

macrumors 6502a
Apr 5, 2013
524
4,447
Portugal
Some rather silly comments here trying to justify this unjustifiable software being on hardware rather than iCloud.

They are not silly. There are various advantages to doing this on device rather than on a server:

- Security researchers can audit the hash database being used, and the accuracy of the matching process
- No need to decrypt photos on the server to run the matching algorithm
- Only matching photos can be seen by Apple. If a server-based approach were used, all photos could in principle be accessed
- Targeted attacks are much less likely since it's much harder to tamper with an encrypted phone

These are factual advantages of this system. Are there disadvantages? Yes... Potential for government overreach and tampering with the initial database. But these disadvantages also apply to a server-side approach.
 
Last edited:

DanielDD

macrumors 6502a
Apr 5, 2013
524
4,447
Portugal
Regardless, if you upload photos to something like Google Photos, Facebook, Instagram - whatever - they're still scanned for CSAM

You do not need to upload photos. If you have these apps installed on your phone and you grant access to the photo library, they already scan for CSAM. These apps aggressively upload some contents in the gallery to speed up the upload process when you make a new post. That's why Apple implemented access to the gallery on a per photo basis.
 
Last edited:

sw1tcher

macrumors 603
Jan 6, 2004
5,553
19,572
So many people against Apple's CSAM.... This is an opportunity for them to send Apple a message by not upgrading their existing devices to iOS 15 and voting with their wallets by not buying any new Apple device that has it.

But we all know that will not happen.
 
  • Sad
Reactions: boswald

dguisinger

macrumors 65816
Jul 25, 2002
1,098
2,244
So many people against Apple's CSAM.... This is an opportunity for them to send Apple a message by not upgrading their existing devices to iOS 15 and voting with their wallets by not buying any new Apple device that has it.

But we all know that will not happen.
They may see a slower adoption rate on the OS upgrades. Not huge, but I bet their reports will show slight hesitation.

Plus, they removed all of their other big features from iOS 15, CSAM is pretty much the only thing left.
 

Jim Lahey

macrumors 68030
Apr 8, 2014
2,652
5,435
Anyone who thinks this system won’t morph into something with far more despotic intent is either naive or disingenuous. Apple absolutely will comply with any laws in the markets in which it wishes to do business, and if they say otherwise they are lying. This genie will never go back in the bottle.
 

xpxp2002

macrumors 65816
May 3, 2016
1,162
2,740
So many people against Apple's CSAM.... This is an opportunity for them to send Apple a message by not upgrading their existing devices to iOS 15 and voting with their wallets by not buying any new Apple device that has it.

But we all know that will not happen.
It's happening. Just not at the volume that will affect a company like Apple, who sells 40+ million iPhones per quarter.

I've already read quite a few conversations by people who are migrating away to non-Google Android alternatives. Unfortunately it takes time and the immediate reaction won't be there. It'll be a slow trickle as tens of thousands of people who care about privacy cycle out iOS devices for alternatives.
 

AndiG

macrumors 65816
Nov 14, 2008
1,011
1,912
Germany
Regardless, if you upload photos to something like Google Photos, Facebook, Instagram - whatever - they're still scanned for CSAM. There really isn't much of a choice regardless of whether you do it on the phone or in the cloud - the choice, as it were, is not to upload anything to the big cloud providers.
This functionality opens a backdoor and the iPhone is only a hash value away from detecting images like the famous tank picture and informing the chinese authorities.

„After a Chinese cybersecurity law came into effect in 2017, Apple started storing customer iCloud data—spanning emails, contacts, photos, and geolocation—on computer servers in China and handled by Chinese state employees.“
https://fortune.com/2021/05/18/apple-icloud-data-china/



Furthermore, Apples arguments are fake. Why on earth should Apple implement a system that searches for specific content on the phone when this is already implemented in the cloud?

The reason is that Apple developed a framework for scanning content and offers this framework to app developers. Maybe Apples wants to keep nudity from the iPhone - not only CSAM. The next step in iOS16 has to be to make this framework mandatory - or China makes it mandatory if you want to sell Apps in China. But China is only an Example - just think of whistleblowers or content that authorities want to block? You cannot block content on the web - but hey, if you could block it on every single device?


What we‘ve learned: If Tim enters the stage and talks about privacy, what he really means is „blah blah privacy blah blah“. For Tim „privacy“ ist just another word for „blah“.


030.jpg

https://iconicphotos.wordpress.com/2009/04/22/the-tank-man/
 
Last edited:

mr.steevo

macrumors 65816
Jul 21, 2004
1,411
940
I’m not clear about this.
Will the scanning of images start on a certain day or does it require the iOS device to be upgraded to iOS 15?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.