Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,805
31,297


Apple on Thursday provided its fullest explanation yet for last year abandoning its controversial plan to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos.

iCloud-General-Feature.jpg

Apple's statement, shared with Wired and reproduced below, came in response to child safety group Heat Initiative's demand that the company "detect, report, and remove" CSAM from iCloud and offer more tools for users to report such content to the company.
"Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it," Erik Neuenschwander, Apple's director of user privacy and child safety, wrote in the company's response to Heat Initiative. He added, though, that after collaborating with an array of privacy and security researchers, digital rights groups, and child safety advocates, the company concluded that it could not proceed with development of a CSAM-scanning mechanism, even one built specifically to preserve privacy.

"Scanning every user's privately stored iCloud data would create new threat vectors for data thieves to find and exploit," Neuenschwander wrote. "It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types."
In August 2021, Apple announced plans for three new child safety features, including a system to detect known CSAM images stored in iCloud Photos, a Communication Safety option that blurs sexually explicit photos in the Messages app, and child exploitation resources for Siri. Communication Safety launched in the U.S. with iOS 15.2 in December 2021 and has since expanded to the U.K., Canada, Australia, and New Zealand, and the Siri resources are also available, but CSAM detection never ended up launching.

Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company postponed the feature based on "feedback from customers, advocacy groups, researchers, and others." The plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Apple's latest response to the issue comes at a time when the encryption debate has been reignited by the U.K. government, which is considering plans to amend surveillance legislation that would require tech companies to disable security features like end-to-end encryption without telling the public.

Apple says it will pull services including FaceTime and iMessage in the U.K. if the legislation is passed in its current form.

Article Link: Apple Provides Further Clarity on Why It Abandoned Plan to Detect CSAM in iCloud Photos
 
Last edited:

jdavid_rp

macrumors regular
Feb 25, 2020
237
766
Isn’t this detecting feature already on their servers for iCloud? When this controversy came up I got to understand that they wanted to move the detecting process to the customers devices, so pictures on iCloud are being already monitorised.

Edit: A post of the telegraph talking about it:

 
Last edited:

ThunderSkunk

macrumors 68040
Dec 31, 2007
3,855
4,130
Milwaukee Area
Thats nice for icloud, but doesn‘t address the csam hash-checking function running locally on machines with mac OS 10.15 & newer. If your computer comes with a built in quiet little snoop that doesn't alert you of the presence of illegal material, but instead just alerts the federal police and wrecks your life, thats probably something everyone with teenagers should consider very deeply.

As of a year ago, facing the pushback last time,
“December 2021, Apple removed the above update and all references to its CSAM detection plans from its Child Safety page, but an Apple spokesperson informed The Verge that Apple's plans for the feature had not changed.”
 
Last edited:

Glenny2lappies

macrumors 6502a
Sep 29, 2006
574
367
Brighton, UK
Good on Apple!

First they came for the socialists, and I did not speak out—
Because I was not a socialist.

Then they came for the trade unionists, and I did not speak out—
Because I was not a trade unionist.

Then they came for the Jews, and I did not speak out—
Because I was not a Jew.

Then they came for me—and there was no one left to speak for me.



With the rise of populist driven one-trick-pony political movements, it is truly great to see Apple's stance. Privacy is vital as is the right to free speech.
 

Unggoy Murderer

macrumors 65816
Jan 28, 2011
1,155
4,017
Edinburgh, UK
I think an on-device, privacy-centric approach could be a good middle ground. Removes the external service and is an enhancement / addition to existing on-device recognition technologies.

For example, if someone used their device to capture such material, or otherwise saved such material to their device, maybe a notice could be displayed "this content could be illegal and causes real harm". It wouldn't remove it, or "report" it external to the device, but it could be at least some way to get someone to think twice about being in possession such horrendous material.

I think Google does something similar for particular search queries. Like if you searched for something you shouldn't, Google displays a notice along the lines of "you shouldn't be looking for this".
 

xxray

macrumors 68040
Jul 27, 2013
3,077
9,300
Still super grateful Apple listened and had the “courage” to walk this back and then even enable end-to-end encryption in iCloud. It really made me lose faith in Apple for a bit when they were pursuing this. I cancelled iCloud and everything.

Yes, CSAM is the absolute worst of humanity, and people who cause/encourage it deserve to be punished. But not at the expense of ruining privacy for the entire rest of the human population.
 

laptech

macrumors 68040
Apr 26, 2013
3,607
4,007
Earth
"Scanning every user's privately stored iCloud data would create new threat vectors for data thieves to find and exploit,"

Is there a deep hidden meaning to this? Basically is Apple saying in not so many words that if they were to create software to scan for CSAM in icloud that it could fall into the hands of data thieves who would exploit it's use and thus to prevent such a thing happening they have no intention of creating the software in the first place? Much like the argument Apple use for not making security backdoors into iphones because they are worried making such a thing will fall into the hands of criminals who would exploit it's use and therefore it is better to say no such thing exists.
 

Glenny2lappies

macrumors 6502a
Sep 29, 2006
574
367
Brighton, UK
I think an on-device, privacy-centric approach could be a good middle ground. Removes the external service and is an enhancement / addition to existing on-device recognition technologies.

For example, if someone used their device to capture such material, or otherwise saved such material to their device, maybe a notice could be displayed "this content could be illegal and causes real harm". It wouldn't remove it, or "report" it external to the device, but it could be at least some way to get someone to think twice about being in possession such horrendous material.
No.

The slippery slope commences. Child abuse, so easy to put in place the infrastructure for censoring.
Then came the clamor for "hate speech". Then Thought Crime.

No. Child abuse is illegal and only a very few people are involved. The wider privacy issue is far more important.

Just read 1984 (the Stephen Fry audiobook is best) to see the end result which we can already see happening.
 

Glenny2lappies

macrumors 6502a
Sep 29, 2006
574
367
Brighton, UK
Is there a deep hidden meaning to this? Basically is Apple saying in not so many words that if they were to create software to scan for CSAM in icloud that it could fall into the hands of data thieves who would exploit it's use and thus to prevent such a thing happening they have no intention of creating the software in the first place? Much like the argument Apple use for not making security backdoors into iphones because they are worried making such a thing will fall into the hands of criminals who would exploit it's use and therefore it is better to say no such thing exists.
It would be the baying crowd demanding that, for example, Climate deniers, Anti-Vaxers, non-mask-wearers... all of them should be prevented from discussing anything that isn't Thought Crime as defined by extremists...

Don't start and weaken privacy!
 

centauratlas

macrumors 68000
Jan 29, 2003
1,825
3,772
Florida
Is there a deep hidden meaning to this? Basically is Apple saying in not so many words that if they were to create software to scan for CSAM in icloud that it could fall into the hands of data thieves who would exploit it's use and thus to prevent such a thing happening they have no intention of creating the software in the first place? Much like the argument Apple use for not making security backdoors into iphones because they are worried making such a thing will fall into the hands of criminals who would exploit it's use and therefore it is better to say no such thing exists.

Not just data thieves but authoritarians of all stripes - fascist, socialist, communist, monarchist etc - would use it to scan for unacceptable speech and dissent. Then they'd use it to shut down dissent and free speech. Just ask the panda.

See 1984 and Brave New World. They treat them like instructions, not warnings.
 

laptech

macrumors 68040
Apr 26, 2013
3,607
4,007
Earth
Apple just need to pray that criminals that are involved with child abuse and exploitation do not use icloud for their ill gotten gains because if the police catch the criminals and find images on Apple's icloud, the crap will hit the fan because Apple are saying their current systems for finding child abuse media on icloud are already robust and thus they do not need to build CSAM detection into icloud but if the police were to find images it would prove Apple's stance on the issue is baseless and cause a huge backlash against Apple because many would then be saying if Apple had implemented CSAM like they was asked to, the images the police find would not have been there.
 

Unggoy Murderer

macrumors 65816
Jan 28, 2011
1,155
4,017
Edinburgh, UK
No.

The slippery slope commences. Child abuse, so easy to put in place the infrastructure for censoring.
Then came the clamor for "hate speech". Then Thought Crime.

No. Child abuse is illegal and only a very few people are involved. The wider privacy issue is far more important.

Just read 1984 (the Stephen Fry audiobook is best) to see the end result which we can already see happening.
I didn't say censor the material, just notify that the material is illegal as defined in practically every country on earth. It would be quite a leap to go from that to "that screenshot contains a racist joke, you should feel terrible".

Not everything created with good intentions is destined to be weaponised. That same logic would see us banning knives because they can be used to harm people.
 

nt5672

macrumors 68040
Jun 30, 2007
3,386
7,230
Midwest USA
That they didnt see the dangers of this in the first place is an absolutely baffling show of how dirty apple would play if allowed
It is not just Apple, most major companies are run by people that just go along with the current group psychological expectation. People in the trenches can often only see the limited view of their small group and the managers see the same view. So it happens, unless no one steps up, which in a company like Apple is career ending.

That is why we as consumers have to be super vocal about these stupid decisions.
 

jimbobb24

macrumors 68040
Jun 6, 2005
3,358
5,387
Apple just need to pray that criminals that are involved with child abuse and exploitation do not use icloud for their ill gotten gains because if the police catch the criminals and find images on Apple's icloud, the crap will hit the fan because Apple are saying their current systems for finding child abuse media on icloud are already robust and thus they do not need to build CSAM detection into icloud but if the police were to find images it would prove Apple's stance on the issue is baseless and cause a huge backlash against Apple because many would then be saying if Apple had implemented CSAM like they was asked to, the images the police find would not have been there.
If I rent a locker and store porn is the locker own liable. Why would apple be liable in any way. This just isn’t the case. It’s not a worry at all.
 

hagar

macrumors 68000
Jan 19, 2008
1,999
5,042
Not just data thieves but authoritarians of all stripes - fascist, socialist, communist, monarchist etc - would use it to scan for unacceptable speech and dissent. Then they'd use it to shut down dissent and free speech. Just ask the panda.

See 1984 and Brave New World. They treat them like instructions, not warnings.
The feature is only as good as its (privacy focused) implementation. There is zero reason to assume authoritarians would be able to gain unlimited access to the tool to scan whatever they want without involving Apple.

But if more and more regimes come up with baffling demands (like the U.K.) then it’s the right call to never release the tool.
 
Last edited:

Unggoy Murderer

macrumors 65816
Jan 28, 2011
1,155
4,017
Edinburgh, UK
No. Child abuse is illegal and only a very few people are involved.
Unfortunately a lot more people are involved than you would think. You'll be able to find reports from people that have accessed the "dark web", to find thousands of communities with thousands of members, all circulating all kinds of awful material.

Worth a watch if you want to erode your remaining faith in humanity:
 

hagar

macrumors 68000
Jan 19, 2008
1,999
5,042
That’s a decent explanation, but I’m baffled that they didn’t think of these things before announcing the feature originally.
Obviously they knew this perfectly beforehand but they didn’t expect such a backlash.

And now they realise it’s far easier (and more important) to promote privacy in their ecosystem than protecting children
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.