Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,770
31,228


Apple this week announced that, starting later this year with iOS 15 and iPadOS 15, the company will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children, a non-profit organization that works in collaboration with law enforcement agencies across the United States.

apple-csam-flow-chart.jpg

The plans have sparked concerns among some security researchers and other parties that Apple could eventually be forced by governments to add non-CSAM images to the hash list for nefarious purposes, such as to suppress political activism.

"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this," said prominent whistleblower Edward Snowden, adding that "if they can scan for kiddie porn today, they can scan for anything tomorrow." The non-profit Electronic Frontier Foundation also criticized Apple's plans, stating that "even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor."

To address these concerns, Apple provided additional commentary about its plans today.

Apple's known CSAM detection system will be limited to the United States at launch, and to address the potential for some governments to try to abuse the system, Apple confirmed to MacRumors that the company will consider any potential global expansion of the system on a country-by-country basis after conducting a legal evaluation. Apple did not provide a timeframe for global expansion of the system, if such a move ever happens.

Apple also addressed the hypothetical possibility of a particular region in the world deciding to corrupt a safety organization in an attempt to abuse the system, noting that the system's first layer of protection is an undisclosed threshold before a user is flagged for having inappropriate imagery. Even if the threshold is exceeded, Apple said its manual review process would serve as an additional barrier and confirm the absence of known CSAM imagery. Apple said it would ultimately not report the flagged user to NCMEC or law enforcement agencies and that the system would still be working exactly as designed.

Apple also highlighted some proponents of the system, with some parties praising the company for its efforts to fight child abuse.

"We support the continued evolution of Apple's approach to child online safety," said Stephen Balkam, CEO of the Family Online Safety Institute. "Given the challenges parents face in protecting their kids online, it is imperative that tech companies continuously iterate and improve their safety tools to respond to new risks and actual harms."

Apple did admit that there is no silver bullet answer as it relates to the potential of the system being abused, but the company said it is committed to using the system solely for known CSAM imagery detection.

Article Link: Apple Addresses CSAM Detection Concerns, Will Consider Expanding System on Per-Country Basis
 
Last edited:

budafied

macrumors regular
Jun 22, 2008
110
173
Apple's known CSAM detection system will be limited to the United States at launch, and to address the potential for some governments to try to abuse the system, Apple confirmed to MacRumors that the company will consider any potential global expansion of the system on a country-by-country basis after conducting a legal evaluation.
Oh, idk. I thought the US government was pretty ****ing dishonest when it comes to privacy. How did that get approved in the first place?

**** Apple for doing this.
 

dmx

macrumors 6502a
Oct 25, 2008
731
1,507
This system is ripe for abuse and privacy creep over time.

Anyone who it would catch will just turn off iCloud photos anyway, defeating the purpose.

Apple should admit that they made a mistake and cancel the rollout.
 

KaliYoni

macrumors 68000
Feb 19, 2016
1,729
3,808

transpo1

macrumors 6502a
Jul 15, 2010
997
1,651
This is a horrendous idea with so many ways this tech could go wrong.

Limiting it to the U.S. is not a solution and it’s obtuse of Apple to think so. Apple needs to stop now. Get rid of the feature, both the iCloud and Messages versions. No one wants this.
 

Mebsat

macrumors regular
May 19, 2003
215
367
Florida
Private API leaks.
Malicious app is uploaded to App Store using Apple's private API for this feature.
Gets approved because Apple does not reliably scan for use of private APIs, is reactive not proactive.
Hilarity ensues.

(even Uber was busted after the fact for using private APIs, they don't even scan top 50 apps)
 

H. Flower

macrumors 6502a
Jul 23, 2008
739
822
Private API leaks.
Malicious app is uploaded to App Store using Apple's private API for this feature.
Gets approved because Apple does not reliably scan for use of private APIs, is reactive not proactive.
Hilarity ensues.

(even Uber was busted after the fact for using private APIs, they don't even scan top 50 apps)

I'm so concerned about stuff like this -
 

MrDerby01

macrumors regular
Jun 2, 2010
235
289
This is what you get when EVERYONE in the US feels Apple is the best any only solution. I feel bad for those people who bought Apple products just because they wanted to belong. Everyone else in the background waving red flags on their business practices and monopolistic behaviors. To late now.. Sad.
 

opiapr

macrumors regular
Mar 26, 2010
156
208
Lehigh Valley, PA
This is what you get when you put an accountant in charge of the 2nd largest company in the world.
the largest.

Apple is now the world's most valuable company, dethroning oil giant Saudi Aramco​

“We’re conscious of the fact that these results stand in stark relief during a time of real economic adversity," Apple CEO Tim Cook said of the company's blowout quarterly earnings.
 

foobarbaz

macrumors 6502a
Nov 29, 2007
884
2,043
I really don’t understand the benefit of the system in the first place. It can’t catch the creators, because it compares against existing images.

So it only flags consumers who for some reason put their collection into the iCloud Photo Library (together with their vacation photos??). Is that really how many child porn collectors operate? Will it catch enough bad people to make up for the possible privacy invasion? I have doubts …

(Now if this was only about Shared Albums that could be abused for distribution, I’d totally understand!)
 

J___o___h___n

macrumors regular
Aug 29, 2017
203
564
I’ve nothing to hide, but this just doesn’t seem right to me.

I’m not updating any existing device to iOS15 until this is roll-out is stopped. I don’t want my photos scanned and I don’t want it to happen to my children’s messages. I ensure my children are safe myself. There’s a level of trust and these sort of forced policies just don’t agree with me.
 

aesc80

Cancelled
Mar 24, 2015
2,250
7,144
Ya know, I once wrote something here about iOS 15 that I predicted that Apple would do something really creepy that would have to be rolled out "under the covers". I predicted using Bluetooth reporting to track suspected kidnappers / suspects for Amber Alerts via their device. This one just seems a bit more insidious.

Kinda figured Apple would go down this route - righteous but potentially malicious.
 

ddtmm

macrumors regular
Jul 12, 2010
226
774
Their courage has landed them in hot water. I am very surprised they went forward with this. I can imagine they went through some lengthy think tanks but have somehow decided this was a good idea. This will not be good for Apple in the long run, and even worse for everyone in general.

Glad to see everyone being so vocal about it.
 
So you mean to tell me the U.S. government will have to decide. When it comes to handling our privacy?

Technology and Government should not work together when it comes to dealing with PRIVACY.

I bet you there’s more to this story.

Someone needs to start a petition to put this to STOP.

1628272770504.jpeg


The message is clear: Do not store your stuff on iCloud. Make sure you order 1TB iPhone this year. (You're going to be needing the mega storage).
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.