Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,770
31,228


Apple's plans to scan users' iCloud Photos library against a database of child sexual abuse material (CSAM) to look for matches and childrens' messages for explicit content has come under fire from privacy whistleblower Edward Snowden and the Electronic Frontier Foundation (EFF).

appleprivacyad.jpg

In a series of tweets, the prominent privacy campaigner and whistleblower Edward Snowden highlighted concerns that Apple is rolling out a form of "mass surveillance to the entire world" and setting a precedent that could allow the company to scan for any other arbitrary content in the future.



Snowden also noted that Apple has historically been an industry-leader in terms of digital privacy, and even refused to unlock an iPhone owned by Syed Farook, one of the shooters in the December 2015 attacks in San Bernardino, California, despite being ordered to do so by the FBI and a federal judge. Apple opposed the order, noting that it would set a "dangerous precedent."

The EFF, an eminent international non-profit digital rights group, has issued an extensive condemnation of Apple's move to scan users' iCloud libraries and messages, saying that it is extremely "disappointed" that a "champion of end-to-end encryption" is undertaking a "shocking about-face for users who have relied on the company's leadership in privacy and security."
Child exploitation is a serious problem, and Apple isn't the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor...

It's impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger's encryption itself and open the door to broader abuses.

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children's, but anyone's accounts. That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change.
The EFF highlighted how various governments around the world have passed laws that demand surveillance and censorship of content on various platforms, including messaging apps, and that Apple's move to scan messages and iCloud Photos could be legally required to encompass additional materials or easily be widened. "Make no mistake: this is a decrease in privacy for all iCloud Photos users, not an improvement," the EFF cautioned. See the EFF's full article for more information.

The condemnations join the large number of concerns from security researchers and users on social media since Apple's announcement of the changes yesterday, triggering petitions to urge Apple to roll back its plans and affirm its commitment to privacy.

Article Link: Privacy Whistleblower Edward Snowden and EFF Slam Apple's Plans to Scan Messages and iCloud Images
 
Last edited:

mazz0

macrumors 68040
Mar 23, 2011
3,140
3,584
Leeds, UK
Gotta say, I agree with this. I think the slippery slope argument is valid here. In the US and Europe they might just use this for child porn (for now), but once the principle is established it becomes much harder for them to tell the government in China that they can't look for anti-CCP images, for example, and so on.
 
Last edited:

sunapple

macrumors 68030
Jul 16, 2013
2,749
5,133
The Netherlands
The thing is, from a PR point of view Apple knew exactly what the response would be to these features. Risky move on their part for a lot of reasons. They either really believe in the cause (plausible) or want to prepare us for future steps that go even further (worst-case fear). I’m not usually such a skeptic, but.. I have a lot of questions.
 

thingstoponder

macrumors 6502a
Oct 23, 2014
914
1,100
We’re all criminals now. Wake up.
You sound a bit out there. Snowden is an enemy of the government because he is a traitor. Any government in human history would feel the same way. Russia is harboring him to act like they’re better than us, not because they agree with him. If he did to them what he did to the US then he’d simply be poisoned to death.

But no, he’s not going to be written out of history. You need to wake up and get a grip.
 

jntdroid

macrumors 6502a
Oct 12, 2011
935
1,276
The other article said the analysis all happens on device, not the cloud. So are they really creating a back door?

That said, this line concerns me from the first article:

“Apple then manually reviews each report to confirm there is a match, disables the user's iCloud account, and sends a report to NCMEC. Apple is not sharing what its exact threshold is, but ensures an "extremely high level of accuracy" that accounts are not incorrectly flagged.”

I’m all for stopping child porn, predators, sex trafficking, etc (and regular porn for that matter, but that’s a rabbit trail for another discussion). But this feels like an over-reach. I just can’t imagine there won’t be some false positives along the way, and this will ruin those peoples lives.
 

applesith

macrumors 68030
Jun 11, 2007
2,781
1,578
Manhattan
All for stopping harm to children, but i think it’s a front. There’s no way this could possibly be abused and expanded to watch anyone who doesn’t agree with the government or a certain set of political views! No way at all is this the next step towards complete censorship. At least apple won’t have to adjust this for deployment in China bc it aligns with the oppressive commies. Big tech and the obsession of controlling people’s thoughts will be the downfall of humanity as we know it.
 

IIGS User

macrumors 65816
Feb 24, 2019
1,101
3,084
What’s all over? Your crimes?

What you say is a crime today, is someone else's fight for freedom.

No one, I say again, NO ONE is in favor of seeing children exploited, abused, or harmed. At least no on here I would hope. But that is not the point.

The point is, this is indeed a slippery slope. Much akin to Apple saying they will unlock a phone for law enforcement via a "back door". Which, at present it is my understanding they won't because it doesn't exist

Once the mechanism exists, once the door is installed, or the code made part of the basic building blocks of how the machine operates, it's no longer a question of not being able to do it, but when it will be done. At that point, it's incumbent upon the gatekeepers to decide what is and isn't permitted, or acceptable, or legal.

These are decisions made by human beings. Just as humans are capable of horrible evil acts (like exploitation of children) for their own personal reasons, they can be capable for such evil on a political scale.

Today, child exploitation. Tomorrow, someplace where being LGBTQ or pro democracy where Apple does business. Apple has all ready proven they will bow to the whims of foreign governments who threaten to cut off their business (and revenue stream).

When countries like China are jailing dissidents for expressing pro democracy viewpoints (see footnote link), one can only question how long it is before this sort of invasiveness is unleashed for nefarious reasons.

This is scary stuff. Apple is wrong on this. One hundred percent wrong. People (good people, with liberal with a small "l" ideals will suffer and die because of this). I have no doubt.

They say it could never happen here. Wherever "here" is. Well, it can and probably will happen wherever you are. This is one bigger step towards a high tech dystopia.

 

LiE_

macrumors 68000
Mar 23, 2013
1,690
5,319
UK
I don't follow this slippery slope thought process. He is essentially saying he doesn't trust Apple not to abuse this functionality. You could literally apply this to anything if you believe Apple has ill-intentions.

If we believe Apple has some master plan to abuse this then this opens open every part of their ecosystem to the same "it could be abused if they want to" statement.
 

Sasparilla

macrumors 68000
Jul 6, 2012
1,965
3,384
Agree here. This opens the door for Apple to have to comply with broad government requests to scan people's messages (if backed up to iCloud) and photos for things going forward. Not sure what Apple is doing here.

Just need the right leaders to exploit this. Want to know who was at the rallies you don't like or didn't say good things about current leader, piece of cake. The current U.S. president wouldn't do it for example, but its easy to imagine one that would want this information and punish the non supporters.
 
Last edited:

ddtmm

macrumors regular
Jul 12, 2010
226
774
An irreversible move. The sad thing at this point is that governments around the world now know Apple has developed tech for mass surveillance with pinpoint accuracy. Even if they cancel their plans now they can be forced to use it at some point covertly. If it can be used to scan for CSAM it can be used to scan for anything specific the government wants. Sigh
 

dlondon

macrumors 6502
Sep 6, 2013
412
326
Like many have said, it's where this leads to which is the issue. I feel like they could be starting with child pornography, which is something few could say is acceptable, as a way to start scanning content on our phones for other purposes – such as chat messages and files. Of course, they could be doing this remotely already but we have a level off Trust in Apple.

I can also see an issue where someone could spam an inappropriate image to you which then gets uploaded to the Cloud and starts a whole chain of events. Sure, you could probably prove that you were a victim but only after your life is turned upside down.
 

Radeon85

macrumors 65816
Mar 16, 2012
1,025
1,897
South Wales, UK
This is a very slippery slope and Apple should not have a backdoor to scan any of my photos, I have nothing to hide but still don't like it.

I thought Apple liked privacy, I guess not. If Apple has the means to identify child abuse images on a users device then that means they have the ability to track the images back to the original device/account. This is one hell of a big backdoor that can very easily be abused by governments if they force Apple to give them the ability to use it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.