Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Naraxus

macrumors 68020
Oct 13, 2016
2,105
8,545
******** it is. Apple has abused so much trust over the years and this is right up there with the rest of their lies & broken promises. Apple still needs to have their feet held to the fire over this lest they try and slip it in quietly.

Trust, but verify - Ronald Reagan
 
  • Like
  • Haha
Reactions: nt5672 and SFjohn

symphony

macrumors 68020
Aug 25, 2016
2,204
2,590
Apple with the 5D chess.

Do something controversial. Then later come out with something that fixes it and also and even more better changes.

Obviously kidding.
 

nottorp

macrumors 6502
May 12, 2014
430
509
Romania
It's good that they at least caved in to pressure. It's bad that they even considered doing this.

Now if they'd update the included iCloud space... it may actually make it useful. I mean, even the first paid tier ain't good for anything. You can't fully back up a 64 Gb iPhone.
 

MacProFCP

Contributor
Jun 14, 2007
1,222
2,952
Michigan
While there may be great benefits from CSAM implementation, making this technology available would certainly allow for nefarious State actors to require Apple to implement "similar measures" to suppress dissent.

Furthermore, if Apple were to tag an iCloud account with CSAM, I imagine it would be fairly straight forward to obtain a search warrant for the entire account. For example a parent's pictures of children in a bubble bath could open the door for their entire iCloud to be shared with law enforcement. While that example would prove fruitless, it nonetheless harms the personal privacies that all are entitled to.

I agree with Apple's assertation (or capitulation) that the negatives outweigh the positives.
 

IIGS User

macrumors 65816
Feb 24, 2019
1,101
3,084
I'll let the cat say it.

grumpy.jpg
 

mectojic

macrumors 65816
Dec 27, 2020
1,232
2,376
Sydney, Australia
Great news but they lost all my trust with this. I will continue to use local backups on my Mac and keep my photos out of iCloud for good.
After messing with iCloud for years, getting frustrated with failed syncs and slow uploads/downloads, I returned to a home server solution. Now I have peace of mind. I think iCloud is the most trapping part of the walled ecosystem.
 

t0rqx

macrumors 68000
Nov 27, 2021
1,617
3,785
So Apple was planning to use cpu usage of my iPhone I bought? Will they pay me for their usage? Combing through personal data, using the iPhone I own for their own benefits and I need to pay an extra premium for this?

I am buying the iPhone I want to use my own way not that the cpu is constantly being tasked 20% for tasks of Apple.

I tell you know the base of this is not abandoned but going to be tucked in a new package and marketing way to implement. In the meantime the backdoors are still open like it accidentally got exposed recently when everyones could access each others iCloud Photo library introduced with iOS16. Coincidence?
 
  • Like
Reactions: bobcomer

NightFox

macrumors 68040
May 10, 2005
3,243
4,502
Shropshire, UK
Hello everyone-

They said they weren’t going to scan iCloud Photos. They didn’t say that they won’t scan your phone

If Apple wanted to scan your phone surreptitiously, they could have done that since 15 years ago.

I don't get why people think this whole CSAM exercise was just a means of Apple getting a backdoor into people's phones - why would they need to go to those lengths when they control the OS for pity's sake!
 

Analog Kid

macrumors G3
Mar 4, 2003
8,983
11,734
I've had mixed feelings about this from the beginning, but I wonder if this will become a careful-what-you-wish-for moment... Apple won't handle it in a private, focused way on your device, which cracks the door to government demanding a way to do it themselves with blunter instruments.

It's possible this was another AirTag stalking debacle where Apple proactively tried to solve a problem their technology might cause (AirTags for stalking, end to end encryption letting predators operate with impunity), but I suspected this was a proactive way of removing a strawman argument for government demands for greater access. They held the line against the but-terrorism! argument, maybe they can hold it here.

Time will tell.

Hopefully this will at least stop people from making the argument that "Apple wants to put CSAM on our phones", which was always weird.
 
  • Like
Reactions: SFjohn

huge_apple_fangirl

macrumors 6502a
Aug 1, 2019
757
1,282
So excited to hear this. I’ve been sticking with my iPhone XS and iOS 14 ever since that ill-fated policy was announced. Glad that I can finally move on! Couldn’t have come at a better time for me honestly, I just dropped my iPhone and I think the LCD display connector broke, the screen is completely black but it still rings when called…
 
Last edited:
  • Like
Reactions: bobcomer

Chazak

macrumors 6502
Aug 15, 2022
465
703
Good news for all the pedophiles out there.
Bad joke if it is a joke. If it isn't it would seem to reflect a lack of understanding and knowledge about existing tools law enforcement currently uses to find and prosecute those possessing and trading in underage porn.

I just don't understand people who think irreverent remarks and "jokes" about child porn and pedophila have value as humor. None of it, whether you agree with what Apple has done or not, is funny in any way regardless of how you spin it.
 
  • Like
Reactions: murrayp and SFjohn

Chazak

macrumors 6502
Aug 15, 2022
465
703
If Apple wanted to scan your phone surreptitiously, they could have done that since 15 years ago.

I don't get why people think this whole CSAM exercise was just a means of Apple getting a backdoor into people's phones - why would they need to go to those lengths when they control the OS for pity's sake!
Your voice reflects sanity, but didn't you know everything tech does is a giant conspiracy? Just ask a fair number of the posters here!
 
  • Like
  • Haha
Reactions: NightFox and SFjohn

4nNtt

macrumors 6502a
Apr 13, 2007
917
716
Chicago, IL
I think this is great. I'm more concerned about the privacy issues then CSAM scanning of a private/personal photo repository. I'd rather just see CSAM removal targeted to public places where they can cause more harm.
 

4nNtt

macrumors 6502a
Apr 13, 2007
917
716
Chicago, IL
I wouldn't be so sure they're abandoning it... Probably figured out another way to mass monitor data.
Apple abandoned CSAM scanning of iCloud data. Craig Federighi said as much. Instead they will just be focusing on providing parents tools to identify photos on their children's devices, but nothing will be reported back to Apple or authorities.

I'm no expert, but the need to scan for CSAM among private files has always seemed overblown to me and it has backfired on completely innocent Google Photos users in a number of publicized cases.
 
  • Like
Reactions: SFjohn
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.