Are we still on that? Apple’s implementation of CSAM was deliberately designed with privacy in mind. They didn’t do server side scanning for this exact reason: privacy. Just like all other analysis of photos for face and object recognition are all done client side on the iPhone itself. While other companies like Google do this server side to obtain as much data as possible.Apple is totally disingenuous on CSAM, instead of scanning iCloud Photos on private devices, they could more ethically and effectively implement CSAM of iCloud Photos stored on their own servers. They can legitimately claim they want to police their own servers for illegal content, and it would have the same effect, instead of doing the creepy, sneaky user device monitoring scheme.
A less ideal solution: Apple could also, using CSAM data, block child pornography from displaying on any operating systems across their platform. This is similar to what Adobe does for counterfeiting controls where scans of cash bills won't display correctly. This is also built into scanner and printer hardware by various manufacturers. While this scheme could also be exploited by suspect governments and entities, it doesn't turn Apple into the same kind of intrusive creepy content police of users' personal devices, nor have the same dire ramifications with the absolute certain abuse of CSAM.
I assume Apple is smart enough to know there were several better ways to tackle this problem if they must, but they deliberately chose the absolute worst option.
I wonder why?
Lack of vision? Sad and depressing.
Stupidity thru ego? Kinda where I'm leaning.
Compromised? Paging Alex Jones to the conspiracy desk.
The thing that backfired is that client-side face scanning sounds good and privacy friendly while client-side SCAM scanning sounds creepy. But whatever way you look at it: you either trust the company or you don’t.