It isn't the same implementation.Okay so I am confusion.
iOS already scans your photos to identify pets, faces, cars, buildings, trees, etc... how is finding missing and exploited children via the same technology a bad thing? The fact that it alerts a non-profit NGO? It's already a fact that it will not flag naked pics of your 1-week-old during their first bath time, or the suggestive images you send your sexual partners.
The ONLY way you'll be flagged is if you have known images of exploited children on your phone. In this context, "known" is defined as any image that is already uploaded to the CSAM database, and is found to be an IDENTICAL match in your iCloud library.
The object/person detection uses the on-device neural engine and those identifications never leave the device. In this case, there is a threshold that Apple is not disclosing that will trigger an upload of your private photos if it meets their criteria.
Moreover, this hash database is also unverified and unaudited. The US-funded NCMEC blindly provides these hashes and Apple merely has to assume this database that they're going to be pushing down to every iOS device is genuinely only representative of CSAM material. What happens when unscrupulous politicians (from the US or allies) start leveraging NCMEC to push "undesirable" political content hashes into that database. Legal political dissent content will be flagged as "CSAM" and that "evidence" will be used to falsely accuse, and likely arrest political dissidents and label them as peddlers of CSAM. That reputation damage will be difficult to prove false against the power of large nation-states determined to take down political opponents, let alone overcoming being framed with false evidence. And frankly, I'm shocked Apple had not considered that they would be complicit every time this happens by enabling and pushing this technology on everyone, because they were too weak willed to say "no".
This is an authoritarian's dream just waiting to be unleashed. Politicians are going to be chomping at the bit to push their own content into those hash databases once this is rolled out. It's only a matter of time before a major world power like the US or China passes a law to make it happen -- if it doesn't happen in secret without your knowledge, while falsely accused possessors of "CSAM" are spirited away in the middle of the night to dark sites.