Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

hans1972

macrumors 68040
Apr 5, 2010
3,340
2,916
Yes, it is dangerous because hashes can be made of any file - not just of child porn, but of pictures of demonstrations, audio files of speeches, memes, political manifestos, pictures of yellow and blue flags, etc. Apple gave the world an algorithmic blueprint that could be used by authoritarian regimes to detect all sorts of material while protecting 'privacy'. It was incredibly naive and negligent of Apple - the end product of letting engineers run amok without due consideration of the impacts of the work.

Nothing Apple did was new. Everything they used was a known technology having been developed by others before them.


The frontrunners on these types of technology have been Google and Microsoft. In fact, Microsoft has donated a CSAM-detection solution which is available to other tech companies on a case-by-case basis.
 

H2SO4

macrumors 603
Nov 4, 2008
5,671
6,953
Just because people vote for tyranny does not transform it to something other than tyranny.
Just because people say something is tyranny does not transform a situation to into something that is tyranny.
That doesn't actually answer the question I asked.

Hyperbole aside, in case you are confused;

tyranny | ˈtɪrəni | noun (plural tyrannies) [mass noun] cruel and oppressive government or rule: refugees fleeing tyranny and oppression. • [count noun] a state under cruel and oppressive government. • cruel, unreasonable, or arbitrary use of power or control: the tyranny of her stepmother | figurative : the tyranny of the nine-to-five day. • (especially in ancient Greece) rule by one who has absolute power without legal right.
 

siddavis

macrumors 6502a
Feb 23, 2009
863
2,905
Just because people say something is tyranny does not transform a situation to into something that is tyranny.
That doesn't actually answer the question I asked.

Hyperbole aside, in case you are confused;

tyranny | ˈtɪrəni | noun (plural tyrannies) [mass noun] cruel and oppressive government or rule: refugees fleeing tyranny and oppression. • [count noun] a state under cruel and oppressive government. • cruel, unreasonable, or arbitrary use of power or control: the tyranny of her stepmother | figurative : the tyranny of the nine-to-five day. • (especially in ancient Greece) rule by one who has absolute power without legal right.
From the definition provided: arbitrary use of power or control
To answer your question, yes it qualifies as centralized. The very concept of EU is to centralize decision making for the member states. Just my opinion, this qualifies as arbitrary.
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,117
8,060
Disclaimer: I know nothing about how AI technology works, but I do know that AI can only do what it does if it's fed a whole lot of sample data. That said, how exactly does one train an AI system to recognize this kind of material?
The hashes are the output from the training that a central organization (that has the images) provides to participating entities. And, the training is ONLY done on known images acquired from prior investigations. Considering that the vast majority of images in the world are NOT those images they’ve trained to detect, the chance of false matches are very low and the chance of a large volume of false matches in the same users library (which is what would be reported) is exceedingly low.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,422
Another possibility I have to question: for most people, differentiating 17 year olds (minors) from 18 years olds (adults) is not something that can be done visually.

Does this mean that say, a video or image that someone downloaded or viewed in good faith, thinking a willing adult is in it, would be on the CSAM list (which rightfully, cannot be seen by anyone), and would only find out when police are knocking on their door?
 

VulchR

macrumors 68040
Jun 8, 2009
3,401
14,286
Scotland
Apple was given an almost direct warning by US senators in a Senate hearing where at least one senator threatened to make laws dictating Apple and others to implement such features.
I would have preferred Apple called their bluff, spent some campaign money against the reelection of Congress critters who favour surveillance, did nothing until the law was passed, and then challenged that law.
 

VulchR

macrumors 68040
Jun 8, 2009
3,401
14,286
Scotland
Another possibility I have to question: for most people, differentiating 17 year olds (minors) from 18 years olds (adults) is not something that can be done visually.

Does this mean that say, a video or image that someone downloaded or viewed in good faith, thinking a willing adult is in it, would be on the CSAM list (which rightfully, cannot be seen by anyone), and would only find out when police are knocking on their door?
It might very well mean that, yes.

I am all for police prosecuting people who have child porn on their devices, but there should be evidence before the police or Apple or anybody else go rummaging through your private files on your phone. There should be a warrant to do so. After 9/11 Bin Laden was heard to say something to the effect of 'that's the end of civil rights in the West'. When the news reported this, I scoffed. And yet here we are.
 

VulchR

macrumors 68040
Jun 8, 2009
3,401
14,286
Scotland
Nothing Apple did was new. Everything they used was a known technology having been developed by others before them.


The frontrunners on these types of technology have been Google and Microsoft. In fact, Microsoft has donated a CSAM-detection solution which is available to other tech companies on a case-by-case basis.
Yes but as I have noted above, Apple put these elements together to create a system that works locally on a user's mobile device using AI-optimised chips without users being able to opt out. It like Amazon showing up at your house and installing a security camera inside without your approval. That is a first. Mobile device AI-optimised chips will soon be able to recognise, report, and censor content at the whim of corporations and governments in real time. We will be the last generation that will be able to stop this nightmarish march toward an industrial-governmental surveillance state.
 
  • Like
Reactions: Schismz and dk001

Schismz

macrumors 6502
Sep 4, 2010
343
394
Sadly I agree with your sig, ":apple:: On such-and-such a date Apple will release iOS 15.X and you'll see why 2021 2022 will be exactly like "1984." And then the EU will applaud." but it's not that bad. I mean we're willingly turning ourselves into sheeple by carrying around mobile phones we pay $1K+ for... we could just, ya know, stop doing that, until it becomes illegal to not carry a mobile phone 24 hours a day.

IDK, try it sometime. I have lately, it's a strange feeling, I suddenly don't have this annoying vibrating thing where 101 people always seem to urgently want something from me, and I actually re-engage with my environment. OTOH, where am I? I need an app to tell me to walk 10 feet and turn left! Plus, also, need a prop to pretend to be absorbed in when I'm ignoring people.
 
  • Like
Reactions: VulchR

Unregistered 4U

macrumors G4
Jul 22, 2002
10,117
8,060
Another possibility I have to question: for most people, differentiating 17 year olds (minors) from 18 years olds (adults) is not something that can be done visually.
Which is why it’s good they’re not doing this. From prior enforcement activity, they have a library of images. What’s being presented is a hash to find JUST those images. NOT to try to, using machine learning, discern an adult human from a non-adult human.

If there’s a single match in a library of thousands of images, that’s not enough to be flagged. If there’s hundreds of matches in a library of thousands of images, again remembering that they are ONLY looking for images that they already KNOW are illegal, that will cause that library to be examined by a human before any action is taken. If a human looks at images from prior cases, compares them to the ones in the library and sees that they’re not the same, end of line.

If they ARE the same, the obvious occurs.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.