Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

hot-gril

macrumors 68000
Jul 11, 2020
1,924
1,966
Northern California, USA
Slippery slope, at first I didn't care because I wasn't a kid and later on they will be after you.
My iPhone is already notifying me incessantly that my volume is turned up all the way for too long, telling me to turn it down. Says there's no way to disable these notifications. Excuse me but does the phone own me or the other way around? It's not even damaging my hearing since it's plugged into aux.
 
  • Like
Reactions: Hardijs

hot-gril

macrumors 68000
Jul 11, 2020
1,924
1,966
Northern California, USA
Children grow and change so fast during those ages. A 10 year old's maturity (mentally and physically) is very different from a 13 year old's. VERY different, like totally different people as they move throw those ages. Apple is basing these on proven maturity standards established by society. I see nothing wrong with the age brackets chosen.
Legal age is 18 regardless of how "developed" the person is. They probably chose 13 to avoid the false positives that would come from teenagers sending nudes, as a few of them do.
 

Allyance

Contributor
Sep 29, 2017
2,038
7,530
East Bay, CA
I remember when my 11 year old son got to go France for the summer back in the seventies. The family he was staying with had a sail boat and the mother was topless all the time. No Problem. We got a post card from him saying they had gone to get ice cream cones then they went to a newd (his spelling) beach. Nothing more said. Americans got all bent out shape when a bare breast made into a NFL Half time show for a few seconds!

I support this feature because some teenager could send a picture then end up being labeled a sex offender for the rest of their life. Kids are easily pressured into doing stupid things by peer pressure or bullying. If you are a parent of a teen or were at one time, you know what I am talking about.
 
Last edited:

Piggie

macrumors G3
Feb 23, 2010
9,128
4,033
Genuine question.
So to stop sick people from collecting and sending kiddie porn images, they have collected a massive amount of such photo's they know are circulating, and have created hash codes so they can detect if the hash code matches (you have the photo) and then some action is taken.

Now, let's say I has such a sick image, I load it into photoshop, flip it horizontally, perhaps crop it a tiny amount or add a tiny border, I adjust the bright/contrast/sharpness, and perhaps add a little flower in the corner.

Is that now a totally different photo that can no longer get detected?
Just how intelligent are these hashes as it would be easy for every single pixel in an image to be changed, and yet look exactly the same to a human?
 

orthorim

Suspended
Feb 27, 2008
733
350
Bravo Apple ?? a lot of the negative comments about privacy from people who didn’t read the fine print on who this applies to and how it applies. This isn’t Apple wanting to go through your private photos or intercepting your private messages. This is a seemingly well implemented utilisation of onboard machine learning and algorithmic pattern recognition done in a very anonymous way to link known questionable content with a warning, that’s all. There’s no event recording or reporting to external sources or databases, it’s simply an on-device hash recognition that triggers a warning within parental features for children accounts, that’s all!

This comment goes to show that most humans are unbelievably naive and gullible.

Now China can chime in and Apple will "protect" chinese users from dangerous misinformation. For your safety!

Easy the tech is already there! One request, boom - can Apple afford to say "no, we can't do that". No. Apple can't afford that. Apple has a history of caving to China or any reasonably size market.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.