Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

macfacts

macrumors 601
Oct 7, 2012
4,789
5,609
Cybertron
Bravo Apple ?? a lot of the negative comments about privacy from people who didn’t read the fine print on who this applies to and how it applies. This isn’t Apple wanting to go through your private photos or intercepting your private messages. This is a seemingly well implemented utilisation of onboard machine learning and algorithmic pattern recognition done in a very anonymous way to link known questionable content with a warning, that’s all. There’s no event recording or reporting to external sources or databases, it’s simply an on-device hash recognition that triggers a warning within parental features for children accounts, that’s all!
Slippery slope, at first I didn't care because I wasn't a kid and later on they will be after you.
 

4jasontv

Suspended
Jul 31, 2011
6,272
7,548
Fabulous. So my 13yo who, while monitored, has more freedom with her phone than a younger child, is going to get no integrated protection. Who the hell thought that was a good age for the cut off? :mad:
And I thought this was such a good idea.
Who thought it was a good idea to notify anyone but the parent? This would be an opt-in feature where pictures and messages could be accessed via another device listed as a parent. No one else. That solves the issue of teenagers and child predators in one swoop while maintaining privacy.
 

Krizoitz

macrumors 68000
Apr 26, 2003
1,743
2,097
Tokyo, Japan
Slippery slope, at first I didn't care because I wasn't a kid and later on they will be after you.
Slippery slope is a logical fallacy, you know that right?

EVERY decision is drawing a line somewhere. You can argue this particular line being drawn is too far, but simply stating that because it exists worse things will happen is not logically sound.
 
  • Disagree
Reactions: TakeshimaIslands

ipedro

macrumors 603
Nov 30, 2004
6,255
8,556
Toronto, ON
A lot of folks suddenly panicking today over Apple giving parents a tool to protect young children from predators.

View attachment 1815303

The three guys who disliked this post:

CCAFBE8A-ECE2-4950-A553-10C4F672D136.png

Why don’t you have a seat, right over there…
MarriedUniformEmperorshrimp-max-1mb.gif
 

ikir

macrumors 68020
Sep 26, 2007
2,137
2,289
No, people are freaking out that Apple is giving regimes the ability to scan their secure device for any content they deem unacceptable. This is about more than child porn, but people make it about child porn because that's how you manipulate people into giving up freedoms. Just look at the damage a certain online group that will not be named has done using that as a tool to manipulate the masses, and in that case it's completely made up BS.

In this particular instance, however, I don't see a problem with parents having the ability to scan things on their children's phones. That puts the control at a much lower level than nation state, but it still feels kinda weird like it could be manipulated for nefarious purposes. What I like about this is that you can combine this with other parental controls to prevent them from using other apps that can't be locked down like this. Makes me much more likely to get my daughter a phone at a younger age. Not sure what age yet, but she's currently 7 so we have a while to figure that out, lol. Also not sure why I can't be notified if my 13, 14, 15, 16, or 17 year old is receiving nudes. They're underage and as the courts have ruled, any underage nude photo on your device can get you jail time, even if you are also underage, and especially if you're a boy. My son is only 5 now, but say he's 15 and his 15 year old girlfriend flashes him in a photo. He could go to jail for that. That has actually happened. I suppose the blurred photo still prevents it from being illegal? But I'd want to know to talk to him about the potential dangers of that in case they start sharing polaroids or something, which I could see coming back into fashion if all these digital device are locked down, lol. But I will regularly have talks with them about that sort of stuff when they're older to make sure they really understand the harm. Personally I think that law is pretty jacked up since both parties are minors, especially since the girls tend to get away with it, even if they were the ones sending it. But I think there have been some cases where the girls have gotten busted too, so it's not a good idea for anyone. You just never know if the other kid is going to get found out by their parents and have them go to the police. If it were my kid I'd have them delete it and block the number. I think the parental controls can keep you from unblocking the number. Maybe I sound over the top but it's not worth risking my kids future to see some boobies, even if he thinks otherwise, lol. I've been there.
99% 15-16-17 yo guys/girls send/received nudes.
 
  • Sad
Reactions: TakeshimaIslands

coolfactor

macrumors 604
Jul 29, 2002
7,126
9,871
Vancouver, BC
I know so many parents that create social media profiles for their kids and lie about the age to get past the age-gate, and the child grows up believing this is okay.
 

coolfactor

macrumors 604
Jul 29, 2002
7,126
9,871
Vancouver, BC
This makes NO sense to me. This is the kind of feature that has the potential for false positives/negatives and it’s impossible to draw the line. So apple decided to draw the line at, letting 10yos open explicit images but it can alert their parents, but a 13yo can do the same and it will alert nobody. What the hell is going on at apple? Who is making these decisions and why have they not been fired before any of these stupid announcements?

Children grow and change so fast during those ages. A 10 year old's maturity (mentally and physically) is very different from a 13 year old's. VERY different, like totally different people as they move throw those ages. Apple is basing these on proven maturity standards established by society. I see nothing wrong with the age brackets chosen.
 

Altivec88

macrumors regular
Jun 16, 2016
213
812
Slippery slope is a logical fallacy, you know that right?

EVERY decision is drawing a line somewhere. You can argue this particular line being drawn is too far, but simply stating that because it exists worse things will happen is not logically sound.
It's completely logical when you put 2 and 2 together. I have defended Apple over and over but that has changed. I believed Apple when they said iMessage was impenetrable and only the recipient had the key to view the message. My company used iMessage to discuss private matters, we even passed around passwords in there. I know we are not that interesting or important to monitor but that is beside the point. That trust is broken and Apple's word doesn't mean much anymore. so when they say this "feature" is optional, I say BS to that.

Governments have been hammering Apple for backdoors left and right and it is very logical to assume they didn't come up with this on their own. Why didn't they bring this up at the keynote and fully explain everything instead of waiting for a leak to announce it. They use to put up a fight but its clear they are now conceding to governments. They were and are fully aware that software exists to break into any iOS device. have access to the camera, microphone, even monitor keystrokes. They use to get mad and fix the loop holes immediately, now we just get a response, yah but these tools cost millions, so its not a big deal. What happened to we will fight for your privacy!

If you think implementing system wide machine learning photo analysis into everyones iOS device so the few parents who do care that their Kid (who shouldn't even have a phone if you ask me) sees a nude pic, you are very naive. I guarantee that this system will eventually be used in ways you can't even imagine and turned on remotely by Pegasus users with out you even knowing.

I'm not into conspiracy theories and if you would have told me what I just wrote a month ago, I would have said you are crazy. But due to recent events, it is now considered a fact that this nutty stuff is clearly going on. So yes, normally it is not logically sound to state worse things will happen just because it exists but in this case I believe it's very logical that this system will be used for worse thing just because it exists.
 

Pezimak

macrumors 68030
May 1, 2021
2,993
3,288
Doesn't most teenagers want nudity?

I would be pretty upset if my parents kept such a tight lease on me when I was young.

Yes, they are teenagers, it's part of growing up to be curious and look into sex etc, your hormones are going crazy then. Everyone should remember when they were that age what nude pictures they looked at.

To me it seems a tricky subject, this seems a clever idea, I think warning over 13 year olds about abuse text is perhaps better especially for girls as they tend to be more mature. But sexual predators are a very real thing online and do target all ages of children.
Growing up and being curious needs to be balanced with the predators and their actions, with the systems implemented to help protect children.

A difficult one.
 

Smartass

macrumors 65816
Dec 18, 2012
1,457
1,702
I like it. People with something to hide might not.
People that have something to hide use different phones and software for the "things" they're trying to hide. This feature is a complete ******** and is the probably connected with that years old long demand from FBI and NSA for apple to give them backdoor access to iphones. It was obvious that sooner or later they're going to crack under their demands, and here we have it.
 
  • Love
Reactions: TakeshimaIslands

wanha

macrumors 68000
Oct 30, 2020
1,508
4,378
For some reason, this feels half-assed to me. If Apple's intention is to protect the children, then they should scan for gore/violent images as well. I feel like those images are as disturbing, if not more disturbing, than unsolicited nudes for young children.

edit: grammar
Apple is an American company. America is horrified of nudity, but it glorifies violence.
 

5232152

Cancelled
May 21, 2014
559
1,555
Fabulous. So my 13yo who, while monitored, has more freedom with her phone than a younger child, is going to get no integrated protection. Who the hell thought that was a good age for the cut off? :mad:
And I thought this was such a good idea.

You are angry because the function does not meet 100% of your family's needs? The world is bigger than you, just FYI.
 
  • Like
Reactions: wanha and MLVC

Count Blah

macrumors 68040
Jan 6, 2004
3,192
2,748
US of A
For some reason, this feels half-assed to me. If Apple's intention is to protect the children, then they should scan for gore/violent images as well. I feel like those images are as disturbing, if not more disturbing, than unsolicited nudes for young children.

edit: grammar
Dont’t worry, that’s coming up next. After that comes the NLP to identify texts that go against Apple’s approved political views. It will have a toggle switch in Cupertino though, because at some point saying ‘I don’t like the president’ will be viewed as evil. But after a future inauguration, you will be flagged for saying you’ll like the president.

Apple thought police is this close. Imagine Timmy tapping his finger tips together like Mr. Burns, saying ‘Excellent’
 

The Phazer

macrumors 68040
Oct 31, 2007
3,000
956
London, UK
Bravo Apple ?? a lot of the negative comments about privacy from people who didn’t read the fine print on who this applies to and how it applies. This isn’t Apple wanting to go through your private photos or intercepting your private messages. This is a seemingly well implemented utilisation of onboard machine learning and algorithmic pattern recognition done in a very anonymous way to link known questionable content with a warning, that’s all. There’s no event recording or reporting to external sources or databases, it’s simply an on-device hash recognition that triggers a warning within parental features for children accounts, that’s all!

This is not correct. Have you read it?
 
  • Like
Reactions: Hardijs

Pezimak

macrumors 68030
May 1, 2021
2,993
3,288
because America

just like facebook going all crazy when a painting showing a breast is posted but totally fine with all the violence and guns

the rest of the world simply shrugs and wonders what's wrong with them

Despite the multi billion dollar sex industry, off America... Us British don't like nudity or rude words... but we happily lap up anything American apart from its sense of humour.
 

WaxedJacket

macrumors 6502a
Oct 18, 2013
690
1,071
Great but imagine this tech being used for finding people posting specific memes online. Essentially, consider the unintentional consequences of introducing tech like this and how it could be used in nefarious ways.
 

hot-gril

macrumors 68000
Jul 11, 2020
1,924
1,966
Northern California, USA
Bravo Apple ?? a lot of the negative comments about privacy from people who didn’t read the fine print on who this applies to and how it applies. This isn’t Apple wanting to go through your private photos or intercepting your private messages. This is a seemingly well implemented utilisation of onboard machine learning and algorithmic pattern recognition done in a very anonymous way to link known questionable content with a warning, that’s all. There’s no event recording or reporting to external sources or databases, it’s simply an on-device hash recognition that triggers a warning within parental features for children accounts, that’s all!
If they're going to roll out something creepy, they'll start with the seemingly innocent use cases. "Protecting the kids" is always the excuse for invasive policing.
 
  • Like
Reactions: Hardijs
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.