Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,768
31,226


Apple today announced a series of new child safety initiatives that are coming alongside the latest iOS 15, iPadOS 15, and macOS Monterey updates and that are aimed at keeping children safer online.

iphone-communication-safety-feature-arned.jpg

One of the new features, Communication Safety, has raised privacy concerns because it allows Apple to scan images sent and received by the Messages app for sexually explicit content, but Apple has confirmed that this is an opt-in feature limited to the accounts of children and that it must be enabled by parents through the Family Sharing feature.

If a parent turns on Communication Safety for the Apple ID account of a child, Apple will scan images that are sent and received in the Messages app for nudity. If nudity is detected, the photo will be automatically blurred and the child will be warned that the photo might contain private body parts.

"Sensitive photos and videos show the private body parts that you cover with bathing suits," reads Apple's warning. "It's not your fault, but sensitive photos and videos can be used to hurt you."

The child can choose to view the photo anyway, and for children that are under the age of 13, parents can opt to get a notification if their child clicks through to view a blurred photo. "If you decide to view this, your parents will get a notification to make sure you're OK," reads the warning screen.

These parental notifications are optional and are only available when the child viewing the photo is under the age of 13. Parents cannot be notified when a child between the ages of 13 and 17 views a blurred photo, though children that are between those ages will still see the warning about sensitive content if Communication Safety is turned on.

Communication Safety cannot be enabled on adult accounts and is only available for users that are under the age of 18, so adults do not need to worry about their content being scanned for nudity.

Parents need to expressly opt in to Communication Safety when setting up a child's device with Family Sharing, and it can be disabled if a family chooses not to use it. The feature uses on-device machine learning to analyze image attachments and because it's on-device, the content of an iMessage is not readable by Apple and remains protected with end-to-end encryption.

Article Link: Apple's New Feature That Scans Messages for Nude Photos is Only for Children, Parental Notifications Limited to Kids Under 13
 
Last edited:

eatrains

macrumors 6502a
Mar 11, 2006
635
4,845
Looks like Apple is being very transparent about it now.

Apple's warning. "It's not your fault, but sensitive photos and videos can be used to hurt you."

What in the world does that even mean? Don't take sensitive photos/videos?

They were never not transparent. It's not Apple's fault irresponsible and uninformed reporters spread misinformation.
 

jclo

Managing Editor
Staff member
Dec 7, 2012
1,973
4,308
Looks like Apple is being very transparent about it now.

Apple's warning. "It's not your fault, but sensitive photos and videos can be used to hurt you."

What in the world does that even mean? Don't take sensitive photos/videos?

Not sure there's any ambiguity in Apple's alert. It's clearly warning children against nude photos sent by or requested by predatory adults.
 

mozumder

macrumors 65816
Mar 9, 2009
1,291
4,427
So, Apple is allowing teens to view images they know are nudes? Seems like a minefield.
 

CalMin

Contributor
Nov 8, 2007
1,694
3,015
I applaud Apple's efforts to assist parents/guardians from those who might do their kids harm. They will work out the privacy concerns over time. They are not going to want to compromise their reputation as being the device that has your privacy as a differentiator.
 

rictus007

macrumors 6502
Oct 12, 2011
424
1,107
This is not a bad idea, maybe it can be improved …. With a custom message…. You open that picture and you at not going to touch an iPhone until you are 21.
Not joking: is always good to have tools to protect my kids and also give them some access to the internet…. Until they can make their own decisions
 
  • Like
Reactions: -DMN-

bwillwall

Suspended
Dec 24, 2009
1,031
802
This makes NO sense to me. This is the kind of feature that has the potential for false positives/negatives and it’s impossible to draw the line. So apple decided to draw the line at, letting 10yos open explicit images but it can alert their parents, but a 13yo can do the same and it will alert nobody. What the hell is going on at apple? Who is making these decisions and why have they not been fired before any of these stupid announcements?
 

antiprotest

macrumors 601
Apr 19, 2010
4,044
14,261
This makes NO sense to me. This is the kind of feature that has the potential for false positives/negatives and it’s impossible to draw the line. So apple decided to draw the line at, letting 10yos open explicit images but it can alert their parents, but a 13yo can do the same and it will alert nobody. What the hell is going on at apple? Who is making these decisions and why have they not been fired before any of these stupid announcements?
Well, as the Apple workers said when they complained about going back to the office, they did some of their "best work" while working from home last year. So you're probably seeing some of that "best work". More to come I'm sure.
 

antiprotest

macrumors 601
Apr 19, 2010
4,044
14,261
Fabulous. So my 13yo who, while monitored, has more freedom with her phone than a younger child, is going to get no integrated protection. Who the hell thought that was a good age for the cut off? :mad:
And I thought this was such a good idea.
I think 13 is the age where...younger than that companies are not allowed to collect personal data or allow them to sign up for accounts etc. So maybe Apple is just taking that number for some reason.
 

InGen

Suspended
Jun 22, 2020
275
935
Bravo Apple ?? a lot of the negative comments about privacy from people who didn’t read the fine print on who this applies to and how it applies. This isn’t Apple wanting to go through your private photos or intercepting your private messages. This is a seemingly well implemented utilisation of onboard machine learning and algorithmic pattern recognition done in a very anonymous way to link known questionable content with a warning, that’s all. There’s no event recording or reporting to external sources or databases, it’s simply an on-device hash recognition that triggers a warning within parental features for children accounts, that’s all!
 

macduke

macrumors G5
Jun 27, 2007
13,187
19,795
A lot of folks suddenly panicking today over Apple giving parents a tool to protect young children from predators.

View attachment 1815303
No, people are freaking out that Apple is giving regimes the ability to scan their secure device for any content they deem unacceptable. This is about more than child porn, but people make it about child porn because that's how you manipulate people into giving up freedoms. Just look at the damage a certain online group that will not be named has done using that as a tool to manipulate the masses, and in that case it's completely made up BS.

In this particular instance, however, I don't see a problem with parents having the ability to scan things on their children's phones. That puts the control at a much lower level than nation state, but it still feels kinda weird like it could be manipulated for nefarious purposes. What I like about this is that you can combine this with other parental controls to prevent them from using other apps that can't be locked down like this. Makes me much more likely to get my daughter a phone at a younger age. Not sure what age yet, but she's currently 7 so we have a while to figure that out, lol. Also not sure why I can't be notified if my 13, 14, 15, 16, or 17 year old is receiving nudes. They're underage and as the courts have ruled, any underage nude photo on your device can get you jail time, even if you are also underage, and especially if you're a boy. My son is only 5 now, but say he's 15 and his 15 year old girlfriend flashes him in a photo. He could go to jail for that. That has actually happened. I suppose the blurred photo still prevents it from being illegal? But I'd want to know to talk to him about the potential dangers of that in case they start sharing polaroids or something, which I could see coming back into fashion if all these digital device are locked down, lol. But I will regularly have talks with them about that sort of stuff when they're older to make sure they really understand the harm. Personally I think that law is pretty jacked up since both parties are minors, especially since the girls tend to get away with it, even if they were the ones sending it. But I think there have been some cases where the girls have gotten busted too, so it's not a good idea for anyone. You just never know if the other kid is going to get found out by their parents and have them go to the police. If it were my kid I'd have them delete it and block the number. I think the parental controls can keep you from unblocking the number. Maybe I sound over the top but it's not worth risking my kids future to see some boobies, even if he thinks otherwise, lol. I've been there.
 

Apple_Robert

Contributor
Sep 21, 2012
34,504
50,065
In the middle of several books.
Bravo Apple ?? a lot of the negative comments about privacy from people who didn’t read the fine print on who this applies to and how it applies. This isn’t Apple wanting to go through your private photos or intercepting your private messages. This is a seemingly well implemented utilisation of onboard machine learning and algorithmic pattern recognition done in a very anonymous way to link known questionable content with a warning, that’s all. There’s no event recording or reporting to external sources or databases, it’s simply an on-device hash recognition that triggers a warning within parental features for children accounts, that’s all!
I am going to guess that many people did correctly read the article but decided that they weren't going to blindly take Apple's word and live in blissful trust that there won't be any "oops we are sorry that X happened with your account" kind of news responses by some big tech company.
 

hans1972

macrumors 68040
Apr 5, 2010
3,340
2,916
This makes NO sense to me. This is the kind of feature that has the potential for false positives/negatives and it’s impossible to draw the line. So apple decided to draw the line at, letting 10yos open explicit images but it can alert their parents, but a 13yo can do the same and it will alert nobody. What the hell is going on at apple? Who is making these decisions and why have they not been fired before any of these stupid announcements?

Doesn't most teenagers want nudity?

I would be pretty upset if my parents kept such a tight lease on me when I was young.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.