Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,770
31,228


Apple is planning to expand its Communication Safety in Messages feature to the UK, according to The Guardian. Communication Safety in Messages was introduced in the iOS 15.2 update released in December, but the feature has been limited to the United States until now.

communication-safety-feature-yellow.jpg

Communication Safety in Messages is designed to scan incoming and outgoing iMessage images on children's devices for nudity and warn them that such photos might be harmful. If nudity is detected in a photo that's received by a child, the photo will be blurred and the child will be provided with resources from child safety groups. Nudity in a photo sent by a child will trigger a warning encouraging the child not to send the image.

Communication Safety is opt-in, privacy-focused, and must be enabled by parents. It is limited to the accounts of children, with detection done on-device, and it is not related to the anti-CSAM functionality that Apple has in development and may release in the future.

We have a full guide on Communication Safety in Messages that walks through exactly how it works, where it's used, Apple's privacy features, and more.

Update: Communication Safety in Messages is also coming to Canada.

Update 2: The feature is also coming to Australia and New Zealand, according to The Verge.

Article Link: Apple's Messages Communication Safety Feature for Kids Expanding to the UK, Canada, Australia, and New Zealand
 
Last edited:

JCCL

macrumors 68000
Apr 3, 2010
1,924
4,328
At least they say this is on-device and nothing is sent to Apple. But still, there is one functionality that would render these things useless, and it is called "Parenting"

I have seen a lot of people not pleased with this. Sorry didn’t mean to offend anyone, but I do think that there’s no way the tool can really catch all threats, especially internationally when no one uses iMessage to begin with.

Basis is just establishing trust and communication with your kids, doing this together and yes eventually checking on the stuff they’re doing in your devices. No tool it is going to 100% prevent any risks, especially with all the apps and different media channels that exist.
 
Last edited:

VulchR

macrumors 68040
Jun 8, 2009
3,401
14,286
Scotland
Well, we all know how this thread will end up with the people who didn't read the multiple articles explaining the difference between CSAM and Communication Safety
Not sure why you would say this, but I suppose we'll see. I object vociferously to the local phone CSAM-detection system for multiple reasons, but this seems reasonable enough.
 

VulchR

macrumors 68040
Jun 8, 2009
3,401
14,286
Scotland
At least they say this is on-device and nothing is sent to Apple. But still, there is one functionality that would render these things useless, and it is called "Parenting"
Surely good parenting would include enabling this feature on any iPhones your kids use. Of course this feature might lull parents into a false sense of security because people use a variety of communication channels to send pictures, but no doubt Apple will alert parents to this during the process of enabling iMessage Communication Safety.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
At least they say this is on-device and nothing is sent to Apple. But still, there is one functionality that would render these things useless, and it is called "Parenting"

Um, parenting can involve using technology like this to help protect your children from harmful content or behavior. Even children with outstanding parents are still children - curious, often immature, naive, and make mistakes. While this technology isn't foolproof, of course, it's still a great tool for parents to have at their disposal.

It's not an either-or thing.
 

fwmireault

Contributor
Jul 4, 2019
2,158
9,167
Montréal, Canada
Not sure why you would say this, but I suppose we'll see. I object vociferously to the local phone CSAM-detection system for multiple reasons, but this seems reasonable enough.
I am on the same page. I, as many others, took so much time in other threads explaining and reexplaining the difference between the two, but there are still people here who are screaming about CSAM scanning when we’re talking about Communication Safety Feature (which I don’t have any problem with)
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
Honestly though. Guess what would happen if the child really wanted to see some nude photos?
This does serve some purpose. But sadly when there is a will, there is a way. 😓

My understanding is the parent is notified if they elect to view it anyway, so they can investigate.

@sorgo and others "laughing" at this post. That's funny? Ok... So nice that you guys can laugh about people sending nudes to minors. Guessing you're not parents or people that care about kids much 🙄
 
Last edited:

siddavis

macrumors 6502a
Feb 23, 2009
863
2,905
I am on the same page. I, as many others, took so much time in other threads explaining and reexplaining the difference between the two, but there are still people here who are screaming about CSAM scanning when we’re talking about Communication Safety Feature (which I don’t have any problem with)
True. But when you are caught with your pants down once, people tend to think the pants are coming off every time your hand gets near the belt.
Pun intended :p
 
  • Like
Reactions: Jonathan Leclerc

Wildkraut

Suspended
Nov 8, 2015
3,583
7,673
Germany
Well, we all know how this thread will end up with the people who didn't read the multiple articles explaining the difference between CSAM and Communication Safety
Well, if my parents were so restrictive, i wouldn’t be doing the job i do today, luckily they weren’t. Communicating on IRC was far more dangerous. The only restrictions i’ve set to my kids is time at end of the day.

Restrictions keeps kids from evolving, and i bet the parents who will use this feature are helicopters…
 

B4U

macrumors 68040
Oct 11, 2012
3,582
4,017
Undisclosed location
My understanding is the parent is notified if they elect to view it anyway, so they can investigate.
Yeah, but nothing stops them from using other means to look into nude photos if they so desire.
This is as useful as having an alarm system on the cookie jar. If the child want the cookie bad enough, they have other means of obtaining a cookie.
(Reminds me of the AFV show 2 weeks ago where the kid simply turned the home camera toward the wall when the dad tells him to stop taking the chocolate)
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
Yeah, but nothing stops them from using other means to look into nude photos if they so desire.
This is as useful as having an alarm system on the cookie jar. If the child want the cookie bad enough, they have other means of obtaining a cookie.
(Reminds me of the AFV show 2 weeks ago where the kid simply turned the home camera toward the wall when the dad tells him to stop taking the chocolate)

There are third-party content blockers parents can install. But the fact that devious children can find ways to circumvent protection measures doesn't mean we should just stop using protection measures. Instead, we should continue to improve them AND parents should be training and if disciplining their children appropriately.
 
  • Like
Reactions: KeithBN

britboyj

macrumors 6502a
Apr 8, 2009
814
1,086
What kids use iMessage in the UK anyway LMFAO

None. And this will kill them ever using it.

WhatsApp is king, this isn’t North America.

Bingo. WhatsApp is the dominant "texting" service outside the US, which is unfortunate in and of itself since you know... Zuck owns it.
 
  • Like
Reactions: remington79

Lounge vibes 05

macrumors 68040
May 30, 2016
3,649
10,603
Honestly though. Guess what would happen if the child really wanted to see some nude photos?
This does serve some purpose. But sadly when there is a will, there is a way. ?
This feature is much, much less about trying to stop kids from seeing things they’re eventually going to be naturally curious about, and a lot more about trying to warn them before opening that one very bad Picture from the person who should not be sending them.
Obviously this feature is not going to be 100% effective, but if it protects even one child from seeing a photo they never asked or should have seen, it’s doing its job.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.