Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,768
31,227


Apple today previewed new child safety features that will be coming to its platforms with software updates later this year. The company said the features will be available in the U.S. only at launch and will be expanded to other regions over time.

iphone-communication-safety-feature.jpg

Communication Safety

First, the Messages app on the iPhone, iPad, and Mac will be getting a new Communication Safety feature to warn children and their parents when receiving or sending sexually explicit photos. Apple said the Messages app will use on-device machine learning to analyze image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.

When a child attempts to view a photo flagged as sensitive in the Messages app, they will be alerted that the photo may contain private body parts, and that the photo may be hurtful. Depending on the age of the child, there will also be an option for parents to receive a notification if their child proceeds to view the sensitive photo or if they choose to send a sexually explicit photo to another contact after being warned.

Apple said the new Communication Safety feature will be coming in updates to iOS 15, iPadOS 15 and macOS Monterey later this year for accounts set up as families in iCloud. Apple ensured that iMessage conversations will remain protected with end-to-end encryption, making private communications unreadable by Apple.

Scanning Photos for Child Sexual Abuse Material (CSAM)

Second, starting this year with iOS 15 and iPadOS 15, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies.

Apple said its method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, Apple said the system will perform on-device matching against a database of known CSAM image hashes provided by the NCMEC and other child safety organizations. Apple said it will further transform this database into an unreadable set of hashes that is securely stored on users' devices.

The hashing technology, called NeuralHash, analyzes an image and converts it to a unique number specific to that image, according to Apple.

"The main purpose of the hash is to ensure that identical and visually similar images result in the same hash, while images that are different from one another result in different hashes," said Apple in a new "Expanded Protections for Children" white paper. "For example, an image that has been slightly cropped, resized or converted from color to black and white is treated identical to its original, and has the same hash."

apple-csam-flow-chart.jpg

Before an image is stored in iCloud Photos, Apple said an on-device matching process is performed for that image against the unreadable set of known CSAM hashes. If there is a match, the device creates a cryptographic safety voucher. This voucher is uploaded to iCloud Photos along with the image, and once an undisclosed threshold of matches is exceeded, Apple is able to interpret the contents of the vouchers for CSAM matches. Apple then manually reviews each report to confirm there is a match, disables the user's iCloud account, and sends a report to NCMEC. Apple is not sharing what its exact threshold is, but ensures an "extremely high level of accuracy" that accounts are not incorrectly flagged.

Apple said its method of detecting known CSAM provides "significant privacy benefits" over existing techniques:
• This system is an effective way to identify known CSAM stored in iCloud Photos accounts while protecting user privacy.
• As part of the process, users also can't learn anything about the set of known CSAM images that is used for matching. This protects the contents of the database from malicious use.
• The system is very accurate, with an extremely low error rate of less than one in one trillion account per year.
• The system is significantly more privacy-preserving than cloud-based scanning, as it only reports users who have a collection of known CSAM stored in iCloud Photos.
The underlying technology behind Apple's system is quite complex and it has published a technical summary with more details.

"Apple's expanded protection for children is a game changer. With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material," said John Clark, the President and CEO of the National Center for Missing & Exploited Children. "At the National Center for Missing & Exploited Children we know this crime can only be combated if we are steadfast in our dedication to protecting children. We can only do this because technology partners, like Apple, step up and make their dedication known. The reality is that privacy and child protection can co-exist. We applaud Apple and look forward to working together to make this world a safer place for children."

Expanded CSAM Guidance in Siri and Search

iphone-csam-siri.jpg

Third, Apple said it will be expanding guidance in Siri and Spotlight Search across devices by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.

The updates to Siri and Search are coming later this year in an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey, according to Apple.


Article Link: Apple Introducing New Child Safety Features, Including Scanning Users' Photo Libraries for Known Sexual Abuse Material
 
Last edited:
That's so wrong! Hey Apple... Focus on fixing this stuff before going after the consumers. This is creepy and wrong in so many different levels.



Isn’t this violation of a Privacy? I have a lot of nudes of myself on my phone. :( should I start deleting them or what?

Well over 34,000 photos. You got to be kidding me.

I’m scared 😱 Should I be worried?
 

Attachments

  • 6CD4016D-2877-4AD4-833D-530587B97D47.jpeg
    6CD4016D-2877-4AD4-833D-530587B97D47.jpeg
    183.5 KB · Views: 373
Last edited:

arn

macrumors god
Staff member
Apr 9, 2001
16,363
5,796
Isn’t this violating of a Privacy? I have a lot of nudes of myself on my phone :( should I start deleting them or what?

Well over 34,000 photos. You got to be kidding me

I’m scared ? Should I be worried?

The CSAM database is of known abuse images. So regular nudes shouldn't trigger them.

And the blurring of explicit photos seems to be a parental control thing.
 

Exponent

macrumors 6502
Jul 17, 2002
266
642
Silicon Valley
No, too far, Apple.

What is going to keep you from scanning my library for NeuralHash matches against politics you don’t like? Or criticism of mainland dictatorial China?

if that doesn’t happen in the US, what will keep other countries (read above) from doing just that to their citizens?
 

kildraik

macrumors 6502a
May 7, 2006
933
1,323
So THATS why we had projects to identify tags for sensitive images.

Holesome.

This will have negative impacts for all users in the long run. A complete overreach of privacy. With the work I’ve seen, there will be a decent margin of error.
 
Last edited:

arn

macrumors god
Staff member
Apr 9, 2001
16,363
5,796
Yeh good luck if say you have small young kids who don't keep their clothes on. Like what,t every baby?

This is also creepy asf sorry. Child predators are bad, obviously, but this isn't the way.

The CSAM thing doesn't detect/determine content of images. It checks photos against a database of specific (actively circulating) child abuse images.

Not to say there aren't legitimate concerns, but worrying that it is going to somehow flag your own kid's photos is not one of them.

(The child safety thing does detect, but seems the worst that does is through up a warning/blurring if you have it on)
 
Last edited:

jk1221

macrumors 6502
Feb 1, 2021
285
1,058
It doesn't detect/determine content of images. It only checks photos against a database of known child abuse images.

Not to say there aren't legitimate concerns, but worrying that it is going to somehow flag your own kid's photos is not one of them.

It's still a slippery slope here. What next, will Apple disallow nudity in icloud, because, #morals?

Plus, the whole appeal of iCloud over say Google Photos was there was no cloud AI meddling with your photos scanning them at all.

Now, what is the difference really?
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
Yeh good luck if say you have small young kids who don't keep their clothes on. Like what, every baby?

It's a slippery slope policing your users here.

This is also creepy asf sorry. Child predators are bad, obviously, but this isn't the way.

Read the whole article carefully. While there are plenty of legitimate concerns about this, I don't think false flags are one of them. They are scanning for known child abuse imagery or visually similar edits of those known images (e.g. cropped, filtered, etc.). So your personal family photos aren't going to get flagged.

In other words, they're not scanning for something vague like "a picture of a naked child".
 

hortod1

macrumors 6502
Jan 26, 2009
462
1,265
That's so wrong! Hey Apple... Focus on fixing this stuff before going after the consumer's. This is creepy and wrong in so many levels.


Isn’t this violating of a Privacy? I have a lot of nudes of myself on my phone :( should I start deleting them or what?

Well over 34,000 photos. You got to be kidding me

I’m scared ? Should I be worried?
If you’re sending them to a minor, yes you should be worried
 

zakarhino

Contributor
Sep 13, 2014
2,508
6,778
The justification for a “solution” to a “problem” starts with referencing something abhorrent that nobody in their right mind could possibly defend.

You can’t object to this technology because that would mean you’re supporting child abusers.

You can’t object to anti misinformation technology and laws because that would mean you’re supporting neo nazis and fascism.

Ultimately there’s a long list of things we can do to help combat both of those issues WITHOUT compromising on peoples privacy and freedom of speech in the process but none of those solutions would come with the desired side effect that these people really want: control over what information gets shared amongst the population. I put “solution” and “problem” in inverted commas for a reason, the people in power are the ones setting the narrative on how to define the problem and solution. God forbid they instead choose a solution that actually aims to fix these issues at their core.
 

Apple_Robert

Contributor
Sep 21, 2012
34,504
50,065
In the middle of several books.
That's so wrong! Hey Apple... Focus on fixing this stuff before going after the consumer's. This is creepy and wrong in so many levels.


Isn’t this violating of a Privacy? I have a lot of nudes of myself on my phone :( should I start deleting them or what?

Well over 34,000 photos. You got to be kidding me

I’m scared 😱 Should I be worried?
That is a lot of nudity. Are you in the business or just like to check your body for health goals?
 

fwmireault

Contributor
Jul 4, 2019
2,158
9,167
Montréal, Canada
I don’t know how I’m feeling about this. I do want that we take the appropriate means against child abuse, but I’m not sure that it’s the good way. I do think reading this article that Apple developped the feature carefully and that any of our nudes and other sensitive content would ever been flagged without good reason. But still, coming from a company that brand privacy as its priority, this opens doors to darker purposes (and I’m not necessary talking about just Apple, but also law enforcement and government agencies.
 

BreakingKayfabe

Suspended
Oct 22, 2020
1,322
4,516
Southern Cal
That's so wrong! Hey Apple... Focus on fixing this stuff before going after the consumer's. This is creepy and wrong in so many levels.


Isn’t this violating of a Privacy? I have a lot of nudes of myself on my phone :( should I start deleting them or what?

Well over 34,000 photos. You got to be kidding me

I’m scared 😱 Should I be worried?
It's one thing if you have all those nudes on your iPhone. It's another if you have them synced to a cloud service. I wouldn't enable that in any circumstance.
 

jk1221

macrumors 6502
Feb 1, 2021
285
1,058
Let's also not forget this is the same company that would not help the FBI access dead known/proven terrorists' phones to try to prevent other attacks and get information.

But they will scan your cloud data for these types of images over such privacy rights.

A bit hypocritical which criminals they will assist with or not as well.

And I agree, this all begins with formulating it in a way that no one would say no. "You all want to prevent exploited children or are you a monster?"
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.