Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,768
31,226


Apple today announced that with the launch of iOS 15 and iPadOS 15, it will begin scanning iCloud Photos in the U.S. to look for known Child Sexual Abuse Material (CSAM), with plans to report the findings to the National Center for Missing and Exploited Children (NCMEC).

Child-Safety-Feature.jpg

Prior to when Apple detailed its plans, news of the CSAM initiative leaked, and security researchers have already begun expressing concerns about how Apple's new image scanning protocol could be used in the future, as noted by Financial Times.

Apple is using a "NeuralHash" system to compare known CSAM images to photos on a user's iPhone before they're uploaded to iCloud. If there is a match, that photograph is uploaded with a cryptographic safety voucher, and at a certain threshold, a review is triggered to check if the person has CSAM on their devices.

At the current time, Apple is using its image scanning and matching technology to look for child abuse, but researchers worry that in the future, it could be adapted to scan for other kinds of imagery that are more concerning, like anti-government signs at protests.

In a series of tweets, Johns Hopkins cryptography researcher Matthew Green said that CSAM scanning is a "really bad idea" because in the future, it could expand to scanning end-to-end encrypted photos rather than just content that's uploaded to iCloud. For children, Apple is implementing a separate scanning feature that looks for sexually explicit content directly in iMessages, which are end-to-end encrypted.

Green also raised concerns over the hashes that Apple plans to use because there could potentially be "collisions," where someone sends a harmless file that shares a hash with CSAM and could result in a false flag.

Apple for its part says that its scanning technology has an "extremely high level of accuracy" to make sure accounts are not incorrectly flagged, and reports are manually reviewed before a person's iCloud account is disabled and a report is sent to NCMEC.

Green believes that Apple's implementation will push other tech companies to adopt similar techniques. "This will break the dam," he wrote. "Governments will demand it from everyone." He compared the technology to "tools that repressive regimes have deployed."


Security researcher Alec Muffett, who formerly worked at Facebook, said that Apple's decision to implement this kind of image scanning was a "huge and regressive step for individual privacy." "Apple are walking back privacy to enable 1984," he said.

Ross Anderson, professor of security engineering at the University of Cambridge said called it an "absolutely appalling idea" that could lead to "distributed bulk surveillance" of devices.

As many have pointed out on Twitter, multiple tech companies already do image scanning for CSAM. Google, Twitter, Microsoft, Facebook, and others use image hashing methods to look for and report known images of child abuse.


It's also worth noting that Apple was already scanning some content for child abuse images prior to the rollout of the new CSAM initiative. In 2020, Apple chief privacy officer Jane Horvath said that Apple used screening technology to look for illegal images and then disables accounts if evidence of CSAM is detected.

Apple in 2019 updated its privacy policies to note that it would scan uploaded content for "potentially illegal content, including child sexual exploitation material," so today's announcements are not entirely new.

Article Link: Security Researchers Express Alarm Over Apple's Plans to Scan iCloud Images, But Practice Already Widespread
 

drumpat01

macrumors 6502
Jul 31, 2004
444
115
Denton, TX
Everyone should have known this. Like...if you happen to have any pictures of yourself or others that are of the adult nature, notice that you can't search for an "adult" word and find a match on your phone. That's not just keyword blocks. They know exactly what is happening in those photos and won't show them to you.
 

macrumorsuser10

macrumors 6502
Nov 18, 2010
359
445
Apple should add scanning for:

1. Photos of the confederate flag.
2. Photos of people not wearing Covid masks.
3. Photos of Chinese people disrespecting the Chinese government.
4. Photos of Middle eastern women not wearing burkas.
5. Photos of a group of people with too many whites, not enough blacks.
 

canyonblue737

macrumors 68020
Jan 10, 2005
2,176
2,676
so if i have a private picture of my own child in a bathtub splashing away it will now flag and some contractor working halfway around the world for pennies gets to view my child in the bathtub, without my permission, and determine if perhaps i am some kind of child abuser? maybe they even keep a "souvenir" of the photos somehow. then i get to talk to a detective and be forever flagged in some database as someone accused or at least investigated for one of the most horrific crimes that exists? what could possibly go wrong?

UPDATE: there is no option not to have this scanning if you use photos or icloud photos. however upon reading more about this it apparently works by encoding your image to a "hash string" which isn't the actual image and then comparing that hash string to a list of KNOWN child abuse images from the internet (so unique images of your children playing in the bath aren't even an issue) and if you have a certain number of exact matches then and only then do they "manually" (ie. a person) look at the images and determine if a crime is a occurring they need to report. they claim there is a less than 1 in a TRILLION chance a flagged account doesn't actually contain significant amounts of true child abuse imagery. i need to read more but perhaps it isn't as creepy as it first sounded to me... but they need to be very, very transparent of what is going on.

The real issue perhaps isn't false flags so much as how this technology could spread to scan for other types of images (political etc.) to flag individuals. i get that seems very "black helicopter" of me when i say but it is apple itself who says they don't build backdoors into iOS because of the chance it could be abused.
 
Last edited:

Bawstun

Suspended
Jun 25, 2009
2,374
2,999
If you're not doing anything wrong, then you have nothing to worry about.

This simply isn’t true. As the article notes, the technology can easily be changed to other things in the future - what if they scanned for BLM supporter images or anti-government images? What if they wanted to scan and track certain political parties?

It’s not about child sex material, everyone agrees that that is wrong, it’s about passing over more and more of our rights to Big Tech. Give them an inch and they’ll take a foot.
 

sirozha

macrumors 68000
Jan 4, 2008
1,927
2,327
Depends on the definition of „wrong“. In some countries you apparently cannot even talk bad about your olympics team without having to flee and ask for asylum on foreign soil. Is that wrong?
In Europe, it’s customary to take family pictures of 10-year-old girls topless on the beach. In the US, this would be considered child porn and you can go to prison for 20 years if they find a picture like that on your phone.
 

evansls

macrumors regular
Jul 18, 2004
132
93
Leesburg, VA
iOS 15
- Apple will now scan your images searching for evidence of foul play.

iOS 16
- Apple will now listen to your voice mail searching for evidence of foul play.
- Apple will now monitor your browsing history searching for evidence of foul play.

iOS 17
- Apple will now scan your videos searching for evidence of foul play.
- Apple will now monitor your apple credit card searching for evidence of foul play.

iOS 18:
- Apple will now scan your text messages searching for evidence of foul play.
 

AltecX

macrumors 6502a
Oct 28, 2016
520
1,351
Philly
So, wait. For them to do this it sounds like some company is actively acquiring and indexing every known instance of Child Abuse imagery and giving it a tag for Apple to check against? Man, I would hate to have that job.

What I find a bit more worrying from a security POV is not that Apple might be indexing the iCloud as its on their servers, I expect that. It's this part:

Apple is using a "NeuralHash" system to compare known CSAM images to photos on a user's iPhone before they're uploaded to iCloud....
....a review is triggered to check if the person has CSAM on their devices.


They are looking into my DEVICES local storage. Makes me enjoy that my other phone is a stripped-down Android ROM.
 

Jamers99

macrumors regular
Apr 10, 2015
205
184
Lutz, FL
Thinly veiled excuse to rummage thru your personal photos based on a false premise. Why don't you just build the child porn detection directly into the camera's to scan in real time? I'm sure Apple and others could figure that out.

Seriously though, it is not the Apple's job to go spying and reporting on their customers. What happens with false positives? You really want to be put on a suspected child predator or child kidnapper list and receive a knock at the door from Law Enforcement or the FBI? Especially knocking they'll most certainly pull your gun purchase history or concealed weapons permit and then serve you with a 3:00 am no-knock warrant via battering ram because they claim you are dangerous and armed. This kind of insanity gets innocent people killed all the time. I say no to the Police State and Big Brother.
 

macduke

macrumors G5
Jun 27, 2007
13,187
19,795
While child abuse is terrible and I experienced it myself as a child, I don't think it's worth tearing down our democracy to get rid of some of it. Most child abuse happens behind closed doors offline.

Now that this exists, countries will use it to crack down on everyone in various ways, both now and in the future. They will pressure Apple by disallowing sales in their country if they don't let them scan everything, and Apple will fold because money is their primary motivation. Just look at China for an example of how low Apple will bow. I fear that the days of Apple being the most privacy-conscious company are coming to an end, and now they want to route every website you visit through iCloud Private Relay. Hmmmm.
 

SkyRom

macrumors regular
Dec 17, 2018
132
668
Ultimately it comes down to what is done with the images collected. We know anecdotally from other companies and leaks that "individual bad actors" can always abuse policies like this by preying on or collecting information from subjects that interest them (usually hot women,) or by stealing private information (for instance, your social security card, drivers license or other sensitive screenshots in your iCloud.)

I don't believe the risk of false positives flagging "normal" pictures and getting people erased from the internet or fired from their jobs is a chief concern, and I trust Apple's history (leakers and intellectual property thieves notwithstanding) of requiring warrants before cooperating with law enforcement. My personal policy has always been: Don't put anything online or the cloud that you wouldn't want ending up in a deposition. The responsibility to safeguard myself, my family and my privacy is on me, not Apple Inc.
 

AltecX

macrumors 6502a
Oct 28, 2016
520
1,351
Philly
Thinly veiled excuse to rummage thru your personal photos based on a false premise. Why don't you just build the child porn detection directly into the camera's to scan in real time? I'm sure Apple and others could figure that out.

Seriously though, it is not the Apple's job to go spying and reporting on their customers. What happens with false positives? You really want to be put on a suspected child predator or child kidnapper list and receive a knock at the door from Law Enforcement or the FBI? Especially knocking they'll most certainly pull your gun purchase history or concealed weapons permit and then serve you with a 3:00 am no-knock warrant via battering ram because they claim you are dangerous and armed. This kind of insanity gets innocent people killed all the time. I say no to the Police State and Big Brother.
Because they Arnt scanning for the CREATION of it, just copies of already existing material. Putting it in the camera would be triggered by every high school kid filming a fist fight as "abuse", or a 18-19yo girl that may look young for her age as Child Porn.
 

jclo

Managing Editor
Staff member
Dec 7, 2012
1,973
4,308
so if i have a private picture of my own child in a bathtub splashing away it will now flag and some contractor working halfway around the world for pennies gets to view my child in the bathtub, without my permission, and determine if perhaps i am some kind of child abuser? maybe they even keep a "souvenir" of the photos somehow. then i get to talk to a detective and be forever flagged in some database as someone accused or at least investigated for one of the most horrific crimes that exists? what could possibly go wrong?

No. It doesn't work this way. It's scanning for known child sexual abuse images using hashes to identify those photos. It's not going to flag your personal photo.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.