Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Mebsat

macrumors regular
May 19, 2003
215
367
Florida
No. It doesn't work this way. It's scanning for known child sexual abuse images using hashes to identify those photos. It's not going to flag your personal photo.
Let us stipulate that child porn is wrong.

It remains that a private corporation is now doing something, at scale, that would require a warrant if done by law enforcement. This is unacceptable, full stop. This is vulnerable to mission creep of unbelievably dystopian proportions.

And if Apple's competitors are also doing it, the only difference is that they will monetize it better than Apple.
 

zakarhino

Contributor
Sep 13, 2014
2,521
6,791
"But Practice Already Widespread"

Nice Apple apologist work there sir. The practice of scanning photos on a cloud service is already widespread but scanning all photos on somebody's local device regardless of whether they're using iCloud is not widespread at all and Apple wouldn't invest in creating an on-device version if they didn't intend to enable all local photo scanning in the future. The photos on iCloud are already unencrypted (they shouldn't be). Give me one reason why they would move the processing to on device rather than in the cloud besides the obvious.
 

Khedron

Suspended
Sep 27, 2013
2,561
5,755
there is no option if you use photos or icloud photos. however upon reading more about this it apparently works by encoding your image to a "hash string" which isn't the actual image and then comparing that hash string to a list of KNOWN child abuse images from the internet (so unique images of your children playing in the bath aren't even an issue) and if you have a certain number of exact matches then and only then do they "manually" (ie. a person) look at the images and determine if a crime is a occurring they need to report. they claim there is a less than 1 in a TRILLION chance a flagged account doesn't actually contain significant amounts of true child abuse imagery. i need to read more but perhaps it isn't as creepy as it first sounded to me... but they need to be very, very transparent of what is going on.

Then I’m sure Apple will be willing to offer a trillion dollars of compensation to anyone they falsely accuse?
 

Mousse

macrumors 68040
Apr 7, 2008
3,520
6,760
Flea Bottom, King's Landing
It’s not about child sex material, everyone agrees that that is wrong, it’s about passing over more and more of our rights to Big Tech.
Trudat. I see this as a test bed for Apple's true motive: track down leakers.? Once the technology is perfected, they can hashtag leaked photos and ID the leaker. Apple does things for Apple's reason, not out of any altruistic motive.
Give them an inch and they’ll take a foot.
With the first link a chain is forged...?
 
  • Like
Reactions: biziclop

Lounge vibes 05

macrumors 68040
May 30, 2016
3,655
10,617
Anyone uploading data to a third-party system should just assume that the third-party can and will access the data.
Not just that…
anyone entering any data into a device with the capability to connect to the Internet should just assume that it’s not 100% secure.
even if your phone is on airplane mode with everything shut off, it’s still not secure, no matter what phone you have.
 

bsolar

macrumors 68000
Jun 20, 2011
1,536
1,754
It’s not about child sex material, everyone agrees that that is wrong, it’s about passing over more and more of our rights to Big Tech. Give them an inch and they’ll take a foot.

Or, as the old quote says:

The trouble with fighting for human freedom is that one spends most of one's time defending scoundrels. For it is against scoundrels that oppressive laws are first aimed, and oppression must be stopped at the beginning if it is to be stopped at all.

That was in the context of governments, but today's big companies have enough power that IMHO the same concept applies.
 

Absrnd

macrumors 6502a
Apr 15, 2010
903
1,631
Flatland
yes Microsoft is already scanning, and I have read about many account holders being terminated,
without any answer why, and everything in their account is inaccessible, like all e-mail, documents, photo's....all gone.

There is no way to contact them, and ask what was wrong, they just don't respond.

This will also happen to Apple accounts.
 

mariusignorello

Suspended
Jun 9, 2013
2,092
3,168
Child porn is absolutely 100% wrong, but allowing this could end up being a nightmare.

Saying “if you have nothing to hide who cares” is an incredibly lazy and tired argument that needs to die, because it’s been proven that companies get an inch and take a mile (hello Facebook).

Call me a conspiracy theorist but what are you gonna do when they decide to push further? It’s better to be more proactive rather than reactive, which, I know, goes against the American way.

If law enforcement is so confident in their GrayKey then let them use that with a proper warrant. 24/7 monitoring is not the answer.
 

snakes-

macrumors 6502
Jul 27, 2011
355
138
If Apple sniff on iPhones there will be even look what is on our macs.... remember bypass our vpn in big sur.
Some Chinese brands getting a ban for things like this.
 
  • Like
Reactions: TakeshimaIslands

SBlue1

macrumors 68000
Oct 17, 2008
1,973
2,497
Thinly veiled excuse to rummage thru your personal photos based on a false premise. Why don't you just build the child porn detection directly into the camera's to scan in real time? I'm sure Apple and others could figure that out.
This is a great idea. Like a modern photocopier can't copy money, the printed page is black. You can't make a photo of your d..k. The only thing is the different standards of morale. Like in Europe you can sunbathe naked and nobody cares, in the US seeing a nipple is considered porn. LOL!
 

_Spinn_

macrumors 601
Nov 6, 2020
4,857
10,044
Wisconsin
Even though Apple’s intentions are good in this case, this is beyond creepy and has great potential for abuse. How long before this expands beyond searching for images of child abuse and into policing “wrong think”?

Apple doing this on your local device is worse than Google scanning all your stuff in the cloud. I picked Apple because I didn’t want to be subject to Google’s snooping. Now it looks like Apple is doing the same thing, just locally.
 

Ursadorable

macrumors 6502a
Jul 9, 2013
650
902
The Frozen North
Yes, they've always had software on my computer to scan for "wrongthink" content. Oh wait...

What's right today, can easily be made wrong tomorrow. Who would have thought pictures of Winnie The Pooh could land you in jail in China either.

Either way, if this rolls out, so do all my Apple products.
 

antiprotest

macrumors 601
Apr 19, 2010
4,103
14,539
iOS 15
- Apple will now scan your images searching for evidence of foul play.

iOS 16
- Apple will now listen to your voice mail searching for evidence of foul play.
- Apple will now monitor your browsing history searching for evidence of foul play.

iOS 17
- Apple will now scan your videos searching for evidence of foul play.
- Apple will now monitor your apple credit card searching for evidence of foul play.

iOS 18:
- Apple will now scan your text messages searching for evidence of foul play.
iOS 19
- Apple will become Goople and iOS will become iDroid.
 

sdf

macrumors 6502a
Jan 29, 2004
863
1,169
so if i have a private picture of my own child in a bathtub splashing away it will now flag and some contractor working halfway around the world for pennies gets to view my child in the bathtub, without my permission, and determine if perhaps i am some kind of child abuser?
No, that's not how it works.

Your photo will be reduced to a single numerical fingerprint. That fingerprint will be compared to a list of fingerprints for known images without any context about your image. This fingerprint isn't a picture of your child bathing, and it's definitely not an image or understanding of genitalia. It t's just a number like 8BA62546-1258-4E90-9096-48EE7365ECAE. Since your photo is not on the list, nothing will happen.

On the other hand (and of course you wouldn't do this, I'm just trying to explain the mechanics here), if you sold that image online to a lot of people and it became a well known image of child pornography, the FBI would eventually add the image to their database. Apple would end up building a fingerprint of the FBI's copy of that image. If that fingerprint still matched your image, it would be flagged. Apple/LEO would be able to look at their copy of the image matching that fingerprint because they acquired the image through another mechanism. In this case, though, some other mechanism got the photo from your computer or phone into the cloud.
 
  • Angry
Reactions: TakeshimaIslands

antiprotest

macrumors 601
Apr 19, 2010
4,103
14,539
I think it's important to clarify one aspect of this security software: it fails once in one TRILLION accounts.
Pretty safe and accurate, if you ask me.
Unless you are that one in a trillion sucker. And after that it happens to no one else for five thousand years.

But in any case, that's just a number they throw out though.
 

[AUT] Thomas

macrumors 6502a
Mar 13, 2016
778
989
Graz [Austria]
Hash based or not (if it is, it's next to useless given a single flipped bit completely changes the hash) this is a privacy violation.
Our legal system is still based on the presumptions of innocence. Apple, Youtube,... need to stop playing FBI, judge, nanny... which they are not and if there's no lawful request to do so they shall not violate users privacy, freedom of speech,... Period.
 

Darmok N Jalad

macrumors 603
Sep 26, 2017
5,303
46,003
Tanagra (not really)
No one would object to fighting this particular issue, but the means is the beginning of soft-totalitarianism, where governments reach beyond their laws and oversight and put that burden on regulated corporations to do it. We are all very dependent on these tech companies for our everyday lives at this point. The government doesn't even have to sit down and pass laws to do this to her citizens--they get the corporations to spy, censor, and punish (just think if you ended up on something like a hypothetical Amazon no-delivery list). Companies will flash lengthy user agreements that no one reads or understands before clicking agree, and the courts will be on their side. All iCloud users are assumed guilty until proven otherwise with such systems.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.