Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MozMan68

macrumors demi-god
Jun 29, 2010
6,074
5,162
South Cackalacky
It says that iMessages will be analyzed too, your iMessages are synced to the cloud.
Wrong…iMessage scanning is a completely different tool and has nothing to do with photos from your camera roll unless you physically save them there.

The iMessage scanning is for family accounts with children under the age of 13 to make sure they don‘t get unwanted inappropriate images. It has nothing to do with the hashed image scanning (unless you save the picture you received that is.)
 
  • Like
Reactions: flowsy

Playfoot

macrumors 6502
Feb 2, 2009
282
253
As we stand at the top of a new slippery slope, just remember earlier this year in China Apple admitted that all users' data in China is of this year stored on Chinese government servers. Encryption keys are stored at the server centre. And of course, everyone remembers that Apple surrendered to the demands of the Chinese government not to use encryption.

And of course Apple drops apps from the store in China that the government worries about....Even on a matter as simple as "Designed in California" was dropped from the back of devices in 2019 because it was demanded by the Chinese government.

All of this was done, and possibly more that is not public such as NSL's in the US, because, according to Apple it is the law....

This is just one example of how Apple's marketing of being committed to privacy is just that: marketing.
 
  • Like
Reactions: macmesser

dk001

macrumors demi-god
Oct 3, 2014
10,684
15,033
Sage, Lightning, and Mountains
Who wants their kids receiving sexually explicit images? 80% of teenager girls questioned say they have received pics from boys showing their members. Who wants that?
Why has that become a thing?
People are always ready to complain when anyone makes a decision to do something. The sense of unease about maybe what might be. The beginning of the end, the thin end of the wedge.
But whailst they are feeling uneasy and worrying about someone looking at pics of them playing with the dog or standing round the BBQ, soceity is quietly going to hell
So you don't mind kids getting explicit pics and girls feeling forced to send naked pics back.
OK, you feel uneasy about someone having a machine looking at your pis but enough to not to do something about that?
And what would you suggest they do about that? What would you prefer that they just do nothing?
This is brilliant.
It's a machine giving a number for each match it sees. Only when the match is very high will someone look at it

In the old days you sent your film away to be processed and someone looked at every single frame. People didn't send explicit pics because they knew they wouldn't be processed at the least and that's probably what will happen here.

We have actually lost a lot in the name of freedom. We have allowed a lot of very creepy people to do nasty things in the safety of their own homes and instead of feeling dirty creeping around a red light district, scared of getting mugged, they are in their bedrooms swigging beer chatting to 500 people telling them they feel the same way.
And it's happened without anyone making a fuss.
It's our kids wives and sisters who are mostly paying for it.

And google by the way have copyright over everything you post to your google account. You don't even own the images you put online anymore. They are very quiet at the moment. This includes “all” sexes.

You would be amazed at the amount of self nudes teens share with each other. It has become a common item with todays teen culture.
 

Playfoot

macrumors 6502
Feb 2, 2009
282
253
I am curious, who compiles, reviews and maintains the database of illegal images? Is this a bit like the famous no fly list . . .?
 

MozMan68

macrumors demi-god
Jun 29, 2010
6,074
5,162
South Cackalacky
You would be amazed at the amount of self nudes teens share with each other. It has become a common item with todays teen culture.
The Apple iMessage updated privacy option will only work with iMessage..and only for family accounts with kids under 13. Teens will still be able to send naked selfies to each other. Even if they couldn't via iMessage, that's what SnapChat, WhatsApp, Messenger, etc. are for.
 

Gasu E.

macrumors 603
Mar 20, 2004
5,040
3,165
Not far from Boston, MA.
The CSAM thing doesn't detect/determine content of images. It checks photos against a database of specific (actively circulating) child abuse images.

Not to say there aren't legitimate concerns, but worrying that it is going to somehow flag your own kid's photos is not one of them.

(The child safety thing does detect, but seems the worst that does is through up a warning/blurring if you have it on)
What if some hacker broke into your computer, stole your kid's photos, and posted them on some kid-porn site? Extremely unlikely, right? But if it did happen, the consequences for you are dire-- your iCloud account gets shut down, you get flagged as a predator, and your name gets sent to the authorities. The authorities are going to start out assuming you posted nude pix of your kids yourself. Now, child services comes in to "protect" your kids while you get investigated. Your kids are sent to temporary care. After six months, the police have decided they can't charge you, due to lack of evidence Now child services gets to decide whether your kids will be "safe" in your home environment.

Yeah, the initial chances of you getting hacked in this way are tiny. But all the other things are just how "the system" works, automatically. Pity anyone who gets caught in this way, as one's life is essentially over.
 
  • Disagree
  • Haha
Reactions: flowsy and MozMan68

MozMan68

macrumors demi-god
Jun 29, 2010
6,074
5,162
South Cackalacky
What if some hacker broke into your computer, stole your kid's photos, and posted them on some kid-porn site? Extremely unlikely, right? But if it did happen, the consequences for you are dire-- your iCloud account gets shut down, you get flagged as a predator, and your name gets sent to the authorities. The authorities are going to start out assuming you posted nude pix of your kids yourself. Now, child services comes in to "protect" your kids while you get investigated. Your kids are sent to temporary care. After six months, the police have decided they can't charge you, due to lack of evidence Now child services gets to decide whether your kids will be "safe" in your home environment.

Yeah, the initial chances of you getting hacked in this way are tiny. But all the other things are just how "the system" works, automatically. Pity anyone who gets caught in this way, as one's life is essentially over.
And why does the hacker want to go to all of this trouble to do this? And what makes you think a person that doesn’t actually abuse or share their kid’s pics on porno sites wouldn’t have that proven by a quick check of their computer? And do you really think images are added to the database without some serious research as to where the pics initiated and who the kids are??

Go back to fantasyland…
 
  • Like
Reactions: flowsy

macmesser

macrumors 6502a
Aug 13, 2012
921
198
Long Island, NY USA
That's so wrong! Hey Apple... Focus on fixing this stuff before going after the consumers. This is creepy and wrong in so many different levels.



Isn’t this violation of a Privacy? I have a lot of nudes of myself on my phone. :( should I start deleting them or what?

Well over 34,000 photos. You got to be kidding me.

I’m scared ? Should I be worried?
You should be scared and worried, for sure. This is a textbook example of Orwellian control. Should you delete the nudies on your iPhone? Dunno. I'd have to see them first before rendering an opinion. Not being creepy here, just saying because it's to the point: you are not going to share them and I've got other things to look at. Power is in the proper hands. Big Brother not needed.
 
  • Like
Reactions: Ander123

Gasu E.

macrumors 603
Mar 20, 2004
5,040
3,165
Not far from Boston, MA.
And why does the hacker want to go to all of this trouble to do this? And what makes you think a person that doesn’t actually abuse or share their kid’s pics on porno sites wouldn’t have that proven by a quick check of their computer? And do you really think images are added to the database without some serious research as to where the pics initiated and who the kids are??

Go back to fantasyland…
I don't know anything about the technology used in hacking people's computers. I assume there is some value to sweeping up all the data on a person's computer and analyzing it for whatever is there of value, then breaking it up and reselling it on the dark web. Maybe that's not possible today, but maybe it is or will be; and if it has value, someone will do it. As far as proven by a "quick check", why do you assume that technology exists or will exist? And as far as the database is concerned, it says in the article that anything posted on a know kiddie porn site would get added.

It seems you rushed through your response in order to get to where you really wanted to go, which was your gratuitous insult at the end. Right?
 

arn

macrumors god
Staff member
Apr 9, 2001
16,363
5,796
What if some hacker broke into your computer, stole your kid's photos, and posted them on some kid-porn site? Extremely unlikely, right?

Yes, extremely unlikely. Also because only pretty egregious images are the ones cataloged.

While CSAM is seen and transmitted on computers and through other technology, these images and videos depict actual crimes being committed against children.

There are many reasons to object to Apple's scanning, but worrying that your innocent kid photos are going to be targeted and end up being planted in the CSAM database by a hacker is not even in the top 100, and is so absurd that it just distracts from legitimate arguments against it.
 

ardent73

macrumors regular
Jan 14, 2010
156
61
Huh? A false positive would immediately be recognized and dismissed once it is reviewed by an actual person. It would never even be sent to law enforcement. In fact, you probably won't ever even know about it.

What's happening with a lot you people posting is that you're letting your emotions cloud your judgment. You're VERY upset that Apple is scanning images on your phone (and I can understand being upset about that, btw), but instead of acknowledging facts such as the less than 1-in-1 trillion error rate, you're letting your emotions get the better of you and denying things without a rational, objective basis on which to deny them. No matter what, you just insist that everything about this topic must be sinister and suspect. You need some balance. You sound like a bunch of conspiracy theorists, tbh.
"facts such as the less than 1-in-1 trillion error rate"

I know technology and I know math, this is a lie. You assume facts not mentioned. You need experience in the real analog world. A simple search can provide many examples of technology is right and people are guilty and no proof of innocence is accepted because the system is assumed to be infallible. EZPass in the real world or Minority Report for a fictional version.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
"facts such as the less than 1-in-1 trillion error rate"

I know technology and I know math, this is a lie. You assume facts not mentioned. You need experience in the real analog world. A simple search can provide many examples of technology is right and people are guilty and no proof of innocence is accepted because the system is assumed to be infallible. EZPass in the real world or Minority Report for a fictional version.

Apple has most definitely "mentioned" multiple times that the chance of an account be flagged falsely is less than 1 in 1 trillion. I believe I misstated this several times as the error rate for a single PHOTO being falsely flagged, but of course what really matters is whether an account is flagged (whenever that certain number of flagged photos is reached). I trust the brainpower behind this technology more than I trust your assertion that you know better.

Also, you're talking utter nonsense. There is a manual review process for every flagged account, just on the remote chance that it might have been images might have been falsely identified as CSAM. I guarantee you Apple is not going to report innocent images as CSAM.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451

So a blog article on hackerfactor that says "I'm calling bullsh*t" is a credible source? Yeah, really professional writing there. And no objectivity at all - even the article title (One Bad Apple) poisons the well from the start. No wonder people are so confused! Besides, all he does it "question" things. He has no proof that the statistic is "made up" and neither do you. So, as I've said before, I'm going with Apple over forum members and bloggers guessing.
 

Playfoot

macrumors 6502
Feb 2, 2009
282
253
Sounds like a trojan backdoor way to justify other spying garbage in the future. They promote this practice for a righteous cause, because who would say "no" to protecting children, right?

Hard no for me.
Agree completely. No matter how well intentioned, a back door is still a back door. Combine with Apple's self declared need to follow the laws of the lands it finds itself in, Apple sells phones without FaceTime App because encrypted calls are against the law. Many more examples could be given, but odds are this back door will soon be used for greater snooping....
 

MozMan68

macrumors demi-god
Jun 29, 2010
6,074
5,162
South Cackalacky
So a blog article on hackerfactor that says "I'm calling bullsh*t" is a credible source? Yeah, really professional writing there. And no objectivity at all - even the article title (One Bad Apple) poisons the well from the start. No wonder people are so confused! Besides, all he does it "question" things. He has no proof that the statistic is "made up" and neither do you. So, as I've said before, I'm going with Apple over forum members and bloggers guessing.
...and as I pointed out before, he also mistakenly is using the "single photo" assumption in all of his complicated calculations. I actually laughed while reading it.

There is no basic math you can do to calculate the number since Apple has not provided details as to what triggers the account review (number of pictures), BUT anyone who took a basic statistics course could easily assume that the chances of having multiple innocent images being matched with multiple images in the database causing an account to be flagged is at LEAST one in one trillion. Having one innocent picture of course is much less, but it has to be at least in the hundreds of millions odds...and that still won't trigger a review of one's account.

But the many people on here who are unaware of how the tech works and ask idiotic questions about uploading innocent pics of their kid in a bathtub somehow triggering their account, need to simply read how it DOES work.

It's also been shared on here that Apple was already doing these scans and are now only switching to this method to make it even MORE secure and MORE private to its users.
 
  • Like
Reactions: usagora

Playfoot

macrumors 6502
Feb 2, 2009
282
253
Last words to Tim Cook?

************
There have been people that suggest that we should have a backdoor. But the reality is if you put a backdoor in, that backdoor’s for everybody, for good guys and bad guys… I think everybody’s coming around also to recognizing that any backdoor means a backdoor for bad guys as well as good guys. And so a backdoor is a nonstarter. It means we are all not safe… I don’t support a backdoor for any government, ever.


We do think that people want us to help them keep their lives private. We see that privacy is a fundamental human right that people have. We are going to do everything that we can to help maintain that trust. — Apple CEO Tim Cook, October 1, 2015

***************
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.