Then if you have nothing to hide, forward your username and password. See how that works?I agree with darcyf that if you're not doing anything wrong, then you have nothing to worry about.
Then if you have nothing to hide, forward your username and password. See how that works?I agree with darcyf that if you're not doing anything wrong, then you have nothing to worry about.
If you have nothing to hide, send me all of your photos, since you're apparently totes ok with Apple employees looking through your iCloud picture library.based on everything that’s been explained about how it works and what purpose it serves, it’s honestly creepier that you feel the need to turn it off
Yes, because people are never falsely accused, the hash AI is perfectly flawless (no false positive hash collisions), and I'm sure no Apple employees would ever abuse the system to look through a hot chick's iCloud Photo Library.I agree with darcyf that if you're not doing anything wrong, then you have nothing to worry about.
I'll be the judge of that. Give me access to your photo library. You have nothing to hide.This isn’t about what might happen somewhere someday, this is about catching purveyors of kiddie porn, right now, here, today.
Yes exactly. I'll volunteer to prescreen your photos to make sure there's nothing incriminating before they get uploaded to iCloud. I want to do my part!Good! The use of hash values in comparing known CSAM is extremely accurate, and has been in use by multiple cloud providers for years now - and has resulted in thousands of NCMEC leads which have gone out to the local ICACs - which have resulted in numerous arrests and prosecutions. There's no reason to have CSAM on your phone, period. Apple, thank you.
Again. Not the point. You clearly do not understand the ramification of these kind of action and are blindsided by the slogans.Not as obnoxious as saying we shouldn’t do everything we reasonably can to protect exploited children and punish their abusers because we don’t want to allow a harmless and fully automated scan that affects us in no way at all.
I agree with darcyf that if you're not doing anything wrong, then you have nothing to worry about.
I'm glad they are doing this. If it will help catch even 1 sicko, it would be worth it!
It is an hash. It is mathematics not a visive inspection.
Scanning my photos for kiddie porn has no effect on my privacy, since I don’t have any kiddie porn.
Until someone gets screwed up for taking a pic of their toddler taking a fun bath naked.
I have mixed feelings about this. One one hand, I do not like Apple backpedaling on the topic they’ve marketed incessantly for years and rolling back privacy. As others said, this can easily be a very slippery slope. I completely support finding and severely prosecuting people that abuse and harm children and taking actions to try to prevent it to begin with. I’m just not positive that I like this approach.
It brings up another topic in my mind: cloud services. Here’s the thing, when we store our data on someone else’s property (servers), we’re giving up some level of control of it. The direction that a lot of technology is currently going makes it to where we don’t really own anything. We’re just paying to use a company’s services. There seems to be a bit of a moral dilemma here.
Still worth it!And how many wrong alarms and accusations do you deem ok not mentioning the spying on your private most precious photos, potentially of your own kids?
Still worth it!
Maybe I should have said "to me" it's still worth it. That way you can't say "Nope".Nope. General privacy abuse for everyone out-rules individual abuse.
But sure enough some would argue pro water boarding for the potential terror relief as well.
This is true. One of my employees was on community control (house arrest) many years ago. She was allowed to go the store, bank, etc only on certain days of the week and it had to be preapproved a week in advance. Once she was 2 minutes late arriving home. Her probation was violated and she spent 3 months in the slammer before a judge finally released her and dismissed the charge. The system is inherently corrupt. And lets not forget about how many poor souls have spent 3 or 4 decades in prisons for crimes they never committed.Wait until the employees who find positive hits get paid for each violation they find, they will be planting all kinds of data everywhere.
It’s that way now with those who are on parole. If a parole officer finds a violation they get paid a bonus.
Actually, the intent is creepy. It is not like an elected official passed a law or there was a recent referendum. Apple seems to have just decided that this was "the right thing to do". What if they decided that monitoring hate, extremism, suicidal ideation, bullying or whatever we all now oppose, and however Apple wants to define these things, was the right thing to do?
I remember the good old days, when people on this forums used to make fun out of android for not being privacy freindly to their users, and now Apple comes out to say "we're going to scan every photo you take for possible child sex abuse!". At least google is quiet about these things...
Or it actually allows a human operator to view the pictures, and then it is a massive privacy and safety concern (as it implies that regardless of this particular use case, as a general rule Apple is OK with storing user content which it can decrypt and to give access to its employees).
Why stop there though? Next Apple should pre-screen every thought typed into iMessage, Mail, Safari, Notes etc for potential misinformation!Still worth it!