Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

JimmyHook

macrumors 6502a
Apr 7, 2015
954
1,792
Let's also not forget this is the same company that would not help the FBI access dead known/proven terrorists' phones to try to prevent other attacks and get information.

But they will scan your cloud data for these types of images.

A bit hypocritical which criminals they will assist with or not as well.
Helping the FBI in that context would reduce privacy for everyone. But they do help the feds when they have a valid warrant and the data is accessible without compromising everyone else’s privacy. This is a great tool
 

400

macrumors 6502a
Sep 12, 2015
760
319
Wales
Interesting document linked and the terms and conditions for using 5he service are literally in black and white. Struggling to get as far as the m for meh with this.
Are other cloud providers looking at the same?
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
I know this has been discussed, but this was my first thought. Photos of little grandsons' bath being flagged.

A lot of you guys didn't read the whole article or misunderstood something. This only flags images that match a database of known child abuse images or "visually similar" images (cropped / resized / color change / etc.). There's basically no chance that your little grandson's first bath will be flagged.

This really isn't the concern here. I think the concern is that this is a slippery slope.
 

burgman

macrumors 68030
Sep 24, 2013
2,731
2,302
That's so wrong! Hey Apple... Focus on fixing this stuff before going after the consumer's. This is creepy and wrong in so many levels.


Isn’t this violating of a Privacy? I have a lot of nudes of myself on my phone :( should I start deleting them or what?

Well over 34,000 photos. You got to be kidding me

I’m scared 😱 Should I be worried?
I wondered what agreements in secret Apple made with the DOJ that caused the threats and PR pressure for backdoors to be created suddenly disappear. I think we just saw one. How bad are the non public ones?
 

fwmireault

Contributor
Jul 4, 2019
2,162
9,243
Montréal, Canada
I know this has been discussed, but this was my first thought. Photos of little grandsons' bath being flagged.
If you read how the process will work, this kind of picture would never been flagged. That doesn’t make the move less controversial or more acceptable but Apple wants one of your saved pictures to match a database of child abuse photos before being flagged to anyone. Nudes of you or photos of your childs taking a bath would not match that database in the first place
 

JosephAW

macrumors 603
May 14, 2012
6,040
8,067
Great, go to the harvest festival and take pictures of fruits and vegetables and the next thing you know the police are banging on your door.

So AI is going to report us? What could go wrong?

I hope there’s a human element involved with this. But then again, I don’t want humans or AI rifling through my photographs on my device based on false positives. You would hope that iOS will alert a user if a photo on their device has been flagged and notified of the authorities.

I wonder if they’ve already been doing this without our consent with all the face recognition in the photos app?
Sending each face recognition data to the CIA, NSA and FBI?
 

zakarhino

Contributor
Sep 13, 2014
2,522
6,791
The CSAM database is of known abuse images. So regular nudes shouldn't trigger them.

And the blurring of explicit photos seems to be a parental control thing.

you don’t know for a fact whether or not the system will trigger false positives as Apple have said their technology is proprietary and is designed to detect changes from base CSAM images. That means some level of interpretation is involved via machine learning which on the whole has been proven to be easily tricked into false positives. No doubt that will be weaponized somehow. We can’t say it’s just comparing basic hashes because Apple have stated otherwise and have not opened their system up for public audit.

Not to mention how this on the whole is a massive invasion of privacy and opens the door to scan for images outside of what is obviously bad content.
 

bradl

macrumors 603
Jun 16, 2008
5,939
17,430
It's still a slippery slope here. What next, will Apple disallow nudity in icloud, because, #morals?

Plus, the whole appeal of iCloud over say Google Photos was there was no cloud AI meddling with your photos scanning them at all.

Now, what is the difference really?

One thing everyone has to realize: you do not own your data in the Cloud.

Yes, you heard me right.

When you upload your data to any cloud service, including iCloud, the data for that sits on that provider's servers, putting them in possession of it. So when any federal entity investigating you or any person, they do not require a warrant to access your data there, because since the data is owned by a 3rd party, that 3rd party is not privy to that warrant, so only a subpoena from a clerk of the court is needed.

Oh, PSA: Any lawyer is a clerk of the court. They can create their own subpoena and execute it (it does not need to be signed off by the court), and delivered to that 3rd party for whatever purposes are listed in that subpoena, including providing any and all data needed for their investigation. They do not need to go directly to you for that. A report was done on this and posted in PRSI about it roughly 8 years ago:


How that is relevant here: if such abusive photos or worse is located on Apple's servers, Apple could possibly be charged with the harboring of such photos, putting them in legal liability, along with being subpoenaed to answer questions about who uploaded those pictures, when, etc. Then the investigators could go after the person that did it. In the end, Apple gets dinged on legal charges, as well as the uploader. Apple is looking at how to get out of those legal issues with this.

Again, I'm not saying whether it is morally or ethically right or wrong, but they also are looking after themselves with this.

BL.
 

JimmyHook

macrumors 6502a
Apr 7, 2015
954
1,792
APPLE YOU DO NOT HAVE THE RIGHT TO VIOLATE MY PRIVACY TO ANYTHING ON MY DEVICE.

Typical Tim Cook BS - “all about human rights,” then supports PRC. “All about privacy,” then scans all images on someone’s phone.
Nobody is violating your privacy. You will only be affected if you are a criminal with child photos, and then you should be arrested, shamed, and jailed
 

arn

macrumors god
Staff member
Apr 9, 2001
16,363
5,798
you don’t know for a fact whether or not the system will trigger false positives as Apple have said their technology is proprietary and is designed to detect changes from base CSAM images. That means some level of interpretation is involved via machine learning which on the whole has been proven to be easily tricked into false positives. No doubt that will be weaponized somehow. We can’t say it’s just comparing basic hashes because Apple have stated otherwise and have not opened their system up for public audit.

Not to mention how this on the whole is a massive invasion of privacy and opens the door to scan for images outside of what is obviously bad content.
Absolutely. False positives are a legit concern. But the early responses seemed to think that any borderline child images were going to be reported.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.