Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ipponrg

macrumors 68020
Oct 15, 2008
2,309
2,087
I guess I just don’t care enough if people look at my pictures. You can see my dog, my food, places I’ve traveled. Heck I post some of these of different suicidal media sites anyways. Why are people so freaked out about someone seeing your pictures. If that’s the case you either have some skeletons in your closet or you probably shouldn’t be using technology this advanced. Either that or your in the CIA. I am none of those 3, so it just doesn’t bother me.

The problem with this statement is you are ignoring the big elephant in the room, i.e. the principal. I don't have anything to worry about either in my pictures. The problem is that Apple preaches privacy as it's their mantra, but their actions and statements contradict that. That's why people are being very vocal about it.

Do we know if Google scans Google Photos or if Microsoft scans OneDrive Photos?

They might be doing this already and we don't even know it...

I wouldn't be surprised if they did. Many tech companies bend technology ethics.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
Didn't you just write that they only scan pictures that are uploaded to iCloud, where of course Apple can see them and even scan them? And - surprise - it turns out they have been scanning photos on iCloud all long . So why exactly do they suddenly need to scan our devices? The only reason I can think of is that they are laying the groundwork for future developments ...

Because, as I and others (and Apple themselves) have already explained, it's more private to scan on your device (because the scan "results" if you will are stored on your phone where Apple can't read them). They are voluntarily saying, "Hey, we're going to keep scanning photos for illegal material, but we're going to increase the security/privacy of this scanning by changing the method of how we do it."
 

Michael Scrip

macrumors 604
Mar 4, 2011
7,932
12,489
NC
First of all... CSAM sucks and the people who make it are monsters.

But who has that kind of stuff on a phone that is constantly connected to the internet? :oops:

Talk about needing an airgapped solution... though I'm sure Ethan Hunt could still find it...

mi-macguffin01.png
 

tyranne201

macrumors regular
Apr 1, 2020
238
293
This is really bad. Imagine someone forgetting to set this thing off and having all their nudes scanned and sent to Apple, then Apple gets hacked and a leak occurs.
 
  • Like
Reactions: George Dawes

Rigby

macrumors 603
Aug 5, 2008
6,234
10,177
San Jose, CA
Because, as I and others (and Apple themselves) have already explained, it's more private to scan on your device (because the scan "results" if you will are stored on your phone where Apple can't read them). They are voluntarily saying, "Hey, we're going to keep scanning photos for illegal material, but we're going to increase the security/privacy of this scanning by changing the method of how we do it."
That makes no sense to me, since they can apply the exact same algorithms they are using on the device on their cloud servers. Since the database, hash function, threshold parameters and other details of the algorithm are secret, the on-device method requires you to blindly trust Apple anyway. The difference is that when doing it in the cloud, they don't need to intrude my device and undermine any E2E encryption.
 
  • Like
Reactions: peanuts_of_pathos

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
That makes no sense to me, since they can apply the exact same algorithms they are using on the device on their cloud servers. Since the database, hash function, threshold parameters and other details of the algorithm are secret, the on-device method requires you to blindly trust Apple anyway. The difference is that when doing it in the cloud, they don't need to intrude my device and undermine any E2E encryption.

Well if you're just going to disbelieve Apple or think you know better than them, then obviously nothing I can say is going to change your mind. No point in further discussion.
 

Rigby

macrumors 603
Aug 5, 2008
6,234
10,177
San Jose, CA
Well if you're just going to disbelieve Apple or think you know better than them, then obviously nothing I can say is going to change your mind. No point in further discussion.
What I do know is that I can control what my device uploads to the cloud, but that I cannot control or even find out what exactly Apple's secret scanning algorithms do on my device.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
This is so wrong. A kid could take a picture of themselves and have their photo sent to the police.

Nope, that's not at all what's happening here. Only known CSAM is flagged and once the number of flagged images uploaded to iCloud reaches a certain threshold, they are manually reviewed to confirm. So obviously if it's not actual CSAM after review, it won't be sent to the police.
 
  • Like
Reactions: BigMcGuire

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
What I do know is that I can control what my device uploads to the cloud, but that I cannot control or even find out what exactly Apple's secret scanning algorithms do on my device.

They've already told you what they're doing, but apparently you don't believe them. I do. That's the fundamental difference between us here. There's all kinds of advanced software on your phone already that you are trusting Apple isn't doing devious things with in the background without your knowledge. I don't see why this is any different. I guess the word "scan" triggers some people.
 

jntdroid

macrumors 6502a
Oct 12, 2011
937
1,286
They've already told you what they're doing, but apparently you don't believe them. I do. That's the fundamental difference between us here. There's all kinds of advanced software on your phone already that you are trusting Apple isn't doing devious things with in the background without your knowledge. I don't see why this is any different. I guess the word "scan" triggers some people.

It's because they're playing police. They're welcome to do that with anything I upload to their servers. But they shouldn't be able to investigate things on my device without my permission - even if they can't do anything with it until I choose to upload it. Use your own servers to do that, not my phone. It's the same reason they've rejected requests from the FBI to break into previous devices. Why was that not ok, but this is? (I know the encryption answer, but the general concept is the same)

And this is a completely different thing than search indexing or other background software services that go on. Those aren't "investigating" anything.

If Apple said the scanning and hash matching won't happen UNLESS you have iCloud Photos activated, that would be one thing. But the hash matching is going on whether I'm using iCloud Photos or not. That's where the problem comes in.

Also, you never really answered this, which is the same point I've been trying to make, "That makes no sense to me, since they can apply the exact same algorithms they are using on the device on their cloud servers. Since the database, hash function, threshold parameters and other details of the algorithm are secret, the on-device method requires you to blindly trust Apple anyway. The difference is that when doing it in the cloud, they don't need to intrude my device and undermine any E2E encryption."
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
It's because they're playing police. They're welcome to do that with anything I upload to their servers. But they shouldn't be able to investigate things on my device without my permission - even if they can't do anything with it until I choose to upload it. Use your own servers to do that, not my phone. It's the same reason they've rejected requests from the FBI to break into previous devices. Why was that not ok, but this is? (I know the encryption answer, but the general concept is the same)

But they aren't investigating it on your phone. All that's happening is illegal images are being marked - no investigation happens until you upload a certain number of those to iCloud. This is completely different--even in general concept--to the FBI case, because no third party or even Apple themselves is being allowed access to content on your phone itself.

And this is a completely different thing than search indexing or other background software services that go on. Those aren't "investigating" anything.
Ah, but how do you KNOW with 100% certainty there isn't also some hidden process going on that Apple is covering up! /s See, if people want to go conspiracy theorist about things, there's no end to that rabbit hole. No matter what you tell them, they will keep insisting that "we don't know" or "there are still unanswered questions." They thrive off that. They don't want it to end or be resolved, because then they have no soap box to stand on anymore.

Also, you never really answered this, which is the same point I've been trying to make, "That makes no sense to me, since they can apply the exact same algorithms they are using on the device on their cloud servers. Since the database, hash function, threshold parameters and other details of the algorithm are secret, the on-device method requires you to blindly trust Apple anyway. The difference is that when doing it in the cloud, they don't need to intrude my device and undermine any E2E encryption."

How are they undermining E2E encryption? Any images (illegal or not) sent to iCloud servers can already be read by Apple since they are in technically the recipient whom you're sending the images to. All this scan is doing is alerting them if some of these photos are CSAM during that transfer. But since the actual analyzation of each of your images is being done on your device, Apple isn't looking at data from your non-CSAM photos. If they continued to scan in the cloud, they WOULD be looking at it. I understand that of course Apple can already see your images on the cloud, but now there's no reason for them to "look" at all of them (in terms of analyzation) there because that's already been done on your device, which they can't "see."

I have a feeling you're going to tell me that doesn't answer your question, but I'm afraid I'm not an expert at this highly technical subject and the above is the best I can do, so if you want a more detailed technical answer, I'd suggest contacting Apple.
 
  • Like
Reactions: BigMcGuire

jntdroid

macrumors 6502a
Oct 12, 2011
937
1,286
But they aren't investigating it on your phone. All that's happening is illegal images are being marked - no investigation happens until you upload a certain number of those to iCloud. This is completely different--even in general concept--to the FBI case, because no third party or even Apple themselves is being allowed access to content on your phone itself.

Semantics, IMHO. They're not investigating, but they're marking the illegal images? Seems the same to me. They're not accessing it on iCloud either - or at least they don't have to. They can use the same safety voucher system on iCloud without doing it on my device.

Ah, but how do you KNOW with 100% certainty there isn't also some hidden process going on that Apple is covering up! /s See, if people want to go conspiracy theorist about things, there's no end to that rabbit hole. No matter what you tell them, they will keep insisting that "we don't know" or "there are still unanswered questions." They thrive off that. They don't want it to end or be resolved, because then they have no soap box to stand on anymore.

That's the thing, I'm not going conspiracy theory at all. I believe and trust Apple will do what they say with this. I just don't agree that doing this on my device is better for me, or more private for me. I'm already giving up a certain level of privacy when I choose to use iCloud photos. The same goes for other companies who also scan for CSAM images (Google, Microsoft, etc). Why can't Apple just stick with that? Doing it this way does indeed open the door (not a backdoor...) for more abuse in the future, even if I trust Apple to fight against that.

How are they undermining E2E encryption? Any images (illegal or not) sent to iCloud servers can already be read by Apple since they are in technically the recipient whom you're sending the images to. All this scan is doing is alerting them if some of these photos are CSAM during that transfer. But since the actual analyzation of each of your images is being done on your device, Apple isn't looking at data from your non-CSAM photos. If they continued to scan in the cloud, they WOULD be looking at it. I understand that of course Apple can already see your images on the cloud, but now there's no reason for them to "look" at all of them (in terms of analyzation) there because that's already been done on your device, which they can't "see."

I have a feeling you're going to tell me that doesn't answer your question, but I'm afraid I'm not an expert at this highly technical subject and the above is the best I can do, so if you want a more detailed technical answer, I'd suggest contacting Apple.

Some of this is above my head as well... but again, I just don't see the difference. Unless I'm mistaken, they're scanning every photo on my device to see if there is a hash that matches a hash on the list they're going to put on my device with iOS 15. If there are enough matches, they'll then flag the matches and have them transmitted via a safety voucher system.

So instead of doing that scanning on my device, why not do it in the cloud? They don't have to "look" at anything in iCloud. They can use the exact same system to scan/hash match/flag without "looking" at all of the other photos. All they have to do is simply not "look" at the ones that don't match hashes...

It just feels like a HUGE workaround to say they're doing it in a more private manner on my device, while creating a huge potential future slippery slope (which if I remember correctly, you agree with that part).

Why not just implement this in the cloud like other companies already do, and then market it that way and get a pat on the back like we all know they would?
 

macbookguy

Cancelled
Dec 28, 2006
348
363
Do we know if Google scans Google Photos or if Microsoft scans OneDrive Photos?

They might be doing this already and we don't even know it...

They all do it server side, which is the correct answer. Client side starts a slippery slope. Right now it's limited to things your device is planning to upload to iCloud. Next its the device regardless of upload status. And additions of a political nature, or drugs, or guns, or terrorism (as defined by whatever government sees fit like the muslim minority in China)... With this stuff you always have to think of the end game, not the starting point, and what the worse abuses could be. That's what the EFF is pointing out.
 

Rigby

macrumors 603
Aug 5, 2008
6,234
10,177
San Jose, CA
They've already told you what they're doing, but apparently you don't believe them.
Actually no, they haven't. There are tons of critical details missing in the documentation they have released so far, both technical and in terms of what policies are applied e.g. to prevent abuse. A lot of it is intentionally being kept secret, which prevents any serious review and means there is no accountability. This is not acceptable for such a drastic step.

I do. That's the fundamental difference between us here.
I think you don't know what you are talking about, or perhaps haven't thought through what this new approach means. Too many Apple fans who are blindly defending everything they do.
 
  • Love
Reactions: peanuts_of_pathos

macbookguy

Cancelled
Dec 28, 2006
348
363
Some of this is above my head as well... but again, I just don't see the difference. Unless I'm mistaken, they're scanning every photo on my device to see if there is a hash that matches a hash on the list they're going to put on my device with iOS 15. If there are enough matches, they'll then flag the matches and have them transmitted via a safety voucher system.

From what I can tell from all of the FAQ articles I've read on this, the hashes only get sent if the photo or photos are also queued to upload to iCloud. But that's always subject to change, with or without notice. And since we don't really know the hashes downloaded or the actual content attached to them, all it takes is someone to add some hashes for political memes or linked to ideology, etc.
 
  • Like
Reactions: jntdroid

Rigby

macrumors 603
Aug 5, 2008
6,234
10,177
San Jose, CA
But they aren't investigating it on your phone. All that's happening is illegal images are being marked - no investigation happens until you upload a certain number of those to iCloud.
And you don't even know what that number is, because they are keeping it a secret ...

How are they undermining E2E encryption? Any images (illegal or not) sent to iCloud servers can already be read by Apple since they are in technically the recipient whom you're sending the images to.
You have a very shortsighted view. At first they are only applying the hash matching to iCloud Photo library, which isn't end-to-end encrypted. But that's precisely why it will likely not end there, because if you think about it it doesn't make sense to go to the effort of implementing this on-device system for something that isn't E2E encrypted (because it's much easier and less "creepy" from the marketing perspective if they can do it in the cloud). They will almost certainly apply it to E2E encrypted services going forward, perhaps existing ones like iMessage or new ones. What they have implemented is an "exceptional access" system of the kind that certain politicians have demanded for years in order to undermine E2E encryption.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
This will be my last comment on this thread or any other thread about this, at least for a long while. We're just going in circles now and names and put-downs are being exchanged by many (including myself), which isn't really helpful. I just take the position of "innocent until proven guilty," and many of you take the opposite. Let's face, it - the technology here is probably beyond the full understanding of 99.9% of the members of this forum. I think because of it being so complicated, it makes it a subject ripe for misinterpretation, misinformation, conspiracy theories, fear-mongering, etc.
 

icanhazmac

Contributor
Apr 11, 2018
2,599
9,891
We're just going in circles now and names and put-downs are being exchanged

Like this comment below?

Let's face, it - the technology here is probably beyond the full understanding of 99.9% of the members of this forum. I think because of it being so complicated, it makes it a subject ripe for misinterpretation, misinformation, conspiracy theories, fear-mongering, etc.

Nothing more fail than exiting a dialogue while backhandedly insulting 99.9% of the members you were engaged with. If you wanted to exit the conversation with any shred of tact or grace you would have stopped with this:

This will be my last comment on this thread or any other thread about this, at least for a long while. We're just going in circles now and names and put-downs are being exchanged by many (including myself), which isn't really helpful. I just take the position of "innocent until proven guilty," and many of you take the opposite.

But you didn't, you just had to exit with an insult. Says more about you than any valid point(s) you may have had.
 
  • Like
Reactions: peanuts_of_pathos
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.