Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Mercury7

macrumors 6502a
Oct 31, 2007
738
556
Again, that would be LESS PRIVATE.
You genuinely don’t get that we don’t want this code on our phone…. That’s the entire point, the code collecting data itself is the invasion of privacy. Not sure why that’s so hard to understand, doesn’t matter if you call it spyware or anything else, if you agree to run ios15 then your agreeing to have Apple collect data on your device
 

RamGuy

macrumors 65816
Jun 7, 2011
1,354
1,918
Norway

I don't really see how this helps any of us as end-users? Sure it will most likely result in someone figuring out that Apple is not doing what they claim to do so it makes it more difficult for Apple to say one thing but doing the opposite. But there are still no tools for me as an end-user to really know any of what Apple is doing here other than trusting their word and hope that if they are not true to their word someone else will figure it out and hold them accountable for it.

On macOS I will most likely be able to utilise terminal and activity monitor, or install third-party software to keep a track of things, and even block the scanning I suppose.

None of this really matters. The whole problem is how this is happening on-device, meaning that we as users lose control. When you scan on-device there is nothing stopping Apple from scanning whatever. Sure I do trust Apple as a company. But they are a public traded company, if markets like China, Russia, India starts putting pressure on Apple to utilise these on-device capabilities to scan for other material they will most likely be forced to comply.

Why would they create tools on-device like this. From a privacy perspective, it makes no sense and it opens a huge can of worms.
 
  • Like
Reactions: BurgDog

tylersdad

macrumors regular
Jul 26, 2010
200
520
The updated terms you agreed to when you install an update. If that's too much legal mumbo jumbo, then you should have relied on articles on the internet to tell you this.
So I'm supposed to seek out information that I couldn't possibly know even exists or that I would even know that I NEED to seek out?

Makes total sense.
 

tylersdad

macrumors regular
Jul 26, 2010
200
520
No, this is a new feature. Please provide evidence (not random blog articles where someone makes assertions) that Apple has been on-device scanning photos since 2019 without users being informed.
I made no such claim. I was responding to a different comment which claimed Apple has been doing this since 2019.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
I made no such claim. I was responding to a different comment which claimed Apple has been doing this since 2019.

You sure seemed to be in agreement with it or at least leaning towards it being correct (you even ended your post with an assertion: "This is spyware" that wasn't connected to an "if"). I'd suggest not buying into such assertions without solid evidence to back them up.
 

farewelwilliams

Suspended
Jun 18, 2014
4,966
18,041
Gotta love people who quote the law without reading it:

“shall, as soon as reasonably possible after obtaining actual knowledge of any facts or circumstances described in paragraph (2)(A), take the actions described in subparagraph (B)”

In case you missed that, “after obtaining actual knowledge” does not mean “must proactively go look for illegal content.”

And a lawyer can argue that the machine learning features present on iOS since 2019 could easily detect illegal content but Apple is actively ignoring those facts of "apparent violations". You can search for "child", "urinal", and GPS can show a public bathroom. That would be an "apparent violation" that Apple knows about with their software on device and therefore this law applies.

Look at how that turned out. I actually did read.
 

farewelwilliams

Suspended
Jun 18, 2014
4,966
18,041
You genuinely don’t get that we don’t want this code on our phone…. That’s the entire point, the code collecting data itself is the invasion of privacy. Not sure why that’s so hard to understand, doesn’t matter if you call it spyware or anything else, if you agree to run ios15 then your agreeing to have Apple collect data on your device

You already have code that tells your iPhone to backup your entire device.
And your entire backup can be decrypted server side, including your messages and photos.

Why aren't you telling Apple to remove iCloud backups from your device? Apple can any point in time force your iPhones to backup to iCloud so they can decrypt.
 
  • Like
Reactions: DanielDD

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,076
2,656
On macOS I will most likely be able to utilise terminal and activity monitor, or install third-party software to keep a track of things, and even block the scanning I suppose.
No, you won't. At least not without a significant amount of reverse engineering involved. Don't expect a end-user friendly app for this. Apple already implemented a way to bypass the user network stack, so their own traffic can't be blocked on the system and requires an external firewall. Things are likely to be worse from here on out.
I don't really see how this helps any of us as end-users?
Well, the end user won't need the help, because the end user likely doesn't understand how it works on a technical level to begin with. What the end-user needs is have this removed from the device (which likely can't be done by the end-user) or better not have on the device in the first place (and that's how it should be)
Why would they create tools on-device like this. From a privacy perspective, it makes no sense and it opens a huge can of worms.
And that's the whole point. CSAM is the excuse to bring this on device, because surely no one would support CSAM and speak up against it. It's for the children, remember. They're laying the foundation for much more here and once that's used, we will really have to worry about things. There's no reason at all from a technical point of view this has to be on device. None.
 

farewelwilliams

Suspended
Jun 18, 2014
4,966
18,041
So I'm supposed to seek out information that I couldn't possibly know even exists or that I would even know that I NEED to seek out?

Makes total sense.

The information is there in the long scrolling box before you click agree. How that is something you "couldn't possibly know even exists" is beyond me.

You are supposed to do some research before you update to the next major version of the software. If you blindly click update and click "agree" to the terms without understanding the implications, the blame is on you if you suddenly realize the update is not what you expected to be.

If you don't like this, don't update your phone. It's that simple.

Also, blinding clicking "agree" on your shiny new latest and greatest Android phone isn't going to be anymore "private" than your iPhone.
 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,076
2,656
Why aren't you telling Apple to remove iCloud backups from your device? Apple can any point in time force your iPhones to backup to iCloud so they can decrypt.
Apple can't decrypt E2EE data. Anything that is not E2EE, sure, but nothing else. On top of that, it would be illegal for them to do it.
 
  • Like
Reactions: 09872738

mzeb

macrumors 6502
Jan 30, 2007
358
612
This is one German politician saying this, you’re talking about a law that an entire country passed. So it wouldn’t surprise me if this politician disagrees with the law you mention.
For me this isn't about whether or not I trust Apple. I trust Apple, otherwise I wouldn't own my iPad or iPhone. This is about believing that the information on my device that I paid for is MY information and Apple has no right to scan it, regardless of the privacy protections they believe they've put in place.

Give me a system and I'll find a way to corrupt it. Just tell me the rules.

And that is my biggest fear here.
Tell me about it. I’ve spent a good chunk of my career in white hat style security testing. I broke systems for a job and so I know nothing is truly secure.
 
  • Like
Reactions: tylersdad

ScottHammet

macrumors regular
Jul 22, 2011
134
89
they can already do that
If you’re implying that Apple already provides a mechanism for a government agency to provide a digital fingerprint for an image for a random POI that will search all iCloud photos, then that’s interesting. Riddle me this, though...if they can already do it, then why bother creating a PR mess by announcing a new, identical capability?
 

Mercury7

macrumors 6502a
Oct 31, 2007
738
556
You already have code that tells your iPhone to backup your entire device.
And your entire backup can be decrypted server side, including your messages and photos.

Why aren't you telling Apple to remove iCloud backups from your device? Apple can any point in time force your iPhones to backup to iCloud so they can decrypt.
So your saying that it does not matter whether you have iCloud enabled or not when ios15 arrives, after they scan on your device they can just send a commend to upload it to check for illegal content…. Just to be clear, I assume you know this and not just spouting b.s….. your saying Apple has code on your phone to upload your data when you have it turned off
 

tylersdad

macrumors regular
Jul 26, 2010
200
520
You sure seemed to be in agreement with it or at least leaning towards it being correct (you even ended your post with an assertion: "This is spyware" that wasn't connected to an "if"). I'd suggest not buying into such assertions without solid evidence to back them up.
"This is spyware". The feature that is being delivered by iOS 15...the one we've all been talking about for well over a week now...is spyware.
 

Khedron

Suspended
Sep 27, 2013
2,561
5,755
If you’re implying that Apple already provides a mechanism for a government agency to provide a digital fingerprint for an image for a random POI that will search all iCloud photos, then that’s interesting. Riddle me this, though...if they can already do it, then why bother creating a PR mess by announcing a new, identical capability?

Apple couldn’t have a new process running on everyone’s phone in iOS 15 and scanning their files without first announcing it.

No matter how bad this has been for Apple, it would have been 100 times worse if it was discovered and announced by a security company.

CSAM is just Apple’s best hope of attempting to justify this new system.
 

tylersdad

macrumors regular
Jul 26, 2010
200
520
The information is there in the long scrolling box before you click agree. How that is something you "couldn't possibly know even exists" is beyond me.

You are supposed to do some research before you update to the next major version of the software. If you blindly click update and click "agree" to the terms without understanding the implications, the blame is on you if you suddenly realize the update is not what you expected to be.

If you don't like this, don't update your phone. It's that simple.

Also, blinding clicking "agree" on your shiny new latest and greatest Android phone isn't going to be anymore "private" than your iPhone.
Weak. Very weak.
 
  • Like
Reactions: 09872738

farewelwilliams

Suspended
Jun 18, 2014
4,966
18,041
If you’re implying that Apple already provides a mechanism for a government agency to provide a digital fingerprint for an image for a random POI that will search all iCloud photos, then that’s interesting.

Apple already provides a mechanism for full access to iCloud photo library to governments who can run any number of fingerprints on your data to find what they need.

Riddle me this, though...if they can already do it, then why bother creating a PR mess by announcing a new, identical capability?

Because scanning every single photo on device is factually more private than decrypting every single photo on iCloud to scan. iCloud photos generally stay encrypted for most customers.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
And that's the whole point. CSAM is the excuse to bring this on device, because surely no one would support CSAM and speak up against it. It's for the children, remember.

All the discussions on this forum filled with people speaking up against it, along with politicians and others sort of proves this theory is bunk. It's not hard to qualify objections to make it clear the objector doesn't approve of CSAM itself. Now, I think those objections are baseless, but that's beside the point. If Apple wanted to spy on users, they wouldn't announce anything - not even a "cover story". They'd just do it in an underhanded manner.

They're laying the foundation for much more here and once that's used, we will really have to worry about things.

That's simply in your imagination (unless you have some sort of proof that Apple is planning something else).

There's no reason at all from a technical point of view this has to be on device. None.

Did Apple or anyone else claim there was a technical reason it had to be done on the device? The issue here is privacy. By scanning on the device (which Apple doesn't have access to), all scanning data is hidden from Apple unless 2 things are true: (1) You have iCloud for photos turned on, and (2) You upload 30 or more CSAM images to iCloud.
 
  • Disagree
Reactions: 09872738

farewelwilliams

Suspended
Jun 18, 2014
4,966
18,041
So your saying that it does not matter whether you have iCloud enabled or not when ios15 arrives, after they scan on your device they can just send a commend to upload it to check for illegal content…. Just to be clear, I assume you know this and not just spouting b.s….. your saying Apple has code on your phone to upload your data when you have it turned off

Why is it your hypothetical worst case scenario can become true but not my hypothetical worst case scenario? You're the one worried about the CSAM "code" being on your device regardless of iCloud Photos being off, so you should also be worried that backup code being on your device regardless of "iCloud backups" being off. Double standards it seems.

What's "b.s." is you believing that Apple would actually search willy nilly whatever they want with this tech. I don't believe that, therefore I'm fine with CSAM on device detection AND iCloud backups code being on my device. So, to answer your question, no that's not what I'm saying, but that's what you're essentially saying.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
"This is spyware". The feature that is being delivered by iOS 15...the one we've all been talking about for well over a week now...is spyware.

But it's not. By definition, spyware is malicious software installed without your knowledge (and thus without your consent). That obviously does not apply here. You're simply misusing that term to make things sound dramatic, which is a form of an appeal to emotion fallacy. People hear "spyware" and immediately think "awful! violation! illegal!" etc.
 

tylersdad

macrumors regular
Jul 26, 2010
200
520
Because scanning every single photo on device is factually more private than decrypting every single photo on iCloud to scan. iCloud photos generally stay encrypted for most customers.
It may be factually more private, but that does not make it _actually_ private...which it's not.
 
  • Like
Reactions: 09872738

Mercury7

macrumors 6502a
Oct 31, 2007
738
556
But it's not. By definition, spyware is malicious software installed without your knowledge (and thus without your consent). That obviously does not apply here. You're simply misusing that term to make things sound dramatic, which is a form of an appeal to emotion fallacy. People hear "spyware" and immediately think "awful! violation! illegal!" etc.
Aweful violation ….nailed it, illegal? No because you are giving them permission, anyway you slice it it’s still spyware on your phone exclusively for the purpose of looking for illegal content
 
  • Like
Reactions: BurgDog
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.