Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

farewelwilliams

Suspended
Jun 18, 2014
4,966
18,041
That is not correct. iOS decides when to backup to iCloud. If iCloud backups are turned on, iOS will only try to back up if there is adequate space, and you are connected to power and Wi-Fi for at least 10 minutes with the screen locked.

The user can manually force a backup without meeting those conditions, but Apple cannot at "any point in time force your iPhones to backup."

This is the problem when you jump in the middle of a conversation without understanding it fully.

Mercury7's main concern is that the code can be collecting data without user's consent. If that is true (I'm saying it's not, but for the sake of argument), then it's entirely possible for Apple to just collect your backups without the user's consent which is a far more problematic issue than CSAM collecting more than just child porn.

My argument is operating under Mercury7's condition being true.
 

amartinez1660

macrumors 68000
Sep 22, 2014
1,587
1,622
According to another site Reddit has found the hash algorithm in iOS 14.3 and,

"For example, one user, dxoigmn, found that if you knew the resulting hash found in the CSAM database, one could create a fake image that produced the same hash. If true, someone could make fake images that resembled anything but produced a desired CSAM hash match. Theoretically, a nefarious user could then send these images to Apple users to attempt to trigger the algorithm."

If this is true then I hope Apple finds a solution for this before rollout.

But even if this one is not true, we can probably expect a number of other problems to be discovered and perhaps lives ruined after rollout before they are fixed.
Damn man, excuse me if you have many replies to this same message but this specific example needs all the awareness…

And I don’t get it, Apple have said it themselves, creating a back door leaves it exploitable by both good and bad actors.

No one (99.999999%) are against fixing the issues this tries to tackle at hand, nor the mass shootings or other horrible events… however that someone can “trap” anyone else by random images that contain a tagged hash, then what? I can see even a sort of random attacks with compromised accounts: “send this amount of money or I’ll flood your iCloud account with the wrong kind of pictures” or could even just manipulate the existing ones just enough to get to a required hash hit. Getting on the really scary realm now.
 

DanielDD

macrumors 6502a
Apr 5, 2013
524
4,447
Portugal
No one (99.999999%) are against fixing the issues this tries to tackle at hand, nor the mass shootings or other horrible events… however that someone can “trap” anyone else by random images that contain a tagged hash, then what? I can see even a sort of random attacks with compromised accounts: “send this amount of money or I’ll flood your iCloud account with the wrong kind of pictures” or could even just manipulate the existing ones just enough to get to a required hash hit. Getting on the really scary realm now.

Of course, that assumes that a bad actor knows one of the hashes. Let's assume for a minute that an hash contains 30 alphanumeric characters (which is an unrealistically low number). You'd need 2.1536865546727644e+28 years to find one of them.
And then this is theoretical. The system is designed in such a way that the device never knows if there is a match. That only happens on the server.
 
  • Like
Reactions: Mattewes

Mac4Mat

Suspended
May 12, 2021
168
466
Yep, this is a classic Trojan horse. An attractive gift, stop kiddie porn, with a hidden stinger, on device general surveillance tool. And don't worry, all those doors in the belly of the horse have locks on the inside so nobody can exit.
Will Saudis or Hungarians or Russians get a refund if they don't like the policy in their country after they bought an iPhone 13?
No I didn't think so.
The idea Apple won't bend to government pressure seems at odds with its dealings with China. China insisted data was kept in China. Apple complied.

Today we have Apply complying again.


So when Apple give assurances it means nothing if the country concerned decides to implement laws/restrictions that Apple agree they will comply with.

Just an excerpt:
"Within mainland China, we found that Apple censors political content, including broad references to Chinese leadership and China's political system, names of dissidents and independent news organisations, and general terms relating to religions, democracy, and human rights," it says.

So how much confidence embedded software won't be used to do the same worldwide.

Gives such easy potential to update based on any country's request/demands.

Wonder what Afghanistan's Taliban government when formed will demand.

What the FBI will require in the US etc. etc.

What will the EU require when its already threatening major fines over App Store etc.

With this 'backdoor' to use others terms, and with individual software on users HARDWARE, how easy to modify to the requests of any government or face losing the right to sell Apple equipment in that country, or face fines for competition rules etc. etc. it even gives the potential for INDIVIDUAL software modification because its on individuals own hardware.
 
Last edited:

cola79

macrumors 6502
Sep 19, 2013
380
437
What i don't get is why there are still people who defend this?

Apple is a private company and not the police, the FBI or the state attorney. They have no right to put their users under 24/7 surveillance.

Are the users allowed to scan all internal Apple emails to look after tax crimes, spying on competitors or if they are treating their workers correctly? Would like to see the Foxconn mails too.

See what you do when you defend Apple in this. You devalue yourself to a lower being that is under the control of a thing. Apple is no god or person, it's a stupid company and just that!
 

Ethosik

Contributor
Oct 21, 2009
7,820
6,724
I think many are missing the point still, we don’t care about anything companies or law enforcement scans that we upload to the internet, it’s the putting it on our hardware we object to. It’s just a deal breaker for some of us…. Obviously it does not matter to many as witnessed by the dozens of threads talking about it….. however some of us will not accept it period…. Whether we can just cut off iCloud to feel safe or leave Apple altogether is a question unanswered but it’s one many of us are watching…. Yesterday I decided not to upgrade to ios15, this morning I discover they have already put the spyware on my phone

But it’s ONLY for images that are being uploaded.
 

VulchR

macrumors 68040
Jun 8, 2009
3,401
14,286
Scotland
Thank you. As someone who loves technology (worked at Apple Store for little while) who is also using these tools as a conservative pastor, I’ve tried to think through these things logically… I even wrote a paper for one of my masters degree classes called ‘a biblical theology of technology.’
I am not religious, but the one thing that strikes me is that Apple has thought this through at a technological level, but not at a moral or ethical level. They should have consulted more widely than they appear to have done.
 

VulchR

macrumors 68040
Jun 8, 2009
3,401
14,286
Scotland
Well I could've swore that's what I had read from multiple sources, but other cloud services definitely have been. Even if Apple hasn't, they always could have if they wanted to. That's really my point - that these CSAM detection features in iOS15 aren't any sort of game changer in terms of what Apple COULD do if they wanted to. In fact, the whole point of the new CSAM detection process is to be as non-invasive as possible whilst still being able to detect CSAM uploaded to iCloud.

I understand that side of things, but I define 'invasiveness' as executing processes on a machine I own without my approval. Of course, I could turn off iCloud, but then I'll lose functionality that I like. So I am given a choice: privacy or function. It's so Google-esque. I understand Apple's motivations for doing this, but I think they they either fail to understand personal boundaries or don't care.

View attachment 1820332
Please explain exactly how iOS15 is being magically installed on your device without your knowledge (and forgetting you have auto-update on doesn't count, as that's something under your control). You are misusing that term to try to make things sound more dramatic than they are. Plain and simple. Stop (you and everyone else doing the same thing).

I define spyware as software used for the purpose of spying. I do not accept that knowledge of the spying is a defining feature. And there are multiple definitions out there, many of which focus on spying rather than covertness.

And for the millionth time, NOTHING IS BEING SCANNED on your phone if you turn off iCloud for photos and even if you don't, no scanning information is leaving your phone unless you're uploading illegal images to iCloud and even THEN Apple can't decrypt that info unless the detected CSAM image threshold (30) is met.

You clearly do not understand about the potential for false positives, which Apple admits. Nor do you acknowledge that false positives being decrypted and examined by an Apple employee would be a crass invasion of privacy. We do not know what features are being extracted by the hash process, nor do we know what the false positives look like in comparison to the original. Moreover, 30 sounds like a lot of pictures until you realise people often take multiple pictures of the same scene that look similar. Don't assume the pictures people take are randomly related to each other. Moreover, the more pictures you store on iCloud, the greater the chance of hitting 30 false positives. Thirty is an arbitrary number in any case, and this high value tells you just how much Apple lacks confidence in the matching procedure to reject potential false postives.

Again, it's not spyware, and "probable cause" has nothing to do with this topic since Apple is a private entity and you're voluntarily using their software, which is merely licensed to you (not owned by you).

Would you feel more comfortable if I said 'software for spying'? And my point probable caused is based (1) the right to privacy being so important in our society that it is mentioned in the Constitution and (2) suppose nobody is found to have CSAM through this process, and it throws up nothing but false positives - would we still find the degree of surveillance acceptable given that it didn't actually catch anyone?
 

xpxp2002

macrumors 65816
May 3, 2016
1,154
2,727
I define spyware as software used for the purpose of spying. I do not accept that knowledge of the spying is a defining feature. And there are multiple definitions out there, many of which focus on spying rather than covertness.
This here. The practical definition of "spyware" is software that spies on its user and reports back to someone else. Doesn't have to be covert. Doesn't even have to be without their consent. If it collects data that the user would prefer to keep private and phones it home, it's spyware.

This will be controversial, but by this definition, Google Chrome is spyware. And I agree that it should be classified as such. With the addition of this CSAM hash scanner in iOS, it now contains built-in spyware. You can't even see which photos were flagged as "CSAM" but those photos and their hashes will be sent somewhere else without your knowledge once that threshold is triggered. That's spyware.
 
  • Like
Reactions: VulchR

VulchR

macrumors 68040
Jun 8, 2009
3,401
14,286
Scotland
This here. The practical definition of "spyware" is software that spies on its user and reports back to someone else. Doesn't have to be covert. Doesn't even have to be without their consent. If it collects data that the user would prefer to keep private and phones it home, it's spyware.

This will be controversial, but by this definition, Google Chrome is spyware. And I agree that it should be classified as such. With the addition of this CSAM hash scanner in iOS, it now contains built-in spyware. You can't even see which photos were flagged as "CSAM" but those photos and their hashes will be sent somewhere else without your knowledge once that threshold is triggered. That's spyware.

Perhaps Apple's action with tilt the generally accepted definition to the software's function rather than its covertness. Or maybe it will create a new verb:

apple (/ˈap(ə)l/) noun - the edible fruit from an apple tree; verb – to spy, e.g., to apple on a country
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
I understand that side of things, but I define 'invasiveness' as executing processes on a machine I own without my approval.

But you DO approve it by agreeing to the terms of the software update and enabling iCloud for photos. If you don't want this functionality, either continue to use iCloud on iOS14 and don't upgrade to iOS15 or upgrade to iOS15 and disable iCloud for photos. And Apple doesn't need your approval to add features to iOS. They own the software.

I define spyware as software used for the purpose of spying. I do not accept that knowledge of the spying is a defining feature. And there are multiple definitions out there, many of which focus on spying rather than covertness.

But it's not spying if Apple has told you this is what they're doing. You don't get to redefine words as you please. Inherent to the definition of "spying" is that the person doing the spying intends not to be discovered, obviously. And they're not even simply "observing" you either, because none of that scanning data leaves the secure environment of your phone unless you upload 30 or more detected CSAM images to iCloud.

You clearly do not understand about the potential for false positives, which Apple admits. Nor do you acknowledge that false positives being decrypted and examined by an Apple employee would be a crass invasion of privacy. We do not know what features are being extracted by the hash process, nor do we know what the false positives look like in comparison to the original. Moreover, 30 sounds like a lot of pictures until you realise people often take multiple pictures of the same scene that look similar. Don't assume the pictures people take are randomly related to each other. Moreover, the more pictures you store on iCloud, the greater the chance of hitting 30 false positives. Thirty is an arbitrary number in any case, and this high value tells you just how much Apple lacks confidence in the matching procedure to reject potential false postives.

Nonsense. They are simply erring on the side of caution. If a seasoned climber uses safety gear, does that mean they lack confidence in their climbing ability? Of course not. Apple has backup systems at presentations in case something goes wrong when demoing on the main machine. Does that mean Apple lacks confidence in their hardware or software? Nope. Apple has already stated the chance of a single account being falsely flagged by 30 false positives is less than 1 in 1 trillion per year. And even IF that extremely unlikely event happened, the false positives would be promptly identified as such and nothing ill will befall you.

Would you feel more comfortable if I said 'software for spying'? And my point probable caused is based (1) the right to privacy being so important in our society that it is mentioned in the Constitution and (2) suppose nobody is found to have CSAM through this process, and it throws up nothing but false positives - would we still find the degree of surveillance acceptable given that it didn't actually catch anyone?

I'd feel more comfortable if you'd simply call it what it is: a secure scanning process that happens on the device in order to thwart the spread of CSAM on iCloud whilst maintaining user privacy to the greatest extent possible. Also, the Constitution is between the government and the people, not between private companies and their customers. But again, Apple IS valuing privacy here. That's one of their core values and it's the whole reason why they're implementing this in iOS 15 vs. scanning on the Cloud where they have access to ALL that data.
 

Mydel

macrumors 6502a
Apr 8, 2006
804
664
Sometimes here mostly there
decrypting and processing every single photo server side is factually less private than just processing every single photo on each of the user’s device. this is objectively true.

we’re done.
this is a lie. Processing every photo in iCloud limits the reach to uploaded photos. processing in iOS has unlimited reach of your device data. And the iCloud data are not end to end encrypted
 
  • Like
Reactions: xpxp2002

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
that's what they say. Never trust any company. And that is not the point even. The point is that its snooping in your phone with the unknown code built into iOS

Yet I haven't seen any rational basis on which to assume Apple is lying here. If they were up to no good, why would they even draw attention to the feature at all? They could just have iOS execute code in the background without documenting it. Unless you have actual evidence that Apple is being dishonest here, your assertion is no more valid than me calling Tim Cook a rapist (or even implying he might be).

Also, no "snooping" is going on here. "Snooping" implies someone is looking through all your stuff, but this method of scanning ensures Apple sees NOTHING except data attached to hash-matched illegal images.
 
  • Angry
Reactions: KindJamz

Mydel

macrumors 6502a
Apr 8, 2006
804
664
Sometimes here mostly there
Yet I haven't seen any rational basis on which to assume Apple is lying here. If they were up to no good, why would they even draw attention to the feature at all? They could just have iOS execute code in the background without documenting it. Unless you have actual evidence that Apple is being dishonest here, your assertion is no more valid than me calling Tim Cook a rapist (or even implying he might be).

Also, no "snooping" is going on here. "Snooping" implies someone is looking through all your stuff, but this method of scanning ensures Apple sees NOTHING except data attached to hash-matched illegal images.
No. I doubt they coud sneak in the hash library and the code. Sooner rather than later someone would find out and the blowback would be massive. 100x bigger than it is now. Well… we have a different definition of snooping. As i understand it is checking the contant on my device whithout my consent.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
No. I doubt they coud sneak in the hash library and the code. Sooner rather than later someone would find out and the blowback would be massive. 100x bigger than it is now. Well… we have a different definition of snooping. As i understand it is checking the contant on my device whithout my consent.

I have a feeling Apple would have ways to mask it so that only the most tech savvy would be able to deduce what was happening, and even then they could cover their butts by having it legalese in the software agreement (without making PR statements about it to draw attention to it). But whatever - my point is I see no evidence of dishonesty here. If you have evidence, let's have it.

If you install iOS 15 and use iCloud for photos, you ARE consenting. No one's holding a gun to your head or not informing you of what this version of iOS is doing. What you REALLY mean is, "My photos are being scanned without my LIKING it, even though I consented to it." And again, Apple is seeing NONE of that data except for illegal images uploaded to iCloud. They're not seeing photos of you and your family at the beach or anything like that, LOL! Unless you're a criminal, they see nothing. Not sure why people can't get this after ALL this discussion and explanation. Just asserting "Well they're lying about how it works"--without evidence to support that assertion--is neither rational nor helpful.
 
  • Disagree
Reactions: xpxp2002

Mydel

macrumors 6502a
Apr 8, 2006
804
664
Sometimes here mostly there
I have a feeling Apple would have ways to mask it so that only the most tech savvy would be able to deduce what was happening, and even then they could cover their butts by having it legalese in the software agreement (without making PR statements about it to draw attention to it). But whatever - my point is I see no evidence of dishonesty here. If you have evidence, let's have it.
do you really think that if they sneak it in that wouls not be a major issue for them once discovered? I i hve no doubts that sooner or later it would.
If you install iOS 15 and use iCloud for photos, you ARE consenting. No one's holding a gun to your head or not informing you of what this version of iOS is doing. What you REALLY mean is, "My photos are being scanned without my LIKING it, even though I consented to it." And again, Apple is seeing NONE of that data except for illegal images uploaded to iCloud. They're not seeing photos of you and your family at the beach or anything like that, LOL! Unless you're a criminal, they see nothing. Not sure why people can't get this after ALL this discussion and explanation. Just asserting "Well they're lying about how it works"--without evidence to support that assertion--is neither rational nor helpful.
Again. Misconception. I have no issue with Apple scanning on their servers. I have major with doing it on device. and dont tell me abour encryption...How than in the documents that were cited today on Macrumors they know there is lots of porn on their servers if they are secure and encrypted?
Argument that if you are not criminal they see nothing is so shallow its not even worth to comment….
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
do you really think that if they sneak it in that wouls not be a major issue for them once discovered? I i hve no doubts that sooner or later it would.

If they were up to no good, they probably wouldn't care.

Again. Misconception. I have no issue with Apple scanning on their servers. I have major with doing it on device. and dont tell me abour encryption...How than in the documents that were cited today on Macrumors they know there is lots of porn on their servers if they are secure and encrypted?

So you'd rather them scan all your files on the server where they have complete control vs. on your device where they have no control? Color me confused.

What does files being secured and encrypted on iCloud have to do with lots of porn being there? It's there because they haven't scanned for it. And if you mean how did Apple identify porn if the files are encrypted - well, they hold the encryption keys. Apple's legal agreement for iCloud clearly indicates they have access to your files if needed. People keep quoting "What happens on your iPhone stays on your iPhone" but seem to forget iCloud is NOT "your iPhone" - never has been and never will be. If you don't want Apple having any possibility of access to your files, don't store them in iCloud.

Argument that if you are not criminal they see nothing is so shallow its not even worth to comment….

It wasn't an argument; it was a statement of fact. Well, there's that less than 1 in 1 trillion chance per year that a non-criminal's account may be falsely flagged, but I'd advise against holding your breath waiting for that to happen.
 
Last edited:

Mydel

macrumors 6502a
Apr 8, 2006
804
664
Sometimes here mostly there
If they were up to no good, they probably wouldn't care.
Obviously they dont care now.
So you'd rather them scan all your files on the server where they have complete control vs. on your device where they have no control? Color me confused.
Why? As you pointed below they probably do it now. I dont care. I dont use icloud service.
What does files being secured and encrypted on iCloud have to do with lots of porn being there? It's there because they haven't scanned for it. And if you mean how did Apple identify porn if the files are encrypted - well, they hold the encryption keys. Apple's legal agreement for iCloud clearly indicates they have access to your files if needed. People keep quoting "What happens on your iPhone stays on your iPhone" but seem to forget iCloud is NOT "your iPhone" - never has been and never will be. If you don't want Apple having any possibility of access to your files, don't store them in iCloud.
as mention above. I dont. And im ok with them scanning the files. Every cloud service comes with cavets. That is why i dont use them.
It wasn't an argument; it was a statement of fact. Well, there's that less than 1 in 1 trillion chance per year that a non-criminal's account may be falsely flagged, but I'd advise against holding your breath waiting for that to happen.
No. statistic does not work like that. Its per hash per photo.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
I dont care. I dont use icloud service.

So then what's your issue? The scanning won't even be happening on your device in that case. It only happens if you enable iCloud for photos.

No. statistic does not work like that. Its per hash per photo.

Tell that to Apple if you think you know better than them. Here's their exact quote:
Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.
 

Mydel

macrumors 6502a
Apr 8, 2006
804
664
Sometimes here mostly there
So then what's your issue? The scanning won't even be happening on your device in that case. It only happens if you enable iCloud for photos.
I have principles. You might have heard of that.
Tell that to Apple if you think you know better than them. Here's their exact quote:
They know. I don't need to tell them. You are taking PR blurb seriously...
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
I have principles. You might have heard of that.

I have principles too, such as sticking with logic and not making baseless accusations or insinuations against people or companies without evidence.

They know. I don't need to tell them. You are taking PR blurb seriously...

It's not a "PR blurb" - it's their explanation of how the technology works. What's your concrete evidence that they're lying? Is everything Apple puts out for public consumption a lie, or do you just randomly choose which bits to label as such?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.