Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

tylersdad

macrumors regular
Jul 26, 2010
200
520
But it's not. By definition, spyware is malicious software installed without your knowledge (and thus without your consent). That obviously does not apply here. You're simply misusing that term to make things sound dramatic, which is a form of an appeal to emotion fallacy. People hear "spyware" and immediately think "awful! violation! illegal!" etc.
A rose by any other name is still a rose.
 

Mercury7

macrumors 6502a
Oct 31, 2007
738
556
Why is it your hypothetical worst case scenario can become true but not my hypothetical worst case scenario? You're the one worried about the CSAM "code" being on your device regardless of iCloud Photos being off, so you should also be worried that backup code being on your device regardless of "iCloud backups" being off. Double standards it seems.

What's "b.s." is you believing that Apple would actually search willy nilly whatever they want with this tech. I don't believe that, therefore I'm fine with CSAM on device detection AND iCloud backups code being on my device. So, to answer your question, no that's not what I'm saying, but that's what you're essentially saying.
I’m just quoting you, you said Apple had code on the phones to upload your data whenever they want…. I didn’t think so, I’m not concerned with csam code, I’m concerned with any code designed to look for any content illegal or legal that I don’t give permission
 

farewelwilliams

Suspended
Jun 18, 2014
4,966
18,041
I’m just quoting you, you said Apple had code on the phones to upload your data whenever they want…. I didn’t think so, I’m not concerned with csam code, I’m concerned with any code designed to look for any content illegal or legal that I don’t give permission


And I'm saying if you turn off iCloud Photos, CSAM detection code will not be run.
 

Ethosik

Contributor
Oct 21, 2009
7,820
6,724
Uh no. Apple has to scan NOTHING. Implement full end-to-end encryption for all data between the phone and iCloud and be done with this nanny-state nonsense.

I believe they are legally required to search for it being a cloud storage provider.
 

Ries

macrumors 68020
Apr 21, 2007
2,317
2,895
I can't believe all this misunderstanding and baseless paranoia! Does he not understand that Apple, Google, Microsoft, etc. are already scanning for CSAM? So if they wanted to instead search for other types of images, they can already do that. The only thing this new method does is make things MORE private by hiding all scanning data from Apple except that related to a sizable collection of CSAM being uploaded to their servers. If people are still paranoid about that and don't trust Apple, then they should immediately disable iCloud for photos.
Where does it stop? The next thing is going to be an AI analysing your HomeKit secure videos and flagging you. The next thing you know, is 3 agencies watching your every move, because the AI was wrong.

Once you been flagged innocently, will your government records show "Investigated for..."?
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
A rose by any other name is still a rose.

LOL, what? That saying would apply if it were ACTUALLY spyware (by definition) that Apple called something else. But it's NOT spyware, as I clearly explained, so that response was completely nonsensical. You are not engaging in rational discussion, sir. You might be throwing red meat out there for like-minded people, but it's not helping your case at all if you're trying to convince others.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
Where does it stop? The next thing is going to be an AI analysing your HomeKit secure videos and flagging you. The next thing you know, is 3 agencies watching your every move, because the AI was wrong.

Some of you guys would make GREAT logic textbook writers - if your only job was to provide example sentences of slippery slope fallacies.
 
  • Like
Reactions: I7guy

spazzcat

macrumors 68040
Jun 29, 2007
3,726
4,888
I just thought of a way to make this whole thing moot. The problem will just go away...

I can write software that changes an image's hash. The software would randomly modify an image while still keeping it mostly looking like the original. Yes there are well known ways to do this. Then the user runs this on all stored photos every few weeks or few hours so that the hashes change constantly. This would completely defeat Apple's spyware.

The key that makes this work is that images are already JPG compressed with a lossy kind of compression. We can add information in when decompressed and make the change to the image and recompress.

Please, post as many ways you can think of to defeat this on many forums and get the process started to make this issue completely moot.
So you have a lot of kiddie porn you need to hide?
 

Ries

macrumors 68020
Apr 21, 2007
2,317
2,895
Some of you guys would make GREAT logic textbook writers - if your only job was to provide example sentences of slippery slope fallacies.
Apple scanning your photos is "normal", but apple scanning your other data is a slippery slope of fallacies? Please don't buy any bridges.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
Aweful violation ….nailed it, illegal? No because you are giving them permission, anyway you slice it it’s still spyware on your phone exclusively for the purpose of looking for illegal content

For the gazillionth time, if it's installed with your knowledge and consent, it's NOT spyware. What is so hard to understand about that? You can argue the point until you die and you'll still be wrong. Coin another term for it if you want, but don't try to redefine established terms for the sake of melodrama.
 
  • Like
Reactions: DanielDD

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
Apple scanning your photos is "normal", but apple scanning your other data is a slippery slope of fallacies? Please don't buy any bridges.

No. The fallacy is you asserting that "Because A happened, B will also happen." Unless B is an inevitable/unavoidable result of A (as in, there's no logically possible way that it WON'T happen), then it's a fallacious argument. Now, B (or C or D, etc.) MAY indeed happen, but not as a direct result of A.
 

SFjohn

macrumors 68020
Sep 8, 2016
2,106
4,356
For the gazillionth time, if it's installed with your knowledge and consent, it's NOT spyware. What is so hard to understand about that? You can argue the point until you die and you'll still be wrong. Coin another term for it if you want, but don't try to redefine established terms for the sake of melodrama.
Yes, it’s not spyware yet. Right now anyone can disable iCloud Photo Sharing (huge pain in the ass to deal with the diminished functionality) and no CSAM scanning will take place. So all the pedos out there are safe for now. Everyone else will have their photos monitored. How simple it will be to jail people you don’t like if you know their phone’s password & have access to it for a little while… This should not be happening.
 
  • Like
Reactions: xpxp2002

xpxp2002

macrumors 65816
May 3, 2016
1,154
2,727
Apple can any point in time force your iPhones to backup to iCloud so they can decrypt.
That is not correct. iOS decides when to backup to iCloud. If iCloud backups are turned on, iOS will only try to back up if there is adequate space, and you are connected to power and Wi-Fi for at least 10 minutes with the screen locked.

The user can manually force a backup without meeting those conditions, but Apple cannot at "any point in time force your iPhones to backup."
 
  • Disagree
Reactions: farewelwilliams

DanielDD

macrumors 6502a
Apr 5, 2013
524
4,447
Portugal
That is not correct. iOS decides when to backup to iCloud. If iCloud backups are turned on, iOS will only try to back up if there is adequate space, and you are connected to power and Wi-Fi for at least 10 minutes with the screen locked.

The user can manually force a backup without meeting those conditions, but Apple cannot at "any point in time force your iPhones to backup."

By the same argument, Apple cannot at any point force a scan outside of photos being upload to the cloud. That was the point of the other comment.
 
  • Like
Reactions: farewelwilliams

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
Yes, it’s not spyware yet.

It's not spyware ever unless Apple starts doing something differently without informing users.

Right now anyone can disable iCloud Photo Sharing (huge pain in the ass to deal with the diminished functionality) and no CSAM scanning will take place. So all the pedos out there are safe for now. Everyone else will have their photos monitored.

You're assuming all pedos have common sense, aren't careless, or are educated about this feature. That's quite a big assumption. And of course Apple's goal is not to eliminated pedophilia (impossible) or even eliminate all CSAM. The goal is to prevent their servers from being used to proliferate it.

How simple it will be to jail people you don’t like if you know their phone’s password & have access to it for a little while… This should not be happening.

Don't Facebook, Google, Microsoft, and many others already scan photos or videos being uploaded to their platforms or cloud services? Are you aware of a rash of sabotage jobs by people's enemies uploading child porn on their accounts and getting them arrested? Obviously it's possible, but very unlikely and a very poor argument against what Apple is doing.
 

tylersdad

macrumors regular
Jul 26, 2010
200
520
LOL, what? That saying would apply if it were ACTUALLY spyware (by definition) that Apple called something else. But it's NOT spyware, as I clearly explained, so that response was completely nonsensical. You are not engaging in rational discussion, sir. You might be throwing red meat out there for like-minded people, but it's not helping your case at all if you're trying to convince others.
I don't need you to agree with me. Apple is spying on us. Nothing you say will negate that.
 

pacalis

macrumors 65816
Oct 5, 2011
1,004
662
Despite the facts that it scans locally before uploading, that it only uses hashes initially, that child exploitation is a serious problem that must be solved and that Apple has done a pretty good design job to protect privacy it still isn’t right. This is still digital surveillance of the private population. And worse still it is being done by a corporation of unelected individuals deciding for us how it should be done rather than law enforcement. The term “overreach” is often used in government and it applies here. Apple is not responsible nor accountable for CSAM detection in law enforcement and no country’s citizens have passed a law to give them this mandate. However secure, private and well intentioned this system may be they are breaching the privacy of the people without the people’s’ permission.

Not sure it is at all better if the government does it. The government's can't do this easily because, for example, it is typically unlawful in the US under the 4th amendment and similarly in the Canadian Charter rights. It seems, nearly, that the government is laundering unreasonable search and seizure liabilities.

A real dilemma with dangerous political implications. Perhaps the box has already been opened, but for me it is not obvious what will cause the most good or greatest harm in the long run.
 

antiprotest

macrumors 601
Apr 19, 2010
4,044
14,261
It’s surprising to see that kind of stance by a politician, they are not the biggest fans of digital privacy. Very curious if any major politician in US will fight on CSAM scanning, but I’m not holding my breath
Next we will have Facebook, Google, Cellebrite, and Cambridge Analytica fighting Apple to protect our privacy.

What a plot twist that Apple could turn out to be the villain and the final boss.
 
Last edited:
  • Like
Reactions: Mercury7

johnsc3

macrumors regular
Apr 2, 2018
177
188
Höferlin notably called Apple's CSAM approach "the biggest opening of the floodgates for communication confidentiality since the birth of the internet."

I totally agree. This is a landmark decision Apple has made and it seems many do not seem to understand the consequences of this for the morrow, preferring to talk about MacBook Air M1 colours.
I agree.
 

UK-MacAddict

macrumors 65816
May 11, 2010
1,010
1,225
Vote with your feet. Nobody download iOS 15 or buy any of the new iPhones and Apple will backtrack immediately.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
I don't need you to agree with me.

Fine, then enjoy your echo chamber and stop replying to me ?‍♂️

Apple is spying on us. Nothing you say will negate that.

It's not what I say that negates that; reality negates that. As has been explained to you and others countless times, Apple sees NOTHING unless you upload 30+ illegal images to iCloud. But believe what you want. I'm interested in reality, not fantasy.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.