Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

steve09090

macrumors 68020
Aug 12, 2008
2,144
4,135
Caving to pressure to not encrypt backups (and other things) back in 2018 (reported in 2020) is essentially a huge back door for every government worldwide.

That is not illegal to do. Apple needs to just do it and force the governments to pass a law to make it illegal over the vociferous opposition of people who care about privacy and liberty.

* https://www.reuters.com/article/us-...ps-after-fbi-complained-sources-idUSKBN1ZK1CT
That’s not a back door into iOS. Nothing like it. And my response was related to the "guise of child protection" which has nothing to do with your link.

The problem is that in the (perhaps only historical) US at least the Constitution is a document of enumerated powers, not enumerated liberties. That has been reversed in many people's minds, but it is still true: there is no law that prohibits encrypting everything (iCloud etc) using on device keys but Apple caved to tyrannical pressure from the FBI/DOJ.

If it is legal to protect privacy, Apple should be doing so. Compromising it away a little at a time will result in even more of a surveillance state than we have. Snowden showed how bad it was, only on device encryption will prevent it from continuing.
Do you know what really went on? It would just take a change of law, which Apple can do nothing about as there is no law (or one of the stupid "Amendments to the Constitution") that prevent the laws to force a back door to a hardware system. I would put forward the idea that the FBI want access to data, and to prevent a genuine back door being forced, this was the better option.

I find (and I’m not talking about anyone in particular) that the idea of privacy is purely an abstract view on the real world. Those who believe there are rights to privacy are delusional.
 

iPhoneFan5349

macrumors 6502a
Nov 14, 2021
549
459
Maybe not, but wouldn't they at least come up with ideas to keep Apple from getting sued for failing to protect privacy of customers, which could be accomplished by Apple protecting the privacy of customers? 🤔
Not really. They would Just come up with a terms of agreement we’re customers accept everything.
 

ponzicoinbro

Suspended
Aug 5, 2021
1,081
2,085
I agree that the CSAM-scanning idea is a slippery slope that will be copied, but the CSAM algorithm Apple proposes to use has no doubt been enabled by AI-optimised chips on iPhones.

It’s just an image hash check.

Can be done on CPU but uses less energy using Neural Engine.

Microsoft’s version PhotoDNA is used on servers by most major companies.
 
  • Like
Reactions: ericwn

VulchR

macrumors 68040
Jun 8, 2009
3,393
14,269
Scotland
Again, it's a little more complicated than that. If they enable e2e encryption, they wouldn't be able to scan on their servers. Even with a valid search warrant. Consumers would gain a great deal of privacy, but Apple would be unable to prevent CSAM from being stored on their servers.

Is on device CSAM scanning a worthwhile tradeoff for e2e encryption? For me, absolutely - if it was limited to that purpose. The problem is that the technology that enables CSAM scanning can be abused, especially by authoritarian governments. That potential for abuse is the real problem for me.
Agreed. I truly fear it won't end at CSAM-scanning. We will be the last generation to be able to put constraints on electronic surveillance coupled with AI. We need to establish that on-device surveillance, particularly in the absence of evidence of any crime, is wrong in principle. We would never allow the authorities to randomly search houses for CSAM material without a warrant, let alone a private company. Yet what Apple is proposing to do is equivalent.
 
  • Like
Reactions: avz

VulchR

macrumors 68040
Jun 8, 2009
3,393
14,269
Scotland
It’s just an image hash check.

Can be done on CPU but uses less energy using Neural Engine.

Microsoft’s version PhotoDNA is used on servers by most major companies.
I understand that, but the new machine-algorithm--optimised chips will just allow a ab ever-greater sophistication of surveillance on a device that most of us have either on our person or nearby 24/7. Like I said, I have no problem with Apple scanning its servers for CSAM material. They're Apple's servers, after all, to do with as the company pleases. However, my phone is my property and I don't want either companies or governments searches it without probable cause. I note that perceptual hashes can be used to detect virtually any class of stimuli (faces, memes, words, music, etc.), including those that dictaotrs consider illegal. Would you feel comfortable with Russia using a system like Appl's to detect Ukrainian flags on its citizens phones?.

I dunno. In the various offshoots of UNIX (including iOS and MacOS, there are read-write-execute permission flags for each file, with options for user, group, other. Perhaps we should just add another flag for files that Apple can spy on.
 

ponzicoinbro

Suspended
Aug 5, 2021
1,081
2,085
I understand that, but the new machine-algorithm--optimised chips will just allow a ab ever-greater sophistication of surveillance on a device that most of us have either on our person or nearby 24/7. Like I said, I have no problem with Apple scanning its servers for CSAM material. They're Apple's servers, after all, to do with as the company pleases. However, my phone is my property and I don't want either companies or governments searches it without probable cause. I note that perceptual hashes can be used to detect virtually any class of stimuli (faces, memes, words, music, etc.), including those that dictaotrs consider illegal. Would you feel comfortable with Russia using a system like Appl's to detect Ukrainian flags on its citizens phones?.

I dunno. In the various offshoots of UNIX (including iOS and MacOS, there are read-write-execute permission flags for each file, with options for user, group, other. Perhaps we should just add another flag for files that Apple can spy on.

Ya but my belief is in the end CSAM protection will:

- be enabled by parents.

- will be default only on corporate and education owned devices to protect those institutions.

- everyone else can leave it off but if someone is a dirty scum and uploads nasty stuff to iCloud servers then their material will be detected server side, they will be reported, and they will be banned from iCloud forever.
 

BaldiMac

macrumors G3
Jan 24, 2008
8,775
10,900
Agreed. I truly fear it won't end at CSAM-scanning. We will be the last generation to be able to put constraints on electronic surveillance coupled with AI. We need to establish that on-device surveillance, particularly in the absence of evidence of any crime, is wrong in principle. We would never allow the authorities to randomly search houses for CSAM material without a warrant, let alone a private company. Yet what Apple is proposing to do is equivalent.
I think you are overstating here. What Apple is proposing is more akin to passing through a metal detector to enter a building or drug-sniffing dogs at the airport. Which we routinely allow. This proposal isn't a random warrantless search. This is a company checking to see if you are storing illegal material on their property. Which is reasonable.

Again, the problem is POTENTIAL for misuse. If Apple could guarantee that they won't use it for anything other than CSAM (which they are clearly making the effort to do by the policies they proposed), I would definitely support it if they enabled e2e encryption in exchange.
 

VulchR

macrumors 68040
Jun 8, 2009
3,393
14,269
Scotland
I think you are overstating here. What Apple is proposing is more akin to passing through a metal detector to enter a building or drug-sniffing dogs at the airport. Which we routinely allow. This proposal isn't a random warrantless search. This is a company checking to see if you are storing illegal material on their property. Which is reasonable.
...
Interesting way of putting it - you nearly have me convinced. Still, the metal detector / narco dogs analogy refers to public places where we have a low expectation of privacy. The CSAM-scanning could be done in your own home, where there is an expectancy of privacy. Also, as I have said I have no issue with Apple doing this on their servers. That is the way to check for illegal content, hopefully after a proper search warrant is served rather than the company just deciding to rummage through peoples' photos. Just my two cents. I do agree the potential for abuse of this kind of content-scanning system is more alarming than the actual proposed use. I just think we need to establish principles.
 

BaldiMac

macrumors G3
Jan 24, 2008
8,775
10,900
Interesting way of putting it - you nearly have me convinced. Still, the metal detector / narco dogs analogy refers to public places where we have a low expectation of privacy. The CSAM-scanning could be done in your own home, where there is an expectancy of privacy.
Except it's only done if you choose to upload your photos to Apple's servers. In my opinion, CSAM scanning should only be done if the user chooses to enable e2e encryption for iCloud storage and specifically acknowledges that the CSAM scanning will be enabled.

Also, as I have said I have no issue with Apple doing this on their servers. That is the way to check for illegal content, hopefully after a proper search warrant is served rather than the company just deciding to rummage through peoples' photos. Just my two cents.
Sure. That's what they're currently doing. But that's why I tie it to e2e encryption. Apple would not be able to scan content on their servers if e2e encryption is enabled. Even with a proper search warrant.

I do agree the potential for abuse of this kind of content-scanning system is more alarming than the actual proposed use. I just think we need to establish principles.
Yep.
 
  • Like
Reactions: VulchR

I7guy

macrumors Nehalem
Nov 30, 2013
34,296
24,031
Gotta be in it to win it
Interesting way of putting it - you nearly have me convinced. Still, the metal detector / narco dogs analogy refers to public places where we have a low expectation of privacy. The CSAM-scanning could be done in your own home, where there is an expectancy of privacy. Also, as I have said I have no issue with Apple doing this on their servers. That is the way to check for illegal content, hopefully after a proper search warrant is served rather than the company just deciding to rummage through peoples' photos. Just my two cents. I do agree the potential for abuse of this kind of content-scanning system is more alarming than the actual proposed use. I just think we need to establish principles.
"Rummaging" through your photos is here today and it's google that is doing it. If CSAM scanning comes to pass, and I think it will ultimately, turn off icloud sync and the scanning process will not be initiated.

Now, if one want to believe there is a slippery slope here, and the governments or government agencies can force scanning for virtually anything, then I posit this is here today and you don't know about it.

If you really want to protect yourself, get one of those phones with a linux operating system where you can control it. Of course one won't have an ecosystem.
 
  • Like
Reactions: VulchR

ackmondual

macrumors 68020
Dec 23, 2014
2,434
1,147
U.S.A., Earth
Google already uses AI for censor browsing. Didn't you see the news report a few days ago where a father was censored by Google because he and his wife sent medical pic's of their son's swollen private parts to their doctor via the doctors messenging service (am assuming google email) at the doctors request. The doctor diagnosed the problem, prescribed the right medicine and now the boy is doing OK but Google's AI alerted Google to the pictures and the man got all his Google accounts closed down. Basically Google argued that context of the images do not matter, it is the fact such images were sent and in doing so it broke Google's terms of service.

Article about the incident: https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.html

In the article it says the police were alerted but they said no crime had been committed. Google still closed all of the man's accounts.
I know someone who likes Google products, but doesn't want to get too far in since Google has this power to "just shut you down".

Curious, anybody know what would Apple do in such case with their own Apple Email? Would they alert the police? I'd say probably so, but I'd also like to think they would restore that user's accounts.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.