Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

hagjohn

macrumors 68000
Aug 27, 2006
1,749
3,511
Pennsylvania
It’s surprising to see that kind of stance by a politician, they are not the biggest fans of digital privacy. Very curious if any major politician in US will fight on CSAM scanning, but I’m not holding my breath
I'm surprised one of the 2 US parties isn't crying foul since they always think they are under threat.
 

tylersdad

macrumors regular
Jul 26, 2010
200
520
You didn't bother to read the entire text of the law. It specifically precludes what Apple is doing. Yes, if Apple is made aware of an image on their servers that violates the law, they are obligated to remove that image and report it to law enforcement. Nowhere in this law does it say that Apple or any other tech company should be actively scanning for this.

(f) Protection of Privacy.-Nothing in this section shall be construed to require a provider to-

(1) monitor any user, subscriber, or customer of that provider;

(2) monitor the content of any communication of any person described in paragraph (1); or

(3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).
 

Mac4Mat

Suspended
May 12, 2021
168
466
I think if the European Union authorities say NO, the whole idea may end up being dropped.
I doubt it, this stuff is full of politicking, both EU, worldwide and the USA. With all the cases outstanding and the requests by FBI and other USA agencies decrying Apple's lack of assistance, be easy to lean on Apple offering court cases going there way, and no action over Apple App Store and a verdict in favour of Apple against Epic...Tempting bait.

Not just Apple though as no doubt such pressure is exerted these days on all and sundry to create backdoors....

Only difference is the other organisations like Facebook, Google etc., won't place it on your hardware.
 
  • Like
Reactions: BurgDog

Khedron

Suspended
Sep 27, 2013
2,561
5,755
You didn't bother to read the entire text of the law. It specifically precludes what Apple is doing. Yes, if Apple is made aware of an image on their servers that violates the law, they are obligated to remove that image and report it to law enforcement. Nowhere in this law does it say that Apple or any other tech company should be actively scanning for this.

(f) Protection of Privacy.-Nothing in this section shall be construed to require a provider to-

(1) monitor any user, subscriber, or customer of that provider;

(2) monitor the content of any communication of any person described in paragraph (1); or

(3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).

Apple’s new technology has absolutely nothing to do with CSAM.

If Apple cared about CSAM they could have been scanning their own servers at any time during the last 10-15 years. They never did.

Suddenly they care about CSAM at the exact same moment as they create a technology to remotely monitor users’ phones? Complete nonsense.
 

tylersdad

macrumors regular
Jul 26, 2010
200
520
Well I could've swore that's what I had read from multiple sources, but other cloud services definitely have been. Even if Apple hasn't, they always could have if they wanted to. That's really my point - that these CSAM detection features in iOS15 aren't any sort of game changer in terms of what Apple COULD do if they wanted to. In fact, the whole point of the new CSAM detection process is to be as non-invasive as possible whilst still being able to detect CSAM uploaded to iCloud.



View attachment 1820332
Please explain exactly how iOS15 is being magically installed on your device without your knowledge (and forgetting you have auto-update on doesn't count, as that's something under your control). You are misusing that term to try to make things sound more dramatic than they are. Plain and simple. Stop (you and everyone else doing the same thing).

And for the millionth time, NOTHING IS BEING SCANNED on your phone if you turn off iCloud for photos and even if you do, no scanning information is leaving your phone unless you're uploading illegal images to iCloud and even THEN Apple can't decrypt that info unless the detected CSAM image threshold (30) is met.



Again, it's not spyware, and "probable cause" has nothing to do with this topic since Apple is a private entity and you're voluntarily using their software, which is merely licensed to you (not owned by you).
For the final time, YES images are being scanned on your device. The hash is created on your device. To create the hash, the file must be opened and the 1's and 0's contained the file must be converted into the hash. Apple has indicated the scan doesn't happen until the user uploads an image to iCloud. If that's the case, why is Apple storing the CSAM hash database on our devices? If the scanning is happening only in the cloud, then the CSAM hash database would only need to be in a location accessible by the cloud servers performing the scan...not on our devices as Apple has stated.
 
  • Like
Reactions: BurgDog and Khedron

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
The major issue with iOS and iPadOS is the users lack of control. We have no tools or utilities at our disposal to really verify anything. How do you as a end-user verify if your phone is using this on-device scanning or not? If you don't have the tools you simply have to blindly trust Apple.

And things like this can still be considered spyware even though Apple is putting out a disclaimer about the change. If Apple puts out a disclaimer telling us that from iOS 16 and onward they'll be scanning all the content on our phones, including passwords, private notes, you name it. It's spyware by any definition. You will know about it, you will have to accept the new terms of services. It's still spying and logging everything you have on the devices and everything you do. Simply disclaiming it doesn't change that.

Don't get me wrong. I trust Apple, and I trust that if I disable iCloud Photo's on iOS, iPadOS and macOS this scanner will stay disabled. But the feature and scanner itself is still acting as some kind of spyware that is compromising my privacy.

I'm sorry, but I'm not letting this go. Words have meaning. Spyware by definition is executed surreptitiously. That's why it's called SPYware. If you want to invent a new word to describe the on-device scanning, then go for it, but don't try to misuse or redefine an existing term for the sake of melodrama.
 

Mac4Mat

Suspended
May 12, 2021
168
466
The most obvious thing in all this is after all these plans discussed everywhere does anyone think that some pedophile will use iCloud or photos app to view explicit content on the phone?

Any file manager can be used for that to which anything can be uploaded from computer.

What is the plan here really? To catch 3 pedophiles who do not have internet access and never seen of CSAM discussions? Seriously?

I see absolutely no difference with this and the wish of some politicians and forces to have legal backdoors in communication apps or operating systems themselves.

Any bad actor who needs to - can simply switch to any kind of encrypted messenger, including self hosted open source solutions.
Thus, the only thing that will be achieved is bringing to risk everyone on the platforms to hackers who steal personal and confidential information to make money from it or engage in some scam operations.

This is simply just not the way to catch “bad actors” at all.

CSAM must be scratched before governments all over will start pushing for a full surveillance, knowing that such technical opportunity exist.

In cases of encryption and backdoors, the last defense wall was often “absence of technical possibility to create such backdoor, due to end-to-end encryption nature”.

CSAM = technical possibility to scan your photos absolutely for any content that is considered “offensive” by whatever government or agency.

This is just absolutely transparent logic and I do not understand how Apple leadership does not see this.

Moreover, I believe that brining CSAM will not only put under massive pressure everything related to whatever “politically incorrect” photos we are having, but it will bring a new wave of push regarding all kind of backdoors and weakening encryption. There is just no way around this.

I hope there will be much stronger public and governmental push against CSAM so that it never comes to life.
That is a crucial point and it will hinder such efforts to find these awful creatures who prey on children.

But you see Child abuse etc., was a very convenient platform to cover the fact it is a backdoor and who am I to argue with a German politician, Apple employees etc. etc.
 
  • Like
Reactions: ivan86 and BurgDog

Khedron

Suspended
Sep 27, 2013
2,561
5,755
For the final time, YES images are being scanned on your device. The hash is created on your device. To create the hash, the file must be opened and the 1's and 0's contained the file must be converted into the hash. Apple has indicated the scan doesn't happen until the user uploads an image to iCloud. If that's the case, why is Apple storing the CSAM hash database on our devices? If the scanning is happening only in the cloud, then the CSAM hash database would only need to be in a location accessible by the cloud servers performing the scan...not on our devices as Apple has stated.

But then Apple wouldn’t be able to change their algorithm to scan non-iCloud files with just the flick of a switch… oh.
 

DevNull0

macrumors 68030
Jan 6, 2015
2,703
5,390
It’s surprising to see that kind of stance by a politician, they are not the biggest fans of digital privacy. Very curious if any major politician in US will fight on CSAM scanning, but I’m not holding my breath

That should really emphasise just how bad this move is by Apple.

It's very likely the whole idea came from Timmy's friends in China who just want to track down people of certain political views and the whole who CSAM thing is a smoke screen to make it hard to complain about Apple's move without looking like you're supporting child abuse. Won't someone please think of the children?

Does this scanning have any chance of actually doing some good? How many people are stupid enough to store their CSAM files in Apple's cloud? It just seems like the CSAM excuse is a shaky cover story.
 

svish

macrumors G3
Nov 25, 2017
9,975
25,959
Though only in US first, it may soon expand to all other regions. Good that prominent persons are voicing their concerns.
 
  • Like
Reactions: ivan86

Mac4Mat

Suspended
May 12, 2021
168
466
This. I'm used to politicians proposing simplistic, feel-good solutions to complex problems because most voters lack the critical-thinking skills to understand why said solutions wouldn't work. I'm surprised that Apple is doing it, though. Yes, I watch enough true-crime shows to know that many criminals are stupid and careless ("Hi, I'm Chris Hansen, and you're on Dateline! Did you really think you'd be meeting a 14-year-old girl here today?"), and I suppose some are dumb enough to store their child-porn collections in iCloud, but I assume most of them use other storage solutions and share photos on the dark web. Does Apple have data to the contrary? If so, where did they get it? The only way I can think is if large numbers of pedophiles who have already been caught willingly unlock their Apple devices for law-enforcement authorities, who then find their collections in iCloud Photos. In that case, the methods used to catch them (e.g., sting operations) already worked.

So this sounds like a combination of Apple bowing to political pressure and perhaps naively thinking that most people would be okay with the policy because no sane, civilized person thinks child porn is acceptable. Our culture does seem to view sex crimes as somehow more evil than other types of crimes. Most places in the US have databases of convicted sex offenders so you can find out if any are living in your neighborhood, but I'm unaware of comparable databases for convicted murderers, violent assailants, burglars, fraudsters, and so on. The fact that Apple has refused to unlock the iPhones of suspected terrorists and other criminals points to this double standard. Yes, they claim they have no way to do so, because of their stance on security, but somehow a security company hired by law enforcement found a way to do so in at least one instance.

I'll admit that, initially, Apple's announcement didn't seem like a big deal to me, and I didn't understand the uproar on this forum. The more I've thought about it, though, the more I've come to view it as something that will have little effect on the problem it's intended to address and that potentially will open the the door to abuse.
Paedophiles are not like the average criminal, ask any law enforcement officer. They plan, they bluff really well, and take precautionary measures, which is why so many of the paedophile rings are taken down via dark web.

Apple and others can assist, but advertising it from the rooftops isn't the best way!

These people do tend to use Social Media though, as some of the latest arrests show.

"Sex offenders have increased their criminal activities in social media, via peer-to-peer networks and on the darkweb. Attempts to access websites featuring child sexual abuse material, calls to helplines and activities in dark net and surface web chats sharing child abuse material have all increased during the confinement period."

Be very few that would be stupid enough to use iCloud and Apple by their actions have made it likely that even less on the fringe will get caught, because they will avoid scrutiny making children less safe and paedophiles less likely to get caught.
 
  • Like
Reactions: BurgDog

Mercury7

macrumors 6502a
Oct 31, 2007
738
556
I'm surprised one of the 2 US parties isn't crying foul since they always think they are under threat.
I’m not surprised, Apple shut Congress down on this easily by making it about child porn… no congressman will touch this because they know the ads next fall would relentlessly say they supported child porn by coming out against it. It’s a Trojan horse they can’t touch, they may be forced to take a stance though if it stays in the headlines and more countries weigh in. I would at least expect the libertarian party to step up though
 

Mac4Mat

Suspended
May 12, 2021
168
466
And I‘m probably being over sensitive, but I think there‘s something nauseatingly distasteful about downloading hash codes of child pornography onto everyone’s phone – even if I’ll never see it.
It will not be just iPhones
 

farewelwilliams

Suspended
Jun 18, 2014
4,966
18,041
You didn't bother to read the entire text of the law. It specifically precludes what Apple is doing. Yes, if Apple is made aware of an image on their servers that violates the law, they are obligated to remove that image and report it to law enforcement. Nowhere in this law does it say that Apple or any other tech company should be actively scanning for this.

(f) Protection of Privacy.-Nothing in this section shall be construed to require a provider to-

(1) monitor any user, subscriber, or customer of that provider;

(2) monitor the content of any communication of any person described in paragraph (1); or

(3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).

Reread that section carefully

"require a provider to monitor any user"

The "PROVIDER" isn't monitoring the "USER". The CSAM "ON DEVICE" detection is looking for photos that are going to be stored "ON ICLOUD". This is the exact thing Craig Federighi explained in the first question on the WSJ interview.
 

MacBH928

macrumors G3
May 17, 2008
8,361
3,739
While I think it sucks that Apple is scanning our files against our will , this german dude is talking like Apple is the only one doing it. Google and FB and others are doing so much worse on their part, so if he care for the privacy of others then pointing towards Apple only is the wrong way to approach it.

BAN SURVEILLANCE AND DATA COLLECTION BY LAW
 
  • Love
Reactions: xpxp2002

Mydel

macrumors 6502a
Apr 8, 2006
804
664
Sometimes here mostly there
Reread that section carefully

"require a provider to monitor any user"

The "PROVIDER" isn't monitoring the "USER". The CSAM "ON DEVICE" detection is looking for photos that are going to be stored "ON ICLOUD". This is the exact thing Craig Federighi explained in the first question on the WSJ interview.
Again. They can monitor on THEIR server. NOT on my device
 

Feyl

Cancelled
Aug 24, 2013
964
1,951
Even the government does not have this right. Only if a judge decides it. This is how systems in a democeacy are designed. Separation of legislative, executive and judicial powers.
You apparently didn't get my point. My point was to question the guy why he's ok with scanning people's phones but letting strangers in your home isn't. It's literally the same breaking of privacy but some people don't see their digital identities on the same level. Which is just wrong.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
For the final time, YES images are being scanned on your device. The hash is created on your device. To create the hash, the file must be opened and the 1's and 0's contained the file must be converted into the hash. Apple has indicated the scan doesn't happen until the user uploads an image to iCloud. If that's the case, why is Apple storing the CSAM hash database on our devices? If the scanning is happening only in the cloud, then the CSAM hash database would only need to be in a location accessible by the cloud servers performing the scan...not on our devices as Apple has stated.

You didn't read my post very carefully. Go back and read it again:
And for the millionth time, NOTHING IS BEING SCANNED on your phone if you turn off iCloud for photos and even if you don't, no scanning information is leaving your phone unless you're uploading illegal images to iCloud and even THEN Apple can't decrypt that info unless the detected CSAM image threshold (30) is met.
 

ChromeAce

macrumors 6502a
Jun 11, 2009
599
921
Yes I agree with the sentiment but if it is indeed already a legal requirement that all images stored on corporate servers must be scanned for CASM, then perhaps this is the best way of going about it? I don't know for sure. I'm still against the whole idea in general but I accept there may be more to it than I first gave it credit for. And reconsidering a position isn't the same as changing your mind ?

There is no such legal requirement to scan anything, only to take action if discovered. Otherwise, all companies hosting your email would have to look at it regularly.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
Well, if as some people suggest, they've been doing it since 2019 and they're just now telling us they're doing it, then they've been spying on us since 2019 without our consent. This is spyware.

No, this is a new feature. Please provide evidence (not random blog articles where someone makes assertions) that Apple has been on-device scanning photos since 2019 without users being informed.
 

ChromeAce

macrumors 6502a
Jun 11, 2009
599
921

Gotta love people who quote the law without reading it:

“shall, as soon as reasonably possible after obtaining actual knowledge of any facts or circumstances described in paragraph (2)(A), take the actions described in subparagraph (B)”

In case you missed that, “after obtaining actual knowledge” does not mean “must proactively go look for illegal content.”
 
  • Like
Reactions: tylersdad

macfacts

macrumors 601
Oct 7, 2012
4,855
5,684
Cybertron

Attachments

  • Screenshot_20210818-114324_Chrome.jpg
    Screenshot_20210818-114324_Chrome.jpg
    194.4 KB · Views: 72
  • Like
Reactions: _Spinn_
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.