Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

farewelwilliams

Suspended
Jun 18, 2014
4,966
18,041
Despite the facts that it scans locally before uploading, that it only uses hashes initially, that child exploitation is a serious problem that must be solved and that Apple has done a pretty good design job to protect privacy it still isn’t right. This is still digital surveillance of the private population. And worse still it is being done by a corporation of unelected individuals deciding for us how it should be done rather than law enforcement. The term “overreach” is often used in government and it applies here. Apple is not responsible nor accountable for CSAM detection in law enforcement and no country’s citizens have passed a law to give them this mandate. However secure, private and well intentioned this system may be they are breaching the privacy of the people without the people’s’ permission.

Which Apple is releasing in a country-by-country basis to abide by laws first.
 

farewelwilliams

Suspended
Jun 18, 2014
4,966
18,041
First half of the sentence: And you know that how?
Second part of the sentence: Of course! Securing evidence like this does not belong into the hands of a privately owned company without any control of what actually is happening with the collected data and to who it is handed over to in th end...

First part: technical document released by Apple explains the high level concept.
Second part: Apple has every right to have your iPhone scan it the moment you have iCloud Photos turned on. Turn iCloud photos off and you turn CSAM detection off. There is your control.
 

Mac4Mat

Suspended
May 12, 2021
168
466
Some rather silly comments here trying to justify this unjustifiable software being on hardware rather than iCloud.

Comments about camera on iPhone etc., which actually prove the point, as you can choose not to use the camera, just as you choose most things, but not the software Apple intends to place on your hardware, and even if you don't intend to use iCloud for photos.

In most things Apple have been security and privacy protectors, and all power to their elbow, so it does go against the grain and you can understand then people being so concerned, as it fundamentally calls into question what previously was a massive feather in Apple's cap, and its very public stance against agencies seeking backdoors. You have to wonder what has changed that?

Also interesting to note most do not believe this intended action will safeguard children in any way at all, and in my opinion it may hinder those engaged in trying to stop child abuse/child pornography by pushing the culprits further and further underground.

Good to see a politician seeing through the 'misinterpretation' ideas and calling it what it is, the same as some Apple employees, same as IT specialists and even security specialists BACKDOOR.

So many on this bb want to do anything but acknowledge what this is. SURVEILLANCE and the technicalities of how it does it, or to what extent AT PRESENT are really not the point at all.

Ironic that its a politician now stating their concerns.

With various governments and organisations and judges currently perusing issues that could herald Apple losing court cases, being levied substantial fines by governments, etc. etc., what better leverage it could be from those who have not got the assistance from Apple re: privacy, to ensure a backdoor is put in all Apple devices, hence hardware based rather than cloud based.
 

Prof.

macrumors 603
Aug 17, 2007
5,310
2,026
Chicagoland
Okay so I am confusion.

iOS already scans your photos to identify pets, faces, cars, buildings, trees, etc... how is finding missing and exploited children via the same technology a bad thing? The fact that it alerts a non-profit NGO? It's already a fact that it will not flag naked pics of your 1-week-old during their first bath time, or the suggestive images you send your sexual partners.

The ONLY way you'll be flagged is if you have known images of exploited children on your phone. In this context, "known" is defined as any image that is already uploaded to the CSAM database, and is found to be an IDENTICAL match in your iCloud library.
 

Mac4Mat

Suspended
May 12, 2021
168
466
Which Apple is releasing in a country-by-country basis to abide by laws first.
Abiding by laws includes presumably abiding by laws in China where it instead data be kept in China, and where no doubt if Apple did not comply it would not be on sale in China? These sort of pressures can easily sway a company, even on an issue that was formerly sacrosanct and part of Apple's advertising blurb.
 
  • Like
Reactions: BurgDog and Huck

thejadedmonkey

macrumors G3
May 28, 2005
9,193
3,392
Pennsylvania
I don't like that people view child porn. And as a conservative Christian pastor who works full time at a church, I don't want anyone viewing porn. Furthermore, I intentionally don't watch material that has risky scenes or language that offends me.
However, the same technology that Apple wants us all to accept this fall could one day be the same technology that tells a government that I am a conservative Christian pastor. Therefore, the right thing in this situation is not to catch people that are going to not use the feature—the right thing in this situation is to not implement a feature that is highly useless against the people for whom it is intended... because the day might one day come when others get caught in a web that was not originally intended for them.

First they came for the socialists, and I did not speak out—because I was not a socialist.
Then they came for the trade unionists, and I did not speak out— because I was not a trade unionist.
Then they came for the Jews, and I did not speak out—because I was not a Jew.
Then they came for me—and there was no one left to speak for me.
 

Mac4Mat

Suspended
May 12, 2021
168
466
Okay so I am confusion.

iOS already scans your photos to identify pets, faces, cars, buildings, trees, etc... how is finding missing and exploited children via the same technology a bad thing? The fact that it alerts a non-profit NGO? It's already a fact that it will not flag naked pics of your 1-week-old during their first bath time, or the suggestive images you send your sexual partners.

The ONLY way you'll be flagged is if you have known images of exploited children on your phone. In this context, "known" is defined as any image that is already uploaded to the CSAM database, and is found to be an IDENTICAL match in your iCloud library.
Again total obfuscation about the real point. These comments are on par with someone asking you to dig your own grave, and then suggesting they are doing you a favour as you only have to dig 2ft. down not 6.

The fact is it is a backdoor, it is surveillance on everyone's own hardware.
 

ksec

macrumors 68020
Dec 23, 2015
2,241
2,595
This is enough, I dont think Macrumors comments could stand this any longer, Apple should threaten Germany to pull of out the German Market.
 

Luke MacWalker

macrumors regular
Jun 10, 2014
137
120
Despite the facts that it scans locally before uploading, that it only uses hashes initially, that child exploitation is a serious problem that must be solved and that Apple has done a pretty good design job to protect privacy it still isn’t right. This is still digital surveillance of the private population. And worse still it is being done by a corporation of unelected individuals deciding for us how it should be done rather than law enforcement. The term “overreach” is often used in government and it applies here. Apple is not responsible nor accountable for CSAM detection in law enforcement and no country’s citizens have passed a law to give them this mandate. However secure, private and well intentioned this system may be they are breaching the privacy of the people without the people’s’ permission.
I agree (except with "they are breaching the privacy of the people": they are breaching the privacy of offenders only) but you make it sounds like it would be fine if a government orders them to give access to the devices data (for CSAM… or other kind of surveillance, because why stop there?).
It was probably not your idea, and I’m probably reading too much "between the lines", but anyway I think it is a serious question that you bring up: what should they do if (when?) a legally elected democratic government order them to grant access to users’ data? We know that now Apple says "we cannot" (for part of it anyway), but if the law says that this or that government entity must have access to users’ data, the choice becomes keep the encryption keys or leave the country, and hope that this government doesn’t (or can’t) decide that this law has extra-territorial validity.

Not easy. But I sure wouldn’t feel more comfortable if it was a government that ordered this scanning. Actually, probably less… ?
 
  • Like
Reactions: BurgDog

crawfish963

macrumors 6502a
Apr 16, 2010
933
1,637
Texas
I have already reduced my iCloud storage plan, removed iCloud photos from all devices, removed iCloud iMessage from all but my iPhone, and gone back to encrypted backups on my MBP. I want Apple to reverse course but I bought my iPhone in cash and own it outright. I do not consent to them running scanning software on my phone.
 

Janichsan

macrumors 68040
Oct 23, 2006
3,058
11,213
It’s surprising to see that kind of stance by a politician, they are not the biggest fans of digital privacy. Very curious if any major politician in US will fight on CSAM scanning, but I’m not holding my breath
That might be surprising for some people, but in many democracies, there is an actual wide range of polictical parties with a wide range of political stances.

That does include parties and politicians who actually care about privacy.
 

crawfish963

macrumors 6502a
Apr 16, 2010
933
1,637
Texas
Okay so I am confusion.

iOS already scans your photos to identify pets, faces, cars, buildings, trees, etc... how is finding missing and exploited children via the same technology a bad thing? The fact that it alerts a non-profit NGO? It's already a fact that it will not flag naked pics of your 1-week-old during their first bath time, or the suggestive images you send your sexual partners.

The ONLY way you'll be flagged is if you have known images of exploited children on your phone. In this context, "known" is defined as any image that is already uploaded to the CSAM database, and is found to be an IDENTICAL match in your iCloud library.
Because you can disable the above features and you have the ability to use iCloud photos without it. This will be baked into iOS 15 and cannot be disabled unless you opt not to use the features that make the Apple ecosystem what it is and have been part of why people use Apple products.
 

rme

macrumors 6502
Jul 19, 2008
292
436
I think if the European Union authorities say NO, the whole idea may end up being dropped.
What's the chance of that happening you think?
 

movielad

macrumors regular
Dec 19, 2005
120
219
Surrey
Your point is? These things are not even in the same ball park as implementing CSAM detection on OS level.

My point is that it isn't so much about the technology, but a lot more about people's trust in the US government since clearly many people have trusted Apple significantly over the years with a closed iOS system along with their health data, fingerprints and facial recognition data - the latter two stored in the secure enclaves of their devices and not worry about the potential vulnerabilities surrounding it.

Apple's CSAM device scanner can be disabled simply by not using iCloud Photos. People complain that the hashes could be reverse-engineered and new images faked to use those hashes. If that's the case, we'd have seen issues with existing systems that already implement CSAM detection, surely? Besides which, as I understand it, the hashes are made up of two different sources.
 
  • Like
Reactions: Tagbert

dwsolberg

macrumors 6502a
Dec 17, 2003
844
824
Apple means well, but they don't seem to have considered the unintended consequences.

For one, do they really want to be in charge of collecting information on criminal activity of their customers and reporting it to the police? I get that this is the worst of the worst, but what about murder, rape and so on? Those are pretty horrible, too. Is this really something a private company should be doing vs the government?

Second, they seem to be assuming that there will be no bugs and no security breaches, despite ample evidence that this is never, never the case. In the vein of security breaches, they also seem to assume that no hostile government would ever have the power to control what they scan. Again, this enters the area where Apple seems to be choosing how to enforce the law, which is strange and dangerous in a lot of ways.

Finally, they don't seem to understand the PR implications at all. They were obviously surprised by the uproar, which is weird in itself. They also don't seem to have considered that most of their publicity on security has been a refusal to help the government catch terrorists. That plays okay when they are the privacy company, but no so well when they're deciding to break that privacy for a different reason.
 

ericwn

macrumors G4
Apr 24, 2016
11,925
10,562
Okay so I am confusion.

iOS already scans your photos to identify pets, faces, cars, buildings, trees, etc... how is finding missing and exploited children via the same technology a bad thing? The fact that it alerts a non-profit NGO? It's already a fact that it will not flag naked pics of your 1-week-old during their first bath time, or the suggestive images you send your sexual partners.

The ONLY way you'll be flagged is if you have known images of exploited children on your phone. In this context, "known" is defined as any image that is already uploaded to the CSAM database, and is found to be an IDENTICAL match in your iCloud library.

There’s a group of strangers out there that would just like to get your house keys to look at all your images anytime they like and compare them against a database you don’t have control over in order to make judegements about you. There literally nothing that could go wrong here.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.