It doesn’t matter. Apple still scan our library. This is not their job. They’re not a law enforcement.Neither CSAM scanning nor the Child Safety Features in Message, however, are monitoring any communication.
It doesn’t matter. Apple still scan our library. This is not their job. They’re not a law enforcement.Neither CSAM scanning nor the Child Safety Features in Message, however, are monitoring any communication.
China will love this feature and that’s just a fact.No, you don't. Stop making assumptions!
Germany is not a monolith anymore than the US or any other democracy is. The positions of politicians span a wide range.
Despite the facts that it scans locally before uploading, that it only uses hashes initially, that child exploitation is a serious problem that must be solved and that Apple has done a pretty good design job to protect privacy it still isn’t right. This is still digital surveillance of the private population. And worse still it is being done by a corporation of unelected individuals deciding for us how it should be done rather than law enforcement. The term “overreach” is often used in government and it applies here. Apple is not responsible nor accountable for CSAM detection in law enforcement and no country’s citizens have passed a law to give them this mandate. However secure, private and well intentioned this system may be they are breaching the privacy of the people without the people’s’ permission.
First half of the sentence: And you know that how?
Second part of the sentence: Of course! Securing evidence like this does not belong into the hands of a privately owned company without any control of what actually is happening with the collected data and to who it is handed over to in th end...
Abiding by laws includes presumably abiding by laws in China where it instead data be kept in China, and where no doubt if Apple did not comply it would not be on sale in China? These sort of pressures can easily sway a company, even on an issue that was formerly sacrosanct and part of Apple's advertising blurb.Which Apple is releasing in a country-by-country basis to abide by laws first.
I don't like that people view child porn. And as a conservative Christian pastor who works full time at a church, I don't want anyone viewing porn. Furthermore, I intentionally don't watch material that has risky scenes or language that offends me.
However, the same technology that Apple wants us all to accept this fall could one day be the same technology that tells a government that I am a conservative Christian pastor. Therefore, the right thing in this situation is not to catch people that are going to not use the feature—the right thing in this situation is to not implement a feature that is highly useless against the people for whom it is intended... because the day might one day come when others get caught in a web that was not originally intended for them.
Again total obfuscation about the real point. These comments are on par with someone asking you to dig your own grave, and then suggesting they are doing you a favour as you only have to dig 2ft. down not 6.Okay so I am confusion.
iOS already scans your photos to identify pets, faces, cars, buildings, trees, etc... how is finding missing and exploited children via the same technology a bad thing? The fact that it alerts a non-profit NGO? It's already a fact that it will not flag naked pics of your 1-week-old during their first bath time, or the suggestive images you send your sexual partners.
The ONLY way you'll be flagged is if you have known images of exploited children on your phone. In this context, "known" is defined as any image that is already uploaded to the CSAM database, and is found to be an IDENTICAL match in your iCloud library.
I agree (except with "they are breaching the privacy of the people": they are breaching the privacy of offenders only) but you make it sounds like it would be fine if a government orders them to give access to the devices data (for CSAM… or other kind of surveillance, because why stop there?).Despite the facts that it scans locally before uploading, that it only uses hashes initially, that child exploitation is a serious problem that must be solved and that Apple has done a pretty good design job to protect privacy it still isn’t right. This is still digital surveillance of the private population. And worse still it is being done by a corporation of unelected individuals deciding for us how it should be done rather than law enforcement. The term “overreach” is often used in government and it applies here. Apple is not responsible nor accountable for CSAM detection in law enforcement and no country’s citizens have passed a law to give them this mandate. However secure, private and well intentioned this system may be they are breaching the privacy of the people without the people’s’ permission.
Just think how different Sergeant Shultz and Colonel Klink are.
That might be surprising for some people, but in many democracies, there is an actual wide range of polictical parties with a wide range of political stances.It’s surprising to see that kind of stance by a politician, they are not the biggest fans of digital privacy. Very curious if any major politician in US will fight on CSAM scanning, but I’m not holding my breath
Because you can disable the above features and you have the ability to use iCloud photos without it. This will be baked into iOS 15 and cannot be disabled unless you opt not to use the features that make the Apple ecosystem what it is and have been part of why people use Apple products.Okay so I am confusion.
iOS already scans your photos to identify pets, faces, cars, buildings, trees, etc... how is finding missing and exploited children via the same technology a bad thing? The fact that it alerts a non-profit NGO? It's already a fact that it will not flag naked pics of your 1-week-old during their first bath time, or the suggestive images you send your sexual partners.
The ONLY way you'll be flagged is if you have known images of exploited children on your phone. In this context, "known" is defined as any image that is already uploaded to the CSAM database, and is found to be an IDENTICAL match in your iCloud library.
Will Saudis or Hungarians or Russians get a refund if they don't like the policy in their country after they bought an iPhone 13?Which Apple is releasing in a country-by-country basis to abide by laws first.
What's the chance of that happening you think?I think if the European Union authorities say NO, the whole idea may end up being dropped.
Your point is? These things are not even in the same ball park as implementing CSAM detection on OS level.
Okay so I am confusion.
iOS already scans your photos to identify pets, faces, cars, buildings, trees, etc... how is finding missing and exploited children via the same technology a bad thing? The fact that it alerts a non-profit NGO? It's already a fact that it will not flag naked pics of your 1-week-old during their first bath time, or the suggestive images you send your sexual partners.
The ONLY way you'll be flagged is if you have known images of exploited children on your phone. In this context, "known" is defined as any image that is already uploaded to the CSAM database, and is found to be an IDENTICAL match in your iCloud library.