You know nothing, luckily. Your lack of comprehension tells me enough to ignore youYou trying to compare defecating in public vs child trafficking tells me all I need to know about you.
You know nothing, luckily. Your lack of comprehension tells me enough to ignore youYou trying to compare defecating in public vs child trafficking tells me all I need to know about you.
I think you don't understand. Being seen by human eyes is not required for privacy to be violated. Simply having knowledge of photos "flagged" by an algorithm is the privacy violation. It's not dependent on what a person sees with their eyes. The privacy violation is the electronic scan.You're not understanding. NO ONE sees the scan results from your phone except for flagged photos and only then if there's a certain number of flagged photos and only THEN if you upload them to iCloud (where Apple has always been able to access your content if they wanted to). The scan is not being monitored by Apple employees, LOL! It's all happening within the software and completely hidden from Apple's eyes.
The questions don't get asked until the government is caught doing something shady with the tech. Case in point: Snowden.Ha! Yep! “Should” is one of those funny words and I shouldn’t have used it. In my mind, in a well governed country, this would be a government responsibility. That said, I don’t think anyone isn’t questioning and scrutinizing the US government. So I’d rather it were neither until the lot of us hash our government out. If forced to choose right now I’d still put the responsibility in government hands.
Agree. People wanted to move the topic all over the place. Thanks for agreeing with me. Let’s keep the discussion to the topic of the article.Wow. Non-sequitur much? Unless your winning the lottery will enable and encourage you to trade in child porn and sex trafficking, then it has no bearing on the topic. The consideration of how the article's discussed use of technology could easily be abused by being used in ways not "intended" is 100% pertinent to the topic at hand. And, it is the very reason that individuals and technology security companies are very concerned with the move!
You know what also “can” happen? I could win the lottery. I could be hit by a train.
I’m commenting on the article, not on the what-ifs. The article discusses CSAM.
In fascist states it usually is a crime to fight for your rights.What’s all over? Your crimes?
No stats. I didn’t look up the stats. I really don’t care about the stats. I care about corporations, and ultimately, the government chipping away at our civil liberties.Stats to backup your point? Or is this just an opinion of yours?
Correct.[...] expect out of a “privacy” oriented company.
True. We should air our concerns if want to, however.And that is the beauty of it, you can disable iCloud photos. Apple isn't forcing you to use the service, especially if you don't agree to the terms.
iPhones already scans every photo today for content, faces etc. Creating an additional hash isn't resource heavy at all. Also iPhones do most of their image processing when connected to power and you aren't using your phone.
It's probably local-sensitivity hashing which doesn't require an exact match. Google is the leader here and even presented a paper (in China!) in 2008 on how to it, and they had implementations for at least 10 years.
The probability for a false positive is very low. And Apple has additional controls.
You have to have several matches. People who downloads child pornography usually have thousands and hundreds of thousands of pictures. Apple could easily set this to 50 to reduce the probability of false positives dramatically.
Yes, someone at Apple would be looking at the photos which have been flagged. Apple already has this power today if they want to. And if they're served with a search warrant they turn everything over if needed.
Google and Facebook have been doing this for a decade. How many governments are forcing Google and Facebook to do as you describe?
They are the current parameters for the USA. There is nothing whatsoever to prevent those parameters being expanded upon and the quality of the matches being made more fuzzy on a country by country basis.I think you’re getting distracted by your desire to be right. This article is about flagging CSAM, nothing was talked about same sex consenting age couples. Now you’re just making stuff up to win an argument, you are past the point of continuing a rational dialogue.
I will agree with your first point, but if you have CSAM in your iCloud library, you’ve moved beyond “potential” criminal.
But do we always live in the what ifs?They are the current parameters for the USA. There is nothing whatsoever to prevent those parameters being expanded upon and the quality of the matches being made more fuzzy on a country by country basis.
So you just care about making blanket statements that are grounded in opinion and passing them off as facts? The kids table is to the left. Adults are having dialogue here.No stats. I didn’t look up the stats. I really don’t care about the stats. I care about corporations, and ultimately, the government chipping away at our civil liberties.
You must be a troll, so you're not worth even responding to, but the people fighting for privacy are not pedophlies nor CP owners. I believe none of the members of this forum has ever owned CP. It's more about the privacy and Apple's privacy promises which I got sucked into.Snowden and the EFF protecting the rights of peadophiles
I don’t think I ever stated anything I said was fact. Maybe you are having too many adult drinks, or maybe your adult eyesight is poor.So you just care about making blanket statements that are grounded in opinion and passing them off as facts? The kids table is to the left. Adults are having dialogue here.
Why wouldn’t you ponder what ifs?But do we always live in the what ifs?
Maybe you are the kind of person who drives without a jack and spare tire ? Or don't bother with insurance policies becuase "it'll never happen".But do we always live in the what ifs?
A person using iCloud Photos on Apple servers has no inherent rights. Said person has obligations that he or she agreed to when using the service. Apple also has obligations to carry out service and operations as outlined in the agreement between them and the user of said service.Snowden and the EFF protecting the rights of peadophiles
Who's to say that the government wont add hashes to the database that are not CSAM? What if a few months later the government inserts something unrelated to the database because they are looking for a group of people, items, phrases etc... And all because Apple has decided to play cop.
Who's to say that the government wont add hashes to the database that are not CSAM? What if a few months later the government inserts something unrelated to the database because they are looking for a group of people, items, phrases etc... And all because Apple has decided to play cop.
No one has an issue with protecting children or holding people accountable. The pushback is that you let this one thing go unopposed it gives them license to keep stripping away your privacy and freedoms. This is like saying, "Well people drive unlicensed, that's illegal, so we are going to mandate that all cars have facial scanners and you'll have to swipe your license card before you can drive a car." There has never been something like this implemented, in history, that didn't lead to more of it happening later and it getting more oppressive and draconian. Shrugging your shoulders just means you're OK with that future.I get the pushback, but I personally have no issue with it.