Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Scoob Redux

macrumors 6502a
Sep 25, 2020
580
890
You're not understanding. NO ONE sees the scan results from your phone except for flagged photos and only then if there's a certain number of flagged photos and only THEN if you upload them to iCloud (where Apple has always been able to access your content if they wanted to). The scan is not being monitored by Apple employees, LOL! It's all happening within the software and completely hidden from Apple's eyes.
I think you don't understand. Being seen by human eyes is not required for privacy to be violated. Simply having knowledge of photos "flagged" by an algorithm is the privacy violation. It's not dependent on what a person sees with their eyes. The privacy violation is the electronic scan.
 
  • Like
Reactions: deevey

deevey

macrumors 65816
Dec 4, 2004
1,348
1,417
Ha! Yep! “Should” is one of those funny words and I shouldn’t have used it. In my mind, in a well governed country, this would be a government responsibility. That said, I don’t think anyone isn’t questioning and scrutinizing the US government. So I’d rather it were neither until the lot of us hash our government out. If forced to choose right now I’d still put the responsibility in government hands.
The questions don't get asked until the government is caught doing something shady with the tech. Case in point: Snowden.

I don't have a problem with this current use of the technology, however I do have issue with it's future if any government agency (globally) decides to tweaks its version of a <insert dot org here> database and broaden the parameters to persecute potential dissidents or social undesirables.

This Genie is not going back in the bottle anytime soon. It will be interesting to see how/if Apple (and others) say no to any government that desires this feature for their own "humanitarian reasons".
 
  • Like
Reactions: turbineseaplane

coachgq

macrumors 6502a
Jun 16, 2009
931
1,849
Wow. Non-sequitur much? Unless your winning the lottery will enable and encourage you to trade in child porn and sex trafficking, then it has no bearing on the topic. The consideration of how the article's discussed use of technology could easily be abused by being used in ways not "intended" is 100% pertinent to the topic at hand. And, it is the very reason that individuals and technology security companies are very concerned with the move!
Agree. People wanted to move the topic all over the place. Thanks for agreeing with me. Let’s keep the discussion to the topic of the article.
 

ipponrg

macrumors 68020
Oct 15, 2008
2,309
2,087
You know what also “can” happen? I could win the lottery. I could be hit by a train.
I’m commenting on the article, not on the what-ifs. The article discusses CSAM.

Like most PRSI centric topics, people are commenting on the potential ramifications as is the rest of the internet. There is no problem with what you’re doing or what they are doing, but understand that most will be focusing on the arguably more important consequential topic which you continuously throw strawman arguments at.

I think no one is really discussing CSAM as it is because most agree with the purpose. The larger picture (aka big elephant in the room) is more critical to discuss because Apple’s messaging/how appears to contradict what users expect out of a “privacy” oriented company.
 
  • Like
Reactions: turbineseaplane

Beelzbub

macrumors 6502
Feb 6, 2012
425
187
Who's to say that the government wont add hashes to the database that are not CSAM? What if a few months later the government inserts something unrelated to the database because they are looking for a group of people, items, phrases etc... And all because Apple has decided to play cop.
 
  • Love
Reactions: turbineseaplane

I7guy

macrumors Nehalem
Nov 30, 2013
34,311
24,047
Gotta be in it to win it
[...] expect out of a “privacy” oriented company.
Correct.

A privacy oriented company that has to follow the law. As I said in another thread, the president will not sign an executive order prohibiting this. (It would be nice if congress were are rigorous about this as the app store) However, the way the company handles your PII has not changed.
 

VulchR

macrumors 68040
Jun 8, 2009
3,401
14,286
Scotland
iPhones already scans every photo today for content, faces etc. Creating an additional hash isn't resource heavy at all. Also iPhones do most of their image processing when connected to power and you aren't using your phone.

I don't want any of my phone's processing power and memory going to this fool's errand. Again, as I noted in my previous post this will just spark an arms race between the pedophiles and Apple's system that will cause a proliferation of templates.

It's probably local-sensitivity hashing which doesn't require an exact match. Google is the leader here and even presented a paper (in China!) in 2008 on how to it, and they had implementations for at least 10 years.

The fact that Google is doing this and the PRC is interested doesn't inspire confidence. It increases my concern. A paper was published a while ago that apparently described machine-based facial recognition of the various ethnic groups in the PRC, including the Uyghurs. Do you not see how dangerous this is? It's not f(x) but d(x).

The probability for a false positive is very low. And Apple has additional controls.

How low? Apple doesn't say. Moreover, again this will trigger somebody at Apple looking at your photo's that are above threshold, and my guess is that one of the features the templates will home in on is bare skin. No doubt Apple will get applications from pervs and pedophiles to review the pictures....

You have to have several matches. People who downloads child pornography usually have thousands and hundreds of thousands of pictures. Apple could easily set this to 50 to reduce the probability of false positives dramatically.

First. Apple has not been transparent about false positive rates because my best guess is that they don't know. And what happens if Apple concludes that a picture is a false positive after human review? Do they alert you and say 'hey, we decided it would be in your best interest to review your personal photos without your knowledge or consent. Nice bikini.'

Yes, someone at Apple would be looking at the photos which have been flagged. Apple already has this power today if they want to. And if they're served with a search warrant they turn everything over if needed.

It is already part of the T&C's that user not use iCloud for illegal content. Apple is engaging in this nonsense anyway, as though we are all guilty. Currently Apple provides information to law enforcement to the extent it must. That is not the same as rummaging through our pictures without probable cause.

Google and Facebook have been doing this for a decade. How many governments are forcing Google and Facebook to do as you describe?

If I wanted to use Google or Facebook, I would, but I don't for precisely the reasons you state above. I do not like surveillance capitalism, and I like Orwellian move on Apple's part even less. Can you imagine the field day Apple's competitors are going to have spoofing Apple's 1984 Superbowl Mac commercial? Can you?

55351b6f6da8118b748b4572
 
  • Love
Reactions: turbineseaplane

deevey

macrumors 65816
Dec 4, 2004
1,348
1,417
I think you’re getting distracted by your desire to be right. This article is about flagging CSAM, nothing was talked about same sex consenting age couples. Now you’re just making stuff up to win an argument, you are past the point of continuing a rational dialogue.

I will agree with your first point, but if you have CSAM in your iCloud library, you’ve moved beyond “potential” criminal.
They are the current parameters for the USA. There is nothing whatsoever to prevent those parameters being expanded upon and the quality of the matches being made more fuzzy on a country by country basis.
 
  • Like
Reactions: VulchR

Sciomar

macrumors 6502a
Nov 8, 2017
559
1,737
It's very apparent the article's writer and most users here do not understand what hashing is, nor does the author attempt to explain it for context, more sensationalism.
 

coachgq

macrumors 6502a
Jun 16, 2009
931
1,849
They are the current parameters for the USA. There is nothing whatsoever to prevent those parameters being expanded upon and the quality of the matches being made more fuzzy on a country by country basis.
But do we always live in the what ifs?
 

coachgq

macrumors 6502a
Jun 16, 2009
931
1,849
No stats. I didn’t look up the stats. I really don’t care about the stats. I care about corporations, and ultimately, the government chipping away at our civil liberties.
So you just care about making blanket statements that are grounded in opinion and passing them off as facts? The kids table is to the left. Adults are having dialogue here.
 

iHorseHead

macrumors 65816
Jan 1, 2021
1,307
1,575
Snowden and the EFF protecting the rights of peadophiles
You must be a troll, so you're not worth even responding to, but the people fighting for privacy are not pedophlies nor CP owners. I believe none of the members of this forum has ever owned CP. It's more about the privacy and Apple's privacy promises which I got sucked into.
This could (and will) go wrong in so many ways. Also, I believe CP producers don't use smartphones or cloud.
 

Smearbrick

macrumors 6502
Jan 12, 2013
415
799
Central PA
So you just care about making blanket statements that are grounded in opinion and passing them off as facts? The kids table is to the left. Adults are having dialogue here.
I don’t think I ever stated anything I said was fact. Maybe you are having too many adult drinks, or maybe your adult eyesight is poor.
 

deevey

macrumors 65816
Dec 4, 2004
1,348
1,417
But do we always live in the what ifs?
Maybe you are the kind of person who drives without a jack and spare tire ? Or don't bother with insurance policies becuase "it'll never happen".

In this case though it's not "if", it's an inevitable "when". You should try taking off those rose colored glasses once in a while.
 

Apple_Robert

Contributor
Sep 21, 2012
34,504
50,065
In the middle of several books.
Snowden and the EFF protecting the rights of peadophiles
A person using iCloud Photos on Apple servers has no inherent rights. Said person has obligations that he or she agreed to when using the service. Apple also has obligations to carry out service and operations as outlined in the agreement between them and the user of said service.

Noting that there is potential for abuse with this new system does not equate to protecting pedophiles.
 

turbineseaplane

macrumors G5
Mar 19, 2008
14,962
32,017
Who's to say that the government wont add hashes to the database that are not CSAM? What if a few months later the government inserts something unrelated to the database because they are looking for a group of people, items, phrases etc... And all because Apple has decided to play cop.

And it should be noted that even Apple wouldn't know if that were the case.
They can't see what's in the database
 

Abazigal

Contributor
Jul 18, 2011
19,669
22,211
Singapore
Who's to say that the government wont add hashes to the database that are not CSAM? What if a few months later the government inserts something unrelated to the database because they are looking for a group of people, items, phrases etc... And all because Apple has decided to play cop.

And when those images get flagged and Apple reviews those images and sees that they have nothing to do with child pornography, Apple won’t do anything with that information, and the government gets nothing.
 
  • Like
Reactions: Michael Scrip

laptech

macrumors 68040
Apr 26, 2013
3,600
4,005
Earth
Many in here are of the opinion governments will not get involved. Anyone remember the recent events surrounding Microsofts Bing search engine and the anniversary of the Tiananmen Square protests? where at the time if you was to put the search term 'Tank Man' into Bing, you'd get no results or just errors. Microsoft said it was human error and the matter was resolved quickly. How ironic is it that days prior it was working OK but on the actual day of the anniversary, the search term gets blocked!!. Microsoft Bing has much more freedom than Google does in China and therefore it has been claimed that China put undue influence on Microsoft to block the term 'Tank Man' so no one around the world would be able to see images of that incident. Yes there has been no actual proof the Chinese government influenced Microsoft but the allegations were there.


Note: Tank Man refers to an iconic image of a man standing infront of tank during the Tiananmen Square protests.
 
  • Like
Reactions: turbineseaplane

Sam Squanch

macrumors regular
May 10, 2018
165
297
I get the pushback, but I personally have no issue with it.
No one has an issue with protecting children or holding people accountable. The pushback is that you let this one thing go unopposed it gives them license to keep stripping away your privacy and freedoms. This is like saying, "Well people drive unlicensed, that's illegal, so we are going to mandate that all cars have facial scanners and you'll have to swipe your license card before you can drive a car." There has never been something like this implemented, in history, that didn't lead to more of it happening later and it getting more oppressive and draconian. Shrugging your shoulders just means you're OK with that future.

It isn't Apples or any companies job to be an arm of any law enforcement. Their job is to create products that people want. If in the future your government oppresses you more and draws up laws making companies do these things that is one thing, but you should be very concerned when a company volunteers it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.