Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

flowsy

macrumors 6502
Aug 16, 2009
356
299
Germany
Following on from the inevitable thin end of the wedge, for those who see no issue, what will be said when Apple deems it necessary to use similar technology to review all banking, and financial related transactions on iPhones, iPads, etc.? After all, money laundering, foreign corrupt practises, etc must all followed in order to protect the population . . . .
Apple doesn't give a damn about your banking records. What makes you think they (the government) need your phone or Apple to do that? They're already doing it, they don't need your iPhone or the stored data to do that. Where does that data come from? You're online, all the time. Your data lies everywhere. If your government enforces such legislature, it'd be the tip of the iceberg and Apple would be the least of your problems.
 

djstile

macrumors regular
Jun 17, 2009
180
124
Like any iOS update, the user has to give consent before being allowed to install the update.
This isn’t a GREAT argument though. As we all know how persistent Apple is in badgering a user to update before finally just downloading it and eventually a single yes no prompt to do the update and no way to cancel it.

On top of that, any new phone or replacement phone under warranty will come with the new iOS version with no way to downgrade.
 

jk1221

macrumors 6502
Feb 1, 2021
285
1,058
Following on from the inevitable thin end of the wedge, for those who see no issue, what will be said when Apple deems it necessary to use similar technology to review all banking, and financial related transactions on iPhones, iPads, etc.? After all, money laundering, foreign corrupt practises, etc must all followed in order to protect the population . . . .

Apple doesn't give a damn about your banking records. What makes you think they (the government) need your phone or Apple to do that? They're already doing it, they don't need your iPhone or the stored data to do that. Where does that data come from? You're online, all the time. Your data lies everywhere. If your government enforces such legislature, it'd be the tip of the iceberg and Apple would be the least of your problems.


Ok let's make it easier. How about pirated music or movies? That's illegal too. Or drugs? How about photos of nude celebs since there are databases of those and illegal to share without their consent?

It's the slippery slope argument; what comes next. And how much privacy from the company touting privacy over other companies (their words not ours as customers) you have to give up.

NOT arguing that CSAM is good or abusers should be protected. Obviously, NO ONE except the abusers themselves will argue that. Apple intentionally has made it so if you argue against this process you would come off as a child abuse sympathizer. It's intentionally a catch-22 argument and using a sensitive topic and the wedge to get their foot in the door to argue we are protecting people to invade your privacy.

This argument is surely what could it be used for next. Or what a government could use this "back door" for.

Did we already forget about Pegasus only a few weeks-ish later?
 
Last edited:

Playfoot

macrumors 6502
Feb 2, 2009
283
255
Apple doesn't give a damn about your banking records. What makes you think they (the government) need your phone or Apple to do that? They're already doing it, they don't need your iPhone or the stored data to do that. Where does that data come from? You're online, all the time. Your data lies everywhere. If your government enforces such legislature, it'd be the tip of the iceberg and Apple would be the least of your problems.
If it becomes popular or deemed necessary to combat terrorism, drug dealers, urban crime or other myriad of other reasons, Apple will do it.

Just as Apple will use the excuse of the need to follow the laws of the countries they operate in when it is demanded the technology is used for purposes of tracking dissent.....

Just think how much personal freedoms and privacy have been surrendered in the name of security, all accomplished with sophistry.

As for me, no I am not online all the time. I don't use banking apps, I don't use find my phone, I don't use Siri, I am not connect to the IOT......and much more. While I have nothing to hide, I just don't like tech companies using me as an experiment to perfect their products, then sell it back to me at full price, while they also sell all of the data....
 
Last edited:
  • Like
Reactions: haruhiko and jk1221

flowsy

macrumors 6502
Aug 16, 2009
356
299
Germany
Many of the fears expressed here in the forum are theoretical. I understand what they could do with the planned implementation of this kind. But I also understand what they could have already been doing with existing capabilities, but do not. I just don't see this implementation as being any more of a risk than current options. If Apple does start looking for and matching other content (no matter how) - and we all have the luxury of being informed in advance - then I'll be the first one to throw my iPhone and Mac against the wall. But I just don’t see it. And if you have such fears, why only start here and now? We are waaay past that.
 

Exponent

macrumors 6502
Jul 17, 2002
267
647
Silicon Valley
Many of the fears expressed here in the forum are theoretical. I understand what they could do with the planned implementation of this kind. But I also understand what they could have already been doing with existing capabilities, but do not. I just don't see this implementation as being any more of a risk than current options. If Apple does start looking for and matching other content (no matter how) - and we all have the luxury of being informed in advance - then I'll be the first one to throw my iPhone and Mac against the wall. But I just don’t see it. And if you have such fears, why only start here and now? We are waaay past that.
Will you even know if Apple, due to local government, will be using a different image database, say, one with an protest location map, to check your "neural-tagged" images against?

Now that the engine is in the OS, and governments know about it, the door is open for all kinds of garbage, without you knowing about it.
 
  • Like
Reactions: jk1221

flowsy

macrumors 6502
Aug 16, 2009
356
299
Germany
Will you even know if Apple, due to local government, will be using a different image database, say, one with an protest location map, to check your "neural-tagged" images against?
If it were that easy to force Apple to do such things, we'd already seen it. And no, Chinese servers for Chinese iCloud users and the forced removal of apps from the Chinese AppStore are not even in the same ballpark as forcing Apple to spy on its users.
 

_Spinn_

macrumors 601
Nov 6, 2020
4,857
10,044
Wisconsin
It'll be interesting to see what the iOS 15 adoption rate is. Normally they get to 50% in a matter of weeks, but I have a feeling that people who really value their privacy won't update.
Maybe this is why Apple is letting people stay on iOS 14?

 

eatrains

macrumors 6502a
Mar 11, 2006
639
4,858
Did you intentionally misunderstand my post or did you simply ignore everything I wrote, just to bring home your “correction”?

Today they may ”only“ match hashes from the NCMEC’s database - but what if someone in the future uses the procedure to match something entirely different, in order to scan (if this is the wrong word, I beg your pardon, as I’m not a native speaker) for persons that can be identified using a different hash that represents e.g. rainbow-colored watch bands … ?
The hash wouldn’t represent rainbow-colored watch bands, it’d represent a specific unique picture of, say, someone wearing a band. The victim would have to have that same exact picture of the band for it to match. It just would be incredibly ineffective for this to used to police broad subject matter instead of specific pictures.
 

_Spinn_

macrumors 601
Nov 6, 2020
4,857
10,044
Wisconsin
The hash wouldn’t represent rainbow-colored watch bands, it’d represent a specific unique picture of, say, someone wearing a band. The victim would have to have that same exact picture of the band for it to match. It just would be incredibly ineffective for this to used to police broad subject matter instead of specific pictures.
Apple’s matching sounds pretty fuzzy to me. Apple's summary:
NeuralHash is a perceptual hashing function that maps images to numbers. Perceptual hashing bases this number on features of the image instead of the precise values of pixels in the image.
It sounds like it is looking for features of the photo (a rainbow band) so that way cropping or slight pixel changes don’t throw it off.
 
  • Like
Reactions: Neodym

MozMan68

macrumors demi-god
Jun 29, 2010
6,074
5,162
South Cackalacky
Maybe this is why Apple is letting people stay on iOS 14?

They are not "letting them stay." The whole point of that was to let people who don;t immediately switch over to iOS 15 know that the new security/ad blocking functions being introduced will also be introduced for use on iOS 14.
 

opfreak

macrumors regular
Oct 14, 2014
249
431
What the actual **** are you on about? First of all, I never said Apple code or anything is perfect, so stop putting words in my mouth. Secondly, no one is going to be thrown in jail for a false positive, because it will be manually reviewed. You're trolling, right? Surely you can't be this confused.
you keep repeating this myth that the code will only generate a false positive 1 out of a trillion times. What proof do you have of that?
 

Playfoot

macrumors 6502
Feb 2, 2009
283
255
If it were that easy to force Apple to do such things, we'd already seen it. And no, Chinese servers for Chinese iCloud users and the forced removal of apps from the Chinese AppStore are not even in the same ballpark as forcing Apple to spy on its users.
I am not so certain that "we'd already (have) seen it". Due to all surrounding Trump, most people are now familiar with FISA, but likely not aware of the ongoing "upgrades" since 9/11 (FISA courts have been around since mid to late 70's). However, most are not aware of NSL's or National Security Letters. If a company or individual receives an NSL, it is against the law (terrorism and treason) to even report receipt of such an NSL.

Unlike FISA, no courts, not even the abridged 3 person court of FISA fame, are involved. NSL compels participation and to give assistance to the government. All information can be collected, AND there is not requirement for it to be destroyed. There is NO requirement to share with the people investigated that they were investigated. All information collected can then be shared with any law enforcement. It is estimated that between 35,000 to 40,000 law enforcement officials have access to the current, and growing database of information collected.

Under the Patriot Act, Law enforcement, with "sneak and peak" warrants, is effectively no longer required to tell people if their home, as an example, was searched. With "sneak and peak" authorities are allowed to leave what is referred to as a "magic lantern" on a computer, phone, etc. to record ALL activity continuously.

And before anyone thinks only the guilty are investigated, using what little information has been released, it is often found that in a five year period, with almost 200,000 NSL's, there is only one or two indictments.....And the information "hoovered up" remains with the government.

There are many many more provisions that make it almost certain that we "will likely never see it". But with an estimated 30million additional security camera since 9/11, I am certain someone "will see it".
 
  • Like
Reactions: haruhiko

MozMan68

macrumors demi-god
Jun 29, 2010
6,074
5,162
South Cackalacky
you keep repeating this myth that the code will only generate a false positive 1 out of a trillion times. What proof do you have of that?

It’s actually less than that…Apple will only review if there are multiple pics flagged.

The chances of Apple’s review showing that those pics are NOT part of the set database of offending images is one in one trillion. It’s in the white paper explaining the tech. It’s not a myth.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
you keep repeating this myth that the code will only generate a false positive 1 out of a trillion times. What proof do you have of that?

I misstated it. Here's what Apple actually says:

Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

So it's not the chance of a single photo being flagged but of an account being flagged. Do you realize how big a number a trillion is? It's so big, that even if Apple was lying and overstating the real figure by a hundredfold, the number would still be enormous (100 billion).
 
  • Like
Reactions: MozMan68

Abazigal

Contributor
Jul 18, 2011
19,784
22,408
Singapore
Many of the fears expressed here in the forum are theoretical. I understand what they could do with the planned implementation of this kind. But I also understand what they could have already been doing with existing capabilities, but do not. I just don't see this implementation as being any more of a risk than current options. If Apple does start looking for and matching other content (no matter how) - and we all have the luxury of being informed in advance - then I'll be the first one to throw my iPhone and Mac against the wall. But I just don’t see it. And if you have such fears, why only start here and now? We are waaay past that.
I think this is something that the majority of iPhone users are going to be indifferent to.

It's the same story as with the headphone jack all over again. Apple is the first to do it on a larger scale, takes all the heat for it, and once said practice becomes widely accepted (and perhaps, even expected), other companies then jump on the bandwagon to score free brownie points.

Don't trust the word of any company who claims today that they are never going to implement such a feature. Eventually, they all will. Just that nobody wants to be the poster child for that.
 
That's so wrong! Hey Apple... Focus on fixing this stuff before going after the consumers. This is creepy and wrong in so many different levels.



Isn’t this violation of a Privacy? I have a lot of nudes of myself on my phone. :( should I start deleting them or what?

Well over 34,000 photos. You got to be kidding me.

I’m scared ? Should I be worried?
That's so wrong! Hey Apple... Focus on fixing this stuff before going after the consumers. This is creepy and wrong in so many different levels.



Isn’t this violation of a Privacy? I have a lot of nudes of myself on my phone. :( should I start deleting them or what?

Well over 34,000 photos. You got to be kidding me.

I’m scared ? Should I be worried?
I have selfies and other consenting adults in my library however not keen on any entity being able to scan them. Most are headless except for a few facials ?.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.