Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

giggles

macrumors 65816
Dec 15, 2012
1,048
1,275
I don’t believe the “comparing hashes” party line. Child pornography is being manufactured every second of every day. There are billions of very capable cameras available worldwide in almost everyone’s palm, so I’m sure at least hundreds of child porn images are created every second. To think that there is a database of child porn with hashed images against which the pics on your iPhone can be compared is a ridiculous notion.

Our law enforcement can’t even maintain a database of firearms, let alone being able to have a database of child porn images. The only effective way to try and identify child porn is to use AI to flag the images that the AI thinks is child porn and then let humans decide which ones are child porn. This is already being done by services like Facebook with thousands of contractors reviewing images all day long and getting PTSD from their jobs.

Apple has breached my red line of where they should never have gone. At this point, I will trust Google before I trust Apple.

What are you talking about, the CSAM repository by NCMEC is a thing and nobody says that it encompasses all the CP that ever existed and produced daily. Intercepting 100% of CP is outside of the scope of this system.
 

brofkand

macrumors 65816
Jun 11, 2006
1,352
3,450
Tim Cook: Android already does things like that. We can introduce it as well at Apple. We just make it a bit more sophisticated and sell it as an improvement. That will work then.
Me: I am not on Android, because I trusted Apple - until today. But now I will search for alternatives for my own private and legal stuff. Go to hell. As of today you have lost my confidence, dear Tim.

Apple Stock just dropped - interesting...

I even might think about removing my credit cards from Apple Pay.
And these boarding passes - I might undust my old printer and buy some cartridges and paper sheets...

The "Apple is private" lie is just that - a lie. There is no evidence that you are any more private on an Apple product vs. any other. Remember that Apple only started talking about privacy when their sales began to plateau and they had to come up with a reason to keep people in the ecosystem.
 

opfreak

macrumors regular
Oct 14, 2014
249
431
At runtime, your local pics are not encrypted, otherwise you wouldn’t be able to look at them.
Doing this “CP or not” labeling locally is exactly what allows Apple to not decrypt all your pics when they are already stored on iCloud.
You people have it backward, this system allows for less invasion of our privacy than searching for CSAM in the cloud like other companies do.
At lest get your facts right, THEN we can discuss if it should be done or not.

How is apple adding picture scanning software to your phone 'less invasion of our privacy' ? Please explain that to me. Today they dont scan the pictures tomorrow they will. Sound like thats less privacy.
 
  • Like
Reactions: turbineseaplane

giggles

macrumors 65816
Dec 15, 2012
1,048
1,275
How is apple adding picture scanning software to your phone 'less invasion of our privacy' ? Please explain that to me. Today they dont scan the pictures tomorrow they will. Sound like thats less privacy.
Because if you’re negative, it lives and dies fully within the boundaries of your device.
It’s a glorified “EXIF“ label generated locally. Afterwards your pics will be uploaded to iCloud encrypted end to end like nothing happened.

Only if you‘re positive multiple times Apple is involved in any way.

Whereas with server side search, your pics are being scanned in a physical place that’s thousand of miles from you, maybe they are also being decrypted before scanning them.

Not sure how anyone would prefer the latter, keeping in mind we’re talking about data you were about to upload to a server anyway. (this gets people..yes it’s local data but only if it‘s soon-to-be-uploaded)
 

Apple_Robert

Contributor
Sep 21, 2012
34,536
50,128
In the middle of several books.
Does apple scan if iCloud is turned off?
From what I see, scanning happens on the device whether iCloud photo is turned on or off. The apparent difference is if iCloud Photo is turned off, an alert by Apple is not triggered.

"CSAM image scanning is not an optional feature and it happens automatically, but Apple has confirmed to MacRumors that it cannot detect known CSAM images if the ‌iCloud Photos‌ feature is turned off."
 
  • Like
Reactions: I7guy

cola79

macrumors 6502
Sep 19, 2013
380
437
So i guess most people won't update to iOS 15 then. And probably a large drop for device sales.
I mean what do they expect? They want to stop abuse by abusing everyone.

The same technique can be used to scan voice and text. So if you talk bad about your government, Apple's gonna tell the government's drone your location to eliminate you.

That's no science fiction, this works already and can be implemented within weeks.
 
  • Like
Reactions: turbineseaplane

cola79

macrumors 6502
Sep 19, 2013
380
437
From what I see, scanning happens on the device whether iCloud photo is turned on or off. The apparent difference is if iCloud Photo is turned off, an alert by Apple is not triggered.

"CSAM image scanning is not an optional feature and it happens automatically, but Apple has confirmed to MacRumors that it cannot detect known CSAM images if the ‌iCloud Photos‌ feature is turned off."
That must be a lie.
There is no need to implement this if it only gets active in combination with iCloud photos.

The uploaded photos are scanned already.

So they confirm this is to enable mass control at any given time, if the leader of a nation demands it, they will give them anything. It's not about photos, the technique is working for every content that is on the phone and can be used to scan voice calls either in real-time.

With this statement, Apple confirmed they no longer sell a product, but a weapon.

It is the customer now who has to decide if he abandons freedom and democracy for a damn stupid phone. If he does, he doesn't deserve better.
 
  • Disagree
Reactions: DeepIn2U

axantas

macrumors 6502a
Jun 29, 2015
823
1,131
Home
So i guess most people won't update to iOS 15 then. And probably a large drop for device sales.
I mean what do they expect? They want to stop abuse by abusing everyone.

The same technique can be used to scan voice and text. So if you talk bad about your government, Apple's gonna tell the government's drone your location to eliminate you.

That's no science fiction, this works already and can be implemented within weeks.
We COULD try to fight it by not updating to this 1984 Master Control Firmware.
...i doubt, many will do that.

I do support every step against these abusive pictures, but I am not willing to have every photo scanned, whether it is acceptable or not.

Furthermore:
- Why do they scan "on device" but do nothing if I do not use the iPhoto library. The result of the scan is used for...?
- I am not willing to have a hash-database on my device, which is useless in my case but is controlling, what I am doing.

Apple Stock dropped Thursday and stays at that level for two days now. Let it drop more please....
 

DeepIn2U

macrumors G5
May 30, 2002
12,853
6,892
Toronto, Ontario, Canada
so if i have a private picture of my own child in a bathtub splashing away it will now flag and some contractor working halfway around the world for pennies gets to view my child in the bathtub, without my permission, and determine if perhaps i am some kind of child abuser? maybe they even keep a "souvenir" of the photos somehow. then i get to talk to a detective and be forever flagged in some database as someone accused or at least investigated for one of the most horrific crimes that exists? what could possibly go wrong?

UPDATE: there is no option not to have this scanning if you use photos or icloud photos. however upon reading more about this it apparently works by encoding your image to a "hash string" which isn't the actual image and then comparing that hash string to a list of KNOWN child abuse images from the internet (so unique images of your children playing in the bath aren't even an issue) and if you have a certain number of exact matches then and only then do they "manually" (ie. a person) look at the images and determine if a crime is a occurring they need to report. they claim there is a less than 1 in a TRILLION chance a flagged account doesn't actually contain significant amounts of true child abuse imagery. i need to read more but perhaps it isn't as creepy as it first sounded to me... but they need to be very, very transparent of what is going on.

The real issue perhaps isn't false flags so much as how this technology could spread to scan for other types of images (political etc.) to flag individuals. i get that seems very "black helicopter" of me when i say but it is apple itself who says they don't build backdoors into iOS because of the chance it could be abused.

i I think the fear is that it could spread to something else but this initiative all the technology all the skins for what only does where is this other fear coming from what else do you think will be flagged? You mean the silly post slightly above yours about scanning for too many days are too many that are wearing this or not wearing that? Where is the fear coming from?
 

axantas

macrumors 6502a
Jun 29, 2015
823
1,131
Home
i I think the fear is that it could spread to something else but this initiative all the technology all the skins for what only does where is this other fear coming from what else do you think will be flagged? You mean the silly post slightly above yours about scanning for too many days are too many that are wearing this or not wearing that? Where is the fear coming from?
If I may...
There is no fear, but simply the fact, that I do not want to be controlled continuously, whether I am doing something against the law because I could do it. I am NOT doing it. Being controlled just tells me, that I am regarded as a possible liar per default and have to be checked first.

You know: Security control at the airports. I COULD carry something dangerous, so I am scanned - every time - 100%. Security control on my Smartphone. I COULD do something wrong, so I will be scanned - continously.
 

turbineseaplane

macrumors Pentium
Mar 19, 2008
15,014
32,189
Don’t understand the big deal.

This isn’t people at Apple scanning people’s photos, it’s on-board AI that’s looking for matches with an existing database.

Your photo library is already being scanned for faces (People feature) and other points of interest.

I can't believe you typed this and can't understand why people wouldn't be ok with it.
 
  • Like
Reactions: ssgbryan

DeepIn2U

macrumors G5
May 30, 2002
12,853
6,892
Toronto, Ontario, Canada
Prediction: Apple will have to keep signing the last version of iOS 14 for all devices, not just vintage ones. Otherwise, they will be sued for forcing people to use iOS 15 "BigBrother."
Suing Apple for going against Child pornography from proliferating, I love to see how that court case goes ?
 

Rigby

macrumors 603
Aug 5, 2008
6,225
10,170
San Jose, CA
Some interesting speculation by Matt Green:


He thinks this scanning will only run on devices that have a neural engine (i.e. newer iDevices and M1 Macs) because the secret hash function needs to be executed in a secure environment to prevent it from leaking. That would mean that at least those of us still on Intel Macs could possibly be spared.
 

DeepIn2U

macrumors G5
May 30, 2002
12,853
6,892
Toronto, Ontario, Canada
If I may...
There is no fear, but simply the fact, that I do not want to be controlled continuously, whether I am doing something against the law because I could do it. I am NOT doing it. Being controlled just tells me, that I am regarded as a possible liar per default and have to be checked first.

You know: Security control at the airports. I COULD carry something dangerous, so I am scanned - every time - 100%. Security control on my Smartphone. I COULD do something wrong, so I will be scanned - continously.
OK I’ll consider your debate.
First and foremost you are not being controlled. Period. You can save take pictures share them all you like. Apples already stated that if you disable iCloud photos they’re not gonna be part of the CSAM initiative.
Second a lot of the other services and OS is that you use or do you do this without your consent they haven’t told you Apple Lise is informing everybody publicly. Again this isn’t controlling anybody you choose what you do and what you do it’s your choice conscientiously APple is in changing your choice of doing this or that or anything else get that straight.
Being controlled would stop you from using the phone stop you from using iCloud if you’re doing wrong then at your account will be disabled whether it’s temporary until the proper person looks at the data or permanently.

Basic understanding in no way is any different than you having a license to drive a car it’s a privilege not a right do something wrong and continuously do things wrong and license will be revoked from you it doesn’t control whether or not you choose to drive Eileen hit people on the sidewalk doesn’t control your ability Speed on the highway words always leave your car parked in the driving spot it doesn’t control your actions it Punishes wrong consistent behavior. Just like driving license or not there are laws in the street for cars and for pedestrians there’s also a lot of child pornography newsflash

If you are in advertently in possession of some thing and you had no idea then yes you’re gonna be looked at the door will be locked that you may be questioned there may be an investigation it may turn out that you’re completely innocent but the link of what you had could spread to capturing very very easy people and that will play itself out. Be honourable you’re good.

I think your internalizing As a personal protection or personal fear versus the greater good of the community and for society as a whole you may not be a parent so you may not fully understand the gravity of something that we have a need for CSM in the first place that there’s disgusting traffickers of such content. You may be in support of this as well but maybe communication you’re not like maybe it’s the perception that it can be used as a total Terrian thing no I think that’s more than American perception because of the fear of things that started from a gun laws and the attack on laws recently.

If you work for a corporation understand that having a link sent to you in the chat doesn’t assume make you wrong or part of and it will link but if you click on it without thinking first or you click on it the database records it on the network you report it you’re in the clear you’re in the safe you might not know you have your phone our situation and this particular focus of child security and safety try not to personalize it as everything else you do in your life no that’s not it. But if your contact for it you be the honourable person that you are you’re not gonna be arrested or jail for 20 years or something you’re not very privy to.
 
  • Like
Reactions: I7guy

ssgbryan

macrumors 65816
Jul 18, 2002
1,488
1,420
Kind of agree with this and kind of don't.
I don't think your government, (or collectively the people they serve), actually want a database for firearms.
No need to - the NRA will literally sell their firearms database to anyone that wants a copy.

No need to do the work if they will do it for you.
 

sdf

macrumors 6502a
Jan 29, 2004
862
1,169
What if someone shaved all their body hair and took a selfie to share with another pervert?

Asking for a friend...
Is either account a child?

Because Apple actually does know that.
 

sdf

macrumors 6502a
Jan 29, 2004
862
1,169
In Apple’s document here https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf they claim an account has only a one in one trillion chance of being falsely flagged. Because there is nowhere near one trillion iCloud accounts we can probably assume there has never been and probably never will be an account incorrectly flagged. But by the same token, how would they know the one in one trillion figure is accurate? Because the instance has never occurred.
First part is correct. Last part is not. That estimate was no doubt made from the inside out.
 

Rigby

macrumors 603
Aug 5, 2008
6,225
10,170
San Jose, CA
Obviously. I was answering something specific, though; this reply wasn't in any way useful.
Personally I think your answer wasn't in any way useful, given that Apple's on-device scanning approach is undercutting any kind of E2E encryption anyway.
 

LV426

macrumors 68000
Jan 22, 2013
1,840
2,275
Your next iOS update will be about 100MB larger than it otherwise would be in order to accommodate the fuzzy/neural hashes of known CSAM photos. Which is nice.
Actually, I retract this assertion on the grounds that it would be impractical madness for an ever-growing blacklist. Probably more like this:

1. Get ready to upload your photo to iCloud.
2. On-device algorithm scans the photo and works out one of the new fangled neural hash codes for it.
3. Phone asks an Apple Server if the hash code is on a blacklist.
4. Phone tags the photo with a warning “voucher” if that is the case.
5. Phone sends the photo on its way, complete with voucher if there is one.
 

Abazigal

Contributor
Jul 18, 2011
19,688
22,231
Singapore
So i guess most people won't update to iOS 15 then. And probably a large drop for device sales.
I mean what do they expect? They want to stop abuse by abusing everyone.

The same technique can be used to scan voice and text. So if you talk bad about your government, Apple's gonna tell the government's drone your location to eliminate you.

That's no science fiction, this works already and can be implemented within weeks.

I think you will find that way less people care about this beyond the echo chamber that is tech blogs and forums.
 
  • Love
Reactions: I7guy

H2SO4

macrumors 603
Nov 4, 2008
5,674
6,954
No need to - the NRA will literally sell their firearms database to anyone that wants a copy.

No need to do the work if they will do it for you.
Well maybe but why does it need to be sold?
The authorities should be keeping a list, they even do that already with driving licenses.
 

[AUT] Thomas

macrumors 6502a
Mar 13, 2016
778
989
Graz [Austria]
There absolutely is: In the scenario you outlined, you would not just the receiver but the origin of the child porn. That absolutely needs to be investigated in a rational society, especially a democratic one.
Pardon my late reply... define child porn. A picture of a naked child (which is very common for parents to have e.g. pictures of their kids in the bathtub with a foambeard or so)
However the creepy neighbour taking pictures of your naked child bathing it's a whole different story.
The legal definition of porn or child porn for that matter is also not the same all over the world. There are a lot of deeper issues when you spend some time thinking about the implications. (One common example is sexting between teenagers and particularly the content resulting from that.)
So, just from that perspective there are already a lot of reasons against this.
However, it becomes ridicolous when considering that most messengers recompress the date when sent. This compression is often hardware accelerated and the output will vary a little at every compression. So, latest by now, the hash based approach is of little use even if the average pedo would be stupid enough to send their stuff over regular messengers and store it in their iPhone Photo library let alone iCloud. Also you could expect that even common messengers will now probably alter JPEGs very slightly before saving to camera roll -finally turning that hash based approach into useless nonsense.
What remains is Apple scanning innocent peoples pictures (being hash based or not) for being possible pedos.

You might want to read up on upload filters. It's a very similar issue (but certainly not the same).
 

Powerbooky

macrumors demi-god
Mar 15, 2008
597
499
Europe
Fact is… the photos app already scans locally. Not just for “persons”, but all kinds of subjects. I don’t tag or organise my photos, but can still do a search on keywords I never setup myself. The other day I was looking for a picture with a motorcycle. I could not remember when I took it, but I was easy found just by typing “motorcycle”.

My iPhone is actively scanning stuff, especially during nighttime when it is “idle” on the table. Every time I made a bunch of photos on daytime, the battery is drained more at nighttime than when I haven’t done much the day before.
 

iansilv

macrumors 65816
Jun 2, 2007
1,085
378
They might not do that NOW, but there’s no telling they will not do those LATER.
Well, we can cross that bridge later. Are you protesting Twitter censoring people off its platform? Are you protesting fact-checking on Facebook? If you are- kudos to you, your re consistent in your morality. If not, well, I think you have some soul-searching to do.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.