Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TheToolGuide

macrumors regular
Aug 11, 2021
118
87
Apple sure has come a long way from the "You're holding it wrong" era. Good to see they actually consider user feedback (e.g. stage manager on non m1, etc)
This wasn’t exactly user feedback. The system was never put into place for people to even see how it worked and almost no one on here who didn’t like it understood how it worked and were often just quoting headlines and not he technology behind it.

That being said even though I was in support of the CSAM idea I also recognize that people need to feel comfortable with it and I need to accept the outcome. Apple likely could have done a much better job rolling this out and helping people better understand how it was designed to work. Not everyone was going to like it but in the end I feel it was a loss while others will feel like its a win.
 
  • Disagree
Reactions: MuppetGate

TheToolGuide

macrumors regular
Aug 11, 2021
118
87
That's the Law Enforcement and Government job. Not Apple's.
This is everyone’s job. Often people are not good at identify and/or recognizing when it is happening around them despite the steps the pedophiles use to hide their actions. There are many instances when someone should’ve spoken up but didn’t everyday. We are flawed as well.

It is the Government’s job to pass laws that better protect and prohibit this deplorable behavior along with promoting programs that provide education and treatment to prevent people from succumbing to this mental illness. It is the Law Enforcement’s job to investigate and uphold the law. While law enforcement has its own flaws it still needs tools and technology to capture these people and stop them. Was this the right way? Many people felt it was not.

There is some good intentions from Apple trying to keep this kind of material off their servers and likely legal ones they were trying to address at the same time. I don’t want to suppress good intentions from anyone, company or otherwise, who take on the nearly insurmountable task of catching and inhibiting these people.

There were many so called security experts and companies who campaigned against this purely for monetary reasons and less because of privacy and back door issues.

I hope that people take away from this that pedophilia is still a problem and we need better tools to stop this. Not ‘Yay, we stopped a thing I didn’t fully understand because I read a headline. Sorry kids.’
 

Fat_Guy

macrumors 65816
Feb 10, 2021
1,012
1,078
Apple probably figured since the customer buys the phone there would be legal implications for them scanning on the device.



Let’s see how things go with their “rent a phone” subscription service. Then the phone is theirs and everything in it so they can scan away….




Always that - but the upside is that you can get a free yearly upgrade for something like: USB C (even though I got USB C on a new phone I bought for 150 bucks…) for a small extra fee.
 
  • Like
Reactions: MuppetGate

TheToolGuide

macrumors regular
Aug 11, 2021
118
87
Why?

Just because Apple doesn't scan for content, doesn't mean law enforcement cannot subpoena/compel access to the content that may be stored on Apple's servers once there is proper legal cause to do so.
I was for the idea of CSAM. More importantly nothing has really changed. This was trying to take a more pro-active way to inhibit and stop this type of behavior. Law enforcement can only do so much.

The people have spoken and said this was to far Apple. I was comfortable with it but many weren’t. Sadly I don’t think enough is being done on this particular issue and no one is really coming together to say ‘okay, we won’t do that but what will we do instead of just doing the same old thing.’

Of course this is often the case or any issue society has to deal with and people can’t agree. Likely no solution is going to make everyone happy. I just wish we could come up with something that can make more of an impact.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,690
I don't get why people think this whole CSAM exercise was just a means of Apple getting a backdoor into people's phones - why would they need to go to those lengths when they control the OS for pity's sake!
As one of the opponents to this whole idea, it wasn't about just getting a backdoor, it was about using your device and it's capabilities to spy on you. The backdoor was just what we worried about what comes next. You change the hash database and it could look for anything of yours.
 

I7guy

macrumors Nehalem
Nov 30, 2013
34,306
24,037
Gotta be in it to win it
As one of the opponents to this whole idea, it wasn't about just getting a backdoor, it was about using your device and it's capabilities to spy on you. The backdoor was just what we worried about what comes next. You change the hash database and it could look for anything of yours.
Sure, in a world of endless possibilities I agree.
 

Mercury7

macrumors 6502a
Oct 31, 2007
738
556
Pretty sure most of us did not have issues with Apple scanning iCloud…. The issue was scanning our devices… no matter how secure or intent it was just a bridge too far…. I’m still perfectly fine with them scanning anything I upload to the cloud. I think people still confuse what was at issue
 
  • Love
Reactions: bobcomer

Analog Kid

macrumors G3
Mar 4, 2003
8,916
11,477
Apple was most definitely going for semantics:
"The embedding network represents images as real-valued vectors and ensures that perceptually and
semantically similar images have close descriptors in the sense of angular distance or cosine similarity.
Perceptually and semantically different images have descriptors farther apart, which results in larger
angular distances."


The difference is not that clear-cut. The system extracts features from the image, and based on these features a neural network produces an image descriptor. If this descriptor is sufficiently similar to the image descriptor of a known CSAM image, the image will be flagged. Now yes, I understand that this type of system relies on existing images and is not capable of finding entirely new types of CSAM. But NCMEC was to provide its five million image hashes, that is a lot of images for a subject matter, and if you then go for similarity matching rather than exact matching, you have for all intents and purposes a CSAM classifier.
Training matters.

Semantic classifiers are given many examples of cats, trained to semantically identify cats in general and to distinguish them from other things, and then are asked to infer from a new image if it is a cat. Typically it is trained on multiple classes (cat, dog, house, car) and upon inference returns a confidence against each class. This is probably how "safe search" image filters are implemented-- trained to find stuff that looks like a general definition of porn.

That is not how the Apple NeuralHash is trained. It is trained to detect a specific image under perceptually invariant transforms, not a class of related but distinct images. It is not detecting CSAM, it is detecting specific instances of CSAM. It is trained to distinguish from examples that are not those instances of CSAM to prevent it from, for an over simplified example, declaring that an image with a lot of flesh tone is CSAM.


They are not perfectly reversible, that is true. But you can recreate a recognizable approximation of the original image. See: https://towardsdatascience.com/black-box-attacks-on-perceptual-image-hashes-with-gans-cc1be11f277

This has been shown to work with PhotoDNA, a perceptual image hashing system widely used for CSAM detection. Maybe Apple's system would have been immune to this, but they spent quite some effort to prevent people from even trying, so I have my doubts.

No, you can't.

Here's an input image from that link:
1670537898964.png


Here's the image "reversed" from the Domino's hash:
1670537951837.png


That doesn't look like anything I'd be worried about.



A few things: first your link doesn't use PhotoDNA, it uses the Python imagehash library which has not been designed to resist any attacks such as you're discussing.

Second, it's using a GAN to create another image that matches that hash, but not necessarily the image that created the hash. Specifically it's saying "create an image that is a face and matches this hash". Since the first examples are all faces filling the frame on the input, and the GAN is creating a face filling the frame, it's easy to freak out and say "it's reversible!". It's also easy to say "they even got the hair color right!", because the hash being used is the ahash which is based on the average color of a region.

The example I shared above is when the image is arbitrary as it would be in your photo library and you ask the GAN to "create an image that is a face and matches the hash". What you get is nothing like the input. You could do the same with a GAN trained to make pictures of cars that match a hash, or landscapes that match a hash-- all would look like something, but nothing like what the original image was. Then imagine the image was 8MP rather than 0.044MP and the variation that would lead to. This is an example of what I meant by "spoofing". Generating an image that matches the hash, but looks nothing like the image that created the hash.
 
Last edited:

laptech

macrumors 68040
Apr 26, 2013
3,582
3,986
Earth
Just you wait, 6 months, a year, 2 years, 3 years? the police will crack a child sex gang and as part of their investigation into the gang the police report on 1000's upon 1000's of child sex images were found on Apple's icloud servers and parents/relatives/children's charities will be complaining at Apple as to why they didn't do anything or enough to prevent the spread of such images on their icloud servers and then someone will speak up and say 'well Apple planned to but it got canned due to public pressure over users right to privacy (quoting todays date).'
 

DeepIn2U

macrumors G5
May 30, 2002
12,826
6,880
Toronto, Ontario, Canada
They were ready to roll it out, but backed off when their consumer base set their brand on fire for combing through their private data.

You think they announce stuff like this without having every detail developed and implemented? They even replied with the chance of getting a false positive.

Seriously?

smh.

Doesn’t matter it was NOT rolled out.

Period.

End of discussion despite your hypothesis.

SMH, like get real.
 

DeepIn2U

macrumors G5
May 30, 2002
12,826
6,880
Toronto, Ontario, Canada
Yes. That’s all it takes to lose users trust. They have planted the seed of doubt in my and many other end users minds. I’m not going to forget and just blindly take them at their word. If you believe any corporation has your best interests in mind and you think they’re always telling the truth to users, get with the program. Apple probably realized how badly they eroded trust in their otherwise loyal base. All it takes is one slip up for people to lose trust permanently.
By that analogy in sure there are MANY products by direct or indirect association for mistrust of actions or statements that backpedaled that you’d probably question everything in your home, including the home itself and the mortgages you’ve had.

That kind of “mistrust” and action isn’t something I could live by. To me it’s actions cause words.

So yes when this is implemented outside of beta Apple will show by actions they stand by their worlds and correct mistakes as they’ve continually have year after year after year.

For all of the pundits against anything CSAM on these boards concerns Apple’s mentioning of it a year ago - NOBODY has complained what Google, Microsoft and others have actually done that all their users are affected by. So mistrust is poorly placed against Apple.

“If a man (or company) is guilty for what goes on in his (their) own mind, then give me the electric chair for all my future crimes” - Prince.

I apply this to myself, people I meet or my circle or to businesses.

Actions speak louder than words. My trust is in actions. My mistrust is also in actions as well not statements.
 

Analog Kid

macrumors G3
Mar 4, 2003
8,916
11,477
As one of the opponents to this whole idea, it wasn't about just getting a backdoor, it was about using your device and it's capabilities to spy on you. The backdoor was just what we worried about what comes next. You change the hash database and it could look for anything of yours.

That first sentence gets to the point of what my reservations were. On the one hand, this seemed like a very narrow and customer friendly approach to limit government overreach. CSAM is very, very bad. We should find and prosecute offenders. We shouldn't let it be an excuse to root through a users file system looking for something else, and it should tell the government, Apple and everyone else absolutely nothing about the innocent. Apple pretty much made sure they accomplished that in a way that also enabled end to end encryption in the cloud.

In my mind, the "what will autocrats do" argument is just as over the top and irrelevant as saying we can't have encryption because we need to protect children. Authoritarians will either find a way to bend Apple to their will or will throw Apple out and support a bootlick competitor.

My discomfort is less grandiose. Sometimes I break the law by exceeding the posted speed limit. My phone knows I'm in my car because of Bluetooth or CarPlay (thus parked car waypoints), and it knows how fast I'm going. Should my phone notify law enforcement that I'm speeding?

It seems like a much less noble argument, but it's actually much more relevant. We all try to wriggle out of speeding tickets like it's a sport, but speeding is a crime. Traffic accidents affect a lot of people. How many other crimes like this could technology enforce?

If my technology witnesses a crime and is technically able to report it, should it be obliged to do so? Is demanding that my technology not snitch basically asking Apple to conspire with me on a crime?

This is a bit like the trolly problem in autonomous vehicles. Systems will eventually be smart enough to know the car can save the lives of 2 pedestrians by making a decision that puts the driver at risk, should they?

And then what I consider the kicker: once the public knows the car will make that decision, will anyone buy one?
 
Last edited:
  • Love
Reactions: Darth Tulhu

bobcomer

macrumors 601
May 18, 2015
4,949
3,690
That first sentence gets to the point of what my reservations were. On the one hand, this seemed like a very narrow and customer friendly approach to limit government overreach. CSAM is very, very bad. We should find and prosecute offenders. We shouldn't let it be an excuse to root through a users file system looking for something else, and it should tell the government, Apple and everyone else absolutely nothing about the innocent. Apple pretty much made sure they accomplished that in a way that also enabled end to end encryption in the cloud.

In my mind, the "what will autocrats do" argument is just as over the top and irrelevant as saying we can't have encryption because we need to protect children. Authoritarians will either find a way to bend Apple to their will or will throw Apple out and support a bootlick competitor.

My discomfort is less grandiose. Sometimes I break the law by exceeding the posted speed limit. My phone knows I'm in my car because of Bluetooth or CarPlay (thus parked car waypoints), and it knows how fast I'm going. Should my phone notify law enforcement that I'm speeding?

It seems like a much less noble argument, but it's actually much more relevant. We all try to wriggle out of speeding tickets like it's a sport, but speeding is a crime. Traffic accidents affect a lot of people. How many other crimes like this could technology enforce?

If my technology witnesses a crime and is technically able to report it, should it be obliged to do so? Is demanding that my technology not snitch basically asking Apple to conspire with me on a crime?

This is a bit like the trolly problem in autonomous vehicles. Systems will eventually be smart enough to know the car can save the lives of 2 pedestrians by making a decision that puts the driver at risk, should they?

And then what I consider the kicker: once the public knows the car will make that decision, will anyone buy one?
Interesting thoughts!
 

SnappleRumors

Suspended
Aug 22, 2022
394
515
You got nothing to hide!

If the government can stick its hand down my pants searching for bombs, it can snoop on people's cell phone picture library to make sure they're not diddling kids!!

You have nothing to hide! If you're not a kiddie molester, you have nothing to worry about!!!

The difference is consent, reasonable suspicion, and probably cause.
 

SnappleRumors

Suspended
Aug 22, 2022
394
515
This wasn’t exactly user feedback. The system was never put into place for people to even see how it worked and almost no one on here who didn’t like it understood how it worked and were often just quoting headlines and not he technology behind it.

That being said even though I was in support of the CSAM idea I also recognize that people need to feel comfortable with it and I need to accept the outcome. Apple likely could have done a much better job rolling this out and helping people better understand how it was designed to work. Not everyone was going to like it but in the end I feel it was a loss while others will feel like its a win.

CSAM is completely inconsistent with their recent advanced security deployment.
Apple realized long ago they were going that direction and knew it was impossible to justify craving out an CSAM exception.
 

TheToolGuide

macrumors regular
Aug 11, 2021
118
87
CSAM is completely inconsistent with their recent advanced security deployment.
Apple realized long ago they were going that direction and knew it was impossible to justify craving out an CSAM exception.
If you had read the underlying technology on what they were going to do you would realize your opinion doesn’t apply. This was on device scanning, it used ID tags compared to known illegal photos, those tags were tallied up before a warning was ever sent out, a threshold had to be met, a review ad to be made by a human to verify it wasn’t an error, the error rate was astronomically high, and then law enforcement would be involved once verified the tags matched known CSAM.

End-to-end encryption was never broken since its using tags on device. While files can be encrypted in the cloud Apple wasn’t scanning those files. They were using the ID tags (coupons I think they called them) and if the data was encrypted and needed to be reviewed, there is are solutions a company can create and evolve to address security without completely invading privacy.

Unlike many I took quite a bit of time to read up on the technology. IMO I felt it was quite secure. Thankfully I encourage discussion and understanding. The media, political pundits, and loudest internet voices all said they don’t trust or want it whether they took the time to understand it or not. My opinion was the majority did not even have a clue how it worked, just that it was a backdoor for conspiracies. However that is my opinion and I could likely be be in the minority.

At the end of the day it doesn’t matter what I think about any of it. I thought it was a good approach to addressing the issue with pedophilia around the world and it won’t be in place for various reasons and opinions. Yet know one has proposed a better solution and the problem is still growing and not shrinking.
 

Darth Tulhu

macrumors 68020
Apr 10, 2019
2,191
3,660
You got nothing to hide!

If the government can stick its hand down my pants searching for bombs, it can snoop on people's cell phone picture library to make sure they're not diddling kids!!

You have nothing to hide! If you're not a kiddie molester, you have nothing to worry about!!!
Except you don't know if the officer patting my young daughter down (because her phone was flagged) is a rapist or a kiddie molester himself.

You cannot see if someone is a racist, a rapist, a murderer, or a pedophile, unless they give it away themselves.

And therein lies the problem.

All societal-control systems are run by humans. And there is currently no way to look inside a human's heart and see their intentions.

So the laws must mitigate this. What is written MATTERS.
 

robbietop

Suspended
Jun 7, 2017
876
1,169
Good Ol' US of A
Except you don't know if the officer patting my young daughter down (because her phone was flagged) is a rapist or a kiddie molester himself.

You cannot see if someone is a racist, a rapist, a murderer, or a pedophile, unless they give it away themselves.

And therein lies the problem.

All societal-control systems are run by humans. And there is currently no way to look inside a human's heart and see their intentions.

So the laws must mitigate this. What is written MATTERS.
The officer patting your daughter down is only molesting her by government orders. She might be carrying a bomb, sir.
You've done nothing wrong, right? You're not hiding anything, right? So why can't the TSA diddle young girls going to the airport?
We must write laws that the government can see everything on your phone just in case your TSA agent and you are diddling your daughter together in the back of the airport Subway.
 

robbietop

Suspended
Jun 7, 2017
876
1,169
Good Ol' US of A
The difference is consent, reasonable suspicion, and probably cause.
I reasonably suspicion that TSA agents are government diddlers in disguise sent by Joe Biden and Jeffrey Epstein. It's all consent because you consented to having a government diddle children at the airport when they knocked the towers down and we all got really angry and forgot civil rights and allowed us all to be molested in line with our shoes off.
The shoes being off is the key to a good government molesting.
 

SnappleRumors

Suspended
Aug 22, 2022
394
515
I reasonably suspicion that TSA agents are government diddlers in disguise sent by Joe Biden and Jeffrey Epstein. It's all consent because you consented to having a government diddle children at the airport when they knocked the towers down and we all got really angry and forgot civil rights and allowed us all to be molested in line with our shoes off.
The shoes being off is the key to a good government molesting.

I’m gonna need to read this over a few times.
 

I7guy

macrumors Nehalem
Nov 30, 2013
34,306
24,037
Gotta be in it to win it
That first sentence gets to the point of what my reservations were. On the one hand, this seemed like a very narrow and customer friendly approach to limit government overreach. CSAM is very, very bad. We should find and prosecute offenders. We shouldn't let it be an excuse to root through a users file system looking for something else, and it should tell the government, Apple and everyone else absolutely nothing about the innocent. Apple pretty much made sure they accomplished that in a way that also enabled end to end encryption in the cloud.

In my mind, the "what will autocrats do" argument is just as over the top and irrelevant as saying we can't have encryption because we need to protect children. Authoritarians will either find a way to bend Apple to their will or will throw Apple out and support a bootlick competitor.

My discomfort is less grandiose. Sometimes I break the law by exceeding the posted speed limit. My phone knows I'm in my car because of Bluetooth or CarPlay (thus parked car waypoints), and it knows how fast I'm going. Should my phone notify law enforcement that I'm speeding?

It seems like a much less noble argument, but it's actually much more relevant. We all try to wriggle out of speeding tickets like it's a sport, but speeding is a crime. Traffic accidents affect a lot of people. How many other crimes like this could technology enforce?

If my technology witnesses a crime and is technically able to report it, should it be obliged to do so? Is demanding that my technology not snitch basically asking Apple to conspire with me on a crime?

This is a bit like the trolly problem in autonomous vehicles. Systems will eventually be smart enough to know the car can save the lives of 2 pedestrians by making a decision that puts the driver at risk, should they?

And then what I consider the kicker: once the public knows the car will make that decision, will anyone buy one?
CSAM and speeding are not equivalent. You could be caught speeding and the officer could let you off with a warning. Should people who have CSAM materials get let off with a warning? Traffic accidents from speeding are a part of society. The risk of death and injury is inherent in all that we do in our daily lives is always present.

CSAM in no circumstances is an acceptable part of society and we might have to balance privacy with tech to stop the interwebs from being an enabler or CSAM. (And sure there are clearly issues with using tech to catch CSAM as in that unfortunate incident of a parent sending pics of t heir child to the doctor using gmail, which is a clear breakdown in the systems and common sense)
 
  • Love
Reactions: compwiz1202

Analog Kid

macrumors G3
Mar 4, 2003
8,916
11,477
CSAM and speeding are not equivalent. You could be caught speeding and the officer could let you off with a warning. Should people who have CSAM materials get let off with a warning? Traffic accidents from speeding are a part of society. The risk of death and injury is inherent in all that we do in our daily lives is always present.

CSAM in no circumstances is an acceptable part of society and we might have to balance privacy with tech to stop the interwebs from being an enabler or CSAM. (And sure there are clearly issues with using tech to catch CSAM as in that unfortunate incident of a parent sending pics of t heir child to the doctor using gmail, which is a clear breakdown in the systems and common sense)

I think you're missing my point. I am explicitly saying the two aren't equivalent which is why it's interesting to think about.

The debate so far has been on hyperbolic arguments. Exploited children on one side, despots on the other. Holding up the one time a bad solution turned out good against the one time a good solution turned out bad.

As I said, I'm not bothered by going after and prosecuting child predators, I think Apple did this in a way that was private and secure, and I don't think it's an easy backdoor for dictators. I don't think the arguments against the CSAM scanning hold up. What I was addressing is why I'm still uncomfortable. I don't think think the question is "should your technology be allowed to spy on you doing horrible things", I think the question is "should it be allowed to spy on you at all."

The ethical arguments are nuanced here and that means shifting the perspective from hyperbolic to mundane. If an argument can be made that society benefits from technology enforcing laws, should it? If it does, how does that affect our relationship with and demand for the technology?

I started thinking about speed limits because they do hold such an ambivalent place in our minds. If we can't make a case for enforcing speed limits but can make a case for CSAM then that means there's a threshold somewhere in between. Where is it? How do we define it? How do we hold the line?

I'm not sure we can answer those questions and therefore hesitate to support implementing something like this even if we all agree the specific use is well on the justifiable side of the line.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.