Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

antiprotest

macrumors 601
Apr 19, 2010
4,044
14,261
Does anyone know what happens if someone sends his enemy a bunch of CSAM pictures in iMessage? Suppose the target does not always check his messages and attachments and he just leaves them there. What happens then?

I am not using this as an argument against CSAM. I am only curious if anyone knows how something like this will be handled.

If there is not a technological or procedural system in place to handle this, then CSAM can be weaponized and become like swatting somebody.

Anyone knows what happens?
 
Last edited:

Unregistered 4U

macrumors G4
Jul 22, 2002
10,117
8,060
Let’s say, you are a parent and you take completely innocent photos of your child.
Your images, because you JUST took them wouldn’t be in the CSAM repository and it’s very unlikely there would be a match. If instead, you are involved in a situation where those images are presented as evidence in a criminal case, it is POSSIBLE that they might end up in the CSAM repository and get hash matched. AFTER that time, then, yeah, it might get matched if you have it on your system, Apple would send it on, it would be checked, it would be found in the repository and you’d be prosecuted, possibly.

Otherwise, no.
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,117
8,060
Does anyone know what happens if someone sends his enemy a bunch of CSAM pictures in iMessage? Suppose the target does not always check his messages and attachments and he just leaves them there. What happens then?
No one knows, but, just like someone being sent a malicious payload on their computer, that delivery, and how the images got there, would be tracked. AND, it would be fairly clear that the images ended up there as part of an iMessage “push” from another user and this user was not seeking out the images. AND, fortunately, that’s not all authorities have to go on. They would have access to email and internet browsing history to some degree in order to build a pattern of whether or not a person would have actively reached out to someone, outside of iMessage, and asked them to send these images.

There would definitely be an investigation just like for anything else, and the evidence would play out one way or the other.
 

DBZmusicboy01

macrumors 65816
Sep 30, 2011
1,106
1,279
VERY BAD..... Timmy Cook just wants to appease the HYPOCRITES whom calls themselves Dem0crats but in reality they want to censor everything that can just to Control us. We are Peasants in their eyes. They are Fascists in their ideologies. If they don't like the color purple... they would remove everything with the color blue in it. They say one thing today. But they can change their minds the next day. Oh you like basketball? WE DON'T AND WE WILL REMOVE AND PUNISH YOU FOR HAVING BASKETBALL PICTURES/VIDEOS. That's an example of the dangers of letting Apple scan our phones like that. Because once they have access to our phones like that. They can pretty much do anything and Imagine if hackers PLANT certain things on our phones.

----> !! Both sides wants us to suffer !! <----- So this isn't a Democrat vs Republican thing. BOTH ARE GUILTY AND I AM TIRED OF BOTH ACCOSTING EACH OTHER. BOTH ARE WRONG. On YouTube I have to purposely misspell words just so they don't censor me on there while talking about normal things in life they still censor you there and same with FB.
Things aren't like they used to be. I would say 2015 was the last good year before all this disasters came.


....This entire post is my concern involving Apple planning on taking away our privacy on our iPhones/iOS Devices.
 

BurgDog

macrumors 6502
Apr 22, 2012
384
456
The backdoor is a legal one. Once Apple implements a process to scan a person's phone for some illegal content and call the cops when they find it, they won't be able to legally assert they are unable to do that sort of thing when told by the authorities they must expand what they look for.
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,117
8,060
This seems contradictory.
Not really. It’s been indicated that one reason why Apple currently can’t encrypted ALL user images is because the government does not want Apple’s iCloud to be a safe repository for CSAM images. Apple’s solution was, “Give us the hashes that you think are problematic. We’ll scan those on the phones and report any account that has hashes that match and we can lock that account and let you know who they are. If we promise to do that, will you let us encrypt all other customer’s images?”

Encrypting everyone’s images in a way that only those exchanging CSAM images are affected does align with the privacy goals. Right now, with images not encrypted from Apple, law enforcement can ask for images that proves someone was in a location where a crime was held. IF Apple’s plan had gone forward, Apple wouldn’t be able to provide those images because they’d be encrypted, even from APple. They’d only be able to provide the account information of the account that had a CSAM hash match.
 
  • Like
Reactions: strongy and ipedro

Joe Rossignol

Senior Reporter
Staff member
May 12, 2012
909
3,498
Canada
Macrumors: There’s nothing new to report on this. However, it’s a slow news day. Let’s see if we can get everyone riled up.
1-year anniversary of the CSAM detection feature being announced by Apple just passed recently. If you don't think it's newsworthy to hold a company accountable to something it announced then we can agree to disagree!
 

steve09090

macrumors 68020
Aug 12, 2008
2,166
4,149
No one knows, but, just like someone being sent a malicious payload on their computer, that delivery, and how the images got there, would be tracked. AND, it would be fairly clear that the images ended up there as part of an iMessage “push” from another user and this user was not seeking out the images. AND, fortunately, that’s not all authorities have to go on. They would have access to email and internet browsing history to some degree in order to build a pattern of whether or not a person would have actively reached out to someone, outside of iMessage, and asked them to send these images.

There would definitely be an investigation just like for anything else, and the evidence would play out one way or the other.
Totally agree. This is just a precursor to an investigation. It’s a great idea. I really don’t understand the argument against getting evidence of one of the worst crimes out there.
I think now is the perfect time for Macrumors to suspend comments. When this story gets some traction we know it’s going to be the race to the bottom and will get nasty like previous stories about this.

Meanwhile +1 to Apple for keeping this going. I hate paedophiles.
Question. With all the 'disagrees' I got on this post, is it because I support hashing, or because I hate paedophiles? I think they’re related, so I’m guessing people are sending me the message "I disagree with hashing because it can help catch paedophiles". Because thats what this is all about.

Getting evidence in a way that won’t actually effect anyone is a great way to go. People rubbish on about protecting the rights really don’t understand this world we live in.
 

Joe The Dragon

macrumors 65816
Jul 26, 2006
1,025
474
It's only terrible in the minds of those that don't understand the extensive technological process that Apple implemented to maintain privacy for end users.

It is a well-designed system that is worlds apart from how other companies handle photo gallery scanning.

As for false positives, if someone is intentionally 'planting' photos into someone's iCloud Photo Gallery, then there's another issue that needs to be addresses. The account is compromised, and that has nothing to do with this feature.

Regarding the possibility of back doors, this is just a matter of trust. Apple already clearly stated that they would not allow this feature to be abused by law enforcement agencies for any other purpose, and I choose to believe that. If they were going to allow that, they could do it in secret, and they never have to date.
and if someone in court demands
logs
source code
chain of custody docs
will they give them out?
 

femike

macrumors 6502a
Oct 15, 2011
948
1,734
It will slowly mission creep this in under the radar. It won't mention anything. Apple will have the ability to switch it on or off depending what Apple needs or is requested by the authorities.
 

steve09090

macrumors 68020
Aug 12, 2008
2,166
4,149
I think there is potential for this system to go badly where innocent photos are viewed in a different context. For example…

Let’s say, you are a parent and you take completely innocent photos of your child.
Some parents take photos where the child may be nude (doesn’t everyone have the classic embarrassing “baby butt” shot in their childhood photo album?) but nobody is offended because everyone knows there is no malintent behind the image. It’s just a child in a state of undress.
So, you have a photo like this of your kid, or your kid in the bath, or your kid at the pool/beach, etc. And you post it to social media, and nobody thinks anything of it because to anyone with a properly working brain, there is nothing to think about it.


But THEN, some creeper trolls public social media accounts like Facebook and Instagram for pictures other people post of their children, sees a few that excite them for reasons of their own, saves the good ones to their computer and shares them online on some sicko forum, or trades them with other perverts, etc.

Now when one of them gets caught, or their website gets raided, etc. all their files get flagged as CSAM because of the context in which they were being distributed and viewed by these people, completely unbeknownst to you, the child’s parent, who now still has this original photo on their phone or in their iCloud. And the checksums match because it’s the same file. Do you see how this goes wrong?

I do not know nearly enough about the process in which material is determined to be CSAM but this scenario doesn’t seem implausible to me.
I'm guessing they’re not giving an entire hard drive of images as hashes, but selected ones. Notwithstanding, what you’ve said could be a benefit in knowing the original source, because it can help track the methods used and is helpful in an investigation. You’ve hit the nail on the head because the result is an investigation by a "properly working brain" as you’ve put it.

A positive hash doesn’t and cannot imply guilt.
 
  • Like
Reactions: MrRom92

zakarhino

Contributor
Sep 13, 2014
2,508
6,778
I reckon they've already introduced it silently or are waiting for the inevitable "we totally are doing this to protect children! /s" bill to mandate the technology.
 

steve09090

macrumors 68020
Aug 12, 2008
2,166
4,149
and if someone in court demands
logs
source code
chain of custody docs
will they give them out?
I'm sure they will be required in some occasions when they are arguing that they never had the images. Others will just accept that they had the images and argue the reason for it.
 

BeefCake 15

macrumors 68020
May 15, 2015
2,039
3,120
The backdoor is a legal one. Once Apple implements a process to scan a person's phone for some illegal content and call the cops when they find it, they won't be able to legally assert they are unable to do that sort of thing when told by the authorities they must expand what they look for.
*Ding Ding Ding* exactly

ep-202-37m40s-hector-tio-salamanca.png
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,117
8,060
I reckon they've already introduced it silently or are waiting for the inevitable "we totally are doing this to protect children! /s" bill to mandate the technology.
There doesn’t have to be a bill to mandate the technology, cloud providers based in the US (not sure about other countries) are required to scan for CSAM and they all currently are, including Apple. The only change Apple was trying to make is one that would allow them to encrypt the images of all customers, then just report the ones that show up as a hash match. They have not announced that all customer images have been encrypted, even from them, so we know they haven’t done that. So, for now, they’re just scanning all the cloud images looking for matches.
 

Fat_Guy

macrumors 65816
Feb 10, 2021
1,012
1,078
Come on Apple. Don’t leave unfinished business behind. Let go of CSAM. It is best for the business.
They did let go of CSAM, I thought 15.5 would be the moment they would do something but nothing. I would be shocked if they made an announcement dropping CSAM. Nothing but silence is the best you will ever get.


BTW: Kudos for you being critical of Apple! 😀 I was critical of them with a policy in the music business and I couriered a letter directly to Tim Cook about it. Now my iTunes connect account is buggered up, music listings get screwed up on iTunes and now my Apple receipts are emailed to me in French. Apple was always like this so silence is the best you will get from them.


Anyways I’m happy to see anyone openly push against this idea of theirs.
 

januarydrive7

macrumors 6502a
Oct 23, 2020
537
578
To be fair, the vast majority of everything you linked is based on the "university researchers" comments you linked to, which at the end of the day looked like a desperate attempt for fame. If you read their paper, which proposes a system similar to this, you'd see that it suggests that it would be great if a certain step was implemented in a particular way --- the way that Apple's white paper said it was implemented.

Basically their paper says:
CSAM detection, done in way X, can maintain privacy and prevent government abuse. We haven't figured out how to do way X, so we recommend against this being implemented.

Apple says: we figured out way X.

Researchers say: Hey, look over here!! We talked about this, and it's a bad idea! Look at me! I'm famous!!!

Are there legitimate concerns (like you say, abusers uploading CSAM intentionally to someone else's account)? Sure. But the backdoor and government abuse thing isn't actually a concern based on the actual implementation.
 

JosephAW

macrumors 603
May 14, 2012
5,991
7,948
I’m assuming they are adding the political meme database per current admin orders.
I meet the minimum threshold hit on all my devices and should get the door knocked down at 3am sometime soon. :cool:
 

Naraxus

macrumors 68020
Oct 13, 2016
2,104
8,545
Of course they are. They're waiting for everyone to forget about it and wait for the controversy to die down so they can slip it in quietly once again.

Sorry Timmy but people aren't fooled by your rhetoric. We know exactly what it is. We know exactly what it does. This isn't about "protecting the children", it's about you paying lip service on one end to the privacy you so often gaslight on but to the backdoor you just opened for every government on Earth.

Apple has no more ground to stand on when it comes to privacy and security. For that alone Cook should be shown the door.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.