Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

H2SO4

macrumors 603
Nov 4, 2008
5,674
6,954
I have a bunch of French friends whose 9-10 year-old girls swim topless at a local neighborhood pool and are being customarily photographed by their parents.

I was born and raised in Russia where pre-puberty girls were always topless at the beach and customarily photographed by their parents.

In these cultures, girl chests are not covered until they develop breasts. Obviously, French, German, Scandinavian, etc. women love going to the beach topless where it’s allowed, but that’s a different story altogether.

That’s how I know.

For most Americans, it’s shocking to see 10-year-old girls topless in a public place. The notion “child porn” immediately pops in one’s head. Even for me it was a little shocking first when I saw the French letting their girls being topless in public. Then I remembered my childhood and realized it was completely normal almost everywhere else besides the US.
Can’t say I’ve really noticed. I’ve been European for over last 50 years.
Can only say that it’s a cultural thing and that maybe the Americans have things twisted?
 

axantas

macrumors 6502a
Jun 29, 2015
823
1,131
Home
Can’t say I’ve really noticed. I’ve been European for over last 50 years.
Can only say that it’s a cultural thing and that maybe the Americans have things twisted?
This is, where it starts...

America struggles with agony, if in a TV-Show the scarce textile cover of a womens body slips into a position where everyone may catch a glimpse of something absolutely horrifying on their 150 inch UHD screen, and children immediately get corrupted for their entire life...

Not to think about the view of a topless child...

Yes, the first part is a bit mean, I admit. But it happened...
 

H2SO4

macrumors 603
Nov 4, 2008
5,674
6,954
This is, where it starts...

America struggles with agony, if in a TV-Show the scarce textile cover of a womens body slips into a position where everyone may catch a glimpse of something absolutely horrifying on their 150 inch UHD screen, and children immediately get corrupted for their entire life...

Not to think about the view of a topless child...

Yes, the first part is a bit mean, I admit. But it happened...
I don’t disagree.
 

axantas

macrumors 6502a
Jun 29, 2015
823
1,131
Home
you clearly didn't read my entire post, but that's pretty standard on the internet these days.
Think about the impossible case: That one photo you sent to an aunt of you gets stolen from the old phone of her (old OS, got hacked...) and appears on one of these sites. Gets flagged. There is a match on your database. You even own the original. You are in really big trouble. Do not try to explain anything to the police. No one will believe you...

Oh: even worse. You are abusing photos of your own children and maybe your children too. So, it is not only big trouble. You are dead.
 

canyonblue737

macrumors 68020
Jan 10, 2005
2,179
2,694
Think about the impossible case: That one photo you sent to an aunt of you gets stolen from the old phone of her (old OS, got hacked...) and appears on one of these sites. Gets flagged. There is a match on your database. You even own the original. You are in really big trouble. Do not try to explain anything to the police. No one will believe you...

I guess. Apple said that there is some undiscolsed volume of images that triggers it so it appears one photo (or more) won't trigger it. At that point when it is triggered a human being actually gets to look at the images (ugh, what a horrible job) and makes a judgement call before going to authorities. I'd also assume those authorities would be able to once they talk to you determine the images in question are all harmless typical family photos of just you or your children and it would go no further... at least until you sue the hell out of Apple and go on 20/20 or 60 Minutes to give an expose on how Apple's intrusive privacy policies nearly caused you to be named a child abuser and ruined your life.
 

axantas

macrumors 6502a
Jun 29, 2015
823
1,131
Home
I guess. Apple said that there is some undiscolsed volume of images that triggers it so it appears one photo (or more) won't trigger it. At that point when it is triggered a human being actually gets to look at the images (ugh, what a horrible job) and makes a judgement call before going to authorities. I'd also assume those authorities would be able to once they talk to you determine the images in question are all harmless typical family photos of just you or your children and it would go no further... at least until you sue the hell out of Apple and go on 20/20 or 60 Minutes to give an expose on how Apple's intrusive privacy policies nearly caused you to be named a child abuser and ruined your life.
I can assure you, I WOULD sue Apple for sure for doing this and asking me questions about my childs photos I do have taken years ago. Furthermore I would sue that person, who has categorized my private photo for breaking my privacy.

No one has to ask me any question about these private things. No one.

Oh: The amount of images you need to own to get triggered - it will be lowered in a couple of years, because the initial value was too high and did not trigger anything. (Beta Version, you know...)
 
  • Like
Reactions: ssgbryan

opfreak

macrumors regular
Oct 14, 2014
249
431
“But practice already widespread…” sounds like a cop-out. How many evils throughout history have been “widespread”? And how much blood, sweat, toil and tears did it take to fight them?

I don’t ever buy the logic that just because something is in place means it’s right and we can’t change it. We are humans dealing with human-made problems. Let’s not cop behind the illusion of powerlessness.
everyone is doing it. so its ok. /s
 

Abazigal

Contributor
Jul 18, 2011
19,688
22,231
Singapore
I can assure you, I WOULD sue Apple for sure for doing this and asking me questions about my childs photos I do have taken years ago. Furthermore I would sue that person, who has categorized my private photo for breaking my privacy.

No one has to ask me any question about these private things. No one.

Oh: The amount of images you need to own to get triggered - it will be lowered in a couple of years, because the initial value was too high and did not trigger anything. (Beta Version, you know...)
If your child's photos are on a database intended for victims of child pornography (and I assume there is strict criteria for deciding which photos count), I think you have more pressing concerns than suing other people.
 

giggles

macrumors 65816
Dec 15, 2012
1,048
1,275
everyone is doing it. so its ok. /s

I wish companies would just leave people store their “hobbies” data on servers even if they’re super illegal hobbies.
They should learn from privacy martyrs like Kim Dotcom and Ross Ulbritch.
/s
 

iHorseHead

macrumors 65816
Jan 1, 2021
1,308
1,575
Apple should add scanning for:

1. Photos of the confederate flag.
2. Photos of people not wearing Covid masks.
3. Photos of Chinese people disrespecting the Chinese government.
4. Photos of Middle eastern women not wearing burkas.
5. Photos of a group of people with too many whites, not enough blacks.
They will soon.
 

axantas

macrumors 6502a
Jun 29, 2015
823
1,131
Home
If your child's photos are on a database intended for victims of child pornography (and I assume there is strict criteria for deciding which photos count), I think you have more pressing concerns than suing other people.
Yes I would have the problem, that no one believes me, that my phone with the private photos got stolen and I will be accused as producer of these photos. It starts here: Not even you believe, that I am innocent.
 

MacBH928

macrumors G3
May 17, 2008
8,351
3,734
This is the worst news in IT/Apple this year so far. One of my main reasons for sticking with Apple has just been eroded, of course it's not as if I would go anywhere else since the others are just as bad. Well someone on Snowden's twitter praised Linux phones but that is a non-starter personally - do they even exist?

Not exactly linux phones, but modified Android which is the nest best thing as to be honest they are not as convenient as stock iOS or Android. The least issue and most compatible one is LineageOS. For more privacy software look in this site: privacytools.io
 

sirozha

macrumors 68000
Jan 4, 2008
1,927
2,327
If your child's photos are on a database intended for victims of child pornography (and I assume there is strict criteria for deciding which photos count), I think you have more pressing concerns than suing other people.
This is simply not the case. The AI will identify nudity and it will be flagged for human review. So all that crap about comparing hashes is a bunch of BS. They will use AI to flag suspicious pics and then send them to a bunch of Indians to review the pics and decide your fate. This is the most ridiculous and atrocious thing that I’ve heard in the past 30 years of actively using technology.
 

Ifti

macrumors 68040
Dec 14, 2010
3,941
2,449
UK
I do not use iCloud Photo at all - so how are my photos scanned?
 

Abazigal

Contributor
Jul 18, 2011
19,688
22,231
Singapore
This is simply not the case. The AI will identify nudity and it will be flagged for human review. So all that crap about comparing hashes is a bunch of BS. They will use AI to flag suspicious pics and then send them to a bunch of Indians to review the pics and decide your fate. This is the most ridiculous and atrocious thing that I’ve heard in the past 30 years of actively using technology.
That's not what is happening, and you know it.

In the case of scanning for child pornography, it's comparing the photos against an existing database that is being maintained by US law enforcement, not using AI but fingerprinting. Nudes of yourself or your children will not get flagged, unless they have also been identified as child pornography by the authorities.

This article gives a more in-depth explanation. If you want to hate, at least hate smart, and hate right.


Right now, the main cause for concern is that while services like google photos scan the photos you upload, Apple is introducing a way to scan your photos on device. While this feature will only work on photos that would be uploaded to iCloud Photo Library, the problem is that Apple, in theory at least, still will have a means to scan your photos on your device, and one cannot help but wonder when they will go from scanning photos that would have been uploaded, to simply scanning all photos while they are at it.
 

opfreak

macrumors regular
Oct 14, 2014
249
431
I wish companies would just leave people store their “hobbies” data on servers even if they’re super illegal hobbies.
They should learn from privacy martyrs like Kim Dotcom and Ross Ulbritch.
/s
If you are going to claim things are encrypted end to end, you should have no idea whats on the server. Its just 0s and 1s.
 

opfreak

macrumors regular
Oct 14, 2014
249
431
That's not what is happening, and you know it.

In the case of scanning for child pornography, it's comparing the photos against an existing database that is being maintained by US law enforcement, not using AI but fingerprinting. Nudes of yourself or your children will not get flagged, unless they have also been identified as child pornography by the authorities.

This article gives a more in-depth explanation. If you want to hate, at least hate smart, and hate right.


Right now, the main cause for concern is that while services like google photos scan the photos you upload, Apple is introducing a way to scan your photos on device. While this feature will only work on photos that would be uploaded to iCloud Photo Library, the problem is that Apple, in theory at least, still will have a means to scan your photos on your device, and one cannot help but wonder when they will go from scanning photos that would have been uploaded, to simply scanning all photos while they are at it.

Why do people keep repeating this lie that all apple is doing is comparing your image has to a known database hash?

If that was true then all you would have to do is crop a picture, or change one pixel and the hash value would change and the system would be defeated. Apple itself says thats not the case.

Furthermore, for kids accounts the system 100% scans photos for nudes before they send them, or receive them, so the system has nudity detection AI built in.
 

Abazigal

Contributor
Jul 18, 2011
19,688
22,231
Singapore
Why do people keep repeating this lie that all apple is doing is comparing your image has to a known database hash?

If that was true then all you would have to do is crop a picture, or change one pixel and the hash value would change and the system would be defeated. Apple itself says thats not the case.

Furthermore, for kids accounts the system 100% scans photos for nudes before they send them, or receive them, so the system has nudity detection AI built in.
Because that's what Apple themselves is saying.


The main purpose of the hash is to ensure that identical and visually similar images result in the same hash, and images that are different from one another result in different hashes. For example, an image that has been slightly cropped or resized should be considered identical to its original and have the same hash.
So I assume it's capable of detecting even modified images, so long as some part of it remains intact (eg: the face or posture).
 

opfreak

macrumors regular
Oct 14, 2014
249
431

sirozha

macrumors 68000
Jan 4, 2008
1,927
2,327
That's not what is happening, and you know it.

In the case of scanning for child pornography, it's comparing the photos against an existing database that is being maintained by US law enforcement, not using AI but fingerprinting. Nudes of yourself or your children will not get flagged, unless they have also been identified as child pornography by the authorities.

This article gives a more in-depth explanation. If you want to hate, at least hate smart, and hate right.


Right now, the main cause for concern is that while services like google photos scan the photos you upload, Apple is introducing a way to scan your photos on device. While this feature will only work on photos that would be uploaded to iCloud Photo Library, the problem is that Apple, in theory at least, still will have a means to scan your photos on your device, and one cannot help but wonder when they will go from scanning photos that would have been uploaded, to simply scanning all photos while they are at it.
I don’t believe the “comparing hashes” party line. Child pornography is being manufactured every second of every day. There are billions of very capable cameras available worldwide in almost everyone’s palm, so I’m sure at least hundreds of child porn images are created every second. To think that there is a database of child porn with hashed images against which the pics on your iPhone can be compared is a ridiculous notion.

Our law enforcement can’t even maintain a database of firearms, let alone being able to have a database of child porn images. The only effective way to try and identify child porn is to use AI to flag the images that the AI thinks is child porn and then let humans decide which ones are child porn. This is already being done by services like Facebook with thousands of contractors reviewing images all day long and getting PTSD from their jobs.

Apple has breached my red line of where they should never have gone. At this point, I will trust Google before I trust Apple.
 
Last edited:
  • Like
Reactions: H2SO4

H2SO4

macrumors 603
Nov 4, 2008
5,674
6,954
I don’t believe the “comparing hashes” party line. Child pornography is being manufactured every second of every day. There are billions of very capable cameras available worldwide in almost everyone’s palm, so I’m sure at least hundreds of child porn images are created every second. To think that there is a database of child porn either hashed images against which the pics on your iPhone can be compared is a ridiculous notion.

Our law enforcement can’t even maintain a database of firearms, let alone being able to have a database of child porn images. The only effective way to try and identify child porn is to use AI to flag the images that the AI thinks is child porn and then let humans decide which ones are child porn. This is already being done by services like Facebook where thousands of employees reviewing images all day long and getting PTSD from their jobs.

Apple has breached my red line of where they should never have gone.
Kind of agree with this and kind of don't.
I don't think your government, (or collectively the people they serve), actually want a database for firearms.
 

giggles

macrumors 65816
Dec 15, 2012
1,048
1,275
If you are going to claim things are encrypted end to end, you should have no idea whats on the server. Its just 0s and 1s.

At runtime, your local pics are not encrypted, otherwise you wouldn’t be able to look at them.
Doing this “CP or not” labeling locally is exactly what allows Apple to not decrypt all your pics when they are already stored on iCloud.
You people have it backward, this system allows for less invasion of our privacy than searching for CSAM in the cloud like other companies do.
At lest get your facts right, THEN we can discuss if it should be done or not.
 

Abazigal

Contributor
Jul 18, 2011
19,688
22,231
Singapore
I don’t believe the “comparing hashes” party line. Child pornography is being manufactured every second of every day. There are billions of very capable cameras available worldwide in almost everyone’s palm. To think that there is a database of child porn is a ridiculous notion.

Our law enforcement can’t even maintain a database of firearms, let alone being able to have a database of child porn images. The only effective way to try and identify child porn is to use AI to flag the images that the AI thinks is child porn and the.mn let humans decide which ones are child porn. This is already being done by services like Facebook where thousands of employees reviewing images all day long and getting PTSD from their jobs.

Apple has breached my red line of where they should never have gone.

Facebook uses AI because they don’t control the underlying hardware, so they can only rely on a purely software solution.

I don’t think Apple is willing to go to the extent of hiring an entire army of reviewers just to scan for child pornography. The reason why they would go about this is likely because they believe that they are able to automate this at scale using their A-series processors, and that the actual number of images flagged for human review will be so small that only a very small team will be necessary.

For the moment at least, I am willing to give Apple the benefit of the doubt that they have found a unique way of tackling the very same issue that other companies such as facebook, Microsoft and google are already doing, using their control over hardware and software, in a way that other companies can’t. Just because a company like facebook uses AI doesn’t mean that other companies can’t do it better, or differently.

That also doesn’t mean I don’t have an issue with what Apple is doing here (because again, it’s a fine line between scanning only images designated to be uploaded, and just scanning all your images while Apple is at it), but what they are reporting here on how they intend to go about this process is very likely the truth.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.