Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

opfreak

macrumors regular
Oct 14, 2014
249
431
I misstated it. Here's what Apple actually says:



So it's not the chance of a single photo being flagged but of an account being flagged. Do you realize how big a number a trillion is? It's so big, that even if Apple was lying and overstating the real figure by a hundredfold, the number would still be enormous (100 billion).
Do you really believe apple marketing? I'll just restate what I said before, they made batteries that did last, displays that failed, bad keyboards... etc etc etc, I could go on.

And now we are to believe that they have created software that has an error rate of 1 in a trillion, or even 1 in 100 billion? This would be the best AI photo identification software ever known to men. I dont buy their marketing garbage. No idea why you would.
 

usagora

macrumors 601
Nov 17, 2017
4,869
4,451
Do you really believe apple marketing? I'll just restate what I said before, they made batteries that did last, displays that failed, bad keyboards... etc etc etc, I could go on.

And now we are to believe that they have created software that has an error rate of 1 in a trillion, or even 1 in 100 billion? This would be the best AI photo identification software ever known to men. I dont buy their marketing garbage. No idea why you would.

We'll wait for your peer-reviewed paper with empirical evidence that their figures are wrong or grossly inaccurate. Until then, I'll believe Apple over random conspiracy theorists and alarmists on Macrumors, thanks. Somehow I doubt we're going to be hearing a flood of stories of iCloud accounts closed based off false CSAM detection (especially since there's a manual review process), but I'll be the first to admit I was wrong if that's the case.
 
Last edited:

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
This isn’t a GREAT argument though. As we all know how persistent Apple is in badgering a user to update before finally just downloading it and eventually a single yes no prompt to do the update and no way to cancel it.

On top of that, any new phone or replacement phone under warranty will come with the new iOS version with no way to downgrade.
Nah nobody is forced to do anything no matter how much “badgering” done by the company. It’s your decision in the end so the responsibility lies on the user. Period.
 
  • Like
Reactions: MozMan68

DeepIn2U

macrumors G5
May 30, 2002
12,898
6,908
Toronto, Ontario, Canada
Not gonna lie...that's kinda creepy.
What?!

ONLY creepy if you have child pornography on your device within native Photos app.

sorry but nobody should feel this is creepy with have a soul. This is where the line taken from the Netflix movie Anon in my signature is superseded with “ dedication to protecting children”. I don’t have, never have, nor ever will have questionable nudes of ANY kind involving children (teens inclusive) on anything I own or use, PERIOD!

scan the hell of my photos if it’s in the effort to protect abused and missing children.

I see this as an extention of UK law against d*ck pic spamming over cellphones that is completely unwarranted and never wanted by so many women. This move by apple is to protect children!


I’ll catchup on the other two news articles on this just now, yet reading this, then seeing your post, I’m more concerned why anyone should fear this if not in such a practice against the reasoning for this initiative.
 

dk001

macrumors demi-god
Oct 3, 2014
10,727
15,070
Sage, Lightning, and Mountains
Other companies like Google and Twitter have been doing pretty much the exact same thing for years...


Not exactly.
Google and others scan shared content and not on the users device. Google also provides tools for different organizations to help conbat CSAM.

Apple is looking to take this a whole lot farther.
 

dk001

macrumors demi-god
Oct 3, 2014
10,727
15,070
Sage, Lightning, and Mountains
I agree, I’m not saying that we should not have concerns about this. It opens doors to privacy invasion, with less altruistic purposes than fighting child abuse. I just think that we have to understand well what this is (and what this isn’t) about

This should have a whole lot more discussion before it is rolled out.
 

DeepIn2U

macrumors G5
May 30, 2002
12,898
6,908
Toronto, Ontario, Canada
That's so wrong! Hey Apple... Focus on fixing this stuff before going after the consumers. This is creepy and wrong in so many different levels.



Isn’t this violation of a Privacy? I have a lot of nudes of myself on my phone. :( should I start deleting them or what?

Well over 34,000 photos. You got to be kidding me.

I’m scared ? Should I be worried?

im in agreement Apple should look internally at their issues of sexism and I’m shocked this is a thing at apple (corporate failure here), and their female executives should take a public stand to push for change as without it the old boys rules will continue to stand. I don’t. Stand for that at all.

but your nudes on your phone are yours. You’re not a child are you? You are choosing when and with whom to share them with, and unless the nudes are part of a profession or your income stream, you shouldn’t be concerned WHO views them as long as you’ve been compensated, again. If it’s part of your income stream (I don’t know, not my business to know).

no consider if part of your revenue stream, or if intended for. Yourself and only those you shared them to consider this:
when shared initially by you, how do YOU know or can you control whom else gets them? If you don’t want a third party/people’s seeing them without your consent how would you, know they have them seen them, be how would you stop it? Let’s expand this. With children’s photos - let’s focus on the topic here btw - imagine the corruption of “self” of a child in this world, their families trying to protect them, even after authorities have a sting and make arrests, what of the nude content of children, are they still out there circulated?

seeing the very intro of Mr Robot S1E1 where Malik’s character speaks to the coffee shop owner who secretly stores and shares and sells child pornography, and how he put a stop to it. Consider that. Check it out.

this isn’t about adult consented nudes on their phones, it’s about child pornography! I think too many here have hairs standing in their necks have this twisted.
 

DeepIn2U

macrumors G5
May 30, 2002
12,898
6,908
Toronto, Ontario, Canada
What about photos of "Baby's first bath" will those users get treated as child exploitation?
An interesting and valid inquiry!

1. your family baby pictures would not have been reported to such authorities. said baby isn’t abused or missing or worse.
2. I’m sure the entire photo or the context of the nude would be checked.

honestly I think Apple should work with Google and other smartphone OS makers even Microsoft (windows) on this to come up with ways to vet youre t he sourc of the photo, and to add a set of others you’re allowing to see it As an exemption. Such as family members sharing adorable first bath baby pictures or dad/moms first diaper change over the shoulder or side pic of baby. That’s not creepy, and the context is fine for family.

most parents that share baby photos, if nude, share them with family, and if showing to friends or work colleague (like returning from mat-leave), they’re showing you directly on device not messaging or uploading to file share for that other person to see. They’re also not allowing that work colleague to snap a photo of the pic on your phone with their phone. That would be. Creepy and would be reported to HR or get banged out!
 

DeepIn2U

macrumors G5
May 30, 2002
12,898
6,908
Toronto, Ontario, Canada
Really? You like many others are placing the rights of the many in jeopardy in an attempt to combat the misuse by the few.
I love in the issue of miss used by the few to go unchecked then it involves the many. Would you want an acquaintance or someone met based on legit business (lord forbid) somehow (by accident via wrong number or deliberate & and non invited/warranted) pic sent to you and you’re involved in a sting operation & arrested.!? You wouldn’t be saying nothing then! Ignoring the issue allows it to continually being spread.

if you see a youth (boy/girl) attempted to be abducted, are you the kind or person the glances and turns a blind eye walking by or would you stop it from happening??!! This is the same concept you’re stating in your comment.
 

dk001

macrumors demi-god
Oct 3, 2014
10,727
15,070
Sage, Lightning, and Mountains
There’s the “you’d have to be a monster to argue against it“ I mentioned.

The issue is that this method of enforcement infringes on personal privacy. If I’m renting an apartment, should I allow the building management to come into my home and search my personal property to make sure that their building isn’t being used to store unlawful materials?

When considering whether these actions are appropriate, I would ask everyone to consider the technology and the use case separately. I’m for enforcement of crime, especially in regard to child abuse. I’m against technology that ignores or removes my right to privacy. If you say you’re ok with using technology in a way that erodes privacy for this use case, where is the line? And who decides what other use cases this technology should be applied to?

You know someone in Government will say “FISA”.
 

axantas

macrumors 6502a
Jun 29, 2015
828
1,135
Home
As we are quite a lot of Apple Users here, we maybe should have to tell people we know about this attempt, and question the update to iOS 15 because of severe security concerns. It might multiply in an interesting way.

Just facts. No panic. Facts. And see, what happens...
 

dk001

macrumors demi-god
Oct 3, 2014
10,727
15,070
Sage, Lightning, and Mountains
Parents are in for a surprise then they find out all their high schoolers are sharing nudes.

My kids say most the kids swap images and so by the time you graduate high school you should have a large collection of what will suddenly become child porn.

What a world.

Any tools that allow parents to help their children navigate the current digital age according to the parents preferences are welcome. Nothing perfect but these seem like reasonable tools.

The searching for child porn is trickier. Who maintains such a data base. How up to date is it? Can I troll someone by getting a harmless picture of their child on it.

BTDT - Foster parent for abused teen. Freakin’ legal nightmare.
 
Last edited by a moderator:

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
I just love how people are looking for a way to make Apple into the bad guy here. What is wrong here? Don't people think Apple has kids of their own? They have newborn babies as well. But of course members here only see Apple as this evil giant and not a place made up of humans. It's not that cut n dry that they will wipe away photos of nude babies. That's so stupid to say that. If that were the case Apple's employees and executives that have kids of their own would be subject to this new system. It's about sex trafficking and it's a very real thing that needs to be handled. Would be a good idea if some people would educate themselves on this subject. It's not about babies and it's not just about nudity. Some people need to stop coming up with their own ideas and theories because they're making something out to be what it's not. But hey had this been Google nobody would say a word. ?
 
Last edited:
  • Love
Reactions: DeepIn2U

MozMan68

macrumors demi-god
Jun 29, 2010
6,074
5,162
South Cackalacky
An interesting and valid inquiry!

1. your family baby pictures would not have been reported to such authorities. said baby isn’t abused or missing or worse.
2. I’m sure the entire photo or the context of the nude would be checked.

honestly I think Apple should work with Google and other smartphone OS makers even Microsoft (windows) on this to come up with ways to vet youre t he sourc of the photo, and to add a set of others you’re allowing to see it As an exemption. Such as family members sharing adorable first bath baby pictures or dad/moms first diaper change over the shoulder or side pic of baby. That’s not creepy, and the context is fine for family.

most parents that share baby photos, if nude, share them with family, and if showing to friends or work colleague (like returning from mat-leave), they’re showing you directly on device not messaging or uploading to file share for that other person to see. They’re also not allowing that work colleague to snap a photo of the pic on your phone with their phone. That would be. Creepy and would be reported to HR or get banged out!

Sigh…it’s not valid at all.

a) Nude baby pics alone are not lewd

b) If you did produce a child pornography pic and shared it with someone, you’d be a complete parasite, but neither you sharing or the person receiving the photo, even when scanned, would not be triggered as the phot already has to be hashed from the database Apple is pulling from (known child pornography pics that were seized when pornographers were arrested)

c) If you or your buddy are caught with said illegal pic, it could be added to the database, updated on Apple’s on device hash software when iOS is updated, and then, yes, someone who may have some version of that pic could be flagged.

d) If you received that one pic by accident and happened to save it to your camera roll AND uploaded to iCloud, yes, it may be flagged, but it clearly states in the white paper that it takes more than one image for Apple to flag an account and THEN have a person check the photo (ugh) to see if it is a match.

e) If you have MULTIPLE innocent naked pic of your baby that randomly happens to be so close in some way to MULTIPLE hashed pics in the database and you get flagged for a person at Apple to check, those odds are one in one trillion of happening.

Send all the dick pics, nudes, innocent pics of the baby running around without its diaper on….as long as it isn’t child pornography (that’s already in a database, mind you), you have nothing to worry about.

Like I said earlier, there could be a ton off child pornographers passing pics back and forth on their iPhones via iMessage or any other means and they will never be tagged as long as they are not caught and the pics are added to the database and hashed.

Also, the iMessage precautions (completely separate from above) are designed for kids 13 and under apparently. Not that any teenager doesn’t already use Snapchat, WhatsApp, Messenger, etc.
 
  • Like
Reactions: flowsy and DeepIn2U

DeepIn2U

macrumors G5
May 30, 2002
12,898
6,908
Toronto, Ontario, Canada
Sigh…it’s not valid at all.

a) Nude baby pics alone are not lewd

b) If you did produce a child pornography pic and shared it with someone, you’d be a complete parasite, but neither you sharing or the person receiving the photo, even when scanned, would not be triggered as the phot already has to be hashed from the database Apple is pulling from (known child pornography pics that were seized when pornographers were arrested)

c) If you or your buddy are caught with said illegal pic, it could be added to the database, updated on Apple’s on device hash software when iOS is updated, and then, yes, someone who may have some version of that pic could be flagged.

d) If you received that one pic by accident and happened to save it to your camera roll AND uploaded to iCloud, yes, it may be flagged, but it clearly states in the white paper that it takes more than one image for Apple to flag an account and THEN have a person check the photo (ugh) to see if it is a match.

e) If you have MULTIPLE innocent naked pic of your baby that randomly happens to be so close in some way to MULTIPLE hashed pics in the database and you get flagged for a person at Apple to check, those odds are one in one trillion of happening.

Send all the dick pics, nudes, innocent pics of the baby running around without its diaper on….as long as it isn’t child pornography (that’s already in a database, mind you), you have nothing to worry about.

Like I said earlier, there could be a ton off child pornographers passing pics back and forth on their iPhones via iMessage or any other means and they will never be tagged as long as they are not caught and the pics are added to the database and hashed.

Also, the iMessage precautions (completely separate from above) are designed for kids 13 and under apparently. Not that any teenager doesn’t already use Snapchat, WhatsApp, Messenger, etc.
Agreed I’m fully with you there all I said was the question was a valid question because up until the white people are shot most people didn’t do it did not know so yes I agree baby pictures aren’t moved however there are baby pictures of babies have been abused in a disgusting manner that have been shared so we can’t just think on the baby pictures as a general wholesome thought process because most of us think that’s awesome but there are disgusting people out there that do all kinds of disgusting nasty despicable things hence why I said the question was valid the thought of fear not valid.
 
  • Like
Reactions: MozMan68

BlueMoose

macrumors regular
Sep 23, 2019
242
122
How easy will it be for Apple to scan photos taken by supporters of the previous president? (for example, photos showing people trespassing the US Capitol building or attacking the police, or attending political rallies for you-know-who) If someone's iCloud account has too many of such photos, then Apple can notify the proper authorities. Seems like the next logical step for the use of such technology to prevent another such incident.(which is a clear and present danger for the security of our country)
 

MozMan68

macrumors demi-god
Jun 29, 2010
6,074
5,162
South Cackalacky
How easy will it be for Apple to scan photos taken by supporters of the previous president? (for example, photos showing people trespassing the US Capitol building or attacking the police, or attending political rallies for you-know-who) If someone's iCloud account has too many of such photos, then Apple can notify the proper authorities. Seems like the next logical step for the use of such technology to prevent another such incident.(which is a clear and present danger for the security of our country)

They don’t need secret hash software to do that…
 

BlueMoose

macrumors regular
Sep 23, 2019
242
122
Yeah, the same people afraid that their iPhones might rat them out are already doing an excellent job of it by posting their exploits on social media for the whole world to see

Apple should use its technologies to make this a better country... for example, scan all the iCloud photo accounts and publish the names of every person who are against the president. we really don't need another incident like what happened in January.
 

Abazigal

Contributor
Jul 18, 2011
19,745
22,328
Singapore
Apple should use its technologies to make this a better country... for example, scan all the iCloud photo accounts and publish the names of every person who are against the president. we really don't need another incident like what happened in January.
It's worth noting that slightly over half of Americans actually felt (back then) that Apple should have assisted FBI with unlocking the San Bernardino shooter's iPhone.


I suspect that once this news hits the public sphere and garners more debate, we will find that a not-significant number of people will actually support Apple's efforts here.
 

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
Apple should use its technologies to make this a better country... for example, scan all the iCloud photo accounts and publish the names of every person who are against the president. we really don't need another incident like what happened in January.
Please say you're only joking because this is beyond ridiculous! Apple is a tech company that makes consumer products, end of story. They aren't responsible for saving the world. You're taking this way too far.
 
Last edited:

BlueMoose

macrumors regular
Sep 23, 2019
242
122
Please say you're only joking because this is beyond ridiculous! Apple is a tech company that makes consumer products, end of story. They aren't responsible for saving the world. You're taking this way too far.

20 years ago(2001), we didn't have smartphones yet.... but if we did have iPhones back then and Apple could use its technology to prevent 9/11 attacks, would that have been a bad thing?
 

Apple_Robert

Contributor
Sep 21, 2012
34,584
50,262
In the middle of several books.
Apple should use its technologies to make this a better country... for example, scan all the iCloud photo accounts and publish the names of every person who are against the president. we really don't need another incident like what happened in January.
That is a terrible idea, not to mention doing something like that would get Apple sued and put innocent lives at risk.

It's not Apple's job to be the social police much less set social policy, even though they appear to be working to that end.
 
  • Like
Reactions: Maconplasma
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.