Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,770
31,228


Member of the German parliament, Manuel Höferlin, who serves as the chairman of the Digital Agenda committee in Germany, has penned a letter to Apple CEO Tim Cook, pleading Apple to abandon its plan to scan iPhone users' photo libraries for CSAM (child sexual abuse material) images later this year.

privacy-matters-apple-ad.jpg

In the two-page letter (via iFun), Höferlin said that he first applauds Apple's efforts to address the dangers posed by child sexual abuse and violence but notes that he believes Apple's approach to remedying the issue is not the right one. Höferlin continued to say that the approach Apple has chosen violates one of the "most important principles of the modern information society – secure and confidential communication."
The approach chosen by Apple however – namely CSAM scanning of end devices – is a dangerous one. Regardless of how noble your motives may be, you are embarking on a path that is very risky – not only for your own company. On the contrary, you would also be damaging one of the most important principles of the modern information society – secure and confidential communication. The price for this will most likely be paid not only by Apple, but by all of us.
Höferlin notably called Apple's CSAM approach "the biggest opening of the floodgates for communication confidentiality since the birth of the internet." The letter speaks out against Apple's plans to scan images in a users' iCloud Photo Library for CSAM by checking the hashes of images to a database of known child sexual abuse material.

That feature is entirely different from another feature rolling out later this year, in which iOS will use on-device image analysis to detect possible sexually explicit images in the Messages app and asks users under the age of 13 if they wish to see the photo. While Höferlin referenced some legitimate concerns over CSAM scanning, he continued that the feature destroys "some of the trust users place in not having their communications secretly monitored." Neither CSAM scanning nor the Child Safety Features in Message, however, are monitoring any communication.

Apple's senior vice president of software engineering, Craig Federighi, admitted in a recent interview that the conjoined announcement of CSAM detection and improved safety for children within the Messages app has caused confusion. Nonetheless, Höferlin continued in his letter by stating that while he wishes he could believe Apple's reassurance that it will not allow government interference into CSAM detection, he is unable to take the company by its word.
As much as I want to believe your assurances that you will reject all requests for further application of this function, such as the location of regime critics or surveillance of minorities, these lack credibility. In every country on Earth – even in my home country, despite our historical experiences – political forces continue to coalesce for whom confidential communication and encryption are a thorn in their side, and who are engaged in ongoing efforts to replace freedom with surveillance. For people who unlike us are not lucky enough to live in Western democracies, this can in the worst-case scenario mean a genuine threat to their lives.
Höferlin concluded his letter by pleading with Cook for Apple to abandon its CSAM scanning plans and asked that the company stays on the side of free and private internet.
That is why my urgent appeal to you is that you abandon your plans for CSAM scanning. This would not only save your own company from many foreseeable problems, but would also protect the Achilles' heel of the modern information society! Please stay on the side of those who defend civilization’s achievement of a free internet!
Since its announcement earlier this month, Apple’s plans have received criticism, and in response, the company has continued its attempt to address concerns by publishing additional documents and an FAQ page. CSAM scanning and Child Safety Features within the Messages app are still on track to be released later this year.

Article Link: German Politician Asks Apple CEO Tim Cook to Abandon CSAM Scanning Plans
 

fwmireault

Contributor
Jul 4, 2019
2,158
9,167
Montréal, Canada
The only way Apple will abandon this is if China tells them to, and we already know that this is not a feature that China will oppose.
I think that the push back that would ultimately convince Apple to abandon this scanning is from US, as they are both Apple’s primary market and the only country for now where CSAM scanning will be enabled. I don’t see that happenning, that said
 

movielad

macrumors regular
Dec 19, 2005
120
219
Surrey
Might as well ask Cook to also remove all the cameras in the phone so that people can avoid being filmed during altercations (an altercation that THEY themselves usually cause), threatening to put them on YouTube or Facebook, which they almost certainly do regardless. Their video is subsequently shared many, many times. So much respect for people’s privacy! Do Facebook and YouTube care about people's privacy? No, not while it's entertainment. Yet the source comes from somebody's phone and violates the rights of others.

Or maybe remove the ability to record conversations via Voice Memos, or third-party apps just in case a conversation is recorded secretly without other people knowing.
 

Khedron

Suspended
Sep 27, 2013
2,561
5,755
Might as well ask Cook to also remove all the cameras in the phone so that people can avoid being filmed during altercations (an altercation that THEY themselves usually cause), threatening to put them on YouTube or Facebook, which they almost certainly do regardless. Their video is subsequently shared many, many times. So much respect for people’s privacy! Do Facebook and YouTube care about people's privacy? No, not while it's entertainment. Yet the source comes from somebody's phone and violates the rights of others.

Or maybe remove the ability to record conversations via Voice Memos, or third-party apps just in case a conversation is recorded secretly without other people knowing.

Those are not the equivalent of this technology. Apple’s new technology is about having a process running on the user’s phone that monitors for illegal activity and reports matches to an authority. So this is like having your camera automatically scan for antisocial behaviour and reporting your GPS location to the local police.
 

nt5672

macrumors 68040
Jun 30, 2007
3,373
7,216
Midwest USA
It’s surprising to see that kind of stance by a politician, they are not the biggest fans of digital privacy. Very curious if any major politician in US will fight on CSAM scanning, but I’m not holding my breath
Actually, there might be hope yet. We have a lot of powerful pedophiles in politics and the church. Can't imagine they are happy about CSAM scanning.
 

nvmls

Suspended
Mar 31, 2011
1,941
5,219
Might as well ask Cook to also remove all the cameras in the phone so that people can avoid being filmed during altercations (an altercation that THEY themselves usually cause), threatening to put them on YouTube or Facebook, which they almost certainly do regardless. Their video is subsequently shared many, many times. So much respect for people’s privacy! Do Facebook and YouTube care about people's privacy? No, not while it's entertainment. Yet the source comes from somebody's phone and violates the rights of others.

Or maybe remove the ability to record conversations via Voice Memos, or third-party apps just in case a conversation is recorded secretly without other people knowing.
Good one.

/s?
 

groovebuster

macrumors 65816
Jan 22, 2002
1,249
101
3rd rock from the sun...
German politician has no idea how this CSAM detection works and prefers a less private way of child porn scanning.
First half of the sentence: And you know that how?
Second part of the sentence: Of course! Securing evidence like this does not belong into the hands of a privately owned company without any control of what actually is happening with the collected data and to who it is handed over to in th end...
 

mzeb

macrumors 6502
Jan 30, 2007
358
612
German politician has no idea how this CSAM detection works and prefers a less private way of child porn scanning.
Despite the facts that it scans locally before uploading, that it only uses hashes initially, that child exploitation is a serious problem that must be solved and that Apple has done a pretty good design job to protect privacy it still isn’t right. This is still digital surveillance of the private population. And worse still it is being done by a corporation of unelected individuals deciding for us how it should be done rather than law enforcement. The term “overreach” is often used in government and it applies here. Apple is not responsible nor accountable for CSAM detection in law enforcement and no country’s citizens have passed a law to give them this mandate. However secure, private and well intentioned this system may be they are breaching the privacy of the people without the people’s’ permission.
 

groovebuster

macrumors 65816
Jan 22, 2002
1,249
101
3rd rock from the sun...
Might as well ask Cook to also remove all the cameras in the phone so that people can avoid being filmed during altercations (an altercation that THEY themselves usually cause), threatening to put them on YouTube or Facebook, which they almost certainly do regardless. Their video is subsequently shared many, many times. So much respect for people’s privacy! Do Facebook and YouTube care about people's privacy? No, not while it's entertainment. Yet the source comes from somebody's phone and violates the rights of others.

Or maybe remove the ability to record conversations via Voice Memos, or third-party apps just in case a conversation is recorded secretly without other people knowing.
Your point is? These things are not even in the same ball park as implementing CSAM detection on OS level.
 

tonywalker23

macrumors 6502
Dec 21, 2003
449
1,085
SC
I don't like that people view child porn. And as a conservative Christian pastor who works full time at a church, I don't want anyone viewing porn. Furthermore, I intentionally don't watch material that has risky scenes or language that offends me.

However, the same technology that Apple wants us all to accept this fall could one day be the same technology that tells a government that I am a conservative Christian pastor. Therefore, the right thing in this situation is not to catch people that are going to not use the feature—the right thing in this situation is to not implement a feature that is highly useless against the people for whom it is intended... because the day might one day come when others get caught in a web that was not originally intended for them.
 
Last edited:

antiprotest

macrumors 601
Apr 19, 2010
4,044
14,262
According to another site Reddit has found the hash algorithm in iOS 14.3 and,

"For example, one user, dxoigmn, found that if you knew the resulting hash found in the CSAM database, one could create a fake image that produced the same hash. If true, someone could make fake images that resembled anything but produced a desired CSAM hash match. Theoretically, a nefarious user could then send these images to Apple users to attempt to trigger the algorithm."

If this is true then I hope Apple finds a solution for this before rollout.

But even if this one is not true, we can probably expect a number of other problems to be discovered and perhaps lives ruined after rollout before they are fixed.
 
Last edited:

femike

macrumors 6502a
Oct 15, 2011
948
1,734
Höferlin notably called Apple's CSAM approach "the biggest opening of the floodgates for communication confidentiality since the birth of the internet."

I totally agree. This is a landmark decision Apple has made and it seems many do not seem to understand the consequences of this for the morrow, preferring to talk about MacBook Air M1 colours.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.