Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

lordofthereef

macrumors G5
Nov 29, 2011
13,161
3,720
Boston, MA
I always put user security over convenience when it comes to privacy, but In this case, the police have a warrant, and they're not asking Apple to put permanent backdoors into their software for easy access to law enforcement. They're just saying that Apple has to help them find a way to bypass the locks on the phone, which is far more palatable.
Is this not effectively the same as a back door from a software point of view? Legitimate question because I don't know. I'm not a software engineer.
 
  • Like
Reactions: iOSFangirl6001

friedmud

macrumors 65816
Jul 11, 2008
1,415
1,265
The thing everyone always ignores is that Apple didn't invent encryption and forcing Apple to put backdoors in won't do away with the ability for people who REALLY want to hide stuff (i.e. terrorists and criminals) to encrypt things beyond what the law can access. Those people will simply do a small amount of research and load up any number of other apps that provide encryption (for instance, Whatsapp is adding encryption right now... but I'm sure there are literally hindreds of apps you can get for Android).

The ONLY thing that weakening iPhone's stock encryption does is hurt the privacy of normal, everyday citizens...
 

inkswamp

macrumors 68030
Jan 26, 2003
2,953
1,278
So if Apple is able to turn off the auto-erase after 10 tries, and let someone try unlimited passwords. Well, if they used a 4 digit PIN, that means essentially there IS a backdoor to get in, as it won't take very long to try out all the combinations. This case could shed light on what exactly Apple *can* do with our phones, even if we lock them down to their fullest.

A backdoor is a simple, quick, secret method to access a system. There's nothing secret about this nor is it necessarily quick or simple. How does this fit the definition of a backdoor?
 

furi0usbee

macrumors 68000
Jul 11, 2008
1,790
1,382
The fullest you can lock the phone down isn't a 4 digit passcode, though. I'm pretty sure you can set it to any length passcode and it can include unicode.

Yes I'm aware of that, but many simply use the simple 4 digit passcode.
 

Renzatic

Suspended
Yes, looked at it that way, the argument it's not totally unreasonable.

If you ask me, this is the way it should be expected to be done. Being compelled to help law enforcement do their job is far from being unconstitutional, provided the police have a sound reason and a warrant. If Apple discovers a backdoor themselves, they themselves can use it help the police catch Potential Terrorist X. And then, if they so want, they can patch it out after the fact.

There is no such thing as 100% perfect security. There will always be ways to crack into someone's phone. Apple, being an expert of their own software, will obviously be the first bunch law enforcement hits up to ask for help to get in. That's fine.

...but demanding they place flaws into their software so it can be accessed at any time? No, that's not.
 

furi0usbee

macrumors 68000
Jul 11, 2008
1,790
1,382
A backdoor is a simple, quick, secret method to access a system. There's nothing secret about this nor is it necessarily quick or simple. How does this fit the definition of a backdoor?

Because if Apple has a way of disabling the 10 limit erase, or allow unlimited passwords to be thrown at it via a software tweak, that's a back door that someone other than Apple could figure out some day. Doesn't mean it will be cracked tomorrow, but if Apple isn't truthful when they say they can't do anything to help cops get in, bad guys can also get in.
 
  • Like
Reactions: mzeb

2010mini

macrumors 601
Jun 19, 2013
4,698
4,806
This is no doubt between the judge and the FBI. Most of the Judges dont particularly understand the digital nuances of the warrants they are authorizing or the orders they're being given. I have zero doubts that the feds walked in and said "APPLE is not allowing us to brute force the device, APPLE is forcing the device to self-destruct". They are, i mean, they're configurable options, but written in such a way to not allow any bypassing including by Apple themselves (probably glossed over). Any technically competent judge would knows that key escrow and backdoors have a risk of being leaked. This would be the same court to scream bloody murder if iOS devices with their SSN, PII were breached due to weak crypto controls.

But wait, according to the article:


So thats a complete fail on them. The MDM solution used by the dept should be able to change the pin in a jiffy, thats sort of why we have MDM. If they didn't do their part in managing their devices adequately, its not Apples fault.


This!!!!

If the phone has MDM installed correctly, they don't need Apple at all.
 

scaredpoet

macrumors 604
Apr 6, 2007
6,627
342
This is tough for me. On one hand I want my device to be secure. One the other hand I want to stop terrorists etc.

There are ways to stop terrorists without compromising the privacy of law abiding citizens. Not to mention, the information on the phone is probably useless by now. Any contacts and phone numbers used to communicate likely changed after the shooting happened.

Make no mistake, this is a political move, and the motivation here is to set precedent for the next time they want to get into someone's phone.
 

thisisnotmyname

macrumors 68020
Oct 22, 2014
2,438
5,251
known but velocity indeterminate
thank God for cell phones, in the bad old days prior to their existence we were unable to solve any crimes ... Wait, what?

Law enforcement has gotten addicted to their easy access to data and now that it's drying up we're seeing panic and tantrums. It's a shame. They didn't have access to private conversations prior to mobile devices (unless they had a wiretap in advance) and they don't have it now, go back to traditional investigation techniques.

The really scary part here is if Apple's assistance is requiring handing over enough source and architecture info that the government is able to exploit the system without Apple knowing how, effectively being able to create their own zero day and compromise us all. As said in first post, stay strong Apple.
 

Renzatic

Suspended
Is this not effectively the same as a back door from a software point of view? Legitimate question because I don't know. I'm not a software engineer.

Neither am I. But to me, a backdoor would be something left open in software on purpose for easy access. That's quite a bit different from looking for a flaw to exploit to get into someone's system.

In this case, Apple can give no guarantees that they'll be able to access this guy's phone, nor can it be held against them if they fail. But they do have to try.
 

furi0usbee

macrumors 68000
Jul 11, 2008
1,790
1,382
Neither am I. But to me, a backdoor would be something left open in software on purpose for easy access. That's quite a bit different from looking for a flaw to exploit to get into someone's system.

In this case, Apple can give no guarantees that they'll be able to access this guy's phone. But they do have to try.

In the end, there is no difference between getting in via a flaw or a true back door. What's to say the "flaw" wasn't known and was intended to be a back door of sorts? Nobody knows iOS and the iPhone hardware like Apple. They, more than anyone, would know how to generate a set of events that could trigger *something* to happen. I don't trust Apple or any company fully when they say our data is secure and there is no way to get at it.
 

GreyOS

macrumors 68040
Apr 12, 2012
3,355
1,682
I'm no expert in encryption at all, so sorry if this is way off.

Given that some features and settings are available with the phone locked, and that a locked phone can determine whether it should erase data after failed attempts and how long to wait between failed attempts, doesn't that indicate those settings exist and can be checked outside the encryption? Therefore the software can surely be manipulated at this stage?

Would love someone to explain this...
 

Kabeyun

macrumors 68040
Mar 27, 2004
3,412
6,350
Eastern USA
The problem is not who has access to our personal data, but bad guys having guns and good guys not having guns
Sorry to burst the cowboy bubble, but there's actual data on this, and it doesn't agree with this particular NRA mantra.
- A good guy with a gun is statistically more likely to either shoot another good guy or be shot by another good guy than he is to actually shoot a bad guy.
- A bad guy with gun scenario ends 2-3% of the time by a good guy with a gun, and 52% of the time of the bad guy's own accord.
- A 1% increase in gun ownership results in a 0.9% increase in gun deaths (and no, it's not the bad guys doing the dying).

And yes, I'm for the 2nd Amendment. The entire second amendment, including the often conveniently omitted bit about a well regulated militia.

But we digress...
 

inkswamp

macrumors 68030
Jan 26, 2003
2,953
1,278
Because if Apple has a way of disabling the 10 limit erase, or allow unlimited passwords to be thrown at it via a software tweak, that's a back door that someone other than Apple could figure out some day. Doesn't mean it will be cracked tomorrow, but if Apple isn't truthful when they say they can't do anything to help cops get in, bad guys can also get in.

There's an old saying in computer security that if you have access to the hardware, then all bets are off. There is no system that cannot be cracked if the person doing the cracking has access to the hardware itself. A true backdoor would offer someone quick, simple and secretive access to a system even remotely. This is not a backdoor.
 

Renzatic

Suspended
In the end, there is no difference between getting in via a flaw or a true back door. What's to say the "flaw" wasn't known and was intended to be a back door of sorts? Nobody knows iOS and the iPhone hardware like Apple. They, more than anyone, would know how to generate a set of events that could trigger *something* to happen. I don't trust Apple or any company fully when they say our data is secure and there is no way to get at it.

That's true. You'll never be 100% secure, so long as you're using someone else's product to connect to the internet. If someone wants to get at your stuff, they can and probably will find a way to do it.
 

rdlink

macrumors 68040
Nov 10, 2007
3,226
2,435
Out of the Reach of the FBI
I always put user security over convenience when it comes to privacy, but In this case, the police have a warrant, and they're not asking Apple to put permanent backdoors into their software for easy access to law enforcement. They're just saying that Apple has to help them find a way to bypass the locks on the phone, which is far more palatable.

No, it's not. Apple has no backdoors. And if the government forces them to create one where it didn't exist before the government has taken down a whole company by destroying one of their biggest competitive advantages.

Pound sand FBI.
 
  • Like
Reactions: jjm3

Renzatic

Suspended
No, it's not. Apple has no backdoors. And if the government forces them to create one where it didn't exist before the government has taken down a whole company by destroying one of their biggest competitive advantages.

Pound sand FBI.

You're confusing one situation for another.

Though this could eventually be used as justification for backdoors later, right now all the FBI is doing is handing them a phone, and saying "hey, crack this for us".
 
  • Like
Reactions: You are the One

Juicy Box

macrumors 604
Sep 23, 2014
7,526
8,862
Thank you Apple for standing your ground.

While I have no love for crazy islamic nuts, or any religious nuts, I do not think everyone should lose their liberty because of them.

Look at what we have to put up with at the airport with the TSA, fondling little girls because they might be terrorists.

I think we lost enough freedom and liberty already.
 

NoNothing

macrumors 6502
Aug 9, 2003
453
511
I pray for America. You are run by a psychopathic and satanistic criminal cabal. Please wake up.



Problem-Reaction-Solution. That's how things work. Create a trauma, blame the unability to solve it on encryption, get rid of encryption. Move on. Same in Paris.

I hope you read the opening of that whack-o article you linked to. I really hope you don't seriously believe the Twin Towers came down due to added explosions because the planes could not do it alone. It takes serious amounts of tin-foil hats to believe that.
[doublepost=1455682024][/doublepost]
They want to enter an unlimited number of passwords, the only way I can think of this being done is if Apple installs a special OS on the devices that doesn't have the delete after 10 feature. At this point wouldn't just be easiest to install an OS that doesn't have passwords at all?

You still have to have the passcode to decrypt the data. I am guessing Apple could do a special build to disable the 10 try limit but you would still need to get the cypher keys and you have to have the passcode for those.
 

steve62388

macrumors 68040
Apr 23, 2013
3,090
1,944
This is 100% correct.

The problem is not who has access to our personal data, but bad guys having guns and good guys not having guns.

This is why terrorist attacks in SB, on US military bases, in England, France, and other countries has been successful. The bad guys kill without remorse and the population, having been disarmed, just die. In GB, you can't even defend yourself with a stick or you'll be sent to prison. Civilian disarmament is killing us all around the world.

Yeah, because it worked out really well for all those people carrying guns in the Twin Towers.
[doublepost=1455682481][/doublepost]I don't get why Apple make a big song and dance about this whole encryption thing when your iCloud data is accessible. Cook talks like it's an important principle but isn't it really a PR exercise?
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.