Apple and the Department of Justice are dueling over whether the iPhone maker must write code to help the government break into the San Bernadino shooter’s phone. The government obtained a warrant to search the phone (a nicety, perhaps, since the phone’s owner has consented to the search, and the shooter is dead). But, the killer took advantage of the iPhone 5C’s security: it’s locked with a passcode, and it is likely to wipe the contents after a certain number of incorrect attempts to enter that code. (And, iOS forces a delay in retrying the passcode once the user incorrectly enters the code 4 times.) DoJ relies on the All Writs Act, originally passed in 1789, back when the Blackberry was the smartphone of choice.
There’s been some excellent analysis of the situation; I would point you in particular to Dan Guido’s technical analysis of Apple’s ability to comply, Cyrus Farivar and David Kravets’s overview of the legal precedent and facts, and Michael Dorf’s primer on the AWA. There is also a fair amount of spin, BS, and misunderstanding on both sides of the argument. So, I thought I’d chime in, just to make matters worse, with a few points.
First, this is not a fight about Apple’s encryption – at least, not directly. The government wants the ability to attempt to crack the passcode via brute force. What it wants Apple to do is to make that easier, and to make mistakes less costly. (I’m not sure why the latter is vital. The FBI could easily clone the phone’s storage, so if it gets bricked, they can swap in a new version. Certainly, it’s convenient to have Apple deactivate the error bomb, but hardly necessary.)
Second, we don’t fully understand the depth of the government’s request. Guido notes that iOS imposes a minimum of 80ms per passcode query. That still lets an attacker run a lot of queries, but it’s not at all clear that this is anything other than a programming choice. Quoting him, the DoJ has demanded that “[Apple] will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.” (emphasis mine) Again, I’m not expert in iOS, but it looks like the 80ms delay is a choice made by iOS, not a limit introduced by hardware capabilities. I would bet that the government wants an iOS update that runs queries as fast as the processor, memory, and storage can handle them, which will cut its time to break in.
Third, there may be nothing useful on the phone. Both attackers are dead. The FBI says it can’t rule out that there are accomplices out there, but they’re doing a thorough job of investigating the shooter’s brother, along with other possible conspirators, and at this point, we also can’t rule out a second gunman on the grassy knoll. I don’t think there is much exigency here. Yes, getting into the iPhone might help the investigation. But it’s not clear that there is any extant threat, nor even any extant defendants. This looks a bit like a fishing expedition.
Fourth, this is a political play by the government. The FBI has been pushing for some time to get law enforcement access to encryption systems. (This is often referred to as a backdoor, but it’s really just key escrow. It means either that data is encrypted such that more than one private key will decrypt it, or that the vendor keeps a copy of all private keys and can access them. Lotus Notes, which I worked on for five years, introduced key escrow reluctantly in version 5. And, to comply with export controls limiting the strength of encryption sold outside the U.S., the company turned over part of the private key that Notes uses to the U.S. government. This meant, for example, that key strength was 128 bits against the world, except for the U.S. government, against which it was 64 bits.) It hasn’t gotten much traction; even the NSA has come out against it. I think that’s why the FBI / DoJ have launched this effort to compel Apple. If they succeed, they get a means to bypass encryption more easily without having to pass any legislation or get any new regulations. If they fail, it provides a politically potent example with which to press its cause: we can’t fight terrorists without backdoors! And, the case has generated some bad publicity and badwill for Apple. Marmot impersonator Donald Trump has called for a boycott of Apple, despite using an iPhone himself.
Fifth, if the government presses its legal case (which I fully expect them to do), I think they’ll win. The All Writs Act is the sort of catchall that seems like it ought to run afoul of due process or separation of powers concerns, but the Supreme Court has upheld it. As I understand U.S. v. N.Y. Telephone Co., there are three prongs to the analysis of an AWA request to a third party: whether the measure is necessary, whether the burden imposed on the target is unreasonable, and whether the party is too distant / removed from the situation (a sort of gloss on burden and reasonableness). The iOS update is arguably necessary, since Apple’s security could brick the phone if DoJ doesn’t guess correctly, quickly. Since Apple is the vendor, they are not likely to be too removed. Burden is the harder one, but it’s much easier to remove functionality than to add it. This is Apple’s best hope, but it doesn’t seem a strong one. The case law on AWA is mixed, but it generally runs in favor of the government.
Sixth, the government’s contention that this update applies only to a single iPhone is sophistry. Sure, the code will specify this particular device’s hardware identifiers and the like. But changing that for future iPhones – at least, future iPhones 5C – is trivial: it’s literally find-and-replace in the code. So, the update isn’t a universal zero-day or anything like that. But it is a tool that will be ready the next time law enforcement wants to hack an iPhone – and, it’s one where Apple will no longer have a credible burden argument to resist creating an update. Neither side is portraying this problem very accurately: it’s not a one-off and it’s not a silver bullet that kills iOS security.
Finally, this case leaves aside the hard issue. Imagine Apple’s next iOS update locks the company out of access: once a user encrypts data, there’s no way for a third party to access it short of a brute force attack. Can the government mandate an operating system update that reverses this? Or that captures the user’s password securely somewhere accessible to Apple / law enforcement if need be? This is, in other words, the encryption / escrow debate, and it has strong flavors of the fight over whether tech firms must build in features that protect against copyright infringement (the one the Supreme Court ducked in MGM v. Grokster). That’s the real puzzle, and one for which this iPhone story is merely a play within a play.
Filed under: Apple, badware, Computer crime, Court Decisions, Criminal law, Encryption, Fourth Amendment, Intermediaries, Internet & Society, national security, NSA, Politics, Privacy, Security, Software