Apple still won’t help the FBI break into iPhones
There are two important lessons in last week’s announcement that the Federal Bureau of Investigation has finally succeeded in cracking two mobile phones belonging to Mohammed Alshamrani, the aviation student who killed three people last December at a naval base in Pensacola, Florida.
The first lesson is that cracking an encrypted device takes time and effort even when the federal government brings all its resources to bear. The second is that Apple still refuses to build tools to make hacking its mobile devices easier.
Maybe I’m in the minority, but I’m happy about both.
The story is a familiar one. After the Pensacola attack, the FBI found a pair of iPhones belonging to the shooter. The Justice Department promptly obtained a warrant for their contents, and went to Apple to ask for help breaking the encryption. Although the company did provide certain assistance, it refused to develop software tools to crack its own devices. This has been Apple’s position for years, and it’s one I’ve defended.
So what happened? As in the past, after fulminating for a bit, the FBI got down to work and managed, through means not disclosed, to get into Alshamrani’s phones, obtaining valuable intelligence in the process.
That the FBI found a way past the phones’ defenses is no surprise. The tech community was skeptical from the start of the government’s claim to be unable to crack the devices. Security consultants have long warned that end-to-end encryption is never fully secure.
In other words, the feds don’t really need Apple to get into an iOS device.
The implication is that cracking encryption should be easier. But it shouldn’t. That breaking into a locked mobile device takes time and effort is one of the few guarantees we have that the government will only rarely invest the resources needed to do it.
Encryption is getting better. That’s why under both the current administration and its predecessor, national security officials have called upon tech companies to include in their devices special keys that will allow access in an emergency. The tech industry has resisted these demands, but few went as far as Apple — until recently. Just over a year ago, Google added protections that make hacking Android phones harder, even when the hacker is law enforcement.
I’m told that behind closed doors, much of Silicon Valley thinks Apple is wrong to be so intransigent. Cooperation with law enforcement is routine among U.S. businesses; some techies see no reason for Apple to get a pass. Although I see the point, I continue to find the company’s position attractive. I’m left uneasy by the notion that privacy should be restricted because bad people might misuse it.
Still, the pressure has had its effect. Although Apple steadfastly refuses to build a back door into its mobile devices, earlier this year, the company abandoned plans to allow iOS users fully encrypt their iCloud data. Given that iCloud has an estimated 850 million users this is no small concession.
What this means in practice is that when law enforcement comes to Apple with a warrant for the contents of your phone, the company will turn over whatever you’ve uploaded. In the case of the Pensacola shooter, Apple has proudly touted that it did exactly that. The caveat is important. Even if what attracts you to Apple is the end-to-end encryption of messaging and the difficulty of breaking into your phone, whatever you upload to the cloud is available.
Maybe this is an attractive compromise: Keep your data resident only on your phone and the government will need months to break in. Upload your data to the cloud, and a warrant will gain rapid access to all.
But I worry. My rather oldfashioned view is that privacy is less a “right” than a check on the power of the state. Government can’t regulate what it’s unaware of. That’s why I’m glad that even for the FBI, cracking a phone takes time and effort. The cost in resources forces officials to be picky about when to try.
There’s always a hard hypothetical: a child at risk, a bomb ticking away. But even if we can imagine a moment when we’d all agree that the maker ought to find a way to open the phone, we do better to pretend that we can’t. Nearly six decades ago, the great constitutional scholar Charles Lund Black pointed out that absolutist rules have the virtue of being rarely breached. As rules grow more flexible, we become more creative at coming up with exceptions.
That’s why if our goal is to ensure that manufacturers will help government break into our phones only in the most urgent circumstances, the best approach is to cheer them when they say “Never.”