As you’ve probably heard, the FBI really, really wants Apple to help them unlock an iPhone that belonged to one of the San Bernadino shooters. A court has actually ordered that Apple do so; in response, Apple CEO Tim Cook issued a public reply (in addition to immediate appeals); you can (and should) read that letter in its entirety, because in it Cook lays out very clearly what’s at stake here.
All this is very confusing to lay people, though, I’m sure. We nerds have been up to our asses in crypto for a long time, and understand how critical it is to modern life. You use strong crypto every day, even if you don’t realize it — every time you see that little “lock” icon in your browser, you’re using it, and (to a first approximation) your browser session is locked up tight — otherwise, online commerce wouldn’t be possible, right?
Obviously, a phone is different from your shopping cart at Amazon, but there are lots of points here that are still being obscured by poor media coverage that has, in general, been entirely too deferential to law enforcement and the government. Let me lay out a few things for you, in simple terms, to help you make sense of it all, because whether you realize it or not this case affects you.
First, you need to understand something about encryption itself.
Properly implemented encryption is effectively unbreakable with current technology. (I could explain why, but it would make this post WAAAAY too long.) Not even the NSA can break it; the computing power doesn’t exist yet. It might, in the future (google “quantum computing”), but right now it’s safe and secure.
That’s exactly why law enforcement is so up in arms about wanting back doors built into things: precisely because they can’t break into some systems or data files if they’ve been properly encrypted. Think about it: the cops don’t care how strong your locks are, because they can always break your door. They care about encryption because, done right, they have no recourse.
Second, you need to understand that encryption isn’t the whole picture here.
There’s also device security, and device security at Apple is in an ongoing improvement process. You have probably seen by now stories about how “well, they helped cops BEFORE, why won’t they do it now?” These are wilfully misleading stories authored by deliberately ignorant people who are carrying water for the anti-crypto squad. Just because it was easy or trivial for Apple to unlock a phone in 2008 doesn’t mean it’s just as easy or just as trivial to do so now, because every new iPhone and new version of iOS improves the platform. It is accurate to say that Apple likely views the ease with which a non-owner (Apple) could unlock prior phones as a flaw to be fixed, and are behaving accordingly.
Good, because the only secure device is one that only its owner can unlock.
Third, Cook’s assertion that the FBI’s request would make all similar phones vulnerable is absolutely and unequivocally true.
The cops are demanding, basically, that Apple create a tool that will circumvent the security of the iPhone in question. Such a tool, once created, will almost certainly get leaked and used by other parties — like foreign intelligence people, or criminals, or repressive regimes.
Law enforcement loves to suggest that such bypass tools or (worse) built-in back doors will only ever be used by the “good guys,” but that doesn’t even pass the risibility test. Even supposed “good guys” overstep their authority with astonishing regularity, and law enforcement in the US is absolutely no exception. “Trust us!” is a bullshit argument.
Fourth, don’t give this mouse a cookie.
Iif Apple is forced to do this, now, to this particular generation of iOS and iPhone, then you can be sure that law enforcement will insist they do so (or attempt to do so) for later iterations of the platform. (This is one reason Apple is working so hard to make the devices secure and private, even against attacks from Apple itself.) We cannot let cops — who, let’s be honest, would be happier with a master key to all locks, all phones, all safety deposit boxes, etc., because what do you have to hide? — dictate privacy and security for the rest of us, and Apple realizes this.
Fifth, there is no ticking-time-bomb situation here.
Thus far, terrorist tradecraft is best described as “epically shitty.” The Paris attackers used normal SMS, which is incredibly insecure. They used regular tappable phones. But even if they started using secure methods, signals intelligence isn’t how you track these people. You need to chase them and catch them and prevent attacks through normal police work; you can’t expect an online dragnet of messaging traffic to do much for you (and, indeed, it clearly doesn’t work, even putting aside the privacy concerns). The FBI know who did this. They have reams of other evidence. They’re using this case, and the spectre of TERRORISM TERRORISM TERRORISM, to try and stifle real security for ordinary Americans. There’s no reason to do that.
Stand with Apple, even if you prefer Android. Stand with Apple, even if you hate the walled garden. Stand with Apple, because they are absolutely the only player in this market who have absolutely no interest in analyzing what you do online and selling it to other people. They’ve been increasingly verbal in their commitment to user privacy, and have proved it with the ongoing security improvements in the iPhone. Now they’re putting their money where their mouth is in a big way, on a big stage, in this particular case. Good for Tim Cook, and good for them, and good for US, because it’s a certainty that the Feds would much rather have us insecure.
As security expert Bruce Schneier puts it:
Today I walked by a television showing CNN. The sound was off, but I saw an aerial scene which I presume was from San Bernardino, and the words “Apple privacy vs. national security.” If that’s the framing, we lose. I would have preferred to see “National security vs. FBI access.”
He’s right.
More from Rep. Ted Lieu, and more background on why Apple is so pro-crypto (that bit’s long, but you should read it).