Part 1: What’s everybody fighting about?
Apple and the FBI’s very public fight about unlocking a certain iPhone is due for a court hearing next week. Both sides have filed a flurry of legal briefs that have argued everything from what’s technically possible to what’s ethical and at points have descended to what basically amounts to character assassination. The next court hearing is more likely to trigger another round of fighting rather than resolve anything permanently so there will be plenty more time for hand-wringing over this issue.
Many have written about this topic already but most of what I've seen in print has been a subjective "privacy vs terrorism" argument, has amounted to "I'm in technology and I say this is bad, trust me", or has been written at a technical level that only those of us who are already familiar with the underlying technology will understand the argument. This post is my attempt to avoid the subjective arguments and stick to the technical facts while explaining the whole issue in enough detail that the layman can understand what this fight is about and what the potential consequences would be for them.
In this 3 part series we’re going to try to steer clear of the political arguments as best we can and focus on what this case could potentially mean for businesses. Part 1will help clearly explain to the reader exactly what the FBI is asking for and why Apple doesn’t want to give it. Part 2 will focus on the technical implications for IT personnel, hopefully shedding some light on why many in the tech world are saying that what the FBI is requesting is a potential backdoor that would decrease security overall and that this is about more than just one phone. Part 3 will cover the potential business and legal implications for executives whose companies may find themselves affected by the results of this case.
This is written for a non-technical audience with no familiarity of encryption technology or how the iPhone’s security mechanisms work. Where necessary I will explain these concepts in layman’s terms. If you’re a cryptographer, PKI expert, or iPhone developer: please don’t hate me for my simplified explanations, I’m trying not to drown the average reader in unnecessary technical jargon and had to leave some of the finer details out as a result.
To start with, we must recognize that there is no such thing as “secure”: given enough time and effort an attacker can always find a way to break a system. This is a concept I will keep coming back to over and over. Encryption is as close as we can get to mathematically pure security, relying on the properties of numbers and mathematical functions to scramble our data so well that those without the correct key (which is really just a very large number with special properties) can not read it. Encryption is both the reason why hackers can’t impersonate web sites to steal our credit cards (usually at least) and why the FBI says they can’t unlock this particular iPhone.
The use of encryption has always involved an inherent resource trade-off: The stronger the encryption system is the more time or processing power it will require to use, conversely any decryption key can be cracked given a fast enough computer and enough time. The goal has always been to make the encryption strong enough that an attacker would be forced to spend an insane amount of resources to build a cracking system fast enough to find a particular key in a reasonable amount of time.
The capabilities of modern encryption are such that even with the entire $10.8 billion budget that the NSA had in 2013 it should be infeasible to find a decryption key via a brute force search any time before the Sun burns out and swallows the Earth, but it’s not that simple. Computers get faster every year, a cracking effort that may have been infeasible a few decades ago could now be practical and getting cheaper every day so over time we must move on to stronger and more complicated encryption algorithms with longer keys. For this reason we have retired the old encryption algorithms like DES and MD5 and moved on to newer ones like AES and SHA-2 with keys that have grown from 56 bits in length to 256 bits.
Encryption can also only be as good as our own incomplete understanding of mathematics and, to be practically useful, has to exist within this messy world that we occupy. The result is that new mathematical discoveries can make encrypted data easier to decrypt than was previously believed possible, flawed software implementations can weaken the theoretical strength of a particular encryption algorithm, and poor choice of decryption keys (or the passwords that protect those keys) can allow an attacker to crack the key in a reasonable amount of time.
Besides direct technical attacks on the encryption system we must also worry about an attacker finding and stealing the decryption keys or performing what is euphemistically known as “rubber hose cryptanalysis”: finding someone who knows the key and punishing them until they reveal it (e.g. beating them with a rubber hose).
Of Keys and PINs
And this is where we find ourselves in the fight between Apple and the FBI: A terrorist used an iPhone that encrypts its data by default with a system so strong that it would take eons to crack. The FBI has that phone and wants to know what’s contained within it. The decryption key is actually stored on the phone itself and as such is technically within the FBI’s possession. The only thing that stands between the FBI and that decryption key is a computer chip that guards it, waiting for a 4 digit PIN to reveal it. The phone’s former master is deceased so no amount of punishment will ever extract the PIN from him.
Normally a 4 digit PIN should be easy to crack; there are only 10,000 possible combinations and a computer can run through those in a blink of an eye. There’s a catch though: in this case the chip has been programmed to introduce a delay every time an incorrect PIN number is entered and after enough incorrect entries it permanently deletes the key that it guards. If the key were deleted it would force the FBI to fall back to the much more daunting task of trying to find the decryption key itself via an exhaustive search of every possible key which is likely well beyond their capabilities or those of anyone else at this point in time.
This chip can be given new instructions however. If the chip were reprogrammed to eliminate the delay and allow unlimited PIN entries then the decryption key could be unlocked without any danger of it being destroyed. Unlocking a phone with only a 4 digit PIN would be a foregone conclusion in this scenario but there’s another catch: if it was easy to reprogram the chip then no-one would have to try to crack the decryption key at all, FBI or criminal. It would only take one industrious hacker to create and share a new set of instructions that could be uploaded to any phone to bypass the security protections and thus the whole encryption system. Apple did think of this scenario and programmed the chip to prevent it.
The chip will only allow itself to be reprogrammed if its new instructions come “signed” with a digital signature based on Apple’s own secret encryption keys, keys that were programmed into the chip at the factory. Anyone can try to create new instructions for this chip but, whether it’s the FBI or a phone thief, without Apple’s signing keys the chip will reject them. Cracking Apple’s signing key is just as tough as cracking the decryption key on the phone itself: it likely won’t happen until long after we’re all dead and this case is long forgotten.
So, what it comes down to now is that the FBI is essentially trying to use the rubber hose cryptanalysis technique on Apple via the courts: Punish Apple until they agree to create new instructions for the chip, digitally sign them with their own keys, use the results to reprogram the iPhone in question, and crack the PIN number to reveal the decryption key for the FBI.
Smoke and Mirrors
The story above is neat and tidy, it paints a picture of the iPhone as a veritable fortress, impenetrable to the FBI or anyone else unless Apple deems fit to call off its guard dogs. This however is very far from the truth. Apple does put a lot of effort into security but the iPhone is a collection of hardware and software designed and created by humans, and humans make mistakes, and mistakes weaken security.
There is a long history of software vulnerabilities in the iPhone’s iOS operating system, some of which have allowed for the “lock screen” to be bypassed both on the latest version of iOS and the version on the phone in the FBI’s possession. There are almost certainly vulnerabilities still lurking within it waiting to be discovered, or perhaps they have already discovered by the NSA, other countries’ intelligence agencies, criminal hackers intent on using them for their own personal gain, or private companies willing to use them in tools they can sell to law enforcement agencies. This is the reason for the oft heard comment, sometimes issued tongue firmly in cheek, “why doesn’t the FBI just ask the NSA for help?” and one Edward Snowden, formerly of the NSA, seems to believe that this approach is possible. To be fair, even assuming the NSA knows of a suitable vulnerability, there are good technical reasons why the FBI doesn’t ask the NSA for help (we’ll get to the legal reasons in Part 3).
Some of these vulnerabilities might be directly exploitable from the phone’s screen or by plugging it into a computer but others may require opening the phone up and manipulating the chips themselves. Exploiting these vulnerabilities, especially ones that involve tampering with the hardware, may not be without risk of destroying the data, a risk the FBI doesn’t want to take.
The use of such a vulnerability also plays into the FBI and the NSA’s very different missions: The FBI is trying to gather evidence to catch and charge criminals and terrorists who are, on average, fairly unsophisticated. The NSA on the other hand is attempting to tap into the communications of other nations, many of which have very sophisticated capabilities rivaling the NSA’s, and keep that capability over the long haul. Using a rare iOS vulnerability to crack one phone that likely won’t reveal much of value would not seem to be a good use of national security resources, especially if it reveals the existence of the vulnerability to the world thereby eliminating the NSA’s ability to exploit it. Whether the NSA should be keeping vulnerabilities secret in order to use them offensively rather than reporting them to vendors thereby improving our ability to defend our own systems is another philosophical argument for another day.
So we can accept that the FBI may have alternative methods of gaining access to this particular iPhone but none that they particularly want to use, either for technical, legal, or national security reasons. Their preferred option seems to be legally forcing Apple to unlock this phone (and many others) so that they no longer have to rely on exploiting technical vulnerabilities that could end up getting “fixed” at any time.
Who’s The Enemy
As I’ve mentioned before, there’s no such thing as “secure”, the very fact that there is one known way and many other potential ways to decrypt this particular iPhone is evidence of that. Many people have argued that the FBI’s request will “make us less secure” while the FBI itself argues that the ability to decrypt this iPhone is essential to prevent terrorism which seems like the sort of thing that would make us more secure. Which side is correct about what would increase or decrease security? The answer: likely both of them.
The FBI and the commenters who are pushing back against the FBI are worried about two very different threats, both of which make us less secure but in different ways. The FBI’s threat is a bit easier to understand: terrorists communicate with each other and then decide to shoot and blow up innocent civilians. Being able to see who terrorists are (or were in this case) communicating with can help us catch other accomplices which keeps us all safer.
A legal precedent would also likely be set by this case if it went in the FBI’s favor: the law they are invoking is not specific to terrorism, it dates back to the 18th century when neither terrorism nor smartphones were of much concern. The result is that phones belonging to drug dealers, crime bosses, and all sorts of other miscreants could be decrypted leading to more arrests and keeping us all safer.
On the other hand, the whole reason Apple is using such strong security on the iPhone is because there are many people who are trying to either physically steal them or hack in and use them against their owners. Some of these hackers are spreading malware in order to capture banking credentials and commit wire fraud while others are the foreign equivalents of the NSA looking to spy on important people within the US. Thanks to Snowden we know just how invested the NSA is in the ability to hack smartphones as well and some of Apple’s security measures and the public backlash against the FBI are the result of this information.
It’s obvious that both the FBI and Apple are worried about security but they have very different priorities: Apple is trying to keep their customers from getting their credit cards, identities, and nude photos stolen while the FBI is trying to catch terrorists and, occasionally, the people who steal nude photos. There is a classic risk assessment equation that comes into play here: expected impact x expected frequency = expected loss.
If we try to use this equation to figure out whether Apple or the FBI is trying to address the larger risk we quickly realize that the terrorist threat may have a large impact but manifests itself very rarely while Apple is worried about a relatively small impact threat that nevertheless manifests itself every second of every day across all of their customers. I wouldn’t be surprised if hacking made more of a negative impact to the economy than the threat of terrorism. We also have to consider that giving the FBI the ability to unlock iPhones may not actually reduce the expected frequency of terrorist attack, keep in mind this phone is in the FBI’s possession because an attack was successful and the attacker is already deceased.
The Bigger Threat
It’s not easy to choose between “the big one” and “death by 1,000 cuts” but Apple (including much of the rest of Silicon Valley) and the FBI seem to have arrived at opposite conclusions from each other. Perhaps this isn’t surprising in light of the FBI’s recent change to its mission statement, eliminating “law enforcement” in favor of “national security”. This spins the FBI as more of a counterterrorism agency that would be prioritize the rare terrorist attacks rather the more mundane day to day crime. Besides, there are political and budgetary advantages to taking the counterterrorism tack, even at the expense of other types of security.
Apple’s priorities on the other hand remain with the customers who voted with their dollars to trust Apple to protect their data. From Apple’s side this is about more than money though. Apple surely realizes that if the security on their phones is not strong enough and customers get hacked too frequently they will not only lose customers but end up facing political pressure to improve their security owing to their prominent position in the marketplace. Apple is basically stuck in a position where conceding to the FBI’s demands puts them in a no-win situation with both the public and other parts of the government outside the FBI.
It's also not surprising that those of us in information security tend to side with Apple in this fight. While the FBI may spend their time looking for isolated terror threats and cleaning up after the rare successful attack we spend our days fending off millions of actual attacks and cleaning up after very frequent successful breaches. Unlike much of the public we know how hackers work, how sophisticated they are, and how they exploit vulnerabilities. What we innately understand is that what the FBI is asking for creates more vulnerabilities that will make many people less secure and perhaps have not done the best job of explaining that to the public.
It's easy to say that the threat of terrorism should take priority over hacking due to the potential impact on lives but this is a false dichotomy: terror groups themselves are starting to use hacking as one of their tools. We've already seen ISIS hacking groups hit the US military and other targets, potentially including the FBI itself. This is about more than leaking information too, hackers, ostensibly associated with Russia, have successfully knocked power grids offline and critical infrastructure in this country is just as vulnerable. We also have medical devices, airplanes, cars, and many other devices that have the ability to kill us that have become vulnerable to hacking as they become increasingly connected. All of this opens the doors to terrorist attacks based not on killing people with bombs and guns but on killing with our own hijacked belongings and infrastructure. This is the fundamental reason why so many in tech are afraid (or at least should be) of the FBI's actions creating even more vulnerabilities.
There is another potential threat that I’ve excluded from this equation, it’s one that neither the FBI nor Apple are primarily worried about it, although Apple may consider it so far as it’s something that some of their customers are worried about: The possibility of domestic spying run amok and being abused for political purposes. The Snowden leaks brought this into the public consciousness in a big way and many of the people speaking out against the FBI are likely doing so as a result of the trust they lost in the government during that episode. There are certainly many cases where various branches of the government abused their position of power for political purposes, the FBI itself has a long history of abusing its power, and not all of those cases are decades in the past. We don’t have much visibility into how prevalent this sort of activity is so it will have to remain an unknowable factor for now but it does have many people worried.
The FBI doesn’t seem to have many friends in this fight. Many tech companies have come out in support of Apple and filed briefs with the court to that affect, a court in Federal court in New York has already forcefully rejected a very similar request by the FBI, and the Director of the FBI had to answer some tough questions from Congress about the situation. This may turn out to be an iPhone too far for the FBI after all.
Continued in Part 2