Part 3: What does this mean for my business?

In Part 1 we brought you up to speed on what the FBI’s request in the Apple vs. FBI fight is all about. In Part 2 we discussed what the results of this fight would potentially mean for the security of a corporation’s network. Here in Part 3 we will look at the business and legal implications for companies and their executives.

We are providing the following scenarios and analysis as an input into an organization’s or executive’s risk management process. These are all scenarios that could occur, each of which have consequences, including real economic consequences, of varying severity. It is prudent for anyone running a business or an IT organization to consider these scenarios as part of their risk management process.

Where we left off is that the iPhone utilizes very strong encryption and that the easiest way around this, which is what the FBI wants, is for Apple to create a new update for the iPhone, signed with Apple’s own signing keys, which would allow for decryption key to be recovered. The potential consequences of this update, or the source code and signing keys used to create it, leaking out would essentially break the trust model than underpins Internet security. If this happened it would make life much more difficult than it already is for network administrators who are trying to protect their networks and systems from hackers as they would no longer be able to trust the encrypted connections and software updates that businesses rely on to protect their communications and systems.

Note: the photo accompanying this post is of a device that is capable of unlocking an iPhone, including bypassing the phone's ability to delete the decryption key after 10 failed attempts. This was tested on iOS 8, the same version believed to be on the San Bernardino phone, and appears to belie the FBI's claims that there is no way to unlock that phone without Apple's assistance. The device can be bought online for $350.

You’re Next

It’s very likely that the phone, tablet, and laptop that you receive your work email on utilize the same type of encryption that the FBI is now trying to bypass. Similarly, every time you put your credit card number or password into a web site, retrieve your email, connect to your office VPN, send an instant message, make an Internet phone call, or have a videoconference you are being protected by encryption, perhaps without realizing it.

The company you work for and the various devices and programs your company uses for business implement this encryption because without it any attacker could retrieve your data from a stolen phone or spy on your communications, potentially capturing your passwords or other sensitive data in the process and using it to hijack your accounts. Somewhat ironically, the FBI itself used to recommend smartphones’ encryption to protect data from theft. Now that this same encryption is keeping the FBI from the data they want, they have quietly deleted these recommendations and essentially started a war on encryption.

Although the director of the FBI has said that this case is about one iPhone his previous comments on the topic of encryption belie that statement. This case will set a legal precedent, that’s just how the US legal system works. If the FBI succeeds in forcing Apple to modify their code or hand over their keys it is very likely that the FBI or other law enforcement agencies will immediately begin asking for similar backdoors in every other product and service that uses encryption.

This scenario would immediately affect any company that produces any sort of software or service that utilizes encryption as they will begin receiving orders from law enforcement for the creation of their own backdoors. Companies will have to decide between starting expensive legal fights to try to make the case that their use of encryption is somehow different from any precedent set in the Apple case or cave in to the demands and implement the backdoors.

Implementing these backdoors would require developers to spend time trying to find the right balance between a system weak enough to let law enforcement in while keeping hackers at bay. Keep in mind that this is a task which some of the leading cryptographers in the world have said is nearly impossible and it’s not hard to see how this would put businesses in an untenable position, the same one that Apple is now struggling to keep itself out of. The work would not be over once these backdoors are implemented either, companies would likely have to devote resources to supporting regular law enforcement requests adding to overhead.

As we established in Part 2 of this story, the consequences of failure in this scenario would expose the company implementing the backdoors and their customers to hackers of all sorts. Depending on what sort of information is handled this could range from identity thieves looking for credit cards, social security numbers, bank account passwords, and health records to foreign intelligence agencies stealing intellectual property, sabotaging mergers, spying on dissidents, trying to identify foreign agents , or simply to use as proxies for attacks on others.

The Customer is Always Right

The US currently enjoys its position as a tech leader thanks to early innovation in the area of microprocessors, software, and networking but encryption is just math and the US does not have a monopoly on it. If it becomes untenable for domestic companies to properly protect data with encryption there are plenty of other countries that supportencryption rights where competitors can set up shop. Israel is has already become a research hub for both well known technology companies and cybersecurity upstarts. US businesses will lose if our government acts as if the planet revolves around their decisions.

We can expect even the mere presence of these backdoors to drive foreign customers away from a company’s products or service as they fear US espionage just like the US fears foreign espionage on our systems. We’ve already seen this as a result of Snowden’s leaks as foreign companies lost trust in US companies’ ability to protect their data and took their business elsewhere. Should backdoors get exploited by hackers this would likely also result in US customers, be they businesses or consumers, being driven away to domestic or foreign competition as well.

We must also remember that most of the US government uses off-the-shelf networking products just like the rest of us and any vulnerabilities resulting from weakened encryption or backdoors will affect them as well. We saw this with the Juniper Backdoor revealed a few months ago: Many US government systems were protected by these backdoored VPNs and are now rightly concerned about foreign espionage. Despite the US government claiming that the backdoor is the work of a foreign government, security researchers who have investigated the backdoor believe that it was actually an NSA backdoor that was then found and hijacked by foreign hackers. This scenario is exactly why so many security professionals are worried about backdoors in iPhones or any other encrypted products. Congress is now investigating any possible NSA involvement in the Juniper backdoor.

Cast Out

In Part 2 I covered how SSL/TLS, the encryption system that protects most of the sensitive communications on the Internet, is based on a network of trusted cryptographic keys that create digital signatures. This same network of trust that makes it possible for the FBI, other intelligence or law enforcement agencies, or hackers to hijack keys to forge digital signatures also provides a sort of solution for that very same problem: Software developers, businesses, and users can all configure their systems to not trust certain keys anymore. This is great news, unless you happen to be using the key that everyone has decided not to trust.

We saw this with the DigiNotar example I used in Part 2: that company was hacked and their stolen keys were used to make fraudulent certificates that were used to impersonate Google and a number of other web sites. The potential damage was so great that the companies that produced web browsers removed DigiNotar’s keys from the list of trusted keys. This literally put DigiNotar out of business and prevented their customers’ web sites and other online services from using encrypted connections until they found a new company to provide digital signatures.

There is no central authority for this trust system. Each company that produces a web browser or any other piece of software that utilizes this sort of encryption builds in a list of trusted Certificate Authorities, the companies that issue the highest level of digital signatures, into their products. The Certificate Authorities themselves maintain lists of keys that have been revoked and users are also free to change the lists of trusted keys contained within their software, adding or removing trusted keys as they please.

This presents a problem for a company that loses trust, perhaps because it is revealed that they handed over their key to law enforcement or cooperated with a law enforcement request to put a backdoor in their software or service: There is nothing preventing an independent Certificate Authority from adding that particular “compromised” key to a revocation list which would effectively cast the user of that key out of the web of trust, breaking their systems’ ability to handle any encrypted communications.

If certain Certificate Authorities decided to use this key revocation ability as a punishment for leaking keys or backdooring their software or services this would effectively put those companies in an impossible position: cooperating with law enforcement to avoid legal punishment would create the risk of being cut off from the ability to communicate with one’s own customers, effectively putting the company out of business, just like DigiNotar.

Security experts have also been concerned about law enforcement compromising a Certificate Authority and forging their own certificates, just like the hackers who attacked DigiNotar did. If the authors of the web browsers and other software that contains the trusted key lists decided that the Certificate Authority itself was no longer trustworthy they could effectively cut off an entire Certificate Authority and all of their customers, again just like DigiNotar except this time due to law enforcement cooperation rather than hacking activity.

This is a very dangerous path to go down as it could quickly lead to various geopolitical blocs establishing their own trusted Certificate Authorities and prohibiting software used in their bloc from trusting keys issued by other blocs. This would balkanize the Internet, a doomsday scenario that would break the interoperability that is the very reason we use it. Companies in one bloc would be cut off from any sort of encrypted communication with potential customers in another bloc (i.e. forget US companies doing business in China). The only other possible outcome of this scenario would be a redesign of the security mechanisms underlying the Internet but that is unknown territory.

Stuck in the Middle

The balkanization scenario described above may be a bit of a reach from where we find ourselves today but there is a much more likely reason why companies may find themselves in an impossible position, only in this scenario it would be law enforcement on both sides: many companies do business internationally and different countries have different laws that can conflict with each other.

This wasn’t usually a problem in the past as each national branch of a multinational company could follow the laws of whatever country it was established in. The Internet has effectively broken this model. Let’s say a person who lives in Country A sends personal information, protected by privacy laws in his country, across the Internet to a server in Country B that is owned and controlled by a company headquartered in Country C and a law enforcement agency in Country D, which doesn’t have the same privacy protections as Country A, presents a warrant to access that data? The answer will likely require a lot of lawyers to figure out. We don’t have to guess about some of these scenarios because we are seeing them play out already.

Some countries are passing laws that apply to the data of their citizens regardless of what foreign jurisdictions that data ends up in. The strict European Union privacy laws are an example of this. Companies in the US, where privacy laws are not as strict as the EU, had to agree to what were known as the Safe Harbor rules for protecting that data before European companies were allowed to send protected data across. This model worked fine until Snowden revealed the extent of the NSA’s “collect everything” approach to surveillance at which point a European court cancelled the Safe Harbor agreement and left US companies in the lurch with no clear way to handle the legally protected data of EU citizens.

Some countries are claiming the right to access any data under the control of a domestic company regardless of what country that data happens to be in. A case to this affect already played out with Microsoft forced to hand over data residing on a server in Ireland. This of course directly conflicts with the strict EU privacy laws mentioned above and puts Microsoft in an impossible position: ignoring the warrant puts the company in contempt in the US while complying with the warrant and turning over the data potentially exposes Microsoft to prosecution under EU law.

This rabbit hole goes much deeper if we start to consider the possibility of encryption backdoors. Let’s say the FBI succeeds in their case against Apple and goes on to require that software companies always leave some sort of ability for law enforcement to gain access. It’s a reasonable assumption that foreign countries would want this sort of access for their law enforcement agencies as well which is probably fine as long as we’re talking about countries that are politically aligned with the US, at least until a hacker finds the backdoor and we’re back to the scenarios outlined in Part 2. Where this really breaks down is when we realize that companies like Apple do business in countries that aren’t exactly friendly with the US, like China and Russia, and that they will want this sort of access too. Given their pastactivities it’s not a stretch to imagine that these countries would utilize these backdoors for more than just domestic law enforcement.

This is another scenario that has already come to pass, in fact China beat the US to the punch with a law passed in December, a counterterrorism law requiring companies to decrypt data for law enforcement. And then it turns a bit Kafkaesque: the US is criticizing China’s counterterrorism decryption law while the FBI is demanding a US counterterrorism decryption law here in the US and fighting this case with Apple over the very same issue and now China in turn is using the FBI’s demands for a counterterrorism decryption law in the US and the FBI’s fight against Apple to justify their own law.

If implementing a single backdoor for one law enforcement agency is bad (see Part 2) it will be much worse if a company has to implement hundreds of backdoors for all of the various countries that request them and then keep each of them, and hackers, from gaining unauthorized access to each other’s backdoors. If it gets to this point we might as well just stop using encryption entirely and leave whatever’s left of the Internet to the hackers.

The only other foreseeable outcome of this scenario (barring that ever elusive reimagining of the entire Internet’s security infrastructure) is for companies to stop doing business in potentially hostile foreign markets. Google has already backed out of the Chinese market due to the country’s attempts to censor and spy on the company and their customers, and this may be the path that many others will have to take in the coming years if those that want backdoors get their way.

Existential Threat

There have been quite a few points in the scenarios above where I have written that companies could be put in untenable positions where their failure was a possible outcome. I’ve already covered DigiNotar’s bankruptcy as a result of a hacker gaining control over their keys both above and in more detail in Part 2 but there are other examples we can look to as well.

Lavabit was an email provider whose primary selling point was improved privacy over other providers like Gmail. The company received a warrant from the FBI demanding it hand over its encryption keys, an action that would allow the FBI to read the emails of all of the service’s approximately 410,000 users, along with a gag order preventing Lavabit from discussing the request with anyone.

Lavabit objected to handing over the keys, offering instead to provide access to the single account that was of interest to the FBI as had previously been done when the company had been served with warrants. The FBI refused this offer, insisting on the keys to access all accounts. Lavabit decided shut down their service rather than defeat the privacy it had promised to its users. Although it had been suspected since Lavabit initially went offline, a government redaction error last week confirmed that the actual target of the warrant was Edward Snowden.

The types of warrants and orders used by the FBI in these sorts of cases have some troubling implications for companies that think the demands are overly broad or onerous: The order actually targets the individual whose data is on the company’s server which means the company itself is only a third-party to the case (despite being required to take action to provide the data) and as a result is not always allowed legal representation. This was the case with Lavabit, and the speed with which the FBI issued demands and summonses as contained in the company owner’s story should serve as a warning to any small business that could potentially find itself in the same position. Apple should consider itself lucky that it even has the opportunity and is in a position to fight the FBI’s request.

This isn’t to say that just small companies are at risk of getting battered into oblivion by a barrage of court paperwork. Yahoo found itself in the same position in 2007 after receiving a similar surveillance and gag order from the NSA that it interpreted as overly broad. Yahoo was facing fines of $250,000 per day for failing to comply with the order and the fines were to double after every week of non-compliance. Yahoo attempted to fight the case and lost, so Yahoo complied rather than deal with the fines that would have quickly crushed the company. The gag order was only lifted in 2014 after the Snowden leaks publicly revealed the existence of the secret surveillance program that Yahoo and many other tech companies were unwilling partners in.

Lavabit and Yahoo’s fights against court orders were, like Apple’s, based on a sense of ethics, security, and good business practice rather than a necessity of being stuck in-between two conflicting legal systems. We can point to another company that no longer exists in its previous form as a result of the NSA’s demands conflicting with the US’ own legal system: Qwest Communications. Qwest was the only company that refused to cooperate with the NSA’s warrantless wiretapping program begun shortly before 9/11. The refusal was based on their legal team’s assessment that the wiretapping program was illegal which was borne out in a later court case. Still, this was too late for Qwest: in the aftermath of Qwest’s refusal to cooperate with the NSA the government allegedly revoked contracts worth hundreds of millions of dollars from Qwest and the company stock slid from $38 to $2 per share. The company went through a merger a few years later.

Another interesting angle of the Qwest case is that, because the warrantless wiretapping program was eventually found to be unconstitutional, the actions of all of the other telecoms that cooperated with the NSA were technically illegal. Even if the government declined to prosecute these companies criminally, they were still exposed to potential civil suits from affected customers. This was addressed through the passing of a law granting these companies retroactive immunity for their otherwise illegal actions and there was concern that this law might be repealed in the future. This shows once again the very shaky legal ground that companies may find themselves in whether they cooperate with government requests or not.

Blast from the Past

The Apple case is fraught with the same legal uncertainty. The Communications Assistance for Law Enforcement Act (CALEA) is a 1994 law that requires the telecoms and equipment manufacturers (e.g. Apple) to build wiretap capabilities into their networks and products. This would seem to put Apple on the hook with the FBI except that the law states that the telecom companies are not responsible for decrypting any data that they don’t have the keys to. To get around this inconvenient bit of legislation, the FBI has resorted to justifying their position by saying that Apple is not covered by CALEA and instead using the All Writs Act, a law from 1789 that basically states that anyone must help law enforcement when a court says they should. This means the FBI is saying that a law written in 1789 well before even the telegraph existed, never mind the Internet and smartphones, trumps a law that was written in 1994 and updated in 2004 to cover Internet access.

A federal court in New York has already ruled on another similar case where the FBI and DEA tried to force Apple to unlock a phone using the All Writs Act. The judge was not gentle with the government, criticizing their attempt to force Apple to bypass their own security measures and accusing them of trying to use secret court proceedings to get powers that Congress would never allow. The judged called the potential results of using the All Writs Act in such a case “impermissibly absurd”.

So this would seem to be the end of it then, why is this fight over the San Bernardino phone still going on? For starters the government is appealing their loss in the New York case, but even if that case stands it only sets a precedent in that one federal circuit which covers New York, Connecticut, and Vermont. Each of the other 10 circuits can set their own precedents. This means that if the San Bernardino case goes the FBI way but their New York appeal is denied it might be permissible for a company like Apple to refuse to unlock a phone in New York when they would be required to do so in California. If rulings in various circuits conflict this would be the state of affairs at least until the Supreme Court steps in and sorts it all out nationwide.

This creates yet another legal minefield as companies have to determine exactly when they are and are not obligated to help the government. When state laws that require encryption of certain data get thrown into the mix it can get even more complicated: would a company be in violation of state privacy laws for voluntarily helping the Feds in a jurisdiction where they are not legally required to do so? What about when the data is covered by a foreign privacy law, such as when it covers an EU citizen? Once again it will take many lawyers to find the answers.

The government could also go Forum Shopping, filing their cases in whichever legal jurisdiction would be most likely to give a favorable ruling (already a favorite tactic for patent litigation). Large companies like Apple likely wouldn’t have a problem fighting cases around the country but a small startup may not be able to afford repeatedly parachuting legal teams into various faraway jurisdictions to fight Federal requests.

Throwing away the key

One potential way of preventing law enforcement from forcing a company to reveal decryption keys is to design one’s own systems in such a manner that it is impossible for even the developers to retrieve protected data. It wouldn’t help get Apple off the hook for the iPhone currently in the FBI’s possession, but they could design the next version of their software to require the passcode before applying an update in order to prevent the trick that the FBI is trying to get them to use.

This is also technically possible for network communications through a technique known as “end to end encryption”. In this model the encryption keys are only stored on the two endpoints that are communicating with each other, passing through any intermediary servers in an encrypted format. Most systems today only encrypt data as it passes across a network, allowing it to be monitored and intercepted as it passes through or sits on a server. This is a recommended technique to prevent the theft of credit card data in breaches similar to those at Target and Home Depot.

The WhatsApp messaging service, now owned by Facebook, utilizes this type of encryption to protect its users. When WhatsApp is installed on a phone it generates its own unique encryption key pair, just like the keys used to protect web site communications described in Part 2. When 2 users chat with each other the software on each of their phones encrypts those messages using the other phone’s key before sending it through Facebook’s server infrastructure. Facebook can see the encrypted communications traffic moving through its system but has no way to actually read the contents without the keys.

These types of “perfect” encryption systems are exactly what the FBI is complaining about when they talk about “going dark”. The only way to get the data, other than trying to undertake the nearly impossible task of cracking the keys, is to find a vulnerability that allows access to the endpoint phones so that the keys can be retrieved. This requires a level of effort that makes mass surveillance much more difficult. It likely no coincidence that the app was created by a Ukranian who grew up in under the Soviet Union’s surveillance regime.

Besides the FBI, Brazil is also unhappy about this type of encryption. A court in Brazil has tried just about every trick in the book to get Facebook to turn over the contents of communications suspected to belong to a drug ring: Shutting the service down for 48 hours, fining Facebook $12,600 per day, increasing the fine to $253,000 per day, and arresting one of the company’s vice presidents (more detail on that in a bit). Assuming that the WhatsApp encryption is all it’s cracked up to be this is all futile: there is no direct way in.

The only potential solution to this “going dark” problem then is to ban apps that perform end-to-end encryption, but there is a problem there too: there’s always another app. As mentioned plenty of times already, encryption concepts are well known around the world and creating yet another end-to-end encrypted app in some other jurisdiction that allows it is trivial. When Brazil shut down WhatsApp millions of users switched to Telegram, another similar end-to-end encrypted messaging app. This is a fight the government will have a hard time winning.

It’s not just little apps that can play the jurisdiction game either: Microsoft has already decided to take this approach to avoid the problem of US courts demanding it hand over data stored within its foreign datacenters. The company is currently setting up datacenters in Germany (a country that has come out in support of strong encryption owing to its past experiences with East German domestic surveillance) that will be run by Deutsche Telekom personnel. The general idea is that if Microsoft personnel don’t have access to the datacenter then they can’t be held accountable for turning over data. It will be interesting to see how long that lasts before a judge tries to throw someone from Microsoft in jail for contempt as well simply for creating such an “unaccountable” system.

On Strike

There is another scenario where a company may be unable to comply with a court order to make code changes, although due to human resources problems rather than technical problems: Certain Apple personnel who would likely be involved in modifying Apple’s software to meet the FBI’s request have said that they are preparing to quit the company if the ruling goes against Apple.

The exit of Apple’s core iOS developers would not permanently prevent Apple from making the changes requested by the FBI but they would lose the people most familiar with that bit of software. Bringing in other developers, assuming some who would be willing to perform the work can be found, would take time. Training new developers to understand the existing source code would be difficult after such a brain drain as the engineers who would have been in the best position to make the FBI’s changes would also have been in the best position to train new developers on the inner workings of the software they would be trying to alter.

The loss of a large number of highly skilled personnel would have a knock on effect, not just affecting Apple’s ability to address the FBI’s request but also affecting their ability to add new features to their software and fix bugs. These bugs would inevitably include security vulnerabilities that can be exploited by hackers. If Apple was unable to fix these in a timely manner it would once again put us all at risk of further attacks.

Depending on how the court handles Apple’s inability to address the order in a timely manner following its employees’ resignations this could put Apple or its employees in a tough position. If the court views this as Apple’s problem they could end up with crushing fines, like those we saw threatened at Yahoo, while Apple tries to bring new developers on and get them up to speed. Alternatively the court could decide to go after Apple’s former employees themselves, holding them in contempt for their resignations. This would likely open another long list of legal and constitutional issues that would take years to play out in the courts.

Another possibility is that the FBI would use Apple’s inability to meet their request in a timely basis as an excuse to demand Apple turn over its source code and signing keys. From a technical security perspective this would be even worse than Apple weakening the security on their own as previously detailed in Part 2.

Personal Problems

The consequences of cooperating, or not, with surveillance and backdoor programs can extend beyond companies and impact the ownership and executives as well. This is yet another instance of the “rubber hose cryptanalysis” concept: Threaten a company executive personally to get them to deliver what the government wants. We can pull examples of this from many of the companies we’ve already heard about above.

Lavabit’s owner, Ladar Levison, was found in contempt of court and fined $10,000 despite not being the actual target of the warrant. The fine was in part for delaying the release of the encryption keys by 6 days, this was the period of time when he was attempting to find a lawyer and fight the order in court, thus depriving the FBI of that many days’ worth of email surveillance. The fine was also because he shut the service down before handing over the keys and when eventually did provide the keys they were on a printout (11 pages of 4 point type) rather than in a digital format. This is the encryption equivalent of paying a bill with a sack of pennies: you’ve technically paid the bill but you’re also communicating to the recipient know exactly what you think of the situation. Levison is lucky though, at least he didn’t end up in prison.

Joseph Nacchio was the CEO of Qwest at the time of the NSA’s warrantless wiretap request and, according to him at least, he ended up in prison as a result. Nacchio had begun making stock sales in January 2001. One month later, according to Nacchio, the NSA began asking for wiretap access which Nacchio refused as detailed above and in May the stock began its slide. The government accused Nacchio of selling stock while falsely claiming the company’s outlook was better than it actually was while Nacchio claims the company’s outlook was good until the government pulled its contracts in retribution for his refusal to cooperate with secret NSA programs.

Nacchio tried to introduce the retribution issue at his trial but a judge dismissed it as irrelevant and inadmissible. Nacchio was found guilty, sentenced to 6 years in prison, fined $19 million, and forced to forfeit $52 million from his stock sales. The NSA's activities are now well known although they were not yet at the time of Nacchio's trial, and all the evidence that points towards Nacchio getting framed is only circumstantial. Unless some smoking gun document is released it's likely that it will remain ambiguous as to whether this was a crooked CEO grasping for any excuse to stay out of jail or the actions of a vindictive government, punishing a citizen for refusal to cooperate with an illegal program.

Facebook’s vice president for Latin America was also arrested in Brazil 3 weeks ago because the company had failed to turn over data in the drug ring case described above. This is especially concerning as WhatsApp may not have any way to provide the data that the government is demanding. He was released after only 24 hours but this raises the specter of executives being jailed on contempt charges until their company turns over data, something it may not even be able to do depending on the encryption techniques in place. Given the lack of knowledge the average judge or politician has about the inner workings of encryption systems it may be difficult to convince them of this fact, just like it has proved difficult for Apple to convince them that there are valid technical reasons why the FBI’s request is risky.


This piece has painted a fairly bleak picture of the consequences of the FBI’s demands of Apple and of many other similar demands being made around the world and it should be plain by now that this fight isn’t going to end when the next Apple vs FBI court hearing ends, regardless of the outcome.

Many of the people discussing this fight are concerned primarily about the technical issues I described in Part 2 or out of a concern for their own privacy as a backlash against the NSA’s previous activities. I don’t think any of that information is new, although I hope having it all in one place is helpful to the folks who aren’t intimately familiar with this fight or the technology that underpins it.

What I don’t think many have covered before are these business consequences that we’ve covered here in Part 3 and I feel that this may have even more impact than any of the technical issues. Although some may say the potential doomsday scenarios I outlined are unrealistic there have been plenty of worrying scenarios that have already played out from Microsoft being forced by the US to take actions that potentially violate the laws of the EU to corporate executives being jailed for failing to decrypt data that can’t be decrypted.

Competing government demands for access will put many companies in untenable positions where they are damned regardless of whether they cooperate or who they cooperate with. This is perhaps the bigger threat than any of the technical consequences of Apple cooperating with the FBI or even the terrorism that the FBI is trying to investigate. The Internet is no longer a toy, the world economy relies on it, devices and infrastructure that can literally kill people when they fail are connected to it, and hundreds of billions of dollars are lost every year as a result of the crime that happens on it.

At the end of the day Apple, the FBI, and the rest of us who deal with security are all trying to keep people safer, they just disagree on which threats are more important and how best to address them. Apple and most of the rest of the tech community are constantly fending off attacks by hackers and are doing what they do best: figuring out how to use more technology, like encryption, to stop them. The FBI on the other hand sees this same technology that is intended to protect us from one threat as something that can prevent them from monitoring a very different type of threat: terrorists. We need to reconcile these conflicting priorities and stop working against each other.

As a security professional the one message I wish we could get through to The Powers That Be, is that they are focusing so much on their ability to play offense that they are sacrificing our defense. I get it, I used to be a penetration tester, I know the victory rush you get when you find a vulnerability and use it to get access to something you’re not supposed to. The problem is that others are doing that same thing to us just as often as we’re doing it to them. It’s time for our various law enforcement and intelligence agencies to stop seeing every vulnerability as a tool that they can use to achieve their goals and help the rest of us prevent the enemy from using these same vulnerabilities to achieve their goals.

The next court hearing in the San Bernardino iPhone case is tomorrow, March 22.