On February 23, Brookings scholars Susan Hennessey and Benjamin Wittes answered questions about the Apple iPhone encryption case in an Ask Me Anything (AMA) session on the forum website Reddit. Earlier this month, Apple refused to comply with a court order filed by the FBI demanding technical assistance with bypassing the security measures on an iPhone belonging to one of the San Bernardino shooters. Hennessey and Wittes had previously written about the case on the Lawfare blog. The following is an excerpt of the AMA: following Reddit’s format, responses are indented and the original text remains unedited.
We’re scholars at the Brookings Institution and the editors of the national security blog Lawfare. We know a few things about the laws most relevant to the current debate surrounding encryption and “Going Dark.” Last week, we wrote a piece about our assessment of Apple’s opposition to a California court order. It made many of you feel feelings and evoked some strong opinions (some people on the Internet even called us mean names!) But part of our job is to put forward ideas so they can be vigorously debated and vigorously challenged and we all get smarter in the process. So here we are. Ask us any and all questions about how law relates to encryption policy and Going Dark or our article.
Do you think Apple would have been well-served to voluntarily assist the government in allowing this particular device to be decrypted, rather than litigating and potentially establishing a bad precedent? What would the significance of voluntary cooperation be versus a court order?
Voluntary compliance here would be voluntarily complying with a search warrant issued by a court. I think Apple risks establishing bad precedent but I also think it views the matter as win-win. If they lose in court, they can comply with the order while preserving their public reputation (because they fought it) and if they win, they are champions of civil liberties. But as a general matter to voluntarily comply with an order in one circumstance, does not forfeit the right to challenge the order at a later date (precisely what happened in EDNY).
I don’t begrudge Apple’s fighting this at all. Eventually, someone is going to have clarify what the company’s obligations are. And the way we do that in our system is to have litigation. The litigation requires that someone oppose the government’s request. So it strikes me as very healthy that we’re having this discussion both in court and in Congress.
You’ve taken a pro-FBI stance. I can actually understand this stance in the context of some misunderstandings about the technology involved.
It’s no surprise to me that the tech companies who are intimately familiar with the technology are largely in support of Apple (executives from Facebook, Google, Yahoo, Twitter, etc.).
Hell, even ex-NSA chief Hayden sides with Apple.
I think the misunderstanding lies in the fact that Apple isn’t withholding any information that they currently possess (at least in the San Bernardino case). In order to comply with the order, Apple would need to create software that would allow anyone with sufficient computing resources to brute force the password. History teaches us that there’s a good chance that this software will get leaked (edit: or stolen by attackers), or even used in secret by the government.+
You argue that Apple put themselves in this position by making their encryption more secure. Well, yes. But my question is, what good is insecure encryption? If a company holds my password or encryption key, I am much more vulnerable. Major companies are routinely compromised, either through sophisticated attacks, or plain old social engineering.
Companies have to engineer new things in order to comply with reporting requirements and surveillance orders all the time. CALEA required the phone companies to extensively engineer their systems to facilitate warrants. The FISA Amendments Act required the engineering of lots of systems to comply with 702. And banks have obligations to report a wide range of “suspicious activity” to the Treasury Department—obligations that required extensive development of analytic capabilities. There is simply nothing new about requiring Apple to engineer something novel in the interests of compliance or facilitating lawful surveillance orders. The question is whether there’s anything good about imperfectly secure encryption. I think there is.
CALEA and FAA are far more specific about what sort of assistance must be rendered by a company than the catchall of AWA. What’s the limiting principle if asking a company to write new software is a reasonable request?
Yes, but whose fault is THAT? The FBI would LOVE to have a legislative framework here. Apple opposes one. One of the benefits of a compromise might be narrowing the bounds of the technical assistance that could be demanded.
As always, it’s Congress’s fault. But the reason that there is no legislative framework is because there’s no consensus, or even a legislative proposal, about what that would look like. Congress deciding not to act in this area is itself a legitimate policy option, and I don’t think it’s reasonable to read into that an endorsement of expanding the AWA into parts hereto unknown.
Right, but in this case, there is a legislative framework. The All Writs Act exists to govern in the absence of another legislative framework. That’s what we mean when we say it’s a gap-filling statute.
Congress has chosen not to legislate mandatory back doors to bypass encryption. Why is the use of the all writs act to bypass congress not an abuse of law?
All Writs does not apply where Congress has affirmatively passed a law. That law can say either what a company must do or what a company cannot be required to do, but congressional silence leaves the “gap” which All Writs fills.
I know this much – what I’m really asking is why using an 18th century law that clearly was not designed to help law enforcement with fishing expeditions, will not be considered overreach by the courts? Or put another way, if you were arguing this before the supreme court, how would you answer that?
First off, the fact that the law is old is not important. The Constitution is older and we seem to still apply that. The question is whether the AWA applies to a certain situation or not, not whether it’s old. And if it’s too old, Congress should change it or repeal it. If I were before the Supreme Court, I would deny that this is a fishing expedition, noting that it would be shocking the FBI not NOT pursue the phone of one of the shooters. And I would note the dozens of cases in which the All Writs Act has been applied to situations not wholly different from this one. Congress is free to change it any time. I would support doing so and clarifying everyone’s obligations here, but as long as it’s on the books and investigative exigencies arise, it’s hardly a surprise that the government invokes it.
Policy and forensic experts have warned that if Apple designs the new OS that the FBI is asking for and digitally signs that OS so it can be used to break through phones (in combination with brute force attacks) that it is likely to get into the wild and be misused by others: http://www.zdziarski.com/blog/?p=5706 https://www.justsecurity.org/29453/apple-vs-fbi-just-once/
Do you disagree with their assessment of the risks, or think that the government interest in accessing this and other devices is worth the cybersecurity harm that could result?
There is no doubt that the optimal cybersecurity arrangement is to have no cybersecurity vulnerabilities. The question is whether that’s the optimal security arrangement in a broader sense.
Presently, we have so much more to lose to cyberattacks than any other nation. Seems like it’d be huge to U.S. economic security if there were no cybersecurity vulnerabilities.
So I concede these risks exist in theory, and shouldn’t be dismissed outright. But it’s a question of balancing (what’s the utility of having Apple have software v. probability and consequence of it being stolen and misused). I think Apple’s track record on being successfully able to keep prior software designed to access data safe speak a lot to the probability of risk. I also think that we should wait for Apple’s assertions on the record in federal court regarding the ability to use this specific software on a different phone– I think they may be more careful regarding their representations to a judge.
1) Do you know what the RSA algorithm is and/or how it works?
2) How deep of a knowledge do the writers of the proposed legislation have in regard to encryption, cybersecurity, and mathematics?
3) How great a knowledge do you feel the writers of a law, writ, or other such proposed items should have prior to legislating a technical topic?
4) I find the phrase ‘going dark’ to be describing the government’s inability to cope with having limited access to the public’s personal information, especially in light of Edward Snowden’s impact on the technological community. Do you agree or disagree with this sentiment and why?
1) Yes and not beyond a general layman’s sense of public key cryptography. 2) At this stage, there is no propose legislation. Those who will write any legislation are lawyers, not techies, but they have access to very deep technical resources in the federal government. 3) people who write legislation need technical insight. Full stop. Whether that technical insight is their own or that of people they collaborate with closely is less important. 4) I disagree. From a law enforcement or intelligence standpoint, having large platforms on which major criminal activity and important foreign policy matters take place and in which visibility is gravely impaired is terrifying. And it can wound or disable core functions we expect of government.
We have operated for a great deal of time with limited visibility, it is only because of technology companies such as Apple that the FBI and and other LEAs erected panopticons that redefined how we think of “visibility”. So as the pendulum swung too far in one direction Apple saw the need, morally and financially, to swing it back in the other direction.
It’s a mistake to think that any core functionality is reliant on access to the innards of these devices. They are but eight years old.