The 2013 disclosures from Edward Snowden and the recent Apple-FBI case have widened the debate over encryption. Once the domain of cybersecurity experts and hackers, the debate now encompasses ordinary citizens that entrust their smartphones with greater amounts of personal information. Device manufacturers and software developers are rapidly adopting end-to-end encryption features that prevent the interception of customer communications. This fact has massive implications for both cybersecurity and physical security, as law enforcement agencies seek to simultaneously thwart cyberattacks and recover digital evidence stored on encrypted devices. How will policymakers balance demands for better device security and continued access to law enforcement? Brookings scholars Niam Yaraghi, Scott Andes, Walter Valdivia, and Darrell West weigh in with their own perspectives on the encryption debate.
is a fellow in the Brookings Institution’s Center for Technology Innovation.
A user’s phone, computer, or personal cloud storage space is often compared with his private home that should not be intruded. Although we have the right to lock our front doors and keep our home private, under certain laws and conditions, other authorized entities can break this lock and intrude on our privacy. The police should be able to enter a home in which a group of terrorists are making a bomb and preparing for their next attack. As long as the officials have a justification, they should be able to break any physical or digital lock. Encryption methods that do not allow government to access digital content are a potential security threat in the same way that unbreakable locks would be.
While technology companies should develop more sophisticated encryption methods to prevent unauthorized access to their users’ personal data, they should also reserve the potential to provide authorization for government to also access these data under certain laws and conditions. Unbreakable locks do not ensure privacy, they create chaos.
“Unbreakable locks do not ensure privacy, they create chaos.”
Competing on security
is a senior policy analyst and associate fellow at the Anne T. and Robert M. Bass Initiative on Innovation and Placemaking, a part of the Centennial Scholar Initiative at the Brookings Institution.
The FBI’s demand of Apple to install a “backdoor” to enable law enforcement entry into Apple devices goes beyond concerns over national security and personal privacy. There is also significant risk that doing so would compromise the global competitiveness of one of the country’s most important sectors. Technology companies know that trust in the security of personal data is essential to customers. That’s why Blackberry threatened to shut down operations in Pakistan last year when the government demanded “backdoor” access for security reasons. While the U.S. government has a better reputation for protecting personal security than Pakistan, similar threats could send a ripple through the entire economy.
In many ways, the U.S. economy is at a software-driven inflection point now that nontraditional sectors like health care are truly coming online. Recent data shows that highly-innovative firms in virtually every sector of the economy develop their own software—a bright spot for the U.S. economy. In such an environment it’s only reasonable to consider the effect security measures may have on U.S. global competitiveness.
Layers of privacy
Walter D. Valdivia
is a fellow in the Center for Technology Innovation at Brookings.
Layered privacy could shape encryption design.
Individual privacy has been placed at the heart of the encryption debates on mobile devices. Companies like Apple, Google, and Facebook argue that their effort to encrypt customer’s communications and content respond to customer’s demand for privacy. But what are the expectations for privacy from users? Actual observed behavior reveals that the information contained in mobile devices does not seem to enjoy a consistent expectation of privacy. Rather, expectations move down a gradient from strictly private to public.
If expectations of privacy are in a gradient, or layered, then why must the protection of privacy exist as an all-or-nothing dichotomy? The technology and communications industries could, in partnership with law enforcement agencies, design a layered level of protection to the various services delivered by our mobile devices. If we have the technological capabilities to provide this layered encryption, we do not need to accept a single equilibrium in the trade-off between security and privacy. Seeking such a solution only forces people to err on the side they value the most. Instead, we could engineer a more complex set of equilibria that respect different expectations of privacy protection for the various things we do with our mobile devices.
A global precedent
Darrell M. West
is vice president and director of Governance Studies and founding director of the Center for Technology Innovation
Backdoor access to U.S. law enforcement to encrypted communications on smartphones creates a risky precedent around the world. In America, police justify their requests within a framework of a Bill of Rights and an independent judiciary. They seek access in a legal structure with prescribed rules regarding process and rules of evidence. However, many countries do not place the same emphasis on civil liberties nor have courts that make decisions independently of political leaders. That leads to a situation where citizens cannot be confident about the standards through which law enforcement gains access to personal information or the way authorities will use the information.
If U.S. law enforcement gains access to encrypted personal messages for ordinary citizens, authorities elsewhere will cite America to justify their own requests. They will claim privileges based on national security or law enforcement needs to buttress their own requests. The American example will make it harder for telecommunication firms to turn down requests for personal information. In authoritarian or nondemocratic regimes, backdoor access will open up greater opportunities for surveillance and censorship.
Google and Facebook are donors to the Brookings Institution. The findings, interpretations, and conclusions posted in this piece are solely those of the authors and not influenced by any donation.
[On the interplay between Russian disinformation and hacking], there’s been a huge surge in interest… It’s sort of hit everybody in the face after the 2016 election...FireEye has made the subject a focus, building out its own intelligence team under analyst Lee Foster dedicated to tracking disinformation campaigns.