Sections

Commentary

Filling the gaps in US data privacy laws

Members of the media (L) keep pace with Facebook CEO Mark Zuckerberg

“I Love Lucy” provides the central metaphor for a Brookings paper released today on what to do to protect privacy. It comes from the episode where Lucy goes to work wrapping candies on an assembly line. The line keeps speeding up with the candies coming closer together, and, as they fall farther behind, Lucy and her sidekick Ethel scramble harder to keep up. “I think we’re fighting a losing game,” Lucy says.

This is what we face with data privacy in America today. It’s a losing game because we are in the midst of an information Big Bang far more dizzying than Lucy’s assembly line. We generate more data at a faster pace from more devices, and neither we nor our laws can keep up. If we do not change the rules of the game soon, it will turn into a losing game for our economy and society.

The paper looks at the scope of this information explosion and its impact on current privacy protections as Congress and stakeholders think seriously about privacy legislation. The Cambridge Analytica stories, the Mark Zuckerberg hearings, and the constant reports of major data breaches have increased interest in federal privacy legislation. Various groupings have been convening to develop proposals. The time is ripe for interests to converge on comprehensive federal privacy legislation.

If we do not change the rules of the game soon, it will turn into a losing game for our economy and society.

I have a dog in this hunt: I led the Obama administration task force that developed the “Consumer Privacy Bill of Rights” issued by the White House in 2012 with support from both businesses and privacy advocates, and then drafted legislation that would enact this bill of rights. The Los Angeles Times, The Economist, and The New York Times all pointed to this bill of rights in urging Congress to act on comprehensive privacy legislation. The new paper explores how this bill of rights would change the rules of the game.

Our existing privacy laws developed as a series of responses to specific concerns, a patchwork of federal and state laws, common law jurisprudence, and public and private enforcement that has built up over more than a century. But this system cannot keep pace with the explosion of digital information, the pervasiveness of which has undermined key premises of these laws in increasingly glaring ways. The paper looks at these growing gaps:

Laws on the books                           

As technology and the data universe expand, more falls outside specific laws on the books. This includes most of the data we generate through such widespread uses as web searches, social media, e-commerce, and smartphone apps, and soon through more connected devices embedded in everything from clothing to cars to home appliances to street furniture. The changes come faster than legislation or regulatory rules can adapt, and they erase the sectoral boundaries that have defined our privacy laws.

Expectations of privacy

So much data in so many hands is changing the nature of protected information. The aggregation and correlation of data from various sources make it increasingly possible to link anonymous information to specific individuals and to infer characteristics and information about them. Few laws or regulations address this new reality. Nowadays, almost every aspect of our lives falls in the hands of some third party somewhere. This challenges judgments about “expectations of privacy” that have been a major premise for defining the scope of privacy protection, as the Supreme Court recognized in its recent Carpenter decision. But the concept also applies to commercial data in terms and conditions of service and to scraping of information on public websites.

Notice and consent

Our existing laws also rely heavily on notice and consent—the privacy notices and privacy policies that we encounter online, receive from credit card companies and medical providers, and the boxes we check or forms we sign. Informed consent might have been practical two decades ago when this approach became the norm, but it is a fantasy today. In a constant stream of online interactions, especially on the small screens that now account for the majority of usage, it is unrealistic to read through privacy policies. At the end of the day, it is simply too much to read through even the plainest English privacy notice, and being familiar with the terms and conditions or privacy settings for all the services we use is out of the question. As devices and sensors increasingly permeate the environments we pass through, old-fashioned notice and choice become impossible.

The result is a market failure in which businesses know much more than we do about what our data consists of and what their algorithms say about us and many people are “uncertain, resigned, and annoyed.” This is hardly a recipe for a healthy and sustainable marketplace, trusted brands, or consent of the governed.

Yet most recent proposals for privacy legislation aim at slices of the problem or double down on notice and consent by increasing transparency and consumer choice. So does the newly-enacted California Consumer Privacy Act. It is time for a more comprehensive and ambitious approach. Some point to the European Union’s newly-effective General Data Protection Regulation, but it is not the right model for America. We need an American answer—a common-law approach adaptable to changes in technology.

That’s where the Consumer Privacy Bill of Rights comes in. It would establish a baseline for data handling—seven recognized privacy principles legally enforceable by the Federal Trade Commission. These would move away from static privacy notice and consent forms to a more dynamic framework, less focused on collection and process and more focused on how people and their data are protected. This principles-based approach would be interpreted and developed through codes of conduct and case-by-case FTC enforcement—iterative evolution, much the way both common law and information technology developed.

It needs adapting to changes in technology and politics, but it provides a starting point for today’s policy discussion because of the wide input it received and the widely accepted principles it drew on. It got some important things right, in particular its “respect for context” principle that frames “a right to expect that companies will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data.” This breaks from the formalities of privacy notices, consent boxes, and structured data and focuses instead on respect for the individual.

My Brookings paper proposes an overarching principle of respect for individuals to guide the application of the operational principles. It is a simple rule I term the Golden Rule for Privacy: companies should put the interests of the people whom data is about ahead of their own. I touch on (and expect to expand on) how this golden rule draws on numerous strands of thinking about how companies should act as stewards of data.

At bottom baseline privacy legislation in America is necessary to ensure that individuals can trust that data about them will be used, stored, and shared in ways consistent with their interests and the circumstances in which it was collected. This should hold true regardless of how the data is collected, who receives it, or how it is used. Such trust is an essential building block of a robust digital world. Baseline principles would provide an enduring basis for such trust to enable data-driven knowledge and innovation while laying out guardrails to protect privacy“

Read the full paper, titled “Why protecting privacy is a losing game today—and how to change the game”, here.

Authors