Sections

Research

Utilities for democracy: Why and how the algorithmic infrastructure of Facebook and Google must be regulated

Facebook CEO Mark Zuckerberg testifies before the House Judiciary Subcommittee on Antitrust, Commercial and Administrative Law during a hearing on "Online Platforms and Market Power", in the Rayburn House office Building on Capitol Hill, in Washington, U.S., July 29, 2020. Mandel Ngan/Pool via REUTERS

Executive Summary

In the four years since the last U.S. presidential election, pressure has continued to build on Silicon Valley’s biggest internet firms: the Cambridge Analytica revelations; a series of security and privacy missteps; a constant drip of stories about discriminatory algorithms; employee pressure, walkouts, and resignations; and legislative debates about privacy, content moderation, and competition policy. The nation — indeed, the world — is waking up to the manifold threats internet platforms pose to the public sphere and to democracy.

This paper provides a framework for understanding why internet platforms matter for democracy and how they should be regulated. We describe the two most powerful internet platforms, Facebook and Google, as new public utilities — utilities for democracy. Facebook and Google use algorithms to rank and order vast quantities of content and information, shaping how we consume news and access information, communicate with and feel about one another, debate fundamental questions of the common good, and make collective decisions. Facebook and Google are private companies whose algorithms have become part of the infrastructure of our public sphere.

We argue that Facebook and Google should be regulated as public utilities. Private powers who shape the fundamental terms of citizens’ common life should be held accountable to the public good. Online as well as offline, the infrastructure of the public sphere is a critical tool for communication and organization, political expression, and collective decisionmaking. By controlling how this infrastructure is designed and operated, Facebook and Google shape the content and character of our digital public sphere, concentrating not just economic power, but social and political power too. Leading American politicians from both sides of the aisle have begun to recognize this, whether Senator Elizabeth Warren or Representative David Cicilline, Senator Lindsey Graham or President Donald Trump.

Regulating Facebook and Google as public utilities would be a decisive assertion of public power that would strengthen and energize democracy. The public utility concept offers a dynamic and flexible set of regulatory tools to impose public oversight where corporations are affected by a public interest. We show how regulating Facebook and Google as public utilities would offer opportunities for regulatory innovation, experimenting with new mechanisms of decisionmaking that draw on the collective judgement of citizens, reforming sclerotic institutions of representation, and constructing new regulatory authorities to inform the governance of algorithms. Platform regulation is an opportunity to forge democratic unity by experimenting with different ways of asserting public power.

Founder and CEO Mark Zuckerberg famously quipped that “in a lot of ways Facebook is more like a government than a traditional company.” It is time we took this idea seriously. Internet platforms have understood for some time that their algorithmic infrastructure concentrates not only economic power, but social and political power too. The aim of regulating internet platforms as public utilities is to strengthen and energize democracy by reviving one of the most potent ideas of the United States’ founding: democracy requires diverse citizens to act with unity, and that, in turn, requires institutions that assert public control over private power. It is time we apply that idea to the governance of Facebook and Google.

As a visiting researcher in Facebook’s Responsible AI team, Josh Simons is a paid consultant to Facebook. The views reflected here are the authors’ alone, and Facebook had no role in the production or review of this report. Dipayan Ghosh worked as a privacy and public policy advisor at Facebook from 2015 to 2017.

Google, Facebook, and Twitter provide general, unrestricted support to Brookings. The findings, interpretations, and conclusions in this report are not influenced by any donation. Brookings recognizes that the value it provides is in its absolute commitment to quality, independence, and impact. Activities supported by its donors reflect this commitment.

Authors