A sectoral regulator for digital industries is an idea whose time seems to be coming. In 2019, Harold Feld, Senior Vice President of Public Knowledge, called on legislators to pass a Digital Platform Act to deal with digital competition and content moderation issues in the online world. In 2020, a report from the Shorenstein Center at the Harvard Kennedy School called for a Digital Platform Agency to promote competition and enforce a digital duty of care. The Shorenstein report was authored by an all-star lineup of policy heavy hitters: Tom Wheeler, Brookings expert and former Chairman of the Federal Communications Commission (FCC) under President Obama, Phil Verveer, whose long career in government service includes positions as the former Deputy Assistant Secretary of State for International Communications and Information Policy and Senior Counsellor to the Chairman of the FCC, and Gene Kimmelman, one of the nation’s leading consumer protection advocates who recently stepped down as Deputy Associate Attorney General Department of Justice. Wheeler expands upon this idea of a digital platform agency in his book, forthcoming from Brookings Press in October 2023, entitled, “Techlash: Who Makes the Rules in the Digital Gilded Age?” In 2021, Senator Michael Bennett (D-CO) introduced legislation to establish a Federal Commission to oversee digital platforms and then-Representative Peter Welch (D-VT) introduced a companion House bill. This year, the two joined forces to reintroduce a revised version of this legislation to establish a Digital Platform Commission. Senator Elizabeth Warren (D-MA) and Senator Lindsey Graham (R-SC) teamed up on July 27, 2023 to introduce the Digital Consumer Protection Commission Act and announced this initiative with great fanfare in a New York Times op-ed. The bipartisan measure would establish a new Digital Consumer Protection Commission with authority to regulate digital platforms with respect to competition, transparency, privacy, and national security. It would also require a dominant platform to obtain a license to operate its business and authorizes the commission to revoke the license for repeated, egregious, and illegal misconduct that has caused significant consumer harm. My own contribution to these discussions is contained in my forthcoming book from Brookings Press in November 2023, entitled “Regulating Digital Industries: How Public Oversight Can Encourage Competition, Protect Privacy, and Ensure Free Speech.” I argue that generalist regulators and courts are ill-equipped to deal with the policy challenges in competition, privacy, and content moderation, as these challenges arise in unique form within digital industries. Just as the FCC regulates broadcasting, cable operators, and telephone companies, the designated digital regulator will supervise social media companies, search engines, electronic commerce marketplaces, the mobile app infrastructure, and the blizzard of companies involved in ad tech. The agency’s subject matter jurisdiction would include the promotion of digital competition, the protection of online privacy, and measures to counteract disorder and abuse in the online information environment. By happy coincidence, Brookings’s Center for Technology Innovation recently hosted an informal discussion of regulatory tools needed to promote digital competition. Advocates and scholars bumped up against former and current government officials to learn a few lessons from previous efforts to promote competition, not through one-time antitrust actions, but through ex ante regulatory requirements on dominant companies. One of the questions discussed at this informal session was why we need a new digital agency when we already have the Federal Trade Commission (FTC) with a consumer protection mandate and full authority, together with the Antitrust Division at the Department of Justice, to promote competition. This note addresses that question in two parts: 1) Why we need an implementing and enforcing institution for new digital duties and 2) why it would be ideal for that agency to be a new digital regulator.
The need for a digital regulator emerges from my careful reflection on what would be required to promote competition, protect privacy, and advance free speech in the online world. Legislators can formulate measures to advance these digital public policy objectives in general terms, but to apply them to industry participants in the digital industries sector takes policy and industry expertise that can only be found in a specialist regulatory agency.
Think about data portability as an example. Data portability was enshrined in Article 20 of the European Union’s (EU) 2018 General Data Protection Regulation. The idea was to allow users to have their records transferred from one company to another to vindicate their fundamental right to control information about themselves. But Europe’s national data protection regulators, who were charged with applying the Europe-wide law, quickly faced implementation problems that had not been resolved by the legislative text. What information could a user transfer? What about the privacy rights of other users involved in the transfer of multiparty records such as email communications and contact lists?
EU privacy regulators determined that the information a user provided to a company and the records of their interaction with the company and other users were subject to the portability requirement, but information inferred by the company from user behavior was not. These regulators also ruled that records of the interactions could be transferred without the consent of other affected users but, to protect these third-party privacy rights, the company receiving the records would not be allowed to use the third-party information for their own purposes such as to build or enrich a profile of these individuals or to send marketing messages to them.
Thus, herein lies one compelling reason for a digital regulator. Purely to know what the legislative requirements mean in practice for the affected industry requires the implementing agency to specify the details, as European privacy regulators did for data portability.
The same implementation difficulties emerge when policymakers require the pro-competitive measure of digital interoperability. Interoperability provides competitors with the ability to use the facilities of a dominant company to provide service to their own customers. In telecommunications, it means that one telephone company will pick up and deliver the messages of a competing telco. In the context of social media, it means that a rival to Meta can provide its users with the ability to interact with Facebook users, even if their users are not themselves subscribers to the Facebook service. In electronic commerce, it means that merchants can gain access to Amazon’s customers without being part of the Amazon platform simply by joining a rival electronic marketplace.
But, of course, it is not enough for legislatures to demand interoperability. Someone must explain to industry what the demands of interoperability mean for them. This is especially true when, as in Article 7 of the Digital Markets Act, the enabling legislation requires only interoperability of the “basic functionalities” of a platform and not the added functionalities that might serve as competitive differentiators. Does Meta, for instance, have to provide rival social media companies with access to its user-defined groups? Without a regulatory agency determining which functions must be made available to competitors, there is no answer to the most basic question: interoperability of what?
In telecommunications, the FCC defined interconnection responsibilities. Effective interoperability in tech means that the digital equivalent of the FCC must be authorized to unpack, interpret, and enforce the interoperability requirements legislatures think are needed to promote competition.
But there’s another reason for a day-to-day tech industry regulator besides the simple need to implement general requirements for specific digital industries. To see it, let’s turn to Article 6(9) of the new European Digital Markets Act. This measure restates the right to data portability in competition policy terms. It requires digital gatekeepers to provide their users with portability for data provided or generated by their use of the gatekeeper’s platform. The idea is not to protect user privacy but to spur competition to the dominant company by giving rivals access to the data they need to compete.
This pro-competitive aspect of data portability goes back to its roots in the related concept of telephone number portability. Number portability as implemented in U.S. telecommunications policy in the 1980s and 1990s aimed at giving fledgling local and long-distance competitors a genuine shot at breaking the hold that dominant telecommunications companies had over their long-time customers by allowing customers to keep their telephone number when they switched carriers.
Data portability is a good example of the policy synergies that are commonplace in digital regulation—the same measure promotes both privacy and competition. Regulatory impacts are not single-threaded. Overlapping effects require a single regulator with broad authority to cover both policy fields and able to implement policy requirements in a way that maximizes these synergies.
But the dual nature of data portability also exposes the latent conflict between the policy goals of competition and privacy. The extent of the data portability right in the Digital Markets Act is not limited by concerns to protect the privacy rights of all parties. It extends rather to what rivals to the gatekeepers might need to effectively compete with the incumbent dominant provider. In implementing its version of data portability, data protection authorities banned the use of third-party data for competitive purposes. But if the rationale of data portability for competitive purposes is to empower rivals, why shouldn’t the rivals who receive data about third parties be able to use this data to encourage these potential customers to move to the new platform? Shouldn’t the need to spur competition take precedence here? The harm to privacy, one might argue, is surely minor compared to the potential gains to competition.
The key lesson from this example is that someone must make a balancing judgment in cases involving the competing demands of privacy and competition. There is no use pretending that we can always have it all—perfect privacy and perfect promotion of competition. Someone must address the question of how much privacy protection is needed in implementing data portability and how likely it is that a strong data portability measure will competition and the extent of that likely competition.
So, here we see a second reason for a digital agency. It is needed as the institutional home for these careful balancing judgments when the rights granted by enabling privacy legislation conflict with the demands of separate legislation promoting competition. As a result, the ideal would be to have a common implementation and enforcement agency with expertise and authority in both privacy and competition policy. Housing these separate responsibilities in separate agencies is a recipe for policy incoherence.
Data portability is not an isolated example of these policy tensions. Conflicts among policy objectives, as well as synergies, are ubiquitous in the digital world.
To take another example, the demands of interoperability for pro-competitive purposes can also conflict with privacy, security, and good content moderation practices. Interoperability might mean that dominant social media companies are given no choice about which competing firms to interconnect with—interoperability might be thought of as a neutral, non-discriminatory, universal requirement. Of course, dominant companies don’t want to interoperate so they will be tempted to object that interoperating with certain competitors would menace the safety of their users or the integrity of their network. The monopoly telephone company made such complaints about interconnection with non-network devices and other networks when the FCC attempted to promote telecommunications competition. And they were often pretexts for avoiding competition.
But such concerns are not always pretextual. Security is a real issue in digital networks. No one likes spam and unauthorized data access. Some users value encryptions highly. Others want protection from racist material, online harassment, and hate speech. Everyone wants their kids protected online.
Not every company in the social media business is good at doing these things, and some are genuinely bad actors who skirt the edge of illegality. Does interoperability really mean that dominant companies have a duty to deal with all of them regardless of the quality and kind of service they provide and regardless of the risks they create for users? Must they provide access to purveyors of misinformation who lob misleading and deceptive material at their users and then retreat to other platforms where they plan their campaigns outside the purview of the dominant companies’ content moderation algorithms?
Legislators can require in statute that interoperability must be put in place in a way that balances these competing interests. Last year’s U.S. bill to ban anti-competitive discrimination in tech that almost made it into law did just that with legislative mandates to balance the promotion of competition with the demands of privacy and security. But these mandates for balancing are not self-enforcing. They require an institution where the needed balancing judgments can be made, an agency that can determine when a security or privacy claim is just a pretext to avoid an interoperability obligation and how interoperability can be implemented in a way that does not create undue privacy, security, and content moderation risks. That institution must have competence in privacy and content moderation policy as well as competition policy.
It is possible to let the courts sort out all these issues. But these kinds of implementation and balancing judgments require competence in competition policy, privacy, the details of content moderation systems, and industry expertise that are far beyond the capacity of generalist judges. The courts might be a place where these implementation and balancing judgments can be reviewed for reasonableness and consistency with the authorizing statues. But an institution midway between the legislature and the courts is needed to do the hard job of interpreting, implementing, and enforcing legislative policy requirements and carrying out the legislative mandate for balancing completing objectives.
In its efforts over the last few years to pass competition rules for tech, privacy mandates and social media content moderation rules, Congress has focused on the FTC as the implementing and enforcing agency. This makes a lot of sense. Over the last twenty years the FTC has become the nation’s leading agency to promote privacy and shares top billing with the Department of Justice’s Antitrust Division in promoting digital competition. It has amassed considerable expertise not only in these policy areas, but in the business realities of the core digital industries including search and social media. Among existing agencies, it is the natural home for the role of the nation’s digital regulator.
Lodging these digital responsibilities at the FTC and providing directions to the agency to balance conflicts that appear among privacy, competition and good content moderation would be an enormous step in the right direction. However, the FTC is not the ideal place for the job. The FTC is essentially an economy-wide law enforcement agency, not a sectoral regulator. It is not, like the FCC and the banking regulators, focused on protecting the public from abuse by companies providing goods or services in a specific area of economic activity. The FTC must protect competition and consumers in nearly every business sector in the economy. If it is also given the specific task of being the nation’s digital regulator it will not be able to do both jobs well.
Digital industries need an agency devoted exclusively to the core of social media, search, electronic commerce, mobile app infrastructure and ad tech, with expertise in the three policy areas of most urgent need in these lines of business—promoting competition, protecting privacy, and maintaining effective social media content moderation.
Ideally, legislators should fully fund a new agency dedicated to the digital world, drawing resources from existing agencies, but creating a new structure and new statutory mandates. In my book I outline how that might be done, contributing to the discussion started by Harold Feld, Tom Wheeler and his colleagues and complementing the proposals from Senators Bennet, Welch, Warren, and Graham.
Despite the new interest in newly formed digital platform commission, and a strong case that it is the right administrative structure for supervising the tech industry, it might be that Congress will shrink from the arduous and expensive task of institution building. It might find it far easier and quicker to load the new digital responsibilities onto an existing agency, as it has attempted to do in its recent reform efforts.
And, historically, this is how many of today’s regulatory agencies came into existence. The Department of Commerce had regulatory authority over interstate communications for over twenty years before it was transferred to the FCC in 1934. The Securities and Exchange Commission started off as a bureau of the FTC. Over a period of fifty years, Congress loaded new responsibilities for highways and inland waterways onto the Interstate Commerce Commission in addition to its original role of regulating railroads. Congress has often in the past tried out an existing regulator for the role of regulating a new technology or business sector.
Even if the ideal is a new agency to carry out the new digital responsibilities, it might make some practical and political sense to move to a digital regulator in a two-step process. The first step would be to pass the new standards in competition, privacy and content moderation and assign implementation, rulemaking, and enforcement to the FTC, as Congress has done in its proposed reform efforts so far. Then, over time, as the limitations of housing a sectoral regulator inside a generalist law enforcement agency become apparent, Congress can revisit the issue and spin off these responsibilities to a new agency. The entire process might take years or even decades.
One of the insights from the Brookings discussion last week was the need to take the long view in this effort of digital reform. If a digital agency turns out to be out of reach for practical and political reasons, it is still vital to press ahead now to legislate new competition, privacy, and content moderation standards and lodge implementation and enforcement with a competent existing regulator. The creation of an ideal digital agency might be taken up at a later stage.
-
Acknowledgements and disclosures
Meta and Amazon are general, unrestricted donors to the Brookings Institution. The findings, interpretations, and conclusions posted in this piece are solely those of the author and are not influenced by any donation.
Commentary
Congress eyes establishing a digital regulator
August 9, 2023