Facebook’s recent settlement with the Federal Trade Commission (FTC) has reignited debate over whether the agency is up to the task of protecting privacy. Many people, including some skeptics of the FTC’s ability to rein in Silicon Valley, lauded the settlement, or at least parts of it.
Others, however, saw the five-billion-dollar fine, oversight reforms, and compliance certification measures as a drop in the bucket compared to Facebook’s profits. Two dissenting FTC commissioners and other critics pointed out that the FTC did not change Facebook’s fundamental business model nor hold Mark Zuckerberg personally liable, despite hints that the company fell out of compliance with its original 2010 FTC consent order soon after that agreement was inked. Some privacy advocates and lawmakers even argued that the limits of the settlement are evidence that the FTC, the leading privacy regulator in the US since the late 1990s, is no longer the right agency to protect our personal information from Big Tech. They support creating a new, consumer privacy-focused federal agency.
We think the FTC is still the right agency to lead the US privacy regulatory effort. In this essay, we explain the FTC’s structural and cultural strengths for this task, and then turn to reforms that could help the FTC rise to modern information privacy challenges. Fundamentally, the FTC has the structure and the legal powers necessary to enforce reasonable privacy rules. But it does need to evolve to meet the challenge of regulating modern information platforms.
The FTC wields great powers tempered with experience
The FTC has remarkable powers. At its creation a century ago, Congress gave it unprecedented investigatory and enforcement tools. These have been broadened over time as the FTC has faced new wrongs. Today, the FTC can examine business practices even where there is no investigatory predicate, and as a general-purpose consumer protection agency, it can sue almost any business.
As a result, the FTC is nimble and can adapt to new technologies without an act of Congress. Founded in the days of misleading newspaper advertising, the FTC was quick to pivot to radio, television, and internet fraud. The breadth and generality of its powers are also a source of strength. Much more than just data protection, modern consumer problems involve platforms, power, information asymmetries, and market competition. In theory, the FTC has a broad enough jurisdiction and charge to handle diverse issues often labeled as “privacy,” such as algorithmic manipulation and accountability.
In the information economy, privacy is among the most important values that law and norms should protect. Yet at the same time, privacy must also accommodate other important values, including the risks inherent in economic development. In our view, privacy is a means to the ends of freedom and autonomy in our personal lives and in our polity. It is a key component for human flourishing.
The FTC has achieved much with limited resources and without consistent congressional support
Many privacy issues are thought to be new. But the FTC has decades of experience handling privacy problems, particularly in credit reporting and debt collection. The FTC’s earliest information privacy matters, in 1951 and then a series of cases in the 1970s, recognized the general consumer preference against commercialization of personal data. Using its enforcement powers, the FTC sued companies for deceptive data collection, and for the sale of data collected in preparing tax returns. The agency brought its first internet-related fraud case in 1994, long before most consumers shopped online. Since then, the FTC has pursued the biggest names in internet commerce. It has steadily broadened the duties for fair information handling, particularly in the information security domain.
The FTC’s broadest jurisdiction is its enforcement against unfair and deceptive practices under Section 5 of the FTC Act. Despite a wide reach, however, Section 5 has some significant limits in power. The FTC generally cannot issue a fine for Section 5 violations initially—fines can only be issued for violations of consent decrees, as happened in the Facebook case.
Resources are the FTC’s greatest constraint. It is a small agency charged with a broad mission in competition and consumer protection. It carries out this mission with a budget of just over $300 million and a total staff of about 1,100, of whom no more than 50 are tasked with privacy. In comparison, the U.K.’s Information Commissioner’s Office (ICO) has over 700 employees and a £38 million budget for a mission focused entirely on privacy and data protection. In addition, for much of modern history, Congress has kept the FTC on a short leash. In 1980, Congress punished the agency for being too aggressive, causing it to shut down twice. Congress has held authorization over the agency’s head and used oversight power to scrutinize what members of Congress perceive as the expansive use of FTC legal authority, including its interpretation of privacy harm.
Given these constraints, FTC attorneys make pragmatic choices in their case selection. At any given time, line attorneys are investigating many companies and weighing decisions on where to target limited enforcement resources. The FTC can only bring actions against a small fraction of infringers, and it has chosen cases wisely to make loud statements to industry about how to protect privacy.
Even with these severe limitations, it has managed to bolster important norms and send strong signals to industry that have influenced the practices of many companies. It has become a significant enforcement agency that industry pays attention to. It has an enforcement record that compares quite well to other agencies in the US as well as around the world.
Some critics of the Facebook settlement have focused only on its shortcomings. Despite flaws and limits in the consent order, the five-billion-dollar fine was the biggest privacy settlement worldwide by far. It is an order of magnitude greater than the highest fine under the EU’s General Data Protection Regulation so far (the UK ICO’s €183 million fine against British Airways) and roughly double the record fine under EU competition law, which privacy advocates have urged as the reference for privacy fines.
The settlement also contains significant and noteworthy measures, such as forcing Facebook to make privacy a board-level concern and requiring Mark Zuckerberg to verify compliance. As dissenting Commissioners Chopra and Slaughter note, the FTC’s settlement doesn’t solve every problem; Facebook’s structure and business model remain the same. But no existing enforcement agency has come close to matching the FTC’s impact in this case, and foreign data protection agencies similar to proposed in the U.S. as FTC alternatives have not demonstrated the power or political capital to do so. As privacy enforcers go, the FTC stacks up well to others in many regards.
The FTC resists capture and partisanship
The FTC is resistant, though not immune, to capture by the industries it regulates. One reason the FTC resists capture is because it regulates no single, coherent industry. Even the “technology industry” is a heterogeneous group, with competing and conflicting fundamental interests. While some tech companies have accompanied their business models with commitments to privacy as a human right, this is not typical in Silicon Valley.
The FTC’s most recent capture episode related to the credit bubble. The agency drank deeply from the credit well, with one official even coining the phrase “miracle of instant credit” to describe what he believed to be a sound granting system that could take proper risks even with subprime borrowers. At the time, the FTC—and virtually all other regulators—embraced industry preferences for data selling and joined the industry in opposing real reforms to protect consumers.
The response to the credit meltdown is instructive in assessing the strengths and weaknesses of the FTC in consumer protection. The disaster shook congressional confidence in the FTC, causing Congress to think that financial services needed its own consumer protection regulator, the Consumer Financial Protection Bureau (CFPB). With a new presidential appointee at its head, the CFPB has shifted away from a serious enforcer to a vessel for mere “consumer education.” Unlike the CFPB, the FTC has thus far weathered the Trump Administration. The FTC’s appointed leadership is well-qualified and fundamentally on board with a consumer protection mission.
Agencies can become captured not only by industries, but also by ideologies. As important as privacy is, the risk of a privacy agency is that it could become too pro-privacy, too narrowly focused on the largely procedural protections of fair information practices. The FTC has generally avoided veering too far in any of these directions. It is tasked with a more holistic goal of protecting consumers. It doesn’t treat privacy as an end in itself, but has aimed to promote privacy within an environment of thriving markets and technological development.
The FTC’s structure, with no more than three out of five commissioners allowed to be from the same political party, has helped it remain bipartisan. And, in practice, the FTC has generally operated with a broad consensus in privacy cases. Most privacy enforcement actions have received 5-0 votes by the commissioners.
Reinforcing the FTC’s Existing Powers
Despite the FTC’s balanced and consistent enforcement, it is not realizing the full potential of its authority. The agency should take more ground-breaking and norm-nudging cases. Most of its modern cases are slam dunks because the agency is risk-averse and fears blowback from Congress. Indeed, in a different era when Congress backed the FTC, the FTC took on cases that were more normatively consequential. But in recent decades, it has proceeded more cautiously and incrementally, negotiating consent orders and only rarely litigating.
Aside from doing more to advance norms, the FTC has powers that could create more deterrence, if used. The DC Circuit recently affirmed a broad power to impose personal liability on people who directly participate in or control deceptive practices. This would seem to be an excellent remedy for platform companies like Facebook and Google. These companies continue to be founder-controlled in a real sense, and the founders have demonstrated little or inconsistent respect for users’ privacy interests. In its investigations, the FTC has uncovered numerous emails by executives in which they discuss information predations. Holding these executives more responsible could have a dramatic deterrent effect.
The FTC also could achieve greater deterrence by leveraging an obscure power known as “non-respondent liability.” In cases where the FTC has a fully-adjudicated matter concerning some business practice, the agency can use that precedent to issue civil penalties to others engaging in the same activity . The power is limited to instances of actual knowledge of a closely-matching precedent by the new defendant, but this can be established by sending that company notice of its wrongdoing and the relevant previous order. If we think about recent privacy wrongs—poor data security, selling data despite promising not to, and so on—many are widespread, recurring practices. If the FTC were willing to adjudicate just one case involving information “sale,” changing users’ settings, or even storing passwords in plain text, hundreds of companies could inherit exposure to civil penalty liability though this mechanism.
The FTC’s existing powers would be strengthened by broadening its economic analysis. Some within the FTC see privacy as an economic interest, but the FTC’s application of economic principles has been overly doctrinaire. The FTC takes businesses’ claims of utility gained from personal information at face value—just look at how the agency kowtowed to subprime lenders. At the same time, the FTC has been skeptical of the economic consequences to consumers from information trade, including the transaction costs that businesses can shape and opportunistically impose on consumers. The FTC is out of step with the best behavioral evidence concerning how consumers (mis)conceive of the information economy. With a broader economic conception of consumer behavior and privacy wrongs, the FTC could use its power to police many norm-violating practices.
The FTC has not fully appreciated the challenge of the information marketplace and platform power, resulting in under-conceptualized cases and missed opportunities. The modern consumer challenge is not information scarcity and a discrete choice between buying an Abdominizer or Ab Roller. The modern information dynamic is of information glut, and many transactions are continuous, where companies attempt to capture consumers in a platform. Platforms have unfathomable means and poorly-understood ends, can change terms on consumers, and will keep user data forever if they can.
Platform power is thus bigger than our individual decisions. Platform powers shape our decisions and skew what we think is even possible. That is the modern challenge that the FTC needs to tackle. It is bigger than privacy, and an agency focused only on data protection could not tackle it.
Reforming the FTC
With greater resources, the FTC could handle many more cases. How many depends on the kinds of companies and the business areas. A horseshoe effect plagues FTC privacy enforcement: Some small companies may think themselves immune because they believe they are too inconsequential for FTC attention, while some of the largest companies have proven themselves willing to do almost anything to gain platform status.
Clearly, the number of cases the agency is doing now is not enough. On average, the FTC announces about 15-20 Section 5 enforcement settlements per year. It could start by doing on the order of 100 cases, and then study the deterrent effect among small and large companies. But it needs far more resources to scale up like this. Regardless of whether it adopts comprehensive privacy legislation that expands FTC enforcement authority, Congress should significantly expand the agency’s appropriations to enforce existing law.
Additionally, as threats from platforms evolve and become clear, the FTC might need to go beyond pushing back against deception and unfair actions that cause harm, and also target manipulation and abusive practices. Platforms and apps are now regularly deploying manipulative interfaces, sometimes called “dark patterns,” to wheedle, pressure, and convince people to act against their own interests for the benefit of the company. These dark patterns are often not outright deceptive nor do they necessarily cause the significant kind of harm contemplated by unfairness rules. Rather, they leverage people’s own limitations against them in an adverse way. Congress could embolden the FTC to fight these dark patterns by modifying Section 5 to prohibit “abusive” trade practices in addition to deceptive and unfair ones, which would mirror the powers of the CFPB.
The real thing that upset the two dissenting commissioners and many critics is that the FTC didn’t change Facebook’s business model; it just created a better paper trail for when Facebook surveils its users. However, if the FTC is going to get serious about privacy, Congress is going to have to get serious about limiting platform power, among other issues. The FTC can’t boldly do all the things that must be done without Congress also taking action.
We think the FTC has done well given its limits on civil penalties and rulemaking. The FTC’s performance has to be evaluated in the context of its hostile environment. It is constantly outgunned by powerful business groups. The FTC has far fewer resources than most of the enterprises it examines, as well as its peer agencies elsewhere in the world. Given its power, its position, the law, and all the pressures on it, the FTC has navigated these waters well. It has been generally bipartisan and avoided much of the politicization seen at the Federal Communications Commission and CFPB.
But if the FTC is to be a successful regulator of tech platforms, it needs more resources, more tools, a greater shield from political pressure, and a clear Congressional mandate. Only then can it develop and give effect to a broader vision of privacy, power, and human flourishing for a safe and sustainable information society.