Sections

Commentary

Two ways to improve Senator Klobuchar’s needed antitrust legislation

Senator Amy Klobuchar

The Senate Judiciary Committee took a major step forward on January 20th by reporting out Senator Amy Klobuchar’s platform nondiscrimination bill with a bipartisan vote of 16-6. It is now up to Senate Majority Leader Chuck Schumer whether to bring it up for a vote on the Senate floor.

The bill and its markup provide an almost-perfect illustration of the overlap of competition, privacy, and content moderation issues raised by attempts to regulate digital companies. Legislators seeking to legislate in one of these policy areas must assess how their efforts affect the others. It is a theme I explore at length in a book forthcoming from Brookings Institution Press entitled The Regulation of Digital Industries: Competition, Privacy and Content Moderation.

The general thrust of the bill, S. 2992, the American Innovation and Choice Online Act, is easy to understand and a welcome improvement in competition policy toward the tech industry. Its primary thrust would be to ban dominant digital platforms from favoring their own services and products over those of their competitors. In my view, the bill goes a long way toward establishing the attractive principle of fair competition that a digital platform may not compete with its own dependent business users through self-preferencing conduct that privileges its own affiliated companies.

Several senators who voted to send the bill to the floor made plain that they could not vote for final passage in its present form. Their reservations and those of some outside groups should be addressed before this transformative bill moves forward. In fact, the bill should be modified to not compromise its pro-competition goals to respond to concerns that in its current form it might threaten user privacy and the efforts of tech companies to engage in strong and effective content moderation.

Balancing Privacy and Nondiscrimination

Senator Patrick Leahy voiced concern at the Committee markup that the bill would spark a “race to the bottom” on privacy and predicted that privacy issues would come up on the floor. He is right to be concerned.

The general standard in the bill governing the relationship between promoting competition and protecting privacy is tilted too strongly in favor of competition policy interests. The law does not now, nor should it, treat privacy as a secondary governmental interest, to be pursued only to the extent that it does not interfere with the much more important goal of promoting competition. And yet this inappropriate hierarchy is how the bill treats privacy.

Platforms often impose constraints on the ability of their business users to exploit user data. Apple requires its app developers to get affirmative opt-in consent before tracking app users. Amazon requires its merchants to retain user data for no more than thirty days and to use it only for certain purposes such as fulfilling orders and calculating and remitting taxes. The companies do this to make their services more attractive to users and because they rightly fear that their users will hold them responsible for any privacy intrusions by the merchants and app developers using their platforms.

But these pro-privacy data constraints might be interpreted as violations of the various duties not to discriminate provided for in the bill. How does the bill deal with such possible conflicts? It instructs the Federal Trade Commission (FTC) to allow a platform constraint on data use to override nondiscrimination duties only if a platform can show that the data constraint “was narrowly tailored, could not be achieved through less discriminatory means, was nonpretextual, and was reasonably necessary to prevent a violation of, or comply with, Federal or State law…[or to] protect…user privacy.”

This standard draws upon the language used by the courts in the application of intermediate and strict constitutional scrutiny and would be recognized as such by the FTC and certainly by reviewing courts. This language does not aim solely to ensure that the legally binding nondiscrimination requirements in the bill override whatever voluntary business practices platforms choose to adopt. The priority of nondiscrimination duties applies even when a platform’s data practices are adopted in compliance with federal or state privacy law.

The bill thereby treats competitive nondiscrimination as if it were a constitutional value like free speech or equal protection, and treats privacy as if it were a mere interest that the government can advance so long as it does not interfere too much with the fundamental value of promoting competition. It is worth noting that S. 2710, the Open App Markets Act, which the Senate Judiciary Committee reported last week, similarly treats privacy as a subordinate value.

It is easy to understand why these bills take such an approach. Often, privacy is just a pretext for platform companies to engage in abusive behavior toward their competitors. The beneficial constraints mentioned above, such as Apple’s requirements for affirmative opt-in consent for user tracking, should not be conflated with other policies that may serve less clear privacy-enhancing purposes, such as Apple’s ban on alternative payment methods outside the App Store. Tech companies have been broadly raising the privacy issue in response to antitrust complaints both here and abroad. They hope to use the privacy issue now as a way to kill or fatally weaken the bill, and to continue to use privacy as a shield against pro-competition measures if the bill becomes law. Reformers and privacy groups such as the Electronic Frontier Foundation are right to resist this misuse of the privacy issue and should continue to do so.

But reasonable resistance to efforts to kill or to undermine the bill should not lead legislators to treat privacy as a second-class value, subservient as a matter of law to the promotion of competition.

A better way forward, as has been suggested by law scholar Erika Douglas, is to treat on par the two values, with neither taking precedence over the other. Privacy and competition must accommodate each other rather than being arranged hierarchically.

This accommodationist approach might be embodied in Senator Klobuchar’s active nondiscrimination bill by striking the “narrowly-tailored,” “less discriminatory” and “nonpretextual” language. An amendment similar to this offered by Senator Mike Lee during Committee markup failed by a narrow vote of 10-12, apparently perceived by the bill’s sponsors and supporters as an attempt to gut the bill’s pro-competitive thrust.

Striking this language is needed to create a better balance in the bill between the values of privacy and competition. Removing this language would also allow for an enforcing agency to approve a company’s discriminatory data practice only to the extent that the practice was “reasonably necessary” to protect user privacy, a standard strict enough to prevent the abuse of privacy issues to circumvent the pro-competitive nondiscrimination duties in the bill.

This “reasonably necessary” standard is similar to the general necessity standard used in other parts of the bill. One such provision, as modified by Senator John Cornyn’s national security amendment adopted by a vote of 22-0 in Committee markup, would prevent platforms from restricting the ability of users to uninstall software or change default settings “unless necessary (italics added) for the security or functioning” of the company’s service or to “prevent data…from being transferred to the Government of the People’s Republic of China or that of another foreign adversary.” If this general necessity standard is sufficient to balance nondiscrimination with information security and national security interests, then a standard of reasonable necessity should work to accommodate privacy interests.

This is not a perfect solution to privacy challenges in digital industries. Tech companies have considerable freedom today to define their data practices. The amendment that I’m suggesting enables regulators to second guess these decisions only when the data practices are competitively discriminatory and such discrimination is not reasonably necessary to protect user privacy. It would not give regulators the power to regulate digital data practices more broadly, which they should have if they are fully to protect user privacy. This defect of excessive privacy discretion in the hands of powerful digital companies cannot be cured, however, by letting competition policy officials override platform privacy calls to promote the interests of competing firms. The remedy is to pass a strong national privacy law, in addition to any new competition law, that puts tech industry data practices under regulatory supervision.

Similarly Situated Business Users and Content Moderation

A mixed bag of advocates including the media reform group Free Press, free speech scholar Daphne Keller, and the libertarian head of TechFreedom Berin Szoka have united to oppose the provision in the bill that forbids platforms from discriminating among similarly situated business users, even if the platform itself does not directly compete with their products or services. They are rightly concerned that the provision would hinder platform content moderation efforts to control hate speech and disinformation.

The version of this bill that passed the House Judiciary Committee last June banned any platform conduct that “discriminates among similarly situated business users.” This ban on discrimination against similarly situated business users might have been targeted at platforms that attempt to circumvent the bill’s ban on self-preferencing through equally anticompetitive practices accomplished through contract rather than through ownership or control. Without this provision, platforms might give anticompetitive preferences to companies that paid extra for platform services, even though there was no consumer or business justification for these preferences other than the extra payment. In the extreme, they could offer exclusive arrangements to one provider of goods or services. Ruling out discrimination among similarly situated business users would seem to prevent that kind of anticompetitive platform conduct.

But it is very hard to craft an effective statutory prohibition against anticompetitive platform arrangements with business partners, and this vague language certainly didn’t do it. It might even prevent platform recommendations or rankings differentiating among business users providing similar services on platforms. Moreover, it would have enabled government attacks on content moderation. As Free Press pointed out, it would have allowed enforcing agencies to treat content moderation actions against hate speech purveyors like Infowars as anticompetitive discrimination.

The Senate bill S. 2992 attempts to respond to this criticism. It makes it unlawful for platforms to “discriminate in the application or enforcement of the covered platform’s terms of service among similarly situated business users in a manner that may materially harm competition on the covered platform.”

This narrower provision stops a platform from discriminating against a company in application of its terms of service so as to weaken it against favored competitors or even to make it a more amenable target for acquisition. But it seems to miss the crucial case that might have motivated the House bill’s provision. It would not stop a platform from engaging in anticompetitive preferencing through contract rather than affiliation. Under the revised Senate provision, a platform could, for instance, sign contracts for carriage with some providers of pornography, but refuse to carry other providers, while having nothing in its terms of service that explicitly banned pornography. The Senate provision seems to address only anti-competitive discrimination accomplished through the application or enforcement of terms of service.

If the focus is to prevent anticompetitive deals with business partners, the sections of the bill banning discrimination involving “the covered platform operator’s own products, services, or lines of business” could be expanded to forbid discrimination involving “business partners.” This expansion would be consistent with a similar provision in S. 2710, the Open Apps Market Act, which provides that app stores shall not “require developers to use an In-App Payment System owned or controlled by the Covered Company or any of its business partners as a condition of being distributed on an App Store or accessible on an operating system.

In any case, the Senate revision doesn’t resolve the concerns Free Press raised in connection with content moderation. Enforcing a provision against anticompetitive discrimination in content moderation would inevitably involve enforcing authorities in second-guessing platform content moderation decisions. A platform could defend itself against any enforcement action charging anticompetitive discrimination in the application of its pornography ban, for instance, by arguing that some content providers were allowed on the platform because they lived up to its content standards against pornography, while others routinely did not. Resolving that issue would put the enforcing agencies right in the middle of deciding what counts as pornography under platform standards.

It might be possible to fix this by crafting a new provision barring enforcing agencies from taking content into account in assessing whether platforms discriminated against similarly situated business users in a way that harmed competition. But it is not at all clear how this would work. Content providers compete on platforms through the provision of content. To assess platform discrimination in content moderation among similarly situated content providers appears to require assessing the similarities and differences in the content they provide. There seems to be no way around it.

The dangers of doing this should be obvious. As Free Press noted, the revised Senate language would allow state or federal enforcers to block social media content moderation efforts “claiming that what tech companies rightly define as hate speech, incitements to violence, or vaccine disinformation is really just competing political or health information that must stay up.”

In response to conservative concerns about discrimination in content moderation, state legislatures have already passed laws that echo the Senate provision’s ban on content moderation discrimination. Florida’s content moderation law, which was invalidated by a Federal district court in Florida on First Amendment grounds, would have required social media companies to enforce its terms of service “in a consistent manner among its users on the platform.” A Texas content moderation law, which also has been invalidated by a Federal district court, would have barred social media companies from removing or otherwise limiting content based on the user’s “viewpoint.”

These state measures have some good features, and the judicial decisions invalidating them may go too far in rejecting some potentially attractive transparency and due process mandates in both bills, as a Knight First Amendment appeal brief notes.

But without a doubt, if the ban on content moderation discrimination in the Senate bill becomes law, the state attorneys general in Florida and Texas, who have an enforcement role under the bill, would waste no time using it to accomplish the content moderation restrictions sought in these currently unenforceable state laws.

Conservative concerns about bias in content moderation may seem to be just attempts to work the ref. But they raise very real issues about how to adapt the U.S. free speech tradition to the different challenges of information disorder on social media that have been extensively studied by scholars such as Daphne Keller, Tim Wu (now in President Biden’s White House), and Genevieve Lakier.

As part of resolving these issues, Congress will have to decide whether social media companies should be treated as common carriers for the speech of others, whether they should have a public interest obligation of political neutrality or viewpoint diversity, and how such duties could be made compatible with compelling ethical responsibilities to control online hate speech and disinformation. Legislation in this area would inevitably raise extraordinarily tangled and fraught free speech issues.

The measure in the Senate’s nondiscrimination bill is not an adequate treatment of these complex issues and if passed in its current form it will be a source of unintended content moderation harms. Moreover, a bill aimed at preventing self-dealing by dominant platforms is not the proper vehicle for setting good content moderation policy for tech companies. Legislators should leave these thorny issues for a future content moderation bill, perhaps in the context of hearings and a markup on the thoughtful bipartisan bill proposed by Senators Brian Schatz and John Thune.

Modification, not evisceration

In the face of determined opposition, it is easy for competition policy reformers just to hunker down and dismiss all criticism as bad faith attempts to kill or eviscerate a needed bill. My call for modifying and dropping certain provisions of the bill may superficially resemble the criticism of others who are seeking to block or slow down this vital legislation. That is not my intent at all. The bill can be revised to respond to privacy and content moderation concerns and still deal effectively with obvious and long-standing competitive abuses in digital industries.

These intersecting issues of promoting competition, preserving privacy, and facilitating effective content moderation need a nuanced legislative approach. As the bill moves forward in the House and Senate, as it must, legislators should make further changes to accommodate interests other than promoting competition, and to continue what will be a long-term effort to establish a regulatory structure that balances competition, privacy, and effective content moderation for digital companies.

Amazon and Apple are general, unrestricted donors to the Brookings Institution. The findings, interpretations, and conclusions posted in this piece are solely those of the author and not influenced by any donation.