The initial flurry of commentary concerning the Facebook Oversight Board’s decision on deplatforming President Trump has subsided, providing some perspective to assess what we have learned. A key lesson is that the Board is an ineffective substitute for a meaningful external dispute resolution and accountability mechanism that should be part of an overall system of social media regulation. Another lesson is that, as president of the civil rights group Color of Change Rashad Robinson said, the operations of the Board have become a distraction from the much-needed task of throwing a regulatory net around social media companies.
Nonresident Senior Fellow - Governance Studies, Center for Technology Innovation
One way to assess the decision is to ask whether, in light of it, any aggrieved party would be willing to use it again to achieve individual redress for a perceived wrong. Viewed through that lens, the decision is a complete failure. It took four months for the Board to release an initial decision, and then the Board postponed any further judgment for at least another six months. Moreover, it seemed to suggest that it might not provide an answer even ten months after the alleged injustice, if ever. In other words, the Board has carved out a role for itself quite different from a forum for redressing for individual grievances in a timely fashion.
In the ban of Trump’s Facebook account, the Board ruled that the initial suspension was justified, but the indefinite nature of it was not. This distinction that the Board drew between a permanent or limited suspension and an indefinite one is clear enough. A criminal defendant knows the difference between a two-year sentence and a life sentence, but what is an “indefinite” sentence?
On the face of it, it is not clear that the Board decision’s makes any practical difference. It simply says to Facebook that it cannot use an indefinite suspension as a penalty; it must choose life or a specific time-limited sentence. YouTube’s ban was prolonged indefinitely because of the continuing risk of violence, and according to press accounts, will not be revoked until the platform determines the risk of violence associated with Trump’s channel has decreased. Twitter’s suspension of Trump was permanent, but it too is free to revisit that decision.
This Board ruling gives only the appearance of a due process limitation. Even if Facebook imposes a permanent suspension, it can always change its mind considering changed circumstances. What will it do, for instance, if former President Trump again becomes candidate Trump, running for the Republican nomination for President in 2024?
The real damage from the Board’s decision can be seen by asking whether anyone would have confidence in a reasonable outcome, or any outcome at all, if a hypothetical Facebook decision to bar candidate Trump were submitted to Facebook’s Oversight Board for review. The message from the Board’s decision in this case is that it would take four months to reach a decision on candidate Trump and then delegate responsibility to Facebook to reach its own decision within another six months.
When review Boards matter
The public does needs something like the Board to scrutinize Facebook’s decisions to provide some external accountability. In Europe, the proposed Digital Services Act would mandate the use of external review boards to supplement the internal appeals mechanisms social media companies also would be required to have. Some have suggested social media councils to play this external review role. In a recent Brookings commentary, I suggested a dispute resolution mechanism modelled after the arbitration system supervised by the Financial Industry Regulatory Authority.
But the Board does not see itself as an external review board dedicated to resolving disputes between Facebook and its users. It seems to be trying to find a niche somewhere between judicial review of agency decisionmaking and an administrative judge’s review of the due process rules for granting or denying a government benefit. But setting and enforcing rules for procedural due process in social media is a job for government. It is part of what the bi-partisan Platform Accountability and Consumer Transparency Act is seeking to do. This proposal lodges enforcement at the Federal Trade Commission and the courts, institutions that live under their own accountability and transparency rules. It is not clear to me that the decisionmaking procedures that result from the Board’s own opaque processes would have anymore legitimacy than just leaving it up to Facebook itself.
Most people who follow the Board’s actions closely, especially its first set of decisions, assumed the Board would decide that the initial judgment to suspend then-President Trump was correct in the emergency that existed at that time, but that he no longer posed an imminent danger to public safety or health. The Board’s initial decisions overturned four of Facebook’s choices to delete content. This suggested a strong focus on limiting Facebook’s ability to moderate content, perhaps even going so far as to require Facebook to carry all legal content. In any case, the initial decisions presaged a Board that had no difficulty overruling the substance of Facebook’s content moderation decisions.
Where do we go from here?
There is no use pretending that a decision to reinstate former President Trump or to make the ban permanent would have been an easy one. It is perfectly understandable that the Board would want to back away from making such a difficult choice. The intermediate position it chose must have appealed to them as the best of the bad choices they faced, as Facebook’s former head of security Alex Stamos suggested. But the Board’s failure to make a substantive decision amounts to a failure of nerve.
It is not even clear that the Board will review what Facebook does next. But if it does, the Board has an opportunity to recover partially from its failure. Nothing can restore the public’s expectation of a rapid response to an urgent issue—the ten months delay in the Board’s decisionmaking means the expectation of timely justice is gone forever. But by making a forthright, substantive decision on whether Trump should be deplatformed, it can rehabilitate itself as a place where, however delayed, individual justice and redress can be found.
In the meantime, the search must continue for a regulatory framework that can protect users and the public from abuse on and by social media platforms. The Board has become an obstacle to such a framework and a distraction from efforts to create it. It should be replaced with a different and more effective external appeals institution in a future regulatory framework. Rather than spend time parsing the hidden meaning in the Board’s rulings, advocates, scholars, industry representatives, and policymakers should continue the hard work of crafting the structure and the details of the coming regulatory regime for social media.
Facebook is a general, unrestricted donor to the Brookings Institution. The findings, interpretations and conclusions in this piece are solely those of the author and not influenced by any donation.