Sections

Commentary

Trump deplatforming decision highlights the impotence of Facebook’s Oversight Board

President Donald Trump walks out of the room after an Operation Warp Speed Vaccine Summit in the South Court Auditorium of the of the Eisenhower Executive Office Building at the White House on December 8, 2020 in Washington, DC. (Photo by Oliver Contreras/SIPA USA)No Use UK. No Use Germany.

The Facebook Oversight Board (FBOB) has decided to uphold Facebook’s decision to ban former President Donald Trump from its platform. But not really. The board decided that the ban did not follow the company’s own guidelines and Facebook is now responsible for making that decision again, but within the confines of its own policies.

We are left with an anticlimactic resolution, a kind of Rorschach test that seems to be the product of a corporate pseudo-judiciary trying desperately to also be a corporate pseudo-legislative, all the while being stonewalled by Facebook. The decision is simultaneously a sign of a rebellious FBOB membership denying its role as the ultimate bearer of responsibility, and a clear example of the limitations and ultimate inutility of such a construction. FBOB’s PR blitz tells us it wants to both publicly return to sender on making a choice and be taken seriously as an accountability mechanism, which seem to be at odds.

Ironically, a less nuanced, or opposite decision, would have fed into the “breaking news” media frenzy, the hype of a momentous occasion in global online speech regulation, or seismic societal shift, and made the spectacle enough of a legitimating event. The decision though, and the inherent process of reaching it, are clearly not a win nor a loss for democracy, the U.S., or even the industry. By design, FBOB deliberates, based mostly on Facebook’s characteristics, only about Facebook—the entity which gave it a mandate—and is (barely) bound by its decrees.

How did we get here?

By creating FBOB and granting it very limited but visible authority, Facebook has effectively shifted the conversation, and Wednesday’s decision is just the latest example. On the surface level, for half a day, the world is talking about its governance product, the Oversight Board, rather than literally any of its other, more problematic, products and services.

On a deeper level, this has also been wildly successful. The outrage cycle has reached a point where FBOB, with its limited accountability, previously awash in backlash, is now somehow starting to seem both predetermined and necessary according to otherwise clear-headed pundits, simply because supposedly there isn’t any other accountability on the horizon. The full swath of opinions (some held by the same people) range from loosely championing it as the hard work of smart people, or cautiously praising  the legal innovation, to mildly curious circumspection and slight unease at its innovatively bizarre characteristics.

But FBOB’s construction as a body that “adjudicates cases” while being funded by the company, in hopes of building “case law” precedent, with little actual power over Facebook’s policy creation was always at odds with the more activist but still generally effusive ethos of some of its members.

The Trump decision is the pinnacle of that clash.

The origin story of the FBOB is one that seems to periodically get scrubbed, concurrently criticized and glamorized, to further obfuscate who suggested what; who rode its star to prominence, or conversely gave it some of their own legitimacy; whether it was built as a Supreme Court or just a CYA maneuver; etc.

But outside of the confines of the technology policy world, the intricacies of the FBOB tend to matter very little. Some journalists and their editors are blurring the notions of “independence,” “transparency,” “non-binding recommendations,” and “accountability” into the more palatable headline of “a Facebook board”—or more to the point “Facebook”—burying the complicated nuances deeper into their pieces. Are they wrong?

Accountability?

FBOB has always been an accountability maneuver for Facebook by outsourcing the hard decisions on content with little downside. Should we accept the narrative that it’s the best we’ve got even if it was never meant to do more than provide non-binding policy recommendations? Should we accept as our best shot at accountability, as some have pointed out with its current serious limitations? Or do we allow our imagination to run wild and believe that we can do better in terms of private governance than a faulty structure built on top of an inapt metaphor with no legitimacy and no power?

While we can argue that the Oversight Board’s choice to eschew responsibility here is odd, its flawed nature, much like the Trump decision, reverts to Facebook, its institutional parent. We can cheer the nominal human rights mentions in Wednesday morning’s decision, while also being clear-eyed about what it has always meant: limited set of cases, unclear applicability, performative accountability, all still mostly at the whim of the social media giant. Not designed to tackle the harms that large parts of society associate with online platforms, but also meaningfully tackling any other issues outside of taking down or putting back up content, FBOB most optimistically sits at “better than nothing.”

Without strong popular support and without a mandate to even interrogate Facebook’s business model, FBOB, an expensive and convoluted governance mechanism, is structurally relegated to an advisory role at best.

Conclusion

The ecosystem of online speech is one where everybody plays a role, and there are responsibilities and limits for each actor, whether government, civil society, journalists, and industry. The false duality that, in a country like the U.S., where government regulation of speech is practically verboten, the FBOB represents the only alternative to companies making hard choices alone seems to obscure the potential for better institutions or stakeholder collaboration. Abdicating collective power to a poorly designed pseudo-judiciary that masquerades as private governance inevitably leads us to a point where we have to be grateful for its existence. Dealing with the complexities of online speech regulation at a societal level should not be a foregone conclusion.


Facebook is a general, unrestricted donor to the Brookings Institution. The findings, interpretations and conclusions in this piece are solely those of the author and not influenced by any donation.

Author