The Facebook Oversight Board recently released its first decisions on selected content moderation “cases”, and is in the process of deciding whether to allow former President Donald Trump back on the social media platform. The Oversight Board, while somewhat unique in the online content moderation ecosystem, is a part of a complex ecosystem of institutional and legal choices and actions. The Board, though fraught with intense and legitimate criticism, can still yield a few takeaways for policymakers: Cooperation is important, and appeals and private governance processes may be even more important.
The Board
The idea behind this distinctive single-company private governance institution first appeared in 2018, amid other calls for externalizing content decisions, as a “Supreme Court” that would be the final say on content matters. The social media giant spared no expense in trying to make the global consultations as open and inclusive as possible. The process also benefited from at least two rounds of input from specialists in a wide variety of disciplines (the author was in one such meeting) and with a very diverse perspective on the company itself. Legal scholar Kate Klonick provided scrutiny and advice, also chronicling the experience for The New Yorker magazine.
While ostensibly participatory, the operation to create the Board still yielded a structure closer to Facebook’s needs: externalizing the ultimate decisionmaking on tough cases, while retaining the significantly more consequential power to solely decide content moderation policies. The Oversight Board would be built entirely outside of Facebook, with independent members chosen from around the world to adjudicate a small number of cases. The result of the cases would, within the limits of technical feasibility, be binding for the tech company. However, the Board’s role in offering policy recommendations, an important and legitimate check on Facebook’s power, remained consultative and structurally uncertain.
As the rollout of the structure, by-laws, and members carried on in 2019 and 2020, the general community of activists, advocates, and academics involved did not feel at ease with the Board, which led to a mixed reception. Some had expressed foundational constructive criticism from the start. Others, focusing on legal rather than private governance scholarship, initially bought into the general frame put forth by Facebook, extolling the hard work of the team, its historical legal importance, and offering mild critiques. A group of otherwise remarkable critics created public relations stunts like the “Real Oversight Board” whose real power over Facebook’s actions is even less than the actual Oversight Board. Meanwhile, those that joined the Facebook Board excitedly, but incorrectly, hinted that their decision-making power over the company is final.
The Board will soon decide the fate of the former president on the platform, but it has already released its decisions on a first batch of the originally chosen cases, some more controversial than others. Relying on international human rights law and on the expertise of its members, the decisions were taken with immense care, benefited from external expertise, and included local knowledge, while also tackling with the conflict between speech rights and potential ensuing harm. Outside of the actual choices made, members took their role very seriously, and are not simply a rubber stamp on Facebook’s moderation decisions but in fact truly independent.
Regardless of the choices made and how we feel about them, the underlying concern with this private pseudo-judicial body goes beyond the decisions themselves or the independence of the members: Despite Facebook’s pledge of enforcement for posts with “identical content with parallel context”, the nuances expressed are difficult to translate into so-called case law. The “lower courts” that would decide other cases are either context-deficient algorithms or overworked, underpaid people making a few decisions every minute, oftentimes also devoid of local context, and not benefitting from external expert guidance. Conversely, these and upcoming decisions may significantly influence the conversation outside of Facebook. The Board could even fuel changes at other social media platforms given the legitimacy afforded by its members, thorough decisionmaking, and outsized media attention. In fact, the desire to expand to other entities further down the road, initially a Facebook proposition, has now reached the members of the Board themselves, while still strongly opposed by moderation policy experts.
Takeaways
Cooperation is needed
The complexity of decisions on speech cannot be illustrated more clearly than when a deliberative body of renowned experts from around the world, with access to virtually unlimited resources and through a thorough process, come up with a solution that is easy to criticize on the merits. And this does not take into account the general concerns over structure, role, power, and corporate end-goal. Facebook’s mantra for a long time has been that they are not and should not be “the arbiter of truth”, which has been greeted with detailed lists of examples when it has acted as one. Meanwhile, the U.S. government, restricted by the First Amendment, is neither capable nor always willing to make hard decisions on speech outside of what is and isn’t legal, and oftentimes the two speeds of proposed reform are regulatory capture or adversarial animosity.
While radical alternatives are legitimate, the starting point for the content moderation conversation seems to be that such platforms have a role in society. The seeming intractability of speech choices should then urge stakeholders to collaborate in order to move the problem forward. Far from proposing what others have called “cartelization,” cooperation in content moderation can lead to mutual understanding between government, civil society and platforms, complementary actions, and potentially a less toxic online environment.
Appeals processes matter
Though the structure of the Board mirrors an appellate judiciary, it lacks important characteristics like institutional legitimacy, a constitution, and co-equal branches of government. However, appeals themselves have increasingly been tools used by policymakers, especially in the EU for the Digital Services Act, to ensure better rights protection for users. The Santa Clara Principles count appeals as one of three foundational pillars in transparency and accountability. An insufficient but necessary solution, appeals should be in the toolbox of both regulators and platforms.
Proper private governance can work
The case of the Oversight Board could end up as a cautionary tale against private governance: A large successful company sets up a pseudo-judiciary with limited powers to stave off regulation while concurrently trying to shape any potential regulation for the rest of the industry. However, platforms are already performing “private ordering” through their terms of service. Further, their actions on speech, within a complementary ecosystem, constitute governance choices. Creating private governance systems that institutionalize and make visible the actions and policies underlying them, and subject them to potential change, should be uncontroversial. Private governance, industry-wide with proper guardrails, transparency, and inclusion of other stakeholders could lead to solutions that the U.S. government can’t legislate but could encourage.
Facebook is a general, unrestricted donor to the Brookings Institution. The findings, interpretations and conclusions in this piece are solely those of the author and not influenced by any donation.
Commentary
Facebook’s Oversight Board makes an imperfect case for private governance
February 23, 2021