The world is not only living through a global pandemic. It is also experiencing what the World Health Organization calls an “infodemic.” Disinformation related to health issues, political campaigns, and conspiracy theories have become a major global concern, and the spread of public health disinformation is one reason to re-evaluate our frameworks for understanding the phenomenon.
In 2019, Camille François, Graphika’s chief innovation officer, established a framework for describing and analyzing influence operations known as the disinformation “ABC.” It describes the state of modern malign influence using three criteria:
- Manipulative Actors with clear intention to disrupt democratic processes or the information ecosystem;
- Deceptive Behaviors, as tactics and techniques used by manipulative actors;
- Harmful Content pushed to hurt and/or undermine individuals, organizations, or the public interest and influence the public debate.
Since its publication, this framework has been widely adopted. For researchers and technology companies, it has become a key tool for identifying information/influence operations and assessing how to respond. Particularly in the absence of standard international public enforcement procedures, the framework affords a clear model for how to design counter-disinformation efforts. Facebook’s community standard for “inauthentic coordinated behavior,” for example, leans heavily on aggregate behaviors for detecting campaigns and removing associated accounts.
Yet as successful as the ABC model has been, the focus on specific actors, behaviors, and content doesn’t take into account the way structural factors inform disinformation campaigns. How a platform is structured is crucial to understanding which actors will use it, how they will behave, and the kinds of content they will generate.
Therefore, the ABC framework should be expanded to include a “D,” for information Distribution. How disinformation diffuses and spreads owes largely to the digital architectures of online platforms. Digital architectures are the “technical protocols that enable, constrain, and shape user behavior in a virtual space,” encompassing the network structure, functionality, algorithmic filtering, and datafication of online platforms. Put differently, the way a digital platform is structured shapes and constrains how information is distributed and the kind of reach it will have.
Ads, algorithms, and distribution
The role and importance of distribution has grown in recent years, as recommendation systems and paid advertising have promoted conspiracy theories and falsehoods on YouTube, fed filter bubbles based on racial biases on TikTok, and encouraged emotional changes in Facebook users.
Disinformation actors have benefitted from abusing these vulnerabilities to gain online visibility and financial sustenance. Research from the EU DisinfoLab found that Facebook’s recommendation algorithm, for example, promoted pages posting disinformation authored by a French white supremacist group, allowing the group to boost its number of followers to nearly a half million.
Online political ads in electoral campaigns play a similar role, allowing voters to be directly microtargeted with disinformation. In one instance, what appears to have been a Ukrainian individual with ties to Russia impersonated prominent French politicians to target French audiences on Facebook with disinformation amplified by advertisements. The example clearly demonstrates that foreign stakeholders can exploit paid advertising to significantly extend the reach and influence of their disinformation campaigns.
Encrypted direct messaging platforms, such as WhatsApp, are now key to disinformation distribution and pose new challenges. By virtue of their encrypted nature, they are difficult to study, yet they have played a key role in disinformation around the COVID-19 pandemic, presidential elections in Brazil, and mob and ethnic violence in India.
Stimulate distribution research through better transparency
We cannot understand how disinformation operates online, much less counter it effectively, if there is not clear and trustworthy data about how it is spreading and its impact. In the European Union, the Code of Practice on Disinformation encouraged the creation of online repositories of political ads, allowing researchers and the public to access some distribution metrics. Facebook has also announced they will display country distribution data on some Facebook pages and Instagram accounts. Recently, civil society organizations have asked the tech industry to retain data around coronavirus disinformation takedowns, in order to study them later.
But more steps are needed to reduce the information asymmetry between platforms and third-parties (journalists, researchers, regulators, and end-users). Platforms need to provide independent and reliable ways of accessing data on content audience and impact. As it stands, researchers are dependent on metrics determined and voluntarily released by the platforms themselves, with few ways to verify their veracity. And there is reason to doubt the platforms’ data: Facebook, for instance, inflated metrics on video views on the platform.
Although platforms have tried to address the problem on their own, entire research communities cannot be dependent on the parceled data provided by the tech industry. If we are actually looking to reduce disinformation and its impact, we have to research the distribution vulnerabilities of these products.
Solutions are on the table, such as transparency on the process of content distribution, a proposal that could guarantee researchers the access to distribution data. Another proposal suggests qualifying social media and search engines as media, which could raise their transparency requirements. Multilateral approaches also consider a moratorium on micro-targeted ads containing falsehoods, in order to gather more scientific knowledge on their impact.
The COVID-19 infodemic has shown that platforms could proactively act and claim to have an impact on mitigating online disinformation. By beginning to consider how distribution plays a part in the distribution of online disinformation, we will improve our research and foster a better understanding of this phenomenon.
The “ABC” framework is essential to detect, analyze, and take down these operations. But without also focusing on the role of information distribution in how we study disinformation, we will continue to play the whack-a-mole game against malicious actors who will always readily adapt their tactics.
Alexandre Alaphilippe is the executive director of the EU DisinfoLab, an NGO focused on researching and tackling sophisticated disinformation campaigns.
Facebook and Google, the parent company of YouTube, provide financial support to the Brookings Institution, a nonprofit organization devoted to rigorous, independent, in-depth public policy research.
Commentary
Adding a ‘D’ to the ABC disinformation framework
April 27, 2020