Sections

Research

From anti-vaxxer moms to militia men: Influence operations, narrative weaponization, and the fracturing of American identity

The U.S. Capitol is seen through razor wire after police warned that a militia group might try to attack the U.S. Capitol in Washington, U.S., March 4, 2021. REUTERS/Joshua Roberts

EXECUTIVE SUMMARY

In spring 2020, several Facebook groups coalesced around opposition to coronavirus- related lockdowns. These online groups quickly devolved to hotbeds of conspiracy theories, malign information, and hate speech. Moreover, they drew from multiple online communities, including those opposed to vaccination — known as “anti-vaxxers” — anti- government militias, QAnon supporters and other conspiracy theorists. How did the anti- lockdown online forum unite these seemingly disparate groups and ultimately mobilize people with such different ideologies to show up at local protests? While algorithmic recommendations helped people find these Facebook groups, we find the answer to what makes these groups so compelling is their narratives, in this case narratives of government and elite conspiracies and of threats to individual freedom.

Drawing from the field of cultural sociology, we define narratives as social stories that help people understand events and assign moral meaning. Narratives elicit and play on emotion by tapping into deeply held beliefs and values. Critically, they also establish or reinforce group or collective identity. They may be used to speak to a core audience of believers and also to engage new and even initially unreceptive audience segments with potent cultural messages. Focusing on the networks linking the anti-lockdown narratives to other key social narratives, we illuminate the pathways linking seemingly disparate groups.

We conceptualize messengers, like Fox News, MSNBC, or the Russian Internet Research Agency (IRA) — or even social movements — as seeking to increase commitment to a group identity through strategic deployment and presentation of narrative. This task involves an incremental, iterative, and possibly non-linear process that has similarities to those used by sexual predators to lure victims. Combining cultural sociology insights on narrative with interdisciplinary work on radicalization, we have developed the “WARP” framework. WARP stands for Weaponize, Activate, Radicalize, Persuade, and points to the way narratives are deployed in service of identity and collective action projects, the rhetorical or persuasive strategies used in service of these aims, and variations in individual response and susceptibility to influence operations. Using the data from the Russian influence operations on Twitter, we use the WARP framework to illustrate how groups deploy and weaponize narratives to energize conspiracy theories, exacerbate social divisions, mobilize protest, and even promote violence. Russia’s influence operations provide a prominent example, but other foreign as well as domestic actors have similarly used narratives. To the extent that these various influence operations use the same narrative touchpoints, the impact of foreign versus domestic influence operations is likely impossible to disentangle.

In light of the COVID-19 pandemic and the assault on the U.S. Capitol on January 6, understanding narrative as a weapon of influence and the process through which people become engaged with and mobilized by divisive content has significant policy implications. To counter or prevent the proliferation of weaponized and radicalizing narrative content, we must monitor the network of narratives, in particular the pathways leading to violence and hate. In addition, we must ensure that social media algorithms direct people away from radicalizing groups rather than feed them leads for new recruits. No company should be able to profit off of promoting the destruction of democratic norms and institutions. To this end, the social media companies’ algorithms should be opened to public scrutiny and federal regulation. Social media platforms also must demonetize and remove from algorithmic recommendations any individual, group, or page that weaponize narratives which undermine civil society and national security. We need more privacy protections to prevent malign actors from microtargeting vulnerable individuals. And, finally, we must concentrate on telling true and inspiring stories about the United States of America that draw us together instead of tearing us apart.

Authors

  • Acknowledgements and disclosures

    We gratefully acknowledge the members of the New War Research Consortium and of the DARPA ISAT Digital Inoculation at Scale study group for the thoughtful discussions that sharpened the ideas presented in this paper. We also gratefully thank the reviewers for their considered feedback, which helped improve the manuscript. Any mistakes are our own. Ted Reinert edited this paper and Rachel Slattery provided layout.

  • Footnotes
    1. Joel Finkelstein, John K. Donohue, Alex Goldenberg, Jason Baumgartner, John Farmer, Savvas Zannettou, and Jeremy Blackburn, “COVID-19, Conspiracy and Contagious Sedition: A Case Study
      on the Militia-Sphere,” (Princeton, NJ: Network Contagion Research Institute, 2020), 15, https:// networkcontagion.us/reports/covid-19-conspiracy-and-contagious-sedition-a-case-study-on-the-militia- sphere/; Samuel L. Perry, Andrew L. Whitehead, and Joshua B. Grubbs, “Culture Wars and COVID-19 Conduct: Christian Nationalism, Religiosity, and Americans’ Behavior During the Coronavirus Pandemic,” Journal for the Scientific Study of Religion 59, no. 3 (July 26, 2020): 405-416, https://doi.org/10.1111/ jssr.12677; Elise Thomas and Albert Zhang, “ID2020, Bill Gates and the Mark of the Beast: how Covid-19 catalyses existing online conspiracy movements,” (Barton, Australia: Australian Strategic Policy Institute, June 2020), 21, https://www.jstor.org/stable/resrep25082.
    2. Karen Hao, “He Got Facebook Hooked on AI. Now He Can’t Fix Its Misinformation Addiction,” MIT Technology Review, March 11, 2021, https://www.technologyreview.com/2021/03/11/1020600/ facebook-responsible-ai-misinformation/.
    3. Javier Argomaniz and Orla Lynch, “Introduction to the Special Issue: The Complexity of Terrorism— Victims, Perpetrators and Radicalization,” Studies in Conflict & Terrorism 41, no. 7 (July 3, 2018): 491–506, https://doi.org/10.1080/1057610X.2017.1311101; Arie W. Kruglanski, Michele J. Gelfand, Jocelyn J. Bélanger, Anna Sheveland, Malkanthi Hetiarachchi, and Rohan Gunaratna, “The Psychology of Radicalization and Deradicalization: How Significance Quest Impacts Violent Extremism,” Political Psychology 35, no. S1 (February 2014): 69–93, https://doi.org/10.1111/pops.12163.
    4. Jessica Dawson, “Microtargeting as Information Warfare,” (College Park, MD: SocArXiv, December 2020), https://doi.org/10.31235/osf.io/5wzuq.