Sections

Commentary

The disinformation threat to diaspora communities in encrypted chat apps

A smartphone screen displays the icons of several different chat applications.

Amid the deluge of misinformation surrounding last year’s presidential election in the United States, voters across the country encountered persistent false claims online that ballots had been inappropriately “thrown out.” Aimed at undermining confidence in the vote, the “discarded ballot” hoax spread widely across digital media, including in encrypted group chat applications used in diaspora communities.

The spread of disinformation on encrypted messaging applications poses a threat to diaspora communities, who have turned to WhatsApp and other messaging apps for the trust and intimacy they afford. Yet due to their encrypted and closed nature, conventional fact-checking and content moderation regimes are harder to implement. As a result, these platforms have become a promising new avenue for the spread of disinformation, particularly among diaspora communities. Last year in North Carolina, for instance, encrypted messaging applications were used to spread misleading information in a get-out-the-vote campaign targeting South Asian Americans.

The spread of encrypted messaging apps

The challenge posed by encrypted messaging apps (EMAs), such as WhatsApp, Telegram, and Signal, reflects in part their rapid growth. Globally, EMAs have exploded in popularity in recent years, with WhatsApp alone now boasting more than 2 billion monthly users. The platforms are increasingly being used as a key tool for political campaigning and mobilization, especially in the Global South.  

The United States has not been immune to the trend. Following the 2016 election, platforms like Signal saw a massive spike in users seeking greater communications security following Donald Trump’s victory. More recently, far-right groups migrated to encrypted platforms following the January 6 assault on the U.S. Capitol and subsequent moves by social-media platforms to ban their presence. But the technology is not a niche phenomenon—major tech companies are also incorporating end-to-end encryption into mainstream messaging platforms.

WhatsApp and other EMAs play an important role among minority groups and diaspora communities, including for political discussion. According to Pew Research Center data from 2019, 42% of Hispanics in the United States use WhatsApp, compared to 24% of Black Americans and 13% of white Americans. Our own research points into the same direction: In August 2020, we conducted a representative survey of the U.S. population among 1,010 people 18 or older, carried out through NORC at the University of Chicago, and found that 10.4% of participants said they had used WhatsApp within the past 7 days, with 9.5% saying they’d used WhatsApp in the last six months to discuss politics. But what was particularly interesting were the demographic breakdowns. Of the Hispanics we surveyed, 27% said they used the app within the past six months to discuss politics. The number was 21% among individuals identifying as Asian, 9.5% among people who identified with two or more racial/ethnic categories, 8% among Black respondents, and a mere 4% among non-Hispanic whites.

EMAs in diaspora communities

Messaging apps such as WhatsApp can play a crucial role among immigrant and diaspora communities, as well as in certain ethnic and racial groups. They help people stay in touch with family, friends, and relatives, some of whom may live abroad. What’s important to note, based on both news reporting and our survey research, is that this is particularly true for Latino/Latinx communities in the United States. For the purposes of our work, we refer to communities that rely on EMAs more heavily than the general U.S. population using the umbrella term “diaspora communities”—particularly because such users tell us that they regularly use these applications to communicate with people in their country of origin, individuals who share their cultural context, and people living in the U.S. from that same community. While this approach risks over-including individuals who are not members of diaspora groups, the connecting thread for our research is the usage of these apps.

To understand the information ecosystem in which EMAs such as WhatsApp operate, it is important to note key features of these apps. First, they are typically end-to-end encrypted, which means that neither the platform itself, malicious actors, nor other users have the technical ability to peek inside others’ communications. Second, EMAs have features of social media platforms (e.g., the sharing of messages through forwarding, or, in some cases, the creation of news feeds) that are combined with features of closed-communication channels (e.g., private chats). Third and most importantly, EMAs are embedded in existing, trusted networks and relationships that people have established on their cell phones—contact lists of phone numbers.

Overall, EMAs can provide tremendous value and benefit to diaspora communities, but that also makes them more impactful when weaponized. Malicious actors who use EMAs to propagate disinformation, if successful, are able to harness existing trusted relationships to forward misleading or false information. Compared to the relative publicity of major social media platforms with fact-checking or other content moderation regimes, disinformation may travel further and more unencumbered.

The spread of disinformation on closed chat applications spans a variety of U.S. diaspora groups. News reports have documented the phenomenon across Latino, Indian American, Korean American, Chinese American, and Arab American communities in various closed apps. In our interviews with community leaders in Florida, we encountered examples in which disinformation in primarily Colombian, Cuban, and Venezuelan WhatsApp groups piggybacked on slight mistranslations that can at times misconstrue facts—both intentionally and not. For example, in a video in which Joe Biden described himself as “progressive” while speaking in English, a Spanish translation rendered that word as “progresista.” While technically correct, in some Latin American communities that word carries far-left connotations that are closer to the Spanish words “socialista” and “comunista,” as one of our interview subjects explained. According to one Indian American community leader in North Carolina, non-political groups created to benefit the community became inundated with false political information. As part of our observational work in Brazilian American WhatsApp groups in recent months, we observed a disinformation-laden discourse about fraud in the recent U.S. presidential election and arguments that the same phenomenon would take place in the 2022 elections in Brazil—when another populist strong-man will attempt to hold onto power. 

Regardless of whether false claims are spread intentionally on WhatsApp and other apps, the fact that the messengers on these platforms are highly trusted lends greater credence to their claims. Political messaging from trusted sources is increasingly being used by political consultants to promote candidates—something the industry calls “relational organizing.” The tactic is growing increasingly commonplace, especially as the pandemic has made traditional intrapersonal outreach difficult or impossible. To be sure, we are not arguing that mis- and disinformation in diaspora communities who rely on EMAs for communication is as intentional as a relational organizing campaign. Like many frustrated members of these communities and reporters covering these events, we have yet to successfully trace the sources of falsehoods spread on these apps.

Attribution of disinformation on EMAs is complicated in part because information flows easily across both international and state borders. In one instance, we encountered a get-out-the-vote campaign based in California targeting South Asian Americans that became a vector for false election information for a community in North Carolina. In a WhatsApp group chapter created by this campaign in North Carolina, one interview subject encountered people from California posting about voting rules related to drop boxes that did not apply to North Carolina. The subject was unable to tell if the harm this was causing her community was intentional or not: “Is it because someone is just being very invested in the voting process and sharing information not understanding that rules are different state to state, or is it a malicious person?”

Though EMAs are closed to outsiders and content moderation, there are still tools that can be deployed to counter harmful messaging, and members of affected diaspora communities express a desire to be a part of such efforts. If members of the research, investigative journalism, and policy communities make the effort to work with community leaders, then community-specific counter-messaging tools that capitalize on the dynamics leveraged in relational organizing might be developed to debunk false content. By pooling skills and resources, it may be possible to trace the origin of problematic content and hold those responsible to account. The research organization First Draft has published a list of recommendations for platforms, news outlets, researchers, and information providers seeking to understand and counter this disinformation vector.

Our research supports their recommendations, and we emphasize the importance of working with trusted community leaders for several reasons. First, counter messaging that originates from within these communities will help to overcome ethical and technical barriers to studying WhatsApp messaging. Second, in our interviews we encountered dedicated members of these communities who organize to debunk and counter the harmful messaging that appears across multiple platforms, such as in both Facebook and WhatsApp groups. However, these leaders also struggled to identify the source of content. Researchers and journalists with expertise in source attribution should work in concert with them to trace content from platform to platform and eventually to the source. 

Jacob Gursky and Martin J. Riedl are research associates at the Center for Media Engagement at the University of Texas at Austin.
Samuel Woolley is the director of the propaganda research program at the Center for Media Engagement, the research director of disinformation work at the Good Systems initiative, and an assistant professor in the School of Journalism—all at the University of Texas at Austin.

Facebook provides financial support to the Brookings Institution, a nonprofit organization devoted to rigorous, independent, in-depth public policy research. 

Authors