Sections

Commentary

Facebook can’t resolve conflicts in Myanmar and Sri Lanka on its own

A Buddhist monk looks through the window of a burnt car after Muslims attacked and set fire to a temple in Cox's Bazar October 1, 2012. Bangladesh accused Muslim Rohingya refugees from Myanmar on Monday of involvement in attacks on Buddhist temples and homes in the southeast and said the violence was triggered by a photo posted on Facebook that insulted Islam. REUTERS/Andrew Biraj (BANGLADESH - Tags: RELIGION CIVIL UNREST) - GM1E8A11KYB01

Facebook CEO Mark Zuckerberg has been caught up in a whirlwind in recent months, giving congressional testimony and public statements defending Facebook against allegations that it has been too lax in combating online hate speech and disinformation. International criticism has rightly brought attention to the urgent need to address Facebook’s role in stoking ethnic and religious violence in countries such as Myanmar and Sri Lanka, where racist or anti-Muslim posts online have revived old conflicts, inspiring real-world death and destruction.

However, rather than placing the onus on Facebook to police its own content as many have proposed (and the company is making efforts to do), a more effective long-term solution should seek to make the populations of these countries more resistant to social media’s divisive influences.

A history of violence

For both Myanmar and Sri Lanka, Buddhist-Muslim violence has long been a fact of life. Though not well known today, Burmese military operations in 1978 led to the outflow of over 200,000 ethnic Rohingya Muslims and a further 260,000 in 1991, about 230,000 of whom were repatriated soon after. While many Rohingya were hopeful that Myanmar’s partial democratization in 2011 would improve their lot, the recent violence has dashed this short-lived optimism. Following that political transition, the sudden rollback of authoritarian controls and press censorship—along with the rapid expansion of internet and mobile phone penetration—opened the floodgates to a deluge of online hate speech and xenophobic nationalism directed at the Rohingya.

In June 2012, unsubstantiated reports on Facebook that a group of Muslims raped a Buddhist woman touched off a spate of violence, as did a similar incident in July 2014. The latest bloodshed between Buddhists and Muslims in Myanmar’s Rakhine State, precipitated by a small-scale attack on military outposts in August 2017, has forced more than 700,000 Rohingya to flee across the border to Bangladesh. Citing the large volume of vitriol online, the chairman of the U.N. fact-finding mission on the situation in Rakhine told reporters during a press conference in March that Facebook played the “determining role” in accelerating the violence and had “substantively contributed to the level of acrimony” toward the Rohingya.

Nationalist Buddhist monks, such as the infamous U Wirathu, have played a leading role in disseminating and magnifying anti-Rohingya propaganda on Facebook. More distressing, however, even the Facebook page for State Counselor Aung San Suu Kyi’s press office has been used as a platform to downplay alleged atrocities committed by the military in Rakhine, dismissing Rohingya claims of sexual assault as “fake rape.” In April, Mr. Zuckerberg responded to a letter from concerned activists in Myanmar, who criticized the media company’s inability to control the proliferation of online hate speech and fake news, by saying that the company would do more to censor harmful content.

In nearby Sri Lanka, where a brutal civil war between the Sinhalese Buddhist majority and Tamil Muslim minority ended less than a decade ago, Facebook has also proven a potent tool for reigniting Buddhist-Muslim conflict. In March, posts circulated on Facebook accusing Muslim shopkeepers of putting sterilization pills in Buddhists customers’ food. The story soon went viral, and a flurry of online messages spread of fictitious attacks, with some urging Buddhists to “kill all Muslims.” The next week, a road rage incident in which a group of Muslim men beat a Sinhalese truck driver to death set off a wave of violence, arson, and destruction in a Muslim neighborhood. One Muslim man died after being trapped inside a house that was set on fire.

In response, the Sri Lankan government declared a state of emergency and temporarily blocked access to Facebook and WhatsApp, the company’s subsidiary online messaging service, until order was restored. The online blackout swiftly caught the attention of Facebook executives, who intervened the next day to request its services be brought back online. While Sri Lankan authorities and internet providers prevented a far worse crisis, tensions and mistrust linger between Buddhist and Muslim communities.

Facebook enters the fray

To understand why Facebook has played such an outsized role in stoking conflict in these places, it’s important to get a sense of the circumstances that have made the platform so ubiquitous. In Myanmar and Sri Lanka, a drastic reduction in consumer costs arising from deregulation in the telecommunications sector and the availability of cheap internet-enabled devices have facilitated the rapid growth of mobile internet activity. (In 2011, the World Bank estimated Myanmar’s mobile penetration rate at 2.5 percent of the population, less than that of North Korea; it has since skyrocketed to over 100 percent.) Facebook’s partnership with and 2011 acquisition of the Israeli startup Snaptu proved especially fortuitous for the company’s expansion into these underdeveloped countries because it made the platform accessible on the low-end “feature phones” predominant in more impoverished markets.

[In Myanmar,] Facebook has essentially become the internet.

Facebook’s entry into Myanmar was further aided by the company’s Internet.org initiative, a cooperative effort with other large tech firms to provide low-cost basic internet service to the nearly two-thirds of the world’s population for whom such access is prohibitively expensive. Through a partnership with one of Myanmar’s local internet service providers, subscribers were able to access Facebook free of charge through its zero-rated Free Basics app. Although the service was quietly terminated in September 2017 (just as violence was ratcheting up in Rakhine), for a large swath of the population, Facebook has essentially become the internet.

Online and offline solutions

Amid the tumult in these countries and elsewhere, there has been a growing clamor worldwide for Facebook to take greater responsibility for the material uploaded to its network. Indeed, tech companies should work with, and within, affected communities to understand complex social dynamics and build linguistic capabilities among their staff to help cultivate broad-based digital literacy. Such a campaign would entail educating the public about disinformation and encouraging the responsible use of social media. In his letter responding to the Myanmar activists, Mr. Zuckerberg promised to boost Facebook’s censorship capacity in the country by recruiting additional Burmese-speaking content reviewers and using artificial intelligence to track and erase hate speech that could trigger violence.

While such efforts are well-intentioned, they are extraordinarily challenging to implement. Facebook’s attempt to remove content in Germany following the implementation of a new online hate speech law highlights the immense resources required to mount such a censorship campaign and points to the difficulties associated with subjectively identifying offensive material. Moreover, the company’s vice president for community integrity recently stated in an interview that Facebook’s effort to employ artificial intelligence against hate speech has so far proven ineffective because of the nuances of culture and context that are required. Even if these mechanisms could be improved, they essentially function as little more than whack-a-mole tactics that may temporarily silence online incitements but do little to suppress conflict or heal entrenched divisions in the long-term.

It’s time to consider a more holistic solution that seeks to inoculate populations against the pernicious influences of online disinformation and hate speech. First and foremost, there is much work to be done in educating the public at all age levels about how to cope with today’s information-rich environment. In Myanmar and Sri Lanka, where still-nascent government institutions remain weak and the media has come under attack, civil society organizations can play such a role. One model has emerged in Myanmar where, since 2014, an innovative grassroots organization called Panzagar, or “Flower Speech” in English, has been working to promote civil dialogue and online fact-checking both through its advocacy on Facebook and in real-life public forums.

Yet, perhaps an even more effective means of enhancing the resilience of these societies would be to encourage positive intercommunal contact and dialogue to counter the polarizing tendencies of online communication. The homogenous virtual enclaves of likeminded individuals that social media has fostered have supplanted diverse physical communities and provide fertile ground for hateful ideologies to take root. Leaders in Myanmar and Sri Lanka should pursue opportunities to foster organic real-world interactions among their populations to allow individuals to gain insight into diverging viewpoints and empathy for their purveyors.

Ultimately, it is incumbent upon political elites to downplay interreligious and ethnic divisions, long engrained in these countries’ political landscapes, and to encourage a national discourse of tolerance and cooperation in the larger democratic experiment. Such a cultural and political shift will extend well beyond online interactions and will require sustained engagement by civil society, religious leaders, educators, journalists, and politicians.

Washington can also play an important role in helping to bring about positive change in countries like Myanmar and Sri Lanka by imparting lessons learned from its own tumultuous history grappling with a diverse civil society. Public-private partnerships such as AmeriCorps, which helps facilitate voluntary civic service in the United States, demonstrate how societies can work together toward comprehensive solutions to strengthen multiethnic communities by eroding long-standing barriers to interpersonal engagement and dialogue. While hate speech and disinformation are likely to persist online despite Facebook’s best technological solutions, societies willing to patiently pursue more long-term and systemic reforms can begin to make real progress toward resolving the deep-rooted conflicts exacerbated by social media.

Brandon Paladino is an employee of the United State Government (USG), which is funding his current fellowship at the Brookings Institution. All statements of fact, opinion, or analysis are those of the author and do not reflect the official position or views of the USG.

Authors