Editor’s note: On January 27, 2015, J.M. Berger testified before the U.S. House of Representatives Committee on Foreign Affairs. Read his prepared oral statement below or watch the full video of his remarks online.
Terrorist use of technology can be complex, but it is not mysterious. Extremists generally follow the same practices available to everyone, from dial-up bulletin boards in the 1990s to social media today.
Jihadists have figured out how to use social media to make an impact, even though their numbers are minuscule in comparison to the overall user base, with Islamic State, more commonly known as ISIS or ISIL, leads the way. Its highly organized social media campaign uses deceptive tactics and shows a sophisticated understanding of how such networks operate.
After years of back and forth debate among free speech advocates, government and companies, Facebook and YouTube have instituted reporting procedures that allow users to flag content that supports terrorism for removal.
Until last fall, Twitter took an extremely permissive approach to the question of what content it would permit. Starting shortly before ISIS disseminated a video of the beheading of American journalist James Foley, Twitter began to take a more aggressive approach to ISIS specifically, and thousands of ISIS supporter accounts have been suspended since. Other jihadist groups have been targeted, but in lesser numbers.
In a forthcoming study on ISIS’s use of Twitter, commissioned by Google Ideas and to be published by the Brookings Institution’s Project on U.S. Relations with the Islamic World, technologist Jonathon Morgan and I set out to develop metrics that could define the size and function of this coordinated effort on Twitter.
While our analysis is not complete, we can confidently estimate that during the autumn of 2014, there were at least 45,000 Twitter accounts used by ISIS supporters. This figure includes accounts that were both created and suspended during the time it took us to collect the data.
Our full findings will be published in March and may reflect a higher estimate for the autumn 2014 time frame. The current size of the network has likely changed and is possibly smaller, but we are still collecting data for that assessment.
Our research began at the same time that Twitter started an aggressive campaign to suspend ISIS supporter accounts, so it reflects some of the effects of suspensions.
We found that the vast majority of ISIS supporters on Twitter, about 73 percent, had fewer than 500 followers each. During that period of time, we found no accounts actively supporting ISIS that possessed more than 50,000 followers, a sharp change from early 2014 when some ISIS users could be found with more than 80,000 followers.
The pace of activity – the number of tweets per day – was perhaps the important factor in determining who would be suspended. Suspended users tweeted three times as often as those who were not suspended, and received almost 10 times as many retweets from other ISIS supporters. Suspended users averaged twice as many followers as those who were not suspended.
When users are removed from the system, we found evidence that existing users do compensate to some extent, but preliminary evidence suggests they cannot fullyregenerate the network if the suspensions continue at a consistent pace.
We noted that almost 800 confirmed ISIS supporter accounts were suspended between fall 2014 and January 2015. This may be the tip of the iceberg, as we also identified almost 18,000 accounts related to the ISIS network which were suspended during the same time frame. We were not able to estimate how many of the 18,000 were ISIS supporters, but we suspect it is a significant number.
ISIS supporters on Twitter are under significant pressure, with the most active and viral users taking the brunt of the suspensions. While tens of thousands remain, ISIS supporters online call the effects of these suspensions “devastating.”
There are three important benefits to the current level of suspensions.
First, they reduce ISIS’s reach among online populations at risk of radicalization. ISIS supporters do not spring from the womb fully radicalized, and a path is required between recruiters and the vulnerable. Suspensions do not eliminate that path, but they create obstacles and increase the cost of participation.
Second, by allowing some ISIS accounts to continue with a lower profile, the current level of suspension activity preserves a substantial amount of open-source intelligence.
Third, targeting the most active members of the ISIS supporter network undercuts ISIS’s most important strategic advantage on platforms like Twitter – the 1,000 to 3,000 accounts that are, at any given time, far more active than ordinary Twitter users.
These accounts – described fully in data collected over the last two years as well as in ISIS strategy documents – act in a coordinated way to amplify ISIS’s message, tweeting links to ISIS propaganda and hashtags at an unnaturally fast pace, which causes them place higher in search results and results in content being aggregated by third parties. The workings of this system will be described in substantial detail in the forthcoming book, “ISIS: The State of Terror,” by Jessica Stern and J.M. Berger.
In addition, ISIS and more recently, al Qaeda in the Arabian Peninsula, use “bots,” computer-controlled Twitter accounts that automatically send out content in a similar manner. Thousands of such bots support ISIS and other illegal ventures.
The suspensions also make ISIS vulnerable to its own tactics. For instance, ISIS critics in the Persian Gulf have recently taken to paying spammers to send out thousands of tweets criticizing ISIS, often at a higher volume than ISIS supporters.
In conclusion, I believe the current environment is approaching the right balance of pressure on ISIS networks, degrading its ability to achieve its goals while still allowing the United States to exploit open source intelligence from the network of members and supporters online.
That said, we can do better in three areas.
First, transparency. All stakeholders need to clearly understand exactly why and how a user gets suspended on social media. Companies need to communicate this better.
Second, consistency. If suspensions do not continue at a consistent pace and with consistent criteria, the targeted network will regenerate. The suspension process is akin to weeding a garden. You don’t “defeat” weeds, you manage them, and if you stop weeding, they will grow back.
Third, scope. ISIS far from our only problem on social media. Aside from other terrorists who are already taking lessons from the Islamic State’s tactics, challenges on social media range from bullying and targeted harassment to extensive activities by foreign state-sponsored disinformation and intelligence programs.
Finally, it is important to note that no single authority exists for dealing with these issues. The concerns of corporations are different from those of governments and those of activists, and the concerns of governments and activists are wildly different around the globe.
Any approach to social media policing needs to include some consideration of our multipolar world. In our fight against terrorism, we do not wish to create precedents and authorities that would empower tyrants and repressive movements with tools to silence legitimate dissent.
With the downward trajectory in [U.S.-China] relations, the incoming ambassador ideally will need to have a visible connection to the president and his senior advisers, familiarity with the range of issues that comprise the relationship, and a future in American politics. The more the ambassador is seen as likely to wield influence in the future on issues affecting China, the higher the cost and risk for Beijing to mistreat him/her.