In recent years, commercial remote sensing satellites have played a key role in dozens of human rights and war crimes investigations. They’ve been used to spot mass graves in Burundi, verify the destruction of two towns in northern Nigeria by Boko Haram, and reveal the massacre of at least 350 people by the Nigerian army. When the Kremlin denied involvement in the fighting in Ukraine in September 2014, satellite imagery and testimony gathered by Amnesty International (AI) indicated the Kremlin’s assertion was incorrect. And in April, Human Rights Watch (HRW) used satellites to document military and police abuses in Venezuela. In the last decade, AI and HRW, sometimes in partnership with the American Association for the Advancement of Science, have produced dozens of reports based on the analysis of commercial satellite imagery. One group, Satellite Sentinel Project (SSP), was built around the use of commercial remote sensing technology.
What are the political and policy implications of the use of satellite technology by human rights organizations? First, they allow human rights NGOs to monitor places that are otherwise too distant or too dangerous to reach by conventional means. Secondly, remote sensing introduces a timeline into investigations. Because of the enormous stores of geospatial data found in archives, analysts can essentially look back in time in search of evidence. DigitalGlobe’s WorldView-3 satellite collects 1,200,000 km2 of images of the earth each day, and it is only one of the dozens of high-resolution satellites in orbit, with more coming online each year. As new satellites shrink the time between overflights, the ability to observe events is growing.
Third, human rights NGOs are now in the business of anticipating events. With enough imagery over time, patterns emerge that allow for prediction. This offers the tantalizing possibility of NGO interventions in events by releasing statements and images as a warning to potential aggressors that they are being observed. While all three outcomes are important, it is perhaps the third one that raises the greatest ethical and policy challenges.
Before and after satellite images show mass destruction and looting of civilian property in Malakal, South Sudan. Source: Human Rights Watch
Though often debated and renegotiated, a key component of AI’s mission since its founding in 1961 is to bear witness. Professor Stephen Hopgood notes that bearing witness involves an adherence to rules and procedures that seek to “construct in practical terms the kind of space – above, beyond, outside the world – in which the idea of objective morality, of a kind of universal truth, could be anchored”. It involves taking a principled but detached position. The availability of god-like views from the heavens certainly allows AI to stand aloof – literally, “above, beyond, outside the world.” Yet with satellites, AI and other groups that make use of them now have a potent form of agency to intervene indirectly in events. As the adage says, knowledge is power. As an AI analysts told me recently, “The real purpose (of the 2007 Eyes on Darfur project) was a deterrent effect.” Eyes on Darfur was one of AI’s first major remote sensing projects.
Some might argue that AI has taken a step beyond bearing witness: it is using its moral authority and its technical prowess to alter events on the ground. A counter to this assertion would point out that human rights organizations have always used the tools available to them to alter the behavior of war criminals and human rights abusers. The “boomerang model” of human rights advocacy, developed by political scientists Margaret Keck and Kathryn Sikkink, underscores the idea that information collected by human rights NGOs is intended to pressure abusers of rights into better compliance with broadly shared norms. In this respect, there is a direct line from writing an open letter to publishing a satellite image. Yet with satellites the burdens are greater. Getting it wrong, interpreting an image incorrectly or releasing information that undermines the wellbeing of populations constitutes an entirely different set of moral and ethical considerations. The use of satellite imagery brings human rights NGOs closer to sharing responsibility for rapidly unfolding events with the players themselves. With greater agency comes greater moral responsibility.
A persuasive example of this is found in SSP’s high-tempo use of satellite imagery in 2011 to affect political outcomes along what is now the border region between South Sudan and Sudan. The goal was to acquire and analyze imagery and publicize findings at a pace that would deter further aggression – usually by forces aligned with Khartoum. By broadcasting perceived indicators of impending aggression, SSP was inserting itself directly into events. As its website notes,
Through analysis of DigitalGlobe‘s satellite imagery, the Satellite Sentinel Project can identify chilling warning signs — elevated roads for moving heavy armor, lengthened airstrips for landing attack aircraft, build-ups of troops, tanks, and artillery preparing for invasion — and sound the alarm.
This goal put tremendous pressure on young volunteer analysts, mostly Harvard undergraduate students, to get it right. Professional analysts with DigitalGlobe backstopped the effort, as did for a time United Nations image analysts in Geneva, but much of the work and responsibility fell to the students and a handful of paid SSP staff. The potential for getting too far ahead of what the imagery revealed, or simply getting it wrong, raised serious ethical and operational concerns for those responsible for the project. For example, miscalculating the direction of an impending attack could cause people to flee their homes and head straight into the attack.
Satellite imagery gives human rights investigator tremendously powerful new tools. Learning where to draw the line in the responsible use of imagery remains a great challenge moving forward, but remote sensing analysts working in human rights are well aware of the challenges and are committed to refining its use.