Can Drones and Viruses be Ethical Weapons?

Editor’s Note: In a February 28, 2014 interview with Big Think, Peter W. Singer discusses how technology like drones and cyber weapons are transforming the role of humans in war and conflict. There has been a geographic shift, where human operators are less likely to expose themselves on the battlefield, and as modern weapons become increasingly autonomous, this raises new and difficult ethical questions.

There has been an enormous amount of changing forces on warfare in the twenty-first century. They range from new actors in war – like private contractors, the Blackwaters of the world – to child soldier groups to technologic shifts, the introduction of robotics to cyber. One of the interesting things that ties these together is not only how the “who” of war is being expanded but also the “where” and then “when.” So one of the things that links, for example, drones and robotics with cyber weapons is that you’re seeing a shift in the geographic location of the human role.  

Humans are still involved – we’re not in the world of the Terminator – but there’s been a geographic shift where the operation can be happening in Pakistan, but the person flying the plane might be back in Nevada, some 7,000 miles away. Or, on the cyber side, where the software might be hitting Iranian nuclear research centrifuges, like what Stuxnet did, but the people who designed it and decided to send it are again thousands of miles away. In that case it was a combined U.S. – Israeli operation.

One of the next steps in this, both with the physical side of robotics and the software side of cyber is a shift in that human role, not just geographically, but chronologically, where humans are still making decisions, but they’re sending the weapon out in the world to then make its own decisions as it plays out there. In robotics you think about this as “autonomy.”