The Boston Globe

Robot Ethics Won’t Clean Up Combat

Today, a new technology has emerged in war. The U.S. military has more than 7,000 unmanned systems in the air (commonly called drones) and another 12,000 on the ground. And, so we shouldn’t be surprised that the same arguments of how technology can make war easier, safer and cleaner are emerging once more.

Politicians have taken to describing drone strikes as "costless" or "judicious," and too often fail to recognize the consequences and complications that come with any use of force. But the same problem extends back to the lab. For instance, there has been research, largely funded by the military, on how to create an "ethical governor" for unmanned weapons. Think Watson, the Artificial Intelligence program that won Jeopardy, given a law degree and dropped into war. Software would program robotic weapons to act ethically, such as only being able to fire in situations that conform to the Geneva Conventions.

The argument is that while soldiers sometimes let emotions of anger or hate get the worst of them and commit atrocities, an emotionless machine would only follow its program and war crimes would be less likely with robots than humans. As one professor at work on an ethics governor project for warbots put it, ‘‘Some robots are already stronger, faster and smarter than humans. We want to do better than people, to ultimately save more lives.’’

Such a vision is appealing, but also seductive. Unfortunately, while armed robotics are real, any true ethics governor for them remains ‘‘vaporware,’’ — all design concept, but no reality. But even if such programs were to come to fruition, we need to understand there would still be problems.

Too frequently, there is the argument that the ‘‘"fog of war can be lifted’’ (as the technophile thinkers who once surrounded former Secretary of Defense Donald Rumsfeld argued), by either the perfection of our technology or the perfection of our souls. Instead, war in the 21st century shares the same qualities with past centuries: It is a terrible, awful mess.

So, whenever some politician or scientist asserts confidently that some new war technology will lead to less bloodshed or greater compliance with the law, we should look through a different, dirtier lens — war as it is, rather than how we wish it to be. Take these situations, all of which have happened in wars in the last decade: A sniper shoots at you from between the legs of two kneeling women, while four children sit on his back. Does a living suit of non-combatant armor mean you can’t shoot back?

A nine-year-old boy, abducted from his family and hopped up on cocaine, comes towards your checkpoint carrying an AK-47 rifle. Do you shoot the kid or wait for him to shoot you first?

An ambulance is being used to move both wounded civilians and reloads of RPG grenade rounds for insurgents. Take it out or not?

A farmer is being blackmailed into shooting a rocket set on his farm at a nearby city. If he doesn’t pull the trigger, his family will be killed. Do you shoot at him? If so, do you have to wait for him to fire first, knowing it will kill more civilians?

None have easy answers, and military officers, lawyers, and philosophers literally fill pages arguing what is the right thing to do in these real world situations. So to argue that such problems will are going to be easily solved by plugging in some bit of software is a bit of a stretch. It isn’t just that precious few ethical situations translate into the simple zero vs. one language of computer programming. It is that in the mess of modern war, the laws themselves are often unclear about what to do, and even more, being taken advantage of in a way their original writers didn’t anticipate.

I am not opposed to new technologies like robotics, and indeed support the US military efforts to buy more of this needed next generation of weaponry for our national security. I even support efforts to try to introduce more thinking about ethics into the fruits of our science. But we shouldn’t be seduced into thinking this or any advanced technology can make war something it isn’t. Whether it’s being fought with sticks and stones or Predator drones, war is still a story of humans: our causes, our decisions, our losses, and our ethics. And, just the same, our problems of war are not going to be easily solved by any silver bullet solution.