drone013
Op-Ed

The Robotics Revolution

Peter W. Singer

Whether it is a report about the latest drone strike into Pakistan or an awesome web video of a cute robot dancing in the latest style, it seems like robots are taking over the world, figuratively if not yet literally. But within their growing appearance in the news is perhaps something bigger, a story that is reshaping the overall history of war and politics, and even humanity.

Where are we now?

While unmanned systems have a long history, dating back to Da Vinci’s designs for a robotic knight and including things like German remote-controlled torpedo boats in the First World War, it wasn’t until just a decade ago that they truly took off in war. Advances in technology made unmanned systems more usable, especially through the incorporation of GPS technology that allowed such systems to locate themselves in the world. At the same time, the new conflicts that followed 9/11 drove demand. When US forces first went into Afghanistan, the U.S. military had only a handful of unmanned aerial systems (UAS, also called “remotely piloted aircraft” or, more colloquially, “drones”) in the air, none of them armed, and zero on the ground.  Now it has a force inventory of more than 8,000 in the air and more than 12,000 on the ground. Another example of how far the change has gone is that last year, the U.S. Air Force trained more unmanned systems operators than fighter and bomber pilots combined.

But when we think about technologies like the Predator or the PackBot, we need to remember that they are just the first generation, the Model T Fords and Wright Flyers compared to what is already in the prototype stage. We are still at the “horseless carriage” stage of this technology, describing these technologies by what they are not, rather than wrestling with what they truly are. These technologies are “killer applications” in all the meanings of the term. They are technologies that advance the power of killing, but also have a disruptive effect on existing structures and programs. That is, they are akin to advancements like the airplane or the steam engine in allowing greater power and reach in war, but they are also akin to what iPods did to the music industry, changing it forever.

What Next? The Robotics Revolution

While many are surprised by the existing use of robotics, the pace of change won’t stop. We may have thousands now, but as one three-star U.S. Air Force general noted in my book Wired for War, very soon it will be “tens of thousands.”

But the numbers matter in another way. It won’t be tens of thousands of today’s robots, but tens of thousands of tomorrow’s robots, with far different capabilities.

One of the laws in action when it comes to technology is Moore’s Law, that the computing power that can fit on a microchip doubles just under every two years or so. It has become an encapsulation of broader exponential trends in technology that have occurred through history, with technology constantly doubling upon itself in everything from power to storage to broader innovation patterns. If Moore’s Law holds true over the next 25 years, the way it has held true over the last 40 years, then our chips, our computers, and, yes, our robots will be as much as a billion times more powerful than today. But Moore’s Law is not a law of physics. It doesn’t have to hold true. What if our technology moves at a pace just 1/1000th slower than it has historically? In this slowed-down scenario, we’d only see a mere 1,000,000 times the change.

The bottom line is that what was once only fodder for science-fiction conventions like Comic-Con is now being talked about seriously in places like the Pentagon. A robotics revolution is at hand.

I should be clear here. The robot revolution happening is not the Robopocalypse that Steven Spielberg is preparing to film. It is not the type where you need to worry about the former governor of California showing up at your door, à la The Terminator.

Instead, every so often, a technology comes along that changes the rules of the game. These technologies – be they fire, the printing press, gunpowder, the steam engine, the computer, etc. – are rare, but truly consequential.

The key to what makes a revolutionary technology is not merely its new capabilities, but its questions. Truly revolutionary technologies force us to ask new questions about what is possible that wasn’t possible a generation before. But they also force us to relook at what is proper. They raise issues of right and wrong that we didn’t have to wrestle with before.

The historical comparisons that people make to the robotics revolution illustrate this. When I conducted interviews for my book, I asked people to give historical parallels to where they think we stand now with robotics. As I noted earlier with the comparison to the “horseless carriage,” many of them, especially engineers, liken where we are now with robotics to the advent of the automobile. Indeed, at this stage of the last century, Ford was selling fewer than 1,000 cars a year. Within a decade, especially spurred on by the military proving ground of the First World War, it was selling a million a year.

If the horseless carriage is the parallel, think of the ripple effects that cars had on everything from our geopolitics to our law enforcement. A group of people who were, at the time, desert nomads became crucial players in the global economy simply because they lived over a sticky black substance previously considered more of a nuisance than anything else. The greater use of that same – now crucial – resource has changed the global climate. The growing use of cars, in turn, led to new concepts that reshaped the landscape, whether through highways and suburbia, or through new social notions.

Others, such as Bill Gates, make a different comparison, to the computer in 1980. Much like robots today, the computer back then was a big, bulky device for which we could only conceive a few functions. Importantly, the military was the main spender on computers’ research and development and a key client driving the marketplace, again comparable to the development of robots.

But soon, computers changed. They got smaller. We figured out more and more functions and applications that they could perform, both in war and in civilian life. And they proliferated. It soon got to the point that we stopped thinking of most of them as “computers.” I drive a car with more than 100 computers in it. No one calls it a “computerized car.” I have a number of computers in my kitchen. I call them things like “microwave” or “coffee maker.”

The same thing is happening with robotics – not just the changes in size and proliferation, but also the reconceptualization. Indeed, if you buy a new car today, it will come equipped with things like “parking assist” or “crash avoidance” technologies. These are kind ways of saying that we stupid humans are not good at parallel parking and too often don’t look in our blind spots. So, the robotic systems in our car will handle these things for us.

But, again, just as the story of the automobile was more than just the shift from owning horse stables to garages, so, too, was the computer about more than never having to remember long-division tables again. What were important, again, were the ripple effects. The game-changing technology reshaped the modern information-rich economy, allowing billions of dollars to be made and lost in nanoseconds. It led to new concepts of social relations and even privacy. I can now “friend” someone in China I’ve never met. Of course, I may now be concerned about my niece social networking with people whom she’s never met. It became a tool of law enforcement (imagine the TV show CSI without computers), but also led to new types of crime (imagine explaining “identity theft” to J. Edgar Hoover). And it may even be leading to a new domain of war, so-called “cyber-war.”

This comparison is a striking one because it illustrates how bureaucracies often have a hard time keeping up with revolutionary change. For example, while computers were obviously important by then, the director of the FBI was so averse to computers that he didn’t have one in his office and never used email, as late as 2001. Sound amazing? Well, the current U.S. secretary of Homeland Security, the agency in charge of the civilian side of American cyber-security, doesn’t use email today in 2012.

The final comparison that is made is perhaps a darker one. It is to the work on the atomic bomb in the 1940s. Scientists, in particular, talk about the field of robotics today in much the same way they talked about nuclear research back in the 1940s. If you are a young engineer or computer scientist, you will find yourself drawn towards it. It is the cutting edge. It is where the excitement is, and where the research money is.

But many worry that their experience will turn out just like that of those amazing minds that were drawn towards the Manhattan Project, like a moth to an atomic flame. They are concerned that the same mistakes could be repeated – of creating something and only after the fact worrying about the consequences. Will robotics, too, be a genie we one day wish we could put back in the bottle?

The underlying point here is that too often in discussions of technology we focus on the widget. We focus on how it works and its direct and obvious uses. But that is not what history cares about. The ripple effects are what make that technology revolutionary.  Indeed, with robotics, issues on the technical side may ultimately be much easier to resolve than dilemmas that emerge from our human use of them.

How Our Robots Are Changing

The first key ripple effect with robotics is the diversification of the field and expansion of the market itself.

The first generations of aerial robots were much like the manned systems they were replacing, even down to some of them having the cockpit where the pilot would sit looking like it’d been painted over. Now we are seeing an explosion of new types, ranging in size, shape, and form. With no human inside, they can stay in the air not just for hours, but for days, months, and even years, having wings the length of a football field. Alternatively, they can be as small as an insect. And, of course, they need not be modelled after our manned machines, but can instead take their design cues from nature, or even the bizarre.

The other key change is their gain in intelligence and autonomy. This is a whole new frontier for weapons development. Traditionally, we’ve compared weapons based on their lethality, range, or speed. Think about the comparison between a Second World War B-17 bomber plane and a B-24 bomber plane. The B-24 could be considered superior because it flew faster, further, and carried more bombs. The same could be said in comparing the MQ-9 Reaper UAS with its earlier version, the MQ-1 Predator. The Reaper is better because it flies faster and further and carries more bombs. But the Reaper is also something else, which we couldn’t say about previous generations of weapons: It is smarter, and more autonomous. We are not yet in the world of The Terminator, where weapons make their own decisions, but the Reaper can do things like take off and land on its own, fly mission waypoints on its own, and carry sensors that make sense of what they are seeing, such as identifying a disruption in the dirt from a mile overhead and recognizing it as something that we humans call a “footprint.”

From these changes comes a crucial opening up of the user base and the functionality of robotics. Much as you once could only use a computer if you first learned a new language like “Basic,” so, too, could you once only use robotic systems if you were highly trained. To fly an early version Predator drone, for instance, you had to be a rated pilot. Now, just as my three-year-old can navigate his iPad without even knowing how to spell, so, too, can you fly some drones with an iPhone app.

This greater usability opens up the realm of possible users, lowering the costs and spreading the technology even further. So, we are seeing the range of uses expand not just in the military, but also, once proved on the military side, moving over to the civilian world. Take aerial surveillance with UAS. It’s gone from a military activity to border security to police to environmental monitoring. Similarly, the notion of using a robotic helicopter to carry cargo to austere locations was first tested out in Afghanistan, but is now being looked at by logging companies.

A key step in moving this forward in the U.S. will be the integration of unmanned aerial systems into the National Airspace System (NAS) and expanded civilian use. Congress has recently set a deadline of 2015 for the Federal Aviation Authority to figure out how to make this happen. While it is unclear if the FAA will meet that deadline, the step is coming, and with it, the next ripple effect outwards in the market.

Indeed, what the opening of the civilian airspace will do to robotics is akin to what the internet did to desktop computing. The field was there before, but then it boomed like never before. For instance, if you are a maker of small tactical surveillance drones in the U.S. right now, your client pool numbers effectively one: the U.S. military. But when the airspace opens up, you will have as many as 21,000 new clients – all the state and local police agencies that either have expensive manned aviation departments or can’t afford them.

Beyond the obvious applications moved over from the military side, the real change occurs when imagination and innovation cross with profit-seeking. This is where parallels to computer or aviation history hold most, as the civilian side then starts to lead the way for the military. For instance, the idea of moving freight via airplanes was not originally a military role. It started out in 1919 with civilians. Today, it’s both a major military role (the U.S. military’s Air Mobility Command has some 134,000 members) and an industry that moves more than $10 trillion in global trade annually. And, yes, a number of airfreight firms are starting to explore drone air cargo delivery.

If history is any lesson, there are many more ways we don’t yet know of that robotics might be applied to other fields. Who saw agriculture as a field to be computerized? And yet the application of computers has led to massive efficiency gains. So, too, is agriculture appearing to be an area in which robotics will drive immense change, from the surveillance of the fields to the crop-dusting to the picking and harvesting.

The Global Revolution

As this progress in robotics plays out, it leads to more ripple effects, notably on the global level. While this is a robotics revolution, it will not be solely an American revolution.

The U.S. is certainly ahead now in this revolution, and well it should be, given that it outspends the rest of the world on military research and development.

There is a rule, however, in both technology and war that means the U.S. should not rest on its laurels: There is no such thing as a permanent first-mover advantage. Companies like IBM and Commodore may have once led the world of computing, but their wares likely don’t sit on your desk today. Similarly, the British may have invented the tank in the First World War, inspired by an H.G. Wells short story about “Land Ironclads.” But it was the Germans who figured out how to use them better in the Blitzkrieg of the Second World War.

Today, there are more than 50 other countries building, buying, and using military robotics of some sort. They range from close allies like Canada and the United Kingdom to potential adversaries like Iran, China, Russia, and Pakistan. Indeed, China has gone from having no UAS under development just a few years back to showing off well more than 25 different models of Chinese-made drones at its tradeshows, ranging from the Predator-like “Pterodactyl” to a stealthy, lethal-looking “Dark Sword.”

Battles of Ideas and Persuasion

The introduction of a revolutionary technology brings new races for ideas and new interactions of knowledge, power, and communication. In the case of robotics, a new fascinating cross has emerged between intellectual-property rights issues and defence studies.

As a critical field to security and industry, akin to the rise of the car, the computer, or the atomic bomb, we are unsurprisingly seeing attempts at stealing information for copying abroad. The examples of this already range from advanced persistent threats in the cyber-security space targeting the secrets of major defence manufacturers to a sales guy for a small robotics maker I spoke with, who happened to see a clone of his firm’s ground robot being sold at an Asian arms fair.

Beyond the stealing of design secrets, unmanned systems have also opened a competition to reach into the communications of the machines themselves. In Iraq, insurgents managed to hack into the video feed of U.S. military drones – in effect, the equivalent of a robber listening in on a police radio scanner. What is even more notable is that the insurgents were able to do so using a $29 piece of software they had obtained from a Russian website. It had originally been designed to allow college kids to illegally download movies online.

As we use more and more systems that are digitally controlled, where a human is not physically inside, we will see a new step in this race open. The battle is not just for design secrets and access to communications, but also for control. We enter into an era of battles of persuasion.

This is a fundamental shift. We have never been able to “persuade” a weapon to do what its owner didn’t want. You never could change the direction of a bullet or arrow in mid flight. Now you can do the equivalent. The goal then moves from only seeking to destroy the enemy’s plane or tank, to co-opting it to “persuade” it to do things its original owners wouldn’t want. “Recode all allied soldiers as enemies, and all enemy soldiers as friendly.” A human would ask why, needing motivation to change his or her ways. With the proper access, a computer will just comply.

Privacy and the Law

A computer will also not ask for an explanation when tasked with surveillance. While some say drones are no different than manned planes or surveillance cameras and so raise no new privacy issues, this is incorrect. There are many similarities but also fundamental differences.

To operate, a robot is always gathering and storing information about the world around it. Always. This is different from a regular plane, for example, where the human operator is gathering most of this information but cannot store it for playback. A robot’s operating requirements mean that even in the course of regular operations, it is gathering and storing information about everything that crosses its path.  This gives robots an advantage over human-operated planes, where a conscious decision to acquire and store data must be made. The other main advantage of unmanned systems is their ability to loiter for long periods of time, which again allows them to draw in more information than manned systems. Taking in vast quantities of information happens unintentionally – a robot on a “Where’s Waldo?” mission to hunt down one person in a city will still be gathering data on the entirety of that city throughout the search process.

Visual information is not the only type of data being gathered. Unmanned systems also carry out electronic surveillance. A drone unveiled at the DefCon hacking conference last year can crack Wi-Fi networks and intercept text messages and cell phone conversations, all without the knowledge or help of either the communications provider or the customer. This type of drone draws in electronic information on a wide group of people beyond the intended target, including those who have not signed a user agreement or otherwise signaled they accept this intrusion upon their privacy.

Finally, the size and mobility of robotic systems is fundamentally different: they are being designed in increasingly smaller sizes, and they are able to move and track targets covertly when required. A robotic system can watch from above, but can also get up close and personal, unlike a fixed security camera or a high altitude spy plane.

These differences lie at the heart of a lot of the suspicion of domestic use of unmanned systems. Such suspicion has been encouraged by the American Civil Liberties Union, and by right wing commentators on Fox News, who have urged Americans to use their Second Amendment powers to shoot down drones (something already done by a group of hunters in Pennsylvania, who shot down a drone doing environmental monitoring).

As with revolutionary inventions of the past – like the horseless carriage and manned airplanes – no amount of handwringing or fear mongering by pundits late to the game will lead to a ban on technology of such great promise.

Instead, a revolutionized world requires the establishment of new rules, which in turn requires an understanding of the new technology. Much of the substance of these rules will likely come from public discourse and the private sector. For example, the origins of the modern way we drive can be found in Rules of the Road, published in 1903 by William P. Eno. Known as “the father of traffic safety,” Eno’s book contained such revolutionary ideas as cars only passing on the left, stoplights and one-way streets. (Ironically he never drove himself; he was always chauffered).

We are seeing a similar evolution now, whether in the development of industry codes of conduct or guidelines for university research groups. But much like the early “rules of the road”, these will need enforceable laws to make them real. Early cars and planes needed more than Eno’s book – mainstream use of these inventions demanded the drafting of traffic laws and the creation of regulatory institutions like the Federal Aviation Administration. Similarly, the increasing use of unmanned systems has highlighted a gap at the state and federal levels that demands action.

 The Psychological Side

There is a degree of irony to all the calls for regulation. Our reactions to drones policing city streets from above, computers at the National Security Agency reading emails, and smartphones letting Starbucks know when there’s a potential customer walking nearby, are still mostly determined by the very fuzzy combination of our identity and our emotions – by the DNA coding and chemical makeup that drives human psychology.

What then will be the reaction to the intensification of the surveillance state? Will we respond similarly to those teenagers given access to Facebook and Twitter who couldn’t care less that the world is watching and who have embraced the system to the point of overload? Or will we respond with fear? And how will our response to being watched impact the way we look at the human operators behind the robots?

We are facing the domestic version of the problem confronting our counterterrorist efforts abroad – the impact of robots on the very human “war of ideas”.

We need to consider what message we think we are sending with our robot watchers versus the one publics are actually receiving, and the range of impacts that message can have. U.S. troops in Afghanistan describe unmanned systems as reassuring, saying that they can sleep better because they feel like someone is always overhead, watching out for them. On the other hand, many Afghan civilians fear and distrust them.

Some, such as one senior State Department official, believe that our unmanning of war “…Plays to our strength. The thing that scares people is our technology.” Their idea is that it has a deterrent value even if it is scary.

But the psychology of scaring people with technology is a tricky business. There is the risk that robotic surveillance will instead be perceived as an intrusive “Big Brother” figure, as the Russian police whose used drones to monitor protesters have been called. Or, they might be seen as emblematic of those trying to police people they don’t know, on the cheap, from afar. The drone becomes like the cameras favoured by the disconnected and corrupt Baltimore police force of the TV show The Wire, who watch a world of crime play out that they don’t understand.

User Questions

The innovation spread of robotics represents another trend of opportunity and peril. An ever-wider set of users is innovating for all sorts of positive purposes with robotics, from the great work being done by young students at robotics labs at McGill University to the team in Australia that built an autonomous drone to help find lost bushwalkers.

But not all of the people behind machines have only the best in mind. Take the traditional notion of using a robotic drone for surveillance. The new users have not just been militaries or police, but have also been civilians. These include news journalists who have reported on natural disasters with drones, as well as even parents who want new ways to watch their kids. A father in the U.S. gave new meaning to the term “helicopter parent,” using an automated quadcopter drone to escort his child to the school bus stop.

The problem is that each and every technology has its darker side. The same field of drone journalism that reports important stories with a whole new level of fidelity also advances the field of paparazzi. For instance, Gary Morgan, chief executive officer of Splash News, a celebrity-photo agency, has already said he’d like to be buzzing his quarry soon with silent, miniature drones mounted with tiny cameras: “It would strike fear in the hearts of every celebrity having a birthday party.” And, one has the sense that child may end up telling a therapist one day about his father loving him a bit too much, to the extent of following him with a drone.

Open Source

More seriously, just as software has gone “open source,” so has warfare. Robotics is not a technology like the atomic bomb or aircraft carrier, where only the great powers can build and use it effectively. Instead, just like with the “app” in the field of software, it is not just the big boys who control the field. The barriers to entry are not exceptionally high, and that means that bad actors will be able to gain and use this advanced technology.

If history is any guide, the repurposing of a low-entry revolutionary technology tends to happen fairly quickly. Indeed, the first car bomb was set off as early as 1905, used in an assassination attempt on the Ottoman sultan. Similarly, the first hijacking of a plane took place in 1931, very early in civilian air travel.

A particular area of concern, then, is the use of robotic systems by terrorists and other non-state actors. Israel as a state has long used drones, and now so has its non-state opposition. Hezbollah, for example, is not a major state military, but it has alr