At the turn of the last century, a strange new technology began to appear in America. As a January 4, 1900 article about one of the very first sightings in the state of Florida described, “The Locomobile resembles a rubber-tired driving buggy in its outward appearance, except that no allowance is made for attaching a horse…A brake is attached to the rear axle that will stop the machine in a much shorter space than a horse can be stopped.”
The locomobile, or “horseless carriage,” caught people’s fancy and powered a huge new industry. Businesses opened up in places that ranged from Basic City, Virginia, home of the Dawson Steam Auto-Mobile, a two-cylinder runabout with single chain drive and tiller rather than a steering wheel, to the Southern Automobile Manufacturing Company of Jacksonville, which assembled five cars a day that sold for a princely sum of $400 each.
Soon, the industry rippled out into all sorts of directions. It was only two years after the first car hit the roads of Florida that the first car dealership was created. This led to new endeavors in areas like the logistics and support “garages,” which had ripple effects out into other areas. For instance, just three years after the first news article on the locomobile appeared in The Florida Times-Union & Citizen, the very first newspaper advertisement for one appeared.
But this new technology also brought strange new questions, such as how to protect people from them. The first fine for “speeding” came just a year later in 1904, when a man was arrested for endangering the lives and property of pedestrians in downtown Jacksonville. He had exceeded the 6 mile per hour speed limit
The new technology also created new demands on governments, like an entire new type of infrastructure. Staying in Florida for the moment, it was in 1907 when the first of what we now call “snow birds” arrived via horseless carriage. Mr. Ralph Owen “accomplished the amazing feat of driving an Oldsmobile motorcar from New York to Florida in only 15 days.”
The reason this took so long is that no one was ready for it, especially the government. There were no real roads, at least as we think about them now, and no truly reliable maps for the pathways that did exist. Indeed, as late as 1921 the Automobile Club of America recommended that motorists traveling from New England to Florida simply bypass the entire state of Virginia because of these problems.
It wasn’t just the poor state of transportation that required a network of roads and highways to be funded but also basic issues like what safety equipment the new technology required. For example, early horseless carriages often had headlights but no turn signals. Drivers had to use hand signals to indicate their intentions to turn or slow down. A new business started selling a seeming solution, Devilseye Reflector Rings. Drivers would wear large red rings on their fingers at night so that when they held their hand outside the car the rings reflected other headlights and allowed other drivers to see the signal. Soon, this concept was replaced by the novel idea of requiring the reflector be embedded in the car rather than carried by the driver.
These stories of the early days of “horseless carriages” and “locomobiles” aren’t just fascinating but they should serve to help us frame the issues we face today in “unmanned systems” and robotics. They were a technology that once seemed alien but we figured it out.
Where are we now? Robots and War
While unmanned systems have a long history dating back to Da Vinci’s designs for a robotic knight, and first emerged in war with German remote-controlled torpedo boats in the First World War, it wasn’t until just a decade ago that they truly took off. Advances in technology made unmanned systems more usable, especially through the incorporation of GPS technology that allowed such systems to locate themselves in the world. At the same time, the new conflicts that followed 9/11 drove demand. When U.S. forces first went into Afghanistan, the U.S. military had only a handful of unmanned aerial systems (UAS, also called “remotely piloted aircraft” or, more colloquially, “drones”) in the air, none of them armed, and zero on the ground. Now it has a force inventory of more than 8,000 in the air and more than 12,000 on the ground. Another example of how far the change has gone is that last year, the U.S. Air Force trained more unmanned systems operators than fighter and bomber pilots combined.
But when we think about technologies like the Predator or the PackBot, we need to remember that they are just the first generation, the Model T Fords and Wright Flyers compared to what is already in the prototype stage. We are still at the “horseless” stage of this technology, describing these technologies by what they are not rather than wrestling with what they truly are. These technologies are “killer applications” in all the meanings of the term. They are technologies that advance the power of killing. They are also technologies that have a disruptive effect on existing structures and programs. That is, they are akin to advancements like the airplane or the steam engine in allowing greater power and reach in war, but they are also akin to what iPods did to the music industry, changing it forever.
What Next? The Robotics Revolution
While many are surprised by the existing use of robotics, the pace of change won’t stop. We may have thousands now, but as one three-star U.S. Air Force general noted in my book Wired for War, very soon it will be “tens of thousands.”
But the numbers matter in another way. It won’t be tens of thousands of today’s robots, but tens of thousands of tomorrow’s robots, with far different capabilities.
One of the laws in action when it comes to technology is Moore’s Law, which states that the computing power that can fit on a microchip doubles just under every two years or so. It has become an encapsulation of broader exponential trends in technology that have occurred throughout history, with technological power constantly doubling in everything from power to storage to broader innovation patterns. If Moore’s Law holds true over the next 25 years the way it has held true over the last 40 years, then our chips, our computers, and, yes, our robots will be as much as a billion times more powerful than today. But Moore’s Law is not a law of physics. It doesn’t have to hold true. What if our technology moves at a pace just 1/1000th slower than it has historically? In this slowed-down scenario, we’d only see a mere 1,000,000 times the change.
The bottom line is that what was once only fodder for science-fiction conventions like Comic-Con is now being talked about seriously in places like the Pentagon. A robotics revolution is at hand.
We should be crystal clear here. The robot revolution happening is not the Robopocalypse that Steven Spielberg was preparing to film. It is not the type where you need to worry about the former governor of California showing up at your door, à la The Terminator.
Instead, every so often, a technology comes along that changes the rules of the game. These technologies – be they fire, the printing press, gunpowder, the steam engine, the computer, etc. – are rare but truly consequential.
The key to making technology truly revolutionary is not merely its new capabilities but its questions. Revolutionary technologies force us to ask new questions about what is possible and consider things that weren’t conceivable a generation before. But they also force us to relook at what is proper. They raise issues of right and wrong that we didn’t have to wrestle with before.
The historical comparisons that people make to the robotics revolution illustrate this. When I conducted interviews for my book, I asked people to give historical parallels to where they think we stand now with robotics. As I noted earlier with the comparison to the “horseless carriage,” many of them, especially engineers, liken where we are now with robotics to the advent of the automobile.
If the horseless carriage is the parallel, think of the ripple effects that cars had on everything from our geopolitics to our law enforcement. A group of people who were, at the time, desert nomads became crucial players in the global economy simply because they lived over a sticky black substance previously considered more of a nuisance than anything else. The greater use of that same – now crucial – resource has changed the global climate. The growing use of cars, in turn, led to new concepts that reshaped the landscape, whether through highways and suburbia, or through new social notions, like dating (teens previously could only court on parents’ front porches).
And of course a whole new world requires the establishment of rules of the game, or rather new rules of the road. It wasn’t just a matter of fines for “speeding,” but also changes to the very structure of American law enforcement. The rise of easy cross state crime enabled by the speed and reach of horseless carriages, such as the string of bank robberies by Bonnie Parker and Clyde Barrow, helped lead to the rise of the then Bureau of Investigation, now the modern FBI.
Others, such as Bill Gates, make a different comparison to the computer in 1980. Much like robots today, the computer back then was a big, bulky device for which we could only conceive a few functions. Importantly, the military was the main spender on computers’ research and development and a key client driving the marketplace, again comparable to the development of robots.
But soon, computers changed. They got smaller. We figured out more and more functions and applications that they could perform, both in war and in civilian life. And they proliferated. It has reached the point that we have stopped thinking of most of them as “computers.” I drive a car with more than 100 computers in it. No one calls it a “computerized car.” I have a number of computers in my kitchen. I call them things like “microwave” or “coffee maker.”
The same thing is happening with robotics – not just the changes in size and proliferation, but also the reconceptualization. Indeed, if you buy a new car today, it will come equipped with things like “parking assist” or “crash avoidance” technologies. These are kind ways of saying that we stupid humans are not good at parallel parking and too often don’t look in our blind spots. So, the robotic systems in our car will handle these things for us.
But again, just as the story of the automobile reveals more than just the shift from owning horse stables to garages, so, too, was the computer about more than never having to remember long-division tables again. What were important were the ripple effects. The game-changing technology reshaped the modern information-rich economy, allowing billions of dollars to be made and lost in nanoseconds. It led to new concepts of social relations and even privacy. I can now “friend” someone in China I’ve never met. Of course, I may now be concerned about my niece social networking with people whom she’s never met. It became a tool of law enforcement (imagine the TV show CSI without computers) but also led to new types of crime (imagine explaining “identity theft” to J. Edgar Hoover). And it may even be leading to a new domain of war, so-called “cyber-war.”
This comparison is a striking one because it illustrates how bureaucracies often have a hard time keeping up with revolutionary change. For example, the FBI director was so averse to computers that he didn’t have one in his office and never used email as late as 2001. Sound amazing? Well, the current Secretary of Homeland Security, the agency in charge of the civilian side of American cyber-security, doesn’t use email today.
The final comparison that is made is perhaps a darker one: work on the atomic bomb in the 1940s. Scientists, in particular, talk about the field of robotics today in much the same way they talked about nuclear research back in the 1940s. If you are a young engineer or computer scientist, you will find yourself drawn towards it. It is the cutting edge. It is where the excitement is and where the research money is.
But many worry that their experience will turn out just like that of those amazing minds that were drawn towards the Manhattan Project, like a moth to an atomic flame. They are concerned that the same mistakes could be repeated – of creating something and only after the fact worrying about the consequences. Will robotics, too, be a genie we one day wish we could put back in the bottle?
The underlying point here is that too often in discussions of technology we focus on the widget. We focus on how it works and its direct and obvious uses. But that is not what history cares about. The ripple effects are what make that technology revolutionary. Indeed, with robotics, issues on the technical side may ultimately be much easier to resolve than dilemmas that emerge from our human use of them.
How Our Robots Are Changing
The first generations of aerial robots were much like the manned systems they were replacing, even down to some of them having the cockpit where the pilot would sit looking like it’d been painted over. Now we are seeing an explosion of new types, ranging in size, shape, and form. With no human inside, they can stay in the air not just for hours, but for days, months, and even years, having wings the length of a football field. Alternatively, they can be as small as an insect. And, of course, they need not be modelled after our manned machines, but can instead take their design cues from nature, or even the bizarre.
The other key change is their gain in intelligence and autonomy. This is a whole new frontier. Traditionally, we’ve compared weapons based on their lethality, range, or speed. Think about the comparison between a Second World War B-17 bomber plane and a B-24 bomber plane. The B-24 could be considered superior because it flew faster, further, and carried more bombs. The same could be said in comparing the MQ-9 Reaper UAS with its earlier version, the MQ-1 Predator. The Reaper is better because it flies faster and further and carries more bombs. But the Reaper is also something else, which we couldn’t say about previous generations of weapons: It is smarter, and more autonomous. We are not yet in the world of The Terminator, where weapons make their own decisions, but the Reaper can do things like take off and land on its own, fly mission waypoints on its own, and carry sensors that make sense of what they are seeing, such as identifying a disruption in the dirt from a mile overhead and recognizing it as something that we humans call a “footprint.”
From these changes comes a crucial opening up of the user base and the functionality of robotics. Much as you once could only use a computer if you first learned a new language like “Basic,” so, too, could you once only use robotic systems if you were highly trained. To fly an early version Predator drone, for instance, you had to be a rated pilot. Now, just as my three-year-old can navigate his iPad without even knowing how to spell, so, too, can you fly some drones with an iPhone app.
The Civilian Side Opens
This greater usability opens up the realm of possible users, lowering the costs and spreading the technology even further. So, we are seeing the range of uses expand not just in the military, but also, once proved on the military side, moving over to the civilian world. Take aerial surveillance with UAS. It’s gone from a military activity to border security to police to environmental monitoring. Similarly, the notion of using a robotic helicopter to carry cargo to austere locations was first tested out in Afghanistan, but is now being looked at by logging companies.
A key step in moving this forward in the U.S. will be the integration of unmanned aerial systems into the National Airspace System (NAS) and expanded civilian use. While there has been a huge amount of energy around the topic of domestic drones, such that many politicians speak about them as if they are already “watching everything from above,” the present laws restrict civilian use. An ever growing number of special permits, however, have been issued to domestic operators, now summing 1,428. These early users range from small police departments like Mesa County in Colorado, which found they cost over 90% less to operate than police helicopters, to universities conducting environmental research in Alaska.
Congress has set a deadline of September 2015 for the Federal Aviation Authority to figure out how to make this happen on a more regularized basis, in essence opening up the national airspace to the civilian public and private sector use. As part of this process, there are to be six test sites created around the nation, which some twenty states are competing to be awarded. While it is unclear if the FAA will meet the deadline, the step is coming, and with it, the next ripple effect outwards in the market.
Indeed, what the opening of the civilian airspace will do to robotics is akin to what the Internet did to desktop computing. The field was there before, but then it boomed like never before. For instance, if you are a maker of small tactical surveillance drones in the U.S. right now, your client pool numbers effectively one: the U.S. military. But when the airspace opens up, you will have as many as 21,000 new clients – all the state and local police agencies that either have expensive manned aviation departments or can’t afford them.
The scale of this market is estimated to be in the tens of billions in its first years, but it is frankly too early to know where it will end up. If history is any lesson, we shouldn’t just focus on the sale of drones in roles we already know but recognize that there are many more ways we don’t yet know of where robotics might be applied to other fields. Who saw agriculture as a field to be computerized? And yet the application of computers has led to massive efficiency gains. So, too, is agriculture appearing to be an area in which robotics will drive immense change. Agribusinesses nationwide such as Monsanto are lobbying for the use of domestic drones in roles that range from the monitoring and surveillance of the fields to the crop-dusting to the picking and harvesting.
Impact on U.S. military
There are a huge set of ripple effects that will emerge from the opening up of the airspace to domestic drones. One is a potential role reversal. What will be the impact on the U.S. military as a technology area that it once led in, blossoms on the civilian side?
Take the areas of acquisitions. What happens when manufacturers have a wider set of clients than just the DoD and therefore become less responsive to its needs? If the parallel is computers, microchips and IT networks, the U.S. military once was in the lead in the research and development and then purchasing of computing. Now it is often behind the civilian side and, indeed, in areas like microchips can’t get makers to shift to its unique demands.
Beyond the obvious applications moved over from the military side, the real change occurs when imagination and innovation cross with profit-seeking. This is where parallels to computer or aviation history hold most, as the civilian side then starts to lead the way for the military. For instance, the idea of moving freight via airplanes was not originally a military role. It started out in 1919 with civilians. Today, it’s both a major military role (the U.S. military’s Air Mobility Command has some 134,000 members) and an industry that moves more than $10 trillion in global trade. And, yes, a number of airfreight firms are starting to explore drone air cargo delivery, from large-scale trans-oceanic movement to small movement of medical supplies or even fast food.
Similarly, what will it mean for training, when more and more young service men will come in with experience using the technology at home, or even when they see more advanced versions on the market than what they get from the Pentagon? The bottom line is that discussions of the civilian side also matter to the military.
Economic Winners and Losers: Nations and Communities
A new industry raises another ripple effect: Who will be the winners and losers?
One can certainly think about this issue on the global level. The U.S. faces a strange situation of trying to compete in a world economy, where technologic knowhow is a key differentiator, and yet has an education system that too often moves in an opposite direction. American high school students rank 23rd in science and 31st in math among wealthy nations, and 27th in college graduates with degrees in science and math. And the trends aren’t improving greatly. In 2004, the number of American computer science majors was 60,000. In 2013, it had shrunk to 38,000. (It is all not bad news, we are graduating twice as many journalists.) 
But the issue of winners and losers isn’t just a matter for Washington policymakers; it should have huge resonance for state and local leaders. That is, if what is playing out in the field of robotics is comparable to horseless carriage, who is Detroit, which became the epicenter of this industry for the 20th century, and who are going to be like Basic City or Jacksonville that had early automobile companies around the same period? Or, if the comparison is to computers, who is going to be akin to Philadelphia, a key node in the early days of computing, and who is going to be the robotics version of Silicon Valley?
Answering this question turns on challenging a false notion that has taken hold, that in today’s world of globalization distance doesn’t matter. But despite our new technologies, we have repeatedly seen at the state and metropolitan level, success happens in clusters.
As Maryann Feldman writes in her study Location, Location, Location: Creating Innovation Clusters, “Grounded in place, innovation and entrepreneurship rely on an ecosystem of firms (both suppliers and customers), universities and community colleges, government agencies, and trade associations, all systematically aligned to encourage creativity and experimentation. Once started, concentrations of industries within places become self-reinforcing as talent is attracted to opportunity, the flow of ideas increases, and their potential is understood and appreciated. With that dynamic, it becomes easier and less costly for entrepreneurs to realize their dreams.”
The role of the government is central in developing these clusters. While entrepreneurship is a private-sector activity, it is public policy that sets the stage. For example, I am from North Carolina. Like that old Saturday Night Live joke, we were really happy there was a South Carolina and District of Columbia whenever the education rankings came out, as that meant we had someone to look down on from our lofty perch of 49th in the nation.
Yet, when North Carolina’s textile manufacturing economy declined, the local government did something brilliant. It fostered a new “innovation cluster” centering around the Research Triangle Park that is now the home to more than 130 research facilities and helped North Carolina become one of the hubs of the biotech industry. This boom then benefited the rest of the state and made it one of fastest growing states in the nation during this period. The success didn’t happen overnight. As Feldman noted, the policy world can nurture these kinds of success stories via “…steady and consistent state policy, investment tax credits, and quasi-governmental, sector-specific agencies.”
Job Gain and Loss
The rise of domestic robotics use holds the potential to create a number of jobs. Indeed, the AUSVI industry trade group has claimed some 70,000 new jobs will be created in just the first few years once the airspace opens up, arguing (with an obvious self-interest) that the US loses some $27 million per day in economic activity the longer it waits to do so.
This boom for the robotics industry, though, raises deep questions not just of which areas will win out, but also which individuals will win and lose.
Just as the horseless carriage made titans of Henry Ford and Alfred Sloan, computers created a whole new generation of billionaires and millionaires. But, of course, just like with the craftsmen before the first industrial age, there were also losers. For hundreds of years, there was a highly skilled profession of men who did mathematics for hire. They were well paid, many making the equivalent of $200,000 a year. They were called “calculators.” They have gone the way of so many other professions reshaped by new technology like the blacksmith making horseshoes or the elevator operator.
Indeed, robots have already and will continue to shape the economy both as an issue of growth and job loss. As a recent MIT study found, automation is "destroying jobs and creating prosperity," explaining both the gains in efficiency and the loss of as many as six million jobs over the last decade. Robots are a large part of the reason the automobile companies of Detroit are back, but so many automobile workers are not back to work. (Already, one in ten has been replaced by a factory line robot, with many companies across a wide array of industries planning to fully automate their assembly lines.)
Such trends mean that a part of our economy will make a great deal of money from robotics, which is why there is so much lobbying behind the area today. Last year, drone manufacturers gave $2.3 million in contributions to the House Unmanned Systems Caucus, while the industry’s trade group spent a quarter million lobbying for the FAA bill that opens up the airspace (the group proudly told donors that “Our suggestions were often taken word-for-word” in the language of the bill). But these very same trends also mean the expansion of the industry will be seen as a threat to livelihoods, further stoking tensions and underlying suspicions of the technology.
Law and Privacy
One profession that will be busy, though, is the lawyers.
While some say drones are no different than manned planes or fixed surveillance cameras on the street, and so raise no new privacy issues, this is incorrect at face value. There are similarities but also fundamental differences.
To operate, a robot is always gathering and storing information about the world around it. Always. This is different from a regular plane, for example, where the human operator is gathering most of this information but cannot store it for playback. A robot’s operating requirements mean that even in the course of regular operations, it is gathering and storing information about everything that crosses its path. This gives robots an advantage over human operated planes, where a conscious decision to acquire and store data must be made. The other main advantage of unmanned systems is their ability to loiter for long periods of time, which again allows them to draw in more information, and as the ACLU's Jay Stanley and Catherine Crump have written, also allows them to “...pose a more serious threat to privacy than do manned flights."
Taking in vast quantities of information even unintentionally is a key part of the concern. For example, a robot on a “Where’s Waldo?” mission to hunt down one person in a city will still be gathering data on the entirety of the city throughout the search process.
Visual information is not the only type of data being gathered. Unmanned systems also carry out electronic surveillance. A drone unveiled at the DefCon hacking conference in 2011 can crack Wi-Fi networks and intercept text messages and cell phone conversations – without the knowledge or help of either the communications provider or the customer. This type of drone draws in electronic information on a wide group of people beyond the intended target – and, different from a computer, includes those who have not signed a user agreement or otherwise signaled they accept this intrusion upon their privacy.
Finally, the size and mobility of robotic systems is fundamentally different – many are being designed in increasingly smaller sizes, and they are able to move and track targets covertly when required. A robotic system can watch from above, but can also get up close and personal, unlike a fixed security camera or a high altitude spy plane.
These differences lie at the heart of a lot of the worries over domestic use of unmanned systems. Such suspicion has mobilized left wing groups like the ACLU, but also those on the right, such as the Tea Party movement, perhaps best illustrated by the speeches and legislation of Senator Rand Paul, who has attempted in the words of one article to launch “a Preemptive Strike Against Domestic Drone Use.” While some 20 states from Nevada to North Carolina are competing to be the home of the six FAA drone test sites, the anti-drone movement has crystallized into efforts to ban the use of drones in at least ten state legislatures, ranging from Virginia to Oregon. Indeed, Charles Krauthammer, a right wing commentator on Fox News, even urged Americans to use their Second Amendment powers to shoot down drones (something already done by a group of hunters in Pennsylvania, who shot down a drone doing environmental monitoring).
With these concerns brewing, we are starting to see some steps forward to respond. For instance, an industry “code of conduct” has been put forward by the same trade group that prompted the current controversy over domestic drones with its successful lobby to open the airspace. The AUVSI code took on many of the concerns circulating, grouping them into three core themes of Safety, Professionalism, and Respect. It laid out how the industry and users would "commit" to not operating drones "in a manner that presents undue risk to persons or property;" to planning for "all anticipated off-nominal events;" and to share such contingency plans with "all appropriate authorities." It made great sense and was reported widely.
The challenge for the robotics code of conduct, however, is much the same as other industries' attempts at self-regulation, ranging from banking to the private military industry. It's a laudable start, but it doesn't change the underlying issues and concerns. Like such other would-be "codes of conduct," it lacks a key ingredient: consequences.
It is a voluntary code with no results if one violates it. Indeed, much of what is laid out is actually restatements of responsibilities the firms and users already should abide by, regardless of any code. For example, the code says that the firms "will comply with all federal, state and local laws." So, before the code, they could violate the law at will? Of course not.
But more importantly, the code is not able to deal effectively with all the areas where the law is absent or vague. It says that "We will ensure that UAS are piloted by individuals who are properly trained and competent to operate the vehicle or its systems." Who will determine this, and what does "trained and competent" mean in a world where some believe drones should only be operated by rated pilots, even though new versions can be flown by teens using iPhone apps?
Likewise, the code pledges to "respect the privacy of individuals," which is a bold statement. But "Respect" could be anything from avoiding the monitoring of individuals without their express permission to showing them "respect" only in the public-relations sense.
Of course, these are thorny issues. Indeed, it's their very thorniness that is why an industry self-regulatory code is the beginning of the discussion, not the final answer
The Police Weigh In
The same could be said of a push by police chiefs, who have offered a code of conduct for their use of drones. This effort asserted that police wouldn’t let any images captured by unmanned aerial vehicles be open to inspection by the public, and that the images would not be stored, unless they are evidence of a potential crime or part of an ongoing investigation. Of course, that’s a pretty large out clause.
More importantly, the police chiefs’ effort is a proposal, not yet policy, with some huge gaps. Even worse, it has a big dose of unrealism. For instance, it suggests that police would use a “Reverse 911 telephone system to alert those living and working in the vicinity of aircraft operations. If such a system is not available, the use of patrol car public address systems should be considered.”
In reality, such a system would be unworkable and even laughable. Each and every time a UAS flies, the police are going to call all of an area’s residents’ home phone (setting aside the growing number who only have mobile phones)? Or, alternatively, the police are planning to ensure public awareness of potential privacy losses by recreating the scene from the movie The Blue Brothers, driving through the streets yelling out on a car’s bullhorn?
Similarly, we have next order questions, like whether drones should be armed. This is cast aside quickly in the proposed codes, but again definition and context matters. Law enforcement in Texas has shown interest in unmanned aerial systems armed with a shotgun that shoots “less than lethal” rounds. One person’s shotgun or taser is another person’s unarmed drone.
The bottom line is that, as with revolutionary inventions of the past, no amount of handwringing by pundits late to the game will see a technology of such great promise banned.
Instead, a revolutionized world requires the establishment of new rules, which in turn requires an understanding of the new technology. Much of the substance of these rules will likely come from both public discourse and the private sector. For example, the origins of the modern way we drive can be found in Rules of the Road, published in 1903 by William P. Eno. Known as "the father of traffic safety,” Eno’s book contained such revolutionary ideas as cars only passing on the left, stop-lights and one-way streets. (Ironically he never drove himself; he was always chauffeured).
We are seeing a similar evolution now, whether in the development of industry codes of conduct or guidelines for university research groups. But much like the early “rules of the road,” these will need enforceable laws to make them real. Early cars and planes, for instance, needed more than Eno’s book – mainstream use of these inventions demanded the drafting of traffic laws and the creation of regulatory institutions like the Federal Aviation Administration. Similarly, the increasing use of unmanned systems has highlighted a gap at the state and federal level that demands action.
Where the Law Goes Next
As these laws are hopefully built by Congress, we need to recognize that much of what is written in the law is just the first draft. For instance, federal district court judges have spoken about how, much like with computers and the privacy questions they created, questions over the proper use of drones by law enforcement will end up as Supreme Court cases.
But even then what will the Court decide? A case that is frequently spoken about as a potential precedent is 2001's Kyllo vs. U.S. In this instance, a federal government agent used a "thermal imaging device" to scan a home in Florence, Oregon. They did not have a warrant, but it allowed them to learn that marijuana was being grown.
When the case made its way up to the Supreme Court, the majority opinion, written by Judge Scalia, was that when the "government uses a device that is not in general public use, to explore details of a private home that would previously have been unknowable without physical intrusion, the surveillance is a Fourth Amendment 'search,' and is presumptively unreasonable without a warrant."
Many point to this as evidence that the Supreme Court will be less likely to approve domestic use of drones in an intrusive way by police. But they ignore the caveat. What about when a technology becomes in “general public use,” as drones are evolving to?
Similarly, last January in the U.S. v. Jones case, the Supreme Court ruled that placing a GPS tracking device on a vehicle is considered a search under the Constitution and required a warrant. Notably, though, it was the physical placement of the GPS on the vehicle that mattered most. The Court said other evidence obtained without using the GPS device was admissible because the suspect had no "reasonable expectation of privacy" for a vehicle on the public streets. One way to read it is that your car can’t be tracked without a warrant; another is that your car can be tracked without a warrant, just as long as the police don’t place anything on the vehicle, which is no longer required with our current technology.
The underlying point is that the precedents cited with certainty by analysts and lawyers are often not as clear as they might be. And, when there are questions, or even potential abuses, it will be years before the legal system resolves them. The GPS case happened in 2005, but didn’t get resolved until 2012, well after the technology of a physical tracker was no longer needed. Moreover, just because the Supreme Court ruled one way, doesn’t mean it won’t rule differently on very similar issues, just at a different time. As everything from voting rights to abortion rulings demonstrates, all it takes to reorder the law is just a few seats changed on the court. Neither technology nor laws are written in stone, and justices don’t live forever.
The innovation spread of robotics represents another trend of opportunity and peril. An ever-wider set of users is innovating for all sorts of positive purposes with robotics, from the great work being done by young students at robotics labs at McGill University to the team in Australia that built an autonomous drone to help find lost hikers.
But not all of the people behind machines have only the best in mind. Take the traditional notion of using a robotic drone for surveillance. The new users have not just been militaries or police, but have also been civilians. These include news journalists who have reported on natural disasters with drones, as well as parents who want new ways to watch their kids. A father in the U.S. gave new meaning to the term “helicopter parent,” using an automated quadcopter drone to escort his child to the school bus stop.
The problem is that each and every technology has its darker side. The technology is enabling a new field of drone journalism (already taught at University of Nebraska and University of Missouri) that reports important stories with a whole new level of fidelity. But the same phenomenon also advances the field of paparazzi. For instance, Gary Morgan, chief executive officer of Splash News, a celebrity-photo agency, has already said he’d like to be buzzing his quarry soon with silent, miniature drones mounted with tiny cameras: “It would strike fear in the hearts of every celebrity having a birthday party.” And, one has the sense that the child may end up telling a therapist one day about his father loving him a bit too much, to the extent of following him with a drone.
More seriously, just as software has gone “open source,” so has warfare. Robotics is not a technology like the atomic bomb or aircraft carrier, where only the great powers can build and use it effectively. Instead, just like with the “app” in the field of software, it is not just the big boys who control the field. The barriers to entry are not exceptionally high, and that means that bad actors will be able to gain and use this advanced technology.
If history is any guide, the repurposing of a low-entry revolutionary technology tends to happen fairly quickly. The first car bomb was set off as early as 1905, used in an assassination attempt on the Ottoman sultan. Similarly, the first hijacking of a plane took place in 1931, very early in civilian air travel.
A particular area of concern, then, is the use of robotic systems by terrorists and other non-state actors. Israel as a state has long used drones, and now so does its non-state opposition. Hezbollah, for example, is not a major state military, but it has already operated UAVs, as too has Hamas.
The impact of this trend is twofold. The first is that it reinforces the empowerment of individuals and small groups against the power of the state. During the Second World War, for example, Hitler’s entire Luftwaffe could not manage to reach across the Atlantic to strike at Canada or the U.S. Just a few years ago, a blind 77-year-old man managed to build his own drone that flew itself across the Atlantic.
And one man’s hobby may be another man’s plot. In 2011, the FBI arrested Rezwan Ferdaus, a man who wanted to recreate the 9/11 attacks (not so ironically, he had been angered by drone attacks in the Mideast intended to stop terrorism). Unable to hijack planes, he instead obtained a large drone and planned to fly it into the Pentagon. Fortunately, he made the mistake of asking an FBI informant where he could obtain C-4 explosives. The plot was averted, but it showed we are now in a world where it is easier to get the drone than the bomb.
This greater reach and power may also see a lowering of the bar. One does not have to be suicidal to carry out attacks that previously might have required one to be so. This allows new players into the game, making al-Qaeda 2.0 and the next-generation version of the Unabomber or Timothy McVeigh far more lethal.
Just as car bombs are not the only way automobile technology has been misused, we should not make the mistake of only focusing on terrorism when it comes to the potential criminals uses of robotics. The early horseless carriage may have been reworked into a car bomb by turn-of-the-century terrorists, but the main illegal use was as a getaway device for criminals. Similarly, the best example of innovation in the field of robotics last year might be the team of thieves in Taiwan, who used tiny helicopters equipped with pinhole cameras to carry out a jewellery heist. They made away with $4 million worth of loot before being caught.
The challenge for the law is not just how to prevent bad guys from doing bad things, but what to do when things go wrong without someone having bad intent, such as when the Google car was in a wreck in August 2011. Like most wrecks, the various sides involved blamed each other, just now they did it via online social networks.
Now take these issues and move them into the air. Congressional investigators report that there were over 200 drone accidents in Iraq and Afghanistan over the course of four years. This doesn’t include the many that happened in the not so covert world of strikes in Pakistan and Somalia. Perhaps the most amusing, but also maybe scary case took place at a base in Djibouti in March 2011.
As The Washington Post reported, a Predator parked at the Camp Lemonnier started its engine without any human direction, even though the ignition had been turned off and the fuel lines closed. “Technicians concluded that a software bug had infected the “brains” of the drone, but never pinpointed the problem…“After that whole starting-itself incident, we were fairly wary of the aircraft and watched it pretty closely,” the Air Force squadron commander testified to an investigative board.”
The issue here isn’t that Predators are poised to take over the homeland, but rather another vexing question of law, politics, and ethics. Robotics has a long history of what one Vice President of a technology firm described to me as “oops moments.” These are when things don’t work out with your machine as planned and you have to take it back from the field. With military robotics, the examples range from the machine gun armed UGV that went “squirrelly” and started spinning around during a demonstration to the automated anti-aircraft system in South Africa that had what investigators thought was a “software glitch” during a training exercise. It shot nine soldiers by accident in a real world version of the famous scene from Robocop.
Today, these oops moments might even be intentionally caused by hostile man-made threats, including criminal or adversarial efforts at UAS communications interference or hacking. Here again, this scenario is not science fiction, but was recently demonstrated in a test in Texas, where a university team hacked the navigation system of a drone.
The issues this phenomenon presents are not just how to avoid them through technology improvements and deconfliction protocols, but also more vexing questions of process, policy, and even philosophy. How do we investigate and apportion out accountability in a realm where more and more is happening outside our old concepts of control and responsibility?
For instance, aviation law and insurance right now focuses on determining if the problem was a hardware error (a widget broke), wetware error (the human pilot made an error), or spiritual (an “Act of God” caused the loss). Now we have much in between, the role that software plays. And in the software field, responsibility and accountability is not something easily assigned. It can be stretched over the long periods of time between design and use, over the large numbers of people involved in writing and selling and buying and upkeeping software, by a business approach that often intends to let the customer find the errors, and by the fact that software will repeatedly be put in real world circumstances for which it wasn’t originally designed. In short, we have to figure out how to catch up our 20th century laws, with our 21st century technologies.
The Psychology Side
The irony in all this is that while the future may involve more and more machines watching us, whether it is police watching city streets from above, or the NSA reading your email, or your phone letting Starbucks know you are walking nearby, how we react to it will still be driven by the very fuzzy combination of our human programming, our identity and emotions – our chemical makeup that drives human psychology.
So what then will be the reaction to this next level step in the surveillance state? Will we redefine our notions of privacy, reacting like how teenagers have handled their online behavior on Facebook and Twitter? Who cares if all my behavior is shared with the world? Instead, I’ll embrace a loss of privacy that would have shocked my parents generation, and even mock it. One can already see this in the new offerings of anti-drone “stealth clothing” for any “style-conscious” terrorists the U.S. seeks, as well as “fashionistas who value their privacy.” As its designer told the media, it also doesn’t fall along clear partisan lines, making it the “Project Runway” version of Rand Paul’s filibuster. “It interests people on the far right as much as it interests people on the far left. Ultra-conservatives see it as anti-government and ultra-liberals see it as anti-military.”
Or will we fear it? And is this a good or bad thing? Some, such as one senior State Department official, believe that our unmanning of war “…Plays to our strength. The thing that scares people is our technology.” The carryover of this belief to the domestic side is the belief that a world of more drones will be a safer world, via a deterrent value. Where’s Waldo won’t mug me if he knows he’ll be caught on screen,
But the psychology of scaring people with technology is a tricky business. It’s the domestic version of the problem we face in our counterterrorism today. Abroad the U.S. government is wrestling with the robot’s impact on our very human “war of ideas” that we are fighting against radical movements. U.S. troops in Afghanistan describe having drones overhead as reassuring, saying they can sleep better as they feel like someone is always above, watching out for them. On the other hand, many civilians there say it’s intrusive, and creates a climate of fear and distrust. That is, on the domestic side, the risk is that robotic surveillance will instead be perceived as an intrusive “Big Brother” figure, as the Russian police who have already used drones to monitor protesters have been called.
The broader issue might not be one of fear, however, but a redefinition of how those who are watched look at the watchers. There is the potential that the drone could become emblematic of those trying to police people they don’t know, on the cheap, from afar. The drone becomes like the cameras favored by the disconnected Baltimore police force of the TV show The Wire, who watch a world of crime play out that they don’t understand.
The ripple effects of robotics will continue to push out into all sorts of domains, in ways both expected and unexpected. Through it all, though, one fundamental principle will hold true as it has in the past: There are always two sides to technologic revolutions. From our new technologies we gain amazing capabilities that seem like they are straight from science-fiction. But from our new technologies we also gain new human dilemmas that seem like they are straight from science-fiction. Moore’s Law is operative, but so is Murphy’s Law.
The issues of domestic “drones” all seem futuristic, but notice how none of the examples that were explored in this article were from the distant future. The questions they raise are fundamental policy questions of today. We can ignore them, or we can embrace and engage in the opportunities and dilemmas of these exciting times.
The above paper includes sections explored in the article The Robotics Revolution, for the Canadian International Council. The author would also like to thank the Christopher Newport University Center for American Studies.
 “Drones to help control border,” The Washington Times, June 28, 2004, http://www.washingtontimes.com/news/2004/jun/28/20040628-123415-2931r/.
Julia Bagg, “Miami-Dade Police Department’s Drones Ready to Fly,” NBC 6 South Florida, January 16, 2012, http://www.nbcmiami.com/news/local/Miami-Dade-Police-Departments-Drones-Ready-To-Fly-137434223.html.
Adam Martin, “Using Drones to Capture Environmental Violations Makes Perfect Sense,” The Atlantic Wire, January 25, 2012, http://www.theatlanticwire.com/technology/2012/01/using-drones-capture-environmental-violations-makes-perfect-sense/47872/.
 Gerald L. Dillingham, testimony to the House, Subcommittee on Oversight, Committee on Science, Space, and Technology, Unmanned Aircraft Systems: Continued Coordination, Operational Data, and Performance Standards Needed to Guide Research and Development, February 15, 2013, http://www.gao.gov/assets/660/652223.pdf.
 Brian Bennett, “Drones are Taking to the Skies,” Los Angeles Times, February 15, 2013. http://www.latimes.com/news/nationworld/nation/la-na-domestic-drones-20130216,0,3374671.story
 “Monsanto,” Fast Company, http://www.fastcompany.com/tag/monsanto?page=1.
“Multi Rotor Drone & Helicopters for Aerial Imaging, Crop Dusting and More,” FlightSchoolList.com, September 11, 2011, http://www.flightschoollist.com/blog/2011/09/multi-rotor-drone-helicopters-for-aerial-imaging-crop-dusting-and-more/.
“Strawberry harvesting robot,” Bing video, November 30, 2010, http://www.bing.com/videos/search?q=robotc+harvaster&view=detail&mid=DBE1FB9441E6F76CA689DBE1FB9441E6F76CA689&first=0&adlt=strict.
 “The Future of Work,” MIT Technology Review Business Report (July 2012).
 P. W. Singer, Wired for War.