Is the U.S. military’s futurism obsession hurting national security?

U.S. Marine Corps Lance Cpl. Skyler Stevens uses new night optics technology during Advanced Naval Technology Exercise 2018 (ANTX-18) at Marine Corps Base Camp Pendleton, California, March 19, 2018. Picture taken March 19, 2018.  U.S. Marine Corps/Lance Cpl. Rhita Daniel/Handout via REUTERS.  ATTENTION EDITORS - THIS IMAGE WAS PROVIDED BY A THIRD PARTY

Amid a pandemic that was all but predicted by biosecurity experts, there has been an obsession with the kind of thinking that attempts to forecast such catastrophes. We see it everywhere—from journalism, to advertising, to defense planning—suggesting that a future world we once could only imagine is imminent. Look no further than the Economist, which advertised its annual “What If?” issue as offering “compelling predictive scenarios…with an eye to what could happen.” A recent Nike commercial opens with the word “TOMORROW” splashed across the screen, then envisions a world in which incredible athletic records are set and running shoes grow on trees. As the pace of the commercial picks up, the announcer breathlessly proclaims “I don’t know what this is but it looks amazing!” as people riding hobby horses compete to cross a finish line, apparently unaware that it is already an organized sport.

This fascination with the future extends to the U.S. Department of Defense, which has been seeking science fiction writers to help predict the nature of tomorrow’s conflicts. Calls for proposals have asked consultants to imagine how artificial intelligence (AI) will “change how decisions are made on the battlefield.” NATO recently published a set of short stories on the future of warfare in 2036, and the Army Cyber Institute commissioned an “Invisible Force” graphic novel to explore the role of cyberattacks in a 2030 conflict scenario.

What is going on here? Is the future arriving, is humanity falling prey to the future’s power of seduction, or is this wishful thinking and cynical escapism from confronting difficult problems?

Four reasons why we are future obsessed

1. The Jetsons Effect: Prior expectations of the future were wrong or disappointing

For those of us who grew up watching the Jetsons, the future is long overdue. In the futuristic cartoon, Rosey the Robot was the Jetson’s robotic rent-a-maid. She washed dishes, cooked dinner, and cleaned up after the Jetson family when fully charged. Fast forward to today, and you’re bound to be disappointed: The AI-enabled Roomba, which scurries around the floor, vacuuming away dirt and debris, working around obstacles and changing direction when it encounters walls, only arrived in 2002—and then took almost 20 years to add sensors and improve the AI to the point where it can avoid smearing dog poop all over your floor. Even now, it is still a far cry from the robotic maid we thought we’d have when we grew up.

2. The future is here-ish, but better technology is predicted to be right around the corner

Artificial intelligence is everywhere. Want to play a video game, ask Google to play you some music, or have Siri remind you about your dentist appointment? You’re already using AI. While China dwarfs the United States in online to real-world implementation of automation and machine learning, in the U.S. we’ve gotten just enough of a taste to make us hunger for more.

Moreover, advanced versions of AI seem to be just around the corner, continuing to stoke dreams of truly breakthrough technology. Tesla, for example advertises its autopilot AI as the “Future of Driving.” It has sold “full self-driving” (FSD) capabilities to buyers since 2016 and promised a demonstration of a fully autonomous drive from Los Angeles to New York by the end of 2017. But in the face of social complexity, these promises have fallen far short of reality. In November, 12,000 Teslas had to be recalled during the FSD beta test due to “unexpected activation of the cars’ emergency braking system.” Nevertheless, according to AI Superpowers author Kai-Fu Lee, our current AI-era “has set ablaze the popular imagination when it comes to AI” and “has fed a belief that we’re on the verge of achieving what some consider the Holy Grail of AI research, artificial general intelligence (AGI)—thinking machines with the ability to perform any intellectual task that a human can—and much more.”

The allure of futuristic technology, including AI, extends to the defense innovation and planning domain. The Terminator films, which debuted in 1984, were early adopters of battlefield AI imagineering. After the first film, AI-enabled robots were both good and bad guys—and the good guys always prevailed. In defense circles, the Information Age (the shift to an economy based on information technology) has been upon us since the early ‘90s. But widespread adoption and awareness of all this information and the algorithms that do stuff with it, giving rise to “smartness,” is comparatively recent. The U.S. Department of Defense has now embraced initiatives that envision future technologies in a big way.

3. Catastrophic risks are more apparent

Mutations have kept COVID-19 in the headlines and at the top of most national policymakers’ agendas. The global democratic decline, led by U.S. allies, made the 2021 Summit for Democracy a front-page feature. The discovery of nearly 300 nuclear missile silos in remote areas of China, Iran’s enrichment of uranium to weapons-grade levels, and recent tests of new nuclear weapons delivery systems by China and Russia, have put nuclear weapons back in the news and raised the specter of a new arms race. Constant and catastrophic wildfires, floods, heatwaves, and deadly tornadoes remind citizens and policymakers that the effects of climate change are upon us. The variety and severity of the threats facing humanity makes alternative visions of the future more urgent than ever.

4. Overcorrection is fueling our zeal

We are slowly learning not to dismiss fantastical predictions of the future—particularly in the defense domain. When the novelist Tom Clancy wrote about terrorists piloting planes into the Capitol in 1994’s Debt of Honor, the possibility of such an attack was dismissed by the intelligence community as outlandish. When biosecurity experts who advised on the film Contagion helped craft a watertight storyline for a realistic pandemic scenario in the wake of the anthrax attacks, it seemed entertaining rather than foreshadowing. Clancy was later interviewed about 9/11 as a terrorism expert, and today Contagion appears prescient. In 2015, Peter Singer co-wrote the dystopian thriller Ghost Fleet with August Cole to help the U.S. prevail in a war driven by weapons of the future.

However, much like a heart attack victim who adopts a vegan diet, defense planners and decisionmakers are now openly and methodically placing our security in the hands of futurists. With the zeal of the converted, they are relying on science fiction writers, scenario planners, and wargamers to help “get the future right” where AI is concerned.

Four problems with future obsession

So what’s the problem with envisioning the future? Don’t we need to do that to steer innovation and avoid paths that could lead to disaster? Or is the embrace of future visions simply a cynical move by complacent policymakers who are avoiding making decisions that require tradeoffs?

1. Not preparing for current crises

Future obsession can lead to a lack of preparation for contemporary dangers. Despite a raging pandemic, preparation for the near future of COVID-19 is limited to deeply flawed technological solutions rather than ones that take into account social and political complexity. While we are assured that futuristic mRNA vaccines can be quickly adapted to new mutations, they cannot prevent those mutations from happening; only a concerted global push for vaccination can do that. Rather than focusing on doing the difficult but doable diplomatic work of arms control to keep the latest arms race from turning into a crisis, the U.S. reaction to China and Russia testing hypersonic glide weapons has been to advance hypersonic capabilities with “national pride” at stake, misunderstanding that U.S. actions can cause complex political reactions that can spiral out of control.

2. Kicking the can down the road

One response to the threat of climate change is that future generations will have better technologies to curb greenhouse gas emissions or remove them from the atmosphere, arguing for technical rather than political and social solutions. This assertion often does not account for the incentives to innovate that may be required for this to be true. Imagining the future to possess better, cheaper, and more efficient technologies allows policymakers to avoid making difficult decisions, ignores the tradeoffs of business-as-usual, and underestimates the amount of progress that could be made with current technologies. Approaches such as “stabilization wedges” that attempt to envision solutions for climate change using current technologies could counter this potentially catastrophic procrastination.

3. The drunkard’s search

Another, more subtle, problem with future obsession is the tendency to lock on to particular scenarios, like the drunkard looking for their keys under the lamppost because the light is better there. In the 1960s, 90% of RAND’s nuclear war scenarios assumed an (unlikely) surprise attack on the U.S. homeland. Similarly, 9/11 led to a hyperfocus on unlikely airplane-based attacks, resulting in massively increased security at airports. Yet there is still no vaccine mandate or testing requirement for flying, almost two years into a pandemic. It took almost a year for the CDC to focus on airborne transmission for COVID rather than fomites, despite a Chinese report in early January 2020 and the Japanese experience with the Diamond Princess cruise ship indicating that airborne transmission was the primary vector for spreading the virus. Preparations for pandemics in the U.S. had envisioned influenza or bioweapons such as anthrax or smallpox, leaving us unprepared for a novel, airborne coronavirus.

Even now, we are unprepared for future pandemics. No country scores above 80 out of 100 in the 2021 Global Health Security index, which measures the biosecurity preparedness and capacity of 195 countries. Despite having recorded almost 800,000 deaths from COVID-19, the United States scored lower in 2021 than in 2019 due to a decreasing ability to prevent zoonotic disease, poor risk communication, and harmful trade and travel restrictions. Meanwhile, the current pandemic has been described by at least one science reporter as merely a “dress rehearsal” for the next pandemic.

4. Escapism over engagement

Finally, future obsession can lead to escapism. Instead of investing in security and stability now on Earth, private entrepreneurs engage in space races, envision colonizing the Moon and Mars, and build a 10,000-year clock which “offers a pleasant distraction from the dangerous trajectory of the world we occupy today.”

Looking to the future is fundamentally a good thing to do: We need to prepare for what lies ahead. Where national security and vital interests are at stake, knowing when and how to use so-called “futurethink” for long-term planning requires judiciousness. There are risks from over-embracing futuristic thinking as a panacea and getting distracted when the next shiny object comes along. We also need to distinguish better between events that aren’t likely to be repeated and those that are. It can serve as a cloak for cynicism and protect vested interests, bits and circuits serving as the new bread and circuses. It cannot act as a substitute for policymaking and should never prevent actions that can be taken today to engage with current events or prepare for future crises. Indeed, we must press policymakers to make pragmatic and potentially difficult decisions. The correct approach involves moderation, caution, and regular updating regarding the future while engaging with possible solutions in the present, all with an eye to and careful consideration of the systems effects that our decisions will have due to the complexity of social and political life.

Amy J. Nelson is a David M. Rubenstein Fellow in the Foreign Policy program and with the Center for Strategy, Security and Technology.
Alexander H. Montgomery is an associate professor of political science at Reed College.

This article was inspired by and is dedicated to Bob Jervis.