Parts of the nation’s commentariat have been seized, in recent months, with a nasty bout of technophobia. Technophobia is a psychological condition, but infectious. Hardly a week goes by without a new outbreak documented in another blog post or business column. To judge from the symptomatic hand-wringing the epidemic is spreading, we are on the verge of mass unemployment as work becomes increasingly automated.
Technophobia is an affliction we have yet to cure even after decades of evidence-based ameliorative efforts. We might not have expected much resistance to the disease in earlier times, before evidence accumulated that the fears it inspired were irrational. Back in 1930, a mind as brilliant as John Maynard Keynes was susceptible to the condition. Keynes sensed sickness in the air but misdiagnosed it as a feature of the capitalist economy: “We are being afflicted with a new disease of which some readers may not yet have heard the name, but of which they will hear a great deal in the years to come—namely, technological unemployment.”
Years of economic progress would thereafter reveal the misdiagnosis itself as the disease. The computer was invented and subsequently the fastest one increased in speed by a factor of 100,000. Unemployment fell to around five percent from 1930’s nine percent. Median earnings doubled. Yet a self-appointed committee of public intellectuals declared in a letter to the President that a “cybernation revolution” was at hand, the product of “the combination of the computer and the automated self-regulating machine.” The revolution was producing “a system of almost unlimited productive capacity which requires progressively less human labor.” The year was 1964.
Between 1964 and 2007, on the eve of the Great Recession, the earnings of the median working-age male rose by about one-third, and while most of that came during the 1960s, men’s earnings did rise modestly after 1969. Median earnings among female workers more than doubled after 1969. Median household income rose by about 75 percent. The typical age at retirement declined, and leisure time increased among both working age men and women. Unemployment was no higher than five percent, even accounting for adults not included in official figures because they had stopped looking for work out of discouragement. The computing speed of the fastest computer rose by a factor of one billion. With a “b”. But we cannot shake technophobia. The sense remains that the next one-billion-fold increase in technological progress will be different.
Epidemiologists believe that technophobia’s resiliency stems from an elevated susceptibility produced by another condition present in its hosts: excessive economic pessimism. Equally important is the frequent absence of prophylactic humility about the limits of imagination in predicting the future.
Take blogger Kevin Drum’s influential essay from May of this year. Drum sees worrisome long-term declines in income, employment, and the share of national income going to workers. Resistant to evidence that such declines either did not occur or are not nearly so worrisome as he believes, Drum has proven easy prey for technophobia pathogens.
At the same time, with a bit more appreciation for our inability to foresee the ways in which tomorrow’s world will differ from today’s, those pathogens are easily fended off. Of the Luddites, early 19th century weavers who went on a spree smashing power looms out of fear that their adoption would doom workers to misery, Drum writes,
Power looms put them out of work, but in the long run automation made the entire workforce more productive. Everyone still had jobs—just different ones. Some ran the new power looms, others found work no one could have imagined just a few decades before, in steel mills, automobile factories, and railroad lines. (Emphasis added.)
But we still can’t imagine what work will look like in a few decades. In particular, people seem to resist the idea that if technology reduces demand for labor by a quarter, that might translate into everyone working 25 percent less rather than unemployment rising by one-fourth. The late economic historian Robert Fogel predicted that the increase in leisure between 1995 and 2040 would exceed the gain Americans saw in the 115 years before 1995.
Will Fogel be proven right? I certainly don’t know. What I’m pretty sure of, however, is that in 2040 we will not look back at 2013 and think, “We’ve made a huge mistake,” ruing the day we failed to listen to those afflicted with technophobia. Technological progress is not a trick we have played on ourselves to throw ourselves into poverty. It is a means for fulfilling our material wants and needs and continues only to the extent that it does so.
Technophobia is pernicious not only because it needlessly distresses those it afflicts but because it distracts people from real economic problems that would be tractable if we could focus more narrowly on them. For an example of a productive look at technological change and its implications for workers, don’t miss the new Third Way report, “Dancing with Robots,” by economists Frank Levy and Richard Murnane. Levy and Murnane are unconcerned about mass joblessness but argue that the subset of future workers who lack sophisticated skills to manage information will see stagnant or declining wages.
How big is this subset of workers? They do not say, but the less-skilled of their three categories of future jobs—those comprising “non-routine manual tasks”—corresponds closely with the “low-wage” category used by the Economic Policy Institute in a recent paper to analyze occupational changes. That paper finds that 18 percent of jobs are in occupations that tend to pay relatively poorly. In all five decades from 1950 to 2000, the share of workers in high-wage jobs rose faster than the share in low-wage jobs, but the pattern was reversed from 2000 to 2007. Levy and Murnane recommend a transformation of American education to equip more Americans with information-management skills. That seems a more productive course than loom-smashing or preparing for mass unemployment.