What had Washington been like in the years immediately preceding September 11? As even an amnesiac will surely recall, the American capital at the end of the second millennium, like Rome at the start of the first, was consumed with palace intrigue, scandals, and arcane legal inquests. The fascination with scandals and investigations, however, had become but one manifestation of a larger change in the nation’s public life: call it the descent to low politics. Increasingly the national government during the late decades of the 20th century had digressed into concerns that carried it far afield from its core responsibilities.
In earlier years, foreign policy had greatly preoccupied presidents and congressional leaders. Politicians were never oblivious to the details of domestic policy, of course, but these did not overrun the national agenda. Indeed, when John F. Kennedy became president in 1961 his inaugural address had contained exactly two words about domestic imperatives. (One sentence in his speech, about tyranny abroad, ended with a pledge to be “unwilling to witness or permit the slow undoing of those human rights to which this nation had always been committed, and to which we are committed at home and round the world.”) But in President Bill Clinton’s state of the union address in February 1997, the emphasis was heavily on the home front. Clinton ruminated on such subjects as the enforcement of truancy laws, the advantages of school uniforms, the math tests of eighth graders, the availability of medical insurance to cover mammograms, the appropriate hospital stay for women after a mastectomy, the utility of flex-time for employees, the revitalization of community waterfronts, and so forth.
This taste for appropriating the routine work of governors, mayors, hospital administrators, or school boards was strong during the Clinton years, but it had been building before. As East-West tensions subsided through the 1980s, and the outside world was perceived (or, rather, misperceived) to be far less threatening, the temptation in Washington grew to retreat from entanglements overseas and take up internal particulars, many of which could seem decidedly diminutive by comparison. Players from both political parties and at both ends of Pennsylvania Avenue succumbed.
On Ronald Reagan’s watch and that of his successor, the federal government got into the business of determining the minimum drinking age for motorists, setting the licensing standards for bus and truck drivers, overseeing spillages from thousands of city storm sewers, requiring asbestos inspections in classrooms, enforcing child support payments, replacing water coolers in local school buildings, ordering sidewalk ramps on city streets, purifying municipal water supplies, regulating where passengers should stand when riding city buses, and much more.
Puttering at Home
The inward impulse of national politics may have seemed especially conspicuous during the Clinton presidency, marked most notably during its first term by detachment or vacillation in the face of troubles overseas. Following a brief and shallow recession, the Clintonites had come to Washington in 1993 convinced that large parts of the U.S. economy needed radical surgery. While that operation was going on, unsettling events abroad were put on hold. The Clinton administration’s approach toward a festering crisis in the Balkans was illustrative (at least until the magnitude of Serbian atrocities belatedly stirred the North Atlantic Treaty Organization to intervene). A late-20th-century reprise of ethnic cleansing and national dismemberment on the doorstep of Europe was relegated to the margins of the administration’s concerns during its first two years.
The disengaged attitude toward the bloodbath in Bosnia seemed out of step with America’s customary leadership role across the Atlantic and would eventually prove unsustainable. While it lasted, though, Clinton’s stance was not altogether at odds with that of his predecessor. The Balkan disaster had been brewing for years. George Bush senior had marginalized it as well.
In fact, in the years bracketing the end of the Gulf War in 1991 to the onset of the war against terrorism in 2001, it could not be said that Republican administrations, presidential candidates, or congressional leaders were busier than their Democratic rivals fathoming America’s global challenges. Some voices on the hard Right were making louder isolationist noises; others seemed more vocal about symbols than substance in U.S. international relations—our dues assessment at the United Nations, the admissibility of family planning in foreign aid programs, and the like. At least in its second term the Clinton administration had begun to face more squarely the realities that demanded attention from the world’s only superpower. Officials grew alarmed as terrorist incidents increased. The administration eventually reacted forcefully when Slobodan Milosevic unleashed his ethnic cleansers on Kosovo. Clinton also strove to broker (albeit ineffectually) a comprehensive settlement of the Israeli-Palestinian conflict.
But the holiday from foreign affairs was not quite over. During the 2000 presidential campaign Vice President Al Gore scarcely mentioned that America and the world were imperiled by terrorists, and neither did George W. Bush. In their book, The Age of Sacred Terror, Daniel Benjamin and Steven Simon describe the bureaucratic inertia that hobbled U.S. efforts to mobilize against al-Qaida during the Clinton years. The trouble continued well into the summer of 2001. Prescient warnings of possible terrorist operations on American soil generated at lower echelons of the Federal Bureau of Investigation and the Central Intelligence Agency languished in administrative and legalistic labyrinths. On the morning of September 11 word reached President Bush that a commercial airliner had slammed into the World Trade Center while he was sitting in on a second-grade classroom in Sarasota, Florida.
The Toll of Trivial Pursuits
A government absorbed in the minutiae of myriad domestic undertakings might have difficulty organizing itself to discharge other basic duties—including a timely and judicious allocation of resources to combat international terrorism.
Federal antiterrorism spending had grown to almost $10 billion by 2001, but it was spread chaotically across a bewildering maze of agencies and programs, often reflecting a series of uncoordinated legislative earmarks. Tossing funds hither and yon with no coherent strategy meant that dollars would hit or miss vital aspects of “preparedness.” The Department of Justice, for example, wound up with just $15 million to train police officers, firefighters, and rescue squads how to respond in an emergency. Unsurprisingly, in three years the department managed to train only a small fraction of these first responders.
Meanwhile the FBI needed a computer system to enhance the ability of its field offices to share information collected by agents. Congress did not get around to paying for that essentiality until late in 2000—too late to be of much help a few months later. Similarly, the Immigration and Naturalization Service had been unable to buy and install, before September 11, computers to keep track of foreign visitors on student visas. This necessary instrument would not become operational until 2003. The U.S. Coast Guard, whose aging boats had to run after everything from marijuana smugglers to distressed private yachts, scrambled to protect the nation’s most vulnerable places—its seaports. Examples of neglect or disarray abounded. Whatever the many plausible explanations, one was simply that amid all their diversions policymakers could not readily “focus like a laser” on the central task at hand.
Some of the political activities that diverted energy in the wrong directions, moreover, left various agencies shorthanded. The more things the federal bureaucracy was asked to do, the more slots it would need to fill—so many, in fact, that some would remain unfilled. In addition, nominees for appointed positions faced an increasingly grueling review process. It had become common for a new administration to spend its first year with vacancy rates of more than 50 percent in posts that required Senate confirmation and to tolerate as much as 30 percent thereafter.
By the late 1980s many a capable citizen was likely to think twice before pursuing almost any high-ranking government job. For many the path to public service was too politicized and hazardous. More than a few officials found themselves in the crosshairs of independent counsels, ever on the prowl for traces of impropriety. In 1971 some 100 public officials were indicted for wrongdoing in the United States. Twenty years later the volume of indictments exceeded 1,400. It strains credulity to suppose that the nation’s public servants were 14 times more malfeasant in 1991 than they had been in 1971. No matter; the purge, often hyping relatively minor misdeeds, maintained a feverish pitch through most of the 1990s.
So it was that in the 1990s the nation’s safety would be entrusted to a distracted government, spreading itself thin while often failing to replenish its ranks expeditiously.
What accounted for the introspective politics of the 1990s? The end of the Cold War was one factor. With the collapse of the Soviet Union, gone was the great external threat that had compelled and defined U.S. internationalism for nearly half a century and that had simplified priorities accordingly. Now, presumably, Americans could afford to fuss over internal imperfections (real or exaggerated).
Yet the notion that the nation was finally at peace, and now could indulge in political luxuries long deferred, should not be overstated. The post-Cold War world, as any responsible decisionmaker in Washington knew, was not benign. China, an emerging industrial powerhouse, showed few signs of relaxing its repressive regime. Trouble-making tyrants ruled Iraq, Iran, Syria, Libya, and North Korea. Russia retained a huge and insecure stockpile of nuclear weapons. India and Pakistan were headed for a nuclear arms race. The Middle East remained a tinder box. Millions died in Africa’s tribal massacres. A genocidal Balkan war had by mid-1995 shredded the credibility of the United Nations. And there was more.
America declared war on terrorism in the fall of 2001, but terrorists had gone to war against America long before that. They had murdered U.S. ambassadors to Sudan, Lebanon, and Afghanistan in the 1970s. In 1979, 52 U.S. citizens were taken hostage at the American embassy in Tehran and held captive for 444 days. That year Islamic radicals also rioted in Pakistan and set fire to the U.S. embassy in Islamabad. In 1983 the American embassy in Beirut was blasted by a suicide car-bombing that blew up 63 people, including 17 Americans. Six months later, terrorists drove a van filled with explosives into the U.S. military barracks at the Beirut airport and killed 241 marines. In 1985 the passengers of a hijacked TWA jetliner were held hostage in Lebanon, and a disabled U.S. citizen on board the Mediterranean cruise ship Achille Lauro was thrown into the sea by Palestinian “freedom fighters.” One month later simultaneous suicide attacks were carried out at the check-in desks of Israeli and American airlines in the international airports of Rome and Vienna. In 1986 the bombing of a Berlin discotheque, in which 44 Americans were wounded and one killed, was linked to Muammar Qaddafi’s operatives. President Reagan ordered U.S. Air Force F-111s to retaliate against Libyan installations. The Libyans then planned the destruction of Pan Am Flight 103 over Scotland in 1988.
More action loomed in the 1990s, with Americans very much in the line of fire-first at the World Trade Center (bombed by jihadists in 1993), next at an Air Force base in Saudi Arabia in 1996, then at the U.S. embassies in Kenya and Tanzania, and in October 2000 on the USS Cole in Yemen. A plot to blow up the Lincoln and Holland tunnels was thwarted in 1993. So were plans to bring down 11 American airliners in Asia two years later and to wreck the Los Angeles International Airport at around the time of the millennium celebration. Well into his second term President Clinton spoke of placing “the fight against terrorism at the top of our agenda.”
The “top of our agenda,” however, was a crowded place. A lot of other issues, no matter how picayune, jockeyed for position there. Thus episodes of terror (and close-calls) would come and go, and soon afterwards the political process would detour to the domestic topic du jour. This propensity did not just take hold because Americans were out of danger as the Cold War receded. It persisted while danger grew.
Up to a point the subjects that captivated U.S. politics in the late 20th century comported with the public’s frame of mind. Ordinarily public opinion pays less heed to world affairs than to events closer to home. For all the visible signs of new perils abroad, people’s perception of them was not so keen as to outweigh more proximate cares. Hence in the early 1990s pocketbook worries, health care concerns, and anxiety about crime gained salience.
Nonetheless the way some of the domestic issues were blown out of proportion often bore no close connection to public preferences. Polls signaled considerable unease with the state of the economy in 1992, but even by the most permissive standards of political oratory the Clinton campaign’s depiction of U.S. economic frailty—”the worst economic performance since the Great Depression”—went overboard. The idea that the nation’s health care system needed a complete overhaul also was flogged further than the public was comfortable with. By the spring of 1994 almost one in three Americans regarded violent crime as the main predicament facing the country. How much this opinion called for the wholesale federalization of local criminal codes that was legislated that year, however, remained far from obvious.
Some disputes that dominated the national conversation during the decade seemed distinctly out of touch with public opinion. A case in point was the growing political preoccupation with official misconduct—culminating with the dogged attempt to remove Clinton from office. In December 1998, after a year-long sex scandal, the House of Representatives voted to impeach the president. National polls consistently showed the public opposed to impeachment and conviction, typically by margins of two to one.
Intensifying the force of inside-the-Beltway feuds, including ones that most ordinary voters found overblown, was the heightened contentiousness of partisan politics. Differences between the parties had widened. Party solidarity and distinctiveness had increased as the Republicans gradually captured from the Democrats their conservative southern base and as districts for the House of Representatives were redrawn to minimize geographic co-mingling of the parties’ respective constituencies. Also, while fewer incumbents in Congress were ousted in general elections, more members now worried about the possibility of having to defend their seats in primary elections. The upshot was a marked decline in the percentage of centrists—that is, members who were comfortable crossing party lines on selective issues.
The polarizing effect of redistricting has been profound. In the first 15 House elections following World War II one party or the other gained an average of 29 seats. In the past 14 elections the average switch has been 13 seats. Increasingly sophisticated computer software permits political cartographers to map the spatial distribution of registered voters according to their partisan voting preferences. With more precision than ever, districts would be gerrymandered to tilt reliably to one party or the other. Marginal districts, where candidates would have to appeal to voters from both parties, became relatively few.
Primaries, now a dominant institution in both parties, have further separated them. In a simple two-party electoral system, candidates competing for single-member districts naturally tend to gravitate toward the center of the political spectrum. Primaries, however, do not encourage this convergence. Their electorates tend to be small and unrepresentative of mainstream voters. Hence candidates in primaries often move away from the center to appeal to their small but active factions.
The threat of actual or potential primary challenges from the Right may well have been among the decisive forms of leverage that the GOP’s base exercised to harden the party line on questions such as the Lewinsky affair. Among the Democrats similar pressure, though emanating from the opposite extreme, probably helped explain their own nearly unanimous position on symbols of significance to that party’s activists—litmus tests like opposing impeachment, for example, or raising the minimum wage.
The latter, in fact, became almost as much an idee fixe for the Democrats as prosecuting Clinton was for the Republicans. It illustrated the lengths the Left would go to prevail on a matter of less-than-earthshaking importance. Never mind that the wages of low-end workers were already keeping pace with the rise of average wages amid the tight labor market of the second half of the 1990s; in 1996 the White House began campaigning to raise the minimum wage, knowing full well that the project was anathema to conservatives in the Republican-controlled Congress. The Democrats’ strategy was to exploit Senate rules by offering the minimum-wage increase as an amendment to every piece of legislation that the majority leader, Robert Dole, brought to the floor. Lacking the votes to kill the wage proposal or impose cloture for legislation he wanted to pass, Dole pulled bill after bill off the floor, fueling news stories of congressional gridlock. Eventually, in the heat of an election year, the Republicans had to capitulate.
Squabbles like this in recent years appeared related to increased party polarization, but the extent or direction of causality (if any) was not self-evident. Parties cleaving to their extremes may engage more frequently in petty wrangles, but the wrangling may also be what drives the partisans further apart. Moreover, polarity and pettiness do not always co-exist. At times in American history, the deeper the philosophical rift between the political parties, and the greater their disciplined fidelity to principles, the more likely they were to debate serious national questions rather than narrow, expendable, or phony ones. That certainly was the case in the mid-19th century when the nation struggled over slavery.
In sum, when Washington wallowed in “low politics” during the past decade, more than polarized parties was involved.
The Razor’s Edge
As the 1990s wore on, the Democrats and Republicans contended closely for supremacy, as evinced by the narrowing margin between the parties in the House of Representatives. A comfortable cushion of 71 seats for the Democrats in 1984 went to a mere 12 seats for the Republicans by 2000. (The GOP added five to seven more House seats in the 2002 election, improving the party’s margin ever so slightly.) In a competition where even the smallest political gain could tip the scales, neither side would pass up opportunities to score points—by whatever means available. Obstructionism was one, the aim being to deprive an opponent of perceptible successes. Thwarting or retarding the other side’s policy initiatives (with veto threats, filibusters, and poison-pill amendments) became routine during some years, as did holding up appointments to executive and judicial offices. “Going public” with political postures in policy debates became common. This behavior frequently cornered or embarrassed the opposition, thereby precluding quiet negotiations often needed to bridge differences on complex problems. Routing out the presumed official miscreants (especially of the opposing party) was another tactic. So was continual trawling for potentially pivotal supporters and campaign contributors to energize.
Much of this political niche—marketing was to vocal factions and advocacy groups whose influence had been rising steadily over the past 30 years and who frequently regarded the laying on of federal hands as the answer to almost any community problem. Whatever the challenges facing localities in postmodern American society-stolen cars, teenage smoking, cable television access, domestic violence, drunken driving, impurities in drinking water&3151;lobbies or movements existed to champion the cause and make it the national government’s responsibility. The evenly divided partisan environment of the 1990s, with its relentless pressure to fill campaign coffers, made pandering to these claimants more tempting.
To be sure, control of Congress had been closely contested at other junctures in the past half-century. The Democrats lost enough seats in both houses in the election of 1950 to cut their lead over the Republicans to 36 in the House and 2 seats in the Senate. The balance in both chambers tipped, by razor-thin margins, to the Republicans two years later, and then again narrowly back to the Democrats in 1954. Yet the style of politics then was nothing like the one now.
Consider campaign spending. The total spent to elect the 83rd Congress in 1952 was $36.4 million (in current dollars). In the 2000 election congressional candidates and their political parties disbursed a total of $1.4 billion (not including large sums of independent spending, called “issue advocacy,” by outside groups). In 1952, one of the most expensive campaigns was for a Senate seat in New Jersey, where the opposing candidates together spent almost $410,000 (again, adjusted for inflation). In 2000 the Democratic candidate alone shelled out more than 150 times that much to win the same seat.
Contemporary campaigning has become, in Hugh Heclo’s words, a permanent “immense industry for studying, manufacturing, organizing, and manipulating public voices in support of candidates and causes.” The industry’s lavish finances reflect the development of increasingly elaborate techniques of political salesmanship—televised advertising, continual polling, focus-group probing, direct-mail operations, “media buys,” opposition research, and other devices designed to mold, massage, and mobilize possible sympathizers. Like corporations adept at tailoring finely differentiated products to segmented markets, office-seekers (or their legions of professional managers) have the capacity to customize a sales pitch for every carefully targeted group or subgroup of presumptive clients.
And like the vending of a vast selection of detergents or deodorants on today’s supermarket shelves, the new norms of the political marketplace mean that politicians’ policy portfolios would be filled with narrowly gauged favors and frivolities, not just essentialities.
The Media Environment
No account of the modern political proclivity for turning government into a kind of overstocked domestic-issues mart would be complete without mention of the news media. Newscasts and newspapers filled the air with reports of localized misfortunes that supposedly merited extensive national scrutiny. Coverage of world events dwindled. By the mid-1990s the major TV networks were spending an average of six minutes a day on any news outside the United States.
For decades television journalism had been informing mass opinion and often framing the terms of public discourse. By the closing years of the century, however, TV had become the principal source of information for most Americans—and the subjects of its imagery typically trumped all others. The world as depicted on TV news, observed James Fallows, consisted of “a strange equivalence of spectacles” where no one event—the smoldering hull of the USS Cole, the O.J. Simpson trial—was necessarily more important than another. With the day-to-day presentation of events large and small qualifying interchangeably for primetime, it became harder for many people to distinguish worthwhile objects of public affairs from trivial or transitory ones.
The political culture of the capital post-September 11 has not exhibited (so far) anything comparable to the nadirs of the preceding years.
The presidency of George W. Bush had started out with another round of emphasis on domestic initiatives-adjusting taxes, deepening federal involvement in local public education, planning to increase energy supplies, supporting community services provided by local churches. After September 11 these ventures were not forgotten, but they were overshadowed by the president’s concerted steps against international terrorism and the outlaw states that could magnify the threat.
For the first time al-Qaida’s redoubts in Afghanistan felt the full force of U.S. military might. And America’s muscular role on the world stage was not likely to stop there. The United States declared that it would strike, preemptively when necessary, at terrorist enemies and the regimes that backed them, and that it would actively promote “free and open societies on every continent.” A president whose interest in foreign affairs had appeared limited during the 2000 election proved to be an assertive internationalist. With refreshing clarity and determination, he challenged the international community to confront the scourge of terrorism but also to cease appeasing the ever-treacherous regime in Iraq. But that was not all. His administration also soon signed with Russia a pact for steep cuts in nuclear arms, set sensible preconditions for the creation of a viable Palestinian state, favored impressive increases in U.S. foreign aid, and proposed to deploy large sums of money for the international battle with AIDS.
The administration’s counterterrorism efforts informed its budgetary priorities and eventually led to a reasonably comprehensive proposal for a new cabinet department of homeland security. For its part, Congress proved more constructive than contrarian, at least at some important junctures—passing major legislation to secure the airports, renewing at long last the president’s authority to negotiate international trade treaties, approving a resolution to force the disarmament of Iraq, and finally exercising more vigorous oversight over faltering intelligence and law enforcement agencies.
Popular attitudes also appeared to have shifted perceptibly. Public confidence in the federal government rose dramatically following September 11. It sagged again by the spring of 2002—but hardly all the way back to the lows of, say, mid-2001. Any appreciable overall rise in public trust of the government could grant policymakers more latitude to focus on antiterrorism, maybe even permitting them to shelve (for raisons d’etat) a few unrelated policy demands.
A capacity to advance a coherent U.S. policy against global terror—one that takes a relatively consistent stand against the full range of messianic mass murderers, be they al-Qaida’s, Bagdad’s, or those of the Al Aqsa Martyrs Brigades—may have been buttressed over the past year by greater public attentiveness to international events. The share of people who reported following world news in the spring of 2002 was up from where it had been in 1998. Indeed, more than 60 percent followed important developments abroad. Today many more Americans (48 percent) can identify Yasser Arafat than can identify key U.S. cabinet secretaries. And polls last fall found that voters, for a change, were not assigning “the economy and jobs” a higher priority than “terrorism and national security.”
Shifts in news coverage played a part. After September 11 foreign relations and military matters upstaged the usual domestic sagas. Press reporting—principally by the New York Times, the Wall Street Journal, and the Washington Post—on the Afghan war, the al-Qaida network, and the government’s lapses leading up to the day of reckoning became formidably broad and deep.
The political landscape since September 11 also has had its share of scandalous stories and ensuing investigations—but with a difference. The sensational news has not been about the personal indiscretions or forgettable missteps of this or that cabinet officer but about business fraud (Enron, WorldCom) on a scale so grand it was damaging U.S. financial markets and maybe even the rest of the economy. And the response to the recent wave of scandals—this time involving corporate corruption—has also been rather different. Congressional oversight has not merely pondered human interest angles (such as the fate of Enron employees’ pensions) but deliberated thoughtfully about market flaws that require structural correction.
Politics as Usual?
Differences in the dynamics of our politics after September 11 are discernable. Whether the transformation has been profound enough to place an enduring restraint on the pursuit of narrow political expediency and its trivializing effect on the federal government’s purpose is another matter.
For all the redirection that President Bush has given the ship of state since Americans began taking the barbarities of terrorism seriously, his style has not always paralleled that of, let us say, Franklin D. Roosevelt after Pearl Harbor. When FDR described the state of the union on January 6, 1942, not a single domestic-policy aside made its way into his address. The nation was at war. “War costs money,” the president explained. “That means taxes and bonds and bonds and taxes. It means cutting luxuries and other nonessentials.” When Bush addressed the nation on January 29, 2002, his account of the state of the union began with the accomplishments and challenges of another war, but later added a shopping list—good schools, affordable energy, the child tax credit, a patients’ bill of rights, coverage for prescription drugs, services by faith-based groups, new safeguards for 401(K) and pension plans.
The mixture of guns and butter was but one exhibition of domestic politicking unfazed by September 11. The administration acquiesced to a farm bill boosting subsidies to $180 billion over the next decade. Amid the unavoidable logrolling that distorts energy bills, much of the administration’s proposed legislation predictably degenerated to an assortment of subventions for influential interest groups such as the ethanol lobby. Budget legislation, wending its way through the congressional sausage-maker, made room for many additional millions of dollars of spending for highways, schools, nursing homes, museums, fishing boats, and countless other earmarked blandishments.
The consummate political calibration of federal expenditures and regulations was sometimes all too reminiscent of years past. The Bush administration slapped a 30 percent tariff increase on imported steel and followed it with another round of punitive tariffs on Canadian lumber. Sensitive to domestic economic considerations and resistance from various business and labor interests, the administration did not bring to an environmental conundrum of possibly incalculable importance—global warming