Have fear, Americans. Ours is a country divided. On one side are those who divide Americans into two sides; on the other are all the rest. Yes, America today is divided over the question of whether America is divided.
All right, I’m joking. But the joke has a kernel of truth. In 1991 James Davison Hunter, a professor of sociology and religious studies at the University of Virginia, made his mark with an influential book called Culture Wars: The Struggle to Define America. The notion of a country deeply and fundamentally divided over core moral and political values soon made its way into politics; in 1992 Patrick Buchanan told the Republicans at their national convention that they were fighting “a cultural war, as critical to the kind of nation we will one day be as was the Cold War itself.” By 1996, in his singeing dissent in the gay-rights case Romer v. Evans, Supreme Court Justice Antonin Scalia could accuse the Court of “tak[ing] sides in the culture wars,” and everyone knew exactly what he meant.
In 2000 those ubiquitous election-night maps came along, with their red expanses of Bush states in the heartland and their blue blocks of Gore territory along the coasts and the Great Lakes. From then on everyone talked about red America and blue America as if they were separate countries. The 2004 post-election maps, which looked almost identical to the 2000 ones, further entrenched the conventional wisdom, to the point where most newspaper readers can recite the tropes: red America is godly, moralistic, patriotic, predominantly white, masculine, less educated, and heavily rural and suburban; blue America is secular, relativistic, internationalist, multicultural, feminine, college educated, and heavily urban and cosmopolitan. Reds vote for guns and capital punishment and war in Iraq, blues for abortion rights and the environment. In red America, Saturday is for NASCAR and Sunday is for church. In blue America, Saturday is for the farmers’ market (provided there are no actual farmers) and Sunday is for The New York Times.
An odd thing, however, happened to many of the scholars who set out to map this culture war: they couldn’t find it. If the country is split into culturally and politically distinct camps, they ought to be fairly easy to locate. Yet scholars investigating the phenomenon have often come back empty-handed. Other scholars have tried to explain why. And so, in the fullness of time, the country has arrived at today’s great divide over whether there is a great divide.
One amusing example: In April of last year The Washington Post ran a front-page Sunday article headlined “Political Split Is Pervasive.” It quoted various experts as saying, for example, “We have two parallel universes” and “People in these two countries don’t even see each other.” In June, The New York Times shot back with an article headlined “A Nation Divided? Who Says?” It quoted another set of experts who maintained that Americans’ disagreements are actually smaller than in the past and shrinking.
Courageously, your correspondent set out into the zone of conflict. The culture-war hypothesis has generated some fairly rigorous scholarship in recent years, and I examined it. I wound up believing that a dichotomy holds the solution to the puzzle: American politics is polarized but the American public is not. In fact, what may be the most striking feature of the contemporary American landscape—a surprise, given today’s bitterly adversarial politics—is not the culture war but the culture peace.
What, exactly, do people mean when they talk about a divided or polarized America? Often they mean simply that the country is evenly divided: split fifty-fifty, politically speaking. And so it indubitably and strikingly is. In 1979 Democratic senators, House members, governors, and state legislators commandingly outnumbered Republicans; since early in this decade the numbers have been close to equal, with Republicans slightly ahead. Opinion polls show that Republicans and Democrats are effectively tied for the public’s loyalty. For the time being, America doesn’t have a dominant party.
That may sound odd, given the Republicans’ dominance in winner-take-all Washington. But in fact the 2004 elections confirmed that the parties are remarkably close to parity. The presidential election was tight, especially considering that an incumbent president was in the race. Republicans picked up four Senate seats, but the House of Representatives barely budged. The partisan allocation of state legislative seats (now close to parity) and of governorships (mildly favoring Republicans) also barely budged. As if to make parity official, in the main exit poll voters described themselves as Democrats and Republicans in precisely equal proportions.
To political analysts, who live in a world of zero-sum contests between two political parties, it seems natural to conclude that partisan division entails cultural division. Sometimes they elide the very distinction. In his book The Two Americas (2004), Stanley B. Greenberg, a prominent Democratic pollster, opens with the sentence “America is divided” (his italics) and goes on to say, “The loyalties of American voters are now almost perfectly divided between the Democrats and Republicans, a historical political deadlock that inflames the passions of politicians and citizens alike.” In a two-party universe that is indeed how things look. But we do not live in a two-party universe. The fastest-growing group in American politics is independents, many of them centrists who identify with neither party and can tip the balance in close elections. According to the Pew Research Center for the People and the Press, since the Iraq War 30 percent of Americans have identified themselves as Republicans, 31 percent as Democrats, and 39 percent as independents (or “other”). Registered voters split into even thirds.
On election day, of course, independents who want to vote almost always have to choose between a Republican and a Democrat. Like the subatomic particles that live in a state of blurred quantum indeterminacy except during those fleeting moments when they are observed, on election day purple independents suddenly appear red or blue. Many of them, however, are undecided until the last moment and aren’t particularly happy with either choice. Their ambivalence disappears from the vote tallies because the very act of voting excludes the nonpartisan middle.
By no means, then, does partisan parity necessarily imply a deeply divided citizenry. People who talk about culture wars usually have in mind not merely a close division (fifty-fifty) but a wide or deep division—two populations with distinct and incompatible world views. It was this sort of divide that Hunter said he had found in 1991. One culture was “orthodox,” the other “progressive.” The disagreement transcended particular issues to encompass different conceptions of moral authority—one side anchored to tradition or the Bible, the other more relativistic. Not only does this transcendental disagreement reverberate throughout both politics and everyday life, Hunter said, but “each side of the cultural divide can only talk past the other” (his italics). In his book The Values Divide (2002) the political scientist John Kenneth White, of Catholic University, makes a similar case. “One faction emphasizes duty and morality; another stresses individual rights and self-fulfillment,” he writes. The result is a “values divide”—indeed, a “chasm.”
Both authors make their observations about culture and values—many of which are quite useful—by aggregating the attitudes of large populations into archetypes and characteristic world views. The question remains, however, whether actual people are either as extreme or as distinct in their views as the analysts’ cultural profiles suggest. Might the archetypes really be stereotypes?
In 1998 Alan Wolfe, a sociologist at Boston College, said yes. For his book One Nation, After All, Wolfe studied eight suburban communities. He found a battle over values, but it was fought not so much between groups as within individuals: “The two sides presumed to be fighting the culture war do not so much represent a divide between one group of Americans and another as a divide between sets of values important to everyone.” Intellectuals and partisans may line up at the extremes, but ordinary people mix and match values from competing menus. Wolfe found his subjects to be “above all moderate,” “reluctant to pass judgment,” and “tolerant to a fault.” Because opinion polls are designed to elicit and categorize disagreements, he concluded, they tend to obscure and even distort this reality.
I recently came across an interesting example of how this can happen: In an August 2004 article Jeffrey M. Jones and Joseph Carroll, two analysts with the Gallup Organization, took note of what they called an election-year puzzle. Frequent churchgoers and men were much more likely to support George W. Bush than John Kerry. Non-churchgoers and women leaned the other way. That all jibed with the familiar archetypes of religious-male reds and secular-female blues. But here was the puzzle: “Men—particularly white men—are much less likely to attend church than are women of any race or ethnicity.” How, then, could churchgoers prefer Bush if women preferred Kerry?
The answer turns out to be that most individuals don’t fit the archetypes. Men who go to church every week overwhelmingly favored Bush (by almost two to one), and women who stay home on Sundays favored Kerry by a similar margin. But these two archetypal categories leave out most of the population. Women who go to church weekly, men who stay home Sundays, and people of both sexes who go to church semi-regularly are all much more closely divided. The majority of actual Americans are in this conflicted middle.
To know how polarized the country is, then, we need to know what is happening with actual people, not with cultural or demographic categories. One thing we need to know, for example, is whether more people take extreme positions, such that two randomly chosen individuals would find less common ground today than in the past. In the fifty-fifty nation does the distribution of opinion look like a football, with Americans divided but clustered around the middle? Or has it come to look like a dumbbell, with more people at the extremes and fewer in the center?
In an impressive 1996 paper published in The American Journal of Sociology—”Have Americans’ Social Attitudes Become More Polarized?”—the sociologists Paul DiMaggio, John Evans, and Bethany Bryson, of Princeton University, set out to answer that question using twenty years’ worth of data from two periodic surveys of public opinion. They found no change in the “bimodality” of public opinion over the two decades. The football was not becoming a dumbbell.
DiMaggio and his colleagues then looked at particular issues and groups. On most issues (race and gender issues, crime and justice, attitudes toward liberals and conservatives, and sexual morality) Americans had become more united in their views, not more divided. (The exceptions were abortion and, to a lesser extent, poverty.) Perhaps more surprising, the authors found “dramatic depolarization in intergroup differences.” That is, when they sorted people into groups based on age, education, sex, race, religion, and region, they found that the groups had become more likely to agree.
The authors did, however, find one group that had polarized quite dramatically: people who identified themselves as political partisans. There had been a “striking divergence of attitudes between Democrats and Republicans.” In 2003 John Evans updated the study using data through 2000. He found, for the most part, no more polarization than before—except among partisans, who were more divided than ever.
Could it be that the structure of public opinion shows stability or convergence even as individuals hold their opinions in more vehement, less compromising ways? If so, that might be another kind of polarization. Getting inside individuals’ heads is difficult, but scholars can look at so-called “feeling thermometers”—survey questions that ask respondents to rate other people and groups on a scale from “very cold” to “very warm.” In his recent book Culture War? The Myth of a Polarized America the political scientist Morris P. Fiorina, of Stanford University (writing with Samuel J. Abrams and Jeremy C. Pope), finds little change in emotional polarization since 1980—except, again, among strong partisans.
A further possibility remains. Political segregation may be on the rise. Like-minded people may be clustering together socially or geographically, so that fewer people are exposed to other points of view. States, neighborhoods, and even bridge clubs may be turning all red or all blue. Is America becoming two countries living side by side but not together?
Fiorina and his associates approached that question by comparing blue-state and red-state opinion just before the 2000 election. What they found can only be described as a shocking level of agreement. Without doubt, red states were more conservative than blue ones; but only rarely did they actually disagree, even on such culturally loaded issues as gun control, the death penalty, and abortion. Rather, they generally agreed but by different margins. To take one example of many, 77 percent of red-state respondents favored capital punishment, but so did 70 percent of blue-state respondents. Similarly, 64 percent of those in blue states favored stricter gun control, but so did 52 percent of those in red states. Red-state residents were more likely to be born-again or evangelical Christians (45 percent, versus 28 percent in blue states), but strong majorities in both sets of states agreed that religion was very important in their lives. On only a few issues, such as whether to allow homosexuals to adopt children or join the military, did blue-state majorities part company with red-state majorities. Majorities in both red and blue states concurred—albeit by different margins—that Bill Clinton was doing a good job as president, that nonetheless they did not wish he could run again, that women’s roles should be equal to men’s, that the environment should take precedence over jobs, that English should be made the official language, that blacks should not receive preferences in hiring, and so on. This hardly suggests a culture war.
Red-state residents and blue-state residents agreed on one other point: most of them regarded themselves as centrists. Blue residents tipped toward describing themselves as liberal, and red residents tipped toward seeing themselves as conservative; but, Fiorina writes, “the distributions of self-placements in the red and blue states are very similar—both are centered over the ‘moderate’ or ‘middle-of-the-road’ position, whether we consider all residents or just voters.” By the same token, people in both sets of states agreed, by very similar margins, that the Democratic Party was to their left and the Republican Party to their right. “In both red and blue states,” Fiorina concludes, “a solid majority of voters see themselves as positioned between two relatively extreme parties.”
Of course, one reason states look so centrist might be that most states aggregate so many people. A state could appear moderate, for example, even if it were made up of cities that were predominantly liberal and rural areas that were predominantly conservative. Indeed, media reports have suggested that a growing share of the population lives in so-called landslide counties, which vote for one party or the other by lopsided margins. Philip A. Klinkner, a professor of government at Hamilton College, examined this claim recently and found nothing in it. In 2000 the share of voters in landslide counties (36 percent) fell smack in the middle of the historical range for presidential elections going back to 1840. In 2000, Klinkner writes, “the average Democrat and the average Republican lived in a county that was close to evenly divided.”
Of course, 36 percent of Americans living in landslide counties is a lot of people. But then, America has always been a partisan place. What John Adams’s supporters said in 1796 about Thomas Jefferson, Bill Clinton pungently (and correctly) observed recently, would “blister the hairs off a dog’s back.” America is also no stranger to cultural fission. Think of Jeffersonians versus Hamiltonians, Jacksonians against the Establishment, the Civil War (now there was a culture war), labor versus capital a century ago, the civil-rights and Vietnam upheavals. No cultural conflict in America today approaches any of those. By historical standards America is racked with harmony.
My favorite indication of the culture peace came in a survey last July of unmarried Americans, conducted by the Gallup Organization for an online dating service called Match.com. Asked if they would be “open to marrying someone who held significantly different political views” from their own, 57 percent of singles said yes. Majorities of independents, Democrats, and (more narrowly) Republicans were willing to wed across political lines. Just how deep can our political disagreements be, I wonder, if most of us are willing to wake up next to them every morning?
A picture begins to emerge. A divide has opened, but not in the way most people assume. The divide is not within American culture but between politics and culture. At a time when the culture is notably calm, politics is notably shrill. Now, it bears emphasizing that culture peace, or war, is always a relative concept. America, with its cacophonous political schools and ethnic groups and religions and subcultures, will never be a culturally quiescent place, and thank goodness for that. Given the paucity of nation-splitting disagreements, however, what really needs explaining is the disproportionate polarization of American politics.
Reasons for it are not hard to find. They are almost bewilderingly numerous. When I burrow through the pile, I end up concluding that two are fundamental: America’s politicians have changed, and so have America’s political parties.
“Who sent us the political leaders we have?” Alan Ehrenhalt asked in 1991. Ehrenhalt is a respected Washington political journalist, the sort of person who becomes known as a “veteran observer,” and the riddle is from his book The United States of Ambition: Politicians, Power, and the Pursuit of Office. “There is a simple answer,” he continued. “They sent themselves.” This, he argued persuasively, was something new and important.
Ehrenhalt, who was born in 1947, grew up in the dusk of a fading world that I, at age forty-four, am just a little too young to remember. In those days politicians and their supporters were like most other people, only more so. Ambition and talent always mattered, but many politicians were fairly ordinary people (think of Harry Truman) who were recruited into politics by local parties or political bosses and then worked their way up through the system, often trading on their ties to the party and on their ability to deliver patronage. Party machines and local grandees acted as gatekeepers. Bosses and elders might approach a popular local car dealer and ask him to run for a House seat, and they were frequently in a position to hand him the nomination, if not the job. Loyalty, not ideology, was the coin of the realm, and candidates were meant to be smart and ambitious but not, usually, too smart and ambitious.
In a society as rambunctious and egalitarian as America’s, this system was probably bound to break down, and in the 1960s and 1970s it finally did. The smoke-filled rooms, despite their considerable (and often underappreciated) strengths, were too cozy and homogenous and, yes, unfair to accommodate the democratic spirit of those times. Reformers, demanding a more open style of politics, did away with the gatekeepers of old. The rise of primary elections was meant to democratize the process of nominating candidates, and so it did; but hard-core ideologues—with their superior hustle and higher turnouts—proved able to dominate the primaries as they never could the party caucuses and conventions. As the power of the machines declined, ideology replaced patronage as the prime motivator of the parties’ rank and file. Volunteers who showed up at party meetings or campaign offices ran into fewer people who wanted jobs and more who shared their opinions on Vietnam or busing.
With parties and patrons no longer able to select candidates, candidates began selecting themselves. The party nominee, Ehrenhalt wrote, gave way to the “self-nominee.” Holding office was now a full-time job, and running for office was if anything even more grueling than holding it. “Politics is a profession now,” Ehrenhalt wrote. “Many people who would be happy to serve in office are unwilling to think of themselves as professionals, or to make the personal sacrifices that a full-time political career requires. And so political office—political power—passes to those who want the jobs badly enough to dedicate themselves to winning and holding them.” Those people, of course, are often left-wing and right-wing ideologues and self-appointed reformers. In the 1920s the town druggist might be away serving in Congress while the local malcontent lolled around the drugstore grumbling about his pet peeve. Today there’s a good chance that the druggist is minding the store and the malcontent is in Washington.
The parties, too, have changed. Whereas they used to be loose coalitions of interests and regions, they are now ideological clubs. Northeastern Republicans were once much more liberal than Southern Democrats. Today more or less all conservatives are Republicans and more or less all liberals are Democrats. To some extent the sorting of parties into blue and red happened naturally as voters migrated along the terrain of their convictions, but the partisans of the political class have been only too happy to prod the voters along. Whereas the old party machines specialized in mobilizing masses of partisans to vote for the ticket, the newer breed specializes in “activating” (as the political scientist Steven E. Schier has aptly put it) interest groups by using targeted appeals, often inflammatory in nature. (This past year the Republican National Committee sent mailings in Arkansas and West Virginia suggesting that the Democrats would try to ban the Bible.) Both parties, with the help of sophisticated computer software and block-by-block demographic data, have learned to target thinner and thinner slices of the population with direct mail and telephone appeals.
Perhaps more significant, both parties also got busy using their computer programs and demographic maps to draw wildly complicated new district boundaries that furnished their incumbents with safe congressional seats. Today House members choose their voters rather than the other way around, with the result that only a few dozen districts are competitive. In many districts House members are much less worried about the general election than they are about being challenged in the primary by a rival from their own party. Partisans in today’s one-party districts feel at liberty to support right-wing or left-wing candidates, and the candidates feel free (or obliged) to cater to the right-wing or left-wing partisans.
It’s not such a surprise, then, that the ideological divide between Democrats and Republicans in Congress is wider now than it has been in more than fifty years (though not wide by pre-World War I standards). The higher you go in the hierarchies of the parties, the further apart they lean. The top leaders on Capitol Hill are the bluest of blues and the reddest of reds—left and right not just of the country but even of their own parties. (This is especially true on the Republican side. National Journal, a nonpartisan public-policy magazine and a sister publication of The Atlantic, rated House Speaker Dennis Hastert the most conservative member of the House in 2003; Majority Leader Tom DeLay tied for second place.) As party lines have hardened and drawn apart, acrimony has grown between Democratic and Republican politicians, further separating the parties in what has become a vicious cycle. The political scientist Gary C. Jacobson, of the University of California at San Diego, finds that Democrats and Republicans not only enter Congress further apart ideologically, but also become more polarized the longer they stay in Congress’s fiercely partisan environment.
Not all of this had to happen—and indeed, happenstance has made matters worse in recent years. It is interesting to wonder how much less polarized American politics might be today if John McCain had won the presidency in 2000. Instead we got Bush, with his unyielding temperament and his strategy of mobilizing conservatives. Even more divisive was the fact that one party—the Republicans—has controlled the presidency and both chambers of Congress since 2003. In a fifty-fifty country, shutting one party out of the government can only lead to partisan excess on one side and bitter resentment on the other.
Centrist voters, of course, are unhappy, but what can they do? As Fiorina pithily puts it, “Given a choice between two extremes, they can only elect an extremist.” Presented with a credible candidate who seemed relatively moderate, a McCain or a Ross Perot, many independents jumped at him; but the whole problem is that fewer moderates reach the ballot. The result, Fiorina writes, is that “the extremes are overrepresented in the political arena and the center underrepresented.” The party system, he says, creates or inflames conflicts that are dear to the hearts of relatively small numbers of activists. “The activists who gave rise to the notion of a culture war, in particular, and a deeply polarized politics, in general, for the most part are sincere. They are polarized.” But ordinary people—did someone say “silent majority”?—are not.
Well. A grim diagnosis. That it is largely correct is simply beyond question. I say this as one of the frustrated independent voters who feel left behind by two self-absorbed and overzealous major parties. In particular, the practice of gerrymandering congressional districts to entrench partisans (and thus extremists) is a scandal, far more insulting to popular sovereignty than anything to do with campaign finance. But that is not the note I wish to end on. Something may be going right as well.
It seems odd that cultural peace should break out at the same time that political contentiousness grows. But perhaps it is not so odd. America may be culturally peaceful because it is so politically polarized. The most irritating aspect of contemporary American politics—its tendency to harp on and heighten partisan and ideological differences—may be, as computer geeks like to say, not a bug but a feature.
America’s polarized parties, whatever their flaws, are very good at developing and presenting crisp choices. How do you feel about abortion? A constitutional ban on same-sex marriage? Privatizing Social Security accounts? School vouchers? Pre-emptive war? Well, you know which party to vote for. Thanks to the sharply divided political parties, American voters—including the ones in the center—get clear alternatives on most issues that matter. By presenting those alternatives, elections provide a sense of direction.
Moreover, although party polarization may disgruntle the center (can’t we be for stem-cell research and school vouchers?), it helps domesticate fanatics on the left and the right. Though you would be partly correct to say that the mainstream parties have been taken over by polarized activists, you could also say, just as accurately and a good deal more cheerfully, that polarized activists have been taken over by the mainstream parties. The Republican Party has acquired its distinctively tart right-wing flavor largely because it has absorbed—in fact, to a significant extent has organizationally merged with—the religious right. As Hanna Rosin reports elsewhere in this package, religious conservatives are becoming more uniformly Republican even as their faiths and backgrounds grow more diverse. On balance it is probably healthier if religious conservatives are inside the political system than if they operate as insurgents and provocateurs on the outside. Better they should write anti-abortion planks into the Republican platform than bomb abortion clinics. The same is true of the left. The clashes over civil rights and Vietnam turned into street warfare partly because activists were locked out by their own party establishments and had to fight, literally, to be heard. When Michael Moore receives a hero’s welcome at the Democratic National Convention, we moderates grumble; but if the parties engage fierce activists while marginalizing tame centrists, that is probably better for the social peace than the other way around.
In the end what may matter most is not that the parties be moderate but that they be competitive—which America’s parties are, in spades. Politically speaking, our fifty-fifty America is a divisive, rancorous place. The rest of the world should be so lucky.
Commentary
Bipolar Disorder: Is America Divided?
January 1, 2005