When you read this, George W. Bush may be President, which will most likely mean that his lawyers, his brother Jeb and his Florida campaign co-chair and ambassadorial wannabe Katherine Harris succeeded in short-circuiting the manual recounts in Florida that had Al Gore's total edging upward. Or Gore--who, as we went to press, said he would abide by the results of a limited or, if Bush preferred, statewide hand recount--may have wrested victory from the jaws of premature concession because the hand-counted chads were hanging his way.
The bromide "every vote must count" has oft been uttered, but the Florida election ripped the veil off the many ways votes can be made not to count. Such as: Secretary of State Harris's refusal to redress blunders like the mysteriously unrecorded 6,600 presidential-line votes in Broward County; her selective tolerance of a 5 percent error rate in Florida's voting-card machines in an election with a far narrower margin; improprieties in the handling of GOP absentee ballots in Seminole County; closings of polling places in certain black precincts while voters were still waiting in line; and denial of requests for Creole interpreters.
In tandem with these ward-heeler power plays went the Bush forces' relentless stealth attack on democracy--the strategy seemed to be to sow confusion and doubt about the counting process. Leading the spinners was the pompous ex-Secretary of State James Baker, whose phalanx of lawyers sought an injunction in federal court--never mind the hypocrisy of champions of states' rights trying to overturn state elections laws. Federal judge Donald Middlebrooks gave these ploys short shrift and underscored that recounts are not aberrations in our system but routine occurrences, which a body of state and local law exists to handle.
The polls showed that a majority of Americans approved of the idea that the votes be fully and fairly counted; it was mainly the conservative punditocracy and academic talking heads who called for Gore to fall on his sword. We were reminded of the run-up to impeachment, when some of these same tribunes were hectoring President Clinton to resign rather than "put the country through" a period of instability threatening to undermine democracy and the Free World. Such warnings were dusted off for Baker's PR drive, enlivened with dire threats that the market would go south if a recount continued (upon which the market, driven by its inner neuroses, went up). Conveniently forgotten was the fact that there's a President on the job until January 20.
As the legal/political maneuvers unfolded we were struck by the relevance of what contributors to this issue, among them Lani Guinier, Theodore Lowi and William Greider, are saying from different angles: First, that democracy is messy and unpredictable--something the elites abhor--and all the more reason to insure that every vote is duly counted; and second, that over the long term the aftermath of this election may be more important than the question of which contender wins the race--if it galvanizes citizens to take a fresh look at the American way of voting. Right now we live in a drafty old house, and our contributors propose some practical ways to fix the roof and shore up the foundation. As Americans have learned throughout history, our rights periodically have to be wrested back from elites trying to take them away--as the Bush team was caught doing in Florida.
For years many of us have called for a national conversation about what it means to be a multiracial democracy. We have enumerated the glaring flaws inherent in our winner-take-all form of voting, which has produced a steady decline in voter participation, underrepresentation of racial minorities in office, lack of meaningful competition and choice in most elections, and the general failure of politics to mobilize, inform and inspire half the eligible electorate. But nothing changed. Democracy was an asterisk in political debate, typically encompassed in a vague reference to "campaign finance reform." Enter Florida.
The fiasco there provides a rare opportunity to rethink and improve our voting practices in a way that reflects our professed desire to have "every vote count." This conversation has already begun, as several highly educated communities in Palm Beach experienced the same sense of systematic disfranchisement that beset the area's poorer and less-educated communities of color. "It felt like Birmingham last night," Mari Castellanos, a Latina activist in Miami, wrote in an e-mail describing a mammoth rally at the 14,000-member New Birth Baptist Church, a primarily African-American congregation in Miami. "The sanctuary was standing room only. So were the overflow rooms and the school hall, where congregants connected via large TV screens. The people sang and prayed and listened. Story after story was told of voters being turned away at the polls, of ballots being destroyed, of NAACP election literature being discarded at the main post office, of Spanish-speaking poll workers being sent to Creole precincts and vice-versa.... Union leaders, civil rights activists, Black elected officials, ministers, rabbis and an incredibly passionate and inspiring Marlene Bastiene--president of the Haitian women's organization--spoke for two or three minutes each, reminding the assembly of the price their communities had paid for the right to vote and vowing not to be disfranchised ever again."
We must not let this once-in-a-generation moment pass without addressing the basic questions these impassioned citizens are raising: Who votes, how do they vote, whom do they vote for, how are their votes counted and what happens after the voting? These questions go to the very legitimacy of our democratic procedures, not just in Florida but nationwide--and the answers could lead to profound but eminently achievable reforms.
§ Who votes--and doesn't? As with the rest of the nation, in Florida only about half of all adults vote, about the same as the national average. Even more disturbing, nonvoters are increasingly low-income, young and less educated. This trend persists despite the Voting Rights Act, which since 1970 has banned literacy tests nationwide as prerequisites for voting--a ban enacted by Congress and unanimously upheld by the Supreme Court.
We are a democracy that supposedly believes in universal suffrage, and yet the differential turnout between high-income and low-income voters is far greater than in Europe, where it ranges from 5 to 10 percent. More than two-thirds of people in America with incomes greater than $50,000 vote, compared with one-third of those with incomes under $10,000. Those convicted of a felony are permanently banned from voting in Florida and twelve other states. In Florida alone, this year more than 400,000 ex-felons, about half of them black, were denied the opportunity to vote. Canada, on the other hand, takes special steps to register former prisoners and bring them into full citizenship.
§ How do they vote? Florida now abounds with stories of long poll lines, confusing ballots and strict limitations on how long voters could spend in the voting booth. The shocking number of invalid ballots--more ballots were "spoiled" in the presidential race than were cast for "spoiler" Ralph Nader--are a direct result of antiquated voting mechanics that would shame any nation, let alone one of the world's oldest democracies. Even the better-educated older voters of Palm Beach found, to their surprise, how much they had in common with more frequently disfranchised populations. Given how many decisions voters are expected to make in less than five minutes in the polling booth, it is common sense that the polls should be open over a weekend, or at least for twenty-four hours, and that Election Day should be a national holiday. By highlighting our wretched record on voting practices, Florida raises the obvious question: Do we really want large voter participation?
§ Whom do they vote for? Obviously, Florida voters chose among Al Gore, George Bush and a handful of minor-party candidates who, given their status as unlikely to win, were generally ignored and at best chastised as spoilers. But as many voters are now realizing, in the presidential race they were voting not for the candidates whose name they selected (or attempted to select) but for "electors" to that opaque institution, the Electoral College. Our constitutional framers did some things well--chiefly dulling the edge of winner-take-all elections through institutions that demand coalition-building, compromise and recognition of certain minority voices--but the Electoral College was created on illegitimate grounds and has no place in a modern democracy.
As Yale law professor Akhil Reed Amar argues, the Electoral College was established as a device to boost the power of Southern states in the election of the President. The same "compromise" that gave Southern states more House members by counting slaves as three-fifths of a person for purposes of apportioning representation (while giving them none of the privileges of citizenship) gave those states Electoral College votes in proportion to their Congressional delegation. This hypocrisy enhanced the Southern states' Electoral College percentage, and as a result, Virginia slaveowners controlled the presidency for thirty-two of our first thirty-six years.
Its immoral origins notwithstanding, the Electoral College was soon justified as a deliberative body that would choose among several candidates and assure the voice of small geographic areas. But under the Electoral College, voters in small states have more than just a voice; indeed their say often exceeds that of voters in big states. In Wyoming one vote in the Electoral College corresponds to 71,000 voters; in Florida, one electoral vote corresponds to 238,000 voters. At minimum we should eliminate the extra bias that adding electors for each of two senators gives our smallest states. As Robert Naiman of the Center for Economic and Policy Research reports, allowing each state only as many electors as it has members in the House of Representatives would mean, for example, that even if Bush won Oregon and Florida, he would have 216 and Gore would have 220 electoral votes.
Today its backers still argue that the Electoral College is necessary to insure that small states are not ignored by the presidential candidates. Yet the many states--including small ones--that weren't close in this election were neglected by both campaigns. Some of the nation's biggest states, with the most people of color, saw very little presidential campaigning and get-out-the-vote activity. Given their lopsided results this year, we can expect California, Illinois, New York, Texas and nearly all Southern states to be shunned in the 2004 campaign.
§ How are their votes counted? The presidency rests on a handful of votes in Florida because allocation of electoral votes is winner-take-all--if Gore wins by ten votes out of 6 million, he will win 100 percent of the state's twenty-five electoral votes. The ballots cast for a losing candidate are always "invalid" for the purposes of representation; only those cast for the winner actually "count." Thus winner-take-all elections underrepresent the voice of the minority and exaggerate the power of one state's razor-thin majority. Winner-take-all is the great barrier to representation of political and racial minorities at both the federal and the state level. No blacks or Latinos serve in the US Senate or in any governor's mansion. Third-party candidates did not win a single state legislature race except for a handful in Vermont.
Given the national questioning of the Electoral College sparked by the anomalous gap between the popular vote and the college's vote in the presidential election, those committed to real representative democracy now have a chance to shine a spotlight on the glaring flaws and disfranchisement inherent in winner-take-all practices and to propose important reforms.
What we need are election rules that encourage voter turnout rather than suppress it. A system of proportional representation--which would allocate seats to parties based on their proportion of the total vote--would more fairly reflect intense feeling within the electorate, mobilize more people to participate and even encourage those who do participate to do so beyond just the single act of voting on Election Day. Most democracies around the world have some form of proportional voting and manage to engage a much greater percentage of their citizens in elections. Proportional representation in South Africa, for example, allows the white Afrikaner parties and the ANC to gain seats in the national legislature commensurate with the total number of votes cast for each party. Under this system, third parties are a plausible alternative. Moreover, to allow third parties to run presidential candidates without being "spoilers," some advocate instant-runoff elections in which voters would rank their choices for President. That way, even voters whose top choice loses the election could influence the race among the other candidates.
Winner-take-all elections, by contrast, encourage the two major parties to concentrate primarily on the "undecideds" and to take tens of millions of dollars of corporate and special-interest contributions to broadcast ads on the public airwaves appealing to the center of the political spectrum. Winner-take-all incentives discourage either of the two major parties from trying to learn, through organizing and door-knocking, how to mobilize the vast numbers of disengaged poor and working-class voters. Rather than develop a vision, they produce a product and fail to build political capacity from the ground up.
§ What happens after the voting? Our nation is more focused on elections now than it has been for decades; yet on any given Sunday, more people will watch professional football than voted this November. What democracy demands is a system of elections that enables minor parties to gain a voice in the legislature and encourages the development of local political organizations that educate and mobilize voters.
Between elections, grassroots organizations could play an important monitoring role now unfulfilled by the two major parties. If the Bush campaign is right that large numbers of ballots using the same butterfly format were thrown out in previous elections in Palm Beach, then something is wrong with more than the ballot. For those Democratic senior citizens in Palm Beach, it was not enough that their election supervisor was a Democrat. They needed a vibrant local organization that could have served as a watchdog, alerting voters and election officials that there were problems with the ballot. No one should inadvertently vote for two candidates; the same watchdog organizations should require ballot-counting machines like those in some states that notify the voter of such problems before he or she leaves the booth. Voters should be asked, as on the popular TV quiz show, "Is that your final answer?" And surely we cannot claim to be a functioning democracy when voters are turned away from the polls or denied assistance in violation of both state and federal law.
Before the lessons of Florida are forgotten, let us use this window of opportunity to forge a strong pro-democracy coalition to rally around "one vote, one value." The value of a vote depends on its being fairly counted but also on its counting toward the election of the person the voter chose as her representative. This can happen only if we recognize the excesses of winner-take-all voting and stop exaggerating the power of the winner by denying the loser any voice at all.
A proposed 14.2 percent postage increase for periodicals was swept aside by the Postal Rate Commission in a recommendation issued on November 13. The five-member presidentially appointed commission approved increases that average just under 10 percent. In our view that's just about 10 percent too much, given that the Postal Service is--the Internet notwithstanding--the circulatory system of our democracy. The Nation was among the witnesses cited in the commission's 1,000-page opinion who warned about the potentially destructive impact of the proposed rate hikes on journals of opinion.We were pleased that the commission recognized these magazines as a category worthy of separate consideration, but next time we hope to persuade them that it's as wrong to tax ideas through postal-rate increases as it was to tax tea in colonial times.
Katha Pollitt is not writing a column this week; she will be back in two weeks.
HEARING THE OTHER SIDE
At a time when Israeli opinion has hardened against peace efforts, 120 Palestinian academics and activists published an "urgent statement to the Israeli public" as a paid ad in Israeli newspapers on November 10. The statement called for "a final historic reconciliation that would enable our two peoples to live in peace, human dignity and neighborly relations." The signers argued that the Oslo accords have been used to camouflage expansion of settlements and the continuing expropriation of Palestinian land. It said that freedom of movement for Palestinians has been severely curtailed while settler violence against Palestinian communities continues. Resolving current inequities within the framework of the Oslo agreements with exclusive American "brokerage" was now impossible. Four principles were declared to be essential to a just peace agreement: Ending the occupation of the territories captured in 1967; Palestinian sovereignty over East Jerusalem and recognition of the city as the capital of two states; Israel's acknowledgment of its responsibility in the creation of Palestinian refugees in 1948; and mutual respect for spiritual and historical sites.
As rain dances used to serve certain primitive tribes and scripture still serves true believers, the two-party system serves as the religion of the political class. Never mind that more than 50 percent of Americans may not share the civic religion, answering yes to pollsters when asked if they would prefer more than two choices (and that includes many regular voters as well as the bulk of habitual nonvoters). Nevertheless, every new party that has ever tried to establish itself has been treated by the political priesthood as a blasphemer--an evil force that inevitably contributes to the disastrous victory of the more detested of the two major candidates. Perot elected Clinton. Nader elects Bush.
The real culprit in the current election imbroglio is the two-party system itself and the state laws supporting it. These laws exist to discourage new parties. Florida has come in for special attention because of the current crisis, but Florida is typical among states. The beautiful irony is that the laws written to discourage third parties have proved to be a double-edged sword, cutting for the moment against those responsible for the existence of those laws.
Consider first how the laws work against all new parties. It is not Providence that takes an energetic social movement and crushes it as soon as it chooses to advance its goals through elections. It is the laws of the state here on earth that keep the party system on life support by preferring two parties above all others. The key example will be found in the laws of the states and Congress that mandate the single-member district system of representation plus the plurality or first-past-the-post method of election. Another historic example is provided by the "antifusion" laws in all but a half-dozen states, which prohibit joint nomination, whereby a third party seeks to nominate for its ticket the candidate already nominated by one of the major parties. Even the Supreme Court has approved such laws with the argument that having the same name in two places on the ballot would confuse the poor, defenseless voters.
Add to all this the new gerrymandering. Traditional gerrymandering was at least a genuine struggle between the majority parties to dilute the vote power of the other party by concentrating a maximum of their voters into a minimum of districts. The new method takes advantage of the Voting Rights Act by benign race-conscious gerrymandering in order to keep minorities within one of the major parties. In practice, blacks are guaranteed one or more additional Congressional or state legislature seats within the Democratic Party, while Republicans gain strength in districts from which the minority voters are evacuated.
Then there are the countless state laws that prescribe higher thresholds for the number of correct signatures required on third-party nominating petitions than for regulars on two-party ballots. Even the laws that apply equally to all parties are discriminatory, because they are written in such detail that ballot access for third-party candidates requires expensive legal assistance just to get through the morass of procedures. That mind-numbing detail is doubly discriminatory because the implementation of these laws thrusts tremendous discretion into the hands of the registrars, commissioners and election boards, all staffed by political careeristas of the two major parties, whose bipartisan presence is supposed to provide "neutrality with finality"--but it is common knowledge that they can agree with each other to manipulate the laws for the purpose of discouraging the candidacies of smaller and newer parties.
The same principles help explain why less than 50 percent of the electorate turns out to vote. Most of the blame goes to the forbidding proceduralism of registration, enrollment and eligibility and the discretionary power of local and county officials in implementation. And don't forget the gruesome timing of state election laws that restrict voting to one ordinary workday. The duopoly has a stake in low turnout. Virtually all expansion of the electorate (to include women, 18-year-olds, blacks) and the easing of restrictions on registration (judicial enforcement of the "motor voter" law) have been imposed on the state two-party systems from the outside by national social movements and federal courts.
Now, as poetic justice would have it, this legal structure is cutting the other way. Just look at the havoc it has wreaked: Loused-up ballots. Machine versus manual recounts. A lawyers' field day and the threat of court intervention that could cause a constitutional crisis or take Florida out of the electoral vote altogether. The Florida crunch can happen in any state where the results are extremely close and the outcome can change the national results.
That's because the two constituted parties cooperate well as a duopoly so long as market share is stable, with decisive election results. But whenever there is an extremely close election, the two parties become vicious antagonists, and the high stakes make it profitable for each to use its control of the electoral machinery as a weapon of mass destruction against the other. No war is more destructive than a civil war, and ordinarily the two parties have incentives to keep civil war from happening. Civil war in 2000 has broken out because two-party competition has turned from a public good to a public evil. The two-party system has at the moment become a menace to the Republic, made worse by the overwhelming weakness of the parties' presidential candidates and the impossibility of choosing between them when the only way to vote no for the candidate you hate is to vote yes for the one you can barely tolerate. And forget about having a good option when you hate both equally.
With Nader in the race, a lot of things got said that otherwise wouldn't have--no matter that the leading candidates excommunicated him. Making issues out of nonissues is what third parties are about, but those issues obviously did not create the stalemate we now confront. Stalemate is putting the case too mildly; mutual assassination is more like it. The crisis will not end with a certified recount in Florida. The civil war will continue, and the two parties will give us competition literally with a vengeance. Forget about smooth transitions. The FBI won't be ready with its security checks of top appointees, and the Senate will look at them with far greater than average scrutiny, even if the President's party is in the majority, because the Senate is run by sixty antifilibuster votes, not by mere majorities. That will apply in spades to judicial vacancies. Get ready for a Supreme Court of eight, seven, even six members, because as the vacancies occur, there'll be a majority against any nominee, even ones as mushy and fuzzy as President Bush or Gore will nominate. (The Constitution does not require any particular number of Justices on the Supreme Court.)
No exit? We have to turn the civic religion on its head and lionize the principle of a multiparty system, because its presence on a regular and expanded basis would relieve the two major parties of the need to be all things to everyone in order to get their phony majorities. We don't do that by inviting third parties to join the major parties on legal life support--as government-sponsored agencies. We do it by deregulating our politics. Hey, guys, deregulation. If you really meant it all these years, you Republicans and you Democrats, then be honest and deregulate yourselves. Take away the two-party safety net, by legislation and better yet by judicial review, and the democratic revolution can begin.
On Tuesday, November 14, exactly one week after Election Day (and with no President yet in sight), a notable though little-noted disclosure was made to the public. I do not mean the news that the federal judge in Florida had turned down the Republicans' stop-the-hand-count motion, or the news that Bush's lead in Florida was now 388 votes, or the news that a Florida state judge had waffled on Florida Secretary of State Katherine Harris's decree that no county votes would be counted if reported after the 5 pm deadline that afternoon, or, for that matter, anything else that was happening in the murk of the Sunshine State. I mean the news that, according to a poll released by the Washington Post and ABC News, 45 percent of the public wanted George Bush to become President whereas only 44 percent wanted Al Gore to become President (6 percent wanted "neither," 4 percent had no opinion and 1 percent wanted "other"). The claim was all the more striking in view of the hard contemporaneous fact that in the most recent count of the actual vote of November 7, Gore led Bush by a nationwide margin of 222,880 votes.
If anyone ever had doubts that politics in the United States is dominated by polling, this poll should put an end to them. A major poll was, in a manner of speaking, calling the election a full week after the vote--and reversing the known results.
The polls had been mercifully silent since the election. Many had good reason to be. Five of seven major ones had been "wrong" about the outcome of the election. That is, their final counts had failed to reflect the winner on Election Day (though some, it's true, were within the margin of error). The New York Times/CBS "final" poll, which put Bush at 46 percent and Gore at 41 percent, had the margin wrong by more than five points and Gore's final tally off by eight points. The Battleground poll, which gave Bush 50 percent to Gore's 45 percent, likewise got the margin wrong by five points. Others were more modestly in error. CNN gave Bush 48 percent and Gore 46 percent; in the Washington Post it was Bush 48 and Gore 45; and in the Pew Research Center poll (with undecided voters counted), it was Bush 49, Gore 47. Only the Zogby poll, which put Gore ahead in the popular vote by 48 to 46 percent, and a CBS election-morning tracking poll, which gave Gore 45 percent and Bush 44 percent, picked the right winner in the popular vote, and with a margin close to the actual result. All in all, Gore's victory in the popular vote came as a surprise. Of course, it's not literally true that the polls were wrong, since there is a margin of error, and people can change their minds between the day of the poll and the election. On the other hand, election results are the only check on the accuracy of polling that there is--they are to polling what experimentation is to scientific hypothesis--and there is no reason to suppose that a poll whose final measure is 8 percentage points off the election result is not 8 percentage points off year in, year out.
Considering the decisive importance that polling had throughout the race in every aspect of the campaign, including media coverage, fundraising and campaign strategy (in the last few weeks of the election, hearts were lifting and falling on single-point fluctuations in poll numbers), these discrepancies deserved much reflection. The reason they did not get it was that on election night the magicians of public opinion went on to make even more egregious and momentous errors, by prematurely predicting the winner in Florida twice and the winner of the national election once. (The election-night calls made by the television networks, which in turn are based on exit polling done by a single, nearly anonymous firm, the Voter News Service, are not quite the same as opinion polling, since they record a deed--voting--rather than an opinion, but their use of sampling techniques to predict outcomes places them in the same general category as other polls.)
The last of these mistakes, of course, led a credulous Gore to concede the election and then, minutes later, to retract the concession. For a few hours, the networks and the candidates appeared to have assumed the power to decide the election between them. There is every reason to believe, for instance, that George Bush would now be President-elect if, moments before his concession speech, Gore had not got the news that Florida had been declared undecided again. If Gore's concession had gone unretracted, Bush had made his acceptance speech and the country had gone to bed believing it had made its decision, it is scarcely imaginable that the close results in Florida would have been contested. Even now, many observers await a concession by one or another of the candidates as the decisive event. But it is not up to either the networks or the candidates to decide who is to be President; that matter is left under the Constitution to the voters, whose will, no matter how narrowly expressed, must be ascertained.
Then a week later, the polls that had played such an important and misleading role in the election were weighing in again, this time on the Florida battle. The poll that brought the startling, seemingly counterfactual news that Bush led Gore in the public's preference also revealed that six out of ten voters were opposed to legal challenges to the Florida results--possibly bad news for Gore, who had been considering a legal challenge to the infamous butterfly ballot in Palm Beach County. However, observers who did not like that conclusion could find comfort on the same day in a New York Times/CBS poll, which reported that another 6 in 10 were unworried about a delay in finally deciding upon the next President--good news for Gore, who had been relying on time-consuming hand recounts to erase Bush's narrow lead.
If, however, the arts of reading public opinion helped get us into our current mess, perhaps we can take comfort from the hope that they can also help us get out of it. Many observers have suggested that by failing to produce a clear mandate, the ever-changing vote-count of the year 2000--let's call it the Butterfly Election--will cripple the presidency of the winner. They need not worry too much. In our day, it is not only--perhaps not even mainly--elections that create mandates, once every four years. It is polling data that, day in and day out, create our impressions, however incompletely or inaccurately, of what the public wants. Let the new President act in a way that the public approves, as determined by a poll or two, and he will have all the mandate he needs to govern.
In Texas, vote-counters routinely count a dimpled chad as a vote for the candidate because it clearly establishes the voter's intent.
Three weeks ago, that sentence would have been gibberish, a sure sign that the writer had lost his mind. But I offer it today as the key point in the debate about who should be President and as proof positive that the Bush camp is being, to put it politely, disingenuous. Both Texas and Florida law hold that a voter's intent is all important in determining how a vote is counted. An indented ballot--the now-famous dimpled or pregnant chad--has been interpreted in states, from Texas to Massachusetts, as proof that the voter intended to vote for a particular candidate.
All the Florida Supreme Court has done, by a unanimous vote, is to affirm that the manual count is legal, just as it would be in Texas. So what's the fuss? Why are all of the Bushies yapping about the possibility of a stolen election, given that what county election officials are now doing in Florida has long been the common practice in their candidate's home state?
George W. Bush is acting as if he believes the presidency is part of his natural inheritance. Otherwise, why wouldn't he gracefully play out the hand that the Florida Supreme Court has dealt and accept Al Gore's offer to agree to support the decision of the voters as announced in four days, a decision that is still most likely to go Bush's way?
Even with the dimpled chad ballots included, Bush may be the next President, ambiguous though his victory may be. He did, after all, lose the national popular vote by more than 250,000 votes, which would make him the first loser since 1888 to squeak through in the electoral college. But our system requires that, if that happens, he be granted the awesome powers of the presidency, in which case we should all give him the respect due to the occupant of that office.
By endorsing the manual count, the Florida Supreme Court made the best of a bad situation. The Bush team is solely responsible for not exercising its right--after Gore asked for recounts in several counties--to request hand counts in those counties where Bush could have picked up more votes. Instead, Bush and his aides have done their best to obstruct the fairest way to recount legitimate votes in disputed counties, and they have muddied the waters with their attacks on manual counting as some sort of Democratic plot. It isn't, as demonstrated by the widespread use of this device to check the fallibility of machines throughout the nation. Imperfect, yes; devious, no.
And what about the other voting irregularities in Florida, most of which seem to have cheated Gore? The case of the Republican campaign helpers in Seminole County who were allowed to work in the registrar's office--some up to ten days--adding required information to thousands of absentee ballot applications that would have been disqualified; the flawed butterfly ballots in Palm Beach County; the tens of thousands of ballots of black voters around Jacksonville that were rejected because of a confusing ballot that led to double-punching.
The Gore campaign decided against asking that the outcome of the election be held up pending an investigation of those cases. Gore also stated that he wouldn't accept any electoral college votes cast for him by Bush electors in any state, and will willingly accept the results of the count underway in Florida as a final disposition of the presidential race, no matter the outcome.
The Bush camp appears ready to accept that result only if its man is the victor. Toward that end, it is willing to trample on the cherished Republican principle of states' rights by appealing to the US Supreme Court to overturn Florida's highest court. It has also threatened to use Florida's GOP-controlled state Legislature to undermine the court, making a hash of the principle of an independent judiciary.
The Bush blitzkrieg against the Democrats for exercising their right to ask for a manual count betrays the bipartisan cooperation that Bush promised during the campaign. It is neither candidate's fault that this, the most closely contested election in over a century, has proved so difficult to call.
Bush probably will win the electoral battle, but he will only emerge as a true winner by taking the high road now and joining Gore in pledging to be bound by the vote totals as reported to the secretary of state in keeping with the Florida Supreme Court's order.
There's an easy way to take your own pulse, and that of anyone you know, concerning the vertiginous events of the night of November 7. Was the apparent non-outcome really a "mess" or a crisis? Or was the pre-existing system a sordid mess and a crisis waiting to happen? If you choose the second explanation, then the meltdown of all the fixers and self-appointed gatekeepers and pseudo-experts, as well as being a source of joy, is also an unparalleled opportunity, an occasion for a long-postponed national seminar on democracy and how to get it.
To buy or not to buy turns out to have been the question of the century in America--Just Do It or Just Say No. And in the past fifteen years, consumer society has moved to the center of historical inquiry as well. It began with the social history of commercial culture and the advertising industry, in books such as Kathy Peiss's Cheap Amusements: Working Women and Leisure in Turn-of-the-Century New York (1986) and Roland Marchand's Advertising the American Dream (1985). Drawing inspiration from the pioneering anthropological explorations of Dick Hebdidge (Subculture, The Meaning of Style, 1979), Arjun Appadurai (The Social Life of Things, 1988) and, especially, Mary Douglas and Baron Isherwood (The World of Goods, 1979), investigators then turned to the cultural history of how ordinary people use and assign meanings to commodities. A good example of this genre is Alison Clarke's Tupperware: The Promise of Plastic in 1950s America (1999). In recent works--such as Robert Collins's More: The Politics of Economic Growth in Postwar America (2000) and Alan Brinkley's The End of Reform: New Deal Liberalism in Recession and War (1995)--they have studied the political history of how nation-states promote and foster particular regimes of consumption. Where once consumption was deemed relevant only to the history of popular culture, in other words, it is now seen as intertwined with the central themes of American history, touching as it does on economics, politics, race relations, gender, the environment and other important topics.
Gary Cross, a professor at Penn State University and a pioneering and prolific historian of Europe and America, has explored the social, cultural and political dimensions of consumption before. In the past decade, he has published a half-dozen books on topics ranging from the history of leisure and working-class commercial amusements to the material culture of children's toys. Cross may study leisure, but his scholarship suggests that he doesn't take a whole lot of time to participate in consumer society. Fortunately, his work ethic has enabled the rest of us to understand our consumer ethic with clarity and historical perspective. Indeed, An All-Consuming Century displaces Daniel Horowitz's still-impressive but less wide-ranging The Morality of Spending (1985) as the best survey yet written of the history of modern American consumer society. Much more than a summary of recent scholarship (although it performs this task admirably), it is an informed, balanced, thoughtful and surprisingly passionate meditation on the making and meaning of our society. Avoiding the extremes of celebration and condemnation that too often pass for analysis, Cross's searching book is imbued with a generous concern for the revival of an active, democratic and participatory public sphere.
According to Cross, a paradox lies at the heart of American consumer society: It has been both an ideological triumph and a triumph over politics. Although it may be "difficult for Americans to see consumerism as an ideology," this is, Cross argues, precisely how it functions. It is, in his words, the "ism that won," the quiet but decisive victor in a century of ideological warfare. Over the course of the twentieth century it became naturalized to such an extent that few citizens "consider any serious alternatives or modifications to it."
In describing this ideological victory, Cross eschews conspiratorial interpretations of advertising and business collusion and gives consumer society its due for concretely expressing "the cardinal political ideals of the century--liberty and democracy--and with relatively little self-destructive behavior or personal humiliation." It won, Cross believes, because in large measure it met people's basic needs, helped them to fit into a diverse society even as it enabled them to forge new understandings of personal freedom, and served to fulfill, rather than mock, people's desire for the pleasures of the material world.
In spite of its popularity and successes, Cross believes that the ascension of consumer society has come at great cost: the abrogation of public life in favor of private thrills. By valorizing the private over the public and the present over the past and future, consumer society has "allowed little space for social conscience" and truly democratic politics. Rather than shoring up civil society, consumerism has pretty much replaced it: "The very idea of the primacy of political life has receded" as individual acquisition and use of goods has become the predominant way that Americans--and, increasingly, the rest of the industrialized world--make meaning of their lives. The suggestion that there should be limits to commercialism--that there are sacred places where the market does not belong--is, according to Cross, no longer taken seriously in a society that equates commercialism with freedom. Moreover, by the end of the century, "there seemed to be no moral equivalent to the world of consumption." The politics of consumption, in Cross's view, makes alternative conceptions of the good life virtually unimaginable in large part because it encourages people to think about themselves in isolation from the rest of society and from their history. (Reading Cross's book, I was reminded of Edward Hopper's painting Nighthawks, in which a customer at an urban diner sits alone, utterly disconnected from the humanity that surrounds him.) If Cross ultimately loses sight of the paradoxical nature of American consumerism and concludes on this dark note, An All-Consuming Century nonetheless provides important resources for others to explore the democratic potential of consumer society.
The narrative unfolds both chronologically and analytically. Cross divides the development of modern consumer society into four periods: 1900-1930, 1930-1960, 1960-1980 and 1980 to the end of the century. In this breakdown, the first three decades of the century were a takeoff period, during which a number of crucial elements converged to make America a consumer society. Cross consistently overstates the degree to which nineteenth-century America was a "traditional" society, untainted by commercialism; many elements of consumer society were born in the market revolution of the early 1800s and the corporate revolution of the later nineteenth century. But he is right to single out important developments that transformed the country from what we might call a nineteenth-century society with consumerist features to a full-blown consumer society in the twentieth century. The keys were increases in leisure time and personal income on the demand side, along with new products and innovations in selling on the supply side.
New, nationally advertised, branded products became widely available and affordable after the turn of the century. These products alleviated material needs, but more than that, Cross astutely notes, they became markers of new feelings of "comfort and ease" and "new sensations of power and speed." Modern products like cigarettes, candy and soft drinks made the sensational available on a daily, indeed almost hourly, basis. Amusement parks like Coney Island and other "cheap amusements" also made the regular purchase of spectacular thrills affordable for working people. In the consumer society, the utilitarian was always mixed with the sensual. The embodiment of this mixture was, of course, the great symbol of early-twentieth-century consumer society, the automobile. Already characterized by an increasing number of what Cross calls "private pleasures," in this period, as he shows, mass culture contributed to political and social changes as well: It blurred ethnic and class divisions and encouraged the children of immigrants to redefine themselves as members of a blended, multiethnic, if still racially segregated, youth culture.
The period 1930-1960 was one of consolidation in time of crisis. The constraints of the Great Depression and World War II led to a "frustrated consumerism more than a rejection of the capitalist system." Rather than blame the new consumerism, most policy-makers and indeed many ordinary Americans came to see "underconsumption" as the root cause of the slump. After the war, government policy encouraged the development of mass purchasing power rather than efforts to equalize the distribution of wealth. During the cold war, consumer society became "a positive answer to communism." In his 1959 "kitchen debate" with Nikita Khrushchev, Vice President Richard Nixon drove this point home by contrasting modern American appliances with outdated Soviet culinary technology. Despite the linkage in these years between consumption and freedom, Cross notes that the consumerism of the postwar years was not hedonistic but "domesticated," focused on the suburban home and the nuclear family. Signature developments of these years were Levittown, McDonald's and Holiday Inn, sites of responsible, respectable, family-oriented consumption.
From 1960 to 1980 consumer society faced a very different set of challenges but emerged stronger than ever. First, the counterculture challenged the very premises of consumerism, and in the 1970s, the specter of scarcity called into question the permanence of the cornucopia upon which consumer society depended. In spite of these challenges, "consumption became even more ubiquitous." Indeed, Cross suggests, the roots of the even more individualistic and socially fragmenting consumerism of the late twentieth century lay in part in the 1960s critique of consumerism: While countercultural figures critiqued conformity and idealized the "authentic self," many Americans sought to achieve this authenticity through consumption. Businesses began to modify the Fordist practice of mass production in favor of flexible production and segmented, demographically distinct markets. Drawing on the work of cultural critic Thomas Frank (rendered throughout the book as "Frank Thomas"), Cross writes that consumerism became "adaptable to the green and the hip." Similarly, during the energy crisis of the 1970s those politicians who took the shortage to be the result of overproductionwere rebuked as naysayers. With great political success, Ronald Reagan attacked President Jimmy Carter for a speech in which Carter had the temerity to suggest that "owning things and consuming things does not satisfy our longing for meaning." Reagan called that 1979 "malaise" address un-American in its pessimism and its call for restraint.
The trend toward fragmented, individualistic consumption accelerated during the last two decades of the century, an era that Cross labels "markets triumphant." Radical faith in the virtues of the market led politicians like Reagan to put a moral gloss on the "unfettered growth of market culture in the 1980s." Government constraints of an earlier era, in the form of environmental and advertising regulation, weakened, and commerce entered unfettered into areas where it had previously been kept at arm's length: children's homes and classrooms. By century's end the "Victorian notion that some time and place should be free from commerce" seemed as quaint as a Currier and Ives lithograph. Cross, who has a knack for unearthing telling statistics, notes that "supermarkets carried about 30,000 different products in 1996, up from 17,500 in 1986 and about 9,000 in the mid-1970s." Even the all-time-high consumer debt--$1.25 trillion by 1997--did nothing to stop the belief that the future of American prosperity and freedom depended upon the continuing expansion of the realm of consumption. Indeed, shopping had become the nation's primary form of entertainment, and monuments to consumption like the gargantuan 4.2-million-square-foot Mall of America became a haven for tourists from around the world.
In Cross's telling, the attractions and problems of consumer society are in effect one and the same: the cult of the new, immediate gratification and the valorization of "private pleasures." Consumerism is the "ism that won," owing to its ability not only to withstand challenges but, through a magical jujitsu, to co-opt them. Although initially formulated in terms neither celebratory nor condemnatory, Cross's story is ultimately one of declension. While he avoids the nostalgia of many commentators, there is little doubt that Cross finds contemporary consumer society to be a negative force: asocial, apolitical, amoral and environmentally dangerous. Whereas consumerism once helped integrate the diverse inhabitants of an immigrant nation in a youthful mass culture, by century's close, cynical marketers were happy to divide an equally multicultural nation into segmented demographic units based on "multiple and changing lifestyles." Thus the shift from an integrative, public-spirited popular culture in the early twentieth century to an increasingly privatized, solipsistic commercial culture of the late twentieth century. What was seductive in 1900--cornucopia and pleasure for the masses--became obscene by 2000, as a cultural stimulant turned into a dangerous narcotic.
An All-Consuming Century is one of the few indispensable works in the ever-expanding library of books on American consumer society. But in an otherwise rich overview the author has surprisingly little to say about the role of women, African-Americans and ethnic minorities (and nothing about regional variations) in the construction of consumer society. These are serious omissions. As admen and women's organizations recognized early on, women have performed the vast majority of the unpaid labor of consumer society: the shopping, budgeting and refashioning of older items. Cross notes that African-Americans were excluded from many of the benefits of the emerging mass culture, but he does not address the ways popular culture served to reinforce both the whiteness of the "new immigrants" from Eastern and Southern Europe--a skin privilege that was not yet fully acknowledged by the majority culture--and the otherness of Asian and Latino immigrants.
Nor does Cross discuss the attractions of nationwide retailers and national brands for African-Americans, who often took advantage of what the historian Edward Ayers has called the "anonymity and autonomy" made possible by the advent of the Sears catalogue (and chain stores in the nonsegregated North), whose mass customer base and "one price" system reduced the possibilities for racial discrimination that frequently accompanied visits to the corner store. For this group, the private pleasures occasionally afforded by the advent of national markets offered advantages over the public humiliations that so often accompanied local commerce.
Cross's relative neglect of women and minorities leads him to underestimate the importance of grassroots consumer activism as well, which has often been led by members of these groups. Meat boycotts, cost-of-living protests, "don't buy where you can't work" campaigns and sit-ins were integral to the development of American consumer society because they represented demands to expand the benefits of consumerism beyond a middle-class elite. One of the most important women's political organizations of the first half of the century, the National Consumers League, which pioneered the crusade for "ethical consumption" and labor rights, goes unmentioned. Cross stresses the ways marketers attempted to co-opt the civil rights movement, but he does not address the degree to which the demand for full participation in consumer society was a key ingredient in that crusade for social justice. By virtually ignoring these movements, Cross leaves out an important part of the story of consumer society--efforts to unite citizenship with consumption.
The critics of consumer society whom Cross discusses most often are proponents of what he calls the "jeremiad," the high-culture dismissal of mass culture as vulgar. He condemns the elitism and arrogance of such thinkers and is surely correct to note that their criticism had little impact on ordinary shoppers. Cross is less critical of the "simple living" tradition and calls the self-provisioning movement of the 1960s "the most positive aspect" of the counterculture. He argues that "the idea of the 'simple life,' perhaps never more than a daydream, had almost ceased being even a prick to the conscience," but he only briefly mentions the growing popularity of the "voluntary simplicity" movement, a topic addressed in more detail in Juliet Schor's The Overspent American (1998).
Cross also develops a persuasive critique of the consumer rights movement. While the Depression era saw the rise of groups like Consumers Union, which sought to make consumers a greater force against the power of business and advertisers, he notes that by focusing primarily on product quality and prices, many consumer rights groups have served only to reinforce "the individualism and the materialism of American consumption." This tradition of angry but apolitical individualism can still be found at innumerable websites, like starbucked.com, that highlight at great length the indignation of formerly loyal customers: "The sales clerk who sold me the machine was rude, then decidedly refused to hand over the free half pound of coffee given with every purchase of a Starbucks espresso machine...." The democratizing power of consumer demands for corporate responsibility is too often dissipated by such narrowly cast diatribes.
In spite of the failure of the jeremiad, the seeming irrelevance of simplicity and the individualization of the concept of consumer rights, Cross is too definitive about the nature of the "victory" of consumer society. Many Americans still recognize that however much advertisers and marketers attempt to cover it up, consumption is fundamentally a social and political act. So although it is true that "late twentieth century consumerism turned social problems into individual purchasing decisions," it is also the case that individual shopping decisions have frequently been viewed in the context of social problems. As consumer activists from the League of Women Shoppers in the 1930s through environmentalists today have pointed out, the goods that we buy leave ecological, labor and government "footprints." In spite of corporate attempts to fetishize goods, diligent activists like John C. Ryan and Alan Thein Durning of Northwest Environment Watch have described--and tried to estimate--the hidden social costs incurred by the purchase of quotidian products, including coffee and newspapers. The actions of students in the antisweatshop campaigns of recent years indicate that a growing number of consumers are looking behind the logo to determine the conditions under which the clothing they buy is made. As Naomi Klein has recently argued in No Logo:Taking Aim at the Brand Bullies, the ubiquity and importance of brands provides an opening for protesters who can threaten, through consumer boycotts and other actions, to sully corporate America's most valuable asset, the brand name. One teen in Klein's book puts it this way: "Nike, we made you. We can break you." Cross may decry the "inwardness of the personal computer," but the protests at the Seattle World Trade Organization and Washington International Monetary Fund meetings reveal that the Web creates alliances and expands social bonds. The history of consumer activism--and its recent incarnations--shows that consumerism does not necessarily lead to an antipolitics of radical individualism.
Cross does put forth important arguments about the "excesses of consumer culture": the environmental degradation, the waste, the lack of free time and the sheer mind-numbing meaninglessness that accompany modern consumerism. But these must be balanced with the recognition that most Americans, especially those in the working class, have viewed the enjoyment of the fruits of consumer society as an entitlement, not a defeat. This should not be dismissed as false consciousness or "embourgeoisement." Far from allowing consumerist demands to erode political impulses, working people--through living-wage, union-label and shorter-hour campaigns--have consistently politicized consumption. Rather than pitting the culture of consumption against democracy, it will be important to continue this tradition of democratizing, rather than demonizing, the culture of consumption. In his assessment of the twentieth century's most influential "ism," Cross provides important warnings about the difficulties of such an effort. But in its stress on the paradoxes of consumer society--an emphasis that then too rapidly gives way to condemnation--An All-Consuming Century also provides lessons from history about the necessity of the undertaking.
The quiet grace of Ring Lardner Jr., who died the other week at 85, seemed at odds with these noisy, thumping times. I cannot imagine Ring playing Oprah or composing one of those terribly earnest essays, "writers on writing," that keep bubbling to the surface of the New York Times. He was rightly celebrated
for personal and political courage but underestimated, it seems to me, as a protean writer who was incapable of composing an awkward sentence. It ran against Ring's nature to raise his voice. Lesser writers, who shouted, drew more acclaim, or anyway more attention.
The obituaries celebrated his two Academy Awards but made less of other achievements. Ring's novel,The Ecstasy of Owen Muir, begun in 1950 while he was serving his now-famous prison sentence for contempt of Congress, drew a transatlantic fan letter from Sean O'Casey. Ring felt sufficiently pleased to have the longhand note framed under glass, which he then slipped into a shirt drawer. He was not about advertisements for himself. In 1976 he published The Lardners: My Family Remembered. Garson Kanin commented, "In the American aristocracy of achievement, the Lardners are among the bluest of blue bloods. In Ring Lardner, Jr. they have found a chronicler worthy of his subject. The Lardners is a moving, comical, patriotic book."
The progenitor was, of course, Ring Lardner Sr., the great short-story writer, who sired four sons, each of whom wrote exceedingly well. James Lardner was killed during the Spanish Civil War; David died covering the siege of Aachen during World War II; a heart attack killed John in 1960, when he was 47. Add Ring's prison term to the necrology and you would not have what immediately looks to be the makings of a "moving, comical" book. But The Lardners was that and more because of Ring Jr.'s touch and slant and his overview of what E.E. Cummings called "this busy monster, manunkind."
From time to time, Ring published splendid essays. The one form he avoided was the short story. He wrote, "I did not want to undertake any enterprise that bore the risk of inviting comparison with my father or the appearance of trading on his reputation."
We became close in the days following the death of John Lardner, who was, quite simply, the best sports columnist I have read. I set about preparing a collection, The World of John Lardner, and Ring, my volunteer collaborator, found an unfinished serio-humorous "History of Drinking in America." He organized random pages with great skill. Reading them I learned that the favorite drink of the Continentals, shivering at Valley Forge, was a Pennsylvania rye called Old Monongahela. George Washington called it "stinking stuff." At headquarters the general sipped Madeira wine.
A year or so later, with the blacklist still raging, I picked up Ring for lunch at the Chateau Marmont, an unusual apartment hotel on Sunset Boulevard near Hollywood. Outside the building, a fifty-foot statue of a cowgirl, clad in boots and a bikini, rotated on the ball of one foot, advertising a Las Vegas hotel. I asked the room clerk for Mr. Robert Leonard. Ring was writing some forgotten movie, but could not then work under his own name. "Robert Leonard" matched the initials on his briefcase.
This was a pleasant November day, but the blinds above Ring's portable typewriter were drawn. When I asked why, he opened them. His desk sat facing the bikinied cowgirl, bust-high. Every eighteen seconds those giant breasts came spinning round. "Makes it hard to work," Ring said and closed the blinds.
The Saturday Evening Post was reinventing itself during the 1960s, on the way to dying quite a glorious death, and with my weighty title there, editor at large, I urged Clay Blair, who ran things, to solicit a piece from Ring about the blacklist. Ring responded with a touching, sometimes very funny story that he called "The Great American Brain Robbery." He explained, "With all these pseudonyms, I work as much as ever. But the producers now pay me about a tenth of what they did when I was allowed to write under my own name."
Clay Blair lived far right of center, but Ring's story conquered him, and he said, "Marvelous. Just one thing. He doesn't say whether he was a member of the Communist Party. Ask him to put that in the story."
"I won't do that, Clay."
"He chose jail, rather than answer that question."
"Then, if he still won't, will he tell us why he won't?"
Ring composed a powerful passage.
The impulse to resist assaults on freedom of thought has motivated witnesses who could have answered no to the Communist question as well as many, like myself, whose factual response would have been yes. I was at that time a member of the Communist party, in whose ranks I found some of the most thoughtful, witty and generally stimulating men and women in Hollywood, I also encountered a number of bores and unstable characters.... My political activity had already begun to dwindle at the time [Congressman J. Parnell] Thomas popped the question, and his only effect on my affiliation was to prolong it until the case was finally lost. At that point I could and did terminate my membership without confusing the act, in my own or anyone else's head, with the quite distinct struggle for the right to embrace any belief or set of beliefs to which my mind and conscience directed me.
These words drove a silver stake into the black heart of the blacklist.
Ring won his first Oscar for Woman of the Year in 1942, and when he won his second, for M*A*S*H in 1970, numbers of his friends responded with cheering and tears of joy. The ceremony took place early in 1971, and Ring accepted the statuette with a brief speech. "At long last a pattern has been established in my life. At the end of every twenty-eight years I get one of these. So I will see you all again in 1999."
Indeed. Early in the 1990s I lobbied a producer who had bought film rights to my book The Boys of Summer, to engage Ring for the screenplay. Ring, close to 80, worked tirelessly. A screenplay is a fictive work, and Ring moved a few days and episodes about for dramatic purposes. His scenario ended with the Brooklyn Dodgers winning the 1955 World Series from the Yankees and my account of that ballgame landing my byline on the front page of the New York Herald Tribune. The sports editor is congratulating me on a coherent piece when the telephone rings: My father has fallen dead on a street in Brooklyn; I am to proceed to Kings County Hospital and identify his body.
As I, or the character bearing my name, move toward the morgue, I bump into two beer-drunk Dodgers fans. One says, "What's the matter with him?" The other says, "He's sober. That's the matter with him." The body is there. It is my father's body. Beer drunks behind us, my mother and I embrace. Fin.
I can only begin to suggest all that Ring's scene implies. I would start with the point that winning the World Series is not the most important thing on earth, or even in Brooklyn. I was always careful not to embarrass Ring with praise, but here I blurted out, "This is the best bleeping screenplay I've ever read, Ringgold. Oscar III may come true in '99."
"Curious," Ring said. "I seem to have had the same thought myself."
The blacklisting bounders were now dead, but a new generation of Hollywood hounds refused to shoot Ring Lardner's scenario. The grounds: "a father-son angle" was not commercial. "It worked in Hamlet," Ring said, but to unhearing ears. And then we were talking about Ring writing a screenplay for a book I published in 1999 about Jack Dempsey and the Roaring Twenties. "Have to cut it back a bit," Ring said. "Following your text would give us the first billion-dollar picture."
Years ago, the critic Clifton Fadiman wrote that Ring Lardner Sr. was an unconscious artist and that his power proceeded from his hatred of the characters he created. Ring told me: "If my father hated anyone or anything, it was a critic like Fadiman. Unconscious artist? My father knew perfectly well how good he was and--better than anyone else--how hard it was to be that good."
Ring Jr. knew the very same thing about himself. Or so I believe. Yeats writes, "The intellect of man is forced to choose/perfection of the life, or of the work." As well as anyone in our time, my suddenly late friend Ring Lardner came pretty damn close to achieving perfection in both.
Long before Carrie-Anne Moss rips open Val Kilmer's shirt and begins pounding his chest, providing him with a version of CPR that she must have learned from a Japanese drum troupe, the makers of Red Planet have resorted to their own thumpings and flailings, as if to resuscitate a film that's gone limp. It's a panic response, coming from people who have realized too late that the hookup of a radio would be a high point of their picture.
Their script has stuck Moss in a stricken spaceship that's orbiting Mars; by this point, her comrades Kilmer and Tom Sizemore have been marooned, incommunicado, on the planet's surface. So when the boys stumble upon an old circuit board in the dust, it's time for high-energy drama. "Let's do it!" shrieks Sizemore, as if he were starting the Indy 500. With a roar, guitars and drums begin pounding away on the soundtrack. Kilmer, in closeup, damn well solders a wire, sending a meteor shower's worth of sparks across the screen--at which point, back on the spaceship, Moss decides to strip down to a sleeveless T-shirt, giving us a much better view of her breasts.
I'm really grateful for the breasts. If not for them, I might have fallen asleep and missed the climactic scene, in which Kilmer performs a diagnostic check on a computer.
If only the makers of Red Planet had trusted in their story's essential schleppiness! Then, instead of giving us this lumbering, expensive beast, they might have realized the small but halfway-clever idea that's still dimly visible within: a story about the heroism-by-default of a spaceship janitor.
The character in question, a fellow named Gallagher, holds the job title of mechanical systems engineer; but to the rest of the personnel on this flight to Mars, that's like saying he's the guy who fixes the toilets. "It's high school," he remarks to a fellow civilian in the crew, after being brushed back by a swaggering NASA pilot. "They're the jocks, and we're the nerds." Just so. When he bumps into Moss--the ship's commander--on her way out of the unisex shower, Gallagher can think of nothing better to do than fumble with his fingers and blush. Later, when the outcome of the mission comes to rest on him, Moss has to give him a pep talk before he'll even get to his feet. Yet he's the guy who must save Earth from destruction and consummate a rendezvous with those breasts. What a role for Steve Buscemi! How the hell did it go to Val Kilmer?
He's good, of course. Kilmer is always good--but he's a guy who previously played Jim Morrison, Elvis and Batman. The only thing that's nerdlike about him is the hairdo he's been given for this picture, which is brushy and yellow and makes him look as if he's in crying need of a conditioner. Mind you, the premise of Red Planet is that all of Earth needs a conditioner. After these many years of environmental degradation, we've burned out our world and must colonize someplace else. Hence the desperate and very expensive project, in the year 2057, of sending Moss and her crew to Mars. Wouldn't it have been cheaper, as well as more practical, to institute a few conservation measures instead? No doubt. But humans, according to this movie, lack much capacity for self-discipline and forethought, and so must splurge on stupid but spectacular stunts. As if to prove this point, the producers have done their own splurging and hired Kilmer--the actorly equivalent of a rocket to Mars, compared with Buscemi's compost heap.
As they cast the lead, so too did they decide to ladle on the excitement: pounding guitars, sleeveless T-shirts, unmotivated shrieks. How were these choices made? I can venture a guess. The credits for Red Planet list three producers and two executive producers. This is a fairly standard aggregation in today's movie business; and with so many big shots keeping themselves busy on the picture, how could a mere idea survive? The story, written by a lone guy named Chuck Pfarrer, was almost sure to be buried alive; and into the dirt with it went a few other notions.
One of them might have involved some sexual role-play, based on the fact that the only females in the story are Moss, the shipboard computer (named Lucille) and a navigation robot called Amee. "She's my kind of girl," Gallagher says of the robot, just before it goes into killer mode. (It was designed for the Marines.) Somebody, maybe Pfarrer, seems to have wanted the nerdy Gallagher to feel ambivalent toward strong women: attracted to them when they shower, threatened by them when they turn into whirring kung-fu machines.
But since the production is at war with its own screenplay--have I mentioned that Red Planet is directed, more or less, by Antony Hoffman?--this kinky little idea is no better realized than the movie's religiosity. As far as I'm concerned, it's just as well that this latter theme gets only lip service. Ever since 2001: A Space Odyssey, Earthlings in Outer Space have sought God, and found light shows. At least Red Planet spares us that final cliché--though it still makes us listen to a lot of spiritual blather.
Those Deep Thoughts are provided by Terence Stamp, who manages to be the crew's world-famous scientist despite having abandoned rationalism. Science cannot provide the answers he craves, Stamp explains to a sweetly patient Kilmer, and so he has turned to religion. Kilmer obligingly spends the rest of the picture looking for a divine purpose--which doesn't seem so misguided, considering the level of scientific expertise around him. When the crew's biologist (Sizemore) discovers a life form on Mars, he cries out, "Nematodes!" Either he's forgotten his Linnaeus--nematodes are worms--or else the solution to God's mysteries is to be found not in Outer Space but in the pages of old sci-fi magazines. These creatures are clearly arthropods: the genre's usual bugs.
Fans of the platoon-in-space movie will want to know that the Mars scenery is furnished with the necessary rocks, peaks and ravines. Fans of Carrie-Anne Moss--meaning the adolescent boys, of whatever age, who admired The Matrix--will want to know that here, too, she gets to fly around. Not every actress is suited to antigravity; and so, until such time as Moss gets the chance to deliver a performance, I will congratulate her on giving good float.
New York City
In The Unexpected Legacy of Divorce, Judith Wallerstein argues that the consequences of parental divorce for children are typically harmful and long-lasting. Katha Pollitt disagrees ["Subject to Debate," Oct. 23], charging that Wallerstein's study cannot be trusted because her sample is too small and because the families she studied suffer from multiple problems, not just divorce. Wallerstein's anti-anti-divorce critics have been making these charges for years, but fortunately we now have independent evidence to show who is getting it right. The two most important quantitative studies based on representative samples seeking to distinguish the effects of parental divorce from the effects of pre-divorce family problems are A Generation at Risk (1997), by Paul Amato and Allan Booth, and a study by Andrew Cherlin and colleagues published in the American Sociological Review in 1998. Both studies broadly support Wallerstein's main findings and offer little or no support to her critics. Which is why your readers aren't likely to hear about these studies from Katha Pollitt, even as she improbably appoints herself guardian of the scientific method on this topic.
Institute for American Values
In her strident column, Katha Pollitt attacks my book, The Unexpected Legacy of Divorce, with a plethora of misstatements. I reply to the most egregious. I state categorically several times in my book that I am not against divorce: "I am not against divorce. How could I be? I've probably seen more examples of wretched, demeaning, and abusive marriages than most of my colleagues." And further: "I don't know of any research, mine included, that says divorce is universally detrimental to children."
What I do say throughout my book, which Pollitt chooses to ignore, is that when people decide to divorce, it has a short-term and long-term traumatic effect upon the children that makes their subsequent life journey more difficult and that society, including the courts and their parents, must recognize this and take steps to mitigate this impact. They can come out well, as I demonstrate. It is simply harder.
Pollitt attacks my work as "pseudoscience." My method, the case-study method of qualitative research, is well established in biological and social science. It is a major method in medicine, psychiatry, psychology and anthropology. It depends on intensive interviewing to learn the internal landscape of the person and is the chief source of hypothesis formation and knowledge generation. The twentieth-century contributions of Piaget, Freud, Erikson and Bowlby were based on this method. Quantitative survey research cannot tap inner life experience.
About controls, Pollitt is again wrong. At the twenty-five-year mark, when I was elaborating the life experience of these children from childhood into adulthood, I assembled a comparison group of youngsters in the same neighborhoods with parents in comparable social and economic circumstances. They were matched (as a group, as do most sociological studies) along major parameters that I found relevant to my study. It is not clear what more Pollitt, who is not a behavioral scientist, could have in mind. This group was not solicited earlier because I was starting in a new area--no one had studied the impact of divorce on children before me--and I could not have known then what to control.
Pollitt asserts that mine is a skewed sample consisting of "crazy" divorcing parents, primarily responding to an offer of treatment. Certainly, people who are divorcing are distressed at the time and can exhibit very disturbed behaviors, including violence, which did not characterize the prior relationship until its downhill course brought the couple to the divorce decision. Certainly, people who divorce--across the board, not just in my sample--are people who have failed at a central relationship and therefore may have more psychic disturbances than those who maintain successful marriages.
It is shocking to call my sample "crazy" and therefore not representative of a divorcing population. I was working for divorce under the best of circumstances and was happy to have such a relatively affluent and well-educated sample. It is altogether untrue that they came as "sixty disastrous families, featuring crazy parents, economic insecurity [and] trapped wives." As for the 131 children of the sixty couples, they were screened to be developmentally on course, without significant school, home or play disturbances prior to the breakup.
Pollitt states that our world has changed significantly since my study began in 1971. Fathers are now more actively involved with their children, mothers are now better placed economically, etc. It remains to be seen how much difference this makes. In my study, a significant number of the women had professional degrees and careers, and that has not made the post-divorce relationships of their children significantly better than the others in the cohort. And it remains to be seen, when 50 percent of divorces occur with children under 6, and 75 percent of the divorced fathers remarry, how many fathers can maintain their parenting in the first marriage, while living with the requirements of the second marriage and new children.
My main point is that although I was the first to call attention to the traumatic impact of the divorce experience, during the early seventies, there have by now been many corroborating studies--done in our contemporary climate--that uphold my findings, and what seemed to many to be an alarmist view then is now conventional wisdom. I trust that my current findings of the long-term impact of parental divorce that crescendos as these children face the issue of man-woman relationships in adulthood may be similarly concurred in as further studies are carried out by others.
JUDITH S. WALLERSTEIN
New York City
It is true that Judith Wallerstein says in her latest book that she is not "against divorce." But what does that mean? She writes, "I think you should seriously consider staying together for the sake of your children," and she praises parents who stay in unhappy or dead marriages "with grace and without anger" but not those who leave such marriages and still put parenting first, something she seems to think is nearly impossible ("parenting erodes almost inevitably at the breakup and does not get restored for years, if ever"). Everyone who has written about her research takes it to argue that divorce is a great evil, to be avoided if at all possible--certainly that is what David Blankenhorn thinks she is saying. If Wallerstein is not "against divorce" why does she sit on the Council on Families of Blankenhorn's Institute for American Values, which has an explicit antidivorce agenda, opposing no-fault divorce, favoring "covenant marriage," waiting periods and mandatory counseling?
Wallerstein compares her methods to those of illustrious modern psychologists. It's odd to see Freud, who has been widely criticized for massaging his data when he didn't make it up, invoked as a model practitioner, but in any case, none of these men co-wrote their books with popular journalists (in Wallerstein's case, Sandra Blakeslee), used composite characters or presented as interviews done by themselves interviews that were conducted by other people. Case studies are all very well, perhaps even when written up with an obvious eye to mass-market advertising and media soundbites, but interviewing people for a few hours every five years (or listening to the tapes of such interviews by others) is not "intensive interviewing"--it's a conversation, a visit, a tête-à-tête. Nor is a group of high school classmates of one's original subjects assembled twenty-five years into one's research a valid scientific control. Besides, as she herself notes, the comparison group parents were much better educated and wealthier.
As a sample of children whose parents are divorced, Wallerstein's 131 subjects leave much to be desired. For one thing, she didn't follow up on the ones who dropped out--thirty-eight people, almost 30 percent of the original group! If, as is likely, the ones who stayed were the ones with more problems, and the ones who left were the ones who adjusted well to divorce and moved on with their lives, then failing to do "case studies" of the dropouts leaves her with a sample biased toward gloomy findings. (That Wallerstein's continuing subjects came disproportionately from families that had a hard time coping with divorce is suggested by the fact that 32 percent of their mothers had only a high school diploma or less versus 24 percent of the mothers in the original group.) It is disturbing that Wallerstein seems incurious about the melting away of so many of her original subjects, and the result is that she not only cannot say how representative her interviewees are of "children of divorce" in general, she can't even say how representative they are of her own sample!
As she has done many times in recent years, Wallerstein fudges the fact that she recruited her group--and skewed her sample--by offering free therapy, as she herself clearly acknowledged in her first report on her study "Surviving the Breakup." Similarly, although she professes herself shocked by the word "crazy," it was she who, in the same book, described her sample as consisting largely of people who were mentally or emotionally troubled. According to her own words, only one-third of the parents in her sample were "those whose functioning overall during the life history of the marriage was generally adequate or better." Roughly 50 percent were "moderately disturbed"--nor does she suggest in the earlier book that this is a temporary aberration caused by the stress of divorce, as she now claims. On the contrary, she speaks of addictions, suicidal tendencies, chronic depression, "severe neurotic difficulties," "handicaps in relating to another person" and "longstanding problems in controlling their rage or sexual impulses." This is half the parents. The remainder--15 percent of the men and 20 percent of the women--were "severely troubled during their marriages, perhaps throughout their lives," with "histories of mental illness, including paranoid thinking, bizarre behavior, manic-depressive illnesses, and generally fragile or unsuccessful attempts to cope with the demands of life, marriage and family." Divorce or no divorce, the offspring of such people are not likely to reach adulthood unscathed.
Wallerstein labels her subjects "children of divorce." The very process of participating in her study may have encouraged her subjects to embrace that self-definition, as "children of alcoholics" often view their lives through the lens of parental drinking, which is taken to explain every possible deviation from the ideal. Another researcher might label them "children of the emotionally or mentally ill." Perhaps, as Wallerstein seems to believe, two disturbed parents are better than one. But that tells us little about what the effects of divorce are for the children of parents who are nonviolent, sane, stable and capable of loving and responsible relationships. Paradoxically, these parents, the ones most likely to raise healthy kids after divorce, are the ones most likely to heed Wallerstein's advice to remain in bad marriages.
SISTERHOOD WAS SOURFUL
In "When Women Spied on Women" [Sept. 4/11], the piece excerpted from Ruth Rosen's book regarding women spying on and infiltrating the women's movement, contained a reference to Seattle that I wish to clarify. Rosen repeats Betty Friedan's contention that the FBI had infiltrated a number of women's organizations and manipulated the gay-straight split. She cites Friedan's charge that when she was invited to speak in Seattle, she was met with protesters. Friedan told Rosen that she thought "the Seattle thing was [the result] of agents."
I was active in the left and women's liberation movement in Seattle, am writing a book about the women's liberation movement in Seattle, helped organize the protests against Friedan--and I know I wasn't an agent. The protest against Friedan was organized mainly by individuals and groups like the University YWCA, the University of Washington Women's Commission and the Seattle Gay Women's Alliance, who were disturbed by Friedan's homophobia. One of the main organizers of that protest was Mary Aiken Rothschild, now a professor of women's studies at the University of Arizona, then a PhD candidate in history and acting director of the University of Washington women's studies program. In an interview with Pandora, a local feminist newsletter, Rothschild challenged "Friedan's idea of what a feminist movement is about.... Mary defined herself as a straight woman, a mother, a professional who supported her lesbian sisters. Gay and straight women work together in Seattle and that is why we are getting somewhere. We didn't need big name leaders from the outside coming in to disrupt our movement."
Those of us involved in organizing the protest were very proud of our activities. We had attempted to convince Friedan to share a platform with an activist in the lesbian movement. She refused. We tried to meet with her and discuss her political point of view. She refused. So, we confronted her at a cocktail party and then at her public meeting at the University of Washington. This was not the first time that the radical women's liberation movement publicly confronted movement "leaders." In the fall of 1972, largely through the efforts of the University of Washington Women's Commission, we met with and publicly demonstrated against Gloria Steinem. The women were particularly critical of her support for the Democratic Party and her role in voting down the pro-choice plank at the 1972 Democratic convention.
As Rosen demonstrates, many women were hardly "sisterly." Others, to their discredit, accused women of being agents, provocateurs or male-identified as a way to dismiss their ideas or persona. As historians, especially as historians of our own movement, we have an obligation not to leave these charges unanswered.
I can verify that the FBI continued its surveillance of the women's movement long after the late sixties and early seventies. Three months after my book Mothers on Trial: The Battle for Children and Custody was published, in 1986, the FBI convened the first grand jury in the history of our country to question an American citizen, me, about the whereabouts of a missing mother and her "allegedly" sexually abused daughter, who had fled "underground" when a court awarded custody of the girl to the "alleged" paternal incest-abuser. Had I been granted immunity to testify before the Buffalo grand jury and failed to do so, I might have sat in jail for a long time.
What terrified me was the possibility that few feminists understood that my silence was a political act. At the time (long before TV began to air docudramas about a Mother's Underground) virtually none of the liberal feminist organizations with whom I had worked on other issues--NOW, NOW's Legal Defense and Education Fund, the National Center for Women and Family Law, Ms. magazine and Foundation--were institutionally or ideologically ready to face this kind of danger. Lawyer Margy Ratner of the Center for Constitutional Rights was, and the center stood by me. A few days before the grand jury was to take place, I received a call from the FBI telling me that they "had captured the felon" and that my testimony was no longer needed. Meanwhile, feminist and lesbian networks were disbanded, a number of feminist and lesbian lawyers were harassed, and one lost her license to practice law in Mississippi because she dared to represent a mother in a similar circumstance. The FBI successfully hunted down and jailed a number of runaway mothers.