Quantcast

Articles | The Nation

News and Features

When you read this, George W. Bush may be President, which will most likely mean that his lawyers, his brother Jeb and his Florida campaign co-chair and ambassadorial wannabe Katherine Harris succeeded in short-circuiting the manual recounts in Florida that had Al Gore's total edging upward. Or Gore--who, as we went to press, said he would abide by the results of a limited or, if Bush preferred, statewide hand recount--may have wrested victory from the jaws of premature concession because the hand-counted chads were hanging his way.

The bromide "every vote must count" has oft been uttered, but the Florida election ripped the veil off the many ways votes can be made not to count. Such as: Secretary of State Harris's refusal to redress blunders like the mysteriously unrecorded 6,600 presidential-line votes in Broward County; her selective tolerance of a 5 percent error rate in Florida's voting-card machines in an election with a far narrower margin; improprieties in the handling of GOP absentee ballots in Seminole County; closings of polling places in certain black precincts while voters were still waiting in line; and denial of requests for Creole interpreters.

In tandem with these ward-heeler power plays went the Bush forces' relentless stealth attack on democracy--the strategy seemed to be to sow confusion and doubt about the counting process. Leading the spinners was the pompous ex-Secretary of State James Baker, whose phalanx of lawyers sought an injunction in federal court--never mind the hypocrisy of champions of states' rights trying to overturn state elections laws. Federal judge Donald Middlebrooks gave these ploys short shrift and underscored that recounts are not aberrations in our system but routine occurrences, which a body of state and local law exists to handle.

The polls showed that a majority of Americans approved of the idea that the votes be fully and fairly counted; it was mainly the conservative punditocracy and academic talking heads who called for Gore to fall on his sword. We were reminded of the run-up to impeachment, when some of these same tribunes were hectoring President Clinton to resign rather than "put the country through" a period of instability threatening to undermine democracy and the Free World. Such warnings were dusted off for Baker's PR drive, enlivened with dire threats that the market would go south if a recount continued (upon which the market, driven by its inner neuroses, went up). Conveniently forgotten was the fact that there's a President on the job until January 20.

As the legal/political maneuvers unfolded we were struck by the relevance of what contributors to this issue, among them Lani Guinier, Theodore Lowi and William Greider, are saying from different angles: First, that democracy is messy and unpredictable--something the elites abhor--and all the more reason to insure that every vote is duly counted; and second, that over the long term the aftermath of this election may be more important than the question of which contender wins the race--if it galvanizes citizens to take a fresh look at the American way of voting. Right now we live in a drafty old house, and our contributors propose some practical ways to fix the roof and shore up the foundation. As Americans have learned throughout history, our rights periodically have to be wrested back from elites trying to take them away--as the Bush team was caught doing in Florida.

To buy or not to buy turns out to have been the question of the century in America--Just Do It or Just Say No. And in the past fifteen years, consumer society has moved to the center of historical inquiry as well. It began with the social history of commercial culture and the advertising industry, in books such as Kathy Peiss's Cheap Amusements: Working Women and Leisure in Turn-of-the-Century New York (1986) and Roland Marchand's Advertising the American Dream (1985). Drawing inspiration from the pioneering anthropological explorations of Dick Hebdidge (Subculture, The Meaning of Style, 1979), Arjun Appadurai (The Social Life of Things, 1988) and, especially, Mary Douglas and Baron Isherwood (The World of Goods, 1979), investigators then turned to the cultural history of how ordinary people use and assign meanings to commodities. A good example of this genre is Alison Clarke's Tupperware: The Promise of Plastic in 1950s America (1999). In recent works--such as Robert Collins's More: The Politics of Economic Growth in Postwar America (2000) and Alan Brinkley's The End of Reform: New Deal Liberalism in Recession and War (1995)--they have studied the political history of how nation-states promote and foster particular regimes of consumption. Where once consumption was deemed relevant only to the history of popular culture, in other words, it is now seen as intertwined with the central themes of American history, touching as it does on economics, politics, race relations, gender, the environment and other important topics.

Gary Cross, a professor at Penn State University and a pioneering and prolific historian of Europe and America, has explored the social, cultural and political dimensions of consumption before. In the past decade, he has published a half-dozen books on topics ranging from the history of leisure and working-class commercial amusements to the material culture of children's toys. Cross may study leisure, but his scholarship suggests that he doesn't take a whole lot of time to participate in consumer society. Fortunately, his work ethic has enabled the rest of us to understand our consumer ethic with clarity and historical perspective. Indeed, An All-Consuming Century displaces Daniel Horowitz's still-impressive but less wide-ranging The Morality of Spending (1985) as the best survey yet written of the history of modern American consumer society. Much more than a summary of recent scholarship (although it performs this task admirably), it is an informed, balanced, thoughtful and surprisingly passionate meditation on the making and meaning of our society. Avoiding the extremes of celebration and condemnation that too often pass for analysis, Cross's searching book is imbued with a generous concern for the revival of an active, democratic and participatory public sphere.

According to Cross, a paradox lies at the heart of American consumer society: It has been both an ideological triumph and a triumph over politics. Although it may be "difficult for Americans to see consumerism as an ideology," this is, Cross argues, precisely how it functions. It is, in his words, the "ism that won," the quiet but decisive victor in a century of ideological warfare. Over the course of the twentieth century it became naturalized to such an extent that few citizens "consider any serious alternatives or modifications to it."

In describing this ideological victory, Cross eschews conspiratorial interpretations of advertising and business collusion and gives consumer society its due for concretely expressing "the cardinal political ideals of the century--liberty and democracy--and with relatively little self-destructive behavior or personal humiliation." It won, Cross believes, because in large measure it met people's basic needs, helped them to fit into a diverse society even as it enabled them to forge new understandings of personal freedom, and served to fulfill, rather than mock, people's desire for the pleasures of the material world.

In spite of its popularity and successes, Cross believes that the ascension of consumer society has come at great cost: the abrogation of public life in favor of private thrills. By valorizing the private over the public and the present over the past and future, consumer society has "allowed little space for social conscience" and truly democratic politics. Rather than shoring up civil society, consumerism has pretty much replaced it: "The very idea of the primacy of political life has receded" as individual acquisition and use of goods has become the predominant way that Americans--and, increasingly, the rest of the industrialized world--make meaning of their lives. The suggestion that there should be limits to commercialism--that there are sacred places where the market does not belong--is, according to Cross, no longer taken seriously in a society that equates commercialism with freedom. Moreover, by the end of the century, "there seemed to be no moral equivalent to the world of consumption." The politics of consumption, in Cross's view, makes alternative conceptions of the good life virtually unimaginable in large part because it encourages people to think about themselves in isolation from the rest of society and from their history. (Reading Cross's book, I was reminded of Edward Hopper's painting Nighthawks, in which a customer at an urban diner sits alone, utterly disconnected from the humanity that surrounds him.) If Cross ultimately loses sight of the paradoxical nature of American consumerism and concludes on this dark note, An All-Consuming Century nonetheless provides important resources for others to explore the democratic potential of consumer society.

The narrative unfolds both chronologically and analytically. Cross divides the development of modern consumer society into four periods: 1900-1930, 1930-1960, 1960-1980 and 1980 to the end of the century. In this breakdown, the first three decades of the century were a takeoff period, during which a number of crucial elements converged to make America a consumer society. Cross consistently overstates the degree to which nineteenth-century America was a "traditional" society, untainted by commercialism; many elements of consumer society were born in the market revolution of the early 1800s and the corporate revolution of the later nineteenth century. But he is right to single out important developments that transformed the country from what we might call a nineteenth-century society with consumerist features to a full-blown consumer society in the twentieth century. The keys were increases in leisure time and personal income on the demand side, along with new products and innovations in selling on the supply side.

New, nationally advertised, branded products became widely available and affordable after the turn of the century. These products alleviated material needs, but more than that, Cross astutely notes, they became markers of new feelings of "comfort and ease" and "new sensations of power and speed." Modern products like cigarettes, candy and soft drinks made the sensational available on a daily, indeed almost hourly, basis. Amusement parks like Coney Island and other "cheap amusements" also made the regular purchase of spectacular thrills affordable for working people. In the consumer society, the utilitarian was always mixed with the sensual. The embodiment of this mixture was, of course, the great symbol of early-twentieth-century consumer society, the automobile. Already characterized by an increasing number of what Cross calls "private pleasures," in this period, as he shows, mass culture contributed to political and social changes as well: It blurred ethnic and class divisions and encouraged the children of immigrants to redefine themselves as members of a blended, multiethnic, if still racially segregated, youth culture.

The period 1930-1960 was one of consolidation in time of crisis. The constraints of the Great Depression and World War II led to a "frustrated consumerism more than a rejection of the capitalist system." Rather than blame the new consumerism, most policy-makers and indeed many ordinary Americans came to see "underconsumption" as the root cause of the slump. After the war, government policy encouraged the development of mass purchasing power rather than efforts to equalize the distribution of wealth. During the cold war, consumer society became "a positive answer to communism." In his 1959 "kitchen debate" with Nikita Khrushchev, Vice President Richard Nixon drove this point home by contrasting modern American appliances with outdated Soviet culinary technology. Despite the linkage in these years between consumption and freedom, Cross notes that the consumerism of the postwar years was not hedonistic but "domesticated," focused on the suburban home and the nuclear family. Signature developments of these years were Levittown, McDonald's and Holiday Inn, sites of responsible, respectable, family-oriented consumption.

From 1960 to 1980 consumer society faced a very different set of challenges but emerged stronger than ever. First, the counterculture challenged the very premises of consumerism, and in the 1970s, the specter of scarcity called into question the permanence of the cornucopia upon which consumer society depended. In spite of these challenges, "consumption became even more ubiquitous." Indeed, Cross suggests, the roots of the even more individualistic and socially fragmenting consumerism of the late twentieth century lay in part in the 1960s critique of consumerism: While countercultural figures critiqued conformity and idealized the "authentic self," many Americans sought to achieve this authenticity through consumption. Businesses began to modify the Fordist practice of mass production in favor of flexible production and segmented, demographically distinct markets. Drawing on the work of cultural critic Thomas Frank (rendered throughout the book as "Frank Thomas"), Cross writes that consumerism became "adaptable to the green and the hip." Similarly, during the energy crisis of the 1970s those politicians who took the shortage to be the result of overproductionwere rebuked as naysayers. With great political success, Ronald Reagan attacked President Jimmy Carter for a speech in which Carter had the temerity to suggest that "owning things and consuming things does not satisfy our longing for meaning." Reagan called that 1979 "malaise" address un-American in its pessimism and its call for restraint.

The trend toward fragmented, individualistic consumption accelerated during the last two decades of the century, an era that Cross labels "markets triumphant." Radical faith in the virtues of the market led politicians like Reagan to put a moral gloss on the "unfettered growth of market culture in the 1980s." Government constraints of an earlier era, in the form of environmental and advertising regulation, weakened, and commerce entered unfettered into areas where it had previously been kept at arm's length: children's homes and classrooms. By century's end the "Victorian notion that some time and place should be free from commerce" seemed as quaint as a Currier and Ives lithograph. Cross, who has a knack for unearthing telling statistics, notes that "supermarkets carried about 30,000 different products in 1996, up from 17,500 in 1986 and about 9,000 in the mid-1970s." Even the all-time-high consumer debt--$1.25 trillion by 1997--did nothing to stop the belief that the future of American prosperity and freedom depended upon the continuing expansion of the realm of consumption. Indeed, shopping had become the nation's primary form of entertainment, and monuments to consumption like the gargantuan 4.2-million-square-foot Mall of America became a haven for tourists from around the world.

In Cross's telling, the attractions and problems of consumer society are in effect one and the same: the cult of the new, immediate gratification and the valorization of "private pleasures." Consumerism is the "ism that won," owing to its ability not only to withstand challenges but, through a magical jujitsu, to co-opt them. Although initially formulated in terms neither celebratory nor condemnatory, Cross's story is ultimately one of declension. While he avoids the nostalgia of many commentators, there is little doubt that Cross finds contemporary consumer society to be a negative force: asocial, apolitical, amoral and environmentally dangerous. Whereas consumerism once helped integrate the diverse inhabitants of an immigrant nation in a youthful mass culture, by century's close, cynical marketers were happy to divide an equally multicultural nation into segmented demographic units based on "multiple and changing lifestyles." Thus the shift from an integrative, public-spirited popular culture in the early twentieth century to an increasingly privatized, solipsistic commercial culture of the late twentieth century. What was seductive in 1900--cornucopia and pleasure for the masses--became obscene by 2000, as a cultural stimulant turned into a dangerous narcotic.

An All-Consuming Century is one of the few indispensable works in the ever-expanding library of books on American consumer society. But in an otherwise rich overview the author has surprisingly little to say about the role of women, African-Americans and ethnic minorities (and nothing about regional variations) in the construction of consumer society. These are serious omissions. As admen and women's organizations recognized early on, women have performed the vast majority of the unpaid labor of consumer society: the shopping, budgeting and refashioning of older items. Cross notes that African-Americans were excluded from many of the benefits of the emerging mass culture, but he does not address the ways popular culture served to reinforce both the whiteness of the "new immigrants" from Eastern and Southern Europe--a skin privilege that was not yet fully acknowledged by the majority culture--and the otherness of Asian and Latino immigrants.

Nor does Cross discuss the attractions of nationwide retailers and national brands for African-Americans, who often took advantage of what the historian Edward Ayers has called the "anonymity and autonomy" made possible by the advent of the Sears catalogue (and chain stores in the nonsegregated North), whose mass customer base and "one price" system reduced the possibilities for racial discrimination that frequently accompanied visits to the corner store. For this group, the private pleasures occasionally afforded by the advent of national markets offered advantages over the public humiliations that so often accompanied local commerce.

Cross's relative neglect of women and minorities leads him to underestimate the importance of grassroots consumer activism as well, which has often been led by members of these groups. Meat boycotts, cost-of-living protests, "don't buy where you can't work" campaigns and sit-ins were integral to the development of American consumer society because they represented demands to expand the benefits of consumerism beyond a middle-class elite. One of the most important women's political organizations of the first half of the century, the National Consumers League, which pioneered the crusade for "ethical consumption" and labor rights, goes unmentioned. Cross stresses the ways marketers attempted to co-opt the civil rights movement, but he does not address the degree to which the demand for full participation in consumer society was a key ingredient in that crusade for social justice. By virtually ignoring these movements, Cross leaves out an important part of the story of consumer society--efforts to unite citizenship with consumption.

The critics of consumer society whom Cross discusses most often are proponents of what he calls the "jeremiad," the high-culture dismissal of mass culture as vulgar. He condemns the elitism and arrogance of such thinkers and is surely correct to note that their criticism had little impact on ordinary shoppers. Cross is less critical of the "simple living" tradition and calls the self-provisioning movement of the 1960s "the most positive aspect" of the counterculture. He argues that "the idea of the 'simple life,' perhaps never more than a daydream, had almost ceased being even a prick to the conscience," but he only briefly mentions the growing popularity of the "voluntary simplicity" movement, a topic addressed in more detail in Juliet Schor's The Overspent American (1998).

Cross also develops a persuasive critique of the consumer rights movement. While the Depression era saw the rise of groups like Consumers Union, which sought to make consumers a greater force against the power of business and advertisers, he notes that by focusing primarily on product quality and prices, many consumer rights groups have served only to reinforce "the individualism and the materialism of American consumption." This tradition of angry but apolitical individualism can still be found at innumerable websites, like starbucked.com, that highlight at great length the indignation of formerly loyal customers: "The sales clerk who sold me the machine was rude, then decidedly refused to hand over the free half pound of coffee given with every purchase of a Starbucks espresso machine...." The democratizing power of consumer demands for corporate responsibility is too often dissipated by such narrowly cast diatribes.

In spite of the failure of the jeremiad, the seeming irrelevance of simplicity and the individualization of the concept of consumer rights, Cross is too definitive about the nature of the "victory" of consumer society. Many Americans still recognize that however much advertisers and marketers attempt to cover it up, consumption is fundamentally a social and political act. So although it is true that "late twentieth century consumerism turned social problems into individual purchasing decisions," it is also the case that individual shopping decisions have frequently been viewed in the context of social problems. As consumer activists from the League of Women Shoppers in the 1930s through environmentalists today have pointed out, the goods that we buy leave ecological, labor and government "footprints." In spite of corporate attempts to fetishize goods, diligent activists like John C. Ryan and Alan Thein Durning of Northwest Environment Watch have described--and tried to estimate--the hidden social costs incurred by the purchase of quotidian products, including coffee and newspapers. The actions of students in the antisweatshop campaigns of recent years indicate that a growing number of consumers are looking behind the logo to determine the conditions under which the clothing they buy is made. As Naomi Klein has recently argued in No Logo:Taking Aim at the Brand Bullies, the ubiquity and importance of brands provides an opening for protesters who can threaten, through consumer boycotts and other actions, to sully corporate America's most valuable asset, the brand name. One teen in Klein's book puts it this way: "Nike, we made you. We can break you." Cross may decry the "inwardness of the personal computer," but the protests at the Seattle World Trade Organization and Washington International Monetary Fund meetings reveal that the Web creates alliances and expands social bonds. The history of consumer activism--and its recent incarnations--shows that consumerism does not necessarily lead to an antipolitics of radical individualism.

Cross does put forth important arguments about the "excesses of consumer culture": the environmental degradation, the waste, the lack of free time and the sheer mind-numbing meaninglessness that accompany modern consumerism. But these must be balanced with the recognition that most Americans, especially those in the working class, have viewed the enjoyment of the fruits of consumer society as an entitlement, not a defeat. This should not be dismissed as false consciousness or "embourgeoisement." Far from allowing consumerist demands to erode political impulses, working people--through living-wage, union-label and shorter-hour campaigns--have consistently politicized consumption. Rather than pitting the culture of consumption against democracy, it will be important to continue this tradition of democratizing, rather than demonizing, the culture of consumption. In his assessment of the twentieth century's most influential "ism," Cross provides important warnings about the difficulties of such an effort. But in its stress on the paradoxes of consumer society--an emphasis that then too rapidly gives way to condemnation--An All-Consuming Century also provides lessons from history about the necessity of the undertaking.

This election may jolt Americans out of a passive acceptance of civil mythologies.

For years many of us have called for a national conversation about what it means to be a multiracial democracy. We have enumerated the glaring flaws inherent in our winner-take-all form of voting, which has produced a steady decline in voter participation, underrepresentation of racial minorities in office, lack of meaningful competition and choice in most elections, and the general failure of politics to mobilize, inform and inspire half the eligible electorate. But nothing changed. Democracy was an asterisk in political debate, typically encompassed in a vague reference to "campaign finance reform." Enter Florida.

The fiasco there provides a rare opportunity to rethink and improve our voting practices in a way that reflects our professed desire to have "every vote count." This conversation has already begun, as several highly educated communities in Palm Beach experienced the same sense of systematic disfranchisement that beset the area's poorer and less-educated communities of color. "It felt like Birmingham last night," Mari Castellanos, a Latina activist in Miami, wrote in an e-mail describing a mammoth rally at the 14,000-member New Birth Baptist Church, a primarily African-American congregation in Miami. "The sanctuary was standing room only. So were the overflow rooms and the school hall, where congregants connected via large TV screens. The people sang and prayed and listened. Story after story was told of voters being turned away at the polls, of ballots being destroyed, of NAACP election literature being discarded at the main post office, of Spanish-speaking poll workers being sent to Creole precincts and vice-versa.... Union leaders, civil rights activists, Black elected officials, ministers, rabbis and an incredibly passionate and inspiring Marlene Bastiene--president of the Haitian women's organization--spoke for two or three minutes each, reminding the assembly of the price their communities had paid for the right to vote and vowing not to be disfranchised ever again."

We must not let this once-in-a-generation moment pass without addressing the basic questions these impassioned citizens are raising: Who votes, how do they vote, whom do they vote for, how are their votes counted and what happens after the voting? These questions go to the very legitimacy of our democratic procedures, not just in Florida but nationwide--and the answers could lead to profound but eminently achievable reforms.

§ Who votes--and doesn't? As with the rest of the nation, in Florida only about half of all adults vote, about the same as the national average. Even more disturbing, nonvoters are increasingly low-income, young and less educated. This trend persists despite the Voting Rights Act, which since 1970 has banned literacy tests nationwide as prerequisites for voting--a ban enacted by Congress and unanimously upheld by the Supreme Court.

We are a democracy that supposedly believes in universal suffrage, and yet the differential turnout between high-income and low-income voters is far greater than in Europe, where it ranges from 5 to 10 percent. More than two-thirds of people in America with incomes greater than $50,000 vote, compared with one-third of those with incomes under $10,000. Those convicted of a felony are permanently banned from voting in Florida and twelve other states. In Florida alone, this year more than 400,000 ex-felons, about half of them black, were denied the opportunity to vote. Canada, on the other hand, takes special steps to register former prisoners and bring them into full citizenship.

§ How do they vote? Florida now abounds with stories of long poll lines, confusing ballots and strict limitations on how long voters could spend in the voting booth. The shocking number of invalid ballots--more ballots were "spoiled" in the presidential race than were cast for "spoiler" Ralph Nader--are a direct result of antiquated voting mechanics that would shame any nation, let alone one of the world's oldest democracies. Even the better-educated older voters of Palm Beach found, to their surprise, how much they had in common with more frequently disfranchised populations. Given how many decisions voters are expected to make in less than five minutes in the polling booth, it is common sense that the polls should be open over a weekend, or at least for twenty-four hours, and that Election Day should be a national holiday. By highlighting our wretched record on voting practices, Florida raises the obvious question: Do we really want large voter participation?

§ Whom do they vote for? Obviously, Florida voters chose among Al Gore, George Bush and a handful of minor-party candidates who, given their status as unlikely to win, were generally ignored and at best chastised as spoilers. But as many voters are now realizing, in the presidential race they were voting not for the candidates whose name they selected (or attempted to select) but for "electors" to that opaque institution, the Electoral College. Our constitutional framers did some things well--chiefly dulling the edge of winner-take-all elections through institutions that demand coalition-building, compromise and recognition of certain minority voices--but the Electoral College was created on illegitimate grounds and has no place in a modern democracy.

As Yale law professor Akhil Reed Amar argues, the Electoral College was established as a device to boost the power of Southern states in the election of the President. The same "compromise" that gave Southern states more House members by counting slaves as three-fifths of a person for purposes of apportioning representation (while giving them none of the privileges of citizenship) gave those states Electoral College votes in proportion to their Congressional delegation. This hypocrisy enhanced the Southern states' Electoral College percentage, and as a result, Virginia slaveowners controlled the presidency for thirty-two of our first thirty-six years.

Its immoral origins notwithstanding, the Electoral College was soon justified as a deliberative body that would choose among several candidates and assure the voice of small geographic areas. But under the Electoral College, voters in small states have more than just a voice; indeed their say often exceeds that of voters in big states. In Wyoming one vote in the Electoral College corresponds to 71,000 voters; in Florida, one electoral vote corresponds to 238,000 voters. At minimum we should eliminate the extra bias that adding electors for each of two senators gives our smallest states. As Robert Naiman of the Center for Economic and Policy Research reports, allowing each state only as many electors as it has members in the House of Representatives would mean, for example, that even if Bush won Oregon and Florida, he would have 216 and Gore would have 220 electoral votes.

Today its backers still argue that the Electoral College is necessary to insure that small states are not ignored by the presidential candidates. Yet the many states--including small ones--that weren't close in this election were neglected by both campaigns. Some of the nation's biggest states, with the most people of color, saw very little presidential campaigning and get-out-the-vote activity. Given their lopsided results this year, we can expect California, Illinois, New York, Texas and nearly all Southern states to be shunned in the 2004 campaign.

§ How are their votes counted? The presidency rests on a handful of votes in Florida because allocation of electoral votes is winner-take-all--if Gore wins by ten votes out of 6 million, he will win 100 percent of the state's twenty-five electoral votes. The ballots cast for a losing candidate are always "invalid" for the purposes of representation; only those cast for the winner actually "count." Thus winner-take-all elections underrepresent the voice of the minority and exaggerate the power of one state's razor-thin majority. Winner-take-all is the great barrier to representation of political and racial minorities at both the federal and the state level. No blacks or Latinos serve in the US Senate or in any governor's mansion. Third-party candidates did not win a single state legislature race except for a handful in Vermont.

Given the national questioning of the Electoral College sparked by the anomalous gap between the popular vote and the college's vote in the presidential election, those committed to real representative democracy now have a chance to shine a spotlight on the glaring flaws and disfranchisement inherent in winner-take-all practices and to propose important reforms.

What we need are election rules that encourage voter turnout rather than suppress it. A system of proportional representation--which would allocate seats to parties based on their proportion of the total vote--would more fairly reflect intense feeling within the electorate, mobilize more people to participate and even encourage those who do participate to do so beyond just the single act of voting on Election Day. Most democracies around the world have some form of proportional voting and manage to engage a much greater percentage of their citizens in elections. Proportional representation in South Africa, for example, allows the white Afrikaner parties and the ANC to gain seats in the national legislature commensurate with the total number of votes cast for each party. Under this system, third parties are a plausible alternative. Moreover, to allow third parties to run presidential candidates without being "spoilers," some advocate instant-runoff elections in which voters would rank their choices for President. That way, even voters whose top choice loses the election could influence the race among the other candidates.

Winner-take-all elections, by contrast, encourage the two major parties to concentrate primarily on the "undecideds" and to take tens of millions of dollars of corporate and special-interest contributions to broadcast ads on the public airwaves appealing to the center of the political spectrum. Winner-take-all incentives discourage either of the two major parties from trying to learn, through organizing and door-knocking, how to mobilize the vast numbers of disengaged poor and working-class voters. Rather than develop a vision, they produce a product and fail to build political capacity from the ground up.

§ What happens after the voting? Our nation is more focused on elections now than it has been for decades; yet on any given Sunday, more people will watch professional football than voted this November. What democracy demands is a system of elections that enables minor parties to gain a voice in the legislature and encourages the development of local political organizations that educate and mobilize voters.

Between elections, grassroots organizations could play an important monitoring role now unfulfilled by the two major parties. If the Bush campaign is right that large numbers of ballots using the same butterfly format were thrown out in previous elections in Palm Beach, then something is wrong with more than the ballot. For those Democratic senior citizens in Palm Beach, it was not enough that their election supervisor was a Democrat. They needed a vibrant local organization that could have served as a watchdog, alerting voters and election officials that there were problems with the ballot. No one should inadvertently vote for two candidates; the same watchdog organizations should require ballot-counting machines like those in some states that notify the voter of such problems before he or she leaves the booth. Voters should be asked, as on the popular TV quiz show, "Is that your final answer?" And surely we cannot claim to be a functioning democracy when voters are turned away from the polls or denied assistance in violation of both state and federal law.

Before the lessons of Florida are forgotten, let us use this window of opportunity to forge a strong pro-democracy coalition to rally around "one vote, one value." The value of a vote depends on its being fairly counted but also on its counting toward the election of the person the voter chose as her representative. This can happen only if we recognize the excesses of winner-take-all voting and stop exaggerating the power of the winner by denying the loser any voice at all.

There's an easy way to take your own pulse, and that of anyone you know, concerning the vertiginous events of the night of November 7. Was the apparent non-outcome really a "mess" or a crisis? Or was the pre-existing system a sordid mess and a crisis waiting to happen? If you choose the second explanation, then the meltdown of all the fixers and self-appointed gatekeepers and pseudo-experts, as well as being a source of joy, is also an unparalleled opportunity, an occasion for a long-postponed national seminar on democracy and how to get it.

NATION NOTES

A proposed 14.2 percent postage increase for periodicals was swept aside by the Postal Rate Commission in a recommendation issued on November 13. The five-member presidentially appointed commission approved increases that average just under 10 percent. In our view that's just about 10 percent too much, given that the Postal Service is--the Internet notwithstanding--the circulatory system of our democracy. The Nation was among the witnesses cited in the commission's 1,000-page opinion who warned about the potentially destructive impact of the proposed rate hikes on journals of opinion.We were pleased that the commission recognized these magazines as a category worthy of separate consideration, but next time we hope to persuade them that it's as wrong to tax ideas through postal-rate increases as it was to tax tea in colonial times.
Katha Pollitt is not writing a column this week; she will be back in two weeks.

HEARING THE OTHER SIDE

At a time when Israeli opinion has hardened against peace efforts, 120 Palestinian academics and activists published an "urgent statement to the Israeli public" as a paid ad in Israeli newspapers on November 10. The statement called for "a final historic reconciliation that would enable our two peoples to live in peace, human dignity and neighborly relations." The signers argued that the Oslo accords have been used to camouflage expansion of settlements and the continuing expropriation of Palestinian land. It said that freedom of movement for Palestinians has been severely curtailed while settler violence against Palestinian communities continues. Resolving current inequities within the framework of the Oslo agreements with exclusive American "brokerage" was now impossible. Four principles were declared to be essential to a just peace agreement: Ending the occupation of the territories captured in 1967; Palestinian sovereignty over East Jerusalem and recognition of the city as the capital of two states; Israel's acknowledgment of its responsibility in the creation of Palestinian refugees in 1948; and mutual respect for spiritual and historical sites.

The quiet grace of Ring Lardner Jr., who died the other week at 85, seemed at odds with these noisy, thumping times. I cannot imagine Ring playing Oprah or composing one of those terribly earnest essays, "writers on writing," that keep bubbling to the surface of the New York Times. He was rightly celebrated

for personal and political courage but underestimated, it seems to me, as a protean writer who was incapable of composing an awkward sentence. It ran against Ring's nature to raise his voice. Lesser writers, who shouted, drew more acclaim, or anyway more attention.

The obituaries celebrated his two Academy Awards but made less of other achievements. Ring's novel,The Ecstasy of Owen Muir, begun in 1950 while he was serving his now-famous prison sentence for contempt of Congress, drew a transatlantic fan letter from Sean O'Casey. Ring felt sufficiently pleased to have the longhand note framed under glass, which he then slipped into a shirt drawer. He was not about advertisements for himself. In 1976 he published The Lardners: My Family Remembered. Garson Kanin commented, "In the American aristocracy of achievement, the Lardners are among the bluest of blue bloods. In Ring Lardner, Jr. they have found a chronicler worthy of his subject. The Lardners is a moving, comical, patriotic book."

The progenitor was, of course, Ring Lardner Sr., the great short-story writer, who sired four sons, each of whom wrote exceedingly well. James Lardner was killed during the Spanish Civil War; David died covering the siege of Aachen during World War II; a heart attack killed John in 1960, when he was 47. Add Ring's prison term to the necrology and you would not have what immediately looks to be the makings of a "moving, comical" book. But The Lardners was that and more because of Ring Jr.'s touch and slant and his overview of what E.E. Cummings called "this busy monster, manunkind."

From time to time, Ring published splendid essays. The one form he avoided was the short story. He wrote, "I did not want to undertake any enterprise that bore the risk of inviting comparison with my father or the appearance of trading on his reputation."

We became close in the days following the death of John Lardner, who was, quite simply, the best sports columnist I have read. I set about preparing a collection, The World of John Lardner, and Ring, my volunteer collaborator, found an unfinished serio-humorous "History of Drinking in America." He organized random pages with great skill. Reading them I learned that the favorite drink of the Continentals, shivering at Valley Forge, was a Pennsylvania rye called Old Monongahela. George Washington called it "stinking stuff." At headquarters the general sipped Madeira wine.

A year or so later, with the blacklist still raging, I picked up Ring for lunch at the Chateau Marmont, an unusual apartment hotel on Sunset Boulevard near Hollywood. Outside the building, a fifty-foot statue of a cowgirl, clad in boots and a bikini, rotated on the ball of one foot, advertising a Las Vegas hotel. I asked the room clerk for Mr. Robert Leonard. Ring was writing some forgotten movie, but could not then work under his own name. "Robert Leonard" matched the initials on his briefcase.

This was a pleasant November day, but the blinds above Ring's portable typewriter were drawn. When I asked why, he opened them. His desk sat facing the bikinied cowgirl, bust-high. Every eighteen seconds those giant breasts came spinning round. "Makes it hard to work," Ring said and closed the blinds.

The Saturday Evening Post was reinventing itself during the 1960s, on the way to dying quite a glorious death, and with my weighty title there, editor at large, I urged Clay Blair, who ran things, to solicit a piece from Ring about the blacklist. Ring responded with a touching, sometimes very funny story that he called "The Great American Brain Robbery." He explained, "With all these pseudonyms, I work as much as ever. But the producers now pay me about a tenth of what they did when I was allowed to write under my own name."

Clay Blair lived far right of center, but Ring's story conquered him, and he said, "Marvelous. Just one thing. He doesn't say whether he was a member of the Communist Party. Ask him to put that in the story."

"I won't do that, Clay."

"Why not?"

"He chose jail, rather than answer that question."

"Then, if he still won't, will he tell us why he won't?"

Ring composed a powerful passage.

The impulse to resist assaults on freedom of thought has motivated witnesses who could have answered no to the Communist question as well as many, like myself, whose factual response would have been yes. I was at that time a member of the Communist party, in whose ranks I found some of the most thoughtful, witty and generally stimulating men and women in Hollywood, I also encountered a number of bores and unstable characters.... My political activity had already begun to dwindle at the time [Congressman J. Parnell] Thomas popped the question, and his only effect on my affiliation was to prolong it until the case was finally lost. At that point I could and did terminate my membership without confusing the act, in my own or anyone else's head, with the quite distinct struggle for the right to embrace any belief or set of beliefs to which my mind and conscience directed me.

These words drove a silver stake into the black heart of the blacklist.

Ring won his first Oscar for Woman of the Year in 1942, and when he won his second, for M*A*S*H in 1970, numbers of his friends responded with cheering and tears of joy. The ceremony took place early in 1971, and Ring accepted the statuette with a brief speech. "At long last a pattern has been established in my life. At the end of every twenty-eight years I get one of these. So I will see you all again in 1999."

Indeed. Early in the 1990s I lobbied a producer who had bought film rights to my book The Boys of Summer, to engage Ring for the screenplay. Ring, close to 80, worked tirelessly. A screenplay is a fictive work, and Ring moved a few days and episodes about for dramatic purposes. His scenario ended with the Brooklyn Dodgers winning the 1955 World Series from the Yankees and my account of that ballgame landing my byline on the front page of the New York Herald Tribune. The sports editor is congratulating me on a coherent piece when the telephone rings: My father has fallen dead on a street in Brooklyn; I am to proceed to Kings County Hospital and identify his body.

As I, or the character bearing my name, move toward the morgue, I bump into two beer-drunk Dodgers fans. One says, "What's the matter with him?" The other says, "He's sober. That's the matter with him." The body is there. It is my father's body. Beer drunks behind us, my mother and I embrace. Fin.

I can only begin to suggest all that Ring's scene implies. I would start with the point that winning the World Series is not the most important thing on earth, or even in Brooklyn. I was always careful not to embarrass Ring with praise, but here I blurted out, "This is the best bleeping screenplay I've ever read, Ringgold. Oscar III may come true in '99."

"Curious," Ring said. "I seem to have had the same thought myself."

The blacklisting bounders were now dead, but a new generation of Hollywood hounds refused to shoot Ring Lardner's scenario. The grounds: "a father-son angle" was not commercial. "It worked in Hamlet," Ring said, but to unhearing ears. And then we were talking about Ring writing a screenplay for a book I published in 1999 about Jack Dempsey and the Roaring Twenties. "Have to cut it back a bit," Ring said. "Following your text would give us the first billion-dollar picture."

Years ago, the critic Clifton Fadiman wrote that Ring Lardner Sr. was an unconscious artist and that his power proceeded from his hatred of the characters he created. Ring told me: "If my father hated anyone or anything, it was a critic like Fadiman. Unconscious artist? My father knew perfectly well how good he was and--better than anyone else--how hard it was to be that good."

Ring Jr. knew the very same thing about himself. Or so I believe. Yeats writes, "The intellect of man is forced to choose/perfection of the life, or of the work." As well as anyone in our time, my suddenly late friend Ring Lardner came pretty damn close to achieving perfection in both.

Long before Carrie-Anne Moss rips open Val Kilmer's shirt and begins pounding his chest, providing him with a version of CPR that she must have learned from a Japanese drum troupe, the makers of Red Planet have resorted to their own thumpings and flailings, as if to resuscitate a film that's gone limp. It's a panic response, coming from people who have realized too late that the hookup of a radio would be a high point of their picture.

Their script has stuck Moss in a stricken spaceship that's orbiting Mars; by this point, her comrades Kilmer and Tom Sizemore have been marooned, incommunicado, on the planet's surface. So when the boys stumble upon an old circuit board in the dust, it's time for high-energy drama. "Let's do it!" shrieks Sizemore, as if he were starting the Indy 500. With a roar, guitars and drums begin pounding away on the soundtrack. Kilmer, in closeup, damn well solders a wire, sending a meteor shower's worth of sparks across the screen--at which point, back on the spaceship, Moss decides to strip down to a sleeveless T-shirt, giving us a much better view of her breasts.

I'm really grateful for the breasts. If not for them, I might have fallen asleep and missed the climactic scene, in which Kilmer performs a diagnostic check on a computer.

If only the makers of Red Planet had trusted in their story's essential schleppiness! Then, instead of giving us this lumbering, expensive beast, they might have realized the small but halfway-clever idea that's still dimly visible within: a story about the heroism-by-default of a spaceship janitor.

The character in question, a fellow named Gallagher, holds the job title of mechanical systems engineer; but to the rest of the personnel on this flight to Mars, that's like saying he's the guy who fixes the toilets. "It's high school," he remarks to a fellow civilian in the crew, after being brushed back by a swaggering NASA pilot. "They're the jocks, and we're the nerds." Just so. When he bumps into Moss--the ship's commander--on her way out of the unisex shower, Gallagher can think of nothing better to do than fumble with his fingers and blush. Later, when the outcome of the mission comes to rest on him, Moss has to give him a pep talk before he'll even get to his feet. Yet he's the guy who must save Earth from destruction and consummate a rendezvous with those breasts. What a role for Steve Buscemi! How the hell did it go to Val Kilmer?

He's good, of course. Kilmer is always good--but he's a guy who previously played Jim Morrison, Elvis and Batman. The only thing that's nerdlike about him is the hairdo he's been given for this picture, which is brushy and yellow and makes him look as if he's in crying need of a conditioner. Mind you, the premise of Red Planet is that all of Earth needs a conditioner. After these many years of environmental degradation, we've burned out our world and must colonize someplace else. Hence the desperate and very expensive project, in the year 2057, of sending Moss and her crew to Mars. Wouldn't it have been cheaper, as well as more practical, to institute a few conservation measures instead? No doubt. But humans, according to this movie, lack much capacity for self-discipline and forethought, and so must splurge on stupid but spectacular stunts. As if to prove this point, the producers have done their own splurging and hired Kilmer--the actorly equivalent of a rocket to Mars, compared with Buscemi's compost heap.

As they cast the lead, so too did they decide to ladle on the excitement: pounding guitars, sleeveless T-shirts, unmotivated shrieks. How were these choices made? I can venture a guess. The credits for Red Planet list three producers and two executive producers. This is a fairly standard aggregation in today's movie business; and with so many big shots keeping themselves busy on the picture, how could a mere idea survive? The story, written by a lone guy named Chuck Pfarrer, was almost sure to be buried alive; and into the dirt with it went a few other notions.

One of them might have involved some sexual role-play, based on the fact that the only females in the story are Moss, the shipboard computer (named Lucille) and a navigation robot called Amee. "She's my kind of girl," Gallagher says of the robot, just before it goes into killer mode. (It was designed for the Marines.) Somebody, maybe Pfarrer, seems to have wanted the nerdy Gallagher to feel ambivalent toward strong women: attracted to them when they shower, threatened by them when they turn into whirring kung-fu machines.

But since the production is at war with its own screenplay--have I mentioned that Red Planet is directed, more or less, by Antony Hoffman?--this kinky little idea is no better realized than the movie's religiosity. As far as I'm concerned, it's just as well that this latter theme gets only lip service. Ever since 2001: A Space Odyssey, Earthlings in Outer Space have sought God, and found light shows. At least Red Planet spares us that final cliché--though it still makes us listen to a lot of spiritual blather.

Those Deep Thoughts are provided by Terence Stamp, who manages to be the crew's world-famous scientist despite having abandoned rationalism. Science cannot provide the answers he craves, Stamp explains to a sweetly patient Kilmer, and so he has turned to religion. Kilmer obligingly spends the rest of the picture looking for a divine purpose--which doesn't seem so misguided, considering the level of scientific expertise around him. When the crew's biologist (Sizemore) discovers a life form on Mars, he cries out, "Nematodes!" Either he's forgotten his Linnaeus--nematodes are worms--or else the solution to God's mysteries is to be found not in Outer Space but in the pages of old sci-fi magazines. These creatures are clearly arthropods: the genre's usual bugs.

Fans of the platoon-in-space movie will want to know that the Mars scenery is furnished with the necessary rocks, peaks and ravines. Fans of Carrie-Anne Moss--meaning the adolescent boys, of whatever age, who admired The Matrix--will want to know that here, too, she gets to fly around. Not every actress is suited to antigravity; and so, until such time as Moss gets the chance to deliver a performance, I will congratulate her on giving good float.

As rain dances used to serve certain primitive tribes and scripture still serves true believers, the two-party system serves as the religion of the political class. Never mind that more than 50 percent of Americans may not share the civic religion, answering yes to pollsters when asked if they would prefer more than two choices (and that includes many regular voters as well as the bulk of habitual nonvoters). Nevertheless, every new party that has ever tried to establish itself has been treated by the political priesthood as a blasphemer--an evil force that inevitably contributes to the disastrous victory of the more detested of the two major candidates. Perot elected Clinton. Nader elects Bush.

The real culprit in the current election imbroglio is the two-party system itself and the state laws supporting it. These laws exist to discourage new parties. Florida has come in for special attention because of the current crisis, but Florida is typical among states. The beautiful irony is that the laws written to discourage third parties have proved to be a double-edged sword, cutting for the moment against those responsible for the existence of those laws.

Consider first how the laws work against all new parties. It is not Providence that takes an energetic social movement and crushes it as soon as it chooses to advance its goals through elections. It is the laws of the state here on earth that keep the party system on life support by preferring two parties above all others. The key example will be found in the laws of the states and Congress that mandate the single-member district system of representation plus the plurality or first-past-the-post method of election. Another historic example is provided by the "antifusion" laws in all but a half-dozen states, which prohibit joint nomination, whereby a third party seeks to nominate for its ticket the candidate already nominated by one of the major parties. Even the Supreme Court has approved such laws with the argument that having the same name in two places on the ballot would confuse the poor, defenseless voters.

Add to all this the new gerrymandering. Traditional gerrymandering was at least a genuine struggle between the majority parties to dilute the vote power of the other party by concentrating a maximum of their voters into a minimum of districts. The new method takes advantage of the Voting Rights Act by benign race-conscious gerrymandering in order to keep minorities within one of the major parties. In practice, blacks are guaranteed one or more additional Congressional or state legislature seats within the Democratic Party, while Republicans gain strength in districts from which the minority voters are evacuated.

Then there are the countless state laws that prescribe higher thresholds for the number of correct signatures required on third-party nominating petitions than for regulars on two-party ballots. Even the laws that apply equally to all parties are discriminatory, because they are written in such detail that ballot access for third-party candidates requires expensive legal assistance just to get through the morass of procedures. That mind-numbing detail is doubly discriminatory because the implementation of these laws thrusts tremendous discretion into the hands of the registrars, commissioners and election boards, all staffed by political careeristas of the two major parties, whose bipartisan presence is supposed to provide "neutrality with finality"--but it is common knowledge that they can agree with each other to manipulate the laws for the purpose of discouraging the candidacies of smaller and newer parties.

The same principles help explain why less than 50 percent of the electorate turns out to vote. Most of the blame goes to the forbidding proceduralism of registration, enrollment and eligibility and the discretionary power of local and county officials in implementation. And don't forget the gruesome timing of state election laws that restrict voting to one ordinary workday. The duopoly has a stake in low turnout. Virtually all expansion of the electorate (to include women, 18-year-olds, blacks) and the easing of restrictions on registration (judicial enforcement of the "motor voter" law) have been imposed on the state two-party systems from the outside by national social movements and federal courts.

Now, as poetic justice would have it, this legal structure is cutting the other way. Just look at the havoc it has wreaked: Loused-up ballots. Machine versus manual recounts. A lawyers' field day and the threat of court intervention that could cause a constitutional crisis or take Florida out of the electoral vote altogether. The Florida crunch can happen in any state where the results are extremely close and the outcome can change the national results.

That's because the two constituted parties cooperate well as a duopoly so long as market share is stable, with decisive election results. But whenever there is an extremely close election, the two parties become vicious antagonists, and the high stakes make it profitable for each to use its control of the electoral machinery as a weapon of mass destruction against the other. No war is more destructive than a civil war, and ordinarily the two parties have incentives to keep civil war from happening. Civil war in 2000 has broken out because two-party competition has turned from a public good to a public evil. The two-party system has at the moment become a menace to the Republic, made worse by the overwhelming weakness of the parties' presidential candidates and the impossibility of choosing between them when the only way to vote no for the candidate you hate is to vote yes for the one you can barely tolerate. And forget about having a good option when you hate both equally.

With Nader in the race, a lot of things got said that otherwise wouldn't have--no matter that the leading candidates excommunicated him. Making issues out of nonissues is what third parties are about, but those issues obviously did not create the stalemate we now confront. Stalemate is putting the case too mildly; mutual assassination is more like it. The crisis will not end with a certified recount in Florida. The civil war will continue, and the two parties will give us competition literally with a vengeance. Forget about smooth transitions. The FBI won't be ready with its security checks of top appointees, and the Senate will look at them with far greater than average scrutiny, even if the President's party is in the majority, because the Senate is run by sixty antifilibuster votes, not by mere majorities. That will apply in spades to judicial vacancies. Get ready for a Supreme Court of eight, seven, even six members, because as the vacancies occur, there'll be a majority against any nominee, even ones as mushy and fuzzy as President Bush or Gore will nominate. (The Constitution does not require any particular number of Justices on the Supreme Court.)

No exit? We have to turn the civic religion on its head and lionize the principle of a multiparty system, because its presence on a regular and expanded basis would relieve the two major parties of the need to be all things to everyone in order to get their phony majorities. We don't do that by inviting third parties to join the major parties on legal life support--as government-sponsored agencies. We do it by deregulating our politics. Hey, guys, deregulation. If you really meant it all these years, you Republicans and you Democrats, then be honest and deregulate yourselves. Take away the two-party safety net, by legislation and better yet by judicial review, and the democratic revolution can begin.

THIS YEAR IN CINCINNATI After anti-World Trade Organization activists filled the streets of Seattle last November, one of the prime movers in corporate globalization schemes offered a curt dismissal of the protests. "I don't believe those [activists] in Seattle represented somebody with a legitimate stake," said Peter Sutherland, former co-chairman of the Trans-Atlantic Business Dialogue. A privately funded vehicle through which CEOs from corporations such as Boeing, IBM, Time Warner, Proctor & Gamble and Ford influence trade policies set by governments in Europe and the United States as well as the WTO, the TABD has been in the forefront of efforts to eliminate barriers to trade based on human rights, labor rights and environmental concerns. The group is a big proponent of "harmonization" schemes, which force countries with tough regulations to water those standards down to parallel the rules of weak regulators. Such is its influence that US Deputy Under Secretary of Commerce Timothy Hauser has said, "Virtually every market-opening move undertaken by the U.S. in the last couple of years has been suggested by the TABD." But when the group meets in Cincinnati November 16-18 to "enhance economic globalization," activists from around the world plan to remind Sutherland and the top business and government officials expected to attend that the people in the streets do indeed have a stake in trade decisions. Public Citizen's Global Trade Watch, a prime mover in the Seattle protests, has been working with local labor unions and church groups to organize a November 16 teach-in on the group. The local Coalition for a Humane Economy is planning a major protest on the 17th, with support from the AFL-CIO, the Sierra Club and the Farm Labor Organizing Committee. And the Cincinnati Direct Action Collective is pulling together puppeteers, dance troupes and affinity groups for street theater, including a parade of corporate pig puppets representing visiting CEOs. "The TABD is used to meeting in the dark," says Global Trade Watch's Mike Dolan. "They aren't going to know what hit them."

RESPONSIBLE WEALTH Singer/actor Cher isn't interested in George W. Bush's proposed tax cut. "I'm one of the 1 percent that George W. Bush wants to give the money to. What I say is: 'Keep it, W.--it's just not worth it,'" said Cher, one of a number of wealthy celebrities who publicized their views during the presidential campaign. Sharing that view was Chicago Hope actress Christine Lahti, who declared, "Education needs it. I don't need it." Those messages delighted the three-year-old group Responsible Wealth, a national network of business executives, investors and affluent Americans who actively oppose public policies that favor the rich over working Americans. Members of the group were present at the White House when President Clinton vetoed a Republican-sponsored repeal of the estate tax. Said Responsible Wealth co-director Mike Lapham, an heir to an upstate New York paper-mill fortune, "Something is wrong when we can't fully fund Head Start, but we can give descendants of the wealthy an added head start."

EXIT INTERVIEW In an effort to spike turnout in traditionally Democratic constituencies, President Clinton made Election Day phone calls to radio stations in urban areas across the country. Most of the calls were perfunctory get-out-the-vote appeals. But that wasn't the case at Pacifica Radio's WBAI in New York, where Clinton found himself on the line with Amy Goodman, the host of Pacifica's Democracy Now! news show, and Gonzalo Aburto, host of WBAI's La Nueva Alternativa. In a freewheeling thirty-minute interview, Goodman and Aburto peppered Clinton with questions about the morality of his support for the death penalty, sanctions against Iraq and executive clemency for Native American activist Leonard Peltier. In his first public comment on the Peltier matter, Clinton said he was reviewing requests that he grant executive clemency to Peltier, who is serving a life sentence for murder. Clinton also said that "there's just not a shred of evidence" that the two parties are bought by corporate campaign contributions, and he rejected criticism of his Administration's free-trade policies, saying, "Two-thirds of the American people support [NAFTA]."

BANK BOYCOTT The international campaign to force reform of the World Bank by getting public agencies to boycott the international lending agency's bonds won a major boost when the San Francisco Board of Supervisors voted unanimously this fall to stop buying the bonds. The movement has also won endorsements from unions (the Communications Workers of America and the United Electrical, Radio and Machine Workers of America), religious groups and human rights groups. Dennis Brutus, the veteran antiapartheid activist, has taken a lead in linking the bond boycott to the divestment movement of the 1980s. "We need to break the power of the World Bank over developing countries, as the divestment movement helped break the power of the apartheid regime over South Africa," he says.

John Nichols's e-mail address is jnichols@thenation.com.

On Tuesday, November 14, exactly one week after Election Day (and with no President yet in sight), a notable though little-noted disclosure was made to the public. I do not mean the news that the federal judge in Florida had turned down the Republicans' stop-the-hand-count motion, or the news that Bush's lead in Florida was now 388 votes, or the news that a Florida state judge had waffled on Florida Secretary of State Katherine Harris's decree that no county votes would be counted if reported after the 5 pm deadline that afternoon, or, for that matter, anything else that was happening in the murk of the Sunshine State. I mean the news that, according to a poll released by the Washington Post and ABC News, 45 percent of the public wanted George Bush to become President whereas only 44 percent wanted Al Gore to become President (6 percent wanted "neither," 4 percent had no opinion and 1 percent wanted "other"). The claim was all the more striking in view of the hard contemporaneous fact that in the most recent count of the actual vote of November 7, Gore led Bush by a nationwide margin of 222,880 votes.

If anyone ever had doubts that politics in the United States is dominated by polling, this poll should put an end to them. A major poll was, in a manner of speaking, calling the election a full week after the vote--and reversing the known results.

The polls had been mercifully silent since the election. Many had good reason to be. Five of seven major ones had been "wrong" about the outcome of the election. That is, their final counts had failed to reflect the winner on Election Day (though some, it's true, were within the margin of error). The New York Times/CBS "final" poll, which put Bush at 46 percent and Gore at 41 percent, had the margin wrong by more than five points and Gore's final tally off by eight points. The Battleground poll, which gave Bush 50 percent to Gore's 45 percent, likewise got the margin wrong by five points. Others were more modestly in error. CNN gave Bush 48 percent and Gore 46 percent; in the Washington Post it was Bush 48 and Gore 45; and in the Pew Research Center poll (with undecided voters counted), it was Bush 49, Gore 47. Only the Zogby poll, which put Gore ahead in the popular vote by 48 to 46 percent, and a CBS election-morning tracking poll, which gave Gore 45 percent and Bush 44 percent, picked the right winner in the popular vote, and with a margin close to the actual result. All in all, Gore's victory in the popular vote came as a surprise. Of course, it's not literally true that the polls were wrong, since there is a margin of error, and people can change their minds between the day of the poll and the election. On the other hand, election results are the only check on the accuracy of polling that there is--they are to polling what experimentation is to scientific hypothesis--and there is no reason to suppose that a poll whose final measure is 8 percentage points off the election result is not 8 percentage points off year in, year out.

Considering the decisive importance that polling had throughout the race in every aspect of the campaign, including media coverage, fundraising and campaign strategy (in the last few weeks of the election, hearts were lifting and falling on single-point fluctuations in poll numbers), these discrepancies deserved much reflection. The reason they did not get it was that on election night the magicians of public opinion went on to make even more egregious and momentous errors, by prematurely predicting the winner in Florida twice and the winner of the national election once. (The election-night calls made by the television networks, which in turn are based on exit polling done by a single, nearly anonymous firm, the Voter News Service, are not quite the same as opinion polling, since they record a deed--voting--rather than an opinion, but their use of sampling techniques to predict outcomes places them in the same general category as other polls.)

The last of these mistakes, of course, led a credulous Gore to concede the election and then, minutes later, to retract the concession. For a few hours, the networks and the candidates appeared to have assumed the power to decide the election between them. There is every reason to believe, for instance, that George Bush would now be President-elect if, moments before his concession speech, Gore had not got the news that Florida had been declared undecided again. If Gore's concession had gone unretracted, Bush had made his acceptance speech and the country had gone to bed believing it had made its decision, it is scarcely imaginable that the close results in Florida would have been contested. Even now, many observers await a concession by one or another of the candidates as the decisive event. But it is not up to either the networks or the candidates to decide who is to be President; that matter is left under the Constitution to the voters, whose will, no matter how narrowly expressed, must be ascertained.

Then a week later, the polls that had played such an important and misleading role in the election were weighing in again, this time on the Florida battle. The poll that brought the startling, seemingly counterfactual news that Bush led Gore in the public's preference also revealed that six out of ten voters were opposed to legal challenges to the Florida results--possibly bad news for Gore, who had been considering a legal challenge to the infamous butterfly ballot in Palm Beach County. However, observers who did not like that conclusion could find comfort on the same day in a New York Times/CBS poll, which reported that another 6 in 10 were unworried about a delay in finally deciding upon the next President--good news for Gore, who had been relying on time-consuming hand recounts to erase Bush's narrow lead.

If, however, the arts of reading public opinion helped get us into our current mess, perhaps we can take comfort from the hope that they can also help us get out of it. Many observers have suggested that by failing to produce a clear mandate, the ever-changing vote-count of the year 2000--let's call it the Butterfly Election--will cripple the presidency of the winner. They need not worry too much. In our day, it is not only--perhaps not even mainly--elections that create mandates, once every four years. It is polling data that, day in and day out, create our impressions, however incompletely or inaccurately, of what the public wants. Let the new President act in a way that the public approves, as determined by a poll or two, and he will have all the mandate he needs to govern.

A corporate antiviolence program targets students who don't fit in.

While Bush says counting votes by hand's unfair,
Gore gains--a bubbe here, a zayde there.

When George W. Bush spokesman James A. Baker III termed the fight
over the Florida vote recount "a black mark on our democracy," he
couldn't have been more wrong. At the time he said it on Sunday, Bush was
ahead in Florida by a mere 288 votes, and of course the full recount,
required by Florida law, is in order, as a federal judge ruled Monday.

Anyway, since when is political tumult and democracy a bad mix? Never
in our recent history has the vitality of our democracy been on such
splendid display, and it's disheartening that there are so many
frightened politicians and pundits panicked by this whiff of controversy.

What's wrong with a bit of electoral chaos and rancor? The
post-electoral debate over a rare photo finish is just the stuff that
made this country great. People should be outraged if their votes were
improperly counted--the founding fathers fought duels over less.

We have lectured the world about the importance of fair elections, and
we cannot get away with hiding the imperfections of our own system. Not
so imperfect as to require international observers for a full-scale
investigation under UN supervision, yet controversial enough to fully
engage the public. An election that once threatened to be boring beyond
belief has turned into a cliffhanger that is now more interesting than
reality-based TV entertainment. Indeed, it is reality-based TV
entertainment.

Never since John F. Kennedy eked out a suspicious victory over Richard
M. Nixon in 1960 has the proverbial man-in-the-street been so caught up
on the nuances of the electoral process. People who didn't even realize
we had an electoral college are now experts on it. But instead of
celebrating an election that people are finally excited about, driving
home the lesson for this and future generations that every vote counts,
the pundits are beside themselves with despair.

What hypocrites. They love every moment of increased media exposure
for themselves, while darkly warning of the danger to our system. Their
fears are nonsense. What is being demonstrated is that the system works:
Recounts, court challenges, partisan differences are a healthy response
to an election too close to call.

The fear-mongers hold out two depressing scenarios, one being that the
people will lose faith in the electoral process, and the other that
whoever wins the election will be weakened for lack of a mandate.

As to the former, the electoral process has never seemed more vital;
some who voted for Ralph Nader may be second-guessing their choices, and
states such as Florida and Oregon with primitive voting systems will no
doubt come into the modern age, but apathy has been routed, and next time
around, the presidential vote count will be the highest ever.

True, the candidate who finally wins will be weakened. He should be.
An election this close hardly provides the winner with a compelling
mandate, particularly if it is Bush, who may win the electoral college
majority while Al Gore is declared the winner of the popular vote. If
that turns out to be the case, Bush ought to tread with caution.

Compromise is good when not only the President is without a mandate
but so, too, the House and the Senate because of their razor-thin
outcomes. The country has come through eight incredibly prosperous and
relatively peaceful years, so why the rush to march down some new
uncharted course? Later for privatizing Social Security, a huge tax cut
for the super-rich and a $160-billion missile defense system--three mad
components of the core Republican program.

As for the Democrats, with or without Gore as President, it will be
the season for nothing more ambitious than damage control. With Gore, the
main weapon of reason would not be bold new programs that Congress would
ignore, but rather the threat of a veto to stop Republican mischief.
Without Gore, the responsibility will fall on the Democratic minority in
both branches of Congress to engage in a principled holding action
preparing for a congressional majority in 2002.

Odds are that Bush will be the President presiding over a nation that,
by a clear margin in the popular vote, rejected him for Gore. If Bush
wins the office, his challenge will be to prove that the moderate face he
presented during the election is truly his. If it isn't, and he attempts
to be a hero to the right wing of his party, he will wreck the GOP.
Clearly, future political power resides with the vibrant big cities and
modern suburbs, the sophisticated hot spots of the new economy, which
went for Gore, and not the backwater rural outposts that turned out to be
Bush country largely because men remain obsessed with their guns.


Malthusian Delusions?

Franklin, N.Y.

Amartya Sen starts his otherwise sensible "Population and Gender Equity" [July 24/31] with the unproven assertion that Thomas Malthus was wrong when he wrote that population growth would soon outstrip growth in food production. What Malthus didn't know is that the age of cheap and abundant fossil fuels was at hand and would, for a geologically brief 200 years, delay the fulfillment of his gloomy prediction. Now those fuels are running out while population has grown to numbers Malthus probably could not have conceived of.

Modern agriculture has been described as a means of turning petroleum into food, but sometime--very likely this decade--the world will reach peak oil production. Then all will change. Fertilizers made from petroleum and natural gas will be very expensive and then unavailable; transportation and the operation of farm machinery will be hugely expensive or impossible. The idea that alternative fuels and solar and wind power will make up the deficit is, so far, a fantasy, and the level of investment in such alternatives remains paltry. In a two-century orgy of consumption, we have burned up the solar energy that was for hundreds of millions of years stored under the earth's surface. Our oil-based civilization is about to come crashing to an end, and we have very little time to prepare to deal with the consequences. Not surprisingly, the oil companies don't acknowledge the problem. Even more disturbing, none of our politicians want to be the bearer of bad news.

One is loath to lump a man of Sen's decency and humanity in with the economic cultists who believe that "the market" will take care of the problem. Nevertheless, it is the bizarre beliefs of economists, including the notion that the world runs on investment rather than energy, that will probably result in Malthus being proved an accurate observer.

EUGENE MARNER


Oakland, Calif.

Amartya Sen dismisses concerns about the global food supply as it relates to burgeoning population for two reasons: Food production has expanded and the price of food continues to fall. But we should not be lulled into thinking the world's food supply is sustainable or secure. That's because at least one key part of food production is not reflected in current prices: water. In coming decades, increasing scarcity of water will make itself felt in prices and supply. Visions for a sustainable future must balance population density and growth with current and future water prices and availability. Food--grain, produce and livestock alike--represent huge investments of water. Producing a ton of beef can require up to 70,000 tons of water and a ton of grain up to 3,000 tons. Agriculture is a thirsty enterprise. As water becomes scarcer (as is already happening in the Central Valley of California, which, like many regions, relies on artificial water supplies) and soils become more salinized, food production will not hold at current levels. The earth's hydrological cycles have been mined to increase food production. This is a historical anomaly, not a sustainable trend.

MICHELLE SALE-SINEX
Redefining Progress


College Park, Md.

In Amartya Sen's article on gender equity, structures of patriarchy and capitalism are nowhere visible. Problems are reduced to a series of variables--economic, cultural and political "handicaps" that are to be overcome. Sen's basic point that literacy and schooling bring a decline in fertility is simply a correlational, not causal, part of the tired argument that investment in human capital will yield wealth and progress. Historically, fertility rates did not decline because of education but because wealth made having large families unnecessary for survival. Unusual low-income, low-fertility-rate stories, like China and Kerala, are not due to education (Sen discounts the effects of China's "one child" policy) but because both have departed from traditional capitalist and patriarchal structures.

The most important issue Sen raises is how employment opportunities for women (and men) are essential to achieving greater gender equity. But Sen, like any mainstream economist, lacks understanding of structures of inequality and oppression. He believes that fostering the education of girls and women, promoting access to microcredit for rural women and fighting discrimination in urban labor markets are the policies that will improve employment and equity. To the contrary, the creation of sustainable, decent livelihoods for the 2 billion women, men and children living on the global margin will not come from better policies within structures rooted in poverty and inequality. "Reversing the...handicaps that make women voiceless and powerless" and "bringing gender equity and women's empowerment to the center of the stage," as Sen wishes to do, do not depend on a "unified framework of understanding" based on the results of "empirical and statistical research" but on a political struggle for economic rights and societal transformation.

STEVEN J. KLEES


SEN REPLIES

London

Eugene Marner is right to express worry about the growth of world population, even though the source of this worry cannot really be the alleged accuracy of Malthus (I shall return to Malthus after discussing the general problem). The exhaustion of fossil fuel is certainly one source of concern (to which Marner rightly draws attention), as is the growing difficulty in guaranteeing adequate water supply (to which Michele Gale-Sinex devotes her letter). Even though each of them has chosen a singular focus of attention (petroleum and water, respectively), problems generated by excessive population growth can arise in many other ways as well, varying from the depletion of the ozone layer to overcrowding in a limited habitat (as I discussed in my essay).

The point of departure in my essay was the particular relation between (1) high fertility rates and (2) the low decisional power--indeed subjugation--of women. The critical linkage is that "the most immediate adversity caused by a high rate of population growth lies in the loss of freedom that women suffer when they are shackled by persistent bearing and rearing of children." This connection is important in itself because of its relevance to the well-being and freedom of women (and derivatively of men as well). Furthermore, since the interests of young women are so closely involved, it would also be natural to expect that anything that increases the voice and power of young women in family decisions will tend to have the effect of sharply decreasing fertility rates (and through that, reducing the environmental adversities associated with population explosion). This expected connection has received very substantial statistical confirmation in intercountry comparisons around the world as well as in interstate and interdistrict correspondences within India (as I indicated in my essay).

That was the reason for my conclusion that women's empowerment and agency (through such factors as their education and economic independence) are central to an effective resolution of the so-called population problem, including its environmental consequences. These connections, which draw on a firm interpretive framework, cannot be dismissed as "simply correlational," as Steven Klees does in his letter. Empirical work is inescapably dependent on statistical investigation. Causal connections, which demand interpretation, have to be assessed on the basis of statistical findings, not independently of them. This combination of interpretive scrutiny and statistical assessment gives causal plausibility to the empirical association between fertility decline and women's empowerment (reflected by such enabling factors as female literacy, women's gainful employment and access to microcredit, land and other resources, and public debates and political discussions on gender equity).

Klees argues that as a "mainstream economist," I cannot have any "understanding of structures of inequality and oppression." If correct, this would be very sad for me, since--mainstream or not--I have devoted a very big part of my life precisely to investigating inequality and oppression, including studying, at close quarters, their manifestations in such phenomena as famines and starvation, class- and gender-related atrocities, and military and police brutalities. I accept the possibility that Klees has been able to acquire (from his vantage point in College Park, Maryland) a direct understanding of these issues which I have failed to achieve. However, since Klees does not refer to any empirical work whatsoever, it would have been very nice to have been told a little about how he has accomplished this understanding. Indeed, despite his fleeting invocation of "patriarchy" (along with "capitalism"), Klees dismisses the relevance of the indicators of women's empowerment that well-researched empirical studies in feminist economics as well as demography have established as important (on which my essay drew).

I come, finally, to Eugene Marner on Malthus. Marner disputes what he describes as my "unproven assertion that Thomas Malthus was wrong when he wrote that population growth would soon outstrip growth in food production." Since exactly the opposite of what Malthus predicted has occurred and continues, why is the recording of the nonfulfillment of Malthus's prediction "unproven"? Is a period of 200 years not time enough to check a prediction? But we must not dismiss Marner's reasoned worries about the future, since the exhaustion of petroleum is an important issue. However, Marner surely oversimplifies with his "turning petroleum into food." There are a great many different factors (such as new seeds, better cultivation techniques, etc.) that have contributed to the sharp rise in food production per capita in the world, which has occurred since Malthus's gloomy predictions were made and which has continued to occur through the most recent decades. Nevertheless, given the difficulties that are visible now (including petroleum and water problems) and new adversities that might well arise, we do have good reason to consider ways and means of raising agricultural productivity as well as reducing fertility rates (as I discussed in my essay).

Where Malthus is particularly counterproductive is in his dismissal of informed reproductive choice and of the effectiveness of women's conscious agency as ways of reducing fertility rates. Malthus took penury to be the only sure way of keeping fertility rates down (he did not revise his view on this particular subject, despite rethinking on some other issues) and even argued for suppressing the Poor Laws and the very modest arrangements for social safety nets and economic security that existed for the poor at his time. It would be unfortunate to rely on Malthus's harsh and dogmatic pronouncements for our understanding of the population problem.

AMARTYA SEN

Providence put me on a panel debating the Gore/Nader choice with Cornel West at New York University in late October. Most of the audience was for Nader, and the lineup on stage did nothing to improve those odds.

Before the debate began, its organizers took a few moments to speak on behalf of the university's graduate students' struggle for unionization. So did West, who had been handed a flier about it from the floor. And as a man about to lose a debate (and a longtime grad student as well as an occasional NYU adjunct faculty member), I was happy for the interruption. Days later, the National Labor Relations Board set an important precedent by ruling in favor of the students. But here's what I don't understand. How can the student union supporters also be Nader supporters? Nonsensical "Tweedledee/Tweedledum" assertions to the contrary, only one party appoints people to the NLRB who approve of graduate student unions, and only one appoints people to the Supreme Court who approve of such NLRB decisions. No Democrat in the White House, no graduate student union; it's that simple. An honest Nader campaign slogan might have read, "Vote your conscience and lose your union...or your reproductive freedom...your wildlife refuge, etc., etc."

Well, Nader's support collapsed, but not far or fast enough. In the future, it will be difficult to heal the rift that Nader's costly war on even the most progressive Democrats has opened. Speaking to In These Times's David Moberg, Nader promised, "After November, we're going to go after the Congress in a very detailed way, district by district. If [Democratic candidates] are winning 51 to 49 percent, we're going to go in and beat them with Green votes. They've got to lose people, whether they're good or bad." It's hard to imagine what kind of deal can be done with a man whose angriest rhetorical assaults appear reserved for his natural allies. (The vituperative attacks on Nader, leveled by many of my friends and cronies on the prolabor democratic left, were almost as counterproductive, however morally justified.) But a deal will have to be done. Nader may have polled a pathetic 2 to 3 percent nationally, but he still affected the race enough to tip some important balances in favor of Bush and the Republicans. He not only amassed crucial margins in Florida, New Hampshire and Oregon; he forced progressive Democrats like Tom Hayden, Paul Wellstone, Ted Kennedy and the two Jesse Jacksons to focus on rear-guard action during the final days rather than voter turnout. If this pattern repeats itself in future elections, Naderite progressives will become very big fish in a very tiny pond indeed.

Perhaps a serious Feingold or Wellstone run at the nomination with a stronger platform on globalization issues will convince those die-hard Naderites to join in the difficult business of building a more rational, Christian Coalition-like bloc to counter corporate power within the party. For now, we can expect an ugly period of payback in Washington in which Nader's valuable network of organizations will likely be the first to pay. Democrats will no longer return his calls. Funders will tell him to take a hike. Sadly, his life's work will be a victim of the infantile left-wing disorder Nader developed in his quixotic quest to elect a reactionary Republican to the American presidency.

* * *

Giving Nader a run for his money in the election hall of shame are the mainstream media. Media portraits of both candidates were etched in stone, with nary a fact or figure allowed to intrude upon the well-worn script. Bush was dumb and Gore a liar; pretty much nothing else was allowed in the grand narrative. Like Nader, reporters assumed the enormous policy differences between Gore and Bush--on Social Security, prescription drugs, education, affirmative action, abortion rights, the environment--to be of trivial importance, hardly worth the time and effort to explain or investigate. The media's treatment of this election as a popularity contest rather than a political one between two governing ideologies was an implicit endorsement of the Bush campaign strategy, as the issues favored Gore. But even so, Bush was usually treated like some pet media cause. With regard to such consequential questions as his political program, his political experience, his arrest record, his military service, his business ethics, Bush was given a free pass by media that continued to hound Gore about whether he was really the model for Oliver in Love Story--which, by the way, he was. I guess being a Bigfoot journalist means never having to say you're sorry.

* * *

One election development that had to gladden New Republic owner Marty Peretz's heart was how bad it was for the Arabs. I got a call one day from a Republican Party functionary telling me that Hillary Clinton supported a Palestinian state and took money from groups that supported terrorist organizations "like the one that just blew up the USS Cole." I told the sorry sonofabitch that like Israel's Prime Minister, I, too, support a Palestinian state. And, if there was any justice in the world, Hillary's "terrorist" friends would blow up Republican headquarters while we were still on the phone, so I could enjoy hearing the explosion.

This heavy-handed bit of racist manipulation grew out of a story published, surprisingly, not in Rupert Murdoch's New York Post but in the putatively responsible and nominally liberal New York Daily News, owned by Mortimer Zuckerman. It was inspired by the machinations of one Steven Emerson, a discredited "terrorism expert" last heard trying to pin the Oklahoma City bombing on the Arabs by noting that "inflict[ing] as many casualties as possible...is a Middle Eastern trait." Each actor played a dishonorable role in the tawdry drama: The Daily News invented the story. The Lazio campaign brazenly exploited it. Hillary Clinton's campaign capitulated to it. Together with the media coverage of the main event, this mini-drama will go down in history as further evidence of that unhappy nostrum of American politics that this year seems to have escaped everyone from the Nader die-hards to Palestinian militants: Things can always get worse.

So it all came out right in the end: gridlock on the Hill and Nader blamed for sabotaging Al Gore.

First a word about gridlock. We like it. No bold initiatives, like privatizing Social Security or shoving through vouchers. No ultra-right-wingers making it onto the Supreme Court. Ah, you protest, but what about the bold plans that a Democratic-controlled Congress and Gore would have pushed through? Relax. There were no such plans. These days gridlock is the best we can hope for.

Now for blaming Nader. Fine by me if all that people look at are those 97,000 Green votes for Ralph in Florida. That's good news in itself. Who would have thought the Sunshine State had that many progressives in it, with steel in their spine and the spunk to throw Eric Alterman's columns into the trash can?

And they had plenty of reason to dump Gore. What were the big issues for Greens in Florida? The Everglades. Back in 1993 the hope was that Clinton/Gore would push through a cleanup bill to prevent toxic runoff from the sugar plantations south of Lake Okeechobee from destroying the swamp that covers much of south-central Florida. Such hopes foundered on a "win-win" solution brokered by sugar barons and the real estate industry.

Another issue sent some of those 97,000 defiantly voting for Nader: the Homestead Air Force Base, which sits between Biscayne National Park and the Everglades. The old Air Force base had been scheduled for shutdown, but then Cuban-American real estate interests concocted a scheme to turn the base into a commercial airport. Despite repeated pleas from biologists inside the Interior Department as well as from Florida's Greens, Gore refused to intervene, cowed by the Canosa family, which represented the big money behind the airport's boosters. Just to make sure there would be no significant Green defections back to the Democratic standard, Joe Lieberman made a last-minute pilgrimage to the grave of Jorge Mas Canosa.

You want another reason for the Nader voter in Florida? Try the death penalty, which Gore stridently supported in that final debate. Florida runs third, after Texas and Virginia, as a killing machine, and for many progressives there it is an issue of principle. Incidentally, about half a million ex-felons, having served sentence and probation, are permanently disfranchised in Florida. Tough-on-crime drug-war zealot Gore probably lost crucial votes there.

Other reasons many Greens nationally refused to knuckle under and sneak back to the Gore column? You want an explanation of why Gore lost Ohio by four points and New Hampshire by one? Try the WTI hazardous-waste incinerator (world's largest) in East Liverpool, Ohio. Gore promised voters in 1992 that a Democratic administration would kill it. It was a double lie. First, Carol Browner's EPA almost immediately gave the incinerator a permit. When confronted on his broken pledge, Gore said the decision had been pre-empted by the outgoing Bush crowd. This too was a lie, as voters in Ohio discovered a week before Election 2000. William Reilly, Bush's EPA chief, finally testified this fall that Gore's environmental aide Katie McGinty told him in the 1992 transition period that "it was the wishes of the new incoming administration to get the trial-burn permit granted.... The Vice President-elect would be grateful if I simply made that decision before leaving office."

Don't think this was a picayune issue with no larger consequences. Citizens of East Liverpool, notably Terri Swearingen, have been campaigning across the country on this scandal for years, haunting Gore. So too, to its credit, has Greenpeace. They were active in the Northeast in the primaries. You can certainly argue that the last-minute disclosure of Gore's WTI lies prompted enough Greens to stay firm and cost him New Hampshire, a state that, with Oregon, would have given Gore the necessary 270 votes.

And why didn't Gore easily sweep Oregon? A good chunk of the people on the streets of Seattle last November come from Oregon. They care about NAFTA, the WTO and the ancient forests that Gore has been pledging to save since 1992. The spotted owl is now scheduled to go extinct on the Olympic Peninsula within the next decade. Another huge environmental issue in Oregon has been the fate of the salmon runs, wrecked by the Snake River dams. Gore thought he'd finessed that one by pretending that, unlike Bush, he would leave the decision to the scientists. Then, a week before the election, his scientists released a report saying they thought the salmon could be saved without breaching the four dams. Nader got 5 percent in Oregon, an amazing result given the carpet-bombing by flacks for Gore like Gloria Steinem.

Yes, Nader didn't break 5 percent nationally, but he should feel great, and so should the Greens who voted for him. Their message to the Democrats is clear. Address our issues, or you'll pay the same penalty next time around. Nader should draw up a short list of nonnegotiable Green issues and nail it to the doors of the Democratic National Committee.

By all means credit Nader, but of course Gore has only himself to blame. He's a product of the Democratic Leadership Council, whose pro-business stance was designed to regain the South for the Democrats. Look at the map. Bush swept the entire South, with the possible exception of Florida. Gore's electoral votes came from the two coasts and the old industrial Midwest. The states Gore did win mostly came courtesy of labor and blacks.

Take Tennessee, where voters know Gore best. He would have won the election if he'd carried his home state. Gore is good with liberals earning $100,000-$200,000. He can barely talk to rural people, and he made another fatal somersault, reversing his position on handguns after telling Tennessee voters for years that he was solid on the gun issue. Guns were a big factor in Ohio and West Virginia, too. You can't blame Nader for that, but it's OK with us if you do. As for Nader holding the country to ransom, what's wrong with a hostage-taker with a national backing of 2.7 million people? The election came alive because of Nader. Let's hope he and the Greens keep it up through the next four years. Not one vote for Nader, Mr. Alterman? He got them where it counted, and now the Democrats are going to have to deal with it.

The razor-thin margin that defined the presidential race is sure to stir controversy around the Ralph Nader vote. Those wishing to blame Nader for Gore's troubles and those Greens wishing to take credit for giving the Democratic candidate a political "cold shower" will focus on Florida. Nader's 97,000 votes in that state came to less than 2 percent of the statewide total, but with barely 1,000 Florida votes deciding the national election, they are sure to be dissected and scrutinized. Ironically, only in the final days of the campaign did Nader decide to return to Florida and ask for votes. A last-minute debate inside his campaign weighed the possibilities of focusing efforts in the swing states like Florida or in Democrat-rich states like New York and California, where "strategic voters" could vote Green without concern about affecting Gore's final tallies. Nader eventually decided he would get more media coverage by targeting places like Florida.

On the national level, Nader fell considerably short of his goal of achieving a 5 percent national vote that would have qualified the Green Party for millions in federal matching funds in 2004. When the votes were counted, Nader had pocketed 3 percent, or around 2.7 million votes--almost four times more than his "uncampaign" garnered in 1996. Relentless pressure on potential Nader voters by liberal Democrats to switch to Gore clearly had an effect on the Green campaign, helping tamp down the final vote to almost half the level at which Nader had finally been polling.

No question but that this result is far from the best scenario for those who hoped that Nader's run this year would hand the Greens substantial future leverage. Given the failure to establish a federally funded national Green Party in the balloting, however, that future clout will depend mostly on Nader's ability and willingness to take his list of 75,000 campaign contributors (as well as countless volunteers and voters) and hone it into an identifiable political entity. That task could be rendered even more problematic by those who will blame Nader for a Gore defeat.

That said, various state Green parties will emerge from this week strengthened and positioned to make a difference in scores of Congressional and legislative districts. In some progressive-minded counties--like Humboldt and Mendocino in Northern California--the Nader vote grazed 13 to14 percent. In many others the Greens scored 5 to 10 percent, making them a potential swing vote in further local elections. In this election, nationwide, some 238 Greens ran for municipal office, and fifteen were victorious.

In what had been considered virtual "Naderhoods"--several northern-tier states where the Greens had significant pockets of strength--the candidate's vote was less than spectacular. In Wisconsin, Washington and Oregon Nader finished with only 4 or 5 percent. Just six weeks ago, he was approaching 10 percent in Oregon. The Greens scored 5 percent in Minnesota--a figure they had been polling for some time--and they hit 6 percent in Montana, Maine, Massachusetts, Rhode Island and Hawaii. The Green high-water marks were in Vermont (7 percent) and Alaska (10 percent--down from 17 percent in some earlier polls).

In the Democratic strongholds of New York and California, where Al Gore won by huge margins and where a ballot for Nader was considered "safe" by those who called for strategic voting, the Greens ended up with a relatively disappointing 4 percent--the same number reached in New Mexico, where Greens have competed statewide for more than five years.

Predictions that the Greens would spoil Gore's chances failed to materialize. Washington, Minnesota, New Mexico, Michigan and Wisconsin--states where Democrats argued that Nader could swing the vote to the GOP--were all won by Al Gore. Even in Oregon, Nader's impact on the major party race was arguably negligible. At press time, Gore was losing the state by about 25,000 votes and Nader's total was 5 percent, or just over 50,000. But whether a sufficient number of the Nader votes would have gone to Gore is open to question. A national USA Today/CNN/Gallup Tracking poll a few days before the election found that only 43 percent of likely Nader voters would vote for Gore as their second choice. Twenty-one percent said they would vote for Bush second. And an equal number said they would vote for Nader or not at all.

As the media obsessed over the seesaw presidential poll, voters across the country quietly made their choices on more than 200 disparate ballot measures and initiatives. For progressives the results are--as usual--mixed.

First the bad news: Three campaign finance reform initiatives went the wrong way. Clean-money measures providing for full public financing were thumped in Missouri and Oregon. Similar measures had been passed in previous years by voters in Maine, Massachusetts and Arizona as well as by the legislature in Vermont--but this time around powerful, well-financed business lobbies weighed in, and dirty money beat clean money. In Oregon opponents ran an effective (and expensive) radio campaign highlighting the out-of-state financial support for the reform, and it raised the specter of extremists running for office if it passed.

In Missouri corporate opponents--including Anheuser-Busch, KC Power & Light, Hallmark Cards and the Missouri Association of Realtors--poured hundreds of thousands into their victorious antireform campaign. Californians, meanwhile, approved Proposition 34, billed as campaign reform but actually cooked up by the establishment to block real reform. The returns on these three measures should compel campaign finance reform activists to rethink their strategies. These are significant and stinging defeats.

The good news is that the failed drug war was a loser in five of seven related measures nationwide. Medical marijuana initiatives passed in Colorado and Nevada (although a full marijuana-legalization bill failed in Alaska). Oregon and Utah voted to reform draconian drug forfeiture laws. And in California, Proposition 36, providing treatment instead of jail for first- and second-time drug offenders, passed easily. But a similar proposition failed in Massachusetts (which also refused to approve a universal healthcare proposal).

Another bright spot was public education. Voucher measures in California and Michigan were beaten by wide margins. Silicon Valley entrepreneur Tim Draper put up millions for the California proposal--to no avail. California voters also approved a measure that makes passage of school bonds easier. But bilingual education, banned in the Golden State two years ago, was also thrown out by Arizona voters. As he did in California, businessman Ron Unz fathered and funded the Arizona measure.

Colorado voters defeated the so-called informed consent measure on abortion, but Arizona and Nebraska approved a ban on same-sex marriages and civil unions. In Maine a measure to protect gays from discrimination was defeated. In Oregon the notorious Measure 9, which outlaws "teaching" homosexuality in schools, failed. Oregonians also rejected two antiunion "paycheck protection" measures, which the state labor federation had vigorously fought.

A land-claim suit is pitting Oneidas against other upstate residents.

DNA testing can convict the guilty; it can also destroy the privacy of millions.

In New Mexico, communists who fail to register their party affiliation with the state commit a felony. Under New Mexico's DNA databanking law, if they are caught they are required to submit a DNA sample to the department of public safety. In Idaho, consensual sodomy with a partner other than your spouse constitutes a sex-crime felony. Those unfortunate enough to be caught in the act are similarly required by law to submit a tissue sample to the state's DNA databank for the purposes of preventing future sex crimes. And if Governor George Pataki is successful in the next legislative session, New York will begin collecting genetic material from any person convicted of a misdemeanor, such as resisting arrest or disorderly conduct as a result of peaceful civil disobedience.

In an age of biotechnology and computers, we are all but a needle-stick away from disclosing hereditary-disease susceptibilities, familial relationships and identifying information. Anyone who values privacy should therefore be concerned that US law-enforcement agencies are amassing ever larger portions of the general population's DNA while neglecting to implement measures that would protect the privacy and presumptive innocence of citizens. And because DNA evidence is currently enjoying an unprecedented degree of bipartisan enthusiasm, these gradual developments have tended to be sheltered from the criticism that might otherwise confront such policies.

Not that DNA evidence's celebrity isn't well deserved. It is many rape victims' best hope for identifying their assailants and law enforcement's most penetrating method of apprehending serial offenders. It can be credited with triggering a re-examination of the nation's capital punishment system by exonerating eight death-row inmates. Like its predecessor, the fingerprint, DNA profiles are a reliable means of identifying individuals (except in the case of identical twins). But glib analogies to fingerprints obscure important differences. DNA samples can reveal far more information than fingerprints, including sensitive medical conditions, traits or a person's biological parentage. In addition, while fingerprints are unique to every individual, genetic profiles are partially shared among blood relatives. Thus, databanks contain identifying information on nonoffending relatives of people explicitly covered by databanking statutes. Finally, because we shed our genetic calling cards in a trail of hair follicles, skin flecks, saliva aerosols and finger smudges, DNA can also provide a trace of our activities.

DNA databanks are premised on statistics indicating that individuals convicted of a serious violent offense often commit other violent offenses that leave behind incriminating DNA. Tissue samples, usually in the form of a blood sample or cheek swab, are thus collected from offenders covered by their state's databank laws and are analyzed using a technique called "profiling," which detects genetic variations among individuals that, at least as currently understood by geneticists, have no biological function. The resulting data are then computerized so that profiles produced from crime-scene samples can be compared with those already in the database, allowing authorities to eliminate certain suspects or target those whose profiles match. In effect, databanks provide a means of genetically frisking anyone who has ever committed a covered offense for any crime in which DNA has been recovered.

As of June 1998 all fifty states had enacted statutes authorizing state and local law-enforcement agencies to operate criminal DNA databases and to pool their DNA profiles into a national FBI-operated database called CODIS (Combined DNA Identification System). Though the earliest laws targeted convicted violent sexual felons, civil libertarians looked to the history of Social Security numbers, fingerprinting and drug-testing to warn of an inevitable migration of the technique from convict to suspect terrain. A decade later, as many states have passed laws to cover new offender categories, the Cassandras appear to have been vindicated. Delaware, for instance, requires submission of genetic samples for all those who have committed offenses against children, which include selling tobacco or tattooing minors without the consent of a guardian. Twenty-three states cover certain categories of misdemeanors, and seven states have enacted legislation that would require DNA submission for any felony, which extends DNA databanking into realms such as perjury, larceny, bribery and fraud. Thus, in addition to New Mexico's statute covering unregistered communists, Alabama's code covers tax evaders and Virginia's targets people who deface brands or marks on timber. Experts like CODIS program director Steve Niezgoda have predicted that all states will eventually amend their statutes to cover all felonies; four states have already done so, and another three have recently considered or will consider such an expansion in their next legislative sessions. Among these three, New York's proposal stands out as by far the nation's most comprehensive, targeting all convicted felons and class-A misdemeanants.

DNA databanking laws are furthermore part of the ferment that is corroding the century-old juvenile justice system that treats minors as a category of offenders separate from adults. More than half of all states authorize inclusion of DNA profiles collected from juveniles in their databanks. In contrast to the convention of sealing or erasing juvenile criminal records after a period of time--a practice grounded on a rehabilitative ideal--none of the statutes require states to remove juvenile DNA profiles from their databanks, and one (Arizona's) expressly prohibits their removal. Several states have revised their original legislation to cover juvenile offenders as well. The spread of DNA databanking to minors is especially troubling when considered against the racial inequities that plague the juvenile justice system. According to Vincent Schiraldi, president of the Center on Juvenile and Criminal Justice, "When you control for poverty, white and black [teens] commit the same amount of violent crime, [but] blacks are arrested at four times the rate of whites and imprisoned at seven times the rate of whites. So don't think for a second this databank will be race-neutral. This policy will grossly overrepresent criminal behavior by blacks and exacerbate disparities in incarceration because [databanks are] going to be used against people."

An indirect consequence of expanding DNA databanks is their partial coverage of a larger proportion of nonoffending relatives as well. Because individuals share portions of their DNA with biological relatives--half in the case of siblings, parents and children--an incomplete match between a databanked person's profile and that of a crime-scene sample might lead investigators to question an individual's immediate family. The effect of such profiling by proxy is that identifying information about nonoffenders is present in criminal databank systems as well; in effect, if you have a relative whose profile has been databanked, you're likely to be partially genetically frisked as well.

A critical unresolved question about current databanking practices concerns what law-enforcement agencies actually do with their frozen vials of human tissue. The human genome contains approximately 100,000 different genes, many of which are associated with specific illnesses. Though DNA profiles have few applications beyond linking individuals to biological specimens, the actual tissue samples submitted by offenders could in principle be analyzed for genetic traits ranging from sickle-cell anemia to schizophrenia. Since evolving typing techniques may one day outmode profiles currently being entered into computers, more than half of US states are authorized or required by law to archive their samples so they can be retested. This sustains the possibility that samples may eventually be used for purposes other than profiling.

Most statutes restrict sample use to "law enforcement"--a term whose broadness in this context can only be described as oceanic. Twenty states allow law-enforcement agencies to use samples for research on improving forensic techniques, which could mean searching banked DNA samples for genetic predictors of recidivism, pedophilia or aggression. One Massachusetts legislator publicly advocated such a use, and Tom Callaghan, program manager of the FBI's Federal Convicted Offender DNA Database, refused to rule out such a possibility when pressed at a National Institute of Justice Symposium in September 1999. Moreover, tissue repositories created by databanks would provide genetics researchers with congenial waters in which to trawl for genes thought to be involved in criminal behavior. Alabama's databanking law brushes perilously close to this by authorizing release of anonymous DNA population data collected by law-enforcement authorities to "assist in other humanitarian endeavors including, but not limited to, educational research or medical research or development."

Experimenting with offender DNA in this way would violate basic tenets of biomedical ethics by using tissues that were not obtained by consent for purposes that arguably run counter to the interests of the research subject. "If [law-enforcement authorities] want to do research," argues Boston University bioethicist George Annas, "they should follow the same rules everyone else has to follow in terms of informed consent and privacy.... Criminals have privacy rights like everyone else." As such, using databanked samples for research without consent also runs counter to recommendations by the American College of Medical Genetics.

Such research authorizations are especially troubling in light of this nation's checkered history of experimentation on prisoners. In 1875 social reformer and prison inspector Richard Dugdale wrote his famous study of the Jukes family after he noticed a disproportionate number of inmates with that last name. The availability of banked criminals' tissues may prove a valuable resource should society's interest in genetic explanations for social ills be renewed.

Legal challenges of DNA database laws have generally failed and are therefore unlikely to stem their widening sweep. Practices in Britain, the first country to enlist DNA in its crime-fighting cavalry, may portend dramatically widened use of databanking in the United States. Britain's Forensic Science Service is authorized to collect DNA samples from anyone questioned about or suspected of any offense for which a person could be detained. As of July 1999, England had collected 547,000 DNA samples; the effort was projected to reach 30 percent of British men eventually. In addition, England has conducted at least eighty "intelligence-based screens"--the official term for what is colloquially called a "genetic sweep"--in which the general population is asked to submit DNA samples to help police investigate a particular crime. Although samples are provided voluntarily, social pressures, heavy media coverage and the concern that failure to submit a sample may itself invoke police suspicion undermine the notion of submissions being truly consensual. Other countries, including Canada and Germany, have conducted similar sweeps, and while some argue that the Fourth Amendment would probably bar such practices in the United States, privacy watchdogs like New York Civil Liberties Union's executive director Norman Siegel caution that "Fourth Amendment challenges [of databanks] have not been successful; these are the only reference points we have [for predicting how courts will rule on genetic sweeps], and they're not promising."

The next battle between civil libertarians and law-enforcement authorities concerning DNA databanking is likely to concern the leap from profiling convicted felons to arrestees. Former NYPD chief Howard Safir has championed arrestee profiling, and US Attorney General Janet Reno has begun to explore the implications of such a policy by querying a National Institute of Justice commission. Arrestee profiling would dramatically broaden the reach of DNA databanking and, if not subject to careful restrictions, would empower law-enforcement authorities to arrest people for minor offenses, collect a tissue sample and search their databases for a match between the arrestee's profile and another crime-scene sample. Despite widespread enthusiasm in law-enforcement circles, profiling on such a scale isn't likely to be implemented anytime soon, given the backlog of tissue samples awaiting profile analysis and the high costs (at least $100 per sample). Nevertheless, one state (Louisiana) profiles arrestees for sexual offenses, and advancing automation technologies are likely to erode these fiscal barriers.

Though this is reason for despair among privacy advocates, there are a few hopeful signs among the various statutes. Twenty-seven states (and the federal government), for example, prohibit disclosure of genetic materials or information to unauthorized third parties. Wisconsin requires that law-enforcement authorities eliminate DNA samples of convicted persons after profiling is complete, and six states (Indiana, New Hampshire, Rhode Island, Texas, Vermont and Wyoming) restrict what authorities can do with collected DNA by prohibiting analysis of genetic mutations that could predict a person's traits. But in an environment where the political leaders most likely to raise objections to such policies are often silenced by a fear of appearing to be soft on crime, the stability of these protections remains to be seen.

Imagining a fair and protective system for using DNA evidence in the criminal justice system isn't all that difficult. People claiming innocence should be given opportunities to volunteer DNA to clear their name. For them--and more broadly for the credibility of the criminal justice system--DNA forensic technology may be the only life vest within reach. Upon overturning a conviction, volunteered DNA samples and profiles should be promptly destroyed, preserving the individual's presumptive innocence. For people convicted of serious violent offenses and beyond the reach of such exculpatory evidence, however, the trade-off between privacy and public interest may tilt toward favoring a DNA databanking system with strong privacy protections, including sample destruction after profiling and prohibitions on uses other than comparing profiles with those collected from crime scenes. And finally, to protect the presumptive innocence of convicted offenders' family members, states should impose stringent requirements for when a match between a crime-scene sample and a databanked profile can trigger an investigation.

Privacy is a zero-sum entity: The extension of law-enforcement authorities' genetic gaze comes directly at the expense of an individual's power to withhold such information. Where most human DNA handling once occurred in medical clinics and research laboratories--institutions that are generally subject to public oversight and cautious (if imperfect) ethical review--DNA has now entered a territory not particularly distinguished for its ethical circumspection. States are not providing many reasons for the public to be confident that they are taking these concerns seriously; perhaps of even greater concern, negligence in protecting the privacy of offenders and criminal suspects may acclimate a public to weak protections of genetic materials. As the predictive powers of genetic technologies are refined, this could have grievous consequences for everyone.

VINCENT CANBY

As a memorial tribute to Vincent Canby, the "Arts & Leisure" section of the New York Times recently published half a page of excerpts of his prose, as selected by The Editors. Implacable beings of ominous name! With grim rectitude, they shaped a Canby in their image, favoring passages where he had laid down principles of the sort that should be cited only under capitalization. These were Sound Judgments.

For those of us who admired Mr. Canby (as the Times would have called him while he was alive, and as I will continue to call him, knowing how the style fit the man), soundness of judgment was in truth a part of his merit. A hard man to fool, he could distinguish mere eccentricity from the throes of imaginative compulsion, the pleasures of pop moviemaking from the achievements of film art; and when he was offered sentimentality in place of feeling, his heart didn't warm, it burned. These powers of discernment allowed him to bear with extraordinary grace the responsibility of being the Times critic. They also contributed a lot to his need for responsibility, since it was his sureness, as much as the institutional weight of the Times, that made Vincent Canby so influential.

That said, I confess I read him to laugh. At present, I can give only tin-eared approximations of his wisecracks--correct and ample quotation will become possible when someone smart decides to publish a Vincent Canby anthology--but I can hardly forget his review of Salome's Last Dance. This picture was the latest chapter in Ken Russell's phantasmagorical history of sex in the arts, or the arts in sex. Mr. Canby's lead (more or less): "As the bee is drawn to the flower, as the hammer to the nail, so Ken Russell was bound to get to Oscar Wilde."

`
I also recall Mr. Canby's description of the used car that Jim Jarmusch peddled to the title characters in Leningrad Cowboys Go America. It looked, he said, as if it had been dropped from a great height. Writing about I've Heard the Mermaids Singing, a film of relentlessly life-affirming whimsy, he claimed he'd been cornered by a three-hundred-pound elf. A typically self-regarding, show-offy performance by Nicolas Cage (was it in Vampire's Kiss?) inspired him to write that other actors must enjoy working with this man about as much as they'd welcome being shut up with a jaguar. And once, when forced to think up copy about his umpteen-thousandth formula movie, he proposed that the only way to derive pleasure from such a picture would be to play a game with yourself, betting on whether you could guess what would happen next. "As you win," he wrote, "you lose."

From these few and random examples, you may conclude that Mr. Canby's principles often emerged with a deep-voiced chuckle, and that they involved matters that went far beyond the movies. Some of these concerns were political in the specific sense, as when he gave a favorable review to Alex Cox's Walker: a film that offered a burlesque insult to US supporters of the Nicaraguan contras, in government and at the Times. His concerns were also political in a broader sense. Witness the 200 words he devoted to a little African-American picture titled Love Your Mama: a heartfelt, thoroughly amateurish movie produced in Chicago by some people who had hired an industrial filmmaker to direct their script. While quietly letting his readers know that they probably would not want to watch this film, Mr. Canby conveyed a sense that real human beings, deserving of respect, had poured themselves into the project.

Of course, the best places in which to seek Mr. Canby's principles were within the films he championed. He would have earned his place in cinema history (as distinct from the annals of journalism) had he done nothing more than support Fassbinder's work. And yet I'm not surprised that The Editors found no space to reprint Mr. Canby's writings on this crucial enthusiasm. Fassbinder, like his critic, was preternaturally alert to political and social imposture, to the bitter and absurd comedy of human relationships, and also (for all his laughter) to the pain and dignity of those who go through life being pissed on. Mr. Canby recognized in Fassbinder's work all these qualities and more (such as the presence, in the person of Hanna Schygulla, of one of cinema's great fantasy objects); but these matters seem to have been judged too unruly for an "Arts & Leisure" tribute.

Now, I've been allowed to do some work for "Arts & Leisure" and have received from my editors nothing but aid and kindness. Surely the people I've dealt with at the Times would have chosen excerpts from Mr. Canby that were funnier, sharper, more challenging. So maybe, when the Times moves to memorialize somebody as one of its own, a higher level of control takes over. It's as if the paper means to show its own best face--or rather the image it wants to see in the mirror, urbane and solid--and never mind that man in the old tweed jacket.

This tendency of the institution to eclipse the individual figures prominently in a new book by another major film critic, Jonathan Rosenbaum. By "major," I mean that Rosenbaum is highly regarded by other reviewers and film academics, and that he's gained a certain public following (concentrated in Chicago, where he serves as critic for the Reader). But if you were to ask him how he fits into American film culture in particular and US society in general, he would locate himself, quite accurately, on the margins. As his friends will tell you (I hope I may count myself among them), Rosenbaum is one of the angel-headed hipsters: a sweet-natured, guileless man, wholly in love with art and wholly longing for social justice. And for these very reasons, he has become the angry man of American film criticism, as you might gather from the title of his new work, Movie Wars: How Hollywood and the Media Conspire to Limit What Films We Can See (A Cappella, $24).

Rosenbaum argues--"argue," by the way, is one of his favorite words--that those American writers, editors and TV producers who pretend to cover film are for the most part hopelessly self-blinkered. It's in their interest to look at only those movies that the big American companies want to promote (including the so-called independent films that have been ratified by Sundance and Miramax). So journalism collaborates with commerce, instead of acting as a check on it; informed, wide-ranging criticism gets shoved to the side; films that might have seemed like news flashes from the outside world fail to penetrate our borders; and everyone excuses this situation by claiming that "the people" are getting the dumb stuff they want. Rosenbaum is enraged that moviegoers should be viewed with such contempt; he's infuriated that well-placed journalists should justify their snobbism (and laziness) by dismissing whatever films and filmmakers they don't already know about; and he's mad enough to name names.

In Movie Wars, Rosenbaum advances his arguments by means of a crabwise motion, scuttling back and forth between general observations (which are newly composed) and case studies (many of them published before, in the Reader and elsewhere). This means that some stretches of ground are covered two or three times. I don't much mind the repetition--even when the material shows up in a second new book by Rosenbaum, his excellent, unabashedly partisan monograph on Jarmusch's Dead Man (BFI Modern Classics, $12.95). I do worry that indignation, however righteous, has begun to coarsen Rosenbaum's tone and push him into overstatement.

When Rosenbaum is at his best, his extraordinary wealth of knowledge about cinema informs an equally extraordinary power of insight into individual pictures; and both these aspects of his thinking open into frequently astute observations of the world at large. You can get Rosenbaum at his best in his Dead Man monograph and in three previously published collections: Moving Places, Placing Movies and Movies as Politics (California). By contrast, Movie Wars is a sustained polemic, with all the crabbiness that implies.

It's a welcome polemic, in many ways. Most rants against the infotainment industry are on the level of Michael Medved's godawful Hollywood vs. America; they complain, in effect, that the movies tell us too much about the world. Rosenbaum recognizes the real problem, which is that our world (filmed and otherwise) has been made to seem small. I agree with much of what he says. But when, in his wrath, he digresses to settle scores or rampages past obvious counterarguments, I begin to wish that he, too, would sometimes pretend to be urbane and solid.

"There's a hefty price tag for whatever prestige and power comes with writing for The New York Times and The New Yorker," Rosenbaum says, "and I consider myself fortunate that I don't have to worry about paying it. Film critics for those publications--including Vincent Canby and Pauline Kael...--ultimately wind up less powerful than the institutions they write for, and insofar as they're empowered by those institutions, they're disempowered as independent voices."

To which I say, yes and no. As bad as the situation is--and believe me, it's woeful--I've noticed that news of the world does sometimes break through. David Denby, in The New Yorker, may contribute to American ignorance by being obtuse about Kiarostami (as Rosenbaum notes with disdain); but then, as Rosenbaum fails to note, Stephen Holden and A.O. Scott in the Times delivered raves to Taste of Cherry and The Wind Will Carry Us. Individuals in even the most monolithic publications still make themselves heard; and the exceptional writer can manage (at least in life) to upstage an entire institution.

Rosenbaum himself has pulled off that trick at the Reader; and Vincent Canby did it at the Times. To the living critic, and all those who share his expansive view of the world, I say, "We've lost a champion. Better stop grousing and pick up the slack." And to those who mourn Mr. Canby, I say, "You can still hear his laughter. Just don't let The Editors get in the way."

The last chapter in Ring Lardner Jr.'s new memoir, I'd Hate Myself in the Morning (Nation Books), is called "Sole Survivor." When Lardner, who died October 31, wrote it he was indeed (a) the last of a family of four boys with a famous father, the humorist and sportswriter Ring Lardner; and (b) the last surviving member of the Hollywood Ten, who gained renown in 1947 when they refused to answer the House Committee on Un-American Activities' question, "Are you now or have you ever been a member of the Communist Party?" They were indicted, prosecuted and convicted of contempt of Congress and sent to prison-- in Ring's case for a year.

Among the first victims of the great Red purge to come, The Ten, also known as the Unfriendly Ten, are historically important because they were willing to risk prison to help prevent it, putting First Amendment principle ahead of personal convenience.

At the time, Billy Wilder, the witty director, cruelly and unjustly said, "Of the Unfriendly Ten, only two had any talent; the other eight were just unfriendly." Ring, who had already won his first academy award for Woman of the Year, starring Katharine Hepburn, was one of the two. The other was his buddy Dalton Trumbo, the highest-paid writer in Hollywood, who went on to win an Oscar for The Brave One, a movie he wrote under the pseudonym Robert Rich.

At the time, the tabloid press and newsreels did their best to portray the Ten as obstreperous, dogmatic followers of the party line. Each of the Ten was, in fact, following his conscience, albeit they arrived at their decision on how to confront HUAC after collective deliberation with counsel, some of whom were party lawyers, others not.

Lardner's famously elegant response to the committee was a clue to how wrong that image was. "I could answer your question," he said, but "I would hate myself in the morning"--hence his memoir's title.

Even during the blacklist years, when he made his primary living writing under various pseudonyms, he never gave up on his social commitment. Thus in 1955, when Hannah Weinstein set up a production company in London and chose for its maiden effort in the new medium of television The Adventures of Robin Hood, Lardner, along with fellow blacklistees like Abe Polonsky and Walter Bernstein, leapt at the opportunity for, as he put it, commentary-by-metaphor "on the issues and institutions of Eisenhower-era America."

After he was finally graduated from the blacklist--it took twelve years--and able to write under his own name, he gave us M*A*S*H, the black comedy that was, on the surface, about life in a medical unit during the Korean war; but beneath the surface, like Joe Heller's Catch-22, it was about the absurdities and contradictions of war itself.

Although his public positions were militant, privately he was a gentle soul. His main target was often himself. He would delight in telling how he recommended to David O. Selznick that he not acquire Gone With the Wind, the highest-grossing picture of its time, "because I objected on political grounds to the glorification of slave owners and the Ku Klux Klan." When progressives praised him for his principled stand against HUAC he would observe that the Ten did the only thing they could do under the circumstances "short of behaving like complete shits."

The loss of Lardner is a loss for both The Nation and the nation. One part Marxist democrat and two parts humanist-rationalist, he stayed true to his vision to the end. A few years ago he listed in The Nation "some of the strange things Americans believe 200 years after Thomas Paine published The Age of Reason." (Typical entries: "Eating fish is good for the brain"; "There never was a Holocaust.") He felt no comment was called for. But when a reader wrote to complain that "Reason is a wonderful tool, but it is a tiny flashlight shining here and there..." Lardner responded, "What he sees as a tiny flashlight, I call, in the words of Cicero, 'the light and lamp of life.'"

In an introduction to his memoir, I call Lardner "recrimination-challenged." In fact he seemed incapable of bitterness. Although he did once say of Martin Berkeley, a screenwriter who named a record 161 names before HUAC and specialized in writing animal pictures, "I always maintained that was because he couldn't write human dialogue."