Quantcast

November 27, 2000 | The Nation

In the Magazine

November 27, 2000

Cover:

Browse Selections From Recent Years

2014

2013

2012

2011

2010

2009

2008

2007

2006

2005

2004

Letters


Malthusian Delusions?

Franklin, N.Y.

Amartya Sen starts his otherwise sensible "Population and Gender Equity" [July 24/31] with the unproven assertion that Thomas Malthus was wrong when he wrote that population growth would soon outstrip growth in food production. What Malthus didn't know is that the age of cheap and abundant fossil fuels was at hand and would, for a geologically brief 200 years, delay the fulfillment of his gloomy prediction. Now those fuels are running out while population has grown to numbers Malthus probably could not have conceived of.

Modern agriculture has been described as a means of turning petroleum into food, but sometime--very likely this decade--the world will reach peak oil production. Then all will change. Fertilizers made from petroleum and natural gas will be very expensive and then unavailable; transportation and the operation of farm machinery will be hugely expensive or impossible. The idea that alternative fuels and solar and wind power will make up the deficit is, so far, a fantasy, and the level of investment in such alternatives remains paltry. In a two-century orgy of consumption, we have burned up the solar energy that was for hundreds of millions of years stored under the earth's surface. Our oil-based civilization is about to come crashing to an end, and we have very little time to prepare to deal with the consequences. Not surprisingly, the oil companies don't acknowledge the problem. Even more disturbing, none of our politicians want to be the bearer of bad news.

One is loath to lump a man of Sen's decency and humanity in with the economic cultists who believe that "the market" will take care of the problem. Nevertheless, it is the bizarre beliefs of economists, including the notion that the world runs on investment rather than energy, that will probably result in Malthus being proved an accurate observer.

EUGENE MARNER


Oakland, Calif.

Amartya Sen dismisses concerns about the global food supply as it relates to burgeoning population for two reasons: Food production has expanded and the price of food continues to fall. But we should not be lulled into thinking the world's food supply is sustainable or secure. That's because at least one key part of food production is not reflected in current prices: water. In coming decades, increasing scarcity of water will make itself felt in prices and supply. Visions for a sustainable future must balance population density and growth with current and future water prices and availability. Food--grain, produce and livestock alike--represent huge investments of water. Producing a ton of beef can require up to 70,000 tons of water and a ton of grain up to 3,000 tons. Agriculture is a thirsty enterprise. As water becomes scarcer (as is already happening in the Central Valley of California, which, like many regions, relies on artificial water supplies) and soils become more salinized, food production will not hold at current levels. The earth's hydrological cycles have been mined to increase food production. This is a historical anomaly, not a sustainable trend.

MICHELLE SALE-SINEX
Redefining Progress


College Park, Md.

In Amartya Sen's article on gender equity, structures of patriarchy and capitalism are nowhere visible. Problems are reduced to a series of variables--economic, cultural and political "handicaps" that are to be overcome. Sen's basic point that literacy and schooling bring a decline in fertility is simply a correlational, not causal, part of the tired argument that investment in human capital will yield wealth and progress. Historically, fertility rates did not decline because of education but because wealth made having large families unnecessary for survival. Unusual low-income, low-fertility-rate stories, like China and Kerala, are not due to education (Sen discounts the effects of China's "one child" policy) but because both have departed from traditional capitalist and patriarchal structures.

The most important issue Sen raises is how employment opportunities for women (and men) are essential to achieving greater gender equity. But Sen, like any mainstream economist, lacks understanding of structures of inequality and oppression. He believes that fostering the education of girls and women, promoting access to microcredit for rural women and fighting discrimination in urban labor markets are the policies that will improve employment and equity. To the contrary, the creation of sustainable, decent livelihoods for the 2 billion women, men and children living on the global margin will not come from better policies within structures rooted in poverty and inequality. "Reversing the...handicaps that make women voiceless and powerless" and "bringing gender equity and women's empowerment to the center of the stage," as Sen wishes to do, do not depend on a "unified framework of understanding" based on the results of "empirical and statistical research" but on a political struggle for economic rights and societal transformation.

STEVEN J. KLEES


SEN REPLIES

London

Eugene Marner is right to express worry about the growth of world population, even though the source of this worry cannot really be the alleged accuracy of Malthus (I shall return to Malthus after discussing the general problem). The exhaustion of fossil fuel is certainly one source of concern (to which Marner rightly draws attention), as is the growing difficulty in guaranteeing adequate water supply (to which Michele Gale-Sinex devotes her letter). Even though each of them has chosen a singular focus of attention (petroleum and water, respectively), problems generated by excessive population growth can arise in many other ways as well, varying from the depletion of the ozone layer to overcrowding in a limited habitat (as I discussed in my essay).

The point of departure in my essay was the particular relation between (1) high fertility rates and (2) the low decisional power--indeed subjugation--of women. The critical linkage is that "the most immediate adversity caused by a high rate of population growth lies in the loss of freedom that women suffer when they are shackled by persistent bearing and rearing of children." This connection is important in itself because of its relevance to the well-being and freedom of women (and derivatively of men as well). Furthermore, since the interests of young women are so closely involved, it would also be natural to expect that anything that increases the voice and power of young women in family decisions will tend to have the effect of sharply decreasing fertility rates (and through that, reducing the environmental adversities associated with population explosion). This expected connection has received very substantial statistical confirmation in intercountry comparisons around the world as well as in interstate and interdistrict correspondences within India (as I indicated in my essay).

That was the reason for my conclusion that women's empowerment and agency (through such factors as their education and economic independence) are central to an effective resolution of the so-called population problem, including its environmental consequences. These connections, which draw on a firm interpretive framework, cannot be dismissed as "simply correlational," as Steven Klees does in his letter. Empirical work is inescapably dependent on statistical investigation. Causal connections, which demand interpretation, have to be assessed on the basis of statistical findings, not independently of them. This combination of interpretive scrutiny and statistical assessment gives causal plausibility to the empirical association between fertility decline and women's empowerment (reflected by such enabling factors as female literacy, women's gainful employment and access to microcredit, land and other resources, and public debates and political discussions on gender equity).

Klees argues that as a "mainstream economist," I cannot have any "understanding of structures of inequality and oppression." If correct, this would be very sad for me, since--mainstream or not--I have devoted a very big part of my life precisely to investigating inequality and oppression, including studying, at close quarters, their manifestations in such phenomena as famines and starvation, class- and gender-related atrocities, and military and police brutalities. I accept the possibility that Klees has been able to acquire (from his vantage point in College Park, Maryland) a direct understanding of these issues which I have failed to achieve. However, since Klees does not refer to any empirical work whatsoever, it would have been very nice to have been told a little about how he has accomplished this understanding. Indeed, despite his fleeting invocation of "patriarchy" (along with "capitalism"), Klees dismisses the relevance of the indicators of women's empowerment that well-researched empirical studies in feminist economics as well as demography have established as important (on which my essay drew).

I come, finally, to Eugene Marner on Malthus. Marner disputes what he describes as my "unproven assertion that Thomas Malthus was wrong when he wrote that population growth would soon outstrip growth in food production." Since exactly the opposite of what Malthus predicted has occurred and continues, why is the recording of the nonfulfillment of Malthus's prediction "unproven"? Is a period of 200 years not time enough to check a prediction? But we must not dismiss Marner's reasoned worries about the future, since the exhaustion of petroleum is an important issue. However, Marner surely oversimplifies with his "turning petroleum into food." There are a great many different factors (such as new seeds, better cultivation techniques, etc.) that have contributed to the sharp rise in food production per capita in the world, which has occurred since Malthus's gloomy predictions were made and which has continued to occur through the most recent decades. Nevertheless, given the difficulties that are visible now (including petroleum and water problems) and new adversities that might well arise, we do have good reason to consider ways and means of raising agricultural productivity as well as reducing fertility rates (as I discussed in my essay).

Where Malthus is particularly counterproductive is in his dismissal of informed reproductive choice and of the effectiveness of women's conscious agency as ways of reducing fertility rates. Malthus took penury to be the only sure way of keeping fertility rates down (he did not revise his view on this particular subject, despite rethinking on some other issues) and even argued for suppressing the Poor Laws and the very modest arrangements for social safety nets and economic security that existed for the poor at his time. It would be unfortunate to rely on Malthus's harsh and dogmatic pronouncements for our understanding of the population problem.

AMARTYA SEN

Editorials

The razor-thin margin that defined the presidential race is sure to stir controversy around the Ralph Nader vote. Those wishing to blame Nader for Gore's troubles and those Greens wishing to take credit for giving the Democratic candidate a political "cold shower" will focus on Florida. Nader's 97,000 votes in that state came to less than 2 percent of the statewide total, but with barely 1,000 Florida votes deciding the national election, they are sure to be dissected and scrutinized. Ironically, only in the final days of the campaign did Nader decide to return to Florida and ask for votes. A last-minute debate inside his campaign weighed the possibilities of focusing efforts in the swing states like Florida or in Democrat-rich states like New York and California, where "strategic voters" could vote Green without concern about affecting Gore's final tallies. Nader eventually decided he would get more media coverage by targeting places like Florida.

On the national level, Nader fell considerably short of his goal of achieving a 5 percent national vote that would have qualified the Green Party for millions in federal matching funds in 2004. When the votes were counted, Nader had pocketed 3 percent, or around 2.7 million votes--almost four times more than his "uncampaign" garnered in 1996. Relentless pressure on potential Nader voters by liberal Democrats to switch to Gore clearly had an effect on the Green campaign, helping tamp down the final vote to almost half the level at which Nader had finally been polling.

No question but that this result is far from the best scenario for those who hoped that Nader's run this year would hand the Greens substantial future leverage. Given the failure to establish a federally funded national Green Party in the balloting, however, that future clout will depend mostly on Nader's ability and willingness to take his list of 75,000 campaign contributors (as well as countless volunteers and voters) and hone it into an identifiable political entity. That task could be rendered even more problematic by those who will blame Nader for a Gore defeat.

That said, various state Green parties will emerge from this week strengthened and positioned to make a difference in scores of Congressional and legislative districts. In some progressive-minded counties--like Humboldt and Mendocino in Northern California--the Nader vote grazed 13 to14 percent. In many others the Greens scored 5 to 10 percent, making them a potential swing vote in further local elections. In this election, nationwide, some 238 Greens ran for municipal office, and fifteen were victorious.

In what had been considered virtual "Naderhoods"--several northern-tier states where the Greens had significant pockets of strength--the candidate's vote was less than spectacular. In Wisconsin, Washington and Oregon Nader finished with only 4 or 5 percent. Just six weeks ago, he was approaching 10 percent in Oregon. The Greens scored 5 percent in Minnesota--a figure they had been polling for some time--and they hit 6 percent in Montana, Maine, Massachusetts, Rhode Island and Hawaii. The Green high-water marks were in Vermont (7 percent) and Alaska (10 percent--down from 17 percent in some earlier polls).

In the Democratic strongholds of New York and California, where Al Gore won by huge margins and where a ballot for Nader was considered "safe" by those who called for strategic voting, the Greens ended up with a relatively disappointing 4 percent--the same number reached in New Mexico, where Greens have competed statewide for more than five years.

Predictions that the Greens would spoil Gore's chances failed to materialize. Washington, Minnesota, New Mexico, Michigan and Wisconsin--states where Democrats argued that Nader could swing the vote to the GOP--were all won by Al Gore. Even in Oregon, Nader's impact on the major party race was arguably negligible. At press time, Gore was losing the state by about 25,000 votes and Nader's total was 5 percent, or just over 50,000. But whether a sufficient number of the Nader votes would have gone to Gore is open to question. A national USA Today/CNN/Gallup Tracking poll a few days before the election found that only 43 percent of likely Nader voters would vote for Gore as their second choice. Twenty-one percent said they would vote for Bush second. And an equal number said they would vote for Nader or not at all.

What if they held a presidential election and neither guy won? Or a dead man from Missouri defeated an incumbent Republican senator?

As the media obsessed over the seesaw presidential poll, voters across the country quietly made their choices on more than 200 disparate ballot measures and initiatives. For progressives the results are--as usual--mixed.

First the bad news: Three campaign finance reform initiatives went the wrong way. Clean-money measures providing for full public financing were thumped in Missouri and Oregon. Similar measures had been passed in previous years by voters in Maine, Massachusetts and Arizona as well as by the legislature in Vermont--but this time around powerful, well-financed business lobbies weighed in, and dirty money beat clean money. In Oregon opponents ran an effective (and expensive) radio campaign highlighting the out-of-state financial support for the reform, and it raised the specter of extremists running for office if it passed.

In Missouri corporate opponents--including Anheuser-Busch, KC Power & Light, Hallmark Cards and the Missouri Association of Realtors--poured hundreds of thousands into their victorious antireform campaign. Californians, meanwhile, approved Proposition 34, billed as campaign reform but actually cooked up by the establishment to block real reform. The returns on these three measures should compel campaign finance reform activists to rethink their strategies. These are significant and stinging defeats.

The good news is that the failed drug war was a loser in five of seven related measures nationwide. Medical marijuana initiatives passed in Colorado and Nevada (although a full marijuana-legalization bill failed in Alaska). Oregon and Utah voted to reform draconian drug forfeiture laws. And in California, Proposition 36, providing treatment instead of jail for first- and second-time drug offenders, passed easily. But a similar proposition failed in Massachusetts (which also refused to approve a universal healthcare proposal).

Another bright spot was public education. Voucher measures in California and Michigan were beaten by wide margins. Silicon Valley entrepreneur Tim Draper put up millions for the California proposal--to no avail. California voters also approved a measure that makes passage of school bonds easier. But bilingual education, banned in the Golden State two years ago, was also thrown out by Arizona voters. As he did in California, businessman Ron Unz fathered and funded the Arizona measure.

Colorado voters defeated the so-called informed consent measure on abortion, but Arizona and Nebraska approved a ban on same-sex marriages and civil unions. In Maine a measure to protect gays from discrimination was defeated. In Oregon the notorious Measure 9, which outlaws "teaching" homosexuality in schools, failed. Oregonians also rejected two antiunion "paycheck protection" measures, which the state labor federation had vigorously fought.

This issue goes to press on Wednesday, November 8, the day after the election, when all was supposed to have been decided, all was to be made clear. Instead, a great bewilderment has descended over the land. The recount of the vote in Florida, which might conceivably erase Bush's lead of a thousand or so votes and give the state and the presidency to Al Gore, has begun but not been completed. As I write, the numbers are changing hourly, and no two news outlets seem to have the same ones at the same time. There seems to be some fuzzy math going on down in Florida. Meanwhile, we do know that Al Gore has won the popular vote and faces the possibility that the will of the people will be annulled by the Electoral College. In short, we do not know at present who the next President will be or whether, when we do know, the people will have wanted that man.

Ordinarily, journalists hate situations like this, in which the deadline for elucidating a momentous event descends just before the event occurs. We are required, it seems, either to qualify our comments to the point of meaninglessness or else to pen words so vague and general that they will cover all contingencies. Rich as the arts of pontificating are, these occasions seem to stretch them to the breaking point. ("Whoever wins the White House, one thing is clear, the democracy of this great land..." and so forth.)

On this particular occasion, however, the situation is different. History, giving hard-pressed journalists a hand, has, by declining to produce a victor, provided for the time being the perfect metaphor for the campaign that has now ended. In the campaign the choices offered by the two parties were more obfuscated than clarified, more concealed than revealed. Gore decided to distance himself from his partner in the White House, Bill Clinton, declaring himself to be "my own man," assuring the voters that "I will never let you down" and preventing Clinton from going out on the hustings. It was the fundamental strategic decision of the Gore campaign. Yet the reason for it--the scandals that led to the impeachment of Clinton--were never mentioned by Gore. In consequence, impeachment, the most important political event of the last decade, and the one with the most important bearing on the fitness of the Republican Party to be placed in positions of trust and authority, went undiscussed by the Democrats. Had the impeachment been a necessary remedy for a grave danger to the Republic from President Clinton, or had it been (as I believe) a reckless abuse of power by the Republicans? No question was more in need of an answer in this year's election, but none went more thoroughly unaddressed. Gore's decision even prevented him from taking adequate credit for the Clinton Administration's economic successes.

The Republicans, for their part, waged what E.J. Dionne of the Washington Post rightly called a "stealth campaign." They had held the majority in Congress for six years, yet the Congressional Republicans were all in hiding, and their self-described "revolution" of the nineties--including, for example, their attempt to eviscerate environmental law, their attempt to shut down the Department of Education and their shutdown of the federal government--also went down the memory hole. They opportunistically took their stand on Democratic issues--a plan for prescription drugs, a plan for saving Social Security, a plan for education. Only Bush's proposal for an across-the-board tax cut was in keeping with the recent Republican record. (The art of winning elections by stealing the other party's issues is one they appear to have learned from Clinton.)

Astonishingly, the Republicans even pre-empted the impeachment issue--though without mentioning it explicitly any more than Gore had. Bush spent the final week of the campaign attacking the "partisan bickering" in "Washington," as if it had been the Democrats who had tried to impeach a Republican President for frivolous reasons rather than the other way around. Thus did the impeachment issue control the candidates' decisions without being discussed by them. Almost the only issue given a really thorough airing was the entirely jolly one of how to pass out the trillions of dollars of the budget surplus (how much in prescription drug benefits? how much in tax cuts?)--trillions that may never in fact materialize and that the current Congress has in any case been busily spending.

Had the outcome of the election been known today, a tidal wave of interpretation of the results no doubt would already be rolling over us. It is well that it was stopped. It is better to reflect for a moment on our political confusion. The contest, even when it produces a winner, will not have provided a basis for generalizations regarding the public mind. A foggy campaign has ended in a deep fog, as if the people, not having been offered a true choice, have simply decided not to choose.

It wasn't exactly a reversal of 1994, but in this year's Senate races Democrats erased much of the Republican majority that was established in that year of Grand Old Party hegemony. Democrats picked up at least five and perhaps six Republican-held seats, while losing just two. In several cases, the shifts were dramatic, replacing staunch conservatives with far more progressive legislators.

For example, Minnesota Democrat Mark Dayton, a liberal department store heir, displaced Republican incumbent Rod Grams, a conservative hundred-percenter famous for dismissing Senator Paul Wellstone's attempt to lodge a criticism of Chinese human rights abuses as "demagoguery." Missouri Republican John Ashcroft, perhaps the Senate's most outspoken advocate of the Christian-right agenda, will be replaced by Jean Carnahan, the widow of liberal Missouri Governor Mel Carnahan (although the threat of a legal challenge remains). Michigan Democrat Debbie Stabenow, a House member whose strong labor record won her passionate support from the powerful United Auto Workers union, beat Republican incumbent Spencer Abraham, a former aide to Dan Quayle who frequently zeroed out on labor's voting-record ratings. One immediate change could come in the area of campaign finance reform: Senator Russ Feingold of Wisconsin says he now believes there will be sufficient support in the Senate to get the McCain-Feingold Campaign Finance Reform bill passed. "The question now is whether we'll have a Democratic President, who will sign the bill, or a Republican, who will veto it," says Feingold.

Perhaps the biggest Senate disappointment of the night was the 51-to-48 loss of Brian Schweitzer, a Montana rancher who used his campaign to dramatize the high cost of drugs for seniors and in the final weeks closed the gap on antienvironment, antilabor incumbent Conrad Burns.

There was even more disappointment in the results in the House, where an expensive two-year Democratic drive to win the seven Republican seats needed to take back the House failed. While Clinton impeachment manager Jim Rogan went down in California, and Arkansas's Jay Dickey may have been beaten in part because of his support for removing the state's number-one homeboy, most targeted Republican incumbents withstood the challenge. At least one Democrat with a good record, Connecticut's Sam Gejdenson--a consistent liberal with a strong international affairs bent--was ousted, and several progressive newcomers, including California's Gerrie Schipske and Montana's Nancy Keenan, have been beaten. But Illinois's Lane Evans beat back a meanspirited Republican challenge, and new California Representative Mike Honda won with strong backing from labor, as did New York's Steve Israel, who won the Long Island seat vacated by Republican Rick Lazio--the man First Lady Hillary Clinton bested in the nation's highest-profile Senate contest.

At the state level, the nastiest fight was in Vermont. There, in a campaign that turned into a referendum on the question of whether gay and lesbian couples should be allowed the rights of heterosexual couples, Vermont Governor Howard Dean, who signed the state's groundbreaking yet controversial "civil union" law, won a decisive victory in his campaign for a fifth term as governor. Vermont Progressive Party gubernatorial candidate Anthony Pollina surprised observers by racking up 10 percent of the vote and helping elect several Progressive Party candidates to legislative seats.

North Dakota Democrat Heidi Heitkamp, the state's progressive populist attorney general, lost a race for the governorship. But Delaware Lieutenant Governor Ruth Ann Minner, who dropped out of high school at 16 to work on the family farm and returned to school after being widowed at age 32, was elected governor. Her win means there will now be five women governors--the highest number in history.

A key result of the elections at the state level will be their impact on redistricting, which will shape Congress for years to come. Democrats now control forty-nine state legislative houses of a total of ninety-eight, while Republicans control forty-five. Four legislative chambers are evenly tied. Democrats won control of the Colorado Senate and the Washington Statehouse Tuesday. But they lost the Vermont House, with thirteen incumbents being swept from office. "It was all the civil-union fight," says Kevin Mack of the Democratic Legislative Campaign Committee. "Howard Dean won up there, but a lot of good Democrats lost on the issue."

Columns

scheer

When George W. Bush spokesman James A. Baker III termed the fight
over the Florida vote recount "a black mark on our democracy," he
couldn't have been more wrong. At the time he said it on Sunday, Bush was
ahead in Florida by a mere 288 votes, and of course the full recount,
required by Florida law, is in order, as a federal judge ruled Monday.

Anyway, since when is political tumult and democracy a bad mix? Never
in our recent history has the vitality of our democracy been on such
splendid display, and it's disheartening that there are so many
frightened politicians and pundits panicked by this whiff of controversy.

What's wrong with a bit of electoral chaos and rancor? The
post-electoral debate over a rare photo finish is just the stuff that
made this country great. People should be outraged if their votes were
improperly counted--the founding fathers fought duels over less.

We have lectured the world about the importance of fair elections, and
we cannot get away with hiding the imperfections of our own system. Not
so imperfect as to require international observers for a full-scale
investigation under UN supervision, yet controversial enough to fully
engage the public. An election that once threatened to be boring beyond
belief has turned into a cliffhanger that is now more interesting than
reality-based TV entertainment. Indeed, it is reality-based TV
entertainment.

Never since John F. Kennedy eked out a suspicious victory over Richard
M. Nixon in 1960 has the proverbial man-in-the-street been so caught up
on the nuances of the electoral process. People who didn't even realize
we had an electoral college are now experts on it. But instead of
celebrating an election that people are finally excited about, driving
home the lesson for this and future generations that every vote counts,
the pundits are beside themselves with despair.

What hypocrites. They love every moment of increased media exposure
for themselves, while darkly warning of the danger to our system. Their
fears are nonsense. What is being demonstrated is that the system works:
Recounts, court challenges, partisan differences are a healthy response
to an election too close to call.

The fear-mongers hold out two depressing scenarios, one being that the
people will lose faith in the electoral process, and the other that
whoever wins the election will be weakened for lack of a mandate.

As to the former, the electoral process has never seemed more vital;
some who voted for Ralph Nader may be second-guessing their choices, and
states such as Florida and Oregon with primitive voting systems will no
doubt come into the modern age, but apathy has been routed, and next time
around, the presidential vote count will be the highest ever.

True, the candidate who finally wins will be weakened. He should be.
An election this close hardly provides the winner with a compelling
mandate, particularly if it is Bush, who may win the electoral college
majority while Al Gore is declared the winner of the popular vote. If
that turns out to be the case, Bush ought to tread with caution.

Compromise is good when not only the President is without a mandate
but so, too, the House and the Senate because of their razor-thin
outcomes. The country has come through eight incredibly prosperous and
relatively peaceful years, so why the rush to march down some new
uncharted course? Later for privatizing Social Security, a huge tax cut
for the super-rich and a $160-billion missile defense system--three mad
components of the core Republican program.

As for the Democrats, with or without Gore as President, it will be
the season for nothing more ambitious than damage control. With Gore, the
main weapon of reason would not be bold new programs that Congress would
ignore, but rather the threat of a veto to stop Republican mischief.
Without Gore, the responsibility will fall on the Democratic minority in
both branches of Congress to engage in a principled holding action
preparing for a congressional majority in 2002.

Odds are that Bush will be the President presiding over a nation that,
by a clear margin in the popular vote, rejected him for Gore. If Bush
wins the office, his challenge will be to prove that the moderate face he
presented during the election is truly his. If it isn't, and he attempts
to be a hero to the right wing of his party, he will wreck the GOP.
Clearly, future political power resides with the vibrant big cities and
modern suburbs, the sophisticated hot spots of the new economy, which
went for Gore, and not the backwater rural outposts that turned out to be
Bush country largely because men remain obsessed with their guns.

Stop the Presses

Providence put me on a panel debating the Gore/Nader choice with Cornel West at New York University in late October. Most of the audience was for Nader, and the lineup on stage did nothing to improve those odds.

Before the debate began, its organizers took a few moments to speak on behalf of the university's graduate students' struggle for unionization. So did West, who had been handed a flier about it from the floor. And as a man about to lose a debate (and a longtime grad student as well as an occasional NYU adjunct faculty member), I was happy for the interruption. Days later, the National Labor Relations Board set an important precedent by ruling in favor of the students. But here's what I don't understand. How can the student union supporters also be Nader supporters? Nonsensical "Tweedledee/Tweedledum" assertions to the contrary, only one party appoints people to the NLRB who approve of graduate student unions, and only one appoints people to the Supreme Court who approve of such NLRB decisions. No Democrat in the White House, no graduate student union; it's that simple. An honest Nader campaign slogan might have read, "Vote your conscience and lose your union...or your reproductive freedom...your wildlife refuge, etc., etc."

Well, Nader's support collapsed, but not far or fast enough. In the future, it will be difficult to heal the rift that Nader's costly war on even the most progressive Democrats has opened. Speaking to In These Times's David Moberg, Nader promised, "After November, we're going to go after the Congress in a very detailed way, district by district. If [Democratic candidates] are winning 51 to 49 percent, we're going to go in and beat them with Green votes. They've got to lose people, whether they're good or bad." It's hard to imagine what kind of deal can be done with a man whose angriest rhetorical assaults appear reserved for his natural allies. (The vituperative attacks on Nader, leveled by many of my friends and cronies on the prolabor democratic left, were almost as counterproductive, however morally justified.) But a deal will have to be done. Nader may have polled a pathetic 2 to 3 percent nationally, but he still affected the race enough to tip some important balances in favor of Bush and the Republicans. He not only amassed crucial margins in Florida, New Hampshire and Oregon; he forced progressive Democrats like Tom Hayden, Paul Wellstone, Ted Kennedy and the two Jesse Jacksons to focus on rear-guard action during the final days rather than voter turnout. If this pattern repeats itself in future elections, Naderite progressives will become very big fish in a very tiny pond indeed.

Perhaps a serious Feingold or Wellstone run at the nomination with a stronger platform on globalization issues will convince those die-hard Naderites to join in the difficult business of building a more rational, Christian Coalition-like bloc to counter corporate power within the party. For now, we can expect an ugly period of payback in Washington in which Nader's valuable network of organizations will likely be the first to pay. Democrats will no longer return his calls. Funders will tell him to take a hike. Sadly, his life's work will be a victim of the infantile left-wing disorder Nader developed in his quixotic quest to elect a reactionary Republican to the American presidency.

* * *

Giving Nader a run for his money in the election hall of shame are the mainstream media. Media portraits of both candidates were etched in stone, with nary a fact or figure allowed to intrude upon the well-worn script. Bush was dumb and Gore a liar; pretty much nothing else was allowed in the grand narrative. Like Nader, reporters assumed the enormous policy differences between Gore and Bush--on Social Security, prescription drugs, education, affirmative action, abortion rights, the environment--to be of trivial importance, hardly worth the time and effort to explain or investigate. The media's treatment of this election as a popularity contest rather than a political one between two governing ideologies was an implicit endorsement of the Bush campaign strategy, as the issues favored Gore. But even so, Bush was usually treated like some pet media cause. With regard to such consequential questions as his political program, his political experience, his arrest record, his military service, his business ethics, Bush was given a free pass by media that continued to hound Gore about whether he was really the model for Oliver in Love Story--which, by the way, he was. I guess being a Bigfoot journalist means never having to say you're sorry.

* * *

One election development that had to gladden New Republic owner Marty Peretz's heart was how bad it was for the Arabs. I got a call one day from a Republican Party functionary telling me that Hillary Clinton supported a Palestinian state and took money from groups that supported terrorist organizations "like the one that just blew up the USS Cole." I told the sorry sonofabitch that like Israel's Prime Minister, I, too, support a Palestinian state. And, if there was any justice in the world, Hillary's "terrorist" friends would blow up Republican headquarters while we were still on the phone, so I could enjoy hearing the explosion.

This heavy-handed bit of racist manipulation grew out of a story published, surprisingly, not in Rupert Murdoch's New York Post but in the putatively responsible and nominally liberal New York Daily News, owned by Mortimer Zuckerman. It was inspired by the machinations of one Steven Emerson, a discredited "terrorism expert" last heard trying to pin the Oklahoma City bombing on the Arabs by noting that "inflict[ing] as many casualties as possible...is a Middle Eastern trait." Each actor played a dishonorable role in the tawdry drama: The Daily News invented the story. The Lazio campaign brazenly exploited it. Hillary Clinton's campaign capitulated to it. Together with the media coverage of the main event, this mini-drama will go down in history as further evidence of that unhappy nostrum of American politics that this year seems to have escaped everyone from the Nader die-hards to Palestinian militants: Things can always get worse.

Afew days before the election, I accompanied a friend to the dentist's office. It was one of those situations in which appearance takes over more complex realities of who we are. I was a middle-aged black woman assisting an elderly white man. That he's a wild old radical who browbeats the mad law professor in me with Russian ideologues and German philosophers probably wasn't what most people saw as we toddled down the street arm in arm on cane. In the vast warren of the medical center, we become even more invisible in a waiting room filled with physically fragile patients, many of whom had been brought there by female caretakers of color.

Perhaps because of some such condescension, we became privy to a loud conversation floating out the not-quite-closed door of the office next to which we were sitting. One of the doctors was chatting with a patient, expressing his general pique at the world in familiar, often contradictory clichés. He was upset at the loss of standards in schools. He pitted merit against equality and paired merit with white, Jewish and Asian students. He insisted that "we are not all equal" and concluded that affirmative action was inherently immoral. A few minutes later he blamed white liberals for abandoning standards and praised as standard-bearers those blacks who support vouchers. "The problem is" minorities who teach their children to hate white people. He said that "blacks are out of control" and that black leaders "are not taking responsibility." He cited Al Sharpton, Marion Barry and Louis Farrakhan as typical black leaders, and he rattled on against substance abuse in the inner cities and guns in the hands of young blacks who will never make it into the middle class, because they don't study and don't have good table manners.

"Bite down," he said as he finished with a paean of support for "zero tolerance" policies, standardized testing and George W. Bush.

George W. Bush! I shook my head wonderingly. If only he were black. It's one of those things we black people think about a lot: If only this or that one were black. Can you imagine, we tell each other.

Just think where a black man who spent more than half his adult life as a substance abuser would be--a black man who had a conviction for drunk driving and a notoriously bad attitude. Is it too obvious to point out that George Bush and Dick Cheney--who has two convictions for drunk driving--share a certain equality of status with Marion Barry?

Just think where a sneering black frat brother who committed gross grammatical butcheries and called Greeks Grecians would be. What fun Abigail Thernstrom could have questioning why unqualified upper-class whiners like that should be admitted to "first tier" universities like Yale and Harvard. (I guess we're supposed to feel better that Cheney flunked out of Yale on his own merits.)

Just think of where a black businessman with a "winning" personality but a losing financial record would be when he showed up to buy that team franchise. Assuming he could get a job way down in the corporate food chain, you can bet they wouldn't let him anywhere near the cash register.

Imagine a black politician who was so loudmouthed that his own family called him "bombastic," who proffered opinions about nations whose names he hadn't bothered to learn or badly mispronounced and who created an international incident by falsely accusing the Russian Prime Minister of stealing from the IMF. If you're thinking Al Sharpton, think again.

Imagine a black leader who began his campaign for office at a university that historically advocated racial separatism as God's law and that published materials describing Judaism as heretical and Catholicism as a "cult." I do wonder how it is that George W. can wander through so much of Louis Farrakhan's metaphysical territory and still come out looking like someone whose morals so many Americans say they can look up to.

I do not draw such analogies simply to relativize. The more important point, I think, is one related to what I sometimes call innocence profiling. If George W. Bush were black, he would be a classic suspect profile. If he were Driving While Black, there are people who would have forgiven police if they had decided to shoot at his drunkenly weaving car on that dark Maine highway (as New Jersey troopers shot at that van full of perfectly sober, cooperative college students). If he had been black, we might have heard Mayor Rudolph Giuliani describing him as "no altar boy" (as he described Patrick Dorismond, a security guard "accidentally" shot and killed by the NYPD).

But of course, George W. Bush is not black, and thus it is, perhaps, that the New York Times instead ran an article describing him as having tamed his "inner scamp" and entered "midlife redemption"--even as the article goes on to describe the supposedly redeemed man-who-would-be-Commander-in-Chief as having behaved so insultingly and inappropriately toward Queen Elizabeth at a state dinner in 1991--five years after he says he gave up alcohol--that a horrified Barbara Bush promised the Queen to seat him far away from Her Majesty, "for fear of him saying something."

The lesson of equality is, at its heart, related to the question of double standards: There are still too many examples in American society of the degree to which we have zero tolerance for disreputable black behavior and seemingly unlimited indulgence when whites behave the very same way.

Anyway, back at the medical center, the dentist's door flew open. "Next!" called out the doctor.

"Now set the teeth...," growled my dear old friend and lefty warrior as he marched into the office to face needles, drills...and more. "It won't be so bad," smiled the dentist unsuspectingly.

But my friend had been quoting Shakespeare's Henry V. "Teach them how to war..." he went on and winked at me. The door shut softly behind them.

The Journal of Nutrition wrote about
Some researchers who tested chocolate out.
It may help stave off heart attacks, they claim.
Red wine, we've known for years, can do the same.
Without the need of any doctor's urging,
I feel a healthy diet plan emerging.

So it all came out right in the end: gridlock on the Hill and Nader blamed for sabotaging Al Gore.

First a word about gridlock. We like it. No bold initiatives, like privatizing Social Security or shoving through vouchers. No ultra-right-wingers making it onto the Supreme Court. Ah, you protest, but what about the bold plans that a Democratic-controlled Congress and Gore would have pushed through? Relax. There were no such plans. These days gridlock is the best we can hope for.

Now for blaming Nader. Fine by me if all that people look at are those 97,000 Green votes for Ralph in Florida. That's good news in itself. Who would have thought the Sunshine State had that many progressives in it, with steel in their spine and the spunk to throw Eric Alterman's columns into the trash can?

And they had plenty of reason to dump Gore. What were the big issues for Greens in Florida? The Everglades. Back in 1993 the hope was that Clinton/Gore would push through a cleanup bill to prevent toxic runoff from the sugar plantations south of Lake Okeechobee from destroying the swamp that covers much of south-central Florida. Such hopes foundered on a "win-win" solution brokered by sugar barons and the real estate industry.

Another issue sent some of those 97,000 defiantly voting for Nader: the Homestead Air Force Base, which sits between Biscayne National Park and the Everglades. The old Air Force base had been scheduled for shutdown, but then Cuban-American real estate interests concocted a scheme to turn the base into a commercial airport. Despite repeated pleas from biologists inside the Interior Department as well as from Florida's Greens, Gore refused to intervene, cowed by the Canosa family, which represented the big money behind the airport's boosters. Just to make sure there would be no significant Green defections back to the Democratic standard, Joe Lieberman made a last-minute pilgrimage to the grave of Jorge Mas Canosa.

You want another reason for the Nader voter in Florida? Try the death penalty, which Gore stridently supported in that final debate. Florida runs third, after Texas and Virginia, as a killing machine, and for many progressives there it is an issue of principle. Incidentally, about half a million ex-felons, having served sentence and probation, are permanently disfranchised in Florida. Tough-on-crime drug-war zealot Gore probably lost crucial votes there.

Other reasons many Greens nationally refused to knuckle under and sneak back to the Gore column? You want an explanation of why Gore lost Ohio by four points and New Hampshire by one? Try the WTI hazardous-waste incinerator (world's largest) in East Liverpool, Ohio. Gore promised voters in 1992 that a Democratic administration would kill it. It was a double lie. First, Carol Browner's EPA almost immediately gave the incinerator a permit. When confronted on his broken pledge, Gore said the decision had been pre-empted by the outgoing Bush crowd. This too was a lie, as voters in Ohio discovered a week before Election 2000. William Reilly, Bush's EPA chief, finally testified this fall that Gore's environmental aide Katie McGinty told him in the 1992 transition period that "it was the wishes of the new incoming administration to get the trial-burn permit granted.... The Vice President-elect would be grateful if I simply made that decision before leaving office."

Don't think this was a picayune issue with no larger consequences. Citizens of East Liverpool, notably Terri Swearingen, have been campaigning across the country on this scandal for years, haunting Gore. So too, to its credit, has Greenpeace. They were active in the Northeast in the primaries. You can certainly argue that the last-minute disclosure of Gore's WTI lies prompted enough Greens to stay firm and cost him New Hampshire, a state that, with Oregon, would have given Gore the necessary 270 votes.

And why didn't Gore easily sweep Oregon? A good chunk of the people on the streets of Seattle last November come from Oregon. They care about NAFTA, the WTO and the ancient forests that Gore has been pledging to save since 1992. The spotted owl is now scheduled to go extinct on the Olympic Peninsula within the next decade. Another huge environmental issue in Oregon has been the fate of the salmon runs, wrecked by the Snake River dams. Gore thought he'd finessed that one by pretending that, unlike Bush, he would leave the decision to the scientists. Then, a week before the election, his scientists released a report saying they thought the salmon could be saved without breaching the four dams. Nader got 5 percent in Oregon, an amazing result given the carpet-bombing by flacks for Gore like Gloria Steinem.

Yes, Nader didn't break 5 percent nationally, but he should feel great, and so should the Greens who voted for him. Their message to the Democrats is clear. Address our issues, or you'll pay the same penalty next time around. Nader should draw up a short list of nonnegotiable Green issues and nail it to the doors of the Democratic National Committee.

By all means credit Nader, but of course Gore has only himself to blame. He's a product of the Democratic Leadership Council, whose pro-business stance was designed to regain the South for the Democrats. Look at the map. Bush swept the entire South, with the possible exception of Florida. Gore's electoral votes came from the two coasts and the old industrial Midwest. The states Gore did win mostly came courtesy of labor and blacks.

Take Tennessee, where voters know Gore best. He would have won the election if he'd carried his home state. Gore is good with liberals earning $100,000-$200,000. He can barely talk to rural people, and he made another fatal somersault, reversing his position on handguns after telling Tennessee voters for years that he was solid on the gun issue. Guns were a big factor in Ohio and West Virginia, too. You can't blame Nader for that, but it's OK with us if you do. As for Nader holding the country to ransom, what's wrong with a hostage-taker with a national backing of 2.7 million people? The election came alive because of Nader. Let's hope he and the Greens keep it up through the next four years. Not one vote for Nader, Mr. Alterman? He got them where it counted, and now the Democrats are going to have to deal with it.

Articles

DNA testing can convict the guilty; it can also destroy the privacy of millions.

A land-claim suit is pitting Oneidas against other upstate residents.

In New Mexico, communists who fail to register their party affiliation with the state commit a felony. Under New Mexico's DNA databanking law, if they are caught they are required to submit a DNA sample to the department of public safety. In Idaho, consensual sodomy with a partner other than your spouse constitutes a sex-crime felony. Those unfortunate enough to be caught in the act are similarly required by law to submit a tissue sample to the state's DNA databank for the purposes of preventing future sex crimes. And if Governor George Pataki is successful in the next legislative session, New York will begin collecting genetic material from any person convicted of a misdemeanor, such as resisting arrest or disorderly conduct as a result of peaceful civil disobedience.

In an age of biotechnology and computers, we are all but a needle-stick away from disclosing hereditary-disease susceptibilities, familial relationships and identifying information. Anyone who values privacy should therefore be concerned that US law-enforcement agencies are amassing ever larger portions of the general population's DNA while neglecting to implement measures that would protect the privacy and presumptive innocence of citizens. And because DNA evidence is currently enjoying an unprecedented degree of bipartisan enthusiasm, these gradual developments have tended to be sheltered from the criticism that might otherwise confront such policies.

Not that DNA evidence's celebrity isn't well deserved. It is many rape victims' best hope for identifying their assailants and law enforcement's most penetrating method of apprehending serial offenders. It can be credited with triggering a re-examination of the nation's capital punishment system by exonerating eight death-row inmates. Like its predecessor, the fingerprint, DNA profiles are a reliable means of identifying individuals (except in the case of identical twins). But glib analogies to fingerprints obscure important differences. DNA samples can reveal far more information than fingerprints, including sensitive medical conditions, traits or a person's biological parentage. In addition, while fingerprints are unique to every individual, genetic profiles are partially shared among blood relatives. Thus, databanks contain identifying information on nonoffending relatives of people explicitly covered by databanking statutes. Finally, because we shed our genetic calling cards in a trail of hair follicles, skin flecks, saliva aerosols and finger smudges, DNA can also provide a trace of our activities.

DNA databanks are premised on statistics indicating that individuals convicted of a serious violent offense often commit other violent offenses that leave behind incriminating DNA. Tissue samples, usually in the form of a blood sample or cheek swab, are thus collected from offenders covered by their state's databank laws and are analyzed using a technique called "profiling," which detects genetic variations among individuals that, at least as currently understood by geneticists, have no biological function. The resulting data are then computerized so that profiles produced from crime-scene samples can be compared with those already in the database, allowing authorities to eliminate certain suspects or target those whose profiles match. In effect, databanks provide a means of genetically frisking anyone who has ever committed a covered offense for any crime in which DNA has been recovered.

As of June 1998 all fifty states had enacted statutes authorizing state and local law-enforcement agencies to operate criminal DNA databases and to pool their DNA profiles into a national FBI-operated database called CODIS (Combined DNA Identification System). Though the earliest laws targeted convicted violent sexual felons, civil libertarians looked to the history of Social Security numbers, fingerprinting and drug-testing to warn of an inevitable migration of the technique from convict to suspect terrain. A decade later, as many states have passed laws to cover new offender categories, the Cassandras appear to have been vindicated. Delaware, for instance, requires submission of genetic samples for all those who have committed offenses against children, which include selling tobacco or tattooing minors without the consent of a guardian. Twenty-three states cover certain categories of misdemeanors, and seven states have enacted legislation that would require DNA submission for any felony, which extends DNA databanking into realms such as perjury, larceny, bribery and fraud. Thus, in addition to New Mexico's statute covering unregistered communists, Alabama's code covers tax evaders and Virginia's targets people who deface brands or marks on timber. Experts like CODIS program director Steve Niezgoda have predicted that all states will eventually amend their statutes to cover all felonies; four states have already done so, and another three have recently considered or will consider such an expansion in their next legislative sessions. Among these three, New York's proposal stands out as by far the nation's most comprehensive, targeting all convicted felons and class-A misdemeanants.

DNA databanking laws are furthermore part of the ferment that is corroding the century-old juvenile justice system that treats minors as a category of offenders separate from adults. More than half of all states authorize inclusion of DNA profiles collected from juveniles in their databanks. In contrast to the convention of sealing or erasing juvenile criminal records after a period of time--a practice grounded on a rehabilitative ideal--none of the statutes require states to remove juvenile DNA profiles from their databanks, and one (Arizona's) expressly prohibits their removal. Several states have revised their original legislation to cover juvenile offenders as well. The spread of DNA databanking to minors is especially troubling when considered against the racial inequities that plague the juvenile justice system. According to Vincent Schiraldi, president of the Center on Juvenile and Criminal Justice, "When you control for poverty, white and black [teens] commit the same amount of violent crime, [but] blacks are arrested at four times the rate of whites and imprisoned at seven times the rate of whites. So don't think for a second this databank will be race-neutral. This policy will grossly overrepresent criminal behavior by blacks and exacerbate disparities in incarceration because [databanks are] going to be used against people."

An indirect consequence of expanding DNA databanks is their partial coverage of a larger proportion of nonoffending relatives as well. Because individuals share portions of their DNA with biological relatives--half in the case of siblings, parents and children--an incomplete match between a databanked person's profile and that of a crime-scene sample might lead investigators to question an individual's immediate family. The effect of such profiling by proxy is that identifying information about nonoffenders is present in criminal databank systems as well; in effect, if you have a relative whose profile has been databanked, you're likely to be partially genetically frisked as well.

A critical unresolved question about current databanking practices concerns what law-enforcement agencies actually do with their frozen vials of human tissue. The human genome contains approximately 100,000 different genes, many of which are associated with specific illnesses. Though DNA profiles have few applications beyond linking individuals to biological specimens, the actual tissue samples submitted by offenders could in principle be analyzed for genetic traits ranging from sickle-cell anemia to schizophrenia. Since evolving typing techniques may one day outmode profiles currently being entered into computers, more than half of US states are authorized or required by law to archive their samples so they can be retested. This sustains the possibility that samples may eventually be used for purposes other than profiling.

Most statutes restrict sample use to "law enforcement"--a term whose broadness in this context can only be described as oceanic. Twenty states allow law-enforcement agencies to use samples for research on improving forensic techniques, which could mean searching banked DNA samples for genetic predictors of recidivism, pedophilia or aggression. One Massachusetts legislator publicly advocated such a use, and Tom Callaghan, program manager of the FBI's Federal Convicted Offender DNA Database, refused to rule out such a possibility when pressed at a National Institute of Justice Symposium in September 1999. Moreover, tissue repositories created by databanks would provide genetics researchers with congenial waters in which to trawl for genes thought to be involved in criminal behavior. Alabama's databanking law brushes perilously close to this by authorizing release of anonymous DNA population data collected by law-enforcement authorities to "assist in other humanitarian endeavors including, but not limited to, educational research or medical research or development."

Experimenting with offender DNA in this way would violate basic tenets of biomedical ethics by using tissues that were not obtained by consent for purposes that arguably run counter to the interests of the research subject. "If [law-enforcement authorities] want to do research," argues Boston University bioethicist George Annas, "they should follow the same rules everyone else has to follow in terms of informed consent and privacy.... Criminals have privacy rights like everyone else." As such, using databanked samples for research without consent also runs counter to recommendations by the American College of Medical Genetics.

Such research authorizations are especially troubling in light of this nation's checkered history of experimentation on prisoners. In 1875 social reformer and prison inspector Richard Dugdale wrote his famous study of the Jukes family after he noticed a disproportionate number of inmates with that last name. The availability of banked criminals' tissues may prove a valuable resource should society's interest in genetic explanations for social ills be renewed.

Legal challenges of DNA database laws have generally failed and are therefore unlikely to stem their widening sweep. Practices in Britain, the first country to enlist DNA in its crime-fighting cavalry, may portend dramatically widened use of databanking in the United States. Britain's Forensic Science Service is authorized to collect DNA samples from anyone questioned about or suspected of any offense for which a person could be detained. As of July 1999, England had collected 547,000 DNA samples; the effort was projected to reach 30 percent of British men eventually. In addition, England has conducted at least eighty "intelligence-based screens"--the official term for what is colloquially called a "genetic sweep"--in which the general population is asked to submit DNA samples to help police investigate a particular crime. Although samples are provided voluntarily, social pressures, heavy media coverage and the concern that failure to submit a sample may itself invoke police suspicion undermine the notion of submissions being truly consensual. Other countries, including Canada and Germany, have conducted similar sweeps, and while some argue that the Fourth Amendment would probably bar such practices in the United States, privacy watchdogs like New York Civil Liberties Union's executive director Norman Siegel caution that "Fourth Amendment challenges [of databanks] have not been successful; these are the only reference points we have [for predicting how courts will rule on genetic sweeps], and they're not promising."

The next battle between civil libertarians and law-enforcement authorities concerning DNA databanking is likely to concern the leap from profiling convicted felons to arrestees. Former NYPD chief Howard Safir has championed arrestee profiling, and US Attorney General Janet Reno has begun to explore the implications of such a policy by querying a National Institute of Justice commission. Arrestee profiling would dramatically broaden the reach of DNA databanking and, if not subject to careful restrictions, would empower law-enforcement authorities to arrest people for minor offenses, collect a tissue sample and search their databases for a match between the arrestee's profile and another crime-scene sample. Despite widespread enthusiasm in law-enforcement circles, profiling on such a scale isn't likely to be implemented anytime soon, given the backlog of tissue samples awaiting profile analysis and the high costs (at least $100 per sample). Nevertheless, one state (Louisiana) profiles arrestees for sexual offenses, and advancing automation technologies are likely to erode these fiscal barriers.

Though this is reason for despair among privacy advocates, there are a few hopeful signs among the various statutes. Twenty-seven states (and the federal government), for example, prohibit disclosure of genetic materials or information to unauthorized third parties. Wisconsin requires that law-enforcement authorities eliminate DNA samples of convicted persons after profiling is complete, and six states (Indiana, New Hampshire, Rhode Island, Texas, Vermont and Wyoming) restrict what authorities can do with collected DNA by prohibiting analysis of genetic mutations that could predict a person's traits. But in an environment where the political leaders most likely to raise objections to such policies are often silenced by a fear of appearing to be soft on crime, the stability of these protections remains to be seen.

Imagining a fair and protective system for using DNA evidence in the criminal justice system isn't all that difficult. People claiming innocence should be given opportunities to volunteer DNA to clear their name. For them--and more broadly for the credibility of the criminal justice system--DNA forensic technology may be the only life vest within reach. Upon overturning a conviction, volunteered DNA samples and profiles should be promptly destroyed, preserving the individual's presumptive innocence. For people convicted of serious violent offenses and beyond the reach of such exculpatory evidence, however, the trade-off between privacy and public interest may tilt toward favoring a DNA databanking system with strong privacy protections, including sample destruction after profiling and prohibitions on uses other than comparing profiles with those collected from crime scenes. And finally, to protect the presumptive innocence of convicted offenders' family members, states should impose stringent requirements for when a match between a crime-scene sample and a databanked profile can trigger an investigation.

Privacy is a zero-sum entity: The extension of law-enforcement authorities' genetic gaze comes directly at the expense of an individual's power to withhold such information. Where most human DNA handling once occurred in medical clinics and research laboratories--institutions that are generally subject to public oversight and cautious (if imperfect) ethical review--DNA has now entered a territory not particularly distinguished for its ethical circumspection. States are not providing many reasons for the public to be confident that they are taking these concerns seriously; perhaps of even greater concern, negligence in protecting the privacy of offenders and criminal suspects may acclimate a public to weak protections of genetic materials. As the predictive powers of genetic technologies are refined, this could have grievous consequences for everyone.

Books & the Arts

The last chapter in Ring Lardner Jr.'s new memoir, I'd Hate Myself in the Morning (Nation Books), is called "Sole Survivor." When Lardner, who died October 31, wrote it he was indeed (a) the last of a family of four boys with a famous father, the humorist and sportswriter Ring Lardner; and (b) the last surviving member of the Hollywood Ten, who gained renown in 1947 when they refused to answer the House Committee on Un-American Activities' question, "Are you now or have you ever been a member of the Communist Party?" They were indicted, prosecuted and convicted of contempt of Congress and sent to prison-- in Ring's case for a year.

Among the first victims of the great Red purge to come, The Ten, also known as the Unfriendly Ten, are historically important because they were willing to risk prison to help prevent it, putting First Amendment principle ahead of personal convenience.

At the time, Billy Wilder, the witty director, cruelly and unjustly said, "Of the Unfriendly Ten, only two had any talent; the other eight were just unfriendly." Ring, who had already won his first academy award for Woman of the Year, starring Katharine Hepburn, was one of the two. The other was his buddy Dalton Trumbo, the highest-paid writer in Hollywood, who went on to win an Oscar for The Brave One, a movie he wrote under the pseudonym Robert Rich.

At the time, the tabloid press and newsreels did their best to portray the Ten as obstreperous, dogmatic followers of the party line. Each of the Ten was, in fact, following his conscience, albeit they arrived at their decision on how to confront HUAC after collective deliberation with counsel, some of whom were party lawyers, others not.

Lardner's famously elegant response to the committee was a clue to how wrong that image was. "I could answer your question," he said, but "I would hate myself in the morning"--hence his memoir's title.

Even during the blacklist years, when he made his primary living writing under various pseudonyms, he never gave up on his social commitment. Thus in 1955, when Hannah Weinstein set up a production company in London and chose for its maiden effort in the new medium of television The Adventures of Robin Hood, Lardner, along with fellow blacklistees like Abe Polonsky and Walter Bernstein, leapt at the opportunity for, as he put it, commentary-by-metaphor "on the issues and institutions of Eisenhower-era America."

After he was finally graduated from the blacklist--it took twelve years--and able to write under his own name, he gave us M*A*S*H, the black comedy that was, on the surface, about life in a medical unit during the Korean war; but beneath the surface, like Joe Heller's Catch-22, it was about the absurdities and contradictions of war itself.

Although his public positions were militant, privately he was a gentle soul. His main target was often himself. He would delight in telling how he recommended to David O. Selznick that he not acquire Gone With the Wind, the highest-grossing picture of its time, "because I objected on political grounds to the glorification of slave owners and the Ku Klux Klan." When progressives praised him for his principled stand against HUAC he would observe that the Ten did the only thing they could do under the circumstances "short of behaving like complete shits."

The loss of Lardner is a loss for both The Nation and the nation. One part Marxist democrat and two parts humanist-rationalist, he stayed true to his vision to the end. A few years ago he listed in The Nation "some of the strange things Americans believe 200 years after Thomas Paine published The Age of Reason." (Typical entries: "Eating fish is good for the brain"; "There never was a Holocaust.") He felt no comment was called for. But when a reader wrote to complain that "Reason is a wonderful tool, but it is a tiny flashlight shining here and there..." Lardner responded, "What he sees as a tiny flashlight, I call, in the words of Cicero, 'the light and lamp of life.'"

In an introduction to his memoir, I call Lardner "recrimination-challenged." In fact he seemed incapable of bitterness. Although he did once say of Martin Berkeley, a screenwriter who named a record 161 names before HUAC and specialized in writing animal pictures, "I always maintained that was because he couldn't write human dialogue."

Music

LOUIS ARMSTRONG AT 100

In 1927 a young cornetist led his band into a meticulously hilarious version of a classic composition Jelly Roll Morton had made famous, "Twelfth Street Rag." The recorded track sounds like the opening shot of a revolution--except that the revolution had already been in full swing in Louis Armstrong's head and hands for years.

Unlike most revolutions, though, from the first this one displayed an ingratiating, inviting sense of humor and charm. "Dippermouth," as his early New Orleans pals dubbed him, used the rag as a flight vehicle: As his horn fractures the tune's familiar refrains, the precise, cakewalking rhythmic values of ragtime suddenly coil and loop and stutter and dive, the aural equivalent of a bravura World War I flying ace in dogfighting form. Every time Armstrong comes precariously near a tailspin, he pulls back the control stick and confidently, jauntily, heads off toward the horizon, if not straight into another virtuosic loop-de-loop. The cut is from an astonishing series of recordings Armstrong made in 1925-28 that amount to the jazz-creating legacy of his Hot Fives and Hot Sevens, a succession of studio groups that virtually never performed live. And now, in time for his centennial--he claimed he was born in 1900 but wasn't--it's all been reissued.

The relentless joy brimming in the sound of young Satchelmouth's horn, the glorious deep-blue and fiery-red Whitmanesque yawp of it, has an undeniably self-conscious edge to it. Ralph Ellison and Albert Murray first pointed out a half-century ago that it is also the sound of self-assertion, a musical realization of the double consciousness W.E.B. Du Bois posited for African-Americans. Within this compound of power and pain, a racial revisitation of the master-slave encounter in Hegel's Phenomenology of Spirit, Du Bois explained that African-Americans were inevitably alienated, stood both inside and outside mainstream American culture and its norms, prescriptions, hopes, dreams. Such alienation, Du Bois pointed out, could cripple black Americans by forcing them to internalize mainstream cultural values that held them to be less than human, but it could also liberate the brightest of them. The "Talented Tenth," as he called this group, could act on their perceptions of the contradictions between the high ideals grounding basic American cultural myths (for example, that society believed "all men are created equal," as the Declaration of Independence puts it) and gritty daily reality, where blacks were not exactly welcomed into concert halls, schools, restaurants or the front of buses.

In the bell of Armstrong's barbaric (which means, in the sense Whitman inherited from Emerson, non-European) horn is the sound of a new, all-American culture being forged from the stuff of the social sidelines. In 1957 Ellison wrote to Murray,

I've discovered Louis singing "Mack The Knife." Shakespeare invented Caliban or changed himself into him. Who the hell dreamed up Louis? Some of the bop boys consider him Caliban, but if he is, he is a mask for a lyric poet who is much greater than most now writing. Man and mask, sophistication and taste hiding behind clowning and crude manners--the American joke, man.

Armstrong himself was no naïve artist; he certainly wasn't a fool. From his earliest days he saw race as a key issue in his life, his art and his country, with a wit and understanding evident in his music. As he wrote of the composer of "Twelfth Street Rag" and jazz's self-proclaimed inventor, "Jelly Roll [Morton] with lighter skin than the average piano players, got the job [at New Orleans's leading whorehouse, Lulu White] because they did not want a Black piano player for the job. He claimed he was from an Indian or Spanish race. No Cullud at all. They had lots of players in the District that could play lots better than Jelly, but their dark Color kept them from getting the job. Jelly Roll made so much money in tips that he had a diamond inserted in one of his teeth. No matter how much his Diamond Sparkled he still had to eat in the Kitchen, the same as we Blacks."

In The Omni-Americans, Murray explains how Armstrong's music limned human talents needed in the frenetic, fast-changing twentieth century. Drawn from the pioneer, Indian and slave, the key American survival skill was improvisation, the soloist's ability to mesh with his surroundings. Ellison's Invisible Man uses Armstrong's version of "Black and Blue," a tune from the 1929 Broadway play Chocolate Dandies, to demonstrate the Du Boisian nature of improvising as epistemological tool.

This was the lesson Armstrong started teaching in the Jazz Age, when flappers reigned and sexual emancipation knocked at the door of mainstream culture, when the Harlem Renaissance redefined African-Americans, when Prohibition created a nation of outlaws who, thanks to associating with booze and gangsters and the demimonde's jazz soundtrack, saw that Negroes, as they were called, were subject to legal and extralegal restrictions and prejudices more arbitrary and inane than the constitutional amendment forbidding booze.

The elastic rhythms and fiery solos on the sides by the Hot Fives and Hot Sevens spoke to these people. On tune after tune, Armstrong cavorts and leaps and capers over and around his musical cohorts with the playful self-possession of a young and cocky top cat. Nothing can hold him down. He traverses keys and bar lines and rhythms with impunity, remolding them without missing a step.

"Black and Blue"--originally written as a lament by a dark-skinned gal for her man, who's attracted to high-yellow types--made him a star. Armstrong's brilliant, forceful reading renders it as mini-tragedy, the musical equivalent of Shylock's speech in The Merchant of Venice. "My only sin," he sings in that growl that compounds the earthy humanity of the blues with an unflinching dignity (this is no grovel), "is in my skin/What did I do to be so black and blue?" The short answer: in America, nothing. The color line did it all.

Subversive and powerful, Armstrong's music was the fountainhead of the Jazz Age and the Swing Era, when jazz was America's popular music and the sounds of syncopated surprise filled the nation's dance halls while young folks skittered and twirled and flounced and leaped and broke out of lingering Victorian constraints to loose-limbed beats and blaring horns that emerged from America's Darktowns in New Orleans, New York and Chicago.

One of Armstrong's 1936 recordings is "Rhythm Saved the World." Like many, this banal tune is transformed by his syncopating personality. Its idea still echoes across America's teeming subcultures: Decades later, Parliament Funkadelic sang, "Here's my chance to dance my way out of my constrictions."

If Armstrong claimed he was born on July 4, 1900, who could blame him? As one of America's primary declarers of cultural independence--and interdependence--he should have been. But in his rich biography Satchmo, Gary Giddins (who insists that all American music emanates from Armstrong) proves that Louis's birth date was August 4, 1901. Armstrong and his sister were born in a hard district of New Orleans; their father left before either could remember him. In his early years Armstrong was raised by his grandmother, whom he credited with the Emersonian values--hard work, self-reliance, artistic daring coupled with personal amiability--that guided him. His mother may or may not have been a prostitute for a while; Louis returned to live with her when he was 5.

At 7 he quit school and went to work for a Jewish family, the Karmofskys, and picked up his first instrument--a tin horn. He'd been dancing and singing on the street for pennies with other kids, but working coal wagons with the Karmofsky sons, he learned to blow the cheap horn by putting his fingers together in front of the tube (he'd pulled off the mouthpiece). The boys encouraged him, their clients loved his melodies, and Little Louis, as he was known, had found his calling.

On January 1, 1913, he was busted for firing his stepfather's pistol, and sentenced to the Colored Waif's Home. There he joined the band and got his first musical training, which he characteristically never forgot. According to clarinet great Sidney Bechet, who in the 1920s was Armstrong's only peer as a virtuosic improviser, the cornet-playing young Louis mastered the chops-busting clarinet solo for "High Society" before his teens--an astounding feat that only hinted at what was to come.

Little Louis danced in second-line parades, following cornetist Joe "King" Oliver in the Onward Band as they wound through the Crescent City streets. Oliver was a catalytic force for Armstrong, who always insisted he learned his stuff from Papa Joe. When Oliver left for Chicago, following post-World War I black migration from the South to Northern and Western cities, he left Little Louis his slot in the Kid Ory band, which led the young cornetist to Fate Marable and the riverboats plying the Mississippi in 1920-21.

Marable, impressed by the young hornman's dazzling facility and ear, hired him for his riverboat band, and one of his sidemen trained the youngster to read and write music. What they played was a mix that would confound the Dixieland revivalists who decades later took Armstrong as their figurehead: adapted arias and classical overtures, quadrilles and other dance music, and the like. (Historian Dan Morgenstern has pointed out the suggestive influence of classical music on Armstrong.) At Davenport, Iowa, when the riverboat docked, a white kid named Bix Beiderbecke first heard Armstrong with Marable and decided to make the jazz cornet his life.

In 1922 Oliver sent for his protégé, who kissed his mother goodbye, packed the fish sandwich she made for him and headed north to Chicago. When he got to the Lincoln Gardens Cafe, where Oliver's band was wailing, he looked like a rube and was so shy he stayed by the door to watch. He couldn't believe he'd be playing with these masters of jazz. Yet in a very short time, first in recordings with them, then with his own Hot Fives and Hot Sevens, he would make them all sound like musical relics.

Rube or not--and his mode of dress quickly became Chicago-style sharp--Armstrong got the girl. His second wife, piano-playing Lil Hardin, married him while they were both playing with Oliver. Hardin was conservatory-trained and middle class, and for the next few years her ambition would drive the modest genius she married to make his mark in the rapidly exploding Jazz Age. Convinced that Oliver kept Louis in his band to keep him from fronting his own, Lil persuaded her husband to grab Fletcher Henderson's offer to join his New York-based big band. When Armstrong arrived in the Big Apple in 1924, Henderson's band was, as Morgenstern notes, "designed for Roseland's white dancing public...rhythmically stiff"; when he left fourteen months later, both arrangers and soloists were extending his sound.

It was Lil who persuaded Armstrong to go back to Chicago after scarcely more than a year in New York, and there he joined her band, then Carroll Dickerson's, and rocked the town. The night he returned, he was greeted by a banner she had unfurled over the bandstand: world's greatest trumpet player. Armstrong later told Morgenstern the reason he left Henderson's band was that the "dicty bandleader," college-educated, light-skinned and prone to look down on dark blacks, wouldn't let him sing, except occasionally for black audiences or for novelty and comic effect. Armstrong had been singing before he ever picked up a horn--it was a fundamental part of who he was and what he had to say. Ultimately, his vocals would make him a world-famous star. More immediately, they were another virtuosic tool he used to change jazz and, in the process, American culture.

Armstrong pioneered so many firsts in jazz and America that a list seems implausible. Here's a sample: He invented the full-fledged jazz solo. He invented scat singing. He introduced Tin Pan Alley and Broadway tunes as jazz's raw material. (When Armstrong replaced New Orleans standards and blues with Tin Pan Alley tunes in the 1930s, he forged the model followed by swing, jazz's most successful invasion of American pop music. His model was followed literally: Key arrangers like Don Redman, who worked for many bandleaders, including Benny Goodman, adapted Armstrong's runs and rhythmic moves to section-by-section big-band arrangements.) And Armstrong performed in interracial settings. Once, in New Orleans, when a bigoted announcer refused to introduce his band, he did it himself--so well that the radio station asked him to do it for the rest of the band's stint.

His voice engulfed America. Among his major disciples was Bing Crosby, who called him "the beginning and the end of music in America." His influence rippled across American popular and jazz singing like an earthquake. As he reconfigured pop tunes, the apparently natural force of his voice's cagey dynamics and loose rhythms seized the imagination of talents like Ella Fitzgerald and Billie Holiday, Crosby and Frank Sinatra.

With his last Hot Sevens recordings for Okeh in 1928, in which tunes like "I Can't Give You Anything But Love" were issued as B-sides, Armstrong had moved closer to the new American cultural mainstream he was inspiring. When he started recording for Decca in 1935, the impetus accelerated. A couple of interim managers gave way that year to Joe Glaser, a thuggish, mob-connected scion of a well-off Chicago family. He and Armstrong shook hands on a deal that lasted till they both died. As Armstrong put it, "A black man needs a white man working for him." It was the beginning of his big crossover into mainstream American culture--another Armstrong first in undermining de facto segregation in America. And his years at Decca were his workshop in change.

He fronted a big band, which critics hated and fans enjoyed. The outfit was run by Glaser, since Armstrong, who occasionally hired and fired personnel, didn't want to shoulder a bandleader's nonmusical burdens. And he agreed with Glaser on a new musical direction: setting his solos off in sometimes inventive, sometimes indifferent big-band charts; smoothing his blues-frog vocals into a more sophisticated sound without losing their rhythmic slyness--something he was also doing with his trumpet solos, reshaping his early frenetic chases after strings of high-altitude notes into less eye-popping, more lyrical solos.

Physical damage to Armstrong's lip and mouth from high and hard blowing forced the issue. Joe Muranyi, who played with him years later, says, "Part of the change in Louis's style could be attributed to the lip trouble he had in the early thirties. There are tales of blood on his shirt, of blowing off a piece of his lip while playing. This certainly influenced the way he approached the horn; yet what we hear on these tracks has at least as much to do with musical development as with physical matters." Limitation was, for Satchmo's genius, a pathway to a matured artistic conception. As Giddins argues forcefully in Satchmo, Armstrong had never separated art and entertainment; jazz for him was pop music. And if his bands irritated critics, there were plenty of gems, and besides, people loved him.

By World War II, his audiences were more white than black.

The war years broke the big bands. The culture had changed: Singers and small groups were hip. It was the era of a new sound, what Dizzy Gillespie called modern jazz and journalists dubbed bebop. Bop's frenetic, fragmented rhythms restated the postwar world's rhythms, and it deliberately presented itself not as entertainment but as art. The musicians forging it, like Gillespie and Charlie Parker, were fully aware of the stirring civil rights movement. World War II had fostered widespread entry of blacks into industry and the American military. Not surprisingly, after the war, they weren't willing to return to the old values of accommodation and deference. Instead, they demanded equality and freedom. In this context, boppers and their followers saw Armstrong's lifelong mugging and entertainment as Uncle Tom-ism rather than artistic expression.

The Dixieland revival, based in Chicago, occurred at about the same time. The (mostly white) revivalists needed an artistic figurehead. With a healthy historical irony they ignored, they chose Armstrong--the very soloist who blew apart old-style New Orleans polyphony, their idea of "pure" or "real" jazz. By 1947 Satchmo reluctantly abandoned his eighteen-piece outfit for the All Stars, a New Orleans-style sextet that included Jack Teagarden and Earl Hines. Though they often made fine music, the group was seen as a step backward by boppers. They jabbed at Satchmo, he jabbed back, and the split between revivalists and modernists escalated into a civil war that, in different stylistic and racial modes, still divides the jazz world.

Sadly, it was another Armstrong first. And his audiences began to turn lily-white. Giddins deftly shows Armstrong's world-famous onstage persona--the big grin, the bulging eyes, the shaking head, the brandished trumpet, the ever-present handkerchief, the endless vaudevillian mugging--to be an organic conception of the artist as entertainer. Still, from the 1950s until just before his death in 1971, Armstrong had to deal with accusations and slurs.

But if he never forgot who he was, while retaining his characteristically modest manner and only privately protesting how much he'd done to advance black civil rights, he could still be provoked, as President Eisenhower and the public discovered in 1957. Armstrong was poised to go on the first State Department-sponsored tour of the Soviet Union, a cold war beachhead by jazz. He abruptly canceled it because the Southern states refused to integrate schools, and he publicly excoriated Ike and America.

Early jazz musicians often refused to record, because they felt competitors could steal their best licks from their records. This was why the all-white Original Dixieland Jass Band made jazz's first records; black New Orleans trumpeter Freddie Keppard refused, fearing for his originality.

No one knows for sure how many recordings Armstrong made during the course of his half-century recording career. All agree, however, that he helped create both the art and the industry. After all, "race" records, especially Armstrong's hits, were as important as Bing Crosby's in saving the fledgling record companies from collapse during the Depression. (And there was more to it than that. Through the phonograph Armstrong made infinite disciples, shaping what jazz would become.)

During the 1950s and 1960s, when he was largely considered a period piece, Armstrong recorded important documents, like his meetings with Duke Ellington and Ella Fitzgerald. The best thing about them is their apparent artlessness, the easy, offhand creativity that was as much Armstrong's trademark as his trumpet's clarion calls. The pleasure is doubled by the response of his disciples.

Ella fits that description easily, since her trademark scat singing owes so much to Armstrong's. Yet she made it her own, purging scat of its overt blues roots. Producer Norman Granz supported them with his favorite Jazz at the Philharmonic stars--Oscar Peterson, Herb Ellis and Ray Brown. The results: Both Ella and Louis and Ella and Louis Again are incandescent yet low-key, full of generous pearls (from "Can't We Be Friends" to "Cheek to Cheek") that can almost slip by because of their understated yet consummate ease.

The 1961 session with Armstrong and Duke Ellington was hasty and almost haphazard, a simple melding of Ellington into Armstrong's All Stars; and yet it produced a wonderful, relaxed, insightful album. After all, Ellington had shaped his earliest bands around trumpeters and trombonists who could serve up Armstrong's New Orleans flair.

Like most postwar babies, I grew up knowing Louis Armstrong as the guy who sang "Mack the Knife" and, most famously, "Hello Dolly." It was only later that I'd discover the old blues stuff with singers like Bessie Smith, the Hot Fives, Ella and Louis, Fletcher Henderson and--one of my faves--Armstrong's accompaniment on early hillbilly star Jimmie Rodgers's "Blue Yodel No. 9." But even as a kid I felt strangely drawn to the little guy singing and grimacing on TV, wiping his perspiring brow with his trademark handkerchief. Although it all seemed corny, there was something, a hint of irony--though that wouldn't have been what his audiences, black or white, noticed unless they were old-timers who knew the ironic physical language or Satchmo fans or, like me, just a kid.

Why would a white kid in America catch a glimpse of Armstrong's abundantly joyful and potentially dangerous ironies? I'd love to claim precociousness, but it was a lot simpler. I could tell Armstrong was real because he filled the little blue TV screen so overwhelmingly that he made everything around him look, as it should have, fake.

Book

What ought to be read--and why--are questions that have a unique urgency in a multicultural milieu, where each group fights, legitimately, for its own space and voice. In the past couple of decades, battles over the Western canon have been fought strenuously in intellectual circles--one such flash point was Allan Bloom's The Closing of the American Mind and the debates that ensued. These skirmishes have much to do with the fact that America is undergoing radical change. The Eurocentric place once acknowledged as the heart of its culture has ceased to be so. Alternative groups, from different geographies, have brought with them the conviction that public life with a myriad of cores rather than a single one is far more feasible today.

It strikes me as emblematic that the voices most sonorous in the battlefield over the fate of literature are often Jewish, from those of the two Blooms, Allan and Harold, to that of Cynthia Ozick. This is not a coincidence: After all, the Jews are known as "the people of the book." For the Talmudic rabbis, to read is to pray, but so it is, metaphorically, among secular Jews...or, if not to pray, at least to map out God's cosmic tapestry. Among the most deeply felt Jewish expressions of book-loving I know is a letter to the legendary translator Samuel ibn Tibbon, a Spanish Jew of the illustrious translation school of Toledo in the twelfth century, written by his father. In it the elder Tibbon recommends:

Make your books your companions, let your cases and shelves be your pleasure grounds and gardens. Bask in their paradise, gather their fruit, pluck their roses, take their spices and their myrrh. If your soul be satiate and weary, change from garden to garden, from furrow to furrow, from prospect to prospect. Then will your desire renew itself and your soul be filled with delight.

But to turn Tolstoy's Anna Karenina into a companion, to satiate one's soul with it--ought that to be a Jewish pastime? I'm invariably puzzled at the lack of debate among Jewish intellectuals, especially in the Diaspora, on the formation of a multinational literary canon made solely of Jewish books. Why spend so many sleepless nights mingling in global affairs, reorganizing a shelf that starts in Homer and ends in García Márquez, yet pay no attention whatever to those volumes made by and for Jews?

The idea of a Jewish literary canon isn't new. Among others, Hayyim Nakhman Bialik, the poet of the Hebrew renaissance and a proto-Zionist, pondered it in the early part of the twentieth century. He developed the concept of kinus, the "ingathering" of a literature that was dispersed over centuries of Jewish life. Bialik's mission was to centralize it in a particular place, Israel, and in a single tongue, Hebrew. And a handful of Yiddish and Jewish-American critics, from Shmuel Niger to Irving Howe, have addressed it, although somewhat obliquely. Howe, for instance, in pieces like "Toward an Open Culture" and "The Value of the Canon," discussed the tension in a democratic culture between tradition and innovation, between the blind supporters of the classics and the anti-elitist ideologues. But in spite of editing memorable volumes like A Treasury of Yiddish Stories, he refused to see Jewish literature whole.

The undertaking never achieved the momentum it deserves--until now. A number of books have appeared in English in the past few months that suggest the need for a debate around a modern Jewish library. The Translingual Imagination (Nebraska), by Steven Kellman, a professor at the University of Texas, San Antonio, while partially concerned with Jewish literature, addresses one crucial issue: the polyglotism of authors like Sh. Y. Abramovitch, the so-called grandfather of Yiddish letters, whose conscious switch from Hebrew into Yiddish didn't preclude him from translating many of his novels, like The Mare, back into the sacred tongue. The presence of multilingualism in the Jewish canon, of course, is unavoidable, for what distinguishes the tradition is precisely its evaporative nature, for example, the fact that it emerges wherever Jews are to be found, regardless of tongue or geographical location. This complicates any attempt at defining it in concrete ways: What, after all, are the links between, say, Bruno Schulz, the Polish fabulist and illustrator responsible for The Street of Crocodiles, and Albert Cohen, the French-language author of the masterpiece Belle du Seigneur?

Also recently released is a book by Robert Alter, author of the influential The Art of Biblical Narrative and translator of Genesis. It is titled Canon and Creativity (Yale) and attempts to link modern letters to the biblical canon to stress issues of authority. Alter is attracted to the debate of "canonicity" as it is played out in academia and intellectual circles today, but he isn't concerned, not here at least, with purveying the discernible edges of Jewish literature historically. Far more concerned--obsessed, perhaps--with the continuity between Jewish authors from the Emancipation to the present is Ruth Wisse, a professor of Yiddish at Harvard, whose volume The Modern Jewish Canon will legitimize the debate by bringing it to unforeseen heights. For purposes of mitigated objectivity, I must acknowledge up front that together with Alter and Wisse and four other international Jewish critics, I am part of a monthslong project at the Yiddish Book Center to compose a list of the hundred most "important" (the word cannot fail to tickle me) Jewish literary books since the Enlightenment. So I too have a personal stake in the game. But sitting together with other candid readers in a room is one thing. It is another altogether to respond to the pages--at once incisive and polemical--of one of them whose views have helped to form my own.

Wisse is a conservative commentator of the Jewish-American and Israeli scenes and, most significant to me, an intelligent reader of strong opinions whose work, especially her study of Itzjak Leib Peretz and her monograph The Schlemiel as Modern Hero, I have long enjoyed. In her latest work she ventures into a different territory: From specialist to generalist, she fashions herself as a Virgil of sorts, thanks to whom we are able to navigate the chaotic waters of Jewish culture.

Probably the most estimable quality of The Modern Jewish Canon is simply that it exists at all. It insinuates connections to document the fact that Jews have produced a literature that transcends national borders. Albert Memmi's Pillar of Salt and Philip Roth's Operation Shylock might appear to be worlds apart, but Wisse suggests that there is an invisible thread that unites them, a singular sensibility--a proclamation of Jewishness that is clear even when it isn't patently obvious.

This is a crucial assertion, given that Jewish communities worldwide often seem imprisoned in their insularity: Language and context serve to isolate them from their counterparts in other countries and continents. For example, American Jews, for the most part, are miserably monolingual. (I doubt Jews have been so limited linguistically at any time in the past.) They insist on approaching their own history as starting in the biblical period but then jump haphazardly to the Holocaust, and thereon to the formation of the State of Israel in 1948. The Spanish period, so exhilarating in its poetic invocations, is all but ignored, and so is the importance of Jewish communities beyond those of Eastern Europe. Why are the echoes from the Tibbon family to Shmuel Hanagid, Shlomo ibn Gabirol, Moses ibn Ezra and medieval Spanish letters in general so faint? The power of these poets, the fashion in which they intertwined the divine and the earthly, politics and the individual, the struggles of the body and the soul, left a deep imprint in Jewish liturgy and shaped a significant portion of the Jewish people through the vicissitudes of the Ottoman Empire and northern Africa. Even the Dreyfus Affair is little known or regarded, as is the plight of the Jews in Argentina from 1910 to the bombing of their main cultural building in Buenos Aires in early 1994. And where the verbal isolation is not a problem, the insular perspective still applies: For instance, only now is Israel overcoming its negation of Diaspora life, which has deformed Israeli society and resulted in an institutionalized racism against those co-religionists whose roots are not traced to Yiddishland.

Wisse displays genuine esteem for high-quality literary art. She trusts her instincts as a savvy reader and writes about what she likes; no affirmative action criteria seem to apply in her choices--and for hewing to her own perspective, she ought to be commended. The common traits she invariably ascribes to what is a varied corpus of Jewish literature always point to Russia and Europe. Her encyclopedism is commendable in that it surveys a vast intellectual landscape, but it has clear limitations. She is well versed in English, Hebrew and Yiddish letters. But what about Sephardic culture? Ought she to exclude all that she is unfamiliar with?

The study is divided into ten chapters of around thirty pages each, ordered chronologically according to the birth dates of authors. She starts in the right place--with Sholem Aleichem, the author of the most beloved of all Jewish novels and my personal favorite, Tevye the Dairyman. And she ends with Israeli literature. In the interim, she mixes excerpts, critical commentary and historical perspective in exploring the work of Kafka, S.Y. Agnon, Isaac Babel, Isaac Bashevis Singer and scores of other luminaries, some of questionable value in my eyes (Jerzy Kosinski, for instance) and others often overpraised (here I would include Ozick). The contributions of critics such as Dan Miron, Chone Shmeruk, Lionel Trilling and Howe are acknowledged by Wisse in these pages, their perspectives still fresh and inviting.

It may be ungenerous to accuse Wisse of a certain nearsightedness; after all, to capture the essence of a literature written in a plethora of tongues and cultures, a literature that is by definition "undefinable," any potential cataloguer would need to be versed in each and every one of them. But The Modern Jewish Canon suffers another serious shortcoming, entirely within control: It is too dry a read. For a treatise that aspires to connect the various Jewish Weltanschauungen and juxtapose a rainbow of imaginations, each responding to different stimuli, from the eighteenth century to this day, Wisse offers little by way of narrative enchantment. She is a scholar and writes as such. Scarce effort is made to turn words into metaphors, to twist and turn ideas and allow them to wander into unexplored regions. The reader finds himself lost in a sea of "objective impersonality." Too bad, for shouldn't a book about the beauties of a polyphonic literature aspire to that on its own?

Wisse herself announces: "Modern Jewish literature...promises no happy merger into universalism at the end of the day." And yet some form of universalism is what she is attempting to describe, extending connective tissue between literary works where, at least superficially, there seemed none before. In that sense the achievement is impressive. Immediately after finishing the book, I took up pencil and paper to shape a list of what would be my own choice of books. In one of her last pages Wisse, who concentrates on novelists, includes a list of almost fifty titles, "meant to serve as a reference guide." Included are Yaakov Shabtai's Past Continuous, Piotr Rawicz's Blood From the Sky, Pinhas Kahanovitch's The Family Mashber, and Anne Frank's Diary of a Young Girl. But I found myself asking, Where are Marcel Proust, Elias Canetti and Moacyr Scliar? And that, precisely, is one thing a book of this sort should do: force readers to compose a response to the invisible questionnaire the author has quietly set before our eyes.

Future generations will find The Modern Jewish Canon proto-Ashkenazic and hyper-American, a sort of correlative to the Eurocentrism that once dominated American letters. They will kvetch, wondering why the Iberian and Levantine influence on today's Jewish books--from the poetry of the crypto-Jew João Pinto Delgado, to the inquisitorial autobiography of Luis de Carvajal the Younger, to even the Sephardic poetry that came out of the Holocaust--was so minimized in the English-language realm. Kvetch is of course a Yiddish word--or, as Leo Rosten would have it, a "Yinglish" one--but fretting and quarreling are Jewish characteristics regardless of place, and they inhabit the restless act of reading as well. The idea of a Jewish canon, modern and also of antiquity, hides behind it an invaluable fact: that Jews are at once outsiders and insiders, keepers of the universal library but also of their own private ones. Books have always served as their--our--companions for renewal and delight. The content of that private library might be up for grabs, but not its endurance.

The attempt to see Jewish literature whole, as expressing a singular sensibility, has never had the momentum it deserves--until now.

Film

VINCENT CANBY

As a memorial tribute to Vincent Canby, the "Arts & Leisure" section of the New York Times recently published half a page of excerpts of his prose, as selected by The Editors. Implacable beings of ominous name! With grim rectitude, they shaped a Canby in their image, favoring passages where he had laid down principles of the sort that should be cited only under capitalization. These were Sound Judgments.

For those of us who admired Mr. Canby (as the Times would have called him while he was alive, and as I will continue to call him, knowing how the style fit the man), soundness of judgment was in truth a part of his merit. A hard man to fool, he could distinguish mere eccentricity from the throes of imaginative compulsion, the pleasures of pop moviemaking from the achievements of film art; and when he was offered sentimentality in place of feeling, his heart didn't warm, it burned. These powers of discernment allowed him to bear with extraordinary grace the responsibility of being the Times critic. They also contributed a lot to his need for responsibility, since it was his sureness, as much as the institutional weight of the Times, that made Vincent Canby so influential.

That said, I confess I read him to laugh. At present, I can give only tin-eared approximations of his wisecracks--correct and ample quotation will become possible when someone smart decides to publish a Vincent Canby anthology--but I can hardly forget his review of Salome's Last Dance. This picture was the latest chapter in Ken Russell's phantasmagorical history of sex in the arts, or the arts in sex. Mr. Canby's lead (more or less): "As the bee is drawn to the flower, as the hammer to the nail, so Ken Russell was bound to get to Oscar Wilde."

`
I also recall Mr. Canby's description of the used car that Jim Jarmusch peddled to the title characters in Leningrad Cowboys Go America. It looked, he said, as if it had been dropped from a great height. Writing about I've Heard the Mermaids Singing, a film of relentlessly life-affirming whimsy, he claimed he'd been cornered by a three-hundred-pound elf. A typically self-regarding, show-offy performance by Nicolas Cage (was it in Vampire's Kiss?) inspired him to write that other actors must enjoy working with this man about as much as they'd welcome being shut up with a jaguar. And once, when forced to think up copy about his umpteen-thousandth formula movie, he proposed that the only way to derive pleasure from such a picture would be to play a game with yourself, betting on whether you could guess what would happen next. "As you win," he wrote, "you lose."

From these few and random examples, you may conclude that Mr. Canby's principles often emerged with a deep-voiced chuckle, and that they involved matters that went far beyond the movies. Some of these concerns were political in the specific sense, as when he gave a favorable review to Alex Cox's Walker: a film that offered a burlesque insult to US supporters of the Nicaraguan contras, in government and at the Times. His concerns were also political in a broader sense. Witness the 200 words he devoted to a little African-American picture titled Love Your Mama: a heartfelt, thoroughly amateurish movie produced in Chicago by some people who had hired an industrial filmmaker to direct their script. While quietly letting his readers know that they probably would not want to watch this film, Mr. Canby conveyed a sense that real human beings, deserving of respect, had poured themselves into the project.

Of course, the best places in which to seek Mr. Canby's principles were within the films he championed. He would have earned his place in cinema history (as distinct from the annals of journalism) had he done nothing more than support Fassbinder's work. And yet I'm not surprised that The Editors found no space to reprint Mr. Canby's writings on this crucial enthusiasm. Fassbinder, like his critic, was preternaturally alert to political and social imposture, to the bitter and absurd comedy of human relationships, and also (for all his laughter) to the pain and dignity of those who go through life being pissed on. Mr. Canby recognized in Fassbinder's work all these qualities and more (such as the presence, in the person of Hanna Schygulla, of one of cinema's great fantasy objects); but these matters seem to have been judged too unruly for an "Arts & Leisure" tribute.

Now, I've been allowed to do some work for "Arts & Leisure" and have received from my editors nothing but aid and kindness. Surely the people I've dealt with at the Times would have chosen excerpts from Mr. Canby that were funnier, sharper, more challenging. So maybe, when the Times moves to memorialize somebody as one of its own, a higher level of control takes over. It's as if the paper means to show its own best face--or rather the image it wants to see in the mirror, urbane and solid--and never mind that man in the old tweed jacket.

This tendency of the institution to eclipse the individual figures prominently in a new book by another major film critic, Jonathan Rosenbaum. By "major," I mean that Rosenbaum is highly regarded by other reviewers and film academics, and that he's gained a certain public following (concentrated in Chicago, where he serves as critic for the Reader). But if you were to ask him how he fits into American film culture in particular and US society in general, he would locate himself, quite accurately, on the margins. As his friends will tell you (I hope I may count myself among them), Rosenbaum is one of the angel-headed hipsters: a sweet-natured, guileless man, wholly in love with art and wholly longing for social justice. And for these very reasons, he has become the angry man of American film criticism, as you might gather from the title of his new work, Movie Wars: How Hollywood and the Media Conspire to Limit What Films We Can See (A Cappella, $24).

Rosenbaum argues--"argue," by the way, is one of his favorite words--that those American writers, editors and TV producers who pretend to cover film are for the most part hopelessly self-blinkered. It's in their interest to look at only those movies that the big American companies want to promote (including the so-called independent films that have been ratified by Sundance and Miramax). So journalism collaborates with commerce, instead of acting as a check on it; informed, wide-ranging criticism gets shoved to the side; films that might have seemed like news flashes from the outside world fail to penetrate our borders; and everyone excuses this situation by claiming that "the people" are getting the dumb stuff they want. Rosenbaum is enraged that moviegoers should be viewed with such contempt; he's infuriated that well-placed journalists should justify their snobbism (and laziness) by dismissing whatever films and filmmakers they don't already know about; and he's mad enough to name names.

In Movie Wars, Rosenbaum advances his arguments by means of a crabwise motion, scuttling back and forth between general observations (which are newly composed) and case studies (many of them published before, in the Reader and elsewhere). This means that some stretches of ground are covered two or three times. I don't much mind the repetition--even when the material shows up in a second new book by Rosenbaum, his excellent, unabashedly partisan monograph on Jarmusch's Dead Man (BFI Modern Classics, $12.95). I do worry that indignation, however righteous, has begun to coarsen Rosenbaum's tone and push him into overstatement.

When Rosenbaum is at his best, his extraordinary wealth of knowledge about cinema informs an equally extraordinary power of insight into individual pictures; and both these aspects of his thinking open into frequently astute observations of the world at large. You can get Rosenbaum at his best in his Dead Man monograph and in three previously published collections: Moving Places, Placing Movies and Movies as Politics (California). By contrast, Movie Wars is a sustained polemic, with all the crabbiness that implies.

It's a welcome polemic, in many ways. Most rants against the infotainment industry are on the level of Michael Medved's godawful Hollywood vs. America; they complain, in effect, that the movies tell us too much about the world. Rosenbaum recognizes the real problem, which is that our world (filmed and otherwise) has been made to seem small. I agree with much of what he says. But when, in his wrath, he digresses to settle scores or rampages past obvious counterarguments, I begin to wish that he, too, would sometimes pretend to be urbane and solid.

"There's a hefty price tag for whatever prestige and power comes with writing for The New York Times and The New Yorker," Rosenbaum says, "and I consider myself fortunate that I don't have to worry about paying it. Film critics for those publications--including Vincent Canby and Pauline Kael...--ultimately wind up less powerful than the institutions they write for, and insofar as they're empowered by those institutions, they're disempowered as independent voices."

To which I say, yes and no. As bad as the situation is--and believe me, it's woeful--I've noticed that news of the world does sometimes break through. David Denby, in The New Yorker, may contribute to American ignorance by being obtuse about Kiarostami (as Rosenbaum notes with disdain); but then, as Rosenbaum fails to note, Stephen Holden and A.O. Scott in the Times delivered raves to Taste of Cherry and The Wind Will Carry Us. Individuals in even the most monolithic publications still make themselves heard; and the exceptional writer can manage (at least in life) to upstage an entire institution.

Rosenbaum himself has pulled off that trick at the Reader; and Vincent Canby did it at the Times. To the living critic, and all those who share his expansive view of the world, I say, "We've lost a champion. Better stop grousing and pick up the slack." And to those who mourn Mr. Canby, I say, "You can still hear his laughter. Just don't let The Editors get in the way."