It took George W. Bush a matter of days--if not hours--to prove that he doesn't believe his own different-kind-of-Republican rhetoric and that he is leading a squad as loaded with partisan hacks as the other team. He doesn't trust the people--at least, the people of the recount counties who want to make sure every chad counts. (The Bush-league spin that manual recounts are less accurate than Ouija boards was demolished by computer scientists and voting-machine experts, who maintain that well-managed hand counts are without question more accurate than machine feeds.)
Bush also shows his promise to be a unifier, not a divider, to be counterfeit. Relying on the impression created by the networks' false projection of him as the victor--a call first made by his cousin the vote projector at Fox News--Bush and his lieutenants portray every move that works against them in Florida as part of a conspiracy to "steal" the election from the rightful winner. This is the way to foster unity and healing? The Bush camp then played an ugly card by accusing Democrats, who were following the traditional practice of carefully vetting overseas absentee ballots, of seeking to disfranchise the men and women of the armed forces. What of the men and women who serve as firefighters, inner-city teachers and ER nurses in the disputed counties--did the Republicans care about registering their votes?
Al Gore was pegged as the candidate who would say or do anything to win, but clearly Bush is willing to do whatever it takes to score in Florida. Yes, the Democrats assaulted Florida Secretary of State Katherine Harris, but imagine the fury of Republican spinmeisters if a Democratic state official tied to the Gore campaign had voided a Bush-requested recount.
Given the closeness of the election and the rampant problems with vote-counting in Florida and elsewhere, neither Bush nor Gore can be a clear winner. A system of counting 100 million votes cannot be expected to be accurate to within 0.001 percent. But beware the drawers of lessons, those voices from on high who pronounce this split decision a mandate for centrism. Both candidates ran toward the center, and neither achieved a majority. (If one combines the Ralph Nader and Al Gore vote, there's a 52 percent center-left majority; but given the differences between Goreism and Naderism, could that be a workable majority?) Moreover, campaign centrism again failed to inspire most Americans. Nonvoters outnumbered Gore or Bush voters. Under such circumstances, a pundit-approved mandate would be a figment of the political class's imagination. With roughly half of Americans choosing not to choose, the winner can claim only slightly less than one-quarter support. Had Bush or Gore won by 5 percentage points, he still wouldn't have a popular mandate.
The no-decision election of 2000 may result in sorely needed electoral reforms. But will it convince the next President and the pols to rethink the notion that the center is all? Doubtful: The sad fact is that even more than in previous years, the winner of Campaign 2000 will no doubt be fixated on his re-election as the way to legitimize his very iffy win--and re-election mania breeds caution. After this contest, it's likely that the permanent campaign will become even more permanent. The election of 2000 will very possibly not be settled until 2004.
THE BUCHANAN FACTOR
John Cavanagh of the Institute for Policy Studies posits that Pat Buchanan undermined Bush's election hopes more than Ralph Nader did Gore's. Although Buchanan collected around 450,000 votes to Nader's 2.6 million, in the four states that Bush lost by razor-thin margins Pat may have played the spoiler. In New Mexico, which Bush lost by 377, Buchanan got 1,185. In Wisconsin, where Bush was down by 5,816, Buchanan garnered 11,077. In Iowa, where Pat chalked up 6,776 votes, Bush lagged by 4,047. In Oregon Buchanan took 5,029, Bush lost by 4,186. That adds up to thirty electoral votes that Buchanan might have cost Bush. The only two states where Nader's vote surpassed Bush's margin of victory were New Hampshire and (maybe) Florida, a total of twenty-nine electoral votes. Thus, it's possible that Buchanan hurt Bush more than Nader hurt Gore. Yet Republicans aren't blaming Buchanan for Bush's current predicament.
HOW TO CHOOSE A PRESIDENT
Speaking of the electoral impasse, the Ratzan family sent us a list of possible ways to award the presidency. Among them: * Proportional Term Method. Give each candidate a term in office proportional to the percentage of the vote he won. Two years for Bush, two for Gore, one day for Nader, five minutes for Buchanan. All other candidates' presidencies depend on whether their term leaves them sufficient time to take the oath of office. * Disgruntled Roommates Method. Part 1: Use tape to make line across middle of the Oval Office: This side is mine, that side is yours. Part 2: Divide government 50-50 between Bush and Gore--by department, budget or individual government worker. * Jeopardy Quiz Method. Categories would include Name That Foreign Leader, What Legislation Have I Signed? and English Vocabulary. * Quaker Meeting Method. No President elected until everyone agrees. (For more ways to award the presidency, see email@example.com.)
PRO BONO--AND CON
Sam Nelson reports: According to Supreme Court Justice Ruth Bader Ginsburg, the noble tradition of private pro bono legal counsel, long an answer to the government's inadequate legal aid to the poor, is fast eroding as law firms choose profits over charity. In a speech delivered at Brooklyn Law School recently, Ginsburg said that only 20 percent of the need for legal services to the poor is currently being met in the United States. She noted furthermore that countries in Western Europe spend several times what the United States spends per capita on civil legal aid. The role the private bar plays in providing free legal services has therefore been a critical one. But Ginsburg highlights a disturbing trend: One hundred of the wealthiest law firms now devote one-third less time to pro bono work than they did eight years ago, while the average lawyer spends less than a half-hour a week on unbilled legal work. This is due in part to familiar job-market trends in the new economy, as large law firms increase starting salaries by an average of $25,000 to compete for talent with other lucrative corners of the private sector, like Internet start-ups. Pro bono work, says Ginsburg, "has suffered accordingly." The tradition of pro bono work began in the philanthropic movements of the late nineteenth century and reached its peak in the civil rights climate of the sixties and seventies. Thirty years ago law school graduates frequently sought out firms based on the type of pro bono work they did, said Ginsburg. "Today they don't ask anymore."
TURNING THE ELECTORS
Two politically enterprising seniors at Claremont-McKenna College in California are behind a cybermovement called Citizens for True Democracy, which is mobilizing Internet folk to persuade at least two Bush electors (assuming W. takes Florida, giving him a two-vote majority in the Electoral College) to switch their votes, throwing the election into the House of Representatives or to Gore. To this end they've put on their website--www.truedemocracy.org--contact information for 189 Bush electors in the nineteen states that have no laws requiring electors to follow the popular vote, so that people can urge them not to vote for Bush. CTD cofounder David Enrich, a former Nation intern, says that "thousands of people have visited the page with electors' contact information." Enrich emphasizes he's no fan of Al Gore. "We're focusing on Bush's electors at the moment because Gore is ahead in the popular vote, and we think the popular vote should determine the next President." Actually, he and his partner, Matt Grossmann, founded CTD in 1998 to agitate for direct presidential elections and instant-runoff voting.
There aren't many gracious fighters--especially in the shouting-head world of Washington journalism. But Lars-Erik Nelson, who reported from the capital for the past three decades, was such a person. He recently died at age 59 as he was watching television at home. Nelson, a columnist for the New York Daily News, combined political passion--that is, a thirst for justice--with a hard-edged reporter's desire to unearth facts ignored, overlooked or neglected by others. Throughout the Monica Lewinsky scandal he was one of the few reporters who kept his head while all around him were losing theirs and distinguished between spin and substance. We were fortunate to count him as a contributor (see "Reno: Getting It From All Sides," June 26). The scoundrels, secret-keepers and hypocrites of Washington and the political class are slightly safer with Nelson gone. Our condolences to his family, friends and readers.
NEWS OF THE WEAK IN REVIEW
We have long suspected that the two major parties have become like brand names competing for market share. Now it turns out that a study by a marketing research company called FCB Worldwide actually rated the Democrat and GOP names along with sixty other product brands in terms of the intensity of consumers' relationship with them. FCB used fourteen criteria, which it says were developed with marriage counselors and psychologists and included "chemistry, affection, trust and reciprocity." Judged by these standards, the two parties ranked at the bottom, below well-known names like Nike, Oreos and Clorox. FCB's director, Janet Pines, told the New York Times, "It's scary, yes, but I think it's also easy to see that a bottle of bleach is less likely to betray you than a politician."
Patricia Williams's column does not appear in this issue. She will return in two weeks.
There's been a lot of talk in recent days about "disfranchisement." Jesse Jackson has invoked memories of the bloody battles for voter registration in Selma; elderly Jews in Palm Beach, upset that they might have voted for an anti-Semite, have sworn that they were "disfranchised" by a butterfly ballot. The Republicans have lamented the disfranchisement of overseas soldiers for want of a postmark. Even the justices of the Florida Supreme Court got into the act.
The rhetoric is overblown, but there is a point: One byproduct of our election-turned-lawsuit is the revelation of the many ways people can be prevented from voting and having their votes counted. Thanks to the spotlight on Florida (focusing beams elsewhere as well), we now know that it is routine for states and counties to toss out tens of thousands of ballots because they are imperfectly marked; for countless people to arrive at the polls to find that their registration forms have been lost; for voting machines to have error rates that would be unacceptable in the grading of SATs.
The most extreme case is outright legal disfranchisement. Since the dramatic advances in voting rights of the 1960s, straightforward legal barriers apply only to two large groups of adults: noncitizens and felons. Noncitizens, numbering 15 million, have not always been excluded from the franchise, but the last state to allow aliens to vote (Arkansas) did away with that practice in the 1920s. Although a few communities permit resident aliens to vote in local elections, there has been a notable absence of debate in the United States (in comparison with Europe) on the proposition that people should be able to vote where they live and work.
Felons are permanently disfranchised in some states and temporarily barred in most others. (On November 7 Massachusetts brought a long, progressive history to an end by voting to disfranchise convicted felons.) Disfranchised felons and ex-felons now number roughly 4 million, most of them black or Hispanic. The link between the commission of a crime and the deprivation of political rather than civil rights has always been tenuous, and the constitutional legitimacy of such laws resides in a dubious interpretation of a phrase in the Fourteenth Amendment that tacitly permits the states to deprive men of the right to vote because of their participation in "rebellion or other crime."
Voters can also be prevented from voting (or having their votes counted) because they are tripped up somewhere along the elections-procedure obstacle course. In most states, advance registration is required; in many, no voting cards are issued, and (as happened in Illinois and elsewhere this year) people who thought they had registered at motor-vehicles bureaus discovered on Election Day that their registration was never recorded. Ballots are sometimes bewildering, assistance at the polls is scarce and problematic, and polling places migrate, often without notice.
In some states, it can be difficult, or even impossible, to vote for a candidate who happens not to be a Democrat or a Republican. In North Carolina, for example, Ralph Nader was not on the ballot (because he lacked enough signatures on a petition last spring), and write-in votes for Nader were not counted because he was not an "official" write-in candidate. (Since that fact was not advertised, many people did write in his name and had their ballots thrown away.) Meanwhile, the Electoral College dilutes the presidential votes of large-state residents, and minority voters are still subject to harassment in parts of the South.
This unhappy state of affairs has complex origins. Some of our institutions, such as the Electoral College, were created in an era when there were few believers in democracy. Many regulations date to a resurgence of antidemocratic sentiment in the late nineteenth century, a time when the two major parties colluded to suppress the threat of third-party insurgencies and when complex registration schemes were adopted both to minimize fraud and to reduce the electoral participation of blacks and immigrant workers. Over the past century, election rules have been forged through partisan rivalry, with spells of conflict ending truces and compromises that permitted the parties to mobilize their most loyal voters while imposing burdens on everyone else. That's how we ended up with Republican officials correcting the absentee-ballot forms of their voters, while their Democratic counterparts were instructing voters in Duval County (erroneously) to put a punch on every page. So much for the voice of the people.
"Actions approved by the U.S. government aggravated political polarization and affected Chile's long tradition of democratic elections and respect for constitutional order and the rule of law," reads a White House press release that accompanied the November 13 declassification of 16,000 secret government documents on Chile. That statement, contorted bureaucratese for admitting a US contribution to undermining Chilean democracy and backing a brutal dictatorship, falls far short of accepting US accountability for the national and human horror experienced in Chile--an acknowledgment necessary for Chileans and Americans to reach closure on this shameful history.
The release marks the final installment of the Clinton Administration's special Chile Declassification Project. "One goal of the project," according to the White House statement--issued by the press secretary rather than in the name of the President--"is to put original documents before the public so that it may judge for itself the extent to which US actions undercut the cause of democracy and human rights in Chile." Among the 24,000 documents declassified over the past two years are secret cables, memorandums and reports making that judgment perfectly clear.
The new documents dramatically record the imperial spectacle of high-level US efforts to destroy Chilean democracy in order to prevent an elected socialist, Salvador Allende, from governing. In a declassified transcript of a November 6, 1970, National Security Council meeting, President Nixon and selected Cabinet members casually discuss the need to "do everything we can to hurt [Allende] and bring him down." There, in bald terms, the historical record reveals the callous willingness to promote upheaval and bloodshed to achieve this goal. "You have asked us to provoke chaos in Chile," the CIA station chief in Santiago cabled headquarters in October 1970 during covert efforts to foment a coup; "we provide you with [a] formula for chaos." The CIA Chilean coup-plotters predicted at least 10,000 casualties if the military coup went forward. "Carnage could be considerable and prolonged i.e. civil war."
The CIA knew a year before the coup that Pinochet was prone to ruthlessness. An intriguing intelligence report records Pinochet as saying in September 1972 that "Allende must be forced to step down or be eliminated." A Chilean informant, who apparently accompanied Pinochet on a trip to Panama to purchase US tanks for the Chilean military, told the CIA that US Army personnel based at the Southern Command had assured them, "US will support coup against Allende 'with whatever means necessary,' when time comes."
In the United States, revelations of covert operations to destabilize the Allende government caused a major scandal in the mid-1970s. In Chile, where even the pro-Pinochet media have been forced to report on the declassified US records, this history is only now having a major impact on the national psyche. Throughout the country, there is outrage at this dramatic evidence of US intervention in Chile's internal affairs. A group of prominent senators has demanded that the Chilean government formally protest US "violations of our sovereignty and dignity" and have summoned the foreign minister to explain what action the government of Ricardo Lagos intends to take toward Washington. Privately, Chilean government officials have requested that the United States clearly acknowledge actions that helped change the course of Chilean history.
The Clinton White House considered such an acknowledgment to accompany the final documents' release--but in the end decided against it. Some officials fear that Washington could be held liable for covert war crimes in Chile--that the long arm of international justice that nabbed Augusto Pinochet could someday reach US officials. Although President Clinton did apologize to Guatemala for Washington's cold war policy of aiding and abetting repression--"support for military forces or intelligence units which engaged in violent and widespread repression," the President stated in Guatemala City last year, "was wrong, and the United States will not repeat that mistake"--no similar statement on Chile will be forthcoming. With the declassified documents, we now have a fuller accounting of the US role in Chile--but with no accountability.
The postelection battle for the presidency is without doubt some kind of crisis, but it's not easy to define precisely what kind. David Broder has suggested in the Washington Post that it grows out of deep divisions in the country. "The nation has rarely appeared more divided than it does right now," he writes, and attributes the phenomenon to quarrels left over from the 1960s among the "polarized baby boomers." Of course, it's true that the vote for Congress as well as the President was exceedingly close, and in that sense the country is, literally, divided. Division, however, should not be confused with polarization. On the contrary, the even split of the electorate can be attributed to the opposite of polarization--namely, the centrism of the candidates. Each carefully tailored his campaign to appeal to a reportedly contented "center," and each, unsurprisingly, won nearly half of it. The fact is that the United States, prosperous and at peace, is, politically speaking, more asleep than it is agitated. Almost 50 percent of the public did not bother to vote. Not a division in the country but a division between two politicians to win over a united country has been the source of the turmoil.
The battle, then, is between the parties rather than the people. It is, in the words of social critic Tom Engelhardt, a crisis of politics but not of the polity. A top-heavy establishment--overfunded, overpowerful, overcovered--has imposed its power struggle on a country that wants no part of it, except, perhaps, as entertainment. Almost entirely lacking in substance, that struggle possesses the logic more of vendetta than of authentic competition. A better analogy than the ideological divisions of the sixties would be the feuding Hatfields and McCoys of legend, or perhaps the Guelphs and Ghibellines of the Middle Ages. Like two armies fighting an unpopular war, both parties try to recruit support from a populace that for the most part would just as soon watch football. The consequence is the disconcerting spectacle before our eyes of all-out political war in a politically apathetic land.
It is important, though, to be more exact in assigning responsibility for the disturbance. The pressures of American politics create a temptation among journalists to practice a meretricious even-handedness in judging the parties. It is, of course, important for journalists to be nonpartisan. That is, they should exercise independent judgment, uninfluenced by any party interest. Being nonpartisan, however, does not mean blaming the two parties equally in all situations; it means judging both by the same standards and letting the chips fall where they may. If they are equal offenders, then that should be said, but if one party is by far the greater offender, then that must be said, too, even if it falsely creates an appearance of partisanship.
Such is the case at present. The Democrats have hardly been pacifists in the struggle. Their record is barren of moves taken for any evident reason but winning the presidency. The language of Gore's spokesmen and lawyers has at times been intemperate, as when the lawyer Alan Dershowitz called Florida's Secretary of State Katherine Harris "a crook." Yet by far the most dangerous escalations have come from the Republican camp. During the first ten days of the crisis, the fight was kept within certain bounds on both sides. Then came the Florida Supreme Court's decision to order Harris to refrain from certifying the election until further instruction. The GOP responded with a torrent of unsubstantiated defamation of the Gore campaign and of the boards conducting the recount in Florida. House Republican whip and impeachment zealot Tom DeLay announced without evidence that the election was "nothing less than a theft in progress" in Florida. A new spokesman for the Bush campaign, Governor Marc Racicot of Montana, charged that Democratic supervisors, by disallowing absentee military ballots without postmarks by Election Day and with other deficiencies, "have gone to war in my judgment against the men and women who serve in our armed forces," and opined that "when the American people learn about these things, they're going to ask themselves what in the name of God is going on here." Governor William Janklow of South Dakota announced that the Democrats "are going to steal the election." And Bush's press secretary, Karen Hughes, accused the Gore campaign of "reinventing and miscounting the true intentions of the voters."
At the same time, Republicans were beginning preparations to carry the battle beyond the Florida courts--into the Supreme Court, the Florida legislature and Congress. Former Senator Bob Dole and other Republicans said they might consider boycotting a Gore inaugural. Implicit in these preparations was the threat that if Bush didn't get his way in Florida the Republican Party was prepared to turn what so far has been a legal battle in one state into a true constitutional crisis. In that case, the mere party crisis, arising out of nothing more than a few people's love of power and lack of restraint in grasping for it, will have, by their single-handed efforts, created the national division that the country itself has failed to produce.
The throngs of Vietnamese who hailed Bill Clinton as "the antiwar President" demonstrated that they as a people remember something that we as a people have chosen to forget. It is time to restore our memory of that great antiwar movement by tens of millions of Americans, a movement that began with the first US acts of war in 1945.
Yes, 1945. In September and October of that year, eight troopships were diverted from their task of bringing American troops home from Europe to transport US-armed French soldiers and Foreign Legionnaires from France to recolonize Vietnam. The enlisted seamen on those ships immediately began organized protests. On arriving in Vietnam, the entire crews of the first four troopships met in Saigon and drew up a resolution condemning the US government for using American ships to transport an invasion army "to subjugate the native population" of Vietnam.
The movement kept growing. In 1954, when Vice President Nixon suggested sending American troops to replace the French because "the Vietnamese lack the ability to conduct a war or govern themselves," thousands of letters and telegrams opposing US intervention deluged the White House. An American Legion division with 78,000 members demanded that "the United States should refrain from dispatching any of its Armed Forces to participate as combatants in the fighting in Indochina or in southeast Asia." On the Senate floor, Senator Ed Johnson of Colorado declared, "I am against sending American GIs into the mud and muck of Indochina on a blood-letting spree to perpetuate colonialism and white man's exploitation in Asia." A Gallup poll revealed that 68 percent of those surveyed were against sending US troops to Indochina. Because of the American people's opposition, the US war had to be waged by four administrations under the cloak of plausible deniability.
We have been depriving ourselves of pride about the finest American behavior during that war. In most wars, a nation dehumanizes and demonizes the people on the other side. Almost the opposite happened during the Vietnam War. Tens of millions of Americans sympathized with the Vietnamese people's suffering, many came to identify with their 2,000-year struggle for independence and some even found them an inspiration for their own lives.
But in the decades since the war's conclusion, American consciousness of the Vietnamese people, with all its potential for healing and redemption, has been systematically obliterated. Ironically, it was after the war that demonization of the Vietnamese began to succeed, thanks in part to the national beatification of POWs and the myth of POWs as martyrs still being tortured by Vietnam. Soon those who had fought against the war became, as a corollary, a despised enemy. They also became the villains in another myth, developed from the 1980s to the present: the spat-upon veteran. As Vietnam veteran and sociologist Jerry Lembcke has shown in The Spitting Image, there is not a shred of evidence of this supposedly widespread phenomenon.
In fact, Vietnam veterans and active-duty soldiers and sailors became the vanguard of the antiwar movement. At home, veterans led the marches and demonstrations, including the 1971 assembly of a half-million protesters headed by a thousand Vietnam veterans, many in wheelchairs and on crutches, who paraded up to a barricade erected to keep them from the Capitol and hurled their Purple Hearts, Bronze Stars and Silver Stars at the government that had bestowed them. In Vietnam, fraggings and mutinies helped compel the withdrawal of most of the ground forces, while rebellions and sabotage put at least five aircraft carriers out of combat. (Who today can believe that 1,500 crew members of the USS Constellation signed a petition demanding that Jane Fonda's antiwar show be allowed to perform on board?)
As the antiwar movement spread even into the intelligence establishment, the American people got access to the most damning truths in the leaked Pentagon Papers. As Senator Mike Gravel noted in 1971, only a person who "has failed to read the Pentagon Papers" could believe we were fighting for "freedom and liberty in Southeast Asia."
But we as a nation have forgotten all that, just as we have forgotten our government's pledge to help rebuild the country it destroyed despite all our opposition.
Do we want a Vice President who endorses illegal detention and torture of Palestinians? Anthony Cordesman, a national security type frequently deployed as a television pundit, recently posted a paper on the website for the Washington-based Center for Strategic and International Studies recommending that Yasir Arafat's Palestinian Authority engage in just these practices to repress the latest intifada.
"Halt civil violence," Cordesman counsels, "even if it means using excessive force by the standards of Western police forces." But this is only a warm-up. "Halt terrorist and paramilitary action by Hamas and Islamic Jihad," Cordesman continues, "even if this means interrogations, detentions, and trials that are too rapid and lack due process." Still not clear enough. "Effective counter-terrorism relies on interrogation methods that border on psychological and/or physical torture, arrests and detention that violate the normal rights of privacy, [with] levels of violence in making arrests that are unacceptable in civil cases, and measures that involve the innocent (or at least the not provably directly guilty) in arrests and penalties."
In other words, protected only by the weasel phrase "border on," Cordesman urges that Israel's security forces return to the torture techniques that were finally abandoned under High Court order a year ago. Joe Lieberman is one of the senators belonging to the CSIS Middle East Task Force. Thus far, despite explicit requests for comment, he has not disavowed Cordesman's prescriptions, which have been condemned by Amnesty International USA.
For two months now Israel has laid barbarous siege to Palestinians throughout the occupied territories. The Israeli Army is busily cordoning Palestinian areas behind trenches and barbed wire, making Gaza and the West Bank one vast prison--or rather, many separate prisons, all barred from communicating with one another.
The policy of "closure," initiated after the Gulf War, continued unabated during the so-called Oslo peace process, in violation of Israeli government obligations. The strategy of apartheid and imprisonment is now accelerating, accompanied by bombardment of heavily populated areas, as well as incessant attacks from settlers (all courtesy of the US government, as always, with vast new military subventions rolling in after the Al-Aksa intifada began).
Even the relatively better-informed mainstream accounts fail to convey the brutality of this policy. There are a number of excellent news outlets for those who want unjaundiced reporting. The website for Middle East Research and Information Project is trustworthy (www.merip.org), as is the Electronic Intifada (electronicintifada.net/new.html). For the latter, the intro essay by Nigel Parry gives a useful overview of media coverage. Electronic Intifada also has links to other sites, as does ZNet's Mideast Watch (www.zmag.org/meastwatch/meastwat.htm). Particularly comprehensive is Birzeit University's (www.birzeit.edu/links).
Five fine specimens of Meleagris gallopavo--wild turkey to you--wandered onto my property here in Humboldt County, Northern California, a few days ago. I assume they forgot to check the calendar. Under California fish and game regs, you can shoot them legally for two weeks around Thanksgiving. Out came my 12-gauge, and I loosed off a shot that at some 100 feet did no discernible damage, and after a brief bout of what-the-hell-was-that the turkeys continued to forage. A fusillade of two more shots finally brought down a fourteen-pounder. I hung him for four days, plucked him and by Thanksgiving's end he was history. This was all easier than sporting manuals suggest, where hunters take enormous trouble to decoy the turkeys with fake gobbles.
Wild turkeys haven't been seen in California since earlier in the Cenozoic era, but in recent years two ranchers in my valley imported a few and now they've begun to appear in our neighborhood in Humboldt County in substantial numbers. I've heard reports of flocks of up to a hundred wild turkeys fifteen miles up the Mattole River around Honeydew, an impressive quantity though still far short of the thousand birds counted in one day by two hunters in New England in the 1630s.
The speed with which New World foods spread across Europe and Asia is astounding. Cortez brought turkeys back to Europe from Mexico, and by the 1530s they were well-known in Germany and England. The Puritans had domestic turkeys with them in New England, gazing out at their wild relatives, offered by the Indians who regarded them as somewhat second-rate as food.
Of course, wild turkeys have many enemies aside from the Beast called Man. There are swaths of Humboldt and Mendocino counties where coyotes and mountain lions now hold near-exclusive sway. Ranchers running sheep used to hold off the coyotes with M-80 poison-gas canisters that exploded at muzzle touch, but these are now illegal, and the alternatives are either trapping, which is a difficult and time-consuming job, or getting Great Pyrenees dogs to guard the flock. But the coyotes are crafty and wait till the sheep have scattered, then prey on the unguarded half.
Gabbing on the phone to my friend Ford Roosevelt, who lives in Los Angeles, I mentioned my turkey kill, and he reacted with revulsion, not so much to the fate of Meleagris gallopavo but to the fact that I have a shotgun at all. I told Ford that it was this sort of city-slicker foolishness that cost Gore states like West Virginia, Tennessee, Arkansas and Ohio. Ford, a grandson of FDR, then disclosed that the Democratic National Committee had asked him to campaign in various states, including West Virginia. "Well Ford, didn't you find that the gun issue was on people's minds?" "Yes, as a matter of fact. I was talking to some miners and they brought it up. I told them that as far as I was concerned, guns should be banned altogether. They weren't pleased." "So it was you, Ford, who lost West Virginia." He didn't seem contrite.
The President has gone to Vietnam,
A smallish country that we used to bomb
But now would like to send our products to.
And so our corporations take the view
That if the country's ruling class has picked
A form of rule that can be somewhat strict,
That's up to them. And Clinton went to say
That there is nothing standing in the way
Of being friends with them forevermore.
Remind me, please: Why did we fight that war?
A quarter-million people thronged Abraham Lincoln's Memorial that day. In the sweltering August humidity, executive secretary Roy Wilkins gravely announced that Dr. William Edward Burghardt Du Bois--NAACP founding father and "senior intellectual militant of his people"--had died in exile the day before.
It's easy to forget. What we now think of, monolithically, as the civil rights movement was at the time a splintering half-dozen special-interest groups in ill-coordinated pitched camps. Thurgood Marshall, never known for tact or political correctitude, called the Nation of Islam "a buncha thugs organized from prisons and financed, I'm sure, by some Arab Group." The NOI viewed the Urban League as a black front for a white agenda. A fringe figure gaining notoriety for his recent Playboy interview with an obscure journalist named Alex Haley, Malcolm X irreverently dismissed both "the farce on Washington" and the young minister just moments away from oratorical immortality, the Rev. Dr. Martin Luther King Jr., as "Bishop Chickenwings."
If the legacy of Du Bois's long life was unclear then, what can it all mean now? What possessed him to renounce the widely coveted citizenship for which those gathered there that day--inspired in part by his example--were marching? What can a scholarly biography of the patron saint of African-American intellectuals--written by a tenured professor for a prestigious publishing house, impatiently awaited by specialists and educated generalists alike--what can all this mean to 101 million eligible nonvoters "entirely ignorant of my work and quite indifferent to it," as Du Bois said in his time, much less to 30 million African-Americans beyond the Talented Tenth and those few old-timers in Harlem who remember Du Bois as being, mostly, a remarkably crotchety old man?
With these mixed feelings of pleasure, gratitude, frustration and momentous occasion, I read the monumentally ambitious sequel, seven years in the making, itself a National Book Award finalist, to David Levering Lewis's Pulitzer Prize-winning Biography of a Race, 1868-1919.
"I remember well," Du Bois wrote, famously, "when the shadow swept across me." He was born "a tangle of New England lineages"--Dutch, Bantu, French Huguenot--within living memory of the Fourteenth Amendment and The Communist Manifesto, one generation removed from slavery. And though he laid claim to both his African and European heritage, still it was a peculiar sensation. "One ever feels his two-ness--an American, a Negro; two souls, two thoughts, two unreconciled strivings; two warring ideals in one dark body, whose dogged strength alone keeps it from being torn asunder." Yet Du Bois knew full well that had he not felt, very early on, this double-consciousness, he might easily have become just another "unquestioning worshiper at the shrine of the established social order."
Willie D. charted his course as early as his teens, inaugurating his writing and public-speaking careers with articles in the Springfield Republican and a high school valedictory address on abolitionist Wendell Phillips. He arrived at the Harvard of Santayana and William James, who thought him easily among the most gifted of his students, already notorious for the "arrogant rectitude" others would resent all his life. He graduated cum laude, honing his prose with a rigorously liberal education in Latin, Greek, modern languages, literature, history and philosophy. But for a graduate student in sociology during the 1890s, Max Weber's Berlin, not Cambridge, was the place to be. And it was there, chain-smoking fluent German, celebrating both his 25th birthday and "his own genius," that W.E.B. Du Bois spelled out his life's ambition: "to make a name in science, to make a name in literature, to raise my race." Only because his scholarship ran out did Du Bois return to America for the consolation prize: Harvard's first African-American PhD.
Atlanta, after Europe and the North, came as a shock. Not that the recent lynching was in itself any great surprise. Du Bois simply wasn't prepared, passing by the local grocer, to see the souvenirs of severed fingers on display out front. Headquartered at Atlanta University, for the next twelve years he taught history and economics. By the time Frederick Douglass died in 1895, the Tuskegee model of black higher education was dominant, and Booker T. Washington its leading lobbyist. That same year Washington, whose power had been growing since 1885, had delivered his famous Atlanta Exposition speech: "In all things purely social," he said, holding up both hands, digits spread wide, "we can be as separate as the [five] fingers"--he paused dramatically, clenching each hand into a fist--"yet as the hand in all things essential to mutual progress." Convinced that Washington's appeasement had paved the way for Plessy v. Ferguson in 1896, Du Bois and other black intellectuals felt sold down the river. Du Bois's scathing review of Washington's Up From Slavery (1901), declaring war on merely vocational training of a "contented and industrious peasantry," was collected in The Souls of Black Folk (1903). Du Bois and Washington came, notoriously, to ideological blows. It was the beginning of the end for Booker T. Washington.
Yet there was no personal animus between them. Shrewdly, Washington tried to hire Du Bois away to Tuskegee, even taking him along on one of his fundraising junkets. But once at Andrew Carnegie's office, Washington--who knew where his bread was buttered and that Du Bois could be counted on not to keep his mouth shut--left him waiting downstairs. "Have you," Washington asked, "read Mr. Carnegie's book?" W.E.B. allowed he had not. "You ought to," said Booker T. "Mr. Carnegie likes it."
Around 1909, certain Niagara Movement radicals and Jewish abolitionist holdovers formed a coalition that became the NAACP. Du Bois moved to New York, where, as editor of The Crisis for the next twenty-five years, his word was gospel.
Meanwhile, Marcus Garvey addressed a Harlem crowd of 2,000 in 1917, preaching black economic independence and resettlement. He even offered, to the resurgent Klan's delight, to transport them back to Africa. Now, the masses might be fooled by the plumed and gold-braided pretensions and Napoleonic pageantry of
the Emperor Marcus Mosiah Garvey--self-proclaimed High Potentate and Provisional President-General of all Africa, Commander in Chief of the Black Star Line, an entire fleet of three dubiously seaworthy vessels--with his back-to-the-motherland schemes, his dukes and duchesses of Uganda and Niger, his knight commanders of the Distinguished Order of Ethiopia and the Nile. But Du Bois, who had just returned from Firestone's Liberia as diplomatic envoy, knew better. (Besides, everybody who was anybody knew that what Garvey's Universal Negro Improvement Association really stood for was "Ugliest Negroes in America.") As far as Du Bois was concerned, Garvey was either a lunatic or a traitor. Whereas, it seemed to Garvey--who saw Du Bois's NAACP as the National Association for the Advancement of Certain People--that the lunacy was for blacks to expect equality in America. In the end, his daring, energy and charisma were surpassed only by his ignorance of finance. Du Bois sounded the rallying cry: "Garvey Must Go." The FBI agreed. And if deportation on the grounds of being an undesirable alien wouldn't hold up in court, mail fraud would do nicely. Arrested in 1922, tried and convicted in 1923, Garvey took up residence at Atlanta Federal two years before Malcolm X was born.
Remember, back before they were Jim Crowed into academic ghettos, when history was literature and vice versa? When nonspecialists read Macaulay, Michelet? Poet, short-story writer, essayist and novelist as well as historian, Du Bois was by no means master of all the genres he assayed. But he electrified African-American literature as writer during the twentieth century's first decade. Then, as editor, he paved the way for younger writers during subsequent decades. Biography, however, is a late development in the tradition. What advances have eminent African-Americans like David Levering Lewis made in that "most delicate and humane of all the branches of the art of writing"? And do his tomes amount to a "masterpiece of the biographer's craft"?
With their cast of legendary characters, colorful set locations, gripping storylines and virtuoso draftsmanship, they certainly aspire to it. For analytical rigor, judicious gossip and subtle insight into the social, political and economic "roots and ramifications" of "racial, religious, and ethnic confrontation, and assimilation in America" between Reconstruction and the civil rights movement, Lewis is fully equal to the task of his formidable subject. And his lucid, downright old-fashioned good writing, so full of fine flourishes and phrases, is mostly innocent of academic jargon. So much so that for years--visiting the same archives, examining the same documents and cross-examining the same witnesses while working my way carefully through these volumes, underlining passages in mechanical pencil, leaving yellow flags on every other page--I kept trying to figure out my misgivings.
And then it hit me. The problem here is not one of length--Boswell's massive Life of Samuel Johnson still startles, 200 years later--but scale, of Turgenev's "right relation" among a dozen or so vivifying narrative elements beyond character and what used to be called "plot." All of these together in a constant juggle of transitions--abstract to concrete, poetic to prosaic, description to dialogue, sentence length and rhythm--can create compelling momentum. Any one of these, overrelied upon in a fact-filled narrative of 1,500 pages, can be lethal. "With the 20th century," said Virginia Woolf,
a change came over biography, as it came over fiction and poetry.... the author's relation to his subject is different. He is no longer the serious and sympathetic companion, toiling slavishly in the footsteps of his hero.... Moreover, he does not think himself constrained to follow every step of the way.... he sees his subject spread about him. He chooses; he synthesizes; in short, he has ceased to be the chronicler; he has become an artist.
Cautious of overstepping the bounds of the historically permissible, the distinguished professor has crafted a straightforward chronicle. Far too often, characters are molded not organically from suggestive situation but by accretion of meticulous archival detail--endless lists of academic pedigree heaped, all at once, in static inventories of naturalistic description--then left to atrophy in the reader's mind. A compelling narrative begins where the dossier leaves off. And a good biographer is a historian, but a good historian isn't necessarily a biographer. The progression from one to the other is no more formally inevitable than that from short-story writer to novelist. But don't get me wrong. The aesthetic quibble is really by way of illustrating how close this life might have come to greatness, to the artistry of all that Lytton Strachey left out in tending toward that "becoming brevity...which excludes everything that is redundant and nothing that is significant," and which, "surely, is the first duty of the biographer."
Du Bois's influence on African-American literature, as both writer and editor, is hard to exaggerate. Between Phyllis Wheatley, the publication of Souls, the silence of Charles Chestnutt and the death of Paul Laurence Dunbar from drunken disillusionment in 1906, dozens of poets, authors and pamphleteers emerged, boycotting the happy-blacky-nappy, banjo-strumming, watermelon-eating, darky dialect of previous eras. Of this work, says James Weldon Johnson in the classic history Black Manhattan, "Some was good, most was mediocre, much was bad, and practically all of it unknown to the general public." As late as 1914, with the exception of Johnson's Autobiography of an Ex-Colored Man, there wasn't much in the way of African-American literature, and Du Bois thought things looked bleak. By 1920, New York was America's greatest city, and Harlem--a two-square-mile city within the city where a quarter-million African-Americans boasted more poets, journalists, musicians, composers, actors, dramatists and nightclubs than any other spot on earth--became the world-famous capital of black America. It seemed to Du Bois that a renaissance of American Negro literature was now due.
His lover/literary editor Jessie Fauset, to put the arts on equal footing with social policy, urged an editorial shift in the pages of The Crisis. In short order, she published Langston Hughes's "The Negro Speaks of Rivers" in 1921 and prose poetry by Jean Toomer, later collected in Cane (1923). For the first time in history--just when Du Bois feared he'd have no worthy successors--a literature of African-Americans, by African-Americans and for African-Americans and anyone else who cared to listen was not only a possibility but a reality. The Harlem Renaissance was under way.
One prodigy Du Bois particularly delighted in was pinky-ringed young poet Countee Cullen. Companionable, uncombative, anxious for the kind of credibility a tidy résumé and Harvard degree could confer, Cullen idolized Du Bois to a degree perhaps predictable in a cautious orphan risen from impoverished obscurity to international fame by the age of 22 yet lacking, in the final analysis, the kind of intellectual and artistic daring that could sustain it. Du Bois, for his part, perhaps projected onto Cullen some of the paternal pride and ambition long buried with the infant son he'd loved and lost. And so he married off his only daughter. Langston Hughes rented a tuxedo, an organist played Tannhäuser and sixteen bridesmaids wore white. The only problem--aside from the fact that Countee Cullen was gay--was that the girl admired but didn't love him. It was a match made in Hell, a dramatic example of how "spectacularly wrongheaded" Du Bois could be.
For a decade or more, the Harlem Renaissance promised 10 million African-Americans "taken for granted by one political party and despised by the other, poor and overwhelmingly rural, frightened and disunited," the illusion of an era of freedom, justice and equality undreamed of since Reconstruction. To his immense credit, Du Bois was not lulled into submission, mistrusting the impulse toward "salon exotica" and a smattering of prizes for prodigies. Then as now, the means of production--the Hollywood studios, the recording studios, the theaters--were for the most part white-owned. As early as 1926, he warned about "the politics of patronage," challenging that African-Americans would get the art that they deserved--or were willing to pay for: "If a colored man wants to publish a book, he has to get a white publisher and a white newspaper to say it's great; and then [black people] say so." (Ain't a damn thang changed.) By 1934 it had become embarrassingly clear that civil rights would not follow logically from "forceful prose" and a demonstration of artistic excellence on the part of a few Ivy League Negroes. The movement was dead, "scuttled," as chief publicist Alain Locke put it, as much from within as from without, by faddish market swings and stock speculations of Zora Neale Hurston Niggerati, on the one hand, and the liberal Negrotarians on the other.
For Du Bois, as for most African-Americans, the Depression hit harder and faster and lasted longer than for the country at large. The royal wedding had wiped out his savings, and his Crisis salary hadn't been paid for months. He was broke.
Du Bois became increasingly radicalized during the 1930s and '40s. As he saw it, the NAACP, by focusing almost exclusively on legal strategy, was beginning to work "for the black masses but not with them." In 1934, out of sync with the mainstream leadership, he left in disgust. He returned to Atlanta University, reading Das Kapital and writing Black Reconstruction in America (1935). Du Bois, who first visited the Soviet Union in 1926, returned in 1936. Home from History's frontlines a self-professed "Bolshevik," even though, as a Socialist, he combined "cultural nationalism, Scandinavian cooperativism, Booker Washington and Marx in about equal parts," Du Bois remained unconvinced that the Communist Party, which never attracted more than a few hundred black members, was their last best hope. In any case, African-Americans did not "propose to be the shock troops of the Communist Revolution."
During the McCarthy era, the black leadership, bending in the prevailing ideological winds, began to distance itself from the left. Back in New York, involved in nuclear disarmament activity declared subversive by the US government, Du Bois was arrested and tried as an unregistered agent of a foreign power. He was acquitted in 1951, but the State Department confiscated his passport, prohibiting travel abroad. It was the last straw.
The prophet was without honor only in his own country. So when the government embargo was lifted in 1958, Du Bois went on lecture tours of Eastern Europe and the Soviet Union, becoming a kind of poster boy in the Communist effort to discredit the States. He was awarded the Lenin Peace Prize in 1959, and in Red China, his birthday was declared a national holiday by Chou En-lai. Did the party use Du Bois? Or did Du Bois use the party to further his own agenda? Both, most likely.
In 1960, seventeen African states, including Kwame Nkrumah's Ghana, gained independence. At Nkrumah's invitation, Du Bois exiled himself, renouncing his American citizenship. He officially joined the Communist Party in 1961. Shrunken now and a bit stooped, his memory not quite as sharp as it once was, the scholar-citizen spent his last days in a spacious house with a view of flowering shrubs in Accra's best neighborhood, an honored guest of state, surrounded by busts of Lenin and Chairman Mao and an impressive library of Marxist thought, editing the Negro encyclopedia and receiving visitors the world over. At last, on August 27, 1963, the visionary whose long life--spanning Reconstruction, Plessy v. Ferguson, two World Wars, Brown v. Board of Education and now the civil rights movement--had been the literal embodiment of the nineteenth century's collision with the twentieth, died in Accra, where he was accorded an elaborate state funeral.
The bioepic ends, as it began 1,500 pages ago in Volume I, with the death of W.E.B. Du Bois. A living institution, he was "productive, multiple, controversial, and emblematic." His influence--as cultural ambassador, as writer and editor, as activist whose spectrum of social, political and economic thought seems refracted in phenomena as varied as Ho Chi Minh, the Negritude of poet-statesmen Aimé Césaire and Léopold Senghor as well as the Black Power movement that peaked after his death--is ubiquitous.
A difficult man as capable of coldness to old friends as he was reluctant to admit mistakes, a prickly Brahmin who walked with kings but failed to acquire the common touch, Dr. Du Bois emerges a kind of tragic hero as flawed as he was gifted. At times you wonder whether he wasn't his own most formidable enemy. But whatever his blind spots, he was only too well aware, looking backward, that battling racism real and imagined at every turn had twisted him into a far less "human" being than he might otherwise have been.
Fifteen years and two computer crashes in the research and writing, these volumes were a lifetime, literally, in the making. As a boy born in Little Rock two decades before the civil rights movement began, Lewis had a portentous encounter with the great man. Fisk man and author of books on South Africa and the Dreyfus Affair, he's now a professor of history at Rutgers. And just as Renaissance scholarship would be incomplete without When Harlem Was in Vogue, the twenty books and 100 articles of W.E.B. Du Bois's eighty-year publishing career, so handsomely anthologized in Nathan Irvin Huggins's Library of America Writings, are indispensably complemented by what is, if not a masterpiece of biography, then almost certainly the standard social, political and intellectual history of his life and times.
"Simone de Beauvoir said 'Books saved my life.' I think that's true for me," announced Gloria Whelan in accepting her National Book Award recently for Homeless Bird (which won for Young People's Literature). It was a refreshing zenith in the remarks that evening, and I suspect that what she said holds true for many of us--or that books save us from a certain type of life, anyway, one more arid and circumscribed than we'd prefer. They help us create who we are, in a kind of secular but still miraculous transubstantiation. And who we are--how we determine the nature of that--is a question you will find running like a highway stripe through the essays assembled here.
Are we dispassionate scientists or self-interested exploiters of the less fortunate, whether on the individual or state level? Patrick Tierney's Darkness in El Dorado reaches one conclusion, reviewer Greg Grandin another, slightly askew from Tierney. Does divorce cause long-term damage to children? Andrew Cherlin, some of whose own research has been used by others to support the idea that it does, has a less ominous view in discussing Judith Wallerstein's conclusions. And what is the inescapable bias in reporting on each other, in any respect? Longtime Saul Bellow friend Richard Stern contemplates the question as spurred by James Atlas's new bio of the Nobelist. Peter Schrag offers a variation on the theme while assessing Richard Ben Cramer's life of American icon Joe DiMaggio. When told the hero worship "was always about us," Schrag retorts, "Of course it was always about us; what else could it be about?"
Michelle Jensen begins her overview of Third Wave feminism and the Manifesta of Jennifer Baumgardner and Amy Richards by putting a different twist on the question, noting that, so far, works representing the Third Wave have been personal accounts too much about "us," which leaves one thirsting for a theoretical grounding. And academic theory is invoked again, this time from the classics, Georgette Fleischer reports, in Judith Butler's revisitation of the story of Antigone; she uses the tale to refract out--or perhaps in--a perspective for sexual "outsiders." And through what sort of prism are we to filter a historian's self-history? Paul Buhle considers Arthur Schlesinger Jr.'s beginnings, innocent or otherwise.
Elsewhere in the issue, faith in the transformative prospects of the word may be most evident in Rimbaud's conviction that his poetry would change the world, or in Orwell's more blatantly political reporting, or in W.E.B. Du Bois's double-header life as both political and literary powerhouse. Margaret Atwood and Eduardo Galeano, of course, have spent a lifetime tracing our silhouettes through language--as has Jules Feiffer with his pen and wry sense of paradox.
Last but not least, we come to the issue of who we are in a literal sense, here at The Nation. We take this opportunity to welcome Hillary Frey, who has joined our staff as assistant literary editor. She was formerly managing editor of Lingua Franca. We hope you enjoy the issue.
To judge from magazine covers, the American divorce rate is either a disaster for children or no problem at all. First came the famous "Dan Quayle Was Right" article in The Atlantic in 1993, with a cover line that said divorce "dramatically weakens and undermines our society." Then, in 1998, Newsweek heralded The Nurture Assumption, whose author, Judith Rich Harris, argued that whether parents divorce makes little difference in children's lives because genetics and peer groups determine their problems. This September, Time featured The Unexpected Legacy of Divorce, a book whose authors, psychologists Judith Wallerstein and Julia Lewis and journalist Sandra Blakeslee, brought the gloomy news that a majority of children of divorce still suffer twenty-five years later. What's a reader to think?
The facts are not in dispute: The American divorce rate doubled in the 1960s and 1970s and has held steady or possibly declined a bit since then. At current rates, about half of all marriages will end in divorce. One million children experience a parental divorce every year. Most of them are upset in the immediate aftermath of the breakup. Some act out, others become withdrawn. Too often, fathers fail to provide adequate financial support and mothers and children see their standards of living drop. Without doubt, going through a divorce is a traumatic experience for parents and children alike.
But are most children harmed in the long term? On this question, recent media coverage has lurched between two extremes. At one end is the doomsday view that divorce sentences children to life at emotional hard labor. In this view, a parental divorce starts a chain of events that leaves most adult children anxious, unhappy and often unable to make a commitment to a partner. At the other end is the evolutionary psychologists' view that children's behavior is genetically programmed, so that whether parents divorce doesn't matter very much. According to this line of reasoning, divorce is just a flag that identifies genetically challenged families whose troubles would have occurred even if the parents had stayed together.
Said this starkly, neither extreme seems convincing. Yet it is a sad fact of public debates about social problems that the extremes tend to capture everyone's attention. Magazines are sold and talk shows are fueled by the announcement that a particular problem is devastating American society and then by the news that--wait a minute--it's really not a big problem after all. There's little patience for discussions of problems that are serious but not calamitous. And yet the gravity of many social problems lies in the demilitarized zone between the extremes.
For example, consider teenage childbearing. It was initially declared a scourge. A leading researcher wrote famously in 1968, "The girl who has an illegitimate child at the age of 16 suddenly has 90 percent of her life's script written for her." More recently, however, some researchers and commentators have argued that most teenage mothers would not be better off had they delayed having children. Teenage childbearing, it is alleged, merely reflects growing up in disadvantaged circumstances. Poor teen mothers would still be poor even if they hadn't had their babies. While there is some merit to this argument, research suggests that having a baby as a teenager does add to the difficulties girls from disadvantaged backgrounds face.
Research on divorce also suggests that extreme views are inaccurate. But you wouldn't know it to read the latest report by Wallerstein and her colleagues on her long-term study of children of divorce. In 1971, she selected sixty families that had been referred by their attorneys and others to her marriage and divorce clinic in Marin County, California, shortly after the parents separated. Wallerstein kept in touch with the 131 children from these families. Her book on the first five years, Surviving the Breakup: How Children and Parents Cope with Divorce, written with Joan Berlin Kelly, contained insightful portraits of the difficulties the children faced as their parents struggled with the separation and its aftermath. Her book about how they were doing at the ten- and fifteen-year mark, Second Chances: Men, Women, and Children a Decade After Divorce, written with Sandra Blakeslee, became a bestseller. It chronicled the continuing problems that most of the children were having.
For her new book, she was able to talk to ninety-three of the children at the twenty-five year mark. Her striking conclusion is that most of these individuals, now 33 years old on average, have suffered greatly in adulthood. A minority have managed to construct successful personal lives, but only with great effort. The legacy of divorce, it turns out, doesn't fade away:
Contrary to what we have long thought, the major impact of divorce does not occur during childhood or adolescence. Rather, it rises in adulthood as serious romantic relationships move center stage. When it comes time to choose a life mate and build a new family, the effects of divorce crescendo.
Young adults from divorced families, Wallerstein writes, lack the image of an intact marriage. Because they haven't had the chance to watch parents in successful marriages, they don't know how to have one. When it comes time to choose a partner or a spouse, their anxiety rises; they fear repeating the mistakes of their parents. Lacking a good model, they tend to make bad choices. (In the realm of work, in contrast, Wallerstein's subjects had no particular problems.)
A woman who took the role of caregiver to a distraught parent or to younger siblings while growing up, for instance, may choose a man who needs lots of caring in order to function. But she soon finds his neediness and dependency intolerable, and the relationship ends. Wallerstein writes of one such woman in her study:
She described how she would come home after work and find her partner lying on the couch, waiting for her to take charge. It was just like taking care of her mom. At that point, she realized she had to get out.
Young men, Wallerstein tells us, were wary of commitment because they were afraid their marriages would end as badly as their parents' had. Many avoided casual dating and led solitary lives. She tells the story of Larry, who after courting and living with Grace for seven years still could not bring himself to marry her. Not until she packed up and left in frustration did he agree. He told Wallerstein:
I realized I loved her and that she was important to me but I was unable to make a decision. I was afraid because of the divorce. I was afraid of being left and I think that is why I was afraid of making a commitment to her.
Other children in the study turned to alcohol, drugs and, particularly among girls, early sexual activity. Wallerstein writes that sexual promiscuity was a result of girls' feelings of abandonment by their fathers. Their low self-esteem, their craving for love and their wish to be noticed led them to seek sexual liaisons and sometimes to start ill-conceived partnerships and marriages.
Overall, we are told, close to half the women and over one-third of the men were able to establish successful personal lives by the twenty-five-year mark--but only after considerable pain and suffering, much anxiety about repeating the mistakes of their parents, many failed relationships and, for one-third, psychotherapy. The rest were still floundering. Only 60 percent had ever married, compared with about 80 percent among all adults at their ages. Moreover, only one-third had children, as if they were afraid of doing to children what had been done to them.
Without doubt, a disturbing picture. And what makes it even more disturbing is Wallerstein's claim that her subjects are more or less representative of the typical American middle-class family that undergoes a divorce. Her families were carefully screened, she assures us, so that the children were doing "reasonably well" at school and had been developmentally "on target" before the divorce. Nor were the families especially troubled before the breakup, she says. "Naturally," Wallerstein writes, "I wanted to be sure that any problems we saw did not predate the divorce. Neither they nor their parents were ever my patients."
This claim to have a sample of typical, not unduly troubled families is, however, contradicted by the extensive psychological problems that the parents displayed when they were assessed at the initial interview. But you won't find that information in this book or the previous one. Only in the appendix to her first book, Surviving the Breakup, in 1980, does Wallerstein discuss the parents' mental states. There we learn the startling information that 50 percent of the fathers and close to half the mothers were "moderately disturbed or frequently incapacitated by disabling neuroses or addictions" when the study started:
Here were the chronically depressed, sometimes suicidal individuals, the men and women with severe neurotic difficulties or with handicaps in relating to another person, or those with long-standing problems in controlling their rage or sexual impulses.
And that's not all: An additional 15 percent of the fathers and 20 percent of the mothers were found to be "severely troubled during their marriages." These people "had histories of mental illness including paranoid thinking, bizarre behavior, manic-depressive illnesses, and generally fragile or unsuccessful attempts to cope with the demands of life, marriage, and family."
Typical American middle-class families? Hardly. These were by and large troubled families of the kind one might expect to come to a divorce clinic for therapy. Why this information was excluded from the nine-page appendix on the research sample in the new book--why an interested reader can only find it buried in the appendix of a book written twenty years ago--is puzzling. Does Wallerstein now consider this information to be in error? Irrelevant? Or just embarrassing?
The problem for Wallerstein is that troubled families often produce troubled children, whether or not the parents divorce. So it may be a considerable overstatement to blame the divorce and its aftermath for nearly all the problems she saw among her children over the twenty-five years. In a study of the records of several thousand British children who were followed from birth to age 33, Lindsay Chase-Lansdale, Christine McRae Battle and I found that children whose parents would later divorce already showed more emotional problems at age 7 than children from families that would remain together. The gap widened as the divorces occurred and the children reached adulthood, suggesting that divorce did have a detrimental long-term effect on some of them. But a large share of the gap preceded the divorces and might have appeared even had the parents stayed together.
Sensitive to the particularities of her sample, Wallerstein recruited a "comparison sample" of adults from nondivorced families. The comparison sample, we are told, was selected to match the socioeconomic level of the families in the study. In many respects, the individuals in the comparison group were doing better than the study's children, which Wallerstein presents as evidence that divorce really is the cause of the difficulties in the latter group. But since the comparison sample presumably was not matched on the parents' chronic depression, suicidal tendencies, problems in controlling rage, bizarre behavior and manic-depressive illness, their inclusion does not prove Wallerstein's case.
What, then, can we take from Wallerstein's study? It is an insightful, long-term investigation of the lives of children from troubled divorced families. It gives us valuable information on what happens to children when things go wrong before and after a divorce. And things sometimes do go wrong: Many divorcing parents face the kinds of difficulties that Wallerstein saw in her families. Her basic point that divorce can have effects that last into adulthood, or even peak in adulthood, is valid. She was one of the first people to write about children who seemed fine in the short-term but experienced emotional difficulties in adolescence or young adulthood--in her previous book she called this the "sleeper effect"--and now she is the first to describe it in detail among adults who have reached their 30s. Psychotherapists, social workers, teachers and other professionals who see troubled children of divorce and their parents will find her analyses instructive. Parents and children who are struggling with divorce-related problems will find her analyses helpful.
But no one should believe that the negative effects of divorce are as widespread as Wallerstein claims. Some portion of what she labels as the effects of divorce on children probably wasn't connected to the divorce. And the typical family that experiences divorce won't have as tough a time as Wallerstein's families did. Parents with better mental health than this heavily impaired sample can more easily avoid the worst of the anger, anxiety and depression that comes with divorce. They are better able to maintain the daily routines of their children's home and school lives. Their children can more easily avoid the extremes of anxiety and self-doubt that plague Wallerstein's children when they reach adulthood.
What divorce does to children is to raise the risk of serious long-term problems, such as severe anxiety or depression, having a child as a teenager or failing to graduate from high school. But the risk is still low enough that most children in divorced families don't have these problems. In the British study, we found that although divorce raised the risk of emotional problems in young adulthood by 31 percent, the vast majority of children from divorced families did not show evidence of serious emotional problems as young adults.
Except for Wallerstein, many of the writers most concerned about divorce now appear to recognize this distinction. Barbara Dafoe Whitehead, who wrote the "Dan Quayle Was Right" piece in The Atlantic (drawing heavily on Wallerstein's earlier work), acknowledged in a more recent, book-length treatment, The Divorce Culture, that a majority of children probably aren't seriously harmed in the long term. But she argued that even if only a minority of children are harmed, divorce is so common that a "minority" is still a lot. And she is correct. Divorce is not a problem that "dramatically weakens and undermines our society," but it nevertheless deserves our attention.
For that reason, some of the remedies Wallerstein suggests would be useful: creating more support groups in schools for children whose parents are divorcing, insuring that divorced fathers contribute to the cost of their children's college education and educating newly separated parents about how to shield their children from conflict. Measures such as these would help some children without imposing undue strain on parents, schools or the courts.
Less clearly useful is Wallerstein's recommendation that parents in unhappy, loveless, but low-conflict marriages consider staying together for the sake of their children. I think she is probably right that children can develop adequately in "good enough" marriages that limp along without an inner life of love and companionship. There were millions of these marriages during the baby-boom years of the 1950s, when wives weren't supposed to work and women were forced to choose between having a career and being a mother. The result was often frustration and depression. Few people (not even Wallerstein) want to constrain women's choices again. Certainly, unhappy parents have an obligation to try hard to change an unsuccessful marriage before scuttling it. Without doubt some parents resort to divorce too hastily. But no one as yet has a formula that can tell parents how much pain they must bear, how much conflict to endure, before ending a marriage becomes the better alternative for themselves and their children.
Least defensible is the attempt by Wallerstein to inform readers whose parents have divorced that their problems with intimacy stem from the breakup. In high self-help style, Wallerstein tells her readers:
You were a little child when your parents broke up, and it frightened you badly, more than you have ever acknowledged.... When one parent left, you felt like there was nothing you could ever rely on. And you said to yourself that you would never open yourself to the same kinds of risks. You would stay away from loving. Or you only get involved with people you don't care about so you won't get hurt. Either way, you don't love and you don't commit.
And so forth. Wallerstein plants the seed of the pernicious effect of exposure to divorce as a young child--and then waters it. Yes, the reader thinks, that must be why I'm so anxious about getting married. Never mind that making a commitment to marry someone is anxiety-producing for young adults from any background. Or that we live in an era when the average person waits four to five years longer to marry than was the case a half-century ago. Wallerstein encourages readers to believe that most of their commitment problems stem from their parents' divorces. But parental divorce isn't that powerful, and its effects aren't that pervasive. To be sure, it raises the chances that children will run into problems in adulthood, but most of them don't. Unfortunately, that's a cover line that doesn't sell many magazines.
Judith Butler, who is a Maxine Elliot Professor of Rhetoric and Comparative Literature at the University of California, Berkeley, is a troublemaker. She announced as much when she arrived on the critical feminist scene in her second and most well-known work, Gender Trouble: Feminism and the Subversion of Identity, first published in 1990:
Contemporary feminist debates over the meanings of gender lead time and again to a certain sense of trouble, as if the indeterminacy of gender might eventually culminate in the failure of feminism. Perhaps trouble need not carry such a negative valence. To make trouble was, within the reigning discourse of my childhood, something one should never do precisely because that would get one in trouble. The rebellion and its reprimand seemed to be caught up in the same terms, a phenomenon that gave rise to my first critical insight into the subtle ruse of power: The prevailing law threatened one with trouble, even put one in trouble, all to keep one out of trouble. Hence, I concluded that trouble is inevitable and the task, how best to make it, what best way to be in it.
In the 149 dense pages that follow this preface, Butler took on a host of psychoanalytic theorists, from "Freud and the Melancholia of Gender" to "Lacan, Riviere, and the Strategies of Masquerade." She also critiqued "The Body Politics of Julia Kristeva" (who uses semiotics in the service of psychoanalytic critique) and "Monique Wittig: Bodily Disintegration and Fictive Sex," whose The Lesbian Body and other works are, according to Butler, limited by Wittig's humanism. In Gender Trouble, Butler's admiration is reserved for Michel Foucault, the openly gay philosopher of power most famous for his History of Sexuality and Discipline and Punish, a philosopher whose terms are evident in Butler's preface above: "the reigning discourse of my childhood"; "rebellion and its reprimand seemed to be caught up in the same terms"; "the subtle ruse of power." Butler's genealogical critique of gender, i.e., a critique of gender's very origins, a critique of the very terms of the critique, was a grand synthesis of the most radical European ideas about sexuality and sexual identity. Simone de Beauvoir's famous statement in The Second Sex that one is not born but rather becomes a woman is a conceptual starting point, but only a starting point. Foucault's work on the journals of Herculine Barbin, a nineteenth-century hermaphrodite so tortured by his/her predicament in a sexually normative world that s/he commits suicide, enables Butler's challenge not only to the categories of gender but to the categories of sex itself. But what stands head and shoulders above Butler's illustrious collection of radical theories is Gender Trouble's overarching claim that gender, and possibly even sex itself, is not an expression of who one is but rather a performance.
Toward the end of Gender Trouble, Butler poses a set of questions that indicate the practical, political direction of her critique:
What performance where will invert the inner/outer distinction and compel a radical rethinking of the psychological presuppositions of gender identity and sexuality? What performance where will compel a reconsideration of the place and stability of the masculine and the feminine? And what kind of gender performance will enact and reveal the performativity of gender itself in a way that destabilizes the naturalized categories of identity and desire?
Not only did Gender Trouble immediately appear on feminist-theory syllabuses around the country, it became a foundational text of queer theory. Is it any wonder it provoked a backlash?
Antigone's Claim: Kinship Between Life and Death is a slender, very well-written book that is the published version of the Wellek Library Lectures Butler gave at the University of California, Irvine, in May 1998. Butler starts out:
I began to think about Antigone a few years ago as I wondered what happened to those feminist efforts to confront and defy the state. It seemed to me that Antigone might work as a counterfigure to the trend championed by recent feminists to seek the backing and authority of the state to implement feminist policy aims. The legacy of Antigone's defiance appeared to be lost in the contemporary efforts to recast political opposition as legal plaint and to seek the legitimacy of the state in the espousal of feminist claims.
Butler's study of Antigone led her someplace she had not anticipated. Rather than view Antigone as the figure who defies the state in the person of her uncle, Creon the King, who has forbidden her to bury her brother Polyneices--"I say that I did it and I do not deny it"--Butler follows some of her own most important intellectual mentors, namely, the Enlightenment philosopher and founder of dialectics, Georg Wilhelm Friedrich Hegel, and the poststructuralist psychoanalysts Jacques Lacan and Luce Irigaray, in viewing Antigone "not as a political figure, one whose defiant speech has political implications, but rather as one who articulates a prepolitical opposition to politics, representing kinship as the sphere that conditions the possibility of politics without ever entering into it." Butler is interested in Antigone as a liminal figure between the family and the state, between life and death (this is the choice she must make, and in her defiance of Creon she chooses the latter), but also as a figure, like all her kin, who represents the nonnormative family, a set of kinship relations that seems to defy the standard model.
In addition, there is a contemporary occasion for Antigone's Claim, one that is elucidated in Butler's new preface to the tenth-anniversary edition of Gender Trouble, in which she declares her interest in "increasing the possibilities for a livable life for those who live, or try to live, on the sexual margins." I do not think it amiss to describe Antigone's Claim as dedicated to those who try to die on the sexual margins. Though directly referred to only occasionally in her text, it is the specter of death as a result of AIDS that haunts Antigone's Claim, and the particular dilemma AIDS presents to those who live and die outside the boundaries of normative family and kinship relations. Toward the end of the third and final chapter, "Promiscuous Obedience," Butler states:
For those relations that are denied legitimacy, or that demand new terms of legitimation, are neither dead nor alive, figuring the nonhuman at the border of the human. And it is not simply that these are relations that cannot be honored, cannot be openly acknowledged, and cannot therefore be publicly grieved, but that these relations involve persons who are also restricted in the very act of grieving, who are denied the power to confer legitimacy on loss.
The outlines of the troubled Theban family are well-known. Oedipus Rex, actually written after Antigone (442 BCE) though its action precedes it, begins with the problem of a plague. As a priest informs us:
A blight is on the fruitful plants of the earth,
A blight is on the cattle of the fields,
a blight is on our women that no children
are born to them; a God that carries fire,
a deadly pestilence, is on our town,
strikes us and spares not, and the house of Cadmus
is emptied of its people while black Death
grows rich in groaning and in lamentation.
Soon, of course, we learn what the trouble is, when the blind seer Teiresias informs Oedipus the King, "You are the land's pollution." Unwittingly, the man has murdered his own father during an altercation at a crossroads, wedded his own mother and produced four offspring who are in fact his half-siblings. This unbearable truth causes his wife and mother Jocasta to hang herself in the polluted bedchamber, where afterward Oedipus tears the brooches from her robe in order to blind his own eyes. Toward the end of Antigone's Claim, Butler raises an issue that supports my reading of the book's contemporary occasion: "Consider that the horror of incest, the moral revulsion it compels in some, is not that far afield from the same horror and revulsion felt toward lesbian and gay sex, and is not unrelated to the intense moral condemnation of voluntary single parenting, or gay parenting, or parenting arrangements with more than two adults involved (practices that can be used as evidence to support a claim to remove a child from the custody of the parent in several states in the United States)."
In Oedipus at Colonus (401 BCE), the middle play of the trilogy but written last, an old, blind Oedipus is led onstage by his daughter Antigone. (Sigmund Freud, who did so much for the Oedipus myth, referred at the end of his life to his daughter and fellow psychoanalyst Anna Freud as his "Antigone.") Here, the theme of proper burial, which is so important a theme in Antigone and in Antigone's Claim, receives advance treatment. Oedipus begs of Theseus, King of Athens, a proper burial when he dies, that Theseus accept "the gift" of his "beaten self: no feast for the eyes." The oracle has prophesied that if Oedipus's sons do not tend his corpse, Thebes will be conquered by Athens, and Oedipus wants revenge on his sons because they drove him into exile from Thebes. When Polyneices makes an appearance toward the end of Oedipus at Colonus, Oedipus not only rejects his son's plea to join his side against his other son, presently in possession of Thebes, he curses them both; a curse that comes to pass between the action of Oedipus at Colonus and Antigone, when in battle both brothers die at once on the other's sword. Polyneices' final words in the trilogy are spoken at the end of Oedipus at Colonus to his beloved sister Antigone, to whom he offers a blessing if she will honor his corpse with burial rites. And here we have arrived at Antigone and Antigone's Claim.
From the start of her career, Judith Butler has been on a quest for a theory of the subject that might work for "those who live, or try to live, on the sexual margins." As she stated in her new preface to the recent reissue of her first book, Subjects of Desire: "In a sense, all of my work remains within the orbit of a certain set of Hegelian questions: What is the relation between desire and recognition, and how is it that the constitution of the subject entails a radical and constitutive relation to alterity?" Hegel's Phenomenology of Spirit has underwritten most of Butler's work, as has the work of Lacan, whose seminar on "The Ethics of Psychoanalysis" is the other major influence on Antigone's Claim. The book both follows from Butler's earlier work and turns in some interesting new directions; namely, it moves explicitly into the realm of ethics and implicitly into practical politics.
While Butler has tended in the past to focus particularly on the section of the Phenomenology of Spirit that deals with the famous "Lordship and Bondage" relation, in Antigone's Claim she makes what seems like an inevitable advance in the text, given the confluence of her present interests, into the section of the Phenomenology that deals with "the true Spirit: The ethical order." In this part, Hegel argues that it is the "Family" that "as the element of the nation's actual existence...stands opposed to the nation itself; as the immediate being of the ethical order, it stands over against that order which shapes and maintains itself by working for the universal; the Penates [household gods] stand opposed to the universal Spirit." For Hegel, it is woman who is associated with these household gods that stand opposed to the universal Spirit or the state; it is woman who is associated with the divine, as opposed to the human, law. The figure of Antigone upholds the divine law when she buries her brother Polyneices (twice) in defiance of her uncle Creon, who has ordered that the corpse of a man who threatened the integrity of the state will be left to rot in the sun, torn by beasts and birds.
Butler's affinities with a philosophical tradition arising from Hegel--the Frankfurt School of neo-Marxist philosophers and social critics (though she rarely if ever refers to them in her work)--are not limited to her use of difficult language, which notoriously won her a Bad Writing Award from the journal Philosophy and Literature. Butler shares with the Frankfurt School a fundamental, one might say foundational, debt to the Hegelian dialectic, which Marx harnessed in his theories of history. Hegel explains his dialectic in the Preface to the Phenomenology of Spirit:
Knowledge is only actual, and can only be expounded, as Science or as system; and furthermore, that a so-called basic proposition or principle of philosophy, if true, is also false, just because it is only a principle. It is, therefore, easy to refute it. The refutation consists in pointing out its defect; and it is defective because it is only the universal or principle, is only the beginning. If the refutation is thorough, it is derived and developed from the principle itself, not accomplished by counter-assertions and random thoughts from outside.
The Hegelian dialectic is a philosophical tradition a classical liberal humanist like Martha Nussbaum does not, apparently, have much sympathy for. It's unfortunate that Nussbaum did not take on this philosophical difference in attacking Butler in The New Republic last February. Instead of accepting the work as being of a tradition "that seeks to provoke critical examination of the basic vocabulary of the movement of thought to which it belongs," in Butler's self-characterization, Nussbaum isolates her as a philosopher:
Butler gains prestige in the literary world by being a philosopher; many admirers associate her manner of writing with philosophical profundity. But one should ask whether it belongs to the philosophical tradition at all, rather than to the closely related but adversarial traditions of sophistry and rhetoric.
According to Nussbaum, Butler is a "new symbolic type" of feminist thinker, influenced by a lot of French "postmodernist" ideas. In Nussbaum's vision, Butler is the Pied Piper of academia, traipsing off with all the "young feminists" behind her. Not only does Nussbaum claim that Butler's ideas are philosophically soft (if they are even philosophy at all), but she claims that Butler is leading a trend away from engaged feminism, having traded "real politics" for "symbolic verbal politics." The "new feminism" of Judith Butler "instructs its members that there is little room for large-scale social change, and maybe no room at all." From here, Nussbaum stoops to condescension ("In public discussions, she proves that she can speak clearly and has a quick grasp of what is said to her") and, ultimately, after several swipes at Butler's "sexy acts of parodic subversion," to the astonishing claim that Butler "purveys a cruel lie, and a lie that flatters evil by giving it much more power than it actually has": Butler's "hip quietism," according to Nussbaum, "collaborates with evil."
In Antigone's Claim, it is not only Antigone's public grief over Polyneices and her insistence that she bury him that absorbs Butler's interest but also the way in which her defiance of Creon, her condemnation to death and the taking of her own life (like her mother, Jocasta, she hangs herself) "fails to produce heterosexual closure for that drama"--if Antigone had complied, she would have married Creon's son and presumably become a mother. This, Butler claims, "may intimate the direction for a psychoanalytic theory that takes Antigone [as opposed to Oedipus] as its point of departure," namely, a psychoanalytic theory that would step outside the confines of compulsory heterosexuality.
And yet Butler's attraction to this particular family drama goes further back. While for Freud and for Lacan after him the Oedipal drama is a paradigm that in various ways instates, by way of prohibition, normative heterosexuality and kinship relations, Butler views this drama differently. In its deviations from the law and in its apparent need for prohibition, the most famous Theban family represents not just the predicament of those who live on the sexual margins but in a more historical sense, the family and kinship relations of our times:
Consider that in the situation of blended families, a child says "mother" and might expect more than one individual to respond to the call. Or that, in the case of adoption, a child might say "father" and might mean both the absent phantasm she never knew as well as the one who assumes that place in living memory. The child might mean that at once, or sequentially, or in ways that are not always clearly disarticulated from one another. Or when a young girl comes to be fond of her stepbrother, what dilemma of kinship is she in? For a woman who is a single mother and has her child without a man, is the father still there, a spectral "position" or "place" that remains unfilled, or is there no such "place" or "position"?... And when there are two men or two women who parent, are we to assume that some primary division of gendered roles organizes their psychic places within the scene, so that the empirical contingency of two same-gendered parents is nevertheless straightened out by the presocial psychic place of the Mother and Father...that every psyche must accept regardless of the social form that kinship takes?
Butler sees in the Oedipal story an allegorical reflection of things as they presently are; what if, rather than prohibiting such things, we took them as our starting point; what if we accepted the nonnormative? Second, Butler wants to move the fulcrum of the drama a generation forward because Antigone occupies a position not only between life and death, and not only between private and public, between the family and the state: Antigone figures for Butler a desirable transition into the world of ethics that does not forget familial origins. This is made clear in Antigone's extended exit speech, one Butler focuses especially on. On the point of being led away to her death, Antigone argues that her brother Polyneices is irreplaceable and therefore had to be honored by her even though it means her own death. A husband or child could have been replaced, but since her parents are no longer alive, not a brother.
What is the law that lies behind these words?
One husband gone, I might have found another,
or a child from a new man in first child's place,
but with my parents hid away in death,
no brother, ever, could spring up for me.
Such was the law by which I honored you.
In this speech one senses that Antigone is finally at peace. For she, like the rest of her family, is characterized as much by her personal moral sense as she is by her strange kinship predicament. And one senses in Butler's interest in these lines a homage to those who have lived, or have tried to live, and to those who have died "on the sexual margins."
A few years back, critics of postmodernism, both left and right, chuckled at the academic sting pulled on the journal Social Text when it published Alan Sokal's bogus article on the socially constructed nature of nature. For conservatives, that the journal ran Sokal's fuzzy call for a progressive postmodern science confirmed the fundamental divide between the politicized humanities and the objective sciences--proof positive of cultural studies run amok. In all the discussion that followed, however, little notice was paid to the origins of post-World War II radical critiques of science. In the shadow of Hitler and Stalin and in the wake of the Vietnam War, theorists from Theodor Adorno to Donna Haraway have been concerned with the ways in which science has colluded with acts of barbarism.
Patrick Tierney's Darkness in El Dorado examines the tragic consequences of medical and social science research on the Venezuelan Yanomami and reminds us why scientific practices and theories should indeed be the domain of social critics. White scientists in the jungle have long been central characters in the stories the West tells about itself. Alongside Humboldt and Mengele, Tierney's book now adds to the tropical pantheon James Neel, founder of the University of Michigan's human genetics department, and Napoleon Chagnon, perhaps the world's most infamous living anthropologist.
Well before Darkness's publication, Tierney's most damning charge--that Neel and Chagnon provoked, perhaps knowingly, a fatal 1968 measles epidemic responsible for "hundreds, perhaps thousands" of deaths--has created a scandal that threatens to distract from the real significance of his research. The Chronicle of Higher Education reported that the book may create a crisis "unparalleled in the history of anthropology." At a special American Anthropological Association forum in mid-November, defenders of Neel charged libel and politicized agendas. One panelist proclaimed that Tierney's "anti-science views" would jeopardize future vaccine efforts and lead to more deaths from disease. Chagnon, evoking the terms of the Sokal affair, has responded that only "cultural anthropologists from the Academic Left" who "despise the words 'empirical evidence' would take Tierney's claims seriously."
Empirical evidence is not lacking in Tierney's copiously footnoted book. Like all good chronicles of Western rationalists who lose their mind among primitives, Darkness in El Dorado is filled with absurd and disgraceful behavior: a French anthropologist who loses himself for decades in a sexual Eden; the world's wealthy holding a tuxedo dinner catered by helicopters on a jungle mountain; researchers who try to kill one another with machetes or commit suicide after being spurned by a Yanomami lover. But aside from his Joseph Conrad-like musings as to what it is about the Yanomami that made white people crazy, Tierney has written a fascinating, but also frustrating, ethnography of the practices and beliefs of cold war medical and social science researchers.
Tierney focuses primarily on the long and strange career of Napoleon Chagnon, who originated the myth of Yanomami aggression in his book The Fierce People, the all-time-bestselling ethnography. Chagnon portrayed the Yanomami as one of the most violent cultures on earth, where villages went to war to procure women and serial murderers bred at a higher rate than men who did not kill.
Tierney convincingly demonstrates his charge that unethical methodology and false science produced this myth. He also describes its often fatal consequences.
Most cultural anthropologists now believe that the wars Chagnon witnessed were provoked by Chagnon himself. He offered axes, machetes, fishhooks and pots in exchange for ethnographic information, creating tensions among villages that vied for monopoly control of his wares. Within months of Chagnon's arrival in 1964, three different fights broke out between villages that had previously been at peace for decades. Anthropologist Brian Ferguson reports that Chagnon was "very much involved in the fighting and the wars. Chagnon becomes a central figure in determining battles over trade goods and machetes."A Yanomami reports that Chagnon offered him an outboard motor in exchange for help, including the procurement of a Yanomami wife. Shotguns, a seemingly unlimited supply of trade goods and willingness to don feathers, face paint and a loincloth allowed Chagnon to transform himself from an "impoverished Ph.D. student at the bottom of the totem pole to being a figure of preternatural power."
Tierney argues that many of Chagnon's data are simply false. The Yanomami do not have a particularly high murder rate, nor do men who kill reproduce more than those who don't. Neither are the Yanomami particularly well-nourished--a claim that Chagnon uses to argue that men fight over women and not food.
In the United States, Chagnon and his sociobiologist allies continue to portray the Yanomami as an untainted relic of our past--a handy control group used to prove the biological basis of a range of aggressive human traits. In Latin America, the endurance of the myth of Yanomami aggression has reinforced racism and justified indifference. Both the Venezuelan and Brazilian governments have used unfavorable images of the Yanomami to justify their failure to protect them from migrants, who, starting in the late 1980s, increasingly entered the region, resulting in the death from disease and violence of untold numbers of Yanomami.
Tierney is at his best when he discusses Chagnon's career within the cultural history of the cold war. Born poor in Michigan, Chagnon used the expanding university system to climb out of poverty. Like many at the time who through discipline and hard work improved their class standing, Chagnon developed a visceral antipathy toward communism. It manifested itself in an intense masculine persona that earned Chagnon a reputation for barfighting and academic brawling. One of Tierney's insights is that Chagnon's theories had their "genesis during the Vietnam War and its cultural equivalent on the University of Michigan's Ann Arbor campus, where hippies in tepees chanted slogans like 'Make love, not war.' The whole point...was that you had to make war in order to make love--that violence was part of the natural order.... As a cold war metaphor, the Yanomami's 'ceaseless warfare' over women proved, that even in a society without property, hierarchies prevailed."
Tierney is on to something important here. The Fierce People was published in 1968, a particularly tough year for the United States abroad. American officials justified counterinsurgency campaigns that were taking place in the jungles of Latin America, Africa and Asia in decidedly Chagnonian terms. As one 1968 dissenting State Department memo put it: "We have condoned counter-terror.... We suspected that maybe it is a good tactic, and that... murder, torture, and mutilation are alright if our side is doing it and the victims are communists. After all hasn't man been a savage from the beginning of time so let us not be too queasy about terror. I have literally heard these arguments from our people."
Tierney rightly reads The Fierce People as a piece of home-front propaganda. To counter those who argued that war was caused by struggles over resources (a central claim of New Left interpretations of both the cold war and the Vietnam War), Chagnon "engineered a bold creation myth, a ferocious Garden of Eden, where the healthy, well-fed Yanomami fought for... sexual pleasure.... It was not the Yanomami but Chagnon's fellow Americans who belonged, in reality, to one of the best-fed, healthiest societies in history. America enjoyed abundance so delirious that it seemed, for a short time in the 1960s, that its citizens would not agree to the stress of world combat against Communism.... At that critical moment, The Fierce People... came to reverse a dangerous complacency, proof that the battle is never won, that the fight can never be abandoned."
By the late 1980s Chagnon was in trouble. Tierney misses an important opportunity to discuss how the decline in Chagnon's fortunes was tied to the end of superpower tensions. At home, a generation of anthropologists critical of its discipline's role in justifying US foreign policy came into professional power. In Venezuela his former research subjects were demanding that he be barred from entering their territory. And reflecting the post-cold war extension of economic activity into areas previously off-limits, gold miners poured into the Amazon, causing widespread ecological destruction and social dislocation. Challenged by his liberal colleagues, harangued by feminists, threatened by dark-skinned peoples and adrift in the new post-cold war economy, Chagnon became an international version of the angry white man.
Chagnon did what many did at the end of the cold war--he went private. He teamed up with a flamboyant Venezuelan industrial gold miner, who turned "tracts of forest into mud soup," and the mistress of the Venezuelan president, who has since fled the country following indictments for corruption and fraud. The three came close to establishing a private biosphere in Yanomami territory that would have given them political authority over the Yanomami and monopoly rights over mineral and scientific claims. In order to muster international support for their scheme, they shuttled journalists and scientists in and out of remote Yanomami communities on lightning helicopter tours, without providing protection against possible contagion. Newspapers and television news ran stories of recently discovered "lost villages," while "foreign scientists carried out huge amounts of plant and animal samples."
When Venezuelan and international opposition scuttled his plan to set up a fiefdom in his former field site, Chagnon, now largely shut out of anthropology journals, stepped up efforts to disseminate his theories in the popular press. Although Chagnon often casts himself as an embattled truth-seeker--the preferred role of most biological determinists, no matter how much funding or open access to the media they have--Tierney points out the "abject admiration many male journalists apparently felt for the great anthropologist." He cites a fax that Matt Ridley, the science reporter at The Economist, sent to Chagnon apologizing for not writing a more sympathetic piece: "I have written it in the way that the International Editor wanted, which means 'impartially.' (She is a bit PC, herself.) So you may find it less unambiguously sympathetic to you than you might have hoped, but it is about as far as I dare go.... I do hope you like it."
What will make and, unfortunately, probably break Darkness in El Dorado is its description of the deadly 1968 outbreak of measles that coincided with the arrival of an expedition, funded by the Atomic Energy Commission and headed by Neel and Chagnon, to collect Yanomami blood samples.
Tierney's speculation that Neel may have been responsible for the epidemic is based on Neel's decision to use what was by 1968 an antiquated vaccine, Edmonston B, which was contraindicated for isolated populations such as the Yanomami. Tierney suggests that Neel chose this vaccine to prove that American Indians were not genetically vulnerable to European germs. Since Edmonston B produced the same level of antibodies as an infection of real measles, follow-up antibody tests would allow for a comparison of European and Yanomami immune systems. This may be why, according to Tierney, Neel opted for Edmonston B even though it was known to cause measleslike symptoms among isolated groups and even though a cheaper, safer vaccine (but one that did not produce antibodies comparable to the disease) was available. Tierney argues that because Edmonston B produces symptoms similar to measles, its use may have ignited the outbreak; he goes even further by proxy, citing a medical historian who ventures that Neel may have intentionally started the epidemic.
Tierney unfortunately has presented his case in a way that allows for easy dismissal. He provides compelling evidence that Neel and Chagnon did indeed treat the vaccination campaign as an experiment. For instance, by Neel's own telling, in the first village, before the epidemic, the team inexplicably vaccinated only forty Yanomami out of a total population of seventy-six, even though it had enough doses for all. Combined with the fact that most in this village had been tested for measles antibodies two years earlier, the inoculation of half the village created a fortuitous control group for Neel's published findings. It also seems that the vaccine did induce fevers and rashes in many Yanomami. Nevertheless, the fact that Tierney gives no direct evidence to back up his most serious conjecture--that the measles epidemic was caused by the vaccine--threatens to discredit his entire study. (Also, in response to the pre-publication controversy, most medical experts insist that it is impossible for a vaccine, no matter what symptoms it may bring on in the inoculated, to spread as an epidemic.)
Tierney's missteps here speak to a larger problem with his book, which draws its inspiration more from The X-Files than from the Frankfurt School. Tierney tries too hard to link the actions and motives of the individuals involved in a tight net of intrigue, misrepresenting cold war social science as a secret society of an elected few.
Of course, for many, the actions of the United States during the cold war don't make sense any other way. Consider this history: Neel, who did research on Hiroshima survivors, was funded by the Atomic Energy Commission to collect thousands of samples of Yanomami blood because it was thought it could be used as a baseline to measure degrees of genetic mutation. In 1958 the AEC, which in other instances engaged in deadly human radiation experiments, paid Marcel Roche, a Venezuelan doctor who worked on Neel's 1968 expedition, to inject the Yanomami, without their knowledge, of course, with radioactive iodine to study why they did not suffer from goiters. Tierney should not be entirely blamed if he didn't have a theory, other than conspiracy, to explain this.
Darkness in El Dorado unconvincingly attempts to trace this shameful history directly to Neel ("I felt that Neel was the key"), unfairly describing him as an extreme eugenicist. This is unfortunate, for Tierney could have written a more powerful book by demonstrating how the cold war produced acts of barbarism regardless of individual motive.
This is not to let Neel and Chagnon off the hook. They were instrumental in the creation of a body of knowledge that valued the Yanomami not for their own sake but for what they could provide cold war science. Their blood was believed to contain answers to questions raised by the new post-Hiroshima world, while their culture was thought to be a distilled version of what the West once was and, for some, should be again.
In the documentary made of the 1968 expedition, Neel and others are shown professionally inoculating Yanomami, who are presented as pictures of vibrant health. Sound outtakes reveal a different story. The team was exhausted, sick and panicked as the epidemic escaped their control and ravaged the Yanomami. Neel can be heard ordering the cameraman to stop filming a sick Yanomami. Whatever the cause of the measles outbreak, it is probable that the research team exposed the Yanomami to respiratory infections and other illnesses. The outtakes also reveal that Neel and Chagnon were much more concerned with making the documentary and collecting blood samples than with containing the epidemic. They broke quarantine lines to procure donors and quickly abandoned the area so that their blood would not be ruined in the tropical heat.
Tierney's effort to pin the tragic history of the Yanomami on Neel speaks to a larger problem, both in his book and in current ways of thinking about colonialism. With the failure of socialism and the discrediting of revolutionary movements and governments, many First World activists have thrown their energy into advocating on behalf of the cultural rights of native peoples. Much of this work is profoundly apolitical, justified more by appeals to Indian virtue than by critical analysis. This kind of activism too easily sets itself up for dismissal when it is revealed that Indians may have their own interests and may not be as innocent as portrayed.
This problem is reproduced in Tierney's book. It speaks to the poverty of our political culture that Tierney, an experienced investigative reporter, refuses, either out of ignorance or bias, to discuss the history of the Amazon in reference to colonialism, capitalism or racism. Instead, he searches for the mastermind behind the mayhem. Tierney creates a kitschy Heart of Darkness-like tale and casts himself as Marlow and Chagnon as Kurtz (Neel, perhaps, could be King Leopold). Well before we hear any Yanomami voices, we learn of Tierney's battles against jungle thieves and malaria, heroically rescuing Yanomami children and fending off evil gold miners.
Tierney's narrative rightly demonstrates how objective scientists can be implicated in a history of atrocity--and his gaffes should not distract from this history--but it can't account for the fact that while the AEC was paying for Neel's and Chagnon's jungle excursions, it was also funding the work of Harvard biologist Richard Lewontin, along with other progressive scientists and anthropologists. These scholars became powerful critics of how the supposed objective research of their colleagues served not-so-objective agendas and had not-so-benign consequences. These politicized scholars have served science well--proof positive that Adorno was right, that "science needs those who disobey it."
I have been waiting for Manifesta to come out. I had certain hopes for this book. In particular, I was looking forward to using it as a corrective addition in a course I'm teaching on "Third Wave Feminism and Girl Culture." When I first taught this class last spring, my students became increasingly frustrated with the overwhelmingly personal tone of the contemporary feminism we were reading. Our central texts, and until now they have been the central texts of the self-proclaimed Third Wave, were three anthologies, all published in the past five years: Barbara Findlen's Listen Up: Voices From the Next Generation; Rebecca Walker's To Be Real: Telling the Truth and Changing the Face of Feminism; and Leslie Heywood and Jennifer Drake's Third Wave Agenda: Being Feminist, Doing Feminism. While each of these collections takes its own approach to the Third Wave, they share an emphasis on the singular experience of young women and the occasional young man as grounds for a new generation of feminist politics ("young" in this context generally designates those born between 1964 and 1980). Specifically, all three anthologies grapple with how to combine some version of feminist politics with what Third Wave Agenda calls the "lived messiness" of real life. After reading assorted articles in which individual Third Wavers describe their intimate struggles with eating disorders, gender dysphoria, racial difference and antifeminist workplaces or, conversely, their sustaining attachments to various punk rockers, my students begin to ask, "Isn't there some Third Wave theory we could read?"
Last spring I suggested to my students that, for the moment, this return to experience in all of its messy multiplicity might be the unifying theory of the Third Wave (we might see it, for instance, as a historically necessary return to the "personal" moment of "the personal is political"), but I share their longing for a militant, argumentative feminism--one that would abandon the personal essay with its fetishization of contradiction and get on with elaborating a political program. Contemporary feminism needs the kind of intervention Manifesta purports to be. Billed as "a powerful indictment from within of the current state of feminism, and a passionate call to arms," Manifesta aims to challenge the experientialism and fragmentation of the emerging Third Wave with history, political argument and activism. These are, to my mind, exactly the grounds on which to confront the Third Wave, but how effectively Manifesta manages this confrontation is another question entirely.
Written by Jennifer Baumgardner and Amy Richards, both journalists, activists and Third Wavers themselves, Manifesta turns out to be a strange book. Some of this strangeness no doubt derives from its collaborative production. The two writers seem to be trying to repress, rather than sharpen, their differences, and this results in a book that is, narratively, both bland and contradictory by turns. This general atmosphere of forced consensus extends to the content as well. As Richards outlines in her introduction (each writer provides an introduction, not to the book but to herself), she and Baumgardner created Manifesta by combining their separate book projects, one on activism and the other a cultural analysis of current feminism, into one text. The end result is a long, wide-ranging and episodic book that touches on everything from Barbie and Riot Grrrls to voter registration and Title IX without ever fully integrating its cultural and activist components. Perhaps the strangest thing about the book is its title, for Manifesta is neither short nor scrappy like the best of its genre (e.g., the SCUM and Communist manifestoes). I was, however, encouraged to find that the book does contain (finally! on page 278) an actual "thirteen point" manifesta that distills its uncontroversial pro-choice, pro-ERA, anti-domestic violence agenda.
Its structural peculiarities aside, Manifesta does supply several potentially powerful correctives to contemporary feminism--the first of which is a historical perspective. One of the striking features of works of Third Wave feminism published so far is their general impatience with, and desire to break from, the feminist past. (Although the editors of both Listen Up and Third Wave Agenda make a point of pledging their allegiance to the Second Wave, their contributors for the most part do not.) Third Wavers frequently accomplish this break by declaring the Second Wave outmoded, unrealistically militant and irrelevant to the lives of young women. Melissa Klein describes this renunciation in her contribution to Third Wave Agenda, "Duality and Redefinition":
Many young women hesitate to take on the mantle of feminism, either because they fear being branded as fanatical "feminazis" or because they see feminism not as a growing and changing movement but as a dialogue of the past that conjures up images of militantly bell-bottomed "women's libbers."
In Third Wave writing, reductive caricature--those "bell-bottomed feminazis"--often displaces and deters real historical knowledge about the politics, accomplishments and legacy of the Second Wave, not to mention earlier feminisms. (For an especially sharp and poignant instance of the Third Wave's failure to recognize the Second, see the foreword and afterword to Rebecca Walker's To Be Real, in which a bewildered Gloria Steinem and Angela Davis wrestle with the treatment that feminism of the sixties and seventies receives in the book.)
Baumgardner and Richards reject this species of feminist ahistoricism. Point 5 of their manifesta aims
To tap into and raise awareness of our revolutionary history.... To have access to our intellectual feminist legacy and women's history; for the classics of radical feminism, womanism, mujeristas, women's liberation, and all our roots to remain in print; and to have women's history taught to men as well as women as a part of all curricula.
Against the Third Wave's rebellious declarations of independence, Baumgardner and Richards insist on a cross-generational, continuous understanding of feminism secured through the study of feminist history. "Having no sense of how we got here," they write, "condemns women to reinvent the wheel and often blocks us from creating a political strategy." Manifesta works throughout to supply some of this prehistory by linking current feminist cultural forms and figures to earlier ones. The Lilith Fair, for instance, is presented in the tradition of the Michigan Womyn's Music Festival, and prosex profemininity Girlie feminists are recognized as descendants of Helen Gurley Brown.
The authors systematize their version of feminist history in a chapter titled "What Is Feminism?" Here they produce a sketchy, breakneck overview of United States feminism from Seneca Falls, through the Nineteenth Amendment, the Second Wave and the ERA, up to the Third Wave. They make some attempt to be multiculturally and politically inclusive by mentioning Native American matriarchies, Sojourner Truth and Emma Goldman, but what they call feminist history here is fundamentally the history of white, middle-class liberal feminism and its record of US governmental reforms. Manifesta's restricted focus on liberal feminism is, unfortunately, systemic. In the rest of the book, where Second Wavers provide most of the historical counterpoint, Baumgardner and Richards repeatedly offer up liberal feminists as representative of all feminism: Gloria Steinem, Betty Friedan, Gloria Steinem, Carol Gilligan, Gloria Steinem... Please note that our authors met while interning at Ms.
Feminists on the left and feminists of color will not find their history represented in Manifesta. The contributions of Audre Lorde, Barbara Smith, Lydia Sargent, bell hooks, Heidi Hartmann, Gloria Anzaldúa and Cherríe Moraga, to name only a few, go pretty much unmentioned. This historical prejudice is especially striking, given that recent anthologies tend to date Third Wave feminism from the critiques that women of color launched against liberal feminism toward the end of the Second Wave. In this context, Manifesta's historical sensibility reads as reaction, as a call for a return to some imagined white, homogeneous Second Wave feminism. On the few occasions that Baumgardner and Richards deign to mention leftish feminisms, they criticize leftists not for their politics but for being unnecessarily divisive, for undermining some presumed feminist consensus. Barbara Ehrenreich and Katha Pollitt, for instance, receive sharp criticism for having the gall to "[point] their fingers" at other feminists.
This anxiety about feminist dissent permeates Manifesta. Like a mantra, Baumgardner and Richards repeat phrases like "everyday feminism," "the same old feminism" and "organic" feminism, as if there were some reassuring common sense that united all feminists. For our authors, this "same old feminism" designates the same old liberal reformism, and while I praise their historical instincts--the Third Wave needs its past more than it knows--I wish Baumgardner and Richards had worked harder to be more fully historical. Not only do they provide the feminist history that is most likely to be familiar to readers without their help (through mainstream institutions like Ms., NOW and the Democratic Party) but their reductive version of the feminist past is unlikely to speak to the interests and experiences of women of color, working-class and radical women, or queers (though of all differences among women, they give the most lip service to sexual difference). Manifesta's history simply isn't adequate for comprehending, much less galvanizing, the actual class, racial, sexual and political heterogeneity of American women.
The second correction that Manifesta brings to the Third Wave is an insistence on political argumentation. If Third Wavers are vulnerable to charges of navel-gazing, of musing endlessly and confessionally over the contradictions between feminism and life, the authors adamantly resist this deferral of political consciousness. Throughout Manifesta they insist on making feminist sense of the world, using anecdotal narratives and statistical data (they admit to being obsessive fact-checkers) to remind us that the pay gap between women and men persists and remains substantial (74 cents to the dollar, by current calculations), that rape and domestic violence still operate to restrict women's independence, that the sexual double standard continues to distort female sexuality and that reproductive rights are only partially and tenuously secure. Although their political consciousness remains disappointingly close to their own experiences and needs as young professional white women (they give an inordinate amount of time to the injustices that face female journalists in New York City, while other feminist issues, like daycare, racism, gay-bashing and collective bargaining in the pink-collar ghetto, receive little to no treatment), at least Baumgardner and Richards model the process of politicizing experience, of seeing the personal as political. "Consciousness-raising," they argue, "must precede action."
At a deeper level, however, Manifesta simply isn't argumentative enough. In fact, the argument we most expect from a work of contemporary feminist theory--a systemic analysis of the causes of women's oppression today--is entirely absent. If the Third Wave intends to remake feminism for this generation, then it needs a comprehensive account of the specific material conditions that currently determine (and determine differentially) the social and economic position of women in the United States and outside the United States as well. Such an account requires thinking through systems (capitalism, patriarchy, racism, homophobia) in the way many Second Wavers did, although it does not require that the Third Wave simply redeploy arguments generated in the seventies. After all, material conditions change. Instead of systemic argumentation, however, Baumgardner and Richards offer up a loose platform of issues: prison reform, pay inequality, military access for women, negative body images, the ERA, egalitarian healthcare, etc. Because Manifesta lacks a coherent structural account that could link these disparate issues (e.g., through the underlying socioeconomic processes that produce them), readers are unlikely to recognize any inner logic in this collection of so-called women's issues. Nor can Manifesta provide an argument for prioritizing one issue over another. In Baumgardner and Richards's account, feminism becomes analytically rootless, seemingly implicated everywhere, but no more effective or necessary in one arena than another.
In the absence of sustained structural analysis, our authors use large quantities of populist boosterism to hold the book together. Their populism underwrites two of Manifesta's larger claims, the first of which is that despite what critics say, feminism is everywhere in contemporary culture, just waiting to be acknowledged. The authors announce the existence of what they call a contemporary "Feminist Diaspora"--a large, dispersed population of "everyday" feminists who embody the Second Wave's success in establishing feminism as part of our cultural common sense. "For anyone born after the early 1960s," they assert, "the presence of feminism in our lives is taken for granted. For our generation, feminism is like fluoride. We scarcely notice that we have it--it's simply in the water." Of course this is a controversial proposition, as it assumes that the feminism of the sixties and seventies was disseminated uniformly to young women throughout the United States, irrespective of class, racial, educational or geographical distinctions. You get a very different picture of feminism's reach if you talk to women who, although "born after the early 1960s," were raised in rural areas, in immigrant families or in working-class neighborhoods. But if it's true, as our authors say, that feminism can now be taken for granted, that it has become part of popular consciousness, this presents Baumgardner and Richards with a unique dilemma. "The only problem," they acknowledge, "is that, while on a personal level feminism is everywhere, like fluoride, on a political level the movement is more like nitrogen: ubiquitous and inert." So even though they see "a generation of [young women] leading revolutionary lives," our authors concede that these same women are "best known for saying, 'I'm not a feminist, but...'"
Point 1 of the manifesta contains their plan for attacking this lack of feminist self-identification: "To out unacknowledged feminists, specifically those who are younger, so that Generation X can become a visible movement and, further, a voting block of eighteen- to forty-year-olds." As far as I can tell, "outing," in this context, consists of making feminism so enticingly broad and nondemanding that young women, realizing they are in no way required to interrogate themselves or their social practices, will claim feminism for themselves. "Maybe you aren't sure you need feminism," Baumgardner and Richards coax,
...or you're not sure it needs you. You're sexy, a wallflower, you shop at Calvin Klein, you are a stay-at-home mom, a big Hollywood producer, a beautiful bride all in white, an ex-wife raising three kids, or you shave, pluck, and wax. In reality, feminism wants you to be whoever you are--but with a political consciousness.
By asserting that young women are already feminists, if unconscious ones, Baumgardner and Richards feel empowered to claim that there is, indeed, a feminist "movement" afoot today. Although they admit that it does not consist of "a huge force of conscious feminists" (i.e., it does not look like anything we'd recognize as collective action), they repeatedly refer to "the movement" as if saying the word could call the social form into being. I share Manifesta's desire for a movement (the fizzled Riot Grrrl was arguably the closest--and it wasn't very close--we've come to collective feminist action in the last decade), but I don't believe that calling whatever women do to survive "a movement" or trying to swell the feminist ranks with prepolitical young women is the most effective way to get us there.
Baumgardner and Richards's populist strategies also emerge in the second of their larger claims, namely that political differences between types of feminism really don't, and ideally shouldn't, matter all that much. Our authors take a staggeringly latitudinarian approach to feminism. They stage extended defenses of Naomi Wolf and Katie Roiphe, both of whom most feminists consider conservative backlashers, in order to assert their rightful membership in the feminist camp. "We have to put down our relentless search for feminist purity," they argue,
...and look at Katie Roiphe, Elizabeth Wurtzel, Naomi Wolf, and the rest of the emerging young women as what they are: feminists, the next generation.... Yes, all feminists deserve critique and debate, but save your political vitriol for the young babes who are right-wing and political.
Baumgardner and Richards also extend feminist inclusion to "Girlie" types, those young women, vaguely associated with Bust's readership, who find personal empowerment in the cultural trappings of traditional adolescent femininity. They even make a case for Monica Lewinsky as a contemporary feminist icon, calling her "a twenty-three-year-old White House intern who owned her own libido and sexual prowess."
What our manifesta writers hope to gain by stretching feminism to its outer limits in order to include Roiphe, Wolf, Girlies and Lewinsky is a kind of "big tent" feminism that could take on "right-wing babes" like Christina Hoff Sommers, Laura Ingraham and Ann Coulter. And they are not the first Third Wavers to promote this kind of feminist populism. Rebecca Walker, in To Be Real, argues that we should "[broaden] our view of who and what constitutes 'the feminist community,'" so as to "stake out an inclusive terrain from which to actively seek the goals of societal equality and individual freedom." What they lose in the stretch, however, is any real content to feminism, other than the crudest and too often imaginary distinction between the right and left wing. Baumgardner and Richards would do for feminism what Clinton did for the Democrats over the past eight years: try to absorb, rhetorically, everyone from left-liberals to centrists in order to build a strategic coalition against the radical right. But why should the broad spectrum of feminists be forced to define themselves negatively and homogeneously against a few shrill right-wingers? While feminists need to be able to come together around issues that concern us, and I think we do, our differences are politically meaningful and, to my mind, ultimately productive. Roiphe, Wolf et al., for instance, raise important questions about the Third Wave revalorization of beauty, sexual power and femininity. What happens to feminism when it reclaims the very sources of power the patriarchy has always been happy to grant us? Why is it difficult to recognize feminist "agency" in the circumstance of a young female intern, smitten with male presidential power, dropping to her knees? Rather than subordinate our differences in the service of the flabby populism Manifesta promotes, I would like to see contemporary feminism embrace contention, sharpen its differences and strengthen its analysis.
There are limits, however, even to Baumgardner and Richards's feminist magnanimity. Their inclusionism breaks down not only around the "divisive" left but in their engagement with psychoanalytic Second Waver Phyllis Chesler. Chesler elicits their ire for, apparently, using the wrong tone of voice. In her 1997 Letters to a Young Feminist, Chesler draws on her longstanding engagement with feminism to delineate what she sees as the Second Wave's "legacy" to the next generation. Specifically, she focuses on the contradictions produced by Second Wave feminisms (e.g., between the ideology of "sisterhood" and the reality of female competition, between movement egalitarianism and the hierarchies "professional" feminism reproduced) and presses younger feminists to learn from and supersede these contradictions. In keeping with her training, Chesler approaches her Letters through the lens of the family drama and uses the persona of a feminist mother to address imagined feminist daughters (and, in the last chapter, her real-life feminist son). The phony intimacy of this address makes for some serious rhetorical melodrama: The reader is regularly addressed as "darling" and "my dear" by an overbearing Ma Chesler. Despite its stylistic goofiness, however, Chesler's book remains one of the few Second Wave feminist "memoirs" (and there are now many) that work to instrumentalize, rather than glorify or recant, the feminist past in order to serve the feminist future.
Baumgardner and Richards are unable to recognize how Chesler's book, like their own, attempts to build a bridge between the Waves. Instead, in an angry "Letter to an Older Feminist," our authors perform their rebellion against Chesler and her cohort, exclaiming "You're not our mothers." "We let you off your mother trip," they announce, "Now you have to stop treating us like daughters. You don't have the authority to treat us like babies or acolytes who need to be molded." As much as our authors say they want to connect with the Second Wave, they clearly want the connection on their own terms. It's OK for Chesler to participate in the Third Wave as an icon, as an inspiring bit of history, but Baumgardner and Richards would rather she quit trying to contribute her own work. "Read our books, buy our records," they command the Older Feminist. Ever vigilant of ageism when it's directed at younger feminists, here Baumgardner and Richards themselves, unnecessarily, reproduce a generation gap.
My favorite part of Manifesta, and the final corrective it offers to the Third Wave's nearly exclusive focus on cultural critique, is its insistence on activism. In the final section of the book, in a chapter titled "What Is Activism?" Baumgardner and Richards push young feminists to take action. "Activism," they write, "starts with the acknowledgment of injustice, but it doesn't stop with the rant...or even with the manifesta." To insure that their readers develop realistic expectations, the two debunk what they say are four myths about activism: that "activism will bring an immediate and decisive victory," that activism "has to be huge," that activism requires "superleaders" and, finally, that contemporary feminism is "politically impotent." Baumgardner and Richards also challenge the common preconception that volunteering is necessarily the highest form of activism. They make a fabulous distinction between "activist" and "charity" types of volunteer work, defining the latter as those positions (like candy stripers and literacy instructors) that have a long tradition of relying on unpaid female labor. Readers are directed away from the feminized sector and are encouraged instead to turn their efforts toward the "activist" groups--those "organizations that are too ahead of their time to be funded by the government"--and to continue to lobby for pay for their work. Central to successful activism, Baumgardner and Richards suggest, is a "clear intention, a realistic plan, and an identifiable constituency," and they provide steps for developing these strategic elements. In addition to an appendix containing contact information for numerous activist organizations (along with record labels, makeup brands and sex-toy shops), Baumgardner and Richards also provide a series of "creative social justice" issues that they think warrant activist involvement, such as political asylum for female refugees who have suffered gendered forms of violence, getting female reproductive care into prisons and pressing the National Honor Society to strike down its exclusion of pregnant women. For each issue they provide concrete avenues for action: Lobby the President, recruit Ob-Gyns to go into prisons, petition the NHS with lists of male members who have impregnated women.
While I love its demystification of activism, I remain unenamored with Manifesta's overall political vision, which never moves much beyond liberal reformism. For all their talk of "revolution," Baumgardner and Richards are primarily interested in, as they call it, putting the "participatory back into participatory democracy." The book, moreover, contains no clear sense of how issue-by-issue reformism of the type they advocate could lead to the "revolutionary movement" and larger social transformation they often invoke as their long-range goal. Despite its political tunnel vision, however, Manifesta works productively, in my view, to reorient the Third Wave toward action, particularly action beyond just the cultural level. Baumgardner and Richards encourage young feminists to engage with politics, the law and (to some extent) the economy, and they supply concrete strategies and realistic expectations for beginning this kind of activist work. Manifesta provides a solid starting place for reformist-style activism, and in the current moment, any activism is better than none. Who knows how young feminists might be revolutionized through the types of activism Baumgardner and Richards advocate; Manifesta could lay the groundwork for more radical forms of political action.
All in all, I think Manifesta suggests a formula, if not the specific content, for a better version of Third Wave feminism. We need to build on the feminisms that have preceded us, but we need the history of all feminisms, not just the least controversial, most mainstream forms. We need to embrace political argument, but we need to root our arguments in a larger understanding of the conditions that oppress us--all of us. We also need to be able to argue among ourselves about what feminism at this historical moment ought to look like, and to do that we have to dispense with the idea--itself an artifact of the backlash--that feminism needs warm bodies more than it needs theory or principles. We need to fight the seemingly widespread preconception that a state of feminist grace is prerequisite to action and that essay writing should be our preferred mode of activism. We need to get busy in the ways Manifesta urges and in many more. Without meaning to, Manifesta also prompts us, through some of its engagement with "older feminists," to think about how the Third Wave may be founding itself on unexamined ageism. In the end, a better version of Third Wave feminism might involve changing the name as a first step toward unloading altogether the dubious politics of generationality.
They laughed when I sat down with these two writers--and never mind that both books arrived in the same box. The bad gay boy and the cold war saint! The apostle of derangement and the lexicographer of Newspeak! The red cape and the tweed jacket, the rotting knee and the lousy lung, the drunken boat and the memory hole! "I came to find my mind's disorder sacred," said the poet on a camel. "Good prose is like a window pane," said the novelist who shot elephants.
But both Arthur Rimbaud and George Orwell did go down-and-out in Paris and London. Both their fathers were mostly absent, doing time as globocops in Third World tropics of the French and British empires. (On a sand dune, Captain Rimbaud taught himself Bedouin ways, the better to repress them. Orwell's dad was a deputy opium agent, making sure the poppy juice got from India to the Chinese addicts.) Both their mothers loved cards more than kids. Both sons, hating school, gifted at languages, hostile to religion, intrigued by popular culture, would follow their fathers to the colonies, enlist in foreign wars, lose not only their tempers but also amazing amounts of manuscript and die younger than they should have, after dreaming up and acting out alternative identities. (Take a hike, Eric Blair: "I is somebody else.")
Both live on as cautionary tales, litmus tests, celebrity role models and undead icons. In his wickedly entertaining revised version of Rimbaud, Graham Robb points to his posthumous career "as Symbolist, Surrealist, Beat poet, student revolutionary, rock lyricist, gay pioneer and inspired drug-user," as well as "an emergency exit from the house of convention" for avant-gardes everywhere. Well-thumbed copies of A Season in Hell and The Illuminations are to be found in the portmanteaus of Picasso, Breton, Cocteau and Malraux and in the backpacks of Allen Ginsberg, Bob Dylan, Bruce Chatwin and Kurt Cobain. Jim Morrison, that swinging Door, is even rumored "to have faked his death in Paris and followed Rimbaud to Ethiopia"--just the right splash of mythic Tabasco.
Whereas Orwell's name is mentioned every time we are looked down upon by surveillance cameras, lied to by governments, read about journalists who have been "disappeared" or hear about dissidents in mental hospitals. Big Brother is a member of our extended family, the pigs go on drinking all the milk and eating all the apples, and the SLORC word for Burma in Newspeak is "Myanmar." In Democracy, her 1984 (!) novel of skulduggery on the Pacific Rim, Joan Didion would notice that "all reporters had paperback copies of Homage to Catalonia, and kept them in the same place where they kept the matches and the candle and the notebook, for when the hotel was bombed." So postmodern is Curious George that he has even been abducted by such aliens as Norman Podhoretz.
And both for a season or so professed revolutionary socialism. Even if the moment passed like measles, Rimbaud was there for the Paris Commune, and Orwell was there for the Spanish Republic, and these, of course, are two of the biggest Super Bowl games in the left's long losing streak, and it makes you want to weep.
Robb reminds us that the massacre of the Communards in 1871 "was the bloodiest week in French history: a savage humiliation of the proletariat. Thousands were shot, inexpertly tortured or shipped to the penal colonies without a proper trial. Women carrying bottles in the street were bayoneted by soldiers who had heard of the mythical, bomb-throwing 'pétroleuses.' More people died during la Semaine Sanglante than in the Reign of Terror or the Franco-Prussian War." While the Rimbaud article in my Britannica omits any mention of the Commune, the young poet yo-yo'd in and out of all of it, and Robb suggests that he may have been raped by a gang of soldiers while trying to slip through the lines a couple of weeks before the slaughter, after which he wrote his famous Lettre du Voyant--announcing the poet as Romantic Lucifer and Promethean Satan, whose job it was to rescue man from God.
On the open wound of the Spanish Republic, Jeffrey Meyers quotes Albert Camus: "It was in Spain that men learned that one can be right and yet be beaten, that force can vanquish spirit, that there are times when courage is not its own recompense. It is this, doubtless, which explains why so many men, the world over, regard the Spanish drama as a personal tragedy." Certainly it was personal for Orwell. On his first Barcelona stop, he found a socialist community "where no one was on the make, where there was a shortage of everything but no privilege and no bootlicking," and dining rooms in the luxury hotels had been turned into canteens for the militia. But his second time around, he saw fat men eating quails while children begged for bread, and the commissars were hunting down his anarchist friends like deer. And then he took a bullet in the throat.
Anyway, both of them were lonely guys: vagabonds and vanishing acts. And they somehow hang together, coincidental and corresponding, in a rainbow arc from the Cult of the Artist to the Writer on the Barricades to Joe DiMaggio for Mr. Coffee and Bob Dole for Viagra. In Democracy, Joan Didion also quotes Kierkegaard: "Life can only be understood backwards, but it must be lived forwards."
It can only be the end of the world, ahead of time.
"The first poet of a civilization not yet born," as René Char called him, showed up on October 20, 1854, in Charleville in French Flanders, three years after Napoleon III's coup d'état. At age 4, already precocious, he tried to trade his baby sister for some colored prints in a bookshop window. At age 6, his father shipped off for Algeria and never came back, leaving Arthur at the mercy of a mother devoted to church, shopping and whist, with a "phenomenal capacity for not showing affection." At age 7, he entered the "corpse-yellow" rooms of the local lycée as if preparing "for a life in prison." By age 14, he had inhaled all of French poetry, won every academic prize and developed acute self-consciousness:
I have the bluish-white eyes of my ancestors the Gauls, their small brains, their clumsiness in battle. I find my dress as barbaric as theirs. But I do not butter my hair.
Picture him in the summer of 1870, chatting up navvies and quarrymen, reading Verlaine for the first time and stowing away under a seat on the train to Paris, where he will be arrested on suspicion of republicanism and/or spying for Bismarck, and spend maybe a week in prison, during which not even Robb can say for sure what happens to him, except lice. There followed, as if on an elastic string that kept snapping him back to "the Mouth of Darkness," as he called his disapproving mother, an itchy six-month period of itinerant journalism, cafe polemics, bohemian sonnets and shopping for surrogate fathers, during which he swore like a prisoner, ate like a pig, refused to pass the salt and came to believe that "the mind could be shaped by an act of will," that morality "is a weakness of the brain" and that Society "will fall to the axe, the pick and the steamroller."
In the cities, the mud suddenly appeared to me red and black, like a mirror when the lamp moves about in the next room, like a treasure in the forest! Good luck, I cried, and saw a sea of flames and smoke in the sky, and, to left and to right, all riches blazing like a billion thunders.
This is a kid ready for a Commune. He sells his watch for a third-class ticket to Paris in February 1871, and for two weeks walks the streets "feasting on theatre bills, advertisements, pamphlets and shop signs," sleeping on coal barges, competing with dogs for scraps of food--a "vagrant poet with a fish in his pants." Six days after he has hoofed it home, workers rise, generals are lynched and he has to go back again: "Paris had fallen to poets who worked with laws and human beings instead of words." A new chief of police removes "Saint" from every street name and issues a warrant for God's arrest. Maybe words actually do have "a direct, controllable influence on reality."
"Order is vanquished!" declares the 16-year-old, and writes his own revolutionary Constitution: A permanent state of referendum! Abolition of families and their "slave-holding" of children! Communication with animals, plants and extraterrestrials! He will return in late April, at the delirious height of the Commune, to enlist as a Left Bank guerrilla: "To whom shall I hire myself? What beast should I worship? What holy image are we attacking? Which hearts shall I break? What lie must I keep?--In what blood shall I walk?" When government troops bomb their own capital, he slips away, suffers what he suffers and enters the gaudy tent of his own legend: "I owe my superiority to the fact that I have no heart."
In fact, says Robb, he has decided "to seize control of the means of intellectual production.... In terms that were unavailable to him in 1871, he was considering the possibility of detaching the censorious superego from the endlessly imaginative id." And the "superego incarnate" is Mme. Rimbaud, from whom he's always hiding out in attics, cellars or latrines, and to whom he always returns, until Africa. You are saying this is reductive. But every once in a while, praxis so improves on theory that we get a penguin.
That summer of 1871 he posts a batch of poems to Verlaine so full of kinky innuendo that The Nasty Fellows raise a subscription to bring the prodigy to the capital and subsidize his genius. Rimbaud arrives with "a strange nostalgia for the future," one of the most remarkable poems in any language, "The Drunken Boat," and a plan to fold, bend, spindle and mutilate his own personality. Almost immediately, he will trash hotel rooms like a rock star and leave turds behind on pillows. Verlaine, of course, will fall in love with him, when he isn't rotting his brain with absinthe or setting his wife's hair on fire. Verlaine is easy to make fun of only if you've never been smitten by somebody bad for you, or until you are reminded that Pol Pot was one of his great admirers.
We are now in familiar territory, with the familiar contradictions. Rimbaud the vandal, hooligan, sadist and "murderous" prankster is also the Rimbaud who writes a lovely article about "human alarm-clocks" who for a small fee rush around in the early hours in the poorer sections of the city waking up factory workers. The "vile, vicious, disgusting, smutty little schoolboy" is also the author of the marvelous "Voyelles," a poem in which each vowel has its own color (noir, blanc, rouge, vert, bleu)--inspired by Ernest Cabaner, a composer who plays piano in a bar, collects old shoes to use as flowerpots and believes that each note of the octave corresponds to a particular color and vowel. According to Robb:
This is the ambiguity that lies at the heart of Rimbaud's work: the ardent search for powerful systems of thought that could be used like magic spells, conducted by an acutely ironic intelligence--a combination that rarely survives adolescence gracefully.
He loses a notebook, the Belgian poems and the manuscript of "Spiritual Hunt." Since he believes "every being...to be entitled to several other lives," why not go to England, live with Verlaine in Soho, grub sixpence from writing business letters and teaching French, admire the boys in tight-fitting suits waiting outside public urinals and read Shakespeare, Longfellow, Poe and Swinburne? Certainly, like all ex-Communards in jittery Europe, they are spied upon and hassled. So should they be. They hobnob with the socialist underground. They see Karl Marx. Robb even suggests that several of The Illuminations can be construed as glosses on Kapital--on "the alienated consumers of the modern metropolis, the disinherited masses, the resurrectionary mythology of the Commune and the magic wand of global capitalism."
Not so his astonishing A Season in Hell, in which Modernism rears its contrary head; in which experiments with language are investigations into the unstable self; in which, "like a particle accelerator," repellent forms of thought collide: Job and Goethe; fairy tales and Taine; Fleurs du Mal and "the Mouth of Darkness." "Rimbaud, at the age of 18, had invented a linguistic world that can be happily explored for years like the scrapyard of a civilization." After which, confoundingly, he abandoned literature, France, fame and Mme. Rimbaud.
Well, now: Brussels, Stuttgart, Milan, Siena. Enlisting in the Carlist rebel army, then absconding with the cash bonus. Enlisting in the Dutch Colonial Army, then deserting the minute he gets to Java. Trying to enlist in the US Navy, then having instead to run off to Scandinavia with a circus. Going over the Alps on foot, setting sail for Alexandria, learning Russian, Arabic and Hindi. Discovering at last that while no tree grows in Aden, there is a nearby Forbidden City unseen by Europeans since Richard Burton, and money to be made trading coffee, tobacco, incense, ivory, spices, spears, swords, ostrich eggs, animal skins and guns. He will wear a turban, keep a woman, chew khat, catch syphilis, ride camels, write mom, lose another manuscript (on Abyssinia) and then his right leg (to bone cancer). At the end, he refuses opium for fear of what he might say in his delirium to his sister.
Disregard previous rumors, even in Enid Starkie. He neither converted to Islam nor traded in slaves, though you couldn't do business in his part of Africa without cutting the warlords in on the deal. What he did do, by selling guns to Menelik, was help an African army defeat a European nation--well, at least Italy--for the first time. Disregard as well the Tragic Aura. He didn't die bitter and broke. He actually made a lot of money, which he hid from his mother in bank accounts all over the Middle East. Some people are still looking for it.
Some people are also still looking for the poet. Rimbaud killed him off when he stopped living with other people, after he realized that the world couldn't be changed by verbal innovation. Literature, Robb explains, hadn't worked:
For Rimbaud, poetry had always been the means to an end: winning esteem, causing irritation, changing the nature of reality. Each redefinition of the goal had rendered the old technology obsolete. The prose Rimbaud had shown no more nostalgia for verse than most mathematicians showed for their slide-rules after the invention of the personal computer.
It's hard to read this as anything other than a triumph of capitalism over Bohemia.
My starting point is always a feeling of partisanship, a sense of injustice.
(Orwell, "Why I Write")
Orwell lasted ten years longer, but all of it was much less thrilling. And so, compared to Graham Robb, is Jeffrey Meyers. Whether, after two volumes by William Abrahams and Peter Stansky, one full-length bio by Bernard Crick, another by Michael Shelden, a short and elegant "Literary Life" by the editor of the twenty-volume Complete Works, Peter Davison, and a brilliant black valentine by Raymond Williams in the "Modern Masters" series, we even need another account is open to question. "'Father Knew George Orwell' is a cracked old song," wrote Williams almost three decades ago. But the centennial of his birth will be upon us in three short years, so batten down your aspidistra.
According to Meyers, he felt guilty about everything: "his colonial heritage, his bourgeois background, his inverted snobbery and his elite education," not to mention his behavior as a policeman in Burma, his inability to get himself arrested while he was collecting material for Down and Out and maybe even the uncircumcised penis that so mortified him at Eton among such contemporaries as Anthony Powell, Henry Green and Harold Acton. And so his whole life was a kind of penance, never taking care of himself, doing it all the hard way, always off to another dangerous front, ending up on an island off the coast of Scotland as far away from medical attention as an Englishman with tuberculosis could get. "All these risky moves were prompted by the inner need to sabotage his chance of a happy life," Meyers the schoolmarm tells us.
We've heard this before, from everybody else, and it still doesn't explain anything. How many boys went to Eton and not to Spain? How many writers went to Spain, like Hemingway, and failed to notice anything peculiar? How come Lawrence Durrell and Anthony Burgess never felt guilty about their colonial service or imperial privilege? Who else (who didn't have to) went down the Wigan mines, or into the casual wards of a public hospital to find out how the poor died, or saw a man hanged and decided on the spot, "When a murderer is hanged, there is only one person at the ceremony who is not guilty of murder"?
From Meyers, we also get a surprising amount of sex, all of it depressing. Orwell was nervous about women, apparently not much good in bed and would complain in his "Last Literary Notebook" about "their incorrigible dirtiness & untidiness" and "their terrible, devouring sexuality":
Within any marriage or regular love affair, he suspected that it was always the woman who was the sexually insistent partner. In his experience women were quite insatiable, & never seemed fatigued by no matter how much love-making.... In any marriage of more than a year or two's standing, intercourse was thought of as a duty, a service owed by the man to the woman. And he suspected that in every marriage the struggle was always the same--the man trying to escape from sexual intercourse, to do it only when he felt like it (or with other women), the woman demanding it more & more, & more & more consciously despising her husband for his lack of virility.
How does this square with his adventures in Rangoon brothels or among Parisian trollops and Berber girls in Marrakech? Was the former colonial cop and declassed intellectual only capable of getting it up with the lower orders? Raymond Williams was much exercised by this class angle in Orwell--an unconscious condescension, a double standard, a writing off of the brute masses because he'd come to feel all politics "was a mode of adjustment to one's own wishes and fantasies." Hadn't he, in Nineteen Eighty-Four, projected his own apathy on the oppressed proles by insisting that, "Under the spreading chestnut tree/I sold you and you sold me"?
But these are difficult thoughts, getting into what Williams called Orwell's "submerged despairs"--the "radical pessimism" and "accommodation to capitalism" of this self-described "shock-absorber of the bourgeoisie." Meyers will no more entertain them than he will explore the kind of craft questions that bring out the best in Peter Davison--on, for instance, how those magnificent essays about elephants, toads and Dickens got themselves written. Or the precise debt of Nineteen Eighty-Four to Yevgeny Zamyatin's We, Jack London's Iron Heel and Katherine Burdekin's Swastika Night. No mention in Meyers, either, of how the 1955 film version of Animal Farm omitted the last-scene melding of men and pigs, which might have opened questions about cultural expropriation, body-snatching and even Doublethink--all for the greater good of the cold war cause. In all Meyers's many pages, not a single sentence stops us in mid-platitude to say anything half as intellectually arresting as these several in Raymond Williams, on Orwell's recurring patterns:
This experience of awareness, rejection, and flight is repeatedly enacted. Yet it would be truer to say that most of Orwell's important writing is about someone who tries to get away but fails. That failure, that reabsorption, happens, in the end, in all the novels mentioned, though of course the experience of awareness, rejection, and flight has made its important mark.
To think these thoughts is then to ask whether, on a fundamental level, Nineteen Eighty-Four had much of anything to say to Chinese students or the Velvet Revolutionaries, who turned out to be made of sterner stuff than Winston Smith.
Instead, we get the same old stories: St. Cyprian's, with Cyril Connolly and Cecil Beaton; Eton and his unrequited crush on a younger boy; Burma, where he briefly imagined that the "the greatest joy in the world would be to drive a bayonet into a Buddhist priest's guts"; Paris, where he wrote and destroyed two novels; teaching boys, selling books, being rejected by T.S. Eliot, marrying Eileen; Spain, Morocco and the Blitz; the BBC, the adopted child and the dead Eileen; P.G. Wodehouse, Edmund Wilson, Animal Farm and the audition of the widows in waiting--after which egregious Sonia, the widow everybody loves to hate, who is said here to have spat in disgust whenever she passed a nun on the street.
And along with the famous decency, the equally famous abuse: W.H. Auden was "a sort of gutless Kipling." William Morris, Bernard Shaw and Upton Sinclair were "dull, empty windbags." Off with the heads of "the creepy eunuchs in pansy-left circles" and "all that dreary tribe of high-minded women and sandal-wearing and bearded fruit-juice drinkers who come flocking towards the smell of 'progress' like bluebottles to a dead cat." Wouldn't it be loverly "if only the sandals and the pistachio-colored shirts could be put in a pile and burnt and every vegetarian, teetotaler and creeping Jesus sent home to Welwyn Garden City to do his yoga exercises quietly!"
Wilfrid Sheed once said that Orwell wrote best about the things he hated. So maybe we're just lucky that some of the things he hated were more important than sandals and vegetarianism.
But for now, it's the night before. Let us receive all influxes of vigor and real tenderness. And at dawn, armed with ardent patience, we shall enter the splendid cities.
(Rimbaud, A Season in Hell)
I am reminded of Simone Weil, who also negated herself, who willed herself out of this world. At her funeral, the priest arrived too late, because of a stalled train. At Rimbaud's funeral, nobody came, because his mother kept it secret. Orwell is remembered on the one hand, by Malcolm Muggeridge, as having "loved the past, hated the present and dreaded the future," and on the other by H.G. Wells, as "a Trotskyist with big feet." And George himself told us that "saints should always be judged guilty until they are proved innocent."
So Rimbaud gave up poetry when it failed to change the world. Orwell at the end must have had his doubts about language, too, or he wouldn't have dreamed up Newspeak. Neither is remembered for his hard work at identity-making. Instead, the poet's name is worn by freaks, geeks and videodrones as if it were a logo on a T-shirt or a jet-propelled sneaker, and the novelist is propped up on a horse like the dead El Cid to frighten the Moorish hordes. They have both been turned into the standard-issue celebrity flacks of this empty, buzzing time, selling something other than themselves, unattached to honor, glory, kingship, sainthood or genius. They join a talk-show parade of the power-mad, the filthy rich and the serial killers, the softboiled fifteen-minute Warhol eggs, the rock musicians addled on cobra venom, the war criminals whose mothers never loved them and the starlets babbling on about their substance abuse, their child molestations, their anorexia and their liposuction. "I have never belonged to this race," said Rimbaud.
The twentieth century produced few American heroes like Joe DiMaggio. He was arguably the best all-around ballplayer who'd ever taken the field, a unique combination of power, speed and grace, a lifetime .325 hitter with a classic swing and an unworldly calm whose fielding was as nearly flawless as it seemed effortless. He was not a fidgeter, adjusting batting gloves a hundred times (there were no batting gloves). Once he squared his bat, said his friend Tony Gomez, "the guy was a statue." There was no wasted motion on the field--he flowed to the ball--and no hotdogging: The fielders' mitts were too small for snap-catches. Those of us who saw him play when we were teenagers would caricature the batting styles of other players, but we all wanted to look and move like DiMaggio. He was also the possessor, as any fan knows, of what is the most extraordinary feat in baseball, and perhaps in any sport, a fifty-six-game hitting streak that defies all statistical logic and that most people believe will never be matched again. That in itself is the material of myth.
But there was something else as well. When he first appeared in a New York Yankees uniform in 1936, he seemed to come from nowhere at the very moment when both the Yankees and a depressed nation--and the rising second generation of Italian-Americans--seemed to need him most. Paul Simon's line "where have you gone, Joe DiMaggio," could have been written as anticipatory longing thirty years before it became ironic sentimentality.
Unlike the boisterous beer-swilling Babe Ruth, who'd retired the year before, DiMaggio, the son of an immigrant Sicilian fisherman from San Francisco, became the essence of that elusive thing called class. He rarely spoke; he dressed superbly--another thing he would become known for--and he seemed to conduct himself, both on and off the field, with a royal calm, even an icy distance, that only enhanced the allure. The Yankees, in those days when baseball was the national pastime, had won the World Series just once since their Murderers' Row rampage of the 1920s. In the four years after he arrived, they would win four times. In his thirteen war-interrupted seasons--the last was 1951--they would win the pennant ten times. He played not only to win--to drive his team to win, often playing through his own pain, the bone spurs in his heels, the aching knees, the trick shoulder--but to play flawlessly. He was the epitome of Yankees royalty.
And somehow, after those thirteen seasons, when the myth might have faded into an endless haze of celebrity golf tournaments and testimonial dinners, it seemed only to thrive--through the 286-day klieg-light royal marriage to Marilyn Monroe, the ensuing divorce and the love that seemed to survive both, through the Mr. Coffee and Bowery Bank commercials and through tawdry rounds of high-priced baseball-card shows and memorabilia signings--little seemed to tarnish the mythic glow. If anything, the forty-eight years after DiMaggio's retirement--he died in 1999--seemed only to burnish it. Almost from the moment he arrived in New York, people wanted to touch him, do him favors, run his errands, drive him places, give him things. Cops gave him access to places denied anyone else. He rarely paid for his own meals, his own cars or even his own hotel rooms. There would always be guys eager to be his delivery boys, to bring him women--mostly the blond showgirls he preferred--even some who moved out of their homes to be with him, to take care of him. Anything for the Dago. The namewas used with so much affection that it became an honorific.
But of course there was more--lots of it--and Richard Ben Cramer is there to mine every ugly moment: the money, ultimately more than a million, that came under the table in hundreds and two hundreds from mobsters (who adored him even more than did other American males, and who found him a useful draw to Toots Shor's, El Morocco or the other clubs and restaurants they controlled in New York); DiMaggio's compulsive whoring, combined with his possessiveness--unto physical abuse--of his two wives; the estrangement from his own brothers, who were also big-league ballplayers; frosty rejection of his son (except when publicity photos were required), who would die of a drug overdose; the envy directed at other great players; the grudging World War II military career that he spent in safe, warm places playing baseball for the prestige of the brass under whom he served; the obsessive money-grubbing--$150, or $175, for each signed baseball, each signed bat, each photograph, thousands upon thousands of them, deals upon deals.
Cramer contends that DiMaggio not only wanted the money--he was pathological in the thought that others would profit: "Who else would make money off the deal? How much? Why should those guys make a buck off my life?" The fear went back to the beginning of his career, to the days before free agency when ballplayers were chattel: Club owners like Ruppert beer baron Jacob Ruppert of the Yankees and his general manager Ed Barrow owned not just the players but many of the writers and columnists as well. You could try to hold out, but in the end, it was the owners who set the terms; you either played for the team that owned you or you didn't play at all. Worse, as DiMaggio discovered early in his career, even the attempt was likely to expose you to a torrent of press and fan abuse as an ingrate. The same newspaper hacks who could manufacture heroes could just as easily be turned to embarrassing them or tearing them down. DiMaggio, the idol who was making the owners additional millions in attendance, was lucky to get his $25,000, or his $40,000. In the Depression years, those seemed like princely sums. In a way, you could understand the paranoia about other people making money off you. Lots of them tried.
In the course of telling the story, Cramer seems to have turned over every rock in DiMaggio's life, but in the end even he seems uncertain how to frame his flawed hero's life, caught up, on the one hand, in the man's greatness and lavishing us, on the other, with his rage, his distrust, his shabbiness.
DiMaggio excelled and continued to excel, against the mounting "natural" odds. He exceeded, withal, the cruelest expectations: He was expected to be the best--and he was. He was expected to be the exemplar of dignity, class, grace--expected to look the best.... And he looked perfect.
DiMaggio did for us--for the sake of our good opinion--through every decade, every day. He was, at every turn, one man we could look to who made us feel good. For it was always about how we felt...with Joe. No wonder we strove for sixty years to give him the hero's life. It was always about us. Alas, it was his destiny to know that, as well.
Of course it was always about us; what else could it be about? But as with a lot of other latter-day muckraking of heroes "who did for us"--Roosevelt, Eisenhower, Kennedy--the ground rules have changed. Even the un-kept, independent sports writers of the 1930s and 1940s would never have written the other DiMaggio story, would have respected the man's privacy, as the White House press respected Kennedy's. (Through Marilyn Monroe, of course, the two stories were linked: DiMaggio thought maybe the "fucking Kennedys" had killed her.) If we were charmingly naïve then, a nation of hicks who liked simple morality tales, our confessional age now demands full disclosure--we expose our potential heroes before they even have a chance to show their stuff. Cramer, who won a Pulitzer Prize for international reporting and wrote a fine book about the 1988 presidential campaign, gets himself caught in between--still in love with the performance, the style, the heroism, but probing the private, inner man until little is left. Heroes on pedestals are all fair game. But Cramer gives us little help in squaring the two DiMaggios. How do we hold the one without forgetting the other? In the end, it's even hard to square what Cramer tells us about DiMaggio's admiration for--and friendship with--people like Woody Allen with the shallow DiMaggio he mostly gives us.
What makes that even more exasperating is that Cramer gets into his characters' heads, reports events and quotes conversations with no attribution. The book's acknowledgments include a huge list of people, from old ballplayers to Henry Kissinger, himself a DiMaggio idolater from the 1930s who would later sit with the Clipper at Yankee Stadium and get enlightenment about the subtleties of big-league pitching and hitting. But there are no footnotes, no lists of sources. In the hours after the 1989 San Francisco earthquake, Cramer reports, DiMaggio rushed to his sister's house in the Marina--the house, which he had given to his family many years earlier, was undamaged--and emerged with "his big right hand around the neck of a garbage bag...which held six hundred thousand dollars, cash." How does he know that--not the part about the bag, but about the contents? And where did the cash come from? (It seems to have belonged to some long-gone mobster who was making certain that he could make a fast exit if necessary, but we are not sure.) There's also the touching story about Marilyn Monroe's tour entertaining the troops in Korea in 1954, three years after DiMaggio--who wanted his wives to be homebodies and never approved of their careers--had retired. "Joe," she said on her return, "you never heard such cheering." "Yes, I have," he said. Where did that come from? And when "he was off to himself, on his cot, thinking about (his first wife) Dorothy," where did that come from?
To compound the exasperation, Cramer likes to affect a wise-guy writing style that's often more annoying than evocative. The ambient sporting life of 1930s New York is itself a nice story, full of Guys and Dolls characters--prizefighters, jockeys, ballplayers; Broadway showgirls; politicians like La Guardia, columnists like Walter Winchell and Sidney Skolsky; small-time hoods like Jimmy "Peanuts" Ceres, who drove DiMaggio around, and some big-time ones as well, Ruggiero "Richie the Boot" Boiardo, Joe Adonis, Abner "Longy" Zwillman, "who put the 'organized' in organized crime"; Toots Shor himself, who loved the Dago and would later be spurned by him, as would so many other onetime friends. But the Runyonesque rhetoric gets in the way: sentences like "See, Joe had to have a private life," or "See, Gomez was gone," or "In the sixth, Joe got ahold of a pitch...", or "Winchell, Len Lyons, that nosy Kilgallen broad; even the battle-ax, Louella Parsons, used to write up Joe like an old friend" or (even more bizarre) "Joe was digging for second base, when Gionfriddo, in an act of God...and--Cazzo! Figlio di putana!--stole the home run away from DiMaggio." Now who said (or thought) that?
It's hard to deny Cramer's portrait of DiMaggio as an empty and increasingly lonely and embittered man, whose lifelong act as an aging public monument could only have added to the bitterness. "From the start," Cramer writes early in the book, "he had to have it both ways: he wanted to be well known at what he was known for--and for the rest, he wouldn't be known at all." We once allowed our heroes that privilege--but as Cramer's book demonstrates, we permit it less and less, either to the living or the dead. If DiMaggio had cooperated, he would probably have received more consideration, but DiMaggio being who he was, no such cooperation could have been expected. In the end, our sympathy is restored only by the venality of his lawyer Morris Engelberg, who continues to mine DiMaggio's memorabilia and exploit his name even more ruthlessly than DiMaggio did. In the penultimate moment in Cramer's book, a few minutes after DiMaggio's death, there is Engelberg, in DiMaggio's room, ordering the nurse to force DiMaggio's 1936 World Series ring, the only genuine one he had left, from the dead hero's finger. When the nurse succeeded, "Morris yanked [it] out of his hands, and left the room in a hurry." He would claim that DiMaggio "gave him that ring, on his deathbed--before Joe died in his arms."
Thirty years ago, I went to the San Francisco Giants Arizona spring-training camp to do a magazine piece on Willie Mays, another of our imperfect diamond heroes. How much, Mays asked, was he going to get paid for cooperating? At that point, I decided I would simply hang around for a week or two and watch and listen. There was little he could tell me, I decided, that would strengthen the piece. (In the days following, I learned more than I ever expected--about Mays, about the changing culture of baseball and about the game itself.) Sometimes, maybe, the work of athletes, like that of dancers or, for that matter, composers or actors or novelists, deserves to be well known, as DiMaggio seemed to wish, without the unceasing pursuit and exposure of all the rest. In some cases, say in Mozart's or Wagner's or J.D. Salinger's, or maybe even in Bill Clinton's, if you can't separate the neuroses or the anti-Semitism or just the ordinariness of a man from the public performance--you may never know greatness at all. But it gets harder every day.
In the Acknowledgments section of his biography of Saul Bellow, James Atlas quotes a somewhat greater biographer, Samuel Johnson: "We know how few can portray a living acquaintance, except by his most prominent and observable particularities, and the grosser features of his mind, and it may be easily imagined how much of this little knowledge may be lost in imparting it, and how soon a succession of copies will lose all resemblance of the original."
Johnson knew few of those whose lives he described and none nearly as well as Boswell knew him. (Would he have been as pessimistic about the unreliability of history and biography if he'd read Boswell's book? Probably more so. The truer the portrait, the more repellent to such a subject.)
I'm not as pessimistic about discovery as Johnson was. So, for instance, well as I knew Bellow before reading Atlas's biography, I think I know him better now.
I mean that I know more about the places he lived, what his parents were like, what others thought of him, what he said about many things, including me. (To my surprise, I learned that I was once mentioned in his will and that, perhaps after one of our arguments, I was removed from it.) It doesn't mean that my view of Bellow now is Atlas's. By no means.
Atlas also knows Bellow and was helped by him in the course of writing his book.1 He writes that he immersed himself in Bellow's records and acquaintances far more than he'd done in work for his prizewinning biography of Delmore Schwartz (whom he'd never met). Atlas wonders, though, if familiarity and labor have produced a better book. I think this is a better book, largely because Bellow is a more brilliant and interesting man than Schwartz was. (Indeed, his version of Schwartz in Humboldt's Gift is more interesting, amusing and touching than the one in the Atlas biography, which was--we learn in the new book--inspired by it.)
Better, truer; more interesting, more touching.
The first two distinctions don't matter in works of fiction. So the uproar over Bellow's Ravelstein and the real Allan Bloom doesn't bear on its power as a novel or, on the other hand, on the pain it gave and gives some who saw themselves "portrayed" and/or "betrayed" in it. They do matter, however, for biography. Would Boswell's Life of Samuel Johnson be as good a book if it were a work of fiction--if, say, the Johnson in it hadn't lived or been a totally different man? It would not be. The understanding a biographer establishes with his readers includes the sense that he is telling as much of the truth as he's been able to gather about actual people and events. If that understanding is compromised, it constitutes an aesthetic betrayal different from--and, in my view, worse than--the "betrayal" of a fiction writer's acquaintance in his fiction.
I'm one of the many Bellow friends Atlas interviewed and whom he cites in Bellow: A Biography. Much I know and feel about Bellow is not in the book because I didn't tell Atlas about it. Some of it would have somewhat altered his portrait of Bellow; none of it would have altered it significantly.2
Most of the book's citations from me are from letters Bellow wrote me or I him.3 Such citations constitute the sort of record biographers and other historians have drawn on for the two or three hundred years in which history has been assessed as a function of it. If I'd given Atlas access to my diaries, he would have found another source of Bellow matter that would have expanded--if not deepened, let alone altered--his view of his subject. The subject of every biography has had millions of thoughts and experiences that have never--thank God--been recorded. It means that the gulf Johnson wrote about is an uncrossable one.
The difference between modern history/biography and, say, what constituted their equivalent in Thucidydean Athens or seventeenth-century Europe is enormous. Scholars don't believe that Pericles delivered the magnificent oration that Thucydides attributed to--that is, wrote for--him, though he probably delivered a speech that resembled it. Our problem with a presidential speech today is not the actuality of the words pouring from the presidential mouth but who wrote and even who conceived them. We're content that our conception of Periclean Athens is to no small degree that of Thucydides' interpretation of it, but the historical standard is different for modern events and people, those who leave their tracks in letters and diaries, interviews and film.
Atlas uses such archival materials and such biographical techniques as interviews, and he is far more aware of the hazards as well as the advantages of such usage than, say, Vasari was in his verbal portraits of fifteenth- and sixteenth-century artists, some of whom he knew. An experienced journalist, Atlas has a nose for bias and such vested interest as the desire of ordinary people to be part of the record of extraordinary ones. (This is probably a trait of most biographers.) He also raises the question of how his long biographical labor affected his book. Did he, like his mythical namesake, get so weary of "holding up" the "bewilderingly complex" Bellow world that the exasperated weariness created a portrait as far from actuality as Thucydides' Pericles was from the "actual" Pericles?
I've known Bellow for almost forty-five years. For many of those years, we've been close friends and have said things to each other we may not have said to other people. We have also quarreled, disagreed and not seen each other for months and even years at a time. Our politics have been different, and the difference counted--perhaps more for him than for me. Nonetheless, we are close enough so that a few days before I write these words, we could tell each other on the phone--the first time we'd spoken since my wife and I stayed with him and his wife in their Vermont house two years ago--that we loved each other. We are old men now, and I believe that we both thought it possible that we wouldn't see each other again. In that conversation, I told Bellow that I'd read much of Atlas's book and that he shouldn't be concerned about it. I said that Atlas had built a crate large and secure enough to deliver the marvelous sculpture within.
A few hours later, I finished the last 100 or 150 pages of the book. In them, I detected the kind of weariness Atlas himself mentions, but I saw it as a weariness complicated by judgmental anger. Atlas had interviewed many people who'd been hurt--or said they'd been hurt--by Bellow. Partly as an attempt to maintain his independence of and objectivity about Bellow, partly from exasperated weariness, partly from his sense that he'd surrendered--his verb--his life to another man, a man whom he'd been seeing in part through the angry eyes of others, Atlas became harsher and harsher in his assessments. So I wrote Bellow telling him that although what counted--the portrait of a remarkable person becoming over decades ever more remarkable--was intact, I believe that it was deformed by Atlas's querulous anger, if not by sanctimonious contempt, and that he and Janis (Bellow's wife) would do well not to read it. "Hector and Andromache," I wrote, "Don't need to know Thersites' version of their lives."
This was perhaps as unfair to Atlas as I thought he was, at times, to Bellow, but then Atlas writes that I am Bellow's "old and loyal friend," the "Boswellian explainer of the great man to the general public," so any unfairness to him has been--clairvoyantly?--subverted.
Very well. As friend of subject and author,4 I am disqualified from reviewing this--I'll risk two adjectives--fascinating and sometimes brilliant book. I will instead talk about Johnson's concern, the gulf between actuality and its representation in biography, conversation and history.
I've read a number of books and hundreds of articles about people I've known. There are few, though, from which I've not learned often surprising, even shocking, facts, none in which I haven't felt at least some distance between what was written and what I knew. At times, as in the case of Bellow, my complex admiration for the central portrait has complicated and deepened my admiration for the friend portrayed. Reading remarks Bellow made or wrote years before I met him made me realize even more how remarkable a person he was and is.
Twenty-odd years ago, the day after I finished reading the manuscript of Humboldt's Gift, I had lunch with its author and said to him that it was difficult for me to think that the man across the table was the same man who'd written that profound, delightful and beautiful book. The man eating a sandwich and drinking tea talked with me about ordinary as well as extraordinary things, but nothing out of his mouth came close to the depth and beauty of what was on its best pages, and I said something like, "Yet there's less distance between you and your work than between any writer I've known and his."5
Atlas's biography has narrowed that distance for me. For all the schmutz that accumulates about and spatters the central portrait, it emerges as that of a very great man becoming great in the course of a long life of activity, acquaintance, introspection and expression. There is more original power in the intelligence recorded here than in 95 percent of biographies. Atlas does not have the mimetic power of Boswell or of a writer he rightly praises here, Mark Harris, author of a delightful Bellow book called Saul Bellow, Drumlin Woodchuck;6 he does not have the stylistic or analytic gifts of Samuel Johnson or Richard Ellmann, but what he does have is access to hundreds of brilliant Bellow observations and analyses outside of Bellow's books. Atlas's Bellow is like a match, Atlas's contribution being the assemblage and, perhaps, the wooden stem, Bellow's the sulfur that, rubbed, ignites and fires the wood.
The day the galleys of this book arrived in the mail, I saw my sister-in-law, who, days earlier, on a trip with her husband to Israel, had swum in the Dead Sea. She said there were all sorts of perils there, the crystalline spears one dodges to get to the viscous water, which deposits a salty scum on one's skin, and the water's semi-impenetrability, so that if one somehow managed to dive into it, ascending would be dangerously difficult. I felt an analogy to the perils of biography. The subject is himself almost impenetrable, guarded by fearful suspicion and his own complexity; even after getting access to him, the progress is difficult, and biographer-readers are left with the scum of his resistance to their penetration.
I've thought and talked about Bellow--and now this biography--with a few friends who also know him. Each sees Bellow in a somewhat different way; all condemn Atlas's version more than I. (I credit Atlas for collecting and organizing the materials that enable us to know more about Bellow; they blast him for his inability and/or unwillingness to understand him.) One friend, a first-rate novelist, thinks Atlas not only misunderstands Bellow's radical independence but resents it. So he sees a politically correct Atlas piling up criticism along familiar--to Bellow critics--misogynist, conservative and racial lines. He thinks that Atlas is shocked by Bellow's anarchic "cocksmanship," and when I suggested that Bellow had a grand streak of bad boy, if not outlaw, in him, he found a different way to express his own view: "He's a transgressive monkey. And a great con man."7 He makes Bellow into a version of a favorite character of his own fiction, a brilliantly anarchic, half-crazed sexual adventurer.
A former woman friend of Bellow's talked of his powers of devotion and charm. She detests Atlas's portrait, especially the account--to which she feels one of her letters has contributed--of his lovemaking.8 "He made me feel wonderful. I still love him." (She hasn't seen him in ten years.)
I myself have written about Bellow as a man simpler in many ways than other people, one who very early in his life discovered his own powers and let them set his course. More important than what happened to him--and I'm persuaded by Atlas that such things as the death of his mother help explain much later behavior--were these exceptional powers, an extraordinary memory, an extraordinarily acute and cultivated sensorium (visual, musical, olfactory, tactile) and--let's call it--emotional power (unusual ability to empathize, sympathize, love, hate and also, be detached). Like most of us, Bellow is many things, but unlike most of us, he's more of a piece and has been that way a very long time. The piece is stamped "writer," indeed "great writer," and the pressure of that stamp isn't like most other professional pressures; but this is something that is talked about ad nauseam meam, and I'll not add to the nauseating complex here.
What I've mostly wanted to hint at is the difficulty of writing, reading and being the subject of other people's descriptions of oneself, and to spell out what Johnson said was the distance between the real, the remembered and the written version of reality, the deformation of the "was" in the "is."
Yet such versions are what we have of the past, the history and biography with which we're left. One work of history can challenge or even refute another, or it can add, refine or subtilize it. Even memories rub against one another. Yet I do not subscribe to the notion (of, say, Peter Novick's splendid book That Noble Dream) that tries to dispose of the actuality of objectivity. I don't think we should abandon the recording of actuality as an ideal or ever think that there's no crucial difference between what we believe is actual and what we know we've made up or lied about. Nonetheless, what we get when we describe something or someone is, at its driest and purest, metamorphosis.
The greatest--at least the most delightful--investigator of such metamorphoses, Marcel Proust, claimed that only in what he called "involuntary memory" does the past ever re-emerge with its original--and even more than its original--power. (Beckett's comment about this was that Proust showed that the only real paradise was a lost one.) That sensuous, unsummoned memory is clarified as reflections in a clear pool are, free of the dust particles and blinding light that make what's reflected almost impossible to see.
Atlas's Bellow is a work built around voluntary, elicited and recorded memory. It is a version of actuality that may be read, sometimes with shivers of remembrance, by its subject and his acquaintances. It has a truth of its own, somewhere between the original actualities, the complex feelings and memories of those who supplied the author with data, and the author's own gifts and feelings. The portrait of the great man who is its subject will be difficult to dislodge. Luckily, the man has left a far more powerful self-portrait, that of the mentality behind his beautiful books.
1. Although Bellow recently told me that he "opened himself" to Atlas, who, lately, seemed to have turned away from him. I said that Atlas probably didn't want his work to be compromised by affection. After I wrote him not to read the book, he answered that he wouldn't, that there was "a parallel" between it "and the towel with which the bartender cleans the bar." This image of biography as the soak of spilled drinks is the sort of thing Bellow has invented for most of his 85 years.
2. One description of me there is so peculiar--"the Oblomov-like Stern"--that I actually wrote Atlas to ask what it meant. When I told Bellow, he said that Atlas had probably not read the wonderful Goncharov novel. When I questioned the adjective in a letter to Atlas, he replied genially that Oblomov "is a sympathetic character and so are you."
3. Most of our letters are filed in the Special Collections of the Regenstein Library at the University of Chicago.
4. Cf. Atlas's well-done interview with me, originally commissioned by George Plimpton for the Paris Review, in Chicago Review, Fall-Winter, 1999.
5. No one seemed more different from his work to me than Samuel Beckett, whom I saw about once a year between 1977 and 1987. Cf. the portrait of him inOne Person and Another (Dallas: Baskerville Books. 1993).
6. A book dedicated to me in which I play a minor role.
7. We both remember Bellow's early portrait of the terrific Chicago con man, Yellow Kid Weil.
8. One of John F. Kennedy's "girls" is said to have described the relationship as "the greatest thirty seconds of my life."
The article on electronic books ["On Pixel Pages It Was Writ," June 12] left out the most intriguing aspect of this new format: digital rights management technologies (DRM). These technologies are being developed by the electronic publishing industry to protect the rights of the copyright holders and, of course, are not so diligent about protecting the rights of readers. DRM standards, such as the XrML standard developed by Xerox and endorsed by Microsoft, contain mechanisms to allow publishers to put time limits on reading, to potentially charge by the page or by the minute, to protect against excerpting and printing. These "rights" go significantly beyond the rights recognized by copyright law.
Among the many annoyances of these systems is that works are generally licensed to a particular piece of hardware, such as an individual computer or e-book reader. While the hardware industry is working to make our computing devices obsolete, the content industry is tying our content to those same machines. Upgrade your computer, and you lose access to all the content you have licensed. So, the question is not whether we'll be able to read digital works in the bathtub or on the beach--the question is whether we'll be able to reread them in a few years, quote from them or offer them to friends once we've finished with them.
BREAD, CIRCUSES & MUSIC
Siva Vaidhyanathan, in "MP3: It's Only Rock and Roll and The Kids Are Alright" [July 24/31], suggests that Metallica has somehow "forgotten that it got rich through free music" simply because the band objects to Napster's accessory to theft. Giving away free music to build a following is a valid business model; as a musician, I may do the same thing. But the fact that Metallica gave away the music it once created has nothing to do with whether it wants to (or ought to) give away the music it now creates. That's Metallica's choice, but Napster, Gnutella, etc., make it easy to take that choice away; they don't distinguish between music that an artist has granted permission to distribute free and music that some unethical third party has offered without the artist's permission.
As we celebrate the demise of the recording industry's distribution near-monopoly, the distinction between freedom of information and respect for intellectual property is being ignored. Music fans rejoice in the "right to free music" Napster has brought them, but it has brought them no such thing; it has simply permitted them to do something possibly illegal without facing the consequences. Any musician will tell you that they're last in line to get paid; stealing from them and justifying it by pointing to recording industry profiteering is intellectually dishonest.
Vaidhyanathan levels fair criticism at the recording industry, which is clearly fighting a losing battle to retain its monopoly over distribution channels, but his dismissal of Napster as a serious issue defeats his own alternative. He points out that bands can bypass the entire conventional production/distribution/marketing monopoly through home production and Internet alternative-music websites, charging "$1 per song for MP3 downloads." But Napster, Gnutella and the rest don't come close to enabling that business model to be used; in fact, they make it absurdly easy to defeat. Certainly, these services are not going to go away, but it's crucial to recognize that they are morally and ethically neutral and that they fail to make distinctions between lawful and unlawful behavior. How to support the decision of the artist about how his or her music is to be distributed is the conversation we ought to be having. My music is not yours simply because I created it.
TALES OF 'OLD BUBBLEHEAD'
In a hagiographic review of the Culver/Hyde biography of Henry Wallace ["The Wallace Doctrine," June 12], Kai Bird rhetorically inquires as to "who wouldn't" like its protagonist. I, for one. Whatever Truman's failings, at least he didn't belong to a weird cult in which he used the code names "Shamballal" and "Logvan" (his wife was "Poroona") and uttered such inanities as "I shall obey the Gita as remorselessly as Krishna." For all his loony mysticism, Wallace was quite capable of double-crossing his guru, Nicholas Roerich, when he thought he had become a political embarrassment. Having sent him on a mission to Asia, Wallace prevented him from returning by threatening him with a $14,000 tax lien.
Wallace's insensitivity in personal relations was legendary. Given a new car when he married, he went off on a three-hour solo spin while his bride waited in bewilderment. A rich man, he was such a stingy tipper that at restaurants aides would have to surreptitiously flesh out his niggardly gratuities. In World War I, his well-heeled family kept him out of military service as an "essential farmer." After the 1948 election, he walked out of his headquarters without a word of thanks to devoted campaign workers. When asked by H.L. Mencken about the "guru letters"--fawning missives he had addressed to "Beloved Master" Roerich in happier days-- Wallace weaseled, causing intense mirth among the press corps, who unaffectionately referred to him as Old Bubblehead. Objective scrutiny of the man and his record makes Westbrook Peglers of us all.
So Wallace was quite a character! I'll still take his eccentricities any day over the men who defeated him.
BOWLING ALONE IN THE 8TH CIRCLE
The August 7/14 issue contained two articles that, when read against each other, produce serious discontent. In "The Crack in the Picture Window," Benjamin Barber's review of Bowling Alone, we are presented with an analysis of the loss of "social capital" and "civic grace" in the face of growing social isolation. It's astounding that no mention is made of the profound dominance of social life by corporations. In fact, no meaningful reference to the tyranny of corporate power occurs in the entire review.
If only Barber had read E.L. Doctorow's passionate polemic in the same issue, "In the Eighth Circle of Thieves." Doctorow sees clearly that American life outside and in is manipulated for the sake of corporate dominance and gain; the consequent result is distorted priorities, child poverty, media domination, the swelling of ethnic prison populations, the high cost of health insurance, international trade agreements that defeat national environmental laws--"the list is long."
Doctorow calls on the iconic power of Whitman and proposes a reform bill. Barber could not recognize corporations, but Doctorow cannot, apparently, recognize that corporations are embedded in a social-productive system--capitalism. Capitalism forces the movement of corporations among their various paths to venality and social-environmental destruction. What has come to dominate The Nation is a new populism, a recognition of large-scale social destructiveness unrelated to its underlying economic determinant.
S. Gardiner, Me.
E.L. Doctorow remarks that "campaign finance reform as a phrase has been bruited about so long and to so little effect and is so yawningly dull, dreary and unresounding, it makes one wonder if it's not partly responsible for the conditions it has so far failed to address." I totally agree. "Graft" seems the appropriate term. It puts the focus on the politician, which is exactly where it belongs.
And while we're calling a spade a spade, how about returning the name of the Defense Department to its historical and accurate name, the War Department? It would have a major effect in stemming that hemorrhage from the public treasury. Just imagine how it would sound: "President recommends increase in the war budget." It would be a well-placed thorn in the media's bag of foul air.
R. D. BALDWIN
OLD LEFT/NEW LEFT, RED LEFT...
Tom Hayden ["Harrington's Dilemma," June 12] draws a plausible lesson from Michael Harrington's life: The Shachtman-Harrington crowd shouldn't have been so nasty to the rest of the left. But there's another lesson, more relevant for today: The left is torn apart and weakened when part of it makes peace with the US war machine. When Harrington was expelled from Norman Thomas's Socialist Party in 1952 "because of his involvement in trying to take over its youth branch," the underlying reason was that Harrington was against the war in Korea, while Thomas was for it. And in Hayden's 1965 debate with Irving Howe, I'd have been more upset at Howe for supporting the Vietnam War than for his "paternalistic needling." Now that Soviet-style Communism is dead and buried, the US empire is more powerful and seductive than ever. Drawing this lesson seems more important than rehashing old feuds among ex-Communists, ex-Trotskyists and ex-New Leftists.
For those of us who knew Harrington and worked with him, one of his more endearing qualities was his capacity to reflect, in a self-critical way, on his political past. Both in his published writings and in conversation, he would forthrightly state that he mishandled the relations between the parent League for Industrial Democracy and the newborn Students for a Democratic Society in 1962, that he waited too long to express publicly his opposition to the Vietnam War and that his censure of the New Left had often been unduly harsh and unnecessarily polarizing. Harrington's description of that behavior as "stupid" in the copy of his autobiography he signed for Hayden was quite characteristic.
But Harrington and others from the old left had no monopoly on stupidity and sectarianism. Those of us who came of political age as part of the New Left contributed mightily in both of those areas, and any reasonable account of that period would have to address the incredible self-destructiveness of that movement, which ended with SDS dissolving into a bunch of warring sects adhering to the worst caricatures of Marxism-Leninism and Stalinism. Until the New Left is as direct and as honest in our self-evaluations as Harrington was, we will be willfully blind to our own history. Hayden made his share of mistakes, and then some, as a leader of SDS and the New Left, and one would have hoped he would use this review to acknowledge them. If there is a "true believer" in this story, it is much more my fellow New Leftist Hayden than Harrington.
"Indeed, it seems to me that Nader, who is a reformer acting empirically, has in many ways raised more radical questions, and possibilities, than the European social democrats. His lead should be carefully followed."
Prophetic words? They were written by Michael Harrington in 1972, in Socialism, chapter 12. The torch was passed, unremarked, nearly thirty years ago. Now it's up to the rest of us to unite behind another torchbearer in an international Green-Red movement. Is that Michael's ghost with a hopeful smile?