New York City
Win McCormack's sophisticated examination of conservative tactics in the last election was fascinating ["Deconstructing the Election," March 26]. Such sophistication is not necessary to an understanding of conservative intellectual resilience in defending the interests of the dominant economic powers. 'Twas ever thus. There is only one consistency in conservative analysis of the role of government: It is to follow its historical role of protecting the dominant interests and transferring wealth from those who have little to those who have much.
I'm glad the web-based version of your magazine is free. Otherwise I'd be forced to ask for a refund, not to mention punitive damages for the waste of my time (and, undoubtedly, a number of brain cells) caused by reading this article. I'll spare you my opinions and critique. Let's just paraphrase Foucault and say that this is the kind of writing that gives bullshit a bad name.
I am sympathetic with Win McCormack's applying poststructural analysis to the presidential election, but he makes a glaring mistake. The quote he attributes to Michel Foucault, that Derrida is "the kind of philosopher who gives bullshit a bad name," was in fact uttered by UC Berkeley professor John Searle, who made the statement in a 1983 article in the New York Times Book Review.
New Haven, Conn.
I hope I may be pardoned if I quibble, in prepostmodern fashion, over a minor point. I have it on good authority that the wonderful remark on Derrida being the sort of philosopher "who gives bullshit a bad name" comes not from Foucault but from Richard Rorty. But, of course, if the interpretation and recounting of all "texts" really is indeterminate, it perhaps doesn't matter all that much anyway.
JOSHUA L. CHERNISS
The "bullshit" quote is postmodernly elusive. Richard Rorty is fairly certain he never said it. John Searle admits to using it but not to originating it. In his article, McCormack relied on the rarely correct Dinesh D'Souza, who attributed it to Foucault. Scholars pronounce it "un-Foucauldian."
St. Peter, Minn.
I enjoyed Win McCormack's review of the Florida debacle and appreciate seeing the crisscrossing issues brought together in one place. But I was annoyed by his harping on "irony" and hints that Baker & Co. were secret advocates of the "postmodernism" Lynne Cheney castigates. Two errors: 1. Foucault's rejection of objective neutrality is premised on the principle that there's no such thing as objectivity independent of somebody's constructive work--nothing counts as "neutrality" in that sense; rising above subjectivity is an essential impossibility, not one based on human fallibility. This isn't at all the same as James Baker's claim that people are fallible and cannot arrive at objective truth, which nevertheless exists and is better approximated by nonpartisan machines than biased people.
2. There's nothing ironic about Republicans behaving the way they say they don't. It's a nifty example of Foucault's power theory, but you'd hardly expect Lynne Cheney to embrace Foucault. Not unless you find it "ironic" that capitalists are still behaving the way Marx said they do even as they pronounce Marxism dead and discredited. That's what Marx said they'd do. Lynne Cheney writes in a way Foucault anticipates even as she attempts to discredit him. Nothing ironic there.
RICHARD A. HILBERT
Win McCormack's article reminded me of Nixon winning elections by calling his opponents communists and later saying he knew they weren't communists but he had to win. As someone who was at the Inauguration protests in Washington and in several other protests, I was especially interested in the parts about the paid Republican protesters in Florida. I encountered Republican protesters here at the governor's mansion during the election fiasco--nasty, horrible, meanspirited people. When we protested the presence of the Fortune 500 group and Vicente Fox at U Texas, we were held back by a horrifying force of police in riot gear. Our protest community is notoriously peaceful, but no one was protecting us. The police got to try out their new toys--like rubber bullets--against some college students at Mardi Gras, causing several injuries and terrifying us all. Despite strong objections at a city council meeting, the police got a large raise, a toothless oversight committee, no civilian review and were sent on a junket to Seattle to learn crowd control! If we protesters had tried anything nearly as threatening as what Republicans staged in Florida, the police would have caused a bloodbath, and the media would have blamed us.
Win McCormack effectively conveys the tendentiousness, hypocrisy and even demagogy that characterized the Republicans' strategy in Florida. But I take exception to his claim that we require Foucault's concept of "a battle among discourses" to properly understand this historical event. Tendentiousness, hypocrisy and demagogy have characterized political rhetoric since well before the birth of poststructuralist philosophy. They have been analyzed with great acuity by, among others, Machiavelli, who advocated deploying them prudently, and Jürgen Habermas, whose ethics of discourse repudiates them.
McCormack wrongly invokes the term "discourse" to describe the position of one party in a two-party or multiparty controversy. Discourses for Foucault are analogous to what we might call the "paradigm" (Foucault would say "discipline") within which a controversy occurs. A Foucauldian approach to the Florida deadlock, therefore, would involve studying the underlying social, political, economic and cultural relations of power that determined which truth claims were accepted as valid.
It is true, as McCormack notes, that a subjectivist or relativist epistemology underlies this approach to the study of the relationship between power and ideas. But the fact that James Baker raised the problem of "individual subjectivity" on the canvassing boards in no way confirms Foucault's theory of power, as McCormack claims. To assert that election officials may be subjective is a far cry from demonstrating the validity of the proposition that everything is subjective. At most, McCormack may be able to claim that widespread acceptance of Baker's argument would demonstrate that some number of people have embraced a Foucauldian theory of power. This, of course, would no more confirm the theory than does Baker's charge of bias on Florida's canvassing boards.
Ultimately, we do not require Foucauldian concepts to understand what happened in Florida. George Bush personified hypocrisy by contesting the constitutionality of a voting procedure he signed into law in Texas; the US Supreme Court's justification for stopping the manual recount was manifestly tendentious; and Baker's claim that a prolonged electoral struggle would undermine US international standing was demagogic. Foucault can help us understand how networks of power determine whether these discursive acts come to be accepted or rejected. But to understand the corruption that pervaded GOP strategy in Florida, we need only to have been paying attention.
McCormack turns out to be prophetic of future postmodernisms by the more right-wing elements of the Establishment. Writing for an undivided Supreme Court, Justice Clarence Thomas informs us, "It is clear from the text of the [Controlled Substances] act that Congress has made a determination that marijuana has no medical benefits worthy of an exception"--a nice reminder that "power is knowledge."
The idea of different lenses through which history can be viewed and refracted (and twisted) was satirized by E.M. Forster in his seminal (hell, downright ovular) 1909 dystopian satire "The Machine Stops"; and the malleability of the past and the social construction of knowledge and the universe would be no news to George Orwell's O'Brien in 1984 or to the Stalinists, Nazis and other totalitarians he represents. No one knew that power is knowledge better than the authoritarians and totalitarians of the first half of the twentieth century, and later.
What's postmodern now is the degree to which "the best lack all conviction": the degree to which twenty-first-century intellectuals lack the ontological and epistemological foundations from which to argue that Congress might just, concerning medical marijuana at least, be in error, cruel and--in a nontheoretical formulation--full of shit.
RICHARD D. ERLICH
D'outre-Tombe [Beyond the Grave], France
Imagine my post-mortem shock at seeing my name on the cover of The Nation, somehow linked with the slogan "History Is Entirely Subjective." Am I not among those who pointed out in the 1960s that "Man" is a recent invention, and one fast approaching its end? If you want a slogan from my work, how about this one, spoken by an anonymous voice at the end of The Archaeology of Knowledge: "Discourse is not life; its time is not your time. In it, you will not be reconciled to death; you may have killed God beneath the weight of all that you have said. But don't imagine that, with all that you are saying, you will make a man that will live longer than he." "Entirely subjective" indeed!
I am gratified by the voluminous amount of mail that arrived in reaction to my article. I will respond to only one point, as I think that will enable me to expand and clarify my central thesis.
It may be true, as Jason Neidleman contends, that ultimately we do not need to deploy a Foucauldian intellectual apparatus to grasp the basic structure of what happened in Florida. However, I was more concerned with Republican or conservative intellectual superstructure. Conservatives have for some time now been claiming that their movement possesses a moral and intellectual integrity superior to that of their political and ideological adversaries and have repeatedly cited the widespread embrace of "decadent" postmodern (by which they mean poststructuralist) theories in liberal academia as partial evidence of that. Convincing the public of this putative superiority is much of what they have in mind when they speak of fighting, and winning, the "cultural wars" and is the very goal of tracts like Lynne Cheney's Speaking the Truth and Dinesh D'Souza's Illiberal Education. In that context, the fact that their behavior in Florida, from the rough-and-tumble street level all the way up to the hushed, august chambers of the Supreme Court, reveals them, on their own chosen terms of discourse, to be intellectually and morally inconsistent and bankrupt seems more than noteworthy and was, ultimately, my real point.
George W. Bush's energy plan fudges the facts, raises false alarms, shamelessly peddles halfhearted green measures--all to provide a cover under which to slide the oil industry's wish list. Jimmy Carter, who knows a real energy crisis, in a Washington Post Op-Ed accused the administration of using "misinformation and scare tactics to justify such environmental atrocities as drilling in the Arctic National Wildlife Refuge." Bush cites California's troubles as a call to action for a plan that does not address them. It revives nuclear power with no ideas on where to safely put the waste, trashes environmental regulations and airily dismisses international concerns about global warming, as UN Secretary General Kofi Annan pointedly noted in a little-reported speech at Tufts University on May 20. "We do not face a choice between economy and ecology," Annan said. "In fact, the opposite is true: Unless we protect resources and the earth's natural capital, we shall not be able to sustain economic growth."
Bush's plan, crafted in secret sessions with input from industry reps and none from consumer advocates, is a mélange of vague ("make energy security a priority of our trade and foreign policy"), sometimes contradictory suggestions and steps the Bush Administration can take unilaterally (easing regulations for the electric, oil and nuclear industries). It will unleash much squabbling in Congress, as legislators take up other portions of the package. Not only will enviros face off against industros on assorted fronts--the consensus of the moment is that Bush's plan to drill for oil in the Alaskan wilderness is near-DOA in Congress--but energy-producing states will square off against consuming states.
Industries themselves could be at each other's throats, competing to gain an edge via legislation. Natural gas companies, for instance, have no interest in seeing environmental rules relaxed for coal-burning utilities. Electricity cooperatives will wrangle with electric utilities. Northeast power-generators could tussle with Midwest utilities over emissions. Conservative free-marketeers will decry using the tax code to assist one industry or another. All in all, this is a full-employment project for Washington lobbyists. Expect campaign contributions from energy companies to rise faster than the price of gasoline.
But the many fault lines and divides have yet to be defined, and the possibility of Jim Jeffords's switch (still not officially announced at press time--see the editorial, following) injects a new element of uncertainty into the mix. Democrats are in some disarray; senators and House members have home-state concerns. Think of John Breaux and Mary Landrieu of oil-producing Louisiana, who already have broken with their party to support a tax bill resembling the Bush plan. As the tax-bill fight demonstrates, Republicans need only to peel away a few Bush-friendly Democrats in the Senate to succeed. And in the House, as one senior Democratic staffer notes, "there are a number of Blue Dog Democrats who get their money from the same people who fund Bush and Cheney." But at the same time, there are House Republicans--including several in California--worrying about energy legislation that rewards price-gougers and gives no short-term relief to consumers.
One liberal-leaning Democratic senator surveys the landscape this way: "Senate Democrats should be able to stick together on much of the environmental and policy matters regarding, say, regulations and more resources for conservation and alternatives. It's going to be much harder on the populist front. They're not all going to want to be tough on the industries."
The Democrats' message so far is that Bush and Cheney are just pimping for Big Oil and other energy interests while trampling the environment. Perhaps that will resonate with voters. But it remains to be seen if Democrats can sell a bold alternative approach that promotes conservation, efficiency and renewable energy. At a recent meeting with reporters, House Democratic leader Dick Gephardt dismissed the notion of raising fuel-efficiency standards for cars and SUVs. Like his Republican foes, he fears proposals that might impinge on the American way of life (read: consuming oil in gluttonous amounts).
Republicans think that blaming environmental extremists for the nation's (real or imagined) energy troubles--like California's deregulation mess--is good politics, while the Democrats almost giddily believe environment/energy to be Bush's main vulnerability. Given the muddy legislative swirl Bush's energy plan will stir up, the public will be lucky if the debate stays that starkly defined.
Environmentalist groups should forcibly inject broader public interest considerations into the mix and seek to provoke a real national energy dialogue that goes beyond green-versus-brown accusations and political point-scoring. Any serious, comprehensive energy policy would start by taking the $100 billion the Pentagon wants for a technically dubious National Missile Defense system and investing it in enhancing proven alternative-energy and efficiency technologies. Solar, wind, geothermal, biomass, hydrogen fuel cells (the idea Al Gore pushed during the late campaign and Bush derided, then stole for his energy plan)--all these renewable technologies are proven and feasible. A multibillion-dollar federal investment in them would assure their cost-effectiveness and wider use.
Bush has declared an energy crisis and opened the door to a national discourse on saving energy. The Democrats should say, "Thanks, George," and take the opportunity to supplant his hot air with action.
MONUMENTALISM ON THE MALL
Jon Wiener writes: The week the movie Pearl Harbor opened, Congress and the President ordered construction to begin on the proposed World War II Memorial on the Mall in Washington, exempt from existing law and oversight. The new legislation nullified the lawsuits challenging the design and prohibited
federal agencies from further deliberations about it. The House vote was a lopsided 400-15, and it zipped through the Senate without debate or objection. The courts had agreed to consider whether the plan violated the Environmental Policy Act and the Commemorative Works Act, which protects the Mall from ill-considered projects. The National Capitol Planning Commission had scheduled new public hearings for mid-June. Critics (see Wiener, "Save the Mall," November 13, 2000) had argued that the site near the Lincoln Memorial would break up the nation's number-one location for protest demonstrations where Martin Luther King Jr. proclaimed "I have a dream." Others criticized the design for a grandiose, triumphalist style inappropriate for honoring the men and women who defeated Hitler. Now we'll have a memorial with architecture more suited to Nazi Germany.
DUNCE GETS DOCTORATE
William S. Lin writes: When John F. Kennedy delivered the commencement address at Yale University upon receiving an honorary doctor of laws degree in 1962, he observed: "A great university is always enlisted against the spread of illusion and on the side of reality." Which side of reality, then, did the Yale Corporation subscribe to in conferring the same honor on George W. Bush during this year's commencement exercises on May 21? Faculty members and students, at least, didn't buy the illusion that Bush had earned the degree: 208 Yale professors boycotted the graduation ceremony in protest, calling the decision to grant the degree "premature." Graduating seniors, for their part, sported stickers that read Got Arsenic? and "5-4," and when Bush rose to speak, hundreds of them flashed yellow signs with slogans like Protect Reproductive Rights and Execute Justice, not People while booing and heckling the President. (Someone draped a banner featuring The Nation's portrait of Bush as Alfred E. Neuman from a dorm window.) But none of this stopped Bush from reverting to his tired routine as Comedian in Chief, alluding to his penchant for napping and partying during his bright college years: "To the C students I say, You, too, can be President of the United States." Who knew that a belief in mediocrity, and a lifelong commitment to it, could someday lead to the highest academic distinction at Yale?
THE FEC--BUMBLING ENFORCER
Kathryn Lewis writes: As Congress squabbles over McCain-Feingold, campaign reform watchdog groups are calling for an overhaul of the Federal Election Commission. At a recent Capitol Hill conference sponsored by the Project on Government Oversight (POGO) and Common Cause, critics argued that the FEC has flunked its mission "to assure that the campaign finance process is fully disclosed and that the rules are effectively and fairly enforced." A new report released by POGO shows that more than $12 million in campaign contributions during the 1998 election were unaccounted for, improperly listed or missing from the FEC's databases. POGO charges that the FEC doesn't check candidates' reports against PAC reports to find discrepancies and that basic campaign finance disclosure cannot occur, because FEC databases are inaccurate. How can the agency enforce campaign finance laws when its numbers don't add up?
NEWS OF THE WEAK IN REVIEW
Jennifer Berkshire reports on Alternet that genetically altered foods will be featured on White House menus. The inaugural GM meal will be dished up at a banquet for French Prime Minister Lionel Jospin and will include genetically altered super-salmon and Star Link corn pudding. (Does M. Jospin approve of this culinary experiment?) It's nice of Bush to give Frankenfoods the First Family test, but the demonstration smells like a paid political announcement for the agribusiness lobby.
"Democracy without dividends." That's the phrase you're likely to hear from many Nigerians asked to assess the country's democratic experience under President Olusegun Obasanjo. Two years into civil democracy, Nigeria is once again an accepted member of the international community. But Obasanjo, who portrayed himself as a political messiah, has so far failed to redeem his promises to Nigerians of a better life.
When Obasanjo took the oath of office as Nigeria's elected President on May 29, 1999, after the country had endured fifteen years of adventurism and brigandage under military rule, it was regarded as a turning point in its political history. And indeed, there were many early positive developments. The media and the civil population regained their freedom, investor confidence was restored and international financial institutions like the IMF were willing to discuss the rescheduling of Nigeria's debt. The Obasanjo administration also won plaudits for passing an anticorruption act that prescribes stringent punishments for corrupt officeholders and for forcing the retirement of more than 120 military officers whose careers were judged to be tainted by politics. A government-sponsored panel investigated human rights violations during the period of military rule based on thousands of petitions, thereby achieving, to some extent, a national catharsis.
But more recent developments are far less encouraging. The economy remains nearly comatose, with imports accounting for more than 80 percent of manufactured goods consumed. The value of the national currency has gone from 100 to 135 naira to the dollar in the past four months. The collapse of the social infrastructure remains unaddressed, as do crime and the battered education system. The imposition of Sharia (Muslim law) in the northern states has upset many southerners and Christians.
Perhaps the biggest problems Obasanjo has failed to solve involve Nigeria's lifeline, oil. The conflict between the oil-producing communities of the Niger Delta and the oil multinationals has assumed the status of low-intensity warfare, with the government on the side of the oil giants. The government provides police protection for oil companies' installations and applies military force against protesting communities. Last year Obasanjo ordered an attack on Odi, a sleepy village with immense oil and gas reserves; the soldiers razed the entire village in order to put down an uprising. Recently the governors from the oil states demanded control of the resources in their domain, citing the Nigerian constitution. The international character of the struggle has intensified with a ruling in October by a US Circuit Court of Appeals that the Ogoni community could bring a class-action suit against Royal Dutch/Shell.
Gas lines--which briefly disappeared after the Obasanjo administration blocked some of the loopholes in the supply and distribution system, carried out refinery maintenance and embarked on a well-supervised massive importation of fuel--have returned. The country's three main refineries are near collapse, and the government no longer has enough funds to subsidize imported fuel. Ironically, the world's sixth-largest oil producer is unable to make fuel available to its people. Obasanjo has now decided to end government subsidies, which will mean price increases of 200 to 300 percent. That threatens to bring about a massive national strike and possible economic shutdown.
There are palpable fears that Obasanjo's seeming inability to find ways of resolving these problems may provide a justification for a military takeover. The past two years have witnessed worsening living conditions, and many Nigerians argue that it was never this bad, even under the military. Retired generals openly attack the government for its woeful performance, while forced retirements cause restlessness in the barracks. In April three service chiefs were suddenly retired based on intelligence reports questioning their loyalty. Meanwhile, agitation for a national conference to renegotiate Nigeria's federalism, to which the government is opposed, has won more converts. And opponents are already looking toward the next presidential election, in 2003.
Obasanjo must take bold and pragmatic actions toward resolving these seemingly intractable problems and bringing about genuine reconciliation among Nigeria's tribes if he is to avoid the risk that the country will disintegrate. Obasanjo understands this but is confronted by powerful forces. Recently while on a visit to the United States, he was asked to name the major achievement of his administration. After a deep breath, he answered, "Nigeria is still united." To force change, Obasanjo must stop dispensing patronage, send home the aged brigade of ministers recycled through several administrations and recruit young technocrats who better understand the nuances of the new economy. Corruption and inefficiency must be rooted out. Obasanjo must also involve Nigerians more in decision-making rather than acting unilaterally, as has been his practice. If he falters, the club of retired military officers, with their civilian collaborators, is prepared to seize any opportunity to take power.
"What do we do now?" That famous last line of the 1972 film The Candidate, in which Robert Redford finds himself--to his surprise--elected to the Senate, should be on the minds of Senate Democrats, now that Senator Jim Jeffords of Vermont has abandoned the GOP for independent-hood and an alliance with the Dems. It took a moderate--a dying breed in the Republican Party--to thrust Tom Daschle & Co. into control of half of Congress. That should mean the end of Bush's relatively--and unexpectedly--easy ride in Washington. With such a change, the Democrats' leaders will no longer be able to wring their hands and plead minority status when they lose legislative battles, such as the fight over the relieve-the-rich tax cut. The party will gain the (theoretical) ability to strike down the Bush agenda and deny him his more extreme appointees--to speak for the majority of voters who said no to Bush. All this is possible, that is, if the Democrats can mount a unified opposition. A big if, since several Senate Dems have been happy to work with Bush on taxes and other measures. The Jeffords switch doesn't change that dynamic. After all, a dozen Senate Dems ended up voting for the Reaganesque tax cut.
There is much the party can do with the Senate in its hands. Ever since the Republicans took over Congress after the 1994 election, the Democrats have tried to beat up the party of Gingrich on kitchen-table issues, including healthcare, prescription drugs, education and wages. With an edge in the Senate, Democrats have the chance to pass uncompromising legislation on several fronts: a strong patient bill of rights, expansions in health insurance to cover those not covered, a minimum-wage boost free of tax breaks for corporations. These bills would likely be shot down by the Republicans in the House and President Bush. But this could show that the Democrats do stand for something and create a record of difference useful for the party in the elections of 2002 and 2004. The last time the Democrats were in charge of Congress, they passed a modest Family and Medical Leave Act, Bush the Elder vetoed it, and a very effective campaign issue was handed to candidate Bill Clinton.
The Democrats have picked up the power of subpoena. There are many topics worthy of its use. Oil company price-gouging. Electric utilities price-gouging. Pharmaceutical companies price-gouging. Perhaps an inquiry into the recent fundraiser at the Vice President's home. (Oh, sorry, the Republicans claim it was just a "thank you" to financial supporters, not a fundraiser.) The Democrats are free to host high-profile hearings to counter Bush's bully pulpit and to advance their own ideas. Imagine hearings on victims of arsenic poisoning. Or on the dangers of nuclear waste disposal. Or on the plight of older Americans who can't afford medicine. Or on renewable-energy alternatives to increased oil drilling.
The Jeffords move, perhaps partly caused by heavy-handed Bush/GOP tactics, including threats made against dairy price supports for Vermont farmers, shows that the Republicans have a hard time being anything other than a party of the right. Yet Senator Zell Miller--the conservative Democrat of Georgia and number-one target for GOP defector-hunters--could in a similar way inconvenience the Democrats, although he has twice said he wouldn't join the party with which he regularly votes. And a Jeffords jump is not the end but the beginning of the intrigue. GOPers will be trawling for other Democratic cross-dressers besides Miller (and making sure that at least one senior citizen, Strom Thurmond, has ready access to prescription drugs). Senator Robert Torricelli, a New Jersey Democrat, ought to be sweating even more. He's being investigated on a number of matters--deservedly so, it appears--but now the Republicans have even more incentive to nail him.
That one Yankee could so upset the balance in Washington--and so discomfit the Bush advance--illuminates not only how divided are forces in the capital but how much opportunity exists for the Democrats, should they be able to act like a party of principles.
For the first few months of this year, it looked like Hollywood's unionized writers and actors were about to premiere a new labor strategy. As their conglomerate employers raked in record profits from expanding Internet, cable, foreign, video and DVD markets, the writers and actors guilds were talking about staging a fight to win long-deferred increases in residual payments for work that's reused and resold in different media.
With their film and TV contracts expiring back to back, the supposedly militant leadership of the Writers Guild of America (WGA) and the Screen Actors Guild (SAG) spoke of ushering in a new era of pay equity, while producers, the networks and studios braced for a historic Tinseltown strike. The AFL-CIO, meanwhile, came to town ready to advise and support the Hollywood unions in their looming titanic confrontation with the entertainment giants.
But instead of debuting a dramatic breakthrough, it now seems that this season's Hollywood labor push is ending in another dreary rerun of business-as-usual back-room deals. After weeks of on-again, off-again closed-door talks, the WGA reached a three-year agreement with employers on May 4, two days after the contract ran out. And though the deal is being presented to the membership (who must ratify it in early June) as a significant wedge of the economic pie, it is in reality a pretty thin slice. The writers won no increase in video or DVD residuals and failed to achieve the exclusive union jurisdiction over Internet production they sought. No additional residuals were won for material resold to cable. Regarding cable, they won a first-time but modest residual for premium services only. And residual increases for foreign sales will benefit only writers on the most profitable shows. Strengthened "creative rights" were won, but the economic tally is meager. On an industrywide scale, writers' residuals will increase about 2 percent.
After so much heat, why such a paltry settlement? "It was collective bargaining without the collective," says a longtime observer close to the negotiations. At a time when their employers have radically restructured themselves on the most sophisticated mega-merger global scale, the traditionally parochial and insular Hollywood guilds seemed incapable of making the transition to a modern and effective labor movement. The WGA neither mobilized nor engaged its membership in a credible campaign against the producers. How could it when it carried out its negotiations under an official "blackout" that kept not only the media but its own rank and file in the dark?
Instead of reaching out to build new alliances with "below the line" technical and craft unions, it alienated them. Instead of relying on an industrywide movement to wring concessions from the producers, the guild put forward a corporate-molded negotiating team much more experienced in managing studio employees than in representing them. Worst of all, the WGA faked itself out of a better deal. By threatening a strike it was unwilling to prepare for, it converted the possible stoppage into a lethal management weapon, with the studio execs, producers flacks and even the conservative mayor of Los Angeles enthusiastically branding any shutdown a certain economic Götterdämmerung for the strikers.
The actors union, whose contract expires July 1, began its round of talks in mid-May. It's pretty much a foregone conclusion that SAG will now settle--and settle short. The actors have formally shut down even the pretense of a membership contract campaign and have, like the WGA, imposed their own information blackout on the negotiations. That's not all the two guilds have in common. SAG's relatively new Hollywood leadership talks a tough populist line, but it has little interest or experience in doing the hard work of effective organizing and mobilization. The AFL-CIO is quietly packing up its local support operation, sensing that SAG has no stomach for a real fight.
A historic opportunity to bring Hollywood's professional and craft unions together in a powerful new alliance has thus been postponed, if not lost, and just at a time when industry restructuring makes such unity an iron imperative. Instead, there's already an incipient whisper campaign under way, with writers and actors scapegoating each other's unions for producing such disappointing results. Here it is better to quote not Spielberg but Shakespeare: "The fault, dear Brutus, is not in our stars,/ But in ourselves...."
Over the past two years, it has become commonplace to read that the casualties among Kosovo Albanians were not sufficiently high to warrant the NATO intervention that put an end--at some remove--to the rule of Slobodan Milosevic. Without saying so explicitly, many liberal and "left" types, and many conservatives and isolationists, have implied that the Kosovars did not suffer quite enough to deserve their deliverance. The dispute revolves around two things; the alleged massacre at Racak (which may or may not have been a firefight provoked by the Kosovo Liberation Army) and the relative emptiness of certain identifiable "mass grave" sites.
As to Racak, it might be argued that Western policy-makers seized too fast on the evidence of a Bosnian-style bloodbath, but--in view of what had been overlooked or tolerated for so long in Bosnia--it would be tough to argue that a "wait and see" policy would have been morally or politically superior. Wait for what? Wait to see what? And, since most of those who cast doubt on Racak were opposed on principle in any case to any intervention, as they had been in Bosnia, the force of their objection does not really depend on the body count, or on the issue of who shot first. For those of us who supported the intervention, with whatever misgivings, it was plain enough that Milosevic wanted the territory of Kosovo without the native population, and that a plan of mass expulsion, preceded by some exemplary killings, was in train. The level of casualties would depend on the extent of resistance that the execution of the plan would encounter.
The bulk of the European and American right had announced in advance that the cleansing of Kosovo by Milosevic was not a big enough deal to justify military action; this seems to remain their view. It was also, according to former NATO commander Gen. Wesley Clark in his new memoir, the institutional view of the Pentagon. It would therefore have been the right's view, whatever happened or did not happen at Racak. It would presumably also have been their view even if the United Nations had passed a resolution authorizing the operation, over the entrenched objections of Boris Yeltsin and Jiang Zemin. (The Genocide Convention, which mandates action by signatory powers whenever the destruction of a people in whole or in part is being committed, takes precedence in the view of some.)
So we'll never know if another Rwanda was prevented or not, since another Rwanda did not in fact take place. However, on the issue of the mass graves there is now, as a result of the implosion of the Milosevic regime, more forensic evidence to go on.
At the time of the war itself I received a letter from a Serbian student of mine, a political foe of Milosevic but by no means a NATO fan. He told me that his family in Serbia had a friend, a long-distance truck driver whom they trusted. This man had told them of entering Kosovo with his refrigerated vehicle, picking up Albanian corpses under military orders and driving them across the "Yugoslav" border as far as the formerly autonomous province of Voivodina, where they were hastily unloaded. He'd made several such runs. At the time, I decided not to publish this letter because although it appeared to be offered in good faith it also seemed somewhat weird and fanciful, and because rumors of exactly this sort do tend to circulate in times of war and censorship.
In early May of this year, the Belgrade daily newspaper Blic, now freed from the constraints of censorship, published a report about a freezer truck, loaded with Albanian cadavers and bearing Kosovo license plates, that had been pulled from the river Danube in April 1999. The location was the town of Kladovo, about 150 miles east of Belgrade. Local gravediggers told of being hastily mobilized to load the bodies onto another truck, and to keep their mouths shut. The man who found the truck, Zivojin Djordjevic, was interviewed on Belgrade Radio B92. "It was a Mercedes lorry--the name of the meat-processing company from Pec was written in Albanian on the cabin. The license plates were from Pec.... When the lorry was pulled out and the doors of the freezer opened, corpses started sliding out. There were many bodies of women, children and old people. Some women had Turkish trousers, some children and old people were naked."
To this macabre tale, identifiable people have put their names. The director of the Humanitarian Law Center in Belgrade, Natasha Kandic, has been collecting information about comparable incidents in the period between late March and mid-June 1999, with piles of corpses removed from cemeteries or graves in Kosovo and either reburied secretly or incinerated. This is not improvised wartime atrocity propaganda; it is the careful finding of patient human rights investigators after the fact.
One cannot yet say the same about another story, which concerns the mass burning of bodies in the blast furnaces of the Trepca steel plant. The eyewitnesses here are, so far, only a driver named "Branko" and a Serbian "special forces" officer named "Dusko." They suggest that, in that terrible spring, as many as 1,500 murdered Albanian civilians were fed into the mills and furnaces of the steel complex. It would be premature to credit such unconfirmed and lurid reports, even though investigators from the Hague tribunal have already spoken about evidence being destroyed at the nearby Trepca mines. And at first, I didn't quite believe the freezer-truck tale either.
In the relatively new atmosphere of post-Milosevic Serbia, the armed forces have charged some 183 soldiers for crimes committed in Kosovo. This might be part of an "isolated incident" strategy, or it might be the beginning of a real investigation. If the reports now in circulation prove to be true, it would mean (given the complicity of border guards, steelworks managers, traffic cops and cemetery authorities) there was a state design both to the original murders and the secret interments. Such a discovery would help constitute the emancipation of Serbia as well as of Kosovo. But it would owe very little to those who described the belated Western intervention as an exercise in imperialism based upon false reporting. We shall see.
So Rudy's lawyer, playing the piranha,
Decided he'd gain ground by dissing Donna.
The judge, appalled that anyone could be
As crude as that, then squashed him like a flea.
The lesson litigants should now acquire?
When looking for a lawyer, you should hire,
Unless you want your fortunes going south,
A mouthpiece with a somewhat smaller mouth.
They were kidnapped on the street, or summoned to the village square, or lured from home with false promises of work, to be forced into the Japanese military's far-flung, highly organized system of sexual slavery throughout its occupied territories in the 1930s and 1940s. Of some 200,000 so-called comfort women only a quarter survived their ordeal; many of those died soon after of injuries, disease, madness, suicide. For years the ones who remained were silent, living marginal lives. But beginning in 1991, first one, then a trickle, then hundreds of middle-aged and elderly women from Korea, China, Taiwan, the Philippines and other former Japanese possessions came forward demanding that the Japanese government acknowledge its responsibility, apologize and make reparations. Despite a vigorous campaign of international protest, with mass demonstrations in Korea and Taiwan, Japan has hung tough: In 1995 Prime Minister Tomiichi Muayama offered "profound"--but unofficial--apologies and set up a small fund to help the women, to be financed on a voluntary basis by business; this past March, the Hiroshima high court overturned a modest award to three Korean women. As if official foot-dragging weren't demeaning enough, a popular comic-book history of Japan alleges that the comfort women were volunteers, and ultraright-wing nationalists have produced middle-school textbooks, approved for use in classrooms, that omit any mention of the government's role in the comfort-woman program.
Frustrated in Japan, the comfort women have now turned to the US Court of Appeals for the Washington, DC, Circuit. Under the 212-year-old Alien Tort Claims Act, foreigners may sue one another in US courts for human rights violations; the women are also relying on a law against sexual trafficking passed last year by Congress. In mid-May, however, the State Department asked the Justice Department to file a brief expressing its sympathies with the women's sufferings but urging that the case be dismissed as lacking jurisdiction: Japan has sovereign immunity, under which nations agree to overlook each other's wrongdoings, and moreover, treaties between it and the United States put finis to claims arising from the war.
In other words, it's all right to seize girls and women and put them in rape camps--aka "comfort stations"--for the amusement of soldiers far from home, as long as it's part of official military policy. War is hell, as the trustees of the New School noted in their letter absolving their president, Bob Kerrey, of the killing of as many as twenty-one Vietnamese women and children. If it's OK to murder civilians, how wrong can it be to rape and enslave them?
"The Administration's position is particularly terrible and irresponsible when you consider the evolution of attitudes toward wartime rape over the last ten years," says Elizabeth Cronise, who with Michael Hausfeld is arguing the comfort women's case. Indeed, sexual violence in war has typically been regarded as the inevitable concomitant of battle, part of the spoils of war, maybe even, for the evolutionary-psychology minded, the point of it: Think of the rape of the Sabine women or the plot of the Iliad, which is precipitated by a fight between Achilles and Agamemnon over possession of the captured Trojan girls Chryseis and Briseis, although my wonderful old Greek professor Howard Porter didn't quite put it like that. It was only this past February that an international tribunal brought charges solely for war crimes of sexual violence, when three Bosnian Serbs were convicted in The Hague of organizing and participating in the rape, torture and sexual enslavement of Muslim women.
But even by these ghastly standards, the case of the comfort women stands out for the degree of planning and organization the Japanese military employed. Noting, for example, that subject populations tended to resent the rape of local women, authorities typically shipped the women far from home; although the women saw little or no money, "comfort stations" were set up as brothels with ethnically graduated fees, from Japanese women at the top to Chinese women at the bottom. The system was not, strictly speaking, a wartime phenomenon: It began in 1932, with the Japanese occupation of Manchuria, and continued after the war's end. In fact, according to Yoshimi Yoshiaki, whose Comfort Women: Sexual Slavery in the Japanese Military During World War II (Columbia) is crucial reading, the Japanese military authorities set up comfort stations for the conquering American troops. As Cronise points out, even if the United States has closed the books on Japan's wartime atrocities, it could still side with the comfort women on the grounds that many of them were enslaved during peacetime.
"The government's position is technically defensible," says Widney Brown, advocacy director for women's rights at Human Rights Watch. "What's not defensible is the Department of Justice's giving as a reason that it doesn't want to jeopardize relations with Japan." Incredibly, the Justice Department is arguing just that, along with the further self-interested point that a ruling in favor of the comfort women would open the United States to human rights lawsuits in other countries. (Remember that the United States has sabotaged the International Criminal Court.) Says Brown, "It shows a failure to understand the significance of the comfort women case as a major step in the development of human rights for women. After all, their case could have been brought up in the Far East tribunal right after World War II, but it wasn't. This is a major chance to move beyond that. You could even argue that the view of women as property--if not of one man, then another--was what prevented sexual slavery from being seen as a war crime until now."
The US lawsuit may well be the comfort women's last chance. Now in their 70s and 80s, most will soon be dead, and since few married or had children, there won't be many descendants to continue the fight for reparations. By stonewalling, the Japanese government will have won. And the Bush Administration will have helped it. All that's missing is the call for healing and mutual forgiveness.
The Bush Administration is pulling a fast one on energy, and we
will all pay dearly for decades to come. By panicking the public with oil
industry propaganda of an energy shortage, the Bushies are building
support for the most reckless energy policy since the days before the
environmentalist movement, when blackened skies and lungs represented the
vision of progress.
To make things worse, to head off objections to their plans to plunder
virgin lands and obliterate conservation measures, they have thrown in as
a palliative the old oxymoron of "clean" nuclear power.
Of course there is nothing clean about nuclear waste, which can never
be rendered safe.
The public may temporarily accept new nuclear power plants, as long as
one is not built anywhere near their neighborhood and the radioactive
byproduct is shipped to another part of the country.
But trust me, while these things may be better designed today, the
insurance companies are no dummies for still refusing to insure nuclear
power plants. It is wildly irresponsible for the Bush Administration to
now insist that US taxpayers underwrite these inherently dangerous
Does anyone even remember Three Mile Island? Or, more disastrously,
Chernobyl? I was the first foreign print journalist admitted to the
Chernobyl plant after the explosion. Even a year after the fact, and with
the benefit of the best of Western scientific advice, it was still a
scene of chaos. Nuclear power is like that--unpredictable, unstable and
ultimately as dangerous as it gets.
The entire Chernobyl operation is now buried in a concrete-covered
grave, but the huge area under the radioactive plume emitted from the
plant is a permanent cancer breeding ground, as is the sediment in the
area's main rivers and throughout much of its farm land. I traveled from
Moscow to Chernobyl by train in the company of top US and Soviet
experts, but even they seemed to feel lost and frightened as they donned
white coats and Geiger counters to tour Chernobyl. Nuclear power is just
too risky a gamble to push because of a phony energy crisis.
The desperation in the White House is palpable, but it is not over an
"energy crisis," which Bush's buddies and campaign contributors
manipulated in the Western electricity market.
No, the fear of the Bush people, even before Jim Jeffords's defection,
was that their political power would be short-lived and that they had
best move as fast as possible on their pet projects, beginning with
increasing the profits of GOP energy company contributors.
Why else the panic? There is no sudden energy crisis. Known
world reserves of fossil fuel are greater than ever, alternative energy
sources are booming, and conservation measures work. If the Federal
Energy Regulatory Commission would do its legally required duty of
capping wholesale prices to prevent gouging, there would not be an
electricity crisis in California or elsewhere.
The FERC has not done its job. Clearly, as the New York Times reported
last week, energy wholesalers are in cahoots with the Bush administration
to use the FERC as their personal marketing tool to drive up their
already obscene profits.
Finally, there is simply no reason to rape America in pursuit of
something called "energy self-sufficiency." If the vast reservoirs of
natural energy resources--resources that are sitting under land
controlled by regimes around the world that we've propped up at enormous
military cost for half a century--are not available to be sold to us at a
fair price, why continue to prop up these regimes? What did President
Bush's Dad, with his buddies Dick Cheney and Colin Powell, achieve in
preserving Saudi Arabia and Kuwait if those degenerate monarchs they
saved in the Gulf War will not now trade fairly in the one commodity of
value that they hold?
We must make our quid pro quo clear: We will pay for a huge military
to keep these sheikdoms and other energy-rich regimes in power only if
they guarantee fair oil and natural gas prices for our retail consumers.
Make that deal and the energy "crisis" is history.
Here I sit so patiently/Waiting to find out what price/You have to pay to get out of/ Going through all these things twice.
Forward, into the past!
Nothing was delivered, but I can't say I sympathize.
In November 1994, dressed in iconic big-polka-dot shirt and black sunglasses, 53-year-old Bob Dylan appeared on MTV's Unplugged. He sang a handful of his greatest hits, mostly 1960s-vintage, some of his most wondrous and paranoid and surreal creations: "Tombstone Blues," "All Along the Watchtower," "Rainy Day Women #12 & 35," "Desolation Row," "Like a Rolling Stone," "With God on Our Side" and "The Times They Are A-Changin'." Not long afterward, he licensed that last tune for use in ads by the Bank of Montreal and Coopers & Lybrand.
Yes, this is the enigmatic legacy of the 1960s, that tar baby of American cultural politics. But the selling of the counterculture was built in to what was, after all, a pop phenomenon. The Grateful Dead started peddling T-shirts during the Winterland days with Bill Graham. By the time we got to Woodstock, "counterculture" was a squishy advertising concept. No one at the time saw this better than the artful enigma now just turning 60.
My first Dylan albums were Bringing It All Back Home, Highway 61 Revisited and Blonde on Blonde, so for me, Dylan's real value has never been as a political symbol, anyway: He's got everything he needs, he's an artist, he don't look back. As a friend of mine once put it, Dylan opened the toy chest of American popular music so that anyone could play with all of its contents. The remark underscores the breadth of Dylan's catalogue. Only a few musical peers--Ray Charles comes to mind--have done anything as wide-ranging.
Maybe it's not surprising that, like Charles, Dylan seems to have two key qualities: genius and self-protective complexity. From the beginning, the Dance of the Seven Veils between the whirring rumors and the (initially few genuine) facts that surfaced about his private lives has been part of his celebrity allure; it amplified his gyrating lyrics, gave insiders plenty to guess and gossip about, and outsiders a contact high.
The slightly pudgy 19-year-old came to the 1961 Greenwich Village folk scene with a Woody Guthrie playbook on his knee, but he loved Buddy Holly's Stratocaster and Elvis Presley's raw Sun recording sessions and knew he wanted to be a star. The Village folkies, in full creative coffeehouse flight, were generally leftish, middle-class, longing for cultural authenticity and artistic purity, and interested in making something apart from the loathed world of commercial showbiz. That, by contrast, is precisely where Dylan dove headlong as soon as he could. Even before his fabled fiasco at the 1965 Newport Folk Festival, Dylan drew electric guitars and drums--the evil talismans of showbiz--from his toy chest, where they'd been waiting alongside Harry Smith's Anthology of American Folk Music, Hank Williams, Little Richard and Elvis Presley. Anti-Dylan folkies are still as hardfaced about it as jazz purists are about post-Bitches Brew Miles Davis.
As he moved from protest singer to surrealistic prophet, from born-again Christian to born-again Jew, Dylan's life and music registered, however unwillingly or elliptically, his times. This is one reason people have interpreted his Mona Lisa-highway blues smile and his amphetamine/Beat attitudes in their own images. They've translated him into hero, antihero, sellout, savior, asshole, religious zealot, burnout, political radical and artist. Unless it was useful to him, Dylan usually resented being reduced in rank from prophet (he has always credited divine inspiration for his work, and his most apocalyptic imagery rages with echoes of Blake and the Bible) to mere mirror-holder, and he has usually managed to translate himself anew--the protean artist. That is part of his genius, the soul linking his tangled life to his web of art--and, for that matter, his art to his audience.
So, like the decade he's a symbol of, Dylan today is many things to many people. He's an aging rock star composer of some of the most powerful and enduring songs of the past century who loves the gypsy life of the road; a multimillionaire with an Elvis-like entourage who has an un-American lack of interest in personal hygiene; a double-talking celebrity with a ferocious sense of privacy who has spent most of his life in studios and on the road with his ears full--to varying degrees, depending on exactly when we're talking about--of the transcendent sounds he hears in his head as well as the roaring sound of the star machinery and its need for lubrication. Such is the dilemma of any commercial artist. Pop culture is full of the tales. But few if any other pop songwriters have been considered for the Nobel Prize in Literature.
By most accounts (and over the decades there have been plenty) Dylan early on cast himself--first in his mind's eye, then, after he'd established the myths, in fact--as a shadow observer hoboing through life, with his BO and irresistible charm and coldhearted focus and spew of genius. The chorus for this troubadour's life has many members. There are women who sing his praises, care for him, want to protect him. There are ex-acolytes and musicians and business associates wailing the I-been-abused blues. There are core loyalists and friends. There are fawners, often drawn from the same pool as the abused. They all agree, though, that the Bob Dylan they know is an unbelievably private, ironically inarticulate man with nearly unshakable drive and talent.
That was already clear in 1965, when D.A. Pennebaker tagged along for Dylan's last all-acoustic tour of Britain and filmed Don't Look Back. Released in 1967, the movie caused a stir mostly because it unveiled another few sides of Dylan. Now it's been reissued on DVD, with the usual enhanced menu of outtakes (here audio tracks) and commentary (some useful, some silly). The good news is it looks just as murky as ever. With this backstage home movie, Pennebaker was inventing our notions of cinéma vérité: a wash of grimy, grainy images with weirdly impromptu light, in-the-moment vignettes and scenes.
Pennebaker wasn't interested in converting Dylan into a poster boy for activism or peace and love or the Francis Child ballad collection; he grasped the artistic multiplicity that often came out as duplicity. During the movie, Dylan reveals side after side: the manipulative creep; the defensive master of the counterlunge; the insular and sometimes inarticulate star; the smartass provocateur; the hyperintense performer; the chain-smoking, coffee-drinking, spasmic-twitching composer sitting endlessly at typewriters and pianos. And yeah, the nice guy pops up too. It's a portrait of the artist as Zelig.
In Pennebaker's film, this Zelig too has his handler: an owlish, pudgy Svengali, Albert Grossman, who negotiates about money in a couple of revealing scenes. Folk veterans tend to see him as a representative of Moloch: Grossman devised crossover acts like Peter, Paul and Mary and gave them Dylan tunes to sing. He owned a bigger percentage of Dylan's publishing income than Dylan did, though the singer didn't know it then; even people who don't like him agree that Grossman encouraged Dylan to write and experiment. According to Pennebaker, Dylan came up with the movie's famous opening: "Subterranean Homesick Blues" plays while Dylan, wearing a slight sneer, stands on one side of an alley. Allen Ginsberg and Peter Orlovsky stand off to the other. Dylan holds placards with bits of lyrics from the tune, dropping each card to the ground when it goes by on the audio track. It's a neat piece of visual business that bridges Buster Keaton and MTV.
Pennebaker's movie takes place in the last quarter of David Hajdu's Positively 4th Street. The author of the well-received Lush Life, a biography of Duke Ellington collaborator Billy Strayhorn, Hajdu has written an engrossing page-turner that puts early 1960s Dylan into a pas-de-deuxing foursome with the Baez sisters, Joan and Mimi, and Richard Fariña. The narrative's hook is deliciously open-ended. The Baez sisters, performers themselves, were romantically as well as creatively entwined with Fariña and Dylan, two ambitious myth-making weirdos who were womanizers, bastards and, in their different ways, trying to create poetry with a backbeat. Their ever-changing interpersonal dynamics are the intellectual soap opera that is the book's bait.
Hajdu plays out the sexual and creative permutations and combinations in and around this vaguely Shakespearean quartet with narrative panache and just the right tang of gossip and attitude to get it excerpted in Vanity Fair. At its best, his fluent style floats information with deceptive lightness, but he's not lightweight. Hajdu dug through the papers, including unpublished outtakes of Robert Shelton's No Direction Home: The Life and Music of Bob Dylan, talked to plenty of witnesses and tapped new sources; the most notable is Thomas Pynchon, Fariña's Cornell roommate and best man, whom Hajdu interviewed by fax. All this lets him conjure a novelistic immediacy. His well-plotted scenes usually ring true and bristle with evocative detail. He uses his narrative's inherent elasticity to open perspective and depth of field naturally, then skillfully dollies around and pans in and out of larger contexts as illuminating backdrop for his two odd couples. Topics from the history of American vernacular music to contemporary politics, art and architecture add resonance to the main plot.
Hajdu's story starts with the young Baez sisters seeing Pete Seeger ("a sociopolitical Johnny Appleseed during the mid-1950s") in concert and getting their own guitars. It follows Joan to the thriving Cambridge folk scene, where she became a star with a recording contract. Hajdu builds a novelistic collage of perspectives: Baez herself, those she'd already left behind in California, those watching her rise in Boston. This technique shapes the book's storytelling. We see Fariña, for instance, through Mimi's eyes as a basically lovable, if hurtful, rogue genius; through Joan's by turns as accomplice, potential seducer and parasite. We watch Joan's Cambridge friends fret and fume at young Bobby Dylan's riding her to the top while Joan loves him blindly, and we meet other Dylan lovers like Suze Rotolo and Sara Lownds, whom Dylan later married. We wonder why Mimi can't see how Fariña is using her to get to Joan, since nearly everybody else, including Joan, does, and we wonder if he'll succeed. And we hear the chorus of disharmony around the charged moment when Dylan abandoned his image as folk singer; we note that Joan idealistically spurns Albert Grossman and a major record label and Bob signs with both.
It's easy to see how this fly-on-the-wall approach could devolve easily into name- and eavesdropping--a pitfall Hajdu generally avoids. He evokes the aura of the relationship between Dylan and Rotolo by noting that by the spring of 1962 they'd known each other for six months; he tested his songs on her and played Elvis records for her, while she lent him books of poetry--they read Byron and Rimbaud together--and took him to CORE meetings. "He knew about Woody and Pete Seeger," says Rotolo, "but I was working for CORE and went on youth marches for civil rights, and all that was new to him. It was in the air, but it was new to him."
So, although characters and narrative strands multiply as they weave in and out, Positively 4th Street usually avoids feeling cluttered or confused. And the pacing, spurred by the frisson of eyewitness memories, insider gossip and the rush of circumstance, carries you over its rough spots until things skid abruptly to a finish in 1966. That April, after a publication party for his seminal book Been Down So Long It Looks Like Up to Me, Fariña died in a motorcycle crash. Three months later, Dylan had his own motorcycle crash, which pulled him out of the public eye for three years. Hajdu writes, "Precisely what happened to Bob Dylan on July 29 is impossible to reconstruct with authority."
Until now, that was true. But in Down the Highway: The Life of Bob Dylan, Howard Sounes in fact pieces together testimony and circumstantial evidence into a fairly detailed account of Dylan's wreck. (He relies heavily on Sally Grossman, the late Albert's wife.) It's the kind of thing Sounes does well, opening new angles on the enigmatic polyhedron that is Dylan. An indefatigable reporter, Sounes has collected most of the folks in the Dylan orbit and brought into print several, including Dylan family members, who haven't been there before. He has unearthed more detail about Dylan's marriages and divorces and children and lovers and homes, his harassing fans and his tour receipts, even his desperate late 1980s offer to join the Grateful Dead as his popularity ebbed. He has combed the earlier sources and extracted their meat. Exhaustive is the right adjective.
As Sounes sees it, Dylan lives in introverted, near-constant turbulence, buffeted by internal as well as external winds and by his own creativity, which produces constant alienation. We watch obsessive fans stake out his houses, hassle his women and kids, ransack his garbage. We learn more of the grimy legal battles (suit and countersuit) between Dylan and Grossman, who for several years, at least, earned much more from Dylan than Dylan did.
Dylan did know lots of women, and they parade dizzyingly by: sincere Minnesota folkie madonnas, Village political sophisticates like Suze Rotolo, Baez, Suze again, his first wife Sara, Baez again, back to Sara, various side trips, a string of black backup singers like Clydie King and Carolyn Dennis, who, Sounes reveals, had Dylan's child and secretly married him. So do his musical cohorts from over the decades, who retail variations of the same tale: Little contact, little to no rehearsal, vague if any instruction. Even members of The Hawks, later known as The Band, arguably Dylan's closest creative associates in the late 1960s, shed little light on the man and his muse. It's not surprising, then, that in discussing Dylan's visual artwork collected in Drawn Blank, Sounes writes, "Mostly Bob seemed to be alone in empty rooms. He often drew the view from his balcony, a view of empty streets, parking lots, and bleak city skylines."
That's as close as Sounes gets to piercing Dylan's veil. Even in this monumental bio, just as in Hajdu's book, the star of the show flickers like a strobed image through the crosscut glimpses of his intimates. The facts and tales pile up; the figure behind the screen seems to come into clearer focus but never quite emerges. Still, his complexity is elucidated--which may be the best anyone, including Dylan himself, can do.
Sounes's book has its drawbacks. Its workmanlike prose lurches periodically into fanzine or tabloid rambles by the author or his witnesses. (Why open with what reads like a magazine story about the party that followed Dylan's "Thirtieth Anniversary Concert"? Why ask Jakob Dylan, now a pop star in his own right, if he thinks he'll measure up to his dad?) It gropes for the "inner" Dylan and sometimes comes up silly. (It's not at all clear Dylan has "conservative" beliefs, as Sounes asserts, aside from desperately wanting privacy for himself and his families. It does seem that he, like most folks, has a floating mishmash of an ad hoc personal code.) With all those facts pressing on him, Sounes can also warp chronology in a confusing fashion. (Why, when first introducing Dylan's manager Grossman, dwell in such detail on the court battles that broke out between them seven years later?) But the bulky research and reporting make up for relatively minor lapses in style and sensibility.
Inevitably there are spots when Sounes and Hajdu overlap and disagree about what happened. Take Newport 1965. Sounes retails the traditional story of how outraged fans, shocked at Dylan's betrayal of acoustic music and, by implication, folkie principles, booed Dylan's electric set. Early on, Pete Seeger and Dylan himself helped promote the tale. Hajdu suggests, via other witnesses, that people were screaming about the crummy sound system, and he wonders, as others have, how 15,000 fans could have been shocked by an electric Dylan set after hearing "Like a Rolling Stone" on the radio that summer. Look at it this way: The doughnut is being filled in, but the hole in the middle remains. Dylan's lifelong attempts to fog his personal life may have been rolled back more than ever, but blurry patches still linger, subject to interpretation and debate, just as they always will with the decade of which he--for better or worse, rightly or wrongly--is still an emblem.
You want to find out why politics has become so dreary? You won't find the answer in Rick Perlstein's book. But what you will find is relief. I've read Before the Storm twice and intend to go on reading it, as my opiate, as long as Bush is in the White House and Gore is in the wings.
Before the Storm is the story of such a fascinating era and Perlstein is such a great storyteller--one of the most enjoyable historians I've read--that I guarantee for a while you will simply forget the dreariness of today's politics. You will be carried away to those exciting days of yore--the 1950s and 1960s--when several large parts of the national psyche became so twisted, so gripped by fear, so almost comically, sometimes viciously, mad that they got behind a senator from Arizona named Barry Goldwater--who himself, by the way, was free of all those characteristics--and if fate hadn't intervened, just might have made that right-winging mediocrity what he apparently had little ambition to be: our thirty-seventh President.
But as Perlstein makes clear, fate was greatly aided by some of Goldwater's very unpolitic conduct and by the dummies who got control of his campaign in 1964 and chased off and frustrated all the smarties in his organization. This is a complex story, but the man at its center was simple.
Barry Goldwater. He is pictured on the jacket of this book looking macho grim, dressed in Western frontier gear (the sort wealthy sportsmen like to wear), with a fancy pump shotgun perched on his cocked leg and a saguaro cactus right behind him. What makes the picture unintentionally perfect is that, rather vaguely in the background, as Perlstein is kind enough to point out, is patio furniture.
Goldwater loved for the Eastern press to write about him as a sort of frontiersman, and generally it obliged. Indeed, he did come from a frontier family. Around 1860 his Polish immigrant grandfather, after operating a saloon/brothel in San Francisco, moved to Phoenix, then less than a village, and launched what would become Arizona's most famous mercantile business. Goldwater and his siblings grew up with a nurse, chauffeur and live-in maid. Wealthy in his own right, young Barry married an heiress of the Borg-Warner fortune.
His noblesse was short on oblige. When Congress passed the Fair Labor Standards Act in the 1930s, raising the minimum wage from 25 cents to 40 cents an hour and limiting working hours to forty-four a week, Goldwater made his first public political statement by buying newspaper space to denounce President Roosevelt's program as a sop for "the racketeering practices of ill-organized unions." All his career, he growled that Washington was nothing but a burden on businessmen, and he left the impression that he and his family had made it on their own. Quite the contrary. From the day his grandfather went into business with federal contracts for supplying soldiers and delivering mail, the Goldwaters thrived on government largesse, directly or indirectly. During the Great Depression, their business would have shriveled like lettuce in the Arizona sun if fifty different federal agencies hadn't shoveled $342 million into the wretched little state (which sent back less than $16 million in taxes). Then came World War II, and with its ideal flying weather Phoenix became the center of four huge air bases and countless "service" industries. The Goldwater business boomed.
The war and its aftermath also flooded the state with a political breed Arizona had seen few of until then: Republicans. They elected Goldwater to the US Senate in 1952 by a slim 7,000-vote margin, which he probably owed to the coattails of the man elected President that year, Dwight David "Ike" Eisenhower. During Goldwater's first term he was seldom noticed except when he promoted antiunion legislation and insulted the era's most liberal and influential labor leader, Walter Reuther, whom he considered a Communist. He also said nasty things about Eisenhower's budgets, in which Goldwater detected "the siren song of socialism" and "government by bribe." For a junior Republican senator to say such things about the ever-popular Ike, the Big Daddy of "Modern Republicanism," was so newsworthy, Perlstein tells us, that "the dashing Arizona senator's name began cropping up in the press like dandelions."
In the off-year elections of 1958, the Republicans were slaughtered. In Congress, it was "the worst defeat ever for a party occupying the White House." The exceptions were in Arizona. Goldwater, though seen as the underdog, won. So once again the Eastern press, always eager to create myths from Wild West material, seized upon his victory to gush about "the tall, bronzed, lean-jawed, silver-haired man of 49." That gush was courtesy of Time magazine. The Saturday Evening Post gave five pages to "The Glittering Mr. Goldwater." (Ah, the fickle press! Six years later, the Post called for the "crushing defeat" of Goldwater's presidential effort because he was "a wild man, a stray, an unprincipled and ruthless political jujitsu artist." Actually, the "glittering" Goldwater of '58 was pretty much the same man as the "stray" of '64. One way or another, he usually brought out the press's hyper side.)
But with Democrats on a roll and New Dealism somewhat diminished but still firmly in place, Goldwater and his brand of conservatism were not taken very seriously, except by the very unhappy, forlorn folks on the far right, where there was a significant encampment of Democrats, especially in Dixie, as well as the Ike-is-a-traitor Republicans. But since the national Democratic Party at the time was, as Perlstein says, "now pulled unmistakably in the direction of the eggheads and the do-gooders: left," the only hope for a new far-right leader was among Republicans. And so they followed the star to the Arizona manger and anointed Goldwater as their savior.
Let us pause for a moment to recall that in those days, the "far right" developed from an anti-Communist frenzy. At the end of World War II, many of our leaders saw our recent ally the Soviet Union as a greater threat than Nazi Germany had been. Our policy-makers supported the continued use of Nazi bureaucrats in defeated Germany. And it was commonplace for the US Army and the CIA to recruit Nazi arms experts and spies, among whom were numerous war criminals, and use them both in this country and abroad.
Tens of thousands of Eastern European refugees--often sponsored by conservative religious and ethnic organizations, and by the CIA under special immigration programs--came to this country in that period. An estimated 10,000 were Nazi war criminals, but most were simply people who adamantly opposed peace with the Soviet Union or anything that smacked of "socialism." Joining with the sizable population of native right-thinkers, they helped create a hate-the-commie spirit in America--which often boiled down to hatred of political liberals, and even moderates. George Kennan, a top State Department official in those days, said, "These recent refugees were by no means without political influence in Washington. Connected as they were with the compact voting blocs situated in the big cities, they were able to bring direct influence to bear on individual Congressional figures. They appealed successfully at times to religious feelings, and even more importantly to the prevailing anti-Communist hysteria."
Hysteria was the word for it. Perlstein writes that Senator Tom Kuchel told his colleagues that
10 percent of the letters coming into his office--six thousand a month--were "fright mail," mostly centering on two astonishing, and astonishingly widespread, rumors: that Chinese commandos were training in Mexico for an invasion of the United States through San Diego, and that 100,000 UN troops--16,000 of them "African Negro troops, who are cannibals" [sic]--were secretly rehearsing in the Georgia swamps under the command of a Russian colonel for a UN martial-law takeover of the United States.
Crazy? Of course. But as Perlstein points out, American citizens were expected to make sense of the world around them but were denied, because of cold war secrecy, the information they needed.
Senator Joseph McCarthy, the Wisconsin Republican, was of course the most adept exploiter of this scariness, but Ronald Reagan was no piker at it. About the time Goldwater's star was in its ascendancy, Reagan was a professional pitchman for General Electric; he went around warning, "We have ten years. Not ten years to make up our mind, but ten years to win or lose--by 1970 the world will be all slave or all free." But prominent Democrats who were liberal-to-moderate in most ways also stoked the hysteria. President Truman launched a "loyalty" program to investigate 4 million federal workers; a few hundred were dismissed as "security risks." His Attorney General, J. Howard McGrath, warned that Communists might not be under America's bed but "they are everywhere...in factories, butcher shops, street corners, in private businesses." Not even McCarthy had the gall to propose building camps in time of war to hold US citizens "suspected" of being Communist sympathizers, but Senator Hubert Humphrey, the liberal from Minnesota, did. And Robert Kennedy, who had been a friend of McCarthy's in the latter's heyday, on becoming Attorney General helped keep the fear rolling across America by announcing (without specifics, of course), "Communist espionage here in this country is more active than it has ever been."
"Since McCarthy's day," writes Perlstein,
liberals had been wondering why apparently intelligent people could believe that the wrong kind of politics in the United States would inexorably hasten its takeover by the USSR.... The cognoscenti neglected the simplest answer: people were afraid of internal Communist takeover because the government had been telling them to be afraid--at least since 1947.... It shouldn't have been surprising that the John Birch Society was able to win a membership in the tens of thousands in an officially encouraged atmosphere of fear and suspicion.
In a moment we'll come back to the mystically crackpot John Birch Society, but right now let me introduce, with Perlstein's guidance, a few of the men who created the JBS and similar organizations, and who were effective in raising Goldwater to the level of a presidential contender in the public's mind.
Clarence Manion was the first to suggest, in 1959, that a drive be started to draft Goldwater for the Republican presidential nomination. Manion was an ugly man but a mesmerizing speaker. His bald dome extended into a forehead, Perlstein tells us, that "seemed to get bigger each year, as if to make room for yet one more set of facts and figures on the Communist conspiracy, forcing the droopy ears, doughy cheeks, protruding lower lip, and picket-fence teeth to crowd ever more tightly at the bottom of his face," and "his eyelids were raccooned, as ever, from too much work and too little sleep. But his eyes sparkled. He looked almost beatific."
And why not? A practicing Catholic, he tried all his life to do the Lord's work, both as dean of the law school at Notre Dame and as a political activist, popular radio announcer, lecturer and writer. He began as a loyal Democrat but was disillusioned by what he considered the duplicity of the Democratic Presidents who took us into the "entangling alliances" of two world wars. Back when he was a supporter of the New Deal, Manion believed that "guaranteeing a decent standard of living to all Americans was government's sacred duty," but as a constitutional scholar drifting rightward he came to the conclusion that in trying to do good, the federal government had assumed tyrannical powers. Manion became one of the nation's most influential isolationists.
Equally important and more flamboyant was Robert Harold Winborne Welch, founder of the John Birch Society, which in its day was the liberals' favorite symbol of far-right insanity. The JBS was perhaps best-known for the billboards that went up all over the country urging the impeachment of that well-known Communist, Chief Justice Earl Warren. "By April of 1961," writes Perlstein, "you had to have been living in a cave not to know about Robert Welch and his John Birch Society. The daily barrage of reports left Americans baffled and scared at this freakish power suddenly revealed in their midst. It also left some eager to learn where they could sign up."
Welch was a genius who entered college at the age of 12. An omnivorous reader, by the time he was 50 he had taught himself that the Communist conspiracy, which had gobbled up Western Europe, was gaining ground here; that our State Department had deliberately surrendered China to the Communists; that Eisenhower had "consciously served the Communist conspiracy" all his life; that "civil rights" was just part of the Communists' larger diabolical plan, which included "elites surrendering American sovereignty to the UN; foreign aid rotting our balance of payments; skyrocketing taxes, unbalanced budgets, inflation. There was only one way to explain it: our labor unions, churches, schools, the government--all had been infiltrated" by Communists. These and similar discoveries were delivered to the public in commercially published tracts that "[flew] off the shelves."
The John Birch Society itself was founded in December 1958 at an Indianapolis lecture delivered by Welch to eleven wealthy men, three of them past presidents of the National Association of Manufacturers. It lasted two straight days, with breaks only for "lunch, coffee, and dinner." Identical meetings were held in a dozen other cities. By 1962, Welch was raising over a million dollars a year--big bucks in those days--and membership was estimated to be as high as 100,000. Because the press usually described the organization as extremely weird, not many politicians admitted membership, but a few Congressmen did. Ike's own Secretary of Agriculture, the Mormon elder Ezra Taft Benson, was a member. Not long after giving the benediction at Kennedy's inauguration, Richard Cardinal Cushing declared his admiration for Welch. Denison Kitchel, who later became Barry Goldwater's campaign director, was a member. As for Goldwater, he was characteristically loose about it, saying "a lot of people in my home town have been attracted to the society, and I am impressed by the type of people in it. They are the kind we need in politics."
Robert Welch was far from being the only crazy who rallied behind Goldwater. Others included the Texas billionaire oilman H.L. Hunt, justifiably designated by Perlstein as a "lunatic," whose Life Line broadcasts over 300 radio stations promoted such unique ideas as a cashocracy--the more money you had, the more votes you should get.
But the movement was supported not so much by lunatic billionaires as by tough millionaires, guys like Roger Milliken, one of the wealthiest men in the South, who was willing to put his millions where Barry Goldwater's mouth was (and willing to lose millions on principle: "In 1956, when workers at his Darlington [textile] factory organized to form a union, Milliken shut it down permanently rather than negotiate").
We have had more lackadaisical aspirants than Goldwater for our highest political office. William McKinley is remembered for his stroll for the presidency in 1896, when he campaigned primarily from his front porch in Canton, Ohio. Goldwater was one of the most indifferent of modern times, though, and the reason for his indifference, as explained by Perlstein, is to me his most appealing quality:
He liked his freedom: flying himself to speeches, maybe dropping in at an Air Force base somewhere and begging a turn at the controls of the latest model, overflying Navajo country, poking around on his ham rig, listening to Dixieland jazz records full blast and then fussing with his trombone, or cranking up the air-conditioning full blast, building a roaring fire in his library, and settling in with a good Western....
He had no inflated judgment of his abilities. "Doggone it," he told the Chicago Tribune, "I'm not even sure that I've got the brains to be the President of the United States." He was, we are told, "horrified" when first urged to run, and for a very long time it looked like he would never show the slightest enthusiasm for the idea.
Meanwhile, without his knowledge, a new conservative machine was being put together to take over the Republican Party for the purpose of nominating him. For readers who are addicts of political maneuvering, or for readers who want a crash course in how it is done, Perlstein supplies one of the most fascinating accounts in the career of one Frederick Clifton White, the Machiavelli of grassroots organizing and a great master of "the black arts of convention trickery."
On October 8, 1961, he secretly called together twenty-two men to explain how they could seize control of the party and nominate Goldwater.
"When White explained that a convention could be won without the Northeast," writes Perlstein,
they understood that what he was describing was a revolution. No one had ever convincingly claimed that a Republican presidential candidate could be nominated without winning New York. He said 637 of the 655 delegates needed to win the nomination were likely already in the bag. Many Republican organizations across the country, he explained, withered from eight slack years under Eisenhower, enervated by the agonizing 1960 Nixon loss, weren't really organizations at all. If five people showed up for precinct meetings, it took only six conservatives to take the precinct over. Enough precincts and you had a county. Then a congressional district. If you got enough congressional districts, then you had a state. With just the people in this room and a network of loyal friends, they could have the Republican Party, as easy as pushing on an open door.... A single small organization, from a distance and with minimal resources, working in stealth, could take on an entire party. They didn't need the big fish, the governors, senators, mayors. They didn't need the little fish, the individual voters. They just needed enough middling fish.
That's the way it turned out. And the smoothness and stealth with which it was done left Eastern Republican establishment candidates--men like Nelson Rockefeller and William Scranton and Henry Cabot Lodge--thinking they actually had a chance for the nomination, believing the meaningless public opinion polls that showed their popularity, spending zillions of dollars going after delegates White had lured into Goldwater's camp and who were fanatically loyal to it. The wildest spender and most impossible candidate was Rockefeller. His chances died when he divorced and later remarried in mid-campaign. That was a different era. "It was a time, according to Betty Friedan, when it was easier to find an abortionist than a minister willing to marry a divorcé." Catholic cardinals and influential Protestant ministers sometimes refused to appear on the stage with Rockefeller, or urged him to quit the race because of his divorce.
Perlstein's description of how these establishment white mice raced around the maze looking for an opening that didn't exist is surely one of the best farcical stories of modern politics. And part of the joke was the way the big Eastern press and its pundits kept predicting that Rockefeller or Scranton or Lodge would surely win the nomination. The Washington Post and New York Times, the Wall Street lawyers and the big Eastern bankers, were rooting for anyone but Goldwater. Vice President Humphrey predicted that "the big money in the East there, you know, will move in, as they've done before," and squash Goldwater at the convention. Lyndon Johnson, who had "seen 'em do it" before, agreed.
They didn't have a chance.
But neither did Goldwater. The convention was his last hurrah. If nothing else destroyed him, President Kennedy's assassination would have. When Richard Nixon heard about the shooting in Dallas, he called J. Edgar Hoover: "What happened, was it one of the right-wing nuts?" It was the question asked everywhere. And why not? After all, Dallas was--next to Orange County, California--perhaps the nation's main gathering place for nuts. On the morning of the day Kennedy was shot, a full-page newspaper ad and radio programs, both paid for by H.L. Hunt, warned Dallas residents that the President coming to their town was a Communist collaborator and that his next move would be to revoke the right to bear arms, thereby depriving residents of "weapons with which to rise up against their oppressors."
Whether he liked it or not, Goldwater was branded as the candidate of nuts like that. A Gallup poll showed his approval rating dropping sixteen points.
The second blow from the assassination was that now Goldwater's opponent would be Lyndon Johnson, a Southerner who, despite his having betrayed Southern racists by pushing new civil rights laws, was still popular enough in that region to loosen the stranglehold Goldwater had on it when his opponent was Kennedy. (As it turned out, though, Goldwater did win the five meanest Dixie states, and Mississippi with 87 percent of the vote.)
What decent politician, you may reasonably ask, would even want to win some parts of Dixie? Since this is a social as well as a political history, Perlstein will vividly remind you of just how rotten parts of America could be in those days: Birmingham, where black protesters were put seventy-five at a time into cells built for eight; Mississippi, where a jury acquitted Byron de la Beckwith of shooting Medgar Evers, though the defendant's fingerprints were on the murder weapon; Mississippi again, where three Klansmen admitted four bombings but were released on suspended sentences. The judge ruled they had been "unduly provoked" by "unhygienic" outsiders of "low morality."
Aside from the assassination, the primary reasons for the hopelessness of Goldwater's campaign were two: (1) the positions he took, and the kind of people who liked him; (2) the stupidity of the Arizona Mafia, of which he was the don.
Regarding the first reason, a Louis Harris poll showed that voters disagreed with Goldwater on eight out of ten issues. Goldwater opposed federal civil rights legislation, and this gave a lot of people the wrong idea. George Wallace (who received 43 percent of the Democratic vote in the Maryland primary that year) offered himself as a running mate, but Goldwater, who opposed forced segregation as much as forced integration, "thought Wallace was a racist thug." During the Birmingham riots, Goldwater said, "If I were a Negro I don't think I would be very patient either."
His remarks about military matters kept building up and falling over on him. For years, Goldwater had been growling, "To our undying shame, there are those among us who would prefer to crawl on their bellies rather than to face the possibility of an atomic war." He told a convention of the Military Order of the World Wars in October 1963, "I say fear the civilians. They're taking over." During test-ban hearings, "he boasted that America's missiles were accurate enough 'to lob one into the men's room at the Kremlin' (if they could be counted on to get into the air first)." He said battlefield nuclear weapons were nothing to get hysterical about; they were just like "a bullet or any other weapon." Eisenhower and Rockefeller had said the very same thing without attracting criticism, but that last point was the sort that Goldwater's opponents--and especially the press, which generally blanched at the thought of him in the White House--could interpret as showing he was a violent extremist, perfectly willing to blow up the world.
Although events would show that Johnson was the man to worry about right then, White House propagandists--and Perlstein says this was especially true of Bill Moyers, "surely the most ruthless...of LBJ's inner circle"--developed the image of Goldwater as The Nuclear Madman with great effectiveness. The press also grossly exaggerated the threat embedded in Goldwater's most famous remark, that "extremism in the defense of liberty is no vice" (which Perlstein correctly points out was no different in tone from some of President Kennedy's most admired rhetoric). The Washington Post, for example, wailed, "If a party so committed were to gain public office in this country, there would be nothing left for us to do but pray."
Goldwater had cause to complain about the press favoritism shown Johnson, whose campaign trail was marked by occasional drunkenness and interviews riddled with contradictions, incoherence and obvious lies. Attending reporters wrote nothing about such lapses. On one campaign flight, "in order to squeeze as many VIPs into his plane as possible, he booted onto an accompanying plane the military aide who kept the briefcase handcuffed to his wrist that contained the codes to launch a nuclear strike. That plane nearly ended up crashing. Reporters looked the other way."
If, in a coffin that was already so full of nails, a final nail could be identified, it would be the one driven in by Goldwater himself. For his vice presidential candidate he picked William Miller, an invisible congressman best known for accusing the Kennedy White House of immorality because guests had danced the twist in the ballroom. For his campaign, Goldwater rejected guidance from men who were experts at organizing, propagandizing and raising money. Remember Frederick Clifton White, the genius who had put together the Draft Goldwater drive and stacked the convention so brilliantly? Goldwater didn't trust him, and gave him only a rinky-dink role in the campaign.
Instead, he surrounded himself with what was known as the Arizona Mafia, old friends, most of whom were political ignoramuses. "None had a day's experience in national elections," says Perlstein. Most ignorant of all was Denison Kitchel, Goldwater's best friend. Kitchel had married the daughter of an Arizona copper baron and became a lawyer for Phelps Dodge, one of the most brutally antiunion mining companies in the world. Goldwater made him his campaign chairman. He was hopeless.
"Who's Arthur Summerfield?" Kitchel asked when advised to consult the former Republican National Committee chair; "What line of work are you in?" he queried an uncommitted Republican senator.
[He was] hard of hearing--in a field where the most important work was done in whispers!
Old friend Richard Kleindienst, on being named director of field operations, asked, "And what am I supposed to do?"
Kitchel and Goldwater "didn't bother to attend the Sunday strategy meetings. They made strategy at 33,000 feet. The campaign plane was their playhouse," where they and other members of the Arizona Mafia "swapped ribald jokes, told hunting stories, yapped on the airborne ham radio."
Goldwater sometimes seemed to have taken leave of his senses. He went tooling around to "places so irrelevant to the outcome of the election that it was as if the plane were flying itself." And when he landed at places critically important to the outcome, he was sometimes almost comically offensive. In Knoxville, he advocated selling the Tennessee Valley Authority. In Memphis and Raleigh he derided cotton subsidies. In Fort Worth, he criticized a military aircraft project being built in--Fort Worth. In West Virginia, "the land of the tar-paper shack, the gap-toothed smile, and the open sewer," he called the War on Poverty "a war on your pocketbooks." As he left that rally, he was jeered by lines of workmen.
Did that sort of reception bother Goldwater and his courtiers? Not at all. "That Goldwater alienated audiences was taken by his inner circle as evidence he was doing something right--telling them things they needed, but didn't want, to hear."
When a campaign worker outside the Mafia tried to inject more rationality into the drive, Goldwater told him, "I want you to stop it. It's too late. You go back and tell your crowd that I'm going to lose this election. I'm probably going to lose it real big. But I'm going to lose it my way."
And so he did. He won only six states--one of them, his home state, by half a percentage point.
But other things were happening in 1964 that the press failed to put into the equation that would determine future events. Nixon gave 156 speeches in thirty-six states for the GOP ticket that year, gathering chits for "his new master plan." Reagan brought life to many a dying rally, where he would give a masterful introduction to Goldwater, stealing the show so completely that some in the audience went away uncertain as to which one was the candidate. In the final days of the campaign, Reagan was on television constantly with replays of what came to be known as The Speech ("You and I have a rendezvous with destiny.... We will preserve for our children this, the last best hope of man on earth, or we will sentence them to take the last step into a thousand years of darkness"). It was a stunning, melodramatic debut.
But Dick and Ronnie did not appear in the pundits' crystal ball. Instead, most insisted on seeing the outcome of the 1964 election as the right wing's burial. The New York Times's Scotty Reston wrote that Goldwater's conservatism "has wrecked his party for a long time to come." Also at the Times, Tom Wicker wrote that conservatives "cannot win in this era of American history." The Los Angeles Times interpreted the election outcome to mean that if Republicans continued to hew to the conservative line, "they will remain a minority party indefinitely." The Washington Post saw the conservative victory in Dixie as just a "one-shot affair."
With understandable relish, Perlstein allows the pundit chorus to end on this note:
The nation's leading students of American political behavior, Nelson Polsby and Aaron Wildavsky, speculated that if the Republicans nominated a conservative again he would lose so badly "we can expect an end to a competitive two-party system." Arthur Schlesinger Jr. put it most succinctly of all.... "The election results of 1964," he reflected, "seemed to demonstrate Thomas Dewey's prediction about what would happen if the parties were realigned on an ideological basis: 'The Democrats would win every election and the Republicans would lose every election.'"
To which Perlstein adds, "At that there seemed nothing more to say. It was time to close the book."
And so he does.
About every thirty years for the last one hundred, a crusading journalist somewhere has gotten the same idea: Abandon the middle-class literary life (for a brief period), get a real job, gain firsthand experience in the underclass, go home and write it up.
Not surprisingly, most practitioners of the genre have been left-wing whistleblowers--notably, Jack London and George Orwell. London's 1902 book People of the Abyss chronicled the misery of urban and agricultural workers, plus the unemployed, in turn-of-the-century England. "Work as they will," he discovered, "wage-earners cannot make their future secure. It is all a matter of chance. Everything depends upon the thing happening, the thing about which they can do nothing. Precaution cannot fend it off, nor can wiles evade it."
Already a renowned writer, London entered this new world of poverty and insecurity "with an attitude of mind which I may best liken to that of an explorer." Orwell's expedition, at the time of the Great Depression, followed in London's footsteps in the same East End neighborhoods, later ending up in Paris. Published in 1933 as an autobiographical novel, Down and Out in Paris and London records the author's experiences toiling under terrible conditions as a plongeur, or restaurant dishwasher, in the bowels of a great Paris hotel. In both cities, Orwell's narrator struggles to make ends meet--just like his co-workers and fellow tenement dwellers.
A plongeur is better off than many manual workers, but still, he is no freer than if he were bought and sold. His work is servile and without art; he is paid just enough to keep him alive; his only holiday is the sack. Except by a lucky chance, he has no escape from this life, save into prison. If plongeurs thought at all, they would strike for better treatment. But they do not think; they have no leisure for it.
T hree decades later, on the eve of the civil rights revolution in the United States, journalist John Howard Griffin was down and out in Dixie. His book, Black Like Me, featured the additional twist of an author trying to cross both class and racial lines. To find out, as a white, what it was like for African-Americans to live and work in the segregated South, the author darkened his skin and traveled about in the guise of what was then called (appropriately enough for Griffin) a "colored" person. Black Like Me had a great impact at the time because of the novelty of the author's assumed identity and the book's shocking (for many whites) account of the routine indignities and monstrous injustice of apartheid in America.
It took far less makeup for Barbara Ehrenreich, the well-known socialist and feminist, author and columnist, to "pass" among the mainly white working-class people she met while researching Nickel and Dimed. Between 1998 and 2000, she took jobs as a waitress and hotel maid in Florida, a nursing-home aide and a house cleaner in Maine, and a retail sales clerk in Minnesota. Her trip across the class divide did require that she temporarily leave behind most of the accoutrements of her normal existence--home ownership, social connections, professional status, "the variety and drama of my real, Barbara Ehrenreich life."
Retaining, as her private safety net, credit cards (to be used only in emergencies) and a series of "Rent-a-Wrecks" to make job-hunting easier, she set out to determine how a person with every advantage of "ethnicity and education, health and motivation" might fare in the "economy's lower depths" in "a time of exuberant prosperity."
Her attempt to "match income to expenses" on the $6-$8 an hour wages of the working poor succeeds only briefly, though--and then just barely--in Portland, Maine, where she is able to juggle two jobs at once. Like Orwell living in Left Bank penury in Paris, she quickly becomes obsessed with counting her pennies and staying within a daily budget that does not allow for any splurges or unexpected financial adversity. Unlike the hundreds of thousands of single mothers with children who've been dumped into the job market by "welfare reform," she doesn't have to worry about finding and paying for childcare while holding down a draining, low-income job (or two). Nevertheless, she ends up being defeated by the same fundamental obstacle they face: Despite much hard work, "many people earn far less than they need to live on."
"How much is that?" she asks. "The Economic Policy Institute recently reviewed dozens of studies of what constitutes a 'living wage' and came up with the figure of $30,000 a year for a family of one adult and two children, which amounts to a wage of $14 an hour." The problem is that "the majority of American workers, about 60 percent, earn less than $14 an hour," while 30 percent, according to the EPI, made $8 an hour or less when Ehrenreich joined their ranks in 1998.
At each stop on her low-wage tour, the author tests out local support services for the working poor. Not surprisingly, the things that people need most to make their lives better--health coverage, affordable housing and access to mass transit--aren't available at the agencies she visits. (Instead, she gets the occasional bag of free groceries, plus referrals for apartment rentals she can't afford.) She finds that many of her co-workers--particularly those without family support networks--lack sufficient funds for the rental deposits and one month's advance rent needed to acquire an apartment. As a result, they are forced into overcrowded, ripoff lodging arrangements at seedy residential motels, which charge by the day or week. Even trailer-park living, which Ehrenreich tried in Key West, is now prohibitively expensive in tighter local housing markets. The nation's widespread deficiencies in public transportation also limit workers' options about where they can live--and work--if they don't own a car.
Many low-end employers don't offer health insurance, of course. Even when they do, workers in places like Wal-Mart often can't afford the payroll deductions required for family or even individual coverage when their starting pay is only $7 an hour (rising to $7.75 after two years in the Minneapolis store where Ehrenreich worked). The resulting lack of preventive medical and dental care leads to a cycle of daily discomfort and, sometimes, life-threatening deprivation. The work that Ehrenreich describes in painful detail--scrubbing floors, waiting on tables, lifting Alzheimer's patients--is hard on the body. Years of it breeds myriad aches and pains, injuries and allergic reactions, which, left untreated, become a never-ending source of misery, not to mention missed work, lost income and potentially ruinous bills. As Ehrenreich notes, she held up as well as she did in several of her jobs only because she hadn't been doing them for long; without her personal history of regular exercise, proper diet and medical care, a woman her age (late 50s) would have been struggling to stay on her feet all day as a Merry Maid or Wal-Mart sales clerk.
What makes Nickel and Dimed so engaging, however, is not its tutorial on the economics and ergonomics of low-wage life and work. Rather, it is the author's insights into the labor process in the retail and service sectors, and into workplace power relationships. If Wal-Mart had been around in Orwell's era and he, rather than Ehrenreich, had worked there, he would have written 1984 much sooner. The private empire created by Arkansas billionaire Sam Walton boasts both a Big Brother figure--the late "Mr. Sam" himself--and a work force of "proles" (now 825,000 strong) whose docility, devotion and nonunion status are major corporate preoccupations. Entering this "closed system," replete with its own "newspeak" and "doublethink," Ehrenreich discovers that all the workers, like herself, are "associates," all the customers "guests," and the store supervisors "servant leaders."
One of management's top priorities, she learns, is eradicating "time-theft"--a crime most often committed by associates who violate Wal-Mart's strictly enforced "no-talk" rule, linger on their smoke breaks or otherwise dally in the never-ending task of stocking, straightening and restocking shelves. Potential malingerers (and others with rebel tendencies) are ferreted out during the prehire process of personality screening and drug testing. Once you're on the job, close surveillance by "servant leaders" and continuing "education"--via taped messages and training videos featuring Mr. Sam--are a constant feature of company life. To leaven this atmosphere of brainwashing and intimidation, "team meetings" for associates often end with a special "Wal-Mart cheer"--a morale-boosting device personally imported from Japan by the founder himself.
Given the widespread existence of such demeaning conditions and "the dominant corporate miserliness," why don't the wretched of this low-wage world revolt? What's holding them back? Nickel and Dimed offers several explanations for the absence of collective action: high job turn-over among the unskilled, their low self-esteem, the universal fear of being fired for speaking out or challenging management authority and, in some cases, actual workeridentification with corporate values or individual bosses. Even with a background quite different from that of her fellow restaurant workers, Ehrenreich finds herself being affected by the culture of low-wage work in ways that she doesn't like:
Something new--something loathsome and servile--had infected me, along with the kitchen odors that I could still sniff on my bra when I finally undressed at night. In real life I am moderately brave, but plenty of brave people shed their courage in POW camps, and maybe something similar goes on in the infinitely more congenial milieu of the low-wage American workplace.
In the course of the book, after much buffeting by rude customers, abusive supervisors and unreliable co-workers, a kind of working-class alter ego of the author emerges--the "Barb" of her Wal-Mart ID who "is not exactly the same person as Barbara" (nor as sympathetic):
"Barb" is what I was called as a child, and still by my siblings, and I sense that at some level I'm regressing. Take away the career and the higher education, and maybe what you're left with is this original Barb, the one who might have ended up working at Wal-Mart for real if her father hadn't managed to climb out of the mines. So it's interesting, and more than a little disturbing, to see how Barb turned out--that she's meaner and slyer than I am, more cherishing of grudges, and not quite as smart as I'd hoped.
The author sounds more like her usual self when, as a house cleaner for Merry Maids--the McDonald's of its industry--she is forced "to meet the aesthetic standards of the New England bourgeoisie" down on her hands and knees, with a scrub brush. A particularly obnoxious client, the owner of a million-dollar condo on the coast of Maine, takes Ehrenreich into the master bathroom whose marble walls have been "bleeding" onto the brass fixtures--a problem she wants Ehrenreich to address by scrubbing the grouting "extra hard."
That's not your marble bleeding, I want to tell her, it's the worldwide working class--the people who quarried the marble, wove your Persian rugs until they went blind, harvested the apples in your lovely fall-themed dining room centerpiece, smelted the steel for the nails, drove the trucks, put up this building, and now bend and squat and sweat to clean it.
Unable to deliver this political tirade--lest she blow her cover--Ehrenreich instead fantasizes about exacting revenge similar to that witnessed and described so memorably by Orwell in Down and Out (i.e., the disgruntled cook who spat in the soup, the waiters who put their dirty fingers in the gravy, etc.). "All I would have to do," she muses angrily in a gorgeous country house, "is take one of the E. coli-rich rags that's been used on the toilets and use it to 'clean' the kitchen counters." No one, she concludes, should be asked to wipe out someone else's "shit-stained" bathroom bowl or gather up the pubic hairs found in their "shower stalls, bath tubs, jacuzzis, drains, and even, unaccountably, in sinks."
Ehrenreich has long been a rarity on the left--a radical writer with great wit and a highly accessible style. While often sad and grim, Nickel and Dimed is nevertheless sprinkled with the author's trademark humor. She is, for example, frequently struck by the oddity of her circumstances. Sitting alone in a cheap motel, eating takeout food after a hard day at Wal-Mart, she watches an episode of Survivor. "Who are these nutcases who would volunteer for an artificially daunting situation in order to entertain millions of strangers with their half-assed efforts to survive? Then I remember where I am and why I am here."
Half-assed as her attempts to learn unfamiliar jobs may have been--and as funny as she sometimes makes the experience seem--Ehrenreich is still engaged in a serious project. Nickel and Dimed may not be prime-time fare for millions. Yet, hopefully, it will still reach enough readers to expand public awareness of the real-world survival struggles that many Americans faced even before the current economic downturn. If anything, this book should command greater attention now because the life of the working poor--never easy in good times--is about to get harder in ways we'll never see on "reality TV."
Thirty-eight years after the bombing of Birmingham's 16th Street Baptist Church, two of the four principals are dead, but the issues are still full of life. Thomas Blanton Jr. is one of two surviving Klan bombers, and after a jury convicted him in early May of murdering the four black girls that Sunday morning, former Alabama Attorney General Bill Baxley wrote a blistering Op-Ed for the New York Times accusing the FBI of concealing evidence and aiding the Klan for decades after the event. The FBI's denial made page one the next day: "There's no reason we would have done that," a bureau spokesperson declared. The Times also published a letter from the special agent in charge of the FBI's Birmingham office, calling Baxley's Op-Ed "a disservice to all the agents who tirelessly investigated the 1963 bombing."
Diane McWhorter's Carry Me Home is a history of that bombing, of the FBI "investigation," of the people responsible for it--high and low--and of the civil rights movement in Birmingham. She grew up there--she was 10 years old at the time of the bombing--and later she worried, because her father, who had fallen from an elite family, had spent many evenings attending what her mother called "civil rights meetings." But Diane knew he had Klan literature around. Eventually she realized that her father could have been attending Klan meetings, and might even have been one of the bombers. Many years later she set out to find out the truth about him--and ended up writing this magnificent book.
Although the 16th Street Baptist Church served as a rallying point for demonstrators in the 1963 campaign, it was not a Movement church. McWhorter calls it "the snootiest black congregation in the city," and its founding minister worked with the local industrialists to persuade blacks not to join the union. At services they didn't sing gospel songs but rather "the sedate hymns of white Christianity." And 16th Street Baptist was the only church in the city that charged the Movement for using its facilities.
The bomb was a huge one--perhaps a dozen sticks of dynamite. When the blast was heard across town, Klansman Ross Keith, almost certainly one of the bombers, told a friend, "I guess it's somebody discriminating against them niggers again." The four girls who were killed were in the women's lounge, freshening up for their roles as ushers in the main service. Denise McNair was 11; the three others were 14: Carole Robertson, Addie May Collins and Cynthia Wesley--Wesley was wearing high heels for the first time, "shiny black ones bought the day before."
There is a survivor who was in the women's lounge with the other four: 12-year-old Sarah Collins, sister of Addie May. When they found Sarah in the rubble, her face was spurting blood. She was loaded into an ambulance--a "colored" one. On the way to the hospital she sang "Jesus Loves Me" and occasionally said, "What happened? I can't see." Today she is 50 and still blind in one eye.
Immediately after the four girls were identified, the authorities began "furious background checks on them, the search for some flaw deserving punishment." But their records were clean: None, that is, had participated in the recent civil rights demonstrations. Thus, even the city fathers and the local press had to agree they were "innocent."
The big question was never who the bombers were--they were identified by the FBI and the police almost immediately. The big question, McWhorter shows, is what permitted them to get away with it--"the state's malevolence or the FBI's negligence." Dozens of bombings had been carried out by the Klan in the preceding few years, virtually none of which were prosecuted. The FBI's informant in the local Klan, Gary Thomas Rowe, participated in some of them. McWhorter's index has ninety entries for "bombings," starting in the late 1940s. Most Klan bombings in the fifties targeted upwardly mobile blacks moving into middle-class white neighborhoods.
After the 16th Street church bombing, local authorities kept suggesting that blacks were the bombers. The police took the church custodian in for questioning. The FBI's pursuit of witnesses was unhurried, which gave the Klansmen more time to coordinate alibis. FBI informant Rowe told a Birmingham policeman that the man who put up the money to have the church bombed was Harry Belafonte.
The man convicted just weeks ago, Thomas Blanton Jr., was part of an extremist subgroup of the Klan. Initially he focused his violent hatred on Catholics, like the Klan of the 1920s. He had a neighbor, a widow, who was Catholic; she received regular "anonymous calls" from a voice she recognized as his--she had known him for eighteen years--telling her "Niggers and Catholics have to die." Once he threw red paint on her new white Ford and slashed her tires. Earlier in 1963 Blanton had been talking about organizing a church bombing, but he wanted to bomb a Catholic church, not a Negro one. "His associates pronounced him not intelligent enough to make a bomb but dumb enough to place it."
The other man recently charged with the bombing has been judged mentally incapable of standing trial. But in 1962-63, Bobby Frank Cherry was 32 years old, had "no upper front teeth, a 'Bobby' tattoo on his arm, seven kids, and a wife he beat and cheated on." He had been a police suspect in the 1958 attempted bombing of Birmingham's Temple Beth El, and McWhorter has evidence strongly suggesting that he also participated in bombing churches in January 1962, almost two years before the four girls were killed. If the FBI had investigated him after the 1962 explosions, that might have prevented the 16th Street Baptist Church bombings, but as McWhorter points out, "instead the FBI was investigating Martin Luther King," proposing to, as the bureau put it, "expose, disrupt, discredit, or otherwise neutralize" the leader of the civil rights movement. (In the middle of the Birmingham battle, Bobby Kennedy agreed to let J. Edgar Hoover wiretap King.)
The killing of the four black girls finally spurred the Kennedy Administration to propose, Congress to pass and new President Lyndon Johnson to sign the Civil Rights Act of 1964, outlawing racial discrimination in public facilities--the first significant civil rights legislation in a century. The bombing followed the biggest and most successful mass civil rights demonstrations in US history--police met the thousands of marchers with fire hoses and dogs. Today the history of the civil rights movement seems like one of steady progress: first the Montgomery bus boycott in 1955, which propelled King to national prominence and established nonviolent direct action as the new tactic, supplanting the legal gradualism of the NAACP; then, in 1960-61, the sit-in movement, in which small groups of courageous students across the South took the lead in a direct personal challenge to segregation; then the Freedom Rides, where a few brave people provoked racist violence that compelled the Kennedy Administration to enter the civil rights arena; and finally Birmingham, where mass protests filled the jails and finally won national legislation outlawing segregation in public accommodations.
What's been forgotten is the grim situation that faced King and the Movement at the outset of the Birmingham campaign in 1962. It had been seven long years since the Montgomery bus boycott--seven years with intermittent acts of immense heroism but without concrete victories. The Southern states were defiant, and the Kennedys, as Victor Navasky argued in Kennedy Justice, considered activists like Martin Luther King to be a problem that endangered their real initiatives, like a tax cut and fighting communism. By 1963 King and the Movement desperately needed a nationally significant victory, somewhere.
King himself had not, up to 1963, initiated any civil rights protest himself--starting with Montgomery in 1955, he was brought in as a spokesman after the action had already begun. Birmingham was no different. Here the real hero and moving force was Fred Shuttlesworth, in many ways the opposite of King--a man of the people, not of the elite; a man who courted danger and pushed the envelope, who stayed till the end, unlike King, who was criticized for leaving town early and leaving "a community stranded with false hope and huge legal fees." Much of the story of Birmingham is the story of Shuttlesworth's brilliant strategic initiatives and awesome physical courage--and King's more cautious efforts to negotiate a settlement by enlisting the White House, in exchange for calling off the demonstrations. It was Shuttlesworth who set out to launch mass demonstrations, fill the jails and compel the city leaders to desegregate downtown businesses and public facilities. McWhorter's book also shows just how close the Birmingham campaign came to failure. A month into the campaign, few people had signed up to go to jail--barely 300 in total, even though King himself had gone to jail. "There are more Negroes going to jail for getting drunk," one Movement leader commented.
What turned this around was an idea of James Bevel's--he had been a Student Nonviolent Coordinating Committee (SNCC) leader, who later became field secretary for King's organization, the Southern Christian Leadership Conference. His idea for Birmingham: Fill the jails with children. The adults were full of doubt and fear, but the kids were eager. Hundreds boycotted school on May 2, 1963, instead gathering at the 16th Street Baptist Church, then marching into the streets--more than a thousand of them. The children confronted the cops, singing in high voices "Ain't Gonna Let Nobody Turn Me Around" and "Which Side Are You On?" and then were ushered into buses to go to jail. For the first time, King and his lieutenants had achieved Gandhi's goal--fill the jails.
The next day thousands more showed up to march. That was the day of the fire hoses. The city's fire chief initially resisted officials' attempts to enlist the fire department in attacking demonstrators, on the grounds that the national union of firefighters officially opposed using fire equipment to "control" crowds. But when the orders came, they turned on high-pressure hoses powerful enough to knock a big man off his feet, blast the shirts off people's backs and flush individuals down the gutters.
The success of the civil rights movement on the national political landscape required not just heroic action by large numbers of ordinary black people; it also required that the viciousness of the opponents of civil rights be presented vividly and dramatically to ordinary American newspaper readers and TV watchers. In this, the Birmingham movement turned out to be supremely fortunate to have the grotesque Eugene "Bull" Connor as police commissioner. Photos of young demonstrators linking arms and standing up to the high-pressure hoses made page one around the world. Life magazine ran a two-page spread of the most dramatic photo of firemen blasting demonstrators, headlined "They Fight a Fire That Won't Go Out." The photos of police dogs attacking demonstrators had the same effect. The New York Times ran a photo of a dog biting a demonstrator on page one, three columns wide and above the fold, headlined "Dogs and Hoses Repulse Negroes at Birmingham."
Key reporters had already found the civil rights drama a compelling story. In 1960, the New York Times published a blazing Harrison Salisbury story on page one before the Birmingham campaign got going: "Every reasoned approach, every inch of middle ground has been fragmented by the emotional dynamite of racism, reinforced by the whip, the razor, the gun, the bomb, the torch, the club, the knife, the mob, the police and many branches of the state's apparatus." State authorities responded by charging Salisbury with forty-two counts of criminal libel. The Times's response was to order its reporters to stay out of Alabama--not exactly a fighting stance--which meant that other news organizations would henceforth get the story while the Times relied on wire copy for the climactic battles. The Times didn't return until a year later, when Claude Sitton persuaded executives to let him cover the aftermath of the Freedom Rides.
While the Times proved gun-shy on Alabama, CBS-TV didn't; network president Frank Stanton sent reporter Howard K. Smith to Birmingham to make a documentary. (Even though Stanton was not exactly a civil rights advocate; he also "blacked out all Negro speakers at the Democratic and Republican presidential conventions.")Smith's crew set out to interview leading whites; the head of the elite Women's Committee told him on camera that "one of the contributing factors to our creativeness in the South is sort of a joyousness of the Negro." But she was worried because it had been four or five years since she had "heard Negroes just spontaneously break into song." Smith also turned out to be the only national reporter on the scene when the Freedom Riders arrived and were savagely beaten by a white mob while the police stood by.
Who Speaks for Birmingham? aired on CBS in 1961 and featured Smith's account of the mob attack on the Freedom Riders. Network executives complained that the program "presented Birmingham's Negroes in a better light than its whites," but executive producer Fred Friendly fought to keep the whole thing, and in the end gave up only Smith's closing line, a quote from Edmund Burke: "The only thing necessary for the triumph of evil is for good men to do nothing." But when the same Howard K. Smith criticized Kennedy in his regular Sunday radio commentary, asking whether "we really deserve to win the cold war" in view of the racist violence in Birmingham, CBS News suspended him from his job as Washington bureau chief.
The media coverage was crucial, but one of the secrets of the demonstrations was that neither the police nor the media distinguished between marchers and spectators. Only a couple of hundred people joined the early official demonstrations, but a thousand or more turned out to watch and see what happened. The police attacked everybody, and the press reported thousands of demonstrators.
Carry Me Home includes the most detailed account ever of the Birmingham Movement's strategy and tactics, day by day and hour by hour, but what makes it unique is its account of the local opposition to civil rights, and particularly the links between the "Big Mules," who ran Birmingham's industrial economy, and the Klan bombers. The book's most important contribution is its decisive evidence that the bombing of the 16th Street Baptist Church "was the endgame in the city fathers' long and profitable tradition of maintaining their industrial supremacy through vigilantism."
Birmingham had never been what you would call a happy place--the New South's one center of heavy industry, it was a city where the ruling elite fought working-class militancy with the most blatant racism. Power in Birmingham centered on US Steel, which ran the town along fascist lines--one Communist organizer in the 1930s was sentenced to a shackled road crew for possessing "seditious" literature, which included The Nation magazine. The dirty work of the Big Mules was carried out by the Alabama Klan, which was reorganized in the 1930s as an antiunion shock force.
Charles DeBardeleben headed the Big Mules--he ran the biggest coal company in the state, and his father had pretty much founded Birmingham as a coal and iron center. By the mid-1930s DeBardeleben was also a secret corporate benefactor of the Constitutional Educational League, part of a global network of pro-Nazi propagandists. The league's 1938 banquet featured George Van Horn Moseley, who "advocated sterilizing all Jewish immigrants to the US." McWhorter names the names of the other key Big Mules and shows their connections to the bombers of the 1950s and 1960s.
History also loomed large for the Jewish businessmen who owned the downtown department stores that were the target of the demonstrators' demands for integration and jobs. Birmingham's Jews had been traumatized a generation earlier during the Scottsboro trial, when the nine "Boys" were defended during their rape trial by a Jewish attorney from New York named Samuel Liebowitz. The state's closing statement challenged the jury to "show them that Alabama justice cannot be bought and sold with Jew money from New York." The jury obliged.
That was 1933. Thirty years later, Birmingham's Jews were still feeling defensive. One liked to tell his gentile friends, "It wasn't the Birmingham Jews who killed Jesus. It was the Miami Jews." Now they declared that they were as opposed to "outside agitators" as Bull Connor--indeed, one Birmingham Jewish organization issued a public statement demanding not only that Martin Luther King stay away but that the Anti-Defamation League stay out of Birmingham.
Birmingham is also famous as the place where Martin Luther King composed his best-known written work, "Letter From a Birmingham Jail." It wasn't King's idea. Harvey Shapiro, an editor at The New York Times Magazine, suggested that King write a "letter from prison" for the magazine. The missive that King wrote turned out to be a classic, "the most eloquent treatment of the nexus between law and injustice since Thoreau's essay 'Civil Disobedience.'" But when King submitted his piece, the Times editors rejected it. It wasn't printed for another two months, and then in The Atlantic Monthly.
The Martin Luther King who appears in McWhorter's account is not very heroic. His claim that "unearned suffering is redemptive," made at the March on Washington earlier that year, seemed irresponsible to more and more blacks, ranging from SNCC militants to ordinary Birmingham blacks. In King's first statement after the bombing, he asked, "Who murdered those four girls?" and answered, "The apathy and complacency of many Negroes who will sit down on their stools and do nothing and not engage in creative protest to get rid of this evil." Carole Robertson's mother had not participated in the demonstrations; she was so outraged at King blaming her for her daughter's murder that she refused to join the three other families in a mass funeral for the girls.
At the other end of the spectrum in black Birmingham were the men who saw the events as providing "a chance to kill us a cracker." The Movement's insistence that marchers take a pledge of nonviolence was based on leaders' knowledge of the deep rage that black men in particular bore for whites. "At mass meetings, King began passing around a box for people to deposit razors, knives, ice picks, and pistols, and salted his inspirational calls to dignity with reminders that being black did not in itself constitute a virtue." People needed courage and hope before they could take the pledge of nonviolence.
McWhorter's panoramic cast includes blacks on the wrong side of the Movement. "Rat Killer" ran the 17th Street Shine Parlor, a popular after-hours spot where visiting stars like Jackie Wilson, Sam Cooke and the Temptations hung out, and where Movement preachers got their shoes shined. But Rat Killer was "Bull Connor's right-hand man" in the black community--he traded information for informal permission to sell bootleg liquor and do some pimping.
McWhorter weaves her personal story throughout the book, and these sections provide uniquely rich and revealing evidence of the blindness of middle-class whites in this era. The book opens at Birmingham's elite white country club on a Sunday, when McWhorter was having brunch as usual with her family. It turns out to have been the morning the church was bombed. McWhorter was a year younger than the youngest of the four girls killed. Although the bombing marked a turning point in the nation's history, her family took little note of it. She doesn't remember it at all, and her mother's diary entry for that day says only that Diane's rehearsal for the community theater production of The Music Man was canceled--not in mourning over the deaths but because whites feared that black people would riot.
The police dogs that horrified the world were well-known to McWhorter. Before the historic day they attacked the demonstrators, the police brought one of the dogs to an assembly at her school to demonstrate its crime-fighting abilities. McWhorter was so excited by the event that she changed her career goal to police-dog handler (she had planned to become an Olympic equestrian).
At the end of the book, McWhorter finally confronts her father on tape. He says he's told friends his daughter is writing a book about "the nigger movement," but says he wasn't in the Klan and was never involved in murdering anyone. She concludes he was a camp follower but not much of an activist.
The rest of the key figures in the story are mostly dead now: Bull Connor died in 1973; Robert Chambliss, until Blanton the only man convicted in the bombing (in a 1977 trial brought by then-Alabama Attorney General Baxley), died in 1985 while serving his prison term. (Baxley ran for governor in 1978 and lost.) The FBI's Klan informant, Gary Thomas Rowe, admitted that he and three Klan members shot and killed Viola Liuzzo on the 1965 Selma-to-Montgomery march; the killers were acquitted of murder (but not of violating her civil rights), and Rowe went into the witness protection program after the trial and died in 1998. The other Klan bombers died too, until the only ones left seemed to be Bobby Frank Cherry and Tommy Blanton.
Chambliss's 1977 trial exploded back to life early this May with Baxley's New York Times Op-Ed. He wrote that he had "requested, demanded and begged the FBI for evidence" from 1971 through 1977; that his office was "repeatedly stonewalled"; that the bureau practiced "deception," the result of which was that Blanton went "free for 24 years" while the FBI had "smoking gun evidence hidden in its files." He concluded by describing "the disgust" he felt over the FBI's conduct. No state attorney general has ever spoken so forcefully in criticizing the bureau.
Now Blanton has been convicted, but virtually all the other Southern white men who killed blacks during the heyday of the civil rights movement have gone unpunished. In the end the Klan bombers may not be the biggest villains in this story. It's the city and state officials, including the police and the FBI, who tolerated and sometimes encouraged racist violence, and the Kennedy brothers, who didn't want to do anything about it until they were forced to. Diane McWhorter started writing about "growing up on the wrong side of the civil rights revolution"; she ended up with the most important book on the movement since Taylor Branch's Parting the Waters. It should become a classic.
Readers of this magazine do not need reminders of the costs of the cold war. The mountains of corpses, the damaged lives, divided families and displaced refugees, the secret police forces and death squads, and the resources wasted on ghastly weapons of unfathomable evil are not only markers of a recent past but still-active landmines buried a few inches beneath the surface of our contemporary lives.
What may be harder to remember is the ways the global struggle with the Soviet Union enabled social and cultural achievements that made the United States a decidedly more decent society. From Harry Truman's integration of the armed forces to the Brown decision and the 1963 March on Washington, the initial phase of the civil rights movement capitalized on the moral embarrassment of segregation for a nation trying to win the hearts and minds of Third World peoples. Likewise, the rapid postwar expansion of state universities, the infusion of government monies into public schools after Sputnik and the creation of the National Endowments for the Arts and the Humanities in 1965 were all episodes in an ideological cold war meant to demonstrate the cultural superiority of the "free world" to the Soviet bloc. It was a strange era that offered both Martin Luther King Jr. and his persecutor J. Edgar Hoover their big chance to bring the United States closer to their ideals.
Two monuments to the cold war stand catty-corner to one another on Washington's Pennsylvania Avenue: On one side, the brutalist Hoover FBI building; on the other, the restored neo-Romanesque post office that houses the NEA and NEH and bears the name of Nancy Hanks, the liberal Republican chair of the NEA during its glory days in the late 1960s and early 1970s. Care to guess which building will be renamed first?
Michael Brenson's new study of the NEA, Visionaries and Outcasts, emphasizes the cold war origins of the agency in an effort to place the "arts wars" of the past dozen years in historical perspective. Looking beyond the 1995 budget cuts that devastated the endowment, and the earlier battles in 1989-90 over NEA-supported exhibitions by photographers Robert Mapplethorpe and Andres Serrano, Brenson tracks the unfolding of a tension between "ideology and idealism" inherent in the NEA founders' understanding of the agency's role in American culture. Arts advisers to Presidents Kennedy and Johnson sought federal support for the arts to promote international awareness of the cultural vitality of a society dedicated to free expression and civil liberties. At the same time, cultural policy-makers like August Heckscher and Arthur Schlesinger Jr.--heirs to the upper-middle-class lampoon of middle-class "conformism" that stretched from Sinclair Lewis's Main Street (1920) to William H. Whyte's The Organization Man (1956)--saw in federal arts funding a way to create an American "civilization" equal to Western Europe's, which would inspire their fellow citizens with something more ennobling than the stuff of television and Levittown. Much like Clement Greenberg, the towering figure in postwar art criticism, Camelot culture warriors mounted a two-front campaign against the state-dominated art of the Soviet bloc and the kitsch of a newly affluent society.
Amazing as it now seems, the man (and he was imagined as a man) who was to do such heroic work for the nation was the artist. Kennedy's wooing of celebrity artists and writers--epitomized by his choice of Robert Frost to deliver a poem (he recited "The Gift Outright") at his 1961 inaugural and his subsequent invitation to Pablo Casals to perform at the White House--was not only an attempt to surround himself with glamorous and influential opinion-makers but, according to Brenson, a determined effort to establish the artist-prophet as a symbol of defiant individualism in an other-directed age. Whether it was Frost the aging Yankee reciting from memory at the inaugural or the Abstract Expressionist painters wrenching meaning from existential meaninglessness, the image of modern artists as "visionaries and outcasts" served liberals' war of ideas against Communist adversaries abroad and the benighted middle classes at home. As Kennedy put it in his 1963 speech at the dedication of the Frost Memorial library at Amherst College--the occasion for his most extended comments on the arts--a great artist was the "last champion of the individual mind and sensibility against an intrusive society and an officious state."
Visionaries and Outcasts sketches the history of liberalism's dream of the visual artist as national hero from the early 1960s to the present. Brenson was originally commissioned to write an internal study of the NEA's visual arts program, but the former New York Times art critic chose to revise and publish his work independently after the dismantling of that program by the Gingrich Congress in 1995. The book he has produced is more than an institutional study of one office in a federal agency, however. Brenson rightly considers the program that gave some 4,000 fellowships to individual artists between 1965 and 1995 as the heart and soul of the Endowment. Although early chapters suffer from the bureaucratic language common to government reports, the book concludes by raising thoughtful and provocative questions about the tragic history of the NEA. As he revised his study, Brenson expanded his vision to include the rise and fall of this heroic image of the modern artist as prophet and redeemer of late-twentieth-century US culture. "The NEA became a lens onto larger issues of the changing identity of the American artist and the enduring problem of...the visual artist in a country that...is still only comfortable with the artist as a maker of high-priced commodities controlled by galleries and museums."
In the story Brenson tells, modern artists were useful to this country's political elites only so long as the cold war was raging. Once that war was won, and the political culture had shifted markedly to the right, the lonely artist was no longer a bearer of universal values but a threat to them. The ideological rationale for the endowment collapsed along with the Berlin wall, and cautious NEA administrators invested their idealism in established art institutions. Better to fund museums than to risk spending money on unruly individuals who might turn out--like Serrano or Mapplethorpe--to be "controversial." Despite the defensive maneuvers of arts administrators and their allies, a vengeful Congress cut the NEA's budget by 40 percent in 1995 and eliminated all grants to individual artists (except writers). The endowment has since limped along into the twenty-first century, but more as an occasion for petition drives and liberal fundraising than as a vital force for artistic creativity. In reality, the NEA of 1965 is dead, and with it the official myth of the artist as critic and savior of American national culture.
During the three decades when it mattered, the NEA's visual arts program gave small grants, no strings attached, to many of this country's major artists, often offering them assistance early in their careers before private money was forthcoming. The mechanism for doling out funds was peer panels composed of artists, curators, scholars and critics, who operated without political oversight from federal officials. In fact, Kennedy liberals organized the peer-panel system precisely because it insulated art-funding decisions from state interference and therefore drew another contrast with the state cultural agencies in place in the Soviet-bloc countries. Artistic freedom, in the view of Camelot arts advisers, required the support of professional panels that would judge art strictly according to nonideological, aesthetic standards. At a time when a Greenbergian theory of aesthetic autonomy reigned supreme in New York-based art circles, the freedom of the NEA's peer panels from politicians' control seemed to most liberals a necessary complement to a Modernist logic that divorced "pure" painting and sculpture from political ideology, representation and traditional subject matter of any kind.
The NEA's panels instantly became objects of criticism from true "outsiders," who interpreted talk of an autonomous aesthetic as a bid for power by art-establishment cronies. Brenson ignores the early history of such attacks, which originally came from the political left, and instead repeats the now-familiar story of the persecution of the NEA by the Christian right and its allies in Congress after 1989. The story is a bit more complicated than that, however. In the context of the late 1960s and early 1970s, the authority of the peer panels and the autonomous aesthetic theory they defended came under attack from other quarters: from advocates of more politically charged, social-realist and feminist art; from African-American and Latino artists who saw little of their work or their traditions acknowledged, let alone supported, by the NEA in its early years; and from folk artists and enthusiasts of regionalist cultural traditions who disputed the place of New York Modernists at the pinnacle of the NEA's cultural hierarchy. Although the Endowment quickly made concessions to its critics on the left, the peer-panel process remained largely unchanged from its original incarnation until 1995, in the aftermath of the Republican sweep in the previous fall's elections, when the conservative polemic against the tyranny of a "cultural elite" hostile to the values of "normal Americans" finally succeeded in killing off the visual arts fellowships.
Brenson devotes almost half his book to an admiring account of the panels' operations, quoting extensively from artists who served as referees or benefited from the program's largesse. He condemns the system's rightist critics as ignorant and presents the panels in the most glowing terms imaginable as models of aesthetic judgment, openness and generosity. "The peer panel system embodied the idealism and nobility of the NEA," he tells us. Those who applied unsuccessfully during these years may have had another view of the matter, but no one can deny that the award of such a grant at an early stage of an artist's career meant far more than the money involved. Installation artist Ann Hamilton recalls that "winning" her fellowship in 1993 "gave me a very important sense of support from my peers, which is and was very important in maintaining the trust and faith necessary to make new work, to change, to make a leap of imagination toward what can't easily be knowable or containable in language." This was the NEA's visual arts program at its best--"a gift," as Brenson calls it, "in the fullest sense of something given especially to one particular person, with a special knowledge of who that person is and what that person needs, by someone or something that cares--in this case a government agency, on the advice of peers."
What went wrong, then? Given its distinguished history, why was the visual arts program so vulnerable in 1995? Visionaries and Outcasts is not altogether helpful in answering that question, though it offers a rudimentary road map for a fuller account in the future. Brenson rounds up the usual suspects--Jesse Helms, fundamentalists, New Criterion editor Hilton Kramer--and, in a more intriguing move, notes how the ground shifted beneath the panel system in the 1980s as the art market and American artists themselves transformed the cultural meaning of the visual arts. The go-go art market of the Reagan era created a private reward system that made the NEA irrelevant to many young artists on the make, while conservatives inside and outside the endowment began assigning to museums the universalistic values that 1960s liberals once invested in the image of the heroic artist. Meanwhile, radical artists gave up the Modernist ideal of the individual prophet-artist standing apart from his or her culture. The adoption by many political artists of the term "community arts movement" to describe their project was an important sign of a new sensibility among artists who came of age in the 1980s and rejected the endowment's original assumptions even as they accepted its subsidies. Brenson himself adopts some of their critique in the closing pages of his book, acknowledging that the NEA "put artists on pedestals" and "ended up sustaining their marginalization" by perpetuating an image that many Americans found "arrogant and disdainful."
Brenson's second thoughts seem not to have influenced the rest of this book, which hardly registers the effect of such searching self-criticism. That is unfortunate, because his valuable questioning of the Modernist myth that originally inspired the NEA, and his closing call for an art of "connectedness"--to other citizens and to the natural world--should be the starting points for any serious reconsideration of the embattled agency's history. Especially when it comes to the arts, liberal and leftist culture-workers are too quick to attribute their current troubles to the malevolence of strangers (what will the so-called People for the American Way do when Jesse Helms dies?); too loath to acknowledge that they have achieved positions of power, wealth and influence in American society; and too devoted to their flattering self-image as, alternately, daring rebels or beleaguered victims. Such poses may absolve cultural administrators of any feeling of responsibility for their institutions' plight, but they will prove useless when it comes time to sort through the wreckage of the NEA and other liberal cultural programs in search of lessons for the future.
At one crucial moment in his book, Brenson inadvertently hints at a more critical history of the endowment that might better explain its terrible predicament. He compares the panel system to "the United States jury system" in its rock-bottom faith in humans' "need to learn, [their] belief in justice, and [their] commitment to the common good." Maybe those were the impulses that motivated the panelists as they watched hundreds of slides flash before their eyes; but in retrospect it's exactly the extent to which the NEA selection process was not like a jury that stands out as its chief political liability. Juries, after all, are not composed solely of lawyers, criminologists, psychologists and forensic experts. Nor are embezzlers, assassins and car thieves invited exclusively to judge their peers. When those people serve on juries, they do so as citizens, not in their capacity as professionals. Whatever their limitations, juries embody the civic ideal that ordinary voters--informed by the law and the testimony of relevant specialists--possess the wisdom to govern themselves and administer justice fairly. Never did the NEA's founders display a comparable faith in the ability of nonexperts to contribute to the common culture. Indeed, one reason they married a formalist aesthetic to bureaucratic proceduralism in the first place was to secure a space for creativity separate from the presumed ignorance and tastelessness of the general public.
Such a system "worked" well enough in the NEA's early years, when a New York-based art elite had an astonishing confidence about which artists deserved support. As the East Coast NEA panel met in 1966, it was easy for a few insiders to chat informally and select names. "Generally there was a consensus" about which artists deserved grants, sculptor George Segal told Brenson. "There was not too much of a discussion because it was assumed that all of us knew them." The founding director of the visual arts program, former Metropolitan Museum of Art curator Henry Geldzahler, was openly contemptuous of a request at a West Coast meeting that the panelists examine slides of work by the artists under consideration. As panelist and fellow museum curator Walter Hopps recalled, "The boxes were pushed into the room. Henry stood up and went over and thumped each box with his hand and said, OK, now we've seen the applications and we've seen all this." The boxes of slides were removed, unopened; the applications sat in a pile unread. "We just talked about who we wanted.... It was all over in a morning."
A small art world with a strong consensus on a Greenbergian narrative of Modernist progress could afford to behave this way, especially when it enjoyed support from a liberal majority in Congress. But even when the peer-panel process was cleaned up and made more professional, the complaints poured in that the selection system was unresponsive to the very public this public agency was meant to serve and indifferent to the growing heterogeneity of art practices that transformed visual culture in the United States after the 1960s. What at first seemed like a means of protecting the independence of cutting-edge "visionaries and outcasts" from bureaucratic interference stood condemned by the late 1970s and early 1980s as an institutionalized patronage network that favored specific aesthetic commitments and excluded the vast majority of Americans as incapable of informed artistic judgment.
Coming to terms with the political shortcomings of the peer-panel system requires that we take a more skeptical view of the idea that artists (and their liberal allies) were "outcasts" in the first place, back in 1965. Despite his trenchant critique of the heroic-individualist model of the artist during the cold war, Brenson himself slips into romantic and avant-gardist rhetoric that is long overdue for critical scrutiny. To what extent can one really speak of the modern artists the NEA supported in the 1960s and 1970s as an avant-garde? Wasn't the original mission of the NEA proof that by mid-century the avant-garde ideal had merged perfectly with the cult of expertise that so captivated elite liberals, with their dream of benevolent rule from above by "the best and the brightest"? The class and ideological biases of the cultural institutions that liberals created in that period seem to have escaped no one except liberals themselves.
A quarter-century after the collapse of the New Deal arts programs, with their organic connection to 1930s labor insurgency, the case for federal arts funding returned in a very different political guise. The NEA's original base was in the (Nelson) Rockefeller wing of the Republican Party and the (John) Kennedy wing of the Democratic Party, two upper-middle-class constituencies that prided themselves on their distance from a seemingly "stodgy" labor movement and a parvenu middle class mired in the "ticky-tacky" vulgarity of the suburbs. It should come as no surprise that Nancy Hanks--once Nelson Rockefeller's personal secretary and then the NEA's chairwoman during the Nixon and Ford administrations--presided over dramatically escalating budgets for the endowment. Republicans still needed to appease the Rockefeller wing of their own party. And it should be no surprise, either, that a new right within the Republican Party succeeded in large part by pursuing a very different brand of cultural politics.
Capitalizing on popular unhappiness with the arrogance of the "New Class" at the helm of the NEA and other official cultural institutions, the Goldwater-Reagan right was able to oust the Rockefeller liberals from its own party and mount a masterful crusade against "cultural elites" in the universities, foundations, mainline Protestant churches, museums and the two endowments. Elite liberalism has not fared well in postliberal America, as conservatives have channeled popular disaffections into a pseudo-populism on cultural matters that they would never tolerate in economic affairs or foreign policy. The result has been an increasing isolation of artists, writers and intellectuals in universities and a delegitimation of the very idea of a common cultural life shared by citizens of different backgrounds.
With its original claims to aesthetic autonomy and professional expertise discredited by years of pounding from the left and the right, the endowment lacks a persuasive language to justify alternatives to the privatization of arts patronage. Its very name, the National Endowment for the Arts, speaks to an era of liberal consensus--on the nation, on the nature and desirability of national cultural standards, on what does and does not constitute art--that has disappeared. With the nation and the arts in dispute, all that remains is the program's pathetic "endowment," mere chump change in the global village overseen by the likes of Rupert Murdoch, Charles Saatchi and the trustees of the Guggenheim Museum chain store.
In an era of market fundamentalism, the panel system that once promised artists protection from political and bureaucratic interference during the cold war deserves careful reconsideration. It is conceivable that panels might again function as "free spaces," this time offering artists a refuge from the commercial imperatives that are ruining publishing, museums and public broadcasting. But to make the case for such spaces today requires a radically different mindset from the sentimental avant-gardism and antidemocratic prejudice still current in elite art circles. It also demands a clear-eyed acknowledgment of the historical complicity of the endowment's defenders in the political logic that threatens our public schools, museums and libraries, as well as our artists.
Starting from ground zero, with the NEA in ruins, advocates of public funding for the arts need a language that recognizes the difference between the authority of collective judgments rooted in shared standards and the exercise of market power, and which assumes, furthermore, that every person has access to varieties of aesthetic experience that may contribute to the formulation of such standards. Opening panels to nonspecialists need not be a Trojan Horse for "Archie Bunkerism" or "authoritarian populism," those bugaboos of elite left-liberalism. Nor is it an affront to the credentials of artists and scholars who benefit from public subsidy (like this reviewer) to insist that they discuss their work with lay audiences in exchange for such support. These are tiny steps, of course, but the suspicion and hostility even such modest suggestions provoke in some quarters are a sign of the bleak cultural pessimism that now poisons all discussion of the civic role of the arts in the United States.
Every few months, I receive a forwarded e-mail message that recounts a reputed NPR story by Nina Totenberg about an upcoming Supreme Court ruling on funding for the NEA, warns that the Court's conservatives are about to kill off the endowment once and for all, and then asks for my name on its long list of petitioners. The petition is a classic Internet hoax, but even if it weren't, the time for forwarding such messages is long gone. The NEA was gutted several years ago, and the rebuilding of public support for publicly funded art is going to take a lot more than e-mail petitions. There are hard, unsettling questions that the people who sign such petitions need to ask about the responsibility they and their institutions bear for the ascendancy of our conservative order and about the blindness that comes with the heady self-image of artists and intellectuals as visionaries, outcasts and perpetual victims. Michael Brenson's book is a valuable starting point for a conversation, barely audible at the moment, that might finally address those questions. Until then, ignore the petition on your computer screen. That delete button is there for a reason.
When Philip Roth compiles lists of the writers he most admires, Tolstoy never seems to make it. There's Flaubert, Kafka, Bellow--the touchstones. Gogol, Dostoyevsky, Céline--the madmen. Henry Miller, of course; even Chekhov and Thomas Mann. But Tolstoy, when he appears in Roth's fiction at all, is usually something of a joke. In The Ghost Writer, young Nathan Zuckerman travels to meet his hero, the reclusive novelist E.I. Lonoff ("Married to Tolstoy" is how the novel describes the plight of Lonoff's wife); lying the first night in the sanctum where Lonoff composes his masterpieces, and knowing that a fetching student of Lonoff's is also staying at the house, Zuckerman is, shamefully, seized by erotic yearnings. He yields to them. "Virtuous reader," he reports, "if you think that after intercourse all animals are sad, try masturbating on the daybed in E.I. Lonoff's study and see how you feel when it's over."
As if this wasn't bad enough, four years later Roth began The Anatomy Lesson with a sexual rewriting of Anna Karenina's famous opening. "Happy families are all alike," Tolstoy wrote. "Every unhappy family is unhappy in its own way. Everything at the Oblonskys' was in confusion." Roth's version: "When he is sick, every man wants his mother; if she's not around, other women must do. Zuckerman was making do with four other women."
So perhaps it is as punishment for this needling that in his old age Roth has become Tolstoy. His last five novels have been Tolstoyan in scope, and, like Tolstoy, he has been celebrated for them. Like Tolstoy he is loathed by the official organs of religion--an archbishop of the Russian Orthodox Church suggested that Tolstoy be executed for the antimarital rantings of "The Kreutzer Sonata," while here in America an influential rabbi demanded to know, "What is being done to silence this man?" after Roth's attacks on Jewish suburbia in Goodbye, Columbus. And if it so happens that the Jews are wrong, and Hell exists, there can be no question that the author of Sabbath's Theater will spend eternity there.
But the chief reason that Roth is Tolstoy is that he, almost alone of our contemporary novelists, so insistently has Something to Say, and is prepared, at times, to forsake all his literary instincts in order to say it. Tolstoy's digressions in War and Peace on the mechanisms of history infuriated such early readers as Flaubert ("he repeats himself and he philosophises!"), as well as everyone since. After completing his masterpiece, Anna Karenina, Tolstoy for some time wrote only philosophical and religious tracts. As for Roth, who came dangerously close to turning his last, very powerful novel, The Human Stain, into a political rant against the Clinton impeachment, he too has for the moment dropped most pretenses to fiction and produced, with The Dying Animal, something far closer to an essay.
It is an essay, naturally, about sex. Lenin claimed that Tolstoy was the mirror of the Russian Revolution; for the past forty years, Roth has been the mirror of the sexual one. In his work, the contradictions of that libidinal revolt have found their fullest expression. During the 1960s, Roth hailed its arrival--indeed, three years after the 1969 publication of Portnoy's Complaint, Irving Howe could damningly suggest that Roth was "a man at ease with our moment." But Portnoy, Zuckerman and the rest have also testified eloquently to the costs of such freedom. You may shatter convention, Roth showed, but be warned that society (with its thuggish enforcer, the superego) has the resources to defend itself, with extreme prejudice.
The same paradigm fits the slim plot of The Dying Animal. The narrator, 70-year-old David Kepesh, is a cultural arbiter and professor who has systematically been sleeping with everyone, including and especially his students, since leaving his wife and child in the 1960s. You will perhaps object that Kepesh doesn't have any kids, and you'll be right. Roth has never been scrupulous with his characters' biographies--Zuckerman's childhood, for example, what with the boxing lessons and ping-pong in Swede Levov's basement and the Communism, is beginning to look awfully crowded--and in this case he outfits Kepesh, who appeared in two previous, rather mediocre outings as a hesitant philanderer in The Professor of Desire and as a giant breast in The Breast, with a more virile résumé and an abandoned son from a different first marriage. Nor does he bother to explain how the mammillary Kepesh turned himself back into a man.
But it's still the same Kepesh, of all Roth's narrators the dullest and most methodical. Even when he ceased to be a man, Kepesh was a most reasonable breast. He is reasonable still, as he catalogues his sexual habits, rules and arrangements; of an affair with a middle-aged former student, he explains: "It was a joint venture, our sexual partnership, that profited us both and that was strongly colored by Carolyn's crisp executive manner. Here pleasure and equilibrium combined." Given this regimented administration of his own happiness, it is naturally satisfying to see Kepesh--"the propagandist of fucking"--caught up in all the old emotions after an affair with a particularly stunning student. "This need," he moans, referring not to lust but to attachment. "This derangement. Will it never stop?"
But before everyone runs out to buy this paean to the triumph of the bourgeois spirit, they should be warned that Roth takes Kepesh far more seriously than this plot summary indicates--takes him at his word. It should even be noted--if I may be allowed a quick critical crudity--that Kepesh's style is the closest among his narrators to the style of Roth's own essays and memoirs. His concerns are Roth's, and he shares many of the master's ideas about the world. For Kepesh is not merely a reflection of the sexual revolution, but also its historian.
And here I must stop myself--it is so easy to make fun of Roth. Sixty-eight years old and again with the sex. When Tolstoy published his attack on physical love in "The Kreutzer Sonata," young wits suggested that the Count's own kreutzer might be out of order. It is easy, in other words, to make fun of old men. I myself have done so. I thought--it seems to be the general consensus--that sex for Roth was a device with which to propel his fictions; that he could have used cars, or whales, or sports, and chose sex merely because it was historically ripe, as a subject, and for the simpler reason that it was the quickest way to épater ye olde bourgeoisie. Diaphragm! Cunt! "The Raskolnikov of jerking off"!
I no longer think so. It seems obvious that at this point Roth can do little with sex that he hasn't done already (though he tries in The Dying Animal, he tries). This continued fixation is fictionally fallow--as Roth writes, baldly, in The Dying Animal, "You know you want it and you know you're going to do it and nothing is going to stop you. Nothing is going to be said here that's going to change anything." Since sex is, in this view, overdetermined, it's like writing about gravity. (In fact, not having sex is far more promising--one of the things it promises being future sex.)
Yet Roth persists, and after forty years it can only be because he believes sex the most important topic he could possibly tackle, and now more than ever. So this book demands that we approach it with a straight face, even when a straight face seems the least natural response. Kepesh, of course, is professorial, telling of the Merry Mount trading post in colonial Massachusetts, raided by the Puritans because it was a bad influence on the young. "Jollity and gloom," he quotes Hawthorne, "were contending for an empire." He is also empirical, a one-man research institute, reporting the number of times (one) that he was the beneficiary of oral sex in college in the 1950s, and clinically tracing the progress made in the interim: "The decades since the sixties have done a remarkable job of completing the sexual revolution. This is a generation of astonishing fellators. There's been nothing like them ever before among their class of young women."
If this seems deliberately offensive, it is part of the general urgency, even desperation, that pulses through this book. Roth is running out of time; he must tell you as quickly as possible, he must convince you to change your life. Now, Roth has always considered the sexual revolution in quasi-world-historical terms. "The massive late-sixties assault upon sexual customs," he told an interviewer in 1974,
came nearly twenty years after I myself hit the beach fighting for a foothold on the erotic homeland held in subjugation by the enemy. I sometimes think of my generation of men as the first wave of determined D-day invaders, over whose bloody, wounded carcasses the flower children subsequently stepped ashore to advance triumphantly toward that libidinous Paris we had dreamed of liberating as we inched inland on our bellies, firing into the dark.
This is sweet and funny and light--and wholly innocent, it seems, of the damage done.
There is no such lightness in The Dying Animal. When the same idea (Roth as sexual revolutionary vanguard) resurfaces, it has an embattled quality to it, as if Roth is no longer certain what has happened, or who won. "Look," says Kepesh, in his demotic, direct address:
I'm not of this age. You can see that. You can hear that. I achieved my goal with a blunt instrument. I took a hammer to domestic life and those who stand watch over it. And to [my son]'s life. That I'm still a hammerer should be no surprise. Nor is it a surprise that my insistence makes me a comic figure on the order of the village atheist to you who are of the current age and who haven't had to insist on any of this.
The shift in tone from the interview is remarkable. The confidence is gone; the winds of history are shifting. Not only have the young forgotten their benefactors, they've started to cede the freedoms won for them--"now even gays want to get married," says Kepesh. "I expected more from those guys." And the deflowered order has been replaced by a new form of surveillance, which Kepesh scrupulously documents during a student conference: "we sat side by side at my desk, as directed, with the door wide open to the public corridor, all eight of our limbs, our two contrasting torsos visible to every Big Brother of a passerby." The revolution for which Kepesh fought so ruthlessly has been betrayed.
Which is a well-known habit of revolutions. Roth might have predicted, in fact, that women could not merely come alive as autonomous sexual beings without also developing ways of defending themselves against groping professors. He might even have predicted that this defense would at times grow absurd, that it would seek regimentation not only for physical but for verbal relations, that it would create a vocabulary of misunderstanding so dense it may take the passing of an entire generation before men and women can speak to one another again.
That all this might have been predicted in no way suggests that Roth is wrong to raise his voice in protest. It is striking, indeed, that a writer forever accused of it has now turned himself so vehemently against vulgarity--against the very leveling and coarsening of our conversation. Toward the end of The Dying Animal, Roth's former lover is beset by tragedy: "She began telling me about how foolish all her little anxieties of a few months back now seemed, the worries about work and friends and clothes, and how this had put everything in perspective," says Kepesh, "and I thought, No, nothing puts anything in perspective."
No, because there is no privileged view, no heights from which to look. This is the endpoint of the nihilist's wisdom. And Roth, after a circle of great radius, comes again to look like Tolstoy, like a writer who turns the light of his reason upon all the expressions and conventions by which we thoughtlessly live. How out of place he seems at a time when most fiction, competent as it is, has taken to being demure about its own necessity; when most writers are such professionals. Updike, DeLillo, Pynchon, of his generation, are all at least as talented as Roth; DeLillo is as timely, as ready to philosophize and to use the word "America." But no one is as urgent, as committed to the communication of his particular human truth.
The Dying Animal is not a great work in the way that The Human Stain, American Pastoral, Operation Shylock and, especially, Sabbath's Theater were great works. But it completes the picture--the picture of what a writer can be. Where DeLillo's recent novella, The Body Artist, was remarkable for its departure from his customary mode, The Dying Animal is remarkable for its fealty to the ground Roth has always worked. It cedes nothing, apologizes for nothing; it deepens, thereby, the seriousness of all his previous books.
"Because [sex] is based in your physical being, in the flesh that is born and the flesh that dies," says Kepesh.
Only [during sex] are you most cleanly alive and most cleanly yourself. It's not the sex that's the corruption--it's the rest. Sex isn't just friction and shallow fun. Sex is also the revenge on death. Don't forget death. Don't ever forget it. Yes, sex too is limited in its power. I know very well how limited. But tell me, what power is greater?
You could answer (virtuous reader), as you have answered Roth so many times before, that art, and its promise of eternity, is greater; or politics, and its promise of justice, is greater; or religion, and its promise of spiritual peace, is more powerful. You could answer Roth thus, but one of you would have to be lying.
Once in a while you come across a book that is so original, so persuasive, so meticulously researched and documented that it overrides some of your most taken-for-granted assumptions and beliefs. Devices and Desires is such a work. The author, Andrea Tone, associate professor of history at Georgia Tech, belongs to a small band of new historians who are reassessing the lives of nineteenth-century women through attention to their personal (and I do mean very personal) health aids. An earlier example would be Rachel Maines's The Technology of Orgasm, published by Johns Hopkins in 1999, which describes and illustrates the 1880s-style vibrators that doctors freely used in their offices--and women in their homes--for relief of pelvic congestion and the female "hysteria" associated with it.
Devices and Desires opens in 1873 when, through the machinations of Anthony Comstock--star agent for the New York Society for the Suppression of Vice (NYSSV)--Congress unexpectedly voted to make contraceptives illegal. Many Americans disapproved, and when the news reached Ireland, George Bernard Shaw coined the word "Comstockery," which, he predicted, would become "the world's standing joke at the expense of the United States."
It may be that talk of the new law made contraception known to some folks who had never heard of it before. It may be that, as with Eve's forbidden fruit, the ban made pregnancy prevention seem more alluring or naughty--or more fun. Then too, the "bootleg" business environment that ensued was relatively welcoming to entrepreneurial immigrants, smart single mothers with families to support and other ambitious "outsiders." "As with condoms," Tone observes, "creating diaphragms was easy and inexpensive, an ideal venture for those with little money and a penchant for risk." In any case the business of contraception flourished in the Comstock era, embracing scores of diverse devices and spermicides for women and men. Hundreds if not thousands of small entrepreneurs and distributors profited, as did as an impressive handful of industrial giants, including the arch-hypocrite Samuel Colgate, millionaire heir to the New Jersey-based soap firm, who served as president of Comstock's NYSSV while openly promoting Vaseline to "destroy spermatozoa, without injury to the uterus or vagina."
Other well-established companies that made, distributed and freely advertised contraceptives--ranging from intrauterine devices (IUDs) to vaginal pessaries (appliances intended to support the uterus that could also prevent the passage of sperm), and from douching syringes, suppositories and foaming tablets to sponges and male caps--included some still familiar names: B.F. Goodrich, Sears, Roebuck & Co. and Goodyear. "The B.F. Goodrich Company," notes Tone, "manufactured three soft-rubber IUDs--one pear- and two donut-shaped, each available in five sizes--and twelve hard-rubber models. Two of the latter models were one-size-fits-all rings." Physicians were leading players in the commercialization of mass-produced IUDs--constructed from rubber, metal, ivory and even wood--although some models were promoted for do-it-yourself insertion.
Tone's exhaustive research led her--like an ace detective or shoe-leather crime reporter--through an eight-year coast-to-coast investigation of Post Office Department records, Federal Trade Commission transcripts (some with decaying diaphragms and condoms glued to the pages), American Medical Association (AMA) Health Fraud Archives, records of the NYSSV, credit reports from nineteenth-century Dun and Co. collections, patents, love letters, arrest records, trial records, advertisements and trade catalogues--as well as "entrapment" letters, some drafted by Comstock himself. (He or another agent would pose as being in desperate need of birth control, get the goods and make the arrest.) Established companies, Tone discovered, run by "honest, brave men" who supported Comstock and NYSSV, were never targets for such treatment, which was reserved for smaller entrepreneurs--especially immigrants ("sly" Jews) and women ("old she villains").
Even so, many of those arrested were let off or punished lightly, while the entrapping agents and prosecutors ran the risk of being scolded and humiliated by judges and juries who doubted the advisability, and constitutionality, of such far-reaching Congressional interference into personal matters. Over time, what Tone calls a "zone of tolerance" was created to buffer the flourishing contraceptives trade and its practitioners. In fact, Shaw's prediction that Comstockery would become a "standing joke" was soon realized here in the United States, even, it would seem, in Congress itself.
The hugely ambitious Comstock Act, however, was hardly about contraceptives alone. It stated:
No obscene, lewd or lascivious book, pamphlet, picture, paper, print or other publication of an indecent character, or any article or thing designed to be intended for the prevention of conception or the inducing of abortion, nor any article or thing intended or adapted for any indecent or immoral use or nature, nor any written or printed card, circular, book, pamphlet, advertisement or notice of any kind giving information, directly or indirectly, where, or how, or of whom, or by what means either of the things mentioned may be obtained or made...shall be carried in the mail.
You do the math. A small army of Post Office inspectors (known as special agents) were required to enforce such an effort. But Congress refused to drum up a serious budget for the measure when Comstock went into effect and made light of the ambitious national program by raising the number of inspectors--nationwide--from fifty-nine to only sixty-three. Looking at the postal arrest figures from May 1875 through April 1876, Tone counted a total of 410 apprehensions, of which only twenty-seven were for violations of the Comstock law.
Nonetheless, in later generations we took it for granted that from passage of the Comstock Act until the post-World War I rise of Margaret Sanger, the average American had little or no access to what we now call "family planning" (a term, Tone informs us, suggested in the 1960s by Malcolm X--"because Negroes [are] more willing to plan than to be controlled"). And while it's true that some of the contraceptives available at the time were ineffective, dangerous or both, others, including condoms, cervical caps (apt to be euphemistically advertised as pessaries in the Comstock era), diaphragms, sponges and some spermicides were often pretty good and relatively inexpensive. Tone notes that in this sea of alternatives many determined wives and husbands doubled, tripled or quadrupled on protection. Given this environment, it's not surprising that after 1880 the national fertility rates for both white and black women declined rapidly, reaching an all-time low in 1940--twenty years before Enovid, the first birth-control pill, came to market and only three years after the AMA's 1937 resolution to "endorse" contraception and recommend it for inclusion in the standard medical school curriculum.
Between 1880 and 1940 the average fertility rate of whites dropped from 4.4 children per woman to 2.1. For blacks it dropped from 7.5 children to 3. Given these incontrovertible facts--a flourishing contraceptives industry paired with a steady decline in births--how could we have come to believe otherwise, that our great and great-great grandmothers were, so to speak, up the fertility creek without a paddle?
Some of the historical distortion must be attributed to the work of Margaret Sanger, who originally dreamed of female empowerment through woman-oriented contraceptive technologies. She viewed birth control as a woman's right and responsibility, and wrote in 1922 that "the question of bearing and rearing children is the concern of the mother and potential mother.... No woman can call herself free who does not own and control her own body." Condoms "compromised this objective by placing women's procreative destiny in men's hands." Until her death in 1966 Sanger promoted the manufacture first of diaphragms and later the pill, never quite answering objections from other feminists--in the 1920s and again in the 1960s--that this transferred power over women's bodies to doctors who were overwhelmingly (in the case of gynecologists, 97 percent) male. Sanger came to believe so strongly in medically controlled contraception that in a 1952 letter she stated that her greatest achievement had been "to keep the movement strictly and sanely under medical auspices."
This was an about-face from her earlier position. In her extraordinary 1915 pamphlet, Family Limitation, a home guide to contraception, Sanger, as Tone explains, "envisioned a world of grassroots birth control where women from all walks of life could use contraceptives without reliance on doctors, a populist approach she would soon abandon." Family Limitation discussed douches, condoms and cervical caps. (The essential difference between a pessary or cap and a diaphragm is that the generally thimble-shaped caps fit over the cervix by suction and are less likely to be displaced. The diaphragm, however, more or less divides the vagina vertically into two compartments, protecting the cervix from the arena where sperm is deposited. Both methods can benefit from outside help with fitting and correct technique, but the cap has a better record of over-the-counter success, and was long distributed by this means in France, England and the United States.) Sanger ultimately recommended caps, which she felt could be most easily and discreetly used and controlled by women. She distributed 100,000 copies of her pamphlet, imploring women to learn how to insert caps into their own bodies and then to "teach each other" how to as well.
When Sanger and her sister Ethel opened their first clinic in 1916 they instructed women, eight at a time, on how to use over-the-counter (OTC) contraceptives, including condoms, suppositories and rubber pessaries. When police, inevitably, raided the clinic, they found boxes of Mizpah pessaries. An effective OTC contraceptive, this flexible rubber cap was sold by druggists and mail-order vendors for the alleged purpose of treating such medical conditions as a displaced (or prolapsed) uterus. But as Tone writes: "Family Limitation got Sanger into more trouble. In 1915, she found herself back in Europe dodging American law while continuing her contraceptive education.... The trip across the Atlantic was risky. War had broken out." Back home, her husband, William Sanger, had his own problems. And as fate would have it, so did Anthony Comstock. William Sanger had been arrested by Comstock for distributing Family Limitation. And Comstock, who caught a cold in the courtroom during the trial, died soon after of pneumonia.
After her divorce from William, Margaret admitted she was looking for "a widower with money." James Noah Henry Slee, twenty years her senior, was "a well-heeled member of Manhattan's business elite...part of the same establishment Sanger had vilified in her younger, more radical years." They married in 1922 and with his backing, Tone explains,
Sanger embarked on a new chapter of her career, one that distanced the birth control movement from its radical origins and placed it on a more conservative path.... She recognized...that medical science enjoyed increasing prestige and political clout...she sought birth control allies through an ideology that trumpeted women's health over their civil liberties and cast doctors, not patients, as agents of contraceptive choice.
Sanger switched her preference to the diaphragm, particularly the Holland-Rantos brand, which sold exclusively to doctors. (This company, established in 1925, was funded by none other than Mr. Margaret Sanger, James Noah Henry Slee.) Sanger next prevailed on her besotted bridegroom to hire a distinguished, high-salaried doctor to promote their new company:
1925 is to be the big year for the break in birth control...the medical profession will take up the work...I shall feel that I have made my contribution to the cause and...I can withdraw from full-time activity.... If I am able to accomplish this victory...I shall bless my adorable husband, JNH Slee, and retire with him to the garden of love.
Sanger did not retire. In the following years she worked ceaselessly toward her goal of getting the AMA to endorse birth control. Her "signature" story, often bringing audiences to tears, concerned Sadie Sachs, a young immigrant mother of three, married to Jake, a truck driver. When Sadie begged a doctor to give her birth control he cruelly retorted, "Tell Jake to sleep on the roof." Sadie later died of septicemia following a self-induced abortion. Sanger was now in the business of helping the public forget that some of the widely available OTC methods worked very well for many people. As Tone points out, if Sadie could afford a physician visit, she could surely afford the far lower price of a contraceptive.
In addition to the move toward medicalization, our collective memory may have been dealt a brainwashing by panic-driven "eugenicists." As Sanger moved up socially she supported birth control for some elitist reasons, such as "the facilitation of the process of weeding out the unfit [and] of preventing the birth of defectives." But this was mild compared with the phobic reasoning of some of our greatest national leaders, who also feared the newcomers from Europe. Falling birthrates among our native born and the widespread immigration of foreigners from southern and eastern Europe (over 23 million people arrived on America's shores between 1880 and 1920) led Teddy Roosevelt to warn in 1912 that if middle-class American women used fertility control it "means racial death." In 1927 Supreme Court Justice Oliver Wendell Holmes, our great champion of civil liberties, stunned many of his admirers when, in Buck v. Bell, he agreed to uphold a Virginia eugenics statute legalizing the coerced sterilization of "socially inadequate persons." Carrie Buck, the plaintiff, was young, single and white, the "daughter of an imbecile," the mother of an "illegitimate feeble minded child." Holmes agreed to the cutting of Buck's fallopian tubes, proclaiming, "Three generations of imbeciles are enough." Tone adds that during the Nuremberg trials following World War II, accused Nazi war criminals cited Buck v. Bell to justify the forced sterilization of some 2 million Germans.
Here in the United States, the eugenics and population control movements promoted--and continue to promote--the need to develop contraceptives that take prescription (and often removal) out of the woman's hands. For example, in interviews with 686 low-income users of Norplant--a hormonal contraceptive, intended to last for five years, that consists of six matchstick-sized capsules implanted in a woman's arm--researchers at Columbia University's Center for Population and Family Health learned that 40 percent anticipated or experienced "cost barriers" that could impede the removal of Norplant. They urge that family-planning clinics "follow a policy of Norplant removal on demand, regardless of the patient's ability to pay." Some feminists charge that the effectiveness of OTC methods (carefully used) is still downplayed in quasi-official figures, a dangerous public health mistake in this age of galloping STDs.
Meanwhile, the effectiveness of doctor-controlled methods has been exaggerated, as the FDA has acknowledged. Previously, it gave out "ideal" figures for oral-contraceptive effectiveness, in contrast to discouraging clinic "use" figures for barrier methods. In the new round of product labeling this has been partially corrected; actual-use figures for the Pill are placed in a truthful range of 92-95 percent, not at the falsely optimistic 99 percent-plus.
Devices and Desires is replete with riveting histories of women and men who labored--legally and illegally--in the ever-challenging arena of conception control, from the Comstock era through today, and includes portraits of the men who developed Enovid, the first pill, as well as those behind the notorious IUD, the Dalkon shield. Those who read this fascinating book will have a far keener and more credible sense of what has happened and where we are now. Most women are still unsatisfied with their contraceptive choices, and as Tone concludes,
It is ironic that in a post-Roe v. Wade world that celebrates reproductive choice, the most frequently used contraceptive in the country--by a wide-margin--is female sterilization. In a very real sense Americans are still waiting for the heralded "second contraceptive revolution" to arrive.... In the absence of universal health care or prescription drug coverage, one way out of the contraceptive conundrum may be the development of more affordable over-the-counter methods, which would increase men's and women's options without tethering contraceptives to the medical marketplace from which millions are excluded.... Today to meet the needs of women and men who lack sufficient resources, we must supplement reliable medical methods with inexpensive over-the-counter options.
In the 1960s, the introduction of the Pill, revival of the ever-treacherous IUD and "stealth sterilization" of welfare moms--whose tubes were tied, without permission, after giving birth--placed contraception still more firmly in doctors' hands. In the parlance of that decade the "greasy kid stuff," including condoms, was left in the dust. Because of overpopulation fears, the new technologies enjoyed a diplomatic immunity--at women's expense. At an annual meeting of medical school deans, Nobel laureate Dr. Frederick Robbins declared, "The dangers of overpopulation are so great that we may have to use certain techniques of conception control that may entail considerable risk to the individual woman." Original Pills contained 150 micrograms of estrogen; today we know that 20 suffices. Millions of women served as guinea pigs for high-dose pills and IUDs, and thousands died. In 1970, at Senate hearings, Dr. Louis Hellman, chairman of the FDA advisory committee that twice declared the Pill safe, admitted that in his equation of benefit versus risk he "put population first, before benefits to the individual woman's health."
As women demanded to "take our bodies back" from deceitful doctors, the spirit of Comstock rose up again. In 1973 Our Bodies, Ourselves, Ellen Frankfort's Vaginal Politics and my book Free and Female were banned in Cleveland and Washington, DC. In 1979 shipments of cervical caps that independent women's self-help clinics imported from England were seized by FDA agents. Senator Ted Kennedy helped to get the caps released, but the FDA restricted their use to "investigational device" (ludicrous--the same caps had been in continuous use in England for a century), thus subverting the grassroots revival of the cap. Who can say who put the FDA up to this, but perhaps some future Andrea Tone-type women's history sleuth will get to the bottom of it.
Meanwhile, one of my hopes for Tone's extraordinary book is that it might encourage many people--men as well as women--to reconsider the barrier methods, respect them more and possibly learn to enjoy them, as some say they do. In contrast to the steady decline of teenage pregnancy, the epidemic of sexually transmitted diseases in young adults is increasing at a truly alarming rate. For example, estimates are that 46 percent of female college students are now infected with human papilloma virus (HPV), which can cause both genital warts and cervical cancer. Are student health services reliably advising their clients of this? My informants say no.
American intellectuals love the higher gossip because it gives intellectual life here--ignored or sneered at by the public--a good name. Sensational anecdotes (Harvard's Louis Agassiz getting caught in flagrante Clinton), tart one-liners (Oliver Wendell Holmes's crack that Dewey wrote as "God would have spoken had He been inarticulate") and stark biographical details about influential thinkers (William Lloyd Garrison's habit of burning copies of the Constitution at his public appearances) do more than illuminate thought, explain impulses and entertain. In the right hands, they create solidarity with the rest of modern consumer and media culture, injecting the sizzle of boldface revelation into respectable scholarly work.
What red-white-and-blue-blooded man or woman of letters can resist the news that Holmes made his family practice fire drills in which the satchel with his new edition of Kent's Commentaries on American Law was to be evacuated from the house first? Or Alice James's verdict on her brother William that he was "just like a blob of mercury, you cannot put a mental finger upon him"--a man so pluralist all the way down that he resented the notion that everyone should spell the same way? Don't the tales of Charles Sanders Peirce's blatant philandering with a teen belle, his inability to finish manuscripts, his erratic disappearances when scheduled to teach, his failure to include return addresses on requests for money, the impulsive sale of his library to Johns Hopkins, his flamboyant hiring of a French sommelier to give him lessons on Medoc wine in the midst of financial chaos--provide the pizazz of a stellar film, while also giving further force to traditional questions about genius and madness?
These are our cerebral celebrities, after all. For modern American intellectuals suckled on the concrete like their everyday peers--for whom even a paragraph of "abstract" blather is a signal to put the headphones back on, grab a magazine, tune out--such perky additives are necessary. But bringing the higher gossip to American philosophy--the Death Valley of American humanities, when it comes to literary style--is a uniquely forbidding matter. For every Richard Rorty whose unabashed colloquial style reveals he's a native speaker of American English, legions of disciplinary weenies, raised in exotic places like Pittsburgh and Palo Alto, stultify the subject by writing in a stilted English as a second jargon. To entrenched American philosophy types still bound to the flat prose of logical positivism (even after ditching its assumptions), anecdotes, biographical details and colorful examples remain a foreign rhetoric: irrelevant information properly left to bios of the canonized dead by scholars from second-rate schools, but no part of the laughable research programs of conceptual analysis they pursue.
Louis Menand enters this arid terrain with sainted credentials and connections. Having begun as a work-one's-way-up English professor, Menand, now at City University of New York, ranks as the crossover star of his academic generation, a bi-Manhattan emissary between campus and media whose prose travels only first-class, the public intellectual whose pay per word every public intellectual envies. In the media capital of the last superpower, where thousands of professors undoubtedly think they, too, with a little Manhattan networking, could be a contributing editor (and editor heir apparent) of The New York Review of Books, or staff writer at The New Yorker, or contributor to The New Republic, Menand has actually pulled it off as he works out whether he wants to be Edmund Wilson or Irving Howe, or just Luke Menand. Let the naysayers sulk. A few years back, to the annoyance of some careerists in American philosophy, he got the nod to edit a Vintage paperback edition of classic American pragmatists despite outsider status in the field. The specialists who carped about that choice will not be happy to welcome The Metaphysical Club, unless they welcome redemption.
Here, in the major book of his career so far, Menand brings his exquisite literary and philosophical talents together to invent a new genre--intellectual history as improv jazz. In it, Alex Haley and Arthur Lovejoy seem sidemen jamming in the background as Menand samples family genealogy, battlefield coverage, popular science writing and philosophical exposition to tell "a" story (the indefinite article is key) of how pragmatism, the now consecrated philosophy of the United States, riffed its way to prominence through the art of four philosophical geniuses: Holmes, James, Dewey and Peirce. The Metaphysical Club, Menand warns in his preface, is "not a work of philosophical argument" but one of "historical interpretation." Just so. In that respect, it belongs to the grand tradition of American intellectual history staked out by V.L. Parrington, Merle Curti, Max Lerner and Richard Hofstadter. Yet true to the pragmatist spirit, Menand aims "to see ideas as always soaked through by the personal and social situations in which we find them." His overview of pragmatism's evolution and triumph, told mainly through the lives of his four horsemen of absolutist philosophy's apocalypse, integrates textured biography and superlative storytelling to an extraordinary degree (though a seeming cast of thousands get walk-on roles, too). "I had no idea, when I started out," explains Menand in his acknowledgments, "how huge a mountain this would be to climb." If so, he deserves a Sir Edmund Hillary Award for sustained commitment to an extreme sport. All four of the familiar figures he focuses on have been "biographied" to death, often at massive length. Menand's excellent syntheses of secondary works and primary materials demonstrate exactly how steeped he became in the materials.
Menand's combination of dogged historical research--almost daring the reader to dispute the representational accuracy of his story among stories--with an unapologetic literary stylishness makes The Metaphysical Club a page-turning pleasure to read. Yet it also forces one to sharply different judgments: one literary, the other philosophical (in a perhaps antiquated sense) and historical.
As a literary effort, a daring act of bringing the narrative magic of a Tracy Kidder or Tom Wolfe to thinkers who largely lived on their keisters while reading and writing intellectual prose, The Metaphysical Club is a masterpiece of graceful interpretation. Menand's sly wit and reportorial hijinks, his clarity and rigor in making distinctions, his metaphorical gift in driving home pragmatist points make The Metaphysical Club this summer's beach read for those who relax by mulling the sands of time. If one takes Menand at his pragmatist word--that this is just one "story of ideas in America" that does not preclude other narratives--there's little to complain about. On a Rortyan reading of the book, the type Menand plainly invites (there's less space between Rorty's and Menand's views of pragmatism than between Britannica volumes on a tightly packed shelf), the right question to ask is not "Does Menand have the story right?" but "Is this the best story for us Americans in achieving our purposes?"
At the same time, if one retains a shade of the representational approach to the world that pragmatists largely disdain--the notion that America's intellectual history did happen one way and not another--one can't help rejecting Menand's fundamental organizational claim that the Civil War (as he states in his preface) "swept away almost the whole intellectual culture of the North." It's a belief expeditiously assumed because it smooths the post-Civil War story he chooses to tell. At one point late in The Metaphysical Club, while writing of the now largely forgotten political scientist Arthur Bentley, Menand describes James Madison as "a writer to whom Bentley strangely did not refer." One might say almost the same, in Menand's case, regarding the father of the Constitution, whose devices for accommodating factions in the structure of democracy were at least as pragmatically shrewd as Holmes's neutralist dissents in jurisprudence. And one might say it in regard to Benjamin Franklin, that larger than life proto-pragmatist who gets only a single mention as the great-grandfather of one Alexander Dallas Bache. Franklin, to be sure, is not a figure helpful to Menand's project, given the author's premise that there was "a change in [the country's] intellectual assumptions" because it "became a different place" after the Civil War. But a closer look at the story Menand tells helps explain why.
His method throughout The Metaphysical Club is to toss out the genetic fallacy and explain, in wonderful set pieces, how the experiences of his four protagonists drove them to the views they eventually held as magisterial thinkers. In Part One, devoted to the young Holmes, Menand thus laces the history of antebellum abolitionism and the politics of slavery through Holmes's own trials of conscience before his Civil War service. Holmes's story serves as a model of how Menand finds an internal pragmatist evolution in each of his leading characters. The future giant of American jurisprudence, Menand reports in graphic detail, witnessed an extraordinary amount of fighting and carnage in the Civil War. At the 1861 battle of Ball's Bluff, he took a rifle bullet just above the heart, but survived. In 1862, at the horrific battle of Antietam, where the Union suffered 13,000 casualties, he took a bullet in the neck, but again survived. In 1863, at a battle known as Second Fredericksburg, enemy fire struck his foot. He returned to Boston and the grim reaper didn't get him until 1935, when he was 93, a retired Supreme Court Justice and the most distinguished jurist in the country. But the war, Menand writes, "had burned a hole... in his life."
In a notebook Holmes kept during the war, the young soldier entered a phrase to which Menand calls special attention: "It is curious how rapidly the mind adjusts itself under some circumstances to entirely new relations." Holmes's experiences taught him, Menand writes, that "the test of a belief is not immutability, but adaptability." During the war, Menand maintains, Holmes "changed his view of the nature of views."
"The lesson Holmes took from the war," Menand continues, "can be put in a sentence. It is that certitude leads to violence." And so even though Holmes never accepted pragmatism as his official party affiliation, believing it a Jamesian project to smuggle religion back into modern scientific thought, he'd come to share one of its tenets: rejection of certainty. The whole of his subsequent judicial life, Menand contends, became an attempt to permit different views to be democratically heard in the marketplace of ideas and policy.
Too simple? Too slim a reed to sustain the view that Holmes's turn against certainty (exemplified by antebellum abolitionism) came as an adaptive response to a life in which certainty spurred violence--one more Darwinian twist in a story replete with Darwinian themes? Menand's evidence is substantial. Holmes never tired of telling war and wound stories. He "alluded frequently to the experience of battle in his writings and speeches." After his death, Menand reports, "two Civil War uniforms were found hanging in his closet with a note pinned to them. It read: 'These uniforms were worn by me in the Civil War and the stains upon them are my blood.'"
Menand finds a similar evolution documented in James. Famously fragile in his emotions, and a legendary procrastinator, James came to believe that "certainty was moral death." Rather, he thought, the ability and courage to bet on a conception of truth before all the evidence was in amounted to the best test of "character." That remarkably open mind, Menand relates, grew, like Holmes's resistance to dogmatism, out of experiences, such as the "international hopscotch" that family patriarch Henry Sr. imposed on his children's educations by yanking them out of one school after another.
"The openness that characterized both the style and the import of his writings on pragmatism," Menand writes of William James, "seemed to some of his followers to have been specifically a consequence of his disorganized schooling." Similarly, James's close work with Agassiz on the naturalist's famous "Thayer expedition" down the Amazon in the 1860s taught James that "everything we do we do out of some interest," a tenet crucial to pragmatism. Menand suggests that meditations on Brazilian Indians ("Is it race or is it circumstance that makes these people so refined and well bred?" James asked in a letter) may have begun James's relational thinking. Alluding to such influences, Menand concludes, "It seems that Brazil was to be, in effect, his Civil War."
By the time the author gets to Peirce, in Part Three, and Dewey, in Part Four, his entertaining method is in full swing. Menand portrays the pragmatism of his foursome, with their individual idiosyncrasies, as the consequence of experience-driven epiphanies, with epiphany playing the role in intellectual development that chance adaptive mutation plays in what once was considered "lower" biological development. Giraffes get longer necks--Americans get pragmatism.
Peirce proves the most challenging of Menand's subjects because he remained unpredictable and dysfunctional. The son of Benjamin Peirce, professor of mathematics at Harvard at the age of 24 and "the most massive intellect" Harvard president A. Lawrence Lowell claimed ever to have met, he had a lot to live up to. But Peirce suffered from painful facial neuralgia and turned to opium, ether, morphine and cocaine over his lifetime to ease the suffering. Violence and infidelity complicated the picture further--Peirce spent many years trying unsuccessfully to regain the brief foothold in academe he'd achieved during a short teaching stint at Johns Hopkins. With Peirce, Menand takes us through a famous nineteenth-century probate action, known as the "Howland will case," in which Benjamin Peirce testified, with behind-the-scenes help from his son, about the probability of a forged signature. A fascinating set piece, it's also Menand's inspired way of backgrounding the younger Peirce's involvement with the increasing importance of probability theory in the nineteenth century.
Peirce's work with "the law of errors," which "quantified subjectivity," was just one experience that drove him to pragmatist views. In time, writes Menand, Peirce came to believe both that "the universe is charged with indeterminacy" and that it "makes sense." He held that "in a universe in which events are uncertain and perception is fallible, knowing cannot be a matter of an individual mind 'mirroring' reality.... Peirce's conclusion was that knowledge must therefore be social. It was his most important contribution to American thought." Only in this stretch does Menand come to the title subject of his book: "The Metaphysical Club," an informal discussion group that Peirce, James and Holmes attended for perhaps nine months in 1872. There the idea that Menand considers a central link among the three, and fundamental to pragmatism--that ideas are not Platonic abstractions but tools, like forks, for getting tasks accomplished in the world--took articulate form for the first time. Here, as elsewhere, Menand evokes the atmosphere and supporting actors of the setting through fine orchestration of detail. He smoothly recovers the mostly forgotten Chauncey Wright, another man who learned in the Civil War that "beliefs have consequences." Wright used weather as his favorite example, and the "notion of life as weather" became his emblematic position.
Finally, in exploring Dewey in Part Four, Menand follows pragmatism's clean-up hitter from Vermont childhood to early academic stints at Hopkins, Michigan and Chicago. Menand's two-tiered approach falters a bit here. When the camera is on Dewey, we see him wrestling with issues of Hegelianism and laissez-faire individualism, and drawing lessons from his laboratory school at Chicago ("if philosophy is ever to be an experimental science, the construction of a school is its starting point"). He gets the de rigueur epiphany--the evil of antagonism among social factions--personally from Jane Addams. He absorbs moral insights offered by the Pullman strike and articulates his own great priority within pragmatism, on democracy as a matter of social participation and cooperation, not just numbers and majorities. But here Menand's characteristic deep backgrounding, particularly on the genesis of the "Vermont transcendentalism" that was more conservative than the Boston variety, seems overmuch. For all of Menand's literary deftness, we sometimes wonder, when taking in the variations on French figures like Laplace, or Scottish ones like Robert Sandeman, whether we're listening to a wonderful stretch of intellectual exotica--fine improvisational solos--or music crucial to the story. At the same time, one of the book's undeniable pleasures is Menand's voyages into the estuaries of nineteenth-century intellectual history, from Agassiz's endorsement in the 1850s of polygenism (the claim that races were created separately, with different and unequal aptitudes), to the work of the Belgian mathematician Adolphe Quetelet, "a brilliant promoter of statistical methods" who called his approach "social physics." Menand's accounts of nineteenth-century America's intellectual debates, like his sketches of Darwinian thinking and its social ramifications, are models of efficient summary.
Their net effect, of course, is to show that pragmatist concepts--opposition to certainty, evolution toward probabilistic modes of thought--were in the air, and his four protagonists breathed deeply. To Menand's credit, given the compass of this biographical and sociological work, he keeps his eye on the enduring subject--pragmatism as a distinct mode of thought--showing the family resemblance in pragmatist epiphenomena of the time, from proximate cause in law to statistical understanding of the role of molecules in heat. His superbly syncretic summary, late in the book, of what he's found sounds less sweeping than the claims in his preface:
Pragmatism seems a reflection of the late nineteenth-century faith in scientific inquiry--yet James introduced it in order to attack the pretensions of late-nineteenth century science. Pragmatism seems Darwinian--yet it was openly hostile to the two most prominent Darwinists of the time, Herbert Spencer and Thomas Huxley.... Pragmatism seems to derive from statistical thinking--but many nineteenth-century statisticians were committed to principles of laissez-faire James and Dewey did not endorse.... Pragmatism shares Emerson's distrust of institutions and systems, and his manner of appropriating ideas while discarding their philosophical foundations--but it does not share his conception of the individual conscience as a transcendental authority.
"In short, pragmatism was a variant of many strands in nineteenth-century thought," writes Menand, "but by no means their destined point of convergence. It fit in with the stock of existing ideas in ways that made it seem recognizable and plausible: James subtitled Pragmatism 'A New Name for Old Ways of Thinking.'" So maybe it's not true that the Civil War "swept away almost the whole intellectual culture of the North." That judicious modesty makes it easier to note some of the oddities of Menand's choices, especially given the bold leaps he takes to find pragmatist principles in areas of knowledge far afield from traditional philosophy. Some, considering the prominent space and harsh spotlight he devotes to discussions of slavery and racism by nineteenth-century thinkers like Agassiz, are regrettable.
At times, for instance, Menand can seem more interested in patricians for patricians' sake--or Boston Brahmins for Brahmins' sake--than the tale requires. It's easy to feel that a story with more nineteenth-century black and feminist thinkers, and fewer Northeastern gentlemen, would be a better tale for understanding the development of American thought. Menand's maverick status with regard to philosophy, welcome in his syntactic verve and enthusiasm for complex biographical explanation, perhaps intimidated him in this regard. As an outsider, he arguably stays too respectful of professional philosophy's ossified white-man pantheon of American philosophy, despite the canon wars of his own field. Martin Delany, Frederick Douglass and Elizabeth Cady Stanton, for instance, ought to be recognized as part of the pragmatist tradition, whether they have been formally or not.
Yet while Menand briefly mentions Delany and his troubles in being accepted at Harvard, he presents him more as a victim (which he was) than a thinker. More happily, Menand does devote respectful attention to the black pragmatist Alain Locke late in the book. But the biggest surprise is that W.E.B. Du Bois, who surfaces about 400 pages into the book, gets short shrift--four pages. Du Bois's famous articulation, at the beginning of The Souls of Black Folk, of the question black people silently hear asked of them by too many whites--"How does it feel to be a problem?"--provocatively inverted the pragmatist problematic in a way Dewey and James never fully pondered in their model of (white) agents facing their environments: the problem of being a problem to others. One imagines Menand could have made fascinating arabesques out of that peculiarity.
Then, finally, there is the Franklin problem. It's often forgotten, in an era when Franklin's face stands for thrift and prudence in bank ads, that his reputation, as John Adams wrote in the early nineteenth century, was "more universal than that of Leibnitz or Newton, Frederick or Voltaire," that Jefferson viewed him as "the father of American Philosophy" and Hume agreed. Is a thinker who wrote in his Autobiography in 1784 that "perhaps for Fifty Years past no one has ever heard a dogmatical Expression escape me" far from pragmatism? In his emphases on experience, experimentation and community, Franklin was the proto-pragmatist par excellence. Even in the free-jazz genre of intellectual history, his absence is a large lacuna.
Pragmatism, however, offers special benefits to authors and reviewers. Once one abandons the idea that we mirror the world exactly with our stories, and takes the nervier view that we tell stories about it that may be good for us in the way of belief, the kind of criticism made here--that Franklin, Madison, Delany and other thinkers merit membership in that ironically named "Metaphysical Club"--assumes its humble place. The greater accomplishment--Menand's--is to show that powerfully experienced consequences form beliefs, that beliefs form consequences and that the whole circular process of life teems with blood and pain and laughter that expose the abstract approach of much professional philosophy for the self-interested charlatanism it is. Writing to his father about Agassiz, William James observed that "no one sees farther into a generalisation than his own knowledge of details extends." Accepted as a truism rather than a rebuke, the insight suggests that questions about Menand's choices represent rival stories--what James might have seen as another pluralist tale seeking airtime. Judged by the latter's standards--what difference it makes if this or that worldview is true--The Metaphysical Club casts a vast, brilliant light on the human subtleties of America's most influential philosophical achievement. It's a feast of canny wisdom and sophisticated entertainment, and one hopes Menand's already privileged position in the intellectual elite, and the envy of the specialists, won't muffle the sounds of celebration.
"You don't need a weatherman to know which way the wind blows," some sage once wrote. Just so. As this issue went to press, the Museum of International Folk Art, a state-run institution under the aegis of the Museum of New Mexico, finally decided--in a debate that had been raging since February--to allow a computerized image of Our Lady of Guadalupe to remain on display. The artwork's offense? Our Lady was clad less than demurely, in a bikini of roses. The photographer Renee Cox (Yo Mama) sparked a similar controversy in New York with a portrait of the artist as a youngish woman--standing in for Jesus, at the Last Supper--unimpeded by clothing. The mayor of this fair city, which likes to consider itself the nation's art capital, hastily appointed a commission to assess the decency of art appearing in publicly funded venues. The fey breezes of our "culture wars" continue to blow, in other words, and you don't need a wind sock, either, to suss out their direction.
All of the essays assembled here relate in some way to this aeolian theme, whether it's the roots of political conservatism in the gusty person of Barry Goldwater or the history of feminism; the concussive moment in Birmingham nearly two score years ago or the sexual revolution in fact and fiction; the home-grown philosophy of pragmatism or the emblematic figure who famously wrote that our answers, friend, are "Blowin' in the Wind." As Casey Nelson Blake argues in the lead essay in this collection, though, the notion of the artist as a prophetic seer of sorts needs some radical updating as well--it often seems a cause without rebels, in fact.
The reductio ad absurdum of the situation, despite the fact that the sails of our public life may at times appear to be swelled out in vigorous debate, is that what we are left arguing over is the fittingness of a bikini in a work of the imagination. Annette Funicello, where are you?
Which brings us to another work of the imagination, this one beached on the gritty shores of copyright law and its interpretation: Alice Randall's novel The Wind Done Gone. You might have been reading a discussion of it in this issue, or from the book itself, if a federal district court in Atlanta hadn't found it "piracy" a few weeks back, for borrowing characters and scenes from Gone With the Wind. Randall's novel is told from a slave perspective, and bears mention here because the commercial question--would the trusts that own Margaret Mitchell's copyright be damaged--should be considered against larger questions of the nature of artistic invention, the process of cultural embroidering and the understanding of what constitutes literature in the first place. It was E.M. Forster, I believe, who spoke of creating "word masses" that we call characters; if we have a different "word mass" with the same name and perhaps even many of the same attributes, is the inflection of feeling in the reader--the received idea of "character"--the same? If anything can leave us culturally becalmed, stuck in the fetid doldrums with bad art, literary or visual or any other kind, it is the cutting off of spaces in which to reimagine the world. We hope you'll find that the following essays create some instead, to help you to do just that.