Help

Nation Topics - Books and Ideas

Topic Page

Articles

News and Features

American intellectuals love the higher gossip because it gives intellectual life here--ignored or sneered at by the public--a good name. Sensational anecdotes (Harvard's Louis Agassiz getting caught in flagrante Clinton), tart one-liners (Oliver Wendell Holmes's crack that Dewey wrote as "God would have spoken had He been inarticulate") and stark biographical details about influential thinkers (William Lloyd Garrison's habit of burning copies of the Constitution at his public appearances) do more than illuminate thought, explain impulses and entertain. In the right hands, they create solidarity with the rest of modern consumer and media culture, injecting the sizzle of boldface revelation into respectable scholarly work.

What red-white-and-blue-blooded man or woman of letters can resist the news that Holmes made his family practice fire drills in which the satchel with his new edition of Kent's Commentaries on American Law was to be evacuated from the house first? Or Alice James's verdict on her brother William that he was "just like a blob of mercury, you cannot put a mental finger upon him"--a man so pluralist all the way down that he resented the notion that everyone should spell the same way? Don't the tales of Charles Sanders Peirce's blatant philandering with a teen belle, his inability to finish manuscripts, his erratic disappearances when scheduled to teach, his failure to include return addresses on requests for money, the impulsive sale of his library to Johns Hopkins, his flamboyant hiring of a French sommelier to give him lessons on Medoc wine in the midst of financial chaos--provide the pizazz of a stellar film, while also giving further force to traditional questions about genius and madness?

These are our cerebral celebrities, after all. For modern American intellectuals suckled on the concrete like their everyday peers--for whom even a paragraph of "abstract" blather is a signal to put the headphones back on, grab a magazine, tune out--such perky additives are necessary. But bringing the higher gossip to American philosophy--the Death Valley of American humanities, when it comes to literary style--is a uniquely forbidding matter. For every Richard Rorty whose unabashed colloquial style reveals he's a native speaker of American English, legions of disciplinary weenies, raised in exotic places like Pittsburgh and Palo Alto, stultify the subject by writing in a stilted English as a second jargon. To entrenched American philosophy types still bound to the flat prose of logical positivism (even after ditching its assumptions), anecdotes, biographical details and colorful examples remain a foreign rhetoric: irrelevant information properly left to bios of the canonized dead by scholars from second-rate schools, but no part of the laughable research programs of conceptual analysis they pursue.

Louis Menand enters this arid terrain with sainted credentials and connections. Having begun as a work-one's-way-up English professor, Menand, now at City University of New York, ranks as the crossover star of his academic generation, a bi-Manhattan emissary between campus and media whose prose travels only first-class, the public intellectual whose pay per word every public intellectual envies. In the media capital of the last superpower, where thousands of professors undoubtedly think they, too, with a little Manhattan networking, could be a contributing editor (and editor heir apparent) of The New York Review of Books, or staff writer at The New Yorker, or contributor to The New Republic, Menand has actually pulled it off as he works out whether he wants to be Edmund Wilson or Irving Howe, or just Luke Menand. Let the naysayers sulk. A few years back, to the annoyance of some careerists in American philosophy, he got the nod to edit a Vintage paperback edition of classic American pragmatists despite outsider status in the field. The specialists who carped about that choice will not be happy to welcome The Metaphysical Club, unless they welcome redemption.

Here, in the major book of his career so far, Menand brings his exquisite literary and philosophical talents together to invent a new genre--intellectual history as improv jazz. In it, Alex Haley and Arthur Lovejoy seem sidemen jamming in the background as Menand samples family genealogy, battlefield coverage, popular science writing and philosophical exposition to tell "a" story (the indefinite article is key) of how pragmatism, the now consecrated philosophy of the United States, riffed its way to prominence through the art of four philosophical geniuses: Holmes, James, Dewey and Peirce. The Metaphysical Club, Menand warns in his preface, is "not a work of philosophical argument" but one of "historical interpretation." Just so. In that respect, it belongs to the grand tradition of American intellectual history staked out by V.L. Parrington, Merle Curti, Max Lerner and Richard Hofstadter. Yet true to the pragmatist spirit, Menand aims "to see ideas as always soaked through by the personal and social situations in which we find them." His overview of pragmatism's evolution and triumph, told mainly through the lives of his four horsemen of absolutist philosophy's apocalypse, integrates textured biography and superlative storytelling to an extraordinary degree (though a seeming cast of thousands get walk-on roles, too). "I had no idea, when I started out," explains Menand in his acknowledgments, "how huge a mountain this would be to climb." If so, he deserves a Sir Edmund Hillary Award for sustained commitment to an extreme sport. All four of the familiar figures he focuses on have been "biographied" to death, often at massive length. Menand's excellent syntheses of secondary works and primary materials demonstrate exactly how steeped he became in the materials.

Menand's combination of dogged historical research--almost daring the reader to dispute the representational accuracy of his story among stories--with an unapologetic literary stylishness makes The Metaphysical Club a page-turning pleasure to read. Yet it also forces one to sharply different judgments: one literary, the other philosophical (in a perhaps antiquated sense) and historical.

As a literary effort, a daring act of bringing the narrative magic of a Tracy Kidder or Tom Wolfe to thinkers who largely lived on their keisters while reading and writing intellectual prose, The Metaphysical Club is a masterpiece of graceful interpretation. Menand's sly wit and reportorial hijinks, his clarity and rigor in making distinctions, his metaphorical gift in driving home pragmatist points make The Metaphysical Club this summer's beach read for those who relax by mulling the sands of time. If one takes Menand at his pragmatist word--that this is just one "story of ideas in America" that does not preclude other narratives--there's little to complain about. On a Rortyan reading of the book, the type Menand plainly invites (there's less space between Rorty's and Menand's views of pragmatism than between Britannica volumes on a tightly packed shelf), the right question to ask is not "Does Menand have the story right?" but "Is this the best story for us Americans in achieving our purposes?"

At the same time, if one retains a shade of the representational approach to the world that pragmatists largely disdain--the notion that America's intellectual history did happen one way and not another--one can't help rejecting Menand's fundamental organizational claim that the Civil War (as he states in his preface) "swept away almost the whole intellectual culture of the North." It's a belief expeditiously assumed because it smooths the post-Civil War story he chooses to tell. At one point late in The Metaphysical Club, while writing of the now largely forgotten political scientist Arthur Bentley, Menand describes James Madison as "a writer to whom Bentley strangely did not refer." One might say almost the same, in Menand's case, regarding the father of the Constitution, whose devices for accommodating factions in the structure of democracy were at least as pragmatically shrewd as Holmes's neutralist dissents in jurisprudence. And one might say it in regard to Benjamin Franklin, that larger than life proto-pragmatist who gets only a single mention as the great-grandfather of one Alexander Dallas Bache. Franklin, to be sure, is not a figure helpful to Menand's project, given the author's premise that there was "a change in [the country's] intellectual assumptions" because it "became a different place" after the Civil War. But a closer look at the story Menand tells helps explain why.

His method throughout The Metaphysical Club is to toss out the genetic fallacy and explain, in wonderful set pieces, how the experiences of his four protagonists drove them to the views they eventually held as magisterial thinkers. In Part One, devoted to the young Holmes, Menand thus laces the history of antebellum abolitionism and the politics of slavery through Holmes's own trials of conscience before his Civil War service. Holmes's story serves as a model of how Menand finds an internal pragmatist evolution in each of his leading characters. The future giant of American jurisprudence, Menand reports in graphic detail, witnessed an extraordinary amount of fighting and carnage in the Civil War. At the 1861 battle of Ball's Bluff, he took a rifle bullet just above the heart, but survived. In 1862, at the horrific battle of Antietam, where the Union suffered 13,000 casualties, he took a bullet in the neck, but again survived. In 1863, at a battle known as Second Fredericksburg, enemy fire struck his foot. He returned to Boston and the grim reaper didn't get him until 1935, when he was 93, a retired Supreme Court Justice and the most distinguished jurist in the country. But the war, Menand writes, "had burned a hole... in his life."

In a notebook Holmes kept during the war, the young soldier entered a phrase to which Menand calls special attention: "It is curious how rapidly the mind adjusts itself under some circumstances to entirely new relations." Holmes's experiences taught him, Menand writes, that "the test of a belief is not immutability, but adaptability." During the war, Menand maintains, Holmes "changed his view of the nature of views."

"The lesson Holmes took from the war," Menand continues, "can be put in a sentence. It is that certitude leads to violence." And so even though Holmes never accepted pragmatism as his official party affiliation, believing it a Jamesian project to smuggle religion back into modern scientific thought, he'd come to share one of its tenets: rejection of certainty. The whole of his subsequent judicial life, Menand contends, became an attempt to permit different views to be democratically heard in the marketplace of ideas and policy.

Too simple? Too slim a reed to sustain the view that Holmes's turn against certainty (exemplified by antebellum abolitionism) came as an adaptive response to a life in which certainty spurred violence--one more Darwinian twist in a story replete with Darwinian themes? Menand's evidence is substantial. Holmes never tired of telling war and wound stories. He "alluded frequently to the experience of battle in his writings and speeches." After his death, Menand reports, "two Civil War uniforms were found hanging in his closet with a note pinned to them. It read: 'These uniforms were worn by me in the Civil War and the stains upon them are my blood.'"

Menand finds a similar evolution documented in James. Famously fragile in his emotions, and a legendary procrastinator, James came to believe that "certainty was moral death." Rather, he thought, the ability and courage to bet on a conception of truth before all the evidence was in amounted to the best test of "character." That remarkably open mind, Menand relates, grew, like Holmes's resistance to dogmatism, out of experiences, such as the "international hopscotch" that family patriarch Henry Sr. imposed on his children's educations by yanking them out of one school after another.

"The openness that characterized both the style and the import of his writings on pragmatism," Menand writes of William James, "seemed to some of his followers to have been specifically a consequence of his disorganized schooling." Similarly, James's close work with Agassiz on the naturalist's famous "Thayer expedition" down the Amazon in the 1860s taught James that "everything we do we do out of some interest," a tenet crucial to pragmatism. Menand suggests that meditations on Brazilian Indians ("Is it race or is it circumstance that makes these people so refined and well bred?" James asked in a letter) may have begun James's relational thinking. Alluding to such influences, Menand concludes, "It seems that Brazil was to be, in effect, his Civil War."

By the time the author gets to Peirce, in Part Three, and Dewey, in Part Four, his entertaining method is in full swing. Menand portrays the pragmatism of his foursome, with their individual idiosyncrasies, as the consequence of experience-driven epiphanies, with epiphany playing the role in intellectual development that chance adaptive mutation plays in what once was considered "lower" biological development. Giraffes get longer necks--Americans get pragmatism.

Peirce proves the most challenging of Menand's subjects because he remained unpredictable and dysfunctional. The son of Benjamin Peirce, professor of mathematics at Harvard at the age of 24 and "the most massive intellect" Harvard president A. Lawrence Lowell claimed ever to have met, he had a lot to live up to. But Peirce suffered from painful facial neuralgia and turned to opium, ether, morphine and cocaine over his lifetime to ease the suffering. Violence and infidelity complicated the picture further--Peirce spent many years trying unsuccessfully to regain the brief foothold in academe he'd achieved during a short teaching stint at Johns Hopkins. With Peirce, Menand takes us through a famous nineteenth-century probate action, known as the "Howland will case," in which Benjamin Peirce testified, with behind-the-scenes help from his son, about the probability of a forged signature. A fascinating set piece, it's also Menand's inspired way of backgrounding the younger Peirce's involvement with the increasing importance of probability theory in the nineteenth century.

Peirce's work with "the law of errors," which "quantified subjectivity," was just one experience that drove him to pragmatist views. In time, writes Menand, Peirce came to believe both that "the universe is charged with indeterminacy" and that it "makes sense." He held that "in a universe in which events are uncertain and perception is fallible, knowing cannot be a matter of an individual mind 'mirroring' reality.... Peirce's conclusion was that knowledge must therefore be social. It was his most important contribution to American thought." Only in this stretch does Menand come to the title subject of his book: "The Metaphysical Club," an informal discussion group that Peirce, James and Holmes attended for perhaps nine months in 1872. There the idea that Menand considers a central link among the three, and fundamental to pragmatism--that ideas are not Platonic abstractions but tools, like forks, for getting tasks accomplished in the world--took articulate form for the first time. Here, as elsewhere, Menand evokes the atmosphere and supporting actors of the setting through fine orchestration of detail. He smoothly recovers the mostly forgotten Chauncey Wright, another man who learned in the Civil War that "beliefs have consequences." Wright used weather as his favorite example, and the "notion of life as weather" became his emblematic position.

Finally, in exploring Dewey in Part Four, Menand follows pragmatism's clean-up hitter from Vermont childhood to early academic stints at Hopkins, Michigan and Chicago. Menand's two-tiered approach falters a bit here. When the camera is on Dewey, we see him wrestling with issues of Hegelianism and laissez-faire individualism, and drawing lessons from his laboratory school at Chicago ("if philosophy is ever to be an experimental science, the construction of a school is its starting point"). He gets the de rigueur epiphany--the evil of antagonism among social factions--personally from Jane Addams. He absorbs moral insights offered by the Pullman strike and articulates his own great priority within pragmatism, on democracy as a matter of social participation and cooperation, not just numbers and majorities. But here Menand's characteristic deep backgrounding, particularly on the genesis of the "Vermont transcendentalism" that was more conservative than the Boston variety, seems overmuch. For all of Menand's literary deftness, we sometimes wonder, when taking in the variations on French figures like Laplace, or Scottish ones like Robert Sandeman, whether we're listening to a wonderful stretch of intellectual exotica--fine improvisational solos--or music crucial to the story. At the same time, one of the book's undeniable pleasures is Menand's voyages into the estuaries of nineteenth-century intellectual history, from Agassiz's endorsement in the 1850s of polygenism (the claim that races were created separately, with different and unequal aptitudes), to the work of the Belgian mathematician Adolphe Quetelet, "a brilliant promoter of statistical methods" who called his approach "social physics." Menand's accounts of nineteenth-century America's intellectual debates, like his sketches of Darwinian thinking and its social ramifications, are models of efficient summary.

Their net effect, of course, is to show that pragmatist concepts--opposition to certainty, evolution toward probabilistic modes of thought--were in the air, and his four protagonists breathed deeply. To Menand's credit, given the compass of this biographical and sociological work, he keeps his eye on the enduring subject--pragmatism as a distinct mode of thought--showing the family resemblance in pragmatist epiphenomena of the time, from proximate cause in law to statistical understanding of the role of molecules in heat. His superbly syncretic summary, late in the book, of what he's found sounds less sweeping than the claims in his preface:

Pragmatism seems a reflection of the late nineteenth-century faith in scientific inquiry--yet James introduced it in order to attack the pretensions of late-nineteenth century science. Pragmatism seems Darwinian--yet it was openly hostile to the two most prominent Darwinists of the time, Herbert Spencer and Thomas Huxley.... Pragmatism seems to derive from statistical thinking--but many nineteenth-century statisticians were committed to principles of laissez-faire James and Dewey did not endorse.... Pragmatism shares Emerson's distrust of institutions and systems, and his manner of appropriating ideas while discarding their philosophical foundations--but it does not share his conception of the individual conscience as a transcendental authority.

"In short, pragmatism was a variant of many strands in nineteenth-century thought," writes Menand, "but by no means their destined point of convergence. It fit in with the stock of existing ideas in ways that made it seem recognizable and plausible: James subtitled Pragmatism 'A New Name for Old Ways of Thinking.'" So maybe it's not true that the Civil War "swept away almost the whole intellectual culture of the North." That judicious modesty makes it easier to note some of the oddities of Menand's choices, especially given the bold leaps he takes to find pragmatist principles in areas of knowledge far afield from traditional philosophy. Some, considering the prominent space and harsh spotlight he devotes to discussions of slavery and racism by nineteenth-century thinkers like Agassiz, are regrettable.

At times, for instance, Menand can seem more interested in patricians for patricians' sake--or Boston Brahmins for Brahmins' sake--than the tale requires. It's easy to feel that a story with more nineteenth-century black and feminist thinkers, and fewer Northeastern gentlemen, would be a better tale for understanding the development of American thought. Menand's maverick status with regard to philosophy, welcome in his syntactic verve and enthusiasm for complex biographical explanation, perhaps intimidated him in this regard. As an outsider, he arguably stays too respectful of professional philosophy's ossified white-man pantheon of American philosophy, despite the canon wars of his own field. Martin Delany, Frederick Douglass and Elizabeth Cady Stanton, for instance, ought to be recognized as part of the pragmatist tradition, whether they have been formally or not.

Yet while Menand briefly mentions Delany and his troubles in being accepted at Harvard, he presents him more as a victim (which he was) than a thinker. More happily, Menand does devote respectful attention to the black pragmatist Alain Locke late in the book. But the biggest surprise is that W.E.B. Du Bois, who surfaces about 400 pages into the book, gets short shrift--four pages. Du Bois's famous articulation, at the beginning of The Souls of Black Folk, of the question black people silently hear asked of them by too many whites--"How does it feel to be a problem?"--provocatively inverted the pragmatist problematic in a way Dewey and James never fully pondered in their model of (white) agents facing their environments: the problem of being a problem to others. One imagines Menand could have made fascinating arabesques out of that peculiarity.

Then, finally, there is the Franklin problem. It's often forgotten, in an era when Franklin's face stands for thrift and prudence in bank ads, that his reputation, as John Adams wrote in the early nineteenth century, was "more universal than that of Leibnitz or Newton, Frederick or Voltaire," that Jefferson viewed him as "the father of American Philosophy" and Hume agreed. Is a thinker who wrote in his Autobiography in 1784 that "perhaps for Fifty Years past no one has ever heard a dogmatical Expression escape me" far from pragmatism? In his emphases on experience, experimentation and community, Franklin was the proto-pragmatist par excellence. Even in the free-jazz genre of intellectual history, his absence is a large lacuna.

Pragmatism, however, offers special benefits to authors and reviewers. Once one abandons the idea that we mirror the world exactly with our stories, and takes the nervier view that we tell stories about it that may be good for us in the way of belief, the kind of criticism made here--that Franklin, Madison, Delany and other thinkers merit membership in that ironically named "Metaphysical Club"--assumes its humble place. The greater accomplishment--Menand's--is to show that powerfully experienced consequences form beliefs, that beliefs form consequences and that the whole circular process of life teems with blood and pain and laughter that expose the abstract approach of much professional philosophy for the self-interested charlatanism it is. Writing to his father about Agassiz, William James observed that "no one sees farther into a generalisation than his own knowledge of details extends." Accepted as a truism rather than a rebuke, the insight suggests that questions about Menand's choices represent rival stories--what James might have seen as another pluralist tale seeking airtime. Judged by the latter's standards--what difference it makes if this or that worldview is true--The Metaphysical Club casts a vast, brilliant light on the human subtleties of America's most influential philosophical achievement. It's a feast of canny wisdom and sophisticated entertainment, and one hopes Menand's already privileged position in the intellectual elite, and the envy of the specialists, won't muffle the sounds of celebration.

The history which bears and determines us has the form of a war rather than that of a language: relations of power, not relations of meaning.

         --Michel Foucault, Power/Knowledge

Michel Foucault would have been fascinated by late-twentieth-century presidential campaigns.

         --Lynne Cheney, Telling the Truth

Since the late 1980s, when they discovered with horror certain French-derived theories of social science and literary analysis that long before then had taken root among left-leaning academics in the United States--essentially replacing Marxist dialectics as weapons of intellectual struggle, in reaction against the failure of radical politics in the 1960s--American conservative intellectuals have held these particular theories under siege. In such books as Tenured Radicals (1990) by New Criterion managing editor Roger Kimball, Illiberal Education (1991) by ex-Reagan White House domestic policy adviser (and former Dartmouth Review editor) Dinesh D'Souza and Telling the Truth (1995) by ex-National Endowment for the Humanities chairwoman and future Vice Presidential spouse Lynne Cheney, and in innumerable interviews, stump speeches and talk-radio tirades, representatives of the American conservative movement have denounced the exponents of these theories for attempting to lure students away from traditional cherished academic ideals like objectivity and truth toward a cynical, despairing view of history, politics, literature and law.

So we can assume that participants in this decade-long conservative jeremiad did not foresee that at the end of that decade their colleagues in the Republican Party would wage a campaign to win a close presidential election in ways that would seem to confirm, in virtually every respect, the validity of the theories they had been railing against--and moreover, that as part of that campaign, their allies would espouse and promote to the public the very essence of these same reviled theories. However, if we look closely at the theories in question and at the facts of Republican behavior in Florida, we will see that this is exactly what happened.

The theories in question are those derived from the works of French philosophers Michel Foucault and Jacques Derrida. Their conservative critics tend to conflate the ideas of the two men, and then to muddle things further by presenting both as synonymous with postmodernism; in fact, though they worked in distinct fields and did not even like each other (Foucault once called Derrida "the kind of philosopher who gives bullshit a bad name"), their theories do have analogous aspects that make it not difficult to confuse them.

Foucault was a philosopher of history who posited, basically, the impossibility of achieving an objective and neutral interpretation of a historical event or phenomenon. Derrida is a philosopher of literature, founder of the notorious school of deconstruction, who suggested the impossibility of achieving a stable and coherent interpretation of a literary text, or any text. In both cases, the (putative) fact of the indeterminacy of the interpretive act leads to the conclusion (or has the assumption) that whatever interpretation comes to be accepted--the official interpretation--must have been imposed by the exercise of political power (though in deconstructionism this latter point has been elaborated and emphasized much more by Derrida's American disciple Stanley Fish than by Derrida himself). It is this shared assumption that any official interpretation, whether of human behavior or the written word, has been arrived at through a process of power competition and not through the application of objective, neutral and independent analysis (because there is no such thing) that has so agitated conservative intellectuals.

In her book Telling the Truth, the wife of the man who was to become Vice President of the United States following Republican Party political and legal maneuvers in Florida uses a book that Foucault edited called I, Pierre Rivière as the starting point for a critical examination of the philosopher's ideas. Pierre Rivière was a Norman peasant boy who in 1835 brutally murdered his mother, a sister and one of his brothers with a pruning hook. Foucault and a group of his students at the Collège de France compiled a collection of documents relating to the case. What the documents revealed to Foucault was not an overarching thesis that illuminated the cause and meaning of Rivière's shocking act--not, in other words, the unifying concept or constellation of concepts that academic analysts typically grope for in their research and thinking--but, on the contrary, a welter of conflicting and irreconcilable interpretations put forth by competing, equally self-interested parties, including doctors, lawyers, judges, Rivière's remaining family members and fellow villagers, and Rivière (who wrote a memoir) himself.

In other words, as Cheney puts it, the documents were important to Foucault "not for what they tell of the murders, but for what they show about the struggle to control the interpretation of the event." Or, as she quotes Foucault as saying, the documents form "a strange contest, a confrontation, a power relation, a battle among discourses and through discourses." The reason he decided to publish the documents, Foucault said, was "to rediscover the interaction of those discourses as weapons of attack and defense in the relations of power and knowledge."

"Thus," Cheney concludes, "I, Pierre Rivière is a case study showing how different groups construct different realities, different 'regimes of truth,' in order to legitimize and protect their interests."

The Foucauldian mode of analysis does not meet with any approbation or sympathy from the Vice President's wife. In fact, she goes on to say that Foucault's ideas "were nothing less than an assault on Western Civilization. In rejecting an independent reality, an externally verifiable truth, and even reason itself, he was rejecting the foundational principles of the West." Therefore it seems a pretty good joke on her that it turns out to be the perfect mode for analyzing how Republican Party strategy in Florida was developed and implemented.

In fact, I might suggest that if Michel Foucault had not confected them already, his concepts of "discourses" and "a battle among discourses" ultimately to be decided by power would have to be invented before this signal event of American political history could be properly understood.

When former Secretary of State James Baker arrived in Florida on November 10, 2000, three days after the election, dispatched there by Lynne Cheney's husband to take charge of the Bush campaign's effort to secure the state's Electoral College slate and thereby the Oval Office, George W. Bush's initial lead of 1,784 had already been reduced by an automatic machine recount to 327, and the Gore campaign had requested manual recounts in four Democratic-leaning counties: Miami-Dade, Broward, Palm Beach and Volusia. It appeared self-evident, from the tales both of the Palm Beach butterfly ballot and of the difficulties encountered by minority voters in getting to the polls and attempting to cast their ballots, that by the intention of Florida voters who had gone to the polls, if not by the actual counted results, Gore had won the state (which, of course, is why the media had awarded it to him early on election night in the first place), and it seemed a lot more likely than not that manual recounts in counties favoring Gore, or even a full statewide manual recount, would alter the actual results in Gore's favor, despite the fact that the absentee ballots, which usually favor Republican candidates, were yet to be counted. On top of all that, Gore was leading in the national popular vote, and it came as news to many Americans that a presidential candidate could win the popular vote and lose the election. (Prior to Election Day, Bush campaign strategists, believing it likely that George Bush would win the popular vote but lose in the Electoral College, had developed a strategy to try to discredit the Electoral College, and thus perhaps gain support from Democratic electors.)

Clearly, what Secretary Baker had to do in order to insure Bush's election to the presidency was to stop the requested manual recounts, or any manual recounts, from taking place. Since Florida election law explicitly permits manual recounts, and there is a long history of them being conducted in the state for elections to offices at various levels (though not previously the presidency), including one race of Republican Senator Connie Mack, and given the situation just described above, what Baker encountered first of all in Florida was a severe public relations problem. In advance of Republican lawyers making a legal case in various courts against the manual recount process, Baker had to make a case to the American public as to why a perfectly legitimate process that had been employed many times before in Florida and elsewhere across the United States to decide close electoral contests, should not be used to resolve the closest Electoral College contest in 114 years. He needed to participate in what in Foucault's word is a discourse, by presenting an alternative to and challenging the Gore campaign and Democratic Party stance that the Florida vote was so close and so rife with proven and potential irregularities that only careful manual recounting could decide it fairly, as well as what we might call the underlying and perhaps more threatening Democratic contention that Gore had actually won the election and required only the additional step of targeted manual recounting to prove it.

The discourse (in the looser sense of a narrative) widely presented by Secretary Baker at his first press conference, on November 10, had several parts. He immediately tried to convey the sense that George W. Bush had already effectively won the election. "The American people voted on November 7. Governor George W. Bush won thirty-one states with a total of 271 electoral votes. The vote here in Florida was very close, but when it was counted, Governor Bush was the winner. Now, three days later, the vote in Florida has been recounted. Governor Bush is still the winner," he began. Wielding that assumption as his predicate, he attacked the Gore campaign for attempting to "unduly prolong the country's national presidential election," introducing the phrases "endless challenges" and "unending legal wrangling" when the election dispute was all of three days old. Then he attacked the process of recounting itself, particularly manual recounting, saying that "the more often ballots are recounted, especially by hand, the more likely it is that human errors, like lost ballots and other risks, will be introduced. This frustrates the very reason why we have moved from hand counting to machine counting." He stressed the importance of getting "some finality" to the election and accused the Gore campaign of "efforts to keep recounting, over and over, until it happens to like the result." Finally, he argued that a continued struggle over the presidential election would jeopardize America's standing in the world. By November 14, he was tying it as well to the stability of American financial markets.

Former Secretary of State Warren Christopher, Baker's equivalent in the Gore camp and someone no doubt unfamiliar with the writings of Foucault (and therefore not having the term discourse at his disposal), referred to Baker's argument about America's standing in the world as a "self-serving myth," and Baker did not raise this canard again. Neither did he again raise the matter of endangering US financial markets.

The next day, November 11, at a press conference announcing that the Bush campaign had filed suit in the US District Court for the Southern District of Florida to block the manual recounts requested by the Gore campaign, thus becoming the first of the two campaigns to initiate "legal wrangling" (a number of private lawsuits related to the election had already been filed but none yet by the Gore campaign itself, something Baker took pains to submerge), Baker dramatically escalated his attack on manual recounting. For a number of reasons, it is worth quoting the central paragraph of his formal statement in full:

The manual vote count sought by the Gore campaign would not be more accurate than an automated count. Indeed, it would be less fair and less accurate. Human error, individual subjectivity and decisions to "determine the voter's intent" would replace precision machinery in tabulating millions of small marks and fragile hole punches. There would be countless opportunities for the ballots to be subject to a whole host of risks. The potential for mischief would exist to a far greater degree than in the automated count and recount that these very ballots have already been subjected to. It is precisely...for these reasons that our democracy over the years has moved increasingly from hand counting of votes to machine counting. Machines are neither Republicans nor Democrats--and therefore can be neither consciously nor unconsciously biased. [Emphasis added.]

Clearly, Baker and the Bush camp had decided to place a demonization of hand counting at the core of their electoral discourse. That this was a purely calculated discourse, and one in no way sincerely embraced, is child's play to demonstrate.

Manual recounting is the method used by the United Nations for settling disputed elections around the globe, and it is also countenanced by the United States when our representatives get involved in observing the resolution of electoral conflicts in other nations. A majority of American states either mandate or permit manual recounting when the differential between the machine vote totals for opposing candidates is within a certain margin. Candidate Bush, while governor, had signed just such a bill in Texas that established the same "intent of the voter" standard set by Florida. Until this particular situation in Florida arose, requiring this particular discourse, no Republican politician I am aware of had ever risen to denounce manual recounting (on the contrary, any number of Republican politicians had taken successful advantage of manual recount provisions), nor have I found any literature on the subject that appeared in the conservative journals. Moreover, the Bush camp gladly accepted the results of manual recounts in Florida when those went in their favor, as the results did in Seminole County, and they had considered plans to contest machine results in Iowa, Wisconsin, New Mexico and Oregon if things did not go their way in Florida. Manufacturers of the punch-card voting machinery used in Florida were also on record as saying that hand counting was a more accurate method of gauging votes than using their machines.

Conservative writers like Cheney, D'Souza and Kimball, who have attacked Foucault, Derrida and their allies, have done so in the first place because of these thinkers' extreme skepticism about the possibility of objectivity in human affairs. Cheney complains in Telling the Truth about "how discredited ideas like truth and objectivity have become," and accuses her ideological adversaries of aiming at "discrediting the objectivity and rationality at the heart of the scientific enterprise." (Ironically, she goes on soon after that to take Al Gore to task for criticizing, in Earth in the Balance, the cold-blooded scientism of British philosopher Francis Bacon.) D'Souza, referring to the predominantly deconstructionist English department that had been assembled by Stanley Fish at Duke University, wrote that "the radical skepticism cultivated at Duke and elsewhere is based on the rejection of the possibility of human beings rising above their circumstances" (he recounts that Fish, in an interview D'Souza conducted with him, denounced Cheney for "the error of objectivism").

The Republican campaign to demonize the hand counting of votes in Florida was nothing less than an all-out assault on this cherished ideal of the conservative movement, however, an assault as withering and uncompromising as any ever waged by Foucault or the deconstructionist movement. For what was Baker saying, in essence? That it is impossible for human beings to hand count votes accurately and honestly, citing as reasons the inevitability of "human error" (machine error is obviously to be preferred), the danger posed by "individual subjectivity," the "potential for mischief" (read, deliberate cheating by Democratic canvassing boards) and even the possibility of people being not only consciously but "unconsciously biased"; at the same time he exalted the superiority of "precise" machines over humans--even machines as grievously and laughably flawed as those that produced the "dimpled," "pregnant" and "hanging" chads.

Machines--even the worst, most malfunctioning of them--are capable of being objective, but people are not, in other words. People are incapable of "rising above their circumstances." Did Foucault ever put the case better himself? Has Stanley Fish? Has anyone?

It is safe to say that James Baker probably did not realize that he was challenging, and perhaps fatally undermining, core conservative doctrine when he advanced these arguments in his no-holds-barred effort to secure Florida's electoral votes for George W. Bush. And, for the conservative intellectual cause, there was worse to come. Soon Baker and his allies would take a further broad step down this road. They would begin impugning the fundamental American ideal of objective, neutral and fair-minded jurisprudence.

The Republican Florida discourse can be summarized even more simply and more baldly than it has been above. It was, or became: Manual recounting is untrustworthy and subject to manipulation; therefore, in attempting to force hand counting, the Democrats are trying to steal the election. (The Democratic Florida discourse was the obverse of this: Careful manual recounting is more reliable than machine counting; therefore, in attempting to stop manual recounting, the Republicans are trying to steal the election.) Such was Jim Baker and the Bush camp's basic pitch to the American public, and it became more and more explicit and fervid (or perfervid) as time went on.

Let us recall that in Foucauldian theory it is power, the possession and wielding thereof, that determines what discourse prevails in any given contest, or confrontation, or battle of discourses--and not the relative merit or cleverness of the argument. Conservative thinkers detest this part of the theory with equal or greater vehemence, because it is customarily deployed against institutions and systems they revere and are closely allied with--for instance, corporations and corporate capitalism. Cheney, struggling to explicate Foucault, complains about "the idea that reality is nothing more than a social construct, a tool that allows dominant groups to exercise power."

So, following Foucault, it would not be enough for James Baker simply to promulgate the argument that the Gore camp was trying to steal an election that neutral, precise voting machines had already indicated they had lost. To prevail, under Foucauldian theory, the GOP discourse--competing as it was with one that on the evidence and on experience was more plausible--would have to be imposed by an exertion of institutional power. And in fact an examination of the tactics used in the Republican War for Florida reveals that Republicans used, or were prepared to use, every conceivable lever of power--administrative, legislative, judicial (and not excluding extra-institutional mob rule)--in order to prevail; and that they prevailed because every single one of the controlling levers of power, from the Florida Governor's and Secretary of State's offices to the US Congress, from the Florida legislature to the US Supreme Court, was controlled by them, and was used ruthlessly.

Florida Secretary of State Katherine Harris, co-chair of the Bush campaign in Florida, was the point person in the Republican effort to delay and forestall completely, they hoped, any manual recounting. There are, remember, conflicting statutes regarding the deadline for manual recounting, one of which stipulates that any manual recounts not finished and submitted to the Secretary of State's office by the statutory deadline for certification, November 14, "shall be ignored," and another (chronologically more recent) indicating that they "may be ignored." And there is also an obvious conflict between this statutory deadline and the provision allowing requests for manual recounts to be made up to seventy-two hours after Election Day, since the amount of time then remaining before the deadline (three to four days, including a weekend) would not be sufficient for many Florida counties to complete such recounts. In interpreting an ambiguous and contradictory corpus of election law, Harris chose in each instance to follow a course redounding to the benefit of George W. Bush and the disadvantage of Al Gore. She refused to extend the deadline to allow time for the manual recounts requested by the Gore campaign, refused requests by Broward and Palm Beach counties to have their manual recount results included in the statewide certification after the deadline, issued a ruling questioning the legality of such recounts that temporarily delayed them from proceeding, and refused to grant an extra two hours to Palm Beach County to meet the extended deadline of November 26 mandated by the Florida Supreme Court, or to include any of the recount results the Palm Beach canvassing board had achieved so far.

After the Florida Supreme Court ruled on November 21 that hand counts must be included in the Secretary of State's certification, and extended the deadline, the Republican War for Florida shifted "to the ground," that is, to the places where actual hand counting was being done, or contemplated, by the canvassing boards. Republican tactics were summarized thus by the Los Angeles Times:

The Republicans were on the defensive, so their style was more confrontational: Challenge every disputed ballot and, if necessary, challenge the boards themselves. Build a record of inconsistent standards for court. If that leads to delays, so much the better. [Emphasis added.]

The New Republic described a Republican "ground operation" that involved, besides "now-infamous faux grassroots protests," visits to recount centers by "GOP luminaries," and that emphasized the blatant hypocrisy of the operation: It recounted Michigan Governor John Engler falling asleep during his service as an observer, and then going outside to "blast" the process, and New Jersey Governor Christine Todd Whitman getting along "swimmingly" with the canvassing board, even complimenting them on how well they were running things, then leveling "the obligatory attacks into the microphones."

The New Republic noted a crucial difference between the Gore and Bush camps: The Gore campaign chose anonymous lawyers specializing in arcane voting law to act as their observers, whereas the Bush campaign let loose a "rotating cast of big name pols." This was because the Bush campaign was less interested in trying to insure the fairness of the recounting process than in undermining it by propagandizing their discourse about its alleged inherent unfairness.

It could be said that the Gore effort in Florida foundered in a number of ways and places, two of which were certainly Palm Beach County and Miami-Dade County. In Palm Beach County, in the words of the Los Angeles Times, "Republicans crushed the Democrats." There was more than one reason that the Palm Beach canvassing board missed the new November 26 certification deadline (when Katherine Harris certified Bush as having won by 537 votes), but here is how the Los Angeles Times summarized what happened: "Endless delays, false starts and court challenges by Republicans meant the full recount didn't begin until Friday, November 17."

In Miami-Dade, the county with the largest voting population in the state of Florida (and the largest black vote), Republicans succeeded in preventing manual recounting from taking place at all. (The Miami Herald recently reported that by its own assessment of the undercounted votes in the county, Gore would have netted another forty-nine.) Members of the Miami-Dade canvassing board, and particularly its chairman, David Leahy, had been ambivalent about doing a recount from the start. The board first decided against doing one, then reversed course. The recount started on November 20; but the very next day, the Florida Supreme Court issued a ruling setting the new certification deadline at November 26. Believing that the board did not now have the time to conduct a full recount, Leahy persuaded the other board members that they should count only the 10,750 "undervotes" (ballots cast on which the punch-card tally machines had not detected any vote for President). The board then moved upstairs to a smaller room, where there were machines that could separate out the undervotes from the rest.

There, members of the board confronted what Time called a "mob scene" and "GOP melee." A group of several dozen or more Republican protesters, most of them apparently from out of state, many of them paid Capitol Hill staffers recruited by House majority whip Tom DeLay for the Florida effort (this being one of the Republicans' faux grassroots protests), directed from a Republican electronic command center in a Winnebago outside the building and by leaders on the scene with bullhorns, engaged in "screaming...pounding on doors and...alleged physical assault on Democrats," according to Time. When Miami-Dade Democratic chairman Joe Geller emerged from the room carrying a sample ballot, he was pushed and shoved by many protesters screaming, "I'm gonna take you down!" Simultaneously, longtime GOP operative Roger Stone was overseeing phone banks urging Republicans to storm downtown Miami, and Radio Mambi, a right-wing Cuban-American radio station, was inciting the Miami community into the streets. Outside the room where the canvassing board was meeting, members of the rampaging crowd were threatening that as many as 1,000 reinforcements, including a large contingent of angry Cubans, were on the way to join them. As Time put it, "just two hours after a near riot outside the counting room, the Miami-Dade canvassing board voted to shut down the count."

Leahy later denied that the board had been intimidated into inaction by the rioters, but his claim that their bullying and threats of violence had no effect at all on the board's reversal of its previous decision hardly seems credible. In any case, saturation propaganda and near-mob rule were only two of the weapons that Republican strategists had rolled out onto the field of battle in their War for Florida. (I won't even get into the report of a mysterious state police roadblock that intimidated some on their way to the polls on Election Day.) They also had under way a lawsuit seeking to have a federal court invalidate the manual recounts on the grounds that they violated Article II, Section I of the US Constitution, which gives to the state legislatures the power to regulate presidential elections; this lawsuit and others, including arguments which would end up being decided by a 5-4 majority of the US Supreme Court, were being handled by Theodore Olson, a party lawyer who had been active in efforts to discredit President Clinton while he was in office; Olson is a past president of the Federalist Society, a conservative Republican legal organization that normally seeks to severely limit the intrusion of federal power into state matters. And just in case the Republican cause lost in both the Florida and the federal courts, the Republican-controlled legislature in Florida was prepared to intervene and certify its own competing slate of electors. In fact, on December 12, just before the US Supreme Court issued its decision and made the action moot, the Florida House of Representatives did just that. A few days before, Baker, in an interview, had refused to stipulate that the Bush camp would heed a US Supreme Court ruling that went against them rather than turn to the legislature; and on other occasions Baker had appeared to invite its intervention. Beyond that, if the matter went to Congress for final arbitration, the Republicans were more than prepared to flex their majority muscle there. Tom DeLay had circulated a memo on Capitol Hill that a Republican Congressional aide characterized as saying: "Congress can prevent Al Gore from becoming President no matter what."

A final Foucauldian note. Foucauldian theory holds that the way of the rich and powerful will prevail, the less powerful or powerless will lose out (which is partly why the theory has been embraced by the left as a successor or adjunct to Marxism, and is so abhorred by the right). Punch-card voting machines are far less effective in recording votes correctly than optical scan machines. A dimpled or pregnant chad is created when insufficient force is used on the punch tool or when plastic T-strips used in balloting are too worn or rigid to allow chads to pass through; if the ballot is improperly aligned, and only one side of the chad is punched loose, that results in a hanging chad. These problems don't exist with optically scanned ballots, and as an obvious result, only about three out of every 1,000 optically scanned ballots in the Florida election recorded no presidential vote, compared to some fifteen out of 1,000 punch-card ballots, The New York Times reported.

Optical-scan voting machines tend to be more prevalent in the wealthier, and Republican-leaning, precincts and counties of Florida, the Los Angeles Times observed, and punch-card machines more prevalent in the less wealthy and more Democratic areas, simply because the wealthier counties can better afford the more expensive optical machines (the punch-card machines are not only less effective to begin with, but many of them are also old and worn out). Looked at one way, the manual recounting efforts were an attempt to correct a discriminatory imbalance in access to electoral power between rich and poor (and black and white) in Florida; and Republican forces were determined, in every possible way, to thwart this attempt.

American conservative thinkers, as discussed, have also directed considerable intellectual ire at the deconstructionist movement. In dissecting Paul de Man, the Belgian émigré who founded American deconstructionism at Yale in the 1970s, Roger Kimball pejoratively ascribes to him "the thought that language is always so compromised by metaphor and ulterior motives that a text never means what it appears to mean." D'Souza, confronting Derrida and de Man, says that they labored "to discover ingenious, and sometimes bizarre, contradictions which render the work 'radically incoherent.'" Lumping deconstructionists together with "postmodernists, structuralists, poststructuralists, reader-response theorists," D'Souza says that "they are embarked on a shared enterprise: exposing what they say is the facade of objectivity and critical detachment in such fields as law, history, and literature."

Lynne Cheney finds the apparent migration of deconstructionist methods of textual analysis and thinking to the field of law extraordinarily disturbing. In Telling the Truth, she traces the origins (to her own satisfaction, anyway) of critical legal studies, feminist legal theory and critical race theory--academic movements that hold that the law is not in any way neutral but is crafted to favor the interests of a dominant (white, male) elite--to deconstructionism. She claims, for instance, that "one of the primary purposes of CLS [critical legal studies] is to destroy any illusions that might exist about stability and objectivity in the law by deconstructing its arguments," and goes on to assert: "The heirs of CLS, such as those in the critical race theory movement, take a giant step further. As feminists have done, critical race theorists not only attack the notion that the law is disinterested, they advocate using the law to promote their own interests." [Emphasis added.] In other words, Cheney sees the notion that legal texts have no stable, permanent, inherent meaning (a deconstructionist notion) and will therefore be interpreted according to the practical interests of those doing the interpreting, and not other criteria (a leftist political notion), as dangerously subversive of our legal system.

If the objectivity and disinterestedness of the law, however, are bedrock conservative doctrine, then James Baker, and his associates and conservative columnist sympathizers like William Safire, once again challenged and compromised that doctrine in the Florida presidential election imbroglio. The idea that law is (on the whole) neutral, objective and disinterested necessarily implies that the judges who interpret it are (on the whole) neutral, objective and disinterested; there is no conceivable syllogism whose conclusion is that our legal system is (more or less) objective and fair that can have as a premise that our judges are not and are not capable of being so. Yet this was the blatant premise of Republican commentary as an assortment of legal cases relating to the election wound their way through the Florida court system. Just as Republican operatives and commentators trashed the integrity of the county canvassing boards simply because they were under Democratic control, they also used the fact of their being Democratic appointees to attempt to discredit--often in advance--the decisions of various Florida judges, from the circuit level up to the state's Supreme Court. The clear implication was that Democratic judges would necessarily, either reflexively or by calculation, rule in favor of the Democratic candidate. They could not be trusted to be disinterested and objective.

In addition to being a monumental betrayal of the conservative movement's stated intellectual principles, this line of argument creates another problem for its Republican promoters: It tends to discredit in advance the decisions of Republican as well as Democratic judges. For if Democratic judges cannot be trusted to be evenhanded and judicious, what logic can be called forth to argue that Republican judges can be? They are also human. They are also partisan. They also owe something to the people who selected them. The theory unavoidably predicts that judges appointed by Republicans will rule, in a biased and partisan manner, against Democratic candidates and causes when occasions to do so arise.

It is doubly ironic, therefore--and doubly troublesome, one would think, for the integrity of the conservative cause--that this is exactly what happened when the case called Bush v. Gore reached the highest court in the land. On Friday, December 8, the Florida Supreme Court, in a split 4-3 decision overruling a decision by lower court judge N. Sanders Sauls, ordered an immediate manual recount of all presidential undervotes throughout the state. The next day, a 5-4 majority of the US Supreme Court (all the members of this majority conservative appointees of either Richard Nixon, Ronald Reagan or George Bush) ordered the manual recount halted. This action was widely perceived at the time, by people on both sides of the battle, as a body blow to Al Gore's remaining chances of garnering Florida's electoral votes. It would inevitably push the recounting process, were it to resume, up against the December 12 "safe harbor" date (the time by which electors needed to be chosen in order to remain immune to Congressional challenge) and possibly even make it impossible to finish by December 18, the date on which the Electoral College was to cast its vote. Regarding this majority decision, issued by five judges who have pontificated widely in their writings and speeches about the virtues of "judicial restraint," Justice John Paul Stevens wrote in his dissent: "To stop the counting of legal votes, the majority today departs from...venerable rules of judicial restraint that have guided the Court throughout its history."

The Court did not, as we know, allow the recounting process to resume. The following Tuesday, December 12, the same 5-4 majority ruled that manual recounting under circumstances then prevailing in Florida would be unconstitutional. In an unsigned per curiam opinion (the judges said to be the primary authors of this opinion, Anthony Kennedy and Sandra Day O'Connor, clearly did not want their names on it), the majority (whose members in the past have been indifferent to, if not outright scornful of, equal protection claims)--relied principally on an equal protection argument, that it would be unfair to count ballots in different counties according to different standards (e.g., to count only hanging chads in one, but also dimpled chads in another). The argument speciously ignored the fact that the Florida ballots, prior to any recount, were already counted differently, and that the very purpose of recounting was to correct for this discrepancy. It also skirted the fact that ballots are counted differently all across the United States, and that a logically consistent application of the Court's principle would invalidate the entire presidential election.

Justices David Souter and Stephen Breyer had tried, in oral argument and behind the scenes, to work out a compromise position whereby the Justices would send the case back to the Florida Supreme Court and ask it to set a uniform standard for the manual recounts. But according to the per curiam majority, it was too late for this, because there would then not be enough time to meet the "safe harbor" deadline of December 12. This argument ignored the fact that it is the very essence of a "safe harbor" clause that it allows but does not require a certain self-protective action; the Electoral Count Act of 1887 stipulates that states that send electors by then chosen according to rules in place Election Day cannot have those electors rejected by Congress, but it does not mandate that the states behave that way. The argument also glided past the fact that there is nothing in Florida election law explicitly requiring that the state abide by that date, either; the majority opinion in this regard relied entirely on a virtual aside in the first Florida Supreme Court decision, to the effect that it thought the legislature intended the state to meet the deadline; the majority could not cite any actual, germane Florida statutory law--because there isn't any.

Thus did a Republican Party strategy to delay the manual recounting of votes in Florida as long as possible finally achieve its goal by furnishing the rationale for a conservative Republican Supreme Court majority to stop the recount process there for good. Thus did a Supreme Court majority that had been pursuing an aggressive states' rights jurisprudence prior to this decision intervene in a matter of state law in a heavy-handed and unprecedented way; in a category of dispute, furthermore, whose ultimate resolution the US Constitution unambiguously gives to Congress; and in a situation, finally, that even a modicum of "judicial restraint" would have called for it to avoid. Thus did a group of conservative Republican judges--unelected judges, to use a well-worn Republican refrain--choose the Republican candidate for President over the Democratic one, rather than the voters of Florida or the American people. Thus was James Baker's discourse--his "regime of truth"--finally imposed.

A further irony here is that the behavior of the Democratic judges who were involved in the Florida presidential election struggle overwhelmingly refuted Republican predictions of reflexive ideological bias. County Circuit Court Judge Terry Lewis, a Democrat, twice ruled in favor of Republican Secretary of State Katherine Harris, the first time upholding her enforcement of the certification deadline of November 14, the second time upholding her decision to declare a winner without including any hand recounts. Circuit Judge Nikki Clark, who handled the lawsuit involving the question of whether to throw out some 15,000 absentee ballots in Seminole County because of technical violations of the law by Republican canvassing board officials and operatives, had been particularly impugned by Republican commentators. She was black, she was a former legal aid attorney, she was not only a registered Democrat but had been an aide to the state's former Democratic Governor Lawton Chiles, and she had recently been passed over for a promotion by Governor Jeb Bush. But in the event she ruled decisively against throwing the ballots out, as did her fellow Democratic judge who handled a similar case concerning absentee ballots in Martin County. The November 16 decision of the Florida Supreme Court, all Democrats except one independent, to allow manual recounts in Palm Beach County (and tacitly allow the effort already under way in Broward County to continue), was unanimous; but the December 8 decision ordering manual recounts of all the undervotes in the state was a 4-3 split, and the court unanimously upheld virtually all the lower court decisions that went against Democratic interests, with the single exception of the finding by Circuit Court Judge Sauls rejecting the Gore campaign's contest. (Sauls, advertised as a Democrat, is actually a Republican appointee who switched registration from Democratic to Republican, and has run as a "nonpartisan" candidate for re-election in Democratic Leon County.)

"What must underlie petitioners' entire federal assault on the Florida election procedures is an unstated lack of confidence in the impartiality and capacity of the state judges who would make the critical decisions if the vote count were to proceed," Justice Stevens wrote in his dissent in Bush v. Gore. "Otherwise, their position is wholly without merit. The endorsement of that position by the majority of this Court can only lend credence to the most cynical appraisal of the work of judges throughout the land." [Emphasis added.] According to a January 22 article in USA Today detailing the lingering bitterness between the two opposing factions within the Supreme Court over the Bush v. Gore decision, at an election night party on November 7, Justice Sandra Day O'Connor became "visibly upset" when network anchors first awarded the state of Florida to Al Gore. The story went that her husband was heard explaining the couple wanted to retire and that his wife preferred that a GOP President appoint her successor. The paper said that people close to the Justices had confirmed the essence of the story (which was also reported in the Wall Street Journal and Newsweek). Justice O'Connor, it would seem, experienced difficulty rising above her circumstances.

I am not an adherent or admirer of the theories of Michel Foucault, Jacques Derrida, Paul de Man, Stanley Fish or their associated movements. On the contrary. I received my education before these theories came into vogue, at a time when it was still commonly if not universally assumed on campus that the purpose of academic study was to acquire useful and verifiable knowledge in a variety of fields, not excluding literature and politics (my two fields of study). I personally believe that literary texts, while they can (and will, if they are any good) have subtleties and profundities and even contradictions that will stubbornly resist one-dimensional analysis, do have meaning--and that the better the writer, the clearer that meaning is. I believe that while the world and human nature are infinitely complex, there is, within limits, such a thing as objective truth. Since first hearing of the ideas of deconstruction and Foucault, I have counted myself among their skeptics and detractors.

So I did not undertake to write this essay to demonstrate, as it might seem, that the Republican War for Florida in all its aspects--in its ideology, in its concrete actions and, perhaps above all, in its success--lends credibility to these theories, though it has been most interesting to discover the extent to which it does. No, I am far less interested in the remarkable symmetry between what happened in Florida and the theories of Foucault and Derrida about how history and the social construction of reality work, than I am in the stunning asymmetry between Republican Party statements and actions there and the professed ideological principles of American conservatism.

In Florida, to win the presidency, the Republican Party betrayed what its intellectual spokespeople allege are among conservatism's highest ideals. To discredit the manual recounting process that they feared would result in the election of Al Gore, Republican representatives like Jim Baker propagated, in effect, the doctrine that human beings are incapable of being fair and objective in their interpretations of reality. To discredit judicial decisions that went (or simply might go) against their interests, they propagated, in effect, the doctrine that law does not have even a dimension of neutrality or disinterestedness but is from beginning to end an exercise of raw political power in disguise. Both of these are doctrines that their intellectual spokespeople like Lynne Cheney claim to oppose and despise--doctrines that according to her, are nothing less than "an assault on Western Civilization." And, to compound the moral dilemma they were creating for themselves and their movement, these representatives proceeded to conduct themselves in ways that lent support to the validity of these same cynical, anticonservative doctrines. A party that for a long time has professed adherence to principles of states' rights and judicial restraint played federal judicial intervention as its trump card to insure the election of its candidate. Will it ever be possible, in our generation anyway, to take its intellectual pronouncements seriously again?

As a proximate result of its relentless War for Florida, America's conservative party has taken control of the presidency and all the powers attendant on that office. We shall see what comes of that. But as another, perhaps longer-lasting result of that implacable war, the intellectual and moral pretensions of contemporary American conservatism lie in tatters, like so many discarded chads on the floor of a county canvassing board meeting room.

Another book on the Vietnam War? Yes, and one well worth our attention. Enough time has now passed that A.J. Langguth's Our Vietnam: The War 1954-1975 serves not only as a wonderful addition to the rich and diverse literature on the war but as a good vehicle for revisiting and better understanding a tragedy of profound dimension. It is also an excellent one-volume introduction to the subject for those to whom this two-decade national experience is mainly a historical episode--and a useful lens through which to view the new Administration of George W. Bush as it begins to deal with military and national security issues.

Langguth is not a professional historian who approaches the Vietnam War merely through archives and secondary sources; he is a seasoned journalist who was on the ground in Vietnam for the New York Times when US combat troops began fighting in large numbers. Initially as a reporter and then as the paper's Saigon bureau chief, Langguth covered the war during this transformative period of 1964-65. He returned to Vietnam in 1968 to report on the aftermath of the Tet offensive, and then in 1970 to report on the US invasion of Cambodia. In preparing Our Vietnam he made five trips to Hanoi and Ho Chi Minh City, wherehe writes he was "warmly received by Vietcong officers and lifelong Communist politicians."

Langguth's firsthand experiences in Vietnam help infuse this anecdotally rich chronological narrative with a vividness and immediacy that propel the reader through nearly 700 pages of history. It is disappointing that he did not write a concluding chapter or two in which to reflect on the longer-term meanings and consequences of the war. But, to his credit, he does cover the war not only from the American perspective but also from the vantage points of the South Vietnamese, the North Vietnamese and the Vietcong. Langguth stays out of the way of his story--as he states, "My goal was simply a straightforward narrative that would let readers draw their own conclusions"--but makes his own point of view quite clear in the book's final paragraph: "North Vietnam's leaders had deserved to win. South Vietnam's leaders had deserved to lose. And America's leaders, for thirty years, had failed the people of the North, the people of the South, and the people of the United States."

Our Vietnam compares favorably with other books on Vietnam written by journalists that attracted wide public acclaim when published and retain distinguished and important places within the literature on the war. David Halberstam began The Best and the Brightest on the heels of covering the domestic turbulence created by the war during the 1968 campaign, in which he "had seen the Johnson Administration and its legatee defeated largely because of the one issue." When his book was published in 1971, US soldiers were still fighting in Vietnam and Richard Nixon was President. Thus, although Halberstam's study of how and why American leaders made decision after decision leading up to and sustaining the war remains a touchstone, he wrote of an event he was living through and for which he had limited sources.

Stanley Karnow's Vietnam: A History is an exceptional single-volume history of the American war in Vietnam, which Langguth almost seems to have used as a guide for his own effort. Although Karnow gives his thoroughly engaging, magisterial history a broad perspective, Langguth had access to more information because he was writing later, and his narrative provides a richer Vietnamese perspective on events. Neil Sheehan's A Bright Shining Lie: John Paul Vann and America in Vietnam is a remarkable, at times staggering, historical account of the war. But because Sheehan uses the story of Vann's life to tell the tale--even ending with the observation that "John Vann...died believing he had won his war"--his captivating narrative is more personal than a conventional historical account.

Our Vietnam should help keep the record honest today, too, by constituting an antidote to the published memoirs of those who planned and executed the war, the most prominent being former Secretary of Defense Robert McNamara. In his memoir, In Retrospect, McNamara wrote that his "associates in the Kennedy and Johnson administrations were an exceptional group: young, vigorous, intelligent, well-meaning, patriotic servants of the United States," and he wondered: How "did this group...get it wrong on Vietnam?"The mistakes, he insisted, were "mostly honest mistakes," the errors ones "not of values and intentions but of judgment and capabilities." (Langguth's chronology and evidentiary record contradict that view forcefully.)

Langguth's retelling of America's involvement in Vietnam covers already familiar ground. After the North Vietnamese defeated the French at Dien Bien Phu in 1954, Washington--in keeping with a cold war mentality that gripped much of the nation--promised "free elections" in South Vietnam and dispatched military personnel to train South Vietnam's army. But Vietnam was not on President Eisenhower's radar screen. When he met with President-elect John Kennedy during the transition, he drew Kennedy's attention to Laos as a potential trouble spot and made no mention of Vietnam. As the new President became focused on Vietnam, Under Secretary of State Chester Bowles--both an oracle and a lone wolf--warned (to no avail) against US military involvement there, because it would put the "prestige and power" of the United States on the line "in a remote area under the most adverse circumstances."

In 1961 McNamara became the Kennedy Administration's "supervisor for Vietnam," which caused many, years later, to refer to the conflict as McNamara's War. In 1962 Kennedy began to increase the number of US personnel in Vietnam. In 1963 Buddhists there were mounting a challenge to the autocratic ways of South Vietnam's Prime Minister Ngo Dinh Diem; indeed, that summer one Buddhist monk, Thich Quang Duc, horrified the world by burning himself to death on a Saigon street. Diem was murdered only weeks before Kennedy was assassinated in November 1963.

On to the Johnson Administration: On August 7, 1964, Congress rubberstamped--the House vote was 416 to 0 and the Senate's was 88 to 2--an Administration-drafted resolution informally called the Tonkin Gulf Resolution (following a trumped-up incident of attack on a US vessel), which authorized the President to take "all necessary measures to repel any armed attacks against" US forces and "to prevent further aggression." A month after eight Americans were killed at the US base in Pleiku in February 1965, the United States commenced Rolling Thunder, the systematic bombing of North Vietnam, which continued (with some pauses) until October 1968. In March 1965 President Johnson committed land troops to South Vietnam, which over the next two and a half years were increased to more than 500,000.

When the United States went to war, most Americans might have said that the goal of the war was to stop the spread of Communism. But it is likely that only a small portion of them gave much thought to the meaning of that slogan as it applied to Vietnam. Moreover, a still smaller percentage probably thought carefully about how this war would actually prevent the spread of Communism, and what important interest the United States really had in doing so in South Vietnam per se. Thus, when the war began for the United States in earnest in 1965, the American public had only a wafer-thin understanding of why the nation was fighting a land war in Southeast Asia, the goals of the conflict, the dangers, whether the aims of the war were realistic and what magnitude of commitment and sacrifice might be required to see it through.

Once the war began, it dragged on from one season to the next, year after year. The public became restless and the antiwar movement became forceful. In early 1968 the North Vietnamese pulled off the stunning Tet offensive, which left the indelible image of Marines desperately fighting for their lives as they defended the US Embassy in Saigon. Washington tried to assure the American people that Tet was not a Communist victory. But George Aiken, a Republican senator from Vermont, may have expressed the public's mood best when he said sardonically: "If this is a failure, I hope the Vietcong never have a major success." In the end, Tet shredded confidence in Washington, the idea that the war was being won and the suggestion that the South Vietnamese government was anything but a corrupt puppet.

Senator Eugene McCarthy challenged President Johnson for the Democratic Party presidential nomination and stunned the nation by getting 42.2 percent of the vote in the 1968 New Hampshire presidential primary. A few weeks later, the group of so-called Wise Men advised Johnson not to escalate the war further, and days after that Johnson told the nation he would halt much of the bombing and agree to negotiations with the North Vietnamese. He added that he would not be a candidate for re-election. When the Paris Peace Talks were about to begin, the South Vietnamese refused to attend, and the North Vietnamese and US delegations settled into arguing about the shape of the table at which they would sit--and Americans and Vietnamese continued to die.

When Richard Nixon won the general election, he didn't consider an immediate cessation of hostilities. Instead, he inaugurated a policy--eventually termed "Vietnamization"--of gradually withdrawing ground troops. At the same time, the war from the air was enlarged as he ordered the secret bombing of Communist bases in Cambodia. But the war did wind down, if slowly, and the last US combat troops left South Vietnam in March 1973. Rather than face an impeachment trial, Nixon resigned the presidency in 1974, which left Gerald Ford as President when the North Vietnamese smashed through the gates of Independence Palace in Saigon and defeated the South Vietnamese in 1975.

Vietnam was a tragedy of immense proportions, and although it is regrettable that Langguth does not try to distill its causes, his carefully presented evidence makes it plain that the only arguable United States national security interest in Vietnam was the cold war policy of containing Communism to ward off the oft-invoked "domino effect." But even in this regard, Langguth's account indicates that those who planned and executed the war believed that the primary interest of the United States had less to do with containing Communism than it did with some vague idea of national prestige. As the respected John McNaughton, assistant secretary for international security affairs, stated in a 1965 memorandum to McNamara, "70%" of the purpose of US military intervention during the transition was "to avoid a humiliating US defeat (to our reputation as a guarantor)."

Langguth's account also firmly rules out the idea that Vietnam was a quagmire that US leaders stumbled into unaware of the risks. The "quagmire thesis" first made its mark on public consciousness when David Halberstam wrote The Making of a Quagmire in 1964, maintaining that well-intentioned, idealistic American leaders blundered their way in. After that, many were responsible for restating and elaborating the theme, but it was probably Arthur M. Schlesinger Jr., the prominent historian and former Kennedy aide, who gave the thesis one of its most quoted expressions, in The Bitter Heritage: Vietnam and American Democracy, 1941-1966, published in 1967.

When I did my research for The Day the Presses Stopped: A History of the Pentagon Papers Case, it seemed that the Pentagon Papers had smashed the quagmire thesis to smithereens. The Pentagon Papers were a massive, 7,000-page, top-secret military history of America's involvement in Vietnam from the end of World War II to 1968, which McNamara had commissioned in 1967 and which Daniel Ellsberg leaked to the New York Times (and the Washington Post also ran) in 1971. They consisted of government documents embodying key decisions that led up to the war and sustained it, as well as accounts written by so-called Pentagon historians whose task it was to write a narrative of the decisions and events as recorded in the government documents that the study comprised.

But the theme of a morass that had trapped us unwittingly proved resilient. Weeks after the Pentagon Papers became public, none other than Schlesinger stepped forward to defend the quagmire claim against attacks on it by Ellsberg and others. Schlesinger wrote that "the Vietnam adventure was marked much more by ignorance, misjudgment, and muddle than by foresight, awareness, and calculation." He concluded that "I cannot find persuasive evidence that our generals, diplomats, and Presidents were all that sagacious and farsighted that they heard how hopeless things were, agreed with what they heard, and then 'knowingly' defied prescient warnings in order to lurch ahead into what they knew was inevitable disaster."

Even today that view continues to have currency, but Langguth will have none of this. He establishes not only that ranking US officials time and again perceived the dangers but that they were simultaneously unconvinced that the United States had any meaningful national defense or security interests in Vietnam that would have warranted war. Langguth's portrait is one of these same leaders feeling hemmed in by domestic political considerations. They believed that the harm to themselves at home (as well as perhaps to their party) would be too substantial if they were to change the direction of the cold war-inspired policies that were spawned at the end of World War II, and gradually but relentlessly supported the American military involvement in Southeast Asia.

Consider three incidents recounted by Langguth as illustrative of this theme. President Kennedy told his scheduling secretary, Kenny O'Donnell, in 1963 that "withdrawal in 1965 would make him one of the most unpopular presidents in history. He would be damned everywhere as a Communist appeaser. 'But I don't care,' Kennedy said. 'If I tried to pull out completely now from Vietnam we would have another Joe McCarthy red scare on our hands, but I can do it after I'm reelected. So we had better make damned sure that I am reelected.'"

During a 1964 taped telephone conversation between President Johnson and Senator Richard Russell of Georgia, Russell said, "I don't see how we're ever getting out [of Vietnam] without fighting a major war with the Chinese and all of them down there in those rice paddies and jungles. I just don't see it. I just don't know what to do." Johnson answered: "That's the way I've been feeling for six months." When Johnson asked Russell, "How important is it to us?" and Russell answered, "Not important a damned bit," the President did not disagree. Johnson also told Russell that he did not think people knew much about Vietnam and that "they care a hell of a lot less." Toward the end of the conversation, Johnson speculated, "They'd impeach a President that runs out, wouldn't they?"

Langguth also reports a telling incident just before Christmas 1970, when President Nixon told Henry Kissinger, his National Security Adviser, that he "considered making 1971 the last year of America's involvement in Vietnam." Nixon said that he planned to tour South Vietnam in April 1971, reassure South Vietnamese President Nguyen Van Thieu about the consequences of the impending US withdrawal and then come home and "announce that America's role in Vietnam was over." Kissinger protested. He argued that after the withdrawal, the "Communists could start trouble" in 1972, which meant that the "Nixon administration would pay the political price in the 1972 presidential election." Kissinger advised, as Langguth recounts, that "Nixon should promise instead only that he would get American troops out by the end of 1972. That schedule would get him safely past his re-election. Nixon saw the wisdom in Kissinger's argument that guaranteeing his second term would require American soldiers to go on dying."

It would be overreaching to assert that Kennedy, Johnson and Nixon made defense and national security decisions solely in response to their own perceived political fortunes, but the evidence does make it clear that their decisions were greatly influenced by the consideration. And to accept that domestic political concerns played such a pivotal role in the US war in Vietnam, which cost 57,000 American and an estimated 2 million Vietnamese lives, constitutes a profound challenge to the capacity of a democratic order to fashion and implement moral and prudent policies.

As damning as it is to accept the degree to which personal and party interests prompted policies that led to such a protracted war, it would be a mistake to think of the Democrats and Republicans who made those decisions as somehow uniquely flawed. It is unlikely that the qualities of mind, temperament and character of current and future political leaders will be more vital, wise or resourceful than those who occupied high office between 1954 and 1975. One must accept that the United States is not likely to have leaders who have the vision, the ability to communicate and that rare quality of leadership that will allow them to reshape what is politically possible by fundamentally altering (after they have helped formulate it) an entrenched mindset that dominates a nation.

Although Langguth's Our Vietnam does not confront this conundrum, his vivid retelling of the American war there allows us to consider once again the role played by the antiwar movement in bringing the war to an end. In so many ways, the movement was chaotic and ineffective. But can one imagine what would have been the course of the conflict if there had been no movement? Would the US combat forces in Vietnam have risen to a million? Would the United States have used nuclear weapons? Would the United States have supported a war of attrition for another three or four years? The movement was the countervailing force to, in its words, "the system" that made and sustained the war. The movement restrained Johnson's buildup; it put Senator McCarthy in a position to mount a challenge to a sitting President; it pressured Nixon to find a way out of an even longer war. The movement accomplished nothing quickly or easily--it couldn't. It was battling a cold war sensibility implanted in the American mind since the end of World War II. What is so astounding, therefore, is not that the movement did not achieve more, but that it achieved so much.

Just as a people may set constraints on the political process that politicians experience as a prison without walls, the people may also be, as they were during the Vietnam War, the system's last best hope. And if that is true, then the people need to be fully engaged as the inexperienced President Bush confronts an array of defense and national security issues: When and under what circumstances should the United States commit ground forces to a situation comparable to Kuwait or Bosnia? Should the United States deploy a national missile defense system? Is the United States meddling in a civil war in Colombia under the guise of advancing a hapless "war on drugs"? If dangerous weapons of mass destruction are identified with certainty in a nation considered a threat, what is the appropriate response?

Langguth's Our Vietnam reminds us time and again of the importance of skepticism and distrust in assessing defense and national security policies. One anecdote about that engagement makes the point memorably. In August 1964 Johnson was widely praised for ordering airstrikes against North Vietnam in the wake of the Tonkin Gulf incident. The influential New York Times columnist James Reston wrote that "even men who had wondered how Johnson would act under fire 'were saying that they now had a commander-in-chief who was better under pressure than they had ever seen him.'" There were not many dissenting voices, but I.F. Stone was one. Referring to the right-wing Republican presidential candidate, who criticized Johnson's policies toward North Vietnam as insufferably weak, Stone wrote in his weekly: "Who was Johnson trying to impress? Ho Chi Minh? Or Barry Goldwater?" Stone's response in this pivotal event is a vital example of a frame of mind that would serve the nation well if it were widely adopted. Any book that becomes a vessel for meaningful re-examination of a national tragedy is exceptional and demands broad attention. And that is what Langguth's book is, and does.

Single-payer healthcare is favored by the public, yet the insurance industry has too much to lose if it is enacted.

Nick Bromell's Tomorrow Never Knows explores rock and roll in the sixties.

"Yes, nonviolence is a noble ideal, but do you really think it would stop a Hitler?" Or a street thug, a dictator, a death squad?

Pacifists are long accustomed to these questions, mostly thrown up by self-proclaimed realists. And they get the put-down message: Nonviolence is a creed only slightly less trifling than hippies sticking flowers in soldiers' gun barrels.

Readers whose minds are open to another view will be rewarded by A Force More Powerful: A Century of Nonviolent Conflict. It is a comprehensive and lucidly written addition to the literature of peace. Its worthiness puts the authors, Peter Ackerman and Jack DuVall, in the high company of Gene Sharp of the Albert Einstein Institution in Boston, Michael True of Assumption College and Richard Deats of the Fellowship of Reconciliation--all scholars of mettle who bring before the public the many historical examples where the force of organized, nonviolent resistance defeated oppression.

Ackerman and DuVall, deserving of praise for writing nonideologically when they might easily and self-indulgently not have (and thus lost readers looking for hard reporting rather than soft commentary), use fourteen chapters to document and analyze history-altering reforms created by nonviolent strategies. These include the early 1940s Danish resistance to the Nazis; Solidarity's strikes in the 1980s, which eventually took down the Soviet puppet regime in Poland; the 1980s public demands for free elections that removed the Pinochet junta in Chile; the near-bloodless elimination of the Marcos government in the Philippines; the work of the Palestinian-American Mubarak Awad to rally nonviolent civil resistance against Israeli authorities in the occupied territories; and civil rights workers in Nashville in the 1960s.

These are the better-known examples. Ackerman and DuVall also explore the removal of autocratic governments in El Salvador (in 1944), Mongolia and Eastern Europe. Oddly, the authors omit the story of Le Chambon, the French village that was a leading center for hiding Jews in the early 1940s and whose pacifist citizens successfully faced down the Nazis with weapons of the spirit, not weapons of steel. (That story is told by Philip Hallie in Lest Innocent Blood Be Shed.)

Ackerman and DuVall do not portray Awad, King Christian X of Denmark, Gandhi of India, Mkhuseli Jack of South Africa, Reverend James Lawson of Nashville and others as willing martyrs for the cause. Instead, they were hard-thinking political strategists who built bases for citizen support that would not crack when the heat rose and the dogs snarled.

"Nonviolent resistance," the authors write,

becomes a force more powerful than the hand of an oppressor to the extent that it takes away his capacity for control. Embracing nonviolence for its own sake does not produce this force. A strategy for action is needed, and that strategy has to involve attainable goals, movement unity, and robust sanctions that restrict the opponent.... When the regime realizes it can no longer dictate the outcome, the premise and means of its power implode. Then the end is only a matter of time.

Debunking the prevailing image of pacifists as appeasers or well-meaning but addled dreamers who've read one too many biographies of St. Francis, Ackerman and DuVall provide ample details to dispel those errant notions. As portrayed here, organizers of successful collective, nonviolent opposition to oppressors tend to be self-disciplined, practical and dogged--traits commonly held up as military virtues, which is why Gandhi so admired soldiers. The authors write:

Nonviolent action is like violent combat in at least two ways. It does not succeed automatically, and it does not operate mysteriously--it works by identifying an opponent's vulnerabilities and taking away his ability to maintain control. If a regime intends to remain in power indefinitely, it will require extensive, long-term interaction with those it rules--and that creates a dilemma: the broader the regime's system of control, the more vulnerable it is, because it depends on too many actors to ensure that violence against resisters will always work. Once an opposition shows its followers that this weakness exists, it can begin to pry loose the support that the regime requires--its revenue, its foreign investments, or even its military.... Victory is not a function of fate; it is earned.

Tolstoy described pacifists similarly: "For us to struggle, the forces being so unequal, must appear insane. But if we consider our opponent's means of strife and our own, it is not our intention to fight that will seem absurd, but that the thing we mean to fight will still exist. They have millions of money and millions of obedient soldiers; we have only one thing, but that is the most powerful thing in the world--Truth."

Peter Ackerman, formerly a visiting scholar at the International Institute of Strategic Studies, and Jack DuVall, who has worked in television and as a political speechwriter, also collaborated, along with producer Steve York, in a three-hour PBS documentary of the same title that played last September. The film quotes a postwar historian summarizing the Danish resistance to the Nazis by strikes, work slowdowns, hiding or helping Jews and not obeying orders to disperse: "Denmark had not won the war but neither had it been defeated or destroyed. Most Danes had not been brutalized, by the Germans or each other. Nonviolent resistance saved the country and contributed more to the Allied victory than Danish arms ever could have done."

A Chilean leader said of the organized resistance against Pinochet in the 1980s and the successful call for fair elections: "We didn't protest with arms. That gave us more power."

Refreshingly, the authors offer compelling observations--almost as sidenotes--about the ineffectiveness of violence. Lech Walesa and Polish strikers taking on the Jaruzelski regime remembered that except for momentary glee nothing was accomplished by Polish workers in 1970 and 1976 when they burned down Communist Party buildings. "In the 20th century's armed liberation movements," Ackerman and DuVall write, "portraits of gunwielding martyrs--the Che Guevaras of the world--were often flaunted as symbols, but none of those struggles produced freedom."

A Force More Powerful will likely stand as a book more powerful than any guts-and-glory war memoirs by generals or gun-toters, or any extollings of military might by one-note historians.

Quite recently yet another of Jasper Becker's indispensable dispatches from China appeared in his newspaper, the Hong Kong-based South China Morning Post. "Every year," Becker reported, "about 10,000 of China's five million coal miners meet gruesome deaths underground." He went on to explain that censorship limits news of industrial accidents, but that conditions have certainly gotten worse in the past two decades, during China's breakneck effort at economic growth.

You have to pause a bit to let the impact of this statistic set in, especially after realizing that it does not include deaths from other industrial accidents, including factory fires, explosions and collapsing buildings, only a fraction making it into the pages of the Morning Post. Chinese workingmen and -women are dying at a higher rate than their counterparts in Victorian England or turn-of-the-century America, and, until now, the world has been paying little attention. By contrast, when an explosion at a coal mine in Monongah, West Virginia, killed 361 coal miners in 1907, the single largest such accident in our history, the disaster attracted national coverage.

You won't see much of Jasper Becker's kind of reporting in the American mainstream press. Over the past decade or so, American journalists, along with their ideological elder brothers at The Economist, have focused on the booming Chinese coastal cities, glorifying young entrepreneurial yuppies with cell phones and marveling at the construction burst of shopping plazas, office towers and upscale housing.

Slightly more conscientious reporters may mention, in passing, sweatshops and pollution, but they imply that these are the unfortunate and temporary byproducts of "economic reform," a phrase normally presented without the quotation marks, suggesting a self-evident good instead of a controversial set of economic policies. This uncritical attitude, best described as "market fundamentalism," has taken over much of the US media. Even Paul Krugman, currently the economic columnist at the New York Times and someone smart enough to know better, wrote an article back in 1997 titled "In Praise of Cheap Labor."

Jasper Becker is different. He is British-born but fluent in Chinese, and he has spent the past ten years in China, most recently for the Morning Post, skeptically tramping into areas of the country and listening to people most other Western journalists disregard. His years of work (some of it is also available on the Internet, at www.scmp.com) provide two tremendous services. First, he introduces us to Chinese people we would never otherwise meet. Second, he raises profound doubts about a core belief of market fundamentalism: that Chinese suffering today will be justified by a developed nirvana in the future.

Becker is not inspired by any nostalgia for the now-departed Maoist era; his last book, Hungry Ghosts (1998), was a powerful account of how the Great Helmsman's arrogance and the undemocratic Chinese Communist system caused more than 30 million people to die in a man-made famine during the disastrous Great Leap Forward (1958-61).

In The Chinese, Becker continues to care about the impact of economic policy on the lives of ordinary people. His book's very structure proves his determination to look beyond the minority of the newly prosperous, the people the Western market fundamentalists and investors find most photogenic. He starts at the bottom, in a village in the Guangxi region with some of the poorest of the 1 billion peasants, and then slowly moves up through the increasingly stratified Chinese society. Local officials shadow and harass him on his visits to the rural poor, who he says "probably constitute the largest unenfranchised group in the world." He goes on: "Forbidden openly to organize themselves to defend their interests against either the central state or local despots, they form secret underground armies, cults and millenarian sects as they have done throughout history. The state seems involved in a continual battle to crush them, and from time to time faint reports of this repression...reach the outside world." He speculates, "Given the chance, peasants would quickly organize themselves into associations or even political parties but at present that seems a remote prospect."

Becker also writes about the several hundred million migrant workers, third-class citizens who flock into the cities for low-paid, dirty work and who have no permanent right to stay or to bring their families, a state of affairs that would be depressingly familiar to black southern Africans. He visits the collapsing old industrial cities of the northeast, with their millions of angry, sullen unemployed. Becker profiles Chinese intellectuals, demoralized by repression after the 1989 Tiananmen uprising and the consequent "depoliticization of so many aspects of life." He explains: "Censors searching for subversive messages have examined everything from slogans on T-shirts to poetry magazines. The propaganda machinery has returned to its traditions." His grim conclusion is that "intellectuals have tried but generally failed to find some independent space within the system."

Becker describes the Communist Party, with its 58 million members, as a privileged minority in a country that now has nearly 1.3 billion people, but he estimates that "the real size of the ruling elite, from county magistrates upwards, is thought to be no more than 4 million." Only toward the end of his quest does he reach what he calls "the apex of the pyramid, the tiny group of self-selecting rulers."

Becker is extraordinarily cautious and measured. He points out that many of the precise-sounding government statistics, including the glowing economic growth figures, are either exaggerated or "simply made up to suit the propaganda needs of the day." Still, his years of experience crisscrossing the giant country have earned him the right to make certain observations, some of which may surprise even experienced China watchers:

§ Inequality in China is widening dramatically. Becker reminds us that the average annual peasant income in China is still only $240, a figure that has actually fallen in the past few years. The growing gap is distorting the Chinese economy; Becker points out that "much of the considerable investment in new housing was aimed at the very top end of the market despite a pressing need for low-cost housing."

§ Health and education for most Chinese are deteriorating. This discovery is perhaps Becker's most alarming. The decline is a disheartening contrast with the Maoist era, which despite its crimes and excesses brought significant progress. Becker even speculates that the Falun Gong religious cult, which continues to suffer vicious state repression, attracts adherents partly because "its leader, Li Hongzhi, promised his followers that if they adopted his system of exercise they need never take medicines or go to [the] hospital for treatment."

§ China today is governed by a kind of lawless authoritarianism. Local party bureaucrats, most apparently drained of any revolutionary idealism, wield unchecked power, arbitrarily imposing hundreds of different kinds of taxes on the rural poor. Quite logically, corruption flourishes.

At the upper levels, "princelings," the offspring of high party officials, commandeer what was once state property for personal gain and, in league with foreign (often overseas Chinese) investors, dominate vast segments of the economy, accompanied by corruption on a grand scale. Becker reports one particularly ominous development; some of the elite--you cannot call them "new" because many are the actual biological heirs of the old rulers--seem to be stashing billions outside China, a variant of Latin American- or African-style rapacity.

Such capital flight is a dangerous break with the East Asian pattern. In places like South Korea and Taiwan, the new industrialists did prosper, but they were required to keep the gains inside their countries, to reinvest in productive growth. Also, in neither place did the expanding economy widen inequality.

§ China's success at exporting from its coastal enclaves may be exaggerated. The inhuman conditions in these sweatshops are slowly becoming known, thanks to courageous Chinese activists and to solidarity movements overseas, but it may still be a surprise to learn that for the mostly female workers "talking is usually forbidden. To go to the toilet or drink a glass of water requires a permission card. Sexual harassment is common and punishments can involve beating, confinement or cancellation of wages."

Becker once again provides a fresh look, by raising serious doubts about the purely economic benefit of all this repression. He points out that the export zones are subsidized by the rest of the economy and that some of the apparent growth is in fact a speculative bubble (the kind of feverish phenomenon that Internet investors in the West have just painfully learned about).

§ China's security apparatus is actually expanding. Becker's revelation comes as something of a surprise, because the surface of Chinese life looks more relaxed after the monochrome, bleak thought control of the Mao Zedong period. But as Columbia University scholar Andrew Nathan explains in his valuable introduction to the recently published Tiananmen Papers, "To be sure, [the regime] has diminished the range of social activities it purports to control in comparison to the totalitarian ambitions of its Maoist years. It...no longer aspires to change human nature. It has learned that many arenas of freedom are unthreatening to the monopoly of political power."

So even though people in Beijing and Shanghai now wear Western-style jeans and running shoes, Becker points out that repression continues; arrests--not just political--are arbitrary, torture is routine and the death penalty is applied more frequently. Bill Clinton's 1998 state visit and China's accession to the World Trade Organization were supposed to be liberalizing influences. The market fundamentalists who insist that increased trade somehow automatically improves human rights have some explaining to do.

§ China is--for now--not a strong military threat to Taiwan, despite Beijing's bellicose threats. Contrary to the alarms of Western conservatives, Becker contends that the Chinese Navy could not--yet--mount an invasion across the Taiwan Strait, adding that "Chinese pilots cannot even fly in bad weather because their radar screens are unreliable."

§ Environmental degradation is perhaps the worst threat to China. The water table in the north is dropping; The Chinese includes a photograph of a forlorn figure trying to pump from a shallow pool in what is left of the Yellow River. China's water and air are polluted, and deforestation and loss of topsoil continue. Much of this is no surprise, thanks in large measure to the pioneering work of the scientist Vaclav Smil. Becker stresses that the environmental catastrophe is linked to China's lack of genuine democracy: "Since almost everything the state says is untrue, and most information is kept secret, there is no real trust or co-operation between its officials and the rest of the population."

Becker points out humorously that even Chinese weathermen have lied: "In 1999 it emerged that meteorologists had for fifty years been under orders never to report that the temperature had risen above 37 degrees centigrade (98.6 degrees Fahrenheit), although why no one would explain. Perhaps such an admission was seen as discrediting the Party."

China's environmental failure illustrates one of Becker's most important conclusions: Human rights, democracy and government accountability are not luxuries, worthy ideals to be set aside until economic growth is achieved. In fact, genuine, broad-based, environmentally friendly growth will not happen until there is respect for human rights. Spasmodic government exhortations will not reduce corruption at either high or local levels, but a vigorous free press, freedom to speak out and genuine multiparty elections are the only hope. Continuing government lying will not heal the environment, but independent ecology movements can help, as they have already demonstrated in neighboring Taiwan.

Becker is cautious about the prospects for change. He does recognize that "some foreign and domestic observers have predicted that such an explosive mixture of corruption, poverty and unemployment in China must one day result in a rural revolt. Perhaps." But he also points out that the Chinese bureaucratic state can trace its ancestry continuously back to the first Emperor in the third century BC, making it "probably the oldest functioning organization in the world," one that has been "exercising a tighter grip over its subjects than any other comparable government in the last two millennia."

The market fundamentalists of course assume that change will come as the automatic consequence of economic growth. They have done little to help this evolutionary process along. The great Chinese democratic dissident Wei Jingsheng, released in 1997 after eighteen years in Chinese prison camps, sits in exile in the United States, just about ignored by the mainstream media. Alexander Solzhenitsyn was a household word at a similar stage in his career.

Market fundamentalists disregard China's terrible level of industrial accidents and the decline in health and education. They are uncomfortable about Chinese sweatshops, air and water pollution, and corruption, but they justify these ills as an inevitable part of growth, looking back with a kind of misty glow at Dickens's England and post-Civil War America for reassurance that "we" came through the hard times, and so will the Chinese. Jasper Becker's remarkable book ought to raise crushing doubts.

But let us just imagine for a moment that the market fundamentalists are right. Their point of view is still immoral. They are implicitly suggesting that some people, those who gain from the unjust international economic order, have the right to impose suffering on other people, in the name of some ultimate goal. No one asked those 10,000 Chinese coal miners who died last year, men and certainly at least some women, whether they wanted to sacrifice for a greater future. No one allowed them to vote for people who might have protected their rights; no one permitted them to form independent labor unions. Apparently, globalization does not yet mean that members of America's United Mine Workers and other overseas unions can openly visit their Chinese colleagues and share their experience in fighting for safer workplaces. But at least, thanks to Jasper Becker, we are becoming aware that the Chinese miners exist, so they are no longer dying in total silence.

Jason Epstein's Book Business: Publishing Past Present and Future is the third memoir of a major American life in book publishing to reach print in less than two years. It is at once a sign that the guard is changing and a recognition that the business has already changed. It is also, in the case of the 72-year-old Epstein, an opportunity to gaze into the crystal ball to predict the changes to be, something he has been rather good at during the course of his long career.

Simon & Schuster's Michael Korda got the triumvirate rolling in 1999 with Another Life, gossipy and entertaining and novelistic, like the books Korda often publishes. The New Press's André Schiffrin--famously ousted from Random House's Pantheon Books, the once independent imprint his father started--followed suit more recently with The Business of Books, the kind of polemic he has sometimes featured on his list [see Daniel Simon, "Keepers of the Word," December 25, 2000].

It's not surprising, then, that the tone pervading Epstein's memoir--which began with a series of lectures he gave at the New York Public Library, formed two essays in The New York Review of Books and was coaxed into a book by Norton president Drake McFeely--is cool and elegant and full of the gravitas of a man who wanted to be a great writer and instead ended up publishing many such, Morrison and Mailer and Doctorow among them.

He arrived at Random House in 1958, having deemed it time to leave Doubleday when he was prevented from publishing Lolita there. While at Doubleday he had founded Anchor Books and with it the trade paperback format in America. He retired as Random's editorial director in 1998, and during the four decades in between started the Library of America, a unified series of reprints of great American literature; The Reader's Catalog, a kind of print precursor to Amazon; and The New York Review of Books. He had a reputation as a brilliant editor but went beyond that to envisage change and make it happen, and in the process made himself into a pillar of the New York intellectual establishment.

"If I have any regrets, I can't think what they are," he declared during an interview recently, sipping homemade espresso at his large kitchen table in an opulent downtown apartment that could double as the upscale set for one of Woody Allen's Manhattan tales. He still edits authors he's been associated with but now does it from home. He prefers to be based there rather than in the Random corporate offices, wishes to put space between himself and an "increasingly distressed industry" mired in "severe structural problems." Prominent among them are a chain-driven bookselling system that favors "brand name" authors and often returns other new books to their publishers after only a few weeks on the shelves, before the titles have a chance to establish themselves; and a bestseller-driven system of high royalty advances that often do not earn back the money invested, a system that ratchets up unrealistically high sales expectations for new titles overall, and in so doing makes it increasingly difficult to publish certain kinds of books.

One-third of the way through his slim text, Epstein writes that his career has demonstrated an "ambivalence toward innovation." Ambivalence also pervades this elegiac book. Perhaps it is inevitable when a man looks back to his youth and forward to a future in which he will not play a major part, even if he is hopeful about that future. Perhaps, too, it is inevitable when confronting the distress signals of an industry he has spent his life in and clearly loves. Epstein shares his visions of a publishing future liberated electronically, but that future harks back to a deep-seated nostalgia, a longing for what was. His book seems to predict that technology in the form of the Internet will restore to the book business a certain lost rightness from the past.

His first chapter, like Dickens's Christmas tale, moves back and forth among past, present and future in an attempt to limn the larger changes of the past fifty years and what may yet unfold. The rest of the book is chronologically structured. It follows Epstein's career and the transformation of publishing from primarily small-scale, owner-operated enterprises rooted in the 1920s "golden age" of Liveright and Knopf to the "media empires" of today, which are forced to operate within an "overconcentrated," "undifferentiated" and fatally "rigid" bookselling structure. Now, he says, "there can't be Liverights or Cerfs because the context is so different. Roger Straus is the very last of them," and even he has sold his company to the German firm von Holtzbrinck.

Publishing must return to being "a much smaller business again," Epstein is convinced. "It has to, it's a craft and can't be industrialized any more than writing can. It's about to undergo a huge structural shift and there's nothing the conglomerates can do about it. The marketplace has shifted out from under them: the system of big money bestsellers defeats the possibility of building a sustained backlist. And without a sustained backlist, publishing cannot function in the long term. Providentially, just as the industry was falling into terminal decadence, electronic publishing has come along."

Epstein is in no way predicting the demise of print. Rather, his future is predicated on a kind of universal electronic Reader's Catalog, "much like Amazon" but far beyond it, "multilingual, multinational, and responsibly annotated. People will access it on their computers at home, in the office, and in kiosks like ATMs. It will be possible to browse those books, and downloading technology will eventually solve the problem of making it possible to buy those books. They won't exist in print until they're actually bought.

"There is no room on the Internet for middlemen, who sell the same product as their competitors, competing on the basis of price and service, and in so doing eat up their margins." Epstein is of course speaking of the Amazons and B&N.coms of today. "I think Amazon can't be here that much longer," says the man who sat at this same kitchen table doling out advice to its CEO, Jeff Bezos, a few years back.

As for brick-and-mortar stores, "the chains aren't tenable, either. They never were. The superstores have become what the old mall stores were. There are far too many of them, Waldens with coffee bars, and they will shrink. Stores run by people who love running bookstores will arise spontaneously like mushrooms and find a way to stay in business once the chains begin to recede."

And the conglomerate publishers? "I think they can show some financial progress for some years by cutting costs and cutting out redundancies, but eventually they'll find themselves with expensive traditional facilities that are increasingly irrelevant. They'll have to offload many functions on to specialist firms. In the end, they in turn will look for a buyer if they can find one. They should have noticed that the previous owners were all too happy to sell."

Meanwhile, authors will have found a way to bypass their publishers by going directly to the web. People will start independent authors' websites. Books will be much cheaper. Authors will have a much larger share of the revenue.

Stephen King has already gained notoriety in trying to do so. But the spectacular starting bang of Riding the Bullet, done in conjunction with his publisher, Simon & Schuster, attenuated when he tried to serialize online a novel, The Plant, on his own. A downturn in paying customers for the later chapters led King to abandon the project. Asked about this, Epstein insists, "It's like the days of the early cars that ran off the road into the mud. People said cars would never work. Well, one of these days e-publishing will work."

Of other experiments now being tried Epstein is openly dismissive, and he sees a kind of Darwinian process filtering chaff from grain. Mighty Words and similar online publishers "don't know what a book is," he contends. "But people know what a book is. Human beings are designed to distinguish value, and in my opinion that problem will take care of itself."

He disregards the tremors that have gone through the publishing houses ever since B&N.com announced it was getting into the business of publishing books. Barnes & Noble Digital was formed the first week in January to compete with the new electronic subsidiaries of traditional publishers, which are bringing out digital versions of new titles readable on PCs or dedicated devices, as well as original works specifically created for electronic distribution. In addition, they are digitizing backlist and out-of-print books that can be reprinted in very small quantities in a process known as print-on-demand."It's yet another premature entry," says Epstein. "B&N's publishing experience is limited to a remainder operation. That's entirely different from bringing out original works."

While Epstein criticizes the proverbial naysayers' laughing at those early cars stuck in the mud, at the same time he cautions, "I don't think an author who has worked hard to create something of value will want to risk it in the electronic format at this point." He says bookstores will wind up selling new titles at much lower prices than is now the case, $10 or so, but "can't figure out" how that will be done in the black. His predictions are compelling, but they are also much too vague--for instance, he sets out no time frame or actual mechanics for what he believes will transpire.

The bloat of the superstores is something publishers have worried about for years, almost from their rollout. This holiday season's flat sales at the three biggest chains; the margin-slashing of Amazon; and the re-energizing of the independent stores through a marketing program called Booksense, which includes web-based retailing, all serve to illustrate Epstein's points. Borders went so far as to put itself on the block, but found no willing takers. Recent murmurs about B&N's CEO Len Riggio entertaining a buyout offer from media conglomerate Gemstar-TV Guide International, which has aggressively entered the e-book technology market, did not result in a deal but also were more than simple gossip.

The past twenty years have seen the RCAs, MCAs, Advance Publications and the like learn their lessons and abandon book publishing, as Epstein has noted. Other conglomerates have already tried to offload their publishing components and in time will try again. But it also can't be ignored that companies like the German-based Bertelsmann (which acquired Bantam, Doubleday Dell and Random House and consolidated them) and von Holtzbrinck (which has bought Holt, St. Martin's and Farrar, Straus & Giroux) have their roots in the book business itself. They are therefore not as likely to exit the scene as Epstein would have us believe.

Undoubtedly, many of Epstein's electronic dreams are prescient and will one day come to pass. The companies that first turn them into reality, though, will likely be turning out works in the professional, scholarly, reference and educational sectors rather than in the trade world he knows so well. But although the Internet will change book publishing profoundly and in ways even Jason Epstein can't predict, other forces are at work as well and shouldn't be ignored.

A couple of years ago a brilliant and rich entrepreneur who also happens to be a profoundly bookish man devised a model, not unlike Epstein's nostalgic vision, of devolved companies publishing real books that share a central financial source. It is called the Perseus Group. It is still in its early days, far too soon to know whether it will last. But Epstein's longing for a more civilized, human-scale publishing business is shared by many. The Internet may help bring it about, but it won't do everything.

The following debate is adapted from a forum--put together by Basic Books and held in New York City some weeks ago. Participating were: John Donatich, who moderated and is publisher of Basic Books; Russell Jacoby, who teaches at UCLA and is the author of The End of Utopia and The Last Intellectuals; Jean Bethke Elshtain, who has served as a board member of the Institute for Advanced Studies at Princeton University, is a fellow of the American Academy of Arts and Sciences, teaches at the University of Chicago and is the author of Women and War, Democracy on Trial and a forthcoming intellectual biography of Jane Addams; Stephen Carter, the William Nelson Cromwell Professor of Law at Yale University and author of, among other works, The Culture of Disbelief, Reflections of an Affirmative Action Baby, Integrity, Civility and, most recently, God's Name in Vain: The Wrongs and Rights of Religion in Politics; Herbert Gans, the Robert S. Lynd Professor of Sociology at Columbia University and author of numerous works, including Popular Culture and High Culture, The War Against the Poor and The Levittowners; Steven Johnson, acclaimed as one of the most influential people in cyberworld by Newsweek and New York magazines, co-founder of Feedmag.com, the award-winning online magazine, and author of the books Interface Culture and the forthcoming Emergence; and Christopher Hitchens, a columnist for The Nation and Vanity Fair, whose books include the bestselling No One Left to Lie To: The Values of the Worst Family and The Missionary Position: Mother Teresa in Theory and Practice. For Basic, he will be writing the forthcoming On the Contrary: Letters to a Young Radical.

John Donatich: As we try to puzzle out the future of the public intellectual, it's hard not to poke a little fun at ourselves, because the issue is that serious. The very words "future of the public intellectual" seem to have a kind of nostalgia built into them, in that we only worry over the future of something that seems endangered, something we have been privileged to live with and are terrified to bury.

In preparing for this event, I might as well admit that I've been worried about making the slip, "the future of the public ineffectual." But I think that malapropism would be central to what we'll be talking about. It seems to me that there is a central conflict regarding American intellectual work. How does it reconcile itself with the venerable tradition of American anti-intellectualism? What does a country built on headstrong individualism and the myth of self-reliance do with its people convinced that they know best? At Basic Books' fiftieth anniversary, it's a good time to look at a publishing company born in midcentury New York City, a time and place that thrived on the idea of the public intellectual. In our first decades, we published Daniel Bell, Nathan Glazer, Michael Walzer, Christopher Lasch, Herb Gans, Paul Starr, Robert Jay Lifton--and these names came fresh on the heels of Lévi-Strauss, Freud, Erik Erikson and Clifford Geertz.

What did these writers have in common except the self-defined right to worry the world and to believe that there is a symbiotic relationship between the private world of the thinker and the public world he or she wishes to address? That the age of great public intellectuals in America has passed has in fact become a cliché. There are many well-reviewed reasons for this. Scholars and thinkers have retreated to the academy. Self-doubt has become the very compass point of contemporary inquiry. Scholarship seems to start with an autobiographical or confessional orientation. The notion that every question has a noble answer or that there are reliable structures of ideology to believe in wholeheartedly has become, at best, quaint.

Some believe that the once-relied-upon audience of learned readers has disappeared, giving way to a generation desensitized to complex argumentation by television and the Internet. The movie Dumb and Dumber grosses dozens of millions of dollars at the box office, while what's left of bohemian culture celebrates free-market economics. Selling out has more to do with ticket grosses than the antimaterialist who stands apart from society.

How do we reconcile ambition and virtue, expertise and accessibility, multicultural sensitivity and the urge toward unified theory? Most important, how do we reconcile the fact that disagreement is a main catalyst of progress? How do we battle the gravitation toward happy consensus that paralyzes our national debate? A new generation of public intellectuals waits to be mobilized. What will it look like? That is what our distinguished panelists will discuss.

Russell Jacoby has been useful in defining the role of the public intellectual in the past half-century, especially in the context of the academy. Can you, Russell, define for us a sort of historical context for the public intellectual--what kind of talent, courage and/or political motivation it takes for someone to be of the academy but to have his or her back turned to it, ready to speak to an audience greater than one's peers?

Russell Jacoby: A book of mine that preceded The Last Intellectuals was on the history of psychoanalysis. And one of the things I was struck by when I wrote it was that even though psychoanalysis prospered in the United States, something was missing--that is, the sort of great refugee intellectuals, the Erik Eriksons, the Bruno Bettelheims, the Erich Fromms, were not being reproduced. As a field it prospered, but it became medicalized and professionalized. And I was struck by both the success of this field and the absence of public voices of the Eriksons and Bettelheims and Fromms. And from there I began to consider this as a sort of generational question in American history. Where were the new intellectuals? And I put the stress on public intellectuals, because obviously a kind of professional and technical intelligentsia prospered in America, but as far as I could see the public intellectuals were becoming somewhat invisible.

They were invisible because, in some ways, they had become academics, professors locked in the university. And I used a kind of generational account, looking at the 1900s, taking the Edmund Wilsons, the Lewis Mumfords. What became of them, and who were their successors? And I had a tough time finding them.

In some sense it was a story of my generation, the generation that ended up in the university and was more concerned with--well, what?--finding recommendations than with writing public interventions. And to this day, the worst thing you can say about someone in an academic meeting or when you're discussing tenure promotion is, "Oh, his work is kind of journalistic." Meaning, it's readable. It's journalistic, it's superficial. There's an equation between profundity and originality.

My argument was that, in fact, these generations of public intellectuals have diminished over time. For good reasons. The urban habitats, the cheap rents, have disappeared--as well as the jobs themselves. So the transitional generation, the New York intellectuals, ends up in the university. I mention Daniel Bell as a test case. When he was getting tenure, they turned to him and said, "What did you do your dissertation on?" And he said, "I never did a dissertation." And they said, "Oh, we'll call that collection of essays you did a dissertation." But you couldn't do that now. Those of that generation started off as independent intellectuals writing for small magazines and ended up as professors. The next generation started off as professors, wrote differently and thought differently.

So my argument and one of the working titles of my book was, in fact, "The Decline of the Public Intellectuals." And here I am at a panel on "The Future of Public Intellectuals." Even at the time I was writing, some editors said, "Well, decline, that's a little depressing. Could you sort of make a more upbeat version?" So I said, "I have a new book called The Rise of American Intellectuals," and was told, "Well, that sounds much better, that's something we can sell." But I was really taking a generational approach, which in fact, is on the decline. And it caused intense controversy, mainly for my contemporaries, who always said, "What about me? I'm a public intellectual. What about my friends?" In some sense the argument is ongoing. I'm happy to be wrong, if there are new public intellectuals emerging. But I tend to think that the university and professionalization does absorb and suck away too much talent, and that there are too few who are bucking the trends.

Donatich: Maybe the term "public intellectual" begs the question, "who is the public that is being addressed by these intellectuals?" Which participant in this conversation is invisible, the public or the intellectual?

Jean Bethke Elshtain: I mused in print at one point that the problem with being a public intellectual is that as time goes on, one may become more and more public and less and less intellectual. Perhaps I should have said that a hazard of the vocation of the public intellectual lies in that direction. I didn't exactly mean less academically respectable, but rather something more or less along these lines: less reflective, less inclined to question one's own judgments, less likely to embed a conviction in its appropriate context with all the nuance intact. It is the task of the public intellectual as I understand that vocation to keep the nuances alive. A public intellectual is not a paid publicist, not a spinner, not in the pocket of a narrowly defined purpose. It is, of course the temptation, another one, of the public intellectual to cozy up to that which he or she should be evaluating critically. I think perhaps, too many White House dinners can blunt the edge of criticism.

A way I like to put it is that when you're thinking about models for this activity, you might put it this way: Sartre or Camus? An intellectual who is willing to look the other way, indeed, shamefully, explain away the existence of slave-labor camps, the gulags, in the service of a grand world-historic purpose or, by contrast, an intellectual who told the truth about such atrocities, knowing that he would be denounced, isolated, pronounced an ally of the CIA and capitalistic oppressors out to grind the faces of the poor.

There are times when a public intellectual must say "neither/nor," as did Camus. Neither the socialism of the gallows, in his memorable phrase, nor a capitalist order riddled with inequalities and shamed by the continuing existence, in his era, the era of which I speak, of legally sanctioned segregation. At the same time, this neither/nor did not create a world of moral equivalence. Camus was clear about this. In one regime, one order, one scheme of things, one could protest, one could organize to fight inequities, and in the other one wound up disappeared or dead.

Let me mention just one issue that I took on several times when I alternated a column called "Hard Questions" for The New Republic. I'm referring to the question of genetic engineering, genetic enhancement, the race toward a norm of human perfection to be achieved through manipulation of the very stuff of life. How do you deal with an issue like this? Here, it seems to me, the task of the public intellectual in this society at this time--because we're not fighting the issues that were fought in the mid-twentieth century--is to join others in creating a space within which such matters can be articulated publicly and debated critically.

At present, the way the issue is parsed by the media goes like this: The techno-enthusiasts announce that we're one step closer to genetic utopia. The New York Times calls up its three biological ethicists to comment. Perhaps one or two religious leaders are asked to wring their hands a little bit--anyone who's really a naysayer with qualms about eugenics, because that is the direction in which we are heading, is called a Luddite. Case closed, and every day we come closer to a society in which, even as we intone multiculturalism as a kind of mantra, we are narrowing the definition of what is normatively human as a biological ideal. That's happening even as we speak; that is, we're in real danger of reducing the person to his or her genotype, but if you say that, you're an alarmist--so that's what I am.

This leads me to the following question: Who has authority to pronounce on what issue, as the critical issues change from era to era? In our time and place, scientists, technology experts and dot-com millionaires seem to be the automatic authorities on everything. And everybody else is playing catch-up.

So the public intellectual needs, it seems to me, to puncture the myth-makers of any era, including his own, whether it's those who promise that utopia is just around the corner if we see the total victory of free markets worldwide, or communism worldwide or positive genetic enhancement worldwide, or mouse-maneuvering democracy worldwide, or any other run-amok enthusiasm. Public intellectuals, much of the time at least, should be party poopers. Reinhold Niebuhr was one such when he decided that he could no longer hold with his former compatriots of the Social Gospel movement, given what he took to be their dangerous naïveté about the rise of fascism in Europe. He was widely derided as a man who once thought total social transformation in the direction of world peace was possible, but who had become strangely determined to take a walk on the morbid side by reminding Americans of the existence of evil in the world. On this one, Niebuhr was clearly right.

When we're looking around for who should get the blame for the declining complexity of public debate, we tend to round up the usual suspects. Politicians usually get attacked, and the media. Certainly these usual suspects bear some responsibility for the thinning out of the public intellectual debate. But I want to lift up two other candidates here, two trends that put the role of public intellectuals and the very existence of publics in the John Dewey sense at risk. The first is the triumph of the therapeutic culture, with its celebration of a self that views the world solely through the prism of the self, and much of the time a pretty "icky" self at that. It's a quivering sentimental self that gets uncomfortable very quickly, because this self has to feel good about itself all the time. Such selves do not make arguments, they validate one another.

A second factor is the decline of our two great political parties. At one point the parties acted not just as big fundraising machines, not just as entities to mobilize voters but as real institutions of political and civic education. There are lots of reasons why the parties have been transformed and why they no longer play that role, but the results are a decline in civic education, a thinning out of political identification and depoliticization, more generally.

I'm struck by what one wag called the herd of independent minds; by the fact that what too often passes for intellectual discussion is a process of trying to suit up everybody in a team jersey so we know just who should be cheered and who booed. It seems to me that any public intellectual worth his or her salt must resist this sort of thing, even at the risk of making lots of people uncomfortable.

Donatich: Stephen, can you talk about the thinning out of political identity? Who might be responsible for either thickening or thinning the blood of political discourse? What would you say, now that we're talking about the fragmentation of separate constituencies and belief systems, is the role of religion and faith in public life?

Stephen Carter: You know that in the academy the really bad word is "popularizer"-- a mere popularizer, not someone who is original, which of course means obscure, or someone who is "deeply theorized," which is the other phrase. And to be deeply theorized, you understand, in academic terms today, means to be incapable of uttering a word such as "poor." No one is poor. The word, the phrase now, as some of you may know, is "restricted access to capital markets." That's deeply theorized, you see. And some of us just say poor, and that makes us popularizers.

A few years ago someone who was really quite angry about one of my books--and I have a habit of making people angry when I write books--wrote a review in which he challenged a statement of mine asserting that the intellectual should be in pursuit of truth without regard to whether that leaves members of any particular political movement uncomfortable. He responded that this was a 12-year-old nerd's vision of serious intellectual endeavor.

And ever since then I thought that I would like to write a book, or at least an essay, titled something like Diary of an Intellectual Nerd, because I like that idea of being somewhat like a 12-year-old. A certain naïveté, not so much about great ideas and particularly not about political movements but about thought itself, about truth itself. And I think one of the reasons, if the craft of being intellectual in the sense of the scholar who speaks to a large public is in decline, is cynicism. Because there's no sense that there are truths and ideas to be pursued. There are only truths and ideas to be used and crafted and made into their most useful and appropriate form. Everyone is thought to be after something, everyone is thought to have some particular goal in mind, independent of the goal that he or she happens to articulate. And so, a person may write a book or an article and make an argument, and people wonder, they stand up in the audience and they say, "So, are you running for office, or are you looking for some high position?" There's always some thought that you must be after something else.

One of the reasons, ideally, you'd think you would find a lot of serious intellectual endeavor on university campuses is precisely because people have tenure and therefore, in theory, need not worry about trying to do something else. But on many, many campuses you have, in my judgment, relatively little serious intellectual endeavor in the sense of genuinely original thinking, because even there, people are worried about which camp they will be thought to be in.

You can scarcely read a lot of scholarship today without first having to wade through several chapters of laying out the ground in the sense of apologizing in advance to all the constituencies that may be offended, lest one be thought in the other camp. That kind of intellectual activity is not only dangerous, it's unworthy in an important sense, it's not worthy of the great traditions of intellectual thought.

There's a tendency sometimes to have an uneasy equation that there is serious intellectual activity over here, and religion over there, and these are, in some sense, at war. That people of deep faith are plainly anti-intellectual and serious intellectuals are plainly antireligious bigots--they're two very serious stereotypes held by very large numbers of people. I'm quite unembarrassed and enthusiastic about identifying myself as a Christian and also as an intellectual, and I don't think there's any necessary war between those two, although I must say, being in an academic environment, it's very easy to think that there is.

I was asked by a journalist a few years ago why was it that I was comfortable identifying myself, and often did, as a black scholar or an African-American scholar and hardly ever identified myself as a Christian scholar. And surely the reason is, there are certain prejudices on campus suggesting that is not a possible thing to be or, at least, not a particularly useful combination of labels.

And yet, I think that the tradition of the contribution to a public-intellectual life by those making explicitly religious arguments has been an important and overlooked one, and I go back for my model, well past Niebuhr, into the nineteenth century. For example, if you looked at some of the great preachers of the abolitionist movement, one thing that is quite striking about them is, of course, that they were speaking in an era when it was commonly assumed that people could be quite weighty in their theology and quite weighty in their intellectual power. And when you read many of the sermons of that era, many of the books and pamphlets, you quickly gain a sense of the intellectual power of those who were pressing their public arguments in explicitly Christian terms.

Nowadays we have a historical tendency to think, "Oh, well, it's natural they spoke that way then, because the nation was less religiously diverse and more Christian." Actually, the opposite was probably true, as historians now think--the nation is probably less religiously diverse now than it was 150, 175 years ago, when religions were being founded really quite swiftly. And most of those swiftly founded religions in the 1820s to the 1830s have died, but many of them had followers in great number before they did.

America's sense of itself as a so-called Christian nation, as they used to say in the nineteenth century, didn't really grow strong until the 1850s or 1860s. So you have to imagine the abolitionist preachers of the eighteenth and early nineteenth centuries, preaching in a world in which it could be anything but certain that those who were listening to them were necessarily co-religionists.

In this century too, we have great intellectual preachers who also spoke across religious lines. Martin Luther King is perhaps the most famous of them, even though sometimes, people try to make a straitjacket intellectual of him by insisting, with no evidence whatsoever, that he actually was simply making secular moral arguments, and that religion was kind of a smokescreen. If you study his public ministry and look at his speeches, which were really sermons, as a group, you easily discern that that's not true.

And yet, the religiosity of his language gave it part of its power, including the power to cross denominational lines, to cross the lines between one tradition and another, and to cross lines between religion and nonreligion. For the religiously moved public intellectual, the fact is that there are some arguments that simply lose their power or are drained of their passion when they're translated into a merely secular mode. The greatness of King's public oratory was largely a result of its religiosity and its ability to touch that place in the human heart where we know right from wrong; it would not have been as powerful, as compelling, had it lacked that religious quality.

Now, I'm not being ahistorical, I'm not saying, "Oh, therefore the civil rights movement would not have happened or we would still have racial segregation today"--that's not the point of my argument. The point is that his religiosity did not detract from his intellectual power; rather, it enhanced it. This is not to say, of course, that everyone who makes a religious argument in public life is speaking from some powerful intellectual base. But it does suggest we should be wary of the prejudices that assume they can't be making serious arguments until they are translated into some other form that some may find more palatable. In fact, one of my great fears about the place we are in our democracy is that, religion aside, we have lost the ability to express and argue about great ideas.

Donatich: Professor Carter has made a career out of illustrating the effect and protecting the right of religious conviction in public thought. Herbert Gans, on the other hand, is a self-pronounced, enthusiastic atheist. As a social scientist who has taught several generations of students, how does a public intellectual balance the professional need for abstract theory and yet remain relevant, contribute some practical utility to the public discourse?

Herbert Gans: I'm so old that the word "discourse" hadn't been invented yet! I am struck by the pessimism of this panel. But I also notice that most of the names of past public intellectuals--and I knew some of them--were, during their lifetime, people who said, "Nobody's listening to me." Erich Fromm, for example, whom I knew only slightly and through his colleagues, was sitting in Mexico fighting with psychoanalysts who didn't think politics belonged in the dialogue. Lewis Mumford was a teacher of mine, and he certainly felt isolated from the public, except on architecture, because he worked for The New Yorker.

So it seems to me it's just the opposite: that the public intellectual is alive and well, though perhaps few are of the magnitude of the names mentioned. If I did a study, I'd have to define what an intellectual is, and I notice nobody on the panel has taken that one on. And I won't either. The public intellectuals that exist now may not be as famous, but in fact there are lots of them. And I think at least on my campus, public intellectuals are becoming celebrities. Some of them throw stones and get themselves in trouble for a few minutes and then it passes. But I think that really is happening, and if celebrities can exist, their numbers will increase.

One of the reasons the number is increasing is that public intellectuals are really pundits. They're the pundits of the educated classes, the pundits of the highbrow and the upper-middlebrow populations, if you will. And the moment you say they're pundits, then you can start comparing them to other pundits, of which we have lots. And there are middlebrow pundits and there are lower-brow pundits, there are serious pundits, there are not-so-serious pundits.

Some of the columnists in the newspapers and the tabloid press who are not journalists with a PhD are public intellectuals. There are pundits who are satirical commentators, there are a significant number of people who get their political news from Leno and Letterman. And, of course, the pollsters don't really understand this, because what Leno and Letterman supply is a satirical take on the news.

Most public intellectuals function as quote-suppliers to legitimize the media. Two or three times a week, I get called by journalists and asked whether I will deliver myself of a sociological quote to accompany his or her article, to legitimate, in a sense, the generalizations that journalists make and have to make, because they've got two-hour deadlines. Which means that while there are few public intellectuals who are self-selected, most of us get selected anyway. You know, if no journalist calls for a quote, then I'm not a public intellectual; I just sit there writing my books and teaching classes.

I did a book on the news media and hung out at Newsweek and the other magazines. And at Newsweek, they had something they called an island, right in the main editorial room. On the island were names of people who would now be called public intellectuals, the people whom Newsweek quoted. And the rules were--and this is a bit like Survivor--every so often people would be kicked off the island. Because the editors thought, and probably rightly, that we as readers were going to get tired of this group of public intellectuals. So a new group was brought in to provide the quotes. And then they were kicked off.

The public intellectuals come in two types, however. First there are the ones that everyone has been talking about, the generalists, the pundits, as I think of them; and second are the disciplinary public intellectuals. The public sociologists, the public economists, the public humanists--public, plus a discipline. And these are the people who apply the ideas from their own disciplines to a general topic. And again, to some extent, this is what I do when I'm a quote-supplier, and I'm sure my fellow panelists are all functioning as quote-suppliers too.

But the disciplinary public intellectuals show that their disciplinary insights and their skills can add something original to the public debate. That, in other words, social scientists and humanists can indeed grapple with the issues and the problems of the real world. The disciplinary public intellectuals, like other public intellectuals, have to write in clear English. This is a rarity in the academy, unfortunately--which makes disciplinary public intellectuals especially useful. And they demonstrate the public usefulness of their disciplines, which is important in one sense, because we all live off public funds, directly or indirectly, and we need to be able to account every so often that we're doing something useful for taxpayers. I cannot imagine there are very many legislators in this country who would consider an article in an academic journal as proof that we're doing something useful or proof that we're entitled to some share of the public budget.

Disciplinary public intellectuals are useful in another way, too: They are beloved by their employers, because they get these employers publicity. My university has a professionally run clipping service, and every time Columbia University is mentioned, somebody clips and files the story. And so every time somebody quotes me I say, "Be sure to mention Columbia University," because I want to make my employers happy, even though I do have tenure. Because, if they get publicity, they think they're getting prestige, and if they get prestige, that may help them get students or grant money.

There are a number of hypotheses on this; I'm not sure any of them are true-- whether quote-supplying provides prestige, or prestige helps to get good students, whether good students help to get grant money. There is a spiral here that may crash. But meanwhile, they think that if we're getting them publicity, we're being useful. And, of course, public social scientists and those in the humanities are, in some respects, in short supply, in part because their colleagues stigmatize them as popularizers. (They don't call them journalists, which is a dirty word in the ivory tower.)

It's also fair to say that in the newsrooms, "academic" is a dirty word. If you've ever paid attention, journalists always cite "the professor," and it doesn't matter who it is, and it doesn't even matter if they're friends of the professor. But it's always "the professor," which is a marvelous way of dehumanizing us professors. So there's this love/hate relationship between journalists and academics that's at work here. All of which means, yes, of course, it does take a bit of courage to be a public intellectual or a disciplinary public intellectual. If you turn your back on the mainstream of the academy, that's the way you get a knife in your back, at times.

Donatich: Steven Johnson has used the web and Internet energetically and metaphorically. How will the Internet change public dialogue? What are the opportunities of public conversation that this new world presents?

Steven Johnson: One of the problems with the dot-com-millionaire phenomenon--which may, in fact, be starting to fall behind us--is that it really distracted a huge amount of attention from a lot of other very interesting and maybe more laudable things that were happening online. There was kind of a news vacuum that sucked everything toward stories about the 25-year-old guy who just made $50 million, and we lost sight of some of the other really progressive and important things that were happening because of the rise of the web.

I'm of a generation that came of age at precisely that point that Russell Jacoby talked about and wrote about, during the late eighties, when the academy was very much dominated by ideas from France and other places, where there was a lot of jargon and specialization, and it was the heyday of poststructuralism and deconstruction in the humanities. Which leads me to sometimes jokingly, sometimes not, describe myself as a "recovering semiotics major."

I think that I came to the web and to starting Feed, and to writing the book that I wrote about the Internet culture and interface culture, as a kind of a refugee from conversations like one in the academy, when I was a graduate student, in which a classmate asked the visiting Derrida a ten- or fifteen-minute, convoluted Derridean question on his work and the very possibility of even asking a question. And after a long pause, Derrida had to admit, "I'm sorry, I do not understand the question."

The web gave me an unlikely kind of home in that there were ideas and there were new critical paradigms that had been opened up to me from the academic world. But it was clear that you couldn't write about that world, you couldn't write using those tools with that kind of language and do anything useful. And it was very hard to imagine a life within the university system that was not going to inevitably push me toward conversations like that with Derrida.

So the good news, I think, is that my experience is not unique. In fact, there's been a great renaissance in the last five years of the kind of free-floating intellectual that had long been rumored to be on his or her last legs. It's a group shaped by ideas that have come out of the academy but is not limited to that. And I think in terms of publications like Feed--to pat myself on the back--Hermenaut and Suck are all good examples of a lively new form of public intellectualism that is not academic in tone.

The sensibility of that group is very freethinking--not particularly interested in doctrinaire political views, very eclectic in taste, very interested in the mix of high and low culture, much more conversational in tone--funny, even. Funny is an interesting component here. I mean, these new writers are funny in a way, you know, Adorno was never very funny. And they're very attentive to technology changes, maybe as interested in technology and changes in the medium as they are in intellectual fashions. If there's a role model that really stands out, it's somebody like Walter Benjamin for this generation. You know, a sense of an interest that puts together groups of things you wouldn't necessarily expect to see put together in the same essay.

How does the web figure into all of this? Why did these people show up on the web? I think one of the things that started happening--actually, this is just starting to happen--is that in addition to these new publications, you're starting to see something on the web that is very unique to it. The ability to center your intellectual life in all of its different appearances in your own "presence" online, on the home page, so that you can actually have the equivalent of an author bio. Except that it's dynamically updated all the time, and there are links to everything you're doing everywhere. I think we've only just begun to exploit it--of combating the problem with the free-floating intellectual, which is that you're floating all over the place and you don't necessarily have a home, and your ideas are appearing in lots of different venues and speaking to lots of different audiences.

The web gives you a way of rounding all those diverse kinds of experiences and ideas--and linking to them. Because, of course, the web is finally all about linking--in a way that I think nothing has done quite as well before it. And it also involves a commitment to real engagement with your audience that perhaps public intellectuals have talked a lot about in the past, but maybe not lived up to as much as they could have.

Some of this is found in the new formats that are available online in terms of how public dialogue can happen. I'm sure many of you have read these and many of you may have actually participated in them, but I'm a great advocate for this kind of long-format, multiparticipant discussion thread that goes on over two or three weeks. Not a real-time live chat, which is a disaster in terms of quality of discourse, which inevitably devolves into the "What are you wearing" kind of intellectual questions. But rather, the conversations with four or five people where each person has a day or half a day to think up their responses, and then write in 500- to 1,000-word posts. We've done those since we started at Feed. Slate does a wonderful job with them. And it's a fantastic forum. It's very engaged, it's very responsible, it's very dialogic and yet also lively in a conversational way. But, because of the back and forth, you actually can get to places that you sometimes couldn't get in a stand-alone 10,000-word essay.

Donatich: Professor Gans, if you had trouble with the word "discourse," I'm wondering what you'll do with "dialogic."

Johnson: I said I was recovering! That's the kind of thing that should be happening, and it seems to me that in five or ten years we'll see more and more of people who are in this kind of space, having pages that are devoted to themselves and carrying on these conversations all the time with people who are coming by and engaging with them. And I think that is certainly a force for good. The other side is just the economics of being able to publish either your own work or a small magazine. I mean, we started Feed with two people. We were two people for two years before we started growing a little bit. And the story that I always tell about those early days is that we put out the magazine and invited a lot of our friends and some people we just knew professionally to contribute. About three months, I guess, after Feed launched, Wired came out with a review of it. And they had this one slightly snippy line that said, "It's good to see the East Coast literary establishment finally get online." Which is very funny, to be publishing this thing out of our respective apartments. I had this moment where I was looking around my bedroom for the East Coast literary establishment--you open the closet door, and "Oh, Norman Mailer is in there. 'Hey, how's it going!'" And so there can be a kind of Potemkin Village quality online. But I think the village is thriving right now.

Donatich: Christopher Hitchens, short of taking on what a public intellectual might or might not be, will you say something about the manners or even the mannerisms of the public intellectual and why disagreement is important to our progress?

Christopher Hitchens: I've increasingly become convinced that in order to be any kind of a public-intellectual commentator or combatant, one has to be unafraid of the charges of elitism. One has to have, actually, more and more contempt for public opinion and for the way in which it's constructed and aggregated, and polled and played back and manufactured and manipulated. If only because all these processes are actually undertaken by the elite and leave us all, finally, voting in the passive voice and believing that we're using our own opinions or concepts when in fact they have been imposed upon us.

I think that "populism" has become probably the main tactical discourse, if you will, the main tactical weapon, the main vernacular of elitism. Certainly the most successful elitist in American culture now, American politics particularly, is the most successful inventor or manipulator, or leader of populism. And I think that does leave a great deal of room in the public square for intellectuals to stand up, who are not afraid to be thought of as, say, snobbish, to pick a word at random. Certainly at a time when the precious term "irony"--precious to me, at any rate--has been reduced to a form of anomie or sarcasm. A little bit of snobbery, a little bit of discrimination, to use another word that's fallen into disrepute, is very much in order. And I'm grateful to Professor Carter for this much, at least, that he drew attention to language. And particularly to be aware of euphemism. After all, this is a time when if you can be told you're a healer, you've probably won the highest cultural award the society can offer, where anything that can be said to be unifying is better than anything that can be described as divisive. Blush if you will, ladies and gentlemen, I'm sure at times you too have applauded some hack who says he's against or she's against the politics of division. As if politics wasn't division by definition.

The New York Times, which I'm sure some of you at least get, if you don't read, will regularly regale you in this way--check and see if you can confirm this. This will be in a news story, by the way, not a news analysis. About my hometown in Washington, for example, "recently there was an unpleasant outbreak of partisanship on Capitol Hill, but order seems to have been restored, and common sense, and bi-partisanship, is again regained. I've paraphrased only slightly. Well, what is this in translation? "For a while back there it looked as if there'd be a two-party system. But, thank God, the one-party system has kicked back in."

Now, the New York Times would indignantly repudiate--I'm coming back to this, actually--the idea that it stood for a one-party system or mentality, but so it does. And its language reveals it. So look to the language. And that is, in fact, one of the most essential jobs of anyone describing themselves as an intellectual.

Against this, we have, of course, the special place reserved for the person who doesn't terribly want to be a part of it, doesn't feel all that bipartisan, who isn't in an inclusive mood. Look at the terms that are used for this kind of a person: gadfly, maverick and, sometimes, bad boy. Also bad girl, but quite often bad boy, for some reason. Loose cannon, contrarian, angry young man.

These are not hate words, by any means, nor are they exactly insulting, but there's no question, is there, that they are fantastically and essentially condescending. They're patronizing terms. They are telling us, affectionately enough, that pluralism, of course, is big enough, capacious enough, tolerant enough to have room for its critics.

The great consensus, after all, probably needs a few jesters here and there, and they can and should be patted upon the head, unless they become actually inconvenient or awkward or, worst of all--the accusation I have myself been most eager to avoid--humorless. One must be funny, wouldn't you say? Look to the language again. Take the emaciated and paltry manner and prose in which a very tentative challenge to the one-party system, or if you prefer, the two-party one, has been received. I'm alluding to the campaign by Ralph Nader.

The New York Times published two long editorials, lead editorials, very neatly inverting the usual Voltairean cliché. These editorials say: We don't particularly disagree with what Ralph Nader says, but we violently disagree with his right to say it. I've read the editorials--you can look them up. I've held them up to the light, looked at them upside down, inside out, backwards--that's what they say. This guy has no right to be running, because the electorate is entitled to a clear choice between the two people we told you were the candidates in the first place.

I find this absolutely extraordinary. When you're told you must pick one of the available ones; "We've got you some candidates, what more do you want? We got you two, so you have a choice. Each of them has got some issues. We've got some issues for you as well. You've got to pick." A few people say, "Well, I don't feel like it, and what choice did I have in the choice?" You're told, "Consider the alternatives." The first usage of that phrase, as far as I know, was by George Bernard Shaw, when asked what he felt like on his 90th birthday. And he said, "Considering the alternatives...." You can see the relevance of it. But in this case you're being told, in effect, that it would be death to consider the alternatives.

Now, to "consider the alternatives" might be a definition of the critical mind or the alive intelligence. That's what the alive intelligence and the critical mind exist to do: to consider, tease out and find alternatives. It's a very striking fact about the current degeneration of language, that that very term, those very words are used in order to prevent, to negate, consideration of alternatives. So, be aware. Fight it every day, when you read gunk in the paper, when you hear it from your professors, from your teachers, from your pundits. Develop that kind of resistance.

The word "intellectual" is of uncertain provenance, but there's no question when it became a word in public use. It was a term of abuse used by those who thought that Capt. Alfred Dreyfus was guilty in 1898 to describe those who thought that he was probably innocent. It was a word used particularly by those who said that whether Captain Dreyfus was innocent or not, that wasn't really the point. The point was, would France remain an orderly, Christian, organic, loyal society? Compared to that, the guilt or innocence of Captain Dreyfus was irrelevant. They weren't saying he was necessarily guilty, they were saying, "Those who say he is innocent are not our friends. These are people who are rootless, who have no faith, who are unsound, in effect." I don't think it should ever probably lose that connotation. And fortunately, like a lot of other words that were originally insults--I could stipulate "Impressionist," which was originally a term of abuse, or "suffragette" or "Tory," as well as a number of other such terms--there was a tendency to adopt them in reaction to the abuse and to boast of them, and say, "Well, all right, you call me a suffragette, I'll be a suffragette. As a matter of fact, I'll be an Impressionist."

I think it would be a very sad thing if the word "intellectual" lost its sense that there was something basically malcontent, unsound and untrustworthy about the person who was claiming the high honor of the title. In politics, the public is the agora, not the academy. The public element is the struggle for opinion. It's certainly not the party system or any other form whereby loyalty can be claimed of you or you can be conscripted.

I would propose for the moment two tasks for the public intellectual, and these, again, would involve a confrontation with our slipshod use of language. The first, I think, in direct opposition to Professor Carter, is to replace the rubbishy and discredited notions of faith with scrutiny, by looking for a new language that can bring us up to the point where we can discuss shattering new discoveries about, first, the cosmos, in the work of Stephen Hawking, and the discoveries of the Hubble telescope--the external world--and, second, no less shattering, the discovery about our human, internal nature that has begun to be revealed to us by the unraveling of the chains of DNA.

At last, it's at least thinkable that we might have a sense of where we are, in what I won't call creation. And what our real nature is. And what do we do? We have President Clinton and the other figures in the Human Genome Project appear before us on the day that the DNA string was finally traced out to its end, and we're told in their voices and particularly the wonderful lip-biting voice of the President, "Now we have the dictionary which God used when he was inventing us." Nothing could be more pathetic than that. This is a time when one page, one paragraph, of Hawking is more awe-inspiring, to say nothing of being more instructive, than the whole of Genesis and the whole of Ezekiel. Yet we're still used to babble. For example, in the 18th Brumaire of Louis Napoleon, Karl Marx says, quite rightly, I think, "When people are trying to learn a new language, it's natural for them to translate it back into the one they already know." Yes, that's true. But they must also transcend the one they already know.

So I think the onus is on us to find a language that moves us beyond faith, because faith is the negation of the intellect, faith supplies belief in preference to inquiry and belief, in place of skepticism, in place of the dialectic, in favor of the disorder and anxiety and struggle that is required in order to claim that the mind has any place in these things at all.

I would say that because the intellectual has some responsibility, so to speak, for those who have no voice, that a very high task to adopt now would be to set oneself and to attempt to set others, utterly and contemptuously and critically and furiously, against the now almost daily practice in the United States of human sacrifice. By which I mean, the sacrifice, the immolation of men and women on death row in the system of capital punishment. Something that has become an international as well as a national disgrace. Something that shames and besmirches the entire United States, something that is performed by the professionalized elite in the name of an assumed public opinion. In other words, something that melds the worst of elitism and the absolute foulest of populism.

People used to say, until quite recently, using the words of Jimmy Porter in Look Back in Anger, the play that gave us the patronizing term "angry young man"--well, "there are no good, brave causes anymore." There's nothing really worth witnessing or worth fighting for, or getting angry, or being boring, or being humorless about. I disagree and am quite ready to be angry and boring and humorless. These are exactly the sacrifices that I think ought to be exacted from oneself. Let nobody say there are no great tasks and high issues to be confronted. The real question will be whether we can spread the word so that arguments and debates like this need not be held just in settings like these but would be the common property of anyone with an inquiring mind. And then, we would be able to look at each other and ourselves and say, "Well, then perhaps the intellectual is no longer an elitist."

Blogs

Todd Akin, Mitt Romney and Paul Ryan all have me thinking of June Jordan’s great “Poem about My Rights.”

August 22, 2012

A conversation with election law expert Richard Hasen on the true scope of voter fraud, the power of the ACORN myth and John Roberts’s scary interest in the Voting Rights Act.

 

August 17, 2012

Stephen Colbert took Hayes to task on his show, holding up Hayes's book Twilight of the Elites: America After Meritocracy.

August 6, 2012

If you’re under the false impression that the world is falling into utter moral disrepair, turn your eyes toward Pompeii.

July 26, 2012

McGovern has always practiced a politics that runs deeper; a politics rooted in his love of America’s history, its literature and its possibility.

July 19, 2012

There’s a new Disney exhibit at the Reagan library. But what are drawings of Bambi and Cinderella doing in the National Archives?

July 9, 2012

The Mormon Church is a corporation, says author and essayist Terry Tempest Williams, and there is reason to fear a Romney presidency....

May 27, 2012

230,000 long-term unemployed workers lost their benefits on Sunday and the system is about to get a whole lot worse.

May 18, 2012

Katie Roiphe's Newsweek cover story ignores the fact that in every century and decade, sadomasochistic erotica has broken into the mainstream.

April 16, 2012

A poet passionately engaged with writing and politics, she said "art means nothing if it simply decorates the dinner table of the power which holds it hostage."

March 28, 2012