Help

Nation Topics - Books and Ideas

Topic Page

Articles

News and Features

The following debate is adapted from a forum--put together by Basic Books and held in New York City some weeks ago. Participating were: John Donatich, who moderated and is publisher of Basic Books; Russell Jacoby, who teaches at UCLA and is the author of The End of Utopia and The Last Intellectuals; Jean Bethke Elshtain, who has served as a board member of the Institute for Advanced Studies at Princeton University, is a fellow of the American Academy of Arts and Sciences, teaches at the University of Chicago and is the author of Women and War, Democracy on Trial and a forthcoming intellectual biography of Jane Addams; Stephen Carter, the William Nelson Cromwell Professor of Law at Yale University and author of, among other works, The Culture of Disbelief, Reflections of an Affirmative Action Baby, Integrity, Civility and, most recently, God's Name in Vain: The Wrongs and Rights of Religion in Politics; Herbert Gans, the Robert S. Lynd Professor of Sociology at Columbia University and author of numerous works, including Popular Culture and High Culture, The War Against the Poor and The Levittowners; Steven Johnson, acclaimed as one of the most influential people in cyberworld by Newsweek and New York magazines, co-founder of Feedmag.com, the award-winning online magazine, and author of the books Interface Culture and the forthcoming Emergence; and Christopher Hitchens, a columnist for The Nation and Vanity Fair, whose books include the bestselling No One Left to Lie To: The Values of the Worst Family and The Missionary Position: Mother Teresa in Theory and Practice. For Basic, he will be writing the forthcoming On the Contrary: Letters to a Young Radical.

John Donatich: As we try to puzzle out the future of the public intellectual, it's hard not to poke a little fun at ourselves, because the issue is that serious. The very words "future of the public intellectual" seem to have a kind of nostalgia built into them, in that we only worry over the future of something that seems endangered, something we have been privileged to live with and are terrified to bury.

In preparing for this event, I might as well admit that I've been worried about making the slip, "the future of the public ineffectual." But I think that malapropism would be central to what we'll be talking about. It seems to me that there is a central conflict regarding American intellectual work. How does it reconcile itself with the venerable tradition of American anti-intellectualism? What does a country built on headstrong individualism and the myth of self-reliance do with its people convinced that they know best? At Basic Books' fiftieth anniversary, it's a good time to look at a publishing company born in midcentury New York City, a time and place that thrived on the idea of the public intellectual. In our first decades, we published Daniel Bell, Nathan Glazer, Michael Walzer, Christopher Lasch, Herb Gans, Paul Starr, Robert Jay Lifton--and these names came fresh on the heels of Lévi-Strauss, Freud, Erik Erikson and Clifford Geertz.

What did these writers have in common except the self-defined right to worry the world and to believe that there is a symbiotic relationship between the private world of the thinker and the public world he or she wishes to address? That the age of great public intellectuals in America has passed has in fact become a cliché. There are many well-reviewed reasons for this. Scholars and thinkers have retreated to the academy. Self-doubt has become the very compass point of contemporary inquiry. Scholarship seems to start with an autobiographical or confessional orientation. The notion that every question has a noble answer or that there are reliable structures of ideology to believe in wholeheartedly has become, at best, quaint.

Some believe that the once-relied-upon audience of learned readers has disappeared, giving way to a generation desensitized to complex argumentation by television and the Internet. The movie Dumb and Dumber grosses dozens of millions of dollars at the box office, while what's left of bohemian culture celebrates free-market economics. Selling out has more to do with ticket grosses than the antimaterialist who stands apart from society.

How do we reconcile ambition and virtue, expertise and accessibility, multicultural sensitivity and the urge toward unified theory? Most important, how do we reconcile the fact that disagreement is a main catalyst of progress? How do we battle the gravitation toward happy consensus that paralyzes our national debate? A new generation of public intellectuals waits to be mobilized. What will it look like? That is what our distinguished panelists will discuss.

Russell Jacoby has been useful in defining the role of the public intellectual in the past half-century, especially in the context of the academy. Can you, Russell, define for us a sort of historical context for the public intellectual--what kind of talent, courage and/or political motivation it takes for someone to be of the academy but to have his or her back turned to it, ready to speak to an audience greater than one's peers?

Russell Jacoby: A book of mine that preceded The Last Intellectuals was on the history of psychoanalysis. And one of the things I was struck by when I wrote it was that even though psychoanalysis prospered in the United States, something was missing--that is, the sort of great refugee intellectuals, the Erik Eriksons, the Bruno Bettelheims, the Erich Fromms, were not being reproduced. As a field it prospered, but it became medicalized and professionalized. And I was struck by both the success of this field and the absence of public voices of the Eriksons and Bettelheims and Fromms. And from there I began to consider this as a sort of generational question in American history. Where were the new intellectuals? And I put the stress on public intellectuals, because obviously a kind of professional and technical intelligentsia prospered in America, but as far as I could see the public intellectuals were becoming somewhat invisible.

They were invisible because, in some ways, they had become academics, professors locked in the university. And I used a kind of generational account, looking at the 1900s, taking the Edmund Wilsons, the Lewis Mumfords. What became of them, and who were their successors? And I had a tough time finding them.

In some sense it was a story of my generation, the generation that ended up in the university and was more concerned with--well, what?--finding recommendations than with writing public interventions. And to this day, the worst thing you can say about someone in an academic meeting or when you're discussing tenure promotion is, "Oh, his work is kind of journalistic." Meaning, it's readable. It's journalistic, it's superficial. There's an equation between profundity and originality.

My argument was that, in fact, these generations of public intellectuals have diminished over time. For good reasons. The urban habitats, the cheap rents, have disappeared--as well as the jobs themselves. So the transitional generation, the New York intellectuals, ends up in the university. I mention Daniel Bell as a test case. When he was getting tenure, they turned to him and said, "What did you do your dissertation on?" And he said, "I never did a dissertation." And they said, "Oh, we'll call that collection of essays you did a dissertation." But you couldn't do that now. Those of that generation started off as independent intellectuals writing for small magazines and ended up as professors. The next generation started off as professors, wrote differently and thought differently.

So my argument and one of the working titles of my book was, in fact, "The Decline of the Public Intellectuals." And here I am at a panel on "The Future of Public Intellectuals." Even at the time I was writing, some editors said, "Well, decline, that's a little depressing. Could you sort of make a more upbeat version?" So I said, "I have a new book called The Rise of American Intellectuals," and was told, "Well, that sounds much better, that's something we can sell." But I was really taking a generational approach, which in fact, is on the decline. And it caused intense controversy, mainly for my contemporaries, who always said, "What about me? I'm a public intellectual. What about my friends?" In some sense the argument is ongoing. I'm happy to be wrong, if there are new public intellectuals emerging. But I tend to think that the university and professionalization does absorb and suck away too much talent, and that there are too few who are bucking the trends.

Donatich: Maybe the term "public intellectual" begs the question, "who is the public that is being addressed by these intellectuals?" Which participant in this conversation is invisible, the public or the intellectual?

Jean Bethke Elshtain: I mused in print at one point that the problem with being a public intellectual is that as time goes on, one may become more and more public and less and less intellectual. Perhaps I should have said that a hazard of the vocation of the public intellectual lies in that direction. I didn't exactly mean less academically respectable, but rather something more or less along these lines: less reflective, less inclined to question one's own judgments, less likely to embed a conviction in its appropriate context with all the nuance intact. It is the task of the public intellectual as I understand that vocation to keep the nuances alive. A public intellectual is not a paid publicist, not a spinner, not in the pocket of a narrowly defined purpose. It is, of course the temptation, another one, of the public intellectual to cozy up to that which he or she should be evaluating critically. I think perhaps, too many White House dinners can blunt the edge of criticism.

A way I like to put it is that when you're thinking about models for this activity, you might put it this way: Sartre or Camus? An intellectual who is willing to look the other way, indeed, shamefully, explain away the existence of slave-labor camps, the gulags, in the service of a grand world-historic purpose or, by contrast, an intellectual who told the truth about such atrocities, knowing that he would be denounced, isolated, pronounced an ally of the CIA and capitalistic oppressors out to grind the faces of the poor.

There are times when a public intellectual must say "neither/nor," as did Camus. Neither the socialism of the gallows, in his memorable phrase, nor a capitalist order riddled with inequalities and shamed by the continuing existence, in his era, the era of which I speak, of legally sanctioned segregation. At the same time, this neither/nor did not create a world of moral equivalence. Camus was clear about this. In one regime, one order, one scheme of things, one could protest, one could organize to fight inequities, and in the other one wound up disappeared or dead.

Let me mention just one issue that I took on several times when I alternated a column called "Hard Questions" for The New Republic. I'm referring to the question of genetic engineering, genetic enhancement, the race toward a norm of human perfection to be achieved through manipulation of the very stuff of life. How do you deal with an issue like this? Here, it seems to me, the task of the public intellectual in this society at this time--because we're not fighting the issues that were fought in the mid-twentieth century--is to join others in creating a space within which such matters can be articulated publicly and debated critically.

At present, the way the issue is parsed by the media goes like this: The techno-enthusiasts announce that we're one step closer to genetic utopia. The New York Times calls up its three biological ethicists to comment. Perhaps one or two religious leaders are asked to wring their hands a little bit--anyone who's really a naysayer with qualms about eugenics, because that is the direction in which we are heading, is called a Luddite. Case closed, and every day we come closer to a society in which, even as we intone multiculturalism as a kind of mantra, we are narrowing the definition of what is normatively human as a biological ideal. That's happening even as we speak; that is, we're in real danger of reducing the person to his or her genotype, but if you say that, you're an alarmist--so that's what I am.

This leads me to the following question: Who has authority to pronounce on what issue, as the critical issues change from era to era? In our time and place, scientists, technology experts and dot-com millionaires seem to be the automatic authorities on everything. And everybody else is playing catch-up.

So the public intellectual needs, it seems to me, to puncture the myth-makers of any era, including his own, whether it's those who promise that utopia is just around the corner if we see the total victory of free markets worldwide, or communism worldwide or positive genetic enhancement worldwide, or mouse-maneuvering democracy worldwide, or any other run-amok enthusiasm. Public intellectuals, much of the time at least, should be party poopers. Reinhold Niebuhr was one such when he decided that he could no longer hold with his former compatriots of the Social Gospel movement, given what he took to be their dangerous naïveté about the rise of fascism in Europe. He was widely derided as a man who once thought total social transformation in the direction of world peace was possible, but who had become strangely determined to take a walk on the morbid side by reminding Americans of the existence of evil in the world. On this one, Niebuhr was clearly right.

When we're looking around for who should get the blame for the declining complexity of public debate, we tend to round up the usual suspects. Politicians usually get attacked, and the media. Certainly these usual suspects bear some responsibility for the thinning out of the public intellectual debate. But I want to lift up two other candidates here, two trends that put the role of public intellectuals and the very existence of publics in the John Dewey sense at risk. The first is the triumph of the therapeutic culture, with its celebration of a self that views the world solely through the prism of the self, and much of the time a pretty "icky" self at that. It's a quivering sentimental self that gets uncomfortable very quickly, because this self has to feel good about itself all the time. Such selves do not make arguments, they validate one another.

A second factor is the decline of our two great political parties. At one point the parties acted not just as big fundraising machines, not just as entities to mobilize voters but as real institutions of political and civic education. There are lots of reasons why the parties have been transformed and why they no longer play that role, but the results are a decline in civic education, a thinning out of political identification and depoliticization, more generally.

I'm struck by what one wag called the herd of independent minds; by the fact that what too often passes for intellectual discussion is a process of trying to suit up everybody in a team jersey so we know just who should be cheered and who booed. It seems to me that any public intellectual worth his or her salt must resist this sort of thing, even at the risk of making lots of people uncomfortable.

Donatich: Stephen, can you talk about the thinning out of political identity? Who might be responsible for either thickening or thinning the blood of political discourse? What would you say, now that we're talking about the fragmentation of separate constituencies and belief systems, is the role of religion and faith in public life?

Stephen Carter: You know that in the academy the really bad word is "popularizer"-- a mere popularizer, not someone who is original, which of course means obscure, or someone who is "deeply theorized," which is the other phrase. And to be deeply theorized, you understand, in academic terms today, means to be incapable of uttering a word such as "poor." No one is poor. The word, the phrase now, as some of you may know, is "restricted access to capital markets." That's deeply theorized, you see. And some of us just say poor, and that makes us popularizers.

A few years ago someone who was really quite angry about one of my books--and I have a habit of making people angry when I write books--wrote a review in which he challenged a statement of mine asserting that the intellectual should be in pursuit of truth without regard to whether that leaves members of any particular political movement uncomfortable. He responded that this was a 12-year-old nerd's vision of serious intellectual endeavor.

And ever since then I thought that I would like to write a book, or at least an essay, titled something like Diary of an Intellectual Nerd, because I like that idea of being somewhat like a 12-year-old. A certain naïveté, not so much about great ideas and particularly not about political movements but about thought itself, about truth itself. And I think one of the reasons, if the craft of being intellectual in the sense of the scholar who speaks to a large public is in decline, is cynicism. Because there's no sense that there are truths and ideas to be pursued. There are only truths and ideas to be used and crafted and made into their most useful and appropriate form. Everyone is thought to be after something, everyone is thought to have some particular goal in mind, independent of the goal that he or she happens to articulate. And so, a person may write a book or an article and make an argument, and people wonder, they stand up in the audience and they say, "So, are you running for office, or are you looking for some high position?" There's always some thought that you must be after something else.

One of the reasons, ideally, you'd think you would find a lot of serious intellectual endeavor on university campuses is precisely because people have tenure and therefore, in theory, need not worry about trying to do something else. But on many, many campuses you have, in my judgment, relatively little serious intellectual endeavor in the sense of genuinely original thinking, because even there, people are worried about which camp they will be thought to be in.

You can scarcely read a lot of scholarship today without first having to wade through several chapters of laying out the ground in the sense of apologizing in advance to all the constituencies that may be offended, lest one be thought in the other camp. That kind of intellectual activity is not only dangerous, it's unworthy in an important sense, it's not worthy of the great traditions of intellectual thought.

There's a tendency sometimes to have an uneasy equation that there is serious intellectual activity over here, and religion over there, and these are, in some sense, at war. That people of deep faith are plainly anti-intellectual and serious intellectuals are plainly antireligious bigots--they're two very serious stereotypes held by very large numbers of people. I'm quite unembarrassed and enthusiastic about identifying myself as a Christian and also as an intellectual, and I don't think there's any necessary war between those two, although I must say, being in an academic environment, it's very easy to think that there is.

I was asked by a journalist a few years ago why was it that I was comfortable identifying myself, and often did, as a black scholar or an African-American scholar and hardly ever identified myself as a Christian scholar. And surely the reason is, there are certain prejudices on campus suggesting that is not a possible thing to be or, at least, not a particularly useful combination of labels.

And yet, I think that the tradition of the contribution to a public-intellectual life by those making explicitly religious arguments has been an important and overlooked one, and I go back for my model, well past Niebuhr, into the nineteenth century. For example, if you looked at some of the great preachers of the abolitionist movement, one thing that is quite striking about them is, of course, that they were speaking in an era when it was commonly assumed that people could be quite weighty in their theology and quite weighty in their intellectual power. And when you read many of the sermons of that era, many of the books and pamphlets, you quickly gain a sense of the intellectual power of those who were pressing their public arguments in explicitly Christian terms.

Nowadays we have a historical tendency to think, "Oh, well, it's natural they spoke that way then, because the nation was less religiously diverse and more Christian." Actually, the opposite was probably true, as historians now think--the nation is probably less religiously diverse now than it was 150, 175 years ago, when religions were being founded really quite swiftly. And most of those swiftly founded religions in the 1820s to the 1830s have died, but many of them had followers in great number before they did.

America's sense of itself as a so-called Christian nation, as they used to say in the nineteenth century, didn't really grow strong until the 1850s or 1860s. So you have to imagine the abolitionist preachers of the eighteenth and early nineteenth centuries, preaching in a world in which it could be anything but certain that those who were listening to them were necessarily co-religionists.

In this century too, we have great intellectual preachers who also spoke across religious lines. Martin Luther King is perhaps the most famous of them, even though sometimes, people try to make a straitjacket intellectual of him by insisting, with no evidence whatsoever, that he actually was simply making secular moral arguments, and that religion was kind of a smokescreen. If you study his public ministry and look at his speeches, which were really sermons, as a group, you easily discern that that's not true.

And yet, the religiosity of his language gave it part of its power, including the power to cross denominational lines, to cross the lines between one tradition and another, and to cross lines between religion and nonreligion. For the religiously moved public intellectual, the fact is that there are some arguments that simply lose their power or are drained of their passion when they're translated into a merely secular mode. The greatness of King's public oratory was largely a result of its religiosity and its ability to touch that place in the human heart where we know right from wrong; it would not have been as powerful, as compelling, had it lacked that religious quality.

Now, I'm not being ahistorical, I'm not saying, "Oh, therefore the civil rights movement would not have happened or we would still have racial segregation today"--that's not the point of my argument. The point is that his religiosity did not detract from his intellectual power; rather, it enhanced it. This is not to say, of course, that everyone who makes a religious argument in public life is speaking from some powerful intellectual base. But it does suggest we should be wary of the prejudices that assume they can't be making serious arguments until they are translated into some other form that some may find more palatable. In fact, one of my great fears about the place we are in our democracy is that, religion aside, we have lost the ability to express and argue about great ideas.

Donatich: Professor Carter has made a career out of illustrating the effect and protecting the right of religious conviction in public thought. Herbert Gans, on the other hand, is a self-pronounced, enthusiastic atheist. As a social scientist who has taught several generations of students, how does a public intellectual balance the professional need for abstract theory and yet remain relevant, contribute some practical utility to the public discourse?

Herbert Gans: I'm so old that the word "discourse" hadn't been invented yet! I am struck by the pessimism of this panel. But I also notice that most of the names of past public intellectuals--and I knew some of them--were, during their lifetime, people who said, "Nobody's listening to me." Erich Fromm, for example, whom I knew only slightly and through his colleagues, was sitting in Mexico fighting with psychoanalysts who didn't think politics belonged in the dialogue. Lewis Mumford was a teacher of mine, and he certainly felt isolated from the public, except on architecture, because he worked for The New Yorker.

So it seems to me it's just the opposite: that the public intellectual is alive and well, though perhaps few are of the magnitude of the names mentioned. If I did a study, I'd have to define what an intellectual is, and I notice nobody on the panel has taken that one on. And I won't either. The public intellectuals that exist now may not be as famous, but in fact there are lots of them. And I think at least on my campus, public intellectuals are becoming celebrities. Some of them throw stones and get themselves in trouble for a few minutes and then it passes. But I think that really is happening, and if celebrities can exist, their numbers will increase.

One of the reasons the number is increasing is that public intellectuals are really pundits. They're the pundits of the educated classes, the pundits of the highbrow and the upper-middlebrow populations, if you will. And the moment you say they're pundits, then you can start comparing them to other pundits, of which we have lots. And there are middlebrow pundits and there are lower-brow pundits, there are serious pundits, there are not-so-serious pundits.

Some of the columnists in the newspapers and the tabloid press who are not journalists with a PhD are public intellectuals. There are pundits who are satirical commentators, there are a significant number of people who get their political news from Leno and Letterman. And, of course, the pollsters don't really understand this, because what Leno and Letterman supply is a satirical take on the news.

Most public intellectuals function as quote-suppliers to legitimize the media. Two or three times a week, I get called by journalists and asked whether I will deliver myself of a sociological quote to accompany his or her article, to legitimate, in a sense, the generalizations that journalists make and have to make, because they've got two-hour deadlines. Which means that while there are few public intellectuals who are self-selected, most of us get selected anyway. You know, if no journalist calls for a quote, then I'm not a public intellectual; I just sit there writing my books and teaching classes.

I did a book on the news media and hung out at Newsweek and the other magazines. And at Newsweek, they had something they called an island, right in the main editorial room. On the island were names of people who would now be called public intellectuals, the people whom Newsweek quoted. And the rules were--and this is a bit like Survivor--every so often people would be kicked off the island. Because the editors thought, and probably rightly, that we as readers were going to get tired of this group of public intellectuals. So a new group was brought in to provide the quotes. And then they were kicked off.

The public intellectuals come in two types, however. First there are the ones that everyone has been talking about, the generalists, the pundits, as I think of them; and second are the disciplinary public intellectuals. The public sociologists, the public economists, the public humanists--public, plus a discipline. And these are the people who apply the ideas from their own disciplines to a general topic. And again, to some extent, this is what I do when I'm a quote-supplier, and I'm sure my fellow panelists are all functioning as quote-suppliers too.

But the disciplinary public intellectuals show that their disciplinary insights and their skills can add something original to the public debate. That, in other words, social scientists and humanists can indeed grapple with the issues and the problems of the real world. The disciplinary public intellectuals, like other public intellectuals, have to write in clear English. This is a rarity in the academy, unfortunately--which makes disciplinary public intellectuals especially useful. And they demonstrate the public usefulness of their disciplines, which is important in one sense, because we all live off public funds, directly or indirectly, and we need to be able to account every so often that we're doing something useful for taxpayers. I cannot imagine there are very many legislators in this country who would consider an article in an academic journal as proof that we're doing something useful or proof that we're entitled to some share of the public budget.

Disciplinary public intellectuals are useful in another way, too: They are beloved by their employers, because they get these employers publicity. My university has a professionally run clipping service, and every time Columbia University is mentioned, somebody clips and files the story. And so every time somebody quotes me I say, "Be sure to mention Columbia University," because I want to make my employers happy, even though I do have tenure. Because, if they get publicity, they think they're getting prestige, and if they get prestige, that may help them get students or grant money.

There are a number of hypotheses on this; I'm not sure any of them are true-- whether quote-supplying provides prestige, or prestige helps to get good students, whether good students help to get grant money. There is a spiral here that may crash. But meanwhile, they think that if we're getting them publicity, we're being useful. And, of course, public social scientists and those in the humanities are, in some respects, in short supply, in part because their colleagues stigmatize them as popularizers. (They don't call them journalists, which is a dirty word in the ivory tower.)

It's also fair to say that in the newsrooms, "academic" is a dirty word. If you've ever paid attention, journalists always cite "the professor," and it doesn't matter who it is, and it doesn't even matter if they're friends of the professor. But it's always "the professor," which is a marvelous way of dehumanizing us professors. So there's this love/hate relationship between journalists and academics that's at work here. All of which means, yes, of course, it does take a bit of courage to be a public intellectual or a disciplinary public intellectual. If you turn your back on the mainstream of the academy, that's the way you get a knife in your back, at times.

Donatich: Steven Johnson has used the web and Internet energetically and metaphorically. How will the Internet change public dialogue? What are the opportunities of public conversation that this new world presents?

Steven Johnson: One of the problems with the dot-com-millionaire phenomenon--which may, in fact, be starting to fall behind us--is that it really distracted a huge amount of attention from a lot of other very interesting and maybe more laudable things that were happening online. There was kind of a news vacuum that sucked everything toward stories about the 25-year-old guy who just made $50 million, and we lost sight of some of the other really progressive and important things that were happening because of the rise of the web.

I'm of a generation that came of age at precisely that point that Russell Jacoby talked about and wrote about, during the late eighties, when the academy was very much dominated by ideas from France and other places, where there was a lot of jargon and specialization, and it was the heyday of poststructuralism and deconstruction in the humanities. Which leads me to sometimes jokingly, sometimes not, describe myself as a "recovering semiotics major."

I think that I came to the web and to starting Feed, and to writing the book that I wrote about the Internet culture and interface culture, as a kind of a refugee from conversations like one in the academy, when I was a graduate student, in which a classmate asked the visiting Derrida a ten- or fifteen-minute, convoluted Derridean question on his work and the very possibility of even asking a question. And after a long pause, Derrida had to admit, "I'm sorry, I do not understand the question."

The web gave me an unlikely kind of home in that there were ideas and there were new critical paradigms that had been opened up to me from the academic world. But it was clear that you couldn't write about that world, you couldn't write using those tools with that kind of language and do anything useful. And it was very hard to imagine a life within the university system that was not going to inevitably push me toward conversations like that with Derrida.

So the good news, I think, is that my experience is not unique. In fact, there's been a great renaissance in the last five years of the kind of free-floating intellectual that had long been rumored to be on his or her last legs. It's a group shaped by ideas that have come out of the academy but is not limited to that. And I think in terms of publications like Feed--to pat myself on the back--Hermenaut and Suck are all good examples of a lively new form of public intellectualism that is not academic in tone.

The sensibility of that group is very freethinking--not particularly interested in doctrinaire political views, very eclectic in taste, very interested in the mix of high and low culture, much more conversational in tone--funny, even. Funny is an interesting component here. I mean, these new writers are funny in a way, you know, Adorno was never very funny. And they're very attentive to technology changes, maybe as interested in technology and changes in the medium as they are in intellectual fashions. If there's a role model that really stands out, it's somebody like Walter Benjamin for this generation. You know, a sense of an interest that puts together groups of things you wouldn't necessarily expect to see put together in the same essay.

How does the web figure into all of this? Why did these people show up on the web? I think one of the things that started happening--actually, this is just starting to happen--is that in addition to these new publications, you're starting to see something on the web that is very unique to it. The ability to center your intellectual life in all of its different appearances in your own "presence" online, on the home page, so that you can actually have the equivalent of an author bio. Except that it's dynamically updated all the time, and there are links to everything you're doing everywhere. I think we've only just begun to exploit it--of combating the problem with the free-floating intellectual, which is that you're floating all over the place and you don't necessarily have a home, and your ideas are appearing in lots of different venues and speaking to lots of different audiences.

The web gives you a way of rounding all those diverse kinds of experiences and ideas--and linking to them. Because, of course, the web is finally all about linking--in a way that I think nothing has done quite as well before it. And it also involves a commitment to real engagement with your audience that perhaps public intellectuals have talked a lot about in the past, but maybe not lived up to as much as they could have.

Some of this is found in the new formats that are available online in terms of how public dialogue can happen. I'm sure many of you have read these and many of you may have actually participated in them, but I'm a great advocate for this kind of long-format, multiparticipant discussion thread that goes on over two or three weeks. Not a real-time live chat, which is a disaster in terms of quality of discourse, which inevitably devolves into the "What are you wearing" kind of intellectual questions. But rather, the conversations with four or five people where each person has a day or half a day to think up their responses, and then write in 500- to 1,000-word posts. We've done those since we started at Feed. Slate does a wonderful job with them. And it's a fantastic forum. It's very engaged, it's very responsible, it's very dialogic and yet also lively in a conversational way. But, because of the back and forth, you actually can get to places that you sometimes couldn't get in a stand-alone 10,000-word essay.

Donatich: Professor Gans, if you had trouble with the word "discourse," I'm wondering what you'll do with "dialogic."

Johnson: I said I was recovering! That's the kind of thing that should be happening, and it seems to me that in five or ten years we'll see more and more of people who are in this kind of space, having pages that are devoted to themselves and carrying on these conversations all the time with people who are coming by and engaging with them. And I think that is certainly a force for good. The other side is just the economics of being able to publish either your own work or a small magazine. I mean, we started Feed with two people. We were two people for two years before we started growing a little bit. And the story that I always tell about those early days is that we put out the magazine and invited a lot of our friends and some people we just knew professionally to contribute. About three months, I guess, after Feed launched, Wired came out with a review of it. And they had this one slightly snippy line that said, "It's good to see the East Coast literary establishment finally get online." Which is very funny, to be publishing this thing out of our respective apartments. I had this moment where I was looking around my bedroom for the East Coast literary establishment--you open the closet door, and "Oh, Norman Mailer is in there. 'Hey, how's it going!'" And so there can be a kind of Potemkin Village quality online. But I think the village is thriving right now.

Donatich: Christopher Hitchens, short of taking on what a public intellectual might or might not be, will you say something about the manners or even the mannerisms of the public intellectual and why disagreement is important to our progress?

Christopher Hitchens: I've increasingly become convinced that in order to be any kind of a public-intellectual commentator or combatant, one has to be unafraid of the charges of elitism. One has to have, actually, more and more contempt for public opinion and for the way in which it's constructed and aggregated, and polled and played back and manufactured and manipulated. If only because all these processes are actually undertaken by the elite and leave us all, finally, voting in the passive voice and believing that we're using our own opinions or concepts when in fact they have been imposed upon us.

I think that "populism" has become probably the main tactical discourse, if you will, the main tactical weapon, the main vernacular of elitism. Certainly the most successful elitist in American culture now, American politics particularly, is the most successful inventor or manipulator, or leader of populism. And I think that does leave a great deal of room in the public square for intellectuals to stand up, who are not afraid to be thought of as, say, snobbish, to pick a word at random. Certainly at a time when the precious term "irony"--precious to me, at any rate--has been reduced to a form of anomie or sarcasm. A little bit of snobbery, a little bit of discrimination, to use another word that's fallen into disrepute, is very much in order. And I'm grateful to Professor Carter for this much, at least, that he drew attention to language. And particularly to be aware of euphemism. After all, this is a time when if you can be told you're a healer, you've probably won the highest cultural award the society can offer, where anything that can be said to be unifying is better than anything that can be described as divisive. Blush if you will, ladies and gentlemen, I'm sure at times you too have applauded some hack who says he's against or she's against the politics of division. As if politics wasn't division by definition.

The New York Times, which I'm sure some of you at least get, if you don't read, will regularly regale you in this way--check and see if you can confirm this. This will be in a news story, by the way, not a news analysis. About my hometown in Washington, for example, "recently there was an unpleasant outbreak of partisanship on Capitol Hill, but order seems to have been restored, and common sense, and bi-partisanship, is again regained. I've paraphrased only slightly. Well, what is this in translation? "For a while back there it looked as if there'd be a two-party system. But, thank God, the one-party system has kicked back in."

Now, the New York Times would indignantly repudiate--I'm coming back to this, actually--the idea that it stood for a one-party system or mentality, but so it does. And its language reveals it. So look to the language. And that is, in fact, one of the most essential jobs of anyone describing themselves as an intellectual.

Against this, we have, of course, the special place reserved for the person who doesn't terribly want to be a part of it, doesn't feel all that bipartisan, who isn't in an inclusive mood. Look at the terms that are used for this kind of a person: gadfly, maverick and, sometimes, bad boy. Also bad girl, but quite often bad boy, for some reason. Loose cannon, contrarian, angry young man.

These are not hate words, by any means, nor are they exactly insulting, but there's no question, is there, that they are fantastically and essentially condescending. They're patronizing terms. They are telling us, affectionately enough, that pluralism, of course, is big enough, capacious enough, tolerant enough to have room for its critics.

The great consensus, after all, probably needs a few jesters here and there, and they can and should be patted upon the head, unless they become actually inconvenient or awkward or, worst of all--the accusation I have myself been most eager to avoid--humorless. One must be funny, wouldn't you say? Look to the language again. Take the emaciated and paltry manner and prose in which a very tentative challenge to the one-party system, or if you prefer, the two-party one, has been received. I'm alluding to the campaign by Ralph Nader.

The New York Times published two long editorials, lead editorials, very neatly inverting the usual Voltairean cliché. These editorials say: We don't particularly disagree with what Ralph Nader says, but we violently disagree with his right to say it. I've read the editorials--you can look them up. I've held them up to the light, looked at them upside down, inside out, backwards--that's what they say. This guy has no right to be running, because the electorate is entitled to a clear choice between the two people we told you were the candidates in the first place.

I find this absolutely extraordinary. When you're told you must pick one of the available ones; "We've got you some candidates, what more do you want? We got you two, so you have a choice. Each of them has got some issues. We've got some issues for you as well. You've got to pick." A few people say, "Well, I don't feel like it, and what choice did I have in the choice?" You're told, "Consider the alternatives." The first usage of that phrase, as far as I know, was by George Bernard Shaw, when asked what he felt like on his 90th birthday. And he said, "Considering the alternatives...." You can see the relevance of it. But in this case you're being told, in effect, that it would be death to consider the alternatives.

Now, to "consider the alternatives" might be a definition of the critical mind or the alive intelligence. That's what the alive intelligence and the critical mind exist to do: to consider, tease out and find alternatives. It's a very striking fact about the current degeneration of language, that that very term, those very words are used in order to prevent, to negate, consideration of alternatives. So, be aware. Fight it every day, when you read gunk in the paper, when you hear it from your professors, from your teachers, from your pundits. Develop that kind of resistance.

The word "intellectual" is of uncertain provenance, but there's no question when it became a word in public use. It was a term of abuse used by those who thought that Capt. Alfred Dreyfus was guilty in 1898 to describe those who thought that he was probably innocent. It was a word used particularly by those who said that whether Captain Dreyfus was innocent or not, that wasn't really the point. The point was, would France remain an orderly, Christian, organic, loyal society? Compared to that, the guilt or innocence of Captain Dreyfus was irrelevant. They weren't saying he was necessarily guilty, they were saying, "Those who say he is innocent are not our friends. These are people who are rootless, who have no faith, who are unsound, in effect." I don't think it should ever probably lose that connotation. And fortunately, like a lot of other words that were originally insults--I could stipulate "Impressionist," which was originally a term of abuse, or "suffragette" or "Tory," as well as a number of other such terms--there was a tendency to adopt them in reaction to the abuse and to boast of them, and say, "Well, all right, you call me a suffragette, I'll be a suffragette. As a matter of fact, I'll be an Impressionist."

I think it would be a very sad thing if the word "intellectual" lost its sense that there was something basically malcontent, unsound and untrustworthy about the person who was claiming the high honor of the title. In politics, the public is the agora, not the academy. The public element is the struggle for opinion. It's certainly not the party system or any other form whereby loyalty can be claimed of you or you can be conscripted.

I would propose for the moment two tasks for the public intellectual, and these, again, would involve a confrontation with our slipshod use of language. The first, I think, in direct opposition to Professor Carter, is to replace the rubbishy and discredited notions of faith with scrutiny, by looking for a new language that can bring us up to the point where we can discuss shattering new discoveries about, first, the cosmos, in the work of Stephen Hawking, and the discoveries of the Hubble telescope--the external world--and, second, no less shattering, the discovery about our human, internal nature that has begun to be revealed to us by the unraveling of the chains of DNA.

At last, it's at least thinkable that we might have a sense of where we are, in what I won't call creation. And what our real nature is. And what do we do? We have President Clinton and the other figures in the Human Genome Project appear before us on the day that the DNA string was finally traced out to its end, and we're told in their voices and particularly the wonderful lip-biting voice of the President, "Now we have the dictionary which God used when he was inventing us." Nothing could be more pathetic than that. This is a time when one page, one paragraph, of Hawking is more awe-inspiring, to say nothing of being more instructive, than the whole of Genesis and the whole of Ezekiel. Yet we're still used to babble. For example, in the 18th Brumaire of Louis Napoleon, Karl Marx says, quite rightly, I think, "When people are trying to learn a new language, it's natural for them to translate it back into the one they already know." Yes, that's true. But they must also transcend the one they already know.

So I think the onus is on us to find a language that moves us beyond faith, because faith is the negation of the intellect, faith supplies belief in preference to inquiry and belief, in place of skepticism, in place of the dialectic, in favor of the disorder and anxiety and struggle that is required in order to claim that the mind has any place in these things at all.

I would say that because the intellectual has some responsibility, so to speak, for those who have no voice, that a very high task to adopt now would be to set oneself and to attempt to set others, utterly and contemptuously and critically and furiously, against the now almost daily practice in the United States of human sacrifice. By which I mean, the sacrifice, the immolation of men and women on death row in the system of capital punishment. Something that has become an international as well as a national disgrace. Something that shames and besmirches the entire United States, something that is performed by the professionalized elite in the name of an assumed public opinion. In other words, something that melds the worst of elitism and the absolute foulest of populism.

People used to say, until quite recently, using the words of Jimmy Porter in Look Back in Anger, the play that gave us the patronizing term "angry young man"--well, "there are no good, brave causes anymore." There's nothing really worth witnessing or worth fighting for, or getting angry, or being boring, or being humorless about. I disagree and am quite ready to be angry and boring and humorless. These are exactly the sacrifices that I think ought to be exacted from oneself. Let nobody say there are no great tasks and high issues to be confronted. The real question will be whether we can spread the word so that arguments and debates like this need not be held just in settings like these but would be the common property of anyone with an inquiring mind. And then, we would be able to look at each other and ourselves and say, "Well, then perhaps the intellectual is no longer an elitist."

A review of Looking Backward 2000–1887, by Edward Bellamy.

We're sorry, but we do not have permission to present this article on our website. It is an excerpt from Upside Down: A Primer for the Looking-Glass World (Metropolitan). © 2000 by Eduardo Galeano. Translation © 2000 by Mark Fried.

Modern Russian history, as taught by Clinton Administration spin doctors and Op-Ed pundits, holds that Boris Yeltsin dismembered the Soviet Union and set Russia on a historic path to democracy and a market economy. The Russians were eager to follow their first "duly elected" leader. They idolized ;the West and they willingly surrendered their values and their dreams--at least the "new Russians" did, a term that apparently is confined to a segment of the newly rich Muscovites. Year after year we were told that Yeltsin's reforms were changing the face of his land--witness the number of Mercedes and the evidence of breathtaking conspicuous consumption. A few years ago, shortly after I checked in to one of the new luxury hotels in Moscow, I was told that each Friday I could avail myself of fresh lobster flown in from Canada that very day! Western experts advising Yeltsin's "young reformers" on how to proceed were optimistic. I was given a stern lecture by one of them, an economist from Sweden, for suggesting that conditions in the country appeared catastrophic in comparison to the days of Communist rule.

The Yelstin legend took hold; he was Clinton's icon for a new Russia. From the moment he stood on a tank in August 1991 to face down an attempted Communist coup, Yeltsin was championed by the West as Russia's great hope. He was an appealing figure, athletic, always neatly dressed. He publicly boasted of his friendships with Bill Clinton and other Western politicians. He was a man to do business with, the Kremlin leader whose government was no longer a threat, whose human failings were on display for all to see. Who could forget Clinton's uproarious laughter as he tried to defuse Yeltsin's drunken diatribes during their summit at the Roosevelt museum in Hyde Park? Or the inebriated Yeltsin snatching the baton from the conductor of the Berlin Police Band and proceeding to conduct himself? Little attention was given to Yeltsin's tanks pounding his "duly elected" Parliament or to his policy in Chechnya. The Clinton Administration publicly encouraged Yeltsin to disband the Parliament because a solid majority of deputies wanted to pursue reforms more slowly. Several months before he actually moved against the legislature, a senior US official told the New York Times that "if Yeltsin suspends an antidemocratic parliament, it is not necessarily an antidemocratic act." Later, while Russian planes, tanks and artillery rained death on the Chechen capital of Grozny, Clinton saw fit to compare Yeltsin to Abraham Lincoln. Even when Yeltsin's entire economic reform program came crashing down in 1998, Vice President Gore voiced the opinion that "optimism prevails universally among those who are familiar with what is going on in Russia."

In short, the Clinton Administration hitched its Russia policy to Yeltsin's fortunes. Yeltsin's critics in Russia were dismissed as "dark forces" seeking Communist restoration or worse. There is a simple explanation. Heavily dependent on Western loans and subsidies, Yeltsin was always prepared to render services to Washington, provided he was handled with great sensitivity and accorded even greater public respect. He proved accommodating in Bosnia and again in Kosovo.

But for most Russians, Yeltsin's rule was a social and economic disaster. They viewed him--not without good reason--as being completely dependent on Washington, where the US Treasury and the International Monetary Fund are located. These institutions were a primary influence on his behavior and the often violent and self-destructive course he followed.

When he suddenly resigned on New Year's Eve a year ago, Yeltsin left his successor an impoverished state with few features of democracy and many more of authoritarianism. It is too early to assess properly the true meaning and consequences of his rule. But figures indicate that it wreaked far greater damage on Russia's economy than the Nazi invasion and World War II. Russia's gross domestic product between 1991 and 1998 declined by 43 percent, compared with the Soviet GDP decline of 24 percent from 1941 to 1945.

Behind this figure lurks a dramatic decline in living standards. An estimated 40 percent of the population lives in poverty--a tenfold increase since the collapse of Communism. Yeltsin's policies have had a catastrophic effect on health, education and social programs. Rising infant mortality, declining life expectancy and spreading infectious diseases have produced a negative population growth that is obscured in part by the steady stream of ethnic Russians returning from the outlying parts of the former Soviet Union (by 1995, Russia's population had declined by some 2 million). Agriculture remains comatose--Russia today imports 55 percent of its food supply. Officially, unemployment is about 12 percent, but the real figure is estimated to be between 20 and 25 percent (there are about 11 million Russians of working age who are listed as "missing"). The average daily food intake today is 2,100 calories, less than the minimum recommended by the World Health Organization; in the 1980-85 period, the average intake was 3,400.

None of these troubling issues are to be found in Yeltsin's Midnight Diaries. Memoir writers of course want to present themselves in the best possible light, and Yeltsin is no exception. He portrays himself as the leader who set Russia on a new course, gave it political stability and secured a peaceful transfer of power. Under his leadership Russia has joined the exclusive club of the eight most advanced industrial nations in the world.

What seems most remarkable about this Panglossian version of one of the most turbulent decades in Russia's history is its tenuous relation to reality. The disastrous reform program and the failure to introduce the rule of law, to the extent that they are touched upon at all in this book, are presented with serene detachment--Yeltsin writes about such things as though they had nothing to do with him.

On the other hand, Yeltsin wants us to believe that he had everything to do with his memoir, that he wrote it himself "in fragments over the years...late at night or early in the morning." It is widely known that it was ghostwritten by Valentin Yumashev, a former journalist and Yeltsin's longtime personal aide, with daughter Tatyana being the final censor. Only in passing--in Chapter 13--does Yeltsin mention that Yumashev worked with him on the manuscript. Other bits of contrived candor are sprinkled sparsely around with the apparent aim of defusing--with a sentence or two--some of his well-publicized shortcomings, even his drinking problem. Yeltsin says alcohol was his "only means [of getting] rid of stress"--until his 1995 heart attack. His consumption was afterward reduced to a single glass of wine per day.

Herculean efforts are made by the authors of this slapdash tome, which is filled with homilies about duty and patriotism, to suggest that Yeltsin possessed mysterious and therefore miraculously effective leadership skills. He liked the "simple, effective" style of leadership and made his decisions with "surgical precision." His stationery was embossed in gold with the presidential seal. His desk was cluttered with "coded telegrams" and "presidential mail." He used his "presidential pen" to sign decrees. By pushing buttons on a presidential "control panel" he could reach his far-flung minions. Metaphors reinforce the image of a supercool superexecutive who is always in control. Sometimes he is the sea captain, steering the Russian ship of state past dangerous reefs and shoals. On another occasion, before making a major announcement, he is like the space expert about to fire a rocket. ("Now it was too late for doubts. The countdown had begun. The bomb was ticking.")

The oddest thing about the details is that they offer the illusion of concreteness to obscure enormous ambiguities. We don't see Yeltsin making decisions on any substantive domestic issues. There is no evidence of his even being aware of the scope of devastation visited upon the people by his social policy. (Statistics give us an inkling: His government used only 9 percent of GNP on social services, compared with around 33 percent in the West.)

Yeltsin is certainly not stupid, but when you consider his remarkable energy in fighting for the presidency he seems unaccountably passive in other respects. We don't see him really concerned with the substance of his job. It is difficult to find an economic or social initiative Yeltsin conceived and brought to completion. ("There won't be any inflation," Yeltsin tells the press shortly before prices explode following the collapse of the ruble.) In fact, he reversed the democratization trend initiated by Mikhail Gorbachev. Yeltsin resorted to force to overhaul the entire constitutional order and to create a presidency that suited his needs. According to his own account, his crowning political achievement was Vladimir Putin's election as Russia's president in March 2000 (much of the first and last portions of the book are devoted to this).

There was something Reaganesque about Yeltsin, for his leadership seemed to exist only in his public utterances. But Reagan looks like a giant by comparison, since he held on to a few simple but firm beliefs and surrounded himself with capable aides. Yeltsin seems to be missing a central core belief--"the vision thing." He believed, he said in his final speech (in which he asked forgiveness of the people), that he was moving the country from its totalitarian past "to a bright, prosperous, civilized future." But wasn't that exactly the belief he was supposed to cherish when he served as the Communist Party boss of Sverdlovsk?

There is nothing in this book that appears to qualify Yeltsin for the presidency, with the exception of his prodigious lust for power and a genius for behind-the-scenes, byzantine politics, in which various elites struggle over the reallocation of power and wealth. Yeltsin was not a marionette. Far from it--he made his way up the greasy pole of power and fought constantly to stay at the top. He may have been extraordinarily passive on economic and social matters, but he was a superb bureaucratic infighter--bold, decisive and ruthless. He had no qualms about sacrificing even his most loyal supporters. "It was too bad, really just too bad," he notes after dismissing one of his prime ministers. When he fired his longest-serving prime minister--"faithful, decent, honest, intelligent" Viktor Chernomyrdin--he did it without forewarning because decision-making requires a special approach. "A decision should not wait. With any leakage of information, the decision ceases to be a bold, unexpected move and turns into the opposite." But even though he says firing people caused him "the severest kind of stress," Yeltsin concedes that he "felt an unusual rise in spirits, an enormous wave of optimism." He insists that his perpetual personnel changes were part of a careful and deliberate search for a politician to replace him and continue "on the path of democracy."

On the basis of the evidence, in the light of his years as president, we see Yeltsin as confident, surefooted and deeply interested in only one issue: the preservation of his personal power. He is a genius at perpetual conniving. Unlike Reagan, Yeltsin feared competent officials with established reputations. He entrusted great power to younger, inexperienced people without a political base of their own, then dismissed them when things went wrong. The failures were attributed to his revolving-door prime ministers as though they bore exclusive responsibility.

But why put oneself at the mercy of incompetent advisers? Yeltsin reveals his priorities in explaining his reasons for appointing Sergei Kiriyenko, 35, an obscure and inexperienced official, as prime minister: "Everybody needed a new figure, not someone who would lobby for the interests of some against others, not someone from some sort of camp, not someone who had already appeared in Moscow's echelons of power." In short, someone without a history or a political base. During his most severe crisis, in 1998, Yeltsin turned to his foreign minister and perhaps the most experienced man in the government, Yevgeny Primakov. But when Primakov suddenly gained widespread popularity in early 1999, Yeltsin became alarmed. He realized, he writes, that Primakov "was becoming a serious political alternative to my course and my plan for the country's development." Ignoring Primakov's "honesty, decency, and loyalty," Yeltsin swiftly defused the threat by dismissing the prime minister for alleged pro-Communist sympathies. "Primakov had too much red in his political palette," Yeltsin writes.

His final choice was Vladimir Putin, a former KGB lieutenant colonel, who was named prime minister in the summer of 1999. Putin's first move on becoming acting president was to sign a decree protecting Yeltsin from future criminal prosecution.

In this context, Yeltsin's rambling memoir is inherently interesting for what it tells us about his character and maneuverings. Its author-statesman casts the ongoing Russian drama in terms of Kremlin intrigues, ceremonial functions, gossip, meetings and talks with foreign potentates, and perpetual personnel changes. All along it is Yeltsin who holds every string in his hands and who, like a puppetmaster, keeps moving the cardboard characters he has created, apparently for that very purpose. The sagging economy, rampant corruption, rising crime, growing social inequities and one of the greatest lootings of assets ever recorded in history seem to be matters of minor concern. "How can you force a bureaucrat not to take bribes to feed his family, when he earns only 5000-6000 rubles per month but is involved in monitoring multi-million-ruble transactions?" Yeltsin writes. "Naturally, the only way is to raise his salary."

The picture that emerges is one of a petulant, self-centered, cunning man whose lust for power and fragile ego were the dominant forces of his presidency. Even though he was the picture of a model Bolshevik throughout much of his life, even though he had toadied up to Brezhnev, Chernenko and other political leaders to crawl up the party ranks, Yeltsin had always been a man waiting for the main chance. He turned against his colleagues when they blocked his path to the top in 1987. He fought hard. He finally seized his chance. The image of the man atop a tank was the apex of his career, the grand gesture for which he will be remembered in history.

Paradoxically, despite his anti-Communist diatribes, Yeltsin remained a Bolshevik at heart insofar as he believed that strong-willed and determined individuals could change the world by forcibly engineering social and economic changes. He saw himself as just such a man. He sought to obliterate the past, revise his own history and cultivate his own myth. I recall a St. Petersburg historian contemptuously quoting from an early New Year's Eve address to the nation in which Yeltsin referred to Communists as "they"--"They have imposed Communism on us for seventy years." And who was talking, the historian asked rhetorically? A former Politburo member and Communist boss of Sverdlovsk.

What is there of substance, if anything, in this man who strove mightily for grand gestures and theatrical effects? Midnight Diaries provides no answers, so there remains the question of whether Yeltsin ever seriously considered championing a democratic revolution.

What happened in 1991 is that the students and workers who made the revolution and toppled the old regime did not know how to make a new government. Those who did know how were the ones from the old regime. Yeltsin brought those same people back to power and subsequently worked mightily against the very democratic forces that had been the mainstay of his support when he was a populist politician.

Yeltsin's memoir offers no evidence to suggest that he was ever interested in the systematic mobilization of Russia's democratic forces. He had no vision of the nation's identity and future; his concerns were far more personal. His obsession with the grand gesture--something that required an element of surprise--made him fret constantly about leaks. Not only did he crave the limelight, he always tried to stun the world by unexpected actions: "If the news were to leak, the whole effect would be lost," he writes about his decision to resign. "Any leak, any advance talk, any forecasts or proposals would put the impact of the decision in jeopardy." In June 1999, at the end of the war with Yugoslavia, he ordered the Russian brigade serving on peacekeeping duty in Bosnia to steal a march on NATO and occupy the Pristina airport in Kosovo even though he knew it was an empty maneuver. "I decided that Russia must make a crowning gesture, even if it had no military significance," he writes. This was, he adds, "a sign of our moral victory."

Ironically, the first wave of opposition to Yeltsin's policies came from the very people who brought him to power. They argued that his economic reforms had little to do with a genuine free market but amounted to a Bolshevik-style, top-down expropriation and redistribution of assets in disguise. In The Tragedy of Russia's Reforms: Market Bolshevism Against Democracy, Peter Reddaway and Dmitri Glinski note that by late 1993, most democrats--"an entire generation of talented and idealistic would-be leaders of Russia's body politic and civil society"--were "pushed off the political stage along with the democratic movement as a whole."

Reddaway, a professor at George Washington University and former director of the Washington-based Kennan Institute for Advanced Russian Studies, and Dmitri Glinski, a senior research associate at the Moscow-based Russian Academy of Sciences' Institute of World Economy and International Relations, teamed up to produce a critical analysis of the Yeltsin years in power. It is a finely argued and frequently provocative account that deserves a respectful hearing.

Reddaway and Glinski believe Yeltsin had "little commitment to democracy, the national interest, or the economic development of his nation." His rule was an age of blight. The destruction of Russia's intellectual assets was particularly severe. The number of scientists has shrunk from 3.4 million to 1.3 million. Russia's net financial loss from the decline in its science is estimated at between $500 billion and $600 billion annually.

The overall damage inflicted on the economy, they write, exceeds that of the comparable American experience during the Great Depression or, again, the industrial loss inflicted on Russia in World War II. High-tech industries suffered the worst. Production in electronics fell by 78 percent between 1991 and 1995. In 1997 imports made up half the Russian consumer market.

The picture of devastation looks even grimmer in light of dramatic declines in energy production (since 1991 oil production has declined by 50 percent, gas by 13 percent and electricity generation by 20 percent). Lack of investment in electricity generation will have potentially far-reaching consequences for the military and civilian economies, with the prospect of future migrations away from the frigid northern zones of the country. Brownouts have already forced a population exodus from the city of Norilsk.

"For the first time in recent world history," Reddaway and Glinski write,

one of the major industrial nations with a highly educated society has dismantled the results of several decades of economic development...and slipped into the ranks of countries that are conventionally categorized as "Third World." To make this experience even more dramatic, this comprehensive national collapse occurred at the same time as the nation's leaders and some of their allies in the West promised Russians that they were just about to join the family of democratic and prosperous nations.

Instead of promoting democracy, these analysts argue, "Yeltsin and his associates disbanded the new post-Soviet parliament by force and emasculated its successor, blocked the development of an independent judicial branch, reduced the power and revenue base of local self-government, and by 1994 had imposed a regime of Byzantine authoritarianism on the country."

The authors contend not only that Russia's social and economic degradation "can and should be reversed" but that it is in the national interest of the United States and Western Europe to assist in that process. A more stable Russia, they say, would provide better hope for viable security arrangements and for a more cooperative relationship between Moscow and key international organizations.

But this is a Catch-22. Given the fact that Moscow is not able to service its foreign debt, no influx of foreign capital is to be expected. Who wants to invest in a country lacking comprehensive, clear and effective tax legislation?

In his book Post-Soviet Russia, the distinguished Russian historian Roy Medvedev also chronicles the failures of Yeltsin's rule, arguing that Russia's plunge to capitalism was both precipitate and ill conceived.

Yeltsin first privatized the area of public safety, which led to the creation of private armies and mafias. At the same time, the managers of state-owned firms created private companies and moved the cash flow to offshore banks in Cyprus. New banks were formed and made fortunes in currency transactions.

There was something very Russian about the whole endeavor. Yeltsin approached it much in the way Peter the Great and other czars carried out their modernization programs; "capitalist perestroika" was imposed from above. Medvedev notes Yeltsin's explanation: "We had to forcibly introduce a real market place, just as potatoes were introduced under Catherine the Great."

The remark suggests, perhaps inadvertently, how vague was Yeltsin's grasp of the magnitude of the undertaking. Once prices were allowed to float freely, they immediately jumped fifteen to twenty times over. Hyperinflation wiped out life savings of the population. It touched off the flight of capital, as profits from exports were deposited in Western banks as a hedge against inflation. Low domestic prices on raw materials generated illegal exports and the emergence of illicit trade. Domestic production declined sharply. In 1998 the government once again devalued the ruble and froze bank accounts.

Another of Yeltsin's failings was his lack of sound judgment about people. Medvedev catalogues the incompetent, inexperienced young men with whom Yeltsin chose to surround himself. One was a junior foreign ministry bureaucrat, Andrei Kozyrev, who was made foreign minister. Medvedev likens another Yeltsin aide, Boris Nemtsov, to the character of a confidence man in Gogol's Dead Souls.

Why did Yeltsin entrust so much of the government to a young and green journalist, Yegor Gaidar? During their very first meeting, Gaidar assured Yeltsin that the shift to the market could be accomplished in one year. Yeltsin himself provided an account of the "surgical precision" of his decision to place Gaidar in charge: "It's a curious thing, but I couldn't help being affected by the magic of his name," Yeltsin wrote later. Gaidar's grandfather, Arkady Gaidar, had been a famous children's writer whose books were read by generations of Soviet kids, Yeltsin explained, "Including myself. And my daughters. And so I had faith in the inherited talent of Yegor, son of Timur, grandson of Arkady Gaidar."

Gaidar's advisers included a group of Western experts, led by Jeffrey Sachs of Harvard and Anders Aslund of the Carnegie Endowment [see Janine R. Wedel, "The Harvard Boys Do Russia," June 1, 1998]. The Russians were very receptive to outside advice; they thought the West was genuinely concerned. But expert recommendations failed to "take into account the structure of the Russian economy and its particular features" and thus did more harm than good, Medvedev believes. He goes even further and suggests that the experts were trying foremost to preserve the interests of the wealthy Western countries.

"Shock therapy" sent the country reeling with pain, causing tremendous harm to an economy that, despite all its known shortcomings, did include first-rate firms and research and development laboratories. This was particularly true of the military-industrial complex, which employed millions of highly skilled workers, technicians and engineers. Yeltsin failed to reorient these resources to the production of consumer goods. At the same time, there was a sharp drop in government orders for military-industry goods.

But the calamity also created opportunities for people to become rich almost overnight. For aficionados, Medvedev provides a detailed analysis of this new but small class of Russians who acquired vast fortunes during what can only be described as the looting of Russia. One of them is the subject of Paul Klebnikov's excellent book Godfather of the Kremlin, which is a must-read for anyone interested in the Yeltsin era.

Klebnikov, a senior editor at Forbes, makes its amply clear that thievery on such a scale has occurred with the cooperation of top political leaders. The businessmen, in a strict sense, committed no crimes and broke no laws; they were advised and helped by Yeltsin's young reformers in the Kremlin. Virtually all the people around Yeltsin, including members of his family and the President himself, are portrayed as deeply corrupt. "Yeltsin was very quickly compromised by all those things that accompany limitless power: flattery, luxury, absolute irresponsibility," Yeltsin's former chief of security is quoted as saying.

Klebnikov's vivid portrait of Boris Berezovsky, until recently one of the wealthiest men in Russia, is closely documented by detailing financial transactions, strategies and alliances. Berezovsky's fortunes rose after the publication in the early 1990s of Yeltsin's memoirs Notes of a President, which Berezovsky had partly underwritten. Having expected to make $1 million, Yeltsin was disappointed by his far more modest proceeds. At that point, according to Klebnikov, Berezovsky began putting funds into Yeltsin's personal account at Barclays Bank in London, explaining that this was income generated by the memoir. Berezovsky in turn became a Yeltsin favorite (by 1994 Yeltsin had $3 million in the account).

In addition to Berezovsky, a former scientist turned car dealer, Klebnikov skillfully describes other members of the clique of predatory oligarchs who plundered the country's most important assets with the connivance of the regime. "He and his crony capitalists produced no benefit to Russia's consumers, industries, or treasury. No new wealth was created." They did, however, produce substantial benefits to Yeltsin and his entourage.

Yeltsin, in Midnight Diaries, dismisses such allegations. "In fact, these people don't have any links to the criminal world. These are not robber barons and not the heads of mafia clans. These are representatives of big capital who have entered into close and complex relationships with the government." The evidence indicates otherwise, though. Klebnikov's presentation suggests that Berezovsky was involved in mafia wars, that he attempted to have his chief rival killed and that he was the target of an assassination attempt himself. (Berezovsky was badly injured and his driver decapitated by a bomb placed near his automobile.)

Klebnikov may indeed go too far, however, when he asserts that Berezovsky, as a private individual, managed to "hijack the state." Berezovsky's influence was always directly linked to his proximity to Yeltsin. Yeltsin appointed the oligarch to several top posts, including that of deputy chairman of the National Security Council. But ultimately Berezovsky remained a moneyman who was never allowed into the charmed circle of power. Political power in Russia, when it came to a crunch, always had more punch than financial muscle.

Klebnikov adds his voice to recent charges that the Clinton Administration stuck by Yeltsin even though it knew all along about the unsavory nature of his regime. The Administration, he writes, "while trumpeting the principles of democracy and the free market, repeatedly ignored evidence that the Yeltsin regime was a kleptocracy."

Gaidar, the architect of Yeltsin's shock therapy, acknowledged in a 1997 book that the entire Yeltsin program was a failure. "Unfortunately," he writes, "the combination of imperial rhetoric, economic adventurism, and large-scale theft seems likely to become the long-term determinants of Russian realities." The term now commonly used to describe Russia is a "bandit state." Reddaway and Glinski call it "market bolshevism."

The new books have punched some big holes in the Yeltsin legend as well as in Clinton's own uncritical backing for the Russian president. Reddaway and Glinski provide some evidence that Yeltsin used his secret police to stage "a provocation that unleashed violence on the part of the opposition, thus giving Yeltsin a pretext to proceed with a bloody crackdown on the parliament." Clinton joined Yeltsin's war against Parliament, saying, "We cannot afford to be in the position of wavering at this moment, or backing off or giving any encouragement to people who clearly want to derail the election process and are not committed to reform." An unnamed US official was quoted by Newsweek as saying the Administration "would have supported Yeltsin even if his response had been more violent than it was." (Official figures say 187 people died and almost 500 were wounded in the attack.) Charles Blitzer, chief economist on Russia for the World Bank, commented on the incident: "I've never had so much fun in my life." Another Western economist advising the Yeltsin government was quoted in the press as saying, "With parliament out of the way, this is a great time for reform."

There is also evidence that the 1993 referendum on a new constitution--which gave Yeltsin extensive personal powers--was in fact rigged. Reddaway and Glinski cite various infractions, including intimidation and other irregularities engaged in by the Yeltsin people. The minimum voter turnout was reduced from 50 percent to 25 percent; the minimum wage was raised; television access for oppositionists was sharply curtailed. "By all indications, the constitution did not gain the necessary minimum of voter support, but the authorities declared it [had] been approved," they write. "The gap was evidently closed by government vote fraud."

Yeltsin's 1996 re-election campaign--financed by Berezovsky and other oligarchs and mafia barons--"was marked by spectacular violations of the law on the part of the incumbent," Reddaway and Glinski write. Yeltsin started with an approval rating of less than 10 percent and succeeded in getting re-elected by spending thirty times more than the legal spending limit. One incident is telling: An aide of Anatoly Chubais, a key Yeltsin assistant at the time, was caught leaving his office with $200,000 in cash in a suitcase. Apart from direct distribution of money, to win votes Yeltsin used government funds on a lavish scale for tax breaks; made cash transfers to institutions, garden owners and small farmers; and disbursed government credits. The Economist estimated that the effort cost the Russian treasury some $10 billion. The aggressive giveaway dwarfed the promises of all other contenders combined. But the most powerful weapon in Yeltsin's hands was the broadcast media. The brazen violations of the law on campaign coverage were summarized by the European Institute for the Media: Yeltsin got 53 percent of prime-time coverage; the Communist candidate, Gennady Zyuganov, 18 percent; and all other candidates combined, 11 percent.

After the election, the oligarchs divided up among themselves the most valuable state companies, which Yeltsin privatized under fire-sale conditions. But corruption permeated all levels of government, and included Yeltsin's "young reformers." One of them is said to have handled an estimated $178 million in precious stones, gold and antique jewelry smuggled out of the Russian treasury, to be sold in San Francisco and Antwerp. The bribery involved in "trading" operations was on an epic scale. The wives of the interior minister and his first deputy, invited on a three-day shopping trip to Switzerland by a commodities trader, bought $300,000 worth of furs, perfumes, watches and so on (and carted the haul in twenty pieces of excess luggage)--all paid for by the trader's firm.

Primakov was the only prime minister who made a determined effort to fight corruption and hold Berezovsky and others accountable to the law. This may have brought stability and trust back to Russian politics, but his corruption probe was extremely dangerous for Yeltsin. Moreover, Primakov did not offer enough of a guarantee that Yeltsin himself would not face prosecution after leaving office. Finally, Klebnikov says, Primakov's government evoked loud protests from Washington. He was replaced as prime minister by the minister of internal affairs, the man who had promised to protect Berezovsky. ("The dismissal of Primakov was my personal victory," Berezovsky later told Le Figaro.) Here lie the reasons for the selection of Putin.

Putin and his people are left with the overarching need for a qualitatively new strategy of economic and social recovery. Yeltsin's course reached a dead end; polls suggest that discontent with capitalist experimentation now permeates all classes--workers, peasants, the army, intellectuals. Whatever emerges in this decade in Russia is likely to be viewed as a communist reformation--a moderate shade of red--that would allow some degree of private property, individual freedom and entrepreneurship. How this will evolve is going to depend on Putin himself. An argument can be made that the origins of Communism's collapse may lie partly in a car trip the young Gorbachev and his wife made in the 1960s, which allowed them to observe that even Italian peasants lived better than Russian elites. Putin served as a KGB officer in East Germany for six years in the 1980s, but even that exposure was enough for his wife to complain about the empty shelves back home in Russia.

How will history judge Yeltsin? On one level, to use the image of one of his acolytes, Yeltsin could be compared to Ilya Muromets, the peasant hero of medieval epics who one day is bravely slaying Russia's foes, then spends weeks sleeping on the stove in his hut.

On another level, however, Yeltsin's final grand gestures do set an enormously important precedent. During the twentieth century ten men stood at Russia's helm, but only one of them--Yeltsin--was actually elected by the people. Moreover, there was no regular system of power transfer in Russia throughout the twentieth century, and that was perhaps the most important cause of the country's difficulties and setbacks. Five of its ten leaders died in office, three were removed by revolutions and one by a palace coup. Yeltsin alone left office before the end of his term. This indeed established a much-needed precedent to legitimize an orderly system of succession. In the context of Russian history, this has been progress.

In their campaigns for the White House, the major-party candidates--even the one backed by labor--spent little time debating labor-law reform.

Nevertheless, the AFL-CIO had hoped that a Gore victory and Democratic gains in Congress would lead to strengthening of the National Labor Relations Act (NLRA) or, at least, union-friendly appointments to the National Labor Relations Board (NLRB). Continued Republican control of Congress now eliminates the possibility of the former, while Bush's court-won victory makes the latter highly unlikely. In fact, when our new President gets through filling three vacancies on the NLRB early this year, his appointees will insure that the failure of labor law--a scandal exposed in different ways by former NLRB chairman William Gould in Labored Relations and by lawyer Lance Compa in the recent Human Rights Watch report Unfair Advantage--continues to thwart union organizing for the next four years.

Since the AFL-CIO began putting greater emphasis on membership recruitment in 1995, there have, of course, been important new gains. But some of the most significant victories involved organizing campaigns in which unions used their bargaining or political clout--where they still have it--to secure recognition in new units without using Labor Board certification procedures. For tens of millions of workers in the private sector, bypassing the law is not an option--and, for better or worse, the sixty-five-year-old NLRA continues to shape organizing strategies in many key industries.

Long hailed as the "Magna Carta of American labor," the NLRA (or Wagner Act) is definitely showing signs of age. The act was designed in 1935 to promote collective bargaining as a peaceful alternative to the many violent, Depression-era battles over union recognition. Its New Deal sponsors viewed unionization as a necessary corrective to the "inequality of bargaining power" between individual workers and management. To referee workplace disputes, Congress created the NLRB, which conducts representation elections, awards bargaining rights based on them and investigates "unfair labor practices" by employers that might discourage organizing or prevent workers from negotiating a union contract.

But the limited remedies, light penalties and secret-ballot elections available under the NLRA are meaningful only if its administration is swift and efficient. In few other areas of the law is there greater truth to the axiom that "justice delayed is justice denied." When union votes are stalled for months, union victories tied up in litigation for years, bad-faith bargaining goes unpunished and fired union supporters get reinstated (if at all) long after an organizing campaign has ended, management wins--even if the board ultimately rules otherwise.

The selection of NLRB members--and the agency's influential general counsel--is determined by who controls the White House and what kind of nomination deals are brokered with the Senate. (Functioning at full strength, the board consists of three appointees, including the chairman, from the President's own party and two from the opposition party.) However, as the AFL-CIO argued in its last major campaign for labor-law reform in the late 1970s, unfair-labor-practice victims need more than a sympathetic NLRB majority or efficient functioning by the agency's 2,000 career employees around the country. The law itself must be repaired.

The enormous gap between workers' legal rights on paper and the reality of NLRA enforcement under Democrats and Republicans alike is most effectively documented in Unfair Advantage. Labored Relations also describes how bad substantive decisions, "the creakiness of the NLRA's administrative procedures" and its "lack of effective remedies" have undermined worker organizing and strike activity in recent decades. But the bulk of Gould's memoir is devoted to refighting the personal political battles that occupied him during his four and a half years as a Clinton appointee on the NLRB. Gould's book thus invites comparison with Locked in the Cabinet, Robert Reich's glibly amusing account of his stint as Clinton's Secretary of Labor. Both men assumed their Washington posts--Reich at the Labor Department and Gould at the board--after a career in academia. Even before Clinton nominated Gould in 1993, Reich had tapped him (based on his work as a Stanford University law professor and respected arbitrator) to serve on the Dunlop Commission, a panel of experts convened to recommend labor-law changes.

At the time, Gould had just offered his own ideas on this subject in a book titled Agenda for Reform. In it he called for many of the same corrective measures now advocated by Human Rights Watch: employer recognition of unions based on signed authorization cards rather than contested elections; imposition of first-contract terms by an arbitrator when the parties can't reach agreement by themselves; greater use of injunctive relief to secure quicker reinstatement of workers fired for union activity; a ban on permanent replacement of economic strikers; and heavier financial penalties for labor-law violators.

Needless to say, Senate Republicans weren't too keen on Gould's proposals and kept his nomination to the NLRB dangling for almost a year. In fact, even Reich's labor-law-reform panel--which Gould left prior to being confirmed as NLRB chairman--failed to promote these much-needed changes. Instead, the Dunlop Commission stressed the importance of amending the NLRA so management-dominated "employee participation" schemes could flourish even more widely as an alternative to unions. Repackaged as the Teamwork for Employees and Managers (or TEAM) Act and adopted by Congress after the GOP took over in 1994, this anti-union legislation was ultimately vetoed by Clinton--after frantic labor lobbying.

To survive his contentious confirmation process (and avoid the fate of fellow African-American Lani Guinier, whose nomination to a top Justice Department post was dropped by Clinton when her writings as a law professor were attacked by the right), Gould played up his credentials as a "professional neutral." He proclaimed that his goal in Washington would be "to reduce polarization both at the board and also between labor and management." Equipped with what turned out to be a serious lack of diplomatic skills, Gould might have had an easier time trying to bring peace to the Middle East.

During Gould's tenure, Congressional Republicans sought to cripple the NLRB's operations with budget cuts, harassing oversight hearings and nonstop political sniping. Positive initiatives, like general counsel Fred Feinstein's attempt to get more federal court orders reinstating fired workers while their cases were being litigated, became a lightning rod for conservative criticism. Under these trying circumstances, Gould, Feinstein and pro-labor board members like Sarah Fox and the late Margaret Browning needed to stick together and coordinate their strategy in the face of common adversaries. Gould, however, quickly fell out with his colleagues in a fit of pique over their failure "to accord me stature and defer to my leadership." His "leadership" soon took the form of public feuding with, and criticism of, his fellow Clinton appointees--combined with attention-getting public statements about many of the leading labor-management controversies of the day. Even when he was on the right side of these disputes, his ill-timed interventions had the effect of exacerbating the NLRB's political problems.

In 1998, for example, Gould injected himself into the debate about a state ballot initiative in California that would have required unions to obtain the individual consent of their members before using dues money for political purposes. Gould's statement of opposition to this Republican-backed "paycheck protection" scheme correctly noted that it would "cripple a major source of funding for the Democratic Party." When his testimony was briefly posted on the NLRB's website after being presented to state legislators, it created such a ruckus that even Congressional Democrats generally supportive of labor and the board raised the possibility that Gould should resign to avert further Republican retribution against the agency.

Ironically, Gould's batting average at the board shows that he was not as much a union partisan as his business critics claimed. According to a recent law-review analysis by professor Joan Flynn, Gould's "votes in disputed cases were considerably less predictable than those of his colleagues from management or union-side practice... [they] broke down in a much less lopsided fashion: 159 for the 'union' position and 46 for the 'management' position." In contrast, when Ronald Reagan tried to change the NLRB's alleged pro-union tilt during his Administration, his chairman was a management-side lawyer--Donald Dotson, a figure no less controversial in the 1980s than Gould was in the '90s. Did Dotson pursue Gould's stated goal of "return[ing] the Board to the center to promote balance"? Of course not. Despite equally hostile Congressional oversight by members of the then-Democratic majority, Dotson openly promoted a Right-to-Work Committee agenda, defending management interests just as zealously as he had when he was on the corporate payroll.

Naming Gould to lead the board was, thus, very much an expression of Clinton's own political centrism. Unhappily for labor, Gould's unexpected personal showboating, squabbling with would-be allies and what Flynn calls his "near-genius for irritating Congress" impeded, rather than aided, the administrative tinkering that Clinton appointees were able to do at the board during his tenure. Vain, impolitic and--in the view of some critics--hopelessly naïve, Gould often did as much harm as good. In this respect, he was not unlike the Dunlop Commission, in that Reich's vehicle for building a political consensus on labor-law reform instead fed right-wing attempts to weaken the NLRA.

Gould's defense of his record seems designed to avoid the kind of flak that Reich received over his memoir's fanciful reconstruction of private and even public exchanges with various Washington notables. Labored Relations quotes extensively from the author's minutiae-filled daily journal, leaving the impression that no such literary license has been employed. Unfortunately, Gould lacks Reich's self-deprecatory humor and acute sense of irony. The author's tedious recitation of his speaking dates, telephone calls, case conferences, lunch and dinner conversations, etc., will be a hard slog for anyone but specialists in the field or ex-colleagues searching for critical comments about themselves (of which there are many).

Outside the Beltway and the "labor bar," settling old scores about who did what to whom as part of the "Clinton Board" is much less a preoccupation than the difficulty of defending workers' rights under any administration. Unfair Advantage does a much better job of keeping this big picture in focus, in particular by documenting the rising toll of workers fired for what, in board jargon, is called "protected concerted activity." In the 1950s, author Lance Compa reports, "workers who suffered reprisals for exercising the right to freedom of association numbered in the hundreds each year. In the 1960s, the number climbed into the thousands, reaching slightly over 6,000 in 1969. By the 1990s, more than 20,000 workers each year were victims of discrimination for union activity--23,580 in 1998, the most recent year for which figures are available."

The "right to freedom of association" is, of course, enshrined in international human rights standards that the United States nominally supports and often seeks to apply to other nations. Compa, a former organizer for the United Electrical Workers who now teaches international labor law at Cornell, exposes the hypocrisy of this official stance in light of persistent NLRB enforcement problems and the structural defects of the NLRA itself. In this Human Rights Watch report, he concludes that "provisions of U.S. law openly conflict with international norms...of freedom of association."

Millions of workers, including farm workers, household domestic workers and low-level supervisors, are expressly barred from the law's protection of the right to organize. American law allows employers to replace permanently workers who exercise the right to strike, effectively nullifying that right. New forms of employment relationships have created millions of part-time, temporary, sub-contracted and otherwise "atypical" or "contingent" workers whose freedom of association is frustrated by the law's failure to adapt to changes in the economy.

The problem with Compa's sweeping indictment of the status quo is that it contains no strategy for change--other than elevating the debate about what should be done from the lowly sphere of labor-management relations to the higher moral plane of international human rights norms. At the local level, Jobs with Justice coalitions and some AFL-CIO central labor councils around the country are actually trying to build a long-term grassroots campaign to promote greater public support for the right to organize. Not unlike that of Human Rights Watch, their target audience is the same elements of academia, the arts, churches and the liberal middle class that have long displayed admirable concern about human rights violations abroad or discrimination against women, gays and minorities at home.

Public officials, university professors, the clergy, civil rights leaders and neighborhood activists are now being encouraged to intervene in organizing situations to help neutralize illegal management resistance to unionization. Workers' rights activists will find plenty of new ammunition in Unfair Advantage, and even some that's buried in Labored Relations. Hopefully, their community-based efforts will create an improved climate for organizing--in some parts of the country at least--and put NLRA reform back on the national political agenda of labor's putative allies in the Democratic Party. Yet while having a Democrat in the White House may prove a necessary condition for reform initiatives, it's hardly sufficient--as the Clinton era just proved. Workers who try to form unions will continue to be at risk until Americans elect both a Congress and a President willing to do more than just tinker with our tattered protection of the right to organize.

TROOPS IN THE STREETS

Nation contributing editor and radio host Marc Cooper was tossed out of the California State University system for antiwar activities in 1971 by executive order of Governor Ronald Reagan--and ended up in Chile, working as a translator for Salvador Allende. What footnote could be more oddly appropriate in Cooper's memoir Pinochet and Me? The book is part diaristic reconstruction of the 1973 coup against Allende, part chronicle of return trips Cooper has made since to Chile and part consideration of that country's--and Pinochet's--fate at century's turn.

Chile "briefly shined as a beacon of inspiration," Cooper laments, the culmination of fifty years of massive campaigning for democratic socialism and the notion that "perhaps, radical social change and resulting improvements in the lives of common people were possible through democratic, peaceful and legal means." His firsthand reporting on the twists and turns of the following quarter-century is by turns chilling and poignant.

Cooper could easily have met the fate of Charles Horman (whom he knew slightly), subject of the film Missing; his account of the coup and its aftermath is fraught with chaos, luck and what he calls the "moment of greatest naivety in my adult life"--assuming the US Embassy would be of help. The American consul told Cooper and a few others that "the armed forces are restoring order but there's still a danger of scattered left-wing snipers." Cooper may be the journalistic equivalent of the latter, but the only danger he poses is that he fosters understanding of the social forces at work in the country he has, in more than one sense, married into. His account of returns to Chile first under an assumed name and, years later, under his own, are compelling: an arrest for photographing an army bus that civilians had tied to random shootings; an encounter with youths who say they are hungry and beg for guns; the effects neoliberal economics have wrought on everyday life; the feelings of Chileans on Pinochet's arrest in London. Cooper ends with the inscription on a Chilean memorial to the disappeared: "The forgotten past is full of memory."

"What we saw in Seattle across those tumultuous days stretching from November 28 through December 3, 1999, and then in Davos, Switzerland, Washington DC, Philadelphia, Los Angeles and Prague was the flowering of a new radical movement in America and across the world, rambunctious, anarchic, internationalist, well informed and in some ways more imaginative and supple than kindred popular eruptions in recent decades," write Nation columnist Alexander Cockburn and his co-authors in this eyewitness chronicle of protests against the WTO and other institutions of the new global economic order. The authors have keen eyes for the social ironies surrounding the events: Jeffrey St. Clair's Seattle diary observes not just the segregation of the city's high-tech opulence from south Seattle's old-economy piers and pulp and chemical factories but an account of an officer frisking a woman whose boyfriend was walking her home from work. Just before unloading his pepper spray on her escort, the policeman tells her, "You've no idea what we've been through today." Protest experience at the Democratic National Convention left the authors concluding as well that "America is moving toward the normalization of paramilitary forces in law enforcement." Winking at the old journalistic saw of quoting taxi drivers to get the street pulse, Nation contributor and former senior editor JoAnn Wypijewski, whose DC diary of World Bank and IMF protests is included, quotes an Ethiopian driver: "What is going on is simply the recolonization of the so-called developing world." Adds another: "This is why we are taxi drivers."

About a year ago, Amit Chaudhuri published in the Times Literary Supplement a panoramic survey of the past century or so of Indian writing and its reception in the West. He observed there that the postcolonial Indian novel tends to be celebrated as a hybrid form in the West, with Salman Rushdie's Midnight's Children eclipsing all previous Indian writing. Unhappily, critics seem to believe that the postcolonial totality of India can only be articulated by Indian novelists writing in English. Yet the novella, Chaudhuri argued, is an equally important form in the vernaculars (there are around twenty major languages and countless dialects with their individual literary traditions in India), as is the short story, and ellipsis is often more effective than all-inclusiveness in attempting to describe India. The tendency to forget that vernacular Indian literatures existed long before Salman Rushdie's brilliant experiment with magical realism--or Vikram Seth's presentation of India as a mosaic of epic proportions in A Suitable Boy--sets a problematic yardstick for judging Indian writing in English. It leads one to think that the Indian narrative is essentially "lush and overblown," whereas the literary traditions of India are actually much more delicately nuanced. Chaudhuri also suggested that hybridization of language is not the only tool for conveying the otherness of perception: Even the correct English of writers like V.S. Naipaul has otherness implicit in it.

To Chaudhuri, who is Bengali, this otherness takes the form of returning to older regional traditions of India. His literary forebears include the Bengali writers of what is known as the kallol jug--which was roughly around the second quarter of the twentieth century in Bengal--rather than contemporaries like Rushdie or Seth. As such, his novels have strong affinities to a specific movement in Bengali literature that attempted to capture the humdrum and the quotidian, though his audience is more the yuppie Indian who constantly juggles English and the vernacular than the educated Bengali middle-class bhadralok. Even the code-switching between Bengali and English--and the occasional Hindi--in Chaudhuri's novels seems to be an attempt to tell the story of the Westernized but ordinary Bengali, rather than hybridization or what Rushdie calls the pickling of language. It is also the story of polyglot India, where most of the population speaks, and habitually switches among, several languages. And mellifluous as Chaudhuri is at times, no one can accuse him of writing overblown prose.

In writing his fourth novel, A New World, Chaudhuri seems to have remained true to his critical principles: The result is not quite likely to make readers in Calcutta swoon but a novel that is as much an attempt to capture the macrocosm of India in a microcosm as it is an attempt to carry on a particular vernacular tradition in English. And to those who have never been to Calcutta, it offers a refreshingly low-key and intimate insight into the heart of the city.

In A New World, a quest for solace brings the protagonist to Calcutta to seek the comforts of the familiar rituals of his parents' home. Jayojit Chatterjee, a not-so-young professor of economics at a Midwestern US college, is back for the summer with his son in tow. Normally, his parents would have been overjoyed. But neither Jayojit nor his parents can get over the fact that the family is now broken, that Jayojit's wife has divorced him. Jayojit has recently won partial custody of his young son, Bonny, and the visit to Calcutta promises to become an annual summer retreat, an escape from his adopted country to the land of his birth.

Divorce has familiarized Jayojit with a new world of frozen pizzas and TV dinners. It also seems to have made him acutely attuned to the harmonies and dissonances of lives around him. One of the clichés about storytelling is that plots are essentially of two kinds--either someone undertakes a journey, or a stranger comes to town. In such a schema, this novel would appear to fall into both categories. Jayojit may not be a stranger visiting Calcutta, but he has certainly moved far from the roots to which he has temporarily returned. He stays with his parents, runs across his neighbors, moves around the city and muses on his married life and the attempt at a second, arranged marriage that he had made on his last visit home a year previously. Daily life in Calcutta is familiar, yet no longer quite familiar. Family photos still clutter the drawing-room table, only now there is a gaping hole in this tapestry of faces--all the pictures of Jayojit's ex-wife Amala have been removed. Her absence haunts the family perhaps more than her presence would have. Nothing sensational happens in Calcutta, not even another attempt at arranging a marriage. Jayojit's visit affects no one but his parents--but the details of a humdrum holiday are meticulously captured.

There is something very familiar about this stillness to anyone who has spent any time in Calcutta. I remember this torpor from countless summer holidays spent in the city, so it is no surprise that Jayojit neglects the book he is planning to write. I also remember vendors selling Kwality ice cream--a brief respite from the oppressive heat, which Bonny yearns for--from pushcarts.

Like New York's pushcart hot dogs and Bangkok's curbside satays, Calcutta also has its distinctive street fare--the rolls, jhaalmuri, phuchka and bhelpuri sold by vendors--whose taste simply cannot be replicated elsewhere. Jayojit's brief interaction with a bhelpuri seller brings back to this reviewer many memories of the tangy snack, flavored with spiced tamarind water, sold by a particular vendor near Sunny Park in the city. A New World speaks to the expatriate reader of little, intimate, everyday things in Calcutta, reminiscent of the way that Amitav Ghosh's Shadow Lines, a novel set partly in that city, did a few years ago.

Chaudhuri's book almost self-consciously tries to be different from the usual Indian writing in English. To put things in perspective, consider Raj Kamal Jha's The Blue Bedspread, the other novel set in Calcutta that has recently been published in America. An interesting foil to A New World, it is nothing if not sensational in plot and incidents. Its narrator is another not-quite-young man, but one who has a secret to reveal--and has just one night to write it all down. In the bedroom a newborn child lies on a blue bedspread; in the adjacent room, the narrator struggles to give voice to a mosaic of stories from his and his sister's past that can be pieced together to reveal the truth, insofar as truth may be known. The idea is clever but the secret obvious from page five onward. Of course, the ingenious aspect of Jha's plot is the frame that the story needs to be written in a matter of hours--which means that any rough edges and disjunctions in the text are automatically to be excused, the way amateurish camera work was in, say, The Blair Witch Project. This accounts for inconsistencies in the story line, and the series of deliberately unreliable narrative perspectives only helps further the cause. Judging by its reception in the West, however, he pitched his story to the right audience: the Western critic who, by all appearances, has little idea of what Calcutta is like, is willing to give Jha credit for having done for Calcutta what Joyce did for Dublin (as a reviewer wrote in the New York Times). Critics also laud Jha for letting the incestuous cat out of the bag of a repressive India. That particular cat, however, has always roamed at large in Vedic creation myths and vernacular writings. In fact, over a decade ago, Safdar Hashmi, one of India's foremost theater personalities, was assassinated by Hindu fundamentalists for staging one of the earliest mentions of incest in Indian literature: a little-known version of the epic Ramayana in which the hero Rama's queen is also his sister. Jha certainly explores the eternally sensitive issue of incest in contemporary society as his narrator tells overlapping pieces of the story, and he even throws in a bit of sodomy and pederasty for good measure; but his method is a tabloid-ish piling of sensation upon sensation that might, at best, be an unfortunate outcome of his training as a journalist.

Unlike Chaudhuri, who tries to produce a miniaturist's portrait of Calcutta by adding brush stroke upon brush stroke of minutely observed detail, Jha sets out to write the novel that will lay bare the heart of Calcutta but loses his way in the quagmire of sensational revelations. This is a pity, as the novel has its occasional and redeeming moments of brilliance:

Just outside the oil mill, a couple of feet to the right of its entrance, were the birds. In a large cage, more like a coop, the kind you see at the Alipore Zoo, slightly smaller, the size of an average storeroom in an average house...people stopped by to look at these dozen birds in the cage.
      Flying round and round, grey and white, grey and white. On certain rainy days, when the sky was dark, it seemed tiny clouds had slipped into the cage each dragging with it just a little bit of the sky. And then one afternoon in 1977, the oil mill closed down. Just like that, all of a sudden.

Too bad the novel does not contain more quiet gems like this passage. On the other hand, India has long been imagined as the land of elephants and tigers, jungles and sadhus, snake-charmers and the vanishing-rope trick, so why blame the author for catering to popular fantasies?

If The Blue Bedspread is a psychological study, then A New World is probably best described as an anthropological exercise. It undoubtedly offers one of the more lyrical descriptions of Bengali life that exists in English fiction. Jayojit's mother is the quintessential Bengali homemaker of a particular generation: She welcomes him home with a fond "You've put on weight, have you" but also reverses herself to "Where--I don't think you've put on weight" when he protests against eating too much. His father, a retired rear admiral and patriot who had sided with the Nationalists against the British, is nonetheless a holdover from the colonial days and eats, brown sahib style, with a fork and spoon. He is the detached head of the family, who still maintains an "inconsequential tyrannical hold over this household, in which usually only he and his wife lived, with part-time servants coming and going each day." Neither parent can quite accept their son's divorce: "they seemed to feel the incompleteness of their family, and that it would not be now complete. Someone was missing. Both mother and father were too hurt to speak of it. In a strange way, they felt abandoned." This feeling of bereftness is perhaps only to be expected. Divorce is still a relatively rare occurrence in India. Not surprisingly, when the parents try to arrange the second marriage for their son, it is to a fellow divorcée. The family doctor gets involved as an intermediary, a situation not unusual in the delicate rituals of matchmaking. She, unlike Jayojit, is childless, a crucial consideration for the still-patriarchal Calcutta society.

On the lighter side, Bengali idiosyncrasies like the obsession with traveling are gleefully dwelt upon. The Admiral's ire against Bangladesh Biman remains unclear till he sardonically observes, "Every week tens of middle-class Bengalis who've been saving up all their lives queue up in the airport to travel by Bangladesh Biman--to visit their son or daughter in England, or to travel: you know the Bengali weakness for 'bhraman'?" referring to the well-known Bengali obsession with globetrotting. His own projected trip to visit Jayojit had been derailed by his son's divorce. The thankless but socially necessary habit of keeping track of obscure relationships gets some ribbing--"Jayojit's mother's late brother-in-law's niece had a husband whose sister had married Bijon, who himself had no children." And Dr. Sen, the neighbor and friend of the Chatterjees, chuckles over how Bengalis "only come out during the Pujas. Then you'll see them--heh, heh--bowing before Ma Durga!" No believer dares run the risk of offending the goddess who once saved the very gods from calamity.

Chaudhuri's nuanced ear for language is likewise directed at readers familiar with Bengali. Jayojit's mother greets her grandson with a "Esho shona.... Come to thamma." Bonny, who speaks little Bengali, cannot pronounce the hard th. "All right, tamma," he says. Unfortunately, not every attempt to transliterate words is equally happy. The phrase "How much" might have been better transcribed as "koto" than "kato," which suggests the Bengali imperative "cut"; and the "Hay" in "Hay bhelpuri" sounds more like the lofty address "O" than "yes." What jars more is Chaudhuri's tendency to italicize words in an attempt to convey Bengali speech rhythms--it becomes wearisome. (Unlike English, word stress in Bengali is not predetermined but changes with the speaker.)

While this novel remains a bold attempt to transfer to Indian writing in English some of the characteristics of vernacular literatures, it is not without other, deeper problems. One can, after all, read of beads of moisture condensing on the outside of glasses of cold water and heads of dead fish only so many times before wondering where such aestheticized details lead. Also, given that Dhaka is a half-hour ahead of Calcutta, it's a pity that Chaudhuri's chronological error in claiming that "although they'd [Jayojit and Bonny] left Calcutta at half-past seven, it was still seven-thirty in Bangladesh" was not rectified in the editorial process. On a lesser note, one would also like to quibble over Chaudhuri's referring to phuchkas as golgappas, a term that is common in Bombay, where Chaudhuri grew up, but which many Calcuttans might not recognize.

Good translations of vernacular Indian writing are scarce in English, but there are several collections of Rabindranath Tagore's fiction available here, the best of which perhaps are those by William Radice and Ketaki Kushari Dyson; Imaginary Maps: Three Stories by Mahasweta Devi (translated by Gayatri Chakravorty Spivak) offers three tales about tribal women--the most marginalized among the marginal--of a significantly different flavor; and the two-volume Women Writing in India: 600 B.C. to the Present, edited by K. Lalita and Susie J. Tharu, is a good anthology for a historical overview, albeit with a gender bias. Nitpicking aside, A New World is definitely worth reading. Nowhere close to the best writing available in India's regional languages, it is still a creditable endeavor and should be appreciated as such.

Chaplinesque Rapscallion New Leader of Germany's National Socialist Party
      --The Onion

"I have nothing to say about Hitler." With this line Karl Kraus, turn-of-the-century Vienna's most famous journalist, began his 300-page anti-Hitler invective, The Third Walpurgis Night. Kraus's fate has been shared widely. Hitler tickles and tortures the authorial imagination like no other twentieth-century figure. At first as a hero, for the most part, then as a villain, also for the most part, Hitler has been a fantastically popular subject among all kinds of writers since his postputsch courtroom antics transformed him into something much larger than a right-wing rabblerouser. Indeed, between 1923 and 1995, more than 120,000 essays and monographs on Hitler were published. Attenuation seems unlikely. For if it has changed at all, our fascination with Hitler appears to have grown even stronger in the past five years.

And so we should not be surprised by the fact that a lot of books about Hitler have been published recently. Yet there is a twist here; it has to do with quality rather than quantity. We expected more books about Hitler. What we did not expect is that the most prominent of them would be so good. This remark is less cynical than it sounds. Over the years able scholars have produced a very substantial body of excellent research on Hitler. Of course, it would be absurd to regard as unexpected everything that adds to it.

Furthermore, we had reason to hope for significant new contributions. Ideology does not play quite the same role in Hitler studies that it did fifteen years ago. Historians in East Germany tended to treat Hitler as an effect of capitalism, while historians in the West often viewed him in narrowly personal terms, as a deranged, gigantic individual crushing a fragile democratic experiment. But scholars in the West, and especially in West Germany, were not exactly of one opinion with regard both to Hitler's causes and his effects. In the mid-1980s, a new revisionist conservatism led to a new contentiousness. At issue was a series of incendiary questions--even the question of whether it was appropriate to ask them: Was Hitler a revolutionary? Which of his policies were rational? Ernst Nolte, who had been drifting steadily away from the trenchant analysis of Nazism he advanced in the early 1960s, went so far as to call Hitler's worldview an understandable reaction to a perceived Bolshevik threat. Just a few months ago, Nolte received one of Germany's most prestigious awards for cultural achievement, which simply confirms what we already knew: Hitler remains an intensely politicized field of inquiry. However, in general, the intellectual atmosphere in this area has improved. It is more open, as are archives in Moscow. And material discovered there--for example, Hitler's skull and a complete copy of Goebbels's diary--has helped to answer old questions.

But discovering new sources will only get you so far. It certainly will not explain a phenomenon as complex as Hitler. Nor will sheer intellectual openness. The great majority of the thousands of open-minded books about Hitler have little interpretive value. In fact, until recently there were only two truly formidable biographies of him: Alan Bullock's Hitler: A Study in Tyranny (1952, revised 1962) and Joachim Fest's Hitler: A Biography (1973). We now have a third major biography of Hitler, Ian Kershaw's two-volume masterpiece Hitler 1889-1936: Hubris (1998) and Hitler 1936-1945: Nemesis (2000). It is the best of the three, by far.

Improvements in biographical research do not always imply a general shift in the significance of the subject. Yet that is likely to be the case here. For, again, the publication of Kershaw's biography was accompanied by a procession of incisive and well-researched books: The Hitler of History (1997), John Lukacs's useful survey of, and critical engagement with, historical scholarship on Hitler; Hitler: Diagnosis of a Destructive Prophet (1999), Fritz Redlich's illuminating "psychography" of Hitler (this should not be confused with "psychohistory": Redlich, who is a psychiatrist, works carefully with relevant sources and examines Hitler's mental condition at every stage of his life, minutely charting the changes, and he does not seek to "solve" the enigma of Hitler's psychopathic behavior by focusing on childhood trauma or a particular psychic disturbance); Explaining Hitler: The Search for the Origins of His Evil (1998), Ron Rosenbaum's extensive collection of interviews with scholars, intellectuals and artists who, in some form or other, have tried to "explain Hitler"; and Hitler's Vienna: A Dictator's Apprenticeship (1999, German original 1996), Brigitte Hamann's scrupulously researched and intelligently argued account of Hitler's early years in Vienna (1906-13) and of their influence on his later development.

Every one of these books represents an attempt at sustained, comprehensive critical reckoning with Hitler. In the past, the most compelling works on him were often of a very different character. (Consider Eberhard Jäckel's and Sebastian Haffner's shorter, much more synthetic books on Hitler's Weltanschauung, which were published in 1969 and 1978.) But if there has been a structural change, what has caused it? Kershaw himself offers an insightful answer. "Reflecting" on Hitler's historical significance in the preface to Hitler 1889-1936: Hubris, he writes: "Hitler's dictatorship has the quality of a paradigm for the twentieth century." Kershaw also claims that "Hitler's mark on the century" has been "deeper" than anyone's. The implication is clear. Taking leave of the twentieth century means trying to settle our accounts with Hitler, its paradigmatic problem, which, in turn, means engaging in sustained, comprehensive critical analysis. Certainly something close to this seems to be at stake in Rosenbaum's work, and in Hamann's. She suggestively tracks the full extent of Hitler's debt to "twentieth-century culture" by examining his encounter with one of its paradigms: fin de siècle Vienna. Kershaw has given us a twenty-first-century biography of Hitler that could have been written only at the end of the twentieth century.

Kershaw's biography is a true "social biography," to use a phrase the great film theorist Siegfried Kracauer coined, in exile, as he wrote about the culture that Hitler's Germany had begun to annihilate. Without a trace of moralism, and without losing himself in quotidian minutiae and psychological speculation, Kershaw nonetheless shadows Hitler the way a conscience might have. He examines Hitler's daily life, as well as his emotional and political development, in vivid detail. At the same time, he situates Hitler's personal narrative within its social context, charting their reciprocal influence and pointing out how Hitler's experiences and attitudes were emblematic of large social trends. And he does so with impressive erudition. The result is a kind of interpretive balance, which is very difficult to bring off in Hitler's case. With him, moving back and forth between the microlevel of personal narrative and the macrolevel of social context entails entering into not so much a hermeneutic circle as a dizzying spiral. For, at a certain point, Hitler's narrative begins to reshape--as few, if any, personal narratives have--the social context that shaped it, only, of course, to be shaped again itself by the context it reshaped.

Neither Bullock nor Fest came close to producing a real social biography, as both of their books focus on the personal narrative. They offer well-informed, penetrating answers to one crucial question: Why did Hitler commit the terrible crimes for which he will be remembered? But neither one makes a serious attempt to shed light on Hitler's path to the chancellorship or to understand how he remained in power for twelve years while executing policies of mass destruction and mass self-destruction. They do not tell us how Hitler became Hitler.

Kershaw's book works so well as social biography because his approach proceeds from a transitional concept: charisma. Elaborating on the argument he developed in The "Hitler Myth" (1987), Kershaw invokes charisma as a sociological category. Here charisma is a modern, postliberal structure of authority, one that became possible in Weimar Germany for a number of impersonal reasons. These include the "ignominy of Versailles," the concomitant collective longing for national redemption and the inability of the democratic government to appeal to a strong democratic tradition in Germany.

Charisma is also a psychological category. It can therefore function as a way to mediate between the levels of biographical analysis. And, indeed, Kershaw makes his overriding concern the fateful match between Hitler's personal charisma and Germany's impersonal readiness for charismatic rule. Summing it all up, Kershaw writes, "The Germany which had produced Adolf Hitler had seen its future in his vision, had so readily served him, and had shared in his hubris, had also to share in his nemesis." Germany followed the charismatic leader it "produced" because he envisioned, in just the right way, at just the right time, the Germany it wanted to see.

In Hubris, Kershaw explains how Hitler's idiosyncratic "vision" for a "better" future and Germany's receptiveness to it took shape. In Nemesis, he tracks the bloody business of implementation. We might expect the second volume of a two-volume Hitler biography to begin in 1933. But Kershaw divides Hitler's life into pre- and post-1936 stages, because 1936 marks "the culminating point of the first phase of the dictatorship." Kershaw wants Nemesis to begin with the beginning of the end, with the onset of the "ceaseless radicalization" that persisted until 1945. Both volumes are well written and come equipped with helpful maps and eerie photographs. And because Kershaw keeps his debates with other scholars, as well as his extensive remarks about primary sources, neatly contained in his footnotes, Hubris and Nemesis read smoothly, remarkably so, given their factual girth and cognitive intricacy. Some chapters are structured as accounts of Hitler's life stages, such as his "dropout" years in Vienna, while others are organized around seminal events, for example, Germany's strategic "miscalculation" during the 1939 Poland crisis. Kershaw puts personal narrative into the foreground when it seems to be of decisive importance. And he does the same with social context. Tellingly, all the chapter headings in Nemesis refer to large historical developments, starting, again, with the Nazis' "ceaseless radicalization."

In 1936, according to Kershaw, Hitler was at once more delusional than ever and cannily realistic. His early diplomatic and economic successes had fed his surging megalomania. Both Hitler and the nation that, at the time, overwhelmingly supported him believed that he could achieve whatever he wanted to. Yet Hitler also astutely recognized that his authority could not rest on a foundation of rationally organized domestic prosperity. It would last only as long as he was associated with a "project of national salvation." The pressure to expand, "to radicalize" unremittingly, came from outside as well as from inside his circle.

Kershaw's most original, most provocative claims have to do with the place of Nazi Party leaders in this constellation of causal forces. He insists that even as they used the most cynical images and slogans to manufacture Hitler's charisma, men like Alfred Rosenberg, Heinrich Himmler and especially Joseph Goebbels remained fanatically in Hitler's thrall. As Kershaw puts it, they "combined pure belief and impure propaganda." Working closely with Goebbels's complete diary, which proves to be a key new source (Hitler's bond with Goebbels was the closest thing he had to a friendship), Kershaw draws out the full, chilling extent of this belief. He also shows that well into the war, and until the very end, defeat did nothing to shake it. For in taking huge risks and losing, Hitler remained true to the principles that had won him such loyal disciples.

Perhaps even more chilling is Kershaw's account of how these same party leaders influenced the Final Solution. Here again Goebbels's diary is crucially important. More lucidly than other sources, it reveals that Hitler had to be prodded into instituting not only the policy of mass deportations but even the compulsory-identification measure (the yellow Star of David) for Jews living in Germany. Party leaders had urged Hitler to take this latter step in the wake of Kristallnacht (November 1938). He resisted it until August 1941, when Goebbels finally "convinced" him to act. And in the summer of 1941, he repeatedly "rejected" Reinhard Heydrich's proposals to make the destruction of Eastern Jewry more systematic. Why? Certainly moral compunction cannot be the answer. According to Kershaw, Goebbels expressed a certain dismay at the inconsistency between Hitler's behavior and his stated principles on the "Jewish Question," yet he never suggested that Hitler had softened his attitude toward the Jews. During this time Hitler continued to cite his own prewar "prophecy," according to which the Jews would be destroyed if they started another world war, and to provide various justifications for large-scale murder. Kershaw speculates that Hitler may have been acting, or not acting, out of denial. For to devise a "Final Solution" before winning the war in the East was to acknowledge that the war could not be won anytime soon. As long as the fiction of imminent victory could be sustained, it made more "sense" to wait for the acquisition of vast new territories. After all, the Nazis were trying to figure out how to dispose of millions of people and had not yet begun to think seriously about gas and ovens.

The problem, for Kershaw, is that Hitler had given up this illusion by the fall of 1941, and yet he remained reluctant to authorize mass deportations and overtly genocidal policies. Hitler did not enumerate his reservations, at least not on records available to us. And so we are left wondering. What is clear is that the solicitations of Heydrich, Himmler and Goebbels had the desired effect--Hitler eventually did license extermination. Yet, as Kershaw stresses, he did so only in the most general terms. Pushing his claim, Kershaw goes so far as to contend, "Whatever the reasons, [Hitler] could never have delivered the sort of speech which, notoriously, Himmler would give in Posen two years later [1943] when he described what it was like to see 1,000 corpses lying side by side and spoke openly of the 'extermination' (Ausrottung) of the Jewish people as a 'glorious page in our history....' Even in his inner circle Hitler could never bring himself to speak with outright frankness about the killing of the Jews." Hitler "could not bring himself" to discuss the Holocaust directly, apparently not even with Goebbels. This is an unsettling idea. Indeed, David Irving, the British historian and notorious Hitler apologist, rushes from Hitler's silence to the conclusion that he did not know about the death camps. What Kershaw does is very different. With unrivaled precision and without polemicism, he circumscribes Hitler's unwillingness to speak about the Holocaust, ultimately treating it as a question. Far from exculpating Hitler, Kershaw's move invites further inquiry. Nemesis does more than inform exhaustively and explain brilliantly: It points to what remains to be said about Hitler.

Blogs

230,000 long-term unemployed workers lost their benefits on Sunday and the system is about to get a whole lot worse.

May 18, 2012

Katie Roiphe's Newsweek cover story ignores the fact that in every century and decade, sadomasochistic erotica has broken into the mainstream.

April 16, 2012

A poet passionately engaged with writing and politics, she said "art means nothing if it simply decorates the dinner table of the power which holds it hostage."

March 28, 2012

A new book takes exciting and historic trends a step too far.

March 23, 2012

A new book shows that the HIV virus was triggered by wanton imperial depredations.

February 28, 2012

How does the movie The Help "photoshop" American history?

February 27, 2012

Romney misattributed the key quote of his Florida victory speech to the pamphleteer. But that's hardly the only reason why Paine would have decried the Bain Capitalist.

February 1, 2012

Will the new president's mega-million-dollar makeover of the main library scare off scholars and leave the branches begging?

December 6, 2011

If the great Hiroshima novel remains unwritten, a number of major poets have written brilliantly on nuclear concerns.

August 16, 2011