Help

Nation Topics - Books and the Arts

Topic Page

Nation Topics - Books and the Arts

Subsections:

Arts and Entertainment Books and Ideas

Articles

News and Features

Jason Epstein's Book Business: Publishing Past Present and Future is the third memoir of a major American life in book publishing to reach print in less than two years. It is at once a sign that the guard is changing and a recognition that the business has already changed. It is also, in the case of the 72-year-old Epstein, an opportunity to gaze into the crystal ball to predict the changes to be, something he has been rather good at during the course of his long career.

Simon & Schuster's Michael Korda got the triumvirate rolling in 1999 with Another Life, gossipy and entertaining and novelistic, like the books Korda often publishes. The New Press's André Schiffrin--famously ousted from Random House's Pantheon Books, the once independent imprint his father started--followed suit more recently with The Business of Books, the kind of polemic he has sometimes featured on his list [see Daniel Simon, "Keepers of the Word," December 25, 2000].

It's not surprising, then, that the tone pervading Epstein's memoir--which began with a series of lectures he gave at the New York Public Library, formed two essays in The New York Review of Books and was coaxed into a book by Norton president Drake McFeely--is cool and elegant and full of the gravitas of a man who wanted to be a great writer and instead ended up publishing many such, Morrison and Mailer and Doctorow among them.

He arrived at Random House in 1958, having deemed it time to leave Doubleday when he was prevented from publishing Lolita there. While at Doubleday he had founded Anchor Books and with it the trade paperback format in America. He retired as Random's editorial director in 1998, and during the four decades in between started the Library of America, a unified series of reprints of great American literature; The Reader's Catalog, a kind of print precursor to Amazon; and The New York Review of Books. He had a reputation as a brilliant editor but went beyond that to envisage change and make it happen, and in the process made himself into a pillar of the New York intellectual establishment.

"If I have any regrets, I can't think what they are," he declared during an interview recently, sipping homemade espresso at his large kitchen table in an opulent downtown apartment that could double as the upscale set for one of Woody Allen's Manhattan tales. He still edits authors he's been associated with but now does it from home. He prefers to be based there rather than in the Random corporate offices, wishes to put space between himself and an "increasingly distressed industry" mired in "severe structural problems." Prominent among them are a chain-driven bookselling system that favors "brand name" authors and often returns other new books to their publishers after only a few weeks on the shelves, before the titles have a chance to establish themselves; and a bestseller-driven system of high royalty advances that often do not earn back the money invested, a system that ratchets up unrealistically high sales expectations for new titles overall, and in so doing makes it increasingly difficult to publish certain kinds of books.

One-third of the way through his slim text, Epstein writes that his career has demonstrated an "ambivalence toward innovation." Ambivalence also pervades this elegiac book. Perhaps it is inevitable when a man looks back to his youth and forward to a future in which he will not play a major part, even if he is hopeful about that future. Perhaps, too, it is inevitable when confronting the distress signals of an industry he has spent his life in and clearly loves. Epstein shares his visions of a publishing future liberated electronically, but that future harks back to a deep-seated nostalgia, a longing for what was. His book seems to predict that technology in the form of the Internet will restore to the book business a certain lost rightness from the past.

His first chapter, like Dickens's Christmas tale, moves back and forth among past, present and future in an attempt to limn the larger changes of the past fifty years and what may yet unfold. The rest of the book is chronologically structured. It follows Epstein's career and the transformation of publishing from primarily small-scale, owner-operated enterprises rooted in the 1920s "golden age" of Liveright and Knopf to the "media empires" of today, which are forced to operate within an "overconcentrated," "undifferentiated" and fatally "rigid" bookselling structure. Now, he says, "there can't be Liverights or Cerfs because the context is so different. Roger Straus is the very last of them," and even he has sold his company to the German firm von Holtzbrinck.

Publishing must return to being "a much smaller business again," Epstein is convinced. "It has to, it's a craft and can't be industrialized any more than writing can. It's about to undergo a huge structural shift and there's nothing the conglomerates can do about it. The marketplace has shifted out from under them: the system of big money bestsellers defeats the possibility of building a sustained backlist. And without a sustained backlist, publishing cannot function in the long term. Providentially, just as the industry was falling into terminal decadence, electronic publishing has come along."

Epstein is in no way predicting the demise of print. Rather, his future is predicated on a kind of universal electronic Reader's Catalog, "much like Amazon" but far beyond it, "multilingual, multinational, and responsibly annotated. People will access it on their computers at home, in the office, and in kiosks like ATMs. It will be possible to browse those books, and downloading technology will eventually solve the problem of making it possible to buy those books. They won't exist in print until they're actually bought.

"There is no room on the Internet for middlemen, who sell the same product as their competitors, competing on the basis of price and service, and in so doing eat up their margins." Epstein is of course speaking of the Amazons and B&N.coms of today. "I think Amazon can't be here that much longer," says the man who sat at this same kitchen table doling out advice to its CEO, Jeff Bezos, a few years back.

As for brick-and-mortar stores, "the chains aren't tenable, either. They never were. The superstores have become what the old mall stores were. There are far too many of them, Waldens with coffee bars, and they will shrink. Stores run by people who love running bookstores will arise spontaneously like mushrooms and find a way to stay in business once the chains begin to recede."

And the conglomerate publishers? "I think they can show some financial progress for some years by cutting costs and cutting out redundancies, but eventually they'll find themselves with expensive traditional facilities that are increasingly irrelevant. They'll have to offload many functions on to specialist firms. In the end, they in turn will look for a buyer if they can find one. They should have noticed that the previous owners were all too happy to sell."

Meanwhile, authors will have found a way to bypass their publishers by going directly to the web. People will start independent authors' websites. Books will be much cheaper. Authors will have a much larger share of the revenue.

Stephen King has already gained notoriety in trying to do so. But the spectacular starting bang of Riding the Bullet, done in conjunction with his publisher, Simon & Schuster, attenuated when he tried to serialize online a novel, The Plant, on his own. A downturn in paying customers for the later chapters led King to abandon the project. Asked about this, Epstein insists, "It's like the days of the early cars that ran off the road into the mud. People said cars would never work. Well, one of these days e-publishing will work."

Of other experiments now being tried Epstein is openly dismissive, and he sees a kind of Darwinian process filtering chaff from grain. Mighty Words and similar online publishers "don't know what a book is," he contends. "But people know what a book is. Human beings are designed to distinguish value, and in my opinion that problem will take care of itself."

He disregards the tremors that have gone through the publishing houses ever since B&N.com announced it was getting into the business of publishing books. Barnes & Noble Digital was formed the first week in January to compete with the new electronic subsidiaries of traditional publishers, which are bringing out digital versions of new titles readable on PCs or dedicated devices, as well as original works specifically created for electronic distribution. In addition, they are digitizing backlist and out-of-print books that can be reprinted in very small quantities in a process known as print-on-demand."It's yet another premature entry," says Epstein. "B&N's publishing experience is limited to a remainder operation. That's entirely different from bringing out original works."

While Epstein criticizes the proverbial naysayers' laughing at those early cars stuck in the mud, at the same time he cautions, "I don't think an author who has worked hard to create something of value will want to risk it in the electronic format at this point." He says bookstores will wind up selling new titles at much lower prices than is now the case, $10 or so, but "can't figure out" how that will be done in the black. His predictions are compelling, but they are also much too vague--for instance, he sets out no time frame or actual mechanics for what he believes will transpire.

The bloat of the superstores is something publishers have worried about for years, almost from their rollout. This holiday season's flat sales at the three biggest chains; the margin-slashing of Amazon; and the re-energizing of the independent stores through a marketing program called Booksense, which includes web-based retailing, all serve to illustrate Epstein's points. Borders went so far as to put itself on the block, but found no willing takers. Recent murmurs about B&N's CEO Len Riggio entertaining a buyout offer from media conglomerate Gemstar-TV Guide International, which has aggressively entered the e-book technology market, did not result in a deal but also were more than simple gossip.

The past twenty years have seen the RCAs, MCAs, Advance Publications and the like learn their lessons and abandon book publishing, as Epstein has noted. Other conglomerates have already tried to offload their publishing components and in time will try again. But it also can't be ignored that companies like the German-based Bertelsmann (which acquired Bantam, Doubleday Dell and Random House and consolidated them) and von Holtzbrinck (which has bought Holt, St. Martin's and Farrar, Straus & Giroux) have their roots in the book business itself. They are therefore not as likely to exit the scene as Epstein would have us believe.

Undoubtedly, many of Epstein's electronic dreams are prescient and will one day come to pass. The companies that first turn them into reality, though, will likely be turning out works in the professional, scholarly, reference and educational sectors rather than in the trade world he knows so well. But although the Internet will change book publishing profoundly and in ways even Jason Epstein can't predict, other forces are at work as well and shouldn't be ignored.

A couple of years ago a brilliant and rich entrepreneur who also happens to be a profoundly bookish man devised a model, not unlike Epstein's nostalgic vision, of devolved companies publishing real books that share a central financial source. It is called the Perseus Group. It is still in its early days, far too soon to know whether it will last. But Epstein's longing for a more civilized, human-scale publishing business is shared by many. The Internet may help bring it about, but it won't do everything.

The following debate is adapted from a forum--put together by Basic Books and held in New York City some weeks ago. Participating were: John Donatich, who moderated and is publisher of Basic Books; Russell Jacoby, who teaches at UCLA and is the author of The End of Utopia and The Last Intellectuals; Jean Bethke Elshtain, who has served as a board member of the Institute for Advanced Studies at Princeton University, is a fellow of the American Academy of Arts and Sciences, teaches at the University of Chicago and is the author of Women and War, Democracy on Trial and a forthcoming intellectual biography of Jane Addams; Stephen Carter, the William Nelson Cromwell Professor of Law at Yale University and author of, among other works, The Culture of Disbelief, Reflections of an Affirmative Action Baby, Integrity, Civility and, most recently, God's Name in Vain: The Wrongs and Rights of Religion in Politics; Herbert Gans, the Robert S. Lynd Professor of Sociology at Columbia University and author of numerous works, including Popular Culture and High Culture, The War Against the Poor and The Levittowners; Steven Johnson, acclaimed as one of the most influential people in cyberworld by Newsweek and New York magazines, co-founder of Feedmag.com, the award-winning online magazine, and author of the books Interface Culture and the forthcoming Emergence; and Christopher Hitchens, a columnist for The Nation and Vanity Fair, whose books include the bestselling No One Left to Lie To: The Values of the Worst Family and The Missionary Position: Mother Teresa in Theory and Practice. For Basic, he will be writing the forthcoming On the Contrary: Letters to a Young Radical.

John Donatich: As we try to puzzle out the future of the public intellectual, it's hard not to poke a little fun at ourselves, because the issue is that serious. The very words "future of the public intellectual" seem to have a kind of nostalgia built into them, in that we only worry over the future of something that seems endangered, something we have been privileged to live with and are terrified to bury.

In preparing for this event, I might as well admit that I've been worried about making the slip, "the future of the public ineffectual." But I think that malapropism would be central to what we'll be talking about. It seems to me that there is a central conflict regarding American intellectual work. How does it reconcile itself with the venerable tradition of American anti-intellectualism? What does a country built on headstrong individualism and the myth of self-reliance do with its people convinced that they know best? At Basic Books' fiftieth anniversary, it's a good time to look at a publishing company born in midcentury New York City, a time and place that thrived on the idea of the public intellectual. In our first decades, we published Daniel Bell, Nathan Glazer, Michael Walzer, Christopher Lasch, Herb Gans, Paul Starr, Robert Jay Lifton--and these names came fresh on the heels of Lévi-Strauss, Freud, Erik Erikson and Clifford Geertz.

What did these writers have in common except the self-defined right to worry the world and to believe that there is a symbiotic relationship between the private world of the thinker and the public world he or she wishes to address? That the age of great public intellectuals in America has passed has in fact become a cliché. There are many well-reviewed reasons for this. Scholars and thinkers have retreated to the academy. Self-doubt has become the very compass point of contemporary inquiry. Scholarship seems to start with an autobiographical or confessional orientation. The notion that every question has a noble answer or that there are reliable structures of ideology to believe in wholeheartedly has become, at best, quaint.

Some believe that the once-relied-upon audience of learned readers has disappeared, giving way to a generation desensitized to complex argumentation by television and the Internet. The movie Dumb and Dumber grosses dozens of millions of dollars at the box office, while what's left of bohemian culture celebrates free-market economics. Selling out has more to do with ticket grosses than the antimaterialist who stands apart from society.

How do we reconcile ambition and virtue, expertise and accessibility, multicultural sensitivity and the urge toward unified theory? Most important, how do we reconcile the fact that disagreement is a main catalyst of progress? How do we battle the gravitation toward happy consensus that paralyzes our national debate? A new generation of public intellectuals waits to be mobilized. What will it look like? That is what our distinguished panelists will discuss.

Russell Jacoby has been useful in defining the role of the public intellectual in the past half-century, especially in the context of the academy. Can you, Russell, define for us a sort of historical context for the public intellectual--what kind of talent, courage and/or political motivation it takes for someone to be of the academy but to have his or her back turned to it, ready to speak to an audience greater than one's peers?

Russell Jacoby: A book of mine that preceded The Last Intellectuals was on the history of psychoanalysis. And one of the things I was struck by when I wrote it was that even though psychoanalysis prospered in the United States, something was missing--that is, the sort of great refugee intellectuals, the Erik Eriksons, the Bruno Bettelheims, the Erich Fromms, were not being reproduced. As a field it prospered, but it became medicalized and professionalized. And I was struck by both the success of this field and the absence of public voices of the Eriksons and Bettelheims and Fromms. And from there I began to consider this as a sort of generational question in American history. Where were the new intellectuals? And I put the stress on public intellectuals, because obviously a kind of professional and technical intelligentsia prospered in America, but as far as I could see the public intellectuals were becoming somewhat invisible.

They were invisible because, in some ways, they had become academics, professors locked in the university. And I used a kind of generational account, looking at the 1900s, taking the Edmund Wilsons, the Lewis Mumfords. What became of them, and who were their successors? And I had a tough time finding them.

In some sense it was a story of my generation, the generation that ended up in the university and was more concerned with--well, what?--finding recommendations than with writing public interventions. And to this day, the worst thing you can say about someone in an academic meeting or when you're discussing tenure promotion is, "Oh, his work is kind of journalistic." Meaning, it's readable. It's journalistic, it's superficial. There's an equation between profundity and originality.

My argument was that, in fact, these generations of public intellectuals have diminished over time. For good reasons. The urban habitats, the cheap rents, have disappeared--as well as the jobs themselves. So the transitional generation, the New York intellectuals, ends up in the university. I mention Daniel Bell as a test case. When he was getting tenure, they turned to him and said, "What did you do your dissertation on?" And he said, "I never did a dissertation." And they said, "Oh, we'll call that collection of essays you did a dissertation." But you couldn't do that now. Those of that generation started off as independent intellectuals writing for small magazines and ended up as professors. The next generation started off as professors, wrote differently and thought differently.

So my argument and one of the working titles of my book was, in fact, "The Decline of the Public Intellectuals." And here I am at a panel on "The Future of Public Intellectuals." Even at the time I was writing, some editors said, "Well, decline, that's a little depressing. Could you sort of make a more upbeat version?" So I said, "I have a new book called The Rise of American Intellectuals," and was told, "Well, that sounds much better, that's something we can sell." But I was really taking a generational approach, which in fact, is on the decline. And it caused intense controversy, mainly for my contemporaries, who always said, "What about me? I'm a public intellectual. What about my friends?" In some sense the argument is ongoing. I'm happy to be wrong, if there are new public intellectuals emerging. But I tend to think that the university and professionalization does absorb and suck away too much talent, and that there are too few who are bucking the trends.

Donatich: Maybe the term "public intellectual" begs the question, "who is the public that is being addressed by these intellectuals?" Which participant in this conversation is invisible, the public or the intellectual?

Jean Bethke Elshtain: I mused in print at one point that the problem with being a public intellectual is that as time goes on, one may become more and more public and less and less intellectual. Perhaps I should have said that a hazard of the vocation of the public intellectual lies in that direction. I didn't exactly mean less academically respectable, but rather something more or less along these lines: less reflective, less inclined to question one's own judgments, less likely to embed a conviction in its appropriate context with all the nuance intact. It is the task of the public intellectual as I understand that vocation to keep the nuances alive. A public intellectual is not a paid publicist, not a spinner, not in the pocket of a narrowly defined purpose. It is, of course the temptation, another one, of the public intellectual to cozy up to that which he or she should be evaluating critically. I think perhaps, too many White House dinners can blunt the edge of criticism.

A way I like to put it is that when you're thinking about models for this activity, you might put it this way: Sartre or Camus? An intellectual who is willing to look the other way, indeed, shamefully, explain away the existence of slave-labor camps, the gulags, in the service of a grand world-historic purpose or, by contrast, an intellectual who told the truth about such atrocities, knowing that he would be denounced, isolated, pronounced an ally of the CIA and capitalistic oppressors out to grind the faces of the poor.

There are times when a public intellectual must say "neither/nor," as did Camus. Neither the socialism of the gallows, in his memorable phrase, nor a capitalist order riddled with inequalities and shamed by the continuing existence, in his era, the era of which I speak, of legally sanctioned segregation. At the same time, this neither/nor did not create a world of moral equivalence. Camus was clear about this. In one regime, one order, one scheme of things, one could protest, one could organize to fight inequities, and in the other one wound up disappeared or dead.

Let me mention just one issue that I took on several times when I alternated a column called "Hard Questions" for The New Republic. I'm referring to the question of genetic engineering, genetic enhancement, the race toward a norm of human perfection to be achieved through manipulation of the very stuff of life. How do you deal with an issue like this? Here, it seems to me, the task of the public intellectual in this society at this time--because we're not fighting the issues that were fought in the mid-twentieth century--is to join others in creating a space within which such matters can be articulated publicly and debated critically.

At present, the way the issue is parsed by the media goes like this: The techno-enthusiasts announce that we're one step closer to genetic utopia. The New York Times calls up its three biological ethicists to comment. Perhaps one or two religious leaders are asked to wring their hands a little bit--anyone who's really a naysayer with qualms about eugenics, because that is the direction in which we are heading, is called a Luddite. Case closed, and every day we come closer to a society in which, even as we intone multiculturalism as a kind of mantra, we are narrowing the definition of what is normatively human as a biological ideal. That's happening even as we speak; that is, we're in real danger of reducing the person to his or her genotype, but if you say that, you're an alarmist--so that's what I am.

This leads me to the following question: Who has authority to pronounce on what issue, as the critical issues change from era to era? In our time and place, scientists, technology experts and dot-com millionaires seem to be the automatic authorities on everything. And everybody else is playing catch-up.

So the public intellectual needs, it seems to me, to puncture the myth-makers of any era, including his own, whether it's those who promise that utopia is just around the corner if we see the total victory of free markets worldwide, or communism worldwide or positive genetic enhancement worldwide, or mouse-maneuvering democracy worldwide, or any other run-amok enthusiasm. Public intellectuals, much of the time at least, should be party poopers. Reinhold Niebuhr was one such when he decided that he could no longer hold with his former compatriots of the Social Gospel movement, given what he took to be their dangerous naïveté about the rise of fascism in Europe. He was widely derided as a man who once thought total social transformation in the direction of world peace was possible, but who had become strangely determined to take a walk on the morbid side by reminding Americans of the existence of evil in the world. On this one, Niebuhr was clearly right.

When we're looking around for who should get the blame for the declining complexity of public debate, we tend to round up the usual suspects. Politicians usually get attacked, and the media. Certainly these usual suspects bear some responsibility for the thinning out of the public intellectual debate. But I want to lift up two other candidates here, two trends that put the role of public intellectuals and the very existence of publics in the John Dewey sense at risk. The first is the triumph of the therapeutic culture, with its celebration of a self that views the world solely through the prism of the self, and much of the time a pretty "icky" self at that. It's a quivering sentimental self that gets uncomfortable very quickly, because this self has to feel good about itself all the time. Such selves do not make arguments, they validate one another.

A second factor is the decline of our two great political parties. At one point the parties acted not just as big fundraising machines, not just as entities to mobilize voters but as real institutions of political and civic education. There are lots of reasons why the parties have been transformed and why they no longer play that role, but the results are a decline in civic education, a thinning out of political identification and depoliticization, more generally.

I'm struck by what one wag called the herd of independent minds; by the fact that what too often passes for intellectual discussion is a process of trying to suit up everybody in a team jersey so we know just who should be cheered and who booed. It seems to me that any public intellectual worth his or her salt must resist this sort of thing, even at the risk of making lots of people uncomfortable.

Donatich: Stephen, can you talk about the thinning out of political identity? Who might be responsible for either thickening or thinning the blood of political discourse? What would you say, now that we're talking about the fragmentation of separate constituencies and belief systems, is the role of religion and faith in public life?

Stephen Carter: You know that in the academy the really bad word is "popularizer"-- a mere popularizer, not someone who is original, which of course means obscure, or someone who is "deeply theorized," which is the other phrase. And to be deeply theorized, you understand, in academic terms today, means to be incapable of uttering a word such as "poor." No one is poor. The word, the phrase now, as some of you may know, is "restricted access to capital markets." That's deeply theorized, you see. And some of us just say poor, and that makes us popularizers.

A few years ago someone who was really quite angry about one of my books--and I have a habit of making people angry when I write books--wrote a review in which he challenged a statement of mine asserting that the intellectual should be in pursuit of truth without regard to whether that leaves members of any particular political movement uncomfortable. He responded that this was a 12-year-old nerd's vision of serious intellectual endeavor.

And ever since then I thought that I would like to write a book, or at least an essay, titled something like Diary of an Intellectual Nerd, because I like that idea of being somewhat like a 12-year-old. A certain naïveté, not so much about great ideas and particularly not about political movements but about thought itself, about truth itself. And I think one of the reasons, if the craft of being intellectual in the sense of the scholar who speaks to a large public is in decline, is cynicism. Because there's no sense that there are truths and ideas to be pursued. There are only truths and ideas to be used and crafted and made into their most useful and appropriate form. Everyone is thought to be after something, everyone is thought to have some particular goal in mind, independent of the goal that he or she happens to articulate. And so, a person may write a book or an article and make an argument, and people wonder, they stand up in the audience and they say, "So, are you running for office, or are you looking for some high position?" There's always some thought that you must be after something else.

One of the reasons, ideally, you'd think you would find a lot of serious intellectual endeavor on university campuses is precisely because people have tenure and therefore, in theory, need not worry about trying to do something else. But on many, many campuses you have, in my judgment, relatively little serious intellectual endeavor in the sense of genuinely original thinking, because even there, people are worried about which camp they will be thought to be in.

You can scarcely read a lot of scholarship today without first having to wade through several chapters of laying out the ground in the sense of apologizing in advance to all the constituencies that may be offended, lest one be thought in the other camp. That kind of intellectual activity is not only dangerous, it's unworthy in an important sense, it's not worthy of the great traditions of intellectual thought.

There's a tendency sometimes to have an uneasy equation that there is serious intellectual activity over here, and religion over there, and these are, in some sense, at war. That people of deep faith are plainly anti-intellectual and serious intellectuals are plainly antireligious bigots--they're two very serious stereotypes held by very large numbers of people. I'm quite unembarrassed and enthusiastic about identifying myself as a Christian and also as an intellectual, and I don't think there's any necessary war between those two, although I must say, being in an academic environment, it's very easy to think that there is.

I was asked by a journalist a few years ago why was it that I was comfortable identifying myself, and often did, as a black scholar or an African-American scholar and hardly ever identified myself as a Christian scholar. And surely the reason is, there are certain prejudices on campus suggesting that is not a possible thing to be or, at least, not a particularly useful combination of labels.

And yet, I think that the tradition of the contribution to a public-intellectual life by those making explicitly religious arguments has been an important and overlooked one, and I go back for my model, well past Niebuhr, into the nineteenth century. For example, if you looked at some of the great preachers of the abolitionist movement, one thing that is quite striking about them is, of course, that they were speaking in an era when it was commonly assumed that people could be quite weighty in their theology and quite weighty in their intellectual power. And when you read many of the sermons of that era, many of the books and pamphlets, you quickly gain a sense of the intellectual power of those who were pressing their public arguments in explicitly Christian terms.

Nowadays we have a historical tendency to think, "Oh, well, it's natural they spoke that way then, because the nation was less religiously diverse and more Christian." Actually, the opposite was probably true, as historians now think--the nation is probably less religiously diverse now than it was 150, 175 years ago, when religions were being founded really quite swiftly. And most of those swiftly founded religions in the 1820s to the 1830s have died, but many of them had followers in great number before they did.

America's sense of itself as a so-called Christian nation, as they used to say in the nineteenth century, didn't really grow strong until the 1850s or 1860s. So you have to imagine the abolitionist preachers of the eighteenth and early nineteenth centuries, preaching in a world in which it could be anything but certain that those who were listening to them were necessarily co-religionists.

In this century too, we have great intellectual preachers who also spoke across religious lines. Martin Luther King is perhaps the most famous of them, even though sometimes, people try to make a straitjacket intellectual of him by insisting, with no evidence whatsoever, that he actually was simply making secular moral arguments, and that religion was kind of a smokescreen. If you study his public ministry and look at his speeches, which were really sermons, as a group, you easily discern that that's not true.

And yet, the religiosity of his language gave it part of its power, including the power to cross denominational lines, to cross the lines between one tradition and another, and to cross lines between religion and nonreligion. For the religiously moved public intellectual, the fact is that there are some arguments that simply lose their power or are drained of their passion when they're translated into a merely secular mode. The greatness of King's public oratory was largely a result of its religiosity and its ability to touch that place in the human heart where we know right from wrong; it would not have been as powerful, as compelling, had it lacked that religious quality.

Now, I'm not being ahistorical, I'm not saying, "Oh, therefore the civil rights movement would not have happened or we would still have racial segregation today"--that's not the point of my argument. The point is that his religiosity did not detract from his intellectual power; rather, it enhanced it. This is not to say, of course, that everyone who makes a religious argument in public life is speaking from some powerful intellectual base. But it does suggest we should be wary of the prejudices that assume they can't be making serious arguments until they are translated into some other form that some may find more palatable. In fact, one of my great fears about the place we are in our democracy is that, religion aside, we have lost the ability to express and argue about great ideas.

Donatich: Professor Carter has made a career out of illustrating the effect and protecting the right of religious conviction in public thought. Herbert Gans, on the other hand, is a self-pronounced, enthusiastic atheist. As a social scientist who has taught several generations of students, how does a public intellectual balance the professional need for abstract theory and yet remain relevant, contribute some practical utility to the public discourse?

Herbert Gans: I'm so old that the word "discourse" hadn't been invented yet! I am struck by the pessimism of this panel. But I also notice that most of the names of past public intellectuals--and I knew some of them--were, during their lifetime, people who said, "Nobody's listening to me." Erich Fromm, for example, whom I knew only slightly and through his colleagues, was sitting in Mexico fighting with psychoanalysts who didn't think politics belonged in the dialogue. Lewis Mumford was a teacher of mine, and he certainly felt isolated from the public, except on architecture, because he worked for The New Yorker.

So it seems to me it's just the opposite: that the public intellectual is alive and well, though perhaps few are of the magnitude of the names mentioned. If I did a study, I'd have to define what an intellectual is, and I notice nobody on the panel has taken that one on. And I won't either. The public intellectuals that exist now may not be as famous, but in fact there are lots of them. And I think at least on my campus, public intellectuals are becoming celebrities. Some of them throw stones and get themselves in trouble for a few minutes and then it passes. But I think that really is happening, and if celebrities can exist, their numbers will increase.

One of the reasons the number is increasing is that public intellectuals are really pundits. They're the pundits of the educated classes, the pundits of the highbrow and the upper-middlebrow populations, if you will. And the moment you say they're pundits, then you can start comparing them to other pundits, of which we have lots. And there are middlebrow pundits and there are lower-brow pundits, there are serious pundits, there are not-so-serious pundits.

Some of the columnists in the newspapers and the tabloid press who are not journalists with a PhD are public intellectuals. There are pundits who are satirical commentators, there are a significant number of people who get their political news from Leno and Letterman. And, of course, the pollsters don't really understand this, because what Leno and Letterman supply is a satirical take on the news.

Most public intellectuals function as quote-suppliers to legitimize the media. Two or three times a week, I get called by journalists and asked whether I will deliver myself of a sociological quote to accompany his or her article, to legitimate, in a sense, the generalizations that journalists make and have to make, because they've got two-hour deadlines. Which means that while there are few public intellectuals who are self-selected, most of us get selected anyway. You know, if no journalist calls for a quote, then I'm not a public intellectual; I just sit there writing my books and teaching classes.

I did a book on the news media and hung out at Newsweek and the other magazines. And at Newsweek, they had something they called an island, right in the main editorial room. On the island were names of people who would now be called public intellectuals, the people whom Newsweek quoted. And the rules were--and this is a bit like Survivor--every so often people would be kicked off the island. Because the editors thought, and probably rightly, that we as readers were going to get tired of this group of public intellectuals. So a new group was brought in to provide the quotes. And then they were kicked off.

The public intellectuals come in two types, however. First there are the ones that everyone has been talking about, the generalists, the pundits, as I think of them; and second are the disciplinary public intellectuals. The public sociologists, the public economists, the public humanists--public, plus a discipline. And these are the people who apply the ideas from their own disciplines to a general topic. And again, to some extent, this is what I do when I'm a quote-supplier, and I'm sure my fellow panelists are all functioning as quote-suppliers too.

But the disciplinary public intellectuals show that their disciplinary insights and their skills can add something original to the public debate. That, in other words, social scientists and humanists can indeed grapple with the issues and the problems of the real world. The disciplinary public intellectuals, like other public intellectuals, have to write in clear English. This is a rarity in the academy, unfortunately--which makes disciplinary public intellectuals especially useful. And they demonstrate the public usefulness of their disciplines, which is important in one sense, because we all live off public funds, directly or indirectly, and we need to be able to account every so often that we're doing something useful for taxpayers. I cannot imagine there are very many legislators in this country who would consider an article in an academic journal as proof that we're doing something useful or proof that we're entitled to some share of the public budget.

Disciplinary public intellectuals are useful in another way, too: They are beloved by their employers, because they get these employers publicity. My university has a professionally run clipping service, and every time Columbia University is mentioned, somebody clips and files the story. And so every time somebody quotes me I say, "Be sure to mention Columbia University," because I want to make my employers happy, even though I do have tenure. Because, if they get publicity, they think they're getting prestige, and if they get prestige, that may help them get students or grant money.

There are a number of hypotheses on this; I'm not sure any of them are true-- whether quote-supplying provides prestige, or prestige helps to get good students, whether good students help to get grant money. There is a spiral here that may crash. But meanwhile, they think that if we're getting them publicity, we're being useful. And, of course, public social scientists and those in the humanities are, in some respects, in short supply, in part because their colleagues stigmatize them as popularizers. (They don't call them journalists, which is a dirty word in the ivory tower.)

It's also fair to say that in the newsrooms, "academic" is a dirty word. If you've ever paid attention, journalists always cite "the professor," and it doesn't matter who it is, and it doesn't even matter if they're friends of the professor. But it's always "the professor," which is a marvelous way of dehumanizing us professors. So there's this love/hate relationship between journalists and academics that's at work here. All of which means, yes, of course, it does take a bit of courage to be a public intellectual or a disciplinary public intellectual. If you turn your back on the mainstream of the academy, that's the way you get a knife in your back, at times.

Donatich: Steven Johnson has used the web and Internet energetically and metaphorically. How will the Internet change public dialogue? What are the opportunities of public conversation that this new world presents?

Steven Johnson: One of the problems with the dot-com-millionaire phenomenon--which may, in fact, be starting to fall behind us--is that it really distracted a huge amount of attention from a lot of other very interesting and maybe more laudable things that were happening online. There was kind of a news vacuum that sucked everything toward stories about the 25-year-old guy who just made $50 million, and we lost sight of some of the other really progressive and important things that were happening because of the rise of the web.

I'm of a generation that came of age at precisely that point that Russell Jacoby talked about and wrote about, during the late eighties, when the academy was very much dominated by ideas from France and other places, where there was a lot of jargon and specialization, and it was the heyday of poststructuralism and deconstruction in the humanities. Which leads me to sometimes jokingly, sometimes not, describe myself as a "recovering semiotics major."

I think that I came to the web and to starting Feed, and to writing the book that I wrote about the Internet culture and interface culture, as a kind of a refugee from conversations like one in the academy, when I was a graduate student, in which a classmate asked the visiting Derrida a ten- or fifteen-minute, convoluted Derridean question on his work and the very possibility of even asking a question. And after a long pause, Derrida had to admit, "I'm sorry, I do not understand the question."

The web gave me an unlikely kind of home in that there were ideas and there were new critical paradigms that had been opened up to me from the academic world. But it was clear that you couldn't write about that world, you couldn't write using those tools with that kind of language and do anything useful. And it was very hard to imagine a life within the university system that was not going to inevitably push me toward conversations like that with Derrida.

So the good news, I think, is that my experience is not unique. In fact, there's been a great renaissance in the last five years of the kind of free-floating intellectual that had long been rumored to be on his or her last legs. It's a group shaped by ideas that have come out of the academy but is not limited to that. And I think in terms of publications like Feed--to pat myself on the back--Hermenaut and Suck are all good examples of a lively new form of public intellectualism that is not academic in tone.

The sensibility of that group is very freethinking--not particularly interested in doctrinaire political views, very eclectic in taste, very interested in the mix of high and low culture, much more conversational in tone--funny, even. Funny is an interesting component here. I mean, these new writers are funny in a way, you know, Adorno was never very funny. And they're very attentive to technology changes, maybe as interested in technology and changes in the medium as they are in intellectual fashions. If there's a role model that really stands out, it's somebody like Walter Benjamin for this generation. You know, a sense of an interest that puts together groups of things you wouldn't necessarily expect to see put together in the same essay.

How does the web figure into all of this? Why did these people show up on the web? I think one of the things that started happening--actually, this is just starting to happen--is that in addition to these new publications, you're starting to see something on the web that is very unique to it. The ability to center your intellectual life in all of its different appearances in your own "presence" online, on the home page, so that you can actually have the equivalent of an author bio. Except that it's dynamically updated all the time, and there are links to everything you're doing everywhere. I think we've only just begun to exploit it--of combating the problem with the free-floating intellectual, which is that you're floating all over the place and you don't necessarily have a home, and your ideas are appearing in lots of different venues and speaking to lots of different audiences.

The web gives you a way of rounding all those diverse kinds of experiences and ideas--and linking to them. Because, of course, the web is finally all about linking--in a way that I think nothing has done quite as well before it. And it also involves a commitment to real engagement with your audience that perhaps public intellectuals have talked a lot about in the past, but maybe not lived up to as much as they could have.

Some of this is found in the new formats that are available online in terms of how public dialogue can happen. I'm sure many of you have read these and many of you may have actually participated in them, but I'm a great advocate for this kind of long-format, multiparticipant discussion thread that goes on over two or three weeks. Not a real-time live chat, which is a disaster in terms of quality of discourse, which inevitably devolves into the "What are you wearing" kind of intellectual questions. But rather, the conversations with four or five people where each person has a day or half a day to think up their responses, and then write in 500- to 1,000-word posts. We've done those since we started at Feed. Slate does a wonderful job with them. And it's a fantastic forum. It's very engaged, it's very responsible, it's very dialogic and yet also lively in a conversational way. But, because of the back and forth, you actually can get to places that you sometimes couldn't get in a stand-alone 10,000-word essay.

Donatich: Professor Gans, if you had trouble with the word "discourse," I'm wondering what you'll do with "dialogic."

Johnson: I said I was recovering! That's the kind of thing that should be happening, and it seems to me that in five or ten years we'll see more and more of people who are in this kind of space, having pages that are devoted to themselves and carrying on these conversations all the time with people who are coming by and engaging with them. And I think that is certainly a force for good. The other side is just the economics of being able to publish either your own work or a small magazine. I mean, we started Feed with two people. We were two people for two years before we started growing a little bit. And the story that I always tell about those early days is that we put out the magazine and invited a lot of our friends and some people we just knew professionally to contribute. About three months, I guess, after Feed launched, Wired came out with a review of it. And they had this one slightly snippy line that said, "It's good to see the East Coast literary establishment finally get online." Which is very funny, to be publishing this thing out of our respective apartments. I had this moment where I was looking around my bedroom for the East Coast literary establishment--you open the closet door, and "Oh, Norman Mailer is in there. 'Hey, how's it going!'" And so there can be a kind of Potemkin Village quality online. But I think the village is thriving right now.

Donatich: Christopher Hitchens, short of taking on what a public intellectual might or might not be, will you say something about the manners or even the mannerisms of the public intellectual and why disagreement is important to our progress?

Christopher Hitchens: I've increasingly become convinced that in order to be any kind of a public-intellectual commentator or combatant, one has to be unafraid of the charges of elitism. One has to have, actually, more and more contempt for public opinion and for the way in which it's constructed and aggregated, and polled and played back and manufactured and manipulated. If only because all these processes are actually undertaken by the elite and leave us all, finally, voting in the passive voice and believing that we're using our own opinions or concepts when in fact they have been imposed upon us.

I think that "populism" has become probably the main tactical discourse, if you will, the main tactical weapon, the main vernacular of elitism. Certainly the most successful elitist in American culture now, American politics particularly, is the most successful inventor or manipulator, or leader of populism. And I think that does leave a great deal of room in the public square for intellectuals to stand up, who are not afraid to be thought of as, say, snobbish, to pick a word at random. Certainly at a time when the precious term "irony"--precious to me, at any rate--has been reduced to a form of anomie or sarcasm. A little bit of snobbery, a little bit of discrimination, to use another word that's fallen into disrepute, is very much in order. And I'm grateful to Professor Carter for this much, at least, that he drew attention to language. And particularly to be aware of euphemism. After all, this is a time when if you can be told you're a healer, you've probably won the highest cultural award the society can offer, where anything that can be said to be unifying is better than anything that can be described as divisive. Blush if you will, ladies and gentlemen, I'm sure at times you too have applauded some hack who says he's against or she's against the politics of division. As if politics wasn't division by definition.

The New York Times, which I'm sure some of you at least get, if you don't read, will regularly regale you in this way--check and see if you can confirm this. This will be in a news story, by the way, not a news analysis. About my hometown in Washington, for example, "recently there was an unpleasant outbreak of partisanship on Capitol Hill, but order seems to have been restored, and common sense, and bi-partisanship, is again regained. I've paraphrased only slightly. Well, what is this in translation? "For a while back there it looked as if there'd be a two-party system. But, thank God, the one-party system has kicked back in."

Now, the New York Times would indignantly repudiate--I'm coming back to this, actually--the idea that it stood for a one-party system or mentality, but so it does. And its language reveals it. So look to the language. And that is, in fact, one of the most essential jobs of anyone describing themselves as an intellectual.

Against this, we have, of course, the special place reserved for the person who doesn't terribly want to be a part of it, doesn't feel all that bipartisan, who isn't in an inclusive mood. Look at the terms that are used for this kind of a person: gadfly, maverick and, sometimes, bad boy. Also bad girl, but quite often bad boy, for some reason. Loose cannon, contrarian, angry young man.

These are not hate words, by any means, nor are they exactly insulting, but there's no question, is there, that they are fantastically and essentially condescending. They're patronizing terms. They are telling us, affectionately enough, that pluralism, of course, is big enough, capacious enough, tolerant enough to have room for its critics.

The great consensus, after all, probably needs a few jesters here and there, and they can and should be patted upon the head, unless they become actually inconvenient or awkward or, worst of all--the accusation I have myself been most eager to avoid--humorless. One must be funny, wouldn't you say? Look to the language again. Take the emaciated and paltry manner and prose in which a very tentative challenge to the one-party system, or if you prefer, the two-party one, has been received. I'm alluding to the campaign by Ralph Nader.

The New York Times published two long editorials, lead editorials, very neatly inverting the usual Voltairean cliché. These editorials say: We don't particularly disagree with what Ralph Nader says, but we violently disagree with his right to say it. I've read the editorials--you can look them up. I've held them up to the light, looked at them upside down, inside out, backwards--that's what they say. This guy has no right to be running, because the electorate is entitled to a clear choice between the two people we told you were the candidates in the first place.

I find this absolutely extraordinary. When you're told you must pick one of the available ones; "We've got you some candidates, what more do you want? We got you two, so you have a choice. Each of them has got some issues. We've got some issues for you as well. You've got to pick." A few people say, "Well, I don't feel like it, and what choice did I have in the choice?" You're told, "Consider the alternatives." The first usage of that phrase, as far as I know, was by George Bernard Shaw, when asked what he felt like on his 90th birthday. And he said, "Considering the alternatives...." You can see the relevance of it. But in this case you're being told, in effect, that it would be death to consider the alternatives.

Now, to "consider the alternatives" might be a definition of the critical mind or the alive intelligence. That's what the alive intelligence and the critical mind exist to do: to consider, tease out and find alternatives. It's a very striking fact about the current degeneration of language, that that very term, those very words are used in order to prevent, to negate, consideration of alternatives. So, be aware. Fight it every day, when you read gunk in the paper, when you hear it from your professors, from your teachers, from your pundits. Develop that kind of resistance.

The word "intellectual" is of uncertain provenance, but there's no question when it became a word in public use. It was a term of abuse used by those who thought that Capt. Alfred Dreyfus was guilty in 1898 to describe those who thought that he was probably innocent. It was a word used particularly by those who said that whether Captain Dreyfus was innocent or not, that wasn't really the point. The point was, would France remain an orderly, Christian, organic, loyal society? Compared to that, the guilt or innocence of Captain Dreyfus was irrelevant. They weren't saying he was necessarily guilty, they were saying, "Those who say he is innocent are not our friends. These are people who are rootless, who have no faith, who are unsound, in effect." I don't think it should ever probably lose that connotation. And fortunately, like a lot of other words that were originally insults--I could stipulate "Impressionist," which was originally a term of abuse, or "suffragette" or "Tory," as well as a number of other such terms--there was a tendency to adopt them in reaction to the abuse and to boast of them, and say, "Well, all right, you call me a suffragette, I'll be a suffragette. As a matter of fact, I'll be an Impressionist."

I think it would be a very sad thing if the word "intellectual" lost its sense that there was something basically malcontent, unsound and untrustworthy about the person who was claiming the high honor of the title. In politics, the public is the agora, not the academy. The public element is the struggle for opinion. It's certainly not the party system or any other form whereby loyalty can be claimed of you or you can be conscripted.

I would propose for the moment two tasks for the public intellectual, and these, again, would involve a confrontation with our slipshod use of language. The first, I think, in direct opposition to Professor Carter, is to replace the rubbishy and discredited notions of faith with scrutiny, by looking for a new language that can bring us up to the point where we can discuss shattering new discoveries about, first, the cosmos, in the work of Stephen Hawking, and the discoveries of the Hubble telescope--the external world--and, second, no less shattering, the discovery about our human, internal nature that has begun to be revealed to us by the unraveling of the chains of DNA.

At last, it's at least thinkable that we might have a sense of where we are, in what I won't call creation. And what our real nature is. And what do we do? We have President Clinton and the other figures in the Human Genome Project appear before us on the day that the DNA string was finally traced out to its end, and we're told in their voices and particularly the wonderful lip-biting voice of the President, "Now we have the dictionary which God used when he was inventing us." Nothing could be more pathetic than that. This is a time when one page, one paragraph, of Hawking is more awe-inspiring, to say nothing of being more instructive, than the whole of Genesis and the whole of Ezekiel. Yet we're still used to babble. For example, in the 18th Brumaire of Louis Napoleon, Karl Marx says, quite rightly, I think, "When people are trying to learn a new language, it's natural for them to translate it back into the one they already know." Yes, that's true. But they must also transcend the one they already know.

So I think the onus is on us to find a language that moves us beyond faith, because faith is the negation of the intellect, faith supplies belief in preference to inquiry and belief, in place of skepticism, in place of the dialectic, in favor of the disorder and anxiety and struggle that is required in order to claim that the mind has any place in these things at all.

I would say that because the intellectual has some responsibility, so to speak, for those who have no voice, that a very high task to adopt now would be to set oneself and to attempt to set others, utterly and contemptuously and critically and furiously, against the now almost daily practice in the United States of human sacrifice. By which I mean, the sacrifice, the immolation of men and women on death row in the system of capital punishment. Something that has become an international as well as a national disgrace. Something that shames and besmirches the entire United States, something that is performed by the professionalized elite in the name of an assumed public opinion. In other words, something that melds the worst of elitism and the absolute foulest of populism.

People used to say, until quite recently, using the words of Jimmy Porter in Look Back in Anger, the play that gave us the patronizing term "angry young man"--well, "there are no good, brave causes anymore." There's nothing really worth witnessing or worth fighting for, or getting angry, or being boring, or being humorless about. I disagree and am quite ready to be angry and boring and humorless. These are exactly the sacrifices that I think ought to be exacted from oneself. Let nobody say there are no great tasks and high issues to be confronted. The real question will be whether we can spread the word so that arguments and debates like this need not be held just in settings like these but would be the common property of anyone with an inquiring mind. And then, we would be able to look at each other and ourselves and say, "Well, then perhaps the intellectual is no longer an elitist."

A review of Sol LeWitt's Autobiography.

Steven Soderbergh's Traffic—for all its flaws—illustrates how the United States' is deluding itself in its crusade against drugs.

A review of Looking Backward 2000–1887, by Edward Bellamy.

We're sorry, but we do not have permission to present this article on our website. It is an excerpt from Upside Down: A Primer for the Looking-Glass World (Metropolitan). © 2000 by Eduardo Galeano. Translation © 2000 by Mark Fried.

Let's cut to the chase on Ken Burns's Jazz, which rolled out on PBS January 8, by invoking Wallace Stevens.

1) Is it entertaining TV? Mostly, in PBS fashion.

2) Does it leave out people and places and whole periods and genres

normally considered vital parts of jazz history? Yes.

3) Does it need more editing? Yes.

4) Does Louis Armstrong claim 40 percent of its nineteen hours? Yes.

5) Does post-1960s jazz claim 10 percent? Yes.

6) Does it tell an informed and informative story? Usually.

7) Does it identify the 500-odd pieces of jazz that serve as its soundtrack? Rarely.

8) Does it have rare and evocative pictures and film footage? Absolutely.

9) Is it good history? It's made-for-PBS history.

10) Will it satisfy jazz fans and musicians and critics? Seems like it already hasn't, and it hasn't even aired yet.

11) Will it save the jazz industry? That depends: CDs labeled Ken Burns's Jazz are bullish.

12) Will it make jazz a part of mainstream American culture again? Not likely, but it may help make it an official part of American popular history.

13) Is it part of the transition jazz has been making for three decades into the academic world? You bet.

Now let's dolly back and try to tell the story.

The numbers have to come first. The ten-episode, nineteen-hour series was six years in the making, and it sprawls: seventy-five talking heads, thousands of still photos and pieces of film, some 500 pieces of music and so on. Costing some $13 million, about a third of it from General Motors, it's the biggest documentary that's been done about jazz.

And yet a lot of jazz musicians and critics and fans, in print and on the web, have been complaining that it's too constrictive. It's easy to see why. It's certainly not comprehensive. For Burns and collaborator Geoffrey Ward, history unfolds in the textures of individual lives. (Ward won the Francis Parkman prize for A First-Class Temperament, one volume of his biography of FDR.) Jazz for them is the story of a few great men (and the odd woman) who changed the way Americans, then the world, hear and think and act. Chief among them: Louis Armstrong and Duke Ellington. There are places of honor for the likes of James Reese Europe and Jelly Roll Morton, Sidney Bechet and Bix Beiderbecke, Benny Goodman and Count Basie, Artie Shaw and Charlie Parker, Miles Davis and Dave Brubeck. This sort of survey is easier to sustain until about 1929, because jazz musicians were few (though not as few or as limited to New Orleans, Chicago and New York as the series implies). But Burns & Co. can tell a credible story of jazz's first decades using a handful of pioneers.

One reason for the noise is that this overlaps the story of jazz according to the Jazz at Lincoln Center program, a flashpoint in the jazz world. JALC teaches that jazz is a clear-cut genealogy of a few outstanding figures, and it excludes many important artists, especially after 1960, often for ideological reasons. The basic plot for both: Taking its building blocks from slave music, marching bands, blues, the church, European dance and classical music, jazz began life as a mongrel in New Orleans, came up the river to Chicago, met up (via Armstrong) with New York proto-swing bands and Harlem stride pianists and exploded, drawing young white players into a black-developed music. This is true enough, though it ultimately means ignoring uncomfortable parallel developments (Red Allen and Armstrong) or scenes (between-the-wars LA jazz) or entire genres (Latin jazz, European jazz). But schematic history can be good TV, and Burns, like earlier PBS filmmaker Frederick Wiseman, makes long, long movies that depend on strong, heavily delineated characters and themes to keep them from dissipating.

His story's heart is Armstrong. Its head is Ellington. And its soul is the Jazz Age and the Swing Era.

In episode five, "Swing: Pure Pleasure (1935-1937)," writer Albert Murray declares, "Jazz is primarily dance music." Though that hasn't been true for nearly half the music's history, it's clear he's speaking for Burns: Three episodes, nearly six hours, discuss the big-band era, when jazz underpinned popular music, lifted Depression-era spirits, saved the record industry and dominated that new omnipresent technology, radio. Nevertheless, as the often-intrusive talking heads tell us, from Ellington on down the musicians knew the difference between the business and the music; stage shtick and chart slots were as important then as now. This is a bittersweet Golden Age of speakeasies, hoods, the Great Depression, squealing bobby-soxers, lynchings, jitterbugging, novelty tunes and early moves toward racial integration. It is described as a time of "adult sensibility" and is the series' gravitational center.

The great-man schematic creates escalating difficulty for the plotting starting with episode seven, which begins with Charlie Parker and spends nearly as much time on Satchmo as it does on bebop. By the mid-1940s, the musicians had multiplied and moved on--out of Harlem and swing time. And so jazz dissolves into hundreds of musicians searching for different sounds, styles, approaches, languages, multimedia formats. The last forty years of Jazz are a choppy and unreliable ride; a lot disappears, and what's left can be telegraphic or confusing and look exactly like JALC speaking.

Burns says post-1960s jazz is too controversial even in the jazz world to be history. Maybe he should have ended, then, with John Coltrane; Baseball, after all, stopped at 1970. For in less than two hours, faces from Charles Mingus's to Sonny Rollins's flash across the screen between inevitable reprises of Duke and Satchmo. Miles Davis's push into fusion shrinks to his alleged desperation for teen fans. Ornette Coleman is dismissed. Keith Jarrett and Chick Corea don't appear. The 1970s and 1980s are a quick-blur artistic wilderness until the arrival of Wynton Marsalis, artistic director of Jazz at Lincoln Center and the film's senior creative consultant and prime talking head. And there, after a brief survey of new stars (Cassandra Wilson, Joshua Redman) and a recapitulation of key figures and themes, it ends.

The signal irony: If Burns had cut the final episode and billed this as Jazz: The First 50 Years, more of the discussion might be where it belongs--on the movie.

Until pretty recently nobody thought enough of jazz to point a movie camera in its general direction for very long. There are snatches of footage of Armstrong, Ellington, Fats Waller, Bessie Smith and the like from the early days. By the mid-1930s the popular swing bands cropped up in films and then in "soundies." But the video record of what fans like to call America's greatest art form is sporadic and discouraging.

This problem plays to Burns's strengths: He loves having his staff dig up old photos (for this, they turned up millions), and he loves working stills to make them kinetic. He pans across and slowly zooms in and out of a single shot to give it a movielike temporal depth. In one vignette about Harlem's Savoy Ballroom, where drummer Chick Webb held court and introduced Ella Fitzgerald in the 1930s, Burns intercuts shots of separate white and black dancers to hammer home the voiceover's point about its integrated patrons--a first in America. He assembles a deft mix of photos and film to re-create the stage-fright-to-triumph of Benny Goodman's 1938 "Sing Sing Sing" concert at Carnegie Hall.

The series boasts tours de force. The evocative segment called "The Road" strings out a head-turning daisy chain of wondrous footage: bands on trains and buses and touring cars, chugging 500 miles a day, six days a week, making whoopee and changing tires, riding high onstage and coping with breakdowns and prejudice offstage. The recently deceased bass and photography great Milt Hinton recalls how at band stops his wife would head into town to look for black homes where the musicians could eat and stay, how musicians were people of prestige in the community. Readings from journals and newspapers and diaries sample big-band life's dizzying ups and downs, while the film rolls from impromptu baseball games to a couple of female jazz fans puffing fake reefers while hugging the sign of a town named Gage.

And in the background rolls out more jazz by far than 99 percent of America has heard. Much of the time, it's as snippets in the background when one after another talking head pops up. The heads are duly identified time after time. The tunes aren't, unless they're keyed to a biographical or sociological set piece. Why not flash a subtitle to tell the audience what's playing?

Because jazz is the soundtrack for this series as much as or more than it is its subject. To put it another way, this isn't really a movie about jazz history. Think of Burns as PBS's Oliver Stone. Like the Civil War and baseball, jazz for Burns and Ward is a lens to focus on basic questions: Who are Americans, and how do they manage to get along--or not? And their central query concerns race.

So they film jazz as the tale of black redemption in and of America, a narrative of conversion and triumph whose shape recalls St. Augustine and Dante. From the days of slavery through the humiliations of Jim Crow and minstrelsy to the assertive freedom of the blues and jazz, Burns's movie resounds with the apocalyptic ring of apotheosis, as it examines a few crucial candidates for cultural sainthood. For it wants both to carve jazz greats into the American pantheon and to underline jazz's pivotal centrality to twentieth-century America as an affirmation of African-American creativity and endurance.

This, coupled with Marsalis's camera-savvy polish as a spokesman as well as his insistent championing of jazz education over the years, explains why a filmmaker like Burns would feel drawn to JALC's version of jazz history. (Actually, Dan Morgenstern, the respected head of the Rutgers Institute of Jazz Studies, was the film's senior historical consultant and vetted the script; there were twenty-three consultants in all, so until the final episode there are inevitable points of similarity, but not identity, with Lincoln Center's tale.) But dramatic necessity also helps explain why some characters, like Armstrong and Ellington, are the story's recurrent focus.

Swing, you might guess, is a buzzword in this series, and you'd be right, even though the film itself doesn't swing much. The earnestness that suffuses PBS cultural products won't let it float for long. At times, the music's lilting ease and fire contrast vividly with its deliberate, self-conscious pace. That's exacerbated by Burns's seventy-five talking heads: Watching can be like sitting through a course team-taught by the UN.

Besides Marsalis, Burns's other main soloist is writer Gary Giddins, and Giddins swings: His wide-ranging erudition rides his love for jazz easily. Other commentators--Stanley Crouch, Albert Murray, Artie Shaw, Gerald Early, James Lincoln Collier, Dave Brubeck--give good camera and consistent historical edutainment. But too many proffer vague impressions, clichéd memories, breathless interpretations and warmed-over anecdotes. They could easily have been edited or edited out. Then there are periodic pileups. In episode seven Joya Sherrill, Mercedes Ellington (Duke's granddaughter) and a few others repeat that Duke and Billy Strayhorn were a rare and wonderful match. In episode five, the same two dancers appear twice with virtually the same observations about Harlem's Savoy Ballroom.

Sometimes the anecdotes are fun or fabulous, sometimes they're bad history. Take Jon Hendricks, who in episode four retails the disproven mythic origin of Armstrong's scatting (sheet music fell off his stand at a recording session). Or director Bertrand Tavernier, who gushes about Django Reinhardt and Stéphane Grappelli introducing the guitar-violin combo to jazz, though they themselves would have fingered Eddie Lang and Joe Venuti. Ballplayer Buck O'Neil rambles good-naturedly about Billie Holiday giving listeners "the greatest moments" and "the saddest moments," demonstrating how a tighter edit could have sliced out the lapses into vacuity.

Marsalis's starring role has several sides. He delivers very effective musical glosses and explanations, polished by years of shows and clinics with adults, teens and kids. His knowledge of and passion for the jazz he loves, and his conviction that it represents American life in full, are infectious, if sometimes hyperbolic. But when he holds forth about Ellington and Armstrong and the semilegendary Buddy Bolden as if he knew them intimately, it's TV, not history.

History can be light-fingered instead of heavy-handed, and Jazz could use more humor, more of the "light" Marsalis ascribes to the best jazz musicians. It has some fabulous vignettes from Crouch, the third-ranked talking head. Except for the last two hours, Crouch swings. In one priceless bit he mimics pre-Armstrong pop vocalists and then Armstrong himself, and asks why anyone would want to revert. "That would be a bad choice," he deadpans. Anybody who makes that choice, he adds, should be deported--count a beat--"to somewhere." Another beat. "Maybe Pluto." It's impossible to disagree, especially when you're laughing.

To some extent, Burns has himself to blame for the unjoyful noise in the jazz world. In conversation, he tends, rightly, to underplay his work's ambitions. It's not the history of jazz, he says. Viewers will get to know a handful of musicians, meet another dozen or two and brush past a few dozen more. He can't possibly compete with books like Giddins's Visions of Jazz or jazz histories like those of Ted Gioia or Marshall Stearns; he's made a movie that tells an educational story for a mass audience. This is reasonable, accurate and no small feat. And, in fact, the movie is steeped with rich human detail of the sort most music historians rarely touch on. But the PR bombast trumpets him as jazz's Joan of Arc, and once he's on-message he can't stop selling. Jazz, like academia, is small and marginal with plenty of defensive, combative types; "the music" is a secular religion. Burns's perceived power inevitably lights the territorial fuses.

As it happens, the jazz industry, now down to about 1 percent of US music sales once you exclude Kenny G and his clones, looks like a Victorian maiden lashed to the tracks awaiting her hero. Burns's movie is a mantra, as record labels crunch despairing numbers and weed out personnel and artists after the latest wave of megamergers and Internet terrors. For his well-designed five-CD companion set (subtitled The Story of America's Music), the filmmaker brokered a deal between Sony and Verve (Universal), bitter corporate rivals, then brought in other labels; all are hoping for sales like the companion book's, which had a first printing of 250,000. This is mind-blowing if you're a jazz-label head used to dealing in niche sales (Marsalis himself rarely moves more than 10,000 CDs) and waiting for the next guillotine stroke.

Potential audience numbers get tossed around fervently: 40 million viewers for Baseball and The Civil War, and Jazz will probably draw less, but... It fascinates me that few of the film's critics address that. Why not consider an America where 20 million more people--or 3 million, or however many finally watch--know something, anything, about Armstrong, Ellington, Parker, Davis and a few others? Where, if they survive the overstatements, talking heads and pacing, they learn some hidden history?

Am I a Pollyanna? Maybe. Reality check: This is a made-for-TV movie. But I too think race is America's central issue, even more multifaceted in the twenty-first century. What holds this joint's pasted seams together, beyond the Founding Documents, is the frequently intangible glue called culture. TV is a major place American culture gets made. Can anyone measure what it meant to have Bill Cosby playing an upper-middle-class dad-next-door for a generation? What it means now that there are black and Hispanic and Asian and gay and you-name-'em channels filling cable and satellite TV? Can anyone guess what it might mean in five years to have Jazz, whatever its warts, playing over and over to a country as terminally divided and in search of itself as this one?

These are not delusions of grandeur about the power of jazz or Ken Burns. They are possibilities written in the history of jazz in America. Take Burns's vignette about Charlie Black, a white Texas teen who saw Armstrong perform in the 1930s. It changed his life. He joined the NAACP's legal team working on what became Brown v. Board of Education. The sociology of jazz is full of such stories. And they are very real.

For instance, no one with a brain disputes that jazz was initially an African-American creation. But as Marsalis, Giddins, Crouch, Murray and Early point out over and over, jazz was welcoming, inclusive, open. It replaced minstrelsy with a cultural site where all Americans could participate, speak to one another, override or ignore or challenge or slide by the society's fixations on racial and ethnic stereotypes. Black Americans (and other ethnic outsiders) could use it to enter mainstream society, white Americans could flee to it from mainstream society, and the transactions created a flux and flow that powered American cultural syntheses.

Jazz, the theme goes, represents America at its best--the dream of America. In the Depression, as Early reminds us, it rivaled MGM musicals in lifting the country's spirits. Of course, since jazz is a human activity, it also reflects the deepest divisions as well as the ideals at America's core. Race, sex, money, power, capitalism, creative freedom, the interaction of the individual and the group--these are all questions embedded in jazz history. They're the questions Burns and Ward are truly interested in. At its best, Jazz gets us interested in them too.

Burns admits he never listened to jazz until he started considering it as a subject. Ward became an Armstrong fan at age 10, when he was hospitalized with polio. Jazz is lucky they're interested in it.

Right now, jazz's commercial future is murky. The major labels are mostly wreckage. Marsalis, who used to get $1 million a year to make niche-market records in the hope that they would turn into catalogue gold, doesn't have a label; neither does Redman. High-profile jazz promoters are hemorrhaging. The Knitting Factory is reportedly in the hole for $2 million, after luring a big entertainment firm to take a stake, opening a club in LA and losing its annual jazz-festival sponsor. The Blue Note chain is said to be spurting red ink from expansions into Las Vegas and midtown Manhattan. Nor are jazz's nonprofit arms thriving. The Thelonious Monk Institute, so closely aligned with the Clinton/Gore Administration that its head was reportedly hoping for an ambassadorship if Al won, is looking pale. And the long-dormant board of Jazz at Lincoln Center has just fired executive director Rob Gibson in a swirl of intrigue: changed door locks and computer codes, fired and rehired personnel, amid persistent rumors of financial malfeasance, bullying and drug abuse.

Jazz has been on a commercial slide since the 1970s, when it racked up 10 percent of retail music sales. At the same time, it began entering the groves of academe. Today most jazz musicians are trained at schools; jazz history is laced through American studies and music curriculums.

This process has already fundamentally changed jazz itself and its relation to American culture, though how isn't always clear at first. As a colleague reminded me recently, in the jazz heydays celebrated by Burns's Jazz, musicians fashioned their own idiosyncratic solutions to musical problems, drawing on oral tradition (which varied considerably) and their own ingenuity and needs. This meant finding individual creative solutions to problems--how to finger this note or sequence, how to get that timbre, how to connect those chord changes. Now, a professor distributes computer analyses of famous solos, templates for solutions that are shared by hundreds and thousands of students. This has a paradoxical effect: It raises the general level of and standardizes jazz training, but it also tends to vitiate the individuality traditionally at the music's heart. This is why older musicians routinely complain that younger schooled players all sound alike. On the other hand, they're well suited for jazz repertory programs like JALC.

That is part of jazz's changing contemporary dynamics. So is Ken Burns's Jazz.

In the aftermath of the Iran/contra crisis, one of the networks decided to make a docudrama about the life of Ollie North, loosely based on a biography by Ben Bradle Jr. Its problem was that once North joined the Reagan National Security Council staff, the story lost both its moral compass and empathetic value. The producers could not find a single real-life character among the top Administration officials who displayed the slightest concern about the moral implications of North's drug- and gun-smuggling, hostage-buying and terrorist-supplying enterprises. They solved this problem by simply inventing someone.

The producers of Thirteen Days, the new Kevin Costner/Cuban Missile Crisis $80 million extravaganza, have done something similar. Instead of inventing a new character, however, they have invented a new history for an old one. Special Assistant Kenneth O'Donnell, who was responsible primarily for presidential scheduling in real life, does not even register in respected crisis histories. In the nearly 700 pages of transcripts from ExComm, the ad hoc committee dealing with the crisis, edited by Ernest May and Philip Zelikow and published by Harvard in 1997, O'Donnell rates exactly two insignificant lines. Yet here we see O'Donnell, played by Costner, saving the Kennedys from themselves and the world from self-destruction. One minute SuperKen is bawling out the President for going soft on the Commies, the next he's roughing up Mac Bundy for suggesting the same. A cross between an über-aide barking orders at quivering politicos and a shaggy dog who follows his master around with scotch-filled Waterford crystal, he instructs Adlai Stevenson to stand up to the Soviets at the UN and a fighter pilot to pretend he was not shot at in Cuba. Cynics looking for an explanation of this rather odd historical rewrite might point to the fact that the film was partially funded by O'Donnell's son, Earthlink co-founder Kevin O'Donnell.

Reviewers like the Wall Street Journal's Joe Morgenstern innocently term the film "a valuable history lesson." In fact, the film takes countless liberties with the documentary record. For instance, Thirteen Days

§ conveniently skips Robert McNamara's initial arguments that Russia's placement of the missiles should be ignored because Soviet long-range missiles made them strategically meaningless, lest this comment undercut the film's entire rationale;

§ ignores the record of US efforts to destabilize the Castro regime, including contingency invasion plans being readied at the time of the emplacement;

§ explicitly whitewashes the Kennedys' unconscionable McCarthyite plot to discredit the dovish Adlai Stevenson, whose recommendations they largely--and secretly--ended up following;

§ sans evidence, attributes a column by Walter Lippmann that contained the seeds of a crisis-ending missile trade to a leak direct from Jack and Bobby;

§ places the Kennedys' meetings that decided in favor of a missile trade inside the ExComm, when in fact they deliberately kept these secret from the "Wise Men," fearing the same attacks they themselves had leveled at Stevenson.

Of course, the level of accuracy is not too bad for a film whose credits include six tailors and seven hairdressers but not one academic historian. (Former CIA analyst Dino Brugioni, author of a fine book on the technical aspects of the crisis called Eyeball to Eyeball, is listed, but one hopes he had nothing to do with its story line.)

My view is that anyone who takes Hollywood's history for scripture deserves whatever they get. As John Sayles has observed to Eric Foner, "Using [the word] 'responsibility' in the same sentence as 'the movie industry'--it just doesn't fit." Yet at the same time, Sayles noted, Hollywood can't help itself. Often the only way to sell a movie is for the ad to read "Based on a true story..." Sometimes they get away with it, sometimes not, usually depending on whose interests are served by the lies in question. When Costner and Oliver Stone offered up their loony version of the Kennedy assassination in JFK, the Washington media establishment reacted with such outrage the Capitol threatened to float away on hot air. No one wanted to see Stone's conspiratorial version of the assassination and the Vietnam War replace the official misinformation. On the other hand, some Hollywood lies are welcomed by pundits. Last summer, Mel Gibson and company came up with a version of the American Revolution in The Patriot in which the Americans, not the British, freed the slaves. No matter that the Southern revolutionaries fought to protect their "peculiar institution" while the British offered the slaves their freedom should they join the loyalist cause. William F. Buckley (surely a born loyalist if ever there was one) came forward to endorse Hollywood's fictional history. David Horowitz, displaying his patented post-Stalinist brand of hysterical ignorance leavened with personal dishonesty, complained, "Leftwing reviewers inwardly despising its patriotic themes have taken to faulting its alleged historical 'inaccuracies' as a way of dismissing its significance.... [But] isn't this what the American revolution was about--the promise that all men would be free? And didn't the new nation deliver on that promise in a generation and pay an even greater price in blood to do so?"

Well, no, Comrade Horowitz, it didn't. A generation after the Revolution, the slaves were still slaves, and Southern revolutionaries were still slaveowners. The Emancipation Proclamation (which freed only selected slaves) took nearly a century, and blacks were not given the right to a meaningful vote in the South for another hundred years after that. (Moreover, some, including quite a few thousand in Florida, are still fighting.)

Judged by the standards of JFK and The Patriot, Thirteen Days looks pretty good. At least it comes with a warning: "You'll never believe how close we came," its ad campaign promises. And I didn't.

Modern Russian history, as taught by Clinton Administration spin doctors and Op-Ed pundits, holds that Boris Yeltsin dismembered the Soviet Union and set Russia on a historic path to democracy and a market economy. The Russians were eager to follow their first "duly elected" leader. They idolized ;the West and they willingly surrendered their values and their dreams--at least the "new Russians" did, a term that apparently is confined to a segment of the newly rich Muscovites. Year after year we were told that Yeltsin's reforms were changing the face of his land--witness the number of Mercedes and the evidence of breathtaking conspicuous consumption. A few years ago, shortly after I checked in to one of the new luxury hotels in Moscow, I was told that each Friday I could avail myself of fresh lobster flown in from Canada that very day! Western experts advising Yeltsin's "young reformers" on how to proceed were optimistic. I was given a stern lecture by one of them, an economist from Sweden, for suggesting that conditions in the country appeared catastrophic in comparison to the days of Communist rule.

The Yelstin legend took hold; he was Clinton's icon for a new Russia. From the moment he stood on a tank in August 1991 to face down an attempted Communist coup, Yeltsin was championed by the West as Russia's great hope. He was an appealing figure, athletic, always neatly dressed. He publicly boasted of his friendships with Bill Clinton and other Western politicians. He was a man to do business with, the Kremlin leader whose government was no longer a threat, whose human failings were on display for all to see. Who could forget Clinton's uproarious laughter as he tried to defuse Yeltsin's drunken diatribes during their summit at the Roosevelt museum in Hyde Park? Or the inebriated Yeltsin snatching the baton from the conductor of the Berlin Police Band and proceeding to conduct himself? Little attention was given to Yeltsin's tanks pounding his "duly elected" Parliament or to his policy in Chechnya. The Clinton Administration publicly encouraged Yeltsin to disband the Parliament because a solid majority of deputies wanted to pursue reforms more slowly. Several months before he actually moved against the legislature, a senior US official told the New York Times that "if Yeltsin suspends an antidemocratic parliament, it is not necessarily an antidemocratic act." Later, while Russian planes, tanks and artillery rained death on the Chechen capital of Grozny, Clinton saw fit to compare Yeltsin to Abraham Lincoln. Even when Yeltsin's entire economic reform program came crashing down in 1998, Vice President Gore voiced the opinion that "optimism prevails universally among those who are familiar with what is going on in Russia."

In short, the Clinton Administration hitched its Russia policy to Yeltsin's fortunes. Yeltsin's critics in Russia were dismissed as "dark forces" seeking Communist restoration or worse. There is a simple explanation. Heavily dependent on Western loans and subsidies, Yeltsin was always prepared to render services to Washington, provided he was handled with great sensitivity and accorded even greater public respect. He proved accommodating in Bosnia and again in Kosovo.

But for most Russians, Yeltsin's rule was a social and economic disaster. They viewed him--not without good reason--as being completely dependent on Washington, where the US Treasury and the International Monetary Fund are located. These institutions were a primary influence on his behavior and the often violent and self-destructive course he followed.

When he suddenly resigned on New Year's Eve a year ago, Yeltsin left his successor an impoverished state with few features of democracy and many more of authoritarianism. It is too early to assess properly the true meaning and consequences of his rule. But figures indicate that it wreaked far greater damage on Russia's economy than the Nazi invasion and World War II. Russia's gross domestic product between 1991 and 1998 declined by 43 percent, compared with the Soviet GDP decline of 24 percent from 1941 to 1945.

Behind this figure lurks a dramatic decline in living standards. An estimated 40 percent of the population lives in poverty--a tenfold increase since the collapse of Communism. Yeltsin's policies have had a catastrophic effect on health, education and social programs. Rising infant mortality, declining life expectancy and spreading infectious diseases have produced a negative population growth that is obscured in part by the steady stream of ethnic Russians returning from the outlying parts of the former Soviet Union (by 1995, Russia's population had declined by some 2 million). Agriculture remains comatose--Russia today imports 55 percent of its food supply. Officially, unemployment is about 12 percent, but the real figure is estimated to be between 20 and 25 percent (there are about 11 million Russians of working age who are listed as "missing"). The average daily food intake today is 2,100 calories, less than the minimum recommended by the World Health Organization; in the 1980-85 period, the average intake was 3,400.

None of these troubling issues are to be found in Yeltsin's Midnight Diaries. Memoir writers of course want to present themselves in the best possible light, and Yeltsin is no exception. He portrays himself as the leader who set Russia on a new course, gave it political stability and secured a peaceful transfer of power. Under his leadership Russia has joined the exclusive club of the eight most advanced industrial nations in the world.

What seems most remarkable about this Panglossian version of one of the most turbulent decades in Russia's history is its tenuous relation to reality. The disastrous reform program and the failure to introduce the rule of law, to the extent that they are touched upon at all in this book, are presented with serene detachment--Yeltsin writes about such things as though they had nothing to do with him.

On the other hand, Yeltsin wants us to believe that he had everything to do with his memoir, that he wrote it himself "in fragments over the years...late at night or early in the morning." It is widely known that it was ghostwritten by Valentin Yumashev, a former journalist and Yeltsin's longtime personal aide, with daughter Tatyana being the final censor. Only in passing--in Chapter 13--does Yeltsin mention that Yumashev worked with him on the manuscript. Other bits of contrived candor are sprinkled sparsely around with the apparent aim of defusing--with a sentence or two--some of his well-publicized shortcomings, even his drinking problem. Yeltsin says alcohol was his "only means [of getting] rid of stress"--until his 1995 heart attack. His consumption was afterward reduced to a single glass of wine per day.

Herculean efforts are made by the authors of this slapdash tome, which is filled with homilies about duty and patriotism, to suggest that Yeltsin possessed mysterious and therefore miraculously effective leadership skills. He liked the "simple, effective" style of leadership and made his decisions with "surgical precision." His stationery was embossed in gold with the presidential seal. His desk was cluttered with "coded telegrams" and "presidential mail." He used his "presidential pen" to sign decrees. By pushing buttons on a presidential "control panel" he could reach his far-flung minions. Metaphors reinforce the image of a supercool superexecutive who is always in control. Sometimes he is the sea captain, steering the Russian ship of state past dangerous reefs and shoals. On another occasion, before making a major announcement, he is like the space expert about to fire a rocket. ("Now it was too late for doubts. The countdown had begun. The bomb was ticking.")

The oddest thing about the details is that they offer the illusion of concreteness to obscure enormous ambiguities. We don't see Yeltsin making decisions on any substantive domestic issues. There is no evidence of his even being aware of the scope of devastation visited upon the people by his social policy. (Statistics give us an inkling: His government used only 9 percent of GNP on social services, compared with around 33 percent in the West.)

Yeltsin is certainly not stupid, but when you consider his remarkable energy in fighting for the presidency he seems unaccountably passive in other respects. We don't see him really concerned with the substance of his job. It is difficult to find an economic or social initiative Yeltsin conceived and brought to completion. ("There won't be any inflation," Yeltsin tells the press shortly before prices explode following the collapse of the ruble.) In fact, he reversed the democratization trend initiated by Mikhail Gorbachev. Yeltsin resorted to force to overhaul the entire constitutional order and to create a presidency that suited his needs. According to his own account, his crowning political achievement was Vladimir Putin's election as Russia's president in March 2000 (much of the first and last portions of the book are devoted to this).

There was something Reaganesque about Yeltsin, for his leadership seemed to exist only in his public utterances. But Reagan looks like a giant by comparison, since he held on to a few simple but firm beliefs and surrounded himself with capable aides. Yeltsin seems to be missing a central core belief--"the vision thing." He believed, he said in his final speech (in which he asked forgiveness of the people), that he was moving the country from its totalitarian past "to a bright, prosperous, civilized future." But wasn't that exactly the belief he was supposed to cherish when he served as the Communist Party boss of Sverdlovsk?

There is nothing in this book that appears to qualify Yeltsin for the presidency, with the exception of his prodigious lust for power and a genius for behind-the-scenes, byzantine politics, in which various elites struggle over the reallocation of power and wealth. Yeltsin was not a marionette. Far from it--he made his way up the greasy pole of power and fought constantly to stay at the top. He may have been extraordinarily passive on economic and social matters, but he was a superb bureaucratic infighter--bold, decisive and ruthless. He had no qualms about sacrificing even his most loyal supporters. "It was too bad, really just too bad," he notes after dismissing one of his prime ministers. When he fired his longest-serving prime minister--"faithful, decent, honest, intelligent" Viktor Chernomyrdin--he did it without forewarning because decision-making requires a special approach. "A decision should not wait. With any leakage of information, the decision ceases to be a bold, unexpected move and turns into the opposite." But even though he says firing people caused him "the severest kind of stress," Yeltsin concedes that he "felt an unusual rise in spirits, an enormous wave of optimism." He insists that his perpetual personnel changes were part of a careful and deliberate search for a politician to replace him and continue "on the path of democracy."

On the basis of the evidence, in the light of his years as president, we see Yeltsin as confident, surefooted and deeply interested in only one issue: the preservation of his personal power. He is a genius at perpetual conniving. Unlike Reagan, Yeltsin feared competent officials with established reputations. He entrusted great power to younger, inexperienced people without a political base of their own, then dismissed them when things went wrong. The failures were attributed to his revolving-door prime ministers as though they bore exclusive responsibility.

But why put oneself at the mercy of incompetent advisers? Yeltsin reveals his priorities in explaining his reasons for appointing Sergei Kiriyenko, 35, an obscure and inexperienced official, as prime minister: "Everybody needed a new figure, not someone who would lobby for the interests of some against others, not someone from some sort of camp, not someone who had already appeared in Moscow's echelons of power." In short, someone without a history or a political base. During his most severe crisis, in 1998, Yeltsin turned to his foreign minister and perhaps the most experienced man in the government, Yevgeny Primakov. But when Primakov suddenly gained widespread popularity in early 1999, Yeltsin became alarmed. He realized, he writes, that Primakov "was becoming a serious political alternative to my course and my plan for the country's development." Ignoring Primakov's "honesty, decency, and loyalty," Yeltsin swiftly defused the threat by dismissing the prime minister for alleged pro-Communist sympathies. "Primakov had too much red in his political palette," Yeltsin writes.

His final choice was Vladimir Putin, a former KGB lieutenant colonel, who was named prime minister in the summer of 1999. Putin's first move on becoming acting president was to sign a decree protecting Yeltsin from future criminal prosecution.

In this context, Yeltsin's rambling memoir is inherently interesting for what it tells us about his character and maneuverings. Its author-statesman casts the ongoing Russian drama in terms of Kremlin intrigues, ceremonial functions, gossip, meetings and talks with foreign potentates, and perpetual personnel changes. All along it is Yeltsin who holds every string in his hands and who, like a puppetmaster, keeps moving the cardboard characters he has created, apparently for that very purpose. The sagging economy, rampant corruption, rising crime, growing social inequities and one of the greatest lootings of assets ever recorded in history seem to be matters of minor concern. "How can you force a bureaucrat not to take bribes to feed his family, when he earns only 5000-6000 rubles per month but is involved in monitoring multi-million-ruble transactions?" Yeltsin writes. "Naturally, the only way is to raise his salary."

The picture that emerges is one of a petulant, self-centered, cunning man whose lust for power and fragile ego were the dominant forces of his presidency. Even though he was the picture of a model Bolshevik throughout much of his life, even though he had toadied up to Brezhnev, Chernenko and other political leaders to crawl up the party ranks, Yeltsin had always been a man waiting for the main chance. He turned against his colleagues when they blocked his path to the top in 1987. He fought hard. He finally seized his chance. The image of the man atop a tank was the apex of his career, the grand gesture for which he will be remembered in history.

Paradoxically, despite his anti-Communist diatribes, Yeltsin remained a Bolshevik at heart insofar as he believed that strong-willed and determined individuals could change the world by forcibly engineering social and economic changes. He saw himself as just such a man. He sought to obliterate the past, revise his own history and cultivate his own myth. I recall a St. Petersburg historian contemptuously quoting from an early New Year's Eve address to the nation in which Yeltsin referred to Communists as "they"--"They have imposed Communism on us for seventy years." And who was talking, the historian asked rhetorically? A former Politburo member and Communist boss of Sverdlovsk.

What is there of substance, if anything, in this man who strove mightily for grand gestures and theatrical effects? Midnight Diaries provides no answers, so there remains the question of whether Yeltsin ever seriously considered championing a democratic revolution.

What happened in 1991 is that the students and workers who made the revolution and toppled the old regime did not know how to make a new government. Those who did know how were the ones from the old regime. Yeltsin brought those same people back to power and subsequently worked mightily against the very democratic forces that had been the mainstay of his support when he was a populist politician.

Yeltsin's memoir offers no evidence to suggest that he was ever interested in the systematic mobilization of Russia's democratic forces. He had no vision of the nation's identity and future; his concerns were far more personal. His obsession with the grand gesture--something that required an element of surprise--made him fret constantly about leaks. Not only did he crave the limelight, he always tried to stun the world by unexpected actions: "If the news were to leak, the whole effect would be lost," he writes about his decision to resign. "Any leak, any advance talk, any forecasts or proposals would put the impact of the decision in jeopardy." In June 1999, at the end of the war with Yugoslavia, he ordered the Russian brigade serving on peacekeeping duty in Bosnia to steal a march on NATO and occupy the Pristina airport in Kosovo even though he knew it was an empty maneuver. "I decided that Russia must make a crowning gesture, even if it had no military significance," he writes. This was, he adds, "a sign of our moral victory."

Ironically, the first wave of opposition to Yeltsin's policies came from the very people who brought him to power. They argued that his economic reforms had little to do with a genuine free market but amounted to a Bolshevik-style, top-down expropriation and redistribution of assets in disguise. In The Tragedy of Russia's Reforms: Market Bolshevism Against Democracy, Peter Reddaway and Dmitri Glinski note that by late 1993, most democrats--"an entire generation of talented and idealistic would-be leaders of Russia's body politic and civil society"--were "pushed off the political stage along with the democratic movement as a whole."

Reddaway, a professor at George Washington University and former director of the Washington-based Kennan Institute for Advanced Russian Studies, and Dmitri Glinski, a senior research associate at the Moscow-based Russian Academy of Sciences' Institute of World Economy and International Relations, teamed up to produce a critical analysis of the Yeltsin years in power. It is a finely argued and frequently provocative account that deserves a respectful hearing.

Reddaway and Glinski believe Yeltsin had "little commitment to democracy, the national interest, or the economic development of his nation." His rule was an age of blight. The destruction of Russia's intellectual assets was particularly severe. The number of scientists has shrunk from 3.4 million to 1.3 million. Russia's net financial loss from the decline in its science is estimated at between $500 billion and $600 billion annually.

The overall damage inflicted on the economy, they write, exceeds that of the comparable American experience during the Great Depression or, again, the industrial loss inflicted on Russia in World War II. High-tech industries suffered the worst. Production in electronics fell by 78 percent between 1991 and 1995. In 1997 imports made up half the Russian consumer market.

The picture of devastation looks even grimmer in light of dramatic declines in energy production (since 1991 oil production has declined by 50 percent, gas by 13 percent and electricity generation by 20 percent). Lack of investment in electricity generation will have potentially far-reaching consequences for the military and civilian economies, with the prospect of future migrations away from the frigid northern zones of the country. Brownouts have already forced a population exodus from the city of Norilsk.

"For the first time in recent world history," Reddaway and Glinski write,

one of the major industrial nations with a highly educated society has dismantled the results of several decades of economic development...and slipped into the ranks of countries that are conventionally categorized as "Third World." To make this experience even more dramatic, this comprehensive national collapse occurred at the same time as the nation's leaders and some of their allies in the West promised Russians that they were just about to join the family of democratic and prosperous nations.

Instead of promoting democracy, these analysts argue, "Yeltsin and his associates disbanded the new post-Soviet parliament by force and emasculated its successor, blocked the development of an independent judicial branch, reduced the power and revenue base of local self-government, and by 1994 had imposed a regime of Byzantine authoritarianism on the country."

The authors contend not only that Russia's social and economic degradation "can and should be reversed" but that it is in the national interest of the United States and Western Europe to assist in that process. A more stable Russia, they say, would provide better hope for viable security arrangements and for a more cooperative relationship between Moscow and key international organizations.

But this is a Catch-22. Given the fact that Moscow is not able to service its foreign debt, no influx of foreign capital is to be expected. Who wants to invest in a country lacking comprehensive, clear and effective tax legislation?

In his book Post-Soviet Russia, the distinguished Russian historian Roy Medvedev also chronicles the failures of Yeltsin's rule, arguing that Russia's plunge to capitalism was both precipitate and ill conceived.

Yeltsin first privatized the area of public safety, which led to the creation of private armies and mafias. At the same time, the managers of state-owned firms created private companies and moved the cash flow to offshore banks in Cyprus. New banks were formed and made fortunes in currency transactions.

There was something very Russian about the whole endeavor. Yeltsin approached it much in the way Peter the Great and other czars carried out their modernization programs; "capitalist perestroika" was imposed from above. Medvedev notes Yeltsin's explanation: "We had to forcibly introduce a real market place, just as potatoes were introduced under Catherine the Great."

The remark suggests, perhaps inadvertently, how vague was Yeltsin's grasp of the magnitude of the undertaking. Once prices were allowed to float freely, they immediately jumped fifteen to twenty times over. Hyperinflation wiped out life savings of the population. It touched off the flight of capital, as profits from exports were deposited in Western banks as a hedge against inflation. Low domestic prices on raw materials generated illegal exports and the emergence of illicit trade. Domestic production declined sharply. In 1998 the government once again devalued the ruble and froze bank accounts.

Another of Yeltsin's failings was his lack of sound judgment about people. Medvedev catalogues the incompetent, inexperienced young men with whom Yeltsin chose to surround himself. One was a junior foreign ministry bureaucrat, Andrei Kozyrev, who was made foreign minister. Medvedev likens another Yeltsin aide, Boris Nemtsov, to the character of a confidence man in Gogol's Dead Souls.

Why did Yeltsin entrust so much of the government to a young and green journalist, Yegor Gaidar? During their very first meeting, Gaidar assured Yeltsin that the shift to the market could be accomplished in one year. Yeltsin himself provided an account of the "surgical precision" of his decision to place Gaidar in charge: "It's a curious thing, but I couldn't help being affected by the magic of his name," Yeltsin wrote later. Gaidar's grandfather, Arkady Gaidar, had been a famous children's writer whose books were read by generations of Soviet kids, Yeltsin explained, "Including myself. And my daughters. And so I had faith in the inherited talent of Yegor, son of Timur, grandson of Arkady Gaidar."

Gaidar's advisers included a group of Western experts, led by Jeffrey Sachs of Harvard and Anders Aslund of the Carnegie Endowment [see Janine R. Wedel, "The Harvard Boys Do Russia," June 1, 1998]. The Russians were very receptive to outside advice; they thought the West was genuinely concerned. But expert recommendations failed to "take into account the structure of the Russian economy and its particular features" and thus did more harm than good, Medvedev believes. He goes even further and suggests that the experts were trying foremost to preserve the interests of the wealthy Western countries.

"Shock therapy" sent the country reeling with pain, causing tremendous harm to an economy that, despite all its known shortcomings, did include first-rate firms and research and development laboratories. This was particularly true of the military-industrial complex, which employed millions of highly skilled workers, technicians and engineers. Yeltsin failed to reorient these resources to the production of consumer goods. At the same time, there was a sharp drop in government orders for military-industry goods.

But the calamity also created opportunities for people to become rich almost overnight. For aficionados, Medvedev provides a detailed analysis of this new but small class of Russians who acquired vast fortunes during what can only be described as the looting of Russia. One of them is the subject of Paul Klebnikov's excellent book Godfather of the Kremlin, which is a must-read for anyone interested in the Yeltsin era.

Klebnikov, a senior editor at Forbes, makes its amply clear that thievery on such a scale has occurred with the cooperation of top political leaders. The businessmen, in a strict sense, committed no crimes and broke no laws; they were advised and helped by Yeltsin's young reformers in the Kremlin. Virtually all the people around Yeltsin, including members of his family and the President himself, are portrayed as deeply corrupt. "Yeltsin was very quickly compromised by all those things that accompany limitless power: flattery, luxury, absolute irresponsibility," Yeltsin's former chief of security is quoted as saying.

Klebnikov's vivid portrait of Boris Berezovsky, until recently one of the wealthiest men in Russia, is closely documented by detailing financial transactions, strategies and alliances. Berezovsky's fortunes rose after the publication in the early 1990s of Yeltsin's memoirs Notes of a President, which Berezovsky had partly underwritten. Having expected to make $1 million, Yeltsin was disappointed by his far more modest proceeds. At that point, according to Klebnikov, Berezovsky began putting funds into Yeltsin's personal account at Barclays Bank in London, explaining that this was income generated by the memoir. Berezovsky in turn became a Yeltsin favorite (by 1994 Yeltsin had $3 million in the account).

In addition to Berezovsky, a former scientist turned car dealer, Klebnikov skillfully describes other members of the clique of predatory oligarchs who plundered the country's most important assets with the connivance of the regime. "He and his crony capitalists produced no benefit to Russia's consumers, industries, or treasury. No new wealth was created." They did, however, produce substantial benefits to Yeltsin and his entourage.

Yeltsin, in Midnight Diaries, dismisses such allegations. "In fact, these people don't have any links to the criminal world. These are not robber barons and not the heads of mafia clans. These are representatives of big capital who have entered into close and complex relationships with the government." The evidence indicates otherwise, though. Klebnikov's presentation suggests that Berezovsky was involved in mafia wars, that he attempted to have his chief rival killed and that he was the target of an assassination attempt himself. (Berezovsky was badly injured and his driver decapitated by a bomb placed near his automobile.)

Klebnikov may indeed go too far, however, when he asserts that Berezovsky, as a private individual, managed to "hijack the state." Berezovsky's influence was always directly linked to his proximity to Yeltsin. Yeltsin appointed the oligarch to several top posts, including that of deputy chairman of the National Security Council. But ultimately Berezovsky remained a moneyman who was never allowed into the charmed circle of power. Political power in Russia, when it came to a crunch, always had more punch than financial muscle.

Klebnikov adds his voice to recent charges that the Clinton Administration stuck by Yeltsin even though it knew all along about the unsavory nature of his regime. The Administration, he writes, "while trumpeting the principles of democracy and the free market, repeatedly ignored evidence that the Yeltsin regime was a kleptocracy."

Gaidar, the architect of Yeltsin's shock therapy, acknowledged in a 1997 book that the entire Yeltsin program was a failure. "Unfortunately," he writes, "the combination of imperial rhetoric, economic adventurism, and large-scale theft seems likely to become the long-term determinants of Russian realities." The term now commonly used to describe Russia is a "bandit state." Reddaway and Glinski call it "market bolshevism."

The new books have punched some big holes in the Yeltsin legend as well as in Clinton's own uncritical backing for the Russian president. Reddaway and Glinski provide some evidence that Yeltsin used his secret police to stage "a provocation that unleashed violence on the part of the opposition, thus giving Yeltsin a pretext to proceed with a bloody crackdown on the parliament." Clinton joined Yeltsin's war against Parliament, saying, "We cannot afford to be in the position of wavering at this moment, or backing off or giving any encouragement to people who clearly want to derail the election process and are not committed to reform." An unnamed US official was quoted by Newsweek as saying the Administration "would have supported Yeltsin even if his response had been more violent than it was." (Official figures say 187 people died and almost 500 were wounded in the attack.) Charles Blitzer, chief economist on Russia for the World Bank, commented on the incident: "I've never had so much fun in my life." Another Western economist advising the Yeltsin government was quoted in the press as saying, "With parliament out of the way, this is a great time for reform."

There is also evidence that the 1993 referendum on a new constitution--which gave Yeltsin extensive personal powers--was in fact rigged. Reddaway and Glinski cite various infractions, including intimidation and other irregularities engaged in by the Yeltsin people. The minimum voter turnout was reduced from 50 percent to 25 percent; the minimum wage was raised; television access for oppositionists was sharply curtailed. "By all indications, the constitution did not gain the necessary minimum of voter support, but the authorities declared it [had] been approved," they write. "The gap was evidently closed by government vote fraud."

Yeltsin's 1996 re-election campaign--financed by Berezovsky and other oligarchs and mafia barons--"was marked by spectacular violations of the law on the part of the incumbent," Reddaway and Glinski write. Yeltsin started with an approval rating of less than 10 percent and succeeded in getting re-elected by spending thirty times more than the legal spending limit. One incident is telling: An aide of Anatoly Chubais, a key Yeltsin assistant at the time, was caught leaving his office with $200,000 in cash in a suitcase. Apart from direct distribution of money, to win votes Yeltsin used government funds on a lavish scale for tax breaks; made cash transfers to institutions, garden owners and small farmers; and disbursed government credits. The Economist estimated that the effort cost the Russian treasury some $10 billion. The aggressive giveaway dwarfed the promises of all other contenders combined. But the most powerful weapon in Yeltsin's hands was the broadcast media. The brazen violations of the law on campaign coverage were summarized by the European Institute for the Media: Yeltsin got 53 percent of prime-time coverage; the Communist candidate, Gennady Zyuganov, 18 percent; and all other candidates combined, 11 percent.

After the election, the oligarchs divided up among themselves the most valuable state companies, which Yeltsin privatized under fire-sale conditions. But corruption permeated all levels of government, and included Yeltsin's "young reformers." One of them is said to have handled an estimated $178 million in precious stones, gold and antique jewelry smuggled out of the Russian treasury, to be sold in San Francisco and Antwerp. The bribery involved in "trading" operations was on an epic scale. The wives of the interior minister and his first deputy, invited on a three-day shopping trip to Switzerland by a commodities trader, bought $300,000 worth of furs, perfumes, watches and so on (and carted the haul in twenty pieces of excess luggage)--all paid for by the trader's firm.

Primakov was the only prime minister who made a determined effort to fight corruption and hold Berezovsky and others accountable to the law. This may have brought stability and trust back to Russian politics, but his corruption probe was extremely dangerous for Yeltsin. Moreover, Primakov did not offer enough of a guarantee that Yeltsin himself would not face prosecution after leaving office. Finally, Klebnikov says, Primakov's government evoked loud protests from Washington. He was replaced as prime minister by the minister of internal affairs, the man who had promised to protect Berezovsky. ("The dismissal of Primakov was my personal victory," Berezovsky later told Le Figaro.) Here lie the reasons for the selection of Putin.

Putin and his people are left with the overarching need for a qualitatively new strategy of economic and social recovery. Yeltsin's course reached a dead end; polls suggest that discontent with capitalist experimentation now permeates all classes--workers, peasants, the army, intellectuals. Whatever emerges in this decade in Russia is likely to be viewed as a communist reformation--a moderate shade of red--that would allow some degree of private property, individual freedom and entrepreneurship. How this will evolve is going to depend on Putin himself. An argument can be made that the origins of Communism's collapse may lie partly in a car trip the young Gorbachev and his wife made in the 1960s, which allowed them to observe that even Italian peasants lived better than Russian elites. Putin served as a KGB officer in East Germany for six years in the 1980s, but even that exposure was enough for his wife to complain about the empty shelves back home in Russia.

How will history judge Yeltsin? On one level, to use the image of one of his acolytes, Yeltsin could be compared to Ilya Muromets, the peasant hero of medieval epics who one day is bravely slaying Russia's foes, then spends weeks sleeping on the stove in his hut.

On another level, however, Yeltsin's final grand gestures do set an enormously important precedent. During the twentieth century ten men stood at Russia's helm, but only one of them--Yeltsin--was actually elected by the people. Moreover, there was no regular system of power transfer in Russia throughout the twentieth century, and that was perhaps the most important cause of the country's difficulties and setbacks. Five of its ten leaders died in office, three were removed by revolutions and one by a palace coup. Yeltsin alone left office before the end of his term. This indeed established a much-needed precedent to legitimize an orderly system of succession. In the context of Russian history, this has been progress.

Blogs

Years of Living Dangerously will make you boiling mad about the climate calamity that awaits us in the twenty-first century.  

April 15, 2014

Conservatives see the selection of Colbert as the next host of The Late Show as proof of liberal media bias. 

April 11, 2014

How Mickey Rooney’s Puck in A Midsummer Night’s Dream changed Vidal’s life.

April 7, 2014

How the cowardice of Viking Penguin kept the author’s In The Spirit of Crazy Horse out of print for seven years

April 7, 2014

An excerpt from When Hollywood Turned Left, Greg Mitchell’s new e-book about the first modern campaign to secretly raise massive amounts of money.

April 6, 2014

Naturally, J. Edgar Hoover, Dr. Martin Luther King Jr., and Senator Richard Russell all put in appearances.

April 2, 2014

“The Interweb tried to swallow me whole,” said Colbert. “But I am proud to say that I got lodged in its throat and it hacked me back up, like a hastily chewed chicken wing.”

April 1, 2014

Actors and actresses were long known for contract disputes and sex scandals, not social activism. So when, and why, did the shift to the left occur?

March 25, 2014

The CIA and NSA routinely ignore the Constitution, yet want the Justice Department to protect them from an over-reaching Congress.

March 24, 2014