Quantcast

Nation Topics - Media Analysis | The Nation

Topic Page

Nation Topics - Media Analysis

Articles

News and Features

The paper of record has a curiously difficult time reporting the 'Chinese espionage' case.

The following debate is adapted from a forum--put together by Basic Books and held in New York City some weeks ago. Participating were: John Donatich, who moderated and is publisher of Basic Books; Russell Jacoby, who teaches at UCLA and is the author of The End of Utopia and The Last Intellectuals; Jean Bethke Elshtain, who has served as a board member of the Institute for Advanced Studies at Princeton University, is a fellow of the American Academy of Arts and Sciences, teaches at the University of Chicago and is the author of Women and War, Democracy on Trial and a forthcoming intellectual biography of Jane Addams; Stephen Carter, the William Nelson Cromwell Professor of Law at Yale University and author of, among other works, The Culture of Disbelief, Reflections of an Affirmative Action Baby, Integrity, Civility and, most recently, God's Name in Vain: The Wrongs and Rights of Religion in Politics; Herbert Gans, the Robert S. Lynd Professor of Sociology at Columbia University and author of numerous works, including Popular Culture and High Culture, The War Against the Poor and The Levittowners; Steven Johnson, acclaimed as one of the most influential people in cyberworld by Newsweek and New York magazines, co-founder of Feedmag.com, the award-winning online magazine, and author of the books Interface Culture and the forthcoming Emergence; and Christopher Hitchens, a columnist for The Nation and Vanity Fair, whose books include the bestselling No One Left to Lie To: The Values of the Worst Family and The Missionary Position: Mother Teresa in Theory and Practice. For Basic, he will be writing the forthcoming On the Contrary: Letters to a Young Radical.

John Donatich: As we try to puzzle out the future of the public intellectual, it's hard not to poke a little fun at ourselves, because the issue is that serious. The very words "future of the public intellectual" seem to have a kind of nostalgia built into them, in that we only worry over the future of something that seems endangered, something we have been privileged to live with and are terrified to bury.

In preparing for this event, I might as well admit that I've been worried about making the slip, "the future of the public ineffectual." But I think that malapropism would be central to what we'll be talking about. It seems to me that there is a central conflict regarding American intellectual work. How does it reconcile itself with the venerable tradition of American anti-intellectualism? What does a country built on headstrong individualism and the myth of self-reliance do with its people convinced that they know best? At Basic Books' fiftieth anniversary, it's a good time to look at a publishing company born in midcentury New York City, a time and place that thrived on the idea of the public intellectual. In our first decades, we published Daniel Bell, Nathan Glazer, Michael Walzer, Christopher Lasch, Herb Gans, Paul Starr, Robert Jay Lifton--and these names came fresh on the heels of Lévi-Strauss, Freud, Erik Erikson and Clifford Geertz.

What did these writers have in common except the self-defined right to worry the world and to believe that there is a symbiotic relationship between the private world of the thinker and the public world he or she wishes to address? That the age of great public intellectuals in America has passed has in fact become a cliché. There are many well-reviewed reasons for this. Scholars and thinkers have retreated to the academy. Self-doubt has become the very compass point of contemporary inquiry. Scholarship seems to start with an autobiographical or confessional orientation. The notion that every question has a noble answer or that there are reliable structures of ideology to believe in wholeheartedly has become, at best, quaint.

Some believe that the once-relied-upon audience of learned readers has disappeared, giving way to a generation desensitized to complex argumentation by television and the Internet. The movie Dumb and Dumber grosses dozens of millions of dollars at the box office, while what's left of bohemian culture celebrates free-market economics. Selling out has more to do with ticket grosses than the antimaterialist who stands apart from society.

How do we reconcile ambition and virtue, expertise and accessibility, multicultural sensitivity and the urge toward unified theory? Most important, how do we reconcile the fact that disagreement is a main catalyst of progress? How do we battle the gravitation toward happy consensus that paralyzes our national debate? A new generation of public intellectuals waits to be mobilized. What will it look like? That is what our distinguished panelists will discuss.

Russell Jacoby has been useful in defining the role of the public intellectual in the past half-century, especially in the context of the academy. Can you, Russell, define for us a sort of historical context for the public intellectual--what kind of talent, courage and/or political motivation it takes for someone to be of the academy but to have his or her back turned to it, ready to speak to an audience greater than one's peers?

Russell Jacoby: A book of mine that preceded The Last Intellectuals was on the history of psychoanalysis. And one of the things I was struck by when I wrote it was that even though psychoanalysis prospered in the United States, something was missing--that is, the sort of great refugee intellectuals, the Erik Eriksons, the Bruno Bettelheims, the Erich Fromms, were not being reproduced. As a field it prospered, but it became medicalized and professionalized. And I was struck by both the success of this field and the absence of public voices of the Eriksons and Bettelheims and Fromms. And from there I began to consider this as a sort of generational question in American history. Where were the new intellectuals? And I put the stress on public intellectuals, because obviously a kind of professional and technical intelligentsia prospered in America, but as far as I could see the public intellectuals were becoming somewhat invisible.

They were invisible because, in some ways, they had become academics, professors locked in the university. And I used a kind of generational account, looking at the 1900s, taking the Edmund Wilsons, the Lewis Mumfords. What became of them, and who were their successors? And I had a tough time finding them.

In some sense it was a story of my generation, the generation that ended up in the university and was more concerned with--well, what?--finding recommendations than with writing public interventions. And to this day, the worst thing you can say about someone in an academic meeting or when you're discussing tenure promotion is, "Oh, his work is kind of journalistic." Meaning, it's readable. It's journalistic, it's superficial. There's an equation between profundity and originality.

My argument was that, in fact, these generations of public intellectuals have diminished over time. For good reasons. The urban habitats, the cheap rents, have disappeared--as well as the jobs themselves. So the transitional generation, the New York intellectuals, ends up in the university. I mention Daniel Bell as a test case. When he was getting tenure, they turned to him and said, "What did you do your dissertation on?" And he said, "I never did a dissertation." And they said, "Oh, we'll call that collection of essays you did a dissertation." But you couldn't do that now. Those of that generation started off as independent intellectuals writing for small magazines and ended up as professors. The next generation started off as professors, wrote differently and thought differently.

So my argument and one of the working titles of my book was, in fact, "The Decline of the Public Intellectuals." And here I am at a panel on "The Future of Public Intellectuals." Even at the time I was writing, some editors said, "Well, decline, that's a little depressing. Could you sort of make a more upbeat version?" So I said, "I have a new book called The Rise of American Intellectuals," and was told, "Well, that sounds much better, that's something we can sell." But I was really taking a generational approach, which in fact, is on the decline. And it caused intense controversy, mainly for my contemporaries, who always said, "What about me? I'm a public intellectual. What about my friends?" In some sense the argument is ongoing. I'm happy to be wrong, if there are new public intellectuals emerging. But I tend to think that the university and professionalization does absorb and suck away too much talent, and that there are too few who are bucking the trends.

Donatich: Maybe the term "public intellectual" begs the question, "who is the public that is being addressed by these intellectuals?" Which participant in this conversation is invisible, the public or the intellectual?

Jean Bethke Elshtain: I mused in print at one point that the problem with being a public intellectual is that as time goes on, one may become more and more public and less and less intellectual. Perhaps I should have said that a hazard of the vocation of the public intellectual lies in that direction. I didn't exactly mean less academically respectable, but rather something more or less along these lines: less reflective, less inclined to question one's own judgments, less likely to embed a conviction in its appropriate context with all the nuance intact. It is the task of the public intellectual as I understand that vocation to keep the nuances alive. A public intellectual is not a paid publicist, not a spinner, not in the pocket of a narrowly defined purpose. It is, of course the temptation, another one, of the public intellectual to cozy up to that which he or she should be evaluating critically. I think perhaps, too many White House dinners can blunt the edge of criticism.

A way I like to put it is that when you're thinking about models for this activity, you might put it this way: Sartre or Camus? An intellectual who is willing to look the other way, indeed, shamefully, explain away the existence of slave-labor camps, the gulags, in the service of a grand world-historic purpose or, by contrast, an intellectual who told the truth about such atrocities, knowing that he would be denounced, isolated, pronounced an ally of the CIA and capitalistic oppressors out to grind the faces of the poor.

There are times when a public intellectual must say "neither/nor," as did Camus. Neither the socialism of the gallows, in his memorable phrase, nor a capitalist order riddled with inequalities and shamed by the continuing existence, in his era, the era of which I speak, of legally sanctioned segregation. At the same time, this neither/nor did not create a world of moral equivalence. Camus was clear about this. In one regime, one order, one scheme of things, one could protest, one could organize to fight inequities, and in the other one wound up disappeared or dead.

Let me mention just one issue that I took on several times when I alternated a column called "Hard Questions" for The New Republic. I'm referring to the question of genetic engineering, genetic enhancement, the race toward a norm of human perfection to be achieved through manipulation of the very stuff of life. How do you deal with an issue like this? Here, it seems to me, the task of the public intellectual in this society at this time--because we're not fighting the issues that were fought in the mid-twentieth century--is to join others in creating a space within which such matters can be articulated publicly and debated critically.

At present, the way the issue is parsed by the media goes like this: The techno-enthusiasts announce that we're one step closer to genetic utopia. The New York Times calls up its three biological ethicists to comment. Perhaps one or two religious leaders are asked to wring their hands a little bit--anyone who's really a naysayer with qualms about eugenics, because that is the direction in which we are heading, is called a Luddite. Case closed, and every day we come closer to a society in which, even as we intone multiculturalism as a kind of mantra, we are narrowing the definition of what is normatively human as a biological ideal. That's happening even as we speak; that is, we're in real danger of reducing the person to his or her genotype, but if you say that, you're an alarmist--so that's what I am.

This leads me to the following question: Who has authority to pronounce on what issue, as the critical issues change from era to era? In our time and place, scientists, technology experts and dot-com millionaires seem to be the automatic authorities on everything. And everybody else is playing catch-up.

So the public intellectual needs, it seems to me, to puncture the myth-makers of any era, including his own, whether it's those who promise that utopia is just around the corner if we see the total victory of free markets worldwide, or communism worldwide or positive genetic enhancement worldwide, or mouse-maneuvering democracy worldwide, or any other run-amok enthusiasm. Public intellectuals, much of the time at least, should be party poopers. Reinhold Niebuhr was one such when he decided that he could no longer hold with his former compatriots of the Social Gospel movement, given what he took to be their dangerous naïveté about the rise of fascism in Europe. He was widely derided as a man who once thought total social transformation in the direction of world peace was possible, but who had become strangely determined to take a walk on the morbid side by reminding Americans of the existence of evil in the world. On this one, Niebuhr was clearly right.

When we're looking around for who should get the blame for the declining complexity of public debate, we tend to round up the usual suspects. Politicians usually get attacked, and the media. Certainly these usual suspects bear some responsibility for the thinning out of the public intellectual debate. But I want to lift up two other candidates here, two trends that put the role of public intellectuals and the very existence of publics in the John Dewey sense at risk. The first is the triumph of the therapeutic culture, with its celebration of a self that views the world solely through the prism of the self, and much of the time a pretty "icky" self at that. It's a quivering sentimental self that gets uncomfortable very quickly, because this self has to feel good about itself all the time. Such selves do not make arguments, they validate one another.

A second factor is the decline of our two great political parties. At one point the parties acted not just as big fundraising machines, not just as entities to mobilize voters but as real institutions of political and civic education. There are lots of reasons why the parties have been transformed and why they no longer play that role, but the results are a decline in civic education, a thinning out of political identification and depoliticization, more generally.

I'm struck by what one wag called the herd of independent minds; by the fact that what too often passes for intellectual discussion is a process of trying to suit up everybody in a team jersey so we know just who should be cheered and who booed. It seems to me that any public intellectual worth his or her salt must resist this sort of thing, even at the risk of making lots of people uncomfortable.

Donatich: Stephen, can you talk about the thinning out of political identity? Who might be responsible for either thickening or thinning the blood of political discourse? What would you say, now that we're talking about the fragmentation of separate constituencies and belief systems, is the role of religion and faith in public life?

Stephen Carter: You know that in the academy the really bad word is "popularizer"-- a mere popularizer, not someone who is original, which of course means obscure, or someone who is "deeply theorized," which is the other phrase. And to be deeply theorized, you understand, in academic terms today, means to be incapable of uttering a word such as "poor." No one is poor. The word, the phrase now, as some of you may know, is "restricted access to capital markets." That's deeply theorized, you see. And some of us just say poor, and that makes us popularizers.

A few years ago someone who was really quite angry about one of my books--and I have a habit of making people angry when I write books--wrote a review in which he challenged a statement of mine asserting that the intellectual should be in pursuit of truth without regard to whether that leaves members of any particular political movement uncomfortable. He responded that this was a 12-year-old nerd's vision of serious intellectual endeavor.

And ever since then I thought that I would like to write a book, or at least an essay, titled something like Diary of an Intellectual Nerd, because I like that idea of being somewhat like a 12-year-old. A certain naïveté, not so much about great ideas and particularly not about political movements but about thought itself, about truth itself. And I think one of the reasons, if the craft of being intellectual in the sense of the scholar who speaks to a large public is in decline, is cynicism. Because there's no sense that there are truths and ideas to be pursued. There are only truths and ideas to be used and crafted and made into their most useful and appropriate form. Everyone is thought to be after something, everyone is thought to have some particular goal in mind, independent of the goal that he or she happens to articulate. And so, a person may write a book or an article and make an argument, and people wonder, they stand up in the audience and they say, "So, are you running for office, or are you looking for some high position?" There's always some thought that you must be after something else.

One of the reasons, ideally, you'd think you would find a lot of serious intellectual endeavor on university campuses is precisely because people have tenure and therefore, in theory, need not worry about trying to do something else. But on many, many campuses you have, in my judgment, relatively little serious intellectual endeavor in the sense of genuinely original thinking, because even there, people are worried about which camp they will be thought to be in.

You can scarcely read a lot of scholarship today without first having to wade through several chapters of laying out the ground in the sense of apologizing in advance to all the constituencies that may be offended, lest one be thought in the other camp. That kind of intellectual activity is not only dangerous, it's unworthy in an important sense, it's not worthy of the great traditions of intellectual thought.

There's a tendency sometimes to have an uneasy equation that there is serious intellectual activity over here, and religion over there, and these are, in some sense, at war. That people of deep faith are plainly anti-intellectual and serious intellectuals are plainly antireligious bigots--they're two very serious stereotypes held by very large numbers of people. I'm quite unembarrassed and enthusiastic about identifying myself as a Christian and also as an intellectual, and I don't think there's any necessary war between those two, although I must say, being in an academic environment, it's very easy to think that there is.

I was asked by a journalist a few years ago why was it that I was comfortable identifying myself, and often did, as a black scholar or an African-American scholar and hardly ever identified myself as a Christian scholar. And surely the reason is, there are certain prejudices on campus suggesting that is not a possible thing to be or, at least, not a particularly useful combination of labels.

And yet, I think that the tradition of the contribution to a public-intellectual life by those making explicitly religious arguments has been an important and overlooked one, and I go back for my model, well past Niebuhr, into the nineteenth century. For example, if you looked at some of the great preachers of the abolitionist movement, one thing that is quite striking about them is, of course, that they were speaking in an era when it was commonly assumed that people could be quite weighty in their theology and quite weighty in their intellectual power. And when you read many of the sermons of that era, many of the books and pamphlets, you quickly gain a sense of the intellectual power of those who were pressing their public arguments in explicitly Christian terms.

Nowadays we have a historical tendency to think, "Oh, well, it's natural they spoke that way then, because the nation was less religiously diverse and more Christian." Actually, the opposite was probably true, as historians now think--the nation is probably less religiously diverse now than it was 150, 175 years ago, when religions were being founded really quite swiftly. And most of those swiftly founded religions in the 1820s to the 1830s have died, but many of them had followers in great number before they did.

America's sense of itself as a so-called Christian nation, as they used to say in the nineteenth century, didn't really grow strong until the 1850s or 1860s. So you have to imagine the abolitionist preachers of the eighteenth and early nineteenth centuries, preaching in a world in which it could be anything but certain that those who were listening to them were necessarily co-religionists.

In this century too, we have great intellectual preachers who also spoke across religious lines. Martin Luther King is perhaps the most famous of them, even though sometimes, people try to make a straitjacket intellectual of him by insisting, with no evidence whatsoever, that he actually was simply making secular moral arguments, and that religion was kind of a smokescreen. If you study his public ministry and look at his speeches, which were really sermons, as a group, you easily discern that that's not true.

And yet, the religiosity of his language gave it part of its power, including the power to cross denominational lines, to cross the lines between one tradition and another, and to cross lines between religion and nonreligion. For the religiously moved public intellectual, the fact is that there are some arguments that simply lose their power or are drained of their passion when they're translated into a merely secular mode. The greatness of King's public oratory was largely a result of its religiosity and its ability to touch that place in the human heart where we know right from wrong; it would not have been as powerful, as compelling, had it lacked that religious quality.

Now, I'm not being ahistorical, I'm not saying, "Oh, therefore the civil rights movement would not have happened or we would still have racial segregation today"--that's not the point of my argument. The point is that his religiosity did not detract from his intellectual power; rather, it enhanced it. This is not to say, of course, that everyone who makes a religious argument in public life is speaking from some powerful intellectual base. But it does suggest we should be wary of the prejudices that assume they can't be making serious arguments until they are translated into some other form that some may find more palatable. In fact, one of my great fears about the place we are in our democracy is that, religion aside, we have lost the ability to express and argue about great ideas.

Donatich: Professor Carter has made a career out of illustrating the effect and protecting the right of religious conviction in public thought. Herbert Gans, on the other hand, is a self-pronounced, enthusiastic atheist. As a social scientist who has taught several generations of students, how does a public intellectual balance the professional need for abstract theory and yet remain relevant, contribute some practical utility to the public discourse?

Herbert Gans: I'm so old that the word "discourse" hadn't been invented yet! I am struck by the pessimism of this panel. But I also notice that most of the names of past public intellectuals--and I knew some of them--were, during their lifetime, people who said, "Nobody's listening to me." Erich Fromm, for example, whom I knew only slightly and through his colleagues, was sitting in Mexico fighting with psychoanalysts who didn't think politics belonged in the dialogue. Lewis Mumford was a teacher of mine, and he certainly felt isolated from the public, except on architecture, because he worked for The New Yorker.

So it seems to me it's just the opposite: that the public intellectual is alive and well, though perhaps few are of the magnitude of the names mentioned. If I did a study, I'd have to define what an intellectual is, and I notice nobody on the panel has taken that one on. And I won't either. The public intellectuals that exist now may not be as famous, but in fact there are lots of them. And I think at least on my campus, public intellectuals are becoming celebrities. Some of them throw stones and get themselves in trouble for a few minutes and then it passes. But I think that really is happening, and if celebrities can exist, their numbers will increase.

One of the reasons the number is increasing is that public intellectuals are really pundits. They're the pundits of the educated classes, the pundits of the highbrow and the upper-middlebrow populations, if you will. And the moment you say they're pundits, then you can start comparing them to other pundits, of which we have lots. And there are middlebrow pundits and there are lower-brow pundits, there are serious pundits, there are not-so-serious pundits.

Some of the columnists in the newspapers and the tabloid press who are not journalists with a PhD are public intellectuals. There are pundits who are satirical commentators, there are a significant number of people who get their political news from Leno and Letterman. And, of course, the pollsters don't really understand this, because what Leno and Letterman supply is a satirical take on the news.

Most public intellectuals function as quote-suppliers to legitimize the media. Two or three times a week, I get called by journalists and asked whether I will deliver myself of a sociological quote to accompany his or her article, to legitimate, in a sense, the generalizations that journalists make and have to make, because they've got two-hour deadlines. Which means that while there are few public intellectuals who are self-selected, most of us get selected anyway. You know, if no journalist calls for a quote, then I'm not a public intellectual; I just sit there writing my books and teaching classes.

I did a book on the news media and hung out at Newsweek and the other magazines. And at Newsweek, they had something they called an island, right in the main editorial room. On the island were names of people who would now be called public intellectuals, the people whom Newsweek quoted. And the rules were--and this is a bit like Survivor--every so often people would be kicked off the island. Because the editors thought, and probably rightly, that we as readers were going to get tired of this group of public intellectuals. So a new group was brought in to provide the quotes. And then they were kicked off.

The public intellectuals come in two types, however. First there are the ones that everyone has been talking about, the generalists, the pundits, as I think of them; and second are the disciplinary public intellectuals. The public sociologists, the public economists, the public humanists--public, plus a discipline. And these are the people who apply the ideas from their own disciplines to a general topic. And again, to some extent, this is what I do when I'm a quote-supplier, and I'm sure my fellow panelists are all functioning as quote-suppliers too.

But the disciplinary public intellectuals show that their disciplinary insights and their skills can add something original to the public debate. That, in other words, social scientists and humanists can indeed grapple with the issues and the problems of the real world. The disciplinary public intellectuals, like other public intellectuals, have to write in clear English. This is a rarity in the academy, unfortunately--which makes disciplinary public intellectuals especially useful. And they demonstrate the public usefulness of their disciplines, which is important in one sense, because we all live off public funds, directly or indirectly, and we need to be able to account every so often that we're doing something useful for taxpayers. I cannot imagine there are very many legislators in this country who would consider an article in an academic journal as proof that we're doing something useful or proof that we're entitled to some share of the public budget.

Disciplinary public intellectuals are useful in another way, too: They are beloved by their employers, because they get these employers publicity. My university has a professionally run clipping service, and every time Columbia University is mentioned, somebody clips and files the story. And so every time somebody quotes me I say, "Be sure to mention Columbia University," because I want to make my employers happy, even though I do have tenure. Because, if they get publicity, they think they're getting prestige, and if they get prestige, that may help them get students or grant money.

There are a number of hypotheses on this; I'm not sure any of them are true-- whether quote-supplying provides prestige, or prestige helps to get good students, whether good students help to get grant money. There is a spiral here that may crash. But meanwhile, they think that if we're getting them publicity, we're being useful. And, of course, public social scientists and those in the humanities are, in some respects, in short supply, in part because their colleagues stigmatize them as popularizers. (They don't call them journalists, which is a dirty word in the ivory tower.)

It's also fair to say that in the newsrooms, "academic" is a dirty word. If you've ever paid attention, journalists always cite "the professor," and it doesn't matter who it is, and it doesn't even matter if they're friends of the professor. But it's always "the professor," which is a marvelous way of dehumanizing us professors. So there's this love/hate relationship between journalists and academics that's at work here. All of which means, yes, of course, it does take a bit of courage to be a public intellectual or a disciplinary public intellectual. If you turn your back on the mainstream of the academy, that's the way you get a knife in your back, at times.

Donatich: Steven Johnson has used the web and Internet energetically and metaphorically. How will the Internet change public dialogue? What are the opportunities of public conversation that this new world presents?

Steven Johnson: One of the problems with the dot-com-millionaire phenomenon--which may, in fact, be starting to fall behind us--is that it really distracted a huge amount of attention from a lot of other very interesting and maybe more laudable things that were happening online. There was kind of a news vacuum that sucked everything toward stories about the 25-year-old guy who just made $50 million, and we lost sight of some of the other really progressive and important things that were happening because of the rise of the web.

I'm of a generation that came of age at precisely that point that Russell Jacoby talked about and wrote about, during the late eighties, when the academy was very much dominated by ideas from France and other places, where there was a lot of jargon and specialization, and it was the heyday of poststructuralism and deconstruction in the humanities. Which leads me to sometimes jokingly, sometimes not, describe myself as a "recovering semiotics major."

I think that I came to the web and to starting Feed, and to writing the book that I wrote about the Internet culture and interface culture, as a kind of a refugee from conversations like one in the academy, when I was a graduate student, in which a classmate asked the visiting Derrida a ten- or fifteen-minute, convoluted Derridean question on his work and the very possibility of even asking a question. And after a long pause, Derrida had to admit, "I'm sorry, I do not understand the question."

The web gave me an unlikely kind of home in that there were ideas and there were new critical paradigms that had been opened up to me from the academic world. But it was clear that you couldn't write about that world, you couldn't write using those tools with that kind of language and do anything useful. And it was very hard to imagine a life within the university system that was not going to inevitably push me toward conversations like that with Derrida.

So the good news, I think, is that my experience is not unique. In fact, there's been a great renaissance in the last five years of the kind of free-floating intellectual that had long been rumored to be on his or her last legs. It's a group shaped by ideas that have come out of the academy but is not limited to that. And I think in terms of publications like Feed--to pat myself on the back--Hermenaut and Suck are all good examples of a lively new form of public intellectualism that is not academic in tone.

The sensibility of that group is very freethinking--not particularly interested in doctrinaire political views, very eclectic in taste, very interested in the mix of high and low culture, much more conversational in tone--funny, even. Funny is an interesting component here. I mean, these new writers are funny in a way, you know, Adorno was never very funny. And they're very attentive to technology changes, maybe as interested in technology and changes in the medium as they are in intellectual fashions. If there's a role model that really stands out, it's somebody like Walter Benjamin for this generation. You know, a sense of an interest that puts together groups of things you wouldn't necessarily expect to see put together in the same essay.

How does the web figure into all of this? Why did these people show up on the web? I think one of the things that started happening--actually, this is just starting to happen--is that in addition to these new publications, you're starting to see something on the web that is very unique to it. The ability to center your intellectual life in all of its different appearances in your own "presence" online, on the home page, so that you can actually have the equivalent of an author bio. Except that it's dynamically updated all the time, and there are links to everything you're doing everywhere. I think we've only just begun to exploit it--of combating the problem with the free-floating intellectual, which is that you're floating all over the place and you don't necessarily have a home, and your ideas are appearing in lots of different venues and speaking to lots of different audiences.

The web gives you a way of rounding all those diverse kinds of experiences and ideas--and linking to them. Because, of course, the web is finally all about linking--in a way that I think nothing has done quite as well before it. And it also involves a commitment to real engagement with your audience that perhaps public intellectuals have talked a lot about in the past, but maybe not lived up to as much as they could have.

Some of this is found in the new formats that are available online in terms of how public dialogue can happen. I'm sure many of you have read these and many of you may have actually participated in them, but I'm a great advocate for this kind of long-format, multiparticipant discussion thread that goes on over two or three weeks. Not a real-time live chat, which is a disaster in terms of quality of discourse, which inevitably devolves into the "What are you wearing" kind of intellectual questions. But rather, the conversations with four or five people where each person has a day or half a day to think up their responses, and then write in 500- to 1,000-word posts. We've done those since we started at Feed. Slate does a wonderful job with them. And it's a fantastic forum. It's very engaged, it's very responsible, it's very dialogic and yet also lively in a conversational way. But, because of the back and forth, you actually can get to places that you sometimes couldn't get in a stand-alone 10,000-word essay.

Donatich: Professor Gans, if you had trouble with the word "discourse," I'm wondering what you'll do with "dialogic."

Johnson: I said I was recovering! That's the kind of thing that should be happening, and it seems to me that in five or ten years we'll see more and more of people who are in this kind of space, having pages that are devoted to themselves and carrying on these conversations all the time with people who are coming by and engaging with them. And I think that is certainly a force for good. The other side is just the economics of being able to publish either your own work or a small magazine. I mean, we started Feed with two people. We were two people for two years before we started growing a little bit. And the story that I always tell about those early days is that we put out the magazine and invited a lot of our friends and some people we just knew professionally to contribute. About three months, I guess, after Feed launched, Wired came out with a review of it. And they had this one slightly snippy line that said, "It's good to see the East Coast literary establishment finally get online." Which is very funny, to be publishing this thing out of our respective apartments. I had this moment where I was looking around my bedroom for the East Coast literary establishment--you open the closet door, and "Oh, Norman Mailer is in there. 'Hey, how's it going!'" And so there can be a kind of Potemkin Village quality online. But I think the village is thriving right now.

Donatich: Christopher Hitchens, short of taking on what a public intellectual might or might not be, will you say something about the manners or even the mannerisms of the public intellectual and why disagreement is important to our progress?

Christopher Hitchens: I've increasingly become convinced that in order to be any kind of a public-intellectual commentator or combatant, one has to be unafraid of the charges of elitism. One has to have, actually, more and more contempt for public opinion and for the way in which it's constructed and aggregated, and polled and played back and manufactured and manipulated. If only because all these processes are actually undertaken by the elite and leave us all, finally, voting in the passive voice and believing that we're using our own opinions or concepts when in fact they have been imposed upon us.

I think that "populism" has become probably the main tactical discourse, if you will, the main tactical weapon, the main vernacular of elitism. Certainly the most successful elitist in American culture now, American politics particularly, is the most successful inventor or manipulator, or leader of populism. And I think that does leave a great deal of room in the public square for intellectuals to stand up, who are not afraid to be thought of as, say, snobbish, to pick a word at random. Certainly at a time when the precious term "irony"--precious to me, at any rate--has been reduced to a form of anomie or sarcasm. A little bit of snobbery, a little bit of discrimination, to use another word that's fallen into disrepute, is very much in order. And I'm grateful to Professor Carter for this much, at least, that he drew attention to language. And particularly to be aware of euphemism. After all, this is a time when if you can be told you're a healer, you've probably won the highest cultural award the society can offer, where anything that can be said to be unifying is better than anything that can be described as divisive. Blush if you will, ladies and gentlemen, I'm sure at times you too have applauded some hack who says he's against or she's against the politics of division. As if politics wasn't division by definition.

The New York Times, which I'm sure some of you at least get, if you don't read, will regularly regale you in this way--check and see if you can confirm this. This will be in a news story, by the way, not a news analysis. About my hometown in Washington, for example, "recently there was an unpleasant outbreak of partisanship on Capitol Hill, but order seems to have been restored, and common sense, and bi-partisanship, is again regained. I've paraphrased only slightly. Well, what is this in translation? "For a while back there it looked as if there'd be a two-party system. But, thank God, the one-party system has kicked back in."

Now, the New York Times would indignantly repudiate--I'm coming back to this, actually--the idea that it stood for a one-party system or mentality, but so it does. And its language reveals it. So look to the language. And that is, in fact, one of the most essential jobs of anyone describing themselves as an intellectual.

Against this, we have, of course, the special place reserved for the person who doesn't terribly want to be a part of it, doesn't feel all that bipartisan, who isn't in an inclusive mood. Look at the terms that are used for this kind of a person: gadfly, maverick and, sometimes, bad boy. Also bad girl, but quite often bad boy, for some reason. Loose cannon, contrarian, angry young man.

These are not hate words, by any means, nor are they exactly insulting, but there's no question, is there, that they are fantastically and essentially condescending. They're patronizing terms. They are telling us, affectionately enough, that pluralism, of course, is big enough, capacious enough, tolerant enough to have room for its critics.

The great consensus, after all, probably needs a few jesters here and there, and they can and should be patted upon the head, unless they become actually inconvenient or awkward or, worst of all--the accusation I have myself been most eager to avoid--humorless. One must be funny, wouldn't you say? Look to the language again. Take the emaciated and paltry manner and prose in which a very tentative challenge to the one-party system, or if you prefer, the two-party one, has been received. I'm alluding to the campaign by Ralph Nader.

The New York Times published two long editorials, lead editorials, very neatly inverting the usual Voltairean cliché. These editorials say: We don't particularly disagree with what Ralph Nader says, but we violently disagree with his right to say it. I've read the editorials--you can look them up. I've held them up to the light, looked at them upside down, inside out, backwards--that's what they say. This guy has no right to be running, because the electorate is entitled to a clear choice between the two people we told you were the candidates in the first place.

I find this absolutely extraordinary. When you're told you must pick one of the available ones; "We've got you some candidates, what more do you want? We got you two, so you have a choice. Each of them has got some issues. We've got some issues for you as well. You've got to pick." A few people say, "Well, I don't feel like it, and what choice did I have in the choice?" You're told, "Consider the alternatives." The first usage of that phrase, as far as I know, was by George Bernard Shaw, when asked what he felt like on his 90th birthday. And he said, "Considering the alternatives...." You can see the relevance of it. But in this case you're being told, in effect, that it would be death to consider the alternatives.

Now, to "consider the alternatives" might be a definition of the critical mind or the alive intelligence. That's what the alive intelligence and the critical mind exist to do: to consider, tease out and find alternatives. It's a very striking fact about the current degeneration of language, that that very term, those very words are used in order to prevent, to negate, consideration of alternatives. So, be aware. Fight it every day, when you read gunk in the paper, when you hear it from your professors, from your teachers, from your pundits. Develop that kind of resistance.

The word "intellectual" is of uncertain provenance, but there's no question when it became a word in public use. It was a term of abuse used by those who thought that Capt. Alfred Dreyfus was guilty in 1898 to describe those who thought that he was probably innocent. It was a word used particularly by those who said that whether Captain Dreyfus was innocent or not, that wasn't really the point. The point was, would France remain an orderly, Christian, organic, loyal society? Compared to that, the guilt or innocence of Captain Dreyfus was irrelevant. They weren't saying he was necessarily guilty, they were saying, "Those who say he is innocent are not our friends. These are people who are rootless, who have no faith, who are unsound, in effect." I don't think it should ever probably lose that connotation. And fortunately, like a lot of other words that were originally insults--I could stipulate "Impressionist," which was originally a term of abuse, or "suffragette" or "Tory," as well as a number of other such terms--there was a tendency to adopt them in reaction to the abuse and to boast of them, and say, "Well, all right, you call me a suffragette, I'll be a suffragette. As a matter of fact, I'll be an Impressionist."

I think it would be a very sad thing if the word "intellectual" lost its sense that there was something basically malcontent, unsound and untrustworthy about the person who was claiming the high honor of the title. In politics, the public is the agora, not the academy. The public element is the struggle for opinion. It's certainly not the party system or any other form whereby loyalty can be claimed of you or you can be conscripted.

I would propose for the moment two tasks for the public intellectual, and these, again, would involve a confrontation with our slipshod use of language. The first, I think, in direct opposition to Professor Carter, is to replace the rubbishy and discredited notions of faith with scrutiny, by looking for a new language that can bring us up to the point where we can discuss shattering new discoveries about, first, the cosmos, in the work of Stephen Hawking, and the discoveries of the Hubble telescope--the external world--and, second, no less shattering, the discovery about our human, internal nature that has begun to be revealed to us by the unraveling of the chains of DNA.

At last, it's at least thinkable that we might have a sense of where we are, in what I won't call creation. And what our real nature is. And what do we do? We have President Clinton and the other figures in the Human Genome Project appear before us on the day that the DNA string was finally traced out to its end, and we're told in their voices and particularly the wonderful lip-biting voice of the President, "Now we have the dictionary which God used when he was inventing us." Nothing could be more pathetic than that. This is a time when one page, one paragraph, of Hawking is more awe-inspiring, to say nothing of being more instructive, than the whole of Genesis and the whole of Ezekiel. Yet we're still used to babble. For example, in the 18th Brumaire of Louis Napoleon, Karl Marx says, quite rightly, I think, "When people are trying to learn a new language, it's natural for them to translate it back into the one they already know." Yes, that's true. But they must also transcend the one they already know.

So I think the onus is on us to find a language that moves us beyond faith, because faith is the negation of the intellect, faith supplies belief in preference to inquiry and belief, in place of skepticism, in place of the dialectic, in favor of the disorder and anxiety and struggle that is required in order to claim that the mind has any place in these things at all.

I would say that because the intellectual has some responsibility, so to speak, for those who have no voice, that a very high task to adopt now would be to set oneself and to attempt to set others, utterly and contemptuously and critically and furiously, against the now almost daily practice in the United States of human sacrifice. By which I mean, the sacrifice, the immolation of men and women on death row in the system of capital punishment. Something that has become an international as well as a national disgrace. Something that shames and besmirches the entire United States, something that is performed by the professionalized elite in the name of an assumed public opinion. In other words, something that melds the worst of elitism and the absolute foulest of populism.

People used to say, until quite recently, using the words of Jimmy Porter in Look Back in Anger, the play that gave us the patronizing term "angry young man"--well, "there are no good, brave causes anymore." There's nothing really worth witnessing or worth fighting for, or getting angry, or being boring, or being humorless about. I disagree and am quite ready to be angry and boring and humorless. These are exactly the sacrifices that I think ought to be exacted from oneself. Let nobody say there are no great tasks and high issues to be confronted. The real question will be whether we can spread the word so that arguments and debates like this need not be held just in settings like these but would be the common property of anyone with an inquiring mind. And then, we would be able to look at each other and ourselves and say, "Well, then perhaps the intellectual is no longer an elitist."

In the wake of the controversial Supreme Court decision that put him in office, George W. Bush's inauguration was filled with protests.

Files leaked from China show that country's leadership to have conspired against their own people during the Tiananmen protests—but are the documents genuine?

If the absence of soldiers seizing cable networks is the ultimate standard of meaningful democratic empowerment, we're not doing half bad.

All I want is the truth. Just gimme some truth.
      --John Lennon

Florida's electoral mishegoss lends itself to the exploration of an issue that receives no attention in the media and yet underlies virtually everything its members do. I speak to you, dear reader, of the Meaning of Truth.

Ever since Fox's John Ellis began the mistaken media stampede for his cousin George W. Bush's victory on election night, reporters, producers and executives have spun themselves silly trying to describe a situation that is ultimately an epistemological bottomless pit. There is no single "truth" about who won Florida. From the point of view of "institutional truth," we began without clear rules or precedents for measuring the vote, whether they include dimple-counting, partially punched chads or butterfly ballots. I am convinced Gore carried the will of the people, but I'm guessing that Lady Katherine Harris Macbeth would rather contract rabies than accept my admittedly subjective interpretation. From the perspective of "brute truth," however, the difference between the Bush/Gore numbers turns out to be so small that it will never exceed the count's margin of error. What we are seeing, therefore, is not a process of objective measurement but a contest of raw power. The Democrats use the courts and the law. The Republicans rely on rent-a-mobs, partisan hacks and power-hungry allies in the state legislature and Congress. Guess which side is bound to win?

Our media coverage admits none of this, because it is committed to a fairy-tale version of truth and objectivity that separates "fact" and "opinion" but cannot fathom anything in between. When Tim Russert declared on November 26 that George Bush "has now been declared the official winner of the Florida election...and therefore he is the 43rd President of the United States," he was making a statement that could not have been true when he made it. (Even Bush understood that he was only playing a President-elect on TV.) But the feared and celebrated Russert knew that his words were bound by only the narrowest definition of "truth." He could always take it back later.

The attachment to the idea of attainable objective "truth" on the part of American journalism is partially responsible for its frequent brainlessness. As NYU's Jay Rosen points out, "objectivity as a theory of how to arrive at the truth is bankrupt intellectually.... Everything we've learned about the pursuit of truth tells us that in one way or another the knower is incorporated into the known." (Remember Heisenberg? Remember Einstein?) The famous 1920s debate between Walter Lippmann and John Dewey shed considerable light on this problem, with Lippmann arguing for a "spectator" theory of reality and Dewey arguing for a more consensual one, arrived at through discourse and debate.

The notion of a verifiable objective truth received what many intellectuals considered its final coffin nail in the form of Richard Rorty's classic 1979 work, Philosophy and the Mirror of Nature. While the word true may have absolute correlations in reality, Rorty later argued, "its conditions of application will always be relative." What was "true" in ancient Athens--that slavery and pederasty were positive goods--is hardly "true" to us today. As Rorty explains it, we call our beliefs "true" for the purposes of self-justification and little more. The point is not accuracy but pragmatism. Moreover, Ludwig Wittgenstein has taught us that the gulf between what "is" and the language we use to describe it is so large as to be unbridgeable. "Truth" may be out there, but there is no answer to a redescription, Rorty observes, "save a re-re-redescription." Truth is what works.

Now, it's possible to contest Rorty on any number of counts. I personally find him overly generous to the extreme relativism of antifoundationalists like Jacques Derrida and Michel Foucault. (The antifoundationalist perspective can be simplistically summarized by the famous Surrealist painting of a pipe by René Magritte beneath the words, Ce n'est pas une pipe.) But the argument itself cannot be avoided. Truth, as Lippmann never understood but Dewey did, is a lot more complicated than a baseball box score or a Johnny Apple New York Times news analysis. What is needed to evaluate whether a report is ultimately credible is not an endless parade of "facts" that may or may not be true but a subjective marshaling of evidence. Yet because the entire media establishment treats these questions as just so much mental masturbation, the standard definition of "fact" often turns out to be any given statement that cannot be easily disproved at the moment it is made. Hence, we frequently see journalistic accounts of the mood of an entire country or even a whole continent based on little more than the taxi ride from the airport.

A second byproduct of American journalism's childish belief in attainable objective truth, Rosen notes, is the alienation it causes between journalists and intellectuals. In Europe the public profits from a two-way transmission belt between the world of ideas and that of reported "fact." But here such exchanges are nearly impossible because, as Rosen puts it, "intellectuals familiar with the currents in twentieth-century thought just can't deal with some of the things that come out of journalists' mouths." Such people, he notes, believe it "useless to try to talk with journalists" owing to their "naïve empiricism." Still, the academy is also at fault, owing to its recent retreat into a Derrida/Foucault-inspired debate that admits almost no reality at all outside the text and does not even pretend to speak intelligibly to the nonspecialist.

In any case, George W. Bush may be our next President. But it won't be because he outpolled Al Gore in Florida in any remotely objective sense. It will merely be because he might have, and we decided to call it "true."

* * *

Congratulations to Ralph Nader on George W. Bush's decision to appoint Andrew Card, formerly the auto industry's top antienvironmental lobbyist, to be his Chief of Staff. Just a few more appointments like this one, I suppose, and the revolution can begin in earnest.

Click here for Eric Alterman's latest dispatch on Florida.

On Tuesday, November 14, exactly one week after Election Day (and with no President yet in sight), a notable though little-noted disclosure was made to the public. I do not mean the news that the federal judge in Florida had turned down the Republicans' stop-the-hand-count motion, or the news that Bush's lead in Florida was now 388 votes, or the news that a Florida state judge had waffled on Florida Secretary of State Katherine Harris's decree that no county votes would be counted if reported after the 5 pm deadline that afternoon, or, for that matter, anything else that was happening in the murk of the Sunshine State. I mean the news that, according to a poll released by the Washington Post and ABC News, 45 percent of the public wanted George Bush to become President whereas only 44 percent wanted Al Gore to become President (6 percent wanted "neither," 4 percent had no opinion and 1 percent wanted "other"). The claim was all the more striking in view of the hard contemporaneous fact that in the most recent count of the actual vote of November 7, Gore led Bush by a nationwide margin of 222,880 votes.

If anyone ever had doubts that politics in the United States is dominated by polling, this poll should put an end to them. A major poll was, in a manner of speaking, calling the election a full week after the vote--and reversing the known results.

The polls had been mercifully silent since the election. Many had good reason to be. Five of seven major ones had been "wrong" about the outcome of the election. That is, their final counts had failed to reflect the winner on Election Day (though some, it's true, were within the margin of error). The New York Times/CBS "final" poll, which put Bush at 46 percent and Gore at 41 percent, had the margin wrong by more than five points and Gore's final tally off by eight points. The Battleground poll, which gave Bush 50 percent to Gore's 45 percent, likewise got the margin wrong by five points. Others were more modestly in error. CNN gave Bush 48 percent and Gore 46 percent; in the Washington Post it was Bush 48 and Gore 45; and in the Pew Research Center poll (with undecided voters counted), it was Bush 49, Gore 47. Only the Zogby poll, which put Gore ahead in the popular vote by 48 to 46 percent, and a CBS election-morning tracking poll, which gave Gore 45 percent and Bush 44 percent, picked the right winner in the popular vote, and with a margin close to the actual result. All in all, Gore's victory in the popular vote came as a surprise. Of course, it's not literally true that the polls were wrong, since there is a margin of error, and people can change their minds between the day of the poll and the election. On the other hand, election results are the only check on the accuracy of polling that there is--they are to polling what experimentation is to scientific hypothesis--and there is no reason to suppose that a poll whose final measure is 8 percentage points off the election result is not 8 percentage points off year in, year out.

Considering the decisive importance that polling had throughout the race in every aspect of the campaign, including media coverage, fundraising and campaign strategy (in the last few weeks of the election, hearts were lifting and falling on single-point fluctuations in poll numbers), these discrepancies deserved much reflection. The reason they did not get it was that on election night the magicians of public opinion went on to make even more egregious and momentous errors, by prematurely predicting the winner in Florida twice and the winner of the national election once. (The election-night calls made by the television networks, which in turn are based on exit polling done by a single, nearly anonymous firm, the Voter News Service, are not quite the same as opinion polling, since they record a deed--voting--rather than an opinion, but their use of sampling techniques to predict outcomes places them in the same general category as other polls.)

The last of these mistakes, of course, led a credulous Gore to concede the election and then, minutes later, to retract the concession. For a few hours, the networks and the candidates appeared to have assumed the power to decide the election between them. There is every reason to believe, for instance, that George Bush would now be President-elect if, moments before his concession speech, Gore had not got the news that Florida had been declared undecided again. If Gore's concession had gone unretracted, Bush had made his acceptance speech and the country had gone to bed believing it had made its decision, it is scarcely imaginable that the close results in Florida would have been contested. Even now, many observers await a concession by one or another of the candidates as the decisive event. But it is not up to either the networks or the candidates to decide who is to be President; that matter is left under the Constitution to the voters, whose will, no matter how narrowly expressed, must be ascertained.

Then a week later, the polls that had played such an important and misleading role in the election were weighing in again, this time on the Florida battle. The poll that brought the startling, seemingly counterfactual news that Bush led Gore in the public's preference also revealed that six out of ten voters were opposed to legal challenges to the Florida results--possibly bad news for Gore, who had been considering a legal challenge to the infamous butterfly ballot in Palm Beach County. However, observers who did not like that conclusion could find comfort on the same day in a New York Times/CBS poll, which reported that another 6 in 10 were unworried about a delay in finally deciding upon the next President--good news for Gore, who had been relying on time-consuming hand recounts to erase Bush's narrow lead.

If, however, the arts of reading public opinion helped get us into our current mess, perhaps we can take comfort from the hope that they can also help us get out of it. Many observers have suggested that by failing to produce a clear mandate, the ever-changing vote-count of the year 2000--let's call it the Butterfly Election--will cripple the presidency of the winner. They need not worry too much. In our day, it is not only--perhaps not even mainly--elections that create mandates, once every four years. It is polling data that, day in and day out, create our impressions, however incompletely or inaccurately, of what the public wants. Let the new President act in a way that the public approves, as determined by a poll or two, and he will have all the mandate he needs to govern.

Providence put me on a panel debating the Gore/Nader choice with Cornel West at New York University in late October. Most of the audience was for Nader, and the lineup on stage did nothing to improve those odds.

Before the debate began, its organizers took a few moments to speak on behalf of the university's graduate students' struggle for unionization. So did West, who had been handed a flier about it from the floor. And as a man about to lose a debate (and a longtime grad student as well as an occasional NYU adjunct faculty member), I was happy for the interruption. Days later, the National Labor Relations Board set an important precedent by ruling in favor of the students. But here's what I don't understand. How can the student union supporters also be Nader supporters? Nonsensical "Tweedledee/Tweedledum" assertions to the contrary, only one party appoints people to the NLRB who approve of graduate student unions, and only one appoints people to the Supreme Court who approve of such NLRB decisions. No Democrat in the White House, no graduate student union; it's that simple. An honest Nader campaign slogan might have read, "Vote your conscience and lose your union...or your reproductive freedom...your wildlife refuge, etc., etc."

Well, Nader's support collapsed, but not far or fast enough. In the future, it will be difficult to heal the rift that Nader's costly war on even the most progressive Democrats has opened. Speaking to In These Times's David Moberg, Nader promised, "After November, we're going to go after the Congress in a very detailed way, district by district. If [Democratic candidates] are winning 51 to 49 percent, we're going to go in and beat them with Green votes. They've got to lose people, whether they're good or bad." It's hard to imagine what kind of deal can be done with a man whose angriest rhetorical assaults appear reserved for his natural allies. (The vituperative attacks on Nader, leveled by many of my friends and cronies on the prolabor democratic left, were almost as counterproductive, however morally justified.) But a deal will have to be done. Nader may have polled a pathetic 2 to 3 percent nationally, but he still affected the race enough to tip some important balances in favor of Bush and the Republicans. He not only amassed crucial margins in Florida, New Hampshire and Oregon; he forced progressive Democrats like Tom Hayden, Paul Wellstone, Ted Kennedy and the two Jesse Jacksons to focus on rear-guard action during the final days rather than voter turnout. If this pattern repeats itself in future elections, Naderite progressives will become very big fish in a very tiny pond indeed.

Perhaps a serious Feingold or Wellstone run at the nomination with a stronger platform on globalization issues will convince those die-hard Naderites to join in the difficult business of building a more rational, Christian Coalition-like bloc to counter corporate power within the party. For now, we can expect an ugly period of payback in Washington in which Nader's valuable network of organizations will likely be the first to pay. Democrats will no longer return his calls. Funders will tell him to take a hike. Sadly, his life's work will be a victim of the infantile left-wing disorder Nader developed in his quixotic quest to elect a reactionary Republican to the American presidency.

* * *

Giving Nader a run for his money in the election hall of shame are the mainstream media. Media portraits of both candidates were etched in stone, with nary a fact or figure allowed to intrude upon the well-worn script. Bush was dumb and Gore a liar; pretty much nothing else was allowed in the grand narrative. Like Nader, reporters assumed the enormous policy differences between Gore and Bush--on Social Security, prescription drugs, education, affirmative action, abortion rights, the environment--to be of trivial importance, hardly worth the time and effort to explain or investigate. The media's treatment of this election as a popularity contest rather than a political one between two governing ideologies was an implicit endorsement of the Bush campaign strategy, as the issues favored Gore. But even so, Bush was usually treated like some pet media cause. With regard to such consequential questions as his political program, his political experience, his arrest record, his military service, his business ethics, Bush was given a free pass by media that continued to hound Gore about whether he was really the model for Oliver in Love Story--which, by the way, he was. I guess being a Bigfoot journalist means never having to say you're sorry.

* * *

One election development that had to gladden New Republic owner Marty Peretz's heart was how bad it was for the Arabs. I got a call one day from a Republican Party functionary telling me that Hillary Clinton supported a Palestinian state and took money from groups that supported terrorist organizations "like the one that just blew up the USS Cole." I told the sorry sonofabitch that like Israel's Prime Minister, I, too, support a Palestinian state. And, if there was any justice in the world, Hillary's "terrorist" friends would blow up Republican headquarters while we were still on the phone, so I could enjoy hearing the explosion.

This heavy-handed bit of racist manipulation grew out of a story published, surprisingly, not in Rupert Murdoch's New York Post but in the putatively responsible and nominally liberal New York Daily News, owned by Mortimer Zuckerman. It was inspired by the machinations of one Steven Emerson, a discredited "terrorism expert" last heard trying to pin the Oklahoma City bombing on the Arabs by noting that "inflict[ing] as many casualties as possible...is a Middle Eastern trait." Each actor played a dishonorable role in the tawdry drama: The Daily News invented the story. The Lazio campaign brazenly exploited it. Hillary Clinton's campaign capitulated to it. Together with the media coverage of the main event, this mini-drama will go down in history as further evidence of that unhappy nostrum of American politics that this year seems to have escaped everyone from the Nader die-hards to Palestinian militants: Things can always get worse.

Marvin Kalb, executive director of the Washington office of Harvard's Shorenstein Center on the Press, Politics, and Public Policy, diagnoses an anti-Israel tilt in the US media, in which "the Israelis have come through a miraculous alchemical formula to become the giants and everyone else is the David.'' What planet is this man living on?

Just look at the numbers. Nearly 100 Palestinians have been killed and more than 2,500 injured, compared with just five Israeli Jews. The Palestinians attack with stones, Molotov cocktails and the extremely rare automatic weapon. Unlike nations that quell riots by their own people with tear gas and rubber bullets, the Israelis respond with live ammunition: antitank rockets, helicopter gunships and armor-piercing missiles. Armed Jewish vigilantes have undertaken murderous rampages against unarmed Arab citizens, shooting them in cold blood. The UN Security Council condemns Israel's "excessive use of force."

Yet aside from the Palestinians invited to speak explicitly for their own cause, the mainstream US media condemn the Palestinians and exonerate Israel with Soviet-like consensus. Editorial pages are unanimous in apportioning the blame exclusively to Yasir Arafat rather than the war criminal Ariel Sharon, who provoked the riots to advance his political career. Sharon was puffed up in extremely sympathetic interviews by Lally Weymouth published in the Washington Post and Newsweek, and held forth as well on the Wall Street Journal Op-Ed page. Meanwhile, the members of the punditocracy who appeared during the weekend of Barak's ultimatum spoke as if channeling American Jewish Committee talking points.

While Hillary Clinton and Rick Lazio battled one another to shower the Palestinians with higher and higher degrees of contempt in their second debate, the only American voices heard to speak to the larger context of the conflict were the twin electoral outliers, Ralph Nader and Pat Buchanan. Given his history of anti-Semitism and hatred of Israel, the former Crossfire host has forfeited any credibility he once had on the issue. Nader's criticism of Sharon, which he expressed on CBS's Face the Nation, was therefore far more valuable, especially in light of the relative scarcity of such voices on network television.

More typical, however, are the views of Charles Krauthammer, who has apparently contracted the same mental and emotional affliction that drove poor Abe Rosenthal insane. The pundit actually compared the phenomenon of Palestinian riots and rock-throwing to the Nazi invasion of Poland. Complaining of overly sympathetic coverage of Palestinian "frustration"--"frustration with what?" Krauthammer demanded in mock horror, as if the average Palestinian refugee lived next door in Chevy Chase--Krauthammer termed Israel's dovish leaders "feckless" for seeking an accommodation to create a nation where Jewish soldiers are no longer in a position to gun down unarmed 12-year-old boys.

Sure Arafat is a corrupt, untrustworthy leader, and I wish he had somehow found the courage to risk his own neck and embrace Barak's surprising concessions at Camp David, if only as a foundation stone in a much longer peace process. The concessions were, unfortunately, the best offer the Palestinians are likely to get for some time. But it's not Arafat's indecision or Palestinian rock-throwing that lies at the root of the current conflict. Rather, as the Israeli lawyer Allegra Pacheco wrote on the Times Op-Ed page, it is the fact that "the proponents of the agreement, including the Clinton Administration, never fully informed the Palestinian people that the [Oslo] accord did not offer any guarantee of Palestinian self-determination, full equality and an end to the military occupation." Since Oslo, Pacheco notes, the quality of life in the West Bank and Gaza has declined from terrible to nearly unbearable. Owing to the lack of good will on both sides, what is being constructed from Oslo is less peace than apartheid.

I have walked across open sewage in Palestinian refugee camps surrounded by children begging for candy. I have been served tea at the home of a Palestinian family whose 13-year-old son was killed days earlier by the Israeli Defense Force as a suspect in a murder that turned out to be the work of a crazed Jewish fanatic. I have stood in the rubble of Palestinian houses that the Israelis bulldozed as a warning to those who would continue to protest. Seven years ago, I stood on the White House lawn and listened, tearfully, to Yitzhak Rabin say "enough" to the killing on both sides. Alas, it was not enough. And given the realities on the ground, for every Israeli who loses a son or daughter, so too will scores of Palestinians.

It would behoove those in the media who hold forth on this issue to address themselves for once to its larger context. It is Israel that is oppressing the Palestinians, and it is the Palestinians who are doing virtually all the dying. True, Ehud Barak has taken massive political risks by offering concessions that go well beyond the Israeli consensus. He is a brave leader and an authentic soldier for peace. But given the magnitude of the physical, psychological and sociological costs of the Palestinian "catastrophe," Barak's best is simply not good enough. The only chance for lasting peace will come when Israel agrees to share Jerusalem with a full Palestinian partner, granting equal rights to citizens of both nations; with Israeli rule in the West and Palestinian rule in the East.

Perhaps it's too much to ask a victorious people to offer genuine justice and material sacrifice to the nation it has vanquished on the battlefield--particularly when the hatred of the defeated nation continues unabated. But the Palestinians will accept nothing less.

I'm a Jew with deep emotional ties to Israel and strong sympathies with the Labor/Zionist project. My own words fill me with foreboding. But if it must come to war, then let us at least be honest about it. Like Ariel Sharon's 1982 invasion of Lebanon, it will be a war that Israel has chosen because it could not countenance the alternatives. And it will be the Palestinians who, once again, will endure the lion's share of the suffering.

Blogs

Eric on this week in concerts and new music releases and Reed on how the mainstream press is always trying to tell the same (false) story about the Republican Party.

November 17, 2014

Eric on this week's concerts (plus some upcoming events) and Reed on Matt Bai's treatment of the Gary Hart-Donna Rice affair.

October 6, 2014

Bolling apologized, he said, because his wife gave him “the look.” Sounds sincere to us!

September 26, 2014

Scotland was closely divided on the issue of independence. But media cast a loud “no” vote.

September 19, 2014

An in-depth look at a South Carolina report on domestic violence is at the center of a national conversation. 

September 10, 2014

The Nation’s editor appeared on CNN’s Reliable Sources to discuss press coverage of ISIS and whether some members of the media are participating in war-mongering.

September 8, 2014

Insider Chuck Todd tries to bring outsider edge to the Sunday show.

September 8, 2014

Eric on this week's concerts and Reed on the Beltway Media, American Exceptionalism and foreign policy. 

September 5, 2014

Eric on this week's concerts and Reed on David Gregory's departure from Meet the Press. 

August 20, 2014

Eric on the latest books and CD's and Reed on media coverage of Ferguson, Missouri. 

August 14, 2014