News and Features
Estrada is gone, but corruption remains.
Michèle Montas, widow of journalist Jean Dominique, wants justice in Haiti.
In our retrograde era, "the personal is political" might better be put "politics sure messes up progressive lives." This past December, just after the Supreme Court completed the electoral coup that imposed the Bush presidency upon us, I spent a miserable snowy afternoon in my Chicago-area university office trying to winnow down a set of readings for a graduate seminar on race, ethnicity and nationalism. Glumly predicting the sorts of Cabinet appointees and White House policies that have indeed come to pass in the weeks since, I found myself unable to pare down the list. Instead, mindful of the racist renaissance we are likely in for in the coming years--not that Clinton's two terms, characterized by the police-state crime bill and the evisceration of AFDC, were exactly models of antiracist governing--I shoveled back in masses of old Bell Curve-era readings on New Right cultural politics.
The Talmudic reading load imposed by a punctilious and politically depressed lefty professor on hapless grad students is, of course, the least of the burdens of newly enhanced conservative rule. But as we attempt to assess and contest the worsened life conditions, from Colombia to Cairo to Kazakhstan to California, about to be produced by Bush Administration policies, we need new analytic tools to help us envision the meanings of race and ethnicity in shifting national and global political economy. And Claire Kim's fresh study, Bitter Fruit: The Politics of Black-Korean Conflict in New York City, offers precisely such tools.
Bitter Fruit is based on a meticulous account of the 1990-91 black-led "Red Apple" boycott of two Korean-run produce stores--Family Red Apple and Church Fruits--in the Flatbush neighborhood of Brooklyn, a boycott that arose in response to allegations that Family Red Apple's store manager, Bong Ok Jang, beat an older Haitian woman customer, Ghiselaine Felissaint, during an argument at the cash register. But Kim, a younger politics and ethnic studies professor at the University of California, Irvine, uses that narrative to reframe the ways in which even we progressives, influenced by public culture despite our best efforts, tend to see the history and contemporary realities of race, immigration, representation, politics and poverty in American cities. Most political, ethnographic or other analyses of urban lives--with key exceptions in works like Brett Williams's Upscaling Downtown and Dwight Conquergood's "Life in Big Red"--focus on only one population, whether black street vendors or Latina or Chinese sweatshop workers. One of Kim's strengths, making her the Anna Deavere Smith of the poli-sci set, is her careful consideration, through extensive interviewing, of the voices of all the players in the Red Apple imbroglio--Haitian immigrants and longer-term residents, black American political activists and elected officials, Korean merchants and community politicians of different generations, the various mainstream and alternative media--and her clearheaded recognition of their differential access to power and resources.
This is the key to the issue and the real innovation in Kim's work. She lays out for us the "conventional wisdom" about black/Korean conflict:
Shut out of the mainstream economy by historical discrimination and hit hard by recent global economic changes, urban-dwelling Blacks are frustrated and angry. Enter Korean immigrants, who open stores in poor black neighborhoods and rapidly achieve economic success by virtue of their hard work and thriftiness.... Blacks lash out at them, irrationally venting their accumulated frustrations on this proximate, vulnerable, and racially distinct target. Korean immigrants...simply get caught in the wrong place at the wrong time.
Kim then disassembles this "racial scapegoating" narrative for us. She notes that "historical discrimination, economic competition, Black rage, immigrant dreams and prosperity" are all genuine phenomena but that this formulation "isolates these features and rips them out of the overall context of how racial power operates in America." Racial power, in Kim's analysis, is linked to racial ordering, the economic and ideological process through which populations are evaluated relative to one another. These constructions rely not on notions of ongoing white conspiracy or intentionality but on the reproduction of political-economic structures and discursive frames, the very ways in which we talk about the subject. Racial power "finds concrete expression in a wide variety of...processes that tend cumulatively to perpetuate White dominance over non-Whites. Putatively impersonal forces such as global restructuring and deindustrialization are in fact mediated by racial power so that Whites systematically accrue greater benefits from and suffer fewer burdens from these developments than do non-Whites." The racial scapegoating story turns out, then, to veil the "'bitter fruit' of deeply entrenched patterns of racial power in contemporary American society."
Central to contemporary American racial ordering are the empirically false and mutually interdependent constructs depicting a feckless and violent black and brown urban underclass and a hardworking, bootstrapping Asian "model minority." The model-minority myth presents "Asian Americans as culturally superior to Blacks and yet culturally distinct from Whites and detached from politics." As the American economy improved over the 1990s, as crime plummeted because of improved economic prospects, demographic transition, mass imprisonment and rising youth common sense, and as the impoverished were thrown off public assistance without much public outcry, we have heard less and less about the dangerous minority poor who have only themselves to blame for their circumstances. (Given the bear market and other recent indicators, though, watch this space.)
Representations of Asian model-minority behavior, though, dating from the 1960s, continue strong in mass media. Kim traces the origin of model-minority ideology to the use of Asian-American "success" stories--with mom-and-pop stores in the forefront--"as an explicit rebuke to Blacks involved in collective demand making of one kind or another." "Consider the two myths as mirror images," Kim invites us:
The underclass is lazy, undisciplined, lacking in family values, criminally inclined, unable to defer gratification, deviant, dependent, and prone to dropping out; the model minority is diligent, disciplined, possessed of strong family values, respectful of authority, thrifty, moral, self-sufficient, and committed to education. Whites--the unspoken overclass to the underclass and majority to the model minority--are factored out of the picture as if they were neutral, colorblind, wholly disinterested observers.
This triangulated racial ordering helps to rationalize common-sense "colorblind talk" that serves to mask both white power and the innately relational character of all racial systems.
Providing clear empirical proof of the bankruptcy of this vision, Kim locates both blacks and Koreans in the historical political economy of New York City. She uses other scholars' work to establish the persistent and unique residential segregation of black populations--so extreme, both locally and nationally, that Douglas Massey and Nancy Denton label it "American apartheid"--and reprises the record of brutal and deadly actions by outer-borough whites against "trespassing" blacks throughout the 1980s. She uses sociologist Roger Waldinger's research to demonstrate the ways in which blacks have been excluded from the changing urban occupational "ethnic queue." Even their relative success in public-sector jobs in the 1970s, the result of federal antidiscrimination legislation, tripped them up when the public/private balance shifted and they lacked networks and resources to gain access to burgeoning business opportunities.
Kim cites abundant evidence that New York employers, like those elsewhere in the United States, operate on the basis of "old-fashioned racism--or discrimination based on the construction of Blacks (especially Black men) as undesirable (lazy, dishonest, unreliable) employees." Even the conservative business newspaper Crain's New York Business lamented in 1989 that "being black reduces the prospects for entrance and advancement in nearly every sector that defines the economic life of the city."
The cumulative national effects of residential segregation and systematic credit discrimination, in addition to specifically regional oppression (for example, Koch administration refusal to grant city contracts to nonwhites), explain both Afro-Americans' generally low levels of self-employment and the particularly extreme paucity of black small businesses in New York. The per capita rates in Los Angeles, for example, are 2.5 times as high.As a result of combined governmental and private-sector actions, by the late 1980s "increasing rates of overall and extreme poverty, deepening income inequalities, and persistently low labor-force participation rates shaped the lives of most Black New Yorkers."
Haitian migrants to the United States, and New York specifically, beginning with 1960s waves of anti-Duvalier activists fleeing certain death, were immediately racialized as black and subjected to the same discriminatory treatment, with two additions. In the first place, blackness "is a source of great pride" in the first independent black nation in the Western Hemisphere, and Haitians had to come to terms with its often degraded American status. Then, as black foreigners began arriving in the 1970s in larger numbers, and with the rise of the AIDS crisis, Haitians were further coded as dirty, diseased and dangerous.
In this overall context of extreme antiblack racism, finally, Kim documents how ordinary patterns of ethnic political succession in New York City have never included Afro-Americans. In the period in which blacks were winning City Council seats and mayoralties, and influencing (if largely in the interests of the better-off) urban policy elsewhere, Ed Koch's and then Rudy Giuliani's long mayoral reigns, through finagling with the Board of Estimate and the City Council, were dedicated to wholesale black exclusion. Kim notes dryly that "this sheds some light on why Black efforts at empowerment eventually migrated outside of traditional political channels, resulting in the new Black Power movement of which the Red Apple Boycott was part." The Afro-American David Dinkins's short-lived stay in Gracie Mansion would be, among other political disasters, haunted by the boycott, begun only seventeen days after his inauguration.
The experiences of new Korean immigrants run entirely counter to this pattern. In the first place, Kim places post-1965 Korean immigration to the United States in the context of "America's protracted efforts to influence economic development and shore up repressive anticommunist regimes in a non-White nation located on the periphery, resulting in significant migration from periphery to core." That migration, in response to the explicit economic policy embedded in the Hart-Celler Immigration Act, was largely of educated, white-collar Koreans with small but significant savings to invest. Then these migrants were "racialized as Asian Americans and triangulated between Blacks and Whites.... It is in this way that the very economic opportunities that are closed to Blacks become the ticket to upward mobility for Korean Americans."
Thus, while they were certainly victimized by American racism, these Korean immigrants, unlike blacks, were not subject to its more extreme forms--residential segregation, pervasive violence and abuse on the streets and in the criminal justice system. They were, however, forced into the "status derogation" of small business by both their poor English skills and employer discrimination against them as "foreigners." Extensive Korean exploitation of retailing niches created ethnic business networks allowing them to take over entire urban retail sectors--greengrocers in New York and liquor stores in Los Angeles, for example. Relatively privileged but stressed and squeezed, Koreans in small business tended to subscribe even more extensively than white Americans to victim-blaming underclass mythology. The stage was set for the Red Apple boycott.
Here Kim really shines as an analyst. She disabuses us of "the conventional notion that the boycotters were venting their frustrations on Koreans instead of on Whites" by placing the event inside the "resurgent Black Power movement in New York City." She identifies the heterogeneous players in and the politics of that movement, characterized in mainstream media as a solid bloc of crazy white-haters, and places them in the context of the public and private antiblack onslaught of the Koch years. Kim demonstrates how always-latent black nationalism became the lowest common denominator "frame repertoire" for organizing the boycott, despite the more developed left politics of the dominant black American December 12th Movement, which took over from the original Haitian agitators. And she notes the ultimate irony that this group, which was vilified as violently anti-Asian, "had presumptively positive feelings toward Koreans," encouraged black patronage of all Korean greengrocers except the two under boycott and had even engaged in pro-Korean unification demonstrations.
Kim also carefully lays out the roles of mainstream, black, Haitian and Korean media in motivating the boycott and the backlash against it. As a long-term lover of the neglected public media of black and Latino radio, I particularly appreciate her coverage of the key organizing functions of minority radio stations. Kim shows effectively how their very different transnational as well as American placements structured Korean and Haitian interpretations and actions. During the boycott, for example, to offset their losses, the two storeowners received $150,000 from Korean-American and other sources. While this capital infusion was important, the real battle of the boycott occurred in the realm of the political. The "multiple layers of contested meaning" created by activists and their associated media inevitably resolved themselves into the overwhelming mainstream-media narrative, in which "colorblind talk," heavily appropriating civil rights-era references, "garbled and distorted" the boycotters' message and defined them solely as crime-prone anti-Korean racists. Michael Kinsley, for example, "the putative representative of the left on CNN's Crossfire, said simply: 'You don't mediate between out and out racism on the one hand and a hardworking entrepreneur on the other. And that's what's going on.'" Kim justly observes that "the most striking aspect of the regular news coverage of the Red Apple Boycott was its univocality."
This single voice put David Dinkins "squarely on the hot seat." Already having been accused, before taking office, both of pandering to black extremists and of selling out communities of color, Dinkins could only lose on the boycott issue. His early refusal to send in the NYPD to move the protesters off-site enraged the city's elite, who claimed he was ruining New York City's business climate. But his final capitulation to white pressure, a televised speech opposing "any boycott based on race," stung his black supporters. Al Sharpton accused Dinkins's speech of being like "a James Brown record--talking loud and saying nothing." And attorney Vernon Mason declared that "he ain't got no African left in him." Overwhelmed by bad publicity, the boycott lost steam and collapsed after only eight months of picketing. Kim notes the key role it played in New York electoral politics: "David Dinkins made history again by becoming the first breakthrough Black mayor in American history to lose office after only one term." In 1993 Rudolph Giuliani "won a highly racially polarized election to become only the third Republican mayor of New York City since 1930." And we all know what happened then!
Kim ends her fine study with a riff on W.E.B. Du Bois's twentieth-century color-line aphorism: "It seems likely that the problem of the twenty-first century will be that of the multiple color lines embedded in the American racial order." She rightly asks, "When is 'voice' really voice?"--querying claims of American democracy in the context of centralized and corporate-controlled mass media (and, we might add, of differentially efficient and functional voting machines). I would have liked her to deal with the gendered dimensions of the Red Apple boycott, write more extensively about non-Korean Asian-American politics around the event and trace out the implications of her work for other faulty analyses of the dilemmas of "middleman minorities" in the American and global past and present. But no one book can accomplish everything, and Kim's Bitter Fruit sets an incisive new pattern for our understanding of class in multiracial politics as we live through the bitter years ahead.
Is there a more contemptible poseur and windbag than Elie Wiesel? I suppose there may be. But not, surely, a poseur and windbag who receives (and takes as his due) such grotesque deference on moral questions.
How many times did we hear during the endless campaign that Bush wouldn't go after abortion if elected? Republicans, Naderites and countless know-it-alls and pundits in between agreed: Pro-choice voters were too powerful, the country was too divided, the Republicans weren't that stupid and Bush didn't really care about abortion anyway. Plus whoever won would have to (all together now) "govern from the center." Where are all those smarties now, I wonder? Bush didn't even wait for his swearing-in ceremony to start repaying the immense debt he owes to the Christian right, which gave him one in four of his votes, with the nominations of anti-choice die-hards John Ashcroft for Attorney General and Tommy Thompson to head Health and Human Services.
On his first full day in office, Bush reinstated the "gag rule" preventing international family-planning clinics and NGOs from receiving US funds if they so much as mention the word "abortion." (This action was widely misrepresented in the press as being a ban on funding for performing abortions; in fact, it bans clinics that get US aid from performing abortions with their own money and prohibits speech--whether lobbying for legal changes in countries where abortion is a crime or informing women with life- or health-threatening pregnancies about their legal options.) A few days later, Thompson announced he would look into the safety of RU-486, approved by the FDA this past fall--a drug that has been used by half a million European women over twelve years and has been more closely studied here than almost any drug on the market. In the wake of Laura Bush's remark to NBC News and the Today show that she favored retention of Roe v. Wade, both the President and the Vice President said the Administration has not ruled out a legal challenge to it, placing them to the right of Ashcroft himself, who told the Judiciary Committee he regarded Roe as settled law (at least until the makeup of the Supreme Court changes, he did not add).
Don't count on the media to alert the public. The press is into champagne and confetti: Who would have thought "Dick" Cheney would be such an amiable talk show guest! Time to move on, compromise, get busy with that big tax cut. "Who in hell is this 'all' we keep hearing about?" a friend writes, "as in 'all agree' that the Bush transition has been a smashing success?" An acquaintance at the Washington Post, whose executive editor, Leonard Downie Jr., claims to be so objective he doesn't even vote, says word has come down from "on high" that stories must bear "no trace of liberal bias"--interestingly, no comparable warnings were given against pro-Bush bias. So, on abortion, look for endless disquisitions on the grassiness of the anti-choice roots, the elitism of pro-choicers and the general tedium of the abortion issue. Robin Toner could barely stifle a yawn as she took both sides to task in the New York Times ("The Abortion Debate, Stuck in Time," January 21): Why couldn't more anti-choicers see the worth of stem cell research, like anti-choice Senator Gordon Smith, who has several relatives afflicted with Parkinson's (but presumably no relatives unwillingly pregnant); and why can't more pro-choicers acknowledge that sonograms "complicate" the status of the fetus? In an article that interviewed not a single woman, only the fetus matters: not sexuality, public health, women's bodies, needs or rights.
Now is the time to be passionate, clever, original and urgent. I hate to say it, but pro-choicers really could learn some things from the antis, and I don't mean the arts of arson, murder and lying to the Judiciary Committee. Lots of right-wing Christians tithe--how many pro-choicers write significant checks to pro-choice and feminist organizations? Why not sit down today and send President Bush a note saying that in honor of the women in his family you are making a donation to the National Network of Abortion Funds to pay for a poor woman's abortion (NNAF: Hampshire College, Amherst MA 01002-5001)? March 10 is the Day of Appreciation for Abortion Providers--send your local clinic money for an abortion "scholarship," flowers, a thank-you note, a bottle of wine, a Nation subscription for the waiting room! (Refuse & Resist has lots of ideas and projects for that day--call them at 212-713-5657.)
The antis look big and powerful because they have a built-in base in the Catholic and fundamentalist churches. But (aha!) pro-choicers have a built-in constituency too: the millions and millions of women who have had abortions. For all sorts of reasons (privacy concerns, overwork, the ideology of medicine) few clinics ask their patients to give back to the cause. Now some providers and activists are talking about changing that. "My fantasy," Susan Yanow of the Abortion Access Project wrote me, "is that every woman in this country gets a piece of paper after her procedure that says something like, 'We need your help. You just had a safe, legal abortion, something that the current Administration is actively trying to outlaw. Think of your sisters/ mothers/daughters who might need this service one day. Please help yourself to postcards and tell your elected representatives you support legal abortion, join (local group name here), come back as a volunteer' and so on." If every woman who had an abortion sent her clinic even just a dollar a year, it would mean millions of dollars for staff, security, cut-rate or gratis procedures. Think how different the debate would be if all those women, and the partners, parents, relatives and friends who helped them, spoke up boldly--especially the ones whose husbands are so vocally and famously and self-righteously anti-choice. If women did that, we would be the grassroots.
* * *
Correction: It was Joe Conason, not Chip Berlet, who reported that John Ashcroft had met with the St. Louis head of the racist Council of Conservative Citizens. Berlet's equally fascinating story, cut for space reasons, was that Ashcroft made a cameo appearance in a 1997 Phyllis Schlafly video that claims that environmentalism, feminism, multiculturalism, gay rights and even chemical weapons treaties are part of a conspiracy to bring about One World Government. See clips at www.publiceye.org.
What's at stake in faith-based politics
Throughout the last campaign, while liberal Democrats warned that Bush was much more reactionary than he pretended to be, Naderites argued that Democrats were much less progressive than their rhetoric. From the evidence of the first days of the Bush Administration, it turns out both were right.
For all the dulcet compassion written into his inaugural address, Bush turned right even before entering the White House. His nomination of John Ashcroft as Attorney General showed contempt, not compassion, for the broad center of American politics. His environmental troika--Norton, Abraham and Whitman--are an affront even to Republican environmentalists. While professing her love for nature Norton preposterously invoked the California power crisis as a reason to start drilling in the Arctic wildlife preserve. The troika also threatened a review of the environmental regulations Clinton issued in his last weeks in power.
On his first day in office Bush targeted women's right to choose by reinstating the odious gag rule defunding any international organization that counsels women abroad on family planning and abortion. He also opened fire on women's rights at home, announcing that "it is my conviction that taxpayer funds should not be used to pay for abortions or advocate or actively promote abortions either here or abroad." He hailed those gathered at the annual national protest against Roe v. Wade, saying that "we share a great goal" in overturning the constitutional protection of a woman's right to seek an abortion. And Health and Human Services Secretary Tommy Thompson announced that he would review RU-486, which anti-choicers want banned, fearful that it will make abortion more accessible. So much for compassion.
Bush launched his push for an education plan that will demand lots of testing in exchange for a little new funding for beleaguered urban and rural schools. The $5 billion annual price tag for his education bill is mocked by the $68 billion annual tax cut he wants to give to the wealthiest 1 percent of Americans--to say nothing of the tens of billions about to be thrown at the Pentagon. But Bush knows what he calls "my base." The lily-white, mink-draped crowd at his inauguration broke into loud applause only twice: when Bush promised to reduce taxes and when Chief Justice Rehnquist was introduced. So much for bipartisanship.
Yet, despite the stolen election, the wolf politics after a sheep's campaign and a furious and frightened constituency, many Democrats in the Senate seem content with getting rolled. Conservatives in the party didn't pause before trampling their leaders to embrace the tainted President. While Senate Democratic leader Tom Daschle was urging his troops to hold off on any announcements about Ashcroft, the opportunistic Robert Torricelli and dubious Democrat Zell Miller of Georgia were hailing the Missouri tribune of the Confederacy as Attorney General. Despite a furious reaction by Democrats across the country, opponents like Ted Kennedy are struggling to summon even forty votes against a zealot whose career has been marked by his willingness to abuse his office for political gain. While Daschle was trying to get some agreement on a smaller tax-cut package from Democrats, Miller leapt in to co-sponsor the equivalent of the Bush plan with Texas Senator Phil Gramm.
Dick Cheney's former opponent, Joe Lieberman, didn't even thank African-Americans and the unions for their remarkable support this past fall before kicking them in the teeth in January. He joined nine other New Democrats in an unctuous letter to "President-Elect Bush" indicating their willingness to work with him on an education bill and urging him to make a top priority of the fight for "Fast Track trading authority" for "expansion of trade in the Americas." Lieberman et al. begged to meet with Bush as early as possible. So much for Democratic unity.
But the Democratic collaborators are likely misjudging the temper of the country. What the inaugural also revealed was the depth of voter anger nationwide. Demonstrators often outnumbered celebrators along the parade route. And from San Francisco to Kansas City to Tallahassee, citizens turned out to express their dismay at the installation of the illegitimate President. Bush seems committed to refighting old battles against choice, affirmative action and environmental and consumer protection, as well as to waging a new offensive in the continuing class warfare of the privileged against the poor. But citizens are showing that they are ready to resist. Some Democrats--Maxine Waters, Dennis Kucinich, Jan Schakowsky, Barney Frank, George Miller and others in the House, as well as Kennedy and Richard Durbin in the Senate--are already engaged. The day before Bush was sworn in, the Progressive Caucus led a daylong conference on political reform that featured a bold agenda and a promise to push for change at the state and national levels. In the coming fray, Democrats who decide to cozy up to the new Administration are likely to find themselves caught in the crossfire.
For more than two years, the antisweatshop movement has been the hottest political thing on campus [see Featherstone, "The New Student Movement," May 15, 2000]. Students have used sit-ins, rallies, hunger strikes and political theater to demand that garments bearing their institution's logo be made under half-decent working conditions.
From the beginning, the major players were students and administrators. While some progressive faculty members--mostly from sociology departments--offered the students early support, economists, who like to think of their discipline as the queen of the social sciences, kept fairly quiet.
That changed this past July. After colleges and universities made a number of visible concessions to the students over the spring, a group of some 250 economists and lawyers released a letter to administrators, basically complaining that they hadn't been consulted. The letter, initially drafted by Jagdish Bhagwati of Columbia University and burnished to perfection by a collective of free-trade zealots calling themselves the Academic Consortium on International Trade (ACIT), reproached administrators for making concessions "without seeking the views of scholars" in relevant disciplines. Judging from their letter, the views of these scholars might not have been terribly enlightening. On page 24 of the magazine, the ACIT missive appears with some comments (see "Special" box, right).
Jason Epstein's Book Business: Publishing Past Present and Future is the third memoir of a major American life in book publishing to reach print in less than two years. It is at once a sign that the guard is changing and a recognition that the business has already changed. It is also, in the case of the 72-year-old Epstein, an opportunity to gaze into the crystal ball to predict the changes to be, something he has been rather good at during the course of his long career.
Simon & Schuster's Michael Korda got the triumvirate rolling in 1999 with Another Life, gossipy and entertaining and novelistic, like the books Korda often publishes. The New Press's André Schiffrin--famously ousted from Random House's Pantheon Books, the once independent imprint his father started--followed suit more recently with The Business of Books, the kind of polemic he has sometimes featured on his list [see Daniel Simon, "Keepers of the Word," December 25, 2000].
It's not surprising, then, that the tone pervading Epstein's memoir--which began with a series of lectures he gave at the New York Public Library, formed two essays in The New York Review of Books and was coaxed into a book by Norton president Drake McFeely--is cool and elegant and full of the gravitas of a man who wanted to be a great writer and instead ended up publishing many such, Morrison and Mailer and Doctorow among them.
He arrived at Random House in 1958, having deemed it time to leave Doubleday when he was prevented from publishing Lolita there. While at Doubleday he had founded Anchor Books and with it the trade paperback format in America. He retired as Random's editorial director in 1998, and during the four decades in between started the Library of America, a unified series of reprints of great American literature; The Reader's Catalog, a kind of print precursor to Amazon; and The New York Review of Books. He had a reputation as a brilliant editor but went beyond that to envisage change and make it happen, and in the process made himself into a pillar of the New York intellectual establishment.
"If I have any regrets, I can't think what they are," he declared during an interview recently, sipping homemade espresso at his large kitchen table in an opulent downtown apartment that could double as the upscale set for one of Woody Allen's Manhattan tales. He still edits authors he's been associated with but now does it from home. He prefers to be based there rather than in the Random corporate offices, wishes to put space between himself and an "increasingly distressed industry" mired in "severe structural problems." Prominent among them are a chain-driven bookselling system that favors "brand name" authors and often returns other new books to their publishers after only a few weeks on the shelves, before the titles have a chance to establish themselves; and a bestseller-driven system of high royalty advances that often do not earn back the money invested, a system that ratchets up unrealistically high sales expectations for new titles overall, and in so doing makes it increasingly difficult to publish certain kinds of books.
One-third of the way through his slim text, Epstein writes that his career has demonstrated an "ambivalence toward innovation." Ambivalence also pervades this elegiac book. Perhaps it is inevitable when a man looks back to his youth and forward to a future in which he will not play a major part, even if he is hopeful about that future. Perhaps, too, it is inevitable when confronting the distress signals of an industry he has spent his life in and clearly loves. Epstein shares his visions of a publishing future liberated electronically, but that future harks back to a deep-seated nostalgia, a longing for what was. His book seems to predict that technology in the form of the Internet will restore to the book business a certain lost rightness from the past.
His first chapter, like Dickens's Christmas tale, moves back and forth among past, present and future in an attempt to limn the larger changes of the past fifty years and what may yet unfold. The rest of the book is chronologically structured. It follows Epstein's career and the transformation of publishing from primarily small-scale, owner-operated enterprises rooted in the 1920s "golden age" of Liveright and Knopf to the "media empires" of today, which are forced to operate within an "overconcentrated," "undifferentiated" and fatally "rigid" bookselling structure. Now, he says, "there can't be Liverights or Cerfs because the context is so different. Roger Straus is the very last of them," and even he has sold his company to the German firm von Holtzbrinck.
Publishing must return to being "a much smaller business again," Epstein is convinced. "It has to, it's a craft and can't be industrialized any more than writing can. It's about to undergo a huge structural shift and there's nothing the conglomerates can do about it. The marketplace has shifted out from under them: the system of big money bestsellers defeats the possibility of building a sustained backlist. And without a sustained backlist, publishing cannot function in the long term. Providentially, just as the industry was falling into terminal decadence, electronic publishing has come along."
Epstein is in no way predicting the demise of print. Rather, his future is predicated on a kind of universal electronic Reader's Catalog, "much like Amazon" but far beyond it, "multilingual, multinational, and responsibly annotated. People will access it on their computers at home, in the office, and in kiosks like ATMs. It will be possible to browse those books, and downloading technology will eventually solve the problem of making it possible to buy those books. They won't exist in print until they're actually bought.
"There is no room on the Internet for middlemen, who sell the same product as their competitors, competing on the basis of price and service, and in so doing eat up their margins." Epstein is of course speaking of the Amazons and B&N.coms of today. "I think Amazon can't be here that much longer," says the man who sat at this same kitchen table doling out advice to its CEO, Jeff Bezos, a few years back.
As for brick-and-mortar stores, "the chains aren't tenable, either. They never were. The superstores have become what the old mall stores were. There are far too many of them, Waldens with coffee bars, and they will shrink. Stores run by people who love running bookstores will arise spontaneously like mushrooms and find a way to stay in business once the chains begin to recede."
And the conglomerate publishers? "I think they can show some financial progress for some years by cutting costs and cutting out redundancies, but eventually they'll find themselves with expensive traditional facilities that are increasingly irrelevant. They'll have to offload many functions on to specialist firms. In the end, they in turn will look for a buyer if they can find one. They should have noticed that the previous owners were all too happy to sell."
Meanwhile, authors will have found a way to bypass their publishers by going directly to the web. People will start independent authors' websites. Books will be much cheaper. Authors will have a much larger share of the revenue.
Stephen King has already gained notoriety in trying to do so. But the spectacular starting bang of Riding the Bullet, done in conjunction with his publisher, Simon & Schuster, attenuated when he tried to serialize online a novel, The Plant, on his own. A downturn in paying customers for the later chapters led King to abandon the project. Asked about this, Epstein insists, "It's like the days of the early cars that ran off the road into the mud. People said cars would never work. Well, one of these days e-publishing will work."
Of other experiments now being tried Epstein is openly dismissive, and he sees a kind of Darwinian process filtering chaff from grain. Mighty Words and similar online publishers "don't know what a book is," he contends. "But people know what a book is. Human beings are designed to distinguish value, and in my opinion that problem will take care of itself."
He disregards the tremors that have gone through the publishing houses ever since B&N.com announced it was getting into the business of publishing books. Barnes & Noble Digital was formed the first week in January to compete with the new electronic subsidiaries of traditional publishers, which are bringing out digital versions of new titles readable on PCs or dedicated devices, as well as original works specifically created for electronic distribution. In addition, they are digitizing backlist and out-of-print books that can be reprinted in very small quantities in a process known as print-on-demand."It's yet another premature entry," says Epstein. "B&N's publishing experience is limited to a remainder operation. That's entirely different from bringing out original works."
While Epstein criticizes the proverbial naysayers' laughing at those early cars stuck in the mud, at the same time he cautions, "I don't think an author who has worked hard to create something of value will want to risk it in the electronic format at this point." He says bookstores will wind up selling new titles at much lower prices than is now the case, $10 or so, but "can't figure out" how that will be done in the black. His predictions are compelling, but they are also much too vague--for instance, he sets out no time frame or actual mechanics for what he believes will transpire.
The bloat of the superstores is something publishers have worried about for years, almost from their rollout. This holiday season's flat sales at the three biggest chains; the margin-slashing of Amazon; and the re-energizing of the independent stores through a marketing program called Booksense, which includes web-based retailing, all serve to illustrate Epstein's points. Borders went so far as to put itself on the block, but found no willing takers. Recent murmurs about B&N's CEO Len Riggio entertaining a buyout offer from media conglomerate Gemstar-TV Guide International, which has aggressively entered the e-book technology market, did not result in a deal but also were more than simple gossip.
The past twenty years have seen the RCAs, MCAs, Advance Publications and the like learn their lessons and abandon book publishing, as Epstein has noted. Other conglomerates have already tried to offload their publishing components and in time will try again. But it also can't be ignored that companies like the German-based Bertelsmann (which acquired Bantam, Doubleday Dell and Random House and consolidated them) and von Holtzbrinck (which has bought Holt, St. Martin's and Farrar, Straus & Giroux) have their roots in the book business itself. They are therefore not as likely to exit the scene as Epstein would have us believe.
Undoubtedly, many of Epstein's electronic dreams are prescient and will one day come to pass. The companies that first turn them into reality, though, will likely be turning out works in the professional, scholarly, reference and educational sectors rather than in the trade world he knows so well. But although the Internet will change book publishing profoundly and in ways even Jason Epstein can't predict, other forces are at work as well and shouldn't be ignored.
A couple of years ago a brilliant and rich entrepreneur who also happens to be a profoundly bookish man devised a model, not unlike Epstein's nostalgic vision, of devolved companies publishing real books that share a central financial source. It is called the Perseus Group. It is still in its early days, far too soon to know whether it will last. But Epstein's longing for a more civilized, human-scale publishing business is shared by many. The Internet may help bring it about, but it won't do everything.
The following debate is adapted from a forum--put together by Basic Books and held in New York City some weeks ago. Participating were: John Donatich, who moderated and is publisher of Basic Books; Russell Jacoby, who teaches at UCLA and is the author of The End of Utopia and The Last Intellectuals; Jean Bethke Elshtain, who has served as a board member of the Institute for Advanced Studies at Princeton University, is a fellow of the American Academy of Arts and Sciences, teaches at the University of Chicago and is the author of Women and War, Democracy on Trial and a forthcoming intellectual biography of Jane Addams; Stephen Carter, the William Nelson Cromwell Professor of Law at Yale University and author of, among other works, The Culture of Disbelief, Reflections of an Affirmative Action Baby, Integrity, Civility and, most recently, God's Name in Vain: The Wrongs and Rights of Religion in Politics; Herbert Gans, the Robert S. Lynd Professor of Sociology at Columbia University and author of numerous works, including Popular Culture and High Culture, The War Against the Poor and The Levittowners; Steven Johnson, acclaimed as one of the most influential people in cyberworld by Newsweek and New York magazines, co-founder of Feedmag.com, the award-winning online magazine, and author of the books Interface Culture and the forthcoming Emergence; and Christopher Hitchens, a columnist for The Nation and Vanity Fair, whose books include the bestselling No One Left to Lie To: The Values of the Worst Family and The Missionary Position: Mother Teresa in Theory and Practice. For Basic, he will be writing the forthcoming On the Contrary: Letters to a Young Radical.
John Donatich: As we try to puzzle out the future of the public intellectual, it's hard not to poke a little fun at ourselves, because the issue is that serious. The very words "future of the public intellectual" seem to have a kind of nostalgia built into them, in that we only worry over the future of something that seems endangered, something we have been privileged to live with and are terrified to bury.
In preparing for this event, I might as well admit that I've been worried about making the slip, "the future of the public ineffectual." But I think that malapropism would be central to what we'll be talking about. It seems to me that there is a central conflict regarding American intellectual work. How does it reconcile itself with the venerable tradition of American anti-intellectualism? What does a country built on headstrong individualism and the myth of self-reliance do with its people convinced that they know best? At Basic Books' fiftieth anniversary, it's a good time to look at a publishing company born in midcentury New York City, a time and place that thrived on the idea of the public intellectual. In our first decades, we published Daniel Bell, Nathan Glazer, Michael Walzer, Christopher Lasch, Herb Gans, Paul Starr, Robert Jay Lifton--and these names came fresh on the heels of Lévi-Strauss, Freud, Erik Erikson and Clifford Geertz.
What did these writers have in common except the self-defined right to worry the world and to believe that there is a symbiotic relationship between the private world of the thinker and the public world he or she wishes to address? That the age of great public intellectuals in America has passed has in fact become a cliché. There are many well-reviewed reasons for this. Scholars and thinkers have retreated to the academy. Self-doubt has become the very compass point of contemporary inquiry. Scholarship seems to start with an autobiographical or confessional orientation. The notion that every question has a noble answer or that there are reliable structures of ideology to believe in wholeheartedly has become, at best, quaint.
Some believe that the once-relied-upon audience of learned readers has disappeared, giving way to a generation desensitized to complex argumentation by television and the Internet. The movie Dumb and Dumber grosses dozens of millions of dollars at the box office, while what's left of bohemian culture celebrates free-market economics. Selling out has more to do with ticket grosses than the antimaterialist who stands apart from society.
How do we reconcile ambition and virtue, expertise and accessibility, multicultural sensitivity and the urge toward unified theory? Most important, how do we reconcile the fact that disagreement is a main catalyst of progress? How do we battle the gravitation toward happy consensus that paralyzes our national debate? A new generation of public intellectuals waits to be mobilized. What will it look like? That is what our distinguished panelists will discuss.
Russell Jacoby has been useful in defining the role of the public intellectual in the past half-century, especially in the context of the academy. Can you, Russell, define for us a sort of historical context for the public intellectual--what kind of talent, courage and/or political motivation it takes for someone to be of the academy but to have his or her back turned to it, ready to speak to an audience greater than one's peers?
Russell Jacoby: A book of mine that preceded The Last Intellectuals was on the history of psychoanalysis. And one of the things I was struck by when I wrote it was that even though psychoanalysis prospered in the United States, something was missing--that is, the sort of great refugee intellectuals, the Erik Eriksons, the Bruno Bettelheims, the Erich Fromms, were not being reproduced. As a field it prospered, but it became medicalized and professionalized. And I was struck by both the success of this field and the absence of public voices of the Eriksons and Bettelheims and Fromms. And from there I began to consider this as a sort of generational question in American history. Where were the new intellectuals? And I put the stress on public intellectuals, because obviously a kind of professional and technical intelligentsia prospered in America, but as far as I could see the public intellectuals were becoming somewhat invisible.
They were invisible because, in some ways, they had become academics, professors locked in the university. And I used a kind of generational account, looking at the 1900s, taking the Edmund Wilsons, the Lewis Mumfords. What became of them, and who were their successors? And I had a tough time finding them.
In some sense it was a story of my generation, the generation that ended up in the university and was more concerned with--well, what?--finding recommendations than with writing public interventions. And to this day, the worst thing you can say about someone in an academic meeting or when you're discussing tenure promotion is, "Oh, his work is kind of journalistic." Meaning, it's readable. It's journalistic, it's superficial. There's an equation between profundity and originality.
My argument was that, in fact, these generations of public intellectuals have diminished over time. For good reasons. The urban habitats, the cheap rents, have disappeared--as well as the jobs themselves. So the transitional generation, the New York intellectuals, ends up in the university. I mention Daniel Bell as a test case. When he was getting tenure, they turned to him and said, "What did you do your dissertation on?" And he said, "I never did a dissertation." And they said, "Oh, we'll call that collection of essays you did a dissertation." But you couldn't do that now. Those of that generation started off as independent intellectuals writing for small magazines and ended up as professors. The next generation started off as professors, wrote differently and thought differently.
So my argument and one of the working titles of my book was, in fact, "The Decline of the Public Intellectuals." And here I am at a panel on "The Future of Public Intellectuals." Even at the time I was writing, some editors said, "Well, decline, that's a little depressing. Could you sort of make a more upbeat version?" So I said, "I have a new book called The Rise of American Intellectuals," and was told, "Well, that sounds much better, that's something we can sell." But I was really taking a generational approach, which in fact, is on the decline. And it caused intense controversy, mainly for my contemporaries, who always said, "What about me? I'm a public intellectual. What about my friends?" In some sense the argument is ongoing. I'm happy to be wrong, if there are new public intellectuals emerging. But I tend to think that the university and professionalization does absorb and suck away too much talent, and that there are too few who are bucking the trends.
Donatich: Maybe the term "public intellectual" begs the question, "who is the public that is being addressed by these intellectuals?" Which participant in this conversation is invisible, the public or the intellectual?
Jean Bethke Elshtain: I mused in print at one point that the problem with being a public intellectual is that as time goes on, one may become more and more public and less and less intellectual. Perhaps I should have said that a hazard of the vocation of the public intellectual lies in that direction. I didn't exactly mean less academically respectable, but rather something more or less along these lines: less reflective, less inclined to question one's own judgments, less likely to embed a conviction in its appropriate context with all the nuance intact. It is the task of the public intellectual as I understand that vocation to keep the nuances alive. A public intellectual is not a paid publicist, not a spinner, not in the pocket of a narrowly defined purpose. It is, of course the temptation, another one, of the public intellectual to cozy up to that which he or she should be evaluating critically. I think perhaps, too many White House dinners can blunt the edge of criticism.
A way I like to put it is that when you're thinking about models for this activity, you might put it this way: Sartre or Camus? An intellectual who is willing to look the other way, indeed, shamefully, explain away the existence of slave-labor camps, the gulags, in the service of a grand world-historic purpose or, by contrast, an intellectual who told the truth about such atrocities, knowing that he would be denounced, isolated, pronounced an ally of the CIA and capitalistic oppressors out to grind the faces of the poor.
There are times when a public intellectual must say "neither/nor," as did Camus. Neither the socialism of the gallows, in his memorable phrase, nor a capitalist order riddled with inequalities and shamed by the continuing existence, in his era, the era of which I speak, of legally sanctioned segregation. At the same time, this neither/nor did not create a world of moral equivalence. Camus was clear about this. In one regime, one order, one scheme of things, one could protest, one could organize to fight inequities, and in the other one wound up disappeared or dead.
Let me mention just one issue that I took on several times when I alternated a column called "Hard Questions" for The New Republic. I'm referring to the question of genetic engineering, genetic enhancement, the race toward a norm of human perfection to be achieved through manipulation of the very stuff of life. How do you deal with an issue like this? Here, it seems to me, the task of the public intellectual in this society at this time--because we're not fighting the issues that were fought in the mid-twentieth century--is to join others in creating a space within which such matters can be articulated publicly and debated critically.
At present, the way the issue is parsed by the media goes like this: The techno-enthusiasts announce that we're one step closer to genetic utopia. The New York Times calls up its three biological ethicists to comment. Perhaps one or two religious leaders are asked to wring their hands a little bit--anyone who's really a naysayer with qualms about eugenics, because that is the direction in which we are heading, is called a Luddite. Case closed, and every day we come closer to a society in which, even as we intone multiculturalism as a kind of mantra, we are narrowing the definition of what is normatively human as a biological ideal. That's happening even as we speak; that is, we're in real danger of reducing the person to his or her genotype, but if you say that, you're an alarmist--so that's what I am.
This leads me to the following question: Who has authority to pronounce on what issue, as the critical issues change from era to era? In our time and place, scientists, technology experts and dot-com millionaires seem to be the automatic authorities on everything. And everybody else is playing catch-up.
So the public intellectual needs, it seems to me, to puncture the myth-makers of any era, including his own, whether it's those who promise that utopia is just around the corner if we see the total victory of free markets worldwide, or communism worldwide or positive genetic enhancement worldwide, or mouse-maneuvering democracy worldwide, or any other run-amok enthusiasm. Public intellectuals, much of the time at least, should be party poopers. Reinhold Niebuhr was one such when he decided that he could no longer hold with his former compatriots of the Social Gospel movement, given what he took to be their dangerous naïveté about the rise of fascism in Europe. He was widely derided as a man who once thought total social transformation in the direction of world peace was possible, but who had become strangely determined to take a walk on the morbid side by reminding Americans of the existence of evil in the world. On this one, Niebuhr was clearly right.
When we're looking around for who should get the blame for the declining complexity of public debate, we tend to round up the usual suspects. Politicians usually get attacked, and the media. Certainly these usual suspects bear some responsibility for the thinning out of the public intellectual debate. But I want to lift up two other candidates here, two trends that put the role of public intellectuals and the very existence of publics in the John Dewey sense at risk. The first is the triumph of the therapeutic culture, with its celebration of a self that views the world solely through the prism of the self, and much of the time a pretty "icky" self at that. It's a quivering sentimental self that gets uncomfortable very quickly, because this self has to feel good about itself all the time. Such selves do not make arguments, they validate one another.
A second factor is the decline of our two great political parties. At one point the parties acted not just as big fundraising machines, not just as entities to mobilize voters but as real institutions of political and civic education. There are lots of reasons why the parties have been transformed and why they no longer play that role, but the results are a decline in civic education, a thinning out of political identification and depoliticization, more generally.
I'm struck by what one wag called the herd of independent minds; by the fact that what too often passes for intellectual discussion is a process of trying to suit up everybody in a team jersey so we know just who should be cheered and who booed. It seems to me that any public intellectual worth his or her salt must resist this sort of thing, even at the risk of making lots of people uncomfortable.
Donatich: Stephen, can you talk about the thinning out of political identity? Who might be responsible for either thickening or thinning the blood of political discourse? What would you say, now that we're talking about the fragmentation of separate constituencies and belief systems, is the role of religion and faith in public life?
Stephen Carter: You know that in the academy the really bad word is "popularizer"-- a mere popularizer, not someone who is original, which of course means obscure, or someone who is "deeply theorized," which is the other phrase. And to be deeply theorized, you understand, in academic terms today, means to be incapable of uttering a word such as "poor." No one is poor. The word, the phrase now, as some of you may know, is "restricted access to capital markets." That's deeply theorized, you see. And some of us just say poor, and that makes us popularizers.
A few years ago someone who was really quite angry about one of my books--and I have a habit of making people angry when I write books--wrote a review in which he challenged a statement of mine asserting that the intellectual should be in pursuit of truth without regard to whether that leaves members of any particular political movement uncomfortable. He responded that this was a 12-year-old nerd's vision of serious intellectual endeavor.
And ever since then I thought that I would like to write a book, or at least an essay, titled something like Diary of an Intellectual Nerd, because I like that idea of being somewhat like a 12-year-old. A certain naïveté, not so much about great ideas and particularly not about political movements but about thought itself, about truth itself. And I think one of the reasons, if the craft of being intellectual in the sense of the scholar who speaks to a large public is in decline, is cynicism. Because there's no sense that there are truths and ideas to be pursued. There are only truths and ideas to be used and crafted and made into their most useful and appropriate form. Everyone is thought to be after something, everyone is thought to have some particular goal in mind, independent of the goal that he or she happens to articulate. And so, a person may write a book or an article and make an argument, and people wonder, they stand up in the audience and they say, "So, are you running for office, or are you looking for some high position?" There's always some thought that you must be after something else.
One of the reasons, ideally, you'd think you would find a lot of serious intellectual endeavor on university campuses is precisely because people have tenure and therefore, in theory, need not worry about trying to do something else. But on many, many campuses you have, in my judgment, relatively little serious intellectual endeavor in the sense of genuinely original thinking, because even there, people are worried about which camp they will be thought to be in.
You can scarcely read a lot of scholarship today without first having to wade through several chapters of laying out the ground in the sense of apologizing in advance to all the constituencies that may be offended, lest one be thought in the other camp. That kind of intellectual activity is not only dangerous, it's unworthy in an important sense, it's not worthy of the great traditions of intellectual thought.
There's a tendency sometimes to have an uneasy equation that there is serious intellectual activity over here, and religion over there, and these are, in some sense, at war. That people of deep faith are plainly anti-intellectual and serious intellectuals are plainly antireligious bigots--they're two very serious stereotypes held by very large numbers of people. I'm quite unembarrassed and enthusiastic about identifying myself as a Christian and also as an intellectual, and I don't think there's any necessary war between those two, although I must say, being in an academic environment, it's very easy to think that there is.
I was asked by a journalist a few years ago why was it that I was comfortable identifying myself, and often did, as a black scholar or an African-American scholar and hardly ever identified myself as a Christian scholar. And surely the reason is, there are certain prejudices on campus suggesting that is not a possible thing to be or, at least, not a particularly useful combination of labels.
And yet, I think that the tradition of the contribution to a public-intellectual life by those making explicitly religious arguments has been an important and overlooked one, and I go back for my model, well past Niebuhr, into the nineteenth century. For example, if you looked at some of the great preachers of the abolitionist movement, one thing that is quite striking about them is, of course, that they were speaking in an era when it was commonly assumed that people could be quite weighty in their theology and quite weighty in their intellectual power. And when you read many of the sermons of that era, many of the books and pamphlets, you quickly gain a sense of the intellectual power of those who were pressing their public arguments in explicitly Christian terms.
Nowadays we have a historical tendency to think, "Oh, well, it's natural they spoke that way then, because the nation was less religiously diverse and more Christian." Actually, the opposite was probably true, as historians now think--the nation is probably less religiously diverse now than it was 150, 175 years ago, when religions were being founded really quite swiftly. And most of those swiftly founded religions in the 1820s to the 1830s have died, but many of them had followers in great number before they did.
America's sense of itself as a so-called Christian nation, as they used to say in the nineteenth century, didn't really grow strong until the 1850s or 1860s. So you have to imagine the abolitionist preachers of the eighteenth and early nineteenth centuries, preaching in a world in which it could be anything but certain that those who were listening to them were necessarily co-religionists.
In this century too, we have great intellectual preachers who also spoke across religious lines. Martin Luther King is perhaps the most famous of them, even though sometimes, people try to make a straitjacket intellectual of him by insisting, with no evidence whatsoever, that he actually was simply making secular moral arguments, and that religion was kind of a smokescreen. If you study his public ministry and look at his speeches, which were really sermons, as a group, you easily discern that that's not true.
And yet, the religiosity of his language gave it part of its power, including the power to cross denominational lines, to cross the lines between one tradition and another, and to cross lines between religion and nonreligion. For the religiously moved public intellectual, the fact is that there are some arguments that simply lose their power or are drained of their passion when they're translated into a merely secular mode. The greatness of King's public oratory was largely a result of its religiosity and its ability to touch that place in the human heart where we know right from wrong; it would not have been as powerful, as compelling, had it lacked that religious quality.
Now, I'm not being ahistorical, I'm not saying, "Oh, therefore the civil rights movement would not have happened or we would still have racial segregation today"--that's not the point of my argument. The point is that his religiosity did not detract from his intellectual power; rather, it enhanced it. This is not to say, of course, that everyone who makes a religious argument in public life is speaking from some powerful intellectual base. But it does suggest we should be wary of the prejudices that assume they can't be making serious arguments until they are translated into some other form that some may find more palatable. In fact, one of my great fears about the place we are in our democracy is that, religion aside, we have lost the ability to express and argue about great ideas.
Donatich: Professor Carter has made a career out of illustrating the effect and protecting the right of religious conviction in public thought. Herbert Gans, on the other hand, is a self-pronounced, enthusiastic atheist. As a social scientist who has taught several generations of students, how does a public intellectual balance the professional need for abstract theory and yet remain relevant, contribute some practical utility to the public discourse?
Herbert Gans: I'm so old that the word "discourse" hadn't been invented yet! I am struck by the pessimism of this panel. But I also notice that most of the names of past public intellectuals--and I knew some of them--were, during their lifetime, people who said, "Nobody's listening to me." Erich Fromm, for example, whom I knew only slightly and through his colleagues, was sitting in Mexico fighting with psychoanalysts who didn't think politics belonged in the dialogue. Lewis Mumford was a teacher of mine, and he certainly felt isolated from the public, except on architecture, because he worked for The New Yorker.
So it seems to me it's just the opposite: that the public intellectual is alive and well, though perhaps few are of the magnitude of the names mentioned. If I did a study, I'd have to define what an intellectual is, and I notice nobody on the panel has taken that one on. And I won't either. The public intellectuals that exist now may not be as famous, but in fact there are lots of them. And I think at least on my campus, public intellectuals are becoming celebrities. Some of them throw stones and get themselves in trouble for a few minutes and then it passes. But I think that really is happening, and if celebrities can exist, their numbers will increase.
One of the reasons the number is increasing is that public intellectuals are really pundits. They're the pundits of the educated classes, the pundits of the highbrow and the upper-middlebrow populations, if you will. And the moment you say they're pundits, then you can start comparing them to other pundits, of which we have lots. And there are middlebrow pundits and there are lower-brow pundits, there are serious pundits, there are not-so-serious pundits.
Some of the columnists in the newspapers and the tabloid press who are not journalists with a PhD are public intellectuals. There are pundits who are satirical commentators, there are a significant number of people who get their political news from Leno and Letterman. And, of course, the pollsters don't really understand this, because what Leno and Letterman supply is a satirical take on the news.
Most public intellectuals function as quote-suppliers to legitimize the media. Two or three times a week, I get called by journalists and asked whether I will deliver myself of a sociological quote to accompany his or her article, to legitimate, in a sense, the generalizations that journalists make and have to make, because they've got two-hour deadlines. Which means that while there are few public intellectuals who are self-selected, most of us get selected anyway. You know, if no journalist calls for a quote, then I'm not a public intellectual; I just sit there writing my books and teaching classes.
I did a book on the news media and hung out at Newsweek and the other magazines. And at Newsweek, they had something they called an island, right in the main editorial room. On the island were names of people who would now be called public intellectuals, the people whom Newsweek quoted. And the rules were--and this is a bit like Survivor--every so often people would be kicked off the island. Because the editors thought, and probably rightly, that we as readers were going to get tired of this group of public intellectuals. So a new group was brought in to provide the quotes. And then they were kicked off.
The public intellectuals come in two types, however. First there are the ones that everyone has been talking about, the generalists, the pundits, as I think of them; and second are the disciplinary public intellectuals. The public sociologists, the public economists, the public humanists--public, plus a discipline. And these are the people who apply the ideas from their own disciplines to a general topic. And again, to some extent, this is what I do when I'm a quote-supplier, and I'm sure my fellow panelists are all functioning as quote-suppliers too.
But the disciplinary public intellectuals show that their disciplinary insights and their skills can add something original to the public debate. That, in other words, social scientists and humanists can indeed grapple with the issues and the problems of the real world. The disciplinary public intellectuals, like other public intellectuals, have to write in clear English. This is a rarity in the academy, unfortunately--which makes disciplinary public intellectuals especially useful. And they demonstrate the public usefulness of their disciplines, which is important in one sense, because we all live off public funds, directly or indirectly, and we need to be able to account every so often that we're doing something useful for taxpayers. I cannot imagine there are very many legislators in this country who would consider an article in an academic journal as proof that we're doing something useful or proof that we're entitled to some share of the public budget.
Disciplinary public intellectuals are useful in another way, too: They are beloved by their employers, because they get these employers publicity. My university has a professionally run clipping service, and every time Columbia University is mentioned, somebody clips and files the story. And so every time somebody quotes me I say, "Be sure to mention Columbia University," because I want to make my employers happy, even though I do have tenure. Because, if they get publicity, they think they're getting prestige, and if they get prestige, that may help them get students or grant money.
There are a number of hypotheses on this; I'm not sure any of them are true-- whether quote-supplying provides prestige, or prestige helps to get good students, whether good students help to get grant money. There is a spiral here that may crash. But meanwhile, they think that if we're getting them publicity, we're being useful. And, of course, public social scientists and those in the humanities are, in some respects, in short supply, in part because their colleagues stigmatize them as popularizers. (They don't call them journalists, which is a dirty word in the ivory tower.)
It's also fair to say that in the newsrooms, "academic" is a dirty word. If you've ever paid attention, journalists always cite "the professor," and it doesn't matter who it is, and it doesn't even matter if they're friends of the professor. But it's always "the professor," which is a marvelous way of dehumanizing us professors. So there's this love/hate relationship between journalists and academics that's at work here. All of which means, yes, of course, it does take a bit of courage to be a public intellectual or a disciplinary public intellectual. If you turn your back on the mainstream of the academy, that's the way you get a knife in your back, at times.
Donatich: Steven Johnson has used the web and Internet energetically and metaphorically. How will the Internet change public dialogue? What are the opportunities of public conversation that this new world presents?
Steven Johnson: One of the problems with the dot-com-millionaire phenomenon--which may, in fact, be starting to fall behind us--is that it really distracted a huge amount of attention from a lot of other very interesting and maybe more laudable things that were happening online. There was kind of a news vacuum that sucked everything toward stories about the 25-year-old guy who just made $50 million, and we lost sight of some of the other really progressive and important things that were happening because of the rise of the web.
I'm of a generation that came of age at precisely that point that Russell Jacoby talked about and wrote about, during the late eighties, when the academy was very much dominated by ideas from France and other places, where there was a lot of jargon and specialization, and it was the heyday of poststructuralism and deconstruction in the humanities. Which leads me to sometimes jokingly, sometimes not, describe myself as a "recovering semiotics major."
I think that I came to the web and to starting Feed, and to writing the book that I wrote about the Internet culture and interface culture, as a kind of a refugee from conversations like one in the academy, when I was a graduate student, in which a classmate asked the visiting Derrida a ten- or fifteen-minute, convoluted Derridean question on his work and the very possibility of even asking a question. And after a long pause, Derrida had to admit, "I'm sorry, I do not understand the question."
The web gave me an unlikely kind of home in that there were ideas and there were new critical paradigms that had been opened up to me from the academic world. But it was clear that you couldn't write about that world, you couldn't write using those tools with that kind of language and do anything useful. And it was very hard to imagine a life within the university system that was not going to inevitably push me toward conversations like that with Derrida.
So the good news, I think, is that my experience is not unique. In fact, there's been a great renaissance in the last five years of the kind of free-floating intellectual that had long been rumored to be on his or her last legs. It's a group shaped by ideas that have come out of the academy but is not limited to that. And I think in terms of publications like Feed--to pat myself on the back--Hermenaut and Suck are all good examples of a lively new form of public intellectualism that is not academic in tone.
The sensibility of that group is very freethinking--not particularly interested in doctrinaire political views, very eclectic in taste, very interested in the mix of high and low culture, much more conversational in tone--funny, even. Funny is an interesting component here. I mean, these new writers are funny in a way, you know, Adorno was never very funny. And they're very attentive to technology changes, maybe as interested in technology and changes in the medium as they are in intellectual fashions. If there's a role model that really stands out, it's somebody like Walter Benjamin for this generation. You know, a sense of an interest that puts together groups of things you wouldn't necessarily expect to see put together in the same essay.
How does the web figure into all of this? Why did these people show up on the web? I think one of the things that started happening--actually, this is just starting to happen--is that in addition to these new publications, you're starting to see something on the web that is very unique to it. The ability to center your intellectual life in all of its different appearances in your own "presence" online, on the home page, so that you can actually have the equivalent of an author bio. Except that it's dynamically updated all the time, and there are links to everything you're doing everywhere. I think we've only just begun to exploit it--of combating the problem with the free-floating intellectual, which is that you're floating all over the place and you don't necessarily have a home, and your ideas are appearing in lots of different venues and speaking to lots of different audiences.
The web gives you a way of rounding all those diverse kinds of experiences and ideas--and linking to them. Because, of course, the web is finally all about linking--in a way that I think nothing has done quite as well before it. And it also involves a commitment to real engagement with your audience that perhaps public intellectuals have talked a lot about in the past, but maybe not lived up to as much as they could have.
Some of this is found in the new formats that are available online in terms of how public dialogue can happen. I'm sure many of you have read these and many of you may have actually participated in them, but I'm a great advocate for this kind of long-format, multiparticipant discussion thread that goes on over two or three weeks. Not a real-time live chat, which is a disaster in terms of quality of discourse, which inevitably devolves into the "What are you wearing" kind of intellectual questions. But rather, the conversations with four or five people where each person has a day or half a day to think up their responses, and then write in 500- to 1,000-word posts. We've done those since we started at Feed. Slate does a wonderful job with them. And it's a fantastic forum. It's very engaged, it's very responsible, it's very dialogic and yet also lively in a conversational way. But, because of the back and forth, you actually can get to places that you sometimes couldn't get in a stand-alone 10,000-word essay.
Donatich: Professor Gans, if you had trouble with the word "discourse," I'm wondering what you'll do with "dialogic."
Johnson: I said I was recovering! That's the kind of thing that should be happening, and it seems to me that in five or ten years we'll see more and more of people who are in this kind of space, having pages that are devoted to themselves and carrying on these conversations all the time with people who are coming by and engaging with them. And I think that is certainly a force for good. The other side is just the economics of being able to publish either your own work or a small magazine. I mean, we started Feed with two people. We were two people for two years before we started growing a little bit. And the story that I always tell about those early days is that we put out the magazine and invited a lot of our friends and some people we just knew professionally to contribute. About three months, I guess, after Feed launched, Wired came out with a review of it. And they had this one slightly snippy line that said, "It's good to see the East Coast literary establishment finally get online." Which is very funny, to be publishing this thing out of our respective apartments. I had this moment where I was looking around my bedroom for the East Coast literary establishment--you open the closet door, and "Oh, Norman Mailer is in there. 'Hey, how's it going!'" And so there can be a kind of Potemkin Village quality online. But I think the village is thriving right now.
Donatich: Christopher Hitchens, short of taking on what a public intellectual might or might not be, will you say something about the manners or even the mannerisms of the public intellectual and why disagreement is important to our progress?
Christopher Hitchens: I've increasingly become convinced that in order to be any kind of a public-intellectual commentator or combatant, one has to be unafraid of the charges of elitism. One has to have, actually, more and more contempt for public opinion and for the way in which it's constructed and aggregated, and polled and played back and manufactured and manipulated. If only because all these processes are actually undertaken by the elite and leave us all, finally, voting in the passive voice and believing that we're using our own opinions or concepts when in fact they have been imposed upon us.
I think that "populism" has become probably the main tactical discourse, if you will, the main tactical weapon, the main vernacular of elitism. Certainly the most successful elitist in American culture now, American politics particularly, is the most successful inventor or manipulator, or leader of populism. And I think that does leave a great deal of room in the public square for intellectuals to stand up, who are not afraid to be thought of as, say, snobbish, to pick a word at random. Certainly at a time when the precious term "irony"--precious to me, at any rate--has been reduced to a form of anomie or sarcasm. A little bit of snobbery, a little bit of discrimination, to use another word that's fallen into disrepute, is very much in order. And I'm grateful to Professor Carter for this much, at least, that he drew attention to language. And particularly to be aware of euphemism. After all, this is a time when if you can be told you're a healer, you've probably won the highest cultural award the society can offer, where anything that can be said to be unifying is better than anything that can be described as divisive. Blush if you will, ladies and gentlemen, I'm sure at times you too have applauded some hack who says he's against or she's against the politics of division. As if politics wasn't division by definition.
The New York Times, which I'm sure some of you at least get, if you don't read, will regularly regale you in this way--check and see if you can confirm this. This will be in a news story, by the way, not a news analysis. About my hometown in Washington, for example, "recently there was an unpleasant outbreak of partisanship on Capitol Hill, but order seems to have been restored, and common sense, and bi-partisanship, is again regained. I've paraphrased only slightly. Well, what is this in translation? "For a while back there it looked as if there'd be a two-party system. But, thank God, the one-party system has kicked back in."
Now, the New York Times would indignantly repudiate--I'm coming back to this, actually--the idea that it stood for a one-party system or mentality, but so it does. And its language reveals it. So look to the language. And that is, in fact, one of the most essential jobs of anyone describing themselves as an intellectual.
Against this, we have, of course, the special place reserved for the person who doesn't terribly want to be a part of it, doesn't feel all that bipartisan, who isn't in an inclusive mood. Look at the terms that are used for this kind of a person: gadfly, maverick and, sometimes, bad boy. Also bad girl, but quite often bad boy, for some reason. Loose cannon, contrarian, angry young man.
These are not hate words, by any means, nor are they exactly insulting, but there's no question, is there, that they are fantastically and essentially condescending. They're patronizing terms. They are telling us, affectionately enough, that pluralism, of course, is big enough, capacious enough, tolerant enough to have room for its critics.
The great consensus, after all, probably needs a few jesters here and there, and they can and should be patted upon the head, unless they become actually inconvenient or awkward or, worst of all--the accusation I have myself been most eager to avoid--humorless. One must be funny, wouldn't you say? Look to the language again. Take the emaciated and paltry manner and prose in which a very tentative challenge to the one-party system, or if you prefer, the two-party one, has been received. I'm alluding to the campaign by Ralph Nader.
The New York Times published two long editorials, lead editorials, very neatly inverting the usual Voltairean cliché. These editorials say: We don't particularly disagree with what Ralph Nader says, but we violently disagree with his right to say it. I've read the editorials--you can look them up. I've held them up to the light, looked at them upside down, inside out, backwards--that's what they say. This guy has no right to be running, because the electorate is entitled to a clear choice between the two people we told you were the candidates in the first place.
I find this absolutely extraordinary. When you're told you must pick one of the available ones; "We've got you some candidates, what more do you want? We got you two, so you have a choice. Each of them has got some issues. We've got some issues for you as well. You've got to pick." A few people say, "Well, I don't feel like it, and what choice did I have in the choice?" You're told, "Consider the alternatives." The first usage of that phrase, as far as I know, was by George Bernard Shaw, when asked what he felt like on his 90th birthday. And he said, "Considering the alternatives...." You can see the relevance of it. But in this case you're being told, in effect, that it would be death to consider the alternatives.
Now, to "consider the alternatives" might be a definition of the critical mind or the alive intelligence. That's what the alive intelligence and the critical mind exist to do: to consider, tease out and find alternatives. It's a very striking fact about the current degeneration of language, that that very term, those very words are used in order to prevent, to negate, consideration of alternatives. So, be aware. Fight it every day, when you read gunk in the paper, when you hear it from your professors, from your teachers, from your pundits. Develop that kind of resistance.
The word "intellectual" is of uncertain provenance, but there's no question when it became a word in public use. It was a term of abuse used by those who thought that Capt. Alfred Dreyfus was guilty in 1898 to describe those who thought that he was probably innocent. It was a word used particularly by those who said that whether Captain Dreyfus was innocent or not, that wasn't really the point. The point was, would France remain an orderly, Christian, organic, loyal society? Compared to that, the guilt or innocence of Captain Dreyfus was irrelevant. They weren't saying he was necessarily guilty, they were saying, "Those who say he is innocent are not our friends. These are people who are rootless, who have no faith, who are unsound, in effect." I don't think it should ever probably lose that connotation. And fortunately, like a lot of other words that were originally insults--I could stipulate "Impressionist," which was originally a term of abuse, or "suffragette" or "Tory," as well as a number of other such terms--there was a tendency to adopt them in reaction to the abuse and to boast of them, and say, "Well, all right, you call me a suffragette, I'll be a suffragette. As a matter of fact, I'll be an Impressionist."
I think it would be a very sad thing if the word "intellectual" lost its sense that there was something basically malcontent, unsound and untrustworthy about the person who was claiming the high honor of the title. In politics, the public is the agora, not the academy. The public element is the struggle for opinion. It's certainly not the party system or any other form whereby loyalty can be claimed of you or you can be conscripted.
I would propose for the moment two tasks for the public intellectual, and these, again, would involve a confrontation with our slipshod use of language. The first, I think, in direct opposition to Professor Carter, is to replace the rubbishy and discredited notions of faith with scrutiny, by looking for a new language that can bring us up to the point where we can discuss shattering new discoveries about, first, the cosmos, in the work of Stephen Hawking, and the discoveries of the Hubble telescope--the external world--and, second, no less shattering, the discovery about our human, internal nature that has begun to be revealed to us by the unraveling of the chains of DNA.
At last, it's at least thinkable that we might have a sense of where we are, in what I won't call creation. And what our real nature is. And what do we do? We have President Clinton and the other figures in the Human Genome Project appear before us on the day that the DNA string was finally traced out to its end, and we're told in their voices and particularly the wonderful lip-biting voice of the President, "Now we have the dictionary which God used when he was inventing us." Nothing could be more pathetic than that. This is a time when one page, one paragraph, of Hawking is more awe-inspiring, to say nothing of being more instructive, than the whole of Genesis and the whole of Ezekiel. Yet we're still used to babble. For example, in the 18th Brumaire of Louis Napoleon, Karl Marx says, quite rightly, I think, "When people are trying to learn a new language, it's natural for them to translate it back into the one they already know." Yes, that's true. But they must also transcend the one they already know.
So I think the onus is on us to find a language that moves us beyond faith, because faith is the negation of the intellect, faith supplies belief in preference to inquiry and belief, in place of skepticism, in place of the dialectic, in favor of the disorder and anxiety and struggle that is required in order to claim that the mind has any place in these things at all.
I would say that because the intellectual has some responsibility, so to speak, for those who have no voice, that a very high task to adopt now would be to set oneself and to attempt to set others, utterly and contemptuously and critically and furiously, against the now almost daily practice in the United States of human sacrifice. By which I mean, the sacrifice, the immolation of men and women on death row in the system of capital punishment. Something that has become an international as well as a national disgrace. Something that shames and besmirches the entire United States, something that is performed by the professionalized elite in the name of an assumed public opinion. In other words, something that melds the worst of elitism and the absolute foulest of populism.
People used to say, until quite recently, using the words of Jimmy Porter in Look Back in Anger, the play that gave us the patronizing term "angry young man"--well, "there are no good, brave causes anymore." There's nothing really worth witnessing or worth fighting for, or getting angry, or being boring, or being humorless about. I disagree and am quite ready to be angry and boring and humorless. These are exactly the sacrifices that I think ought to be exacted from oneself. Let nobody say there are no great tasks and high issues to be confronted. The real question will be whether we can spread the word so that arguments and debates like this need not be held just in settings like these but would be the common property of anyone with an inquiring mind. And then, we would be able to look at each other and ourselves and say, "Well, then perhaps the intellectual is no longer an elitist."
Facebook Like Box