A friend and I were sitting around commiserating about the things that get to us: unloading small indignities, comparing thorns. "So there I was," she said, "sitting on the bus and this man across the aisle starts waving a copy of law professor Randall Kennedy's new book Nigger. He's got this mean-looking face with little raisiny eyes, and a pointy head, and he's taking this book in and out of his backpack. He's not reading it, mind you. He's just flashing it at black people."
"Don't be so touchy," I responded. "Professor Kennedy says that the N-word is just another word for 'pal' these days. So your guy was probably one of those muted souls you hear about on Fox cable, one of the ones who's been totally silenced by too much political correctness. I'd assume he was just trying to sign 'Have a nice day.'"
"Maybe so," she said, digging through her purse and pulling out a copy of Michael Moore's bestselling Stupid White Men. "But if I see him again, I'm armed with a 'nice day' of my own."
"That's not nice," I tell her. "Besides, I've decided to get in on the publishing boom myself. My next book will be called Penis. I had been going to title it Civil Claims That Shaped the Evidentiary History of Primogeniture: Paternity and Inheritance Rights in Anglo-American Jurisprudence, 1883-1956, but somehow Penis seems so much more concise. We lawyers love concision."
She raised one eyebrow. "And the mere fact that hordes of sweaty-palmed adolescents might line up to sneak home a copy, or that Howard Stern would pant over it all the way to the top of the bestseller list, or that college kids would make it the one book they take on spring break----"
"...is the last thing on my mind," I assured her. "Really, I'm just trying to engage in a scholarly debate about some of the more nuanced aspects of statutory interpretation under Rule 861, subsection (c), paragraph 2... And besides, now that South Park has made the word so much a part of popular culture, I fail to see what all the fuss is about. When I hear young people singing lyrics that use the P-word, I just hum along. After all, there are no bad words, just ungood hermeneutics."
"No wonder Oprah canceled her book club," she muttered.
Seriously. We do seem to have entered a weird season in which the exercise of First Amendment rights has become a kind of XXX-treme Sport, with people taking the concept of free speech for an Olympic workout, as though to build up that constitutional muscle. People speak not just freely but wantonly, thoughtlessly, mainlined from their hormones. We live in a minefield of scorched-earth, who-me-a-diplomat?, let's-see-if-this-hurts words. As my young son twirls the radio dial in search of whatever pop music his friends are listening to, it is less the lyrics that alarm me than the disc jockeys, all of whom speak as though they were crashing cars. It makes me very grateful to have been part of the "love generation," because for today's youth, the spoken word seems governed by people from whom sticks and stones had to be wrested when they were children--truly unpleasant people who've spent years perfecting their remaining weapon: the words that can supposedly never hurt you.
The flight from the imagined horrors of political correctness seems to have overtaken common sense. Or is it possible that we have come perilously close to a state where hate speech is the common sense? In a bar in Dorchester, Massachusetts, recently, a black man was surrounded by a group of white patrons and taunted with a series of escalatingly hostile racial epithets. The bartender refused to intervene despite being begged to do something by a white friend of the man. The taunting continued until the black man tried to leave, whereupon the crowd followed him outside and beat him severely. In Los Angeles, the head of the police commission publicly called Congresswoman Maxine Waters a "bitch"--to the glee of Log Cabin Republicans, who published an editorial gloating about how good it felt to hear him say that. And in San Jose, California, a judge allowed a white high school student to escape punishment after the student, angry at an African-American teacher who had suspended his best friend, scrawled "Thanks, Nigga" on a school wall. The judge was swayed by an argument that "nigga" is not the same as "nigger" but rather an inoffensive rap music term of endearment common among soul brothers.
Frankly, if Harvard president Lawrence Summers is going to be calling professors to account for generating controversy not befitting that venerable institution, the disingenuous Professor Kennedy would be my first choice. Kennedy's argument that the word "nigger" has lost its sting because black entertainers like Eddie Murphy have popularized it, either dehistoricizes the word to a boneheaded extent or ignores the basic capaciousness of all language. The dictionary is filled with words that have multiple meanings, depending on context. "Obsession" is "the perfume," but it can also be the basis for a harassment suit. Nigger, The Book, is an appeal to pure sensation. It's fine to recognize that ironic reversals of meaning are invaluable survival tools. But what's selling this book is not the hail-fellow-well-met banality of "nigger" but rather the ongoing liveliness of its negativity: It hits in the gut, catches the eye, knots the stomach, jerks the knee, grabs the arm. Kennedy milks this phenomenon only to ask with an entirely straight face: "So what's the big deal?"
The New Yorker recently featured a cartoon by Art Spiegelman that captures my concern: A young skinhead furtively spray-paints a swastika on a wall. In the last panel, someone has put the wall up in a museum and the skinhead is shown sipping champagne with glittery fashionistas and art critics. I do not doubt that hateful or shocking speech can be "mainstreamed" through overuse; I am alarmed that we want to. But my greater concern is whether this gratuitous nonsense should be the most visible test of political speech in an era when government officials tell us to watch our words--even words spoken in confidence to one's lawyer--and leave us to sort out precisely what that means.
By identifying ethics with civic virtue, we create an ethics of the left.
On East Capitol Street a few years ago, I was in a taxi when a car pulled suddenly and dangerously across our bow. My driver was white, with a hunter's cap and earmuffs and an indefinable rosy hue about his neck. The offending motorist was black. Both vehicles had to stop sharply. My driver did not, to my relief, say what I thought he might have been about to say.
The slogans scrawled across the walls of Paris in May 1968 suggest possibilities most of us have forgotten or that were long ago deemed preposterous. "Never work!" said one slogan. "The more you consume the less you live!" warned another. The words, restless and uncompromising, ask you to wake up, to change your life, to find a better way to live.
At its zenith in the late 1960s, the Situationist International could claim that "our ideas are in everyone's mind," even though the SI itself never had more than a few dozen members. When the whole world was exploding in 1968, the Situationist texts that had appeared throughout the decade read like road maps for revolution, full of slogans and tactics that youthful rebels picked up en route to wildly varying destinations.
Nearly forgotten after their dissolution in 1972, the Situationist legacy was recovered in 1989 with the publication of Greil Marcus's Lipstick Traces, which purported to trace the subterranean relationships between medieval heresy, nineteenth-century utopianism, Dada, Surrealism, Situationism, soul music and punk rock.
Today, Situationism exerts considerable--though often unacknowledged and depoliticized--influence over academic discourse and artistic practice in many media. It also plays a role in shaping the movement for global justice (or the "antiglobalization movement," as its critics like to call it), from Naomi Klein's book No Logo to the magazine Adbusters to the proliferating network of Independent Media Centers. Kept alive by a stream of reprints, anthologies and retrospectives from mostly anarchist presses, the Situationist critique continues to gain fresh adherents.
The most recent anthology, Beneath the Paving Stones: Situationists and the Beach, May 1968, includes three major Situationist pamphlets, along with eyewitness accounts, photographs, poster art, leaflets and other documents of France's almost-revolution. City Lights, meanwhile, has published what is the inaugural volume of its "Contributions to the History of the Situationist International and Its Time," a long conversation with Jean-Michel Mension called The Tribe.
Jean-Michel Mension was a petty criminal and teenage drunk who hung around the Saint-Germain-des-Prés neighborhood of Paris from 1952 to 1954. There he met the Lettrists, a movement of poets and painters founded in the late 1940s by Isidore Isou in response to the growing impotence of Surrealism, and taught them to smoke hash, snort ether and consume heroic amounts of alcohol. It was in this capacity that Mension met Guy Debord, the bookish filmmaker who would later become the chief theorist of the SI.
In the photos throughout The Tribe, Debord is bespectacled and a bit short, resembling a young Woody Allen. Yet his slight physical stature belied a ferocious intellect and messianic personality, one that Marcus in Lipstick Traces identifies in young rebels from eighteenth-century blasphemer Saint-Just to punk rocker Johnny Rotten. "His instincts," says Marcus, "are basically cruel; his manner is intransigent.... He is beyond temptation, because despite his utopian rhetoric satisfaction is the last thing on his mind.... He trails bitter comrades behind him like Hansel his breadcrumbs."
Debord, says Mension, was fascinated by outlaws and by the prisons and reformatories where they could be found. Not that Debord was destined for prison--Mension notes that he lived a "very bourgeois" lifestyle, once finding his friend at home in Rue Racine "in the role of a gent in a dressing gown." Like the Beats in America (correctly characterized by the SI as "that right wing of the youth revolt"), the Lettrists saw in antisocial behavior the revolt they longed for. The young, said Isou in 1953, were "slaves, tools...the property of others, regardless of class, because they have no real freedom of choice...to win real independence they must revolt against their very nonexistence." Mension was a model for the Lettrists in that he refused to be a slave or a tool, was "always on the margins," always drunk.
"Guy taught me stuff about thinkers," says Mension, "and I taught him stuff about practice, action." Young men like Isou and Debord needed delinquents like Mension, whom they later expelled from the Lettrists for being "merely decorative." Mension was not an intellectual and did not pretend to be. "I was a youngster," says Mension, "who had done things that [Debord] was incapable of doing.... I was the existential principle and he was the theoretician."
Even in his own memoir, Mension comes off as the object of others' interpretations rather than as an active subject. The most engaging parts of The Tribe are not the conversations with Mension, who is vague and noncommittal about the great ideas of his day, but the photographs and excerpts of Lettrist writing that appear in the margins of the book. For example: "We are young and good-looking," says their leaflet against Charlie Chaplin, whom they deemed an artistic and political sellout, "and when we hear suffering we reply Revolution." Combining ironic arrogance with self-righteous anger, their words will be instantly familiar to anyone acquainted with the rhetoric of youth revolt. The Tribe is not the place to go if you are trying to understand Lettrism and its influence on other movements, but it is a charming sketch of a time and place where characters like Mension and Debord collided to create new ways of living and thinking.
In 1957 Guy Debord met with avant-garde artists and theorists from around Europe to found a new group, which would be devoted to creating situations: "moment[s] of life concretely and deliberately constructed by the collective organization of a unitary ambiance and a game of events." The Situationists were leftists in the tradition of Karl Marx and Rosa Luxemburg, but like the Lettrists they embraced youth as agents of revolutionary change--a mad and astonishingly prescient challenge to the sociological orthodoxy of their time. They also developed a radical new vision of how capital shapes culture and society.
"In the primitive phase of capitalist accumulation," writes Debord in his 1967 treatise The Society of the Spectacle, "political economy sees the proletarian only as the worker," who receives only enough compensation to survive. When the surplus of commodities reaches a certain level, however, it becomes necessary for capitalism to extract "a surplus of collaboration from the worker." As Henry Ford famously remarked, he had to pay his assembly-line workers enough so that they could afford to buy his cars. Thus was born the culture of consumption--and the society of the spectacle.
"The spectacle," says Debord, is "capital accumulated until it becomes an image" that mediates social relations among people, serving the needs and obscuring the power of capital. In the spectacular economy, all daily life and everything related to thought--sports, advertising, news, school, etc.--is mobilized on behalf of commodities, preaching work and consumption to the powerless so that the owners may prosper and live more fully. Unlike the rest of the Marxist left, the Situationists did not target scarcity, but instead abundance and the contradictions it entailed--especially boredom, which they saw as an ultramodern, artificially created method of social control.
According to Situationism, revolution in the spectacular economy cannot be waged only at the point of industrial production, as Marxists thought, but also at points of consumption and in the realm of image. It was at these points that alienation was deepest, the contradictions sharpest. By destroying the symbols that stand between the owners and nonowners, the underlying machinations of capital might be revealed. The proletariat was still a revolutionary class, but one joined by students, alienated youth and media workers. In 1966, a group of students used their posts in the student government at the University of Strasbourg to publish a stick of dynamite called "On the Poverty of Student Life," by SI member Mustapha Khayati. Thousands of copies were distributed to students at Strasbourg and throughout France, lighting a fuse that eventually ignited a general strike of students and workers in 1968. This is where Beneath the Paving Stones begins, with the moment in which Situationism broke through to its contemporaries.
Snotty and provocative, "On the Poverty of Student Life" asks students "to live instead of devising a lingering death, and to indulge untrammeled desire." By doing so, the erstwhile students would create a situation that goes beyond the point of no return. The spectacle and the mode of hierarchical, exploitative production it represents would be destroyed, replaced (rather magically, it must be said) by workers' councils practicing direct democracy and free individuals living without false desires. Voilà! Utopia!
"On the Poverty of Student Life" popularized ("diffused") the Situationist critique, but the centerpiece of Beneath the Paving Stones is "The Totality for Kids," by Raoul Vaneigem. In thirty elliptical sections, Vaneigem takes the reader from the "prehistoric food-gathering age" to the spectacular information economy and then to the point of revolution. There, says Vaneigem, the dispossessed must each seize their own lives, deliberately constructing each moment in a generalized conflict that stretches from domestic squabbles to classrooms to shop-floor struggles. It's a conception of revolution that encompasses feminism, black power, student power, anticolonial revolt, workers' control and even avant-garde artistic movements.
While Situationist writings, powerful and still relevant, deserve their perennial revival, Beneath the Paving Stones falters by not criticizing and updating Situationist theory and practice. At the beginning of the twenty-first century, it is simply not possible to present the events of 1968 without indulging in a certain radical nostalgia. Not only has postmodernism depoliticized or challenged many Situationist ideas (particularly reader-response theories arguing that nonowners actively participate in shaping the spectacle) but the Situationist legacy is often embraced by radical movements that forget to ask what happened after 1968. They certainly knew how to throw a great party, but they were not particularly interested in the details. The Situationist International ended its life fragmented, isolated, defeated. No other movement or organization was ever pure enough for them, and as personality conflicts and expulsions diminished their ranks, self-indulgent tactics limited their influence.
One can see this self-destructive tendency emerging early on, in the works and lives of those proto-punks, the Lettrists. By seeking to live without contradictions, on the margins where they had the freedom to construct their own daily lives--"we must survive," says Vaneigem, "as antisurvivors"--the Lettrists sacrificed their ability to attack those contradictions. The empty space in Situationist theory, as in many others, lies between constructing individual moments and changing the world. The Surrealists once sought to fill that space with official Communism, but were ultimately forced by Stalinism to sacrifice the subconscious for Socialist Realism. The Situationists learned from their sad example, taking a path that has left their ideas intact but confined to the realm of anthologies and retrospectives. Romanticizing their integrity might be useful, but fetishizing their failure is not. Situationism must be surpassed if it is ever to make a difference.
At work recently, I went to get a ham sandwich from the university cafeteria. I discovered, to my vocal dismay, that the well-loved food counter offering homemade fare had been torn out and replaced by a Burger King franchise. Questioned about this innovation, the head of "food services" insisted that
it had been implemented in response to consumer demand. An exhaustive series of polls, surveys and questionnaires had revealed, apparently, that students and faculty were strongly in favor of a more "branded feel" to their dining environment.
It is worth pausing over the term "branded feel." It represents, I think, something profound: The presence of Burger King in the lunchroom is claimed to be a matter of affect. It addresses itself to "feelings," it meets a need that is more emotional than economic. This need has been identified, I was informed, by scientific and therefore inarguable means. The food-services honcho produced statistics that clearly indicated a compelling customer desire for bad, expensive food. According to his methodology, my protests were demonstrably elitist and undemocratic.
It is hardly news that opinion polls are frequently used to bolster the interests of those who commission them. But in recent years the notion that opinion can be measured in quantifiable terms has achieved unprecedented power and influence over public policy. The American penal system, for instance, has been rendered increasingly violent and sadistic as a direct response to opinion polls, which inform politicians that inhumane conditions are what voters desire. The thoughts and emotions of human beings are regarded as mathematically measurable, and the practical effects of this notion are now perceptible in the most mundane transactions of daily life.
This quantified approach to human nature is the result of the importation of theoretical economics into the general culture. Since the marginalist revolution of the late nineteenth century, neoclassical economists have rigidly confined their investigations within the methodological paradigm of positivist science, and they aspire in particular to the model of mathematics. Economists seek to produce empirically verifiable, statistical patterns of human behavior. They regard such studies as objective, unbiased and free of value-laden, superstitious presuppositions. The principle of "consumer sovereignty" hails this mode of procedure as the sociological arm of democracy, and it has made economics the most prestigious of the human sciences.
As David Throsby's Economics and Culture and Don Slater and Fran Tonkiss's Market Society show, the procedures of academic economists are now being further exalted to a position of dominant influence over everyday experience. Homo economicus is fast becoming equated with Homo sapiens. When airlines refer to passengers as "customers" and advise them to be "conservative with your space management," this development may seem trivial or comic. But in their very different ways, these books suggest that beneath such incremental cultural mutations there lurks an iceberg of titanic dimensions.
The Australian academic David Throsby is about as enlightened and humanistic as it is possible for a professional economist to be. He is also an accomplished playwright, and his influence on the political culture of his native land has been extensive and unvaryingly benign. He begins from the accurate supposition that "public policy and economic policy have become almost synonymous," and his intention is to rescue culture from the philistinism of businessmen and politicians who are incapable of lifting their eyes above the bottom line. It is a lamentable sign of the times, however, that he sees no other means of doing so than by translating aesthetic endeavor into quantifiable, economic terms. As he puts it, "If culture in general and the arts in particular are to be seen as important, especially in policy terms in a world where economists are kings, they need to establish economic credentials; what better way to do this than by cultivating the image of art as industry."
In order to cultivate this image, Throsby makes extensive if ambivalent use of the "rational-choice theory" derived from the work of Gary Becker. In Becker's opinion, the kinds of decision-making that economists contrive to abstract from the actions of people conceived as economic agents can be extrapolated to explain their behavior in areas of life that were once, romantically and unscientifically, thought of as lying beyond the arid terrain of rational calculation: love, for example, or aesthetic endeavor. This emboldens Throsby to ask whether we "might envisage creativity as a process of constrained optimisation, where the artist is seen as a rational maximizer of individual utility subject to both internally and externally imposed constraints," and to postulate "a measure...of difference in creativity (or 'talent'), in much the same way as in microeconomic analysis differences between production functions in input-output space measures differences in technology."
There are enough caveats in Throsby's book to indicate a laudable reluctance to engage in this project; however, he evidently feels that the current climate of opinion leaves him no other choice. He is thus driven to apply the economic understanding of "value" to cultural phenomena, and to engage in a "consideration of culture as capital...in the economic sense of a stock of capital assets giving rise over time to a flow of capital services." Much of this book consists of a monomaniacal reinscription of life itself into the technical discourse of neoclassical economics. We are therefore subjected to lengthy discussions of "cultural capital" (formerly known as "culture"), "social capital" (a k a "society"), "physical capital" (née "buildings"), "natural capital" (alias "nature") and of course "human capital" (once referred to as "people"). There is, it seems, no limit to the colonizing potential of economics: "If broader cultural phenomena, such as traditions, language, customs, etc. are thought of as intangible assets in the possession of the group to which they refer, they too can be brought into the same framework."
We are faced here, essentially, with the quantification of all human experience. Not merely economic behavior but every aspect of life and thought can be expressed under the statistical rubric and studied in mathematical form. The notion of the "stakeholder," dear to Tony Blair, whose ambition to create a "stakeholder society" is overt and unapologetic, is fundamental to this project.
A stakeholder stands in relation to the world as a shareholder does to a corporation. He (or she) casts a cold eye on his surroundings and perceives only his "stake" in them; he rationally considers the means by which he may optimally maximize their benefits. The stakeholder, then, is not human. He is rather a quantified abstraction from humanity, a machine designed for the calculation of marginal utility. Good-hearted economists such as Throsby would retort that the stakeholder does not enjoy an empirical existence; he is merely a useful theoretical construct. Would that it were so. But in fact, as Hannah Arendt said of neoclassical economics' cousin, behavioral psychology: "The problem...is not that it is false but that it is becoming true."
There is an interesting convergence between rational-choice theory and the venerable tradition of socialist materialism. Both approaches insist that the real factor motivating human behavior is economic self-interest: that of an individual in the former case, and that of a social class in the latter. The British sociologists Don Slater and Fran Tonkiss address many of the same questions as Throsby in their book Market Society, but they view the conquest of intellectual and social life by economics from a more traditionally leftist perspective. Like Throsby, Slater and Tonkiss acknowledge that "market logic has come to provide a means of thinking about social institutions and individuals more generally," but instead of concluding that students of aesthetics must therefore incorporate economic concepts into their practice, they envisage a movement in the other direction. Today, they claim, "the economist's task of explanation is as much interpretive or hermeneutic as it is mathematical."
Slater and Tonkiss are influenced here by the "rhetorical turn" that economists such as Deirdre McCloskey have recently attempted to introduce into their discipline. The increasingly abstract nature of money, it is claimed, lays bare the fact that financial value, like semiotic meaning, is an imaginary and therefore arbitrary mode of signification. As such, money can be studied using terms and concepts drawn from rhetoric and literary criticism. (An amusing parody of this idea occurs in Will Self's novel My Idea of Fun, which features a "money critic" whose job is to pontificate about the aesthetic qualities of various forms of finance.) Slater and Tonkiss present this as an appealing reversal of intellectual roles: "Whereas the central preoccupation of critical social analysis has traditionally been the way in which economic rationality dominates culture, contemporary social theory has been increasingly concerned with the central role of cultural processes and institutions in organizing and controlling the economic."
Although their emphasis is different, Slater and Tonkiss's argument leads to the same essential conclusion as Throsby's: It no longer makes sense to distinguish between "economics" and "culture," or between "the market" and "society." In practice, it makes little difference whether one regards this as an incursion of aesthetics into economics or vice versa. Indeed, Slater and Tonkiss are a good deal more pessimistic than Throsby about the consequences of this development. To their credit, they are willing and able to introduce into the discussion concepts like "commodification" and "alienation," from which even liberal economists like Throsby recoil in horror. But they stop well short of the bleak dystopianism of Adorno, and their slightly anodyne conclusion is that "markets are not simply good or bad, because they are highly variable." This pluralism is forced upon them, because their book is intended as a historical survey of various theoretical approaches to the market: Market Society provides admirably lucid and meticulously fair readings of Smith, Ricardo, Durkheim, Simmel, Weber and Polanyi. Despite its historical approach, the most beguiling feature of the book is that its treatment of such past thinkers is undertaken with a prominent sense of our present predicament.
Discussing the economist whose theories have had the greatest influence on that predicament, Slater and Tonkiss remind us that "Hayek held that ultimately there were no economic ends as such; economic action always served ends that were non-economic in character because needs and desires are exogenous (or external) to the market setting." But to say that there are no economic ends is the same as to say that there are only economic ends. It is, in other words, to abolish any distinction between the economic and the noneconomic. Toward the end of Economics and Culture, Throsby observes that "in primitive societies...culture and economy are to a considerable degree one and the same thing." By this definition, as each of these important and timely books suggests, our society may be the most primitive of all. Can anyone, today, escape the "branded feel"?
September 11 showed us true American heroes. Now let's build on their strength.
The following debate is adapted from a forum--put together by Basic Books and held in New York City some weeks ago. Participating were: John Donatich, who moderated and is publisher of Basic Books; Russell Jacoby, who teaches at UCLA and is the author of The End of Utopia and The Last Intellectuals; Jean Bethke Elshtain, who has served as a board member of the Institute for Advanced Studies at Princeton University, is a fellow of the American Academy of Arts and Sciences, teaches at the University of Chicago and is the author of Women and War, Democracy on Trial and a forthcoming intellectual biography of Jane Addams; Stephen Carter, the William Nelson Cromwell Professor of Law at Yale University and author of, among other works, The Culture of Disbelief, Reflections of an Affirmative Action Baby, Integrity, Civility and, most recently, God's Name in Vain: The Wrongs and Rights of Religion in Politics; Herbert Gans, the Robert S. Lynd Professor of Sociology at Columbia University and author of numerous works, including Popular Culture and High Culture, The War Against the Poor and The Levittowners; Steven Johnson, acclaimed as one of the most influential people in cyberworld by Newsweek and New York magazines, co-founder of Feedmag.com, the award-winning online magazine, and author of the books Interface Culture and the forthcoming Emergence; and Christopher Hitchens, a columnist for The Nation and Vanity Fair, whose books include the bestselling No One Left to Lie To: The Values of the Worst Family and The Missionary Position: Mother Teresa in Theory and Practice. For Basic, he will be writing the forthcoming On the Contrary: Letters to a Young Radical.
John Donatich: As we try to puzzle out the future of the public intellectual, it's hard not to poke a little fun at ourselves, because the issue is that serious. The very words "future of the public intellectual" seem to have a kind of nostalgia built into them, in that we only worry over the future of something that seems endangered, something we have been privileged to live with and are terrified to bury.
In preparing for this event, I might as well admit that I've been worried about making the slip, "the future of the public ineffectual." But I think that malapropism would be central to what we'll be talking about. It seems to me that there is a central conflict regarding American intellectual work. How does it reconcile itself with the venerable tradition of American anti-intellectualism? What does a country built on headstrong individualism and the myth of self-reliance do with its people convinced that they know best? At Basic Books' fiftieth anniversary, it's a good time to look at a publishing company born in midcentury New York City, a time and place that thrived on the idea of the public intellectual. In our first decades, we published Daniel Bell, Nathan Glazer, Michael Walzer, Christopher Lasch, Herb Gans, Paul Starr, Robert Jay Lifton--and these names came fresh on the heels of Lévi-Strauss, Freud, Erik Erikson and Clifford Geertz.
What did these writers have in common except the self-defined right to worry the world and to believe that there is a symbiotic relationship between the private world of the thinker and the public world he or she wishes to address? That the age of great public intellectuals in America has passed has in fact become a cliché. There are many well-reviewed reasons for this. Scholars and thinkers have retreated to the academy. Self-doubt has become the very compass point of contemporary inquiry. Scholarship seems to start with an autobiographical or confessional orientation. The notion that every question has a noble answer or that there are reliable structures of ideology to believe in wholeheartedly has become, at best, quaint.
Some believe that the once-relied-upon audience of learned readers has disappeared, giving way to a generation desensitized to complex argumentation by television and the Internet. The movie Dumb and Dumber grosses dozens of millions of dollars at the box office, while what's left of bohemian culture celebrates free-market economics. Selling out has more to do with ticket grosses than the antimaterialist who stands apart from society.
How do we reconcile ambition and virtue, expertise and accessibility, multicultural sensitivity and the urge toward unified theory? Most important, how do we reconcile the fact that disagreement is a main catalyst of progress? How do we battle the gravitation toward happy consensus that paralyzes our national debate? A new generation of public intellectuals waits to be mobilized. What will it look like? That is what our distinguished panelists will discuss.
Russell Jacoby has been useful in defining the role of the public intellectual in the past half-century, especially in the context of the academy. Can you, Russell, define for us a sort of historical context for the public intellectual--what kind of talent, courage and/or political motivation it takes for someone to be of the academy but to have his or her back turned to it, ready to speak to an audience greater than one's peers?
Russell Jacoby: A book of mine that preceded The Last Intellectuals was on the history of psychoanalysis. And one of the things I was struck by when I wrote it was that even though psychoanalysis prospered in the United States, something was missing--that is, the sort of great refugee intellectuals, the Erik Eriksons, the Bruno Bettelheims, the Erich Fromms, were not being reproduced. As a field it prospered, but it became medicalized and professionalized. And I was struck by both the success of this field and the absence of public voices of the Eriksons and Bettelheims and Fromms. And from there I began to consider this as a sort of generational question in American history. Where were the new intellectuals? And I put the stress on public intellectuals, because obviously a kind of professional and technical intelligentsia prospered in America, but as far as I could see the public intellectuals were becoming somewhat invisible.
They were invisible because, in some ways, they had become academics, professors locked in the university. And I used a kind of generational account, looking at the 1900s, taking the Edmund Wilsons, the Lewis Mumfords. What became of them, and who were their successors? And I had a tough time finding them.
In some sense it was a story of my generation, the generation that ended up in the university and was more concerned with--well, what?--finding recommendations than with writing public interventions. And to this day, the worst thing you can say about someone in an academic meeting or when you're discussing tenure promotion is, "Oh, his work is kind of journalistic." Meaning, it's readable. It's journalistic, it's superficial. There's an equation between profundity and originality.
My argument was that, in fact, these generations of public intellectuals have diminished over time. For good reasons. The urban habitats, the cheap rents, have disappeared--as well as the jobs themselves. So the transitional generation, the New York intellectuals, ends up in the university. I mention Daniel Bell as a test case. When he was getting tenure, they turned to him and said, "What did you do your dissertation on?" And he said, "I never did a dissertation." And they said, "Oh, we'll call that collection of essays you did a dissertation." But you couldn't do that now. Those of that generation started off as independent intellectuals writing for small magazines and ended up as professors. The next generation started off as professors, wrote differently and thought differently.
So my argument and one of the working titles of my book was, in fact, "The Decline of the Public Intellectuals." And here I am at a panel on "The Future of Public Intellectuals." Even at the time I was writing, some editors said, "Well, decline, that's a little depressing. Could you sort of make a more upbeat version?" So I said, "I have a new book called The Rise of American Intellectuals," and was told, "Well, that sounds much better, that's something we can sell." But I was really taking a generational approach, which in fact, is on the decline. And it caused intense controversy, mainly for my contemporaries, who always said, "What about me? I'm a public intellectual. What about my friends?" In some sense the argument is ongoing. I'm happy to be wrong, if there are new public intellectuals emerging. But I tend to think that the university and professionalization does absorb and suck away too much talent, and that there are too few who are bucking the trends.
Donatich: Maybe the term "public intellectual" begs the question, "who is the public that is being addressed by these intellectuals?" Which participant in this conversation is invisible, the public or the intellectual?
Jean Bethke Elshtain: I mused in print at one point that the problem with being a public intellectual is that as time goes on, one may become more and more public and less and less intellectual. Perhaps I should have said that a hazard of the vocation of the public intellectual lies in that direction. I didn't exactly mean less academically respectable, but rather something more or less along these lines: less reflective, less inclined to question one's own judgments, less likely to embed a conviction in its appropriate context with all the nuance intact. It is the task of the public intellectual as I understand that vocation to keep the nuances alive. A public intellectual is not a paid publicist, not a spinner, not in the pocket of a narrowly defined purpose. It is, of course the temptation, another one, of the public intellectual to cozy up to that which he or she should be evaluating critically. I think perhaps, too many White House dinners can blunt the edge of criticism.
A way I like to put it is that when you're thinking about models for this activity, you might put it this way: Sartre or Camus? An intellectual who is willing to look the other way, indeed, shamefully, explain away the existence of slave-labor camps, the gulags, in the service of a grand world-historic purpose or, by contrast, an intellectual who told the truth about such atrocities, knowing that he would be denounced, isolated, pronounced an ally of the CIA and capitalistic oppressors out to grind the faces of the poor.
There are times when a public intellectual must say "neither/nor," as did Camus. Neither the socialism of the gallows, in his memorable phrase, nor a capitalist order riddled with inequalities and shamed by the continuing existence, in his era, the era of which I speak, of legally sanctioned segregation. At the same time, this neither/nor did not create a world of moral equivalence. Camus was clear about this. In one regime, one order, one scheme of things, one could protest, one could organize to fight inequities, and in the other one wound up disappeared or dead.
Let me mention just one issue that I took on several times when I alternated a column called "Hard Questions" for The New Republic. I'm referring to the question of genetic engineering, genetic enhancement, the race toward a norm of human perfection to be achieved through manipulation of the very stuff of life. How do you deal with an issue like this? Here, it seems to me, the task of the public intellectual in this society at this time--because we're not fighting the issues that were fought in the mid-twentieth century--is to join others in creating a space within which such matters can be articulated publicly and debated critically.
At present, the way the issue is parsed by the media goes like this: The techno-enthusiasts announce that we're one step closer to genetic utopia. The New York Times calls up its three biological ethicists to comment. Perhaps one or two religious leaders are asked to wring their hands a little bit--anyone who's really a naysayer with qualms about eugenics, because that is the direction in which we are heading, is called a Luddite. Case closed, and every day we come closer to a society in which, even as we intone multiculturalism as a kind of mantra, we are narrowing the definition of what is normatively human as a biological ideal. That's happening even as we speak; that is, we're in real danger of reducing the person to his or her genotype, but if you say that, you're an alarmist--so that's what I am.
This leads me to the following question: Who has authority to pronounce on what issue, as the critical issues change from era to era? In our time and place, scientists, technology experts and dot-com millionaires seem to be the automatic authorities on everything. And everybody else is playing catch-up.
So the public intellectual needs, it seems to me, to puncture the myth-makers of any era, including his own, whether it's those who promise that utopia is just around the corner if we see the total victory of free markets worldwide, or communism worldwide or positive genetic enhancement worldwide, or mouse-maneuvering democracy worldwide, or any other run-amok enthusiasm. Public intellectuals, much of the time at least, should be party poopers. Reinhold Niebuhr was one such when he decided that he could no longer hold with his former compatriots of the Social Gospel movement, given what he took to be their dangerous naïveté about the rise of fascism in Europe. He was widely derided as a man who once thought total social transformation in the direction of world peace was possible, but who had become strangely determined to take a walk on the morbid side by reminding Americans of the existence of evil in the world. On this one, Niebuhr was clearly right.
When we're looking around for who should get the blame for the declining complexity of public debate, we tend to round up the usual suspects. Politicians usually get attacked, and the media. Certainly these usual suspects bear some responsibility for the thinning out of the public intellectual debate. But I want to lift up two other candidates here, two trends that put the role of public intellectuals and the very existence of publics in the John Dewey sense at risk. The first is the triumph of the therapeutic culture, with its celebration of a self that views the world solely through the prism of the self, and much of the time a pretty "icky" self at that. It's a quivering sentimental self that gets uncomfortable very quickly, because this self has to feel good about itself all the time. Such selves do not make arguments, they validate one another.
A second factor is the decline of our two great political parties. At one point the parties acted not just as big fundraising machines, not just as entities to mobilize voters but as real institutions of political and civic education. There are lots of reasons why the parties have been transformed and why they no longer play that role, but the results are a decline in civic education, a thinning out of political identification and depoliticization, more generally.
I'm struck by what one wag called the herd of independent minds; by the fact that what too often passes for intellectual discussion is a process of trying to suit up everybody in a team jersey so we know just who should be cheered and who booed. It seems to me that any public intellectual worth his or her salt must resist this sort of thing, even at the risk of making lots of people uncomfortable.
Donatich: Stephen, can you talk about the thinning out of political identity? Who might be responsible for either thickening or thinning the blood of political discourse? What would you say, now that we're talking about the fragmentation of separate constituencies and belief systems, is the role of religion and faith in public life?
Stephen Carter: You know that in the academy the really bad word is "popularizer"-- a mere popularizer, not someone who is original, which of course means obscure, or someone who is "deeply theorized," which is the other phrase. And to be deeply theorized, you understand, in academic terms today, means to be incapable of uttering a word such as "poor." No one is poor. The word, the phrase now, as some of you may know, is "restricted access to capital markets." That's deeply theorized, you see. And some of us just say poor, and that makes us popularizers.
A few years ago someone who was really quite angry about one of my books--and I have a habit of making people angry when I write books--wrote a review in which he challenged a statement of mine asserting that the intellectual should be in pursuit of truth without regard to whether that leaves members of any particular political movement uncomfortable. He responded that this was a 12-year-old nerd's vision of serious intellectual endeavor.
And ever since then I thought that I would like to write a book, or at least an essay, titled something like Diary of an Intellectual Nerd, because I like that idea of being somewhat like a 12-year-old. A certain naïveté, not so much about great ideas and particularly not about political movements but about thought itself, about truth itself. And I think one of the reasons, if the craft of being intellectual in the sense of the scholar who speaks to a large public is in decline, is cynicism. Because there's no sense that there are truths and ideas to be pursued. There are only truths and ideas to be used and crafted and made into their most useful and appropriate form. Everyone is thought to be after something, everyone is thought to have some particular goal in mind, independent of the goal that he or she happens to articulate. And so, a person may write a book or an article and make an argument, and people wonder, they stand up in the audience and they say, "So, are you running for office, or are you looking for some high position?" There's always some thought that you must be after something else.
One of the reasons, ideally, you'd think you would find a lot of serious intellectual endeavor on university campuses is precisely because people have tenure and therefore, in theory, need not worry about trying to do something else. But on many, many campuses you have, in my judgment, relatively little serious intellectual endeavor in the sense of genuinely original thinking, because even there, people are worried about which camp they will be thought to be in.
You can scarcely read a lot of scholarship today without first having to wade through several chapters of laying out the ground in the sense of apologizing in advance to all the constituencies that may be offended, lest one be thought in the other camp. That kind of intellectual activity is not only dangerous, it's unworthy in an important sense, it's not worthy of the great traditions of intellectual thought.
There's a tendency sometimes to have an uneasy equation that there is serious intellectual activity over here, and religion over there, and these are, in some sense, at war. That people of deep faith are plainly anti-intellectual and serious intellectuals are plainly antireligious bigots--they're two very serious stereotypes held by very large numbers of people. I'm quite unembarrassed and enthusiastic about identifying myself as a Christian and also as an intellectual, and I don't think there's any necessary war between those two, although I must say, being in an academic environment, it's very easy to think that there is.
I was asked by a journalist a few years ago why was it that I was comfortable identifying myself, and often did, as a black scholar or an African-American scholar and hardly ever identified myself as a Christian scholar. And surely the reason is, there are certain prejudices on campus suggesting that is not a possible thing to be or, at least, not a particularly useful combination of labels.
And yet, I think that the tradition of the contribution to a public-intellectual life by those making explicitly religious arguments has been an important and overlooked one, and I go back for my model, well past Niebuhr, into the nineteenth century. For example, if you looked at some of the great preachers of the abolitionist movement, one thing that is quite striking about them is, of course, that they were speaking in an era when it was commonly assumed that people could be quite weighty in their theology and quite weighty in their intellectual power. And when you read many of the sermons of that era, many of the books and pamphlets, you quickly gain a sense of the intellectual power of those who were pressing their public arguments in explicitly Christian terms.
Nowadays we have a historical tendency to think, "Oh, well, it's natural they spoke that way then, because the nation was less religiously diverse and more Christian." Actually, the opposite was probably true, as historians now think--the nation is probably less religiously diverse now than it was 150, 175 years ago, when religions were being founded really quite swiftly. And most of those swiftly founded religions in the 1820s to the 1830s have died, but many of them had followers in great number before they did.
America's sense of itself as a so-called Christian nation, as they used to say in the nineteenth century, didn't really grow strong until the 1850s or 1860s. So you have to imagine the abolitionist preachers of the eighteenth and early nineteenth centuries, preaching in a world in which it could be anything but certain that those who were listening to them were necessarily co-religionists.
In this century too, we have great intellectual preachers who also spoke across religious lines. Martin Luther King is perhaps the most famous of them, even though sometimes, people try to make a straitjacket intellectual of him by insisting, with no evidence whatsoever, that he actually was simply making secular moral arguments, and that religion was kind of a smokescreen. If you study his public ministry and look at his speeches, which were really sermons, as a group, you easily discern that that's not true.
And yet, the religiosity of his language gave it part of its power, including the power to cross denominational lines, to cross the lines between one tradition and another, and to cross lines between religion and nonreligion. For the religiously moved public intellectual, the fact is that there are some arguments that simply lose their power or are drained of their passion when they're translated into a merely secular mode. The greatness of King's public oratory was largely a result of its religiosity and its ability to touch that place in the human heart where we know right from wrong; it would not have been as powerful, as compelling, had it lacked that religious quality.
Now, I'm not being ahistorical, I'm not saying, "Oh, therefore the civil rights movement would not have happened or we would still have racial segregation today"--that's not the point of my argument. The point is that his religiosity did not detract from his intellectual power; rather, it enhanced it. This is not to say, of course, that everyone who makes a religious argument in public life is speaking from some powerful intellectual base. But it does suggest we should be wary of the prejudices that assume they can't be making serious arguments until they are translated into some other form that some may find more palatable. In fact, one of my great fears about the place we are in our democracy is that, religion aside, we have lost the ability to express and argue about great ideas.
Donatich: Professor Carter has made a career out of illustrating the effect and protecting the right of religious conviction in public thought. Herbert Gans, on the other hand, is a self-pronounced, enthusiastic atheist. As a social scientist who has taught several generations of students, how does a public intellectual balance the professional need for abstract theory and yet remain relevant, contribute some practical utility to the public discourse?
Herbert Gans: I'm so old that the word "discourse" hadn't been invented yet! I am struck by the pessimism of this panel. But I also notice that most of the names of past public intellectuals--and I knew some of them--were, during their lifetime, people who said, "Nobody's listening to me." Erich Fromm, for example, whom I knew only slightly and through his colleagues, was sitting in Mexico fighting with psychoanalysts who didn't think politics belonged in the dialogue. Lewis Mumford was a teacher of mine, and he certainly felt isolated from the public, except on architecture, because he worked for The New Yorker.
So it seems to me it's just the opposite: that the public intellectual is alive and well, though perhaps few are of the magnitude of the names mentioned. If I did a study, I'd have to define what an intellectual is, and I notice nobody on the panel has taken that one on. And I won't either. The public intellectuals that exist now may not be as famous, but in fact there are lots of them. And I think at least on my campus, public intellectuals are becoming celebrities. Some of them throw stones and get themselves in trouble for a few minutes and then it passes. But I think that really is happening, and if celebrities can exist, their numbers will increase.
One of the reasons the number is increasing is that public intellectuals are really pundits. They're the pundits of the educated classes, the pundits of the highbrow and the upper-middlebrow populations, if you will. And the moment you say they're pundits, then you can start comparing them to other pundits, of which we have lots. And there are middlebrow pundits and there are lower-brow pundits, there are serious pundits, there are not-so-serious pundits.
Some of the columnists in the newspapers and the tabloid press who are not journalists with a PhD are public intellectuals. There are pundits who are satirical commentators, there are a significant number of people who get their political news from Leno and Letterman. And, of course, the pollsters don't really understand this, because what Leno and Letterman supply is a satirical take on the news.
Most public intellectuals function as quote-suppliers to legitimize the media. Two or three times a week, I get called by journalists and asked whether I will deliver myself of a sociological quote to accompany his or her article, to legitimate, in a sense, the generalizations that journalists make and have to make, because they've got two-hour deadlines. Which means that while there are few public intellectuals who are self-selected, most of us get selected anyway. You know, if no journalist calls for a quote, then I'm not a public intellectual; I just sit there writing my books and teaching classes.
I did a book on the news media and hung out at Newsweek and the other magazines. And at Newsweek, they had something they called an island, right in the main editorial room. On the island were names of people who would now be called public intellectuals, the people whom Newsweek quoted. And the rules were--and this is a bit like Survivor--every so often people would be kicked off the island. Because the editors thought, and probably rightly, that we as readers were going to get tired of this group of public intellectuals. So a new group was brought in to provide the quotes. And then they were kicked off.
The public intellectuals come in two types, however. First there are the ones that everyone has been talking about, the generalists, the pundits, as I think of them; and second are the disciplinary public intellectuals. The public sociologists, the public economists, the public humanists--public, plus a discipline. And these are the people who apply the ideas from their own disciplines to a general topic. And again, to some extent, this is what I do when I'm a quote-supplier, and I'm sure my fellow panelists are all functioning as quote-suppliers too.
But the disciplinary public intellectuals show that their disciplinary insights and their skills can add something original to the public debate. That, in other words, social scientists and humanists can indeed grapple with the issues and the problems of the real world. The disciplinary public intellectuals, like other public intellectuals, have to write in clear English. This is a rarity in the academy, unfortunately--which makes disciplinary public intellectuals especially useful. And they demonstrate the public usefulness of their disciplines, which is important in one sense, because we all live off public funds, directly or indirectly, and we need to be able to account every so often that we're doing something useful for taxpayers. I cannot imagine there are very many legislators in this country who would consider an article in an academic journal as proof that we're doing something useful or proof that we're entitled to some share of the public budget.
Disciplinary public intellectuals are useful in another way, too: They are beloved by their employers, because they get these employers publicity. My university has a professionally run clipping service, and every time Columbia University is mentioned, somebody clips and files the story. And so every time somebody quotes me I say, "Be sure to mention Columbia University," because I want to make my employers happy, even though I do have tenure. Because, if they get publicity, they think they're getting prestige, and if they get prestige, that may help them get students or grant money.
There are a number of hypotheses on this; I'm not sure any of them are true-- whether quote-supplying provides prestige, or prestige helps to get good students, whether good students help to get grant money. There is a spiral here that may crash. But meanwhile, they think that if we're getting them publicity, we're being useful. And, of course, public social scientists and those in the humanities are, in some respects, in short supply, in part because their colleagues stigmatize them as popularizers. (They don't call them journalists, which is a dirty word in the ivory tower.)
It's also fair to say that in the newsrooms, "academic" is a dirty word. If you've ever paid attention, journalists always cite "the professor," and it doesn't matter who it is, and it doesn't even matter if they're friends of the professor. But it's always "the professor," which is a marvelous way of dehumanizing us professors. So there's this love/hate relationship between journalists and academics that's at work here. All of which means, yes, of course, it does take a bit of courage to be a public intellectual or a disciplinary public intellectual. If you turn your back on the mainstream of the academy, that's the way you get a knife in your back, at times.
Donatich: Steven Johnson has used the web and Internet energetically and metaphorically. How will the Internet change public dialogue? What are the opportunities of public conversation that this new world presents?
Steven Johnson: One of the problems with the dot-com-millionaire phenomenon--which may, in fact, be starting to fall behind us--is that it really distracted a huge amount of attention from a lot of other very interesting and maybe more laudable things that were happening online. There was kind of a news vacuum that sucked everything toward stories about the 25-year-old guy who just made $50 million, and we lost sight of some of the other really progressive and important things that were happening because of the rise of the web.
I'm of a generation that came of age at precisely that point that Russell Jacoby talked about and wrote about, during the late eighties, when the academy was very much dominated by ideas from France and other places, where there was a lot of jargon and specialization, and it was the heyday of poststructuralism and deconstruction in the humanities. Which leads me to sometimes jokingly, sometimes not, describe myself as a "recovering semiotics major."
I think that I came to the web and to starting Feed, and to writing the book that I wrote about the Internet culture and interface culture, as a kind of a refugee from conversations like one in the academy, when I was a graduate student, in which a classmate asked the visiting Derrida a ten- or fifteen-minute, convoluted Derridean question on his work and the very possibility of even asking a question. And after a long pause, Derrida had to admit, "I'm sorry, I do not understand the question."
The web gave me an unlikely kind of home in that there were ideas and there were new critical paradigms that had been opened up to me from the academic world. But it was clear that you couldn't write about that world, you couldn't write using those tools with that kind of language and do anything useful. And it was very hard to imagine a life within the university system that was not going to inevitably push me toward conversations like that with Derrida.
So the good news, I think, is that my experience is not unique. In fact, there's been a great renaissance in the last five years of the kind of free-floating intellectual that had long been rumored to be on his or her last legs. It's a group shaped by ideas that have come out of the academy but is not limited to that. And I think in terms of publications like Feed--to pat myself on the back--Hermenaut and Suck are all good examples of a lively new form of public intellectualism that is not academic in tone.
The sensibility of that group is very freethinking--not particularly interested in doctrinaire political views, very eclectic in taste, very interested in the mix of high and low culture, much more conversational in tone--funny, even. Funny is an interesting component here. I mean, these new writers are funny in a way, you know, Adorno was never very funny. And they're very attentive to technology changes, maybe as interested in technology and changes in the medium as they are in intellectual fashions. If there's a role model that really stands out, it's somebody like Walter Benjamin for this generation. You know, a sense of an interest that puts together groups of things you wouldn't necessarily expect to see put together in the same essay.
How does the web figure into all of this? Why did these people show up on the web? I think one of the things that started happening--actually, this is just starting to happen--is that in addition to these new publications, you're starting to see something on the web that is very unique to it. The ability to center your intellectual life in all of its different appearances in your own "presence" online, on the home page, so that you can actually have the equivalent of an author bio. Except that it's dynamically updated all the time, and there are links to everything you're doing everywhere. I think we've only just begun to exploit it--of combating the problem with the free-floating intellectual, which is that you're floating all over the place and you don't necessarily have a home, and your ideas are appearing in lots of different venues and speaking to lots of different audiences.
The web gives you a way of rounding all those diverse kinds of experiences and ideas--and linking to them. Because, of course, the web is finally all about linking--in a way that I think nothing has done quite as well before it. And it also involves a commitment to real engagement with your audience that perhaps public intellectuals have talked a lot about in the past, but maybe not lived up to as much as they could have.
Some of this is found in the new formats that are available online in terms of how public dialogue can happen. I'm sure many of you have read these and many of you may have actually participated in them, but I'm a great advocate for this kind of long-format, multiparticipant discussion thread that goes on over two or three weeks. Not a real-time live chat, which is a disaster in terms of quality of discourse, which inevitably devolves into the "What are you wearing" kind of intellectual questions. But rather, the conversations with four or five people where each person has a day or half a day to think up their responses, and then write in 500- to 1,000-word posts. We've done those since we started at Feed. Slate does a wonderful job with them. And it's a fantastic forum. It's very engaged, it's very responsible, it's very dialogic and yet also lively in a conversational way. But, because of the back and forth, you actually can get to places that you sometimes couldn't get in a stand-alone 10,000-word essay.
Donatich: Professor Gans, if you had trouble with the word "discourse," I'm wondering what you'll do with "dialogic."
Johnson: I said I was recovering! That's the kind of thing that should be happening, and it seems to me that in five or ten years we'll see more and more of people who are in this kind of space, having pages that are devoted to themselves and carrying on these conversations all the time with people who are coming by and engaging with them. And I think that is certainly a force for good. The other side is just the economics of being able to publish either your own work or a small magazine. I mean, we started Feed with two people. We were two people for two years before we started growing a little bit. And the story that I always tell about those early days is that we put out the magazine and invited a lot of our friends and some people we just knew professionally to contribute. About three months, I guess, after Feed launched, Wired came out with a review of it. And they had this one slightly snippy line that said, "It's good to see the East Coast literary establishment finally get online." Which is very funny, to be publishing this thing out of our respective apartments. I had this moment where I was looking around my bedroom for the East Coast literary establishment--you open the closet door, and "Oh, Norman Mailer is in there. 'Hey, how's it going!'" And so there can be a kind of Potemkin Village quality online. But I think the village is thriving right now.
Donatich: Christopher Hitchens, short of taking on what a public intellectual might or might not be, will you say something about the manners or even the mannerisms of the public intellectual and why disagreement is important to our progress?
Christopher Hitchens: I've increasingly become convinced that in order to be any kind of a public-intellectual commentator or combatant, one has to be unafraid of the charges of elitism. One has to have, actually, more and more contempt for public opinion and for the way in which it's constructed and aggregated, and polled and played back and manufactured and manipulated. If only because all these processes are actually undertaken by the elite and leave us all, finally, voting in the passive voice and believing that we're using our own opinions or concepts when in fact they have been imposed upon us.
I think that "populism" has become probably the main tactical discourse, if you will, the main tactical weapon, the main vernacular of elitism. Certainly the most successful elitist in American culture now, American politics particularly, is the most successful inventor or manipulator, or leader of populism. And I think that does leave a great deal of room in the public square for intellectuals to stand up, who are not afraid to be thought of as, say, snobbish, to pick a word at random. Certainly at a time when the precious term "irony"--precious to me, at any rate--has been reduced to a form of anomie or sarcasm. A little bit of snobbery, a little bit of discrimination, to use another word that's fallen into disrepute, is very much in order. And I'm grateful to Professor Carter for this much, at least, that he drew attention to language. And particularly to be aware of euphemism. After all, this is a time when if you can be told you're a healer, you've probably won the highest cultural award the society can offer, where anything that can be said to be unifying is better than anything that can be described as divisive. Blush if you will, ladies and gentlemen, I'm sure at times you too have applauded some hack who says he's against or she's against the politics of division. As if politics wasn't division by definition.
The New York Times, which I'm sure some of you at least get, if you don't read, will regularly regale you in this way--check and see if you can confirm this. This will be in a news story, by the way, not a news analysis. About my hometown in Washington, for example, "recently there was an unpleasant outbreak of partisanship on Capitol Hill, but order seems to have been restored, and common sense, and bi-partisanship, is again regained. I've paraphrased only slightly. Well, what is this in translation? "For a while back there it looked as if there'd be a two-party system. But, thank God, the one-party system has kicked back in."
Now, the New York Times would indignantly repudiate--I'm coming back to this, actually--the idea that it stood for a one-party system or mentality, but so it does. And its language reveals it. So look to the language. And that is, in fact, one of the most essential jobs of anyone describing themselves as an intellectual.
Against this, we have, of course, the special place reserved for the person who doesn't terribly want to be a part of it, doesn't feel all that bipartisan, who isn't in an inclusive mood. Look at the terms that are used for this kind of a person: gadfly, maverick and, sometimes, bad boy. Also bad girl, but quite often bad boy, for some reason. Loose cannon, contrarian, angry young man.
These are not hate words, by any means, nor are they exactly insulting, but there's no question, is there, that they are fantastically and essentially condescending. They're patronizing terms. They are telling us, affectionately enough, that pluralism, of course, is big enough, capacious enough, tolerant enough to have room for its critics.
The great consensus, after all, probably needs a few jesters here and there, and they can and should be patted upon the head, unless they become actually inconvenient or awkward or, worst of all--the accusation I have myself been most eager to avoid--humorless. One must be funny, wouldn't you say? Look to the language again. Take the emaciated and paltry manner and prose in which a very tentative challenge to the one-party system, or if you prefer, the two-party one, has been received. I'm alluding to the campaign by Ralph Nader.
The New York Times published two long editorials, lead editorials, very neatly inverting the usual Voltairean cliché. These editorials say: We don't particularly disagree with what Ralph Nader says, but we violently disagree with his right to say it. I've read the editorials--you can look them up. I've held them up to the light, looked at them upside down, inside out, backwards--that's what they say. This guy has no right to be running, because the electorate is entitled to a clear choice between the two people we told you were the candidates in the first place.
I find this absolutely extraordinary. When you're told you must pick one of the available ones; "We've got you some candidates, what more do you want? We got you two, so you have a choice. Each of them has got some issues. We've got some issues for you as well. You've got to pick." A few people say, "Well, I don't feel like it, and what choice did I have in the choice?" You're told, "Consider the alternatives." The first usage of that phrase, as far as I know, was by George Bernard Shaw, when asked what he felt like on his 90th birthday. And he said, "Considering the alternatives...." You can see the relevance of it. But in this case you're being told, in effect, that it would be death to consider the alternatives.
Now, to "consider the alternatives" might be a definition of the critical mind or the alive intelligence. That's what the alive intelligence and the critical mind exist to do: to consider, tease out and find alternatives. It's a very striking fact about the current degeneration of language, that that very term, those very words are used in order to prevent, to negate, consideration of alternatives. So, be aware. Fight it every day, when you read gunk in the paper, when you hear it from your professors, from your teachers, from your pundits. Develop that kind of resistance.
The word "intellectual" is of uncertain provenance, but there's no question when it became a word in public use. It was a term of abuse used by those who thought that Capt. Alfred Dreyfus was guilty in 1898 to describe those who thought that he was probably innocent. It was a word used particularly by those who said that whether Captain Dreyfus was innocent or not, that wasn't really the point. The point was, would France remain an orderly, Christian, organic, loyal society? Compared to that, the guilt or innocence of Captain Dreyfus was irrelevant. They weren't saying he was necessarily guilty, they were saying, "Those who say he is innocent are not our friends. These are people who are rootless, who have no faith, who are unsound, in effect." I don't think it should ever probably lose that connotation. And fortunately, like a lot of other words that were originally insults--I could stipulate "Impressionist," which was originally a term of abuse, or "suffragette" or "Tory," as well as a number of other such terms--there was a tendency to adopt them in reaction to the abuse and to boast of them, and say, "Well, all right, you call me a suffragette, I'll be a suffragette. As a matter of fact, I'll be an Impressionist."
I think it would be a very sad thing if the word "intellectual" lost its sense that there was something basically malcontent, unsound and untrustworthy about the person who was claiming the high honor of the title. In politics, the public is the agora, not the academy. The public element is the struggle for opinion. It's certainly not the party system or any other form whereby loyalty can be claimed of you or you can be conscripted.
I would propose for the moment two tasks for the public intellectual, and these, again, would involve a confrontation with our slipshod use of language. The first, I think, in direct opposition to Professor Carter, is to replace the rubbishy and discredited notions of faith with scrutiny, by looking for a new language that can bring us up to the point where we can discuss shattering new discoveries about, first, the cosmos, in the work of Stephen Hawking, and the discoveries of the Hubble telescope--the external world--and, second, no less shattering, the discovery about our human, internal nature that has begun to be revealed to us by the unraveling of the chains of DNA.
At last, it's at least thinkable that we might have a sense of where we are, in what I won't call creation. And what our real nature is. And what do we do? We have President Clinton and the other figures in the Human Genome Project appear before us on the day that the DNA string was finally traced out to its end, and we're told in their voices and particularly the wonderful lip-biting voice of the President, "Now we have the dictionary which God used when he was inventing us." Nothing could be more pathetic than that. This is a time when one page, one paragraph, of Hawking is more awe-inspiring, to say nothing of being more instructive, than the whole of Genesis and the whole of Ezekiel. Yet we're still used to babble. For example, in the 18th Brumaire of Louis Napoleon, Karl Marx says, quite rightly, I think, "When people are trying to learn a new language, it's natural for them to translate it back into the one they already know." Yes, that's true. But they must also transcend the one they already know.
So I think the onus is on us to find a language that moves us beyond faith, because faith is the negation of the intellect, faith supplies belief in preference to inquiry and belief, in place of skepticism, in place of the dialectic, in favor of the disorder and anxiety and struggle that is required in order to claim that the mind has any place in these things at all.
I would say that because the intellectual has some responsibility, so to speak, for those who have no voice, that a very high task to adopt now would be to set oneself and to attempt to set others, utterly and contemptuously and critically and furiously, against the now almost daily practice in the United States of human sacrifice. By which I mean, the sacrifice, the immolation of men and women on death row in the system of capital punishment. Something that has become an international as well as a national disgrace. Something that shames and besmirches the entire United States, something that is performed by the professionalized elite in the name of an assumed public opinion. In other words, something that melds the worst of elitism and the absolute foulest of populism.
People used to say, until quite recently, using the words of Jimmy Porter in Look Back in Anger, the play that gave us the patronizing term "angry young man"--well, "there are no good, brave causes anymore." There's nothing really worth witnessing or worth fighting for, or getting angry, or being boring, or being humorless about. I disagree and am quite ready to be angry and boring and humorless. These are exactly the sacrifices that I think ought to be exacted from oneself. Let nobody say there are no great tasks and high issues to be confronted. The real question will be whether we can spread the word so that arguments and debates like this need not be held just in settings like these but would be the common property of anyone with an inquiring mind. And then, we would be able to look at each other and ourselves and say, "Well, then perhaps the intellectual is no longer an elitist."
December 8, 2000: It was twenty years ago today that Mark David Chapman shot and killed John Lennon outside the Dakota on West 72nd Street in New York City, bringing whatever was left of the sixties to a definitive and miserable end. Yet Lennon lives on--not just for his now-graying fans, not just for younger kids discovering the Beatles, but in some unexpected and surprising ways.
Case in point: At the Republican National Convention in Philadelphia this past August, as Dick Cheney stepped up to the podium to accept the party's nomination as vice presidential candidate, the band struck up a spirited version of Lennon's song "Come Together." This is the one on the Abbey Road album that begins "Here come ol' flattop" (Cheney of course is mostly bald), and continues, "One thing I can tell you is you got to be free"--a sixties sentiment that meant something quite different from tax cuts for the rich.
Cheney probably didn't know that Lennon started writing "Come Together" as a campaign song--for Timothy Leary's planned 1970 campaign for California governor against Ronald Reagan. Leary never used the song, but Lennon sang it live onstage at Madison Square Garden in 1972 in the midst of another presidential campaign, when Nixon was trying to have him deported to silence a prominent voice of the antiwar movement. Lennon changed the title line to "Come together--stop the war--right now!" and the audience cheered wildly.
The Democrats also played a Lennon song at their convention: They used "Imagine" as the theme of a tribute to Jimmy Carter. While the giant video showed Jimmy and Rosalynn hammering nails and fondling small children, the easy-listening version of Lennon's song omitted the words "Imagine there's no heaven/it's easy if you try/No hell below us/Above us only sky"--not really appropriate for America's first born-again Baptist President.
"Imagine" is a utopian anthem, and the utopian imagination was always a keystone of sixties New Left thought, distinguishing it from the bread-and-butter politics of traditional working-class socialism. "Power to the imagination" was a key slogan written on the walls in May '68. Today the country is full of billboards urging people to "Dial 1-800-imagine." I tried it. You don't get John Lennon singing "Imagine no possessions." Instead you get AT&T Wireless Services: Press 1 to upgrade your wireless plan, press 2 to inquire about new service, press 3 to inquire about an order and, of course, press 4 to hear these options again.
A search of the Nexis database found these variants on Lennon's "Imagine no possessions": a Republican who said "Imagine no estate tax," a television critic who wrote "Imagine no more Regis," a technophobe who wrote "Imagine no computers" and a Democratic pundit who headlined an opinion piece, "Imagine There's No Nader."
Lennon lyrics appear in print in some other unlikely places. When Time put Bill Clinton on its cover at the beginning of his first term, the cover line was "You Say You Want a Revolution." Two years later, when the Republicans won control of the House, the New York Times ran an opinion piece by R.W. Apple Jr. headlined "You Say You Want a Devolution." And just a few months ago, after Joe Lieberman changed his mind about privatizing Social Security, The New Republic headline read "You say you want an Evolution."
The headline writers probably had forgotten that Lennon wrote "Revolution" in response to the May '68 uprisings in Paris, criticizing student radicals for advocating violence. He recorded two versions of the song. The single--the "fast" version--came first. It was recorded on May 30, 1968, and released in the United States in August, shortly after the police riot at the Democratic National Convention in Chicago. After the opening line--"You say you want a revolution"--it concluded, "count me out." The radical press was outraged. Ramparts called the song "a betrayal"; New Left Review called it "a lamentable petty bourgeois cry of fear." Time, on the other hand, reported that the Beatles had criticized "radical activists the world over," which Time found "exhilarating." The second, "slow" version of the song was released on the White Album two months later. Now, after the line "count me out," Lennon added another word: "in." He later explained, "I put both in because I wasn't sure." A year later he was singing "Power to the People."
Lennon's "Give Peace a Chance" was sung by half a million antiwar demonstrators at the Washington Monument in 1969, but since then it's come in for some revisionism. I remember militant friends back in those days singing "Give the dictatorship of the proletariat a chance." Then there's "Give War a Chance," which pops up every once in a while--the establishment journal Foreign Affairs used it as the title of a 1999 article by Edward Luttwak arguing against US intervention in local conflicts. Frontline broadcast a story on the Balkans in 1999 with the same title, and P.J. O'Rourke used Give War a Chance as the title for a book that became a bestseller. On the other hand, none other than Trent Lott uttered the words "give peace a chance" on the floor of the Senate--talking about Kosovo. Finally, a company called Peace Software (www.peace.com) is using the slogan "Give Peace a Chance."
Lennon's most intense and personal post-Beatle song, "God," a very slow track on his first solo album, contains a litany that concluded, "I don't believe in Beatles." The New York Times ran a full-page interview in September with Philip Leider, the founding editor of ArtForum, that included his own personal version of the lyrics, which took up twenty-three lines of our newspaper of record. Warhol came first: "I don't beleeve in Andy." Then: "I don't beleeve in Haring"; "I don't beleeve in Fischl"; "I don't beleeve in Koons"; and so on through nineteen more current art stars.
Several of Lennon's most memorable lines have not been appropriated by pundits or Op-Ed types: "Instant Karma's gonna get you" remains untouched, at least according to Nexis, and thus far nobody has found a way to use "I am the walrus, goo-goo g'joob." But aside from these notable exceptions, the conclusion is clear: John Lennon may be gone, but twenty years after his death his words and ideas are here, there and everywhere.
To buy or not to buy turns out to have been the question of the century in America--Just Do It or Just Say No. And in the past fifteen years, consumer society has moved to the center of historical inquiry as well. It began with the social history of commercial culture and the advertising industry, in books such as Kathy Peiss's Cheap Amusements: Working Women and Leisure in Turn-of-the-Century New York (1986) and Roland Marchand's Advertising the American Dream (1985). Drawing inspiration from the pioneering anthropological explorations of Dick Hebdidge (Subculture, The Meaning of Style, 1979), Arjun Appadurai (The Social Life of Things, 1988) and, especially, Mary Douglas and Baron Isherwood (The World of Goods, 1979), investigators then turned to the cultural history of how ordinary people use and assign meanings to commodities. A good example of this genre is Alison Clarke's Tupperware: The Promise of Plastic in 1950s America (1999). In recent works--such as Robert Collins's More: The Politics of Economic Growth in Postwar America (2000) and Alan Brinkley's The End of Reform: New Deal Liberalism in Recession and War (1995)--they have studied the political history of how nation-states promote and foster particular regimes of consumption. Where once consumption was deemed relevant only to the history of popular culture, in other words, it is now seen as intertwined with the central themes of American history, touching as it does on economics, politics, race relations, gender, the environment and other important topics.
Gary Cross, a professor at Penn State University and a pioneering and prolific historian of Europe and America, has explored the social, cultural and political dimensions of consumption before. In the past decade, he has published a half-dozen books on topics ranging from the history of leisure and working-class commercial amusements to the material culture of children's toys. Cross may study leisure, but his scholarship suggests that he doesn't take a whole lot of time to participate in consumer society. Fortunately, his work ethic has enabled the rest of us to understand our consumer ethic with clarity and historical perspective. Indeed, An All-Consuming Century displaces Daniel Horowitz's still-impressive but less wide-ranging The Morality of Spending (1985) as the best survey yet written of the history of modern American consumer society. Much more than a summary of recent scholarship (although it performs this task admirably), it is an informed, balanced, thoughtful and surprisingly passionate meditation on the making and meaning of our society. Avoiding the extremes of celebration and condemnation that too often pass for analysis, Cross's searching book is imbued with a generous concern for the revival of an active, democratic and participatory public sphere.
According to Cross, a paradox lies at the heart of American consumer society: It has been both an ideological triumph and a triumph over politics. Although it may be "difficult for Americans to see consumerism as an ideology," this is, Cross argues, precisely how it functions. It is, in his words, the "ism that won," the quiet but decisive victor in a century of ideological warfare. Over the course of the twentieth century it became naturalized to such an extent that few citizens "consider any serious alternatives or modifications to it."
In describing this ideological victory, Cross eschews conspiratorial interpretations of advertising and business collusion and gives consumer society its due for concretely expressing "the cardinal political ideals of the century--liberty and democracy--and with relatively little self-destructive behavior or personal humiliation." It won, Cross believes, because in large measure it met people's basic needs, helped them to fit into a diverse society even as it enabled them to forge new understandings of personal freedom, and served to fulfill, rather than mock, people's desire for the pleasures of the material world.
In spite of its popularity and successes, Cross believes that the ascension of consumer society has come at great cost: the abrogation of public life in favor of private thrills. By valorizing the private over the public and the present over the past and future, consumer society has "allowed little space for social conscience" and truly democratic politics. Rather than shoring up civil society, consumerism has pretty much replaced it: "The very idea of the primacy of political life has receded" as individual acquisition and use of goods has become the predominant way that Americans--and, increasingly, the rest of the industrialized world--make meaning of their lives. The suggestion that there should be limits to commercialism--that there are sacred places where the market does not belong--is, according to Cross, no longer taken seriously in a society that equates commercialism with freedom. Moreover, by the end of the century, "there seemed to be no moral equivalent to the world of consumption." The politics of consumption, in Cross's view, makes alternative conceptions of the good life virtually unimaginable in large part because it encourages people to think about themselves in isolation from the rest of society and from their history. (Reading Cross's book, I was reminded of Edward Hopper's painting Nighthawks, in which a customer at an urban diner sits alone, utterly disconnected from the humanity that surrounds him.) If Cross ultimately loses sight of the paradoxical nature of American consumerism and concludes on this dark note, An All-Consuming Century nonetheless provides important resources for others to explore the democratic potential of consumer society.
The narrative unfolds both chronologically and analytically. Cross divides the development of modern consumer society into four periods: 1900-1930, 1930-1960, 1960-1980 and 1980 to the end of the century. In this breakdown, the first three decades of the century were a takeoff period, during which a number of crucial elements converged to make America a consumer society. Cross consistently overstates the degree to which nineteenth-century America was a "traditional" society, untainted by commercialism; many elements of consumer society were born in the market revolution of the early 1800s and the corporate revolution of the later nineteenth century. But he is right to single out important developments that transformed the country from what we might call a nineteenth-century society with consumerist features to a full-blown consumer society in the twentieth century. The keys were increases in leisure time and personal income on the demand side, along with new products and innovations in selling on the supply side.
New, nationally advertised, branded products became widely available and affordable after the turn of the century. These products alleviated material needs, but more than that, Cross astutely notes, they became markers of new feelings of "comfort and ease" and "new sensations of power and speed." Modern products like cigarettes, candy and soft drinks made the sensational available on a daily, indeed almost hourly, basis. Amusement parks like Coney Island and other "cheap amusements" also made the regular purchase of spectacular thrills affordable for working people. In the consumer society, the utilitarian was always mixed with the sensual. The embodiment of this mixture was, of course, the great symbol of early-twentieth-century consumer society, the automobile. Already characterized by an increasing number of what Cross calls "private pleasures," in this period, as he shows, mass culture contributed to political and social changes as well: It blurred ethnic and class divisions and encouraged the children of immigrants to redefine themselves as members of a blended, multiethnic, if still racially segregated, youth culture.
The period 1930-1960 was one of consolidation in time of crisis. The constraints of the Great Depression and World War II led to a "frustrated consumerism more than a rejection of the capitalist system." Rather than blame the new consumerism, most policy-makers and indeed many ordinary Americans came to see "underconsumption" as the root cause of the slump. After the war, government policy encouraged the development of mass purchasing power rather than efforts to equalize the distribution of wealth. During the cold war, consumer society became "a positive answer to communism." In his 1959 "kitchen debate" with Nikita Khrushchev, Vice President Richard Nixon drove this point home by contrasting modern American appliances with outdated Soviet culinary technology. Despite the linkage in these years between consumption and freedom, Cross notes that the consumerism of the postwar years was not hedonistic but "domesticated," focused on the suburban home and the nuclear family. Signature developments of these years were Levittown, McDonald's and Holiday Inn, sites of responsible, respectable, family-oriented consumption.
From 1960 to 1980 consumer society faced a very different set of challenges but emerged stronger than ever. First, the counterculture challenged the very premises of consumerism, and in the 1970s, the specter of scarcity called into question the permanence of the cornucopia upon which consumer society depended. In spite of these challenges, "consumption became even more ubiquitous." Indeed, Cross suggests, the roots of the even more individualistic and socially fragmenting consumerism of the late twentieth century lay in part in the 1960s critique of consumerism: While countercultural figures critiqued conformity and idealized the "authentic self," many Americans sought to achieve this authenticity through consumption. Businesses began to modify the Fordist practice of mass production in favor of flexible production and segmented, demographically distinct markets. Drawing on the work of cultural critic Thomas Frank (rendered throughout the book as "Frank Thomas"), Cross writes that consumerism became "adaptable to the green and the hip." Similarly, during the energy crisis of the 1970s those politicians who took the shortage to be the result of overproductionwere rebuked as naysayers. With great political success, Ronald Reagan attacked President Jimmy Carter for a speech in which Carter had the temerity to suggest that "owning things and consuming things does not satisfy our longing for meaning." Reagan called that 1979 "malaise" address un-American in its pessimism and its call for restraint.
The trend toward fragmented, individualistic consumption accelerated during the last two decades of the century, an era that Cross labels "markets triumphant." Radical faith in the virtues of the market led politicians like Reagan to put a moral gloss on the "unfettered growth of market culture in the 1980s." Government constraints of an earlier era, in the form of environmental and advertising regulation, weakened, and commerce entered unfettered into areas where it had previously been kept at arm's length: children's homes and classrooms. By century's end the "Victorian notion that some time and place should be free from commerce" seemed as quaint as a Currier and Ives lithograph. Cross, who has a knack for unearthing telling statistics, notes that "supermarkets carried about 30,000 different products in 1996, up from 17,500 in 1986 and about 9,000 in the mid-1970s." Even the all-time-high consumer debt--$1.25 trillion by 1997--did nothing to stop the belief that the future of American prosperity and freedom depended upon the continuing expansion of the realm of consumption. Indeed, shopping had become the nation's primary form of entertainment, and monuments to consumption like the gargantuan 4.2-million-square-foot Mall of America became a haven for tourists from around the world.
In Cross's telling, the attractions and problems of consumer society are in effect one and the same: the cult of the new, immediate gratification and the valorization of "private pleasures." Consumerism is the "ism that won," owing to its ability not only to withstand challenges but, through a magical jujitsu, to co-opt them. Although initially formulated in terms neither celebratory nor condemnatory, Cross's story is ultimately one of declension. While he avoids the nostalgia of many commentators, there is little doubt that Cross finds contemporary consumer society to be a negative force: asocial, apolitical, amoral and environmentally dangerous. Whereas consumerism once helped integrate the diverse inhabitants of an immigrant nation in a youthful mass culture, by century's close, cynical marketers were happy to divide an equally multicultural nation into segmented demographic units based on "multiple and changing lifestyles." Thus the shift from an integrative, public-spirited popular culture in the early twentieth century to an increasingly privatized, solipsistic commercial culture of the late twentieth century. What was seductive in 1900--cornucopia and pleasure for the masses--became obscene by 2000, as a cultural stimulant turned into a dangerous narcotic.
An All-Consuming Century is one of the few indispensable works in the ever-expanding library of books on American consumer society. But in an otherwise rich overview the author has surprisingly little to say about the role of women, African-Americans and ethnic minorities (and nothing about regional variations) in the construction of consumer society. These are serious omissions. As admen and women's organizations recognized early on, women have performed the vast majority of the unpaid labor of consumer society: the shopping, budgeting and refashioning of older items. Cross notes that African-Americans were excluded from many of the benefits of the emerging mass culture, but he does not address the ways popular culture served to reinforce both the whiteness of the "new immigrants" from Eastern and Southern Europe--a skin privilege that was not yet fully acknowledged by the majority culture--and the otherness of Asian and Latino immigrants.
Nor does Cross discuss the attractions of nationwide retailers and national brands for African-Americans, who often took advantage of what the historian Edward Ayers has called the "anonymity and autonomy" made possible by the advent of the Sears catalogue (and chain stores in the nonsegregated North), whose mass customer base and "one price" system reduced the possibilities for racial discrimination that frequently accompanied visits to the corner store. For this group, the private pleasures occasionally afforded by the advent of national markets offered advantages over the public humiliations that so often accompanied local commerce.
Cross's relative neglect of women and minorities leads him to underestimate the importance of grassroots consumer activism as well, which has often been led by members of these groups. Meat boycotts, cost-of-living protests, "don't buy where you can't work" campaigns and sit-ins were integral to the development of American consumer society because they represented demands to expand the benefits of consumerism beyond a middle-class elite. One of the most important women's political organizations of the first half of the century, the National Consumers League, which pioneered the crusade for "ethical consumption" and labor rights, goes unmentioned. Cross stresses the ways marketers attempted to co-opt the civil rights movement, but he does not address the degree to which the demand for full participation in consumer society was a key ingredient in that crusade for social justice. By virtually ignoring these movements, Cross leaves out an important part of the story of consumer society--efforts to unite citizenship with consumption.
The critics of consumer society whom Cross discusses most often are proponents of what he calls the "jeremiad," the high-culture dismissal of mass culture as vulgar. He condemns the elitism and arrogance of such thinkers and is surely correct to note that their criticism had little impact on ordinary shoppers. Cross is less critical of the "simple living" tradition and calls the self-provisioning movement of the 1960s "the most positive aspect" of the counterculture. He argues that "the idea of the 'simple life,' perhaps never more than a daydream, had almost ceased being even a prick to the conscience," but he only briefly mentions the growing popularity of the "voluntary simplicity" movement, a topic addressed in more detail in Juliet Schor's The Overspent American (1998).
Cross also develops a persuasive critique of the consumer rights movement. While the Depression era saw the rise of groups like Consumers Union, which sought to make consumers a greater force against the power of business and advertisers, he notes that by focusing primarily on product quality and prices, many consumer rights groups have served only to reinforce "the individualism and the materialism of American consumption." This tradition of angry but apolitical individualism can still be found at innumerable websites, like starbucked.com, that highlight at great length the indignation of formerly loyal customers: "The sales clerk who sold me the machine was rude, then decidedly refused to hand over the free half pound of coffee given with every purchase of a Starbucks espresso machine...." The democratizing power of consumer demands for corporate responsibility is too often dissipated by such narrowly cast diatribes.
In spite of the failure of the jeremiad, the seeming irrelevance of simplicity and the individualization of the concept of consumer rights, Cross is too definitive about the nature of the "victory" of consumer society. Many Americans still recognize that however much advertisers and marketers attempt to cover it up, consumption is fundamentally a social and political act. So although it is true that "late twentieth century consumerism turned social problems into individual purchasing decisions," it is also the case that individual shopping decisions have frequently been viewed in the context of social problems. As consumer activists from the League of Women Shoppers in the 1930s through environmentalists today have pointed out, the goods that we buy leave ecological, labor and government "footprints." In spite of corporate attempts to fetishize goods, diligent activists like John C. Ryan and Alan Thein Durning of Northwest Environment Watch have described--and tried to estimate--the hidden social costs incurred by the purchase of quotidian products, including coffee and newspapers. The actions of students in the antisweatshop campaigns of recent years indicate that a growing number of consumers are looking behind the logo to determine the conditions under which the clothing they buy is made. As Naomi Klein has recently argued in No Logo:Taking Aim at the Brand Bullies, the ubiquity and importance of brands provides an opening for protesters who can threaten, through consumer boycotts and other actions, to sully corporate America's most valuable asset, the brand name. One teen in Klein's book puts it this way: "Nike, we made you. We can break you." Cross may decry the "inwardness of the personal computer," but the protests at the Seattle World Trade Organization and Washington International Monetary Fund meetings reveal that the Web creates alliances and expands social bonds. The history of consumer activism--and its recent incarnations--shows that consumerism does not necessarily lead to an antipolitics of radical individualism.
Cross does put forth important arguments about the "excesses of consumer culture": the environmental degradation, the waste, the lack of free time and the sheer mind-numbing meaninglessness that accompany modern consumerism. But these must be balanced with the recognition that most Americans, especially those in the working class, have viewed the enjoyment of the fruits of consumer society as an entitlement, not a defeat. This should not be dismissed as false consciousness or "embourgeoisement." Far from allowing consumerist demands to erode political impulses, working people--through living-wage, union-label and shorter-hour campaigns--have consistently politicized consumption. Rather than pitting the culture of consumption against democracy, it will be important to continue this tradition of democratizing, rather than demonizing, the culture of consumption. In his assessment of the twentieth century's most influential "ism," Cross provides important warnings about the difficulties of such an effort. But in its stress on the paradoxes of consumer society--an emphasis that then too rapidly gives way to condemnation--An All-Consuming Century also provides lessons from history about the necessity of the undertaking.
It offers a blatant apologia for economic inequality--but few question the faith.