Quantcast

Nation Topics - | The Nation

Topic Page

Nation Topics -

Articles

News and Features

The fortunes of American unions have taken a turn for the worse. Thanks to terrorism and recession, union members are reeling from a series of economic and political setbacks. Nearly half a million of them now face unemployment in the hotel and airline industries, and at Boeing, Ford, major steel-makers and other manufacturing firms. Many public employees will be clobbered next, as state and local budget crises deepen around the country. Already, teachers in New Jersey and state workers in Minnesota have been forced into controversial strikes over rising healthcare costs--a trend that affects millions of Americans. The accompanying loss of job-based medical coverage by many people who still have jobs should be fueling a revived movement for national health insurance, but few unions bother to raise that banner anymore.

Promising new AFL-CIO initiatives on immigration--like its call for legalization of undocumented workers--have been undermined by post-September 11 paranoia about Middle Easterners and federal scrutiny of thousands of them. Union organizing is stalled on many fronts, and rank-and-file participation in protests against corporate globalization--on the rise in Seattle and Quebec City--has faltered amid the myriad political distractions of the "war on terrorism." While labor's nascent grassroots internationalism remains overshadowed by flag-waving displays of "national unity," trade unionists have yet to be rewarded for their patriotism, even with a modest boost in unemployment benefits. Instead, President Bush is seeking cuts in federal job-training grants for laid-off workers. He's already won House approval for fast-track negotiating authority on future trade deals that threaten even more US jobs--and expects a Senate victory on that issue soon. To insure that collective bargaining doesn't interfere with the functioning of various executive branch offices now engaged in "homeland security," the White House just stripped hundreds of federal employees of their right to union representation. As University of Illinois labor relations professor Michael LeRoy observed in the New York Times, "a time of national emergency makes it more difficult for unions to engineer public support."

Into this bleak landscape arrives State of the Union, Nelson Lichtenstein's intellectual history of labor's past 100 years. Readers might take comfort from the fact--well documented by the author--that labor has been down before and, as in the 1930s, bounced back. Nevertheless, Lichtenstein's book raises disturbing questions about when, where and how that's going to happen again in a period when "solidarity and unionism no longer resonate with so large a slice of the American citizenry."

The author's views on this subject are informed by both scholarship and activism. A professor of history at the University of California, Santa Barbara, Lichtenstein wrote The Most Dangerous Man in Detroit, a definitive biography of one-time United Auto Workers president Walter Reuther. In 1996 Lichtenstein helped launch Scholars, Artists, and Writers for Social Justice (SAWSJ), a campus-based labor support network. Through SAWSJ, Lichtenstein has aided teach-ins and protests about workers' rights and worked with AFL-CIO president John Sweeney to re-establish links between unions and intellectuals that might help labor become a more "vital force in a democratic polity."

Consistent with this mission, Lichtenstein hopes to revive interest in what liberal reformers in politics and academia once called "the labor question." State of the Union is thus a history of the ideas about labor that animated much of the action--all the great union-building attempts during the past century. "Trade unionism requires a compelling set of ideas and institutions, both self-made and governmental, to give labor's cause power and legitimacy," Lichtenstein argues. "It is a political project whose success enables the unions to transcend the ethnic and economic divisions always present in the working population."

He begins his survey in the Progressive Era, a period in which "democratization of the workplace, the solidarity of labor, and the social betterment of American workers once stood far closer to the center of the nation's political and moral consciousness." Politicians, jurists, academics and social activists--ranging from Woodrow Wilson to Louis Brandeis to Florence Kelley of the National Consumers League--all joined the debate about the threat to our "self-governing republic" posed by large-scale industrial capitalism. How could democracy survive when America's growing mass of factory workers were stripped of their civic rights, and often denied a living wage as well, whenever they entered the plant gates?

The Progressives' response was "industrial democracy"--extending constitutional rights of free speech and association to the workplace, enacting protective labor laws and securing other forms of the "social wage." Unfortunately, national-level progress toward these goals foundered after World War I on the rocks of lost strikes, political repression and Republican Party dominance in Washington. "Neither the labor movement nor the state, not to mention industrial management itself, generated the kind of relationships, in law, ideology, or practice, necessary to institutionalize mass unionism and sustain working-class living standards" during the 1920s, observes Lichtenstein.

The years of the Roosevelt Administration were a different story. State of the Union recounts how Depression-era unrest--plus the efforts of an unusual and uneasy alliance between industrial workers, labor radicals, dissident leaders of AFL affiliates, pro-union legislators and New Deal policy-makers--led to passage of the Wagner Act. It created a new legal framework for mediating labor-management disputes and boosted consumer purchasing power via the wage gains of collective bargaining.

As industrial unions experienced explosive growth before and during World War II, the previously unchecked political and economic power of the great corporations was finally tempered through the emergence of a more social democratic workers' movement, led by the Congress of Industrial Organizations. The CIO spoke up for the poor, the unskilled and the unemployed, as well as more affluent members of the working class. Even the conservative craft unions of the AFL ultimately grew as a result of the CIO's existence because many employers, if they had to deal with any union at all, preferred one with less ideological baggage.

Then as now, the nation's manufacturing work force was multiethnic, which meant that hundreds of thousands of recent immigrants used CIO unionism as a vehicle for collective empowerment on the job and in working-class communities. Successful organizers "cloaked themselves in the expansive, culturally pluralist patriotism that the New Deal sought to propagate," says Lichtenstein. "Unionism is the spirit of Americanism," proclaimed a labor newspaper directed at "immigrant workers long excluded from a full sense of citizenship." The exercise of citizenship rights in both electoral politics and National Labor Relations Board voting became, for many, a passport to "an 'American' standard of living."

State of the Union credits some on the left for noting, then and later, that New Deal labor legislation also had its limits and trade-offs. Wagner Act critics like lawyer-historian Staughton Lynd complain that it merely directed worker militancy into narrow, institutional channels--soon dominated by full-time union reps, attorneys for labor and management, not-so-neutral arbitrators and various government agencies. During World War II, attempts by labor officialdom to enforce a nationwide "no strike" pledge led to major rifts within several CIO unions and helped undermine the position of Communist Party members who tried to discourage wildcat walkouts.

The "union idea" that was so transcendent among liberals and radicals during the New Deal underwent considerable erosion in the 1950s. Many leading writers, professors and clergymen had signed petitions, walked picket lines, spoken at rallies, testified before Congressional committees and defended the cause of industrial organization in the 1930s. These ties began to fray after World War II and the onset of the cold war, when the CIO conducted a ruthless purge of its own left wing. This made it much harder for "outsiders" with suspect views to gain access to the increasingly parochial world of the (soon to be reunited) AFL and CIO. As Lichtenstein shows in his survey of their writings, the subsequent alienation of intellectuals like C. Wright Mills, Dwight Macdonald, Harvey Swados and others was rooted in the perception--largely accurate--that union bureaucracy and self-interest, corruption and complacency had replaced labor's earlier "visionary quest for solidarity and social transformation."

Lichtenstein questions whether unions were ever quite as fat, happy and structurally secure as some economists and historians claimed (after the fact) in books and articles on the postwar "labor-management accord." If such a deal had really existed during those years, State of the Union argues, it was "less a mutually satisfactory concordat" than "a limited and unstable truce, largely confined to a well-defined set of regions and industries...a product of defeat, not victory."

Measured by dues-payers alone, "Big Labor" was certainly bigger in the 1950s--at least compared with the small percentage of the work force represented by unions now (33 percent at midcentury versus 14 percent today). But union economic gains derived more from members-only collective bargaining than from social programs--like national health insurance--that would have benefited the entire working class.

Labor's failure to win more universal welfare-state coverage on the European or Canadian model led to its reliance--in both craft and industrial unions--on "firm-centered" fringe-benefit negotiations. The problem with the incremental advance of this "privatized welfare system" for the working-class elite was that it left a lot of other people (including some union members) out of the picture. Millions of Americans in mostly nonunion, lower-tier employment ended up with job-based pensions, group medical insurance, paid vacations, etc., that were limited or nonexistent.

The fundamental weakness of this edifice--even for workers in longtime bastions of union strength--was not fully exposed until the concession bargaining crisis of the late 1970s and '80s. As Lichtenstein describes in painful detail, employers launched a major offensive--first on the building trades, then on municipal labor and then on union members in basic industry. Pattern bargaining unraveled in a series of lost strikes and desperate giveback deals. This allowed management to introduce additional wage-and-benefit inequalities into the work force, including two-tier pay structures within the same firm, healthcare cost shifting, more individualized retirement coverage and greatly reduced job security due to widespread outsourcing and other forms of de-unionization.

By then, of course, African-Americans in the South, who suffered longest and most from economic inequality, had already risen up and made a "civil rights revolution." Their struggle was one that unions in the 1960s--at least the more liberal ones--nominally supported and in which veteran black labor activists played a seminal role. Yet the civil rights movement as a whole clearly passed labor by and further diminished its already reduced stature as the champion of the underdog and leading national voice for social justice. In a key chapter titled "Rights Consciousness in the Workplace," Lichtenstein explores how unions, their contracts and their negotiated grievance procedures have been further marginalized by the enduring legal and political legacy of the civil rights era. According to the author, this has created "the great contradiction that stands at the heart of American democracy today":

In the last forty years, a transformation in law, custom, and ideology has made a once radical demand for racial and gender equality into an elemental code of employer conduct.... But during that same era, the rights of workers, as workers, and especially as workers acting in an autonomous, collective fashion, have moved well into the shadows.... Little in American culture, politics, or business encourages the institutionalization of a collective employee voice.

Now, every US employer has to be an "equal opportunity" one or face an avalanche of negative publicity, public censure and costly litigation. Discrimination against workers--on grounds deemed unlawful by the 1964 Civil Rights Act and subsequent legislation--has become downright un-American, with the newest frontiers being the fight against unfair treatment of workers based on their physical disabilities or sexual preference. At the same time, as State of the Union and other studies have documented, collective workplace rights are neither celebrated nor well enforced [see Early, "How Stands the Union?" Jan. 22, 2001]. What Lichtenstein calls "rights consciousness" is the product of heroic social struggle and community sacrifice but, ironically, often reinforces a different American tradition: "rugged individualism," which finds modern expression in the oft-repeated threat to "call my lawyer" whenever disputes arise, on or off the job.

To make his point, Lichtenstein exaggerates the degree to which individual complaint-filers at the federal Equal Employment Opportunity Commission (and equally backlogged state agencies) end up on a faster or more lucrative track than workers seeking redress at the National Labor Relations Board. There is no doubt, though, that high-profile discrimination litigation has paid off in ways that unfair-labor-practice cases rarely do. Among other examples, the book contrasts the unpunished mass firing of Hispanic phone workers trying to unionize at Sprint in San Francisco--a typical modern failure of the Wagner Act--with big class-action victories like the settlement securing $132 million for thousands of minority workers victimized by racist managers at Shoney's. The restaurant case involved much public "shaming and redemption" via management shakeups at the corporate level; Sprint merely shrugged off allegations of unionbusting until a federal court ruled in its favor.

Lichtenstein's solution is for labor today to find ways to "capitalize on the nation's well-established rights culture of the last 40 years," just as the CIO "made the quest for industrial democracy a powerful theme that legitimized its strikes and organizing campaigns in the 1930s." He looks to veterans of 1960s social movements--who entered the withering vineyard of American labor back when cold warriors like George Meany and Lane Kirkland still held sway--to build coalitions with nonlabor groups that can "make union organizational rights as unassailable as are basic civil rights."

In so doing, Lichtenstein recommends finding a middle way between a renewed emphasis on class that downplays identity politics--"itself a pejorative term for rights consciousness"--and an exclusive emphasis on the latter that may indeed thwart efforts to unite workers around common concerns. In the past, Lichtenstein notes, "the labor movement has surged forward not when it denied its heterogeneity" but instead found ways to affirm it, using ethnic and racial pluralism within unions to build power in more diverse workplaces and communities.

Given the enormous external obstacles to union growth, the author's other proposals--summarized in a final chapter titled "What Is to Be Done?"--seem a bit perfunctory. His "three strategic propositions for the union movement" do point in a better direction than the one in which the AFL-CIO and some of its leading affiliates are currently headed. State of the Union calls for more worker militancy, greater internal democracy and less dependence on the Democratic Party. These are all unassailable ideas--until one gets beyond the official lip service paid to them and down to the nitty-gritty of their implementation.

Too often in labor today--particularly in several high-profile, "progressive" unions led by onetime student activists--participatory democracy is missing. Membership mobilization has a top-down, carefully orchestrated character that subverts real rank-and-file initiative, decision-making and dynamism. The emerging culture of these organizations resembles Third World "guided democracies," in which party-appointed apparatchiks or technocrats provide surrogate leadership for the people who are actually supposed to be in charge. In politics, it's equally disheartening to see that labor's "independence" is not being demonstrated through the creation of more union-based alternatives to business-oriented groups within the Democratic Party or by challenging corporate domination of the two-party system. Instead, it's taking the form of very traditional and narrow special-interest endorsement deals with Republicans like New York Governor George Pataki.

This is not what Lichtenstein has in mind when he urges adoption of "a well-projected, clearly defined political posture in order to advance labor's legislative agenda and defend the very idea of workplace rights and collective action." His book applauds the authentic militants who battled contract concessions and the labor establishment prior to the 1995 palace coup that put John Sweeney and his associates in control of the AFL-CIO. While the author backs "the new agenda of the Sweeneyite leadership," with its primary focus on the right to organize, he argues that the fight for union democracy is equally "vital to restoring the social mission of labor and returning unions to their social-movement heritage."

How labor is viewed, aided, undermined or ignored by men and women of ideas (including the author) is, by itself, never going to determine its fate in any era. Workers themselves--acting through organizations they create or remake--are still the primary shapers of their own future, whether it's better or worse. Nevertheless, creative interaction between workers and intellectuals has helped spawn new forms of workplace and political organization in every nation--Poland, South Africa, Korea and Brazil--where social movement unionism has been most visible at some point in recent decades. In the United States, unions--and their new campus and community allies--face the daunting task of developing ideas and strategies that will "again insert working America into the heart of our national consciousness." If they succeed in restoring its relevance, the labor movement may yet have a broader impact on our society, and Lichtenstein's State of the Union will deserve credit for being a catalyst in that process.

As the World Economic Forum met in New York City recently, the American media were much more concerned with what protesters were doing in the streets than with what they were saying there. You'd think that dissenting views were old hat and "isms" were for the classroom, not the newsroom.

But it's far too early for that. Similarly, at first glance, Peter Glassgold's collection of prose and poetry from an American anarchist magazine of 1906-17 appears to be of only historical interest; something that might be recommended as supplemental reading in an American studies curriculum, because it treats the fights for birth control and civil liberties, and against joblessness and conscription in this period. It's full of names now obscure, words that have become archaic. Imagine a time when "a special throwaway" was printed up and "circularized" in New York City by the movement of the unemployed. Or when Zola was referred to repeatedly because his works had resonance. Another, distant era. But just when it seemed that anarchism was for scholars, along came demonstrations in Seattle, Philadelphia, Prague, Quebec City, Genoa. "Anarchist troublemakers" was the antique expression I heard on the TV news not long ago. Congratulations, Peter Glassgold--you couldn't be more timely.

Since the A-word is a dirty one to many, it's likely that the presence and actions of anarchists at recent demonstrations have been exaggerated to discredit the anti-WTO, global-justice movement. But it's also possible that anarchism is visible on the left because it has less competition at present. Now, as in the late 1960s, it may channel discontent after other outlets have been rejected. It can serve as the radicalism of last resort, profiting from crises in other camps. Socialism, sharing political power in much of Western Europe, has made so many deals and compromises with big business that it no longer seems principled to a lot of people. And there's widespread suspicion that ex-Communists are weak on democracy, having made excuses for repressive states for so long.

Anarchists have the advantage of exclusion, the nobility of failure, so to speak.

They've rarely had much power; in fact, they've rarely gotten on well with the powerful. There are exceptions to that oppositional stance, however, and Glassgold's book gives glimpses of some of them. The famous anarchist theoretician Peter Kropotkin supported France when it fought the German Kaiser in World War I. The prominent propagandists and agitators Emma Goldman and Alexander Berkman rallied around the Bolsheviks in 1917, though they became angry and disillusioned when Lenin and his followers soon turned against rivaling revolutionary tendencies.

A "Philosophy of Non-Submission" was one name for anarchism, and the state and its institutions have not been the only target of anarchist wrath. Mother Earth, a New York journal edited by Goldman and Berkman, among others, which accepted work by anarchists and nonanarchists in the United States and abroad, spoke out against capitalism, the private ownership of land, religion, monogamy, female modesty, middle-class feminism--and I could go on. Glassgold's choice of texts captures not just the breadth but the depth of its antagonisms. "I do not want to 'love my enemies,' nor 'let bygones be bygones.' I do not want to be philosophical, nor preach their inclusion in the brotherhood of man. I want to hate them--utterly," wrote the American anarchist writer and activist Voltairine de Cleyre.

Clearly, the movement has attracted not only those who can bear angry isolation but those who find pleasure and strength in it. Berkman loved the menace in the black flag. When people try to inspire fear and loathing, I don't guarantee them satisfaction. I read this anthology with detached interest, to hear what all the Sturm und Drang was about.

Glassgold chose well when he culled from Mother Earth. The exuberance of its prose is what summaries of anarchism often fail to capture. It is all too easy for historians to make the movement sound more consistent and systematic than it was. The magazine itself, which I've examined in facsimile in a library, is full of a highly emotive type of writing and relies not just on metaphor but on a host of oratorical devices to stir an audience. Irony alternates with inspirational appeals for a better future. Essays in the journal often read like speeches (and sometimes were), where hyperbole covers holes in the arguments and exhortation often substitutes for analysis. But Glassgold hasn't prettified them.

Nor has he excised the extremism in anarchist history, which is sometimes moving, sometimes painful to read about. He doesn't skip over its martyrology: the periodic celebration and commemoration of those who suffered or died defending their ideal. With hagiography and eulogies, the movement articulated and reinforced its values: purity, courage, perseverance, self-sacrifice, devotion. These are military qualities, demanded of the soldier under fire, for anarchists were at war with society. But battles were not fought by men alone. In the anarchist milieu, women were allowed to be comrades and leaders, and to display what was at the time an unladylike anger. Revenge was tolerated, sometimes encouraged in the movement of the era. "Even animals possess the spirit of revenge," Berkman wrote in 1906. "As long as the world is ruled by violence, violence will accomplish results," he added in 1911.

Not all anarchists have taken his position. Alternative revolutionary methods, such as the general strike, were advocated at the time. Direct action could mean, simply, that the people must liberate themselves and not delegate that job to parliaments or other representatives. But at the end of the nineteenth century, it was associated with dynamite used by lone individuals or small conspiracies, and Mother Earth shows a lingering sympathy for such tactics. The process of renouncing them was slow, faltering and, in the case of some anarchists, incomplete. To omit this history would be to whitewash the movement. But to restrict anarchism to this tendency would be unfair as well.

The title Mother Earth points to an equally important and oft-neglected aspect of the movement: its appeal to a romanticized nature as the ultimate standard. While Glassgold is right that "the message of the name was not environmental but libertarian," anarchism was and remains a philosophy of nature. One of its major theorists, the Russian exile Kropotkin, was a Darwinist of a particular stripe who believed that evolution favors mutual support and cooperation, not competition. "Without that [sociable] instinct not one single race could survive in the struggle for life against the hostile forces of Nature," he stated in a lecture to a eugenics congress in London that was printed in Mother Earth in 1912. Two years later, he asserted in the same journal that

once it is recognized that the social instinct is a permanent and powerful instinct in every animal species, and still more so in man, we are enabled to establish the foundations of Ethics (the Morality of Society) upon the sound basis of the observation of Nature and need not look for it in supernatural revelation. The idea which Bacon, Grotius, Goethe, and Darwin himself (in his second work, The Descent of Man) were advocating is thus finding a full confirmation, once we direct our attention to the extent to which mutual aid is carried on in Nature.

The Spanish educator Francisco Ferrer also tied anarchism to evolution, writing in Mother Earth about the need to adapt instruction to "natural laws" and "the spontaneous response of the child." And Max Baginski, a German-born editor of Mother Earth, spurned the "artificial, forced, obligatory" aid of one trade union to another in times of trouble, preferring solidarity based on human nature--that is, his concept of it: "The gist of the anarchistic idea is this, that there are qualities present in man which permit the possibilities of social life, organization and co-operative work without the application of force." Optimistic faith in the goodness and beneficence of nature, combined with intense distrust of the "machinery" of government, the law courts and the military, distinguished anarchists from most Marxists before the First World War. And still does today. It is this combination of ideas, I think, that has become diffused among contemporary leftists who would not identify themselves as anarchists. For many radicals, then as now, nature is what Richard M. Weaver (in The Ethics of Rhetoric) calls a "god term" because it trumps all others.

Of course, one may well ask exactly what the anarchists' nature--including human nature--consists of. Shouldn't it be interrogated, not assumed? After all, the nature of nature is not self-evident. Sorry to disappoint: In Mother Earth, as in most other anarchist writing, the concept of nature was not analyzed but invoked and revered. The magazine appealed to enthusiasts. In fact, it raised what Berkman called "active enthusiasm" to a principle. There Kropotkin declared, "In a revolutionary epoch, when destructive work precedes constructive efforts, bursts of enthusiasm possess marvelous power." (When Emma Goldman was convicted in New York in 1916 for spreading birth control information in an allegedly indecent manner to an allegedly promiscuous audience, her friend and supporter Leonard Abbott reported, "Her face was alight with enthusiasm.") As Voltairine de Cleyre put it, "Wholesale enthusiasm is a straw fire which burns out quickly; therefore it must be utilized at once, if at all; therefore, those who seek to burn barriers away with it must direct it to the barriers at once."

Fire, storm, earthquake, volcano--when the topic was the coming revolution, anarchists tended to transform human actors into a force of nature. Berkman, who served a long prison sentence for attempting to kill the steel magnate Henry Clay Frick after workers had been shot in the Homestead steel strike, declared in Mother Earth that the bomb "is manhood's lightning out of an atmosphere of degradation of misery that king, president and plutocrat have heaped upon humanity." Anarchist metaphors made the rebel and criminal part of earth science, integrating and naturalizing them.

And then there are the environmental images for the vitality, joy and beauty of the anarchist goal. I wish I had a nickel for every "dawn" and "blooming spring" I've met in old anarchist publications. Glassgold's anthology has some superior examples. Praising the Paris Commune of 1871, Kropotkin asserted in Mother Earth, "The Government evaporated like a pond of stagnant water in a spring breeze." And the first cover of the magazine was heavy with traditional, even banal, symbols of paradise: human nakedness within lush vegetation. A New Age scene, Glassgold cannily observes.

Indeed, the alternative lifestyle we now call New Age was intertwined with anarchism in the late nineteenth and early twentieth centuries. Health and dress reformers, homeopaths and herbalists, practitioners of free love and nudism were often sympathetic to anarchism, friends and neighbors of anarchists, if not anarchists themselves. In the German-speaking world, this symbiosis is relatively well-known, since it has been described in such books as Ulrich Linse's Ökopax und Anarchie("Ecopeace and Anarchy," 1986). The historian Paul Avrich has often demonstrated the close connection of anarchism to bohemia, and the tie between the two tendencies cannot be missed in the writings and biographies of Emma Goldman, Mabel Dodge and Margaret Anderson.

Yet it remains to be shown that in the American cultural realm, anarchists have had an influence out of proportion to their numbers. If we knew the continuity of anarchism in America--its influence on Gestalt psychology, Allen Ginsberg and the Beats, the folk-song counterculture of Joan Baez, the avant-garde art of Yvonne Rainer, etc.--we might not be surprised when it pops up in the news today. Certain ideas are in the air, distributed by word of mouth, more than secondhand. You may well repeat them never knowing they appeared in Freud, Marx or perhaps Bakunin. The process of popularization is notoriously hard to chart, which is probably one reason historians and social scientists tend not to study it. But the fact that something is vague and elusive doesn't necessarily make it trivial and unimportant. Is the marginalism of anarchism only apparent? I vote to leave this question open.

Let me lay my cards on the table: "I am not now nor have I ever been" an anarchist, but I've written essays as well as fiction about this tradition because I think it's widely misunderstood. Ignored, idealized or caricatured, it is still largely the stuff of polemics. Glassgold's achievement is to help it be heard in its intensity and complexity.

The slogans scrawled across the walls of Paris in May 1968 suggest possibilities most of us have forgotten or that were long ago deemed preposterous. "Never work!" said one slogan. "The more you consume the less you live!" warned another. The words, restless and uncompromising, ask you to wake up, to change your life, to find a better way to live.

At its zenith in the late 1960s, the Situationist International could claim that "our ideas are in everyone's mind," even though the SI itself never had more than a few dozen members. When the whole world was exploding in 1968, the Situationist texts that had appeared throughout the decade read like road maps for revolution, full of slogans and tactics that youthful rebels picked up en route to wildly varying destinations.

Nearly forgotten after their dissolution in 1972, the Situationist legacy was recovered in 1989 with the publication of Greil Marcus's Lipstick Traces, which purported to trace the subterranean relationships between medieval heresy, nineteenth-century utopianism, Dada, Surrealism, Situationism, soul music and punk rock.

Today, Situationism exerts considerable--though often unacknowledged and depoliticized--influence over academic discourse and artistic practice in many media. It also plays a role in shaping the movement for global justice (or the "antiglobalization movement," as its critics like to call it), from Naomi Klein's book No Logo to the magazine Adbusters to the proliferating network of Independent Media Centers. Kept alive by a stream of reprints, anthologies and retrospectives from mostly anarchist presses, the Situationist critique continues to gain fresh adherents.

The most recent anthology, Beneath the Paving Stones: Situationists and the Beach, May 1968, includes three major Situationist pamphlets, along with eyewitness accounts, photographs, poster art, leaflets and other documents of France's almost-revolution. City Lights, meanwhile, has published what is the inaugural volume of its "Contributions to the History of the Situationist International and Its Time," a long conversation with Jean-Michel Mension called The Tribe.

Jean-Michel Mension was a petty criminal and teenage drunk who hung around the Saint-Germain-des-Prés neighborhood of Paris from 1952 to 1954. There he met the Lettrists, a movement of poets and painters founded in the late 1940s by Isidore Isou in response to the growing impotence of Surrealism, and taught them to smoke hash, snort ether and consume heroic amounts of alcohol. It was in this capacity that Mension met Guy Debord, the bookish filmmaker who would later become the chief theorist of the SI.

In the photos throughout The Tribe, Debord is bespectacled and a bit short, resembling a young Woody Allen. Yet his slight physical stature belied a ferocious intellect and messianic personality, one that Marcus in Lipstick Traces identifies in young rebels from eighteenth-century blasphemer Saint-Just to punk rocker Johnny Rotten. "His instincts," says Marcus, "are basically cruel; his manner is intransigent.... He is beyond temptation, because despite his utopian rhetoric satisfaction is the last thing on his mind.... He trails bitter comrades behind him like Hansel his breadcrumbs."

Debord, says Mension, was fascinated by outlaws and by the prisons and reformatories where they could be found. Not that Debord was destined for prison--Mension notes that he lived a "very bourgeois" lifestyle, once finding his friend at home in Rue Racine "in the role of a gent in a dressing gown." Like the Beats in America (correctly characterized by the SI as "that right wing of the youth revolt"), the Lettrists saw in antisocial behavior the revolt they longed for. The young, said Isou in 1953, were "slaves, tools...the property of others, regardless of class, because they have no real freedom of choice...to win real independence they must revolt against their very nonexistence." Mension was a model for the Lettrists in that he refused to be a slave or a tool, was "always on the margins," always drunk.

"Guy taught me stuff about thinkers," says Mension, "and I taught him stuff about practice, action." Young men like Isou and Debord needed delinquents like Mension, whom they later expelled from the Lettrists for being "merely decorative." Mension was not an intellectual and did not pretend to be. "I was a youngster," says Mension, "who had done things that [Debord] was incapable of doing.... I was the existential principle and he was the theoretician."

Even in his own memoir, Mension comes off as the object of others' interpretations rather than as an active subject. The most engaging parts of The Tribe are not the conversations with Mension, who is vague and noncommittal about the great ideas of his day, but the photographs and excerpts of Lettrist writing that appear in the margins of the book. For example: "We are young and good-looking," says their leaflet against Charlie Chaplin, whom they deemed an artistic and political sellout, "and when we hear suffering we reply Revolution." Combining ironic arrogance with self-righteous anger, their words will be instantly familiar to anyone acquainted with the rhetoric of youth revolt. The Tribe is not the place to go if you are trying to understand Lettrism and its influence on other movements, but it is a charming sketch of a time and place where characters like Mension and Debord collided to create new ways of living and thinking.

In 1957 Guy Debord met with avant-garde artists and theorists from around Europe to found a new group, which would be devoted to creating situations: "moment[s] of life concretely and deliberately constructed by the collective organization of a unitary ambiance and a game of events." The Situationists were leftists in the tradition of Karl Marx and Rosa Luxemburg, but like the Lettrists they embraced youth as agents of revolutionary change--a mad and astonishingly prescient challenge to the sociological orthodoxy of their time. They also developed a radical new vision of how capital shapes culture and society.

"In the primitive phase of capitalist accumulation," writes Debord in his 1967 treatise The Society of the Spectacle, "political economy sees the proletarian only as the worker," who receives only enough compensation to survive. When the surplus of commodities reaches a certain level, however, it becomes necessary for capitalism to extract "a surplus of collaboration from the worker." As Henry Ford famously remarked, he had to pay his assembly-line workers enough so that they could afford to buy his cars. Thus was born the culture of consumption--and the society of the spectacle.

"The spectacle," says Debord, is "capital accumulated until it becomes an image" that mediates social relations among people, serving the needs and obscuring the power of capital. In the spectacular economy, all daily life and everything related to thought--sports, advertising, news, school, etc.--is mobilized on behalf of commodities, preaching work and consumption to the powerless so that the owners may prosper and live more fully. Unlike the rest of the Marxist left, the Situationists did not target scarcity, but instead abundance and the contradictions it entailed--especially boredom, which they saw as an ultramodern, artificially created method of social control.

According to Situationism, revolution in the spectacular economy cannot be waged only at the point of industrial production, as Marxists thought, but also at points of consumption and in the realm of image. It was at these points that alienation was deepest, the contradictions sharpest. By destroying the symbols that stand between the owners and nonowners, the underlying machinations of capital might be revealed. The proletariat was still a revolutionary class, but one joined by students, alienated youth and media workers. In 1966, a group of students used their posts in the student government at the University of Strasbourg to publish a stick of dynamite called "On the Poverty of Student Life," by SI member Mustapha Khayati. Thousands of copies were distributed to students at Strasbourg and throughout France, lighting a fuse that eventually ignited a general strike of students and workers in 1968. This is where Beneath the Paving Stones begins, with the moment in which Situationism broke through to its contemporaries.

Snotty and provocative, "On the Poverty of Student Life" asks students "to live instead of devising a lingering death, and to indulge untrammeled desire." By doing so, the erstwhile students would create a situation that goes beyond the point of no return. The spectacle and the mode of hierarchical, exploitative production it represents would be destroyed, replaced (rather magically, it must be said) by workers' councils practicing direct democracy and free individuals living without false desires. Voilà! Utopia!

"On the Poverty of Student Life" popularized ("diffused") the Situationist critique, but the centerpiece of Beneath the Paving Stones is "The Totality for Kids," by Raoul Vaneigem. In thirty elliptical sections, Vaneigem takes the reader from the "prehistoric food-gathering age" to the spectacular information economy and then to the point of revolution. There, says Vaneigem, the dispossessed must each seize their own lives, deliberately constructing each moment in a generalized conflict that stretches from domestic squabbles to classrooms to shop-floor struggles. It's a conception of revolution that encompasses feminism, black power, student power, anticolonial revolt, workers' control and even avant-garde artistic movements.

While Situationist writings, powerful and still relevant, deserve their perennial revival, Beneath the Paving Stones falters by not criticizing and updating Situationist theory and practice. At the beginning of the twenty-first century, it is simply not possible to present the events of 1968 without indulging in a certain radical nostalgia. Not only has postmodernism depoliticized or challenged many Situationist ideas (particularly reader-response theories arguing that nonowners actively participate in shaping the spectacle) but the Situationist legacy is often embraced by radical movements that forget to ask what happened after 1968. They certainly knew how to throw a great party, but they were not particularly interested in the details. The Situationist International ended its life fragmented, isolated, defeated. No other movement or organization was ever pure enough for them, and as personality conflicts and expulsions diminished their ranks, self-indulgent tactics limited their influence.

One can see this self-destructive tendency emerging early on, in the works and lives of those proto-punks, the Lettrists. By seeking to live without contradictions, on the margins where they had the freedom to construct their own daily lives--"we must survive," says Vaneigem, "as antisurvivors"--the Lettrists sacrificed their ability to attack those contradictions. The empty space in Situationist theory, as in many others, lies between constructing individual moments and changing the world. The Surrealists once sought to fill that space with official Communism, but were ultimately forced by Stalinism to sacrifice the subconscious for Socialist Realism. The Situationists learned from their sad example, taking a path that has left their ideas intact but confined to the realm of anthologies and retrospectives. Romanticizing their integrity might be useful, but fetishizing their failure is not. Situationism must be surpassed if it is ever to make a difference.

The subtitle sounds bad, but keep in mind that Thorstein Veblen considered subtitling his book on academics "A Study in Total Depravity." The really bad news concerns the title: The term "public intellectual" is practically obsolete.

It's dying young. Although the subject of much hoo-ha lately, it has not been current for very long. Russell Jacoby popularized it in his 1987 book The Last Intellectuals: American Culture in the Age of Academe. Jacoby did not coin the term--he quoted C. Wright Mills using it in 1958--but he found a congenial semantic niche for it: to distinguish unaffiliated from college-based thinkers. In the old days, there wasn't any need to make the distinction, because the generation born in and around 1900 doubted that intellectual life could take place in academia. "To be an intellectual did not entail college teaching," Jacoby wrote of the era that formed Lewis Mumford and Edmund Wilson; "it was not a real possibility." By the time of Jacoby's book, however, contemplative lives were being led on campuses, or so it was claimed, and since the campuses had the dollars to back the claim, the old-fashioned independent intellectuals were marked with the delimiting adjective "public."

Now the adjective is about to disappear, because the independents are on the verge of losing even their right to the noun. In his new book, Richard Posner hints that there is today "a certain redundancy in the term 'public intellectual.'" One would expect Posner to be highly sensitive to the use of the term, because he lives the role it describes. Profiled in Lingua Franca and more recently in The New Yorker, and invited to post his diary on Slate, Posner is a judge on the Court of Appeals for the Seventh Circuit, a founder of the field of law and economics, and the author of books on everything from the rational-choice theory of sex to the 2000 presidential election.

The term is redundant, Posner suggests, because an intellectual is by definition someone who addresses the public. Writing for fellow experts may take just as much brainpower but is merely academic. For practical reasons Posner is not concerned, as Jacoby was, with the brave last stand of independent thinkers. "There was a time when an intellectual could do as well (or rather no worse) for himself financially by writing books and articles as by being a professor," Posner writes. "That time is largely past. The opportunity cost of being an independent public intellectual has skyrocketed because of the greatly increased economic opportunities in the academic market." Nowadays the term "public intellectual" merely refers to an academic in his capacity as a moonlighter. The qualifier "public" is expendable once all intellectuals have day jobs.

In other words, the short lifespan of the term corresponds to the interval between the decline of "Intellectuals cannot be professors" and the rise of "All intellectuals are professors." About this transition Jacoby was wistful and, in a desperate, Gertrude-Stein-beckoning-the-Lost-Generation way, optimistic. "A specter haunts American universities or, at least, its faculties: boredom," Jacoby wrote, and he quoted a report that found "almost 40 percent [of professors] ready and willing to leave the academy." Posner, in contrast, is resigned and matter-of-fact. He knows the laws of economics. The marketplace of ideas, like other markets, results from the preferences and resources of those who participate in it. If 40 percent of professors say they want to leave the academy, they must have excellent reasons for staying. After all, if nothing were holding them back, their dissatisfaction would not show up in surveys of professors; they would not be professors.

The market has reasons that reason knows not of, and Posner is willing to respect them. "In the main we shall have to live with this slightly disreputable market," he writes. "But what else is new? We Feinschmeckers have to live with vulgarity in popular culture, the sight of overweight middle-aged men wearing shorts and baseball caps, weak coffee and the blare of the television set in every airport waiting lounge. It is doubtful that the public-intellectual market is a more debilitating or less intractable feature of contemporary American culture than these other affronts to the fastidious."

But although a monopoly by academics may be inevitable, Posner apprehends the mediocrity of it, acutely. "The disappointment lingers," he admits. In fact he was motivated to write this book by dismay at what his tenured peers had written and said about the impeachment of Clinton, the Microsoft antitrust case and the supposed moral decline of America. Their commentary seemed so shoddy and silly as to require an economic explanation.

How do you analyze the economics of something so airy? In fact, once Posner sets in, it turns out to be less airy than bloody. Much of the fun in reading Public Intellectuals consists of watching Posner triage the meats for his sausage.

You won't be put through his grinder just because you're smart and pop up in Nexis. Of Nation writers, Alexander Cockburn and Patricia Williams make his list, as do Victor Navasky and Katrina vanden Heuvel, but Christopher Hitchens inexplicably does not, even though Posner has a footnote to him. John Rawls is out--too, too academic.

Harvard's new president, Lawrence Summers, qualifies, just barely. Theodore Roosevelt, Newt Gingrich, Winston Churchill, Leonard Bernstein and William Sloane Coffin are excluded because they are better known for nonintellectual achievements. This caution is justified because intellectual celebrity is so easily dwarfed by other kinds. The intellectual most often mentioned in the media between 1995 and 2000 was Henry Kissinger, and yet the 12,570 allusions to him are as a drop in the bucket and are counted as the small dust on the balance beside Michael Jordan's 108,000.

Whether devised by art, science or expedience, the tallies and rankings are where most readers will start, and Posner has strategically placed them in the precise middle of his book, as far from either end as possible, for the same reason grocers put milk at the back of the store. "Consumer Reports does not evaluate public intellectuals," Posner observes, and people like to know the score. It is disconcerting to see Camille Paglia and Oliver Wendell Holmes nearly tied in a ranking by media mentions. It is suggestive that the intellectual most often cited in scholarly writing between 1995 and 2000 was Michel Foucault, and even more suggestive that Foucault's score is nearly twice that of the second-most-cited intellectual, Pierre Bourdieu. (Posner himself comes in tenth.)

Once the air of the horsetrack has dissipated, the reader turns to Posner's analysis. Here is the news, in summary:

If you are a public intellectual, your odds of being mentioned in the media improve if you are not an academic, are not dead, have served in government and are either a journalist or a writer.

At first glance this might look like good news. But the higher profile of nonacademics does not mean that the unaffiliated intellectual is alive and well. It means, rather, that those who have managed to become public intellectuals despite a lack of academic credentials tend to be mentioned more frequently than their academic peers. As time goes by, there are fewer such people. Of the 546 intellectuals in Posner's sample, 56 percent of the dead ones are academics, and 70 percent of the living ones. And as Posner deadpans, "Notice the high average age even of the living public intellectuals"--64 years old. Among actual young people, the rate of intellectual institutionalization is probably even higher.

"Media mentions come at the expense of scholarly citations (and vice versa)," Posner observes. "An academic who wants to succeed as a public intellectual might be well advised to substitute government service for additional scholarly publications!" But if it is posterity you hunger for, think carefully. In Posner's sample, being dead correlates well with scholarly citations, which suggests that "public-intellectual work is more ephemeral than scholarship." The correlation may, of course, suggest other inferences to less sanguine minds. So much for the facts. Although Posner is known as a pragmatist, the most provocative analysis in Public Intellectuals is actually of his own hunches and grudges, and of the social maps drawn by observers like Jacoby and Bobos in Paradise author David Brooks.

Posner thinks that public-intellectual work offers the consumer three goods: entertainment, solidarity and information. The consumer (and the magazine editor or television producer who procures on the consumer's behalf) can usually tell by inspection whether a commentary is entertaining and whether it reassures people that they are on the right team, be it of abortion-haters or deconstruction-defenders. In an age of specialized knowledge, however, only another expert can judge whether the information in a piece of commentary is worthwhile. Its value is what an economist would call a "credence good"; consumers have to take it on faith. By the time you figure out that there must have been a flaw somewhere in that September 1999 Atlantic Monthly article titled "Dow 36,000," it is too late to get your money back.

By now even the writers have been paid.

Most markets in credence goods correct for this uncertainty, in order to keep frustrated consumers from fleeing. Sellers may offer money-back guarantees, advertise heavily to signal long-term commitment to a product, cooperate with a third-party rating system, choose retailers who are reputed to be judicious gatekeepers or consent to government regulation. Even in the absence of any correctives, however, sellers usually refrain from offering egregiously low-quality products, because they want customers to buy from them again in the future. They are deterred by "the cost...of exit from the market."

The public-intellectual market deals in credence goods, but Posner fears that it may be suffering from market failure. Consumers trust periodicals and talk shows to act as filters, but they seem to be filtering for entertainment and solidarity rather than for information. More damaging, the cost of exit from the public-intellectual market is very low. No academic loses his job because he has made a fool of himself on the Op-Ed page. It has therefore become unwise for the consumer to believe public intellectuals. Posner likens them to palm readers: They claim to know the answers to vital questions, but the cost of figuring out whether they really do is prohibitive. The rational consumer responds by discounting the value of the information and consulting them merely for entertainment.

Why is the cost of exit from the public-intellectual market so low? For the simple reason that there is not much reward for entering it in the first place. Here economic analysis converges with traditional lament. The professors have ruined everything. They are obscurantist, pedantic, naïve, exaggerative of the reach of their expertise, theory-mad, timid toward anyone who might put a letter in their tenure file and intemperate toward everyone else, but the real problem is their free time. They have a lot of it, and they are willing to sacrifice almost any quantity to see their names in print. They are, in other words, cheap. They drag the supply curve downward on the dollar axis. The price of public-intellectual work drops, and more of it is produced.

With prices so low, unaffiliated intellectuals can no longer make a living. (At many periodicals, the payment for editorials and book reviews is lower than for other kinds of writing. This is not because they require less effort; it is because an academic can always be found to write them.) Absent a class of people whose livelihood depends on the market, an ethos of quality gives way to an ethos of tourism. "He is on holiday from the academic grind and all too often displays the irresponsibility of the holiday goer," Posner writes of the moonlighting professor. "Insulated from the retribution of disappointed consumers by virtue of being part-timers," academic intellectuals behave like movie-star politicians.

You're so vain, you probably think this book is about you, don't you? Public Intellectuals is a portmanteau book. The first part consists of the analysis of the public-intellectual market described above, but in the second, the reader is dropped into conversations whose beginnings he has not witnessed. Martha Nussbaum is wrong to think that the moral of The Golden Bowl is resignation to your husband's adultery. (Martha Nussbaum is here? In the room with us?) Wayne Booth's attempt to reconcile the aesthetic to the ethical is doomed. Aldous Huxley predicted the future better than George Orwell, but Orwell wrote a better novel. Robert Bork is disingenuous about so-called partial birth abortions. Gertrude Himmelfarb is unconvincing about the cultural metastasis of the naughty. Richard Rorty may be the heir to Socrates, Dewey and J.S. Mill, but he deploys a rhetoric that passed its freshness date sometime in the 1930s, and as for Martha Nussbaum--did I mention her already? The chapters are informative and at times highly entertaining ("The 'Ode on Melancholy' is not improved by being made risqué, just as a pig is not enhanced by wearing lipstick," writes Posner, in a simile that becomes more disturbing the more it is considered), but they are miscellaneous, and the reader senses that because of a wish to revisit old grudges--or recycle old articles--the tail is wagging the pig. In his conclusion, Posner returns to topic. Academia has diminished intellectual life, but rebellion is futile, because academia is what Tocqueville would call a soft tyranny. Like the Hand of God as described to me in Sunday school, it destroys not by striking the wicked but by releasing them into the danger they prefer, where they must write for in-flight magazines in order to pay their rent.

Accordingly, Posner offers extremely modest proposals for reform: He would like to encourage academics to post their public-intellectual work on websites, deposit printouts in libraries and disclose relevant earnings. He doesn't think the reforms will be adopted, because "the irresponsibility of public-intellectual work is one of the rewards of being a public intellectual." But even if Posner's suggestions were adopted, they would change nothing. The money involved is usually trivial, as he himself admits, and he has overestimated how hard it is to trace what an academic has said in public.

As near as I can tell, only one of Posner's suggestions has even the faintest chance of success: "One might hope that as a matter of self-respect the university community could be persuaded to create and support a journal that would monitor the public-intellectual activities of academics and be widely distributed both within and outside the community." Thus would specialized academics be matched by specialized journalists, and the failure of one market remedied by the development of another. Alas, Lingua Franca suspended publication in November.

Nike-Zeus, Nike-X, Sentinel, Safeguard, Star Wars, X-ray lasers, spaced-based neutron particle beams, Brilliant Pebbles, Ground-Based Midcourse National Missile Defense, Midcourse Defense Segment of Missile Defense. Over the past fifty years America has poured approximately $100 billion into these various programs or efforts to shield the country against long-range ballistic missiles. Yet not one has worked. Not one. Nevertheless, except for the constraints imposed by his own "voodoo economics," President George W. Bush appears poised to pursue the development and deployment of a layered missile defense--as a hedge against more failures--that would force taxpayers to cough up as much as another $100 billion. In December Bush formally notified Russia that the United States was withdrawing from the 1972 Anti-Ballistic Missile treaty in order to "develop ways to protect our people from future terrorist or rogue state missile attacks."

Russian President Vladimir Putin labeled Bush's decision a "mistake," a mild reaction that should not disguise the fact that much of Russia's political elite is seething at the withdrawal. Already smarting from America's broken promise not to expand NATO and the US-led NATO bombing of Yugoslavia in 1999 (which violated the 1997 "Founding Act" between Russia and NATO), the coincidence of America's success in Afghanistan (obviating the need for further Russian assistance) and withdrawal from the ABM treaty is viewed as yet further evidence of American duplicity.

President Clinton diplomatically explained the Republicans' obsession with missile defense when he observed: "One of the problems they've got is, for so many of their supporters, this is a matter of theology, not evidence. Because President Reagan was once for it, they think it must be right, and they've got to do it, and I think it makes it harder for them to see some of the downsides." That's a nice way of saying that the conservative wing of the Republican Party abounds with missile-defense wackos. I've participated personally in two missile-defense conferences and was astounded by their right-wing, faith-based atmospherics.

Which is why Bradley Graham's engaging narrative of politics and technology during the Clinton years, Hit to Kill: The New Battle Over Shielding America From Missile Attack, seems destined for popular success, notwithstanding its serious conceptual limitations. Graham ably recounts the excessive exuberance of Republicans as they schemed to realize their missile-defense dreams. But he is equally critical of the Clinton Administration's attempt to actually build a missile defense: its "three-plus-three" ground-based midcourse program.

Offered in the spring of 1996, in part to undercut the Republicans, "three-plus-three" provided for three (or four) years of development, after which, if then technologically feasible and warranted by a threat, there would be deployment within another three years. In early 1998, however, a sixteen-member panel, led by retired Air Force chief of staff Larry Welch, condemned the plan as a "rush to failure."

But two overdramatized events later that year demanded even greater urgency. In July, the Commission to Assess the Ballistic Missile Threat to the United States, led by Donald Rumsfeld, asserted that America's intelligence agencies had woefully underestimated the capability of "rogue" regimes, such as those leading North Korea and Iran, to attack US territory with ballistic missiles within five years. It concluded: "The threat to the United States posed by these emerging capabilities is broader, more mature, and evolving more rapidly than has been reported in estimates and reports by the intelligence community."

When North Korea subsequently launched a three-stage Taepodong 1 missile past Japan in August 1998, many Americans put aside not only their qualms about the role Representatives Curt Weldon and Newt Gingrich had played in creating the commission, but also their suspicions about the blatantly pro-missile defense bias of most of its members. Although Graham generally portrays the commission's deliberations as unbiased, he does provide evidence that some of its briefers were not.

For example, one intelligence official betrayed visible irritation during his briefing of commission members, prompting General Welch to ask, "You're not happy to be here, are you?" The official replied, "No, I'm not. I'm ticked off that I have to come down and brief a bunch of wacko missile-defense advocates." His outburst infuriated Rumsfeld, who "stalked" out of the room.

Nevertheless, Rumsfeld's report and the launch of North Korea's missile frightened Americans and galvanized Republicans. Graham's investigative reporting gets inside the subsequent political war waged against a Clinton Administration that, itself, was slowly awakening to the possibility of a more imminent ballistic missile threat.

Graham brings an open mind to the hotly disputed technological merits of missile defense. Nevertheless, he cannot avoid the conclusion that George W. Bush's decision to expand missile defense beyond Clinton's ground-based midcourse program constitutes an acknowledgment that, after fifty years, "military contractors had yet to figure out how best to mount a national missile defense."

In theory, a ballistic missile can be intercepted during its comparatively slow, if brief, "boost phase," before its "payload"--warheads, decoys and debris--is released. Speed is of the essence during the boost phase. So is proximity to the target. According to Philip Coyle, former director of the Pentagon's Office of Operational Test and Evaluation, "The process of detection and classification of enemy missiles must begin within seconds, and intercept must occur within only a few minutes. In some scenarios, the reaction time to intercept can be less than 120 seconds."

Compounding concerns about boost-phase intercepts are questions about the ability of an interceptor to distinguish quickly between a missile's flame and the missile itself. Finally, boost-phase missile-defense platforms would invite pre-emptive attacks against those platforms by any state bold (and foolish) enough to launch ballistic missiles.

The "terminal phase" of ballistic missile flight is the final minute or two when the payload re-enters the atmosphere. Detection of the warhead is comparatively simple, but designing a missile fast enough to catch it and hit it--given the problems associated with sensor degradation in intense heat--is extremely difficult. Countermeasures, such as maneuvering capability or precursor explosions, would further complicate defensive efforts. Finally a terminal-phase missile defense can, by definition, protect only a limited area, perhaps one city. Thus, many such systems would be required.

The "midcourse phase" of ballistic missile flight is the period during which the payload is dispersed in space. It remains there more than 80 percent of the missile's total flight time. The Clinton Administration's ground-based midcourse program (continued by the Bush Administration) is designed to strike the warhead in space with a high-speed, maneuverable kill vehicle--thus Graham's title: Hit to Kill.

Easily the most developed of all programs, as recently as December 3, 2001, the midcourse program demonstrated the awesome technological feat of destroying a warhead hurtling through space--hitting a bullet with a bullet. Yet such a feat constitutes but the commencement of an arduous technological journey, not its endpoint.

As a "Working Paper" issued recently under the auspices of the Union of Concerned Scientists noted, America's ground-based midcourse program has not been subjected to real-world tests. Five hit-to-kill tests have resulted in three hits. But each test: (1) used identical test geometrics (the location of launches, trajectories of target and interceptor missiles); (2) released the same objects (payload bus, warhead and decoy); (3) occurred at the same time of day; (4) made the lone decoy obviously and consistently different from the warhead; (5) told the defense system what to look for in advance; (6) attempted intercept at an unrealistically low closing speed; (7) kept the target cluster sufficiently compact to aid the kill vehicle's field of view; and (8) provided the kill vehicle with unduly accurate artificial tracking data.

Any ground-based midcourse missile defense system has to contend with virtually insurmountable countermeasures, especially the decoys that, in space, are quite indistinguishable from the warheads. Yet the three successful hits did not have to contend with even the countermeasures that a missile from a "rogue" regime would probably employ.

A National Intelligence Estimate in 1999 determined that "countermeasures would be available to emerging missile states." In April 2000 a "Countermeasures" study group from the Union of Concerned Scientists and the MIT Security Studies Program concluded: "Even the full [National Missile Defense] system would not be effective against an attacker using countermeasures, and an attacker could deploy such countermeasures before even the first phase of the NMD system was operational." Consequently, "it makes no sense to begin deployment."

Craig Eisendrath, Melvin Goodman and Gerald Marsh (Eisendrath and Goodman are senior fellows with the Center for International Policy in Washington; Marsh is a physicist at Argonne National Laboratory) state the problem even more starkly in their recent book The Phantom Defense: America's Pursuit of the Star Wars Illusion: "This is the bottom line: the problem isn't technology, it's physics. Decoys and warheads can always be made to emit almost identical signals in the visible, infrared, and radar bands; their signatures can be made virtually the same."

If such information troubles Defense Department officials responsible for missile defense, they seldom admit it publicly. However, they're not nearly as irresponsible as the political and "scholarly" cheerleaders who remain unmoved by a half-century of failure and the physics of countermeasures. I encountered one of them last June at a missile defense conference in King of Prussia, Pennsylvania.

Representative Weldon delivered the conference's keynote address to more than 220 participants from the Defense Department, the military industry, think tanks, various universities and the press. Weldon is the author of HR 4, legislation that made it "the policy of the United States to deploy a national missile defense." (Senator Carl Levin was able to add amendments to the Senate bill on missile defense that made the program dependent upon the annual budget process and tied it to retention of the ABM treaty; Weldon referred to the amendments as cowardice. Nevertheless, they remained in the Missile Defense Act that President Clinton signed on July 22, 1999.)

Weldon told the audience that the United States requires a missile-defense system to protect its citizens from an intentional missile attack by a "rogue" regime presumably undeterred by the prospect of an overwhelming American nuclear retaliation. He even displayed an accelerometer and a gyroscope, Russian missile components allegedly bound for a "rogue." He then displayed an enlarged, poster-size photograph of Russia's SS-25 ICBM. Russia possesses more than 400 such missiles, he asserted, and any one of them might be launched accidentally against the United States, given Russia's deteriorating command and control capabilities.

It was a "no-brainer." Both threats demanded that America build a national missile defense system, capable of intercepting such missiles, as soon as possible.

However, when I asked Congressman Weldon to shift from the SS-25 and contemplate whether his modest missile-defense system could prevent the penetration of an accidentally launched TOPOL-M ICBM from Russia, he responded, "I don't know. That's a question you should ask General Kadish during tomorrow's session." Extending the reasoning, I asked Weldon whether his modest missile-defense system could shield America against a missile, launched by a rogue regime, that was capable of TOPOL-M countermeasures. Weldon again answered that he did not know. But rather than let such doubts linger at a conference designed to celebrate missile defense, Kurt Strauss, director of naval and missile defense systems at Raytheon, rose to deny that Russia possessed such countermeasures.

Presumably, Strauss was unaware of the work of Nikolai Sokov, a former Soviet arms control adviser and author of Russian Strategic Modernization: Past and Future. Sokov claims that the TOPOL-M features a booster intended to reduce the duration and altitude of the boost phase, numerous decoys and penetration aids, a hardened warhead and a "side anti-missile maneuver."

Strauss's uninformed denial hints at a much bigger problem, however: the prevalence of advertising over objectivity in a society where the commercialization of war and the cult of technology have reached historic proportions. In The Pursuit of Power historian William McNeill traces the commercialization of war back to mercenary armies in fourteenth-century Italy, pointing out the "remarkable merger of market and military behavior." And Victor Davis Hanson, in Carnage and Culture, sees much the same reason behind the decimation of the Turkish fleet, some two centuries later, by the Christian fleet at Lepanto--"there was nothing in Asia like the European marketplace of ideas devoted to the pursuit of ever more deadly weapons." McNeill concludes that "the arms race that continues to strain world balances...descends directly from the intense interaction in matters military that European states and private entrepreneurs inaugurated during the fourteenth century."

Post-cold war America, virtually alone, luxuriates in this dubious tradition. Yet it was no less than Dwight Eisenhower who warned America in his farewell address: "This conjunction of an immense military establishment and a large arms industry is new in the American experience. The total influence--economic, political, even spiritual--is felt in every city, every Statehouse, every office of the federal government."

Who could have been surprised, then, when Matthew Evangelista conclusively demonstrated, in Innovation and the Arms Race (1988), that commercial opportunities within America's military-industrial complex, much more than any Soviet threat, propelled innovation--and, thus, most of the arms race with the Soviet Union. A year later, the highly respected defense analyst Jacques Gansler identified the uniquely American "technological imperative" of commercialized warfare: "Because we can have it, we must have it." Such impulses caused the United States to run profligate arms races with itself both during and after the cold war. They also explain America's post-cold war adherence to cold war levels of military expenditures and, in part, our missile-defense obsession today.

This technological imperative had its origins in America's "exceptional" historical experience, which it continues to serve. Indeed, so the argument goes, Why should a country on a mission from God sully itself with arms control agreements and other compromises with lesser nations, when its technological prowess will provide its people with the invulnerability necessary for the unimpeded, unilateral fulfillment of their historic destiny?

Such technological utopianism, however, has its costs. In their book The Dynamics of Military Revolution, 1300-2050, MacGregor Knox and Williamson Murray demonstrate the very secondary role that technology has played in past military revolutions. They conclude: "The past thus suggests that pure technological developments without the direction provided by a clear strategic context can easily lead in dangerous directions: either toward ignoring potential enemy responses, or--even more dangerously--into the dead end, graphically illustrated by the floundering of U.S. forces in Vietnam, of a technological sophistication irrelevant to the war actually being fought." (In Hit to Kill, Graham has little to say about military strategy or the commercialization of warfare.)

In hawking a missile defense shield, Representative Weldon traveled in the first dangerous direction when he assured the defense conferees that although Congress was not ignoring the threat posed by terrorists with truck bombs, "when Saddam Hussein chose to destroy American lives, he did not pick a truck bomb. He did not pick a chemical agent. He picked a SCUD missile.... The weapon of choice is the missile."

Unfortunately, on September 11, America learned that it is not.

Potentially worse, however, is the Reaganesque theology propelling the Bush Administration's decision to withdraw from the 1972 Anti-Ballistic Missile treaty. Putting aside the question of whether withdrawal requires formal Congressional approval and other questions of international relations, one must ask why any administration would destroy the cornerstone of strategic stability. The ban on national missile defenses not only prevents a defensive arms race but also obviates the need to build more offensive missiles to overload the enemy's. Why would a country withdraw from the ABM treaty without knowing whether its own missile-defense system will even work, and before conducting all the tests permitted by the treaty that would provide greater confidence in the system's ultimate success?

Readers of Keith Payne's recent book The Fallacies of Cold War Deterrence and a New Direction, might guess the probable answer. Payne, chosen by the Bush Administration to help shape the Defense Department's recently completed but still classified Nuclear Posture Review, writes about a new, post-cold war "effective deterrence," to which even an imperfect missile-defense system might contribute: "In the Cold War, the West held out the threat of nuclear escalation if the Soviet Union projected force into NATO Europe; in the post-Cold War period it will be regional aggressors threatening Washington with nuclear escalation in the event the United States needs to project force into their regional neighborhoods.... In short, Washington will want effective deterrence in regional crises where the challenger is able to threaten WMD [weapons of mass destruction] escalation and it is more willing to accept risk and cost."

The real concern, then, is less about protecting America from sneak attacks by rogue states ruled by madmen, and more about preserving our unilateral options to intervene throughout much of the world. Thus, President Bush's speech at The Citadel in December was disingenuous. His rhetorical question asking what if the terrorists had been able to strike with a ballistic missile was primarily an attempt to steamroller frightened Americans into supporting missile defense. The speech simply seized upon the wartime danger to compel a military transformation that has been debated for almost a decade and resisted by the services and the military industry since the beginning of Defense Secretary Rumsfeld's tenure.

Lest we forget, China hasn't disappeared either. Its muted criticism of America's withdrawal from the ABM treaty was accompanied by a call for talks to achieve "a solution that safeguards the global strategic balance and doesn't harm international efforts at arms control and disarmament." Failing such talks, China may feel compelled to increase its offensive arsenal to insure penetration of an American missile defense, which could provoke India, and consequently Pakistan--perhaps rekindling tensions that have already brought them to the brink of war.

Russia, for its part, believes it has little to fear from America's current missile-defense programs but is awaiting the inevitable: the moment when the technological utopians push America to expand its modest system into a full-blown shield. How will Russia respond then?

To court such reactions by withdrawing from the ABM treaty before even testing against decoys is pure strategic illiteracy--which only a Reaganesque theology (founded on exceptionalism, commercialized militarism, technological utopianism and righteous unilateralism) shrouded by the "fog of war" might explain.

Why in 1973 did Chile's democracy, long considered the crown jewel of Latin America, turn into Augusto Pinochet's murderous regime? Why did the United States, which helped Pinochet seize power from Salvador Allende, support the violent dictator for nearly two decades? Scholars answering these questions have usually focused on the threat posed by Allende, the first elected Marxist head of state, to Chilean and US business interests and to the cold war foreign policy of the United States. But recently declassified documents, along with the reissue of Patricia Politzer's Fear in Chile: Lives Under Pinochet, suggest that the Chilean counterrevolution, however much shaped by immediate economic and political causes, was infused with a much older, more revanchist political spirit, one stretching as far back as the French Revolution.

Edward Korry, who served as US ambassador to Chile between 1967 and 1971, greeted Allende's election in 1970 as if the sans-culottes were at the gate. Before all the votes were in, he smelled the "stink of defeat" and could hear "the mounting roar of Allendistas acclaiming their victory" arising "from the street below." Although no guillotine blade had yet dropped, material declassified by the United States over the past couple of years shows that Korry fired cable after cable back to Washington, warning of "the terror" to come and citing Baudelaire to brand Allende a "devil."

It may seem bizarre that an LBJ-appointed Democrat would pepper his diplomatic missives with the overheated prose of French romanticism. After all, critics have charged cold war liberals, such as Robert McNamara and McGeorge Bundy, with employing a dry calculus in deciding the number of casualties needed to defeat Communism. But Korry was no bloodless bureaucrat. In fact, in both tone and content, his writings were remarkably similar to those of the illiberal Joseph de Maistre, the arch-Catholic reactionary who launched violent, intoxicated attacks on the French Revolution. By injecting medieval Catholic orgiastic mysticism with the revolutionary zealotry of his contemporaries, Maistre offered a compelling alternative to earthly promises of secular justice and political participation. He was the first who understood that if a counterrevolution was to be won, it would be necessary to win the "hearts and minds" of what would come to be known as the masses.

As fervidly as Maistre hated la secte of Jacobins and eighteenth-century rationalists, Korry disdained Allende and his Popular Unity followers, and largely for the same reason: Where Maistre rejected the idea that people could be governed by enlightened principles, Korry dismissed as "dogmatic and eschatological" those who believed that "society can be structured to create paradise on earth." And both men reserved their strongest scorn for the pillars of the old regime--church, army and state--because, either for reasons of ineptitude or corruption, they had failed to see and to confront the evil before them. Lost in a "myopia of arrogant stupidity," the elites and officials who had allowed Allende to come to power were a "troupe of fools and knaves" leading Chile to the "marxist slaughter-house." It is as if Korry saw the revolution as divine retribution against a decaying polity. "They should be given neither sympathy nor salvation," he said of the weak-willed ruling party.

Echoing Maistre's observation that republican rule is ill suited to protect society against revolutionary fanaticism, Korry complains in his cables about a gracious political culture that places no brake on Allende's determination: "Civility is the dominant characteristic of Chilean life. Civility is what controls aggressiveness, and civility is what makes almost certain the triumph of the very uncivil Allende." Neither the military nor the outgoing president, Eduardo Frei, "have the stomach for the violence they fear would be the consequence of intervention," Korry wrote to Washington. The Communist Party, in contrast, Korry warned, was "that most clear-minded and cohesive force in Chile.... Allende is their masterwork in Latin America and they do not lack for purpose or will."

Korry worked to strengthen domestic opposition to Allende's Popular Unity coalition, yet he also opposed Henry Kissinger's plot to provoke a military coup (which led to the murder of Chilean Gen. René Schneider). Instead, he advocated patience, confident that, with encouragement, internal dissent would eventually oust Allende. Again, remarkably akin to Maistre, Korry felt that restoration had to come from within rather than be imposed from without. He had faith that time favored his position; that the revolutionaries, in their effort to build a society that ran against human nature, would soon exhaust themselves; that rumor and chaos, unavoidable spawns of popular rule, would fuel an irresistible counterwave that would sweep them from power.

In fact, CIA destabilization strategies, both in Chile and in other Latin American nations, seem to draw directly from Maistre's restoration scenario, which relied on counterrevolutionary determination to generate dissension. Rumor acts as the cat's-paw for fear, poisoning commitment, corroding solidarity and forcing an acceptance of inevitable reaction. In Chile the CIA, in a cable dated September 17, 1970, set out a plan to

create the conviction that Allende must be stopped.... discredit parliamentary solution as unworkable...surface ineluctable conclusion that military coup is the only answer. This is to be carried forward until it takes place. However, we must hold firmly to the outlines or our production will be diffuse, denatured, and ineffective, not leaving the indelible residue in the mind that an accumulation of arsenic does. The key is psych war within Chile. We cannot endeavor to ignite the world if Chile itself is a placid lake. The fuel for the fire must come within Chile. Therefore, the station should employ every stratagem, every ploy, however bizarre, to create this internal resistance.

After the end of World War II, when demands for social democratic reform swept the continent, a series of coups and political betrayals successively radicalized and polarized social movements. The Old Left gave way to the New, and calls for reform climaxed into cries for revolution. By the late 1960s, Latin American military elites and their US allies knew, as Maistre knew two centuries earlier, that a simple changing of the guard would no longer be enough to contain this rising tide: "We are talking about mass public feeling as opposed to the private feeling of the elite," wrote the CIA about the intended audience of its "psych war" in Chile. The Latin American military regimes that came into power starting in the late 1960s combined terror and anti-Communist Catholic nationalism to silence this revolutionary roar. As Gen. Oscar Bonilla, who helped Pinochet install his seventeen-year dictatorship, put it, "What this country needs is political silence. We'll return to the barracks when we have changed the mentality of the people."

Patricia Politzer's Fear in Chile: Lives Under Pinochet recounts, through fifteen first-person testimonies gathered in the mid-1980s, while Pinochet was still in power, how his dictatorship did just that. By 1973, the United States had succeeded in its stated goal of extinguishing Chilean civility and igniting political passions. It seemed to many that their country had become ungovernable. Chronic shortages of basic goods, violent conflicts, political impasses and swirling rumors of coups and invasions wore Chileans down.

Nearly all of Fear in Chile's witnesses begin their accounts with the coup, and they all convey the exhaustion and confusion of the moment. Andrés Chadwick Piñera recounts his lonely sadness at hearing of Allende's death while his middle-class family, wife and neighbors celebrated. Sympathetic to the revolution, he burned his books and eventually made peace with the regime. Even the most committed became disoriented. Raquel, a student member of the Communist Party, recalls the uncertainty of revolutionary leadership, which told members to first do one thing, then another. Blanca Ibarra Abarca, a shantytown community leader, became "furious" after listening to Allende's radio message broadcasting news of the coup. She wanted "to do something, to fight," but was paralyzed by "pain and impotence." Manuel Bustos Huerta, president of his union, called a meeting but "no one knew anything...some people said we should go home, and others said we should take over the factory. Finally, after much discussion, we decided that people should go home." (Maistre wrote, nearly 200 years earlier, of how confusion would replace revolutionary resolve with resignation: "Everywhere prudence inhibits audacity.... On the one side there are terrible risks, on the other certain amnesty and probable favors. In addition, where are the means to resist? And where are the leaders to be trusted? There is no danger in repose.")

At times the polarization described by Politzer's witnesses seems absolute. While many wept upon hearing news of Allende's death, others bonded in anti-Communist solidarity: "Everyone from the block got together in a neighbor's house to celebrate.... Everyone brought something and it was a very joyous occasion."

But it is where the testimonies intersect, often at unexpected junctures, that Fear in Chile reveals just how deep and popular both the revolution and counterrevolution were. Blanca Ester Valderas and Elena Tesser de Villaseca recount radically different experiences and backgrounds. Valderas is a poorly educated rural woman whose husband was murdered in Pinochet's coup. Under Allende, after growing weary of following her husband through a series of dead-end jobs, Valderas joined the Socialist Party and was appointed mayor of her town. Even after the coup, when she was forced to change her name and go into hiding, she continued in politics, working with Chile's nascent human rights organizations. Tesser de Villaseca is a well-to-do "Pinochet diehard" who untiringly organized women to bring Allende down, even though she denies that either she or her husband is "political." Nor did she return home after Pinochet took power; instead Tesser de Villaseca and her friends threw themselves into myriad social welfare organizations aimed at making Chileans "a sound race again, to make the country healthy." Despite the different historical consequences of their actions, both women used politics as an avenue of upward human mobility, to escape the restraints of family and to influence civic life.

In Costa-Gavras's movie Missing, which, while not mentioning Chile specifically, depicts Pinochet's coup, the first repressive act shown is of soldiers pulling a woman off a bus queue and cutting off her slacks, warning her that in the new nation, women do not wear pants. Many of the voices in Fear in Chile recall similar acts of violence: men who had their long hair shorn; women who were ordered to wear skirts; a worker who was arrested and tortured for being "an asshole" and not acting sufficiently submissive to authority. Notwithstanding Allende's supposed alignment with the Soviet Union and his threat to economic interests, acts like these illustrate that the real danger of the Chilean left was not that it undermined secular liberal democracy but that it promised to fulfill it, to sweep away the privilege and deference of patriarchy and class. "It was as if we had suddenly returned to a past era," recalls the wife of an Allende functionary in recounting her dealings with male military officers who, prior to the coup, she'd treated as friends and equals.

For many, Pinochet realigned a world that had spun out of control, and the power of Politzer's book is that it takes seriously the concerns of his supporters. Pinochet remained popular because he satiated the desire of many Chileans for both order and freedom. He haunts the pages of Fear in Chile like Maistre's powerful but distant sovereign, who "restrains without enslaving." As one of Pinochet's supporters put it, "I believe in a democracy in which certain general objectives are submitted to a vote; after that, each matter should be handed over to experts capable of realizing those objectives. In a family, for instance, where there is a health problem, you don't have a democratic vote about what steps to take."

It is this image of a family that is constantly invoked by followers of the regime to symbolize a just society, a family with Pinochet as the wise and strong father ("I adore Pinochet," says Tesser de Villaseca. "I adore him because he is a superhuman person who is also sensible and worthy") and his wife, Lucía, as the empathetic mother ("an extraordinary woman," says a Pinochet colonel, "who has created a volunteer corps in Chile that should be an example to the world. She's like a diligent little ant who works in different areas and also collaborates well with her husband").

Pinochet's success in generating a degree of popular legitimacy ultimately rested on violence and terror. By the time he left office, in 1990, his regime had arrested 130,000 people, tortured 20,000 others and, if the killing that took place during the coup is included, murdered between 5,000 and 10,000 Chileans. Fear not only led people to burn their books, drop out of politics, go into hiding and exile and switch allegiances, but allowed those who supported the government and dreaded a return to anarchy and conflict to justify murder: "I don't have any special knowledge about DINA [Pinochet's intelligence agency, responsible for a good deal of the terror], but if they were really out to find people working against democracy, people who didn't hesitate to kill to achieve their goals, I think what they were doing was good. I'm not one of those who don't believe that there were disappeared persons," says Carlos Paut Ugarte, an economist who returned to Chile following Allende's overthrow to work in Pinochet's government.

From Edmund Burke to Jeane Kirkpatrick, it has been the lie of modern counterrevolutionary thinkers that, against totalitarian abstractions, they defended historical actuality. The status quo is what should be, they say, and any effort otherwise leads straight to the guillotine or the gulag. But Pinochet's god, father and homeland were no less utopian and intangible than the just nation that Allende and Popular Unity hoped to build--the difference being that Pinochet had guns and the United States.

In his day Maistre was optimistic that restoration could be brought about with little violence. "Would it be argued," he asked, "that the return from sickness to health must be as painful as the passage from health to sickness?" Writing before the great counterinsurgency terrors of the nineteenth and twentieth centuries, he can be excused his sanguinity. But Korry, too, liked to draw on historical analogies to make his case, and he has no such excuse. "There is a graveyard smell to Chile," he wrote immediately after Allende's election, "the fumes of a democracy in decomposition. They stank in my nostrils in Czechoslovakia in 1948 and they are no less sickening today."

It is too bad Korry couldn't escape the prison of his own abstractions and draw a lesson from a more relevant historical referent: Indonesia in 1965, where anti-Communist government agents slaughtered, as the United States watched, hundreds of thousands of its citizens. After all, the analogy was not lost on the CIA, which dubbed Pinochet's coup "Operation Jakarta."

At work recently, I went to get a ham sandwich from the university cafeteria. I discovered, to my vocal dismay, that the well-loved food counter offering homemade fare had been torn out and replaced by a Burger King franchise. Questioned about this innovation, the head of "food services" insisted that
it had been implemented in response to consumer demand. An exhaustive series of polls, surveys and questionnaires had revealed, apparently, that students and faculty were strongly in favor of a more "branded feel" to their dining environment.

It is worth pausing over the term "branded feel." It represents, I think, something profound: The presence of Burger King in the lunchroom is claimed to be a matter of affect. It addresses itself to "feelings," it meets a need that is more emotional than economic. This need has been identified, I was informed, by scientific and therefore inarguable means. The food-services honcho produced statistics that clearly indicated a compelling customer desire for bad, expensive food. According to his methodology, my protests were demonstrably elitist and undemocratic.

It is hardly news that opinion polls are frequently used to bolster the interests of those who commission them. But in recent years the notion that opinion can be measured in quantifiable terms has achieved unprecedented power and influence over public policy. The American penal system, for instance, has been rendered increasingly violent and sadistic as a direct response to opinion polls, which inform politicians that inhumane conditions are what voters desire. The thoughts and emotions of human beings are regarded as mathematically measurable, and the practical effects of this notion are now perceptible in the most mundane transactions of daily life.

This quantified approach to human nature is the result of the importation of theoretical economics into the general culture. Since the marginalist revolution of the late nineteenth century, neoclassical economists have rigidly confined their investigations within the methodological paradigm of positivist science, and they aspire in particular to the model of mathematics. Economists seek to produce empirically verifiable, statistical patterns of human behavior. They regard such studies as objective, unbiased and free of value-laden, superstitious presuppositions. The principle of "consumer sovereignty" hails this mode of procedure as the sociological arm of democracy, and it has made economics the most prestigious of the human sciences.

As David Throsby's Economics and Culture and Don Slater and Fran Tonkiss's Market Society show, the procedures of academic economists are now being further exalted to a position of dominant influence over everyday experience. Homo economicus is fast becoming equated with Homo sapiens. When airlines refer to passengers as "customers" and advise them to be "conservative with your space management," this development may seem trivial or comic. But in their very different ways, these books suggest that beneath such incremental cultural mutations there lurks an iceberg of titanic dimensions.

The Australian academic David Throsby is about as enlightened and humanistic as it is possible for a professional economist to be. He is also an accomplished playwright, and his influence on the political culture of his native land has been extensive and unvaryingly benign. He begins from the accurate supposition that "public policy and economic policy have become almost synonymous," and his intention is to rescue culture from the philistinism of businessmen and politicians who are incapable of lifting their eyes above the bottom line. It is a lamentable sign of the times, however, that he sees no other means of doing so than by translating aesthetic endeavor into quantifiable, economic terms. As he puts it, "If culture in general and the arts in particular are to be seen as important, especially in policy terms in a world where economists are kings, they need to establish economic credentials; what better way to do this than by cultivating the image of art as industry."

In order to cultivate this image, Throsby makes extensive if ambivalent use of the "rational-choice theory" derived from the work of Gary Becker. In Becker's opinion, the kinds of decision-making that economists contrive to abstract from the actions of people conceived as economic agents can be extrapolated to explain their behavior in areas of life that were once, romantically and unscientifically, thought of as lying beyond the arid terrain of rational calculation: love, for example, or aesthetic endeavor. This emboldens Throsby to ask whether we "might envisage creativity as a process of constrained optimisation, where the artist is seen as a rational maximizer of individual utility subject to both internally and externally imposed constraints," and to postulate "a measure...of difference in creativity (or 'talent'), in much the same way as in microeconomic analysis differences between production functions in input-output space measures differences in technology."

There are enough caveats in Throsby's book to indicate a laudable reluctance to engage in this project; however, he evidently feels that the current climate of opinion leaves him no other choice. He is thus driven to apply the economic understanding of "value" to cultural phenomena, and to engage in a "consideration of culture as capital...in the economic sense of a stock of capital assets giving rise over time to a flow of capital services." Much of this book consists of a monomaniacal reinscription of life itself into the technical discourse of neoclassical economics. We are therefore subjected to lengthy discussions of "cultural capital" (formerly known as "culture"), "social capital" (a k a "society"), "physical capital" (née "buildings"), "natural capital" (alias "nature") and of course "human capital" (once referred to as "people"). There is, it seems, no limit to the colonizing potential of economics: "If broader cultural phenomena, such as traditions, language, customs, etc. are thought of as intangible assets in the possession of the group to which they refer, they too can be brought into the same framework."

We are faced here, essentially, with the quantification of all human experience. Not merely economic behavior but every aspect of life and thought can be expressed under the statistical rubric and studied in mathematical form. The notion of the "stakeholder," dear to Tony Blair, whose ambition to create a "stakeholder society" is overt and unapologetic, is fundamental to this project.

A stakeholder stands in relation to the world as a shareholder does to a corporation. He (or she) casts a cold eye on his surroundings and perceives only his "stake" in them; he rationally considers the means by which he may optimally maximize their benefits. The stakeholder, then, is not human. He is rather a quantified abstraction from humanity, a machine designed for the calculation of marginal utility. Good-hearted economists such as Throsby would retort that the stakeholder does not enjoy an empirical existence; he is merely a useful theoretical construct. Would that it were so. But in fact, as Hannah Arendt said of neoclassical economics' cousin, behavioral psychology: "The problem...is not that it is false but that it is becoming true."

There is an interesting convergence between rational-choice theory and the venerable tradition of socialist materialism. Both approaches insist that the real factor motivating human behavior is economic self-interest: that of an individual in the former case, and that of a social class in the latter. The British sociologists Don Slater and Fran Tonkiss address many of the same questions as Throsby in their book Market Society, but they view the conquest of intellectual and social life by economics from a more traditionally leftist perspective. Like Throsby, Slater and Tonkiss acknowledge that "market logic has come to provide a means of thinking about social institutions and individuals more generally," but instead of concluding that students of aesthetics must therefore incorporate economic concepts into their practice, they envisage a movement in the other direction. Today, they claim, "the economist's task of explanation is as much interpretive or hermeneutic as it is mathematical."

Slater and Tonkiss are influenced here by the "rhetorical turn" that economists such as Deirdre McCloskey have recently attempted to introduce into their discipline. The increasingly abstract nature of money, it is claimed, lays bare the fact that financial value, like semiotic meaning, is an imaginary and therefore arbitrary mode of signification. As such, money can be studied using terms and concepts drawn from rhetoric and literary criticism. (An amusing parody of this idea occurs in Will Self's novel My Idea of Fun, which features a "money critic" whose job is to pontificate about the aesthetic qualities of various forms of finance.) Slater and Tonkiss present this as an appealing reversal of intellectual roles: "Whereas the central preoccupation of critical social analysis has traditionally been the way in which economic rationality dominates culture, contemporary social theory has been increasingly concerned with the central role of cultural processes and institutions in organizing and controlling the economic."

Although their emphasis is different, Slater and Tonkiss's argument leads to the same essential conclusion as Throsby's: It no longer makes sense to distinguish between "economics" and "culture," or between "the market" and "society." In practice, it makes little difference whether one regards this as an incursion of aesthetics into economics or vice versa. Indeed, Slater and Tonkiss are a good deal more pessimistic than Throsby about the consequences of this development. To their credit, they are willing and able to introduce into the discussion concepts like "commodification" and "alienation," from which even liberal economists like Throsby recoil in horror. But they stop well short of the bleak dystopianism of Adorno, and their slightly anodyne conclusion is that "markets are not simply good or bad, because they are highly variable." This pluralism is forced upon them, because their book is intended as a historical survey of various theoretical approaches to the market: Market Society provides admirably lucid and meticulously fair readings of Smith, Ricardo, Durkheim, Simmel, Weber and Polanyi. Despite its historical approach, the most beguiling feature of the book is that its treatment of such past thinkers is undertaken with a prominent sense of our present predicament.

Discussing the economist whose theories have had the greatest influence on that predicament, Slater and Tonkiss remind us that "Hayek held that ultimately there were no economic ends as such; economic action always served ends that were non-economic in character because needs and desires are exogenous (or external) to the market setting." But to say that there are no economic ends is the same as to say that there are only economic ends. It is, in other words, to abolish any distinction between the economic and the noneconomic. Toward the end of Economics and Culture, Throsby observes that "in primitive societies...culture and economy are to a considerable degree one and the same thing." By this definition, as each of these important and timely books suggests, our society may be the most primitive of all. Can anyone, today, escape the "branded feel"?

Immigrants and traffickers are the subjects of a certain style of Mexican music.

Blogs

A new book takes exciting and historic trends a step too far.

March 23, 2012

Ahmed Chalabi’s daughter recounts the family’s saga and the ancien regime.

March 6, 2011