Quantcast

Articles | The Nation

News and Features

It's time for the UN Security Council to impose "externally directed separation."

Drug companies influence research; they also affect what gets published.

Earthsea, Ursula Le Guin's magical world of islands and archipelagoes, is going through a period of intense, uncomfortable social change. The old ways no longer work and the new ones are not yet clear. At last there is a central government, though its young head of state is still establishing his authority, and it's bumpy going in the wild Kargish lands, where the religion, language and ethnicity are different and the women wear burqas. He has also encountered some resistance from the college of wizards at Roke, a theocratic caste that has ruled for centuries and become rather stiff and doctrinaire, as well as hateful toward women. Now Earthsea has suddenly been plunged into turmoil by two simultaneous assaults. One is an invasion of the collective unconscious by the voices and images of the dead, who beg to be set free from the dry land behind the wall of stones where they are confined. The other comes in fire from the skies, as dragons zoom in from the west to attack farm and forest. What is the reason for these threats? Are they connected? And does this society have what it takes to meet them?

Such are the themes of Ursula Le Guin's two new Earthsea books: Tales From Earthsea and The Other Wind: the boundary between life and death, terror from the sky and how hard it is for male-dominant societies to listen to women. Timely themes, from an acknowledged master not only of fantasy but of science fiction as well, a feminist, anarchist and Green whose books are taught in universities, and who has won many literary prizes (five Nebulas, five Hugos, the National Book Award for children's literature, a Newbery silver medal, Horn book award). In a country that valued wisdom and symbolic thinking, these two books would have been met with hosannas from coast to coast.

Does it matter that they weren't? I think so. To me, Le Guin is not only one of the purest stylists writing in English but the most transcendently truthful of writers. The books she writes are not true in the way facts are true; they speak to a different kind of truth and satisfy a desire for narrative that is so fundamental it must be in our cells. As she puts it:

The great fantasies, myths, and tales are indeed like dreams: they speak from the unconscious to the unconscious, in the language of the unconscious--symbol and archetype. Though they use words, they work the way music does: they short-circuit verbal reasoning, and go straight to the thoughts that lie too deep to utter. They cannot be translated fully into the language of reason, but only a Logical Positivist, who also finds Beethoven's Ninth Symphony meaningless, would claim that they are therefore meaningless. They are profoundly meaningful, and usable--practical--in terms of ethics; of insight; of growth.
      "The Child and the Shadow" (1975), in The Language of the Night

Le Guin wrote the first three Earthsea books thirty years ago. A Wizard of Earthsea (1968) is the coming-of-age story of the boy Ged, who meddles in forbidden lore and summons up a rough, bearlike Shadow, who attacks and nearly kills him. The rest of the book concerns Ged's struggle to understand this Shadow, so strong it could bring destruction to the world unless he can defeat it. What is this rough beast? Why does it increasingly resemble him?

The second Earthsea book, The Tombs of Atuan (1970) takes place in the Kargish lands, which are separate from and more primitive than the rest of Earthsea. It is the story of Tenar, who as a small child became Arha, the Eaten One, priestess of the tombs of Atuan, ruled by the old earth powers of death, blood and brooding revenge. Into this dark underground labyrinth comes Ged, looking for the ring of Erreth-Akbe, which bears a lost rune of peace that can bring about a new era. Injured, starving, trapped, he is not strong enough to fight the old earth powers and escape unless Tenar helps him. Her entire upbringing urges her to kill him, but he is the first man she has ever seen as well as the first wizard, and she is tempted. In the end, she chooses life and escape, seeing that, by freeing him, she can also free herself. But then what? Where can she go once she is free?

Although Le Guin has been heavily influenced by Tolkien, her cosmology differed from his from the beginning. While both write of lands ruled by magic, Tolkien's Middle Earth has states and civil society; Earthsea has principalities but is more or less ruled by a caste of celibate priest-wizards centered on the Island of Roke, whose inborn mastery has been schooled at the college. In Earthsea, power of this kind is based on the Language of the Making which is also the language of dragons, only they are born knowing it; men have to learn it. Names in the Language of the Making are the thing, and knowledge of them confers power, over nature and over other people. A wizard who knows someone's true name can control him. But mature wizards do not use their power any more than they have to, for the ruling principle of Le Guin's world is not Tolkien's struggle between good and evil, but equilibrium, balance. Earthsea is a Taoist world (Le Guin has actually translated the Tao Te Ching), where light and dark, life and death are yin and yang, intertwined rather than opposed. The world gets out of balance when one side of an opposition gets too strong: light, wizardry, men. When men of power use their knowledge to fence themselves off from the dailiness of ordinary life--farming, mending, giving birth, and women--trouble is coming. Such hubris can lead to denial of death itself. It does in the third book, The Farthest Shore (1972).

The Farthest Shore begins with the inexplicable: magic, the organizing principle of Earthsea, is failing and no one knows why. Gradually it becomes clear that a destroyer has arisen, a terribly powerful wizard, Cob, who awakens the terror of death while promising immortality to any who will follow him. His followers drift in dumb despair, work ceases and meaning drains out of the world. Ged, now Archmage (head of the wizard's council), and his young disciple Lebannen, destined to be the long-awaited king, must trace this peril to its source and defeat it. To do so, they must cross the wall of stones into the dry land, the land of death, where no wind blows, no sun shines, and people, still trapped in the prison of their names, wander forever, unable even to recognize those they once loved.

Through many perils Ged and Lebannen seek the physical entrance to the dry land but can only find it when aided by dragons. The plague of despair has affected the dragons too; their young are killing one another and drowning themselves in the sea, and even the wisest are in danger of losing their language and themselves. After a hard pursuit and struggle in the dry land, Lebannen and Ged together defeat Cob and Ged reseals the gap between life and death. But in doing so, he drains his own power; he is no longer a wizard, no longer strong enough even to walk. Lebannen must carry him over the Mountain of Pain, which is the only exit from the dry land, to the beach, where the dragon Kalessin, the eldest, awaits them. Now that Ged has lost his power, he can no longer be Archmage; Kalessin flies him on past Roke to his home island of Gont. But Lebannen will be crowned king and bring about a new era under the rune of peace that Ged and Tenar brought from underground so many years before.

So ends Le Guin's third Earthsea book. She thought it was the last. Then, twenty years later, she suddenly wrote a sequel, Tehanu (1990). I interviewed her at that time and asked her why. She said she had to tell what happened to Tenar. She had tried to earlier but couldn't; she was too caught in the tradition of heroic male fantasy to be able to figure out what would happen to a woman in a Tolkien world. "That is why I had to write this fourth volume, because I changed. I had to show the other side."

But what is the other side of heroic male fantasy? The answer is not as simple as flipping a coin with King Arthur on one side, Britomart on the other. Traditionally there are only four possible roles for women in this sort of book: absent beloved, evil witch, damsel in distress and girl warrior. Can one make room for real women without undermining the fundamental premises of the genre?

From Le Guin's practice, it would appear not. Tenar became a farmer's wife because...what else can she do on Gont? This is farm country, after all, and while she has some kind of power, it is not the kind of power of which wizards are made. Even if it were, they would never train her on Roke, where the wizards have the kind of attitude toward women one tends to find in celibate priesthoods. A widow now, Tenar has adopted Therru, a little girl who was beaten and raped and almost burned up in a fire by her parents, so that one of her arms is withered and one whole side of her face is a hardened shell of scars. Therru too has some kind of power but nobody knows what it is. Tehanu begins where The Farthest Shore ends, as the dragon Kalessin delivers Ged into Tenar's care. Tenar has always loved him, and the two finally get together, overcoming his lifelong celibacy and his shame at having lost his power. But peril persists from those who followed the destroyer and, at the end, they can be saved only by the little burnt girl Therru, who calls the dragon back in the Language of the Making, a language she has never been taught. "Tehanu," he names the child, and calls her daughter. We are left wondering, how can this damaged, tormented little girl also be a dragon?

After eleven more years, Le Guin answered that question with Tales From Earthsea and The Other Wind, which do more than undermine the conventions of heroic male fantasy; they retrospectively transform the very history she created in the first three Earthsea books. There are five stories in Tales From Earthsea, but the central one is "Dragonfly." Dragonfly is a big, ungainly country girl, whose real name is Irian. Like Tenar and Tehanu, she has some kind of power nobody can exactly name. She knows she isn't like other people and wants to find out what she is. Finally she encounters somebody willing to take her to Roke to find out. But when she gets there, she comes up against a wall. In the absence of an Archmage, Roke has become factionalized. Thorion, the Summoner, had followed Ged and Lebannen into the dry land. He stayed there too long and was thought dead; now he has somehow returned to life, by the power of his will, and seeks to rule, to become Archmage and preserve the old ways. He says no woman can be admitted into the school on Roke; Irian must leave the island. The wizards are divided; the Master Patterner, Azver, lets her stay with him in the Immanent Grove, and begins to love her. Yet he, like the few others who are willing to deal with her, seems paralyzed; none of them have the strength to stand against the dead man, Thorion, and those who follow him. So when Thorion finally comes to throw Irian off the island, she must defend herself. She challenges Thorion to meet her on Roke Knoll, the hill where things can only be what they truly are:

The air was darkening around them. The west was only a dull red line, the eastern sky was shadowy above the sea.
      The Summoner looked up at Irian. Slowly he raised his arms and the white staff in the invocation of a spell, speaking in the tongue that all the wizards and mages of Roke had learned, the language of their art, the Language of the Making: "Irian, by your name I summon you and bind you to obey me!"
      
She hesitated, seeming for a moment to yield, to come to him, and then cried out, "I am not only Irian."
      
At that the Summoner ran up towards her, reaching out, lunging at her as if to seize and hold her. They were both on the hill now. She towered above him impossibly, fire breaking forth between them, a flare of red flame in the dusk air, a gleam of red-gold scales, of vast wings--then that was gone, and there was nothing there but the woman standing on the hill path and the tall man bowing down before her, bowing slowly down to earth, and lying on it.

When the others come up to him, he is only "a huddle of clothes and dry bones and a broken staff." Aghast, they ask Irian who she is. She says she does not know. "She spoke...as she had spoken to the Summoner, in the Language of the Making, the tongue the dragons speak." She says goodbye to Azver, whom she loves, touching his hand and burning him in the process, then goes up the hill.

As she went farther from them they saw her, all of them, the great gold-mailed flanks, the spiked, coiling tail, the talons, the breath that was bright fire. On the crest of the knoll she paused a while, her long head turning to look slowly round the Isle of Roke, gazing longest at the Grove, only a blur of darkness in darkness now. Then with a rattle like the shaking of sheets of brass the wide, vaned wings opened and the dragon sprang up into the air, circled Roke Knoll once, and flew.

The Other Wind continues this theme of women who are also dragons, and plays it off against another central theme of these books, the relationship between life and death. For the terrible breach between life and death made by Cob continues. Now the dead have started appearing to the living in dreams, coming to the stone wall at the dry hill, begging to be set free, as if death were a prison. And at the same time, wild dragons have come to take back the land from men; they have come even to Havnor, where the young king, Lebannen, holds court under the rune of peace. All the patterns, clues and oppositions, set up over thirty years in five other books, come to fruition and are worked out in The Other Wind, but the book is so dependent on what came before, so complex, it is impossible to explicate here. It must be read--after the others--then thought on long and hard, for its meanings are not immediately manifest.

Long after reading, certain images stay in the mind. One is the dry land, this prison of death, and its relationship to immortality through the mastery of naming, of language. Another is women who are also dragons, who can find no place here on earth but must fly off beyond the west, on the other wind. Irian, excluded by the men of power, with only a few defenders, who are outnumbered and outweighed by the dead hand--there's plenty of resonance here for any woman who ever found herself a little bit too far ahead of the affirmative-action curve. As far as gender goes, these books seem to me a true symbolic picture of where we are now, with no untainted source of male power, no mature authoritative leadership of any kind, caught midway in our evolution as social beings, still trying to struggle up out of the ooze onto the land, no longer tadpoles and not yet frogs.

Science fiction and heroic fantasy began as the province of men, and the gradual entry of women into these genres has not necessarily produced more psychological depth overall. The best writers (including Octavia Butler, Samuel Delany, Neil Gaiman, Kim Stanley Robinson, Joanna Russ and Le Guin herself) have given us complex re-visionings of gender and power relations. But most writers have ambitions no higher than those of their counterparts who write in other commercial genres like espionage, crime or romance.

That is why Tales From Earthsea and The Other Wind are cause for celebration: they are books by a master stylist writing at the height of her powers. Although plenty of mass market fantasy is written in extremely pedestrian prose, style is key in fantasy, as in poetry. For fantasy is a pure creation of the imagination, summoned unto existence by the language of the making. Le Guin's style is as spare, plain, American and transparent as a northern lake: no tricks, no razzle-dazzle, no lists. "Why," she asks in an early essay, "is style of such fundamental significance in fantasy?"

because in fantasy there is nothing but the writer's vision of the world. There is no borrowed reality of history, or current events, or just plain folks.... There is no comfortable matrix of the commonplace to substitute for the imagination, to provide ready-made emotional response, and to disguise flaws and failures of creation. There is only a construct built in a void, with every joint and seam and nail exposed. To create what Tolkien calls "a secondary universe" is to make a new world. A world where no voice has ever spoken before; where the act of speech is the act of creation. The only voice that speaks there is the creator's voice. And every word counts.
      
From Elfland to Poughkeepsie, (1973)

If Le Guin is such a master and these books are so good, why have they been smuggled into the bookstore, largely unnoticed except in the professional reviewing periodicals? To understand the answer to this question, one must look at how genre is viewed in America and at the tyranny of contemporary realism in literary fiction.

Until the triumph of capitalism in the nineteenth century, the source of literature was thought to be the imagination, and the realistic novel was considered an inferior form, earthbound, compared to poetry, drama and the epic. In Shakespeare, Spenser and Milton, and even in the later, more contested work of the Brontës, Hawthorne and Melville, psychological realism exists in happy symbiosis with ghosts, fairies, demons and supernatural whales. With the triumph of capital and its handmaidens, science and rationalism, came a changed aesthetic. By the mid-twentieth century, the realistic novel of contemporary life had become so much the norm for serious fiction, at least in the United States, that anything else was trivialized or confined to a genre ghetto. We are, after all, a country run by hardheaded men who know the value of a dollar and who want no truck with moonshine. Many boast that they never read fiction. In such a culture, "magic realism" was acceptable only because it was imported; exceptions are always allowed for foreign luxury goods.

So strong was the idea that serious fiction must be a realistic picture of the present time that in the 1960s, when American novels began to combine some aspects of contemporary realism with monsters, ghosts, bodily organs run amok and other wild fancies (Ellison, Heller, Pynchon, Roth, Morrison), the writers were still considered realists or else given special dispensation as African-Americans, who, like foreigners, could be allowed their own cultural traditions because they were too marginal to threaten the mainstream aesthetic and politics. Living writers whose work was not grounded in a realistic, contemporary premise were relegated to the nursery or confined to special ghettos in the bookstore (historical fiction, science fiction, romance, fantasy), as though disqualified by genre from being shelved with "literature."

But surely this does not apply anymore; isn't this the Age of Harry Potter, when fantasy is king? Not exactly. It depends what sort of fantasy, and why. How different are the Harry Potter books really, in style and substance, from contemporary realism? Are they not parodies of same, combining realistic conventions with magical appliances and the war between good and evil? Is this parodic incongruity not, in fact, the reason they are so much fun? From the pinstriped cloak worn by the Minister of Magic to the disgusting variety of Bertie Botts Every Flavored Beans, the culture of the Harry Potter books is a faithful reflection of English schoolboy culture, including the cliques and teasing of the boarding school books that have molded generations.

And have they been treated seriously, as literature, or as a marketing phenomenon?

I would guess 90 percent of the articles I have read about J.K. Rowling deal with her not as a writer but as the commercial equivalent of a comet whizzing into the atmosphere from out of nowhere, a poor single mum writing her first book in a Scottish cafe. It's a great story, but you can only be a nine days' wonder once. After the novelty wears off, the commercial pressure remains; you are expected to do the same thing again and again and again, varying it no more than one flavor of yogurt varies from another. Every successful writer is faced with this choice: Do you stay faithful to the inner voice or turn yourself into a marketable commodity, producing a new product of the same kind every year or two? There are great social and economic rewards for the commodity production of the self.

Ursula Le Guin is doing something different. She has gone her own way, written forty books, not one of them either predictable or commercially motivated. She probably drives the industry crazy; it doesn't even know whether to classify the Earthsea books as children's literature or adult. In her foreword to Tales From Earthsea, she has some interesting things to say about commodification and why we read fantasy:

All times are changing times, but ours is one of massive, rapid moral and mental transformation.... It's unsettling. For all our delight in the impermanent, the entrancing flicker of electronics, we also long for the unalterable.... So people turn to the realms of fantasy for stability, ancient truths, immutable simplicities.
      
And the mills of capitalism provide them. Supply meets demand. Fantasy becomes a commodity, an industry.
      
Commodified fantasy takes no risks; it invents nothing, but imitates and trivializes. It proceeds by depriving the old stories of their intellectual and ethical complexity, turning their action to violence, their actors to dolls, and their truth-telling to sentimental platitude. Heroes brandish their swords, lasers, wands, as mechanically as combine harvesters, reaping profits. Profoundly disturbing moral choices are sanitized, made cute, made safe. The passionately conceived ideas of the great story-tellers are copied...advertised, sold, broken, junked, replaceable, interchangeable.

Le Guin's writing is on the edge, which is perhaps the same as the margins: idiosyncratic and hard to pin down. She is the kind of writer businessmen hate most, producing challenging, unpredictable books whose meanings are too elusive to be easily controlled. I can almost hear them saying, "No Earthsea books since 1990 and now two books in the same year? Hasn't she heard of regular marketing intervals?"

Unlike Le Guin's science fiction, her fantasies are not overtly political. The two genres have become almost interchangeable at the mass market level, but have different parents: science fiction derives from Victorian scientific speculation by writers like Conan Doyle and H.G. Wells, while fantasy grew out of myth. Le Guin's science fiction is about social and political life; some reads like ethnographies of imaginary societies, some deals with revolution. Because of its social themes, it appears more political than her fantasies, which deal with the inner life.

Nonetheless, the Earthsea books are profoundly radical because they lead one to think and feel outside of regular realistic patterns and the details of everyday life, laying depth charges that bring up long-forgotten reveries of childhood, unrecognized forms of heroism, secret challenges to power. Softly, elusively, they tear away at the wall of stones that keeps us in the dry land, the arid land of adulthood, the land of death-in-life, where so many of us spend so much of our time; they let the wind into our imaginations, and help to set us free.

Bush says he'd take a thousand whacks

Before he'd let the income tax
Of all the richest people go back up.
Remember Daddy's tax-hike fix.
An old dog may not learn new tricks,
But demonstrates the flubbed ones to his pup.

Nike-Zeus, Nike-X, Sentinel, Safeguard, Star Wars, X-ray lasers, spaced-based neutron particle beams, Brilliant Pebbles, Ground-Based Midcourse National Missile Defense, Midcourse Defense Segment of Missile Defense. Over the past fifty years America has poured approximately $100 billion into these various programs or efforts to shield the country against long-range ballistic missiles. Yet not one has worked. Not one. Nevertheless, except for the constraints imposed by his own "voodoo economics," President George W. Bush appears poised to pursue the development and deployment of a layered missile defense--as a hedge against more failures--that would force taxpayers to cough up as much as another $100 billion. In December Bush formally notified Russia that the United States was withdrawing from the 1972 Anti-Ballistic Missile treaty in order to "develop ways to protect our people from future terrorist or rogue state missile attacks."

Russian President Vladimir Putin labeled Bush's decision a "mistake," a mild reaction that should not disguise the fact that much of Russia's political elite is seething at the withdrawal. Already smarting from America's broken promise not to expand NATO and the US-led NATO bombing of Yugoslavia in 1999 (which violated the 1997 "Founding Act" between Russia and NATO), the coincidence of America's success in Afghanistan (obviating the need for further Russian assistance) and withdrawal from the ABM treaty is viewed as yet further evidence of American duplicity.

President Clinton diplomatically explained the Republicans' obsession with missile defense when he observed: "One of the problems they've got is, for so many of their supporters, this is a matter of theology, not evidence. Because President Reagan was once for it, they think it must be right, and they've got to do it, and I think it makes it harder for them to see some of the downsides." That's a nice way of saying that the conservative wing of the Republican Party abounds with missile-defense wackos. I've participated personally in two missile-defense conferences and was astounded by their right-wing, faith-based atmospherics.

Which is why Bradley Graham's engaging narrative of politics and technology during the Clinton years, Hit to Kill: The New Battle Over Shielding America From Missile Attack, seems destined for popular success, notwithstanding its serious conceptual limitations. Graham ably recounts the excessive exuberance of Republicans as they schemed to realize their missile-defense dreams. But he is equally critical of the Clinton Administration's attempt to actually build a missile defense: its "three-plus-three" ground-based midcourse program.

Offered in the spring of 1996, in part to undercut the Republicans, "three-plus-three" provided for three (or four) years of development, after which, if then technologically feasible and warranted by a threat, there would be deployment within another three years. In early 1998, however, a sixteen-member panel, led by retired Air Force chief of staff Larry Welch, condemned the plan as a "rush to failure."

But two overdramatized events later that year demanded even greater urgency. In July, the Commission to Assess the Ballistic Missile Threat to the United States, led by Donald Rumsfeld, asserted that America's intelligence agencies had woefully underestimated the capability of "rogue" regimes, such as those leading North Korea and Iran, to attack US territory with ballistic missiles within five years. It concluded: "The threat to the United States posed by these emerging capabilities is broader, more mature, and evolving more rapidly than has been reported in estimates and reports by the intelligence community."

When North Korea subsequently launched a three-stage Taepodong 1 missile past Japan in August 1998, many Americans put aside not only their qualms about the role Representatives Curt Weldon and Newt Gingrich had played in creating the commission, but also their suspicions about the blatantly pro-missile defense bias of most of its members. Although Graham generally portrays the commission's deliberations as unbiased, he does provide evidence that some of its briefers were not.

For example, one intelligence official betrayed visible irritation during his briefing of commission members, prompting General Welch to ask, "You're not happy to be here, are you?" The official replied, "No, I'm not. I'm ticked off that I have to come down and brief a bunch of wacko missile-defense advocates." His outburst infuriated Rumsfeld, who "stalked" out of the room.

Nevertheless, Rumsfeld's report and the launch of North Korea's missile frightened Americans and galvanized Republicans. Graham's investigative reporting gets inside the subsequent political war waged against a Clinton Administration that, itself, was slowly awakening to the possibility of a more imminent ballistic missile threat.

Graham brings an open mind to the hotly disputed technological merits of missile defense. Nevertheless, he cannot avoid the conclusion that George W. Bush's decision to expand missile defense beyond Clinton's ground-based midcourse program constitutes an acknowledgment that, after fifty years, "military contractors had yet to figure out how best to mount a national missile defense."

In theory, a ballistic missile can be intercepted during its comparatively slow, if brief, "boost phase," before its "payload"--warheads, decoys and debris--is released. Speed is of the essence during the boost phase. So is proximity to the target. According to Philip Coyle, former director of the Pentagon's Office of Operational Test and Evaluation, "The process of detection and classification of enemy missiles must begin within seconds, and intercept must occur within only a few minutes. In some scenarios, the reaction time to intercept can be less than 120 seconds."

Compounding concerns about boost-phase intercepts are questions about the ability of an interceptor to distinguish quickly between a missile's flame and the missile itself. Finally, boost-phase missile-defense platforms would invite pre-emptive attacks against those platforms by any state bold (and foolish) enough to launch ballistic missiles.

The "terminal phase" of ballistic missile flight is the final minute or two when the payload re-enters the atmosphere. Detection of the warhead is comparatively simple, but designing a missile fast enough to catch it and hit it--given the problems associated with sensor degradation in intense heat--is extremely difficult. Countermeasures, such as maneuvering capability or precursor explosions, would further complicate defensive efforts. Finally a terminal-phase missile defense can, by definition, protect only a limited area, perhaps one city. Thus, many such systems would be required.

The "midcourse phase" of ballistic missile flight is the period during which the payload is dispersed in space. It remains there more than 80 percent of the missile's total flight time. The Clinton Administration's ground-based midcourse program (continued by the Bush Administration) is designed to strike the warhead in space with a high-speed, maneuverable kill vehicle--thus Graham's title: Hit to Kill.

Easily the most developed of all programs, as recently as December 3, 2001, the midcourse program demonstrated the awesome technological feat of destroying a warhead hurtling through space--hitting a bullet with a bullet. Yet such a feat constitutes but the commencement of an arduous technological journey, not its endpoint.

As a "Working Paper" issued recently under the auspices of the Union of Concerned Scientists noted, America's ground-based midcourse program has not been subjected to real-world tests. Five hit-to-kill tests have resulted in three hits. But each test: (1) used identical test geometrics (the location of launches, trajectories of target and interceptor missiles); (2) released the same objects (payload bus, warhead and decoy); (3) occurred at the same time of day; (4) made the lone decoy obviously and consistently different from the warhead; (5) told the defense system what to look for in advance; (6) attempted intercept at an unrealistically low closing speed; (7) kept the target cluster sufficiently compact to aid the kill vehicle's field of view; and (8) provided the kill vehicle with unduly accurate artificial tracking data.

Any ground-based midcourse missile defense system has to contend with virtually insurmountable countermeasures, especially the decoys that, in space, are quite indistinguishable from the warheads. Yet the three successful hits did not have to contend with even the countermeasures that a missile from a "rogue" regime would probably employ.

A National Intelligence Estimate in 1999 determined that "countermeasures would be available to emerging missile states." In April 2000 a "Countermeasures" study group from the Union of Concerned Scientists and the MIT Security Studies Program concluded: "Even the full [National Missile Defense] system would not be effective against an attacker using countermeasures, and an attacker could deploy such countermeasures before even the first phase of the NMD system was operational." Consequently, "it makes no sense to begin deployment."

Craig Eisendrath, Melvin Goodman and Gerald Marsh (Eisendrath and Goodman are senior fellows with the Center for International Policy in Washington; Marsh is a physicist at Argonne National Laboratory) state the problem even more starkly in their recent book The Phantom Defense: America's Pursuit of the Star Wars Illusion: "This is the bottom line: the problem isn't technology, it's physics. Decoys and warheads can always be made to emit almost identical signals in the visible, infrared, and radar bands; their signatures can be made virtually the same."

If such information troubles Defense Department officials responsible for missile defense, they seldom admit it publicly. However, they're not nearly as irresponsible as the political and "scholarly" cheerleaders who remain unmoved by a half-century of failure and the physics of countermeasures. I encountered one of them last June at a missile defense conference in King of Prussia, Pennsylvania.

Representative Weldon delivered the conference's keynote address to more than 220 participants from the Defense Department, the military industry, think tanks, various universities and the press. Weldon is the author of HR 4, legislation that made it "the policy of the United States to deploy a national missile defense." (Senator Carl Levin was able to add amendments to the Senate bill on missile defense that made the program dependent upon the annual budget process and tied it to retention of the ABM treaty; Weldon referred to the amendments as cowardice. Nevertheless, they remained in the Missile Defense Act that President Clinton signed on July 22, 1999.)

Weldon told the audience that the United States requires a missile-defense system to protect its citizens from an intentional missile attack by a "rogue" regime presumably undeterred by the prospect of an overwhelming American nuclear retaliation. He even displayed an accelerometer and a gyroscope, Russian missile components allegedly bound for a "rogue." He then displayed an enlarged, poster-size photograph of Russia's SS-25 ICBM. Russia possesses more than 400 such missiles, he asserted, and any one of them might be launched accidentally against the United States, given Russia's deteriorating command and control capabilities.

It was a "no-brainer." Both threats demanded that America build a national missile defense system, capable of intercepting such missiles, as soon as possible.

However, when I asked Congressman Weldon to shift from the SS-25 and contemplate whether his modest missile-defense system could prevent the penetration of an accidentally launched TOPOL-M ICBM from Russia, he responded, "I don't know. That's a question you should ask General Kadish during tomorrow's session." Extending the reasoning, I asked Weldon whether his modest missile-defense system could shield America against a missile, launched by a rogue regime, that was capable of TOPOL-M countermeasures. Weldon again answered that he did not know. But rather than let such doubts linger at a conference designed to celebrate missile defense, Kurt Strauss, director of naval and missile defense systems at Raytheon, rose to deny that Russia possessed such countermeasures.

Presumably, Strauss was unaware of the work of Nikolai Sokov, a former Soviet arms control adviser and author of Russian Strategic Modernization: Past and Future. Sokov claims that the TOPOL-M features a booster intended to reduce the duration and altitude of the boost phase, numerous decoys and penetration aids, a hardened warhead and a "side anti-missile maneuver."

Strauss's uninformed denial hints at a much bigger problem, however: the prevalence of advertising over objectivity in a society where the commercialization of war and the cult of technology have reached historic proportions. In The Pursuit of Power historian William McNeill traces the commercialization of war back to mercenary armies in fourteenth-century Italy, pointing out the "remarkable merger of market and military behavior." And Victor Davis Hanson, in Carnage and Culture, sees much the same reason behind the decimation of the Turkish fleet, some two centuries later, by the Christian fleet at Lepanto--"there was nothing in Asia like the European marketplace of ideas devoted to the pursuit of ever more deadly weapons." McNeill concludes that "the arms race that continues to strain world balances...descends directly from the intense interaction in matters military that European states and private entrepreneurs inaugurated during the fourteenth century."

Post-cold war America, virtually alone, luxuriates in this dubious tradition. Yet it was no less than Dwight Eisenhower who warned America in his farewell address: "This conjunction of an immense military establishment and a large arms industry is new in the American experience. The total influence--economic, political, even spiritual--is felt in every city, every Statehouse, every office of the federal government."

Who could have been surprised, then, when Matthew Evangelista conclusively demonstrated, in Innovation and the Arms Race (1988), that commercial opportunities within America's military-industrial complex, much more than any Soviet threat, propelled innovation--and, thus, most of the arms race with the Soviet Union. A year later, the highly respected defense analyst Jacques Gansler identified the uniquely American "technological imperative" of commercialized warfare: "Because we can have it, we must have it." Such impulses caused the United States to run profligate arms races with itself both during and after the cold war. They also explain America's post-cold war adherence to cold war levels of military expenditures and, in part, our missile-defense obsession today.

This technological imperative had its origins in America's "exceptional" historical experience, which it continues to serve. Indeed, so the argument goes, Why should a country on a mission from God sully itself with arms control agreements and other compromises with lesser nations, when its technological prowess will provide its people with the invulnerability necessary for the unimpeded, unilateral fulfillment of their historic destiny?

Such technological utopianism, however, has its costs. In their book The Dynamics of Military Revolution, 1300-2050, MacGregor Knox and Williamson Murray demonstrate the very secondary role that technology has played in past military revolutions. They conclude: "The past thus suggests that pure technological developments without the direction provided by a clear strategic context can easily lead in dangerous directions: either toward ignoring potential enemy responses, or--even more dangerously--into the dead end, graphically illustrated by the floundering of U.S. forces in Vietnam, of a technological sophistication irrelevant to the war actually being fought." (In Hit to Kill, Graham has little to say about military strategy or the commercialization of warfare.)

In hawking a missile defense shield, Representative Weldon traveled in the first dangerous direction when he assured the defense conferees that although Congress was not ignoring the threat posed by terrorists with truck bombs, "when Saddam Hussein chose to destroy American lives, he did not pick a truck bomb. He did not pick a chemical agent. He picked a SCUD missile.... The weapon of choice is the missile."

Unfortunately, on September 11, America learned that it is not.

Potentially worse, however, is the Reaganesque theology propelling the Bush Administration's decision to withdraw from the 1972 Anti-Ballistic Missile treaty. Putting aside the question of whether withdrawal requires formal Congressional approval and other questions of international relations, one must ask why any administration would destroy the cornerstone of strategic stability. The ban on national missile defenses not only prevents a defensive arms race but also obviates the need to build more offensive missiles to overload the enemy's. Why would a country withdraw from the ABM treaty without knowing whether its own missile-defense system will even work, and before conducting all the tests permitted by the treaty that would provide greater confidence in the system's ultimate success?

Readers of Keith Payne's recent book The Fallacies of Cold War Deterrence and a New Direction, might guess the probable answer. Payne, chosen by the Bush Administration to help shape the Defense Department's recently completed but still classified Nuclear Posture Review, writes about a new, post-cold war "effective deterrence," to which even an imperfect missile-defense system might contribute: "In the Cold War, the West held out the threat of nuclear escalation if the Soviet Union projected force into NATO Europe; in the post-Cold War period it will be regional aggressors threatening Washington with nuclear escalation in the event the United States needs to project force into their regional neighborhoods.... In short, Washington will want effective deterrence in regional crises where the challenger is able to threaten WMD [weapons of mass destruction] escalation and it is more willing to accept risk and cost."

The real concern, then, is less about protecting America from sneak attacks by rogue states ruled by madmen, and more about preserving our unilateral options to intervene throughout much of the world. Thus, President Bush's speech at The Citadel in December was disingenuous. His rhetorical question asking what if the terrorists had been able to strike with a ballistic missile was primarily an attempt to steamroller frightened Americans into supporting missile defense. The speech simply seized upon the wartime danger to compel a military transformation that has been debated for almost a decade and resisted by the services and the military industry since the beginning of Defense Secretary Rumsfeld's tenure.

Lest we forget, China hasn't disappeared either. Its muted criticism of America's withdrawal from the ABM treaty was accompanied by a call for talks to achieve "a solution that safeguards the global strategic balance and doesn't harm international efforts at arms control and disarmament." Failing such talks, China may feel compelled to increase its offensive arsenal to insure penetration of an American missile defense, which could provoke India, and consequently Pakistan--perhaps rekindling tensions that have already brought them to the brink of war.

Russia, for its part, believes it has little to fear from America's current missile-defense programs but is awaiting the inevitable: the moment when the technological utopians push America to expand its modest system into a full-blown shield. How will Russia respond then?

To court such reactions by withdrawing from the ABM treaty before even testing against decoys is pure strategic illiteracy--which only a Reaganesque theology (founded on exceptionalism, commercialized militarism, technological utopianism and righteous unilateralism) shrouded by the "fog of war" might explain.

Talk about the politics of class struggle. George W. Bush now is apparently willing to give his life to make the rich richer.

"Former Yankee virtues, common sense, scepticism if not suspicion of authority, a belief in the mastery of the future, have been driven underground.

On August 21 in Lake Charles, Louisiana, a struggling oil-refinery town on the Texas border, Wilbert Rideau walked to the center of the modern courtroom, hobbled by shackles. The man Life magazine called "the most rehabilitated man in America" lifted up his furrowed brow and looked at the judge. And stillness came over the crowd of mostly elderly blacks, as Rideau pleaded not guilty to a murder committed forty years ago.

Interest in the case lies not in Rideau's innocence or guilt. On numerous occasions he has accepted responsibility for murdering a woman after robbing a bank in 1961. Rideau, 59, received the death penalty, but by an accident of history, lived to become a famous journalist. As editor of a prison magazine called The Angolite, he has won almost every journalistic award and become a national expert on prison life; he's been "Person of the Week" on World News Tonight with Peter Jennings and a pundit on Nightline--all from behind prison bars in Angola, Louisiana. In 1994 Rideau's lawyers, in a last-ditch effort to free him, filed a habeas corpus petition in federal court. In December 2000 the Fifth Circuit Court of Appeals in New Orleans found that the original prosecutor of the case had excluded blacks from the grand jury in blatant violation of the Constitution, and ruled that the state must retry Rideau or release him.

This year Rideau is set to stand trial in the same Louisiana town where he was first convicted forty years ago. Many thought that Lake Charles and Calcasieu Parish would look the other way rather than reprosecute an age-old case with lost evidence and a manifestly rehabilitated defendant. Rideau's lawyers have said he would settle by pleading guilty to manslaughter and walk away with more time served than all but four convicted murderers in Louisiana history. But the state won't offer any deal.

The reason can be found in Lake Charles, a town where redemption may not be possible when a black man kills a white woman. Powerful people in the parish have blocked Rideau's release, whereas other inmates sentenced for similar crimes have received parole. During Rideau's time in Louisiana State Penitentiary in Angola, nearly 700 convicted murderers have been freed. Four pardon boards have recommended Rideau for release--but two governors have denied clemency. "Why Not Wilbert Rideau?" was the title of a 20/20 segment exploring why he has not been able to get parole. "I think he is a con artist," said District Attorney Rick Bryant. "He's a master manipulator of the media and people who have supported him."

The vehemence stems in part from the fact that Rideau is a prosecutor's nightmare. This is the fourth time the parish has tried him. Each time Rideau is convicted, he appeals and exposes shameful structural flaws in how the justice system here really works. And he's doing it again. This past November 29 the Louisiana Supreme Court struck down the parish's process for selecting judges in capital cases, which the court faulted for allowing judge-picking, a practice used by prosecutors to obtain judges favorable to the state. The prosecution had filed its new case against Rideau when the only ball left in the bingolike hopper was the one for Michael Canaday, a white judge who had never before tried a felony. After watching Judge Canaday in court, Marjorie Ross, 68, a retired department store salesperson, said, "I look back forty years ago and things haven't changed. It's because of this." She pointed to her dark-skinned face.

But the new judge, selected "at random" with all seven balls in the hopper, happens to be one Wilford Carter, who is black and was elected from a black district with many voters fixated on this case. It's a boon that has become Rideau's signature--the grace of luck appears just when it seems to have run out. "The fact that I excelled beyond anybody's wildest expectations not only vindicated official decisions but increased the hostility of my enemies," Rideau said in a series of telephone interviews. "Everything I became, everything I have achieved, has been in spite of this unholy force from Lake Charles dedicated to destroying me and denying me the ability to be anything more than the criminal they wanted me to be."

His crime has been hard for the town to forget. According to the original prosecutor, Frank Salter, on February 16, 1961, Wilbert Rideau, then 19, knocked on the door of the Gulf National Bank at closing time. Bank manager Jay Hickman unlocked the door. He knew Rideau as the errand runner at Halperin's, the sewing shop next door, who would fetch sodas for bank employees, until the relationship became too friendly for the whites. "We stopped [asking him for sodas] because he started talking," said victim Dora McCain in her trial testimony, "calling us by our first names. So we just--we just got a refrigerator for the bank." That day, Rideau produced a gun and demanded that Hickman empty the money drawer. Rideau put $14,000 in a gray suitcase (leaving $30,000 on the floor and in coffers) and forced Hickman and two women bank tellers into a car. They drove to a country road in a wooded area, where Rideau lined up his three hostages and began firing. One bullet landed in Jay Hickman's arm. Hickman rolled off into a bayou out of sight. The two women fell to the ground with gunshot wounds. Julia Ferguson, 49, cried out, "Think of my poor old daddy," who lived with her. "Don't worry, it will be quick and cool," Rideau allegedly said before slitting her throat and stabbing her in the heart. Ferguson died at the scene. Rideau approached the other teller, Dora McCain, a pretty twentysomething with a well-known family, who lay face down. He kicked her in the side three times to see if she was dead. When she didn't cry out, Rideau took the car and left. Two state troopers stopped Rideau in his car as he was leaving town. They found the suitcase with the money in the back seat. (Rideau's counsel declined to comment on the facts before trial.)

That year, the first of three all-white, all-male juries convicted Rideau and sentenced him to death. Rideau appealed on grounds that a TV station, KPLC-TV, had secretly filmed the sheriff posing questions to Rideau, who had no access to a lawyer, and aired his mumbled answers as a confession. The US Supreme Court slammed the parish's "kangaroo court proceedings" and found that the broadcast had unfairly prejudiced the jury pool. The Court reversed the conviction and said Rideau could not be tried anywhere within the reach of KPLC. In 1964 at a second trial, in Baton Rouge, the jury deliberated for fifteen minutes before deciding to give him the electric chair. Rideau appealed again, and a federal court overturned his conviction on grounds that the state court had rejected jurors with doubts about the death penalty, in effect stacking the jury with death penalty proponents--a violation of due process. In 1970 at a third trial, in Baton Rouge, the jury took eight minutes to give Rideau the death penalty. His appeals were unsuccessful, and he returned to death row--just in time to benefit from Furman v. Georgia, the 1972 Supreme Court decision that temporarily found the death penalty unconstitutional. As a result, every death-row inmate in America, including Rideau, had his death sentence commuted to life imprisonment.

Rideau won't comment on the crime because he is facing a new trial. But he agreed to talk about the person he was at the time and how he has changed. Though he usually speaks quickly, in perfect sentences, his cadence is deliberate in describing the man he was when he entered prison. "I wouldn't recognize him today," he said. "I was typical in a lot of ways. I was another dumb black, immature, angry. Not even aware that there is a world bigger than me." He says he had a fairly normal childhood, moving to Lake Charles when he was 6. "My home life wasn't the best," Rideau says. "But that doesn't say much because a lot of people's family lives weren't." His problems, he says, began during adolescence. "People used to pass by and they would throw Coke bottles and spit and holler at you," he says. "You could be walking by with your girl and they would call at you talking about you--'Hey nigger, blah blah blah, whatever.'" Rideau knew it wasn't directed at him alone. But he took it as "the end of the world." "I saw whites as enemies responsible for everything wrong with my world. Whites created this bizarre segregated world where racism ruled," he says. In his segregated school, he dismissed the hand-me-down books from white schools, which held forth ideas of "rights" and "how life was so wonderful." Though he had a straight-A average, he quit school in the eighth grade because he saw no use for an education. "I wanted to be a spaceman like Flash Gordon," he says. At 13, he began a series of low-paying jobs and spent most of his time in pool halls and gin joints. "I didn't even know the name of the governor of the state," he said. "I was totally out of it."

Eventually, he became an errand runner in the fabric shop, his last job before being sent to Angola. In prison, he noticed the strange ethics of prison life, starting with white guards who smuggled him novels and science texts. "I read a library on death row," he says. And in a Baton Rouge jail, where he stayed for part of his appeals, Rideau lived in the segregated white section as punishment for leading a "strike" in protest against prison conditions--flooding the commodes and burning mattresses. When Rideau led white prisoners in a strike as well, the prison put him in solitary confinement. And to Rideau's shock, whites began secretly sending him food and kind words. "Whites started taking care of me," he said.

Within the first year of his life sentence, Rideau asked to join the then-all-white newspaper, the Angolite, only to have administrative officials turn him down. "I read in the paper that they couldn't find a black who could write," he says. The rejection stung. Over the past decade, he had penned a book-length analysis of criminality and corresponded with a young editor at a New York publishing house, who tutored him in the art of writing. Rideau rounded up an all-black staff and started The Lifer, which chronicled stories like that of a group of elderly women who brought a truckload of toilet paper to the prison and were turned away. Eventually, the administration put him out of business. "They threw me in the dungeon saying I was advocating insurrection," he says. White prisoners petitioned a black senator to demand Rideau's release from solitary confinement. "Along the way, the whites that I initially saw as enemies befriended me and fought for me, not blacks," he says. "That experience caused hell with the way I saw things."

In 1975, the warden made Rideau editor of the Angolite as part of compliance with a federal court order mandating integration of the segregated Angola prison. A year later a new warden, C. Paul Phelps, arrived and offered to strike a deal. Phelps promised that the Angolite would operate under the same standards that applied to journalists in the free world--he could print whatever he could prove--so long as Rideau would teach him about life at Angola. Over the years, the two men had many philosophical and political discussions. And they ate together in the dining hall. "He told me that like begets like," Rideau says. Phelps permitted Rideau to become a public speaker, a reward for well-behaved prisoners to travel and explain the dangers of prison life to youth at risk. And with his new freedom, Rideau jettisoned a longtime plan to escape. "The thing that is most respected in prison is character, loyalty, keeping your word," says Rideau. "These are things that are highly valued in the real world, but they are really, really valued in ours." This and the passage of time have changed him. "Part of it is just growing up," he says. And growing up has meant a realization that he may die in prison. Since 1997 Rideau has been president of the Angola Human Relations Club, which cares for elderly inmates by providing such essentials as toiletries, warm caps and gloves, and which buries the dead.

After Rideau became editor of the Angolite, the paper changed from a mimeographed newsletter into a glossy magazine exposing systemic problems and an emotional inner life. One story revealed that the Department of Corrections had doled out money for AIDS programs that were never implemented. Another issue featured pictures of inmates after electrocution--a portrait so horrifying that Louisiana changed its method to lethal injection. The magazine has won seven nominations for a National Magazine Award, and Rideau has won the Robert F. Kennedy Journalism Award, the George Polk Award and an Academy Award nomination for The Farm, a documentary film about Angola that he co-directed. He co-edited a book, Life Sentences (Random House). He addressed the convention of the American Society of Newspaper Editors in 1989 and 1990. And he's a correspondent for National Public Radio's Fresh Air program. While Lake Charles watched, the man many faulted with ushering in an era of crime became a nationally respected writer and commentator. "There's no way you're going to give life back where it's been taken," Rideau opined on Nightline in 1990. "But you--you just try to make up.... When it's all over and done with, Wilbert Rideau will have tried."

One blistering August afternoon in Lake Charles, I locked my keys in my rental car and called "Pop-A-Lock" for help. As owner Jim Rawley jimmied the lock, he recalled the night Rideau committed the crime. Rawley was in high school then. His friends wanted to kill Rideau and mobbed the courthouse. "There was a group of vigilantes among us," he says. "I can't remember the specifics. But I remember the atmosphere. Macho kind of stuff, except that we were scared too." Years later, Rawley became a Calcasieu Parish deputy and knew Rideau, who was awaiting trial in the Lake Charles jail, as "a troublemaker." Once, he says, a friend and fellow officer "beat the hell out of [Rideau]" for being "belligerent and uncooperative." When asked if he thought Rideau had changed, he said, "By all appearances he has rehabilitated himself, for lack of a better word. He seems to be a different man than he used to be. But that doesn't negate what he did.... It doesn't change the fact that he was convicted three times. He has never claimed that he didn't commit the crimes. He is fortunate he didn't receive the death penalty." He also said Rideau is a burden to the courts and should stop appealing his case. "If he's a different person he needs to go through the pardon board," he argued. But everyone knows that governors have blocked his release. Rawley shrugs, "It's already been decided, then."

Rawley's reaction was typical of whites I met. Rideau's good actions matter little next to the fact that he escaped the death penalty, as if death had somehow been cheated. And one has to wonder if there isn't some jealousy of his fame in the world outside Lake Charles. Elliott Currie, a professor of criminology at the University of California, Berkeley, calls the unceasing and vindictive punishment of those who have committed bad acts, without regard for the genuineness of their remorse or rehabilitation, "punitive individualism." Law-abiding people don't want prisoners to have anything they can't have--thus the 1994 elimination of Pell Grants (federal educational scholarships) for prisoners and the conflicts over whether taxpayers should pay for weightlifting equipment for prisoners. Rideau represents the extreme of this line of thinking: Most of us are never going to get to be on Nightline. Why does this murderer get to do it? Many white observers view the legal mistakes in his case as technicalities, and his appeals a waste of taxpayer money. After the arraignment, a blue-blazered security guard grabbed my hand very tightly and muttered, "If I killed your grandmother could you rehabilitate her?"

And the more well-known a defendant, the more the public focuses on preventing release. In this sense, Rideau is not unlike famous white prisoners who can't get a break despite impeccable prison records--like Kathy Boudin, the former Weather Underground radical, denied parole last August for a 1981 murder conviction; or Karla Faye Tucker, a convicted murderer executed in 1998, even after the victim's brother begged Texas Governor George W. Bush to pardon her. Their violent offenses do not elicit leniency. "It's not that people are afraid he is going to do it to her again," says Currie. "They are saying, 'Anybody who does this can't be free again; in our moral universe that can't happen.'" This attitude pervades public policy. Federal laws passed in 1994 provide matching funds to states to keep violent criminals in prison longer by denying parole.

But perhaps the biggest strike against Rideau is his race: No black man convicted of murdering a white person in Lake Charles has ever been released from prison, according to The Rideau Project, a research effort at Loyola University in New Orleans (see www.wilbertrideau.com). Whether or not people were alive at the time of the crime, feelings seem to be as strong as they were forty years ago. A 33-year-old white saleswoman at an electronics store, who asked not to be identified, said, "He should die the same death like everyone else," adding that she had to put her kids in private schools because of the "kids who cause trouble." She then mouthed the word "blacks." Her co-worker, a 30-year-old white man, used lynching imagery to say he agreed: "They should have swung him a long time ago." But then he asked, "What did he do?"

This is what gives District Attorney Rick Bryant his mandate. He's up for re-election in November, which means trying Rideau during campaign fundraising season. In two conversations, one at his desk and a second in a downtown bar, he said that even if Rideau were rehabilitated (and he wouldn't admit this), he would reprosecute. "He did the crime, didn't he?" Bryant refuses to recognize his own prosecutorial discretion, implying that he actually doesn't have the power to decide not to prosecute. This may be true, but only in the sense that his political survival in this majority-white town depends on a conviction. "They are trying to make me into a glorified pardon board. I am not a pardon board. I am a DA. Like I should be God of this case! Like I don't care! Or that I should decide he's a good guy in prison! That is not my job. The only reason I would not retry him is if there is no evidence, he's innocent or the victims want his release," he says. I suggest that his job is to seek justice, not just to convict, and that a retrial can only divide the town. "They line up and tell me to keep him in prison," Bryant says.

Of course, there are those--mostly black and some influential whites--lining up on the other side, too. Cliff Newman, an attorney and Democratic state senator from 1980 to 1988, once lobbied the governor to keep Rideau in prison at the behest of Dora McCain, the only victim who is still alive today. In the following years, Newman met Rideau in Angola at the prison rodeo and followed his story in the media. Today Newman has changed his mind: "From a political point of view it is not popular to ever say a murderer should be released. But I am not in politics anymore. And I am not going to be. Everyone is capable of rehabilitation."

Even conservative whites are hard pressed to argue that Rideau is not a different man today. Bill Shearman, owner of the town's conservative weekly newspaper, said, "Well, yeah, I think Rideau is rehabilitated," explaining that his view isn't representative. "Only a scant minority realized he has changed." Jim Beam, 68, a columnist of the American Press, the conservative daily that has opposed Rideau's freedom, admitted, "If you asked me if he's rehabilitated I would say yes." And Peggi Gresham, retired assistant warden and Angolite supervisor for twelve years, said, "I am not a bleeding-heart liberal. I don't think that everybody should get out. But when a person is as successful as some individuals are they can get out and have a good life. Wilbert is one of those people."

Young black professionals I met generally thought Rideau should be released because he has changed but see his plight as a remnant of past prejudice that doesn't really concern them. Rideau's real support in Lake Charles has come from the local NAACP and black press who believe that Rideau didn't commit the crime alone and is part of a larger conspiracy. "Blacks don't rob banks and they don't commit suicide," says Lawrence Morrow, publisher and editor of the black magazine Gumbeaux. Rideau had a good job, they argue, at a time when it was difficult for blacks to find jobs, and he took only $14,000, leaving $30,000 in the bank. Joshua Castille, 73, a retired black law enforcement officer, had drinks with Rideau the night before the crime and saw no peculiar behavior. He believes Rideau acted in concert with bank manager Hickman. Even back then, he said, a bank would never open its doors after closing hours. For a black person? "For anyone," he says. "They just wouldn't do it." The contrasting perceptions of the Rideau case among blacks and whites is emblematic of the different ways the two groups view crime, as well as issues like the death penalty. "Blacks are more likely to understand that people like Rideau are less likely to have committed the crime because they are monsters than because of circumstances that put them in that situation-- 'there but for their fortune go I,'" says Currie. "And they know that the criminal justice system has been pushed toward punishing blacks more than whites for as long as the justice system has existed."

Rideau's trial could go either way. On the one hand, Lake Charles elects its judges and Judge Carter is accountable to a black constituency that cares about this case enormously, which could mean openness to arguments about prosecutorial vindictiveness. On the other hand, when Carter's son, then 16, was charged with second-degree murder, he received a plea deal from Bryant reducing the charge to manslaughter--which, critics say, could predispose the judge to be friendly to the prosecution. And while, after so many years of appeals, the evidence is mostly lost, Dora McCain's lawyer, Frank Salter, the original prosecutor, said she would testify, which could mean a conviction based on her testimony alone (McCain did not respond to interview requests). Rideau's lawyer is the formidable George Kendall of the NAACP Legal Defense & Educational Fund, but it isn't yet clear how Judge Carter feels about counsel who swoops in from New York.

Rideau says if he does get out, he wants to leave Louisiana and write two books. "And neither one of them is about me," he says, explaining that he hopes to redefine criminality. "But I am telling you they are going to give me the Pulitzer Prize for this." It's hardly what Lake Charles wants to hear. When does he believe punishment should stop? "Whatever it should be, it should be," he says. "But it should be equal."

In the United States the writer tends to become an entrepreneur, competing with other literary vendors marketing their characters and language, their humor or drama, to a skeptical and distracted public. In Israel, it seems, they order things differently. For a nation perpetually in crisis, with an ancient prophetic tradition behind it, the serious writer remains something of a sage, a wisdom figure who speaks with authority. Amos Oz has been such a presence on the Israeli scene for close to four decades, publishing not only novels and stories but political journalism, literary essays and Op-Ed columns, never wholly disengaging his state of mind from the state of the nation. Yet his public pronouncements, always as beautifully crafted as his fiction, have never laid to rest the inner demons that power his creative work. This is especially evident in his newest novel, The Same Sea. Despite its deceptively light tone, it reads like one of the most personal books he has yet written.

The Same Sea is at once spare and lushly experimental, an unusual mixture of hard, precise prose that drives the story forward and often lyrical, evocative verse that bathes us in the mental glow of each of the characters. The musical qualities of this verse, strong in Hebrew, are largely lost in translation, but its strategic line-breaks and numerous biblical echoes, especially from the Song of Songs, save it from becoming altogether prosaic. The story is so simple that the author can sum it up in his opening lines. It centers on a triangle familiar from some of Oz's earlier books--the mild, practical father; the languid, troubled mother, who has recently died; and their only son, who has fled home in the wake of her death and, in this case, gone off mountaineering in Tibet. It would not seem possible for a writer to build his novel around three characters whom we never see in one another's company: the widowed father, trying half-heartedly to resume his life, the deceased mother, not yet fully accepting her death, and the distant son, surrounded by his mother's palpable presence, sleeping with women who bring her back to him, trying aimlessly to outrun his grief.

Yet this is a book in which the dead are never wholly dead, where memory and meditation are more vibrant than action, while time and distance are seen less as objective facts than as constantly varying states of mind. It's also a book in which the fictional narrator, who resembles the author in every biographical detail, repeatedly emerges from behind the proscenium to sort out his own memories, which are precisely the ones that fed into the story. Just as the characters swarm about him, they inhabit one another's minds as well, communicating across continents with some of the mobility and omniscience an author usually reserves for himself.

In short, this is a book about someone writing a novel, showing us how it lives within him while it is also spilling out onto the page. Yet somehow, even at this remove from direct storytelling, the characters resonate. Amos Oz has written other versions of this father, this mother, this boy, in Hill of Evil Counsel, for example, but never has mingled them so clearly with his own past, which instead of fading has grown more insistent with time. Confronting mortality himself, he feels more impelled to take stock of his own dead. The loss of his parents, especially his mother's suicide when he was 13, still obsesses him as he approaches 60. The narrator even has one of his characters, the son's carefree 26-year-old girlfriend, try to talk him out of his brooding mood. "Your mother killed herself/and left you quite shattered.... But for how long? Your whole life?/The way I see it being in mourning for your mother for forty-five years is/pretty ridiculous." The narrator sees it differently. How can he bail out? "How can you jump from a plane/that's already crashed and rusted or sunk under the waves?" For him the dead continue to haunt the living. Yet what she says has the authentic ring of the younger generation, and the author, with the warm generosity of Chekhov, respects its callow wisdom and healthy insensitivity, which part of him would love to emulate.

The Same Sea is magnanimous toward characters who could just as well be brutally satirized or dismissed--the coarse yuppie always on the lookout for a good deal, the ill-favored film producer, hopelessly unlucky with women, who becomes fixated on a character in a script, the girl who casually sleeps with nearly all the male characters, including (almost) her boyfriend's widowed father. An underlying tenderness softens their hard edges. As in a Renoir film or Chekhov story, they somehow surprise the reader into sympathy and a wistful tolerance. Unexpectedly, too, they begin to nurture one another.

One feature of this enchanting book that I have already mentioned stands out most strikingly. As the story unfolds, the author keeps intervening in it, at first pushing his pad aside and wondering "how on earth/he came to write such a story," but gradually interacting with his characters, commenting on the film script that the girlfriend is trying to sell, offering little scenes from his writing life and recollecting his own parents and childhood. At first it seems he is playing a postmodern game, violating the boundaries of the novel by wantonly mixing poetry and prose, fact and fiction, puncturing our suspension of disbelief. Worse still, we wonder whether the writer is simply losing interest in his own story, taking it over. But it soon becomes clear that, on the contrary, the story is so real to him that the people in it have invaded his life, and not only when he sits composing at his desk.

As he works in the garden, all the people in his head, real and imagined--where to draw the line?--the dead and the living, his children, his grandchildren, the characters from the novel, all his own selves, seem right there with him, tossed up from the same sea, pitching in despite their different views of how the gardener's work should be done. This is a fanciful conceit, often used in the Renaissance for poetic creation, yet something about it rings ingeniously true. This is no symbolic landscape of ideas and images but a scene showing us the writer himself, away from his work but with his mind still abuzz. In this flux, paradoxically, he feels a contentment that allows him to set his demons aside, the dead who will not stay dead, the characters who insist on a life of their own, the fears for the future that poison the present: "Grief fear and shame are as far from me today as one dream is/from another," he says. "Whatever I have lost I forget, whatever has hurt me has faded,/whatever I have given up on I have given up on, whatever I am left with/will do." For the time being, at least, he can dwell in the moment. "Later I'll go back to my desk," he concludes, "and maybe I'll manage to bring back/the young man who went off to the mountains to seek the sea/that was there all the time right outside his own home."

Last spring Richard Pollak asked in these pages, "Is GE Mightier Than the Hudson?" (May 28, 2001). Given the Environmental Protection Agency's December 4 decision to dredge the PCB-contaminated river, it is tempting to ring in the new year with a resounding No. Despite the company's multimillion-dollar blitz of lawyering, lobbying and PR, the Bush Administration, in the person of its EPA Administrator, Christine Todd Whitman, has come down squarely on the side of those in New York's historic Hudson River Valley who have been agitating for years to make GE clean up the lethal mess it created by dumping more than a million pounds of polychlorinated biphenyls in the river from the 1940s into the 1970s. This pollution has turned 200 miles of the Hudson, from just above Albany south to New York Harbor, into the biggest Superfund site in the nation; EPA law requires that GE pay the cost of removing the toxic chemicals, which the agency estimates at $460 million. More than once, the company has told its stockholders it can well afford this sum, as a multinational with a market value of some $500 billion surely can.

Still, it may be premature to pop the champagne corks. This past fall, fearing that Whitman might follow the lead of her Clinton Administration predecessor, Carol Browner, and endorse the cleanup, GE filed a federal suit attacking as unconstitutional a Superfund provision that allows the EPA, if the company refuses to dredge, to do the job itself and bill GE for three times the final cost plus penalties of $27,500 a day. GE has plenty of time (and cash) to pursue this and other maneuvers against dredging, which is needed to remove some 150,000 pounds of PCBs still in the Hudson. The EPA estimates it will take at least three years to work out the project's engineering and other details--e.g., what kind of equipment is needed, how much stirred-up sediment is acceptable and what landfills can safely handle the contaminated mud. Many residents along the banks of the river are divided--sometimes angrily--on these and several other issues. During the EPA's 127-day comment period in 2001 it received about thirty-eight boxes of letters and 35,000 e-mails, many spurred by GE's scare campaign--on billboards, in newspaper ads and on TV infomercials--warning that dredging will destroy the river.

The EPA has pledged that the public will have even more of a voice in the project's design decisions over the coming months--a welcome process but one that GE is likely to exploit with more propaganda. At its enviro-friendly-sounding website (hudsonwatch.com), for example, the company continues to insist, on no hard evidence, that the citizens of the Hudson River Valley oppose dredging "overwhelmingly." Some residents do resist dredging and the inevitable inconvenience it will bring to their communities, and not all have arrived at their view because of GE's PR tactics. But after almost two decades of review by the EPA, the burden of scientific evidence shows that the remaining PCBs, which cause cancer in laboratory animals and probably in humans, continue to poison the river a quarter-century after their use was banned and GE stopped dumping them.

The EPA's December 4 order could be the precedent that requires the company to clean up forty other sites where it has dumped PCBs. This would cost several billion dollars, a hit not so easy to reassure shareholders about. Even with GE master-builder Jack Welch retired and busy flogging his bestselling How-I-Did-It book, don't look for the company to roll over anytime soon.

The violent popular uprising in Argentina and abrupt collapse of its government should be understood as a warning bell, reminding the governing elites how unstable--and unjust--their system of globalization remains. Unfortunately, the Washington establishment prefers instead to dwell on its global war against terrorism. The Bush Administration's battlefield successes in that war, its diplomatic victories in the new trade round launched at Qatar and the House's narrow approval of fast-track negotiating authority for the President seemed to confirm America's self-image as benevolent steward of the world.

When Argentina exploded, it should have blown away the smugness, but instead we witness once again the supple forgetfulness that allows the globalist architects of the IMF and their cheerleaders to skip past obvious contradictions in their ideology. Argentina, one has to recall, was toasted not very long ago as the best case for "responsible leadership" in the developing world. Its regime included the requisite "Harvard-trained economist" as finance minister, who advanced the same austere measures that Washington demanded from the sinking Argentine economy: Squeeze the populace as harshly as necessary until capital accounts are balanced so foreign creditors may feel protected from devaluation or default (they are now likely to experience both).

The Argentines endured quite a lot--four years of recession, unemployment approaching 20 percent, shrinking incomes and public spending--until they swarmed screaming into the streets, looting supermarkets and battling police, with many casualties. Now, Eduardo Duhalde, Argentina's fifth president in two weeks, has lashed out, blaming US-backed free-market policies adopted in the 1990s for the country's collapse. "Argentina is bankrupt. Argentina is destroyed. This model destroyed everything," Duhalde said in his inaugural speech.

The central fallacy exposed by the ruination of Argentina-- and the many previous cases like Russia and Mexico--is the presumption that poor nations should accept the global system's commanding dictates, occasionally including massive suffering in the name of financial order, and in return the system will make them rich (or at least less poor). In Argentina's case, the straitjacket was sincerely accepted in the most extreme terms: Its currency was rigidly bound to the value of the American dollar. This commitment was widely praised by US economic thinkers, and it did stimulate US banks and investors to lend more generously. But it encouraged foreign lending to swell to impossible dimensions--$132 billion in Argentina's case--followed by the inevitable economic deterioration as the dollar soared and Argentine exports ceased to be competitive. The IMF prescribed its usual austerity remedy while lending billions more to cover the debt obligations--thus giving more time for the foreign debtors to be repaid before the inevitable default.

The story of Argentina is baffling, and deeply infuriating, because it is so familiar. Yet sensible reforms, like capital controls on the creditors and alternative economic strategies for developing nations, remain topics for learned papers and polite conferences, not for real action. There is an obvious explanation: IMF policy may ruin many borrowers, but it serves the creditors, who are able to evade the full consequences of their folly. Perhaps if many more nations follow Argentina down the road of debt default, the creditors will also see something wrong with the system and demand change.

One country that has escaped the current scrutiny of US backing for Arab dictatorships is Morocco, in part because its human rights situation has improved over the past decade. But for most of the late King Hassan II's thirty-eight-year rule, the United States and France provided financial and diplomatic support to this moderate on Arab-Israeli issues, while his henchmen tortured and secretly jailed thousands of domestic critics. Hundreds were disappeared. Now, the revelations of a retired secret policeman living in Casablanca have raised new questions about Washington's role in the repression.

Since the death of Hassan in 1999 and the ascent of his son, Mohammed VI, to the throne, Morocco has enjoyed a somewhat freer atmosphere. Human rights activists, victims' groups and the media are exposing the grim past and debating what mix of truth-telling, reparations and punishment will both deliver justice to the victims and help consolidate the democratization process. Mohammed apparently does not want trials of torturers or the sort of truth-telling that could delegitimize the monarchy and roil the security forces. But he has distanced himself from his father's worst excesses by acknowledging the state's role in past abuses and compensating some victims. His gestures, unprecedented in the Arab world, have helped to brighten the government's image at a time when it has made little headway in combating poverty and unemployment.

The state's script for turning the page on the past has nevertheless been disrupted by Ahmed Boukhari, the first police agent to talk about the dirty war against dissidents during the 1960s and '70s.

Among Boukhari's revelations was the presence of three men he describes as CIA operatives who worked daily in the Rabat headquarters of the secret police from 1960 until 1967. Boukhari says these men helped to build the young agency. "They went through the résumés and picked the men to hire," he told me in a recent interview. "Then they taught them how to conduct surveillance of dissidents."

Boukhari's most sensational disclosure, if confirmed, would solve a nagging political mystery: the fate of the socialist opposition leader Mehdi Ben Barka after he was picked up by French police in Paris in 1965 and never seen again. Exiled at the time, Ben Barka was a charismatic and rising star in the progressive Third World alliance known as the Tricontinental Conference. He is still revered by the Moroccan left.

While no one ever doubted Ben Barka's abduction to have been engineered by senior Moroccan officials with the collusion of French and Israeli agents, details of what followed remained murky. According to Boukhari, who maintained the daily logs for the police's formidable countersubversion unit, Ben Barka died the night he was kidnapped while being tortured under interrogation in a villa near Paris. His corpse was then flown secretly to Morocco, where police dissolved it in a vat of acid--a technique of disposal that Boukhari says was introduced by a CIA agent he knew as "Colonel Martin."

Martin allegedly had unfettered access to the countersubversion unit's logs and attended the agency's meetings at which the Ben Barka operation was planned. Reporting to work on the morning after the kidnapping, he would have learned that Ben Barka's body was to be spirited off to Rabat.

Although the Ben Barka affair triggered a diplomatic crisis between Morocco and France, the United States remained circumspect. Washington viewed King Hassan as a key ally in a region where Egyptian President Gamal Abdel Nasser's pan-Arab socialism enjoyed broad appeal and newly independent Algeria seemed to be drawing closer to the Soviet camp.

Two retired US diplomats stationed in Rabat at the time, political section chief William Crawford and economic officer Frederick Vreeland, denied in a recent interview any knowledge of the three agents Boukhari describes, or of any CIA role in helping the King police his opponents. Both Crawford and Vreeland mentioned Morocco's well-known collaboration with Israel's intelligence agency, the Mossad, including in the surveillance of dissidents. Vreeland said the men Boukhari describes might have been Mossad agents posing as CIA agents, since Israelis working for Moroccan intelligence couldn't disclose their nationality.

In the wake of Boukhari's testimony, Moroccan, French and US human rights organizations have urged Washington to declassify the more than 1,800 documents it has admitted having on the Ben Barka affair. The government has responded neither to this plea nor to my requests for comment on Boukhari's allegations about the CIA.

Boukhari's plight since he blew the whistle reveals the fear of Moroccan authorities that the current reckoning with the past will escape their control. In August he was arrested and sentenced to three months in prison for writing bad checks. A month after his release he was given another three-month sentence and a fine for libeling three of the Moroccan agents he implicated in Ben Barka's abduction. What authorities have not done is approach Boukhari as a valuable new witness in unsolved cases of political murder and disappearance, or issue him the passport he needs to comply with subpoenas to testify before a judge in France who is investigating Ben Barka's disappearance.

Although fitting the past into the future is primarily a task for Moroccans, Washington can play a crucial role. Ahmed Hirzeni, a Rabat sociologist who served twelve years in prison on political charges, observed, "We don't want to dwell forever on the dossier of the past. The Americans can help us turn the page by clarifying their role in the Ben Barka affair." Whatever it yields, US disclosure will pressure the Moroccan state to acknowledge more fully the torture, political arrests and disappearances it carried out in the past. And that, say activists like Hirzeni, will help to prevent their recurrence.

Why in 1973 did Chile's democracy, long considered the crown jewel of Latin America, turn into Augusto Pinochet's murderous regime? Why did the United States, which helped Pinochet seize power from Salvador Allende, support the violent dictator for nearly two decades? Scholars answering these questions have usually focused on the threat posed by Allende, the first elected Marxist head of state, to Chilean and US business interests and to the cold war foreign policy of the United States. But recently declassified documents, along with the reissue of Patricia Politzer's Fear in Chile: Lives Under Pinochet, suggest that the Chilean counterrevolution, however much shaped by immediate economic and political causes, was infused with a much older, more revanchist political spirit, one stretching as far back as the French Revolution.

Edward Korry, who served as US ambassador to Chile between 1967 and 1971, greeted Allende's election in 1970 as if the sans-culottes were at the gate. Before all the votes were in, he smelled the "stink of defeat" and could hear "the mounting roar of Allendistas acclaiming their victory" arising "from the street below." Although no guillotine blade had yet dropped, material declassified by the United States over the past couple of years shows that Korry fired cable after cable back to Washington, warning of "the terror" to come and citing Baudelaire to brand Allende a "devil."

It may seem bizarre that an LBJ-appointed Democrat would pepper his diplomatic missives with the overheated prose of French romanticism. After all, critics have charged cold war liberals, such as Robert McNamara and McGeorge Bundy, with employing a dry calculus in deciding the number of casualties needed to defeat Communism. But Korry was no bloodless bureaucrat. In fact, in both tone and content, his writings were remarkably similar to those of the illiberal Joseph de Maistre, the arch-Catholic reactionary who launched violent, intoxicated attacks on the French Revolution. By injecting medieval Catholic orgiastic mysticism with the revolutionary zealotry of his contemporaries, Maistre offered a compelling alternative to earthly promises of secular justice and political participation. He was the first who understood that if a counterrevolution was to be won, it would be necessary to win the "hearts and minds" of what would come to be known as the masses.

As fervidly as Maistre hated la secte of Jacobins and eighteenth-century rationalists, Korry disdained Allende and his Popular Unity followers, and largely for the same reason: Where Maistre rejected the idea that people could be governed by enlightened principles, Korry dismissed as "dogmatic and eschatological" those who believed that "society can be structured to create paradise on earth." And both men reserved their strongest scorn for the pillars of the old regime--church, army and state--because, either for reasons of ineptitude or corruption, they had failed to see and to confront the evil before them. Lost in a "myopia of arrogant stupidity," the elites and officials who had allowed Allende to come to power were a "troupe of fools and knaves" leading Chile to the "marxist slaughter-house." It is as if Korry saw the revolution as divine retribution against a decaying polity. "They should be given neither sympathy nor salvation," he said of the weak-willed ruling party.

Echoing Maistre's observation that republican rule is ill suited to protect society against revolutionary fanaticism, Korry complains in his cables about a gracious political culture that places no brake on Allende's determination: "Civility is the dominant characteristic of Chilean life. Civility is what controls aggressiveness, and civility is what makes almost certain the triumph of the very uncivil Allende." Neither the military nor the outgoing president, Eduardo Frei, "have the stomach for the violence they fear would be the consequence of intervention," Korry wrote to Washington. The Communist Party, in contrast, Korry warned, was "that most clear-minded and cohesive force in Chile.... Allende is their masterwork in Latin America and they do not lack for purpose or will."

Korry worked to strengthen domestic opposition to Allende's Popular Unity coalition, yet he also opposed Henry Kissinger's plot to provoke a military coup (which led to the murder of Chilean Gen. René Schneider). Instead, he advocated patience, confident that, with encouragement, internal dissent would eventually oust Allende. Again, remarkably akin to Maistre, Korry felt that restoration had to come from within rather than be imposed from without. He had faith that time favored his position; that the revolutionaries, in their effort to build a society that ran against human nature, would soon exhaust themselves; that rumor and chaos, unavoidable spawns of popular rule, would fuel an irresistible counterwave that would sweep them from power.

In fact, CIA destabilization strategies, both in Chile and in other Latin American nations, seem to draw directly from Maistre's restoration scenario, which relied on counterrevolutionary determination to generate dissension. Rumor acts as the cat's-paw for fear, poisoning commitment, corroding solidarity and forcing an acceptance of inevitable reaction. In Chile the CIA, in a cable dated September 17, 1970, set out a plan to

create the conviction that Allende must be stopped.... discredit parliamentary solution as unworkable...surface ineluctable conclusion that military coup is the only answer. This is to be carried forward until it takes place. However, we must hold firmly to the outlines or our production will be diffuse, denatured, and ineffective, not leaving the indelible residue in the mind that an accumulation of arsenic does. The key is psych war within Chile. We cannot endeavor to ignite the world if Chile itself is a placid lake. The fuel for the fire must come within Chile. Therefore, the station should employ every stratagem, every ploy, however bizarre, to create this internal resistance.

After the end of World War II, when demands for social democratic reform swept the continent, a series of coups and political betrayals successively radicalized and polarized social movements. The Old Left gave way to the New, and calls for reform climaxed into cries for revolution. By the late 1960s, Latin American military elites and their US allies knew, as Maistre knew two centuries earlier, that a simple changing of the guard would no longer be enough to contain this rising tide: "We are talking about mass public feeling as opposed to the private feeling of the elite," wrote the CIA about the intended audience of its "psych war" in Chile. The Latin American military regimes that came into power starting in the late 1960s combined terror and anti-Communist Catholic nationalism to silence this revolutionary roar. As Gen. Oscar Bonilla, who helped Pinochet install his seventeen-year dictatorship, put it, "What this country needs is political silence. We'll return to the barracks when we have changed the mentality of the people."

Patricia Politzer's Fear in Chile: Lives Under Pinochet recounts, through fifteen first-person testimonies gathered in the mid-1980s, while Pinochet was still in power, how his dictatorship did just that. By 1973, the United States had succeeded in its stated goal of extinguishing Chilean civility and igniting political passions. It seemed to many that their country had become ungovernable. Chronic shortages of basic goods, violent conflicts, political impasses and swirling rumors of coups and invasions wore Chileans down.

Nearly all of Fear in Chile's witnesses begin their accounts with the coup, and they all convey the exhaustion and confusion of the moment. Andrés Chadwick Piñera recounts his lonely sadness at hearing of Allende's death while his middle-class family, wife and neighbors celebrated. Sympathetic to the revolution, he burned his books and eventually made peace with the regime. Even the most committed became disoriented. Raquel, a student member of the Communist Party, recalls the uncertainty of revolutionary leadership, which told members to first do one thing, then another. Blanca Ibarra Abarca, a shantytown community leader, became "furious" after listening to Allende's radio message broadcasting news of the coup. She wanted "to do something, to fight," but was paralyzed by "pain and impotence." Manuel Bustos Huerta, president of his union, called a meeting but "no one knew anything...some people said we should go home, and others said we should take over the factory. Finally, after much discussion, we decided that people should go home." (Maistre wrote, nearly 200 years earlier, of how confusion would replace revolutionary resolve with resignation: "Everywhere prudence inhibits audacity.... On the one side there are terrible risks, on the other certain amnesty and probable favors. In addition, where are the means to resist? And where are the leaders to be trusted? There is no danger in repose.")

At times the polarization described by Politzer's witnesses seems absolute. While many wept upon hearing news of Allende's death, others bonded in anti-Communist solidarity: "Everyone from the block got together in a neighbor's house to celebrate.... Everyone brought something and it was a very joyous occasion."

But it is where the testimonies intersect, often at unexpected junctures, that Fear in Chile reveals just how deep and popular both the revolution and counterrevolution were. Blanca Ester Valderas and Elena Tesser de Villaseca recount radically different experiences and backgrounds. Valderas is a poorly educated rural woman whose husband was murdered in Pinochet's coup. Under Allende, after growing weary of following her husband through a series of dead-end jobs, Valderas joined the Socialist Party and was appointed mayor of her town. Even after the coup, when she was forced to change her name and go into hiding, she continued in politics, working with Chile's nascent human rights organizations. Tesser de Villaseca is a well-to-do "Pinochet diehard" who untiringly organized women to bring Allende down, even though she denies that either she or her husband is "political." Nor did she return home after Pinochet took power; instead Tesser de Villaseca and her friends threw themselves into myriad social welfare organizations aimed at making Chileans "a sound race again, to make the country healthy." Despite the different historical consequences of their actions, both women used politics as an avenue of upward human mobility, to escape the restraints of family and to influence civic life.

In Costa-Gavras's movie Missing, which, while not mentioning Chile specifically, depicts Pinochet's coup, the first repressive act shown is of soldiers pulling a woman off a bus queue and cutting off her slacks, warning her that in the new nation, women do not wear pants. Many of the voices in Fear in Chile recall similar acts of violence: men who had their long hair shorn; women who were ordered to wear skirts; a worker who was arrested and tortured for being "an asshole" and not acting sufficiently submissive to authority. Notwithstanding Allende's supposed alignment with the Soviet Union and his threat to economic interests, acts like these illustrate that the real danger of the Chilean left was not that it undermined secular liberal democracy but that it promised to fulfill it, to sweep away the privilege and deference of patriarchy and class. "It was as if we had suddenly returned to a past era," recalls the wife of an Allende functionary in recounting her dealings with male military officers who, prior to the coup, she'd treated as friends and equals.

For many, Pinochet realigned a world that had spun out of control, and the power of Politzer's book is that it takes seriously the concerns of his supporters. Pinochet remained popular because he satiated the desire of many Chileans for both order and freedom. He haunts the pages of Fear in Chile like Maistre's powerful but distant sovereign, who "restrains without enslaving." As one of Pinochet's supporters put it, "I believe in a democracy in which certain general objectives are submitted to a vote; after that, each matter should be handed over to experts capable of realizing those objectives. In a family, for instance, where there is a health problem, you don't have a democratic vote about what steps to take."

It is this image of a family that is constantly invoked by followers of the regime to symbolize a just society, a family with Pinochet as the wise and strong father ("I adore Pinochet," says Tesser de Villaseca. "I adore him because he is a superhuman person who is also sensible and worthy") and his wife, Lucía, as the empathetic mother ("an extraordinary woman," says a Pinochet colonel, "who has created a volunteer corps in Chile that should be an example to the world. She's like a diligent little ant who works in different areas and also collaborates well with her husband").

Pinochet's success in generating a degree of popular legitimacy ultimately rested on violence and terror. By the time he left office, in 1990, his regime had arrested 130,000 people, tortured 20,000 others and, if the killing that took place during the coup is included, murdered between 5,000 and 10,000 Chileans. Fear not only led people to burn their books, drop out of politics, go into hiding and exile and switch allegiances, but allowed those who supported the government and dreaded a return to anarchy and conflict to justify murder: "I don't have any special knowledge about DINA [Pinochet's intelligence agency, responsible for a good deal of the terror], but if they were really out to find people working against democracy, people who didn't hesitate to kill to achieve their goals, I think what they were doing was good. I'm not one of those who don't believe that there were disappeared persons," says Carlos Paut Ugarte, an economist who returned to Chile following Allende's overthrow to work in Pinochet's government.

From Edmund Burke to Jeane Kirkpatrick, it has been the lie of modern counterrevolutionary thinkers that, against totalitarian abstractions, they defended historical actuality. The status quo is what should be, they say, and any effort otherwise leads straight to the guillotine or the gulag. But Pinochet's god, father and homeland were no less utopian and intangible than the just nation that Allende and Popular Unity hoped to build--the difference being that Pinochet had guns and the United States.

In his day Maistre was optimistic that restoration could be brought about with little violence. "Would it be argued," he asked, "that the return from sickness to health must be as painful as the passage from health to sickness?" Writing before the great counterinsurgency terrors of the nineteenth and twentieth centuries, he can be excused his sanguinity. But Korry, too, liked to draw on historical analogies to make his case, and he has no such excuse. "There is a graveyard smell to Chile," he wrote immediately after Allende's election, "the fumes of a democracy in decomposition. They stank in my nostrils in Czechoslovakia in 1948 and they are no less sickening today."

It is too bad Korry couldn't escape the prison of his own abstractions and draw a lesson from a more relevant historical referent: Indonesia in 1965, where anti-Communist government agents slaughtered, as the United States watched, hundreds of thousands of its citizens. After all, the analogy was not lost on the CIA, which dubbed Pinochet's coup "Operation Jakarta."

At work recently, I went to get a ham sandwich from the university cafeteria. I discovered, to my vocal dismay, that the well-loved food counter offering homemade fare had been torn out and replaced by a Burger King franchise. Questioned about this innovation, the head of "food services" insisted that
it had been implemented in response to consumer demand. An exhaustive series of polls, surveys and questionnaires had revealed, apparently, that students and faculty were strongly in favor of a more "branded feel" to their dining environment.

It is worth pausing over the term "branded feel." It represents, I think, something profound: The presence of Burger King in the lunchroom is claimed to be a matter of affect. It addresses itself to "feelings," it meets a need that is more emotional than economic. This need has been identified, I was informed, by scientific and therefore inarguable means. The food-services honcho produced statistics that clearly indicated a compelling customer desire for bad, expensive food. According to his methodology, my protests were demonstrably elitist and undemocratic.

It is hardly news that opinion polls are frequently used to bolster the interests of those who commission them. But in recent years the notion that opinion can be measured in quantifiable terms has achieved unprecedented power and influence over public policy. The American penal system, for instance, has been rendered increasingly violent and sadistic as a direct response to opinion polls, which inform politicians that inhumane conditions are what voters desire. The thoughts and emotions of human beings are regarded as mathematically measurable, and the practical effects of this notion are now perceptible in the most mundane transactions of daily life.

This quantified approach to human nature is the result of the importation of theoretical economics into the general culture. Since the marginalist revolution of the late nineteenth century, neoclassical economists have rigidly confined their investigations within the methodological paradigm of positivist science, and they aspire in particular to the model of mathematics. Economists seek to produce empirically verifiable, statistical patterns of human behavior. They regard such studies as objective, unbiased and free of value-laden, superstitious presuppositions. The principle of "consumer sovereignty" hails this mode of procedure as the sociological arm of democracy, and it has made economics the most prestigious of the human sciences.

As David Throsby's Economics and Culture and Don Slater and Fran Tonkiss's Market Society show, the procedures of academic economists are now being further exalted to a position of dominant influence over everyday experience. Homo economicus is fast becoming equated with Homo sapiens. When airlines refer to passengers as "customers" and advise them to be "conservative with your space management," this development may seem trivial or comic. But in their very different ways, these books suggest that beneath such incremental cultural mutations there lurks an iceberg of titanic dimensions.

The Australian academic David Throsby is about as enlightened and humanistic as it is possible for a professional economist to be. He is also an accomplished playwright, and his influence on the political culture of his native land has been extensive and unvaryingly benign. He begins from the accurate supposition that "public policy and economic policy have become almost synonymous," and his intention is to rescue culture from the philistinism of businessmen and politicians who are incapable of lifting their eyes above the bottom line. It is a lamentable sign of the times, however, that he sees no other means of doing so than by translating aesthetic endeavor into quantifiable, economic terms. As he puts it, "If culture in general and the arts in particular are to be seen as important, especially in policy terms in a world where economists are kings, they need to establish economic credentials; what better way to do this than by cultivating the image of art as industry."

In order to cultivate this image, Throsby makes extensive if ambivalent use of the "rational-choice theory" derived from the work of Gary Becker. In Becker's opinion, the kinds of decision-making that economists contrive to abstract from the actions of people conceived as economic agents can be extrapolated to explain their behavior in areas of life that were once, romantically and unscientifically, thought of as lying beyond the arid terrain of rational calculation: love, for example, or aesthetic endeavor. This emboldens Throsby to ask whether we "might envisage creativity as a process of constrained optimisation, where the artist is seen as a rational maximizer of individual utility subject to both internally and externally imposed constraints," and to postulate "a measure...of difference in creativity (or 'talent'), in much the same way as in microeconomic analysis differences between production functions in input-output space measures differences in technology."

There are enough caveats in Throsby's book to indicate a laudable reluctance to engage in this project; however, he evidently feels that the current climate of opinion leaves him no other choice. He is thus driven to apply the economic understanding of "value" to cultural phenomena, and to engage in a "consideration of culture as capital...in the economic sense of a stock of capital assets giving rise over time to a flow of capital services." Much of this book consists of a monomaniacal reinscription of life itself into the technical discourse of neoclassical economics. We are therefore subjected to lengthy discussions of "cultural capital" (formerly known as "culture"), "social capital" (a k a "society"), "physical capital" (née "buildings"), "natural capital" (alias "nature") and of course "human capital" (once referred to as "people"). There is, it seems, no limit to the colonizing potential of economics: "If broader cultural phenomena, such as traditions, language, customs, etc. are thought of as intangible assets in the possession of the group to which they refer, they too can be brought into the same framework."

We are faced here, essentially, with the quantification of all human experience. Not merely economic behavior but every aspect of life and thought can be expressed under the statistical rubric and studied in mathematical form. The notion of the "stakeholder," dear to Tony Blair, whose ambition to create a "stakeholder society" is overt and unapologetic, is fundamental to this project.

A stakeholder stands in relation to the world as a shareholder does to a corporation. He (or she) casts a cold eye on his surroundings and perceives only his "stake" in them; he rationally considers the means by which he may optimally maximize their benefits. The stakeholder, then, is not human. He is rather a quantified abstraction from humanity, a machine designed for the calculation of marginal utility. Good-hearted economists such as Throsby would retort that the stakeholder does not enjoy an empirical existence; he is merely a useful theoretical construct. Would that it were so. But in fact, as Hannah Arendt said of neoclassical economics' cousin, behavioral psychology: "The problem...is not that it is false but that it is becoming true."

There is an interesting convergence between rational-choice theory and the venerable tradition of socialist materialism. Both approaches insist that the real factor motivating human behavior is economic self-interest: that of an individual in the former case, and that of a social class in the latter. The British sociologists Don Slater and Fran Tonkiss address many of the same questions as Throsby in their book Market Society, but they view the conquest of intellectual and social life by economics from a more traditionally leftist perspective. Like Throsby, Slater and Tonkiss acknowledge that "market logic has come to provide a means of thinking about social institutions and individuals more generally," but instead of concluding that students of aesthetics must therefore incorporate economic concepts into their practice, they envisage a movement in the other direction. Today, they claim, "the economist's task of explanation is as much interpretive or hermeneutic as it is mathematical."

Slater and Tonkiss are influenced here by the "rhetorical turn" that economists such as Deirdre McCloskey have recently attempted to introduce into their discipline. The increasingly abstract nature of money, it is claimed, lays bare the fact that financial value, like semiotic meaning, is an imaginary and therefore arbitrary mode of signification. As such, money can be studied using terms and concepts drawn from rhetoric and literary criticism. (An amusing parody of this idea occurs in Will Self's novel My Idea of Fun, which features a "money critic" whose job is to pontificate about the aesthetic qualities of various forms of finance.) Slater and Tonkiss present this as an appealing reversal of intellectual roles: "Whereas the central preoccupation of critical social analysis has traditionally been the way in which economic rationality dominates culture, contemporary social theory has been increasingly concerned with the central role of cultural processes and institutions in organizing and controlling the economic."

Although their emphasis is different, Slater and Tonkiss's argument leads to the same essential conclusion as Throsby's: It no longer makes sense to distinguish between "economics" and "culture," or between "the market" and "society." In practice, it makes little difference whether one regards this as an incursion of aesthetics into economics or vice versa. Indeed, Slater and Tonkiss are a good deal more pessimistic than Throsby about the consequences of this development. To their credit, they are willing and able to introduce into the discussion concepts like "commodification" and "alienation," from which even liberal economists like Throsby recoil in horror. But they stop well short of the bleak dystopianism of Adorno, and their slightly anodyne conclusion is that "markets are not simply good or bad, because they are highly variable." This pluralism is forced upon them, because their book is intended as a historical survey of various theoretical approaches to the market: Market Society provides admirably lucid and meticulously fair readings of Smith, Ricardo, Durkheim, Simmel, Weber and Polanyi. Despite its historical approach, the most beguiling feature of the book is that its treatment of such past thinkers is undertaken with a prominent sense of our present predicament.

Discussing the economist whose theories have had the greatest influence on that predicament, Slater and Tonkiss remind us that "Hayek held that ultimately there were no economic ends as such; economic action always served ends that were non-economic in character because needs and desires are exogenous (or external) to the market setting." But to say that there are no economic ends is the same as to say that there are only economic ends. It is, in other words, to abolish any distinction between the economic and the noneconomic. Toward the end of Economics and Culture, Throsby observes that "in primitive societies...culture and economy are to a considerable degree one and the same thing." By this definition, as each of these important and timely books suggests, our society may be the most primitive of all. Can anyone, today, escape the "branded feel"?

That would-be martyr John Walker--the mujahid of Marin County--has done something more than give a bad name to my favorite Scotch whiskey. He has illuminated the utter unfitness of our police and intelligence chiefs for the supreme power they now wish and propose to award themselves. And he has also accidentally exposed the stupidity and nastiness of the Patriot Act.

Desperate to be rid of a repressive regime, many turn to militant Islam.

Scattered chunks of films littered the theaters this holiday season. Except for The Royal Tenenbaums, which I've told you about, there wasn't a whole movie to be found. Or, to speak more precisely, no movie except The Royal Tenenbaums gave me the impression of wholeness, by which I mean the pleasure that arises when the mind can play back and forth through a picture, discovering how the details enrich one another.

No doubt I value this pleasure so much because I've been trained, as a critic, to look for it. Surrealists, post-structuralists and the average moviegoer do not. Even so, I believe that when artists aspire to wholeness, they put into their work a kind of sustained intelligence that we might call integrity, care or love. When I claim that this quality is missing from most movies nowadays, I of course say almost nothing. Maybe a slightly higher percentage of today's films are hash, compared to the run of productions in the 1930s; but that's for the cliometricians to decide. The critic's challenge is to find some response to the present year-end Oscar contenders, when there's no object of criticism among them.

Should I solve the problem by jumping outside the film world? Then, from a safe distance, I could belabor the politics of Black Hawk Down for being simple-minded, and the politics of Iris for being absent. Many useful comments could be made on these subjects. They just wouldn't be useful to someone who already reads The Nation.

So I suppose I'll have to do what moviegoers have always done: ignore the pictures and watch the stars. I won't talk about The Majestic and Ali, Monster's Ball and A Beautiful Mind. The subjects of this column will be Jim Carrey, Will Smith, Halle Berry and Jennifer Connelly. Let me begin with Connelly, who in A Beautiful Mind has finally achieved recognition as an actress, and in so doing has given the film a large part of its merit.

As you may know, A Beautiful Mind offers a loose approximation of the story of John Nash, a highly gifted mathematician who has struggled all his life against delusions and compulsions. The film, too, suffers from some mental confusion--screenwriter Akiva Goldsman and director Ron Howard somehow got Nash's biography mixed up with Jack and the Beanstalk--but once you get past that problem, you may appreciate the cleverness of this quasi-fairy tale. To begin with, the filmmakers have invented some briskly effective ways to suggest that Nash has a miraculous talent for pattern recognition, and that such a talent can be dangerous. Even when there's no order to be found, his mind keeps searching for one; and since the cold war provides great material for paranoia--the film begins in the late 1940s--Nash has a world of troubling data to sort. In a risk that's bold by Hollywood standards, the film presents its hero's blossoming delusions as if they were real--that is, as he would experience them. You're well into the story before you can sift the facts from the hallucinations, a process that's made compelling by Russell Crowe's performance in the lead. Awkward, shuffling, aggressive, witty, exasperating and vulnerable, he's altogether credible as someone who thinks in abstractions for a living.

But back to Connelly. She plays Alicia Larde, the woman who courts, marries and helps to rescue Nash. The filmmakers turn A Beautiful Mind into her story, almost as much as it is her husband's, and that's as it should be. Alicia is the one who gets scared witless, calls in the shrinks, strives to keep the household together and howls in the bathroom at 2 am. Connelly deserves full credit for carrying off the role.

It's a credit that's long been denied her. Although she's done some good work in smaller productions--Keith Gordon's Waking the Dead, Darren Aronofsky's Requiem for a Dream--Connelly has suffered till now from the Elizabeth Taylor syndrome. Like Taylor, she started young in show business and was quickly turned into a physical commodity, cast for her dark hair, blue eyes, smooth face and a buxom figure that she exposed very freely, arousing both sexual interest and condescension in a single gesture. The condescension came all the more quickly because Connelly, like Taylor, seems submerged in her beauty. It tends to separate her from other actors, as a rare fish is held apart in an aquarium, with the result (among other things) that she's a bad choice for comedy. Connelly can play at being amused by someone, but she isn't funny in herself--in contrast, for example, to her near-contemporary Shannon Elizabeth, a wonderfully silly person who shares her looks like a good joke.

Connelly has so far been incapable of such lightness; but she's right at home with the intensity of suffering that's called for in melodrama. Now her reputation is taking an upward turn similar to Taylor's at the time of Suddenly, Last Summer and Butterfield 8. Heaven knows, I don't want to go on to Cleopatra; but as someone who respects the tradition of melodrama, I think American cinema would be stronger if producers created more roles for Jennifer Connelly.

Having just seen Monster's Ball, I will also say the same for Halle Berry. She, too, has based her reputation on being absurdly gorgeous, with this distinction: Berry treats her looks like a loaded gun, which she can and will use. Of course, the danger varies; there was a lot of it in Bulworth but not much, somehow, in The Flintstones. Now, in Monster's Ball, the sense of risk suddenly leaps to a higher order.

Berry plays a wife and mother in a present-day Southern town--wife to a man on death row, mother to a boy who weighs 180 pounds and has not yet reached puberty. Through a series of catastrophes--or perhaps I should say wild coincidences--she eventually finds herself on the sofa late at night with Billy Bob Thornton, the racist white prison guard who led her husband to the electric chair. Grief, fatigue and booze are weighing heavily on her. She needs to wriggle free of them; everything that's still alive in her demands it. And so, in a scene that becomes a tour de force, she laughs in reminiscence about her husband, insists to herself that she's been a good mother, philosophizes starkly about the lives of black men in America and ultimately pours herself into Thornton's lap, demanding, "Make me feel good."

The screenwriters of Monster's Ball, Milo Addica and Will Rokos, might easily have based this scene on an acting-class exercise. A pair of students are assigned random emotions and must then improvise their way through them, making up the transitions as they go. What Berry does with the scene, though, has no whiff of the classroom. She doesn't just bob along on the swells and troughs of her feelings; she remembers at all times that these emotions have welled up because of the stranger next to her, this oddly quiet man to whom she addresses the whole monologue. She seems half-blind when she looks at him, but only half. She pushes against his self-possession, moment by moment; and the steadier he holds, the further she plunges in.

I wish the rest of Monster's Ball could live up to this scene. There are several fine sequences in the movie, which Marc Forster has directed with admirable restraint; but the picture is entirely too eager to flatter the audience. Monster's Ball is a machine, designed to make Billy Bob Thornton think and behave just as you believe he should. By the end, there's nothing to cut the good intentions except the memory of that smoky, greasy, overpowering scene where Halle Berry risks everything. It's almost enough.

The opening fifteen minutes of Ali are so good that they, too, come close to justifying the picture. In a virtuoso montage, which shows director Michael Mann at his very best, this sequence takes young Cassius Clay up to his first fight against Sonny Liston and his declaration of allegiance to the Nation of Islam. After that, you begin to notice that four screenwriters have labored over this production. Plot points are made with the galumphing literal-mindedness of Bob interviewing Ray. What's worse, these same points, from Liston I through the Foreman match in Zaire, were touched on in the 1977 film The Greatest, written by Ring Lardner Jr., directed by Tom Gries and Monte Hellman and starring (in the role of Muhammad Ali) Muhammad Ali.

Condemned in advance to being third best, after the real-life figure and the original movie incarnation, Will Smith can do little more than look good. It's what he specializes in; I've loved him for it. Here his innate cockiness takes him a long way in the role, as does his rapper's enjoyment of Ali's rhymes. So why does he keep getting upstaged by his supporting cast: Jamie Foxx, who makes something glorious of Ali's sidekick Drew "Bundini" Brown, and Jon Voight, who lives and breathes the role of Howard Cosell? The answer, I think, is that Smith does best when he floats along at a slight remove from his scenes, commenting on the action as if he might at any moment call it a day and go home. Ali makes him earnest; and earnestness, even more than the need to mimic a living figure, makes Will Smith disappear.

I wish Jim Carrey would disappear when he becomes earnest; but instead he latches into the movie like a tick, gorging on sentiment and perpetually, monstrously sucking in more. The effect is all the worse in Frank Darabont's The Majestic for the cinematography. It turns Carrey into a pastel-colored tick.

In this insufferable fantasy about good old-fashioned movies and good old-fashioned Americans, Carrey plays a blacklisted Hollywood screenwriter who (through a wild coincidence) loses his memory and is welcomed into a small town. It's a wonderful life, except for the FBI. I needn't point out to Nation readers how The Majestic makes a hash out of the blacklist period. (Carrey figures out, in a climactic burst of inspiration, that he can plead the First Amendment before HUAC. Gee!) What really concerns me is the demotion of this anarchic genius to the status of All-American Nothing. Carrey can play comedy like nobody else alive; so why is he pushed into melodrama?

My conclusion: American cinema is taking its actors too seriously, and its actresses not seriously enough. Happy new year.

With little public notice and no serious debate inside the party, Democratic National Committee chairman Terry McAuliffe and his allies have hatched a plan to radically alter the schedule and character of the 2004 Democratic presidential nominating process. If the changes McAuliffe proposes are implemented--as is expected at a January 17-19 meeting of the full DNC--the role of grassroots Democrats in the nomination of their party's challenger to George W. Bush will be dramatically reduced, as will the likelihood that the Democratic nominee will run the sort of populist, people-power campaign that might actually pose a threat to Bush's re-election.

The change, for which McAuliffe gained approval in November from the DNC rules subcommittee, would create a Democratic primary and caucus calendar that permits all states to begin selecting delegates on February 3, 2004. That new start-up date would come two weeks after the Iowa caucuses and just one week after the traditional "first in the nation" New Hampshire primary. Thus, the window between New Hampshire and the next primary--five weeks in 2000--would be closed. Already, says McAuliffe, South Carolina, Michigan and Arizona Democrats have indicated they will grab early February dates, and there is talk that California--the big enchilada in Democratic delegate selection--will move its primary forward to take advantage of the opening. McAuliffe's changes will collapse the nominating process into a fast-and-furious frenzy of television advertising, tarmac-tapping photo ops and power-broker positioning that will leave little room for the on-the-ground organizing and campaigning that might allow dark horse candidates or dissenting ideas to gain any kind of traction--let alone a real role at the 2004 Democratic National Convention.

"What McAuliffe is doing represents a continuation of the shift of influence inside the Democratic Party from volunteer-driven, precinct-based grassroots politics to a cadre of consultants, hacks and Washington insiders," says Mike Dolan, the veteran organizer who ran voter-registration campaigns for the California Democratic Party before serving as national field director for MTV's "Rock the Vote" initiative. "This whole process of reshaping the party to exclude people at home from the equation has been going on for years, but this really is the most serious change we've seen. And it's an incredibly disturbing shift. It will increase the power of the consultants and the fundraisers. But it will also make it a lot harder to build the enthusiasm and volunteer base a candidate needs to win in November."

McAuliffe, who is riding high after playing an important role in securing Democratic wins in November 2001 races for the Virginia and New Jersey governorships, says reforms are needed to avoid long, intraparty struggles and allow a clear focus on the task of challenging Bush. With a wide field of Democratic senators, governors, representatives and a former Vice President positioning to run in 2004, he says, "We can't be going through the spring with our guys killing each other."

McAuliffe makes no secret of his desire to have Democrats mirror the Republicans' compressed nominating schedule-- which helped front-runner Bush dispatch the more appealing John McCain in 2000. He wants his party's 2004 nominee identified by early March. Then, the nominee-in-waiting can get down to the business of fundraising and organizing a fall campaign without having to march in Chicago's St. Patrick's Day parade, visit Wisconsin's dairy farms or jostle for a position on the stage of Ohio's union halls.

One problem with McAuliffe's theory is that history suggests that Democrats who beat sitting Republican Presidents usually do so following extended nomination fights. In 1976, for instance, almost three months passed between the Iowa caucus and the point at which a majority of delegates to the Democratic National Convention had been selected. That convention nominated Jimmy Carter, who went on to beat President Gerald Ford. The next Democrat to beat a Republican President, Bill Clinton, won his party's 1992 nod after a bruising primary season that saw him fighting Jerry Brown for New York votes two months after the delegate-selection process began.

A serious state-by-state fight for the party nod can force the eventual nominee to build grassroots networks in key states that withstand the media assaults of the fall; just think how things would have gone if Al Gore had developed better on-the-ground operations in states with solid labor bases, like Missouri, West Virginia and Ohio--any one of which could have provided the Electoral College votes needed to render Florida's recount inconsequential. Instead of recognizing the advantage Democrats gain when they tend the grassroots, however, former candidate Brown says McAuliffe appears to be steering the party toward a model that mirrors Republican approaches. "The process is evolving and it's changing so that it will be even harder to tell Democrats from Republicans," Brown says. "This means the Democrats will be defined more than ever by money and the centralized, Washington-based establishment that trades in money. The trajectory the party is on is not toward greater democracy, not toward more involvement at the grassroots. Rather, the trajectory will make it harder for the local to influence the national. A historic democratic influence on the process is being wiped out, and with it will go a lot of energy Democratic nominees have been able to rely on in the past."

Brown touches on another problem with McAuliffe's approach. In a party already badly warped by the influence of special-interest money and fundraising demands, the new schedule will greatly expand the influence of big money--and of Washington insiders like veteran fundraiser McAuliffe, who can move that money into accounts of "acceptable," if not particularly progressive, candidates. "Everyone agrees the financial demands on candidates will be even higher than in the past, given the breakneck pace at which the contests will unfold," explains Washington Post columnist David Broder.

That bodes well for the best-known candidates with the strongest fundraising networks, like former Vice President Al Gore and Connecticut Senator Joe Lieberman, and also for well-heeled senators like Massachusetts' John Kerry and North Carolina's John Edwards. But low-budget, issue-driven campaigns, like those imagined by Senator Russ Feingold of Wisconsin, Congresswoman Marcy Kaptur of Ohio or outgoing Vermont Governor Howard Dean, will be even more difficult to mount. That, says former Democratic National Committee chairman Fred Harris, is bad news for the party and for progressive politics in America. "If you tighten up all the primaries at the start, it will limit the serious choices for Democrats to those candidates who are well-known or well-financed, or both. That takes away the range of choices, it makes the process less exciting and, ultimately, less connected to the grassroots," says Harris, a former senator and 1976 candidate for the presidency. "This really is a move in the wrong direction. The Democratic Party, to win, needs to be more democratic--not less."

The regulations proposed to implement George W. Bush's order establishing military commissions for the trial of "international terrorists" are mere window dressing and will not cure the fatal defects of the order. They provide the accused with so little protection as to raise a suspicion that they are made primarily to disarm the critics.

The fundamental problem is that the proposed system, including all its "judicial" elements, still lies entirely within the military chain of command and subordinate to the President, who is the ultimate authority over every aspect of the proceedings. But independent impartial judges who are not beholden to any side are the indispensable bedrock of any credible system of justice. They must be the ones to make the basic decisions or at least to review them. Without such a tribunal to monitor them, the various "protections" provided by the proposed regulations--the presumption of innocence, guilt beyond a reasonable doubt, even outside counsel--mean little or nothing.

This is not a novel insight. Congress and the military have recognized how indispensable an independent judiciary is to a meaningful system of justice: Under the Uniform Code of Military Justice, verdicts are not final until they have been reviewed by a civilian Court of Appeals for the Armed Forces. The provision of an appeal mechanism, especially in cases as politically and internationally sensitive as these, thus adds nothing to the fairness of the process--it merely insures that the final decision will be made by higher-ranking military officers who are still subject to military and presidential control.

White House counsel Alberto Gonzales, aware of these shortcomings, has sought to reassure doubters by noting that habeas corpus review will be available. But the order itself, which the regulations are only supposed to implement, expressly prohibits recourse to any court, as he well knows. For this reason, he was careful to describe the review as just a check on the jurisdiction of the tribunal, that is, whether the commission has the legal authority to try the particular accused. But review of a tribunal's jurisdiction does not touch on any substantive or procedural aspect of a proceeding, such as apprehension, detention, pretrial procedure, trial, evidentiary rulings, verdict or the sentence.

Moreover, as noted, the order specifically mandates that the ultimate authority is the President. Since the initial decision to apprehend someone is also the President's, and since everyone in the decision-making process, including the prosecutor, is subordinate to the President as the Commander in Chief, the police, prosecutor, some defense counsel, judge and jury are all rolled into one entity subject to one man--the antithesis of a just system. And given the rigidity of the military hierarchy and the natural desire of military personnel for promotion, who would challenge a judgment of their Commander in Chief that there is reason to believe someone is guilty of international terrorism and must be taken into custody--even if, as in so many instances, the action is as much for political reasons as for national security?

Compounding the difficulty is the absence of any real limit on what evidence may be admitted. The tribunal still may admit single, double and triple hearsay, affidavits, opinion and other dubious evidence. None of this can be effectively tested by cross-examination, especially since some of this evidence can be kept secret from the accused and his lawyers.

The decision to open up the proceedings to public view looks good, but it is only conditional--they may be closed if evidence that the tribunal considers worthy of secrecy is to be admitted. We have learned to our dismay how quick government officials are to classify information, even when it is already in the public domain. This Administration is particularly secretive, as shown by Bush's order holding back presidential papers from public release, as well as the refusal to reveal any information about the 1,000-plus detainees held since September 11. Moreover, the usual reason for secrecy is that disclosure will reveal methods and sources. But reliance on sources often involves very subjective judgments based on inaccurate or untrustworthy information. Yet it is just this kind of evidence that is most likely to be kept secret.

These are not tribunals worthy of a nation governed by law. And we don't need them. In the past eight years we have convicted twenty-six terrorists for the 1993 World Trade Center bombing and other cases in ordinary criminal trials and without revealing any secrets. The Administration realizes this, for it has decided to try the alleged "twentieth hijacker," Zacarias Moussaoui, in the criminal justice system.

The problem with these proposals is not that some people will never be satisfied--it is that the demands of justice have not been satisfied.

So Rudy is the person of the year.
We join the world in offering a cheer
To him--a man, some thought, was sent by heaven
To guide us through the shock of 9/11.
At certain times, it now must be conceded,
A paranoid control freak's just what's needed.