News and Features
Earthsea, Ursula Le Guin's magical world of islands and archipelagoes, is going through a period of intense, uncomfortable social change. The old ways no longer work and the new ones are not yet clear. At last there is a central government, though its young head of state is still establishing his authority, and it's bumpy going in the wild Kargish lands, where the religion, language and ethnicity are different and the women wear burqas. He has also encountered some resistance from the college of wizards at Roke, a theocratic caste that has ruled for centuries and become rather stiff and doctrinaire, as well as hateful toward women. Now Earthsea has suddenly been plunged into turmoil by two simultaneous assaults. One is an invasion of the collective unconscious by the voices and images of the dead, who beg to be set free from the dry land behind the wall of stones where they are confined. The other comes in fire from the skies, as dragons zoom in from the west to attack farm and forest. What is the reason for these threats? Are they connected? And does this society have what it takes to meet them?
Such are the themes of Ursula Le Guin's two new Earthsea books: Tales From Earthsea and The Other Wind: the boundary between life and death, terror from the sky and how hard it is for male-dominant societies to listen to women. Timely themes, from an acknowledged master not only of fantasy but of science fiction as well, a feminist, anarchist and Green whose books are taught in universities, and who has won many literary prizes (five Nebulas, five Hugos, the National Book Award for children's literature, a Newbery silver medal, Horn book award). In a country that valued wisdom and symbolic thinking, these two books would have been met with hosannas from coast to coast.
Does it matter that they weren't? I think so. To me, Le Guin is not only one of the purest stylists writing in English but the most transcendently truthful of writers. The books she writes are not true in the way facts are true; they speak to a different kind of truth and satisfy a desire for narrative that is so fundamental it must be in our cells. As she puts it:
The great fantasies, myths, and tales are indeed like dreams: they speak from the unconscious to the unconscious, in the language of the unconscious--symbol and archetype. Though they use words, they work the way music does: they short-circuit verbal reasoning, and go straight to the thoughts that lie too deep to utter. They cannot be translated fully into the language of reason, but only a Logical Positivist, who also finds Beethoven's Ninth Symphony meaningless, would claim that they are therefore meaningless. They are profoundly meaningful, and usable--practical--in terms of ethics; of insight; of growth.
"The Child and the Shadow" (1975), in The Language of the Night
Le Guin wrote the first three Earthsea books thirty years ago. A Wizard of Earthsea (1968) is the coming-of-age story of the boy Ged, who meddles in forbidden lore and summons up a rough, bearlike Shadow, who attacks and nearly kills him. The rest of the book concerns Ged's struggle to understand this Shadow, so strong it could bring destruction to the world unless he can defeat it. What is this rough beast? Why does it increasingly resemble him?
The second Earthsea book, The Tombs of Atuan (1970) takes place in the Kargish lands, which are separate from and more primitive than the rest of Earthsea. It is the story of Tenar, who as a small child became Arha, the Eaten One, priestess of the tombs of Atuan, ruled by the old earth powers of death, blood and brooding revenge. Into this dark underground labyrinth comes Ged, looking for the ring of Erreth-Akbe, which bears a lost rune of peace that can bring about a new era. Injured, starving, trapped, he is not strong enough to fight the old earth powers and escape unless Tenar helps him. Her entire upbringing urges her to kill him, but he is the first man she has ever seen as well as the first wizard, and she is tempted. In the end, she chooses life and escape, seeing that, by freeing him, she can also free herself. But then what? Where can she go once she is free?
Although Le Guin has been heavily influenced by Tolkien, her cosmology differed from his from the beginning. While both write of lands ruled by magic, Tolkien's Middle Earth has states and civil society; Earthsea has principalities but is more or less ruled by a caste of celibate priest-wizards centered on the Island of Roke, whose inborn mastery has been schooled at the college. In Earthsea, power of this kind is based on the Language of the Making which is also the language of dragons, only they are born knowing it; men have to learn it. Names in the Language of the Making are the thing, and knowledge of them confers power, over nature and over other people. A wizard who knows someone's true name can control him. But mature wizards do not use their power any more than they have to, for the ruling principle of Le Guin's world is not Tolkien's struggle between good and evil, but equilibrium, balance. Earthsea is a Taoist world (Le Guin has actually translated the Tao Te Ching), where light and dark, life and death are yin and yang, intertwined rather than opposed. The world gets out of balance when one side of an opposition gets too strong: light, wizardry, men. When men of power use their knowledge to fence themselves off from the dailiness of ordinary life--farming, mending, giving birth, and women--trouble is coming. Such hubris can lead to denial of death itself. It does in the third book, The Farthest Shore (1972).
The Farthest Shore begins with the inexplicable: magic, the organizing principle of Earthsea, is failing and no one knows why. Gradually it becomes clear that a destroyer has arisen, a terribly powerful wizard, Cob, who awakens the terror of death while promising immortality to any who will follow him. His followers drift in dumb despair, work ceases and meaning drains out of the world. Ged, now Archmage (head of the wizard's council), and his young disciple Lebannen, destined to be the long-awaited king, must trace this peril to its source and defeat it. To do so, they must cross the wall of stones into the dry land, the land of death, where no wind blows, no sun shines, and people, still trapped in the prison of their names, wander forever, unable even to recognize those they once loved.
Through many perils Ged and Lebannen seek the physical entrance to the dry land but can only find it when aided by dragons. The plague of despair has affected the dragons too; their young are killing one another and drowning themselves in the sea, and even the wisest are in danger of losing their language and themselves. After a hard pursuit and struggle in the dry land, Lebannen and Ged together defeat Cob and Ged reseals the gap between life and death. But in doing so, he drains his own power; he is no longer a wizard, no longer strong enough even to walk. Lebannen must carry him over the Mountain of Pain, which is the only exit from the dry land, to the beach, where the dragon Kalessin, the eldest, awaits them. Now that Ged has lost his power, he can no longer be Archmage; Kalessin flies him on past Roke to his home island of Gont. But Lebannen will be crowned king and bring about a new era under the rune of peace that Ged and Tenar brought from underground so many years before.
So ends Le Guin's third Earthsea book. She thought it was the last. Then, twenty years later, she suddenly wrote a sequel, Tehanu (1990). I interviewed her at that time and asked her why. She said she had to tell what happened to Tenar. She had tried to earlier but couldn't; she was too caught in the tradition of heroic male fantasy to be able to figure out what would happen to a woman in a Tolkien world. "That is why I had to write this fourth volume, because I changed. I had to show the other side."
But what is the other side of heroic male fantasy? The answer is not as simple as flipping a coin with King Arthur on one side, Britomart on the other. Traditionally there are only four possible roles for women in this sort of book: absent beloved, evil witch, damsel in distress and girl warrior. Can one make room for real women without undermining the fundamental premises of the genre?
From Le Guin's practice, it would appear not. Tenar became a farmer's wife because...what else can she do on Gont? This is farm country, after all, and while she has some kind of power, it is not the kind of power of which wizards are made. Even if it were, they would never train her on Roke, where the wizards have the kind of attitude toward women one tends to find in celibate priesthoods. A widow now, Tenar has adopted Therru, a little girl who was beaten and raped and almost burned up in a fire by her parents, so that one of her arms is withered and one whole side of her face is a hardened shell of scars. Therru too has some kind of power but nobody knows what it is. Tehanu begins where The Farthest Shore ends, as the dragon Kalessin delivers Ged into Tenar's care. Tenar has always loved him, and the two finally get together, overcoming his lifelong celibacy and his shame at having lost his power. But peril persists from those who followed the destroyer and, at the end, they can be saved only by the little burnt girl Therru, who calls the dragon back in the Language of the Making, a language she has never been taught. "Tehanu," he names the child, and calls her daughter. We are left wondering, how can this damaged, tormented little girl also be a dragon?
After eleven more years, Le Guin answered that question with Tales From Earthsea and The Other Wind, which do more than undermine the conventions of heroic male fantasy; they retrospectively transform the very history she created in the first three Earthsea books. There are five stories in Tales From Earthsea, but the central one is "Dragonfly." Dragonfly is a big, ungainly country girl, whose real name is Irian. Like Tenar and Tehanu, she has some kind of power nobody can exactly name. She knows she isn't like other people and wants to find out what she is. Finally she encounters somebody willing to take her to Roke to find out. But when she gets there, she comes up against a wall. In the absence of an Archmage, Roke has become factionalized. Thorion, the Summoner, had followed Ged and Lebannen into the dry land. He stayed there too long and was thought dead; now he has somehow returned to life, by the power of his will, and seeks to rule, to become Archmage and preserve the old ways. He says no woman can be admitted into the school on Roke; Irian must leave the island. The wizards are divided; the Master Patterner, Azver, lets her stay with him in the Immanent Grove, and begins to love her. Yet he, like the few others who are willing to deal with her, seems paralyzed; none of them have the strength to stand against the dead man, Thorion, and those who follow him. So when Thorion finally comes to throw Irian off the island, she must defend herself. She challenges Thorion to meet her on Roke Knoll, the hill where things can only be what they truly are:
The air was darkening around them. The west was only a dull red line, the eastern sky was shadowy above the sea.
The Summoner looked up at Irian. Slowly he raised his arms and the white staff in the invocation of a spell, speaking in the tongue that all the wizards and mages of Roke had learned, the language of their art, the Language of the Making: "Irian, by your name I summon you and bind you to obey me!"
She hesitated, seeming for a moment to yield, to come to him, and then cried out, "I am not only Irian."
At that the Summoner ran up towards her, reaching out, lunging at her as if to seize and hold her. They were both on the hill now. She towered above him impossibly, fire breaking forth between them, a flare of red flame in the dusk air, a gleam of red-gold scales, of vast wings--then that was gone, and there was nothing there but the woman standing on the hill path and the tall man bowing down before her, bowing slowly down to earth, and lying on it.
When the others come up to him, he is only "a huddle of clothes and dry bones and a broken staff." Aghast, they ask Irian who she is. She says she does not know. "She spoke...as she had spoken to the Summoner, in the Language of the Making, the tongue the dragons speak." She says goodbye to Azver, whom she loves, touching his hand and burning him in the process, then goes up the hill.
As she went farther from them they saw her, all of them, the great gold-mailed flanks, the spiked, coiling tail, the talons, the breath that was bright fire. On the crest of the knoll she paused a while, her long head turning to look slowly round the Isle of Roke, gazing longest at the Grove, only a blur of darkness in darkness now. Then with a rattle like the shaking of sheets of brass the wide, vaned wings opened and the dragon sprang up into the air, circled Roke Knoll once, and flew.
The Other Wind continues this theme of women who are also dragons, and plays it off against another central theme of these books, the relationship between life and death. For the terrible breach between life and death made by Cob continues. Now the dead have started appearing to the living in dreams, coming to the stone wall at the dry hill, begging to be set free, as if death were a prison. And at the same time, wild dragons have come to take back the land from men; they have come even to Havnor, where the young king, Lebannen, holds court under the rune of peace. All the patterns, clues and oppositions, set up over thirty years in five other books, come to fruition and are worked out in The Other Wind, but the book is so dependent on what came before, so complex, it is impossible to explicate here. It must be read--after the others--then thought on long and hard, for its meanings are not immediately manifest.
Long after reading, certain images stay in the mind. One is the dry land, this prison of death, and its relationship to immortality through the mastery of naming, of language. Another is women who are also dragons, who can find no place here on earth but must fly off beyond the west, on the other wind. Irian, excluded by the men of power, with only a few defenders, who are outnumbered and outweighed by the dead hand--there's plenty of resonance here for any woman who ever found herself a little bit too far ahead of the affirmative-action curve. As far as gender goes, these books seem to me a true symbolic picture of where we are now, with no untainted source of male power, no mature authoritative leadership of any kind, caught midway in our evolution as social beings, still trying to struggle up out of the ooze onto the land, no longer tadpoles and not yet frogs.
Science fiction and heroic fantasy began as the province of men, and the gradual entry of women into these genres has not necessarily produced more psychological depth overall. The best writers (including Octavia Butler, Samuel Delany, Neil Gaiman, Kim Stanley Robinson, Joanna Russ and Le Guin herself) have given us complex re-visionings of gender and power relations. But most writers have ambitions no higher than those of their counterparts who write in other commercial genres like espionage, crime or romance.
That is why Tales From Earthsea and The Other Wind are cause for celebration: they are books by a master stylist writing at the height of her powers. Although plenty of mass market fantasy is written in extremely pedestrian prose, style is key in fantasy, as in poetry. For fantasy is a pure creation of the imagination, summoned unto existence by the language of the making. Le Guin's style is as spare, plain, American and transparent as a northern lake: no tricks, no razzle-dazzle, no lists. "Why," she asks in an early essay, "is style of such fundamental significance in fantasy?"
because in fantasy there is nothing but the writer's vision of the world. There is no borrowed reality of history, or current events, or just plain folks.... There is no comfortable matrix of the commonplace to substitute for the imagination, to provide ready-made emotional response, and to disguise flaws and failures of creation. There is only a construct built in a void, with every joint and seam and nail exposed. To create what Tolkien calls "a secondary universe" is to make a new world. A world where no voice has ever spoken before; where the act of speech is the act of creation. The only voice that speaks there is the creator's voice. And every word counts.
From Elfland to Poughkeepsie, (1973)
If Le Guin is such a master and these books are so good, why have they been smuggled into the bookstore, largely unnoticed except in the professional reviewing periodicals? To understand the answer to this question, one must look at how genre is viewed in America and at the tyranny of contemporary realism in literary fiction.
Until the triumph of capitalism in the nineteenth century, the source of literature was thought to be the imagination, and the realistic novel was considered an inferior form, earthbound, compared to poetry, drama and the epic. In Shakespeare, Spenser and Milton, and even in the later, more contested work of the Brontës, Hawthorne and Melville, psychological realism exists in happy symbiosis with ghosts, fairies, demons and supernatural whales. With the triumph of capital and its handmaidens, science and rationalism, came a changed aesthetic. By the mid-twentieth century, the realistic novel of contemporary life had become so much the norm for serious fiction, at least in the United States, that anything else was trivialized or confined to a genre ghetto. We are, after all, a country run by hardheaded men who know the value of a dollar and who want no truck with moonshine. Many boast that they never read fiction. In such a culture, "magic realism" was acceptable only because it was imported; exceptions are always allowed for foreign luxury goods.
So strong was the idea that serious fiction must be a realistic picture of the present time that in the 1960s, when American novels began to combine some aspects of contemporary realism with monsters, ghosts, bodily organs run amok and other wild fancies (Ellison, Heller, Pynchon, Roth, Morrison), the writers were still considered realists or else given special dispensation as African-Americans, who, like foreigners, could be allowed their own cultural traditions because they were too marginal to threaten the mainstream aesthetic and politics. Living writers whose work was not grounded in a realistic, contemporary premise were relegated to the nursery or confined to special ghettos in the bookstore (historical fiction, science fiction, romance, fantasy), as though disqualified by genre from being shelved with "literature."
But surely this does not apply anymore; isn't this the Age of Harry Potter, when fantasy is king? Not exactly. It depends what sort of fantasy, and why. How different are the Harry Potter books really, in style and substance, from contemporary realism? Are they not parodies of same, combining realistic conventions with magical appliances and the war between good and evil? Is this parodic incongruity not, in fact, the reason they are so much fun? From the pinstriped cloak worn by the Minister of Magic to the disgusting variety of Bertie Botts Every Flavored Beans, the culture of the Harry Potter books is a faithful reflection of English schoolboy culture, including the cliques and teasing of the boarding school books that have molded generations.
And have they been treated seriously, as literature, or as a marketing phenomenon?
I would guess 90 percent of the articles I have read about J.K. Rowling deal with her not as a writer but as the commercial equivalent of a comet whizzing into the atmosphere from out of nowhere, a poor single mum writing her first book in a Scottish cafe. It's a great story, but you can only be a nine days' wonder once. After the novelty wears off, the commercial pressure remains; you are expected to do the same thing again and again and again, varying it no more than one flavor of yogurt varies from another. Every successful writer is faced with this choice: Do you stay faithful to the inner voice or turn yourself into a marketable commodity, producing a new product of the same kind every year or two? There are great social and economic rewards for the commodity production of the self.
Ursula Le Guin is doing something different. She has gone her own way, written forty books, not one of them either predictable or commercially motivated. She probably drives the industry crazy; it doesn't even know whether to classify the Earthsea books as children's literature or adult. In her foreword to Tales From Earthsea, she has some interesting things to say about commodification and why we read fantasy:
All times are changing times, but ours is one of massive, rapid moral and mental transformation.... It's unsettling. For all our delight in the impermanent, the entrancing flicker of electronics, we also long for the unalterable.... So people turn to the realms of fantasy for stability, ancient truths, immutable simplicities.
And the mills of capitalism provide them. Supply meets demand. Fantasy becomes a commodity, an industry.
Commodified fantasy takes no risks; it invents nothing, but imitates and trivializes. It proceeds by depriving the old stories of their intellectual and ethical complexity, turning their action to violence, their actors to dolls, and their truth-telling to sentimental platitude. Heroes brandish their swords, lasers, wands, as mechanically as combine harvesters, reaping profits. Profoundly disturbing moral choices are sanitized, made cute, made safe. The passionately conceived ideas of the great story-tellers are copied...advertised, sold, broken, junked, replaceable, interchangeable.
Le Guin's writing is on the edge, which is perhaps the same as the margins: idiosyncratic and hard to pin down. She is the kind of writer businessmen hate most, producing challenging, unpredictable books whose meanings are too elusive to be easily controlled. I can almost hear them saying, "No Earthsea books since 1990 and now two books in the same year? Hasn't she heard of regular marketing intervals?"
Unlike Le Guin's science fiction, her fantasies are not overtly political. The two genres have become almost interchangeable at the mass market level, but have different parents: science fiction derives from Victorian scientific speculation by writers like Conan Doyle and H.G. Wells, while fantasy grew out of myth. Le Guin's science fiction is about social and political life; some reads like ethnographies of imaginary societies, some deals with revolution. Because of its social themes, it appears more political than her fantasies, which deal with the inner life.
Nonetheless, the Earthsea books are profoundly radical because they lead one to think and feel outside of regular realistic patterns and the details of everyday life, laying depth charges that bring up long-forgotten reveries of childhood, unrecognized forms of heroism, secret challenges to power. Softly, elusively, they tear away at the wall of stones that keeps us in the dry land, the arid land of adulthood, the land of death-in-life, where so many of us spend so much of our time; they let the wind into our imaginations, and help to set us free.
Nike-Zeus, Nike-X, Sentinel, Safeguard, Star Wars, X-ray lasers, spaced-based neutron particle beams, Brilliant Pebbles, Ground-Based Midcourse National Missile Defense, Midcourse Defense Segment of Missile Defense. Over the past fifty years America has poured approximately $100 billion into these various programs or efforts to shield the country against long-range ballistic missiles. Yet not one has worked. Not one. Nevertheless, except for the constraints imposed by his own "voodoo economics," President George W. Bush appears poised to pursue the development and deployment of a layered missile defense--as a hedge against more failures--that would force taxpayers to cough up as much as another $100 billion. In December Bush formally notified Russia that the United States was withdrawing from the 1972 Anti-Ballistic Missile treaty in order to "develop ways to protect our people from future terrorist or rogue state missile attacks."
Russian President Vladimir Putin labeled Bush's decision a "mistake," a mild reaction that should not disguise the fact that much of Russia's political elite is seething at the withdrawal. Already smarting from America's broken promise not to expand NATO and the US-led NATO bombing of Yugoslavia in 1999 (which violated the 1997 "Founding Act" between Russia and NATO), the coincidence of America's success in Afghanistan (obviating the need for further Russian assistance) and withdrawal from the ABM treaty is viewed as yet further evidence of American duplicity.
President Clinton diplomatically explained the Republicans' obsession with missile defense when he observed: "One of the problems they've got is, for so many of their supporters, this is a matter of theology, not evidence. Because President Reagan was once for it, they think it must be right, and they've got to do it, and I think it makes it harder for them to see some of the downsides." That's a nice way of saying that the conservative wing of the Republican Party abounds with missile-defense wackos. I've participated personally in two missile-defense conferences and was astounded by their right-wing, faith-based atmospherics.
Which is why Bradley Graham's engaging narrative of politics and technology during the Clinton years, Hit to Kill: The New Battle Over Shielding America From Missile Attack, seems destined for popular success, notwithstanding its serious conceptual limitations. Graham ably recounts the excessive exuberance of Republicans as they schemed to realize their missile-defense dreams. But he is equally critical of the Clinton Administration's attempt to actually build a missile defense: its "three-plus-three" ground-based midcourse program.
Offered in the spring of 1996, in part to undercut the Republicans, "three-plus-three" provided for three (or four) years of development, after which, if then technologically feasible and warranted by a threat, there would be deployment within another three years. In early 1998, however, a sixteen-member panel, led by retired Air Force chief of staff Larry Welch, condemned the plan as a "rush to failure."
But two overdramatized events later that year demanded even greater urgency. In July, the Commission to Assess the Ballistic Missile Threat to the United States, led by Donald Rumsfeld, asserted that America's intelligence agencies had woefully underestimated the capability of "rogue" regimes, such as those leading North Korea and Iran, to attack US territory with ballistic missiles within five years. It concluded: "The threat to the United States posed by these emerging capabilities is broader, more mature, and evolving more rapidly than has been reported in estimates and reports by the intelligence community."
When North Korea subsequently launched a three-stage Taepodong 1 missile past Japan in August 1998, many Americans put aside not only their qualms about the role Representatives Curt Weldon and Newt Gingrich had played in creating the commission, but also their suspicions about the blatantly pro-missile defense bias of most of its members. Although Graham generally portrays the commission's deliberations as unbiased, he does provide evidence that some of its briefers were not.
For example, one intelligence official betrayed visible irritation during his briefing of commission members, prompting General Welch to ask, "You're not happy to be here, are you?" The official replied, "No, I'm not. I'm ticked off that I have to come down and brief a bunch of wacko missile-defense advocates." His outburst infuriated Rumsfeld, who "stalked" out of the room.
Nevertheless, Rumsfeld's report and the launch of North Korea's missile frightened Americans and galvanized Republicans. Graham's investigative reporting gets inside the subsequent political war waged against a Clinton Administration that, itself, was slowly awakening to the possibility of a more imminent ballistic missile threat.
Graham brings an open mind to the hotly disputed technological merits of missile defense. Nevertheless, he cannot avoid the conclusion that George W. Bush's decision to expand missile defense beyond Clinton's ground-based midcourse program constitutes an acknowledgment that, after fifty years, "military contractors had yet to figure out how best to mount a national missile defense."
In theory, a ballistic missile can be intercepted during its comparatively slow, if brief, "boost phase," before its "payload"--warheads, decoys and debris--is released. Speed is of the essence during the boost phase. So is proximity to the target. According to Philip Coyle, former director of the Pentagon's Office of Operational Test and Evaluation, "The process of detection and classification of enemy missiles must begin within seconds, and intercept must occur within only a few minutes. In some scenarios, the reaction time to intercept can be less than 120 seconds."
Compounding concerns about boost-phase intercepts are questions about the ability of an interceptor to distinguish quickly between a missile's flame and the missile itself. Finally, boost-phase missile-defense platforms would invite pre-emptive attacks against those platforms by any state bold (and foolish) enough to launch ballistic missiles.
The "terminal phase" of ballistic missile flight is the final minute or two when the payload re-enters the atmosphere. Detection of the warhead is comparatively simple, but designing a missile fast enough to catch it and hit it--given the problems associated with sensor degradation in intense heat--is extremely difficult. Countermeasures, such as maneuvering capability or precursor explosions, would further complicate defensive efforts. Finally a terminal-phase missile defense can, by definition, protect only a limited area, perhaps one city. Thus, many such systems would be required.
The "midcourse phase" of ballistic missile flight is the period during which the payload is dispersed in space. It remains there more than 80 percent of the missile's total flight time. The Clinton Administration's ground-based midcourse program (continued by the Bush Administration) is designed to strike the warhead in space with a high-speed, maneuverable kill vehicle--thus Graham's title: Hit to Kill.
Easily the most developed of all programs, as recently as December 3, 2001, the midcourse program demonstrated the awesome technological feat of destroying a warhead hurtling through space--hitting a bullet with a bullet. Yet such a feat constitutes but the commencement of an arduous technological journey, not its endpoint.
As a "Working Paper" issued recently under the auspices of the Union of Concerned Scientists noted, America's ground-based midcourse program has not been subjected to real-world tests. Five hit-to-kill tests have resulted in three hits. But each test: (1) used identical test geometrics (the location of launches, trajectories of target and interceptor missiles); (2) released the same objects (payload bus, warhead and decoy); (3) occurred at the same time of day; (4) made the lone decoy obviously and consistently different from the warhead; (5) told the defense system what to look for in advance; (6) attempted intercept at an unrealistically low closing speed; (7) kept the target cluster sufficiently compact to aid the kill vehicle's field of view; and (8) provided the kill vehicle with unduly accurate artificial tracking data.
Any ground-based midcourse missile defense system has to contend with virtually insurmountable countermeasures, especially the decoys that, in space, are quite indistinguishable from the warheads. Yet the three successful hits did not have to contend with even the countermeasures that a missile from a "rogue" regime would probably employ.
A National Intelligence Estimate in 1999 determined that "countermeasures would be available to emerging missile states." In April 2000 a "Countermeasures" study group from the Union of Concerned Scientists and the MIT Security Studies Program concluded: "Even the full [National Missile Defense] system would not be effective against an attacker using countermeasures, and an attacker could deploy such countermeasures before even the first phase of the NMD system was operational." Consequently, "it makes no sense to begin deployment."
Craig Eisendrath, Melvin Goodman and Gerald Marsh (Eisendrath and Goodman are senior fellows with the Center for International Policy in Washington; Marsh is a physicist at Argonne National Laboratory) state the problem even more starkly in their recent book The Phantom Defense: America's Pursuit of the Star Wars Illusion: "This is the bottom line: the problem isn't technology, it's physics. Decoys and warheads can always be made to emit almost identical signals in the visible, infrared, and radar bands; their signatures can be made virtually the same."
If such information troubles Defense Department officials responsible for missile defense, they seldom admit it publicly. However, they're not nearly as irresponsible as the political and "scholarly" cheerleaders who remain unmoved by a half-century of failure and the physics of countermeasures. I encountered one of them last June at a missile defense conference in King of Prussia, Pennsylvania.
Representative Weldon delivered the conference's keynote address to more than 220 participants from the Defense Department, the military industry, think tanks, various universities and the press. Weldon is the author of HR 4, legislation that made it "the policy of the United States to deploy a national missile defense." (Senator Carl Levin was able to add amendments to the Senate bill on missile defense that made the program dependent upon the annual budget process and tied it to retention of the ABM treaty; Weldon referred to the amendments as cowardice. Nevertheless, they remained in the Missile Defense Act that President Clinton signed on July 22, 1999.)
Weldon told the audience that the United States requires a missile-defense system to protect its citizens from an intentional missile attack by a "rogue" regime presumably undeterred by the prospect of an overwhelming American nuclear retaliation. He even displayed an accelerometer and a gyroscope, Russian missile components allegedly bound for a "rogue." He then displayed an enlarged, poster-size photograph of Russia's SS-25 ICBM. Russia possesses more than 400 such missiles, he asserted, and any one of them might be launched accidentally against the United States, given Russia's deteriorating command and control capabilities.
It was a "no-brainer." Both threats demanded that America build a national missile defense system, capable of intercepting such missiles, as soon as possible.
However, when I asked Congressman Weldon to shift from the SS-25 and contemplate whether his modest missile-defense system could prevent the penetration of an accidentally launched TOPOL-M ICBM from Russia, he responded, "I don't know. That's a question you should ask General Kadish during tomorrow's session." Extending the reasoning, I asked Weldon whether his modest missile-defense system could shield America against a missile, launched by a rogue regime, that was capable of TOPOL-M countermeasures. Weldon again answered that he did not know. But rather than let such doubts linger at a conference designed to celebrate missile defense, Kurt Strauss, director of naval and missile defense systems at Raytheon, rose to deny that Russia possessed such countermeasures.
Presumably, Strauss was unaware of the work of Nikolai Sokov, a former Soviet arms control adviser and author of Russian Strategic Modernization: Past and Future. Sokov claims that the TOPOL-M features a booster intended to reduce the duration and altitude of the boost phase, numerous decoys and penetration aids, a hardened warhead and a "side anti-missile maneuver."
Strauss's uninformed denial hints at a much bigger problem, however: the prevalence of advertising over objectivity in a society where the commercialization of war and the cult of technology have reached historic proportions. In The Pursuit of Power historian William McNeill traces the commercialization of war back to mercenary armies in fourteenth-century Italy, pointing out the "remarkable merger of market and military behavior." And Victor Davis Hanson, in Carnage and Culture, sees much the same reason behind the decimation of the Turkish fleet, some two centuries later, by the Christian fleet at Lepanto--"there was nothing in Asia like the European marketplace of ideas devoted to the pursuit of ever more deadly weapons." McNeill concludes that "the arms race that continues to strain world balances...descends directly from the intense interaction in matters military that European states and private entrepreneurs inaugurated during the fourteenth century."
Post-cold war America, virtually alone, luxuriates in this dubious tradition. Yet it was no less than Dwight Eisenhower who warned America in his farewell address: "This conjunction of an immense military establishment and a large arms industry is new in the American experience. The total influence--economic, political, even spiritual--is felt in every city, every Statehouse, every office of the federal government."
Who could have been surprised, then, when Matthew Evangelista conclusively demonstrated, in Innovation and the Arms Race (1988), that commercial opportunities within America's military-industrial complex, much more than any Soviet threat, propelled innovation--and, thus, most of the arms race with the Soviet Union. A year later, the highly respected defense analyst Jacques Gansler identified the uniquely American "technological imperative" of commercialized warfare: "Because we can have it, we must have it." Such impulses caused the United States to run profligate arms races with itself both during and after the cold war. They also explain America's post-cold war adherence to cold war levels of military expenditures and, in part, our missile-defense obsession today.
This technological imperative had its origins in America's "exceptional" historical experience, which it continues to serve. Indeed, so the argument goes, Why should a country on a mission from God sully itself with arms control agreements and other compromises with lesser nations, when its technological prowess will provide its people with the invulnerability necessary for the unimpeded, unilateral fulfillment of their historic destiny?
Such technological utopianism, however, has its costs. In their book The Dynamics of Military Revolution, 1300-2050, MacGregor Knox and Williamson Murray demonstrate the very secondary role that technology has played in past military revolutions. They conclude: "The past thus suggests that pure technological developments without the direction provided by a clear strategic context can easily lead in dangerous directions: either toward ignoring potential enemy responses, or--even more dangerously--into the dead end, graphically illustrated by the floundering of U.S. forces in Vietnam, of a technological sophistication irrelevant to the war actually being fought." (In Hit to Kill, Graham has little to say about military strategy or the commercialization of warfare.)
In hawking a missile defense shield, Representative Weldon traveled in the first dangerous direction when he assured the defense conferees that although Congress was not ignoring the threat posed by terrorists with truck bombs, "when Saddam Hussein chose to destroy American lives, he did not pick a truck bomb. He did not pick a chemical agent. He picked a SCUD missile.... The weapon of choice is the missile."
Unfortunately, on September 11, America learned that it is not.
Potentially worse, however, is the Reaganesque theology propelling the Bush Administration's decision to withdraw from the 1972 Anti-Ballistic Missile treaty. Putting aside the question of whether withdrawal requires formal Congressional approval and other questions of international relations, one must ask why any administration would destroy the cornerstone of strategic stability. The ban on national missile defenses not only prevents a defensive arms race but also obviates the need to build more offensive missiles to overload the enemy's. Why would a country withdraw from the ABM treaty without knowing whether its own missile-defense system will even work, and before conducting all the tests permitted by the treaty that would provide greater confidence in the system's ultimate success?
Readers of Keith Payne's recent book The Fallacies of Cold War Deterrence and a New Direction, might guess the probable answer. Payne, chosen by the Bush Administration to help shape the Defense Department's recently completed but still classified Nuclear Posture Review, writes about a new, post-cold war "effective deterrence," to which even an imperfect missile-defense system might contribute: "In the Cold War, the West held out the threat of nuclear escalation if the Soviet Union projected force into NATO Europe; in the post-Cold War period it will be regional aggressors threatening Washington with nuclear escalation in the event the United States needs to project force into their regional neighborhoods.... In short, Washington will want effective deterrence in regional crises where the challenger is able to threaten WMD [weapons of mass destruction] escalation and it is more willing to accept risk and cost."
The real concern, then, is less about protecting America from sneak attacks by rogue states ruled by madmen, and more about preserving our unilateral options to intervene throughout much of the world. Thus, President Bush's speech at The Citadel in December was disingenuous. His rhetorical question asking what if the terrorists had been able to strike with a ballistic missile was primarily an attempt to steamroller frightened Americans into supporting missile defense. The speech simply seized upon the wartime danger to compel a military transformation that has been debated for almost a decade and resisted by the services and the military industry since the beginning of Defense Secretary Rumsfeld's tenure.
Lest we forget, China hasn't disappeared either. Its muted criticism of America's withdrawal from the ABM treaty was accompanied by a call for talks to achieve "a solution that safeguards the global strategic balance and doesn't harm international efforts at arms control and disarmament." Failing such talks, China may feel compelled to increase its offensive arsenal to insure penetration of an American missile defense, which could provoke India, and consequently Pakistan--perhaps rekindling tensions that have already brought them to the brink of war.
Russia, for its part, believes it has little to fear from America's current missile-defense programs but is awaiting the inevitable: the moment when the technological utopians push America to expand its modest system into a full-blown shield. How will Russia respond then?
To court such reactions by withdrawing from the ABM treaty before even testing against decoys is pure strategic illiteracy--which only a Reaganesque theology (founded on exceptionalism, commercialized militarism, technological utopianism and righteous unilateralism) shrouded by the "fog of war" might explain.
In the United States the writer tends to become an entrepreneur, competing with other literary vendors marketing their characters and language, their humor or drama, to a skeptical and distracted public. In Israel, it seems, they order things differently. For a nation perpetually in crisis, with an ancient prophetic tradition behind it, the serious writer remains something of a sage, a wisdom figure who speaks with authority. Amos Oz has been such a presence on the Israeli scene for close to four decades, publishing not only novels and stories but political journalism, literary essays and Op-Ed columns, never wholly disengaging his state of mind from the state of the nation. Yet his public pronouncements, always as beautifully crafted as his fiction, have never laid to rest the inner demons that power his creative work. This is especially evident in his newest novel, The Same Sea. Despite its deceptively light tone, it reads like one of the most personal books he has yet written.
The Same Sea is at once spare and lushly experimental, an unusual mixture of hard, precise prose that drives the story forward and often lyrical, evocative verse that bathes us in the mental glow of each of the characters. The musical qualities of this verse, strong in Hebrew, are largely lost in translation, but its strategic line-breaks and numerous biblical echoes, especially from the Song of Songs, save it from becoming altogether prosaic. The story is so simple that the author can sum it up in his opening lines. It centers on a triangle familiar from some of Oz's earlier books--the mild, practical father; the languid, troubled mother, who has recently died; and their only son, who has fled home in the wake of her death and, in this case, gone off mountaineering in Tibet. It would not seem possible for a writer to build his novel around three characters whom we never see in one another's company: the widowed father, trying half-heartedly to resume his life, the deceased mother, not yet fully accepting her death, and the distant son, surrounded by his mother's palpable presence, sleeping with women who bring her back to him, trying aimlessly to outrun his grief.
Yet this is a book in which the dead are never wholly dead, where memory and meditation are more vibrant than action, while time and distance are seen less as objective facts than as constantly varying states of mind. It's also a book in which the fictional narrator, who resembles the author in every biographical detail, repeatedly emerges from behind the proscenium to sort out his own memories, which are precisely the ones that fed into the story. Just as the characters swarm about him, they inhabit one another's minds as well, communicating across continents with some of the mobility and omniscience an author usually reserves for himself.
In short, this is a book about someone writing a novel, showing us how it lives within him while it is also spilling out onto the page. Yet somehow, even at this remove from direct storytelling, the characters resonate. Amos Oz has written other versions of this father, this mother, this boy, in Hill of Evil Counsel, for example, but never has mingled them so clearly with his own past, which instead of fading has grown more insistent with time. Confronting mortality himself, he feels more impelled to take stock of his own dead. The loss of his parents, especially his mother's suicide when he was 13, still obsesses him as he approaches 60. The narrator even has one of his characters, the son's carefree 26-year-old girlfriend, try to talk him out of his brooding mood. "Your mother killed herself/and left you quite shattered.... But for how long? Your whole life?/The way I see it being in mourning for your mother for forty-five years is/pretty ridiculous." The narrator sees it differently. How can he bail out? "How can you jump from a plane/that's already crashed and rusted or sunk under the waves?" For him the dead continue to haunt the living. Yet what she says has the authentic ring of the younger generation, and the author, with the warm generosity of Chekhov, respects its callow wisdom and healthy insensitivity, which part of him would love to emulate.
The Same Sea is magnanimous toward characters who could just as well be brutally satirized or dismissed--the coarse yuppie always on the lookout for a good deal, the ill-favored film producer, hopelessly unlucky with women, who becomes fixated on a character in a script, the girl who casually sleeps with nearly all the male characters, including (almost) her boyfriend's widowed father. An underlying tenderness softens their hard edges. As in a Renoir film or Chekhov story, they somehow surprise the reader into sympathy and a wistful tolerance. Unexpectedly, too, they begin to nurture one another.
One feature of this enchanting book that I have already mentioned stands out most strikingly. As the story unfolds, the author keeps intervening in it, at first pushing his pad aside and wondering "how on earth/he came to write such a story," but gradually interacting with his characters, commenting on the film script that the girlfriend is trying to sell, offering little scenes from his writing life and recollecting his own parents and childhood. At first it seems he is playing a postmodern game, violating the boundaries of the novel by wantonly mixing poetry and prose, fact and fiction, puncturing our suspension of disbelief. Worse still, we wonder whether the writer is simply losing interest in his own story, taking it over. But it soon becomes clear that, on the contrary, the story is so real to him that the people in it have invaded his life, and not only when he sits composing at his desk.
As he works in the garden, all the people in his head, real and imagined--where to draw the line?--the dead and the living, his children, his grandchildren, the characters from the novel, all his own selves, seem right there with him, tossed up from the same sea, pitching in despite their different views of how the gardener's work should be done. This is a fanciful conceit, often used in the Renaissance for poetic creation, yet something about it rings ingeniously true. This is no symbolic landscape of ideas and images but a scene showing us the writer himself, away from his work but with his mind still abuzz. In this flux, paradoxically, he feels a contentment that allows him to set his demons aside, the dead who will not stay dead, the characters who insist on a life of their own, the fears for the future that poison the present: "Grief fear and shame are as far from me today as one dream is/from another," he says. "Whatever I have lost I forget, whatever has hurt me has faded,/whatever I have given up on I have given up on, whatever I am left with/will do." For the time being, at least, he can dwell in the moment. "Later I'll go back to my desk," he concludes, "and maybe I'll manage to bring back/the young man who went off to the mountains to seek the sea/that was there all the time right outside his own home."
Why in 1973 did Chile's democracy, long considered the crown jewel of Latin America, turn into Augusto Pinochet's murderous regime? Why did the United States, which helped Pinochet seize power from Salvador Allende, support the violent dictator for nearly two decades? Scholars answering these questions have usually focused on the threat posed by Allende, the first elected Marxist head of state, to Chilean and US business interests and to the cold war foreign policy of the United States. But recently declassified documents, along with the reissue of Patricia Politzer's Fear in Chile: Lives Under Pinochet, suggest that the Chilean counterrevolution, however much shaped by immediate economic and political causes, was infused with a much older, more revanchist political spirit, one stretching as far back as the French Revolution.
Edward Korry, who served as US ambassador to Chile between 1967 and 1971, greeted Allende's election in 1970 as if the sans-culottes were at the gate. Before all the votes were in, he smelled the "stink of defeat" and could hear "the mounting roar of Allendistas acclaiming their victory" arising "from the street below." Although no guillotine blade had yet dropped, material declassified by the United States over the past couple of years shows that Korry fired cable after cable back to Washington, warning of "the terror" to come and citing Baudelaire to brand Allende a "devil."
It may seem bizarre that an LBJ-appointed Democrat would pepper his diplomatic missives with the overheated prose of French romanticism. After all, critics have charged cold war liberals, such as Robert McNamara and McGeorge Bundy, with employing a dry calculus in deciding the number of casualties needed to defeat Communism. But Korry was no bloodless bureaucrat. In fact, in both tone and content, his writings were remarkably similar to those of the illiberal Joseph de Maistre, the arch-Catholic reactionary who launched violent, intoxicated attacks on the French Revolution. By injecting medieval Catholic orgiastic mysticism with the revolutionary zealotry of his contemporaries, Maistre offered a compelling alternative to earthly promises of secular justice and political participation. He was the first who understood that if a counterrevolution was to be won, it would be necessary to win the "hearts and minds" of what would come to be known as the masses.
As fervidly as Maistre hated la secte of Jacobins and eighteenth-century rationalists, Korry disdained Allende and his Popular Unity followers, and largely for the same reason: Where Maistre rejected the idea that people could be governed by enlightened principles, Korry dismissed as "dogmatic and eschatological" those who believed that "society can be structured to create paradise on earth." And both men reserved their strongest scorn for the pillars of the old regime--church, army and state--because, either for reasons of ineptitude or corruption, they had failed to see and to confront the evil before them. Lost in a "myopia of arrogant stupidity," the elites and officials who had allowed Allende to come to power were a "troupe of fools and knaves" leading Chile to the "marxist slaughter-house." It is as if Korry saw the revolution as divine retribution against a decaying polity. "They should be given neither sympathy nor salvation," he said of the weak-willed ruling party.
Echoing Maistre's observation that republican rule is ill suited to protect society against revolutionary fanaticism, Korry complains in his cables about a gracious political culture that places no brake on Allende's determination: "Civility is the dominant characteristic of Chilean life. Civility is what controls aggressiveness, and civility is what makes almost certain the triumph of the very uncivil Allende." Neither the military nor the outgoing president, Eduardo Frei, "have the stomach for the violence they fear would be the consequence of intervention," Korry wrote to Washington. The Communist Party, in contrast, Korry warned, was "that most clear-minded and cohesive force in Chile.... Allende is their masterwork in Latin America and they do not lack for purpose or will."
Korry worked to strengthen domestic opposition to Allende's Popular Unity coalition, yet he also opposed Henry Kissinger's plot to provoke a military coup (which led to the murder of Chilean Gen. René Schneider). Instead, he advocated patience, confident that, with encouragement, internal dissent would eventually oust Allende. Again, remarkably akin to Maistre, Korry felt that restoration had to come from within rather than be imposed from without. He had faith that time favored his position; that the revolutionaries, in their effort to build a society that ran against human nature, would soon exhaust themselves; that rumor and chaos, unavoidable spawns of popular rule, would fuel an irresistible counterwave that would sweep them from power.
In fact, CIA destabilization strategies, both in Chile and in other Latin American nations, seem to draw directly from Maistre's restoration scenario, which relied on counterrevolutionary determination to generate dissension. Rumor acts as the cat's-paw for fear, poisoning commitment, corroding solidarity and forcing an acceptance of inevitable reaction. In Chile the CIA, in a cable dated September 17, 1970, set out a plan to
create the conviction that Allende must be stopped.... discredit parliamentary solution as unworkable...surface ineluctable conclusion that military coup is the only answer. This is to be carried forward until it takes place. However, we must hold firmly to the outlines or our production will be diffuse, denatured, and ineffective, not leaving the indelible residue in the mind that an accumulation of arsenic does. The key is psych war within Chile. We cannot endeavor to ignite the world if Chile itself is a placid lake. The fuel for the fire must come within Chile. Therefore, the station should employ every stratagem, every ploy, however bizarre, to create this internal resistance.
After the end of World War II, when demands for social democratic reform swept the continent, a series of coups and political betrayals successively radicalized and polarized social movements. The Old Left gave way to the New, and calls for reform climaxed into cries for revolution. By the late 1960s, Latin American military elites and their US allies knew, as Maistre knew two centuries earlier, that a simple changing of the guard would no longer be enough to contain this rising tide: "We are talking about mass public feeling as opposed to the private feeling of the elite," wrote the CIA about the intended audience of its "psych war" in Chile. The Latin American military regimes that came into power starting in the late 1960s combined terror and anti-Communist Catholic nationalism to silence this revolutionary roar. As Gen. Oscar Bonilla, who helped Pinochet install his seventeen-year dictatorship, put it, "What this country needs is political silence. We'll return to the barracks when we have changed the mentality of the people."
Patricia Politzer's Fear in Chile: Lives Under Pinochet recounts, through fifteen first-person testimonies gathered in the mid-1980s, while Pinochet was still in power, how his dictatorship did just that. By 1973, the United States had succeeded in its stated goal of extinguishing Chilean civility and igniting political passions. It seemed to many that their country had become ungovernable. Chronic shortages of basic goods, violent conflicts, political impasses and swirling rumors of coups and invasions wore Chileans down.
Nearly all of Fear in Chile's witnesses begin their accounts with the coup, and they all convey the exhaustion and confusion of the moment. Andrés Chadwick Piñera recounts his lonely sadness at hearing of Allende's death while his middle-class family, wife and neighbors celebrated. Sympathetic to the revolution, he burned his books and eventually made peace with the regime. Even the most committed became disoriented. Raquel, a student member of the Communist Party, recalls the uncertainty of revolutionary leadership, which told members to first do one thing, then another. Blanca Ibarra Abarca, a shantytown community leader, became "furious" after listening to Allende's radio message broadcasting news of the coup. She wanted "to do something, to fight," but was paralyzed by "pain and impotence." Manuel Bustos Huerta, president of his union, called a meeting but "no one knew anything...some people said we should go home, and others said we should take over the factory. Finally, after much discussion, we decided that people should go home." (Maistre wrote, nearly 200 years earlier, of how confusion would replace revolutionary resolve with resignation: "Everywhere prudence inhibits audacity.... On the one side there are terrible risks, on the other certain amnesty and probable favors. In addition, where are the means to resist? And where are the leaders to be trusted? There is no danger in repose.")
At times the polarization described by Politzer's witnesses seems absolute. While many wept upon hearing news of Allende's death, others bonded in anti-Communist solidarity: "Everyone from the block got together in a neighbor's house to celebrate.... Everyone brought something and it was a very joyous occasion."
But it is where the testimonies intersect, often at unexpected junctures, that Fear in Chile reveals just how deep and popular both the revolution and counterrevolution were. Blanca Ester Valderas and Elena Tesser de Villaseca recount radically different experiences and backgrounds. Valderas is a poorly educated rural woman whose husband was murdered in Pinochet's coup. Under Allende, after growing weary of following her husband through a series of dead-end jobs, Valderas joined the Socialist Party and was appointed mayor of her town. Even after the coup, when she was forced to change her name and go into hiding, she continued in politics, working with Chile's nascent human rights organizations. Tesser de Villaseca is a well-to-do "Pinochet diehard" who untiringly organized women to bring Allende down, even though she denies that either she or her husband is "political." Nor did she return home after Pinochet took power; instead Tesser de Villaseca and her friends threw themselves into myriad social welfare organizations aimed at making Chileans "a sound race again, to make the country healthy." Despite the different historical consequences of their actions, both women used politics as an avenue of upward human mobility, to escape the restraints of family and to influence civic life.
In Costa-Gavras's movie Missing, which, while not mentioning Chile specifically, depicts Pinochet's coup, the first repressive act shown is of soldiers pulling a woman off a bus queue and cutting off her slacks, warning her that in the new nation, women do not wear pants. Many of the voices in Fear in Chile recall similar acts of violence: men who had their long hair shorn; women who were ordered to wear skirts; a worker who was arrested and tortured for being "an asshole" and not acting sufficiently submissive to authority. Notwithstanding Allende's supposed alignment with the Soviet Union and his threat to economic interests, acts like these illustrate that the real danger of the Chilean left was not that it undermined secular liberal democracy but that it promised to fulfill it, to sweep away the privilege and deference of patriarchy and class. "It was as if we had suddenly returned to a past era," recalls the wife of an Allende functionary in recounting her dealings with male military officers who, prior to the coup, she'd treated as friends and equals.
For many, Pinochet realigned a world that had spun out of control, and the power of Politzer's book is that it takes seriously the concerns of his supporters. Pinochet remained popular because he satiated the desire of many Chileans for both order and freedom. He haunts the pages of Fear in Chile like Maistre's powerful but distant sovereign, who "restrains without enslaving." As one of Pinochet's supporters put it, "I believe in a democracy in which certain general objectives are submitted to a vote; after that, each matter should be handed over to experts capable of realizing those objectives. In a family, for instance, where there is a health problem, you don't have a democratic vote about what steps to take."
It is this image of a family that is constantly invoked by followers of the regime to symbolize a just society, a family with Pinochet as the wise and strong father ("I adore Pinochet," says Tesser de Villaseca. "I adore him because he is a superhuman person who is also sensible and worthy") and his wife, Lucía, as the empathetic mother ("an extraordinary woman," says a Pinochet colonel, "who has created a volunteer corps in Chile that should be an example to the world. She's like a diligent little ant who works in different areas and also collaborates well with her husband").
Pinochet's success in generating a degree of popular legitimacy ultimately rested on violence and terror. By the time he left office, in 1990, his regime had arrested 130,000 people, tortured 20,000 others and, if the killing that took place during the coup is included, murdered between 5,000 and 10,000 Chileans. Fear not only led people to burn their books, drop out of politics, go into hiding and exile and switch allegiances, but allowed those who supported the government and dreaded a return to anarchy and conflict to justify murder: "I don't have any special knowledge about DINA [Pinochet's intelligence agency, responsible for a good deal of the terror], but if they were really out to find people working against democracy, people who didn't hesitate to kill to achieve their goals, I think what they were doing was good. I'm not one of those who don't believe that there were disappeared persons," says Carlos Paut Ugarte, an economist who returned to Chile following Allende's overthrow to work in Pinochet's government.
From Edmund Burke to Jeane Kirkpatrick, it has been the lie of modern counterrevolutionary thinkers that, against totalitarian abstractions, they defended historical actuality. The status quo is what should be, they say, and any effort otherwise leads straight to the guillotine or the gulag. But Pinochet's god, father and homeland were no less utopian and intangible than the just nation that Allende and Popular Unity hoped to build--the difference being that Pinochet had guns and the United States.
In his day Maistre was optimistic that restoration could be brought about with little violence. "Would it be argued," he asked, "that the return from sickness to health must be as painful as the passage from health to sickness?" Writing before the great counterinsurgency terrors of the nineteenth and twentieth centuries, he can be excused his sanguinity. But Korry, too, liked to draw on historical analogies to make his case, and he has no such excuse. "There is a graveyard smell to Chile," he wrote immediately after Allende's election, "the fumes of a democracy in decomposition. They stank in my nostrils in Czechoslovakia in 1948 and they are no less sickening today."
It is too bad Korry couldn't escape the prison of his own abstractions and draw a lesson from a more relevant historical referent: Indonesia in 1965, where anti-Communist government agents slaughtered, as the United States watched, hundreds of thousands of its citizens. After all, the analogy was not lost on the CIA, which dubbed Pinochet's coup "Operation Jakarta."
At work recently, I went to get a ham sandwich from the university cafeteria. I discovered, to my vocal dismay, that the well-loved food counter offering homemade fare had been torn out and replaced by a Burger King franchise. Questioned about this innovation, the head of "food services" insisted that
it had been implemented in response to consumer demand. An exhaustive series of polls, surveys and questionnaires had revealed, apparently, that students and faculty were strongly in favor of a more "branded feel" to their dining environment.
It is worth pausing over the term "branded feel." It represents, I think, something profound: The presence of Burger King in the lunchroom is claimed to be a matter of affect. It addresses itself to "feelings," it meets a need that is more emotional than economic. This need has been identified, I was informed, by scientific and therefore inarguable means. The food-services honcho produced statistics that clearly indicated a compelling customer desire for bad, expensive food. According to his methodology, my protests were demonstrably elitist and undemocratic.
It is hardly news that opinion polls are frequently used to bolster the interests of those who commission them. But in recent years the notion that opinion can be measured in quantifiable terms has achieved unprecedented power and influence over public policy. The American penal system, for instance, has been rendered increasingly violent and sadistic as a direct response to opinion polls, which inform politicians that inhumane conditions are what voters desire. The thoughts and emotions of human beings are regarded as mathematically measurable, and the practical effects of this notion are now perceptible in the most mundane transactions of daily life.
This quantified approach to human nature is the result of the importation of theoretical economics into the general culture. Since the marginalist revolution of the late nineteenth century, neoclassical economists have rigidly confined their investigations within the methodological paradigm of positivist science, and they aspire in particular to the model of mathematics. Economists seek to produce empirically verifiable, statistical patterns of human behavior. They regard such studies as objective, unbiased and free of value-laden, superstitious presuppositions. The principle of "consumer sovereignty" hails this mode of procedure as the sociological arm of democracy, and it has made economics the most prestigious of the human sciences.
As David Throsby's Economics and Culture and Don Slater and Fran Tonkiss's Market Society show, the procedures of academic economists are now being further exalted to a position of dominant influence over everyday experience. Homo economicus is fast becoming equated with Homo sapiens. When airlines refer to passengers as "customers" and advise them to be "conservative with your space management," this development may seem trivial or comic. But in their very different ways, these books suggest that beneath such incremental cultural mutations there lurks an iceberg of titanic dimensions.
The Australian academic David Throsby is about as enlightened and humanistic as it is possible for a professional economist to be. He is also an accomplished playwright, and his influence on the political culture of his native land has been extensive and unvaryingly benign. He begins from the accurate supposition that "public policy and economic policy have become almost synonymous," and his intention is to rescue culture from the philistinism of businessmen and politicians who are incapable of lifting their eyes above the bottom line. It is a lamentable sign of the times, however, that he sees no other means of doing so than by translating aesthetic endeavor into quantifiable, economic terms. As he puts it, "If culture in general and the arts in particular are to be seen as important, especially in policy terms in a world where economists are kings, they need to establish economic credentials; what better way to do this than by cultivating the image of art as industry."
In order to cultivate this image, Throsby makes extensive if ambivalent use of the "rational-choice theory" derived from the work of Gary Becker. In Becker's opinion, the kinds of decision-making that economists contrive to abstract from the actions of people conceived as economic agents can be extrapolated to explain their behavior in areas of life that were once, romantically and unscientifically, thought of as lying beyond the arid terrain of rational calculation: love, for example, or aesthetic endeavor. This emboldens Throsby to ask whether we "might envisage creativity as a process of constrained optimisation, where the artist is seen as a rational maximizer of individual utility subject to both internally and externally imposed constraints," and to postulate "a measure...of difference in creativity (or 'talent'), in much the same way as in microeconomic analysis differences between production functions in input-output space measures differences in technology."
There are enough caveats in Throsby's book to indicate a laudable reluctance to engage in this project; however, he evidently feels that the current climate of opinion leaves him no other choice. He is thus driven to apply the economic understanding of "value" to cultural phenomena, and to engage in a "consideration of culture as capital...in the economic sense of a stock of capital assets giving rise over time to a flow of capital services." Much of this book consists of a monomaniacal reinscription of life itself into the technical discourse of neoclassical economics. We are therefore subjected to lengthy discussions of "cultural capital" (formerly known as "culture"), "social capital" (a k a "society"), "physical capital" (née "buildings"), "natural capital" (alias "nature") and of course "human capital" (once referred to as "people"). There is, it seems, no limit to the colonizing potential of economics: "If broader cultural phenomena, such as traditions, language, customs, etc. are thought of as intangible assets in the possession of the group to which they refer, they too can be brought into the same framework."
We are faced here, essentially, with the quantification of all human experience. Not merely economic behavior but every aspect of life and thought can be expressed under the statistical rubric and studied in mathematical form. The notion of the "stakeholder," dear to Tony Blair, whose ambition to create a "stakeholder society" is overt and unapologetic, is fundamental to this project.
A stakeholder stands in relation to the world as a shareholder does to a corporation. He (or she) casts a cold eye on his surroundings and perceives only his "stake" in them; he rationally considers the means by which he may optimally maximize their benefits. The stakeholder, then, is not human. He is rather a quantified abstraction from humanity, a machine designed for the calculation of marginal utility. Good-hearted economists such as Throsby would retort that the stakeholder does not enjoy an empirical existence; he is merely a useful theoretical construct. Would that it were so. But in fact, as Hannah Arendt said of neoclassical economics' cousin, behavioral psychology: "The problem...is not that it is false but that it is becoming true."
There is an interesting convergence between rational-choice theory and the venerable tradition of socialist materialism. Both approaches insist that the real factor motivating human behavior is economic self-interest: that of an individual in the former case, and that of a social class in the latter. The British sociologists Don Slater and Fran Tonkiss address many of the same questions as Throsby in their book Market Society, but they view the conquest of intellectual and social life by economics from a more traditionally leftist perspective. Like Throsby, Slater and Tonkiss acknowledge that "market logic has come to provide a means of thinking about social institutions and individuals more generally," but instead of concluding that students of aesthetics must therefore incorporate economic concepts into their practice, they envisage a movement in the other direction. Today, they claim, "the economist's task of explanation is as much interpretive or hermeneutic as it is mathematical."
Slater and Tonkiss are influenced here by the "rhetorical turn" that economists such as Deirdre McCloskey have recently attempted to introduce into their discipline. The increasingly abstract nature of money, it is claimed, lays bare the fact that financial value, like semiotic meaning, is an imaginary and therefore arbitrary mode of signification. As such, money can be studied using terms and concepts drawn from rhetoric and literary criticism. (An amusing parody of this idea occurs in Will Self's novel My Idea of Fun, which features a "money critic" whose job is to pontificate about the aesthetic qualities of various forms of finance.) Slater and Tonkiss present this as an appealing reversal of intellectual roles: "Whereas the central preoccupation of critical social analysis has traditionally been the way in which economic rationality dominates culture, contemporary social theory has been increasingly concerned with the central role of cultural processes and institutions in organizing and controlling the economic."
Although their emphasis is different, Slater and Tonkiss's argument leads to the same essential conclusion as Throsby's: It no longer makes sense to distinguish between "economics" and "culture," or between "the market" and "society." In practice, it makes little difference whether one regards this as an incursion of aesthetics into economics or vice versa. Indeed, Slater and Tonkiss are a good deal more pessimistic than Throsby about the consequences of this development. To their credit, they are willing and able to introduce into the discussion concepts like "commodification" and "alienation," from which even liberal economists like Throsby recoil in horror. But they stop well short of the bleak dystopianism of Adorno, and their slightly anodyne conclusion is that "markets are not simply good or bad, because they are highly variable." This pluralism is forced upon them, because their book is intended as a historical survey of various theoretical approaches to the market: Market Society provides admirably lucid and meticulously fair readings of Smith, Ricardo, Durkheim, Simmel, Weber and Polanyi. Despite its historical approach, the most beguiling feature of the book is that its treatment of such past thinkers is undertaken with a prominent sense of our present predicament.
Discussing the economist whose theories have had the greatest influence on that predicament, Slater and Tonkiss remind us that "Hayek held that ultimately there were no economic ends as such; economic action always served ends that were non-economic in character because needs and desires are exogenous (or external) to the market setting." But to say that there are no economic ends is the same as to say that there are only economic ends. It is, in other words, to abolish any distinction between the economic and the noneconomic. Toward the end of Economics and Culture, Throsby observes that "in primitive societies...culture and economy are to a considerable degree one and the same thing." By this definition, as each of these important and timely books suggests, our society may be the most primitive of all. Can anyone, today, escape the "branded feel"?
Scattered chunks of films littered the theaters this holiday season. Except for The Royal Tenenbaums, which I've told you about, there wasn't a whole movie to be found. Or, to speak more precisely, no movie except The Royal Tenenbaums gave me the impression of wholeness, by which I mean the pleasure that arises when the mind can play back and forth through a picture, discovering how the details enrich one another.
No doubt I value this pleasure so much because I've been trained, as a critic, to look for it. Surrealists, post-structuralists and the average moviegoer do not. Even so, I believe that when artists aspire to wholeness, they put into their work a kind of sustained intelligence that we might call integrity, care or love. When I claim that this quality is missing from most movies nowadays, I of course say almost nothing. Maybe a slightly higher percentage of today's films are hash, compared to the run of productions in the 1930s; but that's for the cliometricians to decide. The critic's challenge is to find some response to the present year-end Oscar contenders, when there's no object of criticism among them.
Should I solve the problem by jumping outside the film world? Then, from a safe distance, I could belabor the politics of Black Hawk Down for being simple-minded, and the politics of Iris for being absent. Many useful comments could be made on these subjects. They just wouldn't be useful to someone who already reads The Nation.
So I suppose I'll have to do what moviegoers have always done: ignore the pictures and watch the stars. I won't talk about The Majestic and Ali, Monster's Ball and A Beautiful Mind. The subjects of this column will be Jim Carrey, Will Smith, Halle Berry and Jennifer Connelly. Let me begin with Connelly, who in A Beautiful Mind has finally achieved recognition as an actress, and in so doing has given the film a large part of its merit.
As you may know, A Beautiful Mind offers a loose approximation of the story of John Nash, a highly gifted mathematician who has struggled all his life against delusions and compulsions. The film, too, suffers from some mental confusion--screenwriter Akiva Goldsman and director Ron Howard somehow got Nash's biography mixed up with Jack and the Beanstalk--but once you get past that problem, you may appreciate the cleverness of this quasi-fairy tale. To begin with, the filmmakers have invented some briskly effective ways to suggest that Nash has a miraculous talent for pattern recognition, and that such a talent can be dangerous. Even when there's no order to be found, his mind keeps searching for one; and since the cold war provides great material for paranoia--the film begins in the late 1940s--Nash has a world of troubling data to sort. In a risk that's bold by Hollywood standards, the film presents its hero's blossoming delusions as if they were real--that is, as he would experience them. You're well into the story before you can sift the facts from the hallucinations, a process that's made compelling by Russell Crowe's performance in the lead. Awkward, shuffling, aggressive, witty, exasperating and vulnerable, he's altogether credible as someone who thinks in abstractions for a living.
But back to Connelly. She plays Alicia Larde, the woman who courts, marries and helps to rescue Nash. The filmmakers turn A Beautiful Mind into her story, almost as much as it is her husband's, and that's as it should be. Alicia is the one who gets scared witless, calls in the shrinks, strives to keep the household together and howls in the bathroom at 2 am. Connelly deserves full credit for carrying off the role.
It's a credit that's long been denied her. Although she's done some good work in smaller productions--Keith Gordon's Waking the Dead, Darren Aronofsky's Requiem for a Dream--Connelly has suffered till now from the Elizabeth Taylor syndrome. Like Taylor, she started young in show business and was quickly turned into a physical commodity, cast for her dark hair, blue eyes, smooth face and a buxom figure that she exposed very freely, arousing both sexual interest and condescension in a single gesture. The condescension came all the more quickly because Connelly, like Taylor, seems submerged in her beauty. It tends to separate her from other actors, as a rare fish is held apart in an aquarium, with the result (among other things) that she's a bad choice for comedy. Connelly can play at being amused by someone, but she isn't funny in herself--in contrast, for example, to her near-contemporary Shannon Elizabeth, a wonderfully silly person who shares her looks like a good joke.
Connelly has so far been incapable of such lightness; but she's right at home with the intensity of suffering that's called for in melodrama. Now her reputation is taking an upward turn similar to Taylor's at the time of Suddenly, Last Summer and Butterfield 8. Heaven knows, I don't want to go on to Cleopatra; but as someone who respects the tradition of melodrama, I think American cinema would be stronger if producers created more roles for Jennifer Connelly.
Having just seen Monster's Ball, I will also say the same for Halle Berry. She, too, has based her reputation on being absurdly gorgeous, with this distinction: Berry treats her looks like a loaded gun, which she can and will use. Of course, the danger varies; there was a lot of it in Bulworth but not much, somehow, in The Flintstones. Now, in Monster's Ball, the sense of risk suddenly leaps to a higher order.
Berry plays a wife and mother in a present-day Southern town--wife to a man on death row, mother to a boy who weighs 180 pounds and has not yet reached puberty. Through a series of catastrophes--or perhaps I should say wild coincidences--she eventually finds herself on the sofa late at night with Billy Bob Thornton, the racist white prison guard who led her husband to the electric chair. Grief, fatigue and booze are weighing heavily on her. She needs to wriggle free of them; everything that's still alive in her demands it. And so, in a scene that becomes a tour de force, she laughs in reminiscence about her husband, insists to herself that she's been a good mother, philosophizes starkly about the lives of black men in America and ultimately pours herself into Thornton's lap, demanding, "Make me feel good."
The screenwriters of Monster's Ball, Milo Addica and Will Rokos, might easily have based this scene on an acting-class exercise. A pair of students are assigned random emotions and must then improvise their way through them, making up the transitions as they go. What Berry does with the scene, though, has no whiff of the classroom. She doesn't just bob along on the swells and troughs of her feelings; she remembers at all times that these emotions have welled up because of the stranger next to her, this oddly quiet man to whom she addresses the whole monologue. She seems half-blind when she looks at him, but only half. She pushes against his self-possession, moment by moment; and the steadier he holds, the further she plunges in.
I wish the rest of Monster's Ball could live up to this scene. There are several fine sequences in the movie, which Marc Forster has directed with admirable restraint; but the picture is entirely too eager to flatter the audience. Monster's Ball is a machine, designed to make Billy Bob Thornton think and behave just as you believe he should. By the end, there's nothing to cut the good intentions except the memory of that smoky, greasy, overpowering scene where Halle Berry risks everything. It's almost enough.
The opening fifteen minutes of Ali are so good that they, too, come close to justifying the picture. In a virtuoso montage, which shows director Michael Mann at his very best, this sequence takes young Cassius Clay up to his first fight against Sonny Liston and his declaration of allegiance to the Nation of Islam. After that, you begin to notice that four screenwriters have labored over this production. Plot points are made with the galumphing literal-mindedness of Bob interviewing Ray. What's worse, these same points, from Liston I through the Foreman match in Zaire, were touched on in the 1977 film The Greatest, written by Ring Lardner Jr., directed by Tom Gries and Monte Hellman and starring (in the role of Muhammad Ali) Muhammad Ali.
Condemned in advance to being third best, after the real-life figure and the original movie incarnation, Will Smith can do little more than look good. It's what he specializes in; I've loved him for it. Here his innate cockiness takes him a long way in the role, as does his rapper's enjoyment of Ali's rhymes. So why does he keep getting upstaged by his supporting cast: Jamie Foxx, who makes something glorious of Ali's sidekick Drew "Bundini" Brown, and Jon Voight, who lives and breathes the role of Howard Cosell? The answer, I think, is that Smith does best when he floats along at a slight remove from his scenes, commenting on the action as if he might at any moment call it a day and go home. Ali makes him earnest; and earnestness, even more than the need to mimic a living figure, makes Will Smith disappear.
I wish Jim Carrey would disappear when he becomes earnest; but instead he latches into the movie like a tick, gorging on sentiment and perpetually, monstrously sucking in more. The effect is all the worse in Frank Darabont's The Majestic for the cinematography. It turns Carrey into a pastel-colored tick.
In this insufferable fantasy about good old-fashioned movies and good old-fashioned Americans, Carrey plays a blacklisted Hollywood screenwriter who (through a wild coincidence) loses his memory and is welcomed into a small town. It's a wonderful life, except for the FBI. I needn't point out to Nation readers how The Majestic makes a hash out of the blacklist period. (Carrey figures out, in a climactic burst of inspiration, that he can plead the First Amendment before HUAC. Gee!) What really concerns me is the demotion of this anarchic genius to the status of All-American Nothing. Carrey can play comedy like nobody else alive; so why is he pushed into melodrama?
My conclusion: American cinema is taking its actors too seriously, and its actresses not seriously enough. Happy new year.
Norman Rockwell's ouevre is deceptively simple—the self-proclaimed 'illustrator' had more depth than he's credited for.
Director Wes Anderson's 'The Royal Tenenbaums' is full of bittersweet whimsy.
What's next for Ms. magazine now that it's hit the ripe age of 30 and is now heading west?
Facebook Like Box