It seems scarcely to have required a great philosophical mind to come up
with the observation that each of us is the child of our times, but that
thought must have been received as thrillingly novel when Hegel wrote it
in 1821. For it implied that human nature is not a timeless essence but
penetrated through and through by our historical situation.
Philosophers, he went on to say, grasp their times in thought, and he
might as a corollary have said that artists grasp their times in images.
For Hegel was the father of art history as the discipline through which
we become conscious of the way art expresses the uniqueness of the time
in which it is made. It is rare, however, that grasping his or her own
historical moment becomes an artist's subject. It was particularly rare
in American art of the second half of the twentieth century, for though
the art inevitably belonged to its historical moment, that was seldom
what it set out to represent. It strikes me, for example, that Andy
Warhol was exceptional in seeking to make the reality of his era
conscious of itself through his art.
German artists of the same period, by contrast, seem to have treated the
historical situation of art in Germany as their primary preoccupation.
How to be an artist in postwar Germany was part of the burden of being a
German artist in that time, and this had no analogy in artistic
self-consciousness anywhere else in the West. Especially those in the
first generation after Nazism had to find ways of reconnecting with
Modernism while still remaining German. And beyond that they had to deal
with the harsh and total political divisions of the cold war, which cut
their country in two like a mortal wound. Gerhard Richter was a product
of these various tensions. But like Warhol, whom he resembles in
profound ways, he evolved a kind of self-protective cool that enabled
him and his viewers to experience historical reality as if at a
distance. There is something unsettlingly mysterious about his art.
Looking at any significant portion of it is like experiencing late Roman
history through some Stoic sensibility. One often has to look outside
his images to realize the violence to which they refer.
Richter grew up in East Germany, where he completed the traditional
curriculum at the Dresden Academy of Art, executing a mural for a
hygiene museum in 1956 as a kind of senior thesis. Since the institution
was dedicated to health, it was perhaps politically innocuous that the
imagery Richter employed owed considerably more to the
joy-through-health style of representing the human figure at play, which
continued to exemplify Hitler's aesthetic well after Nazism's collapse,
than to the celebration of proletarian industriousness mandated by
Socialist Realism under Stalin. This implies that East German artistic
culture had not been Sovietized at this early date. The real style wars
were taking place in West Germany and surfaced especially in the epochal
first Documenta exhibition of 1955. Documenta, which usually takes place
every five years in Kassel, is a major site for experiencing
contemporary art on the international circuit today. But at its
inception, it carried an immense political significance for German art.
It explicitly marked the official acceptance by Germany of the kind of
art that had been stigmatized as degenerate by the Nazis and was thus a
bid by Germany for reacceptance into the culture it had set out to
destroy. The content of Documenta 1--Modernism of the twentieth century
before fascism--could not possibly carry the same meaning were it shown
today in the modern art galleries of a fortunate museum. But Modernism,
and particularly abstraction, had become a crux for West German artists
at the time of Documenta 1, as if figuration as such were politically
dangerous. It was not until Richter received permission to visit
Documenta 2 in 1959, where he first encountered the art of the New York
School--Abstract Expressionism--that some internal pressure began to build
in him to engage in the most advanced artistic dialogues of the time.
The fact that he fled East Germany in 1961 exemplifies the way an
artistic decision entailed a political choice in the German Democratic
It was always a momentous choice when an artist decided to go
abstract--or to return to the figure after having been an abstractionist,
the way the California painter Richard Diebenkorn was to do. But to
identify oneself with Art Informel--the European counterpart of the
loosely painted abstractions of the New York School--as many German
artists did, was to make a political declaration as well as to take an
artistic stand. Richter was to move back and forth between realism and
abstraction, but these were not and, at least in his early years in the
West, could not have been politically innocent decisions. Neither was
the choice to go on painting when painting as such, invariantly as to
any distinction between abstraction and realism, became a political
matter in the 1970s. If ignorant of the political background of such
choices, visitors to the magnificent Museum of Modern Art retrospective
of Richter's work since 1962--the year after his momentous move from East
to West--are certain to be baffled by the fact that he seems to vacillate
between realism and abstraction, or even between various styles of
abstraction, often at the same time. These vacillations seemed to me so
extreme when I first saw a retrospective of Richter's work in Chicago in
1987, that it looked like I was seeing some kind of group show. "How can
you say any style is better than another?" Warhol asked with his
characteristic faux innocence in a 1963 interview. "You ought to be able
to be an Abstract Expressionist next week, or a Pop artist, or a
realist, without feeling that you have given up something." For most
artists in America, it is important that they be stylistically
identifiable, as if their style is their brand. To change styles too
often inevitably would have been read as a lack of conviction. But what
the show at MoMA somehow makes clear is that there finally is a single
personal signature in Richter's work, whatever his subject, and whether
the work is abstract or representational. It comes, it seems to me, from
the protective cool to which I referred--a certain internal distance
between the artist and his work, as well as between the work and the
world, when the work itself is about reality. It is not irony. It is not
exactly detachment. It expresses the spirit of an artist who has found a
kind of above-the-battle tranquility that comes when one has decided
that one can paint anything one wants to in any way one likes without
feeling that something is given up. That cool is invariant to all the
paintings, whatever their content. As a viewer one has to realize that
abstraction is the content of one genre of his painting, while the
content of the other genres of his painting is...well...not abstraction.
They consist of pictures of the world. So in a sense the show has an
almost amazing consistency from beginning to end. It is as though what
Richter conveys is a content that belongs to the mood or tone, and that
comes through the way the quality of a great voice does, whatever its
Before talking about individual works, let me register another
peculiarity of Richter's work. He paints photographs. A lot of artists
use photography as an aid. A portraitist, for example, will take
Polaroids of her subject to use as references. The photographs are like
auxiliary memories. With Richter, by contrast, it is as if photographs
are his reality. He is not indifferent to what a photograph is of, but
the subject of the photograph will often not be something that he has
experienced independently. In 1964 Richter began to arrange photographs
on panels--snapshots, often banal, clippings from newspapers and
magazines, even some pornographic pictures. These panels became a work
in their own right, to which Richter gave the title Atlas.
Atlas has been exhibited at various intervals, most recently in
1995 at the Dia Center for the Arts in New York, at which venue there
were already 600 panels and something like 5,000 photographs. These
photographs are Richter's reality as an artist. When I think of
Atlas, I think of the human condition as described by Plato in
the famous passage in The Republic where Socrates says that the
world is a cave, on the wall of which shadows are cast. They are cast by
real objects to which we have no immediate access, and about which, save
for the interventions of philosophy, we would have no inkling. But there
is an obvious sense in which most of what we know about, we never
experience as such. Think of what the experience of the World Trade
Center attack was for most of us on September 11 and afterward. We were
held transfixed by the images of broken walls and burning towers, to use
Yeats's language, and fleeing, frightened people.
The first work in the exhibition is titled Table, done in 1962.
Richter considers it the first work in his catalogue raisonné,
which means that he assigns it a significance considerably beyond
whatever merits it may possess as a painting. It means in particular
that nothing he did before it is part of his acknowledged oeuvre.
Barnett Newman felt that way about a 1948 work he named Onement.
He considered it, to vary a sentimental commonplace, the first work of
the rest of his artistic life. Next to Table, one notices two
photographs of a modern extension table, clipped from an Italian
magazine, on which Richter puddled a brushful of gray glaze.
Table itself is an enlarged and simplified painting of the table
in the photographs, over which Richter has painted an energetic swirl of
gray paint. It is easy to see why it is so emblematic a work in his
artistic scheme. Whatever the merits of the depicted table may have been
as an object of furniture design, such tables were commonplace articles
of furniture in middle-class domestic interiors in the late fifties. In
1962 it was becoming an artistic option to do paintings of ordinary,
everyday objects. We are in the early days of the Pop movement. The
overlaid brushy smear, meanwhile, has exactly the gestural urgency of
Art Informel. So Table is at the intersection of two major art
movements of the sixties: It is representational and abstract at once.
Warhol in that period was painting comic-strip figures like Dick
Tracy--but was dripping wet paint over his images, not yet able to
relinquish the talismanic drip of Abstract Expressionism. Indeed, in
1960 he painted a Coca-Cola bottle with Abstract Expressionist
mannerisms--a work I consider Table's unknown artistic sibling.
Richter gave up Art Informel in 1962, just as Warhol dropped Abstract
Expressionist brushiness in favor of the uninflected sharpness and
clarity of his Pop images. By 1963 Richter had begun painting the
blurred but precise images that became his trademark. Richter's
marvelously exact Administrative Building of 1964 captures the
dispiriting official architecture of German postwar reconstruction,
especially in the industrial Rhineland. And his wonderful Kitchen
Chair of 1965 is a prime example of Capitalist Realism, the version
of Pop developed by Richter and his colleague, Sigmar Polke, in the
mid-sixties. Richter and Warhol had fascinatingly parallel careers.
The deep interpretative question in Richter's art concerns less the
fact that he worked with photographs than why he selected the
photographs he did for Atlas, and what governed his decision to
translate certain of them into paintings. There are, for example,
photographs of American airplanes--Mustang Squadrons, Bombers and Phantom
Interceptor planes in ghostly gray-in-gray formations. Richter was an
adolescent in 1945, and lived with his family within earshot of Dresden
at the time of the massive firebombings of that year. The photograph
from which Bombers was made had to have been taken as a
documentary image by some official Air Force photographer, whether over
Dresden or some other city. The cool of that photograph, compounded by
the cool with which that image is painted--even to the hit plane near the
bottom of the image and what must be the smoke trailing from
another--cannot but seem as in a kind of existential contrast with the
panic of someone on the ground under those explosives falling in slow
fatal series from open bays. But what were Richter's feelings? What was
he saying in these images?
And what of the 1965 painting of the family snapshot of the SS
officer--Richter's Uncle Rudi--proudly smiling for the camera, which must
have been taken more than twenty years earlier, shortly before its
subject was killed in action? Tables and chairs are tables and chairs.
But warplanes and officers emblematize war, suffering and violent death.
And this was not simply the history of the mid-twentieth century. This
was the artist's life, something he lived through. We each must deal
with these questions as we can, I think. The evasiveness of the artist,
in the fascinating interview with Robert Storr--who curated this show and
wrote the catalogue--is a kind of shrug in the face of the
unanswerability of the question. What we can say is that photographs
have their acknowledged forensic dimension; they imply that their
subjects were there, constituted reality and that the artist himself is
no more responsible than we are, either for the reality or the
photography. The reality and the records are what others have done. He
has only made the art. And the blurredness with which the artist has
instilled his images is a way of saying that it was twenty years
ago--that it is not now. Some other horrors are now.
The flat, impassive transcriptions of Richter's paintings are
correlative with the frequent violence implied by what they depict. That
makes the parallels with Warhol particularly vivid. It is easy to
repress, in view of the glamour and celebrity associated with Warhol's
life and work, the series of disasters he depicted--plane crashes,
automobile accidents, suicides, poisonings and the shattering images of
electric chairs, let alone Jackie (The Week That Was), which
memorializes Kennedy's funeral. Or the startlingly anticelebratory
Thirteen Most Wanted Men that he executed for the New York State
Pavilion at the 1964 World's Fair. Compare these with Richter's 1966
Eight Student Nurses, in which the bland, smiling, youthful faces
look as if taken from the class book of a nursing school--but which we
know were of victims of a senseless crime. Warhol's works, like
Richter's, are photography-based. The pictures came from vernacular
picture media--the front page of the Daily News, or the
most-wanted pictures on posters offering rewards, which are perhaps
still tacked up in post offices. These were transferred to stencils and
silk-screened, and have a double graininess--the graininess of newspaper
reproduction and of the silk-screen process itself. And like Richter's
blurring, this serves to distance the reality by several stages--as if it
is only through distancing that we can deal with horror. I tend to think
that part of what made us all feel as if we were actually part of the
World Trade Center disaster was the clarity of the television images and
the brightness of the day that came into our living rooms.
Whatever our attitude toward the prison deaths of the Baader-Meinhof
gang members in 1977, I think everyone must feel that if Richter is
capable of a masterpiece, it is his October 18, 1977 suite of
thirteen paintings, done in 1988 and based on aspects of that reality.
These deaths define a moral allegory in which the state, as the
guarantor of law and order, and the revolution, as enacted by utopian
and idealist youths, stand in stark opposition, and in which both sides
are responsible for crimes that are the dark obverses of their values.
But how fragile and pathetic these enemies of the state look in
paintings that make the photographs from which they were taken more
affecting than they would seem as parts, say, of Atlas. Who knows
whether Richter chose the images because they were affecting, or made
them so, or if we make them so because of the hopelessness of a reality
that has the quality of the last act of an opera, in which the chorus
punctuates the tragedy in music? There are three paintings, in graded
sizes, of the same image of Ulrike Meinhof, who was hanged--or hanged
herself--in her cell. The paintings do not resolve the question of
whether she was killed or committed suicide. They simply register the
finality of her death--Dead. Dead. Dead. (Tote. Tote. Tote.)--in a
repetition of an image, vanishing toward a point, of a thin dead young
woman, her stretched neck circled by the rope or by the burn left by the
rope. That is what art does, or part of what it does. It transforms
violence into myth and deals with death by beauty. There was a lot of
political anger when these paintings were shown in 1988, but there was
no anger in the gallery on the occasions when I have visited it in the
past several weeks.
By comparison with the ferocity of human engagements in the real world,
the art wars of the mid-twentieth century seem pretty thin and petty.
But it says something about human passion that the distinction between
figuration and abstraction was so vehement that, in my memory, people
would have been glad to hang or shoot one another, or burn their
stylistic opponents at the stake, as if it were a religious controversy
and salvation were at risk. It perhaps says something deep about the
spirit of our present times that the decisions whether to paint
abstractly or realistically can be as lightly made as whether to paint a
landscape or still life--or a figure study--was for a traditional artist.
Or for a young contemporary artist to decide whether to do some piece of
conceptual art or a performance. Four decades of art history have borne
us into calm aesthetic waters. But this narrative does not convey the
almost palpable sense in which Richter has grasped his times through his
art. One almost feels that he became a painter in order to engage not
just with how to be an artist but how, as an artist, to deal with the
terribleness of history.
What date shall I assign to Chris Marker's magnum opus, A Grin
Without a Cat? This rugged oak of an essay-film, whose gnarls trace
the growth and withering of decades of leftist politics, is now playing
for the first time in the United States, where it's being shown in the
form Marker gave it after
the demise of the Soviet Union. I might say it's a film from 1993; and
yet the version we now have is the revision of a work completed in 1977,
when Communism was still alive, and anti-Communism was more than the
hungry zombie it's since become.
Communism was still alive, but even then Marker perceived a change. The
last major event he incorporated into his essay was the 1974 election of
Valéry Giscard d'Estaing to the presidency of France. In the
film, this election represents the end of a period of turmoil that had
begun in 1967: the year of campus uprisings in the United States against
the Vietnam War, increased union militancy in France, bloody student
protests in Berlin against the visiting Shah of Iran, the death in
Bolivia of Che Guevara. It's fair to say that the main body of A Grin
Without a Cat deals with these years, so I might date the film
But then, the historical marker slips back even further. To explain why
Che perished as he did, to account for his prestige in death, to suggest
how that martyrdom shaped the period that followed, the film revisits
1962, when Douglas Bravo launched a guerrilla war in rural Venezuela.
Believing that a few militants could spark revolution on their own,
Bravo and his followers abandoned the discipline of the Communist Party.
That was the good news. The bad news was, they also abandoned the
party's political base. In Marker's words (which are spoken throughout
the film by several voiceover narrators), the guerrillas made themselves
into "a spearhead without a spear, a grin without a cat."
The phrase brings to mind Lewis Carroll, and maybe Gogol, too. I will
have something to say about the rude adventures of this grin. First,
though, a question: Assuming there was once a whole cat, what did it
Marker gives a filmmaker's reply: He goes back in time to The
Battleship Potemkin. His picture begins in that other movie--begins
twice, in fact. As his first gesture in A Grin Without a Cat,
Marker shows us Eisenstein's celebrated vision of the Potemkin
mutiny, in which a sailor faces a line of riflemen and wins them over
with a single shout: Brothers! Out of that moment, Marker develops a
great, thrilling montage sequence of his own, spanning half a century of
conflicts in the streets and ending on Eisenstein's Odessa steps, more
or less in the present day. There, as if to begin the film again, Marker
shows us a pleasant young woman who sits in the sunshine, chatting with
an offscreen interviewer. She is a French-speaking Intourist guide, and
she can testify that this site is very popular. She brings people to it
two or three times a day.
We might conclude that the not-quite-mythical cat was on the prowl
sometime between these two historical moments, the first of inspiration,
the second of nostalgia. We might decide that A Grin Without a Cat
is dated 1925-93.
During those years, was anything left unfilmed? To watch this picture is
to be astonished at the world of footage that's been piled up here, some
of it shot by Marker himself, most of it recorded by others, both known
and anonymous. The raw materials of A Grin Without a Cat include
images of a US pilot bombing Vietnam, as seen from the cockpit; scenes
of carefully staged party congresses in Havana and Beijing and of an
unscripted, on-the-run congress in 1968 Prague; views of the festive Cat
Parade in Ypres; broadcasts of the Watergate hearings and of the Shah of
Iran's grandiose party for himself in Persepolis; raw footage of
Communist and Trotskyist workers getting into a fistfight at a factory
gate; interviews in the jungle with Douglas Bravo, in the Pentagon with
a counterinsurgency expert, in the Citroën headquarters with that
firm's managing director; Soviet newsreels from World War II; a student
collective's newsreel from 1967 Berlin; shots of Giscard d'Estaing
playing the accordion and of The Who destroying their instruments;
behind-the-scenes pictures of training sessions at the School of the
Americas; and the usual amalgam of flaming automobiles, flying tear-gas
canisters, descending truncheons and human beings lying in pools of
So complete is the filmed record on which Marker draws, and so
associative is his method of using it, that he can show us a statement
made in 1968 by a Czech national hero, Emil Zatopek, just before he was
stripped of his military rank for protesting against the invasion;
Zatopek at the 1952 Helsinki Olympics, when he famously swept the
distance running events; and Zatopek in 1972, when he was released from
the mines and trotted out to look solemn at the Munich Olympics, when
the games continued despite the murder of eleven Israeli athletes. But
then, Marker comments, "I had been in Mexico City in 1968, when 200
people were killed so the games could begin," and we have that footage,
This sort of thing can make your head spin; but since it should also
make your head clear, Marker's montage is not only associative but
diagrammatic as well. A Grin Without a Cat is divided into two
main sections. Part One, "Fragile Hands," concentrates on the events of
1967 and 1968, up to the fizzling of the May revolt in France. Part Two,
"Severed Hands," begins with the Soviet invasion of Czechoslovakia,
continues with the rise and fall of Salvador Allende (and the Gang of
Four) and concludes with the fading of the cat's grin, late in the
Marker tends to present these events in big loops. He'll jump from
source to source, place to place, to develop an argument (about the
concept of a revolution in the revolution, for example); he'll digress
to examine the way people gestured with their hands, or how they either
filled or did not fill the space between striking workers and police;
and then he'll swing back to close the loop, concluding one phase of his
essay and moving on to the next. At each phase (at least in the earlier
part of the film) he also introduces elements that I might as well call
dialectical. When he shows a group of war protesters preparing to burn
their draft cards in 1967, he also shows a rally of the American Nazi
Party. When French student leader Daniel Cohn-Bendit comes into the
picture, so does Giscard d'Estaing. We watch the New Left rise in tandem
with the New Right. In Marker's view of history, the development of the
New Right may have been the New Left's greatest achievement.
If so, then the Old Left contributed ample help. Marker makes the point
with stunning force during his section on Czechoslovakia, when he
unexpectedly closes one of those big loops of montage. Citizens of
Prague have surrounded a Soviet tank driver and are berating him--"How
could you, a Communist, be doing this?"--when that intertitle from
The Battleship Potemkin pops onto the screen again, in a way
that's now heartbreaking and futile: Brothers!
And since Marker is a moviemaker above all, A Grin Without a Cat
also makes its point as a movie should, through the actions of its star.
Yes, there is a lead actor in this film: Fidel Castro, whose many
performances, interspersed throughout the picture, amount to a little
drama of their own, complete with a nasty plot twist. Here is Fidel on
the podium, addressing a night-time rally with wit, vigor and good
sense. Here he is again, sprawled casually on the grass for the benefit
of the camera, giving a very good impersonation of a man speaking
spontaneously, sensitively, about popular militancy and his comrade Che
Guevara. And here, giving a radio broadcast, Fidel appears to work
himself into a fury against the invasion of Czechoslovakia, as a
dramatic overture to praising the Soviets for their tanks.
This is dense, complex, allusive filmmaking, encyclopedic in ambition,
profound in understanding, playful enough in form to make you smile
sometimes at the tricks of history. Though Marker has made an elegy to
the left, he would prefer that you leave the theater invigorated,
feeling that power is still abroad in the world, and that you and your
friends might still disrupt its dirty work.
My only complaint is that the film could have sent you home feeling even
better. During the period Marker covers, the feminists got a few things
done, often without bothering to define their relationship to the
Communist Party; but feminism shows up very late in A Grin Without a
Cat, as a mere afterthought. Africa doesn't show up at all; yet
activists from around the world made some changes there too, such as
ending apartheid and establishing a new democratic state. You may choose
to add to the list a third or fourth victory. We've had a few, despite
all of history's tricks.
That said, A Grin Without a Cat was made for you, Nation
reader. It premieres in America on May Day, at New York's Film Forum.
Abbas Kiarostami's most recent documentary, which premieres in the
United States on May 3 at New York's Cinema Village, is about nothing
other than Africa and feminism. Made on behalf of the UN's International
Fund for Agricultural Development, ABC Africa is the record of a
trip to Uganda, during which Kiarostami investigated the effect of AIDS
on women and children.
The effect, briefly stated, is that children are orphaned, and women are
left to care for them: six, eleven, thirty-five at a time. According to
the film, there are now more than 1.6 million orphans in Uganda, out of
a population of 22 million. The Catholic Church helps by offering a
wretched level of care to the suffering, meanwhile insuring there will
be more suffering by discouraging the use of condoms. By contrast, the
Uganda Women's Effort to Save Orphans (UWESO) helps with a program that
encourages women to band together and become economically
I lack the space in this column to describe even a part of what
Kiarostami recorded with his digital video cameras. It's enough to say
that, while he captured images on the run, he somehow made a Kiarostami
film. ABC Africa is devastating, as you'd expect. It's also
lyrical, beautiful and quietly inventive.
The Nation announces the winners of Discovery/ The Nation, the Joan
Leiman Jacobson Poetry Prize. Now in its twenty-eighth year, it is an
annual contest for poets whose work has not been published previously in
book form. The new winners are: Linda Jenkins, Gregory McDonald, Andrew
Varnon and Stefi Weisburd. This year's judges are Catherine Bowman,
Carolyn Forché and Paul Muldoon. As in the past, manuscripts are judged
anonymously. Distinguished former winners include Susan Mitchell, Katha
Pollitt, Mary Jo Salter, Sherod Santos, Arthur Smith and David St. John.
This year's winners will read their poems at Discovery/The Nation
'02 at 8:15 pm on Monday, May 6, at The Unterberg Poetry Center, 92nd
Street Y, 1395 Lexington Avenue (92nd Street and Lexington Avenue) in
New York City.
--Grace Schulman, poetry editor
The Lewis & Clark Snowglobe
There exists one, anti-gewgaw, memento
ingenuous as any wonder,
though I've never seen nor heard of it, and yet--
as is revolution of heavenly body, of colony--
all's a given. The only question being which scene
of scenes? Spring 1804: keelboat,
all fifty-five feet of it, curses
the Missouri's sawyers--
Shake it and snow that falls in summer
plagues unseen men--Clark's "misquetors."
Or Lewis gazes, dizzy with May and his first
"plain and satisfactory view" of the Rockies'
plastic expanse, its blue-lipped ardor soothing
words Northwest Passage forever.
In a roadside gift shop,
Sacagawea proves false
an old adage; Home again Home again, swirls
her first moments back
among the Shoshones; with a knick-knack's economy,
sixteen mounted warriors become
one or two; her lost brother has become chief,
and they embrace:
novelist's fantastical turn.
It's the day a horse takes badly a Bitterroots precipice, the group--
ravenous, anonymous, androgynous--proceeds,
one colt divided among thirty-plus bellies. It's Clark,
jubilant at the first
(if false) view of Pacific.
It's hermetic 1806 St. Louis,
its sluicy tempest of rounds and cheers.
And not famed, not at all likely
to be the scene, yet Washington's elite toasts Lewis
with a ball; outside, glitter falls--and Lewis, triumphant, drunk
off the New Year, raises his glass, voices
a toast of his own:
"May works be the test
of patriotism as they ought, of right, to be of religion,"
as they ought (redundant or no) to be of love.
It was in an Age of Such Incredible Secrets
It was in an age of such incredible secrets
that my mother began to paint her toenails
the color of eggshells, and my father
learned how to make love with his hands
at his side. I saw them practicing once,
but all I could think about was our icebox
full of fish and ketchup, and the small wooden bird
above my grandmother's bed, rocking back and forth,
dipping its red beak into a bowl of water.
What I Remember
I lift the bottle every time you catch me
looking at you. In all the apartment
complexes down Alafaya Trail,
I roll on the floor away from the wet nose
of a basset hound. Pennies spill
that I will forget; lips are moving but
I can't keep my footing in the mud.
Spanish moss hangs from a tree, there is a frog
and everybody throws water balloons.
A black dress with pink flowers
A storm over the gulf at sunrise
Empty beach chairs face turquoise
Traffic lights change without cars
I chase you with whiskey and chase
whiskey with beer and chase an armadillo
around the art gallery, muttering something
about "plasticity" or "negative space."
The search lights catch up with me. I walk
out the back door too easy, afraid of fists
that put holes in your wall. Mine
is the long walk home under streetlights
with only beat cops and that one Muddy Waters
song I know to keep me company, me and that
thirsty head full of wilderness I'm so afraid of.
Elegy For Two
A yowling pulls like tides at our blind ear
from down the hall. The sound of Baby's ire
at God knows what, the broken night, the leer
of suns, I said. The nurse spit out: Liar.
Eyes of fruit and cinder block conspire.
His cries would fever milk and wrench the bed.
A letter in my husband's hands perspires.
For the love of God, it's just a cat, my nurse said.
But cats don't antidote true love or shred
the film of sleep with shrill ballistic shrieks
or tick heart's tomb, slash the vagrant thread,
tear the doll to wipe the bloody streaks.
Cats don't rasp or beg with gnawing squall
on stairs to help the helpless totter, fall.
Alan Dershowitz prides himself on his credentials as a civil
libertarian, and to judge by most of the essays in his latest book,
Shouting Fire: Civil Liberties in a Turbulent Age, he has good
reason to do so. The Harvard law professor has built a considerable
reputation on his defense of free speech, due process and the separation
of church and state, to say nothing of his propensity for controversial
clients and clamorous talk shows. Shouting Fire is a pastiche of
fifty-four essays, some of them new, most of them not, the earliest
dating from 1963. The impetus for the collection appears to be at least
in part a desire to reassert the importance of civil liberties, even in
the face of such national security threats as those posed by the events
of September 11 and their aftermath. Moreover, Dershowitz admirably
offers what rights advocates rarely do: a philosophical grounding for
civil and political rights beyond the mere positivist assertion that
"that's the law."
If this were all Dershowitz had done in Shouting Fire, the book
might have received its share of kind reviews and headed off to
Remainderland. But in less than two of the book's 550 pages, he manages
to guarantee the collection a longer shelf life. For in an addendum to a
1989 article in the Israel Law Review, Alan Dershowitz, civil
libertarian, champion of progressive causes, counsel to human-rights
hero Anatoly Shcharansky, makes a case for torture or, more exactly, for
the creation of a new legal device that he dubs a "torture warrant." And
then, through a deft combination of newspaper editorials, public
appearances and an extended interview on 60 Minutes, Dershowitz
has expanded upon that proposition in a way designed to make talk of
torture routine and, not incidentally, banter about his book robust.
Dershowitz's proposal, therefore, deserves careful scrutiny, not only
because it comes from a respected voice but also because sources in the
FBI have floated the possibility that torture will be applied against
prisoners or detainees who refuse to tell what they know about
terrorists. Last October 45 percent of Americans approved of that.
Today, thanks to Dershowitz and others having lent the idea the patina
of respectability--Jonathan Alter writing in Newsweek, Bruce
Hoffman in The Atlantic--the number may be higher.
Dershowitz starts with the familiar scenario from every freshman
philosophy class, the case of the ticking bomb. Suppose the authorities
are holding a suspect who knows where a ticking bomb is located, a bomb
that will kill hundreds of people if it explodes. Would they be
justified in torturing the suspect to procure the information and
thereby save innocent lives?
Dershowitz contends that whether we like it or not, the officials would
inevitably resort to torture and, what's more, the vast majority of us
would want them to. But because any officer who did so might be subject
to prosecution, despite the availability of the common law defense that
a crime may be justified if it is necessary to prevent a greater evil,
the onus of responsibility should not be left on the individual
official. Instead the authorities should apply to a court for a "torture
warrant," similar to a search warrant, so that the courts must bear the
burden of authorizing torture or the consequences of failing to do so.
In another context Dershowitz has offered the reassurances that "the
suspect would be given immunity from prosecution based on information
elicited by torture" and that "the warrant would limit the torture to
nonlethal means, such as sterile needles being inserted beneath the
nails to cause excruciating pain without endangering life."
Despite these precautions, however, Dershowitz's proposal has not met
with universal acclaim, and in recent weeks he has appeared to be
distancing himself from it. In a February 17 letter to The New York
Times Book Review responding to a critical review of Shouting
Fire, Dershowitz claims that "the only compromises [with civil
liberties] I suggest we should consider, and not necessarily
adopt, relate directly to protecting civilians against imminent
terrorist attacks [emphasis added]." But there is no hint on the two
relevant pages of Shouting Fire that Dershowitz's "torture
warrant" proposal is merely hypothetical. Indeed, in commenting on the
decision by the Supreme Court of Israel that prompted the idea in the
first place, he chastises the court for leaving interrogating officers
vulnerable to prosecution if they use torture and says, "The Supreme
Court of Israel...or the legislature should take the...step of requiring
the judiciary to assume responsibility [for torture] in individual
cases." Dershowitz is stuck with his "torture warrants" just as surely
as Arthur Andersen is stuck with its Enron audits.
So what, after all, is wrong with that--other than the fact that torture
violates both the Convention Against Torture, which the United States
ratified in 1994, and the Constitution? The first thing that is wrong is
that the act of torture, unlike that of searching for something, is in
itself both universally condemned and inherently abhorrent. Under
international law, torturers are considered hostis humani
generis, enemies of all humanity, and that is why all countries have
jurisdiction to prosecute them, regardless of where the torture took
place. The fact that a US court or legislature might offer its approval
of the act does not abrogate that internationally recognized standard
any more than a court in Singapore that authorizes the jailing of a
dissident journalist makes Singapore any less guilty of violating the
rights of a free press. Tyrannical governments often try to cloak their
human rights violations in national statute. It is interesting, however,
that no country has ever legalized torture except, arguably, Israel,
until the Israeli Supreme Court struck down the provision for the use of
"moderate physical pressure," and even while that provision was on the
books, the Israeli government argued vehemently that such pressure was
not the equivalent of torture.
To see more clearly the shoals upon which the "torture warrant"
flounders, consider this. There is no doubt that despite official
efforts to eradicate it, police brutality is practiced in many US
jurisdictions and probably always will be. Some police officers will
claim, in their more candid moments, that the use of excessive force is
often the only way to protect the lives of officers and the general
public. Why ought the police not be able, therefore, to apply for
"brutality warrants" in specialized cases? Why ought police officers who
believe that a little shaving of the truth on the witness stand is worth
sending a bunch of drug pushers to prison, thus protecting hundreds of
youngsters from a life of drugs and crime, not be able to seek
"'testilying' warrants"? Why ought correctional officers who argue that
allowing dominant male prisoners to rape other prisoners helps preserve
order among thugs and thus protects the lives of guards not be allowed
to seek "warrants to tolerate prisoner rape" in particularly dangerous
situations? The answer in all cases is the same: because the act itself
(brutalizing citizens; committing perjury; facilitating rape) is itself
abhorrent and illegal. Dershowitz's analogy to search warrants fails
because, while a particular search may itself be illegal, the act of
searching is not ipso facto unethical or a crime. For a society
to start providing its imprimatur to criminal acts because they are
common or may appear to provide a shortcut to admirable ends is an
invitation to chaos.
But even if torture were a licit activity under some circumstances,
there are very good pragmatic reasons to reject its use. If the ticking
bomb scenario were designed only to establish the abstract moral
calculus that the death of X number of people constitutes a greater evil
than the torture of one, it would certainly be possible to make a
plausible utilitarian argument for torture. The problem is, however,
that the proponents of the ticking bomb scenario want it to serve as the
basis of public policy, and unfortunately reality rarely conforms to
scenarios and life doesn't stop where the scripts do. How strange that
though the ticking bomb scenario has been used for decades to justify
torture, its defenders are unable to cite the details of even one
verifiable case from real life that mirrors its conditions.
Perhaps, upon reflection, that is not so strange. For what the ticking
bomb case asks us to believe is that the authorities know that a bomb
has been planted somewhere; know it is about to go off; know that the
suspect in their custody has the information they need to stop it; know
that the suspect will yield that information accurately in a matter of
minutes if subjected to torture; and know that there is no other way to
obtain it. The scenario asks us to believe, in other words, that the
authorities have all the information that authorities dealing with a
crisis never have.
Even aficionados of ticking bomb torture agree that its use can only be
justified as a last resort applicable to those we know to a moral
certainty are guilty and possess the information we seek. That 45
percent of Americans who reported last October that they approved of
torture were approving of the "torture of known terrorists if they know
details about future terrorist attacks." But how do we know all that?
The reason torture is such a risky proposition is exactly because it is
so difficult to tell ahead of time who is a terrorist and who is not;
who has the information and who does not; who will give the information
accurately and who will deceive; who will respond to torture and who
will endure it as a religious discipline. The fact is that many people
suspected of being terrorists turn out not to be, as our experience
since September 11 has proven so well; that, historically, many of those
subjected to torture are genuinely ignorant of the details the
authorities seek; that the information protracted with torture is
notoriously unreliable; and that torture almost always takes a long
time--days and weeks, not hours and minutes--to produce results. Torture
is of course extraordinarily common. Almost three-fourths of the world's
countries practice it. But not to find ticking bombs. To punish
political opponents. To intimidate their allies. To cow a citizenry. The
ticking bomb scenario in its purest form is a fantasy of "moral" torture
all too easily appropriated by tyrants as an excuse to justify the more
And if the ticking bomb scenario is a fantasy, the Dershowitzian
addition of a "torture warrant" makes it into a chimera. Here is a
situation Dershowitz envisions for the warrant's use:
Had law enforcement officials arrested terrorists boarding one of the
[September 11] airplanes and learned that other planes, then airborne,
were headed toward unknown occupied buildings, there would have been an
understandable incentive to torture those terrorists in order to learn
the identity of the buildings and evacuate them.
This assumes that those law enforcement officials would have had time in
the hour and a half or so between the boarding of the planes and the
impact on their targets to (1) take the suspects into custody; (2)
ascertain with enough certainty to warrant torture that the suspects
were (a) terrorists who (b) had the needed information in their
possession; (3) apply to a judge for a torture warrant and make the case
for one; (4) inflict torture sufficient to retrieve the necessary facts;
(5) evaluate the validity of those facts in order to be assured that no
innocent plane would be identified and blown out of the sky; and (6)
take the steps required to stop or mitigate the terrorist act. Perhaps
after John Ashcroft has been Attorney General another three years, law
enforcement will have learned to cut enough corners of the legal
niceties to accomplish this feat. But at the moment, given the INS, Tom
Ridge, bureaucratic infighting and all, it seems unlikely.
Which leads to the question of whether, if the United States were to
become the first country in the world to adopt "torture warrants," they
would make us safer. That, after all, is presumably the only ultimate
rationale for their use. But here is another place where the traditional
ticking bomb case explodes in the face of reality. For it assumes that
there are no further detrimental consequences once the victims of the
bombing are saved--no retaliatory strikes, for example, by the torture
victim's comrades to pay back the inhumanity done to their brother. It
doesn't take much imagination to see how quickly officially authorized
torture would diminish the credibility of a struggle against terrorism
that is being fought in the name of defending American values and the
rule of law. How many people would need to be tortured before our allies
threw up their hands in disgust and our adversaries started celebrating
their moral victory? How many innocent people would have to be
brutalized before their resentment and that of their friends and family
would spill over into violence? In his book No Equal Justice law
professor David Cole has shown how mistreatment of the innocent by US
police can alienate entire communities and result in increases in crime.
Torture, similarly, is a sure-fire way to manufacture an embittered
opponent of the United States where there was none before. And make no
mistake that innocent people would be tortured, warrant or no, for,
after all, if close to 100 innocent people have been convicted of
capital crimes and sentenced to death in this country despite all the
protection our legal system offers, how much more likely is it that
miscarriages of justice will flow from the pen of a single judge?
Whatever leadership the United States can claim in the world is
intimately linked to our practice of values universally regarded as
fundamental to a civilized people.
So how could a distinguished human rights advocate like Alan Dershowitz
have strayed so far from the mark? Part of it may have to do with the
philosophical basis for rights that he sketches in the beginning of his
book. Wisely rejecting the notions that rights are derived from deity or
natural law and yet unconvinced that positivism alone provides
sufficient heft for rights claims, Dershowitz adopts what he calls the
"experiential-advocacy approach." In effect, he says, we should look to
history to identify prototypical instances of injustice (slavery, for
example) and then, based upon that human experience, construct a set of
rights--free speech, due process--that are most likely to bring about
the type of society in which we would want to live. So far, so good.
Human rights are assuredly derived from human experience.
But what if you disagree with my vision of the good society? The best we
can do, Dershowitz insists, is to try to argue you out of your myopia:
"That is all I can do," he says. "Defend my preference for [certain]
rights.... But I make no claim for these rights beyond my ability to
persuade you to agree with me that history--especially the history of
wrongs--has shown these rights to be important enough to be given a
special status in the hierarchy of preferences. It may surprise you to
learn that for me there is no sharp line...separating rights from
strongly held preferences." It is here that Dershowitz stumbles.
For while rights are, in a sense, preferences, they are also more than
that: They are norms, behavioral norms necessary to create and sustain a
good society. And they become norms not through argument alone but
through its conclusions, through an articulated consensus of the
international community. One of the most astonishing lacunas in the
philosophical section of Shouting Fire is the absence of even one
mention, if the index and my reading are to be believed, of the
Universal Declaration of Human Rights. For while the UDHR did not set
out to be a legally binding treaty (the State Department called it in
1948 "a hortatory statement of aspiration") and hence avoids the limits
of positivism, it does reflect--imperfectly, to be sure, but as well as
possible within the current limits of human endeavor--what St. Augustine
called our "overlapping loves," our common measures of a decent world.
To those who disagree with its vision of that world, we can offer much
more than a shouting contest, much more than any one person's reading of
history or any one nation's perception of its needs. We can offer the
collective wisdom of the human community as hammered out, written down
and, more and more frequently, enforced. And part of that wisdom is that
torture is wrong. Everywhere. In all circumstances. With or without
Alan Dershowitz may not like that. And he is certainly entitled to go on
arguing about it. He is a persuasive fellow and eventually he may even
succeed in helping erode the international prohibitions on torture. That
will be a sad day, no doubt, but how comforting it will be to know at
that point that, thanks to the professor, the needles will be
To immerse oneself in Robert Caro's heroic biographies is to come face to face with a shocking but unavoidable realization: Much of what we think we know about money, power and politics is a fairy tale. Our newspapers, magazines, broadcast and cable newscasts are filled with comforting fictions. We embrace them because the truth is too messy, too frightening, simply too much.
In a 1997 speech on the topic, Ben Bradlee attributes our problem to official lying. "Even the very best newspapers have never learned how to handle public figures who lie with a straight face. No editor would dare print.... 'The Watergate break-in involved matters of national security, President Nixon told a national TV audience last night.... That is a lie.'"
But the problem is much larger than Bradlee allows. Caro demonstrates how this colossal structure of deceit clouds the historical record. The unelected Robert Moses exercised a dictatorial power over the lives of millions of New Yorkers for nearly half a century. He uprooted communities and destroyed neighborhoods using privately run but publicly funded entities called "public authorities," whose charters he personally wrote. Before the publication of The Power Broker in 1974 (1,246 pages, after having been cut by 40 percent to fit into a single volume), no book or major magazine article existed on the topic. Caro's obsessive exhumation of Moses's career transformed our understanding of the mechanics of urban politics. And yet even today the media proceed as if it's simply a matter of campaigns, elections and legislation.
The true face of our money-driven political system is buried so far beneath the surface of our public discourse that almost nobody has any incentive to uncover it. With a meager $2,500 advance to sustain him, Caro sold his house and nearly bankrupted his family; his wife, Ina--a medieval historian--went to work as his full-time researcher. When I asked why he did it, he got a little choked up about the sacrifice of Ina's career and how much she had loved their old house. Finally he said he had no idea. The Caros' combination of intellectual independence and professional dedication inspires comparisons with another great marital partnership: that of the late, great Izzy and Esther Stone. (Can anyone imagine what Izzy would have come up with if he had committed virtually his entire career to smoking out the truth about just two powerful men?)
Caro's new book, Master of the Senate, volume three of The Years of Lyndon Johnson, forces us not only to rewrite our national political history but to rethink it as well. What Caro is doing here is something we rarely see attempted in any medium: His aim, as he once explained to Kurt Vonnegut, "is to show not only how power works but the effect of power on those without power. How political power affects all our lives, every single day in ways we never think about."
Caro's been burrowing beneath the shadows of the substance of our politics for more than twenty-eight years, and what he finds is both fascinating and surprising. In many ways Johnson's personality--so outsized and contradictory as to be cognitively uncontainable--gets in the way of this compulsively readable story, which is about how power is exercised in this country.
Lyndon Johnson did not invent the form of legislative power he exercised through the Senate in the 1950s, but Caro has almost had to invent a new history to describe it. People have told pieces of it here and there, but who's got the time, the motivation or the patience to really nail down not only what happened but what it meant to the nation? Here's a tiny example, of which this new book has almost one a page. Listen to longtime Senate staffer Howard Shuman: "William S. White, [whom Caro terms the Senate's "most prominent chronicler"] wrote that the way to get into the Club was to be courteous and courtly. Well, that's nonsense." Johnson mocked and humiliated liberal New York Senator Herbert Lehman at every opportunity: "It didn't have anything to do with courtly. It had to do with how you voted--with whether or not you voted as Lyndon Johnson wanted you to vote." Neil MacNeil, veteran Time correspondent adds, "The Senate was run by courtesy, all right--like a longshoreman's union."
Now don't go looking in old Time magazines for any hint of this. Caro spends more than 300 of his 1,167 pages on the incredible story of Johnson's navigation of the 1957 Civil Rights Act through Congress, something that hardly anyone thought possible until he pulled it off. With the singular exception of Tom Wicker, then a green (and largely ignored) young reporter for the Winston-Salem Journal, no one covering the story had an inkling of how it happened.
One indisputable conclusion that Caro offers is pretty tough to swallow. The advances in civil rights legislation that helped end centuries of legal apartheid in this country could never have occurred had they not been planned and executed by a man who turns out to have been a thoroughgoing racist. Caro was much criticized for downplaying Johnson's 1948 support for Truman, considering the fact that his lionized opponent, Coke Stevenson, stood with the racist Strom Thurmond Dixiecrat campaign. But Johnson, it turns out, attacked Truman's civil rights policies no less virulently. He gave a campaign speech in May 1948 in which he compared civil rights legislation to the creation of "a police state in the guise of liberty." Caro found the speech in a White House file with the following admonition stapled on top. "DO NOT RELEASE THIS SPEECH-speech--not even to staff...this is not EVER TO BE RELEASED." Thanks to Caro, this story, and with it a big chunk of our history, has been released as well.
Addendum: George W. Bush's Executive Order 13233, which effectively eviscerates the Presidential Records Act of 1978 by fiat, is designed to insure that no historian can ever provide this kind of public service again. Twenty Democrats and three Republicans are co-sponsors of a bill to restore it. Write your representatives and tell them to get on board.
The buildings' wounds are what I can't forget;
though nothing could absorb my sense of loss,
I stared into their blackness, what was not
supposed to be there, billowing of soot
and ragged maw of splintered steel, glass.
The buildings' wounds are what I can't forget,
the people dropping past them, fleeting spots
approaching death as if concerned with grace.
I stared into the blackness, what was not
inhuman, since by men's hands they were wrought;
reflected on the TV's screen, my face
upon the building's wounds. I can't forget
this rage, I don't know what to do with it--
it's in my nightmares, towers, plumes of dust,
a staring in the blackness. What was not
conceivable is now our every thought:
We fear the enemy is all of us.
The buildings' wounds are what I can't forget.
I stared into their blackness, what was not.
Late in the evening in back-road America you tend to pick the motels with a few cars parked in front of the rooms. There's nothing less appealing than an empty courtyard, with maybe Jeffrey Dahmer or Norman Bates waiting to greet you in the reception office. The all-night clerk at the Lincoln motel (three cars out front) in Austin, Nevada, who checked me in around 11:30 pm last week told me she was 81, and putting in two part-time jobs, the other at the library, to help her pay her heating bills since she couldn't make it on her Social Security.
She imparted this info without self-pity as she took my $29.50, saying that business in Austin last fall had been brisk and that the fifty-seven motel beds available in the old mining town had been filled by crews laying fiber optic cable along the side of the road, which in the case of Austin meant putting it twenty feet under the graveyard that skirts the road just west of town.
Earlier that day, driving from Utah through the Great Basin along US 50, famed as "the loneliest road," I'd seen these cables, blue and green and maybe two inches in diameter, sticking out of the ground on the outskirts of Ely, as if despairing at the prospect of the Great Salt Lake desert stretching ahead.
So we can run fiber optic cable through the Western deserts but not put enough money in the hands of 81-year-olds so they don't have to pull all-night shifts clerking in motels. What else is new?
People who drive or lecture their way through the American interior usually notice the same thing, which is that you can have rational conversations with people about the Middle East, about George W. Bush and other topics certain to arouse unreasoning passion among sophisticates on either coast. Robert Fisk describes exactly this experience in a recent piece for The Independent, for which he works as a renowned reporter and commentator on mostly Middle Eastern affairs.
Fisk claims on the basis of a sympathetic hearing for his analysis--unsparing of Sharon's current rampages--on campuses in Iowa and elsewhere in the Midwest that things are now changing in Middle America. After twenty-five years of zigzagging my way across the states I can't say I agree. It's always been like that, and even though polls purport to establish that a high percentage of all Middle Americans claim to have had personal exchanges with Jesus and reckon George W. to be the reincarnation of Abe Lincoln, the reality is otherwise. Twenty years ago Fisk would have met with lucid views in Iowa on the Palestinian question, plus objective assessments of the man billed at that time as Lincoln's reincarnation, Ronald Reagan.
Some attitudes do change. White people are more afraid of cops than they used to be. A good old boy in South Carolina I've bought classic cars from for a quarter of a century was a proud special constable back in the early eighties. These days if a police cruiser passes him on the highway, he'll turn off at the next intersection and take another road. Reason: A few years ago a couple of state cops stopped him late at night, frisked him, accused him of being drunk. This profoundly religious Baptist told them truthfully he'd never consumed alcohol in his life. Then they said he must be a drug dealer. He reckons the only reason they didn't plant some cocaine in his car was that he told them to check him out with the local police chief, an old friend.
I know from the stats that a lot of Americans are poor, so how come I'm often the only fellow on the road, or in town, in an old car aside from some of the Mexican fieldworkers in California for whom such cars are home? Most everyone seems to be in a late-model pickup or at least a nice new Honda Civic. I know, I know. The poor are out there, lots of them, but the whole place just doesn't seem to feel as poor as it often did in the early eighties recession. Then, day after day you could drive through towns that felt like graveyards, with no prospect of fiber optic cable.
Take Grants on I-40 in New Mexico, west of Albuquerque, which became the nation's self-proclaimed "uranium capital" in the fifties after Paddy Martinez heard descriptions of what uranium ore looked like and led the mining prospectors to the yellow rocks he'd been looking at down the years. The mines closed, and I recall from the early 1980s Grants looking sadly becalmed, with its Uranium Café and souvenir motels from the great days of Route 66. The audio in the Mining Museum still speaks plaintively about radiation's bad rep, despite the fact that in modest amounts it's good for you and there was much more of it around when the world was young.
Well, 66 nostalgia is still strong in Grants, but aside from the Lee Ranch coal mine the juice in Grants's economy now comes in large part from three prisons--one fed, one state and one private.
No wonder people are nervous of cops. There are so many prisons for the cops to send you to. So many roads where a sign suddenly comes into view, advertising correctional facility and warning against hitchhikers. I was driving through Lake Valley in eastern Nevada along US 93, with Mount Wheeler looking to the east. Listening to the radio and Powell's grotesque meanderings I was thinking, Why not just relocate the whole West Bank to this bit of Nevada where the Palestinians could have their state at last, financed by a modest tax on the gambling industry? The spaces are so vast you wouldn't even need a fence. Then reality returned in the form of the usual sign heralding a prison round the next bend.
West along US 50 from Austin I came to Grimes Point, site of fine petroglyphs. A sign informed me that "The act of making a petroglyph was a ritual performed by a group leader. Evidence suggests that there existed a powerful taboo against doodling." The graffiti problem. Some things never change. On the other hand, some things do. Many thousands of years ago those rocks were on the edge of a vast sea, maybe 700 feet deep. The petroglyph ridge was once beachfront property. The world was warmer then, and we're heading that way once more, from natural causes. To leave you on an upbeat note: At least natural radiation is on the wane.
More than the much-reviled products of Big Tobacco, big helpings and Big Food constitute the number-one threat to America's children, especially when the fare is helpings of fats, sugars and salt. Yet the nation so concerned about protecting kids from nefarious images on library computers also
allows its schools to bombard them with food and snack ads on Channel One and to sign exclusive deals with purveyors of habit-forming, tooth-rotting, waist-swelling soft drinks.
Foreigners who arrive in the United States often remark on the national obsessions about food and money. It is perhaps not surprising that a gluttonous mammon would rule the federal regulators of our food chain, but Marion Nestle, professor of nutrition at New York University, confesses that she has heard few of her nutritionist colleagues discuss the cardinal point: "Food companies will make and market any product that sells, regardless of its nutritional value or its effect on health."
Nestle goes on to demonstrate that not only do food companies use traditional corporate checkbook clout with Congress to insure their unfettered right to make money; they also co-opt much of the scientific and nutritional establishment to aid in their efforts. For example, the omnipresent "milk mustache" advertisements often show blacks and Asians--precisely those who are most likely to be lactose-intolerant. But then "science" rides to the rescue: There are a lot more research dollars shunted to those arguing that lactose intolerance is not a problem than there are for those who think otherwise. In fact, the Physicians' Committee for Responsible Medicine sued to annul the federal dietary guidelines, which recommended two to three servings of milk products daily; six of the eleven people on the voting committee had received research grants, lectureships or other support from the food industry.
Here Nestle wobbles a little in her argument, however. She waves the standard of science on behalf of the Food and Drug Administration when it comes to food supplements and herbal medicines, but devalues the "science" as well by revealing the conflicts of interest among researchers and regulators. Science is often up for sale. Researchers go to the food corporations for the same reason that bandits rob banks: That's where the money is, not least since the FDA's own research funding is controlled by Congressional committees in charge of agriculture, whose primary aim is hardly to promote dieting--it is the force-feeding of agribusiness with federal funds. Indeed, Nestle concedes, "USDA officials believe that really encouraging people to follow dietary guidelines would be so expensive and disruptive to the agricultural economy as to create impossible political barriers."
The dietary guidelines Nestle is referring to were monumentalized in the famous "food pyramid" familiar to every primary school student. But the pharaohs finished theirs in less time than it took the FDA to pilot its version past the army of lobbyists who resented the hierarchical implication that some foods were healthier than others. As a whistleblower on the FDA advisory committee that was drawing up the guidelines, Nestle is well qualified to recount the obstacles it faced.
In fact, many people did become more health-conscious as a result of such guidelines, but as Nation readers know, practice does not always match theory. Much of Food Politics reveals how the food industry has seized upon the marketing possibilities of consumers' safety concerns and perverted them by adding supplements to junk foods and then making health claims for the products.
Food is an elemental subject, on a par with sex and religion for the strength of people's beliefs about it. Otherwise rational people have no difficulty believing the impossible during breakfast, where their stomachs are concerned. Big Food relies on that snake-oil factor, the scientific illiteracy of most consumers. For example, marketers are happy with the advice to eat less saturated fat, since most buyers won't recognize it when it's drizzled across their salad. But advice to eat less of anything recognizable stirs up serious political opposition.
Federal dietary guidelines recommending that we "eat less" were thinned down to suggesting that we "avoid too much," which metabolized into "choose a diet low in..." And so on. For example, Nestle relates how in 1977 the National Cattlemen's Association jumped on Bob Dole's compromise wording on reducing red meat in the diet and increasing lean meat consumption: "Decrease is a bad word, Senator," the cattlemen warned him. The cowboys effectively corralled the McGovern committee on dietary guidelines: "Decrease consumption of meat" was fattened into "choose meats, poultry and fish which will reduce saturated fat intake."
Sometimes the more potential for harm, the more it seems likely that a product's positive--or putative--health benefits will be touted. We get vitamin-supplemented Gummi Bears and, what provokes Nestle's justifiable ire most, Froot Loops. This marshmallow blasted "cereal...contains no fruit and no fiber" and "53% of the calories come from added sugar," she inveighs. The perfect breakfast complement to a twenty-ounce bottle of cola that will be downed in school? Such pseudo-foods occupy the very top of the food pyramid, which characterizes them as to be used sparingly, or rather, only use if you have good dental insurance.
As Nestle points out, health warnings on alcohol and tobacco have done little to stop consumers. But picture a tobacco company allowed to sell cigarettes as "healthier" or "with added vitamins." (Indeed, she details a campaign by the alcohol companies to get Congress to allow them to market their products as healthy elixirs until Strom Thurmond's religious principles outweighed his conservatism enough for him to help shoot down the proposal.)
I was mildly surprised that Nestle does not comment on the imprecise use of "serving" information on food packaging. As a longtime student of labels, I find that the unhealthiest foods seem to have incredibly small "servings" compared with what consumers actually eat or drink. For the USDA, one slice of white bread or one ounce of breakfast cereal is a "serving" of grain, and nutritional data such as caloric content are rendered "per serving." A cinema-size actual serving of soda may contain 800 calories in sugar, before you get down to the buttered popcorn, not to mention the Big Mac before or after.
Food marketers are hardly breaking people's arms to persuade them to eat this stuff, of course. It is, after all, a great American principle that you can have your cake, eat it and slim down at the same time. What Nestle calls "techno-foods"--those labeled "healthier," "less fat," "lite," "more fiber"--pander to the health consciousness of a generation that will do anything to lose weight and live longer, except eat less.
The ultimate example of food marketing has to be Olestra, the cooking fat that passes through the gut undigested. Its maker, Proctor & Gamble, has spent up to $500 million on it, and spent twenty-seven years of the FDA's time getting various approvals, while it kept trying to remove the mandated health warning that the product could cause cramping, loose stools and block the absorption of fat-soluble vitamins. P&G should count its blessings. A Center for Science in the Public Interest report says: "Olestra sometimes causes underwear staining. That phenomenon may be caused most commonly by greasy, hard-to-wipe-off fecal matter, but occasionally also from anal leakage (leakage of liquid Olestra through the anal sphincter)."
By 1998 Proctor & Gamble disingenuously claimed that 250 tons, or four railcarfuls, of fat had not been added to American waistlines. No one claimed it had--what the company meant was that was how much Olestra had been used to fry chips. The public expectations were quite high, though; Nestle says that "people also were disappointed that the chips did not help them lose weight." Indeed, she reports that some ended up with more calories from eating Olestra-fried chips than they would have from other kinds, because they consumed a higher volume, convinced that they were calorie-free, though of course they were not.
But given the amount of money involved and the way food-industry/scientific-community connections are structured, "it is virtually impossible for any nutritionist interested in the benefits and risks of Olestra to avoid some sort of financial relationship with P&G unless one systematically refuses all speaking invitations, travel reimbursements, honoraria and meals from outside parties," Nestle observes.
In yet another case of Big Food getting its way, Nestle chronicles how the State Department came to declare that signing the World Health Organization/UNICEF international code on marketing of baby formula would flout the Constitution. "Inasmuch as this explanation strains credulity," Nestle suggests, the real reason was lobbying by US formula companies. The formula makers are fighting a war of attrition against mother's milk, in other words, not just here but internationally.
A more recent case involves the coalition that forced the FDA to allow claims of benefits from untested herbal supplements. I wish Nestle had gone into more detail about the sociology of this mélange of New Age alternative-medicine users, libertarian types and those who mistrust the medical establishment. Groups like Citizens for Health and the Corporate Alliance for Integrative Medicine rallied behind the rapidly growing corporations to ram a suppository up the FDA and its power to control sales of what on occasion have proven to be fatally flawed "alternative" remedies for everything from impotence to Alzheimer's. As she quotes an FDA commissioner, "[We] are literally back at the turn of the century, when snake oil salesmen made claims for their products that could not be substantiated." She reports claims that 12 percent of users of herbal medicines, or about 12 million people, suffer from some kind of adverse effect.
People may feel better when they take supplements, but should health officials use "feelings" as a basis for regulatory measures, she asks? Or should the FDA instead "take the lead in reenergizing a crucial phase of its basic mission to promote honest, rational scientific medicine by vigorously combating its opposite"?
Many people may want to know what "science" is. Is it corporate-sponsored research, or the AMA defending its professional turf with the same vigor with which it has traditionally fought "socialized medicine"? Nestle shows how the American Academy of Pediatrics tried to insure that highly profitable baby formula flowed through its hands and rallied against direct sales to mothers. Was that concern for the "client" or concern for professional prerogatives?
Perhaps Nestle should have been more polemical. The food supplement row raised the question of "whether irreparable damage has been done to the ability of our federal regulatory system to ensure the safety of foods and supplements and to balance public health interests against the economic interests of corporations," she writes. But her own reporting suggests that the barbarians are already inside the gates and forcing their wares on the gullible.
Nestle sees no magic bullet to retrieve the situation. She wants "some federal system to guarantee that all those products on the shelves are safe and effective," and she asks, "Shouldn't there be some regulatory framework to control patently absurd or misleading claims?" To answer that in the affirmative is not necessarily the same as agreeing that the FDA is the best agency, certainly in its present form, nor that the AMA and similar organizations are in the corner of good science. The FDA's record does not inspire confidence, which is one of the reasons the herbalists' revolt was so successful in Congress. Its arrogance often matches its ignorance. While reading this book I went to a small British-owned cholesterol shop in Manhattan (pork pies, etc.). Its owner can't import kippers because the FDA does not recognize them as food. His first shipment of a brand of British Band-Aids was held on suspicion of being a soup, and when that confusion was finally cleared up, the FDA demanded of him a medical-goods import license.
I would like to hear more about how the FDA could be made more responsive and more efficient. It seems that in their present form, the regulatory bodies need some means of democratic oversight to check bureaucracy and to weigh problems of undue influence from the producing industries. Nestle details problems we've come to see elsewhere: the revolving door between civil servants, Congressional staff and industry. She also suggests rules--"a higher and stronger 'firewall'" between regulatory agencies and industry to inhibit the easy career glide from poaching to gamekeeping and back again--and she is entirely correct that the last bodies that should be overlooking FDA funding are the Congressional agriculture committees, which are dedicated to the prosperity of agribusiness.
Otherwise, Nestle's wish list ranges from sensible to Mission Impossible: tighter labeling rules so people can see exactly what they are consuming. A ban on advertising of junk foods in schools, especially candies and soft drinks with high sugar content. Sumptuary taxes on soft drinks as well--sure to be opposed bitterly by the lobbyists. If alcohol and tobacco advertisements cannot be allowed on children's TV, why allow advertising of foods that promote obesity and future health ills on a par with them?
At first glance, Nestle's call for an ethical standard for food choices for nutritionists and the industry seems highly idealistic; but ten years ago, who would have foreseen Philip Morris's berating of state governments for not spending their tobacco settlement money on the pledged anti-child-smoking campaigns? Already, more and more scientific journals are demanding disclosure of conflicts of interest for papers submitted.
Nestle does not touch the subject directly, but who knows, maybe campaign finance reform really will cut indirectly the pork in the political diet and the crap in the school lunches. However, it will be a hard push. Educating the public is a start, and Food Politics is an excellent introduction to how decisions are made in Washington--and their effects on consumers. Let's hope people take more notice of it than they do of the dietary guidelines.
The third-year medical student held the intravenous catheter, poised to insert it into a patient's vein. Suddenly the patient asked, "Have you done this before?" As the student later recounted to me, a long period of silence fell upon the room. Finally, the student's supervising resident, who was also present, said, "Don't worry. If she misses, I'll do it." Apparently satisfied, the patient let the student proceed.
Breaking this type of uncomfortable silence is the goal of Complications: A Surgeon's Notes on an Imperfect Science by Atul Gawande, a surgical resident and a columnist on medicine for The New Yorker. As Gawande's collection of stories reveals, fallibility, mystery and uncertainty pervade modern medicine. Such issues, Gawande believes, should be discussed openly rather than behind the closed doors of hospital conference rooms.
Complications is surely well timed. In 2000, the Institute of Medicine published "To Err Is Human," a highly charged report claiming that as many as 98,000 Americans die annually as a result of medical mistakes. In the wake of this study, research into the problem of medical error has exploded and politicians, including then-President Bill Clinton, have proposed possible solutions. The message was clear: The silence that has too long characterized medical mistakes is no longer acceptable. Yet while Gawande's book provides great insights into the problem of medical error, it also demonstrates how there can be no quick fix.
What may be most remarkable about the recent obsession with medical error is just how old the problem is. For decades, sociologists have conducted studies on hospital wards, perceptively noting the pervasiveness of errors and the strategies of the medical profession for dealing with them. As sociologist Charles Bosk has shown, doctors have largely policed themselves, deciding what transgressions are significant and how those responsible should be reprimanded. Within the profession, then, there is much discussion. Yet the public was rarely told about operations that went wrong or medications that were given in error. Residents joining the medical fraternity quickly learned to keep quiet.
Indeed, when one of those young physicians decided to go public, he used a pseudonym, "Doctor X." In Intern, published in 1965, the author presented a diary of his internship year, replete with overworked residents, arrogant senior physicians and not a few medical errors. In one instance, a surgeon mistakenly tied off a woman's artery instead of her vein, leading to gangrene and eventual amputation of her leg. Doctor X pondered informing the woman about the error, wondering "just exactly where medical ethics come into a picture like this." But his colleagues convinced him to remain quiet.
One whistleblower willing to use his own name and that of his hospital, New York's Bellevue, was William Nolen. In The Making of a Surgeon, published in 1970, surgeons swagger around the hospital, making derisive comments about patients and flirting relentlessly with nurses. (Not the least of reasons for being nice to nurses was the expectation that they would help cover up young doctors' mistakes.) Interestingly, Nolen was subsequently excoriated both by surgeons, who believed he had betrayed the profession's secrets, and by the lay public, who felt he was celebrating the "callousness and prejudice" of surgeons toward vulnerable patients.
Perhaps the peak of this genre of scandalous tell-all accounts occurred in 1978, with the publication of The House of God, written by the pseudonymous Samuel Shem. Although fictional, the book draws on the author's raucous and racy experiences as a medical intern at Boston's Beth Israel Hospital. To Shem, medicine's whole approach to patient care was misguided. The book's hero, the Fat Man, teaches his trainees a vital lesson: "The delivery of medical care is to do as much nothing as possible."
Today it has become more fashionable than rebellious for physicians to describe the trials and tribulations of their training. Dozens of doctors (and some nurses) have published such accounts. Gawande is a prime example of this more mainstream type of physician-author. Even though he describes very disturbing events in his articles for The New Yorker (some of which have been reprinted in Complications), he uses his real name and that of his institution: Boston's Brigham and Women's Hospital.
Gawande, however, has taken the art of physician narrative to a new level. He is a deft writer, telling compelling stories that weave together medical events, his personal feelings and answers to questions that readers are surely pondering. Most important, Gawande paints with a decidedly gray brush. There are few heroes or villains in Complications, just folks doing their jobs. Although some readers, perhaps those who have felt victimized by the medical system, may find Gawande's explanations too exculpatory of doctors, he has documented well the uncertainties and ambiguities that characterize medical practice.
Take, for example, his chapter "When Doctors Make Mistakes." With great flair, Gawande describes a case in which he mistakenly did not call for help when treating a woman severely injured in a car accident. Although Gawande could not successfully place a breathing tube in her lungs, he stubbornly kept trying rather than paging an available senior colleague. Eventually, Gawande clumsily attempted an emergency procedure with which he had little experience, of cutting a hole in her windpipe and attempting to breathe for her. It was only through good fortune that the patient did not die or wind up with brain damage. An anesthesiologist, called in very late in the game, managed to sneak a child-size breathing tube into her windpipe, enabling the patient to obtain adequate oxygen.
With typical candor, Gawande lists the possible reasons that he did the wrong thing: "hubris, inattention, wishful thinking, hesitation, or the uncertainty of the moment." All doctors, he is arguing, experience these very human feelings as they tend to their craft. The fact that lives are at stake may make physicians--as compared with other professionals--even more prone to such emotions.
Gawande also details how the surgery department addressed his error. The case was presented at the weekly morbidity and mortality (M & M) conference, where physicians discuss deaths and other bad outcomes. "The successful M & M presentation," Gawande perceptively notes, "inevitably involves a certain elision of detail and a lot of passive verbs." This clearly occurred during the discussion of Gawande's case, where, remarkably, no one ever asked him why he did not call for help sooner. Rather, his blunder was later addressed through another ritual, a private discussion between Gawande and the senior attendant he had not called. Games with language and secret conversations: These are the reasons Gawande has written his book.
In another chapter, Gawande provides a more provocative explanation for the type of mistake he made. Gaffes, he argues in "Education of a Knife," are part of how surgeons--and other physicians--must learn their craft. (After all, physicians don't perform medicine, they practice it.) In an anecdote resembling that of my third-year student, Gawande describes how he routinely caused complications when learning to place dangerous central-line catheters into the necks of seriously ill patients. Expertise, he explains, does not just happen. Physicians in training must victimize a certain percentage of patients to acquire the skills they will need to become competent doctors. Should we consider these events to be mistakes or business as usual? Deciding how to define a medical error is not the least problem.
In such learning situations, the necessary experience is best attained by keeping quiet. Using the "physician's dodge," patients are told "You need a central line" but not "I am still learning to do this." One ramification of this type of learning, Gawande notes, is the victimization of poor, less educated patients, who are often incapable of questioning doctors. Medicine's inclination to learn on "the humblest of patients" becomes especially apparent with Gawande's candid admission that he himself chose a more senior physician--rather than a more attentive cardiology fellow--to care for his son's heart problem.
Mistakes may be made not only by physicians but by patients. In the chapter "Whose Body Is It, Anyway?" Gawande asks what physicians should do when patients seem to make bad decisions. One especially compelling story, which I often use to teach medical students, involves a man who absolutely refused to go on a breathing machine after experiencing a complication of gall bladder surgery. Although the doctors explained that artificial ventilation would only be temporary and would likely save his life, the patient continued to object.
When the man passed out due to lack of oxygen, Gawande was faced with a devastating quandary. Does he abide by the man's wishes, which is what doctors are supposed to do, or immediately put him on the ventilator? Gawande chose the latter. I love to ask students what they think the man said when, a few days later, Gawande triumphantly took him off the machine. Invariably, half of the students predict that the man said, "Call my lawyer." But the other half, who guess that he said "Thank you," are correct. Gawande had surely averted a mistake in this case, but he was left without clear guideposts for approaching similar cases in the future.
Complications is filled with other stories demonstrating the capriciousness of medicine. For example, Gawande once detected a case of the rare, often fatal infection necrotizing fasciitis (flesh-eating bacteria) because he happened to have seen a case a few weeks before. He ultimately saved the patient's life, not through hard, scientific evidence but through a gut feeling and a willingness to submit a patient to possibly unnecessary surgery. "Medicine's ground state," he concludes, "is uncertainty." Other chapters examine why the medical profession so often hides the mistakes of impairedphysicians, and the questionable use of an operation to help morbidly obese patients lose weight.
In the wake of the Institute of Medicine report, experts have proposed numerous remedies for the problem of error. Most attention has focused on a "systems approach," which would produce a "culture of safety" similar to that of the airline industry. In such a scheme, sophisticated computerized systems would be put in place to detect impending errors, such as wrong medication doses, sloppily written prescriptions and dangerous drug interactions. This emphasis aims to revamp the current approach to medical error, which encourages finger-pointing and malpractice lawsuits.
Gawande's book demonstrates both the advantages and limits of such a systems model. On the one hand, by discouraging the stigmatization of medical mistakes, physicians may be more willing to reveal their own errors and those of their peers. The notion that the case of the obstructed airway could be discussed in an open and nonjudgmental environment, rather than couched in secrecy, is altogether welcome.
On the other hand, there is a reason decades of exposés like Complications have not led to significant change. Defining errors and ascertaining their causes is a tricky business.
So is dealing with the issue of blame. Gawande is willing to admit that he screwed up when he did not call for immediate help for his deteriorating trauma patient. "Good doctoring is all about making the most of the hand you're dealt," he writes, "and I failed to do so." But many physicians remain reluctant to come quite so clean.
A friend and I were sitting around commiserating about the things that get to us: unloading small indignities, comparing thorns. "So there I was," she said, "sitting on the bus and this man across the aisle starts waving a copy of law professor Randall Kennedy's new book Nigger. He's got this mean-looking face with little raisiny eyes, and a pointy head, and he's taking this book in and out of his backpack. He's not reading it, mind you. He's just flashing it at black people."
"Don't be so touchy," I responded. "Professor Kennedy says that the N-word is just another word for 'pal' these days. So your guy was probably one of those muted souls you hear about on Fox cable, one of the ones who's been totally silenced by too much political correctness. I'd assume he was just trying to sign 'Have a nice day.'"
"Maybe so," she said, digging through her purse and pulling out a copy of Michael Moore's bestselling Stupid White Men. "But if I see him again, I'm armed with a 'nice day' of my own."
"That's not nice," I tell her. "Besides, I've decided to get in on the publishing boom myself. My next book will be called Penis. I had been going to title it Civil Claims That Shaped the Evidentiary History of Primogeniture: Paternity and Inheritance Rights in Anglo-American Jurisprudence, 1883-1956, but somehow Penis seems so much more concise. We lawyers love concision."
She raised one eyebrow. "And the mere fact that hordes of sweaty-palmed adolescents might line up to sneak home a copy, or that Howard Stern would pant over it all the way to the top of the bestseller list, or that college kids would make it the one book they take on spring break----"
"...is the last thing on my mind," I assured her. "Really, I'm just trying to engage in a scholarly debate about some of the more nuanced aspects of statutory interpretation under Rule 861, subsection (c), paragraph 2... And besides, now that South Park has made the word so much a part of popular culture, I fail to see what all the fuss is about. When I hear young people singing lyrics that use the P-word, I just hum along. After all, there are no bad words, just ungood hermeneutics."
"No wonder Oprah canceled her book club," she muttered.
Seriously. We do seem to have entered a weird season in which the exercise of First Amendment rights has become a kind of XXX-treme Sport, with people taking the concept of free speech for an Olympic workout, as though to build up that constitutional muscle. People speak not just freely but wantonly, thoughtlessly, mainlined from their hormones. We live in a minefield of scorched-earth, who-me-a-diplomat?, let's-see-if-this-hurts words. As my young son twirls the radio dial in search of whatever pop music his friends are listening to, it is less the lyrics that alarm me than the disc jockeys, all of whom speak as though they were crashing cars. It makes me very grateful to have been part of the "love generation," because for today's youth, the spoken word seems governed by people from whom sticks and stones had to be wrested when they were children--truly unpleasant people who've spent years perfecting their remaining weapon: the words that can supposedly never hurt you.
The flight from the imagined horrors of political correctness seems to have overtaken common sense. Or is it possible that we have come perilously close to a state where hate speech is the common sense? In a bar in Dorchester, Massachusetts, recently, a black man was surrounded by a group of white patrons and taunted with a series of escalatingly hostile racial epithets. The bartender refused to intervene despite being begged to do something by a white friend of the man. The taunting continued until the black man tried to leave, whereupon the crowd followed him outside and beat him severely. In Los Angeles, the head of the police commission publicly called Congresswoman Maxine Waters a "bitch"--to the glee of Log Cabin Republicans, who published an editorial gloating about how good it felt to hear him say that. And in San Jose, California, a judge allowed a white high school student to escape punishment after the student, angry at an African-American teacher who had suspended his best friend, scrawled "Thanks, Nigga" on a school wall. The judge was swayed by an argument that "nigga" is not the same as "nigger" but rather an inoffensive rap music term of endearment common among soul brothers.
Frankly, if Harvard president Lawrence Summers is going to be calling professors to account for generating controversy not befitting that venerable institution, the disingenuous Professor Kennedy would be my first choice. Kennedy's argument that the word "nigger" has lost its sting because black entertainers like Eddie Murphy have popularized it, either dehistoricizes the word to a boneheaded extent or ignores the basic capaciousness of all language. The dictionary is filled with words that have multiple meanings, depending on context. "Obsession" is "the perfume," but it can also be the basis for a harassment suit. Nigger, The Book, is an appeal to pure sensation. It's fine to recognize that ironic reversals of meaning are invaluable survival tools. But what's selling this book is not the hail-fellow-well-met banality of "nigger" but rather the ongoing liveliness of its negativity: It hits in the gut, catches the eye, knots the stomach, jerks the knee, grabs the arm. Kennedy milks this phenomenon only to ask with an entirely straight face: "So what's the big deal?"
The New Yorker recently featured a cartoon by Art Spiegelman that captures my concern: A young skinhead furtively spray-paints a swastika on a wall. In the last panel, someone has put the wall up in a museum and the skinhead is shown sipping champagne with glittery fashionistas and art critics. I do not doubt that hateful or shocking speech can be "mainstreamed" through overuse; I am alarmed that we want to. But my greater concern is whether this gratuitous nonsense should be the most visible test of political speech in an era when government officials tell us to watch our words--even words spoken in confidence to one's lawyer--and leave us to sort out precisely what that means.