Help

Nation Topics - Books and Ideas

Topic Page

Articles

News and Features

Mary McCarthy would have turned 90 on June 21, a fact that is itself astonishing to those who remember her flagrant youth, when her sharp style made her the most feared and forthright writer in New York. 

 

I am writing this review in the midst of a Chicago heat wave, almost
exactly seven years after the heat disaster that killed nearly 800
people in the city. The Chicago Tribune's multicolored weather
page adorns the forecast with a special "excessive heat watch"
symbol--an exclamation point lodged in a red circle--newscasters
earnestly tell us to stay inside and take it easy, and veteran black
radio deejay Herb Kent, the Kool Gent, chats on-air about liquor and
caffeinated drinks being dehydrating and the need to drink lots of "good
old H2O."

I remember the 1995 disaster well, but for me personally it was a period
of intensive work on my last book, cooped up indoors 24/7, with roaring
air-conditioning, punctuated by horrified reading of the
Tribune's coverage of rolling city power outages and the growing
spectacle of hundreds of heat-related deaths, with the bodies piling up
and overwhelming the city morgue's capacity. Suspicious of the
Tribune because of its long history of rightist and racist
slants, I scrutinized the stories to see if the city was, as usual,
shortchanging its black South and West sides on services, but couldn't
figure anything out. Sociologist Eric Klinenberg, a young Chicago
native, was out of the country during the disaster, but has since then
more than made up for lost time. His Heat Wave is a trenchant,
multilayered and well-written social autopsy of the disaster.

Since finishing Heat Wave, I've been obsessively asking friends,
neighbors, students and colleagues if they were in town in July 1995,
and if so, what they remember. Most of my middle-class interlocutors
were as insulated as I was, in cooled rooms, and only vaguely remember
the period because of media coverage. But many younger people, who were
then living on student or first-job budgets, told tales of extreme
misery and multiple palliative strategies--double bills at
air-conditioned theaters, plunging into Lake Michigan every possible
nonworking hour, bunking with better-off friends and relatives, long
drives in cars with AC and, of course, all the old tricks with cold
water, towels and fans. One conservative young woman described her
sudden comprehension, lying sweaty and wretched in her sweltering apartment, listening to neighbors' AC compressors turning on, of the ressentiment and violence of some inner-city dwellers.

In fact, Klinenberg explains, aside from some vigilante actions against
city workers sent to reseal the 3,000 open fire hydrants liberated by
kids, poor Chicagoans were far too enervated by the hot, wet blanket
enveloping the city to commit mayhem. The real criminals of the heat
crisis, Klinenberg makes clear, were the federal, state and local
officials who, in the words of Robert Scates, the bitter black
thirty-year veteran emergency medical services director, committed
"murder by public policy."

But first we need to come to terms with the epidemiological realities of
heat crises. Extreme heat, Klinenberg explains, tends not to be taken as
seriously as other weather and human disasters--hurricanes, floods,
earthquakes, blizzards, plane crashes. But "more people die in heat
waves than in all other extreme events combined," and the '95
crisis has "no equal in the record of US heat disasters." Because the
body's defenses "can take only about forty-eight hours of uninterrupted
exposure to such heat before they break down," Klinenberg observes, area
ambulance services and emergency rooms were soon overwhelmed, and at the
height of the catastrophe, half of Chicago's hospitals went on bypass
status--turned all new patients away. Most Chicagoans saw the grisly
televised scenes of emergency workers falling prostrate with heatstroke,
of police cars backed up clear around the block, waiting to deliver
cadavers to nine forty-eight-foot refrigerated trucks donated by a local
meatpacking firm when the morgue ran entirely out of body-storage space,
and heard and read about the record-breaking murderousness of the
disaster. But Klinenberg notes that only months after the catastrophe,
Chicagoans reacted to his queries with "detachment and disavowal." Not
only did they, and the press whose interpretations they were reflecting,
wish to relegate the disaster to a nonhappening but many, following
Mayor Richard Daley's lead, asserted that the death figures weren't
"really real," that "the massive mortality figures...had somehow been
fabricated, or that the deaths were simply not related to the heat."

Klinenberg took on the task of explicating what's "really real" with
extraordinary energy. He burrowed into public health and press
documents, did street-level fieldwork and police ride-alongs in poor
neighborhoods, interviewed every possible city, state and private agency
official, and many low-level service workers, and thoroughly engaged
local journalists on their hour-by-hour decision-making on the framing
and coverage of the breaking story. In domain after domain, across
institutions, he smashes home his key finding: "The geography of
vulnerability during the heat wave was hauntingly similar to the
everyday ecology of inequality." Heat disasters in general resonate less
with the general public because, unlike other sorts of disasters, they
leave property untouched and mostly affect the poor, the frail, the
nonwhite--whoever can't afford air-conditioning! The Chicago dead were
indeed largely the isolated, elderly and disproportionately black poor,
and the city rapidly turned its back on them.

But the everyday ecology of inequality is not a timeless phenomenon, and
Chicago is not Everycity. By the mid-1990s, the US economy had recovered
from the Reagan-Bush recession, the market was booming, urban street
crime was dropping and American media were hyping an urban renaissance.
Mayor Daley capitalized on these national trends with an ambitious
program of urban beautification and a massive public relations campaign,
suburbanites moved back downtown and tourism revived dramatically.
(Klinenberg doesn't mention the role of the 1990s spike in international
migration to Chicago, which brought much-needed quality and variety to
local restaurant fare, added exotic cuteness to tourist attractions and
provided a vast underpaid labor force for booming restaurants, hotels
and offices.) During the heat wave, the Daley administration was
particularly engaged in "gloss[ing] its image in preparation for the
Democratic National Convention of 1996"--felt as a crucial task, given
the debacle of the 1968 DNC event, when Daley's father was mayor, with
its globally reproduced images of Chicago's finest beating the shit out
of middle-class white kids and not a few journalists and Democratic
politicians. So it comes as little surprise that Daley viewed the heat
wave deaths primarily as "a potential public relations disaster," and
Chicago-watchers will not be too surprised to read that the city
administration both actively hindered appropriate relief efforts and put
most of its energy into an attempt to "spin its way out of the crisis."

God is in the details, though, and Klinenberg painstakingly lays out for
us both the structural and more proximate policies that led to the
disastrous Chicago mortality figures of July 1995. Most crucial is the
rise of neoliberalism, which Klinenberg rather oddly denominates
"reinvented government" and "the entrepreneurial state," in a narrow
sociological tradition, rather than connecting to abundant available
radical analyses of the phenomenon worldwide. No matter, he names the
key shifts: the state's growing divestment of social service
responsibilities; the outsourcing and simultaneous downsizing of the
remaining functions; the overarching capitalist managerial model of
lean, mean efficiency; and the new model of citizens as "active
consumers" of public goods, and too damned bad if they lack the
knowledge, capacity or energy to do so.

In the case of the heat wave, the crucial noxious brew involved
neoliberal policies with regard to low-cost housing, consumer energy use
and social service personnel. Since Reagan, the federal government has
been cutting back support for low-cost housing, and the public housing
crisis in Chicago was so acute that local activists were unwilling to
draw attention to the many code violations in single room occupancy
(SRO) hotel units--more than 18,000 rooms had been lost already--for
fear that they would "only embolden the political officials and real
estate developers who would prefer to convert the units into market-rate
family housing." As a result, many frail elderly people literally cooked
to death in illegal multiply subdivided "cattle sheds for human beings."

As well, the traditional down-on-its-luck SRO population had been
swollen since the 1970s with the mentally ill dumped onto urban housing
markets with the closure of government-operated asylums. Fragile
community connections were severed as SRO residents, afraid of the
"crazy folk," retreated from common spaces into their tiny rooms, making
it ever more likely that those sinking with heatstroke would fail to be
discovered until it was too late. In public housing, the Chicago Housing
Authority provided no air-conditioning even in common rooms, and in a
perverse interpretation of the Americans with Disabilities Act, the CHA
dumped youthful drug addicts, without rehab services, into
well-established senior housing all over the city. Crime in the projects
predictably skyrocketed, the collective caretaking bonds the residents
had built atrophied as the elderly retreated in terror into their
individual units; many lives were lost as a result.

Air-conditioning may be part of the overarching environmental crisis,
but it is a godsend in extreme heat, and for better or worse,
working-class and better-off Americans have organized their lives around
it in all parts of the country affected by high summer temperatures.
Inability to afford winter heating, much less summer air-conditioning,
is part of what Klinenberg labels the "everyday energy crisis" of the
poor. A 50 percent cutback in the federal low-income energy-assistance
program, combined with soaring utility rates, pinched the city of
Chicago so badly that it still closes down aid each year at the
beginning of the cold season, and provides no AC subsidies at all. The
poor elderly with whom Klinenberg visited were so fearful of excessive
energy bills that they even avoided using electric lights during the
day. In an extraordinary illustration of neoliberal cruelty, as the heat
wave deaths were still being counted, the US Senate initiated a vote to
end the energy program but settled on skimming off a mere hundred
million dollars. In the same session, Congress vastly expanded federal
support to insurance companies and homeowners who suffer property damage
due to disasters. The final fillip is the new "market model" utility
policy that punishes delinquent customers, even the desperately ill, by
cutting off not only electricity but water. Klinenberg notes
sardonically that this policy is simply not parallel to the money-making
efficiency of the car boot: "Water, unlike a car, is a resource that
people need to survive."

Chicago's specific demographic and spatial history greatly magnified the
final domain--social services--of murder by public policy. Klinenberg
demonstrates that the city, much to my surprise, has significantly
higher percentages than the American average both of single residents in
general and of elderly living alone. Of course, as he notes, living
alone and being without resources are two distinct states. But Chicago
lost 1 million people between 1950 and 1990, and for the elderly poor,
"aging in place" in neighborhoods devastated first by capital and then
by massive population flight--and then colonized by kids working in the
only industry left, drugs--is a recipe for dangerous isolation. Add
state cutbacks and outsourcing, and you have private agencies on
insanely low budgets sending outrageously overworked service providers
out to elderly poor clients no more than once a year--and even then, in
fear of the druggies, confining their visits to the early mornings.

North Lawndale is one such "bombed out" neighborhood, and Klinenberg's
star turn is a rigorous ethnographic and historical comparison of that
Southwest Side area with the contiguous Little Village. Both
neighborhoods were founded by Southeastern European immigrants and then
tipped minority in the postwar years, and both have similar poverty
levels and percentages of poor elderly--but North Lawndale had ten times
more heat wave deaths, proportionately, than its southern neighbor.
Scholars, politicians, social service people and even residents
themselves offered up "racial" explanations, as North Lawndale is black
while Little Village is Mexican: Latinos are used to hot weather, they
have close intergenerational families, they form tight communities, etc.
Klinenberg demolishes all these folk theories with hard facts and
careful logic (and not a little sarcasm--black Chicagoans with roots in
the Delta don't have close families and aren't used to hot weather?) and
forces us to consider variations in urban spatial ecology and their
consequences for city-dwellers' daily lives. After all, three Chicago
neighborhoods with the lowest per capita heat-wave death rates were
majority-black--but not "bombed out."

The key difference is human density. Little Village is both an
entrepôt for the vast Latino migration to Chicago and a safe haven
for Latinos gentrified out of other neighborhoods. As one resident said
of the neighborhood, "there is no such thing as an empty lot." High
populations maintain abundant local business, which in turn guarantees
lively street life and thus a safe and interesting public environment in
which the elderly can shop, exercise--and cool down in air-conditioned
stores during a heat wave. Even the "aging in place" whites left over
from Little Village's earlier incarnation fared well in the crisis.
Certainly Little Villagers have strong community bonds, especially
through the Catholic Church, but North Lawndale residents are organized
to a fare-thee-well too. Their church groups and block clubs, though,
simply cannot make up for abandoned buildings, empty lots and few
stores.

Klinenberg deals diligently but less successfully with three other
domains key to his story. He nails the Daley administration's
culpability in an hour-by-hour account of the unfolding disaster and
discusses the highly publicized failed snow removal that doomed the
1970s Bilandic administration, but he neglects to mention
African-American Harold Washington's brief but significant interim
mayoralty of the 1980s. Washington, after all, gained both national fame
and notoriety for trying to equalize city resources across rich and poor
neighborhoods, and that profoundly race-inflected inequality is the
fulcrum of Heat Wave's criticism of current city government. Some
of Klinenberg's heroes of the crisis, public health activist Quentin
Young and Sid Bild of Metro Seniors in Action, are actually white
veterans of the old Washington coalition. And we never really hear about
the Daley/developer deals that have stripped the city of affordable
housing, which are well documented in radical scholarship and
journalism. Similarly, Klinenberg does wonders with the sordid story of
the firefighter/paramedic feud--one reason for the city's belated response to the crisis--but doesn't really clue us in that racism is at the root of that
one too. Finally, he gives us terrific reporter's-eye insight into the
bureaucratic realities that determined the false coverage of the
breaking crisis at the Chicago Tribune, but never informs us of
the Trib's history of rightist ownership, the structures above
the heads of the city editors.

Klinenberg documents the local media's chastened post-'95
hyperresponsibility to advise the public on individual tactics to
mitigate heat danger, and lists the specific ongoing political
structures that will inevitably lead to more murder by public policy.
But he never quite adds these elements up to their sum total--the heat
disaster as an altogether predictable product of neoliberal capitalist
shift. Heat Wave connects the dots to tell us an important new
muckraking story but doesn't fully recognize the radical urban and
national political economy narrative already on the page.

Like Pop-Up Video--one of the many things the movie-industry left
never anticipated--ancillary factoids keep imposing themselves on Paul
Buhle and Dave Wagner's Radical Hollywood:

1. When the oft-dubbed "revolutionary" Lew Wasserman (longtime MCA mogul) died this past June 3, obit writers made the old archcapitalist
sound like he'd been the happy end of a Bolshevik dream--the man who
finally took the power away from the studios and gave it to the people
(OK, very rich, well-placed people).

2. Wasn't it Ronald Reagan--"FBI collaborator," the man deemed "too
dumb" for membership in Hollywood's CP of the 1930s and the star of the
blacklisted screenwriter Val Burton's last movie (Bedtime for
Bonzo
)--who helped decontrol the studios' ownership of movie
theaters, i.e., the means of distribution?

3. Showing that memory is fleeting even among the most
progressive-minded people, the Stockholm International Film Festival of
1997 jumped the gun on the Academy Awards and hosted a retrospective of
work by friendly witness Elia Kazan--its organizers claiming, quite
convincingly, that they were completely unaware of the then-raging (sort
of) Kazan Kontroversy.

4. Showing that memory is as tenacious as the ego it's attached to,
Hollywood Ten member Ring Lardner Jr., honoree of the
screenwriter-centric Nantucket Film Festival of 1998, still had the
energy to rail against the system--although the preponderance of his
outrage was not over his HUAC-imposed prison time but the liberties
Joseph Mankiewicz and Louis B. Mayer had taken fifty-odd years before
with his script for Woman of the Year.

If there are unwritten messages within Radical Hollywood, one
might be that artistic vanity and general cupidity are neither exclusive
nor native to a particular political persuasion, nor even the movie
industry itself. And that nothing ever changes. Current cinephiles fear
and loathe the fact that in today's movie business, "business" takes
precedence over "movies." But by 1933, after the bankruptcies of Fox,
Paramount and RKO, the money men had already taken over. (As the authors
write, "Bankers were good at firing studio workers...but were notably
untalented at making films." Make it "lawyers" and it might be 2002.)
Back in 1919, Charles Chaplin, Douglas Fairbanks, D.W. Griffith and Mary
Pickford organized the first independent-of-the-studios Hollywood movie
company, United Artists--the DreamWorks of its time. Last year's
threatened strike by the Writers Guild--which, together with the strike
threat by the Screen Actors Guild, is still affecting studio production
schedules--was largely about credits, because they translate into
salaries; in 1933, meeting secretly, Hollywood's leading screenwriters
(including such leftist lights as John Howard Lawson, John Bright,
Samuel Ornitz and Lester Cole) gathered to organize, largely over the
issue of credits, and for the same reason. Variety, Hollywood
"bible" and noted mangler of the English language, played the game with
the mobbed-up craft union IATSE (International Alliance of Theatrical
Stage Employees) back in Depression-era Hollywood. It plays plenty of
games today.

And then (sigh) there's that oh-so-predictable outcry over pop cinema's
influence on/instigation of sociocriminal behavior--the knee-jerk
finger-pointing at Hollywood every time a Columbine happens (but never,
you may notice, a 9/11). This is hardly a newsflash either: The release
of such hard-nosed gangster thrillers as The Public Enemy,
Scarface and Little Caesar in the early 1930s helped lead
to the establishment of the Legion of Decency, the Production Code, the
Hays Office, the bluenosed rule of in-house censor Joseph Breen and
decades-long cultural prosperity for those who preferred their movie sex
infantilized and their view of America strained through fine mesh. How
the Christian right does long for those thrilling days of yesteryear.

The story of the left in Hollywood, in other words, is the story of
today in Hollywood; but if you're looking for correlations and parallels
you won't find many in Radical Hollywood. Not that parallels are
always what you need: As the blacklisted writer/director Abraham Polonsky (Force of Evil, Body and Soul,
Tell Them Willie Boy Is Here) told interviewer David Walsh a few
months before his death in 1999, "In the old days, if something like
this [the Kazan Oscar] was going on, you'd make a few telephone calls,
you'd have a thousand people there. No more. Nobody believes in
anything, except in the finance capitalist." Did anyone in the whole of
Hollywood--or the entire United States Congress, for that matter--make a
peep of support for the recent and quite reasonable California appellate
court decision on the Pledge of Allegiance? If they did, it was drowned
out by the sound of scuttling feet, heading for the political lifeboats.

This last episode was certainly too late for inclusion or comment in
Radical Hollywood, but it points up both the stasis and mutation
in what we have to recognize, however reluctantly, as the cultural
capital of the country--and whose history is far more alive than this
book would imply. Encyclopedic in the most frightening sense, RH
is thorough and wide-ranging, and fairly exhaustive in ferreting out
every possible leftist association in any vaguely relevant movie
produced by Hollywood from the New Deal through the postwar Red Scare.
But the authors are also straitjacketed by their own theses: One, that
there was a leftist subtext imposed on many of the movies that
the right held in fear and contempt. (Who knew?) And two, that the
movies were simply superior during the more or less lefty days of
Hollywood.

They may be right. "The content of films was better in 1943 than it is
in 1953," Hollywood Ten-ster Dalton Trumbo is quoted as saying, and the
authors contend that "any reasonable calculation" would confirm what
Trumbo says. But reasonable calculation has nothing to do with the very
subjective business of judging art. One might as well reduce the entire
argument to a single question: What do you prefer? Movies with the
left-leaning Humphrey Bogart? Or movies with Ronald Reagan? It may not
seem to be a contest. But it wouldn't be an example of the scientific
process, either.

Despite their tabloidy subtitle--"the untold story behind America's
favorite movies"--Buhle and Wagner don't dabble much in the anecdote,
gossip or movie-set story that would have lubricated their prose or
perhaps even parted their sea of subordinate clauses. Still, famous
names abound. "As FBI reports suggested," Lucille Ball, Katharine
Hepburn, Olivia de Havilland, Rita Hayworth, Humphrey Bogart, Danny
Kaye, Fredric March, Bette Davis, Lloyd Bridges, John Garfield, Anne
Revere, Larry Parks (The Jolson Story), the wives of March and
Gene Kelly, and Gregory Peck's fiancée--to say nothing of the
scores of writers Buhle and Wagner profile and analyze, or their more
loosely affiliated or merely sympathetic directors and stars--were all
in or close to the Communist Party. Why? For one thing, the authors say,
because these were the people of 1930s and '40s Los Angeles who were
smarter, consequently more liberal, and enjoying a more egalitarian and
humanistic worldview than their constipatedly conservative counterparts.
But it was, they point out, also a result of Hollywood's (and America's)
bigotry and its effect on social life: The comically titled West Side
Writing and Asthma Club, an ostensibly nonpolitical alternative for Jews
barred from Los Angeles's beach clubs and marginalized in the better
restaurants, became a hotbed of anti-Nazi sentiment (which, of course,
made it politically suspect). Eventually, through the Asthma Club, even
one of the world's leading, albeit largely apolitical, Marxists
(Groucho) could channel donations to the Popular Front.

That the Communist Party in Hollywood was largely a "social agency," as
the authors call it, was what helped make the McCarthy-era hearings and
HUAC roundups so wide-ranging and terrifying, even if, after the
Hitler-Stalin Pact, the LA branch of the party "had died...but simply
not known it," as the exiled Carl Foreman (High Noon) put it. How
such screenwriters, who are Buhle and Wagner's principal subjects,
maintained their political principles while clawing their way up the
studio ladders is something left amorphous. Lardner, ever aware of the
contradictions in being a high-priced proletarian, said in his
autobiography I'd Hate Myself in the Morning (his famous response
to J. Parnell Thomas about why he wouldn't name names) that he picketed
Warner Bros. when Mussolini's son came calling, and told David O.
Selznick not to make Gone With the Wind because it was pro-Klan.
But he was an artist, too, a hungry one, and a man who knew the siren
song of fame and fortune never quite harmonized with "The
Internationale."

The authors exhibit a weakness for locating leftist content and
associations where they need to and and shoehorning certain movies into
their theses (their view of Universal's horror catalogue as anti-Wall
Street seems particularly windy). But by the time Radical
Hollywood
gets to the era of film noir--which they call "arguably
the only fully realized American 'art film' genre"--it feels as if the
rest of the book has been prologue. Clearly, the authors know and love
the period and what it did to American cinema in the aftermath of World
War II--countering the forced fairy tale of Hollywood with a new, frank,
sexually liberated, sexually sophisticated, sexually metaphorical take
on the dark view of postwar, postnuclear existence (although, strangely,
Radical Hollywood never analyzes noir via the A-bomb, despite the
celebrated apocalyptic imagery of such genre classics as Robert
Aldrich's Kiss Me Deadly). That noir also refashioned the
traditional portrayals of the sexes--at a time when, the authors point
out, the country's postwar recovery and strength were being
propagandized as dependent on the American male and his renewed sense of
self--made it one of the most important cultural developments of the
twentieth century, if not the nation's entire cultural history. No
wonder it fell victim to the strangling effects of creeping McCarthyism.

Radical Hollywood, whether or not it's "the untold story behind
America's favorite movies," certainly puts a new spin on those films,
especially for those already familiar with them--readers who,
unfortunately, will be those most distracted by the authors' rather
habitual way with the errant fact. Some are trivial: Edward G. Robinson
didn't say "Mother of God..." at the end of Little Caesar; he
said "Mother of Mercy," as any schoolchild knows (any schoolchild,
granted, with an unnatural obsession with movies). William Randolph
Hearst may have "attributed the 'subversive' label to anything that
smacked of egalitarian liberalism," but he didn't do it in the pages of
the Los Angeles Times, because he never owned the Los Angeles
Times
. In assessing the populist perspective of Destry Rides
Again
, Buhle and Wagner seem oblivious to the fact that James
Stewart's character is the son of the more famous Destry. The
famously Hungarian-born director Michael Curtiz (director of the
leftist-written Casablanca, among many others) is identified at
one point as a "German refugee." John Wayne's "first major screen role"
wasn't in 1938's Pals of the Saddle, but Raoul Walsh's 1930
The Big Trail. Warner Bros.' "self-serving prologue" at the
beginning of The Public Enemy may have been self-serving--it
mentions the social impact of the studio's own PE and Little
Caesar
while omitting UA's Scarface--but it wasn't on the
original 1931 print; it was added for a re-release several years later.

Jean Renoir's The Southerner marked William Faulkner's "only
notable screenplay contribution"? How about The Big Sleep?
Mildred Pierce? And let's not forget To Have and Have Not,
in which he rewrote Hemingway, by all reports to their mutual delight.
And Katharine Hepburn didn't lose the "box-office poison" appellation
after Holiday but after The Philadelphia Story, whose film
rights she bought because she knew it would remake her career.

But let's imagine this litany of errors is itself a metaphor for the
intrinsic unreality of the left in Hollywood. It's a subject that Buhle
and Wagner have attacked with energy and all the right intentions; the
reader may wish that he or she were given a bit more reason to stick
with the book through its thicker moments, but there's no denying the
authors' enthusiasm, erudition and engaging way of summarizing plot
lines and associations. Still, it's a weird tale they're telling. As
they relate early on, Polonsky recounted in his later years that one of
the oft-discussed issues among the Hollywood left wing was what, in
fact, they were all doing there. Should they be in Hollywood, making pap
and trying to inject it with a social conscience? Or secede from the
union and create film art independently? As Polonsky put it, the answer
was simple: "Filmmaking in the major studios is the prime way that film
art exists." And so it was. And is. And unfortunately--thanks to an
American indie movement that has lost its lure for youth, a dissipated
market for the once-hip foreign film and a general tendency toward
divorce between American art and American politics--so it is likely to
remain.

I don't know if it's some childhood image left over from Victory at
Sea
or from a book of pictures my uncle brought back from the
service, but when I think about the war in the Pacific, I see pink
cumulus clouds piled high, one upon another, on the decks of aircraft
carriers. It's not the iconic image of violent battle that usually represents the war, but my imagination seems to be
telling me that the iconic images aren't the whole story, that serenity
and beauty coexisted alongside the bloodshed and were a large part of
the day-to-day reality of the war.

It's for similar reasons that I think the nitty-gritty details of life
near Ground Zero as presented in one of the first theatrical responses
to 9/11, comic monologist Reno's Rebel Without a Pause, appeal to
me so. They provide relief from the media's iconic packaging, which has
been beamed at us ever since the attack on the Trade Towers and the
(rarely mentioned) Pentagon attack.

With a deluge of energy, Reno, who lived near the towers from 1981,
relates what it was like in lower Manhattan "that gorgeous day." She
recreates the clicking sound, like the noise an old machine gun would
make, that was the sound of the floors collapsing into one another. She
exhibits dismay at the total absence of Conelrad and the Emergency
Defense System. ("Maybe this wasn't enough of an emergency.") She
tells a story about finding her ATM emptied out at 9 am and the bank
refusing to open its doors so customers could get their money.

But mostly it's the human reactions to catastrophe that are so
wonderful, so wildly hilarious. The rumors that the terrorists are holed
up with machetes in a macrobiotic restaurant on Prince Street; people
rushing home to have their televisions validate what they'd just seen
with their own eyes; and what Reno calls the "hierarchical bragging
rights of pain and knowledge"--New Yorkers one-upping each other over
what they knew and what they'd suffered.

Reno's warnings about changes in constitutional protections make for a
very disturbing second half of her monologue, though she herself doesn't
seem to fear the new spy agency powers: She gives voice to her every
political thought, no matter how out there it is. She points out how
cheaply reporters have been won over by chummy Don Rumsfeld, and she
contemplates Henry Kissinger being arrested for war crimes. Reno even
suggests that Florida be allowed to float down to Uruguay, "where all
the other fascists are."

She also reveals some interesting facts, like ones you find in this
magazine but not in the major media. For instance, Hamid Karzai, the new
president of Afghanistan, used to work for Unocal. And this from Frank
Lindh, who saw the show the night before I did: FBI agents treated his
son kindly because even they knew "he was a hapless kid."

After a while, I began feeling the tingle of what I hope was just my own
paranoia (although as I learned the last time--when Watergate lanced the
Nixonian pustule--paranoia can be a very accurate predictor of reality).
Reno talks about what is being done to our civil liberties in the
context of Christian fundamentalist influence on this Administration. At
342 pages, the USA Patriot Act, she suggests, wasn't written in the days
after 9/11, and the Padilla case has clearly crossed the line of
innocent until proven guilty. She builds a picture of how really
extremist the Bush people are and how far to the right the President has
taken the country. So far, in fact, that Colin Powell is the "Communist
of this Administration."

Such points may be made with laughter, but Reno brings a fierceness to
her criticisms and an urgency to her concerns about the current
Administration that we are only beginning to see in the big world, and
then over financial wheeler-dealering and privilege, not civil liberties
and constitutional guarantees.

You will walk away from Reno with a clear sense that the changes aren't
minor, and they won't fall only on bad guys and enemies. It's a real
turning point: Democracy is up for grabs.

The San Francisco Mime Troupe's free summer show, Mr. Smith Goes to
Obscuristan
, likewise treats the aftermath of 9/11. In it,
Condoleezza Rice (Velina Brown) and Dick Cheney (Cheney lookalike Ed
Holmes) seek to sell the Bush presidency as an Administration that cares
about democracy, not profits, and so devise a plan to send 9/11
firefighter Jeff Smith (the always wonderful Michael Gene Sullivan) to
oversee the first free election in the Central Asian, formerly Soviet,
republic of Obscuristan. The winner of this contest is certain to be
warlord and privatizer Automaht Regurgitov (Victor Toman), since he is
the only candidate. That is, until the oppositionist Ralif Nadir (Amos
Glick) throws his hat into the ring, arguing that "people should vote
their hearts, not their fears." (Of course, had one or two percent of
Florida's Nader voters forsworn that advice, the Mime Troupe wouldn't
have a Bush Administration to satirize.)

(Or would they?)

Smith, who has been kept ignorant by outfits like SNN, the Selective
News Network, believes America wants freedom for everyone. He is,
however, disillusioned when it becomes clear that there is oil in
Obscuristan and that the Administration's real interest is that
Regurgitov win, since he will insure the atmosphere necessary for US
investment. Smith then sets out to prove that the ordinary American
doesn't want to screw Obscuristan over, and by the end of the day
rescues Nadir, who was kidnapped and branded a terrorist. He also helps
bring an SNN reporter and the US ambassador over to the side of a fair
shake for Obscuristan.

The Mime Troupe hits many of the right points: that energy sources are a
major factor in our involvement in Central Asia, for instance, and that
much of the weaponry in the area was originally supplied by the CIA. And
they raise questions about just how free our own elections are. Given
that, I was left pondering why Mr. Smith seemed so tepid and not
particularly funny compared with Rebel Without a Pause. It's
doubly strange given that the Mime Troupe brought in the usually very
funny monologist, independent filmmaker and former Nation intern
Josh Kornbluth (Red Diaper Baby and Haiku Tunnel) to help
write the script.

The difference is, I think, that Reno articulates things you hadn't
thought about, or says things you may have thought a lot about, but in
ways that create the old shock of recognition. As when she says, "The
people of Missouri were so worried about Ashcroft making decisions, they
voted for the dead guy."

There are moments like that in Mr. Smith. Barbara Bush (Ed Holmes
again, this time in a gray wig and pearls) explains the rules of the oil
game to George W., and the whole facade of her Betty Crockerdom smacks
right up against her tough capitalist intelligence. This is a Barbara
Bush who says, "Never send a member of the working class to do an
aristocrat's job." But such moments are rare. For the most part, the
Mime Troupe's most incisive statements, such as "Only an American would
confuse a fixed election with a real one" or "Welcome to democratic
nations like Saudi Arabia who protect human rights," simply restate our
perceptions or are so bitterly ironic that a lot of the laughter I heard
was sniggering.

Given that the source of the satire is Capra's populist classic, Mr.
Smith Goes to Washington
, I think the Mime Troupe missed a real
opportunity to have us question ourselves by asking, Who is Mr. Smith
and what is he about? In the mythos of Mime Troupe plays, the ordinary
American is decent and fair, and in every respect there's a lot of
daylight between him and the ruling class, and therefore between us and
what our government does in our name. The Mime Troupe believes that like
Jeff Smith, the ordinary American has been kept in ignorance by the
media, and that if he only knew what was really going on, he would rise
up and change things.

That conveniently ignores the fact that ordinary Americans are of many
minds, and that many of us do understand that our comfort is based on
the deprivation (and worse) of people in other parts of the world. So
then, you have to ask whether we feel we can't do anything about it or
whether we don't want to. How much is the ordinary American willing to
give up to see people elsewhere get a larger slice of the pie?

And what is the usefulness of a mythos of unquestioned fairness and
decency, and in this play, as in other Mime Troupe efforts, of a sellout
who regains her soul and of a decisive victory over the people's
enemies? It's positive, but does it send us out of the park feeling
hopeful and intent on action? Or do we feel that a lot of what we
witnessed was too simple and fantastic?

The appeal in Mr. Smith is ultimately to idealism, to looking out
for the other guy and doing the right thing. Reno, on the other hand,
talks about self-interest: that we are losing our rights and that some
of us were slaughtered. "The [US] government," she says, "created the
mujahedeen that came to my town and killed us." That seems a much
stronger motive for action.

Mr. Smith Goes to Obscuristan will be performed through Labor Day
in various Northern California locales (415-285-1717 or www.sfmt.org).
Rebel Without a Pause played a week at the Brava Theater Center
in San Francisco in June and went on to an extended run at the Lion
Theater on 42nd Street in New York City.

If Canadian writer Yann Martel were a preacher, he'd be charismatic,
funny and convert all the nonbelievers. He baits his readers with
serious themes and trawls them through a sea of questions and confusion,
but he makes one laugh so much, and at times feel so awed and chilled,
that even thrashing around in bewilderment or disagreement one can't
help but be captured by his prose.

That's largely why I took such pleasure in Life of Pi, Martel's
wonderful second novel, which playfully reworks the ancient sea voyage,
castaway themes of classics like Defoe's Robinson Crusoe, Swift's
Gulliver's Travels, Coleridge's The Rime of the Ancient
Mariner
, Melville's Moby-Dick and (in some of its more
fantastical aspects) Homer's The Odyssey, to explore the role of
religion in a highly physical world. What's more, it's a religious book
that makes sense to a nonreligious person. Although its themes are
serious and there are moments of awful graphic violence and bleak
despair, it is above all a book about life's absurdities that makes one
laugh out loud on almost every page, with its quirky juxtapositions,
comparisons, metaphors, Borgesian puzzles, postmodern games and a sense
of fun that reflects the hero's sensual enjoyment of the world. Although
Martel pays tribute to the past by using the typical castaway format
(episodic narrative, focus on details of survival, moments of shocking
violence and reflections on God and nature), his voice, and the fact
that his work is more fantastic, more scientifically sound and funnier
than that of his predecessors, infuses the genre with brilliant new
life. If this century produces a classic work of survival literature,
Martel's novel is surely a contender.

Life of Pi is the unlikely story of a 16-year-old Indian boy, Pi
Patel, adrift in a boat with a hungry tiger after the ship carrying his
zookeeper father, mother, brother and many animals sinks in the middle
of their journey from India to Canada. (It's the mid-1970s and Pi's
father decides to emigrate after Prime Minister Indira Gandhi starts
jailing her enemies and suspending civil liberties.) Pi is at once a
Hindu, Christian and Muslim (echoes of the pacific Mahatma Gandhi here)
who believes that all religions are about "love." But having grown up
among animals, he's also practical and grounded. Early in the book, his
three religious teachers meet, and Pi gets his "introduction to
interfaith dialogue," a big argument that ends only when he is asked for
his opinion. He quotes Gandhi, "All religions are true," adding, "I just
want to love God," which floors them all. Then he goes out with his
parents for ice cream. Most of the rest of the book is a challenge to
Pi's simple faith, as this sweet yet unsentimental hero experiences a
situation where, it would seem, survival is everything. Aside from the
detailed descriptions of hands-on survival techniques that almost rival
Ishmael's whaling lore in Moby-Dick, the book poses the
questions: Can faith survive in the face of doubt and suffering? Can the
love of God and one's fellows remain pure in an angry, violent world?

Despair sets in from the beginning. Not only does Pi lose his parents,
but he is facing life on the ocean wave with a tiger (named Richard
Parker), a zebra, an orangutan and a hyena. Pi watches them kill each
other, with Richard Parker finishing off the hyena. The boat is littered
with animal carcasses. As the days go by, Pi, a vegetarian, learns how
to kill with his bare hands, batter turtles to death and eat uncooked
flesh. He weeps. He is "dumb with pain and horror." But he survives,
marking his territory with his urine, as animals do, to keep Richard
Parker at bay, feeding him and finally teaching the tiger (by using a
whistle) that he, Pi, is master here.

It's true that his three faiths recede to a whisper on the boat. He
confesses that it is Richard Parker, and the practical matter of
avoiding being eaten by him, that gives him "purpose," even "peace" and
perhaps "wholeness," and thus keeps him alive. "If he died, I would be
left alone with despair, a foe even more formidable than a tiger.... He
pushed me to go on living." Pi keeps up with his religious rituals, but
he finds his faith wavering. In one funny scene, he yells out his
beliefs to make them more real. "I would touch the turban I had made
with the remnants of my shirt and I would say aloud, 'THIS IS GOD'S
HAT!'" Then he points at Richard Parker and says, "THIS IS GOD'S CAT!"
The boat is "GOD'S ARK!" The sea, "GOD'S WIDE ACRES!" The sky, "GOD'S
EAR!" But, he says, "God's hat was always unravelling," and "God's ear
didn't seem to be listening."

You might say he's trying to persuade himself. But it's clear that he
continues to appreciate the beauty of the sea and sky, and the sparse
life around him, in which, as a Hindu, he sees his connection to God.
There are wonderful poetic descriptions of the fish around the boat as a
little city, of Richard Parker's beauty and of a dorado fish that, as it
dies, begins to "flash all kinds of colours in rapid succession. Blue,
green, red, gold and violet flickered and shimmered neon-like on its
surface as it struggled. I felt I was beating a rainbow to death." Even
when his journey is "nothing but grief, ache and endurance," it is
"natural," he says, that he "should turn to God."

But religion is only one element of the book's exploration of faith.
Martel is also interested in the faith of his readers. He wants them to
believe his story. He has his narrator pose a larger, Keatsian "beauty
is truth" argument against the glorification of reason, "that fool's
gold for the bright." It's as if he were suggesting that storytelling is
a kind of religious experience because it helps us understand the world
in a more profound way than a just-the-facts approach (or by
implication, dogma, fundamentalism and literalism). Two passages that
some reviewers have picked out as the least convincing (for their lack
of literal accuracy!), I find illustrate Martel's attempt to show the
power of storytelling at its best. Fantastic, yes, but utterly
convincing. The first is Pi's encounter with a blind, cannibalistic
Frenchman whom Pi runs into at the exact moment he too has gone blind
for lack of nourishment. Their obsessive conversation about food is one
of the funniest and most farcical moments in the book. The second is
Pi's sojourn on a flesh-eating island, which is one of the most chilling
symbolic illustrations of evil I have read. (If the pious Swiss Family
Robinson finds utopia, the religious Pi finds dystopia!)

Good postmodernist that he is, Martel wants to use the very telling of
the tale--multiple narrators, a playful fairytale quality ("once upon a
time" and "happy ending" are mentioned in passing), realistically
presented events that may be hallucinations or simply made up--to push
at the limits of what's believable, yet still convince the reader of his
literary, not literal, veracity. He wants to prove that it's possible to
remain curious about and connected to the world, yet to accept that
there are always going to be aspects of life (and literature) that
remain mysterious.

Pi's doubts about his faith are mirrored by the seeds of doubt Martel
sows in the mind of the reader throughout the narrative. Every moment of
certainty is undercut by the potential for disbelief, and that's when
Martel seems to ask: Am I convincing you now? He sifts the story through
various narrators, beginning with an author-narrator that at first one
thinks is Martel himself but is only Martel-like, introducing the story
as if it were true. Martel has said in interviews that some of this
information is factually accurate. Like his narrator, he was trying to
write a novel about Portugal that wouldn't come alive when he got the
idea for Life of Pi on a trip to India. Martel also briefly
acknowledges his special debt to Brazilian Jewish writer Moacyr Scliar,
whose novella Max and the Cats also has a hero who survives the
sinking of a ship filled with zoo animals and spends days at sea in a
boat with a large cat, in this case a jaguar. Scliar's is the
mini-version that Martel fleshes out with more lyrical language and the
fruits of zoological research.

But there reality stops. There's the whiff of an old-fashioned quest or
allegorical tale in the introduction, for the Martel-like narrator first
learns the story from Francis Adirubasamy, a family friend of Pi's, who
tells him that Pi's story will make him "believe in God." And he plays
with the reader's sense of reality when he has Adirubasamy talk about Pi
as "the main character" whom the narrator proceeds to track down in
Canada. And just how believable is Pi? Now in his 40s, Pi apologizes for
his memory and tells the story as a series of out-of-sequence
events--jumping back and forth between his early childhood, his teenage
years and his time at sea. He can barely remember what his mother looks
like, but he appears able to recall whole conversations from his
childhood. He even asks the narrator to "tell my jumbled story in
exactly one hundred chapters, not one more, not one less." (He does.)
One begins to wonder if Pi made up Richard Parker. Despite his knowledge
that people anthropomorphize animals because of their "obsession" with
putting themselves "at the centre of everything," Pi seems
disproportionately haunted by the fact that when the boat hits Mexico,
Richard Parker takes off without a backward glance. Perhaps the loss of
the tiger symbolizes the greater loss of his family, or of his own
innocence. Perhaps Pi invented the tiger to keep himself sane. The
reader is left to decide.

In a final test of the reader's faith in the narrative, Martel has Pi
tell an alternate, allegedly more believable version of the story at the
end--lacking not only Richard Parker but also the humor, poetry and
detail of the tiger story--to please a couple of doubting Japanese
shipping officials. He asks them which they think is the "better" story.
Of course, the tiger story is the finer, more thoughtful literary
creation and therefore (Martel suggests) has a truth more lasting than
the second, more journalistic version, with its "dry, yeastless
factuality."

Even if one accepts the twists and turns of the narrative, one faces the
further challenge of tracking down clues hidden in a warren of allusions
for more definitive answers to questions about Pi's religious faith, and
whether the narrator (and the reader) will be persuaded of the story's
original premise that it will make one believe in God. That symbolism is
important in this book is made clear at first by the most obvious symbol
of Pi's name, self-chosen because it's the short version of his real
name Piscine (after a family friend's favorite Parisian swimming pool),
and he is inevitably called "Pissing" by classmates. Nothing could be
grittier. In contrast, Pi is like ¼, what mathematicians call an
"irrational number," that is, 3.14 if rounded off, but with endlessly
unfolding decimal places if carried out. Martel couples this mysterious
abstraction with a concrete image--"And so, in that Greek letter that
looks like a shack with a corrugated tin roof, in that elusive,
irrational number with which scientists try to understand the universe,
I found refuge"--to show that, as a boy, Pi is in harmony with things as
they are as well as with his sense of the unknowable.

That Pi's attitude to religion may have changed after his ordeal is
buried in the hidden symbolism hinted at by Pi's college studies in
religion and zoology, described on the opening page as if to emphasize
their importance as a key to the story. (This is after the lifeboat
comes to shore in Mexico, and Pi goes to Canada to start a new life.)
His specialties are the sixteenth-century Jewish mystic Isaac Luria and
the sluggish three-toed sloth (symbol of the Trinity?) whose miraculous
capacity to stay alive, he says, "reminded me of God." (An echo of his
own survival, perhaps? A hint that God seems more elusive these days?)
More important, Luria's cabalistic ideas may hold the key to Pi's
experience at sea. His philosophy (Luria thought the secrets of the
universe lay in numbers) echoes the symbolism of ¼, and the formula
for figuring out the dimension of a circle and its radius (connecting
perimeter and center). Luria believed that God's light contracted from
the center of the universe, purging itself of evil elements, leaving an
empty space (a circle) in which human life developed. But God also sent
down a ray of light (like a radius) so that the few remaining divine
sparks could reconnect with Him. To achieve this fusion with God, and by
implication eliminate evil from the world, Luria believed, people must
live an ethical life. The original divine contraction is called
variously tzimtzum, zimzum or simsum. It's no
coincidence that Martel called the sinking ship Tsimtsum. Thus Pi at sea
was experiencing his own void (or withdrawal of God), in which elements
of evil fight with the instinct to do good. Richard Parker saved his
sanity, and Pi's goodness kept Richard Parker (and perhaps his own
faith) alive. By introducing this strain of mystical Jewish thought,
Martel not only further illustrates Pi's contention that all religions
are essentially the same in that they stem from love but he also uses
mysticism to underscore the profound ways in which literature can
present life's truths. Skeptics, however, might see Pi's study of Luria
as a move away from his earlier, purer faith toward a more structured
mysticism. That would explain his comment at the end of the book, when
he confesses his need for "the harmony of order."

Though one can read Life of Pi just for fun, trying to figure out
Pi's relationship to God makes one feel a bit like the castaway hero
wrestling slippery fish into his lifeboat for dinner. An idea twists and
turns, glittering and gleaming, slaps you in the face with its tail and
slips away. Did the story really happen? Does it make one believe in
God? What kind of God? Early on the narrator says, "This story has a
happy ending." But Pi also tells his interviewer, "I have nothing to say
of my working life, only that a tie is a noose, and inverted though it
is, it will hang a man nonetheless if he's not careful," which suggests
a man with at least some conflict on his mind. On the other hand, Martel
may also be suggesting that work is less important to Pi than God and
family--the narrator gives us glimpses of Pi's shrine-filled house and
his loving relationship with his wife, son and daughter. However, when
Pi is showing him family pictures, the narrator notes, "A smile every
time, but his eyes tell another story." I believe Martel's point is that
doubt inevitably accompanies faith. But the opposite explanation, that
after Pi's life-threatening experiences his faith is a mere prop for his
anxiety, might work just as well.

Does it matter that the answer to all questions in this novel is both
yes and no? One answer comes in the form of Pi's question moments after
the ship has sunk and he's sitting in the lifeboat, bewailing the loss
of his family and God's silence on the topic: "Why can't reason give
greater answers? Why can we throw a question further than we can pull in
an answer? Why such a vast net if there's so little fish to catch?" And
that, of course, is the nature of faith. One can't argue it through, one
just believes. Faith in God (as the younger Pi sees it) "is an opening
up, a letting go, a deep trust, a free act of love." It's also "hard to
love," Pi adds, when faced with adversity. The same might be true of a
good novel, as readers are taken to the edge of their understanding by
something new. If the reader lets go of preconceptions, the experience
can be liberating and exciting. Martel may be sowing seeds of
uncertainty about God, but there's no doubt that he restores one's faith
in literature.

More than thirty years ago, in an essay called "Uncle Tom and Tiny Tim:
Some Reflections on the Cripple as Negro," I suggested that cripples
emulate the civil rights movement by focusing on political solutions to
the problems of living under difficult physical conditions. (It's a lost
battle, but I continue to prefer the term "cripple" to the bland "disabled.") The problems cripples faced seemed as much the result of our inability to
define our needs as they were the fault of a society quite willing to
live with its ignorance of those problems and quite willing not to see
us at all unless absolutely forced to. It wasn't until the late 1960s
that cripples began to believe that they had the right to demand that
America meet their needs.

Anyone who has spent significant time living with a serious physical
condition probably has had an experience similar to the following:
entering a restaurant with another person, he (or she) finds that the
waiter is addressing not him but the person he is with. He is a
category, and categories are simply assumed to be unable to take
responsibility even for something as minor as placing an order. Yet even
such infantilization can seem liberating if the cripple realizes that
the problem it bespeaks is political rather than psychological: One
infantilizes the other by assuming attitudes held by society at large.
And this process is something that the cripple, too, is encouraged to
do. Even Randolph Bourne, as tough a social critic as America ever
produced, looks inward in his famous essay "The Handicapped," published
back in 1911. Writing about other issues, Bourne understands that
political problems demand political solutions. But when it concerns the
cripple, among whose ranks he was numbered, he was curiously
inner-directed and soft.

The demand for the rights of cripples was already under way as I was
writing "Uncle Tom and Tiny Tim." And while I would be happier without
much of the rhetoric of the Disability Rights Movement, to its credit,
it has helped change the consciousness of those who must confront the
world with physical disabilities. Both its success and its burgeoning
political potential seemed wishful thinking in 1969, when I still
dismissed its prospects. But that success was confirmed with the
enactment of the Americans With Disabilities Act in 1990. Despite its
admitted weaknesses, few Congressional acts more deserve the term
"landmark legislation." The Americans With Disabilities Act promised
those forced to live with severe physical impairments the possibility of
legal if not functional equality. Its most profound accomplishment, even
allowing for the vagueness of definition that has come to haunt it, was
to accept the idea that cripples have the right to specific
accommodations that meet their employment needs. For a population
battling the indignities of permanent illness, its promise was
comparable to that of the Civil Rights Act for African-Americans in
1964.

Twelve years after its passage, that promise seems about to be swamped
by a legal system in which what constitutes a workplace disability is
undefined and perhaps undefinable. The confusion about what would seem
to be the most elementary of definitions--what is meant when we speak of
a disability--threatens to weaken if not make the act virtually useless.
The cripple's demand for rights still commands a good deal of public
interest and a degree of public sympathy. Yet the Americans With
Disabilities Act has not led to widespread political activity on behalf
of the nation's cripples. Their quest for equality is not only
threatened with that most severe of American sins, being relegated to
political unfashionability, but the question of what a disability is
shows few signs of being resolved in favor of those whom the act was
supposed to help. Recent Supreme Court rulings in which disability was
ill defined must be seen as setbacks for those who look to the judiciary
to enforce what the act called for, a policy of accessibility and
inclusiveness. The Court ruled in April by a 5-to-4 majority in US
Airways v. Barnett
that US Airways' seniority system took precedence
over the right of a disabled worker to transfer to a more suitable job.
In Toyota Motor Manufacturing v. Williams, the Court ruled
unanimously that the definition of disability must mean substantial
limitations on abilities "central to daily life," not just the job. And
the Court also unanimously held, in mid-June in Chevron U.S.A. v.
Echazabal
, that employers had the right to refuse to hire a worker
whose health they believed might be impaired by performing a particular
job.

For this alone Ruth O'Brien's Crippled Justice is a welcome
addition to the literature on living with disability. A professor of
political science at the City University of New York, O'Brien approaches
her subject armed with an analytical perspective nurtured by her earlier
work. Her first book, Workers' Paradox: The Republican Origins of New
Deal Labor Policy, 1886-1935
, already reflected her interest in the
subject of workers' rights. Yet even academic inquiries can be rooted in
personal experience. "Had I not sustained what is now a ubiquitous
workplace injury," she writes, "a debilitating case of bilateral
tendinitis in my hands and forearms, I might never have explored the
development and implementation of...disability policy." Yet the focus of Crippled Justice is neither
personal nor anecdotal. It is a serious inquiry into the history of
public policy as that policy has affected large numbers of men and women
crippled by illness, accident or birth. As serious scholarship is
expected to be, it is factual and analytical. The past few decades have
witnessed a rich expansion of memoirs and essays by writers forced to
struggle with their own physical or mental deterioration, books that
depict what life is like for those who must live it with severe illness.
But the kind of political analysis O'Brien offers in Crippled
Justice
is what, I believe, cripples need now.

Analysis demands perspective, particularly when it begins in personal
experience. While bilateral tendinitis may not have the same sort of
consequences as, say, pushing through life in a wheelchair or trying to
earn a living as a blind person, the experience limited O'Brien's normal
ability to function. It turned her temporarily from normal to cripple.
And however temporary an experience, it was also sufficiently
dehumanizing to give her a strong sense of what life is like for those
forced to live with more severe conditions. The first discovery one
makes on entering the shadowy world of cripples is that one no longer
defines need, ability and ambition for oneself. The experience of living
with disability forced Ruth O'Brien to recognize that the cripple must
"struggle over the same issues that women and minorities battle." But
she also saw that the problems cripples faced were in some ways less
soluble and in others more mechanical than the problems of other groups.
Nothing would be more beneficial to cripples as a group than a fantasy
I've held for the past decade--a law that would make it mandatory for
every elected official in the country to live a single week each year as
a cripple.

If nothing else, that would show that the problems involved are as
political as they are psychological. And that is why I am grateful that
Crippled Justice restricts itself to the conditions cripples
confront in the workplace. To the writer, physical disability offers a
personal confrontation. And as is the case with writers, that
confrontation is about language. But what the cripple confronts in the
workplace, as O'Brien shows, are confrontations that have solutions. And
those solutions are political. What she tells us about the history of
disability policy in the workplace may not be as powerful or as dramatic
as, say, Andre Dubus writing about the changes that were imposed upon
his life by the sudden transition he underwent from being a normal man
to being wheelchair-bound. Nor does Crippled Justice offer us the
savage honesty of Harold Brodkey writing about his own impending death
from AIDS. O'Brien's focus is more mundane, which is to say that it is
more political: She is interested in the possibility of a meaningful
work life for those who lack the talent of a Dubus or a Brodkey.

We do not, of course, read memoirs and essays to create public policy
but to recreate individual lives. Yet if the experience of being forced
to live as a cripple is invariably personal, the reality of how
one lives that life is invariably political. I have no choice but to
accept being in a wheelchair. On the other hand, the New York through
which I push myself has any number of choices in how it reacts to my
need for that wheelchair. It is able to define how I live, what is now
subsumed under that horrendous phrase "quality of life," through the
public policy decisions it makes. Such seemingly trivial items as the
condition of the streets through which I push speak less eloquently but
more truthfully of what is or isn't possible for me than Dubus's essays
or my own essays or Nancy Mairs's essays. Public policy defines the
boundaries of the cripple's life. Mundane issues such as the condition
of the streets and the accessibility of restaurants and stores and
theaters (and how the Court defines disability) speak to the cripple's
ability to live with dignity.

The first half of Crippled Justice offers a historical overview
of the rehabilitation of the cripple in America. The ideas dominating
medical and social policy after the end of the Second World War in 1945
were largely formulated by two physicians, Dr. Howard Rusk and Dr. Henry
Kessler. (War may be unhealthy for children and other living things, but
it has done wonders for the fields of prosthetics and rehabilitation
medicine.) Rusk and Kessler are among the villains of the book, since,
along with Mary Switzer, the federal bureaucrat responsible for the
Vocational Rehabilitation Act of 1954, they created models of
rehabilitation still largely followed today. From Freud and even more
from William Menninger, rehabilitation medicine was inspired to shift
its focus from the need to treat the cripple's physical symptoms to the
need to treat the whole person. And the models were psychological.
O'Brien describes "the deep strain of individualism in American
liberalism" as the source of the mistaken path rehabilitation medicine
took. Yet I am not convinced that individualism is so negative in the
life of the cripple. No one can overcome the effects of disability
through mere willpower or a well-developed work ethic--but a
well-developed sense of self helps if one is to be a "success" as a
cripple. One might even suggest that the successful cripple must combine
a free-market head with a socialist soul. Perhaps more than others do,
he needs to see himself as singular. After all, what else can account
for all those memoirs about the singularity of the experience of
disability? The best passage I know about living as a cripple--as moving
to me as Shylock's "Hath not a Jew" speech--wasn't written by a cripple
but by a healthy Saul Bellow at the height of his powers. Put into the
mouth of the poolroom entrepreneur in The Adventures of Augie
March
, its power derives from how it speaks for us cripples as it
speaks about Einhorn's aching sense of his individual quandary.

O'Brien is on more solid ground when writing about how Rusk and Kessler
expected the "sick" individual to "adjust" to what they viewed as a
"healthy" society. The cripple unable to make the adjustment was a
social and psychological problem. Even so, one can argue that the
individualism O'Brien finds irritating is the cripple's best chance to
find salvation. Ambition should be made of sterner stuff than turning
all problems into psychological barriers. At the same time, the desire
to get even with an unjust fate shouldn't be dismissed lightly.
Liberalism may have a lot to answer for where attitudes toward the
cripple are concerned, but excessive concern with individualism is not
the biggest item on that bill. Still, the psychologizing of disability
was a mistake for which we continue to pay a price. And it remains, I
believe, the source of the Court's restricted vision of workplace
disability.

The conditions cripples face in the workplace cannot be conquered by
their adjusting to normal society but by society making certain minor
but necessary adjustments to their problems. By the 1970s the
psychological definition of the cripple had already shown how limited it
was. But is it better to define the cripple legally? Despite its immense
promise, the Americans With Disabilities Act is, as O'Brien writes, "an
idiosyncratic body of law." Where once cripples had to convince the
world of their ability to meet standards set by normals, they are now
expected to meet thresholds of disability set by a Court that seems
oblivious to the obvious. When the issue is as clear-cut as it was in
the case of the golfer on the PGA Tour, Casey Martin, whose bone
deterioration made it impossible for him to walk the links although it
didn't prevent him from playing golf, the courts seem willing to allow
the spirit of the original act to serve as its definition. But even that
makes the judiciary our "modern-day experts of vocational rehabilitation
because of the idiosyncratic nature of disability." The Court has not
yet claimed the right to define whether an individual is or is not a
cripple. But by insisting on its right to define what constitutes
disability in the workplace, it has assumed the power of defining what
the consequences of being a cripple are. As far as work is
concerned, cripples "have gone from being subjects of medicine to
subjects of law." Whether this is an improvement over the psychologizing
of disability is certainly open to question. The conclusion of
Crippled Justice is not despairing but it is skeptical. And for
good reason. In a valuable study of workplace disability as both a
political and social issue, O'Brien has performed a service to anyone
interested in social justice. Unfortunately, recent Supreme Court
decisions threaten to make her skepticism the book's lasting legacy.
Whether defined by the judges or doctors, it seems to be the cripple's
fate to be defined as the other.

On its anniversary, two of its authors assess its relevance for today.

William J. Bennett, former Secretary of Education, ex-chairman of the
National Endowment for the Humanities, candidate for President in 2000
in the Republican primaries, has written an intemperate little book
called Why We Fight. Using the horror of 9/11, the book crackles
with protestations of his patriotism as he lobs shells at those who do
not share his views. Apparently Bennett had no moral choice but to write
what he had to say in order to save the Republic. "I sensed in my bones
that if we could not find a way to justify our patriotic instincts, and
to answer the arguments of those who did not share them, we would be
undone."

If Bennett had his way, those who did not hold his views would be dealt
with very harshly indeed. He leaves it to the reader to guess what he
would do with those he views as "unpatriotic." But there are ample
clues. Civil liberties are not his concern, neither in this book, as he
makes clear, nor for that matter anywhere else. He states that he is for
military tribunals "and the detention of suspects within our own borders
for questioning." For how long Bennett does not say. Nor does he tell us
whether there is the same standard for a non-American as for an American
citizen. Until recently there were hundreds being held in detention,
sanctioned by an act of Congress that gives the Bush Administration
virtual carte blanche in handling suspects without warrants, and perhaps
even without recourse to the regular court system. (Most of the
detainees have been quietly deported.) This exercise of power is a
complement to Administration foreign policy, as it is apparently
prepared to intervene in or invade nations even if there is no evidence
that they are involved in terrorism or backing terrorists. The domestic
implications are spelled out well by Bennett, but none of it bothers
him. His gravamen against the left and those who disagree with
him--members of the "peace party," as he calls his adversaries--is that
they "have caused damage, and they [you] need to be held to account."
Nation editors and thinkers like Eric Foner, Richard Falk, Katha
Pollitt and Jonathan Schell, take heed. They are not alone as enemies of
Bennett--New York Times editors, Harvard (Bennett is an
ungrateful alum) and assorted scholars, Noam Chomsky, students and the
professoriate generally should watch out. They are targets in Bennett's
campaign for an inquisition, twenty-first-century style. He is concerned
that "the Foners of the United States" have led a minority of Americans
away from being true believers. As Bennett so indelicately puts it, "A vast
relearning has to take place," undertaken by everyone, especially
"educators, and at every level." "The defect" in our education and
morals "can only be redressed by the reinstatement of a thorough and
honest study of our history, undistorted by the lens of political
correctness and pseudosophisticated relativism." In other words, there
has to be a moral cleansing in America.

The word "reinstatement" does not tell us what Bennett is attempting to
reinstate, though. From Why We Fight we learn of Bennett's deep
distress at American education, where his notions of American history
seem less persuasive than they were in the days when nineteenth-century
historians acted as propaganda instruments for war, racism and America's
imperial superiority. Those were the days when "a vast relearning" was
not necessary. He quotes approvingly Professor Donald Kagan, the Yale
historian, who tells us that those who do not hold to their definition
of patriotism and their reading of history suffer from "failures of
character
[emphasis added by Bennett], made by privileged people who
enjoy the full benefits offered by the country they deride and detest,
its opportunities, its freedom, its riches, but who lack the basic
decency to pay it the allegiance and respect that honor demands."
Bennett does concede at one point that while it is incumbent on those
who hew to the Kagan version of truth to point out the despicable
behavior of the naysayers, we must also "[respect] their right to be
irresponsible and even subversive of our safety."

There are other views of patriotism, of course. One was promulgated by
the leading American philosopher John Dewey, an independent thinker not
given either to religions or secular religions, namely Communism. He
surely would have been measured for a Soviet gulag. But he would also
have been on Bennett's enemies list for his belief that scoundrels too
often fly the flag of patriotism and nationalist triumphalism:

On the side in which public spirit is popularly known as patriotism this
widening of the area of interest has been accompanied by increased
exclusiveness, by suspicion, fear, jealousy, often hatred, of other
nations.... The self interest of the dynastic and military class
persistently keeps the spark of fear and animosity alive in order that
it may, upon occasion, be fanned into the flames of war. A definite
technique has grown by which the mass of citizens are led to identify
love of one's own country with readiness to regard other nations as
enemies.... And in many cases, it is becoming clear that particular
economic interests hide behind patriotism in order to serve themselves.
So far has this feeling gone that on one side there is a definite
attempt to attach the stigma of "unpatriotic" to everything designated
international; to cultivate that kind of "hundred percent Americanism"
which signifies practically suspicion and jealousy of everything
foreign.

In other words, Americanism can serve as a code word for "contempt of
other peoples," Dewey concluded.

The disinterested observer must wonder whether it is inaccurate to note
the emergence of dynastic classes whose political power is linked to the
intelligence community, the military and big business. It would be
absurd to deny at this point that there are classes and groups that
profit from war and military preparedness. It is equally naïve to
believe that the constitutional contract of civil liberties is so strong
that prosecutors, local police, freewheeling inquisitors and others will
not spy and inform on and harass the different and the dissident. War
mobilization is the perfect cover story for such abuses. The problem is
made worse because legal and structural changes in governing and
consciousness are legitimized through law, for example in the USA
Patriot Act. That is to say, the legacy of Bush will live long after he
returns to Crawford, Texas.

But what about the doubter? What about today's or next year's or next
decade's "little guy," a man like Winston in Orwell's 1984, who
didn't go along or know how to because the contradictions were so
profound between the stories that were given from one year to the next
that he knew enough not to believe in this year's lies? Suppose he
wondered why Ferdinand Marcos of the Philippines was our friend one year
and the next we helped overthrow him, or why the hapless former
Panamanian leader Manuel Noriega, a man once on the CIA payroll, became
the occasion for our invasion of Panama, ostensibly because of his
involvement with drug payoffs? The results were much destruction and the
death of several hundred Panamanians. Bennett's defense of violence
takes on frightening characteristics. Somehow he believes that, quoting
Orwell favorably, "Those who 'abjure' violence can only do so because
others are committing violence on their behalf." He goes on to wrap
himself in the comfort of the armed forces. But surely he can't mean
this about Panama, El Salvador, Colombia, etc. Violence was not being
committed there on behalf of those who objected here. Indeed, it is a
stretch to imply that these actions did anything for the American
people.

Imagine the naïve citizen who doesn't understand hypocrisy and
strategies of evasion, contradiction or double standards. That person
might wonder why we went to war in Afghanistan when the perpetrators of
the 9/11 destruction were for the most part Saudis. Referring to
Augustine and Jean Bethke Elshtain, Bennett claims that "not
resorting to force leads to evils far greater than the one we
oppose." But surely it would be nice to know who the enemy is, and drop
the bombs on the correct culprit. Whether the naïve person who
holds such views and then organizes others to express their doubts
should be held without bail as a suspect is unclear from the Bennett
text. What is clear is that doubters should be shunned and punished.
They are raining on Bennett's "war party" (his term), a parade in which
he is a proud adjutant.

Bennett's animus toward his fellow Americans is unforgiving especially
in reference to those who were part of the movements of the 1960s, which
had the effect of concretizing ideals into practice--and at no small
cost. Perhaps his anger against the movement members was that they
employed nonviolence and used or stumbled into a social method that
broke "facts" open and found values that contradicted the stated
democratic ideals of inclusivity, equality and sheer decency. It is no
wonder that this social method is one that helps ourselves and the young
demystify events, their causes and implications. His disdain for the
peace party goes back to the Vietnam War. At that time, the peace party,
made up of the flaccid and pusillanimous, didn't support the "bomb them
back to the Stone Age" position of Gen. Curtis LeMay. Bennett, the angry
moralist, remains upset that the LeMay position didn't get much of a
hearing, although the general ran for Vice President with George
Wallace, and the tonnage of bombs dropped on Vietnam by the United
States was greater than the amount dropped in World War II. As Bennett
opines, it was the Gandhian nonviolence people of the peace party who
subverted an American victory in Vietnam because "those among us who
espoused the LeMay position were scarcely to be heard from." His
argument is uncomfortably reminiscent of the German generals and the
right during the Weimar Republic who claimed that the Germans lost World
War I because they were "stabbed in the back" by the left.

As a good Republican, Bennett bristles at those who might doubt the
motives and methods of the Bush Administration. After all, how could
anyone doubt those patriots who took power under questionable
circumstances, who had already used every sleazy trick to get one of
their fellow rightists onto the Supreme Court and vault into the White
House a man who'd lost the popular vote, installed as it were, by a
5-to-4 decision of the Supreme Court? Because Bennett is a dogmatic man
he is not burdened with self-doubt but has a surfeit of faith. (Bennett
lets us know that he is a religious man, a Catholic who has no doubts
about his faith and his belief in the Catholic Church, its teaching and
activities. It is his kind of faith, religion itself, which he
understands to be the backbone of America, much the way other believers
throughout the world, such as Osama bin Laden, perhaps, link their faith
to their political judgments.)

To Bennett, 9/11 was a moment of clarity between good and evil. "Good
was distinguished from evil, truth from falsehood." But there was more
to the question. He was concerned that some said the United States
helped bring the disaster about through its foreign and military
policies. After all, the skeptics wondered, didn't the United States
train and militarily assist the radical fundamentalists against the
Soviet Union? And then didn't our assets turn against the United States
when Afghanistan was left a broken nation? And did the United States
overstay its welcome in Saudi Arabia, whose people include chief backers
of the radical fundamentalists? These were not idle questions, nor was
it idle and unpatriotic to analyze from top to bottom the ethos of
American invulnerability. The United States had placed its faith in a
forward defense. But on that terrible day, the idea of fighting wars on
other people's territory was severely damaged. Wouldn't these questions
suggest a comprehensive review of American foreign policy? But Bennett
the purist claims that he is not interested in policy. He is interested
in right and wrong, good and evil. Bennett, the consummate Washington
insider, is not one, apparently, to get his hands dirty with the
realities of policy-making and everyday life--i.e., what to do--although
working through his principles would have horrendous consequences for a
democratic society.

The reader may ask whether there is anything about which Bennett and I
agree. And here the answer is yes. Certainly the assault on American
cities was an atrocious attack by a gang of zealots. On why they thought
to undertake their suicide mission Bennett and I disagree. Perhaps the
perpetrators wanted to give the United States a lesson in cost-benefit
analysis to show that all the high-tech military equipment in the world
does not make the United States invulnerable. (Indeed, because of the
interconnectedness of our communications system, the United States is as
vulnerable as any Third World country.) The zealots may have been imbued
with an anti-Western spirit that has rankled for over a thousand years
and finally erupted against the United States, paradoxically for the
same reason Bennett has had grave questions about American society: its
relativism, sensuality, individuality and lack of religious discipline.
Relativism has acquired a vulgar connotation, and Bennett uses its
burlesqued meaning as a stick against nonbelievers and the peace party.
He compares Stanley Fish, the dean of liberal arts and sciences at the
University of Illinois in Chicago, a leader of the postmodernist school
of literary theory, to mass murderer Charles Manson, who said that he
thought no man could really know and represent another, "to communicate
one reality through another, and into another, reality."

"Stanley Fish himself could hardly have put it better," writes Bennett:

Do we, then, have no independent and objective standard for determining
why Professor Fish should be allowed to teach at a prestigious
institution of higher learning while Charles Manson should languish in
prison just because he followed a doctrine he shares with Professor Fish
to its logical conclusion--the conclusion that since everything is
relative, everything can be justified and all is permitted.

One does not have to be a postmodernist, which I am not, to be deeply
offended by Bennett's comment. Bennett picks up on Leszek Kolakowski's
views that to follow principles to their logical conclusion can lead to
disaster. But Bennett overlooks a fundamental truth. The question is how
to determine an "independent and objective standard," what goes into
that judgment and who decides what that standard is. By analyzing this
set of questions we learn our own weaknesses, that of the standard
setters and those who seek to impute their values into an objective
reality. We can analyze and judge, from our perspective, actions and
behaviors. People can then choose between Fish and Manson.

Right and wrong may come from God or moral sentiments, which the
philosophers Francis Hutcheson and David Hume spoke of. These
sentiments, better stated as capacities that people have, may be
degraded by social roles, institutions, laws, poor upbringing, whatever
causes a person to turn toward the pathological. Obviously, if one
believes in the Enlightenment and historical progress, ways of acting do
emerge that are acceptable as against actions that are no longer
acceptable either as a result of social agreement or because there are
moral sentiments that make their way through historical struggle.
Bennett, who appears to be all over the map philosophically, does hold
as a constant his belief in Plato, who in turn held tightly to the idea
of an antidemocratic society, one based on hierarchy and strict class
lines. Plato, according to Bennett, disposed of the relativism that his
apostle now sees as the cause of our decay. But what exactly is
relativism? Bennett also quotes approvingly Abelard's dialectical idea
of sic and non (the debate surrounding opposite
propositions) as being the probable "basis of all learning itself...of
our very outlook on the world." But Abelard's method can be read two
ways. One is that the questions undertaken invariably lead to the same
question expressed in new ways (aporia), or it is a method that
is supposed to give the right answer expressed by a church that defines
what reason and faith are.

Relativism is really a special form of democratic skepticism that
encourages us to examine and extend our inquiry beyond the appearance of
an event even in the case of recognizable and accepted facts. The
relativist points out that the fact can be seen from different vantage
points, and, more important, that a fact has within itself an entire
story that can and should be explored. Now the question is, how does
this apply to 9/11?

First, there is the fact of its occurrence. In a policy sense it becomes
critical for us to understand how and why the event occurred, what the
implications are, what its immediate causes were. For its various flaws,
relativism is an attempt to move to a coherent, if invariably incomplete
picture of what happened and what lay behind the event. It is the only
way we can learn what to do. It takes a dim view of professed views of
what is "good" and "evil" not because they don't exist but because ideas
of an absolutist nature that are put into practice can lead to the most
horrendous consequences. It is why law, including international law, is
so important, for it imposes boundaries even for the protection of the
evildoer. In policy terms, matters of good and evil are transposed into
causes, consequences and manageable categories for people who cannot
know the whole truth, and for people who seek a means of understanding
rather than mere retaliation or dogma.

This form of analysis leads to certain conclusions. The first is that
9/11 almost immediately became a social and political question of what
to do. It was a moral question for those caught between their pacifist
beliefs and their concern for justice for their fellow citizens. For
Bennett that terrible day was the moment not only to get mad (angry) in
his terms but to get even. Bennett is obsessed with the idea that there
is not enough anger in American society. We are all caught in this
unmanly process of Roger Fisher and William Ury's ideas of "getting to
yes," that is, finding avenues of agreement between people, states and
groups. If this formulation does not have value then humanity cannot
escape the vise of dominator/dominated. Nor can it find ways of controlling and sublimating anger,
violence and rage. Nor will humanity be able to escape forever the
further use of nuclear weapons.

There is a smidgen of truth to Dean Rusk's and Bennett's idea that the
American people have to be pulled kicking and screaming into war. But
this belies the work of a state that has been involved, depending on
one's count, in more than 150 interventions and wars since its founding.
Only someone given to deceiving himself would not recognize the American
state as a warrior state. There are many reasons Bennett chooses not to
see this reality--that is to say, in Bennett's history book there are
many blank pages. Thus, the United States made continuous war on Indians
for the better part of a hundred years, always with its eye on the
prize: to take as much land as it could from them. The Mexican-American
war can hardly be seen in a different light. This is an old story told
well and critically by historians--a story Bennett would sugarcoat for
the young, with claims of an American destiny. Is that what the "vast
relearning" is to be about? Whether the United States had high moral
purpose or crass economic motives in employing violence and deceit does
not change the reality about the means used.

It should go without saying that there is a matter of supreme importance
for Bennett with which I do agree. It is that there is no place for
anti-Semitism in twenty-first-century civilization--whether it comes as
the virulent form that has erupted among too many in Muslim nations or
whether it exists as a residue in American politics (peace to the memory
of Richard Nixon). But it's there, whether in the Middle East, Europe or
in American politics.

This anti-Semitism does not excuse Israel's foreign and military
policies, which put at risk the state of Israel, in my view; but Bennett
is among the staunchest of Israel's supporters. He says there is "an
understanding, almost religious in nature, that to our two nations above
all others has been entrusted the fate of liberty in the world." There
is a consistency in his view. He wants no appeasement toward the
Palestinians, seeking their subjugation and cautioning the Bush
Administration; I suppose that weak fellow General Powell had better
watch his step in his concern to temper this ugly war. Or maybe it's his
back.

Here the prudent analyst might have learned something from Vietnam.
There was much pressure to remove the corrupt and seemingly feckless
Diem from his position. And after he was removed, with American backing,
the leadership structure of South Vietnam ended in turmoil. We may
expect the same to occur if the Israelis, with American concurrence,
manage to force into place among the Palestinians a Middle East version
of a puppet leader. Bennett's view of American foreign policy demands
that we look only at the depredations of Osama, Palestinian terrorists
and certain nations on his enemies list. He claims that he is interested
in objectivity, but he is unprepared or unwilling to look at those
issues that may or may not have salience. This has little to do with
good and evil, except as those words are used to obfuscate. The moral
asymmetry he assumes should be surrendered, so that the universal
standards Bennett says he is for can be applied to the United States as
well.

Another place of agreement between us is in Bennett's recognition that
through enormous struggle, the United States has sought to concretize
its shifting ideals of freedom and racial and economic justice into the
reality of everyday life. There are some exceptions, but there is little
to suggest that those who hold Bennett's views were the ones who were
part of the movements that changed the face of this nation into one that
others throughout the world admire for its freedoms. These struggles
were paid for dearly by the various social movements so the likes of
Bennett and me could live in relative comfort. It was not the
right--whether the ultramontane elements of Catholic hierarchy, Judge
Gary, J. Edgar Hoover, Joe McCarthy, Phyllis Schlafly, Antonin Scalia,
the George Bushes or William F. Buckley--that made this nation one that
championed "intellectual, moral and political freedom," to use the
philosopher A.E. Murphy's phrase.

But back to "why we fight" in international terms: Being a believing
Catholic, Bennett is concerned that "just war" be recognized as a
doctrine that has modern utility; one applicable to American reprisals.
As ironic as it may appear, "just war" is a weak reed to hang from in
order to support a war without end. Just war is predicated on struggles
between nations; it is not a struggle between a gang and a nation. A
just war has a beginning, middle and end, and it is not supposed to do
more damage than the original harm. Bennett argues that the opinions of
others (sometimes good to have) should in no way deter any unilateral
action the United States cares to take--that is to say, those who
control the reins of power. Bennett has thus adopted just war as his
rationalization for militarism.

One last word. An American-initiated alternative must be offered to that
part of the world that is writhing in pain. It is one that gets rid of
weapons of mass destruction through general disarmament. (This includes
our own.) It is one that supports the pacific settlements of disputes.
This does not mean the fashioning of imperial law but of expanding
international law. That the United States does not support the
International Criminal Court and has pulled out of various international
treaties is not a good sign for the United States or the world's future.
The alternative includes international economic rights, the buildup of
regional forces to act under the aegis of the UN Security Council,
massive health and economic assistance, and a system that makes clear
that intelligence is a feature of a free society--it is public property,
not that of the few or of the state. The alternative recognizes and
supports claims of plural cultures without undercutting in any way the
ideals and struggles that have defined human rights in the United
States, namely women's rights, civil liberties, civil rights, labor
rights, gender rights, environmental rights. It recognizes that
education, housing, religion, free inquiry and health are rights to be
expanded and cherished. This charge is not likely to be fulfilled by
calls for wars without end and claims of patriotism meant to mystify,
and worse.

For readers of this magazine and millions of other Americans, the
initial horror of September 11 was compounded by the sobering
realization that George W. Bush would be at the helm for the aftermath.
With a cabal of fundamentalists, crackpots and fascists whispering in
his ear, Dubya became the world's most dangerous weapon. Perhaps, we hoped, the rather low esteem in which he was held by the American people, the news media and much of Congress might save us.

No such luck. Congress and the mainstream media lined up behind him in
lockstep. Instances of his much-vaunted ignorance wound up on the
cutting-room floor. One cable network ran daily promos of Bush spurring
on World Trade Center rescue workers, declaring that he had "found his
voice" amid the rubble. Pundit Peggy Noonan declared Bush's post-9/11
speech to Congress no less than "God-touched"; he had "metamorphosed
into a gentleman of cool command...[with] a new weight, a new gravity."
Yet, despite the rise in his approval ratings, many harbored lingering
doubts about the extent to which a "new" Bush existed.

Among the many critical viewpoints drowned out in the wake of the
attacks was Mark Crispin Miller's The Bush Dyslexicon, the first
systematic critical examination of the President's mistakes,
misstatements and malapropisms. Fortunately, this clever volume has been
reissued with updated material on Bush's sayings and doings since that
time.

Bush's propensity for mangling the English language is no secret to
anyone. No doubt we all have our favorites, which we've gleefully shared
with friends, family, co-workers and comrades. Miller, a professor of
media ecology at New York University, has compiled what is clearly the
largest collection of Dubya-isms to date, among them these treats:

§ On his qualifications to be President: "I don't feel I've got all
that much too important to say on the kind of big national issues"
(September 2000); and "Nobody needs to tell me what I believe. But I do
need somebody to tell me where Kosovo is" (September 1999).

§ On coping with terrorism and other threats: "[We'll] use our
technology to enhance uncertainties abroad" (March 2000); and "We'll let
our friends be the peacekeepers and the great country called America
will be the pacemakers" (September 2000).

§ On Russia: "And so one of the areas where I think the average
Russian will realize that the stereotypes of America have changed is
that it's a spirit of cooperation, not one-upmanship; that we now
understand one plus one can equal three, as opposed to us, and Russia we
hope to be zero" (November 2001).

Miller vividly illustrates the depth of ignorance--as opposed to
stupidity--that leads this President away from direct contact with
journalists whenever possible. Miller also demonstrates that Bush's
"problem" with language is not easily separated from his "problem" with
policy and politics. If we focus exclusively on his stormy relationship
with proper grammar and logical sentence structure, Miller argues, we
risk underestimating what his presidency means for the United States and
the world. "Our president is not an imbecile but an operator just as
canny as he is hard-hearted.... To smirk at his alleged stupidity is,
therefore, not just to miss the point, but to do this unelected
president a giant favor."

Loosely organized by subject matter-- "That Old Time Religion," "It's
the Economy, Your Excellency"--the book's chapters chronicle several
intertwined aspects of the chief executive: the politics of style that
characterize his behavior and demeanor; the media's role in crafting him
as a valid presidential candidate and, post-9/11, a changed man; the
Bush family's political legacy and troubled public image; and, finally,
the real meaning behind Dubya's flubs and gaffes.

Miller documents in detail how major news outlets have from the
beginning provided a heavily edited public transcript of Bush's
statements and have helped steer viewers away from his lack of policy
knowledge. Even more disturbing are the ways the media have simply
reported Bush's "ideas" without comment. Commenting on a Kansas
school-board vote to end evolution's exclusivity in the state science
curriculum (later overturned), for example, Bush declared, "I personally
believe God created the earth" (September 1999); later, he opined,
"After all, religion has been around a lot longer than Darwinism"
(September 2000).

The abundant evidence Miller provides of Dubya getting pass after pass
in the media seems particularly alarming. In addition to general
"cover," Cokie Roberts, Sam Donaldson and other famed "journalists" and
newspeople consistently let Bushisms fly with little or no comment. Note
this flub on the fate of Elián González's potential
citizenship during an airing of ABC's This Week:

Well, I think--I--It--listen, I don't understand the full ramifications
of what they're going to do. But I--I--I--think it'd be a--a--a
wonderful gesture. I guess the man c--the boy could still go back to
Cuba as a citizen of the United States.... I hadn't really thought about
the citizenship issue. It's an interesting idea, but if I were in the
Senate, I'd vote aye.

Roberts gave no response to the nonsensical Bush, nor did Chris Matthews
in this bizarre MSNBC Hardball episode in May 2000:

Matthews: When you hear Al Gore say "reckless, irresponsible," what do
you hear from him, really?...

Bush: I hear a guy who's not confident in his own vision, and,
therefore, wants to take time tearing me down. Actually, I--I--this may
sound a little West Texan to you, but I like it when I'm talking about
what I'm--what I--

Matthews: Right.

Bush:--when I'm talking about myself, and when he's talking about
myself, all of us are talking about me.

Matthews: Right.

Of course, these snippets pale in comparison to the alacrity with which
the media papered over the fact that our current President was not
elected by a majority of the populace.

This is quite a contrast from the dis-ease with which the fourth estate
treated Bush's predecessors. Miller traces the phenomenon back to
Richard Nixon, whom he calls the "godfather" of Bush-era politics. Like
Bush, Nixon was not a man well liked by the television cameras; nor, as
the White House tapes reveal, was he an especially enlightened man, with
his pedestrian literary interpretations, paranoid hatred of Jews,
virulent racism, sexism and homophobia. "You know what happened to the
Greeks!?" Nixon bellowed to Haldeman and Ehrlichman: "Homosexuality
destroyed them. Sure, Aristotle was a homo." Nixon's angry and, as
Miller describes it, "low-born" personality manifested itself throughout
his televisual life, particularly during the scandal that brought down
his presidency.

Inheriting this image problem was Dubya's patriarch, George Bush senior,
who not only worked for Nixon politically but also shared in his
televisually and verbally handicapped style. Whereas Nixon came off as a
classless bully, Bush suffered from sissiness, the infamous Wimp Factor:
"Bush's posh class background was his major TV problem, the cameras
mercilessly outing the big pantywaist within.... In fact, the Bush clan,
although fabulously wealthy, is not aristocratic enough to do well on
TV, if by that modifier we mean elegant and polished. First of all, the
Bushes often have let fly in the most boorish way--as when Barbara Bush
hinted coyly that Geraldine Ferraro was a 'bitch.'"

In an effort to analyze Bush Sr.'s wanna-be aristocratic demeanor,
Miller proceeds to call him a "Yalie faggot" and argues that the Bush
family's privilege put the elder Bush in the toughest of spots relative
to his macho Republican predecessors. On losing a straw poll in Ames,
Iowa, for example, Bush noted, "A lot of people who support me were at
an air show, they were off at their daughter's coming-out party, they
were teeing up at the golf course." Miller makes it abundantly clear how
frequently Bush Sr. not only missed, but miscalculated, the mark.

The point is that on television, class is not an economic issue but a
style issue. Given what Miller terms the Kennedy "savoir-faire," the
Bush family is at a distinct image disadvantage. Unfortunately, Miller
frequently analogizes Bush's moneyed privilege with a certain kind of
homosexuality--offensive behavior in a critic himself trying to "out"
Nixon's ignorance and homophobia. And he contrives that Barbara's
complaining of another woman's bitchiness is somehow anathema to
aristocratic behavior.

At root, these strangely aristocratic cheap shots smack of a kind of
backhanded liberal Kennedy worship. It is impossible to miss the
implication that America's royal family is the standard-bearer of
sufficiently presidential (read: aristocratic and classy) demeanor.
Given that JFK was an ethically challenged, commie-hunting political
lightweight, Miller's willingness to engage in macho class snobbery
points to the disturbing presence in the book of a crass partisanship
better suited to a Democratic media flack than a scholar of the left.

Symptomatic of this is the fact that for much of the book Miller seems
to forget the high degree of political convergence between Bush and
neoliberal New Democrats like Al Gore. One cannot help wondering if
Miller thinks a Gore Administration would not have responded to
September 11 with military action, and with legislation that expanded
the already egregious powers given the government in the
Clinton-sponsored Counter Terrorism Initiative of 1995. This see-no-evil
quality of the book is all the more telling because it represents the
very type of amnesia that Miller says afflicts us all after years of
corporate-led media idiocy. When he harps on Clinton's downfall at the
hands of the right without sufficiently stressing Bill's own
never-ending rightward shift throughout his eight years in office, one
wonders if Miller's own political memory lapsed from 1992 to 2000. It is
not until near the end of the book that he turns tail and concedes Al
Gore's rather striking resemblances to a war-happy Republican candidate,
as Gore "spoke more expertly, but just as deferentially, straining to
out-hawk the jut-jawed W, arguing that he would raise the military
budget even higher and retrospectively saluting the preposterous
invasions of Grenada and Panama."

Finally, Miller's critique of the "politics of style" turns in upon
itself. Miller obtains the lion's share of Bushisms from precisely those
style-obsessed media outlets he accuses of bringing down Clinton and
building up Bush: the New York Times, Talk,
Glamour, 20/20 and Larry King Live appear all over
Miller's source citations, and he is just as dependent on, and dedicated
to, the politics of style as they are. At the end of the book, one
cannot help suspecting that Miller's beef with the politics of style is
that it took down his guy while it has yet to take down the other guy.

This hedging makes crucial parts of the book read like sour grapes and
detracts from the moments of sharp observation that Miller offers
elsewhere. He clearly grasps the very real danger of the Bush
Administration--his most intriguing observation is that Bush is not
always a rhetorical bumbler. As Miller conducts his repeated dissections
of various Bushisms, it becomes clear that this man is in fact possessed
of considerable guile. In an interview with Charlie Rose, in August
2000, Bush speaks about Saddam Hussein:

Rose: OK. What if you thought Saddam Hussein, using the absence of
inspectors, was close to acquiring a nuclear weapon?
Bush: He'd pay a price.
Rose: What's the price?
Bush: The price is force, the full force and fury of a reaction.
Rose: Bombs away?
Bush: You can just figure that out after it happens.

Here we see Dubya apparently willing and even eager to bomb a country
with which we are not at war--yet. Two years before the recent
enunciation of a "Hitting First" policy of pre-emption and even more
recent revelations of an existing attack plan from land, sea and air,
Bush's warring language was unambiguous. Likewise, when speaking of
anger and vengeance post-9/11, he is nothing if not clear, and his
dyslexic tendencies are nowhere in evidence. Down-homish and
cringe-inducing though it may be, "evildoers" is a phrase whose meaning
is singular, and Bush's repeated use of it has not been subject to the
usual emendations or "clarifications" of his handlers. Similarly, Bush
famously threatened to "smoke 'em out" of their holes, another
inappropriate, unpresidential, phrase; yet no one was confused about
what it meant for Al Qaeda.

The Bush Dyslexicon makes it clear that even after the 11th of
September, Bush's personality was far from "God-touched" or even
transformed; in fact, provided with the opportunity to inflate his
defense budget, savage Social Security and go after the Taliban as if in
a 'coon hunt, Bush was just this side of gleeful at the prospect for
revenge. Hardly had the mourning American public time to collect itself
before Dubya encouraged the military to "smoke 'em out of their caves,
to get 'em runnin' so we can get 'em" in order, as Bush himself put it,
to "save the world from freedom."

Given the potentially dire consequences of Bush's post-9/11 policy
agenda, though, it seems strangely incongruous that Miller so often goes
for the breezy, snappy rhetoric and eschews a more forthrightly
analytical tone. It may be therapeutic to laugh in the face of danger,
but somehow these do not seem to be particularly funny times.

A half-century ago T.H. Marshall, British Labour Party social theorist,
offered a progressive, developmental theory for understanding the
history of what we have come to call citizenship. Taking the experience
of Englishmen to define the superior path, he postulated a hierarchy of
citizenships: civil rights, political rights and social rights. The last of these became the
category in which twentieth-century Europeans have understood claims on
the state to health, welfare, education and protection from avoidable
risk. They conceived of these citizenships as stages in an upward climb
toward an ever better democracy.

Marshall's schema looked only at European men. Feminists have pointed
out that women did not achieve citizenship in this order. In fact, women
often won some social rights--for example, protective legislation and
"welfare"--before achieving political ones such as the right to vote.
And women's individual civil rights were often overwhelmed and even
suppressed by legally imposed family obligations and moral sanctions.
(For example, a century ago courts generally interpreted the law of
marriage to mean that women were legally obligated to provide housework,
childcare and sexual services to husbands.) Equally problematic were
Marshall's obliviousness to British imperialism and what it meant for
Third World populations, including the fact that he conceived of the
British as civilizers rather than exploiters, and his apparent ignorance
of the conditions of second-class citizenship for racial/ethnic
subordinates within nation-states. In short, his historical hierarchy
was highly ideological.

But no one has yet done what Alice Kessler-Harris has in her newest
book, In Pursuit of Equity, reaching beyond Marshall and his
critics to suggest a new concept, economic citizenship. In this history
of how women have been treated in employment, tax and welfare policy,
Kessler-Harris--arguably the leading historian of women's labor in the
United States--synthesizes several decades of feminist analysis to
produce a holistic conception of what full citizenship for women might
entail. In lucid prose with vivid (and sometimes comic) illustrations of
the snarled thinking that results from conceiving of women as
dependents--rather than equal in heading families--she offers a vision
of how we can move toward greater democracy. In the process, she also
shows us what we are up against. Her book illustrates brilliantly how
assumptions about appropriate gender roles are built into all aspects of
policy.

She aims to resolve what is perhaps the central contradiction for
policy-makers and policy scholars who care about sex equality: the
contradiction between, on the one hand, valuing the unpaid caring work
still overwhelmingly performed by women and, on the other hand, enabling
women to achieve equality in wage labor and political power. Today, for
example, although all feminists oppose the punitive new requirements of
the policy that replaced Aid to Families with Dependent Children,
repealed in 1996, they are divided about what would constitute the right
kind of welfare system. Some find it appropriate that all adults,
including parents of young children, should be employed, assuming they
can get a living wage and good childcare. Others, often called
maternalists, believe a parent should have the right to choose full-time
parenting for young or particularly needy children. Behind this difference lie two different visions of
sex equality--one that emphasizes equal treatment of the sexes and individual rights
and responsibilities, another that seeks to make unpaid caring labor,
notably for the very young, the old and the ill, as honorable and valued
as waged labor.

Kessler-Harris would resolve this contradiction through a labor-centered
view of citizenship, a notion of economic citizenship based on equity,
or fairness, in the valuation of socially worthy labor. Previously, the
policy proposal closest to this principle of equity was "comparable
worth." Second-wave feminists saw that the Equal Pay Act of 1963 and
Title VII of the Civil Rights Act of 1964 had failed to equalize male
and female wages. Because the labor force is so segregated, and female
jobs are so consistently undervalued, equal pay alone cannot produce
justice to women (or men of color). The comparable-worth strategy called
for equal wages for work of comparable expertise and value, even when
the jobs differed. For example, consider the wage gap between truck
drivers and childcare workers. Truck drivers earned much more even than
registered nurses, whose training and responsibility was so much
greater. The women's movement's challenge to inequality in jobs took off
in 1979, when Eleanor Holmes Norton, then head of the Equal Employment
Opportunity Commission, called for evaluations of job skills to remedy
women's low wages. But her successor, Clarence Thomas, refused to
consider comparable-worth claims. Although some substantial victories
were achieved in state and union battles--for example, the American
Federation of State, County and Municipal Employees (AFSCME) won wage
increases averaging 32 percent and back pay retroactive to 1979 for
Washington State employees, 35,000 of whom shared a $482 million
settlement--the comparable-worth campaigns faded in the 1980s.

But even had the comparable-worth strategy been adopted, it could not
have recognized the hours spent in caring for children, parents,
disabled relatives and friends, not to mention the work of volunteering
in underfunded schools, cooking for homeless shelters, running kids'
basketball teams. Kessler-Harris is arguing for a citizenship that
respects unpaid as well as paid labor.

She has worked out the arguments in this book systematically over many
years. Several years ago, an article of hers with the deceptively simple
title "Where Are All the Organized Women Workers?" enlarged the
understanding of gendered "interests" from an exclusive focus on women
to take in men as well. She demonstrated that so long as men dominate,
aspirations understood and characterized as class interests often
express gender interests equally strongly. She uncovered how unions
often operated as men's clubs, built around forms of male bonding that
excluded women, primarily unconsciously but often consciously, too. In
this new book she extends her analysis of men's gendered interests to
reveal how labor unionists' inability to stop defending the privileges
of masculinity have held back labor's achievements. One vivid example
was unions' opposition to state-funded welfare programs and
health-and-safety regulation, stemming from anxiety that they would
deprive workers of their manly independence. Of course, unionist
resistance to state control over workplace and work-centered programs
also derived from a defense of workers' control. But this vision of
workplace democracy was inextricably masculinist, and workingmen's
understanding of their dignity rested on distinguishing themselves from
women.

In A Woman's Wage, Kessler-Harris showed that both Marxist and
neoclassical economics were mistaken in their joint assumption that the
wage was somehow a consistent, transparent token of the capital/labor
relation. By contrast, wage rates, wage systems, indeed the whole labor
market were constructed by gender interests and ideology as well as by
supply and demand or surplus value or the actual cost of subsistence. A
wonderful example from her new book: The Hawthorne experiments of the
late 1920s have been interpreted to show that women workers were more
tractable than men. In one study, a group of women workers adapted more
cooperatively and quickly to a speedup than did a group of male workers.
In seeking to explain this behavior, investigators examined the women's
home lives and even their menstrual cycles, while paying no particular
attention to the fact that the collective rather than individual wage
structure imposed on them was such that higher productivity could only
increase their total wages, while the men's piece-rate wage structure
offered no such guarantee--in fact, the men had reason to expect that
the piece rate would be lowered if they speeded up. We see here not a
"natural" gendered difference arising informally from culture and
socialization, but female and male workers responding rationally to a
gendered system imposed by employers.

In Pursuit of Equity argues that no one can enjoy civil and
political rights without social and economic citizenship. Marshall's
alleged gradual expansion of civil and political rights not only
excluded many others but actually strengthened women's exclusion from
citizenship. One fundamental premise of democratic capitalism--free
labor--was never fully extended to all women, whose labor was often
coercively regulated, not only by husbands but by the state.
Kessler-Harris shows how free labor developed in tandem with the "family
wage" ideal, that is, that husbands/fathers should earn for the entire
family and that women's destiny was domestic unpaid labor. The correlate
was that men "naturally" sought economic and social independence while
women "naturally" sought dependence. Ironically, most feminists of the
nineteenth century went along with this dichotomy and tried to root
women's citizenship in their essential family services rather than in
the free-labor definition of independence. That is, they argued for
rights on the basis of women's spiritual and material work in unpaid
caretaking labor.

The book demonstrates particularly effectively how the dominant modern
gender system--the family-wage norm--made it difficult for women to
become full citizens. In one closely documented section, Kessler-Harris
exposes the condescending and defensive assumptions of those who drafted
the Old Age Insurance program (which later became Social Security). The
drafters agreed, for example, that the widow of a covered man with young
children should be able to receive three-quarters of his pension until
she remarried or the children reached 18. A widow without children
lacked any rights to her husband's pension. But if this pension was her
husband's by right, as the designers insisted, then why were his heirs
not entitled to all of it as with all other parts of his property? If
the widow remarried, she would not have to give up the bank account or
house or car he had left her--why should she give up a Social Security
pension? One Social Security drafter argued that retaining such an
annuity after remarriage would make widows "a prize for the fellow that
has looked for it," assuming that women are entirely passive in marriage
decisions! The drafters were all convinced that "once a woman was no
longer dependent on the earnings of a particular male (dead or
alive)...his support for her should cease." In other words, his status
as breadwinner should continue even after his death. The drafters
rejected the idea of granting all widows of covered men an equal stipend
or one based on the number of children. It was important for her
benefits to be calibrated to his earnings so as to feed "the illusion
that families deprived of a father or husband would nevertheless
conceive him...as a continuing provider." "Why should you pay the widow
less than the individual himself gets if unmarried?" Because "she can
look after herself better than he can." Imagining women as less capable
of handling money than men, the designers removed the option of a
lump-sum benefit to widows, requiring them, unlike men, to receive
monthly stipends. To avoid "deathbed marriages," they allowed a widow to
collect only if she had been married and living with her husband for at
least a year before he died.

The concern with male status was reflected particularly comically in
discussions about the age at which a wife could start to receive her
share of her husband's benefits. Some argued for an earlier "retirement"
age for women because if both men and women were eligible at 65, this
would mean that men with younger wives--a common phenomenon--might not
get their full pension for a number of years after they retired. But
others argued that since men who married much younger women were more
likely to be those who had married more than once, granting women an
earlier retirement date might reward these men over single-marriage
retirees.

Several decades ago economist Heidi Hartmann pointed out that patriarchy
was as much a system of power and hierarchy among men as a male-female
relation, and Kessler-Harris confirms that insight. For example, the
entire debate about whether married couples should be able to report
separate incomes for IRS purposes concerned the inequalities this would
create between men with employed wives and men with nonemployed wives.
Fairness to women was not a prominent concern. The fact that employed
women's old-age insurance benefits were restricted according to their
marital status while men's weren't "did not seem like sex discrimination
[to the Social Security designers] but rather like equity to men."

At the core of In Pursuit of Equity is the understanding that
what is "fair" is historically changing. The problem we face today is
not that men deliberately built policies to subordinate women but that
when our basic economic policies were established, men and women alike
tended to see male breadwinning and female domesticity as "fair." That
standard is far, far from reality today. One result is a double standard
in which supposedly ideal family life, requiring a full-time mother, is
a privilege of wives of high-earning husbands.

In the United States, the resultant damage is worse than in Europe,
because here many fundamental aspects of citizenship flow from the labor
market. "Independence" today is generally defined as earning one's
living through wages, despite the fact that the resulting dependence on
employers leaves workers as vulnerable, if not more vulnerable, than
dependence on government stipends. Social rights vital for survival,
such as medical insurance, retirement pensions and workers'
compensation, typically derive from employment in this country, in
contrast to most developed countries, which provide such help as a
matter of right to all citizens or residents. This is one way in which
American wage workers, as Kessler-Harris says, were "in a different
relationship to the constitution than those who did care-giving work."
As a result the development of democratic capitalism, even the growth of
working-class power in some ways failed to strengthen women's economic
citizenship, even weakened it. Indeed, she shows how victories against
sex discrimination in the labor force in the 1960s inadvertently
confirmed the assumption that all women could and should work for wages,
thereby contributing to the repeal of welfare without creating the
conditions that would make it possible for poor women to support
themselves through employment.

This gendered citizenship became more visible and more obnoxious to
women as wage-earning became the female norm and as "alternative
families" gained political clout. For example, if every individual was
entitled to an old-age pension and unemployment compensation, we
wouldn't have to struggle about the inheritance rights of gay partners
or stay-at-home parents' need for support. Even today, banning sex
discrimination is difficult because it is difficult to get agreement on
what constitutes discrimination. In a few cases division among feminists
has held back the struggle. Kessler-Harris ends the book with a brief
reprise of EEOC v. Sears, Roebuck & Co., a 1980s marker of
this division and a case in which she herself played a significant role.
Sears admitted that very few women held any of its well-paying
commission sales jobs but argued that women were not interested in these
jobs because the positions were competitive, pressured, demanding.
Another historian of women testified for Sears against the women
plaintiffs, using her expertise to argue that women's primary attachment
to unpaid domestic labor led them to want only jobs which did not
conflict with it. Her arguments illustrated vividly the continuing
influence of this emphasis on male/female difference, not necessarily as
"natural" or essential but nevertheless beyond the appropriate scope of
legal remedy. Sears won the case.

There is one pervasive absence in Kessler-Harris's book--race--and the
omission weakens the argument substantially. Her understanding of how
the family-wage ideal works would have to be substantially complicated
if she made African-American women more central, for they were rarely
able to adopt a male breadwinner/female housewife family model and often
rejected it, developing a culture that expects and honors women's
employment more than white culture. Mexican-American women's experience
did not fit the family-wage model either, despite their reputation as
traditional, because so many have participated in agricultural and
domestic wage labor throughout their lives in the United States. Equally
problematic to the argument, prosperous white women who accepted the
family-wage model often didn't do unpaid domestic labor because they
hired poor immigrants and women of color to do it for low wages. These
different histories must affect how we envisage a policy that recognizes
labor outside the wage system, and they need to be explored.

One aspect of Kessler-Harris's economic citizenship concept is being
expressed today by progressive feminists trying to influence the
reauthorization of Temporary Assistance for Needy Families (TANF), the
program for poor children and their parents that succeeded AFDC. We are
pushing a House bill that would recognize college education and
childcare as work under the new welfare work requirements. This book is
a sustained argument for that kind of approach and should help it become
part of the policy discussion. It probably won't win. Some will call it
unrealistic. But today's policies are already wildly unrealistic, if
realism has anything to do with actual life. If we don't begin now to
outline the programs that could actually create full citizenship for
women, we will never get there.

Blogs

“How can I have seemed so settled in my opinions? So smug in my attitudes?”

September 24, 2014

In his new book, John Dean finally offers definitive answers to the questions “What did he know, and when did he know it?”
 

August 14, 2014

The quagmire of the Vietnam War was built on a “queasy foundation of fact and myth.”

July 31, 2014

"About nothing does the mob forget so quickly as about war."

July 28, 2014

Reflections on the meaning of the French Revolution in the shadow of Adolf Hitler.

July 14, 2014

“Have you a city that smells worse than Akron?”

July 11, 2014

The Nation’s predictive capacities were pretty mixed.

June 28, 2014

The first modern war created the modern Nation.

June 16, 2014

There is no avoiding the inherently alienating consequences of trying to earn a living through the production of words.

May 30, 2014

The piece drew hard questions because of the choice of the (allegedly biased) reviewer.

May 28, 2014