Kevin Phillips argues that the dynastic aspect of wealth in the US has been growing fast, Mark Hertsgaard reports on why both sides are avoiding the truth on global warming and Katha Pollitt examines the Convention for the Elimination of All Forms of Discrimination Against Women.
GOULD & SCIENCE FOR THE PEOPLE
In his excellent June 17 piece on Stephen Jay Gould, John Nichols
mentions the Science for the People movement and our involvement in it,
and by implication incorrectly places Steve and me in leading roles.
Neither Steve nor I was a founder of Science for the People, nor were we
in any sense leading actors in it. True, we did each write an occasional
piece for the Science for the People Magazine and were members of
SftP study groups--for example, the Sociobiology Study Group--and we
each appeared at some SftP public functions and press conferences and
helped write some of its public statements. We were, however, much less
responsible and active in the movement than many others who devoted
immense amounts of time and energy to it and who kept it going for so
It is important to understand the nature of the Science for the People
movement. It came out of the anti-elitist, anti-authoritarian movement
of the 1960s and was committed to participatory democracy and lack of
central organization. Like many others, Steve and I separately became
adherents of the movement precisely because of its anti-elitism and
participatory nature, as well as for its political orientation. We all
struggled very hard to prevent those outside it from picturing it
falsely and conventionally as being composed of leading persons and
their allies. If, despite everyone's best efforts, there were some
people who from time to time were forced into leading roles, Steve and I
were never among them.
TOUGH LOVE FOR ISRAEL
Philadelphia; New York City
Liza Featherstone in "The Mideast War Breaks Out on Campus" [June 17]
mentions a number of Jewish groups critical of Israeli policy in the
occupied territories, including Rabbinical Students for a Just Peace,
the group of 108 students from seven rabbinical seminaries (not only the
Jewish Theological Seminary, as indicated in the article) who recently
sent a letter asking American Jewish leaders to recognize the suffering
of the Palestinians and to support the creation of a viable Palestinian
As two of the organizers of this letter, we wish to clarify that our
goal is both, as Featherstone indicates, to be "outspoken critics of
Israeli policy" and to support Israel's right to a secure existence
within its pre-1967 borders. Discussion of the Israeli-Palestinian
conflict generally suffers from a lack of nuance. Both pro-Israel and
pro-Palestine activists routinely vilify the other and ignore the
mistakes and abuses committed by those they support.
As future rabbis who have spent significant time living in Israel, we
speak out of deep love for Israel and concern for Israel's continued
security. We are committed to creating a Zionist, pro-Israel voice
willing to criticize Israeli policy, out of a desire to guarantee
Palestinians the right to live in dignity in their own state, and to
insure the security of Israel. Our views may appear radical within the
context of an American Jewish community that offers unqualified support
for the Israeli government, but they are in no way inconsistent with the
mainstream Israeli political debate, which has always allowed for a
greater range of opinion than does the US pro-Israel community.
SHOSHANA LEIS GROSS
DO WHAT MEN DO: HAVE IT ALL
I agree with Katha Pollitt that being childless can be as voluntary a
choice for women as for men ["Subject to Debate," May 13] and that we
sometimes make choices "unconsciously" by giving a goal a low priority
and then getting to the point where it is no longer achievable. But I'd
like to make one point: Successful, high-achieving women might consider
the "marriage strategy" of successful, high-achieving men. If you want a
fulfilling marriage and a high-powered career, choose a spouse
who is willing to put your career ahead of theirs--someone who loves you
enough to "hitch their wagon to your star."
Men have always felt free to marry for love and emotional support and to
choose women younger, poorer and less educated than themselves. Women
could broaden their "eligibility pool" in a similar way.
JOHN F. BRADLEY
RAWA IN THE USA & AFGHANISTAN
We applaud Jan Goodwin's "An Uneasy Peace" [April 29] on the perilous
situation for Afghan women and the crucial need for basic security.
However, we were dismayed by her characterization of the Afghan women's
organization RAWA as having "garnered considerably more publicity in the
United States than it has credibility in its own country." Both sides of
this comparison are oversimplified and dangerously misleading.
RAWA (www.rawa.org), an indigenous organization founded in 1977, has
indeed become better known in recent years, but not only in the United
States, and not for superficial reasons (as Goodwin suggests by setting
"publicity" against "credibility"). Rather, RAWA's website (since 1997)
and its dogged work for humanitarian relief, underground education and
documenting fundamentalist atrocities have broadened its international
Goodwin's statement also implies that RAWA lacks credibility in
Afghanistan. Certainly, jihadis, Taliban and other extremists will say
RAWA members are whores and communists, because they oppose RAWA's goals
(e.g., secular democratic government) and very existence. Among Afghan
refugees, however, RAWA is said by many to be one of the few
organizations that keeps its promises and is respected because it is
Afghan and has remained active in Afghanistan across two decades of
conflict. People in both Afghanistan and Pakistan speak highly of its
schools, orphanages, hospital, income-generating projects and views.
However, many inside Afghanistan do not know when they have benefited
from RAWA's help, since threats and persecution have made it impossibly
dangerous for RAWA to take credit for much of its work.
This is indeed a pivotal moment for human rights in Afghanistan,
including women's rights. It would therefore be a grave mistake to
misrepresent a major force advancing these goals: RAWA is,
unfortunately, the only independent, pro-democracy, humanitarian and
political women's grassroots organization in Afghanistan.
As a factual correction, while Sima Samar is a former member of RAWA,
she was not among the founders.
ANNE E. BRODSKY
New York City
Concerning RAWA's credibility, I was surprised that Anne Brodsky, who
was handling press and helping to host the RAWA representative during
her tour of the United States last fall, failed to disclose that
Western feminists may be able to identify with what RAWA has to say, but
as I mentioned in my article, the group lacks credibility and acceptance
in its own country. Part of its marginalization has to do with its
inability to make alliances with other Afghan organizations of any
stripe. RAWA is also not the only humanitarian and political women's
organization in Afghanistan, and to suggest so is to insult the many
Afghan women who have risked their lives to work in these arenas through
twenty-three years of conflict. Sima Samar was indeed a founding member
of RAWA but since breaking with the organization some years ago has been
disavowed by them.
A GEORGE OF A DIFFERENT COLOR
Thank you, thank you, thank you! Senator McGovern's "Questions for Mr. Bush" [April 22] speaks to my heart.
Bravo! We do have fascist madmen in the White House, and phrases like
"Axis of Evil" and "War on Terrorism" are going to be the end of us. I
am relieved that there are still intelligent men in the world working
for the good.
Melrose Park, Pa.
I voted for George McGovern in 1972, but I cannot agree with some of
the views in his editorial. He wonders if the Bush Administration's
bunker mentality suffers from paranoia, if the Bush team has become
obsessed with terrorism and if terrorism may replace Communism "as the
second great hobgoblin of our age." These questions reflect a deep
skepticism about the severity of the threat from Al Qaeda, a skepticism
shared by many writers for The Nation and close to denial in its
pervasiveness. Millions of other Americans, however, realized soon after
September 11 that our immense infrastructure is vulnerable precisely
because it is so large and diverse. Dams, bridges, tunnels, 103 nuclear
reactors, airports--all these and more must now be guarded against
Senator Ted Kennedy has co-sponsored funding for measures against
bioterrorism, while Senators Tom Harkin, Carl Levin and Paul Sarbanes
have chaired major hearings. Gary Hart chaired a commission two years
ago that warned of attacks such as September 11. These former colleagues
of Senator McGovern appear to believe that the terrorist threat is not a
hobgoblin, but all too real.
George McGovern was my hero when he ran for the presidency, oh so many
years ago. A more decent and capable man would be hard to imagine. The
weakness in his bid may, in fact, have been his honesty and
kindness--commodities not in much demand in a system that worships money
and power. McGovern argues for the nexus of poverty, oppression and
violence. He is far too generous in giving the Bush team the benefit of
the doubt that they will learn on the job and improve policies. I
started with Truman, and in my lifetime the presidency has never been
occupied by a smaller figure.
J. RUSSELL TYLDESLEY
I so wish George McGovern were our President right now.
CLOSE, BUT NO CIGAR
If Fidel Castro rises to George W. Bush's challenge to hold "a real
election" and "to count [the] votes" ["In Fact...," June 10], will Bush
also challenge him to figure out a way to take office even if the people
don't elect him?
<& "$_basedir/include/icaps.imhtml", style=>$icapstyle, letter=>'"A' &>rguing with intelligence, a massive array of facts and a sly wit, Sifry claims that our two-party system is a 'duopoly' that decisively dictates national politics through control of federal money and does not reflect the views or needs of many Americans." --Publishers Weekly, on
Micah L. Sifry's
Spoiling For a Fight: Third-Party Politics in America.
Readers of Andrew Sullivan's website may have noticed a series of items
about my piece "Attack of the Homocons," which appeared in The
Nation's July 1 issue.
The Africa trip of Treasury Secretary Paul O'Neill and Irish rock star
Bono produced a bumper harvest of photo ops and articles about aid to
Africa. Unfortunately, media coverage was mired in the perennial and
stale aid debate: Should we give more? Does it work?
If the O'Neill-Bono safari resulted in Washington finally paying more of
its proper share for global health, education and clean water, that
would be cause for applause. But any real change requires shifting the
terms of debate. Indeed, the term "aid" itself carries the patronizing
connotation of charity and a division of the world into "donors" and
At the late June meeting in Canada of the rich countries known as the
G8, aid to Africa will be high on the agenda. But behind the rhetoric,
there is little new money--as evidenced by the just-announced paltry sum
of US funding for AIDS--and even less new thinking. Despite the new
mantra of "partnership," the current aid system, in which agencies like
the World Bank and the US Treasury decide what is good for the poor,
reflects the system of global apartheid that is itself the problem.
There is an urgent need to pay for such global public needs as the
battles against AIDS and poverty by increasing the flow of real
resources from rich to poor. But the old rationales and the old aid
system will not do. Granted, some individuals and programs within that
system make real contributions. But they are undermined by the negative
effects of top-down aid and the policies imposed with it.
For a real partnership, the concept of "aid" should be replaced by a
common obligation to finance international public investment for common
needs. Rich countries should pay their fair share based on their
privileged place in the world economy. At the global level, just as
within societies, stacked economic rules unjustly reward some and punish
others, making compensatory public action essential. Reparations to
repair the damage from five centuries of exploitation, racism and
violence are long overdue. Even for those who dismiss such reasoning as
moralizing, the argument of self-interest should be enough. There will
be no security for the rich unless the fruits of the global economy are
shared more equitably.
As former World Bank official Joseph Stiglitz recently remarked in the
New York Review of Books, it is "a peculiar world, in which the
poor countries are in effect subsidizing the richest country, which
happens, at the same time, to be among the stingiest in giving
assistance in the world."
One prerequisite for new thinking about questions like "Does aid work?"
is a correct definition of the term itself. Funds from US Agency for
International Development, or the World Bank often go not for economic
development but to prop up clients, dispose of agricultural surpluses,
impose right-wing economic policies mislabeled "reform" or simply to
recycle old debts. Why should money transfers like these be counted as
aid? This kind of "aid" undermines development and promotes repression
and violence in poor countries.
Money aimed at reaching agreed development goals like health, education
and agricultural development could more accurately be called
"international public investment." Of course, such investment should be
monitored to make sure that it achieves results and is not mismanaged or
siphoned off by corrupt officials. But mechanisms to do this must break
with the vertical donor-recipient dichotomy. Monitoring should not be
monopolized by the US Treasury or the World Bank. Instead, the primary
responsibility should be lodged with vigilant elected representatives,
civil society and media in countries where the money is spent, aided by
greater transparency among the "development partners."
One well-established example of what is possible is the UN's Capital
Development Fund, which is highly rated for its effective support for
local public investment backed by participatory governance. Another is
the new Global Fund to Fight AIDS, Tuberculosis & Malaria, which has
already demonstrated the potential for opening up decision-making to
public scrutiny. Its governing board includes both "donor" and
"recipient" countries, as well as representatives of affected groups. A
lively online debate among activists feeds into the official
Funding for agencies like these is now by "voluntary" donor
contributions. This must change. Transfers from rich to poor should be
institutionalized within what should ultimately be a redistributive tax
system that functions across national boundaries, like payments within
the European Union.
There is no immediate prospect for applying such a system worldwide.
Activists can make a start, however, by setting up standards that rich
countries should meet. AIDS activists, for example, have calculated the
fair contribution each country should make to the Global AIDS Fund (see
Initiatives like the Global AIDS Fund show that alternatives are
possible. Procedures for defining objectives and reviewing results
should be built from the bottom up and opened up to democratic scrutiny.
Instead of abstract debates about whether "aid" works, rich countries
should come up with the money now for real needs. That's not "aid," it's
just a common-sense public investment.
The $4.4 million damages award in June against FBI agents and Oakland
police for violating the constitutional rights of environmental
activists Darryl Cherney and Judi Bari, wrongly accused of terrorism in
1990, represents more than the culmination of a twelve-year struggle for
vindication. The case also highlights the risks of today's antiterrorism
measures and offers lessons both daunting and encouraging about the
In May 1990, an explosion tore through the car carrying Earth First!
organizers Bari and Cherney. Bari suffered a fractured pelvis; Cherney,
less serious injuries. They assumed the bombing was the work of
antienvironmentalists, meant to disrupt planning for the Redwood Summer
of civil disobedience against the logging of old-growth forest.
The FBI Joint Terrorist Task Force jumped to quite a different
conclusion. As soon as Bari and Cherney were identified, the FBI
informed the local police and leaked to the press that the pair were
terrorists. The authorities claimed that Bari must have made the bomb
herself and that it had accidentally exploded while the two were
carrying it to an unknown target. Bari was placed under arrest in her
hospital bed. Police and FBI agents searched houses in Oakland where
Bari and Cherney had stayed and questioned their fellow activists. Over
the next two months, until the government announced it would not charge
the two environmentalists, the local police and the FBI continued to
call them terrorists.
Only after years of litigation did the truth emerge: The FBI, before the
bombing, had been investigating Bari and Cherney because of their
political activism. When the bomb went off, the FBI shaded the facts to
fit an ideological presumption of guilt. It was also revealed that the
FBI, even after Bari and Cherney had been cleared, collected data
nationwide on hundreds of individuals and groups merely on the basis of
their association with the two Earth First! activists.
The case demonstrates how the truth will come out when the judiciary
fulfills its constitutional role. With patience, skill and funding,
committed activists and lawyers can bring accountability to the FBI.
Just as Bari and Cherney won, just as the secret evidence cases brought
after the 1996 antiterrorism law melted in the face of judicial
challenges, so the material witness detentions and other rights
violations of today will ultimately be held unconstitutional. But the
FBI and the Justice Department will resist oversight and use secrecy and
delaying tactics to evade accountability, prolonging personal and
political damage. Justice was too late for Judi Bari. She died of cancer
The most sobering lesson of the Bari-Cherney case may be this: The FBI's
focus on politics over hard evidence meant that the real bomber was
never captured. In the same way, the Attorney General's recent
announcement that the FBI can monitor meetings and groups with no prior
suspicion of criminal conduct is likely to take the FBI down the path of
investigations based on politics, ethnicity or religion, while real
terrorists escape detection.
The journalist I.F. Stone used to joke that the government issues so
much information every day, it can't help but let the truth slip out
every once in a while. The Bush Administration's recent report on global
warming is a classic example. Though far from perfect, it contains some
crucial but awkward truths that neither George W. Bush nor his
environmentalist critics want to confront. Which may explain why the
Administration has sought to bury the report, while critics have
misrepresented its most ominous conclusion.
U.S. Climate Action Report 2002 made headlines because it
contradicted so much of what the Administration has said about global
warming. Not only is global warming real, according to the report, but
its consequences--heat waves, water shortages, rising sea levels, loss
of beaches and marshes, more frequent and violent weather--will be
punishing for Americans. The report's biggest surprise was its admission
that human activities, especially the burning of oil and other fossil
fuels, are the primary cause of climate change. Of course, the rest of
the world has known since 1995 that human actions have "a discernible
impact" on the global climate, to quote a landmark report by the United
Nations Intergovernmental Panel on Climate Change. But the White House
has resisted this conclusion. After all, if burning fossil fuels is to
blame for global warming, it makes sense to burn less of them. To a
lifelong oilman like Bush, who continues to rely on his former industry
colleagues for campaign contributions as well as senior staff, such a
view is nothing less than heresy.
No wonder, then, that Bush and his high command have virtually
repudiated the report. Although their staffs helped write it, both EPA
Administrator Christine Todd Whitman and Energy Secretary Spencer
Abraham claimed they were unaware of the report until the New York
Times disclosed its existence on June 3. Bush himself dismissed it
as a mere product of "the bureaucracy," that oft-vilified bogyman of
right-wing ideology. But he could equally have blamed his own father.
The only reason U.S. Climate Action Report 2002 was compiled in
the first place is that George Bush the First signed a global warming
treaty at the 1992 Earth Summit that obligates the United States to
periodically furnish such reports to the UN (one more reason, it seems,
to despise treaties). But somebody in the Administration must have seen
trouble coming, because the report could not have been released with
less fanfare: It was simply posted on the EPA's website, three unguided
links in from the homepage. If you weren't looking for it, you'd never
The Administration has been hammered for issuing a report that on one
hand admits that global warming threatens catastrophe but on the other
maintains there is no need to reduce consumption of fossil fuels. The
report squares this circle by arguing that global warming has now become
inevitable, so we should focus less on preventing it than on adapting to
it. To deal with water scarcity, for example, the report advocates
building more dams and raising the price of water to encourage
conservation. Critics see such recommendations as proof that the
Administration is doing nothing about global warming. Unfortunately,
it's not that simple.
The worst thing about the new global warming report is that it is
absolutely correct about a fundamental but often unmentioned aspect of
the problem: the lag effect. Most greenhouse gases remain in the
atmosphere for approximately 100 years. The upshot of this undeniable
chemical fact is that no matter what remedial steps are taken today,
humanity is doomed to experience however much global warming the past
100 years of human activities will generate. That does not mean we
should make matters worse by continuing to burn fossil fuels, as Bush
foolishly urges; our children and grandchildren deserve better than
that. It does mean, however, that we as a civilization must not only
shift to green energy sources immediately but also begin planning how we
will adapt to a world that is bound to be a hotter, drier, more
disaster-punctuated place in the twenty-first century.
Many environmentalists know it is too late to prevent global warming;
the best we can do is minimize its scope. They don't like to admit this
truth, because they fear it will discourage people from making, and
demanding, the personal and institutional changes needed to reduce
greenhouse gas emissions. There is that risk. But a truth does not
disappear simply because it is inconvenient. Besides, a green energy
future would mean more, not less, economic well-being for most
Americans, while also increasing our chances of avoiding the most
extreme global warming scenarios. Sometimes the truth hurts. But
avoiding it will hurt even more.
Has the war on terrorism become the modern equivalent of the Roman Circus, drawing the people's attention away from the failures of those who rule them?
Would it be too early to sense a sudden, uncovenanted shift against the corporate ethic, if ethic is the word? I can barely turn the page of a newspaper or magazine without striking across either some damaging admission, or at least some damage-control statement, from the boardroom classes.
What would the world look like if women had full human rights? If girls
went to school and young women went to college in places where now they
are used as household drudges and married off at 11 or 12? If women
could go out for the whole range of jobs, could own the land they work,
inherit property on equal terms with men? If they could control their
own sexuality and fertility and give birth safely? If they had recourse
against traffickers, honor killers, wife beaters? If they had as much
say and as much power as men at every level of decision-making, from the
household to the legislature? If John Ashcroft has his way, we may never
find out. After twenty years of stalling by Jesse Helms, the Senate
Foreign Relations Committee in early June held hearings on the
Convention for the Elimination of All Forms of Discrimination Against
Women (CEDAW), an international treaty ratified by 169 nations.
(President Carter signed CEDAW in 1980, but the Senate blocked it.)
George W. Bush originally indicated that he would sign it--that was when
he was sending Laura onto the airwaves to blast the Taliban--but under
the influence of Ashcroft, he's since been hedging. Naturally, the
religious right has been working the phones: According to one e-mail
that came across my screen, the operator who answers the White House
comment line assumed the writer was calling to oppose CEDAW, so heavily
were the calls running against it. The reasons? CEDAW would license
abortion, promote homosexuality and teen sex and destroy The Family. In
2000, Helms called it "a terrible treaty negotiated by radical feminists
with the intent of enshrining their anti-family agenda into
How radical can CEDAW be, you may ask, given that it's been ratified by
Pakistan, Jordan and Myanmar? Genderquake is hardly around the corner.
Still, across the globe women have been able to use it to improve their
access to education and healthcare as well as their legal status. In
Japan, on the basis of a CEDAW violation, women sued their employers for
wage discrimination and failure to promote; the Tanzanian High Court
cited CEDAW in a decision to overturn a ban on clan land inheritance for
women. Given the dire situation of women worldwide, it is outrageous to
see US policy in the grip of Falwell, James Dobson and Ralph Nader's
good friend Phyllis Schlafly. Like the Vatican, which uses its UN
observer status to make common cause with Islamic fundamentalist
governments on behalf of fetus and family, on CEDAW the Bush
Administration risks allying itself with Somalia, Qatar and Syria to
promote the religious right agenda on issues of sexuality. In the same
way, at the recent UN General Assembly Special Session on the
Child--where the United States opposed providing girls with sex
education beyond "just say no," even though in much of the Third World
the typical "girl" is likely to be married with children--the Bush
Administration allied itself with Libya, Sudan and evil axis member
Iran. Some clash of civilizations.
Given this season's spate of popular books about mean girls and inhumane
women, it might seem starry-eyed to suppose that more equality for women
would have a positive general social effect. Where women are healthy and
well educated and self-determined, you can bet that men are too, but the
situation of women is not only a barometer of a society's general level
of equality and decency--improving women's status is key to solving many
of the world's most serious problems. Consider the AIDS epidemic now
ravaging much of the Third World: Where women cannot negotiate safe sex,
or protect themselves from rape, or expect fidelity from their male
partners, where young girls are sought out by older HIV-positive men
looking for tractable sex partners, where prostitution flourishes under
the most degraded conditions and where women are beaten or even murdered
when their HIV-positive status becomes known, what hope is there of
containing the virus? Under these circumstances, "just say no" is worse
than useless: In Thailand, being married is the single biggest predictor
of a woman's testing positive. As long as women are illiterate, poor and
powerless, AIDS will continue to ravage men, women and children.
Or consider hunger. Worldwide, women do most of the farming but own only
2 percent of the land. In many areas where tribal rules govern
inheritance, they cannot own or inherit land and are thrown off it
should their husband die. Yet a study by the Food and Agriculture
Organization shows that women spend more time on productive activities,
and according to the International Center for Research on Women, women
spend more of their earnings on their children than men do. Recognizing
and maximizing women's key economic role would have a host of
benefits--it would lessen hunger, improve women's and children's
well-being, improve women's status in the family, lower fertility.
And then there's war and peace. I don't think it's an accident that
Islamic fundamentalism flourishes in the parts of the world where women
are most oppressed--indeed, maintaining and deepening women's
subjugation, the violent rejection of everything female, is one of its
major themes. (Remember Mohammed Atta's weird funeral instructions?) At
the same time, the denial of education, employment and rights to women
fuels the social conditions of backwardness, provincialism and poverty
that sustain religious fanaticism.
If women's rights were acknowledged as the key to human progress that
they are, we would look at all these large issues of global politics and
economics very differently. Would the US government have been able to
spend a billion dollars backing the fundamentalist warlords who raped
and abducted women and threw acid at their unveiled faces while
"fighting communism" and destroying Afghanistan? At the recently
concluded loya jirga, which featured numerous current and former
warlords as delegates, a woman delegate stood up and denounced former
President Burhanuddin Rabbani as a violent marauder. For a moment, you
could see that, as the saying goes, another world is possible.
Attempts to organize are squelched by a flying column of
Nixon thought so; Otis Chandler doesn't. Maybe it depends on where you
Although car chases are formulaic, they needn't be standard issue. One
of the many substantial pleasures that The Bourne Identity offers
is a thoughtful car chase, a loving car chase, in which the characters
truly care about their conduct amid prolonged automotive mayhem. It
doesn't hurt, of course, that the scene is Paris. The streets there are barely wide enough for a single fleeing vehicle--which means that Jason Bourne may as well use the sidewalk when he needs an extra lane. Once the pedestrians dive out
of the way, he gets to skid through every degree of turn except
ninety--Descartes never laid his grid over this city--until the route
ends at a set of stairs. They're very picturesque; and considering what
his car's undercarriage was already like, they can't do much harm.
By the time the car fully resumes the horizontal, some of the pursuing
motorcycle cops have managed to pull up. "Turn your head," Jason warns
his passenger, Marie Kreutz, in a surprisingly gentle tone. She was
guzzling booze straight from the bottle even before this ride; he'd
rather not worsen her alarm by letting her watch the next maneuver. But
we see it, as one cop after another is shaken off and the car hurtles
onto a highway. At last--a chance to make time! The camera drops to
within an inch of the macadam so that our brains, too, can get a good
rattle, as Jason and Marie's car seems to race straight out of the
screen. Then, almost without transition, it's shooting through more
non-Cartesian turns, off a ramp, past the spot where the last motorcycle
cop makes his rendezvous with a passing truck, to come to a very
temporary version of rest.
How should a car chase end? If the sequence is standard issue, the
filmmaker will require a fireball, or a roll downhill and then a
fireball, followed perhaps by the sight of the good guys speeding away.
But in The Bourne Identity, director Doug Liman has been witty
enough to conclude the sequence by having Jason pull into a parking
garage. From this, we may learn that the hero is a fundamentally
conventional person, despite what he's been doing for the past five
minutes. But this is only part of what we learn--because Liman is also
clever enough to make the real action start when the motor stops.
All but vibrating from what they've been through, Marie and Jason sit in
the car in silence, each glancing awkwardly toward the other and then
looking away. The camera, static at last, takes them both in at once.
Time stretches; they squirm. Someone is going to have to say something
pretty soon--and the words, when they come, will have the shy banality
of a postcoital stab at conversation, when the two people have scarcely
met and are wondering what the hell they've just done.
For me, this was the moment when The Bourne Identity revealed its
true nature, as a study of those people in their 20s who can't yet put
up with workaday life. Liman has looked at such characters before, in
Swingers and Go. Those movies were about using
recreational drugs, selling recreational drugs, selling over-the-counter
medicines that you claim are recreational drugs, losing yourself in
music, losing yourself in lap dancing, losing your sense that this cute
thing before you might not be an ideal companion when you get to be 70.
Jobs in these movies count for little or nothing; friendships mean the
world, though they're always breaking apart. If you can recognize these
attitudes, and if you're familiar with the behavior through which
they're expressed nowadays, you will understand Jason Bourne and Marie
Kreutz. They're typical Doug Liman characters, who just happen to live
in a spy thriller.
Now, since The Bourne Identity is adapted from a Robert Ludlum
novel and was written for the screen by two people other than the
director, you might doubt the wisdom of ascribing all the above to
Liman. But look at the casting. In the title role, Liman has Matt Damon,
who carries over from Good Will Hunting his persona of the
regular working stiff--an unpretentious guy who must nevertheless come
to grips with a great power he's been given. In Good Will
Hunting, the gift was mathematical genius, which somehow was shut up
behind Damon's sloping brow and wary, squinting eyes. In The Bourne
Identity, in which he plays a CIA assassin suffering from amnesia,
Damon is puzzled to hear himself speak many languages, and to find that
his arms and legs demolish anyone who threatens him. Different skills;
same aura of being troubled, but decent and game. When Jason Bourne
refuses to hold on to a gun--something that he does more than once in
the picture--Damon infuses the gesture with the gut-level morality of a
Catholic boy from South Boston.
Paired with Damon, in the role of Marie, is Franka Potente, the young
German actress who is best known for Run Lola Run. She, too, has
retained her persona from the earlier film, so that she brings to Marie
a convincing impression of having enjoyed quite a few good times over
the past years, many of which she can't remember. Her basic facial
expression is something between a scowl and a sneer--the sign, you'd
think, of a feral sexuality that bores her, because it encounters no
worthy challengers and yet prevents her from concentrating on anything
else. No wonder she runs--or drifts in this case, playing someone who
has done nothing since high school except wander about. When first seen
in The Bourne Identity, Potente is at the American Embassy in
Zurich, making a pain of herself by demanding a visa to which she is
most likely not entitled. When first approached by Damon, Potente
establishes her baseline attitude toward people by snapping "What are
you looking at?" Her Marie isn't a bad person, you
understand--she's just been bad news for any man she's hung around. Now,
though, she's met the right guy in Jason Bourne, meaning someone who can
be bad news for her.
I think it's worthwhile to compare these characters with those played by
Chris Rock and Anthony Hopkins in Bad Company, a routine
bomb-in-a-suitcase thriller, whose main function is to help audiences
kill time till the release of Men in Black 2. Hopkins plays the
self-controlled CIA agent, who is so white he's English. Rock plays
(guess what?) the street-smart, fast-talking black guy, who must be put
into the field at once, or else the world will end. There's an
underground trade in nuclear weapons, you see, which Hopkins can foil
only with the aid of someone who looks exactly like Rock.
And there's the essential problem of Bad Company. The mere
appearance of Chris Rock is supposedly enough; the assignment requires
no one to act like him. In any decent movie of this sort--48
Hours, say, or Trading Places--the white character will fail
in his task, except for the wiles the black character can lend him. But
in Bad Company, Rock exists solely to be educated. A very smart
man who has made nothing of his abilities--the reasons for which failure
are left disturbingly vague--his character must be trained to wear a
suit, sip Bordeaux and rise at dawn. These traits, according to the
movie, are proper to a white man; and Rock will help defeat terrorism by
adopting them. As an interim goal for the character, this is bad enough.
What's worse is the final justification for rubbing some white onto
Rock: to make him a fit husband.
Bad Company was produced by Jerry Bruckheimer, directed by Joel
Schumacher and written, so far as I can tell, by the welfare policy
officials of the Bush Administration. Heartless in theme and faceless in
style, it is so many thousands of feet of off-the-shelf filmmaking,
through which you sift, disconsolate, in search of a single live moment.
There is one: the scene in which Rock tells off a CIA supervisor. Of
course, this, too, is part of the formula; but when Rock lets loose his
falsetto indignation, the world's shams all wash away in the torrent.
You feel clean and free, listening to Rock's outrage. I wonder what he'd
say in private about this movie.
Maybe he'd say The Bourne Identity has more soul than all of Joel
Schumacher's films put together. I think soulfulness has to do with
acknowledging the reserves of personality in someone who might at first
seem a mere type--or acknowledging, for that matter, the personality in
a movie that appears generic. It's about individual but strict judgments
of right and wrong; and, always, it's about the exuberance of talent.
This last point is the one that makes The Bourne Identity into
Liman's movie. His direction is a performance in its own right,
combining the logic and flair of a first-rate bop solo. He attends to
the small, naturalistic gestures--the way Jason pauses to brush snow off
his sleeve, or Marie shields her mouth to hide a smile. He pushes the
cinematography to extremes, using low levels of light from very few
sources, to give you a sense of intimacy with the characters' flesh. He
continually thinks up ways to keep the action fresh. Sometimes his
tricks are unobtrusive, as when he makes a shot shallower than you'd
expect, and so more arresting. Sometimes he's expressive, as when Bourne
teeters on a rickety fire escape, and the camera peers down at his peril
while swinging overhead. And sometimes he's flat-out wild. In the midst
of a fight scene, Liman tosses in a point-of-view shot, about half a
second long, to show you what the bad guy sees as he flies over a desk,
upside down. If my schedule of screenings and deadlines had been more
merciful, I would now compare Liman's direction with that of the master,
John Woo, in his new Windtalkers. But I wasn't able to see
Windtalkers by press time; and, on reflection, I'm glad I didn't.
The Bourne Identity deserves to be enjoyed for its own sake.
If you're interested in the plot, you can enjoy that, too. I've left it
till last, since that's what Liman does. In one of his cheekiest
gestures, he lets the movie's McGuffin go unexplained. But as a public
service, I will give you this much detail: The Bourne Identity
assumes that the CIA's activities are an endless chain of cover-ups,
with each new calamity needing to be hidden in turn. That's why the
agency needs unlimited power.
Bad Company? Right.
On September 23, 2001, midpoint between the horrific events of September
11 and the beginning of the war in Afghanistan, the New York
Times ran an intriguing headline. "Forget the Past: It's a War
Unlike Any Other," it advised, above an article by John Kifner noting
that "Air Force bombers are heading toward distant airfields to fight a
shadowy foe flitting through the mountains in a deeply hostile land
already so poor and so ruined by two decades of war that [it] is
virtually bereft of targets." It was a poor headline for an article that
began by noting the long history of conflicts among great powers over
control of Central Asia, but it was a message with a significant degree
History was often being ignored in the heated discussions of the coming
war and the attacks that provoked it, of course, but usually without
anyone having to instruct us to forget it. Pundits and politicians alike
could draw on a long tradition of keeping the public ill informed about
the role of the United States in the world. And once the "war on
terrorism" actually started, those who tried to speak about a context
for the attacks of September, or of how the history of US intervention
in the world had produced rage and frustration that could help fuel such
actions, were accused of justifying terrorism.
In The Clash of Fundamentalisms, a riposte to Samuel Huntington's
much-discussed "clash of civilizations" thesis, Pakistani writer and
filmmaker Tariq Ali sets the ambitious goal of challenging such
organized historical amnesia--"the routine disinformation or
no-information that prevails today"--and of speaking forthrightly about
many topics that have become unpopular or even heretical in the West, as
well as within what he calls the House of Islam. "The virtual outlawing
of history by the dominant culture has reduced the process of democracy
to farce," Ali puts it in one chapter, "A short course history of US
imperialism." In such a situation, "everything is either oversimplified
or reduced to a wearisome incomprehensibility."
Whereas Huntington's "clash of civilizations" thesis posits a cultural
conflict between Islamic and Western civilization, and sees religion as
"perhaps the central force that motivates and mobilizes people,"
Ali argues that economics and politics, especially oil politics, remain
central to the friction between Western powers and states in the so-called Islamic world, particularly in the Middle East. He
rejects Huntington's identification of the West with "human rights,
equality, liberty, the rule of law, [and] democracy," and he reminds us
of the vast disparities that exist among cultures and nations within the
Islamic world itself.
Few people are better disposed than Ali to serve as a guide to the
neglected and distorted histories relevant to the conflict in
Afghanistan, the broader "war on terrorism" now being fought on numerous
fronts by the Bush Administration, and the intimately related conflicts
in Pakistan, India and Kashmir, which have recently put the world on a
heightened nuclear alert. Ali, a longtime editor of New Left
Review and Verso books, is the author of three books on Pakistan and
has deep personal and political connections to the region. In The
Clash of Fundamentalisms he surveys a range of regional and
historical conflicts that remain open chapters, including the creation
of Israel and its ongoing occupation of Palestinian lands, the
unfinished legacy of Britain's brutal partition of India in 1947 and the
fallout from division of the world by the colonial powers. The book is
an outstanding contribution to our understanding of the nightmare of
history from which so many people are struggling to awake, and deserves
serious engagement and consideration. Ali broadens our horizons,
geographically, historically, intellectually and politically.
Despite his obvious hostility to religious modes of thinking--defending
against religious orthodoxy in favor of "the freedom to think freely and
rationally and [exercise] the freedom of imagination"--Ali has a
sophisticated appreciation of the many contradictory movements and ideas
that have organized themselves under the banner of Islam. He can debate
Islamic doctrine with the most ardent purists while at the same time
dispensing with the simplistic (and all too often racist) caricatures of
Islam that pass for analysis in the West. In The Clash of
Fundamentalisms he takes the reader on a necessarily schematic and
selective history of Islam, though one wishes he had provided more
signposts for those interested in further study than the scattered and
inconsistent references included in this volume.
Ali writes here of his "instinctive" atheism during his upbringing in
Lahore, Pakistan, and of being politicized at an early age. His
experiences then helped him understand Islam as a political phenomenon,
born of the specific historic experiences of Muhammad, who worked on a
merchant caravan and traveled widely, "coming into contact with
Christians and Jews and Magians and pagans of every stripe." Ali writes
that "Muhammad's spiritual drive was partially fueled by socio-economic
passions, by the desire to strengthen the communal standing of the Arabs
and the need to impose a set of common rules," thus creating an impulse
toward the creation of a universal state that remains an important
element of Islam's appeal.
Ali offers a fascinating discussion of the Mu'tazilites, an Islamic sect
that attempted to reconcile monotheism with a materialist understanding
of the world, including a theory of the atomic composition of matter;
some of its members also argued that the Koran was a historical rather
than a revealed document. "The poverty of contemporary Islamic thought
contrasts with the riches of the ninth and tenth centuries," Ali argues.
But he is by no means backward looking in his own vision. He is
particularly scornful of the mythical idealized past valorized by the
Wahhabites in Saudi Arabia, the Taliban and other Islamic sects. "What
do the Islamists offer?" Ali asks rhetorically: "A route to a past
which, mercifully for the people of the seventh century, never existed."
Ali sees the spread of reactionary impulses within Islam in part as a
response to "the defeat of secular, modernist and socialist impulses on
a global scale." Various forms of religious fundamentalism, not only
Islamic ones, have partially filled a void created by the failures of
parties operating under the banner of secular nationalism and Communism
in the Third World. These failures--his examples include Egypt and
Syria--were connected to the limits of the nationalist leaderships
themselves, especially their lack of democracy and suppression of
religious movements by politicians seeking to preserve and extend their
own power. But Ali also goes on to argue that "all the other exit routes
have been sealed off by the mother of all fundamentalisms: American
Consider, for example, the consequences of the US work to train and arm
the Islamic forces in Afghanistan, the mujahedeen, to wage a holy war
against the Soviet Union. A decade after the Soviets were expelled, the
country "was still awash with factional violence," while "veterans of
the war helped to destabilize Egypt, Algeria, the Philippines, Sudan,
Pakistan, Chechnya, Dagestan and Saudi Arabia." The factional
instability in Afghanistan, coupled with Pakistan's intervention,
created the conditions that led to the Taliban's rise to power.
To discuss the US government's role in overthrowing the secular
nationalist Mossadegh in Iran in 1953 and supporting the brutal Shah for
decades; in operating through the intermediary of Pakistan's
Inter-Services Intelligence units to back the mujahedeen in Afghanistan;
in repeatedly downplaying serious human rights abuses by US "friends"
such as Pakistan under Zia ul-Haq and Benazir Bhutto, whose governments
actively sponsored the growth of the Taliban; and in lending support to
groups like the Muslim Brotherhood in Egypt, Sarekat Islam in Indonesia
and Jamaat-e-Islami in Pakistan is not merely a case of obsessing about
past wrongs. As Ali argues persuasively, the past is indeed prologue.
Ali has a sharp mind and wit. His mode of history telling is lyrical and
engaging, humane and passionate. He repeatedly points to the lessons
learned by people in the course of struggle, challenging the pervasive
view that people can be liberated by those other than themselves,
setting out his differences with the advocates of "humanitarian
intervention." Ali writes that Western intellectuals have been far too
quick to support US-led military interventions such as the Gulf War and
to provide a liberal veneer of respect to wars prosecuted only
rhetorically in the name of human rights and democracy but actually
motivated by traditional "reasons of state." Where other people see
closed doors in history, he sees roads not taken and paths that remain
to be pursued.
Yet Ali spends too little time enumerating what some of those alternate
paths might be, especially for readers who are new to the history
recounted in The Clash of Fundamentalisms (certainly a
significant section of his readership, given the intense interest in
Islam, Central Asia, the Middle East and US foreign policy that has been
so much in evidence in recent months). In his final chapter, "Letter to
a young Muslim," Ali provides a thoughtful challenge to his
correspondent, but I fear he has not done enough to convince his reader
to change allegiances. He has more to say about the weakness of Islamism
than about any alternative vision of how a more just world might be
achieved. What would a compelling agenda look like in an era when, as he
notes, "no mainstream political party anywhere in the world even
pretends that it wishes to change anything significant"? What might a
radical secular program consist of today? How does one effectively mount
a challenge to the claim that there is no alternative to American-style
capitalism, or that attempts at fundamental change will reproduce the
horrors of the Soviet Union?
Indeed, The Clash of Fundamentalisms would have been stronger if
Ali had engaged this question more thoroughly. Though he expresses
contempt for the bureaucratic and dictatorial regimes that confronted
the United States during the cold war, at times he gives the Soviet bloc
more credit than it deserves. To suggest that China and the Soviet Union
were "striving for a superior social and economic system" is to give
those regimes far too much credit, and in essence to maintain some
illusion that Stalinist authoritarianism was a real alternative.
Ali at times repeats himself verbatim and gets a few details wrong (such
as misdating Iraq's invasion of Kuwait in 1991, rather than 1990). None
of this takes away from the importance of his argument that we are not
living in a radically new epoch in history, but in a period with all too
much continuity to the one before September 11.
No one has contributed more to the United States than James Madison. He
was the principal architect of the Constitution, the brilliant theorist
who, more than any other single individual, was responsible for
designing the American system of government. Moreover, along with
Washington and Franklin, Madison was one of the men who made the Constitutional
Convention in Philadelphia work. Whenever passionate disagreements
threatened the enterprise, it was Madison's calm logic to which the
others listened. As one delegate put it, it was Madison who had "the
most correct knowledge" about government affairs.
And no one did more than Madison to get the Constitution ratified in the
face of strong anti-Federalist opposition. The most hyperbolic
superlatives cannot do justice to the twenty-nine newspaper essays
Madison wrote that, together with essays by Alexander Hamilton and John
Jay (all written under the pseudonym Publius), comprise the
Federalist Papers. Suffice it to say that 200 years later a
distinguished political scientist wrote, "The Federalist is the
most important work in political science that has ever been written, or
is likely to be written, in the United States," and that Madison's
contributions shine the brightest.
And that is not all. At the convention in Richmond when anti-Federalists
George Mason and Patrick Henry used every argument and stratagem to
persuade Virginia to refuse to ratify the new Constitution--which, had
they been successful, would have caused the Union to be stillborn--it
was Madison's cool, clear reasoning that once again saved the day.
Madison's place in the pantheon of great Americans, therefore, is secure
regardless of how we evaluate his performance as the nation's fourth
President (1809-17). His reputation can withstand the central inquiry of
Garry Wills's short and provocative new book, namely: Why was James
Madison so great a constitutionalist but so dreadful a President?
Perhaps I overstate by calling Madison's presidency "dreadful." Wills
does not go that far. He presents an evaluation of Madison's successes
and failures, finding both. Nor do historians generally consider Madison
a dreadful President. When C-SPAN asked historians to rank the forty-two
American Presidents, Madison came in at number 18, putting him slightly
above average and, by way of modern comparisons, ahead of George H.W.
Bush (20) and Bill Clinton (21).
Wills's strongest pejorative is his description of Madison as a "hapless
commander in chief." Nevertheless, Wills's examination makes me wonder
whether, out of deference to Madison's other accomplishments, historians
are being unduly charitable to his presidency.
The defining issue of Madison's tenure was the War of 1812. Some
historians argue that he cannot be blamed for a war thrust upon him by a
"War Hawk Congress." Others, however, including most prominently Ralph
Ketcham of Syracuse University, argue that Madison wanted the war and
maneuvered Congress into declaring it. Wills sides with Ketcham and
builds a persuasive case that Madison deliberately propelled America
into a war for which it was ill prepared.
War was raging between England and France when Madison came to office.
Napoleon's armies were conducting their bloody marches across the
Continent while England was using her sea power to try to keep him
confined there. During his term, Jefferson had been confronted with the
problem of what to do about the combatants seizing ships that were
carrying American exports to their adversaries or, in England's case
especially, boarding American ships to seize sailors, many of whom were
deserters from the British Navy. At Madison's urging (Madison was
Jefferson's Secretary of State), Jefferson imposed an embargo on
American ships crossing the Atlantic. While some supported an embargo to
keep American ships out of harm's way, Madison believed an embargo would
exert enough commercial pressure on England to force it to agree to
leave American shipping alone.
But in fact the embargo meant little to England or France. It meant much
more to America, particularly New England, whose economy depended
heavily on trade with England. In the first year of the embargo
America's exports fell by almost 80 percent. New England preferred
having some of its ships and cargo seized by combatants to suspending
all trade. Under great pressure, Congress ended the embargo and replaced
it with the Nonintercourse Act, which permitted American ships to cross
the Atlantic as long as they did not trade with England or France. The
virtue of this approach was that it was unenforceable; once American
ships disappeared over the horizon, there was no telling where they
The embargo ended on the last day of Jefferson's presidency, and the
indignity of combatants seizing American ships and sailors resumed in
full force as Madison took office. Then Madison heard good news: A
British diplomat reported that his government was ready to grant America
neutral trading rights. Thrilled, Madison immediately issued a
proclamation repealing America's prohibition against trade with
whichever nation, England or France, first granted neutral trading
rights to the United States. Believing troubles with England at sea to
be at an end, 600 ships sailed from American ports confident that all
would be well when they arrived at their trading destinations across the
But England quickly announced there had been a mistake. Its
representative had failed to communicate that England would grant
neutral status only upon several conditions, one of which was that
England would continue to stop and board American ships and seize former
British sailors. Madison was fit to tied. By reneging on its word, said
Madison, England had committed an "outrage on all decency" more horrible
than the capture of black slaves from the shores of Africa.
Madison should have realized something was wrong with the original
repre-sentation, Wills argues. The US government's own survey revealed
that roughly 9,000 American crewmen were British deserters, and England
could not possibly afford so many of her sailors safe haven on American
Madison tried to wipe the egg off his face by announcing a new
policy--America would unilaterally resume trade with England and France
and continue to trade with both until either nation recognized America's
neutral trading rights, at which time America would automatically
reimpose an embargo upon the other. In view of the failure of the first
embargo, there was no reason to believe a potential new embargo would
force England or France to change its policy. But, says Wills, Madison
remained stubbornly committed to the failed policy of embargo.
Unfortunately, Wills believes, Napoleon shrewdly exploited it as a means
to maneuver America into war against England.
Napoleon announced he would repeal his ban on neutral trade on November
1, 1812, provided that the United States reimposed its embargo against
England by then. Acting once again without bothering to get
clarification, Madison reimposed the embargo upon England. But just as
he had previously acted without learning England's details and
conditions, this time Madison acted on Napoleon's offer only to discover
that Napoleon refused to rescind an order confiscating American ships at
port in recently captured Holland and other harbors of the empire.
Getting bamboozled by Napoleon appears, paradoxically, to have made
Madison even more furious at England. For its part, England found
Madison's willingness to side with France deplorable. "England felt that
it was defending the free world against the international tyranny of
Bonapartism," Wills writes. "Anyone who was not with them in that
struggle was against them." And so, increasingly, America and England
perceived each other as enemies.
Madison's anger with England was one factor that moved him toward war,
but there was another as well: He wanted to seize Canada. Jefferson
urged Madison to pluck this ripe plum while England was militarily
engaged with Napoleon. "The acquisition of Canada this year will be a
mere matter of marching," advised Jefferson.
It may be worth pausing to observe that many of Madison's worst
disasters involve following Jefferson. With the exception of the War of
1812, the most lamentable mistake of Madison's career was his plotting
with Jefferson to have states nullify federal laws, specifically the
Alien and Sedition Acts of 1798. The acts violated fundamental
principles of free speech and press, and Jefferson and Madison cannot be
blamed for opposing them. But the medicine they prescribed--the claim
that the states could enact legislation nullifying federal law--was
potentially far worse than the disease.
At the Constitutional Convention in 1787, Madison had argued that
Congress should be given the authority to nullify state law, and was
discouraged when he lost this battle. He later betrayed his own
convictions by arguing that the state legislatures could nullify laws
enacted by Congress, though for tactical reasons he called this
"interposition" rather than "nullification." Moreover, Madison allowed
himself to be Jefferson's cat's-paw in this matter. Jefferson, then Vice
President, wanted to keep his own involvement secret, and Madison
fronted for both of them. Madison was haunted by this throughout his
career: Southern states invoked Madison's support of nullification
during disputes over slavery, and Madison's political opponents
delighted in forcing him to try to explain the difference between
"interposition" and "nullification."
Why did Madison so readily follow Jefferson over cliffs? Madison was
nervous, bookish, provisional and physically unimposing (5'4" and 100
pounds). He was so insecure with the opposite sex that he did not
attempt courtship until he was 31. The object of his desire was 15, and
Madison was so crushed by her rejection that he did not venture into
romance again until he was 43, when he successfully won Dolley's hand.
It would be only natural for Madison to fall under the thrall of the
tall, dashing, passionate, cosmopolitan and supremely self-confident
Any sensible strategy to seize Canada from one of the world's
superpowers would necessarily hinge upon a quick and powerful attack to
overwhelm British forces before they could be reinforced or before the
British Navy could be brought to bear in the conflict. Madison and his
military commanders planned a rapid, two-pronged strike: One American
force, commanded by William Hull, was to invade Canada from the west,
crossing over the border from Detroit. Meanwhile, Henry Dearborn was to
lead American forces from the east, crossing the Saint Lawrence River
from various points in New York.
Rather than take the time to raise and train a professional army,
Madison decided to invade Canada with militia forces. But this strategy
was the military equivalent of throwing pebbles at a hornet's nest--and
Madison should have known it.
Before the Revolutionary War, there had been much soapbox rhetoric about
the glories of the militia: Citizen soldiers were supposed to be more
virtuous and therefore more capable than professional soldiers. The
Revolutionary War proved this to be bunk. After the skirmishes at
Lexington and Concord, the militia performed terribly. So often did the
militia bolt in the face of even much smaller opposing forces that it
became Continental Army doctrine to position militia units in front of
and between regular army units, who were ordered to shoot the first
militiamen to run. Washington won the war only after raising and
training a professional army.
Notwithstanding the militia's dismal performance, some
politicians--particularly Southern slaveholders like Madison who relied
on the militia for slave control--continued to cling to the notion that
the virtuous citizen militia was superior to a professional army. One
Southerner who would have found these views laughable if they were not
so dangerous was George Washington. "America has almost been amused out
of her Liberties" by pro-militia rhetoric, he said: "I solemnly declare
I never was witness to a single instance, that can countenance an
opinion of Militia or raw Troops being fit for the real business of
Madison, however, had not been listening. In the Federalist
Papers, he and Hamilton expressed differing views about the militia.
Hamilton argued that an effective fighting force required professional
training and discipline, and he urged Congress to support only a select
militia. Madison, however, continued to envision a universal militia
consisting of all able-bodied white men.
This debate resonates even today in the gun-control debate. Because the
Second Amendment connects the right to bear arms to the militia,
gun-rights advocates suggest that the Founders considered the universal
militia to be sacrosanct. The militia was then composed of the whole
body of the people, and thus the Constitution permanently grants the
whole body of the people the right to keep and bear arms--or so the
argument runs. This makes little sense as a matter of constitutional
law, however, because, as both Hamilton and Madison recognized, the
Constitution expressly empowered Congress to organize the militia as it
Despite the Revolutionary War experience, Madison launched his attack on
Canada almost entirely with militia forces. The results were
predictable. In the east, most militiamen refused to cross the Saint
Lawrence, claiming that it was unlawful to take the militia outside the
United States. Dearborn did manage to coax a small contingent across the
river. But when shooting accidentally broke out among his own forces,
they all fled in confusion back across the Saint Lawrence.
Meanwhile, in the west, Hull's forces were paralyzed by militia refusing
to take orders from regular Army officers. There was an invasion, but
American forces were not the invaders. By the end of 1812, when America
was to be in possession of most of Canada, a few American units that had
failed to retreat successfully back into New York were being held
prisoner in eastern Canada, and English forces had taken Detroit and the
Things continued downhill. Two years later, a British force of 1,200
marched nearly unchallenged into the District of Columbia while 8,000
American troops, mostly militia, "ran away too fast for our hard-fagged
people to make prisoners," as one British commander put it. The British,
of course, burned the White House and Capitol to the ground.
Wills gives Madison high marks for grace and courage during the British
invasion of Washington, and, all in all, the war did not turn out too
badly. The British had not wanted it and settled for the status quo ante
bellum. And rather than feeling disgraced, America took patriotic pride
in a series of Navy successes, remembered through battle slogans and
anthems ("Don't give up the ship," James Lawrence; "We have met the
enemy and they are ours," Oliver Hazard Perry; "the rockets' red glare,"
Francis Scott Key). America came out of war feeling good about itself.
For this, historians give Madison much credit.
Some credit is undoubtedly deserved. More than once, Madison acted with
courage and grace in the midst of panic. America was properly proud of
its naval feats, though it is not clear that a President who took a
nation with seven warships into battle against an adversary with 436
Is it unfair to call Madison a dreadful President? If Wills is correct
about Madison stumbling his way toward war through a series of
diplomatic blunders and then deciding to take on a world power with
militia forces, perhaps not.
And what is it that allowed Madison to be so great a constitutionalist
and so poor a President? Wills argues that it was provincialism and
naïveté: What Madison had learned from the great minds by
reading books allowed him to understand political theory better,
perhaps, than anyone else. But without greater worldly experience, even
Madison could not operate the levers of power that he himself designed.
Yet as Wills aptly concludes, "Madison did more than most, and did some
things better than any. That is quite enough."
Dread ripples through me as I listen to a phone message from our manager
saying that we (The Doors) have another offer of huge amounts of money
if we would just allow one of our songs to be used as the background for
a commercial. They don't give up! I guess it's hard to imagine that
everybody doesn't have a price. Maybe 'cause, as the cement heads try to
pave the entire world, they're paving their inner world as well. No
imagination left upstairs.
Apple Computer called on a Tuesday--they already had the audacity to
spend money to cut "When the Music's Over" into an ad for their new cube
computer software. They want to air it the next weekend, and will give
us a million and a half dollars! A MILLION AND A HALF DOLLARS! Apple is
a pretty hip company...we use computers.... Dammit! Why did Jim (Morrison) have to have such integrity?
I'm pretty clear that we shouldn't do it. We don't need the money. But I
get such pressure from one particular bandmate (the one who wears
glasses and plays keyboards).
"Commercials will give us more exposure," he says. I ask him, "so you're
not for it because of the money?" He says "no," but his first
question is always "how much?" when we get one of these offers, and he
always says he's for it. He never suggests we play Robin Hood, either.
If I learned anything from Jim, it's respect for what we created. I have
to pass. Thank God, back in 1965 Jim said we should split everything,
and everyone has veto power. Of course, every time I pass, they double
It all started in 1967, when Buick proffered $75,000 to use "Light My
Fire" to hawk its new hot little offering--the Opel. As the story
goes--which everyone knows who's read my autobiography or seen Oliver
Stone's movie--Ray, Robby and John (that's me) OK'd it, while Jim was
out of town. He came back and went nuts. And it wasn't even his song
(Robby primarily having penned "LMF")! In retrospect, his calling up
Buick and saying that if they aired the ad, he'd smash an Opel on
television with a sledgehammer was fantastic! I guess that's one of the
reasons I miss the guy.
It actually all really started back in '65, when we were a garage
band and Jim suggested sharing all the songwriting credits and money.
Since he didn't play an instrument--literally couldn't play one chord on
piano or guitar, but had lyrics and melodies coming out of his ears--the
communal pot idea felt like a love-in. Just so no one got too
weird, he tagged that veto thought on. Democracy in action...only
sometimes avenues between "Doors" seem clogged with bureaucratic BS. In
the past ten years it's definitely intensified...maybe we need a third
party. What was that original intent? Liberty and justice for all
songs...and the pursuit of happiness.... What is happiness? More money?
More fame? The Vietnamese believe that you're born with happiness; you
don't have to pursue it. We tried to bomb that out of them back in my
youth. From the looks of things, we might have succeeded.
This is sounding pretty depressing, John; where are you going here? The
whole world is hopefully heading toward democracy. That's a good thing,
John.... Oh, yeah: the greed gene. Vaclav Havel had it right when he
took over as president of Czechoslovakia, after the fall of Communism.
He said, "We're not going to rush into this too quickly, because I don't
know if there's that much difference between KGB and IBM."
Whoa! Here comes another one: "Dear John Densmore, this letter is an
offer of up to one million dollars for your celebrity endorsement of our
product. We have the best weight loss, diet and exercise program, far
better than anything on the market. The problem is the celebrity must be
overweight. Then the celebrity must use our product for four weeks,
which will take off up to 20 pounds of their excess body fat. If your
endorsement works in the focus group tests, you will immediately get
$10,000.00 up front and more money will start rolling in every month
after that--up to a million dollars or more." Wow! Let's see...I've
weighed 130 pounds for thirty-five years--since my 20s...I'll have to
gain quite a bit...sort of like a De Niro thing...he gained fifty pounds for Raging Bull--and won an Oscar! I'm an artist, too, like him...
We used to build our cities and towns around churches. Now banks are at
the centers of our densely populated areas. I know, it's the 1990s....
No, John, it's the new millennium, you dinosaur. Rock dinosaur, that is.
My hair isn't as long as it used to be. I don't smoke much weed anymore,
and I even have a small bald spot. The dollar is almighty, and
ads are kool, as cool as the coolest rock videos.
Why did Jim have to say we were "erotic politicians"? If I had been the
drummer for the Grassroots, it probably wouldn't have cut me to the core
when I heard John Lennon's "Revolution" selling tennis shoes...and
Nikes, to boot! That song was the soundtrack to part of my youth, when
the streets were filled with passionate citizens expressing their First
Amendment right to free speech. Hey...the streets are filled again! Or
were, before 9/11. And they're protesting what I'm trying to wax on and
on about here. Corporate greed! Maybe I should stick to music. I guess
that's why I hit the streets with Bonnie Raitt during the 1996
Democratic National Convention. We serenaded the troops. Bob Hope did it
during World War II, only our troops are those dressed in baggy Bermuda
shorts, sporting dreadlocks. Some have the shaved Army look, but they're
always ready to fight against the Orwellian nightmare. A woman activist
friend of mine said that with the networking of the Net, what's bubbling
under this brave new world will make the '60s unrest look like peanuts.
I don't want "Anarchy, Now," a worn-out hippie phrase, but I would like
to see a middle class again in this country.
Europe seems saner right now. They are more green than us. They're
paranoid about our genetically altered food and they're trying to make
NATO a little more independent in case we get too zealous in our
policing of the globe. When The Doors made their first jaunt from the
colonies to perform in the mother country back in '67, the record
companies seemed a little saner, too. The retailers in England could
order only what they thought they could sell; no returns to the
manufacturers. That eliminated the tremendous hype that this country
still produces, creating a buzz of "double platinum" sales, and then
having half of the CDs returned. Today, there is a time limit of three
to six months for the rackjobbers to get those duds back to the company.
Our band used to be on a small folk label. Judy Collins, Love and the
Butterfield Blues Band were our Elektra labelmates. We could call up the
president, Jac Holzman, and have a chat...and this was before we
made it. Well, Jac sold out for $10 million back in '70, and we were now
owned by a corporation. Actually, today just five corps own almost the
entire record business, where numbers are the bottom line. At
least we aren't on the one owned by Seagram's! Wait a minute...maybe
we'd get free booze...probably not. Advances are always
recoupable, booze probably is too.
Those impeccable English artists are falling prey as well. Pete
Townshend keeps fooling us again, selling Who songs to yuppies hungry
for SUVs. I hope Sting has given those Shaman chiefs he hangs out with
from the rainforest a ride in the back of that Jag he's advertising,
'cause as beautiful as the burlwood interiors are, the car--named after
an animal possibly facing extinction--is a gas guzzler. If you knew me
back in the '60s, you might say that this rant--I mean, piece--now has a
self-righteous ring to it, me having had the name Jaguar John back then.
I had the first XJ-6 when they came out, long before the car became
popular with accountants. That's when I sold it for a Rolls
Royce-looking Jag, the Mark IV, a super gas guzzler. That was back when
the first whiffs of rock stardom furled up my nose. Hopefully, I've
learned something since those heady times, like: "What good is a used-up
world?" Plus, it's not a given that one should do commercials for the
products one uses. The Brits might bust me here, having heard "Riders on
the Storm" during the '70s (in Britain only) pushing tires for their
roadsters, but our singer's ghost brought me to my senses and I gave my
portion to charity. I still don't think the Polish member of our
band has learned the lesson of the Opel, but I am now adamant that three
commercials and we're out of our singer's respect. "Jim's dead!" our
piano player responds to this line of thought. That is precisely
why we should resist, in my opinion. The late, transcendental George
Harrison had something to say about this issue. The Beatles "could have
made millions of extra dollars [doing commercials], but we thought it
would belittle our image or our songs," he said. "It would be real handy
if we could talk to John [Lennon]...because that quarter of us is
gone...and yet it isn't, because Yoko's there, Beatling more than ever."
Was he talking about the Nike ad, or John and Yoko's nude album cover
shot now selling vodka?
Actually, it was John and Yoko who inspired me to start a 10 percent
tithe, way back in the early '80s. In the Playboy interview, John
mentioned that they were doing the old tradition, and it stuck in my
mind. If everybody gave 10 percent, this world might recapture a
bit of balance. According to my calculations, as one gets up into the
multi category, you up the ante. Last year I nervously committed to 15
percent, and that old feeling rose again: the greed gene. When you get
to multi-multi, you should give away half every year. Excuse me, Mr.
Gates, but the concept of billionaire is obscene. I know you give a lot
away, and it's easy for me to mouth off, but I do know something about
it. During the Oliver Stone film on our band, the record royalties
tripled, and as I wrote those 10 percent checks, my hand was shaking.
Why? It only meant that I was making much more for myself. It was the
hand of greed. I am reminded of the sound of greed, trying to talk me
into not vetoing a Doors song for a cigarette ad in Japan.
"It's the only way to get a hit over there, John. They love commercials.
It's the new thing!"
"What about encouraging kids to smoke, Ray?"
"You always have to be PC, don't you, John?" I stuck to my guns and
vetoed the offer, thinking about the karma if we did it. Manzarek has
recently been battling stomach ulcers. So muster up courage, you
capitalists; hoarding hurts the system--inner as well as outer.
So it's been a lonely road resisting the chants of the rising
solicitations: "Everybody has a price, don't they?" Every time we (or I)
resist, they up the ante. An Internet company recently offered three
mil for "Break on Through." Jim's "pal" (as he portrays himself in
his bio) said yes, and Robby joined me in a resounding no! "We'll give
them another half mil, and throw in a computer!" the prez of Apple
pleaded late one night.
Robby stepped up to the plate again the other day, and I was very
pleased that he's been a longtime friend. I was trying to get through to
our ivory tinkler, with the rap that playing Robin Hood is fun, but the
"bottom line" is that our songs have a higher purpose, like keeping the
integrity of their original meaning for our fans. "Many kids have said
to me that 'Light My Fire,' for example, was playing when they first
made love, or were fighting in Nam, or got high--pivotal moments in
their lives." Robby jumped in. "If we're only one of two or three groups
who don't do commercials, that will help the value of our songs in the
long run. The publishing will suffer a little, but we should be proud of
our stance." Then Robby hit a home run. "When I heard from one fan that
our songs saved him from committing suicide, I realized, that's it--we
can't sell off these songs."
So, in the spirit of the Bob Dylan line, "Money doesn't talk, it
swears," we have been manipulated, begged, extorted and bribed to make a
pact with the devil. While I was writing this article, Toyota Holland
went over the line and did it for us. They took the opening
melodic lines of "Light My Fire" to sell their cars. We've called up
attorneys in the Netherlands to chase them down, but in the meantime,
folks in Amsterdam think we sold out. Jim loved Amsterdam.
I received the news of paleontologist and popular science writer Stephen
Jay Gould's death, at age 60, in the week I was reading Jonathan Marks's
new book on genetics, human evolution and the politics of science. My
friends and I discussed our shock--Gould had famously "beat" cancer some
years back--and shared charming and ridiculous Gould information, like his
funny-voice contributions to The Simpsons. Postings on leftist
listservs noted that Gould's fulsome New York Times obituary,
which rattled on about his "controversial" theory of punctuated
equilibrium, his SoHo loft and love of baseball, neglected to mention
his extensive antiracist writing and many other radical activities,
including working with the Science for the People group. Rhoda and Mark
Berenson wrote in to commend his strong support for the release of their
daughter Lori, the young American leftist sympathizer long imprisoned as
a "terrorist" in Peru.
With Gould gone, the landscape of progressive English-language popular
science writing is much impoverished. In particular, in an era in which
silly, and most frequently racist and sexist "it's all in our genes"
narratives have become--alas!--purely commonsensical in the mass media,
if not in the academy, we have lost a stalwart and articulate
evolutionary biologist who wrote prolifically against sociobiology's
reductionist framings of human experience. But molecular anthropologist
Jonathan Marks, with his broad history-of-science background, his
take-no-prisoners stance on scientific stupidity and overreaching, and
his hilarious Groucho Marx delivery, can help to fill that void.
What It Means to Be 98% Chimpanzee addresses precisely that
question--the issue of human/higher primate connections--and all its
existential and political entailments. Marks reframes the old C.P. Snow
"two cultures" debate, on the gulf between the sciences and the
humanities, in a new and interesting way. Rather than blaming the
general public for its scientific ignorance--which I must confess is my
own knee-jerk tendency--Marks turns the lens around. He indicts
scientists, and particularly his own confrères in genetics, for
their long history of toadying to elite interests: "Where human lives,
welfare, and rights are concerned, genetics has historically provided
excuses for those who wish to make other people's lives miserable, to
justify their subjugation, or to curry favor with the wealthy and
powerful by scapegoating the poor and voiceless." Marks's conclusion is
that genetics "is therefore now obliged to endure considerably higher
levels of scrutiny than other, more benign and less corruptible, kinds
of scientific pronouncements might."
And scrutinize he does. First, Marks provides us with an accessible
history of the linked Western efforts, since the seventeenth century, to
comprehend the natures of nonhuman higher primates, and to develop
biological taxonomy, both before and since the rise of evolutionary
theory. With word-pictures and actual illustrations of explorers' and
others' accounts of "Pongoes," "Baboones, Monkies, and Apes," he makes
vivid for us the ways in which "the apes, by virtue of straddling a
symbolic boundary, are highly subject to the projections of the
scientist from the very outset of modern science." Not the least of
Marks's virtues are his deft along-the-way explanations, as for instance
the key physiological differences between monkeys and apes (the latter
are "large-bodied, tailless, flexible-shouldered, slow-maturing"). Only
last week, I found myself hectoring a hapless video-store worker about
the absurd conjunction, in the store's display case, of an orangutan
(ape) stuffed animal with a Monkey Business movie poster. Now I
can just hand out 98% Chimpanzee.
The "projection" problem, according to Marks, is far more inherent to
biological taxonomy than heretofore realized. He offers amusing
lightning sketches of scientists past and present, from the
eighteenth-century catfight between Buffon and Linnaeus over whether
intrahuman variation could be categorized biologically--the latter
eventually acknowledging Buffon "by naming a foul-smelling plant after
him"--to paleobiologist George Gaylord Simpson's two-martini lunches in
his 1980s Arizona retirement as he declaimed against contemporary
genetic reductionists. These humanized history-of-science narratives
allow Marks to make clear the uncertainties and arbitrariness of "hard"
science categorizations. While "every biology student knows that humans
are mammals," because human females nurse their young, Marks notes that
"it is not obviously the case that breast-feeding is the key feature any
more than having a single bone in the lower jaw (which all
Mammalia, and only Mammalia, have)." He uses historian
Londa Schiebinger's work to show us how Linnaeus, who had been operating
with Aristotle's four-legged "Quadrupedia" label, switched to
Mammalia because he was active in the contemporary movement
against upper-class women sending their infants out to wet nurses: "He
was saying that women are designed to nurse their own children, that it
is right, and that it is what your family should do."
Political apprehensions, as we know, were woven just as deeply into
scientists' evolving modes of categorizing
intrahuman--"racial"--variation. Here Marks tells some familiar stories
in new ways. Many know, for example, about racist University of
Pennsylvania anthropologist Carleton Coon's last-ditch claims, in the
early 1960s, that "the length of time a subspecies has been in the
sapiens state" determines "the levels of civilization attained by some
of its populations." But Marks offers us as well a fascinating sense of
the times. We see, for example, Sherwood Washburn, the Harvard Yankee of
later Man the Hunter fame, and Ashley Montagu, the debonair English
anthropologist redbaited out of the academy and onto What's My
Line appearances, ending up "on the same side, working to purge
anthropology once and for all of the classificatory fallacy that had
blinded it since the time of Linnaeus.... Coon died...an embittered and
largely forgotten figure, done in, he supposed, by the forces of
political correctness, and more darkly (he allowed in personal
correspondence) by a conspiracy of communists and Jews as well."
The importance of cultural constructions, and their irreducibility to
biological functions, have been hoary apothegms in anthropology
classrooms for a half-century. Awareness of the susceptibility of
scientific practice to the politics of reputation has been with us since
the Kuhnian 1960s. Ethnographic, historical and journalistic work on
bench science from the 1980s forward has focused on the political
framing of, and politicized language use in, hard science research and
on the power of corporate and state funding to determine research
directions and even findings. But Marks takes the "cultural construction
of science" line much further than even most progressive critics of the
contemporary idiocies of sociobiologists--although he does get off some
lovely lines, like "sociobiology, which studies the biological roots of
human behavior, whether or not they exist." He takes the critique home
to his specialty, evolutionary molecular genetics, and demonstrates the
multifarious ways that recent claims about human nature and evolution,
based on DNA evidence, have been misframed, are irrelevant or often
That we "are" 98 percent chimpanzee, says Marks, is a profound
misframing. First, our biological closeness to the great apes "was known
to Linnaeus without the aid of molecular genetics." "So what's new? Just
the number." Then he points out that the meaning of phylogenetic
closeness depends upon the standpoint from which it is viewed: "From the
standpoint of a daffodil, humans and chimpanzees aren't even 99.4%
identical, they're 100% identical. The only difference between them is
that the chimpanzee would probably be the one eating the daffodil."
Then, the diagnostic genetic dissimilarities between chimpanzees and
humans do not cause the observed differences between them, and are
therefore irrelevant to discussions of the "meaning" of our genetic
When we compare their DNA, we are not comparing their genes for
bipedalism, or hairlessness, or braininess, or rapid body growth during
adolescence.... We're comparing other genes, other DNA regions, which
have either cryptic biochemical functions, or, often, no known function
at all. It's the old "bait and switch." The genes we study are not
really the genes we are interested in.
Thus all of the wild claims about our "chimp" nature, which have ranged
over the past forty years from male-dominant hunter (early 1960s) to
hippie artist and lover (late 1960s through 1970s) to consummate
competitor (Gordon Gekko 1980s) are entirely politically constructed.
And, Marks adds, in considering the "demonic male" interpretation of
chimp competition as like that of Athens and Sparta, they are simply
argument by analogy: "Maybe a chimpanzee is sort of like a Greek
city-state. Maybe an aphid is like Microsoft. Maybe a kangaroo is like
Gone With the Wind. Maybe a gopher is like a microwave oven."
Just plain dumb.
Using this set of insights, Marks eviscerates a wide array of
contemporary "hi-tech folk wisdom about heredity" claims, from the
"successes" of both the Human Genome and Human Genome Diversity Projects
to the "Caucasian" Kennewick Man, the "genetic" athletic superiority of
black Americans, the genetics of Jewish priesthood and the existence of
a "gay gene." He is particularly trenchant against the Great Ape
Project's use of human/ape genetic similarities to argue for "human
rights" for apes, frequently to the detriment of the impoverished
African and Southeast Asian residents of ape homelands: "Apes should be
conserved and treated with compassion, but to blur the line between them
and us is an unscientific rhetorical device.... our concern for them
can't come at the expense of our concern for human misery and make us
numb to it."
There is much more in 98% Chimpanzee, a real treasure trove of
thoughtful, progressive scientific thought. But I do have a quibble.
While Marks takes an uncompromising equal rights stance when it comes to
female versus male biology, he doesn't delve anywhere near as deeply
into the insanities of contemporary "hi-tech folk wisdom" about
sex--like the "rape is genetic" claims of a few years back--as he does
about race. And they are legion, and just as politically consequential.
Nevertheless, this is an important and refreshing book, the first
claimant to replace the magisterial and out-of-print Not in Our
Genes, and a fitting monument to Stephen Jay Gould's passing. Now
tell me the one again about the duck with lips.