Quantcast

Nation Topics - Books and the Arts | The Nation

Topic Page

Nation Topics - Books and the Arts

Subsections:

Arts and Entertainment Books and Ideas

Articles

News and Features

The SAT has been on the ropes lately. The University of California
system has threatened to quit using the test for its freshman
admissions, arguing that the exam has done more harm than good. The
State of Texas, responding to a federal court order prohibiting its
affirmative action efforts, has already significantly curtailed the
importance of the SAT as a gatekeeper to its campuses. Even usually
stodgy corporate types have started to beat up on the SAT. Last year,
for example, a prominent group of corporate leaders joined the National
Urban League in calling upon college and university presidents to quit
placing so much stock in standardized admissions tests like the SAT,
which they said were "inadequate and unreliable" gatekeepers to college.

Then again, if the SAT is anything, it's a survivor. The SAT
enterprise--consisting of its owner and sponsor, the College Board, and
the test's maker and distributor, the Educational Testing Service--has
gamely reinvented itself over the years in myriad superficial ways,
hedging against the occasional dust-up of bad public relations. The SAT,
for example, has undergone name changes over the years in an effort to
reflect the democratization of higher education in America and
consequent changes in our collective notions about equal opportunity.
But through it all, the SAT's underlying social function--as a sorting
device for entry into or, more likely, maintenance of American
elitehood--has remained ingeniously intact, a firmly rooted icon of
American notions about meritocracy.

Indeed, the one intangible characteristic of the SAT and other
admissions tests that the College Board would never want to change is
the virtual equation, in the public's mind, of test scores and academic
talent. Like the tobacco companies, ETS and the College Board (both are
legally nonprofit organizations that in many respects resemble
profit-making enterprises) put a cautionary label on the product.
Regarding their SAT, the organizations are obliged by professional codes
of proper test practices to inform users of standardized admissions
tests that the exams can be "useful" predictors of later success in
college, medical school or graduate school, when used in conjunction
with other factors, such as grades.

But the true place of admissions testing in America isn't always so
appropriate. Most clear-eyed Americans know that results on the SAT,
Graduate Record Exam or the Medical College Admission Test are widely viewed as synonymous with academic talent in higher education. Whether it's true or not--and there's lots of evidence that it's not--is quite beside the point.

Given the inordinate weight that test scores play in the American
version of meritocracy, it's no surprise that federal courts have been
hearing lawsuits from white, middle-class law school applicants
complaining they were denied admission to law school even though their
LSAT scores were fifty points greater than a minority applicant who was
admitted; why neoconservative doomsayers warn that the academic quality
of America's great universities will plummet if the hordes of unwashed
(read: low test scores) are allowed entry; why articles are written
under titles like "Backdoor Affirmative Action," arguing that
de-emphasizing test scores in Texas and California is merely a covert
tactic of public universities to beef up minority enrollments in
response to court bans on affirmative action.

Indeed, Rebecca Zwick, a professor of education at the University of
California, Santa Barbara, and a former researcher at the Educational
Testing Service, wrote that "Backdoor Affirmative Action" article for
Education Week in 1999, implying that do-gooders who place less
emphasis on test scores in order to raise minority enrollments are
simply blaming the messenger. And so it should not be surprising that
the same author would provide an energetic defense of the SAT and
similar exams in her new book, Fair Game? The Use of Standardized
Admissions Tests in Higher Education.

Those, like Zwick, who are wedded to the belief that test scores are
synonymous with academic merit will like this concise book. They will
praise its 189 pages of text as, finally, a fair and balanced
demystification of the esoteric world of standardized testing. Zwick and
her publisher are positioning the book as the steady, guiding hand
occupying the sensible middle ground in an emotional debate that they
claim is dominated by journalists and other uninformed critics who don't
understand the complex subject of standardized testing. "All too
often...discussions of testing rely more on politics or emotion than on
fact," Zwick says in her preface. "This book was written with the aim of
equipping contestants in the inevitable public debates with some solid
information about testing."

If only it were true. Far from reflecting the balanced approach the
author claims, the book is thinly disguised advocacy for the status quo
and a defense of the hegemony of gatekeeping exams for college and
university admissions. It could be more accurately titled (without the
bothersome question mark) "Fair Game: Why America Needs the SAT."

As it stands, the research staff of the College Board and the
Educational Testing Service, Zwick's former employer, might as well have
written this book, as she trots out all the standard arguments those organizations have used for years to show why healthy doses of standardized testing are really good for American education. At almost every opportunity, Zwick quotes an ETS or College Board study in the most favorable light, couching it as the final word on a particular issue, while casting aspersion on
other studies and researchers (whose livelihoods don't depend on selling
tests) that might well draw different conclusions. Too often Zwick
provides readers who might be unfamiliar with the research about testing
with an overly simplistic and superficial treatment. At worst, she
leaves readers with grossly misleading impressions.

After providing a quick and dirty account of IQ testing at the turn of
the last century, a history that included the rabidly eugenic beliefs of
many of the early testmakers and advocates in Britain and the United
States ("as test critics like to point out," Zwick sneers), the author
introduces readers to one of the central ideologies of mental testing to
sort a society's young for opportunities for higher education. Sure,
mental testing has brought some embarrassing moments in history that we
moderns frown on nowadays, but the testing movement has had its good
guys too. Rather than being a tool to promote and protect the interests
of a society's most privileged citizens, the cold objectivity of
standardized testing remains an important goal for exercise of
democratic values.

According to this belief, standardized testing for admission to college
serves the interest of meritocracy, in which people are allowed to shine
by their wits, not their social connections. That same ideology, says
Zwick, drove former Harvard president James Bryant Conant, whom Zwick
describes as a "staunch supporter of equal opportunity," in his quest to
establish a single entrance exam, the SAT, for all colleges. Conant, of
course, would become the first chairman of the board of the newly formed
Educational Testing Service. But, as Nicholas Lemann writes in his 1999
book The Big Test: The Secret History of the American
Meritocracy
, Conant wasn't nearly so interested in widening
opportunity to higher education as Zwick might think. Conant was keen on
expanding opportunity, but, as Lemann says, only for "members of a tiny
cohort of intellectually gifted men." Disillusioned only with the form
of elitism that had taken shape at Harvard and other Ivy League
colleges, which allotted opportunities based on wealth and parentage,
Conant was nevertheless a staunch elitist, an admirer of the
Jeffersonian ideal of a "natural aristocracy." In Conant's perfect
world, access to this new kind of elitehood would be apportioned not by
birthright but by performance on aptitude tests. Hence the SAT, Lemann
writes, "would finally make possible the creation of a natural
aristocracy."

The longstanding belief that high-stakes mental tests are the great
equalizer of society is dubious at best, and at worst a clever piece of
propaganda that has well served the interests of American elites. In
fact, Alfred Binet himself--among the fathers of IQ testing, who would
invent the first version of the Stanford-Binet intelligence test, the
precursor to the modern SAT--observed the powerful relationship between
one's performance on his so-called intelligence test and a child's
social class, a phenomenon Binet described in his 1916 book The
Development of Intelligence in Children.

And it's the same old story with the SAT. Look at the college-bound high
school seniors of 2001 who took the SAT, and the odds are still firmly
stacked against young people of modest economic backgrounds' beating the
SAT odds. A test-taker whose parents did not complete high school can
expect to score fully 171 points below the SAT average, College Board
figures show. On the other hand, high schoolers whose moms and dads have
graduate degrees can expect to outperform the SAT average by 106 points.

What's more, the gaps in SAT performance between whites and blacks and
between whites and Mexican-Americans have only ballooned in the past ten
years. The gap between white and black test-takers widened five points
and eleven points on the SAT verbal and math sections, respectively,
between 1991 and 2001. SAT score gaps between whites and
Mexican-Americans surged a total of thirty-three points during that same
period.

For critics of the national testing culture, such facts are troubling
indeed, suggestive of a large web of inequity that permeates society and
the educational opportunities distributed neatly along class and race
lines, from preschool through medical school. But for Zwick, the notion
of fairness when applied to standardized admissions tests boils down to
a relatively obscure but standard procedure in her field of
"psychometrics," which is in part the study of the statistical
properties of standardized tests.

Mere differences in average test scores between most minority groups and
whites or among social classes isn't all that interesting to Zwick. More
interesting, she maintains, is the comparative accuracy of test scores
in predicting university grades between whites and other racial groups.
In this light, she says, the SAT and most standardized admissions tests
are not biased against blacks, Latinos or Native Americans. In fact, she
says, drawing on 1985 data from a College Board study that looked at
forty-five colleges, those minority groups earned lower grades in
college than predicted by their SAT scores--a classic case of
"overprediction" that substantiates the College Board claim that the SAT
is more than fair to American minorities. By contrast, if the SAT is
unfair to any group, it's unfair to whites and Asian-Americans, because
they get slightly better college grades than the SAT would predict,
Zwick suggests.

Then there's the odd circumstance when it comes to standardized
admissions tests and women. A number of large studies of women and
testing at the University of California, Berkeley, the University of
Michigan and other institutions have consistently shown that while women
(on average) don't perform as well on standardized tests as male
test-takers do, women do better than men in actual classroom work.
Indeed, Zwick acknowledges that standardized tests, unlike for most
minority groups, tend to "underpredict" the actual academic performance
of women.

But on this question, as with so many others in her book, Zwick's
presentation is thin, more textbookish than the thorough examination and
analysis her more demanding readers would expect. Zwick glosses over a
whole literature on how the choice of test format, such as
multiple-choice versus essay examinations, rewards some types of
cognitive approaches and punishes others. For example, there's evidence
to suggest that SAT-type tests dominated by multiple-choice formats
reward speed, risk-taking and other surface-level "gaming" strategies
that may be more characteristic of males than of females. Women and
girls may tend to approach problems somewhat more carefully, slowly and
thoroughly--cognitive traits that serve them well in the real world of
classrooms and work--but hinder their standardized test performance
compared with that of males.

Beyond Zwick's question of whether the SAT and other admissions tests
are biased against women or people of color is the perhaps more basic
question of whether these tests are worthwhile predictors of academic
performance for all students. Indeed, the ETS and the College Board sell
the SAT on the rather narrow promise that it helps colleges predict
freshman grades, period. On this issue, Zwick's presentation is not a
little pedantic, seeming to paint anyone who doesn't claim to be a
psychometrician as a statistical babe in the woods. Zwick quotes the
results of a College Board study published in 1994 finding that one's
SAT score by itself accounts for about 13 percent of the differences in
freshman grades; that one's high school grade average is a slightly
better predictor of college grades, accounting for about 15 percent of
the grade differences among freshmen; and that the SAT combined with
high school grades is a better predictor than the use of grades alone.
In other words, it's the standard College Board line that the SAT is
"useful" when used with other factors in predicting freshman grades. (It
should be noted that Zwick, consistent with virtually all College Board
and ETS presentations, reports her correlation statistics without
converting them into what's known as "R-squared" figures. In my view,
the latter statistics provide readers with a common-sense understanding
of the relative powers of high school grades and test scores in
predicting college grades. I have made those conversions for readers in
the statistics quoted above.)

Unfortunately, Zwick misrepresents the real point that test critics make
on the question of predictive validity of tests like the SAT. The
salient issue is whether the small extra gains in predicting freshman
grades that the SAT might afford individual colleges outweigh the social
and economic costs of the entire admissions testing enterprise, costs
borne by individual test-takers and society at large.

Even on the narrow question of the usefulness of the SAT to individual
colleges, Zwick does not adequately answer what's perhaps the single
most devastating critique of the SAT. For example, in the 1988 book
The Case Against the SAT, James Crouse and Dale Trusheim argued
compellingly that the SAT is, for all practical purposes, useless to
colleges. They showed, for example, that if a college wanted to maximize
the number of freshmen who would earn a grade-point average of at least
2.5, then the admissions office's use of high school rank alone as the
primary screening tool would result in 62.2 percent "correct"
admissions. Adding the SAT score would improve the rate of correct
decisions by only about 2 in 100. The researchers also showed,
remarkably, that if the admissions objective is broader, such as
optimizing the rate of bachelor's degree completion for those earning
grade averages of at least 2.5, the use of high school rank by itself
would yield a slightly better rate of prediction than if the SAT scores
were added to the mix, rendering the SAT counterproductive. "From a
practical viewpoint, most colleges could ignore their applicants' SAT
score reports when they make decisions without appreciably altering the
academic performance and the graduation rates of students they admit,"
Crouse and Trusheim concluded.

At least two relatively well-known cases of colleges at opposite ends of
the public-private spectrum, which have done exactly as Crouse and
Trusheim suggest, powerfully illustrate the point. Consider the
University of Texas system, which was compelled by a 1996 federal
appeals court order, the Hopwood decision, to dismantle its
affirmative-action admissions programs. The Texas legislature responded
to the threat of diminished diversity at its campuses with the "top 10
percent plan," requiring public universities to admit any student
graduating in the top 10 percent of her high school class, regardless of
SAT scores.

Zwick, of course, is obliged in a book of this type to mention the Texas
experience. But she does so disparagingly and without providing her
readers with the most salient details on the policy's effects in terms
of racial diversity and the academic performance of students. Consider
the diversity question. While some progressives might have first
recoiled at the new policy as itself an attack on affirmative action,
that has not been the case. In fact, at the University of Texas at
Austin, the racial diversity of freshman classes has been restored to
pre-Hopwood levels, after taking an initial hit. Indeed, the
percentage of white students at Austin reached a historic low point in
2001, at 61 percent. What's more, the number of high schools sending
students to the state's flagship campus at Austin has significantly
broadened. The "new senders" to the university include more inner-city
schools in Dallas, Houston and San Antonio, as well as more rural
schools than in the past, according to research by UT history professor
David Montejano, among the plan's designers.

But the policy's impact on academic performance at the university might
be even more compelling, since that is the point upon which
neoconservative critics have been most vociferous in their condemnations
of such "backdoor" affirmative action plans that put less weight on test
scores. A December 1999 editorial in The New Republic typified
this road-to-ruin fiction: Alleging that the Texas plan and others like
it come "at the cost of dramatically lowering the academic
qualifications of entering freshmen," the TNR editorial warned,
these policies are "a recipe for the destruction of America's great
public universities."

Zwick, too, neglects to mention the facts about academic performance of
the "top 10 percenters" at the University of Texas, who have proven the
dire warnings to be groundless. At every SAT score interval, from less
than 900 to scores of 1,500 and higher, in the year 2000, students
admitted without regard to their SAT score earned better grades than
their non-top 10 percent counterparts, according to the university's
latest research report on the policy.

Or, consider that the top 10 percenters average a GPA of 3.12 as
freshmen. Their SAT average was about 1,145, fully 200 points lower than
non-top 10 percent students, who earned slightly lower GPAs of 3.07. In
fact, the grade average of 3.12 for the automatically admitted students
with moderate SAT scores was equal to the grade average of non-top 10
percenters coming in with SATs of 1,500 and higher. The same pattern has
held across the board, and for all ethnic groups.

Bates College in Lewiston, Maine, is one case of a college that seemed
to anticipate the message of the Crouse and Trusheim research. Bates ran
its own numbers and found that the SAT was simply not a sufficiently
adequate predictor of academic success for many students and abandoned
the test as an entry requirement several years ago. Other highly
selective institutions have similar stories to tell, but Bates serves to
illustrate. In dropping the SAT mandate, the college now gives students
a choice of submitting SATs or not. But it permits no choice in
requiring that students submit a detailed portfolio of their actual work
and accomplishments while in high school for evaluation, an admissions
process completed not just by admissions staff but by the entire Bates
faculty.

As with the Texas automatic admission plan, Zwick would have been
negligent not to mention the case of Bates, and she does so in her
second chapter; but it's an incomplete and skewed account. Zwick quotes
William Hiss, the former dean of admissions at Bates, in a 1993
interview in which he suggests that the Bates experience, while perhaps
appropriate for a smaller liberal arts college, probably couldn't be
duplicated at large public universities. That quote well serves Zwick's
thesis that the SAT is a bureaucratically convenient way to maintain
academic quality at public institutions like UT-Austin and the
University of California. "With the capability to conduct an intensive
review of applications and the freedom to consider students' ethnic and
racial backgrounds, these liberal arts colleges are more likely than
large university systems to succeed in fostering diversity while toeing
the line on academic quality," Zwick writes.

But Zwick neglects to mention that Hiss has since disavowed his caveats
about Bates's lessons for larger public universities. In fact, Hiss, now
a senior administrator at the college, becomes palpably irritated at
inequalities built into admissions systems that put too much stock in
mental testing. He told me in a late 1998 interview, "There are twenty
different ways you can dramatically open up the system, and if you
really want to, you'll figure out a way. And don't complain to me about
the cost, that we can't afford it."

Zwick punctuates her brief discussion of Bates and other institutions
that have dropped the SAT requirement by quoting from an October 30,
2000, article, also in The New Republic, that purportedly
revealed the "dirty little secret" on why Bates and other colleges have
abandoned the SAT. The piece cleverly observed that because SAT
submitters tend to have higher test scores than nonsubmitters, dropping
the SAT has the added statistical quirk of boosting SAT averages in
U.S. News & World Report's coveted college rankings. That
statistical anomaly was the smoking gun the TNR reporter needed
to "prove" the conspiracy.

But to anyone who has seriously researched the rationales colleges have
used in dropping the SAT, the TNR piece was a silly bit of
reporting. At Bates, as at the University of Texas, the SAT
"nonsubmitters" have performed as well or better academically than
students who submitted SATs, often with scores hundreds of points lower
than the SAT submitters. But readers of Fair Game? wouldn't know
this.

One could go on citing many more cases in which Zwick misleads her
readers through lopsided reporting and superficial analysis, such as her
statements that the Graduate Record Exam is about as good a predictor of
graduate school success as the SAT is for college freshmen (it's not,
far from it), or her overly optimistic spin on the results of many
studies showing poor correlations between standardized test scores and
later career successes.

Finally, Zwick's presentation might have benefited from a less
textbookish style, with more enriching details and concrete examples.
Instead, she tries to position herself as a "just the facts" professor
who won't burden readers with extraneous contextual details or accounts
of the human side of the testing culture. But like the enormously
successful--at least in commercial terms--standardized tests themselves,
which promote the entrenched belief in American society that genuine
learning and expert knowledge are tantamount to success on Who Wants
to Be a Millionaire
-type multiple-choice questions, books like Fair Game? might be the standardized account that some readers really want.

A more virulent nuclear era has superseded the perils of the cold
war.

Sister, they say heed the hymn in your heart.
You've learned you've an odd rhythm in your heart.

You and I versus our brothers: pitched war.
The four of us in the swim of your heart.

I saw a bird chasing moths trace spirals
in the air, how you love him in your heart!

The wind blows an apple, an acorn down.
Let's revise: follow each whim in your heart.

In the west, weft ascends warp. In the east,
weft treads warp. Silk Route wisdom in your heart.

Knowledge an ocean shaped by desire,
who defines the idiom: in your heart

of hearts? How many hearts do we have? When
one breaks song soothes like a balm in the heart.

Who'll play dub to your syncopated lub?
Endeavor, love, 'gainst tedium in the heart.

The hated math teacher played, "Less is more,"
with my name. Whence the harem in your heart?

Henry James could not resist giving the hero of his 1877 novel The
American
the allegorical name "Newman," but he went out of his way
to describe him as a muscular Christian, to deflect the suggestion that
Newman might be Jewish, as the name would otherwise imply. He is, as an
American, a New Man, who has come to the Old World on a cultural
pilgrimage in 1868, having made his fortune manufacturing washtubs; and
James has a bit of fun at his hero's expense by inflicting him with an
aesthetic headache in the Louvre, where his story begins. "I know very
little about pictures or how they are painted," Newman concedes; and as
evidence, James has him ordering, as if buying shirts, half a dozen
copies of assorted Old Masters from a pretty young copyist who thinks he
is crazy, since, as she puts it, "I paint like a cat."

By a delicious historical coincidence, another New Man, this time
unequivocally Jewish--the Abstract Expressionist Barnett Newman--visits
the Louvre for the first time in 1968, exactly a century later. By
contrast with his fellow noble savage, this Newman has had the benefit
of reading Clement Greenberg and working through Surrealism. So he is
able to tell his somewhat patronizing guide, the French critic Pierre
Schneider, to see Uccello's The Battle of San Romano as a modern
painting, a flat painting, and to explain why Mantegna's Saint
Sebastian
bleeds no more than a piece of wood despite being pierced
with arrows. He sees Géricault's Raft of the Medusa as
tipped up like one of Cézanne's tables. "It has the kind of
modern space you wouldn't expect with that kind of rhetoric." And in
general the new New Man is able to show European aesthetes a thing or
two about how to talk about the Old Masters, and incidentally how to
look at his own work, which so many of his contemporaries found
intractable. In Rembrandt, for example, Newman sees "all that brown,
with a streak of light coming down the middle...as in my own painting."

"All that brown, with a streak of light coming down the middle" could be
taken as a description of the first of Newman's paintings with which the
artist felt he could identify himself, done exactly two decades earlier
than the Louvre visit, and retroactively titled by him Onement 1.
Most would have described it as a messy brown painting with an uneven
red stripe down the middle, and nobody but Newman himself would have
tolerated a comparison with Rembrandt. But Newman told Pierre Schneider,
"I feel related to this, to the past. If I am talking to anyone, I am
talking to Michelangelo. The great guys are concerned with the same
problems." We must not allow it to go unnoticed that Newman counted
himself as among the great guys, though it is something of a hoot to
imagine trying to convince Henry James, were he resurrected, that the
works that make up the wonderful Newman exhibition at the Philadelphia
Museum of Art (until July 7, when they travel to the Tate Modern) are
concerned with the same issues as the Louvre masterworks that gave his
protagonist Newman a headache and eyestrain. Even critics otherwise
sympathetic to advanced painting in the 1950s were made apoplectic by
Newman's huge, minimally inflected canvases--fields of monochromatic
paint with a vertical stripe or two--and they have provoked vandalism
from the time of his first solo show at the Betty Parsons Gallery in
1950. As we shall see, Newman thought he had resolved the problems that
concerned the great guys who preceded him. They had been struggling to
make beautiful pictures, whereas he considered himself as having
transcended beauty and picturing alike. His achievement was to capture
the sublime in painting.

Newman regarded Onement 1 as marking a breakthrough for his work,
and a new beginning. The installation in Philadelphia dramatizes this by
framing the piece by means of a doorway leading from one gallery into
another. While standing in a gallery hung with pictures done by Newman
before the breakthrough, one glimpses a new order of painting in the
room beyond. Like all the great first generation of Abstract
Expressionists, Newman seems to have passed abruptly from mediocrity to
mastery with the invention of a new style--like the flung paint of
Pollock, the heavy brush-strokes of de Kooning, Kline's timberlike black
sweeps against white, Rothko's translucent rectangles of floating color.
The pre-Onement paintings may seem somehow to point toward it, in
the sense that there is in most of them a bandlike element that aspires,
one might say, to become the commanding vertical streak. But in them,
the streak (or band, or bar) shares space with other elements, splotches
and squiggles and smears that are tentative and uninspired. The vertical
streak alone survives a kind of Darwinian struggle for existence, to
become the exclusive and definitive element in Newman's vision, from
Onement 1 onward. The basic format of Newman's work for the
remainder of his career is that of one or more vertical bands, which run
from the top to the bottom of the panel, in colors that contrast with a
more or less undifferentiated surrounding field. Sometimes the bands
will be of differing widths in the same painting, and sometimes, again,
they will differ from one another in hue. But there will no longer be
the variety of forms he used in the pre-Onement period of his
work. It is as if he understood that with Onement 1, he had
entered a newfound land rich enough in expressive possibilities that he
need seek for nothing further by way of elementary forms. Onement
1
is planted like a flag at the threshold, and when one crosses over
it, one is in a very different world from that marked by the uncertain
pictures that preceded it.

I have followed Newman in respecting a distinction between pictures and
paintings. Onement 1 was a painting, whereas what he had done
before were merely pictures. How are we to understand the difference? My
own sense is that a picture creates an illusory space, within which
various objects are represented. The viewer, as it were, looks through
the surface of a picture, as if through a window, into a virtual space,
in which various objects are deployed and composed: the Virgin and Child
surrounded by saints in an adoration; stripes surrounded by squiggles in
an abstraction. In the Renaissance, a picture was regarded as
transparent, so to speak, the way the front of the stage is, through
which we see men and women caught up in actions that we know are not
occurring in the space we ourselves occupy. In a painting, by contrast,
the surface is opaque, like a wall. We are not supposed to see through
it. We stand in a real relationship with it, rather than in an illusory
relationship with what it represents. I expect that this is the
distinction Newman is eager to make. His paintings are objects in their
own right. A picture represents something other than itself; a painting
presents itself. A picture mediates between a viewer and an object in
pictorial space; a painting is an object to which the viewer relates
without mediation. An early work that externally resembles Onement
1
is Moment, done in 1946. A widish yellow stripe bisects a
brownish space. Newman said of it, "The streak was always going through
an atmosphere; I was trying to create a world around it." The
streak in Onement 1 is not in an atmosphere of its own, namely
pictorial space. It is on the surface and in the same space as we are.
Painting and viewer coexist in the same reality.

At the same time, a painting is not just so much pigment laid across a
surface. It has, or we might say it embodies, a meaning. Newman did not
give Onement 1 a title when it was first exhibited, but it is
reasonable to suppose that the meaning the work embodied was somehow
connected with this strange and exalted term. In general, the suffix
"-ment" is attached to a verb like "atone" or "endow" or "command,"
where it designates a state--the state of atoning, for example--or a
product. So what does "onement" mean? My own sense is that it means the
condition of being one, as in the incantation "God is one." It refers,
one might say, to the oneness of God. And this might help us better
understand the difference between a picture and a painting. Since Newman
thinks of himself and Michelangelo as concerned with the same kinds of
problems, consider the Sistine ceiling, where Michelangelo produces a
number of pictures of God. Great as these are, they are constrained by
the limitation that pictures can show only what is visible, and
decisions have to be made regarding what God looks like. How would one
picture the fact that God is one? Since Onement 1 is not a
picture, it does not inherit the limitations inherent in picturing. The
catalogue text says that Onement 1 represents nothing but itself
and that it is about itself as a painting. I can't believe, though, that
what Newman regarded in such momentous terms was simply a painting about
painting. It is about something that can be said but cannot be shown, at
least not pictorially. Abstract painting is not without content. Rather,
it enables the presentation of content without pictorial limits. That is
why, from the beginning, abstraction was believed by its inventors to be
invested with a spiritual reality. It was as though Newman had hit upon
a way of being a painter without violating the Second Commandment, which
prohibits images.

Kant wrote in the Critique of Judgment that "perhaps the most
sublime passage in Jewish Law is the commandment Thou shalt not make
unto thee any graven image, or any likeness of anything that is in
heaven or on earth, or under the earth," etc. This commandment alone can
explain the enthusiasm that the Jewish people felt for their religion
when compared with that of other peoples, or can explain the pride that
Islam inspires. But this in effect prohibited Jews from being artists,
since, until Modernism, there was no way of being a painter without
making pictures and hence violating the prohibition against images!
Paintings that are not pictures would have been a contradiction in
terms. But this in effect ruled out the possibility of making paintings
that were sublime, an aesthetic category to which Kant dedicated a
fascinating and extended analysis. And while one cannot be certain how
important the possibility of Jewish art was to Newman, there can be
little question not only that the sublime figured centrally in his
conception of his art but that it was part of what made the difference
in his mind between American and European art. Indeed, sublimity figured
prominently in the way the Abstract Expressionists conceived of their
difference from European artists. Robert Motherwell characterized
American painting as "plastic, mysterious, and sublime," adding, "No
Parisian is a sublime painter." In the same year that Newman broke
through with Onement 1, he published an important article, "The
Sublime Is Now," in the avant-garde magazine Tiger's Eye. And my
sense is that in his view, there could not be a sublime picture--that
sublimity became available to visual artists only when they stopped
making pictures and started making paintings.

Peter Schjeldahl recently dismissed the sublime as a hopelessly jumbled
philosophical notion that has had more than two centuries to start
meaning something cogent and has not succeeded yet. But the term had
definite cogency in the eighteenth century, when philosophers of art
were seeking an aesthetics of nature that went beyond the concept of
beauty. Beauty for them meant taste and form, whereas the sublime
concerned feeling and formlessness. Kant wrote that "nature excites the
ideas of the sublime in its chaos or in its wildest and most irregular
disorder and desolation, provided size and might are perceived," and he
cited, as illustrations,

Bold overhanging and as it were threatening rocks; clouds piled up in
the sky, moving with lightning flashes and thunder peals; volcanoes in
all their violence of destruction; hurricanes with their track of
devastation; the boundless ocean in a state of tumult; the lofty
waterfall of a mighty river, these exhibit our faculty of resistance as
insignificantly small in comparison with their might.

Since Kant was constrained to think of art in terms of pictures as
mimetic representations, there was no way in which painting could be
sublime. It could only consist in pictures of sublime natural things,
like waterfalls or volcanoes. While these might indeed be sublime,
pictures of them could at most be beautiful. Kant does consider
architecture capable of producing the feeling of sublimity. He cites
Saint Peter's Basilica as a case in point because it makes us feel small
and insignificant relative to its scale.

What recommended the sublime to Newman is that it meant a liberation
from beauty, and hence a liberation from an essentially European
aesthetic in favor of an American one. The European artist, Newman
wrote,

has been continually involved in the moral struggle between notions of
beauty and the desire for the sublime.... The impulse of modern art was
this desire to destroy beauty. Meanwhile, I believe that here in
America, some of us, free from the weight of European culture, are
finding the answer, by denying that art has any concern with the problem
of beauty and where to find it. The question that now arises is how can
we be creating an art that is sublime?

There can be little doubt that in Newman's sense of his own achievement,
he had solved this problem with Onement 1. It is certainly not a
beautiful painting, and one would miss its point entirely if one
supposed that sooner or later, through close looking, the painting would
disclose its beauty as a reward. There was a standing argument, often
enlisted in defense of Modernism, that the reason we were unable to see
modern art as beautiful was because it was difficult. Roger Fry had
written, early in the twentieth century, that "every new work of
creative design is ugly until it becomes beautiful; that we usually
apply the word beautiful to those works of art in which familiarity has
enabled us to grasp the unity easily, and that we find ugly those works
in which we still perceive only by an effort." Newman's response to this
would have been that he had achieved a liberation from what feminism
would later call the beauty trap. He had achieved something grander and
more exalted, a new art for new men and women.

Newman used the term "sublime" in the title of his Vir Heroicus
Sublimis
(1950-51). It is a tremendous canvas, nearly eight feet high and eighteen feet
wide, a vast cascade of red paint punctuated by five vertical stripes of
varying widths, set at varying intervals. Newman discussed this work
(which the critic for The New Republic called asinine) in an
interview with the British art critic David Sylvester in 1965.

One thing that I am involved in about painting is that the painting
should give a man a sense of place: that he knows he's there, so he's
aware of himself. In that sense he related to me when I made the
painting because in that sense I was there. Standing in front of my
paintings you had a sense of your own scale. The onlooker in front of my
painting knows that he's there. To me, the sense of place not only has a
mystery but has that sense of metaphysical fact.

Newman studied philosophy at City College, and Kant sprang to his lips
almost as a reflex when he discussed art. But it is difficult not to
invoke the central idea of Martin Heidegger's philosophy in connection
with his comment to Sylvester. Heidegger speaks of human beings as
Dasein
, as "being there," and it is part of the intended experience
of Newman's paintings that our thereness is implied by the scale of the
paintings themselves. In his 1950 exhibition at the Betty Parsons
Gallery, he put up a notice that while there is a tendency to look at
large paintings from a distance, these works were intended to be seen
from close up. One should feel oneself there, in relationship to the
work, like someone standing by a waterfall. The title of the painting
meant, he told Sylvester, "that man can be or is sublime in his relation
to his sense of being aware." The paintings, one might say, are about us
as self-aware beings.

A high point of the Philadelphia show is Newman's The Stations of the
Cross
, a series of fourteen paintings that is certainly one of the
masterpieces of twentieth-century art. As a spiritual testament, it
bears comparison with the Rothko Chapel in Houston. I have the most
vivid recollection of being quite overcome when I first experienced
The Stations of the Cross in the Guggenheim Museum in 1966.
Newman used as subtitle the Hebrew words Lema
Sabachthani
--Christ's human cry on the Cross. The means could not be
more simple: black and white paint on raw canvas, which he used as a
third color. The fourteen paintings do not map onto corresponding points
on the road to Calvary. But Newman seems to use black to represent a
profound change of state.

The first several paintings have black as well as white stripes (or
"zips," as he came to call them, referring perhaps to the sound that
masking tape makes when it is pulled away). Black entirely disappears in
the Ninth Station, in which a stripe of white paint runs up the
left edge, and two thin parallel white stripes are placed near the right
edge. The rest is raw canvas. The Tenth and Eleventh
stations resemble it, through the fact that they too are composed of
white stripes placed on raw canvas. Then, all at once, Twelfth
Station
is dramatically black, as is the Thirteenth Station.
And then, in the Fourteenth Station, black again abruptly
disappears. There is a strip of raw canvas at the left, and the rest is
white, as if Christ yielded up the ghost as St. Matthew narrates it. The
work demonstrates how it is possible for essentially abstract paintings
to create a religious narrative.

No one today, I suppose, would hold painting in the same exalted state
that seemed possible in the 1950s. Newman became a hero to the younger
generation of the 1960s, when the history of art that he climaxed gave
way to a very different era. He triumphed over his savage critics, as
great artists always do; and all who are interested in the spiritual
ambitions of painting at its most sublime owe themselves a trip to
Philadelphia to see one of the last of the great guys in this thoughtful
and inspired exhibition, the first to be devoted to his work in more
than thirty years.

British folk-rocker Billy Bragg has to be the only popular musician who
could score some airtime with a song about the global justice movement.
The first single from Bragg's England, Half English (Elektra),
"NPWA" (No Power Without Accountability), is destined to become an
enduring anthem for anticorporate organizers everywhere. Just before leaving England to tour the United States in April, Bragg took a few minutes to talk with
Nation assistant literary editor Hillary Frey about
globalization, Woody Guthrie, the duty of a political songwriter and,
perhaps most important, why the AFL-CIO should be sponsoring free rock
concerts. A longer version of this interview appears on The
Nation
's website (www.thenation.com).

HF: I've read that you were politicized during the Thatcher years
in England. How did that happen, and how did your politics find their
way into your music?

BB: When Margaret Thatcher was first elected, in 1979, I didn't
vote. Perhaps that was the arrogance of youth.... It was at the height
of punk, and I was titularly an anarchist. Although, frankly, that was
more of a T-shirt than a developed idea. Her second term, between 1983
and 1987, really brought my political education. By then, Thatcher had
started to chip away at the idea of the welfare state and what that
stands for--free healthcare, free education, decent affordable housing
for ordinary people.

Then, the 1984 Miners' Strike [which protested pit closures and paltry
pay increases for workers] was the real politicization for me. I started
doing gigs outside of London in the coal fields and found that I was
able to articulate what I believed in so that these people who we were
doing benefits for--the miners--didn't think I was just some pop star
from London trying to enhance my career by doing a few fashionable
benefits. I began to define myself by something other than the standard
"Blowin' in the Wind" sort of politics, which aren't that hard to
articulate.

HF: You were in New York City when the World Economic Forum [WEF]
met, and I heard you speak about the groups organizing demonstrations. I
recall a comment to the effect of, "If you really want to be doing
something active and participatory you would organize your local
McDonald's." What are your opinions on the tactics of the global justice
movement?

BB: I feel very strongly that the movement is a positive thing.
The fact that it hasn't yet defined itself in a clear ideological way
doesn't mean that it won't eventually. I feel very much on the
activists' side. However, I don't believe you can change the world by
smashing up fast-food joints.

My approach is perhaps a little more traditional left; I believe that if
you want to change the world, as I said, you should be organizing
fast-food joints. To me, that is a positive way of changing the world.
It's a lot slower, and it won't get you on CNN. But the sort of
campaigns that I've worked with in the USA--Justice for Janitors,
living-wage initiatives in LA and cities like that--have all been rooted
in labor organizing.

HF: How did your relationship with the labor movement evolve?

BB: I made a very strong bond with the labor movement in England
during the Thatcher years, particularly during the Miners' Strike. And
those bonds have stood me in good stead when coming to a country like
the United States, where not only are the politics very different from
the ideological politics of my own country, but I'm a foreigner. As an
internationalist I support UNITE, who are trying to end sweatshop labor
in the clothing industry; we're doing that in the UK as well. That is
the sort of internationalist angle prevalent in the global justice
movement too, and it's something that I can support across borders.

HF: I was surprised to see that your tours are actually sponsored
by a union.

BB: I've just come off a tour actually, that was sponsored by the
GMB, which is one of our general unions.

HF: I can't imagine a union being involved in a concert here in
the United States.

BB: I know! In 1992 I participated in a concert in Central Park
marking the eightieth birthday of Woody Guthrie that was sponsored by
one of the big soft-drink companies. Now why could it not have been
sponsored by the AFL-CIO? Why couldn't the AFL-CIO say, "This is what we
do, we put on free gigs." This is what unions do--bring people together.
The unions have been doing this in the UK for a while, and certainly all
over continental Europe. I've been doing gigs in Italy and France
organized by the big unions there for the last two decades.

How do you explain to young people what unions are for--do you wait
until they're in trouble? Do you wait till they're in a dead-end job?
Wait till they're fired? Or do you get in before with some positive
ideas of what a union is?

HF: Speaking of Woody Guthrie... A few years back you recorded,
with the band Wilco, Mermaid Avenue Vols. I and II--two records
comprising songs written around unrecorded Woody Guthrie lyrics. How did
you get to be the lucky one rooting around in the Guthrie archives and
recording his words?

BB: Woody Guthrie is the father of my tradition--the political
singer/songwriter tradition. I've tried to answer the question of why
[Woody's daughter] Nora chose to give me the great honor of being the
first one in her father's archives.... I guess Nora saw something in my
experience that she thought chimed in with Woody's. Who writes about
unions in the United States and the song gets on the charts? All of the
postwar singer/songwriters have grown up in a nonideological atmosphere.
Their influences have been single issues like the civil rights movement,
Vietnam, campaigning for the environment. There's not been that whole
ideological struggle really going on in the USA.

HF: Is it harder to write political music now than it was when
you started?

BB: It's much more difficult to do this now, without Margaret
Thatcher and Ronald Reagan and the Berlin wall and apartheid--these
things were shorthand for struggles that went on across the world. Now I
don't miss any of those things; I have absolutely no nostalgia for the
1980s whatsoever, and I never want to see any of those things again. But
the job of the political singer/songwriter is perhaps more challenging
because, with a subject like identity, which I deal with on England,
Half English
, it's personal--it means different things to different
people.

HF: But it's clear there is plenty happening now to respond to.
The single from your new record, "NPWA" (No Power Without
Accountability), strikes me as a paean to the global justice movement.

BB: The job of the singer/songwriter is to try to reflect the
world around him, and obviously the global justice movement has been the
big cause célèbre since Seattle. When I was in New York in
February, there was stuff I saw going on the like of nothing I've ever
seen on the left before.

I went to a Methodist Church where activists were speaking about how
they were going to organize the demonstrations [around the WEF] two days
later. They asked me to sing a couple of songs so I sang "NPWA"--and
then they wanted me to sing the "Internationale," and that really
touched me, because we do have a strong tradition on the left, and one
of the things we have to gain from the demise of the Stalinism of the
Soviet Union and the Berlin wall is that we have an opportunity to
create a leftist idea outside the shadow of totalitarianism. And there,
in New York, among very radical young people, I thought, "OK--this isn't
really so different from what I know. It's just a different approach to
get to the same place." And the fact that I've been doing this for
twenty years and people are still interested--I feel fortunate. I figure
I must be hitting some bases.

England, Half English is available now from Elektra Records.

A young man of 16, visiting his cousins in Calcutta in a house in a
"middle-middle-class area," has just published his first poem. This
not-yet-poet from Bombay is the narrator of Amit Chaudhuri's short story
"Portrait of an Artist." The artist in the story is not the visiting
youth, however, but an older man, the English tutor who comes each week to instruct the cousins. This
man is respectfully called mastermoshai.

Mastermoshai has already been shown the narrator's poem. (One of the
cousins reports that the teacher was "very impressed.") On a Saturday
morning, the budding poet meets mastermoshai. He has a "very Bengali
face" with "spectacles that belonged to his face as much as his eyes
did" and "teeth that jutted out from under his lip, making his face
belong to the preorthodontal days." The cousins, and also the narrator,
wait for mastermoshai to say something about the poem. When two literary
men meet in Bengal, they do not indulge in small talk but instead
"straightaway enter realms of the abstract and articulate," we are
advised. Fittingly, mastermoshai's first question to the poet, in a
Bengali-inflected English, is, "Are you profoundly influenced by
Eliot?"

"It was mastermoshai who first spoke to me of Baudelaire," the narrator
says, and there are other discoveries in this induction into the
literary life. When the older man takes the poet to an editor's house in
another part of Calcutta, Chaudhuri's portrait of the artist shades into
a portrait of private homes and of the city as a whole. In Calcutta, our
poet discovers, clerks and accountants nurture an intellectual or
literary life, not only in English but also Bengali. The city appears
provincial, but it also reveals, like Joyce's Dublin, its particularity.

The literary passions that this city with a colonial past breeds are
already obsolete elsewhere. Yet they inspire a romance that is real and
productive. That is what the young poet feels after the years have
passed. By then, mastermoshai has faded into the oblivion of insanity.
His interest in Eliot and Baudelaire is seen by the narrator as a
"transitional" time during which, after the early losses of his life,
mastermoshai had returned to his "youthful enthusiasms." You realize
that the story is not so much about the space of literature, which like
the city itself offers surprises that serve as a refuge from the general
claustrophobia and madness. Instead, it is about the patient and
sometimes crazy, and mostly anonymous, striving in the former
colonies--and also about the tribute we need to pay to mentors in a
literary culture that functions without the trappings of creative
writing programs and, in the case of the poor, even ordinary colleges
and schools.

Chaudhuri's other stories in this debut collection, Real Time,
also concern themselves with the conditions under which art is born or
the circumstances in which artists live. The book's closing story is
about Mohanji, a gentle and gifted singer trained in classical
Hindustani music. He makes a living by teaching affluent housewives in
Bombay how to sing devotional bhajans and ghazals.
Mohanji's life now is "a round of middle-aged women" in Bombay's
affluent districts like Cuffe Parade and Malabar Hill. At night, he
takes the fast train back to his home in a ghetto in distant Dadar.

Lately, Mohanji has been feeling ill. He believes he has an ulcer. He
also suffers from tension. This tension comes "from constantly having to
lie to the ladies he taught--white lies, flattery--and from not having a
choice in the matter."

Mohanji's student Mrs. Chatterjee does not always have the time to
practice. But, she would like to sing. She tells her teacher that she
wishes she could sing like him. Mohanji is "always surprised" that the
rich had desires for "what couldn't be theirs." He is also amused that
"it wasn't enough for Mrs. Chatterjee that she, in one sense, possessed
him; she must possess his gift as well."

This sudden sharpness on Mohanji's part, like his illness, reveals a
malaise. The gentleness in the guru, a quality to which Mrs. Chatterjee
had grown so accustomed, is now shown to be the result of great
restraint and even artistic discipline. The story's presentation of
Mohanji's speech and his silence ushers us into the domain of criticism.

We get a clue here to Chaudhuri's own art. He belongs to a very small
group of Indian writers in English who are as good critics as they are
storytellers. This skill at criticism is not a result of close
reading--though that ability is in fine evidence in The Picador Book
of Indian Literature
, which Chaudhuri has edited--but of a serious
search for a reading public. Chaudhuri's writing, both critical and
fictional, subtly demonstrates for this public (which is yet unborn) its
most responsible function.

There is a great need for such acts in India. Recently, at a literary
festival in Delhi, I heard a well-known writer telling her audience that
there were only two literary critics in Punjabi in the whole country.
But this wasn't the worst. She said that one of the two critics was a
university professor who was interested only in promoting the female
students who were doing their doctorates under him. The other was a man
in Chandigarh who wrote exclusively about other writers from his own Jat
caste. The writer said, "Since I am neither a pretty face nor a Jat, I
am ignored."

I thought about the Punjabi writer, and about Chaudhuri, who was also
there at the festival, when I was awakened past midnight in my hotel
room in Delhi by a call from London. It was someone from the BBC.
Earlier that day, V.S. Naipaul had been rude to another writer. Now the
BBC wanted to know if I believed that "Naipaul had lost it."

I wasn't able to provide gossip. But, as I lay awake in bed after the
call, I remember wondering whether I hadn't made a mistake thinking that
the problem of building a critical culture was India's alone. Did
Britain, for example, have a vibrant literary public sphere? Why then
was the BBC not rousing people from sleep to ask about the solitude of a
writer working in Punjabi, a language that is used by millions, and
endowed with a rich literary past, but now possessing no critics?

Fifteen short stories and a reminiscence-in-verse make up Real
Time
. Not all the pieces are as strong as the ones mentioned above.
A few of the short stories, like the one in the voice of a humiliated
demon from the Ramayana, are clever sketches but call for a more
extended treatment in order to be satisfying. There is a first-person
account of a housewife who is writing a memoir--a story meant to mock
the Indian writing scene, where, it seems, a new writer is born every
day. But Chaudhuri's wit is suited to a more muted, or perhaps just more
nuanced, register, and here the mockery falls flat.

"Words, silences," a story about two male friends who are meeting each
other after a long time, contains a hint of a half-understood homosexual
exchange between them in their boyhood. But the story, in its reticence,
offers too little, the author's silence acting like a silencing of its
own. A couple of other stories in the autobiographical mode work better,
recalling the lyricism and humor of Chaudhuri's earlier fiction. His
first three novels, published in a single volume in the United States
under the title Freedom Song, won a Los Angeles Times book
award in 2000. That year Chaudhuri also published a novel, A New
World
, about an expatriate Indian's return to Calcutta after his
divorce.

A real gem in the present collection is the title story "Real Time,"
which along with the account of Mohanji was first published in the
British magazine Granta. This elegantly crafted story recounts an
executive's visit to a house in Calcutta where a shraddha, or
memorial ceremony, is being held. The ceremony is for a young married
woman who has committed suicide by jumping from the third-floor balcony
of her parents' house.

The visitor and his wife--the latter is related to the family--have been
able to find the house only with some difficulty. They have bought
tuberoses on the way, having bargained the price down from sixteen to
fourteen rupees. The rituals of mourning are not clear in the case of a
suicide. The narrative supplies very little conventional pathos, and yet
pathos is present in the story, always in tension with other quotidian
details that intrude upon the consciousness of the narrator. The visitor
spots an acquaintance and they fall into a conversation about "the
recent changes in their companies," their own children and even "a brief
disagreement about whether civil engineering had a future as a career
today."

Death produces a great absence, but here, in the story, the absence has
more to do with the fact that the visiting couple know very little about
the suicide. They had learned of the death from an item in the
newspaper. Grief remains remote. More than death, it is this distance
that produces a blankness, which, however, slowly gets filled with
ordinariness, and even trivia. The narrative is so precise that it is
with a tiny jolt that the reader realizes that this inconsequential
ordinariness is what we usually call life.

Jacques Derrida has written that the Moroccan Abdelkebir Khatibi does
not speak of his mother tongue "without a trembling that can be heard,"
a "discreet tremor of language that undersigns the poetic resonance of
his entire work." The same can be said of Chaudhuri. In his prose,
history always happens elsewhere. It is like an earthquake in the heart
of the earth. What the writing registers is only the shock and the
falling buildings.

In early 1993, a short while after the demolition of the Babri Mosque in
Ayodhya and the riots that had followed, Chaudhuri wrote a travel essay
about this return to India from Oxford. In that essay, he described how
the metal nameplates in the house where his father had lived in Bombay
were now all blank. This had been done to protect the Muslims living in
the building. "Small, accidental sensations, too small to be called
incidents," he wrote, "told me I was now living in a slightly altered
world."

The trip on which Chaudhuri discovered the small detail of blank metal
nameplates sowed the seed for his novel Freedom Song. While
reading his earlier novels, I had been struck by the way in which
Chaudhuri's evocative, Proustian sentences accumulated visual details. I
thought of Bengali cinema, the moment of its modernity and the movement
of the camera recording the texture of middle-class life. But there was
also an aural element to this writing. It was punctuated with delicate
pauses that made the prose musical. The sentences were marked by spaces
of silence and filled with near-poetry.

It was only when reading Freedom Song, however, that I got a more
vivid sense of Chaudhuri's unique and flawed aesthetic. The rise of
Hindu fundamentalism and the changes ushered in by market liberalization
provide the immediate occasion for the novelist to examine the changes
that affect a small group of relatives and friends. These changes are
not overwhelming; they are subtle variations on a more settled routine.
The technique works because it saves history from the banality of a
slogan. At the same time, it also carries the danger of slipping into a
mannerism. Both the strength and, on occasion, the weakness are present
in the stories of Real Time.

In recent weeks, hundreds have died in India in religious riots
orchestrated by the Hindu right in retaliation for the burning alive of
fifty-eight Hindus in a train. These events have challenged the
democratic credentials of the Indian nation-state. But they also pose a
question for intellectuals and artists, and this is the question of
seeking a powerful and imaginative response to the carnage.

What is our response in "real time"? And how does this time find breath
in our writing? Chaudhuri, in his attention to the imaginative use of
language, makes the search for the answers a process of magical
discovery. Let me end with a passage from Freedom Song that
captures the inertness but also the dynamism of the life that Chaudhuri
sees unfolding around him:

It was afternoon. And in a small lane, in front of a pavement, with the
movement of a wrist, something like a curve began to appear, it was not
clear what pattern was forming, then the letter D appeared upon a wall
of a two-storey house, in black paint, and then U, and N, until DUNKEL
had been formed, in the English language, which seemed to blazon itself
for its curious purpose; then it began again, and I and M and F began to
appear in another corner. Afternoon; no one saw them; it was too hot; on
the main road cars went past, up and down; a few people rested; they had
eaten; beggars dozed, blind to the heat and shadows, their heads bent to
the stomach.

In the past two decades, Richard Rodriguez has offered us a gamut of
anecdotes, mostly about himself in action in an environment that is not
always attuned to his own inner life. These anecdotes have taken the
form of a trilogy that started in 1983 with the classic Hunger of
Memory
, continued in 1993 with Days of Obligation and concludes now with his new book Brown:
The Last Discovery of America.
This isn't a trilogy about history.
It isn't about sociology or politics either, at least in their most
primary senses. Instead, it is a sustained meditation on Latino life in
the United States, filled with labyrinthine reflections on philosophy
and morality.

Rodriguez embraces subjectivity wholeheartedly. His tool, his
astonishing device, is the essay, and his model, I believe, is
Montaigne, the father of the personal essay and a genius at taking even
an insect tempted by a candle flame as an excuse to meditate on the
meaning of life, death and everything in between. Not that Montaigne is
Rodriguez's only touchstone. In Brown he chants to Alexis de
Tocqueville and James Baldwin as well. And in the previous installments
of his trilogy, particularly owing to his subject matter, he has emerged
as something of a successor to Octavio Paz.

The other trunk of this genealogical tree I'm shaping is V.S. Naipaul,
or at least he appears that to me, a counterpoint, as I reread
Rodriguez's oeuvre. They have much in common: They explore a
culture through its nuances and not, as it were, through its
high-profile iconography; they are meticulous littérateurs,
intelligent, incessantly curious; and, more important, everywhere they
go they retain, to their honor, the position of the outsider looking in.
Rodriguez, in particular, has been a Mexican-American but not a
Chicano--that is, he has rejected the invitation to be a full part of
the community that shaped him. Instead, he uses himself as a looking
glass to reflect, from the outside, on who Mexicans are, in and beyond
politics. This, predictably, has helped fill large reservoirs of
animosity against him. I don't know of any other Latino author who
generates so much anger. Chicanos love to hate him as much as they hate
to love him.

Why this is so isn't difficult to understand: He is customarily critical
of programs and policies that are seen as benefactors to the community,
for example, bilingual education and affirmative action, which, in his
eyes, have only balkanized families, neighborhoods and cities. In
Hunger of Memory he portrayed himself as a Scholarship Boy who
benefited from racial profiling. He reached a succinct conclusion: Not
race but individual talent should be considered in a person's
application for school or work--not one's skin color, last name or
country of origin, only aptitude. Naipaul too can play the devil: His
journeys through India and the Arab world, even through the lands of El
Dorado, are unsettling when one considers his rabid opinions on the
"uncivilized" natives. But Naipaul delivers these opinions with
admirable grace and, through that, makes his readers rethink the
colonial galaxy, revisit old ideas. In that sense, Naipaul and Rodriguez
are authors who force upon us the necessity to sharpen our own ideas. We
read them, we agree and disagree with them, so as to fine-tune our own
conception of who we are. They are of the kind of writer who first
infuriates, then unsettles us. What they never do is leave the reader
unchanged. For that alone, one ought to be grateful.

Apparently, the trilogy came into being after Rodriguez's agent, as the
author himself puts it in "Hispanic," the fifth chapter of Brown,
"encouraged from me a book that answers a simple question: What do
Hispanics mean to the life of America? He asked me that question several
years ago in a French restaurant on East Fifty-seventh Street, as I
watched a waiter approach our table holding before him a shimmering
îles flottantes."

The image of îles flottantes is a fitting one, I believe,
since the Latino mosaic on this side of the border (Rodriguez often
prefers to use the term "Hispanic" in his pages) might be seen as
nothing if not an archipelago of self-sufficient subcultures: Cuban,
Puerto Rican, Mexican, Salvadoran, Nicaraguan, Dominican... and the
whole Bolivarian range of possibilities. Are these islands of identity
interconnected? How do they relate to one another? To what extent are a
Brazilian in Tallahassee and a Mexicano in Portland, Oregon, kindred
spirits?

Judging by his answer, Rodriguez might have been asked the wrong
question. Or else, he might have chosen to respond impractically. For
the question that runs through the three installments is, How did
Hispanics become brown? His belief is that brown, as a color, is the
sine qua non of Latinos, and he exercises it as a metaphor of mixture.
"Brown as impurity," he reasons. "I write of a color that is not a
singular color, not a strict recipe, not an expected result, but a color
produced by careless desire, even by accident." It is the color of
mestizaje, i.e., the miscegenation that shaped the Americas from
1492 onward, as they were forced, in spite of themselves, into modern
times. It is the juxtaposition of white European and dark aboriginal, of
Hernán Cortés and his mistress and translator, La
Malinche. And it is also the so-called raza cósmica that
Mexican philosopher José Vasconcelos talked about in the early
twentieth century, a master race that, capitalizing on its own impurity,
would rise to conquer the hemisphere, if not the entire globe.

But have Hispanics really become brown on the Technicolor screen of
America? Rodriguez is mistaken, I'm afraid. The gestation of race in the
Caribbean, from Venezuela to Mexico and the Dominican Republic, has a
different tint, since African slaves were brought in to replace Indians
for the hard labor in mines and fields, and their arrival gave birth to
other racial mixtures, among them those termed "mulattoes" and "zambos."
Argentina, on the other hand, had a minuscule aboriginal population when
the Spanish viceroys and missionaries arrived. The gauchos, a sort of
cowboy, are at the core of its national mythology, as can be seen in the
works of Domingo Faustino Sarmiento, José Hernández and
Jorge Luis Borges. "Brown," in Rodriguez's conception, might be the
color of Mexicans in East LA, but surely not of Cubans in Miami. Some
Latinos might have become brown, but not all. And then again, what does
"brown" really mean? Rodriguez embraces it as a metaphor of impurity.
Mestizos are crossbreeds, they are impure, and impurity is beautiful.
But the term "brown" has specific political connotations as well. It is,
to a large extent, a byproduct of the civil rights era, the era of
César Chávez and the Young Lords, coined in reaction to
the black-and-white polarity that played out in Washington policy
corridors and the media: Brown is between white and black, a third
option in the kaleidoscope of race. A preferred term in the Southwest
was La Raza, but "brown" also found its way into manifestoes, political
speeches, legal documents and newspaper reports.

Rodriguez isn't into the Chicano movement, though. My gut instinct is
that he feels little empathy toward the 1960s in general, let alone
toward the Mexican-American upheaval. His views on la
hispanicidad
in America are defined by his Mexican ancestry and by
his residence in San Francisco, where he has made his home for years. He
is disconnected from the Caribbean component of Latinos, and, from the
reaction I see in readers on the East Coast, the Caribbean Latinos are
also uninvolved with him.

Furthermore, Rodriguez limits himself to the concept of miscegenation,
but only at the racial level. What about promiscuity in language, for
example? Promiscuity might be a strong word, but it surely carries the
right message. Rodriguez's English is still the Queen's English:
overpolished, uncorrupted, stainless. How is it that he embraces
mestizaje but has little to say about Spanglish, that
disgustingly gorgeous mix of Spanish and English that is neither one nor
the other? Isn't that in-betweenness what America is about today? On the
issue of language, I have a side comment: I find it appalling that
Rodriguez's volumes are not available in Spanish to Mexicans and other
Latinos. Years ago, a small Iberian press, Megazul, released Hambre
de memoria
in a stilted, unapologetically Castilian translation.
That, clearly, was the wrong chord to touch, when the author's resonance
is closer to San Antonio than to San Sebastián. How much longer
need Mexicans wait to read the work en español mexicano of
a canonical figure, whose lifelong quest has been to understand Mexicans
beyond the pale? The question brings us back to Paz and his "The Pachuco
and Other Extremes," the first chapter in his masterpiece The
Labyrinth of Solitude
, released in 1950. It has angered Chicanos for
decades, and with good reason: This is an essay that distorts Mexican
life north of the border. Paz approached the pachuco--a social type of
Mexican youth in Los Angeles in the 1940s who fashioned a specific
lingo, and idiosyncrasies that Elvis Presley appropriated obliquely--as
a deterioration of the Mexican psyche. In his work, Rodriguez has
established a sort of colloquy with Paz, though not a direct address. He
embraces Paz's cosmopolitanism, his openness, and perceives him as a
Europeanized intellectual invaluable in the quest to freshen up Mexican
elite culture. But he refuses to confront Paz's anti-Chicanismo, and in
general Paz's negative views on Latinos in the United States. Once, for
instance, when asked what he thought about Spanglish, Paz responded that
it was neither good nor bad, "it is simply an aberration." In any case,
reading both authors on US-Mexican relations is an unpredictable,
enlightening catechism, filled with detours. While Mexicans might not
like to hear what Rodriguez has to say about them and about himself (he
has talked of "hating Mexico"), at least they will be acquainted with
his opinions.

All this is to say that Rodriguez's response to "What do Hispanics mean
to the life of America?" is partial at best. The trilogy shows a mind
engaged, but its subject is almost unmovable. Hunger of Memory
was an autobiographical meditation set in the United States as the
country was about to enter the Reagan era. It denounced a stagnant
society, interested in the politics of compassion more than in the
politics of equality, a society with little patience for Mexicans.
Days of Obligation was also about los Estados Unidos as
the first Bush presidency was approaching its end. By then the Reagan
mirage was officially over. We were about to enter another house of
mirrors under the tutelage of Bill Clinton. And this third installment
of the trilogy arrives in bookstores at a time when the melting pot,
la sopa de culturas, is boiling again, with xenophobia against
Arabs at a height, and Latinos, already the largest minority according
to the latest US Census data--35.3 million strong by late 2000, if one
counts only those officially registered--are still on the fringes,
fragmented, compartmentalized, more a sum of parts than a whole.

These changes are visible only through inference in the trilogy;
Rodriguez seldom makes use of political facts. He lives in a dreamlike
zone, a universe of ideas and sensations and paradox. Somewhere in
Brown he announces:

A few weeks ago, in the newspaper (another day in the multicultural
nation), a small item: Riot in a Southern California high school.
Hispanic students protest, then smash windows, because African-American
students get four weeks for Black History month, whereas Hispanics get
one. The more interesting protest would be for Hispanic students to
demand to be included in Black History month. The more interesting
remedy would be for Hispanic History week to include African history.

This sums up Rodriguez's approach: a micromanagement of identity
delivered periodically from the same viewpoint. Or has the viewpoint
changed? It is possible to see a growing maturity by reading the trilogy
chronologically. He started as an antisegregationist, a man interested
in assimilation of Mexicans into the larger landscape of America. His
feelings toward Mexico and toward his homosexuality were tortured at the
time. These became clear, or at least clearer, in the second
installment, in which a picture of a San Francisco desolated by AIDS and
an argument with the author's own mexicanidad as personified by
his father, among other changes, were evident. Assimilation was still a
priority, but by the 1990s Rodriguez had ceased to be interested in such
issues and was more attracted to his own condition as a public gay
Latino.

Brown is again about assimilation, but from a perspective that
asserts America is a country shaped by so many interbred layers of
ethnicity that nothing is pure anymore. At one point, he describes the
conversation of a couple of girls one afternoon on Fillmore Street. He
renders them and their dialogue thus: "Two girls. Perhaps sixteen.
White. Anglo, whatever. Tottering on their silly shoes. Talking of boys.
The one girl saying to the other: ...His complexion is so cool, this
sort of light--well, not that light." And Rodriguez ends: "I realized my
book will never be equal to the play of the young." This need to capture
what surrounds him is always evident, although it isn't always
successful, because he is an intellectual obsessed with his own stream
of consciousness rather than in catching the pulse of the nation. But
I've managed to explain the continuity of themes in Rodriguez's three
volumes only tangentially.

There is another take, summed up in three catchwords: class, ethnicity
and race. He appears to encourage this reading. The first installment is
about a low-income family whose child moves up in the hierarchy; the
second about the awakening to across-the-border roots; and the third
about "a tragic noun, a synonym for conflict and isolation," race. But
Rodriguez is quick to add:

race is not such a terrible word for me. Maybe because I am skeptical by
nature. Maybe because my nature is already mixed. The word race
encourages me to remember the influence of eroticism on history. For
that is what race memorializes. Within any discussion of race, there
lurks the possibility of romance.

So is this what the trilogy is about, finally? The endeavor strikes me
as rather mercurial. Because Rodriguez works extensively through
metaphor and hyperbole, future generations will read into his books what
they please, depending on the context. I still like Hunger of
Memory
the best. Days of Obligation strikes me as a
collection of disparate essays without a true core. And Brown is
a book that is not fully embracing, not least because it refuses to
recognize the complexity of Latinos in the United States. In it
Rodriguez describes his namesake, Richard Nixon, as "the dark father of
Hispanicity." "Surviving Chicanos (one still meets them) scorn the term
Hispanic," Rodriguez argues, "in part because it was Richard Nixon who
drafted the noun and who made the adjective uniform." A similar
reference was invoked in an Op-Ed piece by him in the New York
Times
, in which he declared George W. Bush the first Hispanic
President of the United States, the way Bill Clinton was the first black
President. Is this true? The argument developed is not always clear-cut:
It twists and turns, as we have by now come to expect. I've learned to
respect and admire Rodriguez. When I was a newly arrived immigrant in
New York City, I stumbled upon an essay of his and then read his first
book. I was mesmerized by the prose but found myself in strong
disagreement with its tenets, and we have corresponded about that in the
intervening years.

At any rate, where will Rodriguez go from here, now that the trilogy is
finished? Might he finally take a long journey overseas? Is his vision
of America finally complete? Not quite, I say, for the country is
changing rapidly. Mestizaje, he argues, is no longer the domain
of Latinos alone. We are all brown: dirty and impure. "This is not the
same as saying 'the poor shall inherit the earth' but is possibly
related," Rodriguez states. "The poor shall overrun the earth. Or the
brown shall." This is a statement for the history books. In his view,
America is about to become América--everyone in it a Hispanic, if
not physically, at least metaphorically. "American history books I read
as a boy were all about winning and losing," Rodriguez states in
"Peter's Avocado," the last of the nine essays in Brown. And with
a typical twist, continues, "One side won; the other side lost.... [But]
the stories that interested me were stories that seemed to lead off the
page: A South Carolina farmer married one of his slaves. The farmer
died. The ex-slave inherited her husband's chairs, horses, rugs, slaves.
And then what happened? Did it, in fact, happen?"

British folk-rocker Billy Bragg has to be the only popular musician who
could score some airtime with a song about the global justice movement.
The first single from Bragg's England, Half Engli

Although Chicano identity has been Luis Valdez's theme since all but the
earliest years of El Teatro Campesino, the guerrilla theater he founded
in the 1960s, getting a clear sense of his roots became doubly important
to him when his parents died in the mid-1990s. Valdez, the first Latino
playwright/director to reach Broadway and the creator of the bellwether Hispanic film Zoot Suit, had always been told his people were Yaquis from Sonora in northern Mexico, but he realized he knew very little about how they had
come to be California Chicanos.

So, in the late 1990s, he began to search his family's history and its
secrets, and what he discovered about the myths and contradictory
stories that had been handed down and about the little-known history of
the Yaqui wars in Mexico led him to write Mummified Deer, in some
ways his most personal play and his first new work for the theater in a
decade and a half (just ending its run at El Teatro Campesino in San
Juan Bautista). It's a play that uses the mythic, presentational
elements we've come to associate with Valdez's work, here present in a
Yaqui deer dancer, who together with the long arm of history defines
identity for the play.

Valdez founded El Teatro Campesino as an organizing and fundraising arm
of the United Farm Workers during the 1965 grape strike in Delano, where
he was born. The actors then were strikers who played type characters in
actos, short satirical sketches on strike issues performed at
work sites and in union halls.

But since splitting off from the union in 1967, the company has made
Chicano racial identity its focus. In the late 1960s and early '70s,
that specifically meant spiritual identity, with the theater reaching
all the way back to La Raza's Aztec and Mayan roots and making ritual
and myth, music and dance integral parts of its style.

Valdez was criticized at the time for abandoning the theater's
materialist viewpoint, and was criticized later in the decade and in the
1980s--when the entertainment industry began to understand the potential
of the Hispanic market--for his unabashed attempt to move into
commercial theater and filmmaking with Zoot Suit. Valdez's
response was that it was time for Chicanos to assume their place in the
mainstream and that separatism had been just a necessary phase that
prepared them to do so without losing their sense of identity. But it
was also clear that the young men in Zoot Suit had to reject that
aspect of pachuquismo, that very attractive, very essential part
of their identity as Chicanos, that was disruptive of society and
self-destructive.

Lack of commitment to cultural authenticity seemed confirmed--certainly
to Latino actors who protested--in 1992 when Valdez attempted to cast
Laura San Giacomo, an actress with something of a bankable name but also
an Italian ancestry, as Frida Kahlo in the movie he was trying to make
about the artist. Valdez argued that the compromise was necessary to get
Hollywood to do movies with Hispanic protagonists at all and that the
movie would offer a picture of Latino life that was not gang- or
drug-based, i.e., nonstereotypical and presumably positive.

Maybe it's just the difficulty of a Chicano writer/director making
headway in the commercial world, but in truth, it's difficult seeing
Valdez as lost leader, as someone who's abandoned his roots, in San Juan
Bautista, the mission town where Mummified Deer has been playing
in a theater Valdez built out of a fruit-packing shed. By no means as
far off the beaten track as Glover, Vermont, where Bread and Puppet
escaped city life in the 1970s, it's still a small rural town a long way
from entertainment capitals and city attitudes.

The style of Valdez's new play also points to continuity. And for the
most part the inspired stylistic innovations that radical theaters
excelled in--in Mummified Deer for instance, a hospital bed
that's transformed into a train laden with Mexican
revolutionaries--still work their magic in Valdez's hands. The sudden
release of concentrated imagination thrills. But even when they don't
work, when they now seem more a part of tradition than vital and
expressive, their mere presence, like the continued earnest tone of his
writing in our smug, cynical time, suggests that Valdez hasn't
jettisoned the past.

In any event, the story itself makes it clear that roots are not easily
cut off. On a simple series of platforms, marked with what seem to be
petroglyphs and hung with plastic sheets that make the set look like an
ice cave--poor theater after all these years!--Mama Chu, a fierce,
84-year-old family matriarch, lies on a hospital bed, suffering from
abdominal pains. When the cause of her condition is diagnosed not as
cancer but as a mummified fetus that has been lodged in her womb for
sixty years, her granddaughter Armida, an anthro grad student at
Berkeley who's in search of the truth about her mother's life, begins to
pierce the maze of myths and half-truths that have made up Chu's story
and the family's history.

Along the way, secrets are revealed about paternity, incest and
migration. The ultimate source of these secrets and family myths isn't,
however, as in many plays, personal pathology. The half-truths and
inventions all proceed from a historic cause: the little-known Yaqui
genocide at the hands of Porfirio Diaz and the Federales, which capped
four centuries of little-known Yaqui resistance to European
colonization.

In the end, it turns out that none of Chu's children as they're
presented in the play are hers. Her children were all taken away
and murdered in the genocide. She gathered Armida's mother, aunt and
uncle to her to fill the void. (The horrific description of the mass
slaughter alone insures that this play is not going very far into the
mainstream.)

Powerful, serious material. And Valdez doesn't always treat it
reverentially, as many lesser playwrights would. The introduction of a
kind of grotesque humor makes it all the more powerful at times. As when
Aunt Oralia (Rosa Escalante) wonders, "Can't you just yank that little
sucka [the dead fetus] out?" or Uncle Profe explains the incest by
saying simply, "We were always very close."

To his credit, Valdez doesn't treat the Chicano family reverentially,
either. He understands that they can be quite conservative even though
they've been victims (or because they've been victims). He satirizes
them and creates a number of characters that, like the satirical figures
of the actos, are one-dimensional types. With an Oralia, that
works to project a sense of how self-protective she is about the past,
but this is ultimately a play of terrible family secrets, and having the
weight of those secrets fall on an Armida who is little more than a plot
mechanism and Berkeley-activist-type blunts the force of the drama.

It's not simply a matter of an uneven cast, one that ranges all the way
from the very adept and realistic Daniel Valdez (Uncle Profe) to
Estrella Esparza (Armida), who can barely make the words her own. It's
also the writing and the way Valdez as director has the characters
played. As director, he also pitches a number of the performances very
high. An actress like Alma Martinez, who plays Mama Chu, can obviously
change gears on a dime and sketch in a reaction or attitude with the
flick of a hand, but Valdez pushed her performance hard and makes it
vocally very forceful, as if constantly to remind us what a powerful
woman this is. The result is a lack of nuance, variety and sympathy that
sent me fleeing to quieter characters like Uncle Profe and Armida's
mother, Agustina (Anita Reyes).

Then too, the revelations about the past are far too complicated,
there's too much information coming at you generally, and what exactly
the deer dancer represents is obscure. Also, the symbol of the mummified
fetus at times feels contrived. All of which makes it difficult to take
in and feel comfortable with what Valdez is apparently going for in his
continuing exploration of what he understands to be a continually
evolving Chicano identity. That is, the sense that Chu's finally
confronting the Yaqui genocide results in her forgoing an operation and
keeping the fetus, which is an incarnation of both an indio past that is
dead and gone and a living Yaqui spirit that--bypassing the acquiescent
and self-deluding generation of aunt and uncle--Chu passes on to her
granddaughter, Armida.

Popular perception notwithstanding, the theory of natural selection was
accepted by every serious evolutionist long before Darwin. Earlier
scientists interpreted it as the clearest possible evidence for
intelligent design of the universe. William Paley's Natural Theology
(1802), for example, employs the famous image of the "great watchmaker" to account for the perfect adaptation of creatures to harmonious ecosystems. Darwin's innovation, which may appear small but is in fact immense, lay in his claim that natural selection is the only cause of evolution.

In one sense, this was merely a change of emphasis: The impulse of
pre-Darwinian evolutionists, faced with incontrovertible evidence of
natural selection, had been to ask why it occurred. They sought after
the "final cause" of evolution, and they found it in the proposal of an
intelligent designer. But one of the essential principles of modern
science is that such final causes are unknowable. Science must limit
itself to "efficient" or "material" causes; it must not ask why things
happen, but how. Darwin applied this principle to evolution. Whereas his
predecessors had seen the adaptation of organisms to their environment
as the effects of design, Darwin saw the physical development of
creatures as the sole cause of evolution. The great watchmaker had been
overthrown.

As Stephen J. Gould (who died as this issue was going to press) shows in
The Structure of Evolutionary Theory, Darwin's breakthrough was
essentially methodological. Darwinism is what you get when you focus on
the micrological details, resolutely refusing to lift your eyes to the
level of the whole. Over the course of the nineteenth century, this
methodological sine qua non for scientific investigation was imposed on
every discipline, but it originated in the "dismal science" of
economics. The "political economy" of Adam Smith began from the material
actions of individuals in pursuit of their own selfish ends, and
extrapolated from this micrological level to abstract generalizations
about the economy as a whole.

What Smith calls "the economy" is thus an amalgamation of all the
self-interested actions of individuals, and precisely the same is true
of what Darwin understood as "evolution." In fact, Darwin consciously
and deliberately imported Smith's economic methodology into biology in
order to refute natural theology's argument from design. As Gould baldly
puts it, "the theory of natural selection is, in essence, Adam Smith's
economics transferred to nature." He is reluctant to dwell too long on
this kinship, no doubt because he understands the severity of the threat
it poses to Darwinism's pretensions to objectivity. Gould's ally and
sometime collaborator Richard Lewontin has criticized him for such
reticence in several exchanges first published in the New York Review
of Books
. Lewontin has called Gould's work "curiously unpolitical"
for failing to draw out the implications of "the overwhelming influence
of ideology in science." For Lewontin, "Darwin's theory of evolution by
natural selection is obviously nineteenth-century capitalism writ
large," and attempts to press it into the service of psychology are
"pure reification."

The distinguishing theoretical characteristic of both Darwin and Smith
is reductionism--they reduce all knowledge to the level of the
individual. As Gould notes, "The rebuttal of the former centerpiece of
natural history--the belief that organic designs record the intentions
of an omnipotent creative power--rests upon the radical demotion of
agency to a much lower level, devoid of any prospect for conscious
intent, or any 'view' beyond the immediate and personal." Today,
technological progress has enabled evolutionists to carry Darwin's
reduction a stage further. The smallest individual Darwin could study
was the organism, but it is now possible to analyze the behavior of the
gene. People like Richard Dawkins now claim that evolution is driven not
by competition between individual organisms, but by struggles among
genes.

Many evolutionary biologists keep a guilty silence regarding the ethical
implications of their theory, but Dawkins positively revels in
dehumanization. His imagery dwells lasciviously on the mechanical--our
bodies are merely "lumbering robots," or "survival machines" for genes.
His infamous book The Selfish Gene (1976) abounds in brazen
antihumanist provocations: "I am treating a mother as a machine
programmed to do everything in its power to propagate copies of the
genes which reside inside it." Nor does mechanization stop with the
body; evolutionary psychology views the mind itself as a machine,
reducing our thoughts and ideas to the chemical reactions that accompany
them. Dawkins has even propounded a theory that the history of ideas
follows rules analogous to competitive gene selection, reducing
dialectic to a tedious and pointless struggle between what he calls
"memes." Lately he has taken to writing letters to the British press,
suggesting that Osama bin Laden and George W. Bush will be enlightened
if they "study memes."

The idea that genes determine all social behavior, that human beings are
machines, evidently strikes a chord in the Western popular mind.
Postmodernist works such as Donna Haraway's "A Cyborg Manifesto"
celebrate the "posthuman" from what their authors apparently regard as a
radical perspective, while the theoretical texts of Michel Foucault and
Jean-Francois Lyotard advocate a micrological materialism that excludes
on principle any interest in "totalizing grand narratives." As John
Dupré has recently remarked, this "tyranny of the microscopic"
really constitutes an "intellectual pathology" whose significance is
sociological rather than scientific. Gould swats Dawkins away easily
enough--sardonically appropriating his vocabulary to dismiss his theory,
cruelly but fairly, as an "impotent meme"--but he does not explain why
such theories have come to seem plausible to many in the general public.
To examine that, we have to back up about 65 million years.

Reptilia served as Exhibit A then. Imagine Triceratops glancing
up from its grazing to notice a seven-mile-wide asteroid descending
rapidly toward its head. Triceratops had not expected this.
Nature had prepared it for the expected; it could expect to spend a
great deal of time fighting with Tyrannosaurus rex, for example,
and was formidably well-equipped for that purpose. But natural selection
had not prepared it to withstand a direct hit from a piece of rock a
league long.

The lump of stone that crashed into what is now the Yucatan Peninsula
ended the Cretaceous Period by showering the earth with fire and
brimstone, thus destroying 70 percent of living species, including
almost all the dinosaurs. This was something of a spanner in the works
of natural selection, from which it may not recover. The implications of
this catastrophe, conclusive evidence for which was discovered only in
1980, have yet to be fully assimilated by evolutionary theory. For most
of the twentieth century, orthodox Darwinists held that natural
selection--the competitive adaptation of individual organisms to their
environment--was the exclusive motor of evolutionary change. Now they
must qualify this dogma, but it is proving a laborious process.

Many scientists remain convinced that catastrophic change is the
exception. If it weren't for that pesky asteroid, they gripe, natural
selection would have continued unabated. They note that natural
selection will always work ceteris paribus--that is, other things
being equal, under the controlled laboratory environment in which modern
scientists conduct their experiments. It will work, that is to say, in
the absence of the unexpected. But don't we know from experience that
the unexpected happens all the time, and occasionally with catastrophic
consequences?

The "K-T event," as the asteroid strike is known, casts suspicion on the
doctrinaire claim that evolution is solely the result of the competitive
adaptation of individual organisms to their environment. It indicates
that the external constraints under which adaptation occurs must
inevitably exert an influence on the course of evolution. And it raises
the possibility that random, "chance" events play at least as
significant a role as the incremental, purposive process of natural
selection.

Although it represents a mortal threat to mainstream Darwinism, the
theory of catastrophic evolution is quite consistent with Stephen Jay
Gould and Niles Eldredge's epochal discovery of "punctuated
equilibrium." Punctuated equilibrium, or "punk-ek," holds that evolution
does not take place incrementally but rather in spurts that are divided
by long periods of stasis. It departs from Darwin by implying that
natural selection by competition among individual organisms cannot be
the exclusive cause of evolutionary change, since such competition does
not pause for periods of equilibrium.

Darwin is often thought to have rescued the history of life from the
superstitious fantasies of religion, by basing his theory on good,
solid, empirical evidence. But, as Gould and Eldredge noticed, the
empirical evidence does not indicate that evolution proceeds by
incremental, incessant natural selection, as Darwin claimed. In fact,
the empirical evidence indicates quite the opposite. When we look at the
living species around us, we do not find a continuum of creatures in
infinitesimally graduated stages of evolution. We find, instead, clearly
distinct species. We find the same when we look at the fossil record;
paleontology testifies that evolutionary stasis is the norm, and that
change takes place in abrupt bursts, as though suddenly spurred forward
by some external stimulus.

One of the many fascinating questions raised in Gould's The Structure
of Evolutionary Theory
is why Darwin did not see this. Why did he
insist on attributing sole determining power to natural selection in
defiance of the evidence? His own explanation was that the fossil record
gives a false impression because it is radically incomplete. But this
does not alter the fact that natural selection is an imposition on the
available evidence, a bold reading against the grain. Did Darwin nod?
Why was he so convinced that all evolution is caused by natural
selection among individual organisms in competition with one another?

Gould does not explain this, almost certainly for a very interesting
reason: He has often been accused, by sociobiologists and orthodox
Darwinians, of handing ammunition to creationists. There is no room for
an intelligent designer in a universe formed entirely through relentless
competition between selfish individuals, but because it allows that
external factors may influence evolution, the theory of punctuated
equilibrium is not incompatible with theories of intelligent design--a
fact that has caused no small embarrassment to its authors. The charge
of neocreationism is deeply unfair--Gould testified against creationism
in landmark court cases and ridiculed it mercilessly in his writing. He
opposed intelligent design on the grounds that it is "theology" and not
"science." In this book, obviously intended as his legacy to scientific
posterity, Gould repeatedly and emphatically protests that no matter how
many revisions and qualifications he may impose upon Darwin, he remains
a faithful follower of the great man. In a rare and revealing mixed
metaphor, he claims to have retained "the guts of the machine," and he
uses a cumbersome simile involving a piece of coral to argue, again and
again, that his own work is merely an "addition" to Darwin.

That is rubbish, and Gould must have known it. The Structure of
Evolutionary Theory
is an "addition" to The Origin of Species
in the same sense that Capital is an "addition" to The Wealth
of Nations
. Gould certainly built upon Darwin's work, assuming its
premises as his own and erecting his own theory on the foundation of a
meticulous analysis of the original texts. But there comes a stage in
the construction at which, in fulfillment of the dialectical law,
quantitative change becomes qualitative change, and the extension to the
edifice deserves to be called a new building.

Despite (and because of) his vehement denials, I believe that Gould
reached that stage. His theory is more than a supplement to Darwinism,
it is an alternative view, a paradigm shift. Gould has deprived natural
selection of the exclusive role Darwin assigned to it, using the most
unimpeachable logic and the most scrupulous empirical research.

Gould obviously liked to limit the destructive impact of his criticism
to distortions of the founder's aims. But Darwin cannot so easily be
exonerated--Gould himself admits that the work of Dawkins constitutes "a
furthering and intensification of Darwin's intent." Indeed, Gould often
refers to theorists of gene selection as "ultra-Darwinists" or
"Darwinian fundamentalists," because they take the master's reductionist
method to the logical conclusion permitted by modern technology. Gould
would have been mortified to hear it, but his own interpretation
suggests that, were Darwin alive today, he might be Richard Dawkins.

Traditional creationism is based on a literal reading of Genesis and
represents no intellectual danger to Darwinism. The recent advocates of
"intelligent design," however, demand to be taken a little more
seriously because of their recent political and pedagogical successes;
they admit to the apparent age of the earth as established in the
geological record, for example, and accept the fossil record as evidence
of species change. Hard-fought cases involving the boards of education
of Kansas (1999) and Ohio (2002) have established a new beachhead for
intelligent design in the public mind, while simultaneously throwing a
shadow on natural selection's claim to be the exclusive motor of
evolutionary change.

The idea that schools in Kansas might depart from Darwinist orthodoxy
induced apoplexy among the commissars of science. John Rennie, editor of
Scientific American, urged colleges to be skeptical of applicants
from Kansas: "If kids in Kansas aren't being taught properly about
science, they won't be able to keep up with children taught competently
elsewhere. It's called survival of the fittest. Maybe the Board of
Education needs to learn about natural selection firsthand." In an
edition of the American Spectator, a leading theorist of
intelligent design, Michael Behe, professed to be mystified at Rennie's
outburst: "What is it about the topic of evolution that drives so many
people nuts? Why does a change in a farm state's high school examination
policy call forth damning editorials all the way from London, England,
and have normally staid editors threatening children?"

The answer is obvious, blindingly so. Behe does not see it because he,
like most advocates of intelligent design, approaches the issue from a
socially conservative point of view. Much scholarship on intelligent
design is sponsored by the Discovery Institute, a Seattle-based
foundation that describes itself as "dedicated to exploring and
promoting public policies that advance representative democracy, free
enterprise and individual liberty," and whose mission statement commits
it to boosting the "common sense" of the "free market." It is this
commitment, I suppose, that distracts Behe from one of the reasons the
American establishment goes "nuts" when the educational privilege of
natural selection is threatened: A threat to the exclusivity of natural
selection--individual competition--is a threat to market ideology.
(Although he tactfully pays it less attention than it deserves, Gould
acknowledges the full extent of Darwinism's complicity with Adam Smith.
But the alterations Gould introduces into evolutionary theory do not
depend on its ideological kinship with classical economics.)

Neither Behe nor his book Darwin's Black Box rate a mention in
The Structure of Evolutionary Theory, and Gould's silence on the
subject of intelligent design can be regarded as extremely eloquent. He
would have denied it, but this book really charts Gould's arduous
passage through Darwinism and his emergence on the other side. This
breakthrough seems to have been facilitated by his discovery of the
literature that Darwin was writing against. Gould blithely informs us
that "I had never read [Paley's] Natural Theology straight
through before pursuing my research for this book." Lay readers may find
this an astonishing confession from the world's leading Darwin scholar,
but those familiar with scientists' undiscriminating rejection of
metaphysics will be unsurprised. Having forced himself to pick up the
book, Gould finds that Paley's primary observation is "undoubtedly
correct," and largely accepted by Darwin--nature does indeed indicate
exquisite adaptation to environment. The difference lies in the reason
Darwin gives for this order in creation. Paley thought it bespoke a
benign creator, but Darwin "seems to mock the standard interpretation in
a manner that could almost be called cruel" when he introduces the
micrological economics of Adam Smith:

as the cruellest twist of all, this lower-level cause of pattern seems
to suggest a moral reading exactly opposite to Paley's lofty hopes for
the meaning of comprehensive order--for nature's individuals struggle
for their own personal benefit, and nothing else! Paley's
observations could not be faulted--organisms are well designed and
ecosystems are harmonious. But his interpretations could not have been
more askew--for these features do not arise as direct products of divine
benevolence, but only as epiphenomena of an opposite process both in
level of action and intent of outcome: individuals struggling for
themselves alone.

Read that last sentence again. What might bring about the triumph of the
"opposite process" to "divine benevolence"? Clue: It is not the blind
indifference of nature. The history of human thought is hardly silent
concerning the struggle between a benevolent deity and a cruel mocker.
But Gould shies away from considering the theological implications of
his theory with the standard get-out clause: "This book cannot address
such a vital issue at any depth."

Many readers will be tempted to respond: "Why on earth not? It's 1,400
pages long!" But Gould was not eager to incur again, in his magnum opus,
the tired charge of neocreationism. He does begin to speculate about why
the homologous visions of Darwin and Smith should complement each other
so conveniently, and he also raises the question of why this connection
has come to seem so glaring in recent years. But his uncharacteristic
hesitancy reveals his discomfort away from scientific terrain: "I
venture these ill-formulated statements about Zeitgeist because I feel
that something important lurks behind my inability to express these
inchoate thoughts with precision."

Indeed it does. Later in the book, Gould remarks that "the exclusivity
of organismal selection...provides the punch line that allowed the
vision of Adam Smith to destroy the explicit beauty and harmony of
William Paley's world." Absolutely true. But the exclusivity of
organismal selection is what Gould denied, too. Is it really accurate,
then, to continue calling him a "Darwinist"? At one point, Gould demands
that creationists throw in the towel and acknowledge Darwin as "the
Muhammad Ali of biology." Ali was undoubtedly a great champion, but his
present condition renders Gould's image rather ambiguous. And then, too,
the reader is left in some doubt as to whether Gould saw himself in the
role of Angelo Dundee or Joe Frazier.

Blogs

In honor of Labor Day, here’s a stab at the impossible task of naming the best songs ever written about working people.

August 29, 2014

In his new book, John Dean finally offers definitive answers to the questions “What did he know, and when did he know it?”
 

August 14, 2014

The quagmire of the Vietnam War was built on a “queasy foundation of fact and myth.”

July 31, 2014

"About nothing does the mob forget so quickly as about war."

July 28, 2014

Eric on this week's concerts and Reed on the media’s coverage of climate change. 

July 21, 2014

Reflections on the meaning of the French Revolution in the shadow of Adolf Hitler.

July 14, 2014

In fact, says theater director Anne Bogart, stories can be “fascistic.”

July 14, 2014

“Have you a city that smells worse than Akron?”

July 11, 2014

These ten songs, taken together, help distill the American experience and make clear both what’s great about the US and what still needs critical attention.

July 2, 2014

On the 25th anniversity of Spike Lee's "Do the Right Thing," America needs to ask itself some hard questions.

June 30, 2014