The SAT has been on the ropes lately. The University of California
system has threatened to quit using the test for its freshman
admissions, arguing that the exam has done more harm than good. The
State of Texas, responding to a federal court order prohibiting its
affirmative action efforts, has already significantly curtailed the
importance of the SAT as a gatekeeper to its campuses. Even usually
stodgy corporate types have started to beat up on the SAT. Last year,
for example, a prominent group of corporate leaders joined the National
Urban League in calling upon college and university presidents to quit
placing so much stock in standardized admissions tests like the SAT,
which they said were "inadequate and unreliable" gatekeepers to college.
Then again, if the SAT is anything, it's a survivor. The SAT
enterprise--consisting of its owner and sponsor, the College Board, and
the test's maker and distributor, the Educational Testing Service--has
gamely reinvented itself over the years in myriad superficial ways,
hedging against the occasional dust-up of bad public relations. The SAT,
for example, has undergone name changes over the years in an effort to
reflect the democratization of higher education in America and
consequent changes in our collective notions about equal opportunity.
But through it all, the SAT's underlying social function--as a sorting
device for entry into or, more likely, maintenance of American
elitehood--has remained ingeniously intact, a firmly rooted icon of
American notions about meritocracy.
Indeed, the one intangible characteristic of the SAT and other
admissions tests that the College Board would never want to change is
the virtual equation, in the public's mind, of test scores and academic
talent. Like the tobacco companies, ETS and the College Board (both are
legally nonprofit organizations that in many respects resemble
profit-making enterprises) put a cautionary label on the product.
Regarding their SAT, the organizations are obliged by professional codes
of proper test practices to inform users of standardized admissions
tests that the exams can be "useful" predictors of later success in
college, medical school or graduate school, when used in conjunction
with other factors, such as grades.
But the true place of admissions testing in America isn't always so
appropriate. Most clear-eyed Americans know that results on the SAT,
Graduate Record Exam or the Medical College Admission Test are widely viewed as synonymous with academic talent in higher education. Whether it's true or not--and there's lots of evidence that it's not--is quite beside the point.
Given the inordinate weight that test scores play in the American
version of meritocracy, it's no surprise that federal courts have been
hearing lawsuits from white, middle-class law school applicants
complaining they were denied admission to law school even though their
LSAT scores were fifty points greater than a minority applicant who was
admitted; why neoconservative doomsayers warn that the academic quality
of America's great universities will plummet if the hordes of unwashed
(read: low test scores) are allowed entry; why articles are written
under titles like "Backdoor Affirmative Action," arguing that
de-emphasizing test scores in Texas and California is merely a covert
tactic of public universities to beef up minority enrollments in
response to court bans on affirmative action.
Indeed, Rebecca Zwick, a professor of education at the University of
California, Santa Barbara, and a former researcher at the Educational
Testing Service, wrote that "Backdoor Affirmative Action" article for
Education Week in 1999, implying that do-gooders who place less
emphasis on test scores in order to raise minority enrollments are
simply blaming the messenger. And so it should not be surprising that
the same author would provide an energetic defense of the SAT and
similar exams in her new book, Fair Game? The Use of Standardized
Admissions Tests in Higher Education.
Those, like Zwick, who are wedded to the belief that test scores are
synonymous with academic merit will like this concise book. They will
praise its 189 pages of text as, finally, a fair and balanced
demystification of the esoteric world of standardized testing. Zwick and
her publisher are positioning the book as the steady, guiding hand
occupying the sensible middle ground in an emotional debate that they
claim is dominated by journalists and other uninformed critics who don't
understand the complex subject of standardized testing. "All too
often...discussions of testing rely more on politics or emotion than on
fact," Zwick says in her preface. "This book was written with the aim of
equipping contestants in the inevitable public debates with some solid
information about testing."
If only it were true. Far from reflecting the balanced approach the
author claims, the book is thinly disguised advocacy for the status quo
and a defense of the hegemony of gatekeeping exams for college and
university admissions. It could be more accurately titled (without the
bothersome question mark) "Fair Game: Why America Needs the SAT."
As it stands, the research staff of the College Board and the
Educational Testing Service, Zwick's former employer, might as well have
written this book, as she trots out all the standard arguments those organizations have used for years to show why healthy doses of standardized testing are really good for American education. At almost every opportunity, Zwick quotes an ETS or College Board study in the most favorable light, couching it as the final word on a particular issue, while casting aspersion on
other studies and researchers (whose livelihoods don't depend on selling
tests) that might well draw different conclusions. Too often Zwick
provides readers who might be unfamiliar with the research about testing
with an overly simplistic and superficial treatment. At worst, she
leaves readers with grossly misleading impressions.
After providing a quick and dirty account of IQ testing at the turn of
the last century, a history that included the rabidly eugenic beliefs of
many of the early testmakers and advocates in Britain and the United
States ("as test critics like to point out," Zwick sneers), the author
introduces readers to one of the central ideologies of mental testing to
sort a society's young for opportunities for higher education. Sure,
mental testing has brought some embarrassing moments in history that we
moderns frown on nowadays, but the testing movement has had its good
guys too. Rather than being a tool to promote and protect the interests
of a society's most privileged citizens, the cold objectivity of
standardized testing remains an important goal for exercise of
According to this belief, standardized testing for admission to college
serves the interest of meritocracy, in which people are allowed to shine
by their wits, not their social connections. That same ideology, says
Zwick, drove former Harvard president James Bryant Conant, whom Zwick
describes as a "staunch supporter of equal opportunity," in his quest to
establish a single entrance exam, the SAT, for all colleges. Conant, of
course, would become the first chairman of the board of the newly formed
Educational Testing Service. But, as Nicholas Lemann writes in his 1999
book The Big Test: The Secret History of the American
Meritocracy, Conant wasn't nearly so interested in widening
opportunity to higher education as Zwick might think. Conant was keen on
expanding opportunity, but, as Lemann says, only for "members of a tiny
cohort of intellectually gifted men." Disillusioned only with the form
of elitism that had taken shape at Harvard and other Ivy League
colleges, which allotted opportunities based on wealth and parentage,
Conant was nevertheless a staunch elitist, an admirer of the
Jeffersonian ideal of a "natural aristocracy." In Conant's perfect
world, access to this new kind of elitehood would be apportioned not by
birthright but by performance on aptitude tests. Hence the SAT, Lemann
writes, "would finally make possible the creation of a natural
The longstanding belief that high-stakes mental tests are the great
equalizer of society is dubious at best, and at worst a clever piece of
propaganda that has well served the interests of American elites. In
fact, Alfred Binet himself--among the fathers of IQ testing, who would
invent the first version of the Stanford-Binet intelligence test, the
precursor to the modern SAT--observed the powerful relationship between
one's performance on his so-called intelligence test and a child's
social class, a phenomenon Binet described in his 1916 book The
Development of Intelligence in Children.
And it's the same old story with the SAT. Look at the college-bound high
school seniors of 2001 who took the SAT, and the odds are still firmly
stacked against young people of modest economic backgrounds' beating the
SAT odds. A test-taker whose parents did not complete high school can
expect to score fully 171 points below the SAT average, College Board
figures show. On the other hand, high schoolers whose moms and dads have
graduate degrees can expect to outperform the SAT average by 106 points.
What's more, the gaps in SAT performance between whites and blacks and
between whites and Mexican-Americans have only ballooned in the past ten
years. The gap between white and black test-takers widened five points
and eleven points on the SAT verbal and math sections, respectively,
between 1991 and 2001. SAT score gaps between whites and
Mexican-Americans surged a total of thirty-three points during that same
For critics of the national testing culture, such facts are troubling
indeed, suggestive of a large web of inequity that permeates society and
the educational opportunities distributed neatly along class and race
lines, from preschool through medical school. But for Zwick, the notion
of fairness when applied to standardized admissions tests boils down to
a relatively obscure but standard procedure in her field of
"psychometrics," which is in part the study of the statistical
properties of standardized tests.
Mere differences in average test scores between most minority groups and
whites or among social classes isn't all that interesting to Zwick. More
interesting, she maintains, is the comparative accuracy of test scores
in predicting university grades between whites and other racial groups.
In this light, she says, the SAT and most standardized admissions tests
are not biased against blacks, Latinos or Native Americans. In fact, she
says, drawing on 1985 data from a College Board study that looked at
forty-five colleges, those minority groups earned lower grades in
college than predicted by their SAT scores--a classic case of
"overprediction" that substantiates the College Board claim that the SAT
is more than fair to American minorities. By contrast, if the SAT is
unfair to any group, it's unfair to whites and Asian-Americans, because
they get slightly better college grades than the SAT would predict,
Then there's the odd circumstance when it comes to standardized
admissions tests and women. A number of large studies of women and
testing at the University of California, Berkeley, the University of
Michigan and other institutions have consistently shown that while women
(on average) don't perform as well on standardized tests as male
test-takers do, women do better than men in actual classroom work.
Indeed, Zwick acknowledges that standardized tests, unlike for most
minority groups, tend to "underpredict" the actual academic performance
But on this question, as with so many others in her book, Zwick's
presentation is thin, more textbookish than the thorough examination and
analysis her more demanding readers would expect. Zwick glosses over a
whole literature on how the choice of test format, such as
multiple-choice versus essay examinations, rewards some types of
cognitive approaches and punishes others. For example, there's evidence
to suggest that SAT-type tests dominated by multiple-choice formats
reward speed, risk-taking and other surface-level "gaming" strategies
that may be more characteristic of males than of females. Women and
girls may tend to approach problems somewhat more carefully, slowly and
thoroughly--cognitive traits that serve them well in the real world of
classrooms and work--but hinder their standardized test performance
compared with that of males.
Beyond Zwick's question of whether the SAT and other admissions tests
are biased against women or people of color is the perhaps more basic
question of whether these tests are worthwhile predictors of academic
performance for all students. Indeed, the ETS and the College Board sell
the SAT on the rather narrow promise that it helps colleges predict
freshman grades, period. On this issue, Zwick's presentation is not a
little pedantic, seeming to paint anyone who doesn't claim to be a
psychometrician as a statistical babe in the woods. Zwick quotes the
results of a College Board study published in 1994 finding that one's
SAT score by itself accounts for about 13 percent of the differences in
freshman grades; that one's high school grade average is a slightly
better predictor of college grades, accounting for about 15 percent of
the grade differences among freshmen; and that the SAT combined with
high school grades is a better predictor than the use of grades alone.
In other words, it's the standard College Board line that the SAT is
"useful" when used with other factors in predicting freshman grades. (It
should be noted that Zwick, consistent with virtually all College Board
and ETS presentations, reports her correlation statistics without
converting them into what's known as "R-squared" figures. In my view,
the latter statistics provide readers with a common-sense understanding
of the relative powers of high school grades and test scores in
predicting college grades. I have made those conversions for readers in
the statistics quoted above.)
Unfortunately, Zwick misrepresents the real point that test critics make
on the question of predictive validity of tests like the SAT. The
salient issue is whether the small extra gains in predicting freshman
grades that the SAT might afford individual colleges outweigh the social
and economic costs of the entire admissions testing enterprise, costs
borne by individual test-takers and society at large.
Even on the narrow question of the usefulness of the SAT to individual
colleges, Zwick does not adequately answer what's perhaps the single
most devastating critique of the SAT. For example, in the 1988 book
The Case Against the SAT, James Crouse and Dale Trusheim argued
compellingly that the SAT is, for all practical purposes, useless to
colleges. They showed, for example, that if a college wanted to maximize
the number of freshmen who would earn a grade-point average of at least
2.5, then the admissions office's use of high school rank alone as the
primary screening tool would result in 62.2 percent "correct"
admissions. Adding the SAT score would improve the rate of correct
decisions by only about 2 in 100. The researchers also showed,
remarkably, that if the admissions objective is broader, such as
optimizing the rate of bachelor's degree completion for those earning
grade averages of at least 2.5, the use of high school rank by itself
would yield a slightly better rate of prediction than if the SAT scores
were added to the mix, rendering the SAT counterproductive. "From a
practical viewpoint, most colleges could ignore their applicants' SAT
score reports when they make decisions without appreciably altering the
academic performance and the graduation rates of students they admit,"
Crouse and Trusheim concluded.
At least two relatively well-known cases of colleges at opposite ends of
the public-private spectrum, which have done exactly as Crouse and
Trusheim suggest, powerfully illustrate the point. Consider the
University of Texas system, which was compelled by a 1996 federal
appeals court order, the Hopwood decision, to dismantle its
affirmative-action admissions programs. The Texas legislature responded
to the threat of diminished diversity at its campuses with the "top 10
percent plan," requiring public universities to admit any student
graduating in the top 10 percent of her high school class, regardless of
Zwick, of course, is obliged in a book of this type to mention the Texas
experience. But she does so disparagingly and without providing her
readers with the most salient details on the policy's effects in terms
of racial diversity and the academic performance of students. Consider
the diversity question. While some progressives might have first
recoiled at the new policy as itself an attack on affirmative action,
that has not been the case. In fact, at the University of Texas at
Austin, the racial diversity of freshman classes has been restored to
pre-Hopwood levels, after taking an initial hit. Indeed, the
percentage of white students at Austin reached a historic low point in
2001, at 61 percent. What's more, the number of high schools sending
students to the state's flagship campus at Austin has significantly
broadened. The "new senders" to the university include more inner-city
schools in Dallas, Houston and San Antonio, as well as more rural
schools than in the past, according to research by UT history professor
David Montejano, among the plan's designers.
But the policy's impact on academic performance at the university might
be even more compelling, since that is the point upon which
neoconservative critics have been most vociferous in their condemnations
of such "backdoor" affirmative action plans that put less weight on test
scores. A December 1999 editorial in The New Republic typified
this road-to-ruin fiction: Alleging that the Texas plan and others like
it come "at the cost of dramatically lowering the academic
qualifications of entering freshmen," the TNR editorial warned,
these policies are "a recipe for the destruction of America's great
Zwick, too, neglects to mention the facts about academic performance of
the "top 10 percenters" at the University of Texas, who have proven the
dire warnings to be groundless. At every SAT score interval, from less
than 900 to scores of 1,500 and higher, in the year 2000, students
admitted without regard to their SAT score earned better grades than
their non-top 10 percent counterparts, according to the university's
latest research report on the policy.
Or, consider that the top 10 percenters average a GPA of 3.12 as
freshmen. Their SAT average was about 1,145, fully 200 points lower than
non-top 10 percent students, who earned slightly lower GPAs of 3.07. In
fact, the grade average of 3.12 for the automatically admitted students
with moderate SAT scores was equal to the grade average of non-top 10
percenters coming in with SATs of 1,500 and higher. The same pattern has
held across the board, and for all ethnic groups.
Bates College in Lewiston, Maine, is one case of a college that seemed
to anticipate the message of the Crouse and Trusheim research. Bates ran
its own numbers and found that the SAT was simply not a sufficiently
adequate predictor of academic success for many students and abandoned
the test as an entry requirement several years ago. Other highly
selective institutions have similar stories to tell, but Bates serves to
illustrate. In dropping the SAT mandate, the college now gives students
a choice of submitting SATs or not. But it permits no choice in
requiring that students submit a detailed portfolio of their actual work
and accomplishments while in high school for evaluation, an admissions
process completed not just by admissions staff but by the entire Bates
As with the Texas automatic admission plan, Zwick would have been
negligent not to mention the case of Bates, and she does so in her
second chapter; but it's an incomplete and skewed account. Zwick quotes
William Hiss, the former dean of admissions at Bates, in a 1993
interview in which he suggests that the Bates experience, while perhaps
appropriate for a smaller liberal arts college, probably couldn't be
duplicated at large public universities. That quote well serves Zwick's
thesis that the SAT is a bureaucratically convenient way to maintain
academic quality at public institutions like UT-Austin and the
University of California. "With the capability to conduct an intensive
review of applications and the freedom to consider students' ethnic and
racial backgrounds, these liberal arts colleges are more likely than
large university systems to succeed in fostering diversity while toeing
the line on academic quality," Zwick writes.
But Zwick neglects to mention that Hiss has since disavowed his caveats
about Bates's lessons for larger public universities. In fact, Hiss, now
a senior administrator at the college, becomes palpably irritated at
inequalities built into admissions systems that put too much stock in
mental testing. He told me in a late 1998 interview, "There are twenty
different ways you can dramatically open up the system, and if you
really want to, you'll figure out a way. And don't complain to me about
the cost, that we can't afford it."
Zwick punctuates her brief discussion of Bates and other institutions
that have dropped the SAT requirement by quoting from an October 30,
2000, article, also in The New Republic, that purportedly
revealed the "dirty little secret" on why Bates and other colleges have
abandoned the SAT. The piece cleverly observed that because SAT
submitters tend to have higher test scores than nonsubmitters, dropping
the SAT has the added statistical quirk of boosting SAT averages in
U.S. News & World Report's coveted college rankings. That
statistical anomaly was the smoking gun the TNR reporter needed
to "prove" the conspiracy.
But to anyone who has seriously researched the rationales colleges have
used in dropping the SAT, the TNR piece was a silly bit of
reporting. At Bates, as at the University of Texas, the SAT
"nonsubmitters" have performed as well or better academically than
students who submitted SATs, often with scores hundreds of points lower
than the SAT submitters. But readers of Fair Game? wouldn't know
One could go on citing many more cases in which Zwick misleads her
readers through lopsided reporting and superficial analysis, such as her
statements that the Graduate Record Exam is about as good a predictor of
graduate school success as the SAT is for college freshmen (it's not,
far from it), or her overly optimistic spin on the results of many
studies showing poor correlations between standardized test scores and
later career successes.
Finally, Zwick's presentation might have benefited from a less
textbookish style, with more enriching details and concrete examples.
Instead, she tries to position herself as a "just the facts" professor
who won't burden readers with extraneous contextual details or accounts
of the human side of the testing culture. But like the enormously
successful--at least in commercial terms--standardized tests themselves,
which promote the entrenched belief in American society that genuine
learning and expert knowledge are tantamount to success on Who Wants
to Be a Millionaire-type multiple-choice questions, books like Fair Game? might be the standardized account that some readers really want.
One of the biggest problems Palestine's supporters face is
Children in New York City's public schools are being shortchanged--again.
As a Russian studies major at Yale in the 1970s, I observed Soviet
"elections" that were conducted more fairly than the 2002 Yale
Corporation's board of trustees election. Why is the Yale Corporation so
threatened by the candidacy of a prominent New Haven pastor who cares
about Yale and its workers?
The last time a prospective trustee was nominated by petition was almost
forty years ago, when William Horowitz became Yale's first elected
Jewish trustee. Back then 250 signatures were required for ballot
qualification; that has since been raised to 3 percent of eligible
alumni--some 3,200 signatures today. The Rev. Dr. W. David Lee, an
African-American pastor of one of New Haven's largest churches and a
graduate of the Yale Divinity School, gathered 4,870 signatures. If
elected, he would be the only New Haven resident other than Yale's
president to sit on the corporation's board.
But he is also supported by Yale's employee unions, and the
university--one of America's great institutions of higher learning--does
not like that. Normally, the Standing Committee for the Nomination of
Alumni Fellows of the Association of Yale Alumni nominates two or three
alumni to stand for election. This year, apparently threatened by Lee's
grassroots efforts, the committee nominated only one, Maya Lin, creator
of the Vietnam War memorial, around whom the Yale Corporation and its
allies could rally.
As an alumnus, I received no fewer than six mailings--from the alumni
organization, from wealthy Yale alumni, from former corporation board
members--all criticizing Lee for failing to identify who paid for his
mailing, for his "aggressive campaign" and for his "ties to special
interests, labor unions."
In a campaign flier (containing no disclosure of who paid for it), the
Association of Yale Alumni quoted comments from Lee critical of the
university. It is not surprising that a minister of a large church at
which many Yale employees worship might at times express substantial
differences with a university that pays many of those workers less than
a living wage.
As if the Yale Corporation had not already made its interests known,
even the ballot package--paid for by the university and sent to all
voters--was slanted in favor of the corporation's candidate. The
official publication intimates support for its favored candidate from
"over 700 alumni," including the Association of Yale Alumni, the
officers of Yale college classes and Yale clubs and other alumni
associations. The other candidate, the Yale Corporation stated in the
ballot package, was "nominated by petition"--(as though Lee's 4,870
signatures did not indicate the support of those alumni).
Reminiscent of elections conducted in one-party states, the corporation
refused to allow an observer to be present when the ballots are counted.
It is not in the Yale bylaws, he was told.
It is unfortunate that Yale, which has produced so many national
leaders, has earned a widespread reputation for its antiunion activities
[see Kim Phillips-Fein, "Yale Bites Unions," July 2, 2001]. To all but
declare war on Yale's workers and its union, and on an outstanding young
New Haven leader, can only exacerbate city-university tensions and roil
Yale's already troubled labor-management waters.
How could one pro-worker candidate who aspires to a lone seat on a board
of nineteen of America's most influential people unleash the fury of an
entire university hierarchy? Why do powerful people--the kind who sit on
Yale's board--feel so threatened by a local minister? Why can't one of
the world's most prestigious universities--with a multibillion-dollar
endowment--pay its workers a living wage?
For God. For Country. For Yale.
Since this is going to be a story about sex and children, let's start
with a bit of groping in the priests' chamber.
I must have been 12. My confederates and I, all suited out in our little
Scout uniforms--demure blouse, ribbon tie, sash of merit badges across
the chest, jaunty tam-o'-shanter--were mustered in the rectory of St. John
Gualbert's, there to be investigated on our knowledge of and devotion to
the Blessed Virgin. This was the last step toward our achieving a
Catholic girl's honor called the Marian Award. I remember the word
"investigated." I remember, too, sitting on the long bench, looking at
the heavy draperies, the carved legs of the vast dining table, waiting
my turn in the half-dark, feeling the gaze of the stripped and suffering
painted Jesus behind me while, at the head of the table, our resolutely
unmortified investigator began asking first one girl then another such
questions as "Where do babies come from?" "What do you have between your
legs?" "What do you have here?" laying hand on breast, and so on like
that. Hmm, I thought, these were nothing like the sample questions in
the manual I'd been reviewing for days. And what was he doing
easing my friend up across his tumid belly and onto his lap? I'd never
liked this priest. He was florid and coarse, with piggy eyes, a bald
head and thick fingers that he'd run along the inside of the chalice
after Communion, smacking his lips on the last drops of the blood of
Christ. My mother didn't teach me about sex--I don't count the
menstruation talk--but, without quite saying so, she taught me to regard
authority figures as persons who had to earn respect. Obedience was
rarely free, never blind. Time has stolen what this priest asked me,
where, if anyplace, he touched me; I remember him stinking of drink is
all, and myself standing schoolmarm straight and reciting, with the
high-minded air I affected for such occasions, the statement I'd been
preparing: "Father, I fail to see what that question has to do with the
Marian Award. Girls, let's go." We escaped in a whirl of gasps and
secretive giggles, rushing to telephone our Scout leader. I had no
inclination to tell my mother, but most of the other girls told theirs,
and soon the priest was relieved of child-related duties. We got our
Marian medals without further investigation, and before too long the
priest dropped dead in the street of a heart attack. Even now, as
middle-aged men weep about the lifelong trauma inflicted by an uninvited
cleric's hand to their childish buttocks, I consider my own too-close
brush with the cloth as just another scene from Catholic school.
There were very different scenes, many more in fact, that I could just
as easily conjure forward now under the heading "sex and childhood,"
though at the time I no more thought they had anything to do with sex
than our encounter with the priest or, for that matter, my mother's
subtle lessons in self-possession. They contained, rather, the bits and
pieces of a sensual education that would be fit together in some
recognizable pattern only later. And because, at least in my school at
that time, official silence about sex meant we were also spared lectures
against abortion and homosexuality, onanism and promiscuity ("Thou shalt
not commit adultery"? who knew?), what was left to us was indulgence in
the high-blown romance of the church: Gregorian chants and incantatory
Polish litanies; the telling and retelling of the ecstasies of the
saints; the intoxicating aroma of incense, of hyacinths at Easter and
heaped peonies in June; the dazzling brocades of the priests' vestments
and the Infant of Prague's extravagant dresses, which we girls would paw
through when cleaning the church on Saturday; the stories of hellfire
and martyrdom; and the dark, spare aesthetic of the nuns.
There is a parallel in my ordering of childish memories here and the
public reaction to Judith Levine's Harmful to Minors. Levine
spends a large portion of the book advocating for candid, comprehensive
sex education in schools, something I and many of my generation never
had. But the spirit that animates the book is a less programmatic,
polymorphous appreciation of the sights and smells, the sounds and
language and tactile delights that make a person--adult or child--feel
alive in her skin. Levine's central preoccupation, running like a golden
thread throughout the book, is the pursuit of happiness, the idea that
kids have a right not just to safety and knowledge but to pleasure too.
And "pleasure" here is more than the sweet shudder of a kiss, the happy
exhaustion of climax; it is the panoply of large and small things that
figure under the heading joie de vivre, including the
satisfaction, quite apart from sex, of relating deeply with others in
the world. "Knowledge" is more than facts and technical skill; it is the
ability to understand the prompts of body and mind--to recognize "when
you can't not have it," as one woman quoted by Levine replied to her
daughter's "How do I know?" question--and the wherewithal to decide when
it's time to get out of the rectory.
In another age and country this might be called reasonable, everyday
stuff. Levine spends hardly any time talking about pedophiles, none on
priests. In dissecting the various sexual panics of the past couple of
decades, she marshals a catalogue of what, in the scheme of things,
should be reassuring studies and statistics to show that satanic ritual
abuse is a myth; child abduction, molestation and murder by strangers
(as opposed to family members) is rare and not rising; pedophilia (an
erotic preference of maybe 1 percent of the population) typically
expresses itself in such "hands-off" forms as voyeurism and
exhibitionism; child sex offenders have among the lowest rates of
recidivism; child porn, whether on the Net or the streets, is almost
nonexistent and then (less reassuring) its chief reproducers and
distributors are cops; sexual solicitations aimed at children over the
Net, while creepy, have not resulted in actual assaults; and "willing"
encounters between adults and minors do not ruin minors. Although Levine
has noted in interviews that, as a teenager, she had a sexual
relationship with an older man, she never mentions it in the book, nor
does she delve too far into this last taboo. She relegates to a footnote
the fascinating, difficult story of Mary Kay Letourneau, the 35-year-old
Seattle area teacher jailed for her affair with a 13-year-old student
who impregnated her twice and insisted to the press, "I'm fine."
Levine's most detailed discussion of age-of-consent laws involves the
more easily comprehended story of a precocious 13-year-old, who also
asserted her free will, and an emotionally immature 21-year-old,
currently locked up for statutory rape. More than once Levine states,
for anyone suspicious enough to wonder, her unswerving opposition to
every form of forced, coerced or violent sex, and to sex between adults
and young children. It shouldn't be necessary for her to assert that
just because kids have a far greater chance of dying in a car accident
than at the hands of a sex offender that doesn't mean the latter isn't a
problem, but she does. Yet, for all that, her book is being blasted by
the heavy guns and light artillery of the right-wing sex police as a
child molester's manifesto.
One reason is timing. The priest scandal, one of those things that
everyone knew but kept an unbothered or guilty silence about until the
court cases and daily headlines forced a response, has raised a hysteria
against which any rationality on youthful sexuality has about as much
chance as that student facing the tank in Tiananmen Square. Even without
that, nothing seems to make the blood boil like the suggestion that it's
possible for minors to emerge unscathed or even enriched from consensual
sexual relations with adults. I have had such conversations with
leftists who angrily reject the whole notion, even as I ask, What about
X, who says it was like an answered prayer when his parents'
30-something friend initiated him sexually at 13, when for months
afterward at the end of the school day he would politely kiss his
same-age girlfriend (now his wife of twenty-five years) and then rush to
this experienced woman's bed? What about Y, who seduced her married
teacher when she was 17 and he 45, and who, thirty years later, has with
this same man one of the most loving unions I have ever seen? What about
Z, who as a youth regularly sought out the company of older men because,
apart from a sexual education, they offered him a safe place for
expression, a cultural home, a real home? The priest scandal, which
forecloses any attempt to separate vicious crime from pervy nuisance
from consenting encounter, has further limited the possibilities for
thoughtful discussion on the real things people do and feel, the causes
and effects and complex power exchanges of a human activity that does
not, and will never, operate according to the precepts of a textbook or
Another reason is that Levine's most bombastic critics had not read
Harmful to Minors before damning it. Dr. Laura, who called
on the University of Minnesota Press to stop the book's release, took
her cues from Judith Reisman, who declared Levine an "academic
pedophile." A longtime zealot in the trenches of the antipornography
cause, Reisman told the New York Times, "It doesn't take a great
deal to understand the position of the writer. I didn't read Mein
Kampf for many years, but I knew the position of the author." Tim
Pawlenty, the Minnesota House majority leader and a Republican hopeful
for governor, also admitted to not having read the book before equating
the press's role in its publication with "state-sanctioned support for
illegal, indecent, harmful activity such as molesting children." Robert
Knight, a spokesman for Concerned Women for America who urged the
university regents to fire those responsible for publishing this "evil
tome," says he "thumbed through it." Knight, whose organization is
dedicated to bringing "Biblical principles into all levels of public
policy," might consider what, at a practical level, that might mean,
starting with Moses' commands to his warriors in the Book of Numbers:
"Kill every woman that hath known man by lying with him. But all the
women children, that have not known a man by lying with him, keep alive
Still, I think Levine would be pilloried by Dr. Laura and her ilk even
without the priest scandal and even if she had ignored the subject of
sex across the age divide. For the pleasure principle she enunciates
challenges the twenty-five-year-old organizing strategy of the right.
Ever since Anita Bryant first demonstrated that a power base could be
built by attacking homosexuals, the right has exploited real anxieties
about sex, love and family to constrain the liberatory spirit, whether
expressed by sexual preference, divorce, abortion, contraception,
women's freedom or teen sex. This has not managed to send queers back to
the closet, lower divorce rates or "protect the children." American
teenagers have about four times the pregnancy rate of teens in Western
Europe. Those in a program of "abstinence only" education still have sex
and are about half as likely to protect themselves than kids who've
received broad sex information. Even with abortion rights severely
curtailed, US teenagers have abortions at about the rate they did just
after Roe v. Wade. One in four has had a sexually transmitted
disease; one an hour is infected with HIV; and, not incidentally, among
American children one in six is poor. That notwithstanding, the sex
panic strategy has succeeded in the only way it had to: creating a
movement, with all the institutions, political power, lawmaking
capability, grassroots presence and funding that implies, to advance an
agenda for everything from global dominance to bedroom snooping.
Levine's critics are all part of that project, and since she butts
against it almost from the opening pages of her book, they are striking
What is more telling is who isn't rushing to the defense. While a group
of free-speechers, pro-sex feminists and radical gay activists have
generated press releases, opinion pieces, e-mail alerts and letters of
support to Levine's publisher, there has been silence from mainstream
feminist organizations and the liberal sex-education and child-health
establishments. That may be partly because they, too, have felt the
sting of Levine's criticism. Rather than build a countermovement to
insist on sexual freedom, she writes, such heavyweights as Planned
Parenthood, the National Campaign to Prevent Teen Pregnancy, ETR
Associates (the largest US mainstream sex-ed publisher), the National
Education Association, the Health Information Network and a host of
progressive sex educators tried to appropriate the "family values"
rhetoric of the right, joining in "a contest to be best at preventing
"The Right won," she writes, but the mainstream let it. Comprehensive
sex educators had the upper hand in the 1970s, and starting in the
1980s, they allowed their enemies to seize more and more territory,
until the Right controlled the law, the language, and the cultural
consensus.... Commenting on its failure to defend explicit sexuality
education during an avalanche of new HIV infection among teenagers,
Sharon Thompson [author of the engrossing book on sex and love among
teenage girls, Going All the Way] said, "We will look back at
this time and indict the sex-education community as criminal. It's like
being in a nuclear power plant that has a leak, and not telling
Throughout the Clinton era those forces largely stood by as the most
sexually reckless President in memory signed a sheaf of repressive
legislation, acts with names like Defense of Marriage, Abstinence Only,
Personal Responsibility and Child Pornography Protection. The last on
that list, capping a legal trend that, as Levine says, "defined as
pornography pictures in which the subject is neither naked, nor doing
anything sexual, nor...is even an actual child," was recently struck
down by the Supreme Court. The second to last, also known as the welfare
bill, is up for reauthorization this year, along with its enhancements
of penalties for statutory rape and its policing of teen sex, motherhood
and marriage. As part of that bill the Clintonites fanned the notion
that minors were too young to consent to sex with an adult, while in
criminal law they eased the way for prosecuting children as adults and
jailing them as adults, in which circumstance consent usually isn't an
issue. To grasp the effect of liberal silence about Levine, it is
perhaps enough to recall one name: Dr. Joycelyn Elders, sacked by
Clinton as Surgeon General in 1994 for saying that masturbation is part
of childhood and it doesn't hurt to talk about it. Elders has written an
eloquent and sensible foreword to Harmful to Minors. Back when
Elders was twisting in the wind ABC's Cokie Roberts called her "a sort
of off-to-the-left, out-of-the-mainstream, embarrassing person"; now
the Washington Times insinuates she's soft on molestation. From
self-abuse to child abuse in eight years, one absurd charge prepares the
ground for the other.
That said, it's too easy to read the reception of Levine's book as
simply more evidence of right-wing lunacy and liberal retreat. What the
brouhaha also signals in its small way is a failure of the left. In
organizing around issues of sex, love and family, the right has surely
been cynical but at least it speaks to the deepest questions of intimate
life. Its answers are necessarily simplistic and straitened. The family
is falling apart? It's the homos. Marriage seems impossible? It's the
libbers. Sex brings suffering? Just say No. Love seems distant? Await
the Rapture. Except for a small group of queer radicals and pro-sex
feminists, to the extent that such questions are even entertained on the
left, the answers tend toward a mixture of social engineering and
denial: There's nothing wrong with the family that an equitable economy,
divorce or gay marriage won't fix. Marriage is possible; equality is the
key. If sex ed was better and condoms were free, teens wouldn't get
pregnant and wouldn't get AIDS. If abortion is painful, you've been
propagandized. If sex is painful, you're doing it wrong. If love is
painful, find a new lover.
Levine is too sensitive to the mysteries and complexities of human
relations to be characterized as advocating anything so pat as
happiness-through-policy in the area of childhood sexuality. But if her
putting children and sex together in the same sentence can be read by
the right as a call to licentiousness, her heavy emphasis on the
pleasure-enhancing possibilities of sex education may encourage readers
on the left to believe that kids can be protected from bad sex, mediocre
sex, regret, risk, danger, pain. And they can't, any more than adults
can. They can't because in matters of sex, desire is a trickster. What
you see isn't always what you get, much less what you want, though it
may be what you need. In matters of the heart, intimacy means
vulnerability means daring to bet against pain. As with all bets,
sometimes, often, you lose.
Levine actually makes this point but she so wants kids to have better
information, better experiences--and she argues so well and hard for
these--that somehow it gets lost. Citing a study showing that 72 percent
of teenage girls who'd had sex wished they had waited, Levine wonders
whether this regret isn't perhaps really about romantic disappointment
and asks, "Might real pleasure, in a sex-positive atmosphere, balance or
even outweigh regret over the loss of love?" Can we know pleasure
without pain? one might ask in return. Can regret over lost love, at any
age, be so easily balanced? Even sidestepping those twisting lines of
inquiry, isn't the promise of "real pleasure" as much a romantic ideal,
as much an invitation to disappointment, as the promise of true love,
especially for the young? However wished, it's not so easy to
disentangle sex from the hope for love, to revel in pure, transporting
sensuality without letting expectations, not to mention fumbling
technique, get in the way. It doesn't have to, and it doesn't always,
but sex can change everything between two people. We are weak,
after all, and life's little joke is that in that weakness lies the
potential for our ecstasy and our despair.
This isn't to discount the lifesaving value of open education about sex,
condoms, desire, freedom. (And because discussions like this always
force one to state the obvious, I'll also note that nothing in the
foregoing should suggest that I oppose equality, economic
redistribution, abortion rights, child safety, sexual liberation, the
search for love or, so long as heterosexuals insist on having the state
sanction their unions via the marriage contract, divorce and gay
marriage.) But rather than promise kids a world of good sex--like
promising a world of happy marriages, monogamous fulfillment,
self-sustaining nuclear families--maybe it's more helpful to explain sex
as the sea of clear water, giddy currents, riptides, sounding depths and
rocky shoals that it is. You navigate, find wonder in the journey,
scrape yourself up, press on anyway and survive. And sometimes,
sometimes, you experience a bliss beyond expression. The political job
is to expand the possibilities for such experience, to free people to
navigate, help them survive the hurt or not hurt so bad. Maybe if we
could be honest about sex, we could be honest about marriage and
monogamy and family. Maybe if so much didn't hinge on an outsized faith
in pleasure and fidelity and romantic love--if for people in couples or
families, everything didn't depend on the thin reed of love, and for
people alone, coupledom wasn't held out as the apex of happiness--all
the talk we hear about community might actually mean something. The
greatest virtue in Levine's book is its hope that children might learn
to find joy in the realm of the senses, the world of ideas and souls, so
that when sex disappoints and love fails, as they will, a teenager, a
grown-up, still has herself, and a universe of small delights and strong
hearts to fall back on.
A long time ago I dated a 28-year-old man who told me the first time we
went out that he wanted to have seven children. Subsequently, I was
involved for many years with an already middle-aged man who also claimed
to be eager for fatherhood. How many children have these now-gray
gentlemen produced in a lifetime of strenuous heterosexuality? None. But
because they are men, nobody's writing books about how they blew their
lives, missed the brass ring, find life a downward spiral of serial
girlfriends and work that's lost its savor. We understand, when we think
about men, that people often say they want one thing while making
choices that over time show they care more about something else, that
circumstances get in the way of many of our wishes and that for many
"have kids" occupies a place on the to-do list between "learn Italian"
Change the sexes, though, and the same story gets a different slant.
According to Sylvia Ann Hewlett, today's 50-something women
professionals are in deep mourning because, as the old cartoon had it,
they forgot to have children--until it was too late, and too late was a
whole lot earlier than they thought. In her new book, Creating a
Life: Professional Women and the Quest for Children, Hewlett claims
she set out to record the triumphant, fulfilled lives of women in
mid-career only to find that success had come at the cost of family: Of
"ultra-achieving" women (defined as earning $100,000-plus a year), only
57 percent were married, versus 83 percent of comparable men, and only
51 percent had kids at 40, versus 81 percent among the men. Among
"high-achieving" women (at least $65,000 or $55,000 a year, depending on
age), 33 percent are childless at 40 versus 25 percent of men.
Why don't more professional women have kids? Hewlett's book nods to the
"brutal demands of ambitious careers," which are still structured
according to the life patterns of men with stay-at-home wives, and to
the distaste of many men for equal relationships with women their own
age. I doubt there's a woman over 35 who'd quarrel with that. But what's
gotten Hewlett a cover story in Time ("Babies vs. Careers: Which
Should Come First for Women Who Want Both?") and instant celebrity is
not her modest laundry list of family-friendly proposals--paid leave,
reduced hours, career breaks. It's her advice to young women: Be
"intentional" about children--spend your twenties snagging a husband,
put career on the back burner and have a baby ASAP. Otherwise, you could
end up like world-famous playwright and much-beloved woman-about-town
Wendy Wasserstein, who we are told spent some $130,000 to bear a child
as a single 48-year-old. (You could also end up like, oh I don't know,
me, who married and had a baby nature's way at 37, or like my many
successful-working-women friends who adopted as single, married or
lesbian mothers and who are doing just fine, thank you very much.)
Danielle Crittenden, move over! Hewlett calls herself a feminist, but
Creating a Life belongs on the backlash bookshelf with What
Our Mothers Didn't Tell Us, The Rules, The Surrendered
Wife, The Surrendered Single (!) and all those books warning
women that feminism--too much confidence, too much optimism, too many
choices, too much "pickiness" about men--leads to lonely nights and
empty bassinets. But are working women's chances of domestic bliss
really so bleak? If 49 percent of ultra-achieving women don't have kids,
51 percent do--what about them? Hewlett seems determined to put the
worst possible construction on working women's lives, even citing the
long-discredited 1986 Harvard-Yale study that warned that women's
chances of marrying after 40 were less than that of being killed by a
terrorist. As a mother of four who went through high-tech hell to
produce last-minute baby Emma at age 51, she sees women's lives through
the distorting lens of her own obsessive maternalism, in which nothing,
but nothing, can equal looking at the ducks with a toddler, and if you
have one child, you'll be crying at the gym because you don't have two.
For Hewlett, childlessness is always a tragic blunder, even when her
interviewees give more equivocal responses. Thus she quotes academic
Judith Friedlander calling childlessness a "creeping non-choice,"
without hearing the ambivalence expressed in that careful phrasing. Not
choosing--procrastinating, not insisting, not focusing--is often a way
of choosing, isn't it? There's no room in Hewlett's view for modest
regret, moving on or simple acceptance of childlessness, much less
indifference, relief or looking on the bright side--the feelings she
advises women to cultivate with regard to their downsized hopes for
careers or equal marriages. But Hewlett's evidence that today's
childless "high achievers" neglected their true desire is based on a
single statistic, that only 14 percent say they knew in college that
they didn't want kids--as if people don't change their minds after 20.
This is not to deny that many women are caught in a time trap. They
spend their twenties and thirties establishing themselves
professionally, often without the spousal support their male
counterparts enjoy, perhaps instead being supportive themselves, like
the surgeon Hewlett cites approvingly who graces her fiancé's
business dinners after thirty-six-hour hospital shifts. By the time they
can afford to think of kids, they may indeed have trouble conceiving.
But are these problems that "intentionality" can solve? Sure, a woman
can spend her twenties looking for love--and show me one who doesn't!
But will having a baby compensate her for blinkered ambitions and a
marriage made with one eye on the clock? Isn't that what the mothers of
today's 50-somethings did, going to college to get their Mrs. degree and
taking poorly paid jobs below their capacities because they "combined"
well with wifely duties? What makes Hewlett think that disastrous recipe
will work out better this time around?
More equality and support, not lowered expectations, is what women need,
at work and at home. It's going to be a long struggle. If women allow
motherhood to relegate them to secondary status in both places, as
Hewlett advises, we'll never get there. Meanwhile, a world with fewer
female surgeons, playwrights and professors strikes me as an infinitely
inferior place to live.
On the morning of April 20, in the nation's capital, activists held two anti-war rallies, each of which drew thousands, almost within sight of one another.
As state budgets around the country are slashed to accommodate the expense of the war on terror, the pursuit of educational opportunity for all seems ever more elusive. While standardized tests are supposed to be used to diagnose problems and facilitate individual or institutional improvement, too often they have been used to close or penalize precisely the schools that most need help; or, results have been used to track students into separate programs that benefit the few but not the many. The implementation of gifted classes with better student-teacher ratios and more substantial resources often triggers an unhealthy and quite bitter competition for those unnaturally narrowed windows of opportunity. How much better it would be to have more public debate about why the pickings are so slim to begin with. In any event, it is no wonder there is such intense national anxiety just now, a fantastical hunger for children who speak in complete sentences by the age of six months.
A friend compares the tracking of students to the separation of altos from sopranos in a choir. But academic ability and/or intelligence is both spikier and more malleably constructed than such an analogy allows. Tracking students by separating the high notes from the low only works if the endgame is to teach all children the "Hallelujah Chorus." A system that teaches only the sopranos because no parent wants their child to be less than a diva is a system driven by the shortsightedness of narcissism. I think we make a well-rounded society the same way we make the best music: through the harmonic combination of differently pitched, but uniformly well-trained voices.
A parsimony of spirit haunts education policy, exacerbated by fear of the extremes. Under the stress of threatened budget cuts, people worry much more about providing lifeboats for the very top and containment for the "ineducable" rock bottom than they do about properly training the great masses of children, the vibrant, perfectly able middle who are capable of much more than most school systems offer. In addition, discussions of educational equality are skewed by conflation of behavioral problems with IQ, and learning disabilities with retardation. Repeatedly one hears complaints that you can't put a gifted child in a class full of unruly, noisy misfits and expect anyone to benefit. Most often it's a plea from a parent who desperately wants his or her child removed from a large oversubscribed classroom with a single, stressed teacher in an underfunded district and sent to the sanctuary of a nurturing bubble where peace reigns because there are twelve kids in a class with two specialists and everyone's riding the high of great expectations. But all children respond better in ordered, supportive environments; and all other investments being equal, gifted children are just as prone to behavior problems--and to learning disabilities--as any other part of the population. Nor should we confuse exceptional circumstances with behavior problems. The difficulty of engaging a child who's just spent the night in a homeless shelter, for example, is not productively treated as chiefly an issue of IQ.
The narrowing of access has often resulted in peculiar kinds of hairsplitting. When I was growing up, for example, Boston's Latin School was divided into two separate schools: one for boys and one for girls. Although the curriculum was identical and the admissions exam the same, there were some disparities: The girls' school was smaller and so could admit fewer students; and the science and sports facilities were inferior to those of the boys.
There was a successful lawsuit to integrate the two schools about twenty years ago, but then an odd thing happened. Instead of using the old girls' school for the middle school and the larger boys' school for the new upper school, as was originally suggested, the city decided to sever the two. The old boys' school retained the name Boston Latin, and the old girls' school--smaller, less-equipped--was reborn as Boston Latin Academy. The entrance exam is now administered so that those who score highest go to Boston Latin; the next cut down go to what is now, unnecessarily, known as the "less elite" Latin Academy.
One of the more direct consequences of this is that the new Boston Latin inherited an alumni endowment of $15 million dollars, much of it used to provide college scholarships. Latin Academy, on the other hand, inherited the revenue of the old Girls' Latin alumni association--something under $200,000. It seems odd: Students at both schools are tremendously talented, the cutoff between them based on fairly insignificant scoring differences. But rather than pool the resources of the combined facilities--thus maximizing educational opportunity, in particular funding for college--the resolution of the pre-existing gender inequality almost purposefully reinscribed that inequality as one driven by wealth and class.
There are good models of what is possible. The International Baccalaureate curriculum, which is considered "advanced" by most American standards, is administered to a far wider range of students in Europe than here, with the result that their norm is considerably higher than ours in a number of areas. The University of Chicago's School Mathematics Project, originally developed for gifted students at the Chicago Lab School, is now recommended for all children--all children, as the foreword to its textbooks says, can "learn more and do more than was thought to be possible ten or twenty years ago." And educator Marva Collins's widely praised curriculum for inner-city elementary schools includes reading Shakespeare.
Imparting higher levels of content requires nothing exceptional but rather normal, more-or-less stable children, taught in small classes by well-trained, well-mentored teachers who have a sophisticated grasp of mathematics and literature themselves. It will pay us, I think, to stop configuring education as a battle of the geniuses against the uncivilized. We are a wealthy nation chock-full of those normal, more-or-less stable children. The military should not be the only institution that teaches them to be all that they can be.
"Thirty years from now the big university campuses will be relics," business "guru" Peter Drucker proclaimed in Forbes five years ago. "It took more than 200 years for the printed book to create the modern school. It won't take nearly that long for the [next] big change." Historian David Noble echoes Drucker's prophecies but awaits the promised land with considerably less enthusiasm. "A dismal new era of higher education has dawned," he writes in Digital Diploma Mills. "In future years we will look upon the wired remains of our once great democratic higher education system and wonder how we let it happen."
Most readers of this magazine will side with Noble in this implicit debate over the future of higher education. They will rightly applaud his forceful call for the "preservation and extension of affordable, accessible, quality education for everyone" and his spirited resistance to "the commercialization and corporatization of higher education." Not surprisingly, many college faculty members have already cheered Noble's critique of the "automation of higher education." Although Noble himself is famously resistant to computer technology, the essays that make up this book have been widely circulated on the Internet through e-mail, listservs and web-based journals. Indeed, it would be hard to come up with a better example of the fulfillment of the promise of the Internet as a disseminator of critical ideas and a forum for democratic dialogue than the circulation and discussion of Noble's writings on higher education and technology.
Noble performed an invaluable service in publishing online the original articles upon which this book is largely based. They helped initiate a broad debate about the value of information technology in higher education, about the spread of distance education and about the commercialization of universities. Such questions badly need to be asked if we are to maintain our universities as vital democratic institutions. But while the original essays were powerful provocations and polemics, the book itself is a disappointing and limited guide to current debates over the future of the university.
One problem is that the book has a dated quality, since the essays are reproduced largely as they were first circulated online starting in October 1997 (except for some minor editorial changes and the addition of a brief chapter on Army online education efforts). In those four-plus years, we have watched the rise and fall of a whole set of digital learning ventures that go unmentioned here. Thus, Noble warns ominously early in the book that "Columbia [University] has now become party to an agreement with yet another company that intends to peddle its core arts and science courses." But only in a tacked-on paragraph in the next to last chapter do we learn the name of the company, Fathom, which was launched two years ago, and of its very limited success in "peddling" those courses, despite Columbia president George Rupp's promise that it would become "the premier knowledge portal on the Internet." We similarly learn that the Western Governors' Virtual University "enrolled only 10 people" when it opened "this fall" (which probably means 1998, when Noble wrote the original article) but not that the current enrollment, as of February 2002, is 2,500. For the most part, the evidence that Noble presents is highly selective and anecdotal, and there are annoyingly few footnotes to allow checking of sources or quotes.
The appearance of these essays with almost no revision from their initial serial publication on the Internet also helps to explain why Noble's arguments often sound contradictory. On page 36, for example, he may flatly assert that "a dismal new era of higher education has dawned"; but just twenty-four pages later, we learn that "the tide had turned" and the "the bloom is off the rose." Later, he reverses course on the same page, first warning that "one university after another is either setting up its own for-profit online subsidiary or otherwise working with Street-wise collaborators to trade on its brand name in soliciting investors," but then acknowledging (quoting a reporter) that administrators have realized "that putting programs online doesn't necessarily bring riches." When Noble writes that "far sooner than most observers might have imagined, the juggernaut of online education appeared to stall," he must have himself in mind, two chapters earlier. Often, Noble is reflecting the great hysteria about online education that swept through the academy in the late 1990s. At other times (particularly when the prose has been lightly revised), he indicates the sober second thoughts that have more recently emerged, especially following the dot-com stock market crash in early 2000.
In the end, one is provided remarkably few facts in Digital Diploma Mills about the state of distance education, commercialization or the actual impact of technology in higher education. How many students are studying online? Which courses and degrees are most likely to appear online? How many commercial companies are involved in online education? To what degree have faculty employed computer technology in their teaching? What has been the impact on student learning? Which universities have changed their intellectual property policies in response to digital developments? One searches in vain in Noble's book for answers, or even for a summary of the best evidence currently available.
Moreover, Noble undercuts his own case with hyperbole and by failing to provide evidence to support his charges. For example, most readers of his book will not realize that online distance education still represents a tiny proportion of college courses taken in the United States--probably less than 5 percent. Noble sweepingly maintains, "Study after study seemed to confirm that computer-based instruction reduces performance levels." But he doesn't cite which studies. He also writes, "Recent surveys of the instructional use of information technology in higher education clearly indicate that there have been no significant gains in pedagogical enhancement." Oddly, here Noble picks up the rhetoric of distance-education advocates who argue that there is "no significant difference" in learning outcomes between distance and in-person classes.
Many commentators have pointed out Noble's own resistance to computer technology. He refuses to use e-mail and has his students hand-write their papers. Surely, there is no reason to criticize Noble for this personal choice (though one feels sorry for his students). Noble himself responds defensively to such criticisms in the book's introduction: "A critic of technological development is no more 'anti-technology' than a movie critic is 'anti-movie.'" Yes, we do not expect movie critics to love all movies, but we do expect them to go to the movies. Many intelligent and thoughtful people don't own television sets, but none of them are likely to become the next TV critic for the New York Times. Thus, Noble's refusal to use new technology, even in limited ways, makes him a less than able guide to what is actually happening in technology and education.
Certainly, Noble's book offers little evidence of engagement with recent developments in the instructional technology field. One resulting distortion is that some readers will think that online distance education is the most important educational use of computer technology. Actually, while very few faculty teach online courses, most have integrated new technology into their regular courses--more than three-fifths make use of e-mail; more than two-fifths use web resources, according to a 2000 campus computing survey. And few of these faculty members can be characterized, as Noble does in his usual broad-brush style, as "techno-zealots who simply view computers as the panacea for everything, because they like to play with them."
Indeed, contrary to Noble's suggestion, some of the most thoughtful and balanced criticisms of the uses of technology in education have come from those most involved with its application in the classroom. Take, for example, Randy Bass, a professor of English at Georgetown University, who leads the Visible Knowledge Project (http://crossroads.georgetown.edu/vkp), a five-year effort to investigate closely whether technology improves student learning. Bass has vigorously argued that technological tools must be used as "engines of inquiry," not "engines of productivity." Or Andrew Feenberg, a San Diego State University distance-education pioneer as well as a philosopher and disciple of Herbert Marcuse, who has insisted that educational technology "be shaped by educational dialogue rather than the production-oriented logic of automation," and that such "a dialogic approach to online education...could be a factor making for fundamental social change."
One would have no way of knowing from Noble's book that the conventional wisdom of even distance-education enthusiasts is now that cost savings are unlikely, or that most educational technology advocates, many of them faculty members, see their goal as enhancing student learning and teacher-student dialogue. Noble, in fact, never acknowledges that the push to use computer technology in the classroom now emanates at least as much from faculty members interested in using these tools to improve their teaching as it does from profit-seeking administrators and private investors.
Noble does worry a great deal about the impact of commercialization and commodification on our universities--a much more serious threat than that posed by instructional technology. But here, too, the book provides an incomplete picture. Much of Noble's book is devoted to savaging large public and private universities--especially UCLA, which is the subject of three chapters--for jumping on the high-technology and distance-education bandwagons. Yet at least as important a story is the emergence of freestanding, for-profit educational institutions, which see online courses as a key part of their expansion strategy. For example, while most people think of Stanley Kaplan as a test preparation operation, it is actually a subsidiary of the billion-dollar Washington Post media conglomerate and owns a chain of forty-one undergraduate colleges and enrolls more than 11,000 students in a variety of online programs, ranging from paralegal training to full legal degrees at its Concord Law School, which advertises itself as "the nation's only entirely online law school." This for-profit sector is growing rapidly and becoming increasingly concentrated in a smaller number of corporate hands. The fast-growing University of Phoenix is now the largest private university in the United States, with more than 100,000 students and almost one-third in online programs, which are growing more than twice as fast as its brick-and-mortar operation. Despite a generally declining stock market, the price of the tracking stock for the University of Phoenix's online operation has increased more than 80 percent in the past year.
As the Chronicle of Higher Education reported last year, "consolidation...is sweeping the growing for-profit sector of higher education," fueled by rising stock prices in these companies. This past winter, for example, Education Management Corporation, with 28,000 students, acquired Argosy Education Group and its 5,000 students. The threat posed by these for-profit operations is rooted in their ability to raise money for expansion through Wall Street ("Wall Street," jokes the University of Phoenix's John Sperling, "is our endowment") and by diminishing public support for second-tier state universities and community colleges (the institutions from which for-profits are most likely to draw new students). Yet, except for an offhand reference to Phoenix, Digital Diploma Mills says nothing about these publicly traded higher-education companies. And these for-profit schools are actually only a small part of the more important and much broader for-profit educational "sector," which is also largely ignored by Noble and includes hundreds of vendors of different products and services, and whose size is now in the hundreds of billions of dollars--what Morgan Stanley Dean Witter calls, without blushing, an "addressable market opportunity at the dawn of a new paradigm."
A strong cautionary tale is provided by Noble, that of the involvement of UCLA's extension division with a commercial company called Onlinelearning.net--the most informative chapter in the book. He shows how some UCLA administrators as early as 1993 greedily embraced a vision of riches to be made in the online marketing of the college's extension courses. UCLA upper management apparently bought the fanciful projections of their commercial partners that the online venture would generate $50 million per year within five years, a profit level that quickly plummeted below $1 million annually. But Noble conflates the UCLA online-extension debacle with a more benign effort by the UCLA College of Letters and Sciences, beginning in 1997, to require all instructors to post their course syllabuses on the web. He seems unwilling to draw distinctions between the venal and scandalous actions of top UCLA administrators and the sometimes ham-handed efforts of other administrators to get UCLA faculty to enhance their classes by developing course websites, a fairly common educational practice and a useful convenience for students. Three-fifths of UCLA students surveyed said that the websites had increased interactions with instructors, and social science faculty recently gave the website initiative a mostly positive evaluation.
Sounding an "early alarm" so that faculty members can undertake "defensive preparation and the envisioning of alternatives" is how Noble explains his purpose in writing Digital Diploma Mills. But will faculty be well armed if they are unaware of the actual landscape they are traversing? In the end, Noble leaves us only with a deep and abiding suspicion of both technology and capitalism. His analysis of technology and education does echo Marx's critique of capitalism, with its evocation of concepts like commodification, alienation, exchange and labor theories of value. But unlike Marx, who produced a critical analysis of the exploitative nature of early capitalist production without outright rejection of the technology that made industrialization possible, Noble cannot manage the same feat.
In the current political climate, Noble's undifferentiated suspicion of technology hinders us more than it helps us. Are we prepared to follow him in his suspicion of any use of technology in higher education? Are faculty members willing to abjure e-mail in communicating with their students and colleagues? Are instructors at small colleges with limited library collections prepared to tell their students not to use the 7 million online items in the Library of Congress's American Memory collection? Are they ready to say to students with physical disabilities that limit their ability to attend on-campus classes or conduct library research that they can't participate in higher education? Are faculty at schools with working adults who struggle to commute to campus prepared to insist that all course materials be handed directly to students rather than making some of it available to their students online?
Similarly, what lines are we prepared to draw with respect to commercialization of higher education within the capitalist society in which we live? Are faculty willing to abandon publishing their textbooks with large media conglomerates and forgo having their books sold through nationwide bookstore chains? Are they prepared to say to working-class students who view higher education as the route to upward mobility that they cannot take courses that help them in the job market?
Noble's answer to most of these questions would undoubtedly be yes, insisting, as he does, that anything less than the "genuine interpersonal interaction," face to face, undermines the sanctity of the essential teacher-student relationship. In a March 2000 Chronicle of Higher Education online dialogue about his critique of technology in education, Noble complained that no one had offered "compelling evidence of a pedagogical advantage" in online instruction. (He pristinely refused to join online, and had a Chronicle reporter type in his answers relayed over the phone.) A student at UCLA, who had unexpectedly taken an online course, noted in her contribution to the Q&A that because she tended to be "shy and reserved," e-mail and online discussion groups allowed her to speak more freely to her instructor, and that she thought she retained more information in the online course than in her traditional face-to-face classes at UCLA. Noble rejected the student's conclusion that the online course had helped her find her voice, arguing that writing was "in reality not a solution, but an avoidance of the difficulty." "Speaking eloquently, persuasively, passionately," he concluded, "is essential to citizenship in a democracy." Putting aside the insensitivity of Noble's reply, his position, as Andrew Feenberg points out in Transforming Technology: A Critical Theory Revisited, is reminiscent of Plato's fear that writing (the cutting-edge instructional technology in the ancient world) would replace spoken discourse in classical Greece, thus destroying the student-teacher relationship. (Ironically, as Feenberg also notes, "Plato used a written text as the vehicle for his critique of writing, setting a precedent" for current-day critics of educational technology like Noble who have circulated their works on the Internet.)
The conservative stance of opposing all change--no technology, no new modes of instruction--is appealing because it keeps us from any possible complicity with changes that undercut existing faculty rights and privileges. But opposition to all technology means that we are unable to support "open source" technological innovations (including putting course materials online free) that constitute a promising area of resistance to global marketization. And it makes it impossible to work for protections that might be needed in a new environment. Finally, it leaves unchanged the growing inequality between full-time and part-time faculty that has redefined labor relations in the contemporary university--the real scandal of the higher-education workplace. Without challenging the dramatic differences in wages and workloads of full professors and adjunct instructors, faculty rejection of educational technology begins to remind us of the narrow privileges that craft workers fought to maintain in the early decades of industrial capitalism at the expense of the unskilled workers flooding into their workplaces.
We prefer to work from a more pragmatic and realistic stance that asks concretely about the benefits and costs of both new technology and new educational arrangements to students, faculty (full- and part-time) and the larger society. Among other things, that means that academic freedom and intellectual property must be protected in the online environment. And the faculty being asked to experiment with new technology need to be provided with adequate support and rewards for their (ad)ventures. As the astute technology commentator Phil Agre wrote when he first circulated Noble's work on the Internet, "the point is neither to embrace or reject technology but to really think through and analyze...the opportunities that technology presents for more fully embodying the values of a democratic society in the institutions of higher education."
When a girl becomes her school's designated slut, her friends stop talking to her. Pornographic rumors spread with dazzling efficiency, boys harass her openly in the hallways, girls beat her up. "WHORE," or sometimes "HORE," is written on her locker or bookbag. And there is usually a story about her having sex with the whole football team, a rumor whose plausibility no one ever seems to question.
Even those of us who weren't high school sluts and don't recall any such outcast from our own school days have become familiar with her plight--through media stories and the growing body of feminist-inspired literature on female adolescence, as well as the talk shows and teen magazine spreads that have made her their focus. What's harder to understand is how the label persists when the landscape of sexual morality that gives it meaning has so drastically changed--well within living memory. If the sexual revolution didn't obliterate the slut, wouldn't the successive waves of libidinous pop stars, explicit TV shows and countercultural movements to reclaim the label have drained it of its meaning? What kinds of lines can today's adolescents, or those of the 1990s or 1980s, for that matter, possibly draw between nice and not nice girls?
Emily White's Fast Girls sets out to look at the central dilemmas of the slut label. Two earlier books that have focused on the slut--Leora Tanenbaum's Slut! Growing Up Female With a Bad Reputation, a collection of oral histories, and Naomi Wolf's Promiscuities, a reflection on girls' sexual coming-of-age in the 1970s that combines memoir with a casual survey of the women Wolf grew up with--rely primarily on the subjective narratives of women and girls to explore the slut phenomenon. Paula Kamen's Her Way: Young Women Remake the Sexual Revolution surveys the sexual mores and activities of young women, but not specifically of teenagers. White is the first to combine different methodologies in an attempt to write specifically about the functions and significance of the teenage slut--in her words, "to shed some light on that space in the high school hallway where so many vital and troubling encounters occur."
White spoke to or corresponded with more than 150 women who had been the sluts of their school (whom she found largely by soliciting their stories through newspaper ads), and she spent "a couple of weeks" observing in a Seattle-area public high school. She also offers cultural criticism--of horror movies and the riot grrrls, for instance--as well as a digest of psychological, sociological and post-structuralist theory pertinent to the subject. White's evident ambition makes it all the more frustrating that the book's impressive breadth doesn't translate into thoroughness or rigor.
When White interviewed the women--most of them white, middle-class and from the suburbs--who responded to her ads, the stories she heard had certain similarities. There was a "type" of girl who tended to be singled out: She developed breasts earlier than other girls; she was a loud, vocal extrovert; she was self-destructive, tough or wild; often she had been sexually abused; and in one way or another she was usually an outsider, whether she had moved from a different town, had less money than most kids or belonged to some peripheral subculture. Some women described themselves as having been promiscuous, but more said they were not as sexually active as their (untainted) friends, and none of them had done the things that were later rumored. Often the first rumors were started by bitter ex-boyfriends or jealous friends. Once they caught on, the ritual torments and "football team" fantasies inevitably followed.
These similarities make up what White calls the "slut archetype," and for much of the book she riffs on the common factors of the stories, with chapters dedicated to subjects like the role of suburbia, the slut's social isolation and the preponderance of sexual abuse. Though sprinkled liberally throughout the book, the women's testimonies are only a launching point for White's meditations. She writes about these interviews in a way that at times both romanticizes and condescends to the women. "She walks so confidently in her boots," writes White of one 18-year-old, "causing tremors in the ground beneath her feet. She presents herself as a girl who has crawled up out of the underworld, who has found her way through the isolation and the drugged dreams.... It is a way of coping, this tough act. It's a start." Still, despite certain problems of credibility, this overwrought style is pretty effective at conveying the anguish of the ostracized adolescent girl (if only by echoing her earnest self-dramatization). It's much less suited to considering the girl in social and cultural context.
In editing and interpreting her interviews, White emphasizes their similarities at the expense of the kind of detail that makes a particular social universe come to life. Her time observing the Seattle-area high school students inspires mostly familiar observations. ("The cafeteria is high school's proving ground. It's one of the most unavoidable and important thresholds, the place where you find out if you have friends or if you don't.") Only about half the time do we get any real sense of the sort of community an interviewee grew up in or what the social scene was like at her school. There's even less detail about precisely how she fit into the hierarchy before the slut label took hold, whether she was perceived as threatening or flirtatious, what her past relationships were like with girls, boys and teachers. Even worse is that for all their lack of texture, the women's stories are by far the most interesting part of the book; when White pulls away to supply her own commentary, it's usually vague and predictable--precisely because she's not attuned to the details that would reveal how the slut really functions in the teenage universe. Although she acknowledges that the slut myth is much bigger than any individual girl behind it, she is also attached to the literal-minded notion that the girl being labeled has some kind of privileged relationship to the slut myth--that her individual story is the slut story, and the women's emotional recollections of abuses and scars collectively explain the slut myth. In fact, to understand the myth we need to know at least as much about what the rest of the school is thinking.
White suggests that "the slut becomes a way for the adolescent mind to draw a map. She's the place on the map marked by a danger sign...where a girl should never wander, for fear of becoming an outcast." But, given the arbitrary relationship White found between the slut label and a girl's actual sex life, does the slut myth really have any practical applications for girls? Do they limit sexual activity out of fear of these rumors? Are there particular sex acts that can bring censure in themselves? Can social status insulate some girls from slutdom, regardless of how much they fool around? White doesn't directly pose these questions, but one of her findings hints that, though they may fear the label, kids themselves interpret slutdom as primarily an expression of social status rather than a direct consequence of sexual activity: "Girls who at one time might have been friends with the slut recede as her reputation grows; they need to be careful how they associate with her or they will be thought of as sluts along with her."
The slut doesn't seem to point to an actual line that a nice girl can't cross; she commemorates the fact that there once was such a line, and suggests that the idea of a line still has currency, even if no one can figure out where it is anymore. It's no surprise that she is such a popular subject for third-wave feminists; her ostracism seems to have implications not only for residual sexism but for the way that we personally experience sex and desire.
Ididn't think I had a personal connection to the slut story. For most of my adolescent years, which were in the late 1980s and early '90s, I was very good, and too awkward to attract attention from boys. In the schools I attended there were whispers about who did what, and some girls were considered sluttier than others, but there was no single figure who captured the imagination of the whole class.
Then I remembered something about one of the girls I was closest to from age 10 to about 13 or 14. We didn't go to the same school, but for much of the time we both attended Saturday Russian classes held in her kitchen by an elderly neighbor. She was the only one of my friends who was, like me, born in Russia, though her family still lived in Philadelphia's immigrant neighborhood while mine had moved to a more prosperous, non-Russian suburb several years earlier. My family had a bigger house. We had, thanks to my American stepdad, more American ways of doing things. I was a better student. I think she was more popular at her school than I was at mine; at least, she was more easygoing and sociable. I never felt in awe of her, as I did of other friends. I was not always nice to her, though usually I was.
She knew more about sex in our early years than I did, but, like me, she didn't go out with anyone in the time we knew each other. She was pretty, in a round-faced, unfashionable way that made me think I had a discerning eye for appreciating it. She always seemed more developed than I was. (That may not have been true in any measurable sense.) At some point in those years, though it didn't particularly affect our friendship, and I don't remember thinking about it while I was actually with her, I began to spend nights casting her as my proxy in every kind of pornographic fantasy I could conjure.
It's always difficult to figure out the relationship between cultural fantasies and mores, on the one hand, and actual behavior and sexual self-image on the other. You could probably spend a long time listening to teenagers and still not get to the bottom of how the slut myth filters into their own lives. Still, the site of the slut's continuous re-creation, the high school hallways, deserves closer scrutiny, and the mysteries of her endurance await further exploration.