There are those opposed to the use of cloning technology to create human embryos for stem-cell research whose concerns emanate from commitments to social justice. One of their arguments runs as follows: The idea driving this medical research is that by creating an embryo through cloning, we can produce embryonic stem cells that are a perfect genetic match for a patient. All that is required to conduct the cloning is a skin cell from which to extract the patient's DNA and...a human egg.
Where, cry out the social justice advocates, are we going to get all these eggs for all these patients? Do the math, they suggest: 17 million American diabetics, needing anywhere from 10 to 100 eggs each, since the cloning technology is far from efficient...and even if you can pull that off, Christopher Reeve is still not walking, Michael J. Fox and Janet Reno still tremble and Ronald Reagan still doesn't remember who Ronald Reagan was. The social justice folk maintain that the billions of eggs required for embryonic stem cell therapies for the millions of Americans suffering from chronic and degenerative diseases will be obtained through exploitation of poor women in this country and the world over. Surplus value will take on an even more nefarious meaning.
Still, the early results from embryonic stem-cell therapy in mice are so dramatic that not to pursue this medical research is recognized as morally obscene and just plain stupid. At the University of California, Dr. Hans Keirstead was able to implant neurological tissue derived from embryonic stem cells in a mouse with partial spinal cord injury so that after eight weeks, the mouse had regained most of its ability to walk and, of major significance to the quarter-million Americans suffering from this tragic condition, had also regained bladder and bowel control. Yet, the question remains, where are we going to get all those eggs?
A call to Stanford University's Paul Berg, a Nobel laureate who has been testifying to Congress on behalf of embryonic stem-cell research, helps elucidate the answer: When it comes to the research, he says, the quantity required may not be a problem. But if the desired therapeutic potential of embryonic stem cells is fully realized, the need for eggs will be great and could short-circuit the availability of these therapies. But a solution to that may be possible, Berg insists. If research is carried out that identifies the biochemicals in the egg directing the genetic material to develop into an embryo, then we could extract and fractionate those biochemicals and insert them into any skin cell, for example, for use in the cloning process. Voilà! A skin cell becomes an egg, and skin cells are plentiful.
The immediate enthusiasm for this breakthrough scientific idea, which could help Reeve walk again while simultaneously obviating the motive for an exploitative human egg market, is quickly tempered by the full realization of what Berg has explained: When we acquire the ability to use any cell as an egg, we will have removed another obstacle to achieving complete control over human reproduction. Admittedly, complete control over the production of reproduction will require a womb for gestation--but that ultimately should prove to be just another biochemical matter for extraction and fractionation.
This, then, is how it goes in biotechnology, the essential dynamic that simultaneously gives rise to medical hope and moral vertigo. Each step forward produces a new problem, the solution to which demands further control over the biological mechanism known as a human being. But this somehow impinges on human beings or some portion of ourselves that we value. To deal with the attendant moral quandaries, a method is found to isolate and duplicate the underlying molecular process. The moral quandary has thus been replaced by an extracorporeal biochemical process, no longer strictly identified as human, and therefore a process that no one can reasonably value apart from its use. The problem, as bioethicist Eric Juengst puts it, is that we could thereby successfully cope with every moral dilemma posed by biotechnology and still end up with a society none of us would wish to live in. For Francis Fukuyama, this is Our Posthuman Future, as he has titled his new book on the subject.
Fukuyama's most famous previous theoretical foray was to declare, in 1989, an end to history, whereby a capitalist liberal democratic structure represented the final and most satisfying endpoint for the human species, permitting the widest expression of its creative energies while best controlling its destructive tendencies. He imagined that ultimately, with the universal acceptance of this regime, the relativist impasse of modern thought would in a sense resolve itself.
But thirteen years after the end of history, Fukuyama has second thoughts. He's discovered that there is no end of history as long as there is no end of science and technology. With the rapidly developing ability of the biological sciences to identify and then alter the genetic structure of organisms, including humans, he fears the essence of the species is up for grabs. Since capitalist liberal democratic structures serve the needs of human nature as it has evolved, interference by the bio-engineers with this human nature threatens to bring the end of history to an end.
The aim of Our Posthuman Future is "to argue that [Aldous] Huxley was right," Fukuyama announces early on, referring to Huxley's 1932 vision of a Brave New World. Multiple meanings are intended by Fukuyama: The industrialization of all phases of reproduction. The genetic engineering of the individuals produced by that process, thereby predetermining their lives. The tyrannical control of this population through neurochemical intervention, making subservience experientially pleasurable. Fukuyama cites specific contemporary or projected parallels to Huxley's Hatchery and Conditioning Center, Social Predestination Room and soma. In Fukuyama's terms, the stakes in these developments are nothing less than human nature itself.
The first of the book's three parts lays out the case that the biotechnologically driven shift to a posthuman era is already discernible and describes some of the potential consequences. Prozac and Ritalin are precursors to the genomically smart psychotropic weapons of the near future. Through these drugs, which energize depressed girls and calm hyperactive boys, we are being "gently nudged toward that androgynous median personality, self-satisfied and socially compliant, that is the current politically correct outcome in American society." Standardization of the personality is under way. This is the area to watch, Fukuyama asserts, because virtually everything that the popular imagination envisions genetic engineering accomplishing is much more likely to be accomplished sooner through neuropharmacology.
Increased life spans and genetic engineering also offer mostly dystopic horizons, whereby gerontocracies take power over societies whose main purpose has become the precision breeding of their progeny. The ancient instincts for hierarchical status and dominance are still the most powerful forces shaping this new world born from biotechnology. Since, as Fukuyama sees it, science does not necessarily lead to the equality of respect for all human beings demanded by liberal egalitarianism, the newest discoveries will serve the oldest drives. We are launched on a genetic arms race.
But be warned: We may not arrive in that new world through some dramatic struggle in which we put up a fight. Rather, the losses to our humanity may occur so subtly that we might "emerge on the other side of a great divide between human and posthuman history and not even see that the watershed had been breached because we lost sight of what that [human] essence was."
If this terrible event is to be prevented, then the human essence, which Fukuyama correlates with human nature itself, must be identified and kept inviolable. But what is that line to be drawn around "human nature" and to which we can all adhere so that we might reap the benefits of biotechnology while preventing the nightmare scenarios from ever coming to pass?
The entire world today wants the answer to this. Fukuyama promises to deliver it. But despite the clarity with which he announces his mission, the author advises his readers, "Those not inclined to more theoretical discussions of politics may choose to skip over some of the chapters here." Yet these are the very chapters containing the answer we all seek in order to tame the biotechnology beast! This, then, signals that we are entering dangerous ground, and we will need to bear with the author's own means of revealing his great discovery, which may be skipped over at our own peril.
In this heart of the book, titled "Being Human," Fukuyama first seeks to restore human nature as the source of our rights, our morality and our dignity. In particular, he wishes to rescue all these dimensions from the positivist and utilitarian liberal philosophers who, closely allied with the scientific community, have dominated the debate over biotechnology. According to the author, these philosophers assign rights everywhere and emphasize the individual as the source of moral concern. In doing so, they put humankind and its collective life at risk before the juggernaut of biotechnology. John Rawls and Ronald Dworkin, among others, have elevated individual autonomy over inherently meaningful life plans, claims Fukuyama, who then questions whether moral freedom as it is currently understood is such a good thing for most people, let alone the single most important human good.
Rather than our individual autonomy or moral freedom, Fukuyama wishes that we would attend to the logic of human history, which is ultimately driven by the priorities that exist among natural human desires, propensities and behaviors. Since he wishes us to shift ground to the logic of the inherent and the natural, he must finally define that core composing human nature:
The definition of the term human nature I will use here is the following: human nature is the sum of the behavior and characteristics that are typical of the human species, arising from genetic rather than environmental factors.
Later he will refine this further to the innate species-typical forms of cognition, and species-typical emotional responses to cognition. What he is really after is not just that which is typical of our species but that which is unique to human beings. Only then will we know what needs the greatest safeguarding. After hanging fire while reviewing the candidates for this irreducible, unique core to be defended, including consciousness and the most important quality of a human being, feelings, Fukuyama finally spills the beans:
What is it that we want to protect from any future advances in biotechnology? The answer is, we want to protect the full range of our complex, evolved natures against attempts at self-modification. We do not want to disrupt either the unity or the continuity of human nature, and thereby the human rights that are based on it.
So, where are we? It would seem we have gone full circle. Human nature is defined by...human nature! To the extent that it is capable of being located in our material bodies, it is all that arises from our genetics. Any attempt at greater precision is a violation of our unity or continuity--and threatens to expose the author's empty hand. Through such sophistry, Fukuyama wishes to assert mastery over any biotechnological innovation that he considers threatening, since he can now arbitrarily choose when it is disruptive of the unity or continuity of the human nature arising from our genetics. Even a heritable cancer could qualify for protection under Fukuyama's rubric for that which is to be defended from biotechnological intervention.
Indeed, there are those agreeing with Fukuyama's view of the biological bases of human social life who draw opposite conclusions about human bioengineering, viewing it as humanity's last best hope.
The remainder of the book is a potpourri of tactical suggestions (embedded in rhetoric cloned from Fukuyama's mentor in these matters, bioethicist Leon Kass) of which biotechnologies should be controlled, and of the need for both national and international bodies and systems to do so, if such control is to be effective. That, in the end, may be the most surprising aspect of the book. All this fervid philosophizing in reaction to fears about a Brave New World, fervently working toward the radical conclusion that what is needed is...regulation. Although obviously recognition of the need for regulation might well be experienced as a radical trauma by someone who has previously placed an overabundance of faith in the market.
But one would be foolish to believe that Fukuyama has gone all this distance simply to argue for what he refers to at one point as a more nuanced regulatory approach. In his most public engagement with biotechnology thus far, he has endorsed, written and testified to Congress on behalf of a bill that will not only ban human reproductive cloning but also ban nonreproductive cloning for stem-cell research. The legislation he supports would also make any doctor who utilizes or prescribes a treatment developed with cloning technology subject to ten years in prison and a $1 million fine. Under this legislation, then, if a cure or treatment for diabetes or heart failure is created in England that used embryo cloning to harvest stem cells for therapy, US physicians would not be allowed to have access to such treatments for their patients. This is his lesson in how moral freedom is not such a good thing compared with an inherently meaningful life plan. Let the fragile diabetic or spinal cord-injury victim learn the true value of our human nature from their catheterized bladders!
Fukuyama's entire brief depends upon avoiding the consequences of his own logic. Having identified the human essence with our biological human nature, he must evade any further specification or else the particular tissues, cells or molecules would be subject to further discussion and analysis as to whether or not they represent the human essence. Rather than discussion, we should trade in our autonomy and moral freedom for his protections. By the close of the book, any moral qualms on his part fall entirely by the wayside. Fukuyama is perhaps aware that he has failed to make his case except to those ready to believe. The book culminates in a final paragraph that is nothing less than a temper tantrum:
We do not have to accept any of these future worlds under a false banner of liberty, be it that of unlimited reproductive rights or of unfettered scientific inquiry. We do not have to regard ourselves as slaves to inevitable technological progress when that progress does not serve human ends. True freedom means the freedom of political communities to protect the values they hold most dear...
Nice rhetoric until we recall the values of the types of political regimes to which moral freedom and science must be sacrificed. While Fukuyama rails against the Brave New World, he takes the side of Huxley's World Controller, who explains, "Truth's a menace, science is a public danger...That's why we so carefully limit the scope of its researches."
There is an alternative to the fear that human nature must be inviolable because human nature cannot be trusted. We have seen imperious dictates against science and moral freedom delivered by philosophers before. In the recent past, we have evidence of very similar ideas in very similar language issuing from the philosopher whom Fukuyama draws upon for the epigraph beginning the first chapter of his book, Martin Heidegger. In the 1930s Professor Heidegger wanted science to serve the German essence, and it did. Now Professor Fukuyama wants science, and all of us, to serve the human essence, which he equates with his version of sociobiology infused with German romantic holism. Once more, we witness someone who would stop tyranny by imposing a tyranny of his own. Since Francis Fukuyama now sits on the President's Council on Bioethics, we should be grateful for the warning.
"Thirty years from now the big university campuses will be relics," business "guru" Peter Drucker proclaimed in Forbes five years ago. "It took more than 200 years for the printed book to create the modern school. It won't take nearly that long for the [next] big change." Historian David Noble echoes Drucker's prophecies but awaits the promised land with considerably less enthusiasm. "A dismal new era of higher education has dawned," he writes in Digital Diploma Mills. "In future years we will look upon the wired remains of our once great democratic higher education system and wonder how we let it happen."
Most readers of this magazine will side with Noble in this implicit debate over the future of higher education. They will rightly applaud his forceful call for the "preservation and extension of affordable, accessible, quality education for everyone" and his spirited resistance to "the commercialization and corporatization of higher education." Not surprisingly, many college faculty members have already cheered Noble's critique of the "automation of higher education." Although Noble himself is famously resistant to computer technology, the essays that make up this book have been widely circulated on the Internet through e-mail, listservs and web-based journals. Indeed, it would be hard to come up with a better example of the fulfillment of the promise of the Internet as a disseminator of critical ideas and a forum for democratic dialogue than the circulation and discussion of Noble's writings on higher education and technology.
Noble performed an invaluable service in publishing online the original articles upon which this book is largely based. They helped initiate a broad debate about the value of information technology in higher education, about the spread of distance education and about the commercialization of universities. Such questions badly need to be asked if we are to maintain our universities as vital democratic institutions. But while the original essays were powerful provocations and polemics, the book itself is a disappointing and limited guide to current debates over the future of the university.
One problem is that the book has a dated quality, since the essays are reproduced largely as they were first circulated online starting in October 1997 (except for some minor editorial changes and the addition of a brief chapter on Army online education efforts). In those four-plus years, we have watched the rise and fall of a whole set of digital learning ventures that go unmentioned here. Thus, Noble warns ominously early in the book that "Columbia [University] has now become party to an agreement with yet another company that intends to peddle its core arts and science courses." But only in a tacked-on paragraph in the next to last chapter do we learn the name of the company, Fathom, which was launched two years ago, and of its very limited success in "peddling" those courses, despite Columbia president George Rupp's promise that it would become "the premier knowledge portal on the Internet." We similarly learn that the Western Governors' Virtual University "enrolled only 10 people" when it opened "this fall" (which probably means 1998, when Noble wrote the original article) but not that the current enrollment, as of February 2002, is 2,500. For the most part, the evidence that Noble presents is highly selective and anecdotal, and there are annoyingly few footnotes to allow checking of sources or quotes.
The appearance of these essays with almost no revision from their initial serial publication on the Internet also helps to explain why Noble's arguments often sound contradictory. On page 36, for example, he may flatly assert that "a dismal new era of higher education has dawned"; but just twenty-four pages later, we learn that "the tide had turned" and the "the bloom is off the rose." Later, he reverses course on the same page, first warning that "one university after another is either setting up its own for-profit online subsidiary or otherwise working with Street-wise collaborators to trade on its brand name in soliciting investors," but then acknowledging (quoting a reporter) that administrators have realized "that putting programs online doesn't necessarily bring riches." When Noble writes that "far sooner than most observers might have imagined, the juggernaut of online education appeared to stall," he must have himself in mind, two chapters earlier. Often, Noble is reflecting the great hysteria about online education that swept through the academy in the late 1990s. At other times (particularly when the prose has been lightly revised), he indicates the sober second thoughts that have more recently emerged, especially following the dot-com stock market crash in early 2000.
In the end, one is provided remarkably few facts in Digital Diploma Mills about the state of distance education, commercialization or the actual impact of technology in higher education. How many students are studying online? Which courses and degrees are most likely to appear online? How many commercial companies are involved in online education? To what degree have faculty employed computer technology in their teaching? What has been the impact on student learning? Which universities have changed their intellectual property policies in response to digital developments? One searches in vain in Noble's book for answers, or even for a summary of the best evidence currently available.
Moreover, Noble undercuts his own case with hyperbole and by failing to provide evidence to support his charges. For example, most readers of his book will not realize that online distance education still represents a tiny proportion of college courses taken in the United States--probably less than 5 percent. Noble sweepingly maintains, "Study after study seemed to confirm that computer-based instruction reduces performance levels." But he doesn't cite which studies. He also writes, "Recent surveys of the instructional use of information technology in higher education clearly indicate that there have been no significant gains in pedagogical enhancement." Oddly, here Noble picks up the rhetoric of distance-education advocates who argue that there is "no significant difference" in learning outcomes between distance and in-person classes.
Many commentators have pointed out Noble's own resistance to computer technology. He refuses to use e-mail and has his students hand-write their papers. Surely, there is no reason to criticize Noble for this personal choice (though one feels sorry for his students). Noble himself responds defensively to such criticisms in the book's introduction: "A critic of technological development is no more 'anti-technology' than a movie critic is 'anti-movie.'" Yes, we do not expect movie critics to love all movies, but we do expect them to go to the movies. Many intelligent and thoughtful people don't own television sets, but none of them are likely to become the next TV critic for the New York Times. Thus, Noble's refusal to use new technology, even in limited ways, makes him a less than able guide to what is actually happening in technology and education.
Certainly, Noble's book offers little evidence of engagement with recent developments in the instructional technology field. One resulting distortion is that some readers will think that online distance education is the most important educational use of computer technology. Actually, while very few faculty teach online courses, most have integrated new technology into their regular courses--more than three-fifths make use of e-mail; more than two-fifths use web resources, according to a 2000 campus computing survey. And few of these faculty members can be characterized, as Noble does in his usual broad-brush style, as "techno-zealots who simply view computers as the panacea for everything, because they like to play with them."
Indeed, contrary to Noble's suggestion, some of the most thoughtful and balanced criticisms of the uses of technology in education have come from those most involved with its application in the classroom. Take, for example, Randy Bass, a professor of English at Georgetown University, who leads the Visible Knowledge Project (http://crossroads.georgetown.edu/vkp), a five-year effort to investigate closely whether technology improves student learning. Bass has vigorously argued that technological tools must be used as "engines of inquiry," not "engines of productivity." Or Andrew Feenberg, a San Diego State University distance-education pioneer as well as a philosopher and disciple of Herbert Marcuse, who has insisted that educational technology "be shaped by educational dialogue rather than the production-oriented logic of automation," and that such "a dialogic approach to online education...could be a factor making for fundamental social change."
One would have no way of knowing from Noble's book that the conventional wisdom of even distance-education enthusiasts is now that cost savings are unlikely, or that most educational technology advocates, many of them faculty members, see their goal as enhancing student learning and teacher-student dialogue. Noble, in fact, never acknowledges that the push to use computer technology in the classroom now emanates at least as much from faculty members interested in using these tools to improve their teaching as it does from profit-seeking administrators and private investors.
Noble does worry a great deal about the impact of commercialization and commodification on our universities--a much more serious threat than that posed by instructional technology. But here, too, the book provides an incomplete picture. Much of Noble's book is devoted to savaging large public and private universities--especially UCLA, which is the subject of three chapters--for jumping on the high-technology and distance-education bandwagons. Yet at least as important a story is the emergence of freestanding, for-profit educational institutions, which see online courses as a key part of their expansion strategy. For example, while most people think of Stanley Kaplan as a test preparation operation, it is actually a subsidiary of the billion-dollar Washington Post media conglomerate and owns a chain of forty-one undergraduate colleges and enrolls more than 11,000 students in a variety of online programs, ranging from paralegal training to full legal degrees at its Concord Law School, which advertises itself as "the nation's only entirely online law school." This for-profit sector is growing rapidly and becoming increasingly concentrated in a smaller number of corporate hands. The fast-growing University of Phoenix is now the largest private university in the United States, with more than 100,000 students and almost one-third in online programs, which are growing more than twice as fast as its brick-and-mortar operation. Despite a generally declining stock market, the price of the tracking stock for the University of Phoenix's online operation has increased more than 80 percent in the past year.
As the Chronicle of Higher Education reported last year, "consolidation...is sweeping the growing for-profit sector of higher education," fueled by rising stock prices in these companies. This past winter, for example, Education Management Corporation, with 28,000 students, acquired Argosy Education Group and its 5,000 students. The threat posed by these for-profit operations is rooted in their ability to raise money for expansion through Wall Street ("Wall Street," jokes the University of Phoenix's John Sperling, "is our endowment") and by diminishing public support for second-tier state universities and community colleges (the institutions from which for-profits are most likely to draw new students). Yet, except for an offhand reference to Phoenix, Digital Diploma Mills says nothing about these publicly traded higher-education companies. And these for-profit schools are actually only a small part of the more important and much broader for-profit educational "sector," which is also largely ignored by Noble and includes hundreds of vendors of different products and services, and whose size is now in the hundreds of billions of dollars--what Morgan Stanley Dean Witter calls, without blushing, an "addressable market opportunity at the dawn of a new paradigm."
A strong cautionary tale is provided by Noble, that of the involvement of UCLA's extension division with a commercial company called Onlinelearning.net--the most informative chapter in the book. He shows how some UCLA administrators as early as 1993 greedily embraced a vision of riches to be made in the online marketing of the college's extension courses. UCLA upper management apparently bought the fanciful projections of their commercial partners that the online venture would generate $50 million per year within five years, a profit level that quickly plummeted below $1 million annually. But Noble conflates the UCLA online-extension debacle with a more benign effort by the UCLA College of Letters and Sciences, beginning in 1997, to require all instructors to post their course syllabuses on the web. He seems unwilling to draw distinctions between the venal and scandalous actions of top UCLA administrators and the sometimes ham-handed efforts of other administrators to get UCLA faculty to enhance their classes by developing course websites, a fairly common educational practice and a useful convenience for students. Three-fifths of UCLA students surveyed said that the websites had increased interactions with instructors, and social science faculty recently gave the website initiative a mostly positive evaluation.
Sounding an "early alarm" so that faculty members can undertake "defensive preparation and the envisioning of alternatives" is how Noble explains his purpose in writing Digital Diploma Mills. But will faculty be well armed if they are unaware of the actual landscape they are traversing? In the end, Noble leaves us only with a deep and abiding suspicion of both technology and capitalism. His analysis of technology and education does echo Marx's critique of capitalism, with its evocation of concepts like commodification, alienation, exchange and labor theories of value. But unlike Marx, who produced a critical analysis of the exploitative nature of early capitalist production without outright rejection of the technology that made industrialization possible, Noble cannot manage the same feat.
In the current political climate, Noble's undifferentiated suspicion of technology hinders us more than it helps us. Are we prepared to follow him in his suspicion of any use of technology in higher education? Are faculty members willing to abjure e-mail in communicating with their students and colleagues? Are instructors at small colleges with limited library collections prepared to tell their students not to use the 7 million online items in the Library of Congress's American Memory collection? Are they ready to say to students with physical disabilities that limit their ability to attend on-campus classes or conduct library research that they can't participate in higher education? Are faculty at schools with working adults who struggle to commute to campus prepared to insist that all course materials be handed directly to students rather than making some of it available to their students online?
Similarly, what lines are we prepared to draw with respect to commercialization of higher education within the capitalist society in which we live? Are faculty willing to abandon publishing their textbooks with large media conglomerates and forgo having their books sold through nationwide bookstore chains? Are they prepared to say to working-class students who view higher education as the route to upward mobility that they cannot take courses that help them in the job market?
Noble's answer to most of these questions would undoubtedly be yes, insisting, as he does, that anything less than the "genuine interpersonal interaction," face to face, undermines the sanctity of the essential teacher-student relationship. In a March 2000 Chronicle of Higher Education online dialogue about his critique of technology in education, Noble complained that no one had offered "compelling evidence of a pedagogical advantage" in online instruction. (He pristinely refused to join online, and had a Chronicle reporter type in his answers relayed over the phone.) A student at UCLA, who had unexpectedly taken an online course, noted in her contribution to the Q&A that because she tended to be "shy and reserved," e-mail and online discussion groups allowed her to speak more freely to her instructor, and that she thought she retained more information in the online course than in her traditional face-to-face classes at UCLA. Noble rejected the student's conclusion that the online course had helped her find her voice, arguing that writing was "in reality not a solution, but an avoidance of the difficulty." "Speaking eloquently, persuasively, passionately," he concluded, "is essential to citizenship in a democracy." Putting aside the insensitivity of Noble's reply, his position, as Andrew Feenberg points out in Transforming Technology: A Critical Theory Revisited, is reminiscent of Plato's fear that writing (the cutting-edge instructional technology in the ancient world) would replace spoken discourse in classical Greece, thus destroying the student-teacher relationship. (Ironically, as Feenberg also notes, "Plato used a written text as the vehicle for his critique of writing, setting a precedent" for current-day critics of educational technology like Noble who have circulated their works on the Internet.)
The conservative stance of opposing all change--no technology, no new modes of instruction--is appealing because it keeps us from any possible complicity with changes that undercut existing faculty rights and privileges. But opposition to all technology means that we are unable to support "open source" technological innovations (including putting course materials online free) that constitute a promising area of resistance to global marketization. And it makes it impossible to work for protections that might be needed in a new environment. Finally, it leaves unchanged the growing inequality between full-time and part-time faculty that has redefined labor relations in the contemporary university--the real scandal of the higher-education workplace. Without challenging the dramatic differences in wages and workloads of full professors and adjunct instructors, faculty rejection of educational technology begins to remind us of the narrow privileges that craft workers fought to maintain in the early decades of industrial capitalism at the expense of the unskilled workers flooding into their workplaces.
We prefer to work from a more pragmatic and realistic stance that asks concretely about the benefits and costs of both new technology and new educational arrangements to students, faculty (full- and part-time) and the larger society. Among other things, that means that academic freedom and intellectual property must be protected in the online environment. And the faculty being asked to experiment with new technology need to be provided with adequate support and rewards for their (ad)ventures. As the astute technology commentator Phil Agre wrote when he first circulated Noble's work on the Internet, "the point is neither to embrace or reject technology but to really think through and analyze...the opportunities that technology presents for more fully embodying the values of a democratic society in the institutions of higher education."
When a girl becomes her school's designated slut, her friends stop talking to her. Pornographic rumors spread with dazzling efficiency, boys harass her openly in the hallways, girls beat her up. "WHORE," or sometimes "HORE," is written on her locker or bookbag. And there is usually a story about her having sex with the whole football team, a rumor whose plausibility no one ever seems to question.
Even those of us who weren't high school sluts and don't recall any such outcast from our own school days have become familiar with her plight--through media stories and the growing body of feminist-inspired literature on female adolescence, as well as the talk shows and teen magazine spreads that have made her their focus. What's harder to understand is how the label persists when the landscape of sexual morality that gives it meaning has so drastically changed--well within living memory. If the sexual revolution didn't obliterate the slut, wouldn't the successive waves of libidinous pop stars, explicit TV shows and countercultural movements to reclaim the label have drained it of its meaning? What kinds of lines can today's adolescents, or those of the 1990s or 1980s, for that matter, possibly draw between nice and not nice girls?
Emily White's Fast Girls sets out to look at the central dilemmas of the slut label. Two earlier books that have focused on the slut--Leora Tanenbaum's Slut! Growing Up Female With a Bad Reputation, a collection of oral histories, and Naomi Wolf's Promiscuities, a reflection on girls' sexual coming-of-age in the 1970s that combines memoir with a casual survey of the women Wolf grew up with--rely primarily on the subjective narratives of women and girls to explore the slut phenomenon. Paula Kamen's Her Way: Young Women Remake the Sexual Revolution surveys the sexual mores and activities of young women, but not specifically of teenagers. White is the first to combine different methodologies in an attempt to write specifically about the functions and significance of the teenage slut--in her words, "to shed some light on that space in the high school hallway where so many vital and troubling encounters occur."
White spoke to or corresponded with more than 150 women who had been the sluts of their school (whom she found largely by soliciting their stories through newspaper ads), and she spent "a couple of weeks" observing in a Seattle-area public high school. She also offers cultural criticism--of horror movies and the riot grrrls, for instance--as well as a digest of psychological, sociological and post-structuralist theory pertinent to the subject. White's evident ambition makes it all the more frustrating that the book's impressive breadth doesn't translate into thoroughness or rigor.
When White interviewed the women--most of them white, middle-class and from the suburbs--who responded to her ads, the stories she heard had certain similarities. There was a "type" of girl who tended to be singled out: She developed breasts earlier than other girls; she was a loud, vocal extrovert; she was self-destructive, tough or wild; often she had been sexually abused; and in one way or another she was usually an outsider, whether she had moved from a different town, had less money than most kids or belonged to some peripheral subculture. Some women described themselves as having been promiscuous, but more said they were not as sexually active as their (untainted) friends, and none of them had done the things that were later rumored. Often the first rumors were started by bitter ex-boyfriends or jealous friends. Once they caught on, the ritual torments and "football team" fantasies inevitably followed.
These similarities make up what White calls the "slut archetype," and for much of the book she riffs on the common factors of the stories, with chapters dedicated to subjects like the role of suburbia, the slut's social isolation and the preponderance of sexual abuse. Though sprinkled liberally throughout the book, the women's testimonies are only a launching point for White's meditations. She writes about these interviews in a way that at times both romanticizes and condescends to the women. "She walks so confidently in her boots," writes White of one 18-year-old, "causing tremors in the ground beneath her feet. She presents herself as a girl who has crawled up out of the underworld, who has found her way through the isolation and the drugged dreams.... It is a way of coping, this tough act. It's a start." Still, despite certain problems of credibility, this overwrought style is pretty effective at conveying the anguish of the ostracized adolescent girl (if only by echoing her earnest self-dramatization). It's much less suited to considering the girl in social and cultural context.
In editing and interpreting her interviews, White emphasizes their similarities at the expense of the kind of detail that makes a particular social universe come to life. Her time observing the Seattle-area high school students inspires mostly familiar observations. ("The cafeteria is high school's proving ground. It's one of the most unavoidable and important thresholds, the place where you find out if you have friends or if you don't.") Only about half the time do we get any real sense of the sort of community an interviewee grew up in or what the social scene was like at her school. There's even less detail about precisely how she fit into the hierarchy before the slut label took hold, whether she was perceived as threatening or flirtatious, what her past relationships were like with girls, boys and teachers. Even worse is that for all their lack of texture, the women's stories are by far the most interesting part of the book; when White pulls away to supply her own commentary, it's usually vague and predictable--precisely because she's not attuned to the details that would reveal how the slut really functions in the teenage universe. Although she acknowledges that the slut myth is much bigger than any individual girl behind it, she is also attached to the literal-minded notion that the girl being labeled has some kind of privileged relationship to the slut myth--that her individual story is the slut story, and the women's emotional recollections of abuses and scars collectively explain the slut myth. In fact, to understand the myth we need to know at least as much about what the rest of the school is thinking.
White suggests that "the slut becomes a way for the adolescent mind to draw a map. She's the place on the map marked by a danger sign...where a girl should never wander, for fear of becoming an outcast." But, given the arbitrary relationship White found between the slut label and a girl's actual sex life, does the slut myth really have any practical applications for girls? Do they limit sexual activity out of fear of these rumors? Are there particular sex acts that can bring censure in themselves? Can social status insulate some girls from slutdom, regardless of how much they fool around? White doesn't directly pose these questions, but one of her findings hints that, though they may fear the label, kids themselves interpret slutdom as primarily an expression of social status rather than a direct consequence of sexual activity: "Girls who at one time might have been friends with the slut recede as her reputation grows; they need to be careful how they associate with her or they will be thought of as sluts along with her."
The slut doesn't seem to point to an actual line that a nice girl can't cross; she commemorates the fact that there once was such a line, and suggests that the idea of a line still has currency, even if no one can figure out where it is anymore. It's no surprise that she is such a popular subject for third-wave feminists; her ostracism seems to have implications not only for residual sexism but for the way that we personally experience sex and desire.
Ididn't think I had a personal connection to the slut story. For most of my adolescent years, which were in the late 1980s and early '90s, I was very good, and too awkward to attract attention from boys. In the schools I attended there were whispers about who did what, and some girls were considered sluttier than others, but there was no single figure who captured the imagination of the whole class.
Then I remembered something about one of the girls I was closest to from age 10 to about 13 or 14. We didn't go to the same school, but for much of the time we both attended Saturday Russian classes held in her kitchen by an elderly neighbor. She was the only one of my friends who was, like me, born in Russia, though her family still lived in Philadelphia's immigrant neighborhood while mine had moved to a more prosperous, non-Russian suburb several years earlier. My family had a bigger house. We had, thanks to my American stepdad, more American ways of doing things. I was a better student. I think she was more popular at her school than I was at mine; at least, she was more easygoing and sociable. I never felt in awe of her, as I did of other friends. I was not always nice to her, though usually I was.
She knew more about sex in our early years than I did, but, like me, she didn't go out with anyone in the time we knew each other. She was pretty, in a round-faced, unfashionable way that made me think I had a discerning eye for appreciating it. She always seemed more developed than I was. (That may not have been true in any measurable sense.) At some point in those years, though it didn't particularly affect our friendship, and I don't remember thinking about it while I was actually with her, I began to spend nights casting her as my proxy in every kind of pornographic fantasy I could conjure.
It's always difficult to figure out the relationship between cultural fantasies and mores, on the one hand, and actual behavior and sexual self-image on the other. You could probably spend a long time listening to teenagers and still not get to the bottom of how the slut myth filters into their own lives. Still, the site of the slut's continuous re-creation, the high school hallways, deserves closer scrutiny, and the mysteries of her endurance await further exploration.
Like it or not, America has been able to achieve and maintain its supremacy as a global power because of its capacity to absorb the best from the rest of the world. This dependency on foreign imports is especially clear in the realm of science and technology. Roughly one-third of US Nobel laureates were born outside the United States and became naturalized citizens. The father of the American nuclear program was a foreigner. But most foreign-born scientists toil away unrecognized in our nation's research labs, universities and private firms, forming the backbone of American high technology. In computer software development, now widely considered the most important area of American advantage, foreign nationals are commonly recognized as being among the best programmers. Almost a third of all scientists and engineers in Silicon Valley are of Chinese or Indian decent.
America cannot afford to lose the loyalty of these high-tech coolies it has come to depend on, yet that's exactly where it seems to be heading with recent cases of immigrant-bashing and racial and ethnic profiling by opportunistic politicians seeking short-term political gains. In the aftermath of the September 11 terrorist attacks, the animosity aimed at the enemies of the United States has also been extended to immigrants and American citizens who originally came from the same part of the world. Hundreds of Arab-Americans and Asians from the Indian subcontinent have been detained as suspects, without charges filed against them, under "special administrative measures" in the name of national security. The majority of Americans, the interpreters of polls tell us, approve. It was in the name of the same national security that a Chinese-American physicist, Wen Ho Lee, was accused some three years earlier of stealing the "crown jewels" of the US nuclear program and giving them to mainland China; similarly enacted special measures threw him in chains and into solitary confinement, although the government had no evidence against him. His public lynching, which was caused by and fed into America's national angst concerning enemy number one of that time--China--is the subject of the two books under review. As a perfect example of a national security investigation botched by racial and ethnic profiling, which led to a shameful failure of all the institutions involved, it could not have been exposed at a better time.
China emerged as America's prime antagonist after the end of the cold war. During the cold war, it was always easy to tell who was America's enemy and who was a friend. Then, with the normalization of Chinese-US diplomatic relations in the late 1970s, those lines began to blur. For a time at least, the People's Republic of China (PRC) was no longer a foe. Individuals and institutions from all walks of life were happily embracing the idea of scientific and cultural exchange, and even nuclear scientists went back and forth. It was understood that the common enemy was the USSR. This cozy relationship ended with the fall of the Soviet Union, when US policy-makers, without clearly defined targets, began to show signs of what Henry Kissinger calls "nostalgia for confrontation" and cast about for a manichean opponent. With its rapidly expanding economy in the 1990s, which brought it into some conflict with American interests in Asia, China became the most logical choice.
The targeting of Chinese-Americans and the questioning of their loyalties did not begin in earnest until after the 1996 general election, when Republicans accused members of the Chinese-American community of passing campaign donations from government officials of the PRC to Bill Clinton's re-election campaign. It was said to be a clandestine plan by China to influence US policy; the charge was not substantiated, but Asian-American contributors to the Democratic Party were investigated by the FBI for possible involvement in traitorous activities, and suspicions of disloyalty among Chinese-Americans lingered.
The investigation of Wen Ho Lee, who was then a research scientist at the Los Alamos National Laboratories in New Mexico, started soon after the campaign scandal. It was initiated by an intelligence report that in 1992 China had tested a bomb very much like the Los Alamos-designed W-88, considered one of the smallest and most highly optimized nuclear weapons in the world. Carried on Trident II submarine-launched missiles, the W-88 can hit multiple targets with great accuracy. When a Chinese defector to Taiwan brought documents with diagrams and text descriptions of a long list of US strategic weapons, including the W-88, US counterintelligence circles cried espionage and began an investigation.
Dan Stober and Ian Hoffman, who covered the story for the San Jose Mercury-News and the Albuquerque Journal, teamed up to write A Convenient Spy: Wen Ho Lee and the Politics of Nuclear Espionage, in which they reveal the scandalous details of the misguided search for the Chinese-American spy. Written like a crime novel, their book is at its best as an exposé of the behind-the-scenes workings of Washington politics, in which the truth is all too easily sacrificed for political expediency. The authors blame everyone involved, from the incompetent employees of the FBI and the ambitious bureaucrats of the Department of Energy (DOE) to the zealous anti-China hawks in Congress and a colluding press corps all too willing to swallow government-distributed information without corroboration.
The government spent four years and millions of dollars to pin Wen Ho Lee, ultimately only to find him innocent of spying. Many American weapons designers who were familiar with the Chinese nuclear program saw no reason that Chinese scientists could not invent in the 1990s the miniaturized warheads US scientists had developed in the 1950s. Others pointed out that most of the details on US missiles were available on a website maintained by the Federation of American Scientists. China could have easily made its own bombs by processing the mounds of information gathered from newspapers, magazines and scientific literature that Chinese students and scientists, over more than a decade of scholarly and business exchanges, had obtained legally--a method US counterintelligence circles refer to as gathering grains of sand. Yet the director of counterintelligence at the DOE, Notra Trulock, refused to believe that the Chinese were capable of developing the most modern weapon in the US arsenal on their own. "There's one spy out there and we're going to find him," he reportedly told an assistant.
The spy, if there was one, could have been any of the scientists from a half-dozen national nuclear-weapons-design labs, or an employee of one of the many plants that manufacture the parts, as they all had blueprints. Yet Trulock's order for an administrative inquiry stipulated that the initial consideration would be to identify those US citizens of Chinese heritage who worked directly or peripherally with the design development. This was a logical starting point, the attached memo went on to explain, based upon the intelligence community's evaluation that the PRC targets and utilizes ethnic Chinese for espionage rather than persons of non-Chinese origin. Following this perilous logic, the investigation took on the shape of a funnel: The list of suspects swiftly shrank from the employees of Los Alamos and Lawrence Livermore research labs who had traveled to China to the scientists of Chinese heritage who had worked directly or peripherally on the W-88 design development and had had contacts with Chinese scientists. From there, it was a quick jump to Wen Ho Lee as the only person who had the opportunity, motivation and legitimate access to the specific nuclear weapons information believed to have been leaked to the Chinese.
The choice of Wen Ho Lee as the spy was far from logical. He was a native of Taiwan and had openly expressed his sympathy for Taiwanese independence, and has in fact admitted to providing unclassified scientific documents to the Chung Shan Institute of Science and Technology--Taiwan's military research center involved in developing nuclear weapons. Also, he had been trapped into cooperating with the FBI many years earlier in an investigation of another Chinese-American scientist, while his wife was recruited to act as an unpaid informant on the activities of visiting Chinese scientists.
This may explain why no one at the FBI or any other government agency initially believed Trulock's accusations against Lee. Trulock's first request for a wiretapping order from the Justice Department was turned down. But he doggedly took his spy story to the CIA, the White House and the Defense Department until he finally found a sympathetic ear among Republicans in Congress. Representative Christopher Cox of California was heading the House Select Committee on US National Security and Military/Commercial Concerns, which was investigating the Clinton Administration for jeopardizing national security by being soft on China in exchange for campaign contributions. Cox immediately saw the potential of using an indictment against Wen Ho Lee to help the charges against Clinton stick. Trulock's unverified assertions became bombshells in Cox's committee report. On one occasion a zealous committee member even confused the scientist Wen Ho Lee with Bill Lann Lee, who was at the time waiting to be confirmed as Assistant Attorney General for Civil Rights.
But the real damage was done when someone leaked the spy story to the ever-hungry-for-a-Clinton-scandal press. A Pulitzer Prize-winning reporter for the New York Times passed the information along without corroboration, and soon Congress and the media were "locked in a game of one-upmanship," describing Lee's crime in ever more superlative-laden rhetoric, according to Stober and Hoffman. In no time, expressions of fear and hatred of the Chinese inundated the Internet, TV and radio talk shows. As the storm gathered, Clinton's appointees, instead of standing up against wrongful accusations, buckled. The new Energy Secretary, Bill Richardson, weighing the risk of losing his nomination as the running mate to presidential candidate Al Gore, ordered that Wen Ho Lee be summarily dealt with.
The FBI at first tried to scare Wen Ho Lee into confessing that he had passed nuclear secrets to China. The Rosenbergs professed their innocence, he was told, and the Rosenbergs are dead. When that did not work, he was put in jail, although the government still had no evidence to convict him as a spy. Five years of relentless hounding by its agents--at times more than 100 FBI personnel were working on his case--had produced nothing. The only wrongdoing he could be charged with, discovered by accident during a search of his office, was his downloading of several weapons codes from the lab's secure computer system onto the unsecured one. Similar security infractions were often ignored at the lab, rarely resulting in disciplinary measures. (In an error of potentially much graver consequences for national security, former Director of Central Intelligence John Deutch had downloaded top-secret files onto his unsecured home computer, which a family member had been using to surf pornography websites. Deutch was disciplined but he did not lose his job, much less end up incarcerated.)
Lee was prosecuted under the cold-war-era Atomic Energy Act, which allowed for the harshest treatment: He was put in manacles and shackles that were chained to his waist, and was locked up in solitary confinement. When members of his immediate family were permitted to visit him for one hour each month, they were not allowed to speak in Chinese--the language they spoke at home. Lights in his cell were on twenty-four hours a day, with a guard on constant watch. Such conditions are rarely experienced by even the most vicious convicted criminals.
Much to Wen Ho Lee's credit, he did not crack. The US district court judge in New Mexico who was put in charge of the prosecution was so incensed by the government's handling of the case that he said to Lee: "I believe you were terribly wronged.... [Government officials] have embarrassed our entire nation.... I sincerely apologize to you."
This unusual gesture, with which Wen Ho Lee opens his account of the ordeal in My Country Versus Me: The First-Hand Account by the Los Alamos Scientist Who Was Falsely Accused of Being a Spy, is by the book's end almost certain to draw applause from the readers, as an enlightened conclusion to a grave miscarriage of justice by the government; but the negative consequences of the incident have yet to be fully tallied.
More than 150,000 Chinese-American engineers and scientists work in US industry, government and academia today; roughly 15,000 are employed by the defense sector alone. Because of the way in which the government handled Wen Ho Lee's case, many found that their loyalty was being severely questioned by their bosses and colleagues. They were frequently subject to innuendo and distressing jokes. There were numerous reports of security clearances withdrawn and promotions denied, of people forced into early retirement. A survey conducted by the Committee of 100 and the Anti-Defamation League soon after Wen Ho Lee's release from prison found that 68 percent of Americans feel negative toward Chinese-Americans; 32 percent believe that Chinese-Americans are more loyal to China than to the United States; and 46 percent believe that Chinese-Americans passing secrets to China is a problem.
Even Stober and Hoffman, who make every effort to show the lack of credible evidence proving that Lee was a spy, maintain that his own unexplained actions fed into the political furor that made him all too convenient a target. For instance, Lee lied to the FBI, to his family and to his lawyers about why he had copied voluminous amounts of non-work-related computer codes used to design nuclear weapons and put them on portable tapes that have never been completely recovered.
In his own book, Lee explains the copying as a precautionary measure against losing his files--as had happened to him when the lab switched from one computer system to another. He defends the volume of downloads as necessary to test his portion of the codes "against the snapshot of the whole code at a certain time," because as the weapons designers change their calculations, his codes are affected as well. To Lee's scientific mind, the measure was prudent and logical. John Richter, a Los Alamos physicist known as "the guru of gurus" on the subject of plutonium explosives, testified in court in Lee's defense. He described Lee's actions with an old saying: Never attribute to malice what can be adequately explained by stupidity.
Whatever the case, Lee comes across as impossibly naïve as he recounts the events of late 1998, considering that he was in the very eye of the storm raised by the Cox investigation. He continued to cooperate with investigators by submitting to polygraph tests and repeated FBI questioning, without the presence of a lawyer. When his daughter told him that a New York Times article headlined "China Stole Nuclear Secrets from Los Alamos, U.S. Officials Say," published March 6, 1999, was about him, he didn't believe it. He didn't read newspapers, didn't vote and professed not to care about politics. Yet his book is politically sophisticated. It shows the unmistakable imprint of his co-author, Helen Zia, an experienced freelance journalist and a seasoned and respected Asian-American activist, who understood the significance of Wen Ho Lee's case in the context of American ethnic and civil rights politics.
In contrast to the position taken by Stober and Hoffman, who credit Lee's lawyers as being the only morally noncorrupt heroes of this story, Zia recognized that the legal case gained moral weight and credibility through the support of brave people who were willing to risk their careers to speak out in Lee's favor. The American Physical Society and the American Association for the Advancement of Science issued statements condemning the government's harsh treatment of Lee. A number of eminent scientists, among them several of Lee's colleagues, individually took the stand. Richter, the guru, provided crucial testimony debunking the government's nonsense that Lee had stolen the nation's "crown jewels," thus altering the balance of power in the world.
The Chinese-American community, still licking the wounds inflicted by Clinton's campaign fundraising scandal, was initially cautious in dealing with the sensitive issue surrounding nuclear secrets. But it picked up Lee's cause as soon as the government went public with its outrageous actions. Foreign-born Chinese-American scientists and engineers, who for years had sweated away quietly in research labs and universities, unrecognized, unappreciated and underpaid, but who were suddenly all suspect, turned their anger into building the Wen Ho Lee Defense Fund, which raised hundreds of thousands of dollars for his legal bills. Supporters established websites and organized rallies and teach-ins around the country, demanding that members of Congress stop the persecution of Lee. When Professor Ling-chi Wang, director of Asian-American studies at the University of California, Berkeley, called for a collective boycott of DOE-overseen national labs by all Asian-American scientists and engineers, the labs took notice. (An agreement with the labs on new procedures appeared imminent at press time.)
Job applications by foreign graduate students, from among whom most research labs and engineering firms recruit their future staff, are down. The National Science Board estimates that 30-50 percent of those who hold science or engineering doctorates in the United States are foreign-born (the number is the highest in math: 57 percent). About 7 percent of all physicists and 15 percent of all engineers in the United States are Asian-American. If Asian-American and other foreign-born scientists are discouraged from entering the US work force, notes Eamon Kelly, chairman of the National Science Board, the country could have a hard time filling the gap.
Yet, spurred by the September 11 attacks, Senator Dianne Feinstein has called for a moratorium on admissions of foreign students to US educational institutions. American national interests can ill afford this type of mindless antiforeign hysteria. American high school students rank near the bottom in math and science, according to studies on schooling worldwide. The country's best and brightest students often opt for careers as lawyers, doctors and financial professionals, where they can command much higher salaries than in the pure science fields. Wen Ho Lee, for instance, despite holding a PhD from an American university and with twenty years of experience at the Los Alamos labs, made only $80,000 a year--an absurdly meager remuneration for a man accused of changing the balance of power in the world.
If there is a lesson in all this, it is that the pre-eminent position of the United States in the world--"our scientific capabilities and national security," in the words of the president of the American Physical Society, James Langer--was in fact compromised by the government's action in the case of Wen Ho Lee and the resulting alienation of the most qualified foreign-born scientists necessary to maintain that pre-eminence. Unfortunately, the lesson is also, as Wen Ho Lee found out, that an immigrant dream--coming to America, working hard, getting an education, taking care of one's family and minding one's own business--can easily be shattered by politics. Only by becoming politically engaged and organized can immigrants gain the respect of the rest of the American people and stop being singled out as easy victims.
My sister-in-law, a historian and researcher in alternative medicine, once told me of a doctoral dissertation she'd happened across in which the writer interviewed a number of committed liberals and conservatives for the purpose of drawing conclusions about their governing emotional equipment. Liberals, the student found, feel most at home with guilt. Conservatives, as you might expect, don't have much truck with that; instead, they do anger.
It may be hard to call these findings shocking ones, and I do not know whether the candidate's advisers concluded that he or she had sufficiently advanced the literature so as to earn a doctorate. But I can say from personal experience that the liberalism-guilt correlation rings true, and, after reading David Brock's Blinded by the Right, I can certify on the strength of Brock's eyewitness--and often eye-popping--account that conservatives really do anger. Anger as trope; anger as strategy; anger as immutable biological condition; and anger just because it's fun. Yes, we knew this. But we didn't know it the way Brock knows it. Let me put it this way. Throughout the Clinton era, I read every major newspaper and all the magazines and a lot of the websites and most of the pertinent books; I didn't think there was much more for me to learn. But once Blinded by the Right kicks into gear, there is a fact, anecdote or reminiscence about the right's feral hatred of the Clintons every ten pages or so that is absolutely mind-boggling. And, as often as not, these stories are also about the rancid hypocrisy (usually sexual) that underlay, or probably even helped cause, the hatred. In sum: You cannot fully understand this fevered era without reading this book.
The question you may fairly ask is the one some people are already asking: Given the source--Brock was the capital's most famous conservative journalistic hit man before quite famously commencing a mea culpa routine in 1997--can we believe it? The short answer is yes, mostly. The long answer requires that we start, as Oscar Hammerstein III put it, at the very beginning.
The book dances back and forth between exposé and memoir. David Brock was raised in New Jersey, the adopted son of a mother who paid too much mind to what the neighbors thought and a father so rigidly conservative that he did something, as Brock notes, that even Pat Buchanan never felt moved to do: He left the Catholic Church to protest the liberal reforms of Vatican II and worshiped in a sect overseen by the profoundly right-wing French archbishop Marcel Lefebvre. It was partly for the sake of agitating his taciturn father that Brock's first political stirrings were liberal (Bobby Kennedy) to moderate (Jimmy Carter, for whom he secretly persuaded his mother to vote). The family moved to Dallas, an inhospitable milieu in general for a Kennedy acolyte, not least one who was coming to terms with the fact that the sight of his fellow boys disrobing after gym class did more to quicken his pulse than, say, a stolen glance in the direction of the décolletage of the Cowboys cheerleaders. Hating Dallas and still seeking to traduce the old man, for college he chose, of all lamentable destinations, Berkeley.
There, Brock expected to drop anchor in a tranquil moorage of like-minded, tolerant, liberal bien pensants. Instead, he ran head-on into the multicultural, academic left, a bird of altogether different plumage. When Jeane Kirkpatrick came to campus to speak, and protesters would not let her utter a sentence as one of them unloaded a bucket of simulated blood on the podium, that was enough for Brock. Soon he was writing columns in the Daily Californian applauding the "liberation" of Grenada and submitting an essay to the Policy Review, a publication of the Heritage Foundation, on campus Marxism. The Wall Street Journal adapted that piece as an Op-Ed, which caught the eye of John Podhoretz, son of Norman, and Midge Decter, and then an editor at Insight, a magazine put out by the Washington Times. Podhoretz offered him a job, and Brock was off to Washington.
The story of Brock's ideological conversion is important, because it reflects a pattern with regard to several of his comrades we meet later in that it was at once both shockingly superficial and utterly fervent. Forget Burke or Oakeshott or Hayek or even Russell Kirk; Brock admits he hadn't read a single thing beyond some issues of Commentary he tracked down in the library. "I knew nothing of the movement's history," he writes. Joe McCarthy, Goldwater, Nixon--all were mysteries to him, for the most part. His politics were nothing more than a reaction to his personal experience. While the same cannot fairly be said of the movement's intellectuals, from Brock's telling it was indeed true of many of the activists, operatives and media babblers. Their conservatism was purely an emotional or psychological response to their immediate environment. In the most extreme case, Brock writes that his former close friend Laura Ingraham, one of the bombastic blondes of cable television, didn't "own a book or regularly read a newspaper." But as we have seen, in our age, ignorance is no barrier to expertise, particularly on cable television.
Shallow though it may have been, Brock's conversion was virtually consummate. I say virtually because there were some matters on which he claims he never drank the Kool-Aid. He had little taste, he says, for the racist shock antics of
the Dartmouth Review crowd; he quietly backed abortion rights; and, of course, on the gay question, he marched to a very different drummer than that of the movement to which he belonged. Of parties at the home of archconservative fomenter Grover Norquist, who hung a portrait of Lenin on his living room wall and often quoted Vladimir Ilyich's dictum to "probe with bayonets, looking for weakness," Brock writes that he was "ill at ease" at these gatherings; "unsure of how to handle the issue of my sexuality, I drifted in late and out early, usually accompanied by a woman colleague," traversing the room "like a zombie." Nevertheless, he wanted nothing more than their approval, and he put his remaining misgivings, and the odd homophobic joke, to the side.
This brings us to the book's second vital point about the winger psyche. The need to belong--and, specifically, to belong to a self-styled minority that felt itself embattled, thumbing its nose at the larger, contaminated culture--is a constant motif of Blinded by the Right, and it becomes clear over the course of the book that it was this convulsed emotional state, even more than ideology, that was, and I suppose still is, the real binding glue among the right. For Brock, it began with his trying to shock his father with Jimmy Carter and Berkeley; it went on to Brock's seeking to vilify the campus lefties. It was present, too, among many of the movement types he befriended: "There was electricity on the right, the same sense of bravely flouting convention--of subverting the dominant culture--that I had first felt in Texas and then at Berkeley."
It was by the time of the 1992 election, when this mindset joined hands with a group of men--and their many millions of dollars--who couldn't accept that the GOP was losing the White House to such a man as Bill Clinton, that it went from being a kind of batty nuisance to a well-oiled agitprop apparatus to, ultimately, a threat to the Constitution. Brock was by then ensconced at The American Spectator, which became in time the most virulent right-wing magazine in America, willing to publish any thinly sourced rumor as long as it made a Clinton look bad, and the home of the Arkansas Project, the Richard Mellon Scaife-funded operation that sought to dig up any Clinton dirt it could find. Brock sharpened his knife first on Anita Hill. With Laurence and Ricky Silberman holding his hand--he was a circuit judge in Washington and a member of the hard-right Federalist Society; she had worked for Clarence Thomas with Hill--Brock could scarcely believe how quickly and easily previously unreleased affidavits and so on fell into his hands from GOP Congressional staffers.
Brock knew intuitively what he was supposed to do with this material, and it wasn't journalism. It was character assassination, and not only of Hill. Of one Democratic Senate staffer, he wrote that the man was "known for cutting ethical corners...to achieve desired results." Brock admits he knew nothing about the man. He made no effort to contact sources who might have had different interpretations (and obviously not Hill herself); he double-checked nothing; he twisted the hearing record to make Hill look like a vengeful harridan who was, in his infamous phrase, "a little bit nutty and a little bit slutty." But it was good enough for the Spectator, which billed it, natch, as investigative journalism. Rush Limbaugh began reading sections of the piece on the air. Brock was put on to Glen Hartley and Lynn Chu, the literary agents of choice for the hard right. He signed a book contract with the Free Press, then run by archconservative Erwin Glikes and Adam Bellow, son of Saul. The Real Anita Hill hit the bestseller lists. The right-wing newspeak machine, now such a fact of political life, was born.
Next up, the famous "Troopergate" story (again in the Spectator), about Arkansas state policemen supposedly setting up sexual liaisons for Governor Clinton. Brock followed the old MO--no independent sourcing, printing rumors, etc.--to the same triumphant effect. And this time he found to his surprise a willing abettor. Though a few mainstream news organizations did shoot down some specific charges that didn't check out, the chief response of a largely panting Washington press corps ("I was astonished to see how easy it was to suck in CNN") was for more, more, more. Brock became a bigger star still.
Hillary Clinton was the next quarry, and Adam Bellow had obligingly put a $1 million price on her head in the form of Brock's advance. But Hillary proved to be Brock's Waterloo--as she has been, incidentally, for several other men who were supposed to steamroller her (Starr, Whitewater committee chair D'Amato, candidate Giuliani, candidate Lazio...). By then, Brock was starting to develop a conscience. In 1994, Jill Abramson and Jane Mayer's book on the Thomas-Hill matter, Strange Justice, had hit the stands. It proved to everyone in the world but hard-shell rightists that Thomas was indeed a ravenous porn enthusiast and that Hill, in all likelihood, was the truthful one. When even Ricky Silberman, who had been Brock's source and cheerleader while Brock was writing the Anita Hill book, seemed to acknowledge privately that Thomas had lied, Brock was shaken.
By the time he got around to Hillary, Brock was determined to write an actual book. ("I began to relish the complexity of my subject. I realized I had never known what journalism was.") I cannot here convey the full flavor of the contempt his old comrades regarded him with as a result: the sideways glances, the calls not returned, the party invites not received--and, now that he wasn't "on the team," in the argot, the jokes about and denunciations of his sexuality, suddenly delivered within earshot. He was not supposed to commit journalism or write what he thought. He was supposed to kill Clintons. Period. Once he stopped that, his life on the right was finished.
David Brock gave up anger and turned to guilt. In the process, he flings open a most illuminating window on this hideous circus. Here is Newt Gingrich, vowing "to say the word 'Monica' in every speech" even while "conducting his own illicit affair." We see Georgia Congressman Bob Barr plotting to bring the troopers to testify on Capitol Hill to expose Clinton's adultery--the same Barr who, interestingly enough, married his third wife within one month of divorcing his second. We hear Jack Romanos, the head of Simon & Schuster, telling Brock, as he signed the million-dollar Hillary book deal--without even writing a proposal!--that the only thing he wanted to know before OK'ing the money was whether Hillary was a lesbian. We eavesdrop on the publisher of the Spectator asking Brock, "Can't you find any more women to attack?" We read of George Conway, one of the lawyers who played a crucial role in pushing Paula Jones's story, admitting that privately he didn't believe Jones's allegation at all but that her case must be pressed nonetheless because the point was to force a situation in which Clinton would have to lie under oath about extramarital sex. We witness Ted Olson, a member of the bar and now this country's Solicitor General, telling Brock that while he believed Vince Foster had committed suicide, the Spectator should still run a trashy, unsourced piece about Foster's "murder" to keep the pressure on the Administration until the Spectator could shake loose another "scandal."
Anecdotes like these spill out of this book. And so we return to the question: Why believe this man? I was not persuaded by every assertion about his emotional state in 1992 or 1995; there could be some after-the-fact varnishing going on there. But as for what he saw, and whom he saw doing it, there are three very good reasons to believe every word. First is the simple standard of factual recall. Brock names names, places, dates, the food and wine consumed, the color of the draperies. Perry Mason would love to have called Brock as a witness and watched as poor Hamilton Burger buried his vanquished head in his hands.
Second, quite simply, the writing has about it the tenor of veracity and candor. Brock comes clean on things he has no contemporary motive to come clean on, like a lie he told back at Berkeley in an attempt to discredit a journalistic foe. That strikes me as an act of expiation, not public relations.
And third, most persuasive to me, is this: You would think the right's screamers would be engaging right now in flamboyant public harangues about Brock's duplicity and so forth. But to date, I've scarcely heard a peep. Admittedly, it's early yet, as the book is just out. If Blinded by the Right ascends the bestseller lists, I expect at that point that the screamers will decide they have to deal with it. Until then, my hunch is that they hope they can bury it with their silence. That tells me that David Brock, while no longer right, is, in fact, right as rain.
When it comes to the events of September 11, everyone is an expert and no one is.
It's official now: The United States has a policy on climate change. President Bush announced it on Valentine's Day at a government climate and oceans research center. "My approach recognizes that economic growth is the solution, not the problem," he said. Instead of requiring the nation to lower greenhouse gas emissions below 1990 levels, as called for in the Kyoto Protocol, the new policy is voluntary and aims only to slow the growth of emissions, not reduce them. The centerpiece of the new climate policy is a tiny little tax cut for any manufacturers who are interested.
Of course, it's not nearly as big as the tax cuts used for real national priorities like distributing income upward or starving civilian government of resources. It's just some walking-around money, less than $1 billion a year, for investors who voluntarily, now and then, feel like doing the right thing for the environment. The President would also like industries to report their own emissions levels voluntarily, which may earn them valuable credits in the future if an emissions trading scheme is implemented.
It takes a creative imagination to believe that this is an appropriate way for the world's largest economy (and producer of about 20 percent of the world's greenhouse emissions) to respond to a serious global crisis. If you believe, that is, that global warming is a crisis. George Bush and his friends keep hoping it's not, but the scientific consensus, not to mention world opinion, is absolutely clear on this point. At the request of the Bush Administration, the National Academy of Sciences re-examined the climate change issue last year and promptly concluded that the problem is every bit as important as previously reported. Finding a way to debunk all this annoying environmental science must be high on the White House wish list.
It almost looks like that wish has been granted. Bjørn Lomborg, a statistics professor at a Danish university and self-described "old left-wing Greenpeace member," says the story began when he got interested in the longstanding debate between environmentalist Paul Ehrlich and economist Julian Simon. Ehrlich claimed that shortages of many natural resources were imminent; Simon said they were not. A few years ago Lomborg started researching the facts in order, he says, to prove that Ehrlich was right. Instead he found to his surprise that Ehrlich was wrong--and indeed, environmentalists were wrong about many, many things.
Trapped by the "litany" of doom and gloom, environmental advocates have, according to Lomborg, missed the evidence that most of the problems they worry about are not so bad, and are not getting any worse. There are more acres of forests all the time, plenty of fish in the sea, no danger of acid rain, no threat of rapid extinction of species, no need to do much about global warming and no reason to worry about environmental causes of cancer. Everyone in the environmental world, his erstwhile comrades at Greenpeace included, has misunderstood the subtleties of statistics and overlooked the growing good news, as he graciously offers to explain.
Preposterous as it sounds (and, in fact, is), that's the message that Lomborg presents in The Skeptical Environmentalist. It received rave reviews in the Wall Street Journal, the Washington Post, The Economist and elsewhere, and it looks as if the Bush Administration has torn a few pages from it. Lomborg plausibly points out that the environmental litany of short-run crisis and impending doom is unrealistic, and sometimes based on statistical misunderstandings. If he had stopped there, he could have written a useful, brief article about how to think about short-run versus long-run problems and avoid exaggeration.
Unfortunately, Lomborg stretches his argument across 350 dense pages of text and 2,930 somewhat repetitive footnotes, claiming that the litany of doom has infected virtually everything written about the environment. As an alternative, he paints a relentlessly optimistic picture of dozens of topics about which he knows very little. Responses from researchers who are more familiar with many of his topics have started to appear, including rebuttals in the January issue of Scientific American, in a report from the Union of Concerned Scientists and on the website www.anti-lomborg.com.
On global warming, Lomborg believes that "the typical cure of early and radical fossil fuel cutbacks is way worse than the original affliction, and moreover [global warming's] total impact will not pose a devastating problem for our future." In support of this Bush-friendly thesis, Lomborg attempts to reinterpret all the massive research of recent years, including the carefully peer-reviewed Intergovernmental Panel on Climate Change (IPCC) reports. But he is not up to the task. Discussing the standard graphs of average temperature over recent centuries, which most analysts use to highlight the exceptional recent increases, he offers pages of meandering speculation and concludes that "the impression of a dramatic divergence [in recent world average temperature] from previous centuries is almost surely misleading." Lomborg's own figures 134, 135 and 146 present strong visual evidence against his strange conclusion, showing average temperatures heading sharply and unprecedentedly upward in recent decades. He also finds it terribly significant that we do not know exactly how fast temperatures will change in the future, as greenhouse gases accumulate in the atmosphere; nonetheless, he accepts IPCC estimates that temperatures above the range of recent historical experience are essentially certain to occur.
When it comes to estimating the economic costs of greenhouse gas reduction, Lomborg's claim that all models produce "more or less the same results" is absurd. He has missed a valuable analysis from the World Resources Institute, by Robert Repetto and Duncan Austin (The Costs of Climate Protection: A Guide for the Perplexed), which describes and analyzes the huge range of sixteen major models' estimates of the costs of greenhouse gas reduction. Repetto and Austin attribute the divergent estimates to the models' differing assumptions about the pace of economic adjustment to future changes, the extent of international emissions trading and the uses the government will make of revenues from carbon taxes or similar measures, among other factors.
I turn out to have a small part in Lomborg's story, in a manner that does not increase my confidence in his research. My name appears in footnote 1,605 in his chapter on solid waste, where he cites in passing a three-page article based on my 1997 book on recycling but overlooks the book (Why Do We Recycle?) and the larger point that it makes. Lomborg's solid-waste chapter simply says that the United States is not running out of space for landfills. Echoing an example long favored by the most vehement critics of recycling, he calculates that a landfill big enough to hold all US solid waste for the next 100 years would be quite small compared with the country's land area. Nothing is said about other countries--Denmark, for example--where land might be a bit scarcer. Almost nothing is said about recycling, either, because it seems that it doesn't much matter: "We tend to believe that all recycling is good, both because it saves resources and because it avoids waste.... We may not necessarily need to worry so much about raw materials, especially common ones such as stone, sand and gravel, but neither should we worry about wood and paper, because both are renewable resources."
The United States is not running out of landfill space, but this does not invalidate concern about waste and recycling. Rather, it shows the error of collapsing our thinking about long-term problems into short-term crisis response.
Several life-cycle analyses of material production, use and disposal (none of which Lomborg refers to) have found that extraction and processing of virgin materials accounts for far more environmental damage than landfilling the same materials when they are discarded. The greatest benefit of recycling is not that it solves a nonexistent landfill crisis, or that it staves off any immediate scarcity of resources, but rather that it reduces pollution from mining, refining and manufacturing new materials.
There are similar shortcomings in many other areas of The Skeptical Environmentalist, of which I will mention just a few. Lomborg claims that there is little need to worry about trends in air pollution: "The achievement of dramatically decreasing concentrations of the major air pollutants in the Western world...is amazing by itself.... There is also good reason to believe that the developing world, following our pattern, in the long run likewise will bring down its air pollution." He endorses wholeheartedly the hypothesis that economic growth will first cause air pollution to get worse, but then later will lead to improvement. This controversial idea, the so-called environmental Kuznets curve (EKC), was more widely accepted in the mid-1990s, the period from which Lomborg's citations are taken. Recent research has cast doubt on this pattern, as he acknowledges in the second sentence of a footnote. Yet he has missed the most comprehensive critique of the EKC research, by David Stern ("Progress on the Environmental Kuznets Curve?," Environment and Development Economics, 1998). According to Stern, the EKC pattern can be clearly detected only for a few air pollutants, such as sulfur, and then only in developed countries.
Rushing to critique environmental views in one area after another, Lomborg may not have had time to read all his citations. In his introductory chapter he maintains that the collapse of the indigenous culture of Easter Island was based on factors unique to that island and does not suggest that an ecological crash caused by resource overuse could threaten other societies. But the only source he cites about Easter Island reached exactly the opposite conclusion, speculating that ecological problems could have caused the decline of such civilizations as the Maya, early Mesopotamia and the Anasazi in what is now the southwestern United States: "Easter Island may be only one case of many where unregulated resource use and Malthusian forces led to depletion of the resource base and social conflict," concluded James Brander and M. Scott Taylor in "The Simple Economics of Easter Island" (American Economic Review, March 1998).
In his concluding chapter, Lomborg relies heavily on studies by John Graham and Tammy Tengs. These studies purport to show vastly different costs per life saved, or per life-year saved, from different regulations. At one extreme, the federal law requiring home smoke detectors, flammability standards for children's sleepwear and the removal of lead from gasoline have economic benefits outweighing their costs. At the other extreme, controls on benzene, arsenic and radioactive emissions at various industrial facilities are said to cost from $50 million to $20 billion per life-year saved. The implication is that shifting resources from the more expensive to the cheaper proposals would be enormously beneficial--by one wild calculation (which Lomborg uncritically accepts) saving 60,000 lives annually: "And the Harvard study gives us an indication that, with greater concern for efficiency than with the Litany, we could save 60,000 more Americans each year--for free." Graham and Tengs follow closely in the footsteps of John Morrall, who made similar claims in a related, earlier study.
A widely cited article in the Yale Law Journal ("Regulatory Costs of Mythic Proportions," 1998) by Georgetown University law professor Lisa Heinzerling explains the fatal flaws in the Morrall study. This, too, escaped Lomborg's notice. Heinzerling demonstrates that Morrall's long list of allegedly expensive regulations includes numerous items that were never adopted and in many cases never even proposed. Moreover, many of the cheaper lifesaving measures--removing lead from gasoline, for example--have already been done and cannot be redone for additional savings. Thus the re-allocation of money that would putatively save thousands of lives would have to be from nonexistent expensive regulations to already completed cheaper rules. In more recent, forthcoming work, Heinzerling and I have found that the same fundamental errors occur throughout the Graham and Tengs studies, including "the Harvard study" that Lomborg likes so well.
Finally, Lomborg cannot be allowed to speak for "old left-wing Greenpeace members" in general. I personally remain happy to support Greenpeace because, among other reasons, I admire its courageous and imaginative confrontations with the likes of nuclear weapons testers, the whaling industry and oil companies drilling in ecologically fragile areas. I am of course disappointed, but hardly shaken in my worldview, to learn that Lomborg claims to have caught Greenpeace in a statistical error or two. Greenpeace doesn't rely on me to throw grappling hooks onto whaling ships, and I don't rely on it for quantitative research. On the strength of this book, I won't rely on Lomborg, either.
Pat Buchanan surely holds the record for the greatest impact on a presidential election with the fewest votes. With less than 0.43 percent of the tally nationally, he still managed to decide the 2000 election. But for the thousands of votes mistakenly cast for Buchanan in Palm Beach because of the infamously confusing "butterfly" ballot, Al Gore would be President today and George W. Bush would be the Republican Michael Dukakis.
Buchanan's pernicious influence, however, did not end with the 2000 election. He's now picking up where he left off with his infamous "cultural war" speech to the 1992 Republican convention, a speech, as Molly Ivins quipped, that "sounded better in the original German." Well, Buchanan's been translating from Deutsch again, this time with The Death of the West: How Dying Populations and Immigrant Invasions Imperil Our Country and Civilization, his new book. The Death of the West harks back to the xenophobic jeremiads of the early twentieth century, such as Madison Grant's The Passing of the Great Race, Lothrop Stoddard's The Rising Tide of Color, Houston Stewart Chamberlain's The Foundations of the Nineteenth Century and Oswald Spengler's The Decline of the West.
Indeed, enterprising journalists and historians looking to expose the next Stephen Ambrose or Doris Kearns Goodwin should consider comparing Buchanan's book side by side with these others. In addition to revising Spengler's title, Buchanan shares Stoddard's love of watery metaphors--both books gush with rising tides, surging oceans and flooding rivers of nonwhites, all of which push inexorably against the ever more precarious dams and dikes around the white world. The two authors also share a predilection for quoting Rudyard Kipling, the poet laureate of the "white man's burden."
Each of these earlier books shares the same simple theme: It's Us against Them, and with fewer and fewer of Us and more and more of Them, things look grim for Us. Buchanan readily accepts the "demography is destiny" argument: "As a growing population has long been a mark of healthy nations and rising civilizations, falling populations have been a sign of nations and civilizations in decline." Buchanan's data clearly put the West into the latter category. "In 1960, people of European ancestry were one-fourth of the world's population; in 2000, they were one-sixth, in 2050, they will be one-tenth. These are the statistics of a vanishing race."
And who's responsible for this disappearance? For Buchanan, women bear most of the blame. Liberated by technological and cultural changes, he argues, Western women have abandoned their true calling as designated racial breeders. "Only the mass reconversion of Western women to an idea that they seem to have given up--that the good life lies in bearing and raising children and sending them out into the world to continue the family and nation--can prevent the Death of the West."
Faced with declining birthrates, the only alternative available to Western nations if they wish to maintain themselves is massive immigration from the burgeoning populations of Asia, Africa and the Middle East. But for Buchanan, this medicine is worse than the disease, since immigration on this scale entails the introduction of too many nonwhite non-Christians. Regarding Europe, he writes: "And as the millions pour into Europe from North Africa and the Middle East, they will bring their Arab and Islamic culture, traditions, loyalties, and faith, and create replicas of their homelands in the heartland of the West. Will they assimilate, or will they endure as indigestible parts of Africa and Arabia in the base camp of what was once Christendom?" Clearly he thinks the latter. The United States faces a similar danger, he warns: "Uncontrolled immigration threatens to deconstruct the nation we grew up in and convert America into a conglomeration of peoples with almost nothing in common--not history, heroes, language, culture, faith, or ancestors. Balkanization beckons."
Buchanan must know that many have rung this tocsin before him, and each time it has been a false alarm. The West's population has probably declined relative to the rest of the world ever since the Western world defined itself as such. For example, when Stoddard wrote in 1922, he sounded the alarm because Western nations had declined to only one-third of the world's population. By 1960, as Buchanan points out, the Western share of the world's population had fallen to one-fourth. Despite this relative decline in population, he considers 1960 as the height of Western power and influence. Furthermore, most evidence suggests that Western nations are at least as powerful now as in 1960, even with the decline in population.
Buchanan's warnings about the United States ring just as hollow. Of the 30 million foreign-born residents, he claims, "Even the Great Wave of immigration from 1890 to 1920 was nothing like this." He's right--that wave surpassed the current one. Today, foreign-born residents make up about 11 percent of the US population, but from the 1870s to the 1920s, that number fluctuated between 13 percent and 15 percent.
Buchanan, however, also argues that today's immigrants are fundamentally different from earlier generations of newcomers; but again, there's no evidence for this. America was hardly more familiar to a Southern Italian peasant who came to New York City in 1900 than it is to an immigrant today from Nigeria or the Philippines. If anything, the spread of global markets and American popular culture has made recent immigrants more attuned to the ways of their new home than their predecessors of a century ago. Furthermore, the bulk of contemporary immigrants come from Latin America, and thus possess the Christian faith that Buchanan views as central to any definition of America. Indeed, the vast majority of Latin American immigrants share Buchanan's Catholicism. Nonetheless, these immigrants "not only come from another culture, but millions are of another race," making it difficult if not impossible for them to assimilate into US society. While Buchanan might consider Latinos as his brothers in Christ, he draws the line at having them as neighbors or fellow citizens.
September 11, Buchanan argues, painfully exposed the threat from contemporary immigrants: "Suddenly, we awoke to the realization that among our millions of foreign-born, a third are here illegally, tens of thousands are loyal to regimes with which we could be at war, and some are trained terrorists sent here to murder Americans." But the past is full of similar warnings about the enemy within. During World War II, anti-Japanese prejudices combined with national security concerns to result in the internment of thousands of US citizens. During World War I, "unhyphenated" Americans saw German-Americans as the Kaiser's minions, engaging in sedition and sabotage to aid the cause of the Fatherland. Yet as these instances demonstrate, the real threat, then as now, existed largely in fevered nativist minds.
This selective and myopic view of American nativism runs throughout The Death of the West. On the one hand, Buchanan refers to nativist statements by such people as Benjamin Franklin, Theodore Roosevelt and Calvin Coolidge to support his assertion that concerns over immigration are not un-American. On the other hand, while he is correct that nativism has always been one of America's multiple political traditions, Buchanan has nary a mention of how pervasive, inaccurate and pernicious such sentiments have been. Of the Know-Nothings, he knows nothing. He quotes Al Smith, the first Catholic nominated for the presidency by a major party, but includes no mention that anti-Catholic prejudices made a major contribution to his landslide defeat in the 1928 election, as he was vigorously opposed by Protestant leaders and groups such as the Ku Klux Klan. (After the election, the joke went, Smith sent a one-word telegram to the Pope: "Unpack.") To Buchanan, it seems, anti-Catholic sentiment is a recent development and limited to left-wing intellectuals. Overall, he chooses to ignore the fact that nearly every immigrant to this country confronted nativists who argued that their race, religion, ethnicity or culture made them unfit to become full American citizens. Furthermore, if these previous nativists had had their way, they would have excluded the ancestors of most current American citizens, including Buchanan's.
Buchanan recognizes that he's in a minefield with this subject, and he makes some efforts to tread lightly. To rebut accusations that he's an anti-Semite, he sheds crocodile tears over the danger to Israel from a growing Arab population and occasionally (but not consistently) refers to America's Judeo-Christian values. But like Dr. Strangelove's hand, Buchanan's anti-Semitism refuses to stay under control. As examples of conservative leaders who have failed to fight the culture wars with sufficient zeal, he singles out Irving Kristol, Gertrude Himmelfarb and Norman Podhoretz. One might well ask why these three when one could level similar charges against Jack Kemp, Bob Dole, John McCain and even George W. Bush.
By the end of the book Buchanan has dropped all pretenses, declaring America to be a Christian nation. His racism is equally apparent. For example, in addition to warning that many current immigrants are of a different--that is, nonwhite--race, he includes a lengthy discussion of black crime rates. Given that most blacks can trace their American ancestry back further than most white Americans, it's clear that Buchanan defines America not by "history, heroes, language, culture, faith, or ancestors" but by race.
If Buchanan's diagnosis of the problem is objectionable, his solution is even worse. For him, democracy, a shared culture and even a common race offer no defense against the West's impending doom. Rather, he argues, "If the West expects a long life, it had best recapture the fighting faith of its youth." And what were these youthful characteristics? "Protestant monarchs and Catholic kings alike did not flinch at burning heretics or drawing and quartering them at the Tyburn tree. The Christianity that conquered the world was not a milquetoast faith, and the custodians of that faith did not believe all religions were equal. One was true; all the rest were false." To believe otherwise invites disaster, "For it is in the nature of things that nations and religions rule or are ruled."
Buchanan's right-wing nativism is nothing new, so it might be tempting to dismiss him and his book as inconsequential. After all, didn't the 2000 election prove that Buchanan had only marginal electoral support and that even the Republican Party considers his views too extreme? But votes don't always measure influence, and The Death of the West has clearly struck a responsive chord. Not only does it stand near the top of the New York Times bestseller list, but its author remains a prominent fixture on the TV talk-show circuit. Indeed, it's interesting to contrast the reception of The Death of the West with that of Buchanan's previous book, A Republic, Not an Empire. The latter set off a firestorm of criticism, especially among Republicans and conservatives, when Buchanan argued that Hitler had not threatened the United States. If anything, The Death of the West is even worse, since Buchanan moves beyond minimizing the danger of Hitler to the open espousal of many of his doctrines. Yet this time around, the conservative commentators have not been nearly as critical. Then, of course, Buchanan was in the middle of bolting the GOP, potentially splitting the conservative vote and throwing the election to the Democrats. None of this came to pass, with Buchanan even helping Bush to win Florida. But the lesson seems clear: Conservatives are more than willing to tolerate Buchanan's racism and xenophobia, so long as he doesn't pose a direct threat to their political interests.
Even more disturbing than Buchanan's kid-gloves treatment by the media and the right is that the book's popularity stems from and seems likely to reinforce the upsurge in nativist sentiments after September 11. For many Americans, those tragic events gave even more reason to see the world in manichean terms and to divide Americans along lines of race, religion and ethnicity. Consequently, relatively open immigration policies came under attack. In Congress, a House caucus devoted to immigration restriction doubled in membership after September 11. Representative James Traficant, Democrat of Ohio, spoke for many of those members when he asked, "How do you defend your home if your front and back doors are unlocked? What do we stand for if we can't secure our borders? How many more Americans will die?... If 300,000 illegal immigrants can gain access to America every year, trying to find a better life, do not doubt for one moment that a larger contingent of people with evil intentions could gain entry into America and continue to kill American citizens."
Thankfully, such sentiments have not gained much headway in the ensuing months. Although the Bush Administration has backed off its proposal for granting amnesty to illegal immigrants from Mexico, it has shown few signs of embracing significant immigration restrictions in response to September 11 and has even agreed to restore food-stamp eligibility to legal immigrants. In Congress, immigration opponents have failed even to gain a formal hearing for their proposals. Yet the popularity of The Death of the West shows that nativist attitudes have not disappeared, and Buchanan's diatribe will undoubtedly help reinforce such views. Furthermore, both opponents and supporters of open immigration recognize that another incident of terrorism is perhaps all that is needed to turn The Death of the West from polemic to policy.
It's the largest profession in healthcare. It's the largest female profession in America. But despite its tremendous importance and impact, most people know very little about contemporary nursing. Public ignorance of the present-day profession, however, pales in comparison with ignorance of nursing's history. How many of us know that the development of nursing as the first secular profession for respectable women was a major feminist achievement? Or that Florence Nightingale was not, in fact, the "founder" of modern nursing? Or that nurses played a key role in developing the American hospital system, as nursing historian Sioban Nelson has documented in her recent book Say Little, Do Much? How many of us know about the role of nursing in the development of public health and care of the chronically ill and poor? Most important, how many of us recognize that society's persistent devaluation of nursing--reflected today in the prejudices of many newly liberated female physicians, health policy experts and journalists--is a legacy of longstanding, socially enforced subordination to medicine?
Katrin Schultheiss, an assistant professor of history and women's studies at the University of Illinois, Chicago, is one of a handful of non-nurses who understand what the profession has to teach us about the complex process of female emancipation, as well as about the development of modern healthcare systems. She recounts the tortuous history of how the "professionalization" of nursing in France coincided with anticlericalism and the secularization of the field. Although her story focuses on the forty-year period from 1880 to 1922 and takes place in one country, the gender dilemmas Schultheiss explores have hampered nurses' ability to care for patients in healthcare systems around the globe, including in the United States.
Her tale begins with the advent of France's Third Republic and follows political reformers who attacked clerical authority as they tried to modernize the healthcare system. Until that time, nursing outside the home was typically provided by convent-trained nuns. Modern hospital reformers recognized that nursing required more nurses with more systematic education, but therein lay the problem. Since knowledge is power, the acquisition of knowledge was inevitably a challenge to authority.
Physicians, as men, did not welcome women on their terrain. As members of a developing profession--one that did not then command the prestige it enjoys today--doctors were also adamant about defending their field "from irregular or illegal practitioners."
Even doctors who recognized the need for a more educated nursing work force and who wanted to laicize the care of the sick would not countenance the education of nurses if, in the process, nurses attained the kind of knowledge and stature that would allow them to demand greater authority and autonomy in both the workplace and society. So even lay nursing had to be constructed in altruistic terms that stressed not nurses' knowledge but their virtue. As Schultheiss writes, "As long as nursing was clearly understood to be a custodial, maternal, or charitable occupation, and as long as nurses were regarded as the social, economic, and educational peers of the patients, rather than the doctors, there would be no ambiguity about who held medical authority within the hospital."
In Paris, nursing nuns, while obedient and devoted, presented a problem to medical reformers. "The very existence of an autonomous community of women called into question the hierarchy of power within municipal institutions," Schultheiss notes. Happily, secular authorities found lay nurses, as one reformer commented, to be "infinitely more subordinate than the religious nurses and more scrupulous in the strict execution of doctors' orders."
While anticlerical reformers touted the benefit of lay nurses, the French public was attached to the nuns who had provided what out-of-home nursing care had existed since the seventeenth century, and even before. Of course, Schultheiss points out, even support for religious nurses was cast in gendered terms. Proponents of the nuns insisted that nursing should be left to a special group of religious women because it would corrupt lay women for their real work--which was mothering. "A woman is either a bad mother or a bad nurse," was their motto. To convince the public to support secularization, reformers had to "feminize nursing--to turn nursing into a general feminine virtue that all women could possess."
Schultheiss's story also introduces us to a peculiar hybrid form of religious nurse--the "hospitalières" of the Hospices Civils of Lyons. These women were secular nuns, congregationist sisters "who undertook a lifelong commitment to serve the sick and poor under harsh physical conditions and with virtually no monetary compensation, but who remained under the direct authority of the secular administration." According to Schultheiss, laicizers supported them because they were easily controllable and because their sense of devotion was easily manipulated by civil administrators who didn't want to pay the real cost of nursing care.
In this section of the book, class also enters the story: If civil administrators were to get nursing care for little or nothing, women's educational standards--and thus their salaries--had to be low. Whether they were secularizers or not, reformers recognized that more highly educated women of a better class would eventually demand more pay, and more say.
Finally, Schultheiss takes us to Bordeaux, where we meet Anna Hamilton, a reformer and devotee of Florence Nightingale. With connections to the international nursing reform movement, Hamilton wanted to open a nursing school that would produce a "new nurse," recruited from the so-called better classes. This new nurse, she insisted, would deliver better patient care than nursing nuns. Hamilton's critique of the nuns, Schultheiss explains, was not based on anticlericalism. Rather, Hamilton argued that the nuns had "distanced themselves from direct patient care" while creating obstacles to the creation of "a single medical hierarchy grounded on universal principles of hygiene and scientific health care."
Hamilton was able to gain support for her project from Paul-Louis Lande, a physician who became mayor of Bordeaux, because she firmly linked the "professionalization and feminization of nursing." Doctors in Bordeaux, Schultheiss writes, recognized "the need for improving the training of hospital nurses, but rejected all aspects of reform that expanded the nurses' autonomy or authority beyond the narrowest limits."
Hamilton accepted these limits, asserting that "it is extremely ridiculous for a nurse who possesses neither the knowledge nor the rights nor the sex of the doctor to try to imitate his way of interacting with the patient and to try to use his language." Thus, in France, as in England and the United States, the nurse-doctor game began with the acceptance of the notion that nurses could not--or should not--possess medical knowledge and that they therefore could not--and should not--use medical language.
Schultheiss ends her story after the First World War. The war produced such a huge need for nurses that the debate over the virtues of lay versus religious nurses effectively ended. When more than 100,000 nurses culled from every social class enlisted to serve "la Patrie," this "demonstrated that women's special aptitudes could be attached fruitfully to the state." However, even during this period and afterward, nursing was valued not for its knowledge but for its virtue. It had become, the author concludes, "a twentieth-century version of republican motherhood."
French nursing carries that legacy to this day. Last year, when I was strolling down the Boulevard St. Germaine in Paris, a book displayed in the window of a children's bookstore caught my eye. It was called Je Sais Qui Me Soigne ("I Know Who Takes Care of Me") and is part of a series on citizenship and the professions. Nurses make a brief appearance in the book--as doctors' servants who have, as the text reads, "just enough schooling to follow doctors' orders."
For nurses struggling to put their education to use for patients, rather than for physicians, the ability to escape, at least temporarily, medical domination has always made home care attractive. Which brings us to Karen Buhler-Wilkerson's part of the story. In No Place Like Home, Buhler-Wilkerson, a professor of community health and director of the Center for the Study of the History of Nursing at the University of Pennsylvania School of Nursing, traces the development of home care from the opening of the first US home-care agency--the Ladies Benevolent Society, founded in 1813 in Charleston, South Carolina--through the present.
In Charleston, as elsewhere, respectable society ladies started home-care agencies because they felt "obligated to improve the conditions of and provide for the comfort of the poor," who were, in turn, "expected to manifest their gratitude to the rich," who established these agencies. But they did not deliver the care. Nurses did.
No Place Like Home does a great service to these ordinary nurses who are often dismissed as know-nothings by some nursing elites today. Buhler-Wilkerson details the complexity of caring for victims of tuberculosis or managing patients during typhoid epidemics. She also documents the persistence of the issues with which home-care agencies still struggle today: how to navigate doctor-nurse relationships; how to choose appropriate patients for home-care services; how to deal with gender, race and class prejudice; and how to secure long-term services for the chronically ill.
From the early days of home care, doctors were concerned about nurses invading their territory. In Boston, for example, doctors "confided to lady managers that 'the constant danger with trained nurses is that they shall usurp the doctors' position and prescribe for patients.'"
At the turn of the twentieth century, with the founding of the Henry Street Settlement on the Lower East Side of Manhattan, Lillian Wald and her colleagues developed public health nursing--"to improve standards of living" of the poor. One of the great innovations of the Henry Street Settlement was the establishment of a "First Aid Room." This was a kind of community clinic where immigrants could gain easy access to nursing care for routine health problems. Doctors, however, soon complained that "nurses were carrying ointments and even giving pills outside the strict control of physicians." Even outspoken nurses like Wald's colleague, socialist Lavinia Dock, feared a confrontation with powerful physicians. By 1911 questionable cases were no longer treated in the First Aid Room. "Later publications," Buhler-Wilkerson writes, assured the public that "the real Henry Street Settlement nurse will make the doctor feel that she is exerting every effort to have his treatment, not hers, intelligently followed."
An equally fascinating subject tackled by Buhler-Wilkerson is the impact of racial prejudice on nurse-patient and nurse-doctor relationships. In both the North and the South, lady managers as well as nurses fretted about whether it was appropriate for white nurses to care for black patients or black nurses for white patients. When insurers, notably Metropolitan Life, entered the field at the turn of the last century, managers considered the same imponderables. Race invariably trumped the needs of care and even of doctor domination of the nurse-physician relationship. For example, Buhler-Wilkerson tells us that the respectable ladies of Richmond, Virginia, who ran home care in that city, decided it was "'eminently' satisfactory for white nurses to care for black patients on the 'same footing' as white patients--but drew the line at white nurses 'taking orders from colored physicians.'"
The advent of health insurance also had a critical impact on the home-care agencies. Wald convinced Metropolitan Life to cover home-care services in 1909. Met Life wanted to reduce the high mortality rate of black life insurance subscribers--thus delaying payments on their life insurance policies. Home-care nursing's preventive approach initially seemed to make good business sense. By the 1950s, public health nursing and medical advances had paid off: Fewer people were dying of infectious diseases, and more acute illnesses were treated in the hospital. This meant that the bulk of home-care patients were chronically ill. To reward public health nursing for its success, Met Life curtailed its home-care program. "Providing care for those who failed to recover quickly was, from an insurance perspective, a poor investment," Buhler-Wilkerson states bluntly.
Since the fate of nursing is tied to the fate of the patients nurses serve, the situation has not improved much, as first Medicare and Medicaid and now managed care have "rediscovered" home care. Indeed, today the promise of the home as a place where nurses and their patients can escape the negative consequences of medical paternalism and give or receive higher-quality care has remained largely unfulfilled.
In Devices & Desires, Margarete Sandelowski uses a different lens--the world of medical technology--to explore the issue of gender and nursing. This brilliant book shows just how much the "charitable, devotional and altruistic" image of the nurse conceals. From the discovery of the thermometer to the development of intensive heart and fetal monitoring, Sandelowski documents doctors' dependence on nurses for their reputation for scientific and technical mastery. As Sandelowski shows, nurses have been critical in administering medical technology, monitoring the information it provides and interpreting that information to physicians, not to mention "educating patients about new devices, getting patients to accept and comply with their use, and alleviating patients' fears about them."
An eye-opening segment describes the use of the first thermometer, rather than the hand, as a diagnostic tool in the mid-nineteenth century. In it, we learn that the thermometer we take for granted today was originally an unwieldy, dangerous instrument that had to be carefully manipulated so as not to injure the patient. Because diagnosis and treatment involved taking the patient's temperature numerous times a day, busy physicians assigned the task to nurses. This involved, however, far more than simply recording data. The nurse, Sandelowski writes, "had to know what caused various temperatures to occur and the nursing measures that would lower or raise temperature to normal levels."
While physicians were the ones to insert the first unwieldy and equally dangerous intravenous devices, nurses were the ones to make sure the patient's arm remained immobile and that the patient could tolerate the discomfort of IV therapy. Nurses are the ones who developed the intensive-care unit--to provide intensive nursing care--and who track and interpret data from fetal monitors. As the primary users of much medical machinery, nurses are often more knowledgeable about equipment than doctors. Indeed, "the benefits of machine monitoring could not be fully harnessed without nurses who understood and could act immediately on the information monitors generated." While the public does not recognize this fact, the author tells us that medical equipment manufacturers certainly do. This is why nurses continually work with physicians and manufacturers to create design improvements and to insure that "expensive machinery [is] fully utilized."
What is amazing about this story is how little nurses have benefited from their technological mastery. Sandelowski shrewdly diagnoses a classic Catch-22. While it is true that nurses' status is somewhat enhanced by their technical proficiency, the recognition they receive does not match their actual accomplishments. That's because physicians quickly label the technical activities nurses engage in as "simple enough" for a nurse to perform.
No matter how much nurses participate in the diagnostic process, of course, physicians have maintained a legal and linguistic stranglehold on "medical" diagnosis. Even as "physicians were increasingly expecting them to perform de facto acts of diagnosis," Sandelowski writes, "nurses were in the bizarre position of having to be mindful of symptoms without speaking their mind about them."
Nurses were supposed to be able to distinguish between normal and abnormal conditions and to look for reasons for any abnormal findings. But nurses were never to use the words "normal" or "abnormal" in reporting or recording patient conditions, and they were to refrain from offering their opinions on etiology or diagnosis.... Nurses were to say (report and record) only what they saw, unlike physicians, who maintained the right to say what they knew.
This has produced the peculiar phenomenon--even today--of the nurse who recognizes that a cancer patient has diarrhea or a mentally ill patient is hallucinating, but who is not allowed to use the actual medical word because that would suggest that she, or he, is making a "medical diagnosis."
As she describes these phenomena, Sandelowski never paints nurses as innocent victims of nasty, overbearing physicians. In their perennial attempt to find "a socially valued place and distinctive identity," Sandelowski argues, many members of the profession have, albeit unwittingly, adopted common gender stereotypes that perpetuate the oppression of nurses.
One segment of the profession, Sandelowski contends, has bought into the notion that the complex practical, technical work that ordinary nurses perform is indeed simple and know-nothing.
Typically conceived of as nothing more than the physician's hand, and persistently caught in the Western cultural dichotomy between merely manual and highly prized mental, or intellectual, work, nurses have struggled to show that nursing is largely brain work. In the process, however, they have inadvertently complied with the prevailing cultural practice of denigrating the very "body-knowledge" that is the forte of the nurse.
This is particularly evident in the nurse-practitioner movement, which so many elite nurses now promote. "The key factor differentiating nurse practitioners from other nurses," she writes, "is both the use of medical instruments and the use of instruments in ways previously denied nurses." But, she points out, in our bottom-line-driven healthcare system "the role emerges as largely economically and 'medically-driven'.... The traditional image is maintained of nurses as the extra hands and eyes of physicians willingly and cheaply filling voids and bridging gaps in health care."
Other segments of the profession, Sandelowski argues, have opposed nurses' emotional and social work to their technological activities, arguing that technology is somehow an inauthentic nursing activity, while "caring" is both authentic and an essential "antidote to technology." Sandelowski shrewdly insists that in opposing "nursing/touch and technology," the profession has been "identified both with and against technology and thus, in an ironic way, with and against itself."
While it is not the purpose of these books to offer advice about dealing with the many problems nursing confronts, they implicitly point to one incontrovertible solution: We can appreciate what nurses do in the present only if we understand how their work has been constructed in the past and what they have contributed--and can contribute--to our healthcare system.
Understanding and analyzing nursing's decades-long struggle for "a socially valued place and distinctive identity" is not an academic exercise. It is central to reversing the chronic underfunding of the nursing services most of us will eventually depend on in hospitals and other healthcare institutions, and also the undereducation of the nursing work force at almost all levels of practice. And it is critical to any solution to the severe nursing shortage, which, if not quickly and effectively addressed, will have disastrous consequences as the population grows older and sicker.