"Thirty years from now the big university campuses will be relics," business "guru" Peter Drucker proclaimed in Forbes five years ago. "It took more than 200 years for the printed book to create the modern school. It won't take nearly that long for the [next] big change." Historian David Noble echoes Drucker's prophecies but awaits the promised land with considerably less enthusiasm. "A dismal new era of higher education has dawned," he writes in Digital Diploma Mills. "In future years we will look upon the wired remains of our once great democratic higher education system and wonder how we let it happen."
Most readers of this magazine will side with Noble in this implicit debate over the future of higher education. They will rightly applaud his forceful call for the "preservation and extension of affordable, accessible, quality education for everyone" and his spirited resistance to "the commercialization and corporatization of higher education." Not surprisingly, many college faculty members have already cheered Noble's critique of the "automation of higher education." Although Noble himself is famously resistant to computer technology, the essays that make up this book have been widely circulated on the Internet through e-mail, listservs and web-based journals. Indeed, it would be hard to come up with a better example of the fulfillment of the promise of the Internet as a disseminator of critical ideas and a forum for democratic dialogue than the circulation and discussion of Noble's writings on higher education and technology.
Noble performed an invaluable service in publishing online the original articles upon which this book is largely based. They helped initiate a broad debate about the value of information technology in higher education, about the spread of distance education and about the commercialization of universities. Such questions badly need to be asked if we are to maintain our universities as vital democratic institutions. But while the original essays were powerful provocations and polemics, the book itself is a disappointing and limited guide to current debates over the future of the university.
One problem is that the book has a dated quality, since the essays are reproduced largely as they were first circulated online starting in October 1997 (except for some minor editorial changes and the addition of a brief chapter on Army online education efforts). In those four-plus years, we have watched the rise and fall of a whole set of digital learning ventures that go unmentioned here. Thus, Noble warns ominously early in the book that "Columbia [University] has now become party to an agreement with yet another company that intends to peddle its core arts and science courses." But only in a tacked-on paragraph in the next to last chapter do we learn the name of the company, Fathom, which was launched two years ago, and of its very limited success in "peddling" those courses, despite Columbia president George Rupp's promise that it would become "the premier knowledge portal on the Internet." We similarly learn that the Western Governors' Virtual University "enrolled only 10 people" when it opened "this fall" (which probably means 1998, when Noble wrote the original article) but not that the current enrollment, as of February 2002, is 2,500. For the most part, the evidence that Noble presents is highly selective and anecdotal, and there are annoyingly few footnotes to allow checking of sources or quotes.
The appearance of these essays with almost no revision from their initial serial publication on the Internet also helps to explain why Noble's arguments often sound contradictory. On page 36, for example, he may flatly assert that "a dismal new era of higher education has dawned"; but just twenty-four pages later, we learn that "the tide had turned" and the "the bloom is off the rose." Later, he reverses course on the same page, first warning that "one university after another is either setting up its own for-profit online subsidiary or otherwise working with Street-wise collaborators to trade on its brand name in soliciting investors," but then acknowledging (quoting a reporter) that administrators have realized "that putting programs online doesn't necessarily bring riches." When Noble writes that "far sooner than most observers might have imagined, the juggernaut of online education appeared to stall," he must have himself in mind, two chapters earlier. Often, Noble is reflecting the great hysteria about online education that swept through the academy in the late 1990s. At other times (particularly when the prose has been lightly revised), he indicates the sober second thoughts that have more recently emerged, especially following the dot-com stock market crash in early 2000.
In the end, one is provided remarkably few facts in Digital Diploma Mills about the state of distance education, commercialization or the actual impact of technology in higher education. How many students are studying online? Which courses and degrees are most likely to appear online? How many commercial companies are involved in online education? To what degree have faculty employed computer technology in their teaching? What has been the impact on student learning? Which universities have changed their intellectual property policies in response to digital developments? One searches in vain in Noble's book for answers, or even for a summary of the best evidence currently available.
Moreover, Noble undercuts his own case with hyperbole and by failing to provide evidence to support his charges. For example, most readers of his book will not realize that online distance education still represents a tiny proportion of college courses taken in the United States--probably less than 5 percent. Noble sweepingly maintains, "Study after study seemed to confirm that computer-based instruction reduces performance levels." But he doesn't cite which studies. He also writes, "Recent surveys of the instructional use of information technology in higher education clearly indicate that there have been no significant gains in pedagogical enhancement." Oddly, here Noble picks up the rhetoric of distance-education advocates who argue that there is "no significant difference" in learning outcomes between distance and in-person classes.
Many commentators have pointed out Noble's own resistance to computer technology. He refuses to use e-mail and has his students hand-write their papers. Surely, there is no reason to criticize Noble for this personal choice (though one feels sorry for his students). Noble himself responds defensively to such criticisms in the book's introduction: "A critic of technological development is no more 'anti-technology' than a movie critic is 'anti-movie.'" Yes, we do not expect movie critics to love all movies, but we do expect them to go to the movies. Many intelligent and thoughtful people don't own television sets, but none of them are likely to become the next TV critic for the New York Times. Thus, Noble's refusal to use new technology, even in limited ways, makes him a less than able guide to what is actually happening in technology and education.
Certainly, Noble's book offers little evidence of engagement with recent developments in the instructional technology field. One resulting distortion is that some readers will think that online distance education is the most important educational use of computer technology. Actually, while very few faculty teach online courses, most have integrated new technology into their regular courses--more than three-fifths make use of e-mail; more than two-fifths use web resources, according to a 2000 campus computing survey. And few of these faculty members can be characterized, as Noble does in his usual broad-brush style, as "techno-zealots who simply view computers as the panacea for everything, because they like to play with them."
Indeed, contrary to Noble's suggestion, some of the most thoughtful and balanced criticisms of the uses of technology in education have come from those most involved with its application in the classroom. Take, for example, Randy Bass, a professor of English at Georgetown University, who leads the Visible Knowledge Project (http://crossroads.georgetown.edu/vkp), a five-year effort to investigate closely whether technology improves student learning. Bass has vigorously argued that technological tools must be used as "engines of inquiry," not "engines of productivity." Or Andrew Feenberg, a San Diego State University distance-education pioneer as well as a philosopher and disciple of Herbert Marcuse, who has insisted that educational technology "be shaped by educational dialogue rather than the production-oriented logic of automation," and that such "a dialogic approach to online education...could be a factor making for fundamental social change."
One would have no way of knowing from Noble's book that the conventional wisdom of even distance-education enthusiasts is now that cost savings are unlikely, or that most educational technology advocates, many of them faculty members, see their goal as enhancing student learning and teacher-student dialogue. Noble, in fact, never acknowledges that the push to use computer technology in the classroom now emanates at least as much from faculty members interested in using these tools to improve their teaching as it does from profit-seeking administrators and private investors.
Noble does worry a great deal about the impact of commercialization and commodification on our universities--a much more serious threat than that posed by instructional technology. But here, too, the book provides an incomplete picture. Much of Noble's book is devoted to savaging large public and private universities--especially UCLA, which is the subject of three chapters--for jumping on the high-technology and distance-education bandwagons. Yet at least as important a story is the emergence of freestanding, for-profit educational institutions, which see online courses as a key part of their expansion strategy. For example, while most people think of Stanley Kaplan as a test preparation operation, it is actually a subsidiary of the billion-dollar Washington Post media conglomerate and owns a chain of forty-one undergraduate colleges and enrolls more than 11,000 students in a variety of online programs, ranging from paralegal training to full legal degrees at its Concord Law School, which advertises itself as "the nation's only entirely online law school." This for-profit sector is growing rapidly and becoming increasingly concentrated in a smaller number of corporate hands. The fast-growing University of Phoenix is now the largest private university in the United States, with more than 100,000 students and almost one-third in online programs, which are growing more than twice as fast as its brick-and-mortar operation. Despite a generally declining stock market, the price of the tracking stock for the University of Phoenix's online operation has increased more than 80 percent in the past year.
As the Chronicle of Higher Education reported last year, "consolidation...is sweeping the growing for-profit sector of higher education," fueled by rising stock prices in these companies. This past winter, for example, Education Management Corporation, with 28,000 students, acquired Argosy Education Group and its 5,000 students. The threat posed by these for-profit operations is rooted in their ability to raise money for expansion through Wall Street ("Wall Street," jokes the University of Phoenix's John Sperling, "is our endowment") and by diminishing public support for second-tier state universities and community colleges (the institutions from which for-profits are most likely to draw new students). Yet, except for an offhand reference to Phoenix, Digital Diploma Mills says nothing about these publicly traded higher-education companies. And these for-profit schools are actually only a small part of the more important and much broader for-profit educational "sector," which is also largely ignored by Noble and includes hundreds of vendors of different products and services, and whose size is now in the hundreds of billions of dollars--what Morgan Stanley Dean Witter calls, without blushing, an "addressable market opportunity at the dawn of a new paradigm."
A strong cautionary tale is provided by Noble, that of the involvement of UCLA's extension division with a commercial company called Onlinelearning.net--the most informative chapter in the book. He shows how some UCLA administrators as early as 1993 greedily embraced a vision of riches to be made in the online marketing of the college's extension courses. UCLA upper management apparently bought the fanciful projections of their commercial partners that the online venture would generate $50 million per year within five years, a profit level that quickly plummeted below $1 million annually. But Noble conflates the UCLA online-extension debacle with a more benign effort by the UCLA College of Letters and Sciences, beginning in 1997, to require all instructors to post their course syllabuses on the web. He seems unwilling to draw distinctions between the venal and scandalous actions of top UCLA administrators and the sometimes ham-handed efforts of other administrators to get UCLA faculty to enhance their classes by developing course websites, a fairly common educational practice and a useful convenience for students. Three-fifths of UCLA students surveyed said that the websites had increased interactions with instructors, and social science faculty recently gave the website initiative a mostly positive evaluation.
Sounding an "early alarm" so that faculty members can undertake "defensive preparation and the envisioning of alternatives" is how Noble explains his purpose in writing Digital Diploma Mills. But will faculty be well armed if they are unaware of the actual landscape they are traversing? In the end, Noble leaves us only with a deep and abiding suspicion of both technology and capitalism. His analysis of technology and education does echo Marx's critique of capitalism, with its evocation of concepts like commodification, alienation, exchange and labor theories of value. But unlike Marx, who produced a critical analysis of the exploitative nature of early capitalist production without outright rejection of the technology that made industrialization possible, Noble cannot manage the same feat.
In the current political climate, Noble's undifferentiated suspicion of technology hinders us more than it helps us. Are we prepared to follow him in his suspicion of any use of technology in higher education? Are faculty members willing to abjure e-mail in communicating with their students and colleagues? Are instructors at small colleges with limited library collections prepared to tell their students not to use the 7 million online items in the Library of Congress's American Memory collection? Are they ready to say to students with physical disabilities that limit their ability to attend on-campus classes or conduct library research that they can't participate in higher education? Are faculty at schools with working adults who struggle to commute to campus prepared to insist that all course materials be handed directly to students rather than making some of it available to their students online?
Similarly, what lines are we prepared to draw with respect to commercialization of higher education within the capitalist society in which we live? Are faculty willing to abandon publishing their textbooks with large media conglomerates and forgo having their books sold through nationwide bookstore chains? Are they prepared to say to working-class students who view higher education as the route to upward mobility that they cannot take courses that help them in the job market?
Noble's answer to most of these questions would undoubtedly be yes, insisting, as he does, that anything less than the "genuine interpersonal interaction," face to face, undermines the sanctity of the essential teacher-student relationship. In a March 2000 Chronicle of Higher Education online dialogue about his critique of technology in education, Noble complained that no one had offered "compelling evidence of a pedagogical advantage" in online instruction. (He pristinely refused to join online, and had a Chronicle reporter type in his answers relayed over the phone.) A student at UCLA, who had unexpectedly taken an online course, noted in her contribution to the Q&A that because she tended to be "shy and reserved," e-mail and online discussion groups allowed her to speak more freely to her instructor, and that she thought she retained more information in the online course than in her traditional face-to-face classes at UCLA. Noble rejected the student's conclusion that the online course had helped her find her voice, arguing that writing was "in reality not a solution, but an avoidance of the difficulty." "Speaking eloquently, persuasively, passionately," he concluded, "is essential to citizenship in a democracy." Putting aside the insensitivity of Noble's reply, his position, as Andrew Feenberg points out in Transforming Technology: A Critical Theory Revisited, is reminiscent of Plato's fear that writing (the cutting-edge instructional technology in the ancient world) would replace spoken discourse in classical Greece, thus destroying the student-teacher relationship. (Ironically, as Feenberg also notes, "Plato used a written text as the vehicle for his critique of writing, setting a precedent" for current-day critics of educational technology like Noble who have circulated their works on the Internet.)
The conservative stance of opposing all change--no technology, no new modes of instruction--is appealing because it keeps us from any possible complicity with changes that undercut existing faculty rights and privileges. But opposition to all technology means that we are unable to support "open source" technological innovations (including putting course materials online free) that constitute a promising area of resistance to global marketization. And it makes it impossible to work for protections that might be needed in a new environment. Finally, it leaves unchanged the growing inequality between full-time and part-time faculty that has redefined labor relations in the contemporary university--the real scandal of the higher-education workplace. Without challenging the dramatic differences in wages and workloads of full professors and adjunct instructors, faculty rejection of educational technology begins to remind us of the narrow privileges that craft workers fought to maintain in the early decades of industrial capitalism at the expense of the unskilled workers flooding into their workplaces.
We prefer to work from a more pragmatic and realistic stance that asks concretely about the benefits and costs of both new technology and new educational arrangements to students, faculty (full- and part-time) and the larger society. Among other things, that means that academic freedom and intellectual property must be protected in the online environment. And the faculty being asked to experiment with new technology need to be provided with adequate support and rewards for their (ad)ventures. As the astute technology commentator Phil Agre wrote when he first circulated Noble's work on the Internet, "the point is neither to embrace or reject technology but to really think through and analyze...the opportunities that technology presents for more fully embodying the values of a democratic society in the institutions of higher education."
Here we are, twenty years on, and the reports of the Israeli army smashing its way through Palestinian towns remind me of what came out of Lebanon as Sharon and his invading army raced north. Israeli troops beating, looting, destroying; Palestinians huddled in refugee camps, waiting for the killers to come.
But there is a huge difference. Twenty years ago, at least for people living here in the United States, it was harder, though far from impossible, to get firsthand accounts of what was going on. You had to run out to find foreign newspapers, or have them laboriously telexed from London or Paris. Reporting in the mainstream corporate press was horrifyingly tilted, putting the best face on Israeli deeds. Mostly, it still is. But the attempted news blackout by the Sharon government and the Israeli military simply isn't working.
Here's Aviv Lavie, writing in Ha'aretz on April 2:
A journey through the TV and radio channels and the pages of the newspapers exposes a huge and embarrassing gap between what is reported to us and what is seen, heard, and read in the world.... On Arab TV stations (though not only them) one could see Israeli soldiers taking over hospitals, breaking equipment, damaging medicines, and locking doctors away from their patients. Foreign television networks all over the world have shown the images of five Palestinians from the National Security forces, shot in the head from close range.... The entire world has seen wounded people in the streets, heard reports of how the IDF prevents ambulances from reaching the wounded for treatment.
As always, there are the courageous witnesses. These days we have the enormously brave young people in the International Solidarity Movement sending daily communications back to the United States that flash their way round the Internet and even translate into important interviews in the mainstream media.
Meet a few of them. Here's Jordan Flaherty, filing this account on Indymedia:
Last night the Israeli Military tried to kill me. I'm staying in the Al Azzeh refugee camp, in Bethlehem, along with about twenty other international civilians. We're here to act as human shields.... On the hill above the camp is an Israeli military sniper's post. To get where we were staying in the village, most of us had to cross this street. It was a quick, low, dash across the street. As I ran, the sniper fired.... The shots began as I came into view, and stopped shortly after I made it to the other side. They were clearly aimed at me. And, by the sound of them, they were close. All night long, there was the sound of gun shots, as the military shot into our village. We stayed clear of the windows.... The guns and bullets were, no doubt, paid for by my tax dollars. Which is, of course, why we are here.
Or Tzaporah Ryter, filing this on Electronic Intifada:
I am an American student from the University of Minnesota. I currently am in Ramallah. We are under a terrible siege and people are being massacred by both the Israeli army and armed militia groups of Israeli settlers.... On Thursday afternoon, the Israeli army began sealing off each entrance to Ramallah.... Those traveling in began desperately searching for alternative ways and traveling in groups, but the Israelis were firing upon them and everyone was running and screaming.... Israeli jeeps were speeding across the terrain, pulling up from every direction and shooting at the women and children, and also at me...
Or the extremely articulate and self-possessed Adam Shapiro, whose testimony ended up in the New York Daily News and on CNN, where he told Kyra Phillips:
This is not about politics between Jew and Arab, between Muslim and Jew. This is a case of human dignity, human freedom and justice that the Palestinians are struggling for against an occupier, an oppressor. The violence did not start with Yasir Arafat. The violence started with the occupation.... Arafat, after every terrorist incident, every suicide bombing, after every action, has condemned this loss of life, of civilian lives on both sides. The Sharon government, sometimes will apologize after it kills an innocent civilian, but it does not apologize for raping the cities and for going in and carrying out terrorist actions, going house to house tearing holes through the walls, roughing up people, killing people, assassinating people.
Most of the time you open up a newspaper and read a robotic column--as I did the Los Angeles Times's Ronald Brownstein the other day--about Palestinian terrorism and the wretched Arafat's supposed ability to quell the uprising with a few quick words. And then you turn on the NewsHour and there, of all people, is Zbigniew Brzezinski, stating the obvious, on April 1:
The fact of the matter is that three times as many Palestinians have been killed, and a relatively small number of them were really militants. Most were civilians. Some hundreds were children.... in the course of the last year, we have had Palestinian terrorism but we have also had deliberate overreactions by Mr. Sharon designed not to repress terrorism but to destabilize the Palestinian Authority, to uproot the Oslo Agreement, which he has always denounced, in a manner which contributed to the climate, that resulted in the killing of one of the two architects of the Oslo Agreement.
After predictable dissent from Kissinger, Brzezinski went on:
It's absolute hypocrisy to be claiming that Arafat can put a stop to the terrorism.... the fact of the matter is that his ability to control the situation would be greatly increased if there was serious movement towards political process, towards a political settlement and that the United States took the lead.
Between this brisk statement and the eloquent courage of Adam Shapiro and his brave fellow internationalists, the truth is getting out--not fast enough, not loud enough--but better than twenty years ago.
Recent days have brought the first tentative but welcome roadblocks to the Bush Administration's war-fevered assault on civil liberties. In Newark, Superior Court Judge Arthur D'Italia, calling secret arrests "odious to a democracy," ordered the Immigration and Naturalization Service to release the names of post-September 11 detainees to the New Jersey ACLU. Although the release of the names has been delayed pending a federal government appeal, Judge D'Italia's courageous ruling is a significant victory for constitutional principle.
Meanwhile, a Freedom of Information Act lawsuit requesting the names of detainees and related information continues in federal court in Washington, DC. The Center for National Security Studies, which is representing numerous civil liberties and media outfits (including The Nation), recently filed a brief supporting its own request for this material. The Administration is expected to file a response in mid-April, and oral arguments could take place weeks after that.
Some of the most important restraints are being applied from abroad. In the case of accused hijacking conspirator Zacarias Moussaoui, Attorney General John Ashcroft confronted a clear choice between his domestic political goal of advancing the death penalty and the international goal of a palatable legal campaign against Al Qaeda. Ashcroft tried to have it both ways, committing himself to capital charges against French citizen Moussaoui while seeking further cooperation in the case from the anti-capital-punishment French. The result: French officials have publicly promised to withhold any evidence that might lead to a death sentence. Ashcroft's prosecutors are on the hook, stuck with a sketchy conspiracy case requiring proof that the silent Moussaoui, grounded in jail on September 11, was a sufficiently active and knowledgeable architect of the Al Qaeda Trade Center and Pentagon attacks to merit a capital conviction.
The sense that European human rights commitments are having an impact despite the Administration's guff was even clearer when Defense Secretary Donald Rumsfeld effected a brief reverse-rudder from George W. Bush's military tribunal order. This past fall the Administration fumed when UN Human Rights High Commissioner Mary Robinson denounced the tribunal order, but Rumsfeld's new plan to require public trials, unanimous votes for a death sentence and additional layers of review was clearly designed to quiet international criticism. Rumsfeld's amendment to the Bush order does not get to the heart of the constitutional problem, however. The tribunals still represent an unlawful seizure of judicial authority by the President. And Rumsfeld's March 28 admission that prisoners in Guantánamo Bay could stay locked up even if acquitted of charges reveals what is in effect a permanent policy of internment without trial, a policy made possible only by the Al Qaeda prisoners' continued status in a no man's land between the Geneva Conventions and US law. This policy has European allies nearly as alarmed as they are about the tribunals.
There are two important lessons here. One is that in the months since September 11, players in the federal system of checks and balances--Congress and federal judges--have largely failed to resist the Bush Administration's constitutional power grab. The second lesson, however, is that resistance is more than possible. With state courts sometimes better guardians of civil liberties than are the Rehnquist-era federal courts, and with Europe deeply invested in its Continental human rights covenants, it may turn out that local and transnational coalitions will become an effective means of preserving civil liberties. State courts at home and allies abroad are turning out to be the most compelling protectors of the essential American values the Bush Administration has been ready to sell down the river.
Odds are good that on a plane or boat or bus somewhere in the world sits a refugee headed for the United States carrying the seeds of a weapon of mass destruction. The agent he unwittingly carries is insidious and lethal but slow acting, so the deaths it causes can come months or even years after it is disseminated in the population. It has the potential to overwhelm, to kill thousands, and there may be no vaccine, antidote or cure.
What is this ominous threat? An ingenious new biological weapon? No, it is a very old nemesis of humankind--tuberculosis, an infectious disease that kills more than 2 million people every year. Because the majority of them are poor and outside our borders, we don't hear much about them, but that may soon change. Tuberculosis is making a comeback and is conquering the treatments that have kept this killer at bay in the developed world. Multidrug-resistant tuberculosis--a death sentence in most developing countries--is becoming more common and is incurable in about half the cases, even in the United States.
Two numbers in the President's budget proposal stand in stark contrast to each other: $6 billion to fight bioterrorism versus $200 million to the Global Fund to Fight AIDS, Tuberculosis and Malaria. That works out to a little more than $1 billion per US anthrax death last year, as compared with $33 per global victim of the more common infectious scourges in 2001.
Re-emerging infectious diseases like malaria, HIV and tuberculosis continue their inexorable march, devastating poor countries in Africa, Asia and South America and ultimately threatening the richest countries. Either we are all protected or we are all at risk. It would be better to recognize that the developed world's inaction and callousness have allowed epidemics to flourish in the fertile soil of poverty, malnutrition and poor living conditions made worse by wars, internal displacements, repressive regimes, refugee crises, economic sanctions and huge debt payments that require poor countries to cut public services.
If we can possibly be unmoved by the staggering numbers affected by the AIDS pandemic alone (3 million dead in 2001, more than 40 million people living with HIV, 28 million of them in Africa), then perhaps we will be moved by fear. Of the world's 6 billion inhabitants, 2 billion are infected with latent tuberculosis. With adequate treatment, TB is 90 percent curable, yet only a fraction of those with the disease have access to this simple technology: a course of medications costing only $10 to $20 per patient. As a result, tuberculosis is now the world's second leading infectious killer after AIDS. Resistant tuberculosis, the result of inadequate treatment, is spreading at alarming rates in poor countries and in urban centers of rich countries. According to the World Health Organization, an eight-hour plane flight with an infected person is enough to risk getting TB.
Of all the rich countries, the United States has its head buried most deeply in the sand. It is the stingiest, spending only one cent on foreign aid for every $10 of GNP. Only $1 in $20 of the aid budget goes to health. A recent WHO report estimated that spending by industrialized countries of just 0.1 percent more of their GNPs on health aid would save 8 million lives, realize up to $500 billion a year via the economic benefits of improved health and help those in poor countries escape illness and poverty.
George W. Bush's recent pledge to increase foreign aid comes too late and with too many strings attached. The proposal doesn't start until 2004, making it largely hypothetical. The current budget keeps spending flat at about $11.6 billion. Even with the promised increase, US spending on foreign aid as a proportion of GNP will still pale in comparison with that of other developed nations. And along with this carrot comes a big stick: Only countries that continue to let corporations raid their economies through detrimental free-trade policies will be eligible.
Bioterrorism is a danger that should be taken seriously, but the current counterterrorism frenzy threatens to militarize the public health system, draining resources away from the research, surveillance systems and treatments needed for existing health problems. Already the political war profiteers have criticized the CDC's funding priorities, using the terrorist threat as cover in an attempt to advance their reactionary agenda. In a letter this past November to Health and Human Services Secretary Tommy Thompson, Republican Representatives Joseph Pitts, John Shadegg and Christopher Smith criticized the CDC for "inappropriate" actions. The Congressmen wrote that "we have grown increasingly concerned about some of the activities that the CDC is funding and promoting--activities that are highly controversial in nature, and funding that could be better used for our War on Terrorism." They specifically objected to AIDS prevention programs targeted at gay men and to a CDC website link to a conference sponsored by organizations promoting reproductive health, including abortions, for women. Bush was happy to oblige by cutting $340 million from the CDC's nonbioterrorism budget.
Just as a missile shield will not protect us from crazed men with box cutters, so mass vaccination campaigns and huge stockpiles of antibiotics will not keep us healthy in an increasingly unhealthy world. September 11 should make us more aware than ever of our shared vulnerability. Making the world safer and healthier means prevention and early treatment of disease, inside and outside our borders. It means building a healthcare system designed to keep people healthy instead of spending billions on bogymen while the real killers are on our doorstep.
Two Palestinian-Israeli wars have erupted in this region. One is the Palestinian nation's war for its freedom from occupation and for its right to independent statehood. Any decent person ought to support this cause. The second war is waged by fanatical Islam, from Iran to Gaza and from Lebanon to Ramallah, to destroy Israel and drive the Jews out of their land. Any decent person ought to abhor this cause.
Yasir Arafat and his men are running both wars simultaneously, pretending they are one. The suicide killers evidently make no distinction. Much of the worldwide bafflement about the Middle East, much of the confusion among the Israelis themselves, stems from the overlap between these two wars. Decent peace seekers, in Israel and elsewhere, are often drawn into simplistic positions. They either defend Israel's continued occupation of the West Bank and Gaza by claiming that Israel has been targeted by Muslim holy war ever since its foundation in 1948, or else they vilify Israel on the grounds that nothing but the occupation prevents a just and lasting peace. One simplistic argument allows Palestinians to kill all Israelis on the basis of their natural right to resist occupation. An equally simplistic counterargument allows Israelis to oppress all Palestinians because an all-out Islamic jihad has been launched against them.
Two wars are being fought in this region. One is a just war, and the other is both unjust and futile.
Israel must step down from the war on the Palestinian territories. It must begin to end occupation and evacuate the Jewish settlements that were deliberately thrust into the depths of Palestinian lands. Its borders must be drawn, unilaterally if need be, upon the logic of demography and the moral imperative to withdraw from governing a hostile population.
But would an end to occupation terminate the Muslim holy war against Israel? This is hard to predict. If jihad comes to an end, both sides would be able to sit down and negotiate peace. If it does not, we would have to seal and fortify Israel's logical border, the demographic border, and keep fighting for our lives against fanatical Islam.
If, despite simplistic visions, the end of occupation will not result in peace, at least we will have one war to fight rather than two. Not a war for our full occupancy of the holy land, but a war for our right to live in a free and sovereign Jewish state in part of that land. A just war, a no-alternative war. A war we will win. Like any people who were ever forced to fight for their very homes and freedom and lives.
Translated by Fania Oz-Salzberger.
As state budgets around the country are slashed to accommodate the expense of the war on terror, the pursuit of educational opportunity for all seems ever more elusive. While standardized tests are supposed to be used to diagnose problems and facilitate individual or institutional improvement, too often they have been used to close or penalize precisely the schools that most need help; or, results have been used to track students into separate programs that benefit the few but not the many. The implementation of gifted classes with better student-teacher ratios and more substantial resources often triggers an unhealthy and quite bitter competition for those unnaturally narrowed windows of opportunity. How much better it would be to have more public debate about why the pickings are so slim to begin with. In any event, it is no wonder there is such intense national anxiety just now, a fantastical hunger for children who speak in complete sentences by the age of six months.
A friend compares the tracking of students to the separation of altos from sopranos in a choir. But academic ability and/or intelligence is both spikier and more malleably constructed than such an analogy allows. Tracking students by separating the high notes from the low only works if the endgame is to teach all children the "Hallelujah Chorus." A system that teaches only the sopranos because no parent wants their child to be less than a diva is a system driven by the shortsightedness of narcissism. I think we make a well-rounded society the same way we make the best music: through the harmonic combination of differently pitched, but uniformly well-trained voices.
A parsimony of spirit haunts education policy, exacerbated by fear of the extremes. Under the stress of threatened budget cuts, people worry much more about providing lifeboats for the very top and containment for the "ineducable" rock bottom than they do about properly training the great masses of children, the vibrant, perfectly able middle who are capable of much more than most school systems offer. In addition, discussions of educational equality are skewed by conflation of behavioral problems with IQ, and learning disabilities with retardation. Repeatedly one hears complaints that you can't put a gifted child in a class full of unruly, noisy misfits and expect anyone to benefit. Most often it's a plea from a parent who desperately wants his or her child removed from a large oversubscribed classroom with a single, stressed teacher in an underfunded district and sent to the sanctuary of a nurturing bubble where peace reigns because there are twelve kids in a class with two specialists and everyone's riding the high of great expectations. But all children respond better in ordered, supportive environments; and all other investments being equal, gifted children are just as prone to behavior problems--and to learning disabilities--as any other part of the population. Nor should we confuse exceptional circumstances with behavior problems. The difficulty of engaging a child who's just spent the night in a homeless shelter, for example, is not productively treated as chiefly an issue of IQ.
The narrowing of access has often resulted in peculiar kinds of hairsplitting. When I was growing up, for example, Boston's Latin School was divided into two separate schools: one for boys and one for girls. Although the curriculum was identical and the admissions exam the same, there were some disparities: The girls' school was smaller and so could admit fewer students; and the science and sports facilities were inferior to those of the boys.
There was a successful lawsuit to integrate the two schools about twenty years ago, but then an odd thing happened. Instead of using the old girls' school for the middle school and the larger boys' school for the new upper school, as was originally suggested, the city decided to sever the two. The old boys' school retained the name Boston Latin, and the old girls' school--smaller, less-equipped--was reborn as Boston Latin Academy. The entrance exam is now administered so that those who score highest go to Boston Latin; the next cut down go to what is now, unnecessarily, known as the "less elite" Latin Academy.
One of the more direct consequences of this is that the new Boston Latin inherited an alumni endowment of $15 million dollars, much of it used to provide college scholarships. Latin Academy, on the other hand, inherited the revenue of the old Girls' Latin alumni association--something under $200,000. It seems odd: Students at both schools are tremendously talented, the cutoff between them based on fairly insignificant scoring differences. But rather than pool the resources of the combined facilities--thus maximizing educational opportunity, in particular funding for college--the resolution of the pre-existing gender inequality almost purposefully reinscribed that inequality as one driven by wealth and class.
There are good models of what is possible. The International Baccalaureate curriculum, which is considered "advanced" by most American standards, is administered to a far wider range of students in Europe than here, with the result that their norm is considerably higher than ours in a number of areas. The University of Chicago's School Mathematics Project, originally developed for gifted students at the Chicago Lab School, is now recommended for all children--all children, as the foreword to its textbooks says, can "learn more and do more than was thought to be possible ten or twenty years ago." And educator Marva Collins's widely praised curriculum for inner-city elementary schools includes reading Shakespeare.
Imparting higher levels of content requires nothing exceptional but rather normal, more-or-less stable children, taught in small classes by well-trained, well-mentored teachers who have a sophisticated grasp of mathematics and literature themselves. It will pay us, I think, to stop configuring education as a battle of the geniuses against the uncivilized. We are a wealthy nation chock-full of those normal, more-or-less stable children. The military should not be the only institution that teaches them to be all that they can be.
With compromise legislation stranded in Congress, the report card on the President's faith-based initiative reads "incomplete." Bush, however, has clearly succeeded on two fronts.
How are we to read the International Conference on Financing for Development, which recently concluded in Monterrey, Mexico? Just another United Nations talkathon?
When a girl becomes her school's designated slut, her friends stop talking to her. Pornographic rumors spread with dazzling efficiency, boys harass her openly in the hallways, girls beat her up. "WHORE," or sometimes "HORE," is written on her locker or bookbag. And there is usually a story about her having sex with the whole football team, a rumor whose plausibility no one ever seems to question.
Even those of us who weren't high school sluts and don't recall any such outcast from our own school days have become familiar with her plight--through media stories and the growing body of feminist-inspired literature on female adolescence, as well as the talk shows and teen magazine spreads that have made her their focus. What's harder to understand is how the label persists when the landscape of sexual morality that gives it meaning has so drastically changed--well within living memory. If the sexual revolution didn't obliterate the slut, wouldn't the successive waves of libidinous pop stars, explicit TV shows and countercultural movements to reclaim the label have drained it of its meaning? What kinds of lines can today's adolescents, or those of the 1990s or 1980s, for that matter, possibly draw between nice and not nice girls?
Emily White's Fast Girls sets out to look at the central dilemmas of the slut label. Two earlier books that have focused on the slut--Leora Tanenbaum's Slut! Growing Up Female With a Bad Reputation, a collection of oral histories, and Naomi Wolf's Promiscuities, a reflection on girls' sexual coming-of-age in the 1970s that combines memoir with a casual survey of the women Wolf grew up with--rely primarily on the subjective narratives of women and girls to explore the slut phenomenon. Paula Kamen's Her Way: Young Women Remake the Sexual Revolution surveys the sexual mores and activities of young women, but not specifically of teenagers. White is the first to combine different methodologies in an attempt to write specifically about the functions and significance of the teenage slut--in her words, "to shed some light on that space in the high school hallway where so many vital and troubling encounters occur."
White spoke to or corresponded with more than 150 women who had been the sluts of their school (whom she found largely by soliciting their stories through newspaper ads), and she spent "a couple of weeks" observing in a Seattle-area public high school. She also offers cultural criticism--of horror movies and the riot grrrls, for instance--as well as a digest of psychological, sociological and post-structuralist theory pertinent to the subject. White's evident ambition makes it all the more frustrating that the book's impressive breadth doesn't translate into thoroughness or rigor.
When White interviewed the women--most of them white, middle-class and from the suburbs--who responded to her ads, the stories she heard had certain similarities. There was a "type" of girl who tended to be singled out: She developed breasts earlier than other girls; she was a loud, vocal extrovert; she was self-destructive, tough or wild; often she had been sexually abused; and in one way or another she was usually an outsider, whether she had moved from a different town, had less money than most kids or belonged to some peripheral subculture. Some women described themselves as having been promiscuous, but more said they were not as sexually active as their (untainted) friends, and none of them had done the things that were later rumored. Often the first rumors were started by bitter ex-boyfriends or jealous friends. Once they caught on, the ritual torments and "football team" fantasies inevitably followed.
These similarities make up what White calls the "slut archetype," and for much of the book she riffs on the common factors of the stories, with chapters dedicated to subjects like the role of suburbia, the slut's social isolation and the preponderance of sexual abuse. Though sprinkled liberally throughout the book, the women's testimonies are only a launching point for White's meditations. She writes about these interviews in a way that at times both romanticizes and condescends to the women. "She walks so confidently in her boots," writes White of one 18-year-old, "causing tremors in the ground beneath her feet. She presents herself as a girl who has crawled up out of the underworld, who has found her way through the isolation and the drugged dreams.... It is a way of coping, this tough act. It's a start." Still, despite certain problems of credibility, this overwrought style is pretty effective at conveying the anguish of the ostracized adolescent girl (if only by echoing her earnest self-dramatization). It's much less suited to considering the girl in social and cultural context.
In editing and interpreting her interviews, White emphasizes their similarities at the expense of the kind of detail that makes a particular social universe come to life. Her time observing the Seattle-area high school students inspires mostly familiar observations. ("The cafeteria is high school's proving ground. It's one of the most unavoidable and important thresholds, the place where you find out if you have friends or if you don't.") Only about half the time do we get any real sense of the sort of community an interviewee grew up in or what the social scene was like at her school. There's even less detail about precisely how she fit into the hierarchy before the slut label took hold, whether she was perceived as threatening or flirtatious, what her past relationships were like with girls, boys and teachers. Even worse is that for all their lack of texture, the women's stories are by far the most interesting part of the book; when White pulls away to supply her own commentary, it's usually vague and predictable--precisely because she's not attuned to the details that would reveal how the slut really functions in the teenage universe. Although she acknowledges that the slut myth is much bigger than any individual girl behind it, she is also attached to the literal-minded notion that the girl being labeled has some kind of privileged relationship to the slut myth--that her individual story is the slut story, and the women's emotional recollections of abuses and scars collectively explain the slut myth. In fact, to understand the myth we need to know at least as much about what the rest of the school is thinking.
White suggests that "the slut becomes a way for the adolescent mind to draw a map. She's the place on the map marked by a danger sign...where a girl should never wander, for fear of becoming an outcast." But, given the arbitrary relationship White found between the slut label and a girl's actual sex life, does the slut myth really have any practical applications for girls? Do they limit sexual activity out of fear of these rumors? Are there particular sex acts that can bring censure in themselves? Can social status insulate some girls from slutdom, regardless of how much they fool around? White doesn't directly pose these questions, but one of her findings hints that, though they may fear the label, kids themselves interpret slutdom as primarily an expression of social status rather than a direct consequence of sexual activity: "Girls who at one time might have been friends with the slut recede as her reputation grows; they need to be careful how they associate with her or they will be thought of as sluts along with her."
The slut doesn't seem to point to an actual line that a nice girl can't cross; she commemorates the fact that there once was such a line, and suggests that the idea of a line still has currency, even if no one can figure out where it is anymore. It's no surprise that she is such a popular subject for third-wave feminists; her ostracism seems to have implications not only for residual sexism but for the way that we personally experience sex and desire.
Ididn't think I had a personal connection to the slut story. For most of my adolescent years, which were in the late 1980s and early '90s, I was very good, and too awkward to attract attention from boys. In the schools I attended there were whispers about who did what, and some girls were considered sluttier than others, but there was no single figure who captured the imagination of the whole class.
Then I remembered something about one of the girls I was closest to from age 10 to about 13 or 14. We didn't go to the same school, but for much of the time we both attended Saturday Russian classes held in her kitchen by an elderly neighbor. She was the only one of my friends who was, like me, born in Russia, though her family still lived in Philadelphia's immigrant neighborhood while mine had moved to a more prosperous, non-Russian suburb several years earlier. My family had a bigger house. We had, thanks to my American stepdad, more American ways of doing things. I was a better student. I think she was more popular at her school than I was at mine; at least, she was more easygoing and sociable. I never felt in awe of her, as I did of other friends. I was not always nice to her, though usually I was.
She knew more about sex in our early years than I did, but, like me, she didn't go out with anyone in the time we knew each other. She was pretty, in a round-faced, unfashionable way that made me think I had a discerning eye for appreciating it. She always seemed more developed than I was. (That may not have been true in any measurable sense.) At some point in those years, though it didn't particularly affect our friendship, and I don't remember thinking about it while I was actually with her, I began to spend nights casting her as my proxy in every kind of pornographic fantasy I could conjure.
It's always difficult to figure out the relationship between cultural fantasies and mores, on the one hand, and actual behavior and sexual self-image on the other. You could probably spend a long time listening to teenagers and still not get to the bottom of how the slut myth filters into their own lives. Still, the site of the slut's continuous re-creation, the high school hallways, deserves closer scrutiny, and the mysteries of her endurance await further exploration.
The jury is still out on whether it can restore, let alone enhance, its influence.