Quantcast

April 22, 2002 | The Nation

In the Magazine

April 22, 2002

Cover:

Browse Selections From Recent Years

2014

2013

2012

2011

2010

2009

2008

2007

2006

2005

2004

Robert Fisk reports from Lebanon, Ken Silverstein investigates Bush Administration ties to a corrupt dictatorship in Equatorial Guinea and Patricia J. Williams discusses tests, tracking and derailment in education policy.

Letters



That Old Orientalism...

New York City

The only true characterization of Kanan Makiya in Turi Munthe's review of The Rock: A Tale of Seventh-Century Jerusalem ["Muslim Jerusalem: A Story," April 1] is Munthe's claim that Makiya is the "Arab world's most ardent and vocal supporter of America's projected intervention in Iraq." Munthe could have added that Makiya was also the most ardent Arab voice calling on the Americans to continue to bomb Iraq all the way to Baghdad after they had decided to stop their bombing campaigns and end the Gulf War. As for Makiya being, in Munthe's words, "the hammer of liberal Arab intelligentsia, [and] the arch anti-Orientalist," there is no basis of truth to these astonishing claims. Not only is Makiya not known anywhere in the Arab world, whether in liberal or illiberal intellectual, academic or journalistic circles (except perhaps among the Iraqi expatriate community in London and among some of the Arab journalists working in that city), indeed, he is not known among the Arab reading public at large, as only selections of his book Republic of Fear were translated into Arabic and published in 1990 by an obscure Egyptian publisher during the heyday of the gulf crisis. Makiya, however, seems to be better known in the West, celebrated by the mainstream American media as a native informant who reproduces views about the Arab world that accord with official US policy. Now, it seems, The Nation is also celebrating him through Munthe's ill-informed review. The main irony, however, is in portraying Makiya as a major Arab intellectual, which he neither was nor is, whether in the Arab world or in the West.

Contrary to the claim that he is an "arch anti-Orientalist," Makiya's own words reveal him to be a deeply Orientalist thinker. Not only did he never take on Orientalists and their anti-Muslim and anti-Arab racist claims, in his own writings he reproduces Orientalist opinion: In his Republic of Fear, a slapdash account of Iraqi politics in which an amateurish pseudopsychological analysis substitutes for social, economic or political analysis and in which facts are selectively chosen to suit the argument (see my review in Against the Current (March/April 1991), he writes, for example, that "conspiratorial thinking has broad roots in the extreme fatalism and hostility to individualism that may be characteristic of Islamic culture generally." Makiya's views in his other book, Cruelty and Silence, are reminiscent of the Israeli Orientalist Raphael Patai's racist book The Arab Mind. Makiya tells us in Cruelty and Silence that unlike in English culture, which considers it rude, "belching" is allegedly an "Arab custom of hospitality" to be practiced at the dinner table in appreciation of one's hosts. In fact, this "arch anti-Orientalist" goes to some pains to excuse anti-Arab racism in the West by affirming that "racism exists everywhere else in the world" (see As ad Abukhalil's excellent review of this book in the Middle East Journal, Autumn 1993).

It is relevant to note that Makiya and his father, Mohamed, ran an architectural firm in London in the 1980s making much of their money doing business with, yes, Saddam Hussein, who had commissioned the firm to rebuild Baghdad, among other projects. The irony of Republic of Fear itself, as a sympathetic profile of Makiya and his father uncovers in The New Yorker ("Architects Amid the Ruins," by Lawrence Weschler, January 6, 1992), is that the book was funded by money paid by Saddam to the firm owned by Makiya's father, which paid the salary of Kanan himself. It should also be added that in the early 1990s, Makiya was affiliated (and may still be) with the discredited and corrupt Iraqi National Congress, whose funding comes from the CIA.

Adding to Makiya's contrived mystique is his choice of pseudonyms for himself throughout his career. While most Iraqi and other Arab dissidents write under their own names, Makiya wrote in his youth under the name Muhammad Ja'far and later under the name Samir al-Khalil. He claimed that the latter name was used to protect himself against assassination by Saddam Hussein's henchmen. It is ironic, however, that he revealed his name only after his father's firm no longer had ongoing business with Saddam's Iraq (see the New Yorker profile).

It is disheartening that The Nation would publish such an ignorant review of a minor pamphleteer whose recent book The Rock is hardly more than a disguised attempt to rob the Palestinians of their own Islamic national heritage and strengthen Israeli Zionist claims not only to archeological finds that Israel (mis)identifies as "Jewish," but to Muslim monuments that have not hitherto been claimed by the most ardent of Zionist fanatics as "Jewish" at all--namely, the Dome of the Rock. The undeclared aim of the book is to convince readers that the Dome of the Rock was built by a Jewish convert. It is only a matter of time before Zionist fanatics will follow Makiya's lead and lay claim to the dome as they have laid claim to everything else in the land of the Palestinians.

JOSEPH MASSAD


MUNTHE REPLIES

London

Joseph Massad makes five points. He tells us (1) that Makiya is not an intellectual (and certainly not an intellectual with any clout in the Arab world); (2) that Makiya is an Orientalist; (3) that Makiya is a fraud who made money from Saddam Hussein; (4) that Makiya is a Zionist agent trying to "rob the Palestinians of their own Islamic national heritage" by proving in his book The Rock "that the Dome of the Rock was built by a Jewish convert"; and (5) (less important) that I am ill informed.

(1) Very briefly: Who knows what Massad considers "intellectual," but a professorship at Brandeis, the directorship of the Iraq Research and Documentation Project at Harvard and (most important) a commitment to unfashionable causes would satisfy me. As for his recognition in the Arab world, al-Hayat (one of the most respected Arabic-language broadsheets, with a daily readership approaching half a million) has already been in touch to ask about translation rights to my Nation review.

(2) That Makiya is an Orientalist is a very old argument, one that first greeted the publication of Republic of Fear in the late 1980s and that Makiya himself tackled head-on in Cruelty and Silence. My review tried to show how, if listened to, Makiya's voice might sharpen Western Arab intellectuals' critique of the problems facing the Arab world. He suggested that blaming the West for its ills was high condescension and a flagrant form of Orientalism in itself, since it assumed no solution could come from within the Arab world. For the last twenty-odd years, taking on "Orientalists and their anti-Muslim and anti-Arab racist claims" has been the staple crusade of the intellectual brigade that Massad follows. Their work has been vital in reshaping the West's conception of its most immediate cultural neighbor, and as an intellectual posture, it has gained widespread and well-deserved currency. In recent years, however, its very currency has made bullies of some of its practitioners. Much as the Jerusalem Post dismisses any criticism of Israeli policy with the cover-all charge of anti-Semitism, so the charge of "Orientalism" works for criticism of the Arab world. In this particular case, it is almost criminally irresponsible. Makiya's Republic of Fear, hailed by many of the leading analysts of the region (from Fred Halliday to Peter Sluglett) as uniquely brilliant, details some of the most horrendous atrocities being committed by any regime in the world. The book was attacked as violently as it was because in justifying America's campaign against Iraq, it weakened criticism of US Middle East policy. That criticism, conceived around the notion of Orientalism, attacked US policy as universally mistaken. For those like Massad principally concerned with Palestine, Makiya seemed to dilute the force of their case. He doesn't. It is deeply disheartening, to say the least, that Massad should try to discredit factual evidence of the suffering of millions of fellow Arabs because it doesn't serve his interests. Dismissing the evidence of that suffering because of its allegedly "Orientalist" tone can only be seen as gross intellectual dishonesty.

(3) A piece of writing should be judged on its own merits: T.S. Eliot was an anti-Semite and the father of modern poetry. But facts: Kanan Makiya never worked for Saddam Hussein, nor was Republic of Fear funded by Saddam. For the record, it was in 1980 that Makiya and his father argued over whether their company should work for Iraq. Kanan refused, left the company and began working on Republic of Fear before his father began work there. As for the "contrived mystique" of a pseudonym, Makiya went public at Harvard (at the explicit request of Roy Mottahedeh and Edward Mortimer) at the time of the Iraqi uprising against Saddam Hussein following the Gulf War, not when his father stopped working in Baghdad. Mohamed Makiya's business with Iraq ceased in 1986; Iraq invaded Kuwait on August 2, 1990.

(4) My review of The Rock tried to show how Makiya's delicate treatment of the historical places of Abrahamic tradition suggests that any absolute territorial claim to the spiritual geography of Jerusalem is idiotic, not simply because of the inextricable links among the three religions but because it confuses symbol with truth. Massad seems to be making the same mistake as his Zionist arch-enemies.

(5) Massad would have opinion different from his own stifled: He criticizes The Nation for giving Makiya the credibility of a review as well as for choosing such an ill-informed reviewer. Massad, an assistant professor of modern Arab politics at Columbia (with, I assume, teaching functions), should not be ill-informed. Nor should he be advancing two-bit conspiracy theories in the national press--he cannot seriously believe that Kanan Makiya is trying to pave the ground for a Zionist take-over of Jerusalem. Makiya's ideas hark back to that old ideal of Arab political thinking that saw the Arab world as a community with a shared sense of purpose and responsibility. It is the shallow (and in this case deceitful) self-interest of men like Joseph Massad that has consistently turned a possibility into a utopia.

TURI MUNTHE

Editorials

The Bush Administration has vigorously and effectively responded to the terrorist attack of September 11. The country seems united behind that effort. Certainly there was no hint of a doubt in the repeated standing ovations Congress gave the President's State of the Union address, including his bold declaration that the war on terrorism has just begun. The President singled out Iran, Iraq and North Korea as the most likely next targets of America's aroused ire against terrorists and governments that attempt to acquire weapons of mass destruction that we, the Russians, the British, the French, the Chinese, the Indians, the Pakistanis and the Israelis already possess.

No longer in government, I do not have the benefit of national security briefings or Congressional committee deliberations. So perhaps instead of making assertions, it may be more appropriate for me to ask some questions that have been on my mind both before and since September 11.

Which course might produce better results in advancing American security? Is it by continuing to boycott, diplomatically and commercially, such countries as Iran, Iraq, North Korea, Libya and Cuba and threatening to bomb them? Or would we be better off opening up diplomatic, trade and travel relations with these countries, including a well-staffed embassy in each? If we are fearful of a country and doubtful of its intentions, wouldn't we be safer having an embassy with professional foreign service officers located in that country to tell us what is going on?

Our leaders frequently speak of "rogue nations." But what is a rogue nation? Isn't it simply one we have chosen to boycott because it doesn't always behave the way we think it should? Do such nations behave better when they are isolated and boycotted against any normal discourse? What do we have to lose in talking to "rogue nations" diplomatically, trading with them commercially and observing their economic, political and military conditions?

Instead of adding $48 billion to the Pentagon budget, as the President has proposed, wouldn't we make the world a more stable, secure place if we invested half of that sum in reducing poverty, ignorance, hunger and disease in the world? We are now twentieth among nations in the percentage of gross national product devoted to improving life in the poor nations. If we invested half of the proposed new military spending in lifting the quality of life for the world's poor we would be the first among nations in helping others.

Is it possible that such an achievement would reduce some of the gathering anger that the poor and miserable of the earth may be inclined to direct at the rich and indifferent? Why does a wealthy zealot like Osama bin Laden gain such a huge following among the poor and powerless of the world? Acting on the old adage "charity begins at home," why not invest the other half of the proposed new money for the Pentagon in raising the educational, nutritional, housing and health standards of our own people?

Our military services are the best in the world. But with a military budget at record levels, do we need to allocate another $48 billion--an amount greater than the total military budget of any other nation? Is not the surest foundation for our military forces a healthy, educated, usefully employed citizenry? And is not the best way to diminish some of the international trouble spots, which might embroil our young men and women, by reducing the festering poverty, misery and hopelessness of a suffering world?

Of course we need to take reasonable precautions in our airports and other strategic points to guard against terrorists or nut cases. As a World War II bomber pilot, I appreciate the role of both tactical and strategic bombing in all-out warfare. But is sending our bombers worldwide in the hope that they might hit terrorist hideouts or such hostile governments as Iraq an effective way to end terrorism? May it not more likely erode our current international coalition, while fanning the flames of terrorism and hatred against us as the world's only superpower, hellbent on eradicating evil around the world?

The Administration now has seventy-five officials hidden in bunkers outside Washington poised to take over the government in the event of a terrorist attack. Is it possible that paranoia has become policy? No such extreme measures were undertaken in World War II, nor in the half-century of cold war between the two nuclear giants, Russia and the United States.

All of us who love this land want our President to succeed. Nothing would give me greater happiness than to see him become a great President. But is it possible that our well-intentioned President and his Vice President have gone off the track of common sense in their seeming obsession with terrorism? Is there still validity to the proverb "whom the Gods would destroy, they first make mad"?

For half a century, our priorities were dominated by the fear of Russian Communism--until it collapsed of its own internal weakness. As I listen to the grim rhetoric of Messrs. Bush and Cheney, I wonder if they are leading us into another half-century of cold war, with terrorism replacing Communism as the second great hobgoblin of our age.

Odds are good that on a plane or boat or bus somewhere in the world sits a refugee headed for the United States carrying the seeds of a weapon of mass destruction. The agent he unwittingly carries is insidious and lethal but slow acting, so the deaths it causes can come months or even years after it is disseminated in the population. It has the potential to overwhelm, to kill thousands, and there may be no vaccine, antidote or cure.

What is this ominous threat? An ingenious new biological weapon? No, it is a very old nemesis of humankind--tuberculosis, an infectious disease that kills more than 2 million people every year. Because the majority of them are poor and outside our borders, we don't hear much about them, but that may soon change. Tuberculosis is making a comeback and is conquering the treatments that have kept this killer at bay in the developed world. Multidrug-resistant tuberculosis--a death sentence in most developing countries--is becoming more common and is incurable in about half the cases, even in the United States.

Two numbers in the President's budget proposal stand in stark contrast to each other: $6 billion to fight bioterrorism versus $200 million to the Global Fund to Fight AIDS, Tuberculosis and Malaria. That works out to a little more than $1 billion per US anthrax death last year, as compared with $33 per global victim of the more common infectious scourges in 2001.

Re-emerging infectious diseases like malaria, HIV and tuberculosis continue their inexorable march, devastating poor countries in Africa, Asia and South America and ultimately threatening the richest countries. Either we are all protected or we are all at risk. It would be better to recognize that the developed world's inaction and callousness have allowed epidemics to flourish in the fertile soil of poverty, malnutrition and poor living conditions made worse by wars, internal displacements, repressive regimes, refugee crises, economic sanctions and huge debt payments that require poor countries to cut public services.

If we can possibly be unmoved by the staggering numbers affected by the AIDS pandemic alone (3 million dead in 2001, more than 40 million people living with HIV, 28 million of them in Africa), then perhaps we will be moved by fear. Of the world's 6 billion inhabitants, 2 billion are infected with latent tuberculosis. With adequate treatment, TB is 90 percent curable, yet only a fraction of those with the disease have access to this simple technology: a course of medications costing only $10 to $20 per patient. As a result, tuberculosis is now the world's second leading infectious killer after AIDS. Resistant tuberculosis, the result of inadequate treatment, is spreading at alarming rates in poor countries and in urban centers of rich countries. According to the World Health Organization, an eight-hour plane flight with an infected person is enough to risk getting TB.

Of all the rich countries, the United States has its head buried most deeply in the sand. It is the stingiest, spending only one cent on foreign aid for every $10 of GNP. Only $1 in $20 of the aid budget goes to health. A recent WHO report estimated that spending by industrialized countries of just 0.1 percent more of their GNPs on health aid would save 8 million lives, realize up to $500 billion a year via the economic benefits of improved health and help those in poor countries escape illness and poverty.

George W. Bush's recent pledge to increase foreign aid comes too late and with too many strings attached. The proposal doesn't start until 2004, making it largely hypothetical. The current budget keeps spending flat at about $11.6 billion. Even with the promised increase, US spending on foreign aid as a proportion of GNP will still pale in comparison with that of other developed nations. And along with this carrot comes a big stick: Only countries that continue to let corporations raid their economies through detrimental free-trade policies will be eligible.

Bioterrorism is a danger that should be taken seriously, but the current counterterrorism frenzy threatens to militarize the public health system, draining resources away from the research, surveillance systems and treatments needed for existing health problems. Already the political war profiteers have criticized the CDC's funding priorities, using the terrorist threat as cover in an attempt to advance their reactionary agenda. In a letter this past November to Health and Human Services Secretary Tommy Thompson, Republican Representatives Joseph Pitts, John Shadegg and Christopher Smith criticized the CDC for "inappropriate" actions. The Congressmen wrote that "we have grown increasingly concerned about some of the activities that the CDC is funding and promoting--activities that are highly controversial in nature, and funding that could be better used for our War on Terrorism." They specifically objected to AIDS prevention programs targeted at gay men and to a CDC website link to a conference sponsored by organizations promoting reproductive health, including abortions, for women. Bush was happy to oblige by cutting $340 million from the CDC's nonbioterrorism budget.

Just as a missile shield will not protect us from crazed men with box cutters, so mass vaccination campaigns and huge stockpiles of antibiotics will not keep us healthy in an increasingly unhealthy world. September 11 should make us more aware than ever of our shared vulnerability. Making the world safer and healthier means prevention and early treatment of disease, inside and outside our borders. It means building a healthcare system designed to keep people healthy instead of spending billions on bogymen while the real killers are on our doorstep.

Recent days have brought the first tentative but welcome roadblocks to the Bush Administration's war-fevered assault on civil liberties. In Newark, Superior Court Judge Arthur D'Italia, calling secret arrests "odious to a democracy," ordered the Immigration and Naturalization Service to release the names of post-September 11 detainees to the New Jersey ACLU. Although the release of the names has been delayed pending a federal government appeal, Judge D'Italia's courageous ruling is a significant victory for constitutional principle.

Meanwhile, a Freedom of Information Act lawsuit requesting the names of detainees and related information continues in federal court in Washington, DC. The Center for National Security Studies, which is representing numerous civil liberties and media outfits (including The Nation), recently filed a brief supporting its own request for this material. The Administration is expected to file a response in mid-April, and oral arguments could take place weeks after that.

Some of the most important restraints are being applied from abroad. In the case of accused hijacking conspirator Zacarias Moussaoui, Attorney General John Ashcroft confronted a clear choice between his domestic political goal of advancing the death penalty and the international goal of a palatable legal campaign against Al Qaeda. Ashcroft tried to have it both ways, committing himself to capital charges against French citizen Moussaoui while seeking further cooperation in the case from the anti-capital-punishment French. The result: French officials have publicly promised to withhold any evidence that might lead to a death sentence. Ashcroft's prosecutors are on the hook, stuck with a sketchy conspiracy case requiring proof that the silent Moussaoui, grounded in jail on September 11, was a sufficiently active and knowledgeable architect of the Al Qaeda Trade Center and Pentagon attacks to merit a capital conviction.

The sense that European human rights commitments are having an impact despite the Administration's guff was even clearer when Defense Secretary Donald Rumsfeld effected a brief reverse-rudder from George W. Bush's military tribunal order. This past fall the Administration fumed when UN Human Rights High Commissioner Mary Robinson denounced the tribunal order, but Rumsfeld's new plan to require public trials, unanimous votes for a death sentence and additional layers of review was clearly designed to quiet international criticism. Rumsfeld's amendment to the Bush order does not get to the heart of the constitutional problem, however. The tribunals still represent an unlawful seizure of judicial authority by the President. And Rumsfeld's March 28 admission that prisoners in Guantánamo Bay could stay locked up even if acquitted of charges reveals what is in effect a permanent policy of internment without trial, a policy made possible only by the Al Qaeda prisoners' continued status in a no man's land between the Geneva Conventions and US law. This policy has European allies nearly as alarmed as they are about the tribunals.

There are two important lessons here. One is that in the months since September 11, players in the federal system of checks and balances--Congress and federal judges--have largely failed to resist the Bush Administration's constitutional power grab. The second lesson, however, is that resistance is more than possible. With state courts sometimes better guardians of civil liberties than are the Rehnquist-era federal courts, and with Europe deeply invested in its Continental human rights covenants, it may turn out that local and transnational coalitions will become an effective means of preserving civil liberties. State courts at home and allies abroad are turning out to be the most compelling protectors of the essential American values the Bush Administration has been ready to sell down the river.

Even as the Middle East plunged deeper into the maelstrom of fear, hatred, violence and despair, recent diplomatic developments, ironically, made the conditions for achieving peace tantalizingly real. Most notable, there was the declaration by the Arab nations meeting in Beirut of their willingness to recognize Israel, after fifty years of denial of its right to exist.

Predictably brushing aside the Arab vision, Prime Minister Ariel Sharon revved up his tanks for a military solution, declaring a "state of war" with the Palestinians and invading the West Bank. Their nation, he told Israelis, is "at a crossroads." So it is. The road down which Sharon is taking them, with a green light from the Bush Administration, leads to more deaths, more brutalizing of civilians, more violations of human rights--and answering violence and anger by the Palestinians, with more suicide bombers making barbarous war on Israeli civilians.

The other road, the road to peace, leads toward the goal, articulated anew in Beirut, of a complete Israeli withdrawal from the West Bank and Gaza, making way for a Palestinian state with its capital in East Jerusalem, in return for normalization of relations with the Arab states.

At Beirut, in another step for regional peace, the Arab nations brokered Iraq's recognition of Kuwait's sovereignty, called for an end to UN sanctions, for dialogue between Baghdad and the UN and for the elimination of weapons of mass destruction in the region. Most significant, they expressed united opposition to US military action against Iraq. This move undercut the Bush Administration's political rationale for attacking Iraq and in effect denied it the regional bases and logistical support essential to success in such a war.

Faced with those impediments, the Administration put its Iraq plans on hold and turned to the crisis in the Middle East. But Bush seems more concerned with maintaining the domestic political momentum of his war on terror (just as Sharon is driven by the harder-line challenge on his right from Benjamin Netanyahu) than he is with making the tough political choices that would lead to a just settlement.

Bush still could choose the road to peace, whether or not Sharon takes that same road. He could reaffirm the US commitment to the March 12 UN resolution calling for a two-state solution, adopt the Arab peace scheme as a vision that could give Palestinians hope of an independent state, re-endorse the recent Security Council resolution calling for Sharon's withdrawal from Palestinian cities and dispatch the Secretary of State to the region with a plan that guarantees Israel's security, calls for abandonment of Israeli settlements and the withdrawal of Israel to 1967 borders--a plan backed up by firm promises of monitors and the material and financial resources necessary to make it work.

Does Bush have the courage to take that road--to risk the prestige of his Administration and the resources of the United States in the cause of achieving a just peace? The State Department mildly criticized Sharon's incursion, but Bush seems to be giving Sharon free rein. The only hope is for the thus-far-small opposition in this country to build pressure on him and make clear that Sharon's way condemns both Israelis and Palestinians to more suffering and bloodshed. We in America must add our voices to the burgeoning protests in the Middle East, Europe and Asia and to the eloquent warnings voiced by UN Secretary General Kofi Annan, Pope John Paul II and other world leaders. Only a US-led third-party intervention can forge a settlement that will end the violence.

Two Palestinian-Israeli wars have erupted in this region. One is the Palestinian nation's war for its freedom from occupation and for its right to independent statehood. Any decent person ought to support this cause. The second war is waged by fanatical Islam, from Iran to Gaza and from Lebanon to Ramallah, to destroy Israel and drive the Jews out of their land. Any decent person ought to abhor this cause.

Yasir Arafat and his men are running both wars simultaneously, pretending they are one. The suicide killers evidently make no distinction. Much of the worldwide bafflement about the Middle East, much of the confusion among the Israelis themselves, stems from the overlap between these two wars. Decent peace seekers, in Israel and elsewhere, are often drawn into simplistic positions. They either defend Israel's continued occupation of the West Bank and Gaza by claiming that Israel has been targeted by Muslim holy war ever since its foundation in 1948, or else they vilify Israel on the grounds that nothing but the occupation prevents a just and lasting peace. One simplistic argument allows Palestinians to kill all Israelis on the basis of their natural right to resist occupation. An equally simplistic counterargument allows Israelis to oppress all Palestinians because an all-out Islamic jihad has been launched against them.

Two wars are being fought in this region. One is a just war, and the other is both unjust and futile.

Israel must step down from the war on the Palestinian territories. It must begin to end occupation and evacuate the Jewish settlements that were deliberately thrust into the depths of Palestinian lands. Its borders must be drawn, unilaterally if need be, upon the logic of demography and the moral imperative to withdraw from governing a hostile population.

But would an end to occupation terminate the Muslim holy war against Israel? This is hard to predict. If jihad comes to an end, both sides would be able to sit down and negotiate peace. If it does not, we would have to seal and fortify Israel's logical border, the demographic border, and keep fighting for our lives against fanatical Islam.

If, despite simplistic visions, the end of occupation will not result in peace, at least we will have one war to fight rather than two. Not a war for our full occupancy of the holy land, but a war for our right to live in a free and sovereign Jewish state in part of that land. A just war, a no-alternative war. A war we will win. Like any people who were ever forced to fight for their very homes and freedom and lives.

\

Translated by Fania Oz-Salzberger.

First, the Arab League summit here in Beirut was chaos. Then it was the nearest to Arab unity that the Middle East has seen since the collapse of the Ottoman Empire. The chaos, of course, was predictable.

Six weeks ago, The Nation called for Army Secretary Tom White's resignation. White, former vice chairman of an Enron Ponzi scheme called Enron Energy Services (EES) was self-evidently not fit to bring sound business practices to the Pentagon. Since then, new revelations have created a bill of particulars against White serious enough to warrant probes by a federal grand jury and the Defense Department's Inspector General. White has stated that "if I ever get to the point...where the Enron business represents a major and material distraction...I wouldn't stay." That point has come. If White does not resign, he must be fired. The recent revelations show that White continues to practice the same squirrelly ethics that made Enron infamous. Since becoming Army Secretary, he has:

§ infuriated Republican Senator John Warner and Democrat Carl Levin of the Armed Services Committee by masking the full range of his Enron holdings;

§ violated his pledge to divest himself of those holdings, in accordance with ethics guidelines. After requesting an extension to sell his 405,710 shares, he finished dumping them in October, after a flurry of calls to executives at Enron and just before the SEC's public announcement of a formal investigation of the company, which caused the stock to tank. This has made White a target of a grand jury probe on insider trading. White says he was just commiserating with his former friends about Enron's troubles;

§ concealed those supposedly innocent contacts with Enron executives, failing to include them in response to a request by Representative Henry Waxman. White claims that he forgot to include the calls from his home phone;

§ misused a military plane to fly his wife and himself to Aspen, Colorado, where he completed the sale of his $6.5 million vacation house. This earned him an Inspector General's review of his past travel. Military transport is available only for official duty. White claims he had official business in Dallas and Seattle and that Aspen was directly between the two. He also states that he was required to fly a military plane as part of the Bush Administration's secretive continuity-in-government plan, which apparently requires top officials to fly military aircraft to resorts where they maintain mansions.

The more we learn of White's past at Enron, the worse it gets. EES cooked the books to register immediate earnings and profits, when in fact it was suffering hundreds of millions in losses--most of which were then secreted in Enron's notorious accounting scams. White has claimed that he knew nothing about improprieties at EES. But former EES employees interviewed by Dow Jones Newswires affirm that White was part of the scam. He signed off on the EES contracts that produced immediate paper profits and long-term real losses. He urged the sales force to make the company look like it was making money. He even participated in the notorious Potemkin Village trading floor, a fake trading room that EES threw together to impress visiting stock analysts. And then White walked off with millions, while investors were fleeced and the workers discarded. For conservative military analyst Eliot Cohen this alone is grounds for White's resignation, because he cannot profess the core military ethic of "mission" and "men" before self since "he was an integral part of an organization that violated those principles."

These days George W. Bush scarcely remembers his leading political patron, Enron CEO Ken "Kenny Boy" Lay. The President now poses as a champion of corporate accountability, calling for executives to be held personally responsible for their companies' financial statements. Yet he hasn't held his own Army Secretary personally responsible for his fraudulent actions at Enron and his misdeeds as Army Secretary. If White doesn't have the grace to go, he should be dismissed. The Army and the country would be better served if he defended himself from scandals past and present on his own time and with his own dime.

Columns

Stop the Presses

How cool is Jennifer Harbury? She is currently arguing her own case before the Supreme Court, demanding the right to sue the government because, she maintains, its leaders deliberately misled her about the murder of her husband, a Guatemalan rebel leader named Efrain Bamaca Velasquez who was killed in army custody during the counterinsurgency war in Guatemala in the early 1990s.

Harbury has a case. The State Department has confirmed that Col. Julio Roberto Alpirez, who was present during Bamaca's interrogation/murder, was a paid CIA asset. A CIA report alleges that Alpirez did the dirty deed himself. When then-State Department official Richard Nuccio informed Senator Robert Torricelli of that, Nuccio immediately found himself the target of a Justice Department investigation. A federal prosecutor accused him of betraying America by conspiring with Torricelli to blow Alpirez's cover, of destroying CIA officers' careers and of being an agent of the guerrillas. Although the United States offered no official charges or accusations, in a highly unusual move the CIA demanded that the State Department strip Nuccio of his security clearance, thereby depriving him of his livelihood. Harbury endured a thirty-two-day hunger strike to force those officials to come clean. She is now arguing that she could have saved her husband's life through the US court system had she known the truth during the period between his capture in March 1992 and his murder in 1993 or 1994.

A report by the President's Intelligence Oversight Board rejected the charge of deliberate lying by US officials but admitted that if the government had bothered to investigate "when Jennifer Harbury first raised the issue of her husband's fate" in the spring of 1992, the State Department "might have been able at a much earlier date to provide her with useful information." The key word here appears to be "useful."

Warren Christopher, Anthony Lake and the other Clinton Administration officials named by Harbury are probably right when they argue that leveling with her at the time would have made little difference in saving her husband's life. US courts do not have jurisdiction over the Guatemalan military (though US foreign policy officials often do). They also deny that they lied. But for procedural reasons, the ex-officials have to argue that regardless of whether they lied, a US citizen has no legal right to sue a public official who does lie. Solicitor General Theodore Olson filed an amicus brief arguing on behalf of the government's right to lie: "It is an unfortunate reality that the issuance of incomplete information and even misinformation by government may sometimes be perceived as necessary to protect vital interests," he maintains.

This particular case stinks for more reasons than can be precisely counted. In addition to the above, Bamaca was killed by a genocidal government that enjoyed the enthusiastic support of Ronald Reagan and George H.W. Bush. This is not only my opinion; it is the view of the Guatemalan Historical Clarification Commission's 1999 report, which condemns the United States for aiding a "criminal counterinsurgency" against the nation's indigenous Mayan population. America's Guatemala policy was anticommunism gone mad.

Moreover, if David Brock is to be believed, Olson is himself tainted by his lies to Congress. According to Brock's Congressional testimony, Olson lied during his confirmation hearings about his role in the Richard Mellon Scaife-funded "Arkansas Project," run out of the offices of The American Spectator and designed to undermine the Clinton presidency by any means necessary. What a surprise, therefore, that he thinks it's OK for the government to lie as well.

But the sorry truth is that the question of the government's right to lie is a lot more complicated than it looks. The Supreme Court has repeatedly enshrined in law the extremely provocative statement enunciated in the aftermath of the Cuban missile crisis by Assistant Secretary of Defense for Public Affairs Arthur Sylvester: "It's inherent in [the] government's right, if necessary, to lie to save itself." Dishonest officials have stretched the "national security" definition beyond recognition to protect not only thuggish murderers but also narrow political interests. But the principle itself is not wholly unsound. Although lies undermine the confidence in, and practice of, democracy, in the wake of the September 11 attacks, one can imagine circumstances in which a temporary lie might save lives without endangering the Constitution.

The problem is how to set enforceable limits. Government officials lie all the time. And while it is a crime to lie to Congress and to commit perjury, these acts are prosecuted in such a haphazard and nakedly political fashion that they can hardly serve as much of a deterrent. Lawrence Walsh's legitimate prosecutions of Reagan Administration officials who lied about matters of state were mocked by allegedly high-minded pundits like David Broder and George Will and overturned in a cowardly fashion by defeated President George H.W. Bush after the 1992 election.

Meanwhile, a fanatical cabal inside the Republican Party and Kenneth Starr's office manipulated these same laws to impeach President Clinton and disarm his popular agenda over a private lie not about a matter of state but a routine case of almost adultery. Given that hundreds of thousands if not millions of Americans have told this same type of lie to protect their families (or themselves) from humiliation, they saw this partisan gambit for what it was, punishing its perpetrators in the 1998 election. But the self-righteous pooh-bahs of the punditocracy--many of whom celebrated the Reagan-era liars and quite a few of whom told their share of adulterous lies--behave as if their hypocrisy were somehow patriotically inspired.

Jennifer Harbury continues to fight not only for justice for her husband but also for a reasonable definition of the government's right to lie. Bully for this brave woman who, despite her personal tragedy, takes democracy more seriously than its alleged protectors. She is a patriot to put the pundits to shame.

As state budgets around the country are slashed to accommodate the expense of the war on terror, the pursuit of educational opportunity for all seems ever more elusive. While standardized tests are supposed to be used to diagnose problems and facilitate individual or institutional improvement, too often they have been used to close or penalize precisely the schools that most need help; or, results have been used to track students into separate programs that benefit the few but not the many. The implementation of gifted classes with better student-teacher ratios and more substantial resources often triggers an unhealthy and quite bitter competition for those unnaturally narrowed windows of opportunity. How much better it would be to have more public debate about why the pickings are so slim to begin with. In any event, it is no wonder there is such intense national anxiety just now, a fantastical hunger for children who speak in complete sentences by the age of six months.

A friend compares the tracking of students to the separation of altos from sopranos in a choir. But academic ability and/or intelligence is both spikier and more malleably constructed than such an analogy allows. Tracking students by separating the high notes from the low only works if the endgame is to teach all children the "Hallelujah Chorus." A system that teaches only the sopranos because no parent wants their child to be less than a diva is a system driven by the shortsightedness of narcissism. I think we make a well-rounded society the same way we make the best music: through the harmonic combination of differently pitched, but uniformly well-trained voices.

A parsimony of spirit haunts education policy, exacerbated by fear of the extremes. Under the stress of threatened budget cuts, people worry much more about providing lifeboats for the very top and containment for the "ineducable" rock bottom than they do about properly training the great masses of children, the vibrant, perfectly able middle who are capable of much more than most school systems offer. In addition, discussions of educational equality are skewed by conflation of behavioral problems with IQ, and learning disabilities with retardation. Repeatedly one hears complaints that you can't put a gifted child in a class full of unruly, noisy misfits and expect anyone to benefit. Most often it's a plea from a parent who desperately wants his or her child removed from a large oversubscribed classroom with a single, stressed teacher in an underfunded district and sent to the sanctuary of a nurturing bubble where peace reigns because there are twelve kids in a class with two specialists and everyone's riding the high of great expectations. But all children respond better in ordered, supportive environments; and all other investments being equal, gifted children are just as prone to behavior problems--and to learning disabilities--as any other part of the population. Nor should we confuse exceptional circumstances with behavior problems. The difficulty of engaging a child who's just spent the night in a homeless shelter, for example, is not productively treated as chiefly an issue of IQ.

The narrowing of access has often resulted in peculiar kinds of hairsplitting. When I was growing up, for example, Boston's Latin School was divided into two separate schools: one for boys and one for girls. Although the curriculum was identical and the admissions exam the same, there were some disparities: The girls' school was smaller and so could admit fewer students; and the science and sports facilities were inferior to those of the boys.

There was a successful lawsuit to integrate the two schools about twenty years ago, but then an odd thing happened. Instead of using the old girls' school for the middle school and the larger boys' school for the new upper school, as was originally suggested, the city decided to sever the two. The old boys' school retained the name Boston Latin, and the old girls' school--smaller, less-equipped--was reborn as Boston Latin Academy. The entrance exam is now administered so that those who score highest go to Boston Latin; the next cut down go to what is now, unnecessarily, known as the "less elite" Latin Academy.

One of the more direct consequences of this is that the new Boston Latin inherited an alumni endowment of $15 million dollars, much of it used to provide college scholarships. Latin Academy, on the other hand, inherited the revenue of the old Girls' Latin alumni association--something under $200,000. It seems odd: Students at both schools are tremendously talented, the cutoff between them based on fairly insignificant scoring differences. But rather than pool the resources of the combined facilities--thus maximizing educational opportunity, in particular funding for college--the resolution of the pre-existing gender inequality almost purposefully reinscribed that inequality as one driven by wealth and class.

There are good models of what is possible. The International Baccalaureate curriculum, which is considered "advanced" by most American standards, is administered to a far wider range of students in Europe than here, with the result that their norm is considerably higher than ours in a number of areas. The University of Chicago's School Mathematics Project, originally developed for gifted students at the Chicago Lab School, is now recommended for all children--all children, as the foreword to its textbooks says, can "learn more and do more than was thought to be possible ten or twenty years ago." And educator Marva Collins's widely praised curriculum for inner-city elementary schools includes reading Shakespeare.

Imparting higher levels of content requires nothing exceptional but rather normal, more-or-less stable children, taught in small classes by well-trained, well-mentored teachers who have a sophisticated grasp of mathematics and literature themselves. It will pay us, I think, to stop configuring education as a battle of the geniuses against the uncivilized. We are a wealthy nation chock-full of those normal, more-or-less stable children. The military should not be the only institution that teaches them to be all that they can be.

They give each other tit for tat.
The next tat may get Arafat.

Here we are, twenty years on, and the reports of the Israeli army smashing its way through Palestinian towns remind me of what came out of Lebanon as Sharon and his invading army raced north. Israeli troops beating, looting, destroying; Palestinians huddled in refugee camps, waiting for the killers to come.

But there is a huge difference. Twenty years ago, at least for people living here in the United States, it was harder, though far from impossible, to get firsthand accounts of what was going on. You had to run out to find foreign newspapers, or have them laboriously telexed from London or Paris. Reporting in the mainstream corporate press was horrifyingly tilted, putting the best face on Israeli deeds. Mostly, it still is. But the attempted news blackout by the Sharon government and the Israeli military simply isn't working.

Here's Aviv Lavie, writing in Ha'aretz on April 2:

A journey through the TV and radio channels and the pages of the newspapers exposes a huge and embarrassing gap between what is reported to us and what is seen, heard, and read in the world.... On Arab TV stations (though not only them) one could see Israeli soldiers taking over hospitals, breaking equipment, damaging medicines, and locking doctors away from their patients. Foreign television networks all over the world have shown the images of five Palestinians from the National Security forces, shot in the head from close range.... The entire world has seen wounded people in the streets, heard reports of how the IDF prevents ambulances from reaching the wounded for treatment.

As always, there are the courageous witnesses. These days we have the enormously brave young people in the International Solidarity Movement sending daily communications back to the United States that flash their way round the Internet and even translate into important interviews in the mainstream media.

Meet a few of them. Here's Jordan Flaherty, filing this account on Indymedia:

Last night the Israeli Military tried to kill me. I'm staying in the Al Azzeh refugee camp, in Bethlehem, along with about twenty other international civilians. We're here to act as human shields.... On the hill above the camp is an Israeli military sniper's post. To get where we were staying in the village, most of us had to cross this street. It was a quick, low, dash across the street. As I ran, the sniper fired.... The shots began as I came into view, and stopped shortly after I made it to the other side. They were clearly aimed at me. And, by the sound of them, they were close. All night long, there was the sound of gun shots, as the military shot into our village. We stayed clear of the windows.... The guns and bullets were, no doubt, paid for by my tax dollars. Which is, of course, why we are here.

Or Tzaporah Ryter, filing this on Electronic Intifada:

I am an American student from the University of Minnesota. I currently am in Ramallah. We are under a terrible siege and people are being massacred by both the Israeli army and armed militia groups of Israeli settlers.... On Thursday afternoon, the Israeli army began sealing off each entrance to Ramallah.... Those traveling in began desperately searching for alternative ways and traveling in groups, but the Israelis were firing upon them and everyone was running and screaming.... Israeli jeeps were speeding across the terrain, pulling up from every direction and shooting at the women and children, and also at me...

Or the extremely articulate and self-possessed Adam Shapiro, whose testimony ended up in the New York Daily News and on CNN, where he told Kyra Phillips:

This is not about politics between Jew and Arab, between Muslim and Jew. This is a case of human dignity, human freedom and justice that the Palestinians are struggling for against an occupier, an oppressor. The violence did not start with Yasir Arafat. The violence started with the occupation.... Arafat, after every terrorist incident, every suicide bombing, after every action, has condemned this loss of life, of civilian lives on both sides. The Sharon government, sometimes will apologize after it kills an innocent civilian, but it does not apologize for raping the cities and for going in and carrying out terrorist actions, going house to house tearing holes through the walls, roughing up people, killing people, assassinating people.

Most of the time you open up a newspaper and read a robotic column--as I did the Los Angeles Times's Ronald Brownstein the other day--about Palestinian terrorism and the wretched Arafat's supposed ability to quell the uprising with a few quick words. And then you turn on the NewsHour and there, of all people, is Zbigniew Brzezinski, stating the obvious, on April 1:

The fact of the matter is that three times as many Palestinians have been killed, and a relatively small number of them were really militants. Most were civilians. Some hundreds were children.... in the course of the last year, we have had Palestinian terrorism but we have also had deliberate overreactions by Mr. Sharon designed not to repress terrorism but to destabilize the Palestinian Authority, to uproot the Oslo Agreement, which he has always denounced, in a manner which contributed to the climate, that resulted in the killing of one of the two architects of the Oslo Agreement.

After predictable dissent from Kissinger, Brzezinski went on:

It's absolute hypocrisy to be claiming that Arafat can put a stop to the terrorism.... the fact of the matter is that his ability to control the situation would be greatly increased if there was serious movement towards political process, towards a political settlement and that the United States took the lead.

Between this brisk statement and the eloquent courage of Adam Shapiro and his brave fellow internationalists, the truth is getting out--not fast enough, not loud enough--but better than twenty years ago.

Articles

As the Israeli army continues the second week of its military reoccupation of the Palestinian-controlled towns of the West Bank, a group of internationals is playing a role of solidarit

Muslim "fundamentalists" are often people from the striving middle class.

The Bush Administration, urged by the oil industry, has embraced a corrupt regime.

Books & the Arts

Music

I was in high school in the 1960s when I first saw Dave Van Ronk at the Gaslight, one of those little cellar clubs that used to line a Greenwich Village that now lives in myth and legend. I didn't understand what he was doing. It seemed like a jumble whose elements I recognized--folk tunes, ragtime, early jazz, Delta blues--but they didn't gel into what I thought was coherence. It was really only my expectations, though, that were exposed. I felt like Dr. P in Oliver Sacks's The Man Who Mistook His Wife for a Hat, scanning deconstructed faces for that single telltale feature that would reveal who I was looking at. I didn't know how to think about it. I couldn't have been more confused if Louis Armstrong had ambled onto The Ed Sullivan Show and followed "Hello Dolly!" with "The Times They Are A-Changin'."

Two things, however, I got: Van Ronk was a hellacious guitar picker, and he was the only white guy I'd ever heard whose singing showed he understood Armstrong and Muddy Waters. He roared and bellowed like a hurricane; he could be threatening, and tender as the night. And he was funny. Not cute funny--really funny. He did bits from W.C. Fields, whose movies, like those of the Marx Brothers, were just being revived. He did "Mack the Knife" with a suddenly acquired tremolo I later found out was Marlene Dietrich's. He finished with "Cocaine," which he'd adapted from the Rev. Gary Davis, his friend and teacher, adding his own asides ("Went to bed last night singing a song/Woke up this morning and my nose was gone"). Decades later, Jackson Browne revived the tune, his band parsing Van Ronk's solo guitar.

There are many Van Ronk undercurrents flowing through American pop culture. The acclamation that followed his death from colon cancer early this year strangely mirrored his ghostly omnipresence during life. He was a missing link: an authentic songster who voiced folk-made music. At his artistic core, he reconnected jazz to folk-music forms that he, like his avatar Woody Guthrie, pursued, learned and kept alive--and, with the wit and humor that kept homage from freezing into reverence, dared to reimagine.

A big, burly guy whose personality was as oversized as his voice, Van Ronk never crossed over to commerciality, never got mainstream-famous. In those ways, he was a true exemplar of the folk-revival aesthetic: becoming too visible or successful equaled selling out. He followed the time-honored American path into this culture's musical heart: He studied sources and learned from living African-American performers. Those sources included Piedmont ragtime pickers like Blind Blake and Blind Boy Fuller and Delta deep-bluesmen like Son House, as well as parlor music. Then there was the Rev. Gary Davis. He'd dazzled 1940s Harlem street corners with his stylistically wide-ranging guitar and whooping singing, careening from biblical shouts to leering lipsmackers, and by the 1960s had become a teacher who drew Village hipsters to his small brick house in Queens. This was the era when Moondog, the eccentric jazz poet, took up his post near the Museum of Modern Art and did, well, whatever he felt like doing that day.

Maybe it's not surprising that I was so confused by these figures that I didn't guess until later that I'd seen some of the last stages of America's oral culture.

The acceleration of technological change has inevitably altered the oral process of folk-art transmission. In the twenty-first century it seems that, for better and worse, technology has probably rendered the Van Ronks oddly superfluous, apparently redundant. In evolution, if not architecture, form follows function. The concept of folk music hatched by Charles Seeger and the Lomaxes, and embodied by Woody Guthrie, Lead Belly and Pete Seeger, has, in the age of mass recording, lost the daily uses that made it folk art. Where once songsters were the repositories and transmitters of our polyglot national folk heritage, where Van Ronk's generation of amateur and semipro musicologist-sleuths sought out records tossed into people's attics and garages to find artists obscured by the mists of time, now, thanks to the omnipresent, profitable avalanche of record-company CD reissues, almost anything they dug up is readily available. Of course, the artists and their cultures are not.

So our easy connection with the cultural past is shaped by the recording studio, with its time constraints and pressures and implicit notion of a fixed performance guarded by copyright--and the possibility of paying publishing royalties that are the core of the music industry's economy. That inevitably alters performances from folk art, where borrowing and repetition are demanded. Thus we've lost the idiosyncratic twists to the oral/aural tradition that an artist of Van Ronk's caliber introduces, casually and yet integrally, however much they appear like asides.

"This song has changed since Gary used to do it," he used to growl, introducing "Cocaine." Which was, of course, part of the point, the method of transmission, of real folk music: If culture is a conservative mechanism, a cumulative record of human activity, change results from disconnections and accretions like Van Ronk's sharp-witted reactions to Davis's barbed blues, originally improvised add-ons drawn from his memory of lyrics the way a jazz musician pulls riffs from history and reworks them into his own voice.

Van Ronk was a die-hard collector of sources, living and recorded. As the liner notes to the 1962 album In the Tradition put it, "Dave Van Ronk has established himself as one of the foremost compilers of 'Jury Texts' regarding traditional tunes. (Jury Texts are when many verses are sung to one tune, usually with some new words appearing with each subsequent recording.) Here, in 'Death Letter Blues,' Van Ronk has arranged some of the most moving verses of this song into a dramatic slow blues." Behold the songster at work--a process found in early Armstrong, Guthrie and Robert Johnson.

Although the building blocks of oral culture are plastic, preservationists in a nonoral culture tend toward reverence, and thus simpler imitation--hence the folk revival's slew of earnest groups like the New Lost City Ramblers. As Van Ronk observed in a late 1970s interview in the folk music quarterly Sing Out!, "It was all part and parcel of the big left turn middle-class college students were making.... So we owe it all to Rosa Parks." While black rhythm-and-blues was revving white teens into rock and roll, black folk artists became heroes to young white collegians. The left cast a romantic, even sacramental aura over black (and white) folk art and its traditions, which implicitly stigmatized creative change; the central notion of folk-revival culture, authenticity, meant avoiding commercial trappings and replicating a recorded past.

Perhaps it was Van Ronk's deep study of that past that helped him avoid fixing it. In a late 1990s interview, asked about Harry Smith's Anthology of American Folk Music, he rightly called it the bible of his generation and noted dryly, "I sat up and took notice at how many tunes that, say, Doc Watson does that are on the Anthology.... Some he would have known [via oral tradition]. But you can tell. There are hundreds of possible verses. When someone does [lists three verses in order], you know they've been listening to Bascom Lamar Lunsford."

"One thing I was blessed with is that I was a very, very bad mimic," Van Ronk once observed. Which is another view of how oral tradition mixes conservation and creativity. Van Ronk's background allowed him to understand this uniquely.

He was born in Brooklyn on July 30, 1936, a Depression baby to a mostly Irish working-class family. His father and mother split, and he grew up in blue-collar Richmond Hill, Queens, where he went to Catholic school--or played truant--until the system gave up on him, at 16. In 1998 he told David Walsh, "I remember reading Grant's memoirs, the autobiography of Buffalo Bill. Lots of Mark Twain.... My brain was like the attic of the Smithsonian.... The principal...called me 'a filthy ineducable little beast.' That's a direct quote." Like Guthrie, Van Ronk became a formidable autodidact. While he hung out in pool halls he was listening to jazz--bebop, cool, then traditional, a k a New Orleans or Dixieland jazz, a style with its own cult of authenticity. He fell in love with Armstrong and Bessie Smith, along with Lead Belly and Bing Crosby, his major vocal influences.

Like Odysseus, Guthrie, Kerouac and Pynchon, Van Ronk decided to take to the sea. In 1957, he got a shore gig at the Cafe Bizarre in the Village. Odetta, the gospel-voiced black singer who gave the 1950s folk scene an interracial connection--as Lead Belly, Sonny Terry and Brownie McGhee, and Josh White had to the first Depression wave--heard him, liked him and convinced him to make a demo tape that she'd pass on to Albert Grossman, folk-music maven, Chicago club owner and future manager of Bob Dylan. Popping Benzedrine in the best Beat fashion, Van Ronk hitchhiked to Chicago in twenty-four hours, got to Grossman's club, found out the tape hadn't, auditioned, got turned down (Grossman was booking black songsters like Big Bill Broonzy, and Van Ronk accused him of Crowjimming), hitchhiked back to New York, had his seaman's papers stolen and thus decided that he would, after all, become a folk singer.

Given his sardonic realism, it was fittingly ironic that he and his wife, Terri Thal, became quasi parents for dewy-eyed collegiate folkies drawn by Guthrie's songs and Seeger's indefatigable college-concert proselytizing. Seeger's shows planted folk-music seeds on campuses across the country, but Smith's Anthology provided the rich soil for the next generation of folk musicians. "Cast your mind back to 1952," Van Ronk told one interviewer. "The only way you could hear the old timers was hitting up the thrift shops. When the Anthology came out, there were eighty-two cuts, all the old-time stuff. I wore out a copy in a year. People my age were doing the same." As did his musical stepkids.

Van Ronk once said of Seeger, "What am I supposed to say about the guy who invented my profession?" By the late 1950s that profession had migrated far from Lead Belly and Guthrie, songsters who lived the lives they chronicled, and far from Seeger's fierce anticommercialism and romantic faith in a pure, true folk culture. History intervened. Seeger had refused to testify before the House Un-American Activities Committee and had doggedly resurfaced in the post-McCarthy era. Still, less threatening figures like Burl Ives became the commercial faces of folk music. As Joe Klein noted in Woody Guthrie: A Life, the folk revival offered record companies an exit from payola scandals and the racial and sexual fears that had generated mainstream disapproval of rock and roll. The patina of integrity and authenticity covering white collegiate folk music helped the labels repolish corporate images.

Starting in 1957, the Kingston Trio cleaned up old tunes like "Tom Dooley" and "Tijuana Jail" and scored several top-25 hits. Neat folk groups proliferated, feeding into the Village and Cambridge, Massachusetts, where young men and women donning recently acquired rural accents and denims recycled the Anthology's songbook and hoped to catch a label's ear.

In 1959, when Bobby Zimmerman was leaving behind his piano à la Jerry Lee Lewis for college and the Anthology's lures, Van Ronk made his first records, now compiled on The Folkways Years (Smithsonian/Folkways); they unveil a songster misclassified. Van Ronk once said, "I never really thought of myself as a folk singer at all. Still don't. What I did was to combine traditional fingerpicking guitar with a repertoire of old jazz tunes." Here he does a Gary Davis-derived staple of his repertoire, "Hesitation Blues," and more blues and gospel. His big, rough voice and guitar dexterity are self-evident, as is his improvisational feel.

In 1964, he yanged with Dave Van Ronk's Ragtime Jug Stompers (Mercury), recording high-energy versions of tunes like "Everybody Loves My Baby" with a wild and ragged Dixieland outfit. This was his recurrent jazz-folk dialectic. On his solo album Sings the Blues (Folkways), Van Ronk's coarse voice and nimble fingers got looser--like the irrepressible Davis's--and thus he found himself.

"It was more academic than it is now," Van Ronk remembered in the 1970s:

It was 'de rigueur,' practically, to introduce your next song with a musicological essay--we all did it. There was a great deal of activity around New York--not so much you could make money at. But there were folk song societies in most of the colleges and the left was dying, but not quietly. So there was a great deal of activity around Sing Out! and the Labor Youth League, which wasn't affiliated with the old CP youth group, you understand. There was a lot of grassroots interest among the petit-bourgeois left.

Spoken like the sly observer who once told an interviewer from the International Committee of the Fourth International, "I've always liked Trotsky's writings as an art critic."

By 1961 Bobby Zimmerman was Bobby Dylan and had arrived in New York, Van Ronk was an insider on the Village folk scene and the two gravitated toward and around each other, thanks partly to what Van Ronk called the take-no-prisoners quality of Dylan's music and personality. Ramped up by commercial success, the postwar folk revival's peak loomed over debates about authenticity. "All of a sudden," Van Ronk recalled a few years back, "there was money all over the place."

He settled into the Gaslight, a hub for noncommercial folkies. Several other pass-the-hat beat-folk coffeehouses, like Cafe Wha?, opened. By 1962 Dylan had settled in down the block, at the grander Gerde's Folk City. Izzy Young of the Folklore Center, part of the older folk-revival wave, had set up a folk-music showcase, WBAI had broadcast the shows and club owner Mike Porco, realizing he had a salable product, ousted both, lining his bar with record covers and his seats with young beatniks. Porco's Monday night Hoots were the dollar-admission descendants of both Young's and Seeger's earlier informal loft gatherings, and he showcased rediscovered legends like John Lee Hooker with Dylan as the opener. Tom and Jerry--later known as Simon and Garfunkel--and Judy Collins cut their teeth there. Kids flocked to this semi-underground. Jug bands emerged as the college-beatnik equivalent of the 1950s blue-collar rockabilly outbreak in the South, and street-corner doo-wop in the North, prefiguring the 1960s garage-band explosion after the Beatles and electric Dylan. The link: Everyone felt empowered to make music. These were folk musics.

The Newport Folk Festival, the crowning triumph of the postwar folk revival, was first organized in 1959 by jazz impresario George Wein and Albert Grossman, and graduated the purer wings of the folk movement to big-time concerts; Seeger himself was involved. "I never liked those things," Van Ronk characteristically recalled. "It was a three-ring circus.... You couldn't even really hear what you came to hear. Put yourself in my position, or any singer's position: How would you like to sing for 15,000 people with frisbees?" Along with his own musical catholicity, that may be why, even after the Dylan-goes-electric blowup at the 1965 festival, Van Ronk remained a Dylan defender.

"Nervous. Nervous energy, he couldn't sit still," is how he spoke of young Bob to David Walsh in 1998:

And very, very evasive.... What impressed me the most about him was his genuine love for Woody Guthrie. In retrospect, even he says now that he came to New York to 'make it.' That's BS. When he came to New York there was no folk music, no career possible.... What he said at the time is the story I believe. He came because he had to meet Woody Guthrie.... Bobby used to go out there two or three times a week and sit there, and play songs for him. In that regard he was as standup a cat as anyone I've ever met. That's also what got him into writing songs. He wrote songs for Woody, to amuse him, to entertain him. He also wanted Woody's approval.... [Dylan's music] had what I call a gung-ho, unrelenting quality.... He acquired very, very devoted fans among the other musicians before he had written his first song.

Van Ronk was the first to record a tune Dylan claimed to write, "He Was a Friend of Mine," on Dave Van Ronk, Folksinger in 1962 (the album has been reissued as part of Inside Dave Van Ronk [Fantasy]). Three years later, the Byrds redid it on Turn! Turn! Turn!, whose title cut remade Seeger's setting of Ecclesiastes into folk rock, the new sound Dylan had kicked into high gear during his 1965 tour.

Van Ronk once observed, "The area that I have staked out...has been the kind of music that flourished in this country between the 1880s and, say, the end of the 1920s. You can call it saloon music if you want to. It was the kind of music you'd hear in music halls, saloons, whorehouses, barbershops, anywhere the Police Gazette could be found." That's not exactly a full description of what he did over thirty albums and countless performances. Better to think of him as a songster, an older, more encompassing sort of folk artist. Lead Belly and the Reverend Davis are outstanding examples of this type; they drew from multiple local and regional traditions that, in the early days of radio and phonograph, still defined American musical styles. Dance tunes, blues, ragtime, ballads, gospel--anything to keep the audiences on street corners or in juke joints interested and willing to part with some cash. This was, after all, performance. Entertainment was its primary goal; improvisation, found in the vocal-guitar interplay and instrumental backing as well as verse substitutions and extrapolations or shortenings, played to audience reaction.

In 1962, with the Red Onion Jazz Band, Van Ronk cut In the Tradition, which, along with the solo Your Basic Dave Van Ronk Album, cut in 1981, will be included on the forthcoming Two Sides of Dave Van Ronk. This somewhat odd couple makes a wonderful introduction to the breadth, depth and soul of this songster's legacy. The smoothly idiomatic Red Onions pump joyful New Orleans adrenaline and Armstrong trumpet into a raucous "Cake Walking Babies From Home"; a sinuous "Sister Kate," that dance hit built from an Armstrong melody; and Dylan's caustic "All Over You." Amid the Dixieland are solos: a stunning version of Son House's "Death Letter Blues" (later recorded by Cassandra Wilson), Lead Belly's "Whoa Back Buck," the virtuosic ragtime "St. Louis Tickle," signature pieces like the gentle "Green, Green Rocky Road" and "Hesitation Blues." The tunes drawn from Your Basic Dave Van Ronk Album show no diminishing of talent and a continuing breadth of perspective: Billie Holiday's "God Bless the Child" (sung with a tenderness that scorches periodically into Howlin' Wolf) and "St. James Infirmary" share space with tunes by Davis and Mississippi John Hurt.

In 1967 he cut Dave Van Ronk & The Hudson Dusters (Verve Forecast), a cross of jug band and electric folk music that foreshadowed The Blues Project, the improvising garage band that Van Ronk pupils Danny Kalb and Steve Katz later formed. There was doo-wop, Joni Mitchell (whose Clouds becomes anguished, thanks to Van Ronk's torturous voice breaks used with interpretive skill, a move he learned from Armstrong and Bessie Smith) and the balls-to-the-wall garage rock "Romping Through the Swamp," which sounds akin to Captain Beefheart.

Recorded in 1967, Live at Sir George University (Justin Time) is time-capsule Van Ronk on guitar, plus vocals, doing pieces of his repertoire: "Frankie and Albert," "Down and Out," "Mack the Knife," "Statesboro Blues" and "Cocaine," of course--all masterful, each distinct.

By then the folk boom, whose audience was bleeding into folk-rock, electric blues and psychedelia, stalled and ended. Van Ronk continued (except for a hiatus in the 1970s) to perform and record and gather new-old material. And he had time, before his death, to deliver some acid reflections.

On 1960s folkies:

The whole raison d'être of the New Left had been exposed as a lot of hot air, that was demoralizing. I mean, these kids thought they were going to change the world, they really did. They were profoundly deluded.... Phil Ochs wrote the song "I declare the war is over," that was despair, sheer despair.

On 1980s folkies:

You're talking about some pretty damn good songwriters. But I'd like to hear more traditional music.... With the last wave of songwriters you get the sense that tradition begins with Bob Dylan and nobody is more annoyed with that than Bob Dylan. We were sitting around a few years ago, and he was bitching and moaning: "These kids don't have any classical education." He was talking about the stuff you find on the Anthology [of American Folk Music]. I kidded him: "You got a lot to answer for, Bro."

Book

There are those opposed to the use of cloning technology to create human embryos for stem-cell research whose concerns emanate from commitments to social justice. One of their arguments runs as follows: The idea driving this medical research is that by creating an embryo through cloning, we can produce embryonic stem cells that are a perfect genetic match for a patient. All that is required to conduct the cloning is a skin cell from which to extract the patient's DNA and...a human egg.

Where, cry out the social justice advocates, are we going to get all these eggs for all these patients? Do the math, they suggest: 17 million American diabetics, needing anywhere from 10 to 100 eggs each, since the cloning technology is far from efficient...and even if you can pull that off, Christopher Reeve is still not walking, Michael J. Fox and Janet Reno still tremble and Ronald Reagan still doesn't remember who Ronald Reagan was. The social justice folk maintain that the billions of eggs required for embryonic stem cell therapies for the millions of Americans suffering from chronic and degenerative diseases will be obtained through exploitation of poor women in this country and the world over. Surplus value will take on an even more nefarious meaning.

Still, the early results from embryonic stem-cell therapy in mice are so dramatic that not to pursue this medical research is recognized as morally obscene and just plain stupid. At the University of California, Dr. Hans Keirstead was able to implant neurological tissue derived from embryonic stem cells in a mouse with partial spinal cord injury so that after eight weeks, the mouse had regained most of its ability to walk and, of major significance to the quarter-million Americans suffering from this tragic condition, had also regained bladder and bowel control. Yet, the question remains, where are we going to get all those eggs?

A call to Stanford University's Paul Berg, a Nobel laureate who has been testifying to Congress on behalf of embryonic stem-cell research, helps elucidate the answer: When it comes to the research, he says, the quantity required may not be a problem. But if the desired therapeutic potential of embryonic stem cells is fully realized, the need for eggs will be great and could short-circuit the availability of these therapies. But a solution to that may be possible, Berg insists. If research is carried out that identifies the biochemicals in the egg directing the genetic material to develop into an embryo, then we could extract and fractionate those biochemicals and insert them into any skin cell, for example, for use in the cloning process. Voilà! A skin cell becomes an egg, and skin cells are plentiful.

The immediate enthusiasm for this breakthrough scientific idea, which could help Reeve walk again while simultaneously obviating the motive for an exploitative human egg market, is quickly tempered by the full realization of what Berg has explained: When we acquire the ability to use any cell as an egg, we will have removed another obstacle to achieving complete control over human reproduction. Admittedly, complete control over the production of reproduction will require a womb for gestation--but that ultimately should prove to be just another biochemical matter for extraction and fractionation.

This, then, is how it goes in biotechnology, the essential dynamic that simultaneously gives rise to medical hope and moral vertigo. Each step forward produces a new problem, the solution to which demands further control over the biological mechanism known as a human being. But this somehow impinges on human beings or some portion of ourselves that we value. To deal with the attendant moral quandaries, a method is found to isolate and duplicate the underlying molecular process. The moral quandary has thus been replaced by an extracorporeal biochemical process, no longer strictly identified as human, and therefore a process that no one can reasonably value apart from its use. The problem, as bioethicist Eric Juengst puts it, is that we could thereby successfully cope with every moral dilemma posed by biotechnology and still end up with a society none of us would wish to live in. For Francis Fukuyama, this is Our Posthuman Future, as he has titled his new book on the subject.

Fukuyama's most famous previous theoretical foray was to declare, in 1989, an end to history, whereby a capitalist liberal democratic structure represented the final and most satisfying endpoint for the human species, permitting the widest expression of its creative energies while best controlling its destructive tendencies. He imagined that ultimately, with the universal acceptance of this regime, the relativist impasse of modern thought would in a sense resolve itself.

But thirteen years after the end of history, Fukuyama has second thoughts. He's discovered that there is no end of history as long as there is no end of science and technology. With the rapidly developing ability of the biological sciences to identify and then alter the genetic structure of organisms, including humans, he fears the essence of the species is up for grabs. Since capitalist liberal democratic structures serve the needs of human nature as it has evolved, interference by the bio-engineers with this human nature threatens to bring the end of history to an end.

The aim of Our Posthuman Future is "to argue that [Aldous] Huxley was right," Fukuyama announces early on, referring to Huxley's 1932 vision of a Brave New World. Multiple meanings are intended by Fukuyama: The industrialization of all phases of reproduction. The genetic engineering of the individuals produced by that process, thereby predetermining their lives. The tyrannical control of this population through neurochemical intervention, making subservience experientially pleasurable. Fukuyama cites specific contemporary or projected parallels to Huxley's Hatchery and Conditioning Center, Social Predestination Room and soma. In Fukuyama's terms, the stakes in these developments are nothing less than human nature itself.

The first of the book's three parts lays out the case that the biotechnologically driven shift to a posthuman era is already discernible and describes some of the potential consequences. Prozac and Ritalin are precursors to the genomically smart psychotropic weapons of the near future. Through these drugs, which energize depressed girls and calm hyperactive boys, we are being "gently nudged toward that androgynous median personality, self-satisfied and socially compliant, that is the current politically correct outcome in American society." Standardization of the personality is under way. This is the area to watch, Fukuyama asserts, because virtually everything that the popular imagination envisions genetic engineering accomplishing is much more likely to be accomplished sooner through neuropharmacology.

Increased life spans and genetic engineering also offer mostly dystopic horizons, whereby gerontocracies take power over societies whose main purpose has become the precision breeding of their progeny. The ancient instincts for hierarchical status and dominance are still the most powerful forces shaping this new world born from biotechnology. Since, as Fukuyama sees it, science does not necessarily lead to the equality of respect for all human beings demanded by liberal egalitarianism, the newest discoveries will serve the oldest drives. We are launched on a genetic arms race.

But be warned: We may not arrive in that new world through some dramatic struggle in which we put up a fight. Rather, the losses to our humanity may occur so subtly that we might "emerge on the other side of a great divide between human and posthuman history and not even see that the watershed had been breached because we lost sight of what that [human] essence was."

If this terrible event is to be prevented, then the human essence, which Fukuyama correlates with human nature itself, must be identified and kept inviolable. But what is that line to be drawn around "human nature" and to which we can all adhere so that we might reap the benefits of biotechnology while preventing the nightmare scenarios from ever coming to pass?

The entire world today wants the answer to this. Fukuyama promises to deliver it. But despite the clarity with which he announces his mission, the author advises his readers, "Those not inclined to more theoretical discussions of politics may choose to skip over some of the chapters here." Yet these are the very chapters containing the answer we all seek in order to tame the biotechnology beast! This, then, signals that we are entering dangerous ground, and we will need to bear with the author's own means of revealing his great discovery, which may be skipped over at our own peril.

In this heart of the book, titled "Being Human," Fukuyama first seeks to restore human nature as the source of our rights, our morality and our dignity. In particular, he wishes to rescue all these dimensions from the positivist and utilitarian liberal philosophers who, closely allied with the scientific community, have dominated the debate over biotechnology. According to the author, these philosophers assign rights everywhere and emphasize the individual as the source of moral concern. In doing so, they put humankind and its collective life at risk before the juggernaut of biotechnology. John Rawls and Ronald Dworkin, among others, have elevated individual autonomy over inherently meaningful life plans, claims Fukuyama, who then questions whether moral freedom as it is currently understood is such a good thing for most people, let alone the single most important human good.

Rather than our individual autonomy or moral freedom, Fukuyama wishes that we would attend to the logic of human history, which is ultimately driven by the priorities that exist among natural human desires, propensities and behaviors. Since he wishes us to shift ground to the logic of the inherent and the natural, he must finally define that core composing human nature:

The definition of the term human nature I will use here is the following: human nature is the sum of the behavior and characteristics that are typical of the human species, arising from genetic rather than environmental factors.

Later he will refine this further to the innate species-typical forms of cognition, and species-typical emotional responses to cognition. What he is really after is not just that which is typical of our species but that which is unique to human beings. Only then will we know what needs the greatest safeguarding. After hanging fire while reviewing the candidates for this irreducible, unique core to be defended, including consciousness and the most important quality of a human being, feelings, Fukuyama finally spills the beans:

What is it that we want to protect from any future advances in biotechnology? The answer is, we want to protect the full range of our complex, evolved natures against attempts at self-modification. We do not want to disrupt either the unity or the continuity of human nature, and thereby the human rights that are based on it.

So, where are we? It would seem we have gone full circle. Human nature is defined by...human nature! To the extent that it is capable of being located in our material bodies, it is all that arises from our genetics. Any attempt at greater precision is a violation of our unity or continuity--and threatens to expose the author's empty hand. Through such sophistry, Fukuyama wishes to assert mastery over any biotechnological innovation that he considers threatening, since he can now arbitrarily choose when it is disruptive of the unity or continuity of the human nature arising from our genetics. Even a heritable cancer could qualify for protection under Fukuyama's rubric for that which is to be defended from biotechnological intervention.

Indeed, there are those agreeing with Fukuyama's view of the biological bases of human social life who draw opposite conclusions about human bioengineering, viewing it as humanity's last best hope.

The remainder of the book is a potpourri of tactical suggestions (embedded in rhetoric cloned from Fukuyama's mentor in these matters, bioethicist Leon Kass) of which biotechnologies should be controlled, and of the need for both national and international bodies and systems to do so, if such control is to be effective. That, in the end, may be the most surprising aspect of the book. All this fervid philosophizing in reaction to fears about a Brave New World, fervently working toward the radical conclusion that what is needed is...regulation. Although obviously recognition of the need for regulation might well be experienced as a radical trauma by someone who has previously placed an overabundance of faith in the market.

But one would be foolish to believe that Fukuyama has gone all this distance simply to argue for what he refers to at one point as a more nuanced regulatory approach. In his most public engagement with biotechnology thus far, he has endorsed, written and testified to Congress on behalf of a bill that will not only ban human reproductive cloning but also ban nonreproductive cloning for stem-cell research. The legislation he supports would also make any doctor who utilizes or prescribes a treatment developed with cloning technology subject to ten years in prison and a $1 million fine. Under this legislation, then, if a cure or treatment for diabetes or heart failure is created in England that used embryo cloning to harvest stem cells for therapy, US physicians would not be allowed to have access to such treatments for their patients. This is his lesson in how moral freedom is not such a good thing compared with an inherently meaningful life plan. Let the fragile diabetic or spinal cord-injury victim learn the true value of our human nature from their catheterized bladders!

Fukuyama's entire brief depends upon avoiding the consequences of his own logic. Having identified the human essence with our biological human nature, he must evade any further specification or else the particular tissues, cells or molecules would be subject to further discussion and analysis as to whether or not they represent the human essence. Rather than discussion, we should trade in our autonomy and moral freedom for his protections. By the close of the book, any moral qualms on his part fall entirely by the wayside. Fukuyama is perhaps aware that he has failed to make his case except to those ready to believe. The book culminates in a final paragraph that is nothing less than a temper tantrum:

We do not have to accept any of these future worlds under a false banner of liberty, be it that of unlimited reproductive rights or of unfettered scientific inquiry. We do not have to regard ourselves as slaves to inevitable technological progress when that progress does not serve human ends. True freedom means the freedom of political communities to protect the values they hold most dear...

Nice rhetoric until we recall the values of the types of political regimes to which moral freedom and science must be sacrificed. While Fukuyama rails against the Brave New World, he takes the side of Huxley's World Controller, who explains, "Truth's a menace, science is a public danger...That's why we so carefully limit the scope of its researches."

There is an alternative to the fear that human nature must be inviolable because human nature cannot be trusted. We have seen imperious dictates against science and moral freedom delivered by philosophers before. In the recent past, we have evidence of very similar ideas in very similar language issuing from the philosopher whom Fukuyama draws upon for the epigraph beginning the first chapter of his book, Martin Heidegger. In the 1930s Professor Heidegger wanted science to serve the German essence, and it did. Now Professor Fukuyama wants science, and all of us, to serve the human essence, which he equates with his version of sociobiology infused with German romantic holism. Once more, we witness someone who would stop tyranny by imposing a tyranny of his own. Since Francis Fukuyama now sits on the President's Council on Bioethics, we should be grateful for the warning.

Book

"Thirty years from now the big university campuses will be relics," business "guru" Peter Drucker proclaimed in Forbes five years ago. "It took more than 200 years for the printed book to create the modern school. It won't take nearly that long for the [next] big change." Historian David Noble echoes Drucker's prophecies but awaits the promised land with considerably less enthusiasm. "A dismal new era of higher education has dawned," he writes in Digital Diploma Mills. "In future years we will look upon the wired remains of our once great democratic higher education system and wonder how we let it happen."

Most readers of this magazine will side with Noble in this implicit debate over the future of higher education. They will rightly applaud his forceful call for the "preservation and extension of affordable, accessible, quality education for everyone" and his spirited resistance to "the commercialization and corporatization of higher education." Not surprisingly, many college faculty members have already cheered Noble's critique of the "automation of higher education." Although Noble himself is famously resistant to computer technology, the essays that make up this book have been widely circulated on the Internet through e-mail, listservs and web-based journals. Indeed, it would be hard to come up with a better example of the fulfillment of the promise of the Internet as a disseminator of critical ideas and a forum for democratic dialogue than the circulation and discussion of Noble's writings on higher education and technology.

Noble performed an invaluable service in publishing online the original articles upon which this book is largely based. They helped initiate a broad debate about the value of information technology in higher education, about the spread of distance education and about the commercialization of universities. Such questions badly need to be asked if we are to maintain our universities as vital democratic institutions. But while the original essays were powerful provocations and polemics, the book itself is a disappointing and limited guide to current debates over the future of the university.

One problem is that the book has a dated quality, since the essays are reproduced largely as they were first circulated online starting in October 1997 (except for some minor editorial changes and the addition of a brief chapter on Army online education efforts). In those four-plus years, we have watched the rise and fall of a whole set of digital learning ventures that go unmentioned here. Thus, Noble warns ominously early in the book that "Columbia [University] has now become party to an agreement with yet another company that intends to peddle its core arts and science courses." But only in a tacked-on paragraph in the next to last chapter do we learn the name of the company, Fathom, which was launched two years ago, and of its very limited success in "peddling" those courses, despite Columbia president George Rupp's promise that it would become "the premier knowledge portal on the Internet." We similarly learn that the Western Governors' Virtual University "enrolled only 10 people" when it opened "this fall" (which probably means 1998, when Noble wrote the original article) but not that the current enrollment, as of February 2002, is 2,500. For the most part, the evidence that Noble presents is highly selective and anecdotal, and there are annoyingly few footnotes to allow checking of sources or quotes.

The appearance of these essays with almost no revision from their initial serial publication on the Internet also helps to explain why Noble's arguments often sound contradictory. On page 36, for example, he may flatly assert that "a dismal new era of higher education has dawned"; but just twenty-four pages later, we learn that "the tide had turned" and the "the bloom is off the rose." Later, he reverses course on the same page, first warning that "one university after another is either setting up its own for-profit online subsidiary or otherwise working with Street-wise collaborators to trade on its brand name in soliciting investors," but then acknowledging (quoting a reporter) that administrators have realized "that putting programs online doesn't necessarily bring riches." When Noble writes that "far sooner than most observers might have imagined, the juggernaut of online education appeared to stall," he must have himself in mind, two chapters earlier. Often, Noble is reflecting the great hysteria about online education that swept through the academy in the late 1990s. At other times (particularly when the prose has been lightly revised), he indicates the sober second thoughts that have more recently emerged, especially following the dot-com stock market crash in early 2000.

In the end, one is provided remarkably few facts in Digital Diploma Mills about the state of distance education, commercialization or the actual impact of technology in higher education. How many students are studying online? Which courses and degrees are most likely to appear online? How many commercial companies are involved in online education? To what degree have faculty employed computer technology in their teaching? What has been the impact on student learning? Which universities have changed their intellectual property policies in response to digital developments? One searches in vain in Noble's book for answers, or even for a summary of the best evidence currently available.

Moreover, Noble undercuts his own case with hyperbole and by failing to provide evidence to support his charges. For example, most readers of his book will not realize that online distance education still represents a tiny proportion of college courses taken in the United States--probably less than 5 percent. Noble sweepingly maintains, "Study after study seemed to confirm that computer-based instruction reduces performance levels." But he doesn't cite which studies. He also writes, "Recent surveys of the instructional use of information technology in higher education clearly indicate that there have been no significant gains in pedagogical enhancement." Oddly, here Noble picks up the rhetoric of distance-education advocates who argue that there is "no significant difference" in learning outcomes between distance and in-person classes.

Many commentators have pointed out Noble's own resistance to computer technology. He refuses to use e-mail and has his students hand-write their papers. Surely, there is no reason to criticize Noble for this personal choice (though one feels sorry for his students). Noble himself responds defensively to such criticisms in the book's introduction: "A critic of technological development is no more 'anti-technology' than a movie critic is 'anti-movie.'" Yes, we do not expect movie critics to love all movies, but we do expect them to go to the movies. Many intelligent and thoughtful people don't own television sets, but none of them are likely to become the next TV critic for the New York Times. Thus, Noble's refusal to use new technology, even in limited ways, makes him a less than able guide to what is actually happening in technology and education.

Certainly, Noble's book offers little evidence of engagement with recent developments in the instructional technology field. One resulting distortion is that some readers will think that online distance education is the most important educational use of computer technology. Actually, while very few faculty teach online courses, most have integrated new technology into their regular courses--more than three-fifths make use of e-mail; more than two-fifths use web resources, according to a 2000 campus computing survey. And few of these faculty members can be characterized, as Noble does in his usual broad-brush style, as "techno-zealots who simply view computers as the panacea for everything, because they like to play with them."

Indeed, contrary to Noble's suggestion, some of the most thoughtful and balanced criticisms of the uses of technology in education have come from those most involved with its application in the classroom. Take, for example, Randy Bass, a professor of English at Georgetown University, who leads the Visible Knowledge Project (http://crossroads.georgetown.edu/vkp), a five-year effort to investigate closely whether technology improves student learning. Bass has vigorously argued that technological tools must be used as "engines of inquiry," not "engines of productivity." Or Andrew Feenberg, a San Diego State University distance-education pioneer as well as a philosopher and disciple of Herbert Marcuse, who has insisted that educational technology "be shaped by educational dialogue rather than the production-oriented logic of automation," and that such "a dialogic approach to online education...could be a factor making for fundamental social change."

One would have no way of knowing from Noble's book that the conventional wisdom of even distance-education enthusiasts is now that cost savings are unlikely, or that most educational technology advocates, many of them faculty members, see their goal as enhancing student learning and teacher-student dialogue. Noble, in fact, never acknowledges that the push to use computer technology in the classroom now emanates at least as much from faculty members interested in using these tools to improve their teaching as it does from profit-seeking administrators and private investors.

Noble does worry a great deal about the impact of commercialization and commodification on our universities--a much more serious threat than that posed by instructional technology. But here, too, the book provides an incomplete picture. Much of Noble's book is devoted to savaging large public and private universities--especially UCLA, which is the subject of three chapters--for jumping on the high-technology and distance-education bandwagons. Yet at least as important a story is the emergence of freestanding, for-profit educational institutions, which see online courses as a key part of their expansion strategy. For example, while most people think of Stanley Kaplan as a test preparation operation, it is actually a subsidiary of the billion-dollar Washington Post media conglomerate and owns a chain of forty-one undergraduate colleges and enrolls more than 11,000 students in a variety of online programs, ranging from paralegal training to full legal degrees at its Concord Law School, which advertises itself as "the nation's only entirely online law school." This for-profit sector is growing rapidly and becoming increasingly concentrated in a smaller number of corporate hands. The fast-growing University of Phoenix is now the largest private university in the United States, with more than 100,000 students and almost one-third in online programs, which are growing more than twice as fast as its brick-and-mortar operation. Despite a generally declining stock market, the price of the tracking stock for the University of Phoenix's online operation has increased more than 80 percent in the past year.

As the Chronicle of Higher Education reported last year, "consolidation...is sweeping the growing for-profit sector of higher education," fueled by rising stock prices in these companies. This past winter, for example, Education Management Corporation, with 28,000 students, acquired Argosy Education Group and its 5,000 students. The threat posed by these for-profit operations is rooted in their ability to raise money for expansion through Wall Street ("Wall Street," jokes the University of Phoenix's John Sperling, "is our endowment") and by diminishing public support for second-tier state universities and community colleges (the institutions from which for-profits are most likely to draw new students). Yet, except for an offhand reference to Phoenix, Digital Diploma Mills says nothing about these publicly traded higher-education companies. And these for-profit schools are actually only a small part of the more important and much broader for-profit educational "sector," which is also largely ignored by Noble and includes hundreds of vendors of different products and services, and whose size is now in the hundreds of billions of dollars--what Morgan Stanley Dean Witter calls, without blushing, an "addressable market opportunity at the dawn of a new paradigm."

A strong cautionary tale is provided by Noble, that of the involvement of UCLA's extension division with a commercial company called Onlinelearning.net--the most informative chapter in the book. He shows how some UCLA administrators as early as 1993 greedily embraced a vision of riches to be made in the online marketing of the college's extension courses. UCLA upper management apparently bought the fanciful projections of their commercial partners that the online venture would generate $50 million per year within five years, a profit level that quickly plummeted below $1 million annually. But Noble conflates the UCLA online-extension debacle with a more benign effort by the UCLA College of Letters and Sciences, beginning in 1997, to require all instructors to post their course syllabuses on the web. He seems unwilling to draw distinctions between the venal and scandalous actions of top UCLA administrators and the sometimes ham-handed efforts of other administrators to get UCLA faculty to enhance their classes by developing course websites, a fairly common educational practice and a useful convenience for students. Three-fifths of UCLA students surveyed said that the websites had increased interactions with instructors, and social science faculty recently gave the website initiative a mostly positive evaluation.

Sounding an "early alarm" so that faculty members can undertake "defensive preparation and the envisioning of alternatives" is how Noble explains his purpose in writing Digital Diploma Mills. But will faculty be well armed if they are unaware of the actual landscape they are traversing? In the end, Noble leaves us only with a deep and abiding suspicion of both technology and capitalism. His analysis of technology and education does echo Marx's critique of capitalism, with its evocation of concepts like commodification, alienation, exchange and labor theories of value. But unlike Marx, who produced a critical analysis of the exploitative nature of early capitalist production without outright rejection of the technology that made industrialization possible, Noble cannot manage the same feat.

In the current political climate, Noble's undifferentiated suspicion of technology hinders us more than it helps us. Are we prepared to follow him in his suspicion of any use of technology in higher education? Are faculty members willing to abjure e-mail in communicating with their students and colleagues? Are instructors at small colleges with limited library collections prepared to tell their students not to use the 7 million online items in the Library of Congress's American Memory collection? Are they ready to say to students with physical disabilities that limit their ability to attend on-campus classes or conduct library research that they can't participate in higher education? Are faculty at schools with working adults who struggle to commute to campus prepared to insist that all course materials be handed directly to students rather than making some of it available to their students online?

Similarly, what lines are we prepared to draw with respect to commercialization of higher education within the capitalist society in which we live? Are faculty willing to abandon publishing their textbooks with large media conglomerates and forgo having their books sold through nationwide bookstore chains? Are they prepared to say to working-class students who view higher education as the route to upward mobility that they cannot take courses that help them in the job market?

Noble's answer to most of these questions would undoubtedly be yes, insisting, as he does, that anything less than the "genuine interpersonal interaction," face to face, undermines the sanctity of the essential teacher-student relationship. In a March 2000 Chronicle of Higher Education online dialogue about his critique of technology in education, Noble complained that no one had offered "compelling evidence of a pedagogical advantage" in online instruction. (He pristinely refused to join online, and had a Chronicle reporter type in his answers relayed over the phone.) A student at UCLA, who had unexpectedly taken an online course, noted in her contribution to the Q&A that because she tended to be "shy and reserved," e-mail and online discussion groups allowed her to speak more freely to her instructor, and that she thought she retained more information in the online course than in her traditional face-to-face classes at UCLA. Noble rejected the student's conclusion that the online course had helped her find her voice, arguing that writing was "in reality not a solution, but an avoidance of the difficulty." "Speaking eloquently, persuasively, passionately," he concluded, "is essential to citizenship in a democracy." Putting aside the insensitivity of Noble's reply, his position, as Andrew Feenberg points out in Transforming Technology: A Critical Theory Revisited, is reminiscent of Plato's fear that writing (the cutting-edge instructional technology in the ancient world) would replace spoken discourse in classical Greece, thus destroying the student-teacher relationship. (Ironically, as Feenberg also notes, "Plato used a written text as the vehicle for his critique of writing, setting a precedent" for current-day critics of educational technology like Noble who have circulated their works on the Internet.)

The conservative stance of opposing all change--no technology, no new modes of instruction--is appealing because it keeps us from any possible complicity with changes that undercut existing faculty rights and privileges. But opposition to all technology means that we are unable to support "open source" technological innovations (including putting course materials online free) that constitute a promising area of resistance to global marketization. And it makes it impossible to work for protections that might be needed in a new environment. Finally, it leaves unchanged the growing inequality between full-time and part-time faculty that has redefined labor relations in the contemporary university--the real scandal of the higher-education workplace. Without challenging the dramatic differences in wages and workloads of full professors and adjunct instructors, faculty rejection of educational technology begins to remind us of the narrow privileges that craft workers fought to maintain in the early decades of industrial capitalism at the expense of the unskilled workers flooding into their workplaces.

We prefer to work from a more pragmatic and realistic stance that asks concretely about the benefits and costs of both new technology and new educational arrangements to students, faculty (full- and part-time) and the larger society. Among other things, that means that academic freedom and intellectual property must be protected in the online environment. And the faculty being asked to experiment with new technology need to be provided with adequate support and rewards for their (ad)ventures. As the astute technology commentator Phil Agre wrote when he first circulated Noble's work on the Internet, "the point is neither to embrace or reject technology but to really think through and analyze...the opportunities that technology presents for more fully embodying the values of a democratic society in the institutions of higher education."