Quantcast

Articles | The Nation

News and Features

Perhaps time is our invention
To make things seem to move
Like the uncovering tail of the blue jay
As it lights its feet on the wet
Trembling wood.

Perhaps the seasons are really not
More than a single space with walls inside, disconnected
While fall and winter, and spring
Which we always anticipate, are only
Expansions of our own longings.

Perhaps there is only the now
Neither age nor youth, not even the vertigo of memories stilettoed
Except wounded into this present second
Shorter than the birth of a cell, or the nest dropped
With the sun and the rain always out together.

This center is absolute, it needs no endlessness
For heaven or hell. Or for creation, our own illusion of ourselves.
The minor variations we unfold are all the same
Inherently permutating at once
Repeating one design. Obscure. Lit at the edges of our time.

Having a hard time finding a new apartment to fit your budget? Consider a move to the blocks around Ground Zero. The Lower Manhattan Development Corporation, the body set up by former New York Mayor Rudy Giuliani and Governor George Pataki to oversee the rebuilding of downtown post-September 11, will pay you up to $12,000 to relocate south of Chambers and west of Broadway for two years, or stay there should you live there already. I know--I couldn't believe it either. This bounty to adventurous tenants is federal money and comes even as area rents are already down by as much as 30 percent since 9/11. Think of it as a mini-version of the millions ladled out to keep corporations from abandoning lower Manhattan for New Jersey. I figured out that I could sublet my apartment on the Upper West Side, move downtown and actually be able to live on my Nation salary (well, almost).

It's true downtown is a mess right now--depending on whom you talk to, the air quality's somewhere between itchy and lethal, a lot of little shops and restaurants have folded, and Ground Zero is not everyone's idea of a view. Still, whatever happened to the survival of the fittest? To the market and its omnipotent invisible hand? Why shouldn't downtown apartments fall to their "natural" price--the rent at which sufficient numbers of people will want to take out a lease despite the angst and aggravation? And if that figure turns out to be so low that the current landlords can't make a go of it, isn't it the capitalist theory that other, cleverer landlords will step into the breach, with the consumer the winner? Why should the federal government pay middle-class professionals to live in one neighborhood rather than another? The answer is, to keep downtown a great place for those same middle-class professionals to live and for real estate interests to invest in.

Public subsidy is certainly not the principle animating housing policy for low-income people and homeless families like the ones whose tribulations were superbly, unforgettably chronicled by Jennifer Egan in The New York Times Magazine ("The Hidden Lives of Homeless Children," March 24). Five hundred dollars a month to brighten a scruffy and underpopulated district with their presence? The housing allowance for a family of three on welfare is $286. It's one thing to herd women and kids into filthy motels at the city's edge, miles from grocery stores and hours away from schools and jobs--at daily rates for which they could be happily ensconced in their own apartments. It would be quite another matter to treat low-income New Yorkers as members of society with contributions to make that are equal to (or greater than) those of bond traders or publicity agents, and to see their children as no less deserving of a safe and stable place to live than any other kids.

As Egan points out, homeless families--now 75 percent of the city's shelter population, including 13,000 children this past winter--are caught between falling or stagnant wages and skyrocketing housing costs. The housing market is just too tight, no public housing is being built and the waiting list for section 8 vouchers, which poor families can use toward private-market rents, has more than 200,000 names. Homelessness is a civic emergency, an affront to human dignity and a threat to the city's future, affecting everything from public health to public schools to public safety. But can you imagine Mayor Bloomberg, inspired by Egan's crusading journalism, proposing that we move homeless families--virtuous, sober, quiet homeless families, to be sure--into those hard-to-rent vacancies downtown? Middle-class New Yorkers would lie down in traffic to prevent it.

As society polarizes between rich and poor, differential treatment becomes ever more blatant and punitive. Thus, George W. Bush, seconded by Congressional Republicans and the Democratic Leadership Council, proposes forcing welfare mothers to work forty hours a week--nearly double the national norm for working moms. Thus, the Supreme Court, in a staggering 8-to-0 verdict (Justice Breyer recused himself), decides that public housing authorities can evict tenants if someone in the household uses drugs--including pot, which Mayor Bloomberg himself has acknowledged enjoying in his flaming youth. The rule applies even if they don't know about the drug use or do all in their power to prevent it, and even if it takes place outside the apartment. The plaintiffs? Two grandmothers whose grandsons smoked pot, one mother whose mentally disabled daughter was found thirty blocks away using cocaine, and one elderly disabled man whose health attendant had a crack pipe. Patricia Williams and others have wondered out loud why Jeb and Columba Bush didn't have to vacate the Florida governor's mansion because of their daughter's drug problems. Is it only poor grandmothers who are expected to have perfect control of the young?

The same law of punishing all for the crimes of one, which HUD has titled "One Strike and You're Out," is being used against battered women who seek help from the police, only to find themselves threatened with eviction from public housing because the household was the site of "criminal activity"--the assault. Can you imagine the headlines if the management at Battery Park City tried to evict a woman because her husband beat her up?

In the words of that noted social theorist Jesus Christ: "For unto everyone that hath shall be given, and he shall have more abundance: but from him that hath not shall be taken away even that which he hath." He was speaking of spiritual riches, but these days his words seem to apply to material ones as well.

* * *

My apologies to Tricycle, which did indeed cover strife in Sri Lanka, contrary to my reckless assertion ("God Changes Everything," April 1). No apologies, however, for failing to include Billy Graham's thirty-year-old anti-Semitic remarks in my catalogue of sins of the cloth. Why do I suspect that had I given to that ancient evangelist the space I allotted to West Bank settlers, priestly molesters, Islamic fanatics, Hindu arsonists and murderers or other contemporary religious rampagers, Christopher Hitchens would have suggested I was ignoring current crises in favor of musty Nixoniana?

Now that the recommendations of George W. Bush's Social Security task force have been quietly shelved, it's time to recall that there are simple and equitable solutions available to deal with Social Security's potential future problems resulting from the retirement of millions of baby boomers.

Five years ago, four former senators--Alan Simpson of Wyoming, John Danforth of Missouri, David Pryor of Arkansas and I (two Democrats and two Republicans)--met on the campus of Southern Illinois University with the deputy chief actuary for Social Security. After looking at many possibilities, we recommended two changes:

First, all income should be taxed for payments into the Social Security Retirement Trust Fund. Today income up to only $84,900 is taxed. While the benefit payments are mildly progressive, the taxes are regressive. Most Americans pay more into Social Security than to the IRS. Covering all income would not only help Social Security, it would reduce the growing gap between those more fortunate and those less fortunate. If you earn $1 million a year, your increased tax would be less than $57,000. You could afford that. And you would pay it knowing that you are helping insure a more secure old age for your children and grandchildren.

Second, the Consumer Price Index, which is used to measure inflation for the purpose of determining cost-of-living increases in Social Security benefits, should be corrected. If the price of beef goes up, more people buy chicken, but the cost of food in the index reflects the price of beef--substitution is not considered. Similarly, although drug costs have shot up, boosting the inflation rate, the index does not reflect that generic drugs can be substituted, lowering the cost of prescriptions slightly. If adjustments in the CPI are made, Social Security benefits could continue to rise with inflation, but the rate of increase would be slightly reduced.

Adopting just the first of our proposals would bring in most of the funds needed to meet increased Social Security costs. Adopting both of them, according to the actuaries, would keep the retirement fund solvent for seventy-five years, barring an economic disaster. Neither of these proposals is popular. Political parties don't like to do unpopular things, but a bipartisan Congressional commission could reverse this. It should be created. The longer we wait, the more difficult it will become.

The alternative advocated by George W. Bush and his carefully rigged commission--that a portion of Social Security payments be designated for investments in the stock market--should be a nonstarter. It would be a bonanza for stockbrokers but could hurt most retirees. Has anyone tabulated the cost of auditing millions of private accounts? Does the performance of the stock market during the past two years argue for subjecting people already living on the margin to greater risk? And not simply the experience of the past two years. From December 31, 1964, to December 31, 1981, the Dow Jones average went up less than 1 percent for all those seventeen years. The inflation increase for that period was 95 percent. Interest on government bonds looks good compared with that performance.

The wildest scenario came from a former Reagan Administration economist, Martin Feldstein, who suggested that the federal government should guarantee these investments. I would love to have the government guarantee my investments, but the S&L bailout would look tiny compared with what that idea could cost.

Social Security has a problem. Let's face up to it and deal with it in a way that makes our tax system slightly more progressive.

Like the Cyclops in the tale of Ulysses, Israel is striking at its enemy in blind fury.

Get set again for Liddy Dole.
She's back, to let the good times roll.
She's entering another race,
Her hair and diction all in place.
(Her hair is even more precise
Than that of Condoleezza Rice.)
Her problem is that she's been cast
As someone with a Beltway past.
Although she's Carolina bred,
She left her home to get ahead.
And now they say her luggage tag
Says DC--on a carpetbag.
"But ahm from heah," she'll say to all.
She'll say it in a Tarheel drawl.
The drawl alone should do the trick,
Unless she lays it on too thick--
Unless the voters say, in candor,
"We simply cannot understand her."

Israel's latest military offensive in the West Bank, code-named Defensive Wall, was met with fierce armed resistance, as Palestinians fought house to house and sometimes hand to hand to repulse the reconquest of their towns, villages and refugee camps. Some of the young defenders are guerrillas from new Palestinian militias forged by the intifada, others are Palestinian Authority police officers and many are both.

"This is our Karameh," said one in Jenin. Karameh, a village on the East Bank of the Jordan River, is the site of a battle fought between the Israeli army and Palestinian guerrillas in March 1968. Although the army took the village, the heroic resistance put up by the Palestinians consecrated Yasir Arafat and his Fatah movement as the undisputed leadership of the Palestinian cause. One year later Arafat was elected chairman of the PLO. He converted the movement from a front for Arab regimes into an authentic representative of Palestinian nationalism.

Many believe a similar changing of the guard has occurred during the eighteen months of the latest uprising, with leadership gradually passing from a Palestinian Authority that once ruled over the Palestinian areas to armed and cross-factional militias that now, alone, defend them. Formed in the uprising's first months as a defense against army and settler incursions, Fatah-led militias like the Popular Resistance Committees (PRC) in Gaza and the Al Aqsa Brigades in the West Bank have seen their power and legitimacy soar in inverse ratio to the collapse of the PA's governing and military institutions after a wave of Israeli assaults. As a result, former officers in PA police forces have swelled the militias' ranks.

This transformation has accelerated during Ariel Sharon's premiership. Following his election in February last year--and with Arafat's oblique blessing--the Palestinian armed factions united behind one policy: to destroy Sharon by creating a "balance of terror" with the occupation, a phrase borrowed from Hezbollah's triumphant resistance to Israel's occupation in south Lebanon. "We have to convince Israelis that whatever else Sharon brings them, it won't be security," says Jamal Abu Samhandanah, a PRC leader.

The strategy has exacted a brutal toll. Nearly 2,000 Palestinians and 400 Israelis have been killed in the current conflict, as Sharon's exclusively military solutions went from bombardment to reoccupation, and Palestinian resistance went from guerrilla warfare in the occupied territories to suicide bombings in Israel, executed recently as much by Fatah as by the Islamists of Hamas and Islamic Jihad.

The politics of Palestine's new young guard is as inchoate as the local militias that it comprises. But it opposes the PA-Israeli security cooperation and US-led diplomacy of the Oslo peace process, favoring instead armed struggle and alliances with the Arab world, including the million or so Palestinian citizens of Israel. One militia leader in Bethlehem said the most suitable response to Israel's current assault would be "resistance in Israel's cities and mayhem from the Galilee to Cairo."

Overwhelmingly from village and refugee backgrounds, the young guard is critical of PA mismanagement and corruption and of an Oslo leadership they believe reaped the spoils of the peace process without delivering on Palestinian aspirations to statehood, independence and Israeli withdrawal. But they are loyal to Arafat, and rarely more so than now: The army's siege on the Palestinian leader's compound in Ramallah is seen as a symbol of the plight of every Palestinian. "We think Arafat and all the leaders around him compromised too much in the negotiations. But as long as Sharon acts against him, we will be with Arafat. We will not let Israel decide the Palestinian leadership," says Samhandanah.

The young fighters are positioning for leadership in the post-Arafat era, whether this comes through his natural demise or through forced removal by Israel. The contours of the contest are already clear: between the historic Oslo leadership that seeks a negotiated settlement courtesy of US and international intervention, and a resistance vowing that the intifada will end only with independence, even if that means the destruction of what is left of the PA. Arafat has maintained his leadership by balancing between the two wings; he will side with the winner, say Palestinian analysts.

If Sharon succeeds in reimposing military rule throughout the occupied territories, the Palestinian national leadership will revert to what it was after Karameh, this time laced with a strong Islamist current. It will be young, underground, armed, refugee-based, perhaps more democratic and certainly more radical. It will take the Palestinian-Israeli conflict back three decades, and perhaps further.


MIA: WOMEN FIGHTING STATE TERROR

Jerusalem

We would like to thank Alexander Cockburn for his excellent March 25 "Beat the Devil" column, "The Nightmare in Israel." As activists in Ta'ayush (Arab-Jewish Partnership), we commend his giving voice to the courageous Jewish Israelis who are fighting against Sharon's state terror. However, we were disappointed that Cockburn did not mention any women. Ruchama Marton, for example, as founder and president of Physicians for Human Rights Israel, has been struggling against Israel's occupation and draconian policies in the territories for over a decade. Gila Svirsky is one of the leading activists in the Coalition of Women for a Just Peace; this group has been instrumental in raising public awareness and organizing protests and vigils over the past year and a half. Yehudith Keshet is one of the organizers of Machsom Watch, a group of women who stand witness at the various Israeli checkpoints around the country. These are just three of the many Jewish women who deserve recognition for speaking and acting out against the evil being perpetuated by the Israeli government. And while it is crucial at times like these to heed the voices of all progressive Jewish opposition, it is equally vital to recognize the participation of Palestinian citizens of Israel in the same struggles. Often forgotten in a conflict purportedly between "Arabs" and "Jews," Palestinians in Israel continue to play a foundational role in anti-occupation groups like Ta'ayush, PHR and Bat Shalom. We must insure that women and minorities are not, once again, elided from the historical record.

CATHERINE ROTTENBERG
SHIRA ROBINSON



DENMARK VESEY'S SLAVE REBELLION

Clinton, N.Y.

We encourage readers interested in the debate about black abolitionist Denmark Vesey to turn to the October 2001 and January 2002 William and Mary Quarterly (WMQ) rather than rely on Jon Wiener's misleading and error-ridden recapitulation of it ["Denmark Vesey: A New Verdict," March 11]. Wiener applauds the "stunning piece of historical detective work" by Michael Johnson, who contends that the Vesey plot in Charleston in 1822 was "not a plan by blacks to kill whites but rather a conspiracy by whites to kill blacks." In pinning superlatives on Johnson, however, Wiener neglects to disclose that he and Johnson have been close friends for almost thirty years, dating back at least to the 1970s when both worked together at the University of California, Irvine. With the ethics of historians currently under a great deal of public scrutiny, we find this omission disingenuous at best.

We will have more to say about Johnson's novel interpretation in future publications. We do agree with Wiener that Johnson relies on the manuscript court records, although Wiener errs in implying that scholars writing prior to Johnson neglected to examine those documents. We also agree with Wiener that the coerced testimony of Carolina bondmen in the court records, like virtually all documents pertaining to slavery, should not "be taken at face value." Thus, we do not agree with him (or Johnson) that the manuscript court record "is the only authoritative contemporary source." As with any other surviving document about slave resistance, the court record must be evaluated in the context of other relevant sources. We do not believe that Johnson has adequately responded to our criticism in the WMQ. Indeed, given his energetic investment in denying black agency, we wonder what sort of evidence short of the second coming of Vesey himself with an admission of guilt on his lips would persuade Johnson that at least some of the slaves in Charleston in 1822 were planning to liberate themselves by force.

Johnson responded to his critics by concluding that Vesey and his followers were really the victims of a Machiavellian hoax perpetrated by James Hamilton Jr., the politically ambitious Charleston mayor. For the moment, we will merely say that the principals of 1822, white as well as black, were far more complicated than Johnson, or Wiener, seems to think.

DOUGLAS R. EGERTON, ROBERT L. PAQUETTE


WIENER REPLIES

Irvine, Calif.

Did Denmark Vesey plan what would have been the biggest slave uprising in US history? Or was he framed because of a rivalry between political factions of the slaveholding elite? For decades historians--including myself--have been teaching the former. Now there's important evidence that we may have been wrong. The evidence comes from Michael Johnson. It's true that he was my colleague at UC Irvine until he left for Johns Hopkins eight years ago and that we remain friends. But Southern history is a small field in which people tend to know one another, and I'm friends with those on both sides of this debate. These friendships are less significant for readers than the quality of the evidence about Vesey's trial and execution. Egerton and Paquette promise they'll tell us what they think about that "complicated" issue in "future publications." It's too bad they didn't give us more substance in this one. Meanwhile, Johnson's piece has just been honored by the board of editors of the William and Mary Quarterly, the leading journal of early American history, as its best article of 2001.

JON WIENER



TULSA'S (AND AMERICA'S) SHAME

Tulsa, Okla.

Adrian Brune's insightful and informative article "Tulsa's Shame" [March 18] shines light on a subject once cloaked in a conspiracy of silence. But Tulsa's shame--a horrible act of ethnic cleansing of blacks by the institutions of white power--is not solely Tulsa's or Oklahoma's but America's shame as well. After World War I white mob violence, often tacitly supported by local governments, flared up across the country. Chicago, Omaha and St. Louis were just some of the cities that experienced race riots similar to Tulsa's. The systematic violence and disfranchisement of blacks in Tulsa is not unique to this little oil town on the plains, even though its remoteness from national centers of political and cultural power make it an easy target.

Tulsa may indeed be a conservative town on the buckle of the Bible Belt, but it is the only city, as far as I know, that has been courageous enough to take a good hard look in the mirror in its attempt to seek justice and reconciliation for its almost forgotten victims of racial violence during the pre-civil rights movement era. And it wasn't the legions of cosmopolitan journalists now swooping over the story that broke the silence but the voices of a few brave souls who cared enough about the riot's legacy to keep its history alive.

RUSSELL COBB


Tulsa, Okla.

I am of Native American descent and have lived in Tulsa my whole life, except for a tour in the Navy. The US government took the Oklahoma Territory, gave it away to whoever wanted it and drove Native Americans out of our lands and onto reservations with poor living conditions, then and now. The people who should be upset are the Native Americans, but you don't see Native Americans fighting or complaining that "you took my land." We were here first, but you don't see us wanting a memorial for something that happened long ago. So as far as I am concerned on the race riot, yes, it was a tragedy, the Oklahoma City bombing was a tragedy, September 11 was a tragedy, and I am, with the rest of the country, sorry for those people's losses. Life goes on. Always remember, never forget, but move on.

BRIAN RANDOL


Lexington, Ky.

Adrian Brune mentions that at the end of the Civil War blacks flocked to Oklahoma looking for a state to "call their own." But Tulsa was, as a part of Indian Territory, not a place for either blacks or whites to call their own--unless they were stealing it. Tulsa was part of the Creek Nation, given to the displaced Creeks for "as long as grass grows and water flows." I support reparations for the victims of the Tulsa race riot but find it appalling that there is no recognition that the entire eastern part of the state was created in an orgy of racism unparalleled in the history of our country.

MARGARET VERBLE


Boston

To clarify a statement toward the end of the article that "the Tulsa Metropolitan Ministries has raised about $20,000 toward reparations privately": The Unitarian Universalist Association contributed that $20,000 to the TMM, an interfaith coalition, to initiate a fund for the direct payment of reparations to riot survivors. The UUA has also contributed another $5,000 to help TMM set up antiracism programs.

THE REV. WILLIAM G. SINKFORD


Tulsa, Okla.

As one who has discussed this history with students, colleagues and fellow residents, I know that the riot remains highly controversial. The question is who, if anyone, is going to pay? The city? state? federal government? Tulsa is undergoing significant demographic change with the influx of immigrants from Latin America and Asia. Might it not be more effective to "correct" for past injustices by investing in public services and providing grants to innovative community groups, businesses and individuals to facilitate positive, progressive cultural relations across not just black-white but a number of racial and ethnic lines?

ANDREW WOOD


Bixby, Okla.

The plan backed by many Tulsans, a museum in the old Vernon AME church in Greenwood, and memorial and educational resources funded through the Greenwood Cultural Center, would do more than reparations could ever do. Why feed a few when you can teach a city's population to fish?

KIRK BJORNSGAARD



NEVER DROPPED THE WELFARE BALL

Washington, D.C.

Barbara Ehrenreich and Frances Fox Piven charge that think tanks, including the Economic Policy Institute, failed to look "ahead to the prospect of rising unemployment" in the context of welfare reform ["Who's Utopian Now?" Feb. 4]. That erroneous critique from people we respect and we hoped would be more familiar with our work was disappointing. EPI researchers have published many papers and developed mountains of data on the need to address welfare reform with labor market conditions in mind. We've highlighted the problems of unemployment and underemployment for low-wage workers. We've also shown that falling wages among low-wage workers reflected these problems, which are only worsened by forcing people into the labor market without any corresponding job creation programs.

When welfare reform was first debated, we produced a widely cited report that examined the potential for increased labor supply to outstrip demand and therefore lower wages (by 12 percent!) among low-wage workers. We provided statistics on wages, unemployment and underemployment among workers likely to have left welfare, to illustrate the weaknesses in the job market for former welfare recipients. We published frequent analyses on the labor market experiences of young minority women. We even added staff to try to insure that this work reached a large network of advocates and activists.

To accuse us of not noticing the relationship between the labor market and welfare reform is inaccurate and unfair. We have continued to examine the job prospects of former welfare recipients, even during the boom years. Our current work focuses on how the downturn is affecting former welfare recipients. We haven't once dropped the ball.

HEATHER BOUSHEY
LARRY MISHEL

As the Israeli army continues the second week of its military reoccupation of the Palestinian-controlled towns of the West Bank, a group of internationals is playing a role of solidarit

Some scandals find traction in Washington, others fizzle. The Taiwangate affair--which involves a $100 million secret Taiwan government slush fund that fina...

"I made up my mind that Saddam needs to go," President Bush told Britain's ITV News as he prepared for the arrival of British Prime Minister Tony Blair Friday for weekend meetings at the presidential ranch in Crawford, Texas. Though recent violence on the West Bank and in Israel has shifted the focus of press attention to what Bush and Blair will have to say about that conflict, the president's blunt remark was a reminder that this meeting of allies was originally organized as a forum to explore how Saddam Hussein's Iraq could be made the next target of an expanding "war on terrorism."

Blair reportedly arrived in Crawford with plans to tell Bush that talk of launching a war on Iraq ought to be put on hold at least until the Israeli-Palestinian conflict calms. The question that remains is whether Blair will give Bush an honest report on British sentiments regarding plans for an eventual attack on Iraq by the U.S. and Britain. If the prime minister does that, the summit will not provide Bush with much in the way of encouragement.

It turns out that Blair, who has been the president's most enthusiastic international ally since the September 11 terrorist attacks on the World Trade Center and the Pentagon, has been having a very hard time making the case at home for British support of a U.S.-led attack on Iraq.

From a short essay on "Nationaljudenthum und Zionismus," in Die Kritik (Berlin) for August 7, we learn that the movement for the reestablishment of an independent Jewish state in Palestine

Great Britain grants a homeland to the homeless Jews.

Of all the concepts which are associated with the Jewish
problem and the outstanding effort which is being
made toward its solution, perhaps none has become involved
in obscurer cont

Jerusalem, April 20

How cool is Jennifer Harbury? She is currently arguing her own case before the Supreme Court, demanding the right to sue the government because, she maintains, its leaders deliberately misled her about the murder of her husband, a Guatemalan rebel leader named Efrain Bamaca Velasquez who was killed in army custody during the counterinsurgency war in Guatemala in the early 1990s.

Harbury has a case. The State Department has confirmed that Col. Julio Roberto Alpirez, who was present during Bamaca's interrogation/murder, was a paid CIA asset. A CIA report alleges that Alpirez did the dirty deed himself. When then-State Department official Richard Nuccio informed Senator Robert Torricelli of that, Nuccio immediately found himself the target of a Justice Department investigation. A federal prosecutor accused him of betraying America by conspiring with Torricelli to blow Alpirez's cover, of destroying CIA officers' careers and of being an agent of the guerrillas. Although the United States offered no official charges or accusations, in a highly unusual move the CIA demanded that the State Department strip Nuccio of his security clearance, thereby depriving him of his livelihood. Harbury endured a thirty-two-day hunger strike to force those officials to come clean. She is now arguing that she could have saved her husband's life through the US court system had she known the truth during the period between his capture in March 1992 and his murder in 1993 or 1994.

A report by the President's Intelligence Oversight Board rejected the charge of deliberate lying by US officials but admitted that if the government had bothered to investigate "when Jennifer Harbury first raised the issue of her husband's fate" in the spring of 1992, the State Department "might have been able at a much earlier date to provide her with useful information." The key word here appears to be "useful."

Warren Christopher, Anthony Lake and the other Clinton Administration officials named by Harbury are probably right when they argue that leveling with her at the time would have made little difference in saving her husband's life. US courts do not have jurisdiction over the Guatemalan military (though US foreign policy officials often do). They also deny that they lied. But for procedural reasons, the ex-officials have to argue that regardless of whether they lied, a US citizen has no legal right to sue a public official who does lie. Solicitor General Theodore Olson filed an amicus brief arguing on behalf of the government's right to lie: "It is an unfortunate reality that the issuance of incomplete information and even misinformation by government may sometimes be perceived as necessary to protect vital interests," he maintains.

This particular case stinks for more reasons than can be precisely counted. In addition to the above, Bamaca was killed by a genocidal government that enjoyed the enthusiastic support of Ronald Reagan and George H.W. Bush. This is not only my opinion; it is the view of the Guatemalan Historical Clarification Commission's 1999 report, which condemns the United States for aiding a "criminal counterinsurgency" against the nation's indigenous Mayan population. America's Guatemala policy was anticommunism gone mad.

Moreover, if David Brock is to be believed, Olson is himself tainted by his lies to Congress. According to Brock's Congressional testimony, Olson lied during his confirmation hearings about his role in the Richard Mellon Scaife-funded "Arkansas Project," run out of the offices of The American Spectator and designed to undermine the Clinton presidency by any means necessary. What a surprise, therefore, that he thinks it's OK for the government to lie as well.

But the sorry truth is that the question of the government's right to lie is a lot more complicated than it looks. The Supreme Court has repeatedly enshrined in law the extremely provocative statement enunciated in the aftermath of the Cuban missile crisis by Assistant Secretary of Defense for Public Affairs Arthur Sylvester: "It's inherent in [the] government's right, if necessary, to lie to save itself." Dishonest officials have stretched the "national security" definition beyond recognition to protect not only thuggish murderers but also narrow political interests. But the principle itself is not wholly unsound. Although lies undermine the confidence in, and practice of, democracy, in the wake of the September 11 attacks, one can imagine circumstances in which a temporary lie might save lives without endangering the Constitution.

The problem is how to set enforceable limits. Government officials lie all the time. And while it is a crime to lie to Congress and to commit perjury, these acts are prosecuted in such a haphazard and nakedly political fashion that they can hardly serve as much of a deterrent. Lawrence Walsh's legitimate prosecutions of Reagan Administration officials who lied about matters of state were mocked by allegedly high-minded pundits like David Broder and George Will and overturned in a cowardly fashion by defeated President George H.W. Bush after the 1992 election.

Meanwhile, a fanatical cabal inside the Republican Party and Kenneth Starr's office manipulated these same laws to impeach President Clinton and disarm his popular agenda over a private lie not about a matter of state but a routine case of almost adultery. Given that hundreds of thousands if not millions of Americans have told this same type of lie to protect their families (or themselves) from humiliation, they saw this partisan gambit for what it was, punishing its perpetrators in the 1998 election. But the self-righteous pooh-bahs of the punditocracy--many of whom celebrated the Reagan-era liars and quite a few of whom told their share of adulterous lies--behave as if their hypocrisy were somehow patriotically inspired.

Jennifer Harbury continues to fight not only for justice for her husband but also for a reasonable definition of the government's right to lie. Bully for this brave woman who, despite her personal tragedy, takes democracy more seriously than its alleged protectors. She is a patriot to put the pundits to shame.

Six weeks ago, The Nation called for Army Secretary Tom White's resignation. White, former vice chairman of an Enron Ponzi scheme called Enron Energy Services (EES) was self-evidently not fit to bring sound business practices to the Pentagon. Since then, new revelations have created a bill of particulars against White serious enough to warrant probes by a federal grand jury and the Defense Department's Inspector General. White has stated that "if I ever get to the point...where the Enron business represents a major and material distraction...I wouldn't stay." That point has come. If White does not resign, he must be fired. The recent revelations show that White continues to practice the same squirrelly ethics that made Enron infamous. Since becoming Army Secretary, he has:

§ infuriated Republican Senator John Warner and Democrat Carl Levin of the Armed Services Committee by masking the full range of his Enron holdings;

§ violated his pledge to divest himself of those holdings, in accordance with ethics guidelines. After requesting an extension to sell his 405,710 shares, he finished dumping them in October, after a flurry of calls to executives at Enron and just before the SEC's public announcement of a formal investigation of the company, which caused the stock to tank. This has made White a target of a grand jury probe on insider trading. White says he was just commiserating with his former friends about Enron's troubles;

§ concealed those supposedly innocent contacts with Enron executives, failing to include them in response to a request by Representative Henry Waxman. White claims that he forgot to include the calls from his home phone;

§ misused a military plane to fly his wife and himself to Aspen, Colorado, where he completed the sale of his $6.5 million vacation house. This earned him an Inspector General's review of his past travel. Military transport is available only for official duty. White claims he had official business in Dallas and Seattle and that Aspen was directly between the two. He also states that he was required to fly a military plane as part of the Bush Administration's secretive continuity-in-government plan, which apparently requires top officials to fly military aircraft to resorts where they maintain mansions.

The more we learn of White's past at Enron, the worse it gets. EES cooked the books to register immediate earnings and profits, when in fact it was suffering hundreds of millions in losses--most of which were then secreted in Enron's notorious accounting scams. White has claimed that he knew nothing about improprieties at EES. But former EES employees interviewed by Dow Jones Newswires affirm that White was part of the scam. He signed off on the EES contracts that produced immediate paper profits and long-term real losses. He urged the sales force to make the company look like it was making money. He even participated in the notorious Potemkin Village trading floor, a fake trading room that EES threw together to impress visiting stock analysts. And then White walked off with millions, while investors were fleeced and the workers discarded. For conservative military analyst Eliot Cohen this alone is grounds for White's resignation, because he cannot profess the core military ethic of "mission" and "men" before self since "he was an integral part of an organization that violated those principles."

These days George W. Bush scarcely remembers his leading political patron, Enron CEO Ken "Kenny Boy" Lay. The President now poses as a champion of corporate accountability, calling for executives to be held personally responsible for their companies' financial statements. Yet he hasn't held his own Army Secretary personally responsible for his fraudulent actions at Enron and his misdeeds as Army Secretary. If White doesn't have the grace to go, he should be dismissed. The Army and the country would be better served if he defended himself from scandals past and present on his own time and with his own dime.

There are those opposed to the use of cloning technology to create human embryos for stem-cell research whose concerns emanate from commitments to social justice. One of their arguments runs as follows: The idea driving this medical research is that by creating an embryo through cloning, we can produce embryonic stem cells that are a perfect genetic match for a patient. All that is required to conduct the cloning is a skin cell from which to extract the patient's DNA and...a human egg.

Where, cry out the social justice advocates, are we going to get all these eggs for all these patients? Do the math, they suggest: 17 million American diabetics, needing anywhere from 10 to 100 eggs each, since the cloning technology is far from efficient...and even if you can pull that off, Christopher Reeve is still not walking, Michael J. Fox and Janet Reno still tremble and Ronald Reagan still doesn't remember who Ronald Reagan was. The social justice folk maintain that the billions of eggs required for embryonic stem cell therapies for the millions of Americans suffering from chronic and degenerative diseases will be obtained through exploitation of poor women in this country and the world over. Surplus value will take on an even more nefarious meaning.

Still, the early results from embryonic stem-cell therapy in mice are so dramatic that not to pursue this medical research is recognized as morally obscene and just plain stupid. At the University of California, Dr. Hans Keirstead was able to implant neurological tissue derived from embryonic stem cells in a mouse with partial spinal cord injury so that after eight weeks, the mouse had regained most of its ability to walk and, of major significance to the quarter-million Americans suffering from this tragic condition, had also regained bladder and bowel control. Yet, the question remains, where are we going to get all those eggs?

A call to Stanford University's Paul Berg, a Nobel laureate who has been testifying to Congress on behalf of embryonic stem-cell research, helps elucidate the answer: When it comes to the research, he says, the quantity required may not be a problem. But if the desired therapeutic potential of embryonic stem cells is fully realized, the need for eggs will be great and could short-circuit the availability of these therapies. But a solution to that may be possible, Berg insists. If research is carried out that identifies the biochemicals in the egg directing the genetic material to develop into an embryo, then we could extract and fractionate those biochemicals and insert them into any skin cell, for example, for use in the cloning process. Voilà! A skin cell becomes an egg, and skin cells are plentiful.

The immediate enthusiasm for this breakthrough scientific idea, which could help Reeve walk again while simultaneously obviating the motive for an exploitative human egg market, is quickly tempered by the full realization of what Berg has explained: When we acquire the ability to use any cell as an egg, we will have removed another obstacle to achieving complete control over human reproduction. Admittedly, complete control over the production of reproduction will require a womb for gestation--but that ultimately should prove to be just another biochemical matter for extraction and fractionation.

This, then, is how it goes in biotechnology, the essential dynamic that simultaneously gives rise to medical hope and moral vertigo. Each step forward produces a new problem, the solution to which demands further control over the biological mechanism known as a human being. But this somehow impinges on human beings or some portion of ourselves that we value. To deal with the attendant moral quandaries, a method is found to isolate and duplicate the underlying molecular process. The moral quandary has thus been replaced by an extracorporeal biochemical process, no longer strictly identified as human, and therefore a process that no one can reasonably value apart from its use. The problem, as bioethicist Eric Juengst puts it, is that we could thereby successfully cope with every moral dilemma posed by biotechnology and still end up with a society none of us would wish to live in. For Francis Fukuyama, this is Our Posthuman Future, as he has titled his new book on the subject.

Fukuyama's most famous previous theoretical foray was to declare, in 1989, an end to history, whereby a capitalist liberal democratic structure represented the final and most satisfying endpoint for the human species, permitting the widest expression of its creative energies while best controlling its destructive tendencies. He imagined that ultimately, with the universal acceptance of this regime, the relativist impasse of modern thought would in a sense resolve itself.

But thirteen years after the end of history, Fukuyama has second thoughts. He's discovered that there is no end of history as long as there is no end of science and technology. With the rapidly developing ability of the biological sciences to identify and then alter the genetic structure of organisms, including humans, he fears the essence of the species is up for grabs. Since capitalist liberal democratic structures serve the needs of human nature as it has evolved, interference by the bio-engineers with this human nature threatens to bring the end of history to an end.

The aim of Our Posthuman Future is "to argue that [Aldous] Huxley was right," Fukuyama announces early on, referring to Huxley's 1932 vision of a Brave New World. Multiple meanings are intended by Fukuyama: The industrialization of all phases of reproduction. The genetic engineering of the individuals produced by that process, thereby predetermining their lives. The tyrannical control of this population through neurochemical intervention, making subservience experientially pleasurable. Fukuyama cites specific contemporary or projected parallels to Huxley's Hatchery and Conditioning Center, Social Predestination Room and soma. In Fukuyama's terms, the stakes in these developments are nothing less than human nature itself.

The first of the book's three parts lays out the case that the biotechnologically driven shift to a posthuman era is already discernible and describes some of the potential consequences. Prozac and Ritalin are precursors to the genomically smart psychotropic weapons of the near future. Through these drugs, which energize depressed girls and calm hyperactive boys, we are being "gently nudged toward that androgynous median personality, self-satisfied and socially compliant, that is the current politically correct outcome in American society." Standardization of the personality is under way. This is the area to watch, Fukuyama asserts, because virtually everything that the popular imagination envisions genetic engineering accomplishing is much more likely to be accomplished sooner through neuropharmacology.

Increased life spans and genetic engineering also offer mostly dystopic horizons, whereby gerontocracies take power over societies whose main purpose has become the precision breeding of their progeny. The ancient instincts for hierarchical status and dominance are still the most powerful forces shaping this new world born from biotechnology. Since, as Fukuyama sees it, science does not necessarily lead to the equality of respect for all human beings demanded by liberal egalitarianism, the newest discoveries will serve the oldest drives. We are launched on a genetic arms race.

But be warned: We may not arrive in that new world through some dramatic struggle in which we put up a fight. Rather, the losses to our humanity may occur so subtly that we might "emerge on the other side of a great divide between human and posthuman history and not even see that the watershed had been breached because we lost sight of what that [human] essence was."

If this terrible event is to be prevented, then the human essence, which Fukuyama correlates with human nature itself, must be identified and kept inviolable. But what is that line to be drawn around "human nature" and to which we can all adhere so that we might reap the benefits of biotechnology while preventing the nightmare scenarios from ever coming to pass?

The entire world today wants the answer to this. Fukuyama promises to deliver it. But despite the clarity with which he announces his mission, the author advises his readers, "Those not inclined to more theoretical discussions of politics may choose to skip over some of the chapters here." Yet these are the very chapters containing the answer we all seek in order to tame the biotechnology beast! This, then, signals that we are entering dangerous ground, and we will need to bear with the author's own means of revealing his great discovery, which may be skipped over at our own peril.

In this heart of the book, titled "Being Human," Fukuyama first seeks to restore human nature as the source of our rights, our morality and our dignity. In particular, he wishes to rescue all these dimensions from the positivist and utilitarian liberal philosophers who, closely allied with the scientific community, have dominated the debate over biotechnology. According to the author, these philosophers assign rights everywhere and emphasize the individual as the source of moral concern. In doing so, they put humankind and its collective life at risk before the juggernaut of biotechnology. John Rawls and Ronald Dworkin, among others, have elevated individual autonomy over inherently meaningful life plans, claims Fukuyama, who then questions whether moral freedom as it is currently understood is such a good thing for most people, let alone the single most important human good.

Rather than our individual autonomy or moral freedom, Fukuyama wishes that we would attend to the logic of human history, which is ultimately driven by the priorities that exist among natural human desires, propensities and behaviors. Since he wishes us to shift ground to the logic of the inherent and the natural, he must finally define that core composing human nature:

The definition of the term human nature I will use here is the following: human nature is the sum of the behavior and characteristics that are typical of the human species, arising from genetic rather than environmental factors.

Later he will refine this further to the innate species-typical forms of cognition, and species-typical emotional responses to cognition. What he is really after is not just that which is typical of our species but that which is unique to human beings. Only then will we know what needs the greatest safeguarding. After hanging fire while reviewing the candidates for this irreducible, unique core to be defended, including consciousness and the most important quality of a human being, feelings, Fukuyama finally spills the beans:

What is it that we want to protect from any future advances in biotechnology? The answer is, we want to protect the full range of our complex, evolved natures against attempts at self-modification. We do not want to disrupt either the unity or the continuity of human nature, and thereby the human rights that are based on it.

So, where are we? It would seem we have gone full circle. Human nature is defined by...human nature! To the extent that it is capable of being located in our material bodies, it is all that arises from our genetics. Any attempt at greater precision is a violation of our unity or continuity--and threatens to expose the author's empty hand. Through such sophistry, Fukuyama wishes to assert mastery over any biotechnological innovation that he considers threatening, since he can now arbitrarily choose when it is disruptive of the unity or continuity of the human nature arising from our genetics. Even a heritable cancer could qualify for protection under Fukuyama's rubric for that which is to be defended from biotechnological intervention.

Indeed, there are those agreeing with Fukuyama's view of the biological bases of human social life who draw opposite conclusions about human bioengineering, viewing it as humanity's last best hope.

The remainder of the book is a potpourri of tactical suggestions (embedded in rhetoric cloned from Fukuyama's mentor in these matters, bioethicist Leon Kass) of which biotechnologies should be controlled, and of the need for both national and international bodies and systems to do so, if such control is to be effective. That, in the end, may be the most surprising aspect of the book. All this fervid philosophizing in reaction to fears about a Brave New World, fervently working toward the radical conclusion that what is needed is...regulation. Although obviously recognition of the need for regulation might well be experienced as a radical trauma by someone who has previously placed an overabundance of faith in the market.

But one would be foolish to believe that Fukuyama has gone all this distance simply to argue for what he refers to at one point as a more nuanced regulatory approach. In his most public engagement with biotechnology thus far, he has endorsed, written and testified to Congress on behalf of a bill that will not only ban human reproductive cloning but also ban nonreproductive cloning for stem-cell research. The legislation he supports would also make any doctor who utilizes or prescribes a treatment developed with cloning technology subject to ten years in prison and a $1 million fine. Under this legislation, then, if a cure or treatment for diabetes or heart failure is created in England that used embryo cloning to harvest stem cells for therapy, US physicians would not be allowed to have access to such treatments for their patients. This is his lesson in how moral freedom is not such a good thing compared with an inherently meaningful life plan. Let the fragile diabetic or spinal cord-injury victim learn the true value of our human nature from their catheterized bladders!

Fukuyama's entire brief depends upon avoiding the consequences of his own logic. Having identified the human essence with our biological human nature, he must evade any further specification or else the particular tissues, cells or molecules would be subject to further discussion and analysis as to whether or not they represent the human essence. Rather than discussion, we should trade in our autonomy and moral freedom for his protections. By the close of the book, any moral qualms on his part fall entirely by the wayside. Fukuyama is perhaps aware that he has failed to make his case except to those ready to believe. The book culminates in a final paragraph that is nothing less than a temper tantrum:

We do not have to accept any of these future worlds under a false banner of liberty, be it that of unlimited reproductive rights or of unfettered scientific inquiry. We do not have to regard ourselves as slaves to inevitable technological progress when that progress does not serve human ends. True freedom means the freedom of political communities to protect the values they hold most dear...

Nice rhetoric until we recall the values of the types of political regimes to which moral freedom and science must be sacrificed. While Fukuyama rails against the Brave New World, he takes the side of Huxley's World Controller, who explains, "Truth's a menace, science is a public danger...That's why we so carefully limit the scope of its researches."

There is an alternative to the fear that human nature must be inviolable because human nature cannot be trusted. We have seen imperious dictates against science and moral freedom delivered by philosophers before. In the recent past, we have evidence of very similar ideas in very similar language issuing from the philosopher whom Fukuyama draws upon for the epigraph beginning the first chapter of his book, Martin Heidegger. In the 1930s Professor Heidegger wanted science to serve the German essence, and it did. Now Professor Fukuyama wants science, and all of us, to serve the human essence, which he equates with his version of sociobiology infused with German romantic holism. Once more, we witness someone who would stop tyranny by imposing a tyranny of his own. Since Francis Fukuyama now sits on the President's Council on Bioethics, we should be grateful for the warning.

"Thirty years from now the big university campuses will be relics," business "guru" Peter Drucker proclaimed in Forbes five years ago. "It took more than 200 years for the printed book to create the modern school. It won't take nearly that long for the [next] big change." Historian David Noble echoes Drucker's prophecies but awaits the promised land with considerably less enthusiasm. "A dismal new era of higher education has dawned," he writes in Digital Diploma Mills. "In future years we will look upon the wired remains of our once great democratic higher education system and wonder how we let it happen."

Most readers of this magazine will side with Noble in this implicit debate over the future of higher education. They will rightly applaud his forceful call for the "preservation and extension of affordable, accessible, quality education for everyone" and his spirited resistance to "the commercialization and corporatization of higher education." Not surprisingly, many college faculty members have already cheered Noble's critique of the "automation of higher education." Although Noble himself is famously resistant to computer technology, the essays that make up this book have been widely circulated on the Internet through e-mail, listservs and web-based journals. Indeed, it would be hard to come up with a better example of the fulfillment of the promise of the Internet as a disseminator of critical ideas and a forum for democratic dialogue than the circulation and discussion of Noble's writings on higher education and technology.

Noble performed an invaluable service in publishing online the original articles upon which this book is largely based. They helped initiate a broad debate about the value of information technology in higher education, about the spread of distance education and about the commercialization of universities. Such questions badly need to be asked if we are to maintain our universities as vital democratic institutions. But while the original essays were powerful provocations and polemics, the book itself is a disappointing and limited guide to current debates over the future of the university.

One problem is that the book has a dated quality, since the essays are reproduced largely as they were first circulated online starting in October 1997 (except for some minor editorial changes and the addition of a brief chapter on Army online education efforts). In those four-plus years, we have watched the rise and fall of a whole set of digital learning ventures that go unmentioned here. Thus, Noble warns ominously early in the book that "Columbia [University] has now become party to an agreement with yet another company that intends to peddle its core arts and science courses." But only in a tacked-on paragraph in the next to last chapter do we learn the name of the company, Fathom, which was launched two years ago, and of its very limited success in "peddling" those courses, despite Columbia president George Rupp's promise that it would become "the premier knowledge portal on the Internet." We similarly learn that the Western Governors' Virtual University "enrolled only 10 people" when it opened "this fall" (which probably means 1998, when Noble wrote the original article) but not that the current enrollment, as of February 2002, is 2,500. For the most part, the evidence that Noble presents is highly selective and anecdotal, and there are annoyingly few footnotes to allow checking of sources or quotes.

The appearance of these essays with almost no revision from their initial serial publication on the Internet also helps to explain why Noble's arguments often sound contradictory. On page 36, for example, he may flatly assert that "a dismal new era of higher education has dawned"; but just twenty-four pages later, we learn that "the tide had turned" and the "the bloom is off the rose." Later, he reverses course on the same page, first warning that "one university after another is either setting up its own for-profit online subsidiary or otherwise working with Street-wise collaborators to trade on its brand name in soliciting investors," but then acknowledging (quoting a reporter) that administrators have realized "that putting programs online doesn't necessarily bring riches." When Noble writes that "far sooner than most observers might have imagined, the juggernaut of online education appeared to stall," he must have himself in mind, two chapters earlier. Often, Noble is reflecting the great hysteria about online education that swept through the academy in the late 1990s. At other times (particularly when the prose has been lightly revised), he indicates the sober second thoughts that have more recently emerged, especially following the dot-com stock market crash in early 2000.

In the end, one is provided remarkably few facts in Digital Diploma Mills about the state of distance education, commercialization or the actual impact of technology in higher education. How many students are studying online? Which courses and degrees are most likely to appear online? How many commercial companies are involved in online education? To what degree have faculty employed computer technology in their teaching? What has been the impact on student learning? Which universities have changed their intellectual property policies in response to digital developments? One searches in vain in Noble's book for answers, or even for a summary of the best evidence currently available.

Moreover, Noble undercuts his own case with hyperbole and by failing to provide evidence to support his charges. For example, most readers of his book will not realize that online distance education still represents a tiny proportion of college courses taken in the United States--probably less than 5 percent. Noble sweepingly maintains, "Study after study seemed to confirm that computer-based instruction reduces performance levels." But he doesn't cite which studies. He also writes, "Recent surveys of the instructional use of information technology in higher education clearly indicate that there have been no significant gains in pedagogical enhancement." Oddly, here Noble picks up the rhetoric of distance-education advocates who argue that there is "no significant difference" in learning outcomes between distance and in-person classes.

Many commentators have pointed out Noble's own resistance to computer technology. He refuses to use e-mail and has his students hand-write their papers. Surely, there is no reason to criticize Noble for this personal choice (though one feels sorry for his students). Noble himself responds defensively to such criticisms in the book's introduction: "A critic of technological development is no more 'anti-technology' than a movie critic is 'anti-movie.'" Yes, we do not expect movie critics to love all movies, but we do expect them to go to the movies. Many intelligent and thoughtful people don't own television sets, but none of them are likely to become the next TV critic for the New York Times. Thus, Noble's refusal to use new technology, even in limited ways, makes him a less than able guide to what is actually happening in technology and education.

Certainly, Noble's book offers little evidence of engagement with recent developments in the instructional technology field. One resulting distortion is that some readers will think that online distance education is the most important educational use of computer technology. Actually, while very few faculty teach online courses, most have integrated new technology into their regular courses--more than three-fifths make use of e-mail; more than two-fifths use web resources, according to a 2000 campus computing survey. And few of these faculty members can be characterized, as Noble does in his usual broad-brush style, as "techno-zealots who simply view computers as the panacea for everything, because they like to play with them."

Indeed, contrary to Noble's suggestion, some of the most thoughtful and balanced criticisms of the uses of technology in education have come from those most involved with its application in the classroom. Take, for example, Randy Bass, a professor of English at Georgetown University, who leads the Visible Knowledge Project (http://crossroads.georgetown.edu/vkp), a five-year effort to investigate closely whether technology improves student learning. Bass has vigorously argued that technological tools must be used as "engines of inquiry," not "engines of productivity." Or Andrew Feenberg, a San Diego State University distance-education pioneer as well as a philosopher and disciple of Herbert Marcuse, who has insisted that educational technology "be shaped by educational dialogue rather than the production-oriented logic of automation," and that such "a dialogic approach to online education...could be a factor making for fundamental social change."

One would have no way of knowing from Noble's book that the conventional wisdom of even distance-education enthusiasts is now that cost savings are unlikely, or that most educational technology advocates, many of them faculty members, see their goal as enhancing student learning and teacher-student dialogue. Noble, in fact, never acknowledges that the push to use computer technology in the classroom now emanates at least as much from faculty members interested in using these tools to improve their teaching as it does from profit-seeking administrators and private investors.

Noble does worry a great deal about the impact of commercialization and commodification on our universities--a much more serious threat than that posed by instructional technology. But here, too, the book provides an incomplete picture. Much of Noble's book is devoted to savaging large public and private universities--especially UCLA, which is the subject of three chapters--for jumping on the high-technology and distance-education bandwagons. Yet at least as important a story is the emergence of freestanding, for-profit educational institutions, which see online courses as a key part of their expansion strategy. For example, while most people think of Stanley Kaplan as a test preparation operation, it is actually a subsidiary of the billion-dollar Washington Post media conglomerate and owns a chain of forty-one undergraduate colleges and enrolls more than 11,000 students in a variety of online programs, ranging from paralegal training to full legal degrees at its Concord Law School, which advertises itself as "the nation's only entirely online law school." This for-profit sector is growing rapidly and becoming increasingly concentrated in a smaller number of corporate hands. The fast-growing University of Phoenix is now the largest private university in the United States, with more than 100,000 students and almost one-third in online programs, which are growing more than twice as fast as its brick-and-mortar operation. Despite a generally declining stock market, the price of the tracking stock for the University of Phoenix's online operation has increased more than 80 percent in the past year.

As the Chronicle of Higher Education reported last year, "consolidation...is sweeping the growing for-profit sector of higher education," fueled by rising stock prices in these companies. This past winter, for example, Education Management Corporation, with 28,000 students, acquired Argosy Education Group and its 5,000 students. The threat posed by these for-profit operations is rooted in their ability to raise money for expansion through Wall Street ("Wall Street," jokes the University of Phoenix's John Sperling, "is our endowment") and by diminishing public support for second-tier state universities and community colleges (the institutions from which for-profits are most likely to draw new students). Yet, except for an offhand reference to Phoenix, Digital Diploma Mills says nothing about these publicly traded higher-education companies. And these for-profit schools are actually only a small part of the more important and much broader for-profit educational "sector," which is also largely ignored by Noble and includes hundreds of vendors of different products and services, and whose size is now in the hundreds of billions of dollars--what Morgan Stanley Dean Witter calls, without blushing, an "addressable market opportunity at the dawn of a new paradigm."

A strong cautionary tale is provided by Noble, that of the involvement of UCLA's extension division with a commercial company called Onlinelearning.net--the most informative chapter in the book. He shows how some UCLA administrators as early as 1993 greedily embraced a vision of riches to be made in the online marketing of the college's extension courses. UCLA upper management apparently bought the fanciful projections of their commercial partners that the online venture would generate $50 million per year within five years, a profit level that quickly plummeted below $1 million annually. But Noble conflates the UCLA online-extension debacle with a more benign effort by the UCLA College of Letters and Sciences, beginning in 1997, to require all instructors to post their course syllabuses on the web. He seems unwilling to draw distinctions between the venal and scandalous actions of top UCLA administrators and the sometimes ham-handed efforts of other administrators to get UCLA faculty to enhance their classes by developing course websites, a fairly common educational practice and a useful convenience for students. Three-fifths of UCLA students surveyed said that the websites had increased interactions with instructors, and social science faculty recently gave the website initiative a mostly positive evaluation.

Sounding an "early alarm" so that faculty members can undertake "defensive preparation and the envisioning of alternatives" is how Noble explains his purpose in writing Digital Diploma Mills. But will faculty be well armed if they are unaware of the actual landscape they are traversing? In the end, Noble leaves us only with a deep and abiding suspicion of both technology and capitalism. His analysis of technology and education does echo Marx's critique of capitalism, with its evocation of concepts like commodification, alienation, exchange and labor theories of value. But unlike Marx, who produced a critical analysis of the exploitative nature of early capitalist production without outright rejection of the technology that made industrialization possible, Noble cannot manage the same feat.

In the current political climate, Noble's undifferentiated suspicion of technology hinders us more than it helps us. Are we prepared to follow him in his suspicion of any use of technology in higher education? Are faculty members willing to abjure e-mail in communicating with their students and colleagues? Are instructors at small colleges with limited library collections prepared to tell their students not to use the 7 million online items in the Library of Congress's American Memory collection? Are they ready to say to students with physical disabilities that limit their ability to attend on-campus classes or conduct library research that they can't participate in higher education? Are faculty at schools with working adults who struggle to commute to campus prepared to insist that all course materials be handed directly to students rather than making some of it available to their students online?

Similarly, what lines are we prepared to draw with respect to commercialization of higher education within the capitalist society in which we live? Are faculty willing to abandon publishing their textbooks with large media conglomerates and forgo having their books sold through nationwide bookstore chains? Are they prepared to say to working-class students who view higher education as the route to upward mobility that they cannot take courses that help them in the job market?

Noble's answer to most of these questions would undoubtedly be yes, insisting, as he does, that anything less than the "genuine interpersonal interaction," face to face, undermines the sanctity of the essential teacher-student relationship. In a March 2000 Chronicle of Higher Education online dialogue about his critique of technology in education, Noble complained that no one had offered "compelling evidence of a pedagogical advantage" in online instruction. (He pristinely refused to join online, and had a Chronicle reporter type in his answers relayed over the phone.) A student at UCLA, who had unexpectedly taken an online course, noted in her contribution to the Q&A that because she tended to be "shy and reserved," e-mail and online discussion groups allowed her to speak more freely to her instructor, and that she thought she retained more information in the online course than in her traditional face-to-face classes at UCLA. Noble rejected the student's conclusion that the online course had helped her find her voice, arguing that writing was "in reality not a solution, but an avoidance of the difficulty." "Speaking eloquently, persuasively, passionately," he concluded, "is essential to citizenship in a democracy." Putting aside the insensitivity of Noble's reply, his position, as Andrew Feenberg points out in Transforming Technology: A Critical Theory Revisited, is reminiscent of Plato's fear that writing (the cutting-edge instructional technology in the ancient world) would replace spoken discourse in classical Greece, thus destroying the student-teacher relationship. (Ironically, as Feenberg also notes, "Plato used a written text as the vehicle for his critique of writing, setting a precedent" for current-day critics of educational technology like Noble who have circulated their works on the Internet.)

The conservative stance of opposing all change--no technology, no new modes of instruction--is appealing because it keeps us from any possible complicity with changes that undercut existing faculty rights and privileges. But opposition to all technology means that we are unable to support "open source" technological innovations (including putting course materials online free) that constitute a promising area of resistance to global marketization. And it makes it impossible to work for protections that might be needed in a new environment. Finally, it leaves unchanged the growing inequality between full-time and part-time faculty that has redefined labor relations in the contemporary university--the real scandal of the higher-education workplace. Without challenging the dramatic differences in wages and workloads of full professors and adjunct instructors, faculty rejection of educational technology begins to remind us of the narrow privileges that craft workers fought to maintain in the early decades of industrial capitalism at the expense of the unskilled workers flooding into their workplaces.

We prefer to work from a more pragmatic and realistic stance that asks concretely about the benefits and costs of both new technology and new educational arrangements to students, faculty (full- and part-time) and the larger society. Among other things, that means that academic freedom and intellectual property must be protected in the online environment. And the faculty being asked to experiment with new technology need to be provided with adequate support and rewards for their (ad)ventures. As the astute technology commentator Phil Agre wrote when he first circulated Noble's work on the Internet, "the point is neither to embrace or reject technology but to really think through and analyze...the opportunities that technology presents for more fully embodying the values of a democratic society in the institutions of higher education."

Here we are, twenty years on, and the reports of the Israeli army smashing its way through Palestinian towns remind me of what came out of Lebanon as Sharon and his invading army raced north. Israeli troops beating, looting, destroying; Palestinians huddled in refugee camps, waiting for the killers to come.

But there is a huge difference. Twenty years ago, at least for people living here in the United States, it was harder, though far from impossible, to get firsthand accounts of what was going on. You had to run out to find foreign newspapers, or have them laboriously telexed from London or Paris. Reporting in the mainstream corporate press was horrifyingly tilted, putting the best face on Israeli deeds. Mostly, it still is. But the attempted news blackout by the Sharon government and the Israeli military simply isn't working.

Here's Aviv Lavie, writing in Ha'aretz on April 2:

A journey through the TV and radio channels and the pages of the newspapers exposes a huge and embarrassing gap between what is reported to us and what is seen, heard, and read in the world.... On Arab TV stations (though not only them) one could see Israeli soldiers taking over hospitals, breaking equipment, damaging medicines, and locking doctors away from their patients. Foreign television networks all over the world have shown the images of five Palestinians from the National Security forces, shot in the head from close range.... The entire world has seen wounded people in the streets, heard reports of how the IDF prevents ambulances from reaching the wounded for treatment.

As always, there are the courageous witnesses. These days we have the enormously brave young people in the International Solidarity Movement sending daily communications back to the United States that flash their way round the Internet and even translate into important interviews in the mainstream media.

Meet a few of them. Here's Jordan Flaherty, filing this account on Indymedia:

Last night the Israeli Military tried to kill me. I'm staying in the Al Azzeh refugee camp, in Bethlehem, along with about twenty other international civilians. We're here to act as human shields.... On the hill above the camp is an Israeli military sniper's post. To get where we were staying in the village, most of us had to cross this street. It was a quick, low, dash across the street. As I ran, the sniper fired.... The shots began as I came into view, and stopped shortly after I made it to the other side. They were clearly aimed at me. And, by the sound of them, they were close. All night long, there was the sound of gun shots, as the military shot into our village. We stayed clear of the windows.... The guns and bullets were, no doubt, paid for by my tax dollars. Which is, of course, why we are here.

Or Tzaporah Ryter, filing this on Electronic Intifada:

I am an American student from the University of Minnesota. I currently am in Ramallah. We are under a terrible siege and people are being massacred by both the Israeli army and armed militia groups of Israeli settlers.... On Thursday afternoon, the Israeli army began sealing off each entrance to Ramallah.... Those traveling in began desperately searching for alternative ways and traveling in groups, but the Israelis were firing upon them and everyone was running and screaming.... Israeli jeeps were speeding across the terrain, pulling up from every direction and shooting at the women and children, and also at me...

Or the extremely articulate and self-possessed Adam Shapiro, whose testimony ended up in the New York Daily News and on CNN, where he told Kyra Phillips:

This is not about politics between Jew and Arab, between Muslim and Jew. This is a case of human dignity, human freedom and justice that the Palestinians are struggling for against an occupier, an oppressor. The violence did not start with Yasir Arafat. The violence started with the occupation.... Arafat, after every terrorist incident, every suicide bombing, after every action, has condemned this loss of life, of civilian lives on both sides. The Sharon government, sometimes will apologize after it kills an innocent civilian, but it does not apologize for raping the cities and for going in and carrying out terrorist actions, going house to house tearing holes through the walls, roughing up people, killing people, assassinating people.

Most of the time you open up a newspaper and read a robotic column--as I did the Los Angeles Times's Ronald Brownstein the other day--about Palestinian terrorism and the wretched Arafat's supposed ability to quell the uprising with a few quick words. And then you turn on the NewsHour and there, of all people, is Zbigniew Brzezinski, stating the obvious, on April 1:

The fact of the matter is that three times as many Palestinians have been killed, and a relatively small number of them were really militants. Most were civilians. Some hundreds were children.... in the course of the last year, we have had Palestinian terrorism but we have also had deliberate overreactions by Mr. Sharon designed not to repress terrorism but to destabilize the Palestinian Authority, to uproot the Oslo Agreement, which he has always denounced, in a manner which contributed to the climate, that resulted in the killing of one of the two architects of the Oslo Agreement.

After predictable dissent from Kissinger, Brzezinski went on:

It's absolute hypocrisy to be claiming that Arafat can put a stop to the terrorism.... the fact of the matter is that his ability to control the situation would be greatly increased if there was serious movement towards political process, towards a political settlement and that the United States took the lead.

Between this brisk statement and the eloquent courage of Adam Shapiro and his brave fellow internationalists, the truth is getting out--not fast enough, not loud enough--but better than twenty years ago.

Recent days have brought the first tentative but welcome roadblocks to the Bush Administration's war-fevered assault on civil liberties. In Newark, Superior Court Judge Arthur D'Italia, calling secret arrests "odious to a democracy," ordered the Immigration and Naturalization Service to release the names of post-September 11 detainees to the New Jersey ACLU. Although the release of the names has been delayed pending a federal government appeal, Judge D'Italia's courageous ruling is a significant victory for constitutional principle.

Meanwhile, a Freedom of Information Act lawsuit requesting the names of detainees and related information continues in federal court in Washington, DC. The Center for National Security Studies, which is representing numerous civil liberties and media outfits (including The Nation), recently filed a brief supporting its own request for this material. The Administration is expected to file a response in mid-April, and oral arguments could take place weeks after that.

Some of the most important restraints are being applied from abroad. In the case of accused hijacking conspirator Zacarias Moussaoui, Attorney General John Ashcroft confronted a clear choice between his domestic political goal of advancing the death penalty and the international goal of a palatable legal campaign against Al Qaeda. Ashcroft tried to have it both ways, committing himself to capital charges against French citizen Moussaoui while seeking further cooperation in the case from the anti-capital-punishment French. The result: French officials have publicly promised to withhold any evidence that might lead to a death sentence. Ashcroft's prosecutors are on the hook, stuck with a sketchy conspiracy case requiring proof that the silent Moussaoui, grounded in jail on September 11, was a sufficiently active and knowledgeable architect of the Al Qaeda Trade Center and Pentagon attacks to merit a capital conviction.

The sense that European human rights commitments are having an impact despite the Administration's guff was even clearer when Defense Secretary Donald Rumsfeld effected a brief reverse-rudder from George W. Bush's military tribunal order. This past fall the Administration fumed when UN Human Rights High Commissioner Mary Robinson denounced the tribunal order, but Rumsfeld's new plan to require public trials, unanimous votes for a death sentence and additional layers of review was clearly designed to quiet international criticism. Rumsfeld's amendment to the Bush order does not get to the heart of the constitutional problem, however. The tribunals still represent an unlawful seizure of judicial authority by the President. And Rumsfeld's March 28 admission that prisoners in Guantánamo Bay could stay locked up even if acquitted of charges reveals what is in effect a permanent policy of internment without trial, a policy made possible only by the Al Qaeda prisoners' continued status in a no man's land between the Geneva Conventions and US law. This policy has European allies nearly as alarmed as they are about the tribunals.

There are two important lessons here. One is that in the months since September 11, players in the federal system of checks and balances--Congress and federal judges--have largely failed to resist the Bush Administration's constitutional power grab. The second lesson, however, is that resistance is more than possible. With state courts sometimes better guardians of civil liberties than are the Rehnquist-era federal courts, and with Europe deeply invested in its Continental human rights covenants, it may turn out that local and transnational coalitions will become an effective means of preserving civil liberties. State courts at home and allies abroad are turning out to be the most compelling protectors of the essential American values the Bush Administration has been ready to sell down the river.

Even as the Middle East plunged deeper into the maelstrom of fear, hatred, violence and despair, recent diplomatic developments, ironically, made the conditions for achieving peace tantalizingly real. Most notable, there was the declaration by the Arab nations meeting in Beirut of their willingness to recognize Israel, after fifty years of denial of its right to exist.

Predictably brushing aside the Arab vision, Prime Minister Ariel Sharon revved up his tanks for a military solution, declaring a "state of war" with the Palestinians and invading the West Bank. Their nation, he told Israelis, is "at a crossroads." So it is. The road down which Sharon is taking them, with a green light from the Bush Administration, leads to more deaths, more brutalizing of civilians, more violations of human rights--and answering violence and anger by the Palestinians, with more suicide bombers making barbarous war on Israeli civilians.

The other road, the road to peace, leads toward the goal, articulated anew in Beirut, of a complete Israeli withdrawal from the West Bank and Gaza, making way for a Palestinian state with its capital in East Jerusalem, in return for normalization of relations with the Arab states.

At Beirut, in another step for regional peace, the Arab nations brokered Iraq's recognition of Kuwait's sovereignty, called for an end to UN sanctions, for dialogue between Baghdad and the UN and for the elimination of weapons of mass destruction in the region. Most significant, they expressed united opposition to US military action against Iraq. This move undercut the Bush Administration's political rationale for attacking Iraq and in effect denied it the regional bases and logistical support essential to success in such a war.

Faced with those impediments, the Administration put its Iraq plans on hold and turned to the crisis in the Middle East. But Bush seems more concerned with maintaining the domestic political momentum of his war on terror (just as Sharon is driven by the harder-line challenge on his right from Benjamin Netanyahu) than he is with making the tough political choices that would lead to a just settlement.

Bush still could choose the road to peace, whether or not Sharon takes that same road. He could reaffirm the US commitment to the March 12 UN resolution calling for a two-state solution, adopt the Arab peace scheme as a vision that could give Palestinians hope of an independent state, re-endorse the recent Security Council resolution calling for Sharon's withdrawal from Palestinian cities and dispatch the Secretary of State to the region with a plan that guarantees Israel's security, calls for abandonment of Israeli settlements and the withdrawal of Israel to 1967 borders--a plan backed up by firm promises of monitors and the material and financial resources necessary to make it work.

Does Bush have the courage to take that road--to risk the prestige of his Administration and the resources of the United States in the cause of achieving a just peace? The State Department mildly criticized Sharon's incursion, but Bush seems to be giving Sharon free rein. The only hope is for the thus-far-small opposition in this country to build pressure on him and make clear that Sharon's way condemns both Israelis and Palestinians to more suffering and bloodshed. We in America must add our voices to the burgeoning protests in the Middle East, Europe and Asia and to the eloquent warnings voiced by UN Secretary General Kofi Annan, Pope John Paul II and other world leaders. Only a US-led third-party intervention can forge a settlement that will end the violence.