Ad Policy

June 10, 2002 Issue

  • Editorials

    The Warning Game

    The question is not the 1970s cliché, What did the President know and when did he know it? The appropriate query is, What did US intelligence know--and what did the President know and do about that? The flap over the August 6, 2001, intelligence briefing of George W. Bush--in which he was told that Osama bin Laden's Al Qaeda network was interested in hijackings and looking to strike the United States directly--should not have focused on whether the President ignored that information and missed the chance to prevent the September 11 strikes. Still, a political dust-up ensued, as the White House, overreacting to the overreaction of the Democrats, went into full spin mode. The crucial issue was broached when National Security Adviser Condoleezza Rice stated, "I don't think anybody could have predicted that these people would take an airplane and slam it into the World Trade Center."

    Actually, it was predicted, and the recent hullabaloo called attention to the sad fact that the Clinton and the Bush II national security establishments did not heed hints going back to 1995. In that year a terrorist arrested in the Philippines said bin Laden operatives were considering a plot to bomb airliners and fly a plane into CIA headquarters--information shared with the United States. Two weeks before that arrest, Algerian terrorists linked to Al Qaeda hijacked a plane, hoping to crash it into the Eiffel Tower (French commandos killed the hijackers at a refueling stop).

    From 1995 on, US intelligence and the military should have taken steps to detect and prevent a 9/11-like scheme. There was enough information in the system to cause the US air command to draw up plans for dealing with an airliner-turned-missile and to prompt the CIA and the FBI (and other intelligence outfits) to seek intelligence related to plots of this type. Apparently nothing of the sort happened. Not even when terrorism experts continued to raise airliner attacks as a possibility. In 1998 terrorism analysts briefed Federal Aviation Administration security officials on scenarios in which terrorists flew planes into US nuclear plants or commandeered Federal Express cargo planes and crashed them into the World Trade Center, the Pentagon, the White House, the Capitol and other targets. In 1999 a report prepared for the National Intelligence Council noted that Al Qaeda suicide bombers could fly an aircraft filled with explosives into the Pentagon, CIA headquarters or the White House.

    In 2001 the FBI--not looking for signs of a suicide-bombing plot--failed to recognize the significance of information its agents received while investigating foreign students at a Phoenix flight school and Zacarias Moussaoui, a French national enrolled in a Minnesota aviation school, later charged with participating in the 9/11 conspiracy. In July Italian authorities warned the United States that bin Laden agents might try to attack Bush and other Western leaders at the Genoa summit using an airliner.

    True, these leads were small pieces of data among the massive amounts of material swept up by the sprawling intelligence system. But what's the point of spending more than $30 billion annually on spies and high-tech eavesdropping if the system can't sort out the valuable nuggets? Hindsight is indeed easy. The Bush and Clinton administrations, based on what's now known, don't deserve to be faulted for not discovering the 9/11 plot. But both failed to oversee the intelligence and law-enforcement communities and make sure they were pointed in the right direction.

    There is evidence that the Bush team didn't move quickly on the counterterrorism front. Newsweek reported that Attorney General John Ashcroft prodded the FBI to concentrate on violent crime, drugs and child porn more than on counterterrorism (a story the Justice Department denied). And Defense Secretary Donald Rumsfeld threatened to veto a move that shifted $600 million from the anti-ballistic missile program to antiterrorism. Was there a counterterrorism policy delay? Other questions linger. In July 2001 Richard Clarke, then the National Security Council official in charge of counterterrorism, put out an urgent alert, placing the government at its highest state of readiness for a possible terrorist attack. The alert faded six weeks later. What triggered it? What caused the stand-down? Should there have been a follow-up?

    The multiple failures of policy, imagination and coordination over two administrations should be investigated. To assign blame? Accountability does have its place in a democracy. The public has a right to know who messed up and to be assured that those who did aren't in a position to commit further mistakes. The point, of course, is to learn from those mistakes and to be able to tell the public the failures have been addressed. Does the intelligence system deserve more billions, as Bush has requested, without demonstrating that it can use the money wisely?

    After 9/11 the Bush Administration didn't rush to examine what went wrong. We're too busy fighting the war, it said, while urging Congress not to pursue the matter. Belatedly, Congress authorized a joint investigation by the House and Senate intelligence committees, two panels that traditionally have been cozy with the intelligence crowd. That probe has gotten off to a terrible start--the investigators fighting among themselves over whether to examine government failures or to concentrate on how best to reorganize the intelligence system and accusing the CIA and the Justice Department of not cooperating. One positive consequence of the maelstrom over the August 6 briefing is that it has prompted more calls for an independent commission, which Senators John McCain and Joseph Lieberman have been advocating. Yet so far no inquiry is committed to mounting a no-holds-barred examination and to conducting as much of it as possible in public.

    "I don't have any problem with a legitimate debate over the performance of our intelligence agencies," said Vice President Cheney. But he has opposed sharing the August 6 briefing with Congress. How can there be worthwhile debate without information? After all, the recent tussle began when the press sensed that the White House had withheld a significant--or intriguing--fact. And how can there be information without investigation? The issue is not what Bush knew--but why he didn't know, and whether his Administration took sufficient steps before and after that awful day to deal with the failings of the agencies that are supposed to thwart and protect.

    David Corn

  • In Fact…


    We join the city of Chicago in congratulating our friend and colleague Studs Terkel on his ninetieth birthday. By mayoral proclamation, May 16 was Studs Terkel Day, and there was a star-spangled celebration at the Historical Society. Given Studs's gift for talking--and listening--the Windy City was the right place for him to grow up. He's a man of many words--gales of eloquent ones on TV, radio, in books, speeches. His oral history books, including, most recently, Will the Circle Be Unbroken? (New Press), stitch together a great patchwork quilt of hundreds of American lives. Hail to our national griot with a tape recorder.


    Jeff Chester writes: Both the Washington Post and the Wall Street Journal recently delivered off-the-mark appraisals of media colossi AOL Time Warner and Vivendi Universal, gloomily stressing their huge quarterly losses while ignoring the longer-term implications of the old media's incursions into the new media landscape. The US mainstream press keeps a narrow focus on the role of "synergies" in the media marketplace. But it's not the opportunity to turn last summer's blockbuster movie into this fall's TV hit or fast-food giveaway trinket that drives merger mania. The real prize is locking up key segments of the broadband cable delivery platform and digital TV spectrum, which will loom large in the media empires of the twenty-first century. Also off the mainstream press's radar screens: the enormous lobbying campaigns that helped create the deregulatory environment that will allow a handful of corporate giants to wield unprecedented power in the media marketplace, synergy or no.


    **From Dave Lindorff: Pennsylvania, which has one of the most open, deregulated wholesale electricity markets in the world, was the scene of a market ripoff by PP&L, one of the two dominant electricity generating companies in the state. What the company was doing was similar to Enron's tactics in California: holding back generation of power during peak load periods. Under Pennsylvania utility regulations, the power distribution companies competing for retail business have to pay a "deficiency payment" for not meeting their contractual delivery of power. What made this nice for PP&L was that these deficiency payments had to be paid to the generator, i.e., PP&L. This market dominance netted the company $11.7 million during a six-week cold snap in January-February 2001. But the reason the company got caught (it's currently being investigated by the state's public utility commission, which may hand over the case to the state or the Feds for prosecution) is that NewPower, one of the new firms competing for retail electricity customers, complained. Why did NewPower, alone among PP&L's customers, figure out what was afoot? Because NewPower is a subsidiary of Enron, which was doing the same thing to other power distribution companies in California at the same time!


    **Mica Rosenberg writes: The use of offensive Native American stereotypes (e.g., the Atlanta Braves' "tomahawk chop") as mascots is nothing new. But one small sports team has held a mirror up to the mascot problem. Solomon Little Owl, director of Native American Student Services at the University of Northern Colorado, started an intramural basketball team called the Fighting Whites. The team is protesting nearby Eaton High School's team name--the Fightin' Reds. Eaton's mascot, a caricature Indian with a misshapen nose, wears a loincloth and eagle feather. The Fightin' Whities, as they are affectionately known, have as their mascot a cheesy 1950s-style caricature of a middle-aged white guy over the phrase "Everythang's gonna be all white!" Do they think their satire will convince Eaton school officials to abandon the offensive icon? Charles Cuny a 27-year-old Indian on the team, says, "Going to the school board is like going to Congress and asking for our land back--it's not going to happen." But T-shirt sales are soaring.


    **Martin Austermuhle writes: Stephen Jones, a student teacher of social studies from the graduate program at the state university, was removed from his job at the high school in Old Town, Maine, after parents complained to the school board that his teaching of Islamic history threatened their children's religious upbringing. Jones was using selections from the Bible, the Torah and the Koran to combat the stereotypes he encountered in his tenth-grade world history class (to the question "What is Islam?" students had responded, "crazy terrorists," "dirty," "camels"). His unexplained removal was a shock to Jones, whose lesson plans had been approved by the school's social studies teacher and principal. James Dill, chairman of the school board, said "a couple of board members told me in passing that they thought there should be more separation between church and state...maybe there was some teaching of religion going on that may have been out of place." School officials bucked any comment to Jones's university, which claimed a vow of silence under student privacy laws. Says Jones, "I'm willing to learn something from experience, but I'm concerned about what these kids are learning. If they can become informants, if accused people can't have due process, if the approved course of events involves secret decision-making, no appeal and the teacher disappears, that doesn't smell like democracy. It smells very different." It smells, period.


    According to the Wall Street Journal, after five years and $929 million in federal funding, drug czar John Walters has discovered that antidrug ads don't work--teen drug use is as high as ever, and the ads may actually encourage young kids to try pot. But as Cynthia Cotts notes in the Village Voice, that piece of news wasn't widely reported; it didn't even rate a mention in the New York Times or the Washington Post--both of which profited handsomely from running the ads.
    Who elected him? George W. Bush at a Florida rally for Jeb: "Mr. Castro, once--just once--show that you're unafraid of a real election. Show the world you respect Cuban citizens enough to listen to their voices and to count their votes."

    the Editors

  • September 11 Questions

    George W. Bush, it is true, did not create the FBI's smug, insular, muscle-bound bureaucracy or the CIA's well-known penchant for loopy spy tips and wrongheaded geopolitical analysis. But the President is now in the political cross-hairs for the failures of these agencies in identifying and understanding terrorist threats. And what's wrong with that? Bush is President, after all, and it is mildly amusing to hear the conservative claque plead excusable ignorance or the complexities of governing as his alibi. The trouble is, this failure is too serious to amuse. The ineptitude preceding September 11 arguably heightens the gravest, most immediate threat to national security because, while the dangers may lurk in the twilight zone, they can, as we learned, turn real. Yet this nation is relying on two intelligence agencies that don't even wish to talk to each other--and that not only failed to anticipate September 11 but that have also failed to locate Osama bin Laden, the man George W. Bush said he wanted "dead or alive," or to identify the anthrax killer.

    Instead of expressing a little executive impatience, even anger at possible misfeasance, this President responds, once again, by calling for more secrecy in government, more silence from his critics. And we're not the only ones to suspect a connection between the cascade of Administration warnings about new threats and its wish to turn the public's gaze away from its shortcomings.

    The imperative now is to get a down-to-business accounting of the negligence or inertia that preceded September 11--a systematic inquiry that is not a headhunting exercise but could begin the long-overdue reformation of FBI and CIA operating practices. Whether this is the Congressional investigation already under way or a new independent commission such as Senator Daschle wants, the results will be persuasive only if the public learns a lot more, not less, about how to cope with this new era of shadowy threats. Also needed are elected officials willing to ask the Administration tough questions--fearlessly, in public forums, with no thought as to whether Dick Cheney will brand them as "irresponsible."

    If Bush were a leader of more substance, he would understand that a thorough ventilation is in his self-interest, both politically and otherwise. His green-yellow-red warning code is already a joke. Should terrorists indeed attack again, a rattled populace may begin to wonder, What did the President know? Where was the Vice President hiding? If Americans are going to have to live with uncertainty for a long time, then the government owes them a grown-up conversation on the complexities, what is known and knowable, what is not. People can handle straight talk, but that's not what they are getting.

    This President used last fall's tragedy to pump himself up as the resolute warrior who tossed complexity into the trash can. Bush's I'm-gonna-get-you rhetoric described an open-ended series of battlefields ahead and did wonders for his ratings. But the complicated counterrealities have already blurred that picture, just as the recent revelations greatly diminish his luster as the straight-talking cowboy. Now he wants Americans to appreciate the gray areas and accept that some facts are unknowable. And please, don't ask any more questions of your leader, because it's unpatriotic.

    Just one question, Mr. President: What else didn't you tell us after September 11?

    the Editors

  • The Israel Lobby

    On May 2 the Senate, in a vote of 94 to 2, and the House, 352 to 21, expressed unqualified support for Israel in its recent military actions against the Palestinians. The resolutions were so strong that the Bush Administration--hardly a slouch when it comes to supporting Israel--attempted to soften its language so as to have more room in getting peace talks going. But its pleas were rejected, and members of Congress from Joe Lieberman to Tom DeLay competed to heap praise on Ariel Sharon and disdain on Yasir Arafat. Reporting on the vote, the New York Times noted that one of the few dissenters, Senator Ernest Hollings of South Carolina, "suggested that many senators were after campaign contributions."

    Aside from that brief reference, however, the Times made no mention of the role that money, or lobbying in general, may have played in the lopsided vote. More specifically, the Times made no mention of the American Israel Public Affairs Committee. It's a remarkable oversight. AIPAC is widely regarded as the most powerful foreign-policy lobby in Washington. Its 60,000 members shower millions of dollars on hundreds of members of Congress on both sides of the aisle. It also maintains a network of wealthy and influential citizens around the country, whom it can regularly mobilize to support its main goal, which is making sure there is "no daylight" between the policies of Israel and of the United States.

    So, when Congress votes so decisively in support of Israel, it's no accident. Yet, surveying US newspaper coverage of the Middle East in recent months, I found next to nothing about AIPAC and its influence. The one account of any substance appeared in the Washington Post, in late April. Reporting on AIPAC's annual conference, correspondent Mike Allen noted that the attendees included half the Senate, ninety members of the House and thirteen senior Administration officials, including White House Chief of Staff Andrew Card, who drew a standing ovation when he declared in Hebrew, "The people of Israel live." Showing its "clout," Allen wrote, AIPAC held "a lively roll call of the hundreds of dignitaries, with individual cheers for each." Even this article, however, failed to probe beneath the surface and examine the lobbying and fundraising techniques AIPAC uses to lock up support in Congress.

    AIPAC is not the only pro-Israel organization to escape scrutiny. The Conference of Presidents of Major American Jewish Organizations, though little known to the general public, has tremendous influence in Washington, especially with the executive branch. Based in New York, the conference is supposed to give voice to the fifty-two Jewish organizations that sit on its board, but in reality it tends to reflect the views of its executive vice chairman, Malcolm Hoenlein. Hoenlein has long had close ties to Israel's Likud Party. In the 1990s he helped raise money for settlers' groups on the West Bank, and today he regularly refers to that region as "Judea and Samaria," a biblically inspired catch phrase used by conservatives to justify the presence of Jewish settlers there. A skilled and articulate operative, Hoenlein uses his access to the State Department, Pentagon and National Security Council to push for a strong Israel. He's so effective at it that the Jewish newspaper the Forward, in its annual list of the fifty most important American Jews, has ranked Hoenlein first.

    Hoenlein showed his organizing skills in April, when he helped convene the large pro-Israel rally on Capitol Hill. While the event itself was widely covered, Hoenlein, and the conference, remained invisible. An informal survey of recent coverage turned up not a single in-depth piece about Hoenlein and how he has used the Presidents Conference to keep the Bush Administration from putting too much pressure on the Sharon government.

    Why the blackout? For one thing, reporting on these groups is not easy. AIPAC's power makes potential sources reluctant to discuss the organization on the record, and employees who leave it usually sign pledges of silence. AIPAC officials themselves rarely give interviews, and the organization even resists divulging its board of directors. Journalists, meanwhile, are often loath to write about the influence of organized Jewry. Throughout the Arab world, the "Jewish lobby" is seen as the root of all evil in the Middle East, and many reporters and editors--especially Jewish ones--worry about feeding such stereotypes.

    In the end, though, the main obstacle to covering these groups is fear. Jewish organizations are quick to detect bias in the coverage of the Middle East, and quick to complain about it. That's especially true of late. As the Forward observed in late April, "rooting out perceived anti-Israel bias in the media has become for many American Jews the most direct and emotional outlet for connecting with the conflict 6,000 miles away." Recently, an estimated 1,000 subscribers to the Los Angeles Times suspended home delivery for a day to protest what they considered the paper's pro-Palestinian coverage. The Chicago Tribune, the Minneapolis Star Tribune, the Philadelphia Inquirer and the Miami Herald have all been hit by similar protests, and NPR has received thousands of e-mails complaining about its reports from the Middle East.

    Do such protests have an effect? Consider the recent experience of the New York Times. On May 6 the paper ran two photographs of a pro-Israel parade in Manhattan. Both showed the parade in the background and anti-Israel protesters prominently in the foreground. The paper, which for weeks has been threatened with a boycott by Jewish readers, was deluged with protests. On May 7 the Times ran an abject apology. That caused much consternation in the newsroom, with some reporters and editors feeling that the paper had buckled before an influential constituency. "It's very intimidating," said a correspondent at another large daily who is familiar with the incident. Newspapers, he added, are "afraid" of organizations like AIPAC and the Presidents Conference. "The pressure from these groups is relentless. Editors would just as soon not touch them."

    Needless to say, US support for Israel is the product of many factors--Israel's status as the sole democracy in the Middle East, its value as a US strategic ally and widespread horror over Palestinian suicide bombers. But the power of the pro-Israel lobby is an important element as well. Indeed, it's impossible to understand the Bush Administration's tender treatment of the Sharon government without taking into account the influence of groups like AIPAC. Isn't it time they were exposed to the daylight?

    Michael Massing

  • McKinney Redux

    During the long months of post-September 11 presidential invincibility, no member of Congress climbed further out on the what-did-Bush-know-when limb than Representative Cynthia McKinney. "We know there were numerous warnings of the events to come on September 11," the Georgia Democrat said in March. "What did this Administration know and when did it know it, about the events of September 11? Who else knew, and why did they not warn the innocent people of New York who were needlessly murdered?"

    The disclosure that President Bush was warned in August that Al Qaeda was seeking to hijack domestic aircraft did not confirm all McKinney's intimations--which extended to talk about how the Bush family might have profited from the attacks. Yet she was freed to stake a claim of vindication. "It now becomes clear why the Bush Administration has been vigorously opposing Congressional hearings. The Bush Administration has been engaged in a conspiracy of silence. If committed and patriotic people had not been pushing for disclosure, today's revelations would have been hidden by the White House."

    McKinney's initial calls for an investigation of what Bush knew prompted a storm of criticism. "McKinney has made herself too easy a target for mockery," Atlanta Journal-Constitution editorial page editor Cynthia Tucker announced in April. "She no longer deserves serious analysis." After Bush aides condemned McKinney's "ludicrous, baseless views," National Review Online editor Jonah Goldberg diagnosed her as suffering from "paranoid, America-hating, crypto-Marxist conspiratorial delusions." Barely a month after the McKinney-bashing peaked, however, the Journal-Constitution headline read: "Bush warned by US intelligence before 9/11 of possible bin Laden plot to hijack planes," while Senate Intelligence Committee vice chairman Richard Shelby, an Alabama Republican, said, "I believe, and others believe, if [information on threats] had been acted on properly, we may have had a different situation on September 11."

    There were no apologies to McKinney. Brushing aside complaints from Atlanta civil rights activists, Georgia Senator Zell Miller continued to characterize his fellow Democrat as "loony." McKinney's critics kept exploiting the opening she gave them with her unfounded rumination on the prospect that something other than ineptness might explain the Administration's failure to warn Americans about terrorist threats. But her willingness to go after the Administration when few Democrats dared earned her folk-hero status among dissenters from the Bush-can-do-no-wrong mantra: The popular website now greets visitors with a We Believe Cynthia icon.

    In Georgia, where McKinney faces a July primary challenge from a former judge who labels her "off-the-wall and unproductive," a recent Journal-Constitution headline read, "Revelations Give Boost to McKinney." Letters to the editor, even from former critics, hail her prescience. And Georgia Democratic Representative John Lewis, who once steered clear of McKinney's call for an investigation, says, "I hate to put it in this vein, but she may have the last laugh."

    John Nichols

  • Dire Consequences

    This soon-to-be-classic Ed Sorel cartoon is available only in our print edition. Sorry!

    Edward Sorel

  • As the Press Turns

    Quick, pinch me--am I still living in the same country? Reading and watching the same media? This "Bob Woodward" fellow who co-wrote a tough piece in the May 18 Washington Post demonstrating that the now-famous August 6 presidential daily briefing, contrary to Administration officials' claims about its contents, actually carried the heading "Bin Laden determined to strike in U.S."--is this the same Bob Woodward who co-wrote the Post's infamous "Ten Days in September" series earlier this year, the ur-document of George W. Bush's Churchillization? And this "Michael Isikoff," sharing a byline on the eye-opening May 27 Newsweek cover story that shreds the Administration's "we did everything we could" line of defense--is this the Isikoff who four years ago defined national security in terms of dress stains and cigar probes? One begins to suspect that unbeknownst to all of us, the terrorists have indeed struck--the Washington, DC, water supply.

    An overstatement, to be sure. But it does seem to be the case that wherever this potentially incendiary story leads, from fog of unprovables to hot smoking gun, one change has already taken place because of it that is well worth marking. For the first time since September 11--or, arguably, since ever--the press corps appears ready to expend more effort poking holes in the vaunted Bush Administration spin operation than admiringly limning it. More to the point, Is a new skepticism stirring around such heretofore Teflonized officials as National Security Adviser Condoleezza Rice? Before her May 16 damage-control press conference, Rice was probably the Administration's leading untouchable. After it ("I don't think anybody could have predicted these people would...use an airplane as a missile," a statement left bleeding on the floor after a pile of evidence came forward showing plenty of people were predicting precisely that), her status has taken a major hit. So, as Professor Harold Hill might put it, certain wooorrrrdds are creeping into the media vocabulary--words like "serious credibility gap," in the Newsweek piece.

    It's been a long time coming. If anything "un-American" happened after September 11, it was the triumph of the notion--propounded by the Bushies, reinforced by the major media and far too readily accepted by cowardly Democrats--that "patriotism" somehow equals "support the Bush Administration." CBS's Dan Rather said it recently in an interview with the BBC: "Patriotism became so strong in the United States after 11 September that it prevented US journalists from asking the toughest of the tough questions about the war against terrorism," adding, "I do not except myself from this criticism." The genuflection sometimes reached levels that we might call comic, except that there's nothing comic about a "free" press choosing to ape state-owned media, throwing rose petals at the feet of officials from the most unilateral and secretive Administration in modern American history ("sixty-nine years old, and you're America's stud," Meet the Press's Tim Russert once said to Defense Secretary Donald Rumsfeld).

    One is not quite ready to say, on the evidence of several days' worth of stories, that this sorry era is over just yet. The New York Times and the Washington Post both ran editorials on May 17 that were something short of being full-throated calls for investigation; from the right-wing papers, the predictable yelping about how it's really Clinton's fault.

    All this will probably continue, but at least now it appears that it will be offset by some post-post-9/11 aggression. It will be interesting to watch what leads the media now follow and how far they follow them. For example, some reports--originating with the BBC but picked up in a few minor US outlets--indicate that US intelligence agents were told to back off the bin Laden family and the Saudi royals soon after Bush became President. Reporters might also look into the way the Administration declined to continue a process of tightening overseas and offshore banking regulations begun by the Clinton Administration in an effort to track down narcotics traffickers and terrorists. The Bush people acted partly at the behest of Texas Senator Phil Gramm, which means partly at the behest of Enron--and which may have ended up helping terrorists.

    "Connecting the dots" has become the operative cliché about whether intelligence officials should have been able to put together the various pre-9/11 clues they received. Now, maybe the media will start connecting some dots of their own.

    Michael Tomasky

  • Sign up for our weekly newsletter and we'll send you a FREE gift.

  • Columns

    The Job Has Become Too Big for Ashcroft

    OK, so maybe John Ashcroft and Robert Mueller are not the sharpest tools in the shed. How else to explain that, after September.

    Robert Scheer

  • Knowledge (and Power)

    For Senator Clinton to flourish a copy of the New York Post--the paper that has called her pretty much everything from Satanic to Sapphist--merely because it had the pungent headline "Bush Knew" is not yet her height of opportunism. (The height so far was reached last fall, when she said she could understand the rage and hatred behind the attacks on the World Trade Center because, after all, she had been attacked herself in her time.) But the failure of her husband's regime to take Al Qaeda seriously is the clue to the same failure on the part of the Bush gang.

    Christopher Hitchens

  • And Another Thing…

    Perhaps there's a limit to female masochism after all. To the great astonishment of the New York Times, which put the story on page one, Creating a Life, Sylvia Ann Hewlett's book deploring the failure of female professionals to have as many children as she thinks they need to be happy, is a big commercial flop ("The Talk of the Book World Still Can't Sell," May 20). Out of a 30,000-copy first printing, perhaps 8,000 have sold, despite a publicity campaign from heaven: Time cover story, 60 Minutes, Oprah, Today, wall-to-wall radio. The UK edition, Baby Hunger (Hewlett's original choice of title--gag me with a spoon!), is also piling up in warehouses.

    The Times quotes numerous bewildered publishing people--could it be the cover? women's "deep level of anxiety"?--but it's no big mystery why the book isn't selling. Except when right-wing foundations buy up truckloads of copies, antifeminist tracts usually do poorly despite heavy attention. The media love them--this week's newsstand features New York with "Baby Panic" and Us with "Will They Ever Have Babies?" in which Jennifer Aniston and other nulliparous stars bemoan their lot--but book buyers don't bite. Hewlett follows in the steps of Katie Roiphe, who got great press but few readers for The Morning After, which argued that date rape was just "bad sex." Partly the reason is that these books tend to be so flimsy that the media campaign gives away their entire contents, but the main reason is that nobody but women buy books about women--and women who buy hardcover books are mostly feminists. They know date rape isn't bad sex, and they don't need Hewlett to tell them their biological clocks are ticking. (Apparently not as fast as Ms. Hewlett claims, though. Dr. Alan DeCherney told the Times a woman's chances of getting pregnant at 40 are better than Hewlett makes out.) Why buy a book that tells you to smile, settle and rattle those pots and pans? That's what your relatives are for.

    By the way, my friend Judith Friedlander, coiner of the immortal phrase "a creeping nonchoice," was surprised to find herself on Hewlett's list of tearful women whose careers got in the way of childbearing. "I've had a great life," she told me, "with no regrets, and I spent a long time telling Hewlett just that."

    * * *

    What if a woman ran for President who had great progressive politics except for one thing--she believed that any man accused of rape or sexual harassment should be castrated without a trial? How many progressive men would say to themselves, Oh well, she's got great positions on unions, the environment, the death penalty, and all the rest, and besides, women really like her, so she gets my vote! Ten men? three? two?

    Of course, no progressive woman would ever put this crazy notion forward. Our hypothetical candidate would understand all too well that she couldn't propose to kick men in the collective teeth and expect them to vote for her. Back in the real world, however, this is precisely what some progressives apparently expect women to do for Dennis Kucinich, whose anti-choice voting record was the subject of my last column. Besides numerous e-mails thanking me for "outing" him and two or three upholding the "human rights" of the "itty bitty zygote," I heard from a few readers like Michael Sherrard, who urged "liberals" to "get over their single-issue abortion orthodoxy." Instead of asking women to give up their rights, why not pressure Kucinich to support them? To get that "broad based multi-issue progressive movement" Sherrard wants, Kucinich is the one who needs to get real, to face the demographic truth that without the votes, dollars and volunteer labor of pro-choice women and men, no Democrat can win the White House. His anti-choice votes may suit his socially conservative Cleveland constituents, as his supporters claim, but America isn't the 10th Congressional District of Ohio writ large.

    What Kucinich's fans may not understand is that for pro-choice women, abortion is not just another item on the list. It goes straight to the soul. It is about whether society sees you as fully human or as a vessel for whom no plan or hope or possibility or circumstance, however desperate, matters more than being a nest for that "itty bitty zygote." As I've written before, despite the claims of "pro-life feminists" and "seamless-garment" Catholics, progressive social policies and abortion rights tend to go together: Abortion bans flourish where there are backwardness, poverty, undemocratic government and politically powerful patriarchal religion, where levels of education, healthcare and social investment in children are low, and where women have little power. Instead of asking women to sign over their wombs for the cause, progressives should demand that "their" politicians add abortion rights to their agenda. No progressive would vote for someone who opposed unions or wanted to bring back Jim Crow. Why should women's rights matter less? It's disgusting that the AFL-CIO supports anti-choice politicians--as if their members aren't getting (or causing) abortions in vast numbers--and it backfires, too. In Pennsylvania's Democratic gubernatorial primary, pro-choice centrist Democrat Ed Rendell trounced anti-choice labor-endorsed Bob Casey Jr., 56 to 44 percent.

    * * *

    A French committee is promoting Ahmed Shah Massoud, the assassinated Northern Alliance commander, for the Nobel Peace Prize (among the signatories: actress Jane Birkin, Gen. Philippe Morillon and that inevitable trio of trendy philosophes, Bernard-Henri Levy, Alain Finkielkraut and André Glucksmann). I know what you're thinking: If Henry Kissinger could be awarded this honor, why not the CIA/Russia-backed Tajik warlord who helped set up a fundamentalist government in 1992, destroyed Kabul by fighting with his erstwhile ally Gulbuddin Hekmatyar and helped create so much havoc, including documented massacres of civilians, that Afghans welcomed the Taliban? Still, there's something repellent about proposing to award Massoud, thanks in part to whom Afghanistan is riddled with landmines, the same prize won by anti-landmine activist Jodie Williams in 1997. Maybe they should call it the Nobel War Prize.

    Katha Pollitt

  • Dick Cheney’s Primer on the Constitution

    So what's it called if during war you criticize the President for any reason?
    And how long does this war go on (and this is where this theory's really pretty clever)?

    Calvin Trillin

  • Books and the Arts

    Singing to Power

    British folk-rocker Billy Bragg has to be the only popular musician who could score some airtime with a song about the global justice movement. The first single from Bragg's England, Half Engli

    Hillary Frey

  • Yaqui Way of Knowledge

    Although Chicano identity has been Luis Valdez's theme since all but the earliest years of El Teatro Campesino, the guerrilla theater he founded in the 1960s, getting a clear sense of his roots became doubly important to him when his parents died in the mid-1990s. Valdez, the first Latino playwright/director to reach Broadway and the creator of the bellwether Hispanic film Zoot Suit, had always been told his people were Yaquis from Sonora in northern Mexico, but he realized he knew very little about how they had come to be California Chicanos.

    So, in the late 1990s, he began to search his family's history and its secrets, and what he discovered about the myths and contradictory stories that had been handed down and about the little-known history of the Yaqui wars in Mexico led him to write Mummified Deer, in some ways his most personal play and his first new work for the theater in a decade and a half (just ending its run at El Teatro Campesino in San Juan Bautista). It's a play that uses the mythic, presentational elements we've come to associate with Valdez's work, here present in a Yaqui deer dancer, who together with the long arm of history defines identity for the play.

    Valdez founded El Teatro Campesino as an organizing and fundraising arm of the United Farm Workers during the 1965 grape strike in Delano, where he was born. The actors then were strikers who played type characters in actos, short satirical sketches on strike issues performed at work sites and in union halls.

    But since splitting off from the union in 1967, the company has made Chicano racial identity its focus. In the late 1960s and early '70s, that specifically meant spiritual identity, with the theater reaching all the way back to La Raza's Aztec and Mayan roots and making ritual and myth, music and dance integral parts of its style.

    Valdez was criticized at the time for abandoning the theater's materialist viewpoint, and was criticized later in the decade and in the 1980s--when the entertainment industry began to understand the potential of the Hispanic market--for his unabashed attempt to move into commercial theater and filmmaking with Zoot Suit. Valdez's response was that it was time for Chicanos to assume their place in the mainstream and that separatism had been just a necessary phase that prepared them to do so without losing their sense of identity. But it was also clear that the young men in Zoot Suit had to reject that aspect of pachuquismo, that very attractive, very essential part of their identity as Chicanos, that was disruptive of society and self-destructive.

    Lack of commitment to cultural authenticity seemed confirmed--certainly to Latino actors who protested--in 1992 when Valdez attempted to cast Laura San Giacomo, an actress with something of a bankable name but also an Italian ancestry, as Frida Kahlo in the movie he was trying to make about the artist. Valdez argued that the compromise was necessary to get Hollywood to do movies with Hispanic protagonists at all and that the movie would offer a picture of Latino life that was not gang- or drug-based, i.e., nonstereotypical and presumably positive.

    Maybe it's just the difficulty of a Chicano writer/director making headway in the commercial world, but in truth, it's difficult seeing Valdez as lost leader, as someone who's abandoned his roots, in San Juan Bautista, the mission town where Mummified Deer has been playing in a theater Valdez built out of a fruit-packing shed. By no means as far off the beaten track as Glover, Vermont, where Bread and Puppet escaped city life in the 1970s, it's still a small rural town a long way from entertainment capitals and city attitudes.

    The style of Valdez's new play also points to continuity. And for the most part the inspired stylistic innovations that radical theaters excelled in--in Mummified Deer for instance, a hospital bed that's transformed into a train laden with Mexican revolutionaries--still work their magic in Valdez's hands. The sudden release of concentrated imagination thrills. But even when they don't work, when they now seem more a part of tradition than vital and expressive, their mere presence, like the continued earnest tone of his writing in our smug, cynical time, suggests that Valdez hasn't jettisoned the past.

    In any event, the story itself makes it clear that roots are not easily cut off. On a simple series of platforms, marked with what seem to be petroglyphs and hung with plastic sheets that make the set look like an ice cave--poor theater after all these years!--Mama Chu, a fierce, 84-year-old family matriarch, lies on a hospital bed, suffering from abdominal pains. When the cause of her condition is diagnosed not as cancer but as a mummified fetus that has been lodged in her womb for sixty years, her granddaughter Armida, an anthro grad student at Berkeley who's in search of the truth about her mother's life, begins to pierce the maze of myths and half-truths that have made up Chu's story and the family's history.

    Along the way, secrets are revealed about paternity, incest and migration. The ultimate source of these secrets and family myths isn't, however, as in many plays, personal pathology. The half-truths and inventions all proceed from a historic cause: the little-known Yaqui genocide at the hands of Porfirio Diaz and the Federales, which capped four centuries of little-known Yaqui resistance to European colonization.

    In the end, it turns out that none of Chu's children as they're presented in the play are hers. Her children were all taken away and murdered in the genocide. She gathered Armida's mother, aunt and uncle to her to fill the void. (The horrific description of the mass slaughter alone insures that this play is not going very far into the mainstream.)

    Powerful, serious material. And Valdez doesn't always treat it reverentially, as many lesser playwrights would. The introduction of a kind of grotesque humor makes it all the more powerful at times. As when Aunt Oralia (Rosa Escalante) wonders, "Can't you just yank that little sucka [the dead fetus] out?" or Uncle Profe explains the incest by saying simply, "We were always very close."

    To his credit, Valdez doesn't treat the Chicano family reverentially, either. He understands that they can be quite conservative even though they've been victims (or because they've been victims). He satirizes them and creates a number of characters that, like the satirical figures of the actos, are one-dimensional types. With an Oralia, that works to project a sense of how self-protective she is about the past, but this is ultimately a play of terrible family secrets, and having the weight of those secrets fall on an Armida who is little more than a plot mechanism and Berkeley-activist-type blunts the force of the drama.

    It's not simply a matter of an uneven cast, one that ranges all the way from the very adept and realistic Daniel Valdez (Uncle Profe) to Estrella Esparza (Armida), who can barely make the words her own. It's also the writing and the way Valdez as director has the characters played. As director, he also pitches a number of the performances very high. An actress like Alma Martinez, who plays Mama Chu, can obviously change gears on a dime and sketch in a reaction or attitude with the flick of a hand, but Valdez pushed her performance hard and makes it vocally very forceful, as if constantly to remind us what a powerful woman this is. The result is a lack of nuance, variety and sympathy that sent me fleeing to quieter characters like Uncle Profe and Armida's mother, Agustina (Anita Reyes).

    Then too, the revelations about the past are far too complicated, there's too much information coming at you generally, and what exactly the deer dancer represents is obscure. Also, the symbol of the mummified fetus at times feels contrived. All of which makes it difficult to take in and feel comfortable with what Valdez is apparently going for in his continuing exploration of what he understands to be a continually evolving Chicano identity. That is, the sense that Chu's finally confronting the Yaqui genocide results in her forgoing an operation and keeping the fetus, which is an incarnation of both an indio past that is dead and gone and a living Yaqui spirit that--bypassing the acquiescent and self-deluding generation of aunt and uncle--Chu passes on to her granddaughter, Armida.

    Hal Gelb

  • The Evolution of Darwinism

    Popular perception notwithstanding, the theory of natural selection was accepted by every serious evolutionist long before Darwin. Earlier scientists interpreted it as the clearest possible evidence for intelligent design of the universe. William Paley's Natural Theology (1802), for example, employs the famous image of the "great watchmaker" to account for the perfect adaptation of creatures to harmonious ecosystems. Darwin's innovation, which may appear small but is in fact immense, lay in his claim that natural selection is the only cause of evolution.

    In one sense, this was merely a change of emphasis: The impulse of pre-Darwinian evolutionists, faced with incontrovertible evidence of natural selection, had been to ask why it occurred. They sought after the "final cause" of evolution, and they found it in the proposal of an intelligent designer. But one of the essential principles of modern science is that such final causes are unknowable. Science must limit itself to "efficient" or "material" causes; it must not ask why things happen, but how. Darwin applied this principle to evolution. Whereas his predecessors had seen the adaptation of organisms to their environment as the effects of design, Darwin saw the physical development of creatures as the sole cause of evolution. The great watchmaker had been overthrown.

    As Stephen J. Gould (who died as this issue was going to press) shows in The Structure of Evolutionary Theory, Darwin's breakthrough was essentially methodological. Darwinism is what you get when you focus on the micrological details, resolutely refusing to lift your eyes to the level of the whole. Over the course of the nineteenth century, this methodological sine qua non for scientific investigation was imposed on every discipline, but it originated in the "dismal science" of economics. The "political economy" of Adam Smith began from the material actions of individuals in pursuit of their own selfish ends, and extrapolated from this micrological level to abstract generalizations about the economy as a whole.

    What Smith calls "the economy" is thus an amalgamation of all the self-interested actions of individuals, and precisely the same is true of what Darwin understood as "evolution." In fact, Darwin consciously and deliberately imported Smith's economic methodology into biology in order to refute natural theology's argument from design. As Gould baldly puts it, "the theory of natural selection is, in essence, Adam Smith's economics transferred to nature." He is reluctant to dwell too long on this kinship, no doubt because he understands the severity of the threat it poses to Darwinism's pretensions to objectivity. Gould's ally and sometime collaborator Richard Lewontin has criticized him for such reticence in several exchanges first published in the New York Review of Books. Lewontin has called Gould's work "curiously unpolitical" for failing to draw out the implications of "the overwhelming influence of ideology in science." For Lewontin, "Darwin's theory of evolution by natural selection is obviously nineteenth-century capitalism writ large," and attempts to press it into the service of psychology are "pure reification."

    The distinguishing theoretical characteristic of both Darwin and Smith is reductionism--they reduce all knowledge to the level of the individual. As Gould notes, "The rebuttal of the former centerpiece of natural history--the belief that organic designs record the intentions of an omnipotent creative power--rests upon the radical demotion of agency to a much lower level, devoid of any prospect for conscious intent, or any 'view' beyond the immediate and personal." Today, technological progress has enabled evolutionists to carry Darwin's reduction a stage further. The smallest individual Darwin could study was the organism, but it is now possible to analyze the behavior of the gene. People like Richard Dawkins now claim that evolution is driven not by competition between individual organisms, but by struggles among genes.

    Many evolutionary biologists keep a guilty silence regarding the ethical implications of their theory, but Dawkins positively revels in dehumanization. His imagery dwells lasciviously on the mechanical--our bodies are merely "lumbering robots," or "survival machines" for genes. His infamous book The Selfish Gene (1976) abounds in brazen antihumanist provocations: "I am treating a mother as a machine programmed to do everything in its power to propagate copies of the genes which reside inside it." Nor does mechanization stop with the body; evolutionary psychology views the mind itself as a machine, reducing our thoughts and ideas to the chemical reactions that accompany them. Dawkins has even propounded a theory that the history of ideas follows rules analogous to competitive gene selection, reducing dialectic to a tedious and pointless struggle between what he calls "memes." Lately he has taken to writing letters to the British press, suggesting that Osama bin Laden and George W. Bush will be enlightened if they "study memes."

    The idea that genes determine all social behavior, that human beings are machines, evidently strikes a chord in the Western popular mind. Postmodernist works such as Donna Haraway's "A Cyborg Manifesto" celebrate the "posthuman" from what their authors apparently regard as a radical perspective, while the theoretical texts of Michel Foucault and Jean-Francois Lyotard advocate a micrological materialism that excludes on principle any interest in "totalizing grand narratives." As John Dupré has recently remarked, this "tyranny of the microscopic" really constitutes an "intellectual pathology" whose significance is sociological rather than scientific. Gould swats Dawkins away easily enough--sardonically appropriating his vocabulary to dismiss his theory, cruelly but fairly, as an "impotent meme"--but he does not explain why such theories have come to seem plausible to many in the general public. To examine that, we have to back up about 65 million years.

    Reptilia served as Exhibit A then. Imagine Triceratops glancing up from its grazing to notice a seven-mile-wide asteroid descending rapidly toward its head. Triceratops had not expected this. Nature had prepared it for the expected; it could expect to spend a great deal of time fighting with Tyrannosaurus rex, for example, and was formidably well-equipped for that purpose. But natural selection had not prepared it to withstand a direct hit from a piece of rock a league long.

    The lump of stone that crashed into what is now the Yucatan Peninsula ended the Cretaceous Period by showering the earth with fire and brimstone, thus destroying 70 percent of living species, including almost all the dinosaurs. This was something of a spanner in the works of natural selection, from which it may not recover. The implications of this catastrophe, conclusive evidence for which was discovered only in 1980, have yet to be fully assimilated by evolutionary theory. For most of the twentieth century, orthodox Darwinists held that natural selection--the competitive adaptation of individual organisms to their environment--was the exclusive motor of evolutionary change. Now they must qualify this dogma, but it is proving a laborious process.

    Many scientists remain convinced that catastrophic change is the exception. If it weren't for that pesky asteroid, they gripe, natural selection would have continued unabated. They note that natural selection will always work ceteris paribus--that is, other things being equal, under the controlled laboratory environment in which modern scientists conduct their experiments. It will work, that is to say, in the absence of the unexpected. But don't we know from experience that the unexpected happens all the time, and occasionally with catastrophic consequences?

    The "K-T event," as the asteroid strike is known, casts suspicion on the doctrinaire claim that evolution is solely the result of the competitive adaptation of individual organisms to their environment. It indicates that the external constraints under which adaptation occurs must inevitably exert an influence on the course of evolution. And it raises the possibility that random, "chance" events play at least as significant a role as the incremental, purposive process of natural selection.

    Although it represents a mortal threat to mainstream Darwinism, the theory of catastrophic evolution is quite consistent with Stephen Jay Gould and Niles Eldredge's epochal discovery of "punctuated equilibrium." Punctuated equilibrium, or "punk-ek," holds that evolution does not take place incrementally but rather in spurts that are divided by long periods of stasis. It departs from Darwin by implying that natural selection by competition among individual organisms cannot be the exclusive cause of evolutionary change, since such competition does not pause for periods of equilibrium.

    Darwin is often thought to have rescued the history of life from the superstitious fantasies of religion, by basing his theory on good, solid, empirical evidence. But, as Gould and Eldredge noticed, the empirical evidence does not indicate that evolution proceeds by incremental, incessant natural selection, as Darwin claimed. In fact, the empirical evidence indicates quite the opposite. When we look at the living species around us, we do not find a continuum of creatures in infinitesimally graduated stages of evolution. We find, instead, clearly distinct species. We find the same when we look at the fossil record; paleontology testifies that evolutionary stasis is the norm, and that change takes place in abrupt bursts, as though suddenly spurred forward by some external stimulus.

    One of the many fascinating questions raised in Gould's The Structure of Evolutionary Theory is why Darwin did not see this. Why did he insist on attributing sole determining power to natural selection in defiance of the evidence? His own explanation was that the fossil record gives a false impression because it is radically incomplete. But this does not alter the fact that natural selection is an imposition on the available evidence, a bold reading against the grain. Did Darwin nod? Why was he so convinced that all evolution is caused by natural selection among individual organisms in competition with one another?

    Gould does not explain this, almost certainly for a very interesting reason: He has often been accused, by sociobiologists and orthodox Darwinians, of handing ammunition to creationists. There is no room for an intelligent designer in a universe formed entirely through relentless competition between selfish individuals, but because it allows that external factors may influence evolution, the theory of punctuated equilibrium is not incompatible with theories of intelligent design--a fact that has caused no small embarrassment to its authors. The charge of neocreationism is deeply unfair--Gould testified against creationism in landmark court cases and ridiculed it mercilessly in his writing. He opposed intelligent design on the grounds that it is "theology" and not "science." In this book, obviously intended as his legacy to scientific posterity, Gould repeatedly and emphatically protests that no matter how many revisions and qualifications he may impose upon Darwin, he remains a faithful follower of the great man. In a rare and revealing mixed metaphor, he claims to have retained "the guts of the machine," and he uses a cumbersome simile involving a piece of coral to argue, again and again, that his own work is merely an "addition" to Darwin.

    That is rubbish, and Gould must have known it. The Structure of Evolutionary Theory is an "addition" to The Origin of Species in the same sense that Capital is an "addition" to The Wealth of Nations. Gould certainly built upon Darwin's work, assuming its premises as his own and erecting his own theory on the foundation of a meticulous analysis of the original texts. But there comes a stage in the construction at which, in fulfillment of the dialectical law, quantitative change becomes qualitative change, and the extension to the edifice deserves to be called a new building.

    Despite (and because of) his vehement denials, I believe that Gould reached that stage. His theory is more than a supplement to Darwinism, it is an alternative view, a paradigm shift. Gould has deprived natural selection of the exclusive role Darwin assigned to it, using the most unimpeachable logic and the most scrupulous empirical research.

    Gould obviously liked to limit the destructive impact of his criticism to distortions of the founder's aims. But Darwin cannot so easily be exonerated--Gould himself admits that the work of Dawkins constitutes "a furthering and intensification of Darwin's intent." Indeed, Gould often refers to theorists of gene selection as "ultra-Darwinists" or "Darwinian fundamentalists," because they take the master's reductionist method to the logical conclusion permitted by modern technology. Gould would have been mortified to hear it, but his own interpretation suggests that, were Darwin alive today, he might be Richard Dawkins.

    Traditional creationism is based on a literal reading of Genesis and represents no intellectual danger to Darwinism. The recent advocates of "intelligent design," however, demand to be taken a little more seriously because of their recent political and pedagogical successes; they admit to the apparent age of the earth as established in the geological record, for example, and accept the fossil record as evidence of species change. Hard-fought cases involving the boards of education of Kansas (1999) and Ohio (2002) have established a new beachhead for intelligent design in the public mind, while simultaneously throwing a shadow on natural selection's claim to be the exclusive motor of evolutionary change.

    The idea that schools in Kansas might depart from Darwinist orthodoxy induced apoplexy among the commissars of science. John Rennie, editor of Scientific American, urged colleges to be skeptical of applicants from Kansas: "If kids in Kansas aren't being taught properly about science, they won't be able to keep up with children taught competently elsewhere. It's called survival of the fittest. Maybe the Board of Education needs to learn about natural selection firsthand." In an edition of the American Spectator, a leading theorist of intelligent design, Michael Behe, professed to be mystified at Rennie's outburst: "What is it about the topic of evolution that drives so many people nuts? Why does a change in a farm state's high school examination policy call forth damning editorials all the way from London, England, and have normally staid editors threatening children?"

    The answer is obvious, blindingly so. Behe does not see it because he, like most advocates of intelligent design, approaches the issue from a socially conservative point of view. Much scholarship on intelligent design is sponsored by the Discovery Institute, a Seattle-based foundation that describes itself as "dedicated to exploring and promoting public policies that advance representative democracy, free enterprise and individual liberty," and whose mission statement commits it to boosting the "common sense" of the "free market." It is this commitment, I suppose, that distracts Behe from one of the reasons the American establishment goes "nuts" when the educational privilege of natural selection is threatened: A threat to the exclusivity of natural selection--individual competition--is a threat to market ideology. (Although he tactfully pays it less attention than it deserves, Gould acknowledges the full extent of Darwinism's complicity with Adam Smith. But the alterations Gould introduces into evolutionary theory do not depend on its ideological kinship with classical economics.)

    Neither Behe nor his book Darwin's Black Box rate a mention in The Structure of Evolutionary Theory, and Gould's silence on the subject of intelligent design can be regarded as extremely eloquent. He would have denied it, but this book really charts Gould's arduous passage through Darwinism and his emergence on the other side. This breakthrough seems to have been facilitated by his discovery of the literature that Darwin was writing against. Gould blithely informs us that "I had never read [Paley's] Natural Theology straight through before pursuing my research for this book." Lay readers may find this an astonishing confession from the world's leading Darwin scholar, but those familiar with scientists' undiscriminating rejection of metaphysics will be unsurprised. Having forced himself to pick up the book, Gould finds that Paley's primary observation is "undoubtedly correct," and largely accepted by Darwin--nature does indeed indicate exquisite adaptation to environment. The difference lies in the reason Darwin gives for this order in creation. Paley thought it bespoke a benign creator, but Darwin "seems to mock the standard interpretation in a manner that could almost be called cruel" when he introduces the micrological economics of Adam Smith:

    as the cruellest twist of all, this lower-level cause of pattern seems to suggest a moral reading exactly opposite to Paley's lofty hopes for the meaning of comprehensive order--for nature's individuals struggle for their own personal benefit, and nothing else! Paley's observations could not be faulted--organisms are well designed and ecosystems are harmonious. But his interpretations could not have been more askew--for these features do not arise as direct products of divine benevolence, but only as epiphenomena of an opposite process both in level of action and intent of outcome: individuals struggling for themselves alone.

    Read that last sentence again. What might bring about the triumph of the "opposite process" to "divine benevolence"? Clue: It is not the blind indifference of nature. The history of human thought is hardly silent concerning the struggle between a benevolent deity and a cruel mocker. But Gould shies away from considering the theological implications of his theory with the standard get-out clause: "This book cannot address such a vital issue at any depth."

    Many readers will be tempted to respond: "Why on earth not? It's 1,400 pages long!" But Gould was not eager to incur again, in his magnum opus, the tired charge of neocreationism. He does begin to speculate about why the homologous visions of Darwin and Smith should complement each other so conveniently, and he also raises the question of why this connection has come to seem so glaring in recent years. But his uncharacteristic hesitancy reveals his discomfort away from scientific terrain: "I venture these ill-formulated statements about Zeitgeist because I feel that something important lurks behind my inability to express these inchoate thoughts with precision."

    Indeed it does. Later in the book, Gould remarks that "the exclusivity of organismal selection...provides the punch line that allowed the vision of Adam Smith to destroy the explicit beauty and harmony of William Paley's world." Absolutely true. But the exclusivity of organismal selection is what Gould denied, too. Is it really accurate, then, to continue calling him a "Darwinist"? At one point, Gould demands that creationists throw in the towel and acknowledge Darwin as "the Muhammad Ali of biology." Ali was undoubtedly a great champion, but his present condition renders Gould's image rather ambiguous. And then, too, the reader is left in some doubt as to whether Gould saw himself in the role of Angelo Dundee or Joe Frazier.

    David Hawkes

  • Sleepless in Nightmute

    You may recall Insomnia as a Norwegian film made on a modest budget--do I repeat myself?--about the inner life of a morally compromised police detective. The picture enjoyed a small but respectable run in the United States a couple of years ago, thanks to the shambling presence of Stellan Skarsgard in the lead and to the clever use of locations. The director, Erik Skjoldbjaerg, set the action in the north of Norway, during summer, so that this film noir played out almost entirely in daylight.

    Now comes a new, American Insomnia, made to the costly standards of a Warner Bros. release. Directed by Christopher Nolan in the wake of his surprise hit Memento, this remake transposes the action to rural Alaska and replaces the not-quite-stellar Skarsgard with Al Pacino. A few paragraphs from now, I will recommend this picture to your attention. First, though, let me talk about a modestly budgeted American movie, The Believer, since it has the distinction of being a film of ideas--in contrast to Insomnia, a film of idea.

    I care about The Believer, first of all, because its writer-director, Henry Bean, has noticed a truth that escapes most American filmmakers: People think about things. For most of us, of course, at most times, our notions of the world amount to a discontinuous, self-contradicting jumble; but it's a jumble on which we may stake our lives. That's why the disorderliness can be dramatic in itself--provided, as Bean knows, that the ideas trouble the mind of a compelling enough character.

    So here is young Danny Balint, played unforgettably in The Believer by the whiplike Ryan Gosling. Think of him as Robert De Niro in Taxi Driver, only leaner, more delicate in features and infinitely more articulate. Danny hunches and glowers and struts and slinks through the streets of New York City, his close-cropped head buzzing with mutually incompatible versions of Jewish identity, his brain bursting with arguments about God and against God. Danny wishes with all his heart to be someone other than a young man of ideas--but it's his fate to be cerebral, which is what makes him so moving and so horrible. He is a yeshiva-educated Jew who wants to live in the blood, as a Nazi activist.

    Now, I've hesitated to write about The Believer, in part because I happen to know Henry Bean and in part because I was never sure when the picture would get into theaters. The Believer won the Grand Jury Prize at the Sundance festival in 2001 but then failed to find a theatrical distributor. (According to The Independent magazine, the phones stopped ringing after a preview audience at the Simon Wiesenthal Center felt The Believer might be bad for the Jews.) The filmmakers decided to go straight to cable and signed a deal with Showtime, which announced a television premiere in late September 2001--not a propitious air date, as it turned out, for a movie about an intense guy in New York City who plans to blow things up. But since Showtime has gotten around to presenting The Believer (in March of this year), I want to say a few words about the picture, now that audiences may at last face Danny in the public space of a movie theater.

    Those who choose to do so will discover that The Believer starts in two locations at once, on the subway and inside Danny's skull. In the exterior setting, Danny is a twentyish skinhead, who when first seen is methodically harassing a bespectacled, yarmulke-wearing youth on the elevated train. Danny crowds the prey, crunching his Doc Marten boots all over the guy's wing-tips. Then, when the victim behaves like a victim--avoiding eye contact, fleeing the subway at the first opportunity--Danny pursues him onto the street. "Hit me! Please!" Danny howls. The less resistance he gets, the more enraged he becomes, till he stomps the timid, book-toting Jew.

    Meanwhile, through cross-cutting, we also get access to Danny's memory, in which he's forever the pale student with big eyeglasses. We see Danny in the yeshiva at about age 12--just another of the boys, except for his rage against the patriarch Abraham, who was willing to slaughter his own son as an offering to God. None of the standard, moralized readings of this tale will assuage Danny. He insists that Abraham's sacrifice made the Jews into a race of willing victims, perpetually crushed by a God who holds them to be worthless.

    You see why this stuff can make people nervous. It's not just that Danny takes Jewish self-hatred to its ultimate conclusion--he takes it there theologically, argumentatively, with a foul-mouthed, spray-the-room exuberance that will offend every moviegoer. Zionists, for example, will object when Danny says the Israelis aren't real Jews--they have soil, and the kind of manliness a fascist like him can respect. Supporters of the Palestinians, on the other hand, will cringe to hear Danny denounce the massacre at Sabra and Shatila. (With friends like this...)

    But I'm making The Believer sound like a string of provocations, and it's not. It's a modernist tragedy, meaning one that's realized with equal measures of sympathy and irony. When Danny tries to enlist in an "above-ground, intellectually serious fascist movement," its leaders (Theresa Russell and Billy Zane) welcome his anti-Semitic tirades but dismiss his offer to kill Jews. Instead, to his horror, they make him into a fundraiser, with a suit and a cell phone. When Danny hooks up with a dreamily masochistic young Aryan (Summer Phoenix), it isn't long before she decides to study Hebrew, hangs a mezuzah on the door and starts wearing ankle-length dresses. Yes, hit me! Please! The harder Danny tries to be a Nazi, the more ineluctably he's a Jew.

    I begin to think of Hazel Motes, the protagonist of Flannery O'Connor's Wise Blood, who is a Christian preacher in spite of himself. According to O'Connor, Hazel's integrity lies in his not being able to rid himself of Jesus: "Does one's integrity ever lie in what he is not able to do? I think that usually it does, for free will does not mean one will, but many wills conflicting in one man." In the same way, many wills conflict in Danny, with that of the faithful Jew refusing to die away. At one point, in fact, Danny secretly wraps a prayer shawl around his torso, much as Hazel wound himself in penitential barbed wire. Then, like any good yeshiva boy, Danny lets the fringes dangle beneath the T-shirt, which in his case is emblazoned with a swastika.

    It's good to see someone take such care with his appearance. Most American movies these days are little more than fashion statements--and yet the characters are shockingly thoughtless about their clothes.

    So we come to Al Pacino's leather jacket.

    It plays quite a prominent role in Insomnia, a movie whose premise goes like this: Someone in the remote town of Nightmute, Alaska, has murdered a high school girl. The victim clearly knew her killer, and the local population is neither large nor highly mobile. Nevertheless, the Nightmute police feel too humble to work the case on their own. They send for help--though not from Nome or Anchorage, nor even from Seattle, Portland or San Francisco. They go all the way to Los Angeles, whose police department immediately agrees to dispatch two of its top detectives, despite their being under investigation by Internal Affairs.

    I tried explaining all this to my friend Ben Sonnenberg, who seemed puzzled. "But what about Eddie Murphy?" he asked. "Was he too busy to come from Detroit?"

    Reassure yourself, Ben. Eddie has answered the call, in effect if not in person. That's the point of the leather jacket.

    It's hard to imagine Pacino's character, Detective Will Dormer, going out and buying this item for himself. It's a little too heavy for the climate in LA, a little too pimp-chic for a cop who's supposed to be an agonized moralist. With its supple new leather, the jacket looks more like something that was recently issued to the guy--which, of course, it was. The filmmakers decided this was just the thing to signal "cool, hip and streetwise" for Pacino. In much the same way, they imposed a symbolic costume on the murderer, Robin Williams. Although the script says he's vain and attracted to luxury, Williams is draped in something that says "phony, out-of-touch intellectual": a corduroy jacket.

    Don't worry, by the way, that I've revealed the killer's identity. You'd be able to figure it out for yourself, by process of elimination, no more than ten minutes into the movie, which is about twenty minutes before Williams comes into the open. The mystery of Insomnia has nothing to do with discovering he's the murderer and everything to do with his somehow being able to deliver a restrained, nuanced, convincingly chilling performance. There's Robin Williams, taking care of business, while everybody else is goofing off.

    Pacino behaves ridiculously, as he typically does when the script's a laugh. Hilary Swank has no such history of egregious mugging; but now, in the role of a local cop, she bounces onto the screen like a young squirrel on its first day of acorn school. Who allowed these performances, or maybe even encouraged them? Christopher Nolan, that's who. He was so intent on dolloping pizazz onto this story that he didn't notice the visual syrup was drowning a six-inch stack of toaster waffles.

    I'm sure Insomnia will have its champions, even so. They'll claim the picture is About Something, namely the importance of never, ever breaking the rules. That's the one, big idea of Insomnia. As we may learn from life and better movies, it's wrong.

    Screening Schedule: Speaking of people who broke rules, Lynne Sachs has made a fine, artful documentary about the Catonsville Nine, the war protesters who walked into a Selective Service office in 1968, grabbed as many files as they could carry and burned them with homemade napalm. She's got the surviving protesters down on film, Philip and Daniel Berrigan among them; and she's got other interested parties too, including the district attorney who prosecuted the Nine and one of the jurors who convicted them. The juror weeps now, out of respect for their courage. The film is titled Investigation of a Flame, and it's showing in New York at Anthology Film Archives, May 29-31. The distributor is First Run/Icarus Films, (800) 876-1710.

    Stuart Klawans

  • Custom

    There is a difference it used to make,
    seeing three swans in this versus four in that
    quadrant of sky. I am not imagining. It was very large, as its
    effects were. Declarations of war, the timing fixed upon for a sea-departure; or,
    about love, a sudden decision not to, to pretend instead to a kind
    of choice. It was dramatic, as it should be. Without drama,
    what is ritual? I look for omens everywhere, because they are everywhere
    to be found. They come to me like strays, like the damaged,
    something that could know better, and should, therefore--but does not:
    a form of faith, you've said. I call it sacrifice--an instinct for it, or a habit at first, that
    becomes required, the way art can become, eventually, all we have
    of what was true. You shouldn't look at me like that. Like one of those saints
    on whom the birds once settled freely.

    Carl Phillips

  • Pursued by Love’s Demons

    As if the back streets of our local city
    might dispense with their pyrrhic accumulation of dust and wineful tonality,
    offer a reprise of love itself, a careless love
    rendered grand and persuasive
    by its own shy handful of hope, some ballast such as this
    on a summer afternoon when the air smells of slaughtered chickens,
    and other problems, like the estranged spouse of a good friend,
    holler from the passageway. It's always conclusive
    in the bungled moment after you try to accomplish something irreducible.
    So you say as you return empty-handed from the store,
    having forgotten everything--your money, the list.

    Charlie Smith

  • Grabitization (Don’t Look)

    Almost everything that is wrong with Washington Post foreign editor David Hoffman's new book about Russia's transformation into a capitalist system, The Oligarchs, can be discerned in one small and apparently meaningless passage on page 91. In it, the erstwhile Moscow bureau chief of the Post (1995-2001) describes former Russian Deputy Prime Minister Anatoly Chubais's reaction when, as a young man, the future and now infamous "father of Russian privatization" first read the works of Austrian economist Friedrich von Hayek:

    Many years later, Chubais recalled the thrill of reading Hayek and instantly gave his own example of how Hayek's theory worked in practice in the United States. "One person is selling hamburgers somewhere in New York," he told me, "while another person is grazing cows somewhere in Arkansas to produce meat that will be used to make those hamburgers. But in order for that person in Arkansas to graze cows, there needs to be a price for meat, which tells him that he should graze cows."

    Now, the reaction a sane person is likely to have when reading a passage like this is, What kind of maniac experiences a "thrill" when reading about hamburger distribution? A corollary question that occurred to me, as I imagined this 20-year-old Soviet dreaming guiltily of Arkansas cattle, was, Were there no girls at all in the Leningrad of Anatoly Chubais's youth?

    It's a given that the answers to questions like these are not to be found in the seminal analytical work of one of the Moscow journalism community's most notoriously humorless foreign correspondents, but this problem is less inconsequential than you might think. For it is precisely Hoffman's inability to write honestly and perceptively about ordinary human experience that makes The Oligarchs miss as badly as it does in its attempt to describe the changes in Russian society over the past decade or so.

    By the time Hoffman took over as the Post's Moscow bureau chief, I had been living in Russia for about five years. First as a student and then as a freelance reporter, I'd watched during that time as Russians became increasingly disillusioned with democracy and capitalism. Kids I'd studied with who had brains and talent found themselves working twenty-four-hour shifts in dingy street kiosks or lugging feminine hygiene products door to door, while the only people from my class who ended up with money were morons and thugs who took jobs with local "biznesmen" (read: mobsters) doing God knows what.

    That was the reality for the Russians young and old who had the misfortune to live through the early 1990s, when the inefficient old planned economy was dismantled and something--I hesitate to call it capitalism--was installed in its place. Honest, hard-working people were impoverished overnight, while swindlers and killers quickly rose to the top. The insult was exacerbated for Russians when they began to hear that the rest of the world, America and the American press in particular, was calling this process progress.

    What America called a "painful but necessary transition," most Russians saw as a simple scam in which Communist functionaries and factory directors reinvented themselves by swearing oaths to the new democratic religion and cloaking themselves in fancy new words like "financier" and "entrepreneur." The only difference from the old system appeared to be that the villas were now in the south of France instead of on the Black Sea. The ordinary Russian also noticed that his salary had become largely fictional and that all his benefits had been taken away--corners had to be cut somewhere in order to pay for all those new Mercedes in town.

    At the national level, this process was symbolized by the rise of the oligarchs, a small group of rapacious and mostly bald men who were handed huge fortunes by their friends in government. Eventually, they were to take the place of the Politburo as the ruling coterie of the new elite.

    Men like bankers Mikhail Khodorkovsky, Alexander Smolensky and Vladimir Potanin, industrialist Boris Berezovsky and media magnate Vladimir Gusinsky became Croesus-rich seemingly overnight in those early years of the 1990s. By the middle of the decade, they owned or controlled much of the media and held increasing influence over Boris Yeltsin, a weak autocrat who had grown dependent on their wealth and power to fend off his political enemies.

    The Oligarchs purports to tell the story of the rise of these men. It is an exhaustive book, impressive in scope, that contains extensive interviews with all of the key figures. But it misses because Hoffman does not know what it is like to sleep in a street kiosk during a Leningrad winter, nor does he particularly care to know; he writes like a man trying to describe the dark side of King George from a trundle bed in a guest room of Windsor palace.

    Not that this is surprising. In his tenure as a reporter in Moscow, Hoffman was notorious for being an unapologetic ideologue, the hardest of hard-core cold warriors. The basic structure of a David Hoffman article was generally to lead with a gloomy flashback to some grim Soviet-era scene and then go on to describe how, with the help of American aid, the courageous leadership of the democrat Boris Yeltsin and the heroic efforts of Western-minded reform economists like Chubais, things had since changed spectacularly for the better.

    In other words, lead off with a picture of a groaning, overweight housewife at the end of a long line to buy shoes that don't fit, and close with a shot of an apple-cheeked cashier at Pizza Hut using her salary to buy Nikes. That was Russia Reporting 101 during the 1990s, and no one was better at it or more devoted to its practice than David Hoffman.

    That said, it is surprising, even shocking, that Hoffman would employ that technique in this book, given the subject matter. Hoffman begins his book by focusing on the Soviet-era experiences of a characteristic "ordinary Russian," a schoolteacher named Irina, and describing her humiliating search for toilet paper on a summer day in 1985.

    Use of these images made a kind of sense in the wake of the collapse of Communism, but in Hoffman's book, published ten years after the fact, the decision to spend the entire first chapter (titled "Shadows and Shortages") describing the hardship of product-deprived Soviets in the 1980s can only mean one thing. Hoffman is setting up his reader to understand the phenomenon of the oligarchs in terms of their eventual benefit to society.

    That benefit, in Hoffman's view, is clearly a Russia full of available products and the triumphant building of a "rapacious, unruly capitalism...on the ashes of Soviet communism."

    That the vast majority of Russians could not and cannot afford those products, or even earn enough to feed and clothe themselves, does not concern Hoffman. The opening of the book, set in the old USSR, is full of portraits of ordinary folks grasping for Beatles records and VCRs and other Western delights (Hoffman even sinks so low as to use the heavyweight champion of Russia-reporting clichés: the Soviet citizen sitting despondent at the sight of a full refrigerator in a Western movie). But those same ordinary people are conspicuously absent from the middle and later pages, when the cracks in the new system--the stalled salaries, the collapsed local industries, the crime-- begin to show.

    In one particularly telling section, Hoffman describes Yeltsin's surprise when he learned in early 1998 that his popularity figures in poll ratings had dropped below 5 percent. According to the book, media mogul Gusinsky and some of the other oligarchs discovered that Yeltsin, kept insulated from the truth by his KGB aides, had no conception of the depth of his unpopularity:

    "Before the meeting, they agreed that someone would try to deliver the raw truth to Yeltsin that he was no longer popular, a painful realization that, according to [Yeltsin's chief of staff, Viktor] Ilyushin, the president had not absorbed."

    This passage is ironic because Yeltsin's surprise at this juncture of the story is nearly identical to that of the uninitiated reader traveling through Hoffman's book for the first time. Until he informs us a few sentences later of Yeltsin's meager poll ratings, the pain felt by the overwhelming majority of Russians during the early reform years is completely concealed.

    When Hoffman first showed us the schoolteacher Irina, she was a Soviet citizen deprived of toilet paper, and this was apparently worthy of note. But if she remained a teacher through this Yeltsin poll moment in the middle of the book, in 1996, Irina also saw her health benefits taken away, her salary slashed to the equivalent of about $50 a month (and possibly delayed for months in any case) and funding for her school cut so severely that she would have to buy chalk out of her own pocket. This is not considered noteworthy, in Hoffman's estimation.

    The determination to keep the telling of the oligarchs' story within the context of their eventual salutary effect on the country leads Hoffman into some grievous oversights and contradictions. None of these are more important than his insistence upon painting his oligarch subjects--in particular, Khodorkovsky, Potanin and Berezovsky--as self-made entrepreneurs who bucked the state system to make their fortunes. The fact that he connects the rise of these men to the encouraging fact of a Russia full of products on its shelves is even more misleading.

    The reality is that none of these men produced anything that Russians could consume, and all benefited directly from tribute handed down from the state. Bankers like Smolensky, for instance, made fortunes through a collusive arrangement with state insiders who gave them exclusive licenses to trade in hard currency during a time when prices were set to be abruptly freed. When hyperinflation set in (naturally) and the population frantically scampered to convert their increasingly worthless rubles into dollars, the currency-trading licenses became virtual spigots of cash.

    Furthermore, the oligarchs really became a ruling class only after the "loans for shares" auctions in late 1995, a series of privatizations that underscored the incestuous relationship between the state and the new tycoons. The state "lent" huge stakes in giant companies (in particular oil companies) in return for cash. Implemented and organized by Minister Chubais, the auctions ended up being one of the great shams of all time, as in many cases the bidders themselves were allowed to organize the tenders and even to exclude competitors. In some cases, the state actually managed to lend the bidders the money to make the bids through a series of backdoor maneuvers.

    Hailed at the time as the death knell of the state-controlled economy and a great advance of the privatization effort, the auctions were actually a huge quid pro quo in which bankers were handed billion-dollar companies for a fraction of their market price (a 78 percent stake in Yukos, the second-largest oil company in Russia, valued at least at $2 billion, was sold for just $309.1 million to Khodorkovsky's Menatep Bank) in exchange for support of Yeltsin in the upcoming 1996 election. Many Russians today consider loans for shares one of the biggest thefts in the history of mankind. Hoffman, incidentally, didn't bother to cover loans for shares as a reporter, either.

    One final note about Hoffman. Many reviewers have lauded The Oligarchs for its "readability." They must have been reading a different book. If there is a worse descriptive writer in the journalism world than Hoffman, I have yet to come across him or her. In those passages in which he goes after the "breezy" conversational style of David Remnick's Pulitzer Prize-winning Lenin's Tomb (Hoffman's Remnick inferiority complex is grossly obvious in this book), he repeatedly breaks down into crass stupidities that reveal his lack of knowledge about the country he covered for half a decade.

    At one point, for instance, he describes the young Chubais as having had a penchant for driving his Zaporozhets automobile at "terrifying speeds." As the owner of two such cars, which feature 38-horsepower engines and can be lifted off the ground by two grown men (or maybe four Washington Post correspondents), I can testify that terror is not and has never been in this machine's design profile.

    Hoffman's atrocious Russian, a subject of much snickering in the Moscow press community, also shines through in this book. He consistently mistranslates Russian expressions and fails to grasp lingual/cultural references. For instance, when he talks about Chubais's habit of spending long hours in the Publichka, which he says is what "young scholars fondly called the [public] library," he appears not to grasp that the "fond" nickname is a play on the term publichniy dom, or whorehouse.

    This might be because Hoffman is the only American male to have visited Moscow in the 1990s and escaped without personal knowledge of the term. Whatever the explanation, it seems clear that Hoffman is not the kind of person one would normally consider an authority on the nontycoon Russian experience.

    That's particularly true given the ironic fact that prostitution was one of the few real growth industries during the reign of the oligarchs, the one feasible financial option for the modern-day Irinas of Russia. That's modern Russia in a nutshell: plenty of toilet paper for the asking, but no way to afford it except...the hard way.

    If The Oligarchs is simply a wrongheaded book, then Building Capitalism, by Carnegie fellow Anders Aslund, is legitimately insidious. Aslund throughout the 1990s was a key adviser to reform politicians like Anatoly Chubais and Yegor Gaidar, and as such his assessment of the success of the privatization era is obviously self-interested. He claims in the book that "populations have gained from fast and comprehensive reforms," and that "economic decline and social hazards have been greatly exaggerated, since people have forgotten how awful communism was."

    This is typical of Western analysis of Russia over the past ten years--an academic who grew up in Sweden and lives in Washington, telling Russians that their complaints about reform are groundless because, unlike Western experts, they do not accurately remember what life was like under Communism.

    Aslund, who helped to design the privatization programs in the middle of the past decade, goes on in the book to defend those blitzkrieg liquidations of state industries on the grounds that such formal privatizations were more equitable than what he calls "spontaneous privatization."

    A major aim of formal privatization was to stop spontaneous privatization, which was inequitable, slow, and inefficient. Reformers feared it would arouse a popular political backlash against privatization and reform, as indeed happened all over. Especially in the [former Soviet Union], the saying "what is not privatized will be stolen" suggested the urge for great speed.

    It's not clear from this passage to whom this "great speed" idea was suggested. Those "equitable" formal privatizations Aslund helped design left billion-dollar companies like Yukos and Norilsk Nickel in the hands of single individuals (Khodorkovsky and Potanin, respectively) for pennies on the dollar. They were so corrupt and unfair that for most Russians--the majority of whom were left impoverished by the changes--the word "privatization" became synonymous with theft. Indeed, Russians even coined a new term, prikhvatizatsiya (or "grabitization"), that perfectly expressed their outrage over the private commandeering of property they considered public and their own.

    It should be admitted that the extent to which one finds success in Russia's capitalist experiment--and the worth of the oligarchs who administered it--is largely a matter of opinion.

    If you believe that capitalism is about destroying a country's industry, handing over its wealth to a dozen or so people who will be inclined to move it instantly to places like Switzerland and Nauru Island, and about humiliating the general population so completely that they are powerless to do anything but consume foreign products and long for the "good old days" of totalitarianism (polls still consistently show that 70 percent of the population preferred life under Brezhnev to that of today's Russia), then you have to judge the Russian experiment a success.

    But if you believe that people are more than just numerical variables in some dreary equation found in an Adam Smith reader (or perhaps numbers lumped together with cows in Anatoly Chubais's dogeared Hayek text) then you'll have a hard time finding any true capitalism at all in today's Russia. Or in either of these coldhearted books, for that matter.

    Matt Taibbi

  • In Cold Type

    haven't done much mental spring cleaning because so much of the last month has been taken up with brooding and spewing about the crisis in the Middle East; no doubt the coming months will be much the same. After putting your mind to this issue for a long time--witness Shimon Peres, New York Times columnist Tom Friedman and so many others--cobwebs gather and it becomes hard to see through the accumulated dust. So it was pleasant to turn to Legal Affairs, the new publication of the Yale Law School, edited by Lincoln Caplan, which casts an intelligent eye over a broad and spacious intellectual terrain.

    Of course the first item I turned to--obsessively--was an article on Israel, more specifically on the legendary Supreme Court President Aharon Barak (no relation), by Emily Bazelon--thankfully the only Middle East piece in the inaugural issue, or who knows how I might have been sidetracked. In 1992, from his seat on the Israeli Supreme Court, he championed the Basic Laws that now serve the country as a kind of de facto constitution and give Israel one of the most progressive sets of human rights laws and precepts to govern any nation. But that was just a first step for this exceptional person.

    In May 1998, in a historic pronouncement, he declared (I'm simplifying here) that torture of Palestinian detainees by the Shin Bet was not legalized under Israeli codes. This meant that one day there would be no more shabach--the technique of tying prisoners to kindergarten chairs, putting their heads in sacks and subjecting them to humiliation and psychological torture. It meant no more shaking, a favored method that disorients and injures without leaving visible signs. No more sleep deprivation. Barak later codified this ruling, when he "unequivocally declared for a unanimous court that the Shin Bet's methods of interrogating Palestinians detained without charges violated the rights to human dignity and freedom." But those were better days in Israel, and Bazelon points out that current conditions may have allowed the Shin Bet to violate the ban. The Public Committee Against Torture has filed two petitions to the court since September 2000, both arguing that the ban on torture has not been "fully enforced," as Bazelon understates it. One petition was withdrawn and the other rejected. Like so many of his generation who hoped to normalize life in Israel, Barak too has been undermined by the Degeneration of the Situation.

    Anyway, Legal Affairs is not all bleakness and Jerusalem drizzle. Its other lead piece is Brendan Koerner's dazzling narrative of cyber-intrigue and blackmail that extends from Russia to FBI headquarters in DC. The magazine also looks at hip-hop music with the amusing premise that it is all about law enforcement, in a piece that would be great but for its silly, super-serious tone. Tim Dodd contributes an excellent article from Jakarta on Syafiuddin Kartasasmita, the conservative Indonesian judge who was assassinated a year after leading a three-man panel that found the youngest son of the dictator Suharto, Tommy, guilty of corruption. A very amusing piece by Dashka Slater tells you what it's like to spend a working week watching only court TV (answer: terrific and soporific). A bunch of small excerpts from Christopher Buckley's latest Washington entertainment (No Way to Treat a First Lady) are fun, if not terribly enlightening. And "Silence! Four ways the law keeps poor people from getting heard in court" should be on the reading list of every legal reporter and defense attorney in America. There is also a no doubt valuable piece by Benjamin Wittes on the faulty legal underpinnings of Kenneth Starr's behavior (but lines like "the attorney general had the authority to decline to request an independent counsel where a clear Justice Department policy would preclude an indictment" really harsh my buzz). Legal Affairs reminds you that the law matters--unlike American Lawyer, which makes you think the law is a buddy system for grotesque elites in major urban centers who speak a language the rest of us cannot understand (except when it's about gigantic salaries and hourly fees). The new magazine reminds you that the law is the element in which most of the major stories of our lives take place (marriages, births, deaths, crimes, real estate closings, divorce), and that it provides the narrative framework for the unfolding of most important events.

    News From Nowhere

    Globalvision News Network has set up an extremely useful website called The News Not in the News (you can find it at, by subscription). This is where you can see what the Arab press is really reporting; where you'll find the latest from places like Kyrgyzstan, where the government has just resigned following unrest since the May 10 sentencing of Felix Kulov, the foremost leader of the Kyrgyz (new national adjective!) opposition, to ten years in prison. The stories are put up without annotation, so that, say, the Kyrgyz reporting can become convoluted to the uninitiated reader. But you wouldn't want to miss this story: In his first interview in two years--conducted along the Afghanistan-Pakistan border in writing and by messenger, not in person--Mullah Omar (you remember him) tells Asharq Al-Awsat, an Arab news agency, that flames will engulf the White House and that Osama bin Laden is still alive. Of course, for all one knows, the interviewee could have been an Afghan schoolboy having his fun, since there is no proof that the reporter's questions were actually relayed to Omar himself. But that is what is both useful and charming about this site: It is raw news as it is written and printed in other lands, as fresh as it can be, and with its edgy myth-making untouched by American objectivity. "What is important for the US now is to find out why they did that [the attack on September 11]," says "Omar." "America should remove the cause that made them do it." If only "Omar" had a mirror version of The News Not in the News, he could see what a tempest that very same issue set off in America's own pages not so long ago. But we wouldn't want to harsh his buzz.

    Amy Wilentz

  • Ad

  • Letters



    Cambridge, Mass.

    William Schulz, in his respectful but selectively critical review of "less than two of [Shouting Fire]'s 550 pages," misses the point of my proposal regarding torture warrants ["The Torturer's Apprentice," May 13]. I am against torture, and I am seeking ways of preventing or minimizing its use. My argument begins with the empirical claim--not the moral argument--that if an actual ticking bomb case were ever to arise in this country, torture would in fact be used. FBI and CIA sources have virtually acknowledged this. Does Schulz agree or disagree with this factual assertion? If it is true that torture would in fact be used, then the following moral question arises: whether it is worse in the choice of evils for this torture to take place off the books, under the radar screen and without democratic accountability--or whether it is worse for this torture to be subjected to democratic accountability by means of some kind of judicial approval and supervision. This is a difficult and daunting question, with arguments on all sides. In my forthcoming book Why Terrorism Works, I devote an entire chapter to presenting the complexity of this issue, rather than simply proposing it as a heuristic, as I did in the two pages of Shouting Fire on which Schulz focuses. Schulz simply avoids this horrible choice of evils by arguing that it does not exist and by opting for a high road that will simply not be taken in the event that federal agents believe they can actually stop a terrorist nuclear or bioterrorist attack by administering nonlethal torture.

    Schulz asks whether I would also favor "brutality warrants," "testilying" warrants and prisoner rape warrants. The answer is a heuristic "yes," if requiring a warrant would subject these horribly brutal activities to judicial control and political accountability. The purpose of requiring judicial supervision, as the Framers of our Fourth Amendment understood better than Schulz does, is to assure accountability and judicial neutrality. There is another purpose as well: It forces a democratic country to confront the choice of evils in an open way. My question back to Schulz is, Do you prefer the current situation, in which brutality, testilying and prison rape are rampant, but we close our eyes to these evils?

    There is, of course, a downside: legitimating a horrible practice that we all want to see ended or minimized. Thus we have a triangular conflict unique to democratic societies: If these horrible practices continue to operate below the radar screen of accountability, there is no legitimation, but there is continuing and ever-expanding sub rosa employment of the practice. If we try to control the practice by demanding some kind of accountability, we add a degree of legitimation to it while perhaps reducing its frequency and severity. If we do nothing, and a preventable act of nuclear terrorism occurs, then the public will demand that we constrain liberty even more. There is no easy answer.

    I praise Amnesty International for taking the high road--that is its job, because it is not responsible for making hard judgments about choices of evil. Responsible government officials are in a somewhat different position. Professors have yet a different responsibility: to provoke debate about issues before they occur and to challenge absolutes. That is what Shouting Fire is all about.



    New York City

    Neither I nor Amnesty International can be accused of having closed our eyes to the reality of torture, police brutality or prison rape. Of course, some authorities may utilize torture under some circumstances, just as others choose to take bribes. The question is, What is the best way to eradicate these practices? By regulating them or outlawing them and enforcing the law? That an evil seems pervasive or even (at the moment) inevitable is no reason to grant it official approval. We tried that when it came to slavery, and the result was the Civil War. Had we applied Professor Dershowitz's approach to child labor, American 10-year-olds would still be sweating in shops.



    Princeton, N.J.

    Christopher Hitchens argues that "suicide murders would increase and not decrease" if a two-state solution between Israelis and Palestinians moved closer to reality ["Minority Report," May 13]. This claim seems to bolster Sharon's cataclysmic "war on terror" in the occupied territories: If terrorists seek to destroy peace and only feed on Israel's generosity and sincerity, surely Sharon is correct to eliminate "terror" as a precondition for negotiations?

    In fact, the Oslo process has moved the Palestinians further from the goal of a viable state, and the Israeli left's best offers to date (at Camp David and Taba) envisage the annexation of the vast majority of settlers to Israel in perpetuity along with blocs of land, which would fatally compromise a nascent Palestine. As for Hitchens's observation that the first suicide bombings coincided with the Rabin/Peres government: How does this undermine the explanation that Israel's prolonged oppression has created and fueled the bombers? Rabin and Peres imposed a curfew on Palestinians rather than Israeli settlers after the murder of twenty-nine Arabs by Baruch Goldstein in Hebron early in 1994 (the first suicide bombing was in response to this); they sent death squads into the West Bank and Gaza to kill militants and those who happened to be in their vicinity (the wave of suicide bombings in the spring of 1996 followed one such assassination); and they greatly expanded the settlements, contributing their share to the broader trend of illegal settlement expansion that's doubled the number of Israelis living across the Green Line since 1992.

    Hitchens's promotion of a "culture war" between religious extremists and secular opponents of "thuggery and tribalism" obfuscates the reality of Israel's prolonged and enduring oppression of Palestinians. His argument that a more generous Israeli policy would lead to more Palestinian violence, meanwhile, serves to legitimize Sharon's current tactics. How did such a clearsighted commentator become so myopic? Perhaps if Hitchens stopped looking at every situation through the lens of the "war on terror," he'd regain his former clarity of vision.



    Washington, DC

    I share Eric Alterman's admiration for the work of biographer Robert Caro ["Stop the Presses," May 6]. But why does Alterman feel compelled to refer to Lyndon Johnson as a "thoroughgoing racist"? Johnson was a white man born in 1908 in the most racist region of the most racist country on earth. He was born in a time and place where racism was accepted as part of the atmosphere, where lynching was commonplace, where black people led lives of unimaginable degradation (see Leon Litwack's Trouble in Mind, a portrait of the early twentieth-century Jim Crow South, which has to be read to be believed).

    Of course, given his background, political ambitions and ineligibility for sainthood, Johnson used racist language and shared racist assumptions. Who from that time and place, wanting what he wanted, did not? But what distinguishes Johnson, at all stages in his public career, was his relative lack of public racism. Johnson was a New Deal Congressman from 1937 to '48 who never strayed from loyalty to the national Democratic Party even though conservative Texas Democrats were in revolt against it from 1944 onward. Of course, running for the Senate against a Dixiecrat in 1948 as Southern resistance to civil rights was beginning to build, he opposed the Truman civil rights program. That was the minimum required to be elected to Texas statewide office. Given the pathological ferocity of Johnson's ambition, sticking with Truman for re-election, as Johnson did, took guts that year. As a senator, Johnson was never identified as a leader of the Southern bloc or as an enemy of civil rights. Again, especially in public, he said and did the political minimum to pay homage to the racist consensus. Caro evidently describes his involvement in the Civil Rights Act of 1957, the forerunner of all the other civil rights laws to come. Texas black and Hispanic voters never doubted that, given the alternatives, LBJ was their man.

    Johnson later became the greatest civil rights President in history, pushing through the epochal changes in the laws, appointing Thurgood Marshall to the Supreme Court and going so far as to vet prospective federal judges with the NAACP Legal Defense Fund. Blacks who worked with him, like Roger Wilkins, remember him fondly while acknowledging his ancestral racism, which he tried, not always successfully, to transcend. But if Johnson is a "thoroughgoing racist," where does that leave Richard Russell, James Eastland or Strom Thurmond--or Richard Nixon, for that matter? What about Barry Goldwater, who was probably less "racist" than Johnson but was an opponent of all civil rights legislation and was the leader of the forces of unrepentant segregation (i.e., racist murder and oppression) in 1964?

    As with Abraham Lincoln, also now under renewed attack on similarly ahistorical grounds, to describe Johnson as an extreme racist flattens the historical landscape and renders the fierce conflicts of a past age meaningless. There is nothing wrong with honestly describing anybody's racial views, including those of Lincoln or Johnson. But in studying history, context is everything. And in studying Lincoln or Johnson, what matters most is not the ways they shared their contemporaries' racial attitudes but the ways they did not, as reflected in their words and actions.



    New York City

    There's a bit of hyperbole in Peter Connolly's thoughtful letter, and I disagree with his point about it taking guts to stick with the Democratic President, but by and large I think his criticism is on the mark, and I appreciate it. He is right. Context is everything. Johnson may have been a racist, but unlike most politicians in his time and place he was not a "race man." That's an important distinction, and I wish I had considered it.


    Eric Alterman, William F. Schulz and Our Readers