Ad Policy

March 12, 2001 Issue

  • Editorials

    Right With Bush

    Every conservative is now a compassionate conservative. Well, most were at the recent annual Conservative Political Action Conference, which drew more than 3,000 right-wing activists and leaders to a hotel outside Washington. A year ago George W. Bush was viewed with suspicion by many conservative honchos who worried that ideological wimpiness ran in the family and that Bush's Compassionate Conservatism™ was a retreat from traditional conservatism. What a difference a butterfly ballot can make. At the confab Bush was embraced by this flock as one of their own, a politician who waged a masterful, conservative campaign and who--even better--has adopted as his role model not his pop but Ronald Reagan. Marc Holtzman, the Colorado secretary of technology, proclaimed that a "conservative shaping America today."

    Had a county elections officer in Palm Beach not designed a confusing ballot, these cons probably would be whining about Bush and the wishy-washiness of compassionate conservatism. But winning--even by Supreme Court fiat--changes everything. And the attendees were delighted to grant Bush slack. They did not snipe at his tax-cut plan (too small and unrevolutionary for most of them), his education plan (which bolsters the Education Department rather than demolishes it and nudges school choice toward the back of the bus) or his military-spending plan (which includes a Pentagon raise but does not immediately shower the military with extra tens of billions of dollars). They're willing to wait for Bush to score legislative wins before pressing Social Security privatization, and they're content with an incremental approach to restricting abortion rights.

    This usually cantankerous lot is saluting and following. Commentator Ann Coulter noted that Bush "could teach us a few things.... He discovered all you had to do was go around calling yourself nice.... Many of us took umbrage at that." But it worked. Not everyone absorbed the lesson. Leftist-turned-rightist author David Horowitz urged Republicans to "stop being so polite." Call the liberals what they truly are, he advised: "totalitarians."

    Still, the bitterness quotient at this CPAC was much lower than in previous years. No more Where's Lee Harvey Oswald When You Need Him? bumper stickers. (Instead, one could buy Dixie Forever stickers--as speakers urged conservatives to reach out to blacks and Latinos.) Bill and Hillary Clinton received fewer jabs than expected. A group called America's Survival did hand out a report on "Hillary Clinton's Secret United Nations Agenda." (Implement "world government...that will destroy American sovereignty and traditional families.") Oliver North blasted the ex-President for pardoning Marc Rich, because Rich traded with hostage-holding Iran. (Did North forget he sent missiles to hostage-holding Iran?) Senator James Inhofe griped, "We have had a President who has given away or covered up [the illegal transfer of] virtually every secret in our nuclear arsenal." Nevertheless, many CPACers appeared to believe it was time to move on.

    But even as rightists control the White House and Congress, cons still claim they are besieged. Terry Jeffrey, the editor of Human Events, asserted that "the iron law of American journalism" still stands: "The most conservative candidate in any campaign will be demonized by the establishment press." (Perhaps he ought to ask Al Gore about this.) Coulter, in all seriousness, said that Republicans and conservatives--in battling Democrats and liberals--"are always at a disadvantage because we won't lie." One activist complained that Democrats "with their talking points run circles around Republicans." Another fretted that the GOP, up against a Democratic Party backed by organized labor, was "losing the ground campaign." An NRA official had to remind him that the gun lobby runs its own ground campaign pretty darn well. Perhaps it's tough to be in power when you're accustomed to viewing yourself as a victim of persecution.

    Of course, enemies abound. The National Right to Work Foundation's Stefan Gleason reported that the AFL-CIO "has now embraced communist influences." Senator Mitch McConnell noted that campaign finance reform is a plot mounted by Hollywood, academia and the media to "quiet your vote...[so] they'll have more power." The NRA's Wayne LaPierre warned that the organizers of a UN conference on gun control "want the marvelous millennial youth [of the United States] not to be American citizens but global citizens.... I say never!" Andrea Sheldon Lafferty of the Traditional Values Coalition accused Planned Parenthood of defending abortion rights so it can make money selling fetal remains.

    Fear and loathing continue, but Bush has tamed this fierce crowd. "The ideologically motivated in politics are often disappointed," said David Keene, chairman of the American Conservative Union. "But most conservatives are surprised they like Bush so much." Marc Rotterman, a senior fellow at the John Locke Foundation, reflected the spirit of CPAC when he remarked, "We on the right need to give Bush a chance to develop a broad-based agenda. After 1994 we expected things to go too fast." Now they watch Bush with hope, and they dare to believe.

    David Corn

  • Confronting Iraq

    George W. Bush's description of the US-British bombing of Iraq as a "routine mission" unwittingly summed up the mechanical nature of the US-British air operations in Iraq, which have been bombing on autopilot since 1992. These sorties continue because no one has a better idea of what US policy toward Iraq should be. The only rationales for the February 16 strike were to tell Saddam Hussein that the mindless air campaign will continue under a new administration and to reduce the possibility that Iraq's improved air defenses might shoot down a US plane on the eve of Secretary of State Colin Powell's trip to the Middle East.

    But the attack's main outcome was to remind the world of the emptiness of US policy in the area. The sanctions regime is now widely ignored; US European allies, led by the French, are furious at Washington's unilateralism (even Tony Blair's foreign minister was preparing to relax sanctions). Bush spoke of enforcing "the agreement that [Saddam Hussein] signed after Desert Storm," but the Clinton Administration helped undermine the UN inspection regime instituted after the war by making it an anti-Saddam operation. UNSCOM inspectors pulled out, never to return, just before December 16, 1998, when cruise missiles were unleashed against Baghdad in Operation Desert Fox. Washington's obdurate support of the sanctions, despite massive suffering among the Iraqi people, eroded the anti-Saddam consensus in the Arab world that developed after his invasion of Kuwait. Finally, the failure of Mideast peace talks and Ariel Sharon's victory in Israel lend credence to Saddam's claim to be the champion of the Palestinians, and it provided him with another opportunity to play to the Arab streets and mendaciously blame US-Israel conniving.

    Far from strengthening Powell's mission, the bombings stirred up renewed hostility among the Arab people. The Bush team's campaign pronouncements on Iraq do not allow hope that Powell brings any new ideas to the region. Indeed, the ineluctable drift of events in the past year has left the new Administration few options. The old, cruel sanctions policy is discredited, and there is scant hope at this point that the Iraqis will agree to accept UN inspectors, who are the best check on Saddam's efforts to rebuild his war machine. As it happens, UN Secretary General Kofi Annan was to meet with the Iraqi foreign minister February 26-27 to discuss reinstating them; the bombing surely hasn't helped this initiative. And there is virtually no international support for any of the Administration plans to beef up support for Iraqi opposition groups. Without the backing of a wide coalition of countries, no policy has any chance of success.

    The wisest future course for the United States is to forge a more modest containment and sanctions policy that might win the support of America's partners. It should aim to put in place limited and precisely targeted sanctions designed to curtail Iraq's import of advanced military technology and to contain Saddam. That means abandoning unilateralism (something that goes against the grain of this new White House) and reaching out not only to the UN and allies in Europe and the Middle East but to regional players like Turkey and Russia.

    It is ironic that Colin Powell, the architect of Desert Storm, must now deal with its long-term consequences--its failure to bring peace and stability to the region.

    the Editors

  • Bush’s Nuclear Revival

    George W. Bush's mid-February directive ordering the Pentagon to review and restructure the US nuclear arsenal is a wake-up call for supporters of arms control and disarmament. Under the guise of revising nuclear policy to make it more relevant to the post-cold war world, the Bush Administration is pushing an ambitious scheme to deploy a massive missile defense system and develop a new generation of nuclear weapons. If fully implemented, Bush's aggressive new policy could provoke a multisided nuclear arms race that will make the US-Soviet competition of the cold war era look tame by comparison.

    To understand the danger of Bush's emerging nuclear doctrine, you have to read the fine print. Some elements of his approach--first outlined at a May 23, 2000, speech at the National Press Club--sound sensible. Bush implied that if elected President, he would reduce the nation's arsenal of nuclear overkill from its current level of 7,500 strategic warheads to 2,500 or less. In tandem with these reductions, which go beyond anything the Clinton Administration contemplated, Bush also promised to take as many nuclear weapons as possible off hairtrigger alert status, thereby reducing the danger of an accidental launch.

    So far, so good: fewer nuclear weapons, with fewer on high-alert status, would be a step in the right direction. Unfortunately, Bush also committed himself to deploying, "at the earliest possible date," a missile defense system capable of defending "all fifty states and our friends and allies and deployed forces overseas." Unlike the $60 billion Clinton/Gore National Missile Defense scheme, which involved land-based interceptors based in Alaska and North Dakota, Bush's enthusiasm for a new Star Wars system knows no limit. The President and his Star Warrior in Chief, Defense Secretary Donald Rumsfeld, are willing to put missile interceptors on land, at sea, on airplanes and in outer space in pursuit of continued US military dominance.

    When Bush announced Rumsfeld's appointment in late December, he acknowledged that the Pentagon veteran would have a big "selling job" to do on national missile defense, with allies and potential adversaries alike. But even Washington's closest NATO allies continue to have grave reservations about Rumsfeld's suggestion that the United States might trash the Anti-Ballistic Missile treaty of 1972 in order to pursue its missile defense fantasy. Meanwhile, Russian President Vladimir Putin has flatly stated that a US breakout from the treaty would call the entire network of US-Russian arms agreements into question.

    The cost of Bush's Star Wars vision could be as much as $240 billion over the next two decades, but that's the least of our problems. According to a Los Angeles Times account of a classified US intelligence assessment that was leaked to the press last May, deployment of an NMD system by the United States is likely to provoke "an unsettling series of political and military ripple effects...that would include a sharp buildup of strategic and medium-range nuclear missiles by China, India and Pakistan and the further spread of military technology in the Middle East."

    Bush's provocative missile defense scheme may not even be the most dangerous element of his new-age nuclear policy. According to Steven Lee Myers of the New York Times, Bush's renovation of US nuclear doctrine will draw heavily on a January 2001 study by the National Institute for Public Policy that was directed by Dr. Keith Payne, whose main claim to fame is co-writing a 1980s essay on nuclear war titled "Victory Is Possible." Bush National Security Council staffers Robert Joseph and Stephen Hadley were involved in the production of the NIPP study, as was William Schneider, informal adviser and ideological soulmate of Donald Rumsfeld.

    In its most egregious passage, the study advocates the development and design of a new generation of nuclear weapons to be used for both deterrent and "wartime roles," ranging from "deterring weapons of mass destruction (WMD) use by regional powers" to "preventing catastrophic losses in a conventional war," from "providing unique targeting capabilities (deep underground/biological weapons targets)" to "enhancing US influence in crises." In short, at a time when a number of prominent military leaders, like Gen. Lee Butler, the former head of the Strategic Air Command, have been suggesting the abolition of nuclear weapons on the grounds that they serve no legitimate military purpose, George W. Bush is taking advice from a group of unreformed initiates in the nuclear priesthood who are desperately searching for ways to relegitimize nuclear weapons.

    The unifying vision behind the Bush doctrine is nuclear unilateralism, the notion that the United States can and will make its own decisions about the size, composition and employment of its nuclear arsenal without reference to arms control agreements or the opinions of other nations. It is a disastrous doctrine that raises the odds that nuclear weapons will be used again one day, and as such it demands an immediate and forceful public response.

    It's not as if we haven't been down this road before. In the 1980s, when Ronald Reagan rode into Washington with guns blazing, pressing for a massive nuclear buildup and a Star Wars missile defense system, the international peace movement helped roll back his nightmare nuclear scenarios and push him toward a policy of nuclear arms reductions, not mutual annihilation. It will take that same kind of energy and commitment to stave off Bush's born-again nuclearism.

    William D. Hartung

  • Liberation Musicology

    The recording industry has been celebrating the supposed defeat of Napster. The Court of Appeals for the Ninth Circuit has affirmed the grant of a preliminary injunction that may well have the effect of closing the service down completely and ending the commercial existence of Napster's parent (that is, unless the record companies agree to an implausible deal Napster has proposed). But despite appearances, what has happened, far from being a victory, is the beginning of the industry's end. Even for those who have no particular stake in the sharing of music on the web, there's value in understanding why the "victory"over Napster is actually a profound and irreversible calamity for the record companies. What is now happening to music will soon be happening to many other forms of "content" in the information society. The Napster case has much to teach us about the collapse of publishers generally, and about the liberative possibilities of the decay of the cultural oligopolies that dominated the second half of the twentieth century.

    The shuttering of Napster will not achieve the music industry's goals because the technology of music-sharing no longer requires the centralized registry of music offered for sharing among the network's listeners that Napster provided. Freely available software called OpenNap allows any computer in the world to perform the task of facilitating sharing; it is already widely used. Napster itself--as it kept pointing out to increasingly unsympathetic courts--maintained no inventory of music: It simply allowed listeners to find out what other listeners were offering to share. Almost all the various sharing programs in existence can switch from official Napster to other sharing facilitators with a single click. And when they move, the music moves with them. Now, in the publicity barrage surrounding the decision, 60 million Napster users will find out about OpenNap, which cannot be sued or prohibited because, as free software, no one controls its distribution and any lawsuits would have to be brought against all its users worldwide. Suddenly, instead of a problem posed by one commercial entity that can be closed down or acquired, the industry will be facing the same technical threat, with no one to sue but its own customers. No business can survive by suing or harassing its own market.

    The music industry (by which we mean the five companies that supply about 90 percent of the world's popular music) is dying not because of Napster but because of an underlying economic truth. In the world of digital products that can be copied and moved at no cost, traditional distribution structures, which depend on the ownership of the content or of the right to distribute, are fatally inefficient. As John Guare's famous play has drummed into all our minds, everyone in society is divided from everyone else by six degrees of separation. The most efficient distribution system in the world is to let everyone give music to whoever they know would like it. When music has passed through six hands under the current distribution system, it hasn't even reached the store. When it has passed through six hands in a system that doesn't require the distributor to buy the right to pass it along, it has already reached several million listeners.

    This increase in efficiency means that composers, songwriters and performers have everything to gain from making use of the system of unowned or anarchistic distribution, provided that each listener at the end of the chain still knows how to pay the artist and feels under some obligation to do so, or will buy something else--a concert ticket, a T-shirt, a poster--as a result of having received the music for free. Hundreds of potential "business models" remain to be explored once the proprietary distributor has disappeared, no one of which will be perfect for all artistic producers but all of which will be the subject of experiment in decades to come, once the dinosaurs are gone.

    No doubt there will be some immediate pain that will be felt by artists rather than the shareholders of music conglomerates. The greatest of celebrity musicians will do fine under any system, while those who are currently waiting on tables or driving a cab to support themselves have nothing to lose. For the signed recording artists just barely making it, on the other hand, the changes are of legitimate concern. But musicians as a whole stand to gain far more than they lose. Their wholesale defection from the existing distribution system is about to begin, leaving the music industry--like manuscript illuminators, piano-roll manufacturers and letterpress printers--a quaint and diminutive relic of a passé economy.

    The industry's giants won't disappear overnight, or perhaps at all. But because their role as owner-distributors makes no economic sense, they will have to become suppliers of services in the production and promotion of music. Advertising agencies, production services consultants, packagers--they will be anything but owners of the music they market to the world.

    What is most important about this phenomenon is that it applies to everything that can be distributed as a stream of digital bits by the simple human mechanism of passing it along. The result will be more music, poetry, photography and journalism available to a far wider audience. Artists will see a whole new world of readers, listeners and viewers; though each audience member will be paying less, the artist won't have to take the small end of a split determined by the distribution oligarchs who have cheated and swindled them ever since Edison. For those who worry about the cultural, economic and political power of the global media companies, the dreamed-of revolution is at hand. The industry may right now be making a joyful noise unto the Lord, but it is we, not they, who are about to enter the promised land.

    Eben Moglen

  • Thomas Speaks!

    Back during the presidential campaign, George W. Bush called Clarence Thomas and Antonin Scalia his favorite Supreme Court Justices--a remark widely interpreted at the time as just smoke-blowing in the direction of the right. Guess what--it's time to start taking Bush at his word, especially when it comes to Thomas.

    Just weeks after the inauguration, Justice Thomas has emerged as the new Administration's judicial patron saint. The top three officials of the Bush Justice Department--Attorney General John Ashcroft, Solicitor General-designate Theodore Olson and Deputy Attorney General-designate Larry Thompson--are all close Thomas friends. Thomas even officiated at Olson's wedding (also Rush Limbaugh's) and Ashcroft's swearing-in. While Thomas's wife, Virginia, shovels Heritage Foundation résumés into the 1600 Pennsylvania Avenue personnel department, his former clerk Helgard Walker sits in the White House counsel's office.

    After the Court's Florida decision, Thomas told a group of high school students that his famous, baffling reluctance to ask questions on the bench grows out of his childhood fear of being mocked for speaking Gullah (a black language) in an all-white seminary class. Maybe, but the vindicating presence of so many friends in the White House seems to have given the Supreme Court's Garbo new confidence: After nearly a decade on the sidelines, in mid-February Thomas emerged into the Washington spotlight at the American Enterprise Institute (AEI) with a Castro-length jeremiad on what he views as continuing liberal efforts to stifle him and other conservative culture warriors.

    When Thomas was nominated for the Court, some African-American and liberal voices argued that his biography as a black man gave hope that with time he would moderate his far-right views on affirmative action, welfare and civil rights. His rulings make their own testimony, of course, but if that AEI dinner speech is any indication, what is most remarkable about Thomas is that he has scarcely changed at all, either in preoccupations or politics. The themes of his speech--a hodgepodge of cherry-picked libertarian quotes from the likes of Hamilton, Montesquieu and Thomas Sowell--were instantly familiar to anyone who waded through his preconfirmation writings as the Reagan Administration's dismantler of equal opportunity enforcement. Back then, he praised sports and business as the great crucibles of character in a free society. In his AEI speech he told of how "the great UCLA basketball coach JohnWooden taught his players how to play the game by first teaching them how to lace up and tie their shoes." Back in 1991, Thomas dodged uncomfortable questions about his friend Jay Parker, a flack and registered agent for the apartheid-era South African government. In his speech he went out of his way to praise Parker as his mentor.

    Most of all, what has remained consistent about Justice Thomas is his swirling hornet's nest of resentment--that strange combination of megalomania and self-pity embodied in his famous denunciation of his confirmation hearings as a "high-tech lynching." At AEI he favorably compared himself and other conservative culture warriors to Dimitar Peshev, a heroic Bulgarian civil servant who during World War II secured the rescue of Sofia's Jews at considerable personal risk. Thomas remains obsessed with the idea of conservatives as persecuted victims--which, since those conservatives now run the White House, Justice and Congress, raises questions about his hold on reality. But the question currently being floated in Washington judicial circles is whether Thomas, not the oft-mentioned Scalia, is Bush's favored successor to Chief Justice William Rehnquist.

    Bruce Shapiro

  • Sign up for our weekly newsletter and we'll send you a FREE gift.

  • Columns

    Unreality Television

    The network honchos called by Louisiana Representative Billy Tauzin and the House Energy and Commerce Committee to testify on the election night debacle were a decidedly ungrateful bunch. True, they were forced to sit through a video of their billion-dollar babies making idiots of themselves. (Watching Dan Rather offering "a big tip and a hip, hip, hurrah and a great big Texas howdy to the new President of the United States," and instructing viewers to "Sip it. Savor it. Cup it. Photostat it. Underline it in red. Press it in a book. Put it in an album. Hang it on the wall," more than once ought to be considered cruel and unusual by anyone's standards.) And how rare it must be that anyone, much less mere members of Congress, would dare keep these boys cooling their heels for a full five hours before finally bringing them forward to demand that they swear to tell the truth, the whole truth and nothing but the truth--in public, no less. But really, all the "concern" and "uneasiness" voiced by the execs about government meddling in the news was a bit much. There was never any danger to the networks' independence in Tauzin's hearings; at least none that originated from Congress, rather than their own parent companies.

    Tauzin, a Democrat turned Republican, originally professed to possess an "analysis" that indicated "in almost every case, [the networks] favored early calls for Al Gore over George Bush." Absent any evidence, however, he withdrew the charge of intentional bias and retreated behind a mysterious theory of "flawed data models" and "biased statistical results" that happened to favor Democrats. He offered no evidence this time either, but almost all reporters felt duty-bound to repeat his nonsensical accusations. Hence precious little attention was focused on more concrete election-coverage questions, most notably Fox's decision to rely on the analysis of John "I can't be honest about [my cousin George W. Bush's] campaign.... He's family, and I'm for him" Ellis. And needless to say, there was no time left for an examination of the corrupting effect of the networks' interlocking structure of corporate ownership.

    Had Tauzin and company really tried to censor or intimidate the networks, that would have been interesting, but it is damn near impossible to imagine. As a comprehensive report on media lobbying by the Center for Public Integrity demonstrates, when it comes to mutual backscratching, the primates in the National Zoo have nothing over the networks and Congress.

    Take Tauzin, for instance. According to the CPI report--which might as well have been classified "top secret" for all the attention lavished on it by the media it exposes--the cagey Cajun received more PAC money from media companies than anyone else in the House, including more than $150,000 from entertainment and telecommunications companies for his 2000 campaign, in which he had no credible opponent. Moreover, no member of Congress has traveled more frequently on the media industry's dime. Between 1997 and 2000, Tauzin and his staff took a total of forty-two trips--one out of eight industry-sponsored junkets taken by members of Congress during that period. In December 1999 Tauzin and his wife enjoyed a six-day, $18,910 trip to industry "meetings" in Paris. Representative John Sweeney managed to make the same trip for a mere $7,445. How can Tauzin act as an honest broker for the networks filling his pockets? Easy: He simply does not believe in the concept of conflict of interest. "I have no choice but to do effective oversight," he says by way of explanation. Tauzin's view is hardly unique. His successor as chairman of the House Telecommunications Subcommittee, Fred Upton, has a portfolio worth millions in those very same companies.

    Again, we are seeing nothing unusual here, except perhaps gumption. In 1999 alone, according to the CPI, the fifty largest media companies and four of their trade associations coughed up more than $30 million to lobby Congress, an increase of 26.4 percent in three years. Since 1993, they have given more than $75 million in direct campaign contributions, according to the Center for Responsive Politics. And the numbers tell just a small part of the story. These fellas are not just selling toasters, after all. As former FCC chairman Reed Hundt has explained, more important than the industry's money is the perception of its "near-ubiquitous, pervasive power to completely alter the beliefs of every American." Politicians fear that if they displease these companies, they will simply "disappear" from view.

    And what do the media want in exchange for this largesse? They want to be left alone so they can make themselves and their stockholders rich, regardless of their impact on American democracy. To take just one example, according to data collected by Competitive Media Reporting, politicians and special interests spent an estimated $600 million for paid political ads in the last election cycle, which makes the $11 million or so the National Association of Broadcasters and five media outlets cumulatively spent between 1996 and 1998 to defeat campaign finance reform look like a prudent investment. Note, by the way, that John McCain, the heroic white knight of campaign finance reform, who raises more money from the media companies than even Tauzin, was crucial to the media companies' successful effort to kill the FCC's plan to force a lowering of the cost of political commercials, the primary culprit driving the vicious election/money cycle.

    With Michael Powell as George Bush's new appointee to head the FCC, the networks might not even have to bother lobbying Congress anymore. Powell signaled his own expansive definition of conflict of interest when he refused to recuse himself from the vote approving the merger of AOL and Time Warner, despite the fact that his father, Colin Powell, stood to make millions from the stock he received as a company director. (I don't suppose he opposes the repeal of the estate tax, either.)

    "We don't look to the government to correct the press. We look to the people," explained ABC News president David Westin to Tauzin's committee. "If we fail, the audience will judge us and move somewhere else." I'm thinking France.

    Eric Alterman

  • Hate Versus Death

    Almost every week, it seems, we get to read about some state execution, performed or imminent, wreathed in the usual toxic fog of race or sex prejudice, or incompetency of counsel, or prosecutorial misconduct.

    Take the recent execution in Ashcroft country, February 7, of Stanley Lingar, done in the Potosi Correctional Center in Missouri, for killing 16-year -old Thomas Allen back in 1985. In the penalty phase of Lingar's trial, prosecutor Richard Callahan, who may now be headed for the seat on the Missouri State Supreme Court recently vacated by his mother-in-law, argued for death, citing Lingar's homosexuality to the jury as the crucial factor that should tilt poison into the guilty man's veins. Governor Bob Holden turned down a clemency appeal and told the press he'd "lost no sleep" over signing off on Lingar's fate.

    Is there any hope that the ample list of innocent people either lost to the executioners or saved at the eleventh hour will prompt a national moratorium such as is being sought by Senator Russell Feingold of Wisconsin?

    A year ago it seemed possible. On January 31, 2000, Illinois Governor George Ryan suspended imposition of the death penalty in his state on the grounds that he could not support a system "which, in its administration, has proven so fraught with error."

    By June a Field Poll reported the sensational finding that in the state with the most crowded death row in the nation, Californians by nearly 4 to 1 favored stopping state executions to study how the death penalty was being applied. The Field Poll respondents were told about wrong convictions, also about appeals to Governor Gray Davis by religious leaders for a moratorium. A poll at the end of last year, in which California respondents were not offered this framework, put support for a moratorium at 42 percent, just behind those opposed to any such move. A national poll last fall found 53 percent for a moratorium.

    The discrepancy in the California polls actually affords comfort to abolitionists, since it shows that when respondents are told about innocent people saved from lethal injection, often at the last moment, support for a moratorium soars. It's a matter of public education.

    But where are the educators? Many eligible political leaders have fled the field of battle, convinced that opposition to the death penalty is a sure-fire vote loser. In the second presidential debate last fall Al Gore wagged his head in agreement when George W. Bush declared his faith in executions as a deterrent.

    A few years ago Hillary Clinton spoke of her private colloquys with the shade of Eleanor Roosevelt. Their conversations left La Clinton unpersuaded, since she stands square for death, as does New York's senior senator, Charles Schumer.

    Indeed, the death penalty is no longer a gut issue, or even a necessary stand, for those, like Schumer, who are associated with the Democratic Party's liberal wing. On February 12 the New York Post quoted Kerry Kennedy Cuomo, long known as a leading death-penalty opponent, as saying that "it would be futile" to try to repeal capital punishment in New York.

    Mrs. Cuomo, daughter of Robert F. Kennedy, told the Post that she believes her husband, Andrew, a contender for the Democratic nomination for governor, shares her views. "To tell you the truth, on the death penalty, it's not as big an issue in the state as it was a few years ago." Mrs. Cuomo's father-in-law, Mario, repeatedly vetoed death-penalty measures during his years as governor.

    In line with Kerry Kennedy Cuomo's spineless stance, many liberal or what are now cautiously called "human rights" groups have also found it politic to sideline capital punishment as an issue. No better illustration is available than the recent tussle over John Ashcroft's nomination as Attorney General. Scores of groups flailed at him on choice, racism and hate crimes, but not on the most racist application of hatein the arsenal of state power: the death penalty.

    Return for a moment to the fight to save Lingar's life. Privacy Rights Education Project, the statewide Missouri gay lobby group, endorsed Holden in his gubernatorial race. PREP, however, was quite muted on Lingar's fate, taking little action except to send a letter to the governor the day before the execution. Another gay organization, the Gay and Lesbian Alliance Against Defamation, the folks who want to shut down Dr. Laura, is a national group but happens to have an office in Kansas City, Missouri. Surely what prosecutor Callahan did to Stanley Lingar is well beyond defamation. Where was the Gay and Lesbian Alliance on this case? Not a peep from them. Noisy on hate crimes but silent on the death penalty is the Human Rights Campaign, the nation's largest gay-advocacy group.

    The issue of capital punishment is drawing much more attention these days. Just when help could really make a difference, where are all these (ostensibly) liberal and progressive groups? The Anti-Defamation League (all right, strike the word "ostensible"), whose national director, Abraham Foxman, pulled down $389,000 in 1999, was busy writing letters for Marc Rich. The death penalty? The ADL endorsed Bill Clinton's appalling Antiterrorism and Effective Death Penalty Act of 1996.

    The impetus given by Ryan last year could fall apart. Governor Ryan himself faces difficult re-election prospects in 2002, and a successor could rescind the moratorium. Liberals should abandon their absurd and dangerous obsession with hate crimes and muster against this most hateful excrescence on the justice system. Let them take encouragement from the district attorney of San Francisco, Terrence Hallinan, who told a San Francisco court on February 6 that he would not participate in the capital sentencing of one Robert Massey since "the death penalty does not constitute any more of a deterrent than life without parole" and, among other evils, "discriminates racially and financially, being visited mainly on racial minorities and the poor.... It forfeits the stature and respect to which our state is entitled by reducing us to a primitive code of retribution."

    Alexander Cockburn

  • A Democratic Lullaby to Bill Clinton

    As Bush finds backs to pat and hands to shake,
    The Democrats can't seem to buy a break.
    The opposition doesn't coalesce,
    Because the spotlight's on the Clinton mess,
    A mess that's just like catnip to the press.
    Afraid that he will never go away,
    The Democrats by now just want to say,

    Avoid the headlines, can't you, Bill?
    Speak softly, please, not louder.
    Eschew the networks, can't you, Bill?
    Enough, man! Take a powder!

    Ignored as long as he is on the stage,
    The Democrats, befuddled, try to gauge
    How he, amidst the sleaze, seems so unfazed
    While they are crazed, and find themselve amazed
    At all the oxygen the man inhales,
    As he on his sword himself impales.

    Avoid the headlines, can't you, Bill?
    They say. At any cost!
    Eschew the networks, can't you, Bill?
    Could you please just get lost?

    Calvin Trillin

  • The Court Did What Helms Couldn’t: Trashed the ADA

    The rogues in robes are on the move. US Supreme Court Chief Justice William H. Rehnquist, the leader of the pack, and the rest of the Court's right-wing majority have launched a judicial revolution that usurps the power of Congress as it applies to civil rights law.

    In its latest decision, the Court last week overruled Congress and held that state governments can arbitrarily deny jobs to disabled people without violating the equal protection clause of the Constitution. The Court's decision to gut the Americans with Disabilities Act, passed in 1990 by a huge bipartisan majority in Congress and signed into law by former President Bush, is yet another assault on representative democracy. Not content with short-circuiting the presidential election, the Court has now decided that it, and not Congress, shall make the laws.

    When passed more than a decade ago, the only significant opposition to the ADA came from a band of ultra-rightists led by Jesse Helms (R-N.C.), who mustered just seven other ultra-conservative Senate votes. Unfortunately, the Senate's far right is now well-represented on the Court, and the Justices did what Helms failed to do.

    The ADA was the most significant civil rights legislation in decades, allowing the tens of millions of Americans with disabilities access to jobs, schools and buildings. The legislation was inspired by those, such as wounded war hero Sen. Bob Dole (R-Kan.), who believed that the barriers to the full participation of the disabled in our public life were a clear violation of their civil rights.

    In the recent case before the court, a registered nurse at the University of Alabama hospital was demoted upon returning to work after breast cancer treatment. Rehnquist, in helping to overrule a federal Court of Appeals decision that the ADA prohibited such discrimination, put cost accounting above the right to access when he wrote that it would be "entirely rational and therefore constitutional for a state employer to conserve scarce financial resources by hiring employees who are able to use existing facilities."

    Don't be surprised if the Court next rules that it is not necessary to provide ramps or other facilities to wheelchair-bound people seeking access to public buildings, or Braille numbers to aid the blind in elevators. The Court has already gutted barriers to age discrimination in employment.

    What is at issue is the interpretation of the Fourteenth Amendment to the Constitution extending the protection of universal civil rights to all Americans regardless of the state in which they reside. The Fourteenth Amendment--originally addressing the issue of racial discrimination--explicitly empowers Congress to pass "appropriate legislation" needed to guarantee equal protection of the law for all.

    In striking down key provisions of the ADA, Rehnquist dealt a body blow to the separation of powers, which grants to Congress sole authority to pass federal law. Rehnquist said that the law in question was ill-conceived because he didn't agree with Congress's evaluation of evidence on the subject, saying it was based on "unexamined, anecdotal accounts" that did not qualify as "legislative findings."

    Given that the ADA was one of the more carefully researched pieces of legislation ever passed by Congress, there's no reason to believe that the Court won't throw out any law it doesn't agree with. As Justice Stephen G. Breyer pointed out in his dissent, the ADA had been the subject of a dozen Congressional hearings. Breyer attached a thirty-nine-page list prepared for Congress of state-by-state examples of official acts of discrimination against the disabled. This is not enough? And, anyway, when it suits its political purpose, the Court's right-wing majority is quick to rule the opposite, insisting that Congress, not the courts, has sole power to craft federal law.

    Just this past weekend, Justice Antonin Scalia claimed in a speech that those who believe that the Constitution is an evolving or living document want to use judicial interpretation to make law. He taunted: "You want a right to abortion? Pass a law. That's flexibility."

    Yet when Congress did pass a law extending the scope of civil rights protection to the disabled, Scalia didn't like it. He joined the 5-4 opinion overturning it.

    Just a couple of questions: If the life of every fetus is sacred, why would we want to deny that fetus, if born with disabling birth defects, full civil rights? If one is denied a state job solely because he or she must use a wheelchair, is that not a clear violation of the equal protection of the laws called for in the Fourteenth Amendment?

    Hypocritically, the same five Justices who struck down the ADA as a violation of states' rights were all too willing to toss out the issue of states' rights on the Florida election count. The lesson then and now is that the current majority of the US Supreme Court seeks to usurp the power of the states and the Congress when--and only when--it suits its fiercely held ideological agenda. Sadly, civil rights are the prime target of that agenda.

    Robert Scheer

  • Books and the Arts

    Race: the Continental Divide

    The first moments of a recent documentary about Students for a Democratic Society (SDS), Rebels With a Cause, recall one of the signal images of the 1960s civil rights struggle: police training torrents of water from fire hoses on demonstrators in Birmingham, Alabama. According to its makers, the student New Left and the antiwar movement derived their principal inspiration from the struggle for black freedom.

    In this phase of the movement blacks and their allies sought three rights: integrated public schools; desegregation of public accommodations such as trains and buses, restrooms and water fountains, restaurants and, in much of the South, the right to walk down a street unmolested; and perhaps most important, voting rights. Nearly forty years later the Birmingham confrontation reminds us not only of the violence of Southern resistance to these elementary components of black freedom but also how clear-cut the issues were. The simple justice embodied in these demands had lurked on the margins of political life since the 1870s betrayal of black Reconstruction by politicians and their masters, Northern industrialists. Not that proponents of black freedom were quiescent in the interim. But despite some victories and defeats, mainly in the fight against lynching and legal frame-ups, these demands remained controversial and were largely sidelined. In 1940 Jim Crow was alive and well in the South and many other regions of America.

    World War II and its aftermath changed all that. Under threat of a March on Washington by A. Philip Randolph of the Sleeping Car Porters union and other black leaders, in 1941 President Roosevelt issued an executive order banning discrimination in military-industry hiring. But when millions of black veterans returned from the war and found that America had returned to business as usual--even as the United States was embroiled in a cold war, claiming to be at the forefront of freedom and democracy--pressure mounted for a massive assault on discrimination and segregation. Shrewdly looking forward to an uphill re-election battle, President Truman ordered the desegregation of the armed forces, and the turbulent 1948 Democratic convention passed the strongest civil rights plank of any party since the Radical Reconstruction laws of 1866. Consequently, the party suffered the first of what became a long line of defecting Southern political leaders when Strom Thurmond bolted and ran as the Dixiecrat Party candidate for President. In the early 1950s the NAACP mounted a series of legal cases against school segregation that culminated in the 1954 Supreme Court decision Brown v. Board of Education, sustaining the plaintiffs' claim that the historic Court doctrine of "separate but equal" was untenable. The Court agreed that school integration was the only way to guarantee equal education to black children. The next year an NAACP activist, Rosa Parks, refused to move to the back of a Montgomery, Alabama, bus--a disobedience that sparked a monumental boycott that finally ended in victory for the black community. In 1962, flanked by federal troops, James Meredith entered the University of Mississippi, and, under similar circumstances, a black woman named Autherine Lucy broke the color bar at the University of Alabama.

    As important as these breakthroughs were, they were regarded by some as only the first stage of what was considered to be the most important phase in the struggle-- achieving black voting rights as a prelude to overturning white-supremacist economic and political domination of the South. Many on both sides of the civil rights divide were convinced that the transformation of the South by ending black exclusion was the key to changing US politics. The struggle for voting rights proved as bloody as it was controversial, for it threatened to reduce the Democratic Party to a permanent minority. In summer 1964, in the midst of a major effort by civil rights organizations to enroll thousands of new black voters, three field-workers were murdered and many others were beaten, jailed and in some cases forced to run for their lives. Fearing further losses in the less-than-solid South, the Kennedy Administration had to be dragged kicking, if not screaming, to protect field-workers in various civil rights groups. Indeed, it can be argued that beginning with the 1964 Republican presidential candidate Barry Goldwater, and perfected to an art form by Richard Nixon, the right's infamous Southern Strategy was the key to Republican/conservative domination of national politics in subsequent decades. But driven by the exigencies of the Vietnam War as well as the wager that the Democrats could successfully cut their losses by winning the solid backing of millions of black and Latino voters, Lyndon Johnson's Administration rapidly pushed the Civil Rights and Voting Rights acts through Congress. By 1965 a century of legally sanctioned black subordination apparently came to an end. The Democrats won black loyalty, which to this day remains the irreducible condition of their ability to contend for national political power.


    Today the predominant commentary on the state of race relations in the United States no longer focuses on discrimination but on the consequences of the economic and cultural chasm that separates black and white: white flight from the cities; the formation of a black middle class, which in pursuit of a better life has tended to produce its own segregated suburban communities; and the persistence of black poverty. For after more than three decades of "rights," the economic and cultural reality is that the separation between black and white has resurged with a vengeance. Housing segregation is perhaps more profound than in the pre-civil rights era, and the tiering of the occupational structure still condemns a large percentage of blacks and Latinos to the bottom layer. What's more, the combination of the two has widened the educational gap that pro-integration advocates had hoped would by now be consigned to historical memory. In the wake of achieving voting rights and passing antidiscrimination laws, some now worry more about whether blacks will maintain access to the elite circles of American society, especially in education and the economy. Others have discovered that, despite their confidence that legal rights are secure, the stigma of race remains the unmeltable condition of the black social and economic situation.

    But even voting rights are in jeopardy. What are we to make of the spectacle of mob intimidation of election officials and the brazen theft of thousands of black Florida votes during the 2000 presidential election--so egregious it led the NAACP to file suit? Or the ideological and political mobilization of a conservative Supreme Court majority to halt the ballot recount of George W. Bush's razor-thin lead? Or the refusal of a single member of the Senate, forty-eight of whom are white Democrats, to join mainly black House members in requesting an investigation of the Florida voting events? Vincent Bugliosi has persuasively shown that Gore's lawyer did not use the best arguments at the Court [see "None Dare Call It Treason," February 5]. And then there's the Congressional battle over George W. Bush's nomination of John Ashcroft as Attorney General, during which Ted Kennedy concluded that he could not muster the votes to sustain a filibuster and had to be content with joining forty-one of his colleagues in a largely symbolic protest vote against confirmation. Ashcroft's nomination was perhaps the Bush presidency's blatant reminder that the selection of Condoleezza Rice and Colin Powell for leading foreign policy positions should not be understood as having any relevance to the Administration's stance on domestic racial politics. Contrary to progressives' expectation that the Democratic Party would constitute a genuine opposition to what has become a fairly strident right-wing government, as the events since November 7 have amply demonstrated, the Democrats seem to know their place.

    These events--calumnies, really--cast suspicion on the easy assumption that the civil rights struggle, even in its legal aspect, largely ended in 1965. The deliberate denial of black suffrage in Florida was a statement by the Republican minority that it would not countenance the Gore campaign's bold maneuver, in tandem with the AFL-CIO and the NAACP, to register and turn out tens of thousands of new black and non-Cuban Latino voters. As if to acknowledge his indiscretion, after it became clear that he "wuz robbed," Gore ordered his supporters not to take to the streets; instead, he contented himself with a weak and ultimately unsuccessful series of legal moves to undo the theft. Thus even if their hopes for victory always depended on blacks showing up at the polls in the battleground states, including Florida, Gore and the Democratic Party proved unwilling to fight fire with fire. In the end the right believes it has a royal right to political power because it is grounded in the power of Big Capital and the legacy of racism; the Democrats believe, in their hearts, that they are somehow illegitimate because of their dependence on blacks and organized labor.


    Yet inspired by the notion that the United States is a nation of laws, some writers have interpreted the 1960s judicial and legislative gains to mean that the basis has been laid for full equality of opportunity for blacks, Latinos and other oppressed groups. (I speak here of those like Ward Connerly, Glenn Loury and Shelby Steele.) They cite affirmative action as evidence that blacks may be able to attain places in the commanding heights of corporate office, if not power. But they forget that Nixon utilized affirmative action to derail efforts, inspired by the 1960s civil rights victories, to dramatically expand funding for education and other public goods. In the context of Nixon's abrogation of the Bretton Woods agreement, which destabilized world currencies, and the restructuring of the economy through the flight of manufacturing to Third World countries and capital flight from those same countries, affirmative action was the fig leaf covering the downsizing of the welfare state. As the political winds blew rightward, welfare "reform" became a mantra for Nixon's successors, from Jimmy Carter and Ronald Reagan through Bill Clinton. Needless to say, providing spaces for black students in a handful of elite colleges and universities was not, in itself, a bad idea. That affirmative action accompanied the Reagan tax cuts, ballooning military spending, the gradual end of an income floor for the unemployed and severe spending cuts in housing, public health and education demonstrated its purpose: to expand the black professional and managerial class in the wake of deindustrialization--which left millions of industrial workers, many of them black and Latino, stranded--and the hollowing out of the welfare state, which in contrast to the 1960s gains widened the gap between rich and poor.

    If only the most naïve believed that in time racism would disappear, many have misread William Julius Wilson's 1978 book The Declining Significance of Race to argue that black inequality is not a sign of the persistence of institutional or structural racism but is a remnant. On the contrary, Wilson claims that the persistence of black poverty may not be a sign of discrimination in the old sense but an indication of the vulnerability of blacks to structural changes in the economy. Race still frames the fate of many blacks, but there is also a crucial class dimension to their social situation: Despite having acquired comprehensive legal rights, both deindustrialization of many large cities and black as well as white middle-class flight have left working-class blacks in the lurch. Once, many had well-paid union jobs in steel mills, auto plants and other production industries. But since the mid-1970s, most urban and rural areas where they live have become bereft of good jobs and local services. There are not enough jobs to go around, and those that can be found are McJobs--service employment at or near the minimum wage. Cities also lack the tax base to provide citizens with decent education, recreation, housing and health facilities, and basic environmental standards like clean air and water. Lacking skills, many blacks are stuck in segregated ghettos without decent jobs, viable schools, hope or options.

    To make things worse, the neoliberal policies of both Democratic and Republican governments have drastically slowed public-sector job growth and have eliminated one of the main sources of economic stability for blacks and Latinos. In fact, during his campaign Al Gore boasted that he was an architect of the Clinton Administration's program of streamlining the federal government through layoffs and attrition, which was undertaken to facilitate "paying down the debt." The private sector was expected to pick up the slack, and it did: In many cases laid-off public employees joined former industrial workers in the retail trades and in low-paid construction and nonunion factory jobs.

    But the problems associated with race/class fail to detain some writers--e.g., John McWhorter and Stephen Carter. Having stipulated black economic progress without examining this claim in any depth, their main concern is how to assure that blacks obtain places in the elite intellectual and managerial professions. As if to stress the urgency of this problem, they adduce evidence to show that despite the broad application of affirmative action, blacks have fallen back in educational achievement. For example, while celebrating the civil rights movement's accomplishments, in Losing the Race: Self-Sabotage in Black America, his book about the question of why black students still lag behind whites in educational attainment, linguistics professor McWhorter discounts the salience of racism. McWhorter allows that although some police brutality and discriminatory practices such as racial profiling still exist, these regrettable throwbacks are being eradicated. Accepting the federal government's definition of poverty (now $17,650 a year or less for a household of four), McWhorter minimizes these problems by citing statistics showing that under 25 percent of America's black population lives in impoverished conditions, a decline from 55 percent in 1960. Of course, these federal standards have been severely criticized by many economists and social activists for substantially understating the amount that people actually need to live on in some of America's major cities.

    While $17,650 may constitute a realistic minimum standard in some rural areas and small towns--and even then there is considerable dispute about this figure--anyone living in New York, Chicago, Los Angeles, Philadelphia, Detroit, Oakland, Washington or Atlanta--cities with large black, Latino and Asian populations--knows that this income is far below what people need to live on. The United Way's regional survey of living standards found that households of three in New York City require at least $44,208 to meet minimum decent standards. While the amounts are less for most other cities, they do not run below $25,000, and most of the time they are higher. In many of these cities two-bedroom apartments without utilities cannot be rented for less than $800 a month (plus at least $100 for telephone, gas and electricity)--more than half the take-home pay of a household earning $25,000. In New York City the rent is closer to $1,200, even in the far reaches of the Bronx and Brooklyn. But perhaps half of black households earn less than the rock-bottom comfort level in most areas today.

    None of this bothers McWhorter and other black centrists and conservatives. They know that when measured by grades and standardized tests, educational attainment among children of poor black families may fall short because of poverty, insecurity or broken homes. Why, they ask, do black students in the middle class perform consistently worse by every measure than comparable white students, even when, owing mainly to affirmative action, they gain admission to elite colleges and universities? Relying heavily on anecdotes drawn from his own teaching experience at the University of California, Berkeley, McWhorter advances the thesis that, across the class system, blacks are afflicted by three cultural barriers to higher educational achievement that are of their own making: victimology, according to which all blacks suffer from racism that stands between them and success; separatism, a congenital suspicion of whites and of white norms, including high educational achievement; and, perhaps most salient, endemic anti-intellectualism among blacks, which regards those who study hard and care about learning as "nerds" who tend to be held in contempt. To be "cool" is to slide by rather than do well. The point of higher education is to obtain a credential that qualifies one for a good job, not to take academic learning seriously.

    McWhorter is deeply embarrassed by the studied indifference of many of his black students (almost none of whom are economically disadvantaged) for intellectual pursuits and is equally disturbed by their low grades and inferior test scores. In the end, he attributes much of the problem to affirmative action--a "necessary step" but one that he believes ultimately degrades blacks in the eyes of white society. McWhorter admits that he is troubled that he got tenure in four and a half years instead of the usual seven, and that he was privileged in part because he is black. Like another "affirmative action baby," Stephen Carter, he is grateful for the boost but believes the door should be closed behind him because the policy has morally objectionable consequences, especially cynicism, examples of which litter the pages of his book. Affirmative action "inherently divests blacks and Latinos of the unalloyed sense of personal, individual responsibility for their accomplishments.... The fact that they tend not to be aware of this follows naturally from the fact that affirmative action bars it from their lives: You don't miss what you never had." Instead, he wants blacks "to spread their wings and compete" with others on an equal basis. Condemning the thesis of Richard Herrnstein and Charles Murray (in The Bell Curve) that affirmative action should be eliminated because blacks "are too dumb," at the same time he urges his readers to "combat anti-intellectualism" and victimology as cultural norms.

    Before discussing what I regard as the serious flaws in McWhorter's thesis, it is important to note that it is by no means unique; a growing number of black as well as white pro-civil rights intellectuals embrace it. Having asserted that the legislative and court victories effectively buried the "external" barriers to black advancement, these critics seek answers to the persistence of racial educational disparity within the black community itself--particularly the "culture" of victimization, a culture that, however anachronistic from an objective standpoint, remains a powerful force. Unlike many black conservatives, McWhorter professes admiration for the icons of the civil rights struggle, especially Martin Luther King Jr. and Adam Clayton Powell Jr. But he is not alone in his uncritical assessment of the past half-century as an era of black "progress." As a result, he discounts the claim that racism remains a structural barrier to blacks' further advances. Nor is there justification for the politics and culture of separatism; for McWhorter, blacks are shooting themselves in the foot because these values prevent them from overcoming the psychological and cultural legacy of slavery, the root cause of the anti-intellectualism that thwarts black students from performing well in an academic environment. This culture, he argues, is reproduced as well in the black family when parents refuse to take their children's low or failing grades as an occasion for scolding or punishment and, in the extreme, in peer pressure against the few kids who, as early as elementary school, are good scholars. Academically bright black kids are ridiculed and charged with not being loyal to "the race."

    We have seen this pattern played out in a somewhat different mode among working-class white high school students. In his study of male working-class students in an English comprehensive high school--the term that corresponds to our public schools--Paul Willis describes in his study Learning to Labor how the "lads" rebel against the curriculum and other forms of school authority and thereby prepare themselves for "working-class jobs" in a nearby car factory. We can see in this and many other school experiences the same phenomena that McWhorter ascribes to racial culture. To study and achieve high enough grades to obtain a place in the university is to betray the class and its traditions. One risks isolation from his comrades for daring to be good. But it is not merely anti-intellectualism that produces such behavior; it is a specific form of solidarity that violates the prevailing norm that the key to economic success for the individual working-class kid is educational attainment.

    But as a certified good student who, by his own account, suffered many of these indignities at the hands of classmates, this is beyond McWhorter's grasp. Since he sees no external barriers, anyone who refuses to succeed when he or she has been afforded the opportunity without really trying must be the prisoner of a retrograde culture. And since he defines anti-intellectualism in terms of comparative school performances in relation to whites--and there is no exploration of the concept of intellectual endeavor as such--he must avoid entering the territory of rampant American anti-intellectualism. It may come as a shock to some, but it can be plausibly claimed that we--whites, blacks and everyone else--live in an anti-intellectual culture. Those who entertain ideas that have no apparent practical utility, enjoy reading books without being required by course assignment and play or listen to classical music, attend art museums, etc., are sometimes labeled "Mr. or Miss deadbrains." Moreover, as many have observed, the public intellectual--a person of ideas whose task is to hold a mirror to society and criticize it--is an endangered species in this country.

    McWhorter focuses absolutely no attention on the loss of intellectual life in this sense. His idea of intellectualism is entirely framed by conventional technical intelligence and by a sense of shame that blacks are coded as intellectually inferior. There is little justification of the life of the mind in Losing the Race except as a valuable career tool. So McWhorter cares about educational attainment as a matter of race pride and climbing the ladder. He cannot imagine that working-class and middle-class whites are routinely discouraged from spending their time in "useless" intellectual pursuits or that the disparity between black, Latino, Asian and white school performances can be attributed to anything other than internal cultural deficits.


    Scott Malcomson and Thomas Holt have cast different lights on race disparity in the post-civil rights era. Departing, but only implicitly, from Tocqueville's delineation of the three races of American society--Indian, black and white--Malcomson, a freelance writer and journalist, offers a long, sometimes rambling history, fascinating autobiography and social analysis of the career of race in America. The main historical theme is that racial separatism is interwoven with virtually every aspect of American life, from the sixteenth-century European explorations and conquests that led to the brutal massacres of Native Americans, on through slavery and postemancipation race relations. Malcomson's story is nothing less than a chronicle of the defeat of the ideal of one race and one nation. Black, Native American and white remain, throughout the centuries, as races apart. Not civil war, social movements or legislative change have succeeded in overcoming the fundamental pattern of white domination and the racializing of others. Nor have territorial and economic expansion allayed separatism and differential, racially burdened access to economic and political power.

    Malcomson, the son of a white Protestant minister and civil rights activist, caps One Drop of Blood with a hundred-page memoir of his childhood and youth in 1960s and '70s Oakland. In these pages, the most original part of the book, Malcomson offers an affecting account of his own experience with race but also renders the history of three major twentieth-century Oakland celebrities--Jack London, William Knowland and Bobby Seale. London was, of course, one of America's leading writers of the first two decades of the past century. He was famous not only for his adventure stories, which captured the imagination of millions of young adults and their parents as well, but also for his radical politics. He was a founder of the Intercollegiate Socialist Society, the student branch of a once-vibrant Socialist Party, which regularly won local offices in California until World War I and spawned the explosive political career of another writer, Upton Sinclair. (In 1934 Sinclair even became the Democratic candidate for governor.) Less well known was London's flaming racism, a sentiment shared by the archconservative Oakland Tribune publisher and US Senator William Knowland.

    Deeply influenced by his parents' anti-racist politics, Malcomson was enthralled by the movement and especially by the Black Panthers, whose ubiquitous presence in Oakland as a political and social force was unusual--they lacked the same level of visibility elsewhere, even in their period of fame. But the pathos of his involvement is that his main reference group was neither black nor white: Their separation was simply too painful for a young idealist to bear. Instead, he hung out with a group of Asian students who nevertheless emulated black cultural mores. They did well in school but, by this account, nothing much except social life went on there. To satisfy his intellectual appetites, Malcomson sought refuge in extracurricular reading, an experience I shared in my own high school days in the largely Jewish and Italian East Bronx of the 1940s and 1950s.

    The point of the memoir is to provide a contemporary illustration of the red thread that runs through his book: The promise of a century and a half of struggle for black freedom, and especially of the end of racial separation and of racism, remains unfulfilled. Malcomson takes Oakland as a paradigmatic instance of a majority black population gaining political office but little power because the city's white economic elites simply refused to play, until civic disruption forced them (especially Knowland) to acknowledge that their power was threatened by the militancy of new political forces within the black community. With the waning of black power, Oakland, like many other cities, reverted to its ghettos, punctuated in the 1980s and 1990s by a wave of gentrification that threatened the fragile black neighborhoods and, in the wake of high rents, forced many to leave for cheaper quarters. When Malcomson returns home in the late 1990s after a few decades in New York he finds that the experiment of his father and a local black preacher in interracial unity in behalf of community renewal has collapsed, leaving the clergyman with little hope but a firm conviction that since whites were unreliable allies, blacks had no alternative but to address the devastation of black neighborhoods on their own.

    Thomas Holt is a black University of Chicago social and cultural historian whose major work, The Problem of Freedom, is a brilliant, multifaceted account of Jamaican race, labor and politics in the nineteenth and early twentieth centuries. The question Holt asks in his latest book-length essay, The Problem of Race in the Twenty-First Century, is whether W.E.B. Du Bois's comment that the color line is the problem of the twentieth century remains the main question for our century. Holt addresses the issue from a global perspective on two axes: to place race in the context of both the national and global economies, and to adopt a "global" theoretical framework of analysis that situates race historically in terms of the transformation of production regimes from early to late capitalism. In bold but sharp strokes the book outlines three stages: pre-Fordist, Fordist and post-Fordist.

    Fordism was more than a production system based on a continuously moving assembly line where workers performed repetitive tasks. The "ism" in the term signifies that mass production and mass consumption are locked in an ineluctable embrace: If the line makes possible mass production, ways must be found to provide for mass consumption. Fordism therefore entails the formation of a new consumer by means of raising wages, mainly by the vast expansion of the credit system. Now workers could buy cars, houses and appliances, even send their kids to college on the principle of "buy now, pay later." This practical consideration led to a new conception of modernity.

    The advent of consumer society has changed the face of America and the rest of the capitalist world. Before Fordism, which Holt terms pre-Fordism, the US economy was crucially dependent on slavery. Whatever their egregious moral and ethical features, blacks were at the core of both the pre-Fordist and Fordist production regimes. In the slave and post-Reconstruction eras they planted and picked cotton and tobacco; during and after World War II they were recruited into the vast network of industrial plants as unskilled and semiskilled labor. Indeed, they shared in the cornucopia of consumer society, owning late-model cars and single-family homes; in some instances, they were able to send their kids to college. As Holt points out, exploitative as these relationships were, blacks occupied central places in American economic life.

    The 1970s witnessed the restructuring of world capitalism: The victories of the labor movement prompted capital to seek cheaper labor abroad; cities like Detroit, Cleveland, New York, Oakland and Chicago, where black workers constituted a significant proportion of the manufacturing labor force, were rapidly deindustrialized. In their place arose not only retail establishments but "new economy" computer-mediated industries like hardware and software production, dot-coms and financial services.

    But the reindustrialization of the 1980s and 1990s occurred without the participation of blacks. In industries marked by the old technologies, hundreds of thousands of immigrants made garments, toys and other consumer goods. Lacking capital, African-Americans could not own the retail businesses in their own communities; these are largely owned by immigrants--Koreans, Indians, Caribbean and African merchants. Holt argues that American blacks are now largely excluded from the global economy; they occupy economic niches that are no longer at the center of production, positions that augur badly for the future of race in America. Even the growing black middle class is located in the public sector, which has been under severe attack since the 1980s. Moreover, Holt plainly rejects the judgment of "black progress" that leads to culturalist explanations for the growing economic and social disparity between blacks and whites. He finds:

    overwhelming contemporary evidence that racism permeates every institution, every pore of everyday life. Justice in our courts, earnings on our jobs, whether we have a job at all, the quality of our life, the means and timing of our death--all form the stacked deck every child born black must take up to play the game of life.


    Holt's essay ends with the faint hope that racial stigma and segregation will one day be overcome. For now, he insists, race remains, for all African-Americans--not only those suffering the deprivations of ghetto life--the problem of the twenty-first century. But neither Malcomson nor Holt can find sources of resistance. In fact, Holt explicitly gives up on the labor movement--a prime mover within the Fordist regime. It has been severely weakened by post-Fordist globalization. Nor does he identify forces within the African-American movements capable of leading a fight. As other sharp critics of racism such as Paul Gilroy point out, despair overwhelms hope.

    The concept that judicial and legislative prohibitions against discrimination are sufficient to erase the legacy of four centuries of social and economic oppression is deeply embedded in the American imagination, always alert to the quick fix. From this view follows the inevitable conclusion that if income and social disparities stubbornly refuse to go away, something must be wrong with the victim. Thus the tricky and misleading term "culture" as explanation, which segues into proposals that blacks "pull up their socks" and reach for the main chance. But those who are passionate in their insistence that the elimination of the structurally induced racial divide will require a monumental struggle have hesitated before the gateway of class politics.

    For those like Holt and Malcomson, who are less concerned with whether blacks enter corporate boardrooms than with raising the bottom, there is little alternative to calling on the power of the labor movement to join the fray. It might be argued that organized labor, still dominated by a relatively conservative white leadership, has shown little inclination to mount a fierce defense of black interests. But as unions lose their traditional white, blue-collar base, the labor movement is becoming more black and Latino, and certainly more female. And as its constituency is transformed, there are signs that the top leadership is learning a few lessons. The AFL-CIO's 2000 call for amnesty for undocumented immigrants was a move of historic precedent. And its attempt to organize low-wage workers is a reversal of past practice. If race remains a central problem of the new century, the way forward is probably to re-establish the race/class alliance that fell on hard times in the 1960s and 1970s. Without it, black freedom is confined to a cry in the darkness.

    Stanley Aronowitz

  • Murdoch’s Fox News

    They distort. They decide.

    Daphne Eviatar

  • Little Elegy in G Minor

    A box of Chopin nocturnes handed down
    from the other side of my mother's death--
    evening gowns in trash bags making a little
    Golgotha of their own right in the corner
    of that studio we had spent all morning
    emptying out--uncandled cold chaperoned
    through the sill. Lullabies all of us had
    already heard while drinks kept going round
    the parlor after her wake assembled now
    into makeshift history--bits of tenderness
    discarded down the cosmos slide, each night
    a phantom limb, the hours trapezing over
    that sea of anonymous faces where sidereal
    glances scale up the piano's mirrored lid.

    Timothy Liu

  • Liberation Musicology

    The recording industry has been celebrating the supposed defeat of Napster. The Court of Appeals for the Ninth Circuit has affirmed the grant of a preliminary injunction that may well have the effect of closing the service down completely and ending the commercial existence of Napster's parent (that is, unless the record companies agree to an implausible deal Napster has proposed). But despite appearances, what has happened, far from being a victory, is the beginning of the industry's end. Even for those who have no particular stake in the sharing of music on the web, there's value in understanding why the "victory"over Napster is actually a profound and irreversible calamity for the record companies. What is now happening to music will soon be happening to many other forms of "content" in the information society. The Napster case has much to teach us about the collapse of publishers generally, and about the liberative possibilities of the decay of the cultural oligopolies that dominated the second half of the twentieth century.

    The shuttering of Napster will not achieve the music industry's goals because the technology of music-sharing no longer requires the centralized registry of music offered for sharing among the network's listeners that Napster provided. Freely available software called OpenNap allows any computer in the world to perform the task of facilitating sharing; it is already widely used. Napster itself--as it kept pointing out to increasingly unsympathetic courts--maintained no inventory of music: It simply allowed listeners to find out what other listeners were offering to share. Almost all the various sharing programs in existence can switch from official Napster to other sharing facilitators with a single click. And when they move, the music moves with them. Now, in the publicity barrage surrounding the decision, 60 million Napster users will find out about OpenNap, which cannot be sued or prohibited because, as free software, no one controls its distribution and any lawsuits would have to be brought against all its users worldwide. Suddenly, instead of a problem posed by one commercial entity that can be closed down or acquired, the industry will be facing the same technical threat, with no one to sue but its own customers. No business can survive by suing or harassing its own market.

    The music industry (by which we mean the five companies that supply about 90 percent of the world's popular music) is dying not because of Napster but because of an underlying economic truth. In the world of digital products that can be copied and moved at no cost, traditional distribution structures, which depend on the ownership of the content or of the right to distribute, are fatally inefficient. As John Guare's famous play has drummed into all our minds, everyone in society is divided from everyone else by six degrees of separation. The most efficient distribution system in the world is to let everyone give music to whoever they know would like it. When music has passed through six hands under the current distribution system, it hasn't even reached the store. When it has passed through six hands in a system that doesn't require the distributor to buy the right to pass it along, it has already reached several million listeners.

    This increase in efficiency means that composers, songwriters and performers have everything to gain from making use of the system of unowned or anarchistic distribution, provided that each listener at the end of the chain still knows how to pay the artist and feels under some obligation to do so, or will buy something else--a concert ticket, a T-shirt, a poster--as a result of having received the music for free. Hundreds of potential "business models" remain to be explored once the proprietary distributor has disappeared, no one of which will be perfect for all artistic producers but all of which will be the subject of experiment in decades to come, once the dinosaurs are gone.

    No doubt there will be some immediate pain that will be felt by artists rather than the shareholders of music conglomerates. The greatest of celebrity musicians will do fine under any system, while those who are currently waiting on tables or driving a cab to support themselves have nothing to lose. For the signed recording artists just barely making it, on the other hand, the changes are of legitimate concern. But musicians as a whole stand to gain far more than they lose. Their wholesale defection from the existing distribution system is about to begin, leaving the music industry--like manuscript illuminators, piano-roll manufacturers and letterpress printers--a quaint and diminutive relic of a passé economy.

    The industry's giants won't disappear overnight, or perhaps at all. But because their role as owner-distributors makes no economic sense, they will have to become suppliers of services in the production and promotion of music. Advertising agencies, production services consultants, packagers--they will be anything but owners of the music they market to the world.

    What is most important about this phenomenon is that it applies to everything that can be distributed as a stream of digital bits by the simple human mechanism of passing it along. The result will be more music, poetry, photography and journalism available to a far wider audience. Artists will see a whole new world of readers, listeners and viewers; though each audience member will be paying less, the artist won't have to take the small end of a split determined by the distribution oligarchs who have cheated and swindled them ever since Edison. For those who worry about the cultural, economic and political power of the global media companies, the dreamed-of revolution is at hand. The industry may right now be making a joyful noise unto the Lord, but it is we, not they, who are about to enter the promised land.

    Eben Moglen

  • Salter’s Flight Path

    We have many male authors known for loving women, fewer known for loving men. Love that is not overtly homoerotic--resolutely heterosexual, in fact--can take on an intimacy and purity untroubled by sex, even if still troubling for its intensity, its incoherence and frequent confusion, violence or aggression, its exclusionary quality. And so it is no surprise to find it so often in novels of war, or the military. In the last half-century American practitioners of this form have included Heller and Mailer, Ward Just, Tim O'Brien and, perhaps most overlooked, James Salter, who is interested in more than the camaraderie among men in uniform but also its inverse as well, the case of the solitary, perhaps a newcomer breaking in. Here is his description of the fighter pilot Robert Cassada from his new novel of that name:

    It was his beauty, of course, a beauty that no one saw--they were blind to such a thing.... By beauty, nothing obvious is meant. It was an aspect of the unquenchable, of the martyr, but this quality had its physical accompaniment. His shoulders were luminous, his body male but not hard, his hair disobedient. Few of them had seen him naked, not that he concealed himself or was modest but like some animal come to drink he was solitary and unboisterous. He was intelligent but not cerebral and could be worshipful, as in the case of airplanes.

    "They" and "them" are his colleagues, the men of a fighter squadron stationed in Germany in the 1950s. The men--Dunning, Isbell, Wickenden, Godchaux, Phipps, Dumfries, Ferguson, Harlan, Grace--lead restless, incurious, exalted lives, flying every day in the skies above Western Europe, waiting for the conflict that never comes. So conflict comes from within and among the men, who are arrogant, competitive, bored, cussedly suspicious yet trusting, too. They are not alike. Major Dunning is a Southerner and former college football star; Harlan is a rustic, an overgrown farm boy. Wickenden, or "Wick the prick," Cassada's nemesis, was "born in the wrong century. The cavalry was what he was made for, riding in the dust of the Mexican border with cracked lips and a line edged into his hair from the strap of a campaign hat." Cassada is from Puerto Rico, which leads Harlan to wonder what he's doing in the US Air Force. "Puerto Rico's part of the United States," replies Godchaux.

    "Since when?"

    "I don't know. A long time."

    "I must of missed hearing about it."

    The banter may not recall Catch-22--while sharp, it is seldom witty--but Salter's particular genius is for the inexpressive man. He saves his tenderest regard for Cassada, about whom there is "an elegance...a superiority. You did not find it often." It is perhaps his gravest mistake, for to a reader with less invested in the project, Cassada is the least present, most flattened out, of all the men, the one who never steps out of the page despite being so beautiful or unforgettable as all that. Cassada, which is a revision of an early, out-of-print novel, The Arm of Flesh, should instead be titled Isbell.

    In interviews Salter has dismissed The Arm of Flesh as a "failed book," and he says the same in his preface to Cassada. Admitting that the new venture might be "a mistake," he cites "the appeal of the period, the 1950s, barely a decade after the war; the place, the fighter bases of Europe; and the life itself." Cassada, then--in words that I have seen repeated in every notice--is "the book the other might have been."

    I think in fact it is the same book, although better turned out for some crucial changes. The Arm of Flesh is a novel in alternating voices--seventeen altogether--some of which are hard to figure out, others appearing only briefly, even once. Several could be cut entirely, as their narrative distracts from the general thread, which is about the ordeal of two pilots (one whose radio is out) trying to make it home in terrible weather, while interspersed are episodes from lazy days on base and elsewhere. Cassada is told in the third person, but the structure is much the same--if the two books are laid side by side, one sees in Cassada a succession of loose little chapters that more or less correspond to an individual voice's narrative in The Arm of Flesh. Major Clyde is now Dunning. Lieutenant Sisse from the earlier book does not appear at all in the second. In The Arm of Flesh Cassada never speaks with his own voice. In both works his words are reported to us through the perspective of others; and so he is always at least once, often twice, removed from a reader.

    "Something was usually beginning before the last thing ended." This is Isbell, and the words seem to me to be the key to the book. Cassada is a new arrival at the wing in Giebelstadt, but the rivalries, the ennui, the excitements of a life in the air, at speed, have been going on as long as men have been assembled to fight. Salter, who was a pilot himself in Korea--with the advantage, unlike many of the pilots in Cassada, of actually having seen combat--has written of this elsewhere, in his first novel, The Hunters, and in his memoir Burning the Days. A pilot, it seems, becomes obsessed with doing something remarkable, with being remembered and spoken about even after he's gone. "In the end there is a kind of illness," Salter writes in his memoir. "A feeling of inconsequence, even lightness, takes hold. It is, in a way, like the earliest days, the sense of being an outsider. Others are taking one's place, nameless others who can never know how it was." Cassada is driven relentlessly to prove himself; his immediate commander, Wickenden, thinks he has a death wish. Isbell, who grows to love Cassada, acknowledges his own part in stirring him up. "It was true [he] had sometimes opposed him. It had been essential to. It was part of the unfolding." Earlier we have learned of Isbell's mysticism, his sense of his role among the men as "biblical." "It was the task of Moses--he would take them to within sight of what was promised, but no further. To the friezes of heaven, which nobody knew were there."

    In this kind of outfit, Cassada never stands a chance. It is he who is one of the pilots in trouble as they try to reach home. The other is Isbell. The bond between the two is the strangest in the book, yet critical to its success. I don't think Salter has convinced us that it is true. Isbell is decent, perceives Cassada's isolation; pencils himself in to fly with him once, on an early morning training mission over Germany. It is a matchless day, the kind fliers dream of. They hardly speak.

    The earth lay immense and small beneath them, the occasional airfields white as scars. Down across the Rhine. The strings of barges, smaller than stitches. The banks of poplar. Then a city, glistening, struck by the first sun. Stuttgart. The thready streets, the spires, the world laid bare.

    Afterward Isbell's body is "empty," his mind "washed clean." Cassada asks about a city they flew over, Ingolstadt. "It's not as great as it was this morning," Isbell replies.

    "You could say that about everyplace," he commented.
          It was true, Isbell thought, exactly. He felt a desire to reply in kind. It was not often you found anyone who could say things.

    It is worth reprinting Salter's original language from The Arm of Flesh. The speaker is Isbell:

    "The whole world's like that," he said.
          A chance remark that entered my heart. I didn't know what to say. Suddenly he was not what he seemed--as wise as a schoolboy who knows sex--he was entirely different. Yes, I thought. The whole world is. And early we rise to discover the earth. I felt a sudden desire to bequeath him my dreams, to offer them up. All of the searching is only for someone who can understand them.

    This seems to me rather better, nearly perfect, in fact. While terseness can suggest all the things that must remain unspoken in life, a writer striking at the essence of character must occasionally open himself up, like a pilot his engines. Earlier in the same passage, in The Arm of Flesh, we have the measure of Isbell that is stripped from its revision in Cassada--excitable, aroused, ready for risk: the risk of loving a fellow flier: "We stealthy two. Streaming like princes. Breathing like steers," he thinks while aloft. Over Stuttgart:

    Watch out, Stuttgart. Watch out. We're at God's empty window. We can see everything. The thready streets. The spires. It's all apparent. We can stare through the roofs. Right into the first cups of coffee. Your warm secrets, Stuttgart. Your rumpled beds.

    None of this is in Cassada. None of it says much about Cassada, but it says everything about Isbell. The end of the chapter is the same in both books, except for the following sentences from The Arm of Flesh: "He could have told me what he was going to be. I might have believed him." And later on, when the two men's mission has met its tragic end, a lengthy Isbell monologue is sharply cut, in which his obsession with Cassada again comes to the fore: "There is so much I almost told him. I can't understand why I didn't. I was waiting for something, a word that would fall, an unguarded act." Of course, there are no unguarded acts from the embattled Cassada, but more surprising is the sense--retained in Cassada--that the young pilot actually had something to say. Both versions ascribe uniqueness to him, the phrase "the sum of our destinies." Yet Cassada doesn't even pretend to understand Isbell in those moments when their communion is said to be greatest: "You amaze me, Captain.... We're talking about two different things. I don't know. I just don't understand, I guess."

    In Burning the Days--the eponymous chapter of which, thirty pages long, is a true anticipation of the story told in The Arm of Flesh and Cassada--Salter invokes briefly a pilot named Cortada: "He was from Puerto Rico, small, excitable, and supremely confident. Not everyone shared his opinion of his ability--his flight commander was certain he would kill himself."

    And that's it for Cortada. Another cipher, with too much in common with Cassada to be a coincidence. Salter has kept the story of both men to himself, which is why a reader turns more attentively to the lonely and appealing Captain Isbell, standing between the men and Major Dunning. It is no surprise, of course. Failing to attach ourselves to the protagonist for whom the book is named, we look elsewhere, and find our longing met in the author's substitute.

    The end of Cassada is beautiful. It is only four pages. Isbell and his family are leaving Germany; they are on a train along the Rhine, his daughters rambunctious, his wife solicitous, Isbell alone with his thoughts, which include Cassada. For the first time he senses himself as the romantic figure readers have seen all along, joining the ranks of the eternals, "the failed brother, the brilliant alcoholic friend, the rejected lover, the solitary boy who scorned the dance." Isbell is Salter; and one turns to Burning the Days, where the author takes his own solitary farewell to the flying life:

    When I returned to domestic life I kept something to myself, a deep attachment--deeper than anything I had known--to all that had happened. I had come very close to achieving the self that is based on the risking of everything, going where others would not go, giving what they would not give. Later I felt I had not done enough, had been too reliant, too unskilled. I had not done what I set out to do and might have done. I felt contempt for myself, not at first but as time passed, and I ceased talking about those days, as if I had never known them. But it had been a great voyage, the voyage, probably, of my life.

    "I would have given anything, I remember that," Salter adds, remembering the pilot's terror ("none of it mattered"), including separation from his leader. Isbell mouths nearly those words in remembering a Cassada who "stands before him, fair-haired, his small mouth and teeth, young, unbeholden." In Burning the Days Salter recalls a beloved figure from West Point who fell in the war: "He had fallen and in that act been preserved, made untarnishable. He had not married. He had left no one...he represented the flawless and was the first of that category to disappear."

    Reading Salter's memoir, or recollection, as he prefers to call it, one senses that much of his life has been a mourning. The list of the dead is long and unfolds over pages and pages--many are pilots, men Salter flew with--and it becomes easy to see what he hopes is evident from his preface to Cassada: "the fact that it was sometimes the best along with the worst pilots who got killed." All of Salter's novels--including The Hunters, A Sport and a Pastime and Light Years--are beautiful elegies in which a survivor tries to go on, somehow make sense of it all, knowing the task is futile but that perhaps peace can be achieved. Why should his memoir be any different? Cassada will take a few hours to read, in which time there is exquisite suspense, some lovely sentences, a tender portrait of a hero--Isbell, I still believe, not Cassada--and a lot of shoptalk about flying. But the flying talk is better, more exactly described and sustained, more rapturous--"exalted," to use a favorite Salter word--in Burning the Days, and the memoir has the advantage of tracking the two held-apart strands of Salter's emotional life--the chaste love of men, the unsatiable desire for women--more closely than is possible for a book about fighter pilots. The following sentence sounds like Isbell recalling Cassada, but in fact it's Salter standing in the wreckage of all who have died: "You are surviving, more than surviving: their days have been inscribed on yours."

    Eric Weinberger

  • Support Independent Journalism.

  • Letters

    Bully in the Pulpit?

    Bully in the Pulpit?

    The following is a forum on Ellen Willis's "Freedom From Religion," which appeared in the February 19 issue.
          --The Editors

    Los Angeles

    I am a spiritual man. I believe politics must arise from a spiritual source as well as an ideological one. The great movements in our history have been spiritually motivated, at least in part. I also want the religious institutions engaged in the questions of justice and morality every day of the week.

    But there are problems with the new politics of religion about which Ellen Willis writes. One is the danger of the private religious sphere replacing the public sector, a new kind of privatization that is not accountable. The second is that these religion-based projects will be charitable in nature and will not express political rage--because they will be tax-exempt, dependent on government. Third, social programs and movements should be independent of any pressure to adhere to a religious doctrine to qualify.

    What is needed is Old Testament rage, not a clerical seizure of the public sector.

    Tom Hayden is a former California State Senator.

    Washington, D.C.

    Ellen Willis makes two intertwined arguments. The first is a forthright defense of the separation of religion and the state, inflamed by George W.'s plan to fund "faith-based" service agencies. The second is a sweeping attack on those liberals and leftists who speak kindly of churches and devout churchgoers, ignoring their undemocratic beliefs and arrogant practices.

    Anyone who truly cherishes the First Amendment should indeed be wary of Bush's desire to use tax money to proselytize Americans who are short on money and hope. But it's not only secularists who are concerned. Liberal evangelicals like Jim Wallis and faithful conservatives like Kate O'Beirne worry that manna from the Feds will compromise what Wallis calls the "prophetic voice" of religious bodies. Early in the nineteenth century, the very absence of a state-sanctioned church encouraged Americans to create and follow a wildly heterogeneous variety of creeds (and none at all). People who take their faith seriously would be foolhardy to give up the independence that served them so well in the past.

    Willis's broader polemic against "religious orthodoxy" reveals how little she understands the spirituality she abhors. In challenging all manner of received authority, the rebellion of the 1960s also transformed the rituals of many churches and synagogues--and started many young people on a personal search for "meaning" that social movements by themselves could not satisfy. Today, the conflict between the devout left and the devout right--over economic issues like a living wage as well as an acceptance of homosexuality and support for abortion rights--is as intense as the one in which Willis does battle. And it's probably a more significant fight, given the minority of Americans who articulate their moral beliefs in strictly secular ways.

    Strangely, Willis never seems to wonder why the "alternative moral vision" grounded firmly in the Enlightenment (which we share) captures few contemporary hearts and minds. The secular left has found nothing to replace the socialist dream and struggles to mount a persuasive challenge to the far-too-worldly gospel of free markets. Karl Marx, no friend of organized religion, nevertheless understood that, for ordinary people, religious faith was "the heart of a heartless world." If Willis hopes to build a more humane society, a bit of empathy for the spiritual choices of ordinary Americans might come in handy.

    Michael Kazin's latest book (with Maurice Isserman) is America Divided: The Civil War of the 1960s. He teaches history at Georgetown University.

    Washington, D.C.

    Ellen Willis is right but does not go far enough. Even without Bush's faith-based initiatives the Catholic Church not only demanded but received exemption after exemption from providing the most unexceptional forms of reproductive health. No emergency contraception for women who have been raped. No voluntary postpartum sterilization for women who are having what they hope will be their last child. No fertility treatments for women who would like to have a child.

    These services are legally denied by Catholic hospitals and they are often eliminated in secular hospitals that merge with Catholic institutions. Catholic Charities of California has sued the state, seeking an exemption from a state law that requires employers--other than religious institutions engaged in narrowly defined religious activities--to provide contraceptive coverage to its employees. At the same time Catholic Charities nationally receives about 75 percent of its income from government sources. Catholic hospitals receive the bulk of their funding from government sources and tax-exempt bonds.

    But the simple claim of conscience by a Catholic institution or the assertion of "church teaching" is enough for most legislators to just give the church whatever it wants as well as tax dollars. There was no national interest in protecting women's consciences when the Clintons included in their health reform package a conscience clause for healthcare provider institutions allowing them to deny any service they deemed immoral and still be eligible for government grants and contracts. Catholics are against this. Eighty-two percent believe that if a Catholic hospital receives government funds it should be required to allow its doctors to provide any legal, medically sound service they believe is needed. But for most legislators the power of the 300 US Catholic bishops is much more important.

    For the bishops to try to have their cake and eat it too is politics as usual. For ultraconservative Catholic groups to claim that any criticism of the Catholic Church is Catholic-bashing is part of the game. For our leaders, Democratic and Republican, to keep serving them more cake is unconscionable.

    Frances Kissling is president of Catholics for a Free Choice.

    New York City

    The First Amendment was enacted not merely to keep the state off the back of religion. One of its prime functions was to keep religions off one another's back and to stop them from using the state as their agent.

    Let us not imagine that this problem has long vanished. Two or three years ago, I was supposed to share a platform with John Cardinal O'Connor on the subject of Jewish/Catholic relations at the Park Avenue Synagogue in New York. The Cardinal was so ill he sent his speechwriter to read his remarks. I responded then, and later in an Op-Ed in the New York Times, by asking a very direct question: What would happen to my religious liberty as a rabbi if the campaign by the Roman Catholic hierarchy and the evangelical Protestants to outlaw abortion completely and under all circumstances passed into American law? As a rabbi, I am commanded--not permitted, but commanded--to advise a pregnant mother whose life is in danger that she must have an abortion. On the question of who comes first, the mother or the unborn child, Judaism, even its most Orthodox versions, insists that the mother's right to life is pre-eminent. The ultimate aim of the "pro-life" believers is to outlaw abortion completely. It is thus an attack on my religion. The First Amendment of the Constitution was enacted to make sure that I would not meet a triumphant collection of cardinals and bishops, and evangelical preachers, celebrating a victory of "natural law," as they interpret it, over the Talmud.

    Even more fundamental, the notion that morality is safe only in the hands of the religions does not stand in the face of historical evidence. No society in which a church was dominant has ever emancipated the Jews. In the past two centuries or so, Jews have achieved political equality in the West, but everywhere, without exception, the forces that granted this freedom were fought by the majority religions. The major faiths of the world have learned, or are still learning, to live with one another as equals, but not because they have had new revelations from on high. On the contrary, this kinder and gentler aspect of their natures has been evoked by mercantilism and then by the Enlightenment, that is, by the very secular forces from which we would supposedly be saved by the major religious traditions in the name of their superior morality. In our day Roman Catholics and Orthodox have been slaughtering one another in the Balkans, and joining in the murder of Muslims, in the name of religion, and Muslims in the region have been no kinder. It is not self-evident that we will be cured of our ills if there is more control of public life by the various faiths.

    In my ears, a recent statement by Joseph Cardinal Ratzinger, the contemporary keeper of orthodoxy in the Roman Catholic Church, continues to ring. He pronounced non-Christians to be "defective people." Pope John Paul II, despite some urging, did not disavow this assertion. I would suggest that those who want to cure our ills by asking for a return to "that old-time religion" need to be made to examine very closely what they are calling on our society to affirm.

    Arthur Hertzberg, Bronfman Visiting Professor of the Humanities at New York University, is the author, most recently, of Jews: The Essence and Character of a People. He is currently working on a memoir.

    Cambridge, Mass.

    Ellen Willis and I share much in common: our disdain for the religious right, our distrust of the Bush Administration's motives behind its new "faith-based" solutions--and our complete and unbending opposition to those religious claims that, when acted out, bring real and substantial harm to those whose views are not the same.

    But Willis can't stop there--and yet should have, because taken as a whole her essay is so utterly lacking in historical understanding of religion's roles in America, so devoid of nuance in recognizing the immense varieties of experience, belief and religiously inspired political action, and ultimately so intolerant that she ends up willfully blind to the freedom-, justice- and equality-creating contributions religious ideals, language and actors have made to American life. Bereft of that, she disqualifies herself from being taken seriously. Her ideas, for example, that American religion perforce relies on "absolute truth" while "democracy, by contrast, depends on the Enlightenment values of freedom and equality," and that democracy has thrived here because it has "preserved relatively clear boundaries between public and private," and thereby kept "conflict between secularism and a minimum," mean she's never read Tocqueville or Perry Miller or Gordon Wood or Sydney Ahlstrom or Kevin Phillips's recent Cousins' Wars on religion's centrality in creating and sustaining our democracy, nor even vaguely begins to understand why and what divides American Christianity over literalist, inerrantist and historical/critical readings of the Bible itself--or why it's important to American politics.

    Telling us that unless religion stays a chastely private "matter of personal conscience," it almost always and everywhere breeds intolerance that threatens democracy itself means Willis has never read Eugene Genovese on evangelical Christianity's emancipatory and sustaining role in the lives of African-American slaves. Certain that any religiously based assertion of a public voice serves power and bigotry means she's never read Herbert Gutman on faith's role in the lives of early industrial workers and how it helped them fight for unions and fair treatment before the law. Busy doing other things, apparently she's never picked up Lincoln's Second Inaugural or ever glanced through the private journals of Union soldiers who in their faith found the courage and reason to destroy slavery. Convinced that religion enforces only sexual and gender orthodoxies, she can't imagine why religiously organized colleges were the first to admit women or provide them advanced professional training.

    In seeking to assure us that the best of all possible democratic worlds is one in which we enjoy "freedom from religion," she leaves us with no coherent explanation of what inspired the abolitionist movement, or early suffragists, or the temperance movement, or how the Progressive Era emerged. She gives us no way to understand why US Catholic bishops endorsed extensive worker ownership of the means of production in 1919 or why Father John Ryan was known as "Father New Deal," or how as Hitler rose to power in the 1930s, Jewish and mainline Protestant leaders worked together successfully to transform the then-common references to America as a "Christian nation" into a "Judeo-Christian" one, or how theologians like Paul Tillich and Reinhold Niebuhr led the early homefront mobilizations against Hitler well before Kristallnacht, let alone World War II.

    Closer to the present day, Willis leaves us no way to affirm how the Kings and Abernathys and Lewises as well as the Berrigans and Coffins and Coxes and Hines--and the millions of white and black Americans--who found in their religious faith the reasons to battle racism and segregation or the horrors of the Vietnam War. Nor does she help us understand what, in the 1980s, sent Catholic nuns and Quakers and thousands of other religious Americans to Nicaragua and El Salvador and Guatemala, or inspired the Catholic bishops' pastoral letters on nuclear arms or on economic justice. Nor can she tell us why it is religious groups today that are in the forefront of the anti-death penalty movement, or living-wage campaigns, or massive debt relief for impoverished Third World nation campaigns, or why they were the principal supporters of reuniting Elián González with his father.

    Nuanced when it comes to explaining the aesthetic behind Andres Serrano's Piss Christ, Ellen offers only unnuanced contempt to those she derides as "the earnest centrists and liberals who are doing [the] dirty work" of Bush and the Christian right by thinking religion has an important role to play in America's public and political life. Wrong in her history, wrong in her analysis, she is wrong in her judgments.

    But how could she not be wrong? Living in a country where more than 90 percent of its citizens have always told pollsters they believe in the existence of God, Willis cannot see which traditions therein speak for justice, and those against, which seek freedom, and which do not, which love democracy and which do not--let alone how to build a progressive politics that can build on the shared values of the secular and religious alike. Blind, Willis cannot see; deaf, she cannot hear.

    Led by this kind of thinking, progressivism should stop pretending even to be political--and settle permanently into the sort of dinner-table rant Willis's essay represents.

    Richard Parker teaches on religion, politics and public policy at Harvard University's John F. Kennedy School of Government.

    Washington, D.C.

    Ellen Willis refers to a Joe Lieberman speech in which he mentions an incident after a talk I gave at Harvard on religion and public life. I don't know where Lieberman heard the story, but he got it wrong. Let me give the true account as a way of responding to Willis's piece. At an informal gathering of left intellectuals in the Harvard/Boston community discussing faith and politics, I was asked, "But Jim, what about the Inquisition?" I was a little surprised by the question, after laying out the history of progressive religion's contribution to myriad movements for social reform. I replied, "Well, I was against it at the time--and I am still opposed to it. Now unless you want me to raise Pol Pot and the Khmer Rouge every time you bring up national health insurance, why don't we move on to a better conversation." The questioner smiled, got the point, and we did move into quite an intelligent discussion between religious and nonreligious progressives about common concerns. It's too bad that Willis missed the opportunity to do that.

    In my own religious upbringing, I heard many people equating anybody on the liberal left with the most oppressive and totalitarian of regimes. But when I became involved in civil rights, antiwar and other social-justice movements, I discovered many people on the left quite committed to democracy, pluralism and human rights. It was quite comforting. Apparently, Ellen Willis hasn't yet figured out that there are religious people committed to the same things. She thinks we are all committed, instead, to special privileges for religion in the political arena. I guess she doesn't get out much, or hasn't bothered to read her history, or just can't bear realities that don't fit her pre-conceived ideological agendas.

    It is tedious the way that "secular fundamentalists" like Willis continue to caricature and belittle religion and people of faith. Ellen, we are indeed motivated by our faith to seek justice and peace. But in the public arena, we don't make arguments based on others' accepting the "absolute truth" of our faith. We make cases appropriate to a pluralistic society. King's vision of the "beloved community" came directly from his biblical faith, but he argued for civil rights and voting rights on the basis of what was good and right for a democracy. Many of us make similar claims in similar ways all the time, even if it is faith that compels us to do so.

    It's also very old and, frankly, politically stupid, to keep repeating the abuses of religion as if religious people didn't know or even agree. Yes, their owners gave the Bible to black slaves to turn their eyes toward heaven and away from their earthly plight. But in that same book, the slaves found Moses and Jesus, who helped inspire their liberation struggle. Some of us have been among the chief critics of oppressive religion for years but, at the same time, have lifted up its progressive practice and potential. Do we really need to keep reciting that progressive religious history? I think the Ellen Willises must know it and just choose not to pay any attention. Nothing will keep the secular left more irrelevant in America. Martin Luther King Jr. neither hid nor imposed his religion but rather used it as a social resource to transform America. That's exactly what many of us are doing today. And you know what, we don't need The Nation's permission to do so.

    Jim Wallis is editor of Sojourners magazine, author of Faith Works and convener of Call to Renewal, a federation of faith-based organizations working to overcome poverty.

    New York City

    I applaud Ellen Willis's smartly reasoned critique and agree heartily about the dangers such politics present to democratic and feminist values. Two points that Willis neglects, however, show a more complicated view of the bipartisan, pro-religion landscape we face in the Bush II era.

    My first point has to do with the link between the culture wars and macroeconomics. It is surely true that religious groups and politicians across the spectrum (from Jim Wallis and Floyd Flake to the Christian Coalition and the Vatican) are rushing to legitimize public support for "faith-based programs" in order to reclaim moral authority and "a privileged role in shaping social values." But the struggle over sexuality, gender definitions, parental control and popular culture is also tied to the struggle over public resources. At the macroeconomic level, the newassertiveness of religious institutions as stakeholders in the polity takes place in the larger context of globalization and the rapid privatization of social services formerly provided by the state.

    Privatization wears many faces, including those of the so-called nonprofit as well as the corporate profit-making sector. In the United States, for example, Catholic/non-Catholic hospital mergers have proceeded at a nearly geometric pace since the mid-1990s, according to a series of studies by Catholics for a Free Choice. The result is that in numerous counties across the country, Catholic hospitals that systematically deny essential reproductive health services are the only provider hospitals in town. Likewise, with the attack on public schools and universities and erosion of their resources, parochial schools come to be seen--and advertise themselves in full-page New York Times ads--as a better-quality "choice" for poor black and Latino as well as middle-class children. All this is about not only control over social and sexual values but also the grab for tax dollars and the marketization of religious institutions. The strategic implications are clear: Progressives fighting for "freedom from religion" need to ally with groups opposing privatization in education and health.

    My second point has to do with the potential constituency for such an opposition movement and carries a more optimistic note. I was the daughter of an observant Reform Jewish family who grew up in the heart of the conservative Christian, anticommunist Bible Belt in the 1950s. When I try to trace the roots of my liberal and left radicalism in later years, I find their earliest core in the alienation and anger I felt in public schools where Christian symbols and "Athletes for Christ" were compulsory fare and religion not only defined who you were but was itself defined in evangelical Christian terms. The resurgence of religiosity in politics today may pretend to be stylishly multicultural, but if the sanctimonious Joe Lieberman is any indicator, we can be sure that "correct" religion--and correct "faith-based programs" for the public coffers--will represent the Judeo-Christian mainstream. Even Willis neglects to mention Muslims, Hindus, Parsis, Buddhists and so on in her comments about devaluing the citizenship of "others." And her persistence in using "church" as a generic substitute for religion only underscores the ubiquity of certain exclusionary assumptions in the dominant political discourse.

    My optimism comes from my childhood experience in a similarly conservative era. I predict that the movement to institutionalize and fund "faith-based programs" by the state will create its dialectical opposite. Not only secularists and Jews but a whole generation of immigrant young people--Indians, Vietnamese, Egyptians, Pakistanis--will become radicalized toward secular antiracist feminisms and the left. The sexual dimensions of religious politics (e.g., funding for "abstinence-only" sex education), whose importance Willis rightly emphasizes, will only intensify their fervor.

    Rosalind Pollack Petchesky, professor of political science and women's studies at Hunter College and the Graduate Center, CUNY, is the author of Abortion and Woman's Choice and the forthcoming Women and Global Power.


    As an ordained Baptist minister and former pastor, I largely agree with Willis's principled and spirited defense of democratic secularism. But I've got a few bones to pick, only a couple of which I'll address here. While both left and right critics in their arguments about religion and politics often use the example of the black church, few ever get it just right, including Willis. She charges Stephen Carter with rewriting history by equating "the civil rights movement with Martin Luther King Jr. and the black church." To be sure, a lot of folk outside the black church were involved in that movement, but the central, even defining, influence of black religion on the civil rights struggle can't be missed. Counterposing "secular activists" like Rosa Parks--a devout Methodist--and John Lewis--an ordained minister--to black church activists is a serious misreading of just how much the gospel of freedom influenced many religious blacks who were leaders and foot soldiers in the NAACP, CORE and SNCC.

    Unlike the Christian right, black Christian activists have mainly resisted the impulse to make ours a Christian nation. Martin Luther King Jr. and other black Christians were inspired by their religious beliefs to fight for equality and freedom for all citizens, even those who did not share their religion. That's why King opposed school prayer. He didn't want the state telling anyone that they had to pray, or whom to pray to, even if it turned out to be the God he worshiped. King understood the genius of secularism: It allows all religions to coexist without any one religion--or religion at all--being favored. Plus, King had a healthy skepticism about the white church, which lent theological credence to slavery and Jim Crow. He sided with the state against the church when the former intervened to keep white Christians from bombing black churches.

    But Willis could also learn the skepticism that King and many black Christians had for the Enlightenment. Reason proved no better than religion in regulating the moral behavior of bigots. And proclaiming a devotion to freedom and equality meant nothing if black folk weren't even viewed as human beings worthy of enjoying these goods, either by religious whites or enlightened secularists. For many black folk, it wasn't whether white folk were religious or secular that mattered; it was whether they were just or not. To paraphrase Jesus, Willis and the rest of us should not only see the bad trees in religion's eye, but spot the forest that plagues the secularist's vision as well.

    Michael Eric Dyson, the Ida B. Wells-Barnett University Professor at DePaul University, is the author of I May Not Get There With You: The True Martin Luther King, Jr. (Touchstone).


    New York City

    Richard Parker, Michael Kazin and Jim Wallis all ignore what I actually said and impute to me views I don't hold--some of which my article explicitly disclaims. I did not write a broadside attack on religion or religious people but an argument against the current broadside attack on secularism and secularists, particularly the claim that secular society is antidemocratic and violates believers' rights. While quarreling with the notion that religion is the sole or primary inspiration of social movements--which writes secular activists out of history--I acknowledge the political contributions of the religious left, the common interests of religious and secular leftists, and the fact that many religious progressives oppose the antisecularist politics I criticize. Far from expressing abhorrence of spirituality, as Kazin charges, I note that in our postpsychedelic age many people who favor a secular society and are not religious in the conventional sense have their own conceptions of the quest for transcendence. (I include myself in that category.)

    I do not, as Parker suggests, claim that religion has had no influence on American democracy--or American democracy on religion. Nor do I argue that religious and democratic sentiments are mutually exclusive. My contention that there is "an inherent tension" between the two is a response to the argument made by Stephen Carter and others that a democratic government should make special accommodations to religious belief because of its absolute nature. To recognize a tension, however, is not to deny the existence of efforts to transcend or reconcile it. In an odd misapprehension, Parker has me saying democracy has thrived by preserving clear boundaries between public and private, thereby minimizing conflict between secularism and religion. My point is essentially the opposite: that minimizing religious-secular conflict depen ded on confining the practice of democracy to a narrowly construed public, political sphere, and that the spread of democratic principles to "private" life--especially sex, gender and childrearing--has greatly intensified the conflict. Parker ought to do a better job of reading before presuming to lecture me on the supposed gaps in my bibliography.

    I don't dispute Michael Eric Dyson on the centrality of the church to the civil rights movement, but would argue that secular ideas and organizations were also important. In fact, one laudable function of the Southern black church during the civil rights era, as with the Catholic Church in Poland in the 1980s, was to give shelter and space to a diverse assortment of dissidents, religious or not. I agree that secularists have no monopoly on morality or clear vision. As part of another group not considered fully human, I've experienced the gap between the profession of Enlightenment principles and their practice. But it's only because the principles exist that I can demand that they apply to me.

    Pace Tom Hayden, I believe the only truly radicalizing force is people's desire to change their own lives for the better. In my experience, moral outrage all too quickly becomes self-righteous authoritarianism.

    Thanks to Frances Kissling, Rosalind Petchesky and Arthur Hertzberg for their valuable additions and to Wallis for supplying the context of Joe Lieberman's use of his remarks.


    Tom Hayden, Michael Eric Dyson, Ellen Willis, Michael Kazin, Frances Kissling, Arthur Hertzberg, Richard Parker, Jim Wallis and Rosalind Pollack Petchesky