The war in Afghanistan, coming after the atrocities of September 11, provokes a welter of contradictory emotions. On the one side, a desire for justice and a yearning for security. And on the other side, dread of a war unrestrained by national boundaries, time frame or definable goals.
We believe that America has a right to act in self-defense, including military action, in response to a vicious, deadly attack on US soil by a terrorist network identified with Osama bin Laden. There is a real threat of further attacks, so, as Richard Falk argues on page 11, action designed to hunt down members of the terrorist network and those in the Taliban government who collaborate with it is appropriate.
But acknowledging a right of response is by no means an endorsement of unlimited force. We must act effectively but within a framework of moral and legal restraint. Our concern is that airstrikes and other military actions may not accomplish the ends we endorse and may exacerbate the situation, kindling unrest in other countries and leading to a wider war. They have already triggered bloody riots in Pakistan and Indonesia and on the West Bank, where the cease-fire is in shreds.
This effort ideally should have been carried out under the aegis of the United Nations Security Council and bin Laden and his associates brought to justice for their crimes by an international court. The United States should still seek a mandate from the Security Council for its military actions. This would give the campaign the international legitimacy it needs to avoid playing into the hands of those charging an American war against Islam, and it would offer some protection against the calamity of a wider and uncontrolled war. It would also help strengthen the UN's policing and peacekeeping capacity.
If limited military action in self-defense against bin Laden and his backers and cohorts is justified, an open-ended "crusade" against pariah nations to stamp out ill-defined evil is not. There are already ominous rumblings in the Pentagon that such interventions are contemplated. The Administration has notified the Security Council that it might pursue terrorists in other nations. This may be more of a threat than a promise, especially as it pertains to the Philippines and Indonesia. But it is no secret that hard-liners hanker to expand the war to include strikes against Iraq, Iran, Syria and other hard cases.
Military actions inside Afghanistan must be circumscribed by limited political objectives and carried out with a minimum of civilian casualties. The report of the killing of four Afghan UN employees (engaged in clearing the deadly harvest of mines sowed by two decades of war in that nation) in the second day's bombing underscores the potential costs when vast firepower is unleashed against a poor nation with comparatively few military targets. As civilian casualties mount and more refugees are driven from their homes, international support for the US effort will dwindle.
The US air war has already magnified humanitarian problems that call for urgent attention. In addition to 7.5 million Afghans facing famine before the war, which has interrupted overland shipments of food, half a million refugees have fled the bombing. American cargo planes dropping 37,000 box lunches cannot mitigate this problem, so US contributions to international agencies giving food and medical aid must be stepped up. With fleeing Afghans massing at border chokepoints, the Pakistani government should be pressured to allow aid to go through. The UN, with US assistance, must expand the number of camps that will take in the uprooted.
Also looming in Afghanistan is the prospect of the Taliban government falling and leaving a power vacuum, into which rush the furies of anarchy and civil war. The UN should immediately convene a coalition of opposition groups (including those representing Afghan women) in an attempt to ease the transition to a new government that is broadly representative of the Afghan people.
Here in America, responsible members of Congress should demand clarification of the Administration's goals in this war and oppose the President's attempts to curtail Congressional oversight of the conflict. In this regard, we hope that the courageous statement of Representative Jim McDermott that the Administration lacked a "fully developed and comprehensive strategic plan" will hearten more of his fellow Democrats to engage in similar scrutiny. And let us also praise Senator Russell Feingold for at least slowing down an antiterrorism package that the Senate leadership was trying to rush through Congress by severe limiting amendments or debate.
As the fog of national security closes in Washington, the press must resume its appropriate watchdog role. Civil liberties groups should stay on high alert, flashing early warnings against unconstitutional laws and violations of civil rights--especially those of innocent aliens apprehended in early antiterrorist sweeps.
As we have said before, military means are only one weapon in the fight against terrorism--and a very limited one. Of greater importance are diplomatic, law enforcement and intelligence efforts. Beyond those, instead of more US military attacks we need a multinational coalition dedicated to attacking the conditions breeding terrorism--the endless Israel-Palestine conflict, the corruption of US-supported Arab regimes, the world inequality and poverty spawned by globalization. And on another front, as Jonathan Schell warns on page 7, the question of weapons of mass destruction has acquired a new salience as a result of the recent events. Nuclear disarmament, a test ban and stronger nonproliferation measures are sorely needed. We should not let the military action overshadow these greater challenges.
As Schell writes, "The world is sick. It cannot be cured with America's new war. The ways of peace--adopted not as a distant goal but as a practical necessity in the present--are the only cure."
President Bush has stated that his global campaign against terrorism will be a "new kind of war," in which traditional military approaches will give way to a more innovative mix of tactics. Or, as Defense Secretary Donald Rumsfeld put it in an Op-Ed for the New York Times, "The uniforms of this conflict will be bankers' pinstripes and programmers' grunge just as assuredly as desert camouflage."
Unfortunately, despite this rhetoric about fighting a new kind of conflict, the only thing new about the military budget in the wake of September 11 will be its size. Once emergency funding, previously requested increases and a forthcoming supplemental appropriation are taken into account, military spending for the current fiscal year could hit $375 billion, a $66 billion increase over last year's total. Rarely has so much money been thrown at the Pentagon so quickly, with so little public debate. Most of this new funding will be used to bail out existing weapons programs, not to finance new equipment or tecshniques designed for the fight against terrorism. The Pentagon's latest strategy review, released in early October, makes numerous mentions of "terrorism" and "homeland defense," but it is essentially a status quo document that will not require the cancellation of a single major weapons program.
Given this anything-goes approach, look for Republican Representative Curt Weldon to seize the opportunity to shore up the Boeing V-22 Osprey, a tilt-rotor aircraft built in his district that has been plagued by a series of fatal crashes and falsified test results. Likewise, the Georgia and Texas delegations will use the new, more generous Pentagon funding environment to fend off challenges to the Lockheed Martin F-22, an overweight, outmoded fighter plane that now costs more than $200 million each. And the list will go on, to include costly artillery systems, attack submarines and combat ships, all of which were originally designed for battle with a Soviet military machine that no longer exists, and none of which have any obvious application to the President's current war on terrorism.
Of course, the most expensive item on the Bush Administration's military wish list is its multibillion-dollar missile defense scheme. The low-tech means used for the September 11 attacks underscore the fact that a long-range ballistic missile is the least likely method a hostile power would use to attack the United States. But Congressional Star Wars boosters, like GOP Senator Jon Kyl and Democratic Senator Joseph Lieberman, have rushed to the program's defense nonetheless. Meanwhile, Democratic critics, like Senate Armed Services Committee chairman Carl Levin, are holding their fire for the moment, after agreeing to put aside plans to restrict the Pentagon's $8.3 billion missile defense budget for the current fiscal year in the name of national unity.
In a recent address to the Heritage Foundation, Pentagon Comptroller Dov Zakheim did manage to specify a handful of items that he argued would be useful in tracking down and targeting terrorists, including unmanned surveillance vehicles, like the Northrop Grumman Global Hawk, and specialized reconnaissance aircraft like the RC-135 "Rivet Joint." But these systemsare small change compared with the tens of billions the Pentagon will continue to devote to big-ticket, cold war-era white elephants like the F-22.
Lockheed Martin and its industry cohorts will also benefit from the stepped-up US arms sales to the Middle East and South Asia that are being offered as a quid pro quo for joining the US-led antiterror campaign. The Pentagon's Defense Security Cooperation Agency, which handles government-to-government arms sales, recently established a "war room," known as the "Enduring Freedom Response Cell," which is intended to put weapons requests from US allies on a "fast track." Uzbekistan has been cited as one country that could benefit from the new, streamlined procedures. Pending sales of F-16 aircraft to Oman and the United Arab Emirates and multiple-launch rocket systems to Egypt are being rushed through in the name of coalition-building.
Even if there were an effective military solution to the scourge of terrorism, the Pentagon would be hard pressed to explain how most of the items included in its current spending spree could be put to use in such a battle. It's time for skeptics on Capitol Hill and in the country at large to speak out loudly and clearly, before our leaders in Washington write out a blank check to the Pentagon that could distort federal budget priorities and the conduct of our foreign policy for years to come.
The press conference that Defense Secretary Donald Rumsfeld held shortly after the United States began bombing Afghanistan on October 7 was painful to behold. The questions posed by reporters tended to be either trivial--Did the B-2s involved in the mission depart from the United States?--or thoughtless. Since September 11 Rumsfeld had repeatedly said that he would not divulge any information that might endanger ongoing operations, but that did not stop reporters from trying to elicit it. CNN's Jamie McIntyre, for instance, kept demanding to know whether the United States planned to send ground troops into Afghanistan. Rumsfeld did his best to ignore him, but, as McIntyre persisted, the Secretary finally fixed him with an icy stare and said, "We don't discuss operational details."
The briefing reminded me of the famous Saturday Night Live sketch aired during the Persian Gulf War, in which reporters--despite being warned not to ask about matters that could aid the enemy--posed questions like, "What date are we going to start the ground attack?" and "Where are our forces most vulnerable to attack?" The sketch captured the public's disdain for the media's mindless aggressiveness and reinforced the first Bush Administration's inclination to restrict the flow of information about the war.
Now, with a new conflict upon us, the second Bush Administration seems intent on imposing similar controls. "Although the administration says it is not engaged in censorship," Elisabeth Bumiller reported in the New York Times, "officials throughout the government readily say they have been ordered to be circumspect about their remarks." This is certainly troubling. Without access to battle sites and timely information, the press--whatever its faults--will have a hard time assessing the success of US actions. Accordingly, US news organizations have been pushing the Pentagon to be more open.
That seems unlikely to happen, however. As during the Gulf War, the public seems to support the Administration's approach. Rather than sit around and grumble, though, reporters and editors should rededicate themselves to the real task at hand, which is providing the fullest possible coverage of the complicated new era we have entered. That, in turn, requires journalists to show such qualities as independence, enterprise and, yes, courage. Regardless of how much information the government provides, the press must pose uncomfortable questions, challenge broadly held assumptions and solicit opinion from a wide range of sources.
There are some hopeful signs. During the Gulf War, the press uncritically accepted Pentagon assertions about the accuracy of its missiles. Postwar studies showed those claims to be vastly exaggerated, and many journalists felt burned. A month into the current conflict, some journalists have shown their determination to avoid a repeat. Thus, after the Rumsfeld briefing, Richard Hawley, a former US general turned ABC news consultant, told Peter Jennings that in bombing Afghanistan, the United States was using precision-guided weapons so as to avoid "collateral damage." Jennings immediately pounced. During the Gulf War, he observed, generals "repeatedly talked about precision-guided weapons, and they turned out to be anything but precise. How much better is it now?" Hawley said that US missiles now have GPS-aided navigational devices that make for "far fewer stray rounds." Whether that's so remains to be seen, of course, but the exchange shows how some journalists, at least, have learned from that past conflict.
The current one, however, offers a host of new challenges, especially in covering the political dimensions of the conflict. And here the press could do much better. To cite one example, the Pentagon revealed on October 7 that in addition to dropping bombs on Afghanistan, it was dropping humanitarian food packages. In all, it said, it was delivering about 37,000 packages. Most news organizations accepted at face value the Pentagon's explanation that this showed America's concern for the well-being of the Afghan people. In all, though, millions of Afghans face starvation, and the next day NPR reported that Doctors Without Borders had condemned the US food drop as "propaganda" and, further, that the bombing had caused the UN World Food Program in Pakistan to suspend its daily shipments of 700 tons of food into Afghanistan. In reporting this, NPR did not rely on handouts from the Pentagon; rather, it went into the field and developed its own sources of information. (In fairness, Washington says it plans to increase greatly the size of its food drops once it is safe to do so.)
Another, more serious example of the press's credulity has been its coverage of the US intelligence services. In light of the failures to predict the September 11 attacks, the press has almost unanimously concluded that the United States needs to beef up its spying abroad and to "unleash" the CIA to fight terrorism. In a piece for The New Yorker, for instance, Seymour Hersh, relying heavily on sources within the US intelligence community, lambasted the CIA for turning away from the rough-and-tumble methods it used during the cold war. "Look," one agent told Hersh, "we recruited assholes. I handled bad guys. But we don't recruit people from the Little Sisters of the Poor--they don't know anything." A piece in the New York Times's Week in Review section echoed Hersh. "The CIA's spies are ill-equipped to fight a dirty war in the world's back alleys," lamented Tim Weiner, who went on to cite the need for American intelligence to rebuild its capacity for "old-fashioned espionage" and satisfy the "urge for covert action to combat an invisible foe."
These articles offered no independent assessments as to how much impact such a buildup could actually have in combating terrorism. Even more troubling, they showed no awareness of the serious costs of past US covert operations, from the Congo to Cambodia to Latin America. This omission seemed especially dismaying in the case of Hersh, who over the years has broken so many stories about clandestine mischief abroad.
Clearly, the United States needs to improve its ability to confront invidious groups like Al Qaeda. We are indeed fighting a new kind of war, and it requires new types of responses. Yet the unthinking acceptance of premises like the need to "unleash" the CIA does not advance the discussion. More than ever, US journalists must avoid the temptation to engage in groupthink and--without seeming reflexively adversarial--must ask sharp questions. In the end, the danger they face is not just censorship, but self-censorship.
The attacks of September 11 have not only exposed the failures of our intelligence apparatus and the "blowback" problem of US foreign policy. They have also stripped bare how one branch of corporate America, the $273 billion airline industry, has successfully captured the government agency supposed to oversee it and bought off the people's watchdogs in Congress. This situation argues for far-reaching changes in how campaigns are financed and how government agencies are staffed.
The vulnerability of our airports can be traced, in part, to the role of the airline industry in lobbying year after year against any federal takeover of airport security and its insistence on contracting the work out to low-bidding companies that often pay little more than the minimum wage to the people who check passengers' luggage and X-ray their handbags. Last year the General Accounting Office found that starting salaries for screeners at all nineteen of the nation's largest airports was $6 per hour or less, with five boasting starting salaries of just $5.15 per hour. According to the Federal Aviation Administration (FAA), from May 1998 through April 1999 turnover at those same nineteen airports ranged from 100 percent to more than 400 percent. Argenbright, one of the four big companies that dominate the private airport security business in America, pleaded guilty in 2000 to several charges and agreed to pay $1.2 million in fines for falsifying records, doing inadequate background checks and hiring at least fourteen airport workers in Philadelphia who had criminal convictions for burglary, firearm possession, drug dealing and other crimes. In 1978, reports the New York Times, the FAA "found that screeners failed to detect guns and pipe bombs 13 percent of the time in compliance tests, while in 1987 the agency found that screeners missed 20 percent of the time. Since then, the agency has stopped releasing figures."
Despite these worrisome facts, the airlines and their lobby, the Air Transport Association (ATA), fought against any federal takeover of airport security because they didn't want to have to pay more for it and because they didn't want potential passengers scared off by longer lines or fears of a hijacking. And the FAA dragged its heels, in part because its mandate, written by a Congress addicted to millions in transportation-industry campaign contributions, has been not only to insure air safety but also to promote air travel. The airlines alone have given more than $65 million to federal candidates and parties since 1990, and spent roughly the same amount lobbying the federal government between 1997 and 2000.
Much of that boodle helped to weaken the implementation of new security procedures recommended by a 1996 presidential commission chaired by Vice President Al Gore, set up after the TWA 800 crash. For example, according to a report by Public Citizen, the commission's recommendation that the background of all airport employees be checked for criminal records was opposed by the industry because it would create administrative and financial burdens. Even Gore himself backed down on his commission's insistence that all bags be matched to passengers on all flights. The day after he wrote the ATA about his change of heart, campaign contributions started to pour in from the airlines to various Democratic Party committees at double their previous pace.
Many people in Washington have enriched themselves by maintaining this sordid status quo. Current or recent lobbyists for the airlines and/or the ATA include Linda Hall Daschle (wife of Senate majority leader Tom Daschle), Haley Barbour (former Republican National Committee chair), Harold Ickes (deputy chief of staff in the Clinton White House), Ken Duberstein (chief of staff for Ronald Reagan and a crony of Colin Powell), Nick Calio (now President Bush's Congressional liaison) and former Senators Dale Bumpers and Bob Packwood. Three recent FAA administrators, including Linda Hall Daschle, have come from the industry.
So far, nothing has changed in the wake of the September 11 attacks. According to Paul Hudson, director of the Aviation Consumer Action Project, Transportation Secretary Norman Mineta has "excluded all aviation security proponents, consumer or public representatives, air crash victim groups, former FAA security officials critical of aviation security and the manufacturers of advanced aviation security equipment from his advisory group" on new security measures, relying instead on the industry alone. The airlines finally came out in favor of federalizing airport screening, though by September 12 their lobbyists were already plotting the $15 billion taxpayer bailout. A month later, the thousands of laid-off employees, who lack a similarly well-heeled lobby, are still waiting to find out if they will get emergency unemployment, healthcare and job-training support.
What do we bomb next? Saudi Arabia? The Saudis would be a logical target if President Bush were serious about his stated goal of punishing nations that support terrorism.
Patriotism requires no apologies. Like anti-Communism and anti-Fascism, it is an admirable and thoroughly sensible a priori assumption from which to begin making more nuanced judgments. Nor does patriotism need to be exclusionary. I am an American patriot, a Jewish patriot and a New York chauvinist pig. My patriotism is not about governments and armies; it's about unions, civil rights marches and the '69 Mets. It's not Kate Smith singing "God Bless America"; it's Bruce Springsteen singing "This Land Is Your Land."
Of course, not everyone on the left concurs. While many nonpatriots share an idealistic belief in a kind of cosmopolitan, humanist internationalism, some--like Jerry Falwell and Pat Robertson on the right--really do hate this country. These leftists find nothing to admire in its magnificent Constitution; its fitful history of struggle toward greater freedom for women, minorities and other historically oppressed groups; and its values, however imperfectly or hypocritically manifested in everyday life. This became obvious in a few of the immediate reactions we heard in the wake of September 11. How could anyone say with certainty why we were attacked when we couldn't be sure who attacked us? All they could know, really, is why they thought we deserved it.
This "Hate America" left must be rejected for reasons of honor and pragmatism. It is difficult enough to "talk sense to the American people" in wartime without having to defend positions for which we have no intellectual or emotional sympathy. Many on the right are hoping to exploit a pregnant political moment to advance a host of antidemocratic policies. Principled dissent is never more necessary than when it is least welcome. American history is replete with examples of red scares, racist hysteria, political censorship and the indefensible curtailment of civil liberties that derive, in part, from excessive and abusive forms of superpatriotism. We are already seeing the beginnings of a concerted attack on civil liberties, freedom of expression and freedom of the press. Given the importance most Americans place on patriotism as a bedrock personal value, it is folly to try to enjoin them in a battle that fails to embrace their most basic beliefs.
Moreover, the refusal to draw this line invites the kind of McCarthyite thuggishness we see on display in the writings of pundits like Andrew Sullivan and Michael Kelly, and in the pages of (predictably) National Review and (sadly) The New Republic, tarring anyone with a wartime question or criticism as a pro-terrorist "Fifth Column" (Sullivan's term). Casting as wide a net as possible for their poisonous attacks, they choose examples so tiny as to be virtually nonexistent. In defense of his slander of the people of New York as well as virtually everyone else who voted against George Bush in the "red" areas of the nation, Sullivan pointed to an obscure website based in Denmark run by something called United Peoples. To smear opponents of unfettered free trade and globalization, TNR editor Peter Beinart seized on a bunch of anonymous postings to another, no less obscure, website whose name I cannot even remember. Kelly has now devoted two Washington Post columns to attacking all pacifists as "evil," "objectively pro-terrorist" and "Liars. Frauds. Hypocrites." But in neither column could he find the space--or the courage--to name a single one.
Because none of these writers have yet developed the reputation for malevolent hysteria enjoyed by, say, Marty Peretz on Israel or David Horowitz and Ann Coulter on everything, there is a serious chance that the larger mass media, never good at making distinctions on the left in the best of times, will swallow and repeat their reprehensible assertions. The net result will be the exclusion of all progressives, America-hating or no, from the spectrum of "responsible" debate where decisions are made and the nation's future is determined.
The potential for politically motivated official censorship--beyond that which is genuinely necessary to protect the safety and security of our troops--is never far away in wartime. Politicians and generals quite understandably find the temptation to abuse this power irresistible. We saw countless such examples during the Gulf War, and we can discern hints of future threats from the lips of presidential press secretary Ari Fleischer, who endorsed the attempts of a few Madison Avenue mullahs to withdraw advertising from ABC's moronic talk show Politically Incorrect when host Bill Maher used the word "cowards" regarding the US military's use of cruise missiles. "There are reminders to all Americans that they need to watch what they say, watch what they do, and this is not a time for remarks like that; there never is." (Speaking of cowardice, the White House edited Fleischer's remarks in its official transcript of the exchange.)
Yet another wartime peril to democracy derives from hyper-caution and self-censorship on the part of the media themselves. Why are newspapers like Newsday and the Daily News censoring comics who raise even the gentlest questions about George Bush? Exactly whom does the communications conglomerate Clear Channel imagine it is defending when it instructs its deejays not to play "Bridge Over Troubled Water" or "Ticket to Ride" on the radio? Why do newspaper publishers in Grants Pass, Oregon, and Galveston County, Texas, feel the need to fire writers and editors who wondered why the President "skedaddled" into a "Nebraska hole" on the day of the attack? Most disturbing of all, why has the consortium of national news organizations decided to postpone, apparently indefinitely, the news of who really won the Florida election last winter? The estimated publication date for the collective effort, overseen by the University of Chicago's National Opinion Research Center and costing more than $1 million, had been September 17. But New York Times political reporter Richard Berke wrote that the now "utterly irrelevant" report "might have stoked the partisan tensions."
In other words, the threat of "partisan tensions" arising from a potentially stolen election is more dangerous than continuing to live the lie. How wise of our media minders to decide that America needs to be protected--not from terrorists, but from truth.
The new war on terror isn't going to be of much use in combating the present plunge in America's well-being. Well before the twin towers fell to earth the country was entering a fierce decline, and it is assuredly going to get worse.The fall in growth and investment from early 2000 to early 2001 was the fastest since 1945, from 5 percent growth to zero. So fast, indeed, that people are only now catching on to the extent of the bad numbers, and battening down the hatches as bankruptcies begin to rise.
How did we get from the Merrie Then to the Dismal Now? The bubble in stock prices in those last five years sparked an investment boom, as corporations found mountains of cash available, either from the sale of overvalued stocks or by borrowing money from the banks against the high asset value of those same stocks. And as the Lewinsky years frolicked gaily by, there was a simultaneous consumption boom as the richest fifth of the citizenry--the Delta Force of national consumer spending--saved a lot less and spent a lot more.
The shadows were there for those who cared to look for them. In 1998, 1999 and 2000, when the boom was reaching historic proportions, when annual borrowing by US corporations had reached a historic peak as a percentage of GDP, when Fed Chairman Alan Greenspan was vaunting the power of markets, the rate of profits was falling in the nonfinancial corporate sector, significantly so in manufacturing.
The bubble was due to burst. Now, with the market going down, corporations have less money, can borrow less and invest less. Consumers have less to spend and have begun to lose their appetite anyway. Down go the rates of investment and consumption, and the amount of government debt that the Bush Administration can muster as a Keynesian stimulus will be more than offset by a decline in private debt, as people turn prudent and ratchet up their savings.
But the problems go deeper. The corporate investment boom of the late 1990s took place against a backdrop of falling profitability. Who builds new plants when the bottom line is turning sour year by year? Answer: US corporations in the late 1990s. There was no correlate of investment against the rate of return, hence the amassing of overcapacity on a herculean scale. Between 1995 and 2000 retail store space grew five times faster than the population. Earlier this year, Business Week reckoned that only 2.5 percent of communications capacity is being used.
The most notorious sector was telecommunications, where borrowing was vast and stocks insanely inflated, with analysts boiling up ever more ludicrous ways of claiming profitability for their favored stocks. The degree to which stocks rose above profits was greatest in technology, media and telecommunications (TMT). In this sector, the leading edge of the boom, between 1995 and 2000 the value of TMT stocks grew by 6.1 times, but their earnings by only 2.1 times.
The Organization for Economic Cooperation and Development's survey of the United States for 2000 makes for chastening reading. By that year, the final distension of the bubble, the value of Internet companies reached 8 percent of the total value of all nonfinancial corporate assets in the economy. But most of those companies made only losses. Of 242 Internet companies reviewed in the OECD study, only thirty-seven made profits in the third quarter of 1999, the prepeak of the bubble. Their price-to-earnings ratio was 190 to one; precisely two of these accounted for 60 percent of profits. The other thirty-five profitable companies traded on an average p/e ratio of 270 to one; the 205 remaining companies made losses. For 168 of the companies for which data are available, total losses in the third quarter of 1999 amounted to $12.5 billion at an annualized rate, even as their stock-market valuation reached $621 billion.
You want a definition of a bubble? That's it.
So was there really a "New Economy" emerging in the sunset of the century, as proclaimed by so many exuberant choristers? True, the 1995 to 2000 economy did do better than in any five-year period back to the early 1970s. By all standard measures, such as productivity, economic growth, wages, growth of investment, unemployment and inflation, it was a pretty good time. But as Professor Robert Brenner of UCLA, whose Boom, Bubble, Bust: The US in the World Economy is about to be published by Verso, aptly asks, "If the five years 1995 to 2000 truly saw the emergence of a New Economy, manifesting 'extraordinary performance,' as Clinton's Council of Economic Advisers put it, what are we to call the period 1948 to 1973, which excelled the recent period in every respect?" Productivity growth was about 15 percent slower in those five recent years than in the twenty-five years between 1948 and 1973.
Obit writers for the great boom of 1995-2000 usually avert their eyes from the fact that despite all the exuberance of those giddy years, in terms of growth of gross domestic product, of per capita GDP, of wages and productivity, the 1990s as a whole did worse than the 1980s, and the 1980s worse than the 1970s. In other words, the golden end of the twentieth century was a continuance of the long stagnation of the world economy that began in 1973.
For now? On the one hand, overcapacity; on the other, a drop in investment and consumption, driven first by the drop in the market, then by fear. It will be quite a while before anyone feels the need to invest, hence to borrow. Give the rich a tax cut? It won't help. They'll put it in the bank. Government investment? Yes, if it were done on an appropriately vast scale, but only public investments of a sort that Republicans have never countenanced and that vanished from the political platforms of the Democratic Party decades ago. For sure, planes and missiles for the Navy and Air Force, plus the millions in food aid dropped on Afghanistan, plus new computers for the Office of Homeland Security, aren't going to do the trick.
As I write, the world is filled with fear. I am having one of those reactions that psychologists describe as a stress response. I suppose I'm not alone, though. A friend calls and says, "You hung a flag yet? Anyone who's been to Cuba, you better hang a flag." "Cuba?" I ask, startled. "You don't mean that weeklong human rights trip seventeen years ago?"
"You poor naïve child. I'm sending you a big one. Hang it on your porch."
In the newspaper, I read of Muslims who are shaving their beards and removing their veils. I read of blacks who are embracing suspect profiling. There's an unsubstantiated rumor on the Internet of Barney Frank hugging Strom Thurmond just before he fainted.
"It's that list they'll be drawing up in the Office of Homeland Security," explains a fellow paranoid as we shop for bottled water. "Nobody wants to be on that." Then she points out the physical resemblance between Tom Ridge and J. Edgar Hoover. She believes in reincarnation. I do not, but...it really is uncanny.
Another friend calls to say she's been reading the Washington Post. "Sally Quinn's got gas masks for everyone in her family. Her doctor gave her a stock of antibiotics, enough for her and all the servants." The word "triage" begins to rise uncomfortably in my mind. Who gets to stockpile antibiotics in this new world order? If I went to a doctor for a little "extra" medication, he'd turn me in for drug dealing. If minorities suffer from unequal access to medical treatment now, what happens when panicked hordes make a run on hospitals for limited supplies of anthrax vaccine?
Not that any of this will do any good anyway, I suppose. My mother reminds me of the bomb shelters that sprang up during the 1950s. "I worried too," she said. "But you can't control this sort of thing on an individual level. Will you never go to the beach for fear of being too far from the shelter? Will you never take off the gas mask for fear of smelling the roses?" A friend of mine who's a psychologist says that it is precisely the terrifying lack of control that is sending so many people over the edge. She says that lots of fragile sorts have been showing up at Bellevue to apologize for having driven a plane into the World Trade Center. The less fragile ones have been busy actually hijacking Greyhound buses and rushing into cockpits in states of extreme agitation.
On the news, crusty old senators disclose that they have participated in various government war games, in which they role-played all sides of the conflict in the event of hypothetical disasters. The crusty old senators worry me; they move stiffly and are so relentlessly formal that they refer to themselves in the third person, like Bob Dole. I suspect them of playing these games in the groves of the Bohemian Club, with the expectation that whatever happens they will retire to the bar for whisky sours afterward. All this is a too glib way of saying that I simply don't see them coming up with quite the same strategies and outcomes that Al Qaeda might.
I think that if the Pentagon really wants to role-play doomsday scenarios here at home, they need to lock Jerry Falwell in a small room with Elián González's Miami relatives, G. Gordon Liddy, Louis Farrakhan, Jack Kevorkian, Charlton Heston, Al Sharpton, Kenneth Starr and a horde of neglected, riot-prone, inner-city kids who under the circumstances feel as though they have nothing left to lose. We get O.J. Simpson to keep a body count and Larry King to report what's happening. We give them $43 million worth of weaponry (the sum George Bush, as recently as August, thought would be a nice amount to send to the Taliban), an airdropped bundle of peanut butter sandwiches and ten minutes to reproduce Afghanistan's religiopolitical structure. Does anyone seriously doubt that this much of an experiment would end up destabilizing all of human history?
"People are screaming through the cracks," says a colleague. I had never heard that expression before. She says it means that people are too scared to say what they mean when you ask them to speak on the record. "But if you ride the buses, talk to truck drivers, go to church, hang out with teenagers in the pool hall, they're terrified of this war. No one knows why all this is happening."
It is true that everyone has a different conspiracy theory of this war. When I first heard of the bombing, I thought it was retribution for Timothy McVeigh's execution and that "the terrorists" had chosen New York because it's a city of miscegenated minorities. A Jewish friend was equally certain that New York was chosen because "it's a Jewish city." A stockbroker friend finds it obvious that "they" were out to destroy world trade and global economics. Pat Robertson blames Bill Clinton. A Christian evangelical friend says that it's all about "the rapture," which is apparently that moment just short of end-time when the sanctified will be transported directly to heaven and the rest of us will perish. Maureen Dowd, Washington's favorite material girl, flips mournfully through the Neiman Marcus Christmas catalogue and concludes that it's because foreign agents don't want us to enjoy our "stuff." The White House blames "not all Muslims." And Ari Fleischer blames Bill Maher.
There's a brilliant trilogy by children's author Philip Pullman titled His Dark Materials. The tale features armored bears enlisted in the fight between good and evil--great clanking white bears who smash through enemy armies, clumsy but immense in their power. In my mind, I keep seeing those big armored bears as American warplanes bombing away, strong and accurate and deadly. But I am also visited by images of "the network" that they're fighting as more of a global spider web, very thin, fine lines of connection--tough, resilient and almost invisible. I keep worrying that armored bears aren't much use against a foe like that. The bears are entirely capable of wreaking havoc in a given spot, but the spider web is small, silent, hard to see--drawing strength from structure, not from size; from belief, not from force. And as long as we do not come to terms with the more subtle nature of that kind of adversary, I will not be able to visualize any good end in sight.
After the terrorist attacks on New York and Washington, some have asked whether the West hadn't sown the seeds of its own destruction. That's not a new idea: A hundred years ago, Robert W. Coles wrote a novel called The Struggle for Empire in which Anglo-Saxons conquer the world by destroying or absorbing all other races. Triumphant and enraptured by technology, the Anglo-Saxons invent flying machines and encounter a race of aliens called Sirians. The Anglo-Saxons bomb Sirian city after Sirian city until the aliens surrender unconditionally. In 1923, Anderson Graham's novel The Collapse of Homo Sapiens has Africans and Asians stealing atomic secrets and bombing the Anglo-Saxons back to the Stone Age.
It's suddenly difficult to dismiss such predictions from fantasy novels. The West has bombed Arab lands, not to mention African and Asian, and is doing so again. Arab militants have meted out what they may or may not see as a first strike, and will doubtless try for more, maybe next time with true weapons of mass destruction.
In A History of Bombing, published before the September 11 attacks on America, we learn that the first time airplanes were ever used for a bombing mission, they were used by Europeans to bomb Arabs. In 1911 Italian aviators dropped grenades on nomadic camps in the desert of Tripoli, in North Africa. Newspaper reports recorded the effect on the ground: "Noncombatants, young and old, were slaughtered ruthlessly, without compunction and without shame." The Italian air command reported that the bombs had "a wonderful effect on the morale of the Arabs." Seeing the effectiveness of aerial bombardment, other European powers quickly followed suit. Between 1915 and 1920 Britain bombed Arab towns and villages in Egypt, Transjordan, Iran, Iraq and Afghanistan--all this three-quarters of a century before the Gulf War. So perhaps it shouldn't be a big surprise if the morale of the Arabs has returned to haunt us.
But what does this prove? For Sven Lindqvist, the author of A History of Bombing, it suggests that white Westerners were the first to use airplanes and bombs for terrorism, and that racism was the reason they did. Lindqvist is partly wrong about this. But in a more horrifying way--which has nothing to do with racism--he's also partly right.
Most people blame the Germans for the Holocaust. Lindqvist, a Swedish scholar and journalist, first made a name for himself by pointing the finger elsewhere. In a short travelogue called Exterminate All the Brutes, penned from the Sahara Desert and published in English in 1996, Lindqvist leveled his gaze in an unexpected direction--at the British, the Americans, the Spaniards, the Belgians. Lindqvist noted that these countries were the first colonial expansionists, the first to invent the idea of inferior races. They condoned genocide in Africa and the Americas well before Auschwitz. The Germans didn't have much in the way of colonies abroad. When the Germans started exterminating a local gypsy tribe called the Jews to make room for German expansion, they were just trying to keep up with their European neighbors, at least according to Lindqvist.
In A History of Bombing, Lindqvist makes a related argument about warfare waged from the air. He wants to show that bombing evolved because of racism. When white people realized they could drop explosives from planes, they also realized it was an inhumane thing to do--not fit for other white people. They started bombing anyway, because dark-skinned natives in the colonies were considered less than human. But then, as white people became inured to bombing, they unleashed this brutality on themselves as well.
Lindqvist also argues that bombing natives in the colonies wasn't really about achieving military objectives. It was mostly about terror and extermination, about killing African, Arab and Asian men, women and children where they lived in camps, villages and towns. To bomb was to bomb civilians. It was the white man's savagery, at first reserved for savages.
Lindqvist is on to something. Fantasy novels at the turn of the century predicted the extermination of Asians and Africans with bombs, and the inhabitation of their lands by whites. Like The Struggle for Empire and The Collapse of Homo Sapiens, J. Hamilton Sedberry's 1908 novel Under the Flag of the Cross has a haunting prescience. Here the weapons that white men drop from their flying machines are "electrobombs," which unleash the forces of raw matter. And the enemy isn't an imagined alien race but a real one: yellow people. Here, the atomic bombing of Japan has already happened, before bombing was even invented.
Science fiction writers soon foresaw an even more ghastly future: The combination of racism and bombing might come to haunt Europeans at home. If the terrorists who carried out the airliner attacks against America last month saw their actions partly as revenge for the US bombing of Arab lands in the Gulf War--a supposition hardly farfetched--H.G. Wells predicted something similar after the European bombing of North Africa ninety years ago. Lindqvist notes that in Wells's 1914 novel The World Set Free, a nuclear war starts in Europe, but the pilot who launches the first atomic bomb against whites is dark-skinned, with "negroid" features. In 1926, Irish author Desmond Shaw's story "Ragnarok" depicted France arming a genocidal army of Africans and dispatching them to murder the citizens of London. To stop the savages, the British bomb themselves. As Lindqvist paints it, the creeping recognition in fantasy literature that the white man had, by bombing, become more savage than the savages brought a horrifying realization: When whites began bombing themselves, it was in a suicidal effort to obliterate their own savagery.
By Lindqvist's accounting this didn't take long, thanks to the likes of Arthur "Bomber" Harris, who was a British squadron chief in Iraq in the 1920s. Keen to prove the worth of the fledgling Royal Air Force (RAF), Harris built bigger bombers and perfected the targeting of Iraqi dwellings. His reports of massacring natives were so graphic that they were censored back in London, but his enthusiasm for bombing gave it an aura of efficacy. Harris is a pivotal character in A History of Bombing, because he embodies the psychological and moral turn from civilization to savagery that Lindqvist is trying to track.
Harris had some help, too. In the 1930s, inspired by his successes in Iraq, British military theoreticians finessed the ignominious fact that bombing was, in practice, a terror weapon by inventing euphemisms like "civilian dislocation" and "morale effect"--sadly reminiscent of the "wonderful" effect on morale reported by the Italian bombers of Tripoli. In theory, these new military techniques would be more humane because wars could be won more quickly (the same argument applied at Hiroshima and Nagasaki). In reality, it meant that where in the past soldiers died to protect women and children, now women and children would die to protect soldiers.
By 1940 the RAF was ready to attack civilians across the Channel--Britain was bringing the cruelty it had learned in the colonies home to Europe. The speed with which ethical considerations were sidelined is staggering. In May, Churchill authorized the RAF to hit German military targets. By June "military targets" included the neighborhoods where industrial workers lived. Hitler retaliated with the blitz on English cities in September, and by Halloween the RAF had orders to firebomb more than twenty German cities. In early 1942, the "morale effect" was declared official RAF policy for winning the war, and Arthur "Bomber" Harris was named head of British Bomber Command. Soon Harris would orchestrate the death of nearly 100,000 of his fellow Europeans--many of them women, children and the elderly--in the firebomb attacks on Hamburg and Dresden.
When the United States entered the war its air forces were committed to precision bombing. Lindqvist writes that Americans preferred military objectives and wouldn't target civilians, unlike Harris. That is, until the US commander Curtis LeMay arrived in Europe, just in time to witness the firebombing of Hamburg. LeMay studied Harris's bombing techniques and was then transferred to the Pacific. For Lindqvist the connection is clear: Within weeks, LeMay began firebombing Japanese cities, including the massive firestorm attack on Tokyo that killed 100,000 civilians. After that, Hiroshima and Nagasaki were easy.
By the early 1950s, LeMay had full autonomy to direct an atomic strike force that could kill tens of millions of people in minutes. By 1970, LeMay's legacy was a nuclear arsenal that could destroy the entire population of the earth--700 times. Apparently, bombing still wasn't about achieving military objectives directly: It was about terror and extermination. But LeMay had at least achieved a certain equality of the races. He could kill them all.
To Lindqvist's credit, he's trying to illustrate what should be one of the less subtle truths of racism: that racists hurt themselves as much as they hurt the objects of their prejudice. But there is a big problem with Lindqvist's book: His argument may have it backward. History suggests that white people first developed aerial bombing in earnest as a weapon against other whites, not against their dark-skinned colonial subjects. Only after Europeans conceived of terror-bombing each other and enacted this horror during World War I did they unleash it in scale elsewhere.
In the wake of the terrorist actions against New York and Washington, the United States has suddenly recognized its vulnerability to attack. The message has been conveyed with devastating effect using airplanes. The similarity to Britain on the eve of World War I is uncanny. Lulled into complacency by the protection of its surrounding seas, Britain was jolted to panic by the arrival of planes. And contrary to the story Lindqvist tells, the fearful savages that occupied the English mind weren't off in the far reaches of the empire. They were the Germans.
As early as 1891, the military inventor Sir Hiram Maxim had warned of Britain's susceptibility to airborne assault from the Continent, and took it upon himself to construct a steam-powered aircraft to counter the threat. Maxim showcased his unwieldy machine in 1893 for a crowd of political and literary luminaries, including H.G. Wells. The plane managed only a few hops, but within months a novel called What's the World Coming To? featured Maxim's work to illustrate how a European enemy could use airplanes for a devastating attack. The book, with its theme of whites bombing whites, comes too early to fit neatly into Lindqvist's narrative, as did two novels published in 1895: George Griffiths's The Outlaws of the Air and E. Douglas Fawcett's Hartmann the Anarchist. Both depict the unthinkable: destruction of London by air attack, and the gory deaths of English women and children on home soil.
A Brazilian aviator in France named Alberto Santos-Dumont built and flew Europe's first true airplane in 1906. His flight lasted all of twenty-one seconds, but it shocked Britain's great newspaper magnate Lord Northcliffe into action. Realizing that Britain wasn't prepared to deter a German air assault, Northcliffe launched a propaganda offensive for the establishment of a British air force. At least one of his papers, the Daily Mail, whipped up ethnic hatred for the despicable "Huns" across the Channel. By the time H.G. Wells's novel The War in the Air appeared in 1908, the stage was already set for his depiction of a massive air battle between England and Germany. Much of the fantasy literature in Britain prior to the outbreak of hostilities had a similar theme, intended to galvanize the nation for a bombing war with the Kaiser and his villainous henchmen. Certainly by 1912, German strategists considered terror bombing to be a legitimate technique of war, and had developed plans to attack British civilians.
Lindqvist writes that militarily, air power in World War I was indecisive and not terribly lethal. Fair enough, but he underreports the psychological and strategic milestones passed in the European theater then. Within the war's first weeks, Germany bombed Paris, and Britain bombed German zeppelin hangars. Soon the Austrians and Italians were trading bombing sorties against each other's civilians, and before the war was over, bombs would strike every European population center save Rome. The first attack on England, by a German zeppelin, came in 1915. The English were outraged and called for retribution.
By 1918, Hugh Trenchard, Britain's first air commander, had devised a new theory for winning the war. Trenchard reasoned that by bombing German workers directly, he could destroy Germany's economic and psychological capacity to wage war. Lindqvist gives credit for this method primarily to Harris, who perfected it in World War II after pioneering it on dark-skinned natives in the Middle East in the 1920s. But as historian of British air power Michael Paris points out, the idea had been home-grown in Britain by Trenchard two decades earlier: Not until the British had already bombed Germany did they institute aerial attacks in the Middle East and Africa.
The war ended before Trenchard could massacre enough German civilians to test his theory fully. But when Churchill cut military budgets back to peacetime levels, Trenchard chanced upon a new mission. Far cheaper than sending ground troops into the colonies, Trenchard's airplanes became the perfect long-distance imperial police force; that's when bombing in the colonies really got under way. Even Lindqvist admits that this had as much to do with economics as race. Before learning to fly, Trenchard cut his teeth as a soldier in Africa in the 1890s. Like many whites of that era, in the colonies he may have inured himself to a new brutality born of racism. But that wasn't caused by dropping bombs from planes.
The United States started bombing civilians during World War II, but this was also less clearly because of racism, or "Bomber" Harris's legacy, than Lindqvist lets on. The US commitment to precision bombing before the war was a gloss, thoughtlessly conceived to make the Air Corps's emphasis on strategic bombing palatable. The corps spent the interwar years striving to become an independent service branch, and this meant advocating long-range bombing as a new way to win wars. Implicit in that was a recognition that if attacking military targets failed to achieve victory, the enemy's civilian "morale" would come under attack. Even as early as 1926, US bombing doctrine defined itself as "a method of imposing will by terrorizing the whole population."
American commanders had yet to witness Harris's harsh techniques, but apparently US Air Corps Gen. Henry "Hap" Arnold didn't need much help. On the eve of war in 1941, Arnold asserted that bombing would reduce large cities to the point of surrender, and that the "will of a whole nation" was now an objective. Similarly, the possibility of incendiary attacks against Japanese cities--tinderboxes of paper and wood--had intrigued American strategists for decades before Dresden. While an element of racism probably made such prospects easier to contemplate, plans to firebomb Japan long predated Harris's heroics in Europe.
The first half of the twentieth century marked a terrible shift in the West's methods of waging war. Lindqvist's book is about this moral collapse. But consumed by the particular evils of racism, Lindqvist passes over what was the more obvious reason, a reason just as evil. World War I convinced military commanders, after an unconscionable waste of human life, that conventional ground and naval warfare was no longer winnable. But they failed to draw the obvious lesson, which would have been to stop waging war. Instead, they settled on a solution that would entail the killing of hundreds of thousands of innocent civilians in the half-century to come. "The only hope for restoring decisiveness to war," the historian Michael Sherry has written of the strategists of the interwar years, "was to cease battering at the enemy's strongest point, the surface forces now developed to defensive perfection, and attack the enemy's will behind the lines."
Few of the men responsible saw this as immoral at the time. But Europe's first airplane pilot, Santos-Dumont, didn't take it so lightly. In 1928 he returned to Brazil and soon after committed suicide.
For bombing to be anything but immoral, the distinction between military bombing and terror bombing has to be clear. A History of Bombing reveals that it has rarely been so, and it's hard to fault Lindqvist here. Perhaps the best evidence of the bankruptcy of bombing is that it strengthens the enemy as often as it weakens it. Lindqvist describes how, in the colonies, whites enthusiastically took up their bombardments again after World War II. Colonized populations won their independence anyway, at horrific cost but with an ironclad determination inspired by the wrongs being committed against them. As for Hiroshima and Nagasaki, many Americans remain convinced that the atomic bombs are what ended the war in the Pacific despite historical assessments to the contrary, and Lindqvist's book is unlikely to change that. Still, A History of Bombing is a profound litany of what might someday be considered among the most counterproductive military actions ever taken. Future generations may even classify them as war crimes.
But for the moment there is a new theory: "smart" bombing. As the United States again deploys B-52s and fighter-bombers in the Middle East, citizens the world over are calling for it not to kill innocent Afghans, to target only terrorists and the Taliban. The notion that this is even possible was born of the 1990s campaigns against Iraq and Yugoslavia. In these conflicts the US Air Force, armed with new technology, revived the paradigm of precision bombing that it had flirted with before World War II. Now, as then, the idea is still something of a gloss on the fact that civilians do die in war--witness new euphemisms like "collateral damage." And yet, though the bombing campaigns against Iraq and Yugoslavia were hardly bloodless, they weren't really wars in the old-fashioned sense either. In both cases, the bombing targeted neither a nation nor a people but rather the infrastructure of a particular political and economic establishment.
This is a new twist, so it's a pity that A History of Bombing includes but a few paragraphs on the Gulf War, and nothing on NATO's attacks against Serbia. If Lindqvist had been able to comment on these campaigns, he might also have noticed something familiar: They revived the colonial era. Of the Gulf War, military analyst Jeffrey Record has written that "it was a modern-day equivalent of a nineteenth-century colonial conflict." And in the war against Slobodan Milosevic, the West's demands at Rambouillet were, in journalist Michael Parenti's words, "tantamount to outright colonial domination" of Yugoslavia.
Indeed, Saddam Hussein's resilience seems to suggest a revelation: Even the success of precision bombing may prove pointless without old-fashioned colonial occupation on the ground. By contrast, what is to be learned from Yugoslavia? Maybe that the success of precision bombing doesn't even matter, as long as the enemy's territory has already been transformed into a de facto Western colony. Serbian Army forces emerged relatively unscathed, and were never a primary target in the first place.
Now, as the United States and Britain bomb again, even the air commanders may come to admit that precision bombing against Afghanistan is not especially useful, because there is so little left to bomb. Starting in 1920, when Arthur Harris dropped a twenty-pound bomb directly on the palace of the Afghan king, a roster of warmongers has insured that Afghanistan is already empty of targets. "The Taliban have an organizational structure," said one Pentagon official, "but they are not like a nation-state. It is not like going after Baghdad or Belgrade." This is true enough, and it is a good reason to question the efficacy of a large air campaign. But it misses the point. A comment by Gen. Merrill McPeak, chief of staff for the Air Force during the Gulf War, gets at the real problem. The day after the terrorist strikes against America, McPeak said, "You have to ask, 'What's the endgame?' It's not clear that a massive air attack, unleashing the dogs of hell, will result in an aftermath that's more secure." If only Arthur "Bomber" Harris could have heard such hard-won words of wisdom eighty years ago.
Herein lies the lesson Lindqvist offers. Destruction and slaughter from the air usually engender rage, not resignation, and the repercussions ramify past our powers of prediction. After the attacks against civilians in New York and the defense apparatus in Washington, Americans know this anger more than anyone. But Americans would also be naïve if they failed to recognize that their attackers may have felt the same way, and probably will again.
There is another lesson to be gleaned from Lindqvist's book: Explaining the world's ills in terms of racism can be taken too far. Though Lindqvist would surely not have intended it, A History of Bombing may lend credence to the claim that terrorism by Islamic militants is uniquely justified--a century of racist attacks by the West makes it so. But this is wrong. People of all colors and creeds have suffered from bombing; nonwhites have not been singled out. Indeed, it is a testament to the triumph over racism that people of all colors and creeds died in the hijacking attacks against the most powerful institutions in the world's most powerful country.
Much more disturbing is what the history of bombing reveals about attacks on innocent people in general. The refusal to distinguish between bombing military targets and bombing civilians is not the invention of Third World fanatics. The enlightened West bears the responsibility for this development, and for releasing an arc of airborne violence into history. On September 11, 2001, terrorists completed this grotesque circle. Before, bombs dropped from planes killed innocent civilians. Now, innocent civilians in planes were co-opted as bombs themselves. This is an occasion for epic sadness.
The marching order to "leave nothing but footprints" enlisted an infantry of green builders this season, before our collective attention turned to security. While our man from the Midland Petroleum Club (a k a George W. Bush) dissed environmental causes and dismissed global warming, before his attention, too, was turned, a growing number of land-shapers and place-makers began to cast an ecological eye toward planning and construction. Whether labeled green building, sustainable architecture, organic architecture or what one inclusionist calls "The Whole Building," this new constituency of ecologically attuned and everyday builders has begun to consider environmental values in building inside and out--from the materials in the making, to the siting of the structure, to the energy it consumes.
"Every architect wants to build green," one would-be organic architect says longingly, listening to speakers at a conference on "Building Energy 200l." Sponsored by the Northeast Sustainable Energy Association (NESEA) at Tufts University, the assembly was one of two events pulling in record numbers of builders looking to tread more lightly on the land. The second, "Sustainable Communities by Design," the Southface Energy Institute's annual Greenprints meeting in Atlanta, likewise drew green-minded "carpenters"--sick-building doctors, clean-air experts, developers, engineers and construction firms--as well as professional architects, landscape architects and planners with a growing green agenda.
"What is starting to change a heretofore esoteric or niche market to make it more viable?" Peter Yost asks rhetorically. "People are starting to make a value connection between health, sustainability and the environment," says Yost, senior editor of Environmental Building News. The biological impact of building has begun to enter their calculations, in other words.
Beyond these green gatherings and sentiments, or perhaps because of them, the political advocacy for environmental building legislation has also advanced. From state to state, activists are backing tax-credit legislation for conservation measures in building. Some have secured them in New York and Maryland, and others are sponsoring or organizing them in Pennsylvania, Massachusetts, New Jersey and Rhode Island. Executive orders have issued forth for renewable energy in Chicago and for cleaner buildings in cities (Seattle), states (California) and even the federal government, at the State Department.
Campuses have also become new centers, particularly universities like Tufts and Oberlin, with the latter trying to create an environmental department building that will be climate-neutral, a net nonconsumer of energy, says David Orr, one of its generators. Along with the labors, inevitably, come the publications to instruct them in the nitty-gritty of the new art and the books of broader import on the impacted planet that impel them to the task.
In some ways, the new thinking is not startling, even in terms of self-interest. With fears of rolling blackouts, California dreaming for a time became the national nightmare, power-miser bulbs the new lingua franca. Concern about global warming, fed by the greenhouse gases that heat the planet, is shifting attention to alternatives in both power and production. Americans, chief among the world's consumers (and contributors of one-fourth of the planet's emissions), have begun to calculate the pennies and problems escaping through their single-glazed windows, their under-insulated attics, unwrapped water heaters and oversubsidized superhighways and sprawling buildings.
And, as the bills and hazards mount, the E and R ratings that describe the eco-efficiency of building materials become buzzwords for green-builders and -consumers. The urge for energy conservation accelerates--from so-called daylighting designs that admit natural light to homes turned southward, to renewable energy from harnessing the sun and wind. Turning down the thermostat and buying low-energy appliances moves up the household list, if not in the management of offices whose lights burn bright in the midnight sky of corporate apathy.
The statistics brought out by the pollsters show the idealistic as well as the financial side of this search for greener architecture. The numbers reveal that most Americans--70 percent, in fact--have green goals. For all the McMansions, bulging SUVs and sprawling big-box chain stores (a new Home Depot or Wal-Mart opens every other day, says activist Al Norman), many would like to move from an egocentric planet to an ecocentric one. Whether these green wannabes will correlate their way of life to these concerns is less clear. Statistics also show that more than half would decline to pay a quarter more at the gas pump to help the environment. And who, after all, is permitting and purchasing those 900,000-odd new homes a year, mostly promoting sprawl, indifferent to good land-use planning?
Still, if the medical axiom of "first do no harm" has yet to supplement the architectural credo of duritas, commoditas, felicitas--firmness, commodity and delight--it has become a new imperative. Whether for love or new money, environmentalists and eco-purveyors alike have become earth-conscious in their building. Even in high-style architecture, there are signs of a new concern for ecologically sound alternatives. Last winter International Design Magazine recognized eighteen architects for their ecological design, and the American Institute of Architects (AIA) has begun to pay more attention to the subject as well.
The AIA's honor award winner this year, the Condé Nast Building at 4 Times Square, for instance (labeled "Manhattan's green giant" by one design publication), was lauded for its elements of new thinking and constructing. Designed by architects Fox and Fowle, the structure was touted by promoters for its built-in photovoltaic system to capture the sun; air delivery that sets new standards for interior air quality; and clean construction processes and materials. But the question still arises: Are the eco-statements affixed to this rather glitzy hodgepodge significant, or is the work merely building skin-deep? At the least, at forty-eight stories, this mother of all green skyscrapers carries rather conflicting messages to those bred on the last generation's Small is Beautiful.
The AIA also gave out Earth Day prizes to earth-conscious architects. Among those who awarded the prizes was Joyce Lee, who sits on New York City's Green Building task force and works on a $150 million program to advance energy efficiency in urban facilities. Another was William Reed, an architect at Natural Logic who tries to identify and regenerate natural systems like the rich and fecund landscape that covered the Arizona desert before cattle-raising came along. Contradiction can reign, though: The client list of committee chair Sandra Mendler mixes environmental agents like the Environmental Protection Agency and Nature Conservancy with the Pentagon (green guns?).
The question arises of whether lip service is louder than public service. Tensions between serving the client, serving the building and serving the art abound. In praising architect Richard Meier's high-style glass courthouse for Phoenix as an energy-efficient showstopper, it is inevitably the last word--showstopper--i.e., the glamour aspect, that garners magazine space. "Green" scarcely heads the design alphabet for top-drawer architects like Frank Gehry, whose new building for MIT bears an ecological badge secured by hands other than the ace architect's.
It is an old story, of course. Frank Lloyd Wright embodied the mix. On the one hand, there was his organic architecture that settled into the lay of the land; on the other, there was his doctrine of Broadacre City, the model for the highway-based developments spread across today's landscape. The work of architects displays a "culpable image," writes James Wines in Green Architecture. An imaginative creator himself, Wines provides a literate text and handsome images showing centuries-old, indigenous ecological structures, plus more recent ones.
There are veteran greeners, like Malcolm Wells, still creating burrow-down buildings and books on Cape Cod, and architect Emilio Ambasz and the radical 1960s Jersey Devil, to remind us of an earlier greening caused by the Arab oil embargo, bringing Jimmy Carter's solar panels to the White House roof. Many of these eco-forebears worked with nature through folk habit; others adopted eco-principles through political necessity during that energy shortfall. Even if the hopes and urgencies of these early greeners were smashed when Ronald Reagan dismantled the sun-capturing symbol, writes Wines, many of the buildings and techniques in today's environmental kit of parts have begun to try to balance green science and green art.
Should architecture be ecological oatmeal or eye candy, then? Both, of course, though clearly, the standard-bearers--the architecture magazines and building texts--prefer neon to whole-earth tints. The key professional journals, Architecture and Architectural Record, mostly back off from the environmental edges. Rare is the design firm that badgers them to do otherwise. Who wants to see the nuts and bolts behind the glamour shots? Who wants to read of the workings of ventilation when beauty beckons? Of access to public transport when aesthetic transport lures?
Architects discussing the use of materials in an article in Boston Architecture earlier this year simply skipped the subject of their sustainability. Not a word on whether the materials were healthy, long-lasting or kind to those who use them, make them and live on the planet they inhabit. Likewise, Architectural Record's "Material Affairs" interview with the talented Tod Williams and Billie Tsien made absolutely no reference to the materials' ecological content or impact. More happily, of course, Landscape Architecture and landscape architects look to the earth by definition, and at least have a mandate to hold the high ground as stewards. For their pains, the profession of would-be earth guardians remains isolated, only a whit more powerful than it was in less ecological times.
Despite the eco-talk, then, the gap between high architecture and down-to-earth environmentalism remains; the language of material mastery still reflects the dominion-over-nature of an industrial age that deposits its wastes in land, sea and sky. For every Steve Strong, a longtime solar architect creating sun-holding structures, there are hundreds of energy profligates. For every William McDonough, the silver-tongued architect who uses his lyricism to lure the initiated and influence the apostate, there are developers who think capturing rainwater atop buildings with a habitat there for native species is strictly for the birds (and, more reasonably, those who remain skeptical of his link to the likes of Ford Motor Company).
So it is that even as the devoted talk of the off-gassing of toxic materials, pernicious VOCs (volatile organic compounds) in paints and earth-wracking construction, the design field mostly ignores the 'e' word in talk of byproducts. Whether or not it's from fear that the poisonous possibilities might divert clients from the form of their work, the old notion of life-cycle costing--not just first-cost input--still struggles for a nod of recognition.
That nod comes mostly from the environmental edges, where eco-minded, hands-on builders have begun to ask the right questions. In industry parlance: Are the materials that we build with safe from cradle to gate (in the making); gate to grave (in the using) and even beyond, when they head back to waste, or heal, the earth?
Fortunately, a movement to define standards for those labors has in fact evolved to separate greenwash from green building. That rating method, created by the US Green Building Council, goes by the name LEED (Leadership in Energy and Environmental Design). Its aim is to create a common base to calculate environmental attainments--or "bragging rights," in the words of Rob Watson, co-chairman of the Natural Resources Defense Council. The grades on environmental correctness offer standards of achievement from certified to bronze, silver, gold and platinum (the top), based on environmental achievements.
First and primary assessment goes to whether the site is sustainable, a good place to build: Is it dense and dirty or pristine and green? Second, is it water-efficient: Does the project save or recycle water, inside and out? The third assessment is of energy: Is the power source renewable, energy-efficient, a non-ozone-depleting gas? The fourth deals with materials: Are they made locally with low or no transport costs? Are they recycled, renewed and not wastefully applied? The fifth point addresses indoor environmental quality: Does the stuff on the chairs and floors and walls emit toxins? Do workers get daylighting, good ventilation and thermal comfort?
There is, says Watson, an inherent biophilia, a human need to have contact with the natural world. Documenting whether design accomplishes it reinforces the moral imperative that betters architecture. If it's not green design, it's not good design, he says. Where matters as much as what to sustainable architecture; Watson, like many others, thinks the place category should get more points as a source of good design. Clean brownfields instead of consuming greenfields; in-fill the wasted inner cities, not the exurbs; build a walkable world, not a paved one. "How can you build a cathedral if you're destroying the cathedral of nature?" he asks.
Beyond doing nitty-gritty tallies, some builder-ecologists preach basic science and the laws of thermodynamics. If science tells us that substances never disappear, then we must stop producing toxic ones, clamor organizations like The Natural Step. In daylong sessions its directors show would-be green designers and builders how to build sustainably--to produce only what can be broken down and integrated back into the cycles of nature, or deposited into the earth's crust and turned back into nature's building blocks. (Originated by a Swedish doctor and a physicist and brought to American business by Paul Hawken, the Natural Step process, which attacks today's buy/build/dump development, has become the instrument for those who would move from the market-first prototype of take/make/waste construction.)
The HOK Guidebook to Sustainable Design, by Sandra Mendler and William Odell, is a useful collection of projects trying to follow those standards and make buildings friendly to the biosphere. Though pallid in presentation and language, the book's projects chart the intricate, even excruciating, steps necessary to do the job right--using, say, formaldehyde-free biocomposite made from soybeans and recycled newspaper, as was done at the World Resource Institute building.
Despite the engaging and hard-wrought steps to tread softly, ambiguities bounce off these pages, too. For openers, the sponsorship by HOK, an architecture and engineering firm more noted for mega-sports stadiums than for acts of redeeming social value, sends an ambivalent message. Pages on buildings for Monsanto, the agents of genetically modified crops, and McDonald's, those slicer-dicers of tons of beef (not to mention neighborhoods), can hardly be called models of whole earth enlightenment.
Light-years away from the conglomerate carriage trade, the meticulously green Environmental Building News is the beige bible of clean construction. A good-housekeeping handbook of green materials and methods, the advertisement-free publication was launched in 1992 by Alex Wilson, executive editor. Part engineers, part handymen, the publication's trio of writer/editor/researchers offer not only the nuts and bolts of building (ceiling fans, insulation) but also their lively personal travails in trying out green materials at home. Whatever the charm of the hands-on high jinks shown at the conference on sustainability, this remains a movement in its infancy in America. As evidenced in "Ten Shades of Green," the traveling exhibition dispatched by the Architectural League of New York last year, Americans would do better to look across the Atlantic for a mature movement. On the Continent, small spaces and diminutive energy budgets have defined design for decades, and the environmental work and artistry are correspondingly decades ahead in both science and design. From recirculating treated waste water, to harnessing the sun, to using nontoxic, recyclable materials in gracious interiors and exteriors, these projects blend sustainability and structure. With more stops on the horizon (Berkeley, Salt Lake City, Newport Beach and possibly others, through next year), "Ten Shades of Green" has helped raise consciousness and inspire US labors.
In dimming the lights and creating renewable sources, European models take the lead not only in sustainable building but renewable energy from photovoltaics on the German autobahn to windmills in Denmark. Long on social concerns but short on resource consumption, the Great Glasshouse in the National Botanic Garden of Wales stands as an exemplar. The project, by Sir Norman Foster, reinvents the glasshouse for the twenty-first century. In a mix of good design and whole-earth futurism, it features a thousand-plus plant species, and a computer-controlled light and biomass heating system that harvests energy from the plants on its 568-acre park.
Such ventures make US designers nostalgic for postwar America, when these shores attracted and dominated design. Architect Peter Land at the Illinois Institute of Technology, who immigrated to this country, speaks wistfully of the days when it was morning in America--when Europe's finest flooded to these shores: Mies at Land's IIT, Gropius at Harvard, José Luis Sert and others bringing fresh currents. A half-century later, the course of architecture's biological and aesthetic destiny runs otherwise.
In the end, those who look to architects to control waste, reduce fossil-fuel consumption and build in a benign fashion might better look to themselves. In America's port of consumption, the optimists in a skidding market cheerfully broadcast that spending will rebound. Consumer malaise might be a better wish. But whichever is the case, as Eric Davidson tells this 5 percent of the world's population in denial, You Can't Eat GNP.
Davidson, a scientist who helps battle global warming at the Woods Hole Research Center, has written a slim but readable and persuasive primer on the economic folly of environmental waste. It should be memorized for LEED certification by a nation whose energy consumption is scheduled to grow by 32 percent over the next twenty years and that would do well to reckon Economics as if Ecology Mattered, as the book's subtitle puts it. A platinum designation should require a quiz on this work, and supplement it with J.R. McNeill's eloquent and compelling Something New Under the Sun. McNeill's historic overview of the immense alterations of the natural landscape is a captivating and engagingly written tale, all the more so for its chilling recounting of the human-made trauma we are still a-building but could, in his view, reverse.
The notion of simple living, designed to start a revolution of appropriate technology, is further from us than Walden now, even a full generation after Stewart Brand's Whole Earth Catalogue. As we drink our shade-grown, bird-friendly coffee and recycle our trash, the tear-downs fall for McMansions and sprawling subdivisions in an architecture of excess, consuming 70,00-90,000 acres of wetland and destroying l.2 million acres of farmland a year. In America, of course, sprawl is the law of the land, notes Henry Richmond, a reform-seeker, in Reflections on Regionalism, edited by Bruce Katz. The latest census confirms that the land mass covered by our building is expanding at ten to fifteen times the rate of our population.
By spinning off the power to control land from the center to the surrounds, by neglecting zoning and by subsidizing the automobile to the neglect of walkable transportation and communities, we have swallowed that landscape. This book's diverse writers on business, planning and racial issues offer various solutions, from the bland to the pointed. Some think we will hang separately unless we cling together in more regional modes of planning to stop random development. Others posit that zoning, community by community, is simultaneously the problem and the solution. Both camps seek planning and good land-use practices to stop today's unsustainable free-for-all.
Books that translate that opinion into the nuts and bolts of building such spaces are scarce. Unfortunately, those that do often alternate between the clear but self-serving and the helpful but statistics-heavy. Suburban Nation: The Rise of Sprawl and the Decline of the American Dream, by Andres Duany, Elizabeth Plater-Zyberk and Jeff Speck, for example, has its points and its clarity, but the special pleading of the well-publicized New Urbanist authors is obvious. For all their yea-saying on the need for dense, transit-oriented, walkable communities, too much of their own work slots and shapes the exurban spaces whose location on the fringes destroys such principles. Like the developer co-opted projects that bear the name "smart growth," single-family sprawl subverts larger goals.
Those looking to preserve our planet from such depredations suffered a major loss this spring with the death of Ian McHarg, an ecological hero of twentieth-century landscape architecture. McHarg's Design With Nature taught two generations to plant their spades where ecology ordered, following natural systems too often ignored or obliterated in the post-Olmsted era. In a New York Times obituary, landscape architect Edmund Hollander called McHarg "an apostle for the planet," and indeed he was: the patron saint of the one profession that is required by license to speak for the earth.
By training and instinct, the first-do-no-harm injunction at least compels landscape architects to this holistic, earth-grounded work, and Sustainable Landscape Construction: A Guide to Green Building Outdoors affirms the possibility. Written by Landscape Architecture magazine editor J. William Thompson and landscape architect Kim Sorvig, this book offers comprehensive instruction to the profession. Its authors unabashedly care about environmental health (healthy sites, healing sites). They show us how to practice respect (for places, materials, water and energy costs), with no sense of the saccharine and lots of sense. This is a deep-breath, even deep-ecology, look at the world. Again, anything but a picture book, its mandates and headings are, nonetheless, a Baedeker of the specifics and generalities of environmental do-goodism: Preserve Healthy Topsoil, Save Every Possible Existing Tree. It instructs landscape architects to be healers to a nature stretched and abused by builders who, by and large, remain servants to myopic buyers.
To get a final back-to-basics view of the nuts and bolts of a society torn between greening and consuming, a visit to the Home Depot Expo is instructive. This dispenser of household goods, which managed to capture Greenprints' environmental award, offers the final commentary on America's environmental schizophrenia. And its very sight--and site--says it all. Just off the highway north of Boston, the household purveyor's vast sea of asphalt parking, sprinkled with the requisite SUVs-on-steroids, fronts the megastore's Egyptoid facade, the emporium logo rising like a Steven Spielberg billboard. Inside, the store is more stage set than Home; more Depot than resting place. The smell of wood, fake and true, and the sight of materials boasting no single origin known to nature, issue forth from an enormous sea of appliances. Room after room-setting is bedecked with period chambers of appliances. No sleek, Corbusian Modernist house-as-a-machine-for-living here; today's mechanical servants hide behind ye olde postmodern bric-a-brac to fit the postmodern family hearth.
The green that won the prize and the greenwash that swabbed it away exist side by side: The EnergyStar tags on appliances and cabinetry boasting of the conservation of water, power and other measures of sustainability sit side by side with the swollen artifacts of excess. A $l,200 Humpty-Dumpty-adorned sink for junior rests not far from a Jacuzzi seating four, and hence requiring structural support to survive a slow descent into the cellar. Overstuffed stoves open up to hold three trays of cookies stacked top to bottom on separate shelves. Refrigerators flourish space for forty bottles of wine upright.
Alas, the green hype is not only here. As America, the giant sucking-and-spending machine, goes on to so-called "green buying," even conservation and national security are couched in consuming terms: Buy a new clean car. Purchase compact fluorescent bulbs. Build new clean electric plants. Construct new energy-efficient housing. So much for even personal virtue. Obviously nature approves, chirps an advertisement to buy a new clean Toyota Prius. Can more be truly clean?
In the end, this chorus of to-market, to-market is the downside of the environmental movement, as it is of green building. For, as supposed renewers sign on to their three "R"s--reuse, reduce, recycle--they have yet to add still one more word: reject. Reject the taking, making, wasting. Leave nothing but footprints, goes the saying. But can any footprint be really green in this age of reaping our excess?
Yes, exceptions do exist. Most eco-builders try to take the earth in for repairs. Environmentalism was becoming the cause du jour before the events of September 11, and the promise is there. Some architects speaking at the green conferences had done designs to transform a ravaged campsite into an ecologically sound ashram in upstate New York, or rehabilitate a handsome, classical McKim, Mead and White building into a daycare center in Manhattan. Greenpeace not only worked with a pollutant-free menu (certified wood, recycled carpet, good glues, flooring of recycled tires, counters of recycled yogurt containers), but chose a vintage building in Washington, DC, as a site. The National Trust for Historic Preservation coaches us to retain and restrain. The list grows. But what can stop what the World Resources Institute calls our "Fraying Web of Life?" Enough!, the Center for a New American Dream labels its periodical, questioning the basics.
Such brink-of-extinction talk wouldn't faze my Home Depot salesman, to be sure. In fact, he suggests compensating for a small, water-saving European dishwasher by buying two. Green notions are no match for the manufacturers of industrial-strength crockery and igloo-sized refrigeration. In America's bigger-faster-better-more environment, can a true ecological commitment contend--or even coexist? Can conservation--the Click and Clack mentality, as Greg Watson, director of renewable energy at the Massachusetts Technology Collaborative calls it--compete? Snap off the lights, batten down the hatches, switch off the computers nightly. Buy less, dump less. With the ever-rising US consumption spree, will escalating environmental ethics persuade the pedal-pushers of a now-idling economy toward conservation?
Thoreau, that original ethicist of nature, asked: What's the use of a house if you haven't got a tolerable planet to put it on? A new generation of green-builders gropes for answers. Professionals design projects and clients demand them, but many on both sides of that divide now recognize a new reality beyond style, toward sustainability. The question, of course, is not simply what's the use of a house for humankind without a viable planet but whether that house, and its occupants, will even survive. In the end, the deeper issue is whether we can adopt conserving and mending--and nay-saying--in our architecture, our artifacts and our landscape, to heal this embattled
For Hana Amichai
Inside a domed room photos of children's faces
turn in a candlelit dark as recorded voices
recite their names, ages and nationality.
"Ah, such beautiful faces," a woman sighs.
Yes, but faces without the prestige
of the future or the tolerance of the past.
Not one asks: Why is this happening to me?
They stare at the camera as if it were a commandment:
thou shall not bear false witness...
Why would anyone want to take their photo,
remember what they no longer looked like?
There's no delusion in their eyes,
no recognition or longing, only
the flatness of hours without minutes,
hunger without appetite.
They understand they are no longer children,
that death is redundant, and mundane.
Expected, like a long-awaited guest
who arrives bearing the gift
of greater anticipation. Their eyes
are heavy--fear perhaps,
or the unforgiving weight
Did they understand why they were so hated?
Wonder why they were Jews?
Did God hear their prayers and write
something in one of his glistening books?
Were they of too little consequence?
What did they think of God, finally?
Dante cannot help us.
Imagination is the first child in line.
They cannot help us.
It is wrong to ask them.
Philosophy cannot help us,
nor wisdom, or time.
We look at their faces and their faces look at us.
They know we are pious.
They know we grieve.
But they also know we will soon leave.
We are not their mothers and fathers,
who also could not save them.
UNICEF AND TEXTBOOKS IN PALESTINE
Fouad Moughrabi's "Battle of the Books in Palestine" [Oct. 1] incorrectly states that UNICEF evacuated its staff from the West Bank and Gaza at the outset of the intifada one year ago. In fact, staff were not evacuated but remained on the job in order to insure UNICEF's longstanding support to Palestinian children. Only relatives of some staff members, a volunteer and a consultant at the end of her assignment were sent home. The article also gives the mistaken impression that international agencies like UNICEF have not extended any help to Palestinian children suffering psychologically as a result of this most recent period of conflict. This is not true. UNICEF promptly mobilized up to $480,000 at the outset of the current situation to assist children suffering from stress and other psychological problems, in cooperation with our Palestinian partners. We continue to do so. Our most recent effort is to help others working to help children reach a consensus on practices and ethics for this important work. Rather than being left to "cope on their own," as Moughrabi states, children in the West Bank and Gaza can continue to rely on UNICEF's support during this difficult period.
UNICEF West Bank and Gaza
Professor Fouad Moughrabi's article is, in fact, a reprint of a piece to which the Center for Monitoring the Impact of Peace (CMIP) has already responded. We invited Professor Moughrabi to "openly and honestly examine and discuss the content of the textbooks themselves." To date we have received no response from him.
Moughrabi is disturbed that CMIP has placed the issue of the educational policies inherent in Palestinian and Israeli school texts on the public agenda. Textbooks are not simply another educational device but a clear expression of what governments instill in the minds of the young to further their long-term agenda.
Nothing in Moughrabi's article does anything but reinforce the conclusion that he has no real answer to the analysis put forward in the CMIP report. The objective reader is still left with the conclusion reached in the report that Palestinian textbooks incite against Jews, against Israel's very existence and coexistence with its neighbors. Certain Palestinian textbooks still depict Jews as greedy, treacherous, racist liars and thieves. They are the "enemies of the Arabs," of "the prophets and believers" and even of God. They aspire to rule and control the world, and they view the "non-Jews as pigs just fit for servicing them."
With regard to the work by Mustafa Dabbagh, CMIP would like to clarify the following points: (1) Our Country Palestine was not originally printed in 1947. In 1947 there existed a manuscript, which was lost at sea during Dabbagh's flight from Jaffa to Egypt. This volume was first printed in 1965 by the Dar al-Taliah printing house in Beirut. The same house printed Volume 2 in 1966. (2) Our Country Palestine is not "merely mentioned" in the chapter of the Palestinian Authority's sixth-grade textbook Our Beautiful Language, devoted to Mustafa Murad Dabbagh. Most of this chapter is actually a long quote from the introduction that Dabbagh wrote in 1964 for the first edition of his work. Moreover, in the eighth lesson, for example, the pupils are asked to write a detailed account of the importance of their cities or villages. The lesson suggests using Dabbagh's book to perform this task. So, the pupils have to use Dabbagh's work, which provides a detailed account of each town and village in Palestine from the archeological, historical, geographical, geological, botanical and economic point of view.
(3) Our Country Palestine was reprinted by the University Graduates Union of the province of Hebron, Volume 1 in 1973 and Volume 2 in 1985. Also, one of the copies of Our Country Palestine that was used by CMIP during its research came from the library of one of the intermediate schools of Hebron. (4) The quote "There is no alternative to destroying Israel" appears in the 1965 edition. In the 1973 edition this sentence was changed to "There is no alternative to the complete destruction of Israel." In spite of the Oslo accords, the State of Israel still does not appear on any map in any of the Palestinian textbooks or teachers' guides. One cannot find the slightest hint of recognition of the State of Israel, within its borders of 1948 or even within the framework of the 1947 UN Palestine partition plan. There is no reference to the peace process or to the content of the Oslo accords, to the mutual recognition between Palestinians and Israelis, or to their mutual commitment at Oslo to solve their conflict exclusively by negotiation. Unfortunately, certain Palestinian educational materials advocate the opposite approach--the obligation to liberate Palestine by jihad.
One can find numerous quotes propagating this indoctrination, including excerpts from Palestinian textbooks on the CMIP website (www.edume.org).
DR. YOHANAN MANOR
Center for Monitoring the Impact of Peace
Ramallah, West Bank
In October 2000 most international agencies in the West Bank and Gaza evacuated what they call their "nonessential" staff. In those grim days, UNICEF in particular was not answering our phone calls or responding to our e-mail requests. More significant, however, is the fact that UNICEF's intervention here was minimal, mostly geared to increasing general awareness about the psychological effects of trauma on Palestinian children, by placing ads in newspapers and by distributing a useful booklet designed to help teachers and parents recognize the symptoms and try to deal with them. Pierre Poupard recently sent us a letter in which he claims that UNICEF has also done a limited amount of training of teachers. In other crises throughout the world, UNICEF intervened immediately by conducting large-scale screening and by devising intervention strategies, thereby gaining a wealth of experience in responding to children who suffer the effects of trauma. Excellent and useful work was carried out in neighboring Lebanon with the help of Dr. Mouna Macksoud, who translated screening measures to Arabic and adapted them to local needs. This was not done here, despite the nearly half-million dollars allocated to the task. Palestinian children and their parents continue to cope on their own.
Dr. Manor's response is consistent with a pattern of lies that permeates the entire work of CMIP. He claims that CMIP responded to me and invited me to "openly and honestly examine and discuss the content of the textbooks themselves" and that I have not answered. This is another lie. I never heard from them.
A much longer version of my Nation article is forthcoming in the Journal of Palestine Studies. It will contain even more proof, on the basis of text analysis, of CMIP lies and distortions.
If textbooks are indeed a "clear expression of what governments instill in the minds of the young," as Manor suggests, then I invite him to take a look at Israeli school textbooks, which to this day view Arabs as thieves, killers and marauders; present a map of Israel that includes the entire area from the Mediterranean to the Jordan River as the eternal Land of Israel; and describe the occupied Palestinian West Bank as Judea and Samaria, where no Arabs are said to exist. His obfuscatory remarks about Our Country Palestine notwithstanding, the fact is that CMIP deliberately misquotes and badly translates in order to force a point about a book that various scholars consider to be a classic reference work on the history of Palestine during the British Mandate.
I am more than happy to enter into an honest and open debate on the issue of Palestinian and Israeli school textbooks with knowledgeable and professional Israeli colleagues, but not with extremists whose political agenda is to show that peace and coexistence with the Palestinians are impossible.
I don't think it was fair for Katha Pollitt to object to my observation that embryonic stem cell research, "'rightly or wrongly' summons up visions of Dr. Mengele's Auschwitz experiments" ["Subject to Debate," Aug. 20]. That's exactly how many people feel, and for those debating the issue it's important to know what motivates all sides. Nevertheless, I must say Pollitt made some good points in illuminating the inconsistencies of some of those favoring funding for embryonic stem cell research.
She correctly summarizes Orrin Hatch's position as: It's "OK to destroy a frozen embryo because the embryo is only a person if it's in a woman." Dubbing this the "location theory of personhood," she notes that by this logic, "You put the cells in the woman, it's a person, you take them out, it's not a person, you put them back in, voilà!--it's a person again. You might as well say Orrin Hatch is a person in his office but not in his car." Well put. But it would be nice if Pollitt would apply her wit and reasoning to the "convenience theory of personhood." If the mother wants it, it's called a "baby," cards are sent out, the empty bedroom is decorated with stuffed animals and a crib is installed. If the mother doesn't want it, it's called a fetus and aborted.
The Hudson Institute
Katha Pollitt suggests that antichoice women be recruited to gestate the 100,000 frozen embryo children in need of homes. But how responsible is this? If half of all fertilized eggs fail to implant, we would be condemning 50,000 embryo children to death. Really, the safest place for an embryo child is the freezer. In fact, we might require all fertilized embryos to be removed and frozen for their own safety. What responsible parent would want an embryo child to be faced with the perils of gestation and birth--and ultimate death? Besides solving the problem of death, freezing all embryo children would save money spent on education, medical care and other things that we are too prone to provide for unruly children. Frozen embryos are certainly the best behaved, least troublesome children we will ever get.