News and Features
A long time ago I dated a 28-year-old man who told me the first time we
went out that he wanted to have seven children. Subsequently, I was
involved for many years with an already middle-aged man who also claimed
to be eager for fatherhood. How many children have these now-gray
gentlemen produced in a lifetime of strenuous heterosexuality? None. But
because they are men, nobody's writing books about how they blew their
lives, missed the brass ring, find life a downward spiral of serial
girlfriends and work that's lost its savor. We understand, when we think
about men, that people often say they want one thing while making
choices that over time show they care more about something else, that
circumstances get in the way of many of our wishes and that for many
"have kids" occupies a place on the to-do list between "learn Italian"
Change the sexes, though, and the same story gets a different slant.
According to Sylvia Ann Hewlett, today's 50-something women
professionals are in deep mourning because, as the old cartoon had it,
they forgot to have children--until it was too late, and too late was a
whole lot earlier than they thought. In her new book, Creating a
Life: Professional Women and the Quest for Children, Hewlett claims
she set out to record the triumphant, fulfilled lives of women in
mid-career only to find that success had come at the cost of family: Of
"ultra-achieving" women (defined as earning $100,000-plus a year), only
57 percent were married, versus 83 percent of comparable men, and only
51 percent had kids at 40, versus 81 percent among the men. Among
"high-achieving" women (at least $65,000 or $55,000 a year, depending on
age), 33 percent are childless at 40 versus 25 percent of men.
Why don't more professional women have kids? Hewlett's book nods to the
"brutal demands of ambitious careers," which are still structured
according to the life patterns of men with stay-at-home wives, and to
the distaste of many men for equal relationships with women their own
age. I doubt there's a woman over 35 who'd quarrel with that. But what's
gotten Hewlett a cover story in Time ("Babies vs. Careers: Which
Should Come First for Women Who Want Both?") and instant celebrity is
not her modest laundry list of family-friendly proposals--paid leave,
reduced hours, career breaks. It's her advice to young women: Be
"intentional" about children--spend your twenties snagging a husband,
put career on the back burner and have a baby ASAP. Otherwise, you could
end up like world-famous playwright and much-beloved woman-about-town
Wendy Wasserstein, who we are told spent some $130,000 to bear a child
as a single 48-year-old. (You could also end up like, oh I don't know,
me, who married and had a baby nature's way at 37, or like my many
successful-working-women friends who adopted as single, married or
lesbian mothers and who are doing just fine, thank you very much.)
Danielle Crittenden, move over! Hewlett calls herself a feminist, but
Creating a Life belongs on the backlash bookshelf with What
Our Mothers Didn't Tell Us, The Rules, The Surrendered
Wife, The Surrendered Single (!) and all those books warning
women that feminism--too much confidence, too much optimism, too many
choices, too much "pickiness" about men--leads to lonely nights and
empty bassinets. But are working women's chances of domestic bliss
really so bleak? If 49 percent of ultra-achieving women don't have kids,
51 percent do--what about them? Hewlett seems determined to put the
worst possible construction on working women's lives, even citing the
long-discredited 1986 Harvard-Yale study that warned that women's
chances of marrying after 40 were less than that of being killed by a
terrorist. As a mother of four who went through high-tech hell to
produce last-minute baby Emma at age 51, she sees women's lives through
the distorting lens of her own obsessive maternalism, in which nothing,
but nothing, can equal looking at the ducks with a toddler, and if you
have one child, you'll be crying at the gym because you don't have two.
For Hewlett, childlessness is always a tragic blunder, even when her
interviewees give more equivocal responses. Thus she quotes academic
Judith Friedlander calling childlessness a "creeping non-choice,"
without hearing the ambivalence expressed in that careful phrasing. Not
choosing--procrastinating, not insisting, not focusing--is often a way
of choosing, isn't it? There's no room in Hewlett's view for modest
regret, moving on or simple acceptance of childlessness, much less
indifference, relief or looking on the bright side--the feelings she
advises women to cultivate with regard to their downsized hopes for
careers or equal marriages. But Hewlett's evidence that today's
childless "high achievers" neglected their true desire is based on a
single statistic, that only 14 percent say they knew in college that
they didn't want kids--as if people don't change their minds after 20.
This is not to deny that many women are caught in a time trap. They
spend their twenties and thirties establishing themselves
professionally, often without the spousal support their male
counterparts enjoy, perhaps instead being supportive themselves, like
the surgeon Hewlett cites approvingly who graces her fiancé's
business dinners after thirty-six-hour hospital shifts. By the time they
can afford to think of kids, they may indeed have trouble conceiving.
But are these problems that "intentionality" can solve? Sure, a woman
can spend her twenties looking for love--and show me one who doesn't!
But will having a baby compensate her for blinkered ambitions and a
marriage made with one eye on the clock? Isn't that what the mothers of
today's 50-somethings did, going to college to get their Mrs. degree and
taking poorly paid jobs below their capacities because they "combined"
well with wifely duties? What makes Hewlett think that disastrous recipe
will work out better this time around?
More equality and support, not lowered expectations, is what women need,
at work and at home. It's going to be a long struggle. If women allow
motherhood to relegate them to secondary status in both places, as
Hewlett advises, we'll never get there. Meanwhile, a world with fewer
female surgeons, playwrights and professors strikes me as an infinitely
inferior place to live.
Alan Dershowitz prides himself on his credentials as a civil
libertarian, and to judge by most of the essays in his latest book,
Shouting Fire: Civil Liberties in a Turbulent Age, he has good
reason to do so. The Harvard law professor has built a considerable
reputation on his defense of free speech, due process and the separation
of church and state, to say nothing of his propensity for controversial
clients and clamorous talk shows. Shouting Fire is a pastiche of
fifty-four essays, some of them new, most of them not, the earliest
dating from 1963. The impetus for the collection appears to be at least
in part a desire to reassert the importance of civil liberties, even in
the face of such national security threats as those posed by the events
of September 11 and their aftermath. Moreover, Dershowitz admirably
offers what rights advocates rarely do: a philosophical grounding for
civil and political rights beyond the mere positivist assertion that
"that's the law."
If this were all Dershowitz had done in Shouting Fire, the book
might have received its share of kind reviews and headed off to
Remainderland. But in less than two of the book's 550 pages, he manages
to guarantee the collection a longer shelf life. For in an addendum to a
1989 article in the Israel Law Review, Alan Dershowitz, civil
libertarian, champion of progressive causes, counsel to human-rights
hero Anatoly Shcharansky, makes a case for torture or, more exactly, for
the creation of a new legal device that he dubs a "torture warrant." And
then, through a deft combination of newspaper editorials, public
appearances and an extended interview on 60 Minutes, Dershowitz
has expanded upon that proposition in a way designed to make talk of
torture routine and, not incidentally, banter about his book robust.
Dershowitz's proposal, therefore, deserves careful scrutiny, not only
because it comes from a respected voice but also because sources in the
FBI have floated the possibility that torture will be applied against
prisoners or detainees who refuse to tell what they know about
terrorists. Last October 45 percent of Americans approved of that.
Today, thanks to Dershowitz and others having lent the idea the patina
of respectability--Jonathan Alter writing in Newsweek, Bruce
Hoffman in The Atlantic--the number may be higher.
Dershowitz starts with the familiar scenario from every freshman
philosophy class, the case of the ticking bomb. Suppose the authorities
are holding a suspect who knows where a ticking bomb is located, a bomb
that will kill hundreds of people if it explodes. Would they be
justified in torturing the suspect to procure the information and
thereby save innocent lives?
Dershowitz contends that whether we like it or not, the officials would
inevitably resort to torture and, what's more, the vast majority of us
would want them to. But because any officer who did so might be subject
to prosecution, despite the availability of the common law defense that
a crime may be justified if it is necessary to prevent a greater evil,
the onus of responsibility should not be left on the individual
official. Instead the authorities should apply to a court for a "torture
warrant," similar to a search warrant, so that the courts must bear the
burden of authorizing torture or the consequences of failing to do so.
In another context Dershowitz has offered the reassurances that "the
suspect would be given immunity from prosecution based on information
elicited by torture" and that "the warrant would limit the torture to
nonlethal means, such as sterile needles being inserted beneath the
nails to cause excruciating pain without endangering life."
Despite these precautions, however, Dershowitz's proposal has not met
with universal acclaim, and in recent weeks he has appeared to be
distancing himself from it. In a February 17 letter to The New York
Times Book Review responding to a critical review of Shouting
Fire, Dershowitz claims that "the only compromises [with civil
liberties] I suggest we should consider, and not necessarily
adopt, relate directly to protecting civilians against imminent
terrorist attacks [emphasis added]." But there is no hint on the two
relevant pages of Shouting Fire that Dershowitz's "torture
warrant" proposal is merely hypothetical. Indeed, in commenting on the
decision by the Supreme Court of Israel that prompted the idea in the
first place, he chastises the court for leaving interrogating officers
vulnerable to prosecution if they use torture and says, "The Supreme
Court of Israel...or the legislature should take the...step of requiring
the judiciary to assume responsibility [for torture] in individual
cases." Dershowitz is stuck with his "torture warrants" just as surely
as Arthur Andersen is stuck with its Enron audits.
So what, after all, is wrong with that--other than the fact that torture
violates both the Convention Against Torture, which the United States
ratified in 1994, and the Constitution? The first thing that is wrong is
that the act of torture, unlike that of searching for something, is in
itself both universally condemned and inherently abhorrent. Under
international law, torturers are considered hostis humani
generis, enemies of all humanity, and that is why all countries have
jurisdiction to prosecute them, regardless of where the torture took
place. The fact that a US court or legislature might offer its approval
of the act does not abrogate that internationally recognized standard
any more than a court in Singapore that authorizes the jailing of a
dissident journalist makes Singapore any less guilty of violating the
rights of a free press. Tyrannical governments often try to cloak their
human rights violations in national statute. It is interesting, however,
that no country has ever legalized torture except, arguably, Israel,
until the Israeli Supreme Court struck down the provision for the use of
"moderate physical pressure," and even while that provision was on the
books, the Israeli government argued vehemently that such pressure was
not the equivalent of torture.
To see more clearly the shoals upon which the "torture warrant"
flounders, consider this. There is no doubt that despite official
efforts to eradicate it, police brutality is practiced in many US
jurisdictions and probably always will be. Some police officers will
claim, in their more candid moments, that the use of excessive force is
often the only way to protect the lives of officers and the general
public. Why ought the police not be able, therefore, to apply for
"brutality warrants" in specialized cases? Why ought police officers who
believe that a little shaving of the truth on the witness stand is worth
sending a bunch of drug pushers to prison, thus protecting hundreds of
youngsters from a life of drugs and crime, not be able to seek
"'testilying' warrants"? Why ought correctional officers who argue that
allowing dominant male prisoners to rape other prisoners helps preserve
order among thugs and thus protects the lives of guards not be allowed
to seek "warrants to tolerate prisoner rape" in particularly dangerous
situations? The answer in all cases is the same: because the act itself
(brutalizing citizens; committing perjury; facilitating rape) is itself
abhorrent and illegal. Dershowitz's analogy to search warrants fails
because, while a particular search may itself be illegal, the act of
searching is not ipso facto unethical or a crime. For a society
to start providing its imprimatur to criminal acts because they are
common or may appear to provide a shortcut to admirable ends is an
invitation to chaos.
But even if torture were a licit activity under some circumstances,
there are very good pragmatic reasons to reject its use. If the ticking
bomb scenario were designed only to establish the abstract moral
calculus that the death of X number of people constitutes a greater evil
than the torture of one, it would certainly be possible to make a
plausible utilitarian argument for torture. The problem is, however,
that the proponents of the ticking bomb scenario want it to serve as the
basis of public policy, and unfortunately reality rarely conforms to
scenarios and life doesn't stop where the scripts do. How strange that
though the ticking bomb scenario has been used for decades to justify
torture, its defenders are unable to cite the details of even one
verifiable case from real life that mirrors its conditions.
Perhaps, upon reflection, that is not so strange. For what the ticking
bomb case asks us to believe is that the authorities know that a bomb
has been planted somewhere; know it is about to go off; know that the
suspect in their custody has the information they need to stop it; know
that the suspect will yield that information accurately in a matter of
minutes if subjected to torture; and know that there is no other way to
obtain it. The scenario asks us to believe, in other words, that the
authorities have all the information that authorities dealing with a
crisis never have.
Even aficionados of ticking bomb torture agree that its use can only be
justified as a last resort applicable to those we know to a moral
certainty are guilty and possess the information we seek. That 45
percent of Americans who reported last October that they approved of
torture were approving of the "torture of known terrorists if they know
details about future terrorist attacks." But how do we know all that?
The reason torture is such a risky proposition is exactly because it is
so difficult to tell ahead of time who is a terrorist and who is not;
who has the information and who does not; who will give the information
accurately and who will deceive; who will respond to torture and who
will endure it as a religious discipline. The fact is that many people
suspected of being terrorists turn out not to be, as our experience
since September 11 has proven so well; that, historically, many of those
subjected to torture are genuinely ignorant of the details the
authorities seek; that the information protracted with torture is
notoriously unreliable; and that torture almost always takes a long
time--days and weeks, not hours and minutes--to produce results. Torture
is of course extraordinarily common. Almost three-fourths of the world's
countries practice it. But not to find ticking bombs. To punish
political opponents. To intimidate their allies. To cow a citizenry. The
ticking bomb scenario in its purest form is a fantasy of "moral" torture
all too easily appropriated by tyrants as an excuse to justify the more
And if the ticking bomb scenario is a fantasy, the Dershowitzian
addition of a "torture warrant" makes it into a chimera. Here is a
situation Dershowitz envisions for the warrant's use:
Had law enforcement officials arrested terrorists boarding one of the
[September 11] airplanes and learned that other planes, then airborne,
were headed toward unknown occupied buildings, there would have been an
understandable incentive to torture those terrorists in order to learn
the identity of the buildings and evacuate them.
This assumes that those law enforcement officials would have had time in
the hour and a half or so between the boarding of the planes and the
impact on their targets to (1) take the suspects into custody; (2)
ascertain with enough certainty to warrant torture that the suspects
were (a) terrorists who (b) had the needed information in their
possession; (3) apply to a judge for a torture warrant and make the case
for one; (4) inflict torture sufficient to retrieve the necessary facts;
(5) evaluate the validity of those facts in order to be assured that no
innocent plane would be identified and blown out of the sky; and (6)
take the steps required to stop or mitigate the terrorist act. Perhaps
after John Ashcroft has been Attorney General another three years, law
enforcement will have learned to cut enough corners of the legal
niceties to accomplish this feat. But at the moment, given the INS, Tom
Ridge, bureaucratic infighting and all, it seems unlikely.
Which leads to the question of whether, if the United States were to
become the first country in the world to adopt "torture warrants," they
would make us safer. That, after all, is presumably the only ultimate
rationale for their use. But here is another place where the traditional
ticking bomb case explodes in the face of reality. For it assumes that
there are no further detrimental consequences once the victims of the
bombing are saved--no retaliatory strikes, for example, by the torture
victim's comrades to pay back the inhumanity done to their brother. It
doesn't take much imagination to see how quickly officially authorized
torture would diminish the credibility of a struggle against terrorism
that is being fought in the name of defending American values and the
rule of law. How many people would need to be tortured before our allies
threw up their hands in disgust and our adversaries started celebrating
their moral victory? How many innocent people would have to be
brutalized before their resentment and that of their friends and family
would spill over into violence? In his book No Equal Justice law
professor David Cole has shown how mistreatment of the innocent by US
police can alienate entire communities and result in increases in crime.
Torture, similarly, is a sure-fire way to manufacture an embittered
opponent of the United States where there was none before. And make no
mistake that innocent people would be tortured, warrant or no, for,
after all, if close to 100 innocent people have been convicted of
capital crimes and sentenced to death in this country despite all the
protection our legal system offers, how much more likely is it that
miscarriages of justice will flow from the pen of a single judge?
Whatever leadership the United States can claim in the world is
intimately linked to our practice of values universally regarded as
fundamental to a civilized people.
So how could a distinguished human rights advocate like Alan Dershowitz
have strayed so far from the mark? Part of it may have to do with the
philosophical basis for rights that he sketches in the beginning of his
book. Wisely rejecting the notions that rights are derived from deity or
natural law and yet unconvinced that positivism alone provides
sufficient heft for rights claims, Dershowitz adopts what he calls the
"experiential-advocacy approach." In effect, he says, we should look to
history to identify prototypical instances of injustice (slavery, for
example) and then, based upon that human experience, construct a set of
rights--free speech, due process--that are most likely to bring about
the type of society in which we would want to live. So far, so good.
Human rights are assuredly derived from human experience.
But what if you disagree with my vision of the good society? The best we
can do, Dershowitz insists, is to try to argue you out of your myopia:
"That is all I can do," he says. "Defend my preference for [certain]
rights.... But I make no claim for these rights beyond my ability to
persuade you to agree with me that history--especially the history of
wrongs--has shown these rights to be important enough to be given a
special status in the hierarchy of preferences. It may surprise you to
learn that for me there is no sharp line...separating rights from
strongly held preferences." It is here that Dershowitz stumbles.
For while rights are, in a sense, preferences, they are also more than
that: They are norms, behavioral norms necessary to create and sustain a
good society. And they become norms not through argument alone but
through its conclusions, through an articulated consensus of the
international community. One of the most astonishing lacunas in the
philosophical section of Shouting Fire is the absence of even one
mention, if the index and my reading are to be believed, of the
Universal Declaration of Human Rights. For while the UDHR did not set
out to be a legally binding treaty (the State Department called it in
1948 "a hortatory statement of aspiration") and hence avoids the limits
of positivism, it does reflect--imperfectly, to be sure, but as well as
possible within the current limits of human endeavor--what St. Augustine
called our "overlapping loves," our common measures of a decent world.
To those who disagree with its vision of that world, we can offer much
more than a shouting contest, much more than any one person's reading of
history or any one nation's perception of its needs. We can offer the
collective wisdom of the human community as hammered out, written down
and, more and more frequently, enforced. And part of that wisdom is that
torture is wrong. Everywhere. In all circumstances. With or without
Alan Dershowitz may not like that. And he is certainly entitled to go on
arguing about it. He is a persuasive fellow and eventually he may even
succeed in helping erode the international prohibitions on torture. That
will be a sad day, no doubt, but how comforting it will be to know at
that point that, thanks to the professor, the needles will be
When I was a teenager on my first trip to Paris, I remember looking out
at the Parisians from the window of a taxi as we proceeded along some
splendid boulevard and thinking, But do these people take themselves
seriously, really? They're not Americans, after all. Sorry. It's true,
though embarrassing. I felt sorry for them because they weren't us. I
needed a reality check, which the French were only too happy to provide.
They soon taught me how superior to us, and to me, they were in every
way, especially intellectually and in matters of literature, fashion,
proper cigarette inhalation and the application of maquillage.
Well, let that pass. Now Granta has published the greater part of
an issue (Spring 2002) devoted to Their perceptions of Us, called "What
We Think of America." Interestingly, the writer who is possibly the most
violently anti-American in the collection, or who admits to the most
violently anti-American feeling, is not French, or Arab, but Latin
American. Ariel Dorfman, the US-born Chilean writer, tells about the
time he watched an American toddler tumble into a swimming pool at a
resort in the Andes and carefully measured what his own reaction might
be--after all, the kid had been behaving badly (loud, blond, white,
Anglophone, whining, stupid, spoiled, exploitive, rapacious,
intervening, assassinating legitimate heads of state, financing coups,
training torturers... oops... but really, you catch Dorfman's drift). In
the end, though, he did dive in after the brat.
There is the gentler French person, badboy Benoît Duteurtre,
author most recently of the novel Le Voyage en France, which won
the 2001 Médici prize. Duteurtre criticizes Europe for
proclaiming a high ground in human rights from which to criticize the
Americans, as if, he says, to disguise from itself that it belongs to
exactly the same world and is mired in identical contradictions. He
makes fun of the way the French use the word Disneyland (pronounced
Deez-nee-lahhhhnd) to refer to the entire American polity.
President Jacques Chirac--that unsuccessful chameleon--comes in for a
smacking, too. In Chirac's speech after September 11, Duteurtre writes,
"I heard the inferiority complex of a Europe deprived of its role as
world leader...but still quick to judge good and evil."
The effect of Granta's roundup is shockingly human: Here are no,
or few, diatribes, and much affection--through tears--from Arab and
Muslim contributors. A piece that perhaps explains well what led to
September 11 (which I take to be the ostensible reason for
Granta's package) is Pankaj Mishra's "Jihadis," a beautiful,
brilliantly observed essay about Pakistan (by an Indian!) and its
troubled identity, as well as about the US and Pakistani governments'
growth and nurturing of the jihad movement. Gives you an idea, too, of
the level of corruption that made the initially pure-minded Taliban
This, to me, is the best issue of any magazine trying to explain
September 11. There is also Ziauddin Sardar's "Mecca," both funny and
instructive about the rituals of the hajj and of Saudi society in
general. Don't forget to appreciate the photo essay on Afghanistan by
Thomas Dworzak: It captures the dust, the mud, the turbans, the
mountains, as well as Northern Alliance soccer, burqa ladies buying
their liberation pop-music cassettes and the eerie ruins of eternal
Kabul, after the attacks.
Out on the Links
Sometimes, the problem with online magazines like The Black
Commentator (www.blackcommentator.com) can be the links. For
example, in the inaugural (April 2002) issue, there's a very persuasive
piece on the much-discussed Cory Booker, running for mayor of Newark in
the Democratic primary against the picaresque Sharpe James. Booker,
another dang Rhodes scholar, seems to pride himself on adopting some of
the meretricious conservative bent of Bill Clinton. The TBC piece
attacks Booker for his support of school vouchers and goes on to map out
in great detail the web of conservative groups that have supported the
Booker movement in Newark--not a pretty picture. It argues that Booker
is another pawn in the right's effort to develop African-American
politicians it can work with and manipulate.
The piece has no byline. But at the end, it has a feature possibly more
meaningful than a byline: "sources that contributed to this commentary,"
followed by a series of hyperlinks to information both pro- and
anti-Booker. In his speech at the Manhattan Institute, you can
hear--behind Booker's pro-voucher position--not only the clink of money
and financial backing and the Evil White Rich Men Who Run The World but
also the will of a black electorate with whom Booker, having lived in
Newark's projects and spent month after month on a notorious
never-cleaned-up corner in Newark's drug-dealing inner city, is not
unfamiliar. The problem with TBC is its paranoid style: I'd like
to see it address the question of why there seems to be a drift among
African-American voters toward conservatism--something that doesn't just
tell me the new black pols are being paid for it but that considers the
electorate as well, considers Booker's supporters: what they think of
people like Colin Powell or the improbably named Condoleezza Rice, or of
Cory Booker, who's no idiot. Don't some black families hold these
successful conservative types up as examples to their sons and
daughters, or is that just too Cosbyfied?
Al-Ahram Weekly, a venerable English-language publication based in Cairo
another paper that for themoment is focusing on an oppressed people--in
this case, the Palestinians. Al-Ahram is a useful corrective for
my formerly peace-leaning Jewish friends who feel that the US media,
especially the New York Times and CNN, are increasingly biased
against Israel. It's a real source for uncovered news about what's
happening in the territories, with much less of the myth-making and
demonizing that characterize so much of the Palestinian stuff coming out
of the West Bank over the Internet in these impossible times. Much less
paranoia here but, still, a painful and useful reality check.
On the morning of April 20, in the nation's capital, activists held two anti-war rallies, each of which drew thousands, almost within sight of one another.
I am beginning to suspect that Nation readers may not fully appreciate the challenges Attorney General John Ashcroft faces. What would you do in his place? Your intelligence agencies had no advance knowledge of the September 11 plot and don't appear to know much more about future attackers. Airport security screeners are letting test bombs and guns pass at alarming rates, and your immigration agency is so hapless that it issued visa extensions to two of the hijackers six months after they died flying planes into the World Trade Center towers. When you consider the threat from their side and the incompetence on ours, it's understandable that Ashcroft has cast his net so wide. He's shooting in the dark. In fact, the expanse of his net is probably inversely proportional to the depth of the intelligence he has received.
But just as with the terrorists themselves, understanding Ashcroft's motives does not justify his actions. To date, despite the thousands of Arab and Muslim immigrants arrested, searched, profiled and questioned, Ashcroft has charged only a single person--Zaccarias Moussaoui--with any involvement in the attacks of September 11. And he was arrested before the attacks occurred. Such broad-brush tactics are unlikely to succeed, for they give notice to potential targets, allowing them to evade detection while alienating the very communities we must work with to identify potential threats who may be living among them.
Ashcroft has shown no signs of getting closer to his target. And the less he finds, the wider he sweeps. He recently announced that he was extending to 3,000 more people his much-criticized initiative to subject male immigrants from Arab countries to "voluntary" interviews, despite the fact that the initial interviews have led to no further charges in the investigation. And having learned how easy it is to use immigration law as a pretext for criminal law enforcement when you lack probable cause, the Justice Department is now preparing to enlist local police officers to help enforce immigration law, a disastrous proposal likely to drive immigrant communities even deeper underground.
The lengths to which Ashcroft will go was revealed most recently by his indictment of Lynne Stewart, a 62-year-old New York attorney who has made a career of courageously taking on clients for whom few other lawyers are willing to risk their reputations. Her most notorious such case was defending Sheik Omar Abdel Rahman in his 1995 criminal trial for conspiring to bomb the tunnels into Manhattan. Now she's charged with providing "material support" to the sheik's organization, the Egypt-based Islamic Group, largely by abetting communications between the sheik--whom prison regulations prohibit from communicating with virtually anyone in the outside world--and others in the group.
The government simultaneously announced that it will make Rahman its test case for its unprecedented initiative to listen in on attorney-client communications. Confidential exchanges with lawyers have long been sacrosanct, because they are critical to any fair legal process. In the past, they could be intruded upon only with a warrant based on probable cause that the communications were intentionally furthering criminal activity, but the new regulations permit monitoring without a warrant or probable cause. But under regulations issued after September 11, the government claims the authority to monitor attorney-client communications without establishing probable cause for believing that the communications are being used for illegal ends, and without obtaining authorization from a judge.
Most troubling, Ashcroft is prosecuting Stewart although she has not been charged with furthering any illegal or violent activity of the Islamic Group, a wide-ranging Islamic political movement that engages in a great deal of lawful activity in addition to terrorism. While many have criticized the government for targeting a lawyer, of far more concern is its criminalization of speech and associations having no connection to terrorism. Unable to link Stewart to any actual terrorist activity in any way, Ashcroft has resorted to guilt by association. As a US citizen, Stewart will at least have an opportunity to defend herself in a public trial. Not so the hundreds of noncitizens still being detained on immigration charges in connection with the September 11 investigation, many long after their immigration proceedings have concluded. Under orders from Ashcroft, they are being tried in secret proceedings closed to the public, press, legal observers and family members.
In a major setback for the Ashcroft agenda, US District Judge Nancy Edmunds on April 3 declared the closed proceedings unconstitutional. She ruled that open trials are a fundamental feature of our justice system and that any closure must be carried out not in the sweeping manner that Ashcroft so favors but through means narrowly tailored to protect national security interests. The government has appealed, arguing that to act in a more narrowly tailored fashion might tip off Al Qaeda to what we do and don't know. But one has to wonder whether the government's real concern isn't that opening the proceedings might tip off the public to just how wildly John Ashcroft is shooting in the dark.
It's hard to imagine a tale of corporate mischief that would shock veteran observers of the US tobacco industry. But even the most jaded reader may raise an eyebrow at the allegations reported on page 11 that major American tobacco companies smuggled cigarettes and laundered money on a vast scale, defying US and foreign law and defrauding foreign governments of hundreds of millions in tax revenues before engineering a rewrite of the USA Patriot Act last fall to shield themselves from international liability. For this special report, the result of an investigation by The Nation, the Center for Investigative Reporting, and NOW With Bill Moyers--with support from the Investigative Fund of the Nation Institute--journalist Mark Schapiro traveled to Colombia, whose state governments are suing the companies in US court, to assess the charges and to inspect the scene of the alleged smuggling operations. (NOW airs its investigative report on April 19.)
The Bush Administration ought to cooperate with authorities in Colombia and other countries in their efforts to hold US corporations accountable. It should support legislation to establish clearly the principle of jurisdiction in US courts over allegations of wrongdoing by American companies overseas. And the Justice Department should launch an investigation into the activities of US tobacco firms in Colombia to determine whether laws were broken and prosecution is warranted. It is important for the rest of us to raise the political cost of inaction. Republicans in Congress and in the White House may one day realize that with friends like Philip Morris, they don't need enemies.
More than the much-reviled products of Big Tobacco, big helpings and Big Food constitute the number-one threat to America's children, especially when the fare is helpings of fats, sugars and salt. Yet the nation so concerned about protecting kids from nefarious images on library computers also
allows its schools to bombard them with food and snack ads on Channel One and to sign exclusive deals with purveyors of habit-forming, tooth-rotting, waist-swelling soft drinks.
Foreigners who arrive in the United States often remark on the national obsessions about food and money. It is perhaps not surprising that a gluttonous mammon would rule the federal regulators of our food chain, but Marion Nestle, professor of nutrition at New York University, confesses that she has heard few of her nutritionist colleagues discuss the cardinal point: "Food companies will make and market any product that sells, regardless of its nutritional value or its effect on health."
Nestle goes on to demonstrate that not only do food companies use traditional corporate checkbook clout with Congress to insure their unfettered right to make money; they also co-opt much of the scientific and nutritional establishment to aid in their efforts. For example, the omnipresent "milk mustache" advertisements often show blacks and Asians--precisely those who are most likely to be lactose-intolerant. But then "science" rides to the rescue: There are a lot more research dollars shunted to those arguing that lactose intolerance is not a problem than there are for those who think otherwise. In fact, the Physicians' Committee for Responsible Medicine sued to annul the federal dietary guidelines, which recommended two to three servings of milk products daily; six of the eleven people on the voting committee had received research grants, lectureships or other support from the food industry.
Here Nestle wobbles a little in her argument, however. She waves the standard of science on behalf of the Food and Drug Administration when it comes to food supplements and herbal medicines, but devalues the "science" as well by revealing the conflicts of interest among researchers and regulators. Science is often up for sale. Researchers go to the food corporations for the same reason that bandits rob banks: That's where the money is, not least since the FDA's own research funding is controlled by Congressional committees in charge of agriculture, whose primary aim is hardly to promote dieting--it is the force-feeding of agribusiness with federal funds. Indeed, Nestle concedes, "USDA officials believe that really encouraging people to follow dietary guidelines would be so expensive and disruptive to the agricultural economy as to create impossible political barriers."
The dietary guidelines Nestle is referring to were monumentalized in the famous "food pyramid" familiar to every primary school student. But the pharaohs finished theirs in less time than it took the FDA to pilot its version past the army of lobbyists who resented the hierarchical implication that some foods were healthier than others. As a whistleblower on the FDA advisory committee that was drawing up the guidelines, Nestle is well qualified to recount the obstacles it faced.
In fact, many people did become more health-conscious as a result of such guidelines, but as Nation readers know, practice does not always match theory. Much of Food Politics reveals how the food industry has seized upon the marketing possibilities of consumers' safety concerns and perverted them by adding supplements to junk foods and then making health claims for the products.
Food is an elemental subject, on a par with sex and religion for the strength of people's beliefs about it. Otherwise rational people have no difficulty believing the impossible during breakfast, where their stomachs are concerned. Big Food relies on that snake-oil factor, the scientific illiteracy of most consumers. For example, marketers are happy with the advice to eat less saturated fat, since most buyers won't recognize it when it's drizzled across their salad. But advice to eat less of anything recognizable stirs up serious political opposition.
Federal dietary guidelines recommending that we "eat less" were thinned down to suggesting that we "avoid too much," which metabolized into "choose a diet low in..." And so on. For example, Nestle relates how in 1977 the National Cattlemen's Association jumped on Bob Dole's compromise wording on reducing red meat in the diet and increasing lean meat consumption: "Decrease is a bad word, Senator," the cattlemen warned him. The cowboys effectively corralled the McGovern committee on dietary guidelines: "Decrease consumption of meat" was fattened into "choose meats, poultry and fish which will reduce saturated fat intake."
Sometimes the more potential for harm, the more it seems likely that a product's positive--or putative--health benefits will be touted. We get vitamin-supplemented Gummi Bears and, what provokes Nestle's justifiable ire most, Froot Loops. This marshmallow blasted "cereal...contains no fruit and no fiber" and "53% of the calories come from added sugar," she inveighs. The perfect breakfast complement to a twenty-ounce bottle of cola that will be downed in school? Such pseudo-foods occupy the very top of the food pyramid, which characterizes them as to be used sparingly, or rather, only use if you have good dental insurance.
As Nestle points out, health warnings on alcohol and tobacco have done little to stop consumers. But picture a tobacco company allowed to sell cigarettes as "healthier" or "with added vitamins." (Indeed, she details a campaign by the alcohol companies to get Congress to allow them to market their products as healthy elixirs until Strom Thurmond's religious principles outweighed his conservatism enough for him to help shoot down the proposal.)
I was mildly surprised that Nestle does not comment on the imprecise use of "serving" information on food packaging. As a longtime student of labels, I find that the unhealthiest foods seem to have incredibly small "servings" compared with what consumers actually eat or drink. For the USDA, one slice of white bread or one ounce of breakfast cereal is a "serving" of grain, and nutritional data such as caloric content are rendered "per serving." A cinema-size actual serving of soda may contain 800 calories in sugar, before you get down to the buttered popcorn, not to mention the Big Mac before or after.
Food marketers are hardly breaking people's arms to persuade them to eat this stuff, of course. It is, after all, a great American principle that you can have your cake, eat it and slim down at the same time. What Nestle calls "techno-foods"--those labeled "healthier," "less fat," "lite," "more fiber"--pander to the health consciousness of a generation that will do anything to lose weight and live longer, except eat less.
The ultimate example of food marketing has to be Olestra, the cooking fat that passes through the gut undigested. Its maker, Proctor & Gamble, has spent up to $500 million on it, and spent twenty-seven years of the FDA's time getting various approvals, while it kept trying to remove the mandated health warning that the product could cause cramping, loose stools and block the absorption of fat-soluble vitamins. P&G should count its blessings. A Center for Science in the Public Interest report says: "Olestra sometimes causes underwear staining. That phenomenon may be caused most commonly by greasy, hard-to-wipe-off fecal matter, but occasionally also from anal leakage (leakage of liquid Olestra through the anal sphincter)."
By 1998 Proctor & Gamble disingenuously claimed that 250 tons, or four railcarfuls, of fat had not been added to American waistlines. No one claimed it had--what the company meant was that was how much Olestra had been used to fry chips. The public expectations were quite high, though; Nestle says that "people also were disappointed that the chips did not help them lose weight." Indeed, she reports that some ended up with more calories from eating Olestra-fried chips than they would have from other kinds, because they consumed a higher volume, convinced that they were calorie-free, though of course they were not.
But given the amount of money involved and the way food-industry/scientific-community connections are structured, "it is virtually impossible for any nutritionist interested in the benefits and risks of Olestra to avoid some sort of financial relationship with P&G unless one systematically refuses all speaking invitations, travel reimbursements, honoraria and meals from outside parties," Nestle observes.
In yet another case of Big Food getting its way, Nestle chronicles how the State Department came to declare that signing the World Health Organization/UNICEF international code on marketing of baby formula would flout the Constitution. "Inasmuch as this explanation strains credulity," Nestle suggests, the real reason was lobbying by US formula companies. The formula makers are fighting a war of attrition against mother's milk, in other words, not just here but internationally.
A more recent case involves the coalition that forced the FDA to allow claims of benefits from untested herbal supplements. I wish Nestle had gone into more detail about the sociology of this mélange of New Age alternative-medicine users, libertarian types and those who mistrust the medical establishment. Groups like Citizens for Health and the Corporate Alliance for Integrative Medicine rallied behind the rapidly growing corporations to ram a suppository up the FDA and its power to control sales of what on occasion have proven to be fatally flawed "alternative" remedies for everything from impotence to Alzheimer's. As she quotes an FDA commissioner, "[We] are literally back at the turn of the century, when snake oil salesmen made claims for their products that could not be substantiated." She reports claims that 12 percent of users of herbal medicines, or about 12 million people, suffer from some kind of adverse effect.
People may feel better when they take supplements, but should health officials use "feelings" as a basis for regulatory measures, she asks? Or should the FDA instead "take the lead in reenergizing a crucial phase of its basic mission to promote honest, rational scientific medicine by vigorously combating its opposite"?
Many people may want to know what "science" is. Is it corporate-sponsored research, or the AMA defending its professional turf with the same vigor with which it has traditionally fought "socialized medicine"? Nestle shows how the American Academy of Pediatrics tried to insure that highly profitable baby formula flowed through its hands and rallied against direct sales to mothers. Was that concern for the "client" or concern for professional prerogatives?
Perhaps Nestle should have been more polemical. The food supplement row raised the question of "whether irreparable damage has been done to the ability of our federal regulatory system to ensure the safety of foods and supplements and to balance public health interests against the economic interests of corporations," she writes. But her own reporting suggests that the barbarians are already inside the gates and forcing their wares on the gullible.
Nestle sees no magic bullet to retrieve the situation. She wants "some federal system to guarantee that all those products on the shelves are safe and effective," and she asks, "Shouldn't there be some regulatory framework to control patently absurd or misleading claims?" To answer that in the affirmative is not necessarily the same as agreeing that the FDA is the best agency, certainly in its present form, nor that the AMA and similar organizations are in the corner of good science. The FDA's record does not inspire confidence, which is one of the reasons the herbalists' revolt was so successful in Congress. Its arrogance often matches its ignorance. While reading this book I went to a small British-owned cholesterol shop in Manhattan (pork pies, etc.). Its owner can't import kippers because the FDA does not recognize them as food. His first shipment of a brand of British Band-Aids was held on suspicion of being a soup, and when that confusion was finally cleared up, the FDA demanded of him a medical-goods import license.
I would like to hear more about how the FDA could be made more responsive and more efficient. It seems that in their present form, the regulatory bodies need some means of democratic oversight to check bureaucracy and to weigh problems of undue influence from the producing industries. Nestle details problems we've come to see elsewhere: the revolving door between civil servants, Congressional staff and industry. She also suggests rules--"a higher and stronger 'firewall'" between regulatory agencies and industry to inhibit the easy career glide from poaching to gamekeeping and back again--and she is entirely correct that the last bodies that should be overlooking FDA funding are the Congressional agriculture committees, which are dedicated to the prosperity of agribusiness.
Otherwise, Nestle's wish list ranges from sensible to Mission Impossible: tighter labeling rules so people can see exactly what they are consuming. A ban on advertising of junk foods in schools, especially candies and soft drinks with high sugar content. Sumptuary taxes on soft drinks as well--sure to be opposed bitterly by the lobbyists. If alcohol and tobacco advertisements cannot be allowed on children's TV, why allow advertising of foods that promote obesity and future health ills on a par with them?
At first glance, Nestle's call for an ethical standard for food choices for nutritionists and the industry seems highly idealistic; but ten years ago, who would have foreseen Philip Morris's berating of state governments for not spending their tobacco settlement money on the pledged anti-child-smoking campaigns? Already, more and more scientific journals are demanding disclosure of conflicts of interest for papers submitted.
Nestle does not touch the subject directly, but who knows, maybe campaign finance reform really will cut indirectly the pork in the political diet and the crap in the school lunches. However, it will be a hard push. Educating the public is a start, and Food Politics is an excellent introduction to how decisions are made in Washington--and their effects on consumers. Let's hope people take more notice of it than they do of the dietary guidelines.
The Enron "outrage," AFL-CIO president John Sweeney told a rapt crowd of several hundred workers at Milwaukee's Serb Memorial Hall, is "not the story of one corporation's abuses, but sadly it's an example of business as usual in boardrooms and executive suites all across the country." Over the coming months, at a series of town-hall meetings around America, the AFL-CIO will warn workers that they, too, could be "Enroned," and it will call for "no more business as usual."
In an unprecedented way, argues AFL-CIO corporate affairs director Ron Blackwell, the Enron scandal "opens up a channel of public discourse on issues of retirement security and corporate accountability." In the booming nineties nobody wanted to hear why corporations and capital markets had to be better regulated, and reformers were left pleading for corporations to be "socially responsible." But today, "new economy" job-hoppers as much as steelworkers have good reasons to listen to union warnings about deeply flawed 401(k) plans and Social Security privatization.
The labor movement helped win millions in severance pay for laid-off Enron workers, provided legal counsel for workers battling Enron's creditors, sued Enron executives (through union-affiliated Amalgamated Bank) on behalf of pension funds that lost hundreds of millions of dollars in Enron's collapse and helped ex-Enron workers--both union and nonunion--tell Congress and the public how they were misused. The AFL-CIO requested new Securities and Exchange Commission rules and forced four Enron directors to withdraw from renomination at other corporate and public boards. Now labor is challenging Enron director Frank Savage's renomination to Lockheed Martin's board, sending the message that independent directors have a public trust.
Besides supporting auditor reform, the AFL-CIO is promoting legislation to strengthen the rights of workers in 401(k) plans--to a point. Senator Jon Corzine, backed by the Pension Rights Center, initially proposed prohibiting employees from holding more than 20 percent of their employer's stock in their plans. But after complaints from unions representing some workers who had bet big with their employers' stock, like pilots and GE employees, the AFL-CIO backed Senator Ted Kennedy's legislation, which places a less stringent limit on the employees' 401(k) holdings of their employers' stock but which, quite importantly, would require equal worker and employer representation in governing the plans. Enron worker Dary Ebright, who lost $300,000 from his 401(k), argues that limits make sense. "If that had been in place," he said in Milwaukee, "I wouldn't be here today."
Sweeney hopes that unions can use votes on Enron-related reforms to draw lines in this year's elections showing what candidates put first--corporations or workers. The AFL-CIO attacked Republican Representative John Boehner's legislation, passed in April, for "wip[ing] out existing retirement protections for workers under the guise of responding to" Enron. The House bill would permit investment firms to advise workers about financial products, like mutual funds, from which those firms profit--precisely the kind of 1990s conflict of interest that is under investigation at several Wall Street brokerages. While providing limited protections for workers and preserving executive privileges, the House bill would also make it easier for corporations to exclude most employees from retirement plans. Labor's advocacy for Enron workers and retirement security could also strengthen organizing, including efforts among white-collar workers, by sparking a more "enlightened" view of a collective voice at work, as it did with former Enron vice president Dennis Vegas, now a union enthusiast.
But a budding labor scandal threatens the movement's credibility on corporate accountability. It appears that a few labor leaders, sitting on the board of ULLICO, parent of Union Labor Life Insurance Company, personally profited from privileged deals in the Enronlike boom and bust of telecommunications upstart Global Crossing, while their unions' pension funds were denied the same opportunity. Robert Georgine, president of ULLICO and former president of the AFL-CIO's building and construction trades department, former Iron Workers president Jake West, Plumbers president Martin Maddaloni and Carpenters president Douglas McCarron are among those who got windfalls of several hundred thousand dollars. In March Sweeney, who did not take part in the deal, called on ULLICO, like Enron, to appoint an independent committee and counsel to investigate, but in mid-April Georgine said he would take a "somewhat different" approach. "We're not going to ask Enron to live by one set of standards and ULLICO to live by another," Sweeney insisted. Many union officials say they were shocked and disgusted by the news, a reminder that "no more business as usual" is a widely applicable slogan, even within union ranks.
In an end-of-the-year column devoted to "Politics and Prose," Peter Beinart, editor of The New Republic, asserted that there had been a "new gravity" and "sobriety" to American journalism since September 11. Literary responses had failed, he argued, to process the event, notably in a commemorative issue of The New Yorker in which the writing had been "excessive, even grotesque when applied to mass carnage in downtown New York."
Beinart declared it was now the era of the essay--"non-reported, non-narrative, political or historical analysis"--and "the sombre profile of a person in power"--stripped of excessive description, wanton psychoanalysis and "edge" but not of dutiful and accurate quotation. "American journalism, after a long while on the sidelines," he rallied, was "back in the game."
It was a shaky argument, one some editor of The New Republic (a magazine that confuses an antiliterary style of journalism with an anti-indulgent outlook as a matter of policy) was bound to try to make sometime.
Let's face it, the new Hunter S. Thompson won't ever be found in its Puritan liberal pages, though the journalism of a New Yorker writer like Jonathan Franzen just might be, albeit a soberer, straighter version. Franzen himself exhibits too minute a panic in his work, too much of an "edge" (see his novel of last year, The Corrections), is simply too much like a literary forefather such as Joseph Heller (Catch-22 and, more important for Franzen, Something Happened) to make any editor at The New Republic feel he had a grip on the world. And what is The New Republic--or any news and culture magazine--about if it isn't grip, skeptical firmness, analytical rectitude?
Ever since the 1960s and the advent of New Journalism--subjective and, yes, "literary" in its aspirations, distinguished by figures like Truman Capote, Norman Mailer, Gay Talese, Tom Wolfe, Gail Sheehy, Joan Didion--there has been an ongoing and necessary argument in favor of old-school values like objectivity, plain writing and reporting craft. Beinart's analysis of the American print media today is just the latest salvo, objectively put of course, saying out with "the New" and in with the old. It's part of a larger debate about consciousness and language, and how best to represent the state of the nation in both journalism and fiction in ways that reassure Americans their world can be secured, defined, reinforced.
Ironically, the tag New Journalism has been a misnomer from the beginning, implying--all the more alongside the revolutionary context of the 1960s that birthed it--a rejection of past values and a blind dive into the postpsychedelic waters of contemporary reality. It also denies the historical significance of figures like George Orwell, Martha Gellhorn, Joseph Mitchell and Damon Runyon, who created openings in journalistic convention, idiosyncrasies that demonstrate that "New Journalism" had been around for the best part of the century--if a writer had the gift and the license to explore the possibilities. For that matter, is it so far from Walt Whitman's 1882 diary of the Civil War in Specimen Days, to Michael Herr's scattershot report on Vietnam, Dispatches?
Many writers disliked the term New Journalism for these very reasons, preferring less-catchy descriptions like "Immersion Journalism" to describe the intense amounts of research and closeness to one's subject matter required to make such subjective reporting great and accurate storytelling; or "Literary Journalism" because of the undisguised desire to apply the techniques of fiction to a retelling of factual events and conversations.
One of the most notorious indicators of the style was the use of interior monologue, even pure streams of consciousness in groundbreaking pieces like Gay Talese's "The Loser," a brilliant profile of boxer Floyd Patterson (Esquire, 1964) and Tom Wolfe's "The First Tycoon of Teen" (New York, 1965) a feature story on the recording mogul Phil Spector. How absurd, these voices from inside their heads! Wolfe's rhetorical answer to the critics was a slap in the face: "How could a journalist, writing nonfiction, accurately penetrate the thoughts of another person? The answer proved to be marvelously simple: interview him about his thoughts and emotions."
A radical and disciplined art, New Journalism presented a cinematic and psychological rupture with the prevailing journalistic approaches, using dialogue, scenes, thoughts in a dramatic reconstruction of events and interview material. But it still depended on the old verities: solid research, thorough interviewing, good writing (albeit more jazzy in tone and form) and diligent fact-checking. It was an extension of the possibilities, not a denial or negation of what had happened before.
Not content to disturb the print media, New Journalists started shaking up the literary world by producing "narrative non-fiction" bestsellers that caught the times better than any novelist seemed capable of: Capote's masterful and groundbreaking insight into murder and America's pathological underbelly, In Cold Blood (1965); Didion's neurotic essays on her pale sense of selfhood amid West Coast cultural decadence, Slouching Towards Bethlehem (1968); Mailer's rambunctious, egomaniacal coverage of an anti-Vietnam War protest march on the Pentagon, The Armies of the Night (1968).
Books like Thompson's Fear and Loathing on the Campaign Trail '72 (1973), Herr's Dispatches (1977), Mailer's The Executioner's Song (1979) and Wolfe's The Right Stuff (1979) were among a slather of later releases that proved the phenomenon was not going away--from magazine and newspaper journalism or the bestseller lists. In a twist of fate, Mikal Gilmore, the brother of convicted killer Gary Gilmore, Mailer's subject in the capital punishment "thriller" The Executioner's Song, would go on to become one of the few decent writers of the 1990s operating within what could be called the New Journalist tradition, producing a superb book on his brother as well as some excellent writing for Rolling Stone.
Something sick, though, has been happening since the 1960s and '70s heyday of such writers and books. News as non-stop entertainment, the journalist as B-grade personality, a long, slow, moronic nose dive into excess on a scale difficult to imagine back then.
Beinart is right to attack a media consumed today by "lifestyle writing," the bastard child of New Journalism, and a puffed-up aesthetic attitude lacking the flair and depth of earlier, greater writers. Rather than simply attack an excess of style, though, and perhaps a poverty of generational talent, I'd locate the current malaise in the format-driven glibness that is smothering the oxygen of intelligence--not to mention true journalistic creativity--out of magazines and newspapers today.
As serious print media have attempted to go "lighter" and chase readers in the past decade, circulation figures have dropped, even plummeted. This is a worldwide crisis for up-market magazines and newspapers, dimly explained with arguments (not entirely believed, even by those proposing them) that people are getting more information from the Internet or that the educated reader is disappearing. The truth, more awfully, is that readers of all stripes are disillusioned with what's available. Editors and publishers don't seem to know what to do about that except to go further down-market to anything dumber, faster and glitzier, pursuing that fragmenting audience, that shrinking attention span.
Unfortunately, the old formulas aren't functioning anymore in this fractured, increasingly unstable--some might even say dystopic--market. Thus the argument for "sections" and targeted bites of information neatly accompanied by highly supportive advertising. Even if it's meaningless and no one reads it, at least it turns a profit.
If the New Journalist was merely an "impresario" of stories, as the critic Michael Arlen caustically observed in 1972, today's news feature is altogether more miserable, niche-marketed directly to you without need of any bigger and possibly destabilizing voice. The impresarios are mostly gone; now only the product exists, its sheen undisturbed.
Market conditions of the industry aside, there is something deeply conservative beneath Beinart's analysis, a view that spells trouble for the future of modern journalism and where it might go in the United States today--and therefore the world at large. Certainly Beinart's reactionary spirit is in tune with the nation's siege mentality and a chauvinism that encourages the closing not just of borders but of the state of the American mind. There is a feeling that the unexpected, the elusive, manifest in the form of volatile individuals and their creativity, are not legitimate concerns and activities for American voices in an era of uncertainty and instability.
This affects both the media and literature as the struggle for "representation" in American life takes on a deeply political dimension in terms of the language that should be used. It is not just a matter of what is debated, interpreted, depicted--but how that debate should be carried out, the implication being that the wrong words themselves betray the state. There has been an across-the-board conservative intellectual push in the United States for some time now, making an argument for a return to literary order in fiction. B.R. Myers's controversial essay, "A Reader's Manifesto," in The Atlantic Monthly last summer, struck similar notes to Beinart's more recent aria, attacking the wordy pretensions and metaphoric excesses of contemporary American fiction writers like Don DeLillo, Cormac McCarthy and E. Annie Proulx. Subtitled "An attack on the growing pretentiousness of American literary prose," the essay denounced evil postmodernists, showoffs and "pansified intellectuals" who had undermined good language and sound thinking across the nation. What Myers demanded was "a reorientation towards tradition."
In one of many trainspotting examples he berated Proulx for some "characteristic prose" where she thanked her children at the end of Close Range (1999) for putting up with her "strangled, work-driven ways." According to Myers this phrase made "no sense on any level." When a reader wrote in to complain that it was "an implied metaphor and hardly difficult to understand," Myers stuck to his guns, returning to the dictionary and rules of grammar to justify himself. Fortunately for us, language moves--and is received--poetically and intuitively, even if Myers doesn't want to admit it.
However, he did finger a crucial distraction from the building of the modern American novel and how it is reviewed, even sanctified today. He called this "the sentence cult," those who adore wonderful phrases and patches of writing that finally do not add up to a fully felt, organically "alive" book worth reading, let alone relating to deeply. On this point of literary fragmentation, a collapse away from storytelling and character, a collapse of identification and therefore identity, he may well be right. As to whether such a collapse makes the literature inherently bad--well, that's another thing altogether.
This debate about the state of the American novel has been going on for years. Indeed, American literature regularly convulses to such landmark essays--and the need to write them--a battle for intellectual territory that should not be underestimated. The reverberations of these opinion pieces among cultural elites carry through as manifestoes for the times and exert enormous influence in publishing houses and the media. They should also be understood as beachheads for the highbrow magazines presenting them: in this case the long-running desire of the Atlantic to overtake Harper's as the defining literary and intellectual periodical of the day, a desire underlined by its drift toward political and aesthetic conservatism. The Atlantic is ready for the Bush era, righteous, satisfied and a little smug, just as Harper's might be seen as aristocratically Clintonian, progressive to the point of dilettantism and somehow out of step with the narrowing contemporary mood.
Myers's essay is an attempt to supersede an argument put forth by Franzen in Harper's in 1996, in a piece titled "Perchance to Dream: In the Age of Images a Reason to Write Novels." In that essay, Franzen wrote of his own "despair about the American novel." His conclusions, and hopes, however, were somewhat different from those of Myers.
Of the social novel Franzen wrote: "I didn't know that Philip Roth, twenty years earlier, had already performed the autopsy, describing 'American reality' as a thing that stupefies...sickens...infuriates, and finally...is even a kind of embarrassment to one's own meager imagination. The actuality is continually outdoing our talents." His despair for the state of the American novel was born out of the 1991Gulf War and "a winter when every house in the nation was haunted by the ghostly telepresences of Peter Arnett in Baghdad and Tom Brokaw in Saudi Arabia--a winter when the inhabitants of those houses seemed less like individuals than a collective algorithm for the conversion of media jingoism into an 89 percent approval rating."
Questioning the difficulties of social realism in the age of electronic media, and arguing that TV could represent reality better than any novel could, Franzen pined for the days when a book like Catch-22 had a huge social impact, raising questions about society to such a level that its title became part of the common vocabulary, entering itself in the dictionary (a thought that must give B.R. Myers a sleepless night or two).
What Franzen saw in Heller's black and absurdist work was less of a need to find legitimacy in a realistically detailed social novel of the present à la Dickens, but instead to write a novel that socially engaged, quite a different--if not unrelated--thing. It was this thinking that guided him in writing The Corrections, with its vaguely hallucinogenic, forensically detailed portrait of American family life, and the struggle of its characters to remain human amid the blizzard of consumer alienation. Despite the rave reviews and bestseller status, it is perhaps a little early yet to know if Franzen has succeeded in his project of engagement; but there is no doubt he has struck a nerve.
None too surprisingly, The New Republic took Franzen to task for his epic yet atomized scope. Observing the influence of the novelist Don DeLillo on the younger Franzen, the critic James Wood made a piercing summation of the senior writer's impact on The Corrections: "The DeLillo notion of the novelist as a kind of Frankfurt School entertainer, fighting the culture with dialectical devilry, has been woefully influential, and will take some time to die." Noting that Franzen imagined "a correction of DeLillo in favor of the human," Wood went on to say that this was "more than welcome, it is an urgent task of contemporary American fiction, whose characteristic products are books of great self-consciousness with no selves in them; curiously arrested books that know a thousand different things--the recipe for the best Indonesian fish curry! the sonics of the trombone! the drug market in Detroit! the history of strip cartoons!--but do not know a single human being."
It's clear that Wood--one of America's finest literary critics--finally favors something of Franzen's humanity but resents the occult unease beneath DeLillo's crowded linguistic responses to consumer capitalism and how he applies that language to create a surreptitious and infecting despair, a deep, flamboyant coldness. There is also a vague feeling from Wood that DeLillo is somehow evil, a monster of hidden tones, corrupting America from within. He is certainly appalled by a DeLillo essay that appeared in the New York Times, "The Power of History," wherein the novelist declared, "At its root level, fiction is a kind of religious fanaticism, with elements of obsession, superstition, and awe. Such qualities will sooner or later state their adversarial relationship with history."
How strange those words from 1997 read now, post-September 11. In the buildup to this statement DeLillo had defined the modern novelist as a radical and an outsider to all systems: political, social, linguistic. "Fiction will always examine the small anonymous corners of human experience," he wrote.
But there is also the magnetic force of public events and the people behind them. There is something in the novel itself, its size, its openness to strong social themes that suggests a matching of odd-couple appetites--the solitary writer and the public figure at the teeming center of events. The writer wants to see inside the human works, down to dreams and routine rambling thoughts, in order to locate the neural strands that link him to men and women who shape history. Genius, ruthlessness, military mastery, eloquent self-sacrifice--the coin of actual seething lives.
Against the force of history, so powerful, visible and real, the novelist poses the idiosyncratic self. Here it is, sly, mazed, mercurial, scared half-crazy. It is also free and undivided, the only thing that can match the enormous dimensions of social reality.
This is a nihilistic view, divorcing itself from history's involving tug or becoming perhaps a perversion of it. DeLillo might argue that such perversions are simply the logical result of a "social individual" in the Information Age. A kind of endgame--alienated, yes, but lit with negative protest nonetheless.
Franzen identified this problem similarly in his "Perchance to Dream" essay as the way "privacy is exactly what the American Century has tended toward. First there was mass suburbanization, then the perfection of at-home entertainment, and finally the creation of virtual communities whose most striking feature is that interaction within them is entirely optional--terminable the instant the experience ceases to gratify the user."
The collapse of the myth of the Internet as a democratizing force in news and information, as a glue for a new public consciousness, is part of this great feeling of disaffection and disconnection. While it remains a counterculture organizing ground for assorted global protest groups, it is not quite the democratic free-for-all it was once hoped to be. Meanwhile, clichés like "the New New Journalism" and "the Way New Journalism," which try to give countercultural weight to new forms of Internet journalism, have fallen fast to the reality that major news corporations are maintaining their centrality and indeed expanding it, seeking international print and electronic monopolies over freelance writers in a manner that all but strangles them out of the mainstream system. Add to this a babble of impotent, even crazed voices, and you have confusion, not liberty, shouting to be heard outside the corporate gates.
Where New Journalism once challenged a homogeneity of opinion, even one of its most extreme practitioners, Hunter S. Thompson, the godfather of "Gonzo," finds a heterogeneity on the Net so repulsive he can't bear it. As he put it, "There is a line somewhere between democratizing journalism and every man a journalist. You can't really believe what you read in the papers anyway, but at least there is some spectrum of reliability. Maybe it's becoming like the TV talk shows or the tabloids where anything's acceptable as long as it's interesting."
The language of the Net itself is affecting new books and the audiences who might be reading them. Figures like Dave Eggers in A Heartbreaking Work of Staggering Genius are also partly a literary byproduct of chat rooms and websites, and use an eclipsed language more heady and conversational at once, and therefore "young." Overall, though, one senses an impatience at root in Net culture, a desperation for sensation and the moment that does not feed itself into writing or reading books. In that regard the seething quality of the public consciousness, its near-madness, is really what the Net comes to represent--and with it a deep loneliness, a frenzy masked as social activity. Novelists like DeLillo, Franzen, Eggers, David Foster Wallace and Rick Moody order that sea of thought, but also manifest its rabidity and pointless depths, indexing it to the furies and absurdities of consumer culture. To steal a line from Marshall McLuhan, "Some like it cold."
A critic like Wood finds this sprawling ambition depressing. You might recall his lament that "It is now customary to read 700-page novels, to spend hours and hours within a fictional world, without experiencing anything really affecting, sublime, or beautiful.... This is partly because some of the more impressive novelistic minds of our age do not think that language and the representation of consciousness are the novelist's quarries anymore." It could be argued that just when New Journalism was pushing its way into literature's representational culture, the more talented novelists were moving out to the fringes of consciousness, to places "nonfiction narrative" could not reach. So much so that Tom Wolfe himself eventually berated modern American novelists for their abstractions and lack of research in his own essay manifesto, "Stalking the Billion-Footed Beast." Wolfe espoused a return to the qualities of naturalism, citing Emile Zola and, of course, the importance of the "novelist as reporter." (Since Wolfe had recently written The Bonfire of the Vanities, his screed was seen in many quarters as bald self-promotion.)
Wolfe's views are not so far away from those of B.R. Myers. Wolfe's zippy writing style may have sung with pop culture verve, but he has always been a conservative at heart, as his rigid championing of realism, or "documentation," as he prefers to call it, shows. Aside from a stoush with Mailer over A Man in Full, Wolfe has also argued with John Updike and John Irving, the latter describing all of Wolfe's novels as nothing but "yak" and "journalistic hyperbole described as fiction." Wolfe responded to them all in an essay called "My Three Stooges" (it can be found in his latest collection, Hooking Up), accusing Mailer, Updike and Irving of having "wasted their careers by not engaging in the life around them...turning their backs on the rich material of an amazing country at a fabulous moment in history."
In Wolfe's final opinion, the American social novel is suffering not from "obsolescence" but from "anorexia." For all its force of actuality, though, Franzen's sickened density in The Corrections is quite a different creature from Wolfe's idea of what a social novel should be. It doesn't just observe or document; it palpitates, realistically, with the surreal excess Wolfe once identified as an indulgence. And in a strange way, perhaps, it softens the blows of DeLillo, tries to put us back together again without hiding the cracks.
Now, however, a new era of unvarnished reporting and the dogmatism of style that underlies it appears to be dawning. September 11 is fuel to this conservative fire. The world has become so unsteady, the argument runs, that we have to get back to our roots, find the lines that moor us safely to what and where we are. Plasticity of language, tangential and subjective reporting, work that emphasizes a fractured or restless view of the world--these must be stopped. Examples abound.
There can be little argument that September 11 has sent everyone into the spin of re-evaluation. But writers have always had to wrestle with such extreme moments, monstrous acts that threaten to annihilate us, spiritually if not actually. Where is the sense in it? How do we become human again, rather than vengeful, blind with loss or hate? One danger for literary journalism, of course, is that it threatens to aestheticize the experience of an event like September 11. The same may well be true of writing about Hiroshima, the Holocaust, even something as basic as a brutal, anonymous murder. Straight journalism must negotiate the obverse dangers, the tendency to reduce everything to the details, an impartiality that becomes desensitizing and objective to the point of emotional irrelevance.
The proof of value must finally lie within the words themselves. And for all of Beinart's criticisms of unnecessary poetics and dubious metaphors, the literary fraternity and journalists of literary inclination still gave us much to be grateful for. What that may mean in terms of novels and a broader state of mind to come is still too early to tell; but his and Myers's demand for a retreat from the frontiers of ambiguity, from wordplay and a tensile language that the likes of Don DeLillo tease into something conscious and unsettling within us is, well, a backward step. It may be awful to say it, but the obsession with information that underlies the work of DeLillo, Franzen and others could still be capturing the real and enduring trauma for society, way beyond the immediate horror of September 11 and its psychic impact.
I have to note that the English novelist Ian McEwan's dark and cool eloquence in The Guardian--his interrogation of the images and our action-replay absorption in them, our nauseating lust for news--was of the first order, as both literary essay and as a moral inquiry between self and the society of spectacle. Factual journalism alone can't easily create that kind of recognition. In Vanity Fair, the novelist Toni Morrison's address to the dead was the finest elegy I read from anyone, anywhere, with her bruising admission of "nothing to give...except this gesture, this thread between your humanity and mine." Yes, facts can make us grieve, too, but there are times when we also need the obscure magic of poetry to heal us.
Even the issue of The New Yorker so maligned by Beinart was filled with great literary journalism. The one exception to the form was an essay--nonreported, nonnarrative, political, historical, analytical--by Susan Sontag, a piece strangely overlooked by Beinart in his comments, given the new aesthetic world order he perceives. Sontag questioned the proposition of national innocence, and how that outlook refuses some of the baggage--some of the baggage--of responsibility America has to bear for its foreign policy. It was easier to misunderstand, simplify and demonize her arguments than to take on board the questions she was asking, even the sober ones.
September 11 did do something to the imagination, did go beyond words. It was a profound blow to the spirit. In all the realms of journalism and analysis since then, some have spoken well, some haven't. Some, most interesting of all, have evoked confusion and mixed feelings, and longed for the light of understanding. The clamor to speak has itself become a problem, a moral dilemma that reflects the media's sickening habit of overproduction, its sheer commerciality.
After any death, any tragedy, there is an inevitable level of sobriety and reserve, yes. If that leads to better journalism, better writing, better books, how wonderful for us all. But the argument that literary responses and literary journalism are somehow not up to the task, that reflection and rebuilding the public consciousness should be left to the practitioners of conventional bricks-and-mortar journalism and old-fashioned storytellers who know their rules of grammar, is far from convincing. To do the job fully we need a little soul and poetry, a little shaking up too. Literary journalism and great, radically written novels are more than able to fill that gap. And perhaps raise a few questions as well in that world between the imagined and the real where nightmares--and dreams--are born.
Seeking to eliminate the Palestinians as a people, it is destroying their civil life.