News and Features
We shall see very little of the charmingly simian George W. Bush. The military--Cheney, Powell et al.--will be calling the tune, and the whole nation will be on constant alert, for, James Baker has already warned us, Terrorism is everywhere on the march. We cannot be too vigilant.
The Supreme Court decision effectively handing the presidency to George W. Bush reveals the intensely partisan nature of the Court's current majority. The Court, to be sure, has always been political, but rarely as blatantly as today. Nor are there many precedents for Justices trampling on their own previous convictions to reach a predetermined conclusion.
Chief Justice Roger Taney enlisted the aid of President-elect James Buchanan in persuading Northern Justices to join the pro-slavery majority in Dred Scott. Franklin Roosevelt conferred regularly with Justice Louis Brandeis, and Justice Abe Fortas served as a trusted political adviser of Lyndon Johnson. But never has there been a public statement as partisan as Antonin Scalia's when first suspending the recounts that the Court needed to insure "public acceptance" of a Bush presidency.
If there is a silver lining, it is that the last month suggests an agenda for democratic reform. First, the Electoral College should be abolished. The product of an entirely different political era, when the electorate excluded women, nonwhites and propertyless males, the Electoral College was created by a generation fearful of democracy. Its aim was to place the choice of President in the hands of each state's most prominent men, not the voters. It unfairly enhances the power of the least populous states and can produce the current spectacle of a candidate receiving a majority of the votes but losing the election. At the very least, electors should be chosen in proportion to the popular vote in each state.
Second, the Florida fiasco should lead to the reform of voting procedures. As with schools, roads and public services, the wealthiest districts have the best system of voting. The machines used in poor black precincts of Florida, the Miami Herald demonstrated, are so flawed that they are guaranteed to produce a larger number of spoiled or uncounted ballots than in affluent suburban areas.
One can only view with deep cynicism the Court majority's invocation of "equal protection" in rejecting a recount. Added to the Constitution in the Fourteenth Amendment after the Civil War, this language was intended to protect former slaves from discriminatory state actions and to establish the principle that citizens' rights are uniform throughout the nation. The current Court's concept of equal protection has essentially boiled down to supporting white plaintiffs who claim to be disadvantaged by affirmative action.
Nonetheless, by extending the issue of equal protection to the casting and counting of votes, the Court has opened the door to challenging our highly inequitable system of voting. Claims of unequal treatment by voters in poorer districts are not likely to receive a sympathetic hearing from the current majority. But Bush v. Gore may galvanize demands for genuine equality of participation in the democratic process that legislatures and a future Court may view sympathetically.
Equally difficult to accept at face value is the majority's disdain for the principle of federalism these very Justices have trumpeted for the past several years. Like the South before the Civil War, which believed in states' rights but demanded a fugitive-slave law that overrode the North's judicial and police machinery, today's majority seems to view constitutional principles as remarkably malleable when powerful interests are at stake.
The next time this Court turns down an appeal by a death-row inmate on the grounds that federalism requires it to respect local judicial procedures, the condemned plaintiff may well wonder why his claims do not merit the same consideration as those of the Republican candidate for President.
Eileen Myles's new novel, Cool for You, is much more a writing-out of female madness than a book about it. Framed around the author's search for the medical records of her grandmother, who spent the last years of her life in a state mental institution, Cool for You is about the institutionalized life in general. Though she begins with a description of the sanctioned squalor of the state asylum, really Myles is looking at the big picture: the processing of people into grades and schools and genders, cliques and classes. Like the writing of the late Kathy Acker, Cool for You is a kind of fragmented autobiography. Both Acker and Myles write adventure books in which their lived experience becomes the engine, not the object, of a narrative. Both present an "I" as large as the narrators of Heart of Darkness or Tropic of Cancer, although in female hands, the use of "I" is often misconstrued as memoir. Like Acker, Myles values the most intimate and "shameful" details of her life not for what they tell her about herself but for what they tell us about the culture. In this sense, Cool for You makes the classic Female Madness Tale, from Sylvia Plath's The Bell Jar through Susanna Kaysen's Girl, Interrupted, look like a kind of psychic liberalism.
Unlike Plath and Kaysen, and dozens of practitioners in between, Myles has no particular belief in the possibility of a fully integrated female self. She doesn't think her experience will be redeemed. The circumstances of Myles's life--she is the daughter of a Polish secretary and an alcoholic Irish mail carrier in class-riddled Boston--are no more dire than those of millions who daily feel the disparity between their own lives and the surfaces of upper-middle-class life that are projected blandly on TV and intricately probed in most contemporary literary fiction. What's harrowing is the detail in which this disparity is experienced and recorded.
Nellie Reardon Myles entered the Westborough State Hospital at the age of 60. Her complaint: "I don't feel well." She was a refugee of the Irish potato famine who'd cleaned houses all her life in Boston and given birth to seven children. Appetite: normal. Sleep: normal. Speech: normal. Nellie was stricken with grief over the death of her daughter, Helen. The color of her urine is fully documented over the fifteen years she spent before her death at Westborough. Teeth missing: thirty-two. Economic condition upon her entry: marginal. Her mental state: sometimes resentful. What Myles remembers most are the Sunday outings of her family to the asylum: "Dad went inside. My mother stayed out with us and the camera. Nellie is led out with great aplomb. The queen mother. The camera clicks.... It was our Buckingham Palace."
It's fitting that Cool for You begins with a quote from the Modernist hero Antonin Artaud. Just as Artaud's experience as a wartime inmate of the Rodez asylum became a launching pad and paradigm for his rage against the military-corporate forces that were then gathering toward a new postwar order, Myles reads the cursory entries on her grandmother's life at Westborough State Hospital, where she waswarehoused by the State of Massachusetts, as proof of something she already knew: The Poor Don't Matter.
The writing of both Myles and Acker is dependent on a great belief in myth, the conduit through which we may experience the Modernist passion to be larger than oneself. To use a very public "I" to speak, as Myles has put it, "to her time..." But mythification doesn't happen much to female writing. We have great hagiographies through which to read the works of Jack Kerouac, Neal Cassady, Allen Ginsberg and William Burroughs, but in the case of their contemporary, Diane di Prima, the twenty books she's published must suffice. Criticism also helps create a myth around the lives of certain male contemporary fiction writers. Girls in my writing class refer to the author of A Heartbreaking Work of Staggering Genius as "Dave," as if they knew Eggers, and memorize his interview remarks as if they were late-night phone confessions. Female myth, it seems, is something much more self-created.
Myles and Acker have both succeeded in bringing difficult work that goes against the grain of contemporary commercial narrative to wider audiences through the sheer willingness to cultivate and engage with myth. Acker hit large in the United States and England following Grove's rapid publication of her books in the mid-1980s. She knew the game and cultivated straight-girl celebrity with a vengeance: sex and motorbikes, tattoos, black leather. Acker Junkie, screamed the headline of her review in The Independent. She could be seen at 10 am hailing taxis on Third Avenue in full Punk Priestess regalia whenever heading uptown to meet her agent. By 1995, she knew myth inside out. "The kathy acker that you want...," she wrote to a friend in Australia, "another mickey mouse, you probably know her better than I do. It's media, it's not me. Like almost all the people I know, and certainly all the people I'm closest to, all of whom are 'culture-makers' and so-called successful ones...our only survival card is fame.... We're rats walking tightropes we thought never existed. Oh sure, we all look good while traveling. We're good at media images."
Myles, who isn't straight and is best known as a poet, approaches myth from a different angle. Since the publication of her first book, The Irony of the Leash, in 1982, she's been offering audiences fleshy, candid slices of her consciousness and life. A friend and apprentice of the late James Schuyler, Myles writes in a style that is deceptively immediate and conversational, giddily expressing a huge range of speculative thought. She arrived in New York City in 1974, a working-class butch lesbian from Boston, and adapted the literate candor of New York School Poetry to her needs. Her very presence at that time and place was perceived as confrontational, and it was a challenge she accepted. In 1992 Myles ran as a write-in candidate for President in eleven states, memorizing her poems and delivering them like stump speeches. In "An American Poem," she poses as a Kennedy and implores her listeners:
Shouldn't we all be Kennedys?
This nation's greatest city's
home of the business-
man and home of the
rich artist. People with
beautiful teeth who are not
on the streets. What shall
we do about this dilemma?...
Like Acker, Myles uses "autobiographical" material, but her deployment of it is more revelatory, less strategically conceptual. In Cool for You, Myles's first published novel, she sees much of her own life in tandem with her grandmother's madness. "It seems people go nuts," she writes, "from a number of things," and then proceeds to tell us what. The trajectory of a lost, dissatisfied working-class girl who wants to be a boy is necessarily less insulated, more wide open to a scary form of chance than that of the Harvard Blessed, whose lives she naïvely tries to emulate. She takes a job at Harvard Coop and gains twenty pounds stealing expensive candy bars while marveling at her co-worker, a girl who'd come from Beaver Country Day School who took time off from school to work a little job. "All these people had a certain colored skin, kind of golden peachy and expensive. It was leisure skin." Meanwhile, she was getting pimples. She attends the University of Massachusetts, Boston, imagining "images of the past--college, some bunch of bright young people in sweaters dashing up the steps to their astronomy class," only to find that "it was not school. There was no campus." She commutes on a string of suburban trains and buses to her classes and sits with her fellow students at a seedy coffee shop called Patsio's, as close as U Mass got to an off-campus hangout:
We would sit...and drink our bleary morning coffee and see the first street people we had ever laid eyes on. An old woman pulled up her skirt for us and showed us her bald old pussy. We were going to school. There was an Irish bar around the corner where we'd go after jazz class and smell stale beer and a trio would play there on Friday afternoons, a really old man and a really old woman and some third thing, I can't remember, but I know it was a trio. They were so drunk the music was incredibly bad...and one afternoon they weren't there because one of them had died.... This could not be college.
She knows she's lost. She feels the future opening up into the present and looping back again; she sees a girl dancing to the Doors and it is Jim Morrison's voice that keeps repeating in her head as if the voice were hers, and she wants to be the one to take the dancing girl on a ride into a parallel universe. Like Sade's Justine, Myles has many picaresque adventures. She quits her taxi-driving job and starts working as a nurse-assistant at The Fernald School after a chance conversation with a fare. The Fernald School is an institution for retarded men, and there she finds three classes: the institutionalized men; the staff, consisting of "the slightly educated well-meaning down and out confused," like her; and the Harvard-trained behavior-modification therapists, who rarely venture out into the wards but devise a program in which the staff pass out handfuls of M&Ms to reward appropriate behavior. The Fernald School is as dead-end an institution as any Myles encounters. She recalls: "All around us was the subtle feeling of a campaign for self-improvement. If we were daily...improving these men's capacity to live 'normally' then what could the therapy do for paragons of intelligence like ourselves. When the buzzer went off we would hug each other for not smoking."
She saves up; she travels to the West. She remembers blueness and the perfection of the air and mountains and working lots of different jobs. She wants to be James Joyce, get rid of everything and write, but then there's nothing to hold on to. She starts a book but can't get past line one, about gerbils running around a cage. At night she hears a million voices. The only thing that held her still was taste, and she kept thinking if she could taste the right thing then she would have something to hold on to. "The day was some runs that I knew with my mouth." One time in the park she floats, and realizes she's not anyone or thing. "I was not connected...not in at all. Not outside either. It wasn't like a movie."
For Myles, madness is not exactly something to be overcome. It is a permanent state, because it is a correlate of the female struggle against poverty. Madness isn't ever isolated from the dead-end jobs, the crummy schools, the institutionalized future that awaits the unconnected. Therefore, madness is something richer, darker, more inevitable than a way station on an affluent, rebellious girl's journey to success. In one of the book's most terrifying scenes, 14-year-old Eileen is working part time in her neighborhood at a nursing home. Delivering trays one night, she gets a glimpse of a familiar body, a woman she'd known as Mrs. Beatty. Seven-year-old Eileen had known the same Mrs. Beatty as the most elegant lodger at her friend Lorraine's mother's boardinghouse. She was a large woman with chestnut hair, joyful, with an air of sophistication, who wore hats with veils. But now she's naked, no longer wrapped in an elaborate fox-fur coat, and she's being lifted off the potty by a nurse and she's not a person anymore, she is a smelly shapeless thing. "She turned or a I saw her face and there was nothing in it. She was gone.... I wanted it to be someone else so I wouldn't have to have seen what I saw. This is Mrs. Beatty, said the nurse, disgusted."
Cool for You is a difficult, painful book to read. It is a construction of identity that's truly public, absorbent of the lives of others. With the audacity of Henry Miller, without the protection of his bravado, Myles lets the voice of poverty-madness-shame speak through her and proves the past is never operable.
So it's ending the way it was always going to, with George Bush in the White House. As the Last Marxist said all along, he had the most money, so he should win. True, in order for this to happen, Americans had to see laid bare elements in our system that are usually carefully hidden: You probably didn't know that the election process is controlled by party and campaign officials, that the machine you voted on may well not record your ballot, that in a dozen ingenious ways the black vote is still being blatantly suppressed, that people like Katherine Harris even exist. It is entirely fitting that Chief Justice Rehnquist, who voted with the conservative five-member majority to halt the counting of the Florida vote, intimidated black and Hispanic voters himself as an Arizona pollwatcher in the early 1960s. And given the general air of cronyism and corruption and self-interest that surrounds the whole Florida business, it's fitting too that Clarence Thomas's wife is working at the right-wing Heritage Foundation, where she is collecting résumés for openings in the Bush Administration, and that two of Scalia's sons are employed by law firms representing Bush. Now, having helped run out the clock, the five regret that time's up.
Who knows what might have been if Al Gore had encouraged and joined, rather than avoided and thwarted, popular outrage at black voter suppression in Florida? We might have seen racially integrated civil rights marches to rival those of the l960s, and over the same issue too.We may see them yet. If not--if we let the media bury the Florida outrages in a blizzard of blather about the need to "move on" and "come together"--we will truly have the government we deserve.
President Clinton, at least, has finally achieved the perfect political moment in which to enact the liberal agenda many readers of this magazine believe he has always secretly supported. It's taken eight years, but at last he isn't running for re-election, the Gore campaign is over, the Republicans have done their worst. If you believe Ralph Nader's 2.7 percent of the popular vote represented a message from the restive left, there's nothing now preventing Clinton from opening the envelope and reading it.
Many people cited their opposition to capital punishment as one reason they were voting for Nader. Now along comes Clinton's chance to show he feels their pain by commuting the sentence of Juan Raul Garza, the triple murderer scheduled to be the first person executed under the federal "drug kingpin" death penalty statute, and the first federal execution, period, since 1963. So did the President do not just the right thing but the costless and perhaps even politically astute thing? No. Professing himself troubled by a Justice Department review noting that 80 percent of federal death-row inmates are people of color (but not troubled enough to let them live), he granted Garza a six-month stay of execution, leaving the decision to his successor--the necrophiliac George Bush, who would cheerfully give the lethal injection himself.
Here's another test for Clinton: One of the surprises of the recent election was the victory of anti-drug war referendums in California, Colorado, Nevada, Oregon and Utah. Now 650 religious leaders, members of the Coalition for Jubilee Clemency (www.cjpf.org), have appealed to Clinton to offer a presidential pardon to some of the thousands of Americans serving long mandatory sentences for minor drug offenses, allowing him to undo a small amount of the immense damage of the war on drugs, a cause he has relentlessly, and to my mind mysteriously, promoted. The President does seem to wish to dissociate himself from his own drug policies: In his Rolling Stone interview, he decries the "unconscionable" sentencing disparity between crack and powdered cocaine and claims he tried to equalize it but was thwarted by a Republican Congress that wanted only to narrow it. (Actually, it was he who in 1997 proposed reducing the disparity from 100-to-1 to 10-to-1 while resisting calls to eliminate it.) He even calls for restoring voting rights to all ex-convicts and for a "re-examination of our entire policy on imprisonment."
Well, better late than never. Harsh drug laws and racial bias at every step from arrest to sentencing have done immeasurable harm to poor and minority communities and to unlucky people of all sorts. Imprisonment of minor offenders has destroyed families, left children without parents, spurred a sevenfold increase in the number of women in prison between 1980 and 1997 and is a major reason that the United States now has the world's second-highest rate of incarceration. Leaving aside the horrors of prison itself, a felony conviction means probable unemployment and in thirteen states, as we now know all too well, loss of the right to vote. It isn't often that a person, even a President, gets to wave a magic wand and rectify a mass injustice. Will the President avail himself of this astonishing moral and political opportunity? I'm not holding my breath. For one thing, he's preoccupied, mulling a pardon for junk-bond king Michael Milken, who made off with billions, for which he spent twenty-two months in jail--good thing he didn't try stealing a slice of pizza in California, or he might be doing life! If pardoned, the New York Times noted, Milken would be able to vote again. If the 31 percent of black men banned as convicted felons from voting in Florida had received the same treatment, even Jeb Bush and Katherine Harris might have had trouble suppressing enough votes to deliver the White House to reactionary nincompoop George W.
* * *
Too busy watching CNN to do your Christmas shopping? This year give everyone on your list the same wonderful benefit CD. Octaves Beyond Silence features selections by Ani DiFranco, the Indigo Girls, Me'shell Ndegeocello, BETTY, Eve Ensler, Rachel Z, Virginia Mayhew, Allison Miller and other fabulous performers. The brainchild of 24-year-old Inger Brinck, Octaves Beyond Silence will raise funds for six great organizations fighting violence against women: in Afghanistan (RAWA, which I wrote about in this space last spring), Rwanda, Croatia, Bosnia and the United States. You can order it through Ladyslipper Music at (800) 634-6044 or at www. octavesbeyondsilence.com.
There's a growing movement to add livable hours to calls for a living wage.
Wait until next season. I've already started practicing my
chad-punching, and I suggest the same as therapy for all who feel ripped
off by the collusion between the US Supreme Court's right-wing
ideologues and George W. Bush's lawyers to prevent an accurate Florida
vote count. The electoral process will survive and Bush may even learn to
do the job, but the price of his victory is the court's denigration.
It took a non-ideological Republican appointee, a near-extinct breed
in the GOP, to puncture the outrageous hypocrisy of the Antonin
Scalia-led majority that defined a fair recount by the singular standard
that would leave Bush the winner.
In his dissent, John Paul Stevens wrote the indelible postscript to
this judicial farce: "Although we may never know with complete certainty
the identity of the winner of this year's presidential election, the
identity of the loser is perfectly clear. It is the nation's confidence
in the judge as an impartial guardian of the rule of law."
The so-called court conservatives simply had no sense of shame or even
proportion. Think of the conflicts of interest we learned about only in
the last few days: Clarence Thomas's wife is helping the conservative
Heritage Foundation recruit workers for a Bush administration, and Scalia
has two sons associated with key law firms representing Bush--one a
partner of Theodore Olson, who argued Bush's case before the high court.
It's also common knowledge that Chief Justice William H. Rehnquist and
Justice Sandra Day O'Connor indicated a desire to retire, but only if
Bush won and could replace them. In that event, Bush would likely appoint
Scalia as chief justice.
Common decency, let alone judicial integrity, should have left the
court's majority more hesitant in acting as agent for selecting the next
President. Instead of taking the high road and leaving the matter where
it belonged with the Florida Supreme Court--according to the federal high
court's own oft-avowed states' rights precepts--Scalia and company
insisted on halting the recount. Why? Because there wasn't time to do it
right. But whose fault was that? Bush's and the US Supreme Court's.
Had the statewide count of disputed ballots been allowed to fairly
conclude, it would have shored up our next President's legitimacy. If
Bush had won the electoral vote after a fair count in Florida, it would
have taken the sting out of his ascending to the presidency despite
losing the national popular vote.
The US Supreme Court's heavy-handed intrusion was as destructive of
confidence in our political system as it was unnecessary. As Justice
Stephen G. Breyer wrote in dissent, the majority ruling represented "a
self-inflicted wound--a wound that may harm not just the court but the
Never again will a President's appointment of a federal judge be
viewed by the public--and more important the Senate, which must confirm
it--as a neutral, nonpolitical act. Recall that even such hard-line
ideologues as Justices Thomas and Scalia were confirmed with votes from
Democratic senators who thought it important to give the President the
benefit of the doubt. Next time anyone of discernible ideological bias is
nominated, there will be unprecedented senatorial gridlock. For that
reason, the real test of the Bush presidency will be his appointments to
the federal courts.
It is the same test faced by his father: Will they be true moderates,
such as Justice David H. Souter, a man capable of complex legal thought,
or another Thomas, whose most sentient act is to look to Scalia, then
vote? What a sad comment that the man who replaced Thurgood Marshall as
the only African-American on the court should now, in helping to block
the recount, so brazenly mock Marshall's lifelong crusade to insure the
sanctity of the black vote.
In any event, the court has handed the nation George Bush as
President, and we can live with that and even entertain hopes that he
will rise to the occasion, despite an obvious lack of preparation. Deep
down, if one can presume such a thing, he seems a decent sort. If he just
keeps in mind that most of the voters rejected him, he might resist Tom
DeLay's ultra-rightists in the House and pursue a moderate legislative
course. In any case, now that Joseph Lieberman will retain his seat, the
Senate will be evenly divided, and centrists of both parties will be
calling the shots.
But what we cannot live with is an even more politicized judiciary
dominated by right-wing ideologues. The GOP's far right will want strong
proof that its aggressive campaigning for Bush is rewarded, and its prime
goal is complete control of the federal judiciary, which is why Senate
Republicans blocked scores of Bill Clinton's judicial appointments.
However, if Bush attempts to reward his rabidly conservative backers by
placing their favorites in high positions in the federal judiciary, he
will tear this country apart. And next time, his opponent's chads will be
punched so forcefully that even the Supreme Court won't be able to save
We know what they're afraid of. Cut through the Republican verbiage
that has clogged the airwaves and courts and you find one simple but
disturbing point: They fear an accurate vote count because it might prove
that Al Gore has the votes to be fairly elected President.
That's been their concern since election night, when they began their
drawn-out process of obstruction, and if they succeed in once again
killing the manual count through their US Supreme Court appeal, George
W. Bush's victory will stand as a low point in the annals of American
The indelible impression left on our history will be that Gore won
both the popular and electoral vote and that he and the voters were
cheated out of that victory by a US Supreme Court dominated by
political ideologues appointed by Republican Presidents. If the Justices
cared a whit about the sanctity of the vote, they would have let the
manual-counting process decreed Friday by the Florida Supreme Court
continue. If that had resulted in a Bush win, we should all have
gracefully acknowledged his victory.
Bush, who lost by more than 330,000 in the popular vote--what most of
us grew up thinking of as the real election--may now squeak by with an
electoral college win resulting from a ruling by the right-wing-led US
Supreme Court. During the campaign, Bush cited Antonin Scalia and
Clarence Thomas as his judicial role models, and he has been amply
rewarded. Legal gobbledygook has replaced reason when the mere act of
fairly counting the votes of the citizens is halted to suit the political
agenda of the party that appointed the majority of the Justices.
In a close election, a manual count of all votes not counted by the
antiquated voting machines is a statutory mandate in many states,
including Florida and Texas, and should have been the common-sense demand
of both candidates in Florida. If that simple standard--accurately and
fairly counting all of the votes to ascertain the intent of each
voter--had been asserted in a bipartisan manner, there would have been no
reason for the subsequent confusion and the never-to-end questioning of
the legitimacy of our next President.
Instead, unprecedented rancor will mark the next years of our
politics, mocking all efforts at bipartisan cooperation. This will be
particularly true in battles over the judiciary, which, more than ever,
will come to be viewed widely as a partisan tool.
The Florida election will always be too close to call in a manner that
would leave partisans of both sides totally satisfied. Whoever loses will
feel ripped off, but the denigration of the Florida Supreme Court and of
Gore's legal challenges by top Bush Republican spokesperson James Baker
has gone too far. Twice now he has smeared the motives of Florida Supreme
Court justices for daring to come to conclusions not to Baker's liking.
Yet he reached a new low Friday in disparaging the right of a
presidential candidate--who has won the national popular vote and is only
three electoral votes from victory--to ask for a judicial review of the
obviously deeply flawed Florida election results.
Get real. Both Baker and Bush know they would do the same had the
results gone the other way. Yet they self-righteously abandoned civility
when the nation most needed it. There are no villains in this election,
only imperfect machines and people, but the Bush camp has vilified the
Gore camp for daring to seek a fair adjudication of such matters.
We are still a nation of laws, and it was unconscionable for Baker to
blast Gore for appealing to the Florida state high court at the very time
Bush's lawyers raced to the federal courts in an unseemly departure from
the GOP's commitment to states' rights. In Baker's view, the problem is
not that we have a razor-close election and flawed voting procedures, but
rather that Gore dares to assert his legal rights: "This is what happens
when, for the first time in modern history, a candidate resorts to
lawsuits to overturn the outcome of an election for President. It is very
sad. It is sad for Florida. It is sad for the nation, and it is sad for
Hogwash! What is sad is that tens of thousands of African-American and
Jewish voters in Florida were systematically denied their right to vote
by poorly drawn ballots, malfunctioning voting machines and unhelpful
voting officials. What is sad is that election officials in two counties
turned over flawed Republican absentee ballot applications for
corrections by Republican Party officials but did no such favors for
What would be most sad--indeed, alarming--is if a partisan US
Supreme Court proves to be an enemy of representative democracy.
All I want is the truth. Just gimme some truth.
Florida's electoral mishegoss lends itself to the exploration of an issue that receives no attention in the media and yet underlies virtually everything its members do. I speak to you, dear reader, of the Meaning of Truth.
Ever since Fox's John Ellis began the mistaken media stampede for his cousin George W. Bush's victory on election night, reporters, producers and executives have spun themselves silly trying to describe a situation that is ultimately an epistemological bottomless pit. There is no single "truth" about who won Florida. From the point of view of "institutional truth," we began without clear rules or precedents for measuring the vote, whether they include dimple-counting, partially punched chads or butterfly ballots. I am convinced Gore carried the will of the people, but I'm guessing that Lady Katherine Harris Macbeth would rather contract rabies than accept my admittedly subjective interpretation. From the perspective of "brute truth," however, the difference between the Bush/Gore numbers turns out to be so small that it will never exceed the count's margin of error. What we are seeing, therefore, is not a process of objective measurement but a contest of raw power. The Democrats use the courts and the law. The Republicans rely on rent-a-mobs, partisan hacks and power-hungry allies in the state legislature and Congress. Guess which side is bound to win?
Our media coverage admits none of this, because it is committed to a fairy-tale version of truth and objectivity that separates "fact" and "opinion" but cannot fathom anything in between. When Tim Russert declared on November 26 that George Bush "has now been declared the official winner of the Florida election...and therefore he is the 43rd President of the United States," he was making a statement that could not have been true when he made it. (Even Bush understood that he was only playing a President-elect on TV.) But the feared and celebrated Russert knew that his words were bound by only the narrowest definition of "truth." He could always take it back later.
The attachment to the idea of attainable objective "truth" on the part of American journalism is partially responsible for its frequent brainlessness. As NYU's Jay Rosen points out, "objectivity as a theory of how to arrive at the truth is bankrupt intellectually.... Everything we've learned about the pursuit of truth tells us that in one way or another the knower is incorporated into the known." (Remember Heisenberg? Remember Einstein?) The famous 1920s debate between Walter Lippmann and John Dewey shed considerable light on this problem, with Lippmann arguing for a "spectator" theory of reality and Dewey arguing for a more consensual one, arrived at through discourse and debate.
The notion of a verifiable objective truth received what many intellectuals considered its final coffin nail in the form of Richard Rorty's classic 1979 work, Philosophy and the Mirror of Nature. While the word true may have absolute correlations in reality, Rorty later argued, "its conditions of application will always be relative." What was "true" in ancient Athens--that slavery and pederasty were positive goods--is hardly "true" to us today. As Rorty explains it, we call our beliefs "true" for the purposes of self-justification and little more. The point is not accuracy but pragmatism. Moreover, Ludwig Wittgenstein has taught us that the gulf between what "is" and the language we use to describe it is so large as to be unbridgeable. "Truth" may be out there, but there is no answer to a redescription, Rorty observes, "save a re-re-redescription." Truth is what works.
Now, it's possible to contest Rorty on any number of counts. I personally find him overly generous to the extreme relativism of antifoundationalists like Jacques Derrida and Michel Foucault. (The antifoundationalist perspective can be simplistically summarized by the famous Surrealist painting of a pipe by René Magritte beneath the words, Ce n'est pas une pipe.) But the argument itself cannot be avoided. Truth, as Lippmann never understood but Dewey did, is a lot more complicated than a baseball box score or a Johnny Apple New York Times news analysis. What is needed to evaluate whether a report is ultimately credible is not an endless parade of "facts" that may or may not be true but a subjective marshaling of evidence. Yet because the entire media establishment treats these questions as just so much mental masturbation, the standard definition of "fact" often turns out to be any given statement that cannot be easily disproved at the moment it is made. Hence, we frequently see journalistic accounts of the mood of an entire country or even a whole continent based on little more than the taxi ride from the airport.
A second byproduct of American journalism's childish belief in attainable objective truth, Rosen notes, is the alienation it causes between journalists and intellectuals. In Europe the public profits from a two-way transmission belt between the world of ideas and that of reported "fact." But here such exchanges are nearly impossible because, as Rosen puts it, "intellectuals familiar with the currents in twentieth-century thought just can't deal with some of the things that come out of journalists' mouths." Such people, he notes, believe it "useless to try to talk with journalists" owing to their "naïve empiricism." Still, the academy is also at fault, owing to its recent retreat into a Derrida/Foucault-inspired debate that admits almost no reality at all outside the text and does not even pretend to speak intelligibly to the nonspecialist.
In any case, George W. Bush may be our next President. But it won't be because he outpolled Al Gore in Florida in any remotely objective sense. It will merely be because he might have, and we decided to call it "true."
* * *
Congratulations to Ralph Nader on George W. Bush's decision to appoint Andrew Card, formerly the auto industry's top antienvironmental lobbyist, to be his Chief of Staff. Just a few more appointments like this one, I suppose, and the revolution can begin in earnest.
Death came as a release for Daniel Singer on December 2, but we feel like protesting its rude intrusion. In one of the last things he wrote for us, a review of some books about Sartre, he quoted a friend's son, on the day of the French philosopher's funeral. Asked where he had been, he said he was coming "from the demo against the death of Sartre." We'd like to join a demo against the death of Daniel. Better, though, would be one celebrating the life of our valued colleague, The Nation's Europe correspondent for nearly twenty years.
He wrote about many a demo in his reports to us, incessantly probing for signs of vitality on the European left--or the rot of fascism on the far right. During the 1980s, as Reaganism and Thatcherism blanketed the Continent, he seemed, at times, one of the few remaining Marxists. A protégé of the great Marxist intellectual Isaac Deutscher, he held a steadfast faith in democratic socialism but not in any hard doctrinal way. Indeed, the book of his that prompted Victor Navasky to send associate editor Kai Bird to Paris in 1981 to talk to Daniel about writing regularly for us was The Road to Gdansk, a study of Solidarity, which he presciently celebrated as the first crack in the monolith of Soviet Communism and another exemplar of the power of working people to change the world, which was his abiding faith.
When the neocon intellectuals of France, here and elsewhere jumped aboard the funeral hearse of socialism, Daniel stood defiantly on the sidelines. He never modified his conviction that capitalism's injustices were as glaring after the wall fell as they were before. In his last book, Whose Millennium? Theirs or Ours? he ended with a ringing affirmation: "We are not here to tinker with the world, we are here to change it!"
We'll miss Daniel--his wisdom, his courtly kindness, his brilliance, the stubborn courage that carried him through, from his Polish boyhood before World War II when, as a self-styled "deserter from death," he narrowly escaped the Holocaust, until the end. Before he died, he sent readers the following message:
"These are the last words I shall write to The Nation. With my normal absence of modesty I believe that over the years I acquired a radical readership. Radical need not mean sure of itself; nor does it rule out compromises and calculations. But a 'Luxemburgist socialist' (the definition I like best) could not resign himself to the idea that with the technological genius at our disposal we are unable to build a different world. Nor can we accept the fashion that capitalism will vanish without a vast social movement from below.
"That something can happen does not mean that it will happen. I, for one, shall not see this world. Yet, I am departing with the feeling that on the whole I have followed the right road and even with a degree of confidence. Among my young interns, Carl Bromley and his companions, among the youthful fighters from Seattle to Seoul, one can detect a refusal of resignation. You must join them as they now begin to show the way."
On November 7, voters in Alabama erased from that state's Constitution a provision dating from 1901 that declared that "the legislature shall never pass any law to authorize or legalize any marriage between any white person and a Negro, or descendant of a Negro." This declaration represented in part a desire by white supremacists to express as fully as possible their intention to expunge the racially egalitarian symbols, hopes and reforms of Reconstruction. Although Alabama had never enacted a law expressly authorizing interracial marriage, in 1872 the state's Supreme Court did invalidate the law that prohibited such unions. But it promptly reversed itself in 1877 when white supremacists regained power. The Alabama Constitution's disapproval of interracial marriage, however, had still deeper roots. It stemmed from the presumption that white men had the authority to dictate whom, in racial terms, a person could and could not marry. It was also rooted in the belief that certain segments of the population were simply too degraded to be eligible as partners in marriage with whites. At one point or another, forty states prohibited marriage across racial lines. In all of them blacks were stigmatized as matrimonial untouchables. In several, "Mongolians" (people of Japanese or Chinese ancestry), "Malays" (Filipinos) and Native Americans were also placed beyond the pale of acceptability.
Rationales for barring interracial marriage are useful to consider, especially since some of them echo so resonantly justifications voiced today by defenders of prohibitions against same-sex marriage. One rationale for barring interracial marriages was that the progeny of such matches would be incapable of procreating. Another was that God did not intend for the races to mix. Another was that colored people, especially blacks, are irredeemably inferior to whites and pose a terrible risk of contamination. The Negrophobic Thomas Dixon spoke for many white supremacists when he warned in his novel The Leopard's Spots that "this Republic can have no future if racial lines are broken and its proud citizenry sinks to the level of a mongrel breed." A single drop of Negro blood, he maintained apocalyptically, "kinks the hair, flattens the nose, then the lip, puts out the light of intellect, and lights the fires of brutal passions."
Although opponents of prohibitions on interracial marriage have waged struggles in many forums (e.g., academia, the churches, journalism), two in particular have been decisive. One is the courtroom. In 1967 in the most aptly titled case in American history--Loving v. The Commonwealth of Virginia--the United States Supreme Court ruled that prohibitions against interracial marriage violated the equal protection and due process clauses of the Fourteenth Amendment. (Although much credit is lavished on the Court's decision, it bears noting that nineteen years earlier, in 1948, the Supreme Court of California had reached the same conclusion in an extraordinary, albeit neglected, opinion by Justice Roger Traynor.) When the federal Supreme Court struck down Jim Crow laws at the marriage altar, it relied on the massive change in public attitudes reflected and nourished by Brown v. Board of Education (1954), Martin Luther King Jr.'s "I Have A Dream" address (1963), the Civil Rights Act (1964) and the Voting Rights Act (1965). The Court also relied on the fact that by 1967, only sixteen states, in one region of the country, continued to retain laws prohibiting interracial marriage. This highlights the importance of the second major forum in which opponents of racial bars pressed their struggle: state legislatures. Between World War II and the Civil Rights Revolution, scores of state legislatures repealed bans against interracial marriage, thereby laying the moral, social and political groundwork for the Loving decision. Rarely will any court truly be a pioneer. Much more typically judges act in support of a development that is already well under way.
Unlike opponents of Brown v. Board of Education, antagonists of Loving were unable to mount anything like "massive resistance." They neither rioted, nor promulgated Congressional manifestoes condemning the Court, nor closed down marriage bureaus to prevent the desegregation of matrimony. There was, however, some opposition. In 1970, for example, a judge near Fort McClellan, Alabama, denied on racial grounds a marriage license to a white soldier and his black fiancée. This prompted a lawsuit initiated by the US Justice Department that led to the invalidation of Alabama's statute prohibiting interracial marriage. Yet the Alabama constitutional provision prohibiting the enactment of any law expressly authorizing black-white interracial marriage remained intact until the recent referendum.
That an expression of official opposition to interracial marriage remained a part of the Alabama Constitution for so long reflects the fear and loathing of black-white intimacy that remains a potent force in American culture. Sobering, too, was the closeness of the vote; 40 percent of the Alabama electorate voted against removing the obnoxious prohibition. Still, given the rootedness of segregation at the marriage altar, the ultimate outcome of the referendum should be applauded. The complete erasure of state-sponsored stigmatization of interracial marriage is an important achievement in our struggle for racial justice and harmony.