The third-year medical student held the intravenous catheter, poised to insert it into a patient's vein. Suddenly the patient asked, "Have you done this before?" As the student later recounted to me, a long period of silence fell upon the room. Finally, the student's supervising resident, who was also present, said, "Don't worry. If she misses, I'll do it." Apparently satisfied, the patient let the student proceed.
Breaking this type of uncomfortable silence is the goal of Complications: A Surgeon's Notes on an Imperfect Science by Atul Gawande, a surgical resident and a columnist on medicine for The New Yorker. As Gawande's collection of stories reveals, fallibility, mystery and uncertainty pervade modern medicine. Such issues, Gawande believes, should be discussed openly rather than behind the closed doors of hospital conference rooms.
Complications is surely well timed. In 2000, the Institute of Medicine published "To Err Is Human," a highly charged report claiming that as many as 98,000 Americans die annually as a result of medical mistakes. In the wake of this study, research into the problem of medical error has exploded and politicians, including then-President Bill Clinton, have proposed possible solutions. The message was clear: The silence that has too long characterized medical mistakes is no longer acceptable. Yet while Gawande's book provides great insights into the problem of medical error, it also demonstrates how there can be no quick fix.
What may be most remarkable about the recent obsession with medical error is just how old the problem is. For decades, sociologists have conducted studies on hospital wards, perceptively noting the pervasiveness of errors and the strategies of the medical profession for dealing with them. As sociologist Charles Bosk has shown, doctors have largely policed themselves, deciding what transgressions are significant and how those responsible should be reprimanded. Within the profession, then, there is much discussion. Yet the public was rarely told about operations that went wrong or medications that were given in error. Residents joining the medical fraternity quickly learned to keep quiet.
Indeed, when one of those young physicians decided to go public, he used a pseudonym, "Doctor X." In Intern, published in 1965, the author presented a diary of his internship year, replete with overworked residents, arrogant senior physicians and not a few medical errors. In one instance, a surgeon mistakenly tied off a woman's artery instead of her vein, leading to gangrene and eventual amputation of her leg. Doctor X pondered informing the woman about the error, wondering "just exactly where medical ethics come into a picture like this." But his colleagues convinced him to remain quiet.
One whistleblower willing to use his own name and that of his hospital, New York's Bellevue, was William Nolen. In The Making of a Surgeon, published in 1970, surgeons swagger around the hospital, making derisive comments about patients and flirting relentlessly with nurses. (Not the least of reasons for being nice to nurses was the expectation that they would help cover up young doctors' mistakes.) Interestingly, Nolen was subsequently excoriated both by surgeons, who believed he had betrayed the profession's secrets, and by the lay public, who felt he was celebrating the "callousness and prejudice" of surgeons toward vulnerable patients.
Perhaps the peak of this genre of scandalous tell-all accounts occurred in 1978, with the publication of The House of God, written by the pseudonymous Samuel Shem. Although fictional, the book draws on the author's raucous and racy experiences as a medical intern at Boston's Beth Israel Hospital. To Shem, medicine's whole approach to patient care was misguided. The book's hero, the Fat Man, teaches his trainees a vital lesson: "The delivery of medical care is to do as much nothing as possible."
Today it has become more fashionable than rebellious for physicians to describe the trials and tribulations of their training. Dozens of doctors (and some nurses) have published such accounts. Gawande is a prime example of this more mainstream type of physician-author. Even though he describes very disturbing events in his articles for The New Yorker (some of which have been reprinted in Complications), he uses his real name and that of his institution: Boston's Brigham and Women's Hospital.
Gawande, however, has taken the art of physician narrative to a new level. He is a deft writer, telling compelling stories that weave together medical events, his personal feelings and answers to questions that readers are surely pondering. Most important, Gawande paints with a decidedly gray brush. There are few heroes or villains in Complications, just folks doing their jobs. Although some readers, perhaps those who have felt victimized by the medical system, may find Gawande's explanations too exculpatory of doctors, he has documented well the uncertainties and ambiguities that characterize medical practice.
Take, for example, his chapter "When Doctors Make Mistakes." With great flair, Gawande describes a case in which he mistakenly did not call for help when treating a woman severely injured in a car accident. Although Gawande could not successfully place a breathing tube in her lungs, he stubbornly kept trying rather than paging an available senior colleague. Eventually, Gawande clumsily attempted an emergency procedure with which he had little experience, of cutting a hole in her windpipe and attempting to breathe for her. It was only through good fortune that the patient did not die or wind up with brain damage. An anesthesiologist, called in very late in the game, managed to sneak a child-size breathing tube into her windpipe, enabling the patient to obtain adequate oxygen.
With typical candor, Gawande lists the possible reasons that he did the wrong thing: "hubris, inattention, wishful thinking, hesitation, or the uncertainty of the moment." All doctors, he is arguing, experience these very human feelings as they tend to their craft. The fact that lives are at stake may make physicians--as compared with other professionals--even more prone to such emotions.
Gawande also details how the surgery department addressed his error. The case was presented at the weekly morbidity and mortality (M & M) conference, where physicians discuss deaths and other bad outcomes. "The successful M & M presentation," Gawande perceptively notes, "inevitably involves a certain elision of detail and a lot of passive verbs." This clearly occurred during the discussion of Gawande's case, where, remarkably, no one ever asked him why he did not call for help sooner. Rather, his blunder was later addressed through another ritual, a private discussion between Gawande and the senior attendant he had not called. Games with language and secret conversations: These are the reasons Gawande has written his book.
In another chapter, Gawande provides a more provocative explanation for the type of mistake he made. Gaffes, he argues in "Education of a Knife," are part of how surgeons--and other physicians--must learn their craft. (After all, physicians don't perform medicine, they practice it.) In an anecdote resembling that of my third-year student, Gawande describes how he routinely caused complications when learning to place dangerous central-line catheters into the necks of seriously ill patients. Expertise, he explains, does not just happen. Physicians in training must victimize a certain percentage of patients to acquire the skills they will need to become competent doctors. Should we consider these events to be mistakes or business as usual? Deciding how to define a medical error is not the least problem.
In such learning situations, the necessary experience is best attained by keeping quiet. Using the "physician's dodge," patients are told "You need a central line" but not "I am still learning to do this." One ramification of this type of learning, Gawande notes, is the victimization of poor, less educated patients, who are often incapable of questioning doctors. Medicine's inclination to learn on "the humblest of patients" becomes especially apparent with Gawande's candid admission that he himself chose a more senior physician--rather than a more attentive cardiology fellow--to care for his son's heart problem.
Mistakes may be made not only by physicians but by patients. In the chapter "Whose Body Is It, Anyway?" Gawande asks what physicians should do when patients seem to make bad decisions. One especially compelling story, which I often use to teach medical students, involves a man who absolutely refused to go on a breathing machine after experiencing a complication of gall bladder surgery. Although the doctors explained that artificial ventilation would only be temporary and would likely save his life, the patient continued to object.
When the man passed out due to lack of oxygen, Gawande was faced with a devastating quandary. Does he abide by the man's wishes, which is what doctors are supposed to do, or immediately put him on the ventilator? Gawande chose the latter. I love to ask students what they think the man said when, a few days later, Gawande triumphantly took him off the machine. Invariably, half of the students predict that the man said, "Call my lawyer." But the other half, who guess that he said "Thank you," are correct. Gawande had surely averted a mistake in this case, but he was left without clear guideposts for approaching similar cases in the future.
Complications is filled with other stories demonstrating the capriciousness of medicine. For example, Gawande once detected a case of the rare, often fatal infection necrotizing fasciitis (flesh-eating bacteria) because he happened to have seen a case a few weeks before. He ultimately saved the patient's life, not through hard, scientific evidence but through a gut feeling and a willingness to submit a patient to possibly unnecessary surgery. "Medicine's ground state," he concludes, "is uncertainty." Other chapters examine why the medical profession so often hides the mistakes of impairedphysicians, and the questionable use of an operation to help morbidly obese patients lose weight.
In the wake of the Institute of Medicine report, experts have proposed numerous remedies for the problem of error. Most attention has focused on a "systems approach," which would produce a "culture of safety" similar to that of the airline industry. In such a scheme, sophisticated computerized systems would be put in place to detect impending errors, such as wrong medication doses, sloppily written prescriptions and dangerous drug interactions. This emphasis aims to revamp the current approach to medical error, which encourages finger-pointing and malpractice lawsuits.
Gawande's book demonstrates both the advantages and limits of such a systems model. On the one hand, by discouraging the stigmatization of medical mistakes, physicians may be more willing to reveal their own errors and those of their peers. The notion that the case of the obstructed airway could be discussed in an open and nonjudgmental environment, rather than couched in secrecy, is altogether welcome.
On the other hand, there is a reason decades of exposés like Complications have not led to significant change. Defining errors and ascertaining their causes is a tricky business.
So is dealing with the issue of blame. Gawande is willing to admit that he screwed up when he did not call for immediate help for his deteriorating trauma patient. "Good doctoring is all about making the most of the hand you're dealt," he writes, "and I failed to do so." But many physicians remain reluctant to come quite so clean.
A friend and I were sitting around commiserating about the things that get to us: unloading small indignities, comparing thorns. "So there I was," she said, "sitting on the bus and this man across the aisle starts waving a copy of law professor Randall Kennedy's new book Nigger. He's got this mean-looking face with little raisiny eyes, and a pointy head, and he's taking this book in and out of his backpack. He's not reading it, mind you. He's just flashing it at black people."
"Don't be so touchy," I responded. "Professor Kennedy says that the N-word is just another word for 'pal' these days. So your guy was probably one of those muted souls you hear about on Fox cable, one of the ones who's been totally silenced by too much political correctness. I'd assume he was just trying to sign 'Have a nice day.'"
"Maybe so," she said, digging through her purse and pulling out a copy of Michael Moore's bestselling Stupid White Men. "But if I see him again, I'm armed with a 'nice day' of my own."
"That's not nice," I tell her. "Besides, I've decided to get in on the publishing boom myself. My next book will be called Penis. I had been going to title it Civil Claims That Shaped the Evidentiary History of Primogeniture: Paternity and Inheritance Rights in Anglo-American Jurisprudence, 1883-1956, but somehow Penis seems so much more concise. We lawyers love concision."
She raised one eyebrow. "And the mere fact that hordes of sweaty-palmed adolescents might line up to sneak home a copy, or that Howard Stern would pant over it all the way to the top of the bestseller list, or that college kids would make it the one book they take on spring break----"
"...is the last thing on my mind," I assured her. "Really, I'm just trying to engage in a scholarly debate about some of the more nuanced aspects of statutory interpretation under Rule 861, subsection (c), paragraph 2... And besides, now that South Park has made the word so much a part of popular culture, I fail to see what all the fuss is about. When I hear young people singing lyrics that use the P-word, I just hum along. After all, there are no bad words, just ungood hermeneutics."
"No wonder Oprah canceled her book club," she muttered.
Seriously. We do seem to have entered a weird season in which the exercise of First Amendment rights has become a kind of XXX-treme Sport, with people taking the concept of free speech for an Olympic workout, as though to build up that constitutional muscle. People speak not just freely but wantonly, thoughtlessly, mainlined from their hormones. We live in a minefield of scorched-earth, who-me-a-diplomat?, let's-see-if-this-hurts words. As my young son twirls the radio dial in search of whatever pop music his friends are listening to, it is less the lyrics that alarm me than the disc jockeys, all of whom speak as though they were crashing cars. It makes me very grateful to have been part of the "love generation," because for today's youth, the spoken word seems governed by people from whom sticks and stones had to be wrested when they were children--truly unpleasant people who've spent years perfecting their remaining weapon: the words that can supposedly never hurt you.
The flight from the imagined horrors of political correctness seems to have overtaken common sense. Or is it possible that we have come perilously close to a state where hate speech is the common sense? In a bar in Dorchester, Massachusetts, recently, a black man was surrounded by a group of white patrons and taunted with a series of escalatingly hostile racial epithets. The bartender refused to intervene despite being begged to do something by a white friend of the man. The taunting continued until the black man tried to leave, whereupon the crowd followed him outside and beat him severely. In Los Angeles, the head of the police commission publicly called Congresswoman Maxine Waters a "bitch"--to the glee of Log Cabin Republicans, who published an editorial gloating about how good it felt to hear him say that. And in San Jose, California, a judge allowed a white high school student to escape punishment after the student, angry at an African-American teacher who had suspended his best friend, scrawled "Thanks, Nigga" on a school wall. The judge was swayed by an argument that "nigga" is not the same as "nigger" but rather an inoffensive rap music term of endearment common among soul brothers.
Frankly, if Harvard president Lawrence Summers is going to be calling professors to account for generating controversy not befitting that venerable institution, the disingenuous Professor Kennedy would be my first choice. Kennedy's argument that the word "nigger" has lost its sting because black entertainers like Eddie Murphy have popularized it, either dehistoricizes the word to a boneheaded extent or ignores the basic capaciousness of all language. The dictionary is filled with words that have multiple meanings, depending on context. "Obsession" is "the perfume," but it can also be the basis for a harassment suit. Nigger, The Book, is an appeal to pure sensation. It's fine to recognize that ironic reversals of meaning are invaluable survival tools. But what's selling this book is not the hail-fellow-well-met banality of "nigger" but rather the ongoing liveliness of its negativity: It hits in the gut, catches the eye, knots the stomach, jerks the knee, grabs the arm. Kennedy milks this phenomenon only to ask with an entirely straight face: "So what's the big deal?"
The New Yorker recently featured a cartoon by Art Spiegelman that captures my concern: A young skinhead furtively spray-paints a swastika on a wall. In the last panel, someone has put the wall up in a museum and the skinhead is shown sipping champagne with glittery fashionistas and art critics. I do not doubt that hateful or shocking speech can be "mainstreamed" through overuse; I am alarmed that we want to. But my greater concern is whether this gratuitous nonsense should be the most visible test of political speech in an era when government officials tell us to watch our words--even words spoken in confidence to one's lawyer--and leave us to sort out precisely what that means.
In this excerpt from his 2002 book, Studs Terkel recalls an encounter with Dennis Kucinich, "the boy-mayor of Cleveland," and follows his political odyssey.
Nearly four years have elapsed since that merry month of May when France and the whole world were taken aback by a sudden and momentous upheaval.
Jeff Tweedy may be best known to Nation readers as Billy Bragg's collaborator (along with his band Wilco) on the Mermaid Avenue recordings of recent years--two great albums that set unpubli
"It's a great mistake not to feel pleased when you have the chance," a rich, disfigured spinster advises a frail, well-mannered boy in The Shrimp and the Anemone, the first novel in L.P. Hartley's Eustace and Hilda trilogy. The boy has won a hand of piquet, and the spinster has noticed that he has difficulty
enjoying triumphs. Miss Fothergill (like many of Hartley's characters, the spinster has an outlandishly characteristic name) foresees that her 10-year-old friend may not have ahead of him many occasions of pleasure to waste.
Rather than disobey Miss Fothergill, I will readily admit that I have felt pleased while reading Eustace and Hilda, and very pleased while reading Hartley's masterpiece, The Go-Between. It was a spice to my pleasure that even though the Eustace and Hilda trilogy was first published between 1944 and 1947, and The Go-Between in 1953, I had not even heard of L.P. Hartley before the novels were reissued recently as New York Review Books Classics.
I blame my ignorance on an academic education. Hartley is not the sort of author discussed in schools. He is in no way postmodern. He is modern only in his frugality with sentiment and his somewhat sheepish awareness that the ideas of Marx and Freud are abroad in the world, rendering it slightly more tricky than it used to be to write unself-consciously about unathletic middle-class English boys who have been led by their fantasies and spontaneously refined tastes into the country homes of the aristocracy. If Hartley belongs to any academic canon, it would be to the gay novel, whose true history must remain unwritten until the theorists have been driven from the temple and pleasure-loving empiricists loosed upon the literary critical world.
Hartley belongs with Denton Welch and J.R. Ackerley. The three have different strengths: Welch is sensuous, Ackerley is funny and Hartley is a delicate observer of social machinery. But all are sly and precise writers, challenged by a subject inconvenient for novelizing: the emotional life of gay men.
They met the challenge with unassuming resourcefulness, writing what might be called fairy tales. Hans Christian Andersen was their pioneer, as the first modern writer to discover that emotions considered freakish and repellent in adults could win sympathy when expressed by animals and children. Andersen also discovered that a plain style was the best disguise for this kind of trickery and that the disgust of even the most intolerant readers could be charmed away by an invitation to learn how queer characters came to be the way they are. Thus in Ackerley, Welch and Hartley one finds gentle transpositions--from human to animal, from adulthood to childhood, from health to illness--disarmingly exact language and just-so stories about strange desires. Once upon a time, a man fell in love with another man's dog. Once upon a time, a boy on a bicycle was hit by a car and could not find pleasure again except in broken things. Once upon a time, a boy was made to have tea with a crooked-faced, dying woman, and to his surprise he liked her. The effect is a mood of tenderness; the stories are sweet and a bit mournful.
Hartley loved Hans Christian Andersen, but it was another writer who provided him with a defense of gentle transposition as a novelistic practice: Nathaniel Hawthorne, whose daguerreotype by Mathew Brady is the disconcertingly austere frontispiece of The Novelist's Responsibility, Hartley's 1967 collection of literary criticism. In the preface to The Blithedale Romance, Hawthorne had described the novelist's need for a "Faery Land, so like the real world, that in a suitable remoteness one cannot well tell the difference, but with an atmosphere of strange enchantment, beheld through which the inhabitants have a propriety of their own." Hartley quoted the passage with approval.
Lost time was Hartley's fairyland. "The past is a foreign country: they do things differently there," he wrote in the first, and most famous, sentence of The Go-Between. (He may have been echoing the first sentence of A Sentimental Journey, where Laurence Sterne had written, "They order...this matter better in France," which was Sterne's fairyland.) The remembered world could be as rich and vivid as the real one and yet would always stand at a remove. One could visit but not live there. As Hawthorne explained in his introduction to The Scarlet Letter, in another passage quoted by Hartley, there is something romantic about "the attempt to connect a bygone time with the very present which is flitting away from us."
The Go-Between opens with such an attempt. Leo Colston, a bachelor librarian in his 60s, has begun to sort his papers--apparently in preparation for his death, since he seems to have nothing else to look forward to. He starts by opening "a rather battered red cardboard collar-box." It is full of childhood treasures: "two dry, empty sea-urchins; two rusty magnets, a large one and a small one, which had almost lost their magnetism; some negatives rolled up in a tight coil; some stumps of sealing-wax; a small combination lock with three rows of letters; a twist of very fine whipcord; and one or two ambiguous objects, pieces of things, of which the use was not at once apparent: I could not even tell what they had belonged to." At the bottom of the box is a diary, and at first Colston cannot remember what the diary contains. Then he remembers why he does not want to remember it.
My secret--the explanation of me--lay there. I take myself much too seriously, of course. What does it matter to anyone what I was like, then or now? But every man is important to himself at one time or another; my problem had been to reduce the importance, and spread it out as thinly as I could over half a century. Thanks to my interment policy, I had come to terms with life...
A secret naturally arouses the reader's curiosity, but Colston's attitude toward his secret is a further provocation. The events in the diary, he implies, were both inconsequential and traumatic. He preferred a lifelong effort of forgetting over any attempt to come to terms; only by burying "the explanation of me" could he find a way to live. "Was it true...that my best energies had been given to the undertaker's art? If it was, what did it matter?" An unacknowledged wound, a buried definition of the self... The penumbra around Colston's secret is typical of a closeted homosexual, and yet what follows is neither a same-sex love story nor a coming-out narrative.
In the course of the novel, Colston does discover the facts of life and has at least an intuition of his oblique relation to them, but in The Go-Between Hartley was most intensely concerned with his hero's first experiences of sin and grace. This second, more surprising parallel with Hawthorne is the crucial one. Hartley once wrote that "Hawthorne thought that human nature was good, but was convinced in his heart that it was evil." Hartley was in a similar predicament.
Who would have guessed that the Edwardian sexual awakening of a delicate, precociously snobbish 13-year-old would have anything in common with the Puritan crimes and penitence that fascinated Hawthorne? Yet for Hartley, as for Hawthorne, the awareness of sin is a vital stage of education and a condition of maturity. At first young Leo Colston resists it. "It was like a cricket match played in a drizzle, where everyone had an excuse--and what a dull excuse!--for playing badly."
His moral code at the outset is the pagan one of schoolboys; he believes in curses and spells, and in triumphing over enemies by any means except adult intervention. But at the invitation of a classmate, Leo spends his summer vacation at Brandham Hall, a well-appointed Georgian mansion in Norfolk, and there his world is softened by love, in the person of the classmate's older sister, Marian. She is beautiful, musical and headstrong. Leo brings her messages from her fiancé, Hugh Winlove, Lord Trimingham, and billets from her lover, a local farmer named Ted Burgess. With her love comes sin--not because sexuality is evil, though it may be, but because after he has felt its touch, Leo can no longer think of the people he struggles with as enemies. The lovers make a terrible use of him, but he cares most about those who use him worst. In their triangle, he is incapable of taking a side; he is, after all, their go-between.
If you map Hartley onto Hawthorne too methodically, you arrive at the odd conclusion that Leo is part Chillingworth, part Pearl. This is not quite as silly as it sounds. Like them, Leo is jealous of the lovers he observes and is trapped in their orbit; nothing is lost on him, and he is unable to make emotional sense of what he knows. (His apprehension without comprehension is a boon for the reader, who through him sees the social fabric in fine focus.) But unlike Hawthorne's characters, Leo is a boy starting his adolescence, and that process, which he fears will defeat him, is at the heart of The Go-Between. Leo knows that the end of his childhood ought to be "like a death, but with a resurrection in prospect." His resurrection, however, is in doubt.
Like most fairy tales, the tale of how Leo becomes a fairy will not be fully credible to worldly readers. The Oedipal struggle will seem too bald, the catastrophe too absolute. Hartley was aware of this shortcoming. He knew that he found sexuality more awful than other people did, and in The Novelist's Responsibility, he wrote about his attempt to compensate for it while writing the Eustace and Hilda trilogy: "I remember telling a woman novelist, a friend of mine, about a story I was writing, and I said, perhaps with too much awe in my voice, 'Hilda is going to be seduced,' and I inferred that this would be a tragedy. I shall never forget how my friend laughed. She laughed and laughed and could not stop: and I decided that my heroine must be not only seduced, but paralysed into the bargain, if she was to expect any sympathy from the public."
Hartley's friend would probably have laughed at Hilda's paralysis, too. In the trilogy, Hilda is the older, stronger-willed sister of the exquisitely polite Eustace, who grows up in her shadow, a little too fond of its darkness. Their symbiosis in the first volume is brilliant and chilling, but her paralysis in the third is unconvincing. It is implausible that the demise of a love affair would literally immobilize an adult woman. Fortunately, it happens offstage, and a few of the book's characters do wonder if she is malingering.
However, the lack of perspective may be inextricable from Hartley's gifts. His writing is so mournful and sweet because he is willing to consider seriously terrors that only children ought to have, and perhaps only a man who never quite figured manhood out could still consider them that way. The second and third volumes of Eustace and Hilda are as elegant as the first, but not as satisfying, because Eustace's life becomes too vicarious to hold the reader's attention--and because the characters have grown up. Hartley's understanding of children is sophisticated, but he seems to have imagined adults as emotionally limited versions of them--as children who have become skilled at not thinking unpleasant thoughts. As a writer, his best moments are in describing terror at age 13 and the realization at 60-odd that one need not have been so terrified after all. In The Go-Between, artfully, the intervening years are compressed into the act of recollection, and the novel's structure fits the novelist's talents like a glove.
There is an overall disposition to approach each Whitney Biennial as a State of the Art World Address in the form of an exhibition, organized by a curatorial directorate, presenting us with a reading, more or less objective, of what visual culture has been up to in the preceding two years. It is widely appreciated that on any given occasion, the directorate will be driven by enthusiasms and agendas that compromise objectivity. So there has sprung up a genre of what we might call Biennial Criticism, in which the organizers are taken to task for various distortions, and when these have been flagrant, as in the 1993 or, to a lesser degree, the 1995 Biennial, the critics almost speak as one. Everyone knew, in 1993, that a lot of art was being made that took the form of aggressively politicized cultural criticism, but the Biennial made it appear that there was very little else, and it had the effect of alienating the viewers by treating them as enemies. Again, everyone recognized in 1995 that artists were exploring issues of gender identity--but there was a question of whether these preoccupations were not overrepresented in what was shown. Anticipating the barrage of critical dissent, the Whitney pre-emptively advertised the 2000 Biennial as the exhibition you love to hate, making a virtue of adversity. But Biennials and Biennial Criticism must be taken as a single complex, which together provide, in the best way that has so far evolved, as adequate a picture as we are likely to get of where American artistic culture is at the moment. The Whitney deserves considerable credit for exposing itself to critical onslaughts from various directions in this periodic effort to bring the present art world to consciousness. Art really is a mirror in which the culture gets to see itself reflected, but it requires a fair amount of risk and bickering to get that image to emerge with any degree of clarity.
As it happens, my own sense of the state of the art world is reasonably congruent with that of Lawrence Rinder, who bears chief responsibility for Biennial 2002, though I have to admit that I was unfamiliar with a good many of the artists whose work has been selected. This unfamiliarity can even be taken as evidence that Rinder's selection corresponds to the general profile of art-making today.
It is almost as though any sample drawn from the art world would yield much the same profile of artistic production, so long as it consisted mainly of artists in their 30s and early 40s who have been formed in one or another of the main art schools and keep up with the main art periodicals. A great Biennial could have been put together using older artists with international reputations, but somehow emphasizing the young does not seem a curatorial caprice. It is increasingly an art-world premise that what is really happening is to be found among the young or very young, whose reputations have not as yet emerged. A painter who taught in California told me that he was constantly pressed, by dealers and collectors, to tell them who among the students was hot. So as long as it resembles a fairly large show of MFA students graduating from a major art school--as Biennial 2002 mostly does--a quite representative Biennial can be put together of artists whose work is hardly known at all. Somehow, if it were widely known, it would not have been representative.
Art today is pretty largely conceptual. It is not Conceptual Art in the narrow sense the term acquired when it designated one of the last true movements of late Modernism, in which the objects were often negligible or even nonexistent, but rather in the sense that being an artist today consists in having an idea and then using whatever means are necessary to realize it. Advanced art schools do not primarily teach skills but serve as institutes through which students are given critical support in finding their own way to whatever it takes to make their ideas come to something. This has been the case since the early 1970s.
It is amazing how many young people want to be artists today. I was told that there are about 600 art majors in a state university in Utah--and there will be at least that many applicants for perhaps twenty places in any one of the major MFA programs, despite a tuition equal to that for law or business school. Few will find teaching positions, but their main impulse is to make art, taking advantage of today's extreme pluralism, which entails that there are no antecedent prohibitions on how their art has to be. Every artist can use any technology or every technology at once--photography, video, sound, language, imagery in all possible media, not to mention that indeterminate range of activities that constitute performances, working alone or in collaboratives on subjects that can be extremely arcane.
Omer Fast shows a two-channel video installation with surround sound about Glendive, Montana, selected because it is the nation's smallest self-contained television market. Who would know about this? Or about Sarah Winchester, who kept changing the architecture of her house in San Jose, California, because she felt she was being pursued by victims of the Winchester rifle, which her late husband manufactured, which Jeremy Blake chose as the subject of a 16-millimeter film, augmented by drawings and digital artworks transferred to DVD? I pick these out not as criticism but as observations. They exemplify where visual culture is today.
Initially I felt that painting was somewhat underrepresented, but on reflection I realize that there is not much of the kind of easel painting done now that makes up one's composite memory of Biennials past. What I had to accept was that artists today appropriate vernacular styles and images--graffiti, cartoons, circus posters and crude demotic drawing. Artists use whatever kinds of images they like. Much as one dog tells another in a New Yorker cartoon that once you're online, no one can tell you're a dog, it is less and less easy to infer much about an artist's identity from the work.
At least three graduate students in a leading art school I visited not long ago choose to paint like self-taught artists. The self-taught artist Thornton Dial Senior appeared in Biennial 2000, but his contribution did not look like anyone's paradigm of outsider art, so no one could have known that it was not by an MFA from the Rhode Island School of Design or CalArts. There are some quilts in Biennial 2002 by Rosie Lee Tomkins, who is Afro-American, as we can tell from items in her bibliography (Redesigning Cultural Roots: Diversity in African-American Quilts). Since this year's catalogue does not identify artists with reference to their education, we don't know--nor does it matter--whether Tomkins is self-taught. But it is entirely open to white male graduate students to practice quilt-making as their art if they choose to.
Whether someone can paint or draw is no more relevant than whether they can sew or cook. Everything is available to everyone--the distinctions between insider and outsider, art and craft, fine art and illustration, have altogether vanished. I have not yet seen a Biennial with the work of Sophie Matisse or George Deem in it, both of whom appropriate the painting styles of Vermeer and other Old Masters, but they express the contemporary moment as well as would an artist who drew Superman or The Silver Surfer. Mike Bidlo--also not included--has been painting Jackson Pollocks over the past few years. In a way I rather admire, Biennial 2002 presents us with a picture not just of the art world but of American society today, in an ideal form in which identities are as fluid and boundaries as permeable as lifestyles in general.
The openness to media outside the traditional ones of painting, drawing, printmaking, photography and sculpture has made it increasingly difficult to see everything on a single visit in the recent Biennials, and this is particularly so in Biennial 2002. But just seeing the things that can be taken in on such a visit may not give the best idea of what is really happening in the art world. Biennial 2002 includes the work of eight performance artists or teams of performance artists, for example, and theirs may be among the most revealing work being done today; but you will have to read about their work in the catalogue, since the performances themselves do not take place on the premises of the museum. I'll describe three artists whose most striking work is performance, since together they give a deeper sense of visual culture than we might easily get by looking at the objects and installations in the museum's galleries.
Let's begin with Praxis--a performance collaborative formed in 1999 that consists of a young married couple, Delia Bajo and Brainard Carey. On any given Saturday afternoon, Praxis opens the East Village storefront that is its studio and home to passers-by. The ongoing performance, which they title The New Economy, consists in offering visitors any of four meaningful but undemanding services from the artists: a hug, a footbath, a dollar or a Band-Aid, which comes with the kind of kiss a mommy gives to make it all better. Praxis draws upon a fairly rich art history. Its services are good examples of what were considered actions by Fluxus, an art movement that has frequently figured in this column. Fluxus originated in the early 1960s as a loose collective of artists-performers-composers who were dedicated, among other things, to overcoming the gap between art and life. The movement drew its inspiration from Marcel Duchamp, John Cage and Zen--and from the visionary figure George Maciunas, who gave it its name. It is a matter for philosophers to determine when giving someone a hug is a piece of art--but an important consideration is that as art it has no particular connection to the art market, nor is it the sort of thing that is easily collected. And it requires no special training to know how to do it.
There is something tender and affecting in Praxis's ministrations, which connects it to a second art-historical tradition. It has, for example, a certain affinity to Felix Gonzales-Torres, who piled up candies in the corner of a gallery for people to help themselves to, or to the art of Rirkrit Tiravanija, which largely consists in feeding people fairly simple dishes, which he cooks for whoever comes along. Praxis's art is comforting, in much the way that Tiravanija's work is nurturing. The people who enter Praxis's storefront are not necessarily, as the artists explain, seeking an art experience. Neither are those who eat Tiravanija's green curry in quest of gastronomic excitement. The artists set themselves up as healers or comfort-givers, and the art aims at infusing an increment of human warmth into daily life. There was not a lot of that in Fluxus, but it has become very much a part of art today, especially among younger artists. The moral quality of Praxis belongs to the overall spirit of the Williamsburg section of Brooklyn, which recently emerged as an art scene. On one of my visits there, a gallerist asked me what I thought of the scene and I told him I found it "lite," not intending that as a criticism. "We want to remain children," he told me. The artists there could not have been nicer, and this seems generally the feeling evoked by Biennial 2002. It is the least confrontational Biennial of recent years.
There is, for example, not much by way of nudity, though that is integral to the performances of the remarkable artist Zhang Huan, which stands at the opposite end of the spectrum from Praxis. Zhang Huan was expelled from China in 1998. His work fuses certain Asiatic disciplines laced with appropriations from various Western avant-gardes. In each of his performances, Zhang Huan's shaved head and bare, wiry body is put through trials in which, like a saint or shaman, the performer displays his indifference to injury. His nakedness becomes a universal emblem of human vulnerability. There is a remarkable, even stunning, poetry in these performances, and they feel in fact like religious ordeals, like fasting or mortification, undertaken for the larger welfare. I have seen the film of an amazing performance, Dream of the Dragon, in which Zhang Huan is carried by assistants into the performance space on a large forked branch of a tree, like an improvised cross. The assistants cover his body with a kind of soup they coat with flour. A number of leashed family dogs are then allowed to lick this off with sometimes snarling canine voracity.
The performances of William Pope.L, which involve great physical and, I imagine, psychological stress, stand to Zhang Huan's as West stands to East. His crawl pieces, of which he has done perhaps forty since 1978, perform social struggle, as he puts it. His contribution to Biennial 2002, titled The Great White Way, will involve a twenty-two-mile crawl up Broadway, from the Statue of Liberty to the Bronx, and will take five years. In a film excerpt, Pope.L is seen in a padded Superman suit and ski hat, a skateboard strapped to his back, negotiating a segment of the crawl. Sometimes he uses the skateboard as a dolly, but that seems hardly less strenuous than actual crawling. Pope.L is African-American, and somehow one feels that crawling up the Great White Way has to be seen as a symbolic as well as an actual struggle. But it also has the aura of certain ritual enactments that require worshipers to climb some sacred stairway on their knees, or to achieve a required pilgrimage by crawling great distances to a shrine.
Since foot-washing, which is one of Praxis's actions, is widely recognized as a gesture of humility as well as hospitality in many religious cultures, the three performance pieces bear out one of Rinder's observations that a great many artists today are interested in religious subjects. He and I participated in a conversation organized by Simona Vendrame, the editor of Tema Celeste, and published in that magazine under the title New York, November 8, 2001. We were to discuss the impact of September 11 on American art. With few exceptions, the art in Biennial 2002 was selected before the horror, though it is inevitable that it colors how we look at the exhibits.
In a wonderful departure, five commissioned Biennial works are on view in Central Park, including an assemblage of sculptures in darkly patinated bronze by Kiki Smith, of harpies and sirens. These figures have human heads on birds' bodies, and as they are exhibited near the Central Park zoo, they suggest evolutionary possibilities that were never realized. When I saw pictures of them, however, I could not help thinking they memorialized those who threw themselves out of the upper windows of the World Trade Center rather than endure incineration. I had read that one of the nearby schoolchildren pointed to the falling bodies and said, "Look, the birds are on fire!"
I don't really yet know what effect on art September 11 actually had, and it might not be obvious even when one sees it. The artist Audrey Flack, whose work is in the Biennial, told me that as soon as she could get away from the television screen, she wanted only to paint fishing boats at Montauk. A good bit of what Rinder has selected could as easily as not have been done in response to the terrible events, but he said that he had sensed some sort of change taking place in artists' attitudes well before September 11: "What I was finding over and over again was artists saying things to me like 'Well, to be honest, what I'm really doing is searching for the truth' or 'What matters the most to me is to make the most honest statement I possibly can.'" I don't think one can easily tell from looking at the art that it embodies these virtues, any more than one could tell from Flack's watercolors that they constituted acts of healing for her. But that is what they mean and are.
One consequence of art's having taken the direction it has is that there is not always a lot to be gained from what one sees without benefit of a fair amount of explanation. Biennial 2002 has been very generous in supplying interpretive help. Some people have complained that the wall labels go too far in inflecting the way one is supposed to react to the work, but I am grateful for any help I can get; I found the wall texts, like the catalogue, indispensable. And beyond that, you can hear what the artists thought they were doing by listening to recorded comments on the rented electronic guides. I cannot see enough of the work of Kim Sooja, a Korean artist who works with traditional fabrics from her culture. But her statements contribute to the metaphysics of fabric--to what Kierkegaard calls the meaning of the cloth--and are worth thinking about in their own right.
You will encounter Kim Sooja's Deductive Object, consisting of Korean bedcovers placed over tables at the zoo cafe in Central Park, just north of Kiki Smith's mythological animals and just south of a towering steel tree by Roxy Paine. Since Central Park has been opened up to temporary exhibitions, I would like to urge a longstanding agenda of my own. I cannot think of anything better capable of raising the spirits of New York than installing a beautiful projected piece by Christo and Jeanne-Claude, which, as always with their work, will not cost the city a nickel. They envision a series of tall gates, posted at regular intervals all along the main walkway of the park. Hanging from each will be saffron-colored strips of cloth that will float above us as we follow the path for as long as we care to--an undulating roof, since the strips are just long enough to cover the distance between the gates. The whole world will look with exaltation upon this work, which will express the same spirituality and truth that today's artists, if Lawrence Rinder is right, have aspired to in their work. And billions of dollars will flow into our economy as they pilgrim to our city.
I think the art world is going to be the way it is now for a very long time, even if it is strictly unimaginable how artworks themselves will look in 2004. Meanwhile, I think well of Biennial 2002, though I can have written of only a few of the 113 artists that make it up. You'll have to find your own way, like the artists themselves. Take my word that it is worth the effort. That's the best Biennial Criticism is able do in the present state of things.
Filmmakers in sub-Saharan Africa tend to divide their attention between city life today and village life once upon a time. This rule has its exceptions, of course; but if you're searching for an African film that truly overcomes the split, deftly merging the contemporary with the folkloric, I doubt you'll find
anything more ingenious than Joseph Gaï Ramaka's retelling of Carmen. Set along the coast of modern-day Dakar, this Karmen Geï drapes current Senegalese costumes upon the now-mythic figures of Mérimée and Bizet, puts old-style songs and African pop into their mouths and has its characters dance till they threaten to burst the frame.
The film's American distributor, California Newsreel, suggests that Karmen Geï is Africa's first movie musical--that is, an all-singing, all-dancing story, rather than a story with song and dance added on. If so, that breakthrough would count as another major achievement for Ramaka. But nothing can matter in any Carmen without Carmen herself; and so I propose that Ramaka's true claim to fame is to have put Djeïnaba Diop Gaï on the screen.
Practically the first thing you see of her--the first thing you see at all in Karmen Geï--is the heart-stopping vision of her two thighs slapping together, while a full battery of drummers pounds away. We discover Karmen in the sand-covered arena of a prison courtyard, where she is dancing so exuberantly, lustily, violently that you'd think this was a bullring and she'd just trampled the matador; and at this point, she hasn't even risen from her seat. Wait till she gets up and really starts to move, shaking and swerving and swiveling a body that's all curves and pure muscle, topped by a hairdo that rises like a mantilla and then spills down in ass-length braids. A rebel, an outlaw, a force of nature, an irresistible object of desire: Gaï's Karmen embodies all of these, and embodies them in motion. The only part of her that seems fixed is her smile, shining in unshakable confidence from just above an out-thrust chin.
Is it just the memory of other Carmens that brings a bullring to mind? Not at all. There really is a contest going on in this opening scene, and Karmen is winning it, effortlessly. She is dancing, before a full assembly of the jail's female prisoners, in an attempt to seduce the warden, Angélique (Stéphanie Biddle). Pensive and lighter-skinned than Karmen, dressed in a khaki uniform with her hair pulled back tight, Angélique yields to her prisoner's invitation to dance and soon after is stretched out in bed, sated, while Karmen dashes through the hallways and out to freedom.
From that rousing start, Ramaka goes on to rethink Carmen in ways that vary from plausible to very, very clever. It's no surprise that the Don José figure (Magaye Niang) is a police officer; the twist is that Karmen snares him by breaking into his wedding, denouncing all of respectable Senegalese society and challenging his bride-to-be to a dance contest. The chief smuggler (Thierno Ndiaye Dos) is a courtly older man who keeps the lighthouse; and Escamillo, the only person in the movie big enough to look Karmen in the eye, is a pop singer, played with smooth assurance by pop star El Hadj Ndiaye.
Ramaka's best invention, though, is Angélique, a previously unknown character who is both a lovesick, uniformed miscreant and a doomed woman--that is, a merger of Don José and Carmen. By adding her to the plot, the film gives Karmen someone worth dying for. The details of how she arrives at that death are a little muddled--the direction is elliptical at best, herky-jerky at worst--but thanks to Angélique's presence in the story, the climax feels more tender than usual, and more deliberate. Karmen shows up for her final scene decked out in a red sheath, as if to insure the blood won't spoil her dress.
Karmen Geï has recently been shown in the eighth New York African Film Festival, at Lincoln Center's Walter Reade Theater. It is now having a two-week run downtown at Film Forum.
The title of Fabián Bielinsky's briskly intriguing Nine Queens would seem to refer to a sheet of rare stamps--or, rather, to a forgery of the stamps, which two Buenos Aires con artists hope to sell to a rich businessman. But then, the businessman is himself a crook, the con artists don't actually know one another and the sale just might involve real stamps. You begin to see how complicated things can be in this movie; and I haven't yet mentioned the sister.
The action, which stretches across one long day, begins in the convenience store of a gas station, where fresh-faced Juan (Gastón Pauls) draws the attention of Marcos (Ricardo Darín), an older, more aggressive swindler. Teamed up impromptu, just for the day, the two stumble into the con of a lifetime when Marcos's beautiful, prim, angry sister (Leticia Brédice) summons them to the luxury hotel where she works. She just happens to need Marcos to cart away one of his ailing buddies; and the buddy just happens to know of a guest who might buy some stamps.
No, nothing is as it seems. But Bielinsky's storytelling is so adept, his pace so fleet, his actors so much in love with every nuance of their dishonesty that you will probably laugh with delight, even as you're being dealt a losing hand of three-card monte.
And if you want social relevance, Nine Queens will give you that, too. As if Juan (or was it Marcos?) had scripted the whole country, this release swept the critics' awards for 2001 just in time for Argentina's economy to crash. Enjoy!
I hadn't intended to review this last film; but since it's become a critical success, here goes:
The Piano Teacher is a pan-European remake of Whatever Happened to Baby Jane?, with French stars Isabelle Huppert and Annie Girardot playing the sacred-monster roles and Austrian director Michael Haneke fastidiously avoiding the camp humor that alone could have saved the movie. Set in Vienna and cast (except for the leads) with German-speaking actors, whose lips flop like dying fish around their dubbed French syllables, The Piano Teacher is a combination of immaculately composed shots and solemnly absurd dialogue, much of it about the music of Franz Schubert. "That note is the sound of conscience, hammering at the complacency of the bourgeoisie." Sure it is. Add a sequence in which Huppert humps Girardot (her own mother!) in the bed they share, throw in an extended sex scene where the characters grandly ignore any risk of interruption (though they're grappling in a public toilet), and you've got a movie that ought to have made classical music dirty again.
But to judge from critics' reactions, Schubert remains the touchstone of respectability, and The Piano Teacher is somehow to be taken seriously.
The aura of high-mindedness that cloaks the action (at least for some viewers) emanates mostly from Huppert. No matter what her character stoops to--doggie posture, for the most part--Huppert seems never to lower herself. She maintains her dignity because she is being brave. She is acting. She is allowing herself to be shown as sexually abject before an athletic younger man, Benoît Magimel, who has a cleft chin and peekaboo blond hair. Huppert has been similarly abject in recent years, in Benoît Jacquot's The School of Flesh, for example. I wonder what hope other women may nurture for themselves after 40, when this wealthy, celebrated, greatly accomplished and famously beautiful woman has no better prospects. I know we're expected to give prizes to Huppert for such ostentatious self-abnegation. (Last year, at Cannes, she collected a big award.) But what pleasure are we supposed to get from seeing the character humiliated?
A dishonest pleasure, I'd say; the same kind that's proposed in The Piano Teacher's now-notorious scene of genital mutilation. The meaning of the scene, for those who are pleased to give it one, is of course transgressive, subversive and otherwise big word-like. See how (women) (the Viennese) (the middle class) (fill in the blank) are repressed, how they turn against themselves, how they make themselves and everyone around them suffer. Then again, if you subtract all that guff about the complacent bourgeoisie, maybe the scene means nothing more than "Ew, gross!"
I have admired Haneke's films in the past, beginning with the antiseptically grim The Seventh Continent and going on to the tough, much-maligned Benny's Video. When Haneke has proposed that clean, affluent, educated people may do horrible things, I have agreed, as of course I must, accepting what would have been a mere platitude for the sake of the films' clear vision and genuine sense of dread. But as I watched Huppert's preposterous impersonation of a music teacher, I began to wonder if Haneke knows that characters can be something other than horrid.
The dynamics of Schubert's music represent emotional "anarchy," says Huppert at one point, in a pronouncement that would get a pedagogue sacked from any self-respecting conservatory. Listen to Rudolf Serkin play the great B-flat piano sonata, varying his touch with every breath, and you will hear not anarchy but imagination. It's the quality most lacking in The Piano Teacher--followed closely by warmth, humor, realism and purpose.
Fun at Home: Nation readers will want to know that Zeitgeist Video has just brought out a DVD of Mark Achbar and Peter Wintonick's fine documentary Manufacturing Consent: Noam Chomsky and the Media. All the original fun is there, plus added features such as Chomsky's own commentary on the picture. The film is now ten years old. You will probably find it's more to the point than ever.
Perhaps time is our invention
To make things seem to move
Like the uncovering tail of the blue jay
As it lights its feet on the wet
Perhaps the seasons are really not
More than a single space with walls inside, disconnected
While fall and winter, and spring
Which we always anticipate, are only
Expansions of our own longings.
Perhaps there is only the now
Neither age nor youth, not even the vertigo of memories stilettoed
Except wounded into this present second
Shorter than the birth of a cell, or the nest dropped
With the sun and the rain always out together.
This center is absolute, it needs no endlessness
For heaven or hell. Or for creation, our own illusion of ourselves.
The minor variations we unfold are all the same
Inherently permutating at once
Repeating one design. Obscure. Lit at the edges of our time.
There are those opposed to the use of cloning technology to create human embryos for stem-cell research whose concerns emanate from commitments to social justice. One of their arguments runs as follows: The idea driving this medical research is that by creating an embryo through cloning, we can produce embryonic stem cells that are a perfect genetic match for a patient. All that is required to conduct the cloning is a skin cell from which to extract the patient's DNA and...a human egg.
Where, cry out the social justice advocates, are we going to get all these eggs for all these patients? Do the math, they suggest: 17 million American diabetics, needing anywhere from 10 to 100 eggs each, since the cloning technology is far from efficient...and even if you can pull that off, Christopher Reeve is still not walking, Michael J. Fox and Janet Reno still tremble and Ronald Reagan still doesn't remember who Ronald Reagan was. The social justice folk maintain that the billions of eggs required for embryonic stem cell therapies for the millions of Americans suffering from chronic and degenerative diseases will be obtained through exploitation of poor women in this country and the world over. Surplus value will take on an even more nefarious meaning.
Still, the early results from embryonic stem-cell therapy in mice are so dramatic that not to pursue this medical research is recognized as morally obscene and just plain stupid. At the University of California, Dr. Hans Keirstead was able to implant neurological tissue derived from embryonic stem cells in a mouse with partial spinal cord injury so that after eight weeks, the mouse had regained most of its ability to walk and, of major significance to the quarter-million Americans suffering from this tragic condition, had also regained bladder and bowel control. Yet, the question remains, where are we going to get all those eggs?
A call to Stanford University's Paul Berg, a Nobel laureate who has been testifying to Congress on behalf of embryonic stem-cell research, helps elucidate the answer: When it comes to the research, he says, the quantity required may not be a problem. But if the desired therapeutic potential of embryonic stem cells is fully realized, the need for eggs will be great and could short-circuit the availability of these therapies. But a solution to that may be possible, Berg insists. If research is carried out that identifies the biochemicals in the egg directing the genetic material to develop into an embryo, then we could extract and fractionate those biochemicals and insert them into any skin cell, for example, for use in the cloning process. Voilà! A skin cell becomes an egg, and skin cells are plentiful.
The immediate enthusiasm for this breakthrough scientific idea, which could help Reeve walk again while simultaneously obviating the motive for an exploitative human egg market, is quickly tempered by the full realization of what Berg has explained: When we acquire the ability to use any cell as an egg, we will have removed another obstacle to achieving complete control over human reproduction. Admittedly, complete control over the production of reproduction will require a womb for gestation--but that ultimately should prove to be just another biochemical matter for extraction and fractionation.
This, then, is how it goes in biotechnology, the essential dynamic that simultaneously gives rise to medical hope and moral vertigo. Each step forward produces a new problem, the solution to which demands further control over the biological mechanism known as a human being. But this somehow impinges on human beings or some portion of ourselves that we value. To deal with the attendant moral quandaries, a method is found to isolate and duplicate the underlying molecular process. The moral quandary has thus been replaced by an extracorporeal biochemical process, no longer strictly identified as human, and therefore a process that no one can reasonably value apart from its use. The problem, as bioethicist Eric Juengst puts it, is that we could thereby successfully cope with every moral dilemma posed by biotechnology and still end up with a society none of us would wish to live in. For Francis Fukuyama, this is Our Posthuman Future, as he has titled his new book on the subject.
Fukuyama's most famous previous theoretical foray was to declare, in 1989, an end to history, whereby a capitalist liberal democratic structure represented the final and most satisfying endpoint for the human species, permitting the widest expression of its creative energies while best controlling its destructive tendencies. He imagined that ultimately, with the universal acceptance of this regime, the relativist impasse of modern thought would in a sense resolve itself.
But thirteen years after the end of history, Fukuyama has second thoughts. He's discovered that there is no end of history as long as there is no end of science and technology. With the rapidly developing ability of the biological sciences to identify and then alter the genetic structure of organisms, including humans, he fears the essence of the species is up for grabs. Since capitalist liberal democratic structures serve the needs of human nature as it has evolved, interference by the bio-engineers with this human nature threatens to bring the end of history to an end.
The aim of Our Posthuman Future is "to argue that [Aldous] Huxley was right," Fukuyama announces early on, referring to Huxley's 1932 vision of a Brave New World. Multiple meanings are intended by Fukuyama: The industrialization of all phases of reproduction. The genetic engineering of the individuals produced by that process, thereby predetermining their lives. The tyrannical control of this population through neurochemical intervention, making subservience experientially pleasurable. Fukuyama cites specific contemporary or projected parallels to Huxley's Hatchery and Conditioning Center, Social Predestination Room and soma. In Fukuyama's terms, the stakes in these developments are nothing less than human nature itself.
The first of the book's three parts lays out the case that the biotechnologically driven shift to a posthuman era is already discernible and describes some of the potential consequences. Prozac and Ritalin are precursors to the genomically smart psychotropic weapons of the near future. Through these drugs, which energize depressed girls and calm hyperactive boys, we are being "gently nudged toward that androgynous median personality, self-satisfied and socially compliant, that is the current politically correct outcome in American society." Standardization of the personality is under way. This is the area to watch, Fukuyama asserts, because virtually everything that the popular imagination envisions genetic engineering accomplishing is much more likely to be accomplished sooner through neuropharmacology.
Increased life spans and genetic engineering also offer mostly dystopic horizons, whereby gerontocracies take power over societies whose main purpose has become the precision breeding of their progeny. The ancient instincts for hierarchical status and dominance are still the most powerful forces shaping this new world born from biotechnology. Since, as Fukuyama sees it, science does not necessarily lead to the equality of respect for all human beings demanded by liberal egalitarianism, the newest discoveries will serve the oldest drives. We are launched on a genetic arms race.
But be warned: We may not arrive in that new world through some dramatic struggle in which we put up a fight. Rather, the losses to our humanity may occur so subtly that we might "emerge on the other side of a great divide between human and posthuman history and not even see that the watershed had been breached because we lost sight of what that [human] essence was."
If this terrible event is to be prevented, then the human essence, which Fukuyama correlates with human nature itself, must be identified and kept inviolable. But what is that line to be drawn around "human nature" and to which we can all adhere so that we might reap the benefits of biotechnology while preventing the nightmare scenarios from ever coming to pass?
The entire world today wants the answer to this. Fukuyama promises to deliver it. But despite the clarity with which he announces his mission, the author advises his readers, "Those not inclined to more theoretical discussions of politics may choose to skip over some of the chapters here." Yet these are the very chapters containing the answer we all seek in order to tame the biotechnology beast! This, then, signals that we are entering dangerous ground, and we will need to bear with the author's own means of revealing his great discovery, which may be skipped over at our own peril.
In this heart of the book, titled "Being Human," Fukuyama first seeks to restore human nature as the source of our rights, our morality and our dignity. In particular, he wishes to rescue all these dimensions from the positivist and utilitarian liberal philosophers who, closely allied with the scientific community, have dominated the debate over biotechnology. According to the author, these philosophers assign rights everywhere and emphasize the individual as the source of moral concern. In doing so, they put humankind and its collective life at risk before the juggernaut of biotechnology. John Rawls and Ronald Dworkin, among others, have elevated individual autonomy over inherently meaningful life plans, claims Fukuyama, who then questions whether moral freedom as it is currently understood is such a good thing for most people, let alone the single most important human good.
Rather than our individual autonomy or moral freedom, Fukuyama wishes that we would attend to the logic of human history, which is ultimately driven by the priorities that exist among natural human desires, propensities and behaviors. Since he wishes us to shift ground to the logic of the inherent and the natural, he must finally define that core composing human nature:
The definition of the term human nature I will use here is the following: human nature is the sum of the behavior and characteristics that are typical of the human species, arising from genetic rather than environmental factors.
Later he will refine this further to the innate species-typical forms of cognition, and species-typical emotional responses to cognition. What he is really after is not just that which is typical of our species but that which is unique to human beings. Only then will we know what needs the greatest safeguarding. After hanging fire while reviewing the candidates for this irreducible, unique core to be defended, including consciousness and the most important quality of a human being, feelings, Fukuyama finally spills the beans:
What is it that we want to protect from any future advances in biotechnology? The answer is, we want to protect the full range of our complex, evolved natures against attempts at self-modification. We do not want to disrupt either the unity or the continuity of human nature, and thereby the human rights that are based on it.
So, where are we? It would seem we have gone full circle. Human nature is defined by...human nature! To the extent that it is capable of being located in our material bodies, it is all that arises from our genetics. Any attempt at greater precision is a violation of our unity or continuity--and threatens to expose the author's empty hand. Through such sophistry, Fukuyama wishes to assert mastery over any biotechnological innovation that he considers threatening, since he can now arbitrarily choose when it is disruptive of the unity or continuity of the human nature arising from our genetics. Even a heritable cancer could qualify for protection under Fukuyama's rubric for that which is to be defended from biotechnological intervention.
Indeed, there are those agreeing with Fukuyama's view of the biological bases of human social life who draw opposite conclusions about human bioengineering, viewing it as humanity's last best hope.
The remainder of the book is a potpourri of tactical suggestions (embedded in rhetoric cloned from Fukuyama's mentor in these matters, bioethicist Leon Kass) of which biotechnologies should be controlled, and of the need for both national and international bodies and systems to do so, if such control is to be effective. That, in the end, may be the most surprising aspect of the book. All this fervid philosophizing in reaction to fears about a Brave New World, fervently working toward the radical conclusion that what is needed is...regulation. Although obviously recognition of the need for regulation might well be experienced as a radical trauma by someone who has previously placed an overabundance of faith in the market.
But one would be foolish to believe that Fukuyama has gone all this distance simply to argue for what he refers to at one point as a more nuanced regulatory approach. In his most public engagement with biotechnology thus far, he has endorsed, written and testified to Congress on behalf of a bill that will not only ban human reproductive cloning but also ban nonreproductive cloning for stem-cell research. The legislation he supports would also make any doctor who utilizes or prescribes a treatment developed with cloning technology subject to ten years in prison and a $1 million fine. Under this legislation, then, if a cure or treatment for diabetes or heart failure is created in England that used embryo cloning to harvest stem cells for therapy, US physicians would not be allowed to have access to such treatments for their patients. This is his lesson in how moral freedom is not such a good thing compared with an inherently meaningful life plan. Let the fragile diabetic or spinal cord-injury victim learn the true value of our human nature from their catheterized bladders!
Fukuyama's entire brief depends upon avoiding the consequences of his own logic. Having identified the human essence with our biological human nature, he must evade any further specification or else the particular tissues, cells or molecules would be subject to further discussion and analysis as to whether or not they represent the human essence. Rather than discussion, we should trade in our autonomy and moral freedom for his protections. By the close of the book, any moral qualms on his part fall entirely by the wayside. Fukuyama is perhaps aware that he has failed to make his case except to those ready to believe. The book culminates in a final paragraph that is nothing less than a temper tantrum:
We do not have to accept any of these future worlds under a false banner of liberty, be it that of unlimited reproductive rights or of unfettered scientific inquiry. We do not have to regard ourselves as slaves to inevitable technological progress when that progress does not serve human ends. True freedom means the freedom of political communities to protect the values they hold most dear...
Nice rhetoric until we recall the values of the types of political regimes to which moral freedom and science must be sacrificed. While Fukuyama rails against the Brave New World, he takes the side of Huxley's World Controller, who explains, "Truth's a menace, science is a public danger...That's why we so carefully limit the scope of its researches."
There is an alternative to the fear that human nature must be inviolable because human nature cannot be trusted. We have seen imperious dictates against science and moral freedom delivered by philosophers before. In the recent past, we have evidence of very similar ideas in very similar language issuing from the philosopher whom Fukuyama draws upon for the epigraph beginning the first chapter of his book, Martin Heidegger. In the 1930s Professor Heidegger wanted science to serve the German essence, and it did. Now Professor Fukuyama wants science, and all of us, to serve the human essence, which he equates with his version of sociobiology infused with German romantic holism. Once more, we witness someone who would stop tyranny by imposing a tyranny of his own. Since Francis Fukuyama now sits on the President's Council on Bioethics, we should be grateful for the warning.