Quantcast

Nation Topics - Books and the Arts | The Nation

Topic Page

Nation Topics - Books and the Arts

Subsections:

Arts and Entertainment Books and Ideas

Articles

News and Features

Filmmakers in sub-Saharan Africa tend to divide their attention between city life today and village life once upon a time. This rule has its exceptions, of course; but if you're searching for an African film that truly overcomes the split, deftly merging the contemporary with the folkloric, I doubt you'll find

anything more ingenious than Joseph Gaï Ramaka's retelling of Carmen. Set along the coast of modern-day Dakar, this Karmen Geï drapes current Senegalese costumes upon the now-mythic figures of Mérimée and Bizet, puts old-style songs and African pop into their mouths and has its characters dance till they threaten to burst the frame.

The film's American distributor, California Newsreel, suggests that Karmen Geï is Africa's first movie musical--that is, an all-singing, all-dancing story, rather than a story with song and dance added on. If so, that breakthrough would count as another major achievement for Ramaka. But nothing can matter in any Carmen without Carmen herself; and so I propose that Ramaka's true claim to fame is to have put Djeïnaba Diop Gaï on the screen.

Practically the first thing you see of her--the first thing you see at all in Karmen Geï--is the heart-stopping vision of her two thighs slapping together, while a full battery of drummers pounds away. We discover Karmen in the sand-covered arena of a prison courtyard, where she is dancing so exuberantly, lustily, violently that you'd think this was a bullring and she'd just trampled the matador; and at this point, she hasn't even risen from her seat. Wait till she gets up and really starts to move, shaking and swerving and swiveling a body that's all curves and pure muscle, topped by a hairdo that rises like a mantilla and then spills down in ass-length braids. A rebel, an outlaw, a force of nature, an irresistible object of desire: Gaï's Karmen embodies all of these, and embodies them in motion. The only part of her that seems fixed is her smile, shining in unshakable confidence from just above an out-thrust chin.

Is it just the memory of other Carmens that brings a bullring to mind? Not at all. There really is a contest going on in this opening scene, and Karmen is winning it, effortlessly. She is dancing, before a full assembly of the jail's female prisoners, in an attempt to seduce the warden, Angélique (Stéphanie Biddle). Pensive and lighter-skinned than Karmen, dressed in a khaki uniform with her hair pulled back tight, Angélique yields to her prisoner's invitation to dance and soon after is stretched out in bed, sated, while Karmen dashes through the hallways and out to freedom.

From that rousing start, Ramaka goes on to rethink Carmen in ways that vary from plausible to very, very clever. It's no surprise that the Don José figure (Magaye Niang) is a police officer; the twist is that Karmen snares him by breaking into his wedding, denouncing all of respectable Senegalese society and challenging his bride-to-be to a dance contest. The chief smuggler (Thierno Ndiaye Dos) is a courtly older man who keeps the lighthouse; and Escamillo, the only person in the movie big enough to look Karmen in the eye, is a pop singer, played with smooth assurance by pop star El Hadj Ndiaye.

Ramaka's best invention, though, is Angélique, a previously unknown character who is both a lovesick, uniformed miscreant and a doomed woman--that is, a merger of Don José and Carmen. By adding her to the plot, the film gives Karmen someone worth dying for. The details of how she arrives at that death are a little muddled--the direction is elliptical at best, herky-jerky at worst--but thanks to Angélique's presence in the story, the climax feels more tender than usual, and more deliberate. Karmen shows up for her final scene decked out in a red sheath, as if to insure the blood won't spoil her dress.

Karmen Geï has recently been shown in the eighth New York African Film Festival, at Lincoln Center's Walter Reade Theater. It is now having a two-week run downtown at Film Forum.

The title of Fabián Bielinsky's briskly intriguing Nine Queens would seem to refer to a sheet of rare stamps--or, rather, to a forgery of the stamps, which two Buenos Aires con artists hope to sell to a rich businessman. But then, the businessman is himself a crook, the con artists don't actually know one another and the sale just might involve real stamps. You begin to see how complicated things can be in this movie; and I haven't yet mentioned the sister.

The action, which stretches across one long day, begins in the convenience store of a gas station, where fresh-faced Juan (Gastón Pauls) draws the attention of Marcos (Ricardo Darín), an older, more aggressive swindler. Teamed up impromptu, just for the day, the two stumble into the con of a lifetime when Marcos's beautiful, prim, angry sister (Leticia Brédice) summons them to the luxury hotel where she works. She just happens to need Marcos to cart away one of his ailing buddies; and the buddy just happens to know of a guest who might buy some stamps.

No, nothing is as it seems. But Bielinsky's storytelling is so adept, his pace so fleet, his actors so much in love with every nuance of their dishonesty that you will probably laugh with delight, even as you're being dealt a losing hand of three-card monte.

And if you want social relevance, Nine Queens will give you that, too. As if Juan (or was it Marcos?) had scripted the whole country, this release swept the critics' awards for 2001 just in time for Argentina's economy to crash. Enjoy!

I hadn't intended to review this last film; but since it's become a critical success, here goes:

The Piano Teacher is a pan-European remake of Whatever Happened to Baby Jane?, with French stars Isabelle Huppert and Annie Girardot playing the sacred-monster roles and Austrian director Michael Haneke fastidiously avoiding the camp humor that alone could have saved the movie. Set in Vienna and cast (except for the leads) with German-speaking actors, whose lips flop like dying fish around their dubbed French syllables, The Piano Teacher is a combination of immaculately composed shots and solemnly absurd dialogue, much of it about the music of Franz Schubert. "That note is the sound of conscience, hammering at the complacency of the bourgeoisie." Sure it is. Add a sequence in which Huppert humps Girardot (her own mother!) in the bed they share, throw in an extended sex scene where the characters grandly ignore any risk of interruption (though they're grappling in a public toilet), and you've got a movie that ought to have made classical music dirty again.

But to judge from critics' reactions, Schubert remains the touchstone of respectability, and The Piano Teacher is somehow to be taken seriously.

The aura of high-mindedness that cloaks the action (at least for some viewers) emanates mostly from Huppert. No matter what her character stoops to--doggie posture, for the most part--Huppert seems never to lower herself. She maintains her dignity because she is being brave. She is acting. She is allowing herself to be shown as sexually abject before an athletic younger man, Benoît Magimel, who has a cleft chin and peekaboo blond hair. Huppert has been similarly abject in recent years, in Benoît Jacquot's The School of Flesh, for example. I wonder what hope other women may nurture for themselves after 40, when this wealthy, celebrated, greatly accomplished and famously beautiful woman has no better prospects. I know we're expected to give prizes to Huppert for such ostentatious self-abnegation. (Last year, at Cannes, she collected a big award.) But what pleasure are we supposed to get from seeing the character humiliated?

A dishonest pleasure, I'd say; the same kind that's proposed in The Piano Teacher's now-notorious scene of genital mutilation. The meaning of the scene, for those who are pleased to give it one, is of course transgressive, subversive and otherwise big word-like. See how (women) (the Viennese) (the middle class) (fill in the blank) are repressed, how they turn against themselves, how they make themselves and everyone around them suffer. Then again, if you subtract all that guff about the complacent bourgeoisie, maybe the scene means nothing more than "Ew, gross!"

I have admired Haneke's films in the past, beginning with the antiseptically grim The Seventh Continent and going on to the tough, much-maligned Benny's Video. When Haneke has proposed that clean, affluent, educated people may do horrible things, I have agreed, as of course I must, accepting what would have been a mere platitude for the sake of the films' clear vision and genuine sense of dread. But as I watched Huppert's preposterous impersonation of a music teacher, I began to wonder if Haneke knows that characters can be something other than horrid.

The dynamics of Schubert's music represent emotional "anarchy," says Huppert at one point, in a pronouncement that would get a pedagogue sacked from any self-respecting conservatory. Listen to Rudolf Serkin play the great B-flat piano sonata, varying his touch with every breath, and you will hear not anarchy but imagination. It's the quality most lacking in The Piano Teacher--followed closely by warmth, humor, realism and purpose.

Fun at Home: Nation readers will want to know that Zeitgeist Video has just brought out a DVD of Mark Achbar and Peter Wintonick's fine documentary Manufacturing Consent: Noam Chomsky and the Media. All the original fun is there, plus added features such as Chomsky's own commentary on the picture. The film is now ten years old. You will probably find it's more to the point than ever.

Perhaps time is our invention
To make things seem to move
Like the uncovering tail of the blue jay
As it lights its feet on the wet
Trembling wood.

Perhaps the seasons are really not
More than a single space with walls inside, disconnected
While fall and winter, and spring
Which we always anticipate, are only
Expansions of our own longings.

Perhaps there is only the now
Neither age nor youth, not even the vertigo of memories stilettoed
Except wounded into this present second
Shorter than the birth of a cell, or the nest dropped
With the sun and the rain always out together.

This center is absolute, it needs no endlessness
For heaven or hell. Or for creation, our own illusion of ourselves.
The minor variations we unfold are all the same
Inherently permutating at once
Repeating one design. Obscure. Lit at the edges of our time.

There are those opposed to the use of cloning technology to create human embryos for stem-cell research whose concerns emanate from commitments to social justice. One of their arguments runs as follows: The idea driving this medical research is that by creating an embryo through cloning, we can produce embryonic stem cells that are a perfect genetic match for a patient. All that is required to conduct the cloning is a skin cell from which to extract the patient's DNA and...a human egg.

Where, cry out the social justice advocates, are we going to get all these eggs for all these patients? Do the math, they suggest: 17 million American diabetics, needing anywhere from 10 to 100 eggs each, since the cloning technology is far from efficient...and even if you can pull that off, Christopher Reeve is still not walking, Michael J. Fox and Janet Reno still tremble and Ronald Reagan still doesn't remember who Ronald Reagan was. The social justice folk maintain that the billions of eggs required for embryonic stem cell therapies for the millions of Americans suffering from chronic and degenerative diseases will be obtained through exploitation of poor women in this country and the world over. Surplus value will take on an even more nefarious meaning.

Still, the early results from embryonic stem-cell therapy in mice are so dramatic that not to pursue this medical research is recognized as morally obscene and just plain stupid. At the University of California, Dr. Hans Keirstead was able to implant neurological tissue derived from embryonic stem cells in a mouse with partial spinal cord injury so that after eight weeks, the mouse had regained most of its ability to walk and, of major significance to the quarter-million Americans suffering from this tragic condition, had also regained bladder and bowel control. Yet, the question remains, where are we going to get all those eggs?

A call to Stanford University's Paul Berg, a Nobel laureate who has been testifying to Congress on behalf of embryonic stem-cell research, helps elucidate the answer: When it comes to the research, he says, the quantity required may not be a problem. But if the desired therapeutic potential of embryonic stem cells is fully realized, the need for eggs will be great and could short-circuit the availability of these therapies. But a solution to that may be possible, Berg insists. If research is carried out that identifies the biochemicals in the egg directing the genetic material to develop into an embryo, then we could extract and fractionate those biochemicals and insert them into any skin cell, for example, for use in the cloning process. Voilà! A skin cell becomes an egg, and skin cells are plentiful.

The immediate enthusiasm for this breakthrough scientific idea, which could help Reeve walk again while simultaneously obviating the motive for an exploitative human egg market, is quickly tempered by the full realization of what Berg has explained: When we acquire the ability to use any cell as an egg, we will have removed another obstacle to achieving complete control over human reproduction. Admittedly, complete control over the production of reproduction will require a womb for gestation--but that ultimately should prove to be just another biochemical matter for extraction and fractionation.

This, then, is how it goes in biotechnology, the essential dynamic that simultaneously gives rise to medical hope and moral vertigo. Each step forward produces a new problem, the solution to which demands further control over the biological mechanism known as a human being. But this somehow impinges on human beings or some portion of ourselves that we value. To deal with the attendant moral quandaries, a method is found to isolate and duplicate the underlying molecular process. The moral quandary has thus been replaced by an extracorporeal biochemical process, no longer strictly identified as human, and therefore a process that no one can reasonably value apart from its use. The problem, as bioethicist Eric Juengst puts it, is that we could thereby successfully cope with every moral dilemma posed by biotechnology and still end up with a society none of us would wish to live in. For Francis Fukuyama, this is Our Posthuman Future, as he has titled his new book on the subject.

Fukuyama's most famous previous theoretical foray was to declare, in 1989, an end to history, whereby a capitalist liberal democratic structure represented the final and most satisfying endpoint for the human species, permitting the widest expression of its creative energies while best controlling its destructive tendencies. He imagined that ultimately, with the universal acceptance of this regime, the relativist impasse of modern thought would in a sense resolve itself.

But thirteen years after the end of history, Fukuyama has second thoughts. He's discovered that there is no end of history as long as there is no end of science and technology. With the rapidly developing ability of the biological sciences to identify and then alter the genetic structure of organisms, including humans, he fears the essence of the species is up for grabs. Since capitalist liberal democratic structures serve the needs of human nature as it has evolved, interference by the bio-engineers with this human nature threatens to bring the end of history to an end.

The aim of Our Posthuman Future is "to argue that [Aldous] Huxley was right," Fukuyama announces early on, referring to Huxley's 1932 vision of a Brave New World. Multiple meanings are intended by Fukuyama: The industrialization of all phases of reproduction. The genetic engineering of the individuals produced by that process, thereby predetermining their lives. The tyrannical control of this population through neurochemical intervention, making subservience experientially pleasurable. Fukuyama cites specific contemporary or projected parallels to Huxley's Hatchery and Conditioning Center, Social Predestination Room and soma. In Fukuyama's terms, the stakes in these developments are nothing less than human nature itself.

The first of the book's three parts lays out the case that the biotechnologically driven shift to a posthuman era is already discernible and describes some of the potential consequences. Prozac and Ritalin are precursors to the genomically smart psychotropic weapons of the near future. Through these drugs, which energize depressed girls and calm hyperactive boys, we are being "gently nudged toward that androgynous median personality, self-satisfied and socially compliant, that is the current politically correct outcome in American society." Standardization of the personality is under way. This is the area to watch, Fukuyama asserts, because virtually everything that the popular imagination envisions genetic engineering accomplishing is much more likely to be accomplished sooner through neuropharmacology.

Increased life spans and genetic engineering also offer mostly dystopic horizons, whereby gerontocracies take power over societies whose main purpose has become the precision breeding of their progeny. The ancient instincts for hierarchical status and dominance are still the most powerful forces shaping this new world born from biotechnology. Since, as Fukuyama sees it, science does not necessarily lead to the equality of respect for all human beings demanded by liberal egalitarianism, the newest discoveries will serve the oldest drives. We are launched on a genetic arms race.

But be warned: We may not arrive in that new world through some dramatic struggle in which we put up a fight. Rather, the losses to our humanity may occur so subtly that we might "emerge on the other side of a great divide between human and posthuman history and not even see that the watershed had been breached because we lost sight of what that [human] essence was."

If this terrible event is to be prevented, then the human essence, which Fukuyama correlates with human nature itself, must be identified and kept inviolable. But what is that line to be drawn around "human nature" and to which we can all adhere so that we might reap the benefits of biotechnology while preventing the nightmare scenarios from ever coming to pass?

The entire world today wants the answer to this. Fukuyama promises to deliver it. But despite the clarity with which he announces his mission, the author advises his readers, "Those not inclined to more theoretical discussions of politics may choose to skip over some of the chapters here." Yet these are the very chapters containing the answer we all seek in order to tame the biotechnology beast! This, then, signals that we are entering dangerous ground, and we will need to bear with the author's own means of revealing his great discovery, which may be skipped over at our own peril.

In this heart of the book, titled "Being Human," Fukuyama first seeks to restore human nature as the source of our rights, our morality and our dignity. In particular, he wishes to rescue all these dimensions from the positivist and utilitarian liberal philosophers who, closely allied with the scientific community, have dominated the debate over biotechnology. According to the author, these philosophers assign rights everywhere and emphasize the individual as the source of moral concern. In doing so, they put humankind and its collective life at risk before the juggernaut of biotechnology. John Rawls and Ronald Dworkin, among others, have elevated individual autonomy over inherently meaningful life plans, claims Fukuyama, who then questions whether moral freedom as it is currently understood is such a good thing for most people, let alone the single most important human good.

Rather than our individual autonomy or moral freedom, Fukuyama wishes that we would attend to the logic of human history, which is ultimately driven by the priorities that exist among natural human desires, propensities and behaviors. Since he wishes us to shift ground to the logic of the inherent and the natural, he must finally define that core composing human nature:

The definition of the term human nature I will use here is the following: human nature is the sum of the behavior and characteristics that are typical of the human species, arising from genetic rather than environmental factors.

Later he will refine this further to the innate species-typical forms of cognition, and species-typical emotional responses to cognition. What he is really after is not just that which is typical of our species but that which is unique to human beings. Only then will we know what needs the greatest safeguarding. After hanging fire while reviewing the candidates for this irreducible, unique core to be defended, including consciousness and the most important quality of a human being, feelings, Fukuyama finally spills the beans:

What is it that we want to protect from any future advances in biotechnology? The answer is, we want to protect the full range of our complex, evolved natures against attempts at self-modification. We do not want to disrupt either the unity or the continuity of human nature, and thereby the human rights that are based on it.

So, where are we? It would seem we have gone full circle. Human nature is defined by...human nature! To the extent that it is capable of being located in our material bodies, it is all that arises from our genetics. Any attempt at greater precision is a violation of our unity or continuity--and threatens to expose the author's empty hand. Through such sophistry, Fukuyama wishes to assert mastery over any biotechnological innovation that he considers threatening, since he can now arbitrarily choose when it is disruptive of the unity or continuity of the human nature arising from our genetics. Even a heritable cancer could qualify for protection under Fukuyama's rubric for that which is to be defended from biotechnological intervention.

Indeed, there are those agreeing with Fukuyama's view of the biological bases of human social life who draw opposite conclusions about human bioengineering, viewing it as humanity's last best hope.

The remainder of the book is a potpourri of tactical suggestions (embedded in rhetoric cloned from Fukuyama's mentor in these matters, bioethicist Leon Kass) of which biotechnologies should be controlled, and of the need for both national and international bodies and systems to do so, if such control is to be effective. That, in the end, may be the most surprising aspect of the book. All this fervid philosophizing in reaction to fears about a Brave New World, fervently working toward the radical conclusion that what is needed is...regulation. Although obviously recognition of the need for regulation might well be experienced as a radical trauma by someone who has previously placed an overabundance of faith in the market.

But one would be foolish to believe that Fukuyama has gone all this distance simply to argue for what he refers to at one point as a more nuanced regulatory approach. In his most public engagement with biotechnology thus far, he has endorsed, written and testified to Congress on behalf of a bill that will not only ban human reproductive cloning but also ban nonreproductive cloning for stem-cell research. The legislation he supports would also make any doctor who utilizes or prescribes a treatment developed with cloning technology subject to ten years in prison and a $1 million fine. Under this legislation, then, if a cure or treatment for diabetes or heart failure is created in England that used embryo cloning to harvest stem cells for therapy, US physicians would not be allowed to have access to such treatments for their patients. This is his lesson in how moral freedom is not such a good thing compared with an inherently meaningful life plan. Let the fragile diabetic or spinal cord-injury victim learn the true value of our human nature from their catheterized bladders!

Fukuyama's entire brief depends upon avoiding the consequences of his own logic. Having identified the human essence with our biological human nature, he must evade any further specification or else the particular tissues, cells or molecules would be subject to further discussion and analysis as to whether or not they represent the human essence. Rather than discussion, we should trade in our autonomy and moral freedom for his protections. By the close of the book, any moral qualms on his part fall entirely by the wayside. Fukuyama is perhaps aware that he has failed to make his case except to those ready to believe. The book culminates in a final paragraph that is nothing less than a temper tantrum:

We do not have to accept any of these future worlds under a false banner of liberty, be it that of unlimited reproductive rights or of unfettered scientific inquiry. We do not have to regard ourselves as slaves to inevitable technological progress when that progress does not serve human ends. True freedom means the freedom of political communities to protect the values they hold most dear...

Nice rhetoric until we recall the values of the types of political regimes to which moral freedom and science must be sacrificed. While Fukuyama rails against the Brave New World, he takes the side of Huxley's World Controller, who explains, "Truth's a menace, science is a public danger...That's why we so carefully limit the scope of its researches."

There is an alternative to the fear that human nature must be inviolable because human nature cannot be trusted. We have seen imperious dictates against science and moral freedom delivered by philosophers before. In the recent past, we have evidence of very similar ideas in very similar language issuing from the philosopher whom Fukuyama draws upon for the epigraph beginning the first chapter of his book, Martin Heidegger. In the 1930s Professor Heidegger wanted science to serve the German essence, and it did. Now Professor Fukuyama wants science, and all of us, to serve the human essence, which he equates with his version of sociobiology infused with German romantic holism. Once more, we witness someone who would stop tyranny by imposing a tyranny of his own. Since Francis Fukuyama now sits on the President's Council on Bioethics, we should be grateful for the warning.

"Thirty years from now the big university campuses will be relics," business "guru" Peter Drucker proclaimed in Forbes five years ago. "It took more than 200 years for the printed book to create the modern school. It won't take nearly that long for the [next] big change." Historian David Noble echoes Drucker's prophecies but awaits the promised land with considerably less enthusiasm. "A dismal new era of higher education has dawned," he writes in Digital Diploma Mills. "In future years we will look upon the wired remains of our once great democratic higher education system and wonder how we let it happen."

Most readers of this magazine will side with Noble in this implicit debate over the future of higher education. They will rightly applaud his forceful call for the "preservation and extension of affordable, accessible, quality education for everyone" and his spirited resistance to "the commercialization and corporatization of higher education." Not surprisingly, many college faculty members have already cheered Noble's critique of the "automation of higher education." Although Noble himself is famously resistant to computer technology, the essays that make up this book have been widely circulated on the Internet through e-mail, listservs and web-based journals. Indeed, it would be hard to come up with a better example of the fulfillment of the promise of the Internet as a disseminator of critical ideas and a forum for democratic dialogue than the circulation and discussion of Noble's writings on higher education and technology.

Noble performed an invaluable service in publishing online the original articles upon which this book is largely based. They helped initiate a broad debate about the value of information technology in higher education, about the spread of distance education and about the commercialization of universities. Such questions badly need to be asked if we are to maintain our universities as vital democratic institutions. But while the original essays were powerful provocations and polemics, the book itself is a disappointing and limited guide to current debates over the future of the university.

One problem is that the book has a dated quality, since the essays are reproduced largely as they were first circulated online starting in October 1997 (except for some minor editorial changes and the addition of a brief chapter on Army online education efforts). In those four-plus years, we have watched the rise and fall of a whole set of digital learning ventures that go unmentioned here. Thus, Noble warns ominously early in the book that "Columbia [University] has now become party to an agreement with yet another company that intends to peddle its core arts and science courses." But only in a tacked-on paragraph in the next to last chapter do we learn the name of the company, Fathom, which was launched two years ago, and of its very limited success in "peddling" those courses, despite Columbia president George Rupp's promise that it would become "the premier knowledge portal on the Internet." We similarly learn that the Western Governors' Virtual University "enrolled only 10 people" when it opened "this fall" (which probably means 1998, when Noble wrote the original article) but not that the current enrollment, as of February 2002, is 2,500. For the most part, the evidence that Noble presents is highly selective and anecdotal, and there are annoyingly few footnotes to allow checking of sources or quotes.

The appearance of these essays with almost no revision from their initial serial publication on the Internet also helps to explain why Noble's arguments often sound contradictory. On page 36, for example, he may flatly assert that "a dismal new era of higher education has dawned"; but just twenty-four pages later, we learn that "the tide had turned" and the "the bloom is off the rose." Later, he reverses course on the same page, first warning that "one university after another is either setting up its own for-profit online subsidiary or otherwise working with Street-wise collaborators to trade on its brand name in soliciting investors," but then acknowledging (quoting a reporter) that administrators have realized "that putting programs online doesn't necessarily bring riches." When Noble writes that "far sooner than most observers might have imagined, the juggernaut of online education appeared to stall," he must have himself in mind, two chapters earlier. Often, Noble is reflecting the great hysteria about online education that swept through the academy in the late 1990s. At other times (particularly when the prose has been lightly revised), he indicates the sober second thoughts that have more recently emerged, especially following the dot-com stock market crash in early 2000.

In the end, one is provided remarkably few facts in Digital Diploma Mills about the state of distance education, commercialization or the actual impact of technology in higher education. How many students are studying online? Which courses and degrees are most likely to appear online? How many commercial companies are involved in online education? To what degree have faculty employed computer technology in their teaching? What has been the impact on student learning? Which universities have changed their intellectual property policies in response to digital developments? One searches in vain in Noble's book for answers, or even for a summary of the best evidence currently available.

Moreover, Noble undercuts his own case with hyperbole and by failing to provide evidence to support his charges. For example, most readers of his book will not realize that online distance education still represents a tiny proportion of college courses taken in the United States--probably less than 5 percent. Noble sweepingly maintains, "Study after study seemed to confirm that computer-based instruction reduces performance levels." But he doesn't cite which studies. He also writes, "Recent surveys of the instructional use of information technology in higher education clearly indicate that there have been no significant gains in pedagogical enhancement." Oddly, here Noble picks up the rhetoric of distance-education advocates who argue that there is "no significant difference" in learning outcomes between distance and in-person classes.

Many commentators have pointed out Noble's own resistance to computer technology. He refuses to use e-mail and has his students hand-write their papers. Surely, there is no reason to criticize Noble for this personal choice (though one feels sorry for his students). Noble himself responds defensively to such criticisms in the book's introduction: "A critic of technological development is no more 'anti-technology' than a movie critic is 'anti-movie.'" Yes, we do not expect movie critics to love all movies, but we do expect them to go to the movies. Many intelligent and thoughtful people don't own television sets, but none of them are likely to become the next TV critic for the New York Times. Thus, Noble's refusal to use new technology, even in limited ways, makes him a less than able guide to what is actually happening in technology and education.

Certainly, Noble's book offers little evidence of engagement with recent developments in the instructional technology field. One resulting distortion is that some readers will think that online distance education is the most important educational use of computer technology. Actually, while very few faculty teach online courses, most have integrated new technology into their regular courses--more than three-fifths make use of e-mail; more than two-fifths use web resources, according to a 2000 campus computing survey. And few of these faculty members can be characterized, as Noble does in his usual broad-brush style, as "techno-zealots who simply view computers as the panacea for everything, because they like to play with them."

Indeed, contrary to Noble's suggestion, some of the most thoughtful and balanced criticisms of the uses of technology in education have come from those most involved with its application in the classroom. Take, for example, Randy Bass, a professor of English at Georgetown University, who leads the Visible Knowledge Project (http://crossroads.georgetown.edu/vkp), a five-year effort to investigate closely whether technology improves student learning. Bass has vigorously argued that technological tools must be used as "engines of inquiry," not "engines of productivity." Or Andrew Feenberg, a San Diego State University distance-education pioneer as well as a philosopher and disciple of Herbert Marcuse, who has insisted that educational technology "be shaped by educational dialogue rather than the production-oriented logic of automation," and that such "a dialogic approach to online education...could be a factor making for fundamental social change."

One would have no way of knowing from Noble's book that the conventional wisdom of even distance-education enthusiasts is now that cost savings are unlikely, or that most educational technology advocates, many of them faculty members, see their goal as enhancing student learning and teacher-student dialogue. Noble, in fact, never acknowledges that the push to use computer technology in the classroom now emanates at least as much from faculty members interested in using these tools to improve their teaching as it does from profit-seeking administrators and private investors.

Noble does worry a great deal about the impact of commercialization and commodification on our universities--a much more serious threat than that posed by instructional technology. But here, too, the book provides an incomplete picture. Much of Noble's book is devoted to savaging large public and private universities--especially UCLA, which is the subject of three chapters--for jumping on the high-technology and distance-education bandwagons. Yet at least as important a story is the emergence of freestanding, for-profit educational institutions, which see online courses as a key part of their expansion strategy. For example, while most people think of Stanley Kaplan as a test preparation operation, it is actually a subsidiary of the billion-dollar Washington Post media conglomerate and owns a chain of forty-one undergraduate colleges and enrolls more than 11,000 students in a variety of online programs, ranging from paralegal training to full legal degrees at its Concord Law School, which advertises itself as "the nation's only entirely online law school." This for-profit sector is growing rapidly and becoming increasingly concentrated in a smaller number of corporate hands. The fast-growing University of Phoenix is now the largest private university in the United States, with more than 100,000 students and almost one-third in online programs, which are growing more than twice as fast as its brick-and-mortar operation. Despite a generally declining stock market, the price of the tracking stock for the University of Phoenix's online operation has increased more than 80 percent in the past year.

As the Chronicle of Higher Education reported last year, "consolidation...is sweeping the growing for-profit sector of higher education," fueled by rising stock prices in these companies. This past winter, for example, Education Management Corporation, with 28,000 students, acquired Argosy Education Group and its 5,000 students. The threat posed by these for-profit operations is rooted in their ability to raise money for expansion through Wall Street ("Wall Street," jokes the University of Phoenix's John Sperling, "is our endowment") and by diminishing public support for second-tier state universities and community colleges (the institutions from which for-profits are most likely to draw new students). Yet, except for an offhand reference to Phoenix, Digital Diploma Mills says nothing about these publicly traded higher-education companies. And these for-profit schools are actually only a small part of the more important and much broader for-profit educational "sector," which is also largely ignored by Noble and includes hundreds of vendors of different products and services, and whose size is now in the hundreds of billions of dollars--what Morgan Stanley Dean Witter calls, without blushing, an "addressable market opportunity at the dawn of a new paradigm."

A strong cautionary tale is provided by Noble, that of the involvement of UCLA's extension division with a commercial company called Onlinelearning.net--the most informative chapter in the book. He shows how some UCLA administrators as early as 1993 greedily embraced a vision of riches to be made in the online marketing of the college's extension courses. UCLA upper management apparently bought the fanciful projections of their commercial partners that the online venture would generate $50 million per year within five years, a profit level that quickly plummeted below $1 million annually. But Noble conflates the UCLA online-extension debacle with a more benign effort by the UCLA College of Letters and Sciences, beginning in 1997, to require all instructors to post their course syllabuses on the web. He seems unwilling to draw distinctions between the venal and scandalous actions of top UCLA administrators and the sometimes ham-handed efforts of other administrators to get UCLA faculty to enhance their classes by developing course websites, a fairly common educational practice and a useful convenience for students. Three-fifths of UCLA students surveyed said that the websites had increased interactions with instructors, and social science faculty recently gave the website initiative a mostly positive evaluation.

Sounding an "early alarm" so that faculty members can undertake "defensive preparation and the envisioning of alternatives" is how Noble explains his purpose in writing Digital Diploma Mills. But will faculty be well armed if they are unaware of the actual landscape they are traversing? In the end, Noble leaves us only with a deep and abiding suspicion of both technology and capitalism. His analysis of technology and education does echo Marx's critique of capitalism, with its evocation of concepts like commodification, alienation, exchange and labor theories of value. But unlike Marx, who produced a critical analysis of the exploitative nature of early capitalist production without outright rejection of the technology that made industrialization possible, Noble cannot manage the same feat.

In the current political climate, Noble's undifferentiated suspicion of technology hinders us more than it helps us. Are we prepared to follow him in his suspicion of any use of technology in higher education? Are faculty members willing to abjure e-mail in communicating with their students and colleagues? Are instructors at small colleges with limited library collections prepared to tell their students not to use the 7 million online items in the Library of Congress's American Memory collection? Are they ready to say to students with physical disabilities that limit their ability to attend on-campus classes or conduct library research that they can't participate in higher education? Are faculty at schools with working adults who struggle to commute to campus prepared to insist that all course materials be handed directly to students rather than making some of it available to their students online?

Similarly, what lines are we prepared to draw with respect to commercialization of higher education within the capitalist society in which we live? Are faculty willing to abandon publishing their textbooks with large media conglomerates and forgo having their books sold through nationwide bookstore chains? Are they prepared to say to working-class students who view higher education as the route to upward mobility that they cannot take courses that help them in the job market?

Noble's answer to most of these questions would undoubtedly be yes, insisting, as he does, that anything less than the "genuine interpersonal interaction," face to face, undermines the sanctity of the essential teacher-student relationship. In a March 2000 Chronicle of Higher Education online dialogue about his critique of technology in education, Noble complained that no one had offered "compelling evidence of a pedagogical advantage" in online instruction. (He pristinely refused to join online, and had a Chronicle reporter type in his answers relayed over the phone.) A student at UCLA, who had unexpectedly taken an online course, noted in her contribution to the Q&A that because she tended to be "shy and reserved," e-mail and online discussion groups allowed her to speak more freely to her instructor, and that she thought she retained more information in the online course than in her traditional face-to-face classes at UCLA. Noble rejected the student's conclusion that the online course had helped her find her voice, arguing that writing was "in reality not a solution, but an avoidance of the difficulty." "Speaking eloquently, persuasively, passionately," he concluded, "is essential to citizenship in a democracy." Putting aside the insensitivity of Noble's reply, his position, as Andrew Feenberg points out in Transforming Technology: A Critical Theory Revisited, is reminiscent of Plato's fear that writing (the cutting-edge instructional technology in the ancient world) would replace spoken discourse in classical Greece, thus destroying the student-teacher relationship. (Ironically, as Feenberg also notes, "Plato used a written text as the vehicle for his critique of writing, setting a precedent" for current-day critics of educational technology like Noble who have circulated their works on the Internet.)

The conservative stance of opposing all change--no technology, no new modes of instruction--is appealing because it keeps us from any possible complicity with changes that undercut existing faculty rights and privileges. But opposition to all technology means that we are unable to support "open source" technological innovations (including putting course materials online free) that constitute a promising area of resistance to global marketization. And it makes it impossible to work for protections that might be needed in a new environment. Finally, it leaves unchanged the growing inequality between full-time and part-time faculty that has redefined labor relations in the contemporary university--the real scandal of the higher-education workplace. Without challenging the dramatic differences in wages and workloads of full professors and adjunct instructors, faculty rejection of educational technology begins to remind us of the narrow privileges that craft workers fought to maintain in the early decades of industrial capitalism at the expense of the unskilled workers flooding into their workplaces.

We prefer to work from a more pragmatic and realistic stance that asks concretely about the benefits and costs of both new technology and new educational arrangements to students, faculty (full- and part-time) and the larger society. Among other things, that means that academic freedom and intellectual property must be protected in the online environment. And the faculty being asked to experiment with new technology need to be provided with adequate support and rewards for their (ad)ventures. As the astute technology commentator Phil Agre wrote when he first circulated Noble's work on the Internet, "the point is neither to embrace or reject technology but to really think through and analyze...the opportunities that technology presents for more fully embodying the values of a democratic society in the institutions of higher education."

I was in high school in the 1960s when I first saw Dave Van Ronk at the Gaslight, one of those little cellar clubs that used to line a Greenwich Village that now lives in myth and legend. I didn't understand what he was doing. It seemed like a jumble whose elements I recognized--folk tunes, ragtime, early jazz, Delta blues--but they didn't gel into what I thought was coherence. It was really only my expectations, though, that were exposed. I felt like Dr. P in Oliver Sacks's The Man Who Mistook His Wife for a Hat, scanning deconstructed faces for that single telltale feature that would reveal who I was looking at. I didn't know how to think about it. I couldn't have been more confused if Louis Armstrong had ambled onto The Ed Sullivan Show and followed "Hello Dolly!" with "The Times They Are A-Changin'."

Two things, however, I got: Van Ronk was a hellacious guitar picker, and he was the only white guy I'd ever heard whose singing showed he understood Armstrong and Muddy Waters. He roared and bellowed like a hurricane; he could be threatening, and tender as the night. And he was funny. Not cute funny--really funny. He did bits from W.C. Fields, whose movies, like those of the Marx Brothers, were just being revived. He did "Mack the Knife" with a suddenly acquired tremolo I later found out was Marlene Dietrich's. He finished with "Cocaine," which he'd adapted from the Rev. Gary Davis, his friend and teacher, adding his own asides ("Went to bed last night singing a song/Woke up this morning and my nose was gone"). Decades later, Jackson Browne revived the tune, his band parsing Van Ronk's solo guitar.

There are many Van Ronk undercurrents flowing through American pop culture. The acclamation that followed his death from colon cancer early this year strangely mirrored his ghostly omnipresence during life. He was a missing link: an authentic songster who voiced folk-made music. At his artistic core, he reconnected jazz to folk-music forms that he, like his avatar Woody Guthrie, pursued, learned and kept alive--and, with the wit and humor that kept homage from freezing into reverence, dared to reimagine.

A big, burly guy whose personality was as oversized as his voice, Van Ronk never crossed over to commerciality, never got mainstream-famous. In those ways, he was a true exemplar of the folk-revival aesthetic: becoming too visible or successful equaled selling out. He followed the time-honored American path into this culture's musical heart: He studied sources and learned from living African-American performers. Those sources included Piedmont ragtime pickers like Blind Blake and Blind Boy Fuller and Delta deep-bluesmen like Son House, as well as parlor music. Then there was the Rev. Gary Davis. He'd dazzled 1940s Harlem street corners with his stylistically wide-ranging guitar and whooping singing, careening from biblical shouts to leering lipsmackers, and by the 1960s had become a teacher who drew Village hipsters to his small brick house in Queens. This was the era when Moondog, the eccentric jazz poet, took up his post near the Museum of Modern Art and did, well, whatever he felt like doing that day.

Maybe it's not surprising that I was so confused by these figures that I didn't guess until later that I'd seen some of the last stages of America's oral culture.

The acceleration of technological change has inevitably altered the oral process of folk-art transmission. In the twenty-first century it seems that, for better and worse, technology has probably rendered the Van Ronks oddly superfluous, apparently redundant. In evolution, if not architecture, form follows function. The concept of folk music hatched by Charles Seeger and the Lomaxes, and embodied by Woody Guthrie, Lead Belly and Pete Seeger, has, in the age of mass recording, lost the daily uses that made it folk art. Where once songsters were the repositories and transmitters of our polyglot national folk heritage, where Van Ronk's generation of amateur and semipro musicologist-sleuths sought out records tossed into people's attics and garages to find artists obscured by the mists of time, now, thanks to the omnipresent, profitable avalanche of record-company CD reissues, almost anything they dug up is readily available. Of course, the artists and their cultures are not.

So our easy connection with the cultural past is shaped by the recording studio, with its time constraints and pressures and implicit notion of a fixed performance guarded by copyright--and the possibility of paying publishing royalties that are the core of the music industry's economy. That inevitably alters performances from folk art, where borrowing and repetition are demanded. Thus we've lost the idiosyncratic twists to the oral/aural tradition that an artist of Van Ronk's caliber introduces, casually and yet integrally, however much they appear like asides.

"This song has changed since Gary used to do it," he used to growl, introducing "Cocaine." Which was, of course, part of the point, the method of transmission, of real folk music: If culture is a conservative mechanism, a cumulative record of human activity, change results from disconnections and accretions like Van Ronk's sharp-witted reactions to Davis's barbed blues, originally improvised add-ons drawn from his memory of lyrics the way a jazz musician pulls riffs from history and reworks them into his own voice.

Van Ronk was a die-hard collector of sources, living and recorded. As the liner notes to the 1962 album In the Tradition put it, "Dave Van Ronk has established himself as one of the foremost compilers of 'Jury Texts' regarding traditional tunes. (Jury Texts are when many verses are sung to one tune, usually with some new words appearing with each subsequent recording.) Here, in 'Death Letter Blues,' Van Ronk has arranged some of the most moving verses of this song into a dramatic slow blues." Behold the songster at work--a process found in early Armstrong, Guthrie and Robert Johnson.

Although the building blocks of oral culture are plastic, preservationists in a nonoral culture tend toward reverence, and thus simpler imitation--hence the folk revival's slew of earnest groups like the New Lost City Ramblers. As Van Ronk observed in a late 1970s interview in the folk music quarterly Sing Out!, "It was all part and parcel of the big left turn middle-class college students were making.... So we owe it all to Rosa Parks." While black rhythm-and-blues was revving white teens into rock and roll, black folk artists became heroes to young white collegians. The left cast a romantic, even sacramental aura over black (and white) folk art and its traditions, which implicitly stigmatized creative change; the central notion of folk-revival culture, authenticity, meant avoiding commercial trappings and replicating a recorded past.

Perhaps it was Van Ronk's deep study of that past that helped him avoid fixing it. In a late 1990s interview, asked about Harry Smith's Anthology of American Folk Music, he rightly called it the bible of his generation and noted dryly, "I sat up and took notice at how many tunes that, say, Doc Watson does that are on the Anthology.... Some he would have known [via oral tradition]. But you can tell. There are hundreds of possible verses. When someone does [lists three verses in order], you know they've been listening to Bascom Lamar Lunsford."

"One thing I was blessed with is that I was a very, very bad mimic," Van Ronk once observed. Which is another view of how oral tradition mixes conservation and creativity. Van Ronk's background allowed him to understand this uniquely.

He was born in Brooklyn on July 30, 1936, a Depression baby to a mostly Irish working-class family. His father and mother split, and he grew up in blue-collar Richmond Hill, Queens, where he went to Catholic school--or played truant--until the system gave up on him, at 16. In 1998 he told David Walsh, "I remember reading Grant's memoirs, the autobiography of Buffalo Bill. Lots of Mark Twain.... My brain was like the attic of the Smithsonian.... The principal...called me 'a filthy ineducable little beast.' That's a direct quote." Like Guthrie, Van Ronk became a formidable autodidact. While he hung out in pool halls he was listening to jazz--bebop, cool, then traditional, a k a New Orleans or Dixieland jazz, a style with its own cult of authenticity. He fell in love with Armstrong and Bessie Smith, along with Lead Belly and Bing Crosby, his major vocal influences.

Like Odysseus, Guthrie, Kerouac and Pynchon, Van Ronk decided to take to the sea. In 1957, he got a shore gig at the Cafe Bizarre in the Village. Odetta, the gospel-voiced black singer who gave the 1950s folk scene an interracial connection--as Lead Belly, Sonny Terry and Brownie McGhee, and Josh White had to the first Depression wave--heard him, liked him and convinced him to make a demo tape that she'd pass on to Albert Grossman, folk-music maven, Chicago club owner and future manager of Bob Dylan. Popping Benzedrine in the best Beat fashion, Van Ronk hitchhiked to Chicago in twenty-four hours, got to Grossman's club, found out the tape hadn't, auditioned, got turned down (Grossman was booking black songsters like Big Bill Broonzy, and Van Ronk accused him of Crowjimming), hitchhiked back to New York, had his seaman's papers stolen and thus decided that he would, after all, become a folk singer.

Given his sardonic realism, it was fittingly ironic that he and his wife, Terri Thal, became quasi parents for dewy-eyed collegiate folkies drawn by Guthrie's songs and Seeger's indefatigable college-concert proselytizing. Seeger's shows planted folk-music seeds on campuses across the country, but Smith's Anthology provided the rich soil for the next generation of folk musicians. "Cast your mind back to 1952," Van Ronk told one interviewer. "The only way you could hear the old timers was hitting up the thrift shops. When the Anthology came out, there were eighty-two cuts, all the old-time stuff. I wore out a copy in a year. People my age were doing the same." As did his musical stepkids.

Van Ronk once said of Seeger, "What am I supposed to say about the guy who invented my profession?" By the late 1950s that profession had migrated far from Lead Belly and Guthrie, songsters who lived the lives they chronicled, and far from Seeger's fierce anticommercialism and romantic faith in a pure, true folk culture. History intervened. Seeger had refused to testify before the House Un-American Activities Committee and had doggedly resurfaced in the post-McCarthy era. Still, less threatening figures like Burl Ives became the commercial faces of folk music. As Joe Klein noted in Woody Guthrie: A Life, the folk revival offered record companies an exit from payola scandals and the racial and sexual fears that had generated mainstream disapproval of rock and roll. The patina of integrity and authenticity covering white collegiate folk music helped the labels repolish corporate images.

Starting in 1957, the Kingston Trio cleaned up old tunes like "Tom Dooley" and "Tijuana Jail" and scored several top-25 hits. Neat folk groups proliferated, feeding into the Village and Cambridge, Massachusetts, where young men and women donning recently acquired rural accents and denims recycled the Anthology's songbook and hoped to catch a label's ear.

In 1959, when Bobby Zimmerman was leaving behind his piano à la Jerry Lee Lewis for college and the Anthology's lures, Van Ronk made his first records, now compiled on The Folkways Years (Smithsonian/Folkways); they unveil a songster misclassified. Van Ronk once said, "I never really thought of myself as a folk singer at all. Still don't. What I did was to combine traditional fingerpicking guitar with a repertoire of old jazz tunes." Here he does a Gary Davis-derived staple of his repertoire, "Hesitation Blues," and more blues and gospel. His big, rough voice and guitar dexterity are self-evident, as is his improvisational feel.

In 1964, he yanged with Dave Van Ronk's Ragtime Jug Stompers (Mercury), recording high-energy versions of tunes like "Everybody Loves My Baby" with a wild and ragged Dixieland outfit. This was his recurrent jazz-folk dialectic. On his solo album Sings the Blues (Folkways), Van Ronk's coarse voice and nimble fingers got looser--like the irrepressible Davis's--and thus he found himself.

"It was more academic than it is now," Van Ronk remembered in the 1970s:

It was 'de rigueur,' practically, to introduce your next song with a musicological essay--we all did it. There was a great deal of activity around New York--not so much you could make money at. But there were folk song societies in most of the colleges and the left was dying, but not quietly. So there was a great deal of activity around Sing Out! and the Labor Youth League, which wasn't affiliated with the old CP youth group, you understand. There was a lot of grassroots interest among the petit-bourgeois left.

Spoken like the sly observer who once told an interviewer from the International Committee of the Fourth International, "I've always liked Trotsky's writings as an art critic."

By 1961 Bobby Zimmerman was Bobby Dylan and had arrived in New York, Van Ronk was an insider on the Village folk scene and the two gravitated toward and around each other, thanks partly to what Van Ronk called the take-no-prisoners quality of Dylan's music and personality. Ramped up by commercial success, the postwar folk revival's peak loomed over debates about authenticity. "All of a sudden," Van Ronk recalled a few years back, "there was money all over the place."

He settled into the Gaslight, a hub for noncommercial folkies. Several other pass-the-hat beat-folk coffeehouses, like Cafe Wha?, opened. By 1962 Dylan had settled in down the block, at the grander Gerde's Folk City. Izzy Young of the Folklore Center, part of the older folk-revival wave, had set up a folk-music showcase, WBAI had broadcast the shows and club owner Mike Porco, realizing he had a salable product, ousted both, lining his bar with record covers and his seats with young beatniks. Porco's Monday night Hoots were the dollar-admission descendants of both Young's and Seeger's earlier informal loft gatherings, and he showcased rediscovered legends like John Lee Hooker with Dylan as the opener. Tom and Jerry--later known as Simon and Garfunkel--and Judy Collins cut their teeth there. Kids flocked to this semi-underground. Jug bands emerged as the college-beatnik equivalent of the 1950s blue-collar rockabilly outbreak in the South, and street-corner doo-wop in the North, prefiguring the 1960s garage-band explosion after the Beatles and electric Dylan. The link: Everyone felt empowered to make music. These were folk musics.

The Newport Folk Festival, the crowning triumph of the postwar folk revival, was first organized in 1959 by jazz impresario George Wein and Albert Grossman, and graduated the purer wings of the folk movement to big-time concerts; Seeger himself was involved. "I never liked those things," Van Ronk characteristically recalled. "It was a three-ring circus.... You couldn't even really hear what you came to hear. Put yourself in my position, or any singer's position: How would you like to sing for 15,000 people with frisbees?" Along with his own musical catholicity, that may be why, even after the Dylan-goes-electric blowup at the 1965 festival, Van Ronk remained a Dylan defender.

"Nervous. Nervous energy, he couldn't sit still," is how he spoke of young Bob to David Walsh in 1998:

And very, very evasive.... What impressed me the most about him was his genuine love for Woody Guthrie. In retrospect, even he says now that he came to New York to 'make it.' That's BS. When he came to New York there was no folk music, no career possible.... What he said at the time is the story I believe. He came because he had to meet Woody Guthrie.... Bobby used to go out there two or three times a week and sit there, and play songs for him. In that regard he was as standup a cat as anyone I've ever met. That's also what got him into writing songs. He wrote songs for Woody, to amuse him, to entertain him. He also wanted Woody's approval.... [Dylan's music] had what I call a gung-ho, unrelenting quality.... He acquired very, very devoted fans among the other musicians before he had written his first song.

Van Ronk was the first to record a tune Dylan claimed to write, "He Was a Friend of Mine," on Dave Van Ronk, Folksinger in 1962 (the album has been reissued as part of Inside Dave Van Ronk [Fantasy]). Three years later, the Byrds redid it on Turn! Turn! Turn!, whose title cut remade Seeger's setting of Ecclesiastes into folk rock, the new sound Dylan had kicked into high gear during his 1965 tour.

Van Ronk once observed, "The area that I have staked out...has been the kind of music that flourished in this country between the 1880s and, say, the end of the 1920s. You can call it saloon music if you want to. It was the kind of music you'd hear in music halls, saloons, whorehouses, barbershops, anywhere the Police Gazette could be found." That's not exactly a full description of what he did over thirty albums and countless performances. Better to think of him as a songster, an older, more encompassing sort of folk artist. Lead Belly and the Reverend Davis are outstanding examples of this type; they drew from multiple local and regional traditions that, in the early days of radio and phonograph, still defined American musical styles. Dance tunes, blues, ragtime, ballads, gospel--anything to keep the audiences on street corners or in juke joints interested and willing to part with some cash. This was, after all, performance. Entertainment was its primary goal; improvisation, found in the vocal-guitar interplay and instrumental backing as well as verse substitutions and extrapolations or shortenings, played to audience reaction.

In 1962, with the Red Onion Jazz Band, Van Ronk cut In the Tradition, which, along with the solo Your Basic Dave Van Ronk Album, cut in 1981, will be included on the forthcoming Two Sides of Dave Van Ronk. This somewhat odd couple makes a wonderful introduction to the breadth, depth and soul of this songster's legacy. The smoothly idiomatic Red Onions pump joyful New Orleans adrenaline and Armstrong trumpet into a raucous "Cake Walking Babies From Home"; a sinuous "Sister Kate," that dance hit built from an Armstrong melody; and Dylan's caustic "All Over You." Amid the Dixieland are solos: a stunning version of Son House's "Death Letter Blues" (later recorded by Cassandra Wilson), Lead Belly's "Whoa Back Buck," the virtuosic ragtime "St. Louis Tickle," signature pieces like the gentle "Green, Green Rocky Road" and "Hesitation Blues." The tunes drawn from Your Basic Dave Van Ronk Album show no diminishing of talent and a continuing breadth of perspective: Billie Holiday's "God Bless the Child" (sung with a tenderness that scorches periodically into Howlin' Wolf) and "St. James Infirmary" share space with tunes by Davis and Mississippi John Hurt.

In 1967 he cut Dave Van Ronk & The Hudson Dusters (Verve Forecast), a cross of jug band and electric folk music that foreshadowed The Blues Project, the improvising garage band that Van Ronk pupils Danny Kalb and Steve Katz later formed. There was doo-wop, Joni Mitchell (whose Clouds becomes anguished, thanks to Van Ronk's torturous voice breaks used with interpretive skill, a move he learned from Armstrong and Bessie Smith) and the balls-to-the-wall garage rock "Romping Through the Swamp," which sounds akin to Captain Beefheart.

Recorded in 1967, Live at Sir George University (Justin Time) is time-capsule Van Ronk on guitar, plus vocals, doing pieces of his repertoire: "Frankie and Albert," "Down and Out," "Mack the Knife," "Statesboro Blues" and "Cocaine," of course--all masterful, each distinct.

By then the folk boom, whose audience was bleeding into folk-rock, electric blues and psychedelia, stalled and ended. Van Ronk continued (except for a hiatus in the 1970s) to perform and record and gather new-old material. And he had time, before his death, to deliver some acid reflections.

On 1960s folkies:

The whole raison d'être of the New Left had been exposed as a lot of hot air, that was demoralizing. I mean, these kids thought they were going to change the world, they really did. They were profoundly deluded.... Phil Ochs wrote the song "I declare the war is over," that was despair, sheer despair.

On 1980s folkies:

You're talking about some pretty damn good songwriters. But I'd like to hear more traditional music.... With the last wave of songwriters you get the sense that tradition begins with Bob Dylan and nobody is more annoyed with that than Bob Dylan. We were sitting around a few years ago, and he was bitching and moaning: "These kids don't have any classical education." He was talking about the stuff you find on the Anthology [of American Folk Music]. I kidded him: "You got a lot to answer for, Bro."

When a girl becomes her school's designated slut, her friends stop talking to her. Pornographic rumors spread with dazzling efficiency, boys harass her openly in the hallways, girls beat her up. "WHORE," or sometimes "HORE," is written on her locker or bookbag. And there is usually a story about her having sex with the whole football team, a rumor whose plausibility no one ever seems to question.

Even those of us who weren't high school sluts and don't recall any such outcast from our own school days have become familiar with her plight--through media stories and the growing body of feminist-inspired literature on female adolescence, as well as the talk shows and teen magazine spreads that have made her their focus. What's harder to understand is how the label persists when the landscape of sexual morality that gives it meaning has so drastically changed--well within living memory. If the sexual revolution didn't obliterate the slut, wouldn't the successive waves of libidinous pop stars, explicit TV shows and countercultural movements to reclaim the label have drained it of its meaning? What kinds of lines can today's adolescents, or those of the 1990s or 1980s, for that matter, possibly draw between nice and not nice girls?

Emily White's Fast Girls sets out to look at the central dilemmas of the slut label. Two earlier books that have focused on the slut--Leora Tanenbaum's Slut! Growing Up Female With a Bad Reputation, a collection of oral histories, and Naomi Wolf's Promiscuities, a reflection on girls' sexual coming-of-age in the 1970s that combines memoir with a casual survey of the women Wolf grew up with--rely primarily on the subjective narratives of women and girls to explore the slut phenomenon. Paula Kamen's Her Way: Young Women Remake the Sexual Revolution surveys the sexual mores and activities of young women, but not specifically of teenagers. White is the first to combine different methodologies in an attempt to write specifically about the functions and significance of the teenage slut--in her words, "to shed some light on that space in the high school hallway where so many vital and troubling encounters occur."

White spoke to or corresponded with more than 150 women who had been the sluts of their school (whom she found largely by soliciting their stories through newspaper ads), and she spent "a couple of weeks" observing in a Seattle-area public high school. She also offers cultural criticism--of horror movies and the riot grrrls, for instance--as well as a digest of psychological, sociological and post-structuralist theory pertinent to the subject. White's evident ambition makes it all the more frustrating that the book's impressive breadth doesn't translate into thoroughness or rigor.

When White interviewed the women--most of them white, middle-class and from the suburbs--who responded to her ads, the stories she heard had certain similarities. There was a "type" of girl who tended to be singled out: She developed breasts earlier than other girls; she was a loud, vocal extrovert; she was self-destructive, tough or wild; often she had been sexually abused; and in one way or another she was usually an outsider, whether she had moved from a different town, had less money than most kids or belonged to some peripheral subculture. Some women described themselves as having been promiscuous, but more said they were not as sexually active as their (untainted) friends, and none of them had done the things that were later rumored. Often the first rumors were started by bitter ex-boyfriends or jealous friends. Once they caught on, the ritual torments and "football team" fantasies inevitably followed.

These similarities make up what White calls the "slut archetype," and for much of the book she riffs on the common factors of the stories, with chapters dedicated to subjects like the role of suburbia, the slut's social isolation and the preponderance of sexual abuse. Though sprinkled liberally throughout the book, the women's testimonies are only a launching point for White's meditations. She writes about these interviews in a way that at times both romanticizes and condescends to the women. "She walks so confidently in her boots," writes White of one 18-year-old, "causing tremors in the ground beneath her feet. She presents herself as a girl who has crawled up out of the underworld, who has found her way through the isolation and the drugged dreams.... It is a way of coping, this tough act. It's a start." Still, despite certain problems of credibility, this overwrought style is pretty effective at conveying the anguish of the ostracized adolescent girl (if only by echoing her earnest self-dramatization). It's much less suited to considering the girl in social and cultural context.

In editing and interpreting her interviews, White emphasizes their similarities at the expense of the kind of detail that makes a particular social universe come to life. Her time observing the Seattle-area high school students inspires mostly familiar observations. ("The cafeteria is high school's proving ground. It's one of the most unavoidable and important thresholds, the place where you find out if you have friends or if you don't.") Only about half the time do we get any real sense of the sort of community an interviewee grew up in or what the social scene was like at her school. There's even less detail about precisely how she fit into the hierarchy before the slut label took hold, whether she was perceived as threatening or flirtatious, what her past relationships were like with girls, boys and teachers. Even worse is that for all their lack of texture, the women's stories are by far the most interesting part of the book; when White pulls away to supply her own commentary, it's usually vague and predictable--precisely because she's not attuned to the details that would reveal how the slut really functions in the teenage universe. Although she acknowledges that the slut myth is much bigger than any individual girl behind it, she is also attached to the literal-minded notion that the girl being labeled has some kind of privileged relationship to the slut myth--that her individual story is the slut story, and the women's emotional recollections of abuses and scars collectively explain the slut myth. In fact, to understand the myth we need to know at least as much about what the rest of the school is thinking.

White suggests that "the slut becomes a way for the adolescent mind to draw a map. She's the place on the map marked by a danger sign...where a girl should never wander, for fear of becoming an outcast." But, given the arbitrary relationship White found between the slut label and a girl's actual sex life, does the slut myth really have any practical applications for girls? Do they limit sexual activity out of fear of these rumors? Are there particular sex acts that can bring censure in themselves? Can social status insulate some girls from slutdom, regardless of how much they fool around? White doesn't directly pose these questions, but one of her findings hints that, though they may fear the label, kids themselves interpret slutdom as primarily an expression of social status rather than a direct consequence of sexual activity: "Girls who at one time might have been friends with the slut recede as her reputation grows; they need to be careful how they associate with her or they will be thought of as sluts along with her."

The slut doesn't seem to point to an actual line that a nice girl can't cross; she commemorates the fact that there once was such a line, and suggests that the idea of a line still has currency, even if no one can figure out where it is anymore. It's no surprise that she is such a popular subject for third-wave feminists; her ostracism seems to have implications not only for residual sexism but for the way that we personally experience sex and desire.

Ididn't think I had a personal connection to the slut story. For most of my adolescent years, which were in the late 1980s and early '90s, I was very good, and too awkward to attract attention from boys. In the schools I attended there were whispers about who did what, and some girls were considered sluttier than others, but there was no single figure who captured the imagination of the whole class.

Then I remembered something about one of the girls I was closest to from age 10 to about 13 or 14. We didn't go to the same school, but for much of the time we both attended Saturday Russian classes held in her kitchen by an elderly neighbor. She was the only one of my friends who was, like me, born in Russia, though her family still lived in Philadelphia's immigrant neighborhood while mine had moved to a more prosperous, non-Russian suburb several years earlier. My family had a bigger house. We had, thanks to my American stepdad, more American ways of doing things. I was a better student. I think she was more popular at her school than I was at mine; at least, she was more easygoing and sociable. I never felt in awe of her, as I did of other friends. I was not always nice to her, though usually I was.

She knew more about sex in our early years than I did, but, like me, she didn't go out with anyone in the time we knew each other. She was pretty, in a round-faced, unfashionable way that made me think I had a discerning eye for appreciating it. She always seemed more developed than I was. (That may not have been true in any measurable sense.) At some point in those years, though it didn't particularly affect our friendship, and I don't remember thinking about it while I was actually with her, I began to spend nights casting her as my proxy in every kind of pornographic fantasy I could conjure.

It's always difficult to figure out the relationship between cultural fantasies and mores, on the one hand, and actual behavior and sexual self-image on the other. You could probably spend a long time listening to teenagers and still not get to the bottom of how the slut myth filters into their own lives. Still, the site of the slut's continuous re-creation, the high school hallways, deserves closer scrutiny, and the mysteries of her endurance await further exploration.

Campbell McGrath's entertaining and frustrating fifth book of poems--every single one of them devoted to some aspect of Florida--raises two large questions. One has to do with representations of that state; the other, with precision, personality and populism in poetry, and the relative value of each.

Elizabeth Bishop, who lived in Key West for some years, called Florida "the state with the prettiest name," "the state full of long S-shaped birds, blue and white"; Wallace Stevens saw in Florida's "venereal soil" an escape from intellection--though he came to find its fertility unnerving. Among living poets, William Logan, Tony Harrison and Michael Hofmann have all taught in Gainesville and written about it. Donald Justice described the Florida of his youth in such poems as "A Winter Ode to the Old Men of Lummus Park, Miami, Florida." Dionisio D. Martínez evoked the state's lightning-prone flats in Bad Alchemy, while Karen Volkman skewered Miami in her much-anthologized "Infernal."

McGrath aims to capture in verse a Florida as disturbing as any of those, and far more comprehensive. His narrative, didactic, essayistic and lyric poems together try to depict the whole troubled state, a state that (in McGrath's view) cries out either for political action to set it on a new course or for an apocalypse to wash it all away. As in his celebrated Spring Comes to Chicago (1996), McGrath's models here include the Ginsberg of Howl; the Whitman of big catalogue poems like "A Song for Occupations"; crowd-pleasing comic poets like Billy Collins; and writers of the modern left--from Carl Sandburg to Martín Espada--who wish to tie locally oriented description to socioeconomic protest. McGrath offers, first, a ten-part narrative poem (based on Aristophanes' Birds) called "A City in the Clouds"; next, a group of short poems on subjects Floridian; last, a long verse-essay called "The Florida Poem." Though they share attitudes, topics and techniques, each section has to be judged on its own.

McGrath's narrative shows the rise, success and eventual fall of an airborne city built above Florida--one that bears remarkable resemblances to it. Readers of Aristophanes, or of the headlines, will know quickly what fate McGrath's cloud-folks face (or refuse to face): Seeking a carefree New World, the cloud-dwellers end up dependent on complex irrigation, McDonald's sandwiches, tourism, real estate speculation, overbuilt prisons and exploited noncitizen "laborers [who] were needed...to man the pumps for the earthward flow of water upon which their entire economy depended." Menaced by aerial alligators, then by failing machinery, the cloud-folks finally let the city collapse. The poem's most original moments are those closest to (prose) science fiction: In one, the cloud-dwellers haul up "whatever could be gathered at the ever-shifting terminal point where the wind-flexed elevator shaft met the ground."

Despite such descriptive energies, McGrath's cloud-poem lacks the verbal reliability we expect from most modern verse: His long lines can forsake semantic control. Here, for example, the citizens view their new home:

Times the clouds were like riven badlands, foils and arroyos and alluvial fans, rough country best traversed with safety ropes as if crossing polar seas over plates of tilting ice.

...

Times the clouds were gongs and temples, a rapture in pewter, grand passions, coffers of incense and precious woods.

Rapture and passions. Badlands and alluvial fans, and ice. Often McGrath seems to operate by the rule "Never use one word when three will do": The cloud-dwellers "missed things, various places and objects, old friends or distant cousins, specific sounds, familiar certainties" (as against unfamiliar ones). Later we see "luxurious waterfalls rooted in the barest mist or veil of vapor." Nor is such excess confined to the narrative poem. In the short poem "The Miami Beach Holocaust Memorial" McGrath summons "the vestigial memory of some as yet undreamed/category of violent distinction and hatred," a phrase almost any prose editor would blue-pencil.

McGrath's vivid description and his social critiques carry over into his short poems. So, alas, does his insistence on spelling things out. In a villanelle about the Florida State Fair,

         ...we're stamping and hooting all over the place
while the Texas swing band plays "Rocky Top, Tennessee"
and Haitian kids dip kettle candy beneath a live oak tree
in historic Cracker Country, apt and ironic misnomer for the place,
because this is Florida, after all, not Texas or Tennessee.

McGrath has to tell us what he finds "ironic"; otherwise we might not know. Elsewhere it can be hard to tell if he's kidding: "Trouble with Miami," one poem opines, "is...a dearth of cultural infrastructure so profound//that the only local institution worth its salt is the ocean," where "watching the beautiful women on the beach/...may be our best shot at real enlightenment." Is this a persona we're meant to dislike? Apparently not: "Florida," McGrath explains later, "is bereft of mythic infrastructure,/symbolically impoverished." It's an odd complaint in a book full of (highly symbolic) conquistadors, Seminoles, mangroves, alligators, mouseketeers and scarily iconic restaurants (of which more below). If that's "symbolically impoverished," what to call Delaware?

McGrath seems to mean not that Florida lacks symbols, but that its symbols end up either sinister or ridiculous, or both. The poems he finds he can make out of them are comic, and the comedy moves him to complain. When McGrath instead describes his private life, he can be more careful, and far more likable: "The Zebra Longwing" (named after a butterfly) ends as follows:

have borne them awayWings
have borne them away
from the silk
of the past as surely
as some merciful wind
has delivered us
to an anchorage of such
abundant grace,
Elizabeth. All my life
I have searched, without knowing it,
for this moment.

McGrath has transported James Wright's famous poem "A Blessing" to a warmer climate and a happy marriage. He's done it so carefully that the transposition works.

McGrath rarely gets that calm, though; normally he wants for his own work the prophetic enthusiasms of Whitman or Ginsberg, who also combined sometimes-radical politics with long personal digressions. Yet Ginsberg and Whitman at their best were fascinated by the individuals who made it into their poems, whether for half a book (Carl Solomon in Howl) or for a couple of lines (Whitman's soldiers, prostitutes, firefighters). McGrath almost always considers people other than himself in fairly large groups--cloud-dwellers, exploited workers, the Calusa, the old folks, the tourists. He does better with "Maizel at Shorty's in Kendall":

All shift them sugar donuts
been singing to me,
calling to me something crazy in a voice
Dolly Parton'd be proud of--Maizel, honey,
eat us up!

Notice the alphabetical acrostic (lines begin a, b, c--), a form McGrath uses three times. It suits him, since it allows for long free-verse lists. "What I loved most," he declared in Spring Comes to Chicago, "was the depth and rationality of the catalogue"; here one acrostic ("Seashells, Manasota Key") comprises nothing but catalogue, from "Abras, augers, arks and angel wings" to Zirfaea crispata.

These lists take their place among other manifestations of McGrath's exuberance: He loves to say what he sees, and he finds most of it either very attractive or ugly indeed. Poetry, Yeats said, came not from our quarrels with others, but from our quarrels with ourselves. If there's such a quarrel here, it sets McGrath's impulse to celebrate absolutely everything--cars, lightning, alligators, America--against his understandable sense that Florida, and the other forty-nine states, are resource guzzlers headed for a fall. Usually, though, these poems enact McGrath's excited quarrels with others. Of "Disney's realm of immortal/simulacra," McGrath says that it makes too easy a target "when there are nastier vermin to contest," vermin like "Orlando itself," where "the anthem of our freedom is sung by Chuck E. Cheese." There follows a three-page attack on that fast-food chain and its iconic mouse, "the monstrous embodiment of a nightmare," designed "to entice the youngest among us/to invest their lives in a cycle of competitive consumption." This lengthy philippic against a pizzeria moves beyond predictability, beyond comedy and beyond politics into a vituperation as excessive as it is entertaining: What did Chuck E. Cheese ever do to McGrath?

In poems like that one ("Benediction for the Savior of Orlando"), McGrath is at bottom a dazzling performer, as much so as the cartoon figures he says he hates, though with an admirable politics his corporate nemeses obviously lack. The standard critique of, say, TV ads (they reduce us to passive receivers) might hold just as true for McGrath's verse, which leaves us little to figure out for ourselves. "The Florida Anasazi" attacks "the alligator-headed figure known to us as The Developer who works his trickery upon the people of the tribe, pilfering communal goods, claiming to produce that which he despoils." Pound called poetry news that stays news. Is this news? Does it tell us anything unexpected, either about how to understand evil developers or about how to resist what they try to do?

The long poem titled "The Florida Poem" is a different, and happier, matter. In it McGrath returns to a form that can showcase his talents and neutralize most of his faults. The form is the long, research-filled essay-cum-rant, with roots (McGrath's note suggests) in Pablo Neruda's Canto General--and in McGrath's own bigger, better, earlier, funnier "Bob Hope Poem" (from Spring). Neruda in one way, and "Bob Hope" in another, tried to give the history of a continent; here McGrath contents himself with one state in the union, about which his form allows him to say, and to enjoy saying, anything at all, from the whimsical to the sarcastic to the mock-classical ode:

Sing through me, o native goddess, o sacred orange
blossom nymph, o Weeki Wachee naiad...

...

Florida: it's here!
Florida: it's here and it's for sale!
Florida: it's neat, in a weird way!
Florida: Fuckin' Fantastic!
This would be my official suggestion for a new state motto...

Much of the poem returns to familiar targets, "marketers/and technocrats and mouseketeer apparatchiks" and so on. Yet the real subject of "The Florida Poem" is not the damage such folks have done but instead McGrath's feelings about the state they have produced, with its eye-popping sights and consumerist excess, its real fun and its false Fountains of Youth:

been enticed to sw...I myself have more than once
been enticed to swim in the icy oasis of DeLeon Springs,
and have eaten at the remarkable restaurant
reputedly housed in an old Spanish mill
where they grind still the wheat
to mix the batter you pool and flip on a griddle
in the middle of your very own table.
Pancakes!
Pancakes and alligators and paddleboats and ruins
of vanished conquerors vanquished
in their turn. It's one of my favorite places in the state,
not merely for the flapjacks and historical ironies
but for the chaste fact of its beauty.

In this kind of writing, compression, obliquity, even precision, may be sacrificed for the sake of a voice. For this reason alone "The Florida Poem" is by far the best in the book. Its size lets it encompass both the obvious judgments McGrath thinks we need to hear (conquistadors bad, manatees good, "Indians...easily romanticized" yet "human, familiar with power and avarice") and the details that make those judgments entertaining even at their most predictable. (Floridian readers--especially if they speak Spanish--may call to mind aspects of their state McGrath leaves out.) Above all, "The Florida Poem" gives us the sound of a person talking: It has not only the faults but some of the virtues of what's now called "performance poetry" (a movement to which McGrath has not been linked):

          ...Andrew Jackson bought the whole place
for five million dollars and a solemn promise
to relinquish all future American
claims to Texas.
Hmm.

It's because McGrath--ordinarily--can't slow down for more than a couple of syllables that he gets comic effects from that one-line nonword. Elsewhere his rant reminds me of Williams's splendid and splenetic "Impromptu":

What the governor meant was
come and get it,
tear it
down, rip it up,
mill it for lumber, boil it for turpentine,
orchard it for oranges or pit-mine it for phosphates,
shoot it for hides or skins or quills
or fun...

"It" comes to mean at once particular natural resources, the exploited population and the whole state: It's a neat rhetorical effect, one McGrath can only achieve in a long poem, and one that makes this long poem worth a try. As it spreads back into the prehuman past, and then into a misty future, McGrath applies these effects of capacious verve not just to the parts of the state he hates but to scenes within the state he loves:

of an element so...visceral
of an element so clear each grain of sand
sings forth, each bordering leaf of oak or heliconia,
each minnow or sunfish in the mineral wicker-work,
one jump, one plunge
toward the crevice of rifted limestone
wherefrom the earth pours forth
its liquid gift...

Now that's a Florida worth going to see.

As Halle Berry elegantly strode to the podium to accept her best actress Oscar, the first for a black woman, she wept uncontrollably and gasped, "This moment is so much bigger than me." Just as revealing was Denzel Washington's resolute dispassion as he accepted his best actor Oscar, only the second for a black man, by glancing at the trophy and uttering through a half-smile, "Two birds in one night, huh?" Their contrasting styles--one explicit, the other implied--say a great deal about the burdens of representing the race in Hollywood.

Berry electrified her audience, speaking with splendid intelligence and rousing emotion of how her Oscar was made possible by the legendary likes of Dorothy Dandridge, Lena Horne and Diahann Carroll. And in a stunning display of sorority in a profession riven by infighting and narcissism, Berry acknowledged the efforts of contemporary black actresses Angela Bassett, Jada Pinkett Smith and Vivica Fox. But it was when Berry moved from ancestors and peers to the future that she spoke directly to her award's symbolic meaning. She gave the millions who watched around the globe not only a sorely needed history lesson but a lesson in courageous identification with the masses. Berry tearfully declared that her award was for "every nameless, faceless woman of color" who now has a chance, since "this door has been opened."

Berry's remarkable courage and candor are depressingly rare among famed blacks with a lot on the line: money, prestige, reputation and work. Many covet the limelight's payoffs but cower at its demands. Even fewer speak up about the experiences their ordinary brothers and sisters endure--and if they are honest, that they themselves too often confront--on a daily basis. To be sure, there is an unspoken tariff on honesty among the black privileged: If they dare go against the grain, they may be curtailed in their efforts to succeed or cut off from the rewards they deserve. Or they may endure stigma. Think of the huge controversy over basketball great Charles Barkley's recent comments--that racism haunts golf, that everyday black folk still fight bigotry and that black athletes are too scared to speak up--that are the common banter of most blacks. What Berry did was every bit as brave: On the night she was being singled out for greatness, she cast her lot with anonymous women of color who hungered for her spot, and who might be denied a chance for no other reason than that they are yellow, brown, red or black. Her achievement, she insisted, was now their hope.

At first blush, it may seem that Denzel Washington failed to stand up and "represent." But that would be a severe misreading of the politics of signifying that thread through black culture. Looking up to the balcony where Sidney Poitier sat--having received an honorary Oscar earlier and delivered a stately speech of bone-crushing beauty--Washington said, "Forty years I've been chasing Sidney...." He joked with Poitier, and the academy, by playfully lamenting his being awarded an Oscar on the same night that his idol was feted. Washington, for a fleeting but telling moment, transformed the arena of his award into an intimate platform of conversation between himself and his progenitor that suggested, "This belongs to us, we are not interlopers, nobody else matters more than we do." Thus, Washington never let us see him sweat, behaving as if it was natural, if delayed, that he should receive the highest recognition of his profession. His style, the complete opposite of Berry's, was political in the way that only black cool can be when the stakes are high and its temperature must remain low, sometimes beneath the detection of the powers that be that can stamp it out. This is not to be confused with spineless selling out. Nor is it to be seen as yielding to the cowardly imperative to keep one's mouth shut in order to hang on to one's privilege. Rather, it is the strategy of those who break down barriers and allow the chroniclers of their brokenness to note their fall.

Both approaches--we can call them conscience and cool--are vital, especially if Hollywood is to change. Conscience informs and inspires. It tells the film industry we need more producers, directors and writers, and executives who can greenlight projects by people of color. It also reminds the black blessed of their obligation to struggle onscreen and off for justice. Cool prepares and performs. It pays attention to the details of great art and exercises its craft vigorously as opportunity allows, thus paving the way for more opportunities. The fusion of both approaches is nicely summed up in a lyric by James Brown: "I don't want nobody to give me nothin'/Just open up the door, I'll get it myself."

If you're in the mood to see great acting, I recommend that you watch Aurélien Recoing get caught in a lie in Laurent Cantet's Time Out. As Vincent, a French management consultant who is secretly unemployed--playing hooky from life, let's say--Recoing is forever being asked why he's hanging around in office

towers, motel lobbies or parking lots. The truth is, he's dawdling: killing his own time, or spying on the way other people use theirs. But since dawdling in the modern world is a category of malfeasance, midway in seriousness between a theft and a threat--theft of an organization's private airspace, the threat to use that space without management approval--Vincent must continually justify his mere presence. Each time he fails to do so, Recoing brilliantly shows you how Vincent is a little slow with the first words of his excuse, a little too quick with the rest. You can see the lie form behind his pale, high forehead.

That expanse of flesh seems transparent not only to you but also to the security guards who challenge Vincent. They see how a flush tints his otherwise bloodless, round-cheeked face; they read the effect as shame (which it is, in part). But Recoing's ability to alight cleanly on each emotion, as a dancer hits the mark, is only the beginning of the marvels he performs in this role. What's really impressive is the talent he displays for playing simultaneously to his fellow actors and to the audience, revealing aspects of Vincent's makeup to you even as he conceals them from the people onscreen. The security guards often fail to guess what you do, that the flush comes from anger as much as shame; they seldom hear the note of outrage that wavers beneath Vincent's thin-lipped patter.

Of course, this two-faced performance owes a lot to Cantet, the writer-director of Time Out. He's the one who plotted out the successive views of Vincent, so the man's emotional truth would accumulate even as his lies pile up. But it's Recoing who is so good at making Vincent lie badly. One moment, you'd think his eyes opened onto the rear wall of his skull; a second later, and the pupils are glittering near the surface, in slits like two flesh wounds. All the while, as his mind visibly shoots forward and retreats, Vincent keeps pouring out the words, hopelessly, uselessly, as if he wanted and deserved to be caught.

Maybe he talks so volubly because words are all he has: the words of a corporate functionary, backed up by a cell phone, a car and a carefully tended dark suit. Time Out begins with Vincent using most of the above assets in one of those rare and ingenious opening shots that instantly define a movie. You see a field of nocturnal mist, which gradually proves to be a fogged-up windshield. Vincent has been sleeping behind the wheel. As the shot continues and dawn breaks, a bare landscape takes shape beyond the membrane of the car. Inside the automotive bubble, Vincent picks up the cell phone. Yes, he tells his wife smoothly, the meeting went well--so well that now he has to stay over. He probably won't be back tonight. Miss you, too.

The windshield has cleared. Outside, the world's activities have begun. Inside, Vincent is insulated (though none too well) from the first of the film's challenges. He isn't supposed to be parked where kids get off the school bus? Then he'll drive elsewhere. As you eventually learn, Vincent likes to drive.

We'll get to that revelation. But for now, since I've told you some of the plot, it's probably more important to acknowledge the true story that underlies Time Out. You may remember the newspaper account: For years, a man in France said goodbye to his family each morning and drove off to a nonexistent job. He spent his time sitting around in parking lots and coffee shops; he got his money by floating loans, which he never repaid. When the debts grew so pressing that exposure was at hand, the man took a gun and ended the imposture; he killed his family but, though he tried, not himself.

Cantet has transformed this violent reality by draining it of almost all physical menace. Granted, in one memorable scene Vincent seems on the verge of killing his wife (Karin Viard), but his method is the relatively passive one of abandonment in the snow. Also, at the climax of Time Out, Vincent raises his voice and storms clumsily through the house; but despite that, Cantet doesn't go in for explosive denouements. He's far more interested in the normal texture of this abnormal story. Cantet wants to explore the corridors of a glass office tower, to sink into the squared-off armchairs in a motel lobby, to follow a strip of road wherever it leads. As much as Vincent's anger and anxiety keep mounting throughout the film, the shots and rhythms remain coolly composed.

When filmmaking is this precise and intelligent, critics habitually tack on a third adjective: dull. It's a judgment I was tempted to make whenever Vincent got together with his family. Those scenes felt obligatory, with Vincent trapped among the overbearing father, the surly son, the increasingly frustrated wife. If I had to live with this bunch of stock characters, I too might sleep in my car. But Cantet's vision of domestic life, though uninspired, makes up only a minor part of Time Out, whose patient and meticulous technique pulled me in whenever the film turned to a more congenial subject: criminality.

Vincent's most important relationship in Time Out is not with his wife but with a lean, wolf-faced smuggler named Jean-Michel (Serge Livrozet), who strikes up an acquaintance after catching him sleeping in that motel lobby. There's something wickedly avuncular about Jean-Michel, with his low, low voice and ironic smile. You could take him home to dinner (and wind up feeling like a guest in your own home). Jean-Michel leads Vincent into a world of darkened rooms full of cardboard boxes and darkened roads that slip across borders--a world that temporarily appeals to him.

It's during this time with Jean-Michel that Vincent makes the revelation I mentioned earlier, explaining how a love of driving cost him his job. When on the road to a business meeting, Vincent used to ignore the turnoff and simply keep going; he would drive on for one more exit, then for two, until he eventually stopped showing up at all and was fired. Jean-Michel gives Vincent the courtesy of accepting this story as a confession of good sense. And from what Time Out shows us of the business world, Jean-Michel is right.

What do people talk about in all those meetings? At one point, Cantet has Vincent spy on a conference-table gathering, so we can hear the presentation for ourselves: public-private strategic infrastructure business-model development. Nothing that you could smell or taste or pick up in your hand. Who wouldn't prefer the hum of wheels, the sound of the radio, to these endless polysyllables? And when you consider the cloudiness of this language of global trade, why shouldn't Vincent's old school chums believe him when he says he can take their cash into Switzerland and invest it in, ah, something or other? Some of these buddies all but force their money onto him. After all, he speaks so well.

Cantet's previous film, Human Resources, was similarly skeptical about the modern arts of management. That picture told the story of a young man from a working-class family who comes home from school to work in his father's factory. The father labors on the shop floor; the son, with his college education, hunches over a computer. I admired the way Cantet dramatized the homecoming, with congratulations quickly giving way to suspicion and resentment. (Wear a tie to work, and your favorite old bar might no longer be so comfortable.) But once the film's story kicked in, with the workers threatening to go on strike and the son being maneuvered to lie to them, Human Resources turned into more of a diagram than a movie. You could have taken a piece of graph paper and plotted the characters' relationships. In fact, that's what Cantet seemed to have done.

But there's nothing schematic about Time Out. However neat or decorous the storytelling, the movie respects the oddness of Vincent's refusal; which is to say, it reveals something of the oddness of the normal world by letting Vincent haunt it from a slight remove. And in Aurélien Recoing, the movie has a perfectly bland-looking Vincent whose every breath is charged with mystery. Recoing is your boring neighbor from down the hall, suddenly glimpsed doing the perp walk on the 10 o'clock news. He's Bartleby for the age of the euro; he's what you see in the mirror on Monday morning, before your eyelids mercifully ungum.

Recoing is an actor playing a character who is himself onstage full-time. He's in every frame of Time Out; and to every frame, he contributes something of genius.

Like it or not, America has been able to achieve and maintain its supremacy as a global power because of its capacity to absorb the best from the rest of the world. This dependency on foreign imports is especially clear in the realm of science and technology. Roughly one-third of US Nobel laureates were born outside the United States and became naturalized citizens. The father of the American nuclear program was a foreigner. But most foreign-born scientists toil away unrecognized in our nation's research labs, universities and private firms, forming the backbone of American high technology. In computer software development, now widely considered the most important area of American advantage, foreign nationals are commonly recognized as being among the best programmers. Almost a third of all scientists and engineers in Silicon Valley are of Chinese or Indian decent.

America cannot afford to lose the loyalty of these high-tech coolies it has come to depend on, yet that's exactly where it seems to be heading with recent cases of immigrant-bashing and racial and ethnic profiling by opportunistic politicians seeking short-term political gains. In the aftermath of the September 11 terrorist attacks, the animosity aimed at the enemies of the United States has also been extended to immigrants and American citizens who originally came from the same part of the world. Hundreds of Arab-Americans and Asians from the Indian subcontinent have been detained as suspects, without charges filed against them, under "special administrative measures" in the name of national security. The majority of Americans, the interpreters of polls tell us, approve. It was in the name of the same national security that a Chinese-American physicist, Wen Ho Lee, was accused some three years earlier of stealing the "crown jewels" of the US nuclear program and giving them to mainland China; similarly enacted special measures threw him in chains and into solitary confinement, although the government had no evidence against him. His public lynching, which was caused by and fed into America's national angst concerning enemy number one of that time--China--is the subject of the two books under review. As a perfect example of a national security investigation botched by racial and ethnic profiling, which led to a shameful failure of all the institutions involved, it could not have been exposed at a better time.

China emerged as America's prime antagonist after the end of the cold war. During the cold war, it was always easy to tell who was America's enemy and who was a friend. Then, with the normalization of Chinese-US diplomatic relations in the late 1970s, those lines began to blur. For a time at least, the People's Republic of China (PRC) was no longer a foe. Individuals and institutions from all walks of life were happily embracing the idea of scientific and cultural exchange, and even nuclear scientists went back and forth. It was understood that the common enemy was the USSR. This cozy relationship ended with the fall of the Soviet Union, when US policy-makers, without clearly defined targets, began to show signs of what Henry Kissinger calls "nostalgia for confrontation" and cast about for a manichean opponent. With its rapidly expanding economy in the 1990s, which brought it into some conflict with American interests in Asia, China became the most logical choice.

The targeting of Chinese-Americans and the questioning of their loyalties did not begin in earnest until after the 1996 general election, when Republicans accused members of the Chinese-American community of passing campaign donations from government officials of the PRC to Bill Clinton's re-election campaign. It was said to be a clandestine plan by China to influence US policy; the charge was not substantiated, but Asian-American contributors to the Democratic Party were investigated by the FBI for possible involvement in traitorous activities, and suspicions of disloyalty among Chinese-Americans lingered.

The investigation of Wen Ho Lee, who was then a research scientist at the Los Alamos National Laboratories in New Mexico, started soon after the campaign scandal. It was initiated by an intelligence report that in 1992 China had tested a bomb very much like the Los Alamos-designed W-88, considered one of the smallest and most highly optimized nuclear weapons in the world. Carried on Trident II submarine-launched missiles, the W-88 can hit multiple targets with great accuracy. When a Chinese defector to Taiwan brought documents with diagrams and text descriptions of a long list of US strategic weapons, including the W-88, US counterintelligence circles cried espionage and began an investigation.

Dan Stober and Ian Hoffman, who covered the story for the San Jose Mercury-News and the Albuquerque Journal, teamed up to write A Convenient Spy: Wen Ho Lee and the Politics of Nuclear Espionage, in which they reveal the scandalous details of the misguided search for the Chinese-American spy. Written like a crime novel, their book is at its best as an exposé of the behind-the-scenes workings of Washington politics, in which the truth is all too easily sacrificed for political expediency. The authors blame everyone involved, from the incompetent employees of the FBI and the ambitious bureaucrats of the Department of Energy (DOE) to the zealous anti-China hawks in Congress and a colluding press corps all too willing to swallow government-distributed information without corroboration.

The government spent four years and millions of dollars to pin Wen Ho Lee, ultimately only to find him innocent of spying. Many American weapons designers who were familiar with the Chinese nuclear program saw no reason that Chinese scientists could not invent in the 1990s the miniaturized warheads US scientists had developed in the 1950s. Others pointed out that most of the details on US missiles were available on a website maintained by the Federation of American Scientists. China could have easily made its own bombs by processing the mounds of information gathered from newspapers, magazines and scientific literature that Chinese students and scientists, over more than a decade of scholarly and business exchanges, had obtained legally--a method US counterintelligence circles refer to as gathering grains of sand. Yet the director of counterintelligence at the DOE, Notra Trulock, refused to believe that the Chinese were capable of developing the most modern weapon in the US arsenal on their own. "There's one spy out there and we're going to find him," he reportedly told an assistant.

The spy, if there was one, could have been any of the scientists from a half-dozen national nuclear-weapons-design labs, or an employee of one of the many plants that manufacture the parts, as they all had blueprints. Yet Trulock's order for an administrative inquiry stipulated that the initial consideration would be to identify those US citizens of Chinese heritage who worked directly or peripherally with the design development. This was a logical starting point, the attached memo went on to explain, based upon the intelligence community's evaluation that the PRC targets and utilizes ethnic Chinese for espionage rather than persons of non-Chinese origin. Following this perilous logic, the investigation took on the shape of a funnel: The list of suspects swiftly shrank from the employees of Los Alamos and Lawrence Livermore research labs who had traveled to China to the scientists of Chinese heritage who had worked directly or peripherally on the W-88 design development and had had contacts with Chinese scientists. From there, it was a quick jump to Wen Ho Lee as the only person who had the opportunity, motivation and legitimate access to the specific nuclear weapons information believed to have been leaked to the Chinese.

The choice of Wen Ho Lee as the spy was far from logical. He was a native of Taiwan and had openly expressed his sympathy for Taiwanese independence, and has in fact admitted to providing unclassified scientific documents to the Chung Shan Institute of Science and Technology--Taiwan's military research center involved in developing nuclear weapons. Also, he had been trapped into cooperating with the FBI many years earlier in an investigation of another Chinese-American scientist, while his wife was recruited to act as an unpaid informant on the activities of visiting Chinese scientists.

This may explain why no one at the FBI or any other government agency initially believed Trulock's accusations against Lee. Trulock's first request for a wiretapping order from the Justice Department was turned down. But he doggedly took his spy story to the CIA, the White House and the Defense Department until he finally found a sympathetic ear among Republicans in Congress. Representative Christopher Cox of California was heading the House Select Committee on US National Security and Military/Commercial Concerns, which was investigating the Clinton Administration for jeopardizing national security by being soft on China in exchange for campaign contributions. Cox immediately saw the potential of using an indictment against Wen Ho Lee to help the charges against Clinton stick. Trulock's unverified assertions became bombshells in Cox's committee report. On one occasion a zealous committee member even confused the scientist Wen Ho Lee with Bill Lann Lee, who was at the time waiting to be confirmed as Assistant Attorney General for Civil Rights.

But the real damage was done when someone leaked the spy story to the ever-hungry-for-a-Clinton-scandal press. A Pulitzer Prize-winning reporter for the New York Times passed the information along without corroboration, and soon Congress and the media were "locked in a game of one-upmanship," describing Lee's crime in ever more superlative-laden rhetoric, according to Stober and Hoffman. In no time, expressions of fear and hatred of the Chinese inundated the Internet, TV and radio talk shows. As the storm gathered, Clinton's appointees, instead of standing up against wrongful accusations, buckled. The new Energy Secretary, Bill Richardson, weighing the risk of losing his nomination as the running mate to presidential candidate Al Gore, ordered that Wen Ho Lee be summarily dealt with.

The FBI at first tried to scare Wen Ho Lee into confessing that he had passed nuclear secrets to China. The Rosenbergs professed their innocence, he was told, and the Rosenbergs are dead. When that did not work, he was put in jail, although the government still had no evidence to convict him as a spy. Five years of relentless hounding by its agents--at times more than 100 FBI personnel were working on his case--had produced nothing. The only wrongdoing he could be charged with, discovered by accident during a search of his office, was his downloading of several weapons codes from the lab's secure computer system onto the unsecured one. Similar security infractions were often ignored at the lab, rarely resulting in disciplinary measures. (In an error of potentially much graver consequences for national security, former Director of Central Intelligence John Deutch had downloaded top-secret files onto his unsecured home computer, which a family member had been using to surf pornography websites. Deutch was disciplined but he did not lose his job, much less end up incarcerated.)

Lee was prosecuted under the cold-war-era Atomic Energy Act, which allowed for the harshest treatment: He was put in manacles and shackles that were chained to his waist, and was locked up in solitary confinement. When members of his immediate family were permitted to visit him for one hour each month, they were not allowed to speak in Chinese--the language they spoke at home. Lights in his cell were on twenty-four hours a day, with a guard on constant watch. Such conditions are rarely experienced by even the most vicious convicted criminals.

Much to Wen Ho Lee's credit, he did not crack. The US district court judge in New Mexico who was put in charge of the prosecution was so incensed by the government's handling of the case that he said to Lee: "I believe you were terribly wronged.... [Government officials] have embarrassed our entire nation.... I sincerely apologize to you."

This unusual gesture, with which Wen Ho Lee opens his account of the ordeal in My Country Versus Me: The First-Hand Account by the Los Alamos Scientist Who Was Falsely Accused of Being a Spy, is by the book's end almost certain to draw applause from the readers, as an enlightened conclusion to a grave miscarriage of justice by the government; but the negative consequences of the incident have yet to be fully tallied.

More than 150,000 Chinese-American engineers and scientists work in US industry, government and academia today; roughly 15,000 are employed by the defense sector alone. Because of the way in which the government handled Wen Ho Lee's case, many found that their loyalty was being severely questioned by their bosses and colleagues. They were frequently subject to innuendo and distressing jokes. There were numerous reports of security clearances withdrawn and promotions denied, of people forced into early retirement. A survey conducted by the Committee of 100 and the Anti-Defamation League soon after Wen Ho Lee's release from prison found that 68 percent of Americans feel negative toward Chinese-Americans; 32 percent believe that Chinese-Americans are more loyal to China than to the United States; and 46 percent believe that Chinese-Americans passing secrets to China is a problem.

Even Stober and Hoffman, who make every effort to show the lack of credible evidence proving that Lee was a spy, maintain that his own unexplained actions fed into the political furor that made him all too convenient a target. For instance, Lee lied to the FBI, to his family and to his lawyers about why he had copied voluminous amounts of non-work-related computer codes used to design nuclear weapons and put them on portable tapes that have never been completely recovered.

In his own book, Lee explains the copying as a precautionary measure against losing his files--as had happened to him when the lab switched from one computer system to another. He defends the volume of downloads as necessary to test his portion of the codes "against the snapshot of the whole code at a certain time," because as the weapons designers change their calculations, his codes are affected as well. To Lee's scientific mind, the measure was prudent and logical. John Richter, a Los Alamos physicist known as "the guru of gurus" on the subject of plutonium explosives, testified in court in Lee's defense. He described Lee's actions with an old saying: Never attribute to malice what can be adequately explained by stupidity.

Whatever the case, Lee comes across as impossibly naïve as he recounts the events of late 1998, considering that he was in the very eye of the storm raised by the Cox investigation. He continued to cooperate with investigators by submitting to polygraph tests and repeated FBI questioning, without the presence of a lawyer. When his daughter told him that a New York Times article headlined "China Stole Nuclear Secrets from Los Alamos, U.S. Officials Say," published March 6, 1999, was about him, he didn't believe it. He didn't read newspapers, didn't vote and professed not to care about politics. Yet his book is politically sophisticated. It shows the unmistakable imprint of his co-author, Helen Zia, an experienced freelance journalist and a seasoned and respected Asian-American activist, who understood the significance of Wen Ho Lee's case in the context of American ethnic and civil rights politics.

In contrast to the position taken by Stober and Hoffman, who credit Lee's lawyers as being the only morally noncorrupt heroes of this story, Zia recognized that the legal case gained moral weight and credibility through the support of brave people who were willing to risk their careers to speak out in Lee's favor. The American Physical Society and the American Association for the Advancement of Science issued statements condemning the government's harsh treatment of Lee. A number of eminent scientists, among them several of Lee's colleagues, individually took the stand. Richter, the guru, provided crucial testimony debunking the government's nonsense that Lee had stolen the nation's "crown jewels," thus altering the balance of power in the world.

The Chinese-American community, still licking the wounds inflicted by Clinton's campaign fundraising scandal, was initially cautious in dealing with the sensitive issue surrounding nuclear secrets. But it picked up Lee's cause as soon as the government went public with its outrageous actions. Foreign-born Chinese-American scientists and engineers, who for years had sweated away quietly in research labs and universities, unrecognized, unappreciated and underpaid, but who were suddenly all suspect, turned their anger into building the Wen Ho Lee Defense Fund, which raised hundreds of thousands of dollars for his legal bills. Supporters established websites and organized rallies and teach-ins around the country, demanding that members of Congress stop the persecution of Lee. When Professor Ling-chi Wang, director of Asian-American studies at the University of California, Berkeley, called for a collective boycott of DOE-overseen national labs by all Asian-American scientists and engineers, the labs took notice. (An agreement with the labs on new procedures appeared imminent at press time.)

Job applications by foreign graduate students, from among whom most research labs and engineering firms recruit their future staff, are down. The National Science Board estimates that 30-50 percent of those who hold science or engineering doctorates in the United States are foreign-born (the number is the highest in math: 57 percent). About 7 percent of all physicists and 15 percent of all engineers in the United States are Asian-American. If Asian-American and other foreign-born scientists are discouraged from entering the US work force, notes Eamon Kelly, chairman of the National Science Board, the country could have a hard time filling the gap.

Yet, spurred by the September 11 attacks, Senator Dianne Feinstein has called for a moratorium on admissions of foreign students to US educational institutions. American national interests can ill afford this type of mindless antiforeign hysteria. American high school students rank near the bottom in math and science, according to studies on schooling worldwide. The country's best and brightest students often opt for careers as lawyers, doctors and financial professionals, where they can command much higher salaries than in the pure science fields. Wen Ho Lee, for instance, despite holding a PhD from an American university and with twenty years of experience at the Los Alamos labs, made only $80,000 a year--an absurdly meager remuneration for a man accused of changing the balance of power in the world.

If there is a lesson in all this, it is that the pre-eminent position of the United States in the world--"our scientific capabilities and national security," in the words of the president of the American Physical Society, James Langer--was in fact compromised by the government's action in the case of Wen Ho Lee and the resulting alienation of the most qualified foreign-born scientists necessary to maintain that pre-eminence. Unfortunately, the lesson is also, as Wen Ho Lee found out, that an immigrant dream--coming to America, working hard, getting an education, taking care of one's family and minding one's own business--can easily be shattered by politics. Only by becoming politically engaged and organized can immigrants gain the respect of the rest of the American people and stop being singled out as easy victims.

Blogs

In honor of Labor Day, here’s a stab at the impossible task of naming the best songs ever written about working people.

August 29, 2014

In his new book, John Dean finally offers definitive answers to the questions “What did he know, and when did he know it?”
 

August 14, 2014

The quagmire of the Vietnam War was built on a “queasy foundation of fact and myth.”

July 31, 2014

"About nothing does the mob forget so quickly as about war."

July 28, 2014

Eric on this week's concerts and Reed on the media’s coverage of climate change. 

July 21, 2014

Reflections on the meaning of the French Revolution in the shadow of Adolf Hitler.

July 14, 2014

In fact, says theater director Anne Bogart, stories can be “fascistic.”

July 14, 2014

“Have you a city that smells worse than Akron?”

July 11, 2014

These ten songs, taken together, help distill the American experience and make clear both what’s great about the US and what still needs critical attention.

July 2, 2014

On the 25th anniversity of Spike Lee's "Do the Right Thing," America needs to ask itself some hard questions.

June 30, 2014