Quantcast

Nation Topics - Books and the Arts | The Nation

Topic Page

Nation Topics - Books and the Arts

Subsections:

Arts and Entertainment Books and Ideas

Articles

News and Features

So if you managed to endure CBS's three-plus hours of Grammy cov erage, if you survived the sparsely attended protests from GLAAD and NOW, host Jon Stewart's lame commentary, the lip-synced perfor

Readers of this magazine do not need reminders of the costs of the cold war. The mountains of corpses, the damaged lives, divided families and displaced refugees, the secret police forces and death squads, and the resources wasted on ghastly weapons of unfathomable evil are not only markers of a recent past but still-active landmines buried a few inches beneath the surface of our contemporary lives.

What may be harder to remember is the ways the global struggle with the Soviet Union enabled social and cultural achievements that made the United States a decidedly more decent society. From Harry Truman's integration of the armed forces to the Brown decision and the 1963 March on Washington, the initial phase of the civil rights movement capitalized on the moral embarrassment of segregation for a nation trying to win the hearts and minds of Third World peoples. Likewise, the rapid postwar expansion of state universities, the infusion of government monies into public schools after Sputnik and the creation of the National Endowments for the Arts and the Humanities in 1965 were all episodes in an ideological cold war meant to demonstrate the cultural superiority of the "free world" to the Soviet bloc. It was a strange era that offered both Martin Luther King Jr. and his persecutor J. Edgar Hoover their big chance to bring the United States closer to their ideals.

Two monuments to the cold war stand catty-corner to one another on Washington's Pennsylvania Avenue: On one side, the brutalist Hoover FBI building; on the other, the restored neo-Romanesque post office that houses the NEA and NEH and bears the name of Nancy Hanks, the liberal Republican chair of the NEA during its glory days in the late 1960s and early 1970s. Care to guess which building will be renamed first?

Michael Brenson's new study of the NEA, Visionaries and Outcasts, emphasizes the cold war origins of the agency in an effort to place the "arts wars" of the past dozen years in historical perspective. Looking beyond the 1995 budget cuts that devastated the endowment, and the earlier battles in 1989-90 over NEA-supported exhibitions by photographers Robert Mapplethorpe and Andres Serrano, Brenson tracks the unfolding of a tension between "ideology and idealism" inherent in the NEA founders' understanding of the agency's role in American culture. Arts advisers to Presidents Kennedy and Johnson sought federal support for the arts to promote international awareness of the cultural vitality of a society dedicated to free expression and civil liberties. At the same time, cultural policy-makers like August Heckscher and Arthur Schlesinger Jr.--heirs to the upper-middle-class lampoon of middle-class "conformism" that stretched from Sinclair Lewis's Main Street (1920) to William H. Whyte's The Organization Man (1956)--saw in federal arts funding a way to create an American "civilization" equal to Western Europe's, which would inspire their fellow citizens with something more ennobling than the stuff of television and Levittown. Much like Clement Greenberg, the towering figure in postwar art criticism, Camelot culture warriors mounted a two-front campaign against the state-dominated art of the Soviet bloc and the kitsch of a newly affluent society.

Amazing as it now seems, the man (and he was imagined as a man) who was to do such heroic work for the nation was the artist. Kennedy's wooing of celebrity artists and writers--epitomized by his choice of Robert Frost to deliver a poem (he recited "The Gift Outright") at his 1961 inaugural and his subsequent invitation to Pablo Casals to perform at the White House--was not only an attempt to surround himself with glamorous and influential opinion-makers but, according to Brenson, a determined effort to establish the artist-prophet as a symbol of defiant individualism in an other-directed age. Whether it was Frost the aging Yankee reciting from memory at the inaugural or the Abstract Expressionist painters wrenching meaning from existential meaninglessness, the image of modern artists as "visionaries and outcasts" served liberals' war of ideas against Communist adversaries abroad and the benighted middle classes at home. As Kennedy put it in his 1963 speech at the dedication of the Frost Memorial library at Amherst College--the occasion for his most extended comments on the arts--a great artist was the "last champion of the individual mind and sensibility against an intrusive society and an officious state."

Visionaries and Outcasts sketches the history of liberalism's dream of the visual artist as national hero from the early 1960s to the present. Brenson was originally commissioned to write an internal study of the NEA's visual arts program, but the former New York Times art critic chose to revise and publish his work independently after the dismantling of that program by the Gingrich Congress in 1995. The book he has produced is more than an institutional study of one office in a federal agency, however. Brenson rightly considers the program that gave some 4,000 fellowships to individual artists between 1965 and 1995 as the heart and soul of the Endowment. Although early chapters suffer from the bureaucratic language common to government reports, the book concludes by raising thoughtful and provocative questions about the tragic history of the NEA. As he revised his study, Brenson expanded his vision to include the rise and fall of this heroic image of the modern artist as prophet and redeemer of late-twentieth-century US culture. "The NEA became a lens onto larger issues of the changing identity of the American artist and the enduring problem of...the visual artist in a country that...is still only comfortable with the artist as a maker of high-priced commodities controlled by galleries and museums."

In the story Brenson tells, modern artists were useful to this country's political elites only so long as the cold war was raging. Once that war was won, and the political culture had shifted markedly to the right, the lonely artist was no longer a bearer of universal values but a threat to them. The ideological rationale for the endowment collapsed along with the Berlin wall, and cautious NEA administrators invested their idealism in established art institutions. Better to fund museums than to risk spending money on unruly individuals who might turn out--like Serrano or Mapplethorpe--to be "controversial." Despite the defensive maneuvers of arts administrators and their allies, a vengeful Congress cut the NEA's budget by 40 percent in 1995 and eliminated all grants to individual artists (except writers). The endowment has since limped along into the twenty-first century, but more as an occasion for petition drives and liberal fundraising than as a vital force for artistic creativity. In reality, the NEA of 1965 is dead, and with it the official myth of the artist as critic and savior of American national culture.

During the three decades when it mattered, the NEA's visual arts program gave small grants, no strings attached, to many of this country's major artists, often offering them assistance early in their careers before private money was forthcoming. The mechanism for doling out funds was peer panels composed of artists, curators, scholars and critics, who operated without political oversight from federal officials. In fact, Kennedy liberals organized the peer-panel system precisely because it insulated art-funding decisions from state interference and therefore drew another contrast with the state cultural agencies in place in the Soviet-bloc countries. Artistic freedom, in the view of Camelot arts advisers, required the support of professional panels that would judge art strictly according to nonideological, aesthetic standards. At a time when a Greenbergian theory of aesthetic autonomy reigned supreme in New York-based art circles, the freedom of the NEA's peer panels from politicians' control seemed to most liberals a necessary complement to a Modernist logic that divorced "pure" painting and sculpture from political ideology, representation and traditional subject matter of any kind.

The NEA's panels instantly became objects of criticism from true "outsiders," who interpreted talk of an autonomous aesthetic as a bid for power by art-establishment cronies. Brenson ignores the early history of such attacks, which originally came from the political left, and instead repeats the now-familiar story of the persecution of the NEA by the Christian right and its allies in Congress after 1989. The story is a bit more complicated than that, however. In the context of the late 1960s and early 1970s, the authority of the peer panels and the autonomous aesthetic theory they defended came under attack from other quarters: from advocates of more politically charged, social-realist and feminist art; from African-American and Latino artists who saw little of their work or their traditions acknowledged, let alone supported, by the NEA in its early years; and from folk artists and enthusiasts of regionalist cultural traditions who disputed the place of New York Modernists at the pinnacle of the NEA's cultural hierarchy. Although the Endowment quickly made concessions to its critics on the left, the peer-panel process remained largely unchanged from its original incarnation until 1995, in the aftermath of the Republican sweep in the previous fall's elections, when the conservative polemic against the tyranny of a "cultural elite" hostile to the values of "normal Americans" finally succeeded in killing off the visual arts fellowships.

Brenson devotes almost half his book to an admiring account of the panels' operations, quoting extensively from artists who served as referees or benefited from the program's largesse. He condemns the system's rightist critics as ignorant and presents the panels in the most glowing terms imaginable as models of aesthetic judgment, openness and generosity. "The peer panel system embodied the idealism and nobility of the NEA," he tells us. Those who applied unsuccessfully during these years may have had another view of the matter, but no one can deny that the award of such a grant at an early stage of an artist's career meant far more than the money involved. Installation artist Ann Hamilton recalls that "winning" her fellowship in 1993 "gave me a very important sense of support from my peers, which is and was very important in maintaining the trust and faith necessary to make new work, to change, to make a leap of imagination toward what can't easily be knowable or containable in language." This was the NEA's visual arts program at its best--"a gift," as Brenson calls it, "in the fullest sense of something given especially to one particular person, with a special knowledge of who that person is and what that person needs, by someone or something that cares--in this case a government agency, on the advice of peers."

What went wrong, then? Given its distinguished history, why was the visual arts program so vulnerable in 1995? Visionaries and Outcasts is not altogether helpful in answering that question, though it offers a rudimentary road map for a fuller account in the future. Brenson rounds up the usual suspects--Jesse Helms, fundamentalists, New Criterion editor Hilton Kramer--and, in a more intriguing move, notes how the ground shifted beneath the panel system in the 1980s as the art market and American artists themselves transformed the cultural meaning of the visual arts. The go-go art market of the Reagan era created a private reward system that made the NEA irrelevant to many young artists on the make, while conservatives inside and outside the endowment began assigning to museums the universalistic values that 1960s liberals once invested in the image of the heroic artist. Meanwhile, radical artists gave up the Modernist ideal of the individual prophet-artist standing apart from his or her culture. The adoption by many political artists of the term "community arts movement" to describe their project was an important sign of a new sensibility among artists who came of age in the 1980s and rejected the endowment's original assumptions even as they accepted its subsidies. Brenson himself adopts some of their critique in the closing pages of his book, acknowledging that the NEA "put artists on pedestals" and "ended up sustaining their marginalization" by perpetuating an image that many Americans found "arrogant and disdainful."

Brenson's second thoughts seem not to have influenced the rest of this book, which hardly registers the effect of such searching self-criticism. That is unfortunate, because his valuable questioning of the Modernist myth that originally inspired the NEA, and his closing call for an art of "connectedness"--to other citizens and to the natural world--should be the starting points for any serious reconsideration of the embattled agency's history. Especially when it comes to the arts, liberal and leftist culture-workers are too quick to attribute their current troubles to the malevolence of strangers (what will the so-called People for the American Way do when Jesse Helms dies?); too loath to acknowledge that they have achieved positions of power, wealth and influence in American society; and too devoted to their flattering self-image as, alternately, daring rebels or beleaguered victims. Such poses may absolve cultural administrators of any feeling of responsibility for their institutions' plight, but they will prove useless when it comes time to sort through the wreckage of the NEA and other liberal cultural programs in search of lessons for the future.

At one crucial moment in his book, Brenson inadvertently hints at a more critical history of the endowment that might better explain its terrible predicament. He compares the panel system to "the United States jury system" in its rock-bottom faith in humans' "need to learn, [their] belief in justice, and [their] commitment to the common good." Maybe those were the impulses that motivated the panelists as they watched hundreds of slides flash before their eyes; but in retrospect it's exactly the extent to which the NEA selection process was not like a jury that stands out as its chief political liability. Juries, after all, are not composed solely of lawyers, criminologists, psychologists and forensic experts. Nor are embezzlers, assassins and car thieves invited exclusively to judge their peers. When those people serve on juries, they do so as citizens, not in their capacity as professionals. Whatever their limitations, juries embody the civic ideal that ordinary voters--informed by the law and the testimony of relevant specialists--possess the wisdom to govern themselves and administer justice fairly. Never did the NEA's founders display a comparable faith in the ability of nonexperts to contribute to the common culture. Indeed, one reason they married a formalist aesthetic to bureaucratic proceduralism in the first place was to secure a space for creativity separate from the presumed ignorance and tastelessness of the general public.

Such a system "worked" well enough in the NEA's early years, when a New York-based art elite had an astonishing confidence about which artists deserved support. As the East Coast NEA panel met in 1966, it was easy for a few insiders to chat informally and select names. "Generally there was a consensus" about which artists deserved grants, sculptor George Segal told Brenson. "There was not too much of a discussion because it was assumed that all of us knew them." The founding director of the visual arts program, former Metropolitan Museum of Art curator Henry Geldzahler, was openly contemptuous of a request at a West Coast meeting that the panelists examine slides of work by the artists under consideration. As panelist and fellow museum curator Walter Hopps recalled, "The boxes were pushed into the room. Henry stood up and went over and thumped each box with his hand and said, OK, now we've seen the applications and we've seen all this." The boxes of slides were removed, unopened; the applications sat in a pile unread. "We just talked about who we wanted.... It was all over in a morning."

A small art world with a strong consensus on a Greenbergian narrative of Modernist progress could afford to behave this way, especially when it enjoyed support from a liberal majority in Congress. But even when the peer-panel process was cleaned up and made more professional, the complaints poured in that the selection system was unresponsive to the very public this public agency was meant to serve and indifferent to the growing heterogeneity of art practices that transformed visual culture in the United States after the 1960s. What at first seemed like a means of protecting the independence of cutting-edge "visionaries and outcasts" from bureaucratic interference stood condemned by the late 1970s and early 1980s as an institutionalized patronage network that favored specific aesthetic commitments and excluded the vast majority of Americans as incapable of informed artistic judgment.

Coming to terms with the political shortcomings of the peer-panel system requires that we take a more skeptical view of the idea that artists (and their liberal allies) were "outcasts" in the first place, back in 1965. Despite his trenchant critique of the heroic-individualist model of the artist during the cold war, Brenson himself slips into romantic and avant-gardist rhetoric that is long overdue for critical scrutiny. To what extent can one really speak of the modern artists the NEA supported in the 1960s and 1970s as an avant-garde? Wasn't the original mission of the NEA proof that by mid-century the avant-garde ideal had merged perfectly with the cult of expertise that so captivated elite liberals, with their dream of benevolent rule from above by "the best and the brightest"? The class and ideological biases of the cultural institutions that liberals created in that period seem to have escaped no one except liberals themselves.

A quarter-century after the collapse of the New Deal arts programs, with their organic connection to 1930s labor insurgency, the case for federal arts funding returned in a very different political guise. The NEA's original base was in the (Nelson) Rockefeller wing of the Republican Party and the (John) Kennedy wing of the Democratic Party, two upper-middle-class constituencies that prided themselves on their distance from a seemingly "stodgy" labor movement and a parvenu middle class mired in the "ticky-tacky" vulgarity of the suburbs. It should come as no surprise that Nancy Hanks--once Nelson Rockefeller's personal secretary and then the NEA's chairwoman during the Nixon and Ford administrations--presided over dramatically escalating budgets for the endowment. Republicans still needed to appease the Rockefeller wing of their own party. And it should be no surprise, either, that a new right within the Republican Party succeeded in large part by pursuing a very different brand of cultural politics.

Capitalizing on popular unhappiness with the arrogance of the "New Class" at the helm of the NEA and other official cultural institutions, the Goldwater-Reagan right was able to oust the Rockefeller liberals from its own party and mount a masterful crusade against "cultural elites" in the universities, foundations, mainline Protestant churches, museums and the two endowments. Elite liberalism has not fared well in postliberal America, as conservatives have channeled popular disaffections into a pseudo-populism on cultural matters that they would never tolerate in economic affairs or foreign policy. The result has been an increasing isolation of artists, writers and intellectuals in universities and a delegitimation of the very idea of a common cultural life shared by citizens of different backgrounds.

With its original claims to aesthetic autonomy and professional expertise discredited by years of pounding from the left and the right, the endowment lacks a persuasive language to justify alternatives to the privatization of arts patronage. Its very name, the National Endowment for the Arts, speaks to an era of liberal consensus--on the nation, on the nature and desirability of national cultural standards, on what does and does not constitute art--that has disappeared. With the nation and the arts in dispute, all that remains is the program's pathetic "endowment," mere chump change in the global village overseen by the likes of Rupert Murdoch, Charles Saatchi and the trustees of the Guggenheim Museum chain store.

In an era of market fundamentalism, the panel system that once promised artists protection from political and bureaucratic interference during the cold war deserves careful reconsideration. It is conceivable that panels might again function as "free spaces," this time offering artists a refuge from the commercial imperatives that are ruining publishing, museums and public broadcasting. But to make the case for such spaces today requires a radically different mindset from the sentimental avant-gardism and antidemocratic prejudice still current in elite art circles. It also demands a clear-eyed acknowledgment of the historical complicity of the endowment's defenders in the political logic that threatens our public schools, museums and libraries, as well as our artists.

Starting from ground zero, with the NEA in ruins, advocates of public funding for the arts need a language that recognizes the difference between the authority of collective judgments rooted in shared standards and the exercise of market power, and which assumes, furthermore, that every person has access to varieties of aesthetic experience that may contribute to the formulation of such standards. Opening panels to nonspecialists need not be a Trojan Horse for "Archie Bunkerism" or "authoritarian populism," those bugaboos of elite left-liberalism. Nor is it an affront to the credentials of artists and scholars who benefit from public subsidy (like this reviewer) to insist that they discuss their work with lay audiences in exchange for such support. These are tiny steps, of course, but the suspicion and hostility even such modest suggestions provoke in some quarters are a sign of the bleak cultural pessimism that now poisons all discussion of the civic role of the arts in the United States.

Every few months, I receive a forwarded e-mail message that recounts a reputed NPR story by Nina Totenberg about an upcoming Supreme Court ruling on funding for the NEA, warns that the Court's conservatives are about to kill off the endowment once and for all, and then asks for my name on its long list of petitioners. The petition is a classic Internet hoax, but even if it weren't, the time for forwarding such messages is long gone. The NEA was gutted several years ago, and the rebuilding of public support for publicly funded art is going to take a lot more than e-mail petitions. There are hard, unsettling questions that the people who sign such petitions need to ask about the responsibility they and their institutions bear for the ascendancy of our conservative order and about the blindness that comes with the heady self-image of artists and intellectuals as visionaries, outcasts and perpetual victims. Michael Brenson's book is a valuable starting point for a conversation, barely audible at the moment, that might finally address those questions. Until then, ignore the petition on your computer screen. That delete button is there for a reason.

American intellectuals love the higher gossip because it gives intellectual life here--ignored or sneered at by the public--a good name. Sensational anecdotes (Harvard's Louis Agassiz getting caught in flagrante Clinton), tart one-liners (Oliver Wendell Holmes's crack that Dewey wrote as "God would have spoken had He been inarticulate") and stark biographical details about influential thinkers (William Lloyd Garrison's habit of burning copies of the Constitution at his public appearances) do more than illuminate thought, explain impulses and entertain. In the right hands, they create solidarity with the rest of modern consumer and media culture, injecting the sizzle of boldface revelation into respectable scholarly work.

What red-white-and-blue-blooded man or woman of letters can resist the news that Holmes made his family practice fire drills in which the satchel with his new edition of Kent's Commentaries on American Law was to be evacuated from the house first? Or Alice James's verdict on her brother William that he was "just like a blob of mercury, you cannot put a mental finger upon him"--a man so pluralist all the way down that he resented the notion that everyone should spell the same way? Don't the tales of Charles Sanders Peirce's blatant philandering with a teen belle, his inability to finish manuscripts, his erratic disappearances when scheduled to teach, his failure to include return addresses on requests for money, the impulsive sale of his library to Johns Hopkins, his flamboyant hiring of a French sommelier to give him lessons on Medoc wine in the midst of financial chaos--provide the pizazz of a stellar film, while also giving further force to traditional questions about genius and madness?

These are our cerebral celebrities, after all. For modern American intellectuals suckled on the concrete like their everyday peers--for whom even a paragraph of "abstract" blather is a signal to put the headphones back on, grab a magazine, tune out--such perky additives are necessary. But bringing the higher gossip to American philosophy--the Death Valley of American humanities, when it comes to literary style--is a uniquely forbidding matter. For every Richard Rorty whose unabashed colloquial style reveals he's a native speaker of American English, legions of disciplinary weenies, raised in exotic places like Pittsburgh and Palo Alto, stultify the subject by writing in a stilted English as a second jargon. To entrenched American philosophy types still bound to the flat prose of logical positivism (even after ditching its assumptions), anecdotes, biographical details and colorful examples remain a foreign rhetoric: irrelevant information properly left to bios of the canonized dead by scholars from second-rate schools, but no part of the laughable research programs of conceptual analysis they pursue.

Louis Menand enters this arid terrain with sainted credentials and connections. Having begun as a work-one's-way-up English professor, Menand, now at City University of New York, ranks as the crossover star of his academic generation, a bi-Manhattan emissary between campus and media whose prose travels only first-class, the public intellectual whose pay per word every public intellectual envies. In the media capital of the last superpower, where thousands of professors undoubtedly think they, too, with a little Manhattan networking, could be a contributing editor (and editor heir apparent) of The New York Review of Books, or staff writer at The New Yorker, or contributor to The New Republic, Menand has actually pulled it off as he works out whether he wants to be Edmund Wilson or Irving Howe, or just Luke Menand. Let the naysayers sulk. A few years back, to the annoyance of some careerists in American philosophy, he got the nod to edit a Vintage paperback edition of classic American pragmatists despite outsider status in the field. The specialists who carped about that choice will not be happy to welcome The Metaphysical Club, unless they welcome redemption.

Here, in the major book of his career so far, Menand brings his exquisite literary and philosophical talents together to invent a new genre--intellectual history as improv jazz. In it, Alex Haley and Arthur Lovejoy seem sidemen jamming in the background as Menand samples family genealogy, battlefield coverage, popular science writing and philosophical exposition to tell "a" story (the indefinite article is key) of how pragmatism, the now consecrated philosophy of the United States, riffed its way to prominence through the art of four philosophical geniuses: Holmes, James, Dewey and Peirce. The Metaphysical Club, Menand warns in his preface, is "not a work of philosophical argument" but one of "historical interpretation." Just so. In that respect, it belongs to the grand tradition of American intellectual history staked out by V.L. Parrington, Merle Curti, Max Lerner and Richard Hofstadter. Yet true to the pragmatist spirit, Menand aims "to see ideas as always soaked through by the personal and social situations in which we find them." His overview of pragmatism's evolution and triumph, told mainly through the lives of his four horsemen of absolutist philosophy's apocalypse, integrates textured biography and superlative storytelling to an extraordinary degree (though a seeming cast of thousands get walk-on roles, too). "I had no idea, when I started out," explains Menand in his acknowledgments, "how huge a mountain this would be to climb." If so, he deserves a Sir Edmund Hillary Award for sustained commitment to an extreme sport. All four of the familiar figures he focuses on have been "biographied" to death, often at massive length. Menand's excellent syntheses of secondary works and primary materials demonstrate exactly how steeped he became in the materials.

Menand's combination of dogged historical research--almost daring the reader to dispute the representational accuracy of his story among stories--with an unapologetic literary stylishness makes The Metaphysical Club a page-turning pleasure to read. Yet it also forces one to sharply different judgments: one literary, the other philosophical (in a perhaps antiquated sense) and historical.

As a literary effort, a daring act of bringing the narrative magic of a Tracy Kidder or Tom Wolfe to thinkers who largely lived on their keisters while reading and writing intellectual prose, The Metaphysical Club is a masterpiece of graceful interpretation. Menand's sly wit and reportorial hijinks, his clarity and rigor in making distinctions, his metaphorical gift in driving home pragmatist points make The Metaphysical Club this summer's beach read for those who relax by mulling the sands of time. If one takes Menand at his pragmatist word--that this is just one "story of ideas in America" that does not preclude other narratives--there's little to complain about. On a Rortyan reading of the book, the type Menand plainly invites (there's less space between Rorty's and Menand's views of pragmatism than between Britannica volumes on a tightly packed shelf), the right question to ask is not "Does Menand have the story right?" but "Is this the best story for us Americans in achieving our purposes?"

At the same time, if one retains a shade of the representational approach to the world that pragmatists largely disdain--the notion that America's intellectual history did happen one way and not another--one can't help rejecting Menand's fundamental organizational claim that the Civil War (as he states in his preface) "swept away almost the whole intellectual culture of the North." It's a belief expeditiously assumed because it smooths the post-Civil War story he chooses to tell. At one point late in The Metaphysical Club, while writing of the now largely forgotten political scientist Arthur Bentley, Menand describes James Madison as "a writer to whom Bentley strangely did not refer." One might say almost the same, in Menand's case, regarding the father of the Constitution, whose devices for accommodating factions in the structure of democracy were at least as pragmatically shrewd as Holmes's neutralist dissents in jurisprudence. And one might say it in regard to Benjamin Franklin, that larger than life proto-pragmatist who gets only a single mention as the great-grandfather of one Alexander Dallas Bache. Franklin, to be sure, is not a figure helpful to Menand's project, given the author's premise that there was "a change in [the country's] intellectual assumptions" because it "became a different place" after the Civil War. But a closer look at the story Menand tells helps explain why.

His method throughout The Metaphysical Club is to toss out the genetic fallacy and explain, in wonderful set pieces, how the experiences of his four protagonists drove them to the views they eventually held as magisterial thinkers. In Part One, devoted to the young Holmes, Menand thus laces the history of antebellum abolitionism and the politics of slavery through Holmes's own trials of conscience before his Civil War service. Holmes's story serves as a model of how Menand finds an internal pragmatist evolution in each of his leading characters. The future giant of American jurisprudence, Menand reports in graphic detail, witnessed an extraordinary amount of fighting and carnage in the Civil War. At the 1861 battle of Ball's Bluff, he took a rifle bullet just above the heart, but survived. In 1862, at the horrific battle of Antietam, where the Union suffered 13,000 casualties, he took a bullet in the neck, but again survived. In 1863, at a battle known as Second Fredericksburg, enemy fire struck his foot. He returned to Boston and the grim reaper didn't get him until 1935, when he was 93, a retired Supreme Court Justice and the most distinguished jurist in the country. But the war, Menand writes, "had burned a hole... in his life."

In a notebook Holmes kept during the war, the young soldier entered a phrase to which Menand calls special attention: "It is curious how rapidly the mind adjusts itself under some circumstances to entirely new relations." Holmes's experiences taught him, Menand writes, that "the test of a belief is not immutability, but adaptability." During the war, Menand maintains, Holmes "changed his view of the nature of views."

"The lesson Holmes took from the war," Menand continues, "can be put in a sentence. It is that certitude leads to violence." And so even though Holmes never accepted pragmatism as his official party affiliation, believing it a Jamesian project to smuggle religion back into modern scientific thought, he'd come to share one of its tenets: rejection of certainty. The whole of his subsequent judicial life, Menand contends, became an attempt to permit different views to be democratically heard in the marketplace of ideas and policy.

Too simple? Too slim a reed to sustain the view that Holmes's turn against certainty (exemplified by antebellum abolitionism) came as an adaptive response to a life in which certainty spurred violence--one more Darwinian twist in a story replete with Darwinian themes? Menand's evidence is substantial. Holmes never tired of telling war and wound stories. He "alluded frequently to the experience of battle in his writings and speeches." After his death, Menand reports, "two Civil War uniforms were found hanging in his closet with a note pinned to them. It read: 'These uniforms were worn by me in the Civil War and the stains upon them are my blood.'"

Menand finds a similar evolution documented in James. Famously fragile in his emotions, and a legendary procrastinator, James came to believe that "certainty was moral death." Rather, he thought, the ability and courage to bet on a conception of truth before all the evidence was in amounted to the best test of "character." That remarkably open mind, Menand relates, grew, like Holmes's resistance to dogmatism, out of experiences, such as the "international hopscotch" that family patriarch Henry Sr. imposed on his children's educations by yanking them out of one school after another.

"The openness that characterized both the style and the import of his writings on pragmatism," Menand writes of William James, "seemed to some of his followers to have been specifically a consequence of his disorganized schooling." Similarly, James's close work with Agassiz on the naturalist's famous "Thayer expedition" down the Amazon in the 1860s taught James that "everything we do we do out of some interest," a tenet crucial to pragmatism. Menand suggests that meditations on Brazilian Indians ("Is it race or is it circumstance that makes these people so refined and well bred?" James asked in a letter) may have begun James's relational thinking. Alluding to such influences, Menand concludes, "It seems that Brazil was to be, in effect, his Civil War."

By the time the author gets to Peirce, in Part Three, and Dewey, in Part Four, his entertaining method is in full swing. Menand portrays the pragmatism of his foursome, with their individual idiosyncrasies, as the consequence of experience-driven epiphanies, with epiphany playing the role in intellectual development that chance adaptive mutation plays in what once was considered "lower" biological development. Giraffes get longer necks--Americans get pragmatism.

Peirce proves the most challenging of Menand's subjects because he remained unpredictable and dysfunctional. The son of Benjamin Peirce, professor of mathematics at Harvard at the age of 24 and "the most massive intellect" Harvard president A. Lawrence Lowell claimed ever to have met, he had a lot to live up to. But Peirce suffered from painful facial neuralgia and turned to opium, ether, morphine and cocaine over his lifetime to ease the suffering. Violence and infidelity complicated the picture further--Peirce spent many years trying unsuccessfully to regain the brief foothold in academe he'd achieved during a short teaching stint at Johns Hopkins. With Peirce, Menand takes us through a famous nineteenth-century probate action, known as the "Howland will case," in which Benjamin Peirce testified, with behind-the-scenes help from his son, about the probability of a forged signature. A fascinating set piece, it's also Menand's inspired way of backgrounding the younger Peirce's involvement with the increasing importance of probability theory in the nineteenth century.

Peirce's work with "the law of errors," which "quantified subjectivity," was just one experience that drove him to pragmatist views. In time, writes Menand, Peirce came to believe both that "the universe is charged with indeterminacy" and that it "makes sense." He held that "in a universe in which events are uncertain and perception is fallible, knowing cannot be a matter of an individual mind 'mirroring' reality.... Peirce's conclusion was that knowledge must therefore be social. It was his most important contribution to American thought." Only in this stretch does Menand come to the title subject of his book: "The Metaphysical Club," an informal discussion group that Peirce, James and Holmes attended for perhaps nine months in 1872. There the idea that Menand considers a central link among the three, and fundamental to pragmatism--that ideas are not Platonic abstractions but tools, like forks, for getting tasks accomplished in the world--took articulate form for the first time. Here, as elsewhere, Menand evokes the atmosphere and supporting actors of the setting through fine orchestration of detail. He smoothly recovers the mostly forgotten Chauncey Wright, another man who learned in the Civil War that "beliefs have consequences." Wright used weather as his favorite example, and the "notion of life as weather" became his emblematic position.

Finally, in exploring Dewey in Part Four, Menand follows pragmatism's clean-up hitter from Vermont childhood to early academic stints at Hopkins, Michigan and Chicago. Menand's two-tiered approach falters a bit here. When the camera is on Dewey, we see him wrestling with issues of Hegelianism and laissez-faire individualism, and drawing lessons from his laboratory school at Chicago ("if philosophy is ever to be an experimental science, the construction of a school is its starting point"). He gets the de rigueur epiphany--the evil of antagonism among social factions--personally from Jane Addams. He absorbs moral insights offered by the Pullman strike and articulates his own great priority within pragmatism, on democracy as a matter of social participation and cooperation, not just numbers and majorities. But here Menand's characteristic deep backgrounding, particularly on the genesis of the "Vermont transcendentalism" that was more conservative than the Boston variety, seems overmuch. For all of Menand's literary deftness, we sometimes wonder, when taking in the variations on French figures like Laplace, or Scottish ones like Robert Sandeman, whether we're listening to a wonderful stretch of intellectual exotica--fine improvisational solos--or music crucial to the story. At the same time, one of the book's undeniable pleasures is Menand's voyages into the estuaries of nineteenth-century intellectual history, from Agassiz's endorsement in the 1850s of polygenism (the claim that races were created separately, with different and unequal aptitudes), to the work of the Belgian mathematician Adolphe Quetelet, "a brilliant promoter of statistical methods" who called his approach "social physics." Menand's accounts of nineteenth-century America's intellectual debates, like his sketches of Darwinian thinking and its social ramifications, are models of efficient summary.

Their net effect, of course, is to show that pragmatist concepts--opposition to certainty, evolution toward probabilistic modes of thought--were in the air, and his four protagonists breathed deeply. To Menand's credit, given the compass of this biographical and sociological work, he keeps his eye on the enduring subject--pragmatism as a distinct mode of thought--showing the family resemblance in pragmatist epiphenomena of the time, from proximate cause in law to statistical understanding of the role of molecules in heat. His superbly syncretic summary, late in the book, of what he's found sounds less sweeping than the claims in his preface:

Pragmatism seems a reflection of the late nineteenth-century faith in scientific inquiry--yet James introduced it in order to attack the pretensions of late-nineteenth century science. Pragmatism seems Darwinian--yet it was openly hostile to the two most prominent Darwinists of the time, Herbert Spencer and Thomas Huxley.... Pragmatism seems to derive from statistical thinking--but many nineteenth-century statisticians were committed to principles of laissez-faire James and Dewey did not endorse.... Pragmatism shares Emerson's distrust of institutions and systems, and his manner of appropriating ideas while discarding their philosophical foundations--but it does not share his conception of the individual conscience as a transcendental authority.

"In short, pragmatism was a variant of many strands in nineteenth-century thought," writes Menand, "but by no means their destined point of convergence. It fit in with the stock of existing ideas in ways that made it seem recognizable and plausible: James subtitled Pragmatism 'A New Name for Old Ways of Thinking.'" So maybe it's not true that the Civil War "swept away almost the whole intellectual culture of the North." That judicious modesty makes it easier to note some of the oddities of Menand's choices, especially given the bold leaps he takes to find pragmatist principles in areas of knowledge far afield from traditional philosophy. Some, considering the prominent space and harsh spotlight he devotes to discussions of slavery and racism by nineteenth-century thinkers like Agassiz, are regrettable.

At times, for instance, Menand can seem more interested in patricians for patricians' sake--or Boston Brahmins for Brahmins' sake--than the tale requires. It's easy to feel that a story with more nineteenth-century black and feminist thinkers, and fewer Northeastern gentlemen, would be a better tale for understanding the development of American thought. Menand's maverick status with regard to philosophy, welcome in his syntactic verve and enthusiasm for complex biographical explanation, perhaps intimidated him in this regard. As an outsider, he arguably stays too respectful of professional philosophy's ossified white-man pantheon of American philosophy, despite the canon wars of his own field. Martin Delany, Frederick Douglass and Elizabeth Cady Stanton, for instance, ought to be recognized as part of the pragmatist tradition, whether they have been formally or not.

Yet while Menand briefly mentions Delany and his troubles in being accepted at Harvard, he presents him more as a victim (which he was) than a thinker. More happily, Menand does devote respectful attention to the black pragmatist Alain Locke late in the book. But the biggest surprise is that W.E.B. Du Bois, who surfaces about 400 pages into the book, gets short shrift--four pages. Du Bois's famous articulation, at the beginning of The Souls of Black Folk, of the question black people silently hear asked of them by too many whites--"How does it feel to be a problem?"--provocatively inverted the pragmatist problematic in a way Dewey and James never fully pondered in their model of (white) agents facing their environments: the problem of being a problem to others. One imagines Menand could have made fascinating arabesques out of that peculiarity.

Then, finally, there is the Franklin problem. It's often forgotten, in an era when Franklin's face stands for thrift and prudence in bank ads, that his reputation, as John Adams wrote in the early nineteenth century, was "more universal than that of Leibnitz or Newton, Frederick or Voltaire," that Jefferson viewed him as "the father of American Philosophy" and Hume agreed. Is a thinker who wrote in his Autobiography in 1784 that "perhaps for Fifty Years past no one has ever heard a dogmatical Expression escape me" far from pragmatism? In his emphases on experience, experimentation and community, Franklin was the proto-pragmatist par excellence. Even in the free-jazz genre of intellectual history, his absence is a large lacuna.

Pragmatism, however, offers special benefits to authors and reviewers. Once one abandons the idea that we mirror the world exactly with our stories, and takes the nervier view that we tell stories about it that may be good for us in the way of belief, the kind of criticism made here--that Franklin, Madison, Delany and other thinkers merit membership in that ironically named "Metaphysical Club"--assumes its humble place. The greater accomplishment--Menand's--is to show that powerfully experienced consequences form beliefs, that beliefs form consequences and that the whole circular process of life teems with blood and pain and laughter that expose the abstract approach of much professional philosophy for the self-interested charlatanism it is. Writing to his father about Agassiz, William James observed that "no one sees farther into a generalisation than his own knowledge of details extends." Accepted as a truism rather than a rebuke, the insight suggests that questions about Menand's choices represent rival stories--what James might have seen as another pluralist tale seeking airtime. Judged by the latter's standards--what difference it makes if this or that worldview is true--The Metaphysical Club casts a vast, brilliant light on the human subtleties of America's most influential philosophical achievement. It's a feast of canny wisdom and sophisticated entertainment, and one hopes Menand's already privileged position in the intellectual elite, and the envy of the specialists, won't muffle the sounds of celebration.

Here I sit so patiently/Waiting to find out what price/You have to pay to get out of/ Going through all these things twice.
      --Bob Dylan

Forward, into the past!
      --Firesign Theater

Nothing was delivered, but I can't say I sympathize.
      --Bob Dylan

In November 1994, dressed in iconic big-polka-dot shirt and black sunglasses, 53-year-old Bob Dylan appeared on MTV's Unplugged. He sang a handful of his greatest hits, mostly 1960s-vintage, some of his most wondrous and paranoid and surreal creations: "Tombstone Blues," "All Along the Watchtower," "Rainy Day Women #12 & 35," "Desolation Row," "Like a Rolling Stone," "With God on Our Side" and "The Times They Are A-Changin'." Not long afterward, he licensed that last tune for use in ads by the Bank of Montreal and Coopers & Lybrand.

Yes, this is the enigmatic legacy of the 1960s, that tar baby of American cultural politics. But the selling of the counterculture was built in to what was, after all, a pop phenomenon. The Grateful Dead started peddling T-shirts during the Winterland days with Bill Graham. By the time we got to Woodstock, "counterculture" was a squishy advertising concept. No one at the time saw this better than the artful enigma now just turning 60.

My first Dylan albums were Bringing It All Back Home, Highway 61 Revisited and Blonde on Blonde, so for me, Dylan's real value has never been as a political symbol, anyway: He's got everything he needs, he's an artist, he don't look back. As a friend of mine once put it, Dylan opened the toy chest of American popular music so that anyone could play with all of its contents. The remark underscores the breadth of Dylan's catalogue. Only a few musical peers--Ray Charles comes to mind--have done anything as wide-ranging.

Maybe it's not surprising that, like Charles, Dylan seems to have two key qualities: genius and self-protective complexity. From the beginning, the Dance of the Seven Veils between the whirring rumors and the (initially few genuine) facts that surfaced about his private lives has been part of his celebrity allure; it amplified his gyrating lyrics, gave insiders plenty to guess and gossip about, and outsiders a contact high.

The slightly pudgy 19-year-old came to the 1961 Greenwich Village folk scene with a Woody Guthrie playbook on his knee, but he loved Buddy Holly's Stratocaster and Elvis Presley's raw Sun recording sessions and knew he wanted to be a star. The Village folkies, in full creative coffeehouse flight, were generally leftish, middle-class, longing for cultural authenticity and artistic purity, and interested in making something apart from the loathed world of commercial showbiz. That, by contrast, is precisely where Dylan dove headlong as soon as he could. Even before his fabled fiasco at the 1965 Newport Folk Festival, Dylan drew electric guitars and drums--the evil talismans of showbiz--from his toy chest, where they'd been waiting alongside Harry Smith's Anthology of American Folk Music, Hank Williams, Little Richard and Elvis Presley. Anti-Dylan folkies are still as hardfaced about it as jazz purists are about post-Bitches Brew Miles Davis.

As he moved from protest singer to surrealistic prophet, from born-again Christian to born-again Jew, Dylan's life and music registered, however unwillingly or elliptically, his times. This is one reason people have interpreted his Mona Lisa-highway blues smile and his amphetamine/Beat attitudes in their own images. They've translated him into hero, antihero, sellout, savior, asshole, religious zealot, burnout, political radical and artist. Unless it was useful to him, Dylan usually resented being reduced in rank from prophet (he has always credited divine inspiration for his work, and his most apocalyptic imagery rages with echoes of Blake and the Bible) to mere mirror-holder, and he has usually managed to translate himself anew--the protean artist. That is part of his genius, the soul linking his tangled life to his web of art--and, for that matter, his art to his audience.

So, like the decade he's a symbol of, Dylan today is many things to many people. He's an aging rock star composer of some of the most powerful and enduring songs of the past century who loves the gypsy life of the road; a multimillionaire with an Elvis-like entourage who has an un-American lack of interest in personal hygiene; a double-talking celebrity with a ferocious sense of privacy who has spent most of his life in studios and on the road with his ears full--to varying degrees, depending on exactly when we're talking about--of the transcendent sounds he hears in his head as well as the roaring sound of the star machinery and its need for lubrication. Such is the dilemma of any commercial artist. Pop culture is full of the tales. But few if any other pop songwriters have been considered for the Nobel Prize in Literature.

By most accounts (and over the decades there have been plenty) Dylan early on cast himself--first in his mind's eye, then, after he'd established the myths, in fact--as a shadow observer hoboing through life, with his BO and irresistible charm and coldhearted focus and spew of genius. The chorus for this troubadour's life has many members. There are women who sing his praises, care for him, want to protect him. There are ex-acolytes and musicians and business associates wailing the I-been-abused blues. There are core loyalists and friends. There are fawners, often drawn from the same pool as the abused. They all agree, though, that the Bob Dylan they know is an unbelievably private, ironically inarticulate man with nearly unshakable drive and talent.

That was already clear in 1965, when D.A. Pennebaker tagged along for Dylan's last all-acoustic tour of Britain and filmed Don't Look Back. Released in 1967, the movie caused a stir mostly because it unveiled another few sides of Dylan. Now it's been reissued on DVD, with the usual enhanced menu of outtakes (here audio tracks) and commentary (some useful, some silly). The good news is it looks just as murky as ever. With this backstage home movie, Pennebaker was inventing our notions of cinéma vérité: a wash of grimy, grainy images with weirdly impromptu light, in-the-moment vignettes and scenes.

Pennebaker wasn't interested in converting Dylan into a poster boy for activism or peace and love or the Francis Child ballad collection; he grasped the artistic multiplicity that often came out as duplicity. During the movie, Dylan reveals side after side: the manipulative creep; the defensive master of the counterlunge; the insular and sometimes inarticulate star; the smartass provocateur; the hyperintense performer; the chain-smoking, coffee-drinking, spasmic-twitching composer sitting endlessly at typewriters and pianos. And yeah, the nice guy pops up too. It's a portrait of the artist as Zelig.

In Pennebaker's film, this Zelig too has his handler: an owlish, pudgy Svengali, Albert Grossman, who negotiates about money in a couple of revealing scenes. Folk veterans tend to see him as a representative of Moloch: Grossman devised crossover acts like Peter, Paul and Mary and gave them Dylan tunes to sing. He owned a bigger percentage of Dylan's publishing income than Dylan did, though the singer didn't know it then; even people who don't like him agree that Grossman encouraged Dylan to write and experiment. According to Pennebaker, Dylan came up with the movie's famous opening: "Subterranean Homesick Blues" plays while Dylan, wearing a slight sneer, stands on one side of an alley. Allen Ginsberg and Peter Orlovsky stand off to the other. Dylan holds placards with bits of lyrics from the tune, dropping each card to the ground when it goes by on the audio track. It's a neat piece of visual business that bridges Buster Keaton and MTV.

Pennebaker's movie takes place in the last quarter of David Hajdu's Positively 4th Street. The author of the well-received Lush Life, a biography of Duke Ellington collaborator Billy Strayhorn, Hajdu has written an engrossing page-turner that puts early 1960s Dylan into a pas-de-deuxing foursome with the Baez sisters, Joan and Mimi, and Richard Fariña. The narrative's hook is deliciously open-ended. The Baez sisters, performers themselves, were romantically as well as creatively entwined with Fariña and Dylan, two ambitious myth-making weirdos who were womanizers, bastards and, in their different ways, trying to create poetry with a backbeat. Their ever-changing interpersonal dynamics are the intellectual soap opera that is the book's bait.

Hajdu plays out the sexual and creative permutations and combinations in and around this vaguely Shakespearean quartet with narrative panache and just the right tang of gossip and attitude to get it excerpted in Vanity Fair. At its best, his fluent style floats information with deceptive lightness, but he's not lightweight. Hajdu dug through the papers, including unpublished outtakes of Robert Shelton's No Direction Home: The Life and Music of Bob Dylan, talked to plenty of witnesses and tapped new sources; the most notable is Thomas Pynchon, Fariña's Cornell roommate and best man, whom Hajdu interviewed by fax. All this lets him conjure a novelistic immediacy. His well-plotted scenes usually ring true and bristle with evocative detail. He uses his narrative's inherent elasticity to open perspective and depth of field naturally, then skillfully dollies around and pans in and out of larger contexts as illuminating backdrop for his two odd couples. Topics from the history of American vernacular music to contemporary politics, art and architecture add resonance to the main plot.

Hajdu's story starts with the young Baez sisters seeing Pete Seeger ("a sociopolitical Johnny Appleseed during the mid-1950s") in concert and getting their own guitars. It follows Joan to the thriving Cambridge folk scene, where she became a star with a recording contract. Hajdu builds a novelistic collage of perspectives: Baez herself, those she'd already left behind in California, those watching her rise in Boston. This technique shapes the book's storytelling. We see Fariña, for instance, through Mimi's eyes as a basically lovable, if hurtful, rogue genius; through Joan's by turns as accomplice, potential seducer and parasite. We watch Joan's Cambridge friends fret and fume at young Bobby Dylan's riding her to the top while Joan loves him blindly, and we meet other Dylan lovers like Suze Rotolo and Sara Lownds, whom Dylan later married. We wonder why Mimi can't see how Fariña is using her to get to Joan, since nearly everybody else, including Joan, does, and we wonder if he'll succeed. And we hear the chorus of disharmony around the charged moment when Dylan abandoned his image as folk singer; we note that Joan idealistically spurns Albert Grossman and a major record label and Bob signs with both.

It's easy to see how this fly-on-the-wall approach could devolve easily into name- and eavesdropping--a pitfall Hajdu generally avoids. He evokes the aura of the relationship between Dylan and Rotolo by noting that by the spring of 1962 they'd known each other for six months; he tested his songs on her and played Elvis records for her, while she lent him books of poetry--they read Byron and Rimbaud together--and took him to CORE meetings. "He knew about Woody and Pete Seeger," says Rotolo, "but I was working for CORE and went on youth marches for civil rights, and all that was new to him. It was in the air, but it was new to him."

So, although characters and narrative strands multiply as they weave in and out, Positively 4th Street usually avoids feeling cluttered or confused. And the pacing, spurred by the frisson of eyewitness memories, insider gossip and the rush of circumstance, carries you over its rough spots until things skid abruptly to a finish in 1966. That April, after a publication party for his seminal book Been Down So Long It Looks Like Up to Me, Fariña died in a motorcycle crash. Three months later, Dylan had his own motorcycle crash, which pulled him out of the public eye for three years. Hajdu writes, "Precisely what happened to Bob Dylan on July 29 is impossible to reconstruct with authority."

Until now, that was true. But in Down the Highway: The Life of Bob Dylan, Howard Sounes in fact pieces together testimony and circumstantial evidence into a fairly detailed account of Dylan's wreck. (He relies heavily on Sally Grossman, the late Albert's wife.) It's the kind of thing Sounes does well, opening new angles on the enigmatic polyhedron that is Dylan. An indefatigable reporter, Sounes has collected most of the folks in the Dylan orbit and brought into print several, including Dylan family members, who haven't been there before. He has unearthed more detail about Dylan's marriages and divorces and children and lovers and homes, his harassing fans and his tour receipts, even his desperate late 1980s offer to join the Grateful Dead as his popularity ebbed. He has combed the earlier sources and extracted their meat. Exhaustive is the right adjective.

As Sounes sees it, Dylan lives in introverted, near-constant turbulence, buffeted by internal as well as external winds and by his own creativity, which produces constant alienation. We watch obsessive fans stake out his houses, hassle his women and kids, ransack his garbage. We learn more of the grimy legal battles (suit and countersuit) between Dylan and Grossman, who for several years, at least, earned much more from Dylan than Dylan did.

Dylan did know lots of women, and they parade dizzyingly by: sincere Minnesota folkie madonnas, Village political sophisticates like Suze Rotolo, Baez, Suze again, his first wife Sara, Baez again, back to Sara, various side trips, a string of black backup singers like Clydie King and Carolyn Dennis, who, Sounes reveals, had Dylan's child and secretly married him. So do his musical cohorts from over the decades, who retail variations of the same tale: Little contact, little to no rehearsal, vague if any instruction. Even members of The Hawks, later known as The Band, arguably Dylan's closest creative associates in the late 1960s, shed little light on the man and his muse. It's not surprising, then, that in discussing Dylan's visual artwork collected in Drawn Blank, Sounes writes, "Mostly Bob seemed to be alone in empty rooms. He often drew the view from his balcony, a view of empty streets, parking lots, and bleak city skylines."

That's as close as Sounes gets to piercing Dylan's veil. Even in this monumental bio, just as in Hajdu's book, the star of the show flickers like a strobed image through the crosscut glimpses of his intimates. The facts and tales pile up; the figure behind the screen seems to come into clearer focus but never quite emerges. Still, his complexity is elucidated--which may be the best anyone, including Dylan himself, can do.

Sounes's book has its drawbacks. Its workmanlike prose lurches periodically into fanzine or tabloid rambles by the author or his witnesses. (Why open with what reads like a magazine story about the party that followed Dylan's "Thirtieth Anniversary Concert"? Why ask Jakob Dylan, now a pop star in his own right, if he thinks he'll measure up to his dad?) It gropes for the "inner" Dylan and sometimes comes up silly. (It's not at all clear Dylan has "conservative" beliefs, as Sounes asserts, aside from desperately wanting privacy for himself and his families. It does seem that he, like most folks, has a floating mishmash of an ad hoc personal code.) With all those facts pressing on him, Sounes can also warp chronology in a confusing fashion. (Why, when first introducing Dylan's manager Grossman, dwell in such detail on the court battles that broke out between them seven years later?) But the bulky research and reporting make up for relatively minor lapses in style and sensibility.

Inevitably there are spots when Sounes and Hajdu overlap and disagree about what happened. Take Newport 1965. Sounes retails the traditional story of how outraged fans, shocked at Dylan's betrayal of acoustic music and, by implication, folkie principles, booed Dylan's electric set. Early on, Pete Seeger and Dylan himself helped promote the tale. Hajdu suggests, via other witnesses, that people were screaming about the crummy sound system, and he wonders, as others have, how 15,000 fans could have been shocked by an electric Dylan set after hearing "Like a Rolling Stone" on the radio that summer. Look at it this way: The doughnut is being filled in, but the hole in the middle remains. Dylan's lifelong attempts to fog his personal life may have been rolled back more than ever, but blurry patches still linger, subject to interpretation and debate, just as they always will with the decade of which he--for better or worse, rightly or wrongly--is still an emblem.

The almost exact coincidence in time between the destruction of the Buddha figures by the Taliban in Afghanistan and Mayor Rudolph Giuliani's renewed jihad against the Brooklyn Museum vividly underscores the problems that authorities seem to have in dealing with images. It hardly matters whether it is the most sophisticated city in the world or one of the world's most backward countries--authorities form Panels on Decency or mount Exhibitions of Degenerate Art or ship avant-garde painters off to rot in gulags or divert funds badly needed for the relief of famine to pound, with advanced weaponry, effigies into rubble. And let us not forget Plato's scheme for ridding the Just Society of mimetic art generally. As these examples suggest, iconoclasm cannot always be explained with reference to religious orthodoxy. William Randolph Hearst and Congressman George Dondero of Michigan did what they could on grounds of patriotism to cleanse America of any images that smacked of Modernism. "Art which does not beautify our country in plain simple terms that everyone can understand breeds dissatisfaction," Dondero proclaimed. "It is therefore opposed to our government and those who create and promote it are our enemies." Why should our taxes support imagery of which our officials disapprove? (The answer, of course, is that they were not elected to tell us what we could see--they were elected to secure our basic freedom to make up our own minds on matters of expression, artistic and otherwise.)

Renee Cox's suddenly famous photograph, which shows a naked woman at a dinner party, has been stigmatized by Mayor Giuliani as indecent and anti-Catholic. It is in fact neither. The title, as everyone in the world now knows, is Yo Mama's Last Supper, but Yo Mama has been one of the ways in which Cox has referred to herself since the time when, enrolled in the Whitney Independent Study Program, she did a number of large nude photographs of herself pregnant and, later, with her son. The title in effect means "The Last Supper According to Renee Cox," and the art-historical reference is to the Last Supper according to Leonardo da Vinci. There are a great many pictures of Christ's last meal with his disciples, all of them by the nature of the case interpretations, since literal pictorial records are out of the question. Cox's interpretation enjoys the protections of the First Amendment, but one loses a great opportunity in thinking of her work--or anyone's work, for that matter--merely in terms of the artist's right to make it or the museum's right to display it. Cox is a serious artist, with serious things to say in her chosen medium. The First Amendment exists to protect the freedom of discourse, rightly perceived as central to the intellectual welfare of a free society. Art belongs to that discourse, and our taxes support museums in order to give citizens access to it. Mayors should be first in line to secure these rights and benefits rather than voice hooligan pronouncements against art for the evening news.

Yet the history of images is also the history of forbidding the making of images. This interdiction is wholesale at Exodus 20:4, where Jehovah prohibits any likeness of anything that is in heaven above, or that is in the earth beneath, or is in the water under the earth. There is an implied thesis in pictorial psychology in this commandment, which probably goes to the heart of the matter: People have a hard time not believing that there is an internal connection between pictures and their subjects. If you can place a picture of an antelope on your cave wall, you have made an antelope present in the cave. If you have a picture of a saint before you, the saint herself is right there, mystically present in her icon. So if you pray before the icon, your prayers are immediately heard by her whose image it is. It was this intimacy with holy beings that made icons so greatly cherished in early Christianity, and that accordingly made them so vexed a political nuisance in the Byzantine Empire, which was torn asunder for more than a century by controversy over what we might think of as pictorial metaphysics. The arguments pro and con had an intricacy and deviousness that help give the term "byzantine" its familiar meaning. But when the Iconoclasts were in power, it also meant an actual destruction of icons so thorough that very few of what must have been an almost countless number of them have survived.

Drawing is said to have been invented by a Corinthian girl, Dibutades, who traced the outline of her lover's shadow on the wall so that she would keep a trace of him with her when he left. Images in their nature have outlines, which is why Byzantine theorists regarded every likeness of God as false: God has no outlines, and so to picture God is to represent God as finite. The Byzantine practice of worshiping God through worshiping an icon of God is idolatry, which is the worship of finite things. And it was the intent of Exodus to forestall idol worship. The problem this presented to the established religion was that the church in fact exercised monopolistic control over images, and prohibition accordingly had deep economic consequences, given the appetite that was a defining trait of Byzantine culture. Supporters of icons had a clever answer. Toleration of images is one of the grounds on which Christianity distinguishes itself from Judaism and indeed Islam. The whole message of Christianity rests on the proposition that God decided to save humanity from sin by self-incarnation in human form. But human beings in our nature are finite. Since God is Jesus, in worshiping Jesus one is worshiping an infinite being in finite form. Indeed, we have Jesus' own testimony for the acceptability of images, since he himself conferred his image upon Saint Veronica, who offered him her veil to wipe his brow with as he struggled up the road to the cross: When she received it back, there was the image of Christ's face, like a photographic impression. This was considered a miracle, and Veronica's veil is one of the most important relics in the Church's large inventory.

The identity of the persons of the Trinity is the most abstruse and contested teaching of the early Church, but once the decision is made to take on human form, the question of gender immediately arises, and this brings us to the Brooklyn case. Humans are sexually bimorphic, so the question cannot be avoided. Could God have chosen to be incarnate in a female body? To say that God could not have is inconsistent with God's power. My sense is that a male body would have recommended itself at that moment in history, in order to make sure that Jesus would have a respect and authority not ordinarily accorded females. But does this rule out that Jesus could be represented as female? That might have been difficult for worshipers to deal with during certain stages of iconography, though it should hardly be an insuperable problem, once we appreciate that pictures may be regarded as symbols rather than mere likenesses. Not even the first Christians had difficulties in accepting that Christ could be represented as a fish! The Greek word for fish, Ichthys, acted as an acronym for "Jesus Christ God's Son Savior." One of the great theologians went so far as to play on the idea that through the sacrament of baptism, water is the medium in which we live, so that Christians, like Jesus, are fishlike in nature.

The masculine identity of Jesus is explicit in representations of the Christ child in Western art, over and over again shown with a penis, often pointed to in pictures, sometimes by the Christ child himself. The great art historian Leo Steinberg has made this the theme of a major contribution, The Sexuality of Christ in Renaissance Art and in Modern Oblivion. Any ambiguity on the matter raises difficulties of interpretation. When, for example, pilgrims carried lead badges showing Christ bearded and crucified but wearing a robe, these were found puzzling in Northern Europe, where only women wore such garments. Here is the reasoning that resolved the issue: On the evidence of dress, the figure had to be female. (Evidently clothing trumps beards, since there are bearded women.) A myth evolved that the bearded woman was Saint Wilgefortis, which derives from virgo fortis--Strong Virgin. Wilgefortis, a beautiful virgin, wanted to devote her life to Christ but was betrothed to the King of Sicily. She prayed that she be made ugly, and God answered by causing a beard to grow on her face. The King of Sicily, disgusted, canceled the wedding. Her father was so angry that he had his bearded daughter crucified. Thus grew up the cult of Saint Wilgefortis, and her worshipers, praying before the figure of a bearded woman, were unbeknownst to themselves really praying to Christ.

An image of a crucified person wearing a dress could be, taken literally, Saint Wilgefortis, or symbolically it could be Jesus. The central figure in Yo Mama's Last Supper, since nude, is hardly ambiguous in point of gender. But it is ambiguous as to whether it is literal or symbolic representation. So let's begin to examine the work as art critics:

It is an exceptionally large photograph, in color, consisting of five panels, each 31 inches square. The female figure occupies the entire central panel. She is standing, arms outspread, palms upturned, behind a table, set with some bowls of fruit and a wineglass. Because of the title and certain formal similarities to Leonardo's painting, one has to say that she occupies the place of Christ. I think that it is incidental to the meaning of the picture that Cox photographed herself as Jesus, since I don't think she is suggesting that she is Jesus, or that it is a self-portrait of Renee Cox as Jesus. Rather, she is working along lines associated with Cindy Sherman, who photographs herself but not as herself, with the difference that Sherman has never, so far as I know, shown her own nakedness. Renee Cox has used herself as model for Jesus, symbolically represented as a woman. This is interpretive conjecture: It is impossible to know from the picture alone whether Cox is saying that Jesus was in fact a woman or merely that he is being represented as a woman. The differences are immense, one being about theological, the other about representational, fact. Obviously the two can be connected. No one thinks that Jesus was actually a lamb, but he is often enough depicted as a lamb, and this is thought to be a symbolic way of presenting some deep truth about Jesus. One speaks about being washed in the blood of the lamb, but as Muriel Spark observes in a novel, blood is too sticky to wash with, so the image is poetic license.

In the "Sensation" show (at the same museum and which also drew the Mayor's ire), the British artist Sam Taylor-Wood showed a Last Supper with a woman, nude from the waist up, as Jesus. She titled the work Wrecked. Taylor-Wood's picture is somewhat baroque and even Carravagesque, and in it Jesus looks haunted. Cox's picture is rather classical, with the disciples distributed in two groups of three on either side, and Jesus appears (I would say) magisterial. S/he is holding what I imagine is a shroud over his/her arms and passing behind the body, so as not to conceal her femininity. Taylor-Wood's picture raised no hackles at the time, but this may be explained through hackle-fatigue--unless the fact that Jesus is black in Cox's image is the suppressed premise in the recent complaint.

Since Christ has been shown as a lamb in many wonderful paintings--and continues to be represented by a fish in various gift items and ornaments for automobiles, there is iconographic room for him to be shown in many different ways. Showing God as male is, as I say, a historical contingency. It could be a metaphor, through which one conveys Christ's absolute authority, males traditionally having that in patriarchal societies. But there is a more central consideration. Let us remember that the whole message of Christianity is that God took on a human form in order to redeem us through his suffering. There is a magnificent piece of criticism by Roger Fry of a Madonna and Child by Mantegna. "The wizened face, the creased and crumpled flesh of a new born babe...all the penalty, the humiliation, almost the squalor attendant upon being 'made flesh' are marked." In view of the profound suffering both women and blacks have undergone through history, it would be entirely suitable that Christ be represented as either of these, or both. It is true that in Cox's picture, Christ looks exalted and self-certain. It is a picture of someone defiant and prepared to face down her oppressors. But it is, on whatever symbolic level, after all a picture of God. Taylor-Wood's picture is of Jesus as human. But the important truth is that Jesus is supposed to have been both, and the issue of what gender the human is to be in a given representation is a matter of delicate interpretational negotiation.

These are the considerations on which I want to deny that the picture is either indecent or anti-Catholic. The Mayor blurted out these epithets when he was shown a photograph of Yo Mama's Last Supper in the Daily News. Giuliani can always be counted on to make entertaining noises in the presence of art. He might have said the same thing had an artist scanned a picture of a fish into Leonardo's painting. I appreciate the fact that the Mayor has more pressing things to deal with than pondering the mysteries of Christ's body or the language of religious symbols, but if the so-called Decency Panel he has formed presses forward, I think he owes it to art and to his religion to ask that pictures that offend him be explained to him. I would be astonished if the panel he has appointed is interested in doing that on his behalf. If I were summoned as a witness, I would be eager to point out the complexities of interpretation involved with the art that comes before it, and that the panelists should consider the art the way it is considered by a critic, from the perspective of what view is being visually advanced. Seen that way, it becomes a matter of finding plausible critical hypotheses and then seeing whether they could not be true--giving the art the benefit of the doubt. I cannot imagine the panel having to meet very often, once its meetings turned on such matters of interpretation. The issue finally becomes of a piece with conflicts in society at large, where we have learned to tolerate views whether we like them or not.

There is, to be sure, a distinction between protecting a right and supporting an art museum with our taxes. There are those who see free expression as a right but not necessarily a public right to art museums as institutions. That question reduces to one of why we should have art museums, paid for by our taxes. My view is that it would not be art if it did not advance views, whether the views are mine or agree with mine or not. So, you can't have art museums without the question of freedom of expression arising. (Whether there should be museums at all is another question entirely, though fortunately it is not the mayoral panel's charge to answer it!)

So let's imagine that after all the explanations, an image really is anti-Catholic and indecent. Should our tax dollars support such art--or further, since any view can be expressed in art, are there other views we would not want expressed in our art museums? I say that if it can be expressed outside of art, there is room for it in the museum if expressed as art. Let us take a very controversial view--that abortion is murder. That is part of the discourse on abortion, and it is certainly at the heart of the "prolife" movement. A painting that shows an abortion clinic with the title Massacre of the Innocents has a right to be shown if the belief it expresses has a right to be voiced--as it of course has. It is offensive to prochoice advocates, but hanging it in an art museum harms them less than having to face people shouting their position in front of clinics. A painting showing antiabortion protesters jeering in a very ugly way could be painted by someone like Leon Golub, and it would be offensive to them in just the same way.

All this takes us a long way from Renee Cox's photograph, and it shows how irrelevant to the deep issues of expressive freedom a panel on decency really is. These days, "indecency" is a fairly marginal infraction, since questions of fittingness and suitability are almost impossible to arbitrate. If anything is unsuitable, I would suppose it is officials talking recklessly about art when they are representatives of a city in which interest in art is profound and serious talk about art is as expressive of the city's soul as talk about baseball. A city of great museums and universities, a beacon of high culture to the world at large, deserves decency in discourse about art on the Mayor's part. I would not insist on a panel to keep the Mayor in line.

The departure of Tavis Smiley leaves a hole in the programming calendar of BET, but that's only part of the problem.

Jean Clair, director
of the Musée Picasso in Paris and widely respected both as
scholar and art critic, has for some years been out of sympathy with
contemporary art. When he and I shared a platform in the Netherlands
a year ago, he spoke of a new aesthetic marked, in his view, by
repulsion, abjection, horror and disgust. I have been brooding on
this ever since, and particularly on disgust as an aesthetic
category. For disgust, in Jean Clair's view, is a common trait, a
family resemblance of the art produced today not only in America and
Western Europe but even in the countries of Central Europe recently
thrown open to Western modernity. We do not have in English the
convenient antonymy between goût (taste) and
dégoût (disgust) that licenses his neat
aphoristic representation of what has happened in art over the past
some decades: From taste...we have passed on to disgust. But
inasmuch as taste was the pivotal concept when aesthetics was first
systematized in the eighteenth century, it would be a conceptual
revolution if it had been replaced by disgust. I have never, I think,
heard "disgusting!" used as an idiom of aesthetic approbation, but it
would perhaps be enough if art were in general admired when commonly
acknowledged to be disgusting. It is certainly the case that beauty
has become a ground for critical suspicion, when its production was
widely regarded as the point and purpose of art until well into the
twentieth century.

Though "disgusting" has a fairly broad
use as an all-around pejorative, it also refers to a specific
feeling, noticed by Darwin in his masterpiece, The Expression of
the Emotions in Man and Animals
, as excited by anything unusual
in the appearance, odor or nature of our food. It has little to do
with literal taste. Most of us find the idea of eating cockroaches
disgusting, but for just that reason few really know how cockroaches
taste. The yogurt that sports a mantle of green fuzz--to cite an
example recently mentioned in a New Yorker story--may be
delicious and even salubrious if eaten, but it elicits shrieks of
disgust when seen. A smear of soup in a man's beard looks disgusting,
though there is of course nothing disgusting in the soup itself, to
use one of Darwin's examples. There is nothing disgusting in the
sight of a baby with food all over its face, though, depending on
circumstances, we may find it disgusting that a grown man's face
should be smeared with marinara sauce.

Like beauty, disgust
is in the mind of the beholder, but it is one of the mechanisms of
acculturation, and there is remarkably little variation in our
schedules of what disgusts. So disgust is an objective component in
the forms of life that people actually live. The baby is very quickly
taught to wipe its face lest others find it disgusting, and we hardly
can forbear reaching across the table to remove a spot of chocolate
from someone's face--not for their sake but for our own. What he
speaks of as "core disgust" has become a field of investigation for
Jon Haidt, a psychologist at the University of Virginia. He and his
associates set out to determine the kinds or domains of experience in
which Americans experience disgust. Foods, body products and sex, not
unexpectedly, got high scores when people were queried on their most
disgusting experiences. Subjects also registered disgust in
situations in which the normal exterior envelope of the body is
breached or altered. I was philosophically illuminated to learn that
of fifty authenticated feral children, none evinced disgust at all.
But I am also instructed by the fact that my cultural counterparts
are disgusted by what disgusts me, more or less.

This
overall consensus encourages me to speculate that most of us would
unhesitatingly find the characteristic work of the artist Paul
McCarthy, largely live and video performance, disgusting. There may
be--there doubtless is--more to McCarthy's art than this, but
whatever further it is or does depends, it seems to me, on the fact
that it elicits disgust. It may, for example, debunk a false idealism
McCarthy regards as rampant in Hollywood films, advertising and
folklore, as one commentator writes. But it achieves this just so far
as it is disgusting. It may relentlessly and rigorously probe the
airbrushed innocence of family entertainment to reveal its seamy
psychic underpinnings, to cite another critic. So it may show what
really underlies it all, the way the worm-riddled backside of certain
Gothic sculptures whose front sides were of attractive men and women
were intended to underscore our common mortality. But that does not
erase the fact that maggots count as disgusting. So possibly McCarthy
is a kind of moralist, and his works are meant to awaken us to awful
truths and their disgustingness as a means to edificatory ends. That
still leaves intact the revulsion their contemplation evokes. Disgust
is not something that can easily be disguised. Beautiful art, Kant
wrote, can represent as "beautiful things which may be in nature ugly
or displeasing." But the disgusting is the only "kind of ugliness
which cannot be represented in accordance with its nature without
destroying all aesthetic satisfaction."

"Nothing is so much
set against the beautiful as disgust," Kant wrote in an earlier
essay. So it is all the more striking that McCarthy's commentators
attempt to find his work beautiful after all. I wanted to think about
the question of beauty in your work, an interviewer murmured, to move
from the manifest to the latent. The New York Times speaks of
the "unlikely beauty of the work," adding that it is "not standard
beauty, obviously, but a beauty of commitment and absorption." I have
to believe that McCarthy's perceptions can be very little different
from the rest of ours. He has, indeed, almost perfect pitch for
disgust elicitors, and accordingly making the art he does must be
something of an ordeal. That may have the moral beauty that
undergoing ordeals possesses, especially when undertaken for the
larger welfare. But if it is that sort of ordeal, then it has by
default to be disgusting. As the Gothic statuary demonstrates--or for
that matter, the history of showing the fleshly sufferings of Christ
and the martyrs--artists down the ages have had recourse to some
pretty disgusting images for the ultimate benefit of their viewers.
(Taking on the iconography of Disneyland, as he does, is hardly
commensurate with overcoming Satan's power, but I'll give McCarthy
the benefit of the doubt.)

Something over three decades of
McCarthy's work is on view through May 13 at New York's New Museum of
Contemporary Art in SoHo, and since he is widely admired by the art
establishment, here and abroad, there are prima facie reasons for
those interested in contemporary art to experience it. The disgusting
works have mainly to do with food, but--citing Haidt--disgust is, at
its core, an oral defense. There is no actual gore, though McCarthy
uses food to evoke the images of gore. Similarly, there are no actual
envelope violations; no one is actually cut open. But again, various
accessories, like dolls and sacks, are enlisted to convey the idea
that the exterior envelope of the body is breached or violated.
McCarthy makes liberal use of ketchup in his performances, and in
interviews speaks of the disagreeable smell of ketchup in large
quantities. That is part of what I have in mind in speaking of his
art-making in terms of ordeal. There may or may not be actual shit,
but chocolate is what one might call the moral equivalent of feces,
as you can verify through watching a few minutes of his Santa
Chocolate Shop
. Karen Finley used only chocolate to cover her
body in the performance that landed her in hot water with the
National Endowment for the Arts a few years ago--but everyone knew
what she was getting at.

The use of foodstuffs
distinguishes McCarthy's art from that of the so-called Vienna
Actionists of the 1960s--Hermann Nitsch and Otto Mühl are
perhaps the best known, though the actor Rudolf Schwarzkogler
attained a happily unmerited notoriety through the rumor that he cut
bits of his penis off in successive performances of Penis
Action
. The Actionists made use of real blood and excrement, and
excited at least the illusion of humiliation through such happenings
as that in which a broken egg was dripped into Mühl's mouth from
the vagina of a menstruating woman. They were heavily into
desecration. McCarthy is pretty cheery alongside these predecessors.
His work refers to nursery rhymes and children's stories, and he
makes use of stuffed animals and dolls, often secondhand, and
costumes as well as rubber masks from the joke shop. Some writers
have described McCarthy as a shaman, but he rightly sees that as
something of a stretch: "My work is more about being a clown than a
shaman," he has said. As a clown, he fits into the soiled toy lands
of his mise en scènes, which kick squalor up a couple
of notches, as Emeril Lagasse likes to say when he gives the pepper
mill a few extra turns.

The clown persona is central to
what within the constraints of McCarthy's corpus might be regarded as
his chef-d'oeuvre, Bossy Burger (1991). But he worked
his way up to the creation of this role through a sequence of
performances. In these, he stuffed food in his pants, covered his
head with ketchup, mimicked childbirth using ketchup-covered dolls as
props. In one, or so I have read, he placed his penis inside a
mustard-covered hot dog bun and then proceeded to fill his mouth to
the point of gagging with ketchup-slathered franks. Throughout, food
was placed in proximity to parts of the body with which food has no
customary contact. But many human beings are reluctant to touch food
that has merely been left untouched on the plates of strangers.
Disgust is a defensive reflex, connected with fear, even if we know
the food that evokes it is perfectly safe and edible. That is why
there is so strong a contrast between beauty and disgust: Beauty
attracts.

McCarthy got the idea of using food as the medium
of his performances in the course of searching for a very basic kind
of activity. Inevitably, he had to deal with disgust, which is
inseparable from eating as symbolically charged conduct. It is
understandable that he would stop performing for live audiences (as
he did in 1983) and begin to devise a form of theater to put a
distance between himself and his viewers. I would not care to perform
Bossy Burger a second time, even if I had the stomach to
perform it once. It is perhaps part of the magic of theater that
disgust survives as an affect, even through the video screen. It
doesn't help to know it is only ketchup.

The action of
Bossy Burger transpires in what in fact was a studio set for a
children's television program, and the set--a hamburger stand--is
exhibited as an installation. It shows the damage inflicted on it by
the performance, and looking in through the open wall--or the
windows--we see an utterly nauseating interior, with dried splotches
and piles of food pretty much everywhere. It has the look of
California Grunge, as we encountered it in the work of Ed Kienholz. A
double monitor outside the set shows, over and over, McCarthy's
character, togged out in chef's uniform and toque--and wearing the
Alfred E. Neuman mask that connotes imbecility--grinning his way
through fifty-nine minutes of clownishly inept food preparation. Thus
he pours far more ketchup into a sort of tortilla than it can
possibly hold, folds it over with the ketchup squishing out and moves
on to the next demonstrations. These involve milk and some pretty
ripe turkey parts. The character is undaunted as his face, garments
and hands quickly get covered with what we know is ketchup but looks
like blood, so he quickly takes on the lookof a mad butcher. He piles
the seat of a chair with food. He makes cheerful noises as he bumbles
about the kitchen or moves to other parts of the set, singing, "I
love my work, I love my work." Everything bears the mark of his
cheerful ineptitude. At one point he uses the swinging door to spank
himself, but it is difficult to believe this constitutes
self-administered punishment. He looks through an opening at the
world outside. McCarthy says he envisioned this chef as a trapped
person, but whether that is an external judgment or actually felt by
the character is impossible to decide from the work itself. Viewers
may find themselves wanting to laugh, but a certain kind of
compassion takes over. Perhaps it is a test for tenderness. Whatever
the case, even writing about Bossy Burger makes me feel
queasy.

You won't get much relief by looking at Family
Tyranny
, in which the character uses mayonnaise and sings, "Daddy
came home from work" as he prepares to do unspeakable things to his
children. "They're only dolls" helps about as much as "It's only art"
does, which underscores Kant's point about disgust. Painter
mercifully turns to other substances in its slapstick comedy about
the art world. McCarthy plays the role of art star, wearing a sort of
hospital gown, a blond wig and huge rubber hands, and he has a kind
of balloon by way of a nose. Everyone else in the action--his dealer
and his collectors--wears the same kind of nose, which perhaps
caricatures the hypertrophied sensitivity that exposure to art might
be thought to bring. At one point, the Painter climbs onto a sort of
pedestal as an art-lover kneels to smell his ass. In another action,
he chops away at one of his fingers with a cleaver, and crows OK!
when it comes off. This belongs to the iconography of self-mutilation
that has, since van Gogh--and perhaps Schwarzkogler--become an
ingredient in our myth of the true artist. The Painter's studio is
filled with huge tubes of paint (one of them labeled shit), and he
parodies the Abstract Expressionist address to painting by slapping
pigment wildly here and there, rolling it onto a table and then
pressing his canvas down onto the paint while pushing it back and
forth, all the while singing some version of "Pop Goes the Weasel."
Paint, food and blood serve throughout McCarthy's work as symbolic
equivalents. I could not suppress the thought that Painter is
a kind of self-portrait--there are photographs elsewhere in the show
of an early performance in which McCarthy frantically whipped a
paint-laden blanket against a wall and window until they were covered
with pigment.

It will be apparent that I am a squeamish
person, an occupational impediment for an art critic if Jean Clair is
right about the new aesthetic (for my response to that contention,
see www.toutfait.com/issues/ issue_3/News/Danto/danto.html). I am
not, however, disposed to prudery, though I have a strong memory of a
certain visceral discomfort when I was first writing on Robert
Mapplethorpe's photographs. McCarthy's Spaghetti Man I thought
was pretty funny. It is a sculpture, 100 inches tall, of a kind of
bunny, wearing a plastic grin of self-approval. It could easily be on
sale at F.A.O. Schwarz were it not that the bunny has a fifty-foot
penis, which coils like a plastic hose on the floor beneath him. It
is a kind of comment, but from an unusual direction, on Dr. Ruth's
reassuring mantra for insecure males that Size Doesn't Matter. It
really does matter from the perspective of masculine vanity, even if
Spaghetti Man's organ would put too great a distance between himself
and a partner for any show of tenderness during coitus. So its
message may well be that we should be grateful for what we've
got.

I don't have anything very good to say about The
Garden
, an installation of McCarthy's on view at Deitch Projects,
18 Wooster Street. The garden consists of fake trees and plants--it
was a movie set--in which one sees--Eek!--two animatronic male
figures, one doing the old in-and-out with a knothole in one of the
trees, the other with a hole in the ground. Some ill-advised writers
have compared the work to Duchamp's strangely magical last work,
Étant Donnés, where one sees a pink female nude,
legs spread, sharing a landscape with a waterfall and a gas lamp. The
masturbations in The Garden are too robotic for mystery, and
the meaning of all that effort too jejune to justify the artistic
effort. Cultural Gothic, a pendant to The Garden, is in
the main body of the show at the New Museum. It is a life-size
sculpture of a neatly dressed father and son engaged in a rite de
passage
in which the son is enjoying sex with a compliant goat.
Whether the motor was in its dormant phase or the electricity not
working--or the museum inhibited by some failure of nerve--there was
no motion when I saw it. I thought that an improvement, but purists
might think otherwise.

OK, no Lifelines, no 50-50s, no Audience Participation if you want to be a millionaire: Name the first great African-American sitcom of the New Millennium... Correct! The 2000 presidential election, as perpetrated in Palm Beach and Duval counties.

Imagine, black people actually thinking they could vote. Cue the laugh track. Go to commercial.

If you're already nostalgic for the kind of pure entertainment value offered by the perversely fascinating Florida (bamboozled, indeed), don't fret. There's always the WB (as opposed to the GWB) or the United Plantation Network, to sustain your sense of cultural (dis)equilibrium--as well as a Lester Maddoxian sense of race separation. Ever watch The Steve Harvey Show? Yes? Well, don't be shocked but you may be black: The number-one rated show among African-Americans, it's been all but unknown among the rest of the population.

If the accession of George W. Bush illustrated anything--other than the awesome power of television to stand by and do nothing--it was the cyclical nature of black access to power in this country, on TV or off. In 1876--as we all know now--a rigged election signaled the end of Reconstruction, the rise of Jim Crow, the establishment of the hangman's noose as symbol of Southern recreation and, until the Scottsboro Boys case in 1931, a national coma as regards racial mending.

But only eight years after Scottsboro broke, Ethel Waters was asked to develop a show for a medium that was itself still in development. By the late 1960s, The Brady Bunch had taken the one institutionalized black figure on mainstream TV--the maid--and made her white. By 2001, Jerry Springer was refereeing an on-air fiasco that could only be described as a racist's dream, showcasing, as it does, the dregs of the population, black and white.

That so much of television's black content is currently in syndication--good or bad--is telling. Plenty could argue that Jim Crow is still alive and well on network TV, but it is hard to say that matters aren't better than they were: Many major programs have a major black character; Oprah Winfrey rules the waves. But it's also better than arguable that ever since lynch mobs became more or less unfashionable (except in Texas), television has exercised the kind of social/racial control over our culture that race laws once maintained, and via the same mechanism: Create an artificial universe, with artificial rules; give people little enough to keep them near-starved, but make enough noise about every crumb you do toss their way that the public will think you're a bomb-lobbing revolutionary.

The culture critic Donald Bogle doesn't ascribe so much power, or so much intelligence, to the medium he critiques in Primetime Blues: African Americans on Network Television. But he's certainly cognizant of the power of entertainment to skew one's perception. And oneself. Growing up in the Philadelphia suburbs, Bogle writes, he seldom saw black people he recognized on TV. Or situations, comedic or otherwise, that weren't filtered through a white consciousness. But he watched. And watched.

Early on, it was Beulah, with Waters--and Louise Beavers and Hattie McDaniel--refashioning for an all-new medium the near-mythic character of the wise and/or sardonic black servant. He watched the minstrelized antics of Amos 'n' Andy--which, to its credit, barely acknowledged the white world--as well as the caustic modernism of Eddie "Rochester" Anderson. Later, there were the "events" of Roots and The Autobiography of Miss Jane Pittman, programs reeking of network noblesse oblige. But it wasn't until The Cosby Show, he says, that he realized two things: a previously unknown familiarity with people he was watching, via a seemingly benign, but hugely influential--and successful--NBC sitcom. And an accompanying epiphany about the magnitude of network TV's failure to its black audience.

To no one's surprise, Bill Cosby emerges in Bogle's book as one of the three or four most influential black performers/entrepreneurs in the history of black television (along with Waters, the comedian Flip Wilson and the Wayans brothers, because In Living Color helped put Fox TV "on the map"). But Cosby also ties Bogle up. As a performer, Cosby has been averse to playing the race card for either laughs or points, and his silence has been eloquent. Bogle recognizes this, just as he recognizes that Amos 'n' Andy assumed an existential grandeur by existing in its own black world.

But in Primetime Blues--a companion to Toms, Coons, Mulattoes, Mammies, & Bucks (Continuum), his study of blacks in film--Bogle is torn: There's the sense that every opportunity given, majestically, African-Americans on TV (itself a repugnantly patriarchal concept) should be used to promote a positive image or political message. Conversely, there's the Realpolitik of mass entertainment. It's rather unclear whether he thinks Julia, the landmark series that debuted in turbulent 1968, starring Diahann Carroll as a widowed mother and nurse (working for the crusty-but-benevolent Lloyd Nolan), was rightfully criticized for not having more truthfully represented black people, whatever that means, or was a landmark nonetheless. When he says that the characters in a show like Sanford & Son might have portrayed real anger about their status and thus taken the show in a different and provocative direction, he doesn't say whether he thinks very many viewers would have bothered to follow along.

In this, Bogle skirts the two basic aspects of television's nature: First, that it is craven, soulless and bottom-line fixated. And second, that it is aimed at morons. Sure, Bogle can cite hundreds of examples of African-Americans being portrayed in a patronizing or demeaning fashion, but how many real white people ever show up on the tube? Shows like The Jeffersons and Good Times were cartoons, the latter perpetrating what Bogle dubs neo-"coonery" via comedian Jimmie Walker. But between The Honeymooners and Roseanne, how many regular series represented white America as other than upper-middle-class, Wonder Bread-eating humanoids? Television, in its democratic largesse, has smeared us all.

Some worse than others. If the only place you saw white people was on the evening news--the one slot where blacks were always assured better-than-equal representation--you'd have a pretty warped idea of white people, too. Which is why, Bogle makes plain, it's always been so important to get respectable blacks on network TV.

The history itself is fascinating. Waters, who acquires a quasi-Zelig-like presence in Bogle's account of TV's early age, personified the medium's ability to diminish whatever talent it sucked into its orbit. The original Ethel Waters Show included scenes from Waters's hit play Mamba's Daughters; eleven years later, she'd be back as Beulah. By 1957, she was destitute, dunned by the IRS and had offered herself up as poignant fodder for Edward R. Murrow's Person to Person, talking about Christian faith and a need for money. Finally, television, never sated, asked one more sacrifice and got it, when Waters tried to quiz-show her way out of debt via a show called Break the $250,000 Bank.

Waters remains a towering figure in twentieth-century American culture; after the fanfares of both Bogle and jazz critic Gary Giddins (whose Bing Crosby: A Pocketful of Dreams ranks her alongside Crosby and Louis Armstrong in her importance to American pop singing), she may be due for a full-fledged resurrection, replete with boxed sets and beatification by Ken Burns. But she isn't the only one the author resuscitates. In trying to achieve as complete as possible a history of the medium-in-black, Bogle also tells the unsung stories of other pioneering African-American performers--such people as Tim Moore, Ernestine Wade, Juano Hernandez, James Edwards--who more often than not had one hit show then went on hiatus, and from there to oblivion.

Among the encores given by Bogle (author of a first-rate biography of the actress Dorothy Dandridge) are Bob Howard, star of The Bob Howard Show, a fifteen-minute weeknight program of songs that went on the air in 1948 and was the first to feature a black man as host. It lasted only thirteen months. Howard doesn't seem to have stretched his material beyond renditions of "As Time Goes By" or "The Darktown Strutters' Ball." But the most interesting thing, besides his race, was that the network didn't seem to notice it--didn't seem to have a problem with bringing an African-American into white homes. Of course, the networks had yet to hear the five little words that have echoed down through the annals of black TV (and any other progressive programming, for that matter):

What about the Southern affiliates?

Hazel Scott was hardly the 1950s version of Lil' Kim: The elegant, educated and worldly host of the DuMont Network's Hazel Scott Show had already come under fire from both James Agee and Amiri Baraka for allegedly putting phony white airs on earthy black music--so, if anything, she should have been the darling of the powers of early television. But no. Allegations in the communist-watchdog publication Red Channels dried up sponsorship for her show. And even though Scott demanded and got a chance to plead her patriotism before the House Un-American Activities Committee, her show was canceled after just three months. Scott's fate indicated even at this early stage that television would flee from any sign of controversy, especially political controversy, writes Bogle, who is correct--except when money is involved.

Primetime Blues stands as a history of African-American television, but there's more than enough subject matter to fill two books--a sequel could deal solely with the current ghettoization of the evening airwaves--so Bogle steers mostly clear of analyzing white television (you wish he'd at least dug deeper into the influence of black TV on white TV). But he can't ignore All in the Family. Not only did it spin off one of the most successful black sitcoms ever--The Jeffersons--it had a stronger kinship, albeit an ironic one, to black sitcoms than it did to white. It might even have been a black sitcom, sort of the way Bill Clinton was a black President, by the nature and limits of its experience.

Bogle places himself in the rather illustrious camp (Laura Hobson, author of Gentleman's Agreement, was one critic of the show's "dishonesty") contending that Carroll O'Connor's bigoted Archie Bunker, who brought "hebe," "coon" and "spade" into prime time--and ended up one of TV Guide's Fifty Greatest Characters Ever--did nothing to break down racial barriers but in fact reinforced the very racist attitudes the buffoonish Bunker was supposed to make look ridiculous. Cosby hated it; Lucille Ball (who, it is left unsaid, had one of the top-rated Nielsen shows before AITF premiered) weighed in too, comparing Norman Lear's groundbreaking comedy to the days when "the Romans let human beings be eaten by lions, while they laughed and drank."

CBS pooh-bah William Paley, who originally thought the show offensive, became a big supporter once it became a smash--to the point of ordering that a study he'd commissioned, one that confirmed what critics of the show were saying, be destroyed: What can we do with it? Paley asked. If we release it, we'll have to cancel the show.

Bogle is good at comparing Amos 'n' Andy to In Living Color--shows whose humor would never be viewed the same way by black and white audiences. And he appreciates that while early performers like the Randolph sisters--Lillian (It's a Wonderful Life, Amos 'n' Andy, The Great Gildersleeve) and Amanda (The Laytons, Amos 'n' Andy, Make Room for Daddy)--could add nuance and dimension to otherwise cardboard domestic characters, their roles were mostly nonexistent outside the sphere of their white employers. But he misses what I think is the lasting point of All in the Family: Archie Bunker, a furious, frustrated vessel of negative energy, was defined solely by his hate, solely by his proximity to the people he considered inferior or worse. He existed in a parallel zone to the one that had been created as a ghetto for black performers for decades past--a zone that defined him not by what he was, but what he wasn't. America didn't get it, of course, and CBS didn't intend it, but what All in the Family turned out to be was a perverted version of Amos 'n' Andy.

Which Booker Prize-winner could give Hollywood the boot in the arse it needs and secretly craves? Roddy Doyle, that's who. His Barrytown Trilogy (The Commitments, The Snapper, The Van) is somewhat more consistent than the Godfather Trilogy and less dependent on film tradition. His flicks don't exactly blow Coppola's away, but they're at least as good at sparking a family to rampageous life. It's not images that render Doyle's Dublin Rabbitte clan--it's the talk. Doyle's characters are comets of conversation, a bit like Preston Sturges heroes, daredevilishly suspended in thin plots by sheer velocity and nerve.

Doyle was a Dublin schoolteacher who poured his students' joie de vivre into a novel, The Commitments (1991), about scrappy Irish dole kids who become a soul band. When publishers returned it unopened, Doyle published it himself; then Alan Parker's posse buffed it into one of the best music movies ever, realer-seeming than the current exquisite memory film Almost Famous. It succeeds because it celebrates failure with integrity. As they say about soul music in the film, "It grabs you by the balls and lifts you above the shite."

The Commitments is the best Doyle film because it has Hollywood polish and story shape, but what makes it great is Doyle's untutored talent for dialogue in a medium dominated by words overprocessed and extruded by studios in terror of an original syllable. The Snapper (1993), made for BBC peanuts by Stephen Frears, a London genius who flops whenever he tries to go Hollywood, is a haphazard tale of unwed Dublin motherhood. Lost from the novel it's based on is the inside tour of the mother's thoughts, but still, it's a pure jolt of Doyle dialogue, uncut by movie pros.

What a rush! Who cares if the story has no sense of direction when you've got an intense sense of place and a vital ensemble engaged in the verbal equivalent of a food fight? Even the girl's loathsome impregnator Georgie Burgess (sag-eyed Pat Laffan) is so real, so rooted, you could kiss his puff-pastry face. Doyle captures the fractious loyalty and contained chaos that inspired the comic Martin Mull to say that having a family is "like having a bowling alley installed in your brain." While the eloquently exasperated expectant grandpa (Colm Meaney) tries to pry Burgess's identity out of his "up the pole" daughter, his younger girl high-steps past wearing baton-twirler's duds and a shaving-foam beard; a soused son vomits in the kitchen sink; grandpa-to-be says, "You'll do those dishes!" and gets back to interrogating without missing a beat.

The film The Van (1996), about the Dublin dad's fish-and-chip truck venture, was a bigger comedown from Doyle's Booker-shortlisted book--not enough family feeling. Even so, his cult flicks got him a crack at writing a screenplay not derived from a novel; unhappily, it is derived from all too many movies. The trouble starts with the title: When Brendan Met Trudy. If you're going to quote a famous movie title, why pick one whose title is the worst thing about it?

While there's nothing wrong with stealing, Doyle and director Kieron Walsh are thieving magpies who can't weave bits into a nest for new life. The worst thing about When Brendan Met Trudy is its incessant, inconsequential movie references, no substitute for sturdy characters and witty chaff. In their opening-scene reprise of Sunset Boulevard, virginal 28-year-old schoolteacher Brendan (Peter McDonald) lies face-down on a rain-swept Dublin street as his voiceover suggests that we back up a few weeks to find out how he got there.

The original fulfills that promise with a clockwork plot. This scene is just a one-shot gag: We later find that Brendan tripped in the street, fell and took comfort in mumbling lines from an old movie. He's not dead, just dull, there for no reason besides the filmmaker's wish to quote Sunset Boulevard. Random events happen to Brendan. He sings Panis Angelicus with his church choir (a no-soul band). He absently teaches students whose names he can't keep straight (how can Doyle get nothing from this milieu?). He gets picked up in a pub by Trudy (Flora Montgomery), a determinedly spunky Ellen DeGeneres lookalike; takes her to "an important Polish movie" by "Tomaszewski"; has cute sex with her; suspects her of being the castrator who's (cutely) terrorizing Dublin; and helps her bungle a cutesy burglary of his school. The whimsy is wheezy.

We see clips from Once Upon a Time in the West, The Producers and The African Queen, and Brendan and Trudy re-enact scenes from movies. Brendan gets limp in flagrante in a hayloft. Trudy observes, "What's wrong? You were big a minute ago." He replies, "I am big; it's the pictures that got small." Putting Jean Seberg's New York Herald Tribune T-shirt on Trudy fails to make her Seberg in Breathless. When Belmondo apes Bogey in Breathless, he's his own man. Aping Belmondo, Brendan isn't anybody, just a dead cliché walking. He's very good at mimicking John Wayne's walk at the end of The Searchers--but he ain't goin' nowhere, pilgrim. This movie could be called Airless. Or Something Mild.

Doyle's talent glimmers here and there in the hokey-jokey dialogue; you may find bits charming and me grumpy. Maybe I wouldn't be so disappointed if it didn't come off like a tone-deaf imitation of a real Roddy Doyle movie--one with bighearted characters firmly planted in a real place, whipping up a world out of irreverently poetical words, making me feel like family, banishing the real world by sweeping me up in theirs. Doyle's excruciatingly self-conscious and lumbering farce is not quite shite, it's just the usual, when what we expect from him is a kick in the arse.

Looking Back: First-time director/writer Kenneth Lonergan's You Can Count On Me won Best Screenplay and Best Actress from the National Society of Film Critics instead of the Oscars it also deserved, but how can you expect a bunch of Hollywood types to grasp fully an articulately understated, utterly honest work of art? In Lonergan's tale of an orphaned brother and sister's troubled love, every stammer, rant, skittish glance and awkward silence is precisely in character and scored like music.

Anyone could film an opening scene of a car crash that claims a young couple, but look how sensitively Lonergan handles the next: A cop's face materializes in the obscured glass of a front door. Sheriff Darryl (Adam LeFevre) tells the babysitter of the dead couple's kids, "Would you step outside and close the door?" Darryl's cop-speak must work on drunk drivers, but words fail him now and he's struck dumb with grief. The mute moment is searing, it evokes the closeness of their upstate New York town and it introduces two symbols of disconnection Lonergan loves: the door and the glass.

We flash forward to the orphaned girl Sammy (hummingbird-alert Laura Linney) in middle age, still living in her parents' manse with a wraparound porch like a comforting arm, baking plate-sized cookies for the return of her slouching jailbird hobo brother, Terry (Mark Ruffalo, a real find). On the bus home, Terry smokes joints as if they were his sole source of oxygen--the same way wild-child-turned-churchgoer Sammy smokes cigarettes when her 9-year-old son, Rudy (Rory Culkin, very like his brother Macaulay), is safely tucked in bed.

The town still cramps Terry. Sheriff Darryl is still in his face, confiningly benign. Terry literally can't breathe around the guy, because he'll exhale THC. And when Terry and Sammy meet, Lonergan economically conveys how they've coped with orphanhood in opposite ways. Terry became a Five Easy Pieces-style wandering wastrel. Single-mom Sammy stayed put, raising Rudy and working at a bank run by Brian (artfully blank-eyed Matthew Broderick). Brian is a preposterous martinet, ineptly tyrannical (he asks people to use "a more quote unquote normal range of colors" on their PCs), yet with a nonmean streak. So Sammy feels sorry for him and impulsively takes him to bed. She's always trying to save people.

The story's surface simplicity is deceptive. The relationships between Terry, Rudy, Sammy and her lovers grow together slowly, like frost tendrils in a windowpane. Subtext runs deep, and though he's not the world's most bravura visual director, Lonergan composes a tight symbolic structure connecting apparently desultory events. The climactic punchout scene is not contrived; it closes the circle of the lost-parent theme, and squares with Terry's belief in facing bad facts, not fleeing to faith and tradition. Watching him, you'd never know the 1960s myth of self-actualization was all self-deluded jive. (It sure beats the smug, pothead-bashing moralizing of the otherwise superb Wonder Boys.) Listening to him and Sammy and Rudy and a doleful minister (played well by Lonergan) talk about life, you'd think cinema was an art open to ideas. Plus, it's funny.

I was born by a Kerouac stream under Eisenhower skies
         --John Gorka

The New Folk Movement is now about twenty years old, and John Gorka is one of its leading voices, along with peers like Nanci Griffith and newer arrivals like Ani DiFranco. Over nearly two decades, Gorka has honed his warm baritone and offbeat songwriting skills with 200-days-a-year touring. His fine new album, The Company You Keep (Red House), is a characteristically bittersweet disc, understated but sharpened by deft word usage and a grasp of life's conundrums and paradoxes.

Gorka fits contemporary notions of a folk musician. He was a history and philosophy major in college, and got his performing start at coffeehouses in the late 1970s. His tunes are self-reflective, wry, pungent, pessimistic but unwilling to despair; they have titles like "Joint of No Return" and "Wisheries." In "What Was That," he sings, "Guess I'd better get back up/Get up off the ground again/Guess I'm really not so tough/Up is farther than it's ever been." And up he goes, discarding regret and clearing a space for whatever future he faces.

His country-tinged group has tasty yet simple arrangements, augmented with guest vocals by DiFranco and Lucy Kaplansky and Mary Chapin Carpenter. He's got real range, from on-point satire ("People My Age," which mocks baby-boomer elective plastic surgery) to playful ("Around the House"). And like older generations of folk musicians, he uses found material. The lyrics for "Let Them In" come from an unknown soldier in a World War II military hospital; God tells St. Peter to "Give them things they like/Let them make some noise/Give roadhouse bands, not golden harps/To these our boys."

The Eisenhower years saw the beginnings of the postwar folk revival; it gathered strength and followers in the 1960s. Maria Muldaur was part of it; her funky Greenwich Village apartment at the time often hosted other scene-makers, like John Sebastian. (Check out The Lovin' Spoonful Greatest Hits [BMG/Buddha], a wonderful CD of Sebastian's mid-1960s folk-blues-pop quartet issued last year; its only flaw is incompleteness.) This was the first generation of white kids exploring the folk blues, the language invented by dispossessed rural black America that underpins this country's music. Many went on Kerouac-like journeys in search of the originals, turned some up and then recorded them, resuscitating their careers. Their new audience of white college kids on campuses and in coffeehouses was a far cry from the plantations and street corners and juke joints where the music was created.

The rediscovery of rural blues roughly paralleled the rise of the mass civil rights movement, and reflected it. Blues seemed a true folk music, in the original German sense of the word: a manifestation of something fundamental and authentic about a people. Inevitably, some revivalists had a few misguided notions about authenticity, putting acoustic guitars in the hands of electric-blues masters like Muddy Waters. And yet they also knew what they wanted. Blues material certainly couldn't have been less like Tin Pan Alley's: Love may be a central theme for both, but the blues' gritty realism, with its raw sex and violence and irony and humor, exposed the superficiality of 1950s American pop. It let listeners step outside America's conformity. Not coincidentally, it also let them see black people as cultural heroes--a dramatic reversal of racist stereotypes.

Muldaur has always admired Memphis Minnie, one of the few folk-blues musicians who happened to be a woman. Minnie played mean guitar and wrote lustily double-entendre songs about the life she led. Her "Me and My Chauffeur Blues," with the signature lick that Chuck Berry swiped decades later for "You Can't Catch Me," is among the best moments on Muldaur's twenty-fifth album, Richland Woman Blues (Stony Plain). Age has deepened and coarsened the lilting flutters that shaped Muldaur's girlish voice, but to compensate she's developed heft and power. Maybe it's the spirit she's found in the largely African-American church she attends. Whatever the cause, she both evokes her idols and makes their music her own; her emotional identification with them enriches nearly all of these fourteen songs.

Like Gorka's, this disc gathers like-minded souls, a community joined by music and history. Sebastian's nimble John Hurt-inspired fingerpicking backs Muldaur on the opening cut, and the list spins on from there: Taj Mahal, Alvin Youngblood Hart, Tracy Nelson (whose duet with Muldaur, "Far Away Blues," is riveting and heartbreaking) and Bonnie Raitt. With Angela Strehli, Muldaur reprises the Bessie and Clara Smith classic, "My Man Blues," where two women, discovering they're sharing a man unwittingly, agree to continue the triad "on the cooperation plan," since they like how things are. The fluent piano behind them is courtesy of Dave Matthews.

History has been kind to the Five Blind Boys of Alabama. The original jubilee-style quintet met in 1939, at the Talladega Institute for the Blind, which they snuck out of to sing at a nearby military base. By the 1950s, the peak years of the "gospel highway," the church-based circuit that produced stars like Sam Cooke, they were shouters recording hits for Art Rupe's prestigious Specialty label. These top-tier, soul-rending performances are collected on Oh Lord--Stand by Me (Specialty).

In 1983 they were "rediscovered" in the electrifying remake of Sophocles called The Gospel at Colonnus; the musical hit Broadway in 1988. Now they've opened for rock superstars like Tom Petty and have headlined at the House of Blues chain. It sure ain't church, but in the wondrous way of art, the Blind Boys transform everything they perform into a forum for testifying. Clarence Fountain's massive voice is a monument in motion; few other than bluesmen like Howlin' Wolf match his raw timbre and full-lunged forcefulness.

Spirit of the Century (Real World) deliberately crosses gospel with blues, sacred with profane. It joins the three remaining Blind Boys with 1960s-vintage roots diggers like veteran guitarist David Lindley and blues harp great Charlie Musselwhite, who led one of the earliest and best 1960s electric-blues revival bands. They infuse Tom Waits's off-kilter "Jesus Gonna Be Here" with fearsome fervor, and deepen the resonances of Ben Harper's "Give a Man a Home." And they set "Amazing Grace" to the music of that whorehouse anthem popularized in the 1960s, "House of the Rising Sun." Though he's reimagined himself into something sui generis, Tom Waits has self-evident blues roots. In the early 1970s, he opened for an undersung hero of the 1960s folk-blues revival. Back then, John Hammond was known as John Hammond Jr.; his famous father, the leftist Vanderbilt scion who'd made his name "discovering" and recording black musical talents from Bessie Smith to Charlie Christian, the man who signed Bob Dylan to a major label, was still alive and looming.

The younger Hammond has never reached mass audiences, but some of his students, like an ex-sideman named Jimi Hendrix who worked with him at the seminal Cafe Wha? in the Village, did. Hammond mastered a dizzying variety of folk-blues styles and performed hundreds of old blues, back when the stuff was hard to impossible to find on disc. White kids were scrounging through attics and flea markets and the like searching for old blues records, trying to piece together biographies, compiling oral history and field recordings--work that, along with the Lomax field recordings for the Smithsonian, unearthed most of what we know about blues today.

Long underrated or simply overlooked, Hammond serves up Wicked Grin (Pointblank), and it's a marvelous treat: In a way, it's this year's second Tom Waits disc. That's not a putdown. Old friend Waits produced this edgy album; he also penned and plays on twelve of its thirteen songs. Backed by other roots veterans like keyboardist Augie Meyers, Hammond makes Waits's surreal, character-driven tunes more emphatically bluesy, and Waits endows each song with a sound that evokes different original blues recordings. It's a more creative use of the blues than most have come up with in years. And dig Waits's open-lunged gravel voice dueting with Hammond on the spiritual "I Know I've Been Changed."

Since he joined the Yardbirds, then John Mayall's Bluesbreakers and became the Godhead of the 1960s British blues revival, which paralleled the folk and blues revivals in America, Eric Clapton has changed fairly constantly, yet remained recognizably the same. That, after all, is how superstardom works. One of the biggest shifts came thirty years ago, when he was touring with Delaney and Bonnie and Friends, and they taught him to sing for real. His next album, with Derek and the Dominos (PolyGram), showcased his heartbreak, his suddenly raunchy vocals and his slash-and-burn band featuring slide guitarist Duane Allman.

In the decades since, Clapton has made a few good, even great albums and a pile of slush. Last year's much-heralded outing with B.B. King sounded haphazard and undercooked--a shame, really, given what it could have been. By contrast, Reptile (Reprise) is a keeper, reminiscent in style and pacing to the classic album he made as Derek. During recording, his uncle died. Raised by his grandmother, Clapton had grown up thinking his uncle was his brother. His uncle's favorite term of endearment gave this disc its name. On it, Clapton tours his past with consistent conviction, and his guitar is spry and sharp and ready to slice almost everywhere. He taps oldies like "Got You on My Mind," which gets a nice Jimmy Reed-ish blues treatment, and covers Stevie Wonder and Isley Brothers hits. "Travelin' Light," the latest installment in his ongoing J.J. Cale tributes, is stuffed with rheumy guitars snarling. But "Come Back Baby," his Ray Charles tribute, is the show-stopper. Clapton's overdriven guitar blazes and curdles the clichés of his millions of imitators, and his voice exposes just how rich and craggy it has grown to be. Brother Ray could still outchurch him without too much pain, but Clapton makes us believe he's got us gathered, swaying, in Charles's pews, to the music compounded of the sinful blues and heavenly gospel, the music called soul.

Over a decade ago, Clapton covered Robert Cray's "Bad Influence" and gave the now-multiple Grammy winner an early boost. Cray started mixing blues and soul with touches of jazz in 1974, working the circuit relentlessly; older bluesmen like Albert Collins and Muddy Waters championed his updated sound and lyrics. As time went on, he pumped up his soul-music aspect, deliberately extending the tradition of singer-guitarists like Little Milton and the B.B. King of "The Thrill Is Gone."

On Shoulda Been Home (Rykodisc), Cray's limber voice and spiky guitar once again merge blues and r&b with Memphis soul, with profitable results. Cray is well-known for tackling topics he sees as contemporary versions of the blues. "The 12 Year Old Boy" may even attract the unholy mob of politically correct leftists and Lynne Cheney followers who've climbed on Eminem's back, much as they would have onto Elvis Presley's. In this hard blues, Cray suggests ways to avoid having a preteen rival steal your lover: "If a young boy hangs around you/You should do what I shoulda did/Send him over to your neighbor's/And hope your neighbor likes kids."

Cray writes lyrics that tell stories, and storytelling is one reason I, like Charlie Parker, dig country music as well as blues. Which brings me to Charley Pride. Pride was a black star in country music in the 1960s, at the height of the whitebread "Nashville Sound," and was inducted into the Country Music Hall of Fame last year.

It's often been said that country music is white folks' blues, but that's what Pride always sang. In the midst of sharecropped Mississippi fields, he hugged his radio to listen to the Grand Ole Opry. Not good enough for the ballplayer career he wanted, he went into the Army, became a smelter and moved his family to Montana, where he sang part time and caught the attention of touring Nashville stars. Chet Atkins signed him, released his first album without a picture, and started the hits rolling.

Country Legends (BMG/Buddha) collects them. Some, like "Snakes Crawl at Night," are period curiosities. There are solid genre efforts: the wistful look home ("Wonder Could I Live There Anymore"), infidelity ("Does My Ring Hurt Your Finger"). There's a tribute, a nice version of Hank Williams's classic "Honky Tonk Blues." There's his biggest hit, "Kiss an Angel Good Morning," a tune whose hooky bounce always makes me grin as it offers advice to "kiss an angel good morning/and love her like the devil when you get back home." And there's "I'm Just Me," to which you can add racial inferences, if you like: "Some want more and more's a-getting less/I just want what I got/Some wanna live up on a hill and others down by the sea/Some wanna live behind high walls/I just wanna live free."

The panorama that is American folk music opens in all directions on guitarist Bill Frisell's latest, most far-reaching album, Blues Dream (Nonesuch). For years now, Frisell has been integrating elements of jazz, folk, blues, new music, rock, pop, parlor tunes, you name it, into his musical quest. Unlike too many of his contemporaries, though, he's been trying to distill them into something of his own; he's not trying to slap together yet another postmodern slag heap of influences. With Blues Dream, he's succeeded incredibly.

The album charts many paths across the American landscape. The tremolo-shimmery title track is a brief minor-mode intro, an evocation of post-Kind of Blue Miles Davis. Track two, "Ron Carter," named for the great 1960s Davis bassist, opens with metallic horn squiggles that wind over a brief bass ostinato and off-kilter guitar licks, then builds with horns and overdriven guitar solos. It evokes and updates 1960s experimentalism--no mean feat--as do the rest of the album's tracks.

Music has one big advantage over the real world: Resolution is always possible, if you want it. Take "The Tractor." It kicks off as backporch bluegrass, drummer Kenny Wolleson and bassist David Piltch laying down a shuffle behind Frisell's arpeggiated rhythms, sometimes doubling Greg Leisz's mandolin. Suddenly a snaky, slightly dissonant horn section slices across it. With each chorus, the fine section--trombonist Curtis Fowlkes, saxist Billy Drewes and trumpeter Ron Miles--connects the riffs, filling in until they're almost continuous, a Monkish counterpoint to hillbilly jazz heaven. It's a brilliant work, a wondrous musical portrait of a melting pot or tossed salad or whatever metaphor you prefer for the multiracial, multicultural place America has never, in sad reality, managed to become.

Blogs

What are interns reading for the week of 12/05/14?

December 5, 2014

Americans today are a lot more familiar with his presidency than they think they are.

December 3, 2014

Eric on this week’s concerts and Reed on how, from Bill Cosby’s victims to drone strikes, the media refuse to protect the powerless.

December 2, 2014

What are interns reading for the week of 11/20/14?

November 21, 2014

Eric on this week in concerts and new music releases and Reed on how the mainstream press is always trying to tell the same (false) story about the Republican Party.

November 17, 2014

What are interns reading for the week of 11/13/14?

November 14, 2014

Twenty-five years ago, Eric Alterman reminded us that the so-called “experts” were wrong about the Berlin Wall. Some things haven’t changed much.

November 12, 2014

Need some reading suggestions? See below for recommended reads from our brilliant group of fall interns:

November 12, 2014

The best way to support the troops is to make every effort to keep them out of harm's way.

November 10, 2014

“Socialism is inseparable from democracy,” The Nation wrote in its 1989 editorial.

November 10, 2014