In the Acknowledgments section of his biography of Saul Bellow, James Atlas quotes a somewhat greater biographer, Samuel Johnson: "We know how few can portray a living acquaintance, except by his most prominent and observable particularities, and the grosser features of his mind, and it may be easily imagined how much of this little knowledge may be lost in imparting it, and how soon a succession of copies will lose all resemblance of the original."
Johnson knew few of those whose lives he described and none nearly as well as Boswell knew him. (Would he have been as pessimistic about the unreliability of history and biography if he'd read Boswell's book? Probably more so. The truer the portrait, the more repellent to such a subject.)
I'm not as pessimistic about discovery as Johnson was. So, for instance, well as I knew Bellow before reading Atlas's biography, I think I know him better now.
I mean that I know more about the places he lived, what his parents were like, what others thought of him, what he said about many things, including me. (To my surprise, I learned that I was once mentioned in his will and that, perhaps after one of our arguments, I was removed from it.) It doesn't mean that my view of Bellow now is Atlas's. By no means.
Atlas also knows Bellow and was helped by him in the course of writing his book.1 He writes that he immersed himself in Bellow's records and acquaintances far more than he'd done in work for his prizewinning biography of Delmore Schwartz (whom he'd never met). Atlas wonders, though, if familiarity and labor have produced a better book. I think this is a better book, largely because Bellow is a more brilliant and interesting man than Schwartz was. (Indeed, his version of Schwartz in Humboldt's Gift is more interesting, amusing and touching than the one in the Atlas biography, which was--we learn in the new book--inspired by it.)
Better, truer; more interesting, more touching.
The first two distinctions don't matter in works of fiction. So the uproar over Bellow's Ravelstein and the real Allan Bloom doesn't bear on its power as a novel or, on the other hand, on the pain it gave and gives some who saw themselves "portrayed" and/or "betrayed" in it. They do matter, however, for biography. Would Boswell's Life of Samuel Johnson be as good a book if it were a work of fiction--if, say, the Johnson in it hadn't lived or been a totally different man? It would not be. The understanding a biographer establishes with his readers includes the sense that he is telling as much of the truth as he's been able to gather about actual people and events. If that understanding is compromised, it constitutes an aesthetic betrayal different from--and, in my view, worse than--the "betrayal" of a fiction writer's acquaintance in his fiction.
I'm one of the many Bellow friends Atlas interviewed and whom he cites in Bellow: A Biography. Much I know and feel about Bellow is not in the book because I didn't tell Atlas about it. Some of it would have somewhat altered his portrait of Bellow; none of it would have altered it significantly.2
Most of the book's citations from me are from letters Bellow wrote me or I him.3 Such citations constitute the sort of record biographers and other historians have drawn on for the two or three hundred years in which history has been assessed as a function of it. If I'd given Atlas access to my diaries, he would have found another source of Bellow matter that would have expanded--if not deepened, let alone altered--his view of his subject. The subject of every biography has had millions of thoughts and experiences that have never--thank God--been recorded. It means that the gulf Johnson wrote about is an uncrossable one.
The difference between modern history/biography and, say, what constituted their equivalent in Thucidydean Athens or seventeenth-century Europe is enormous. Scholars don't believe that Pericles delivered the magnificent oration that Thucydides attributed to--that is, wrote for--him, though he probably delivered a speech that resembled it. Our problem with a presidential speech today is not the actuality of the words pouring from the presidential mouth but who wrote and even who conceived them. We're content that our conception of Periclean Athens is to no small degree that of Thucydides' interpretation of it, but the historical standard is different for modern events and people, those who leave their tracks in letters and diaries, interviews and film.
Atlas uses such archival materials and such biographical techniques as interviews, and he is far more aware of the hazards as well as the advantages of such usage than, say, Vasari was in his verbal portraits of fifteenth- and sixteenth-century artists, some of whom he knew. An experienced journalist, Atlas has a nose for bias and such vested interest as the desire of ordinary people to be part of the record of extraordinary ones. (This is probably a trait of most biographers.) He also raises the question of how his long biographical labor affected his book. Did he, like his mythical namesake, get so weary of "holding up" the "bewilderingly complex" Bellow world that the exasperated weariness created a portrait as far from actuality as Thucydides' Pericles was from the "actual" Pericles?
I've known Bellow for almost forty-five years. For many of those years, we've been close friends and have said things to each other we may not have said to other people. We have also quarreled, disagreed and not seen each other for months and even years at a time. Our politics have been different, and the difference counted--perhaps more for him than for me. Nonetheless, we are close enough so that a few days before I write these words, we could tell each other on the phone--the first time we'd spoken since my wife and I stayed with him and his wife in their Vermont house two years ago--that we loved each other. We are old men now, and I believe that we both thought it possible that we wouldn't see each other again. In that conversation, I told Bellow that I'd read much of Atlas's book and that he shouldn't be concerned about it. I said that Atlas had built a crate large and secure enough to deliver the marvelous sculpture within.
A few hours later, I finished the last 100 or 150 pages of the book. In them, I detected the kind of weariness Atlas himself mentions, but I saw it as a weariness complicated by judgmental anger. Atlas had interviewed many people who'd been hurt--or said they'd been hurt--by Bellow. Partly as an attempt to maintain his independence of and objectivity about Bellow, partly from exasperated weariness, partly from his sense that he'd surrendered--his verb--his life to another man, a man whom he'd been seeing in part through the angry eyes of others, Atlas became harsher and harsher in his assessments. So I wrote Bellow telling him that although what counted--the portrait of a remarkable person becoming over decades ever more remarkable--was intact, I believe that it was deformed by Atlas's querulous anger, if not by sanctimonious contempt, and that he and Janis (Bellow's wife) would do well not to read it. "Hector and Andromache," I wrote, "Don't need to know Thersites' version of their lives."
This was perhaps as unfair to Atlas as I thought he was, at times, to Bellow, but then Atlas writes that I am Bellow's "old and loyal friend," the "Boswellian explainer of the great man to the general public," so any unfairness to him has been--clairvoyantly?--subverted.
Very well. As friend of subject and author,4 I am disqualified from reviewing this--I'll risk two adjectives--fascinating and sometimes brilliant book. I will instead talk about Johnson's concern, the gulf between actuality and its representation in biography, conversation and history.
I've read a number of books and hundreds of articles about people I've known. There are few, though, from which I've not learned often surprising, even shocking, facts, none in which I haven't felt at least some distance between what was written and what I knew. At times, as in the case of Bellow, my complex admiration for the central portrait has complicated and deepened my admiration for the friend portrayed. Reading remarks Bellow made or wrote years before I met him made me realize even more how remarkable a person he was and is.
Twenty-odd years ago, the day after I finished reading the manuscript of Humboldt's Gift, I had lunch with its author and said to him that it was difficult for me to think that the man across the table was the same man who'd written that profound, delightful and beautiful book. The man eating a sandwich and drinking tea talked with me about ordinary as well as extraordinary things, but nothing out of his mouth came close to the depth and beauty of what was on its best pages, and I said something like, "Yet there's less distance between you and your work than between any writer I've known and his."5
Atlas's biography has narrowed that distance for me. For all the schmutz that accumulates about and spatters the central portrait, it emerges as that of a very great man becoming great in the course of a long life of activity, acquaintance, introspection and expression. There is more original power in the intelligence recorded here than in 95 percent of biographies. Atlas does not have the mimetic power of Boswell or of a writer he rightly praises here, Mark Harris, author of a delightful Bellow book called Saul Bellow, Drumlin Woodchuck;6 he does not have the stylistic or analytic gifts of Samuel Johnson or Richard Ellmann, but what he does have is access to hundreds of brilliant Bellow observations and analyses outside of Bellow's books. Atlas's Bellow is like a match, Atlas's contribution being the assemblage and, perhaps, the wooden stem, Bellow's the sulfur that, rubbed, ignites and fires the wood.
The day the galleys of this book arrived in the mail, I saw my sister-in-law, who, days earlier, on a trip with her husband to Israel, had swum in the Dead Sea. She said there were all sorts of perils there, the crystalline spears one dodges to get to the viscous water, which deposits a salty scum on one's skin, and the water's semi-impenetrability, so that if one somehow managed to dive into it, ascending would be dangerously difficult. I felt an analogy to the perils of biography. The subject is himself almost impenetrable, guarded by fearful suspicion and his own complexity; even after getting access to him, the progress is difficult, and biographer-readers are left with the scum of his resistance to their penetration.
I've thought and talked about Bellow--and now this biography--with a few friends who also know him. Each sees Bellow in a somewhat different way; all condemn Atlas's version more than I. (I credit Atlas for collecting and organizing the materials that enable us to know more about Bellow; they blast him for his inability and/or unwillingness to understand him.) One friend, a first-rate novelist, thinks Atlas not only misunderstands Bellow's radical independence but resents it. So he sees a politically correct Atlas piling up criticism along familiar--to Bellow critics--misogynist, conservative and racial lines. He thinks that Atlas is shocked by Bellow's anarchic "cocksmanship," and when I suggested that Bellow had a grand streak of bad boy, if not outlaw, in him, he found a different way to express his own view: "He's a transgressive monkey. And a great con man."7 He makes Bellow into a version of a favorite character of his own fiction, a brilliantly anarchic, half-crazed sexual adventurer.
A former woman friend of Bellow's talked of his powers of devotion and charm. She detests Atlas's portrait, especially the account--to which she feels one of her letters has contributed--of his lovemaking.8 "He made me feel wonderful. I still love him." (She hasn't seen him in ten years.)
I myself have written about Bellow as a man simpler in many ways than other people, one who very early in his life discovered his own powers and let them set his course. More important than what happened to him--and I'm persuaded by Atlas that such things as the death of his mother help explain much later behavior--were these exceptional powers, an extraordinary memory, an extraordinarily acute and cultivated sensorium (visual, musical, olfactory, tactile) and--let's call it--emotional power (unusual ability to empathize, sympathize, love, hate and also, be detached). Like most of us, Bellow is many things, but unlike most of us, he's more of a piece and has been that way a very long time. The piece is stamped "writer," indeed "great writer," and the pressure of that stamp isn't like most other professional pressures; but this is something that is talked about ad nauseam meam, and I'll not add to the nauseating complex here.
What I've mostly wanted to hint at is the difficulty of writing, reading and being the subject of other people's descriptions of oneself, and to spell out what Johnson said was the distance between the real, the remembered and the written version of reality, the deformation of the "was" in the "is."
Yet such versions are what we have of the past, the history and biography with which we're left. One work of history can challenge or even refute another, or it can add, refine or subtilize it. Even memories rub against one another. Yet I do not subscribe to the notion (of, say, Peter Novick's splendid book That Noble Dream) that tries to dispose of the actuality of objectivity. I don't think we should abandon the recording of actuality as an ideal or ever think that there's no crucial difference between what we believe is actual and what we know we've made up or lied about. Nonetheless, what we get when we describe something or someone is, at its driest and purest, metamorphosis.
The greatest--at least the most delightful--investigator of such metamorphoses, Marcel Proust, claimed that only in what he called "involuntary memory" does the past ever re-emerge with its original--and even more than its original--power. (Beckett's comment about this was that Proust showed that the only real paradise was a lost one.) That sensuous, unsummoned memory is clarified as reflections in a clear pool are, free of the dust particles and blinding light that make what's reflected almost impossible to see.
Atlas's Bellow is a work built around voluntary, elicited and recorded memory. It is a version of actuality that may be read, sometimes with shivers of remembrance, by its subject and his acquaintances. It has a truth of its own, somewhere between the original actualities, the complex feelings and memories of those who supplied the author with data, and the author's own gifts and feelings. The portrait of the great man who is its subject will be difficult to dislodge. Luckily, the man has left a far more powerful self-portrait, that of the mentality behind his beautiful books.
1. Although Bellow recently told me that he "opened himself" to Atlas, who, lately, seemed to have turned away from him. I said that Atlas probably didn't want his work to be compromised by affection. After I wrote him not to read the book, he answered that he wouldn't, that there was "a parallel" between it "and the towel with which the bartender cleans the bar." This image of biography as the soak of spilled drinks is the sort of thing Bellow has invented for most of his 85 years.
2. One description of me there is so peculiar--"the Oblomov-like Stern"--that I actually wrote Atlas to ask what it meant. When I told Bellow, he said that Atlas had probably not read the wonderful Goncharov novel. When I questioned the adjective in a letter to Atlas, he replied genially that Oblomov "is a sympathetic character and so are you."
3. Most of our letters are filed in the Special Collections of the Regenstein Library at the University of Chicago.
4. Cf. Atlas's well-done interview with me, originally commissioned by George Plimpton for the Paris Review, in Chicago Review, Fall-Winter, 1999.
5. No one seemed more different from his work to me than Samuel Beckett, whom I saw about once a year between 1977 and 1987. Cf. the portrait of him inOne Person and Another (Dallas: Baskerville Books. 1993).
6. A book dedicated to me in which I play a minor role.
7. We both remember Bellow's early portrait of the terrific Chicago con man, Yellow Kid Weil.
8. One of John F. Kennedy's "girls" is said to have described the relationship as "the greatest thirty seconds of my life."
To buy or not to buy turns out to have been the question of the century in America--Just Do It or Just Say No. And in the past fifteen years, consumer society has moved to the center of historical inquiry as well. It began with the social history of commercial culture and the advertising industry, in books such as Kathy Peiss's Cheap Amusements: Working Women and Leisure in Turn-of-the-Century New York (1986) and Roland Marchand's Advertising the American Dream (1985). Drawing inspiration from the pioneering anthropological explorations of Dick Hebdidge (Subculture, The Meaning of Style, 1979), Arjun Appadurai (The Social Life of Things, 1988) and, especially, Mary Douglas and Baron Isherwood (The World of Goods, 1979), investigators then turned to the cultural history of how ordinary people use and assign meanings to commodities. A good example of this genre is Alison Clarke's Tupperware: The Promise of Plastic in 1950s America (1999). In recent works--such as Robert Collins's More: The Politics of Economic Growth in Postwar America (2000) and Alan Brinkley's The End of Reform: New Deal Liberalism in Recession and War (1995)--they have studied the political history of how nation-states promote and foster particular regimes of consumption. Where once consumption was deemed relevant only to the history of popular culture, in other words, it is now seen as intertwined with the central themes of American history, touching as it does on economics, politics, race relations, gender, the environment and other important topics.
Gary Cross, a professor at Penn State University and a pioneering and prolific historian of Europe and America, has explored the social, cultural and political dimensions of consumption before. In the past decade, he has published a half-dozen books on topics ranging from the history of leisure and working-class commercial amusements to the material culture of children's toys. Cross may study leisure, but his scholarship suggests that he doesn't take a whole lot of time to participate in consumer society. Fortunately, his work ethic has enabled the rest of us to understand our consumer ethic with clarity and historical perspective. Indeed, An All-Consuming Century displaces Daniel Horowitz's still-impressive but less wide-ranging The Morality of Spending (1985) as the best survey yet written of the history of modern American consumer society. Much more than a summary of recent scholarship (although it performs this task admirably), it is an informed, balanced, thoughtful and surprisingly passionate meditation on the making and meaning of our society. Avoiding the extremes of celebration and condemnation that too often pass for analysis, Cross's searching book is imbued with a generous concern for the revival of an active, democratic and participatory public sphere.
According to Cross, a paradox lies at the heart of American consumer society: It has been both an ideological triumph and a triumph over politics. Although it may be "difficult for Americans to see consumerism as an ideology," this is, Cross argues, precisely how it functions. It is, in his words, the "ism that won," the quiet but decisive victor in a century of ideological warfare. Over the course of the twentieth century it became naturalized to such an extent that few citizens "consider any serious alternatives or modifications to it."
In describing this ideological victory, Cross eschews conspiratorial interpretations of advertising and business collusion and gives consumer society its due for concretely expressing "the cardinal political ideals of the century--liberty and democracy--and with relatively little self-destructive behavior or personal humiliation." It won, Cross believes, because in large measure it met people's basic needs, helped them to fit into a diverse society even as it enabled them to forge new understandings of personal freedom, and served to fulfill, rather than mock, people's desire for the pleasures of the material world.
In spite of its popularity and successes, Cross believes that the ascension of consumer society has come at great cost: the abrogation of public life in favor of private thrills. By valorizing the private over the public and the present over the past and future, consumer society has "allowed little space for social conscience" and truly democratic politics. Rather than shoring up civil society, consumerism has pretty much replaced it: "The very idea of the primacy of political life has receded" as individual acquisition and use of goods has become the predominant way that Americans--and, increasingly, the rest of the industrialized world--make meaning of their lives. The suggestion that there should be limits to commercialism--that there are sacred places where the market does not belong--is, according to Cross, no longer taken seriously in a society that equates commercialism with freedom. Moreover, by the end of the century, "there seemed to be no moral equivalent to the world of consumption." The politics of consumption, in Cross's view, makes alternative conceptions of the good life virtually unimaginable in large part because it encourages people to think about themselves in isolation from the rest of society and from their history. (Reading Cross's book, I was reminded of Edward Hopper's painting Nighthawks, in which a customer at an urban diner sits alone, utterly disconnected from the humanity that surrounds him.) If Cross ultimately loses sight of the paradoxical nature of American consumerism and concludes on this dark note, An All-Consuming Century nonetheless provides important resources for others to explore the democratic potential of consumer society.
The narrative unfolds both chronologically and analytically. Cross divides the development of modern consumer society into four periods: 1900-1930, 1930-1960, 1960-1980 and 1980 to the end of the century. In this breakdown, the first three decades of the century were a takeoff period, during which a number of crucial elements converged to make America a consumer society. Cross consistently overstates the degree to which nineteenth-century America was a "traditional" society, untainted by commercialism; many elements of consumer society were born in the market revolution of the early 1800s and the corporate revolution of the later nineteenth century. But he is right to single out important developments that transformed the country from what we might call a nineteenth-century society with consumerist features to a full-blown consumer society in the twentieth century. The keys were increases in leisure time and personal income on the demand side, along with new products and innovations in selling on the supply side.
New, nationally advertised, branded products became widely available and affordable after the turn of the century. These products alleviated material needs, but more than that, Cross astutely notes, they became markers of new feelings of "comfort and ease" and "new sensations of power and speed." Modern products like cigarettes, candy and soft drinks made the sensational available on a daily, indeed almost hourly, basis. Amusement parks like Coney Island and other "cheap amusements" also made the regular purchase of spectacular thrills affordable for working people. In the consumer society, the utilitarian was always mixed with the sensual. The embodiment of this mixture was, of course, the great symbol of early-twentieth-century consumer society, the automobile. Already characterized by an increasing number of what Cross calls "private pleasures," in this period, as he shows, mass culture contributed to political and social changes as well: It blurred ethnic and class divisions and encouraged the children of immigrants to redefine themselves as members of a blended, multiethnic, if still racially segregated, youth culture.
The period 1930-1960 was one of consolidation in time of crisis. The constraints of the Great Depression and World War II led to a "frustrated consumerism more than a rejection of the capitalist system." Rather than blame the new consumerism, most policy-makers and indeed many ordinary Americans came to see "underconsumption" as the root cause of the slump. After the war, government policy encouraged the development of mass purchasing power rather than efforts to equalize the distribution of wealth. During the cold war, consumer society became "a positive answer to communism." In his 1959 "kitchen debate" with Nikita Khrushchev, Vice President Richard Nixon drove this point home by contrasting modern American appliances with outdated Soviet culinary technology. Despite the linkage in these years between consumption and freedom, Cross notes that the consumerism of the postwar years was not hedonistic but "domesticated," focused on the suburban home and the nuclear family. Signature developments of these years were Levittown, McDonald's and Holiday Inn, sites of responsible, respectable, family-oriented consumption.
From 1960 to 1980 consumer society faced a very different set of challenges but emerged stronger than ever. First, the counterculture challenged the very premises of consumerism, and in the 1970s, the specter of scarcity called into question the permanence of the cornucopia upon which consumer society depended. In spite of these challenges, "consumption became even more ubiquitous." Indeed, Cross suggests, the roots of the even more individualistic and socially fragmenting consumerism of the late twentieth century lay in part in the 1960s critique of consumerism: While countercultural figures critiqued conformity and idealized the "authentic self," many Americans sought to achieve this authenticity through consumption. Businesses began to modify the Fordist practice of mass production in favor of flexible production and segmented, demographically distinct markets. Drawing on the work of cultural critic Thomas Frank (rendered throughout the book as "Frank Thomas"), Cross writes that consumerism became "adaptable to the green and the hip." Similarly, during the energy crisis of the 1970s those politicians who took the shortage to be the result of overproductionwere rebuked as naysayers. With great political success, Ronald Reagan attacked President Jimmy Carter for a speech in which Carter had the temerity to suggest that "owning things and consuming things does not satisfy our longing for meaning." Reagan called that 1979 "malaise" address un-American in its pessimism and its call for restraint.
The trend toward fragmented, individualistic consumption accelerated during the last two decades of the century, an era that Cross labels "markets triumphant." Radical faith in the virtues of the market led politicians like Reagan to put a moral gloss on the "unfettered growth of market culture in the 1980s." Government constraints of an earlier era, in the form of environmental and advertising regulation, weakened, and commerce entered unfettered into areas where it had previously been kept at arm's length: children's homes and classrooms. By century's end the "Victorian notion that some time and place should be free from commerce" seemed as quaint as a Currier and Ives lithograph. Cross, who has a knack for unearthing telling statistics, notes that "supermarkets carried about 30,000 different products in 1996, up from 17,500 in 1986 and about 9,000 in the mid-1970s." Even the all-time-high consumer debt--$1.25 trillion by 1997--did nothing to stop the belief that the future of American prosperity and freedom depended upon the continuing expansion of the realm of consumption. Indeed, shopping had become the nation's primary form of entertainment, and monuments to consumption like the gargantuan 4.2-million-square-foot Mall of America became a haven for tourists from around the world.
In Cross's telling, the attractions and problems of consumer society are in effect one and the same: the cult of the new, immediate gratification and the valorization of "private pleasures." Consumerism is the "ism that won," owing to its ability not only to withstand challenges but, through a magical jujitsu, to co-opt them. Although initially formulated in terms neither celebratory nor condemnatory, Cross's story is ultimately one of declension. While he avoids the nostalgia of many commentators, there is little doubt that Cross finds contemporary consumer society to be a negative force: asocial, apolitical, amoral and environmentally dangerous. Whereas consumerism once helped integrate the diverse inhabitants of an immigrant nation in a youthful mass culture, by century's close, cynical marketers were happy to divide an equally multicultural nation into segmented demographic units based on "multiple and changing lifestyles." Thus the shift from an integrative, public-spirited popular culture in the early twentieth century to an increasingly privatized, solipsistic commercial culture of the late twentieth century. What was seductive in 1900--cornucopia and pleasure for the masses--became obscene by 2000, as a cultural stimulant turned into a dangerous narcotic.
An All-Consuming Century is one of the few indispensable works in the ever-expanding library of books on American consumer society. But in an otherwise rich overview the author has surprisingly little to say about the role of women, African-Americans and ethnic minorities (and nothing about regional variations) in the construction of consumer society. These are serious omissions. As admen and women's organizations recognized early on, women have performed the vast majority of the unpaid labor of consumer society: the shopping, budgeting and refashioning of older items. Cross notes that African-Americans were excluded from many of the benefits of the emerging mass culture, but he does not address the ways popular culture served to reinforce both the whiteness of the "new immigrants" from Eastern and Southern Europe--a skin privilege that was not yet fully acknowledged by the majority culture--and the otherness of Asian and Latino immigrants.
Nor does Cross discuss the attractions of nationwide retailers and national brands for African-Americans, who often took advantage of what the historian Edward Ayers has called the "anonymity and autonomy" made possible by the advent of the Sears catalogue (and chain stores in the nonsegregated North), whose mass customer base and "one price" system reduced the possibilities for racial discrimination that frequently accompanied visits to the corner store. For this group, the private pleasures occasionally afforded by the advent of national markets offered advantages over the public humiliations that so often accompanied local commerce.
Cross's relative neglect of women and minorities leads him to underestimate the importance of grassroots consumer activism as well, which has often been led by members of these groups. Meat boycotts, cost-of-living protests, "don't buy where you can't work" campaigns and sit-ins were integral to the development of American consumer society because they represented demands to expand the benefits of consumerism beyond a middle-class elite. One of the most important women's political organizations of the first half of the century, the National Consumers League, which pioneered the crusade for "ethical consumption" and labor rights, goes unmentioned. Cross stresses the ways marketers attempted to co-opt the civil rights movement, but he does not address the degree to which the demand for full participation in consumer society was a key ingredient in that crusade for social justice. By virtually ignoring these movements, Cross leaves out an important part of the story of consumer society--efforts to unite citizenship with consumption.
The critics of consumer society whom Cross discusses most often are proponents of what he calls the "jeremiad," the high-culture dismissal of mass culture as vulgar. He condemns the elitism and arrogance of such thinkers and is surely correct to note that their criticism had little impact on ordinary shoppers. Cross is less critical of the "simple living" tradition and calls the self-provisioning movement of the 1960s "the most positive aspect" of the counterculture. He argues that "the idea of the 'simple life,' perhaps never more than a daydream, had almost ceased being even a prick to the conscience," but he only briefly mentions the growing popularity of the "voluntary simplicity" movement, a topic addressed in more detail in Juliet Schor's The Overspent American (1998).
Cross also develops a persuasive critique of the consumer rights movement. While the Depression era saw the rise of groups like Consumers Union, which sought to make consumers a greater force against the power of business and advertisers, he notes that by focusing primarily on product quality and prices, many consumer rights groups have served only to reinforce "the individualism and the materialism of American consumption." This tradition of angry but apolitical individualism can still be found at innumerable websites, like starbucked.com, that highlight at great length the indignation of formerly loyal customers: "The sales clerk who sold me the machine was rude, then decidedly refused to hand over the free half pound of coffee given with every purchase of a Starbucks espresso machine...." The democratizing power of consumer demands for corporate responsibility is too often dissipated by such narrowly cast diatribes.
In spite of the failure of the jeremiad, the seeming irrelevance of simplicity and the individualization of the concept of consumer rights, Cross is too definitive about the nature of the "victory" of consumer society. Many Americans still recognize that however much advertisers and marketers attempt to cover it up, consumption is fundamentally a social and political act. So although it is true that "late twentieth century consumerism turned social problems into individual purchasing decisions," it is also the case that individual shopping decisions have frequently been viewed in the context of social problems. As consumer activists from the League of Women Shoppers in the 1930s through environmentalists today have pointed out, the goods that we buy leave ecological, labor and government "footprints." In spite of corporate attempts to fetishize goods, diligent activists like John C. Ryan and Alan Thein Durning of Northwest Environment Watch have described--and tried to estimate--the hidden social costs incurred by the purchase of quotidian products, including coffee and newspapers. The actions of students in the antisweatshop campaigns of recent years indicate that a growing number of consumers are looking behind the logo to determine the conditions under which the clothing they buy is made. As Naomi Klein has recently argued in No Logo:Taking Aim at the Brand Bullies, the ubiquity and importance of brands provides an opening for protesters who can threaten, through consumer boycotts and other actions, to sully corporate America's most valuable asset, the brand name. One teen in Klein's book puts it this way: "Nike, we made you. We can break you." Cross may decry the "inwardness of the personal computer," but the protests at the Seattle World Trade Organization and Washington International Monetary Fund meetings reveal that the Web creates alliances and expands social bonds. The history of consumer activism--and its recent incarnations--shows that consumerism does not necessarily lead to an antipolitics of radical individualism.
Cross does put forth important arguments about the "excesses of consumer culture": the environmental degradation, the waste, the lack of free time and the sheer mind-numbing meaninglessness that accompany modern consumerism. But these must be balanced with the recognition that most Americans, especially those in the working class, have viewed the enjoyment of the fruits of consumer society as an entitlement, not a defeat. This should not be dismissed as false consciousness or "embourgeoisement." Far from allowing consumerist demands to erode political impulses, working people--through living-wage, union-label and shorter-hour campaigns--have consistently politicized consumption. Rather than pitting the culture of consumption against democracy, it will be important to continue this tradition of democratizing, rather than demonizing, the culture of consumption. In his assessment of the twentieth century's most influential "ism," Cross provides important warnings about the difficulties of such an effort. But in its stress on the paradoxes of consumer society--an emphasis that then too rapidly gives way to condemnation--An All-Consuming Century also provides lessons from history about the necessity of the undertaking.
The quiet grace of Ring Lardner Jr., who died the other week at 85, seemed at odds with these noisy, thumping times. I cannot imagine Ring playing Oprah or composing one of those terribly earnest essays, "writers on writing," that keep bubbling to the surface of the New York Times. He was rightly celebrated
for personal and political courage but underestimated, it seems to me, as a protean writer who was incapable of composing an awkward sentence. It ran against Ring's nature to raise his voice. Lesser writers, who shouted, drew more acclaim, or anyway more attention.
The obituaries celebrated his two Academy Awards but made less of other achievements. Ring's novel,The Ecstasy of Owen Muir, begun in 1950 while he was serving his now-famous prison sentence for contempt of Congress, drew a transatlantic fan letter from Sean O'Casey. Ring felt sufficiently pleased to have the longhand note framed under glass, which he then slipped into a shirt drawer. He was not about advertisements for himself. In 1976 he published The Lardners: My Family Remembered. Garson Kanin commented, "In the American aristocracy of achievement, the Lardners are among the bluest of blue bloods. In Ring Lardner, Jr. they have found a chronicler worthy of his subject. The Lardners is a moving, comical, patriotic book."
The progenitor was, of course, Ring Lardner Sr., the great short-story writer, who sired four sons, each of whom wrote exceedingly well. James Lardner was killed during the Spanish Civil War; David died covering the siege of Aachen during World War II; a heart attack killed John in 1960, when he was 47. Add Ring's prison term to the necrology and you would not have what immediately looks to be the makings of a "moving, comical" book. But The Lardners was that and more because of Ring Jr.'s touch and slant and his overview of what E.E. Cummings called "this busy monster, manunkind."
From time to time, Ring published splendid essays. The one form he avoided was the short story. He wrote, "I did not want to undertake any enterprise that bore the risk of inviting comparison with my father or the appearance of trading on his reputation."
We became close in the days following the death of John Lardner, who was, quite simply, the best sports columnist I have read. I set about preparing a collection, The World of John Lardner, and Ring, my volunteer collaborator, found an unfinished serio-humorous "History of Drinking in America." He organized random pages with great skill. Reading them I learned that the favorite drink of the Continentals, shivering at Valley Forge, was a Pennsylvania rye called Old Monongahela. George Washington called it "stinking stuff." At headquarters the general sipped Madeira wine.
A year or so later, with the blacklist still raging, I picked up Ring for lunch at the Chateau Marmont, an unusual apartment hotel on Sunset Boulevard near Hollywood. Outside the building, a fifty-foot statue of a cowgirl, clad in boots and a bikini, rotated on the ball of one foot, advertising a Las Vegas hotel. I asked the room clerk for Mr. Robert Leonard. Ring was writing some forgotten movie, but could not then work under his own name. "Robert Leonard" matched the initials on his briefcase.
This was a pleasant November day, but the blinds above Ring's portable typewriter were drawn. When I asked why, he opened them. His desk sat facing the bikinied cowgirl, bust-high. Every eighteen seconds those giant breasts came spinning round. "Makes it hard to work," Ring said and closed the blinds.
The Saturday Evening Post was reinventing itself during the 1960s, on the way to dying quite a glorious death, and with my weighty title there, editor at large, I urged Clay Blair, who ran things, to solicit a piece from Ring about the blacklist. Ring responded with a touching, sometimes very funny story that he called "The Great American Brain Robbery." He explained, "With all these pseudonyms, I work as much as ever. But the producers now pay me about a tenth of what they did when I was allowed to write under my own name."
Clay Blair lived far right of center, but Ring's story conquered him, and he said, "Marvelous. Just one thing. He doesn't say whether he was a member of the Communist Party. Ask him to put that in the story."
"I won't do that, Clay."
"He chose jail, rather than answer that question."
"Then, if he still won't, will he tell us why he won't?"
Ring composed a powerful passage.
The impulse to resist assaults on freedom of thought has motivated witnesses who could have answered no to the Communist question as well as many, like myself, whose factual response would have been yes. I was at that time a member of the Communist party, in whose ranks I found some of the most thoughtful, witty and generally stimulating men and women in Hollywood, I also encountered a number of bores and unstable characters.... My political activity had already begun to dwindle at the time [Congressman J. Parnell] Thomas popped the question, and his only effect on my affiliation was to prolong it until the case was finally lost. At that point I could and did terminate my membership without confusing the act, in my own or anyone else's head, with the quite distinct struggle for the right to embrace any belief or set of beliefs to which my mind and conscience directed me.
These words drove a silver stake into the black heart of the blacklist.
Ring won his first Oscar for Woman of the Year in 1942, and when he won his second, for M*A*S*H in 1970, numbers of his friends responded with cheering and tears of joy. The ceremony took place early in 1971, and Ring accepted the statuette with a brief speech. "At long last a pattern has been established in my life. At the end of every twenty-eight years I get one of these. So I will see you all again in 1999."
Indeed. Early in the 1990s I lobbied a producer who had bought film rights to my book The Boys of Summer, to engage Ring for the screenplay. Ring, close to 80, worked tirelessly. A screenplay is a fictive work, and Ring moved a few days and episodes about for dramatic purposes. His scenario ended with the Brooklyn Dodgers winning the 1955 World Series from the Yankees and my account of that ballgame landing my byline on the front page of the New York Herald Tribune. The sports editor is congratulating me on a coherent piece when the telephone rings: My father has fallen dead on a street in Brooklyn; I am to proceed to Kings County Hospital and identify his body.
As I, or the character bearing my name, move toward the morgue, I bump into two beer-drunk Dodgers fans. One says, "What's the matter with him?" The other says, "He's sober. That's the matter with him." The body is there. It is my father's body. Beer drunks behind us, my mother and I embrace. Fin.
I can only begin to suggest all that Ring's scene implies. I would start with the point that winning the World Series is not the most important thing on earth, or even in Brooklyn. I was always careful not to embarrass Ring with praise, but here I blurted out, "This is the best bleeping screenplay I've ever read, Ringgold. Oscar III may come true in '99."
"Curious," Ring said. "I seem to have had the same thought myself."
The blacklisting bounders were now dead, but a new generation of Hollywood hounds refused to shoot Ring Lardner's scenario. The grounds: "a father-son angle" was not commercial. "It worked in Hamlet," Ring said, but to unhearing ears. And then we were talking about Ring writing a screenplay for a book I published in 1999 about Jack Dempsey and the Roaring Twenties. "Have to cut it back a bit," Ring said. "Following your text would give us the first billion-dollar picture."
Years ago, the critic Clifton Fadiman wrote that Ring Lardner Sr. was an unconscious artist and that his power proceeded from his hatred of the characters he created. Ring told me: "If my father hated anyone or anything, it was a critic like Fadiman. Unconscious artist? My father knew perfectly well how good he was and--better than anyone else--how hard it was to be that good."
Ring Jr. knew the very same thing about himself. Or so I believe. Yeats writes, "The intellect of man is forced to choose/perfection of the life, or of the work." As well as anyone in our time, my suddenly late friend Ring Lardner came pretty damn close to achieving perfection in both.
Long before Carrie-Anne Moss rips open Val Kilmer's shirt and begins pounding his chest, providing him with a version of CPR that she must have learned from a Japanese drum troupe, the makers of Red Planet have resorted to their own thumpings and flailings, as if to resuscitate a film that's gone limp. It's a panic response, coming from people who have realized too late that the hookup of a radio would be a high point of their picture.
Their script has stuck Moss in a stricken spaceship that's orbiting Mars; by this point, her comrades Kilmer and Tom Sizemore have been marooned, incommunicado, on the planet's surface. So when the boys stumble upon an old circuit board in the dust, it's time for high-energy drama. "Let's do it!" shrieks Sizemore, as if he were starting the Indy 500. With a roar, guitars and drums begin pounding away on the soundtrack. Kilmer, in closeup, damn well solders a wire, sending a meteor shower's worth of sparks across the screen--at which point, back on the spaceship, Moss decides to strip down to a sleeveless T-shirt, giving us a much better view of her breasts.
I'm really grateful for the breasts. If not for them, I might have fallen asleep and missed the climactic scene, in which Kilmer performs a diagnostic check on a computer.
If only the makers of Red Planet had trusted in their story's essential schleppiness! Then, instead of giving us this lumbering, expensive beast, they might have realized the small but halfway-clever idea that's still dimly visible within: a story about the heroism-by-default of a spaceship janitor.
The character in question, a fellow named Gallagher, holds the job title of mechanical systems engineer; but to the rest of the personnel on this flight to Mars, that's like saying he's the guy who fixes the toilets. "It's high school," he remarks to a fellow civilian in the crew, after being brushed back by a swaggering NASA pilot. "They're the jocks, and we're the nerds." Just so. When he bumps into Moss--the ship's commander--on her way out of the unisex shower, Gallagher can think of nothing better to do than fumble with his fingers and blush. Later, when the outcome of the mission comes to rest on him, Moss has to give him a pep talk before he'll even get to his feet. Yet he's the guy who must save Earth from destruction and consummate a rendezvous with those breasts. What a role for Steve Buscemi! How the hell did it go to Val Kilmer?
He's good, of course. Kilmer is always good--but he's a guy who previously played Jim Morrison, Elvis and Batman. The only thing that's nerdlike about him is the hairdo he's been given for this picture, which is brushy and yellow and makes him look as if he's in crying need of a conditioner. Mind you, the premise of Red Planet is that all of Earth needs a conditioner. After these many years of environmental degradation, we've burned out our world and must colonize someplace else. Hence the desperate and very expensive project, in the year 2057, of sending Moss and her crew to Mars. Wouldn't it have been cheaper, as well as more practical, to institute a few conservation measures instead? No doubt. But humans, according to this movie, lack much capacity for self-discipline and forethought, and so must splurge on stupid but spectacular stunts. As if to prove this point, the producers have done their own splurging and hired Kilmer--the actorly equivalent of a rocket to Mars, compared with Buscemi's compost heap.
As they cast the lead, so too did they decide to ladle on the excitement: pounding guitars, sleeveless T-shirts, unmotivated shrieks. How were these choices made? I can venture a guess. The credits for Red Planet list three producers and two executive producers. This is a fairly standard aggregation in today's movie business; and with so many big shots keeping themselves busy on the picture, how could a mere idea survive? The story, written by a lone guy named Chuck Pfarrer, was almost sure to be buried alive; and into the dirt with it went a few other notions.
One of them might have involved some sexual role-play, based on the fact that the only females in the story are Moss, the shipboard computer (named Lucille) and a navigation robot called Amee. "She's my kind of girl," Gallagher says of the robot, just before it goes into killer mode. (It was designed for the Marines.) Somebody, maybe Pfarrer, seems to have wanted the nerdy Gallagher to feel ambivalent toward strong women: attracted to them when they shower, threatened by them when they turn into whirring kung-fu machines.
But since the production is at war with its own screenplay--have I mentioned that Red Planet is directed, more or less, by Antony Hoffman?--this kinky little idea is no better realized than the movie's religiosity. As far as I'm concerned, it's just as well that this latter theme gets only lip service. Ever since 2001: A Space Odyssey, Earthlings in Outer Space have sought God, and found light shows. At least Red Planet spares us that final cliché--though it still makes us listen to a lot of spiritual blather.
Those Deep Thoughts are provided by Terence Stamp, who manages to be the crew's world-famous scientist despite having abandoned rationalism. Science cannot provide the answers he craves, Stamp explains to a sweetly patient Kilmer, and so he has turned to religion. Kilmer obligingly spends the rest of the picture looking for a divine purpose--which doesn't seem so misguided, considering the level of scientific expertise around him. When the crew's biologist (Sizemore) discovers a life form on Mars, he cries out, "Nematodes!" Either he's forgotten his Linnaeus--nematodes are worms--or else the solution to God's mysteries is to be found not in Outer Space but in the pages of old sci-fi magazines. These creatures are clearly arthropods: the genre's usual bugs.
Fans of the platoon-in-space movie will want to know that the Mars scenery is furnished with the necessary rocks, peaks and ravines. Fans of Carrie-Anne Moss--meaning the adolescent boys, of whatever age, who admired The Matrix--will want to know that here, too, she gets to fly around. Not every actress is suited to antigravity; and so, until such time as Moss gets the chance to deliver a performance, I will congratulate her on giving good float.
As a memorial tribute to Vincent Canby, the "Arts & Leisure" section of the New York Times recently published half a page of excerpts of his prose, as selected by The Editors. Implacable beings of ominous name! With grim rectitude, they shaped a Canby in their image, favoring passages where he had laid down principles of the sort that should be cited only under capitalization. These were Sound Judgments.
For those of us who admired Mr. Canby (as the Times would have called him while he was alive, and as I will continue to call him, knowing how the style fit the man), soundness of judgment was in truth a part of his merit. A hard man to fool, he could distinguish mere eccentricity from the throes of imaginative compulsion, the pleasures of pop moviemaking from the achievements of film art; and when he was offered sentimentality in place of feeling, his heart didn't warm, it burned. These powers of discernment allowed him to bear with extraordinary grace the responsibility of being the Times critic. They also contributed a lot to his need for responsibility, since it was his sureness, as much as the institutional weight of the Times, that made Vincent Canby so influential.
That said, I confess I read him to laugh. At present, I can give only tin-eared approximations of his wisecracks--correct and ample quotation will become possible when someone smart decides to publish a Vincent Canby anthology--but I can hardly forget his review of Salome's Last Dance. This picture was the latest chapter in Ken Russell's phantasmagorical history of sex in the arts, or the arts in sex. Mr. Canby's lead (more or less): "As the bee is drawn to the flower, as the hammer to the nail, so Ken Russell was bound to get to Oscar Wilde."
I also recall Mr. Canby's description of the used car that Jim Jarmusch peddled to the title characters in Leningrad Cowboys Go America. It looked, he said, as if it had been dropped from a great height. Writing about I've Heard the Mermaids Singing, a film of relentlessly life-affirming whimsy, he claimed he'd been cornered by a three-hundred-pound elf. A typically self-regarding, show-offy performance by Nicolas Cage (was it in Vampire's Kiss?) inspired him to write that other actors must enjoy working with this man about as much as they'd welcome being shut up with a jaguar. And once, when forced to think up copy about his umpteen-thousandth formula movie, he proposed that the only way to derive pleasure from such a picture would be to play a game with yourself, betting on whether you could guess what would happen next. "As you win," he wrote, "you lose."
From these few and random examples, you may conclude that Mr. Canby's principles often emerged with a deep-voiced chuckle, and that they involved matters that went far beyond the movies. Some of these concerns were political in the specific sense, as when he gave a favorable review to Alex Cox's Walker: a film that offered a burlesque insult to US supporters of the Nicaraguan contras, in government and at the Times. His concerns were also political in a broader sense. Witness the 200 words he devoted to a little African-American picture titled Love Your Mama: a heartfelt, thoroughly amateurish movie produced in Chicago by some people who had hired an industrial filmmaker to direct their script. While quietly letting his readers know that they probably would not want to watch this film, Mr. Canby conveyed a sense that real human beings, deserving of respect, had poured themselves into the project.
Of course, the best places in which to seek Mr. Canby's principles were within the films he championed. He would have earned his place in cinema history (as distinct from the annals of journalism) had he done nothing more than support Fassbinder's work. And yet I'm not surprised that The Editors found no space to reprint Mr. Canby's writings on this crucial enthusiasm. Fassbinder, like his critic, was preternaturally alert to political and social imposture, to the bitter and absurd comedy of human relationships, and also (for all his laughter) to the pain and dignity of those who go through life being pissed on. Mr. Canby recognized in Fassbinder's work all these qualities and more (such as the presence, in the person of Hanna Schygulla, of one of cinema's great fantasy objects); but these matters seem to have been judged too unruly for an "Arts & Leisure" tribute.
Now, I've been allowed to do some work for "Arts & Leisure" and have received from my editors nothing but aid and kindness. Surely the people I've dealt with at the Times would have chosen excerpts from Mr. Canby that were funnier, sharper, more challenging. So maybe, when the Times moves to memorialize somebody as one of its own, a higher level of control takes over. It's as if the paper means to show its own best face--or rather the image it wants to see in the mirror, urbane and solid--and never mind that man in the old tweed jacket.
This tendency of the institution to eclipse the individual figures prominently in a new book by another major film critic, Jonathan Rosenbaum. By "major," I mean that Rosenbaum is highly regarded by other reviewers and film academics, and that he's gained a certain public following (concentrated in Chicago, where he serves as critic for the Reader). But if you were to ask him how he fits into American film culture in particular and US society in general, he would locate himself, quite accurately, on the margins. As his friends will tell you (I hope I may count myself among them), Rosenbaum is one of the angel-headed hipsters: a sweet-natured, guileless man, wholly in love with art and wholly longing for social justice. And for these very reasons, he has become the angry man of American film criticism, as you might gather from the title of his new work, Movie Wars: How Hollywood and the Media Conspire to Limit What Films We Can See (A Cappella, $24).
Rosenbaum argues--"argue," by the way, is one of his favorite words--that those American writers, editors and TV producers who pretend to cover film are for the most part hopelessly self-blinkered. It's in their interest to look at only those movies that the big American companies want to promote (including the so-called independent films that have been ratified by Sundance and Miramax). So journalism collaborates with commerce, instead of acting as a check on it; informed, wide-ranging criticism gets shoved to the side; films that might have seemed like news flashes from the outside world fail to penetrate our borders; and everyone excuses this situation by claiming that "the people" are getting the dumb stuff they want. Rosenbaum is enraged that moviegoers should be viewed with such contempt; he's infuriated that well-placed journalists should justify their snobbism (and laziness) by dismissing whatever films and filmmakers they don't already know about; and he's mad enough to name names.
In Movie Wars, Rosenbaum advances his arguments by means of a crabwise motion, scuttling back and forth between general observations (which are newly composed) and case studies (many of them published before, in the Reader and elsewhere). This means that some stretches of ground are covered two or three times. I don't much mind the repetition--even when the material shows up in a second new book by Rosenbaum, his excellent, unabashedly partisan monograph on Jarmusch's Dead Man (BFI Modern Classics, $12.95). I do worry that indignation, however righteous, has begun to coarsen Rosenbaum's tone and push him into overstatement.
When Rosenbaum is at his best, his extraordinary wealth of knowledge about cinema informs an equally extraordinary power of insight into individual pictures; and both these aspects of his thinking open into frequently astute observations of the world at large. You can get Rosenbaum at his best in his Dead Man monograph and in three previously published collections: Moving Places, Placing Movies and Movies as Politics (California). By contrast, Movie Wars is a sustained polemic, with all the crabbiness that implies.
It's a welcome polemic, in many ways. Most rants against the infotainment industry are on the level of Michael Medved's godawful Hollywood vs. America; they complain, in effect, that the movies tell us too much about the world. Rosenbaum recognizes the real problem, which is that our world (filmed and otherwise) has been made to seem small. I agree with much of what he says. But when, in his wrath, he digresses to settle scores or rampages past obvious counterarguments, I begin to wish that he, too, would sometimes pretend to be urbane and solid.
"There's a hefty price tag for whatever prestige and power comes with writing for The New York Times and The New Yorker," Rosenbaum says, "and I consider myself fortunate that I don't have to worry about paying it. Film critics for those publications--including Vincent Canby and Pauline Kael...--ultimately wind up less powerful than the institutions they write for, and insofar as they're empowered by those institutions, they're disempowered as independent voices."
To which I say, yes and no. As bad as the situation is--and believe me, it's woeful--I've noticed that news of the world does sometimes break through. David Denby, in The New Yorker, may contribute to American ignorance by being obtuse about Kiarostami (as Rosenbaum notes with disdain); but then, as Rosenbaum fails to note, Stephen Holden and A.O. Scott in the Times delivered raves to Taste of Cherry and The Wind Will Carry Us. Individuals in even the most monolithic publications still make themselves heard; and the exceptional writer can manage (at least in life) to upstage an entire institution.
Rosenbaum himself has pulled off that trick at the Reader; and Vincent Canby did it at the Times. To the living critic, and all those who share his expansive view of the world, I say, "We've lost a champion. Better stop grousing and pick up the slack." And to those who mourn Mr. Canby, I say, "You can still hear his laughter. Just don't let The Editors get in the way."
The last chapter in Ring Lardner Jr.'s new memoir, I'd Hate Myself in the Morning (Nation Books), is called "Sole Survivor." When Lardner, who died October 31, wrote it he was indeed (a) the last of a family of four boys with a famous father, the humorist and sportswriter Ring Lardner; and (b) the last surviving member of the Hollywood Ten, who gained renown in 1947 when they refused to answer the House Committee on Un-American Activities' question, "Are you now or have you ever been a member of the Communist Party?" They were indicted, prosecuted and convicted of contempt of Congress and sent to prison-- in Ring's case for a year.
Among the first victims of the great Red purge to come, The Ten, also known as the Unfriendly Ten, are historically important because they were willing to risk prison to help prevent it, putting First Amendment principle ahead of personal convenience.
At the time, Billy Wilder, the witty director, cruelly and unjustly said, "Of the Unfriendly Ten, only two had any talent; the other eight were just unfriendly." Ring, who had already won his first academy award for Woman of the Year, starring Katharine Hepburn, was one of the two. The other was his buddy Dalton Trumbo, the highest-paid writer in Hollywood, who went on to win an Oscar for The Brave One, a movie he wrote under the pseudonym Robert Rich.
At the time, the tabloid press and newsreels did their best to portray the Ten as obstreperous, dogmatic followers of the party line. Each of the Ten was, in fact, following his conscience, albeit they arrived at their decision on how to confront HUAC after collective deliberation with counsel, some of whom were party lawyers, others not.
Lardner's famously elegant response to the committee was a clue to how wrong that image was. "I could answer your question," he said, but "I would hate myself in the morning"--hence his memoir's title.
Even during the blacklist years, when he made his primary living writing under various pseudonyms, he never gave up on his social commitment. Thus in 1955, when Hannah Weinstein set up a production company in London and chose for its maiden effort in the new medium of television The Adventures of Robin Hood, Lardner, along with fellow blacklistees like Abe Polonsky and Walter Bernstein, leapt at the opportunity for, as he put it, commentary-by-metaphor "on the issues and institutions of Eisenhower-era America."
After he was finally graduated from the blacklist--it took twelve years--and able to write under his own name, he gave us M*A*S*H, the black comedy that was, on the surface, about life in a medical unit during the Korean war; but beneath the surface, like Joe Heller's Catch-22, it was about the absurdities and contradictions of war itself.
Although his public positions were militant, privately he was a gentle soul. His main target was often himself. He would delight in telling how he recommended to David O. Selznick that he not acquire Gone With the Wind, the highest-grossing picture of its time, "because I objected on political grounds to the glorification of slave owners and the Ku Klux Klan." When progressives praised him for his principled stand against HUAC he would observe that the Ten did the only thing they could do under the circumstances "short of behaving like complete shits."
The loss of Lardner is a loss for both The Nation and the nation. One part Marxist democrat and two parts humanist-rationalist, he stayed true to his vision to the end. A few years ago he listed in The Nation "some of the strange things Americans believe 200 years after Thomas Paine published The Age of Reason." (Typical entries: "Eating fish is good for the brain"; "There never was a Holocaust.") He felt no comment was called for. But when a reader wrote to complain that "Reason is a wonderful tool, but it is a tiny flashlight shining here and there..." Lardner responded, "What he sees as a tiny flashlight, I call, in the words of Cicero, 'the light and lamp of life.'"
In an introduction to his memoir, I call Lardner "recrimination-challenged." In fact he seemed incapable of bitterness. Although he did once say of Martin Berkeley, a screenwriter who named a record 161 names before HUAC and specialized in writing animal pictures, "I always maintained that was because he couldn't write human dialogue."
LOUIS ARMSTRONG AT 100
In 1927 a young cornetist led his band into a meticulously hilarious version of a classic composition Jelly Roll Morton had made famous, "Twelfth Street Rag." The recorded track sounds like the opening shot of a revolution--except that the revolution had already been in full swing in Louis Armstrong's head and hands for years.
Unlike most revolutions, though, from the first this one displayed an ingratiating, inviting sense of humor and charm. "Dippermouth," as his early New Orleans pals dubbed him, used the rag as a flight vehicle: As his horn fractures the tune's familiar refrains, the precise, cakewalking rhythmic values of ragtime suddenly coil and loop and stutter and dive, the aural equivalent of a bravura World War I flying ace in dogfighting form. Every time Armstrong comes precariously near a tailspin, he pulls back the control stick and confidently, jauntily, heads off toward the horizon, if not straight into another virtuosic loop-de-loop. The cut is from an astonishing series of recordings Armstrong made in 1925-28 that amount to the jazz-creating legacy of his Hot Fives and Hot Sevens, a succession of studio groups that virtually never performed live. And now, in time for his centennial--he claimed he was born in 1900 but wasn't--it's all been reissued.
The relentless joy brimming in the sound of young Satchelmouth's horn, the glorious deep-blue and fiery-red Whitmanesque yawp of it, has an undeniably self-conscious edge to it. Ralph Ellison and Albert Murray first pointed out a half-century ago that it is also the sound of self-assertion, a musical realization of the double consciousness W.E.B. Du Bois posited for African-Americans. Within this compound of power and pain, a racial revisitation of the master-slave encounter in Hegel's Phenomenology of Spirit, Du Bois explained that African-Americans were inevitably alienated, stood both inside and outside mainstream American culture and its norms, prescriptions, hopes, dreams. Such alienation, Du Bois pointed out, could cripple black Americans by forcing them to internalize mainstream cultural values that held them to be less than human, but it could also liberate the brightest of them. The "Talented Tenth," as he called this group, could act on their perceptions of the contradictions between the high ideals grounding basic American cultural myths (for example, that society believed "all men are created equal," as the Declaration of Independence puts it) and gritty daily reality, where blacks were not exactly welcomed into concert halls, schools, restaurants or the front of buses.
In the bell of Armstrong's barbaric (which means, in the sense Whitman inherited from Emerson, non-European) horn is the sound of a new, all-American culture being forged from the stuff of the social sidelines. In 1957 Ellison wrote to Murray,
I've discovered Louis singing "Mack The Knife." Shakespeare invented Caliban or changed himself into him. Who the hell dreamed up Louis? Some of the bop boys consider him Caliban, but if he is, he is a mask for a lyric poet who is much greater than most now writing. Man and mask, sophistication and taste hiding behind clowning and crude manners--the American joke, man.
Armstrong himself was no naïve artist; he certainly wasn't a fool. From his earliest days he saw race as a key issue in his life, his art and his country, with a wit and understanding evident in his music. As he wrote of the composer of "Twelfth Street Rag" and jazz's self-proclaimed inventor, "Jelly Roll [Morton] with lighter skin than the average piano players, got the job [at New Orleans's leading whorehouse, Lulu White] because they did not want a Black piano player for the job. He claimed he was from an Indian or Spanish race. No Cullud at all. They had lots of players in the District that could play lots better than Jelly, but their dark Color kept them from getting the job. Jelly Roll made so much money in tips that he had a diamond inserted in one of his teeth. No matter how much his Diamond Sparkled he still had to eat in the Kitchen, the same as we Blacks."
In The Omni-Americans, Murray explains how Armstrong's music limned human talents needed in the frenetic, fast-changing twentieth century. Drawn from the pioneer, Indian and slave, the key American survival skill was improvisation, the soloist's ability to mesh with his surroundings. Ellison's Invisible Man uses Armstrong's version of "Black and Blue," a tune from the 1929 Broadway play Chocolate Dandies, to demonstrate the Du Boisian nature of improvising as epistemological tool.
This was the lesson Armstrong started teaching in the Jazz Age, when flappers reigned and sexual emancipation knocked at the door of mainstream culture, when the Harlem Renaissance redefined African-Americans, when Prohibition created a nation of outlaws who, thanks to associating with booze and gangsters and the demimonde's jazz soundtrack, saw that Negroes, as they were called, were subject to legal and extralegal restrictions and prejudices more arbitrary and inane than the constitutional amendment forbidding booze.
The elastic rhythms and fiery solos on the sides by the Hot Fives and Hot Sevens spoke to these people. On tune after tune, Armstrong cavorts and leaps and capers over and around his musical cohorts with the playful self-possession of a young and cocky top cat. Nothing can hold him down. He traverses keys and bar lines and rhythms with impunity, remolding them without missing a step.
"Black and Blue"--originally written as a lament by a dark-skinned gal for her man, who's attracted to high-yellow types--made him a star. Armstrong's brilliant, forceful reading renders it as mini-tragedy, the musical equivalent of Shylock's speech in The Merchant of Venice. "My only sin," he sings in that growl that compounds the earthy humanity of the blues with an unflinching dignity (this is no grovel), "is in my skin/What did I do to be so black and blue?" The short answer: in America, nothing. The color line did it all.
Subversive and powerful, Armstrong's music was the fountainhead of the Jazz Age and the Swing Era, when jazz was America's popular music and the sounds of syncopated surprise filled the nation's dance halls while young folks skittered and twirled and flounced and leaped and broke out of lingering Victorian constraints to loose-limbed beats and blaring horns that emerged from America's Darktowns in New Orleans, New York and Chicago.
One of Armstrong's 1936 recordings is "Rhythm Saved the World." Like many, this banal tune is transformed by his syncopating personality. Its idea still echoes across America's teeming subcultures: Decades later, Parliament Funkadelic sang, "Here's my chance to dance my way out of my constrictions."
If Armstrong claimed he was born on July 4, 1900, who could blame him? As one of America's primary declarers of cultural independence--and interdependence--he should have been. But in his rich biography Satchmo, Gary Giddins (who insists that all American music emanates from Armstrong) proves that Louis's birth date was August 4, 1901. Armstrong and his sister were born in a hard district of New Orleans; their father left before either could remember him. In his early years Armstrong was raised by his grandmother, whom he credited with the Emersonian values--hard work, self-reliance, artistic daring coupled with personal amiability--that guided him. His mother may or may not have been a prostitute for a while; Louis returned to live with her when he was 5.
At 7 he quit school and went to work for a Jewish family, the Karmofskys, and picked up his first instrument--a tin horn. He'd been dancing and singing on the street for pennies with other kids, but working coal wagons with the Karmofsky sons, he learned to blow the cheap horn by putting his fingers together in front of the tube (he'd pulled off the mouthpiece). The boys encouraged him, their clients loved his melodies, and Little Louis, as he was known, had found his calling.
On January 1, 1913, he was busted for firing his stepfather's pistol, and sentenced to the Colored Waif's Home. There he joined the band and got his first musical training, which he characteristically never forgot. According to clarinet great Sidney Bechet, who in the 1920s was Armstrong's only peer as a virtuosic improviser, the cornet-playing young Louis mastered the chops-busting clarinet solo for "High Society" before his teens--an astounding feat that only hinted at what was to come.
Little Louis danced in second-line parades, following cornetist Joe "King" Oliver in the Onward Band as they wound through the Crescent City streets. Oliver was a catalytic force for Armstrong, who always insisted he learned his stuff from Papa Joe. When Oliver left for Chicago, following post-World War I black migration from the South to Northern and Western cities, he left Little Louis his slot in the Kid Ory band, which led the young cornetist to Fate Marable and the riverboats plying the Mississippi in 1920-21.
Marable, impressed by the young hornman's dazzling facility and ear, hired him for his riverboat band, and one of his sidemen trained the youngster to read and write music. What they played was a mix that would confound the Dixieland revivalists who decades later took Armstrong as their figurehead: adapted arias and classical overtures, quadrilles and other dance music, and the like. (Historian Dan Morgenstern has pointed out the suggestive influence of classical music on Armstrong.) At Davenport, Iowa, when the riverboat docked, a white kid named Bix Beiderbecke first heard Armstrong with Marable and decided to make the jazz cornet his life.
In 1922 Oliver sent for his protégé, who kissed his mother goodbye, packed the fish sandwich she made for him and headed north to Chicago. When he got to the Lincoln Gardens Cafe, where Oliver's band was wailing, he looked like a rube and was so shy he stayed by the door to watch. He couldn't believe he'd be playing with these masters of jazz. Yet in a very short time, first in recordings with them, then with his own Hot Fives and Hot Sevens, he would make them all sound like musical relics.
Rube or not--and his mode of dress quickly became Chicago-style sharp--Armstrong got the girl. His second wife, piano-playing Lil Hardin, married him while they were both playing with Oliver. Hardin was conservatory-trained and middle class, and for the next few years her ambition would drive the modest genius she married to make his mark in the rapidly exploding Jazz Age. Convinced that Oliver kept Louis in his band to keep him from fronting his own, Lil persuaded her husband to grab Fletcher Henderson's offer to join his New York-based big band. When Armstrong arrived in the Big Apple in 1924, Henderson's band was, as Morgenstern notes, "designed for Roseland's white dancing public...rhythmically stiff"; when he left fourteen months later, both arrangers and soloists were extending his sound.
It was Lil who persuaded Armstrong to go back to Chicago after scarcely more than a year in New York, and there he joined her band, then Carroll Dickerson's, and rocked the town. The night he returned, he was greeted by a banner she had unfurled over the bandstand: world's greatest trumpet player. Armstrong later told Morgenstern the reason he left Henderson's band was that the "dicty bandleader," college-educated, light-skinned and prone to look down on dark blacks, wouldn't let him sing, except occasionally for black audiences or for novelty and comic effect. Armstrong had been singing before he ever picked up a horn--it was a fundamental part of who he was and what he had to say. Ultimately, his vocals would make him a world-famous star. More immediately, they were another virtuosic tool he used to change jazz and, in the process, American culture.
Armstrong pioneered so many firsts in jazz and America that a list seems implausible. Here's a sample: He invented the full-fledged jazz solo. He invented scat singing. He introduced Tin Pan Alley and Broadway tunes as jazz's raw material. (When Armstrong replaced New Orleans standards and blues with Tin Pan Alley tunes in the 1930s, he forged the model followed by swing, jazz's most successful invasion of American pop music. His model was followed literally: Key arrangers like Don Redman, who worked for many bandleaders, including Benny Goodman, adapted Armstrong's runs and rhythmic moves to section-by-section big-band arrangements.) And Armstrong performed in interracial settings. Once, in New Orleans, when a bigoted announcer refused to introduce his band, he did it himself--so well that the radio station asked him to do it for the rest of the band's stint.
His voice engulfed America. Among his major disciples was Bing Crosby, who called him "the beginning and the end of music in America." His influence rippled across American popular and jazz singing like an earthquake. As he reconfigured pop tunes, the apparently natural force of his voice's cagey dynamics and loose rhythms seized the imagination of talents like Ella Fitzgerald and Billie Holiday, Crosby and Frank Sinatra.
With his last Hot Sevens recordings for Okeh in 1928, in which tunes like "I Can't Give You Anything But Love" were issued as B-sides, Armstrong had moved closer to the new American cultural mainstream he was inspiring. When he started recording for Decca in 1935, the impetus accelerated. A couple of interim managers gave way that year to Joe Glaser, a thuggish, mob-connected scion of a well-off Chicago family. He and Armstrong shook hands on a deal that lasted till they both died. As Armstrong put it, "A black man needs a white man working for him." It was the beginning of his big crossover into mainstream American culture--another Armstrong first in undermining de facto segregation in America. And his years at Decca were his workshop in change.
He fronted a big band, which critics hated and fans enjoyed. The outfit was run by Glaser, since Armstrong, who occasionally hired and fired personnel, didn't want to shoulder a bandleader's nonmusical burdens. And he agreed with Glaser on a new musical direction: setting his solos off in sometimes inventive, sometimes indifferent big-band charts; smoothing his blues-frog vocals into a more sophisticated sound without losing their rhythmic slyness--something he was also doing with his trumpet solos, reshaping his early frenetic chases after strings of high-altitude notes into less eye-popping, more lyrical solos.
Physical damage to Armstrong's lip and mouth from high and hard blowing forced the issue. Joe Muranyi, who played with him years later, says, "Part of the change in Louis's style could be attributed to the lip trouble he had in the early thirties. There are tales of blood on his shirt, of blowing off a piece of his lip while playing. This certainly influenced the way he approached the horn; yet what we hear on these tracks has at least as much to do with musical development as with physical matters." Limitation was, for Satchmo's genius, a pathway to a matured artistic conception. As Giddins argues forcefully in Satchmo, Armstrong had never separated art and entertainment; jazz for him was pop music. And if his bands irritated critics, there were plenty of gems, and besides, people loved him.
By World War II, his audiences were more white than black.
The war years broke the big bands. The culture had changed: Singers and small groups were hip. It was the era of a new sound, what Dizzy Gillespie called modern jazz and journalists dubbed bebop. Bop's frenetic, fragmented rhythms restated the postwar world's rhythms, and it deliberately presented itself not as entertainment but as art. The musicians forging it, like Gillespie and Charlie Parker, were fully aware of the stirring civil rights movement. World War II had fostered widespread entry of blacks into industry and the American military. Not surprisingly, after the war, they weren't willing to return to the old values of accommodation and deference. Instead, they demanded equality and freedom. In this context, boppers and their followers saw Armstrong's lifelong mugging and entertainment as Uncle Tom-ism rather than artistic expression.
The Dixieland revival, based in Chicago, occurred at about the same time. The (mostly white) revivalists needed an artistic figurehead. With a healthy historical irony they ignored, they chose Armstrong--the very soloist who blew apart old-style New Orleans polyphony, their idea of "pure" or "real" jazz. By 1947 Satchmo reluctantly abandoned his eighteen-piece outfit for the All Stars, a New Orleans-style sextet that included Jack Teagarden and Earl Hines. Though they often made fine music, the group was seen as a step backward by boppers. They jabbed at Satchmo, he jabbed back, and the split between revivalists and modernists escalated into a civil war that, in different stylistic and racial modes, still divides the jazz world.
Sadly, it was another Armstrong first. And his audiences began to turn lily-white. Giddins deftly shows Armstrong's world-famous onstage persona--the big grin, the bulging eyes, the shaking head, the brandished trumpet, the ever-present handkerchief, the endless vaudevillian mugging--to be an organic conception of the artist as entertainer. Still, from the 1950s until just before his death in 1971, Armstrong had to deal with accusations and slurs.
But if he never forgot who he was, while retaining his characteristically modest manner and only privately protesting how much he'd done to advance black civil rights, he could still be provoked, as President Eisenhower and the public discovered in 1957. Armstrong was poised to go on the first State Department-sponsored tour of the Soviet Union, a cold war beachhead by jazz. He abruptly canceled it because the Southern states refused to integrate schools, and he publicly excoriated Ike and America.
Early jazz musicians often refused to record, because they felt competitors could steal their best licks from their records. This was why the all-white Original Dixieland Jass Band made jazz's first records; black New Orleans trumpeter Freddie Keppard refused, fearing for his originality.
No one knows for sure how many recordings Armstrong made during the course of his half-century recording career. All agree, however, that he helped create both the art and the industry. After all, "race" records, especially Armstrong's hits, were as important as Bing Crosby's in saving the fledgling record companies from collapse during the Depression. (And there was more to it than that. Through the phonograph Armstrong made infinite disciples, shaping what jazz would become.)
During the 1950s and 1960s, when he was largely considered a period piece, Armstrong recorded important documents, like his meetings with Duke Ellington and Ella Fitzgerald. The best thing about them is their apparent artlessness, the easy, offhand creativity that was as much Armstrong's trademark as his trumpet's clarion calls. The pleasure is doubled by the response of his disciples.
Ella fits that description easily, since her trademark scat singing owes so much to Armstrong's. Yet she made it her own, purging scat of its overt blues roots. Producer Norman Granz supported them with his favorite Jazz at the Philharmonic stars--Oscar Peterson, Herb Ellis and Ray Brown. The results: Both Ella and Louis and Ella and Louis Again are incandescent yet low-key, full of generous pearls (from "Can't We Be Friends" to "Cheek to Cheek") that can almost slip by because of their understated yet consummate ease.
The 1961 session with Armstrong and Duke Ellington was hasty and almost haphazard, a simple melding of Ellington into Armstrong's All Stars; and yet it produced a wonderful, relaxed, insightful album. After all, Ellington had shaped his earliest bands around trumpeters and trombonists who could serve up Armstrong's New Orleans flair.
Like most postwar babies, I grew up knowing Louis Armstrong as the guy who sang "Mack the Knife" and, most famously, "Hello Dolly." It was only later that I'd discover the old blues stuff with singers like Bessie Smith, the Hot Fives, Ella and Louis, Fletcher Henderson and--one of my faves--Armstrong's accompaniment on early hillbilly star Jimmie Rodgers's "Blue Yodel No. 9." But even as a kid I felt strangely drawn to the little guy singing and grimacing on TV, wiping his perspiring brow with his trademark handkerchief. Although it all seemed corny, there was something, a hint of irony--though that wouldn't have been what his audiences, black or white, noticed unless they were old-timers who knew the ironic physical language or Satchmo fans or, like me, just a kid.
Why would a white kid in America catch a glimpse of Armstrong's abundantly joyful and potentially dangerous ironies? I'd love to claim precociousness, but it was a lot simpler. I could tell Armstrong was real because he filled the little blue TV screen so overwhelmingly that he made everything around him look, as it should have, fake.
What ought to be read--and why--are questions that have a unique urgency in a multicultural milieu, where each group fights, legitimately, for its own space and voice. In the past couple of decades, battles over the Western canon have been fought strenuously in intellectual circles--one such flash point was Allan Bloom's The Closing of the American Mind and the debates that ensued. These skirmishes have much to do with the fact that America is undergoing radical change. The Eurocentric place once acknowledged as the heart of its culture has ceased to be so. Alternative groups, from different geographies, have brought with them the conviction that public life with a myriad of cores rather than a single one is far more feasible today.
It strikes me as emblematic that the voices most sonorous in the battlefield over the fate of literature are often Jewish, from those of the two Blooms, Allan and Harold, to that of Cynthia Ozick. This is not a coincidence: After all, the Jews are known as "the people of the book." For the Talmudic rabbis, to read is to pray, but so it is, metaphorically, among secular Jews...or, if not to pray, at least to map out God's cosmic tapestry. Among the most deeply felt Jewish expressions of book-loving I know is a letter to the legendary translator Samuel ibn Tibbon, a Spanish Jew of the illustrious translation school of Toledo in the twelfth century, written by his father. In it the elder Tibbon recommends:
Make your books your companions, let your cases and shelves be your pleasure grounds and gardens. Bask in their paradise, gather their fruit, pluck their roses, take their spices and their myrrh. If your soul be satiate and weary, change from garden to garden, from furrow to furrow, from prospect to prospect. Then will your desire renew itself and your soul be filled with delight.
But to turn Tolstoy's Anna Karenina into a companion, to satiate one's soul with it--ought that to be a Jewish pastime? I'm invariably puzzled at the lack of debate among Jewish intellectuals, especially in the Diaspora, on the formation of a multinational literary canon made solely of Jewish books. Why spend so many sleepless nights mingling in global affairs, reorganizing a shelf that starts in Homer and ends in García Márquez, yet pay no attention whatever to those volumes made by and for Jews?
The idea of a Jewish literary canon isn't new. Among others, Hayyim Nakhman Bialik, the poet of the Hebrew renaissance and a proto-Zionist, pondered it in the early part of the twentieth century. He developed the concept of kinus, the "ingathering" of a literature that was dispersed over centuries of Jewish life. Bialik's mission was to centralize it in a particular place, Israel, and in a single tongue, Hebrew. And a handful of Yiddish and Jewish-American critics, from Shmuel Niger to Irving Howe, have addressed it, although somewhat obliquely. Howe, for instance, in pieces like "Toward an Open Culture" and "The Value of the Canon," discussed the tension in a democratic culture between tradition and innovation, between the blind supporters of the classics and the anti-elitist ideologues. But in spite of editing memorable volumes like A Treasury of Yiddish Stories, he refused to see Jewish literature whole.
The undertaking never achieved the momentum it deserves--until now. A number of books have appeared in English in the past few months that suggest the need for a debate around a modern Jewish library. The Translingual Imagination (Nebraska), by Steven Kellman, a professor at the University of Texas, San Antonio, while partially concerned with Jewish literature, addresses one crucial issue: the polyglotism of authors like Sh. Y. Abramovitch, the so-called grandfather of Yiddish letters, whose conscious switch from Hebrew into Yiddish didn't preclude him from translating many of his novels, like The Mare, back into the sacred tongue. The presence of multilingualism in the Jewish canon, of course, is unavoidable, for what distinguishes the tradition is precisely its evaporative nature, for example, the fact that it emerges wherever Jews are to be found, regardless of tongue or geographical location. This complicates any attempt at defining it in concrete ways: What, after all, are the links between, say, Bruno Schulz, the Polish fabulist and illustrator responsible for The Street of Crocodiles, and Albert Cohen, the French-language author of the masterpiece Belle du Seigneur?
Also recently released is a book by Robert Alter, author of the influential The Art of Biblical Narrative and translator of Genesis. It is titled Canon and Creativity (Yale) and attempts to link modern letters to the biblical canon to stress issues of authority. Alter is attracted to the debate of "canonicity" as it is played out in academia and intellectual circles today, but he isn't concerned, not here at least, with purveying the discernible edges of Jewish literature historically. Far more concerned--obsessed, perhaps--with the continuity between Jewish authors from the Emancipation to the present is Ruth Wisse, a professor of Yiddish at Harvard, whose volume The Modern Jewish Canon will legitimize the debate by bringing it to unforeseen heights. For purposes of mitigated objectivity, I must acknowledge up front that together with Alter and Wisse and four other international Jewish critics, I am part of a monthslong project at the Yiddish Book Center to compose a list of the hundred most "important" (the word cannot fail to tickle me) Jewish literary books since the Enlightenment. So I too have a personal stake in the game. But sitting together with other candid readers in a room is one thing. It is another altogether to respond to the pages--at once incisive and polemical--of one of them whose views have helped to form my own.
Wisse is a conservative commentator of the Jewish-American and Israeli scenes and, most significant to me, an intelligent reader of strong opinions whose work, especially her study of Itzjak Leib Peretz and her monograph The Schlemiel as Modern Hero, I have long enjoyed. In her latest work she ventures into a different territory: From specialist to generalist, she fashions herself as a Virgil of sorts, thanks to whom we are able to navigate the chaotic waters of Jewish culture.
Probably the most estimable quality of The Modern Jewish Canon is simply that it exists at all. It insinuates connections to document the fact that Jews have produced a literature that transcends national borders. Albert Memmi's Pillar of Salt and Philip Roth's Operation Shylock might appear to be worlds apart, but Wisse suggests that there is an invisible thread that unites them, a singular sensibility--a proclamation of Jewishness that is clear even when it isn't patently obvious.
This is a crucial assertion, given that Jewish communities worldwide often seem imprisoned in their insularity: Language and context serve to isolate them from their counterparts in other countries and continents. For example, American Jews, for the most part, are miserably monolingual. (I doubt Jews have been so limited linguistically at any time in the past.) They insist on approaching their own history as starting in the biblical period but then jump haphazardly to the Holocaust, and thereon to the formation of the State of Israel in 1948. The Spanish period, so exhilarating in its poetic invocations, is all but ignored, and so is the importance of Jewish communities beyond those of Eastern Europe. Why are the echoes from the Tibbon family to Shmuel Hanagid, Shlomo ibn Gabirol, Moses ibn Ezra and medieval Spanish letters in general so faint? The power of these poets, the fashion in which they intertwined the divine and the earthly, politics and the individual, the struggles of the body and the soul, left a deep imprint in Jewish liturgy and shaped a significant portion of the Jewish people through the vicissitudes of the Ottoman Empire and northern Africa. Even the Dreyfus Affair is little known or regarded, as is the plight of the Jews in Argentina from 1910 to the bombing of their main cultural building in Buenos Aires in early 1994. And where the verbal isolation is not a problem, the insular perspective still applies: For instance, only now is Israel overcoming its negation of Diaspora life, which has deformed Israeli society and resulted in an institutionalized racism against those co-religionists whose roots are not traced to Yiddishland.
Wisse displays genuine esteem for high-quality literary art. She trusts her instincts as a savvy reader and writes about what she likes; no affirmative action criteria seem to apply in her choices--and for hewing to her own perspective, she ought to be commended. The common traits she invariably ascribes to what is a varied corpus of Jewish literature always point to Russia and Europe. Her encyclopedism is commendable in that it surveys a vast intellectual landscape, but it has clear limitations. She is well versed in English, Hebrew and Yiddish letters. But what about Sephardic culture? Ought she to exclude all that she is unfamiliar with?
The study is divided into ten chapters of around thirty pages each, ordered chronologically according to the birth dates of authors. She starts in the right place--with Sholem Aleichem, the author of the most beloved of all Jewish novels and my personal favorite, Tevye the Dairyman. And she ends with Israeli literature. In the interim, she mixes excerpts, critical commentary and historical perspective in exploring the work of Kafka, S.Y. Agnon, Isaac Babel, Isaac Bashevis Singer and scores of other luminaries, some of questionable value in my eyes (Jerzy Kosinski, for instance) and others often overpraised (here I would include Ozick). The contributions of critics such as Dan Miron, Chone Shmeruk, Lionel Trilling and Howe are acknowledged by Wisse in these pages, their perspectives still fresh and inviting.
It may be ungenerous to accuse Wisse of a certain nearsightedness; after all, to capture the essence of a literature written in a plethora of tongues and cultures, a literature that is by definition "undefinable," any potential cataloguer would need to be versed in each and every one of them. But The Modern Jewish Canon suffers another serious shortcoming, entirely within control: It is too dry a read. For a treatise that aspires to connect the various Jewish Weltanschauungen and juxtapose a rainbow of imaginations, each responding to different stimuli, from the eighteenth century to this day, Wisse offers little by way of narrative enchantment. She is a scholar and writes as such. Scarce effort is made to turn words into metaphors, to twist and turn ideas and allow them to wander into unexplored regions. The reader finds himself lost in a sea of "objective impersonality." Too bad, for shouldn't a book about the beauties of a polyphonic literature aspire to that on its own?
Wisse herself announces: "Modern Jewish literature...promises no happy merger into universalism at the end of the day." And yet some form of universalism is what she is attempting to describe, extending connective tissue between literary works where, at least superficially, there seemed none before. In that sense the achievement is impressive. Immediately after finishing the book, I took up pencil and paper to shape a list of what would be my own choice of books. In one of her last pages Wisse, who concentrates on novelists, includes a list of almost fifty titles, "meant to serve as a reference guide." Included are Yaakov Shabtai's Past Continuous, Piotr Rawicz's Blood From the Sky, Pinhas Kahanovitch's The Family Mashber, and Anne Frank's Diary of a Young Girl. But I found myself asking, Where are Marcel Proust, Elias Canetti and Moacyr Scliar? And that, precisely, is one thing a book of this sort should do: force readers to compose a response to the invisible questionnaire the author has quietly set before our eyes.
Future generations will find The Modern Jewish Canon proto-Ashkenazic and hyper-American, a sort of correlative to the Eurocentrism that once dominated American letters. They will kvetch, wondering why the Iberian and Levantine influence on today's Jewish books--from the poetry of the crypto-Jew João Pinto Delgado, to the inquisitorial autobiography of Luis de Carvajal the Younger, to even the Sephardic poetry that came out of the Holocaust--was so minimized in the English-language realm. Kvetch is of course a Yiddish word--or, as Leo Rosten would have it, a "Yinglish" one--but fretting and quarreling are Jewish characteristics regardless of place, and they inhabit the restless act of reading as well. The idea of a Jewish canon, modern and also of antiquity, hides behind it an invaluable fact: that Jews are at once outsiders and insiders, keepers of the universal library but also of their own private ones. Books have always served as their--our--companions for renewal and delight. The content of that private library might be up for grabs, but not its endurance.
The attempt to see Jewish literature whole, as expressing a singular sensibility, has never had the momentum it deserves--until now.