We're sorry, but we do not have permission to present this article on our website. It is an excerpt from Upside Down: A Primer for the Looking-Glass World (Metropolitan). © 2000 by Eduardo Galeano. Translation © 2000 by Mark Fried.
Let's cut to the chase on Ken Burns's Jazz, which rolled out on PBS January 8, by invoking Wallace Stevens.
1) Is it entertaining TV? Mostly, in PBS fashion.
2) Does it leave out people and places and whole periods and genres
normally considered vital parts of jazz history? Yes.
3) Does it need more editing? Yes.
4) Does Louis Armstrong claim 40 percent of its nineteen hours? Yes.
5) Does post-1960s jazz claim 10 percent? Yes.
6) Does it tell an informed and informative story? Usually.
7) Does it identify the 500-odd pieces of jazz that serve as its soundtrack? Rarely.
8) Does it have rare and evocative pictures and film footage? Absolutely.
9) Is it good history? It's made-for-PBS history.
10) Will it satisfy jazz fans and musicians and critics? Seems like it already hasn't, and it hasn't even aired yet.
11) Will it save the jazz industry? That depends: CDs labeled Ken Burns's Jazz are bullish.
12) Will it make jazz a part of mainstream American culture again? Not likely, but it may help make it an official part of American popular history.
13) Is it part of the transition jazz has been making for three decades into the academic world? You bet.
Now let's dolly back and try to tell the story.
The numbers have to come first. The ten-episode, nineteen-hour series was six years in the making, and it sprawls: seventy-five talking heads, thousands of still photos and pieces of film, some 500 pieces of music and so on. Costing some $13 million, about a third of it from General Motors, it's the biggest documentary that's been done about jazz.
And yet a lot of jazz musicians and critics and fans, in print and on the web, have been complaining that it's too constrictive. It's easy to see why. It's certainly not comprehensive. For Burns and collaborator Geoffrey Ward, history unfolds in the textures of individual lives. (Ward won the Francis Parkman prize for A First-Class Temperament, one volume of his biography of FDR.) Jazz for them is the story of a few great men (and the odd woman) who changed the way Americans, then the world, hear and think and act. Chief among them: Louis Armstrong and Duke Ellington. There are places of honor for the likes of James Reese Europe and Jelly Roll Morton, Sidney Bechet and Bix Beiderbecke, Benny Goodman and Count Basie, Artie Shaw and Charlie Parker, Miles Davis and Dave Brubeck. This sort of survey is easier to sustain until about 1929, because jazz musicians were few (though not as few or as limited to New Orleans, Chicago and New York as the series implies). But Burns & Co. can tell a credible story of jazz's first decades using a handful of pioneers.
One reason for the noise is that this overlaps the story of jazz according to the Jazz at Lincoln Center program, a flashpoint in the jazz world. JALC teaches that jazz is a clear-cut genealogy of a few outstanding figures, and it excludes many important artists, especially after 1960, often for ideological reasons. The basic plot for both: Taking its building blocks from slave music, marching bands, blues, the church, European dance and classical music, jazz began life as a mongrel in New Orleans, came up the river to Chicago, met up (via Armstrong) with New York proto-swing bands and Harlem stride pianists and exploded, drawing young white players into a black-developed music. This is true enough, though it ultimately means ignoring uncomfortable parallel developments (Red Allen and Armstrong) or scenes (between-the-wars LA jazz) or entire genres (Latin jazz, European jazz). But schematic history can be good TV, and Burns, like earlier PBS filmmaker Frederick Wiseman, makes long, long movies that depend on strong, heavily delineated characters and themes to keep them from dissipating.
His story's heart is Armstrong. Its head is Ellington. And its soul is the Jazz Age and the Swing Era.
In episode five, "Swing: Pure Pleasure (1935-1937)," writer Albert Murray declares, "Jazz is primarily dance music." Though that hasn't been true for nearly half the music's history, it's clear he's speaking for Burns: Three episodes, nearly six hours, discuss the big-band era, when jazz underpinned popular music, lifted Depression-era spirits, saved the record industry and dominated that new omnipresent technology, radio. Nevertheless, as the often-intrusive talking heads tell us, from Ellington on down the musicians knew the difference between the business and the music; stage shtick and chart slots were as important then as now. This is a bittersweet Golden Age of speakeasies, hoods, the Great Depression, squealing bobby-soxers, lynchings, jitterbugging, novelty tunes and early moves toward racial integration. It is described as a time of "adult sensibility" and is the series' gravitational center.
The great-man schematic creates escalating difficulty for the plotting starting with episode seven, which begins with Charlie Parker and spends nearly as much time on Satchmo as it does on bebop. By the mid-1940s, the musicians had multiplied and moved on--out of Harlem and swing time. And so jazz dissolves into hundreds of musicians searching for different sounds, styles, approaches, languages, multimedia formats. The last forty years of Jazz are a choppy and unreliable ride; a lot disappears, and what's left can be telegraphic or confusing and look exactly like JALC speaking.
Burns says post-1960s jazz is too controversial even in the jazz world to be history. Maybe he should have ended, then, with John Coltrane; Baseball, after all, stopped at 1970. For in less than two hours, faces from Charles Mingus's to Sonny Rollins's flash across the screen between inevitable reprises of Duke and Satchmo. Miles Davis's push into fusion shrinks to his alleged desperation for teen fans. Ornette Coleman is dismissed. Keith Jarrett and Chick Corea don't appear. The 1970s and 1980s are a quick-blur artistic wilderness until the arrival of Wynton Marsalis, artistic director of Jazz at Lincoln Center and the film's senior creative consultant and prime talking head. And there, after a brief survey of new stars (Cassandra Wilson, Joshua Redman) and a recapitulation of key figures and themes, it ends.
The signal irony: If Burns had cut the final episode and billed this as Jazz: The First 50 Years, more of the discussion might be where it belongs--on the movie.
Until pretty recently nobody thought enough of jazz to point a movie camera in its general direction for very long. There are snatches of footage of Armstrong, Ellington, Fats Waller, Bessie Smith and the like from the early days. By the mid-1930s the popular swing bands cropped up in films and then in "soundies." But the video record of what fans like to call America's greatest art form is sporadic and discouraging.
This problem plays to Burns's strengths: He loves having his staff dig up old photos (for this, they turned up millions), and he loves working stills to make them kinetic. He pans across and slowly zooms in and out of a single shot to give it a movielike temporal depth. In one vignette about Harlem's Savoy Ballroom, where drummer Chick Webb held court and introduced Ella Fitzgerald in the 1930s, Burns intercuts shots of separate white and black dancers to hammer home the voiceover's point about its integrated patrons--a first in America. He assembles a deft mix of photos and film to re-create the stage-fright-to-triumph of Benny Goodman's 1938 "Sing Sing Sing" concert at Carnegie Hall.
The series boasts tours de force. The evocative segment called "The Road" strings out a head-turning daisy chain of wondrous footage: bands on trains and buses and touring cars, chugging 500 miles a day, six days a week, making whoopee and changing tires, riding high onstage and coping with breakdowns and prejudice offstage. The recently deceased bass and photography great Milt Hinton recalls how at band stops his wife would head into town to look for black homes where the musicians could eat and stay, how musicians were people of prestige in the community. Readings from journals and newspapers and diaries sample big-band life's dizzying ups and downs, while the film rolls from impromptu baseball games to a couple of female jazz fans puffing fake reefers while hugging the sign of a town named Gage.
And in the background rolls out more jazz by far than 99 percent of America has heard. Much of the time, it's as snippets in the background when one after another talking head pops up. The heads are duly identified time after time. The tunes aren't, unless they're keyed to a biographical or sociological set piece. Why not flash a subtitle to tell the audience what's playing?
Because jazz is the soundtrack for this series as much as or more than it is its subject. To put it another way, this isn't really a movie about jazz history. Think of Burns as PBS's Oliver Stone. Like the Civil War and baseball, jazz for Burns and Ward is a lens to focus on basic questions: Who are Americans, and how do they manage to get along--or not? And their central query concerns race.
So they film jazz as the tale of black redemption in and of America, a narrative of conversion and triumph whose shape recalls St. Augustine and Dante. From the days of slavery through the humiliations of Jim Crow and minstrelsy to the assertive freedom of the blues and jazz, Burns's movie resounds with the apocalyptic ring of apotheosis, as it examines a few crucial candidates for cultural sainthood. For it wants both to carve jazz greats into the American pantheon and to underline jazz's pivotal centrality to twentieth-century America as an affirmation of African-American creativity and endurance.
This, coupled with Marsalis's camera-savvy polish as a spokesman as well as his insistent championing of jazz education over the years, explains why a filmmaker like Burns would feel drawn to JALC's version of jazz history. (Actually, Dan Morgenstern, the respected head of the Rutgers Institute of Jazz Studies, was the film's senior historical consultant and vetted the script; there were twenty-three consultants in all, so until the final episode there are inevitable points of similarity, but not identity, with Lincoln Center's tale.) But dramatic necessity also helps explain why some characters, like Armstrong and Ellington, are the story's recurrent focus.
Swing, you might guess, is a buzzword in this series, and you'd be right, even though the film itself doesn't swing much. The earnestness that suffuses PBS cultural products won't let it float for long. At times, the music's lilting ease and fire contrast vividly with its deliberate, self-conscious pace. That's exacerbated by Burns's seventy-five talking heads: Watching can be like sitting through a course team-taught by the UN.
Besides Marsalis, Burns's other main soloist is writer Gary Giddins, and Giddins swings: His wide-ranging erudition rides his love for jazz easily. Other commentators--Stanley Crouch, Albert Murray, Artie Shaw, Gerald Early, James Lincoln Collier, Dave Brubeck--give good camera and consistent historical edutainment. But too many proffer vague impressions, clichéd memories, breathless interpretations and warmed-over anecdotes. They could easily have been edited or edited out. Then there are periodic pileups. In episode seven Joya Sherrill, Mercedes Ellington (Duke's granddaughter) and a few others repeat that Duke and Billy Strayhorn were a rare and wonderful match. In episode five, the same two dancers appear twice with virtually the same observations about Harlem's Savoy Ballroom.
Sometimes the anecdotes are fun or fabulous, sometimes they're bad history. Take Jon Hendricks, who in episode four retails the disproven mythic origin of Armstrong's scatting (sheet music fell off his stand at a recording session). Or director Bertrand Tavernier, who gushes about Django Reinhardt and Stéphane Grappelli introducing the guitar-violin combo to jazz, though they themselves would have fingered Eddie Lang and Joe Venuti. Ballplayer Buck O'Neil rambles good-naturedly about Billie Holiday giving listeners "the greatest moments" and "the saddest moments," demonstrating how a tighter edit could have sliced out the lapses into vacuity.
Marsalis's starring role has several sides. He delivers very effective musical glosses and explanations, polished by years of shows and clinics with adults, teens and kids. His knowledge of and passion for the jazz he loves, and his conviction that it represents American life in full, are infectious, if sometimes hyperbolic. But when he holds forth about Ellington and Armstrong and the semilegendary Buddy Bolden as if he knew them intimately, it's TV, not history.
History can be light-fingered instead of heavy-handed, and Jazz could use more humor, more of the "light" Marsalis ascribes to the best jazz musicians. It has some fabulous vignettes from Crouch, the third-ranked talking head. Except for the last two hours, Crouch swings. In one priceless bit he mimics pre-Armstrong pop vocalists and then Armstrong himself, and asks why anyone would want to revert. "That would be a bad choice," he deadpans. Anybody who makes that choice, he adds, should be deported--count a beat--"to somewhere." Another beat. "Maybe Pluto." It's impossible to disagree, especially when you're laughing.
To some extent, Burns has himself to blame for the unjoyful noise in the jazz world. In conversation, he tends, rightly, to underplay his work's ambitions. It's not the history of jazz, he says. Viewers will get to know a handful of musicians, meet another dozen or two and brush past a few dozen more. He can't possibly compete with books like Giddins's Visions of Jazz or jazz histories like those of Ted Gioia or Marshall Stearns; he's made a movie that tells an educational story for a mass audience. This is reasonable, accurate and no small feat. And, in fact, the movie is steeped with rich human detail of the sort most music historians rarely touch on. But the PR bombast trumpets him as jazz's Joan of Arc, and once he's on-message he can't stop selling. Jazz, like academia, is small and marginal with plenty of defensive, combative types; "the music" is a secular religion. Burns's perceived power inevitably lights the territorial fuses.
As it happens, the jazz industry, now down to about 1 percent of US music sales once you exclude Kenny G and his clones, looks like a Victorian maiden lashed to the tracks awaiting her hero. Burns's movie is a mantra, as record labels crunch despairing numbers and weed out personnel and artists after the latest wave of megamergers and Internet terrors. For his well-designed five-CD companion set (subtitled The Story of America's Music), the filmmaker brokered a deal between Sony and Verve (Universal), bitter corporate rivals, then brought in other labels; all are hoping for sales like the companion book's, which had a first printing of 250,000. This is mind-blowing if you're a jazz-label head used to dealing in niche sales (Marsalis himself rarely moves more than 10,000 CDs) and waiting for the next guillotine stroke.
Potential audience numbers get tossed around fervently: 40 million viewers for Baseball and The Civil War, and Jazz will probably draw less, but... It fascinates me that few of the film's critics address that. Why not consider an America where 20 million more people--or 3 million, or however many finally watch--know something, anything, about Armstrong, Ellington, Parker, Davis and a few others? Where, if they survive the overstatements, talking heads and pacing, they learn some hidden history?
Am I a Pollyanna? Maybe. Reality check: This is a made-for-TV movie. But I too think race is America's central issue, even more multifaceted in the twenty-first century. What holds this joint's pasted seams together, beyond the Founding Documents, is the frequently intangible glue called culture. TV is a major place American culture gets made. Can anyone measure what it meant to have Bill Cosby playing an upper-middle-class dad-next-door for a generation? What it means now that there are black and Hispanic and Asian and gay and you-name-'em channels filling cable and satellite TV? Can anyone guess what it might mean in five years to have Jazz, whatever its warts, playing over and over to a country as terminally divided and in search of itself as this one?
These are not delusions of grandeur about the power of jazz or Ken Burns. They are possibilities written in the history of jazz in America. Take Burns's vignette about Charlie Black, a white Texas teen who saw Armstrong perform in the 1930s. It changed his life. He joined the NAACP's legal team working on what became Brown v. Board of Education. The sociology of jazz is full of such stories. And they are very real.
For instance, no one with a brain disputes that jazz was initially an African-American creation. But as Marsalis, Giddins, Crouch, Murray and Early point out over and over, jazz was welcoming, inclusive, open. It replaced minstrelsy with a cultural site where all Americans could participate, speak to one another, override or ignore or challenge or slide by the society's fixations on racial and ethnic stereotypes. Black Americans (and other ethnic outsiders) could use it to enter mainstream society, white Americans could flee to it from mainstream society, and the transactions created a flux and flow that powered American cultural syntheses.
Jazz, the theme goes, represents America at its best--the dream of America. In the Depression, as Early reminds us, it rivaled MGM musicals in lifting the country's spirits. Of course, since jazz is a human activity, it also reflects the deepest divisions as well as the ideals at America's core. Race, sex, money, power, capitalism, creative freedom, the interaction of the individual and the group--these are all questions embedded in jazz history. They're the questions Burns and Ward are truly interested in. At its best, Jazz gets us interested in them too.
Burns admits he never listened to jazz until he started considering it as a subject. Ward became an Armstrong fan at age 10, when he was hospitalized with polio. Jazz is lucky they're interested in it.
Right now, jazz's commercial future is murky. The major labels are mostly wreckage. Marsalis, who used to get $1 million a year to make niche-market records in the hope that they would turn into catalogue gold, doesn't have a label; neither does Redman. High-profile jazz promoters are hemorrhaging. The Knitting Factory is reportedly in the hole for $2 million, after luring a big entertainment firm to take a stake, opening a club in LA and losing its annual jazz-festival sponsor. The Blue Note chain is said to be spurting red ink from expansions into Las Vegas and midtown Manhattan. Nor are jazz's nonprofit arms thriving. The Thelonious Monk Institute, so closely aligned with the Clinton/Gore Administration that its head was reportedly hoping for an ambassadorship if Al won, is looking pale. And the long-dormant board of Jazz at Lincoln Center has just fired executive director Rob Gibson in a swirl of intrigue: changed door locks and computer codes, fired and rehired personnel, amid persistent rumors of financial malfeasance, bullying and drug abuse.
Jazz has been on a commercial slide since the 1970s, when it racked up 10 percent of retail music sales. At the same time, it began entering the groves of academe. Today most jazz musicians are trained at schools; jazz history is laced through American studies and music curriculums.
This process has already fundamentally changed jazz itself and its relation to American culture, though how isn't always clear at first. As a colleague reminded me recently, in the jazz heydays celebrated by Burns's Jazz, musicians fashioned their own idiosyncratic solutions to musical problems, drawing on oral tradition (which varied considerably) and their own ingenuity and needs. This meant finding individual creative solutions to problems--how to finger this note or sequence, how to get that timbre, how to connect those chord changes. Now, a professor distributes computer analyses of famous solos, templates for solutions that are shared by hundreds and thousands of students. This has a paradoxical effect: It raises the general level of and standardizes jazz training, but it also tends to vitiate the individuality traditionally at the music's heart. This is why older musicians routinely complain that younger schooled players all sound alike. On the other hand, they're well suited for jazz repertory programs like JALC.
That is part of jazz's changing contemporary dynamics. So is Ken Burns's Jazz.
In the aftermath of the Iran/contra crisis, one of the networks decided to make a docudrama about the life of Ollie North, loosely based on a biography by Ben Bradle Jr. Its problem was that once North joined the Reagan National Security Council staff, the story lost both its moral compass and empathetic value. The producers could not find a single real-life character among the top Administration officials who displayed the slightest concern about the moral implications of North's drug- and gun-smuggling, hostage-buying and terrorist-supplying enterprises. They solved this problem by simply inventing someone.
The producers of Thirteen Days, the new Kevin Costner/Cuban Missile Crisis $80 million extravaganza, have done something similar. Instead of inventing a new character, however, they have invented a new history for an old one. Special Assistant Kenneth O'Donnell, who was responsible primarily for presidential scheduling in real life, does not even register in respected crisis histories. In the nearly 700 pages of transcripts from ExComm, the ad hoc committee dealing with the crisis, edited by Ernest May and Philip Zelikow and published by Harvard in 1997, O'Donnell rates exactly two insignificant lines. Yet here we see O'Donnell, played by Costner, saving the Kennedys from themselves and the world from self-destruction. One minute SuperKen is bawling out the President for going soft on the Commies, the next he's roughing up Mac Bundy for suggesting the same. A cross between an über-aide barking orders at quivering politicos and a shaggy dog who follows his master around with scotch-filled Waterford crystal, he instructs Adlai Stevenson to stand up to the Soviets at the UN and a fighter pilot to pretend he was not shot at in Cuba. Cynics looking for an explanation of this rather odd historical rewrite might point to the fact that the film was partially funded by O'Donnell's son, Earthlink co-founder Kevin O'Donnell.
Reviewers like the Wall Street Journal's Joe Morgenstern innocently term the film "a valuable history lesson." In fact, the film takes countless liberties with the documentary record. For instance, Thirteen Days
§ conveniently skips Robert McNamara's initial arguments that Russia's placement of the missiles should be ignored because Soviet long-range missiles made them strategically meaningless, lest this comment undercut the film's entire rationale;
§ ignores the record of US efforts to destabilize the Castro regime, including contingency invasion plans being readied at the time of the emplacement;
§ explicitly whitewashes the Kennedys' unconscionable McCarthyite plot to discredit the dovish Adlai Stevenson, whose recommendations they largely--and secretly--ended up following;
§ sans evidence, attributes a column by Walter Lippmann that contained the seeds of a crisis-ending missile trade to a leak direct from Jack and Bobby;
§ places the Kennedys' meetings that decided in favor of a missile trade inside the ExComm, when in fact they deliberately kept these secret from the "Wise Men," fearing the same attacks they themselves had leveled at Stevenson.
Of course, the level of accuracy is not too bad for a film whose credits include six tailors and seven hairdressers but not one academic historian. (Former CIA analyst Dino Brugioni, author of a fine book on the technical aspects of the crisis called Eyeball to Eyeball, is listed, but one hopes he had nothing to do with its story line.)
My view is that anyone who takes Hollywood's history for scripture deserves whatever they get. As John Sayles has observed to Eric Foner, "Using [the word] 'responsibility' in the same sentence as 'the movie industry'--it just doesn't fit." Yet at the same time, Sayles noted, Hollywood can't help itself. Often the only way to sell a movie is for the ad to read "Based on a true story..." Sometimes they get away with it, sometimes not, usually depending on whose interests are served by the lies in question. When Costner and Oliver Stone offered up their loony version of the Kennedy assassination in JFK, the Washington media establishment reacted with such outrage the Capitol threatened to float away on hot air. No one wanted to see Stone's conspiratorial version of the assassination and the Vietnam War replace the official misinformation. On the other hand, some Hollywood lies are welcomed by pundits. Last summer, Mel Gibson and company came up with a version of the American Revolution in The Patriot in which the Americans, not the British, freed the slaves. No matter that the Southern revolutionaries fought to protect their "peculiar institution" while the British offered the slaves their freedom should they join the loyalist cause. William F. Buckley (surely a born loyalist if ever there was one) came forward to endorse Hollywood's fictional history. David Horowitz, displaying his patented post-Stalinist brand of hysterical ignorance leavened with personal dishonesty, complained, "Leftwing reviewers inwardly despising its patriotic themes have taken to faulting its alleged historical 'inaccuracies' as a way of dismissing its significance.... [But] isn't this what the American revolution was about--the promise that all men would be free? And didn't the new nation deliver on that promise in a generation and pay an even greater price in blood to do so?"
Well, no, Comrade Horowitz, it didn't. A generation after the Revolution, the slaves were still slaves, and Southern revolutionaries were still slaveowners. The Emancipation Proclamation (which freed only selected slaves) took nearly a century, and blacks were not given the right to a meaningful vote in the South for another hundred years after that. (Moreover, some, including quite a few thousand in Florida, are still fighting.)
Judged by the standards of JFK and The Patriot, Thirteen Days looks pretty good. At least it comes with a warning: "You'll never believe how close we came," its ad campaign promises. And I didn't.
You've got to understand what Sam Shepard meant to us.
There are those who know Shepard as a movie star and those who discovered him, earlier on, when he won the Pulitzer Prize for Buried Child; but for those of us who first watched his plays in tiny studio theaters above a bar or in converted churches when there was still a counterculture, he was our playwright.
Shepard's plays were like no others--fresh, hip, antiheroic, free from the tired old psychology of Tennessee Williams and the Actors Studio. By no means political, they nevertheless made us aware of the myths that shaped our behavior as Americans. And if you knew where playwriting had been, with all those precious attempts to repoeticize the drama, and knew what was happening with psychedelics--people beginning to listen to those half-heard perceptions passing through their heads--you knew he had created an inevitably right form of drama.
He also meant a lot to people in the Bay Area, where, in the waning days of the counterculture, he settled for the better part of a decade. That was about the same amount of time Eugene O'Neill lived here, and like O'Neill, Shepard wrote many of his best plays here. He's been quoted as saying his years as playwright-in-residence at the Magic Theatre in San Francisco, where he premiered Angel City, Buried Child, Fool for Love and True West, were "the most productive time of my theater life."
But then, having given by his presence a certain validation to regional theater in the forever insecure "world class" city of San Francisco, he pulled up stakes and went to work as an actor in Hollywood movies. This at about the same time he was criticizing the hell out of the corruption of the creative process in La-La Land, in True West. And not only did he abandon our ever so artistically pure Bay Area for Hollywood, but he ended up making a long line of godawful movies, like Dash and Lilly, Purgatory and Baby Boom, pictures you wouldn't have thought a man of his literary sophistication and discrimination would touch.
In the past decade, Shepard seems to have returned to theater, though these have largely been years of successful revivals and very mixed and often not very warm responses to his recent work. The result is a lingering fear that Shepard, once the Wunderkind of American drama, has treated his tremendous gift far too carelessly.
Which brings us to The Late Henry Moss, the first premiere of a Shepard play by the Magic Theatre in seventeen years (at Theatre on the Square in San Francisco). The very best playwright of his generation was able to interest Nick Nolte and Sean Penn in the play--two of the very best actors in America, actors who time and time again have shown seriousness in their choice of material. That heightened expectations of the old exhilaration: a return of the real Sam Shepard, the poet, sure-footed, bringing you face to face with perceptions only half-acknowledged. And not only might Shepard be back at the top of his form, but this was an older and, one hoped, more deeply seeing Shepard, writing about the ultimate subject, death.
Shepard had left the Bay Area saying he was "no longer young," and now here we were so much further along. (Seventeen years; is it possible?) The golden boy is 57 and has lost both his mother and the father who was the source of so much of the anger and unhappiness in his plays. As he says at the end of Cruising Paradise, his 1996 collection of tales containing versions of both those deaths, "Everything was in place."
In The Late Henry Moss, the father's death is a mystery. One son, Ray (Sean Penn), seeks the truth about it and about his father and the family's past. The other, Earl (Nick Nolte), his opposite, seeks to hide the truth from himself and others. But from a cab driver (Woody Harrelson) and a concerned neighbor, Esteban (the delightful Cheech Marin), and in flashbacks, we discover that the father (James Gammon) went on a drunken fishing trip with a mysterious Native American woman (a very strong Sheila Tousey). Psychically tougher and more powerfully vital than any man in the play, she constantly throws it in the old man's face that he is dead in life. Ultimately she helps him really die. As she does, we discover Earl's part in that evening and his earlier act of betrayal when the family broke apart.
As the brothers, Nolte and Penn do what they do. Nolte drags on a cigarette, the tip just emerging from his fist, knocks down a shot, passes a hand through his hair and plays ravaged, weighed-down inner suffering with great naturalness. Equally real for the most part, Penn is intense, like a cat about to spring, and is ace, as you might expect, with Shepard's insolent threats and threatening silences. Both know to goose the energy with dynamic gestures, but both can also be a little small at times, as if they're expecting a camera to magnify the drama of facial nuances.
Unlike Woody Harrelson, who turns in a hugely inventive performance as the cab driver, finding fifty different ways to physicalize essentially the same action, what Nolte and Penn do ultimately begins to seem like more of the same. But here I think the problem is the writing, and with great disappointment I report that Shepard hasn't returned to his former powers with this play. He simply hasn't given Penn and Nolte sufficient material to work with. There's not a whole lot to the characters, and their relationship lacks the continuously rich evolution of True West. I suspect this underwriting is part of what makes the ending seem inflated and overwrought. The fact that what is revealed about the family's past isn't all that compelling doesn't help.
There are of course many joyously perverse, off-the-wall Shepard lines like "Every death has to be reported these days--unless you kill someone" and (to bumbling funeral attendants) "That's my father you just dropped." His typically audacious choices as writer and director are also very much in evidence, as when he leaves a giant, unsettling, unfurnished empty space in the set, stage right, or when Sheila Tousey picks Marin up and swings him back and forth like a doll, or when Harrelson leaps on top of a refrigerator, a meat cleaver in hand for protection.
Where most directors move actors about the stage to articulate relationships and tell the story of the play and create an overall mood with lights and textures, it's as though Shepard does all that and, with the help of designers Andy Stacklin (set), Anne Militello (lights) and Christine Dougherty (costumes), also creates pictures on stage that have the strange beauty of Edward Hopper's--only with a palette more like Wayne Thiebaud's. Shepard also moves into a more overt and equally beautiful surrealism, as when Tousey's head and arms appear otherwise disembodied over the edge of a bathtub.
In fact, Shepard seems to be trying to move into new territory. If Buried Child was Shepard's Ibsen play (and Ibsen parody) and Fool for Love his Strindberg, The Late Henry Moss may be a kind of Long Day's Journey Into Night, an attempt at closure with his father and his death.
The way he manages that attempt shows Shepard still of a countercultural bent, embracing the counterculture's characteristic antidote, inclusion of the Other. The setting is no longer a desert wasteland but the Southwest, the Latin/Native American West, New Mexico, where Shepard first moved when he left the Bay Area, and where a brooding primitivism makes you feel you've crossed into a foreign country.
After years of delineating the underside of macho, in Henry Moss Shepard brings onto his stage a Native woman, sensuous, with a mythic dimension and definitely Other. She brings with her clear vision, reverence for the dead, ritual, dance and a nonstereotypical way of being female. And it is she who--not maternally, but with great hardness--brings Henry to his death and closure to his suffering and macho failings.
Ultimately, however, this closure doesn't bring about a sense of reconciliation. The account of Shepard's father's funeral in Cruising Paradise is tender, full of pity and acceptance, and in it Shepard captures a very real sense of the grief that sneaks up unexpectedly (even when you harbor great anger toward the deceased). He chokes up reading the Bible over his father's grave and can't go on.
Henry Moss is a different work, and there's no reason Shepard should re-create the same emotional landscape, but given the subject matter there's a surprising lack of those feelings. Esteban is upset by Henry's death; Ray stands mutely by the corpse for a moment. In the final analysis, though, Shepard is extremely hard on his characters, father and sons. You might say, unforgiving. The failings and betrayals are a barrier he can't seem to get past. And in the end, the play never deals with the grief and pity that must be dealt with if reconciliation is to come from an encounter with the dead.
John Lennon once characterized his wife, Yoko Ono, as the world's "most famous unknown artist. Everybody knows her name, but nobody knows what she does." What she was famous for, of course, was him. The art for which she was unknown could not conceivably have made her famous--although even the most famous of artists would be obscure relative to the aura of celebrity surrounding the Beatle of Beatles and his bride. Yoko Ono had been an avant-garde artist in New York and Tokyo in the early 1960s, and part of an avant-garde art world itself very little known outside its own small membership. The most robust of her works were subtle and quiet to the point of near-unnoticeability. One of her performances consisted, for example, of lighting a match and allowing it to burn away. One of her works, which she achieved in collaboration with the movement known as Fluxus, consisted of a small round mirror which came in an envelope on which YOKO ono/self portrait was printed. It belonged in Fluxus I--a box of works by various Fluxus artists, assembled by the leader and presiding spirit of the movement, George Maciunas. But the contents of Fluxus I were themselves of the same modest order as Self Portrait. We are not talking about anything on the scale, say, of Les Demoiselles d'Avignon. We are speaking of things one would not see as art unless one shared the values and ideologies of Fluxus.
Fluxus, in that phase of its history, was much concerned with overcoming the gap between art and life, which was in part inspired by John Cage's decision to widen the range of sounds available for purposes of musical composition. Cage's famous 4'33'' consisted of all the noises that occurred through an interval in which a performer, sitting at the piano, dropped his or her hands for precisely that length of time. A typical Fluxus composition was arrived at by selecting a time--3:15, say--from the railway timetable and considering all the sounds in the railway station for three minutes and fifteen seconds as the piece. As early as 1913, Marcel Duchamp made works of art out of the most aesthetically undistinguished vernacular objects, like snow shovels and grooming combs, and he was in particular eager to remove all reference to the artist's eye or hand from the work of art. "The intention," he told Pierre Cabanne in 1968, "consisted above all in forgetting the hand." So a cheap, mass-produced object like a pocket mirror could be elevated to the rank of artwork and be given a title. How little effort it takes to make a self-portrait! In The Republic Socrates made the brilliant point that if what we wanted from art was an image of visual reality, what was the objection to holding a mirror up to whatever we wished to reproduce? "[You] will speedily produce the sun and all the things in the sky, and the earth and yourself and the other animals and implements and plants." And all this without benefit of manual skill!
Fluxus made little impact on the larger art world of those years. I encountered it for the first time in 1984, at an exhibition held at the Whitney Museum of New York in which the art made in New York in the period between 1957 and 1964 was displayed. It was a show mainly of Pop Art and Happenings; and there were some display cases of Fluxus art, many of them objects of dismaying simplicity relative to what one expected of works of art in the early 1960s, exemplified by large heroic canvases with churned pigment and ample brush sweeps. Maciunas spoke of Fluxus as "the fusion of Spike Jones, vaudeville, gag, children's games and Duchamp"; and the display cases contained what looked like items from the joke shop, the children's counter in the dime store, handbills and the like. Ono's relationship to Fluxus is a matter of delicate art-historical analysis, but if she fit in anywhere, it would have been in the world Maciunas created around himself, where the artists and their audience consisted of more or less the same people. It was a fragile underworld, easy not to know about. Ono's work from that era has the weight of winks and whispers.
So, it was as a largely unknown artist that Lennon first encountered her, at the Indica Gallery in London, in 1966. The point of intersection was a work titled YES Painting, which consists of a very tiny inscription of the single word Yes, written in india ink on primed canvas, hung horizontally just beneath the gallery's ceiling. The viewer was required to mount a stepladder, painted white, and to look at the painting through a magnifying lens, suspended from the frame. It was part of the work, as it was of much of Yoko Ono's art, then and afterward, that it required the participation of the viewer in order to be brought fully into being. Much of it, for example, had the form of instructions to the viewer, who helped realize the work by following the instructions, if only in imagination. The ladder/painting was a kind of tacit instruction, saying, in effect, like something in Alice in Wonderland, "Climb me." Somehow I love the fact that John Lennon was there at all, given what I imagine must have been the noisy public world of the Beatles, full of electric guitars and screaming young girls. Lennon climbed the ladder and read the word, which made a great impression on him. "So it was positive," he later said. "It's a great relief when you get up the ladder and you look through the spyglass and it doesn't say no or fuck you; it says YES." There was only the simple affirmative rather than the "negative... smash-the-piano-with-a-hammer, break-the-sculpture boring, negative crap. That 'YES' made me stay." It would be difficult to think of a work of art at once that minimal and that transformative.
"YES" is the name of a wonderful exhibition at the Japan Society, much of it given over to the works for which, other than to scholars of the avant-garde, Yoko Ono is almost entirely unknown. I refer to the work from the early sixties, a blend of Fluxus, Cage, Duchamp and Zen, but with a poetry uniquely Ono's own. The most innovative of the early works are the Instructions for Paintings, which tell the viewer what to do in order for the work to exist. These have the form of brief poems. Here, for example, is the instruction for a work called Smoke Painting:
Light canvas or any finished painting
with a cigarette at any time for any
length of time.
See the smoke movement.
The painting ends when the whole
canvas or painting is gone.
Here is another, called Painting in Three Stanzas:
Let a vine grow.
Water every day.
The first stanza--till the vine spreads.
The second stanza--till the vine withers.
The third stanza--till the wall vanishes.
Now these are instructions for the execution of a work, not the work itself. They exist for the purpose of being followed, like orders. In formal fact, the instructions are very attractive, written out in gracious Japanese calligraphy by, as it happens, Yoko Ono's first husband, Ichiyanagi Toshi, an avant-garde composer. It is true that the conception was hers, but by means of whose handwriting the conception should be inscribed is entirely external. Nothing could be closer to Duchamp's idea of removing the artist's hand from the processes of art. Duchamp was interested in an entirely cerebral art--the object was merely a means. And so these attractive sheets of spidery writing are merely means: The work is the thought they convey. "Let people copy or photograph your paintings," Ono wrote in 1964. "Destroy the originals." So the above instructions, in numbers equal to the press run of The Nation plus however many pass-alongs or photocopies may be made of this review, are as much or as little of "the work" as what you would see on the walls of the gallery. The question is not how prettily they are presented or even in what language they are written. The question is how they are received and what the reader of them does to make them true: The instructions must be followed for the work really to exist.
So how are we to comply? Well, we could trudge out to the hardware store, buy a shovel, pick up a vine somewhere, dig a hole, plant the vine, water it daily--and wait for the wall against which the vine spreads to vanish. Or we can imagine all this. The work exists in the mind of the artist and then in the mind of the viewer: The instructions mediate between the two. At the Indica Gallery, Ono exhibited Painting to Hammer a Nail. A small panel hung high on the wall, with a hammer hanging from its lower left corner. Beneath it was a chair, with--I believe--a small container of nails. If you wanted to comply with the implicit instructions, you took a nail, mounted the somewhat rickety chair, grasped the hammer and drove the nail in. At the opening, Ono recalls, "A person came and asked if it was alright to hammer a nail in the painting. I said it was alright if he pays 5 shillings. Instead of paying the 5 shillings, he asked if it was alright to hammer an imaginary nail in. That was John Lennon. I thought, so I met a guy who plays the same game I played." Lennon said, "And that's when we really met. That's when we locked eyes and she got it and I got it and, as they say in all the interviews we do, the rest is history."
Jasper Johns once issued a set of instructions that became famous: "Take an object./Do something to it./Do something else to it." Ono's version would be "Imagine an object./Imagine doing something to it./Imagine doing something else to it." Ono's enthusiasts like to say how far ahead of her time she was, based on some entirely superficial parallels between her Instructions for Paintings and certain works of Conceptual Art, which also consisted of words hung on the wall. Thus in 1967 Joseph Kosuth composed a work that reproduced the definition of the word "Idea" as it appears in a dictionary. The title of the work is Art as Idea as Idea. The work of art is the idea of idea (Spinoza--profoundly--defined the mind as idea ideae). For reasons entirely different from Ono's, Kosuth was bent on transforming art into thought.
Art historians are always eager to establish priority, usually by finding resemblances that have little to do with one another. In truth, Ono was precisely of her own time. It was a time when the very idea of art was under re-examination by artists. Works of art can never have been more grossly material--heavy, oily, fat--than under the auspices of Abstract Expressionism. But the aesthetic experiments of Cage, of Fluxus and of Yoko Ono were not, in my view, addressed to the overthrow of Abstract Expressionism. They were rather applications of a set of ideas about boundaries--between artworks and ordinary things, between music and noise, between dance and mere bodily movement, between score and performance, between action and imagining action, between artist and audience. If the impulse came from anywhere, it came from Zen. Cage was an adept of Zen, which he transmitted through his seminars in experimental composition at The New School. Dr. Suzuki, who taught his course in Zen at Columbia, was a cult figure for the art world of the fifties. Yoko Ono had absorbed Zen thought and practice in Japan. The aim of Zen instructions was to induce enlightenment in the mind of the auditor, to transform his or her vision of world and self. The aim of Ono's instructions was similarly to induce enlightenment in the mind of the viewer--but it would be enlightenment about the being of art as the reimagination of the imagined. In her fine catalogue essay, Alexandra Munroe, director of the Japan Society Gallery, writes, "Asian art and thought were the preferred paradigm for much of the American avant-garde." Abstract Expressionism and the New York avant-garde exemplified by Cage, Fluxus and Ono belong to disjointed histories that happened to intersect in Manhattan at the same moment.
At the time of their marriage, Ono said that she and John Lennon would make many performances together, and the fact that Lennon set foot in the Indica Gallery in the first place and engaged with Yoko Ono in that atmosphere implies that he found something in art that was lacking in the world of popular music, for all his great success. It is characteristic that for him, art meant performance--not painting on the side, which was to become an outlet for his fellow Beatle Paul McCartney (there is an exhibition of McCartney's paintings making the rounds today). What Ono offered Lennon was a more fulfilling way of making art, and inevitably she was blamed for the dissolution of the band. What Lennon offered Ono was a way of using her art to change minds not just in terms of the nature of art and reality but in terms of war and peace. In 1968 Yoko Ono declared that "the art circle from which I came is very dead, so I am very thrilled to be in communication with worldwide people." One of Yoko Ono's most inspired pieces was her White Chess Set of 1966 (a version of which, Play It By Trust, can be seen in the Japan Society lobby). Instead of two opposing sides, one black and one white, she painted everything--the board and the pieces--white. Since one cannot tell which pieces belong on which side, the game quickly falls apart. "The players lose track of their pieces as the game progresses; Ideally this leads to a shared understanding of their mutual concerns and a new relationship based on empathy rather than opposition. Peace is then attained on a small scale." But with Lennon, she and he could attempt to achieve peace on the largest scale--could use art to transform minds. In 1969, for example, they enacted their Bed-in for Peace. The tremendous widening of the concept of art earlier in the decade made it possible for being in bed together to be a work of art. The press was invited into their hotel bedrooms, gathered around the marital bed, to discuss a new philosophy in which, as in White Chess Set, love and togetherness replaced conflict and competition. In the same year the couple caused billboards to be erected in many languages in many cities, as a kind of Christmas greeting from John and Yoko. The message was WAR IS OVER! (in large letters), with, just beneath (in smaller letters), IF YOU WANT IT. There was no definite article: The sign was not declaring the end of the Vietnam War as such but the end of war as a human condition. All you have to do, as their anthem proclaimed, was GIVE PEACE A CHANCE. Get in bed; make love, not war.
There is a somewhat darker side to Ono's work than I have so far implied. In a curious way, her masterpiece is Cut Piece, a performance enacted by her on several occasions, including at Carnegie Recital Hall in 1965. Ono sits impassively on the stage, like a beautiful resigned martyr, while the audience is invited to come and cut away a piece of her clothing. One by one, they mount the stage, as we see in a video at the Japan Society, and cut off part of what she is wearing. One of the cutters is a man, who cuts the shoulder straps on her undergarment. The artist raises her hands to protect her breasts, but does nothing to stop the action. Ideally the cutting continues until she is stripped bare. I find it a very violent piece, reminding me somehow of Stanley Milgram's experiment in psychology, in which people are encouraged to administer what they believe are electrical shocks to the subject (who pretends to be in agony). The audience has certainly overcome, a bit too gleefully, the gap between art and life--it is after all a flesh-and-blood woman they are stripping piecemeal with shears. It reveals something scary about us that we are prepared to participate in a work like that.
Another film, Fly, shows a housefly exploring the naked body of a young woman who lies immobile as the fly moves in and out of the crevices of her body, or moves its forelegs, surmounting one of her nipples. The soundtrack is uncanny, and we do not know if it is the voice of the fly, the suppressed voice of the woman or the weeping voice of an outside witness to what feels like--what is--a sexual violation. It is like the voiced agony of a woman with her tongue cut out. The sounds are like no others I have heard. Yoko Ono is a highly trained musician who gave her first concert at 4 and who sang opera and lieder when she was young. But she is also a disciple of Cage and an avant-garde singer who uses verbal sobs, damped screams, deflected pleas, to convey the feeling of bodily invasion.
Yoko Ono is really one of the most original artists of the last half-century. Her fame made her almost impossible to see. When she made the art for which her husband admired and loved her, it required a very developed avant-garde sensibility to see it as anything but ephemeral. The exhibition at the Japan Society makes it possible for those with patience and imagination to constitute her achievement in their minds, where it really belongs. It is an art as rewarding as it is demanding.
December 8, 2000: It was twenty years ago today that Mark David Chapman shot and killed John Lennon outside the Dakota on West 72nd Street in New York City, bringing whatever was left of the sixties to a definitive and miserable end. Yet Lennon lives on--not just for his now-graying fans, not just for younger kids discovering the Beatles, but in some unexpected and surprising ways.
Case in point: At the Republican National Convention in Philadelphia this past August, as Dick Cheney stepped up to the podium to accept the party's nomination as vice presidential candidate, the band struck up a spirited version of Lennon's song "Come Together." This is the one on the Abbey Road album that begins "Here come ol' flattop" (Cheney of course is mostly bald), and continues, "One thing I can tell you is you got to be free"--a sixties sentiment that meant something quite different from tax cuts for the rich.
Cheney probably didn't know that Lennon started writing "Come Together" as a campaign song--for Timothy Leary's planned 1970 campaign for California governor against Ronald Reagan. Leary never used the song, but Lennon sang it live onstage at Madison Square Garden in 1972 in the midst of another presidential campaign, when Nixon was trying to have him deported to silence a prominent voice of the antiwar movement. Lennon changed the title line to "Come together--stop the war--right now!" and the audience cheered wildly.
The Democrats also played a Lennon song at their convention: They used "Imagine" as the theme of a tribute to Jimmy Carter. While the giant video showed Jimmy and Rosalynn hammering nails and fondling small children, the easy-listening version of Lennon's song omitted the words "Imagine there's no heaven/it's easy if you try/No hell below us/Above us only sky"--not really appropriate for America's first born-again Baptist President.
"Imagine" is a utopian anthem, and the utopian imagination was always a keystone of sixties New Left thought, distinguishing it from the bread-and-butter politics of traditional working-class socialism. "Power to the imagination" was a key slogan written on the walls in May '68. Today the country is full of billboards urging people to "Dial 1-800-imagine." I tried it. You don't get John Lennon singing "Imagine no possessions." Instead you get AT&T Wireless Services: Press 1 to upgrade your wireless plan, press 2 to inquire about new service, press 3 to inquire about an order and, of course, press 4 to hear these options again.
A search of the Nexis database found these variants on Lennon's "Imagine no possessions": a Republican who said "Imagine no estate tax," a television critic who wrote "Imagine no more Regis," a technophobe who wrote "Imagine no computers" and a Democratic pundit who headlined an opinion piece, "Imagine There's No Nader."
Lennon lyrics appear in print in some other unlikely places. When Time put Bill Clinton on its cover at the beginning of his first term, the cover line was "You Say You Want a Revolution." Two years later, when the Republicans won control of the House, the New York Times ran an opinion piece by R.W. Apple Jr. headlined "You Say You Want a Devolution." And just a few months ago, after Joe Lieberman changed his mind about privatizing Social Security, The New Republic headline read "You say you want an Evolution."
The headline writers probably had forgotten that Lennon wrote "Revolution" in response to the May '68 uprisings in Paris, criticizing student radicals for advocating violence. He recorded two versions of the song. The single--the "fast" version--came first. It was recorded on May 30, 1968, and released in the United States in August, shortly after the police riot at the Democratic National Convention in Chicago. After the opening line--"You say you want a revolution"--it concluded, "count me out." The radical press was outraged. Ramparts called the song "a betrayal"; New Left Review called it "a lamentable petty bourgeois cry of fear." Time, on the other hand, reported that the Beatles had criticized "radical activists the world over," which Time found "exhilarating." The second, "slow" version of the song was released on the White Album two months later. Now, after the line "count me out," Lennon added another word: "in." He later explained, "I put both in because I wasn't sure." A year later he was singing "Power to the People."
Lennon's "Give Peace a Chance" was sung by half a million antiwar demonstrators at the Washington Monument in 1969, but since then it's come in for some revisionism. I remember militant friends back in those days singing "Give the dictatorship of the proletariat a chance." Then there's "Give War a Chance," which pops up every once in a while--the establishment journal Foreign Affairs used it as the title of a 1999 article by Edward Luttwak arguing against US intervention in local conflicts. Frontline broadcast a story on the Balkans in 1999 with the same title, and P.J. O'Rourke used Give War a Chance as the title for a book that became a bestseller. On the other hand, none other than Trent Lott uttered the words "give peace a chance" on the floor of the Senate--talking about Kosovo. Finally, a company called Peace Software (www.peace.com) is using the slogan "Give Peace a Chance."
Lennon's most intense and personal post-Beatle song, "God," a very slow track on his first solo album, contains a litany that concluded, "I don't believe in Beatles." The New York Times ran a full-page interview in September with Philip Leider, the founding editor of ArtForum, that included his own personal version of the lyrics, which took up twenty-three lines of our newspaper of record. Warhol came first: "I don't beleeve in Andy." Then: "I don't beleeve in Haring"; "I don't beleeve in Fischl"; "I don't beleeve in Koons"; and so on through nineteen more current art stars.
Several of Lennon's most memorable lines have not been appropriated by pundits or Op-Ed types: "Instant Karma's gonna get you" remains untouched, at least according to Nexis, and thus far nobody has found a way to use "I am the walrus, goo-goo g'joob." But aside from these notable exceptions, the conclusion is clear: John Lennon may be gone, but twenty years after his death his words and ideas are here, there and everywhere.
Long before Carrie-Anne Moss rips open Val Kilmer's shirt and begins pounding his chest, providing him with a version of CPR that she must have learned from a Japanese drum troupe, the makers of Red Planet have resorted to their own thumpings and flailings, as if to resuscitate a film that's gone limp. It's a panic response, coming from people who have realized too late that the hookup of a radio would be a high point of their picture.
Their script has stuck Moss in a stricken spaceship that's orbiting Mars; by this point, her comrades Kilmer and Tom Sizemore have been marooned, incommunicado, on the planet's surface. So when the boys stumble upon an old circuit board in the dust, it's time for high-energy drama. "Let's do it!" shrieks Sizemore, as if he were starting the Indy 500. With a roar, guitars and drums begin pounding away on the soundtrack. Kilmer, in closeup, damn well solders a wire, sending a meteor shower's worth of sparks across the screen--at which point, back on the spaceship, Moss decides to strip down to a sleeveless T-shirt, giving us a much better view of her breasts.
I'm really grateful for the breasts. If not for them, I might have fallen asleep and missed the climactic scene, in which Kilmer performs a diagnostic check on a computer.
If only the makers of Red Planet had trusted in their story's essential schleppiness! Then, instead of giving us this lumbering, expensive beast, they might have realized the small but halfway-clever idea that's still dimly visible within: a story about the heroism-by-default of a spaceship janitor.
The character in question, a fellow named Gallagher, holds the job title of mechanical systems engineer; but to the rest of the personnel on this flight to Mars, that's like saying he's the guy who fixes the toilets. "It's high school," he remarks to a fellow civilian in the crew, after being brushed back by a swaggering NASA pilot. "They're the jocks, and we're the nerds." Just so. When he bumps into Moss--the ship's commander--on her way out of the unisex shower, Gallagher can think of nothing better to do than fumble with his fingers and blush. Later, when the outcome of the mission comes to rest on him, Moss has to give him a pep talk before he'll even get to his feet. Yet he's the guy who must save Earth from destruction and consummate a rendezvous with those breasts. What a role for Steve Buscemi! How the hell did it go to Val Kilmer?
He's good, of course. Kilmer is always good--but he's a guy who previously played Jim Morrison, Elvis and Batman. The only thing that's nerdlike about him is the hairdo he's been given for this picture, which is brushy and yellow and makes him look as if he's in crying need of a conditioner. Mind you, the premise of Red Planet is that all of Earth needs a conditioner. After these many years of environmental degradation, we've burned out our world and must colonize someplace else. Hence the desperate and very expensive project, in the year 2057, of sending Moss and her crew to Mars. Wouldn't it have been cheaper, as well as more practical, to institute a few conservation measures instead? No doubt. But humans, according to this movie, lack much capacity for self-discipline and forethought, and so must splurge on stupid but spectacular stunts. As if to prove this point, the producers have done their own splurging and hired Kilmer--the actorly equivalent of a rocket to Mars, compared with Buscemi's compost heap.
As they cast the lead, so too did they decide to ladle on the excitement: pounding guitars, sleeveless T-shirts, unmotivated shrieks. How were these choices made? I can venture a guess. The credits for Red Planet list three producers and two executive producers. This is a fairly standard aggregation in today's movie business; and with so many big shots keeping themselves busy on the picture, how could a mere idea survive? The story, written by a lone guy named Chuck Pfarrer, was almost sure to be buried alive; and into the dirt with it went a few other notions.
One of them might have involved some sexual role-play, based on the fact that the only females in the story are Moss, the shipboard computer (named Lucille) and a navigation robot called Amee. "She's my kind of girl," Gallagher says of the robot, just before it goes into killer mode. (It was designed for the Marines.) Somebody, maybe Pfarrer, seems to have wanted the nerdy Gallagher to feel ambivalent toward strong women: attracted to them when they shower, threatened by them when they turn into whirring kung-fu machines.
But since the production is at war with its own screenplay--have I mentioned that Red Planet is directed, more or less, by Antony Hoffman?--this kinky little idea is no better realized than the movie's religiosity. As far as I'm concerned, it's just as well that this latter theme gets only lip service. Ever since 2001: A Space Odyssey, Earthlings in Outer Space have sought God, and found light shows. At least Red Planet spares us that final cliché--though it still makes us listen to a lot of spiritual blather.
Those Deep Thoughts are provided by Terence Stamp, who manages to be the crew's world-famous scientist despite having abandoned rationalism. Science cannot provide the answers he craves, Stamp explains to a sweetly patient Kilmer, and so he has turned to religion. Kilmer obligingly spends the rest of the picture looking for a divine purpose--which doesn't seem so misguided, considering the level of scientific expertise around him. When the crew's biologist (Sizemore) discovers a life form on Mars, he cries out, "Nematodes!" Either he's forgotten his Linnaeus--nematodes are worms--or else the solution to God's mysteries is to be found not in Outer Space but in the pages of old sci-fi magazines. These creatures are clearly arthropods: the genre's usual bugs.
Fans of the platoon-in-space movie will want to know that the Mars scenery is furnished with the necessary rocks, peaks and ravines. Fans of Carrie-Anne Moss--meaning the adolescent boys, of whatever age, who admired The Matrix--will want to know that here, too, she gets to fly around. Not every actress is suited to antigravity; and so, until such time as Moss gets the chance to deliver a performance, I will congratulate her on giving good float.
As a memorial tribute to Vincent Canby, the "Arts & Leisure" section of the New York Times recently published half a page of excerpts of his prose, as selected by The Editors. Implacable beings of ominous name! With grim rectitude, they shaped a Canby in their image, favoring passages where he had laid down principles of the sort that should be cited only under capitalization. These were Sound Judgments.
For those of us who admired Mr. Canby (as the Times would have called him while he was alive, and as I will continue to call him, knowing how the style fit the man), soundness of judgment was in truth a part of his merit. A hard man to fool, he could distinguish mere eccentricity from the throes of imaginative compulsion, the pleasures of pop moviemaking from the achievements of film art; and when he was offered sentimentality in place of feeling, his heart didn't warm, it burned. These powers of discernment allowed him to bear with extraordinary grace the responsibility of being the Times critic. They also contributed a lot to his need for responsibility, since it was his sureness, as much as the institutional weight of the Times, that made Vincent Canby so influential.
That said, I confess I read him to laugh. At present, I can give only tin-eared approximations of his wisecracks--correct and ample quotation will become possible when someone smart decides to publish a Vincent Canby anthology--but I can hardly forget his review of Salome's Last Dance. This picture was the latest chapter in Ken Russell's phantasmagorical history of sex in the arts, or the arts in sex. Mr. Canby's lead (more or less): "As the bee is drawn to the flower, as the hammer to the nail, so Ken Russell was bound to get to Oscar Wilde."
I also recall Mr. Canby's description of the used car that Jim Jarmusch peddled to the title characters in Leningrad Cowboys Go America. It looked, he said, as if it had been dropped from a great height. Writing about I've Heard the Mermaids Singing, a film of relentlessly life-affirming whimsy, he claimed he'd been cornered by a three-hundred-pound elf. A typically self-regarding, show-offy performance by Nicolas Cage (was it in Vampire's Kiss?) inspired him to write that other actors must enjoy working with this man about as much as they'd welcome being shut up with a jaguar. And once, when forced to think up copy about his umpteen-thousandth formula movie, he proposed that the only way to derive pleasure from such a picture would be to play a game with yourself, betting on whether you could guess what would happen next. "As you win," he wrote, "you lose."
From these few and random examples, you may conclude that Mr. Canby's principles often emerged with a deep-voiced chuckle, and that they involved matters that went far beyond the movies. Some of these concerns were political in the specific sense, as when he gave a favorable review to Alex Cox's Walker: a film that offered a burlesque insult to US supporters of the Nicaraguan contras, in government and at the Times. His concerns were also political in a broader sense. Witness the 200 words he devoted to a little African-American picture titled Love Your Mama: a heartfelt, thoroughly amateurish movie produced in Chicago by some people who had hired an industrial filmmaker to direct their script. While quietly letting his readers know that they probably would not want to watch this film, Mr. Canby conveyed a sense that real human beings, deserving of respect, had poured themselves into the project.
Of course, the best places in which to seek Mr. Canby's principles were within the films he championed. He would have earned his place in cinema history (as distinct from the annals of journalism) had he done nothing more than support Fassbinder's work. And yet I'm not surprised that The Editors found no space to reprint Mr. Canby's writings on this crucial enthusiasm. Fassbinder, like his critic, was preternaturally alert to political and social imposture, to the bitter and absurd comedy of human relationships, and also (for all his laughter) to the pain and dignity of those who go through life being pissed on. Mr. Canby recognized in Fassbinder's work all these qualities and more (such as the presence, in the person of Hanna Schygulla, of one of cinema's great fantasy objects); but these matters seem to have been judged too unruly for an "Arts & Leisure" tribute.
Now, I've been allowed to do some work for "Arts & Leisure" and have received from my editors nothing but aid and kindness. Surely the people I've dealt with at the Times would have chosen excerpts from Mr. Canby that were funnier, sharper, more challenging. So maybe, when the Times moves to memorialize somebody as one of its own, a higher level of control takes over. It's as if the paper means to show its own best face--or rather the image it wants to see in the mirror, urbane and solid--and never mind that man in the old tweed jacket.
This tendency of the institution to eclipse the individual figures prominently in a new book by another major film critic, Jonathan Rosenbaum. By "major," I mean that Rosenbaum is highly regarded by other reviewers and film academics, and that he's gained a certain public following (concentrated in Chicago, where he serves as critic for the Reader). But if you were to ask him how he fits into American film culture in particular and US society in general, he would locate himself, quite accurately, on the margins. As his friends will tell you (I hope I may count myself among them), Rosenbaum is one of the angel-headed hipsters: a sweet-natured, guileless man, wholly in love with art and wholly longing for social justice. And for these very reasons, he has become the angry man of American film criticism, as you might gather from the title of his new work, Movie Wars: How Hollywood and the Media Conspire to Limit What Films We Can See (A Cappella, $24).
Rosenbaum argues--"argue," by the way, is one of his favorite words--that those American writers, editors and TV producers who pretend to cover film are for the most part hopelessly self-blinkered. It's in their interest to look at only those movies that the big American companies want to promote (including the so-called independent films that have been ratified by Sundance and Miramax). So journalism collaborates with commerce, instead of acting as a check on it; informed, wide-ranging criticism gets shoved to the side; films that might have seemed like news flashes from the outside world fail to penetrate our borders; and everyone excuses this situation by claiming that "the people" are getting the dumb stuff they want. Rosenbaum is enraged that moviegoers should be viewed with such contempt; he's infuriated that well-placed journalists should justify their snobbism (and laziness) by dismissing whatever films and filmmakers they don't already know about; and he's mad enough to name names.
In Movie Wars, Rosenbaum advances his arguments by means of a crabwise motion, scuttling back and forth between general observations (which are newly composed) and case studies (many of them published before, in the Reader and elsewhere). This means that some stretches of ground are covered two or three times. I don't much mind the repetition--even when the material shows up in a second new book by Rosenbaum, his excellent, unabashedly partisan monograph on Jarmusch's Dead Man (BFI Modern Classics, $12.95). I do worry that indignation, however righteous, has begun to coarsen Rosenbaum's tone and push him into overstatement.
When Rosenbaum is at his best, his extraordinary wealth of knowledge about cinema informs an equally extraordinary power of insight into individual pictures; and both these aspects of his thinking open into frequently astute observations of the world at large. You can get Rosenbaum at his best in his Dead Man monograph and in three previously published collections: Moving Places, Placing Movies and Movies as Politics (California). By contrast, Movie Wars is a sustained polemic, with all the crabbiness that implies.
It's a welcome polemic, in many ways. Most rants against the infotainment industry are on the level of Michael Medved's godawful Hollywood vs. America; they complain, in effect, that the movies tell us too much about the world. Rosenbaum recognizes the real problem, which is that our world (filmed and otherwise) has been made to seem small. I agree with much of what he says. But when, in his wrath, he digresses to settle scores or rampages past obvious counterarguments, I begin to wish that he, too, would sometimes pretend to be urbane and solid.
"There's a hefty price tag for whatever prestige and power comes with writing for The New York Times and The New Yorker," Rosenbaum says, "and I consider myself fortunate that I don't have to worry about paying it. Film critics for those publications--including Vincent Canby and Pauline Kael...--ultimately wind up less powerful than the institutions they write for, and insofar as they're empowered by those institutions, they're disempowered as independent voices."
To which I say, yes and no. As bad as the situation is--and believe me, it's woeful--I've noticed that news of the world does sometimes break through. David Denby, in The New Yorker, may contribute to American ignorance by being obtuse about Kiarostami (as Rosenbaum notes with disdain); but then, as Rosenbaum fails to note, Stephen Holden and A.O. Scott in the Times delivered raves to Taste of Cherry and The Wind Will Carry Us. Individuals in even the most monolithic publications still make themselves heard; and the exceptional writer can manage (at least in life) to upstage an entire institution.
Rosenbaum himself has pulled off that trick at the Reader; and Vincent Canby did it at the Times. To the living critic, and all those who share his expansive view of the world, I say, "We've lost a champion. Better stop grousing and pick up the slack." And to those who mourn Mr. Canby, I say, "You can still hear his laughter. Just don't let The Editors get in the way."
LOUIS ARMSTRONG AT 100
In 1927 a young cornetist led his band into a meticulously hilarious version of a classic composition Jelly Roll Morton had made famous, "Twelfth Street Rag." The recorded track sounds like the opening shot of a revolution--except that the revolution had already been in full swing in Louis Armstrong's head and hands for years.
Unlike most revolutions, though, from the first this one displayed an ingratiating, inviting sense of humor and charm. "Dippermouth," as his early New Orleans pals dubbed him, used the rag as a flight vehicle: As his horn fractures the tune's familiar refrains, the precise, cakewalking rhythmic values of ragtime suddenly coil and loop and stutter and dive, the aural equivalent of a bravura World War I flying ace in dogfighting form. Every time Armstrong comes precariously near a tailspin, he pulls back the control stick and confidently, jauntily, heads off toward the horizon, if not straight into another virtuosic loop-de-loop. The cut is from an astonishing series of recordings Armstrong made in 1925-28 that amount to the jazz-creating legacy of his Hot Fives and Hot Sevens, a succession of studio groups that virtually never performed live. And now, in time for his centennial--he claimed he was born in 1900 but wasn't--it's all been reissued.
The relentless joy brimming in the sound of young Satchelmouth's horn, the glorious deep-blue and fiery-red Whitmanesque yawp of it, has an undeniably self-conscious edge to it. Ralph Ellison and Albert Murray first pointed out a half-century ago that it is also the sound of self-assertion, a musical realization of the double consciousness W.E.B. Du Bois posited for African-Americans. Within this compound of power and pain, a racial revisitation of the master-slave encounter in Hegel's Phenomenology of Spirit, Du Bois explained that African-Americans were inevitably alienated, stood both inside and outside mainstream American culture and its norms, prescriptions, hopes, dreams. Such alienation, Du Bois pointed out, could cripple black Americans by forcing them to internalize mainstream cultural values that held them to be less than human, but it could also liberate the brightest of them. The "Talented Tenth," as he called this group, could act on their perceptions of the contradictions between the high ideals grounding basic American cultural myths (for example, that society believed "all men are created equal," as the Declaration of Independence puts it) and gritty daily reality, where blacks were not exactly welcomed into concert halls, schools, restaurants or the front of buses.
In the bell of Armstrong's barbaric (which means, in the sense Whitman inherited from Emerson, non-European) horn is the sound of a new, all-American culture being forged from the stuff of the social sidelines. In 1957 Ellison wrote to Murray,
I've discovered Louis singing "Mack The Knife." Shakespeare invented Caliban or changed himself into him. Who the hell dreamed up Louis? Some of the bop boys consider him Caliban, but if he is, he is a mask for a lyric poet who is much greater than most now writing. Man and mask, sophistication and taste hiding behind clowning and crude manners--the American joke, man.
Armstrong himself was no naïve artist; he certainly wasn't a fool. From his earliest days he saw race as a key issue in his life, his art and his country, with a wit and understanding evident in his music. As he wrote of the composer of "Twelfth Street Rag" and jazz's self-proclaimed inventor, "Jelly Roll [Morton] with lighter skin than the average piano players, got the job [at New Orleans's leading whorehouse, Lulu White] because they did not want a Black piano player for the job. He claimed he was from an Indian or Spanish race. No Cullud at all. They had lots of players in the District that could play lots better than Jelly, but their dark Color kept them from getting the job. Jelly Roll made so much money in tips that he had a diamond inserted in one of his teeth. No matter how much his Diamond Sparkled he still had to eat in the Kitchen, the same as we Blacks."
In The Omni-Americans, Murray explains how Armstrong's music limned human talents needed in the frenetic, fast-changing twentieth century. Drawn from the pioneer, Indian and slave, the key American survival skill was improvisation, the soloist's ability to mesh with his surroundings. Ellison's Invisible Man uses Armstrong's version of "Black and Blue," a tune from the 1929 Broadway play Chocolate Dandies, to demonstrate the Du Boisian nature of improvising as epistemological tool.
This was the lesson Armstrong started teaching in the Jazz Age, when flappers reigned and sexual emancipation knocked at the door of mainstream culture, when the Harlem Renaissance redefined African-Americans, when Prohibition created a nation of outlaws who, thanks to associating with booze and gangsters and the demimonde's jazz soundtrack, saw that Negroes, as they were called, were subject to legal and extralegal restrictions and prejudices more arbitrary and inane than the constitutional amendment forbidding booze.
The elastic rhythms and fiery solos on the sides by the Hot Fives and Hot Sevens spoke to these people. On tune after tune, Armstrong cavorts and leaps and capers over and around his musical cohorts with the playful self-possession of a young and cocky top cat. Nothing can hold him down. He traverses keys and bar lines and rhythms with impunity, remolding them without missing a step.
"Black and Blue"--originally written as a lament by a dark-skinned gal for her man, who's attracted to high-yellow types--made him a star. Armstrong's brilliant, forceful reading renders it as mini-tragedy, the musical equivalent of Shylock's speech in The Merchant of Venice. "My only sin," he sings in that growl that compounds the earthy humanity of the blues with an unflinching dignity (this is no grovel), "is in my skin/What did I do to be so black and blue?" The short answer: in America, nothing. The color line did it all.
Subversive and powerful, Armstrong's music was the fountainhead of the Jazz Age and the Swing Era, when jazz was America's popular music and the sounds of syncopated surprise filled the nation's dance halls while young folks skittered and twirled and flounced and leaped and broke out of lingering Victorian constraints to loose-limbed beats and blaring horns that emerged from America's Darktowns in New Orleans, New York and Chicago.
One of Armstrong's 1936 recordings is "Rhythm Saved the World." Like many, this banal tune is transformed by his syncopating personality. Its idea still echoes across America's teeming subcultures: Decades later, Parliament Funkadelic sang, "Here's my chance to dance my way out of my constrictions."
If Armstrong claimed he was born on July 4, 1900, who could blame him? As one of America's primary declarers of cultural independence--and interdependence--he should have been. But in his rich biography Satchmo, Gary Giddins (who insists that all American music emanates from Armstrong) proves that Louis's birth date was August 4, 1901. Armstrong and his sister were born in a hard district of New Orleans; their father left before either could remember him. In his early years Armstrong was raised by his grandmother, whom he credited with the Emersonian values--hard work, self-reliance, artistic daring coupled with personal amiability--that guided him. His mother may or may not have been a prostitute for a while; Louis returned to live with her when he was 5.
At 7 he quit school and went to work for a Jewish family, the Karmofskys, and picked up his first instrument--a tin horn. He'd been dancing and singing on the street for pennies with other kids, but working coal wagons with the Karmofsky sons, he learned to blow the cheap horn by putting his fingers together in front of the tube (he'd pulled off the mouthpiece). The boys encouraged him, their clients loved his melodies, and Little Louis, as he was known, had found his calling.
On January 1, 1913, he was busted for firing his stepfather's pistol, and sentenced to the Colored Waif's Home. There he joined the band and got his first musical training, which he characteristically never forgot. According to clarinet great Sidney Bechet, who in the 1920s was Armstrong's only peer as a virtuosic improviser, the cornet-playing young Louis mastered the chops-busting clarinet solo for "High Society" before his teens--an astounding feat that only hinted at what was to come.
Little Louis danced in second-line parades, following cornetist Joe "King" Oliver in the Onward Band as they wound through the Crescent City streets. Oliver was a catalytic force for Armstrong, who always insisted he learned his stuff from Papa Joe. When Oliver left for Chicago, following post-World War I black migration from the South to Northern and Western cities, he left Little Louis his slot in the Kid Ory band, which led the young cornetist to Fate Marable and the riverboats plying the Mississippi in 1920-21.
Marable, impressed by the young hornman's dazzling facility and ear, hired him for his riverboat band, and one of his sidemen trained the youngster to read and write music. What they played was a mix that would confound the Dixieland revivalists who decades later took Armstrong as their figurehead: adapted arias and classical overtures, quadrilles and other dance music, and the like. (Historian Dan Morgenstern has pointed out the suggestive influence of classical music on Armstrong.) At Davenport, Iowa, when the riverboat docked, a white kid named Bix Beiderbecke first heard Armstrong with Marable and decided to make the jazz cornet his life.
In 1922 Oliver sent for his protégé, who kissed his mother goodbye, packed the fish sandwich she made for him and headed north to Chicago. When he got to the Lincoln Gardens Cafe, where Oliver's band was wailing, he looked like a rube and was so shy he stayed by the door to watch. He couldn't believe he'd be playing with these masters of jazz. Yet in a very short time, first in recordings with them, then with his own Hot Fives and Hot Sevens, he would make them all sound like musical relics.
Rube or not--and his mode of dress quickly became Chicago-style sharp--Armstrong got the girl. His second wife, piano-playing Lil Hardin, married him while they were both playing with Oliver. Hardin was conservatory-trained and middle class, and for the next few years her ambition would drive the modest genius she married to make his mark in the rapidly exploding Jazz Age. Convinced that Oliver kept Louis in his band to keep him from fronting his own, Lil persuaded her husband to grab Fletcher Henderson's offer to join his New York-based big band. When Armstrong arrived in the Big Apple in 1924, Henderson's band was, as Morgenstern notes, "designed for Roseland's white dancing public...rhythmically stiff"; when he left fourteen months later, both arrangers and soloists were extending his sound.
It was Lil who persuaded Armstrong to go back to Chicago after scarcely more than a year in New York, and there he joined her band, then Carroll Dickerson's, and rocked the town. The night he returned, he was greeted by a banner she had unfurled over the bandstand: world's greatest trumpet player. Armstrong later told Morgenstern the reason he left Henderson's band was that the "dicty bandleader," college-educated, light-skinned and prone to look down on dark blacks, wouldn't let him sing, except occasionally for black audiences or for novelty and comic effect. Armstrong had been singing before he ever picked up a horn--it was a fundamental part of who he was and what he had to say. Ultimately, his vocals would make him a world-famous star. More immediately, they were another virtuosic tool he used to change jazz and, in the process, American culture.
Armstrong pioneered so many firsts in jazz and America that a list seems implausible. Here's a sample: He invented the full-fledged jazz solo. He invented scat singing. He introduced Tin Pan Alley and Broadway tunes as jazz's raw material. (When Armstrong replaced New Orleans standards and blues with Tin Pan Alley tunes in the 1930s, he forged the model followed by swing, jazz's most successful invasion of American pop music. His model was followed literally: Key arrangers like Don Redman, who worked for many bandleaders, including Benny Goodman, adapted Armstrong's runs and rhythmic moves to section-by-section big-band arrangements.) And Armstrong performed in interracial settings. Once, in New Orleans, when a bigoted announcer refused to introduce his band, he did it himself--so well that the radio station asked him to do it for the rest of the band's stint.
His voice engulfed America. Among his major disciples was Bing Crosby, who called him "the beginning and the end of music in America." His influence rippled across American popular and jazz singing like an earthquake. As he reconfigured pop tunes, the apparently natural force of his voice's cagey dynamics and loose rhythms seized the imagination of talents like Ella Fitzgerald and Billie Holiday, Crosby and Frank Sinatra.
With his last Hot Sevens recordings for Okeh in 1928, in which tunes like "I Can't Give You Anything But Love" were issued as B-sides, Armstrong had moved closer to the new American cultural mainstream he was inspiring. When he started recording for Decca in 1935, the impetus accelerated. A couple of interim managers gave way that year to Joe Glaser, a thuggish, mob-connected scion of a well-off Chicago family. He and Armstrong shook hands on a deal that lasted till they both died. As Armstrong put it, "A black man needs a white man working for him." It was the beginning of his big crossover into mainstream American culture--another Armstrong first in undermining de facto segregation in America. And his years at Decca were his workshop in change.
He fronted a big band, which critics hated and fans enjoyed. The outfit was run by Glaser, since Armstrong, who occasionally hired and fired personnel, didn't want to shoulder a bandleader's nonmusical burdens. And he agreed with Glaser on a new musical direction: setting his solos off in sometimes inventive, sometimes indifferent big-band charts; smoothing his blues-frog vocals into a more sophisticated sound without losing their rhythmic slyness--something he was also doing with his trumpet solos, reshaping his early frenetic chases after strings of high-altitude notes into less eye-popping, more lyrical solos.
Physical damage to Armstrong's lip and mouth from high and hard blowing forced the issue. Joe Muranyi, who played with him years later, says, "Part of the change in Louis's style could be attributed to the lip trouble he had in the early thirties. There are tales of blood on his shirt, of blowing off a piece of his lip while playing. This certainly influenced the way he approached the horn; yet what we hear on these tracks has at least as much to do with musical development as with physical matters." Limitation was, for Satchmo's genius, a pathway to a matured artistic conception. As Giddins argues forcefully in Satchmo, Armstrong had never separated art and entertainment; jazz for him was pop music. And if his bands irritated critics, there were plenty of gems, and besides, people loved him.
By World War II, his audiences were more white than black.
The war years broke the big bands. The culture had changed: Singers and small groups were hip. It was the era of a new sound, what Dizzy Gillespie called modern jazz and journalists dubbed bebop. Bop's frenetic, fragmented rhythms restated the postwar world's rhythms, and it deliberately presented itself not as entertainment but as art. The musicians forging it, like Gillespie and Charlie Parker, were fully aware of the stirring civil rights movement. World War II had fostered widespread entry of blacks into industry and the American military. Not surprisingly, after the war, they weren't willing to return to the old values of accommodation and deference. Instead, they demanded equality and freedom. In this context, boppers and their followers saw Armstrong's lifelong mugging and entertainment as Uncle Tom-ism rather than artistic expression.
The Dixieland revival, based in Chicago, occurred at about the same time. The (mostly white) revivalists needed an artistic figurehead. With a healthy historical irony they ignored, they chose Armstrong--the very soloist who blew apart old-style New Orleans polyphony, their idea of "pure" or "real" jazz. By 1947 Satchmo reluctantly abandoned his eighteen-piece outfit for the All Stars, a New Orleans-style sextet that included Jack Teagarden and Earl Hines. Though they often made fine music, the group was seen as a step backward by boppers. They jabbed at Satchmo, he jabbed back, and the split between revivalists and modernists escalated into a civil war that, in different stylistic and racial modes, still divides the jazz world.
Sadly, it was another Armstrong first. And his audiences began to turn lily-white. Giddins deftly shows Armstrong's world-famous onstage persona--the big grin, the bulging eyes, the shaking head, the brandished trumpet, the ever-present handkerchief, the endless vaudevillian mugging--to be an organic conception of the artist as entertainer. Still, from the 1950s until just before his death in 1971, Armstrong had to deal with accusations and slurs.
But if he never forgot who he was, while retaining his characteristically modest manner and only privately protesting how much he'd done to advance black civil rights, he could still be provoked, as President Eisenhower and the public discovered in 1957. Armstrong was poised to go on the first State Department-sponsored tour of the Soviet Union, a cold war beachhead by jazz. He abruptly canceled it because the Southern states refused to integrate schools, and he publicly excoriated Ike and America.
Early jazz musicians often refused to record, because they felt competitors could steal their best licks from their records. This was why the all-white Original Dixieland Jass Band made jazz's first records; black New Orleans trumpeter Freddie Keppard refused, fearing for his originality.
No one knows for sure how many recordings Armstrong made during the course of his half-century recording career. All agree, however, that he helped create both the art and the industry. After all, "race" records, especially Armstrong's hits, were as important as Bing Crosby's in saving the fledgling record companies from collapse during the Depression. (And there was more to it than that. Through the phonograph Armstrong made infinite disciples, shaping what jazz would become.)
During the 1950s and 1960s, when he was largely considered a period piece, Armstrong recorded important documents, like his meetings with Duke Ellington and Ella Fitzgerald. The best thing about them is their apparent artlessness, the easy, offhand creativity that was as much Armstrong's trademark as his trumpet's clarion calls. The pleasure is doubled by the response of his disciples.
Ella fits that description easily, since her trademark scat singing owes so much to Armstrong's. Yet she made it her own, purging scat of its overt blues roots. Producer Norman Granz supported them with his favorite Jazz at the Philharmonic stars--Oscar Peterson, Herb Ellis and Ray Brown. The results: Both Ella and Louis and Ella and Louis Again are incandescent yet low-key, full of generous pearls (from "Can't We Be Friends" to "Cheek to Cheek") that can almost slip by because of their understated yet consummate ease.
The 1961 session with Armstrong and Duke Ellington was hasty and almost haphazard, a simple melding of Ellington into Armstrong's All Stars; and yet it produced a wonderful, relaxed, insightful album. After all, Ellington had shaped his earliest bands around trumpeters and trombonists who could serve up Armstrong's New Orleans flair.
Like most postwar babies, I grew up knowing Louis Armstrong as the guy who sang "Mack the Knife" and, most famously, "Hello Dolly." It was only later that I'd discover the old blues stuff with singers like Bessie Smith, the Hot Fives, Ella and Louis, Fletcher Henderson and--one of my faves--Armstrong's accompaniment on early hillbilly star Jimmie Rodgers's "Blue Yodel No. 9." But even as a kid I felt strangely drawn to the little guy singing and grimacing on TV, wiping his perspiring brow with his trademark handkerchief. Although it all seemed corny, there was something, a hint of irony--though that wouldn't have been what his audiences, black or white, noticed unless they were old-timers who knew the ironic physical language or Satchmo fans or, like me, just a kid.
Why would a white kid in America catch a glimpse of Armstrong's abundantly joyful and potentially dangerous ironies? I'd love to claim precociousness, but it was a lot simpler. I could tell Armstrong was real because he filled the little blue TV screen so overwhelmingly that he made everything around him look, as it should have, fake.