Where the past isn’t even past.
This is part one of a series on the perils of privatization. Check back soon for the next installment.
Riding the buses in Chicago has been awfully fun this busy Christmas Season. Half the time, it’s been free.
This fall, you see, after a series of delays, the city brought online a new fare payment system called “Ventra” in which customers tap “smart cards” against electronic readers at bus entrances and train station turnstiles. Only it turns out these cards are not so smart. Half the time, tap after tap after tap, the damned things don’t work, and the bus driver just exasperatedly waves you through. Although it hasn’t been as much fun for the passengers who exited the bus through the front door and discovered that, if their purses or backpacks brushed too close to the reader, they were charged twice. Or for a guy named Al Stern who became a local celebrity after receiving an e-mail on Friday, September 6, informing him he’d be receiving a Ventra card in the mail soon to replace his “Chicago Card,” which was the previous prepayment system (which worked fine). Four minutes later he got the same e-mail, then four minutes after that, then four minutes again after that, and so on and so forth all the way until the next morning. Twenty-four days later he arrived home to a pile of 91 envelopes shoved through his mail slot, each holding a Ventra card; the next day, 176 more arrived, each one, he later discovered, canceling the last. “You have to call and activate it,” he told Crain’s Chicago Business, “but I’ve been afraid to do that.”
The goofs accelerated, like Lucille Ball working at the assembly line, or Charlie Chaplin in Modern Times. At the end of November federal workers discovered they could “pay” fares by scanning their employee IDs. (“Please be advised that intentional misuse of federal credentials is prohibited,” the local EPA office wrote employees.) Other customers reported being double-, triple- or even quadruple-billed. At the end of October the Chicago Tribune reported, “CTA employees are being ‘verbally attacked daily by angry riders’ who are blaming them for problems with the new transit-fare payment system”; their union called on the Chicago Transit Authority to scrap Ventra until the bugs were worked out. The city started charging the vendor for the proliferating lost revenue, while that vendor kept promising serial “software fixes” that never seemed to do anything but introduce novel problems. At the end of November came news that when people tapped their wallets, like they used to do without problem with their old Chicago Cards, random credit or debit cards were charged. Then, last week, the “CTA: Ventra Glitches Mostly Gone But Delays Remain.”
rush-hour outages began, and some buses stopping accepting fares altogether. That came two days after the headline
It’s been our own municipal version of the Obamacare rollout—which means everyone should pay attention. For the root problem is exactly the same. Congressman Henry Waxman argued about glitches in the ACA, “if anybody’s head should roll, it should be the contractors who didn’t live up to their contractual responsibility.” But that’s only half right. Consider the sign Harry Truman used to keep on his Oval Office desk: “The buck stops here.” The problem is not just the profusion of private contractors who do the public’s business so poorly; it’s the fact that the public’s business is being so relentlessly privatized by the government executives in charge. Slowly, the perceived imperative to privatize has become the political tail that wags the policy dog. The results are before us.
Why, indeed, was this massive change in how Chicagoans pay for their bus and train fares initiated in the first place? “What was wrong with the old system? It worked fine,” ran the first comment on the Chicago Tribune’s article on all the abuse the poor bus drivers are facing from frustrated customers. Ran the second, “I have never even heard a compelling argument as to WHY we needed a new system to begin with.”
Well, my friends, here’s your compelling argument: under the old system, rich investors didn’t get a piece of the action. Under this one, they most decidedly do.
The contract to replace Chicago’s fare payment system was awarded to the publicly traded corporation Cubic in 2011 by the previous mayor, Richard M. Daley, for $454 million, and implemented with alacrity by the current mayor Rahm Emanuel. I’ll have much more to say about this company and its many dubious works in the next part of this series. For now, consider this. In a separate part of the project, Chicagoans are offered the following opportunity, as advertised on the back of their Ventra cards: “Go beyond transit. Call or go online to activate your Money Network® MasterCard® Prepaid Debit Account and use your Ventra Card for purchases, direct deposit, bill pay, and at ATMs.” This is how the City of Chicago intended to turn its millions of captive citizens over to the commercial banking industry: hoovering spare change from the pockets of Chicago’s marginal communities into corporate America’s overstuffed coffers.
Chicagoans who choose to turn bus cards into bank cards will be socked with hidden fees: $1.50 every time they withdraw cash using your bus-card-cum-bank-card from an ATM,$2.95 every time they add money using a personal credit card. Two dollars for every phone call with a service representative (or, oops, each “Operator Assisted Telephone Inquiry”). Two bucks for a paper copy of their account. An “account research fee” of $10 an hour.
At which point, learning about all that, they might cry, Help! Let me out of this “deal”! Well, that’ll cost them $6—a “balance refund fee.”
Now, turning your bus card into a bank card is optional—a program supposedly intended to help Chicago’s underbanked poor. I liked this observation, however, from a Tribune article last March: “It may be a tough sell, some experts said. Many low-income individuals are cash-centric in their spending habits because they are wise to the way of credit-card charges.” But not to fear, if you’re a Master of the Universe investing in one of the participating multinational banking concerns—Mastercard, First Data, or MetaBank—backing the play. Even though Chicago’s impoverished might not make ready marks for the scam, Chicagoans who don’t choose the banking option will suffer hidden charges. There is, for instance, a $7 “dormancy” fee if you don’t use your transit card for eighteen months, with another $5 charge tacked on for every dormant month after that.
Note that I said hidden fees. How hidden? Well, the only reasons Chicagoans learned about them was that the Chicago Tribune pored through over 1,000 pages of legal boilerplate (for more on how impossibly complex user contracts rip off consumers generally see my reporting here) and discovered them.
The reporters at the local CBS affiliate, meanwhile, reported that First Data got an “F” rating from the Better Business Bureau, with ninety-seven complaints filed against it over the past three years. MetaBank was recently ordered to pay $5.2 million by federal regulators for another public/private hustle, a now-discontinued program of issuing debit cards funded by tax refund loans at interest rates between 120 and 650 percent with a fee of $2.50 per $20 loaned.
The Trib noted, “Neither the CTA nor Cubic…nor MasterCard; nor MetaBank; nor First Data, which will issue the prepaid debit accounts, have disclosed the extra charges or how consumers can avoid them.” The city promised, though, that a “CTA public education campaign on all aspects of the Ventra card, including the fees, will start soon.” I must have missed it. Another fee which all transit users will have to bend over backward to dodge is the $5 it costs to get a transit card in the first place. That $5 is supposed to be refunded if you register your card online or by phone. I didn’t learn about that until I began researching this piece, though I was unable to make the site work on my decrepit old computer. Then they decided to waive the $5 fee, but only if you buy your card via phone or online, not at vending machines—which sort of compounds the insult against less-savvy customers or those with decrepit computers, doesn’t it?
Now, you might think poorer Chicagoans without even decrepit old computers should be able to hobble along paying for their fares like they always have, in cash. But single-ride tickets are going to be $3, instead of the normal $2.25—with what the CTA calls a seventy-five-cent “convenience fee” tacked on.
Chicagoans, these are the business partners your city has chosen. And this is the man in charge of protecting you from them: Forest Claypool, president of the Chicago Transit Authority. Some protector. Claypool wrote that “the ‘convenience fee’ covers the cost of producing the disposable tickets and is ‘entirely avoidable to any and every customer’ so long as they purchase and register Ventra cards.’” But as I noted above, poor people tend to use cash because they don’t trust cards—reasonably enough. But the black-hearted bureaucrat “bristled at the characterization by many critics that cash-payers are being penalized…’There is no $3 cash fare,’ he said. ‘The $3 is if a person chooses a disposable, one-ride ticket. It has nothing to do with cash.’ ”
The editors of Crain’s Chicago Business, no Bolsheviks they, disagree with his blithe assessment. They say “the embedded fees could prey upon the least financially savvy among us, those for whom a $5 charge here and a $7 fee there add up to real money… They rely on people’s ignorance to take in money. And that’s not the kind of business our public servants ought to be in.”
So, so naïve, Crain’s Chicago Business. That’s precisely the business our public servants want to be in—“public” servants like Mayor Emanuel. As Chicago anti-privatization activist Tom Tresser explained to me last summer, “We have a massive global movement of capital which, because they’ve burned their own fucking houses down through their own greed, don’t have the gilt returns that they’re used to receiving…. So the new guaranteed annual returns that big business and big capital are looking for is our assets.”
Next time, I’ll focus on the particular big business behind the fare collection system in Chicago—and in Sydney, Vancouver and London, too. I guarantee it will be eye-opening.
Read Next: Greg Kauffman on a Senate plan to cut food stamps.
It’s one of the most frequent questions people ask me about conservatives: “When they say X, do they really mean it?” Does, for example, Rick Santorum really mean it when he says about Nelson Mandela, as he did in a recent interview with Bill O’Reilly on Fox News, “He was fighting against some great injustice, and I would make the argument that we have a great injustice going on right now in this country with an ever-increasing size of government that is taking over and increasing people’s lives—and Obamacare is front and center in that.” And I have to answer that largely, as astonishing as it may seem, they do.
Never mind that the size of government is not “ever-increasing” (see here). Empirical debunking cannot reach the deepest fear of the reactionary mind, which is that the state—that devouring leviathan—will soon swallow up all traces of human volition and dignity. The conclusion is based on conservative moral convictions that reason can’t shake.
Recently the outstanding political reporter Brian Beutler, now writing for Salon, wrote in a piece headlined “Right-Wing Extremists Face New Moral Conundrum” that as long as Healthcare.gov isn't working like it is supposed to, Republicans could “ignore the moral imperative they face” to help their constituents get healthcare. “A working site that can service a million people a day destroys that excuse. Some conservative groups have been craven and reckless enough to actively discourage people from enrolling in Affordable Care Act coverage.” I guarantee, though, that few or no conservative politicians are losing sleep over this. Instead, they judge themselves heroes. Waylaying their constituents’ ability to avail themselves of federally subsidized healthcare is not a “moral conundrum” for them. It is a deeply moral project. The immorality, as they see it, would be to allow people to become dependent on the state for their health.
I’ve been repeating myself, but clarity is very important here: know thine enemy. (OK, we’re liberals; we don’t have enemies. Know thine adversary.) Theirs is a morality entirely incommensurate with liberalism—but it is a morality.
One of its theorists was the Christian reconstructionist theologian Dr. Rousas J. Rushdoony. He wrote in his 1972 book The Messianic Character of American Education that since “the nuclear family is the basic unit of God’s covenant,” undermining the vaulting ambitions of the secular state was a godly duty. But you don’t have to be a Christian Reconstructionist or advocate, as Rushdoony did and his followers do, returning to biblical punishments like stoning to share the same intuition. Even mild-mannered Gerald Ford, usually not judged a frothing right-winger, used to love the nostrum “A government big enough to give everything to you is big enough to take everything away.”
Relying on government is slavery: it’s a consistent trope within modern conservatism. We see it today from the extremist doctors who refuse to “submit” to being reimbursed for their services by Medicaid, or even the government-tainted private insurance companies. They’re organized in a 4,000-member group called the American Association of Physicians and Surgeons. Senator Rand Paul is a member; its website features him asserting that if you believe in a “right to health care,” then “you believe in slavery.” And what kind of moral person believes in slavery?
Then there’s the old saw that the deal the Democrats supposedly offer African-Americans—you vote for us; we give you free stuff—returns them to “slavery.” The first use of that metaphor I’ve identified was by Ronald Reagan in 1968. A black reporter asked him why there were so few blacks at Republican events. The California governor politely but forcefully replied that it wasn’t Republicans who were racist but the supposedly liberal Democrats who “had betrayed them…. The Negro has has delivered himself to those who have no other intention than to create a federal plantation and ignore him.” The New York Times reported, “Reagan handled the situation so smoothly that some of the newsmen aboard his chartered 727 suggested, half-seriously, that the Reagan organization had set up the incident.”
What does this insight—that conservatives are immune to charges of “immorality” when it comes to denying citizens government services because they believe “hooking” people on government services is profoundly immoral—mean in terms of practical politics? For one thing, that Democrats will never get political credit from conservatives for downsizing or “reinventing” government. Just to speak of the state as something other than the source of all evil is enough to send chills down right-wing spines. Had JFK lived to give the speech he was scheduled to give at the Dallas Trade Mart on November 22, 1963, he intended to set conservatives straight: “At a time when the national debt is steadily being reduced in terms of its burden on our economy, they see that debt as the greatest single threat to our security. At a time when we are steadily reducing the number of Federal employees serving every thousand citizens, they fear those supposed hordes of civil servants far more than the actual hordes of opposing armies.” But I don’t think Texans were going to turn in their John Birch Society membership cards, the scales falling from their eyes, when they learned of the facts.
And given the rank anti-empirical irrationalism that undergirds such convictions, it’s not like the White House can now avoid brickbats by somehow “submerging” progressive action by the state. Which, unfortunately, is what the Obama Administration has habitually tried to do. "Design of the Affordable Care Act,” as The Washington Post reported in a major investigation last month, “was hampered by the White House’s political sensitivity to Republican hatred of the law—sensitivity so intense that the president’s aides ordered that some work be slowed down or remain secret for fear of feeding the opposition. Inside the Department of Health and Human Services’ Centers for Medicare and Medicaid, the main agency responsible for the exchanges, there was no single administrator whose full-time job was to manage the project.”
But it’s not like their overabundance of caution earned a single Republican vote in Congress, or kept Republican attorney generals from suing to end its implementation, or four conservative Supreme Court justices from seeking to strike down the entire act (and a fifth, John Roberts, from ruling on its legality in away that set a precedent that might make future major government initiatives harder to constitutionally defend). It couldn’t have. Conservatives’ deepest moral convictions determined their reaction in advance. Anything liberals do to use the government to help people will be judged by genuine conservatives as an abomination; always have, always will. But genuine conservatives are in the American minority, as I wrote here last month. Isn’t it better to simply sin boldly and let our conservatives devils have the hindmost? Use the state to make people’s lives better. Do it without apology. That’s our moral imperative that should be beyond compromise.
In Part Five of this series, Rick Perlstein discusses conservatives' complicated relationship to the truth.
On November 20, hundreds of unionized graduate student instructors at Berkeley went on a 24-hour solidarity strike to protest the university’s intimidation tactics against university support staffers who’d gone on strike this past spring. An e-mail from a mathematics faculty member to his grad instructors explaining why he was crossing the picket line and why they should too went viral. For the prof, named Alexander Coward, also saw reason to protest: to dissent against the silly notion of solidarity in the first place.
He wrote: “Whatever the alleged injustices are that are being protested about tomorrow, it is clear that you are not responsible for those things, whatever they are, and I do not think you should be denied an education because of someone else’s fight that you are not responsible for.”
So what are they responsible for?
You need to optimize your life for learning.
You need to live and breath [sic] your education.
You need to be *obsessed* [sic] with your education.
Society is investing in you so that you can help solve the many challenges we are going to face in the coming decades, from profound technological challenges to helping people with the age old search for human happiness and meaning.
That is why I am not canceling class tomorrow. Your education is really really important, not just to you, but in a far broader and wider reaching way than I think any of you have yet to fully appreciate.
Society, and one’s education, apparently have nothing to do with issues of decent wages and working conditions and keeping higher education affordable and its institutions accountable. Good to know.
There is something about the very grown-up action of sacrifice for the sake of solidarity that turns some professors into patronizing asses. For Professor Coward also wrote, “All this may sound like speaking in platitudes. However it is a point worth making to all of you because you are so young. One of the nice things about being young is that your thinking can be very clear and your mind not so cluttered up with memories and experiences. This clarity can give you a lot of conviction, but it can also lead you astray because you might not yet appreciate just how complicated the world is.” According to his CV, Coward was born in 1981. That makes him a grizzled thirty-one or thirty-two years of age.
Coward, I’ve found, has plenty comrades in anti-comradeship who are old enough to know better. I’ll never forget the first time I heard the argument that, “even though I support unions I don’t support graduate students unions,” since graduate students were “apprentices,” not “workers.” It was over dinner with an Ivy League professor (not this one, another one) whose writings had taught me a great deal about what union solidarity was and why it mattered in the first place.
Then there’s this letter I was shown, sent by a political science professor at the University of Chicago, answering a public letter from students in his department about why they were working to organize a union. The professor’s response began this way: “First off, let me preface these remarks by saying that when I was in graduate school at Berkeley in the 1990s, I was very active in the graduate student unionization movement. I was shop steward for the political science department for several years and was very active in a three week campus wide teaching strike we held in the fall of 1992. It may also be worth mentioning that I come from a working class family (I was the first and only person in my family to go to college) and I grew up around a lot of issues of collective bargaining. So I’m highly sympathetic to issues of collective action.”
Just not your collective action. For he continued, “That said, I found your co-signed letter to be naive, unconvincing, and, quite frankly, kind of offensive. It is naive in that you seem to really think a union would not change relationships between graduate students and the faculty. I don’t know if either of you have ever been members of a union or worked in a unionized environment, but unions inevitably alter the relationships between union members and the people the interact with, be they management, clients, customers, or what not. The formalization of such relationships is, in fact, the central goal of a union. Your letter says ‘Our goal is simply to gain a voice in the decisions that affect our working conditions.’ Well, these decisions are largely made by the faculty. Thus, if you want a collectivized voice in these decisions, you will be unavoidably shaping your relationships to faculty members.”
He then claimed, “Let’s say each quarter of being a teaching intern requires about 200 hours of work (which is a high estimate), as graduate student ‘employees’ you are effectively making at least $300 an hour for the limited amount of time you are ‘working.’ What’s more egregious is the fact that most of the faculty I know do not think of interns [the University of Chicago’s term for teaching assistants] as employees but think of the internship as another educational experience.”
He also said the grad-student-union organizers’ arguments were “unconvincing because you do not specify any significant hardships regarding your ‘working’ conditions”—that word, working, was in quotes—and that he found the letter “off-putting in the tone of entitlement that rings through it. Every year there are hundreds of applicants for a very small number of slots to study here. You are very lucky to be here, just as I am very lucky to teach here. When you were admitted to the university, you were not hired. You were offered a spot as a student. The university owes you nothing beyond what it initially proposed and what you accepted. To call yourself an employee and complain about an absence of cost-of-living adjustments, health insurance, or the burdens of being a graduate student…sounds both presumptuous and petulant.”
Nothing off-putting, entitled, presumptuous or petulant about that.
It’s interesting to consider such arguments, which are common among faculty, in the context of a brilliant analysis of what this supposedly entitled life of the aspiring scholar actually looks like in the real world. It appeared during the Berkeley strike, and argues that the process of winning a tenure-track job, and then earning tenure, resembles nothing so much as climbing the greasy pole in a drug gang.
If you take into account the risk of being shot by rival gangs, ending up in jail or being beaten up by your own hierarchy, you might wonder why anybody would work for such a low wage and at such dreadful working conditions instead of seeking employment at McDonald’s. Yet, gangs have no real difficulty in recruiting new members. The reason for this is that the prospect of future wealth, rather than current income and working conditions, is the main driver for people to stay in the business: low-level drug sellers forgo current income for (uncertain) future wealth. Rank-and-file members are ready to face this risk to try to make it to the top, where life is good and money is flowing.
What you have is an increasing number of brilliant Ph.D. graduates arriving every year into the market hoping to secure a permanent position as a professor and enjoying freedom and high salaries, a bit like the rank-and-file drug dealer hoping to become a drug lord…. Because of the increasing inflow of potential outsiders ready to accept these kind of working conditions, this allows insiders to outsource a number of their tasks onto them, especially teaching, in a context where there are increasing pressures for research and publishing. The result is that the core is shrinking, the periphery is expanding, and the core is increasingly dependent on the periphery. In many countries, universities rely to an increasing extent on an ‘industrial reserve army’ of academics working on casual contracts because of this system of incentives.
And indeed this analysis notches perfectly with the stories I solicited for my series—here, here, and here—on the plight of young aspiring scholars. I still get responses, including one from a junior professor, rated one of the best in the history of Indian River State College by students. He was assigned six classes to teach. He asked the school’s human resources manager how to apply for health insurance. “I lost all my classes as the college avoids the Affordable Care Act in contempt of Congress. So I established the Adjunct Faculty Union at Indian River State College (AFU-IRSC). The college blocked my e-mail access to the Campus Coalition (Student) Government in violation of Section 7 of the Labor Management Relations Act. Adjuncts teach 75% of IRSC classes for $1600/16 weeks, about the lowest in the state. Admin treats adjuncts like we are invisible, refusing to talk to us or give us job security for doing a great job. Corrupt patronage has replaced merit in higher education.” He offered the story in memory of the now-famous adjunct professor of French Margaret Mary Vojtko, who died from inadequate medical care. Vojtko never earned more than $25,000 a year and received no health benefits despite her twenty-five years of loyal service to a Catholic University, Dusquesne, which argued that its “religious beliefs” should exempt it from federal labor laws.
I also got responses from tenured professors. And their dominant tone was that same clueless arrogance we see above. One, a philosophy professor at a small liberal arts college in the Northeast, allowed that while things could be improved, and “I would like to see more tenure-track jobs and fewer adjuncts,” academia was still after all a meritocracy. He argued that “[f]riends like your autodidact”—he was referring to the example I gave of a recent PhD from one of the greatest universities in the world, who wrote brilliantly and insightfully, was a natural-born teacher and applied to a hundred jobs to no avail before realizing “tenured employment is almost unimaginable” because of his undeveloped suck-up skills—“will slip through the cracks if, despite actual excellence, they can’t muster what the academy considers evidence of excellence…. I think of a tenure-track job like an actor getting a job at a repertory company, or a baseball player being hired to play baseball full-time—there are just too many people lining up to do such jobs to give them to everyone.”
This was supposed to be a defense of the system.
Another, who teaches at one of the nation’s largest, richest and best public universities, complained that after renting a conveniently located house, paying for daycare and servicing his student debt, “groceries end up on the credit card by the 20th of every month.” As a result, “my credit rating is just above Charles Manson’s and my hair is falling out.”
A sad story, but not quite at the level of Margaret Vojtko’s, is it?
Rick Perlstein on the death of Margaret Mary Vojtko.
Correction: This post has been updated to indicate that the University of California strike lasted 24 hours, not three days.
Yesterday came news that my home state, Illinois, is preparing for its twenty-sixth annual ceremony this Saturday to honor the “66 Illinoisans listed as MIA or POW in Southeast Asia.” I absorbed this development the same week I had occasion to attend an internment at a military cemetery in Washington State, over which flew, alongside the banners of all of America’s military service branches, the familiar “POW/MIA” flag with the forlorn, hangdog prisoner silhouetted in the foreground and guard tower and barbed wire in the back. Given the scale of national problems we’re facing these days, this one hardly registers a dent. But it creeps me out all the same. And if you deplore jingoistic, racist propaganda, it should creep you out, too—so, this afternoon, let me unburden myself.
When downed American pilots were first taken prisoner in North Vietnam in 1964, US policy became pretty much to ignore them―part and parcel of President Lyndon B. Johnson’s determination to keep the costs of his increasingly futile military escalation in Southeast Asia from the public. Then, one day in the first spring of Richard Nixon’s presidency, Secretary of Defense Melvin Laird announced the existence of from 500 to 1,300 of what he termed “POW/MIA’s.” Those three letters—“MIA”—are familiar to us now. The term, however, was a new, Nixonian invention. It had used to be that downed fliers not confirmed as actual prisoners used to be classified not as “Missing in Action” but “Killed in Action/Body Unrecovered.” The new designation was a propaganda scam. It let the Pentagon and State Department and White House refer to these 1,300 (later “1,400”) as if they were, every one of them, actual prisoners, even though every one of them was almost certainly dead. “Hundreds of American wives, children, and parents continue to live in a tragic state of uncertainty caused by the lack of information concerning the fate of their loved ones,” Secretary Laird said. That was part of an attempt to manipulate international opinion to frame the North Vietnamese Communists (against whom, of course, America was prosecuting an illegal and undeclared air war against civilians) as uniquely cruel, even though fewer men were taken prisoner or went missing in Vietnam than in any previous American war. (From 1965 through 1969, they were tortured, at least if you believe American prisoners at Guantánamo Bay were tortured; the techniques were essentially the same.)
During the Johnson years, Sybil Stockdale, whose husband James (Ross Perot’s unfortunate running mate in 1992) was the highest-ranking and one of the earliest POWs, had organized a “League of Wives of American Prisoners of War” (later the National League of Families of Prisoners of War, then the League of Families of American Prisoners and Missing in Southeast Asia) which agitated for attention to the prisoners’ plight—against the Pentagon’s wishes. Under Nixon, the Pentagon co-opted it, sometimes inventing chapters outright, as useful to their propaganda barrage. Their families showed up on newsmagazines and TV; “POW bracelets,” invented by the future wingnut congressman “B-1 Bob” Dornan, then a local Limbaugh on Orange County radio, were unveiled in the spring of 1970 at an annual “Salute to the Military” ball in Los Angeles. (Governor Ronald Wilson Reagan presided, and Hollywood choreographer Leroy Prinz, who had worked with Reagan on the 1942 film Hollywood Canteen, choreographed a splendid pageant.) Bracelets soon sold at a rate of 10,000 a day; Sonny & Cher wore them on TV; some people, the The New York Times reported, believed them to “possess medicinal powers”―and not just the children who displayed them two, ten, a dozen to an arm. A Wimbledon champ said one cured his tennis elbow. Lee Trevino said his saved his golf game. Matchbooks, lapel pins, billboards, T-shirts and bumper stickers (POWs never have a nice day!) proliferated, fighter jets made thunderous football stadium fly-bys, full page ads blossomed in every newspaper urging Hanoi to have a heart and release the prisoners for the sake of the children.
Jonathan Schell, then of The New Yorker, observed that the American people were acting “as though the North Vietnamese had kidnapped…Americans and the United States had gone to war to retrieve them”—martyrs to an enemy so devious, as the Armed Forces Journal put it, that they denied hundreds of little boys and girls “a right to know if their fathers were dead or alive.” Ross Perot testified to Congress that when he visited North Vietnam to plead for their release they were incredulous at all this concern over “just 1,400 men.” Americans were plainly more morally sensitive than Communists. Though in fact our South Vietnamese allies held some 100,000 prisoners, many of them Buddhists monks guilty of nothing except pacifism, in a prison complex of American design that was so inhumane that Time’s correspondent described the captives as “grotesque sculptures of scarred flesh and gnarled limbs. They move like crabs, skittering across the floor on buttocks and palms.”
Already, the issue made for “a lunatic semiology,” as the historian Richard Slotkin later described it, where “sign and referent have scarcely any proportionate relation at all.” But it sure was heartily useful to the national security state. When America’s involvement in the war ended in January, 1973, Nixon told his secretary of defense that the military-orchestrated celebration of their return, dubbed “Operation Homecoming,” was "an invaluable opportunity to revise the history of this war.”
This is when the story got even nuttier—when the propaganda slipped the bounds intended by its authors, and became more like the brooms in The Sorcerer’s Apprentice. The scholar H. Bruce Franklin of Rutgers tells the story with elegant economy in the book M.I.A., Or Mythmaking in America; Northwestern’s Michael Allen tells the story in more detail in Until the Last Man Comes Home: POWs, MIAs, and America’s Unending Vietnam War.
Operation Homecoming returned 587 American prisoners of war—but Nixon had by then settled on the number “1,600” as the number of Americans as “POW/MIA.” So where were the other 1,013? The brigadier general who supervised the repatriation announced that he “did not rule out the possibility that some Americans may still be held in Laos.” The secretary of defense promised, “We will not rest until all those still known captive are safe and until we have achieved the best possible accounting for those missing in action.” Holding the government to that pledge had now become the raison d’être of the League of Families—an organization now all the stronger, thanks to its recent history as a veritable White House front group. Bracelets continued to be sold, now with the names of MIA on them. Next came that flag—pow-mia: you are not forgotten—soon flying over VWF and American Legion posts across the fruited plain. And barely months after the Operation Homecoming propaganda triumph, Chicago MIA families declared that the administration was “abandoning” men “seen in photos coming out of Indochina or who have been reported alive by returning POWs.”
The issue came to define the diplomatic relationship between the United States and Vietnam, a subject of considerable exasperation for Vietnamese officials now being called on to “prove” they held no more prisoners. As one of them reasonably exclaimed, “We have not come this far to hold onto a handful of Americans.” A congressman from Milwaukee, Clem Zablocki, opened hearings that fall to debunk the spreading absurdity. He assured concerned families, referring to the testimony both of American returnees and the North Vietnamese, “There are no missing in action or prisoners of war in Southeast Asia at this time that they believe are alive.” Which only meant, to many POW-MIA families, that Congress was just part of the cover-up. “Why are you willing to believe the enemy on this subject when they do not tell the truth on any other subject?” the Corpus Christi chapter of the National League of Families soon raged in a letter to the Pentagon. “The fact is, you have no proof our men are dead.” (Her emphasis.)
But how could there be proof that men shot down over jungles or the Gulf of Tonkin or the South China Sea were “really” dead? And so the “issue” endured. Governor Ronald Reagan, in Singapore as a special presidential representative for a trade deal, said that if North Vietnam didn’t return the POWs and MIAs supposedly still being held, “bombing should be resumed.” He accused liberals in Congress seeking to ban further military action in Southeast Asia of taking away “the power to sway those monkeys over there to straighten up and follow through on the deal.”
Here was the right-wing variant of the Watergate-induced dread about whether anyone in Washington could be trusted. It took on a life of its own. In 1975 a conservative Democratic congressman from Mississippi, Gillespie “Sonny” Montgomery, empaneled a House Select Committee on Missing Persons in Southeast Asia. He was initially sympathetic to the families’ claims of Communist perfidy. Then he led a delegation there which found their hosts warm, accommodating—and, once more, befuddled at what it was they were being asked to account for. (Just about every Vietnamese family had relatives who had disappeared in the war or whose remains could not be returned to the ancestral village—a sacred duty in Vietnamese culture.) Montgomery concluded that the existence of American prisoners in Vietnam was almost certainly a myth. As a CIA pilot captured there in 1965 testified at one of the subcommittee hearings, “If you take a walletful of money over there, you can buy all the information you want on POWs on the streets” but “when you try to run them down they fizzle out somewhere down the line.” They also turned up evidence that China had manufactured stories of MIA’s still in prison camps in order to keep the US from normalizing relations with their Asian rival. Reagan, however, remained adamant: “If there is to be any recognition,” he boomed on the campaign trail in the spring of 1976, “let it be discussed only after they have kept their pledge to give a full accounting of our men still listed as missing in action.”
Henceforth paying ritual obeisance, hat in hand, at meetings of the League of Families of American Prisoners and Missing in Southeast Asia became presidents’ annual ordeal. Read the section in Allen’s book about George H.W. Bush’s manhandling at the 1992 conclave. Read here about how Nixon’s long-lived propaganda goof delayed normalization of relations with Vietnam until 1995. And click here to see how this absurd cult still endures. The 9/11 Truthers don’t enjoy official government sanction. But if you happen to live in Illinois, you can roll with your very own “POW/MIA Illinois Remembers” license plate for your car. The “66 Illinoisans” apparently still imprisoned in Southeast Asia hardly deserve less.
Rick Perlstein questions whether John F. Kennedy would have ended the Vietnam War.
It’s a rough for too many families this Thanksgiving. With an unemployment rate of 7.3 percent, with nearly a million of discouraged no-longer-job-seekers, ashamed and invisible, not even showing up in that total; with an unemployment rate for black teenagers of 36 percent and, as The Nation’s George Zornick points out, the season of feasting a season of fasting for too many families on food stamps—cheer can be hard to find.
Keep our suffering neighbors in your thoughts as you celebrate. And for a possibly cheering contrast, consider a time when things were even worse: 1973, which I’ve researched for my upcoming book on the 1970s, when it was oh-so-much harder to head over the river and through the woods to Grandmother’s house because the Arab oil embargo quadrupled the price of a barrel of crude.
October was rung in with biblical prophecies from an assistant secretary of the interior. “With anything less than the best of luck,” Stephen Wakefield announced, “we shall probably face shortages of heating oil, propane, and diesel fuel this winter.… I am talking about men without jobs, homes without heat, children without schools.” In Los Angeles the Department of Water and Power predicted a 35 percent energy shortage by April. It came the day after the President’s Cost of Living Council set a new ceiling on the price of domestic crude; the major oil companies responded by raising the prices they charged their affiliate service stations by about a penny a gallon. In San Francisco 3,000 service stations shut down for three days in protest—street corners became ghost towns in the beautiful City by the Bay. And all this was before the Arab oil embargo.
That began October 17, after America decided to airlift weapons to Israel in its war with Egypt and Syria. A Watergate-scarred president went on TV and announced “a very stark fact: we are heading into the most acute energy shortage since World War II.” Americans, he said, would have to cut back: “less heat, less electricity, less gasoline”—almost stop being Americans at all. He called for shorter school and factory hours. And the cancellation of 10 percent of jet flights. The federal government would provide an example by setting thermostats to sixty-eight degrees or less, he said (“and that means in this room, too, as well as in every other room in the White House”); government vehicles would be limited to fifty miles an hour. He told governors to pass laws mandating fifty miles per hour for everyone, Congress to pass an emergency statute returning to year-round daylight savings time and to relax environmental regulations. Start carpooling, he recommended: “How many times have you gone along the highway,” he quizzed, “with only one individual in that car?”
Thousands, of course—for wasn’t zooming alone across endless vistas of highways supposed to be the most American pastime of all? Not any more, apparently. What he was describing, he allowed, sounded “like a way of life we left behind with Glenn Miller and the war of the forties.”
Honoring a non-binding presidential request, gas stations began closing down from 9 pm Saturday through midnight on Sundays. So people began “topping off”—filling their tanks every time they passed a gas station, leading to hours-long lines in which idling cars… just wasted more gas. Everyone wanted to get to a pump before the last drop was gone and one of the ubiquitous sorry, no gas signs was hoisted up. Then, they would have to return the next day—when prices were usually two-cents-a-gallon higher. Tempers flared, no architect having thought to design a corner gas station for the eventuality of dozens of angry motorists cutting fellow motorists off on street corners like it was the Indianapolis Motor Speedway.
Time called the energy crisis the “most serious economic threat to face the nation since the Depression.” Cities began reducing bus service. Schools in Massachusetts and Connecticut, states reliant on oil for heat, announced Christmas break for the entire months of December and January. At the New England School of Art, heated only to sixty-five degrees in the Boston chill, nude models were afforded the comfort of roasting in their own body heat in a clear plastic tent. In Rhode Island, a prize high school composition was customarily chosen to be signed by the governor as the official state Thanksgiving proclamation. The governor refused to sign this year’s winner, in which a 17-year-old wrote, “Thanksgiving seems to be pretended, a farce, little more than an outdated tradition no one has yet found time to discard.”
Time’s Thanksgiving cover had Archie Bunker in his trademark easy chair, stalactites of frost hanging from his cigar and winter cap—he couldn’t afford home heating oil. Plastic bags, made with petroleum, became prohibitively expensive; petrochemicals were also ingredients in many lifesaving drugs—so pharmaceutical executives projected a shortage. Twenty-five New Hampshire towns suspended police, fire protection, garbage pickups, road repair and school transportation.
The mayor of Rensselaer, Indiana, turned off the city’s 425 street lights, until a rash of burglaries forced him to turn them on again. In an interview he revealed his motives as less than Christian: “If everyone in the country would make this kind of effort, we could tell the Arabs to go to hell.” Unchristian motives were everywhere. A gas station owner stopped letting owners of big cars buy more than a dollar of gas at a time—“just enough to keep them off the road.” People started driving with a full can of gas in the trunk, which turned them into inadvertent firebombs. The Senate came within eight votes of passing a law rationing gasoline, and the White House ordered the Bureau of Engraving to prepare by printing over 10 billion ration coupons.
A coffee table book, They Could Not Trust the King, with text by William Shannon of the New York Times editorial board, went to press. It called Watergate “a complex and far-reaching political plan that could serve as dress rehearsal for an American fascist coup d’état.”
Then December, and the presidentially mandated closing of service stations from Saturday evening until Monday morning. A Hanford, California, gas station owner shot up six of the pumps of a rival who stayed open across the street. A Miami man yelled to a gas station attendant who wouldn’t sell to him on a Saturday night, “I am going to get some gas even if I have to kill somebody”—and then, waving a pistol, almost honored his pledge. Auto supply houses ran out of siphons, tools of the new street crime of choice—and locks for gas caps. More ambitious crooks started hijacking petroleum trucks. Brooklyn motorists filled up with “Gambinoil”—oil the Gambino stole from bulk plants in the area and sold to area dealers at 70 percent more than legitimate distributors.
A cheap paperback came out, Predictions for 1974, starring a panoply of psychics with names like “Countess Amy, the Gypsy Seeress,” and “Aquarius, Campus Clairvoyant.” It featured, alongside news-to-come about traffic accidents (“A submarine and a UFO will collide off the Aleutian Islands”), the occult (“reincarnation will be espoused by more and more young people as a valid explanation for the dislocations in modern society”), celebrities (“Dean Martin may have a health problem and definitely should be careful of his nose”), and celebrities and the occult (“A youthful female actress of sudden fame will publicly announce that she used witchcraft to obtain her current level of success and happiness”), prediction after prediction about how of the world would collapse. That was what the future looked like now. Deaths from record bitter cold. Deaths from a “nerve gas leak” off the coast of Florida. A 1929-style stock market collapse. A declaration of bankruptcy by New York City—“the first tangible sign of the collapse of our entire civilization.” Single people banned from buying big cars. Locusts and floods, “like the plagues of Egypt,” worldwide droughts, rising sea levels “inundating all coastal areas throughout the world.” Rationing of every staple, urban blackouts, riots, martial law. “Disaster will hit one of New York’s skyscraper landmark buildings." "Man is an endangered species,” as one soothsayer put it. It was a map of the dreads of a nation.
Good times. Let us cherish what we have, and what we have transcended before. Love, and let yourself be loved. Fight injustice, that our children might be blessed. Happy Thanksgiving, dear readers; you help make my life immeasurably meaningful and rich.
For families on food stamps, traditional Thanksgiving meals are out of the question.
Journofolks are talking a lot about the Heritage Foundation these days. The narrative is that a once-august right-wing research shop has gone all hackish on us since being taken over by former Senator Jim DeMint and his fearsome 31-year-old deputy Michael Needham. “The Fall of the Heritage Foundation and the Death of Republican Ideas,” is how the The Atlantic’s Molly Ball tags it. In The New Republic, a profile of Needham, whom The Washington Post’s Dana Milbank labeled “The Shut-Down’s Enforcer-in-chief,” quotes Republican legislators lambasting him for “his ideological inflexibility and aggressive zero-sum tactics.” A bitter Senator Orrin Hatch is quoted in The New York Times: “Is Heritage going to go so political that it really doesn’t amount to anything anymore? I hope not.”
Of course, for a movement supposedly devoted to conserving the past, conservatives are oh-so-splendid at forgetting their own past. The notion of Hatch as the high-minded conservator of the scholarly temper would have been pretty laughable when he won his Senate seat in 1976 as the first major feather in the cap of the nascent New Right fundraising machine captained by Richard Viguerie. Back then, his campaign served as a pass-through for all sorts of Long Con hanky-panky. But never mind. The notion of Heritage’s fall from some noble intellectual golden age has been so ably debunked by historian Jason Stahl that I have little to add.
But not nothing to add. First, some more historical detail. I’ve written here before about the extraordinary events of 1974–75 in Kanawha County, West Virginia, when the school board encompassing the state’s biggest city, Charleston, voted to adopt textbooks Christian conservatives insisted endorsed miscegenation, “secular humanism,” and other assorted alleged sins, ended up dynamiting the school board building. But not before the brand-spanking-new Heritage Foundation rushed to aid the folks laying the dynamite.
In one of the first forays of this “scholarly” organization into national politics, Heritage sent sent two staffers to West Virginia. James McKenna, a lawyer who had won a string of cases defending the rights of parents to homeschool their children, came to defend the activists under indictment for violence. Connie Marshner was a young University of South Carolina graduate who had accepted a job in 1971 on Capitol Hill as a plain old secretary for Young Americans for Freedom, which was where she quietly transformed herself into an expert on Senator Walter Mondale’s bill to establish a national system of federal childcare centers—the “therapeutic state invading the home,” Marshner said. On her own, she started a letterhead organization to fight the bill. When Nixon vetoed it, calling it a threat to “the family in its rightful position as a keystone of our civilization,” she claimed victory, and was hired as Heritage’s first director of education. Soon she was soon hard at work finding “little clusters of Evangelical, fundamentalist Mom’s groups,” and transforming them into troops for the conservative movement army. She ended up writing a book called Blackboard Tyranny as her lasting contribution to the “parents rights” movement’s scholarly legacy. Based on the ideas of the Christian deconstructionist Rousas J. Rushdoony, the book argues that education professionals began their plot to replace Christianity with the “messianic” religion of secular humanism when they started teaching that education should indoctrinate children into democracy, and that parents’ right to oppose this “came from God by way of the natural law.” Scholarly!
The Heritage Foundation saw the Kanawha incident as an opportunity to build strategic capacity. “If you pick the right fight at the right time,” McKenna explained, “[y]ou can make your political points, you can help the people involved, and you can become a force in the political community.” Conservatives used to call people like this “outsider agitators.” On October 6, 1974, they were among the featured speakers at a rally before 8,000 textbook activists. One preacher cried, “If we don’t protect our children we’ll have to account for it on the day of judgement!” The next day this same preacher was among the twenty militants arrested at a garage for sabotaging school buses; the following day, two elementary schools were firebombed. Scholarly!
And now, some personal anecdotage. I’ve visited the Heritage three times for research purposes. My host was Lee Edwards, who in the service of the Heritage Foundation writes hagiographies of conservative institutions and luminaries; nice work if you can get it. Edwards is a friendly guy, generous with his time and recollections, but for all that, as a conservative-movement lifer, someone also implicated in the Long Con: in 1972 he was one of the principles in a hustle called “Friends of the FBI,” to which gullible folks at the grassroots funneled cash that mostly ended up going back to the hustlers; their front man, TV star Efram Zimbalist, withdrew from the project after saying its three founders, including Edwards, were guilty of “fraud and misrepresentation.”
Anyway, on one of these visits, the foundation was fulsomely hosting some Asian dictator, holding him up as a tribune of freedom. On another, one Heritage fellow, a superannuated former Reagan UN ambassador, told me stories about Barry Goldwater “chasing pussy.” On a third, Edwards led me to Edwin Meese’s office for an interview. We passed through a room dedicated to Amway, with a full complement of their products on display—some think tank! (Regular readers of this blog know what I think of Amway, a seriously scholarly outfit…). Once there, the former attorney general of the United States told me no one had ever complained about racism in the Oakland police when he was the Alameda County DA in the 1960s. I told him I knew of some fellows who would have disagreed. He looked at me like I was nuts.
Yes, it used to be a such high-minded, intellectually serious place. Nowadays: What hath Jim DeMint wrought?
Katrina vanden Heuvel on the irrelevanceof the Heritage Foundation.
The Life magazine dated November 22, 1963, which would have arrived on newsstands around November 15, featured a terrifying story by Theodore White, author of the groundbreaking bestseller The Making of the President 1960. Titled “Racial Collision,” and subtitled “the Negro-white problem is greatest in the North where the Negro is taking over the cities—and being strangled by them,” it was a terrifying intimation of an imminent racial holocaust. The first of two parts, the conclusion ran in the issue dated November 29—which ordinarily would have appeared on newsstands on November 22 but was held back to put the martyred President Kennedy on the cover, and to include, inside, several thousand words of what must have been some very speedily written copy about his death. That second part was even scarier. It reported terrors like Adam Clayton Powell’s call for “ ‘a Birmingham explosion in New York City’ this fall”; Communist infiltration of Martin Luther King Jr.’s inner circle; a civil rights group’s fears that it would be labeled “a front for the white man” unless a peaceful march was turned into “a violent putsch on government offices”; and some protesters&rssquo; demand for cash reparations for slavery—“There is a warning if such sin-gold is not paid by white Americans to black Americans, the ‘power structure’ is inviting ‘social chaos.’ ” And it quoted James Forman of the Southern Nonviolent Coordinating Committee reaching the following unsettling conclusion: “85% of all Negroes do not adhere to nonviolence.”
Such foreboding was entirely typical of that very tense summer and fall—and the culmination of fears that had been mounting ever since the Bay of Pigs and the Berlin Crisis and the Cuban missile crisis and the Oxford, Mississippi, crisis of 1961 and 1962. The fear escalated after Bull Connor’s fire hoses in Birmingham in the spring of 1963 unleashed what felt to whites like an uncontainable torrent of black rage across the country: in Columbus, Ohio, two men chained themselves to furniture in the state Capitol; in Boston, a black parent told the segregationist city school board “it is too late for pleading, begging, requesting, or even reasoning.” Whites thereby reacted against the rage: George Wallace stood in the schoolhouse door at the University of Alabama. Medgar Evers was shot. Barry Goldwater began looking good to Republicans—and rival bands of Goldwaterites turned the Young Republican National Federation’s convention into a near-riot.
It felt like riots were breaking out everywhere.
On September 15, Birmingham’s Sixteenth Street Baptist Church was bombed by Klansmen, killing four little girls.
In Dallas, on October 24, United Nations ambassador Adlai Stevenson was shouted down, spat upon, and physically assaulted on the street by right-wingers.
In Saigon, on November 2, South Vietnam’s president Ngo Dinh Diem was assassinated in a US- backed coup.
And in Dallas, on November 22, President Kennedy was supposed to give a speech addressing the widespread feeling that America had become a very scary place, specifically as regarded the 1963 version of Tea Partiers, who had become so scary that many people presumed that they had been the ones that shot him. “In a world of complex and continuing problems, in a world full of frustrations and irritations,” ran the text he was killed before he could deliver, “voices are heard in the land…preaching doctrines wholly unrelated to reality, wholly unsuited to the sixties, doctrines which apparently assume that…vituperation is as good as victory and peace is a sign of weakness…. At a time when we are steadily reducing the number of federal employees serving every thousand citizens, they fear those supposed hordes of civil servants, far more than the actual hordes of opposing armies.”
This was the world that Theodore White, in his next article in Life after his near-prediction of race war proclaimed, until the day John F. Kennedy was killed, “Don’t let it be forgot, that once there was a spot, for one brief shining moment, that was known as Camelot.”
* * *
The story of how the myth of Camelot was invented is wonderfully told in a great little book from 1995 that I’ve never seen referenced before, Theodore H. White and Journalism as Illusion, by media scholar Joyce Hoffman. It begins:
Theodore H. White was in a dentist’s chair on the Upper East Side of Manhattan on a Friday morning in late November 1963, when he learned that Jacqueline Kennedy had telephoned to say she needed him. One week had passed since President John F. Kennedy’s assassination in Dallas, and now his widow was beseeching the journalist, whom she considered an old friend, to come to Hyannisport. She had something she wanted Life magazine to say to America, and White, she insisted, had to bear the message…. She had summoned White because she was angry, very angry. All week newspaper pundits had served up their instant appraisals of the brief and abruptly ended Kennedy administration. Arthur Krock’s New York Times column had especially rankled her…a lament about the failure of ‘even advanced democracy and self-government to extirpate in mankind the resort to anarchy…. Walter Lippmann’s “Today and Tomorrow’” column just four days after the assassination had spoken of the forces of hatred and ungovernability and how “habit of intemperate speech and thought had become deeply ingrained. It is deepened by the strains of war and the frustrations of this revolutionary age.”
In other words, commentators commentated accurately on the mood of the country. But “Mrs. Kennedy wanted White to rescue her husband’s memory from these men. History should celebrate the Kennedy years as a time of hope and magic, she insisted. White sat mesmerized for more than two hours, listening to the rambling and disjointed monologue…. She sneered at the ‘bitter old men’ who wrote history.” (That’s me!) “Finally, she came to the thought that had become her obsession, a thought embodied in the lyrics of the the Broadway musical—Camelot. Over and over again, she and the president had listened to the words sing out of their ten-year-old Victrola…”
What came next is pretty damned astonishing, a nadir in the history of court journalism, something that better belongs in the annals of the Kremlin. White retreated around midnight to draft his article in the maid’s room, “mindful that Life was holding its presses at a cost of $30,000 an hour. When he finished, Mrs. Kennedy took a pencil to White’s work, crossing out some of his words and adding her own in the margins. She hovered near the kitchen telephone—adamant that her Camelot portrayal remain the dominant theme—as he dictated the revised version to his editors.” The article came out. Arthur Schlesinger, baffled, said, “Jack Kennedy never spoke of Camelot.” One Kennedy hand said, “If Jack Kennedy heard this stuff about Camelot, he would have vomited.”
The whole thing is a great object lesson in the horrors of access journalism—and access history. (“The notes of White’s interview with Jacqueline Kennedy,” writes Joyce Hoffman, “known as the ‘Camelot Papers,’ which White donated to the John F. Kennedy library in 1969, remained under restriction until May 19, 1995, one year after the death of Jaqueline Kennedy Onassis.”) If you hate the kind of writing Bob Woodward does now; if you hate Politico or, going back further, if you hate the kind of things Sally Quinn wrote on Monicagate (“ ‘He came in here and he trashed the place,’ says Washington Post columnist David Broder, ‘and it’s not his place.’ ”), or the childish abuse and systematic distortions meted out to Al Gore in 2000 because he didn’t fit into the Washington insiders’ village, blame Camelot—or “Camelot.” If you heard the public radio documentary this morning We Knew JFK: Unheard Stories from the Kennedy Archive and were as astonished as I was at how many journalists blithely based their admiration for the thirty-sixth president on the nice cocktails he and “Jackie” poured in the White House, or if you’ve seen David Auburn’s neat Broadway play from last year, The Columnist, which depicts the incestuous coziness between Joseph Alsop and John F. Kennedy, you know what I mean.
Teddy White, about whom I have complicated feelings, was a crucial conveyor belt in advancing this awful cultural trend—“High Broderism,” some of our better bloggers used to call it—and Hoffman’s book is an important primer for anyone who wants to learn how it happened. A child of Boston’s Jewish ghetto, Teddy (or, in his parents’ Yiddish-speaking mouths, “Tuddy”) White made his way as a scholarship boy to Harvard, where he came to identify with the clubby culture of the WASP with the zeal of the convert, with all the pseudo-aristocratic abuses of democratic culture that entailed. “White’s style of journalism,” Hoffman explains, “fit a model established by a generation of influential column and reporters who had functioned as a subsidiary of government during World War II and the postwar years…a patriot first and a journalist second.”
Twenty-five years before his “Camelot” coup, she notes, “he had written stories from China that had portrayed Chiang Kai-shek as a similarly heroic character.” Then he realized he was wrong, but it was too late—he had helped create the Frankenstein’s monster’s of America’s romance with Chiang, and thus the McCarthyite reaction to China’s fall to Communism, too; he then had to watch helplessly as that reaction devoured some of his friends. Hoffman argues White then helped created the myth of the presidency itself as some sort of American regency. He was the first to capitalize “Oval Office.” She tells an amazing story of how an interview with Kennedy there, for the last chapter of Making of the President, in which the new president, in his underwear being fitted for a suit, gossips altogether un-presidentially, coarsely insults Nixon and obsesses over how much money White will make on his book. Out of that unpromising raw material, White crafted a panegyric to a godlike man who commanded the eighteen-button telephone console on his desk like “the sword and the mace in the politics of the middle ages.”
White had identified so closely with JFK on the 1960 campaign trail that he wore a Kennedy campaign button. When the manuscript was complete, he showed it to both “Bobby” and “Jack,” acceding to RFK’s requested revisions. He didn’t extend the same courtesy to Nixon. But then again, Nixon didn’t invite Teddy White to his cocktail parties. Here’s a diary entry from 1962: “Mad night at Bobby’s great fun. He set the Caroline [the Kennedys’ pet name for Air Force One] up from Washington, we got aboard at 5:00 followed by Harry Belafonte and his wife Julie…”
This sort of thing had real consequences for the country. It is one of the Big Ideas of my first book, Before the Storm: Barry Goldwater and the Unmaking of the American Consensus, that just this sort of consensus-besotted denial of the roiling tensions beneath America’s consensus facade in the early 1960s—the time “before the storm”—made the storms of the later 1960s so much nastier than they would have been, had Americans been better been prepared to accept the ineluctably divisive reality of American life. Instead, the tension burst forth like the return of the repressed.
Anyway, here’s a new Big Idea: journalist sycophants like White helped give us Watergate.
Consider: White felt so guilty at having slighted Nixon in Making of the President 1960 that he turned Making of the President 1968 into a virtual love letter to him, and sent him the book with a fulsome apology. Making of the President 1972 sucked up to Nixon even worse. But then, oops—I discovered this in research for the book I’m finishing now—White had to postpone publication so he could tack on a chapter about a little thing called Watergate, whose seriousness caught him completely by surprise.
Indeed, it was largely the clubbiness of the Washington village press corps that let Nixon get away with Watergate and still win his landslide in 1972. (Read Tim Crouse’s Boys on the Bus for the full story.) Call it Camelot’s revenge: the class of court scribes who made it their profession to uphold a make-believe version of America free of conflict and ruled by noble men helped Nixon get away with it for so long—because, after all, America was ruled by noble men.
Don’t let that be forgot. For who knows what latter-day sycophants and suck-ups in the media might let our leaders get away with next.
“Rick,” a Facebook friend writes, “curious to see what you make of the old debate (which may have some new evidence, see Galbraith II) re JFK and Vietnam. Would we have gone or stayed if JFK lived? Or was he the fervent Cold Warrior some paint him as? (My dad marched in his inauguration, and was almost killed six or seven years later.)”
The argument that John F. Kennedy was a closet peacenik, ready to give up on what the Vietnamese call the American War upon re-election, received its most farcical treatment in Oliver Stone’s JFK. It was made with only slightly more sophistication by Kenneth O’Donnell in the 1972 book Johnny, We Hardly Knew Ye, in which the old Kennedy hand depicted the president telling him, “In 1965, I’ll become one of the most unpopular presidents in history. I’ll be damned everywhere as a Communist appeaser. But I don’t care. If I tried to pull out completely now from Vietnam, we would have another McCarthy red scare on our hands, but I can do it after I’m elected.” O’Donnell also claimed that in an October 2, 1963, National Security Council meeting, after debriefing Robert McNamara and General Maxwell Taylor on their recent trip to Saigon, “President Kennedy asked McNamara to announce to the press after the meeting the immediate withdrawal of one thousand soldiers and to say that we would probably withdraw all American forces from Vietnam by the end of 1965. When McNamara was leaving the meeting to talk to the White House reporters, the President called to him, ‘And tell them that means all the helicopter pilots, too.’ ” Promptly, wrote O’Donnell, McNamara double-crossed the president, giving the reporters merely a prediction of the end of America’s war, not Kennedy’s prescription of the end of America’s war: McNamara merely said they thought “the major part of the the U.S. task” would be completed by the end of 1965, nothing about the president’s intention to complete the task by the end of 1965.
O’Donnell was seeing the world through Camelot-colored glasses. As the historian Edwin Moise demonstrates in A Companion to the Vietnam War (2002), NSC minutes are a matter of record, and the notes show the president himself approving a statement that was only a prediction that things would be over by the end of 1965, framed merely as the observation of Taylor and McNamara. (“They reported that by the end of this year, the U.S. program for training Vietnamese should have progressed to the point where 1,000 military personnel assigned to South Vietnam can be withdrawn.”)
Now, on the broader claim that Kennedy truly intended to end the war by the end of 1965, things get more interesting, and that’s where the case recently made by James K. Galbraith, son of the famous Kennedy hand and economist John Kenneth Galbraith, comes in. As he put it categorically in a letter to The New York Times, “President Kennedy issued a formal decision to withdraw American forces from Vietnam.” Is that true? Only literally, which in the end adds up to mostly nothing.
Kennedy, of course, was the first president to send soldiers to Southeast Asia, 16,732 of them, supposedly as mere “advisers,” but many of them actually combatants. As Kennedy had told famously told The New York Times’s James Reston late in 1961 after the failure at the Bay of Pigs and the erection of the Berlin Wall, “Now we have a problem in making our power credible, and Vietnam is the place.” And a damned good place, his military men kept telling him: early in his third year as president, his Vietnam commanders reported that “barring greatly increased resupply and reinforcement of the Viet Cong by the infiltration, the military phase of the war can be virtually won in 1963”—an opinion he continued hearing repeatedly. That’s important context, for whether JFK’s plans on what to do in Vietnam were contingent on military success in Vietnam—as opposed to cutting and running even if that meant leaving the country to the Communist insurgency—is key to this debate.
As Edwin Moise notes, though, “President Kennedy also read much more pessimistic evaluations. These were written mostly by civilians—some by officials in the State Department, others by journalists like Malcolm Browne and David Halberstam. Kennedy did not openly commit himself to either the optimists or the pessimists.” What he did do was insist publicly that he would never cut and run. July 13, 1963: “We are not going to withdraw from that effort…. we are going to stay there.” September 2: “I don’t agree with those who say we should withdraw. That would be a great mistake.” September 26: “We have to stay with it. We must not be fatigued.”
And what of privately? Bug-out plans were indeed drawn up. Galbraith points to an October 4 message from General Taylor to the Joint Chiefs of Staff: “Program currently in progress to train Vietnamese forces will be reviewed and accelerated as necessary to insure that all essential functions visualized to be required for the projected operational environment, to include those now performed by U.S. military units and personnel, can be assumed properly by the Vietnamese by the end of calendar year 1965. All planning will be directed towards preparing RVN forces for the withdrawal of all U.S. special assistance units and personnel by the end of calendar year 1965.” (Galbraith himself adds the emphasis.) “Execute the plan,” the memo continues, “to withdraw 1,000 U.S. military personnel by the end of 1963…”
Noam Chomsky ably took on this claim by pointing out that the withdrawal plan in question, labeled NSAM 263, included language Galbraith conveniently omits, for instance, “It remains the central object of the United States in South Vietnam to assist the people and Government of that country to win their contest against the externally directed and supported Communist conspiracy. The test of all decisions and U.S. actions in this area should be the effectiveness of their contributions to this purpose.” And that supporting texts included phrases like “without impairment of the war effort,” and that “what furthers the war effort we support, and what interferes the with the war effort we oppose,” and “our actions are related to our fundamental objective of victory.” Moises points to language in minutes from the October 2, 1963, NSC meeting: “President Kennedy indicated he did not want to get so locked into withdrawal plans that it would be difficult to cancel them if the war did not go so well after all.”
In other words, whether John F. Kennedy’s formal decision would be carried through in the interim between October 1963 and January 1966 was contingent on what happened in the future. One day this summer I issued a formal decision to go the beach. Then it rained. And so I did not go to the beach.
And as anyone who knows anything about the Vietnam War knows, the people funneling intelligence to the president were alarmingly adept (“the military phase of the war can be virtually won in 1963”) at claiming the sun was shining when it actually was pouring down rain. In fact, when it came to America’s military prospects there, it was winter in Seattle just about all the time. But tomorrow was always going to be sunny, if you asked the Joint Chiefs of Staff.
The best evidence that this “formal decision” by JFK lacks forecasting power is the actual outcome of phase I of that selfsame formal decision: to remove 1,000 soldiers from Vietnam by the end of 1963. Only 432 were actually removed by the end of 1963 (“although,” writes Moise, “some sources give lower figures,” and even that may have merely been the result of shifting deployment schedules). Sometimes war is what happens when you’re busy making other plans.
And that’s not because there was a new president by the end of 1963, at least if you trust Galbraith, who cites as clinching his argument (though it actually proves his argument is wrong) that a December 11, 1963, memo noting that the plan to withdraw 1,000 soldiers was still in force, “with no reference to the change of commander in chief.” Through the rest of 1963, in other words (Galbraith’s words), America’s Vietnam policy was still Kennedy’s, not LBJ’s. The policy, as articulated two days before Kennedy’s death by Henry Cabot Lodge, America’s ambassador to Vietnam: “We should continue to keep before us the goal of setting dates for phasing out U.S. activities and turning them over to the Vietnamese…. We can always grant last-minute extensions if we think it wise to do so.”
Finally, consider context. We all know how the Cold War worked: Republican claims about “losing China” motivated a generation of Democrats into pants-pissing fears about not looking tough enough on the reds. Writes Moise, “It is hard to believe that Kennedy as a man who had spent so much effort cultivating an image of machismo and youthful vigor would not have cared about being thought a Communist appeaser.” He observes, with subtly and sharp historical acumen, “It is not at all unusual in Washington for people to write plans based on a ‘best-case’ scenario. It also seems possible that when Kennedy based plans on the optimists’ projections, he was using this as a way of putting pressure on senior military officers to be realistic in their reports. They might be less inclined to write inflated claims of progress if they were clearly told that such claims would be treated as justifications for troop pullouts.”
He concludes archly, “To have reached a firm decision to withdraw, so long in advance, he would have to have felt that no possible new development, between 1963 and 1965, might create a prospect of an acceptable outcome of a continued struggle. To have thought the situation was such an unmitigated and unmitigatable disaster, he would have had to think that most of what was being said about he Vietnam War in the National Security Council was nonsense, and that his top military and foreign policy advisors were fools or liars. If he felt that, he did an extraordinary job of concealing it.” I agree wholeheartedly.
So what would have happened in Vietnam had JFK lived? Let the man who knew him best have the last word. Asked in 1964 whether America would have “go[ne] in on land” if the South Vietnamese were about to lose, Bobby Kennedy answered, “Well, we’d face that when we came to it.”
Part II of this series considers whether JFK’s assassination influenced the passage of LBJ’s sweeping social reforms.
A Facebook friend writes: “To what extent did Dallas factor into LBJ’s agenda getting through?”
That’s an easy one: quite nearly one hundred precent. There’s no question that Kennedy was an utter failure as a passer of laws during his proverbial thousand days. I wrote about that in Before the Storm: Barry Goldwater and the Unmaking of the American Consensus: “His only real legislative victory had come in the second week of his term, when the House voted to enlarge the size of the Rules Committee to dilute the power its reactionary majority of Northern Republican and Southern Democrats had used to bog down…social legislation. But he won the victory by only a single vote.” (Those interested in more detail should seek out a 1968 book by Tom Wicker, JFK and LBJ: The Influence of Personality upon Politics.) And that victory, I wrote, “availed him nothing.” His bill to commit major federal funds to education for the first time failed; a bill for aid to depressed areas was watered down; a minimum wage increase was tiny, the number of workers it covered decreased. As for his heroic introduction of the sweeping civil rights bill, Robert Caro suggests that at the time of his death he was apparently ready to trade away its signature provision, the ban on discrimination in public accommodations. A housing bill and what would become Medicare were on the verge of failure—all this despite an approval ratings in the 70s during the spring before his death.
Then, the assassination. Then, Teddy White’s proclamation that America had just been deprived of “Camelot” (more on that later!). Lyndon Johnson stood before a joint session of Congress and said, in words scripted by Kennedy’s great speechwriter Ted Sorensen, “All that I have I would have gladly given not to be standing here today…. On the 20th day of January, in 1961, John F. Kennedy told his countrymen that our national work would not be finished ‘in the first thousand days, nor in the life of this administration, nor even perhaps in our lifetimes on this planet. But,’ he said, ‘let it begin.’ Today, in this moment of new resolve, I would say to all my fellow Americans, let us continue!’ ”
Then came the legislative deluge. Same Congress; the only difference was the blatant and skilled manipulation of the memory of the fallen martyr by LBJ. Medicare. Medicaid. Civil Rights, without a single serious change from draft to passage. Federal aid to education. The tax cut I wrote about yesterday (he threatened to keep legislators in Washington through Christmas unless they passed it). Authorizing legislation for an “all-out war on human poverty,” claimed as an inheritance from Kennedy, though it had been Kennedy’s chairman of the Council of Economic Advisers’ idea to divert money to merely eliminating “pockets of party,” an idea tabled because Kennedy decided reaching out to suburban voters for 1964 was the more important priority.
Part 1 of Kennedy Week focuses on JFK’s legacy as a nuclear strategist and symbol of liberalism.
This week I threw it to the friends in my Facebook community (join us!) for requests about what I should write about for the fiftieth anniversary of John F. Kennedy’s death, which falls this Friday. I got a massive response—scores of questions. All this week I’ll be addressing the most popular and interesting ones.
The very first reply that came in was this: “I can never hear enough about how a liberal Massachusetts Democrat used intelligence and creative intelligence and creative diplomacy to defuse the Cuban Missile Crisis and saved us all from nuclear annihilation.” With all due respect to the questioner, a smart and experienced liberal activist, plus the five folks who gave the question a thumbs-up on Facebook, I wondered initially whether his question wasn’t meant as snark—that he might be referring to Garry Wills’s very convincing argument that the Cuban Missile Crisis was all Kennedy’s fault. As it happens, I agree with Wills: I don’t think Kennedy and the Cuban Missile Crisis is something we should celebrate at all.
Wills made the case in the final section of his 1982 book The Kennedy Imprisonment: A Meditation on Power. Early in his term Kennedy fell in love with a plan, left over from Eisenhower’s administration, to send exiles to invade and overthrow Castro via a landing at the Bahía de Cochinos—the Bay of Pigs. He liked it so much because it was Kennedyesque: “A James Bond exploit blessed by Yale, a PT raid run by PhDs.” A failed invasion, his fault; then, despite the conventional wisdom that he learned from the failure, rather than leave well enough alone, Kennedy’s CIA kept on proliferating increasingly knuckle-headed schemes (exploding cigars!) to assassinate Castro, some using Mafia operatives. One set of plans on the drawing boards: “Operation Northwoods,” which proposed, among other ideas, creating the pretext for another American invasion. James Bamford wrote that the goal of the project was “for innocent people to be shot on American streets; for boats carrying refugees fleeing Cuba to be sunk on the high seas; for a wave of violent terrorism to be launched in Washington, D.C., Miami, and elsewhere. People would be framed for bombings they did not commit; planes would be hijacked. Using phony evidence, all of it would be blamed on Castro.”
We sometimes hear the argument that Kennedy never knew how about the depths to which such madcap plotting sunk, which were indeed always devised to protect the president via maximal “plausible deniability”—but what is undeniable is that the ultimate aim, overthrowing Castro, came straight from the top. The American people didn’t know about any of this, but the Cuban government did. So no wonder they wanted nukes. But there are also outstanding arguments that JFK’s admittedly outstanding and mature diplomacy once the missiles were placed in Cuba did not save us from nuclear annihilation at all. The logic of deterrence rendered those missiles virtually useless. For if a Communist first strike was launched from the Soviet Union, America could destroy the Cuban missiles before they could be used during this long time window; if the missiles from Cuba struck first, the president would have time to push the proverbial button and annihilate the Soviet Union. The only thing those Cuban missiles were useful for, in fact, was preventing America from illegally overthrowing the Castro government. So if you think that’s a splendid thing, yes, celebrate Kennedy for the Cuban Missile Crisis. Otherwise: not so impressive.
* * *
Next up! “I’d love to read your take on Ira Stoll’s book arguing that JFK was actually a conservative.”
The book is JFK, Conservative. Here’s the blurb: “[B]y the standards of both his time and our own, John F. Kennedy was a conservative. His two great causes were anticommunism and economic growth. His tax cuts, which spurred one of the greatest economic booms in our history, were fiercely opposed by his more liberal advisers. He fought against unions. He pushed for free trade and a strong dollar. And above all, he pushed for a military buildup and an aggressive anticommunism around the world…. Not every Republican is a true heir to Kennedy, but hardly any Democrats deserve that mantle.”
I have, of course, heard such claims for ages. What to make of them? Granted, I haven’t read the book, and maybe Stoll’s supporting arguments are so subtly brilliant that he’s suddenly rendered them convincing. But he’d have to be smarter than Einstein to do so. It’s not a great start that the blurb advertising his book contains a basic logical error. One can’t be a conservative “by the standards of both his time and our own,” the space in between being some fifty years filled with massive social changes on virtually every front, any more than something can be simultaneously matter and anti-matter. What is considered “conservative,” and what is considered “liberal,” changes in any given era. Calling tax cuts “conservative,” as such, is shockingly historically ignorant: the idea of tax-cutting as a signature conservative gesture dates only to the late 1970s and the arguments of supply-siders like Jude Wanniski. When Wanniski made his arguments to Ronald Reagan’s very conservative adviser Peter Hannaford in 1976, Hannaford looked at Wanniski like he was crazy and walked away; the previous year, liberal Democrats were the ones pushing a $29.2 billion permanent tax cut as against President Ford’s wish for $16 billion in temporary tax cuts.
As for Kennedy’s tax cut specifically (which was actually Johnson’s tax cut: it went through early in 1964, and are conservatives now claiming Johnson as one of their own?), the historian David Greenberg niftily put paid to that in a piece Stoll must have missed when it came out ten years ago. Yes, the law that passed ended up lowering the top marginal tax rate from 91 to 70 percent, and if Stoll is willing to join the Kennedy-Johnson bandwagon by bringing back that top rate, I’m glad to join him. But the blunt fact of the matter was that the tax cut was designed to create a deficit, and designed to mostly put money into poorer consumers’ pockets: it was explicitly Kenyesian, through and through—the opposite of Reaganite “supply-side” thinking. Businessmen—conservatives—mostly hated it. Because, back then, it was “conservative” to favor fiscal probity even if it took higher taxes to do it.
OK: “He fought against unions.” Um, he fought against union corruption. If Stoll thinks liberals prefer corrupt unions, I don’t know what to say to him. That’s generally the conservative line. As Barry Goldwater said during the hearings Kennedy helped run in the late 1950s that took on Jimmy Hoffa’s Teamsters, “I’d rather have Jimmy Hoffa stealing my money than Walter Reuther stealing my freedom.”
What about Kennedy’s anticommunism? Was that “conservative”? Sure, if you’re stupid beyond stupid. Anticommunism in its modern form was invented by liberals like Harry Truman, the architect of the national security state. The proportion of the voting population that was not anticommunist in 1961 was miniscule. Here’s another, related, question from one of my Facebook friends, another five-thumbs-up popular favorite: “I’d love a perspective on his brand of liberal anticommunism and how it fit in to the era.” What did it mean to be a conservative anticommunist during that time? Mostly, it meant being idiotic. Barry Goldwater’s 1962 book on the subject, Why Not Victory?, built on the argument in the last chapter of Conscience of a Conservative that it should be America’s foreign policy to blithely welcome nuclear war if that was what it took to “advance the cause of freedom.” Yes, literally.
Conservatives like Goldwater (not to mention conservatives in the John Birch Society, who believed the most important thing to know about Communism was that its denizens had infiltrated the federal government all the way to the top, but maybe Ira Stoll agrees?) also believed it was futile to negotiate with the Soviet Union about anything. Why was this especially idiotic? Because historically, relaxation in tensions between the US and the USSR had always been the variable most likely to weaken the hold of totalitarianism with the Soviet Union, opening space for the dissidents whose courage eventually brought down the system. (Conservatives habitually travesty both historical fact and the courageous legacy of these dissidents when they argue otherwise.)
Now, as I noted above, Kennedy’s anticommunism could be stupid, too. But it was most stupid when it was most conservative—see above.
So why is it accurate to say that Kennedy was affirmatively liberal—if too often, as we’ll examine next time, a timid one? For one, because he said he was, out and proud, for instance in this most useful of utterances: “If by a ‘liberal’ they mean someone who looks ahead and not behind, someone who welcomes new ideas without rigid reactions, someone who cares about the welfare of the people—their health, their housing, their schools, their jobs, their civil rights and their civil liberties—someone who believes we can break through the stalemate and suspicions that grip us in our policies abroad—if that is what they mean by a ‘liberal’ then I’m proud to say I’m a liberal.”
The proof was in the pudding. His first debate with Richard Nixon in 1960, remembered now because Kennedy looked hale and ruddy and Nixon looked sweaty and haggard, should also be remembered for Kennedy’s central policy argument: free medical care for the aged, what would later come to pass as Medicare, as an affirmation and extension of the New Deal legacy:
“I want the individuals to meet their responsibilities. And I want the states to meet their responsibilities. But I think there is also a national responsibility. The argument has been used against every piece of social legislation in the last twenty-five years. The people of the United States individually could not have developed the Tennessee Valley; collectively they could have. A cotton farmer in Georgia or a peanut farmer or a dairy farmer in Wisconsin and Minnesota, he cannot protect himself against the forces of supply and demand in the market place; but working together in effective governmental programs he can do so. Seventeen million Americans, who live over sixty-five on an average Social Security check of about seventy-eight dollars a month, they’re not able to sustain themselves individually, but they can sustain themselves through the social security system.”
Kennedy went on, slapping Ira Stoll down from beyond the grave:
“[W]hat is the party record that we lead? I come out of the Democratic party, which in this century has produced Woodrow Wilson and Franklin Roosevelt and Harry Truman, and which supported and sustained these programs which I’ve discussed tonight. Mr. Nixon comes out of the Republican party. He was nominated by it. And it is a fact that through most of these last twenty-five years the Republican leadership has opposed federal aid for education, medical care for the aged, development of the Tennessee Valley, development of our natural resources. I think Mr. Nixon is an effective leader of his party. I hope he would grant me the same. The question before us is: which point of view and which party do we want to lead the United States?”
That’s why John F. Kennedy was a liberal, which happens to be why I am a liberal too.
Wendell Berry commemorates the assassination of JFK in his poem “The Light of all His Last Days.”