Quantcast

Katrina vanden Heuvel | The Nation

  •  
Katrina vanden Heuvel

Katrina vanden Heuvel

Politics, current affairs and riffs and reflections on the news.

The Women Candidates We Need


Hillary Clinton. (Reuters/Jacquelyn Martin)

Editor’s Note: Each week we cross-post an excerpt from Katrina vanden Heuvel’s column at the WashingtonPost.com. Read the full text of Katrina’s column here.

“Just lunch, or is it Campaign 2016 just getting started?” one pundit breathlessly asks of a meal between President Obama and his former secretary of state, Hillary Clinton. The New York Times does a deep dive into the Clinton Foundation, while others list “The People Already Rearranging Their Lives for Hillary Clinton’s 2016 Campaign.” And every major news outlet has asked some form of this question: Is America ready for a woman president?

The media are, in fact, obsessed with whether Hillary Clinton will become the first female president. Her every move is analyzed and interpreted, like tea leaves from which we might deduce her 2016 intentions. But in their heavy breathing over Clinton, many in the media seem to be ignoring an equally important story about women and politics. Put another way, instead of setting up a beat dedicated to covering Clinton, perhaps the Times could better serve the public by using those resources to cover women and politics more broadly.

Will shattering the Oval Office’s glass ceiling and electing a madam president be an inspiring achievement for this country? Of course. Do we also need madam mayors, madam senators, madam councilwomen, madam sheriffs, madam governors and madam congresswomen all across the nation? You betcha.

Editor’s Note: Each week we cross-post an excerpt from Katrina vanden Heuvel’s column at the WashingtonPost.com. Read the full text of Katrina’s column here.

A Populist Insurgency in New York City


Bill de Blasio speaks with potential voters on July 30, 2013. (AP Photo/Frank Franklin II)

Editor’s Note: Each week we cross-post an excerpt from Katrina vanden Heuvel’s column at the WashingtonPost.com. Read the full text of Katrina’s column here.

 

For the most part, Americans outside of New York have heard only one story about New York City’s mayoral race — the bizarre public self- immolation of former representative Anthony Weiner. But obscured beneath the flood lights of the Weiner farce is a populist insurgency that exemplifies the coming struggle to define the Democratic Party in the wake of President Obama.

The progressive champion in the race, New York Public Advocate Bill de Blasio, is challenging the odds-on favorite, City Council Speaker Christine Quinn, to succeed retiring Mayor Michael Bloomberg. Under pressure from de Blasio and progressives, she has begun recently to assert some independence from Bloomberg’s trickle-down technocratic politics and shown a willingness to challenge the administration on certain issues, such as the city’s harmful homelessness policies. But Quinn has too often used her influence as speaker to protect corporate and developers’ interests.

De Blasio has pitched his campaign with the most populist and ambitious agenda in memory. He does so in a city that is one of the most unequal in the country, with an extreme gulf in income and wealth. Visitors gape at Manhattan’s skyscrapers, but almost half the population lives at or near thepoverty level. In any one year, 1.5 millionsuffer hunger or food insecurity. Accelerating gentrification has made affordable housing scarce. Public schools are in crisis. Bloomberg has vetoed efforts to pass a living wage, and he is so anti-labor that all of the 152 public unions in the city now are without a contract.

De Blasio argues that New York is a “tale of two cities,” and that the central issue of this and future campaigns is “economic fairness.” “Without a dramatic change of direction,” he said in a May 30 address, “an economic policy that combats inequality and rebuilds our middle class, generations to come will see New York as little more than a playground for the rich.”

Editor’s Note: Each week we cross-post an excerpt from Katrina vanden Heuvel’s column at the WashingtonPost.com. Read the full text of Katrina’s column here.

This Week in 'Nation' History: James Baldwin's Four Decades of Prophecy, Confession, Emotion and Style


James Baldwin. (AP Photo)

Last week marked what would have been the eighty-ninth birthday of the late James Baldwin—novelist, essayist and, for the last decade of his life, valued member of The Nation’s editorial board. Baldwin became internationally famous as the author of fictional works like Go Tell It on the Mountain (1953) and non-fiction collections like The Fire Next Time (1963), but his first-ever published piece, before moving from Greenwich Village to Paris, was a 1947 Nation review of a collection of Maxim Gorki’s short stories. In the review, one glimpses the beginnings of the qualities Saul Maloff noted in his Nation review of The Fire Next Time—“the confessional voice, the apocalyptic style, the prophetic warning, the turbulent emotion contained and disciplined by stylistic elegance, the gospel of love after the storm of hate.” One also sees a young writer beginning to construct his own identity around a set of fundamental values he cherishes in others. “Here, above all,” he writes of Gorki, though it could just as well be of himself, “is a carefully controlled rage at the lot of men and an insistence on their noble destiny.”

These qualities only became more refined in his later contributions. In a dual review in 1956 of the French writer Daniel Guérin’s Communist-inflected Negroes on the March and J.C. Furnas’ Goodbye to Uncle Tom, Baldwin criticized the orthodox Marxist analysis of American’s racial problems, saying it was both simplistic and dangerous, a point he developed later in his career as well:

Indignation and goodwill are not enough to make the world better. Clarity is needed, as well as charity, however difficult this may be to imagine, much less sustain, toward the other side. Perhaps the worst thing that can be said about social indignation is that it so frequently leads to the death of personal humility. Once that has happened, one has ceased to live in that world of men which one is striving so mightily to make over. One has entered into a dialogue with that terrifying deity, sometimes called History, previously, and perhaps again, to be referred to as God, to which no sacrifice in human suffering is too great.

As the times changed, so did Baldwin’s tone. Ten years later, in “A Report from Occupied Territory” (July 11, 1966), writing about the riots that had gripped New York’s streets for the past several summers, he offered a blistering attack on the deeper causes of America’s mid-1960s racial strife:

…the police are simply the hired enemies of [the black] population. They are present to keep the Negro in his place and to protect white business interests, and they have no other function. They are, moreover—even in a country which makes the very grave error of equating ignorance with simplicity—quite stunningly ignorant; and, since they know that they are hated, they are always afraid. One cannot possibly arrive at a more sure-fire formula for cruelty.

This is why those pious calls to “respect the law,” always to be heard from prominent citizens each time the ghetto explodes, are so obscene. The law is meant to be my servant and not my master, still less my torturer and my murderer. To respect the law, in the context in which the American Negro finds himself, is simply to surrender his self-respect.

Baldwin also prefigures today’s anger over the “stop and frisk” act, part of what came to be known as the Rockefeller drug laws,” when he wrote that it permitted policemen “to stop anyone on the streets, at will, at any hour, and search him.” Baldwin recognized the inherent racism and unconstitutionality of stop-and-frisk more than four decades before it became a major issue in this year’s mayoral campaign. “Harlem believes, and I certainly agree, that these laws are directed against Negroes,” Baldwin wrote in The Nation. “They are certainly not directed against anybody else.”

This arrogant autonomy, which is guaranteed the police, not only in New York, by the most powerful forces in American life—otherwise, they would not dare to claim it, would, indeed, be unable to claim it—creates a situation which is as close to anarchy as it already, visibly, is close to martial law.

Just before the 1980 presidential election, Baldwin surveyed the American political landscape in “Notes on the House of Bondage,” an attempt at answering the question posed by his nieces and nephews, “Who are you going to vote for, Uncle Jimmy?” Eventually he says he’ll vote for Carter simply as “a coldly calculated risk, a means of buying time.” He also explores the differences between veteran activists of his generation—those who linked arms with Marlon Brando at the March on Washington—and more impatient, post-black power activists of the present one:

Someone my age…may be pleased and proud that Carter has blacks in his Cabinet. A younger person may wonder just what their function is in such a Cabinet. They will be keenly aware, too, that blacks called upon to represent the Republic are, very often, thereby prohibited from representing blacks. A man my age, schooled in adversity and skilled in compromise, may choose not to force the issue of defense spending versus the bleak and criminal misery of the black and white populations here, but a younger man may say, out loud, that he will not fight for a country that has never fought for him and, further, that the myth and menace of global war are nothing more and nothing less than a coward’s means of distracting attention from the real crimes and concerns of this Republic. And I may have to visit him in prison, or suffer with him there—no matter. The irreducible miracle is that we have sustained each other a very long time, and come a long, long way together. We have come to the end of a language and are now about the business of forging a new one. For we have survived, children, the very last white country the world will ever see.

* * *

In addition to his own contributions to the magazine, Baldwin’s writing has been the subject of vigorous debate by Nation critics, from Nelson Algren—who liked Baldwin’s second novel, Giovanni’s Room (1956) enough to call it “more than another report on homosexuality”—to Todd Gitlin—who wrote in 1974 that it was possibly Baldwin’s “limpid condensation” of experience “which makes him so quotable and so esteemed by a middle-class white public which is looking for ‘civilized’ access to Those People.” While Baldwin’s essays have always been treated with the utmost reverence in The Nation, his fiction did not fare as well. Randall Kenan’s 1994 review of the biography of Baldwin by David Leeming noted that this division was the fashionable story about Baldwin beginning with Tell Me How Long the Train’s Been Gone. (“His insights are unremarkable and blurred,” wrote Robert Emmet Long in The Nation, June 10, 1968.) The very title of Saul Maloff’s 1962 review of Another Country—“The Two Baldwins”—emphasized that divide.

Please support our journalism. Get a digital subscription for just $9.50!

* * *

Subscribers to The Nation can access our fully searchable digital archive, which contains thousands of historic articles, essays and reviews, letters to the editor and editorials dating back to July 6, 1865.

A Debt-Free College Education


Activist dressed as the “Master of Degree,” holds a ball and chain representing his college loan debt.(AP Photo/Jacquelyn Martin)

Editor’s Note: Each week we cross-post an excerpt from Katrina vanden Heuvel’s column at the WashingtonPost.com. Read the full text of Katrina’s column here.

Last Wednesday—almost a month after Congress failed to prevent student loan rates from doubling—Democrats and Republicans reached a compromise that will keep rates low, at least temporarily, for most graduates.

From a body with a record of procrastinating on student debt worse than students procrastinate on term papers, this was welcome news. But let’s not get ahead of ourselves.

Indeed, the price of higher education—and how that price is paid—is still a huge problem in this country. Federal and student loan debt now exceeds $1 trillion. Today, the average graduate leaves school with nearly $30,000 in debt.

And those are just the students who actually graduate. For millions of students, America’s university system is not a pathway to success but a debt trap. As of 2011, nearly half the students enrolled in four-year programs—and more than 70 percent of students in two-year programs—failed to earn their degrees within that time, with many dropping out because of the cost. They leave school far worse than they arrived: saddled with debt, but with no degree to help them land a job and pay off the debt.

Editor’s Note: Each week we cross-post an excerpt from Katrina vanden Heuvel’s column at the WashingtonPost.com. Read the full text of Katrina’s column here.

Take Action: Tell Your Representatives to Follow Oregon’s Lead to End Student Debt

This Week in 'Nation' History: Hiroshima and the Roots of American Secrecy


An allied correspondent stands in a sea of rubble in Hiroshima Sept. 8, 1945, a month after the first atomic bomb ever used in warfare was dropped by the US. (AP Photo/Stanley Troutman)

There is so much to mourn when we think of Hiroshima. Most importantly, as many as 80,000 Japanese civilians evaporated when the Enola Gay dropped Little Boy sixty-eight years ago this week. Fifty thousand later succumbed to radiation poisoning and other ailments. But we also mourn the end of whatever human innocence remained intact after the atrocities of the previous six years of war, not to mention the preceding tens of thousands. “With this bomb,” President Harry Truman announced, returning from Potsdam aboard the USS Augusta, “we have now added a new and revolutionary increase in destruction.” That, too, should be—and in the pages of The Nation since 1945, has been—mourned.

Initially, The Nation’s response to Hiroshima echoed Truman’s justification for it as a necessary, and desirable, means of ending the Pacific war—one which saved Japanese and American lives. In an editorial in the first issue after August 6, then editor-in-chief Freda Kirchwey wrote:

From the point of view of military strategy, $2,000,000,000…was never better spent. The suffering, the wholesale slaughter it entailed, have been outweighed by its spectacular success; Allied leaders can rightly claim that the loss of life on both sides would have been many times greater if the atomic bomb had not been used and Japan had gone on fighting. There is no answer to this argument.

Future Nation writers, as well as many historians, would disagree, as we’ll see below. But just days removed from the event itself, Kirchwey was understandably more concerned about planning for a drastically transformed future than doubting the official story—which would have been a difficult task anyway, given the scant information the Truman administration had provided about the decision to use the bomb. Kirchwey argued that there was only one way to safely and justly contain what Truman had called “a harnessing of the basic power of the universe”:

If we are to survive our new powers we must understand their full meaning. We shall have to move fast, both internationally and within each country. No longer can we afford a world organized to prevent aggression only if all of the great powers wish it to be prevented. No longer can we afford a social system which would permit private business, in the name of freedom, to control a source of energy capable of creating comfort and security for all the world’s people. This seems self-evident, and so it is. But it calls for changes so sweeping that only an immense effort of will and imagination can bring them about. A new conference of the nations must be assembled to set up a World Government, to which every state must surrender an important part of its sovereignty. In this World Government must be vested the final control over atomic energy. And within each nation the people must establish public ownership and social development of the revolutionary force was has thrust into their hands. This program will sound drastic only to people who have not yet grasped the meaning of the new discovery. It is not drastic. We face a choice between one world or none.

That fall, the Nation Associates—now the Nation Builders—hosted one of the first forums to discuss how atomic energy and weaponry had changed domestic and international political questions, as Sara Alpern records in her outstanding 1987 biography of Kirchwey. The British political theorist and Labor Party chairman Harold Laski, headlining the event, seconded Kirchwey’s argument, telling the crowd that only international socialism could protect humanity from destruction. “No one nation is fit to be trusted with the development of atomic energy,” Laski declared. “We must plan our civilization or we must perish.”

Several articles in the decade after 1945 explored the impact of the bomb on the ground—including a 1946 review of John Hersey’s seminal book Hiroshima and, in 1955, translations of excerpts from the recently published memoirs of several Japanese survivors. “There was a column of fire about ten yards ahead of me—a regular waterfall of fire—with terrific explosions like the sound of a thousand thunderclaps,” Yasuo Yamamoto wrote. “The screams of babies and women and the helpless calling for lost ones poured into my ears like water from a dam that has broken.”

* * *

As the decades passed and the Cold War congealed into a seemingly perpetual state of nuclear standoff, Nation writers began to consider the ways in which so much of contemporary political thinking and behavior could be traced directly to Hiroshima. In 1981’s “Hiroshima and Modern Memory,” the Pulitzer-winning historian and current Nation editorial board member Martin Sherwin wrote:

The American public’s sense of powerless before a monster its own government created and used may be the single most important reason behind the easy acceptance of the idea—so vigorously promoted by the Reagan Administration—that only nuclear superiority can guarantee our national security. Even here, the debate over the atomic bombings of Hiroshima and Nagasaki is relevant, for it is of paramount importance to those who wish to rely increasingly upon nuclear weapons that these weapons not be tarnished with a sense of guilt that could inhibit their use as an instrument of diplomacy.

However, the least obvious impact of Hiroshima and Nagasaki may be the most important: the subtle conversion of tens of millions if people over the course of thirty-six years of nuclear arms racing to the idea that nuclear war is inevitable. The button exists and someday someone will push it; nothing can prevent that. Technology has altered our confidence in free will.

Nation writers also began to reconsider whether the bombing needed to happen in the first place, and why it did. While Kirchwey saw the post-war diplomatic implications of the atomic bomb as secondary, though important, consequences, Sherwin—and many Nation writers since—have tended to view the impending power struggle between the United States and the Soviet Union as the primary motivation behind the bomb’s use, if not its design:

Truman inherited the basic policy that governed the atomic bomb, just as he inherited every other policy related to the war, a point that commentators on both sides of the debate often ignore. It was therefore possible to use the bomb only because Roosevelt had made preparations to do so. Truman was inclined to use the bomb because of those preparations. But he decided to use it because there seemed no good reason not to. On the contrary, the bombs were available and the Japanese fought on; the bombs were available and precedents of burned cities were numerous; the bombs were available and $2 billion had been spent to create them; the bombs were available and revenge had its claim; the bombs were available and the Soviet Union was claiming too much.

In 1995’s “The Atomic Curtain,” the psychohistorian Robert Jay Lifton and current Nation blogger Greg Mitchell, co-authors of Hiroshima in America: Fifty Years of Denial, found in the ultra-secretive creation of the bomb and in the deliberations over using it the origins of the same National Security State that today identifies as a top national priority the persecution of a 29-year-old former intelligence analyst for telling the American people what they have every right to know. “Hiroshima was the mother of all cover-ups,” they wrote, “creating distortions, manipulative procedures and patterns of concealment that have affected all of American life. Secrecy has been linked with national security—and vice versa—ever since.”

Starting with Hiroshima, officials advised Americans to leave all problems surrounding the bomb to political, scientific and military leaders—the nuclear priesthood. Americans were not supposed to think critically or engage in the debate over the gravest issue of our age. Over time, we became accustomed to bowing out of that discussion, and then of debates involving other major issues. We got used to putting the greatest problems, military and social, completely in the hands of experts and political leaders who claimed to have them under control—only to recognize in painful moments that they didn’t have them in hand at all. Surrendering our right to know more about Hiroshima, and later nuclear policies, contributed to our gradual alienation from the entire political process.

The message of the official Hiroshima narrative was control: controlling the story of Hiroshima, controlling nuclear weapons, controlling history. But the official narrative also increased ordinary Americans’ sense of being out of control of their own destiny, of being out of control of the forces that determine their future.

No wonder, then, that the American people have come to feel deceived by the bomb and its caretakers. We know that ominous truths have been concealed from us—starting with Hiroshima. One reason we remain confused is that part of each of us psychologically colludes in the concealment. But our resentment at what has been concealed and falsified does not necessarily limit itself to nuclear matters but can spread, vaguely and bitterly, into just about any aspect of social and national experience.

We have to ask ourselves, then, how much of our mistrust of politicians and public officials, of the media, of our government and just about all who govern us—how much of this angry cynicism so evident in our public life in recent years—is an outcome of the Hiroshima and post-Hiroshima nuclear deceptions and concealments. To what extent do we feel ourselves a people who have been unforgivably deceived in that most fundamental of human areas—having to with how, when and by whose hand, or lethal technology, we are to die?

* * *

The Nation has long been concerned with more than exploring historical arguments or finding in the past the roots of present predicaments, however important that is. Our remarkable peace and disarmament correspondent, Jonathan Schell, has written extensively in The Nation about the necessity of universal nuclear disarmament and the prospects for achieving it. Perhaps his magnum opus on this subject was The Gift of Time, originally published as a special issue dated February 9, 1998, and later as a book. Schell interviewed fifteen major international experts and officials—including Robert McNamara and Mikhail Gorbachev—about the possibility of nuclear disarmament in a post–Cold War age:

The task is of course immense. But history has given us the gift of time—a limited time, perhaps, but enough to proceed, without haste, to scout the obstacles in our path, to weigh carefully and thoroughly the course to be followed, and then to create the structures that will carry us to the goal and keep us there. If we use the gift properly and rid the species for good of nuclear danger, we will secure the greatest of time’s gifts, assurance of a human future.

It was reported in yesterday’s New York Times that President Obama is unsure of what to speak with Vladimir Putin about if they do indeed meet next month. He may want to reread Schell’s essay, and start there.

Please support our journalism. Get a digital subscription for just $9.50!

* * *

In 2010, we put together a slide show of excerpts from The Nation in the nuclear age. Subscribers to The Nation can access our fully searchable digital archive, which contains thousands of historic articles, essays and reviews, letters to the editor and editorials dating back to July 6, 1865.

Researched by and written with Richard Kreitner.

The GOP Misunderstands the 'War on Women'


San Diego Mayor Bob Filner speaks during a news conference at City Hall, Friday, July 26, 2013. (AP Photo/Gregory Bull)

Editor’s Note: Each week we cross-post an excerpt from Katrina vanden Heuvel’s column at the WashingtonPost.com. Read the full text of Katrina’s column here.

You can’t say Republicans lack for chutzpah. The cynical right-wing message-men have come up with a new insult to our intelligence—and to millions of US women. As Buzzfeed reported Friday, Republicans are now spinning a series of scandals to try to prove the Democrats are the party with the real “War on Women.” That’s just silly, and they know it.

Needless to say, some current and former Democratic pols haven’t exactly covered themselves in glory recently. San Diego Mayor Bob Filner’s refusal to resign, despite an apparent pattern of repeated abuse, is particularly outrageous. But sexual indiscretion and sexual harassment (two types of scandal that shouldn’t be conflated) know no partisan affiliation. Remember Herman Cain, onetime GOP presidential frontrunner and accused serial sexual harasser? National Journal reported at the time that “scores of interviews with Iowa Republicans over the weekend turned up scant outrage” over the allegations. Some high-profile Republicans even questioned the concept of sexual harassment itself, with Representative Steve King calling it “a terrible concept,” and Senator Rand Paul warning that some now “hesitate to tell a joke to a woman in the workplace…” The horrors!

Two months ago, Kristen Anderson, the former communications director for the Iowa Senate Republicans, announced she’d be filing a civil rights complaint after a firing she charged constituted retaliation for speaking up about a “sexually hostile work environment.” Anderson told The Des Moines Register she’d complained about men making comments about women’s bodies and sexual orientation, and that it cost her her job. Where’s the Republican outrage over that? Don’t hold your breath.

Editor’s Note: Each week we cross-post an excerpt from Katrina vanden Heuvel’s column at the WashingtonPost.com. Read the full text of Katrina’s column here.

This Week in 'Nation' History: How Not to Make a Dust Bowl Worse

This week’s cover story by Sasha Abramsky, “Dust Bowl Blues,” explores the severe drought that has afflicted the American Southwest for the past few years, devastating crops, communities, and climates. “Just like in the days of the Dust Bowl,” Abramsky writes, “a way of life is under threat here, as are the livelihoods of millions of people.”

In the summer of 1937, with the original Dust Bowl and the second wave of the Great Depression in full effect, The Nation published four features, once each month, by the radical cartoonist and painter William Gropper, who, using Works Progress Administration funds, went on a tour around the country to document the resilience with which ordinary people were facing those twin man-made disasters. Gropper’s simple, evocative sketches and crisp paragraphs have the same effect as more famous artifacts of creative reportage from the American West of the late 1930s, like John Steinbeck’s novels or Woody Guthrie’s songs: that of making the reader or listener feel dusty just by reading or listening.

His first Nation dispatch that summer, “Gropper Visits Youngstown” (July 3, 1937), derived from the same visit to the Ohio city that produced his most famous painting, Youngstown Strike—recording the tumultuous scene at a major labor walkout led by the Congress of Industrial Organizations. But it is his August page, “The Dust Bowl,” that provides a remarkably poignant and sympathetic account of what the Southwest was like in those years:

When the wind blows, you get a blow of nice hot dust in your nose, eyes, and throat. The landscape is simple—sky and sand for miles. At night you can get some sleep with a wet cloth over your face, or if you’re prepared with a mask you can manage to get by until you get used to it. In Elkhart, Kansas, I saw a farmer plowing. No sooner did he pass with the plow than the dust blew over and covered up the earth as if nothing had happened. I asked him if he got any crops, and he said, “No.” “Then why plow?” And he said, “Been doin’ it fer years; it’s a habit now,” and went on with his plowing.

Another blurb tells a story that any American reader, thanks to Steinbeck and Guthrie, can relate to—though Gropper, because he is using a weekly magazine and not a novel or a song, manages it with perhaps more immediacy:

More than once on the road I’ve seen families packing their belongings in old Fords and leaving the old homestead. Many of them, I’ve been told, are migrating to California—not because they want to, but because they have been evicted.

* * *

The Nation’s next and, until Abramsky’s piece this week, last article about the Dust Bowl was written by Kunigunde Duncan (a pseudonym for the Kansan writer Flora Isely) in 1939. In “Reclaiming the Dust Bowl,” Duncan wrote about how farmers on the high plains were using technology and grit to reconstitute the soil and finally beat back the dust storms of the previous six years. As Abramsky details the ways in which human-caused global climate change have contributed to and exacerbated the present drought in the Southwest, Duncan reported that the Dust Bowl of the 1930s was similarly attributable to human actions:

The Dust Bowler, seeking means of combating these terrible conditions, went first into causes and found that he was having to fight more than super-temperatures, water shortage, and constant wind-whipped dust. He was having to fight the mistakes of his predecessors.

Duncan blamed railroad tycoons and the settlers they brought in by the hundreds, who forced the land to produce crops from the East; cattle ranchers who overgrazed the prairie; and the World War I demand for bread which led farmers to replace drought-resistant native buffalo grass with fields and fields of wheat. Her description of the disaster that ensued has echoes of Gropper’s—and Steinbeck’s, too:

The high plains have never had abundant rainfall. But when the annual precipitation of from fourteen to twenty inches decreased to from eight to fourteen, the situation became acute.

It might be worth noting that the average rainfall in Lubbock, Texas, the town profiled in Abramsky’s “Dust Bowl Blues” this week, enjoyed a grand total of 11.43 inches of precipitation in 2012, according to the National Weather Service. In 2011, it had less than 6 inches. In 1937, it had 22.25 inches. Duncan continued:

“Dusters” began to boil up and shut out the sun, and everywhere the question was asked, “What shall we do?” It was a question that remained unanswered for many months while gas engines refused to run and locomotives crawled through a springless, viewless land—a land where people live with windows weather-stripped tightly with adhesive tape to exclude the penetrating silt; where wet sheets were hung above beds and about the walls to save the lives of the old, the ill, and the new-born…where a weird purplish sun guided the funeral procession of those who had died of dust pneumonia.

* * *

Abramsky reports that not only is the present drought crisis in the Southwest, to some extent, a human-caused phenomenon, it continues to be exacerbated by a broken American political system.

Congress has an opportunity to address this crisis through the farm bill, which is currently the topic of robust debate in Washington. On July 11, with help from the powerful agribusiness lobby, the House passed legislation giving large farms the ability to buy “shallow loss insurance,” which would guarantee up to 90 percent of their income—thus providing a perverse incentive for agribusiness to try to cultivate land manifestly unsuited to the crops in question.

Two years ago, as it became apparent that this would be an especially contentious episode in the perpetual drama that is the farm bill’s quinquennial renewal, the ecology writer and activist Daniel Imhoff wrote in The Nation about the prickly questions embedded in the debate—and, like Abramsky, connected the dots between agricultural legislation, drought and climate change:

Generous crop insurance programs and increasing global demand for animal feed and biofuels have triggered an aggressive expansion of farming and ranching activities into marginally productive areas. Water tables are plummeting and aquifers are being depleted. Soil erosion is on the rise. Agriculture contributes to greenhouse gases, loss of biodiversity, chemical runoff in waterways, compromised animal welfare and food contamination. Farm bill conservation programs could be the only thing that stands between us and another Dust Bowl, collapsed watershed or imperiled species.

Sure enough, the farm bill passed by the House earlier this month—in addition to stripping out the food-stamps program, thereby sundering the traditional farm-bill coalition in two—disconnected conservation compliance from crop insurance payouts, essentially declawing the bill’s already insufficient environmental-protection provisions. If the new Dust Bowl gets any worse, we may have to start looking for the next William Gropper.

Please support our journalism. Get a digital subscription for just $9.50!

Helen Thomas's Legacy


President Gerald Ford talks with reporters, including Helen Thomas, during a press conference at the White House.(Courtesy of Wikimedia Commons)

Editor’s Note: Each week we cross-post an excerpt from Katrina vanden Heuvel’s column at the WashingtonPost.com. Read the full text of Katrina’s column here.

In his refreshing appearance at the 2006 White House Correspondents Association dinner, comedian Stephen Colbert showed a parody video in which he “auditioned” for the position of press secretary. In it, he refuses to answer a question from real-life White House correspondent Helen Thomas and spends the rest of the video trying to escape her dogged questioning.

It was a brilliant turn, not only for its skewering of a Washington press corps that was asleep while President George W. Bush took us to war in Iraq but for its implicit praise of the tenacious, shoe-leather reporting of Thomas, who died Saturday at the age of 92.

Born two weeks before women officially had the right to vote, Thomas broke glass ceiling after glass ceiling as a woman journalist, including by becoming the first female member of the White House Correspondents Association and the Gridiron Club. She fought, with characteristic perseverance, to join these organizations. It wasn’t because she saw being in the room as an end itself. Rather, she understood that she needed to have access to power in order to question that power.

Editor’s Note: Each week we cross-post an excerpt from Katrina vanden Heuvel’s column at the WashingtonPost.com. Read the full text of Katrina’s column here.

Jenny McCarthy's Vaccination Fear-Mongering and the Cult of False Equivalence


Jenny McCarthy addresses the audience at an Ante Up for Autism fundraiser. (Courtesy of Flickr user Michael Dorausch)

On February 28, 1998, a British physician named Andrew Wakefield published a paper in The Lancet that purported to identify a link between the measles, mumps and rubella (MMR) vaccine and the appearance of autism in children. The results provoked a widespread backlash against vaccines, forcing the medical community to spend years attempting to debunk his false claims. Eventually, it was revealed that Wakefield had fabricated his research as part of a scheme that promised him millions of dollars. Wakefield suffered a dramatic public downfall—his medical license stripped, his paper retracted from publication—but the damage was done. His propaganda had led to decreased immunization rates and an outbreak of measles in London.

Wakefield’s falsified claims remain at the core of a stubbornly popular anti-vaccination movement. To this day, despite overwhelming evidence to the contrary, many people believe that vaccines are the principal cause of autistic spectrum disorders.

One of the most prominent promoters of this falsehood is actress Jenny McCarthy, who was recently named as Elisabeth Hasselbeck’s replacement on ABC’s hit daytime talk-show, The View. Once she’s on air, it will be difficult to prevent her from advocating for the anti-vaccine movement. And the mere act of hiring her would seem to credit her as a reliable source.

In 2007, McCarthy debuted her views on the national stage when she appeared on The Oprah Winfrey Show to discuss autism, which is growing at alarming rates and continues to baffle medical researchers. McCarthy was convinced that vaccines gave her son autism and seizures. In addition to a gluten-free diet, aromatherapies, B-12 shots and vitamins, she also tried chelation therapy, which is meant to remove toxic substances from the body. Her son, she claimed, was “cured.”

Within the first few minutes of the interview, McCarthy cited as reasons for her success a “little voice” and her “mommy instincts,” all while denigrating several doctors and EMTs.

Oprah Winfrey’s decision to let McCarthy act as an expert, to dismiss science with alchemy, without asking any tough questions, was unconscionable. The same could be said of the producers of Larry King Live and Good Morning America, both of which hosted McCarthy soon after. Even though they at least asked questions about her views, Larry King had her debate a doctor, as though her disproven ideas should be given the same equivalence as those of a medical expert.

In fact, McCarthy’s beliefs—that vaccines and mercury cause autism, that a good diet cures autism and that “diagnosticians and pediatricians have made a career out of telling parents autism is a hopeless condition”—have been roundly dismissed and discredited by doctors and scientists, who insist that her claims are based on no scientific data or research. McCarthy wasn’t deterred. “The University of Google,” she said to Oprah, “is where I got my degree from.”

Let’s be clear: there is no connection between vaccines and autism.

Despite the evidence, it is easy to understand why the parent of an autistic child—in fear and confusion and desperation—might find McCarthy’s claims enticing. These are parents at their most vulnerable and McCarthy, though perhaps well intentioned, has preyed on them. This fear-mongering is incredibly dangerous, especially when a quarter of parents trust the information provided by celebrities about the safety of vaccines. A movement borne out of Wakefield’s discredited research, animated by misinformation, and promoted by people like McCarthy has fed an anti-vaccine frenzy, leading to a huge spike in cases of whooping cough in communities across the United States, especially in Washington State, which, in 2012, saw its worst epidemic in seventy years.

We see the same dangerous nonsense playing out with the HPV vaccine, a major breakthrough that can prevent cervical cancer and, it was recently found, throat cancer in men and women. Unfortunately, parents studying at McCarthy’s alma mater, the University of Google, are absorbing misinformation and refusing to vaccinate their kids.

Please support our journalism. Get a digital subscription for just $9.50!

These incidents reflect a broader disconnect between science and the media on a range of issues. The vast majority of scientists accept that evolution is real, that man-made climate change is occurring and that vaccines do not cause autism. But in the general public, these issues are often hotly debated, and, too often, the media fuels these arguments by airing junk science as though it were legitimate. The result? A major public health risk. Vaccine avoidance makes the entire country more susceptible to diseases like the measles that were once vanquished.

By giving science deniers a public forum, media outlets implicitly condone their claims as legitimate. As Columbia Journalism Review’s Brendan Nyhan recently argued in a post about McCarthy and her vaccination fear-mongering, “he said” “she said” coverage simply puts “unsupported claims alongside credible arguments, or failed to push back altogether.” False equivalency is one of journalism’s great pitfalls, and in an effort to achieve “balance,” reporters often obscure the truth. What’s the merit in “he said, she said” reporting when he says the world is round and she insists it is flat. Indeed, there is an enormous cost to society when the truth could save lives.

Mark Hertsgaard argues that it is time to confront those in government who are undermining our response to climate change.

This Week in ‘Nation’ History: Nelson Mandela’s Courage Through the Years


Former South African President Nelson Mandela. (AP Photo/Pool-Theana Calitz-Bilt)

The annual celebrations—for once, the appropriate word—surrounding Nelson Mandela’s birthday last week bore an extra note of bitter-sweetness this year, amid conflicting reports about the ex-president’s health. “Madiba is always with us,” one 12-year old girl told The New York Times. “He gave us freedom.”

What is often lost, now that Mandela has achieved the status of wise elder, is the extent to which his victories are owed to significant streaks of both radicalism and Realpolitik running throughout his career—first as a principled revolutionary and later as very much the quintessential consensus-building politician.

The most recent Nation articles on Mandela highlighted this paradox. In January 2011, the Chilean-American author Ariel Dorfman wrote about how Mandela expressed concern, in his newly published memoir Conversations With Myself, about being remembered as an otherworldly ethical actor totally separate from difficult, still-unsettled political questions.

The end of the racist South African regime is simply inconceivable without the moral capital and charisma Mandela had accumulated during his prison years. As a symbol of dignity and resistance he was, well, irresistible; but the compassion he showed once he was released, the ability to speak to his enemies and bring them to the table, his disposition to forgive (but never to forget) the terror inflicted on him and his people, his willingness to see the good in others, to trust their deepest sense of humanity and honor, turned him into the sort of ethical giant that our species desperately needed in a petty era of devastation and greed. Such a halo can, however, be just as confining as an island where every move and word is guarded.

A tour through articles about Mandela from The Nation’s archives elucidates the same point, helping us to consider the 95-year-old Mandela as a man who, Dorfman wrote, “fortunately for himself and the world, is not, after all, a saint.”

* * *

The first mention of Mandela in our pages came in January 1966, when the white South African anthropologist Hilda Kuper reviewed No Easy Walk to Freedom, a collection of writings and speeches by the Robben Island prisoner. Kuper called Mandela “a man of courage and deep integrity, a tragic and noble figure,” and wrote that “his impressive and sincere speech” endorsing armed struggle—while defending himself against charges of treason at the Rivonia Trial in 1964—“is not that of a man who enjoyed violence, but of one driven to violence as the last resort.” Kuper closed her piece with an impassioned critique that represented the spirit of the first major international stirrings against apartheid, in the middle of the 1960s:

Lawyers, teachers, ministers, factory workers, trade unionists, liberals and radicals have been exiled, banished, banned, imprisoned. Their voices cannot be heard, and they are prevented from hearing the voices of others. Books in which great men expressed noble ideals throughout the ages are prohibited, to publish the words of any person who is or has been banned is a criminal offense. Criticism is communism, thought is sabotage. South Africa has become a vast and terrifying prison.

The Nation, like the rest of the world, more or less lost sight of Nelson Mandela for many years after his imprisonment, as more radical—and more violent—opponents of apartheid came to the fore. But all that time Mandela was accruing support (or “moral capital,” as Dorfman later put it), and after rejecting at least one offer of release in exchange for unequivocally rejecting violence—Mandela argued that “a prisoner cannot enter into contracts”—he was finally freed in February 1990. After several fits and starts, he entered negotiations with President F.W. de Klerk, struggling to maintain a broad coalition in the ANC while establishing friendly relationships with government figures. To the same extent that Mandela’s immense political gifts enabled him to successfully—and, to a large extent, peacefully—end minority rule and de jure segregation in South Africa, he also struggled to satisfy the disparate elements in fighting for those goals.

As Aryeh Neier, former director of the ACLU and, until last year, the Open Society Foundation, wrote in his Watching Rights column that August, Mandela’s path from political prisoner to revered international leader was anything but straight. Like many, Neier took issue with Mandela’s statements at a stadium event with ethically questionable Kenyan President Daniel arap Moi, then battling attempts by Western powers to force his government to reform. “What right has the West, what right have the whites anywhere,” Mandela asked, somewhat out of character, “to teach us about democracy when they executed those who asked for democracy during the time of the colonial era?” Neier, disheartened by these comments, wrote in The Nation:

The sense of disappointment over Nelson Mandela’s statement in Kenya is all the more acute because, though he consorted with heads of state and other important personages elsewhere, he did not seem to lose his bearings. It would be comforting to think that he will reflect on his experience in Nairobi and that he will come to realize that, despite his reluctance to turn his back on the black African leaders who supported his struggle all these years, it is not the likes of Kenya’s Moi, Zaire’s Mobutu, Somalia’s Siad Barre and Ethiopia’s Mengistu who stand for what he stands for.

Notably, South African presidents since Mandela have suffered the same discomfort in trying to distance themselves from African revolutionaries-turned-despots like Robert Mugabe of Zimbabwe and Muammar Qaddafi of Libya.

Perhaps the most interesting piece The Nation has published about Mandela was a story by the playwright Arthur Miller, about an interview he did with Mandela—on life, not politics, the author stressed—for the BBC. “South Africa is unique,” Miller wrote. “It has state socialism for the whites…and fascism for the blacks.”

I felt the place strange but comprehensible as merely one more kingdom of denial, unusual mainly for the immense proportion of its majority ghettoized and stripped of all civil rights…It is all part of a hopeless muddle of a modern technological state trying to sustain the most primitive, chest-pounding, Nazi master-race dogmas. So surrealism looms at every turn.

What struck me strongly about Nelson Mandela in his American public appearances was the absence in him of any sign of bitterness. After twenty-seven and a half years with his nose against the bars he seemed uninterested in cursing out the whites who had put him there for the crime of demanding the vote in a country where his people outnumber their rulers about six to one…

Watching from a distance I had found him extraordinarily straightforward in his persistent refusal to pulverize his history to suit current American tastes, crediting Communists for being the first whites to befriend his movement, sometimes at the risk of their lives.

Echoes of Miller’s own refusal to testify before the House Un-American Activities Committee in the 1950s are unmistakable.

Lacking a reporter’s killer instinct or investigative techniques I was simply very curious about the roots of this man’s unusual character. How does one manage to emerge from nearly three decades in prison with such hopefulness, such inner calm?….It was striking how he never seemed to categorize people by race or even class, and that he spontaneously tended to cite good men even among the enemy.

* * *

A certain combination of admiration mixed with impatience continued to mark The Nation’s reporting on Mandela—and the ANC movement as a whole—as he struggled to negotiate de Klerk and his National Party out of power. In July 1991’s “Mandela Tries to Stay Out Front,” Chris McGreal—then a reporter for The Independent on Sunday, now Washington correspondent for The Guardian—channeled many South Africans’ complaints with the pace of transition to post-apartheid society: “If Mandela is so influential with the government, it is often asked, why can he not force it do the one thing that would make a difference—ban the carrying of all weapons in public?”

There have been fewer than forty prosecutions for the 10,000 deaths in factional violence. Yet Mandela’s response has been so naïve as to be almost laughable….Mandela’s dilemma is to reconcile his relationships with men who continue to act in bad faith—including de Klerk—with his effort to insure the best deal for South Africa’s black majority. His mistake, perhaps, is that he has continued to show a degree of respect for his negotiating partners that few in the townships consider merited.

Three years later, some months after Mandela had been elected the first black president of South Africa, Mark Gevisser, The Nation’s longtime Southern Africa correspondent, wrote in “Democracy in Living Color” that the lack of major accomplishments by Mandela’s new government was important—“The higher you fly, the harder it is to stay on the right side of history,” he wrote—but not the most important story:

The victory of South African democracy is not that it has begun to transform lives stunted and impoverished by apartheid. For despite a couple of presidential proclamations…and the promulgation of a Land Restitution Act, little has been done, in these first six months of the Mandela government, to change people’s lives in any physical way. The victory of South African democracy is rather, more simply, that it exists. Against all odds and in peace….Whatever the external signifiers, Mandela is there. In fact, his near-seamless statesmanship is largely responsible for what amounts to a transition that is sometimes too smooth to even be noticed.

A year later, in November 1995, Gevisser wrote an editorial calling attention to Mandela’s difficult—arguably, necessary—compromises with white nationalist elements of the ancien régime, which, despite the much-heralded announcement of a Truth and Reconciliation Commission, had the effect of shielding real criminals from responsibility:

For reasons that are by no means politically unsound, Mandela has nailed his colors to the mast of “reconciliation.” While this has served to still the restive right, it also means the possibility of perpetrators of human rights abuses being called to account is slim indeed. South African’s state-sanctioned arms smugglers, along with its anti-democratic murderers, will go free so that South Africa can have peace.

* * *

It does honor to Mandela, and not the opposite, to recall amidst celebrations of his ninety-fifth birthday that he was a real-world political actor, not a saint, an angel or a dream. As Nation columnist Gary Younge wrote last month in “Everyone Loves Mandela,”

to make him a saint is to extract him from the realm of politics and elevate him to the level of deity. And as long as he resides there, his legacy cannot be fully debated or discussed, because his record is then rooted not in his role as the head of a movement, but in the beatified soul of a man and his conscience.

Please support our journalism. Get a digital subscription for just $9.50!

Which brings us back to the first article published about Mandela in The Nation, Hilda Kuper’s review of No Easy Walk to Freedom. “For many non-South Africans,” Kuper wrote in 1966, “South Africa provides a scapegoat on which to project righteous hatred of injustice without risk of involvement in action.” This converse is just as true today: Nelson Mandela provides an icon on which to project righteous love for justice without risk of involvement in action. It is vital, at this pivotal moment, that we nurture a more complete memory of Mandela, one that is situated within the context of the great conflicts of his time—race and class, especially— and with due emphasis on the centrality of risk, involvement, and action to his life and work. That is no less than what the great Madiba deserves.

* * *

Subscribers to The Nation can access our fully searchable digital archive, which contains thousands of historic articles, essays and reviews, letters to the editor and editorials dating back to July 6, 1865.

Researched by and written with Richard Kreitner.

Syndicate content