Quantcast

Rick Perlstein | The Nation

  •  
Rick Perlstein

Rick Perlstein

Where the past isn’t even past.

Why a Democratic Majority Is Not Demographic Inevitability (Part Three: The Fungibility of Fears)


The presidency of George W. Bush did not usher in a new Republican era, contrary to some predictions. (AP Photo/Ron Edmonds.)

Quick show of hands: who remembers the Summer of the Shark? The reference is to those muggy months in 2001 when the news was so slow and the media was so craven that the third-most-covered news story was a supposed epidemic of shark attacks that weren’t even an epidemic (there were 76 shark attacks in 2001 and 85 in 2000). The (media) feeding frenzy ended, naturally, on September 11. And so did something else: the general sense that George Bush was a do-nothing president that drove his approval ratings into the low fifties. As if overnight, they rose to 90 percent.

Then, cunningly, cravenly, the neoconservatives in and around the White House exploited the terrorist attacks to work their political will. The Project for the New American Century 2000 report “Rebuilding America’s Defenses,’ after all, had almost longingly observed that “absent some catastrophic and catalyzing event—like a new Pearl Harbor,” the kind of hyper-militarization of which they dreamed would be hard to achieve. They bank-shot the paralyzing fear into a made-up casus belli in Iraq—and then rode the electorate’s security fears all the way to Bush’s re-election. (Then, the day after that re-election, Bush called it a mandate to privatize Social Security.) Karl Rove’s predictions of a conservative Republican century seemed as reasonable as today’s arguments for Democratic demographically inevitability. Fear worked—in a way that could not possibly have been predicted by electoral prognosticators. Fear has a special way of confounding political predictions.

The Summer of the Shark illustrates something else: American culture is largely an ecology of fears, political culture included. And though it may flatter our liberal amour propre, conservatives don’t have a monopoly on exploiting fear for political advantage. Fear can be progressive—when Democratic politicians speak constructively to ordinary people’s fear of being manipulated and exploited by their bosses, of losing their way in a winner-take-all economy, of the consequences of a state without a safety net. It’s almost a very rough rule of thumb: when Democrats are able to successfully frame the meaning of an election season around middle-class fears, Democrats win the election; when Republicans are able to successfully frame the meaning of an election season around cultural fears, Republicans win the election.

Alas, many of Democrats’ political problems come when they forget that rule of thumb. And given that, it has to be said: when conservatives do fear—when they work to make elections referenda on cultural fears—they really do leave it all out on the floor.

Recent research supports it: “Using a large sample of related individuals, including twins, siblings, and parents and children,” according to a summary from the Associated Press, researchers led by Rose McDermott of Brown “first assessed individuals for their propensity for fear using standardized clinically administered interviews.” They then “surveyed the sample for their attitudes toward out-groups—immigrants in this case—as well as toward segregation. Participants were also ranked on a liberal-conservative partisanship scale depending on how they self-reported their political attitudes.” They found “a strong correlation between social fear and anti-immigration, pro-segregation attitudes. While those individuals with higher levels of social fear exhibited the strongest negative out-group attitudes, even the lowest amount of social phobia was related to substantially less positive out-group attitudes.”

The AP then quoted the scholars’ useful conclusion: “It’s not that conservative people are more fearful, it’s that fearful people are more conservative.” 9/11 showed that was so. Fear of “the Other” became endemic, epidemic, among people who at other times and in other circumstances have shown evidence of a healthy pluralism, tolerance for complexity and salutary fellow feeling. Andrew Sullivan, for example, the sometimes-conservative pundit, has never been my favorite political writer (his specialty seems to be serially getting things wrong, then narcissistically claiming moral credit for the courage to change his mind, long after it really matters), but he doesn’t seem all that bad a guy. After 9/11, however, he lost it, calling you, me, and everyone else skeptical of the rush to war in Afghanistan, “[t]he decadent Left,” who “[m]ay well mount what amounts to a fifth column.” Now (too late!) he diagnoses what drove his own hysteria: “crippling fear.”

But at that, at times, haven’t all of us been sort of conservative like that? Unexpected things happen, unmoor us, and we lose ourselves. We lash out, indulge bad instincts, think with bad faith, grow cranky, irritable, stupid—act like stereotypical right-wingers. Maybe even, temporarily, indulge right-wing political behavior.

Please support our journalism. Get a digital subscription for just $9.50!

Thus the political corollary: make people more scared, and you may make them more conservative. When that happens on a big enough scale, conservatives are electorally advantaged. Which is why, historically, conservatives have been such avid entrepreneurs of political fear. Scare people about something, anything, and you might get them voting “right”; you get power; and once you get power, you needn’t actually address the fear (stoking it further, not addressing it, is the more useful political play). You instead may use the newfound, fortuitous grant of power for something entirely unrelated—say, trying to privatize Social Security.

Once-in-several-generation tragedies like 9/11 being unreliable foundations for political strategy, the process is typically more sedulous. And that which will scare us most being unpredictable, the process demands an experimental temper—throwing things against the wall, seeing what sticks. The pioneers of the New Right in the 1970s were quite explicit about it. “The New Right is looking for issues that people care about,” Richard Viguerie said. “Social issues, at least for the present, fit the bill.” And so, in the 1970s when families were collapsing left and right, they zeroed in on issues that spoke to those fears: federal bureaucrats undermining family authority in the classroom. Gay rights ordinances. The Equal Rights Amendment. Abortion. “I organize discontent,” as Viguerie said.

In 2004, I watched the process happen. Early in the year, as I’ve written, “I attended of hundreds of ‘Parties for the President’ organized nationwide for grassroots volunteers who wanted to help re-elect George W. Bush, at a modest middle-class home in Portland, Oregon. At one point, a nice old lady politely pressed into my hand a grubby little self-published pamphlet she had come upon, purporting to prove that Democratic presidential candidate John Kerry had faked the heroics that had won him three purple hearts in Vietnam. I added it to my mental store of the night’s absurdities that I expected to hear rattling across the wingnutosphere the entire fall: ‘I still believe there are weapons of mass destruction’; ‘There is an agenda—to get rid of God in this country’; ‘John Kerry attended a party in which there was bad language!’” All were equally absurd. One of them, however—the notion that Kerry was a military shirker in Vietnam, symbolically implying that a President Kerry would shirk his executive obligations to protect the nation now—for some reason took off. It was all over CNN that fall, not just Fox; so it was that a crazy right-wing fear-meme insinuated into the mass lizard brain by Election Day.

“Death panels” did pretty well in 2010—no one could have predicted that. The “war on religion,” “Sharia law,” “Obama wasn’t born in America”: Three strikes and the Republicans were out in 2012. You win some, you lose some. But here, at base, is the biggest problem with the confident predictions that demography will drive Democratic inevitability in the coming decades: No one can predict fears before they come; that’s why they call them fears. Fear of rioting “Negroes” from 1966 on: no one saw that coming to upset the Democratic inevitability of 1964. Extremist Iranian students seizing innocent American hostages in 1979: no one saw that coming to help upset the Democratic inevitability of the late 1970s. (Political scientist Everett Carll Ladd in 1978: The “GOP is in a weaker position than any major party of the U.S. since the Civil War.”)

As long as human beings are wired like this—and barring robot brain transplants, human beings will always be wired like this—predictions about permanent conservative retreat cannot be reliable. People are more tolerant of homosexuality? So what. The things people are afraid of change; that’s life. Fear still remains. They surely will figure out something else. What will it be? We cannot know. That is why they call it fear.

Read Rick Perlstein on Anthony Lewis, the Pulitzer Prize-winning journalist known for his coverage of the Supreme Court who died today.

Why a Democratic Majority Is Not Demographic Inevitability (Part Two: The Politics of Immigration Reform)


Protesters rally against the SB1070 immigration bill in Arizona in 2010. (AP Photo/Ross D. Franklin.)

In February, I wrote the first part in a promised series about why today’s political conventional wisdom—that, as Jonathan Chait put it “conservative America will soon come to be dominated, in a semi-permanent fashion, by an ascendant Democratic coalition hostile to its outlook and interests”—may be premature. I cleared the decks by pointing to all those other moments—in 1964, 1972, 1974 (and, I didn’t note, 1992)—when equally confident prognostications of permanent Democratic majorities came a cropper. This time, I take on the most conspicuous this-time-it’s-going-to-be different argument: that the white vote in presidential elections has gone from almost 90 percent in 1980 to about 70 percent in 2012; that there are 24 million Hispanics currently eligible to vote and there will be 40 million by 2030; and that only 27 percent of Hispanic voters chose Mitt Romney for president (chart here)—and so, abracadabra, Democrats Über Alles!

Now, it might hard for us to wrap our minds around a different way of seeing things in these days of Joe Arpaeio and Jan Brewer—and Susana Martinez, the Latina governor of New Mexico who promises to repeal her state’s law allowing undocumented immigrants to get driver’s licenses even though her own grandparents were undocumented immigrants. But, taking the long view—and isn’t that the whole point of this exercise?—it has to be acknowledged: party identities aren’t passed on through the genes. Blocs of “natural Democrats” have become natural Republicans before. Indeed, in at least one instance, it happened with shocking rapidity. As I noted last time, in the 1960s, droves of white Democrat ethnics—Italians, Eastern Europeans, the Irish—started voting Republican in a backlash against the Democrats’ continued embrace of civil rights in the wake of a failed open housing bill and the urban riots. Only an eye-blink earlier, they had been considered the soul of the New Deal coalition.

Not Jews, of course; they stayed Democratic—even as they joined the Italians, Eastern Europeans and Irish in moving out to the suburbs. And so: consider an interesting bon mot I once heard in a speech by David Brooks (not ordinarily much of a sage, I’ll admit). He said the question going forward for the Republican Party was whether Hispanics would turn out to be Italians or Jews. That was in 2005, one of those brief windows when the “Italian Option” seemed like it just might be a possibility. A series of Bush Administration-supported bills for comprehensive immigration reform, including a path to citizenship for undocumented immigrants, were introduced. The next year, Bush boldly distanced himself from nativists in his party by calling the armed civilian border-patrollers of the Minutemen “vigilantes” even as other GOP pols were embracing them as citizen heroes, no more threatening than neighborhood watch volunteers (these were those long-ago days back before Trayvon Martin and George Zimmerman…). Then, in 2007, an armada of braying callers spurred by hysterical right-wing talk-jocks like Rush Limbaugh drove the nail in the coffin for Karl Rove’s dream to make the Republican Party safe for migrants from Mexico. And suddenly, over just about any imaginable time frame, it seemed Hispanics would be like Jews—Democratic loyalists, possibly forever.

But look at the lead headline in today’s New York Times: GOP Opposition on Migrant Law Seems to Erode.” (The lead example is no less a Tea Party darling than Rand Paul.) The RNC’s blockbuster save-the-party report released this week implores, “We must embrace and champion comprehensive immigration reform. If we do not, our party’s appeal will continue to shrink to its core constituencies only.” They actually seem serious about it this time. Admittedly, nobody ever went broke betting against the Grand Old Party’s stubborn insistence never to venture beyond its core constituencies, but: What if? What if a genuine immigration reform law happens?

Please support our journalism. Get a digital subscription for just $9.50!

Then the issue might be taken off the political table—and what then? Tom Frank has frequently argued that Republican politicians don’t really want to ban abortion, because if that happened, they would lose their best scare story to rouse their base on election day. I don’t think that’s true (for if they are actively trying to intentionally not ban abortion, they sure are doing a damned good job of banning it accidentally) but the underlying political logic of the hypothesis must be respected. It works, for immigration, in the inverse: Democrats, an at least slightly more honorable bunch, genuinely do want immigration reform, so if the Republicans manage to take “yes” for an answer, it will be that much harder every two years going forward to make the appeal for Hispanics to vote for Democrats because Republicans hate them. (2004 suggests that’s not an unrealistic policy: all it took then was George W. Bush to make nice noises about possible immigration reform to win 44 percent of the Hispanic vote, compared to 21 percent eight years earlier. See the third chart here: Hispanic party identification, while always majority Democrat, has actually been quite volatile over the last forty years, and shows no particular trend line.)

Something else might happen. Part of the fantasy certain pro-reform Republicans like to broadcast about Hispanics, family-oriented, churchgoing traditionalists, is that they are somehow natural conservatives, just waiting for the Republicans to slough off the skin of bigotry before they can embrace Reaganism en masse just like every other ordinary God-fearing American. Liberals intelligently respond by pointing to polling demonstrating that if anything, Hispanics are more liberal than voters in general on all sorts of issues—for instance, 75 percent of Hispanics prefer “a bigger government providing more services” rather than “a smaller government providing fewer services,” compared to 41 percent for the general population. But what if they start becoming “Italian”? That is to say, what if Hispanics, less hobbled by official discrimination, follow the pattern of other immigrant groups before them, become increasingly upwardly mobile—and become increasingly identified, by themselves and others, as “white.” Is it not reasonable to assume that they might become more Republican? That would certainly be the historical precedent: more and more immigrant groups (excluding, of course, African-Americans), becoming “white.”

The very existence of a more immigrant-friendly Republican Party, meanwhile, might do much to assuage the sort of mainstream, moderate white voters who identify themselves by their tolerance (many of them quite conservative on economic issues) that it is now “safe” to vote Republican.

Unlikely, you say: conservatives require an enemy to fear, an “Other” to hate, and those scary dark people from the crime-ridden wasteland on our southern border are “Others” par excellence. But, as I’ve been pointing out, for the conservative mind, enemies are fungible. So are fears. Conservatives did just fine transforming Martin Luther King from Satan himself into Santa Claus in the space of a couple of decades. Surely, a couple of decades from now, they’ll manage to find some other, less electorally powerful object to fear. And the party’s political fortunes might stabilize accordingly.

That’s what I’ll be writing about next time in this series: the fungibility, and unpredictability, of fears, and how that may well provide Republicans a certain electoral advantage in the future whatever demographic changes come down the pike.

Rick Perlstein explains that there is a conservative wing of the Democratic Party, and why progressives should recognize that fact.

Right and Left in Democratic Politics: The Long View


It was only after the ascension of Franklin Delano Roosevelt that the Democratic party began to be regarded as fundamentally liberal (AP Photo.)

Here’s a pet peeve of mine. It’s when people refer to the “democratic wing of the Democratic Party.” Or who say of a Democrat who makes consistent moves to the right, “Why doesn’t he just join the Republicans?” It’s not the underlying sentiment; I want Democrats to stop doing right-wing stuff as badly as anyone. The problem is descriptive—and, ultimately, strategic. The fact is that the Democratic Party in modern times has always had a conservative wing, one frequently as strong or stronger than its liberal wing, and as such, when progressives speak of the party as a vehicle that naturally belongs to them, as if by right—until conservatives stole it from them—they weaken progressivism. The fact is, the history of the Democratic Party has always been one of ideological civil war. And if you don’t realize you’re in a war, how can you win it?

Let’s review the game tape. Take it all the way back to 1924—when both parties had both left- and right-wing factions (before that year, the great progressive reformer Robert “Fighting Bob” Lafollette of Wisconsin was a Republican), when there was no reason to believe the Democrats would be the ones to become the nation’s established left-of-center party, and when at the presidential nominating convention the civil war came down to 103 ballots (and gubernatorial fistfights on the convention floor) over issues like Prohibition and whether the party should be for the Ku Klux Klan or against it.

It was of course with the ascension of Franklin Roosevelt in 1932 and after that the idea of the Democrats as an institutionally liberal party became credible, though many delegates who voted for him at the convention didn’t necessary think or know they were voting for a liberal. Many voters didn’t think so, either, but just marked the ballot for him because he had a “D” beside his name: They were Southerners, and saw the Democrats as the only political bulwark against the racial mongrelization of America. The progress of the New Deal, we now understand, rested on a fragile and complicated coalition joining visionary progressives and the most fearful reactionaries—and when an overconfident Roosevelt overreached to try to put the reactionaries in their place, in 1938, he almost lost control of the whole thing.

With the coming of the civil rights era, the war played out against that precise template: Northern progressives asserting themselves, Southern reactionaries threatening to pack up their votes and go elsewhere—a melodrama that began with a bang in 1948 when Strom Thurmond led Dixiecrats out of the convention and into his own segregationist presidential run, and reached its apotheosis in 1964 when five Southern states went for Goldwater. That, of course, truly began the slow steady transition to ideological realignment, with more and more Southern Democrats voting Republican in each election.

But, wouldn’t you know it, a new issue immediately arose to muddy anew what it meant to be a Democrat. In 1968 the floor of the convention once more split right down the middle, fistfights included, this time over the question over whether the Vietnam War was a good thing or a bad thing. But the end of the war didn’t bring ideological unity, either. In fact, the fist post-Vietnam election, post-Watergate, in 1974, inaugurated today’s order of battle between the right- and left-leaning wings of the party. Democrats gained forty-nine seats in the House and three in the Senate, giving the party of Jefferson and Jackson an approximate two-to-one advantage over the Republicans. People assumed a liberal deluge was in the offing, Congressional Quarterly noted predictions that the 94th Congress would become “a labor-orchestrated puppet show.” Ronald Reagan said, “The small fires that at first threatened free enterprise are growing daily into full-scale four-alarm blazes,” predicting, “We’re going to see a flood of expensive, spectacular, and ill-conceived legislation which can’t be derailed or even tempered by the voices of moderation.”

In fact, something like the opposite happened—as could have been predicted by the language of the “Watergate Babies” on the campaign trail.

Please support our journalism. Get a digital subscription for just $9.50!

Thirty-six-year-old Gary Hart was more or less the ideologist of the bunch. His memoir of the McGovern presidential campaign, which he had managed two years earlier, called liberalism “near bankruptcy.”Time called him a “liberal.” “Traditional ‘liberal’ and ‘conservative’ slogans,” he wrote back in an angry letter to the editor, “are simply not adequate to cope.” He said the best way out of the energy crisis was “to work together. There will be a lot more cooperative ventures between the environmentalists and the energy developers.” His stock speech, “The End of the New Deal,” argued that his party was hamstrung by the very ideology that was supposed to be its glory—that “if there is a problem, create an agency and throw money at the problem.” It included lines that could have come from Commentary, the neoconservative magazine Jerry Brown, who was friends with Hart, liked to read and quote. Like: “The ballyhooed War on Poverty succeeded only in raising the expectations, but not the living conditions, of the poor.” (That was false: the poverty rate was 17.3 percent when LBJ’s Economic Opportunity Act was enacted in 1964 and 11.2 percent as Gary Hart spoke.) He called those who “clung to the Roosevelt model long after it had ceased to relate to reality,” who still thought the workers, farmers and blacks of the New Deal coalition were where the votes were, “Eleanor Roosevelt Democrats.” He held them in open contempt. His outmaneuvered opponent, a once-popular two-term conservative incumbent, said Hart seemed to be “trying to get to the right of Attila the Hun.” A 32-year-old congressman-elect from Michigan, James Blanchard, said “I’m not entirely sure what my political philosophy is.”

There was a political reason for this. These new Democrats, seeds for Bill Clinton’s capital-n New Democrats, were replacing Republicans in predominantly suburban districts. They spoke to the desires of a white-collar constituency—and not that of the fading urban proleteriat (“We’re not a bunch of little Hubert Humphreys,” Hart famously said). And though many of them, including Hart, frequently did yeoman’s work to reimagine progressivism for a new generation (for instance, in the field of environmentalism), some of them, and their immediate successors, also did yeoman’s work selling off great chunks of the old Democratic agenda to corporate bidders—like Tony Coelho, the California congressman elected in 1978 who became head of the Democratic Congressional Campaign Committee in 1980. Exulted a Dallas venture capitalist about this new/old breed of Democrat in a 1986 profile of Coelho, “I’m one of the biggest contributors to the Governor of Texas, but can I get him on the telephone? Hell, no. Sometimes it takes a week. I call Tony any hour fo the day or night and he gets back to me immediately. Some days he just calls to ask how I’m doing. That pleases me tremendously.”

This battle goes way back. It’s written into the Democratic Party’s DNA. Acknowledge the other side, study them—take them seriously. Don’t let them play the underdog; that just advantages them, too. We’re in a fight here—always have been. They think they are the party—just as confidently as we believe we’re the party. The only way to make our vision of this party a reality is to work for it—and not to act surprised when their side works for it, too.

Educate yourself about the contracts you sign, or you could easily fall victim to small print, Rick Perlstein writes.

Small Print, Big Problem (Part II: Remedies)


Theresa Amato is organizing to fight small-print contract trickery. (Courtesy of Citizens in Charge Foundation.)

Yesterday I wrote about the kind of absurd, unfair, and inscrutable contracts Americans click or sign on every day just to participate in normal commerce. I introduced Theresa Amato of Faircontracts.org, who’s organizing to fight these outrages, all licensed by pro-corporate court decisions, as one of the most pressing public policy problems we face, because, “If you take a look at all the economic problems we have, from the mortgage foreclosure crisis, to student loan debt, to credit card debt—pretty much pick your crisis—underneath everything you’re going to find a fine-print contract.” Today, I’ll tell you about what folks like this are trying to do about it all.

One piece is education—of which these blog posts are a part. “Just haul out your contracts and take a look!” says Amato, rattling off the toxic clauses consumers should look for: ones signing away your day in court in favor of binding arbitration (a non-transparent process that corporations almost always win); waiving liability and the right to take part in class action suits; gagging free speech; granting the contract issuer the right to unilaterally modify the terms. People “can demand that corporations treat them better, and not patronize corporations that have such provisions in their fine print. That of course means they have to read or know about it first.”

She recommends some outstanding recent books on the problem: University of Michigan law professor Margaret Jane Radin’s Boilerplate: The Fine Print, Vanishing Rights, and the Rule of Law (which contends that many fine print “contracts” aren’t actually legally contracts at all); Seduction By Contract: Law Economics, and Psychology in Consumer Markets, by NYU Law’s Oren Bar-Gill; The Fine Print: How Big Companies Use ‘Plain English’ to Rob You Blind, by the legendary investigative reporter David Cay Johnston.

And, hey! Check out, I josh you not, this hip-hop joint, “Fine Print,” which Amato commissioned from the social justice rapper Lonnie Ray Atkinson:

Think about freedom, the freedom that they reppin’. 
Free to agree or else you get to steppin’. 
But that ain’t freedom, freedom means choice. 
The right to participate, the right to a voice.

The power to negotiate at the point of signing
, power to enforce the dotted line. 
A level playing field where everything’s see through
. And you can understand what you just agreed to.

If you ain’t got that, then you ain’t got freedom
. Contracts shouldn’t make you weep when you read ‘em. 
From your economy to your government
you better check who’s writing the fine print.

But educating yourself about the contracts you sign (even via dope beats) is but a first step—and a deeply inadequate one. Says Amato, “I don’t want to get into the frame of blaming the victim here—of blaming the consumer. Because it is not rational for a consumer to have to read forty pages of fine print to buy a product.” After all, as I pointed out yesterday, a lot of these contracts are written at a twenty-seventh grade reading level, designed not to be understood. That has to change. One way, of course, is changing the law. She says, “We as a society should be able to do better than this. Contracts should be easily accessible; but most importantly, they should be fair—as in, easy to understand and not containing or hiding deceptive or abusive provisions that harm consumers who have no idea no idea that these provisions exist in the contract.”

Laws could require any company that issues business-to-consumer standard-terms language to make those terms available on their website and posted at their place of business, so people can comparison shop. In her home state of Illinois, Faircontracts.org had allies in the state legislature introduce just such a a bill, the Consumer Contract Right to Know, because right now comparison shopping on your own next to impossible: Amato had one of her legal interns call up the top six rental car companies to email their boilerplate contracts, and not a single one would do it. The bill got a “subject matter hearing” of less than a few minutes in the Illinois Senate Judiciary Committee, in which the senior counsel of the Illinois Retail Merchants Association rather ridiculously said it was unacceptable to them because it would permit businesses to see their competitors’ contracts. It never came up for a vote.

Please support our journalism. Get a digital subscription for just $9.50!

Surprise, surprise: the powers that be are not particularly enthusiastic about the notion of empowering consumers through fair, transparent contracts—something Amato learned with a vengeance when she tried to push another bill in Springfield, the Consumer Contract Plain Language Act. The big trade associations, from the Chamber of Commerce to the Cable Television and Commerce Association to something called the “Competitive Energy Association” came out in force to lobby against it. And what would this fearsome legislative enactment have demanded? Contracts “written in a clear and coherent manner using words with common and everyday meanings,” in “type of readable size and no less than 10-point font,” not containing provisions “that permits the unilateral modification by the covered entity to the disadvantage of the consumer without explicit consumer consent after the execution of the contract.” Oh, no. Can’t have that. This bill went nowhere (though it’s been reintroduced, and will probably go nowhere again).

But don’t despair: other tactics, ones bypassing the corrupt halls of our legislatures, have worked to push for reform. The contracts you’re expected to sign when you do business with a bank have been especially onerous. The Pew Charitable Trusts undertook a study of 250 types of checking accounts and found the median length of disclosure documentation at the twelve largest banks was sixty-nine pages—and that these same banks we taxpayers have so generously bailed out with our tax dollars especially love to bury hidden fees and penalties within the thicket to make money off our supposedly “free” checking accounts. Pew then publicized and leveraged the study to pressure the banks—reeling from a successful grassroots uprising against Bank of America for its proposed five dollar “swipe fee”—to adopt plain-language disclosures that fit on a single page of paper.

Consumers, it turns out, can have power. Their voices, though, have to be amplified and aggregated—by top-down public-interest organizations like Pew, in this case; but what if we did the amplifying, aggregating—and organizing—ourselves? The sky would be the limit. And that is where Faircontract.org’s ideas start looking visionary—and where, potentially, you come in.

Explains Amato: “What should happen is that people should view the contract as a product or a service that also has the potential to harm you. So that you’re not just buying the underlying product or service; you’re also buying the contract that goes with it. And so it’s important to think of what is in that contract. And when people find out negative things that are in that contract, there should be one place that everyone can go and put their information that they’ve learned about their contract, so they can share their information—like crowd-sourcing, or a review guide.”

Think of it this way: Yelp for contracts.

“And so this is what needs to be developed, and I would like to work with someone who would be interested in working on it. Because we have cumulative knowledge that can help change the behavior of corporations. Just like there’s been a success movement against the imposition of the five dollar fee that Bank of America was going to charge, we should be able to do that every day with every position in a contract that we don’t want to be imposed upon us as intelligent consumers. Most people go into a contract thinking , that will never happen to me,’ or ‘this product will be fine’—they are in a positive optimistic mode. And so they don’t consider when something goes wrong with this product, or the service. And that’s when you learn that there are all these nefarious terms in the contract. They rely on us being disaggregated, there being no motivation or ability, not wanting to plow through these terms, and not being able to share what we know when the terms are applied against us.”

So let’s aggregate. Want to help? Want to be one of the coders to create Yelp for contracts? Go to Faircontracts.org. Get in touch with Theresa. Because, like Lonnie Ray Atikinson raps, “ain’t no freedom if we ain’t free of the fine print.”

Read the Rick Perlstein's previous post on nefariious contract terms.

Small Print, Big Problem (Part I: Diagnosis)


Fair contract activist Theresa Amato. (Courtesy of Citizens in Charge Foundation.)

Imagine you’ve clicked on your computer screen to accept a contract to purchase a good or service—a contract, you only realize later, that’s straight out of Kafka. The widget you’ve bought turns out to be a nightmare. You take to Yelp.com to complain about your experience—but lo, according to the contract you have given up your free speech rights to criticize the product. Let’s also say, in a fit of responsibility, (a bit fantastic, I know) you happened to have printed out this contract before you “signed” it, though you certainly hadn’t read through the thing, which is written, literally, on a “twenty-seventh grade” reading level. Well, you read it now (perhaps with the help of a friend who’s completed the twenty-seventh grade). And you see that there was nothing in the contract limiting your right to free speech at the moment you signed it. That part was added later. Your friend with the twenty-seventh-grade education points to the clause in the contract in which you’ve granted this vendor-from-hell the right to modify the terms of the contract, unilaterally, at any time into the vast limitless future.

Others, you realize, must have had the same problem with this lemon of a product. You begin canvassing the possibility of a class action suit. But you guessed it: the contract you agreed to waived your right to class action as well.

You study this gorgon of a text to figure out what other monstrosities lie within—and discover this: you’ve waived away your right to the privacy of certain information, too. Shocked, you resolve: never again. You realize that when you buy a product or service, you’re also buying the contract that goes with it. So you’ll comparison shop. You think about how, when you rent a car, you have to sign and initial all that contract language you have no time to read with eight people behind you in line at the airport. So you call all the big rental car companies to get copies of their standard boilerplate contracts to read at your leisure—but not one would e-mail you the contract. You’re told it just isn’t done.

The upshot of this horror story? Maybe you’ve figured it out by now. The “you” above is actually you. You, dear reader, have almost certainly signed a contract exactly like this. You may even have done so today.

“Fine print,” or “boilerplate,” contracts have been interwoven into the fabric of our modern commercial society for decades. In recent years, however, they become more and more deliberately obfuscating—and, thanks to business-friendly court decisions more and more aggressive in their intent to deprive customers of all sorts of rights of redress. Recently I sat down to talk to an activist who’s doing something about it. When Theresa Amato of Faircontracts.org, who sat with me recently for an interview, told me about this business of companies reserving—and exercising—the right to change contracts after their customers have signed them, and courts upholding that right, I paused a bit. I said I was speechless. “Yes,” she replied. “You should be speechless. And so should everyone.” She laughs—in a laughing-to-keep-from-crying kind of way: “To call this fine print ‘contracts’ is almost a misnomer.” She corrects herself: “It is a misnomer, according to contract theory, because there’s no mutual consent there.”

The Chicago-based Amato has had a busy career in twenty years as a public interest lawyer working on some of the most dramatic and important issues of our time: first at the Public Citizen Litigation Group, formed by Ralph Nader; then as director of the Freedom of Information Clearing House, fighting for access to secret government documents; forming her own nonprofit, the Citizen Advocacy Center, pioneering democracy-building “community lawyering” in the burgeoning “edge cities” of the Chicago area; managing Ralph Nader’s 2000 and 2004 presidential campaigns (she wrote a definitive book from the experience on the legal structure of the two-party duopoly). Now that she’s thrown herself into this “boilerplate” issue, I ask whether the apparently eye-glazing issue of fine print it really as significant as all that.

More so, she says. “I believe this is one of the most pressing issues today. If you take a look at all the economic problems we have, from the mortgage foreclosure crisis, to student loan debt, to credit card debt—pretty much pick your crisis—underneath everything you’re going to find a fine-print contract. That most likely people didn’t read or didn’t understand. So this is a hundreds-of-billions-of-dollars problem that faces us as a country.”

Please support our journalism. Get a digital subscription for just $9.50!

One of the biggest issues, she’s convinced, is the acquiescence of the courts. Again and again, judges admit that there’s some kind of problem—the legendary Chicago federal judge and author Richard Posner admitted he hadn’t read the fine print when he signed his own mortgage—but claim their hands are tied, and sign off on the contracts nonetheless. Amato points to a Florida appeals court opinion, not yet finalized, in which the family of an elderly woman, now deceased, felt ripped off by her nursing home and challenged the legitimacy of the fine-print contract she had signed. I read the opinion. Acknowledged the judge, “At the time, she was 92 years old and had a fourth-grade education. She could not spell well and often had to sound out words while reading. She had memory problems and was increasingly confused.” He said, “As a practical matter, a significant percentage of the people who enter nursing homes and rehabilitation centers have mental or physical limitations that make it difficult for them to understand the agreements. The same is probably true for most of the contracts that we sign for many consumer services.”

This judge continued on to make a general theoretical point: “There was a time when most contracts were individually negotiated and handwritten. In that period, perhaps the law could adequately describe a mutual agreement as a ‘meeting of the minds’ between the parties”; but not any more.

When I read that, I nodded my head. I thought he was making a sympathetic point, the same one Amato has been pressing home to me: that when our entire system of consumer commerce is based on a vast, structural imbalance of power between sellers empowered to dictate terms, and buyers all but helpless to do anything but accept them just to participate in the economy, something is badly broken—in fact, the free market, which any right-wing economist says relies on adequate information to function efficiently, is badly broken. Even though I knew how this story ended—a decision unfavorable to the family of the deceased—I figured he at least was making one of those “regrettably, my hands are tied” points.

He wasn’t. He was saying the opposite: that there was no problem with inscrutable contracts at all. “Our modern economy,” he concluded, “simply could not function if a ‘meeting of the minds’ required individualized understanding of all aspects of the typical standardized contract that is now signed without any expectation that the terms will actually be negotiated between the parties.”

Well, as has recently been pointed out, in the case of the contract to download a song for ninety-nine cents on iTunes, you have to click on a contract longer than Shakespeare’s Macbeth—and when they change a single clause in the contract, instead of showing you the individual change you’re forced to locate it within the entire fifty pages of fine print—the defeatist notion that nothing can be done to right the balance between buyer and seller becomes an absurdity. Says Amato: “These contracts have evolved into a category you cannot understand. I mean, some of them are written for a twenty-seventh grade level. And most people don’t have postdoctoral degrees, or law degrees—and even lawyers who read them don’t understand them! They’re not meant to be understood.”

Push has come to shove. Next time, I’ll describe to you what Amato thinks can be done—and how you can join her fight.

Read Rick Perlstein on Reagan-appointed surgeon general C. Everett Koop, who held principle over any party line.

C. Everett Koop, 1916–2013


U.S. Surgeon General C. Everett Koop addresses a AIDS rally in Boston on June 4, 1989. (AP Photo/Mark Garfinkel)

A decent enough interval has passed, I hope, to begin to think about an interesting figure of our recent history in a bit of a critical temper. C. Everett Koop died on February 25 this year, the former surgeon general of the United States, between 1981 and 1989—the only person to hold that title to have become a household name, not least for his goofy half-beard and his charming insistence on wearing his ceremonial brocaded Gilbert-and-Sullivan-style uniform everywhere. But also for, it has to be said, serving as an exemplar of honor and courage in a dishonorable time. The Associated Press put it like this:

An evangelical Christian, he shocked his conservative supporters when he endorsed condoms and sex education to stop the spread of AIDS.

He carried out a crusade to end smoking in the United States—his goal had been to do so by 2000. A former pipe smoker, he said cigarettes were as addictive as heroin and cocaine….

Koop, a devout Presbyterian, was confirmed after he told a Senate panel he would not use the surgeon general’s post to promote his religious ideology. He kept his word.

In 1986, he issued a frank report on AIDS, urging the use of condoms for “safe sex” and advocating sex education as early as third grade.

He also maneuvered around uncooperative Reagan administration officials in 1988 to send an educational AIDS pamphlet to more than 100 million U.S. households, the largest public health mailing ever done.

Koop personally opposed homosexuality and believed sex should be saved for marriage. But he insisted that Americans, especially young people, must not die because they were deprived of explicit information about how the HIV virus was transmitted.

He became a hero to AIDS activists, who chanted “Koop, Koop” at his appearances but booed other officials.

And all power to him for that. You don’t see his like much any more, in that there Republican Party. After all, the AP also noted, shortly before his appointment, he was going around the country predicting a “progression from legalized abortion to infanticide to passive euthanasia to active euthanasia, indeed to the very beginnings of political climate that led to Auschwitz, Dachau, and Belsen.”

Then, strikingly enough, he changed.

Disappointingly, the Newspaper of Record downplayed that part of the story in their obituary, merely noting in that trademark anodyne Gray Lady fashion, “Dr. Koop was completing a successful career as a pioneer in pediatric surgery when he was nominated for surgeon general, having caught the attention of conservatives with a series of seminars, films, and books in collaboration with the theologian Francis Schaeffer that expressed anti-abortion views.”

“Expressed anti-abortion views”: oh, it was so much more interesting and colorful than that. On abortion, Dr. Koop made history twice: second by rejecting hysteria on the subject; and first, by pioneering it.

Frank Schaeffer, Francis Schaeffer’s son, tells the first part of that story in his fascinating memoir, Crazy for God: How I Grew Up as One of the Elect, Helped Found the Religious Right, and Lived to Take All (or Almost All) of It Back. The story begins in the most unlikely place: a religious commune in the mountains of Switzerland, l’Abri, where the elder Schaeffer and his wife Edith Schaeffer, eloquent, learned, cultured and charismatic, became a magnet for 1960s spiritual seekers including the likes of Jimmy Page of Led Zeppelin—this even though the theology was evangelical, even fundamentalist. “In Evangelical circles,” Crazy for God explains, “if you wanted to know what Bob Dylan’s songs meant, Francis Schaeffer was the man to ask…. Dad was wearing his hair longer and longer, and he grew a goatee. He took to wearing beige Nehru jackets, odd linen shirts, and mountain-climbers’ knickers,” and “evolved into a hip guru preaching Jesus to hippies, a precursor to, and spiritual father of, the Jesus movement.”

The intellectual point of commonality between the fundamentalist and the freaks was anti-materialism: “Dad said that middle-class values, bereft of their Christian foundation, were empty. He sided with ‘the kids’ against their ‘uptight parents.’… They were describing a world you can’t see, the invisible link between mortality and immortality…bring alive the biblical epoch to twentieth-century young people, competing with modernity by talking up a storm, convincing smart people that the spiritual world is more real and essential than the evidence of one’s eyes.”

Cool stuff. An impassioned student of Western art and philosophy, early in the mid-1970s Francis Schaeffer spread his vision of that the art and philosophy he loved was ineluctably rooted in a Christian worldview, and threatened by the decline of that worldview, by collaborating with a movie producer to create a multi-part film series intended for viewing by church groups—whose philistinism Dr. Schaeffer was glad to challenge when his creative vision demanded it: “We can’t have this for a Christian audience. Churches won’t rent it,” the producer said of stock footage of Michelangelo’s David. Responded Francis Schaeffer: “But we have other nudes and you never said anything. What about Mary’s breast in that Virgin and Child?” “That was bad enough! One holy tit is okay, as long as you don’t leave it on the screen too long. But churches don’t do cock!”

The concluding two parts of the series, which was called How Should We Then Live?: The Rise and Decline of Western Thought and Culture, took an unexpected and historically significant turn. Frank Schaeffer, the son, who was directing the films, had become an impassioned pro-life activist. Abortion, he insisted, represented just the sort of materialist desecration of a godly worldview they were seeking to illustrate. The father was skeptical; fighting abortion was a Catholic thing. Père and fils had a shouting match: “I don’t want to be identified with some Catholic issue. I’m not putting my reputation on the line for that!… What does abortion have to do with art and culture? I’m known as an intellectual, not for this sort of political thing!” Son prevailed—with the sort of ace-in-the-hole argument that would soon become all-too-creepily familiar in evangelical circles. Frank remembers himself shouting, “That’s what you always say about the Lutherans in Germany! “You say they’re responsible for the Holocaust because they wouldn’t speak up, and now you’re doing the same thing! Fucking coward!’ You’re always talking about the ‘dehumanization of man’; now, here is your best example!’”

Though I haven’t been able to find a single reference to this literally world-changing event in newspapers of the time outside announcements on the religion pages (the evangelical upsurge simply wasn’t on America’s political radar at the time), the series, which toured the nation beginning in 1977, ended up taking Christian America by storm—including a showing before 6,000 in Dallas starring quarterback Roger Starbauch and half the Dallas Cowboys and a booking in Madison Square Garden in New York. That was the way How Should We Then Live? became history’s second most influential spur to the evangelical anti-abortion crusade.

The first most influential was the sequel. Enter the Good Doctor, the surgeon-in-chief at Philadelphia’s Children’s Hospital and an old family friend of the Schaeffers.

“His pro-life passion,” Frank writes, “was based on having spent a lifetime saving the lives of babies that were sometimes the same age as those killed in late-term abortions.” He traveled to l’Abri to draft the new series of films that became Whatever Happened to the Human Race?: A Christian Response to Abortion, Euthanasia, and Infanticide. It was a lurid anti-abortion masterpiece—one with a $1.5 million budget (almost $6 million today) and a score performed by the London Symphony Orchestra.

The title cards for Episode 1 (“The Abortion of the Human Race”) introduced Drs. Schaeffer and Koop (“one of the world’s most prominent surgeons,” who “has spent a lifetime studying man’s attitudes and trends from a medical point of view”). The opening image depicted a family clad in white with white death masks painting the title on a pane of glass (the words “Human Race” in blood red)—then shattering it. Dr. Koop, in bow tie, answers the phone, dispatching a baby to the operating room, where doctors evidently save her life, then place her in an incubator. “Why is a human life worth saving?” Dr. Schaeffer narrates. “Why is it worth the trouble? Human life contains so much potential evil as well as good. It would hardly seem worth preserving at all unless there was a specific, compelling reason for doing. Traditionally, in Western culture, the life of a human individual has been regarded as very special. the fully developed view of the sanctity of human life in the West, did not come from nowhere, but came directly form the Judeo-Christian consensus…based on Biblical teachings, people used to view human life as unique, something to be protect, and loved, because it was made in the image of God.”

No longer. “For example, in some of today’s hospitals, this child would be left to die…”

Why? ” The answer is clear. The consensus of our society no longer rests upon a Christian base but upon a humanistic one…man putting himself at the center of all things…all reality is just one big machine…this causes people to view themselves differently, and to have different attitudes toward other human beings.” An image of rabbits in scientists’ cages evolves into one of squalling babies in scientists’ cages. Of cars being crushed in a scrap yard, Dr. Schaeffer lecturing from atop the junk heap, sad-eyed, on the “mechanistic view of people” that has been replacing Christianity: “Now we are faced with a generation who has taken these ideas out of the lab, out of the classroom, and into the streets. More and more, we find ourselves in an uglier world…and God as a personal creator is excluded from the picture,” with “no reason that man should be considered as special.”

Then a discussion by Dr. Koop of fetal development, then a cool and clinical description of the various supposed forms of abortion—“the surgeon then scrapes the wall of the uterus, cutting the baby’s body to pieces…”—over the image of a black baby doll, laying as if in shards of glass, then a gaggle of baby dolls, drowning amid those shards of glass, accompanied by Bernard Herman–like violin screeching. Schaeffer is revealed standing above the dead baby dolls, and the apparent shards of glass is shown to be salt. The scene was literally filmed on the Dead Sea—“where the city of Sodom once stood,” Schaeffer explained…

And so on. The same sort of cross-country tour followed, only bigger; only this time, the mainstream media paid attention—at least a little bit. For instance, a version was shown in January of 1981 on prime time in Washington, DC (“No Matter How Moving, Show Is Still Propaganda,” ran a piece on the showing on the front page of the Washington Post Style section). In 1982, Newsweek’s religion reporter Kenneth Woodward profiled Schaeffer—but, as Frank Schaeffer pointed out, Woodward was “one of the few reporters who seemed to ‘get’ what was happening with the emergence of the Evangelical pro-life movement.” His editors didn’t; “Newsweek had just dropped its dropped its religion section as “irrelevant.”

Ah, yes. So “irrelevant” that millions of Americans would soon adopting Whatever Happened to the Human Race’s inanities—the embryos are morally equivalent to infants; that without a conception of God ethics is impossible; that the “slippery slope” of abortion would soon lead to mass killings of the redundant elderly and handicapped—or an American Auschwitz; that late-term abortions (babies “removed alive,” as Dr. Koop shamefully claimed, then “allowed to die by neglect, and sometimes are killed by a direct act”) were epidemic—would be among the most important principles they would take into the voting booths each November.

Dr. Koop had been in the forefront of this rank politicization of a minority religious opinion—then, somehow, he dropped the ideologue’s armor in favor of the scientist’s. “Koop further angered conservatives by refusing to issue a report requested by the Reagan White House, saying he could not find enough scientific evidence to determine whether abortion has harmful psychological effects on women,” the AP recalls. “At a congressional hearing in 2007, Koop spoke about political pressure on the surgeon general post. He said Reagan was pressed to fire him every day, but Reagan would not interfere”—good for Reagan, and good for Koop, who persevered. He ended up advising Bill Clinton on healthcare reform. I’m not a Christian, but I believe in redemption. Thank the Goddess C. Everett Koop found his way to this one. Rest in peace.

What to Make of the Droning on Drones From the Right?


Rand Paul. (Reuters/Jonathan Ernst.)

I haven’t been tuning in, myself, but I’m told that in recent days Fox News has been going all-in praising Senator Rand Paul’s droning drone filibuster holding up John Brennan’s confirmation as CIA chief, with several Fox contributors fiercely attacking Senators John McCain and Lindsey Graham for taking on the young libertarian lion—or, if you prefer, for taking Barack Obama’s side. That raises interesting questions. As I observed in my Nation dispatch from last year’s Republican convention (“The GOP Throws a Tampa Tantrum”; hats off to your clever Nation editors for that awesome headline!), “Rand Paul got some of the biggest applause of his speech for saying something this party isn’t supposed to support at all: ‘Republicans must acknowledge that not every dollar spent on the military is necessary or well spent.’” And that “John McCain and Condoleezza Rice sounded like schoolmarms lecturing indifferent students when they tried to make the case that what neoconservatives used to call the ‘freedom agenda’ was being betrayed by Barack Obama.” Does all this mean the ancient (and even, sometimes, honorable) tradition of Republican “isolationism” (the word being basically an epithet in American political discourse, its advocates prefer “non-interventionism”) is making a comeback? Or, alternately, did it never really go away at the conservative grassroots, save for those distracting moments when the commander-in-chief is a conservative Republican hero like in those heady first few years of W’s Iraq War? Or is all this just another opportunity for Obama-bashing, and as such a perfect example of the intellectual contentlessness and bottomless cynicism of that favorite Republican activity? (As I put it in the piece on the convention, “What they really love—shown by the way McCain and Condi were able to win back their audience by taking cheap shots at Obama—are enemies. And within their authoritarian mindset [as George Orwell taught us with his talk about Eurasia, Eastasia and Oceania], enemies are fungible.”)

For clues, I cranked up the Patented Perlstein Wayback Machine that lives on my hard drive and discovered the following interesting parallel from 1945, when what would become the Central Intelligence Agency began as a gleam in the American security establishment’s eye. William “Wild Bill” Donovan, head of the wartime spy agency the Office of Strategic Services, proposed to the president that his outfit be made permanent. The news was leaked to one of the most reactionary reporters, Walter Trohan, of one of the nation’s most reactionary major newspapers, the Chicago Tribune (Trohan was one of the infamously bureaucratically jealous J. Edgar Hoover’s favorite reporters, and the leak almost certainly came from Hoover).

How did conservative Republicans respond to the news? Well, there’s a cliché that in America, especially in wartime, “politics stops at the water’s edge.” Here’s one of the most splendid examples that the cliché is perfectly idiotic. The Trib immediately editorialized that FDR had in mind an American “Gestapo.” One Republican congressman said, “This is another indication that the New Deal will not halt in its quest for power. Like Simon Legree it wants to own us body and soul.” Another called it “another New Deal move right along the Hitler line. It would centralize power in Washington.”

Please support our journalism. Get a digital subscription for just $9.50!

These were the sort of (“isolationist”) Republicans who had been skeptical of going to war with the Nazis in the first place—so of course it’s fascinating to see them deploy the argument ad Hitlerium to leverage their hatred of FDR. There are historical complications, to be sure: the permanent spy agency that did end up being invented out of Donovan’s efforts—we call it the CIA—did indeed dangerously centralize power, abusing it badly, so there is a prophetic cast to these utterances. At the same time, the OSS and its CIA successor were notorious redoubts of just the sort of Eastern establishment nabobs long despised by conservative Republicans (you can see a fine depiction of that history in the 2006 Matt Damon film The Good Shepherd), lending an unmistakable air of political turf-maintenance to their complaints. The clincher? People like this were utterly silent about the actual American Gestapo in their midst, except when they were praising it to the skies: the one run by J. Edgar Hoover, whom conservatives universally and consistently adored.

Flash forward thirty years for more evidence of conservative Republicans’, shall we say, shaky record of good faith when it comes to criticizing security-establishment abuses when one of their own is in the Oval Office. I refer to the Church and Pike committees' investigations of 1975, which revealed the CIA’s flagrant abuse of its charter, including the assassinations of foreign leaders, and its utter failure in forecasting world-changing events, which was supposed to be its raison d’être. Here’s Ronald Reagan on his radio show: “My own reaction after months of testimony and discussion during the investigation of the CIA is ‘much ado about—if not nothing at least very little.’” He didn’t call the CIA a “Gestapo.” The opposite: he reserved his scorn for its investigators, accusing them of “witch-hunting.”

Now we have another CIA program dangerously abusing public trust. And suddenly conservatives are up in arms. There’s plenty more to say, true, about what Rand Paul might mean for the dormant conservative tradition of distrust for the security establishment; about the Obama administration’s own awful implication in institutionalizing that establishment’s mushrooming abuses; and, too, concerning the unfortunate indifference of too many Democrats and liberals to those abuses when the guy running show is one of “ours.” But in the meantime, while we think about that stuff, let’s not neglect the obvious: conservatives have always been at war with Eurasia’s spy apparatus. Or Eastasia’s. Or Oceania’s. When it happens to be run by Democrats, I mean.

Rick Perlstein last weighed in on the Bob Woodward debacle, noting that the journalist's series of books about the Iraq War went from  approving to critical in tone—perfectly in line with the changing tastes of the majority of book-buyers.

Reading Bob Woodward


Bob Woodward discusses the White House “threat” with Sean Hannity on Fox News.

This business with Bob Woodward—the White House’s Gene Sperling told him he might “regret” making a certain claim blaming Obama for the sequester debacle; Woodward told Politico he heard that as a veiled threat; conservatives crowed that all this proved Obama has lapped even Richard Nixon as a political thug; then the actual full exchange with Sperling, when it came forth, made it painfully obvious that the offending words were about as threatening as a light misting rain on a warm summer night—reveal most of what you need to know about Bob Woodward’s usefulness these days as a guide to how Washington works. That is to say, he is utterly useless in explaining how Washington works. But he is almost uniquely useful as an object lesson in displaying how Washington works—especially its elite punditry division.

All credit to David Folkenflik of NPR for having the presence of mind to invite us to turn to page 105 of All the President’s Men to remind Woodward what a real White House threat sounds like: John Mitchell in September 1972 telling his partner Carl Bernstein that if The Washington Post published what it knew about Mitchell personally approving the payment of political spies, Post publisher Katherine Graham was “gonna get her tit caught in a big fat wringer.”

They ran it anyway, of course. They ran it even though Richard Nixon’s re-election juggernaut was proving impermeable to Woodward and Bernstein’s ongoing Watergate full-court press, and many among Georgetown’s cocktail set had begun to consider the Post’s ongoing indulgence of the story a bit of an embarrassing obsession, kind of the way a blogger like Glenn Greenwald is looked upon now. Because back then, Woodward had guts. He’s something different now: a barometer of Washington conventional wisdom, who more appears to say what he chooses to say based upon his continually evolving sense of who is up and who is down among precisely that same Georgetown cocktail set.

Think that claim is harsh? Here’s an almost scientific case study to prove it. Consider Woodward’s three-volume series of books about George W. Bush’s foreign policy. I reviewed the series in 2006. The first, published in 2002, called Bush At War, was composed back in those heady days when his president’s approval ratings were up above seventy percent. “The George W. Bush who strides across the pages of Bush at War,” I wrote, “was a superhero…. And while the picture of the commander in chief in Plan of Attack (2004)”—modestly subtitled “The Definitive Account of the Decision to Invade Iraq”—“was rounder, the White House found if flattering enough to put it on the recommended reading list as they prepared for the Bush-Cheney re-election campaign.”

Call it the middle volume in a Goldilocks series: not too fulsome, not too mean, just something a little bit in between—just right: after all, his presidential approval ratings were hovering all that year right around 50 percent.

Then came 2006, the collapse of the Iraq adventure, and a president down below 40 percent in the ratings, roundly derided in all the right circles as a miserable failure. The book Woodward published that year—subtitle: “Bush at War, Part III”—was called State of Denial, and depicted a dangerous idiot. So I did some A/B/C comparisons: Woodward in 2002, 2004 and 2006, characterizing the same subject in completely different ways, correlated in every instance with his declining muscle in Washington.

Descriptions that were deferentially polite in 2002—“Bush, 55, has a quick, joshing manner”—became downright rude by 2006. (“Bush and Rove in particular dwelled on ‘flatulence’—passing gas—and they shared an array of fart jokes.”) In 2002 Woodward’s Bush was a careful empiricist: “I want to know what the options are. A President cannot decide and make rational decisions unless I understand the feasibility of that which may have to happen…. I wanted him to understand some of the nuances…” In 2006? He’s an ignoramous. Here he is in an informal pre-presidential foreign policy seminar with the Bush family pal who also happened to be Saudi Arabia’s ambassador to the United States, Prince Bandar bin Sultan: “Bandar, I guess you’re the best asshole who knows about the world. Explain to me one thing.”

“Governor, what is it?”

Please support our journalism. Get a digital subscription for just $9.50!

“Why should I care about North Korea?… I get these briefings on all parts of the world, and everybody is talking to me about North Korea.”

Bandar patiently explains that, well, America had 38,000 troops deployed on its border, and a single aggressive act could spur a world war. Woodward depicts Bush responding, “Hmmm, I wish those assholes would put things just point-blank to me. I get half a book telling me about the history of North Korea.”

Then, to put it in appropriately Watergate language, I found what I consider smoking-gun proof that all this is power worship far more than it is journalism. As I wrote, “The two most recent books end with the same interview: Mr. Woodward’s several hours with the President on Dec. 10 and 11, 2003. Both books contain the same exchange: Mr. Woodward says, ‘But we have not found any weapons of mass destruction,’ and Mr. Bush replies, ‘We have found weapons programs that could be reconstituted.’ ” The two books continue in almost identical wording, Woodward describing himself telling the president he’d been “travel[ing] around the country,” discovering Americans found him (note the diplomatic construction) “less the voice of realism” than previously for not acknowledging the non-presence of WMD.

But the 2004 and 2006 versions end with him characterizing the same interview entirely differently. “True, true, true,” the president is quoted as saying in Plan of Attack. Mr. Woodward then offers this paraphrase of what Bush said next: “He contended that they had found enough.” That line, in 2004, “He contended that they had found enough,” comes off as Woodwardian self-criticism: the last word belongs to the president, turning the reporter into a quisling who would happily leave a dictator in power because he only had a little bit of weapons of mass destruction.

In 2006, however, gives himself the last word, as an internal dialogue: “It had taken five minutes and 18 seconds for Bush simply to acknowledge the fact that we hadn’t found weapons of mass destruction.” It completely changes the meaning of the very same discussion: the man who came off as a steely protector of the nation when he was up above 50 percent in the approval ratings has suddenly become feckless now that he was below 40.

Woodward claimed in an interview he had just been following the facts as they were revealed to him: “It took me over two years to find out what happened, and quite frankly, as I say as directly as can be said in English, they have not been telling the truth about what Iraq has become.” But as I wrote in 2006, doesn’t precisely that shifting perspective indict itself? “If Part III is the better book because it’s a more accurate portrayal of the Bush administration’s abject failures and inadequacies, doesn’t that make the author look worse? What was he withholding?” For instance the word “Bandar,” a central figure in Part III—or “Bandar Bush,” ran the nickname with which GWB tagged him in tribute to the intimacy of their relationship—doesn’t appear in Volume I. That intimacy apparently wasn’t something Washington polite opinion cared that you knew about in 2002. A state of denial, you might say.

Continue reading Bob Woodward. But not the prose. You won’t learn much from that. Read the man instead. That way you’ll learn what the people in power think about what you’re supposed to think.

Right-wing TV had a field day with Woodward and his White House “threat,” even though it may all just be a ploy by Woodward to sell more books, Leslie Savan writes.

Nothing New Under the Wingnut Sun: Reckless Spending Cuts


A traffic sign is seen near the US Capitol in Washington March 1, 2013. Reuters/Jonathan Ernst

So: the “sequester.” That too-clever-by-half notion, born of last year’s debt ceiling negotiations out of the White House’s presumption that, when faced with the horror of heedless, profligate, across-the-board budget cuts to all manner of popular government programs, the Republicans’ “fever would break”—remember that?—and the Loyal Opposition would somehow come to agree to a reasonable, “balanced” deficit reduction package. It all seemed so cut and dried in those palmy days, just a few months ago: who could possibly imagine a major American political party could possibly let such madness actually go into effect?

Um, me? I wonder how many folks within the White House, gaming out whether Republicans might not just call the bluff, bothered to consider the fact that an embrace of heedless, profligate, across-the-board budget cuts to all manner of popular government programs is a key component of hardcore conservative ideology. That, when Barry Goldwater proclaimed in his 1960 manifesto Conscience of a Conservative, “I have little interest in streamlining government or in making it more efficient, for I mean to reduce its size …. My aim is not to pass laws, but to repeal them …. And if I should later be attacked for neglecting my constituents’ ‘interests,’ I shall reply that I was informed that their main interest is liberty,” that Barry Goldwater—and the future millions for whom his sentiments became an ideological touchstone—meant what he said.

Did anyone in the White House notice how many conservatives, including ones in positions of governmental power, after Mitt Romney’s recorded back-room admission that he couldn’t get elected because 47 percent of the electorate is addicted to suckling on the federal teat, responded that what he said was absolutely correct? (Even if they admitted it was unfortunate a public unready to handle it had to hear it.) That conservatives, as an article of faith, see breaking the link between citizens and their government benefits as the only sure way to break the link between voters and the Democratic Party? And that severing that same link is also the best way way to restore the broken moral fabric of the nation? (Which is one explanation Republican governors use to defend their determination not to accept free federal money to qualify more of their poor citizens for Medicaid under Obamacare: They are saving their citizens from wicked dependency. Their other explanation is that Obama must necessarily be lying to them—but that will have to be the subject for another post).

And what could the White House have predicted conservatives would say to those who point out that pulling the rug out from under huge chunks of federal spending will spur a recession? They could have predicted that many would say exactly what they have said: that since it’s excessive federal spending that causes recessions, what’s wrong with cutting excessive federal spending?

Bottom line: didn’t anyone whose job it is to think about such things consider that at least some powerful Republicans—not all, it is true—would relish sequestration as a marvelous thing, a historic opportunity, a gift from Obama to help further the cause they’d been proclaiming as sacred for generations: to shrink the federal government small enough so they could someday drown it in Grover Norquist’s proverbial bathtub? “Once these cuts take effect, thousands of teachers and educators will be laid off and tens of thousands of parents will have to scramble to find childcare for their kids,” said Obama. Did he ever consider that to a lot of Republicans, that would sound like a wish list?

Here, note, was Rudolph Giuliani eleven days ago: “The federal government is highly inefficient. It could use a 5 or 10 percent cut.”

And that utterance, with its lust after cuts, jogged my weird historian’s memory.

When Ronald Reagan became governor of California in 1967, in part because of his vague but florid promise to cure the state’s budget deficit, his harried and incompetent budget director announced a magical solution: a budget that consisted of little more than last year’s document with the added notation for each deparment, “less 10 percent reduction.”

Saner heads pointed out that, well, this was not exactly how budgeting worked. That some spending was federally mandated, some other spending mandated by state statute; that administrative departments have fixed expenses and were not chunks of cheese that maintain their structural integrity if you carve a tenth of the bulk from any random portion, as if one corner being as important to the structural integrity of the whole as any other; and that—duh—some departments themselves are more integral to the health of a complex polity than others. Other observers made it clear to Reagan that his course would be a political disaster. (On one campus the governor was burned in effigy with placard reading “REDUCE REAGAN BY 10%.”) Indeed just such ineluctable facts of budgetary life were supposed to be sufficiently obvious to today’s Republican negotiators that they would never let the sequester’s slice-off-any-old-chunk-of-the-cheese madness take place. But why should we presume today’s Republicans show themselves any more sensible, any less susceptible to magic thinking, on such matters than their hero Ronald Wilson Reagan?

In 1967, it happened, inconvenient political reality spiked his administration’s hope to decimate (literally!) the entire state budget. He did, however, decimate where he could. For instance, in the the state’s Department of Mental Hygiene, which seemed a practical notion at the time because, as Lou Cannon noted in his book on Reagan’s governorship, “the population of the mental hospitals was declining, thanks to tranquilizing drugs and new medical procedures.” Although, oops: “the numbers were deceptive. The patients leaving the hospitals were the ones who responded best to tranquilizers; those who remained were more apt to need intensive care. And the state’s mental hospitals had never been adequately staffed.”

Please support our journalism. Get a digital subscription for just $9.50!

The indiscriminate—sequester-style, you might say—layoffs went forth nonetheless; and, in 1972, they were intensified. The results are infamous. The psychiatrist and novelist Irvin Yalom has written about what it was like to witness that calamity from within: “Reagan with one bold, brilliant stroke abolished mental illness in California,” he recalled. “As a result hospital staffs were forced, day after day, to go through the charade of treating patients and discharging them back into the same noxious setting that had necessitated their hospitalization. It was like suturing up wounded soldiers and sending them back into the fray. Imagine breaking your ass taking care of patients—initial workup interviews, daily rounds, presentations to the attending psychiatrists, staff planning sessions, medical student workers, writing orders in the hospital charts, daily therapy sessions—knowing all the while that in a couple of days there would be no option but to return them to the same malignant environment that had disgorged them. Back to angry spouses who had long ago run out of love and patience. Back to rag-filled grocery carts. Back to sleeping in moldering cars. Back to the community of cocaine-friends and pitiless dealers awaiting them outside the hospital gates.”

Heartbreaking. Reagan, meanwhile, denied the problem existed. In 1967, when a visiting expert from Sweden called a ward in Sonoma County the worst he had seen in several countries, the governor accused the staff there of having “rigged” the poor conditions to sabotage his cutback program. Another time Reagan just said, “We lead the nation in the quality of our mental patient care, and we will keep that lead.” In 1973 he called his “new approach to the treatment of the mentally ill that has reduced the number of patients sentenced to a hopeless life in our asylums from 16,500 to 7,000” a “model for the rest of the nation.”

What modern day horror stories will attend our own unanticipated chunk-of-cheese approach to federal budgetary decimation? That’s been a subject of much debate. One thing, however, is certain. The conservatives who’ve spent the last few weeks labeling Obama “President Panic” just for making the obvious argument that indiscriminate cutting has consequences will also figure out some species of magic thinking to deny their recklessness has had any negative effects at all—in fact, Reagan-like, they’ll surely devise cherry-picked and distorted nonsense in order to maintain that sequestration has yielded up loaves-and-fishes policy miracles.

Another prediction: sequestration will cause greater budget deficits down the road—because of the simple fact that there are certain things government has to do, and making it harder to do those things at any given moment makes it more inefficient and expensive for government to make up the ground down the road. This conservative retreat from a simple understanding of government spending as investment that pays off down the road is one of the reasons—there are others—Republican administrations end up creating bigger deficits than Democratic ones. Reagan’s gubernatorial administrations, for example: inheriting a $4.6 billion state budget in 1967, he left behind a one in 1975 that cost $10.2 billion. The average individual Californian’s tax burden when he took office was $426. When he left it was almost double that, at $728.

But there is a difference, this time. Back then, Democrats instinctively and successfully fought what Reagan was up to. This season’s budget decimation, on the other hand, has been underwritten by Democrats—by Democratic naïveté. By a simple refusal to absorb and accept the lesson of history: that some conservative Republicans will always be constitutionally incapable of acknowledging that a cut in government capacity can ever be a bad thing. The fact that they now can claim, even if disingenuously, that the cuts were Barack Obama’s idea in the first place may make their triumph politically only the sweeter.

Voting is not a right, but a privilege in this country, Rick Perlstein discovers.

(Son of) Constitutional Roadblock to Efforts to Fix Federal Elections


Voters in California. States could legally take back the power to appoint electors without a popular vote at any time. (AP Photo/Damian Dovarganes.)

Three weeks ago I held forth in thunder on the subject of voting: “The President and Congress have little or no constitutional authority upon which” to fix America’s broken voting system, I wrote. “It is one of the best kept secrets in our political life: There is no federal right to vote…I’d be glad to be corrected, but as best I can tell, that means that technically, in almost every case, a state can make it as hard as it wants for its citizens to vote, and there’s practically nothing DC can do about it.”

Soon after, with my gratitude, I was corrected. But that doesn’t mean that I was all wrong. Today, with the question of fair elections back in the news, what with the oral arguments this week on the Supreme Court challenge to the Voting Rights Act, let’s dig a little deeper.

Article 1, Section 2 of the Constitution grants a federal right to vote for Congressmen—who shall be “chosen…by the people of the several states, and the electors in each state shall have the qualifications requisite for electors of the most numerous branch of the state legislature.” And while the states are granted an uncomfortable amount of power to set voter qualifications (no small thing: that’s the source of so many of the historic abuses so eloquently set forth in the classic text that inspired my post, Alex Keyssar’s The Right to Vote: The Contested History of Democracy in America), Article 1, Section 4 also grants Congress authority to alter voting procedures, at least in congressional elections: “The time, places and manner of holding elections shall be prescribed in each state by the legislature thereof; but the Congress may at any time by law make or alter such regulations.”

Indeed, an election lawyer reminded me of two counter-examples in which Congress passed laws aiming at improving voting administration federally, neither of which faced constitutional challenges I’m aware of: the Military and Overseas Voter Empowerment Act (MOVE), and the Help America Vote Act (HAVA).

But that’s hardly the end of the issue.

HAVA (whose goals were to replace the failed punchcard and antiquated lever-based voting systems, to create an Election Assistance Commission to help administer federal elections and to establish minimum election administration standards), passed overwhelmingly (357-48 in the House, 92-2 in the Senate) and was signed by President Bush in 2002. But it’s one of those diabolically labyrinthine “kludges” within which America so excels in entangling its social policies: not really a congressional mandate, it instead only provides a pool of federal funding states can collect if they lay out an acceptable plan to carry out the law’s goals.

No states ended up turning down that money —but these, alas, are more ideological times than even back in 2002. Republican governors are more lunatic than they used to be—as attested by all the ones so eager to turn down free federal money to qualify more of their poor citizens for Medicaid under Obamacare. Meanwhile, some states have taken the money only to hoard it. And what was politically possible in 2002 may be inconceivable in 2013. Could something like HAVA pass now, given that Republicans all but brag of sabotaging efficient election procedures in order to hold down the Democratic vote? The question, I’d wager, answers itself.

Meanwhile, according to Wikipedia, MOVE, intended to help military folks vote, is a paper tiger: “implementation of the act has been spotty, with only fifteen states having fully implemented it…90 percent of absentee ballots sent to American civilians living abroad are returned and counted, compared to two-thirds of absentee ballots mailed by overseas military personnel. In a report by the Overseas Vote Foundation released in January 2013, 21.6 percent of military voters did not receive their ballots and 13.8 percent of military voters tried to vote but couldn’t.”

I asked Professor Keyssar to clarify his thoughts about whether there can truly be said to be a federal right to vote or not. He pointed out—as Digby also reiterated in an e-mail to me—that the Constitution refers only to a right to vote for members of Congress. Which would sound academic—if Antonin Scalia hadn’t, in his five-to-four decision in Bush v. Gore, triumphantly proclaimed, “The individual citizen has no federal constitutional right to vote for electors for the President of the United States unless and until the state legislature chooses a statewide election as a means to implement its power to appoint members of the Electoral College.” He continued, “the State legislature’s power to select the manner for appointing electors is plenary; it may, if it so chooses, select the electors itself…. History has now favored the voter, and in each of the several States the citizens themselves vote,” but the “State, of course, after granting the franchise in the special context of Article II, can take back the power to appoint electors.”

Please support our journalism. Get a digital subscription for just $9.50!

And what does that mean? Well, according to the good folks at Fairvote.org, who support just the sort of right-to-vote constitutional amendment I endorsed in my previous post on this subject, that means “Florida’s legislature has the power to take that power away from the people at any time, regardless of the popular vote tally.”

Still feel safe in your constitutional right to vote?

And don’t forget: this is the same constitutional provision that allows the outrage that American citizens living in territories—Puerto Rico, the Virgin Islands and Guam—don’t get to vote for president. And that the only reason residents of Washington, DC, get to was a constitutional amendment, passed in 1961.

Given that context, reread what Obama said in his State of the Union address about fixing elections: “Our journey is not complete until no citizen is forced to wait for hours to exercise the right to vote.”

Narrow, narrow, narrow. Maybe some nice law can be proposed, say, providing a pool of federal funds, HAVA-style, to give to governors to increase the number of polling places or some such. And then, in the unlikely event that it passes, Republican governors could gladly turn that money down. Laws introduced by House and Senate Democrats to require states to provide online registration and allow at least fifteen days of early voting will likely go nowhere—because conservative Republicans don’t want to make it easier to vote. (In one of those 1975 Ronald Reagan radio broadcasts I’ve been discussing recently, the future president wrote of the horror he shared with other conservatives at a proposal to allow people to register to vote more easily by sending in a postcard: “In recent years, and without our paying attention,” he darkly intimated—a liberal conspiracy!—“it’s become easier and easier to become a registered voter. And whether we know it or not, we’ve been making it easier and easier for voting blocs to swing elections even though the bloc doesn’t represent a majority…. Look at the potential for cheating!”)

And note what Obama did not throw the weight of presidential rhetoric behind while the whole world was watching his inaugural address: the restrictive voter identification requirements that are as much an insult to democracy as those long lines (even as, admirably, Attorney General Holder has called them the equivalent of a poll tax, illegal under the Twenty-Fourth Amendment).

So it is that, still, a genuine federal right to vote—what poor old soon-to-be-incarcerated Jesse Jackson, Jr. prescribed in his proposed, now-orphaned, constitutional amendment as “regulations narrowly tailored to produce efficient and honest elections,” reviewed regularly by Congress “to determine if higher standards should be established to reflect improvements in methods and practices regarding the administration of elections,” and a requirement for every state to “provide any eligible voter the opportunity to register and vote on the day of any public election”—goes begging. And will go begging, in the end, until that bold day when America finally decides to become a grown-up democracy.

Maybe it's not too late for liberals to get over their love affair with Rahm Emanuel and see his missteps for what they are, Rick Perlstein writes.

Syndicate content