
Barack Obama greets young people at the College of Charleston. (AP Photo/Charles Rex Arbogast.)
I've been arguing in this series that the voting behavior of demographic voting blocs isn't stable in any truly predictable way, and may well confound confident predictions of a generation of Democratic hegemony. Seemingly stable blocs can shatter in something like an instant. Even, for example, urban blacks, which Democrats can reliably count on to vote their way at numbers upwards of 90 percent in every election. Little more than a generation ago, though, urban blacks in industrial states were considered a swing vote. Teddy White energy to the point in Making of the President 1960: Yes, a majority would vote Democrat, but the Party of Lincoln still retained the loyalty of a significant number of "Negroes" that just how many voted Republican in states like Illinois would determine—did determine, in fact—whether John F. Kennedy or Richard Nixon became president. Within four short years, of course, that once-solid conventional wisdom had melted into air. It changed in a flash: A Democratic president signed a historic Civil Rights Act and the Republican presidential nominee voted against it. Lyndon Johnson told Bill Moyers "I think we just delivered the South to the Republican party for a long time to come." There was a corollary: just as indubitably they'd delivered themselves the loyalty of blacks.
There's a moral to this story: it is what a party and its leaders do that determines the loyalty of its voters.
As much so, what determines the loyalty of voters is how well a party and its leaders tell clear, effective stories about what they do. Obama did Obamacare. And how does Obamacare pass these tests? Well, for one thing, it hasn't done that much yet. Some of the things it might do are bad (Los Angeles Times headline this past week: "Healthcare Law Could Raise Premiums 30% for Some Californians"). The main thing it does, meanwhile, establishing easy-to-use online healthcare markets ("exchanges"), still doesn't kick in for nearly a year—if that's not badly delayed: The White House, plainly not grasping the intransigent nature of the opposition whose good faith they still presume, was underprepared at how many Republican-run states would refuse to cooperate. Said one healthcare consultant, “They definitely did not envision this many federally run exchanges. It was considered a fallback. The idea was it would be mostly state run and in the event of an anomalous state that didn’t do it, the feds would step in.”
Please support our journalism. Get a digital subscription for just $9.50!
And how well have they managed to get through effective stories to the public about what they've done? Well, according to a Kaiser Family Foundation poll published on the third anniversary of the Affordable Care Act, 57 percent of Americans had to answer "no" to the question, "Do you feel you have enough information about the health reform law to understand how it will impact you personally, nor not?" 40 percent of the public views it unfavorably, 37 percent unfavorably, down from 46 and 40 percent upon its passage. And yet they say they would like certain specified healthcare reforms were they passed into law—88 percent like tax credits to small businesses who offer health insurance to their employees, 81 percent dig closing the Medicare donut hole, and eighty percent like the idea of health insurance exchanges—with the rub being all these things are major provisions of the law a plurality of Americans say they don't like. Is a Democratic majority inevitable? Not if they keep laying the game like this.
Let's compare all this to how Franklin Delano Roosevelt did things. While the Social Security system did not kick in right away either, people were confident about what it would do—because it was communicated so effectively. After he signed the law in 1935 he had signs hung in every post office reading, "A Monthly Check to You for the Rest of Your Life." That was the year before Roosevelt won the biggest reelection landslide in history. Then, the program really started delivering. It was one of the ways Roosevelt ensured new Democratic politicians were minted for another seventy-five years and counting.
Sometimes I like to think that the responsibility of every new generation of Democrats is to devise a a program that mints new Democrats for another seventy-five years or so. Can Obama's political legacy conceivably match that? Well, maybe. The proof of the pudding will be in the eating—and not in demographic wishful thinking.
Read Rick Perlstein on Chicago teachers organizing against Mayor Rahm Emanuel's shock doctrine.

On March 27, Chicago Teachers Union members protest a plan to close fifty-four public schools. (AP Photo/Charles Rex Arbogast.)
One thousand Chicago Public School teachers and their supporters, including this correspondent, packed Daley Plaza in forty-degree temperatures on Wednesday for a rally protesting the city’s announced plans to close 54 kindergarten-through-eighth-grade schools next year. One-tenth of the protesters were detained and ticketed (though police originally said they had been “arrested”) at a sit-in in front of school board headquarters a few blocks to the south. What they are protesting is genuine shock-doctrine stuff—an announcement utterly rewiring a major urban institution via public rationales swaddled in utter bad faith, handed down in a blinding flash, absent any reasonable due process. Though Mayor Emanuel is learning that the forces of grassroots democracy can shock back too. And boy, does he have it coming.
The story went down like this. Immediately following Mayor Rahm Emanuel’s defeat in the historic Chicago Teachers Union strike this past fall, the district, claiming massive underutilization of school facilities in light of an announced $1 billion budget deficit, began talking about closing perhaps 129 schools. The district CEO Jean-Claude Brizard—sacked soon after as a “distraction” to “school reform”—had once said he could only imagine closing perhaps three schools, given the paucity of schools performing better than the ones they would have shut down. Knowledgeable observers thought perhaps a dozen or two would end up getting the axe. (You can hear those details in this panel discussion from February of this year).
It took a petition signed by an overwhelming majority of the city’s fifty alderman to even win hearings on the issue. But those hearings were a disaster—“cannibalism,” as one alderman described them, “Good people pitted against each other because each one was trying to save their individual school.” A war of all against all: just the kind of atomization any self-respecting shock doctrineer wants to see among his constituency. Karen Lewis, the Chicago Teachers Union president, called them a “sham” for all the effect she thinks they actually had on decision-making.
Then came Thursday, March 21, and the bombshell: the all-at-once announcement of the fifty-four schools to be shuttered. Even the aldermen who’d be responsible for managing the fallout in their wards weren’t informed in advance. And the announcement was made not by the mayor but current school CEO Barbara Byrd-Bennett; Rahm, that brave, brave man, had scheduled the most important announcements of his term on a day when he was off on a Utah ski vacation with his family. “It’s a slap in the face of the citizens,” Alderman Bob Fioretti said. “He’s not here to answer any questions or stand by his head of the school system. He knows the publicity realm. He’s nowhere to be found when he needs to respond to the people.” One sign at the rally had a cartoon of Rahm in ski togs, and a mayoral quote: “I did not run for office to shirk my responsibility.” Another addressed Byrd-Bennett: “Your Conscience Is Underutilized.”
Why is this a matter of conscience? Isn’t it reasonable to assume that declining enrollment figures mean some schools should close? Some schools, perhaps—but consider this school-closing plan in its sordid specifics. When Rahm finally showed his face, sunburned from the ski trip, for a Saturday press conference (to brag about the opening of a new Walmart), he admitted, “there is a lot of anguish and I understand that and appreciate it.… But the anguish and the pain…pales compared to the anguish that comes by trapping children in schools that are not succeeding.” Which makes no sense, when you consider that many of the schools receiving new students perform at or below the levels of the schools that are being closed.
One of those closing schools is called George Manierre Elementary. Located within the boundaries of the former Cabrini-Green housing project, its students are to be sent to another Cabrini-Green school seven blocks away, Jenner Elementary, that has lower test scores. So Rahm is lying; no surprise.
He is also, cry parents at the affected schools, putting their children in far greater danger for violent crime—which perhaps is surprising, given how escalating youth violence has become his political albatross. Chicago’s public radio station WBEZ, whose education reporter Linda Lutton has done outstanding work on the story, profiled one of those parents, a woman named Karlyn Harris who volunteers at Manierre every day. They recorded her anguish after they broke the news to her: “Oh, my God, I knew it was on the list but I didn’t know it was gonna be the—bam, in my face today. Oh, wow.” A reason for her horror: gang turf. Between Manierre and Jenner, “it’s like a boundary. You step over this line, the cowboys get you, over here, it’s the Indians.” The ward’s alderman, Walter Burnett, notes the two schools have been “fighting since I was a kid.… These are lifelong grudges.” He worries that it will keep kids from going to school altogether. Which appears to be a problem district headquarters never much considered: according to a speaker at the rally, there was originally no safety plan, and the bureaucrat in charge of making it didn’t even know the names of the closing schools “until the media got to him.”
Here’s another issue of which the city has proven shockingly heedless: community. There is more to maximizing the utility of limited educational resources than measuring the capacity of school buildings and discerning whether they are 50, 75, or 100 percent full. There is also the fact that each school is an individual human institution, with its own human ecology. Keeping that community together—or at least thinking about whether each individual school is working or not working qua community—has value in itself: for the students, for the teachers (God forbid we think about their human needs), and for the larger community within which the school operates. When you close a school, you kill something: a network of trust (or distrust), a web of relationships (which might or might not be functional relationships), an environment. It is in this way that, when you listen to and read the coverage here, the school-closers and the would-be school savers talk past each other. You hear parents, teachers, and students literally moved to tears at the thought of the places they love closing. Then you hear the CPS people talking statistics.
Isn’t it possible to consider that shuttering even a school at 50 percent classroom capacity can do more harm than good? That shuttering a school at 100 percent capacity might do more good than harm? And that this good or harm might sometimes be hard to measure, but yet still valuable for all that? Beyond learning outcomes, isn’t experiencing a rich community, and having a role in building and maintaining that community, and watching adult role models build and maintain that community, a useful outcome for a young person? (See this piece for the testimony of a college-bound high school senior named Lavell Short on how his “low-performing” elementary school, which is closing, profoundly contributed to his moral development.)
Community: Manierre is across the street from the massive Marshall Field Garden Apartments complex (a privately financed experiment in moderate-income history), one reason why so many parents feel safe sending there kids there. And proximity provides another value. Here’s Harris, the volunteer at Manierre: “It’s like, when you step out of your house you’re right into this school. So it’s like you’re stepping out of your bed and you’re going into a dining room…It’s gonna tear the community down. Not only the students. It’s gonna be the parents. It’s like a mother’s kids being taken from her, that’s how I feel.”
But even this argument is granting city governments too much—because it presumes the utilitarian statistical arguments they claim to be making are intellectually legitimate. Are they? The brass have made that challenging to answer. The whole issue of measuring the capacity of school buildings, for example: it took an investigation from the Chicago Tribune to discover that the city had been pegging arguments about which schools to close on an “ideal” class size of thirty students. That ghastly vision, “little mentioned despite months of public debate,” the Trib observed, all but accusing deliberate subterfuge—that a school full of classrooms with twenty-nine kids is too “empty” to be “efficient”—“allows the mayor and school officials to drive the public debate with attention-grabbing statistics. It has enabled the Emanuel administration to declare nearly half of all elementary and high schools underused, leaving 100,000 desks empty.”
Please support our journalism. Get a digital subscription for just $9.50!
From that, however, two further statistical subterfuges emerge. A large part of what school activist here spend their energy on is “data liberation,” prying forth information locked into PDF files—to compose data sets that, once the numbers are crunched, tell stories radically different from those issued by school headquarters on LaSalle Street. Just that sort of careful statistical work from the advocacy group Apples to Apples has discovered that, in fact, that it would be more accurate to say the school closings aim at an average capacity of thirty-six students per class. A second node of statistical propaganda: note how the Trib cites “100,000 desks empty” in Chicago Public Schools. That means the Trib has effectively been conned—not to be too hard on the city’s marquee daily, because the Emanuel administration is such a skilled gang of con men. Spokesman frequently use that 100,000 figure, though sometimes they use the number 145,000. For instance, in this discussion—pay attention at at about 6:38, for this is a smoking gun—Chicago Board of Education vice president Jesse Ruiz, baldly states “we actually lost 145,000 students.”
Stunning, stunning, stunning: As even the district’s official school closing documents acknowledge, the number “145,000” refers to the number of Chicagoans between the ages of 0 and 19 the city has lost, a large number of which are not even school age, many of those who are school-aged not being CPS students. So Apples to Apples ran the numbers of students CPS has actually lost: only about 30,000, about a fifth of their mendacious claim. It’s enough to make an honest person want to scream.
Here, meanwhile, are other statistics—accurate statistics. Five schools, almost a fifth of the total, are in the Carbrini-Green ward, the 27th—a prime area for gentrification over which real estate developers salivate. (Here’s Mrs. Harris: “I think the reason they really wanna close this school here is because of the land. I don’t know how much these homes cost, over here, or condos—$200,000?”) The area is dotted with high-performing magnet schools; but of course none of the Manierre kids are being reassigned to those—they’re being assigned to a school that’s 98 percent black. Eighty percent of students affected by school closings, in fact, are black; but only 40 percent of students as a whole in the system are black. (Signs hoisted on the dais of the rally on Daley Plaza starred the faces of African-American heroes whose names adorn some of the schools being closed: writer Henry Dumas. Singer Mahalia Jackson. Eighteenth-century scientist Benjamin Banneker.)
“Everybody on the board did not look at this decision as numbers on a spreadsheet,” said Rahm at his sunburned Saturday press conference. “We looked at it and viewed it as what can we do to have every child have a high-quality education regardless of their neighborhood, regardless of their circumstances, regardless of where they live.” Yeah, right. There is a reasonable suspicion that a lot of these shuttered schools are going to be turned over to charter operators. In 2012, belying every “underutilization” claim, the school board announced plans to open 100 new charter, “turnaround” and “contract” schools over the next five years. Considering that fact, here are some other accurate statistics concerning the United Neighborhood Organization, or UNO, the city’s most prominent charter company, whose CEO, Juan Rangel, is a major supporter of Mayor Emanuel. $1.5 million: the amount a company owned by the brother of a top UNO executive, Miguel d’Escoto, got for work as the “owner’s representative” for UNO school construction. $10 million: the amount another d’Escoto brother stands to make installing windows in UNO schools. $98 million: the amount of the state grant to UNO for school construction from which this beneficence derives.
it’s about real people vs. really rich people, read another of the Daley Plaza placards—rich people like school board president David Vitale, the former CEO of the Chicago Board of Trade, vice chairman fo the DNP Select Income Fund, board member for United Continental Holdings, for which he was paid $216,688 in 2011. And Juan Rangel, about whom I stumbled across a puff piece in Hispanic Executive with the unintentionally hilarious headline “Revolutionary Chicago-based Group Refuses to ‘Act Like a Nonprofit’,” with the following unintentionally hilarious quote: “Rangel also works to combat the misconceptions surrounding nonprofits. The CEO says many young people believe nonprofits are well-intentioned, but offer little room for advancement and require taking a ‘vow of poverty.’”
You see why Chicago teachers are angry. And why they’re not going away. And why they promise more civil disobedience. “So lemme tell you what you’re gonna do,” shouted Karen Lewis at the rally. “On the first day of school, you show up at your real school! You show up at your real school! Don’t let these people take your schools.” They just might. They’ve beat Rahm before. They could beat him again.
For more on the uproar over education, read Allison Kilkenny on Chicago teachers' tactics.

Anthony Lewis reading the news of his Pulitzer Prize in 1963. (AP Photo/File.)
I’ve just learned the former New York Times columnist Anthony Lewis died this morning at the age of 85. Among the ornaments to his career were two Pulitzer Prizes and two celebrated books on constitutional law. One, Gideon’s Trumpet, was about the Supreme Court case that established indigent criminal defendants’ right to an attorney, the other, Make No Law: The Sullivan Case and the First Amendment, concerned the decision that made it difficult for targets to harass journalists by suing for libel. The Times itself focuses on how he revolutionized coverage of the Supreme Court. I’ll let others talk about that. Me, I’ll focus on a product of the kind of work I do as a historian of the 1960s and ’70s. In my research, I endeavor to assemble massive piles of the kind of arguments ordinary Americans might encounter about current events in the course of a day, the better to reconstruct how public opinion is formed and deformed. As such, it’s pretty easy for me to put together a fairly representative sample of what the most prominent media voices were saying during those years. That’s what I’ve just done now. And what I’ve found is a stunning record of Anthony Lewis’s consistent astringent vision and moral courage when it came to executive power and the national security state—a willingness ещ record the ugliest things the American state was up to, and to unflinchingly interpret them not as the exceptions of a nation that is fundamentally innocent but as part of a pattern of power-drunk arrogance. Think of Noam Chomsky on the op-ed page, several times a week.
I read him reporting, after visiting North Vietnam during Richard Nixon’s relentless bombing of the country, about American planes bombing hospitals despite the obvious red crosses painted on their roofs. He visited the hospitals; he wrote, decimating American moral arrogance and the bombing campaign’s entire raison d’être—intimidating the Communists into surrender—“It is impossible for this visitor to detect any atmosphere of fear.” In a letter to fellow columnist Stewart Alsop (quoted in this book) he described his motives for reporting uncomfortable stuff with the bark off: “I happen to believe in the sanity of our country, and the last chance to save it from the eternal damnation that the Nazis earned Germany…. If you cannot see that the mass bombing of one of the most densely populated areas on the face of the earth is a crime, then nothing can save us and we shall deserve the reputation we have earned.”
Please support our journalism. Get a digital subscription for just $9.50!
He detected in Nixon’s Saturday Night Massacre firing of the special prosecutor investigating him “the smell of an attempted coup d’etat.” Two weeks after the fall of Saigon, he excoriated the parochialism of the correspondents his profession had begun sending to Vietnam a decade earlier, baffled “that the Vietnamese are not American in outlook and never would be,” who “almost all accepted the official view of the war” which “think[s] of options only in terms of their effects on the country involved.” Two months later, in a piece called “Lying in State,” he recited polling that the percentage of people who distrusted the government had risen from 22 to 62 percent in ten years. Such observations were common in those dark days. What made Lewis’s different was the forcefulness of his interpretation: that people didn’t trust government because its “officials who are caught out in crude deceptiosn so seldom pay any penalty.” He went on to sear the page with stories about former CIA director Richard Helms baldly telling one Senator that the CIA never tried to overthrow the government of Chile when the CIA had actually spent $5 million to overthrow the government of Chile, and another that there had been no CIA connection to Watergate burglars Gordon Liddy and Howard Hunt after 1970 even though the CIA had supplied them their burglary equipment.
And he wrote, when America had waited hardly six months before beginning its next ill-advised (not-so-)secret foreign intervention, in the form of CIA support for a faction fighting for control of the African nation of Angola, that “what the world sees as self-inflicted wounds may look to the authors as a way of electing Gerald F. Ford and keeping Henry Kissinger in office.” He was airing his suspicions that the White House had let a secret mission leak intentionally to give them a political foundation to fight Congress’s intent to increase constitutional accountability of the security state, “painting themselves as patriots.” He concluded, “What is needed, and painfully absent now, is a strong voice in Congress to contest that definition of patriotism.”
He loved that word, “patriotism.” He loved the concept. “Nothing should make us forget that moment of shared wonder and love of country,” he wrote on the first anniversary of the civic process that led to Nixon’s resignation—emphasis added. He was, in fact, one of the articulate and eloquent defenders of a new definition of patriotism that insisting that revealing uncomfortable truths, of criticizing one’s country, of holding leaders accountable for their sins, was a higher form of loving one’s country. If you agree, and I suspect most of you do, mourn Anthony Lewis.
Will recent election wins and ongoing demographic shifts usher in a new liberal era? Not so fast, Rick Perlstein writes.

The presidency of George W. Bush did not usher in a new Republican era, contrary to some predictions. (AP Photo/Ron Edmonds.)
Quick show of hands: who remembers the Summer of the Shark? The reference is to those muggy months in 2001 when the news was so slow and the media was so craven that the third-most-covered news story was a supposed epidemic of shark attacks that weren’t even an epidemic (there were 76 shark attacks in 2001 and 85 in 2000). The (media) feeding frenzy ended, naturally, on September 11. And so did something else: the general sense that George Bush was a do-nothing president that drove his approval ratings into the low fifties. As if overnight, they rose to 90 percent.
Then, cunningly, cravenly, the neoconservatives in and around the White House exploited the terrorist attacks to work their political will. The Project for the New American Century 2000 report “Rebuilding America’s Defenses,’ after all, had almost longingly observed that “absent some catastrophic and catalyzing event—like a new Pearl Harbor,” the kind of hyper-militarization of which they dreamed would be hard to achieve. They bank-shot the paralyzing fear into a made-up casus belli in Iraq—and then rode the electorate’s security fears all the way to Bush’s re-election. (Then, the day after that re-election, Bush called it a mandate to privatize Social Security.) Karl Rove’s predictions of a conservative Republican century seemed as reasonable as today’s arguments for Democratic demographically inevitability. Fear worked—in a way that could not possibly have been predicted by electoral prognosticators. Fear has a special way of confounding political predictions.
The Summer of the Shark illustrates something else: American culture is largely an ecology of fears, political culture included. And though it may flatter our liberal amour propre, conservatives don’t have a monopoly on exploiting fear for political advantage. Fear can be progressive—when Democratic politicians speak constructively to ordinary people’s fear of being manipulated and exploited by their bosses, of losing their way in a winner-take-all economy, of the consequences of a state without a safety net. It’s almost a very rough rule of thumb: when Democrats are able to successfully frame the meaning of an election season around middle-class fears, Democrats win the election; when Republicans are able to successfully frame the meaning of an election season around cultural fears, Republicans win the election.
Alas, many of Democrats’ political problems come when they forget that rule of thumb. And given that, it has to be said: when conservatives do fear—when they work to make elections referenda on cultural fears—they really do leave it all out on the floor.
Recent research supports it: “Using a large sample of related individuals, including twins, siblings, and parents and children,” according to a summary from the Associated Press, researchers led by Rose McDermott of Brown “first assessed individuals for their propensity for fear using standardized clinically administered interviews.” They then “surveyed the sample for their attitudes toward out-groups—immigrants in this case—as well as toward segregation. Participants were also ranked on a liberal-conservative partisanship scale depending on how they self-reported their political attitudes.” They found “a strong correlation between social fear and anti-immigration, pro-segregation attitudes. While those individuals with higher levels of social fear exhibited the strongest negative out-group attitudes, even the lowest amount of social phobia was related to substantially less positive out-group attitudes.”
The AP then quoted the scholars’ useful conclusion: “It’s not that conservative people are more fearful, it’s that fearful people are more conservative.” 9/11 showed that was so. Fear of “the Other” became endemic, epidemic, among people who at other times and in other circumstances have shown evidence of a healthy pluralism, tolerance for complexity and salutary fellow feeling. Andrew Sullivan, for example, the sometimes-conservative pundit, has never been my favorite political writer (his specialty seems to be serially getting things wrong, then narcissistically claiming moral credit for the courage to change his mind, long after it really matters), but he doesn’t seem all that bad a guy. After 9/11, however, he lost it, calling you, me, and everyone else skeptical of the rush to war in Afghanistan, “[t]he decadent Left,” who “[m]ay well mount what amounts to a fifth column.” Now (too late!) he diagnoses what drove his own hysteria: “crippling fear.”
But at that, at times, haven’t all of us been sort of conservative like that? Unexpected things happen, unmoor us, and we lose ourselves. We lash out, indulge bad instincts, think with bad faith, grow cranky, irritable, stupid—act like stereotypical right-wingers. Maybe even, temporarily, indulge right-wing political behavior.
Please support our journalism. Get a digital subscription for just $9.50!
Thus the political corollary: make people more scared, and you may make them more conservative. When that happens on a big enough scale, conservatives are electorally advantaged. Which is why, historically, conservatives have been such avid entrepreneurs of political fear. Scare people about something, anything, and you might get them voting “right”; you get power; and once you get power, you needn’t actually address the fear (stoking it further, not addressing it, is the more useful political play). You instead may use the newfound, fortuitous grant of power for something entirely unrelated—say, trying to privatize Social Security.
Once-in-several-generation tragedies like 9/11 being unreliable foundations for political strategy, the process is typically more sedulous. And that which will scare us most being unpredictable, the process demands an experimental temper—throwing things against the wall, seeing what sticks. The pioneers of the New Right in the 1970s were quite explicit about it. “The New Right is looking for issues that people care about,” Richard Viguerie said. “Social issues, at least for the present, fit the bill.” And so, in the 1970s when families were collapsing left and right, they zeroed in on issues that spoke to those fears: federal bureaucrats undermining family authority in the classroom. Gay rights ordinances. The Equal Rights Amendment. Abortion. “I organize discontent,” as Viguerie said.
In 2004, I watched the process happen. Early in the year, as I’ve written, “I attended of hundreds of ‘Parties for the President’ organized nationwide for grassroots volunteers who wanted to help re-elect George W. Bush, at a modest middle-class home in Portland, Oregon. At one point, a nice old lady politely pressed into my hand a grubby little self-published pamphlet she had come upon, purporting to prove that Democratic presidential candidate John Kerry had faked the heroics that had won him three purple hearts in Vietnam. I added it to my mental store of the night’s absurdities that I expected to hear rattling across the wingnutosphere the entire fall: ‘I still believe there are weapons of mass destruction’; ‘There is an agenda—to get rid of God in this country’; ‘John Kerry attended a party in which there was bad language!’” All were equally absurd. One of them, however—the notion that Kerry was a military shirker in Vietnam, symbolically implying that a President Kerry would shirk his executive obligations to protect the nation now—for some reason took off. It was all over CNN that fall, not just Fox; so it was that a crazy right-wing fear-meme insinuated into the mass lizard brain by Election Day.
“Death panels” did pretty well in 2010—no one could have predicted that. The “war on religion,” “Sharia law,” “Obama wasn’t born in America”: Three strikes and the Republicans were out in 2012. You win some, you lose some. But here, at base, is the biggest problem with the confident predictions that demography will drive Democratic inevitability in the coming decades: No one can predict fears before they come; that’s why they call them fears. Fear of rioting “Negroes” from 1966 on: no one saw that coming to upset the Democratic inevitability of 1964. Extremist Iranian students seizing innocent American hostages in 1979: no one saw that coming to help upset the Democratic inevitability of the late 1970s. (Political scientist Everett Carll Ladd in 1978: The “GOP is in a weaker position than any major party of the U.S. since the Civil War.”)
As long as human beings are wired like this—and barring robot brain transplants, human beings will always be wired like this—predictions about permanent conservative retreat cannot be reliable. People are more tolerant of homosexuality? So what. The things people are afraid of change; that’s life. Fear still remains. They surely will figure out something else. What will it be? We cannot know. That is why they call it fear.
Read Rick Perlstein on Anthony Lewis, the Pulitzer Prize-winning journalist known for his coverage of the Supreme Court who died today.

Protesters rally against the SB1070 immigration bill in Arizona in 2010. (AP Photo/Ross D. Franklin.)
In February, I wrote the first part in a promised series about why today’s political conventional wisdom—that, as Jonathan Chait put it “conservative America will soon come to be dominated, in a semi-permanent fashion, by an ascendant Democratic coalition hostile to its outlook and interests”—may be premature. I cleared the decks by pointing to all those other moments—in 1964, 1972, 1974 (and, I didn’t note, 1992)—when equally confident prognostications of permanent Democratic majorities came a cropper. This time, I take on the most conspicuous this-time-it’s-going-to-be different argument: that the white vote in presidential elections has gone from almost 90 percent in 1980 to about 70 percent in 2012; that there are 24 million Hispanics currently eligible to vote and there will be 40 million by 2030; and that only 27 percent of Hispanic voters chose Mitt Romney for president (chart here)—and so, abracadabra, Democrats Über Alles!
Now, it might hard for us to wrap our minds around a different way of seeing things in these days of Joe Arpaeio and Jan Brewer—and Susana Martinez, the Latina governor of New Mexico who promises to repeal her state’s law allowing undocumented immigrants to get driver’s licenses even though her own grandparents were undocumented immigrants. But, taking the long view—and isn’t that the whole point of this exercise?—it has to be acknowledged: party identities aren’t passed on through the genes. Blocs of “natural Democrats” have become natural Republicans before. Indeed, in at least one instance, it happened with shocking rapidity. As I noted last time, in the 1960s, droves of white Democrat ethnics—Italians, Eastern Europeans, the Irish—started voting Republican in a backlash against the Democrats’ continued embrace of civil rights in the wake of a failed open housing bill and the urban riots. Only an eye-blink earlier, they had been considered the soul of the New Deal coalition.
Not Jews, of course; they stayed Democratic—even as they joined the Italians, Eastern Europeans and Irish in moving out to the suburbs. And so: consider an interesting bon mot I once heard in a speech by David Brooks (not ordinarily much of a sage, I’ll admit). He said the question going forward for the Republican Party was whether Hispanics would turn out to be Italians or Jews. That was in 2005, one of those brief windows when the “Italian Option” seemed like it just might be a possibility. A series of Bush Administration-supported bills for comprehensive immigration reform, including a path to citizenship for undocumented immigrants, were introduced. The next year, Bush boldly distanced himself from nativists in his party by calling the armed civilian border-patrollers of the Minutemen “vigilantes” even as other GOP pols were embracing them as citizen heroes, no more threatening than neighborhood watch volunteers (these were those long-ago days back before Trayvon Martin and George Zimmerman…). Then, in 2007, an armada of braying callers spurred by hysterical right-wing talk-jocks like Rush Limbaugh drove the nail in the coffin for Karl Rove’s dream to make the Republican Party safe for migrants from Mexico. And suddenly, over just about any imaginable time frame, it seemed Hispanics would be like Jews—Democratic loyalists, possibly forever.
But look at the lead headline in today’s New York Times: “GOP Opposition on Migrant Law Seems to Erode.” (The lead example is no less a Tea Party darling than Rand Paul.) The RNC’s blockbuster save-the-party report released this week implores, “We must embrace and champion comprehensive immigration reform. If we do not, our party’s appeal will continue to shrink to its core constituencies only.” They actually seem serious about it this time. Admittedly, nobody ever went broke betting against the Grand Old Party’s stubborn insistence never to venture beyond its core constituencies, but: What if? What if a genuine immigration reform law happens?
Please support our journalism. Get a digital subscription for just $9.50!
Then the issue might be taken off the political table—and what then? Tom Frank has frequently argued that Republican politicians don’t really want to ban abortion, because if that happened, they would lose their best scare story to rouse their base on election day. I don’t think that’s true (for if they are actively trying to intentionally not ban abortion, they sure are doing a damned good job of banning it accidentally) but the underlying political logic of the hypothesis must be respected. It works, for immigration, in the inverse: Democrats, an at least slightly more honorable bunch, genuinely do want immigration reform, so if the Republicans manage to take “yes” for an answer, it will be that much harder every two years going forward to make the appeal for Hispanics to vote for Democrats because Republicans hate them. (2004 suggests that’s not an unrealistic policy: all it took then was George W. Bush to make nice noises about possible immigration reform to win 44 percent of the Hispanic vote, compared to 21 percent eight years earlier. See the third chart here: Hispanic party identification, while always majority Democrat, has actually been quite volatile over the last forty years, and shows no particular trend line.)
Something else might happen. Part of the fantasy certain pro-reform Republicans like to broadcast about Hispanics, family-oriented, churchgoing traditionalists, is that they are somehow natural conservatives, just waiting for the Republicans to slough off the skin of bigotry before they can embrace Reaganism en masse just like every other ordinary God-fearing American. Liberals intelligently respond by pointing to polling demonstrating that if anything, Hispanics are more liberal than voters in general on all sorts of issues—for instance, 75 percent of Hispanics prefer “a bigger government providing more services” rather than “a smaller government providing fewer services,” compared to 41 percent for the general population. But what if they start becoming “Italian”? That is to say, what if Hispanics, less hobbled by official discrimination, follow the pattern of other immigrant groups before them, become increasingly upwardly mobile—and become increasingly identified, by themselves and others, as “white.” Is it not reasonable to assume that they might become more Republican? That would certainly be the historical precedent: more and more immigrant groups (excluding, of course, African-Americans), becoming “white.”
The very existence of a more immigrant-friendly Republican Party, meanwhile, might do much to assuage the sort of mainstream, moderate white voters who identify themselves by their tolerance (many of them quite conservative on economic issues) that it is now “safe” to vote Republican.
Unlikely, you say: conservatives require an enemy to fear, an “Other” to hate, and those scary dark people from the crime-ridden wasteland on our southern border are “Others” par excellence. But, as I’ve been pointing out, for the conservative mind, enemies are fungible. So are fears. Conservatives did just fine transforming Martin Luther King from Satan himself into Santa Claus in the space of a couple of decades. Surely, a couple of decades from now, they’ll manage to find some other, less electorally powerful object to fear. And the party’s political fortunes might stabilize accordingly.
That’s what I’ll be writing about next time in this series: the fungibility, and unpredictability, of fears, and how that may well provide Republicans a certain electoral advantage in the future whatever demographic changes come down the pike.
Rick Perlstein explains that there is a conservative wing of the Democratic Party, and why progressives should recognize that fact.

It was only after the ascension of Franklin Delano Roosevelt that the Democratic party began to be regarded as fundamentally liberal (AP Photo.)
Here’s a pet peeve of mine. It’s when people refer to the “democratic wing of the Democratic Party.” Or who say of a Democrat who makes consistent moves to the right, “Why doesn’t he just join the Republicans?” It’s not the underlying sentiment; I want Democrats to stop doing right-wing stuff as badly as anyone. The problem is descriptive—and, ultimately, strategic. The fact is that the Democratic Party in modern times has always had a conservative wing, one frequently as strong or stronger than its liberal wing, and as such, when progressives speak of the party as a vehicle that naturally belongs to them, as if by right—until conservatives stole it from them—they weaken progressivism. The fact is, the history of the Democratic Party has always been one of ideological civil war. And if you don’t realize you’re in a war, how can you win it?
Let’s review the game tape. Take it all the way back to 1924—when both parties had both left- and right-wing factions (before that year, the great progressive reformer Robert “Fighting Bob” Lafollette of Wisconsin was a Republican), when there was no reason to believe the Democrats would be the ones to become the nation’s established left-of-center party, and when at the presidential nominating convention the civil war came down to 103 ballots (and gubernatorial fistfights on the convention floor) over issues like Prohibition and whether the party should be for the Ku Klux Klan or against it.
It was of course with the ascension of Franklin Roosevelt in 1932 and after that the idea of the Democrats as an institutionally liberal party became credible, though many delegates who voted for him at the convention didn’t necessary think or know they were voting for a liberal. Many voters didn’t think so, either, but just marked the ballot for him because he had a “D” beside his name: They were Southerners, and saw the Democrats as the only political bulwark against the racial mongrelization of America. The progress of the New Deal, we now understand, rested on a fragile and complicated coalition joining visionary progressives and the most fearful reactionaries—and when an overconfident Roosevelt overreached to try to put the reactionaries in their place, in 1938, he almost lost control of the whole thing.
With the coming of the civil rights era, the war played out against that precise template: Northern progressives asserting themselves, Southern reactionaries threatening to pack up their votes and go elsewhere—a melodrama that began with a bang in 1948 when Strom Thurmond led Dixiecrats out of the convention and into his own segregationist presidential run, and reached its apotheosis in 1964 when five Southern states went for Goldwater. That, of course, truly began the slow steady transition to ideological realignment, with more and more Southern Democrats voting Republican in each election.
But, wouldn’t you know it, a new issue immediately arose to muddy anew what it meant to be a Democrat. In 1968 the floor of the convention once more split right down the middle, fistfights included, this time over the question over whether the Vietnam War was a good thing or a bad thing. But the end of the war didn’t bring ideological unity, either. In fact, the fist post-Vietnam election, post-Watergate, in 1974, inaugurated today’s order of battle between the right- and left-leaning wings of the party. Democrats gained forty-nine seats in the House and three in the Senate, giving the party of Jefferson and Jackson an approximate two-to-one advantage over the Republicans. People assumed a liberal deluge was in the offing, Congressional Quarterly noted predictions that the 94th Congress would become “a labor-orchestrated puppet show.” Ronald Reagan said, “The small fires that at first threatened free enterprise are growing daily into full-scale four-alarm blazes,” predicting, “We’re going to see a flood of expensive, spectacular, and ill-conceived legislation which can’t be derailed or even tempered by the voices of moderation.”
In fact, something like the opposite happened—as could have been predicted by the language of the “Watergate Babies” on the campaign trail.
Please support our journalism. Get a digital subscription for just $9.50!
Thirty-six-year-old Gary Hart was more or less the ideologist of the bunch. His memoir of the McGovern presidential campaign, which he had managed two years earlier, called liberalism “near bankruptcy.”Time called him a “liberal.” “Traditional ‘liberal’ and ‘conservative’ slogans,” he wrote back in an angry letter to the editor, “are simply not adequate to cope.” He said the best way out of the energy crisis was “to work together. There will be a lot more cooperative ventures between the environmentalists and the energy developers.” His stock speech, “The End of the New Deal,” argued that his party was hamstrung by the very ideology that was supposed to be its glory—that “if there is a problem, create an agency and throw money at the problem.” It included lines that could have come from Commentary, the neoconservative magazine Jerry Brown, who was friends with Hart, liked to read and quote. Like: “The ballyhooed War on Poverty succeeded only in raising the expectations, but not the living conditions, of the poor.” (That was false: the poverty rate was 17.3 percent when LBJ’s Economic Opportunity Act was enacted in 1964 and 11.2 percent as Gary Hart spoke.) He called those who “clung to the Roosevelt model long after it had ceased to relate to reality,” who still thought the workers, farmers and blacks of the New Deal coalition were where the votes were, “Eleanor Roosevelt Democrats.” He held them in open contempt. His outmaneuvered opponent, a once-popular two-term conservative incumbent, said Hart seemed to be “trying to get to the right of Attila the Hun.” A 32-year-old congressman-elect from Michigan, James Blanchard, said “I’m not entirely sure what my political philosophy is.”
There was a political reason for this. These new Democrats, seeds for Bill Clinton’s capital-n New Democrats, were replacing Republicans in predominantly suburban districts. They spoke to the desires of a white-collar constituency—and not that of the fading urban proleteriat (“We’re not a bunch of little Hubert Humphreys,” Hart famously said). And though many of them, including Hart, frequently did yeoman’s work to reimagine progressivism for a new generation (for instance, in the field of environmentalism), some of them, and their immediate successors, also did yeoman’s work selling off great chunks of the old Democratic agenda to corporate bidders—like Tony Coelho, the California congressman elected in 1978 who became head of the Democratic Congressional Campaign Committee in 1980. Exulted a Dallas venture capitalist about this new/old breed of Democrat in a 1986 profile of Coelho, “I’m one of the biggest contributors to the Governor of Texas, but can I get him on the telephone? Hell, no. Sometimes it takes a week. I call Tony any hour fo the day or night and he gets back to me immediately. Some days he just calls to ask how I’m doing. That pleases me tremendously.”
This battle goes way back. It’s written into the Democratic Party’s DNA. Acknowledge the other side, study them—take them seriously. Don’t let them play the underdog; that just advantages them, too. We’re in a fight here—always have been. They think they are the party—just as confidently as we believe we’re the party. The only way to make our vision of this party a reality is to work for it—and not to act surprised when their side works for it, too.
Educate yourself about the contracts you sign, or you could easily fall victim to small print, Rick Perlstein writes.

Theresa Amato is organizing to fight small-print contract trickery. (Courtesy of Citizens in Charge Foundation.)
Yesterday I wrote about the kind of absurd, unfair, and inscrutable contracts Americans click or sign on every day just to participate in normal commerce. I introduced Theresa Amato of Faircontracts.org, who’s organizing to fight these outrages, all licensed by pro-corporate court decisions, as one of the most pressing public policy problems we face, because, “If you take a look at all the economic problems we have, from the mortgage foreclosure crisis, to student loan debt, to credit card debt—pretty much pick your crisis—underneath everything you’re going to find a fine-print contract.” Today, I’ll tell you about what folks like this are trying to do about it all.
One piece is education—of which these blog posts are a part. “Just haul out your contracts and take a look!” says Amato, rattling off the toxic clauses consumers should look for: ones signing away your day in court in favor of binding arbitration (a non-transparent process that corporations almost always win); waiving liability and the right to take part in class action suits; gagging free speech; granting the contract issuer the right to unilaterally modify the terms. People “can demand that corporations treat them better, and not patronize corporations that have such provisions in their fine print. That of course means they have to read or know about it first.”
She recommends some outstanding recent books on the problem: University of Michigan law professor Margaret Jane Radin’s Boilerplate: The Fine Print, Vanishing Rights, and the Rule of Law (which contends that many fine print “contracts” aren’t actually legally contracts at all); Seduction By Contract: Law Economics, and Psychology in Consumer Markets, by NYU Law’s Oren Bar-Gill; The Fine Print: How Big Companies Use ‘Plain English’ to Rob You Blind, by the legendary investigative reporter David Cay Johnston.
And, hey! Check out, I josh you not, this hip-hop joint, “Fine Print,” which Amato commissioned from the social justice rapper Lonnie Ray Atkinson:
Think about freedom, the freedom that they reppin’. Free to agree or else you get to steppin’. But that ain’t freedom, freedom means choice. The right to participate, the right to a voice.
The power to negotiate at the point of signing , power to enforce the dotted line. A level playing field where everything’s see through . And you can understand what you just agreed to.
If you ain’t got that, then you ain’t got freedom . Contracts shouldn’t make you weep when you read ‘em. From your economy to your government you better check who’s writing the fine print.
But educating yourself about the contracts you sign (even via dope beats) is but a first step—and a deeply inadequate one. Says Amato, “I don’t want to get into the frame of blaming the victim here—of blaming the consumer. Because it is not rational for a consumer to have to read forty pages of fine print to buy a product.” After all, as I pointed out yesterday, a lot of these contracts are written at a twenty-seventh grade reading level, designed not to be understood. That has to change. One way, of course, is changing the law. She says, “We as a society should be able to do better than this. Contracts should be easily accessible; but most importantly, they should be fair—as in, easy to understand and not containing or hiding deceptive or abusive provisions that harm consumers who have no idea no idea that these provisions exist in the contract.”
Laws could require any company that issues business-to-consumer standard-terms language to make those terms available on their website and posted at their place of business, so people can comparison shop. In her home state of Illinois, Faircontracts.org had allies in the state legislature introduce just such a a bill, the Consumer Contract Right to Know, because right now comparison shopping on your own next to impossible: Amato had one of her legal interns call up the top six rental car companies to email their boilerplate contracts, and not a single one would do it. The bill got a “subject matter hearing” of less than a few minutes in the Illinois Senate Judiciary Committee, in which the senior counsel of the Illinois Retail Merchants Association rather ridiculously said it was unacceptable to them because it would permit businesses to see their competitors’ contracts. It never came up for a vote.
Please support our journalism. Get a digital subscription for just $9.50!
Surprise, surprise: the powers that be are not particularly enthusiastic about the notion of empowering consumers through fair, transparent contracts—something Amato learned with a vengeance when she tried to push another bill in Springfield, the Consumer Contract Plain Language Act. The big trade associations, from the Chamber of Commerce to the Cable Television and Commerce Association to something called the “Competitive Energy Association” came out in force to lobby against it. And what would this fearsome legislative enactment have demanded? Contracts “written in a clear and coherent manner using words with common and everyday meanings,” in “type of readable size and no less than 10-point font,” not containing provisions “that permits the unilateral modification by the covered entity to the disadvantage of the consumer without explicit consumer consent after the execution of the contract.” Oh, no. Can’t have that. This bill went nowhere (though it’s been reintroduced, and will probably go nowhere again).
But don’t despair: other tactics, ones bypassing the corrupt halls of our legislatures, have worked to push for reform. The contracts you’re expected to sign when you do business with a bank have been especially onerous. The Pew Charitable Trusts undertook a study of 250 types of checking accounts and found the median length of disclosure documentation at the twelve largest banks was sixty-nine pages—and that these same banks we taxpayers have so generously bailed out with our tax dollars especially love to bury hidden fees and penalties within the thicket to make money off our supposedly “free” checking accounts. Pew then publicized and leveraged the study to pressure the banks—reeling from a successful grassroots uprising against Bank of America for its proposed five dollar “swipe fee”—to adopt plain-language disclosures that fit on a single page of paper.
Consumers, it turns out, can have power. Their voices, though, have to be amplified and aggregated—by top-down public-interest organizations like Pew, in this case; but what if we did the amplifying, aggregating—and organizing—ourselves? The sky would be the limit. And that is where Faircontract.org’s ideas start looking visionary—and where, potentially, you come in.
Explains Amato: “What should happen is that people should view the contract as a product or a service that also has the potential to harm you. So that you’re not just buying the underlying product or service; you’re also buying the contract that goes with it. And so it’s important to think of what is in that contract. And when people find out negative things that are in that contract, there should be one place that everyone can go and put their information that they’ve learned about their contract, so they can share their information—like crowd-sourcing, or a review guide.”
Think of it this way: Yelp for contracts.
“And so this is what needs to be developed, and I would like to work with someone who would be interested in working on it. Because we have cumulative knowledge that can help change the behavior of corporations. Just like there’s been a success movement against the imposition of the five dollar fee that Bank of America was going to charge, we should be able to do that every day with every position in a contract that we don’t want to be imposed upon us as intelligent consumers. Most people go into a contract thinking , that will never happen to me,’ or ‘this product will be fine’—they are in a positive optimistic mode. And so they don’t consider when something goes wrong with this product, or the service. And that’s when you learn that there are all these nefarious terms in the contract. They rely on us being disaggregated, there being no motivation or ability, not wanting to plow through these terms, and not being able to share what we know when the terms are applied against us.”
So let’s aggregate. Want to help? Want to be one of the coders to create Yelp for contracts? Go to Faircontracts.org. Get in touch with Theresa. Because, like Lonnie Ray Atikinson raps, “ain’t no freedom if we ain’t free of the fine print.”
Read the Rick Perlstein's previous post on nefariious contract terms.

Fair contract activist Theresa Amato. (Courtesy of Citizens in Charge Foundation.)
Imagine you’ve clicked on your computer screen to accept a contract to purchase a good or service—a contract, you only realize later, that’s straight out of Kafka. The widget you’ve bought turns out to be a nightmare. You take to Yelp.com to complain about your experience—but lo, according to the contract you have given up your free speech rights to criticize the product. Let’s also say, in a fit of responsibility, (a bit fantastic, I know) you happened to have printed out this contract before you “signed” it, though you certainly hadn’t read through the thing, which is written, literally, on a “twenty-seventh grade” reading level. Well, you read it now (perhaps with the help of a friend who’s completed the twenty-seventh grade). And you see that there was nothing in the contract limiting your right to free speech at the moment you signed it. That part was added later. Your friend with the twenty-seventh-grade education points to the clause in the contract in which you’ve granted this vendor-from-hell the right to modify the terms of the contract, unilaterally, at any time into the vast limitless future.
Others, you realize, must have had the same problem with this lemon of a product. You begin canvassing the possibility of a class action suit. But you guessed it: the contract you agreed to waived your right to class action as well.
You study this gorgon of a text to figure out what other monstrosities lie within—and discover this: you’ve waived away your right to the privacy of certain information, too. Shocked, you resolve: never again. You realize that when you buy a product or service, you’re also buying the contract that goes with it. So you’ll comparison shop. You think about how, when you rent a car, you have to sign and initial all that contract language you have no time to read with eight people behind you in line at the airport. So you call all the big rental car companies to get copies of their standard boilerplate contracts to read at your leisure—but not one would e-mail you the contract. You’re told it just isn’t done.
The upshot of this horror story? Maybe you’ve figured it out by now. The “you” above is actually you. You, dear reader, have almost certainly signed a contract exactly like this. You may even have done so today.
“Fine print,” or “boilerplate,” contracts have been interwoven into the fabric of our modern commercial society for decades. In recent years, however, they become more and more deliberately obfuscating—and, thanks to business-friendly court decisions more and more aggressive in their intent to deprive customers of all sorts of rights of redress. Recently I sat down to talk to an activist who’s doing something about it. When Theresa Amato of Faircontracts.org, who sat with me recently for an interview, told me about this business of companies reserving—and exercising—the right to change contracts after their customers have signed them, and courts upholding that right, I paused a bit. I said I was speechless. “Yes,” she replied. “You should be speechless. And so should everyone.” She laughs—in a laughing-to-keep-from-crying kind of way: “To call this fine print ‘contracts’ is almost a misnomer.” She corrects herself: “It is a misnomer, according to contract theory, because there’s no mutual consent there.”
The Chicago-based Amato has had a busy career in twenty years as a public interest lawyer working on some of the most dramatic and important issues of our time: first at the Public Citizen Litigation Group, formed by Ralph Nader; then as director of the Freedom of Information Clearing House, fighting for access to secret government documents; forming her own nonprofit, the Citizen Advocacy Center, pioneering democracy-building “community lawyering” in the burgeoning “edge cities” of the Chicago area; managing Ralph Nader’s 2000 and 2004 presidential campaigns (she wrote a definitive book from the experience on the legal structure of the two-party duopoly). Now that she’s thrown herself into this “boilerplate” issue, I ask whether the apparently eye-glazing issue of fine print it really as significant as all that.
More so, she says. “I believe this is one of the most pressing issues today. If you take a look at all the economic problems we have, from the mortgage foreclosure crisis, to student loan debt, to credit card debt—pretty much pick your crisis—underneath everything you’re going to find a fine-print contract. That most likely people didn’t read or didn’t understand. So this is a hundreds-of-billions-of-dollars problem that faces us as a country.”
Please support our journalism. Get a digital subscription for just $9.50!
One of the biggest issues, she’s convinced, is the acquiescence of the courts. Again and again, judges admit that there’s some kind of problem—the legendary Chicago federal judge and author Richard Posner admitted he hadn’t read the fine print when he signed his own mortgage—but claim their hands are tied, and sign off on the contracts nonetheless. Amato points to a Florida appeals court opinion, not yet finalized, in which the family of an elderly woman, now deceased, felt ripped off by her nursing home and challenged the legitimacy of the fine-print contract she had signed. I read the opinion. Acknowledged the judge, “At the time, she was 92 years old and had a fourth-grade education. She could not spell well and often had to sound out words while reading. She had memory problems and was increasingly confused.” He said, “As a practical matter, a significant percentage of the people who enter nursing homes and rehabilitation centers have mental or physical limitations that make it difficult for them to understand the agreements. The same is probably true for most of the contracts that we sign for many consumer services.”
This judge continued on to make a general theoretical point: “There was a time when most contracts were individually negotiated and handwritten. In that period, perhaps the law could adequately describe a mutual agreement as a ‘meeting of the minds’ between the parties”; but not any more.
When I read that, I nodded my head. I thought he was making a sympathetic point, the same one Amato has been pressing home to me: that when our entire system of consumer commerce is based on a vast, structural imbalance of power between sellers empowered to dictate terms, and buyers all but helpless to do anything but accept them just to participate in the economy, something is badly broken—in fact, the free market, which any right-wing economist says relies on adequate information to function efficiently, is badly broken. Even though I knew how this story ended—a decision unfavorable to the family of the deceased—I figured he at least was making one of those “regrettably, my hands are tied” points.
He wasn’t. He was saying the opposite: that there was no problem with inscrutable contracts at all. “Our modern economy,” he concluded, “simply could not function if a ‘meeting of the minds’ required individualized understanding of all aspects of the typical standardized contract that is now signed without any expectation that the terms will actually be negotiated between the parties.”
Well, as has recently been pointed out, in the case of the contract to download a song for ninety-nine cents on iTunes, you have to click on a contract longer than Shakespeare’s Macbeth—and when they change a single clause in the contract, instead of showing you the individual change you’re forced to locate it within the entire fifty pages of fine print—the defeatist notion that nothing can be done to right the balance between buyer and seller becomes an absurdity. Says Amato: “These contracts have evolved into a category you cannot understand. I mean, some of them are written for a twenty-seventh grade level. And most people don’t have postdoctoral degrees, or law degrees—and even lawyers who read them don’t understand them! They’re not meant to be understood.”
Push has come to shove. Next time, I’ll describe to you what Amato thinks can be done—and how you can join her fight.
Read Rick Perlstein on Reagan-appointed surgeon general C. Everett Koop, who held principle over any party line.

U.S. Surgeon General C. Everett Koop addresses a AIDS rally in Boston on June 4, 1989. (AP Photo/Mark Garfinkel)
A decent enough interval has passed, I hope, to begin to think about an interesting figure of our recent history in a bit of a critical temper. C. Everett Koop died on February 25 this year, the former surgeon general of the United States, between 1981 and 1989—the only person to hold that title to have become a household name, not least for his goofy half-beard and his charming insistence on wearing his ceremonial brocaded Gilbert-and-Sullivan-style uniform everywhere. But also for, it has to be said, serving as an exemplar of honor and courage in a dishonorable time. The Associated Press put it like this:
An evangelical Christian, he shocked his conservative supporters when he endorsed condoms and sex education to stop the spread of AIDS.
He carried out a crusade to end smoking in the United States—his goal had been to do so by 2000. A former pipe smoker, he said cigarettes were as addictive as heroin and cocaine….
Koop, a devout Presbyterian, was confirmed after he told a Senate panel he would not use the surgeon general’s post to promote his religious ideology. He kept his word.
In 1986, he issued a frank report on AIDS, urging the use of condoms for “safe sex” and advocating sex education as early as third grade.
He also maneuvered around uncooperative Reagan administration officials in 1988 to send an educational AIDS pamphlet to more than 100 million U.S. households, the largest public health mailing ever done.
Koop personally opposed homosexuality and believed sex should be saved for marriage. But he insisted that Americans, especially young people, must not die because they were deprived of explicit information about how the HIV virus was transmitted.
He became a hero to AIDS activists, who chanted “Koop, Koop” at his appearances but booed other officials.
And all power to him for that. You don’t see his like much any more, in that there Republican Party. After all, the AP also noted, shortly before his appointment, he was going around the country predicting a “progression from legalized abortion to infanticide to passive euthanasia to active euthanasia, indeed to the very beginnings of political climate that led to Auschwitz, Dachau, and Belsen.”
Then, strikingly enough, he changed.
Disappointingly, the Newspaper of Record downplayed that part of the story in their obituary, merely noting in that trademark anodyne Gray Lady fashion, “Dr. Koop was completing a successful career as a pioneer in pediatric surgery when he was nominated for surgeon general, having caught the attention of conservatives with a series of seminars, films, and books in collaboration with the theologian Francis Schaeffer that expressed anti-abortion views.”
“Expressed anti-abortion views”: oh, it was so much more interesting and colorful than that. On abortion, Dr. Koop made history twice: second by rejecting hysteria on the subject; and first, by pioneering it.
Frank Schaeffer, Francis Schaeffer’s son, tells the first part of that story in his fascinating memoir, Crazy for God: How I Grew Up as One of the Elect, Helped Found the Religious Right, and Lived to Take All (or Almost All) of It Back. The story begins in the most unlikely place: a religious commune in the mountains of Switzerland, l’Abri, where the elder Schaeffer and his wife Edith Schaeffer, eloquent, learned, cultured and charismatic, became a magnet for 1960s spiritual seekers including the likes of Jimmy Page of Led Zeppelin—this even though the theology was evangelical, even fundamentalist. “In Evangelical circles,” Crazy for God explains, “if you wanted to know what Bob Dylan’s songs meant, Francis Schaeffer was the man to ask…. Dad was wearing his hair longer and longer, and he grew a goatee. He took to wearing beige Nehru jackets, odd linen shirts, and mountain-climbers’ knickers,” and “evolved into a hip guru preaching Jesus to hippies, a precursor to, and spiritual father of, the Jesus movement.”
The intellectual point of commonality between the fundamentalist and the freaks was anti-materialism: “Dad said that middle-class values, bereft of their Christian foundation, were empty. He sided with ‘the kids’ against their ‘uptight parents.’… They were describing a world you can’t see, the invisible link between mortality and immortality…bring alive the biblical epoch to twentieth-century young people, competing with modernity by talking up a storm, convincing smart people that the spiritual world is more real and essential than the evidence of one’s eyes.”
Cool stuff. An impassioned student of Western art and philosophy, early in the mid-1970s Francis Schaeffer spread his vision of that the art and philosophy he loved was ineluctably rooted in a Christian worldview, and threatened by the decline of that worldview, by collaborating with a movie producer to create a multi-part film series intended for viewing by church groups—whose philistinism Dr. Schaeffer was glad to challenge when his creative vision demanded it: “We can’t have this for a Christian audience. Churches won’t rent it,” the producer said of stock footage of Michelangelo’s David. Responded Francis Schaeffer: “But we have other nudes and you never said anything. What about Mary’s breast in that Virgin and Child?” “That was bad enough! One holy tit is okay, as long as you don’t leave it on the screen too long. But churches don’t do cock!”
The concluding two parts of the series, which was called How Should We Then Live?: The Rise and Decline of Western Thought and Culture, took an unexpected and historically significant turn. Frank Schaeffer, the son, who was directing the films, had become an impassioned pro-life activist. Abortion, he insisted, represented just the sort of materialist desecration of a godly worldview they were seeking to illustrate. The father was skeptical; fighting abortion was a Catholic thing. Père and fils had a shouting match: “I don’t want to be identified with some Catholic issue. I’m not putting my reputation on the line for that!… What does abortion have to do with art and culture? I’m known as an intellectual, not for this sort of political thing!” Son prevailed—with the sort of ace-in-the-hole argument that would soon become all-too-creepily familiar in evangelical circles. Frank remembers himself shouting, “That’s what you always say about the Lutherans in Germany! “You say they’re responsible for the Holocaust because they wouldn’t speak up, and now you’re doing the same thing! Fucking coward!’ You’re always talking about the ‘dehumanization of man’; now, here is your best example!’”
Though I haven’t been able to find a single reference to this literally world-changing event in newspapers of the time outside announcements on the religion pages (the evangelical upsurge simply wasn’t on America’s political radar at the time), the series, which toured the nation beginning in 1977, ended up taking Christian America by storm—including a showing before 6,000 in Dallas starring quarterback Roger Starbauch and half the Dallas Cowboys and a booking in Madison Square Garden in New York. That was the way How Should We Then Live? became history’s second most influential spur to the evangelical anti-abortion crusade.
The first most influential was the sequel. Enter the Good Doctor, the surgeon-in-chief at Philadelphia’s Children’s Hospital and an old family friend of the Schaeffers.
“His pro-life passion,” Frank writes, “was based on having spent a lifetime saving the lives of babies that were sometimes the same age as those killed in late-term abortions.” He traveled to l’Abri to draft the new series of films that became Whatever Happened to the Human Race?: A Christian Response to Abortion, Euthanasia, and Infanticide. It was a lurid anti-abortion masterpiece—one with a $1.5 million budget (almost $6 million today) and a score performed by the London Symphony Orchestra.
The title cards for Episode 1 (“The Abortion of the Human Race”) introduced Drs. Schaeffer and Koop (“one of the world’s most prominent surgeons,” who “has spent a lifetime studying man’s attitudes and trends from a medical point of view”). The opening image depicted a family clad in white with white death masks painting the title on a pane of glass (the words “Human Race” in blood red)—then shattering it. Dr. Koop, in bow tie, answers the phone, dispatching a baby to the operating room, where doctors evidently save her life, then place her in an incubator. “Why is a human life worth saving?” Dr. Schaeffer narrates. “Why is it worth the trouble? Human life contains so much potential evil as well as good. It would hardly seem worth preserving at all unless there was a specific, compelling reason for doing. Traditionally, in Western culture, the life of a human individual has been regarded as very special. the fully developed view of the sanctity of human life in the West, did not come from nowhere, but came directly form the Judeo-Christian consensus…based on Biblical teachings, people used to view human life as unique, something to be protect, and loved, because it was made in the image of God.”
No longer. “For example, in some of today’s hospitals, this child would be left to die…”
Why? ” The answer is clear. The consensus of our society no longer rests upon a Christian base but upon a humanistic one…man putting himself at the center of all things…all reality is just one big machine…this causes people to view themselves differently, and to have different attitudes toward other human beings.” An image of rabbits in scientists’ cages evolves into one of squalling babies in scientists’ cages. Of cars being crushed in a scrap yard, Dr. Schaeffer lecturing from atop the junk heap, sad-eyed, on the “mechanistic view of people” that has been replacing Christianity: “Now we are faced with a generation who has taken these ideas out of the lab, out of the classroom, and into the streets. More and more, we find ourselves in an uglier world…and God as a personal creator is excluded from the picture,” with “no reason that man should be considered as special.”
Then a discussion by Dr. Koop of fetal development, then a cool and clinical description of the various supposed forms of abortion—“the surgeon then scrapes the wall of the uterus, cutting the baby’s body to pieces…”—over the image of a black baby doll, laying as if in shards of glass, then a gaggle of baby dolls, drowning amid those shards of glass, accompanied by Bernard Herman–like violin screeching. Schaeffer is revealed standing above the dead baby dolls, and the apparent shards of glass is shown to be salt. The scene was literally filmed on the Dead Sea—“where the city of Sodom once stood,” Schaeffer explained…
And so on. The same sort of cross-country tour followed, only bigger; only this time, the mainstream media paid attention—at least a little bit. For instance, a version was shown in January of 1981 on prime time in Washington, DC (“No Matter How Moving, Show Is Still Propaganda,” ran a piece on the showing on the front page of the Washington Post Style section). In 1982, Newsweek’s religion reporter Kenneth Woodward profiled Schaeffer—but, as Frank Schaeffer pointed out, Woodward was “one of the few reporters who seemed to ‘get’ what was happening with the emergence of the Evangelical pro-life movement.” His editors didn’t; “Newsweek had just dropped its dropped its religion section as “irrelevant.”
Ah, yes. So “irrelevant” that millions of Americans would soon adopting Whatever Happened to the Human Race’s inanities—the embryos are morally equivalent to infants; that without a conception of God ethics is impossible; that the “slippery slope” of abortion would soon lead to mass killings of the redundant elderly and handicapped—or an American Auschwitz; that late-term abortions (babies “removed alive,” as Dr. Koop shamefully claimed, then “allowed to die by neglect, and sometimes are killed by a direct act”) were epidemic—would be among the most important principles they would take into the voting booths each November.
Dr. Koop had been in the forefront of this rank politicization of a minority religious opinion—then, somehow, he dropped the ideologue’s armor in favor of the scientist’s. “Koop further angered conservatives by refusing to issue a report requested by the Reagan White House, saying he could not find enough scientific evidence to determine whether abortion has harmful psychological effects on women,” the AP recalls. “At a congressional hearing in 2007, Koop spoke about political pressure on the surgeon general post. He said Reagan was pressed to fire him every day, but Reagan would not interfere”—good for Reagan, and good for Koop, who persevered. He ended up advising Bill Clinton on healthcare reform. I’m not a Christian, but I believe in redemption. Thank the Goddess C. Everett Koop found his way to this one. Rest in peace.

Rand Paul. (Reuters/Jonathan Ernst.)
I haven’t been tuning in, myself, but I’m told that in recent days Fox News has been going all-in praising Senator Rand Paul’s droning drone filibuster holding up John Brennan’s confirmation as CIA chief, with several Fox contributors fiercely attacking Senators John McCain and Lindsey Graham for taking on the young libertarian lion—or, if you prefer, for taking Barack Obama’s side. That raises interesting questions. As I observed in my Nation dispatch from last year’s Republican convention (“The GOP Throws a Tampa Tantrum”; hats off to your clever Nation editors for that awesome headline!), “Rand Paul got some of the biggest applause of his speech for saying something this party isn’t supposed to support at all: ‘Republicans must acknowledge that not every dollar spent on the military is necessary or well spent.’” And that “John McCain and Condoleezza Rice sounded like schoolmarms lecturing indifferent students when they tried to make the case that what neoconservatives used to call the ‘freedom agenda’ was being betrayed by Barack Obama.” Does all this mean the ancient (and even, sometimes, honorable) tradition of Republican “isolationism” (the word being basically an epithet in American political discourse, its advocates prefer “non-interventionism”) is making a comeback? Or, alternately, did it never really go away at the conservative grassroots, save for those distracting moments when the commander-in-chief is a conservative Republican hero like in those heady first few years of W’s Iraq War? Or is all this just another opportunity for Obama-bashing, and as such a perfect example of the intellectual contentlessness and bottomless cynicism of that favorite Republican activity? (As I put it in the piece on the convention, “What they really love—shown by the way McCain and Condi were able to win back their audience by taking cheap shots at Obama—are enemies. And within their authoritarian mindset [as George Orwell taught us with his talk about Eurasia, Eastasia and Oceania], enemies are fungible.”)
For clues, I cranked up the Patented Perlstein Wayback Machine that lives on my hard drive and discovered the following interesting parallel from 1945, when what would become the Central Intelligence Agency began as a gleam in the American security establishment’s eye. William “Wild Bill” Donovan, head of the wartime spy agency the Office of Strategic Services, proposed to the president that his outfit be made permanent. The news was leaked to one of the most reactionary reporters, Walter Trohan, of one of the nation’s most reactionary major newspapers, the Chicago Tribune (Trohan was one of the infamously bureaucratically jealous J. Edgar Hoover’s favorite reporters, and the leak almost certainly came from Hoover).
How did conservative Republicans respond to the news? Well, there’s a cliché that in America, especially in wartime, “politics stops at the water’s edge.” Here’s one of the most splendid examples that the cliché is perfectly idiotic. The Trib immediately editorialized that FDR had in mind an American “Gestapo.” One Republican congressman said, “This is another indication that the New Deal will not halt in its quest for power. Like Simon Legree it wants to own us body and soul.” Another called it “another New Deal move right along the Hitler line. It would centralize power in Washington.”
Please support our journalism. Get a digital subscription for just $9.50!
These were the sort of (“isolationist”) Republicans who had been skeptical of going to war with the Nazis in the first place—so of course it’s fascinating to see them deploy the argument ad Hitlerium to leverage their hatred of FDR. There are historical complications, to be sure: the permanent spy agency that did end up being invented out of Donovan’s efforts—we call it the CIA—did indeed dangerously centralize power, abusing it badly, so there is a prophetic cast to these utterances. At the same time, the OSS and its CIA successor were notorious redoubts of just the sort of Eastern establishment nabobs long despised by conservative Republicans (you can see a fine depiction of that history in the 2006 Matt Damon film The Good Shepherd), lending an unmistakable air of political turf-maintenance to their complaints. The clincher? People like this were utterly silent about the actual American Gestapo in their midst, except when they were praising it to the skies: the one run by J. Edgar Hoover, whom conservatives universally and consistently adored.
Flash forward thirty years for more evidence of conservative Republicans’, shall we say, shaky record of good faith when it comes to criticizing security-establishment abuses when one of their own is in the Oval Office. I refer to the Church and Pike committees' investigations of 1975, which revealed the CIA’s flagrant abuse of its charter, including the assassinations of foreign leaders, and its utter failure in forecasting world-changing events, which was supposed to be its raison d’être. Here’s Ronald Reagan on his radio show: “My own reaction after months of testimony and discussion during the investigation of the CIA is ‘much ado about—if not nothing at least very little.’” He didn’t call the CIA a “Gestapo.” The opposite: he reserved his scorn for its investigators, accusing them of “witch-hunting.”
Now we have another CIA program dangerously abusing public trust. And suddenly conservatives are up in arms. There’s plenty more to say, true, about what Rand Paul might mean for the dormant conservative tradition of distrust for the security establishment; about the Obama administration’s own awful implication in institutionalizing that establishment’s mushrooming abuses; and, too, concerning the unfortunate indifference of too many Democrats and liberals to those abuses when the guy running show is one of “ours.” But in the meantime, while we think about that stuff, let’s not neglect the obvious: conservatives have always been at war with Eurasia’s spy apparatus. Or Eastasia’s. Or Oceania’s. When it happens to be run by Democrats, I mean.
Rick Perlstein last weighed in on the Bob Woodward debacle, noting that the journalist's series of books about the Iraq War went from approving to critical in tone—perfectly in line with the changing tastes of the majority of book-buyers.



