Where the past isn’t even past.
The proletarianization of higher education, according to the associate general counsel of the United Steel Workers Union, has now claimed a life. In a moving op-ed published in the Pittsburgh Post-Gazette, Daniel Kovalik, wrote this week of Margaret Mary Vojtko, a French teacher at Pittsburgh’s Dusquesne University whose tenure there—though it was, of course, a tenure without tenure—lasted twenty-five years, who just died at the age of 83. Receiving radiation therapy for cancer, living in a house that was nearly collapsing in on itself, and in receipt of a humiliating letter from Adult Protective Services informing her she had been referred to them as not being able to take care of herself, she turned to her union for help, because that is what unions do. Kovalik helped, despite the fact that the Steel Workers did not, officially, represent her: Dusquesne adjuncts had voted overwhelmingly for the USW to represent them a year ago, but the Catholic university has fought the certification of the election tooth and nail ever since, claiming its school’s religious beliefs should exempt it from federal labor laws. “This would be news to Georgetown University—one of only two Catholic universities to make U.S. News & World Report’s list of top twenty-five university—which just recognized its adjunct professors’ union, citing the Catholic Church’s social justice teachings, which favor labor unions.”
She called Kovalik in a panic about the letter from Adult Protective Services, and he tried to connect her with a caseworker. “I said that she had just been let go from her job as a professor at Dusquesne, that she was given no severance or retirement benefits”—after twenty-five years of loyal service; something for today’s adjuncts to look forward to, should they decide to stay in the grueling game—“and that the reason she was having trouble taking care of herself was because she was living in extreme poverty. The caseworker paused and asked with incredulity, ‘She was a professor?’ I said yes. The caseworker was shocked; this was not the usual type of person for whom she was called in to help.”
I predict in the future caseworkers won’t be so shocked at all. Notes Kovalik, “Margaret Mary worked on a contract basis from semester to semester, with no job security, no benefits, and with a salary of between $3,000 and just over $3,500 per three-credit course…. Even during the best of times, when she was teaching three classes a semester and two during the summer, she was not even clearing $25,000 a year, and she received absolutely no health care benefits.” So, soon, if you’re a graduate student and you’re reading this, might you.
“Finally, in the spring, she was let go by the university, which told her she was no longer effective as an instructor—despite many glowing evaluations from students. She came to me to seek legal help to try to save her job. She said that all she wanted was money to pay her medical bills, because Duquesne, which never paid her much to begin with, gave her nothing on her way out the door.”
Compare that, Kovalik says, to Duquesne’s president, whose pay package adds up to upwards of $700,000—you know, the guy with the pauperization of a dead 83-year-old on his conscience. “Duquesne knew all about Margaret Mary’s plight, for I apprised them of it in two letters. I never received a reply, and Margaret Mary was forced to die saddened, penniless and on the verge of being turned over to Orphan’s Court.”
I’m still collecting adjuncts’ stories. Here is one I’ve recently received from a psychology teacher in New Jersey. She told me what she loves:
“I get to stay informed about research, go to conferences, and have access to academic materials….
“I LOOOOOVE teaching my students.
“I experience a tremendous feeling of accomplishment when they come to me rather than their advisors because they trust me, not that I want to usurp the advisor.
“I jump for joy along with them when they get into a Ph.D program, law school, or just get the job they applied for. I cry with them when they don’t.
“I love that three years later when my students run into me in some random place in NYC they remember me and are happy to see me. I love when they tell me how much my class meant to them.
“That my class is safe enough for a young man to ‘come out’ and for a young girl to talk about a sexual assault and my students show compassion to them.
“That when I get a ton of email most of them are just saying ‘thank you’ for something I said in class or a response to a previous email sent earlier in the day.”
And here is what she doesn’t: “I don’t love that my salary is less than what most welfare recipients receive and I am permitted to teach only two classes…. I also don’t have office hours. Yet I make myself available to my students, I help them wherever they need help, and though this is my choice I know that I am doing more than many tenured professors at my university.”
The only wrong note in the Pittsburgh piece is that he says adjuncts make up “well over 50 percent of the faculty at colleges and universities.” It’s well, well, well over 50 percent, in fact—more like 66–75 percent. Expect more cases like that of Margaret Mary Vojtko. Social service agencies, be prepared.
David Kirp discusses the impact of massive open online courses on higher education.
Peter Beinart (AP Photo/David Goldman)
Peter Beinart is out with a major new argument in The Daily Beast about what the political future might hold in store for us. The headline writer calls it “The Rise of the New New Left,” and it begins by citing the recent victory of liberal populist Bill de Blasio in New York’s mayoral primary. “The deeper you look,” Beinart writes, “the stronger the evidence that de Blasio’s victory is an omen of what may be the defining story of America’s next political era: the challenge, to both parties, from the left.”
The argument is generational: that the class of politicians who govern us now take for granted that our ideological debates take place between the goalposts of Reaganism and Clintonism—as manifested not least in the case of Barack Hussein Obama. He notes how Obama established himself in The Audacity of Hope as a child of Reaganism (“In arguments with some of my friends on the left, I would find myself in the curious position of defending aspects of Reagan’s worldview”) with a distaste for Reaganism’s left-wing opponents—who enforced a tyranny of “either/or thinking” that only leads to “ideological deadlock.” Obama emerged as a true avatar of Third Way politics. But the Third Way was at its heart about embracing the dynamism of the market and denying the necessity of activist government to protect people from its ravages—and for kids today, the experience of being ravaged by the free market is their dominant impression of the world. Ravaged by usurious student debt. Ravaged by subprime mortgage scams. Ravaged by structural unemployment. Ravaged by the arrogance of unaccountable plutocrats. And on and on. So much for today’s kids embracing the Third Way.
They are also, he continues, less susceptible” to “right-wing populist appeals”: they are less white and less religious than the populations these appeals are designed for, and also “more dovish on foreign policy.” And above all, they are far to the left of their parents economically: “In 2010, Pew found that two-thirds of Millennials favored bigger government with more services over a cheaper one with fewer services, a margin of 25 points above the rest of the population. While large majorities of older and middle-aged Americans favored repealing Obamacare in late 2012, Millennials favored expanding it, by 17 points.” And “unlike older Americans, who favor capitalism over socialism by roughly 25 points, Millennials, narrowly favor socialism.”
Most fascinatingly, he finds evidence that young Republicans consider the GOP’s Generation X superstars creeps: “According to a 2012 Harvard survey, young Americans were more than twice as likely to say Mitt Romney’s selection of Ryan made them feel more negative about the ticket than more positive. In his 2010 Senate race, Rubio fared worse among young voters than any other age group. The same goes for Rand Paul in his Senate race that year in Kentucky, and Scott Walker in his 2010 race for governor of Wisconsin and his recall battle in 2012,” as well as Ted Cruz’s 2012 Texas Senate race.
Occupy, he says, is an omen—and its children, members of what Chris Hayes has called the “newly radicalized upper-middle class,” are now making their long march through the institutions, with de Blasio’s victory against an (admittedly divided) field of Clintonian Democrats as the harbinger of things to come. He concludes that “Hillary”—a Clintonite Democrat if there ever was one—“is vulnerable to a candidate who can inspire passion and embody fundamental change, especially on the subject of economic inequality and corporate power, a subject with deep resonances among Millennial Democrats. And the candidate who best fits that description is Elizabeth Warren.”
What do I think of the argument?
Well, I’m a historian. The act of predicting the future discomfits me, in any event—and the bigger the prediction, the more distrusting I am. (I sketched out my objections to the demographic arguments for Democratic inevitability here, here and here.) I also have never much dug “generational” arguments, finding them rigidly deterministic and reductionist, betraying a style of thinking more appropriate to ad industry hustlers than serious political analysts.
But Beinart’s arguments are smarter than those. For one, he explicitly rejects what is most offensively schematic about generational arguments: that “generations” are identities that emerge automatically, like clockwork, every twenty or thirty years or whatever—Depression/World War II, Baby Boom, Generation X, Millennial—like the unfolding of the seasons. He deploys instead the conception of the early-twentieth-century German social thinker Karl Mannheim, for whom, he notes, “generations were born from historical disruption.” He argued that people are disproportionately influenced by events that occur between their late teens and mid-twenties,” and that, as such, “a generation has no set length. A new one could emerge ‘every year, every thirty, every hundred.’ What mattered was whether the events people experienced while at their most malleable were sufficiently different from those experienced by people older or younger than themselves.” That is correct: this is how a generational identity is stamped—by a sense of everyday difference from the elders who do not understand them.
But Beinart downplays, even while he acknowledges, another crucial argument of Mannheim’s classic The Problem of Generations: that political generations are not defined by a common ideology but a common ideological argument. For example, the German political generation of the Weimar era was defined by an argument over the meaning of Germany’s loss in World War I and the traumas of the punitive Peace of Versailles. For that era’s left, the solution was proletarian revolution; for the right, the revanchism of Nazism—“socialism or barbarism,” as the left laid out the alternatives. For the Baby Boomers in America, the political argument, in an age when prosperity seemed self-evident and scarcity no longer seemed an issue, was over “freedom.” Theodore White sketched out a Mannheimian observation in a memorable footnote in Making of the President 1964:
I have attended as many civil-rights rallies as Goldwater rallies. The dominant word of these two groups, which loathe each other, is “freedom.” Both demand either Freedom Now or Freedom for All. The word has such emotional power behind it in argument, either with civil rights extremists or Goldwater extremists, a reporter is instantly denounced for questioning what they mean by the word “freedom.” It is quite possible that these two groups may kill each other in cold blood, both waving banners bearing the same word.
But Beinart’s generational argument is deterministic. It’s not about what the defining argument of the future will be. Young people’s ideological outlook seems to him already settled—leftward. That’s far too simple and optimistic.
For one thing it assumes that political dynamics are linear—since the trends tend this way now, they will only tend that way more so in the future. It thus leaves out an awful set of variables that complicates any narrative of progress.
For one thing, he assumes that America has a democracy.
But consider the counter-evidence against that, of which many of you are aware. Thanks to partisan gerrymandering by power-hungry Republicans (remember the counsel to a Texas representative who bragged in a 2003 e-mail to colleagues that they’d fixed it for Republicans to “assure that Republicans keep the House no matter the national mood”), our House of Representatives is, in fact, far from representative. You can’t repeat it often enough: when Barack Obama wins the state of Pennsylvania by five points but the delegation Pennsylvania returns to the House of Representatives contains thirteen Republicans and only five Democrats—well, poll numbers aren’t counting for very much, are they?
Then there’s the “dark money” problem: for instance, the recent news that “a single nonprofit group with ties to Charles G. and David H. Koch provided grants of $236 million to conservative organizations before the 2012 election, according to tax returns the group is expected to file Monday…. Freedom Partners established itself in November 2011 as a 501(c)6 ‘business league,’ typically a trade association of corporations, like the Chamber of Commerce, organized to promote a common business interest. Instead of donors, it has more than 200 ‘members,’ each making a minimum $100,000 contribution, which Freedom Partners classifies as member dues. The approach gives it many of the same advantages social welfare groups have, with one significant addition: Some contributions to the group may be tax deductible as a business expense.”
So let’s assume Beinart is right in his generational diagnosis: kids who came to their maturity during the “Age of Fail,” whose formative experience of American exceptionalism is that America is exceptionally crappy, are pissed, and are willing to work hard for politicians who are willing to do something about it.
If that is so, another scenario looks like this: young citizens motivated by left-leaning passions run into a brick wall again and again and again trying to turn their convictions into power. The defining story of our next political era becomes not a New New Left but a corrosive disillusionment that drives the country into ever deeper sloughs of apathy.
What if, in other words, the harbinger election didn’t take place in New York but in Colorado—where a hyper-ideological, insurrectionist, corporate-money-soaked minority, as I pointed out the other day, recalled two progressive legislatures for daring to favor background checks for gun purchases even though Coloradans want background checks by a margin of 68 to 27 percent.
Beinart wants to think big. So let’s think big. Given a precedent like that, the result of our current trends might not be more socialism, but once more a stark showdown between socialism and barbarism. Apathy and social misery might make fertile ground for some charismatic demagogue, preaching scapegoating and a narrative of violent redemption…
But that’s a big, big prediction—and again, as a historian, I don’t like big predictions. Let’s stay close to the ground, and the near-term, instead. Beinart has amassed some very convincing poll numbers about the mood of young voters. He has written, “If Hillary Clinton is shrewd, she will embrace it, and thus narrow the path for a populist challenger.” Hillary Clinton surely reads Peter Beinart. Let’s hope she reads and heeds this. That would be a very nice start. What will come next, frankly, nobody knows.
Forest Whitaker as Cecil Gaines in a scene from Lee Daniels’s The Butler. (AP Photo/The Weinstein Company/Anne Marie Fox)
Have you heard about the right-wing outrage over The Butler, the movie about the faithful White House retainer who served under presidents from Dwight D. Eisenhower to Ronald Reagan, and transforms himself from a Booker T. Washington to a W.E.B. Du Bois under the tutelage of his civil rights activist son?
The movie presents a sort of Forrest Gump journey through the American civil rights movement, from Eisenhower’s anguish over whether to send federal troops to escort black students past the racist mobs into Little Rock High School to the 1980s protests over apartheid. When I heard about it, I was skeptical. I wrote on Facebook before the first time I tried to see it (it was sold out; my mostly black neighborhood is crazy about the thing) that I “will try to keep open mind but dollars to donuts my dominant impression will be: ‘When there’s finally a movie about a white man serving as foil to the moral development of a black man—then, and only then, Dr. King’s dream will be on its way to fulfillment.” You know the kind of thing I was afraid of: the “Magical Negro” narrative, in which a black character “who often possesses special insight or mystical powers” exists only to come to the spiritual aid of the white protagonist.
It turned out to be far richer than I’d expected—not another Bagger Vance retread but in fact a refreshingly pointed examination of political conflict within African-American families. In a key plot setup—spoiler approaching—Cecil Gaines, the hyper-competent, hyper-compliant butler played by Forest Whitaker, is inspired by the likes of Martin Luther King Jr. to present himself to his martinet boss, played by an icy Jim Gleason, to complain that black White House employees are paid less for the same work and are denied opportunities for promotion. In response, he hears the glorious battle cry of freedom beloved of American bosses everywhere: if you don’t like it, you can quit.
Soon, though, comes the reign of Ronald Reagan, and a second key plot development (spoiler alert). The second time his boss refuses his protest, Whitaker’s butler responds that he’ll be sure to tell the commander-in-chief, Mr. Reagan, about his objections. The boss’s jaw, proverbially, drops, the black staffers get their raises and promotions, and the wisdom of Booker T. Washington’s advice to generations of black men—that the best way to advance is to do your job exceptionally well and to cultivate powerful white patrons—is affirmed. Then, however, in one of the fictionalized film’s few accuracies, Nancy Reagan personally invites him to a state dinner—which ends up (back to fiction again) cementing his turn to Du Bois–ism: he feels the sting of shame at witnessing how his butler-friends are forced to display obsequiousness toward him; almost simultaneously, he overhears President Reagan forcefully asserting his intention to veto South African sanctions. He soon quits, and—spoiler! spoiler! spoiler!—ends up in jail with his son for protesting American accommodation with apartheid.
Cue wingnut outrage.
Columnist Mona Charen offered a quiz: Reagan or Obama, “Which president did more to help black Americans?” Why, Reagan, of course: “The black labor-force-participation rate, which rose throughout the 1980s and 1990s, has declined for the past decade and quite sharply under Obama to 61.4 percent.” Case closed. She writes: “The Butler… misrepresents President Reagan (as I gather from those who’ve seen it [sic]) as, at best, insensitive to blacks, and at worst as racist.” Michael Reagan chimed in: “There you go again, Hollywood. You’ve taken a great story about a real person and real events and twisted it into a bunch of lies,” he wrote on Townhall.com. “If you knew my father, you’d know he was the last person on Earth you could call a racist.” In The Washington Post, three historians who earn their paychecks as professional conservatives weighed in with a brief about the Gipper’s “sensitivity to racial discrimination.” (“While accurate in depicting Reagan’s opposition to sanctions against South Africa…”)
Nation readers don’t need much persuading about how dubious this stuff is—how disastrous Reagan’s policies as president were for struggling African-Americans, how (to indulge the argument ad Hitlerium) even a certain German was nice to his dog; personality is not policy. And in this magazine last July, Sam Kleiner explained how soft Reagan and Reaganites were on apartheid.
Yes, there’s plenty that’s historically haywire in the picture: chronology that’s out of order, a character dying in Vietnam eleven months after American involvement ended, made-up dialogue from people like Martin Luther King, misappropriated historical credit. But personally, speaking as a historian and a storyteller, when it comes to inaccuracy in historical fictioneering, I follow the Shakespeare principle: I’m willing to overlook gobs of mistaken detail if the poetic valence is basically correct. (Richard III, after all, probably never actually said as he lay dying on the battlefield, “My kingdom for a horse.”)
And on Reagan, The Butler’s poetry is acute. Has there ever been an American who so doted upon his kindness to individual African-Americans, who did more to disadvantage them as a class?
It was a core component of the man’s amour-propre: “I am just incapable of prejudice,” as he said on his debut on Meet the Press, in 1966, and many, many times after. Yet he distrusted civil rights laws (like the 1964 act outlawing discrimination in public accommodations that he called “a bad piece of legislation”) as unwarranted intrusions of federal power into the lives of individuals. A man who was raised by a Protestant mother who married his Catholic father in an anti-Catholic age; who played side by side with black boys; who was raised in a church that preached racial brotherhood; whose mother took in released prisoners, black and white, to convalesce in the family sewing room—how could he be racist?
There was the time he wanted to see Birth of a Nation, D.W. Griffith’s pro-Ku Klux Klan blockbuster. “My brother and I were the only kids not to see it,” he would say, reciting his father’s words: “The Klan’s the Klan, and a sheet’s a sheet, and any man who wears one over his head is a bum. And I want no more words on the subject.” And the time his father was working as a traveling salesman and the desk clerk at the only hotel in a small town proudly informed him that the place didn’t serve Jews; Jack announced they wouldn’t be serving this Catholic, either, and slept a winter’s night in his car. There was the time when a visiting team could find no hotel to stay in, for they had two black players, and were welcomed into the Reagan home instead. It was precisely such magnanimous gestures on the part of individual whites that could solve any lingering racial problem and, since Americans were magnanimous, would solve the the problem.
“There must be no lack of equal opportunity, no inequality before the law,” as he said in the televised opening speech of his 1966 gubernatorial campaign. But “there is a limit to what can be accomplished by laws and regulations, and I seriously question whether anything additional is needed in that line.” (In Washington, a new civil rights law banning housing discrimination was then being debated; in Chicago, marching through the city’s white bungalow belt in favor of the principle, Martin Luther King was jeered by swastika-wielding protesters and had knives and rocks thrown at his head.) The next year, while visiting Eureka College to dedicate a new library, he asked the students at his alma mater a rhetorical question: “The problems of the urban ghetto are the result of selfishness on our part, of indifference to suffering?”; the answer was plain. “No people in all the history of mankind have shaped so wisely its material circumstances.” Speaking again at Eureka in 1973 he marveled at those who claimed America was still marred by racism: Hadn’t Los Angeles just elected a black mayor?
It was part of his liturgy of absolution on, for instance, the subject of “law and order.” “The phrase has become unfashionable,” he said on one of his radio broadcasts in the summer of 1975. “Those who have made it so began looking askance at anyone who used the words. Their arched eyebrows were a reaction to what they would inform you that ‘law and order’ were ‘code words’ that really meant a call for racial discrimination…. Well, I think this inference of bigotry is in itself bigoted…. Are they not implying that our fellow citizens that happen to be black are so given to crime that a call for law and order is automatically a call for a curb on the black community?” He went on to cite an unidentified “survey done in the nation’s capital” that found more blacks than whites wanted “sterner action against criminals”—proving, he concluded with an extraordinarily artful rhetoric inversion, that “ ’law and order” is not a code word to blacks. It’s a cry for help—and we’d better join them.”
Indeed, in 1968 when a black questioner asked him why she never saw blacks at Republican events, he politely but forcefully replied that it wasn’t Republicans who were racist but the supposedly liberal Democrats, “a party that had betrayed them…. The Negro has delivered himself to those who have no other intention than to create a Federal plantation and ignore him.” (We’d hear that one again…) The New York Times reported, “Reagan handled the situation so smoothly that some of the newsmen aboard his chartered 727 suggested, half-seriously, that the Reagan organization had set up the incident.”
He hadn’t always handled such questions so smoothly. He almost never lost his temper in public. He did once, however, during his 1966 gubernatorial primary campaign. A delegate at the National Negro Republican Assembly in Santa Monica said, “It grieves me when a leading Republican candidate….” He then shocked the assembly by slamming down his note card and shouting, “I resent the implication that there is any bigotry in my nature. Don’t anyone ever imply I lack integrity. I will not stand silent and let anyone imply that—in this or any other group.”
He slammed his fist into his palm, muttered something and walked out of the convention.
He knew better than to blow up again. He learned to respond with pleasing stories about racial uplift instead. There was, for example, the incredible moment two days after he announced his candidacy for the 1976 Republican presidential nomination. He was in Charlotte, North Carolina. In his speech he preached a homily on racial reconciliation: “When the first bombs were dropped on Pearl Harbor there was great segregation in the military forces. In World War II, this was corrected. It was corrected largely under the leadership of generals like MacArthur and Eisenhower…. One great story that I think of at that time, that reveals a change was occurring, was when the Japanese dropped the bomb on Pearl Harbor there was a Negro sailor whose total duties involved kitchen-type duties…. He cradled a machine gun in his arms, which is not an easy thing to do, and stood on the end of a pier blazing away at Japanese airplanes that were coming down and strafing him and that was all changed.” This was news to the more historically minded reporters, who knew the armed forces integrated only under an executive order from Harry Truman, in 1948, three years after the war ended—and that segregation only ended in the rest of society after concerted protest and civil disobedience. In the press conference that followed he was asked whether he had approved Martin Luther King’s civil disobedience tactics.
No, he responded: “There can never be any justification for breaking the law.”
Then, someone followed up, how could blacks have ever gained their civil rights in places like North Carolina?
He undertook to explain, in response, “where I think the first change began…. I have often stated publicly that the great tragedy was then that we didn’t even know that we had a racial problem. It wasn’t even recognized. But our generation, and I take great pride in this, were the ones who first of all recognized and then began doing something about it.”
Reportorial ears surely pricked up at that: this was going to be something.
“I have called attention to the fact that when I was a sports announcer, broadcasting Major League baseball, most Americans had forgotten that at the time the opening lines of the official baseball guide read, ‘Baseball is a game for Caucasian gentlemen,’ and in organized baseball no one but Caucasians were allowed. Well, there were many of us when I was broadcasting, sportswriters, sportscasters, myself included, began editorializing about what a ridiculous thing this was and why it should be changed. And one day it was changed.”
And indeed, he had called attention to that supposed fact, in 1967, in a televised debate with Robert Kennedy. But if in the interim anyone had bothered to point out to him since that there was no such “official baseball guide” reading “Baseball is a game for Caucasian gentlemen”; or pointed out to him that he stopped broadcasting baseball in 1937 but that the sport wasn’t integrated until 1947; nor that no Iowans who heard him back then could recall him ever raising the subject on the air, the intervention clearly didn’t take, for he was still telling the story in the Oval Office nine years later.
That was Reagan on race. “Eugene Allen, the actual White House butler on whom the film is supposedly based, kept signed photos of Ronald and Nancy Reagan in his living room (pictures of the other presidents he had served hung in the basement),” Mona Charen writes. I don’t doubt it. Reagan was usually nice to individuals (though not so much if they were his children: don’t forget that he didn’t even recognize his own son Michael Reagan, so eager to defend his dad now, when attending his high school graduation, instead introducing himself, “My name is Ronald Reagan. What’s yours?”) In the film they accurately depict his practice of writing checks to individual citizens who wrote to him with sob stories, enlisting loyal Cecil to help him hide the practice—“Don’t tell Nancy!”—from his embarrassed staff and wife. But if The Butler is brilliant about anything, it is in grasping how this is actually the opposite of racial progress—because it makes racial progress seem unecessary. It’s the whole point of the movie. Which unsurprisingly conservatives have proven unable to grok.
Activists attend a pro-gun rally as part of the National Day of Resistance at the state Capitol in Salt Lake City, Utah, on February 23, 2013. (Reuters/Jim Urquhart)
I Googled the phrase “stack them in the streets” because I was searching for a historical reference from 1968. I ended up stumbling on something else: an article from the far-right site MinutemanNews.com explaining what was really going on with the gun debate in Congress:
Once cowed at the thought of provoking Second Amendment supporters, leftists will soon attempt to ban ‘assault weapons’ (and much more) as legislation offered by Diane Feinstein makes its way to the Senate floor…. Maybe Democrats are confident that fallout from Sandy Hook will provide the floor votes necessary to disarm the American people. But if the left is willing to risk picking this fight with millions of American gun owners, it must also believe something far more important—that Americans who have spent years arming themselves against the ultimate expression of tyranny by their own government—the overthrow of the Second Amendment—will choose to not fight when the time finally comes.
It is illustrated with a Che-like logo of a machine gun’s silhouette and the legend “COME AND TAKE IT”—a slogan, which is crucial to know about if you want to understand the contemporary right, that I wrote about here.
And then, at MinutemenNews.com came the 420 comments:
“…we will run them like the british to the shores and they better hope theres a boat waiting for them to take there ass to Europe with rest of the sheeple. If not, the sharks are be eating good that week. They better hope we deside to show mercy long enuff for them to get on planes or ships to leave the usa….”
“thats right there could be a lot of dead LEFTIST!!”
“Stack them in the streets like cordwood. There’s no room for prisoners.”
“Take members of Congress prisoner and hold them hostage because it is reported that many Federal Depts have ordered millions of rounds of hollow points supposedly to hold off civil unrest and insurrection. How long does the left think these Federal employees will hold out if many members of Congress are held as hostages…”
“Not hostage—Citizen Arrest. Their vote will provide the wanted list and their confession.”
“HELL NO, GITMOIZE THEM!”
“Bad time to be a politician. Dibs on shooting/hanging the President.”
How relevant should this stuff be, you ask, to one’s ordinary political reflections and calculations? After all, how many people can possibly read MinutemenNews.com? Well, let’s investigate. Deploying the algorithm at Ranking.com (and allowing that there’s no easily settled way to measure traffic), I compared the traffic there to some of the sites I read and have written for. CoreyRobin.com, the blog of the Brooklyn College professor who is my favorite political writer, ranks 144,234. The website of Campaign for America’s Future, where I published in 2007 and 2008, is at 41,870. Three of my regular lefty reading stops, the blogs Lawyers, Guns, and Money, Americablog, and Crooks and Liars—where I contributed regularly last year—rank 41,346 and 11,301 and 9,897, respectively. The homes of my online columnizing in 2006-07 and 2012, TNR.com and RollingStone.com, clock in at 8,481 and 7,263. TheNation.com? We are at 9,088.
MinutemenNews is in the middle of that pack: 12,600.
It is a reasonable surmise, then, that the author of “The Left Is Convinced Americans Won’t Fight for Second Amendment Rights” has at least as many or possibly more readers than I do. And look here: as much as I hate to admit it, its readers race leagues ahead of the lefty blog pack when it comes to putting their money where their mouth is: knocking on doors in campaigns, stuffing envelopes—and, don’t forget, showing up at political meetings with guns. As my favorite blogger Digby always reminds us, pissing in the wind as far as I can tell, committing politics while armed is the ultimate act of civic intimidation. I find it very hard to argue that the implicit threat by these people to shoot politicians and officers of the law who cross them—or better yet, to “stack them like cordwood”—does not provide some sort of unmeasurable advantage in political conflicts.
Gun nuts are the most motivated people in our politics. And now we’ve had a natural experiment to prove it: the first recall election in Colorado history was lost by two state senators who had the temerity to vote for legislation requiring background checks for firearms purchases and banning ammunition magazines over fifteen rounds.
A site called PolticusUSA.com pronounced with bafflement: “Colorado Voters Support Background Checks Yet Still Recall Lawmakers for Background Checks.” It headlined a nice roundup of data from the election last Tuesday. Senate president John Morse went down 51 to 49 in conservative Colorado Springs, and the other senator, Angela Giron, went down 56-44 in blue-collar Pueblo, both “not districts that lean heavily Republican.” Statewide, a Public Policy Polling survey found the weekend before the balloting that Colorado voters favored background checks by 68 to 27 percent. They concluded that this means Democrats might face trouble in the next statewide election, but I thought that was a dense conclusion. It was the issue here that was the issue—the issue of “politicians taking away our guns.”
“Intensity of commitment” is a difficult problem for political theory to analyze: is it a violation of the public will when fanatics motivate themselves so much more efficiently than moderates? (“The definition of ‘moderate,’” I once read a Barry Goldwater supporter noting in 1964, “is ‘someone who doesn’t knock on doors on election day.”) Is it “undemocratic” when a measure overwhelmingly favored by “ordinary” voters is defeated by conspiracy theorists who fear that a baby step designed to keep guns out of the hands of psychotics and criminals is actually a giant leap toward New World Order government? Or is it the essence of democracy?
I wish I knew.
But that’s an intellectual problem. The political answer is obvious. Don’t mourn. Expose the fact that the National Rifle Association and its acolytes violate all the bounds of civility that make democratic deliberation possible (most people simply don’t know this: the PPP found that the same Coloradans who want background checks by a margin of 68 to 27 percent also have a positive view of the NRA by a margin of 53 to 33 percent). Embarrass the pundits who refuse to recognize this. Tell the story that Digby’s been trying to tell: that guns at political events make democracy impossible. And don’t be moderate. Knock on doors on election day. Organize.
Leonard Ziskind and Devin Burghart chronicle America’s burgeoning militia movement.
A supporter of former Chilean dictator Augusto Pinochet holds a picture of Pinochet outside of Chile's Military Hospital in Santiago on December 3, 2006. (Reuters/Ivan Alvarado)
Let’s not forget Chile.
On September 11, 1973, warplanes began strafing radio stations and newspapers. Images arrived of people scattering in fear ahead of tanks in the streets. Fearsome generals in coats with starred epaulets ordered President Salvador Allende, the world’s only elected Marxist leader, to step down. A military communiqué: “The armed forces and the body of carabineros are united in their historic and responsible mission of fighting to liberate Chile from the Marxist yoke.” Signed; General Augusto Pincohet Ugarte, Commander-in-Chief of the Army.
Pinochet’s coup came the day before a planned national referendum scheduled by Allende, a man fastidiously obsessed with observing his nation’s constitution. Unlike Allende, the military chose not to chance democracy. Instead, they rounded up thousands and deposited them in the national stadium, some marked for execution. In the streets of Santiago, loudspeakers barked out commands: “All people resisting the new government will pay the price.” For at least seventy-five, in the first three weeks, the price was execution by Pinochet’s Caravana de la Muerte—the “Caravan of Death.” One bullet-ridden body, belonging to the popular, pacifist singer Víctor Jara, was found dumped in a Santiago back street, his hands broken and his wrists cracked.
On the morning of the coup, the ousted president, refusing to yield, made his way to the parliament for one last speech. Then he fell back to the presidential palace, now bombarded by planes and pummeled by tanks, forty civil servants still pinned inside. The majestic building nearly burned to the ground. “Oh, baby!” a CBS Radio correspondent intoned on the air with an intake of breath as machine gun volleys sounded. “We’re in the wrong place…. We are pinned down on a corner…looking at a policeman with an automatic rifle…. What the hell am I doing here?”
I wrote here this winter about what General Pinochet’s subsequent rule was like, in the context of Ronald Reagan’s complaining two years later about “the innuendos and the accusations that the CIA and our government had a hand in bringing about the downfall of the government of Chile,” flaying congressmen who “act as if fascism had been imposed on the Chileans, to their great distress and unhappiness,” citing a goofy Gallup Poll—as if “citizens” in a police state could tell strangers honestly what they thought about their government—that 60 percent of Chileans approved of their government and only 3 percent did not. Reagan: “This is quite a contrast to much of what we’ve heard in the news about a reign of terror, political prisoners, torture and a depressed and frightened populace!”
But the Chileans since September 11, 1973, had lived under an official “state of siege,” renewed every month by military decree and not lifted until 1978, at which point General Pinochet revised the state of siege to a mere “state of emergency.” The new rules, he magnanimously explained, meant “I cannot banish anyone for more than six months and there will be no more trials of a military nature.” The death toll of his murderous regime was up in the many thousands by then.
And it couldn’t have happened without our tax dollars.
On December 18, 1975, the Church Committee transmitted its report, U.S. Covert Actions by the Central Intelligence Agency (CIA) in Chile (Including the ‘Assassination’ of Salvador Allende), 1963, 1973.” Short, sharp and thorough like all the Church Committee’s publications—you can pick it up yourself in a nice 127-page paperback from ARC Manor Publishers, including Congressman Maurice Hinchey’s follow-up 2000 report. It concluded: “Covert United States involvement in Chile in the decade between 1963 and 1973 was extensive and continuous. The Central Intelligence Agency spent three million dollars in an effort to influence the outcome of the 1964 Chilean presidential elections. Eight million dollars was spent, covertly, in the three years between 1970 and the military coup in September, 1973, with over three million dollars expended in fiscal year 1972 alone…. What did covert CIA money buy in Chile? It financed activities covering a broad spectrum, from simple propaganda manipulation of the press to large-scale support for Chilean political parties, to public opinion polls to direct attempts to foment a military coup.” In 1970, when the agency passed weapons to the plotters, the army’s commander-in-chief, René Schneider, who opposed them, ended up dead.
What the hell were we doing there, indeed. Let’s not let it happen ever again.
Peter Rothberg commemorates Salvador Allende's final public address.
From left, Chairman of the Joint Chiefs of Staff Martin E. Dempsey, US Secretary of State John Kerry and US Secretary of Defense Chuck Hagel at the first hearing on Syria held by the Senate Foreign Relations Committee on September 3, 2013. (AP Photo)
Maybe the cruise missiles will fly on September 11: wouldn’t that be grand?
It was eleven years ago, a year after September 11, 2001, in the runup to war in Iraq—and my, how time flies—when I published a piece about what George W. Bush was doing to Americans’ power to exercise their muscles for critical citizenship. It began by recording Richard M. Nixon’s envy at Chinese commissars who got to decide what would be in the next day’s People’s Daily. George Bush, too: “Dictatorship would be a heck of a lot easier,” he’d said a few months before expressing how impressed he was when China’s President Jiang Zemin ended a joint news conference with him by walking away after the second question. I called the piece “Surrender to Trust,” and in it I tried to pound some basic points of democratic theory: “Secrecy and power are intimates: Both tend to corrupt; both, when absolute, tend to corrupt absolutely; and both can steal up like an addiction. The cover-up, even of an innocent error, can be worse than the crime. That is why any break in any check or balance in our constitutional power structure should make the front page. Every time. But they almost never do.”
Maybe we’re a little bit better now. But maybe not. “With the World Watching, Syria Amassed Nerve Gas,” Judith Miller’s old paper headlined yesterday. But they didn’t really offer any evidence. John Kerry and Barack Obama now say the evidence is indisputable that President Assad ordered a chemical attack. But listen to Congressman Alan Grayson’s devastating argument from Saturday: “The documentary record regarding an attack on Syria consists of just two papers: a four-page unclassified summary and a 12-page classified summary. The first enumerates only the evidence in favor of the attack”—and “cites intercepted telephone calls, ‘social media’ postings, and the like, but not one of these is actually quoted or attached.”
He says, “I’m not allowed to tell you what’s in the classified summary, but you can draw your conclusion.” (My conclusion, duh: Grayson’s trying to tell us that the extra eight pages add nothing important.) The House Intelligence Committee told him there was no other documentation available for him to examine prior to his vote. That committee hasn’t received access to the intelligence reports on which the conclusions are based.
Grayson notes a media report “that the Obama administration had selectively used intelligence to justify military strikes in Syria, with one report ‘doctored so that it leads a reader to just the opposite conclusion reached by the original report’ ”—the conclusion was that some evidence suggests that Assad didn’t himself know about the chemical weapon attack, which may have taken place in defiance of the orders of his general staff.
He writes that John Kerry “has said repeatedly that this administration isn’t trying to manipulate the intelligence reports the way that the Bush administration did to rationalize its invasion of Iraq.”
In other words, Kerry says, Trust me.
The American people seem to get it. According to polls, they’re saying, No way. Do Democratic congressional representatives get it? Not enough, I’m afraid. On NPR, the bright young comer Adam Schiff of California says, “I’m convinced about the evidence. I think there’s really compelling evidence that Assad has gassed his own people, and not once but multiple times; this being the worst occasion. I also think that a military strike could have the effect of deterring him from doing it again.” But, oops, not even United Nations Ambassador Samantha Power seems particularly confident about that.
Kerry then suggested maybe Russia’s proposal to give Assad a chance to turn over any chemical weapons wasn’t such a bad idea—that if this happened, America’s objective would be achieved. Then, basically, a spokesman said: just kidding. He “was making a rhetorical argument about the impossibility and unlikelihood of Assad turning over chemical weapons he has denied he used.” That’s George W. Bush–style thinking—that the administration’s mind is made up: a bad guy is a bad guy, and if you entertain any possibility of rationality on the part of evildoers, you’re just being a naive sucker.
But that’s not how the founders of our nation wanted it. They didn’t want us to trust any politician. As I wrote, “Our Federalist Papers forefathers once wrote something wise enough to deserve to be affixed to our civic doorposts: ‘If men were angels, no government would be necessary. If angels were to govern men, neither external nor internal controls on government would be necessary.’ We need not wonder whether [the president] is angelic to make the point: Both external and internal controls are falling by the wayside in this White House.” Only it was eleven years ago, and I was writing—then—about George Bush.
When the president speaks, think about this: after a day when the The New York Times writes about an activist journalist arrested and threatened with over a hundred years in jail for linking to a URL the security establishment didn’t like, and with a White House that demands unprecedented power over reporters to change quotes at will, I don’t see any reason the arguments we used to make about George Bush and trust don’t apply equally to Barack Obama as well. Listen to Grayson: why should we trust Obama? After all, he clearly doesn’t trust us.
Greg Mitchell reviews public opinion polls on intervention in Syria.
Higher education stamp from 1962. (Wikimedia Commons)
On August 21 I published an essay in which I offered several anecdotes in the service of the argument that with the rise of a class of permanently under-employed college instructors, alongside a class of “tenured professors…hardly aware that they’re aristocrats and that they oversee an army of of intellectual serfs,” American society has reproduced the era of the “gentleman scholar”—because the only people who can afford the soul-satisfying profession of adding to the world’s store of knowledge, and passing it on to the next generation of college students, are the independently wealthy.
I also made a broader social argument: First, that the historic expansion of liberal arts education “made America more decent, more lovely, more cultured, more critical, even—ask anyone who went to college in the 1960s or ’70s—more fun. It made America richer too, both spiritually and materially; though in an important sense the first condition fed the second, as the liberation of intellectual imaginations midwifed a thousand productive careers in every field, careers that were productive precisely because they were inspired by a ‘liberal arts’ attitude, not merely pinched Babbit-like commercial aspirations.” And that therefore, atrophying this professoriate was one way “a healthy capitalist society eats its seed corn.”
The piece received a staggering amount of attention among the sort of people I was writing about—graduate students, and adjunct and tenured professors. So I invited people to write me with stories of their experience within this rapidly transforming space.
Before I turn to those, let me address a criticism. I called the post “On the Death of Democratic Higher Education.” Some asked what I meant by “democratic higher education”—after all, a professoriate, any professoriate, is not exactly a democratic institution. And indeed one of the practices I criticized, and will be criticizing more in the future—the rise of “MOOCs,” or massive open online courses—lets people take courses from excellent professors, often star professors, for free. Which is pretty damned democratic. For another thing, I’m not really talking about the democratic failings of the present higher education system that harm undergraduates the most, excellently covered by others at The Nation and elsewhere: exploitative for-profit education; crushing student debt; the death grip an amoral class of professional administrators enjoy over governance prerogatives once enjoyed by more public-spirited faculty.
No, I’m focusing on what I know best: the world of professional intellectualism, whose attenuation makes for more subtle harms to the health of a democratic society—but, I’ll be arguing, may make for equally tragic harms in the end.
And so: on to some stories! I’ll reveal what the professors said later. For now, it’s adjuncts-a-go-go.
A young man writes:
I graduated from college in May of 2009. That previous fall, as I faced an uncertain economy, and since I had no idea what to do with a Bachelor’s in History, I followed the only constant piece of advice my professors gave me, and applied to MA programs in history.
So why would history professors encourage smart college history majors to apply for masters degrees? Some might be myopic, not having given a thought to the class politics of modern education: getting a masters worked for them back in the day, so why not now? Another reason is more cynical—that they have thought about the class politics: keep feeding the pipeline that produces more adjuncts, and you better preserve your own tenured privilege—research without teaching—by building and maintain a reserve army of the under-employed. Or at least by not discouraging one from being built and maintained.
Anyway, he liked the master’s program!
Long story short: graduate school was amazing, stressful, made me doubt my every decision on a 24/7 basis for the better part of two years…but I don’t regret it. One of my friends summed up graduate school very succinctly—“it forever alters how your mind functions”—and I realize nearly every day that she was correct.
But he also
realized that I didn’t want to go back to school for another 6-8 years to get a PhD (I got a glimpse of the hiring process from the inside, and saw that it was basically arbitrary, and that the job market was chock-full of very talented people desperate for jobs, even ones with heavy teaching loads).
Note that, professors: from your side of the divide, your world might look like a meritocracy. From the students’ side, though, it looks “basically arbitrary.” Consider that: a smart guy who loved graduate school but can’t imagine continuing for the sheer hopelessness of the employment prospects. Why are they hopeless? Maybe because they can fill so many teaching spots with those ever-eager adjuncts…
Anyway, this fellow, realizing “I had no clue how to leverage an MA in history when applying for jobs” (profs: maybe that’s something you ought to think about, especially if you’re giving out “constant” advice to students to apply for MAs in your field), “bounced around a little bit. For the most part I’ve been working a seasonal job in the educational testing industry.” Then:
Last October, I essentially stumbled into two adjunct jobs. One was at Xavier University, which is a local Catholic school known for a student body a little on the affluent side. The other was at a two-year, open enrollment college operated by the University of Cincinnati, located a bit north of the city.
My job at Xavier was as follows: I played second fiddle to a full-time faculty member who had developed a new take on the standard 100-level European History survey course. Instead of relying solely upon lectures, students would do analytical exercises related to major themes in European history. For instance, students would read a chapter in their textbook about population trends in the late eighteenth century, and would then, in assigned groups, look at records of birth and death rates, diary entries, etc, and would then piece together graphs, written reports, or some other presentation that demonstrated that they not only had a basic understanding of how to read and make charts, graphs, etc., but that they could relay information in a cogent and practical manner. After all, students are not likely to write essays once they leave college and enter the workforce, but they might need to read graphs.
The class itself went well enough once students realized that we expected them to do more than just sit passively through lectures. We ran into the normal problems presented by a reliance upon group work—chronic absences, students not shouldering enough of the burden, etc—but overall we could see real progress being made. Moreover, the students were thankful that they had a chance to do more than just write essays. At the end of the semester, I met with the faculty of record and one of the deans of education to discuss this course. While I don’t want to sound bitter, I was astounded by how out of touch this dean sounded. There was some discussion about course content and assignments, but most of it was about “innovation”—as if we hadn’t completely redesigned a course already. He was more interested in social media and remote learning where students attend classes in groups independent of the classroom (I’ll admit that I am biased here. But when administrators talk about things that render faculty unnecessary, I get a little worried.) and a bunch of other “solutions” and “bold ideas” which didn’t really seem like solutions, but more like unimaginative corporate-speak.
Awesome, huh? These are the awesome folks who more and more are taking over the design and vision of the higher learning in America. (Tom Frank recently wrote brilliantly about this here.)
But let’s hear about working conditions, shall we? At this guy’s two-year college, filling in for a tenured full-timer off on a fabulous Fulbright (those aristocrats…),
I taught two classes on my own, and co-taught two others, and was thus in the classroom five days a week in some capacity. Aside from the stress of going back and forth between different campuses (and policies, cultures, etc), what I felt most was a feeling of impermanence. I felt it, and I also observed it in the words and actions of the other adjuncts I encountered at the two-year college. For them, every semester held the promise of work, but also the fear of a course load reduced to such an extent that their meager salary would barely cover the cost of driving to campus.
I labored under these conditions for only half of a year, but I cannot imagine how other members of the faculty, many of whom finished their degrees a decade or more ago—and who found out firsthand that the wave of retirements that were supposed to open up the job market didn’t exactly materialize as they had hoped, and didn’t lead to tenure-track jobs—and who have families to feed and a myriad of other issues/expenses cope with this situation.
… No matter the rewards that result from teaching (and there were real rewards, such as when my students at the two-year college, several of whom mentioned that they were descended from coal miners, watched part of Germinal and reacted favorably to seeing the miners assert themselves) I could not shake the feeling that my job did not matter, that I did not matter, and that with every lecture I was merely whistling past the graveyard. After all, a semester is only so long. And when you work at a job where your position is tied largely to enrollment numbers which you cannot control, and where your performance is not the deciding factor in whether your contract is picked up for another term, it is hard to feel like anything you do actually matters.
Wow. Next time: “Please do not refer to me by name in anything you write, as I am on the job market yet again and I do not want anything to jeopardize my chances…”
Want to share your story? Reach out at email@example.com.
Student Nation talks about a new tuition plan, “Pay It Forward, Pay It Back.”
Former Chicago Mayor Richard Daley Jr. (Reuters/Chris Wattie)
Last week, for the opening of the school year, I wrote about my interview with Tom Geoghegan about his (so-far) failed suit to stop Mayor Rahm Emanuel’s morally and educationally disastrous crusade to close fifty schools in Chicago. But that’s not all I talked about with Tom. “I don’t want this just to be a Mayor Emanuel–bashing session,” I said. “Because we have to bash Mayor Daley.”
Everyone, I suppose, dislikes parking meters. Chicagoans hate them even more. That’s because Mayor Richard M. Daley in 2008 struck a deal with the investment consortium Chicago Parking Meters LLC, or CPM, that included Morgan Stanley, Allianz Capital Partners and, yes, the Sovereign Wealth Fund of Abu Dhabi, to privatize our meters. The price of parking—and the intensity of enforcement—skyrocketed. The terms were negotiated in secret. City Council members got two days to study the billion-dollar, seventy-five-year contract before signing off on it. An early estimate from the Chicago inspector general was that the city had sold off its property for about half of what it was worth. Then an alderman said it was worth about four times what the city had been paid. Finally, in 2010, Forbes reported that in fact the city had been underpaid by a factor of ten.
Well, Chicagoans, Tom Geoghegan is here to tell you that the whole damn thing is illegal under the Illinois Constitution—and most other constitutions, too. He’s in the middle of a suit to have the whole thing torn up. The argument is driven by the legal theory that “a seventy-five-year-agreement to run parking meters is an unconstitutional restriction on the police power—the sovereign right of the city to control its public streets and ways…. This is a very traditional, conservative, really, argument: what the City of Chicago did was not sell the meters. They sold the police power of the city.”
The deal, you see, is structured like this. Not only does CPM get the money its meters hoover up from the fine upstanding citizens of Chicago. It gets money even if the meters are not used. Each meter has been assigned a “fair market valuation.” If the City takes what is called a “reserve power adverse action”—that can mean anything from removing a meter because it impedes traffic flow, shutting down a street for a block party or discouraging traffic from coming into the city during rush hour—“CPM has the right to trigger an immediate payment for the entire loss of the meter’s fair market value over the entire life of the seventy-five-year agreement.”
Shut down one meter that the market-valuation says makes twenty-two bucks a day, in other words, and the City of Chicago has to fork over a check for $351,000—six days a week (why six days? more on that later), fifty-two weeks in a year, times seventy-five—within thirty days. Very easily, Geoghegan points out, a single shut-down of parking in a chunk of the city—say, for something like a NATO summit Chicago hosted last year—“could be more than the original purchase price of the deal.”
And if the city lowered the parking meter rates, the highest in the country? Same problem: that would trigger the “reserve power action” clause too. Chicago, meet your new City Council: the Sovereign Wealth Fund of Abu Dhabi.
And that’s just wrong—illegal, says Geoghegan: “You can’t bargain away—you can’t sell off—the police power of the city.”
He thundered: “This is privatization gone nuts. It’s almost a comical form of privatization—privatization at its very, very most toxic. Because here, what is being sold off is not really a city asset. It’s not really like Midway Airport”—a deal that might be just around the corner, the Chicago Sun-Times reports. “It’s not like a tollway”—the Chicago Skyway was leased for ninety-nine years to an Australian concern for quick cash in 2005 (“With that kind of money to be made,” The Washington Post said, “Americans are lining up to try their luck at Wall Street’s hottest new game—“investing in infrastructure.”) No, at least in those cases it was just property they were selling off. Here, “they’re selling off the governmental powers of a city. And that’s what’s so disturbing about this. And getting those back is insanely expensive. And the City said, in its brief in the court below, that if the deal were undone they would owe all this money and they can’t pay it back.”
Geoghegan cites the doctrine of in peri delicto—“something you learn your first year of law school”—to explain why this cannot stand. It means the contract is illegal, and thus not enforceable. He gives the example—“not this would ever happen”—of Perlstein selling Geoghegan a gram of meth for $100. (Damn right it wouldn’t happen. My going rate is $200!) “The judge would say this isn’t a legal agreement and I’m not going to enforce it. I couldn’t get restitution. And if this agreement is unconstitutional”—he’s using the Illinois Constitution, but the principle is one of wide application—“CPM is in the same boat.”
I say that this raises an interesting point—that his suit sounds great for Mayor Rahm Emanuel, who happens to be dealing with a huge municipal budget deficit. So he must support the suit! Especially since the parking meter deal was inked by his predecessor, not by him.
“Well!” Geoghegan responds with his inimitably cheerful, bright irony. “He simultaneously badmouths the deal and defends it to the death.”
The badmouthing part: there was the mayor’s proud boast to have “renegotiated” the deal, supposedly to make parking cheaper by making it free on Sunday, which ended up extending the hours on all the other days, making everything just about a wash.
The defending part: he won’t say anything too bad about CPM, because that would discourage investors from buying up other chunks of the city—like the deal to lease the digital billboard concession along the Dan Ryan Expressway for twenty years, which reform alderman Bob Fioretti points out was about as much of a rip-off for the City as the parking meters, not least because no one knows what kind of technology for advertising will be in use twenty years from now.
Rahm Emanuel is a funny guy. He loves to simultaneously take credit for things and evade responsibility for them. In the school-closing case I wrote about last week, Geoghegan tried to sue the mayor as well as the Chicago School Board. The judge didn’t allow Emanuel to be included, buying the mayor’s office’s argument: “They said 'we have nothing to do with the schools'!”
That got a huge laugh: everyone knows that Emanuel is always saying the schools are his personal responsibility; after all, he alone appoints the school board.
So Geoghegan’s suit attached an op-ed signed by Emanuel arguing he was personally responsible for the school closings. And a press release saying much the same thing. In court, though, the mayor’s office said they were just providing “input.” More laughter: “A paralegal in our office described the mayor’s position as, ‘I’m responsible, but I’m not legally responsible.’ ”
A funny guy—and it isn’t just Chicagoans who, bitterly, are being forced to laugh. Our parking meter mess might someday be yours if it isn’t already—yes, this means you, New York, as Matt Taibbi reports. As Tom Tresser of Chicago’s CivicLab impressed upon me, they’re coming after your meters—and bridges, and billboards, and who knows what other public assets—next. “We have a massive global movement of capital which, because they’ve burned their own fucking houses down through their own greed, don’t have the gilt returns that they’re used to receiving…. So the new guaranteed annual returns that big business and big capital are looking for is our assets.”
And that’s about all of us.
Rick Perlstein writes about the resurgent protest culture that is fighting back against Rahm Emanuels’ austerity agenda.
The news that a Heisman trophy winner, Johnny Manziel, will be suspended for one half of one game because he sold his autograph has revived an old debate: should college athletes in marquee sports be paid? Does it make any sense to think of them as “amateurs”? I must, alas, leave the technical discussion to better-informed interlocutors. All I have to offer is an anecdote. Maybe it will help people think some things through.
Our story begins in 1992, when your humble correspondent matriculated as an exceptionally neurotic graduate student at the football and, um, American Studies powerhouse the University of Michigan. Entirely unqualified and unprepared to teach, I was nonetheless thrown to the wolves as a freshman composition instructor. On the first day of class, I had gotten through the introductions, passed out the syllabus, and launched into explaining the course, when a strapping, tall, god-like young man walked into the room—very late. All other activity stopped. I couldn’t teach any more for the murmuring. A star of the University Michigan football team would be practicing the farce of pretending he was a college student in my very own classroom. It turned out to be quite the education for me.
Because most of us first-year graduate students had no idea what we were doing in the classroom, we taught freshman composition using a pedagogy-less “workshop” method (and that’s a subject worth writing about another time). A designated student would bring in a draft of the paper to hand out to the class. The students would read it as homework, and the next class session would be taken up by their discussion of how it could be made better.
Early class sessions, however, were largely taken up by the kids being broken up into small groups and doing writing exercises out in the hall. But I had to stop doing that. All my students would end up doing once outside of my gaze was crowd around the Football Star and harass him for what Johnny Manziel was accused of selling: autographs.
Anyway, one fine fall day, it was Football Star’s turn to show up with his draft of an essay.
But he did not show up that day at all. (He only showed up about half the time, anyway.)
I obviously had to fail Football Star for that section of the class. If I could have done worse than fail him, I would have done that as well. Because since I had no material to “teach,” the next class session had to be cancelled entirely.
Next chapter in the story: I get a call from the guy at the athletic department whose job is it to liaison with professors. I’m ashamed of what happened next. He laid on me a sob story about how Football Star came from a rough ghetto, grew up in a housing project, didn’t get any breaks, etc. That this was his one shot at a college education. Something to fall back on, in case this football thing didn’t work. And me, being a dopey neurotic weak-willed 23-year-old, packed to the gills with liberal guilt to boot, listened to him try to persuade me to give Football Star another chance—and of course I said yes.
He brought a draft of a paper to my office hours. To the moral credit of the University of Michigan athletic department, he did actually seem to have written the thing. That much was obvious: it was barely literate. I should have made a photocopy and kept it.
Football Star went on to help lead the Wolverines to the Big Ten championship that year, then the Rose Bowl, setting records at his position. What happened next? Thanks to Wikipedia, I was able to find out. The 1996 NFL draft was a quality one for his position. So he only went in the fifth round. He played in seven games in his first professional year—and only four in his second. He was traded away to another team before his third season—where he played in only two games. The next year he joined another team’s practice squad—then was released nineteen days later.
He lasted five weeks on a professional team in Europe. Then signed with a Canadian team. Things looked promising in the pre-season and early regular season games—then he got hurt, and was released in September. Two years later he tried a comeback with a team in Arena Football’s developmental league. The league was short-lived; so was the career within it of my former “student.” Then, after the section reading “Professional career” on his Wikipedia page, there is nothing—because what happened next apparently matters to no one. I checked Nexis-Lexis. Also nothing, of course; once his usefulness as a revenue-and/or-glory-generating unit had dried up, he was invisible to the world.
I wonder where he is now. Maybe he’s happy, productive, secure, thoughtful, wise—I hope so. Anyway I’d like to find him. And apologize to him. Apologize for not flunking him. Maybe, then, I could have actually taught him something in that class: you get to evade the rules everyone else has to follow only as long as you are a god. And you get to be a god only when you’re worth something to the people who are using you.
Dave Zirin admonishes the corrupt and exploitative practices of the NCAA.
Students at North Lawndale College Preparatory High School in Chicago walk through the school's hallways. The surrounding neighborhood has an unemployment rate almost triple the city average, and school officials estimate 5 percent to 8 percent of its 525 students are homeless at any one time during the school year. (AP Photo/Paul Beaty)
Today was the first day of school in Chicago—and a profound setback for Chicago’s forces of decency. Fifty fewer schools will be in operation this term, with 2,113 fewer staffers, a colossal injustice I’ve written about here and here and here and here. The school closings are going forward because ten days ago Federal District Judge John Z. Lee denied the attempt to get a preliminary injunction to prevent it. A week before that ruling, I spoke with one of the lawyers who brought the suit, Thomas Geoghegan, for my monthly interview series at Chicago’s Seminary Co-op Bookstore in Hyde Park—where I and my audience deepened our sense of just how mad and malign Mayor Rahm Emanuel’s schools agenda truly is.
You might know Geoghegan for his classic public-policy memoirs like Which Side Are You On? and his most recent, Were You Born on the Wrong Continent?: How the European Model Can Help You Get a Life; or his quixotic run to win the congressional seat vacated when Rahm Emanuel became Barack Obama’s chief of staff, which The Nation endorsed. Our conversation at the Co-op—a public version of dialogues we’ve been having regularly over dinner and drinks for over a decade now—was, like so much of Tom’s discourse, heartbreaking and inspiring in equal measure.
We spoke on August 10, the day after Judge Lee declined to certify Geoghegan’s plaintiffs as a class, a harbinger of the preliminary-injunction denial to come—heartbreaking, because his arguments sounded damned well open-and-shut to my audience and me. The Americans with Disability Act specifies quite clearly that school systems, when moving disabled children, have to proactively provide opportunities for the kids and their parents to meet with “Individual Education Plan” teams to devise specific measures to ease the transition. The Chicago school board didn’t even try—it just called up befuddled parents to ask, as Geoghegan put it, “Anything you want?” And when these parents—overwhelmingly poor and harried, understandably inexpert in the intricacies of special-education best-practices—didn’t have anything specific to offer, the board considered its work done. One of Geoghegan’s expert witnesses, the woman in charge of special education of the Indianapolis school system, said the whole thing was pretty much totally nuts.
The suit also tried another angle. In 2003 Governor Rod Blagojavich (who actually did some good things) signed a state civil rights statute that allowed private plaintiffs to bring claims of disparate racial impact against entities like boards of education without having to prove discriminatory intent—a provision that used to be in federal law until the Supreme Court struck it down in the 1990s. Explained Geoghegan, 88 percent of the affected kids in the receiving schools are African-American, but African-American kids make up only 40.5 percent of students in the system. Pretty damned disparate.
Of course, the Chicago Public Schools had an explanation for that, of a sort: they argued that black kids were being helped by being moved. Actually, they made several arguments—changing them around each time the last was debunked. First it was that they needed to close schools to help with the system’s budget deficit, freeing up resources for instruction. But most of the money they claim to be saving (savings disputed in themselves) is being spent on moving kids, not instruction. And, Geoghegan points out, “After a year, that money goes into the general pot to aid kids in the system.” Look at the system’s plans, and it turns out “the board is going to use some of this money to build schools on the North Side”—the white North Side, in other words. It basically amounts to stealing from poor black kids to give to more affluent white ones.
The system’s second argument is that the schools that kids are being moved into are academically superior to the ones they’re leaving. Well, there is a word in legal jargon for what that claim represents in this case. That word is: bullshit. In their pleadings, Geoghegan’s team pointed out that only seven of the schools are arguably better academically than the ones kids are coming from; some are worse. In fact, the very act of moving kids under such circumstance basically cannot improve their educations. “What’s extraordinary about this is that the study of the Consortium of School Research at the University of Chicago stated that these school closings don’t do students any good but in the long term don’t do any harm,” Geoghegan told my bookstore audience. “The RAND study, which came out in 2012…says that they do do long-term harm, unless the children go to academically superior schools.”
You see the problem, even if a federal judge did not.
CPS’s third argument is yet more dubious. Between 2001 and 2012, leading up to this year’s closing, they closed some seventy-four schools. Back then, they said they were closing “failing schools.” But “now they’ve backed off from that notion of failing schools, which was always a little bit bogus to begin with because, Why are those failing more than any others? It was [empirically] indefensible.” (For instance, at one of the closed schools, Guggenheim, which I wrote about here, one-third of the students were homeless. Geoghegan relayed his suspicion to Chicago homelessness experts: maybe some kids counted as “homeless” were, say, doubling up at the home of an aunt. He heard back, “No! No! Those kids who are doubling up with the aunt aren’t counted as homeless. They’re, like, homeless homeless. Like, they don’t know where they’re going to be every night.” What does it mean to say a school serving a population like that, because of its poor test scores, is “failing”?)
So it was they settled upon the argument that the closing schools were “underutilized.”
Which argument one of Tom’s heroic plaintiffs, a woman named Sharise McDaniel, had already demolished on its face. McDaniel is the parent of a child at a school called George Manierre Elementary, which is by the former Carbini-Green housing projects. The advantages of Manierre for its black, impoverished population—it’s across the street from the Marshall Field Homes, where hundreds of the school’s kids live, making the commute rather safe indeed; also, the building is grand and gorgeous—and the disadvantages of moving them—the receiving school is a mile away, across a treacherous gang boundary—were brilliantly reported by the education reporter Linda Lutton for Chicago’s public radio station WBEZ. Manierre, however, with its gorgeous building, happens to be quite close to a bevy of luxurious condominiums where affluent white families live, and whose children go to overcrowded Lincoln Elementary.
McDaniel and her cadre of parents presented a solution at a community meeting. They learned that the supposedly strapped school board was paying hefty money to rent space for the Lincoln kids at DePaul University. The mothers proposed that, if Manierre was indeed underutilized, Lincoln kids could move into their second floor; Manierre kids could stay on the first floor, and—Geoghegan got a mixture of laughs and groans when he reported this one—“they would have separate entrances so they wouldn’t have to see each other!”
A win-win solution—if the point really was filling underutilized schools, and not, say, emptying out a desirable building of undesirable Chicagoans, the better for Rahm Emanuel to serve his affluent constituency.
So at the trial, Geoghegan asked the system’s number-two administrator, a mountebank named Tim Cawley, “‘Why not move the children from Lincoln Elementary into Manierre?’ I’m not going to quote his answer…but the effect of it was, ‘You don’t know how disruptive that is!’ ”
He earned a roar of laughter from our audience at that. Laughing to keep from crying.
The system denies that it’s placing such kids under physical risk. And yet it plans to spend $7 million a year on a “safe passage” system to protect them. Geoghegan now turns indignant: “The children are going under guard, though gang territory, another one or two or more miles to their new schools. For a worse education experience on all counts…there’s this trauma, not only of all this displacement, [but of] losing all your teachers because they’re all being laid off…. What’s the payoff for this? There is no payoff for it. And the board has no basis to believe these closings are doing any good for the children.”
And yet the judge ruled there was no proof kids “would suffer substantial harm as a result of the school closures.”
So what’s the inspiring part? The solidarity. Noted Geoghegan in our Q&A, “It’s interesting how many middle class white parents have been radicalized by this. They didn’t start that way. But the more they deal with the board, the more they realize that, with the minority children on the South and West Sides, they’re fighting the same battle against a really dysfunctional bureaucracy which just does not work.”
Watch this space. I’ll be writing more about what happens next. It could get ugly—and interesting.
Rick Perlstein writes about the resurgent protest movement in Chicago that is fighting back against Rahm Emanuel’s austerity agenda.