Short takes on politics, culture and life.
Monica Lewinsky has penned a 4,000-word, somewhat irritating and rather selective article about her treatment by the public since the affair (see Rebecca Traister at The New Republic for a great critique). The faults of her essay aside, Lewinsky is not undeserving of sympathy. What happened to her—the gross and near-total sexual shaming; the reduction of her personhood to a blowjob, a cigar and a cum stain; the fake and opportunistic sympathy from the right; the near twenty-year long tabloidization and ridicule of her every subsequent move—is irredeemably awful.
That said, Lewinsky lost me when she explained her rationale for reintroducing herself to society now—which she claims is the 2010 suicide of Rutgers gay freshman Tyler Clementi, who killed himself by jumping off the George Washington Bridge four days after his roommate briefly streamed him making out with another man in their dorm room. That tragedy, Lewinsky says, brought her to tears, and prompted her mother to relive the dark weeks in 1998 when she held a kind of suicide vigil over her daughter’s bed. Fair and tender and empathetic enough. Lewinsky also takes care to distinguish between Clementi’s plight and her own calamity, which was brought on in part by her “own poor choices.” Yes, that’s right.
But the continued juxtaposition of her ordeal with Clementi’s—as precipitated by a comparable atmosphere of “online humiliation” and “public shaming”—frankly rankles. So, too, does the subsequent self-attribution of altruistic motives. “I wished I could have had a chance to have spoken to Tyler about how my love life, my sex life, my most private moments, my most sensitive secrets, had been broadcast around the globe,” Lewinsky writes. “I wish I had been able to say to him that I knew a little of how it might have felt for him to be exposed before the world.”
This comparison, however well-intentioned, is narcissistic and inaccurate. While he lived, Clementi was in no way a public figure. He was not known globally, nationally, on campus or even particularly in his dorm. His “online harassment” lasted a total of four days, not decades. Contrary to popular belief, Clementi’s sexual encounter was neither recorded nor broadcast widely (for a more detailed account, see Ian Parker’s forensic analysis in The New Yorker and my own coverage here and here). A total of six students—Dharun Ravi, his friend Molly Wei and four others—saw only a few seconds of kissing. Ravi did tweet about the first encounter (“Roommate asked for the room till midnight. I went into molly’s room and turned on my webcam. I saw him making out with a dude. Yay”), and Clementi saw that tweet, prompting a formal request to change rooms. Ravi also tweeted that he would broadcast a second liaison, which Clementi thwarted by unplugging Ravi’s computer. At most, Ravi had about 150 Twitter followers, most of whom were high school friends, and there is no evidence that his two tweets circulated beyond a small circle.
Pace Lewinsky, Tyler Clementi was not the victim of a worldwide, social media–enabled public humiliation. He was the victim of rather common peer-to-peer bullying, and what role that bullying had in his decision to take his life, just four days after the first incident, is still a great mystery.
The public branding of Monica Lewinsky is not. It was an utterly predictable result of the alignment of sex, media and partisan politics. What drove her global humiliation was the fact that she had an affair with the president of the United States, that there was an independent prosecutor with broad subpoena powers to investigate that president, that every media outfit (right-wing and otherwise) had a financial reason to drive that scandal to page one and that hatchet hacks like Newt Gingrich, David Brock, Lucianne Goldberg, Matt Drudge and Linda Tripp stood to gain, politically and personally, from bringing Bill Clinton down. Moreover, very little of this had to do with online technology—which, as Lewinsky herself notes, was in its infancy in 1998 (pre-Twitter, Facebook, YouTube, TMZ, Gawker). It had everything to do with politics.
Suffice to say, none of these conditions applied to Tyler Clementi. I appreciate Lewinsky’s empathy for gay teenagers, and I wish her well in her quest to find some purpose in life beyond being “that woman.” But her equation of her ordeal with Clementi’s, and the suggestion that she has something meaningful to say about it, does a disservice both to the world-historically unprecedented clusterfuck that was the Clinton-Lewinsky scandal and to the all-too-quotidian challenges gay teenagers like Clementi face in digitally connected communities.
The truth is, Fred Phelps made a lot of us feel pretty great about ourselves. No, not the families of AIDS patients, gay men, American soldiers and celebrities whose funerals he relentlessly picketed along with a few dozen members of his Westboro Baptist Church. As journalist Donna Minkowitz chronicled almost twenty years ago in Poz magazine, “harassing bereaved families is Phelps’s specialty.” “I love to use words that send them off the edge emotionally,” he told her, “There’s nothing better than that.”
Phelps’s early victims included not just AIDS advocates like Nicholas Rango (1993) and Randy Shilts (1994) and gay men of modest fame like composer Kevin Oldham (1993), but also anonymous individuals like Kenneth Scott, a Topeka graduate student who died in 1992. Scott’s sister Sue Mee told Minkowitz, “When Kenny died, they came to the funeral with a sign that said ‘Fags=Death’ with a big smiley face,” and that years later, Scott’s family would still receive phone calls from random individuals asking, “Is this the house where fags live?”
Later, Phelps would broaden his ghoulish trolling to include the funerals of slain US soliders, whom he reckoned God had killed as punishment for America’s sexual immorality. Those families, too, suffered real harm. I think the Supreme Court reached the right decision in Snyder v. Phelps, the 2011 case that concluded that the First Amendment protected even hateful public protesters from tort liability. But it is impossible to read the testimony of Albert Snyder, the father of Lance Corporal Matthew Snyder, whose funeral Phelps and family bombarded with signs that read “God hates you,” “Fag Troops” and “Thank God for dead soldiers,” and not be utterly devastated. “I want so badly to remember all the good stuff and so far, I remember the good stuff, but it always turns into the bad,” Snyder recounted to the court. The lone dissenter in that case, Justice Samuel Alito, reasoned that open and vigorous public debate need not allow “the brutalization of innocent victims.” One can disagree with Alito’s ultimate conclusion, as well as parse whether or not his sympathy for “innocent victims” would have extended to gay men with AIDS, but the fact of the brutalization is undeniable.
Likewise, Phelps attempted to drive from public life local politicians like Topeka city councilwoman Beth Mechler, whose confidential blood records Phelps published in 1992 in order to falsely suggest that she had AIDS, and district attorney Joan Hamilton, whose private e-mail to her husband acknowledging a one-night stand Phelps also published (see Minkowitz and Kerry Lauerman’s profile for Mother Jones). Mechler, a Republican for whom Phelps had once canvassed, lost her re-election bid; Hamilton won.
For all these people, Phelps was not just some Jerry Springer–ized cartoon. He was a very real menace. But for the rest of us, he was a useful bigot. The left sometimes deployed Phelps as a facile synecdoche for the perils of religious extremism, even though his church was largely composed of his immediate family and isolated from organized conservative movements. It’s probably impossible to quantify, but I bet Phelps was direct-mail gold.
For the right, he was the ideal foil. As People for the American Way fellow Peter Montgomery, who has studied the religious right for decades, put it to me, “Phelps allowed other anti-gay leaders to posture that he was the face of hatred, not them. But the substance of their message to gay people is similar: repent or be damned—it’s just that Phelps framed it as ‘God hates fags,’ while people like Bryan Fischer [of the American Family Association] say God loves them and wants them to abandon their demonic lifestyles.” In the end, Montgomery argues that other anti-gay religious leaders were nonetheless tainted by Phelps’s rhetoric: “His protests at military funerals may have accomplished the most in terms of making anti-gay bigotry seem more broadly anti-American.”
For the media, Phelps was irresistible bait. And on the occasion of his death, we should ask why that was. Especially in recent years, he possessed almost no followers, no influence, no allies. What distinguishes him from any other raving street-corner prophet is the simple-mindedness of his message. In the place of the modern religious emphasis on God’s love, Phelps ranted on about God’s hate—for fags, for America, for Muslims, for Catholics, gun massacre victims and US troops. If American exceptionalism is in some way an attempt to sacralize the profane (America is blessed, its soliders and citizens blessed), Phelps merely reversed polarities, swapping in eternal damnation. It was a juvenile substitution. And to discuss Phelps as if he were a morally vexing and profound evil is to dignify him with a complexity he lacked. His hatred was banal.
In the end, he was only sound and fury. On his own merits, he accomplished nothing. He was a nobody.
Read Next: Richard Kim on the defeat of the religious right in Arizona
I know devotees of Fox News may find this hard to believe, but December 29 was not the first time Melissa Harris-Perry discussed transracial adoption on her show. In that episode, Harris-Perry made an uncharacteristically infelicitous pairing. During a segment in which she asked comedians to provide captions for notable photos of the year, she displayed a portrait of Mitt Romney and his twenty-one grandchildren, including Kieran Romney, the African-American adopted child of Ben and Andelynne Romney. One of her guests quipped in sing-song, “One of these things is not like the other…” Some laughter ensued, as well as a maladroit attempt to turn Romney’s family photo into a symbol of the GOP’s racial dilemma. (Obvious disclosure: Harris-Perry is a Nation columnist; I have also appeared on her show on multiple occasions.)
Suffice to say, the segment has now gone down in the annals of partisan cable-news outrages, thanks largely to a barrage of posturing denunciations from right-wing voices Sean Hannity, Breitbart.com, Sarah Palin, TheBlaze.com and the like. Harris-Perry apologized on Twitter, and then again on the MSNBC website, and then again in an on-air monologue—not just to the Romney family but to all families created by transracial adoption. “I am deeply sorry,” she said, visibly distressed, “that we suggested that interracial families are in any way funny or deserving of ridicule.” Romney himself accepted her “heartfelt apology” this weekend on Fox News, saying “we hold no ill will whatsoever.” And that, at long last, should have been the end of it.
Except, of course, many conservatives are still milking it. On Monday, Ann Coulter called Harris-Perry an MSNBC “token” in a Fox News segment in which she also said that “unfortunately for liberals, there is no more racism in America. There is more cholera in America than there is racism.” On Tuesday, one conservative writer cast Harris-Perry as the modern-day incarnation of Bull Connor (yes, really!), because “skin color—whether using it as an excuse to enslave, segregate, lynch, divide, scorn, mock, get votes or get MSNBC ratings—is part and parcel of the American Left.” And Jason Riley, a black conservative Wall Street Journal editor and Fox News contributor, chimed in to accuse MSNBC of hiring “black mediocrities like Melissa Harris-Perry….simply to race-bait.”
It’s promulgating this view of racism as a problem fabricated by liberals and minorities for political gain—and not any umbrage taken by the Romney family or by families who have adopted across racial lines—that is, of course, the whole point of the conservative outrage machine. Plenty of smart analyses to this effect have been made, and I recommend reading Brittney Cooper at Salon, Ta-Nehisi Coates at The Atlantic and Jelani Cobb at The New Yorker in particular. As Coates put it, “The conservative movement does not believe that racism is an actual issue to be grappled with, but sees it instead as a hand grenade to be lobbed into an enemy camp.”
I will add to all this just one point. If any of Harris-Perry’s conservative critics really cared about the actual substance of her views on transracial adoption, they could have easily found, quoted and linked to the multiple segments she has done on the subject. To the best of my knowledge, not one of them has.
So here’s one telling example of how Harris-Perry has approached the issue. About one month after the verdict in the George Zimmerman trial came down, Harris-Perry convened a panel, inspired by a prompt from her white mother, that addressed how white parents might have “the talk” with their black children. The panel included Rachel Garlinghouse, a white mother of three adopted African-American children; Tracy Robinson-Wood, a professor at Northeastern who studies white mothers of non-white children; Rachel Noerdlinger, an African-American woman adopted by white parents; and Shanna Smith, a housing expert who is also the mother of two mixed-race children who identify as African-American.
The panel discussed the issue of racial literacy, so that white parents who love and seek to protect their African-American children from racism can provide informed guidance. They examined the effect that integrated or segregated institutions and open adoption have on their struggles—and they also talked about how to do black hair. There was not a single partisan note in the entire segment, and with the exception of an aside about Bill de Blasio and his son Dante, none of it was particularly political, if by political one means electoral politics. And, at a time when racial polarization was particularly high, Harris-Perry made the creative choice of broadening the conversation so that “the talk” wasn’t just about African-American children as victims and white vigilantes as perpetrators, but also about white parents as protectors and protagonists. It was, in short, exemplary of the kind of smart, sensitive, unique and brave programming that has deservedly won Harris-Perry a legion of fans.
I reached out to Garlinghouse, one of Harris-Perry’s guests and the author of Come Rain or Come Shine: A White Parent’s Guide to Adopting and Parenting Black Children, about her experience on the show. Here’s what she had to say:
The public is skeptical of transracial adoption, particularly, like in my case, a White parent who adopts Black kids. In the various radio shows and articles I’ve been a part of, the interview questions usually begin from a place of assumption and suspicion. Often I’m asked what my intentions are (in adopting Black kids), why we chose to adopt, what qualifies us to raise Black children and why we didn’t adopt White children. However, when we share parts of our family story, we can see some of the doubts melt away, and the questions turn toward subjects like how we help foster a positive racial identity in our children and how we care for their hair.
I felt like when I was on MHP, I was respected from the get-go. The pre-interviews were non-threatening. I didn’t feel that I need to prove myself and my worth as a transracial, adoptive mom to Melissa, her staff or the other panelists. Being on MHP was the highlight of my year, after, of course, the arrival of my third child and the publication of my book.
Does this sound like someone who has been race-baited to you? Of course not. But then, that’s not really the point of the conservative faux outrage. Harris-Perry’s critics would have a problem with this segment, too, simply because it began from the premise that racism still exists and that good people of all races have a moral imperative to confront it. What they lacked was not a motive, just an opening.
On a final note, as someone who has been a guest on Harris-Perry’s show, I’ll say that segments like this one aren’t just a testament to her talents and interests (although that too!), but to all the producers and researchers who work on her show, the most diverse in all of cable news by a mile. And it also reflects well on MSNBC that they’ve given her the space to pursue such difficult, under-covered subjects with such care and intelligence. I hope they continue to do so—without fear or reservation.
Read Next: Entries and clips from Melissa Harris-Perry's blog.
Call me a slackjaw, but I was just shocked to discover that Phil Robertson—the 67-year-old, shotgun-toting, Bible-quoting, self-described redneck star of A&E’s reality-TV show Duck Dynasty—does not particularly like us gay people. “It seems like, to me, a vagina—as a man—would be more desirable than a man’s anus,” he told GQ magazine, while also evoking the slippery slope of adultery, prostitution and bestiality. Well, chew on that. Until now, I thought Robertson was kinda like a Santa Daddy, and that when he wasn’t busy shooting birds out of the sky, adorable gender-neutral chipmunks nestled in his nicotine-stained beard, enjoying a lifelong contact high.
No, I’m just fucking with you. And so is everyone else who has embarrassed themselves in this political charade—from Papa Robertson, who was later revealed to have once called gay people “insolent, arrogant God-haters;” to GLAAD, whose very rapid response statement condemning Robertson and his “lies about an entire community [that] fly in the face of what true Christians believe” reads as both canny and canned; to the A&E executives who put Robertson “under hiatus from filming indefinitely” while continuing to broadcast a twenty-five-episode, Christmas Day Duck Dynasty rerun marathon; to the flotilla of conservative pols like Sarah Palin, Bobby Jindal and Ted Cruz who rushed to defend Robertson’s First Amendment rights to… have a reality-TV show??
Let us not dignify this incident by calling it a culture war. In a war, there are winners and losers. In this elaborate pantomime, everyone comes out ahead. GLAAD gets the e-mails of thousands of gay and liberal clicktivists and a justification for its continued, sorry existence. The Robertson family and A&E get an avalanche of free press ahead of the show’s fifth-season premiere, as well as a much-needed shot in the arm to the idea that any “reality” remains in the “guided reality” formula. Failed reality-TV star Sarah Palin, who now admits she has not actually read Robertson’s interview with GQ, briefly returns to cultural—if not political—relevance. And everyone gets to be outraged or shamed, which are just about the only two registers left for public recognition of any sort in the altogether joyless mass media.
But what has anyone actually learned? The deepening trough of reality TV has put before our prying eyes and smug, clucking nods a whole host of previously under-examined social ills—compulsive hoarders, fascist dance moms, bridezillas, the entire Kardashian clan—but on the actual lived context of homophobia and racism, it has almost nothing to say. That terrain was ceded a long time ago to starchy documentarians and to the comedian Sacha Baron Cohen, whose alter-egos Borat and (less successfully) Bruno were perhaps the last fake/real figures to elicit demonstrations of casual, everyday prejudice in ways that were both revealing and thoughtful.
This is why A&E now has, if not an obligation, a golden opportunity. The network should return Robertson and Duck Dynasty to the air, but not in the adulterated, cartoon version that has conservatives so besotted (a Facebook page calling for a boycott against A&E has generated over 1.8 million likes). No, Duck Dynasty should get real. It should show Robertson being as homophobic as he pleases, in his home, his church, his community. The show’s editors have previously been criticized for asking Robertson to not say “Jesus” at the end of his prayers; they should now let him get his Jesus freak on, whether in words that invoke hellfire and damnation or those that profess to love the sinner but hate the sin. And, as long as the show’s producers “guide” reality along, they should film Robertson interacting with actual gay people. If the LGBT population of West Monroe, Louisiana is camera-shy, they can fly some folks in for a good old duck hunt. I bet Mary Cheney is pretty handy with a shotgun.
As for gay activists, they should revisit their reflexive strategy of playing whack-a-mole every time some reality-TV star or pasta executive or former cast member of 30 Rocksays something homophobic. After all, who is GLAAD to decide that the Christianity of Robertson’s Churches of Christ denomination is less true than the Christianity espoused by Unitarian Universalists? Or that of Pope Francis, who, Marxist-ish apostolic exhortations against unfettered capitalism aside, has yet to condemn a bill in Uganda, where Catholicism is the most popular religion, that would imprison those convicted of “aggravated homosexuality” to life in prison. The point is: a lot of people believe as Robertson does, or have yet to break faith with religions that formally espouse beliefs that approximate his. And until that is no longer the case, I’d rather see the whole picture—homophobia included—than a caricature of the cuddly duck hunter or pontiff. ‘We’re just not going to talk about it’ is not good enough.
Of course, driving offensive utterances underground can feel like a victory. But consider this: it is no longer acceptable to use the N-word in public, but that doesn’t mean a whole lot of folks don’t think it, or think media-savvy versions of it. To wit, the truly offensive part of Robertson’s interview was not his crude, anatomical account of human sexuality, but his insidious take on the experience of African-Americans in the pre–Civil Rights South. “I never, with my eyes, saw the mistreatment of any black person. Not once,” he said. “They’re singing and happy…. Pre-entitlement, pre-welfare, you say: Were they happy? They were godly; they were happy; no one was singing the blues.” This is straight from the racist text book that justified slavery as one big happy plantation, and Robertson is off-script only in the sense that he hasn’t updated his rhetoric to the 2.0 colorblind version, one that acknowledges that while discrimination once existed, its time has passed.
But the past, as we all know, is not even past. Gay activists should remember that the next time they are tempted to conflate momentary outrage with enduring social change.
On December 19, independent journalist Rania Khalek posted an article at The Electronic Intifada titled “Does The Nation have a problem with Palestinians?” In it, Khalek criticizes The Nation for articles we published about the debate over BDS, the balance of Palestinian voices in that discussion and the lack of Palestinian, Arab and Muslim writers in our pages in general. “When it comes to Israel and Palestine,” Khalek writes, “The Nation habitually reinforces Israeli apartheid by privileging Jewish voices over Palestinian ones.” Shortly thereafter, Khalek, who has twice written for The Nation, tweeted:
— Rania Khalek (@RaniaKhalek) December 18, 2013
Khalek’s claims circulated widely, and they have left the impression that with the exception of Omar Barghouti, who Khalek notes was twice published by The Nation on the subject of BDS, we have failed to include Palestinian writers in our coverage of the Israeli occupation. I wish Khalek had taken the time to search our archives or contacted me or any other Nation editor before making these assertions. Had she done so, I would have directed her to at least fourteen articles on Palestine by ten different Palestinian or Palestinian-American writers that we have published since the beginning of 2008 alone. In addition to those by Barghouti, these include, in no particular order:
“The Rise of the Intifada Generation,” August 24, 2011
“Palestinian Roads: Cementing Statehood, or Israeli Annexation?” April 30, 2010 (with Jesse Rosenfeld).
“Obama’s Blindspot on Israel,” June 16, 2008.
“Rethinking Israel-Palestine: Beyond Bantustans, Beyond Reservations,” March 21, 2013.
Mohammed Omer, Palestinian journalist and winner of the 2008 Martha Gellhorn Prize:
“A Report from Gaza Under Seige,” March 18, 2012.
“Truth and Consequences Under the Israeli Occupation,” July 31, 2008.
“Free Gaza—And Palestine,” July 17, 2009.
“Palestine’s Peaceful Struggle,” September 11, 2009.
Mouin Rabbani, independent journalist and fellow with the Institute for Palestine Studies:
“The Fatah-Hamas Accord,” May 12, 2011.
Plus three articles published before 2008.
“Palestine: Liberation Deferred,” May 8, 2008.
Plus one article on the Arab Spring and two articles written before 2008.
“To Live and Die in Gaza,” January 2, 2009.
Plus four other articles on subjects other than Israel and Palestine.
“The Strangulation of Gaza,” February 1, 2008.
Plus two articles written before 2008.
It is ironic that Khalek’s article, which purports to document the absence of Palestinian voices has, itself, rendered so many Palestinian writers invisible. But contrary to her characterization, our archive in this regard is a rich and varied one. It includes contributions by some of the most prominent Palestinian activists, scholars and journalists in the world, including the founders of ISM and BDS. There are on-the-ground dispatches from the occupation of the West Bank and the siege of Gaza, analyses of US and Israeli policies and coverage of debates internal to Palestine and the Palestinian diaspora, a genre rarely seen in the US press. Moreover, our coverage of Israel and Palestine extends deep into our history and includes other Palestinian writers like Bashir Abu-Manneh and the late Edward Said. Our list of Arab and Muslim writers is, of course, much longer still.
On a final note, it seems that Khalek conflates coverage of BDS with coverage of the Israeli occupation of Palestine in general. This is a disservice all around. BDS is one particular strategy to end the occupation. While it emanates from within Palestinian civil society, it is directed at people of conscience around the world. Citizens of the United States, Israel’s staunchest ally, have a particular burden to wrestle with this call for solidarity, which is exactly what has happened at institutions like the Park Slope Food Co-Op and the American Studies Association.
Our forums on BDS took place in the midst of those discussion, and while they included the co-founder of the BDS movement, they were weighted towards an American audience—because those were the moral agents being asked to make a choice. Should they have included more Palestinian voices? That is a perfectly fair—and quite interesting—subject for discussion. But it is difficult to have that conversation when it comes wrapped in such a willful and facile distortion of our larger record.
(AP Photo/Seth Wenig)
Ok, I admit it: I’m tribal and narcissistic. Well, that’s not the only reason I voted for Christine Quinn for mayor. But that’s part of it. Here’s what it seems like to me: the city understands the value of having a woman mayor, a lesbian mayor, but then skipped right over actually having one, right back to the white guy!
How did Bill de Blasio win the Democratic mayoral primary in New York City? Many reasons, not least of which are that while he’s white and straight, his family isn’t. Plus, as Richard writes, a politician doesn’t represent each of those identities—gender, race, and sexual orientation—simply by pattern-matching his or her constituents. Representing LGBT New Yorkers, or women, means pushing policies that protect them—like paid sick leave, like an enforceable anti-profiling bill to curtail stop-and-frisk. And yes, admittedly, as public advocate, de Blasio got there first, and proudly. That is something to admire, and possibly something to vote for. But, on both those issues, the political process worked like it’s supposed to: constituents pushed, movements grew, celebrities called and, ultimately, the Speaker of the City Council changed her position.
And so this mayoral primary seems like a repeat of the 2008 presidential primary. Female candidate credentials herself for much longer, takes a more traditional path. The path she and all her consultants think is safer. Runs a general election campaign for years before she actually runs in the primary. Male candidate—whose previous job has not required so many compromises—kicks off campaigning later, is able to adapt to the present mood better. It’s not much of a defense of Quinn to say she was running a campaign for so long—that’s pretty calculating—and it was clearly, in this case, a miscalculation. But isn’t this yet another case of it being easier to be the “most progressive” candidate—having the luxury of staking out the most liberal positions—when you’re white and male?
Richard, anyone else, your turn to reply! —ED
* * *
Hi, Emily! I’m glad we’re doing this. I’m also glad you took the time to express an unease with the results that clearly many feminists and LGBT activists share. I can see how it all seems so unfair: New York has long been a tribal city, and getting one of your own elected to the throne has been a sign that your clan has arrived. Italians got Fiorello La Guardia; Jews got Abe Beame; African-Americans, Dinkins. And just when it seemed like it was Christine Quinn’s turn (a two-fer at that!), the rules changed. As Jim Ledbetter of Reuters points out, the city’s reliable pattern of identity politics broke down yesterday. Quinn lost the women’s vote to de Blasio (16 percent to 39 percent) and to Bill Thompson (!), who got 26 percent. De Blasio split the black vote with Thompson. John Liu didn’t get the Asian vote. And de Blasio even captured the city’s LGBT voters (47 percent vs. Quinn’s 34 percent).
Just when identity politics seemed poised to reward a female and LGBT candidate, voters threw identity politics out the window. Hey, wait a minute!
But I have to say, I’m glad that system is coming to a close, at least here in New York. A few reasons why: historically, over-determined identity politics facilitates the formation of political machines, and with that, corruption (see Tammany Hall). It can encourage mindless loyalty at the expense of ideas, including ideas about the rights of minority groups. One of the great things about de Blasio’s win is that it looks like African-American New Yorkers gravitated to his candidacy not just because of Dante’s legendary afro, but because de Blasio, much more so than Thompson, intends to end racial profiling and stop-and-frisk.
Which brings me to the main reason why I’m not overly upset about the fact that New York missed an opportunity to elect its first female and lesbian mayor. Our ideas are winning. All the candidates have great positions on LGBT rights, reproductive choice and other women’s issues. Quinn, of course, has a deeper record (although, as I argued in my column, a more complicated and muddled one than commonly presumed). But in broad brush, there’s not much to separate the pack here. Which is a good thing, because it means that there’s a consensus.
You raise the specter of Hillary Clinton, and I want to go there. But two related question for you first: Do you think Quinn was the victim of meaningful sexism and/or homophobia in this race? And do you think Quinn took the positions she did because she felt she had to overcome her gender and sexual orientation? It’s hard to get in her head, of course, but I don’t see a lot of evidence for either claim. —RK
* * *
Richard, you articulate better than I did my sense of disappointment: “Just when identity politics seemed poised to reward a female and LGBT candidate, voters threw identity politics out the window. Hey, wait a minute!”
But, you ask, was Quinn victim of meaningful sexism and homophobia on the way to her loss? Or did she lose fair and square, because New Yorkers rejected her positions? A recent New York magazine poll asked voters why she was sometimes described as “unlikable.” Twenty-three percent said it was because “she’s a Bloomberg hack”; only 10 percent thought homophobia was to blame; 6 percent sexism. (Eight percent said it’s because she’s “too brash,” which might be another way of saying “because of sexism,” but still.) Potentially sexist or homophobic flashpoints from the campaign trail are few. Yes, The New York Times ran a feature on Quinn’s short temper, which it might not have done for a male politician (on the other hand, it ran a piece on de Blasio’s “plodding, contemplative” management style based on a thin anecdote from a conference call thirteen years ago). De Blasio’s wife, Chirlane McCray, suggested that Quinn wasn’t “accessible” for conversations about “concerns of women who have to take care of children at a young age or send them to school and after school.” That wasn’t a smear on Quinn’s own family, no, but did that signal to voters that a woman candidate without children doesn’t get what you care about—even if McCray didn’t mean it that way?
When Quinn realized she was failing to connect with voters, she tried to go vulnerable and personal, opening up about past struggles with eating disorders and alcohol abuse. Was she trying to force the Hillary Clinton crying-in-a-diner moment? For a city inflamed over out-of-control living costs, tired of its mayor genuflecting to Russian billionaires and coming to terms with the moral failing that is stop and frisk, it didn’t resonate. That, to me, is a gendered moment too: that’s Quinn realizing that her pushy, mouthy self isn’t working, trying out a more “feminine” public persona. Does it count if Quinn visited the sexism on herself?
The age of identity politics is coming to an end, you write. And maybe that’s for good, what with its blunt categorizations of loyalty and affiliation, the focus on being rather than doing. But who wins primaries after identity politics is over? Are those blunt categories really so easily dissolved? When we stop thinking about how to push for fair representation among race, gender and sexual orientation, I think I know who’s winning the race. —ED (And, readers, please respond in the comments below! We’ll join you there.)
* * *
Emily, so I think it’s fair to say that, while we might have differences over emphasis, we both agree that sexism and/or homophobia did not play a determinative role in Quinn’s defeat. Indeed, it’s hard to imagine that a straight, white, male candidate with similar ties to Bloomberg and similar positions would have fared much better in this anti-incumbent climate, and it’s possible he would have done a lot worse.
I was thinking about your provocation—who wins primaries when diversity itself is no longer a predominant rationale?—when NYC council member Brad Lander called. He’s been reading our exchange (Hi Brad!) and wanted to point out that six LGBT council candidates won their elections yesterday. That’s a record! In addition to incumbents Jimmy van Bramer and Danny Dromm from Queens and Rosie Méndez from the East Village/Lower East Side, Corey Johnson will take over Quinn’s west side seat. Brooklyn and the Bronx will both see their first LGBT council representatives in Carlos Menchaca and Ritchie Torres respectively.
What strikes me about this achievement is that with the exception of Quinn/Johnson (and arguably Méndez), these LGBT candidates didn’t run in gay enclaves, which means they had to win by appealing to non-gay (and some non-gay friendly) voters on issues besides gay rights. Torres, for example, described his district as “the Bible Belt of New York City,” and noted a “whisper campaign” against him. Nonetheless, he didn’t hide his sexuality; he wove it into a broadly progressive story he told about youth empowerment, housing, economic opportunity and sustainability. And it worked. The same can be said for the other LGBT candidates to varying degrees.
So to go back to your question: “who wins primaries after identity politics is over?” LGBT people do! Carlos Menchaca, an out Mexican-American, wins—in Sunset Park. Ritchie Torres, an out Puerto-Rican, wins—in Belmont, Bronx. In the purely tribal model of electoral politics, we’d be permanently relegated to one lonely representative from the West Village, waving that rainbow flag all by herself. It’s only because people voted across identity groups and on the issues that we’ll have a record number of LGBT folks in the next council.
You wrote that you voted for Quinn, in large part, because she was a lesbian like yourself. But there will never be enough lesbians in New York to elect a lesbian mayor if, as you seem to admit, that is the only or primary reason to vote for her. Holding our candidates to higher, broader standards isn’t just the right thing to do, it’s the only path to victory.
Ok, that’s all for now. Next exchange, we’ll have to get into the subject of Hillary. Hmmmmm, maybe I’ll let you bite that apple first. ---RK
Now I’m regretting that, in a fit of pique, I agreed to being narcissistic and tribal. Because really, I’m neither. I don’t want women to vote for women and LGBT people to vote for LGBT people and African Americans to vote for African-American candidates. I want everyone to vote for women and LGBT people and people of color! Which is why “identity politics” isn’t quite the right way to think about this. I don’t want a gay city council member elected by a gay enclave—I wanted a gay mayor, for all five boroughs. It’s more like “solidarity politics”—vote for someone who isn’t necessarily just like you, but who represents people who need a place at the table. And they need a place at the table, not just a white guy working on their behalf! Symbolism is irreducibly powerful, even if it isn’t everything. Among other things, having a female and gay mayor would force us to confront how we respond when women are in power, a lingering and well-known obstacle to women’s advancement in many fields. I can’t say it better than Erica Brazelton, who writes that before Obama’s election “we could only ask the question, ‘would a black president in America signal post-racialism?’ as a hypothetical. Obama’s presidency now gives us a resounding emblematic answer: Hell. No.” The news about the record number of LGBT city council members is heartening, but I’d like it best if their constituents weren’t voting for them in spite of their being gay but, at least in part, because of it.
Here’s what I will agree to: Quinn had the wrong stance on the election’s key issues, and voters rejected her for it. But here’s what I would like to see you acknowledge: Successful female candidates are less likely to be the hopey-changey underdog. Sometimes they will be: cf. Elizabeth Warren. But that’ll be rare. If getting more women into elected office sooner rather than later is a priority to you (only one of America’s big-city mayors is female, and only five of fifty governors), then sometimes the less-unimpeachable, more insider candidate might be the one we cast a ballot for.
At any rate, the obituaries for identity politics are premature. It’ll be over when a majority of white voters cast ballots for the black candidate, or majority of men vote for a woman. Since you wanted to talk HRC, let’s remember that most men voted for Obama, and did so by at least 10 percentage points in 18 states (and in the general election, most white voters, including white women, voted for McCain).
As for there never being enough lesbians in New York, well, your imagination may fail you…! —ED
For the record, I don’t think you’re tribal and narcissistic either! But I do think it is revealing that both you and June Thomas at Slate copped to that before laying out what I think is the more compelling case, one you make here and now: that symbols matter. Maybe it’s because the power of that symbolism is so hard to quantify and so impossible to guarantee, especially in the case of Quinn, whose documented record is less than stellar. That said, I think it’s perfectly legitimate for you and June to have voted for Quinn based on a gut feeling about the lasting impact of her historic candidacy. I weighed that too; I just reached a different conclusion.
But let me address your claim that successful female politicians are less likely to run and win as hopey-changey underdogs, a mantle you seem to think men (straight, white ones? But what about Obama?) are more easily able to don. Here I think Hillary Clinton’s 2008 run (and possible 2016 run) over-determines your thinking. You mention Elizabeth Warren, but what about Patty Murray and Carol Mosley Braun, who both ousted male Democratic incumbents in 1992’s “Year of the Woman”? The insurgent energy, in the wake of multiple sexual harassment and assault scandals and the Clarence Thomas confirmation hearing, was certainly with female candidates then. And why couldn’t it be again?
There’s also the related assumption that female candidates are more likely to be insiders, to ride the safe track and to accumulate credentials and qualifications that can later be put, fairly or not, under a microscope. On one level, that’s an apt description of all candidates, because despite some outliers, our political system still rewards incumbents and dogged party loyalists who work the ladder. (And just on a side note, I’d put Bill de Blasio in that category; the man was on city council for three terms and then Public Advocate, which is first in line to succeed the mayor). Here again, I think Hillary’s 2008 run plays an oversized role. Was she more qualified than Obama? Sure, on some level. Was she the most qualified New York Democrat when she began her run for Senate in 2000? Not by a long shot; she wasn’t even a New York Democrat!
My point is: it cuts both ways. Historically, female politicians have had to expand notions of what counts as experience and qualification (a good thing!) or transcend them altogether. So it’s strange to see some feminists rest upon those values now, presumably because doing so would bolster Clinton’s (or Quinn’s) case. I think that’s short-sighted and also entrenches a political value system that more often than not works to exclude women and minorities. ----RK
All right, we’ve had our fun (and it has been fun!), it’s summing up time!
So, I am hung up on HRC’s run, but I think fairly so: the Patty Murray and Carol Moseley Braun examples are from when I was in fifth grade! The outsider strategy for women can work, sure, but your own examples suggest that’s in limited cases. And yes, hopey-changey Candidate Obama was not white, but he was male. That’s not to say he wasn’t targeted by other race-based insinuations that he wasn’t up to the job (something the Clinton campaign wasn’t above suggesting itself, to my enduring shame and disappointment). He was. But being the less-experienced-but-still-believable candidate—all the more appealing as a blank slate!—is something that men, in my observation, have an easier time selling.
But let’s see if that holds true in 2016. What I’d like readers to take away from this is a willingness to weigh a candidate’s public positions against the realities and exigencies of the path they’ve taken in the course of seeking higher office. And once there are representative numbers of women, people of color, and LGBT people in office, we can focus on the issues alone.
Til next time! And thanks to everyone who weighed in in the comments, on Twitter, etc.! —ED
New York mayoral candidate Anthony Weiner and his wife Huma Abedin attend a news conference in New York, July 23, 2013. (Reuters/Eric Thayer)
This is the true story of two men who, unfortunately for both of them, share the same body. Exhibit 1: Anthony Weiner—married politician, disgraced former congressman, mayoral candidate, public asshole. Exhibit 2: Carlos Danger—married man, habitual sexter, private dick. Let us now consider their respective “sins.”
As a congressman, Anthony Weiner was a spectacularly ineffective buffoon. A recent New York Times review of his tenure in the House paints a devastating portrait: Weiner was megalomaniacal, narcissistic, bad at navigating the political ropes, alienating to potential allies, alarmingly disinterested in making actual change and really, really mean to his staff (one former aide likened him to Miranda Priestly in The Devil Wears Prada). In his twelve-plus years in Congress, the Times notes, Weiner “sponsored and wrote only one bill that he steered to enactment: a measure pushed by a family friend who gave his campaigns tens of thousands of dollars in donations.” When Democrats controlled Congress in 2007–08, Weiner introduced fifty bills but didn’t get so much as a cosponsor on thirty-nine of them. According to a former staffer quoted by the Times, “He just never tried…. The point was to be able to say he introduced a bill.” Yes, Weiner squawked loudly about single-payer healthcare, but his interest in it always smacked of grandstanding opportunism. When the ACA finally came up for a vote, Weiner threatened to hold it up in committee unless he could introduce a single-payer amendment—a noble cause, which he bartered away not for any policy concessions but for a primetime speaking slot and a letter of recommendation from Nancy Pelosi.
And that’s just his record on domestic issues. When it comes to foreign policy, Weiner can give Liz Cheney a run for her money. He was one of eighty-two Democrats who voted for the Iraq War, and he continued to support its funding long after its flimsy rationale had evaporated. But it’s on the subject of Israel/Palestine that Weiner truly distinguishes himself. At least twice in public, he has insisted that the West Bank is not occupied, denying a fact that even right-wing former Israeli Prime Minister and settler movement architect Ariel Sharon admitted a decade ago (for a complete account and video of Weiner’s statements, see Ali Gharib’s report for The Daily Beast). He has also claimed that there is no Israeli military presence in the West Bank, which would be news to the IDF soldiers stationed there. As Stephen Zunes points out, Weiner’s denial fits squarely in a disturbing pattern of squelching criticism of the Israeli state and of making patently untrue statements to do so, including claiming that the United States recognizes the PLO as a “terrorist organization” twenty years after it ceased using that designation. These statements reveal Weiner to be a knowing liar or a willfully ignorant ideologue, either of which should disqualify him from holding public office.
Now let’s turn to Carlos Danger, Weiner’s online pseudonym and alter-id. Carlos Danger sexts—a lot. He is overly fond of his penis. He takes pictures of it and sends them to strange women he has never met. He has banal sexual fantasies that involve high heels, pulling hair, holding down wrists and a lot of wetness all around. He does not, apparently, understand the Internet very well, even though he spends a lot of time on it and even though the hackneyed conventions of Internet porn appear to be his Kama Sutra. His prolific sexting took place before, during and after his wife’s pregnancy. He was caught doing it, denied it, admitted it, promised not do it again, did it again and admitted it—again. All of this makes him stupid, boorish, adolescent and deceitful about sex, but no more so than the millions of men (and some women) who do the same thing. It might also make him a bad husband, but only his wife Huma Abedin can be the judge of that, and it is really none of our business what she thinks, what arrangement they do or do not have and whether or not she should leave him. Nothing Carlos Danger did was illegal or coercive, and, it should be pointed out, none of it actually involved physical contact. His behavior and his marriage are entirely unworthy of public concern.
Unless, of course, Carlos Danger has the bad luck to also be Anthony Weiner, politician and candidate, in which case his lies about sex are a sign of a more intrinsic untrustworthiness, his indiscretions a sign of fundamental bad judgment and his repeated transgressions the source of an irretrievably broken public trust. On one level, this is all true. And of course, Weiner himself courted this public condemnation—not by the original sin of sexting but by having the balls to trot out his bogus rehabilitation as something that qualifies him to be mayor of New York.
But let’s just run a counterfactual. What if Anthony Weiner had never denied sexting, had never lied about it and had never promised to stop doing it? What if he had simply asserted that what he does with his penis and cellphone are between him, his wife and his online companions? No American politician has really had the courage to make such a forthright sexual declaration, and so we don’t know the answer to a question that will only become more pressing with time: should the mere existence of an X-rated selfie disqualify one from public office? I suspect it would, because American culture still has a vicious puritanical streak, because we confuse sexlessness and monogamy with public virtue and dignity, because we don’t really have any idea what privacy means. So yes, of course Anthony Weiner lied about Carlos Danger—of course he did. The script had already been written, and underneath the bad calls and broken promises that make up this farce of a morality play there is another lesson: if you have sex outside the box, don’t even think about public service.
Yes, Anthony Weiner is a weasel, a liar, a moron and a ridiculous egomaniac. Yes, he is unfit to hold public office. It doesn’t take a picture of his junk to prove that.
At least fourteen people died in the chemical plant explosion in West, Texas, just a few of the 4,500-plus Americans killed each year on the job. (AP Photo/LM Otero.)
Ask yourself this: Do you know the name of any one of the victims killed in the West Chemical and Fertilizer Company disaster? Do you know how many of them there were? Their ages, aspirations, what they looked like, whether they left behind children or what messages they last posted on Facebook? Do you know if there is an explanation yet for what caused the explosion? Or if investigators are still searching for one?
You probably don’t know the answer to any of these questions, and I didn’t either until I started writing this article. I didn’t know that as of Sunday, April 21, four days after the explosion, officials have confirmed fourteen deaths, eleven of whom were first responders, and that as many as sixty people remain missing. I didn’t know the name Jerry Chapman, 25, who volunteered with the Abbot Fire Company and who, according to his girlfriend Gina Rodriguez, was training to be an EMT. I didn’t know the name Cody Dragoo, 50, who was both an employee of the fertilizer plant and a West firefighter (the town has an all-volunteer force). And I had never heard of West firefighter Morris Bridges, 41, who lived just a few doors down from the facility and whose 18-year-old son Brent Bridges stood on the porch as the blast that killed his father blew out the windows of their home.
I do know, however, the names and faces of Sean Collier, Krystle Campbell, Martin Richard and Lu Lingzhi. I know that Sean, 26, had been on the MIT police force for a little more than a year when he was allegedly shot by Tamerlan and Dzhokhar Tsarnaev, that Lu was a Chinese national studying at Boston University, that Krystle was a regular Boston Marathon watcher and that Martin was just 8 years old and had recently made a sign that read No More Hurting People Peace. I’ve seen the photo of him holding, with obvious pride and joy, those words drawn on a sheet of blue construction paper more than a dozen times. I can’t get away from it on Facebook, and when it shows up on my feed, I can’t look away.
What separates these victims from one another? Surely not innocence, for they are all innocent, and they all deserve to be mourned. And yet the blunt and awful truth is that, as a nation, we pay orders of magnitude more attention to the victims of terrorism than we do to the over 4,500 Americans killed each year while on the job. As former Secretary of Labor Hilda Solis once put it, “Every day in America, thirteen people go to work and never come home.” Very little is ever said in public about the vast majority of these violent and unnecessary deaths. And even when a spectacular tragedy manages to capture our collective attention—as the West explosion briefly did, as the Upper Big Branch Mine disaster did three years before—it is inconceivable that such an event would be constituted as a permanent emergency of world-historic proportions.
Let’s imagine that they were, as many have rushed to suggest that the Boston Marathon bombing ought to be. Let’s imagine that instead of sending a handful of investigators from the ATF and the Chemical Safety Board to West, Texas, we marshaled every local, state and federal resource available to discover the exact sequence of events that led to the explosion. Let’s imagine that the question—Why?—became so urgent that the nation simply could not rest until it had overdetermined the answers. We’d discover that OSHA hadn’t inspected the plant in twenty-eight years—did this play a role in the disaster? If it’s found that the company that owns the plant, Adair Grain, violated safety regulations, as it had last year at another facility, we might call it criminal negligence and attribute culpability. But would we ascribe ideology? And which ideology would we indict? Deregulation? Austerity? Capitalism? Would we write headlines that say Officials Seek Motive in Texas Fertilizer Explosion? And could we name “profit” as that motive in the same way that we might name, say, “Islam” as the motive for terrorism? Would we arrest the plant’s owners, deny them their Miranda rights and seek to try them in an extra-legal tribunal outside the Constitution, as Senator Lindsey Graham has suggested we treat US citizen Dzhokhar Tsarnaev? Would we call for a ban on the production of ammonium nitrate and anhydrous ammonia? Would we say that “gaps and loopholes” in our nation’s agricultural policies were responsible for the tragedy, as Senator Chuck Grassley has suggested about immigration in the Boston bombing case?
No, we won’t. We won’t do any of these things, because even if the West fertilizer plant disaster is ultimately understood as something more than “just an accident,” it will still be taken as the presumed cost of living in a modern, industrialized economy.
When it comes to terrorism, we have the opposite response. We launch wars against other countries, denude the Constitution and create massive state bureaucracies for espionage, covert operations and assassinations. Since 9/11, it’s become a political imperative that our nation must express zero tolerance for terrorism, even though, like workplace fatalities, terrorism has been with us long before globalization lent it a more exotic and threating provenance.
To the problem of violence, there ought to be a path between callous indifference and total social warfare. And that’s why the miserable and absolute failure of gun control legislation in the Senate—just two days after the Boston bombing and on the same day of the West explosion—was especially galling. Like acts of terrorism, the murderous rampage at Sandy Hook Elementary School precipitated a national crisis. In the wake of that tragedy, our collective grief took a particular shape, the shape of democracy. The deaths of those school children were linked to the fate of more than 30,000 victims of gun violence each year, and the impulse to act was channeled through our democratic system, where an overwhelming majority of Americans and a majority of the US Senate expressed support for new gun laws, which were nonetheless defeated.
Last night, a Fox news anchor cited a poll that claimed that just 4 percent of Americans think gun control is the “the most important problem facing the country today.” Implicit in his commentary is the idea that because gun violence isn’t seen as the singularly most urgent issue, it isn’t an issue at all, that like workplace fatalities are to a modern economy, so gun violence is to the Second Amendment—just a cost we should get used to.
So America, here’s your scorecard for the week of April 15, 2013: callous indifference: 2, total warfare: 1.
The ethnically Chechen Boston Marathon suspects confuse many of the assumptions the media likes to make when covering such attacks, Leslie Savan writes.
Walmart shoppers on Black Friday. (CC 2.0.)
For the past few days, one of the most popular stories on the New York Times website has been Graham Hill’s op-ed “Living With Less. A Lot Less.” In a majestic display of guileless narcissism, Hill, an Internet multimillionaire, congratulates himself for downsizing his life and getting rid of all the stuff—the homes and cars and gadgets and sectional sofas and $300 sunglasses—he accumulated over the past decade. Now he lives in a 420-square-foot studio and has only six dress shirts and “10 shallow bowls” that he uses “for salads and main dishes.” Imagine that. Eating off the same plate. Twice. In one meal.
There are too many phrases to mock (“Olga, an Andorran beauty”; “My space is small. My life is big.”), and the Internet has already done a great job pointing out how obnoxious it is for a multimillionaire to hold himself up as a model of moderation when so many Americans are being forcibly downsized from already cramped lives.
But let me make one serious point—because scrubbed of its irritating tone, Hill’s cautionary tale is a familiar one that both the left and right like to tell to slightly different effects. It’s the moralizing force, for example, behind the appeal of Annie Leonard’s viral video “The Story of Stuff,” A&E’s hit show Hoarders, Lauren Greenfield’s documentary The Queen of Versailles and Glenn Beck’s theory of the 2008 economic crash: Americans are spending more and more of their money on stuff! Oh no! The left attributes this trend to advertising and corporate consumerism; the right blames individual choices and cultural decline. But either way, it is taken as gospel that Americans are spending increasingly untenable amounts of money on stuff and this is what’s making us (as households and as a nation) both bankrupt and unhappy.
This may feel true, but the economic data from the past half century tell a different story. As Elizabeth Warren and Amelia Warren Tyagi persuasively document in The Two Income Trap, Americans are not going broke buying clothes, books, music, furniture, cars, appliances and other consumer goods. Rampant consumer spending is not the source of their increasingly precarious lives. They call this mistaken narrative “the myth of overspending.” In fact, the share of income we spend in those categories has dramatically declined. For example, in 1949, the average American household spent 11.7 percent of its annual budget on clothes; today it spends just 3.6 percent. By the early 2000s, when Warren and Tyagi wrote their book, American households were spending 44 percent less on major appliances, 30 percent less on furniture and 20 percent less per car than they did just a generation ago in the late 1970s.
Or let’s take a smaller window, between 1999 and 2010, the years in which Hill acquired and unacquired all his stuff. According to the Bureau of Labor Statistics’ Consumer Expenditure Studies, in 1999, the average household spent $1,743 on clothes, $1,499 on furniture, $3,305 on new or used cars and $159 on reading materials (sad face). In 2010, spending in each of these categories declined, in raw dollars, to $1,700; $1,467; $2,588 and $100 (super-sad face) respectively. As a percentage of the average household budget, Americans spent less on food, clothing, cars, entertainment, furniture and reading materials in 2010 than they did in 1999. The portion we spent on housing remained practically the same in those years (34 percent).
So where did the money go? Two words: education and healthcare. The share of the average household budget devoted to education grew by 22 percent between 1999 and 2010, for healthcare by almost 17 percent. (Warren and Tyagi also document how housing costs have skyrocketed over the longer term, arguing that this spike is pegged to the search for decent education.)
Why does any of this matter? After all, Graham Hill wasn’t auditioning for Jack Lew’s job. He was telling his own saga of sudden wealth and subsequent enthrallment with and recovery from materialism as a vaguely ecological, New Age–y fable. But his is an atypical, 1-percent story, despite his insistence that “members of every socioeconomic bracket can and do deluge themselves with products.”
In reality, the incomes of most Americans haven’t soared like Hill’s; they’ve stagnated. And within those depressingly static budget lines, most Americans don’t spend more and more of their money on stuff; they can’t afford to. Quite the opposite, they have to spend it on school and doctor’s visits.
What has happened is that stuff has gotten cheaper—a lot cheaper—which enables people to buy as much (or more) as they used to while spending less. But the exploitative labor practices, giant retail chains and lax environmental standards that have driven the cost of goods (and wages!) down so rapidly merits but a few words in Hill’s entire op-ed. Apparently, people like to hear a lot about how they spend too much and not about how they actually spend too little on the goods that they do buy. Which is all to say that if they were truly concerned about the undeniably disproportionate amount of global resources the United States consumes, as well as the happiness of the American middle class, Hill and The New York Times would be better off lecturing Washington about pursuing fair labor practices, tougher regulations and socializing medicine and education than they would hectoring people for spending too much on stuff—which they do less of anyway.
Read Richard Kim on the hasty beatification of former mayor Ed Koch.
The instant beatification of former New York City Mayor Ed Koch has a lot of folks itching to do some grave-dancing. Leftists will denounce Koch because he was one of the original neoliberal mayors, ushering in a regime of gentrification and finance-driven inequality that defines the city to this day. Minorities regard him with suspicion because he marginalized the city’s black and Hispanic leadership and inflamed racial fault lines to corner the white vote, presaging the Sister Souljah moments that would come to afflict the national Democratic Party. And yet even there, among the new Democrats, Koch was never a stalwart, breaking with the party to endorse George W. Bush for president in 2004 and flirting with the neocons over Israel late in his life.
All that said, there is a special place reserved for Koch in gay hell—because he was mayor during the onset of the AIDS epidemic, which he is widely seen as failing to do enough about, and because it’s commonly assumed that Koch was a closeted gay man. “I hope he’s burning next to Roy Cohn”—or sentiments quite like it—have appeared frequently on my Facebook feed, especially from vets of ACT UP.
You’ll find very little of this criticism reflected in the New York Times obituary, which thinly and inconclusively grapples with Koch’s political legacy only after fawningly and provincially portraying Koch as a real-life, benignly obnoxious, wacky Grandpa Munster. Sure not everyone liked his politics, but Hizzoner sure was zesty! The first version of the Times obit, in fact, mentions AIDS only in passing, alongside the “scandals and the scourges” of crack cocaine and homelessness, and it was only after the Twitterverse flogged the Gray Lady that new paragraphs on AIDS were added.
The gay brief against Koch comes in two stripes. The first is that he should have been out and that had there been an openly gay political leader of national stature urging action on AIDS, the course of the epidemic might have been very different; countless lives could have been saved. I find this counterfactual an exercise in magical thinking and ultimately unfalsifiable and unhelpful. It’s not clear that an out gay man could have been elected mayor in 1977 in the first place, especially given the “Vote for Cuomo, Not the Homo” signs that current New York Governor Andrew Cuomo is accused of orchestrating on behalf of his father in that primary, or that an out or outed gay mayor would have won re-election in 1981 or 1985. It’s also not hard to envision a scenario in which an out or outed gay mayor would have driven from office by scandal, perhaps only adding to the shame and ostracization the gay community faced then. The point is, we just don’t know, and there are simply too many variables to plot out what kind of impact an openly gay elected official like Harvey Milk, whom Koch fell short of on many levels, would have had on the epidemic.
What we do know, or can usefully conjecture, forms the basis of a more sober if no less damning indictment of Koch—which is that the particular way in which Koch was closeted shaped his halting, seemingly indifferent reaction to the epidemic. Unlike the coy posture he adopted later in life, Koch didn’t just refuse to answer questions about his sexuality during his years in office. He aggressively—if unsuccessfully—attempted to eliminate any whiff of homosexuality from his profile. If Kirby Dick’s documentary Outrage is to be believed, Koch had a long-term relationship with a man named Dick Nathan, but broke it off before his first mayoral race (this account comes from David Rothenberg, whom Koch appointed NYC’s human rights commissioner). Nathan moved to Los Angeles (where he died of AIDS in the ’90s) and was conspicuously replaced by Bess Myerson, the first Jewish Miss America, as Koch’s beard. Koch also proclaimed himself a heterosexual in a 1989 radio show when he was running against David Dinkins, and generally took pains to distance himself from New York’s gay community.
Reading Randy Shilts’s account in And the Band Played On, it’s impossible not to conclude that Koch’s personal paranoia came to determine his policy response to AIDS. According to Shilts, Koch “warmly embraced requests that cost the city nothing,” but routinely rejected any requests—for housing for people with AIDS, for a health center in Greenwich Village, for hospice space—that came with a price tag. Koch, Shilts writes, wanted to avoid the perception that gays would get “special treatment” in his administration. The result is that “for the next two years, AIDS policy in New York would be little more than a laundry list of unmet challenges, unheeded pleas, and programs not undertaken.” “All the ingredients for a successful battle against the epidemic existed in New York City” concludes Shilts, “except for one: leadership.”
As David France, the director of the Oscar-nominated documentary How to Survive a Plague points out, by January 1984, New York City under Koch’s leadership had spent a total of just $24,500 on AIDS (full disclosure: the producer of HTSAP is my partner, Howard Gertler). That same year, San Francisco, a city one-tenth the size of New York, spent $4.3 million, a figure that grew to over $10 million annually by 1987.
The mayor of San Francisco during those years was Dianne Feinstein, who like Koch was no radical. She came from the centrist coalition that included Dan White, the city supervisor who murdered Supervisor Harvey Milk and Mayor George Moscone, whose office Feinstein assumed in the wake. Like Koch, she had a troubled relationship with the gay community (she infamously vetoed a domestic partnership bill in 1983). And like Koch, she was, above all, a political opportunist with national ambitions who happened to live in a liberal city with a large, politically active gay population. But she was straight, and—paradoxically—that made a difference in how those two cities treated people with AIDS in those formative years.
Ed Koch might not have been in a position to accelerate antiretroviral drug development or slow the transmission of HIV on a national scale, but he definitely could have made the lives of thousands of people with AIDS in New York City a whole lot more humane, which might also have extended some of those lives until an effective treatment was available. That he has blood on his hands seems likely. That he is guilty of the curious combination of paranoia, myopia, self-interest and callousness that so often attaches to closeted public officials seems undeniable. Would the fight against AIDS been helped had Ed Koch come out of the closet? Possibly. But it definitely would have been better had he just been straight.
God bless his surely weary soul. I won’t.