Politics, current affairs and riffs and reflections on the news.
The illogic of the GOP’s obsession with Benghazi—and with Hillary Clinton’s role in the tragedy—has been documented ad nauseam, and suffice to say that five investigations, fifty briefings and 25,000 pages of documents have found absolutely no intentional wrongdoing by anyone in the Obama administration, including former Secretary Clinton. In fact, the only involvement Clinton had with Benghazi took place before the attack, when she strongly objected to the Republican Congress’s cutting of the diplomatic security budget. Over her objections, they cut it by $500 million anyway, which left our people overseas needlessly exposed. That’s cause and effect, with Issa and Co.—and not Secretary Clinton—playing the role of cause. Afterward, Clinton implemented every single recommendation made by the bipartisan Pickering Commission, which investigated the attack.
Strunk and White teach writers to use a thesaurus sparingly, if at all, but I’m afraid I have to agree with Michael Tomasky, who at The Daily Beast calls the latest Benghazi inquest “bullsh*t,” as well as “absurd, insane, sickening, repulsive, shameful, and at the same time shame-less.”
Republicans, never ones for examining historical context, seem to have forgotten the lapses in military leadership under sainted President Reagan that led directly to the 1983 barracks bombing in Beirut. Among other failings, the report of the Long Commission, which investigated the attack, determined that the United States failed “to take the security measures necessary to preclude the catastrophic loss of life.” Following the tragedy, Reagan accepted responsibility and in February 1984 began pulling Marines out of Lebanon; nine months later, he carried forty-nine states on his way to re-election. Responsibility? Withdrawal? Today’s GOP would surely have blasted Reagan during the primaries for going “soft on terror.”
But the real Benghazi scandal is the one that Republicans and Democrats don’t want to examine, the trillion-dollar question at the heart of the matter. This is an opportunity to look squarely at our stance on national security, to assess whether we’re even in the ballpark of best serving American security interests abroad (and at home). Nation Contributing Editor Robert Scheer, for example, has called Benghazi a “monster of our own creation,” noting that “as with [9/11], the perps turned out to be people the US secret agencies had once trusted. The enemy here was not Al Qaeda, but rather a homegrown menace empowered by foreign intervention.”
Time and time again, Scheer explains, we insert ourselves into international conflicts, secretly arm and empower a faction we deem to be “freedom fighters,” then suffer the consequences when those we’ve pushed to achieve self-determination do indeed get there. Meanwhile, because our intelligence agencies eschew any semblance of transparency, we have no informed debate on whether any of this was a good idea in the first place. And the cycle, free from oversight, repeats itself, with American citizens even less safe and even more in the dark.
We need to put our Manichaean “soft on terror/tough on terror” mindset behind us and instead seek to become smart on terror. And this cannot be achieved via the secret dealings of special operations forces deployed in the newly proposed global Counterterrorism Partnerships Fund or the CIA and NSA.
President Obama seems to get it, at least on an intellectual level: “When we cannot explain our [counterterrorism] efforts clearly and publicly, we face terrorist propaganda and international suspicion,” he told the graduating cadets at West Point on Wednesday, “…we erode legitimacy with our partners and our people, and we reduce accountability in our own government.” Nevertheless, such rhetoric hasn’t stopped him from expanding secretive counterterrorism operations with Special Forces into four African states. Historian Nick Turse tells The Nation’s Zoë Carpenter that the US Africa Command is busy converting itself from “a more congenial combatant command to an actual war-fighting combatant command.”
We might ask ourselves—instead of taking it on reflexive faith, as we’ve been doing for the last thirteen years—whether or not the “war on terror” is actually doing any good. Absolutely not, The Independent’s Patrick Cockburn has determined. “The ‘war on terror’ has failed,” he writes, explaining a corollary to Scheer’s point, “because it did not target the jihadi movement as a whole and, above all, was not aimed at Saudi Arabia and Pakistan, the two countries that had fostered jihadism as a creed and a movement. The US did not do so because they were important American allies whom it did not want to offend.”
Instead, we’ve decided to offend our own citizenry through needless snooping, civil rights violations and invasions of privacy.
At TomDispatch, Tom Engelhardt observes, incredulously, that the United States “seems incapable of intervening in a meaningful way just about anywhere on Earth despite the fact that its military remains unchallenged on a global level. It’s little short of mind-blowing.” While more time and more money are wasted on another attempt to find the meaning of Benghazi, we’re squandering the opportunity to find the meaning of our own behavior and our own policies. Why can’t we intervene in a meaningful way? Why have opacity and subterfuge replaced transparency and open debate?
That it’s not crystal clear to congressional Republicans that we’re doing something wrong with our foreign policy is a significant sign of incompetence. We’ve already answered (and re-answered) the question of who failed whom in Benghazi; it’s instead time to start asking why, more than a decade after 9/11, we put ourselves in a position to fail in the first place.
Listen to Katrina vanden Heuvel discuss Obama’s foreign policy on The Diane Rehm Show.
Sarah Anderson, director of the Global Economy Project at the Institute for Policy Studies, recently asked in a speech at the New Populism Conference in Washington, “Why should our tax dollars subsidize economic inequality?” Why must you and I foot the bill, via our taxes, for the callousness of Walmart or Domino’s?
The chasm between C-suite pay and minimum wage may be wider than ever before—in 2013, according to the AFL-CIO, CEOs of Fortune 500 companies made 774 times as much minimum-wage workers—but, as Anderson points out, many people have grown tired of waiting for a solution to emerge from the maw of Washington and are instead taking the initiative themselves. “Just like on the minimum wage,” Anderson told the conference, “people aren’t waiting for Washington to lead on CEO pay. We’re seeing an unprecedented explosion of bold creative action outside Washington.” In Sacramento, Providence and other capitals, state-level activism and legislation are taking care of business that the House and Senate have chosen to ignore.
In California, Senate Bill 1372, introduced by state senators Mark DeSaulnier and Loni Hancock, lets the state raise the corporate tax rate, currently 8.84 percent, to as high as 13 percent for companies whose highest-paid executives earn more than 100 times what their typical employees earn. Companies with a CEO-to-typical-worker pay ratio below 100-to-1 would get a tax cut, down to as low as 7 percent. “Under the bill,” Inequality.org reports, “all firms with a ratio under 100-to-1 would end up with a tax cut, all above with a hike.” And California corporations that might be eyeing an end-around via outsourcing would see their tax rate increase by fifty percent if they exhibit a decrease in full-time employees and an increase in “contracted and foreign full-time employees.”
Read the full text of Katrina’s column here.
Read Next: Why is the VA suffering from a lack of resources in the first place?
Though Republicans might not understand, it takes a lot more than bumper stickers to support the troops who fight their wars. It should be a no-brainer, but it seems like those who have determined to politicize the situation at the VA have forgotten the primary reason so many veterans are in such dire need of care to begin with, and why the scandalously cash-strapped department has been so hard pressed to provide it. As I said on Face the Nation this week, without considering the historical context of the wars in Iraq and Afghanistan—namely, that they were both unnecessary and prosecuted with a stunning degree of ineptitude—and without considering Congress’s history of underfunding veterans healthcare, it’s irresponsible to dive-bomb the White House with finger-pointing and grandstanding speeches about who needs to resign, and when.
There’s plenty of blame to go around concerning the massive failures of the healthcare system in the Veterans Administration. Both the media and politicians are focusing on administrative failures at the top and are calling for the resignation of Eric Shinseki, the retired four-star general who heads the federal agency, as if such a high-profile decapitation will fix the problem.
But members of both parties agree: It will not. Senator Claire McCaskill (D-MO) told Politico, “It shouldn’t be about a political scalp. It should be: How are we going to improve care for veterans?” And Senator James Inhofe (R-OK) belittled his colleagues’ knee-jerk demand for a cabinet-level resignation: “I’ve never seen [the tactic] work yet…. I’ve only been around twenty years.” Even Bob Dole has dismissed the notion that Shinseki should be forced out. Former Senator Max Cleland (D-GA), a Vietnam veteran and triple amputee, wrote in a Politico op-ed, “As a disabled veteran myself, there is no one I would rather have heading up the VA now, in this turbulent time, than Eric Shinseki. In my experience, he is the best there is.” Cleland should know; in addition to being, like Dole, a genuine war hero, he also served under President Carter as administrator of veterans affairs (the predecessor position to Shinseki’s) from 1977 to 1981.
Obviously, the creation of secret waiting lists at VA facilities is horrible. There is no excuse for such dereliction of duty, especially when it again puts the lives of our brave veterans in danger after they’ve already been made to face enough. Simply put, those who are responsible for making these lists should be fired. And if their actions are determined to have been illegal, then they should be prosecuted for criminal activity.
But just as obviously, we need to recognize that those actions were not ordered by Shinseki, himself a veteran twice awarded the purple heart (and, as you might recall, the Army Chief of Staff who—presciently—dared contest the Rumsfeld-Wolfowitz notion that postwar Iraq could be reconstructed with a mere 100,000 troops). Moreover, under Shinseki’s watch, the VA has cut the backlog of veterans-benefits claims by more than half. Veteran homelessness has dropped by twenty-four percent since Shinseki made it a priority in 2010.
The creation of “secret lists” comprised unauthorized decisions made at the local level, purposefully hidden from the top leadership of the agency. While we need to focus on the guilty parties who made the actual decisions to create these lists, we also need to recognize that none of this exists in a vacuum. This problem was not created in 2014, and we would be oversimplifying things if we didn’t also recognize that there are many, many others—over many, many years—who have also contributed greatly to this terrible situation.
Why is the VA suffering from a lack of resources? You can find the answer on Capitol Hill. Insufficient funding of veterans’ healthcare has been caused primarily by political decisions made by “support-our-troops” members of the US Senate and House of Representatives. Members of Congress who have in recent years voted against increasing the funding of veterans’ healthcare—increases necessary to meet the need created by the wars in Iraq and Afghanistan—deserve much of the blame for starving the VA into this scandalous situation.
Many of those members of Congress who are now calling for the resignation of Shinseki are themselves guilty of voting against adequate funding for veterans’ healthcare, and they are therefore partly responsible for the deaths of veterans who didn’t get necessary medical treatment quickly enough to save their lives. It is just as reasonable to call for their resignations as it is to call for Shinseki’s resignation, if not more so. If you’re truly outraged, then tell your member of Congress to cough up the necessary funds for the VA. The secretary of veterans affairs can only ask for increased funding; it’s up to Congress to actually provide it.
The inept execution of the Bush administration’s two wars also plays a significant role. It cannot be said enough that our soldiers were nickel-and-dimed by Donald Rumsfeld’s DoD. They should have been provided with body armor (The Army prioritized body armor on the same level as socks, according to a 2005 New York Times report) and with adequately-protected military vehicles from the get-go (“We were reduced to duct taping old flak jackets to the side of our Humvees to provide protection,” Paul Rieckhoff, founder and executive director of IAVA, told NewsHour in 2004). Why are these decade-old outrages relevant today? Well, addressing these failures promptly and effectively then would surely have reduced the demand side of the veterans’ healthcare economic equation now.
To peel back another layer, an even more important factor in the creation of today’s mess was the disgraceful and dishonest launching of the Iraq War. All of those Bush administration officials, as well as their neocon allies, who lied and shamefully tricked us—and our troops—into fighting that needless war are responsible for the injuries sustained as a result of it. It’s a fairly simple case of cause-and-effect, and as long as Iraq War veterans suffer, blood continues to be spilt on their dishonest hands.
It cannot be repeated often enough that, none of these politicians who involved us in the reckless and unjustified wars of the 2000s has ever been held adequately responsible for the massive damage they have done to our finances, international standing, military readiness, and health of our veterans. It might be convenient to pin all of the hawks’ failings (and they are legion) on Eric Shinseki’s shoulders, as Washington Post columnist Eugene Robinson suggests, but it would be morally, historically and economically bankrupt to do so. It would suggest that we, as a nation, do not actually value the lives and health of our soldiers over the political and financial imperatives of our ruling class. It would suggest that we consider our troops to be nothing more than rent-a-cops, called in to do security for the big event, then forgotten the next morning.
Today’s lesson is quite simple: after conflicts are over, we need to fully fund the healthcare and medical needs of our veterans. Forever. Even if that means making the political and economic elite pay more in taxes. Even if that means taking politics out of the VA and focusing instead on the welfare of our veterans. That we have politicians and members of the media who need to be reminded of this is a disgrace.
Read Next: Reed Richardson on how the media’s VA ‘scandal’ coverage is making the same old mistakes
Last week, Tea Party–backed Senator Rand Paul (R-KY) set the progressive world abuzz.
No, not with his usual retrograde positions on abortion, gay marriage or the Civil Rights Act of 1964 (he was against it before he was for it)—but rather with an op-ed in The New York Times, demanding that the Obama administration release its legal argument justifying the use of a drone to kill Anwar al-Awlaki, a US citizen, without trial. Paul vowed to filibuster the nomination to the US Court of Appeals for the 1st Circuit of former Justice Department official David Barron, who helped write memos supporting said argument.
Paul’s strong libertarian principles have always differentiated him from many of his Republican colleagues. It is, therefore, not all that shocking for him to speak out against a president he dislikes on a policy he disdains. Yet his outspokenness has many liberals and leftists asking a legitimate question: Why aren’t there more Democratic voices opposing the surveillance state? Protecting civil liberties should be a critical piece of the progressive platform, but too many establishment Democrats and progressives have been silent on this issue simply because one of their own is in the White House.
Some Democrats in Congress have taken bold stands. Longtime civil liberties champion (and former House Judiciary Committee chair) John Conyers has worked to limit the National Security Agency’s collection of bulk telephone data. Representatives Keith Ellison of Minnesota and Adam B. Schiff of California have probed the administration’s drone and surveillance programs. Representative Zoe Lofgren of California is pushing to prevent the NSA from weakening online encryption. In the Senate, Judiciary Committee chair Patrick Leahy of Vermont has held oversight hearings questioning excessive surveillance. Even Dianne Feinstein of California, chair of the Senate Select Committee on Intelligence and normally a committed defender of the intelligence community, finally spoke out after discovering that the CIA spied on Senate staffers. And last week, Senators Mark Udall of Colorado and Ron Wyden of Oregon sent a letter to Solicitor General Donald B. Verrilli Jr., strongly criticizing a “culture of misinformation” that has resulted in “misleading statements… about domestic surveillance.” And Senator Bernie Sanders, an independent from Vermont, has proposed a bill limiting FBI and NSA spying.
“The Climax of an Era”—so declared The Nation’s headline marking the Supreme Court’s ruling in Brown v. Board of Education, handed down sixty years ago this Saturday. For the few progressives left in those dark times, the Warren Court’s unanimous decision declaring “separate but equal” public facilities unconstitutional was “a fine antidote to the blight of McCarthyism and kindred fevers,” observed Carey McWilliams, soon to become The Nation’s editor.
More promising than the Court’s decision itself, McWilliams continued, was the widespread approval it garnered in a nation whose army had only been desegregated less than six years earlier. That overwhelming response, he argued, should be taken as support for liberalizing federal policy further: “For it demonstrates once again the measure of unity and confidence and pride that can be aroused whenever unqualified expression is given to the individual and social values to be found in the Constitution and Bill of Rights. Fortunately we continue to redeem, often after costly delays and protestations, the promises to which we are committed by history and tradition and, we can now add, by current conviction.”
Sadly, as we know, those delays and protestations continued. In many ways they became even costlier as the civil rights movement zeroed in on the more institutionalized—and, therefore, less easily subverted—barriers to equality that had been erected in the course of hundreds of years of slavery and racial oppression. For The Nation, anniversaries of the Brown v. Board of Education decision have always served as opportunities to reflect on how much has been promised, how much delivered, how much still owed.
In 1974, the American historian John Caughey called Brown “the noblest and most influential judicial decision of the century.”
Brown drove Jim Crow from the drinking fountain, the waiting room, the lunch counter, the rest room, the elevator, the bus, the motel, the voting booth, the employment office, the salesroom, the public playground and the graveyard. That is not to say that racism has been extirpated. But all underpinning of government support has been withdrawn from almost every category of segregation or discrimination by race.
Caughey also noted the sad irony that, after much resistance, the South ultimately consented to school desegregation, while supposedly more enlightened parts of the country continued to resist:
Today, the prime concentrations of black children assigned to segregated schools are to be found in the Northern and Western cities. It is, furthermore, in these communities and on the school issue that the one great exception to the “Browning” of America is to be found. Here the order of the Supreme Court is evaded and resisted, usually under such slogans as local control or opposition to bussing, but with the anomaly that the one governmental body in town that still imposes and enforces segregation is the school system.
In 1994, The Nation headlined its fortieth-anniversary issue on Brown “Broken Promise,” arguing in an editorial that “to attribute today’s segregation and inequality to the broad economic trends that over decades have devastated and isolated the nation’s cities” was “to ignore the dimensions of politics and racism.”
Years before she became a Nation columnist, Columbia University law professor Patricia Williams wrote an essay for that issue about meeting and talking to the surviving members of the family of the late Oliver Brown, who sued on behalf of his daughter, Linda.
Have you been disappointed by the years since 1954? I asked Mrs. Leola Brown Montgomery [Oliver’s widow]. Of course, she said. And then added, “But I don’t think that anybody anticipated the country’s response. The attorneys, the parents, we didn’t really understand the insidious nature of discrimination and to what lengths people would go to not share educational resources: leaving neighborhoods en masse because African-American children could now go to the school in your neighborhood. Not offering the same kinds of programs, or offering a lesser educational program in the same school—I don’t think anybody anticipated what we’ve ended up with…But we’re currently still in the midst of the country’s response, in my opinion.”
Brown was a case, Williams wrote, “that shaped my life’s possibilities, a case that, like a stone monument, stands for just about all the racial struggles with which this country still grapples.”
Perhaps the legacy of Brown is as much tied up with this sense of national imagination as with the pure fact of its legal victory; it sparkled in our heads, it fired our vision of what was possible. Legally it set in motion battles over inclusion, participation and reallocation of resources that are very far from resolved. But in a larger sense it committed us to a conversation about race in which all of us must join—particularly in view of a new rising Global Right.
The fact that this conversation has fallen on hard times is no reason to abandon what has been accomplished. The word games by which the civil rights movement has been stymied—in which “inner city” and “underclass” and “suspect profile” are racial code words, in which “integration” means “assimilation as white,” in which black culture means “tribalism,” in which affirmative active has been made out to be the exact equivalent of quota systems that discrimination against Jews—these are all dimensions of the enormous snarl this nation has been unraveling, in waves of euphoria and despairs, since the Emancipation Proclamation…
We remain charged with the task of getting beyond the stage of halting encounters filled with the superficial temptations of those “my maid says blacks are happy” or “whites are devils” moments. If we could press on to an accounting of the devastating legacy of slavery that lives on as a social crisis that needs generations more of us working to repair—if we could just get to the enormity of that unhappy acknowledgment, then that alone might be the paradoxical source of a genuinely revivifying, rather than a false, optimism.
* * *
On fiftieth anniversary of Brown, in 2004, The Nation published essays by Michael Klarman, Claude M. Steele, and David Garrow, among others, as well as a forum, “Beyond Black, White and Brown,” edited by Eric Foner and Randall Kennedy, reflecting on the legacy of the decision and “the prospects for future change.” Much of what the contributors wrote remains relevant and true today—perhaps only more so.
Pedro Noguera and Robert Cohen: The de facto segregation of so many of our nation’s schools I no longer an issue that generates conflict and controversy. Like the growing prison population and homelessness, racial segregation is accepted as a permanent feature of life in America. Across the country, schools are segregated in terms of race and class, and as was true before Brown, the vast majority of poor children are relegated to an inferior education.
Frank H. Wu: Supporters of affirmative action invoke the spirit of Brown; opponents of the programs complain that there is no comparison to be made. Yet if Brown is to stand for something, ought it not at least to stand for the proposition that, if we believe we must have institutions that are inclusive, we also should take action to insure that they are so?
Asa Hilliard III: In the absence of a real understanding of the structure of domination, some of the worst elements of segregation have returned, in new guises. Tracking is less visible, but it persists. Today’s scripted, standardized, cookie-cutter, minimum-competency managed instruction, sometimes by private contractors, with severely reduced parent and community involvement, is offered mainly in low-income minority cultural group schools. Affluent public or private schools rarely if ever use the scripted, non-intellectual robotic programs. This is the new segregation.
Jacquelyn D. Hall: We now face a situation in which the federal courts are preventing local communities from pursuing race-conscious policies, while segregated housing remains deeply entrenched. The result will be two school systems: one filled with nonwhite children from low-income families and one with middle-class children, most of whom are white, along with our most qualified teachers.
We cannot address this crisis by commemorating the Brown decision in the register either of triumph or declension. Instead, we must grapple with the long civil rights movement as an unfinished revolution whose gains are once again being partially reversed. The culprit now, as in the past, is not just overt racism but public policies that are ostensibly colorblind, yet deliberately shape the landscape of race. We need stories that dramatize the hidden reality, stories that have no satisfying upward or downward arc, stories that call us to a struggle whose end is still not in sight.
* * *
This will be the final installment of “This Week in ‘Nation’ History.” A new archives blog will appear at TheNation.com tomorrow, titled “Back Issues,” which will highlight past articles from The Nation relevant to current topics of the day or of independent interest. As the oldest weekly magazine in the Western hemisphere, there is little that has escaped our notice. Curious how we covered something? Richard Kreitner, the editor of “Back Issues,” will be soliciting readers’ requests: write to him at email@example.com.
Subscribers to The Nation can access our fully searchable digital archive, which contains thousands of historic articles, essays and reviews, letters to the editor and editorials dating back to July 6, 1865.
Violence in Ukraine is spreading. The Ukrainian military and police are splitting apart, a reflection of the fissures in that deeply divided country. Pro-Russian separatists are taking over government buildings and police stations in eastern Ukraine. Pro-government mobs have burned protesters alive. The referenda on self-rule cobbled together by pro-Russian movements in the Donetsk and Luhansk regions deepens the divisions. Zealots on both sides could drive the country into a bloody and destructive civil war.
The United States has no direct national security interests at stake in Ukraine, but we do have an interest in a united and functional Ukraine that has stable relations with its European Union neighbors to the west and with Russia to the east. And the United States surely wants to forestall a crisis that could disintegrate into civil war, economic collapse and chaos, possibly destabilizing a weak European economy.
But if the United States is to help stabilize Ukraine and prevent a much larger European crisis, then the American political establishment and much of the mainstream media will need a sober reassessment of reality.
“Budgets are not just a collection of numbers, they are not just an accounting document,” said New York City Mayor Bill de Blasio when he revealed his FY 2015 budget last week. “They reflect fundamental values.” A municipal budget, to trot out an old chestnut, is the way a society writ large votes with its dollars.
So we should be optimistic about de Blasio’s $74 billion plan, which, as New York Magazine explains, espouses “his consistent priorities: education, housing, the homeless, and raises for municipal workers, all in service of combating income inequality.” The Times editorial board stodgily declares that “there is a lot to like in what Mr. de Blasio is proposing.” And Brent Budowsky of The Hill writes (under a headline non-ironically anointing the mayor as the “FDR of New York”), “[The budget] will be a standard for progressive experimentation and execution in the same way Roosevelt created a New Deal for America that not only survives today but also includes many brilliantly successful and popular policies, such as Social Security.” New York–area members of Congress are eagerly praising the budget, too.
Among the budget’s highlights is a $41 billion allocation, over the next ten years, to build or preserve 200,000 units of affordable housing. De Blasio says that this—”the largest, fastest” affordable-housing program ever attempted on a local level—will create around 194,000 construction jobs and 7,000 permanent jobs. The New York City Housing Authority will receive $70 million for repairs and security upgrades. An additional $14.4 million will go toward providing shelter for both homeless families and single homeless adults. Says Denise Miranda of the Urban Justice Center, “De Blasio ensures that no New Yorkers will have to choose between living in a mold-infested NYCHA apartment or being homeless.” (That any New Yorker ever did, of course, is a sad example of the city’s prior commitment to affordable housing.) And $17.75 billion will be used for settling outstanding contracts with the city’s labor unions. (In today’s antilabor climate, the fact that “union” isn’t a bad word in the de Blasio administration is a very big deal.)
De Blasio’s budget is a vital wake-up call to a city that saw gentrification and de facto segregation rise under Mayor Bloomberg’s watch, especially where housing is concerned. “Mayor de Blasio’s [housing] plan could help decelerate the seemingly irreversible social segregation that is plaguing New York,” writes Richard Eskow at the Campaign for America’s Future. “What happens if this plan isn’t carried out? Manhattan and parts of Brooklyn will increasingly become white, wealthy enclaves. Gentrification will drive lower-income families out of even the outermost boroughs. Service workers and other lower-earning workers could soon face commute times that rival those of apartheid-era South Africa. The rich cultural diversity that has been New York City’s hallmark will disappear, and the school desegregation called for in Brown v. Board of Education will become impossible to achieve.” It’s frightening indeed when New York City can see Johannesburg as a “peer” on desegregation.
In a related item, the budget calls for a twenty-five percent increase in public-library funding, further demonstrating the mayor’s commitment to providing cultural and educational opportunities across the city, and not just in its elite Midtown core.
Could we finally be leaving Mayor Bloomberg’s Gilded City behind? In the introduction to our special issue about the Gilded City last year, we quoted James Parrott, chief economist of the Fiscal Policy Institute in New York, as saying, “New York City’s government is significant enough in its breadth…that the policy tools exist and the wherewithal exists to do something at the margins to lessen inequality.” De Blasio recognizes this. He also recognizes that he serves at the pleasure of the people—all 8.3 million of them—and that his mandate encompasses all of them.
A city in which white people, who make up 37 percent of the population, but earn 51 percent of the income—and where African-Americans and Latinos, who constitute 47 percent of the population, yet take home only 34 percent of the income—keeps its fingers crossed when it calls itself “great.” So does a city that, during the Bloomberg administration, denied shelter to record numbers of homeless individuals and families.
As De Blasio’s term unfolds and as his budget takes actual shape, the city government will work provide both real and symbolic results for the people, an upgrade from too many years of relying solely on the symbolic. To see just how potent this new brand of progressivism has become, just look at the scene from the recent sixteenth birthday party of New York’s Working Families Party—which de Blasio helped found. In a speech, honoree Cynthia Nixon announced, “We’re at the beginning of a great progressive era in New York City… Bill de Blasio, [public advocate] Letitia James, [city council speaker] Melissa Mark-Viverito. That is a holy trinity if I ever heard of one. We’ve had so many victories lately, and I feel that if we keep working, we have so many more coming.”
It’s said that the ideas behind New Deal were invented, developed, and tested in New York City in the 1910s and ’20s before going national under FDR. Behind De Blasio and WFP’s “holy trinity,” perhaps the New New Deal can start here, too.
Read Next: Robert B. Reich: How to Shrink Inequality.
In a move that’s in equal measures frightening, foolhardy and dangerous, Senate Republicans, led by Bob Corker (TN), last week introduced the Russian Aggression Prevention Act of 2014. I’d say the world just became a little bit less safe, but that wouldn’t be accurate. The world just became a lot less safe.
Included in the bill’s Cold War–era rhetoric is a provision that puts a freeze on US participation in the New START Treaty, and one that limits overflights heretofore authorized by the Open Skies Treaty. It will also accelerate deployment of ballistic missile defenses in Europe. While this isn’t quite Dulles-style brinkmanship, it is entirely too close for comfort. Senator Corker and his allies have needlessly pushed us that much closer to an armed conflagration.
Bilateral reduction of nuclear arms is not an acceptable bargaining chip in this situation. The US nuclear arsenal endangers Americans as much as it does Russians, and this petty nuke-rattling by Corker’s war party puts the entire world at risk. Explains Corker without even a hint of self-awareness, “We need to inflict more direct consequences on Russia prior to Vladimir Putin taking additional steps that will be very difficult to undo.” Threatening to “inflict direct consequences,” especially where nuclear disarmament is concerned, is chilling. The mutual reduction of nuclear weapons that New START Treaty calls for is not, as the All Souls Nuclear Disarmament Task Force puts it, “a favor which the U.S. is doing for Russia; it is in the self-interest of both parties (and the rest of humanity).”
It’s ludicrous to think that we “defend peace,” as Senator Johnny Isakson (R-GA) claims the bill does, by ratcheting up military tension—especially nuclear tension. Our nuclear-disarmament agreements, like the New START Treaty, are fragile enough as it is. It is also folly to take Senator John McCain’s (predictably) belligerent stance that the only way to get anywhere with Vladimir Putin is with increasing—and increasingly military—aggression. We’ve held the entire world’s population captive by playing nuclear chicken before; to engage in such a strategy again is beyond reckless. The Cold War is over; we must use diplomacy, not short-sighted chest-thumping to resolve this crisis.
Let’s de-escalate and defuse this situation with rational thought and an appropriate dose of wariness (and awareness) before we start counting and aiming our weapons. Let’s follow the lead of OSCE, which is responding to the Ukraine situation with “high-level diplomacy and multilateral dialogue” on one end, to “monitoring, fact-finding and military visits” on the other. There are nuclear weapons involved here; we won’t be able to shoot first and ask questions later.
Read Next: Katrina vanden Heuvel and Stephen F. Cohen: Cold War against Russia—without debate.
At 6:23 pm last Tuesday, as many Americans sat down to eat dinner, Clayton Lockett lay down to die. More accurately, an Oklahoma prison official strapped him to a gurney, sedated him and then injected him in the groin with an untested mix of two more drugs—one to stop his breathing, the other to stop his heart.
Quickly, it became clear that the coldly clinical execution—what the head of the Oklahoma American Civil Liberties Union called “a human science experiment”—had gone horribly awry. At 6:36 pm, despite Lockett’s being pronounced unconscious, his head lifted off the bed. He began moving and mumbling. At 6:39, convulsing, he uttered the words, “Oh, man.” By the time the director of prisons announced that there had been a vein failure and issued a stay of execution, it was too late. At 7:06 pm, Lockett suffered a massive heart attack and breathed his last—forty-three excruciating minutes after the execution began.
While Lockett was undoubtedly guilty of heinously raping a young woman and shooting and burying alive another, his state-administered death—just the latest in a horrific series of botched executions—serves as a stark reminder that the death penalty has been a moral, economic and practical failure. According to a new study, one in every twenty-five people sentenced to death between 1973 and 2004—around 300 people—was likely innocent. Many of these wrongful convictions stem from a well-documented racial bias in the criminal justice system—55 percent of death row inmates are black or Hispanic, and those who kill whites are far more likely to receive the death penalty than killers whose victims are black. Keeping the country’s 3,000-plus inmates on death row costs us billions—California alone spends $184 million annually on capital punishment—all for a system that experts agree has never been proven to be a successful deterrent.
The European Financial Transaction (a k a Robin Hood) tax scored a big legal victory on April 30, when a challenge regarding the legality of the tax brought by the British government was thrown out by the European Court of Justice. The ECJ has struck a serious blow for fairness, as the dismissal essentially chastises the British government for championing the interests of the UK’s financial industry over those of its citizens. David Hillman, spokesperson for the Robin Hood campaign, told The Guardian, “This futile legal challenge tells you all you need to know about the government’s misguided priorities: it would rather defend a privileged elite in the City than support a tax that could raise billions to tackle poverty and protect public services.”
What’s more, these “misguided priorities” of David Cameron’s government became all the more apparent last Friday, when an analysis of the Bank of England’s £375 billion stimulus program determined that those public funds, according to the International Business Times, “[have] made the wealthiest 5% even richer, worsened the economic recovery, made pension pots smaller, failed to stimulate business investment, and given a bonus to financial services.” To recap, then, Britain’s response to the financial crisis has included flogging for the finance industry at the ECJ and giving them a £375 billion gift from the public coffers—what one analyst actually calls a “Robin Hood tax in reverse.”
In this environment, then, the real, non-reversed Robin Hood tax has serious momentum within Europe, just as the eleven-nation coalition behind it is expected to make an important announcement about the first phase of the tax on May 5 or 6. The proposed tax includes a 0.1 percent tax on stock and bond trades and a tax of 0.01 percent on derivatives. It’s now expected that the tax will indeed be phased in, with the levy on stock-trades comprising the first step. Reportedly, the finance ministers involved in the negotiations plan to use the rest of the year to negotiate over taxes on derivative-trading, which could be introduced later in a second phase. While the German government is reportedly determined to get an agreement from the outset to include derivatives, there has been some resistance, including from the supposedly more left-wing French government.
The Robin Hood Tax is, as European FTT campaigners say, the “most popular tax in history,” and such high regard—even for something as seemingly unromantic as a 0.1 percent tax—isn’t difficult to understand: FTT revenue can be used to create jobs; spur economic development beyond the financial industry; and combat climate change, global poverty and HIV/AIDS. One measure of the tax’s popularity is that this week’s announcement about the FTT’s first phase has been scheduled to occur during the lead-up to the European Parliament elections of May 22–25, and support for the tax is expected to be a major vote-getter. Not exactly an American-election-style “October Surprise” to be sure, but certainly a signal to candidates: Robin Hood matters to European citizens. You can lend your name to the movement, too, by signing the “1 Million Strong” petition.
Even if this week’s announcement isn’t everything campaigners have been hoping for, the announcement itself will send a strong signal that the European FTT—despite the UK’s legal challenge, despite a strong backlash from the financial industry, and despite concerns from the US Treasury Department about possible impacts on US investors—is moving ahead. We’re seeing more and more popular, judicial, and—increasingly—legislative support for the tax in Europe.
This kind of visible success should boost efforts to build support for such a tax in this country, currently represented by Representative Keith Ellison’s (D-MN) Inclusive Prosperity Act. While the Obama administration has not yet supported the idea, there is increased openness in the Treasury Department to at least take a closer look at a FTT as one possible remedy for high-frequency trading (which happens to be the subject to Michael Lewis’s latest best-selling book, Flash Boys).
Thanks in part to Lewis, there’s increasing awareness of how these high-speed/high-frequency traders are rigging American markets in their favor. (Lewis writes, “…[I]f a single Wall Street bank were to exploit the countless minuscule discrepancies in price between Thing A in Chicago and Thing A in New York, they’d make profits of $20 billion a year.” And obviously, as Flash Boys illustrates, that exploitation of those countless minuscule discrepancies is only available to a handful of deep-pocketed outfits with access to certain blazing-fast fiber-optic cable lines, effectively reducing much of the stock market down to a few guys in a black box making 10,000 trades per second. “It is hard to see the benefit to society as a whole of enabling such trades,” muses The New York Times’s Floyd Norris in a discussion of the book.)
But as Sheila Bair, former head of FDIC, has put it, an FTT “would penalize those who destabilize our markets with rapid fire trading, while rewarding those who invest for the long term.”
In the City of London and on Wall Street, the price of doing business should be paid by, well, those doing business. We pay a steep opportunity cost—”dead weight loss,” as Robert J. Barbera of the Center for Financial Economics at Johns Hopkins puts it—when we give the finance industry free reign to make money by any means necessary, even if it destroys the economy for the rest of us. They’re on the right track in Europe; here in the US, we need to join them.
Read Next: Alissa Quart: “Who Takes Care of the Nanny’s Kids?.”