When Racial Justice Meant Universal Social Benefits

When Racial Justice Meant Universal Social Benefits

When Racial Justice Meant Universal Social Benefits

The left used to believe that reducing inequality across the board was the best way to combat racial injustice. What happened?

Facebook
Twitter
Email
Flipboard
Pocket

It is obvious, however, that job discrimination based on racial or religious prejudice is subsidiary to the more pressing issue of full employment. When jobs are plentiful, all kinds of economic discrimination are minimized. When jobs are scarce, and the competition among workers for available openings is sharpened, it is relatively easy to divide employees into convenient groupings provided by the incident of race, color, or religion, and to aggravate the prejudice which leads to an exclusion of minority groups from job opportunities. The basic problem to be solved, therefore, is the problem of full employment.”

Pauli Murray, a legal scholar, civil rights activist, and important voice in Black Popular Front left politics, made this observation in the California Law Review in 1945, at a time when the Full Employment Bill was working its way through Congress. The legislation, which passed the Senate but failed in the House of Representatives, would have established as national economic policy the maintenance of full employment, giving both the president and Congress responsibility for carrying out that policy. Murray was expressing a conviction that was common among Black and other left-leaning civic elites during the early postwar years but has long since been abandoned: that the only way to win social justice victories for Black Americans, and to secure them once they’ve been won, was to make certain that they were combined with, and part of, a broader expansion of social protections.

In a 1944 volume of essays, What the Negro Wants, edited by the Howard University historian Rayford W. Logan, a politically diverse group of Black civic elites expressed a similar understanding, as did the social scientists St. Clair Drake and Horace R. Cayton in their magisterial 1945 case study of Chicago, Black Metropolis: A Study of Negro Life in a Northern City. “Race conflict in northern urban areas,” Drake and Cayton wrote, “arises when competition is particularly keen—for jobs, houses, political power, or prestige—and when Negroes are regarded (with or without foundation) as a threat to those who already have those things or who are competing for them.” The authors concluded on a speculative note: “The people are rather definite about what they want: the abolition of the Job Ceiling; adequate housing; equal, unsegregated access to all places of public accommodation; the protection of the rights of those Negroes and whites who may desire to associate together socially. And they want to see this pattern extended to all of America—including the South.” The means best suited to realizing these goals, they argued, was a broader struggle for “full employment in the postwar world and on the development of a world program for emancipating the Common Man.”

This perspective was a common-sense presumption for a generation or more of Black and other advocates of racial justice. That is no longer the case. Far from it. In recent years, the view that Black Americans’ interests are best met within the context of the pursuit of universal social benefits has been rejected out of hand by adherents of race-reductionist politics. How has this shift come to pass? And what are its implications?

Today, public voices like Ta-Nehisi Coates claim that the War on Poverty failed Black Americans because it did not address the supposedly special nature of Black poverty. In reality, that is precisely what it did, thereby failing Blacks and everyone else. MSNBC talking head Joy Reid dismisses universal social policy as resting on a discredited belief that “a rising tide [will] bring the races together.” Her stance conflates universalism with the growth politics that, since at least the John F. Kennedy administration, has been centrist Democrats’ alternative to a redistributive policy that would provide universal, non-marketized access to necessities like health care, education at all levels, and housing, along with a commitment to a full-employment economy. This sort of misreading is what happens when history is reduced to the equivalent of a fortune-cookie factory.

The period in which Pauli Murray was active marked both the high point and the subsequent retreat of the social-democracy-inclined left that emerged from the New Deal. It was defeated by a combination of anti-communist repression and the corporate counteroffensive against labor and the social state that followed the end of World War II. Congressional conservatives defeated the Full Employment Bill of 1945 and passed a much weaker version the year after. The Wagner-Murray-Dingell bill that would have set a path toward national health care was foiled three times between 1943 and 1950, and in 1947 the conservative Congress passed the Taft-Hartley Act, which severely restricted the power and growth of organized labor. The Cold War–era Red Scare discredited critiques and policy proposals that could be made to seem like socialism.

Regardless, activists rooted in that Popular Front left orientation continued to press for universalist approaches to combating inequality, from the debate over the shaping of federal anti-poverty policy in the early 1960s, to the agitation for the Freedom Budget for All Americans advanced by the A. Philip Randolph Institute and endorsed by other civil rights and labor leaders in the mid-1960s, to the struggle to pass the Humphrey-Hawkins full-employment bill in the mid-1970s.

That story is well enough known. Less widely recognized is that debates in the early 1960s over the shaping of the War on Poverty and of Title VII—the employment section of the 1964 Civil Rights Act—would signal the death knell of the social democratic tendency in American politics. Poverty and racial inequality in employment were disconnected from political economy and relocated to the realm of culture and individual behavior. In each case, an emerging liberal narrative construed the narrower, less effective, and more vulnerable victorious option as the more radical and forward-looking. In my next column, I’ll show how that happened, how we got here from there, and why the shift matters.

Thank you for reading The Nation!

We hope you enjoyed the story you just read, just one of the many incisive, deeply-reported articles we publish daily. Now more than ever, we need fearless journalism that shifts the needle on important issues, uncovers malfeasance and corruption, and uplifts voices and perspectives that often go unheard in mainstream media.

Throughout this critical election year and a time of media austerity and renewed campus activism and rising labor organizing, independent journalism that gets to the heart of the matter is more critical than ever before. Donate right now and help us hold the powerful accountable, shine a light on issues that would otherwise be swept under the rug, and build a more just and equitable future.

For nearly 160 years, The Nation has stood for truth, justice, and moral clarity. As a reader-supported publication, we are not beholden to the whims of advertisers or a corporate owner. But it does take financial resources to report on stories that may take weeks or months to properly investigate, thoroughly edit and fact-check articles, and get our stories into the hands of readers.

Donate today and stand with us for a better future. Thank you for being a supporter of independent journalism.

Thank you for your generosity.

Ad Policy
x