Notice: Trying to get property 'ID' of non-object in /code/wp-content/themes/thenation-2023/functions.php on line 3332 Heather Cox Richardson and the Battle Over US Historyhttps://www.thenation.com/article/society/heather-cox-richardson-democracy/Kim Phillips-FeinJan 24, 2024

One interpretation presents the country as irredeemably tainted by its past. Another contends that the United States has also tended toward egalitarianism.

]]>
Books & the Arts / January 24, 2024

We Have No Princes

Heather Cox Richardson and the battle over American history.

Heather Cox Richardson and the Battle Over US History

One interpretation presents the country as irredeemably tainted by its past. Another contends that the United States has also tended toward egalitarianism.

Kim Phillips-Fein
Illustration by Tim Robinson.

In the fall of 2019, during Donald Trump’s first impeachment hearing, a historian of 19th-century America by the name of Heather Cox Richardson began to publish essays summarizing the day’s political news on her Facebook page. Her calm, clear, and matter-of-fact voice offered readers a daily digest that managed to sidestep the shrill hysteria of Twitter and the confusing blow-by-blow of press accounts. As she continued herreadings of current events, she discovered that there was a tremendous market for just such an approach.

Books in review

Democracy Awakening: Notes on the State of America

Buy this book

Over the dramatic course of 2020 and 2021—from Covid to the election to the events of January 6—Richardson’s straightforward analysis won her a vast readership. She soon began to publish her reports as a Substack newsletter under the title Letters From an American, borrowed from the famous Revolutionary-era rhapsodies of Hector St. John de Crèvecoeur, whose Letters From an American Farmer offered a plainspoken celebration of American democracy. The newsletter went on to amass one of Substack’s largest audiences. With about 1.3 million people reading each missive, Richardson had become a true history star.

Richardson’s popularity is notable in itself. Many of the historians of earlier generations who rose to popular acclaim have been august personages, usually male, declaiming from on high. Richardson, by contrast, is a woman; her voice is sincere, humble, approachable, and jargon-free. She knows a lot about American history, but she doesn’t club you over the head either with her mastery of scholarly arcana or her ironic hot takes.

Richardson’s distinctive persona and gentle egalitarianism are at the center of her new book, Democracy Awakening. She draws on her scholarly background as a historian of the Republican Party of the 1850s to make her case. Asserting that the party under Trump is committed to a radical vision of economic, racial, and social hierarchy, she argues that this is part of a much longer history, one that goes back to the 19th century and the years that preceded the Civil War. In that era, the Southern economic elite, whose wealth relied on extreme exploitation and racial subordination, sought to impose its vision that “some people are better than others” on the American populace, capturing such institutions as the Senate and the Supreme Court in order to uphold its minoritarian views. One interpretation of American history today presents the country as irredeemably tainted by its past, a sordid history of racism, slavery, and violent conquest. Richardson contends that this is only part of the story and that the “fundamental principles” of the nation have tended toward egalitarianism; often, however, it has been up to marginalized groups—women, Indigenous people, and especially Black Americans—to remind the rest of the country of its creed. In this way, Democracy Awakening is worth reading not only for its own merits but also for what it tells us about its readers and how an important swath of the public understands our contemporary dilemmas.

Democracy Awakening is organized into three parts. The first looks at the history of the contemporary right, which Richardson argues began as a mobilization against the New Deal. (“Today’s crisis began in the 1930s,” she writes.) For Richardson, the word “conservative” is inadequate to describe this right-wing mobilization, because it has almost always been committed to making radical changes in American institutions to achieve its goals. Quoting Abraham Lincoln’s February 27, 1860, speech at Cooper Union in New York City, she endorses his idea that the enslavers were the revolutionary, destructive radicals, who sought to entrench slavery in the United States with a commitment that far outstripped that of the founders. The party of Lincoln, she writes, “put into practice [its] conservative position that the nation must, at long last, embrace the principles embodied in the Declaration of Independence: that all men are created equal and must have equal access to resources to enable them to work hard and rise.”

Current Issue

Cover of March 2024 Issue

The New Deal, Richardson argues, also embodied these principles, and thus the forces that resisted it in the 1930s and ’40s were not “true American conservatives,” but instead “the same dangerous radicals Lincoln and the Republicans of his era warned against.” Likewise, the common people who fought in the armed forces during the Second World War and who later joined the civil rights movement in the 1950s and ’60s were the true defenders of democracy, while the “conservative” heroes who wrapped themselves in the trappings of Americanism were the ones resisting the country’s democratic promise and, as a result, laying the foundations for our current authoritarian threat.

After exploring the roots of the American right in the 1930s and ’40s, Democracy Awakening moves on to the present. The second section of the book covers the bewildering cascade of events during the Trump years: Russiagate, Charlottesville, the first impeachment, Covid, the election interference in 2020, and finally January 6. Underlying these twists and turns, Richardson argues, was a broader pattern found throughout the history of the American right—a power grab analogous to that of the enslavers in the years before the Civil War. The reliance of the Republican Party today on the courts suggests, to her, a parallel with the antebellum era: “Like today’s Republicans, as southern enslavers lost support, they entrenched themselves in the states, then took over the machinery of the federal government and then the Supreme Court.” The goals of the Republican Party today are an extension of those earlier ones, she writes. “The MAGA Republicans appeared to be on track to accomplish what the Confederates could not: the rejection of the Declaration of Independence and its replacement with the hierarchical vision of the Confederates.”

The Nation Weekly

Fridays. A weekly digest of the best of our coverage.
By signing up, you confirm that you are over the age of 16 and agree to receive occasional promotional offers for programs that support The Nation’s journalism. You may unsubscribe or adjust your preferences at any time. You can read our Privacy Policy here.

From our Trumpian present, Richardson then goes back to the founding era, discussing the American Revolution and the ratification of the Constitution. Her return to the past is intentional: She wants to show that the hierarchical vision of American history avowed by the right has always had its adherents—the loyalists, the slaveholders, the industrialists—but also that this vision has generally been contested and has often represented a minority of the country.

Yes, there were many in the American colonies who were true believers in the monarchy and wanted to remain part of the British Empire, but they eventually gave in to popular calls for revolution and independence. The founding fathers drafted a constitution that legitimated treating human beings as property, but they also wrote a document that was capable of amendment. By the 1850s, the slaveholders were defending states’ rights as an alternative to government by the national majority, seeking to silence abolitionists by violence if need be. Yet “these defenders of human enslavement” constituted a “small minority of the country” and had to resort to these undemocratic measures precisely because they knew they were losing the political war. The Republican Party under Lincoln and William Seward presented itself as resisting a “slave power” that had perverted the real ideals of the nation.

In this context, the Civil War and Reconstruction were, Richardson argues, truly moments of a “new birth of freedom,” a second founding based on national birthright citizenship. The labor movement of the late 19th century took up these ideals as well. Wealthy white men sought to “restrict suffrage rights”; the extent to which they would go to achieve this disfranchisement was on display in the bloody labor conflicts of the Gilded Age. Meanwhile, Richardson asserts, “Black Americans, people of color, women, and workers had never lost sight of the Declaration of Independence.” She says little about the Progressive Era and its elite-driven reforms, or about Jim Crow in the South, but instead moves quickly to New Deal and its political coalition, which joined Black and white working-class people—a true majority that would be undone by latter-day Confederates and their ilk, who had no commitment to any democratic or egalitarian ideals.

Richardson’s account bears the weight of the intense debates in recent years over how to teach and commemorate American history. Historians have been divided, roughly, into two camps. On the one side, there are those who are committed to revealing the conquest, violence, racism, and ecological exploitation that haunt every apparent accomplishment of the United States; on the other, there are those who insist that regardless of its flaws, the United States remains a noble experiment in self-government and constitutional order. This argument over national symbols and myths is often presented in stark and simple terms. Was the United States born in a glorious revolution in 1776—or was its birth really in 1619, when the first enslaved African people arrived in Virginia? Is the United States a democratic nation that embodies the best hopes and aspirations of a free humanity—or is it, at its heart, a country born out of a war to exterminate the Indigenous inhabitants of the continent? How can we think and speak honestly about the bleak history of the country while also salvaging what seems redeemable within it—assuming anything really is?

Richardson is aware of these conflicts, and by arguing that egalitarianism is the authentic creed of the country, she does align herself with one side more than the other. But she also wants to find a way to bring these two narratives together: to keep the ideals of American democracy and egalitarianism but to change the heroes. In her telling, the very people who have been most excluded throughout American history are the ones who have most forcefully advanced its central ideas and principles. Yet the ideas and principles are still there, a promise and a beacon to be taken up by those who were not included under them at the start.

In certain ways, Richardson’s account resembles that of Howard Zinn, another great popularizer of American history. But her insistence—echoing that of the pre–Civil War Republicans—that the champions of democracy are the conservative force in American society, while the “slave power” and its later manifestations are a radical force bent on transforming American institutions, means that she has little to say about the leading figures in Zinn’s account, namely the radical activists whose work and utopian vision were necessary to press the United States toward greater democracy. There are very few of these actors in Richardson’s book. While we hear a lot about Lincoln and the Republicans in the 1850s and ’60s and Franklin Roosevelt and the Democrats in the 1930s and ’40s, we hear little about the abolitionists, the socialists, or the Industrial Workers of the World, and certainly not the Communist Party of the 1930s—which, with its slogan “Communism is 20th-century Americanism” and its embrace of the iconography of Thomas Paine and Abraham Lincoln—also sought to claim the mantle of democracy. The New Left, Black Power, the radical feminist and gay and lesbian activists of the 1960s are barely mentioned; Richardson says little about the various anti-war movements as well, either in the 1960s or at other points. The people who pressed for radical change in American politics are portrayed as being, on some level, in its mainstream all along.

Richardson also seems to elide some of the divisions among those seeking to keep the United States true to its founding principles. Hasn’t there always been more disagreement between radicals and reformers over democratic ideals and egalitarian politics than the account of American democracy strangled by an undemocratic elite might suggest? After all, what galvanized the North before the Civil War was not so much abolitionism alone or an egalitarian politics, but the more modest desire to exclude slavery from the West. And while the politics of race and slavery shifted profoundly over the course of the war, at the outset many in the North were far from confident that the war would—or even should—lead to the end of slavery.

When we look at politics today, those opposing Trumpism are also divided about their ultimate political goals. Some liberals frame Trumpism as an illicit power grab that marks a break from the country’s historical commitment to liberal and democratic norms, and so the solution is to defeat Trump decisively at the ballot box and restore order under right-thinking Democrats. But others argue that to truly rout the macho nationalism and reactionary politics of today’s Trump-leaning right requires something more radical than a simple reassertion of civility. Trumpism is not just a program foisted on the country by a malign elite. It has real roots within American political culture and is a response to the dispossession created by the capitalism of the last 40 years, and so it requires a transformative response in kind.

Perhaps another way to understand the problem would be to acknowledge that there have been long-intertwined traditions in American politics not only on the right but among liberals too. One of these traditions has been fiercely egalitarian, as Richardson observes, but another has been deeply discomfited by claims to equality. We are familiar today with the many attempts to justify a vastly unequal society—from invocations of economic competition and the meritocratic ideal to the idea that absent hierarchy, social disorder will result. Given how prevalent those arguments are today, it is heartening to remember that they are not the only ones. The farmer who inspired the title of Richardson’s newsletter had a very different understanding of America—one in which the economic success of one was intimately linked to the success of all, and individual wealth was part of a project of communal prosperity. As de Crèvecoeur wrote: “Here are no aristocratical families, no courts, no kings, no bishops, no ecclesiastical dominion, no invisible power giving to a few a very visible one; no great manufacturers employing thousands, no great refinements of luxury.” He went on to project the utopian ideal: “We have no princes, for whom we toil, starve, and bleed: we are the most perfect society now existing in the world.” But even when Crèvecoeur wrote this, his egalitarian vision was not the only or even the prevailing one, and today its radicalism stands out far more than Richardson’s account suggests.



]]>
https://www.thenation.com/article/society/heather-cox-richardson-democracy/
What Does It Take to Win?https://www.thenation.com/article/politics/timothy-shenk-the-realigners/Kim Phillips-Fein,Kim Phillips-FeinJan 24, 2023

In 1948, a young historian named Richard Hofstadter published The American Political Tradition, a critical look at the country’s politics that marked a sharp break from the accepted wisdom among historians of the time. Many scholars, following in the footsteps of the Progressive era’s Charles Beard, held that American history was defined by conflict: by the policy disagreements that separated agrarian from industrial regions, by disputes among different factions of the economic elite over the right path forward for the country. But Hofstadter suggested the opposite. Profiling an array of political leaders—from Thomas Jefferson and John Calhoun to Theodore and Franklin Roosevelt—he argued that American politics existed within a shockingly narrow spectrum. Almost all of the politicians he chronicled, who helped define the American “tradition,” had accepted capitalism and individualism as the reigning norms of political life. Despite their apparent differences, they all shared this bedrock faith—one that left them unable to grapple with the underlying realities of American life, with its myriad inequalities of class and power. American democracy was unable to live up to its promise, Hofstadter insisted, because it was in the grip of a liberal ideology defined by a rapacious individualism.

Generations of scholars have contested Hofstadter’s pessimism and his account of politics in the United States. Pointing out the remarkably limited cast of characters he’d chosen to stand in for the “American political tradition”—all white men, all but one of them elected leaders (the sole exception, the abolitionist activist Wendell Phillips, comes off much better than the rest)—Hofstadter’s heirs argued that American politics had always been far more ideologically diverse than he allowed, including real critics of capitalism and the country’s commercial norms. Others also challenged Hofstadter’s vision of American politics as essentially liberal in nature, pointing to the political forces—from slaveholders to patriarchs—who espoused explicitly illiberal forms and championed reactionary causes.

Yet for all of his critics, the questions that Hofstadter raised in The American Political Tradition are very much with us today. Why is a country that overtly embraces egalitarian ideals and democratic politics so far from being egalitarian and democratic in reality—a country where ordinary people can expect to exercise meaningful political power? We talk a great deal about “democracy,” but what does that really mean: How, and under what conditions, does it exist? And given the many obstacles to political change, how can we explain why it happens when it actually does? Indeed, these are questions of perennial concern—but today, as the country faces a frightening far-right turn, they have new resonance.

Hofstadter hovers over Timothy Shenk’s Realigners: Partisan Hacks, Political Visionaries, and the Struggle to Rule American Democracy. Indebted to The American Political Tradition in style as well as tone, the book is a collection of profiles of major political leaders and activists that takes as its central subject the “narratives, policies and symbols—in short, the ideas” that drive American politics. Shenk’s selection is far more eclectic and idiosyncratic than Hofstadter’s was; it includes Charles Sumner, Phyllis Schlafly, Walter Lippmann, W.E.B. Du Bois, the Republican Party strategist Mark Hanna, and Hanna’s daughter Ruth Hanna McCormick. Presidents get less play; there’s no chapter on Franklin Roosevelt, Lyndon Johnson, or Ronald Reagan, though Martin Van Buren and Barack Obama are represented. Like Hofstadter, Shenk is attuned to the divide between the rhetoric of these figures and the reality of what they accomplished, and to the ways that they were often undermined by their own hopes and actions, seeking to accomplish one thing but ending up in an entirely different place.

But despite Hofstadter’s evident influence, Shenk is motivated by a very different underlying problem. While Hofstader surveyed the sweep of American political thought and came to the pessimistic conclusion that its recurrent theme was the evasion of conflict through appeals to a shallow individualism, Shenk wants to explain how and why structural change does sometimes happen, and often against all odds. Political activists, after all, have always tried to make space for themselves within the American political tradition; sometimes they have even succeeded in turning the country in directions no one had anticipated. For Shenk, the central dilemma is produced by majority rule itself: If democratic politics involves appealing to the majority of voters in order to win an election, how can anyone gain power with the goal of creating lasting structural change? Since real change involves asking people to take a leap into an unknown future, is this possible to achieve in an electoral system that demands creating majority coalitions—including many who may have a profound stake in keeping things the way they are?

he United States has gone through one period of truly revolutionary change in its history—the abolition of slavery—and the first half of Realigners is concerned with the development of political parties and the maneuvering around slavery in the early republic and the antebellum years. The political problem of ending slavery brings Shenk’s key concerns to the fore: How can a society vote to take actions that completely transform property, labor, and class relationships? Enslaved people by definition could not vote—indeed, they could not engage in any formal or recognized political activity; could not speak freely or publish their own newspapers; and could not enjoy freedom from search and seizure, being property themselves. They effectively existed in a dictatorship that was no less absolute for being decentralized, one master at a time controlling his own plantation through recourse to violence—a violence that all white people were implicitly able to enact, if need be, to maintain the social hierarchy. The people most oppressed by slavery could not take political action to abolish it.

The rest of American society, in the North as well as the South, did not seem all that interested in taking action either. After all, most Americans after the Revolution benefited from slavery to some extent, especially in the first three decades of the 19th century, as cotton and its profits dominated the national economy and drove its growth. In the early 19th century, there was no electoral constituency that could bring an end to the slave system, and so, as Shenk notes, one had to be created: a “coalition of free states, powered by mass democratic politics, dedicated to abolishing slavery.” This was the project of Republican politicians like Massachusetts Senator Charles Sumner, in contrast to earlier generations of abolitionists, who had looked at the complicity of the entire nation in sustaining slavery and argued that withdrawal was the only moral choice.

There had been attempts to build an electoral anti-slavery politics before, starting with the Liberty Party, which was founded in 1840 by the abolitionist activists Gerrit Smith and James Birney. The Free Soil Party, emerging from a Democratic Party faction led by former president Martin Van Buren, focused on a single issue: preventing the expansion of slavery into western lands. But it wasn’t until 1856 that a group of Free Soil politicians and antislavery activists across the North founded the Republican Party. Crafting a politics that linked economic self-sufficiency and access to western land with an opposition to expanding the slave empire, the party was by no means committed to abolishing slavery throughout the country. The Republicans’ end goal, Shenk notes, was conservative: halting the divide of the nation into a propertied and a proletarian class. (“The South had to be transformed, they argued, so that the North could stay the same.”) But whatever its limits, the Republican Party did represent a democratic mobilization in opposition to the entrenched power of the slave system, and thus its electoral victory ensured that Southern elites would leave the Union.

The Republican Party’s vision of economic self-sufficiency and the moral dignity of labor helped it win the election of 1860, but war changed the Republicans, too. Although the party may not have supported abolition, the Civil War’s exigencies forced it to embrace an increasingly radical politics. As the “general strike” of enslaved people described by W.E.B. Du Bois erupted in the South, Abraham Lincoln and the Radical Republicans began to consider a general emancipation edict—one that Lincoln issued on January 1, 1863, followed by Congress’s ratification of the 13th, 14th, and 15th Amendments after the war. But even as the Republicans embraced abolition and the radical early programs of Reconstruction, “many persons who have been Radicals all their lives are in doubt whether to be Radical any longer,” as The Nation’s first editor, E.L. Godkin, put it. By 1877, the Republican Party had become a vehicle for the massive capital accumulation of the industrial age, as well as complicit in the disenfranchisement of Black Southerners and the eventual rise of Jim Crow.

Shenk’s best chapter is his profile of Charles Sumner, which dramatizes the rise (and fall) of a more radical Republican Party. He shows how the Massachusetts senator helped popularize the notion of the “Slave Power”—a galvanizing idea that the Republicans built their coalition around, and one that opened the door to a more radical antislavery politics. “All the acts of our Government [are] connected, directly or indirectly, with the institution,” Sumner wrote, insisting that slavery was an attack on white Northern and Southern voters as well as the enslaved.

This vision, Shenk contends, enabled the Republicans to form a mass coalition, win the 1860 presidential election, and sustain morale and popular support for a bloody war against the South. But as he notes, the Republican faith in the egalitarian sensibilities of white Southerners and Northerners would ultimately run aground on the intransigence of Southern elites and racism and indifference in both North and South. While the Radical Republicans would support the war while it lasted and even press for the constitutional amendments that followed, they proved unable to build enough popular support in the postwar years to redistribute land or to enforce the civil and political rights of the freed people. For Shenk, the very vision that allowed the Republicans to build an antislavery majority before the war also limited their capacity to guarantee the creation of democratic institutions after it was over.

nother theme running through Realigners is the ambivalent streak in American politics toward the entire project of electoral democracy. On the one hand, Shenk addresses the wary stance of the early leaders of the American republic toward mass politics. Many of them, he suggests, were skeptical if not hostile to ideas of majoritarian rule and hoped instead to preserve the prerogative of political power for deserving statesmen. They saw elections as a way for an elite class of guardians to obtain legitimate authority, rather than as a democratic mobilization to shape and define the popular will. Shenk presents Alexander Hamilton (for all his “young, scrappy and hungry” fame) as disappointed by the political world he had helped to create. “Our real disease…is DEMOCRACY,” Hamilton wrote to a Federalist comrade the day before he was killed by Aaron Burr. “Every day proves to me more and more that this American world was not made for me.”

Politically remote from the Federalists but also unsure of the project of popular coalitional politics was the young W.E.B. Du Bois, who initially sought to advance the cause of Black freedom through the efforts of the “talented tenth,” an intellectual and cultural elite who would pave the way. Yet Du Bois’s vision, Shenk tells us, was not unique to him: The whole concept of the “talented tenth” was the product of a growing progressive elite who had come to believe that appeals to the majority were fruitless. Instead, political change would result from the mobilization of the most intelligent, creative, and disciplined. Du Bois did not abandon popular democracy altogether; in the 1910s and ’20s, he combined his vision of a progressive elite with an effort to encourage Black voting, believing that Black political mobilization wherever it was still possible was the key to ending segregation. But the “solid South” and the complete absence of democracy there made this seemingly pragmatic vision completely utopian. With the capitalist world sliding into the Great Depression, Du Bois began to question his own elitist preferences, as the upsurge of popular politics in the 1930s led to eviction protests, unemployed councils, sit-down strikes, labor uprisings, and the first glimmers of an interracial movement that could take on segregation and racism. Turning away from his earlier faith in meritocracy, Du Bois embraced a communist politics that insisted that mass mobilization was necessary for social change. In the midst of this turn, Du Bois published Black Reconstruction, his pathbreaking work of history and political theory that showed how enslaved people, waging a “general strike” in the South, helped turn the tide of the Civil War. As Du Bois wrote, “Charles Sumner did not realize, and that other Charles—Karl Marx—had not yet published Das Kapital to prove to men that economic power underlies politics.”

Du Bois was hardly alone in experiencing this sudden, rapid change in attitude toward protest politics during the New Deal years. During the Progressive era, educated elites imagined that they were there to rescue the nation from the unlettered, disruptive masses. But with the status quo dissolving into the despair of the Depression, the social unrest that had once seemed frightening to those same elites now appeared justified. What was more, for politicians such as the New Dealers, the upsurge of protest seemed to hold out the possibility of an alliance that could be harnessed to press through sweeping reforms. Nor were these developments limited to the United States. The democratic ethos and the hatred of inequality and hierarchy—of “royalists” and all those who would rule without popular consent—that drove the sit-down strikes and rebellions of the 1930s would also inspire a passionate resistance to fascism abroad: to Franco in Spain, to Mussolini’s invasion of Ethiopia, to the Nazis in Germany.

This broad defense of popular politics did not continue in the postwar years. As tensions grew between the United States and the Soviet Union, a new elitist sensibility soon became the defining feature of Cold War liberalism. For many liberals, the frank populism of the New Deal years had been exciting in the moment, but in the aftermath of the war and in the early Cold War it seemed embarrassing—a stagy politics that had been manipulated from above. Though Du Bois did not retreat from his commitments, he did become increasingly disaffected from American politics. “The result of the election I cannot change,” he wrote of Dwight D. Eisenhower’s victory in 1956, “but I can at least refuse to condone it. I can stay home and let fools traipse to the polls.” The new Cold War elites running Washington seemed to share his antipathy. Shortly before his 83rd birthday, Du Bois was indicted by a grand jury on the charge of acting as a Soviet agent. (The case against him, which hinged on his peace activism, was slim even by the McCarthy era’s standards and got thrown out by the judge.) Du Bois regained his confiscated passport in 1958, and a few years later he left the country of his birth for Ghana. He never returned.

s Realigners moves closer to the present, it focuses on how the right—not liberals or the left—eventually returned to the project of building a political majority. Phyllis Schlafly, the founder of the Eagle Forum and the architect of anti-feminism as a social movement, is presented by Shenk as “one of the most talented political organizers in American history.” Initially, he reminds us, Schlafly focused almost entirely on the anticommunist crusade, rallying conservatives within the Republican Party against the “kingmakers” she believed were exercising a malign control. Republicans should provide “a choice, not an echo,” as she titled her broadside in support of Barry Goldwater’s presidential campaign. But when this argument ceased to have much traction after Goldwater’s electoral trouncing in 1964, Schlafly’s focus changed. She sought to appeal instead to a constituency of housewives and homemakers that no one had paid much attention to before, telling them that all they treasured was endangered by the liberal backers of the Equal Rights Amendment. Stitching together a coalition of older opponents of New Deal liberalism and new recruits to the anti-feminist cause, she helped create the New Right, which would eventually bring Ronald Reagan to power. The contrast of Du Bois and Schlafly suggests that in Shenk’s view, the right has in some ways been more successful in building this kind of coalition than the left.

Schlafly comes off in Realigners as far more successful than the person described in the book’s final chapter: Barack Obama. Parts of his story are already well known, including his multicultural background, his Kansas grandparents, his youthful radicalism at Columbia (“some species of GQ Marxist,” as one friend put it), and his time as a community organizer on the abandoned, polluted South Side of Chicago. Tired of working for change as an activist on the periphery of power, Obama decided to attend Harvard Law and try to shape the Democratic Party from within. While at Harvard, he wrote a book, Transformative Politics, with a law school friend.

Working through this never-published manuscript, Shenk describes how the young Obama wrestled with questions about the left’s way forward in the aftermath of the civil rights movement. (“The left generally and blacks in particular stand at a crossroads.”) Rejecting any notion of utopian transformations, Obama argued that people who want to improve conditions for Black Americans should make common cause with working-class white people and build broadly social-democratic programs: public works, national health care, investment in higher education. This pragmatic interracial organizing focused on the welfare state would dissolve racial antagonisms. Needless to say, this did not happen. After he was elected president, Obama found himself (in Shenk’s words) “a charismatic leader who failed to convert personal popularity into structural change, another prisoner of fate.” Maybe he could build a majority to win an election, but this did not translate into the muscle to make the changes he claimed to seek real.

hy not? One of the complex aspects of Shenk’s book—and, to some extent, our larger political situation—is that the meaning of elections is taken for granted. On one level, an election is a campaign that results in victory or defeat for a candidate, a party, a political agenda. But anyone who has ever participated in an electoral campaign knows that they are also organizing events. They help to marshal political groups and provide a chance to shape ideology. They are a form of political education as well as a contest for power, and they reflect power as much as they shape it. As we have seen over the past two years, losing an election can itself become a strange kind of organizing opportunity. There is a way in which Shenk’s book seems to emerge out of the new electoral organizing of the Democratic Socialists of America: It is the book of a left that doesn’t want just to be morally right or to dissent from the peanut gallery, but that has a vision of winning and all that might come after.

And yet, what really produces “realignment”—not just of electoral majorities, but of the status quo? And how important are elections as single events? What about the long, slow efforts to shape the overall distribution of power in society—the organizing that happens outside of the electoral realm? Union organizing campaigns, political magazines, the sudden flare-up of social movements such as the Greensboro sit-ins or Occupy Wall Street, the organizing institutes and long-term education projects, the mundane behind-the-scenes business lobbying: Might these be more important than the project of stitching together an electoral majority, because it is through such efforts that political constituencies are born? After all, those moments when things really do change—the 1850s, the 1930s, the 1960s—have usually been fraught with crises far outside electoral politics.

Even more, they have also been moments of economic and political rupture—eras when it is simply no longer possible for society to move forward as it has in the past. What is possible at any given moment shifts with history. Seen this way, winning an election is less a singular moment of realignment than the final step in a long chain of events through which political institutions and leaders are brought into alignment with a new reality.

Despite the many ways that Shenk’s book echoes The American Political Tradition, it is a product of this moment just as much as Hofstadter’s was of his. The historian of the 1940s looked at a society that had survived the cataclysms of the Great Depression and World War II without fundamentally changing its core institutions and its commitments to property and capitalism. Even though many saw the New Deal and its aftermath as marking a historical break, Hofstadter pointed to the lines of continuity. For him, the problem was the tragedy of stasis. For Shenk, on the other hand, the problem is different: explaining why, despite all the evidence to the contrary, society can sometimes, somehow, actually change—that things don’t have to continue as they are now, no matter how deep the paralysis seems. In a moment as bleak as ours, it is helpful to remember that, as Ignazio Silone once put it, “revolutions are facts too.”



]]>
https://www.thenation.com/article/politics/timothy-shenk-the-realigners/
Rise of the Far-Right Ultrashttps://www.thenation.com/article/society/john-huntington-far-right-vanguard/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinJan 11, 2022Far-Right Vanguard, John Huntington shows just how porous the dividing line has been between the far right and mainstream conservatism.]]>

In 1956, the former commissioner of the Internal Revenue Service made a surprising political turn: He announced in an essay in The Washington Post that he saw taxation as a Marxist scheme to “bring capitalism to its knees.” Even though T. Coleman Andrews had served in government only a year before, under Republican President Dwight D. Eisenhower, once out of Washington he turned against the entire enterprise of the modern state. Any progressive or liberal, he insisted, was “either a dupe or, at heart, a dictator.”

Andrews’s bold words made him a hero within a growing world of right-wing activists, and they drafted him to challenge Eisenhower for the presidency. His supporters were a motley crew comprising members of For America (an organization that built on America First, which had opposed US entry into World War II); Southerners who hoped to block the integration of public schools in the wake of Brown v. Board of Education; and business opponents of labor and the welfare state. They were drawn to Andrews’s dire political vision and to his depiction of the United States as being on the verge of a communist takeover. As one supporter put it, “It matters very little if…Roosevelt or Eisenhower is a Communist or not. What does matter is that they have advanced the Communist cause and American Liberals, by participating in the advance of the cause of Communism are unwitting dupes of the International Communist Conspiracy.”

Running as the candidate of the States’ Rights Party, Andrews won just over 111,000 votes in the 1956 election. At the time, the liberal mainstream dismissed the far-right constituency for which he spoke as politically marginal. Such activists (along with the supporters of Senator Joseph McCarthy) would later serve as the prototypes for the deranged, pathetic wackos that Richard Hofstadter chronicled in his famous essays on the “paranoid style” in American politics and the rise of “pseudo-conservatism”—freakish figures desperately clinging to national identity and social status who were to be pitied more than feared. Confident in the telos of liberalism as seen from his perch at Columbia University, Hofstadter concluded that the right was hysterical, a fringe force that might be disruptive but would never prove dominant. But were people like Andrews and his supporters merely on the margins of American conservatism, or were they representative of its ethos and worldview?

Ever since Hofstadter published his essays, historians have taken issue with his dismissive stance. The scholarly consensus has shifted to an interpretation of the conservative movement of the mid-20th century not as a mobilization of zealous cranks but rather as a force that must be taken seriously, its leaders motivated not by paranoia or rage but by deeply held ideas and a canny understanding of their interests. However, as the far-right end of the American political spectrum has grown in recent years—from QAnon and Tucker Carlson to the Capitol rioters and those making death threats against school board officials—and as substantial parts of the Republican Party have actively encouraged or tacitly benefited from this shift, it is also worth asking how marginal the followers of people like Andrews truly were. When it comes to American conservatism and the right, how should we think about the relationship between fringe and mainstream?

The historian John Huntington’s Far-Right Vanguard offers the fullest portrait yet of the ultraconservative mobilization of the 20th century. Whereas many scholars of the right have focused on the self-conscious development of a conservative movement that took shape in the 1950s, espousing such high-minded principles as individual freedom, support of the market, and opposition to communism, Huntington pushes the story back to World War I and the 1920s. He suggests that we should see the rightward edge of American politics as itself a spectrum, with the white-power militias and the neo-Nazis, the vigilantes and those who peddled a belief in The Protocols of the Elders of Zion, as the farthest extreme.

When we take this broader view, it becomes clear that the ultraconservatives were neither completely marginal nor assimilated into a mainstream right. Huntington argues that they made up the “base of the conservative movement”: its networks, its members, its readers and listeners, the people who tuned in to early right-wing radio programs, showed up at rallies to defend the House Committee on Un-American Activities, subscribed to National Review, and canvassed for Barry Goldwater. For Huntington, they were at once “fellow travelers and acerbic critics” of modern conservatism, forming a “vanguard” of the movement that helped to entrench the melodramatic tropes and conspiracy theorizing that remain so foundational today.

Huntington acknowledges that there is a real distinction between the far-right activists he describes, who believed that the United States was headed for an apocalyptic confrontation but who did not themselves engage in acts of terrorist violence, and right-wing extremists such as the Minutemen and the Ku Klux Klan of the 1950s and ’60s, who stockpiled arms and assassinated civil rights activists. But he also insists that the dividing line between the ultras and the mainstream right was and still is far more “porous” than has been commonly understood. Activists, leaders, ideas, and resources flowed easily between the world of the ultraconservatives and their more reputable mainstream comrades. The two political communities were separated more by tactics than by ideology. Here, in this driven, Manichaean mobilization of the mid-20th century, is where we can find the predecessors of today’s politics of reaction.

untington starts his narrative in the 1920s. The opposition to the New Deal and to Franklin D. Roosevelt, he argues, built on the political conservatism of the previous decade: the revival of the Klan as a mass political movement bent on protecting a white Protestant America from immigrants and Black Americans; the culture wars focused on keeping the teaching of evolution out of schools; the fear of Bolshevism and anarchism following the Russian Revolution and the strike wave of 1919. All of this inculcated a vision of a besieged America that crystallized in opposition to the New Deal.

Huntington describes leaders like James A. Reed, the former Missouri senator and the founder of the Jeffersonian Democrats, who argued that FDR had usurped control of the Democratic Party. An inveterate racist, Reed warned that the League of Nations would prove to be a vehicle whereby the “degenerate races” of the world could dominate whites and insisted that New Dealers would “rush into homes, spy upon people, shoot down citizens without warrant and without right…and undertake in every imaginable way the supervision and regulation of humanity.” Reed was a lifelong Democrat, but with Roosevelt in the White House, he called for reclaiming the party of the Solid South. As he put it, “You cannot make socialism and communism democracy by calling them the ‘new deal.’”

Reed worked with other opponents of the New Deal in the 1930s like the American Liberty League. They never broke away to form a new political party but instead attempted to build support for Alf Landon, the Kansan who challenged Roosevelt in 1936 on the Republican ticket. Although unsuccessful, their efforts left behind an ideological framework that adapted the “anti-communism of the First Red Scare for a new era of preponderant liberalism.”

In the 1940s, this emerging community of ultras found new cause for outrage in the expansion of the federal government during and after World War II. Sewell Avery, the president of the Chicago-based mail-order retail company Montgomery Ward, became a hero to the right when he refused to recognize or negotiate with a local union that his workers had organized in the spring of 1944. The federal government took over the company, and Avery was carried out of his office by military police, insisting to the press that “the kind of slavery that is being gradually put into effect now with government help is far worse than anything that ever happened before.”

Although the World War II era is often mythologized as a time of national unity, Huntington suggests that it instead “marked a continuation of substantial social and political divisions.” The fierce conflicts of the late 1930s that saw the rise of the House Committee on Un-American Activities, the intensification of class conflict following the sit-down strikes, and the flourishing of opposition to FDR in Congress did not dissipate with the United States’ entry into the war. On the contrary, for certain circles the growth in federal power symbolized by the Montgomery Ward controversy seemed a harbinger of tyranny to come.

he Cold War nurtured the extreme right, legitimating its fears while fueling the anxiety that not enough was being done to defeat the Red Menace. Even as mainstream Republicans grew concerned that Joseph McCarthy’s brandishing of lists containing the names of alleged communist sympathizers might damage their cause, the Wisconsin senator emerged as the ultras’ next hero; as Gen. Robert E. Wood, the former president of Sears, Roebuck, declared, “McCarthy is doing a great job that had to be done to put traitors and spies out of our government.” Billy James Hargis, the Christian evangelical leader, depicted the United States as divided between Christianity and communism. Rhetoric likening communism and socialism to disease and perversion was everywhere on the right. Another conservative writer of the mid-1950s spoke of a “highly-organized socialist conspiracy” that had “infected every artery of our country,” while Southern California Representative James Utt (echoing the Pizzagate fanatics of today) likened the welfare state to pedophilia: “The child molester always entices a child with candy or some other gift before he performs his evil deed. Likewise, governments promise something for nothing in order to extend their control and dominion” over the people.

The early successes of the civil rights movement—most notably Brown v. Board of Education—also served to spur the far right, as white Southerners joined the Citizens’ Council movement to challenge the Supreme Court and resist integration. By 1956, there were some 90 Citizens’ Councils in the South, which may have had as many as 250,000 members. Even though these were people who “blanketed the South with segregationist propaganda” and took as their inspiration a speech by Mississippi Circuit Court Judge Tom P. Brady warning that the United States faced a choice between “Segregation or Amalgamation,” they mostly eschewed the violence of the Klan. They were, they insisted, “respectable” members of society.

Within the broader world of the right, this double act existed as well. Republican politicians and conservative intellectuals distanced themselves from the outright racists and the anti-communist conspiracy theorists who believed that President Eisenhower was serving his masters in Moscow, and yet, like the ultras, they argued on “principled” grounds that the Supreme Court had overstepped its authority in Brown, that unions were akin to tyrants, and that communists should be barred from teaching in public schools.

Throughout the 1940s and early ’50s, the political community of the right was a big tent, with ultras mingling easily with Republican Party regulars. But that began to change in the late ’50s, as the ultras became a more cohesive force and as the possibility of winning elections began to seem within reach for conservative Republicans. With the establishment of publications like National Review, “gatekeepers” (as Huntington terms them) such as William F. Buckley Jr. began to draw a line between their version of conservatism and that of the ultras. When the Harvard-educated businessman Robert Welch founded the John Birch Society in 1958, it brought together many of the strains of the far right into a single political organization and forced the strategic questions into the open. The Birchers now had to decide how to use their clout and whether they would try to work within the Republican Party or form a third party. Conservative politicians were faced with a similar dilemma: Should they accept the support of the Birch Society and other ultras, even though this might make it harder to win over centrist Republicans and other voters?

When Richard Nixon tried in 1962 to repudiate the Birchers while running for governor of California (despite their strength in the southern part of the state), he lost the election. With the rise of Arizona Senator Barry Goldwater in 1964, both the ultras and the Republicans had their answer. Goldwater was willing to associate with and to encourage his far-right supporters, but he did so through the vehicle of the Republican Party: “Extremism in the defense of liberty is no vice!” he thundered at the party convention that year. His ultra politics helped to doom his candidacy, as the press portrayed him as mentally unstable and trigger-happy. But at the same time, even in the midst of Goldwater’s defeat, the most committed of his supporters saw signs of victory.

The Conservative Society of America peddled neon bumper stickers announcing “27,000,000 Americans Can’t Be Wrong.” They continued to mobilize, backing George Wallace’s independent presidential bid in 1968 and flirting with the idea of starting a third party—and as they did, Republicans began to worry that they had taken the support of the ultras for granted and might be “losing their right wing to Wallace.” For Huntington, the defection of the far right to an independent candidate suggests its partisan flexibility. But with the Wallace campaign, the Republicans also realized that they not only needed to court and flatter the far right but that by doing so, they might appeal to dissenting Southern Democrats as well.

Huntington’s narrative draws to a close in the 1970s, when the rise of Reagan’s more mainstream style of conservatism siphoned off some of the passion of the ultras, while the anti-communism that had once held the mobilization together ceased to be such a powerful unifying force. But it is not hard to see the impact that midcentury ultraconservatism had on the conservative institutions of the United States—the Republican Party above all, but also right-wing media, think tanks, and the movement’s general gestalt. A new conservatism had emerged out of the center-right’s dalliance with the far right: The extreme rhetoric, the effort to build a separate media universe, the fanatical celebration of free markets, the hostility to bureaucracy and the public sphere itself, and the willingness to denounce political opponents as “traitors”—all hallmarks of the ultra-right during the midcentury—became representative of conservatism as a whole. From Newt Gingrich to the Tea Party and finally to Donald Trump, the harsh motifs and ferocious imaginary of the far right had been merged into the conservative mainstream.

ar-Right Vanguard chronicles a network of activists who will not be familiar to most readers today. People like the Louisiana firebrand Kent Courtney, founder of the Conservative Society of America, and Los Angeles libertarian Willis E. Stone, who pushed for repeal of the 16th Amendment, are not exactly household names. But the book’s real value is not so much in its research as its interpretative frame.

Historians have long known about the kinds of groups that Huntington describes here. But in most other accounts, the moments of the right’s greatest success come when its leaders are able to reach beyond those circles and appeal to a larger electorate, to define a politics that distances them from the extremists. Huntington reverses this order: The fans of Sewell Avery, the backers of For America, the Citizens’ Councils, the people who believed that the income tax was a Marxist plot—these were the ones who actually powered the right and pushed it forward. They were the voters, the supporters, the contributors, the subscribers. As the historian David Walsh has argued, even if people like Buckley sought to distinguish the genteel intellectualism of National Review from the loony paranoia of the Birchers, in reality his magazine succeeded precisely because it was able to reach out to the base that the far right had created. He raised funds for National Review from ultra donors and published the writers who wrote for American Mercury, which trafficked in anti-Semitism. And in truth, Buckley shared more with the far right—including his defense of McCarthy and his hostility to the civil rights movement in its early years—than his admirers might like to remember.

Throughout the period Huntington describes, the intellectual and ideological overlap between the “mainstream” right and the ultras was always more extensive than later conservative politicians cared to admit. They shared the libertarian antagonism toward government, the passionate hatred of unions, and the fierce critique of federal action to protect the civil rights of Black Americans. Their differences mattered less than their common ground.

Long after the end of the Soviet Union, conservatives continued to rail against socialism. As the political culture turned rightward, the vitriol of the far right only intensified. Today, the organizations, the worldview, and the “paranoid” political style that in the 1950s seemed like relics now appear instead to be the predecessors of the vaccine resisters, the school-board activists, and the backers of Trump. “Indeed,” as Huntington notes, “Trump and the modern Republican Party represent the culmination of the long ultraconservative movement.”

But is this the whole story? While the intransigent rhetoric of the midcentury ultraconservatives is echoed by the right today, in other ways the ultras seem like a very distinct mobilization. Most of Huntington’s focus is on the leadership of the movement rather than its social base. But the far-right politics he traces seems to have found its most loyal supporters in the world of business—specifically among the midsize manufacturers and affluent suburban professionals who might have been most directly challenged by the changes of the New Deal. Out of the real disruption to their power in the workplace—Sewell Avery, outraged that the federal government had the temerity to challenge his basic property rights and tell him to recognize a union!—they fashioned a broad conspiracy theory in which the reforms of the New Deal really were a communist takeover.

The same was true, in different ways, for the white Southerners who joined the Citizens’ Councils. After all, they were not wrong to perceive that something was changing, that the absolute power that had once belonged to them over the lives of Black people was being hemmed in. But their rage and sense of betrayal morphed into a fantastic vision of the world in which they were entirely victimized by malign, mysterious outsiders. Anti-communism bound the movement together and gave its incoherence a clarity it would otherwise have lacked.

Today, by contrast, the social base of the far right—while not entirely clear—seems likely to include fewer people who are actually the owners of manufacturing enterprises, who might be considered members of a genuine (if local) social elite. They are not necessarily downwardly mobile or economically desperate: The January 6 insurgents, for example, seem to have included many white-collar, middle-class employees—doctors, architects, people in marketing. Some may own businesses, but these are probably smaller than the manufacturing enterprises that fueled the postwar right. No longer can their mobilization be understood simply as a defense of actual economic practices or a literal Jim Crow state; in many ways, it seems as if the privileges these people seek to preserve are in fact illusory. Similarly, the liberalism that the right rails against is not really that of the New Deal any longer—it is instead the strange mash-up advanced by the contemporary Democratic Party, which concedes so much social and political authority to business from the start.

Where the organizations and individuals that Huntington describes sought to defend a particular social order out of which they had emerged and in which they had prospered, today’s right (especially in those parts of the country that have been ravaged by plant closures and deindustrialization) channels a widespread sense of powerlessness that has come unmoored from any material interpretation. Its energy, at times, seems to come from the untethering of the white working class from the collective institutions, such as labor unions, that once helped to give its members a way of understanding themselves and their struggles as part of a broader social conflict and at their best helped to foster a spirit of common purpose and solidarity. In the absence of these connections, many of these workers may have the sense that they are perilously isolated, forced to confront a ruthless world as lone individuals—their only commonality and refuge being that of race. Today’s far right is shot through with the rhetoric of entrepreneurship alongside fantasies of racial displacement. The ravening embrace of market individualism and the presumed creativity of the businessman that is constantly reiterated and celebrated at the top of the social order is reinforced by the lived economic experience of the self-employed, the middle managers, the small business owners—groups pressed between ideals of autonomy and an increasingly fragile economic reality—and by the precarious nature of many jobs today, whose temporary, insecure, and degrading qualities are justified by the dream that they are only the prelude to a big break. Anti-communism as the glue holding the movement together has been replaced by a frenetic emphasis on self-preservation and self-enrichment—aspirations that must be defended against those immigrants, foreigners, and people imagined as racial subordinates who threaten to usurp the material security and wealth that ought to go to those most deserving, to the true Americans.

The libertarian vision of mid-20th century conservatism helped to create the atomized social world that powers the far right today. What is remarkable, then, is the rise of a new strain of conservatism that actively seeks to distance the movement from the free-market faiths of Goldwater and Reagan. Articulated in its more elaborate form by writers such as Michael Anton at the Claremont Review of Books, this conservatism tries to position itself as speaking for a muscular working class, castigating “globalism” and the soulless power of cosmopolitan elites who are sapping the energy of the virtuous (and male) people of the nation. An earlier version of this framework animated the 1992 and ‘96 presidential campaigns of Patrick Buchanan, whose trajectory illustrates how closely its typology veers toward anti-Semitism and the sensibility of fascism. At the same time, most of the actual prerogatives of the wealthy and the extremely wealthy have continued to be defended by Trump’s Republican Party, so that the policy agenda put together in the economic departments and think tanks far from the Birch Society coffee klatches has remained relevant and germane—not replaced entirely by Trump’s posturing against free trade. Yes, at the end of the Trump years, organizations like the Business Roundtable were suddenly horrified by the specter of insurrection and a challenge to the legitimate transfer of power—but until then, they were happy to take the tax cuts.

The contemporary right, in other words, may echo certain preoccupations of the postwar ultras, and it may borrow their melodramatic tone. But socially and politically it is different, as is the broader context in which it operates. In his 1975 article “The Lower Middle Class as Historical Problem,” the historian Arno Mayer described an “inner core of conservatism” that might be embraced by the lower-middle classes, especially in moments of “acute social and political crisis.” Mayer might not have imagined how the ranks of this social group would swell with the decline of unions and the revival of ideas of self-branding and going it alone. But he described the way that the office professionals, small shopkeepers, accountants, and marketers—the predecessors of today’s Internet merchants and Web entrepreneurs—would join together to protect the power of “those higher social classes and governing elites on whom [they] never cease to be dependent and for whom [they] feel envy exacerbated by resentment.” This element of social analysis is needed to make sense of what we are living through today.

Far-Right Vanguard borrows a metaphor of the left—the “vanguard”—to describe the ultraconservatives of the 1940s and ’50s. They pioneered a political style that remains potent today, an ideological infrastructure that underlies the Trump faction of the Republican Party. But in resurrecting this world and showing its centrality to the emergence of the postwar right, Huntington forces the reader to consider how different our own time may be—and the possibility that the resurgence of the right today not only builds on the legacies of the 20th century, but may be threatening and dangerous in new ways.



]]>
https://www.thenation.com/article/society/john-huntington-far-right-vanguard/
Joe From Scrantonhttps://www.thenation.com/article/politics/evan-osnos-biden/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinApr 20, 2021

When Joe Biden was in his early 20s, his new girlfriend’s mother asked him what kind of job he wanted. “President,” he replied, “of the United States.” A college senior at the time, Biden must have appeared brash and full of himself: Who would announce such a goal to someone he presumably wanted to take him seriously? But perhaps he knew something no one else did. Though it took a while longer than he might have hoped—and involved two earlier presidential bids, each embarrassing in its own way—Biden has made it happen at last.

Of his previous attempts, the 1988 run was likely the more disastrous. It began with Biden promising generational change as he declared his candidacy at the Amtrak station in Wilmington, Del., and it ended with his withdrawal from the race amid allegations of plagiarism. The campaign also had the dubious honor of being featured in Richard Ben Cramer’s What It Takes, an exhaustive portrait of the leading contenders for the nomination that year that painted none in a flattering light, but Biden least of all. A proud but insecure man, he was depicted as a legislator always pushing on to the next plan before accomplishing the previous one. He was a magnet for those who saw him as a rising star in the Democratic Party, but he kept his aides up to all hours of the night in meandering conversations. He loved to give speeches and work the crowd, yet his most inspiring lines were often stock phrases aimed only at winning the race. He wanted victory but was never entirely clear about what he would do with it once in office.

But missing from this skeptical portrait was a sense of the politics that motivated Biden to enter the scrum in the first place. Even though his ambition was evident, its source was less so: He had always wanted to be president, but why? What political goals drove him, what dreams kept him running? Even Biden himself, at least in Cramer’s account, wasn’t always sure. In the buildup to his announcement in 1987, the self-proclaimed “son of Delaware” was plagued by self-doubt: “He could not find that overriding reason why he should be President, why he was going to be President, what he was going to be President for.”

Journalist Evan Osnos’s new book, Joe Biden: The Life, the Run, and What Matters Now—one of the latest installments in the literature on Biden’s life—doesn’t really answer this question either. But it does distill the political confusion, both the possibilities and the limits, around Biden’s presidency. For Democrats and liberals, the most pressing demand in the 2020 election was to defeat Donald Trump, but they often seemed less concerned about who would be elected in his place. The result, according to Osnos, is that we’re now left with an “urgent appetite, at home and abroad, to divine what had made” the 46th president—“how he thought, what he carried, and what he lacked.”

In a way, this uncertainty reflects our moment’s deeper tensions between stasis and change. Is what’s needed a restoration of the “normal”—of the world as it was before the pandemic and the destructive narcissism of the Trump presidency? Or does countering the virus and the dynamics that led to Trump require a broader political shift—an attempt to create a society defined by greater solidarity and egalitarianism and the expansion of freedom that these might permit?

Throughout the book, Osnos presents Biden as a man open to either prospect. In his telling, Biden is a figure who may well be willing to use his power to press for sweeping legislation on climate change, to invest in caregiving, to make it easier for workers to organize, and his first months in office have suggested the possible return of a style of liberalism absent for a generation. Here is Biden, giving a Twitter speech that is the most pro-union statement from a president in a generation! There he is, offering his support to an infrastructure bill that relies heavily on taxing corporations and appears to draw on the kind of legislation that only recently was denounced as a “green dream” by Speaker of the House Nancy Pelosi. His coronavirus rescue measures not only supply aid to state and local governments that might stave off cuts to social services and education, they provide support that will make life significantly easier for working-class and poor families (albeit through the dubious mechanism of tax credits, as the journalist Liza Featherstone has noted). Yet Biden has also backed away from a $15 minimum wage, remains opposed to Medicare for All, and has made clear his commitment to traditional US military power by authorizing a bombing strike in Syria.

Osnos sees Biden as a man compelled by the pandemic to become receptive to a politics he would have rejected only years before. But as many have observed, and as even Osnos suggests, there is little in Biden’s political history that indicates any certainty that this will continue. Once the more immediate emergencies of Covid and the current economic crisis have passed, will he have the will and the desire to realize a political program centered on creating a more just and egalitarian society—permanently expanding the welfare state, empowering workers to organize, embarking on the kind of public investment needed to blunt climate change?

lthough Biden now portrays himself as “Joe from Scranton,” the child of hardscrabble northeastern Pennsylvania, one has to go back only a generation to find that much of his family hailed from money. His grandfather was an executive at the American Oil Company and enjoyed a life of privilege. But Biden’s father failed in many of his own business ventures, and a family that had started near the top of the economic pyramid tumbled down—so much so that by the time Biden was 10, they had left Pennsylvania to settle in Wilmington, where his father began selling used cars.

As tenuous as his father’s position may have been, Biden’s childhood in Wilmington was anything but working-class. He went to a private day school where, as Osnos puts it, he was a “middling but popular” student. He attended the University of Delaware and went to Syracuse Law School, primarily to be close to his girlfriend Neilia Hunter, who would soon become his first wife. At Syracuse, Biden horsed around and almost flunked out, getting in trouble for writing a paper that used the work of others without the proper citations. In these years, he was peripatetic and jocular, seeking to include friends in possible business deals and making time for the family reunion, the first communion of a niece or nephew, or a gathering at the pizza shop with old school pals.

An affable guy, sure, but a president? In an age of student rebellion, Biden proved distinctly uninterested in the issues raised by the Vietnam War and the civil rights movement. An institutionalist from the start, he instead looked to electoral politics—not to bring the politics of his generation into the halls of power so much as to enter them himself.

After graduating from Syracuse, Biden returned to Wilmington, where he briefly served as a public defender before running for the New Castle County Council. From this post, at the age of 29, he challenged Republican incumbent J. Caleb Boggs for his seat in the US Senate. Boggs had represented Delaware as a senator for 12 years, but Biden managed to win the seat.

In some ways, Biden’s defeat of Boggs has parallels with Alexandria Ocasio-Cortez’s toppling of 10-term Representative Joe Crowley in her first congressional race. But unlike AOC, whose political awakening came during Bernie Sanders’s 2016 presidential campaign, Biden was not really part of the social movements of his day. He had never been very involved in the anti-war or the civil rights movement, and he kept his distance from the New Left. He was more like a 1972 version of Pete Buttigieg, absent the academic credentials: a clever young man who could embody youthful energy, even passion, but without an oppositional politics.

Once in the Senate, Biden quickly became known as the “Democratic Party’s leading anti-busing crusader,” Osnos writes, opposed to the use of busing to desegregate Wilmington’s schools. In many ways, he amassed a record that one might strain to term “moderate,” voting for the 1994 crime bill, for welfare “reform,” and for the war in Iraq. As head of the Senate Judiciary Committee during Clarence Thomas’s confirmation hearings, Biden refused to allow testimony from other women concerning instances of sexual harassment by the Supreme Court nominee similar to those reported by Anita Hill. But rhetorically, at least, he has always tried to present himself as a politician for the downtrodden, a leader who would restore Americans’ declining faith in government.

Biden was never quite like the other New Democrats, a roster that included his contemporaries Gary Hart, Bill Clinton, Al Gore, and (years later) Barack Obama. His approach was never as technocratic as theirs, his commitment to markets never as ideological. Instead, he has long sought to appeal to the white working class, to position himself as part of it, even if this was as much a question of salesmanship and fantasy as anything else.

Given that it wasn’t backed by much substance, this ambition could also get him into trouble. The turning point of his 1988 campaign came when he lifted a speech directly from Neil Kinnock, the head of Britain’s Labour Party, in one of the primary debates that year. “Why is it that Joe Biden is the first in his family to ever go to a university?” he asked. “Was it because our fathers and mothers were not bright?… My ancestors, who worked in the coal mines of Northeast Pennsylvania and would come up after twelve hours and play football for four hours?” Biden, however, never had any coal miners in his family; the words were taken from Kinnock’s speech. (Indeed, as the historian Gabriel Winant has written in The Guardian, Biden’s only personal connection to the coal industry was a distant relative who died in 1911 and had actually owned a mine.) He’d cited Kinnock before, so a case could be made that in the pressure and anxiety of the debates, Biden simply forgot to attribute the source. But the press soon uncovered his earlier plagiarism incident in law school, as well as a cringe-inducing failure to credit Robert F. Kennedy for the lines in another campaign speech, and shortly thereafter, Biden was out of the race.

The imbroglio with Kinnock’s speech reflected Biden’s larger plight as a politician: He wanted to present himself as a standard-bearer for the old working class, while lacking any real connection to its members and not necessarily doing all that much to advocate on their behalf. In the end, however, these gestures to a working-class style without a working-class politics paid off for him. It was precisely his image as Scranton Joe, someone who could speak to blue-collar workers, that made him attractive to Obama as a running mate in 2008. The Illinois senator, a university professor with a reputation for aloofness and playing it cool, saw in the political veteran a valuable point of identification. Here was someone who could signal Obama’s own moderation and help him connect with older white voters. And this symbolic identification stripped of a larger political program was partly also what brought the Democratic establishment to rally around Biden in 2020. As he assured wealthy voters at an early fundraiser, “Nothing would fundamentally change.”

f Biden’s attempt to identify with the working class has always been more aspiration than reality, the key aspect of his personal history that Osnos emphasizes is his experience of tragedy. The shocking death of his first wife and their baby daughter in a 1972 car accident that left him a single parent with two young sons, both injured in the crash, was only the first of his personal trials. More would follow: his own brain aneurysm and near-death in 1988 and, much more recently, the death of his son Beau from brain cancer at 46. Then there is his son Hunter, who has struggled with alcoholism and drug addiction for much of his life while also pursuing a career as a lobbyist and investor and getting entangled in shady deals with various foreign firms, with little regard for the effect it might have on his father’s career.

Biden has indeed suffered a great deal, and it is hard to imagine at times how he endured—in particular how he managed to sustain his political career during the years after the death of his wife and daughter, as his siblings and parents scrambled to help with his two small boys. But many journalistic accounts rely on these experiences to provide Biden with a political gravitas that isn’t always there. At points, the inescapable grief he has had to bear begins to stand in for the weight of history in his life and the limits of his own political achievements.

Osnos falls prey to this temptation to transmute Biden’s personal story into a larger politics. At times, the image that emerges is affecting. Describing how, during the 2020 campaign, Biden’s aides tried to have him speak on the phone every day with a “regular person,” Osnos recounts the day last spring when Biden was connected through a campaign worker with Mohammad Qazzaz, the owner of a coffee-roasting business in Dearborn, Mich., who had recently tested positive for the coronavirus and was attempting to quarantine in his house away from his two kids. His 2-year-old daughter kept knocking on his bedroom door, trying to get him to come out. Qazzaz choked up as he told Biden about how much he wanted to open the door for his little girl, but he was terrified of infecting her. Right away, Biden offered his own story of loss, a time when his two small sons didn’t understand the tragedy that had just happened to them all. He suggested how Qazzaz could connect with his daughter: by playing a game through the door, telling her a story. The two men were scheduled to speak for only a few minutes, but they wound up talking for more than 20.

Despite its staginess, this tale evokes Biden as a caregiver, someone who once had to stanch his own confused sorrow to guide frightened children. But for Osnos, it is also an example of his abilities as a leader for our moment. Biden’s experiences of loss, Osnos tells us, might make him the perfect president in a time when so many are dealing with death, illness, and despair. Drawing on an ability to empathize born of his history of grief, Biden may be a “weathervane,” Osnos suggests, a leader who will follow the calls for greater social spending, bold action on climate change, and steps toward racial justice. As the country reels from the public health crisis of the past year and the various pathologies that it has highlighted, the implication is that Biden may be willing to cast off the institutions and establishment he’s spent his life serving and push for more radical responses to the disasters unfolding around us.

Although Biden began his campaign with the narrow goal of defeating Donald Trump, Osnos argues, over the course of 2020 he began to awake to the reality that the “emergency” was even bigger than he had imagined. In this context, Biden might not provide “exalted rhetoric,” but “for a people in mourning, he might offer something like solace, a language of healing.”

snos is far from alone in this hope. For many liberals, Biden and Kamala Harris represent a resounding defeat for Trump and Trumpism. Their election demonstrates a repudiation of the racist right and the forces of white supremacy, as well as a rejection of Trump’s gleeful hostility to public health in the pandemic. The ascendance of Biden and Harris means the nation is back on course: Experts can reign once again; policy can be science we believe in; liberals can once more extol America as a beacon of hope on the global stage; and everyone can start talking about politics again in a civil and respectful way.

This overriding goal of restoring order and faith in competent technocrats is in tension, however, with the notion that Biden should press for a more sweeping response to the nation’s problems—bold, dramatic change that will be achieved not through civil discourse and the policies recommended by mainstream economists but through a politics of conflict and mass mobilizations. Whatever coherence this narrative may possess is provided by the notion that even though Biden is a creature of the establishment himself, he has a gift for understanding, through his own painful history, the struggles and loss that so many of his constituents face. Looking at Biden’s personal story becomes a way of eliding the world from which he comes and which has shaped his political commitments.

For critics on the left, the election’s significance is less clear and far less rosy. Yes, Biden and Harris won, but Trump’s continued strength and especially his appeal to working-class voters (and his notable gains among Latinx voters), as well as the weakness of down-ballot Democrats, suggest the “class dealignment” of the parties, as the historian Matthew Karp terms it. Meanwhile, the failure of the Sanders campaign, as the journalist Jamie Merchant and others have argued, points to the difficulties of trying to build a left politics at a time when many collective organizations—including labor unions—are on the defensive and when many people’s actual social experience is one of fragmentation, isolation, and fear. A politics that actually challenges entrenched power and wealth cannot simply be created from above, and it cannot be achieved without a considerable break from the past.

With the likes of David Brooks and Ezra Klein hailing Biden’s presidency as “transformational,” one popular theme to emerge has been that Biden is focused on policy rather than on political rhetoric; he has mostly chosen to stay away from divisive words. For all the comparisons to Franklin Delano Roosevelt, no money-changers are being thrown out of the temple here, and that’s all to the good.

Yet this cheerful assessment seems to breeze past the larger uncertainties, the tentative shifts, not just for Biden but for the whole country. Might not the absence of a political language, a framework to make sense of what’s happening, reflect the president’s own divided loyalties, not his canny strategic sense? If Biden was so willing to turn away from pressing for a higher minimum wage, what chance is there that he will work his old Senate connections to drum up support for changes to the legal infrastructure that would make it easier for people to organize unions, or use the presidency to lay out a moral vision that shames and marginalizes those legislators who oppose them? More deeply, will the steps toward reimagining the public sector represented by the coronavirus relief bill crystallize into a fuller commitment to the communal resources—health care, schools, day care, transit, parks—upon which we all rely? And absent a politics that has the power and the capacity to rally people against the deadly hierarchies that rule our lives, to really deliver a more just and equal society, won’t Trumpism—with its clear division of the world into enemies and friends, its crude and racist version of solidarity, its conspiracy theories that offer a compelling (if loony) explanation of power—continue to thrive, even if the Donald himself slinks off to a Twitterless obscurity in Mar-a-Lago?

These are the questions that only the next few years can answer, and the landscape may be shifting quickly, in the warehouses and hospitals and schools of the country as much as in the capital. But one thing is clear: The losses of the pandemic time are not simply the inevitable outcome of the natural order’s cruel twists of fate. They reflect in harsh detail the familiar inequalities of our lives. In this bleak landscape, the language of mourning and of solace is not enough. What we need is a language of outrage—and a politics of it too.



]]>
https://www.thenation.com/article/politics/evan-osnos-biden/
How Did American Cities Become So Unequal?https://www.thenation.com/article/society/saving-american-cities-lizabeth-cohen/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinOct 19, 2020A new history of Ed Logue and his vision of urban renewal documents the broken promises of midcentury liberalism.]]>

As the coronavirus ricocheted through New York City this spring, among its many casualties was a certain image of life in the Big Apple. The foodie destinations, posh galleries, and pricey cocktail lounges sat deserted while city hospitals long scorned as antiquated, clunky, and ineffective became crowded, bustling centers of activity and pandemonium. If they didn’t abscond to their second homes, financiers and lawyers huddled in their apartments, and grocery store employees, doormen, UPS drivers, and postal workers all became consummate risk-takers. Spaces segregated from the middle class—homeless shelters, nursing homes, jails—were revealed as inextricably linked to the rest of the city on a microbial level, as the virus could not be kept out or contained within. In the pandemic city, the oft-praised prosperity of New York in the early years of the 21st century proved illusory or at least misdirected: a world of glittering condos and luxe hotels that somehow could not provide enough hospital masks to its nurses or figure out a way to keep its children safe.

The virus held up a mirror to the city that revealed a very different image from that of a gleeful elegance and striving opportunity: a distorted, cruel urban landscape divided between those with the means and resources to depart and those who had no choice but to keep taking the subway, even as the viral wave crested. Zip code maps showing infection rates in the Bronx, Brooklyn, and Queens compared with those in Manhattan told the story. When George Floyd was killed in Minneapolis, his death seemed to magnify the vulnerability, racism, and exclusion already evident, and New York, like many other places in the United States, justifiably erupted in protest.

As cities across the country and around the world struggle to cope with the ongoing pandemic, it is an opportune time to read Saving America’s Cities, Lizabeth Cohen’s excellent study of postwar urban planning. The period she chronicles is at once near and far from our own. Over the years that followed World War II, the federal government sought to address the problems of urban poverty and deindustrialization through a series of attempts at urban renewal. Much of the historical literature on these programs has focused on their failings—especially and most damningly with regard to public housing, which created what scholar Arnold Hirsch, writing about Chicago, called a “second ghetto.” Cohen, by contrast, views the contradictory legacy and aspirations of postwar urban liberalism as having much within them to admire, a case she makes by taking up the life of urban planner Ed Logue.

Logue’s name is little known today, but he was a renowned figure in the mid-20th century, dubbed “Mr. Urban Renewal” by The New York Times and the “Master Rebuilder” by The Washington Post. He brought federal redevelopment money to New Haven. He used public money to rebuild downtown Boston. He built the apartment towers on Roosevelt Island for New York City as well as affordable housing elsewhere throughout the state. For Cohen, his career was representative of a certain liberal hope: that professional expertise and federal funds could reverse the racial inequalities, suburban flight, unemployment, and disinvestment that plagued American cities in the second half of the 20th century.

Cohen makes a compelling case that a renewed faith in the public sector and the active participation of the federal government in rebuilding urban infrastructure and public housing are essential for any progress today, and she argues that Logue’s work demonstrates the potential of this approach as well as some of its successes, however partial. Yet at the same time, Saving America’s Cities is, purposefully or not, a study in urban liberalism’s failings and of the profound pressures that always constrained and shaped its aspirations and accomplishments. In the end, Logue’s career—which began with his trademark sunny confidence and brash appeal but ended with his back against the wall of declining federal money and slipping public support—cannot help but suggest the limits of postwar urban renewal as much as its possibilities.

d Logue hailed from the heart of urban America, Philadelphia. Born in 1921, he grew up the oldest of five children in an Irish Catholic household. His father was a tax assessor, a well-paid public sector job that enabled him to send his children to private school and to summer at the Jersey shore. But after he died when Logue was just 13, the family fell into near poverty. The charity of an uncle enabled them to avoid destitution.

Despite the family’s travails, Logue went to Yale on a scholarship. During his time in college, he became involved in New Haven’s labor movement. He worked in the Yale dining room and gained a reputation as a firebrand; a supervisor later told the FBI that Logue had been known for spreading “malicious rumors” about management, calling Yale administrators “dictators” and “slave drivers.” After he graduated, he went to work as an organizer for the local that represented the university’s janitors, maids, and maintenance workers. He may have been a radical, but he was also a committed anti-communist, skeptical of the Communist Party’s role in local politics and dismissive of its broader program. As Cohen puts it, he was a “rebel in the belly of the establishment beast,” a New Dealer who believed in the role of the federal government and in the power of labor unions, but he was not someone who sought broader social transformation.

Logue fought in World War II and then attended Yale Law School on the GI Bill. But he was quickly bored and alienated by the life of an attorney. “The law is a whore’s trade,” he explained to a law school mentor. “I don’t want a nice law practice for anything but the income, and I’m a son of a bitch if I’ll throw away ten or twenty years of my life building up an income.” He considered a career in labor law, but here, too, his ambitions were larger than the role. Being a labor lawyer meant serving the labor movement but not leading it. Logue found himself drawn to politics instead, albeit not as a politician but as the labor secretary to Chester Bowles, a former New Dealer (and head of the Office of Price Administration, which regulated prices during World War II) who had become the governor of Connecticut. Logue then traveled with his boss to India when Bowles became the US ambassador, observing with great interest the community development work the Ford Foundation was doing in rural parts of the country.

In 1954, Logue returned to New Haven to work with the city’s new mayor, Richard Lee. Federal support for suburban development was most evident in subsidized home loans and tax breaks for homeowners as well as road construction, but there was still ample money available for cities, and Logue was able to win more than $130 million in federal aid (over $1 billion in today’s dollars). Cohen details how Logue and Lee used this money to make New Haven a “model modern city” and an example of “what can be accomplished when a full-scale attack on blight and poverty is undertaken.” With federal aid as well as foundation money, Logue and Lee rebuilt schools, developed an industrial center, and helped reshape the downtown commercial district. Logue also helped give shape to an increasingly familiar figure in East Coast cities: the urban planner who would operate with political reserve to create a city in the best interests of all.

With the emergence of civil rights and Black activism in New Haven, many of the problems with Logue’s model of urban planning came to the fore. The kind of renewal that Logue promoted at times involved displacing poorer residents and people of color. It also privileged a certain kind of rule by experts rather than the voices and power of the people most directly affected. Throughout the United States, the anti-communist liberalism embodied in Logue’s top-down urban planning was coming under pressure from below—from the civil rights movement and the New Left, from people who did not want to fight in Vietnam, and from those who rejected the terms of the postwar affluent society.

The tensions in Logue’s liberalism were on full display in New Haven. At first, the city’s Black residents greeted urban renewal with optimism, recognizing that it held the potential for them to gain better housing. But over time these hopes faded, especially as many displaced by redevelopment found it difficult to find new homes and were often relocated in neighborhoods or projects that were no improvement over what they left behind. Community organizers began to show up at public meetings to criticize Logue and his fellow planners for making decisions about their city without consulting the people who would feel the strongest impact of those choices. As one area resident said at a 1967 hearing held by former Illinois Senator Paul Douglas on redevelopment in New Haven, “You people have listened to the Mayor. How about listening to us?”

By the time of that forum, Logue had left for Boston. He claimed he moved on because of his program’s success. “I had done about as much as I could do in New Haven except stay the course and see it through,” he recalled. “After a while there’s not much challenge in that.” But it was already becoming clear that his complacent sense of optimism about New Haven as a model city was unjustified. As early deindustrialization set in, the city’s racism and racial inequality became only more glaring. Local white resistance to school desegregation expanded, and Yale continued to benefit far more from its relationship to the city than the city’s residents did, especially those of color.

ogue’s time in Boston was equally marked by victories and frustrations. There, too, he relied centrally on leveraging federal money to stimulate development and also came up against the challenges of a declining industrial economy and white resistance.

He began his time in the city with ambitious plans to rebuild its downtown. He oversaw the construction of Government Center, a new set of headquarters for city, state, and federal offices that helped encourage commercial development. Building on this success, he embarked on a series of renewal programs in Boston’s neighborhoods, where he sought to encourage racial integration by improving housing and constructing infrastructure and services, including better public schools. In practice, getting white and Black Bostonians to live side by side in racially integrated neighborhoods was far harder than it looked on paper. Logue had trouble building as much affordable replacement housing as the city needed, which meant that Bostonians were displaced by his projects, leading only to further segregation and housing inequality.

Logue claimed to want to work with the neighborhoods to make sure his programs fit the needs and visions of residents, but he was more comfortable with communities that were willing to accept his expert knowledge, and in fact, he was often surprised by the community organizing that resisted his urban renewal efforts or sought to redirect them in a way that more closely fit local needs. When he tried to run for mayor of Boston in 1967, he finished fourth in the primary—well behind Louise Day Hicks, whose claim to fame was her vigorous defense of segregation in Boston’s public schools. Logue’s efforts could not keep open expressions of racism from surging to the forefront of the city’s politics.

From Boston, he went to Albany, where New York Governor Nelson Rockefeller tapped Logue to lead the Urban Development Corporation. After the assassination of Martin Luther King Jr. in 1968, Rockefeller established the UDC with a mandate to construct affordable housing in the state. Under Logue’s leadership, the UDC was responsible for building about 25 percent of government-subsidized housing there, for some 100,000 people. Among these were three “new towns” built from the ground up, including on Roosevelt Island.

In contrast to Logue’s earlier efforts, undertaken in the heyday of postwar liberalism, the UDC could not rely in the same way on federal money. Instead, it sought to combine some public money (from the state as well as the federal government) with private investment raised through moral obligation bonds that carried no legal obligation to repay and thus did not require voter approval. Logue and the UDC relied on Rockefeller’s close connections to real estate developers and the financial community. (Rockefeller’s brother David Rockefeller was, after all, the president of Chase Manhattan Bank.) But after the governor left the state to become vice president under Gerald Ford, the problems with relying on the bond market to carry out the construction of affordable housing quickly became evident. With the absence of public money, it was difficult to build low- and moderate-income housing that would be profitable—and unless it was, private lenders would call in their debt.

The UDC faced serious political 
obstacles as well. When it sought to build low-income housing in Westchester County, just north of New York City, the proposals seemed crafted to mollify the residents of leafy enclaves such as Bedford and Scarsdale, with projects that contained no more than 100 units of low-income housing mixed with moderate-income units, with veterans, area residents, employees of local businesses, and town and school district staffers to be given priority for them. Instead, the plans met with intransigent opposition from suburbanites who had no intention of allowing their towns to become more diverse racially or economically. As Cohen puts it, “In town after town, something like a civil war broke out.” Organizations like United Towns for Home Rule sprang up to marshal resistance. At one point, a barn that the UDC was planning to convert into a community center was burned to the ground. Logue received death threats and was taunted at community meetings: “Get out, we don’t want you in Bedford!” Meanwhile, the Nixon administration imposed a moratorium on the housing subsidies that had been key to the UDC’s financial projections, and Nelson Rockefeller’s departure meant that the UDC no longer had a key ally in Albany. Still, Logue kept building and borrowing to build, using the moral obligation bonds even though he no longer had a clear way of repaying the debt the UDC was accumulating.

By the mid-’70s, the declining commitment of the federal and state governments to build affordable housing, the resistance the UDC faced locally, and Logue’s general indifference to the finances of the agency created a perfect storm. Banks began refusing to lend to the UDC, and in February 1975 the agency defaulted on more than $100 million in debt. As a spokesperson for the bankers put it, the underlying problem was that “social goals are funded one way in this country and economic goals another.” Logue and the UDC’s leaders had assumed that their obligation was to the people who might live in the homes they built, not to the bondholders, but the corporation officials soon realized their mistake. An agency like the UDC could not fund itself through ever-increasing piles of debt.

The rules of the game for people like Logue had been changing even as they played, but above all else, what the UDC’s failure suggested was that only a massive public commitment of resources could address the housing crisis. Relying on the bond markets only empowered the financiers. As Logue said, “We cannot allow basic public policy of this importance to be made in corporate board rooms and issued to public men by fiat.” The problem was that in the absence of taxes and redistributive mechanisms that enabled social resources to be used for public goods like housing, there was no choice but to rely on the credit markets—and that would always mean that corporate boardrooms, in the end, made the rules.

ogue’s reputation was seriously tarnished by the UDC debacle. It marked not only the failure of the agency but also the failure of the kind of urban liberalism to which he had devoted his life. His last major project was an effort at redemption of sorts. In New York City he headed the South Bronx Development Office under Mayor Ed Koch. No love for him was lost upstate, and people in the governor’s office sought to divert funding from his new post. The new governor of New York, Hugh Carey, insisted, “I don’t want any of my money to go to him.”

Despite the governor’s hostility to Logue, the SBDO undertook a variety of projects aimed at rehabilitating the South Bronx, including the construction of Charlotte Gardens: 90 single-family homes, subsidized for purchase, a pocket of suburbia abutting the area’s crumbling apartment towers. Relying less and less on federal or state money, Logue found himself forced to depend primarily on private foundations like Ford to help finance the construction. The Charlotte Gardens project incorporated input from the Black and Puerto Rican neighborhood in which he worked and broke from his early top-down centralism, pointing toward a more democratic and inclusive model of urban renewal.

Yet at the same time, the project’s scale was much more limited and its ambitions smaller than those that defined his earlier career. Logue retired from the SBDO in 1985 and returned to Massachusetts, where he was involved in local development efforts but moved increasingly into obscurity. He wrote letters to the editor and got in touch with his contacts in city government and at The Boston Globe to share his thoughts on public matters. “In a way he tried to go about doing things the same way he had done when he had the power,” said one former colleague. “I think it was hard for him.” Logue died suddenly in 2000 at almost 79.

t is hard not to see in Logue’s story a parable for many of the problems of postwar liberalism and some of the political dilemmas this legacy creates today. On the one hand, he always believed in the importance of the public sector, especially the federal government, as a force that had the capacity to transform cities. The problems of urban poverty, racism, inequality, and substandard housing were for Logue profound moral and political issues—ones that the private sector alone could never resolve and that required the commitment of the state. It is as true today as it was in the mid-20th century that landlords and developers will not provide the affordable housing that people so desperately need and that the consequences of overcrowded housing, rent instability, and eviction wreak havoc.

But on the other hand, the political framework Logue relied on frustrated his biggest goals. Instead of seeing the key problems in terms of money, resources, and power, he treated the challenges posed by deindustrializing cities as issues of development—problems that could be solved by intelligent men working rationally and capably from above.

The issue was not simply one of participatory democracy versus top-down control; it was that men like Logue never really had the power they wanted to begin with. They tried to execute their visions and plans, but they did so in a context that favored suburbia, in which capital had free rein to leave the cities, in which bondholders had the final say, and in which white homeowners saw Black neighbors as a threat to all they held dear. Logue and his ilk also tried to do so in a context in which they purposefully distanced themselves from the political ideas and radical social movements that might have pushed urban liberalism toward a more egalitarian outcome, one that took more seriously questions of power, class, race, and resources. In other words, midcentury urban planning failed not because it was too ambitious but because from its outset, it was not ambitious enough, never really getting at the deeper inequalities that structured American society. It stayed at the level of urban policy when what was needed was a challenge to power.

With deep archival research and a narrative sweep that fixes her subject in the arc of midcentury US history, Cohen sketches Logue vividly, illuminating his forcefulness, his passion, his masculine confidence. She also provides a painful account of what he and so many liberals of his generation were up against. The contrast between these two aspects of her story—Logue’s certitude crashing into the limits of what he was able to accomplish—at times makes Saving America’s Cities read a bit like a claustrophobic horror story. Logue is increasingly hemmed in, his dreams and aspirations frustrated by the ever-dwindling federal resources on which they depended. He comes to seem like the King Lear of urban planning, following a dimming star of liberalism that was so clearly inadequate to its time.

The limits of Logue’s vision are evident all around us today. The 21st century city in the United States has all but abandoned the ideals of integration and affordability. Instead, American cities are organized around accumulated wealth, racial disparities, and delivering luxury consumption to a rarefied elite. Given where we have ended up, it is hard to see Ed Logue and the political tradition he embodies as saviors. The gap between what they sought to achieve and the outcome was too great. After the collapse of the UDC, the journalist Joseph Fried wrote a postmortem in The Nation in which he argued that the “fundamental issue” the UDC raised was that “only a long-range, public effort will make possible the construction and rehabilitation needed” for millions of Americans “to live in decent housing and decent neighborhoods.” The “skittish and volatile” private market could not accomplish this; “the job must be done by American society generally.” To survey the failures of local and federal bodies today is to hear Fried’s words echo down the years. The problems of American cities remain, as he put it then, a “harsh and vivid reminder that American society is still a long way from meeting a moral obligation of its own.”

A previous version of this article incorrectly implied that Paul Douglas was an active US Senator during a 1967 hearing on New Haven redevelopment. At the time, he was chairman of the National Commission on Urban Problems, appointed by President Lyndon Johnson after losing his Senate seat the previous fall.


]]>
https://www.thenation.com/article/society/saving-american-cities-lizabeth-cohen/
Did the New Deal Need FDR?https://www.thenation.com/article/archive/did-new-deal-need-fdr-robert-dallek-book-review/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinNov 11, 2019

Although Winston Churchill once compared meeting Franklin Delano Roosevelt to the feeling of uncorking your first bottle of champagne, many who encountered the 32nd president as a young man would later express their great surprise at the role he came to play in American history. The principal of Roosevelt’s exclusive preparatory school described him as “a quiet, satisfactory boy…not brilliant.” At Harvard, Roosevelt enjoyed beer nights, football, and shooting ducks, perfectly content with his B in history and D-plus in Latin. Frances Perkins, who would become his labor secretary, was distinctly unimpressed when she met Roosevelt in 1910, shortly before he was elected to the New York State Senate: “There was nothing particularly interesting about the tall, thin young man with the high collar and pince-nez…. He had a youthful lack of humility, a streak of self-righteousness, and a deafness to the hopes, fears and aspirations which are the common lot.”

How the pampered only child of a wealthy landowning family in the Hudson Valley should come to stand before a screaming crowd in Madison Square Garden in October 1936 and proclaim that he welcomed the hatred of the country’s “economic royalists” is one of the great mysteries of American history. What enabled a man raised in the cocoon of privilege to take the imaginative leap necessary to create the New Deal? Why should someone whose self-interest would seem to point to maintaining the existing society have described himself as a “prophet of a new order,” as Roosevelt did when he accepted the Democratic nomination for the presidency in 1932?

These questions have been taken up by FDR’s biographers for decades, and they are at the center of Robert Dallek’s new book, Franklin D. Roosevelt: A Political Life. A scholar who has written about John F. Kennedy, Lyndon Johnson, and Richard Nixon, Dallek tells us that he turned to Roosevelt to remind a younger generation of what “great presidential leadership” looks like. More generally, he wants to reassure those questioning the underlying wisdom of the democratic political system in the wake of the 2016 election that it “has been capable of generating candidates for high office whose commitment to the national interest exceeded their flaws and ambitions.”

To develop this argument, Dallek offers us a portrait of a more political Roosevelt, someone who loved the prosaic skirmishes of democratic life. This Roosevelt is not so much a patrician heir as a stumping pol giving speeches, campaigning for office, negotiating legislative compromises, reading the mood of a crowd, and managing to win over a roomful of skeptics and opponents. Being a politician—someone driven by the desire to win votes, garner public adulation, and exercise power—can involve more, Dallek tells us, than sheer ego, flamboyance, and a relentless Twitter finger. For Dallek, what defines political leadership in a democracy is the ability to translate an intuitive “feel for the public mood” into lasting social and institutional change.

But Dallek’s version of FDR’s story might not be as reassuring for American liberalism as he thinks. Roosevelt governed as he did more because of the swirl of social movements and ideas that surrounded him than because of anything intrinsic to his character or political sensibilities. He encountered tremendous resistance that he did not entirely know how to meet. Having started his career as a good-government reformer and then embraced a far more confrontational politics centered on support for a welfare state, Roosevelt changed with the times he lived in, times that he did not shape alone and that were not produced just by his good-spirited liberal politicking.

Dallek’s book, while focused on FDR, also reminds us of another important force in moments of uncertainty and upheaval. Liberals and leftists today are intent on winning the White House back from Donald Trump, and most have placed their hopes in the campaigns of Bernie Sanders and Elizabeth Warren (with Warren capturing much of the liberal camp and Sanders staking out a more intensely working-class populism). But Roosevelt’s evolution as a political figure points to a different locus of power, namely the social movements—especially the labor movement—and ideas that were already taking shape before he came on the scene and that pushed him to positions he would not have taken otherwise. The New Deal era as a whole challenges us to consider the important role that social forces outside the gated institutions of American democracy play in changing the balance of political power. The dilemma today is not so much who our new FDR will be as it is what movements and ideas could help guide such a figure as we attempt to address our current economic and environmental crises.

ne of the features of Dallek’s work is his sensitivity to the psychological dimensions of power. (He is an honorary member at a California psychoanalytic institute.) This was true in his studies of Kennedy, Johnson, and Nixon, and here, too, he lingers over the dimensions of FDR’s life that pushed him to become an “establishment rebel.” But these were not really individual character traits so much as they were reflections of Roosevelt’s uncertain times, and readers of Dallek’s book will find it hard to escape the sense that the world from which Roosevelt emerged is now almost completely gone.

Dallek begins the story of FDR’s life in the genteel society of Hyde Park, New York. Born in 1882, Roosevelt was the descendant of an old New York family that had become wealthy through the West Indian sugar trade and Manhattan real estate. An adored only child, he went with his family every summer to Campobello Island off the Canadian coast, where he rode horses, fished, and swam with his father. When he went to preparatory school, he was thrown by having to compete with peers for attention for the first time. He loved athletics and played on the football team, even though he was unsuited physically for the sport. At Harvard, too, the young Roosevelt continued to strive for the admiration of his contemporaries. The “greatest disappointment of my life,” he later recalled, came when he was blackballed for membership in Harvard’s ultra-exclusive Porcellian Club. Dallek contends that Roosevelt spent years seeking to overcome this relatively minor snub—just one of many signs of the extraordinary privilege and assurance that shaped his youthful years.

After graduating from Harvard, Roosevelt showed little interest in the quiet life of a landed gentleman. Nor did he desire to go into business and become even richer. Instead, he longed to win public approval and regard. Like his cousin Teddy Roosevelt, FDR belonged to a social elite that inculcated in him a profound sense that his position gave him both the right and the obligation to exercise political power. Although he hailed from a world of wealth and property, he did not share the capitalist ethos of many of his era’s leading businessmen. The notion that the market automatically confers justice, that businesses need only accumulate profits relentlessly to demonstrate their moral purpose, and that life’s meaning derives from ever more extreme acts of consumerism were completely alien to him as a scion of America’s old money. As he wrote in a college essay exploring his family history, the Roosevelts believed that having been “born in a good position, there was no excuse for them if they did not do their duty by the community.”

These aristocratic inclinations coexisted with a stubborn heterodoxy that Dallek argues expressed itself in his friendships. While Roosevelt was certainly no bohemian, he amassed many acquaintances who were outsiders in one way or another, often making them close friends and advisers. After his election as a state senator in 1910, he teamed up with Louis McHenry Howe, a veteran journalist with a face scarred from a teenage bicycle accident, who lacked wealth or an established place in the Albany political machine. Howe went on to manage FDR’s campaigns, signed his notes to Roosevelt “your slave and servant,” and referred to him as the “future president” years before the fact. (Howe also encouraged Eleanor Roosevelt to expand her role in public life in the 1920s, and he moved into the White House with the Roosevelts when they went to Washington.) FDR’s private secretary, Marguerite “Missy” LeHand, worked for him for 21 years and never married; she held a prominent role in the White House, participating in the president’s poker games, organizing the daily White House happy hour (the “children’s hour”), and operating like a chief of staff. His longtime friend Harry Hopkins, a former social worker who was briefly a member of the Socialist Party, not only ended up heading the Federal Emergency Relief Administration and the Works Progress Administration (WPA) but also lived in the White House.

At times, these relationships were troubled. Dallek documents the myriad well-known tensions in the Roosevelt marriage. Although scholars like Blanche Wiesen Cook have analyzed Eleanor Roosevelt’s relationship with journalist Lorena Hickok, Dallek portrays these problems primarily in terms of FDR’s irritation with his serious, principled wife, whose moral engagement with civil rights and the desperate poverty of the Depression far exceeded his. According to Dallek, Roosevelt just wanted someone to joke and relax with at the end of the day. (Questionably, the biographer seems to take the president’s side in viewing Eleanor Roosevelt as a killjoy, despite providing ample evidence of her dry sense of humor.) But even when they were strained, the proliferation of these bonds with unusual and idiosyncratic people suggests his willingness to depart from certain established ways.

Dallek also explores another part of Franklin Roosevelt’s personal life that shaped his public career: his polio and paralysis. What is most striking in Dallek’s description is how carefully Roosevelt—with the support of his wife and Howe—staged his recovery from polio. The Roosevelt family was at Campobello in 1921 when FDR was stricken ill. The day before, he had gone sailing and swimming in the ocean, jogged back and forth across the island, and volunteered to help fight a fire. It was a typical summer day for him; he loved sports and all kinds of physical activity. He went to bed tired, woke up with a fever, and by the next day was unable to stand. The man who believed in the masculine virtues of exertion and the “strenuous life” espoused by Teddy Roosevelt would never walk unassisted again.

For Dallek, what matters in this ordeal is primarily the extreme determination that Franklin Roosevelt showed in managing not just his recovery but also the popular perception of it. He insisted there would be no permanent injury long after it was clear that there would. He feared that his body would provoke “pity and revulsion” and set for himself a somewhat unusual goal for his physical therapy: that people would begin to “forget that he was a cripple.” He was absolutely committed to showing that the course of his life would be unchanged by his illness. “I’m not going to be conquered by a childish disease,” he vowed. Seven years after his paralysis, he was elected governor of New York. While others have seen in FDR’s illness hints of what might have made him so open to aiding others laid low by misfortune, for Dallek the main import is the way it made him all the more eager to achieve political triumph.

oosevelt was elected president easily in 1932, winning the Democratic Party’s nomination over his political rival Al Smith and then cruising past a dour Herbert Hoover in the general election. The incumbent was so unpopular that Roosevelt’s running mate, John Nance Garner, told the challenger that all he’d have to do to win would be to stay alive until November. Indeed, Roosevelt won 42 of the 48 states.

But what to do with the spoils of victory? Other scholars have assessed the virtues and limits of the New Deal: the impact that it had on unemployment and the Depression, the way it reified racial categories, the channeling of support to men instead of women, the way it both stimulated and frustrated reforms. This kind of analysis is not really Dallek’s project; he focuses on the political victories of FDR’s first term rather than his policy achievements. In the chapters on the New Deal, we get detailed pictures of the close relationship that Roosevelt cultivated with the press (holding twice-weekly informal briefings), of how emotional he became during the fireside chats that brought his voice over the radio into millions of homes, and of the events of the second Bonus Army march on Washington, DC, by World War I veterans seeking pension payments in May 1933. Whereas Hoover had greeted the veterans’ encampment with tanks, FDR provided three meals a day and unlimited coffee. “Hoover sent the army,” one veteran observed. “Roosevelt sent his wife.”

But FDR’s political skills could not change American institutions. He tried all kinds of different strategies to combat the crisis of the Great Depression, from federal deposit insurance to the National Recovery Administration to the expansion of emergency relief and the hiring of young men through the Civilian Conservation Corps. None of these programs eradicated the high unemployment and precarity of the decade. Over the course of his first term, Roosevelt saw popular unrest rise as a wave of strikes (some led by communist and Trotskyist organizers) rippled through the country in 1934 and the Townsend clubs and other populist mobilizations demanded relief from extreme poverty and insecurity as well as steps to achieve a political economy not quite so hopelessly tilted toward the rich.

Roosevelt did not create any of this pressure from below, and he might not have done what came next without it. But with organized labor, social movements, and radical intellectuals pressing him from the left, he was able to shift course. The result was the Wagner Act, which established for many (though not all) workers the legal right to organize a union and created federal enforcement mechanisms to compel employers to negotiate, and the Social Security Act.

Working-class discontent was only one side of the political ferment of the 1930s, and Dallek offers a particularly strong account of Roosevelt’s mounting frustration and confusion as aggressiveness toward the New Deal grew on the right, both within other branches of government and among the economic elite. The Supreme Court proved an implacable obstacle to New Deal reforms, and after Roosevelt’s landslide reelection in 1936, he pressed for “reform” of the court—which for him meant packing it with justices more sympathetic to his social and economic programs. This only galvanized conservative opposition within the House and Senate and helped mobilize the business class. “There has been no mandate from the people to rape the Supreme Court or tamper with the Constitution,” Virginia Senator Carter Glass declared.

The court reform bill went nowhere—yet in the end, it didn’t matter, since the Supreme Court began to back New Deal programs and several conservative justices retired. But Roosevelt now recognized that he was at war, though he was loath to ally himself too closely with the labor movement. As the economy sank back into recession and conservative Democrats began to point their fingers at alleged communists in the WPA and other agencies, Roosevelt tried to purge the most reactionary Democrats from leadership positions, triggering an open conflict over who controlled the party. A group of Southern Democrats and Northern Republicans wrote up a Conservative Manifesto that openly distanced them from the New Deal. Businessmen and the right also became more vitriolic, while the left pressed FDR for changes that went well beyond any he was willing to make.

Dallek may have intended to capture the political quandary that Roosevelt faced in the late 1930s as well as his unwillingness to genuflect before the sanctity of the Supreme Court. But what is really most interesting here is what it suggests about the limits of presidential power or, indeed, the usefulness of focusing on the presidency to understand what was going on in the 1930s. The kinds of changes that the New Deal brought elicited tremendous resistance outside Washington, which renders any narrative of FDR or liberalism or the New Deal that ends in triumph an incomplete one, since it was outside the venerable institutions of the capital that new forms of right-wing power took hold. The ferocity of the corporate response to the Wagner Act, the intensifying antagonism of the congressional right, the ominous rumblings on the far right in the late 1930s—had the United States not entered World War II, many of the accomplishments of the New Deal might have come under attack much more quickly. Roosevelt liked being a leader when the opposition was in disarray. He enjoyed far less the experience of real political combat.

The intense hostility to the New Deal matters in other ways, too. All of its victories—the Wagner Act, Social Security, the public works projects that built roads and dams and schools—are notable for who and what they left out. As many historians have noted, the exclusion of domestic and agricultural workers helped to enshrine the regional and racial disparities that would largely determine American politics for the rest of the 20th century. The reliance on private sector entities and firm-by-firm collective bargaining for social benefits created the partial welfare state that we live with now. The embrace of public spending during World War II built the military state and also the mass consumerism that has fueled climate change. In other words, the very institutions that created liberalism helped generate the political conditions that would continue to unmake its moral and political authority and, by the 1970s, serve to undermine it.

To understand what happened to American liberalism, we must look beyond the presidency and certainly beyond FDR. The forces of reaction and progress, of white supremacy and social and political equality, were present in the 1930s, and the resolutions that the New Deal provided were only temporary. Roosevelt the politician sought to build coalitions, but he was often unwilling to face or confront the contradictions that came with the kinds of coalitions he built. He may have sensed the fragile foundations of the new order he was trying to create, which may account, in part, for just how uncertain he was when presented with a deepening social conflict that could not be resolved through charm or force of will. His reluctance in the late 1930s and early ’40s to ally himself more forcefully with unions and the left—and the ambivalence that many of the liberals around him felt toward social movements outside the halls of power—shaped the kinds of solutions the New Deal provided, limiting them in ways that would reverberate throughout the rest of the century and to the present day.

allek’s biography ends with World War II. Like many scholars, he credits FDR with recognizing from an early point the grave dangers of fascism and Nazism and criticizes him, rightly, for refusing to do more to welcome European Jews into the United States as they desperately sought visas. Dallek shows, too, how FDR brushed aside any critiques of the decision to place 120,000 Japanese Americans in internment camps after Pearl Harbor. What’s most fascinating about the last chapters of the book is how much anxiety there was in Roosevelt’s inner circle about his health as the war went on and how isolated he felt during the wartime years. Dallek makes it appear that Franklin and Eleanor had almost no real intimate connection by the late 1930s, the dynamic between them having become primarily one of resentment and “political convenience.” FDR’s emotional support, Dallek argues, came increasingly from his cousin Daisy Suckley, whose affectionate and intimate correspondence he quotes extensively. “Do you know that I have never had anyone just sit around and take care of me like this before,” Roosevelt told her at one point when she was tending him while he had a fever in 1943.

Dallek suggests that Roosevelt’s health was in precipitous decline even before he ran for a fourth term in 1944 and that he was likely in the late stages of the heart disease that would ultimately kill him. His doctor wrote in a secret memo that “if Mr. Roosevelt were elected President again,” he would not have “the physical capacity to complete another term.” Roosevelt saw no alternative, though. The desire to override physical constraints that had long motivated him pushed him to stay in office.

For Dallek, this self-sacrifice proves the paramount example of Roosevelt’s noblesse oblige. But in depicting FDR’s political and emotional weaknesses in the 1940s, Dallek also points to some of the tensions and problems of the New Deal—not least the extent to which it relied on the veneration of Roosevelt to keep a political coalition together. By presenting a more human Roosevelt, Dallek encourages us to also see the necessity of building the political infrastructure—the social movements, union organizers, radical publications, striking teachers, grass-roots activists—on which any more lasting changes would necessarily rely. If the Green New Deal and Medicare for All (let alone any larger transformations) are ever to become political reality, it will be because of these kinds of historical actors and their role in shifting what seems possible.

The rarefied social world that nurtured FDR is very different from any that endures today, as is the business class that opposed him. But the current problems of liberalism may not be so different from those of the New Deal era. Ambivalent even now about a more confrontational politics, liberals tend to place their faith in the putative power of innovation and technocratic elites to resolve what are at heart problems of brute power and inequality. For all that has changed since the 1930s, this remains a common thread. While Dallek perhaps has another lesson in mind, his new biography reminds us that today we see Roosevelt as we do not so much because of who he was or what he was able to accomplish but because of the efforts of millions of other Americans who struggled in the Depression years: the men and women whose imagination, bravery, and forgotten decisions to defy established authority in the midst of the worst economic crisis of their lives forged an opening without which Roosevelt might well have remained the haughty man with the pince-nez, always removed from the bitterness of the world.



]]>
https://www.thenation.com/article/archive/did-new-deal-need-fdr-robert-dallek-book-review/
The Bad History—and Bad Politics—of Alan Greenspan and Adrian Wooldridge’s ‘Capitalism in America’https://www.thenation.com/article/archive/alan-greenspan-adrian-wooldridge-capitalism-in-america/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinDec 12, 2018Capitalism in America, it tell us more about Alan Greenspan than about the history of capitalism.]]>

he historian Perry Miller observed, in his 1952 essay “Errand Into the Wilderness,” that most American histories are stories of decline. From the Puritans on, we find the recurring theme that the country has fallen away from a set of high ideals that once prevailed. Miller argued that the intense piety of the early settlers created a standard impossible for later generations to meet, which led, in turn, to profound disappointment and a sense that their country itself had failed. It was, he suggested, a familiar feeling: “Many a man has done a brave deed, been hailed as a public hero, had honors and ticker tape heaped upon him—and then had to live, day after day, in the ordinary routine, eating breakfast and brushing his teeth, in what seems protracted anticlimax.”

This fallen state is the world depicted in Capitalism in America, an account of US economic history co-authored by Alan Greenspan, the former chair of the Federal Reserve, and journalist Adrian Wooldridge. Part popular history, part political intervention, part subtle exoneration of Greenspan for the financial crisis of 2008, this strange book is at heart a declension narrative—an elegy for the American capitalism of yesteryear. Once, Americans harnessed the power of electricity, built steel factories, laid down railroads, and created the Model T. Today, we are hemmed in by a thicket of government programs and regulations. Even as the authors try for an optimistic note near the end (there are smartphones! Robots! Gigabytes!), the book is animated by a fear that capitalism’s glory days have passed.

Back in the 19th century, according to the authors, the government saw its primary role as upholding property rights and enforcing contracts, rather than trying to redistribute wealth or protect people from the vagaries of economic change. There was no income tax or, from 1836 to the Fed’s creation in 1913, even a central bank. The gold standard prevailed. Perhaps most important was the dominant cultural attitude toward business: “Most educated Americans believed in the law of the survival of the fittest.” For Greenspan and Wooldridge, the profound economic transformations of the 19th century were the result.

The problem of how to “restore America’s fading dynamism” hangs over the book. Will America’s economy ever grow again? And more than this, will people ever regain the lost faith in its mission, in the transcendence of economic expansion and the sublimity of technological change? Although the long sweep of American economic history is the ostensible subject of Capitalism in America, the answers that it provides to these questions tell us more about Alan Greenspan than about the history of capitalism in America.

Greenspan’s sympathetic biographer, Sebastian Mallaby, suggested that the former Fed chairman’s trajectory from Ayn Rand’s close friend (she nicknamed him “the Undertaker”) to America’s central banker reflected an underlying commitment to empiricism and pragmatism; he evolved past his initial political predilections. Reading Capitalism in America, one is not so sure. The book is shot through with remarkably ideological readings of American history—everything from the unabashed deification of individual entrepreneurs to occasional exegeses on the glory of gold. In his old age, Greenspan appears to be returning to the faiths that inspired him as a young man (or perhaps he’d never left them behind).

espite Capitalism in America’s libertarian hue, its explicit intellectual reference point isn’t Ayn Rand, whose name is barely mentioned, but the Austrian economic thinker Joseph Schumpeter, especially the oft-cited notion of “creative destruction” developed in his 1942 book Capitalism, Socialism and Democracy. Greenspan and Wooldridge return repeatedly to Schumpeter and the “perennial gale” of economic change in their analysis of American history. Under capitalism, Schumpeter argued, new production processes are constantly displacing old ones, leading to great tumult (the decline of cities, the wiping out of whole occupations)—but also rapid growth. In the past, Greenspan and Wooldridge argue, “America has been much better than almost every other country at resisting the temptation to interfere with the logic of creative destruction.” But today, this is no longer the case.

The authors trace the hospitality of the United States toward capitalist growth, including its destructive possibilities, to the restrictions that its founders placed on popular or majority rule. “The Founding Fathers had been careful to put limits on both the scope of the state and the power of the people,” they write. Even though they hail American capitalism as “the world’s most democratic” example of the system—in that people born in obscurity have risen to riches, and also in the spread of mass consumption—they approve of the ways in which American politics constrains democratic power.

The result, in their view, is what led to the era of entrepreneurial dominance that lasted for most of the 19th century—the “age of giants.” Greenspan and Wooldridge praise the high “respect” with which Americans then treated businesspeople; apparently, they have long understood that “the real motors of historical change were not workers, as Marx had argued, nor abstract economic forces, as his fellow economists tended to imply, but people who built something out of nothing.”Americans rightly worship at the shrine of business, what Greenspan and Wooldridge call the “cult of the entrepreneur.”

Another reason for America’s explosive entrepreneurialism, we are told, is that the 19th century was also the age of the gold standard. For most of the book, the prose is workmanlike, jammed with statistics about economic growth. But when the subject turns to gold, lyrical invocations of timelessness creep in: “Gold has always been acceptable as a means of exchange, and hence a store of value, as far back as history reaches.” It has long served as “one of the most solid defenses of liberal society against the temptation to debauch the currency, the monetary equivalent of property rights.” Conservatives in the late 19th century saw it as “a bulwark not just against economic chaos but against civilizational collapse.” One has the feeling that Greenspan and Wooldridge may see it this way, too.

For Greenspan and Wooldridge, the age of gold was not always a golden age. They are careful to point out the underside of growth. American men lost an average of two inches in height over the course of the late 19th century; there was horrific pollution as well as terrible industrial accidents; and deflation hurt debtors and wage earners. Early in the book, they point out that slavery and the displacement of indigenous peoples also constituted stains on America’s history. But none of this matters as much, in their view, as the fact of the country’s remarkable economic growth.

They thus treat the stormy politics of the late 19th century with a mixture of amusement and disdain. “Anxiety” about the scale of economic change—rather than real and legitimate questions about the nature of the emerging economy and its effects on working Americans—fed the “cult of government” in the Progressive era. When the Populists organized against the gold standard, they “extended the realm of politics” to include economic relationships that had previously been considered sacrosanct. For Greenspan and Wooldridge, this was the beginning of the end. After World War I, there was a brief reversion to the doctrine of laissez-faire. The “active inactivism” of Warren Harding and Calvin Coolidge is cited for praise, but it was a brief “paradise before the fall.” Then came the Great Depression and the era of Franklin Roosevelt’s New Deal, and the nation has never recovered since.

ot surprisingly, Greenspan and Wooldridge express great enthusiasm for the presidency of Ronald Reagan. He not only removed the “shackles” that had bound business during the New Deal and postwar years; more important, in their view, Reagan had “an instinctive faith in business,” and entrepreneurs, in particular, formed “the praetorian guard of his revolution.” The authors are also enthusiastic about Bill Clinton and the way his tenure marked the “revival of entrepreneurialism,” and the financial revolution—junk bonds and all—that made it possible.

Some readers may be tempted to skip ahead to the concluding chapters for some insight into how Greenspan views recent economic events, especially those in which he himself was involved. However, anyone looking for particular evidence of how the economy looked from the viewpoint of the Federal Reserve will be disappointed. Very little appears in these pages that would shed any real light on the debates at the Fed, or on Greenspan’s own contributions to the creative destruction of American capitalism. The closest that Capitalism in America comes to engaging the question of whether Greenspan might bear some culpability for the 2008 financial crash is to observe that while “some critics” have suggested that the low interest rates of the early 2000s helped to drive speculative home-buying and thus the housing bubble, these skeptics “can’t establish a clear link between monetary loosening and the crisis.” While some readers may be frustrated by this coyness, it is in keeping with the overall approach of the book, which downplays public policy in favor of broad statements about America’s great economic culture. Greenspan and Wooldridge insist that our principal problem is that the United States has become “encrusted with entitlements” that crowd out other forms of saving and diminish America’s spirit of entrepreneurialism. Meanwhile, “overregulation forces business founders to endure a Kafkaesque nightmare of visiting different government departments and filling in endless convoluted forms.”

The early Trump years—with their huge corporate-tax cuts and reduction in regulations—offer the occasion for mild cheer, though the authors also note with concern the president’s antagonism toward free trade and his failure to drain the swamp of the regulatory state. Either way, their message is clear: Where once we had prairies, railroads, and an open expanse of light and air, we now have only bureaucracy.

reenspan and Wooldridge’s historical scholarship leaves much to be desired. Their depiction of the 19th-century American economy as a paragon of laissez-faire seems at a distant remove from actual history. True, it did have the gold standard, and taxes were few, and the federal government did not employ many people. But the enslavement of millions of people—whose labor was actually the engine of economic growth in the early years of the 19th century—in fact required an activist state. The decentralized violence of slavery should not disguise the intrusive regime that was needed to uphold in practice the idea that human beings could be treated as commodities.

After the end of slavery, too, the state played a key role in economic life. Breaking strikes with the National Guard, giving Western lands to railroad companies, overturning state and local regulations for the labor market via the Supreme Court—all of these involved the active use of state power. The industrial powerhouse that the United States became was also not a free-trade haven; for much of the era of industrial growth, its manufacturing companies were protected by high tariffs. There was no near-unanimous sentiment, even among the educated, about the virtues of the idea that society should be governed by the “survival of the fittest”—from the American Revolution on, many were critical of the outsize power that economic might could give some people over the lives of others.

Rather than the tale of a free people coming to be dominated by government control, the story of the United States in the late-19th and early-20th centuries is really one of political conflict over who would control the state. And our contemporary politics remains centered on this conflict: Should business leaders be able to dominate government? Or is there something in democratic politics that authorizes the majority of people to shape their society together? Greenspan and Wooldridge seem to view the market and the state as two distinct and antagonistic realms, but in truth they were never as easy to separate as Capitalism in America suggests.

Nor were the massive transformations of daily life that the book chronicles simply the product of brilliant entrepreneurial innovations: As the economic historian Robert Gordon has written, the improvements in health, mortality, and the quality of life in the late-19th and 20th centuries reflect public as well as private investments. What is more, bemoaning the loss of the growth rates that characterized the 19th century makes little sense. For one thing, rising productivity was even more dramatic between 1920 and 1970—years that Greenspan and Wooldridge insist already contained the seeds of decline. As important, the social changes that swept the United States in the late-19th and 20th centuries—the spread of electricity, the movement of millions of people from farms to cities—are not likely to be repeated, not because there are too many regulations now, but because these are the types of transformations that cannot unfold twice.

hen Greenspan was a college student at New York University, he was fascinated by the railroads—and especially by the magnate James J. Hill, who built the rail lines of the Pacific Northwest. “The James Hills and the J.P. Morgans are an affront to a society dedicated to the worship of mediocrity,” Greenspan declared in a lecture he gave in the mid-1960s, organized by one of Rand’s organizations.

One can almost imagine how Rand and her highly moralized vision of economic life might have appealed to the young Greenspan, coming of age amid the anodyne corporate hierarchies of postwar America: In a way, her vision was a rebellion against the gray-flannel monotony of that order. But today, in a world remade by the cult of business, Greenspan’s vision of an entrepreneurial America is not rebellious so much as an affirmation of the powers that already exist—a class of people relentlessly committed to grasping more of society’s wealth for themselves. Trump’s dislike of free trade aside, his presidency seems the sad reductio ad absurdum of the fantasy of entrepreneurial omniscience. In the final pages of Capitalism in America, the authors reiterate the idea that the United States is trapped in an “iron cage of its own making,” one of “out-of-control entitlements and ill-considered regulations.” Scrap these, and the “madness of great men” like Elon Musk and Peter Thiel will once again be able to transform society. But who—outside of the inner circle, the most devout of the faithful—really believes that this would be a good thing?

While Capitalism in America falls short as a work of history, its larger problems are political. On the one hand, the whole framework of “creative destruction” implies a tremendous condescension toward those who dared to impede the march of progress. Contrary to what Greenspan and Wooldridge suggest, their grievances were not mere sour grapes, the whining and complaints of those left behind; what was at stake was always the question of who would benefit from economic change and the kind of social order it would make possible. By framing the issue as a simple resistance to technology and growth, Greenspan and Wooldridge sidestep the underlying questions about democracy, equality, and freedom that are at the heart of this long-standing struggle.

The former chairman of the Fed, it seems, never left behind his childhood obsession with railroad barons and glorious entrepreneurs after all. Perhaps at the start of the Reagan years, this invocation of the market’s wonders still seemed inspiring, the opening bell in a beautiful race. Now it reads like no more than a call to protect the wealthy people who benefited from decades of state-backed economic redistribution upward.

Schumpeter himself would likely have said as much: For him, capitalist growth was not only about rising productivity, but also the transformation of social relationships, and he was closely attuned to the ways that the economic changes brought about by capitalism could also undercut its cultural and political support. In particular, Schumpeter warned that the very march of economic progress meant the doom of the entrepreneur: Over time, the development of technology and of the corporation would render obsolete the role of the lone economic actor, as “teams of trained specialists” and “rationalized and specialized office work” took the place of the robber barons of yesteryear. Back in 1952, Perry Miller wrote that, after the fervor had dissipated and the disappointment had been worked through, people would find themselves forced to confront their country as it actually existed. Capitalism in America aside, that’s where we are now.



]]>
https://www.thenation.com/article/archive/alan-greenspan-adrian-wooldridge-capitalism-in-america/
The Making of 20th-Century New Yorkhttps://www.thenation.com/article/archive/capital-of-the-world/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinAug 2, 2018

When the ground was first broken in 1900 to build a subway under the crowded streets of New York, some 25,000 of the city’s residents gathered to watch the ceremonies and cheer: “To Harlem in 15 minutes!” Not that it would be easy—removing 3 million cubic yards of earth and laying rails under the city was a job that would take four years and thousands of workers, 54 of whom lost their lives during the construction.

When the subway finally opened, it was a day of civic celebration. The city’s factories gave workers a half-holiday, and every passenger was permitted to ride free of charge. Some 150,000 people did. Yet what was perhaps most remarkable was just how quickly New Yorkers adapted to this engineering miracle. One reporter for The New York Times observed that after their exploratory journey, people poured out of the stations and made their way quietly home, “having finished what will be to them the daily routine of the rest of their lives.”

The story of the subway, then, is one of an extraordinary achievement that came to be regarded as perfectly ordinary, and it is, in a way, the urtext of any book about New York. “It is a miracle that New York works at all,” wrote E.B. White in his famous paean to the city. “The whole thing is implausible.” Yet this sense of awe and mystique, even as it animates much writing about the city, can also make New York difficult to see clearly—especially in relation to the rest of American history.

Often, writers treat New York either as the apotheosis of America or as a national outlier. It is perceived as the center of trends that have shaped the rest of the country’s history—the heart of immigration, the capital of finance—or as an extreme metropolis that has little in common with the rest of America, the so-called heartland. To a certain extent, some of the difficulties inherent in writing urban history (or perhaps any history at all) show up with special force in histories of New York: Is the purpose of writing about the city to illuminate its distinguishing features, or to tell a larger story of which New York is representative?

Mike Wallace’s Greater Gotham—the second of what he hopes to be four books about the city—manages to do both. It is a book about New York in all its bewildering particularity, yet it also addresses the sweep of American history in the early 20th century. Greater Gotham is the sequel to the Pulitzer Prize–winning Gotham, in which Wallace, along with co-author Edwin G. Burrows, told the story of the rise of New York over its first 300 years: the transformation of an unassuming island into the hub of a vast city.

Greater Gotham covers a shorter span, the period from 1898 to 1919, but there can be no doubt that the book is a remarkable scholarly achievement. At 1,196 pages, divided into five parts and 24 chapters, it manages to cover what can seem at times like almost every facet of life in the city over the 21 years that separate the consolidation of its various boroughs, in 1898, from its emergence as the nation’s economic capital by the end of World War I. There are chapters on the economics that drove the skyscraper boom of the early 20th century and the labor processes and technological developments needed to make possible the construction of the first subways. We learn about the fissures that divided local activists in the Industrial Workers of the World from those in the Socialist Party, as well as the machinations of the Socialists’ Morris Hillquit, who kicked the IWW’s Big Bill Haywood out of the organization.

Wallace also discusses the public-health campaigns of the early 20th century, which reduced the death rate in the city from 27.2 per 1,000 in 1890 to 13.4 by 1914 despite the organized opposition of the city’s doctors, who feared the state’s expanded role in medical care. He analyzes the proliferation of prostitutes in immigrant enclaves in which men greatly outnumbered women. He also places the worldly city within the larger world, examining the role of the Russian Revolution of 1905 and the pogroms that followed in driving hundreds of thousands of Jews to New York.

On top of all of this, there are wonderful sections on Coney Island, Irish politics, the literary and visual arts, the rise of Harlem as a center of African-American life, Lenin’s appreciation of the New York Public Library, and the strength of anarchist traditions in working-class Italian neighborhoods. Out of this welter of specificity comes a distinctive portrait of something far larger than New York itself: The city’s story, we come to learn, is really the story of American capitalism. It is also the story of the radical politics that emerged in response to it. This is certainly one way to read the book’s title: America itself became New York’s “Greater Gotham.”

allace has given a great deal of thought to the right way to tell this story. In the introduction, he outlines five major areas to examine in unraveling the city’s history: first, its emergence as the financial center of international capitalism by the end of World War I; second, its national importance as the “unofficial capital” of the United States; third, its material development, as New York took on an ever-expanding catalog of economic functions for the country; fourth, the year-by-year rhythms of capitalist expansion and contraction, which helped to spur labor organizing and radical politics; and finally, a view from the ground, the daily experience of the thrilling, chaotic city for people from all social classes and backgrounds.

Running through all of this are certain more general themes. The first is the profound ambivalence of New York’s financial and corporate elite when it came to the nature of competitive capitalism. These were people at the pinnacle of the national economy and the avatars of its achievements. Yet far from being ardent believers in the ruthless precepts of laissez-faire, they sought to tame the market and replace the “ruinous” competition of yore with corporations that dominated their economic sectors to a degree that had never been seen before. The city’s financiers presided over the great merger wave of the early 20th century: Between 1899 and 1904, fueled in part by the expansion of trading on the New York Stock Exchange, they reduced 4,200 companies to a mere 250, resulting in many of the powerful mega-firms that would dominate the American economy during the 20th century, among them US Steel, the American Smelting and Refining Company, United Fruit, and International Harvester. Many of these firms were headquartered in New York; the banks they relied on were located there as well. And their growing power pulled in the elite executives from other businesses. New York, as Wallace puts it, “sucked in millionaires and corporations as fast as they were created, and yanked some already existing ones out of other cities’ orbits.”

This flood of money shaped the physical landscape of the city. Competition between real-estate developers drove its vertical growth. Skyscrapers represented the aspiration to wrest as much money as possible from each lot of land. The very dynamic of growth spurred the buildings higher: Every additional floor meant new tenants and new rents, magnifying the value of the property as a whole, so that by 1912, New York had more tall buildings than any other city in the nation. Ultimately, the Manhattan skyline became the physical embodiment of profits literally seized from the air.

Yet far from reigning with unquestioned confidence, New York’s elites were always afraid of potential challenges to their authority, and this uncertainty forms the second major theme of Wallace’s book: the increasingly organized efforts to push back against their control. There were the Progressives and the middle-class reformers who sought to challenge the raucous power of the new business elite and to tame and civilize the disorderly city. And then there was the working class, which was not content simply to serve as the subject of reform experiments descending from on high. Labor unions proliferated in the city, and New York became a center of radical politics teeming with socialist and anarchist activism that often found its way into the unions and pushed them into confrontational actions against factory owners and financiers.

Even the Industrial Workers of the World, those anti-capitalist poets often seen as the organizers of the Western mines, had their pockets of strength in New York. Immigrant strikers in Lawrence, Massachusetts, sent their children to the city during the IWW-led strike in 1912, and when the hundreds of malnourished children arrived by train, they were greeted by thousands of supporters at Grand Central Station. Two years later, during a recession, radical activists in the IWW circle organized an “army of the unemployed”: Hundreds of people marched through the streets of New York, entering churches to demand food and shelter.

This challenge to the status quo in one area of life soon fed the growth of others. Margaret Sanger was a member of the Socialist Party and a supporter of the Lawrence strike before she became an advocate of birth control. In 1914, a Feminist Alliance, drawing from socialist, anarchist, and labor circles, was organized to challenge sexual inequities in the city (such as a Board of Education policy that forced female teachers to resign once they got married). The Greenwich Village bohemians—a small number of whom famously climbed to the top of the Washington Square Arch one cold January night in 1917 to proclaim the neighborhood a “free and independent Republic”—also shared a milieu with the labor movement and provided spaces for women to step outside the norms prescribed by gender.

The city’s left also spilled into African-American politics. One of the strongest chapters of Greater Gotham traces the expansion of black New York and the city’s emergence as a center of resistance to segregation and inequality throughout the country. The city’s African-American population swelled from around 60,000 in 1900 to 91,700 in 1910, as migrants arrived from the Caribbean and the southern United States. Housing and employment were highly segregated: Black doctors couldn’t practice at public hospitals, and black teachers weren’t employed by the Board of Education until 1895. (The city’s public schools had been legally segregated since the 18th century, and they didn’t become integrated until the 1870s and even later in Queens, which had a separate Board of Education.)

At first, it appeared that a moderate politics of racial uplift through free enterprise might predominate in black New York. Booker T. Washington built his “Tuskegee machine” in part through financial contributions from New York philanthropists, speaking at Madison Square Garden before audiences of corporate tycoons to raise funds. Soon, however, the city became home to a much more ideologically diverse black politics. Wallace traces the establishment of the National Association for the Advancement of Colored People; the founding of The Messenger, socialist A. Philip Randolph’s magazine; and, ultimately, as Harlem emerged as the city’s center of black life, the rise of Hubert Harrison, whose politics blended socialism and black nationalism. By 1917, following what Wallace describes as a “racial pogrom” in East St. Louis, Illinois, in which some 200 black people were killed, Harrison helped organize a silent march down Fifth Avenue, with people bearing placards that read “Mother, Do Lynchers Go to Heaven?” It was the largest protest of African Americans in the city’s history, and it marked the emergence of Harlem as the heart of resistance to racism throughout the country.

ven so, left-wing politics never came to dominate New York, and, as Wallace shows, the city also served as a center for reaction. The press denounced the Industrial Workers of the World as “vicious outcasts”; the police arrested nearly 200 members of the IWW’s “army of the unemployed” on charges of incitement to riot; and open-air meetings were suspended by the city. While the fire that killed 146 workers at the Triangle Shirtwaist Company resulted in the passage of workplace-safety legislation, the owners of the factory were nevertheless acquitted in court of any wrongdoing. Even the suffragist movement—which gained momentum through an alliance of wealthy society women and downtown labor activists—failed to carry the city when New York State voted on a 1915 referendum to grant women the right to vote.

New York was also home to a flourishing political right. Advocates of eugenics and racist pseudoscience, such as Madison Grant and Charles Benedict Davenport, built institutional centers in the city, while Columbia University became the home base for John Burgess, the founder of political science, who denounced Reconstruction as “a monstrous thing” and built his discipline on the principle of racial difference. While many structural features of city life made it possible for radical politics to thrive—among them the city’s density, its plethora of common meeting spaces and gathering points, and above all its inequality, which placed rich and poor in close proximity—none of this meant that the city’s elites would simply allow this new politics to flourish. Instead, they were determined to retain their control, no matter how much turmoil would come as the result.

Federal agencies often helped. Anthony Comstock, the repressive US postal inspector, banned an issue of Sanger’s publication, The Woman Rebel; when Sanger kept publishing material on contraception with the aid of an IWW printer, she was forced to flee the country, under threat of a 45-year prison sentence. (Following Comstock’s death in 1915, the US attorney dropped the charges against Sanger, but only after her husband spent a month in the Tombs for selling another of her publications.) Comstock also went after the bohemian artists, at one point arresting the 19-year-old receptionist of the Art Students League of New York when she gave him a free catalog that included three nude images.

The tensions between radical and reactionary New York came to a head in 1916, as the nation’s leaders debated entry into World War I. On the one hand, the example of a multiethnic city appeared to be a challenge to the ethnic nationalism that had begun to dominate European and, to some extent, American politics. But the subject of what constituted an American national identity was hotly contested in New York, where the pro-war constituency was mostly Anglophile and upper-class and the large anti-war camp was built around the multiethnic working and middle class. With the country’s entrance into the war, the transnational ideal that defined much of the city came into conflict with a shrill and forceful reassertion of militaristic patriotism, one that extolled white Christianity as the only true American identity. Wallace details the shutdown of radical magazines and organizations and the rise of hyper-patriotic organizations, not just among the elites but among many other New Yorkers as well.

or Wallace, despite the repression that accompanied the First World War, the story of New York is ultimately one of triumph—the narrative of a vast city coalescing, despite the intense pressures that might have pulled it apart. In the final pages of the book, he suggests that the experience of New York in its first 21 years as a consolidated city points to the power of shared participation: “Despite those two decades having witnessed nonstop battling between classes, races, ethnic groups, genders, and religions—verbally and at times violently—the center had held.” The “ties that bound”—shared institutions like the subways, the theaters, Tammany Hall, and most of all “the excitement and pride of living in a great city”—kept New York together, a single metropolis and a model of the cosmopolitan ideal. Although Wallace doesn’t make this point explicitly, his “Greater Gotham” is also a city that came to represent a template for the nation as a whole, a particular vision of what it meant to be American that is nearly the exact opposite of Donald Trump’s.

While Wallace’s invocation of New York as an alternative vision of American identity is welcome, there’s a way in which the book’s concluding depiction of a city unified despite its tensions runs counter to its broader narrative of struggle and contest. New York’s political and economic consolidation in the early 20th century (especially after World War I) also meant the shutting down of certain kinds of political challenges. By the end of 1919, the ranks of the Socialist Party had been decimated, thanks both to wartime repression and the internal splits in the party after the Russian Revolution. Emma Goldman had been imprisoned for organizing anti-conscription protests; and the rich network of newspapers, magazines, and political groups that had sustained the city’s left and its artistic counterculture had been driven almost out of existence. Only a few years later, the country would pass a “genuine 100 percent American immigration law” to shut out the “scum of the earth” (to quote New York real-estate developer and eugenicist W.E.D. Stokes). There’s a suggestion in Greater Gotham that the multiethnic working class of New York offered a counterpoint to the world of real estate, finance, and corporate capitalism—one that was able to check and contain its dominance and provide a real alternative. But was this actually the case?

Given the heft of Greater Gotham, asking for more might seem perverse. Still, one cannot help but wonder how Wallace’s story might have been different had he brought his story forward to examine the ways the conflicts and tensions he describes affected New York’s response to the Great Depression. How did the class politics he explores inform the liberal state as it took shape in the postwar city, and how did they shape the undermining and transformation of that liberalism after the 1975 fiscal crisis? Going further forward still, how does the city he chronicles foreshadow the one of today, which is still home to dense ethnic and immigrant neighborhoods and where intense poverty exists right next to some of the most extreme wealth the world has ever known? While elements of the robust public sector that ultimately emerged out of those contests a hundred years ago still exist, the radical politics that once animated the city does not—at least, not in the forms it did during the years chronicled in Greater Gotham. Perhaps New York will someday find itself the center of such a political uprising once again, as the myriad dispossessed of a city dominated by extreme wealth might be able, even today, to discover new points of rebellion. The intensity of life “compressed” (as E.B. White put it) in the city can never, as Wallace shows us, truly be controlled from the top. But we will have to wait for the sequel to find out why this radical spirit got lost in the later years of the 20th century—and whether this narrative of the rise of a city may also be, in a way, a story of its fall.



]]>
https://www.thenation.com/article/archive/capital-of-the-world/
Mont Pelerin in Virginiahttps://www.thenation.com/article/archive/mont-pelerin-in-virginia/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinSep 7, 2017

In the spring of 1972, a small group of thinkers met at the Russell Sage Foundation in New York City to discuss the role of “altruism and morality” in shaping social life. Among them was James Buchanan, an economist then teaching at Virginia Tech, who was best known as one of the founders of “public choice” theory—a branch of economic thought that sought to apply the methodology of economics to political decision-making.

At the conference, Buchanan delivered a paper on the theme of what he termed the “Samaritan’s dilemma.” Using the framework of game theory, he argued that most people had a strong incentive, in the short term, to extend charity to others. There was a definite loss of “utility” associated with denying help to someone who needed it. But the problem was that this innate longing to be charitable opened up the possibility that “parasites” would take advantage. To counter it, people would need to sacrifice their immediate utility for long-term well-being—just as a mother might have to spank her child (an example Buchanan used) in order to encourage good behavior in the long term. “A species that increasingly behaves so as to encourage more and more of its own members to live parasitically off and/or deliberately exploit its producers faces self-destruction at some point in time,” Buchanan concluded. In the context of the early 1970s—shortly after the heyday of Lyndon Johnson’s War on Poverty programs and the welfare-rights movement—there could be little mystery about the political significance of Buchanan’s arguments. The implication of his argument was clear: For the sake of the few, the American welfare state had ended up putting the whole in jeopardy.

Buchanan never achieved the fame of Milton Friedman, who made popularizing free-market ideas via his PBS show Free to Choose and his columns in Newsweek his life’s mission, nor has he ever received the same amount of attention as Friedrich Hayek and Ludwig von Mises in historical accounts of the free-market right in America. But in Democracy in Chains: The Deep History of the Radical Right’s Stealth Plan for America, Nancy MacLean tries to provide a critical history of the role of Buchanan’s ideas in the modern conservative movement. MacLean asserts that his thinking—and especially his caustic attitude toward the state and government—became a critical source of inspiration for several generations of conservative thinkers and activists. Using Buchanan’s papers, she also seeks to excavate his relationship to Charles Koch, one of the Wichita, Kansas–based oil billionaires whose financial contributions have been integral to building the intellectual infrastructure for libertarianism.

Democracy in Chains has already become the object of much controversy. A variety of libertarians and political philosophers—many, though not all, working in Buchanan’s tradition—have attacked the book, arguing that MacLean has read Buchanan’s work (and their own) out of context, distorted quotes, overreached on the available evidence, and failed to grapple adequately with Buchanan’s ideas. They also contend that her vision of Buchanan is Manichaean and melodramatic, and that her assertion that the many Koch-funded organizations comprise a “fifth column” bent on undoing majority rule is, at the least, unfair. Meanwhile, liberal and left-leaning commentators have offered mixed reviews, with some, like political scientists Steven Teles and Henry Farrell, arguing that MacLean doesn’t do enough to engage with the substance of public-choice theory, and others, like historians Bethany Moreton and Colin Gordon, have praised MacLean’s hard-hitting analysis of the libertarian right’s fundamental hostility to ideas about majority rule. (I should say here that MacLean reviewed my first book and we know each other professionally; in fact, because we have participated in conference panels together, my name appears in her acknowledgements.)

Democracy in Chains isn’t a perfect book, if such a thing exists, and there are certainly many places one might criticize her argument. Seen as political history, MacLean’s suggestion that Buchanan’s thought provides, as she puts it, a “master plan” for thinking about the goals of Charles Koch suggests too seamless a connection between the two men, and as intellectual history, the book also could have done much more to clarify Buchanan’s ideas and public-choice theory for a general readership, as well as to give some sense of why it appeals to so many, a point made by Teles and Farrell.

But despite its flaws, MacLean’s book makes two important interventions in the literature on the rise of the right. The first is that, by showing the various points of contact between free-market intellectuals and those organizing in opposition to the civil-rights movement, Democracy in Chains brings the Southern context of libertarian ideas to the forefront. Going back to George Nash’s 1976 intellectual history of American conservatism, most treatments of right-wing thinkers in the 1940s and ’50s say little about the opposition to desegregation, giving the impression that it was an unsavory product of shadowy grassroots groups like the Ku Klux Klan that has little to do with libertarian thought. But MacLean’s book shows how difficult it is to separate race from the history of the free-market right. She highlights how the work of thinkers like Buchanan echoed common arguments being made by anti-integration journalists and legislators and how, in turn, those opponents of civil rights then used his and other libertarian arguments to further their cause. Where other intellectual historians of the right have focused on the intent of the thinkers and the content of their work, MacLean’s emphasis is on how Buchanan’s ideas were used. This leads her to tell a story about the history of free-market conservatism that is very different from what libertarians tell themselves, which is likely one reason that her book has become such a lightning rod for the right.

Her book’s second contribution is in some ways more complex: It is a question and a provocation more than an answer. By exploring the role played by the Kochs and others in financing various centers of intellectual and political activity, Democracy in Chains raises the question of how historians should approach and write about right-wing efforts to reshape political ideas. How can historians write about business activists such as the Kochs without overstating their influence? They may have believed they had a “master plan,” but how can we offer accounts of them without taking them at their word and making them appear all-powerful?

he first half of Democracy in Chains deals with Buchanan’s biography and the development of his career at the moment when “massive resistance” in the South was at its height. Buchanan grew up in small-town Tennessee, the grandson of a man who had been governor during the Populist era. Although his family was hardly well-to-do, they were not impoverished: The Buchanans were attended (at least when James was born) by an African-American live-in servant, and their farm was worked by a family of black sharecroppers. Intellectually ambitious, Buchanan attended a local teachers college before moving on to the University of Tennessee for his masters and then the University of Chicago for his PhD, where he was influenced by Frank Knight and Milton Friedman (though he flatly disliked Friedman for the haughty superiority with which he treated students).

Shortly after his graduation from Chicago, Buchanan was hired by the economics department at the University of Virginia. He arrived there in 1956. It was, at the time, an all-male institution (women were not admitted to UVA until 1970) and almost all-white—the first black student came to the university’s law school in 1950 and withdrew after one year. The state as a whole was not much more egalitarian. Virginia in the mid-1950s was not only highly segregated but still tightly controlled by the Democratic Party, under the leadership of its longtime US senator, Harry F. Byrd. The Byrd Organization, as the Democratic machine in Virginia was known, recognized that enfranchising African Americans would mean the end of its own power, and so it had political as well as racist motives to fight the end of Jim Crow. As political scientist V.O. Key put it in his famous study of Southern politics, “Of all the American states, Virginia can lay claim to the most thorough control by an oligarchy.”

In 1956, the battle over integration was at its peak. High-school students from Prince Edward County, a district in central Virginia, had been among the plaintiffs in Brown v. Board of Education—the only one of the five cases that Brown comprised to have originated with students (they had organized a school strike over unequal facilities, and then agreed to work with the NAACP to challenge segregation more generally). Following the Supreme Court’s decision, the State Legislature passed a law stating that the governor had the right to close any school district that attempted to integrate. Tax-funded “tuition grants” for private schools would be given to families instead—effectively, a subsidy for the private “segregation academies” that were springing up throughout the South. The Legislature’s intransigence was buttressed by the writings of thinkers like James J. Kilpatrick, the editor of The Richmond News Leader, who argued in his 1957 book The Sovereign States that not just Brown, but the entire legal and constitutional infrastructure that supported central pieces of the New Deal—including Social Security, the federal minimum wage, and the National Labor Relations Act affirming the right to form unions—represented a breach of the Constitution that Virginia should resist.

From his position at the University of Virginia, Buchanan weighed in on the noisy debate over integration. Along with another UVA colleague hired at the same time, he wrote a report for the State Legislature making the case for ending the “monopoly” of “state-run schools” and replacing it with a system in which any parent who wanted to send a child to private school could do so with a tax-subsidized voucher. In this way, “every parent could cast his vote in the [educational] marketplace and have it count.” So that public schools didn’t enjoy an unfair advantage, Buchanan and his colleague suggested that the state should sell off its equipment and resources to private operators. (The economists shared a draft of their report with Milton Friedman, who said he didn’t think it went far enough—for example, parents should bear the full cost of their offspring’s education, Friedman argued, to ensure they had “the appropriate number of children.”)

In April 1959, the Virginia Legislature voted on whether counties could choose to abandon public education altogether and replace it with a “scholarship” approach; shortly before the vote, Buchanan’s report was published in the Richmond Times—Dispatch. But the measure did not pass, losing by a margin of 53–45; most legislators recognized that their constituents wouldn’t be happy to see public schools disappear. Still, in September of 1959, Prince Edward County simply closed all of its public schools, for white and black children alike. Nor did the Legislature order the county to reopen its schools, which would remain padlocked for the next five years, until the closure was ruled unconstitutional.

acLean doesn’t show (or seek to show) that Buchanan himself held racist views, nor does she cite extensive correspondence with Kilpatrick or other advocates of segregation. But her reconstruction of this early episode in Buchanan’s career, she believes, demonstrates his willingness to work with the forces eager to find an alternative to integration, and it also shows the confluence between his vision of public schools as a government “monopoly” on education and those who wanted to found all-white private schools in Virginia. Perhaps of greater importance to MacLean, it offers us a critical historical context for interpreting Buchanan’s larger intellectual project and, in particular, his skepticism toward the state. The emergence of libertarianism in America, MacLean wants to show, had close connections to the resurgence of right-wing segregationism in the 1950s and ’60s.

In 1962, only a few years after the battle over the Virginia schools, Buchanan and his colleague Gordon Tullock published The Calculus of Consent, the founding text of public-choice economics. In it, they sought to bring the insights of economics—especially the methodology that takes the utility-maximizing individual as its starting point—to bear on politics. Most interpretations of the state and of government, they argued, were “grounded on the implicit assumption that the representative individual seeks not to maximize his own utility, but to find the ‘public interest’ or ‘common good.’” In contrast, they suggested that a clear-eyed, unsentimental analysis focused on “individualism” and on self-interest as key motivators would prove far more capable of explaining political outcomes.

The mode of analysis embodied by public-choice theory caught fire in the discipline. Some were drawn to the frisson of using economic thought to understand politics. Others appreciated the ironies and paradoxical outcomes it exposed: how its stance of realism and cool reason could allow an economist to dissolve the fuzzy clichés of the “public good.”

The shift that it made—away from totalizing concepts of the state, to focus instead on the self-interested individual as the starting point of analysis—was echoed in other works of social science written in the era by liberals as well as conservatives: most notably, John Rawls’s A Theory of Justice, which took the opposite position about the desirability of the welfare state. Still, regardless of Buchanan and Tullock’s intentions, the attack on public goods and institutions could not help but resonate, in the Southern context, with the long-standing antipathy toward any institutions capable of challenging the racial hierarchies that structured social life in the region. As MacLean puts it, “The driving analysis was less original in its basic convictions than later reviewers imagined. It was midcentury Virginia wine with a Mont Pelerin label.”

och only comes into the book in its second half, as MacLean traces Buchanan’s participation in the developing world of the libertarian right. By the mid-1960s, the UVA administration began to grow anxious about his influence. Criticism circulated within the university that the economics department was “too far to the right,” and that “absolute doctrinalism breeds absolute authoritarianism absolutely.” Someone at the top of the UVA administration commented that there was no one in the department “to the left of the John Birch Society.” When Tullock (whose degree was actually in law rather than economics) failed to be promoted to full professor, Buchanan threatened to leave UVA in protest. The university president stood his ground. Buchanan opted to exit, and he headed west for the University of California at Los Angeles.

UCLA turned out to be an even less accommodating home for a Southern economist. The wave of student-led protests over civil rights and against the escalating war in Vietnam swept through the UCLA campus in the late 1960s. Buchanan would go on to co-author a brief polemic about the state of affairs in American universities, titled Academia in Anarchy, with Nicos Devletoglou. The main thrust of their argument: Free or cheap tuition gave students an incentive to take advantage of their universities.

Never at home on the West Coast, Buchanan returned to the South in 1969, this time to Virginia Tech—an institution with little national reputation and far more modest resources at the time. From his new post, Buchanan renewed his attempts to build an intellectual movement—-recruiting donors, bringing together small groups of like-minded thinkers, discussing the need to “create, support and activate an effective counterintelligentsia” to reshape “the way people think about government” (as he put it at one small gathering at his country cabin).

By the early 1970s, Buchanan wasn’t alone in these efforts; many on the right were discouraged and frightened by the emergence of the New Left, and MacLean describes the wide array of conservative political and intellectual organizations that developed in this period, including the Institute for Contemporary Studies in California; the Law and Economics Center at the University of Miami; the Liberty Fund, which ran annual summer conferences promoting the work of younger scholars; and the Institute for Humane Studies, an intellectual organization that offered seminars and conferences on free-market ideas. Through this growing constellation of libertarian academic organizations, Buchanan ends up meeting the other main character of Democracy in Chains: Charles Koch. Koch’s first major financial contributions went to the Center for Independent Education, a group promoting voucher programs and private schools in Kansas. The two men connected quickly: After meeting Buchanan, Koch invited him to be the first dinner speaker at the newly formed Charles Koch Foundation.

As the 1970s wore on, the relations between Buchanan and his department chair at Virginia Tech worsened. In 1983, after a stint in Chile (where Buchanan advised Gen. Augusto Pinochet on the Chilean Constitution, a move he seems never to have regretted), Buchanan moved on to George Mason University. Once again, he chose a less prominent public institution to which he promised to bring new funds through his connections with conservative circles. In these years, Buchanan’s reputation was on the rise—he won the Nobel Prize in 1986—but he was largely immune to the euphoria that swept the right during the early years of Ronald Reagan’s presidency. Speaking at a Mont Pelerin Society meeting in Chile in November 1981, Buchanan warned: “We should not be lulled to sleep by temporary electoral victories of politicians and parties that share our ideological commitments.”

Like Buchanan, Charles Koch was pessimistic about the prospects of American conservatism. Especially after the failure of House Speaker Newt Gingrich’s “Contract With America” in 1994, Koch feared that his beloved libertarian movement was under siege by “personality cults” and “ossified by purity tests,” and his donations certainly suggested that he wanted to elevate the reputation of public-choice theory. In fact, he would eventually move the Institute for Humane Studies from its base in California to George Mason University, which was rapidly (partly as a result of Buchanan’s time there) becoming an epicenter of conservative thought.

The relationship between Buchanan and Koch was not without its own tensions. In the final chapter of her book, MacLean skips forward to the late 1990s, when Koch donated $10 million to George Mason University to support the expansion of the James Buchanan Center. Oddly, this culmination of their partnership seems to have put the two at odds. A year after Koch’s grant to GMU, Wendy Lee Gramm (an economist and the wife of Senator Phil Gramm) wrote a fund-raising letter to potential donors promoting the Buchanan Center on the grounds that its close proximity to Washington, DC, meant that it was “uniquely positioned to advance freedom…to the very people who’ll make a difference.” She mentioned, in particular, House majority leader Dick Armey’s efforts to cut personal, corporate, and capital-gains taxes.

Buchanan, who hadn’t been told about the fund-raising letter, became anxious that its explicitly political nature might violate the center’s tax-exempt status, and he complained to one colleague that it “amounts to exploitation of me” and verges on “fraud.” Buchanan may have also been worried that the intellectual center that bore his name was being conscripted into a narrowly partisan battle: “Quite frankly,” he added, “I am ‘pissed off,’” and he warned the GMU administration that the people running the center included “no-one with academic standing.” In part because of this experience, Buchanan seems to have withdrawn from the center’s active life. He died in 2013, and MacLean notes, Charles Koch didn’t attend his memorial service.

lthough this runs counter to one of the arguments in her book, MacLean also delineates some of the limits of Koch’s funding efforts. While he may have wanted to build an intellectual movement, Koch wound up alienating the very thinkers like Buchanan who were drawn to the cause but grew frustrated by Koch’s involvement. Mac-Lean quotes with relish the various documents, letters, and speeches in which Buchanan outlined his desire to build a “Third Century” movement that would be able to reverse the trend toward ever-larger governments that he found so alarming. While such language indeed suggests the political energy and sense of momentous self-importance that could make a gathering of economists in the mountains feel like a world-historic event, Democracy in Chains also shows (perhaps inadvertently) how difficult it was to translate these grand sentiments into political or even intellectual change—despite the involvement of powerful businessmen like Koch and their vast resources, or ambitious academics like Buchanan, who repeatedly tapped into this network over the course of his career.

Certainly, by their standards, both Koch and Buchanan have accomplished much—weakening unions, attacking regulations, building organizations to push their policies at the state level—and their relentless denigration of collective action and veneration of the market’s wisdom have helped to shape the political climate that enabled Donald Trump’s victory. Yet today, the movement they helped build is being consumed by infighting; its policy proposals remain as unpopular as ever; and its agenda is now improbably linked to a president who came to power with cynical promises to not cut the entitlement programs (like Social Security) that the Koch brothers have long hated.

Where political sentiments have changed, it’s far from clear that the Kochs and their money, or Buchanan and his ideas, are solely or even primarily responsible. Take the issue of charter schools and vouchers, for example. While the current prominence of the idea of an alternative to public education grew out of the work of thinkers like Buchanan and Friedman and the financial activism of the Koch brothers—and while it also has its roots, as MacLean shows, in the “segregation academies” that promised white families a “choice” about how to spend their education dollars—the charter-school movement can also trace part of its lineage to the support of United Federation of Teachers leader Albert Shanker and many intellectuals associated with the Democratic Leadership Council. No doubt there has been a real transformation of attitudes toward public education and teachers’ unions, but the Kochs are only one part of this, not the whole story.

Obviously the Kochs, Buchanan, and their ilk have played a significant role in helping influence elections and policy—making. Their pools of money and their ideas have transformed the political terrain in countless ways. Writing about their activism, as MacLean does in her timely and thought-provoking book, is one way to demystify and contest it. But there is also a danger in charting the rise of the right as though it has simply unfolded from a “stealth plan” to political reality: It can make it seem almost impossible to resist, and history doesn’t work that way—not even for the Kochs.

Editor’s note: This article has been updated to correct an error introduced in editing.



]]>
https://www.thenation.com/article/archive/mont-pelerin-in-virginia/
The Two Women’s Movementshttps://www.thenation.com/article/archive/two-womens-movements/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinJun 1, 2017

Not even death could stop Phyllis Schlafly. Her final broadside, The Conservative Case for Trump, was released the day after she died at the age of 92 last September. It was a fitting bookend to her first, A Choice Not an Echo, her self-published endorsement of Barry Goldwater for president in 1964. Unlike many other Christian conservatives who backed Texas Senator Ted Cruz in the GOP’s 2016 primaries, Schlafly supported Trump from the outset. Early in the year, she gave an hour-long interview to Breitbart News, making the case that Trump represented the only chance to overturn the “kingmakers” (her word for the Republican establishment). Like Trump, Schlafly’s politics were often focused on a muscular concept of national security. She wanted to see a “fence” protecting the country’s southern border, and she argued that Democrats were recruiting “illegals” in order to bolster their electoral chances. Despite his three marriages, she saw Trump as an “old-fashioned” man whose priorities were hard work and family. After Schlafly died, Trump returned the love. He gave a eulogy at her funeral in the Cathedral Basilica of St. Louis, raising a finger to deliver a promise: “We will never, ever let you down.”

Schlafly emerged on the national scene in the early 1970s, when she led the campaign against the Equal Rights Amendment through her Eagle Forum. Although she’d been well-known in conservative circles since the 1950s, antifeminism brought Schlafly new levels of recognition. In a few short years, she became a household name for a resurgent cultural conservatism, one that ultimately defeated the ERA and helped to elect Ronald Reagan president. Her rise during this period is the subject of Divided We Stand by the political historian Marjorie J. Spruill, a fascinating new account of the “two women’s movements” of the 1970s.

Not so long ago, there was little historical literature about the 1970s. One account of the decade, published in 2005, bears the cryptic title (an allusion to Joseph Heller’s novel) Something Happened. Today, there’s a wave of literature on the era, often approached through the sense of confusion and chaos that defined its art and culture (the title of another book: 1973 Nervous Breakdown). Spruill’s narrative joins the many works insisting on the decade as a turning point. Focusing on the 1977 International Women’s Year conferences, a series of state and national meetings sponsored by the federal government to create a set of principles on women’s rights for policy-makers, she tells the story of the cresting of feminism’s second wave and the counter-feminist mobilization that emerged in response.

From the vantage point of the present, there is much that seems remarkable about this time. Who can imagine, today or at any point in the past 30 years, the federal government funding conferences throughout the United States with an eye toward crafting some kind of proposal to address sexual inequality? But Spruill suggests that what initially appeared a victory for feminism ultimately became the springboard for a counter­revolution. The state meetings provided ample organizing opportunities for women in the nascent antifeminist movement, and the final national gathering in Houston was met by a “pro-family” rally that brought out tens of thousands.

The strength of feminism—its claim to represent women as a whole—turned out to be a weakness as well, since those women who disagreed with its central tenets could puncture the moral claim of unity simply by insisting that the movement did not speak for them. As Spruill points out, “Solidarity among feminists was not the same as solidarity among American women.” In the end, she argues, the 1970s not only gave us some of the most important victories of the modern feminist movement, but also launched the opposition to it that would eventually put Donald Trump in the White House.

s the decade began, feminism was on the march, not just in the streets but in electoral politics. Many historians have focused primarily on its radical edge: the consciousness-raising groups, the Women’s Strike for Equality, the abortion speak-outs, the writings of people like Kate Millett and Shulamith Firestone. Spruill, by contrast, paints a picture of feminism in the early 1970s as a pragmatic, bipartisan movement, one that was focused on winning greater economic and political power for women rather than on challenging male authority in the family and home.

There were certainly ample grounds on which to fight. In 1971, men held 98 percent of the seats in Congress. The National Organization for Women had to press the Equal Employment Opportunity Commission to consider cases of sexual discrimination as part of its purview. Until a 1975 Supreme Court ruling, states were permitted to exclude women from juries. Although she evokes the broader cultural politics of feminism, Spruill sees changing these political and legal inequalities as the central goal of the movement. After all, the politics of sexuality and the family were inextricably connected with women’s structural rights: Roe v. Wade was preceded by a 1972 ruling that legalized the prescription of birth control to unmarried women.

For Spruill, what is most notable about the early 1970s is how mainstream feminism already was. As she notes (it’s the title of Chapter 2 in her book), there was something of a “feminist establishment” by that time. She points to the Kennedy administration’s decision to set up the President’s Commission on the Status of Women in 1961 to explore the role of women in American life. Its report, “American Women,” was published in 1963, the same year that The Feminine Mystique came out. Indeed, Betty Friedan was an ad­viser to the commission. By the late 1960s, states throughout the country were undertaking similar investigations. The Republican Party had backed the Equal Rights Amendment beginning in the 1940s, whereas Democrats were wary about supporting an amendment that would overturn labor legislation protecting and benefiting women. Congress passed a bill creating a national child-care system (subsequently vetoed by President Nixon), as well as the 1974 Women’s Educational Equity Act, which was intended to fund programs to counter “sex-role socialization and stereotyping.” President Ford appointed women’s-rights advocates like Jill Ruckelshaus, dubbed the “Gloria Steinem of the Republican Party,” to leadership positions. Even Alabama Governor George Wallace and South Carolina Senator Strom Thurmond supported the ERA in the early ’70s.

This emerging feminist consensus was undone by the women who became part of the antifeminist mobilization. These were women who insisted on the unique nature of women’s identities as mothers and homemakers. They blamed the second-wave feminism of the ’60s and ’70s for leading women astray, and they were drawn to Schlafly, a Catholic mother of six whose hair was always perfectly coiffed and who preferred pastel dresses to pants.

To her detractors, Schlafly seemed impossibly prim. But to her followers, she looked like a true “lady.” Schlafly had been an activist for years. She got her start in 1952, when she ran for Congress from Illinois as a 27-year-old anti­communist housewife. Her Goldwater book, which argued that the “New York kingmakers” in the GOP had sold out the party’s conservative base, became the most widely distributed tract of the 1964 primaries. Schlafly’s focus was foreign policy; she hadn’t been especially interested in women’s issues when she started out in politics. But she knew a constituency when she saw one, and in 1972, she published an essay attacking the ERA titled “What’s Wrong With ‘Equal Rights’ for Women?”

Schlafly’s sneering portrayal of feminists would be familiar to anyone who follows the alt-right today. American women, Schlafly wrote, were “the most privileged” class of people ever to have lived, and the real heroes of women’s liberation were the men who’d invented the sewing machine, the automobile, and frozen food. Thanks to them, modern mothers were free to spend time enjoying their children and perhaps to take a part-time job or volunteer outside the home if they wanted more to do. There was no real problem of inequality; instead, the “aggressive females on television talk shows yapping about how mistreated American women are” were tricking women into feeling aggrieved. Ms. magazine, according to Schlafly, was filled with “sharp-tongued, high-pitched whining complaints by unmarried women” who “view the home as a prison, and the wife and mother as a slave.” The magazine’s subtext was “how satisfying it is to be a lesbian.”

Schlafly was careful to say that of course she believed in more opportunities for women—she just didn’t think the ERA could secure them. Instead, it would rob women of their special place in society, while also failing to deliver on its promises of equal pay and political representation, thus leaving women worse off than before. Meanwhile, the ERA was being promoted by those in the Republican Party who condescended to people like Schlafly—a “tight little clique running things from the top” that refused to give “equal rights” to delegates who rejected the amendment.

lthough Schlafly was Catholic, the vision she promoted throughout the 1970s appealed to a host of conservative Protestant and Mormon women as well. Spruill paints a picture of antifeminism as a rollicking political movement, one that—ironically, like feminism itself— offered women a way to participate in a world outside the family. Along with the stories of feminist leaders like former Congresswoman Bella Abzug and Midge Costanza, an adviser to President Carter, Spruill tells the story of women on the right like Lottie Beth Hobbs of the Church of Christ, founder of the memorably named group Women Who Want to Be Women, which built on fundamentalist Christian networks and Bible-study groups to recruit women to the cause. Hobbs wrote a flyer titled “Ladies Have You Heard?” widely known as “The Pink Sheet” for its pink paper, which was circulated through church newsletters, local newspapers, and beauty parlors. The ERA, it warned, would demolish family life, destroy homes, bring an end to chivalry, cause women to be drafted into the Army, and force them to use unisex public bathrooms. Ultimately, feminism was part of a secular-humanist crusade that sought to subvert American values from within—and that would leave the United States open to a communist revolution or takeover. Even efforts that might seem worthwhile, such as shelters for battered women, appeared ominous to the forces of antifeminism: They were havens that would allow feminists to nurture women’s grievances against their husbands and foment discord in otherwise happy families.

In 1975, inspired by the United Nations (which had declared 1975–1985 the Decade for Women), Congress mandated that each state hold a convention to elect delegates for a national meeting on women’s issues in Houston. The goal was to create a National Plan of Action that could guide future legislation on women’s rights. The rise of the antifeminist movement meant that the Houston conference in the fall of 1977—and the state conferences leading up to it—would not represent the triumph of political mobilization that its original organizers had hoped. From the outset, the state conferences were heavily contested. In Oklahoma, busloads of conservative women showed up and voted for a resolution stating that homemaking was the “most vital and rewarding of careers for women.” In Alabama, conservatives tried to disrupt the proceedings in any way they could, including paying the $2 registration fee with 200 pennies. In Mississippi, a group called Mississippians for God, Country and Family overwhelmed the feminists who were present. The group managed to take over the proceedings, passed its own platform (which included vigorous denunciations of abortion, day care, and “sin and immorality”), and elected a delegation to the national convention that included six men.

The Houston convention turned out to be not much better. True, it was a milestone gathering with some moments of historic rapprochement, such as Betty Friedan’s endorsement of a resolution calling for equal rights for lesbians. As she wrote later, Friedan had always feared that feminism would be confused with lesbianism, but she was even more concerned about the right’s efforts to “fan a hysteria of hate and fear.” And while there was substantial debate over abortion rights, the conference ultimately supported them.

But what happened within the conference was less important than what transpired outside. Schlafly had organized a Pro-Life, Pro-Family Rally in Houston at the same time. The crowd was large, almost all white (in contrast to the feminist conference), and it was energized. A former Houston mayor welcomed the participants of the rally warmly. One 34-year-old woman from Bismarck, North Dakota—a mother of nine—described driving down in a yellow van with six other women: “We felt we had the Lord knocking on the top of the van all the way down.”

After the National Women’s Conference was over, it became clear that President Car­ter was ambivalent about pushing forward the recommendations contained in the National Plan of Action; instead, he fired Abzug, who had been chair of the conference and co-chair of the National Advisory Committee on Women. As the Democrats moved to distance themselves from the feminist movement, the Republican Party embraced the ascendant antifeminist mobilization. After 40 years, support for the ERA was finally excised from the GOP platform in 1980. Schlafly trumpeted the successes of Houston, invoking the “ERAers, the abortionists, and the lesbians” at the conference to rally people to her cause. Indeed, the plan adopted by the National Women’s Conference, with its support for abortion, its opposition to discrimination, and its enthusiasm for the ERA, became a conservative organizing vehicle. One activist pronounced it the “best recruiting tool I’ve ever had…I just spend twenty minutes reading it to them. That’s all I have to do.”

he final chapters of Divided We Stand trace a line from the antifeminist politics of the 1970s to those of today. The women who were dismissed by feminist leaders in 1977 are the foremothers of the women who voted for Trump in 2016, and Schlafly was at their helm once again. Far from being alienated by Trump’s crude machismo, many of these women saw him as a “family man”—a patriarch with sexual swagger, a man of authority willing to use his power and wealth to command women—and this perception actually deepened his support among some voters.

In other ways, too, the events in Houston in 1977 seemed to anticipate the 2016 election. In Spruill’s account, a feminist establishment backed by Washington faced off against a ragtag group of private citizens, the antifeminists brought into the streets by Schlafly’s call to arms. The leaders of the Houston conference were hubristic, in the manner of Hillary Clinton: They simply assumed that they could count on the allegiance of women, so much so that they never really cared to look across the street in Houston at the crowds massing outside the convention center.

Yet what stands out most in Spruill’s account is just how different feminism and its opponents were in the 1970s, contrasted with the feminism and antifeminism of our moment. Feminism was then a new mobilization, one replete with ideas, many of which were in conflict with one another, but all of which sought to advance a deep transformation of American society. The leaders in Washington were shaped by the women throughout the country whose politics powered theirs. This was light-years away from what passes for feminism in mainstream politics today, embodied most clearly in Hillary Clinton. It was a movement with a grassroots base, one that reached deep into the culture, and one that often took a forthright, confrontational stance toward the existing distribution of power and resources—a stance very different from a vision of empowerment conceived mostly in terms of integration into a corporate meritocracy.

Schlafly and others tried to paint feminism as a movement of insiders, of powerful women with Washington connections who sought to dupe their less-privileged sisters. But the various leaders, from Bella Abzug to Jill Ruckelshaus, and the state commissions—even the National Women’s Conference—were not really what gave feminism its power or made it appear dangerous to the conservatives who opposed it. Far more important were the millions of women who met in small groups, attended marches and demonstrations, and recognized the gap between the cartoonish stereotypes of femininity so prevalent around them and the realities of their own lives.

Antifeminism always had a different dynamic: Its power came from its alliance with institutions and people committed to rolling back the welfare state, opposing godless communism, and protecting companies from both government regulation and workers’ movements. As Schlafly’s own career and roots in organized conservatism suggest, the antifeminist agenda was always very closely connected to that of the larger right. It did not seek to revolutionize social relations, but to bolster traditional society. It accepted a conservative politics of God and capitalism, military might, and hierarchy in the workplace as well as in the home. Its appeal came from the life it breathed into a fantasy of complete feminine fulfillment through service to family, church, children, and men—an ideal linked to a broader vision in which the family itself was under attack.

Many women in the 1970s may have been drawn to the cause because of the ways that it glorified their work in the home: planning and cooking meals, doing laundry, caring for children, and keeping the social bonds of the community intact. This work was and often remains invisible in modern society despite its necessity, and the antifeminism of the 1970s recognized it, extolled it, even celebrated it. (It should go without saying that such enthusiasm was always limited to ideology; the antifeminists didn’t propose any material support that might actually make the labor of care easier.) But even if women felt that antifeminism praised their household work, granting them a sense of importance that was hard to come by otherwise, for leaders such as Schlafly, let alone the Mississippi men who sat on the Houston delegation, antifeminism was really part of a much larger political program—one that had to do with rolling back liberalism overall. Despite the central role of women in pushing it forward, in certain key ways the antifeminist mobilization was never about women at all.

Today, it is even less so. When conservative provocateurs such as Milo Yiannopoulos mock “feminists” for being overweight, drinking venti frappuccinos, and damaging men with false rape claims, does anyone think he is doing so on behalf of women, let alone saving the endangered family? The arguments are repetitive, echoing much of what was said in the 1970s, yet the constituency has changed. Instead of speaking to and for religious women, the antifeminist cause seems to have been taken up today by young men who see society as a whole as a desperate competition—one in which some people get luxury jets and lush resorts and others get work requirements for Medicaid—­and who fear being “losers” in this war of all against all.

Divided We Stand evokes two movements, two equal mobilizations, struggling over the role of women in America, each with its own well-intentioned supporters, divided in their vision of the nation and their sense of the place of families and women within it. But is this parallel framework really the right one? In the book’s final pages, Spruill likens Houston to Seneca Falls, suggesting that “most Americans” now hold “progressive views on women’s and gender issues.” The ERA is back on the march, with the Nevada State Legislature ratifying the amendment earlier this year, 35 years past the deadline. The election of Trump obviously tapped into antifeminist politics, but it has also galvanized women revolted by his brand of leering masculinity and its connections to the politics of wealth and privilege, who are seeking to build a new feminism that connects women’s rights to a broader welfare state and deeper economic security. Underneath it all is the reality that women’s lives conform less and less to the image of total submersion in family life once peddled by Schlafly and company. The promises of the National Women’s Conference in Houston, and the visions of 1970s feminism, have not yet been fulfilled—but they remain as beacons for a new generation.



]]>
https://www.thenation.com/article/archive/two-womens-movements/
The Roots of American Conservatismhttps://www.thenation.com/article/archive/the-roots-of-american-conservatism/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinMay 4, 2016

When Barry Goldwater sought the Republican nomination for president in 1964, his opponents—especially Nelson Rockefeller and George Romney—pilloried him for holding views that had no basis in reality, which for them meant mainstream politics. Here was a politician who criticized labor unions and had made an enemy of the United Automobile Workers; who rejected any suggestion of peaceful coexistence with the Soviet Union; who loathed Social Security and argued that the federal government should play no role in guaranteeing civil rights; and who warned of a growing criminal threat that he seemed to associate with unruly protesters. Perhaps worst of all, Goldwater refused to distance himself from the conspiratorial John Birch Society, accepting their support as he fought for the nomination. When his loyal delegates waged a dogfight at the Cow Palace and secured him the candidacy, he tipped his hat to the Birchers in his acceptance speech: “I would remind you that extremism in the defense of liberty is no vice! And let me remind you also that moderation in the pursuit of justice is no virtue.”

For all the fear of extremism in 1964, neither Goldwater nor his opponents could possibly match the sheer spectacle of the 2016 race for the Republican nomination, with its distinct resemblance to a reality-TV show. Donald Trump’s gold-plated hair is the least of the attractions. With the various candidates taunting each other for being insufficiently pro-gun, anti-immigrant, or pro-life, mocking each other all the while with locker-room humor, it seems that the conservative movement has reached the end of the line. One by one, the putatively mainstream Republicans—the patient Jeb Bush, the stolid Scott Walker, even the obstreperous Chris Christie, all of whom did their duty by attacking public-sector unions, defending the right to work, and pushing tax cuts—have been kicked to the sidelines.

What appeals to Republican primary voters, especially those who are radicalized and revanchist and have a taste for nonsense and paranoia, is Ted Cruz’s “defense of religious liberty” and Trump’s rich-guy braggadocio, especially his visions of brown-skinned immigrant hordes stampeding over the American border to violently seize our jobs. The Tea Party is standing off against voters happy to jettison the old truisms of the free market in favor of a pumped-up nationalism, leaving David Brooks and National Review to tie themselves in knots explaining that none of this is genuine conservatism. The right has experienced schisms many times before, and its collapse has been predicted, erroneously, on many occasions, going back to LBJ’s victory over Goldwater in 1964 and as recently as Obama’s election in 2008. Still, the divisions and chaos unleashed by this primary season make one wonder: How long can this possibly go on?

For a generation, scholars of American politics, almost all of whom are liberals or on the left, have been driven by a sense of bewilderment about the right. Ever since Ronald Reagan’s election in 1980, they have puzzled over how, in the face of what seems to them its self-evident backwardness, the right’s politics of fantasy and rage has remained imperishable. Their work has looked for the origins of conservatism everywhere, from suburban kitchens to corporate boardrooms to academic departments. Their arguments are characterized by a distinct note of surprise and disbelief about the lasting power of conservatism. Wasn’t the politics of religious fundamentalism refuted in the Scopes trial of the 1920s, which concerned the teaching of evolution in public schools? Didn’t the 1929 stock-market crash and the Great Depression prove the folly of laissez-faire? Weren’t the Southern segregationists defeated by the civil-rights movement, and the role of women in social life and the workplace thoroughly transformed by feminism? Why, then, do the politics of free markets and cultural reaction keep returning like some Republican Freddy Krueger?

Historian Kathryn S. Olmsted’s Right Out of California enters directly into these debates, arguing that the origins of today’s conservative movement can be found in the agricultural plantations of California and the fierce labor conflicts that broke out in the state during the 1930s. Olmsted assumes that looking at support for the John Birch Society in the postwar California suburbs (as Lisa McGirr did in her pathbreaking Suburban Warriors) skirts the centrality of economics and labor history in the development of conservative ideas, just as historians who have examined the role of the business reaction against the New Deal on the national level gloss over the regional importance of the West in developing conservative coalitions.

Olmsted does indeed have an amazing story to tell. The largest agricultural strike in American history took place in the San Joaquin Valley in the summer of 1933, when nearly 20,000 cotton pickers walked off the fields. The next year, San Francisco was paralyzed by a general strike following the violent reaction to a longshoremen’s strike. The owners of the “factories in the field” responded to the pickers’ strike by painting the uprising as one that threatened to unsettle traditional norms of family and racial hierarchies, appealing to cultural conservatism in order to turn white workers against the union—and, more broadly, the labor policies advanced by the New Deal. At the same time, they depicted the New Deal itself as akin to socialism, even as the New Dealers sought to distance themselves from the radical left.

* * *

California in the 1930s was a state of extremes, divided between the desperate poverty of the agricultural labor force and the luxury of Hollywood. Not only was it the site of the labor controversies that Olmsted chronicles, but in the 1934 campaign for governor, the erstwhile socialist journalist Upton Sinclair, author of the muckraking classic The Jungle, launched his EPIC campaign (the acronym stood for “End Poverty in California”), which sought to sponsor cooperative farms and factories to directly employ and feed impoverished residents, impose a progressive income tax throughout the state, and create state pensions to support all aged and disabled people. The Progressive-era writer Lincoln Steffens (whose trip to Bolshevist Russia prompted him to declare, “I have seen the future, and it works!”) and his young wife, Ella Winter, were drawn to the strikes, and they managed to recruit writers like Langston Hughes and John Steinbeck to the cotton fields to tell the workers’ stories to the nation. Steinbeck’s 1936 novel In Dubious Battle relied on his conversations with cotton workers, though in the course of writing it he turned the crop from cotton to apples and wrote Mexican and African-American pickers out of the story almost entirely. He also replaced the freewheeling radicals who led the strike with grim, doctrinaire Communist ideologues.

Olmsted argues that the agricultural workers were far more racially and ethnically diverse than those depicted by Steinbeck, and their leadership more dynamic. In particular, she revives the memory of Caroline Decker, a young Southern woman who had become a party member not for ideological reasons or because she found the Soviet Union appealing, but simply because it was the group that seemed to be doing the most to challenge conditions of extreme poverty. Just 21 years old, she emerged as one of the leaders of the cotton workers, walking the picket lines wearing fashionable high heels and rallying groups of workers with whom she would seem to have had little in common. In Dubious Battle had no room for a figure like Decker, instead portraying the workers as (mostly white) innocents trapped between the cruelty of their employers and the rigid political fantasies of their (all-male) leaders.

Olmsted depicts the world of the agricultural workers and the radical circles that formed around them with great subtlety and care. In addition to portraits of Decker, Ella Winter, and Pat Chambers (another strike leader, also a Communist organizer), she uncovers people like Pauline Dominguez and her son Roy, a 7-year-old Mexican American who plucked cotton from dawn until it was too dark to see in the broiling heat of the fields. When such workers heard rumors that the new Roosevelt administration in Washington, DC, had guaranteed their right to form labor unions, a wave of spontaneous strikes broke out across the valley. Fruit and vegetable workers in particular sought to leverage time; the people who picked berries, peas, cherries, peaches, and lettuce walked off the fields at the height of the harvest, leaving the produce to rot. Their faith in Roosevelt, however, was misplaced: The National Industrial Recovery Act of 1933, like the Wagner Act two years later, would explicitly exclude agricultural workers from the right to organize unions.

The strikes were met with fury by the growers. “By all that is fair and just, have the American farmers no rights over Communists and aliens?” asked a San Joaquin Valley newspaper op-ed. As the organizing spread to the farms of the Imperial Valley in the southern part of the state, so did the reaction. Local police officials supported the kidnapping and beating of an ACLU lawyer who had agreed to speak about the right to strike. When the federal government set up a commission to gather information about conditions in the valley, the growers scoffed at the idea that they would be criticized for “defending life and property by the only means they had; that is, by arresting the treason-preaching ‘comrades’ and throwing them in jail.”

In an attempt to link themselves to the old Jeffersonian ideal of the small farmer, the agribusiness giants (with support from the broader corporate community) organized a group called the Associated Farmers to militate against labor rights and better coordinate their response to radicalism in the fields. The businessmen who played a part in the new effort sought to cover their tracks. As one representative of the Los Angeles Chamber of Commerce put it, the participation of other businessmen in the Associated Farmers “should be kept a deep, dark secret. This is the only way you can win the fight.” The Los Angeles Police Department aided the cause by honeycombing labor circles with anticommunist spies; the Associated Farmers also kept its own records on “known radicals, fanatics and Communist sympathizers.” Meanwhile, special press agents were hired to coordinate the opposition to Sinclair’s EPIC campaign. A cartoonist was hired to do subtle drawings—one depicted a bride and groom intimidated by a large black blob labeled the “blot of Sinclairism.”

As the backlash spread, the radical writers who had congregated in California grew afraid. Hughes was targeted by newspaper articles that taunted him as the “guest of ‘honor’ at parties,” noting that “white girls have ridden down the street with him, have walked with him, smiling into his face.” Afraid that he might be lynched, Hughes left California for Harlem. In the summer of 1934, Decker (by that time known as the “blonde flame of the red revolt” in the papers) and several other Communist organizers were arrested and charged with criminal conspiracy; she was eventually convicted. Even at sentencing, Decker stood by her beliefs. “We are not being convicted as criminals,” she insisted. “We are convicted for union organization. The verdict is a conviction of thousands of workers, farmers and students with whom we have been associated.”

* * *

In her final chapter, Olmsted suggests some of the ways that the California fields came to exert a surprising role in American politics well beyond state lines. Richard Nixon first ran for Congress against liberal Representative Jerry Voorhis, a former Socialist who had supported lettuce strikers. As governor, Ronald Reagan fought Cesar Chavez and the grape boycott. (He denounced the boycott on television while snacking on a bunch of grapes; grape growers ran an ad campaign urging Americans to embrace their “consumer rights” and “Eat California Grapes, the Forbidden Fruit.”) Clark Kerr, who would later preside over the University of California system during the 1960s, cited the strikes as formative for his politics, in particular his sense of the growers’ willingness to exploit fears about Communism to turn popular sentiment against the strikers. Olmsted describes the way that the Associated Farmers encouraged sympathetic businessmen to send telegrams and letters to California Governor Frank Merriam praising his hostility to the union, which in turn were celebrated by pro-grower papers like the Los Angeles Times as evidence of a grassroots rejection of labor. “After the invention of synthetic grass, political observers would call this technique ‘Astroturf,’” Olmsted explains. “President Richard Nixon would perfect the tactic, but it had its origins in the grower campaign against the unions.”

Olmsted doesn’t try to make the case that postwar conservative activists literally looked to California for inspiration; she leaves the parallels implicit. (For example, the Koch brothers’ tactic of funding organizations with populist-sounding names—the Center to Protect Patient Rights, Americans for Job Security—is similar to the way the Los Angeles Chamber of Commerce concealed its role in the formation of the Associated Farmers.) More than seeing a conservative playbook in Depression-era California, Olmsted is interested in locating what Raymond Williams called “structures of feeling.” She wants to describe the conservative way of seeing politics, whereby any challenge to economic power automatically comes to be associated with a broader cultural assault—one so destabilizing that it threatens home, family, religion, everything most dear and important in the world.

But despite Olmsted’s assertions and her rich, gripping narrative, it’s not clear that California was the cradle of the national conservative movement after World War II. The conservative sensibility, with its brutal sentimentality, can be found in thinkers going back to the French Revolution and Edmund Burke. Nor is it clear that the American version of this worldview originated with California growers: In the United States, elements of it can be seen in the antiradical reaction during and after World War I, in the campaigns against the Industrial Workers of the World in the early 20th century, and even during Reconstruction after the Civil War, just to name a few occasions.

This raises a larger question: Why should we approach the conservative movement as though its origins spring from a single source, as though it were a riddle with only one correct answer? Rather than treating it as a bizarre mystery that needs to be explained, historians of American conservatism might start from the premise that elements of this worldview have been present throughout American history. Far from being surprising, this would be just what one would expect in a country with a history of plantation slavery, racial segregation, anti-union politics, and economic inequality. The desire to name the origins of the right, to dissect the movement and understand where it came from, reflects the notion that knowing what created it—and how it created itself—may make it easier to undo. Implicit in the quest for origins is the hope that conservatism is a discrete, autonomous force that can be easily disentangled from the rest of our common past. It’s as though revealing its secrets—naming them, pinning them down—might grant us the power to unmask and unmake it. The spell would be broken. By contrast, taking the full measure of conservatism’s scope in American politics may entail accepting it as an expression of certain constant beliefs and ideas in the country’s history—­ones that point to tenacious aspects of our social structure—rather than a strange, florid, and short-lived aberration.

Olmsted’s focus is on conservatism as a distinctive political tradition. But there’s a new movement afoot in historical writing that’s concerned with the broader conservative shift of the late 20th century, one that gives the self-conscious mobilization of conservatives less centrality. Instead, historians have started to look for the links and continuities between postwar liberalism and postwar conservatism, seeing the two as sharing many fundamental assumptions. Elizabeth Hinton, for example, has argued that the policies that led to mass incarceration grew as much out of Lyndon Johnson’s creation of the Law Enforcement Assistance Administration as they did out of racial backlash in the 1980s. Emphasizing the role of the urban riots of the early ’60s in the passage of War on Poverty–­era legislation, Hinton suggests that the LEAA greatly expanded federal funding for local police departments, helping them acquire military-style arsenals and defining the social problems of the inner city largely as criminality. Johnson and the liberals around him feared black uprisings and the spread of violence, so they sought out ways to contain and manage the growing urban population of impoverished people of color. Over time, the emphasis shifted from healthcare and food stamps to three-strikes-you’re-out laws, but the impulse was the same.

Other scholars, such as Brent Cebul and Amy Offner, have argued for the deep continuities between the intellectual and policy framework of postwar liberalism and the neoliberal ideology that flourished starting in the 1980s. John F. Kennedy, Lyndon Johnson, and other Democratic Party liberals of the ’50s and ’60s all pledged their faith in the centrality of the private sector, the importance of individual initiative, and the imperatives of economic growth; they also helped pioneer the “public-private” governance that is often associated with privatization in the 1990s. When postwar economic growth slowed, these liberals were able, with little cognitive dissonance, to shift their positions and adopt a far more skeptical stance toward labor unions, the public sector, and the idea of “big government.” The differences are ones of emphasis as much as they are evidence of one worldview being supplanted entirely by another. It’s not that liberalism and conservatism are the same; rather, our own conservative era evolved out of elements present in an earlier one, which must then bear some part of the responsibility for what exists now.

Olmsted’s fine book is an example of an earlier quest for the roots of American conservatism. Although its argument that those roots can be found in California’s labor struggles may be somewhat overstated, this shouldn’t turn readers away. People who need a break from the mayhem of the 2016 election could do far worse than to read about Sinclair, Decker, Steffens, and the children who picked the nation’s apples and peaches, and whose lives were rarely considered by the busy New Dealers in Washington (just as the young people who still toil in those fields make few appearances in electoral politics now). Olmsted’s stories of violence, upheaval, and idealism in California are reminders of how much is left out of American politics, both in the 1930s and in our own day.



]]>
https://www.thenation.com/article/archive/the-roots-of-american-conservatism/
Who Is the Real Progressive: Hillary Clinton or Bernie Sanders?https://www.thenation.com/article/archive/who-is-the-real-progressive-hillary-clinton-or-bernie-sanders/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael KazinFeb 24, 2016

Earlier this month the social media teams for the two Democratic presidential candidates got into a heated—if not quite illuminating—spat on Twitter and Facebook about the meaning of progressivism. The Bernie Sanders campaign tweeted, “You can be a moderate. You can be a progressive. But you cannot be a moderate and a progressive”; while Hillary Clinton’s account replied that “an important part of being a progressive is making progress.” When asked about the exchange the following night, in an MSNBC debate, the candidates did not shed much more light on the question. “The root of that word ‘progressive’ is ‘progress,’” Clinton observed.

In search of nuance, we asked four historians to comment on how Sanders and Clinton each fit within in the context of the Progressive tradition, and what their respective candidacies might augur for its future. —Richard Kreitner

In Search of a Lost Ideology

Historians have long had trouble agreeing on what Progressivism really meant. After all, the various middle-class reform efforts of the early 20th century included Prohibition, labor reform, efforts to make sure that food and medicine were not adulterated, regulations of workplace safety, and the creation of the Federal Reserve, among many others. Even the Jim Crow laws of the South might be seen as one manifestation of Progressivism: an attempt by social elites to “organize” what they saw as the disruptive elements of their society, albeit in poisonously repressive ways. In a 1982 essay, “In Search of Progressivism,” historian Daniel T. Rodgers observed that various efforts to construct a coherent political ideology to characterize the early-20th-century political movement known as Progressivism had failed: As he put it, “progressivism as an ideology is nowhere to be found.”

When presidential candidates Bernie Sanders and Hillary Clinton spar over which one is more progressive, they don’t have the reality of the Progressive movement of 100 years ago firmly in mind. Each candidate has his or her own reasons for trying to claim the word. For Clinton, it is a way of trying to win over the young, left-leaning voters who overwhelmingly support her opponent, while Sanders uses it to associate himself with a resurgence of democratic populism. Today, the term mostly offers a way of talking about left politics without using the word “liberal” (with its unpopular top-down connotations, and its history of critique by the right) or—even worse—the word left.

Nonetheless, there are some ways in which our own moment does closely resemble that of the early 20th century: in terms of rising income inequality, growing public anxiety about the political role of business, and a widening sense that business interests are cavalier with public safety and hostile to the public good. Although there are many points of commonality between Sanders’s campaign and the Progressive efforts of the last century—the criticisms of business, the advocacy for greater regulation—in one key way, his campaign seems very different. The intense political emotion, the sense of disillusionment with the present, and the idea of the necessity of broad-scale transformation are all at odds with the politics of early-20th-century Progressivism, which was deeply concerned with the threat of upheaval from below. In this sense, although not in others, the Clinton campaign may be more in keeping with Progressivism as it has been classically understood: While her proposed reforms are far less bold than those of the Progressives, she seeks, as they generally did, to channel political unrest, while leaving underlying social inequalities untouched.

A New Wave

In order to understand where Sanders and Clinton fit within the Progressive tradition, it’s vital to know who the Progressives were and what they represented. Such questions have bedeviled generations of historians because there are no easy answers. Workers and farmers formed the largest constituencies of Progressive Era reform. But urban middle-class activists, white and black, male and female, also made their mark on the Progressive tradition. Progressives shared a belief in social improvement by way of public or collective action, focused on a wide array of social ills, and offered a variety of remedies and reforms: from a graduated income tax to public kindergartens, from eight-hour day laws to municipal ownership of utilities.

A consideration of the Progressive presidents of the early 20th century, Teddy Roosevelt and Woodrow Wilson, shows why drawing boundaries around a Progressive tradition can be a hazardous undertaking. Roosevelt attacked the outsized role of corporate power, and called for a national system of health insurance and for pensions for retirees. Yet Roosevelt also had good friends on Wall Street, betrayed African Americans’ civil rights, and was an ardent imperialist and warmonger. Wilson presided over such watershed reforms as the income tax, the Federal Reserve Act, the direct election of senators, and the extension of suffrage to women. Yet Wilson was also a white supremacist who oversaw the segregation of government offices in Washington, and whose “war to make the world safe for democracy” did not accomplish its stated goal, to say the least.

Both Sanders and Clinton have their Progressive Era heroes. Sanders has a plaque of the Socialist labor leader Eugene V. Debs mounted on the wall of his office. Like Debs, Sanders embraces the secular faith in human solidarity, and calls for sweeping reforms to create a more equitable and humane society. But that doesn’t make Sanders a 21st-century version of Debs, who worked to create a “cooperative commonwealth” resting on the power of the unions, alliances, and cooperatives of workers and farmers. Though Sanders speaks of a “political revolution,” how far that extends beyond a more or less traditional presidential campaign is often difficult to discern.

Clinton, meanwhile, has cited the work of Jane Addams on behalf of immigrant children as the type of service that has inspired her. Addams was a Progressive reformer who sought the middle ground between capital and labor, between Debs and the railroad corporations. A political career, of course, was closed to Addams, who spent her life devoted to often small and partial steps to improve the lot of the urban poor. Clinton is a powerful politician who seeks the middle ground between Wall Street and its critics, and who sees compromise and halfway steps as the preferred road to progress.

The future of progressivism will unfold at the same time as the political system undergoes a realignment. In recent decades, the Republican Party has steadily evolved into a conservative party, purging its progressives and moderates and cornering the political market on white racial resentment and xenophobia. Meanwhile, the Democratic Party is shedding its ancient reliance on the white-supremacist vote, and today embraces multiple progressivisms that have pushed the party’s positions on domestic policy to the left since a Clinton last occupied the oval office. A similar shift has yet to occur on foreign policy, as progressives within the Democratic Party, from Obama to Clinton to Sanders, continue to embrace the prerogatives of the US war machine. But perhaps a new wave of progressive renewal will bring to the fore more humane and universal notions of equality and social justice.

As Goes the South…

The future of progressivism is inextricably linked with the future of the American South. The current Democratic primary campaign is only one example of this. But even more important is the likelihood that come November, Georgia, South Carolina, Alabama, and other Southern states will revert to what they have been for most Democratic Party presidential candidates in recent decades: states to be ignored.

In most states of the Deep South, the Democratic Party is a mere shell, barely competitive in national elections. Such a political vacuum disserves and disenfranchises millions of citizens—overwhelmingly African Americans—who need a government that works for them. To allow this neglect to continue would be an egregious mistake.

The birth of a sustainable movement in the South has been a dream of the progressive-minded for generations. After the Civil War, Reconstruction offered a chance for the South to remake itself into a more egalitarian region, before political violence by Southern whites and exhaustion among Northern whites brought it to a premature end. Subsequent attempts to challenge the conservative (often Democratic) status quo regularly collapsed: The Western-dominated Populists of the 1890s failed to rally Southerners of any race to their cause over the long term; widespread textile strikes in 1934 fell victim to a lack of local institutional support; the CIO’s postwar “Operation Dixie,” which attempted to unionize thousands of Southern workers, couldn’t overcome racial hostility; and, finally, attempts by African Americans in the late 1960s to remake the Southern political landscape, through the Mississippi Freedom Democratic Party or Alabama’s Lowndes County Freedom Organization (the original Black Panther Party), fell short.

More recent campaigns like Jesse Jackson’s 1984 and 1988 presidential runs further demonstrated the perils and the possibilities of progressive politics in the South. Although Jackson’s presidential campaign had the support of large amounts of African-American voters in 1984, and forged an even more diverse coalition (including poor whites) in 1988, he was still unable to win the nomination—or even to win significant concessions from the Democratic leadership. During his 1992 presidential run, Bill Clinton used a speech by the rapper Sister Souljah before the Rainbow Coalition to distance himself from Jackson, thereby solidifying support among moderate white voters. Meanwhile, liberal and left-wing elements within the Democratic Party struggled to resist the leadership’s sharp turn to the right, a pivot spurred by consideration of the limits of electability in post-Reagan America.

Neither Democratic candidate is untouched by this complicated history. Bernie Sanders endorsed Jackson for president in 1988, and his focus on class issues is an echo of many of the concerns Southern progressives have had for decades, not least the 1966 “Freedom Budget” proposed by civil-rights leaders A. Phillip Randolph, Bayard Rustin, and Martin Luther King Jr. Hillary Clinton, who promises to address the concerns of Black Lives Matter activists and to resume the fight for comprehensive immigration reform, observed Southern politics firsthand while she was first lady of Arkansas. If Clinton is sincere about these initiatives—and considering how much she will likely owe African Americans if she wins on Super Tuesday, she will not have much choice—there will be further opportunities for Southern progressives to push for even more significant change.

It is time for progressives to assume a more active role in the Southern wing of the Democratic Party. Movements such as the Fight for $15, which has made a strong showing in Atlanta, and the Moral Mondays protest movement in North Carolina show the desire for a resurgence of progressivism in the region. Demographics alone will not be destiny, but the forging of coalitions between African-Americans and a growing Hispanic population, along with working-class and middle-class whites, offer a promising beginning. If the Sanders campaign for a “political revolution” leaves a lasting legacy in the South, progressivism will be revitalized in the region as never before. But this has to happen regardless of who wins next week, or in November. The Democrats can win national elections without the South. But they will not have the political will, or the muscle, to effect truly progressive change without Southerners.

Both/And

At the risk of being disagreeable, let me start by disagreeing with the premise of this forum: Trying to figure out where Bernie Sanders and Hillary Clinton fit into the history of “progressivism” only muddles the key differences between them. It can hardly be otherwise, since the term has had such a promiscuous life in American politics. A century ago, racist Southern Democrats and the founders of the NAACP both embraced it. A few decades later, so did the Communist Party. Sometime in the 1990s, it became a fallback identifier for pretty much anyone The Nation and its journalistic kin smiled upon.

Fortunately, excellent substitutes are available.

Hillary Clinton is best described as a liberal. Like liberals from Woodrow Wilson to Franklin Roosevelt to Lyndon Johnson, Clinton wants to use the federal government to improve the lives of the majority of Americans. Like nearly every Democratic presidential candidate since the 1970s, she makes special pitches to women, non-whites of both genders, and the LGBT community. But she largely views social movements as creatures to be wooed and managed. What she really cares about is shrewd, effective governance. Like every liberal president (and most failed Democratic nominees) since Wilson, she wants the United States to be the dominant power in the world, so she doesn’t question the massive sums spent on the military and on the other branches of the national-security state.

Bernie Sanders, on the other hand, is a leftist. Although he has been winning elections since 1981, Sanders resembles his hero, Eugene V. Debs—the Socialist who ran five quixotic races for president, the last time, in 1920, from a prison cell—far more than he does a standard-issue career politician. Other pols identify with “revolution” and claim their campaign is a “movement.” But Bernie really means it. He is perpetually on the attack against undue power and misused privilege, armed with an unvarnished class-conscious message that, until the emergence of Occupy Wall Street, had long been absent from the public square. He advocates policies he knows even a Congress controlled by Democrats would be quite unlikely to implement: breaking up the biggest banks, making public colleges and universities free to all, outlawing private donations to campaigns, and more. Except for increasing aid to veterans, he seems cold toward every part of the military establishment. His true foreign policy is, in effect, a domestic policy that would turn the United States into another Norway.

Despite these fundamental differences between Clinton and Sanders, the fierceness of their rivalry should not obscure a central truth of political history: Leftists and liberals have always needed each other to push America toward becoming a more humane, more equal place. Radical activists and intellectuals promote fresh ideas, challenge entrenched elites, dedicate themselves to grassroots organizing, and push liberals down paths they might otherwise have avoided or tiptoed along at a craven pace. Liberals build governing coalitions that enact measures, from the progressive income tax to the Civil Rights Act to Obamacare, which improve the lives of ordinary Americans.

It would be a serious mistake for “progressives” of any stripe to ignore this symbiotic relationship. Leftists backing Bernie ought to realize that the route to a social-democratic Promised Land will be long, arduous, and uncertain. Clintonian liberals should embrace the passion for a transformed America that has always been essential to making meaningful reforms in the existing one. Neither can ignore the certain consequence of an internal battle that lasts beyond the time when one of the two candidates secures enough delegates to win the nomination: a federal government under the total control of a Republican right that is determined to undo nearly everything liberals and leftists have achieved.



]]>
https://www.thenation.com/article/archive/who-is-the-real-progressive-hillary-clinton-or-bernie-sanders/
Lessons From the Great Default Crisis of 1975https://www.thenation.com/article/archive/lessons-great-default-crisis-1975/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-FeinOct 16, 2013Thirty-eight years ago, New York City almost went bankrupt. Then as now, conservatives took a hardline stance.

]]>


In this October 21, 1975 file photo, New York Gov. Hugh L. Carey, right, and Felix Rohatyn, chairman of Municipal Assistance Corporation, press members of a House subcommittee in Washington to aid deficit-ridden New York City. (AP Photo/Charled Gorry, File)

It appears as if the United States government will just barely avoid defaulting on October 17. It’s not the first time that the date featured in fiscal history. Thirty-eight years ago, on October 17, 1975, New York City almost went bankrupt. Then as now, conservatives in the White House pushed a style of brinksmanship that could have forced a major government to default.

The context, of course, was very different. New York City’s finances were under stress as the city tried to maintain its extensive local welfare state, despite the pressures of the recession of the early 1970s and the underlying economic problems it faced of industrial decline and middle-class flight. The city tried to borrow to cover a widening budget gap, but by 1975 the major banks refused to market its debt. New York’s mayor and governor—Abraham Beame and Hugh Carey—appealed repeatedly to Washington, DC. Pointing to ailing corporations which had been able to obtain government funds, they argued that New Yorkers should not be made to suffer for problems that were the result of economic forces beyond the city’s control.

President Gerald Ford and his advisers (who included Donald Rumsfeld and Alan Greenspan) insisted that the city’s problems were its own responsibility. New York’s default, they claimed, would not have a significant economic impact. “There is no short cut to fiscal responsibility,” Greenspan, then chairman of the Council of Economic Advisors, wrote in one memo.

On October 17, the president’s sanguine attitude toward bankruptcy was put to the test. The previous evening, trustees for the teachers’ union pension fund had voted against using the pension money of their members to purchase the city’s bonds, even though the city needed this money to pay notes coming due. In the middle of the night, city lawyers had drawn up a bankruptcy petition: “The City of New York is unable to pay its debts or obligations as they mature.”

That morning, the Dow Jones fell, the price of gold rose and city note-holders lined up at the Municipal Building to make a futile attempt to redeem their bonds. Still, Ford refused to help. As his press secretary put it, “This is not a natural disaster or an act of God. It is a self-inflicted act by the people who have been running New York City for a long time.”

Shortly after 2 pm, the union trustees changed their minds, and the teachers’ union bought the bonds. For the moment, New York was safe from default. But it was now clear that New York could come to the edge of bankruptcy and the federal government would simply let it happen.

A few days later, Ford made his famous speech—immortalized by the Daily News headline, “Ford to City: Drop Dead”—in which he promised to veto any bill that bailed out the city, warning that New York’s problems were a microcosm of those facing the nation: “If we go on spending more than we have, providing more benefits and more services than we can pay for, then a day of reckoning will come to Washington and the whole country just as it has to New York City.” And when “that day of reckoning comes, who will bail out the United States of America?”

But as bankruptcy ceased to be a metaphor and moral threat, and edged closer to becoming reality, Ford’s hard-line stance began to seem impractical, destabilizing and short-sighted. Bankers and business leaders testified in Congress about the potential disaster of bankruptcy. West German Chancellor  Helmut Schmidt warned that New York’s default would have global repercussions. In the end, Ford agreed to support legislation that extended loans to New York. The city, in turn, embraced austerity, firing thousands of police officers, fire fighters, teachers and other public workers.

Still, the conservative logic according to which the federal government should stand aside and let the nation’s largest metropolis go bankrupt simply to teach a political lesson has never entirely disappeared. Indeed, more than fiscal recklessness, it now seems that this anti-government animus may bring us to the “day of reckoning” of which Ford spoke. Today, Washington allows cities such as Detroit to file for bankruptcy, even as financial institutions receive bailouts with few strings attached. Regardless of the consequences, the political descendants of Alan Greenspan seem happy to take the federal government to the brink of default, just as they were content to let New York City go under thirty-eight years ago.

On the fiscal crisis deal struck by Congress, Nation editors write, “Even When the GOP Loses, It Wins.”



]]>
https://www.thenation.com/article/archive/lessons-great-default-crisis-1975/
Mountain Viewshttps://www.thenation.com/article/archive/mountain-views/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-FeinJul 16, 2013Angus Burgin revisits Friedrich Hayek’s Mont Pelerin Society in The Great Persuasion.

]]>

This past June, just after the summer solstice, a group of economists and political philosophers gathered on one of the Galápagos Islands, off the coast of Ecuador, to discuss the subject of “Evolution, the Human Sciences and Liberty.” In between touring the islands—and perhaps posing for photos with iguanas or the famous turtles?—the assembled guests attended panels on such subjects as “the moral animal” and the archetypal figures of “the Warrior” and “the Entrepreneur.” The sponsor of this Pacific getaway was the Mont Pelerin Society, the fabled organization founded by the Austrian economist Friedrich Hayek in 1947 to “contribute to the preservation and improvement of the free society,” simply by providing a space where intellectuals committed to the free market could meet and exchange views.

The Mont Pelerin Society was and remains an unusual organization. Its 1947 Statement of Aims proclaimed that the group would not “conduct propaganda,” become aligned with any political party, or attempt to establish a “meticulous and hampering orthodoxy.” A cross between a salon and an academic conference, the society held itself above the contentious world of elections and policy-making. It would not seek to recruit a mass popular membership (even today, new members must be nominated by two people already in the group). It was meant as a haven for intellectuals who believed that their skepticism about state intervention in the economy and their philosophical commitment to the free market put them outside the mainstream of postwar political thought.

The society has often been seen in terms of the rise of conservative anti-government thought and politics in the postwar United States. In A Brief History of Neoliberalism, the radical geographer David Harvey describes it as a “small and exclusive group of passionate advocates” hostile to the welfare state and celebratory of the free market. And as its founder, Hayek might seem a crusader and polemicist whose fierce denunciations of economic planning anticipated the subsequent conservative attacks on unions, the social safety net and the very idea of market regulation. This image was reinforced last summer when Mitt Romney’s vice presidential running mate, Congressman Paul Ryan, claimed Hayek as an intellectual inspiration. The Great Persuasion, by Johns Hopkins historian Angus Burgin, challenges this image of Hayek and of the Mont Pelerin Society—and, by extension, the commonly accepted story of the intellectual roots of neoliberalism. Far from being confident, Burgin suggests, Hayek and those affiliated with Mont Pelerin in its early days were uncertain, almost as critical of the sunny premises of nineteenth-century classical liberalism as they were of economic planning. The Mont Pelerin Society—and neoliberalism itself, in its first incarnation—emerged at a time when the idea of the market was not triumphant but in tatters.

Although it focuses on a small circle of esoteric intellectuals who believed deeply in the political importance of ideas but studiously sought to avoid actual engagement in political life, The Great Persuasion is neither a narrow book nor one meant for specialists. It is capacious and quietly ambitious, offering a dramatic retelling of the intellectual history of the postwar revival of free-market ideas, and it is an excellent example of what can be gained when intellectual history doesn’t focus exclusively on individuals. Burgin shows that the mobilization around the idea of the market—and by extension the broader conservative movement—was not domestic, parochial or anti-intellectual. Rather, it depended on an international community of thinkers, especially academic economists (who are often left out of the story of changing attitudes toward the market, despite their centrality).

In contrast to the bombastic Tea Party, the men of the Mont Pelerin Society—very few members of the group in the early years were women—were cosmopolitan, self-doubting, almost brooding about the fate of capitalism. That their efforts to define a new, self-critical liberalism should have culminated in the free-market cheerleading popularized by the University of Chicago’s Milton Friedman in his book Capitalism and Freedom and his PBS series Free to Choose (as well as a bestselling book by the same name) seems to Burgin deeply ironic, almost comic. The history of Mont Pelerin is, for him, a narrative of declension, “a parable of the perils, in the exercise of ideological influence, of success.” 

* * *

Hayek set out on the road to Mont Pelerin when he left his native Austria for the London School of Economics in 1931. He had been invited to teach there to act as a counterweight to John Maynard Keynes, who had been arguing since the 1920s that the age of laissez-faire had reached its end and a new era of managed capitalism was dawning. Hayek’s first lecture at the LSE was about the loneliness of being an advocate for the free market at a time when “there are, of course, very few people left who are not socialists.” Yet he wasn’t given to despair. Hayek spoke of being “hopelessly out of tune” with the current generation of economists. But he observed that the intellectual predecessors of those who were coming to dominate the field in the early 1930s had been marginal figures only a few years before—such as the Fabian socialists, who founded the LSE. So, too, could free-market economists once again exercise influence; it would likely take “a long time,” but with patience they might yet hold sway. 

Today, the central precepts of market capitalism are so widely accepted in economics departments and among social elites that it is hard to imagine that it was ever otherwise. But in the early 1930s, the Great Depression had created a genuine, widespread sense even in these circles that the idea of the market might be intellectually incoherent. The blithe insistence of nineteenth-century social Darwinists that the “millionaires are the product of natural selection” (to quote William Graham Sumner) and that the gutter was the appropriate place for drunkards seemed antiquated and harsh amid the global economic ruin. The cheerful Victorian mathematics of Alfred Marshall might be useful for describing market competition, but the problems he addressed seemed technical, not moral; they could not convince people to defend capitalism as a social or economic order. 

Today, “Chicago School” is shorthand for the aggressive advocacy of free markets and opposition to government intervention. But even the economists who taught at the University of Chicago in the 1920s, ’30s and ’40s were skeptical about capitalism and wary of openly advocating on its behalf. Frank Knight, one of the department’s leading lights (best known for his book Risk, Uncertainty and Profit), feared that market societies subordinated all social values to the quest for profit: “Economic man is the selfish, ruthless object of moral condemnation.” Jacob Viner was sharply critical of corporate bombast: “Nothing in the history of American business justifies undue confidence on the part of the American public that it can trust big business to take care of the community without supervision, regulation or eternal vigilance.” And Henry Simons—the most politically engaged of the three—denounced monopoly power as the “great enemy of democracy.” 

Small wonder, then, that Hayek began to believe that if a market society was to survive, it would need a new philosophical grounding. Its defense couldn’t be limited to its ability to produce abundant wealth; nor could its workings be so atomized and individualistic. As Hayek’s intellectual orientation changed, economists paid less attention to his work. Hayek insisted that government could do nothing to end the Depression; the only good solution to even the direst economic downturn, he argued, was to wait it out. To economists and graduate students who wanted to use their knowledge for the social good, this was an immensely unsatisfying position, while to the millions of people who were unemployed and desperate, it was an affront. In the early 1930s, Hayek had sparred with Keynes in the economics journals, but when he declined to review Keynes’s The General Theory in 1936, it was widely viewed as a concession that Keynes was right. As Keynes’s star rose and his fell, Hayek began to lose interest in doing purely economic work, instead becoming more concerned with the problem of how to shore up political and intellectual support for the free market. 

The central problem he faced was how to think about liberalism in a new way, adapting the old creed to dramatically changed surroundings. Hayek was not alone in this intellectual project. He took inspiration from The Good Society, American journalist Walter Lippmann’s 1937 critique of the New Deal, which protested the “authoritarian collectivism” of those who called themselves progressives, radicals or Roosevelt liberals, but which also rejected a return to the “Adam Smith fundamentalism” of an earlier time (summed up by Lippmann as a “philosophy of neglect”). Lippmann sought to revive classical liberalism’s skepticism concerning government’s ability to plan the economy, but to do so in positive terms: to orient liberalism around saying yes to a set of principles instead of a knee-jerk no to any action undertaken by the state. In the summer of 1938, Hayek and various colleagues organized the “Colloque Lippmann,” which met in Paris to discuss “the decline of liberalism and…the conditions of a return to a renovated liberal order distinct from Manchester laissez faire.” Four years later, one of these colleagues, the German economist Wilhelm Röpke, published The Social Crisis of Our Time, which condemned the “sterile” choice between collectivism and old-fashioned liberalism, calling instead for a “Third Way” that would use the state to foster traditional agricultural life and artisanal industry. The desire to reinvent liberalism, to create a new or neo-liberalism, was shared by all of these thinkers as the 1930s turned into the 1940s, and fascism and war engulfed Europe and the world.

But it was the book Hayek published near the end of the war that would transform economic thought. The Road to Serfdom is an anguished cry to protect a world that Hayek believed was on the verge of annihilation. The book’s core premise is that a free society in the 1940s faced a challenge more insidious than the external threats of fascism and Nazism—namely, the well-meaning liberals and socialists who sought to regulate the market and plan the economy out of the desire to end poverty and create a more just society. Hayek warned that they would ultimately become “the totalitarians in our midst.” Like Röpke and Lippmann, Hayek tried to make it clear that he thought returning to the laissez-faire of the nineteenth century was impossible. His book devoted a few pages to outlining some of the ways that the state could provide “security,” actions that were “outside of and supplementary to the market system,” many of which might surprise Hayek’s fans in the Tea Party. The state could mitigate the “general fluctuations of economic activity” and combat “the recurrent waves of large-scale unemployment” by adjusting monetary policy; it could also regulate the health and safety of business practices, limit pollution, establish maximum working hours, build public infrastructure, and provide social insurance, old-age pensions and health insurance. “Some minimum of food, shelter and clothing…can be assured to everybody,” Hayek wrote.  

Whereas earlier advocates of the market had focused on the individual, Hayek emphasized the market itself: the delicate, spontaneous interaction of hundreds of millions of people joined in an economic web that transcended anything human beings could consciously conceive. Poverty, exploitation, the threat of concentrated economic power—none of these were Hayek’s concern. He noted a terrible irony: by seeking to make the world better, people would actually make it worse. “It was men’s submission to the impersonal forces of the market that in the past has made possible the growth of a civilization which without this could not have been developed,” he wrote in The Road to Serfdom. “It is by thus submitting that we are every day helping to build something that is greater than any one of us can fully comprehend.”

* * *

To Hayek’s great surprise and occasional dismay, The Road to Serfdom found a receptive audience in the United States, where its message caught fire with people—especially business conservatives—who had been fervent opponents of FDR’s New Deal and the growing labor movement. They were accustomed to hearing their ideas trashed as so much self-interest, but in Hayek’s work they found a sophisticated new way of understanding their own opposition to the state and its efforts to regulate the economy, as well as a guide to thinking about the market in terms of a principled defense of freedom. The book became a must-read for young conservatives and has remained one ever since. 

One fan of the book was DeWitt Wallace, the anticommunist editor of Reader’s Digest, who decided to publish an abridged version that omitted passages about how the minimum wage might be compatible with competition and retained the polemics about totalitarianism. One million reprints were ordered, many by companies that distributed them to their employees. The Road to Serfdom gained even greater circulation when Look magazine printed a cartoon version of the book, which then appeared in General Electric’s corporate publication. In 1945, Hayek traveled to the United States for a speaking tour and was greeted by a crowd of 3,000 at New York City’s Town Hall. The old-fashioned Viennese academic toured the country, his speeches at local chambers of commerce and bankers’ associations an unexpected hit. 

Although he appreciated the acclaim, Hayek was frustrated by the “Liberalism for Dummies” interpretation of his book that had taken hold in the United States. Many people seemed to ignore his endorsement of some kinds of state action and his critique of laissez-faire, instead touting the book as an anti-government jeremiad. Hayek was acutely aware that support for his book in the more conservative reaches of the business community might damage his ability to appear a disinterested, neutral scholar. He worried that he was ruining his academic career. “Keynes died and became a saint,” he would later say, while “I discredited myself by publishing The Road to Serfdom.” But at the same time, he was aware that he might be able to find allies in these new friends: his failure might hold the seeds of a different kind of success. 

Hayek began planning a new organization to bring together the advocates of a reborn liberalism. He had won the attention of the Volker Fund, a small foundation started by a Kansas City furniture manufacturer that, by the late 1940s, was spending about $1 million a year on various free-market advocacy projects. The Volker Fund helped to underwrite Hayek’s salary for ten years at the University of Chicago (as well as that of fellow Austrian Ludwig von Mises at New York University) and also contributed financially to his new group (Wilhelm Röpke worked from Switzerland with Hayek to get it off the ground). In 1947, the Mont Pelerin Society held its first meeting—attended by thirty-nine international academics and intellectuals—at a resort lodge near Vevey, Switzerland, close to the eponymous mountain. Hayek spoke to the assembled group about the “wild experiment” he saw them undertaking. They would set about the task of reinventing liberalism in an intellectual environment in which everyone shared an “agreement on fundamentals.” 

But did they agree on all that much? Some members of the group—such as Knight and Röpke—worried that unregulated markets degraded ethical values, and they struggled to reconcile capitalism and social tradition. Hayek himself suggested that there was congruence between the preservation of tradition and the support for markets, as both opposed a hubristic rationalism that sought to replace spontaneous interaction with directed plans. Still others, such as von Mises, thought that there was too much focus on the critique of liberalism and that what really mattered was defending it vociferously. Although there was a general sense that capitalism was the only economic order compatible with political democracy, many of the participants in the Mont Pelerin Society were ambivalent about majority rule, fearing that democratic publics would vote for the very restrictions on the market they so feared.

Nor did they ever reach any consensus on how to describe their ideas. None of them liked the word “libertarian” (some felt it sounded too much like “libertine”), while “liberal” linked them to FDR. And Hayek slowly distanced himself from conservatism. In 1957, he delivered an address at Mont Pelerin titled “Why I Am Not a Conservative,” arguing that supporting capitalism meant embracing ceaseless change rather than preserving social order as traditionalist thinkers like Russell Kirk wished. However, Hayek’s work had more in common with the conservatives than he acknowledged. By the late 1950s, the major target of his critique was no longer socialism (in The Constitution of Liberty, he argued that “socialism in the old definite sense is now dead in the Western world”); instead, he now directed his ire at labor unions, progressive taxation and the welfare state more generally. The thrust of his argument—that economic regulation, even if undertaken out of the best of motives, would jeopardize the values of liberty and democracy that it was initially intended to advance—had much in common with conservative ideas all the way back to Edmund Burke’s, as A.O. Hirschman observed in The Rhetoric of Reaction. Hayek shared with Burke, too, an anxious sense of the fragility of society and a fear of rationalism. Nonetheless, Hayek thought it important to stake out independent ground, to differentiate his stance from those he saw being adopted in the growing conservative movement.

Despite Hayek’s high hopes, the Mont Pelerin Society was almost immediately a source of frustration to him and others. It proved immensely difficult to use the society to construct an alternative vision of liberalism. By the 1950s, the fluid postwar world had given way to the sharp divisions of the Cold War. In the new climate of postwar prosperity, the agonized skepticism that had characterized the early years of the society gained little traction. A new thinker came to the fore of the free-market movement: Milton Friedman, who was brash, bold and unapologetically political. Like Hayek, he did not like to describe himself as a conservative. (“Good God, don’t call me that!” he told the magazine Human Behavior in 1978.) He embraced “positive economics,” seeing the discipline as a scientific one in which hypotheses could be tested, proved or refuted, in contrast to Hayek’s philosophical approach. He had none of Hayek’s ambivalence about participating in political life or contemporary policy debates. He continued doing academic work while advising Barry Goldwater in his 1964 presidential campaign; he also wrote popular books and articles for The New York Times Magazine, had a regular column in Newsweek, and ultimately produced Free to Choose in 1980 to broadcast his ideas to a still-wider audience.

Unlike the earlier Chicago economists, Friedman had no reservations about the market. For him, markets were an “unremitting good,” a way to achieve greater wealth and freedom for all. As Burgin notes, this belief in the universal superiority of markets was far more optimistic than the social Darwinism of the late nineteenth century. For William Graham Sumner, the market separated the world into winners and losers; its ruthlessness was efficient but cruel. For Friedman, however, there were no losers; with the market, everyone wins. He insisted that he shared the goals of the left—the elimination of poverty, the rise of living standards, a world made freer and more democratic—but simply believed that markets, not the state, were necessary to achieve them. Even at the first meeting of the Mont Pelerin Society, Friedman had argued against the pessimism of others in the group, saying that its manifesto should emphasize the “humanitarian” and “progressive” aspects of liberalism instead of worrying too much about establishing a positive role for the state. 

By the late 1950s and early ’60s, the divisions within the society had become severe. In a vitriolic schism worthy of the Trotskyite left, the organization almost collapsed, with some founding members resigning in a huff and seeking to set up an alternative organization. Röpke was one, having grown distressed that the society had mostly abandoned its original openness to questioning the premises of older free-market faiths. He also complained that economists were increasingly powerful within the group, making it “ever more stale.” Then, after Friedman became president of the Mont Pelerin Society in 1970, he came to think that because it had changed so much since its founding, the best way forward was to shut up shop. (The annual meetings, he warned, had become pleasant “tourist attractions” rather than intense intellectual engagements.) The growing prominence and success of free-market ideas had made the society seem a relic of an earlier, more anxious time, and its role in a world that included the Heritage Foundation, the American Enterprise Institute and a resurgent Republican Party was no longer clear. Millions of Americans were reading or watching Friedman’s Free to Choose in their living rooms. Why did there need to be a cloistered, quiet space in which to ruminate upon the philosophy of the free market? 

* * *

Burgin’s account of the evolution of the Mont Pelerin Society is a study of the complexity of ideological change, of the ways that ideas conceived in one context can acquire a very different hue over time. It is an immensely rich, careful and thoughtful history that captures the range of opinion within a group of people who are too often seen as having marched in lockstep. 

At times, an elegiac note creeps into Burgin’s treatment of the society, as though he is nostalgic for the early days of neoliberalism, with its studied ambivalence and notes of restraint. It can indeed be hard to reconcile the political atmosphere of the present—Grover Norquist and his no-tax pledge, the Club for Growth, the Tea Party, Super PACs, the stultifying timidity of both parties when facing the idea of the market—with the image of those thirty-nine lonely intellectuals meeting by the mountainside, drawing courage from one another in their last stand to defend an endangered civilization. 

But even in their halcyon days, Hayek and his confreres were not as isolated as they perceived themselves to be. Ample support for their critiques of the welfare state and socialism came from certain parts of the business community and the political class. Whatever qualms they had about accepting this support, they did so nonetheless, and they also sought to cultivate connections with people for whom the main appeal of their ideas was that they could be employed in the fight against progressive economic reform. By some measures, their ideas about the market were out of fashion in the postwar years, but by others, they were far more easily accepted than their advocates realized, even at the time. There is a certain irony in Glenn Beck hawking Hayek and Paul Ryan invoking him: Hayek would hardly have known what to do with presidential politics, let alone cable TV. But the history of ideas is also the history of how people use them, and it is hard to imagine that Hayek would have been entirely surprised or displeased by his re-emergence in the contemporary fray. 

Not that he would have been happy to relinquish the position of outsider. Far away from the halls of Washington, the society members gathering on Galápagos may find it sunnier than the mountains of Switzerland, and the political climate balmy compared with 1947. But for today’s Mont Pelerin Society, there is likely something revivifying about recalling the postwar crisis, when the world was menacing and it seemed far from certain that their side would win. For contemporary free-market advocates, who must confront the difficult aftermath of the crash of 2008, the decidedly unsubtle power of business in politics, and the nasty inequality that corrodes every aspect of our economic and political life, there must be something tremendously alluring about that moment of purity. Their ideas helped to build this world, but one can see—especially after reading The Great Persuasion—why those insiders would want to retreat.



]]>
https://www.thenation.com/article/archive/mountain-views/
The Legacy of the 1970s Fiscal Crisishttps://www.thenation.com/article/archive/legacy-1970s-fiscal-crisis/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinApr 16, 2013Nearly forty years after Ford told New York to drop dead, the city is still here—but forever changed.

]]>

Shoppers hustle down 42nd Street in 1975.
Shoppers hustle down 42nd Street in 1975. (Photo by Peter Keegan/Getty Images)
 
On a Tuesday in mid-May of 1975, Abraham Beame and Hugh Carey—New York City’s mayor and governor—arrived at the White House to meet with President Gerald Ford. The news they brought was not good: New York City was experiencing a severe cash shortage, and without help, the city would not be able to cover its bills much longer. Beame described a recent demonstration of CUNY students outside of Gracie Mansion; Carey warned that serious retrenchment might mean the collapse of civil peace. The president listened and then said that he needed twenty-four hours to think it over. (“24 hours. Must do what’s right. Bite bullet,” he wrote on a note- pad, probably before the meeting even happened.) The next day, Ford told Beame and Carey that there was nothing the federal government could prudently do to help. The city would have to solve its problems on its own.

Throughout the rest of the year, New York would flirt with default on its massive loans, scrambling to patch together one plan after another, each intended to save the city from declaring bankruptcy while cutting back on the social and municipal services it provided. Today, the city’s heralded renaissance is often contrasted with the bad old days of the 1970s, a dark, distant past through which the city had to pass to arrive at the rosy present. But in fact, the fiscal crisis of the ’70s—and the subsequent budget cutbacks that followed—reshaped the city in ways that continue to influence it even now.

Like most fiscal crises, New York’s was at once long anticipated and a complete shock. Many observers in the early ’70s had noticed that New York was entering a period of difficulty and falling tax receipts, as the city’s economy was rocked by the decline of manufacturing and the flight of the white middle class to the suburbs. New York did provide more services than most other American cities—more about these in a moment—although contrary to the railing of conservatives at the time, its public workers were not paid wages out of line with those of workers in other cities. During the Great Society years, the expenses of the city climbed, particularly those for Medicaid (for which it bore almost 25 percent of the cost, in accordance with state law) and welfare. At first, increases in federal and state aid helped fuel this expansion. But when the economy turned south in the early 1970s, New York turned to borrowing to make up the budget gaps. The tacit assumption of city leaders—rarely spelled out clearly—was that the borrowing was merely a temporary measure. Perhaps national healthcare would pass and the city would no longer have to foot a massive Medicaid bill. Once the economy recovered, the city would regain its fiscal footing.

But by 1975, as recession enveloped the American economy, the banks that marketed New York’s debt (and owned a great deal of it) became increasingly wary about the city, as did investors around the country. Some business leaders began to tell Mayor Beame that if he didn’t cut spending and balance the budget, “managers” should be put in place and made accountable for New York. In extreme times, wrote Jac Friedgut, a vice president at First National City Bank, “many things can be done even if they are technically not possible.” By the spring, the banks told the city that the bond market had closed.

As soon as its credit was cut off, it became apparent that New York did not have the money to pay its debts—or even to continue to cover payrolls without access to more borrowed funds. The state created the Municipal Assistance Corporation, which was empowered to issue special bonds backed by the sales tax to help the city pay its bills. When MAC proved no more able than the city to market its debt, the state created the Emergency Financial Control Board to oversee the city’s finances and make sure it was moving toward a balanced budget. Finally, with Washington’s help (it agreed in the end to provide the city with some loans), New York made an arrangement with the banks and its unions that kept it out of bankruptcy. The cost was serious budget cuts: over the next three years, the number of police officers and teachers each dropped by about 6,000 and the number of firefighters by about 2,500, transit fares were raised, and tuition was imposed for the first time at the City University of New York.

* * *

Social Democracy Lost

Today, the rituals of fiscal crisis—the blaming of public sector workers, the vilification of the poor who use government services suddenly deemed excessive luxuries—may seem familiar. One American city after another has been rocked by such difficulties in the years since 2008. In mid-March, Michigan Governor Rick Snyder announced the takeover of Detroit’s finances by a state-appointed manager. In the last few years, the cities of Stockton and San Bernardino in California have declared bankruptcy, as have Central Falls, Rhode Island, and Jefferson County, Alabama. To try to keep firehouses open, cities like Baltimore are contemplating selling ad space on fire trucks and rescue vehicles. The local fiscal crises are accompanied by the stoking of anxiety about the fiscal soundness of the federal government and by the debt crises wracking Europe.

Today’s local fiscal crises afflict primarily cities (especially smaller ones) that have been struggling for years. The takeover of Detroit is hardly a surprise—everyone knows the city is broken. New York in the ’70s, on the other hand, was the biggest city in the country, the home of Wall Street, the epicenter of capitalism. The idea that such an apparently powerful urban center could be in fiscal difficulty came as a shock, even to those in charge of governing the city. For people in New York as well as in Washington, DC, the city's problems soon became linked to broader questions about the direction of the country as a whole.

Although there were many local causes of the fiscal crisis—the specific economic problems of New York, the high proportion of the Medicaid and welfare costs borne by the city, and a general willingness within the city government to employ deficit financing strategies—it quickly became seen as a metaphor for the larger breakdown of liberalism in the ’70s. Throughout the postwar years, as historian Joshua Freeman has argued, New York City embodied a particular style of social-democratic politics: one that embraced a strong welfare state, a culture of labor power and solidarity, and a belief in the necessity of using the government (even city government) to help the disadvantaged. It can be hard today to imagine what it was like to live in a city that provided such a rich range of social services, ones that made possible a uniquely democratic urban culture. The city had nineteen public hospitals in 1975, extensive mass transit and public housing, public daycare and decent schools. The municipal university system—the only one of its kind in the country—provided higher education to all, free of charge. Rent stabilization made it possible for a middle class to inhabit the city. For many, the fiscal crisis showed that it was no longer possible for New York to finance these kinds of services. As Christopher Lasch wrote in his 1979 book The Culture of Narcissism, “Those who recently dreamed of world power now despair of governing the city of New York.”

For the rising conservative movement, the fiscal crisis dramatized all of liberalism’s problems: its inability to subordinate sentiment to financial imperatives, its recklessness, its sinful flouting of the norms of rectitude. The right did not take long to draw parallels between New York and the country as a whole in the wake of the expansion of social programs in the Great Society. As President Ford said in his October 29, 1975, speech at the National Press Club (the one that caused the Daily News to coin the famous headline Ford to City: Drop Dead), “Other cities, other states as well as the federal government are not immune to the insidious disease from which New York is suffering…. If we go on spending more than we have, providing more benefits and services than we can pay for, then a day of reckoning will come to Washington and the whole country just as it has to New York.” Ford’s more conservative advisers had long been fiercely hostile to the city. When it first asked for help from Washington early in the spring of 1975, Donald Rumsfeld (then Ford’s chief of staff) responded that such a request was “outrageous” and that acceding to it would be ”a disaster.” Alan Greenspan, the head of Ford’s Council of Economic Advisers, also argued against aiding the city, writing that “there is no short cut to fiscal responsibility.”

Only a month after Ford’s October speech, after a barrage of criticism from such elite figures as the chairman of Con Edison, the president of the Bank of America and the chancellor of West Germany, the administration reversed its position and agreed to extend loans to New York on the condition that the city continue to move toward a balanced budget. Nonetheless, the fact that Ford had been willing to let New York go broke signaled that ideological purity trumped all other concerns for the rising right. Teaching a lesson about the dangers of the welfare state seemed more important than international prestige, Cold War concerns or even the possible economic impact of the city’s default (Ford’s treasury secretary, William Simon, a former municipal bond trader and future president of the Olin Foundation, insisted that New York’s bankruptcy would likely not have a significant effect even on the municipal bond market).

But the political impact of the fiscal crisis was felt far beyond conservative circles. The crisis brought about a transformation of the very language and conception of politics, as the rhetoric of fiscal necessity and business acumen replaced a vision of politics as a domain of struggle and negotiation. Old-time Democratic politicians like Beame had understood urban politics as a world of relationships, negotiations and deals. People with power made arrangements with other powerful people. The investment bankers Beame blamed for “boycotting” the city saw the world differently: they described themselves as mere conduits for the wisdom of the marketplace. Politics mattered less than the vast collective wisdom of the bond market, which rendered New York City and the banks powerless.

The crisis brought about a change in the city’s leadership, as clubhouse Democrats were deposed in favor of a younger generation of business-friendly liberals. For these new leaders, the downsizing of New York became a badge of honor: a sign that liberals were not beholden to such special interests as organized labor but could speak the rhetoric of efficiency. The old faith in the political importance of the working class, the New Deal sense of the necessity of government action, gave way in the fiscal crisis to a liberalism that borrowed its framework and its values from the private sector.

* * *

A Bitter Fight

In New York politics, one can occasionally hear people express nostalgia for those days as a time when business, labor and government got together to do what needed to be done to save the city through the harsh fiscal remedy of budget cuts. Although it was true that union presidents and business leaders alike ultimately acquiesced to the program of retrenchment, the image of unity is largely a product of hindsight. New Yorkers in the ’70s fought bitterly about the crisis. The city was divided by protests against budget cuts. Many New Yorkers blamed the banks for refusing to lend the city money. Although most of the city’s unions finally went along with the cuts, they started out by organizing large-scale demonstrations criticizing the banks; at one point in the fall of 1975, they briefly hinted at a general strike. People occupied firehouses to keep them open, organized massive campaigns to save college campuses (such as Hostos Community College in the Bronx, which was threatened with closure) and threw their trash into the middle of the street to protest the mass layoffs of sanitation workers and resulting slowdown in garbage collections. Public sector workers who had been laid off blocked traffic on the Brooklyn Bridge and highway workers picketed at the Henry Hudson Parkway; at one point, corrections officers angry about layoffs organized a demonstration briefly preventing people from passing over the bridge to Rikers Island. In part, the city workers were enraged at the layoffs that destroyed their economic security—but the intensity of the protests also signaled the sense that the city was at a crossroads, divided between two different visions for its future.

For New Yorkers outside the halls of power, the crisis appeared bewildering: How could a city that seemed so rich suddenly have so little money? But as firehouses closed, mass transit stalled, libraries shut their doors, school class sizes swelled, routine services like garbage collection became unpredictable, and thousands of would-be students found themselves shut out of CUNY because the university simply stopped processing their applications, the crisis helped to spawn a new conservatism in the city as well. Letters poured into the offices of elected officials. Some of them expressed anger at the treatment of the poor: “Why don’t you line us up against the wall and shoot us?” one correspondent asked Governor Hugh Carey. But many others vented rage at the “welfare people” (especially Puerto Rican and “Mexican” immigrants) they believed had brought the city to this pass. The bankruptcy of the state made it difficult for people to assert any claims on it, as economic austerity helped generate a new political disengagement.

Today, the fiscal crisis in New York may seem a distant memory, like the graffiti-covered subway cars of the era or the fires that once blazed through Bushwick, a neighborhood now dotted with artisanal chocolate shops and pizza places that win raves from The New York Times. But the diminished expectations we have for the public sector and the increasing difficulty of living a middle-class life in the city suggest the legacy of the fiscal crisis even now. City governments today—including New York’s—seem primarily to be vehicles to attract and maintain private investment. Business improvement districts and public-private partnerships involve companies directly in paying for the services they receive, while the city sweeps away community challenges to business-oriented development. This is supposed to lead to improved services for all; yet over the same years that have seen the rise of this business culture in city government, New York has become the most unequal city in the country—the gulf between rich and poor widening in ways that would have been hard to imagine even in the early ’70s.

In the artistic and intellectual circles of the left, there’s an undeniable nostalgia for New York in the ’70s—when CBGB opened its doors, working artists had lofts in SoHo, hip-hop was invented, and the Lower East Side became the home to a new musical and artistic scene. It might be easy to dismiss such feelings as the sentimental romanticism of a privileged generation that has grown used to traveling the subways without being anxious about crime—or just the appeal of a time when rent was cheap. But looking around the city today, saturated with money and starkly divided by wealth, the very bleakness of the ’70s seems a refuge, a time of possibility. The violence and brutality of a city in free fall was real. Yet in the literal bankruptcy of the political establishment, there was also a kind of freedom, a political and cultural openness; there was no need to pretend that everything was all right.

The 1970s—in New York and around the country—saw the dawning of a new era of austerity, as the earlier assumptions of economic growth faded. The contraction of the state also meant the shrinking of the social imagination. The stern dictums about the necessary limits of political dreams contrasted sharply with the new populist utopianism of the free market, where anything might be possible. We still live today in a society defined by these two poles: the harsh limits of the political sphere and the delusional boundlessness of the market. Although it wasn’t solely responsible for bringing the city into this new age, New York’s fiscal crisis marks the boundary between the past and the present we still live in today.

Read all of the articles in The Nation's special issue on New York City.



]]>
https://www.thenation.com/article/archive/legacy-1970s-fiscal-crisis/
Subterranean Blues: On ExxonMobilhttps://www.thenation.com/article/archive/subterranean-blues-exxonmobil/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinJul 24, 2012

For more than a century, ever since Ida Minerva Tarbell published her exposé of the Standard Oil Company in
McClure’s Magazine in 1903, oil has been one of the great subjects for muckraking American journalists. Much as the unconscious is for a Freudian, oil’s dark sludge seems the product of a subterranean world that stands in sharp contrast to the pretensions of the society saturated by it. We never see it, never think about it, have no idea where it comes from or how it comes to be—yet the sludge is ubiquitous, and all we do depends upon its power. We insist that we want independence, to spin energy out of the transparent elements of light and air, and yet it never really goes away. When everything else has been stripped away, it is what remains.

In Private Empire: ExxonMobil and American Power, Steve Coll—whose previous works of investigative reporting have covered subjects like American policy in Afghanistan and the history of the bin Laden family—joins the venerable tradition of Tarbell and Upton Sinclair. Here, he documents the political, economic and global power of ExxonMobil, the largest privately owned oil and gas company in the world. Coll frames his story as a narrative of corporate life in the post–cold war era. The choice may feel odd at first: despite the company’s wealth—it has quadrupled its profits in the years since the cold war’s end—oil seems old-fashioned, mired in the physical world. Coll compares it with Walmart and Google, those denizens of the postindustrial economy. In contrast to these, ExxonMobil drills “holes in the ground,” and so its operations are inevitably “linked to the control of physical territory.” In this way, he suggests a different view of the contemporary economy: beneath the glitz and seductions of the service sector runs a river of oil, sluicing through the bright weightlessness of our online dreams.

In Private Empire, Coll writes critically of ExxonMobil but tries to avoid a moralistic tone. Instead, his aim is to describe an energy company in an era of conflict, when it has to deal with the public relations challenges of such things as industrial disasters and political instability. His ExxonMobil is at once a giant and a company beset. Private Empire sprawls over almost 700 pages and makes extended visits to, among other places, Chad, Russia, Iraq and Equatorial Guinea. (Here, too, he’s in good company: Tarbell’s History of the Standard Oil Company, published in 1904, is more than 900 pages.) Private Empire has a cumulative force, for the portrait it provides of Exxon’s internal politics is a study in the culture of denial and the creation of a self-enclosed sphere of power. Yet the problem with Big Oil is no longer simply its monopoly status or its impact on political life. It is impossible these days to write about oil without writing about climate change, and here the model of a muckraking account of corporate power no longer feels sufficient, for the problem is one that cannot be externalized—it implicates us all.

ExxonMobil is descended from the original American oil giant, Standard Oil, which rose to prominence in the industry at the end of the nineteenth century. Its founder was John D. Rockefeller, a famously devout and penurious Baptist, who even as a child was single-mindedly focused on acquisition. Rockefeller built an energy giant through ruthless internal financial discipline and by brutally undercutting his competitors. Like the other robber barons, he had little interest in public opinion. The country’s factories needed his products, so he could afford to ignore the critics. Like social Darwinists then and now, he was confident in the ethics of wealth accumulation: “I believe the power to make money is a gift of God…. Having been endowed with the gift I possess, I believe it is my duty to make money and still more money, and to use the money I make for the good of my fellow man according to the dictates of 
my conscience.”

But the company could not remain oblivious to politics forever; in 1911 the Supreme Court found that Standard Oil had violated antitrust laws and broke it up into thirty-four separate firms. ExxonMobil was created out of a 1999 merger between Exxon and Mobil, two companies that grew out of the 1911 dissolution. The joining of these two companies suggests yet another of the ways in which our modern economy resembles that of the last Gilded Age, as though the old Standard Oil is rising from the dead and reassembling itself piece by piece.

* * *

Should Rockefeller come back from the grave, he would recognize ExxonMobil. Its business is focused, above all, on oil and natural gas. Its culture remains tough, masculine and imbued with Christianity, the mainline variety rather than (as historian Darren Dochuk has shown) the florid fundamentalism that flourished among the rebellious oil wildcatters who hated Standard Oil. One former Exxon manager remembers meetings in the 1970s that always began with a prayer for the future of the company; the inner executive suite at its headquarters in Irving, Texas, is known as the “God Pod.” (The whole complex is dubbed the “Death Star.”) The company recruits heavily from the public universities of the South and Midwest, many of which have petroleum engineering departments. Coll tells the story of one executive who, when he realized that the top five executives (all white men) had fourteen sons and no daughters among them, asked: “What is there in the culture here that promotes people with sons?”

Perhaps the best embodiment of the company is Lee Raymond, its chief executive from 1993 to 2006, who engineered the merger. Born in Watertown, South Dakota, he grew up an evangelical Christian but converted to Catholicism after marrying his wife, a native of Kohler, Wisconsin (a company town that was divided by a lengthy strike in the 1950s). His office nickname was Iron Ass, and he was intolerant of ideas with which he disagreed, interrogating employees with questions like “And what little birdie flew in the window and whispered that dumb-shit idea in your ear?”

A deep believer in “free-market capitalism,” Raymond devoted his life to oil and had few interests or hobbies outside the 
industry to pursue in his many homes around the world. No hedonist, his drink of choice while being flown on one of his private jets is a glass of milk filled with popcorn. He and his wife have separate beds so that Raymond can stay up late to go through documents from work (plus he snores); she sneaks snacks off the planes, complaining about the high price of food in the hotels of Paris and Berlin. Yet when Raymond retired from the company in 2006, he received a package worth $400 million—including an annual “consultancy” payment of $1 million plus coverage of his country club fees. Despite his stated dislike of politics, he counts Dick Cheney—another small-town Midwesterner and oilman—as one of his hunting buddies and remained close to the vice president throughout the Bush years.

But in contrast to the freewheeling machismo of the earlier Standard Oil, today’s ExxonMobil is obsessed with safety. This is, in part, the legacy of the 1989 Exxon Valdez accident, which soaked the Prince William Sound in southern Alaska with tons of oil. These days, every meeting begins with a “safety minute,” in which an employee speaks briefly about some danger in the workplace before getting around to whatever actual business there might be. Forgetting to turn off a coffeepot can lead to an official reprimand. File drawers are not to be left open, in case someone bumps into a sharp edge. Employees are asked to back their cars into their parking spaces to minimize the danger of a crash. Raymond even scolded employees who engaged in excessively risky sports in their time outside the office.

All this emphasis on safety seems quixotic—an effort to ward off accidents, but also to suppress the knowledge of the dangers inherent in burning carbon. And here, Raymond’s obsession points the other way: toward campaigns intended to downplay the risk of global warming in the face of all of the scientific evidence to the contrary. The company gives hundreds of thousands of dollars each year to legislators (overwhelmingly Republican) who score well on its “key vote” survey, which evaluates a lawmaker’s record on the issues that matter most to the oil and gas industry. The company donates to the major groups advancing arguments that are skeptical of climate change. It assists social scientists who are willing to write supportive articles, and it also funds special retreats for liberal activists so they can mingle with ExxonMobil executives. (In fact, it has even given funding to conferences at the New America Foundation, over which the author himself presides; Coll’s acknowledgments disclose two gifts from ExxonMobil that helped pay for conferences on foreign policy, which is remarkable not because it taints his reporting but because it suggests the company’s reach.)

Confronted with the evidence of global warming, the ExxonMobil strategy has been to insist on the inherent uncertainty of science: we don’t know whether the warming trends of recent years are caused by fossil fuels, but we do know that oil is necessary to make our civilization run. Energy is a progressive marker of the advance of civilization; it is the grease that enables the market to function. Being pro-oil is like being anti-poverty: as Raymond said in a 1997 speech at the Fifteenth World Petroleum Congress in China, the “most pressing environmental problems” of developing countries are related to “poverty, not global climate change,” and addressing those problems requires economic growth—in other words, the higher consumption of fossil fuels.

Near the end of the book, we learn that ExxonMobil is no longer as insistent as it was a few years back about the alleged lack of clear evidence on global warming. Under the leadership of its new chief executive, Rex Tillerson, a former Boy Scout whose favorite book is Atlas Shrugged, the company has gently intimated its support for a carbon tax of some sort—a deviation from its earlier resistance in the Raymond era to any suggestion that global warming is a problem connected to carbon emissions. But the evolution of ExxonMobil’s company line seems primarily a question of its public image rather than evidence of any substantial transformation in its practices.

* * *

 

Coll’s second big concern in Private Empire is ExxonMobil’s role in international affairs. The company faces constant pressure to expand its access to oil reserves and is always searching for new places to drill. This means working with governments around the world to make deals to extract that oil. Such relationships have never been simple. Early in the twentieth century, American oil production was primarily domestic—hard as it is to believe, the country was a net oil exporter at that time. But even then there were efforts to drill elsewhere, which invariably led to political conflicts. In 1938, for example, the Mexican government expropriated the holdings of foreign oil corporations, including Standard Oil of New Jersey (the largest of the companies formed in the 1911 breakup). Later in the twentieth century, Exxon and the other American oil giants expanded their production around the world—in Venezuela, Saudi Arabia, Iran and elsewhere—always relying on the strength of the American government to protect their operations (for example, in the coup against Mohammad Mossadeq in Iran in the early 1950s).

Today, ExxonMobil has established refineries in many countries plagued by poverty and political violence: Equatorial Guinea, Chad, Indonesia. Initially, the company was ambivalent about the US invasion of Iraq (although Coll observes that the tanks and jeeps of the US Army fueled up at depots nicknamed Shell and Exxon in their drive to Baghdad). But now that the region seems marginally more stable, ExxonMobil has been expanding its operations there, agreeing to drill in Kurdistan. Much like its obsession with safety, the company attempts to protect itself from political instability by signing agreements with foreign governments (such as Chad’s) that stipulate the host country will not do anything that “adversely affects [ExxonMobil’s] rights and economic benefits” for the duration of the agreement. Such terms, Coll notes, seem designed to prevent the establishment of a minimum wage or the passage of laws that could make union organizing possible.

Even though, on some level, ExxonMobil still depends on the military and political power of the American government to protect its interests, the company is militant about its own independence. Coll tells one story of the Indian prime minister asking George W. Bush to intervene in a controversy involving ExxonMobil: “Why don’t you just tell them what to do?” Bush’s response: “Nobody tells those guys what to do.” And when the State Department voiced its concern about the company cutting drilling deals with the Kurds that were negotiated independently of Iraq’s national government, Rex Tillerson responded: “I had to do what was best for my shareholders.” Earlier, when Lee Raymond was asked whether ExxonMobil might consider building more refineries in the United States in the interest of national security, he responded: “I’m not a U.S. company and I don’t make decisions based on what’s good for the U.S.”—putting the lie to the old saying that “what’s good for business is good for America.” No longer does an oil giant need to drape itself in the American flag; today, the rhetoric of the free market is enough.

* * *

Although Private Empire is unique for its focus on a single company, in recent years there has been no shortage of historical and political work on oil. Scholars such as Timothy Mitchell, Hannah Appel and Michael T. Klare have written about the political economy of energy, while an excellent recent issue of The Journal of American History was devoted to the history of oil. Given the plethora of other work available on the subject, it might seem that Coll would want to draw on the tradition of the muckrakers and combine his reportage with moral indignation, telling the story of ExxonMobil as a parable of contemporary greed.

Private Empire never takes this tone. As a work of investigative journalism, it offers a compelling command of detail and rich portraits of ExxonMobil’s internal culture. As a narrative, though, it is much less gripping. Despite the spy-thriller atmospherics, no clear story line runs through the book. Suspense and drama are oddly 
absent: ExxonMobil starts out powerful and gets more powerful; it begins rich and ends richer. The familiar markers of political life—elections, the two major parties, the ideological conflict of left and right—seem marginal to this story. (As Raymond says, “Presidents come and go; Exxon doesn’t come and go.”) The oft-told economic history of the past few years, from the housing bubble and financial crisis to the recession, feels equally irrelevant. The sundry challenges ExxonMobil faces—attempted coups, kidnappings of corporate executives, allegations that it has failed to deal with human rights abuses in the countries with which it partners—never seem to affect what really matters most: its ability to extract millions of barrels of oil from the earth every single day.

Perhaps some combination of the
apparent implacability of the company, its 
entrenchment in our economy and the absence of a clear vision for an alternative—even in the face of the imminent disaster posed by global warming—accounts for Coll’s detachment. When Tarbell wrote her exposé of Standard Oil, she sought to show how the company had used its power to pervert the free market and undermine business ethics. Monopoly was “a leech on our pockets, a barrier to our free efforts,” so she called for breaking up the trusts and reviving the economy of small, independent producers. Today, however, the problem we face is not only the power of ExxonMobil but our collective dependence on sources of energy that are slowly, steadily changing the climate of our planet. ExxonMobil’s political muscle may make it more difficult to develop alternatives, but the genuine source of its power is the oil pumped through its pipelines around the world. Even journalists such as Coll are painfully aware of the extent to which we rely on the power of the very companies they denounce. “American democracy,” Coll writes, “has produced no politics” capable of confronting the power of oil over the past three decades. The problem is not a dearth of information but rather a lack of political imagination. Would-be crusading reporters have it much harder now than they did 100 years ago.



]]>
https://www.thenation.com/article/archive/subterranean-blues-exxonmobil/
Countervailing Powers: On John Kenneth Galbraithhttps://www.thenation.com/article/archive/countervailing-powers-john-kenneth-galbraith/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinMay 11, 2011John Kenneth Galbraith was a satirist of economics as much as a practitioner of it.

]]>

In a 1930 essay titled “Economic Possibilities for our Grandchildren,” John Maynard Keynes ridiculed economists for having a high opinion of themselves and their work. As the Great Depression engulfed the world, Keynes looked back at historic rates of economic growth, arguing that the real problem people would face in the future was not poverty but the moral quandary of how to live in a society of such abundance and wealth that work would cease to be necessary. The “economic problem,” as he put it, was technical, unimportant in the larger scheme of things. “If economists,” he wrote, “could manage to get themselves thought of as humble, competent people, on a level with dentists, that would be splendid!” John Kenneth Galbraith—the Harvard-based economist whose books shaped the public conversation on economic matters for a generation in mid-twentieth-century America—would have agreed.

Today, given the rise of mathematical methods and computer modeling, economics is if anything even more labyrinthine, esoteric and inaccessible to the layman than it was in the days of Keynes and Galbraith. It is also more intellectually and politically ascendant than it was in the 1930s. Its methods now dominate much of the social sciences, having made inroads in law and political science. Its central theme of the superiority of free markets is the gospel of political life. This makes the publication of the Library of America edition of four of Galbraith’s best-known books—American Capitalism; The Great Crash, 1929; The Affluent Society and The New Industrial State—a cause for celebration. (The volume is edited by Galbraith’s son James, also an economist.) Galbraith delighted in puncturing the self-importance of his profession. He was a satirist of economics almost as much as a practitioner of it. He took generally accepted ideas about the economy and turned them upside down. Instead of atomistic individuals and firms, he saw behemoth corporations; instead of the free market, a quasi-planned economy. Other economists believed that consumers were rational, calculating actors, whose demands and tastes were deserving of the utmost deference. Galbraith saw people who were easily manipulated by savvy corporations and slick advertising campaigns, who had no real idea of what they wanted, or why. In many ways, our economic world is quite different from the one Galbraith described at mid-century. But at a time when free-market orthodoxy seems more baroque, smug and dominant than ever, despite the recession caused by the collapse of the real estate bubble, his gleeful skewering of the “conventional wisdom” (a phrase he famously coined) remains a welcome corrective.

* * *

John Kenneth Galbraith was born in 1908, the third child of a farm family in the small Canadian town of Iona Station. He became an economist almost by accident. Shortly before his graduation from Ontario Agricultural College (which he described in later life as “probably the worst college in the English-speaking world”), he stumbled upon an advertisement for a graduate fellowship in agricultural economics at the University of California, Berkeley. He applied and was accepted, and after completing his studies he managed to secure a nontenure-track position at Harvard.

It took some time for Galbraith to become a full professor, but even then he never retreated into academic life. To learn of the sheer variety of his activities—detailed in the chronology in the Library of America volume—is exhausting. He was prolific, writing four dozen books—among them three novels—and more than 1,100 articles. Before he earned tenure, he did a stint at the Office of Price Administration during World War II, creating the system of price controls that prevented inflation during the wartime boom. In the 1940s he was a writer for a new magazine called Fortune, where his articles helped spread popular interpretations of Keynesianism. He participated in the United States Postwar Strategic Bombing Survey, which found that the supposedly accurate air-bombing of Europe had in fact failed to destroy the munitions plants it had targeted. He was closely tied to the Democratic Party of the postwar years, advising Adlai Stevenson and serving as the American ambassador to India under John F. Kennedy, although he ultimately became an outspoken opponent of the war in Vietnam (and steadily moved to the left of the Democratic Party). In the ’70s he was pilloried by the conservative movement: the Heritage Foundation held him up as an example of all that was wrong with liberalism, and Milton Friedman’s PBS series Free to Choose was created as a conservative alternative to a series Galbraith produced for the BBC called The Age of Uncertainty.

When Galbraith was a young man, the field of economics was more diverse and more openly ideological than it is today. It was far less dominated by mathematical techniques (which Galbraith never used), and political debates were a regular part of economics courses. (For example, the conservative Harvard economist Thomas Nixon Carver used to open the floor to socialist speakers for the first half of his spring semester class, and then spend the second half giving his rebuttal to their arguments.) Thorstein Veblen, Karl Marx and Keynes were among Galbraith’s early intellectual influences. For Veblen, consumption was irrational and atavistic, a display of status more than the satisfaction of genuine needs—a skeptical view of consumerism that Galbraith would follow throughout his work. Even though Galbraith rejected Marxism, finding it as doctrinaire as conservative economics, he admitted that Marx was right about enough—the existence of social classes, the centrality of material interests, the recurrence of economic depressions, the concentration of industry—that any thinking person who was not a Marxist had to occasionally wonder: “Might he not be right on other things—including the prospect for capitalism itself?” In the 1930s, as a young professor, Galbraith became persuaded by Keynes’s arguments against laissez-faire and in favor of an expansive role for government. In the postwar years, American Keynesianism became inextricably bound up with military spending. Galbraith would put forward another vision of Keynes, one that stressed a vision of the common good.

Despite the intellectual richness of economics in the early years of the twentieth century, most of Galbraith’s academic elders in his formative years still believed the old truths of Adam Smith, David Ricardo and Alfred Marshall. The economy was composed of millions of atomistic units—individuals and small firms—that interacted in a marketplace in which prices were set by competition. This perfect system ran best without interference from the state. Competition would produce the best rewards for everyone. Even the Social Darwinist certitude that “the millionaires are the product of natural selection” (to quote one of the credo’s chief boosters, the Yale sociologist William Graham Sumner) remained a respected faith in the academy into the early 1930s.

For many in the broader society, this unthinking faith in the virtues of the market was shattered by the Great Depression. But the men who had devoted their lives to expounding the market’s glories—including quite a few on the Harvard economics faculty—did not retreat. Instead, they spent their time denouncing the New Deal, Roosevelt and the “alien” intellectual influences that had seized control of the state. Throughout his career, Galbraith would write with bitter wit about people who clung to their sentimental philosophies about how the world ought to operate, long after reality should have shaken them free. In The Great Crash, 1929, he would quote Mark Twain, who had provided the Wall Street Journal’s “thought of the day” for September 11 of that year: “Don’t part with your illusions; when they are gone you may still exist, but you have ceased to live.” Galbraith sought to provide a vision of the modern economic world without illusions.

* * *

The four books collected in the Library of America volume take as their central target the idea that the economy is composed of rational, calculating individuals, whose personal preferences shape the market and guide it to an optimal outcome for everyone. It is hard to imagine four such books being written today: they were bestselling, lucid, fiercely confident works that argued in various ways against the idea that the American economy operates as a frictionless, benevolent free market.

American Capitalism, published in 1952, begins by comparing the economy to a bee. “It is told,” Galbraith wrote, “that such are the aerodynamics and wing-loading of the bumblebee that, in principle, it cannot fly.” And yet it did, in blissful ignorance of the laws of physics. So, too, the American economy of the ’50s seemed to be thriving “in defiance of the rules” of economic life as laid down by Smith and Ricardo. A massive government, powerful labor unions and gigantic corporations had replaced the competitive economy of yesteryear. The result, though, was not economic collapse but unprecedented prosperity. Skeptics of bigness who urged a return to the small enterprises of the nineteenth century were nostalgic and misguided. The way to respond to concentrated economic power was not by trying to break it up, as had always been the American tendency. Instead of busting the trusts, the way to curb their power was to create institutions—strong labor unions, consumer associations, farmers’ groups— that could match their economic might. Indeed, the federal government at times took on the role of a “countervailing power,” as when it intervened in the labor market to establish a minimum wage.

The Great Crash, 1929, published in 1955, is perhaps the funniest book ever written about economic collapse. It gains its force by telling the story of the run-up to the stock market crash, and showing how businessmen, journalists, political leaders and economists all helped prop up and sustain the widespread faith in Wall Street as an easy source of vast wealth. This made the bubble of the 1920s possible and the crash that followed virtually inevitable. Galbraith scoured newspapers, political pronouncements, the writings of economists and the prospectuses of investment companies for examples of brash and bombastic statements about the stock market boom of the late ’20s. In June 1929, Bernard Baruch told a magazine that “the economic condition of the world seems on the verge of a great forward movement.” On October 15, 1929, Yale economist Irving Fisher famously pronounced, “Stock prices have reached what looks like a permanently high plateau…. I expect to see the stock market a good deal higher than it is today within a few months.”

Even after the stock market descent began on Thursday, October 24, the political and business communities rushed to reassure the public. President Herbert Hoover told the nation, “The fundamental business of the country, that is production and distribution of commodities, is on a sound and prosperous basis.” In the papers on Monday, October 28, brokerages bought advertisements filled with soothing words: “We believe that the investor who purchases securities at this time with the discrimination that is always a condition of prudent investing, may do so with utmost confidence.” It was the day, Galbraith notes, “the real disaster began.”

The central theme of The Great Crash is the impotence of people with power when in the grip of ideas that leave them no way of confronting reality. They endorse and protect a consensus that conceals what is objectively true. One of the most caustic passages in the book describes the “no-business meetings” held by Hoover in the wake of the crash, which assembled prestigious businessmen at the White House, to great fanfare from the press. No proposals emerged from Hoover’s meetings. No action was taken. No one intended that anything should be accomplished. And yet by gathering men of wealth and undeniable social importance, Hoover intended to give the impression that something of importance was being done while doing nothing at all. They were, Galbraith wrote, a “practical expression of laissez faire.”

 

With The Affluent Society, published in 1958, Galbraith directed his mockery at the world of professional economics. He argued that the basic principles of economic thought had been developed in the late eighteenth and early nineteenth centuries, a period when the chief economic problem was producing enough to ensure survival. The old truisms about the destructive force of the state, the paramount importance of production and the surpassing efficiency of the competitive economy were no longer relevant in the mid-twentieth-century United States, where material prosperity had spread far beyond the upper classes. Yet the “conventional wisdom” of economic life, reiterated with what Galbraith described as religious solemnity at Chamber of Commerce meetings and in the pages of prestigious academic journals alike, made it impossible to govern in a reasonable way. The “march of events” had made the old ideas irrelevant, but businessmen and economists clung to them desperately:

The business executive listening to a luncheon address on the immutable virtues of free enterprise is already persuaded, and so are his fellow listeners, and all are secure in their conviction. Indeed, although a display of rapt attention is required, the executive may not feel it necessary to listen. But he does placate the gods by participating in the ritual. Having been present, maintained attention, and having applauded, he can depart feeling that the economic system is a little more secure.

Nor were such rituals only for the unsophisticated. The conventional wisdom existed at least as much in academic life as in the broader world: “Scholars gather in scholarly assemblages to hear in elegant statement what all have heard before. Again, it is not a negligible rite, for its purpose is not to convey knowledge but to beatify learning and the learned.”

Because of the old faith, inherited from the classical economists and refined to a harsh moral code in the Victorian era, that only hard work, private initiative and a relentless emphasis on increasing productivity could lift human society out of poverty, people were irrationally hostile toward the public sector and government—even though in the modern context they might accomplish goals that were far more socially valuable than the exertions of private firms. Private consumption rose to heights of bizarre extravagance—while schools and parks had to beg for money from the state. “Vacuum cleaners to ensure clean houses are praiseworthy and essential in our standard of living,” Galbraith wrote. “Street cleaners to ensure clean streets are an unfortunate expense. Partly as a result, our houses are generally clean and our streets are generally filthy.” He also observed the odd discrepancy in attitudes toward public and private debt—the one, sharply condemned, the other eagerly encouraged.

The idea that a society ought to commit its economy to satisfying consumer wants no longer made sense if the wants themselves were managed by the companies that were selling the goods to satisfy them (Galbraith called this the “dependence effect”). Once, more production had meant less hunger, less misery, less privation. In the modern world, it only meant satisfying the craving for “shiny rumpus rooms, imaginative barbeque pits, expansive television screens and magnificent automobiles.” Smith had believed that material affluence might lead to a world of greater equality and decency and freedom—but in fact people were enslaved by the whims induced by advertisers. In one of the best-known passages in The Affluent Society, Galbraith described an American family going out for a camping trip in a top-of-the-line “mauve and cerise” automobile, cruising along on a badly paved highway, past a countryside whose natural beauty had been blotted out with billboards, dining on a picnic of packaged foods beside a polluted stream. “Just before dozing off on an air mattress, beneath a nylon tent, amid the stench of decaying refuse, they may reflect vaguely on the curious unevenness of their blessings. Is this, indeed, the American genius?”

In The New Industrial State, published in 1967, Galbraith presented a vision of the American economy that systematically turned the axioms of economics upside down. In the mid-twentieth century, Galbraith argued, the internal “planning system” of gigantic corporations largely substituted for the free market. This marked a fundamental change in economic life. In the modern corporation, bureaucrats and managers exercised far more power than stockholders over the companies that the latter nominally owned. Scientific knowledge and technological expertise became more important in economic life than the willingness of entrepreneurs to take economic risks. The ability of corporations to run on their retained earnings meant that management no longer had to worry much about the opinion of investors. As massive corporations controlled the market, they no longer even needed to maximize profits in order to survive. As a result, the American and Soviet economies had more in common with each other than they did with the idealized vision of small, decentralized production still celebrated in economics textbooks. In his earlier work, Galbraith had hoped that labor unions and the state could counter corporate power. In The New Industrial State, he argued that both had been subsumed by the demands of the corporate monolith. The epitome of the modern economy was the bloated military industry, which was able to win government contracts for technologically sophisticated but socially useless bombs manufactured to support a ginned-up arms race with the Soviet Union. Galbraith believed that attempting to shrink these gigantic companies was futile and counterproductive, but he did think that they could be turned to different, more socially accountable ends.

* * *

Of the books collected in this volume, The New Industrial State has aged the least well. For one, the economy’s industrial base has eroded since Galbraith’s day; General Motors, Chrysler and Ford are no longer unassailable giants. These differences go beyond international competition and the problems of the manufacturing sector. Corporations are no longer the stodgy, management-driven enterprises Galbraith depicted. Shareholders are not the sleepy figures they may have been in the mid-twentieth century. The labor peace that Galbraith wrote about has evaporated, and the prospects he saw for a humane capitalism seem dim. Even in his day, there remained, as Galbraith knew, deep pockets of poverty. Today, the gulf between the richest and the poorest has grown wider, with the policies that support such inequalities buttressed, in part, by the intellectual project of modern conservatism, which Galbraith once called “one of man’s oldest exercises in moral philosophy; that is, the search for a superior moral justification for selfishness.” With the headlines dominated by news of recession and austerity, how clear is it that we live in an “affluent society” any longer?

In a way, Galbraith sought to expand on Keynes’s idea that the world of abundance that was unimaginable at the start of the Industrial Revolution demanded the invention of a new economic morality. No longer was it necessary to extol the virtues of hard work in an unthinking way. In the society Galbraith imagined, once the ideology of production for its own sake no longer reigned supreme, a thriving and rich public realm could come into existence, filled with parks, libraries and universities, accompanied by a new vision of the good life. Yet temperamentally and politically Galbraith was a skeptic, not a radical. He always believed that people were led astray by bad ideas, the suggestion being that reason might prevail if only the right leadership were at the helm. His confidence, in many ways, was a reflection of the optimistic, prosperous period in which he wrote. At the same time, Galbraith’s life was punctuated by tragedy: the sudden death of his mother when he was a teenager, his father’s accidental death only months after Galbraith was married and the loss of his second son to leukemia at 7. The books in this volume were all written after these losses, and it may be that some of Galbraith’s doubts about the orderly operation of the economic universe arose from his own acquaintance with the caprice of life.

But while some of the particulars of his vision may seem out of place today, his argument that there is something absurd about a society that can afford tremendous mansions, private jets and elite colleges while cities close firehouses, shut down bus lines and debate whether their crumbling public schools can even stay open five days a week seems as relevant as ever. So does his willingness to poke fun at those who recite platitudes about the market. One of the strengths of this volume is that it includes the introductions written by Galbraith, who died in 2006, for the updated editions of his four classic works. The 1993 edition of American Capitalism reminds the reader of the immense delight Galbraith took in rebelling against economic ideas long held to be sacrosanct: “For me, at least, there has always been a certain pleasure in questioning the sacred tenets.” In the nineteenth century, economists exalted competition as a civilizing force, one that would uplift the weak and glorify the worthy. True believers, therefore, would see Galbraith’s happy farewell to competition as a “treaty” with “the forces of economic evil.” Did this alarm him? Not in the least. “I enjoyed doing this, and I trust that some of this enjoyment is still evident in this book.”



]]>
https://www.thenation.com/article/archive/countervailing-powers-john-kenneth-galbraith/
Right On, ‘Right On’!https://www.thenation.com/article/archive/right-right/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our ReadersOct 14, 2009 Guatemala City

]]>

Guatemala City

It’s not unusual for The Nation to publish something that by itself justifies the whole year’s subscription. With the September 28 issue the ante went up. I’d place Kim Phillips-Fein’s “Right On” as at least two years’ worth. Brilliant, profound, penetrating, lucid and intellectual… But it’s not just the dazzle; it’s the wisdom and insight.

PAUL MUNSELL


Eugene, Ore.

Kim Phillips-Fein’s historiography of modern conservatism is illuminating, yet her schema of generational changes in how historians have narrated this history cannot escape the history she would reframe. In her story, a first generation of scholars was overly focused on racial backlash in the 1960s. These histories were supplanted by accounts that sought to analyze conservatism as a principled, grassroots movement dating back decades before the 1960s. That story is being succeeded, she argues, by a new generation of histories that properly make political economy the causal agent in history.

One problem with this developmental account of the scholarship is that it too neatly divides historical concerns, falsely splitting anxieties from interests, economics from politics, and race from class. Certainly political economy matters, but its meaning is not self-evident. Actors on the ground (elite and grassroots) had to generate convincing interpretations of economic conditions via a contingent linking of race, anti-elite populism, business conservatism, anti-feminism, anti-statism (and yet military and carceral power) to a triumphant affirmation of national identity. Racial fear, market ideology and evangelical Christianity have no necessary or inherent connection; these positions had to be forged into a coherent worldview across the long postwar era in order for conservatism to become hegemonic. Phillips-Fein is thus right to foreground the role of liberal individualism in conservative success, but she ignores how it is made politically meaningful by its mediation through race and gender, by contrast to demonized figures of embodied dependence. Indeed, an exclusive focus on economics or on liberal ideology may unwittingly echo the voices in our political culture–right and left–that would wish away the still-pervasive racial dimension in American political and economic life. Rather than a linear narrative that progresses through different positions, therefore, it seems more fruitful to hold in tension the different elements of conservative ascent.

JOSEPH LOWNDES, author
From the New Deal to the New Right:
Race and the Southern Origins of Modern Conservatism

GEORGE SHULMAN, author
American Prophecy: Race and Redemption in American Political Culture


Tarzana, Calif.

Finally historians have the critical distance and vantage point to look back at what has been up to now largely overlooked, and to start to connect the ideological dots and major turning points of the conservative movement’s post-’60s emergence. As an overview, the Phillips-Fein essay touches on many of the significant works necessary to place the movement in perspective. A particular case is the conflicted cultural path public education as a liberal institution has taken because of Ronald Reagan’s conservative influence. His election in 1980 is considered a standard baseline, but of deeper significance is his election as governor of California in 1966, and the culture war he unleashed on the state’s public school system (K-16), which later appeared full blown at the national level. I recommend Kurt Schuparra’s excellent Triumph of the Right: The Rise of the California Conservative Movement, 1945-1966; W.J. Rorabaugh’s Berkeley at War: The 1960s; David P. Gardner’s Earning My Degree: Memoirs of an American University President; and Terrel Bell’s political memoir, The Thirteenth Man. Bell was Reagan’s secretary of education, and Gardner was president of the University of California, appointed by Bell to head the National Commission on Excellence in Education, which published A Nation at Risk in 1983. Caveats aside, progressives are at risk if they do not take heed of Phillips-Fein’s overarching point: history has a strange way of rescuing the defeated. And that history is learned in schools.

JAMES ANDREW LaSPINA, author
California in a Time of Excellence:
School Reform at the Crossroads of the American Dream


Commack, N.Y.

What has triumphed is the Spite Right. Its progeny include Limbaugh, Hannity, Coulter, Goldberg, Gallagher, Malkin, Ingraham, Savage, O’Reilly–self-scribbled caricatures who dwell in their own political cartoon, where there are only intrinsically evil “liberals” (Limbaugh adduces Ed Koch and William Kunstler on the same page) versus “conservatives” whose goodness derives solely from fighting them (see ABCDunlimited.com/ideas/rightism.html).

BARRY LOBERFELD


Phillips-Fein Replies

New York City

Many thanks to those who wrote in about my article, and especially to James Andrew LaSpina for referring readers to several important works on California and Ronald Reagan. To respond to the concerns raised by George Shulman and Joseph Lowndes, who is himself the author of a wonderful book on the origins of the right, it was not my intention to suggest that the recent scholarship that focuses on political economy somehow replaces or invalidates all the work that came before. The field has developed through historians having arguments with each other; but each of the different strains of the historiography has its strengths and weaknesses, its areas of insight and areas of elision. Looking at political economy adds a new dimension to understanding the backlash against civil rights, for example by illuminating how political leaders such as Jesse Helms were able to recast a defense of racial separation in the more politically palatable rhetoric of market individualism or (as historian Tami Friedman has argued) by showing how Southern business boosters were able to win industry away from the North by adopting a states’ rights language that justified their rejection of the New Deal state as well as their defense of segregation. But while the newer scholarship on the role of economic ideas and business activism in the rise of the right has the virtue of helping to explain why and how conservative power has helped create widening economic inequality, finding ways to connect this narrative to the history of backlash and racial exclusion will be one of the tasks scholars must grapple with in the years to come.

KIM PHILLIPS-FEIN


Clarification

William F. Baker’s “How to Save the News” [Oct. 12] inadvertently attributed funding for A.C. Thompson’s Nation investigation, “Katrina’s Hidden Race War,” solely to ProPublica. In fact, that investigation was directed and underwritten by the Investigative Fund at The Nation Institute. ProPublica provided additional support, as did the Center for Investigative Reporting and New America Media.



]]>
https://www.thenation.com/article/archive/right-right/
Right Onhttps://www.thenation.com/article/archive/right/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-FeinSep 10, 2009

WALT ZEBOSKI/AP IMAGESGov. Ronald Reagan and wife, Nancy, opening the California State Fair, 1968

Is the conservative movement dead? In November, when many of its leading intellectuals publicly abandoned the McCain-Palin ticket, deserting their comrades and going over to the other side, the movement suffered not only electoral defeat but ideological apostasy. During the transition, as the stock indexes of the world tumbled, crushing the blithe confidence in free-market ideas universally espoused just a few years earlier, many seasoned political observers wondered whether the long-awaited “conservative crack-up”–to quote the title of R. Emmett Tyrrell Jr.’s book of 1992, which predicted the imminent decline of the conservative project–might have at long last arrived. The old Reagan coalition had split into warring elements, with the traditionalists turning on the libertarians and Wall Street executives backing away from a Republican regime that proved to be inept at managing economic chaos. A movement that claimed to have descended from both Milton Friedman and Edmund Burke, wedding the fast pace of capitalism and the slow, stately march of tradition, might always have seemed ideologically strained. By the spring of 2009, the fissures had turned into cracks, and the movement collapsed in on itself.

Or did it? After all, the death of conservatism has been prophesied many times before. In 1964 the New York Times opined that Arizona Senator Barry Goldwater had “not only lost the Presidential election…but the conservative cause as well.” In 1977 Fortune magazine pronounced that the Republican Party was “exceptionally weak,” virtually “bereft of any ‘turf’ that is securely its own.” Three years later Gerald Ford made a play for the Republican nomination by insisting that the country would never elect a movement conservative. “A very conservative Republican can’t win in a national election,” he told the Times. Even during the 1980s, when a genuine movement conservative occupied the White House, liberal journalist Richard Reeves declared Ronald Reagan’s politics a mere “detour” from the country’s liberal history.

Perhaps this time things really are different, but the apparent rifts and tensions within the conservative movement–between elitism and populism, capitalism and custom, the piety and fanaticism of the far right and the moderation of the old guard at the helm–have not caused dissolution, historically speaking. On the contrary, they have generated a strangely durable, tenacious politics that has avoided being shunted to the margins of American life. Goldwater may have been trounced in 1964–he lost every state except his own and the five of the Deep South, where voters were drawn to his opposition to the Civil Rights Act–but movement conservatives remained undeterred and ended up using his campaign’s donor lists to fortify the ranks for future battles. Four years later the Republican Party, which had seemed a shambles during the Johnson presidency, retook the White House. When Richard Nixon left Washington in calumny in 1974, it seemed, again, that the GOP would be weakened for a generation. Instead, Reagan defeated Jimmy Carter in 1980 and was re-elected in a landslide. For conservatives, it seems that their most crushing defeats herald their greatest victories.

Given these Houdini acts, it is surprising that until recently there has been no significant body of scholarship on the history of postwar conservatism. In 1994 historian Alan Brinkley observed that the right had become “something of an orphan” in American political history. Most scholars–themselves liberals–believed that liberalism had triumphed in the United States over the course of the twentieth century. The disparate regional cultures of the country had been unified into one cosmopolitan nation. Ever since World War I and the Progressive Era, Washington had assumed greater management of the economy, culminating in the New Deal and later in the War on Poverty and Great Society programs. The Manichaean, parochial world of religious fundamentalism had given way to the subtle moral distinctions of modernity. Brinkley suggested that if scholars had overlooked the lingering sway of market ideology, the pervasiveness of antigovernment sentiment and the sustained vibrancy of fundamentalist Christianity, it was because they were utterly convinced of liberalism’s triumph. Such assumptions left them unable to reckon with the lasting power of conservative politics and therefore incapable of understanding the collapse of the New Deal electoral coalition and resurgence of the right in the 1980s.

Fifteen years after the publication of Brinkley’s article, the conservative movement is no longer an orphan in the academy. Indeed, writing its history has become something of a cottage industry, with every year bringing new monographs, articles and dissertations on a dizzying array of subjects: right-wing populism among long-haul truckers; the grassroots religious conservatism of Southern preachers who migrated to California in the postwar period; the role of Phoenix, Arizona, in the rise of laissez-faire ideology in the Republican Party; the intellectual influence of Ayn Rand; gay and African-American conservative thought; backlashes in communities ranging from Baltimore to Orange County; anti-unionism in American culture; histories of far-right preachers and the leaders of organizations like the John Birch Society.

As someone who has written about conservatism, I think that while the field has flourished for intellectual and professional reasons (nature, one might say, abhors a vacuum in the scholarly literature), there are political causes for its growth as well. The body of scholarship on the right grew as the movement leapt from one success to the next. Many (although not all) of the younger historians writing about the right are actually left of center, children of the Reagan era who came of age as scholars during the Bush years and have sought to understand the conservative movement partly to forge the tools to undermine it. This groundswell of rich and complex research has allowed a thousand monographs to bloom, but it has yet to produce a retelling of the larger narrative of the postwar period incorporating the insights of recent histories of the right–something on the order of James Patterson’s powerful synthesis of postwar American history, Grand Expectations. And there have been few efforts to understand what the history of the right has to tell us about the movement’s influence today, or its future, which is especially important at a time when conservatism has found itself on the defensive once again.

Before Reagan’s election in 1980, academic scholarship about the conservative movement was meager. There were a few books on topics like the right in the 1930s, the history of McCarthyism, the Ku Klux Klan and the isolationist politics of the far right, but most academic historians spurned the subject of conservatism. The right was seen as a relic of American history–a menagerie of resentful oddballs and misfits, fanatical preachers, eccentric racists and assorted cranks who rejected the New Deal and the Great Society. This patronizing view followed from that of the “consensus” historians of the 1950s, such as Columbia University’s Richard Hofstadter, who interpreted McCarthyism as the vicious politics of “status anxiety.” According to Hofstadter, the paranoia of the right was fed by the resentment of members of social groups whose fortunes were declining in the newly prosperous mass-consumption society of the postwar era–“so many people do not know who they are or what they are or what they belong to or what belongs to them,” Hofstadter wrote in “The Pseudo-Conservative Revolt”–and who turned to the politics of conspiracy in an anxious attempt to shore up their sense of moral virtue. These “pseudo-conservatives” believed they were powerless victims menaced by evil outsiders, and they lashed out angrily at their perceived enemies, confident of their righteousness. More generally, in the 1970s and ’80s scholarship on conservatism was scarce because old-fashioned political history was in decline: the study of culture and social life began to eclipse the analysis of elections, protests, strikes and legislative battles. But Reagan’s election reawakened scholarly curiosity in the conservative movement. How could such hostility to government, such opposition to the tolerant, pluralistic politics of liberalism, such a belligerent temperament thought to be moribund, suddenly re-emerge at the center of political life?

The first generation of historians to take conservatism seriously focused closely on the politics of social backlash. The main characters of their stories were the working-class white voters of Northern cities and the South who had grown alienated from a Democratic Party that they thought was overly solicitous of African-Americans. They were angry about welfare, school busing and affirmative action–policies they felt helped people less deserving than themselves. More deeply, they were frustrated by the radical politics of the 1960s: the longhaired kids protesting Vietnam, the hippies slouching in parks, the feminists with the temerity to blame the nuclear family for their oppression, the black-power advocates with their clenched fists and ten-point programs. The recession of the 1970s drove these blue-collar workers away from their old faith in the power of the state to safeguard prosperity.

Most of the scholars who wrote about these reactionaries–such as Jonathan Rieder in Canarsie (1985), Ronald Formisano in Boston Against Busing (1991) and Dan Carter in The Politics of Rage (1995)–were far from sympathetic to their subjects; their commitments lay with the African-American families who sought to integrate the schools of white neighborhoods. Nonetheless, they endeavored to present their subjects as working-class people in a time of economic decline, desperate to protect the few institutions–home, family, school and neighborhood–over which they had control. As Rieder put it in his study of the backlash in Brooklyn, “The basic fact of life for the residents of Canarsie was the precariousness of their hold on middle-class status, the recency of their arrival in that exalted position, and the intense fear that it might be taken from them.” In this, Rieder and others shared a set of assumptions with the 1950s historians of McCarthyism: they treated conservatism as a populist politics of displaced frustration, in which rage at liberalism reflected anger about the underlying problem of increasing economic insecurity.

During the years of Bill Clinton and George W. Bush, historians turned away from an emphasis on working-class reaction against the social radicalism of the 1960s. While the backlash might explain why the New Deal electoral coalition collapsed and how Reagan won the presidency in 1980, studies of the Boston busing crisis and the protests over school curriculums in West Virginia’s Kanawha County could not offer a convincing explanation of a more longstanding shift that had occurred in American politics. Why had the right proved able to win one election after another? How was it able to exercise such influence even when Clinton reclaimed the White House for the Democrats? Backlash histories were at heart local accounts of liberal decline, not synthetic analyses of conservative ascendancy. They could explain changes in voting patterns but not the rise of a new political force.

A new generation of historians began to shift the focus away from the politics of resentment and toward movement conservatism, not only in Washington but across the country. Many focused on the social origins of the conservative campaign. Here, the main actors were a group of “suburban warriors,” to quote historian Lisa McGirr–prosperous, upwardly mobile men and women who carefully organized in support of a right-wing agenda by knocking on doors, distributing literature, screening films and holding meetings. They were driven not by irrational fears or anxieties but rather by a deeply held set of beliefs about how society ought to be organized, with anticommunism at the forefront of their politics. Far from being socially marginal, they were among the postwar economy’s winners, and their politics reflected their sense of entitlement. They used the same kinds of strategies familiar to any left-of-center social movement, but for dramatically different ends.

Much distinguished this second, younger generation of scholars from historians like Carter and Formisano. The first generation wrote primarily about cities and the South, the second about suburbia. The first focused mostly on specific episodes or campaigns, while the second took a much longer view that often spanned the entire postwar period. But perhaps the biggest interpretive difference lay in their respective approaches to the tumultuous years of the 1960s. Those writing in the wake of the Reagan revolution had shared the vivid hopes of 1968 and the subsequent disappointment of those dreams. The radicalism of the decade and the backlash it had provoked still seemed to be the central events underlying the conservative shift. For some, the New Left’s inability to moderate its strident moralizing and appeal to a broader public made it tragically culpable for the ultimate failure of consensus liberalism–the left had been unable to speak to working-class Americans, who turned instead to the right. For others, the clash simply revealed the intractable racism endemic to segments of American society, meaning that there was no particular tactical failure on the part of the left. Yet most of these earlier scholars, who focused on the collapse of the New Deal electoral coalition, agreed that the modern right was born in this furious, embittered reaction against civil rights, feminism and the antiwar movement.

By contrast, the newer wave of scholarship–books such as Jonathan Schoenwald’s A Time for Choosing (2001), McGirr’s Suburban Warriors (2001) and Rick Perlstein’s Before the Storm (2001)–granted a much smaller place to the fiery emotions of the years of George Wallace, Jane Fonda and Bobby Seale, emphasizing that in important ways the conservative movement actually predates the 1960s. Far from being a sudden, explosive and negative reaction to the decade’s tumult, the conservative movement simmered throughout the postwar period, motivated by its activists’ positive vision of small government, the perfect social ordering promised by the free market and a world without communism. The social crises of the 1960s may have offered the movement an opportunity to broaden its base of support, but conservatism was thriving before that upheaval. The earlier generation of scholars, after all, never did explain how thirty years of conservative politics could have sprouted from a few explosive conflicts in the 1970s. Nor could they elucidate how the blue-collar workers who cheered for George Wallace wound up supporting a politics committed to promoting the free market, fighting unions and rolling back the welfare state. How had the politics of blue-collar resentment and fear of social change come to be joined to the stridently laissez-faire agenda of the movement faithful? To understand the connections, it was necessary to look at the rise of the conservative movement on its own terms–to study its internal logic, its intellectual history and the way its activists promoted their agenda.

Yet at the same time, although the second generation of historians made clear that the victory of conservatism was no mere accident of history but instead proof of a longstanding, concerted effort to shape American politics, their vision of the movement as simply another grassroots mobilization seems vexed. In their attempts to write respectfully about the conservative movement (many are still warring with Hofstadter’s description of it as a paranoid politics), some of them risked smoothing over the baroque strangeness of the American right–such as the way that its prosperous believers and leaders portrayed themselves as aggrieved victims, or its searing hostility to government, to unions and even to the most minimal welfare state.

Moreover, both schools of historical interpretation share a sense of the conservative movement as in some sense populist–a trope that dates to writers like Hofstadter and that is also shared by many conservatives. The earlier generation of writers emphasized the anti-elitism of the backlash, while the later one suggested that the right gained strength by using the same social-movement organizing strategies that the left had employed. The image of conservative populism has penetrated deeply into everyday political chatter as well, thanks in no small part to the efforts of conservatives, who–as Thomas Frank argued in What’s the Matter With Kansas? (2004)–have long sought to claim theirs as a plain-folks, common-sense mobilization against effete social dreamers. When Barack Obama spoke before an audience of wealthy donors in San Francisco in 2008 about the challenge of winning over Rust Belt towns in Pennsylvania, where voters clung desperately to “guns or religion or antipathy toward people who aren’t like them,” he was invoking this familiar image of populist working-class conservatism.

But this raises a larger question: is it correct to see the conservative ascendancy as a populist mobilization akin to the civil rights or labor movement? Some parts of the movement–like antiabortion activism–do seem similar to left organizing projects: they use direct-action strategies and cultivate grassroots commitments. Moreover, from the John Birch Society members who sought to imitate the communists they so hated and feared to the businessmen of the National Association of Manufacturers who wanted to mimic the labor movement’s methods of organizing, there has been much self-conscious discussion among conservatives about learning from the left. And the simmering resentment that fueled the backlash against the civil rights movement and other social movements certainly helped shift voters away from the Democratic Party in the 1970s and ’80s.

Still, it is hard to see the mobilization and political education of a massive number of people as critical to the actual victories of the right, the same way that it must be for any movement that seeks to challenge social hierarchies in a sustained way. For unions to organize in the 1930s, it was not enough for Congress to pass the Wagner Act–workers needed to put their livelihoods and lives on the line. For segregation to end, ordinary men and women had to come to the point of being willing to break the law or even risk losing their lives. Does it really require the same kind of commitment to cut taxes, end regulations or fight labor unions? And if it doesn’t–if the way the conservative movement harnesses the discontent of its constituency is largely limited to certain moments of political theater, when the image of mass support is needed to garner television coverage–shouldn’t we ask whether populism really is the backbone of the right at all?

The most recent scholarship about conservatism in the late twentieth century has taken a new direction. It has emphasized political economy instead of movement politics, looking at such topics as the development of Wal-Mart’s distinctive free-market evangelism (Bethany Moreton’s To Serve God and Wal-Mart, from 2009); the origins of laissez-faire revival in the development politics of Southwestern cities (Elizabeth Tandy Shermer’s dissertation “Creating the Sunbelt: The Political and Economic Transformation of Phoenix, Arizona”); and the organic emergence of a populist anti-statist politics among groups like rural long-haul truckers trying to avoid regulation (Shane Hamilton’s Trucking Country, from 2008). Implicitly, this new body of work raises the question of whether the reasons for lasting conservative success might be related to deeper economic changes in the country. Perhaps the age of liberalism that followed World War II depended on the prosperity of a vibrant manufacturing economy (and wartime Keynesian spending) that has mostly vanished. The rise of a competitive service-oriented economy has created a new kind of voter with changed economic interests and a different relationship to politics–one to whom the old liberal vision no longer appeals. Perhaps conservative leaders enjoyed electoral success because their movement has simply been better able to adapt to deep changes in the bedrock of American society.

At the same time as they have started to emphasize an account of the transformation of the country’s political economy to explain the durability of the conservative movement, historians have also started to describe postwar liberalism and its many triumphs with skepticism, if not pessimism. (While this interpretation goes back to the New Left, it has gained new strength today, even though there is also a countermove by historians like Kevin Mattson to rehabilitate postwar liberalism as a fighting faith.) Some scholars suggest that the entire postwar period was one of struggle and that the liberal order was always more fragile than its victories suggested. The North never truly supported racial equality, at least when it came to its own cities; the manufacturing companies so long thought to have given tacit support to the New Deal now seem to have done all they could to resist and fight it behind the scenes. Instead of a liberal period of consensus that was shattered in the late 1960s and ’70s, there was a continued struggle throughout the postwar period over the legacy of New Deal liberalism.

Others go further still, arguing that the culture and society created during the postwar era contained elements that would prove fatal to liberalism. After all, the government underwrote the expansion of suburbia, which would become the home of an intransigent individualism hostile to the very state that had subsidized its creation. And the liberal project itself never had deep philosophical or cultural roots. It was created during the Great Depression out of expediency, in a moment of economic and political crisis, and never reflected a sustained or coherent agenda; even FDR was ambivalent toward labor and the welfare state. More than anything else, postwar liberals were committed to anticommunism–hence their support of the Vietnam War. The entire postwar era (as Jefferson Cowie and Nick Salvatore have explained) might be seen as no more than a “long exception” to the more lasting conservative project of individualism and laissez-faire that has defined so much of American history.

Another version of this story can be found in the world of intellectual history–for example, in Patrick Allitt’s The Conservatives: Ideas and Personalities Throughout American History (2009). Here, conservatism is analyzed as one of the deep trends within the country’s political life, a “reactive” politics that seeks to respond to “perceived political and intellectual challenges,” to conserve what exists, to protect against social dissolution and to resist the march of social and political equality. In this view, the post-1945 conservative movement is one more development of a strain of politics reaching back to the founding fathers. Unlike historians who have noted how the modern conservative movement–in its determination to tear down the New Deal and its preference for a radical market vision rather than a cautious enthusiasm for capitalism–seems to break with the earlier style of conservatism, Allitt emphasizes the continuities between postwar conservatism and the laissez-faire politics of the nineteenth century.

When Alan Brinkley observed that American historians had lost interest in the right, the general interpretation of the United States in the twentieth century was one of liberal victory followed by tragic decline. Today a new vision seems to be emerging–one that sees the liberalism of the midcentury as always hemmed in by hostility and limited by its internal tensions. In the latest scholarship, the old narrative has been turned upside down: more and more, historians are depicting the century as one of conservative strength only briefly interrupted, some going back still further to see all of American history shot through with the power of forces sharply opposed to equality and democracy. Yet just as the earlier story of liberalism triumphant overlooked the continued existence of a conservative opposition, this bleaker vision may also be too stark. It comes close to substituting a new monolithic political force for the old one and eliding the reality of continued struggle, in which the two sides shape each other and the outcomes are controlled by neither alone. The next wave of scholarship on the history of the right will likely strive not only to tell the story of the movement’s rise but also to account more fully for its internal tensions and the difficulties that it encountered–especially once it came to power in the 1980s.

What does recent historical scholarship say about the future of the right? On the one hand, there undeniably is a crisis–of leadership, of faith, of constituency–among conservatives at this point in history. Although the movement institutions (think tanks, talk-radio stations, churches) remain powerful forces, they are also catering to a narrow and frustrated segment of opinion. The far right’s resentment is deepening: it angrily denounces Obama as a socialist, holds “tea parties” to protest taxation and has turned to Rush Limbaugh as a leader. “Birthers” seek to delegitimize the Obama presidency by attempting to prove that he was born in Kenya, not Hawaii. True believers on the right seem to be sinking deeper within their movement, insisting that George W. Bush failed to adhere sufficiently to first principles but never considering whether the principles themselves were the problem. Yet as they consolidate their faith, they also become more marginal, less a central part of American politics.

Meanwhile, the intellectuals who have brought conservatism to a broader public have been moving away from their old certainties. Ross Douthat, the young conservative tapped as an op-ed writer by the New York Times, has argued that Republicans need to win back the “Sam’s Club” voters and convince working-class people that family values are actually in their economic interest–even though doing so may mean abandoning a hardline laissez-faire position. Times-men David Brooks and Sam Tanenhaus have described a putative “Burkean” strain on the right, a restrained, moderate and sentimental tradition whose adherents objected to the “excesses” of the Great Society but were willing to preserve the established structures of the New Deal out of respect for tradition and settled norms. (Whittaker Chambers, Daniel Patrick Moynihan and even Richard Nixon, in the first years of his presidency, are supposed to be a few of the people who practiced this politics, not all knowingly so.)

But this attempt to reinvent the right is fraught with difficulty. Leaving aside the question of whether Burke–who wrote in the heat and fury of counterrevolution–really held such genteel commitments, it is hard to make the case that the American conservative movement can easily be divided between moderates in the Burkean mold and radical reactionaries. Brooks’s and Tanenhaus’s turn to Burke recalls George Nash’s 1976 classic The Conservative Intellectual Movement in America Since 1945. Like many conservatives who tell their own history, Nash placed the greatest emphasis on the history of ideas and the fusion of traditionalist, anticommunist and libertarian strains of conservatism. By portraying a conservative world teeming with ideas, he was able to refute the enduring 1950s image of conservatism as intellectually dead, even as he glossed over the social and political bases of the movement. Indeed, as Tanenhaus has written, the people who have always determined the course of American conservatism have been the passionate ideologues, the “revanchists” who are less interested in what they can “conserve” than what they can raze.

Although Tanenhaus has argued that left-leaning social historians, having abandoned intellectual history, are ill equipped to understand the appeal of conservative ideas, the project of inventing a tradition of conservative intellectual moderation as the road not taken for American politics seems to have more to do with the present dilemma of the movement than with historical inquiry. That Brooks and Tanenhaus find the motif of Burke appealing is largely a sign of their longing to revive a serious, sophisticated and mature conservatism, and their sense that, thanks to the radicals, the right is in desperate straits and has entered a period of decline. This situation holds a special irony for Tanenhaus, who believes we are living in a conservative era, “perhaps the most conservative since the Eisenhower years,” as he writes in his new book, The Death of Conservatism.

But this narrative of conservative defeat and retrenchment–tightly focused on backlash politics and movement conservatism–is only one part of the story. The recent turn in scholarship toward economic change offers another way of understanding the fate of the movement. For despite the financial crisis of the past year, the faith in laissez-faire that conservatives promoted throughout the postwar period continues to exercise a deep hold on American politics. Think tanks, business organizations and corporate lobbying groups still wield great influence in Washington and throughout the country, their confidence hardly shaken by the disasters that their politics helped bring about. Wealthy individuals who financed the movement are still donating their dollars to fight healthcare reform. Companies such as Citibank are battling labor-law reform even as they take bailout funds. The self-righteous chutzpah of bankers who insist on paying themselves tremendous bonuses even after driving the country into financial turmoil reflects the triumph of an ideology in which the private sector can do no wrong. And the economic assumptions that the conservative movement advanced throughout the postwar period continue to prevail in our culture and politics overall, where it is hard for people to imagine a check on, much less an alternative to, a merciless market-driven world. Even the Obama administration is well stocked with Wall Street veterans and economists committed to preserving high-octane financial capitalism.

History has a strange way of rescuing the defeated. As a self-conscious movement, conservatism has sunk into one of the deepest crises in its history. But it may be only in this moment of apparent ruin that we can for the first time assess the full significance of all that the right has won.



]]>
https://www.thenation.com/article/archive/right/
Living for the City: Robert Clifton Weaver’s Liberalismhttps://www.thenation.com/article/archive/living-city-robert-clifton-weavers-liberalism/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-FeinDec 22, 2008

Schomburg Center, NYPLRobert Clifton Weaver at HUD headquarters, 1968

In July 2000 a ceremony was held to commemorate the renaming of the headquarters of the Department of Housing and Urban Development in Washington, a modular phalanx of concrete and glass designed by Marcel Breuer and completed in 1968. Democratic luminaries such as Andrew Cuomo, Charles Rangel and Daniel Patrick Moynihan gathered to celebrate the dedicatee’s life; he was, in Cuomo’s words, a “pioneer who broke through barriers of racism again and again, to build a life of extraordinary achievement and public service.” The praise was richly deserved, for Robert Clifton Weaver had been a prominent economist, a longtime advocate of fair-housing laws and a member of the country’s black intellectual elite ever since the days before the end of segregation. President Lyndon Johnson had appointed Weaver to head HUD after the agency was founded in 1965, making him the first black cabinet official in American history. And it was Weaver who had dedicated the new HUD building three years later, its Brutalist architecture still cutting-edge and the idealism of the Great Society still fresh.

The story of Weaver’s life, as told in Wendell Pritchett’s new biography, Robert Clifton Weaver and the American City, points to a lesser-known narrative in the long struggle for racial equality–one focused on the politics of Northern cities rather than Southern churches, on economic claims more than moral ones, on regulatory agencies and cabinet meetings rather than lunch-counter sit-ins and mass marches. Yet Pritchett’s biography also offers another story, one about the political limits and personal costs of midcentury liberalism for men like Weaver. The New Deal and Great Society policies that Weaver defended only sometimes fulfilled the promise he thought they held for African-Americans, and even then they did so mostly under the pressure of mobilization rather than from the patient and steady work of insiders like him. For all he accomplished during his remarkable life, Weaver emerges as an ambivalent success in Pritchett’s book, a man whose greatest dream, that government action could ameliorate the poverty of African-Americans in the country’s cities, remained in many ways unrealized. By the time Weaver was rightly memorialized by the renaming of the HUD headquarters in 2000, the building seemed as ponderous and passé as the dreams of Great Society liberalism.

Robert Clifton Weaver was born in 1907 in Washington, DC, the second son of parents descended from several generations of strivers. His maternal great-grandfather, although born a slave in North Carolina, was able to obtain training as a carpenter and to hire himself out to the builders of Raleigh. He gave part of his pay to his master and kept the remainder, and over time he was able to purchase not only his freedom but also that of his wife and six children. Weaver’s maternal grandfather grew up in Washington, reaching maturity in the era of Reconstruction. He seized the possibilities of that moment, applying to Harvard University’s new dental school two years after the end of the Civil War and becoming the first professionally trained black dentist in the country. Upon graduation in 1869, he moved back to Washington, opened a practice and established a position for himself in Washington’s black elite (the “black 400”) at a time when the city was known as the “Capital of the Colored Aristocracy.” His daughter–Florence Freeman Farley, Weaver’s mother–graduated from high school, a rare accomplishment for young black women in the late nineteenth century. Florence knew Latin and read her children Tennyson and Longfellow. (Weaver’s father was a high school graduate and a clerk in the Post Office, one of the highest government positions open to African-Americans in the early twentieth century.) The Freeman family legacy was the ardent belief that individual merit and self-reliance could surmount racial hatred. Weaver remembered his mother telling him as a young boy, “The way to offset color prejudice is to be awfully good at whatever you do.”

Weaver and his older brother, Mortimer, whom Weaver thought of as the real intellectual of the family, attended Dunbar High School, which was known as the “crown jewel” of the capital’s segregated school system. From there, Weaver went to Harvard, yet the migration north was not the journey of improvement it had been for his grandfather. The forty-two black students in the early 1920s–seventeen undergraduates, twenty-five graduates, all male–mostly lived off campus and took their meals apart from the white students. The university president, Abbott Lawrence Lowell, barred them from the freshman dorms altogether. As Lowell wrote to the father of an incoming black freshman, “It seems to me that for the colored man to claim that he is entitled to have the white man compelled to live with him is a very unfortunate innovation.” (Lowell also advocated for a 15 percent admissions quota for Jewish students to limit their enrollment.) Weaver had trouble taking his studies seriously; he was often distracted by the social life of Cambridge, especially by “courting,” as he put it. But after his older brother–who had become an English professor–died in May 1929, the educational and professional ambitions of Weaver’s family, its mandate of individual achievement, became his own. His parents, he thought, would be terribly disappointed if he became a lawyer, as he had planned. He had to earn a doctorate. And so he became the first black person to receive a PhD from Harvard in economics.

Harvard’s economics department in the 1920s was intellectually and socially conservative, a far cry from what it would become in the postwar era, when John Kenneth Galbraith was on the faculty (or even the 1930s, when it was home to Joseph Schumpeter and Alvin Hansen, the “American Keynes”). Weaver’s thesis adviser, William Ripley, argued that the biological superiority of Anglo-Saxons explained the industrial advancement of the United States and Europe. When Weaver began graduate school, he wanted to write a dissertation that would “concern Negroes”; he initially proposed a thesis about “Industrial Education and Industrial Opportunities for the Negro.” But once he started work on the thesis, he switched to a more abstract argument about wages and prosperity.

The only jobs open to Weaver when he received his doctorate in 1933 were at black colleges (the first tenure-track black professor would be hired by the University of Chicago in 1942). Although he had taught briefly at the North Carolina Agricultural and Technical College before he started writing his dissertation, upon completing his degree Weaver turned to politics, first with an organization advocating on behalf of fairer treatment of African-Americans by the National Recovery Administration and then as an adviser on Negro Affairs to Harold Ickes, Roosevelt’s Interior Secretary, who directed the New Deal’s Public Works Administration (PWA). Weaver was not the only black adviser in the Roosevelt administration; several dozen young black college graduates and longtime civil rights movement leaders made up the Federal Council on Negro Affairs, known colloquially and in the press as the Black Cabinet.

The Great Depression was especially devastating for black Americans. By 1932 black unemployment in American cities was more than 50 percent; in the Southern countryside, black cotton farmers found themselves unable to earn any profit at all. Because it offered some relief, the New Deal won the support of black city dwellers, especially during Roosevelt’s second term; by 1939 the Works Progress Administration (WPA) was giving financial support to a million black families, and the program employed thousands of black teachers, composers, artists and writers (a quarter of a million African-Americans learned to read and write in the WPA Education Program). Both the WPA and the PWA banned discrimination in their programs.

Yet in part because Roosevelt and his liberal allies in Congress always relied on the support of Dixiecrats, they could never make racial justice a central plank of the New Deal. Southern states often disregarded federal antidiscrimination directives, and Roosevelt didn’t risk making reprisals. His first cabinet was largely hostile to civil rights. Agricultural and domestic work–the occupations that employed most black workers–were excluded from Social Security. The actions of the US Housing Authority, which built more than 130,000 housing units and was a precursor to the later postwar federal housing programs, were sabotaged by Jim Crow. Many of these new apartments went to African-Americans who otherwise would have been living in desperate conditions, and civil rights spokesmen praised the program for charging black tenants lower average monthly rents than whites. But in the South, the new federal public housing buildings were always segregated–and they often were in the North, as well. A housing project in Williamsburg provided more than 1,600 apartments for white Brooklynites; one in Harlem gave fewer than half that number to black families. When Weaver, in 1940, gave a speech before the National Negro Congress defending the opportunities the program had offered for integrated housing (some fifty projects housed both black and white families), conservatives seized on his remarks and managed to get the program’s funding cut altogether.

Because of the Roosevelt administration’s timid commitment to racial justice, Weaver–as a loyal New Dealer–often found himself the target of fierce criticism from the African-American left. The black advisers who worked for the executive branch, wrote Howard University historian Rayford Logan, were merely “the type of Negro that the white people wanted–one who knew what to say and what not to say; one who gladly accepted what the white officials gave Negroes and never made any further inquiry or complaint.” This view seemed to be vindicated in 1941 when labor leader A. Philip Randolph threatened to organize a march of 100,000 blacks on Washington to protest discrimination in defense jobs and the Army. Weaver had a new government job at the National Defense Advisory Committee, where he was supposed to help integrate African-Americans into the industrial defense program, but he found himself defending companies like Curtiss-Wright, an aircraft manufacturer, against charges of discrimination. Weaver had been able to negotiate a deal whereby Curtiss-Wright would train 1,200 black workers at its plant in Buffalo, New York, but activists argued that this was insufficient, given that the company’s Paterson, New Jersey, factory had only sixteen black workers on a payroll of 16,000. Weaver worried that the protests and demonstrations would stir up “reports of extreme racial chauvinism on the part of Negroes, of super-sensitiveness which complicates their problem of industrial employment and of a discrimination mindedness which influences their participation in the armed forces”–this, at a time when the Army kept its blood supply segregated so that white soldiers would not receive transfusions of blood drawn from African-Americans. Ultimately, Randolph, in return for calling off the protest, was able to secure what Weaver could not–a guarantee of colorblind hiring practices in defense programs and the creation of a Fair Employment Practices Commission to investigate charges of racism. The armed forces remained segregated until after the war.

Weaver left Washington during World War II and–along with his wife and their adopted son (Pritchett says the child was most likely born to an unmarried relative)–moved first to Chicago and then to New York City, holding various positions in and out of academia. It was during this period that he wrote his second and most influential book: The Negro Ghetto (1948), a counterpoint to Gunner Myrdal’s An American Dilemma. (His first book, Negro Labor: A National Problem, was a well-received treatment of employment discrimination.) Instead of focusing on the South, The Negro Ghetto treated residential segregation in the urban North–and it was one of the first academic works to do so, long before the notion of the “inner city” became as well worn as an old penny. Instead of looking at racism primarily as a “moral issue,” as Pritchett puts it, Weaver sought to explain “the economic motivations and institutions” that drove patterns of segregation in Northern cities. He explained how brokers, by creating severe housing shortages in black neighborhoods and by stimulating anxieties about integration, were able to wrest high profits from frightened white families selling their homes. The economic dynamics of white flight needed to be challenged if residential segregation was to be overcome.

Powerful and detailed, The Negro Ghetto provided an argument in favor of using public housing to create integrated neighborhoods and dismantle the myths of prejudice. Weaver criticized the creation of segregated projects, arguing that they would only exacerbate the problems of the ghetto and that it was “a matter of grave concern to Negroes and liberals when the rise in institutionalized residential segregation was accelerated by housing planned, financed and sometimes owned and managed by the Federal government.” The book also indicted the irrationality of racial covenants (which stipulated that homes had to be sold only to white buyers). In 1948, shortly before the publication of The Negro Ghetto, the Supreme Court overturned racial covenants. Although the Court’s decision did not cite Weaver’s research directly, his work was widely thought to have affected the outcome; perhaps in part because of its influence in the case, The Negro Ghetto garnered glowing reviews. Carey McWilliams described it as the “finest study of its kind that has appeared to date,” and the New York Times pronounced the book “a comprehensive, authentic survey of an acute social problem.”

The Negro Ghetto made Weaver’s academic name, and he became a leader in the fight for fair-housing laws. In the 1950s, New York Governor Averell Harriman appointed him deputy commissioner of housing (he also served as rent administrator for the state’s rent-control board). Yet as Pritchett makes painfully clear in the second half of his book, despite Weaver’s mounting successes, the sense of insecurity instilled in him as a young striver never entirely subsided. He and his family rarely attended the parties and other social events that were a regular part of life for many black intellectuals in New York at the time. His subordinates referred to the “Weaver treatment,” which one described as a politeness “so unrelieved in its iciness that its victims felt they would be warmer if they curled up in a refrigerator.” When a friend called to let Weaver know that he was being offered a job in Harriman’s administration, Weaver at first hung up on her, thinking she was joking.

A bigger call for Weaver came in 1960, when John F. Kennedy nominated him to head the Housing and Home Finance Agency, a predecessor to HUD. After being redbaited in hearings by Congressional conservatives, one of whom asked if it bothered him that The Negro Ghetto had been named a “book of lasting value” by an operation called the Workers Book Shop in New York City, Weaver was confirmed. His tenure had some real successes: in November 1962 Kennedy signed an executive order banning discrimination in federal housing programs–something Weaver had fought for ever since the New Deal, made possible at long last in the early ’60s by the surging civil rights movement. But Weaver remained a man apart in the capital, a full participant in neither the Kennedy administration nor the civil rights movement. He was hardly involved in the discussions over King’s 1963 March on Washington (it is not clear that he even attended the demonstration). He told a reporter that “black chauvinism” was as bad as white, and that he was involved with civil rights “as a liberal, rather than a Negro.” But he had few friends in the Kennedy administration; all of the president’s civil rights advisers were white, and Pritchett notes that Weaver was expected to stay focused on housing, not to get involved in broader conversations about civil rights. Weaver accepted this position, preferring, as Pritchett says, to be seen “as a professional who was black rather than a racial advocate.” It is not hard, though, to imagine Weaver’s frustration with being expected to represent the race while at the same time being excluded from the most important debates about racial politics. Acquaintances described Weaver as “basically a loner” who went every day to the counter of a hotel restaurant to eat a sandwich for lunch by himself.

The political costs of such isolation became clear after Johnson created HUD in September 1965. Since Weaver was the head of the Housing and Home Finance Agency, many thought Johnson would quickly tap him for the job. Instead, Johnson dragged his feet. He told Roy Wilkins of the NAACP that Weaver was not a sufficiently “imaginative” leader and that the first head of HUD should be a white man–perhaps the philanthropist Laurence Rockefeller–who could “do a hell of a lot more for the Negro than the Negroes can do for themselves in these cities.” For years Weaver had hoped to be taken seriously as a liberal leader on his own terms, without regard to his race. Later on, he said he “would like to feel that I was appointed not because I was a Negro, but maybe in spite of that fact.” But for Johnson the appointment was about nothing else–if he selected Weaver, he feared, critics would accuse him of not picking the best man for the job; but if the job didn’t go to Weaver, he would disappoint the “little Negro boys in Podunk, Mississippi,” and civil rights leaders would say “when you get down to the nut-cutting…this Southerner just couldn’t quite cut the mustard–he just couldn’t name a Negro to the cabinet.” For the proud, restrained Weaver, the waiting was torturous, a “very, very difficult thing,” as he remembered in later years. Humiliated and made miserable by reports of Johnson interviewing other candidates, he nearly resigned. And when Johnson at last offered the HUD post to Weaver early in 1966, it was almost too late for him to enjoy it.

Once in office, Weaver presided over a great expansion of public housing. More affordable housing was built under the Housing and Urban Development Act of 1968 than at any other time in the nation’s history, and Weaver’s proposals were the ones the administration followed. A national fair-housing law was finally passed in 1968, with Weaver lobbying Congress to support it. Yet these were also the years of the riots in cities like Detroit, of deepening poverty and deindustrialization. Black radicals criticized the housing projects that Weaver helped to build as simply reconstituting the ghetto in new ways. They rejected his quest for a race-blind meritocracy; in a July 1967 column, Jimmy Breslin quoted one such activist: “You know what everybody says about Weaver? They say, ‘He’s light and bright and damn near white.'” Weaver was deeply troubled by the riots, which he argued represented a deepening “community despair and hopelessness” that could only be undone by concerted government action. But the Great Society lasted just a few years, razed by the politics of backlash as white city dwellers reacted to the housing projects rising in their neighborhoods by fleeing the cities in a panic about living next door to blacks. The public housing projects that Weaver had once viewed as the beachhead of a better society came instead to more closely resemble holding pens for the poorest of the poor, people left out of any social compact whatsoever. The projects became exactly what Weaver had once wanted above all to avoid: a second ghetto. Not only the vision of public housing but the buildings themselves seemed to have failed.

Weaver left Washington with Johnson, telling the president, shortly before Johnson chose not to seek re-election, that he planned to resign at the end of 1968 no matter who was in the White House. He returned to New York City, where he served as the first president of Baruch College of CUNY. There, his long career in public service had a sad coda: he sat on the board of directors of the Municipal Assistance Corporation, the organization created to float bonds on behalf of New York City during its fiscal crisis and flirtation with bankruptcy during the mid-1970s. The MAC helped to enforce a dramatic program of restructuring for the city government, cutting funds for daycare, schools, firemen, police, social workers and hospitals, as well as instituting tuition for CUNY. Although the New York fiscal crisis in many ways marked the end of the expansive liberal vision Weaver had long championed, he was a silent member of the board, doing little to shape the course of events. He told an African-American State Assembly member that “you may be sure” that when “specific issues affecting minorities” came before MAC, he would speak up for “what I believe to be the interests of those so long disadvantaged.” But according to Pritchett, the minutes of MAC meetings contain no evidence that Weaver ever tried to do so; perhaps he felt there was nothing he could really do. He ended his career at Hunter College, living on the Upper West Side until his death in 1997. His wife–a light-skinned black woman who was able to pass as white and who received hate mail from segregationists chiding her for marrying a black man–had died a few years earlier, and their adopted son (the “Bobby” to whom he dedicated The Negro Ghetto) had died from a self-inflicted gunshot wound in the early 1960s. Weaver–true to form–said nothing publicly about his son’s death and destroyed all the letters of condolence he received. Three years after Weaver’s death, the HUD headquarters was rededicated in his name.

By the end of Pritchett’s book, one is struck by how much Weaver was hindered by the New Deal and Great Society liberalism that nurtured his long career. In some ways midcentury liberalism was always a precarious politics, especially with regard to race. In the 1930s the Roosevelt administration, never unequivocally committed to civil rights, made its peace with the Southern Democrats; in the ’60s Johnson’s expansion of the welfare state coexisted with a deep paternalism. For a young, ambitious black man like Weaver, the choice to ally himself with these mainstream liberals–as opposed to working with the more radical social movements that so often forced their hands–could not help but yield a certain frustration. Pritchett avoids venturing into the dangerous terrain of psychohistory, preferring to present Weaver as a pioneer in race-blind politics. Yet it also seems possible that Weaver’s careful and lifelong eschewal of radicalism was the result, in part, of his own deep sense of an imperative, instilled by his parents and grandparents, that his individual accomplishments were the ultimate measure of success. And if at times this meant steering clear of engagements that might prove dangerous to his quest for mobility and security–that was the cost of moving forward in the world.

One can’t read Pritchett’s book without thinking of Barack Obama, whose career seems, in certain respects, a reflection of Weaver’s ambiguous legacy. Obama’s victory undeniably marks one kind of progress toward racial equality. And surely Weaver would be thrilled to see Obama in the White House (which was built mostly by slaves and free blacks), identifying with the candidate’s choice to carve out a career at the pinnacle of American politics and his promise to transcend the politics of race. (Weaver might also have appreciated that during Obama’s tenure as a community organizer, he worked in the Altgeld Gardens housing project in Chicago–a World War II development that was part of the early wave of public housing.) Yet even as it is now possible for a black man to become the president of the United States, the cities to which Weaver dedicated his life remain nearly as sharply divided by race and poverty as they have ever been. The battered buildings of the housing projects of Chicago and Detroit that Weaver worked so long, with so much hope, to build, cast their long shadows on the pages of this book, bleak reminders that the triumph of an individual cannot alone make up for these larger defeats. The strength of Robert Clifton Weaver and the American City is that it enables the reader to see the victory and the loss at once.



]]>
https://www.thenation.com/article/archive/living-city-robert-clifton-weavers-liberalism/
Hard Timeshttps://www.thenation.com/article/archive/hard-times/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinMar 20, 2008

In 1883 Yale professor William Graham Sumner published an essay titled “The Forgotten Man.” Sumner was the most prominent American advocate of social Darwinism, and his article reflected the conviction that economic inequality was a natural, inevitable and benign result of the competition for survival. Sumner thought that any attempt to ameliorate the lives of “the poor” and “the weak” would limit the impersonal, harsh but ultimately improving force of natural selection. Even worse, the social cost of passing laws to improve working conditions or crafting government policies to help the indigent would always be borne by the “forgotten man”–the industrious, decent and apolitical American whose freedom would be hemmed in by government regulations and whose salary would be taxed for social services.

In April 1932 a speechwriter for Franklin Delano Roosevelt placed Sumner’s phrase (likely without remembering its origins) in the middle of a radio address given by the governor of New York as he campaigned for the presidency against Herbert Hoover. The Depression, which had begun with the precipitous stock market crash of October 1929, still held the nation in its grip. Unemployment for the country as a whole was more than 20 percent; in cities like Detroit, close to one-third of the labor force lacked work. Those Americans lucky enough to still have jobs saw their wages fall and their hours cut back. More than 25,000 businesses failed each year in the early 1930s. In his speech, Roosevelt focused on the “Main Street” of farmers and homeowners and miners and factory workers, promising to help “the forgotten man at the bottom of the economic pyramid.” Hoover, he argued, was providing “temporary relief from the top down,” whereas he would give “permanent relief from the bottom up.”

Amity Shlaes’s history of the Great Depression borrows its title from Sumner’s essay as much as from Roosevelt’s speech. Published last spring, seventy-five years after Roosevelt’s election, The Forgotten Man is nothing less than an attempt to reclaim the history of the 1930s for the free market. Shlaes, a syndicated columnist for Bloomberg News and formerly a columnist at the Financial Times, insists that the New Deal actually prolonged the economic misery of the ’30s–unemployment never fell under 13 percent during the decade, and after declining from its peak in the early ’30s, it actually rose again during the decade’s final years. The grand initiatives of the New Deal–the National Recovery Administration, the Tennessee Valley Authority (TVA) and the Wagner Act–all created a climate of economic uncertainty, discouraging businesses from investing. The deepest problem of the decade, she writes, was “the intervention, the lack of faith in the marketplace. Government management of the late 1920s and 1930s hurt the economy.” Meddling reformers dared to tamper with the market, and the “forgotten men” paid the price: “From 1929 to 1940, from Hoover to Roosevelt, government intervention helped make the Depression Great.”

The Forgotten Man is a narrative history woven around the exploits of a group of characters Shlaes deems the “people whom the New Deal forgot and hurt.” She chronicles the travails of Wendell Willkie, president of the utilities company Commonwealth and Southern, as he fought the TVA, which competed with his company in the South. (Willkie ultimately challenged Roosevelt for the presidency in 1940, a thankless mission if there ever was one.) She tells the story of Andrew Mellon, the steel magnate and former Treasury Secretary who donated the paintings that became the National Gallery in an effort to demonstrate that private accumulations of wealth were in fact in the public interest: “The only reason his art collection was so great was that he was supremely wealthy,” Shlaes says. She explains the struggle of the Schechters, the humble kosher butchers whose Supreme Court case (Schechter Poultry Corp v. the U.S.) toppled the National Recovery Administration, the early New Deal attempt to relax antitrust laws and permit companies to collaborate in setting wages and prices in the hope of stabilizing the economy. Father Divine, the religious leader who preached the gospel of self-improvement, makes an appearance, as does Bill Wilson, the stock analyst who founded Alcoholics Anonymous. None of these are exactly obscure figures. But Shlaes believes they represented an alternative response to the Depression–one grounded in individual initiative and the support of local voluntary groups (AA being an example of the latter) rather than state projects like the TVA and the Works Progress Administration. Most histories of the era, she contends, glorify the grand government initiatives of the decade, eschewing the quieter stories of these private efforts to cope with the Depression.

Central to Shlaes’s case is the provocative argument that the American economy was basically sound during the 1920s. The meteoric rise of the stock market was not the sign of a speculative bubble but instead a reflection of underlying strengths. If only the feds had had the sense to leave well enough alone, the recession that followed the crash of 1929 would have been shorter and far less severe. She begins her story with a comparison between Calvin Coolidge and Herbert Hoover. Coolidge, in her view, was a New England country lawyer, while Hoover was an educated mining engineer, a man of the world. (Shlaes boils down their character differences to the fact that Coolidge fished with worms, Hoover with artificial flies.) In the 1920s, Coolidge left the economy alone, believing that “the work of life lay in holding back and shutting out.” The result was unemployment below 5 percent and rapid economic growth.

Then the “priggish” Hoover, whom Shlaes sees as temperamentally inclined toward government intervention, took office. After the stock market decline, Hoover was fond of calling White House meetings with businessmen to urge their restraint in cutting wages and firing workers, a tactic that many historians have criticized as an example of Hoover’s inability to confront the Depression. Shlaes presents it as evidence that Hoover pursued heavy-handed intervention: “To force business to go on spending when it did not want to was to hurt business.” There was, of course, little force involved–but Shlaes would have preferred Coolidge’s silent retreat. Hoover did initiate some of the programs that Roosevelt developed further, like the Reconstruction Finance Corporation, which helped shore up failing banks. But Shlaes implies that if only Hoover had pursued a policy of prudent restraint the crisis might have blown over quickly, a “bad quarter of an hour,” in Mellon’s phrase. To believe that the American economy in 1930 was undone by Hoover’s tentative steps toward intervention fails to account for the real economic problems the country faced: the sharply unequal distribution of income, the weak consumer market, the crazy speculative schemes and rampant borrowing that financed the stock market boom, the fragility of the banking system and the agricultural crisis shaking the Midwest.

If the economy would have recovered soon enough if only left alone, then the call for federal economic intervention makes the New Dealers look ridiculous. Shlaes presents the New Dealers as a group of well-meaning ideologues, drawn to social experimentation, almost gleeful at the chance to use the Depression as a lab to test their theories. In her introduction, she differentiates herself from other conservative scholars by saying that she rejects the idea that Communists infiltrated the New Deal. Yet she can’t restrain herself from introducing her leading liberal characters (including Rexford Tugwell, who would be an adviser to Roosevelt throughout the decade) by describing a trip they made to Moscow in 1927 to learn what they could about a managed economy. During the 1920s, these progressives were despondent, unable to enjoy the country’s new prosperity; as Tugwell put it, “We were…all but regarded as social misfits.” When they found themselves in power, they had no idea what to do, and Shlaes describes the result as a comedy of errors–everything the Brain Trust did only undermined its broader goal of ending the Depression.

Shlaes is, of course, correct that the New Deal failed to restore economic health. She is also right that the Roosevelt Administration was far from consistent or coherent in its attempts to cope with the Depression, although she does not really address (as historians like Alan Brinkley have) the changes in the New Deal over the decade. Roosevelt and his advisers relinquished the ideas about the destructive force of capitalist competition that the ill-fated National Recovery Administration was meant to tame and took up a Keynesian program of managing consumer demand to ensure economic health. Shlaes’s arguments about the reluctance of businessmen to invest during the Depression years actually echo those of Roosevelt during the recession of 1937–the President and some of his advisers blamed a “capital strike” for the rising unemployment of the late 1930s. (Others argued that the attempt to balance the government budget that year was responsible, a thesis Shlaes does not mention.)

But in the end there is little historical evidence for her larger claim that the New Deal made the Depression worse and that without it, the crisis would simply have passed. After all, the catastrophe of the 1930s was halted not by private industry but by the massive public spending generated to fight World War II–an outlay of government resources far greater than anything Roosevelt had initially imagined. The fall in unemployment during the buildup to the war seemed proof of the central principles of Keynesianism: the private economy cannot always spend its way out of a depression; sometimes public spending is needed to stimulate growth. The experience of the war seemed to validate the New Deal, not disprove it. If anything, it suggested that Roosevelt should have gone further–that the New Deal was compromised by his timidity, not his radicalism. Does Shlaes seriously think that the Depression would have melted away if only Calvin Coolidge had been President? Was Alcoholics Anonymous really an alternative to the New Deal?

And if the New Deal was such a failure, why did it produce one of the most enduring political realignments of the century? In particular, how did Roosevelt win the loyalty of millions of voters, many of whom had never voted before? Shlaes dismisses FDR’s electoral success by arguing that he was a master of cynical politicking. She suggests that he won re-election in 1936 (taking every state in the nation except Maine and Vermont) because he invented a “new kind of interest-group politics.” Roosevelt “made groups where only individual citizens or isolated cranks had stood before,” gave those groups financial benefits and won votes for it. Social Security and the Wagner Act, in her mind, were little more than crass bids to win working-class support in the 1936 election. But she does not really reckon with the obvious counterargument: that Roosevelt’s electoral success reflected support for the New Deal in a larger ideological and political sense–that people gave their endorsement to the broad project of providing working-class people with minimal economic security during downturns that no policy-maker fully understood how to control. For the New Deal was not only about stimulating the economy; it was also an attempt to alleviate the sufferings of the poor, the aged and the unemployed, a goal that many New Dealers saw as distinct from restoring economic growth. They wanted to rescue capitalism (Roosevelt often seemed surprised by how much hostility businessmen expressed toward his program), but they also wanted to change the way that it worked.

In the end, the Depression itself cannot help but intrude on Shlaes’s optimism about the free market. Now and then, a few characters besides Mellon and the Schechters and Father Divine appear in her account, like the young couple from New York City who retreated to a Catskills cabin in 1931 to starve to death because they were too ashamed to beg, and the 13-year-old boy who hanged himself in Brooklyn in 1937 after watching the gas in his family’s home get shut off, a story Shlaes uses to begin her book. The economic breakdown of the 1930s made faith in the benevolence of business almost impossible during the decade. Shlaes may interpret these stories as demonstrating the failures of the New Deal, but at the time people saw them as evidence of the limits of the market.

For Shlaes, the New Deal was a scheme hatched by well-meaning reformers to carry out their good intentions and grand schemes, paying for their “big projects” with the tax money of the common man. But during the 1930s, millions of forgotten men became involved in politics themselves. The entire New Deal project, as Shlaes presents it, was carried out from above, by the President and the gang of intellectuals and dreamers gathered around him. But American society was shaken during the decade by political explosions: the birth of the Congress of Industrial Organizations, the eviction protests and hunger marches and sit-down strikes, the growth of the Communist Party and other radical groups. Economic depression does not always lead to political mobilization–but in the 1930s, it did. (Reading The Forgotten Man, you’d never know that the vibrant and tragicomic world of radical idealism, delusion and commitment memorably chronicled by writers like Edmund Wilson and Alfred Kazin ever existed.) The law professors and economists and the generation of old Progressives who worked with Roosevelt to craft New Deal legislation did so under pressure from people who in ordinary times might have stayed away from politics but who found themselves during the Great Depression facing economic circumstances that their local and community institutions could not ameliorate, that their old faiths in self-reliance could not contain. Because Shlaes interprets the New Deal as little more than the experiment of a jaded political class, she cannot really look at the popular protest of the decade, let alone assess the ways it shaped legislation.

Throughout the 1940s and ’50s, conservatives openly criticized the New Deal. Today, it is unusual for an author like Shlaes to devote much attention to the decade of the Depression. The New Deal is more often invoked by liberal writers, who describe the political economy of the 1950s and ’60s–which the New Deal helped to create–as an example of the kind of mixed economy that might produce a more egalitarian society today. Frequently it is remembered uncritically as a kind of political pastoral, an image quite different from the one people on the left and even liberals held during the years that followed the end of World War II. Then, many liberal writers criticized the postwar order. John Kenneth Galbraith excoriated the self-absorbed mass consumption of the “affluent society,” in which the public sector was devoted primarily to stockpiling weaponry for the fight against Communism. Michael Harrington reminded a complacent nation of the continued reality of poverty. New Left scholars pointed out the ways that corporations were able to seize hold of liberal institutions, while the civil rights movement made clear how little the New Deal had done to confront racism. It is only in recent years, as the minimal protections of liberalism have been rolled back, that the New Deal has started to enjoy such uncritical acclaim. But its contemporary enthusiasts–see Robert Kuttner’s recent The Squandering of America for one interesting example of the genre–actually have something in common with Shlaes, as surprising as that may seem. To treat the New Deal primarily as a legislative toolkit or as a set of policy initiatives means abandoning the very culture of outrage that matters most about the 1930s. To paraphrase historian Lizabeth Cohen, it means making the very people who truly made the New Deal once more into forgotten men.



]]>
https://www.thenation.com/article/archive/hard-times/
Deal Breakershttps://www.thenation.com/article/archive/deal-breakers/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinNov 21, 2007The Conscience of a Liberal and Jonathan Chait's The Big Con.]]>

In the spring of 1960, several months before the presidential election of that year, a small, slim volume by the junior senator from Arizona, Barry Goldwater, hit the bookstores. The brief manifesto, appropriately titled The Conscience of a Conservative, tried to reclaim the idea of conservatism from the disrepute into which it had fallen ever since the victories of the New Deal. Conservatism was not, Goldwater insisted, a “narrow, mechanistic economic theory.” Its adherents were not greedy businessmen, eager to protect their own privileges. Rather, conservatism was a political philosophy determined to stave off the intrusions of the state and expand the boundaries of freedom. Far from being preoccupied with wealth, conservatism, wrote Goldwater, “puts material things in their proper place.”

Many ironies surrounded the publication of The Conscience of a Conservative. Despite Goldwater’s claim that conservatism had nothing to do with wealth or privilege, the book’s publication had been financed by a small group of intensely antiunion businessmen who hoped to publicize their political views and encourage Goldwater to seek the presidency. They even connected him with National Review editor L. Brent Bozell (brother-in-law of the magazine’s editor in chief, William F. Buckley Jr.), who ghost-wrote the book. In the end, Conscience became a surprising success, hitting the bestseller lists two months after publication and helping conservatism shed its image as the stingy, small-minded politics of businessmen opposed to labor unions and the welfare state. Rarely has a book played as critical a role in building a movement.

Two new books seek to have the galvanizing impact on liberals and progressives that Goldwater’s did on conservatives in 1960. The Conscience of a Liberal, by Princeton economist and New York Times op-ed columnist Paul Krugman, puts forward a vision of liberalism as a populist politics of opposition to economic inequality, not the foppish, sentimental cultural elitism mocked by the right. The Big Con, by New Republic senior editor Jonathan Chait, seeks to demonstrate that for all its posturing, conservatism really is the politics of business and wealth, and that its truest believers are more obsessed with tax cuts than abortion or gay marriage. Most Americans, Krugman and Chait say, don’t really support the conservative economic agenda; they would like the government to do more, not less. Nor are they duped into voting for that agenda by their fanatical hatred of feminism and gay rights (although Krugman, especially, does emphasize the role of racism in winning popular support for an elitist economic agenda). Rather, the real reason for the strength of the right in American politics is its connection to a small group of businessmen, and their intellectual allies, who want to undo the New Deal and restore the laissez-faire, nonunion economy of the late nineteenth century. These “economic royalists” have captured the Grand Old Party and driven out the moderate Republicans of yesteryear. Unlike Thomas Frank, Krugman and Chait don’t think that the crucial problems of contemporary politics can be traced to culture wars in Kansas. They focus instead on the conservative business lobby and its enablers in Washington.

Krugman and Chait offer passionate and persuasive critiques of the ways economic inequality corrodes American politics and life. They are willing to take on the conservative movement with a ferocity that most mainstream liberals shied away from only a couple of years ago. And because their interpretation of the rise of the conservative movement shifts attention away from cultural conflict to struggles over economic power, their books offer a welcome change from the portrayal of America as divided between the traditionalism of the heartland and the soulless tolerance of the coasts.

But at the same time, the books seem written for a political constituency that does not yet exist. They seek to reinvigorate the liberalism of the mid-twentieth century, yet they do not fully appreciate the exceptional circumstances that brought it about. Although their books are trenchant and incisive, their anger seems born more of desperation than of strength. After all, they are trying to defend the remnants of liberalism, an old political order, more than seeking to create a new one. Their books seem radical only because the institutions and the ideas that made the New Deal and the liberal politics of the mid-twentieth century possible have been so significantly weakened. There is, of course, a certain rhetorical power in claiming the mantle of history. But what’s missing from their wish to revive the economic policies of the New Deal and the liberal consensus of the mid-twentieth century are the kind of calls for a true break from the past that sparked those policies to life.

Paul Krugman has a confession to make: he’s nostalgic for the 1950s. Yes, the 1950s–the decade of Joseph McCarthy, the cold war and the segregated South–and despite the Vietnam War, he misses the ’60s too. Back in the day, Krugman “railed against the very real injustices of our society” and even “marched against the bombing of Cambodia.” But the liberal order that he criticized then now looks like a “paradise lost.”

What does Krugman miss? Well, for one thing, in the ’50s the nation’s wealthiest people weren’t casually purchasing multimillion-dollar apartments in New York City and planning to spend millions more on renovations before they even moved in. They weren’t hosting birthday parties for their children at $120,000 a pop, hiring personal nutritionists to evaluate the menus of restaurants around town or filling their hours with many of the other gold-plated entertainments breathlessly catalogued in a recent issue of The New York Times Magazine about the new Gilded Age.

In the last Gilded Age, tycoons like Andrew Carnegie believed that the wealthy were the winners in a great cosmic race and that they, not the state, were best able to determine the right ways to spend their fortunes. The impotence of the business class during the economic free-fall of the 1930s destroyed this vision of the omnipotent rich. By the ’50s, top income tax rates were 91 percent, while estate taxes were near 80 percent. Krugman estimates that the result of these tax rates was that the income of the richest 1 percent of the country was 20 to 30 percent lower in the mid-’50s than it had been a generation before. Yet the “compression” of incomes did not destroy incentives or halt economic growth, as conservatives often warn. On the contrary, family incomes rose regularly throughout the period, climbing about 2.7 percent each year from 1947 to 1973, compared with an annual 0.7 percent increase in median family income since 1980. There was, of course, still much poverty in 1950s America, especially among black Southerners living under Jim Crow. But the gap between the rich and the poor was not widening rapidly the way it is today.

In recent years, of course, the situation has reversed itself. The wealthiest 0.1 percent of Americans were five times richer in 2005 than they were in 1973. Median household income over the same thirty-five-year period had grown 16 percent–mostly as a result of more women entering the labor force; male wages fell over the same period. The result is that the wealthiest people in our society live in a world markedly different–in its daily texture and its horizons of possibility–from the world inhabited by everyone else. Most economists explain this rising income inequality in terms of market forces: the rise of international trade, globalization and an influx of immigrants have all hurt the bargaining power of working-class people while enhancing returns to the educated and skilled.

But Krugman, while assessing the relative impact of immigration and trade on the wages of working-class people, bucks the norms of his profession and argues instead that the primary reasons for the decline of the more egalitarian society of mid-twentieth-century America are political. There was nothing inevitable about the rise of that “middle-class” suburban, mass-consumption economy in the postwar period; it was made possible only by high income taxes and strong labor unions. Today, we are returning to the economic institutions of the last Gilded Age. The tax system is now less progressive than it has been in decades, and the proportion of the private-sector labor force represented by unions has declined to its lowest level since before the New Deal. It should be no surprise, therefore, that our society is becoming as divided by class as it was then.

Krugman argues that the conservative movement is in large part responsible for this rise in economic inequality. The legislation that had seemed so radical in the 1930s–like Social Security and the National Labor Relations Act–had by the ’50s become “the very definition of political moderation,” accepted by Republicans and Democrats alike. The sole Republican President between 1932 and 1968 was Dwight Eisenhower, who expanded Social Security and the minimum wage and wrote to his conservative brother Edgar that anyone who wanted to cut back labor laws and unemployment insurance was “stupid” and politically suicidal. But a small number of conservative activists, backed by antiunion businesses, sought to undermine this consensus by supporting candidates and legislation that would dismantle the welfare state, labor unions and the New Deal. And that group was able to prevail, Krugman argues, because the power of the right is historically different from that of the left. Where the civil rights movement and the labor movement depended on the active participation of millions of people, the strength and longevity of the conservative movement have come from the small number of wealthy activists like Joseph Coors and Richard Mellon Scaife who have contributed to its support over the years. Its electoral victories have not reflected popular support for the laissez-faire program as much as the cynical willingness of Republican campaign operatives to win votes by exploiting racism–such as when Ronald Reagan attacked “welfare queens” in 1976 and gave a 1980 campaign speech about states’ rights near Philadelphia, Mississippi, where three young civil rights workers were murdered in 1964. “The legacy of slavery, America’s original sin, is the reason we’re the only advanced economy that doesn’t guarantee health care to our citizens,” Krugman writes.

The Conscience of a Liberal excels when Krugman is giving his trademark crisp, clear surveys of the economic literature. He isn’t really writing history, and as a result, his sketch of the evolution of the conservative movement sometimes seems offhand and overly schematic. But his argument that inequality has political causes nonetheless resonates throughout the book. We can, he insists, make collective choices about how unequal we want our society to be. He calls for the revival of an “unabashedly liberal program of expanding the social safety net and reducing inequality–a new New Deal.” Although he advocates universal healthcare (especially single-payer)–in part on the grounds that it would be far more effective than our current system of private insurance, in which Americans pay more than people in other developed countries for healthcare that is far worse–his underlying reasons for wanting to revive the New Deal go beyond the economist’s questions about growth or efficiency. Universal healthcare, labor unions and policies that reduce inequality all cut against the idea that some people deserve one way of living–elite universities, million-dollar cars, Hamptons mansions, medical concierges when traveling the world–and others the opposite: crumbling schools, decrepit subways, crowded apartments, no healthcare at all. Rather than generating baroque myths about the superhuman greatness of CEOs alongside morality tales about the character failings of the poor, minimum wages and progressive taxes create a society capable of honoring the truth that people really are created equal.

Krugman writes in the tone of a mild-mannered, reasonable professor, rarely getting flustered as he goes about calmly disarming his intellectual opposition. Jonathan Chait, by contrast, opens his book with an apology for sounding a bit like an “unhinged conspiracy theorist.” You can imagine him shaking his fist or slamming his computer keyboard as he denounces the “tiny coterie of right-wing economic extremists,” some of whom are “ideological zealots, others merely greedy, a few of them possibly insane.” But in truth, Chait explains, he isn’t a radical at all. He’s just a displaced moderate (he gets a little sentimental about the “modern Republicans” of the Eisenhower era) who has found the center pulled out from under him.

The Big Con is at its best whenever Chait skewers the conservative intellectuals who popularized supply-side economics, which he sees as the obsession of today’s Republican Party. The basic idea of supply-side theory is that economic growth is determined almost entirely by the incentives entrepreneurs and investors have to create wealth, especially tax rates. But there is no academic consensus around the idea that there really is a close correlation between tax cuts and growth. The theory owes its popularity, Chait argues, to its promotion by a few economic cranks.

George Gilder, for example, in his 1981 classic, Wealth and Poverty, celebrated the generosity and altruism of entrepreneurs, whom he described as creative geniuses eager to share their gifts with humankind rather than as selfish capitalists driven by economic gain. He also believed in ESP, and took special pride in his ability to locate the queen of spades in a deck of cards without looking. Jude Wanniski, another leader of the movement, saw tax rates as the prime mover of history. In his magnum opus, The Way the World Works, he wrote that even sobbing infants were actually learning about supply and demand: “When the baby screams all the time demanding attention, even when fed and dry, he discovers that mother also remains in the other room and perhaps even closes the nursery door. The tax rate is 100 percent, also yielding zero attentiveness.” Over the years, Wanniski swung further away from respectable politics, meeting with Louis Farrakhan of the Nation of Islam (he approved of Farrakhan’s advocacy of black enterprise), comparing Slobodan Milosevic to Abraham Lincoln and defending Saddam Hussein in the run-up to the Iraq War. He hired the disciples of Lyndon LaRouche at his economic consulting firm, arguing–rightly–that they were not “trained in demand-model economics.” Eventually even the movement institutions disowned him. But they continued to profess the supply-side faith, right up to the Bush tax cuts.

So how did the ideas of these characters come to be the governing wisdom of the Republican Party? Chait suggests that during the 1970s, their ideas resonated with the business lobby at a moment when corporations were organizing to defend themselves against the criticisms of the counterculture and the consumer movement, which accused business of creating faulty and dangerous products, profiting from the Vietnam War and polluting the country’s air and water. Since that time, the political significance of the business community has only grown, and Chait argues that it is the real constituency for the Republican Party. Occasionally, lobbyists organize populist pageants to claim a broader base for the program. Chait tells of a memo that circulated before a 2001 rally in support of the Bush tax cuts: “If people want to participate–AND WE DO NEED BODIES–they must be DRESSED DOWN, appear to be REAL WORKER types, etc.” Hard hats were passed out to the lobbyists who came. But such antics aside, it’s clear who benefits.

To understand the ascendancy of the right, Chait suggests we go back to the 1950s, when intellectuals like Richard Hofstadter and Daniel Bell dismissed conservatives as “crackpots” and paranoiacs, displaced small businessmen suffering from “status anxiety” who lashed out against a new world of technological modernity and bureaucratic organization. Recent scholars working on the conservative movement have rejected the psychological turn of the “radical right” thesis of the 1950s, trying instead to get inside the minds of conservative activists and treat them with respect, if not always sympathy. But Chait argues that “the Hofstadters and Bells were on to something”–the right really is radical, Manichean, ideological and paranoid. The real problem of modern America is the loss of the reasonable, moderate center embodied by the 1950s intellectuals. In a sense, both Chait and Krugman identify with the liberals of that earlier era, scandalized by the perversions of the radical right, with the difference that today the radicals have escaped from the farm and are running the country. Restoring the New Deal is a project of political conservation, a defense of how things used to be before the country was derailed by a nutty conservative cabal.

The problem with this pastoral tale is that it makes the New Deal seem too easy and too natural, as though it was the norm and recent history the aberration. In fact, the reforms of the New Deal–the passage of legislation giving workers the right to organize unions, the creation of a public system of social insurance for the elderly, the introduction of regulation of the stock exchanges, the national minimum wage, the expansion of progressive taxation (although this truly broadened only during World War II)–were won at a moment when capitalism faced the most prolonged, pronounced crisis in its history. The business class of the United States was at its nadir in terms of public prestige during the 1930s, challenged by a militant labor movement, the alternative model of the Soviet Union and the spread of radicalism of various stripes at home. The New Deal was partial and incomplete, a fragmentary welfare state, and yet it could be created only when it seemed capitalism was on the verge of collapse.

Moreover, even though it is true that the parties did not clash openly over the political economy during the 1950s, the economic order eulogized by Krugman never elicited a genuine or complete consensus. Southern and Southwestern states partly exempted themselves from the high-tax, high-wage economy of postwar liberalism, passing right-to-work laws to limit union strength and keeping state and local taxes low to lure capital from the North. Many companies in the North and Northeast tried to find ways to resist labor and the state even at the height of postwar liberal power, doing their best to weaken their unions and funding conservative think tanks. In short, the central institutions of the New Deal were under attack throughout the postwar period, and in fact had eroded substantially by the 1970s. The difficulty of creating the New Deal in the 1930s, and the conflict that it caused even in the ’50s and ’60s, suggests that it will not be easy to return to it today.

Chait’s and Krugman’s sense of themselves as the moderates fighting back a wave of radical libertarians also obscures just how far the center has shifted. After all, the right has no monopoly on love for the market model. The esoteric philosophies of Wanniski and Gilder gained credibility at a time when rational choice theory and Chicago School economics were winning converts in academia. The Democratic Party may not have pursued cutting income tax rates with the zeal of the Republicans, but under Clinton in the ’90s the Democrats enthusiastically embraced the free market, passed welfare reform and debated the privatization of Social Security years before Bush brought it up. The business mobilization that Krugman and Chait decry has shaped the modern Democratic Party, too–and may do so especially in 2008, if the recent shift of corporate donations from the Republicans to the Democrats continues. (In late October the New York Times reported that healthcare companies have donated $6.5 million to the Democratic presidential candidates, compared with $4.8 million to the Republicans.) In 1946 Harry Truman proposed a national healthcare system resembling Canada’s; the resistance of the American Medical Association and Southern Democrats doomed it to failure. But today, even Krugman–who believes that “in purely economic terms, single-payer is the way to go”–has joined the Democratic presidential candidates in advocating the more “politically feasible” solution of mandated coverage, or “a universal health care system run through private insurance companies.”

Moreover, if conservatives aren’t the only politicians to have embraced the free market, the victories of free-market economics can’t be explained solely in terms of the opportunistic attempts of leaders to link conservative economic and social agendas. The beliefs of social scientists to the contrary, there is nothing simple or automatic about how people come to define their economic interests. Especially in the absence of organizations like labor unions, which can offer an alternative understanding of how the economy operates, it should not be surprising that the vision of the marketplace itself exercises a certain deep appeal, even for people who do not benefit from the policies it prescribes. After all, the market model suggests that individuals exercise tremendous power over their own destinies. It argues that people can determine their economic fates. It provides a way of connecting one’s actions to the larger outcome of social and historical processes. These ideas have their own logic and their own attraction, quite aside from the politics of cultural backlash with which they are frequently joined.

Indeed, perhaps the most striking difference between the conservative activists and intellectuals who built the right and modern-day liberals like Krugman and Chait is how tentative the latter are when it comes to offering a new vision of how to go about “putting material things in their proper place” and fashioning a relationship between the individual and the state that can foster liberty and equality. When Goldwater (OK, Bozell) wrote The Conscience of a Conservative, he knew he was calling for a dramatic change in the country’s direction. Ronald Reagan in 1980 also claimed to want to break with history. But these contemporary liberals insist that they are not calling for anything particularly far-reaching; their politics are to be found in the past, in the moderate consensus, in the New Deal. They want to move forward but only by moving back. The internal weaknesses and flaws of the New Deal, or of postwar liberalism, which might seem manifest in the ease with which it was ultimately dismantled, seem to trouble them not at all. Yet if the conservative movement has any political lesson to teach to those who disagree with its motives and goals, it should be that sometimes only a willingness to be radical really brings about change.



]]>
https://www.thenation.com/article/archive/deal-breakers/
Lettershttps://www.thenation.com/article/archive/letters-129/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our ReadersApr 6, 2006 PHOTO NATION DEBUT

Harpswell, Me.

]]>

PHOTO NATION DEBUT

Harpswell, Me.

The photographs by Eugene Richards prove the old adage that a picture is worth 1,000 words [“War Is Personal,” March 27]. His poignant text, in under 1,000 words, served to heighten the impact of his soul-searing photographs. These few photos jarred me as no bombed-out car or rows of sheet-draped bodies in Iraq could. Too bad the mainstream media don’t show more of the 16,000 wounded as they confront their “almost dead” bodies. That kind of photojournalism gets our attention and perhaps would quicken the return of more whole troops!

HARRIET RICHARDS (no relation)


LEAVING STEPFORD…

Melrose Park, Pa.

Katha Pollitt’s eloquent tribute to Betty Friedan [“Subject to Debate,” Feb. 27] reminds us how much courage it took for Friedan to stand up for women in the early 1960s. More than twenty years earlier Friedan spoke out for another unpopular cause. She was a freshman at Smith College in 1938, when Hitler unleashed the Kristallnacht pogrom. Smith president William Allen Neilson urged the students to sign a petition asking President Roosevelt to let German Jewish girls enter the United States outside immigration quotas, to enroll at Smith. “A number of girls spoke against it, about not wanting any more Jews at Smith,” Friedan wrote in her memoir, Life So Far. There were four older, well-to-do Jewish girls in her dorm. “I expected them to speak up, but they didn’t. Finally, despite being only a freshman from Peoria, I spoke, urging that we open our doors to those girls fleeing persecution.” Sadly, the petition was rejected by a large margin.

RAFAEL MEDOFF


Asheville, NC

In response to Katha Pollitt’s recent columns and her moving remarks on Betty Friedan, I write to add another perspective on the question of whether middle-class women are running away from careers into domesticity. Women of my generation and class who are putting careers on hold to devote themselves to family are not rejecting feminism. They are exhausted. Raising children while maintaining a career is incredibly draining, especially for women who do not have involved (or any) partners, flexible work schedules or affordable daycare. Even for those of us who do, balancing work and family is too much. The kids can’t be sent back, but the careers can, so domesticity wins. Some of us coming of age in the 1980s took Friedan and other second wavers too literally, believing we could, and should, “have it all.” As one second waver recently put it: “We said you could be anything, not everything.”

TRACEY RIZZO


Baltimore

If Katha Pollitt is going to teach that class on the bad old days again, she needs details to get through to her students. Things like: In the 1960s the classifieds were divided into Help Wanted, Male; Help Wanted, Female; and Help Wanted, Colored. There were med schools and law schools that wouldn’t consider female applicants, and people who argued that women couldn’t read the news on TV because it wouldn’t sound serious. A 1973 “self-help” book: How to Help Your Husband Get Ahead, by Mrs. Dale Carnegie.

KATHARINE W. RYLAARSDAM


LABOR’S GIANT COCKROACH?

New York City

Gregor Samsa awoke one morning to find himself transformed into a giant insect. Somewhat similarly, in “Labor Pains” [March 6] I found myself transformed into David Horowitz. Corruption! Corruption! Corruption! It’s all Fitch can talk about, complains reviewer Kim Phillips-Fein. Doesn’t Fitch realize he’s undermining the labor movement?

Evidently my book Solidarity for Sale contains a lot about the murky origins, arrested development and bad consequences of America’s fiefdom model of unionism. Corruption does turn up a lot. Upton Sinclair’s The Jungle dwelt a lot on corrupt meat. His strategy was to use rotten beef to illustrate what was wrong with industrial capitalism. Mine is to try to use union corruption to explain what has gone wrong with American unionism. And beyond that, the advent of American exceptionalism.

My aim was not, as claims Phillips-Fein, to answer the rather dated question, How come there is no socialism in America? but instead to ask, How come American capitalism became the wildest, the most exploitive, most unequal variant in the industrialized world–lacking national healthcare, a minimum vacation law or protections against arbitrary firing?

Phillips-Fein claims I blame labor’s troubles on the inception of the AFL in 1885. In fact, I show the origin far earlier. Both European and American unions share a common ancestor: the closed model of monopoly unionism. Despite challenges from the Knights of Labor, the IWW and the early CIO, US unionism never got much beyond a primitive protection system. The AFL union boss who could control a territory protected his clients from the employer and the employer from the members. Eventually all three parties came under the protection of the organized crime boss. The upshot today is 20,000 petty, rent-seeking institutions buttressed by the nineteenth-century instruments of closed shop, exclusive bargaining and dues check-off.

Plainly, not all 20,000 are corrupt. It’s also plain, though, that our unions can’t grow, organize, strike, keep the pension and benefit funds from disappearing or advance a common progressive political agenda. Phillips-Fein doesn’t really dispute this. Her point is the customary claim of official labor’s defenders that it’s all the employers’ fault. But is it really the fault of big capital that most US unions side with corporate interests in opposing a single-payer health system? Solidarity for Sale shows how it follows from the logic of the fiefdom model.

In Europe labor movements led by Catholic, Communist and Socialist parties built national organizations that brought something like social democracy without relying on nineteenth-century mechanisms of forced unionism. They all have national healthcare systems. What the unions in Europe don’t have is a mob problem, not even in Italy.

Is my solution then to take away the mechanisms of forced unionism? Phillips-Fein says it is. But it isn’t. In fact, I say exactly the opposite: “The point is not to demand that old arthritic AFL unions throw away their crutches,” the conclusion states, “but rather to show that unions can be built on a voluntary basis.” What turns off more people to the labor movement? Denial, distortion and repetition of tired formulas? Or having the courage to confront unpleasant facts and face up to the need for painful changes?

The stakes are nothing less than our continued existence. There’s a growing recognition that the left can’t recover without a coast-to-coast revival of labor. But as the Los Angeles model of “progressive unionism” dissolves into the dust and disgrace of plea bargaining and the FBI raids the offices of the New York Central Labor Council, it’s clear too that such a labor revival can’t come from the inside out or the top down. Left adversaries–including Terence Powderly, Eugene Debs, W.E.B. Du Bois, William “Big Bill” Haywood, Dorothy Day and the late Herbert Hill–all insisted on the need for full-scale structural reform of America’s fiefdom model of labor unionism. Solidarity for Sale seeks to revive this critical tradition.

ROBERT FITCH


PHILLIPS-FEIN REPLIES

New York City

In his letter Robert Fitch performs his own act of metamorphosis: I become labor’s “official defender,” while he manages to carry the mantle of every great left radical and critic of labor in US history. I agree that unions are in a state of crisis. And, as I said in my initial review, corruption is part of the problem; far from denying that corruption exists or saying we shouldn’t talk about it, I pointed readers to several other books on the topic. But in my opinion, Fitch’s analytic reliance on the “fiefdom model,” whose widespread applicability to the labor movement he does not demonstrate, greatly weakens his book. That model has nothing to say about unions that grew out of social movements–like the hospital workers’ union, Local 1199, which was led by leftists and which developed in tandem with the civil rights movement. It forces him to ignore times when unions have given support to a broader progressive politics, as when the UAW helped organize the 1963 March on Washington and sent bail money to the civil rights protesters locked in Birmingham jails, or the support that unions give today to living-wage campaigns across the country. Relying on a vision of unions as monopolies, Fitch claims that the periods of growth in US labor history came when unions “competed” for workers, but in fact it’s not clear that “competition,” instead of improved legal and political conditions, was responsible for the expansion of unions in the Progressive Era and the 1930s and ’40s. Fitch argues that the fragmented, localized structure of unions is to blame for the desperate predicament of working-class Americans, but his book does not help us understand how and why this structure, to the extent that it exists, came into being. He says little about the role of business attacks on unions in shaping the labor movement, either in the early years or today, when European unions no less than US ones must confront a climate of mounting hostility.

All these examples are complex: to understand how unions have dealt with race, for example, we must look both at the long history of racism in American labor and the support of unions like the UAW for the civil rights movement. But that’s my point: History is complicated. Fitch could have used his voluminous research to write a book that really shed light on the history of union corruption, assessing its influence on the development of labor and the reasons unions have proved susceptible to it. Instead, he wrote a one-note history, relying on the rhetoric of “monopoly unionism,” also beloved by the right, in which mob infiltration and the failure of unions to broadly support national healthcare stem from an identical source.

Finally, I’m not sure what to make of Fitch’s claim that I misstate his attitude toward union shops. He writes that to revive labor, “the prime features that have to be scrapped are all those features that preempt consent: compulsory membership, exclusive representation, closed shop, ‘union security.'” He may mean that he doesn’t support right-to-work laws but instead envisions reborn unions that eschew these policies. But he argues that union shops are “despotic,” that they create conditions that breed corruption and that they destroy the ability of labor to effectively organize and represent members. I understand why he wants to distinguish his position from that of right-to-work proponents–who usually argue that they don’t object to good unions, even though the objective of the businesses that fund right-to-work efforts is to keep wages and benefits down by weakening all unions. But his assumptions, language and analytic framework are all very close to theirs.

KIM PHILLIPS-FEIN



]]>
https://www.thenation.com/article/archive/letters-129/
Labor Painshttps://www.thenation.com/article/archive/labor-pains-0/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-FeinFeb 16, 2006Solidarity for Sale exposes corruption as the cause of the current crisis in American labor.]]>

Midway through his new book on corruption in the American labor movement, Solidarity for Sale, Robert Fitch tells the story of one of the most famous anticorruption crusaders of all time: Martin Luther. It was, after all, outrage at corruption–priests who charged parishioners for indulgences, living large while making sinners pay for false redemption–that prompted Luther to nail his Ninety-Five Theses to the door of the cathedral at Wittenberg. His critics charged him with harping on petty sins, insisting that the priests he assailed were exceptions to the rule. But Luther knew better. As Fitch puts it, “Neither the sale of indulgences nor the commerce in Church offices served as Luther’s ultimate target. These were not ‘abuses’; they were inevitable expressions of an institution that he charged was corrupt in its essence.” His real target, in the end, was not a few bad pennies but the entire structure of the church. And the result was no piecemeal change but the Reformation.

Solidarity for Sale aims, in a way, to be Ninety-Five Theses for the American labor movement. The book is not so much a history as an exposé. Virtually every page contains revelations of vote fraud, election theft, payoffs, sweetheart deals, Mafia infiltration, pension fund looting and political betrayal, and it is populated by characters with nicknames like Diamond Joe, Dago Mike and Big Jim. But Fitch is a social critic as well as an investigative journalist, and his target is larger than a few wiseguys: He seeks to demonstrate that, like the church before the Reformation, American unions are rotten at the core, and that with few exceptions they have been this way virtually since birth. The crisis of American labor today reflects the corruption at the heart of the movement, if not its original sin; to rebuild unions, Fitch suggests, we must start all over again.

Fitch has a long career as a political reporter, going back to the late 1960s, when he worked for Ramparts magazine. Unlike his fellow Ramparts alum David Horowitz, who today denounces his former comrades as terrorist sympathizers, Fitch has consistently defined himself as a man of the left. He never went into academia full time, instead pursuing a career as a freelance journalist; the body of Solidarity for Sale is based on reporting he did at the Village Voice. (Full disclosure: He is also an old acquaintance of mine, although we haven’t spoken for years.)

The question driving Fitch is the same one famously posed in 1906 by Werner Sombart: Why is there no socialism in the United States? The answer, for Fitch, is corruption–no other labor movement has been as infested with corruption as that of the United States. As a muckraking reporter, Fitch has always highlighted the role of individual elites and institutions in making history, as opposed to the grand, sweeping trends favored by academics. His analysis of the deindustrialization of New York City, for example, in his book The Assassination of New York, emphasized the role of Rockefeller real estate interests in seeking to remake downtown Manhattan as a playground for consumerism, over more abstract invocations of globalization and technological change. Similarly, in Solidarity for Sale, he blames the weakness and impotence of today’s American labor movement not on political backlash, global capital or international competition but on the choices, actions and structures of unions. “The fundamental actors in American labor are institutions–the unions themselves,” he writes.

The end of American labor, Fitch argues, lies in its beginning: the birth of the American Federation of Labor in 1886. The rise of craft unionism in the late nineteenth century meant the development of a unionism that was inherently exclusionary, as the crafts refused to organize unskilled (or black, or female) workers. Labor leaders like the AFL’s Samuel Gompers opposed public health insurance plans and even eight-hour-day legislation. “If we can get an eight-hour law for the working people, then you will find that the working people themselves will fail to have any interest in your economic organization,” he predicted in 1914.

What’s more, in industries where employers were small and craft unions strong–as in construction–the unions could control who got jobs. Where other historians have seen job control unionism as a sign of labor’s strength, or of pride in craft, Fitch argues that it created a culture of deferential, humiliating subservience, since workers’ ability to earn their daily bread depended on the good graces of the business agent. The AFL unions proved susceptible to Mafia corruption, Fitch suggests, precisely because job control unionism established this culture of loyalty and fear. “Notwithstanding the muscling-in era of the 1920s and 1930s, the Mafia has been able to capture and maintain control of trade unions less through overt violence than through their mastery of the politics of job trust unionism,” he writes.

What house of labor could be built on such a fragile foundation? Today, Fitch sees a labor movement that is incapable of either organizing workers or representing them effectively. In some cases, corruption refers to the mob infiltration of unions. The Laborers are “totally mobbed up,” routinely making deals with employers to hire nonunion labor and pocketing part of the difference between union and nonunion wages while merrily looting the pension fund. (Fitch argues that such collusion with employers is more widespread than simple extortion.)

But even when the Mafia isn’t directly involved–and it isn’t in most of the unions Fitch chronicles–Fitch sees a labor leadership more interested in enriching itself than in improving its members’ lives. He argues that AFSCME District Council 37–which represents New York City’s public sector workers–is rife with petty theft and vote fraud. (“Everybody takes a little from the kitty,” he quotes one union officer saying.) UNITE has taken payoffs to not enforce contracts at sweatshops in lower Manhattan. Teamsters reformer Ron Carey–“the most glittering mirage yet to deceive the travelers in labor’s increasingly parched desert”–stole union funds to pay for his re-election campaign (Fitch largely discounts Carey’s 2001 acquittal in the case). Self-serving labor leaders take weak political positions; like the AFL under Samuel Gompers, neither the AFL-CIO nor SEIU calls for national health insurance. Fitch argues that the structural problems of American labor–its culture of insularity and subservience–cannot be solved by new leadership. Indeed, one of his abiding themes is that the labor movement has historically proved impervious to reform: “Call it the Roach Motel syndrome. The leftists go in but they don’t come out.”

Fitch would like to abolish union shop contracts (which mandate that all workers at a given employer must join the union–workers typically vote on this provision in their first contract) and overturn the legal provision giving a union that wins an election the exclusive right to represent workers at a given workplace (the law was initially passed in part to fight company unions). Without these crutches, unions would lose their “monopoly,” and healthy competition would revive the labor movement. Fitch’s literary style, never what one would call restrained, becomes an all-out jeremiad in the closing pages of Solidarity for Sale. “No amount of ‘democracy’ can alter the fact that the AFL-CIO is rooted in compulsion, exclusion and monopoly,” Fitch insists. “Forced labor is slavery; forced marriage is concubinage; forced sex is rape. Few acknowledge that forced unionism–unionism without consent–is despotism.”

Union corruption is a difficult topic to write about, and Fitch deserves credit for his investigative zeal. Mobsters aren’t known for their candid interviews or well-kept archives. Beyond this, employers have sought to tar unions with accusations of corruption ever since the earliest attempts to organize. Sorting out allegations from reality is tricky, especially since any exercise of working-class power can seem to employers like a corruption of the marketplace and of the natural order; they see collective bargaining, in the words of libertarian economist Ludvig von Mises, as “bargaining at the point of a gun.” Finally, unionists themselves sometimes deny real corruption even when it exists, insisting that they are the victims of trumped-up employer charges. Nostalgia, machismo and sentimentality can lead union supporters to simply deny real wrongdoing in the face of all evidence.

But despite the facts Fitch unearths, Solidarity for Sale offers too sweeping an indictment to be persuasive. Although he emphasizes job control unionism, most of the unions he discusses (other than the Laborers and in some instances the Teamsters) don’t actually work this way. In New York’s public sector or even the garment factories, the boss, not the union, decides who works. At times Fitch’s habit of dropping crime family names and his descriptions of labor leaders–who we are repeatedly told resemble “Afghan or Somalian warlords”–seem designed for shock value rather than analytic precision. His focus on the union shop might have been lifted from the campaigns of the National Association of Manufacturers during the 1950s, which sought to associate monopolies with unions in the public mind, or perhaps from the National Right To Work Committee. It’s worth observing that the connection between union shops and corruption is far less clear than Fitch suggests. Industrial CIO unions like the UAW also fought hard to win union shops without succumbing to the same kind of corruption. Would outlawing the union shop lead to the rebirth of labor? Currently twenty-two states have Right To Work laws, making the union shop contract illegal. While some of these states (most notably Nevada) have seen union membership grow in recent years, places like South Carolina and Arizona are not exactly hotbeds of labor militancy.

Perhaps the most frustrating aspect of Solidarity for Sale is that it is very difficult for the reader to tell where decay actually exists because Fitch sees it everywhere. What is the real extent of union corruption? How many locals a year face charges? And isn’t there something about portraying the entire movement as fraught with thieves that demeans the daily difficulties faced by people who strive to organize and represent workers? The past twenty years have seen extensive struggles against organized crime and corruption in unions like the Teamsters, SEIU and HERE, through the combined efforts of the federal government and internal movements for union democracy. Fitch implies that these have all failed miserably. But there are several recent books–like historian David Witwer’s account of corruption and reform in the Teamsters union, the autobiography of Herman Benson, founder of the Association for Union Democracy, and NYU law professor James Jacobs’s Mobsters, Unions, and Feds–that have tried to take stock. These authors see the fight against union corruption as an integral part of the story of American labor. Instead of arguing that American unions are structurally doomed, they give a sense of the ebb and flow of conflict within unions, and a sense of why anticorruption measures have sometimes been able to succeed. And while not being rosy about corruption, they help the reader to grasp the hope that the labor movement still embodies for millions of Americans, the reasons that despite all the many failures of the past 100 years, people still want to join unions and will sometimes risk a great deal to do so.

Oddly absent in Fitch’s bleak account is the power of business in America. After all, employer violence, legal obstacles to organizing and the constant fear of reprisal from the boss have deterred untold numbers from union activism. And this, in turn, has helped create a climate in which corruption could flourish. But like so many veterans of the 1960s, Fitch seems in this book to have turned away from analyzing the powerful. Instead, his rage at the political direction of the United States is focused inward, at labor itself. The result is a book ostensibly written for union supporters that seems designed to alienate most potential readers in the world of labor. The hermetic treatment of the labor movement–as though it evolved in isolation from the struggle with business–the melodramatic language about monopoly unionism and the occasional lapses into Godfather stereotypes are the exact opposite of what a book on a topic as serious as union corruption needs. Even Luther, when he nailed his theses to the cathedral door, was thinking about how to win followers to his vision of the Gospel.



]]>
https://www.thenation.com/article/archive/labor-pains-0/
Texas, Inc.https://www.thenation.com/article/archive/texas-inc/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-FeinDec 18, 2003

In cowboy myth and frontier legend, Texas is a land of rebel outsiders, eternally defiant of the East’s gray-suited Establishment. Yet the official symbols of the adoptive home of George W. Bush are traditional ones of authority and power. The Capitol building’s garish pink dome, larger than that of the Capitol building in Washington, hovers over the city of Austin, a sprawling college town of seedy frat bars and New Age commerce. On the Capitol flagpole, the Lone Star flies below the American flag, emblem of the few brief years when slaveholding Texas was its own republic. A giant bronze statue honors the Confederacy, whose people, we are told, seceded for the mighty principles of 1776 and fought the North, outnumbered, until exhausted. To the side is a mammoth stone tablet, emblazoned with the Ten Commandments. And down the center of the Capitol plaza runs a recently refurbished stone walk, each brand-new brick bearing the name of a different corporate citizen: Enron, Wal-Mart, Philip Morris.

Yet just a few blocks away from the Capitol building there is evidence of another native Texas politics. In a small, one-story storefront, on a scraggly, tree-lined stretch of road, across the street from a temp agency, there hangs a purple-and-yellow sign, calling out to passersby: Organizing for Justice: Sign Here! This is the central office of the Texas State Employees Union, a statewide local of the Communications Workers of America. Opening the screen door to the office, one sees few posters on the whitewashed walls: Instead, they are covered with lists bearing the names of thousands of members across the state. High on one wall, almost in the corner, is a poem, stitched in needlepoint: “You can break my union heart, you may lock up my union body, you might dampen my union spirit, but you’ll never control my union mind.”

TSEU has been organizing public workers in Texas since 1980. It is a true industrial union, organizing most state workers without respect to department or occupation. For the past quarter-century it has been one of the few organizations in Texas to try to imagine, let alone to build, an alternative to the corporate frontier. There is no state income tax in Texas, and in fact, the legislature is constitutionally prohibited from passing one. As a result, hundreds of thousands of children were thrown off health insurance (Texas already led the country in the number of uninsured children) this past year to balance a $10 billion budget deficit, while millions of dollars were cut from school funding. State Representative Debbie Riddle of Houston summarized the political logic of such decisions in the El Paso Times this past spring: “Where did this idea come from that everybody deserves free education, free medical care, free whatever? It comes from Moscow, from Russia. It comes straight out of the pit of hell.” In that spirit, the state legislature passed a bill this past May that will result in the privatization of eligibility determination for TANF (the program that replaced welfare), food stamps and Medicaid–fundamentally transforming the way social services are delivered in Texas. Mike Gross, who once worked for the Texas Youth Commission and is now organizing coordinator of TSEU, says, “We are on the frontlines of the battle to protect the social safety net.”

In some ways, the conservative attack on the union and the public sector in Texas today is a microcosm of the assault in the country as a whole. While business organizations and the corporations that stand to benefit lobby successfully for privatization and budget cuts, tens of thousands of state workers lose health insurance, and the social programs that make life bearable for the desperately poor are downsized.

But more is at stake than particular programs and specific benefits. The very existence of the public sector–controlled by people who are democratically accountable, run by workers who aren’t in thrall to the Dow Jones–hangs in the balance. The real question in Texas, and in the country as a whole, is whether we want to fight social inequality through the collective power of the state, using its capacities for redistribution to help remedy the sharp inequities of the workplace, or whether we should let corporations, churches and the Salvation Army deal with the problems of the poor. The choice is between charity, with its baseline assumption of unequal power and resources, and the expansive vision of social democracy. Against great political odds, TSEU is fighting not only for its members but also, implicitly, for us all.

On a cool April morning last spring, TSEU held its Lobby Day at the state Capitol, in the middle of the biennial legislative session (the legislature meets for five months every two years). More than 2,000 workers from across the state converged in Austin. They boarded buses from Houston, Dallas, San Antonio, Lubbock and other cities; one bus came all the way from El Paso, leaving at 6 pm the night before the rally from the opposite side of the state. Parole officers, admissions attendants in the state’s mental hospitals, workers in state laboratories, people who help workers whose jobs have moved to Mexico find new employment, graduate teaching assistants–all came to the Capitol. Some had never been to a demonstration before; one shy Mexican-American man said, “This is my first adventure.” Workers carried homemade signs up the hill to the Capitol complex: Tax Ken Lay; Don’t Dump Wall Street’s Mess in Texas; Houston: We Have a (Bigger) Problem; The Texas Legislature Massacre. At the end of the rally and march, before going in to meet with their representatives, a union organizer led the crowd in a chorus of “Solidarity Forever,” in English and then in Spanish–a sight George W. Bush probably never expected to see.

Historically, the Texas elite has been bitterly anti-union. With its oil industry and old manufacturing base, Texas is one of the more heavily industrialized Southern states, and union membership grew rapidly during World War II. The expansion of the labor movement was greeted with a harsh reaction. The state passed numerous restrictions on labor rights during and after the war, banning the closed shop, secondary boycotts, mass picketing–even before the Taft-Hartley Act of 1947, which limited such tactics nationally. Employers and state politicians accused unions organizing in oil ports of being the tools of a Communist plot.

For public workers, organizing is especially difficult. State workers in Texas are legally barred from collective bargaining, and TSEU–like many unions a hundred years ago–has no formal recognition. It depends for its existence on building major demonstrations of its members and lobbying state representatives. But through old-fashioned one-on-one organizing, the union has built a dense activist network with 12,500 members statewide (out of a pool of about 90,000 state employees). Dues are on a sliding scale, according to income, and they range from $14 to $19 a month. Although it cannot bargain contracts, the union is credited with winning pay raises and improvements in benefits in the past and stopping privatization in the mid-1990s, as well as countless smaller victories on the local level.

People like Judy Lugo, TSEU’s president, built the union. Lugo is an earnest and unpretentious woman in her 40s. Twenty-three years ago, as a single mother with two children, she applied for welfare. She managed to get a job with the agency that helped her out, and today she is a supervisor in the Texas Department of Human Services, working in El Paso, the Rust Belt of Texas. Many of the people she sees applying for benefits in El Paso have college degrees, or used to work in factories that have vanished across the border. The high unemployment rate in the city keeps wages low. “In Dallas, you get a job in a Burger King and it will pay you $9 or $10. In El Paso, you get $5.15.”

Over her twenty-three years in the system, Lugo has seen the welfare system go through many changes. Staffing has gone down, and “staff are working more hours just trying to get the work done.” Clients can’t get through on the phone and aren’t applying for benefits. She herself has been hurt by the cutbacks in the state workers’ healthcare plan. “I’ve stopped taking one medication because I can’t afford it,” she says. Most of the time, in mainstream Texas politics, Lugo’s work is sneered at as a spectacle of government bloat. The union is one of the few groups in the state to recognize the daily agony of poverty, to take seriously and lend dignity to the difficult lives of poor people. Meanwhile, the Republican legislature obsessively and repetitively attacks government bureaucracy with the enthusiasm of an anorexic staring in the mirror. Representative Garnet Coleman of Houston, a union supporter, says, “They keep talking about cutting the fat, but we’re way under the muscle and we’re hitting the bone.”

The privatization bill that passed the legislature early this past summer will dramatically change the way that social services are delivered in Texas. It collapses the twelve agencies that serve the neediest Texans–people such as nursing-home residents, abused children, the physically disabled and mentally impaired, the blind and the deaf–into five. It will reduce the work force for these agencies by about 3,500 people over the next two years. While the legislation includes items such as incentive payments for women on welfare who agree to take seminars on marriage, the core of the bill is to privatize the frontline offices of the welfare state. Hundreds of local welfare offices across the state that sign people up to receive food stamps, Medicaid, TANF, children’s health insurance, disability and other public benefits will close. There will be no more face-to-face interviews (except in special situations). Instead, workers in four statewide call centers will enroll people from all over the state in programs over the phone. These call centers will be privatized. In addition, state mental hospitals and schools for the mentally disabled may now be sold to low bidders. The result is a huge step toward a welfare state run for profit and traded on the stock exchange–Texas, Inc.–in which the government places the lives and well-being of its poorest citizens in the hands of shareholders and CEOs. The changes start in January, and are supposed to be completed by the end of next summer (though this speedy timetable seems unrealistic to many).

The main corporate player pushing welfare privatization in Texas is ACS State and Local Solutions (the company was once a division of Lockheed Martin, the weapons manufacturer, which sold it to communications giant ACS a few years back). ACS handles contracts for welfare and work-force services, as well as other government functions, across the country. Texas and Florida are its largest markets. In Texas, in addition to holding contracts to run many of the state’s work-force programs (helping people get jobs and training), ACS manages eligibility and re-enrollment for CHIP, the state-federal children’s healthcare program, and it recently won a contract to process claims from Medicaid clients and pay doctors. The company has been eager to expand its reach over the Texas welfare system ever since the mid-1990s, when it helped push legislation that would have privatized eligibility determination in Texas for all human services–welfare, food stamps and Medicaid. The plan failed when Clinton refused to sign a waiver permitting the privatization (current federal law mandates that civil servants determine who can receive Medicaid and food stamps). Bush, on the other hand, is likely to approve such a waiver–opening the door not only for privatization of welfare eligibility in Texas but across the country (Florida already has a pilot project up and running). And ACS will be poised to win the call-center contracts when they go out next year.

Upper management at ACS State and Local Solutions lives in a different world from the client base of the programs it controls. Almost all of the executives are white, and only three out of nineteen are women. Glossy brochures in the ACS office suite depict smiling children and describe the company’s success in helping “customers” build a “career that they are in charge of, every step of the way.” The company’s five-point credo announces that employees are proud to contribute to “shareholder value,” and on one office door hangs a sign: We WILL Make Our Numbers!

ACS maintains a staff of twenty-one lobbyists in Texas alone–at a cost of $910,000 a year, according to the Austin American-Statesman–a roster that includes a former state senator, a former aide to the Speaker of the House and an erstwhile aide to Governor Bush. The company lavishly donates to political campaigns, giving $25,000 to the Texas GOP, $50,000 to Governor Rick Perry’s inauguration, $2,500 to the Comptroller and $18,850 to other state and local candidates (also according to the American-Statesman). This legislative session, ACS got what it paid for.

ACS is only one member of a highly politicized and aggressive business community in Texas. The Texas Association of Business, which claims to represent 140,000 large and small employers in the state, regularly meets with members of the state legislature and Congressional delegation, and gets very involved in their political campaigns. In 2002 Texas elected a Republican majority to the state legislature for the first time since Reconstruction, and TAB took credit–it had run a $1.9 million advertising campaign during the elections. The leaders of TAB are strong advocates of privatization. Bill Hammond, the president of TAB, was appointed by Governor Bush to oversee the privatization of work-force development during the 1990s. From his swank office a few blocks away from the Capitol, Hammond rehearses the arguments in favor of privatization as a way to reduce costs. Competition among private companies for contracts will drive state spending–and hence taxes–down. Private companies will be able to make innovations, invest in new technology and improve service efficiency in ways that are impossible for lumbering state bureaucracies. In fact, to Hammond, the budget crisis in Texas this past year seems a golden opportunity for the state to evaluate its programs in terms of maximum efficiency. Finally, Hammond believes that corporations will lower costs because private workers won’t be as liberal with benefits as public employees are. “If you are a single mom, the last thing you want is welfare,” he says. “If you can get a minimum-wage job, you should. It is not a matter of denying someone welfare, but a balanced perspective and presentation needs to be made in terms of going on welfare.” Says Hammond, “The public sector just hands out benefits.”

In fact, the welfare system in Texas is one of the stingiest in the country, but even beyond this, Hammond’s assumptions about privatization are problematic. For one thing, there is little proof that privatization really does lower costs or taxpayer expense. Frequently, overseeing the private contract incurs new costs that offset whatever savings the vendor promises. Elliott Sclar, a Columbia University urban planning professor who has researched privatization, says, “When you hold everything else constant, the differences are at best minuscule and often favor public provision of services.”

But since the way that a private company makes money with a state contract is by running the operation for less than the state pays, companies face tremendous incentives to cut wages and benefits. Under privatization, unions would be the first thing to go. States like Florida, in which public-sector workers do have collective bargaining rights, would lose tens of thousands of union jobs. In Texas, while workers in private call centers might gain the legal right to bargain collectively, the call centers are almost certain to fight unionization tooth and nail. Like so many other low-wage employers, they would probably not hesitate to discipline workers for union activity (even though it’s illegal).

Because the public sector is at least in theory accountable to citizens, not shareholders, it is different from the private sector in other important ways. Public employers have historically been more ethnically and racially diverse than most private companies, and they have played a critical role in the creation of a black middle class. They do not usually fight unionization with the same bitter intensity, and thanks to civil service rules, they provide more rights and greater protections for employees. By contrast, private employees are accountable to no one except their bosses. Indeed, the leaders of ACS see their power over workers as their main comparative advantage. When asked why private companies would be able to outperform the public sector, Gerald Miller, director of work-force and community solutions for ACS, tells an anecdote. “If Kristin needs a computer today, I can go out and buy it,” he says, referring to a new public relations employee whom I had interviewed moments before. “You know what happens if Kristin does a bad job? I fire her.”

What’s more, increasing the power of management and introducing profit-making considerations into the welfare state has a negative impact on social policy. Public review processes, reporting requirements, adequate staffing–these can seem like “inefficiencies” to private corporations legally bound to deliver the highest return to investors. Prison privatization in Texas, for example, has been a well-documented disaster. In the 1980s Texas sold contracts to manage its overcrowded, violent prisons. In just a few years, the scandals were legion. One privately run jail in northwest Texas that housed inmates shipped in from Hawaii and Montana was not giving prisoners enough food or proper medical care. At a prison run by Capital Correctional Resources corrections officers were videotaped beating up prisoners and setting attack dogs on them. And Wackenhut Corporation lost a $12 million contract when twelve former guards were indicted on charges of having sex with female inmates, some of whom said they had been raped. In practice, “flexibility” can mean cutting back on prison staff and programs, not giving welfare clients due process or making it far more difficult to obtain review of decisions made by the private company.

Are today’s developments in Texas a forecast for the nation tomorrow? The Bush Administration has already won massive tax cuts at the national level, while at the same time cutting services, pursuing devolution and dismantling what remains of the New Deal. Currently, Bush plans to subcontract hundreds of thousands of federal jobs that are now performed by civil servants. The effect, as in Texas, will be to dole out the public sector to politically connected corporate donors, while attacking one of the few remaining economic sectors where unions still wield substantial power. The ideology of the free market justifies the choice to grant huge contracts governing the lives of the poorest citizens to men and women more accountable to shareholders than to the public good. Yet the image of nimble, agile corporations zipping through an abstract marketplace hides deeper assumptions of a sharply hierarchical social order, for the justification of privatization is ultimately that CEOs are the only people who can be trusted with social power–never women like Judy Lugo.

In Texas, TSEU is determined to keep on fighting privatization. The union has defeated the privatization of welfare delivery in Texas before, and organizers hope that enough people can be educated and organized to stop it this time around too. After all, privatization looks even worse on the ground than on paper. Already corporations are asking that their bids for government contracts be kept secret, beyond public scrutiny. It’s not hard to see why: The Texas Observer has reported that Gregg Phillips, the man appointed to oversee the reorganization of the agencies, previously oversaw a failed privatization effort in Mississippi; one of his first actions in office in Texas has been to grant a large contract to his previous employer, Deloitte Consulting. Meanwhile, county governments are starting to realize that local Department of Human Services offices are slated for closure, and many are unhappy about it–especially since they are picking up the healthcare costs of the children who have been kicked off Medicaid. Two county governments have passed resolutions condemning the privatization law, and the mayor of Raymondville, a town in south Texas, has publicly protested the planned closure of the local DHS office. The union held a large training session for activists across the state in September, attended by about a hundred people who are determined not to let their jobs go into the private sector so easily. Despite rhetoric about improving services, organizers say, almost no one who uses the welfare system or works in it believes privatization is going to make things better. Reality may be the potent solvent to dissolve the ideology of the free market. In a state of rebels and mavericks, perhaps someday joining the union will be seen as the most powerful rebellion of all.



]]>
https://www.thenation.com/article/archive/texas-inc/
Victory at Yale?https://www.thenation.com/article/archive/victory-yale/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,William JohnsonOct 23, 2003

Detroit, MI

]]>

Detroit, MI

While Kim Phillips-Fein is correct that some gains were made in the recent settlement between Yale University and the two HERE locals (34 and 35) representing its clerical and service workers [“Yale Workers Win”], she is wrong to unequivocally call the Yale contract a victory. Most obviously, the length of the contract–eight years–is a major concession to the Yale administration, which can now count on nearly a decade of labor peace on campus.

The militance of the Yale unions, which Phillips-Fein identifies as one of their key strengths, has been effectively neutralized by this unusually lengthy contract. This militance, and what Phillips-Fein calls the “deep culture of organizing and solidarity” among the Yale unions, were not nearly as apparent on the ground as they were in the news media. While the papers were filled with photographs of AFL-CIO president John Sweeney and other notables being carted away from Yale protests in handcuffs, Yale workers reported that more union members were crossing the picket lines than not. This lack of solidarity was likely the result of two factors. First, Yale’s workers sensed that the big demonstrations and “militant picket lines” that so impressed Phillips-Fein were for show, and actually having little to no impact on the university’s bargaining position. The workers were right. The gains made on both wages and pensions in the eight-year contract are nowhere close to what the union was demanding on the eve of the strike.

The other reason that solidarity at Yale faltered during negotiations is that HERE leadership had already broken ranks with the coalition of Yale unions (which included graduate students struggling to form a union and hospital workers represented by SEIU), which had long ago agreed that no one union would sign a contract until all workers in this coalition had a contract. Fracturing the coalition by getting HERE to bargain independently was itself a substantial victory for Yale.

For many years, Yale’s unions and workers have rightly attacked the university’s administration (led by president Richard Levin) as exploitative and antiworker. Now that a contract is signed, union officials are calling Levin a model negotiator and well-intentioned, but naïve journalists are calling what is at best a mixed bag of a contract an unqualified victory. If these are our victories, what will our defeats look like?

WILLIAM JOHNSON
Assistant editor
Labor Notes


PHILLIPS-FEIN REPLIES

New York City

First of all, I’m truly stunned to hear a labor journalist like William Johnson repeating the boss’s line on the strike. The union claims that more than 90 percent of Local 35 (service and maintenance) and a majority of Local 34 (clerical) members were out on strike, and while Yale tried to argue that a higher proportion of clerical workers were crossing the line, even the university did not dispute that nearly all Local 35 members were striking. The union’s numbers are pretty close to what the proportion of striking workers has been in other Yale labor disputes, and claims attributed to “Yale workers,” repeated by a journalist hundreds of miles away, just aren’t convincing. Anyway, if no one was on strike and the pickets for “show,” why would Yale have moved an inch?

But despite his suggestion that the strike wasn’t much of a strike at all, Johnson raises an interesting question, one that political activists and writers must constantly face: How do we judge our successes and failures? The Yale contract is not perfect. Johnson is right that its eight-year length is its biggest flaw. But one way to measure a contract is by the hope that it holds out for other workers in the industry and region. Does the contract represent a significant improvement in the lives of workers that could never have been achieved without collective action? Can other workers look to the sit-in, picket lines, arrests and strike-related demonstrations at Yale as a model of how to better their own lives? The answer to both questions, in the case of the Yale contract–which, let us not forget, doubled pensions for many workers–seems to me to be an unequivocal yes. This doesn’t mean that there isn’t more organizing to be done and more to win, both at Yale, at the Yale-New Haven hospital and throughout academia and beyond. But well-intentioned or no, pessimism can be naïve too.

KIM PHILLIPS-FEIN



]]>
https://www.thenation.com/article/archive/victory-yale/
Yale Workers Winhttps://www.thenation.com/article/archive/yale-workers-win/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,William Johnson,Kim Phillips-FeinSep 22, 2003 Late last week, Yale clerical and maintenance workers who had been striking for three weeks won a contract that will transform the standard of living of clerical workers at the university, as wel]]>

Late last week, Yale clerical and maintenance workers who had been striking for three weeks won a contract that will transform the standard of living of clerical workers at the university, as well as future retirees. Under the new contract, the average salary of Yale clerical workers will rise from $33,000 to $42,220 over the next eight years. The average pension for a Yale worker after twenty or more years of service has been $7,450; under the new contract, many workers will receive twice that amount.

How did the Yale workers win? Through militant picket lines and community support. The strike began with six retirees engaging in a twenty-nine-hour sit-in in the investment office. Thousands of union workers came to the region to protest Yale. Hundreds of graduate student teachers and faculty honored the picket lines and moved their classes off campus. Negotiations were held in the offices of the New Haven mayor. The city began to charge Yale for the overtime of police officers who were spending weekends and evenings arresting demonstrators. Yale, after all, is a corporation that does not pay any taxes, is run by a board of trustees that includes millionaire venture capitalists and the president of Pepsico, and that will pay its own president (Richard Levin) a pension of $42,000 a month. Shipping in hundreds of strikebreakers–which the university was beginning to do as the strike went on–did not go over well in the impoverished city of New Haven.

Most fundamentally, though, the unions won because they have cultivated a deep culture of organizing and solidarity and a willingness to take risks and challenge power. The victory at Yale was only possible because of countless individuals who gained the courage, through the union, to defy powerful authorities to seek greater security and freedom in their own lives. Today it is hard–even for unions and progressive activists–to escape the market mantra telling us we can’t ever win anything corporations don’t give us out of the goodness of their hearts. But in this broader context of political defeat, the victory of a few thousand workers at Yale should remind us all that through unity and the bravery that comes from it, we still can win.



]]>
https://www.thenation.com/article/archive/yale-workers-win/
…and the Poor Get Poorerhttps://www.thenation.com/article/archive/and-poor-get-poorer/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,William Johnson,Kim Phillips-Fein,Kim Phillips-FeinJul 17, 2003

A year ago, Loretta Gruytch decided to go back to work.

The 71-year-old resident of Lebanon, Oregon, had suffered from high blood pressure for some time. But her medications cost $600 a month, and she and her husband (who also has health problems) live on Social Security. “You make a choice between paying rent and utilities and getting medication, and you have to have a place to live.”

She’d held jobs in retail and as a mail carrier. She didn’t think she could handle a forty-hour job, but she thought that twenty or twenty-five hours would be enough to pay for her prescriptions.

But she had put it off too long. While she was at a job interview at a local deli, she had a brain aneurysm and a stroke, and had to be rushed to the hospital. Her doctor helped her to enroll in Oregon’s healthcare plan. For the past year, Gruytch has been able to get her medicine. But today, mired in budget crisis, Oregon is cutting back on health coverage. As a result, Gruytch may lose her prescription coverage altogether. She says, “I don’t know what I will do if they take my medicine away.”

On the other side of the country, 43-year-old Brooklyn resident Maria Jones has also slipped through the cracks of the social safety net, forced to choose between her daughter’s education and her own health.

Jones came to the United States from Puerto Rico at the age of 18. She married young and had three children, but divorced her soldier husband after the marriage turned violent (they went through rounds of counseling first). When her children were young, she worked in a dentist’s office. More recently, she was a childcare worker, making and selling crocheted and knitted goods on the side.

Two years ago Jones had a massive heart attack, leaving her face partially paralyzed for months. After her illness, she could not work, nor could she complete the work requirements she needed to get welfare. She began to fall behind on rent. Her youngest daughter–“my miracle baby”–was in her first year at Hunter College at the time, and living with her mother. Jones asked her daughter to apply for welfare. “I was always taught that families stick together, no matter what,” says Jones. When a caseworker told her daughter that she would need to leave school and enroll in the Work Experience Program (WEP, the city’s “workfare” program) to get benefits, she dropped out of school.

After she recovered somewhat, Jones told her daughter to go back to college. “She gave me one semester of her life. I was not going to let her give up her dreams to get welfare,” she says. Now, she subsists on a disability check of $570 a month. Her rent is $680. “Unless I hit the lotto, I don’t know what I am going to do.”

In the Bush Administration’s ideological war on the welfare state, Gruytch and Jones are collateral damage. Unlike with Clinton, who sought to end welfare “as we know it,” the Bush agenda explicitly targets noncash benefits, like healthcare and education, which currently are available to poor and working poor people. Some of the changes the Administration seeks are subtle switches in rules–demanding additional paperwork for the earned-income tax credit, for example, or extra documentation for poor children to receive school lunch. Others replace federal entitlements with block grants to states.

Taken together, the proposed changes will make it more difficult for people to get healthcare, housing assistance, early childhood education and other minimal social benefits. They will fall especially painfully on the working poor. While President Clinton’s 1996 welfare reform law rescinded the federal entitlement to cash assistance, the Bush proposals seek to end the social minimum wage.

One common thread of the Bush Administration’s proposals is that they aim to turn federal entitlement programs into block grants. The idea of block-granting these programs, which began with Reagan’s attacks on poverty programs and expanded with Clinton-era welfare reform, may sound like a technical change interesting only to policy wonks. But in reality, it is a way of quietly defunding social programs, slowly ending health coverage and rent supports for people who have them now. Just as the language of states’ rights during the Jim Crow era was a way to disguise the demands of white segregationists, the language of block grants is a way of deflecting attention from the social costs of cutting back the welfare state.

The scariest proposals involve Medicaid. Right now, the program–which provides health coverage for about 48 million people–is administered by the states but jointly funded by the federal government and the states. Every dollar that a state spends on Medicaid is matched by some money from the federal government. In January, Health and Human Services Secretary Tommy Thompson proposed turning the program into block grants. If such a proposal became law, the federal government would give a fixed, capped amount to each state every year. Since there would be no built-in mechanism to increase federal spending should states’ costs rise, this would be likely to erode funding in the long run. In addition, there would be no incentive to expand services (federal matching funds would be eliminated). Joan Alker, a senior research fellow at Georgetown University who focuses on healthcare, says, “If more people came on the rolls, if there was a new technology or drug, if there was an epidemic, states would be left holding the bag.”

Under block grants, states might also get greater “flexibility” to determine benefit packages. They might even have the option to deliver services selectively for certain populations–to one part of the state, but not another. Currently, the Medicaid program is divided into two parts–“mandatory” recipients and “optional” recipients, and “mandatory” services and “optional” services. Mandatory recipients include pregnant women and children under 6 up to 133 percent of the poverty line, children under 19 below the poverty line, some low-income parents, children in foster care and elderly and disabled people on Supplemental Security Income (SSI). Mandatory services include hospital services, doctor’s visits, nursing-home care, immunizations and lab services.

But a quarter of Medicaid recipients and the majority of elderly recipients are optional beneficiaries, meaning that the federal government does not require states to cover them. These include elderly and disabled people with incomes above the SSI limit (74 percent of poverty), working parents above mandatory income levels, pregnant women above 133 percent of poverty, and low-income children above the poverty line. Many services provided by Medicaid are optional as well–including prescription drugs (for anyone but children), home healthcare, vision and dental care. In a budget crisis, these would be the first ones to go if the federal government isn’t funding them. Indeed, states have already started to cut “optional” populations off Medicaid. The people whose healthcare is threatened are marginal and poor. According to a report by Cindy Mann of Georgetown’s Institute for Health Care Research and Policy, in 1998 the optional benefit population included 4.2 million children, 3.7 million working parents, 2.3 million elderly people and 1.5 million people with disabilities. Medicaid also covers a high proportion of nursing-home expenses. Denise Soffel, health policy analyst at New York’s Community Service Society, says, “People aren’t thinking about Granny in the nursing home using Medicaid dollars.”

The Bush Administration’s hopes for rapid Medicaid reform were stymied when members of the National Governors Association were unable to agree on a proposal endorsing block grants. But Congressional Republicans are still saying they will mark up a Medicaid reform proposal this year, and the Administration has not given up. “If they don’t get it done this year, they will come back to it next year,” says Melanie Nathanson of the Center on Budget and Policy Priorities (CBPP).

Other poverty programs, such as public housing, are also on the chopping block. The Bush Administration is contemplating a radical restructuring of the federal rent voucher program, under which low-income people pay one-third of their rent and the federal government makes up the rest. The voucher program is overwhelmingly used by working poor families, people with disabilities and elderly people, and helps several million people stave off homelessness each year. In New York City alone, about 100,000 people use housing vouchers, and there is a waiting list of 150,000.

In fact, thanks to the recession, the use of rent vouchers is rising across the country. But the Bush Administration is planning cutbacks that would leave 184,000 vouchers currently in use across the country without adequate funding to cover them. At the same time, the Administration envisions turning the entire rent voucher program into state-administered block grants. Legislation to accomplish this has been introduced in both houses of Congress. Despite volatile rental housing markets, the block grants would cap funding for vouchers. Vic Bach, housing expert at the Community Service Society, says, “Over time, the vouchers will be devalued and will cease to be an effective way of obtaining affordable housing.” The program will be renamed Housing Assistance to Needy Families, an allusion to Temporary Assistance to Needy Families (TANF), the name of the program created by the 1996 welfare reform law. Conservatives such as Howard Husock of the Manhattan Institute dream of the day when there will be time limits and work requirements for public housing, and Michael Liu, an assistant secretary at Housing and Urban Development, has said that the Administration will not prohibit states from setting time limits for public housing. Counties in southern Delaware and Charlotte, North Carolina, are already experimenting with public-housing time limits. Another Bush housing proposal is a $50 monthly charge for families in public housing with little or no income. But some housing authorities have already tried minimum rents and given up–in one case, according to the CBPP, after tenants were found selling their blood to pay the rent.

Many liberals hoped that Clinton’s welfare reform would mean an expansion of assistance for working-poor families–more generous daycare or health benefits, for example. But the Bush Administration is building on the most punitive aspects of the 1996 welfare law, seeking to expand work requirements while tightening access to education and training. Meanwhile, many people who left welfare for jobs during the 1990s boom have lost them as the economy has soured.

For example, 39-year-old Brooklyn resident Rosario Rodriguez left welfare a year ago when she landed a $6.50-an-hour job making maternity garments at a small Brooklyn factory. She liked the job, she says, even though she worked in a dirty, unheated room, lost her health benefits as she exceeded Medicaid’s income limit (“it was a factory job, it had no healthcare”) and got only a half-hour lunch break. But after six months, she was laid off as business slowed down.

Now she spends her days at the “job center,” New York’s euphemistic name for welfare offices, where there aren’t any jobs to be found. She says, “My caseworker told me he would pray for me.” Rodriguez suspects that she won’t be able to find paid work, and will instead have to go into the city’s Work Experience Program. Meanwhile, her daughter, a high school student, just had her first baby. Rodriguez would like to help out and stay home with her grandchild, but she fears this would result in a loss of benefits.

It’s easy to overlook the connections between Bush’s sweeping vision and the program changes of the Clinton years. But when it comes to TANF and to poverty, the two Presidents are very similar–in that both have emphasized work, regardless of its quality, and regardless of whether or not it is stable, pays a living wage or provides health coverage. The marketplace, not the government, is supposed to soothe poverty, even when all the evidence shows that it does not.

Against great odds, low-income activists struggle to fend off these attacks on the remnants of the welfare state. For example, Community Voices Heard, a group representing poor women in upper Manhattan, organizes around economic issues. It has not been an easy time, and activists are deeply frustrated. “A lot of people on WEP got jobs,” said longtime activist Stephen Bradley at an organizing meeting on a rainy mid-May evening–jobs that they’ve now lost. “They got off welfare, but now they are back on welfare.” Politicians, he says, are unresponsive: “We go and we plead our case, but they don’t listen.” Congressman Charles Rangel, who represents Harlem, has never consented to an in-person meeting with them. Harlem native Lloyd Anderson says, “People need to be able to pay their bills and take care of their children.”

Through organizing and lobbying, people have made some advances. This past spring, Community Voices Heard and other community groups won passage of a City Council law that permitted people in WEP to attend college for part of their work requirement–though Mayor Bloomberg promptly vetoed the law. Healthcare for poor people in New York State was threatened with cutbacks, but these were averted when the state raised taxes on upper-income people to maintain Medicaid spending for the poor. And a successful action was held May 14, when about 100 activists thronged the lobby of Club for Growth, an elite Beltway fundraising organization that funnels money to conservative Republicans and pressed for the Bush tax-cut plan. Holding photographs of firehouses that are closing and schools in disrepair, they chanted, “Where’s the jobs? Where’s the jobs?”

Almost all the forces of the rich and powerful are stacked against them. Right now, Bradley says, “People are desperate. We are like a lost society.” His words ring true. As Loretta Gruytch faces life without the medication she needs, and Rosario Rodriguez prays for work, it is hard not to feel as though our whole society has somehow gotten lost. Yet at the same time, despite the array of forces against them, the dedication and courage of these activists in Harlem offer hope for the future. When Lloyd Anderson says, “It’s getting ready to get hot,” one can only hope that he is right.



]]>
https://www.thenation.com/article/archive/and-poor-get-poorer/
A New New Deal?https://www.thenation.com/article/archive/new-new-deal/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,William Johnson,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinDec 12, 2002 The same week that New York City Mayor Mike Bloomberg announced his plans to close eight city firehouses, Mike Wallace, John Jay College professor and bard of New York, held a conference on "Ne]]>

The same week that New York City Mayor Mike Bloomberg announced his plans to close eight city firehouses, Mike Wallace, John Jay College professor and bard of New York, held a conference on “New York City and the New Deal” at the CUNY Graduate Center. Walking up through Herald Square on that rainy Friday morning in late November, I found that the desperate sparkle of 1930s New York seemed far away from the stagelike shop-window displays of Macy’s and H&M. The Empire State Building–a monument to the jazzy bubble of the 1920s–loomed over the public space of the Graduate Center.

A few years ago, Wallace (with co-author Edwin Burrows) published Gotham, a 1,000-plus-page tome on New York’s history to 1898. His new book, A New Deal for New York, is a diminutive 128 pages, but its subject is, if anything, more important. Wallace was inspired to write about the New Deal by his experience in New York City after 9/11, where he witnessed New Yorkers’ solidarity and compassion for one another in a moment of crisis. It seemed possible to him that the collapse of the twin towers could focus national attention on the city’s longstanding political and economic difficulties. “By making chronic conditions acute,” Wallace writes, the bombing of the towers “helped galvanize the will to confront them.” Wallace hoped that disaster relief might go not only to the families of those who died in the towers but that it would spill over to relieve the daily disaster of poverty in New York. New homes could be built not only for those whose homes were lost; jobs could be created for people who don’t have them, not only those who once worked in lower Manhattan. The New Deal–which started as a response to another kind of Wall Street disaster–could serve as a model of public investment.

At the Graduate Center, historians, urban planners, politicians, city activists, social scientists and labor leaders met to imagine a new New York City. Representative Jerrold Nadler spoke, as did Bruce Raynor, the president of UNITE. Even Bill Clinton had hoped to attend (he didn’t, which seemed to sum up his rocky relationship with the New Deal). Held in the shadow of November’s conservative electoral victories, the policy proposals discussed at the conference–none of which are anywhere on the table–all had a polemical cast.

What would a new New Deal look like? For Wallace, the most important thing is that the New Deal’s creators viewed the alleviation of poverty as an economic good. Improving the lives of the poor and the working class, as they saw it, could make the whole economy grow. In that same spirit, Wallace envisions a massive public investment program that would revitalize the economy while making economic growth more equitable. He hopes for federal investment in affordable housing, high-speed rail, alternative energy sources and the Second Avenue subway. More generally, he wants a greater level of public involvement to preserve and nurture the delicate urban economy–a cocktail of such unlikely allies as Robert Moses, builder of New York’s highways, and Jane Jacobs, of Greenwich Village, who fought their construction. Federal investment in infrastructure would improve the city’s economy while employing people who badly need good jobs, at the same time boosting purchasing power. It is an economist’s New Deal: Wallace wants the federal government to “deploy its resources (that is, our tax dollars) to alleviate suffering and revitalize the economy.”

By these standards, the New Deal is quite a benchmark. By 1936, public works projects in New York City employed more than 246,000 people, who constructed hundreds of parks, municipal swimming pools, playgrounds and the Central Park Zoo–not to mention almost 400 new police and fire stations. New Deal agencies hired artists, writers and actors, viewing them as workers, not dilettantes. Dentists opened clinics in the city’s schools. Out-of-work teachers founded the city’s first public daycare centers. Thousands of workers in all kinds of occupations–garment workers and office workers alike–organized industrial unions. On a national scale, there were deep flaws in the New Deal, especially in its racial and gender politics. But, especially in New York City, the New Deal contained the promise of a utopian vision of a society driven by human need and collective purpose instead of private wealth. “If all this could happen during the greatest economic crisis in the city’s history,” asked Wallace at the conference, “why the hell can’t we make it happen now?”

While Wallace conveys the “range, sophistication and efficiency” of the New Deal programs, his technocrat’s language and win/win Keynesianism seem unlikely to inspire the dramatic reshaping of the economy that he envisions. The first New Deal, after all, was won only after a century of struggle. As Nelson Lichtenstein suggested at the conference, the New Deal was more than public spending on highways and playgrounds. Rather, it bore witness to a vision of American society that would place labor and the well-being of working people at the center of the economy’s productive power. “The New Deal,” Frances Fox Piven said at one panel, “was a response to a political crisis, not just an economic one.” Urban planners and New Deal Democrats did not make the New Deal alone. It was born of the men and women who sat down at Flint and streamed into industrial unions, who marched on Washington for sustenance and work. This near-revolution was fought bitterly in the 1930s by conservative businessmen, and its remnants are still under attack today–in every supply-side tax cut, every antiunion campaign.

No amount of warm enthusiasm could generate agreement at the Graduate Center about how to move forward. But there was consensus on one point: An egalitarian politics has yet to rise from the ruins of lower Manhattan. “There was always the politics of hope and fear, and the guys at the top are running with fear,” says Wallace. Today, it seems clear that the primary effect of the war on terror has been to strengthen every conservative political force in American society. Instead of a program of public investment, we’ve gotten tax cuts and welfare cutbacks–not to mention a new federal agency free of unions, and the first invocation in a generation of the Taft-Hartley Act to stop a labor action. No politics that matters can hide beneath the sentimental unity of war. The Great Depression and the New Deal revealed deep conflicts in American society that had long been evaded and denied. At the Graduate Center, it was generally accepted that any similar politics today would do the same. No matter how remarkable the New Deal’s practical accomplishments–the bridges its laborers built, the murals its artists painted, the histories its writers chronicled, the rural homes it lit up–the CUNY conference reminded attendants of its deeper political lesson: to keep faith in the struggle for a better world.



]]>
https://www.thenation.com/article/archive/new-new-deal/
The Education of Jessica Riverahttps://www.thenation.com/article/archive/education-jessica-rivera/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,William Johnson,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinNov 7, 2002

Jessica Rivera (not her real name) is a slight, composed 20-year-old Hunter College student. She grew up in the Bronx, raised by her mother and extended family. No one in her family has completed college, so Rivera was thrilled to get accepted to Hunter College, one of the best schools in the City University of New York. “It was my top choice,” she says.

In the legendary heyday of City College in the 1930s and ’40s, Rivera’s could have been a classic story of upward mobility. Had she enjoyed similar opportunities, she might even have wound up with Irving Howe and Daniel Bell, “arguing the world” in the cafeteria alcoves. But Rivera’s mother–who was injured at the Bronx factory that she worked at many years ago–is on public assistance. When Rivera turned 18, welfare caseworkers told her she would have to report for twenty to thirty hours a week to the city’s Work Experience Program (WEP) if she wanted to keep collecting the benefits she and her mother depend on. “They offered me jobs working in the park, cleaning toilets, cleaning transportation.” The long hours would have made it nearly impossible to continue at Hunter as a full-time student. At 18, Rivera was faced with a choice between quitting school for a dead-end job and losing her family’s income.

For middle-class Americans, society offers myriad incentives for higher education: scholarships, interest-free loans and the “Hope” tax credits. But for women on welfare, it’s a different story. In September the 1996 welfare reform law was up for Congressional reauthorization. The vote did not happen then, because of divergences between a bill in the Senate, written by moderate Republicans and Democrats, and the Bush Administration’s vision of welfare reform, reflected in a House bill. The welfare law expired September 30, and no compromise bill or temporary legislation is yet ready to take its place.

One of the sticking points was that the Senate legislation would have made it easier for welfare recipients to go to college. Bush, however, told the New York Times in July that he does not think a college education teaches “the importance of work,” nor does he think it can “[help] people achieve the dignity necessary so that they can live a free life, free from government control.” Now that all three branches of government are controlled by Republicans, it seems likely that the Bush Administration’s vision will soon be reflected in law.

The Personal Responsibility and Work Opportunity Reconciliation Act of 1996 mandates that recipients of public assistance work in return for their checks. They must either find jobs or, failing this, participate in state-run work programs for a minimum of thirty hours a week (split between twenty hours of paid or unpaid work, and ten hours of participation in other programs like job-search services). Should states fail to meet this work requirement, they face the loss of federal grants. (Many cities, like New York, have raised the number of required work hours above the federal minimum–in the case of New York, to thirty-five per week. It’s called a “simulated work week.”)

Under the 1996 law, college education cannot be substituted for any part of the primary work requirement. In New York City the result is clear: Before welfare reform, 28,000 CUNY students were on welfare. By spring 2002, 5,000 were–a decline even steeper than the celebrated 60 percent drop in New York City’s welfare rolls. Today, although nearly 60 percent of welfare recipients in the city lack a high school diploma or a GED, only 2 percent are enrolled in ESL or GED programs, and fewer than 4 percent are engaged in full-time education or training. “New York City has one of the most sophisticated systems of higher education in the country, but welfare recipients are essentially shut out of it,” says Wendy Bach, an attorney at the Urban Justice Center who works with welfare recipients.

The basic presumption behind welfare reform is the harsh moral logic of the workhouse. Welfare recipients, so the theory goes, are poor because they lack the discipline to hold down a job. But women who are struggling to seize hold of a little bit of upward mobility have a different experience: They feel like they work all the time.

Patricia Williams, a 32-year-old Brooklyn native and mother of a gorgeous, energetic 18-month-old girl, graduated from Hunter last year. She plans to go back to school someday for a master’s. “I want to run a high-quality daycare center,” says Williams, who was orphaned at an early age. When she started working, she did temp jobs–“everything from assembling the folders for the new Macy’s event to setting up perfume samples to shelling nuts.” After a while she decided to get an associate’s degree in computer services. Lacking parents who could help her out, she applied for welfare as a kind of financial aid. “I went on public assistance to get ahead.” After completing her degree, she enrolled at Hunter. But then came welfare reform, and she had to enter WEP.

Williams’s first assignment under WEP was housekeeping at a community senior-citizen center in downtown Manhattan. It wasn’t a job she would have chosen–she lives in Brooklyn and commutes to Hunter, on the Upper East Side, for school. But she got up at 5:30 every morning to be at work by 7:30, “cleaning bathrooms and gathering garbage.” At noon, she went uptown for class, then back downtown in the late afternoon for another stint of maid work. At the community center, “they knew me as Pat the WEP worker,” she said. “They didn’t know that I had my associate’s, or that I was working toward my bachelor’s.”

The final straw came in her last semester at Hunter. She asked her supervisor for a change in her schedule, so that she could fulfill a student teaching requirement she needed in order to graduate. “I said I would work late, on weekends.” When WEP refused, she quit. Immediately, she lost her food stamps and Medicaid. At a hearing downtown, she says, she asked a city representative, “Is it fair that I am being pulled out of school to do a dead-end WEP job?” The city worker replied, “You need to know what commitment is and what it takes to report to work.”

Bush’s welfare plan would raise the federal work requirement to forty hours per week and mandate that 70 percent of a state’s caseload be enrolled in work programs. Currently the law allows vocational training in, for example, home healthcare or computers or funeral services–but not college education–to substitute for one year of the work requirement (even though most associate’s degrees take two years). But the White House-inspired House bill would revoke even this miserly provision, allowing welfare recipients only three months of “job readiness” training in their five years of government assistance.

“It is clear that the model for the Administration’s welfare reform is New York City,” says Deepak Bhargava, director of the Center for Community Change, a Washington-based think tank. The Administration’s proposal “will drive more work programs, and the work requirements will make it virtually impossible for women to seek higher education.” As he points out, “arguably the least successful welfare program in the country is being held up as the most successful.” While the census recently found that New York is the most unequal state in the nation, poverty rates in New York City have started to creep upward, according to the Community Service Society of New York. Studies have shown that more than one-third of people who leave welfare are unemployed. And there is no evidence that unpaid work programs like WEP by the Manpower Development Research Corporation lead to sustained employment or to higher earnings. It seems likely that New York’s dramatic reduction in welfare caseload is more a reflection of stringently applied rules than the well-being of poor families. Meanwhile, the city’s homelessness problem is once again reaching crisis proportions, with hundreds of families sleeping in public offices in the Bronx, while the city’s food pantries and soup kitchens have reported steady rises in use.

The Senate welfare bill–crafted in the bipartisan Finance Committee by Democrats and moderate Republicans like Olympia Snowe of Maine–was slightly better than the House bill on the question of education, though it still left much to be desired. It contained considerable work requirements, but it did allow states to develop more generous educational policies, permitting them to substitute school for work for part of the caseload (30 percent for vocational training, 10 percent for college).

Even as New York moves to the center of the national debate over welfare policy, local politicians are starting to respond to pressure to change the law–much of which is coming from welfare recipients themselves. In 2000 the Welfare Rights Initiative (WRI), a Hunter-based organization of current and former welfare recipients, successfully lobbied the state legislature to enact a bill permitting work-study and internships to substitute for work requirements. In spring 2002 Gifford Miller, the Speaker of the City Council, proposed a bill allowing welfare recipients to substitute college course work for WEP. Meanwhile, in Maine, legislators have used state-level funds to support college students on welfare. The program (called Parents as Scholars) has been very successful. The women it serves earn a median wage of $11.71 upon graduation–compared with $8 for women before entering college; they are also more likely to work in jobs that offer health benefits. Ninety percent of Maine women who earned a degree while on welfare have left the rolls, with every indication that they will stay off.

But while innovative local programs are all to the good, the restrictive federal policies with regard to college for welfare recipients are part of a larger social shift toward a constriction of access to higher education for poor and working-class Americans: The number of Pell grants is shrinking, tuition is rising and budgets at public universities are being slashed. New York City’s university system is scandalously underfunded, and CUNY programs are on the chopping block. When Bush ran for President in 2000, he described himself as the “education President,” because ever since Horatio Alger, education has been touted as the key to upward mobility. But in truth, the question of who has access to college has always been deeply social and political. College enrollments exploded during the great postwar boom, in the heyday of high union density and the welfare state, and today’s college gap simultaneously reflects and perpetuates the haughty isolation of the rich.

Young women like Jessica Rivera, though, clearly benefit from whatever changes local organizations can make. Just when she was about to give up on school, Rivera learned about WRI. With legal help provided by the advocacy group, she successfully pleaded her case before a hearing officer to substitute work-study hours for WEP under the state law. The rising junior says she isn’t yet sure what she wants to major in, but she knows she wants to get a master’s degree–even, someday, a PhD. “Who wants to be on welfare? I’m going to have my own job and be independent–I don’t need to depend on anybody,” she says cheerfully. But, at the same time, when she thinks about her mother, Rivera’s face grows sad and reflective. With a gentleness that seems to contradict her spunk, she softly says, “Some people just have to be on welfare.” It is anybody’s guess what our President–whose Poppy surely paid for Yale–thinks young women like Rivera will learn about responsibility or commitment picking up trash in Central Park.



]]>
https://www.thenation.com/article/archive/education-jessica-rivera/
Exchangehttps://www.thenation.com/article/archive/bulldog-bulldog-now-now-now/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,William Johnson,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our ReadersAug 9, 2001 Bulldog, Bulldog, Now Now Now

Helaine S. Klasky, Nina Stachenfeld, Eric Alterman, Carol P. Christ, Kim Phillips-Fein

]]>

Bulldog, Bulldog, Now Now Now

New Haven, Conn.

The one-sided nature of Kim Phillips-Fein's "Yale Bites Unions" [July 2] may be explained by the fact that she is a union organizer of graduate students at Columbia University, but that does not excuse her errors and misrepresentations. A few facts are in order: At Yale, those select few who enter the graduate school (10 percent of all who apply) are provided a minimum annual stipend of $13,700 for five years. Every PhD student also receives additional support that covers tuition for four years ($23,650 per year) and a comprehensive healthcare plan. Over five or six years of study, Yale invests more than $160,000 in each of these students.

During their years in the graduate program, students are expected to master the skills required to become leaders in an academic field, including subject expertise, research methods, writing and, yes, teaching. Future academic leaders must indeed spend a small part of their studies gaining classroom experience. At Yale, graduate students are typically expected to assist professors by teaching part time during two of their graduate student years. Two-thirds of graduate students at Yale are not doing any teaching at all in a typical semester.

As for Phillips-Fein's claim that the vast majority of students have a strong desire to unionize, that has not been demonstrated. The union that seeks to represent the students, an affiliate of the Hotel Employees and Restaurant Employees, has not gone to the National Labor Relations Board to seek an election. There are vocal students on both sides of the issue, and it remains to be seen which point of view has more support. The union has pressed the university to recognize it as a bargaining agent for graduate students without an election. Without protections provided by a federally supervised election, students may be subject to intense pressure to sign authorization cards in on-the-spot, face-to-face encounters with organizers. Yale opposes recognition on the basis of a "card count," because it fails to protect the right of students to a secret ballot, and it fails to promote an open, honest discussion of the issues.

HELAINE S. KLASKY
Director, public affairs, Yale University

 


 

New Haven, Conn.

As a member of the Yale faculty, I consider Yale's treatment of unions beneath contempt. Over the past few years I have witnessed trash pile up in the streets while the administration wore the unions down. They were able to wear them down because there is only so long that janitorial and maintenance workers can remain out of work with families to feed. Do the members of Graduate Employees and Students Organization (GESO) really think that the issues they face are in any way similar to those faced by the members of unions 34, 35 and 1199?

As a former graduate student and postdoctoral associate I understand how hard and seemingly unfair it is to be a graduate student. However, my opposition to GESO is the only time in my life I have ever opposed the formation of a union. Graduate students are transient. Further, as college graduates they have many opportunities open to them. They are not faced with working long hours at low wages for the rest of their lives to support a family. The young man described as busying himself in the kitchen of a Yale professor is there by choice, and that description is a misrepresentation and an exaggeration of what goes on at Yale.

Grading papers is no fun, but it is not exploitation by any stretch of the imagination. The fact that this group of graduate students believes it is exploitation because the number of tenure track positions has fallen over the last ten to fifteen years is ridiculous. How many jobs let individuals reach a point in their career where they cannot be fired, regardless of performance? Isn't it tenure that permits the level of academic arrogance described at Yale by the author? No, even faculty such as myself who are strongly pro-union, have union members in our households and can readily see how members of locals 34 and 35 are treated miserably by Yale find little reason to support GESO. In fact, it seems to me that GESO is exploiting the energy of the leaders and members of 34 and 35, who have enough to do fighting Yale on their own.

The old rivalry between Yale and Harvard still exists and extends far beyond the Yale-Harvard football games. The stark contrast between these schools was evident, with Harvard students staging the longest sit-in in that school's history to demand living wages for its workers. Meanwhile Yale students work to protect themselves while claiming that their work will benefit all the unions at Yale.

NINA STACHENFELD

 


 

New York City

I write to protest Kim Phillips-Fein's egregious misportrayal of my old friend and professor, Paul Kennedy. It is just plain bizarre to see a scholar's unusual devotion to teaching, to history and to his students held up as part of an indictment of his views. And although the author was granted an interview with Kennedy, she made no attempt to allow him to express the rationale for the positions she treated so contemptuously.

It is my understanding that because of his commitment to his undergraduate students, Kennedy asked his TAs to sign a pledge that they would not, as part of a union protest, withhold undergraduates' grades. Why should undergrads, he asked, suffer for graduate student grievances? This position, while certainly arguable, is defensible and honorable, even to a former Yale grad student like myself who supports the cause of their unionization, as I do. To compare the pledge to "yellow dog" contracts of the past is ignorant and ahistorical–revealing much more about the author than the subject.

In addition to his unmatched reputation as a scholar and a mentor, Kennedy is well-known throughout the community for his commitment to serving the needy in New Haven soup kitchens, as well as his public and spirited opposition to the most egregious aspects of conservative rule during the past two decades. It was the publication of his masterwork, The Rise and Fall of the Great Powers (not "of Great Powers," as it was mistitled in the article), that marked the beginning of the end of the reign of Reaganism in public discourse. He deserves far better from The Nation.

ERIC ALTERMAN

 


 

Molivos, Lesbos, Greece

As a former Yale graduate (PhD, 1974), I am sickened reading your article about Yale's attempts to maintain its elite status at the expense of underpaid workers. In the early 1970s I worked with the Yale Women's Faculty, Graduate Student and Staff organization to bring HEW to Yale to investigate sex discrimination among faculty, staff and students. I also supported the effort to unionize Yale's vast secretarial staff. At that time the university president sent a letter to department chairs urging them to tell their secretaries that a vote for the union was unbecoming to their position. One would have hoped that times had changed at Yale.

CAROL P. CHRIST

 


 

 

PHILLIPS-FEIN REPLIES

New York City

Next time Helaine Klasky writes a letter, she might want to talk first to Yale Law School dean Anthony Kronman. Even this antiunion Yale administrator has admitted in the New York Times that grad students have "serious concerns": They "teach undergrads whom the faculty have neither the time nor inclination to teach, and then, after receiving their degrees, are cast off into an inhospitable job market."

Beyond that, Yale pays its "select few" salaries that are lower than Yale's own estimate of what it costs to live for twelve months in New Haven. Graduate students don't take classes in the third and fourth years of study, so it's not so generous for Yale to grant them free "tuition" for four years. Klasky is invoking an accounting fiction, not an argument. At the same time, as Yale administrators know, GESO fought for and won all the benefits that Klasky mentions. Yale would never have raised stipends and improved benefits without graduate students demanding change. When I was doing interviews for my article, administrator Thomas Appelquist–former dean of the graduate school–told me there was "no question" that pressure from GESO was responsible for improving the graduate school.

Klasky might also want to check with Yale's lawyers on the university's position on union elections, for what she says directly contradicts what it has been saying and doing for the past ten years. For years GESO asked Yale to hold an election. The university always refused. The League of Women Voters sponsored a union election in 1995. Yale paid no attention to the results. Richard Levin's administration has fought to reverse the NYU decision granting grad students rights under the Wagner Act, which would make it impossible for grad students anywhere to hold union elections. It's not surprising that these students are skeptical of Yale's newfound democratic sympathies.

In response to Nina Stachenfeld: This past spring, more than 1,100 members of locals 34 and 35 voted to support GESO in the upcoming round of contract negotiations. Given her professed respect for janitorial and maintenance workers "with families to feed," it's odd that she presumes to know better than they what's in their best interests. That kind of condescension is what all Yale's unions are fighting. On a deeper level, as Stachenfeld notes, we live in a society divided by class and privilege. It's sad that a faculty member at Yale should have nothing but criticism for people who are joining together across such boundaries. Her remarks, despite their patina of sympathy for the downtrodden, only reinforce these divisions.

In response to Eric Alterman: It's a curious union supporter who thinks it's "honorable" to deny union members employment unless they sign a pledge not to engage in union activity. Would Alterman support such a pledge for high school teachers? For Yale's janitors and clerical workers? Before whipping himself into a moral lather, Alterman should also check his facts. Kennedy's original e-mail said that he would refuse to teach his undergraduate lecture course if "any of the TAs were GESO members who might take industrial action" in a future dispute with Yale. Making nonunion status a condition of employment is the definition of a yellow-dog contract. When pressed by a student, he explained that what he meant was that while he did not mind having people from GESO as TAs, he sought "reassurance that my undergrads will never be disrupted by industrial action of the sort that so often accompanies wage/benefit negotiations." Until he received it, he would not offer the lecture course. Exactly how grad students could give Kennedy this open-ended reassurance, he did not specify.

Kennedy does care a lot about his students. At Yale, though, this particular ideal of the student-mentor relationship–where a professor is supportive, provided that the student knows his or her place–is precisely what has fueled the viciousness of the school's antiunion reaction.

KIM PHILLIPS-FEIN

 



]]>
https://www.thenation.com/article/archive/bulldog-bulldog-now-now-now/
Yale Bites Unionshttps://www.thenation.com/article/archive/yale-bites-unions/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,William Johnson,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-FeinJun 14, 2001

On a Friday afternoon in late April, Woolsey, the great hall at Yale, is packed with Old Blues. Gilt scrolls frame the proscenium, and from the ceiling hangs an enormous screen bearing the word YALE. A sudden organ chord sounds. In their starched shirts and navy jackets, the alumni rise like a single body and, without prompting, sing the Yale anthem. At the last refrain–“For God, for country and for Yale”–they take out white handkerchiefs, waving them above their heads in a gesture resembling surrender.

It’s the first event of Yale’s Tercentennial Alumni Weekend, part of a yearlong festival celebrating 300 years of pomp and circumstance. Yale has always been self-conscious about its place in American society. Since it was founded by dissenting Puritans, the school has claimed crusading aspirations. Even a modern-day Thorstein Veblen like Lewis Lapham can’t help but extol his alma mater’s “antithetical spirit of remonstrance and dissent” in Yale’s alumni magazine. Three years ago, historian Gaddis Smith gave a course on “Yale and the External World,” featuring such topics as the reorganization of the provost’s office. In a telling misprint, the course was advertised as “Yale and the Eternal World.”

But the weekend’s first event is firmly anchored in this world. It is a chat with Yale alums Robert Rubin, former Treasury Secretary, and Janet Yellen, a former governor of the Federal Reserve. University president Richard Levin, a fellow economist by training, slyly jokes that perhaps Yale can take credit for the past decade of economic expansion–as well as for the last three Presidents. Levin trots Yellen and Rubin through a few paces, coaxing them to rehearse painfully conventional wisdom (why lower deficits are good for economic growth, why NAFTA is a boon to America and Mexico alike). The alumni grow dazed and bored. People start to drift out of the hall.

Outside in the street, there’s another restless crowd massing. Skinny grad students mingle with secretaries and chat with Yale’s service and maintenance crews. Children run underfoot. A huge sound truck blasts reggae, while police on horseback guard the street corners. A grad student from the French department gives a speech about grading papers, and an undergrad talks about what she’s learned from the labor movement. Carpenters and janitors cheer. Alumni passing by aren’t sure what to make of the noise. “What are they protesting now?” a middle-aged alumna asks. “It’s a little bit rude,” sniffs a gray-jowled man.

The demonstrators are from locals 34 and 35 of the Hotel Employees and Restaurant Employees Union (HERE), representing Yale’s support staff and its service and maintenance workers; GESO, the Graduate Employees and Students Organization (affiliated with HERE); and the Service Employees International Union (SEIU), District 1199, which is organizing workers at Yale/New Haven Hospital. (Yale does not own or directly control the hospital, but the university’s president sits on its board of trustees.) GESO has been organizing for the past decade; the hospital workers, for three years. Yale’s history of hostility to unions makes it impossible to hold a fair NLRB election, the unions say, so they’re demanding recognition for GESO and the hospital workers when a majority of both bargaining units sign union cards.

The battle will come to a head in the next academic year, when contracts for locals 34 and 35 expire. The locals are pressuring Yale to recognize GESO and the hospital workers. If the university refuses, negotiations will be difficult.

It’s fitting that the sleepy gentility of the alumni reunion should be shaken by a ruckus in the streets of New Haven, for unionbusting is as much a Yale tradition as Lux et Veritas. Over the course of the past century, the university has bitterly fought the organization of every group of campus workers. Harvard made headlines recently by refusing to pay its employees a living wage, but when it comes to playing tough with unions, New Haven has Cambridge beat by a mile: Yale’s had seven nasty strikes in the past thirty-five years; Harvard’s last was in 1983.

Everyone knows why campus workers–including graduate students–are organizing: low wages, minimal benefits, outsourcing, the dismal academic job market. But at Yale there are deeper reasons for unionization. To many workers and students, the university seems like little more than a country club or an extended business school seminar. The word “education” has its roots in the Latin educare: to bring out of oneself, to develop the full range of human capacities and talents. For many, the university seems no longer to have this power–but the labor movement does. Yale, for its part, sees itself as defending the university, that fragile flagship of Western Civilization, against the gritty conflicts of the picket line. These competing visions of the university–and not just money and power–are at the heart of the battle over unions at Yale.

When graduate students began organizing in the early 1990s in response to a series of sharp budget cuts (targeted at, among other things, the library), their slogan was as loyal to the university as the most hidebound Old Blue could wish: “Save Yale.” GESO members saw themselves as champions of the true Yale against an administration, led by Benno Schmidt, that seemed willing to sacrifice the sociology department to preserve the endowment.

But over the past ten years of organizing, graduate students have come to understand themselves more as employees than as saviors of Yale. This change is partly a result of Yale’s response to GESO, which has fluctuated between arrogant disregard and vicious retaliation. In 1995 the grad union won a League of Women Voters-supervised election, 600 to 166. Levin responded by promising to shut down the university before negotiating with the union. When teaching assistants in turn went on a grade strike, they were told they would be fired from their spring semester teaching jobs. The administration even gave faculty a green light to mention union activism in recommendation letters–equivalent to blacklisting.

The strike, says Wendi Walsh, previously a psychology grad student and now lead organizer for GESO, “completely destroyed any hope that I had in this place.” Her best friend, a fellow union activist, was singled out for disciplinary charges, telephoned at her family’s home in India in the middle of the night by her department chair, and told she was the only one striking and could be expelled if she didn’t hand in her grades. Walsh’s adviser, a woman she respected, failed to speak out against the administration’s punitive acts. Today, Walsh describes the strike as “the most intense time of my life. People were so scared.”

Since the strike, Yale’s attitude toward graduate student unionization has not softened. In 1998, when 1,000 graduate students signed a petition asking Yale to negotiate a contract with graduate students, Levin mailed the signatures back with a breezy note: “The University will not recognize GESO or negotiate with it as a collective bargaining agent for graduate students.” This past fall, graduate students organizing in a campus coffee shop were kicked out repeatedly by an administrator. And Yale has employed Proskauer Rose LLP, a law firm whose website says its mission is to coach companies on “how to avoid, and, where appropriate, resist union organization of employees.”

At the same time that it has “resisted” GESO, Yale, like other universities, has relied increasingly on graduate student and adjunct teachers. About 40 percent of Yale’s teaching is now done by graduate students–paid $13,700 a year–and 30 percent by adjunct faculty. As graduate students’ prospects for tenure worsen, they are less willing to accept such conditions. “Universities across the country are hiring less and less faculty, so this job you spend ten years preparing for may just not be there,” says Rosa Anna DeFillipis, a student in molecular biology. Rhetoric about the high moral purpose of the university now seems precious and quaint, a way of denying who’s doing the work.

While the labor movement shatters the sentimental faiths of academe, it has provided graduate students with a new vision of the scholarly life, in which being an intellectual demands daily confrontations with institutional arrogance. Unionization, says Carlos Roy Aramayo, a history graduate student, “is about putting Enlightenment principles at the center of the academy.” Most GESO members stay in academia, but for some the labor movement ultimately comes to seem more compelling. Two years ago, Walsh turned down two prestigious postdoctoral posts to organize full time for the union. Her adviser was furious. “Before,” she says, “I always thought I was a strong person, but I always did what was expected of me in every way.” The lessons she learned in academia were about subordination to the job market and to her professors; organizing taught her what it meant to live a thinking life.

At Yale/New Haven Hospital, workers have followed a trajectory similar to that of the grad students. When they began organizing, they believed they were defending healthcare and their own jobs against an administration that endangered both. Now they see unionization as good in its own right.

The seeds of the campaign at the hospital were sown in locals 34 and 35’s last contract battle. In 1996 the university sought to expand its ability to use “casual” nonunion workers and to subcontract jobs. Locals 34 and 35 went on monthlong strikes to fight the proposed changes, and though they were largely successful, the unions concluded that Yale’s aggressiveness augured badly for the future. Organizing at the hospital–like organizing the TAs, which HERE has supported financially from the beginning–seemed like a good way to counter Yale’s plans and to build union power in New Haven. In an impressive example of cooperation between unions, HERE has been working with 1199 to organize the hospital for the past three years.

There are plenty of grievances at Yale/New Haven Hospital. Entry-level workers are paid $8.50 an hour, and many longtime employees make only $11 or $12 an hour. “A lot of people have two jobs,” says Monica Osborn, an operating-room technical associate who cuts hair nights and weekends to make ends meet. Meanwhile, the CEO is one of the highest-paid hospital executives in the country. But the campaign really got going after the hospital “wholeheartedly embraced the concept of change in healthcare,” a euphemism invoked by senior vice president Vincent Conti in 1994 to justify sharp staffing reductions.

The hospital has firmly opposed the organizing campaign. Last summer, armed security guards threatened workers with arrest for handing out leaflets in front of the hospital. Days after settling an unfair-labor-practice suit the union filed to protest the incident, the administration sent a memo to top management saying that the settlement “in no way changes the Hospital’s position regarding the current unionization effort. We oppose it and will continue to do so.”

Peggy Tamulevich, who’s worked at the hospital for twenty-three years since graduating high school, says that if Yale doesn’t recognize the union, she’ll have to leave. “I used to be proud to work at Yale, but I don’t feel that way anymore.” Tamulevich and others now see the labor movement not only as a way to improve their personal conditions but as a source of collective power and idealism. As her co-worker Kent Hilton puts it, “If we can actually organize, and then hook up with the union workers at the university, we will have a stronger voice in New Haven.” He goes on, “I’m not talking about riots in the streets, but wouldn’t it be nice to have one big mass walkout? They can’t just fire everyone. I mean, who would clean up if they did? Are they going to do it?”

Joseph Conrad once wrote that had Europeans looked directly at imperialism, they could not have dominated half the globe. But they did, because of their firm faith in the rightness–the obligation–of European rule. “What redeems” the “conquest of the earth,” he wrote, “is the idea only. An idea at the back of it; not a sentimental pretense but an idea; and an unselfish belief in the idea–something you can set up, and bow down before, and offer a sacrifice to.” One might say the same of antiunion feeling at Yale. The university justifies its fierce opposition to unions through its exalted sense of purpose.

President Levin–under whose eight-year tenure Yale’s endowment has increased 41 percent annually, now topping $10 billion–sees “the education of leaders” as Yale’s mission in the new millennium. Yale, he boasts, trains “more leaders of US corporations than any other university.” He has sought to place Yale at the pinnacle of the global economy, creating a World Fellows program and founding the Center for the Study of Globalization, to be led by Strobe Talbot, Deputy Secretary of State under President Clinton.

Yet the education of rulers of the globe takes place in the smallest of worlds. Nestled in quadrangles carpeted with magnolia blossoms, the university is a haven of intellectual discovery, which must be vigilantly protected from the fray. Within Yale’s stone buildings, with their spiky iron gates and intricate Gothic carvings, meant to evoke the twelfth century rather than the twenty-first (although many were built in the 1930s), brilliant professors teach the bright-eyed leaders of tomorrow. Devotion to the promise of this intimate space is what brings the administration–and the faculty–to the barricades.

Historian Paul Kennedy, bestselling author of The Rise and Fall of Great Powers, is one of the most fiercely anti-GESO professors on campus. Harking back to the days of yellow-dog contracts, he’s gone so far as to threaten not to teach his lecture course if any of the teaching assistants are GESO members.

A soft-spoken Briton, Kennedy loves old things. He collects books about old churches. He misses the sense of “history and ancientness” that he finds in European university towns. He fondly remembers receiving an honorary degree from a Belgian university celebrating its 575th anniversary, how he walked through the medieval village early in the morning while a light snow was falling. A high mass was said before the ceremony. He even compares Yale to the Holy Roman Empire. (There are limits to his gentility: When I call for an appointment, his receptionist informs me that he charges more than a thousand dollars for an interview. When we meet–sans fee–he explains that’s for Japanese and Korean publications looking for “foreign gurus.”)

Kennedy tries, as best he can, to replicate medieval tradition in New Haven. His grad students live clustered around him. In the summer he takes them to his cottage on Long Island Sound to swim. He likes to take students to the theater, sometimes in groups of just a few undergrads. “Half the time you’re watching the play, and the other half you’re looking at this 19-year-old from Kansas who has never seen A Midsummer Night’s Dream before and can’t control his emotions,” he says. One PhD Kennedy taught–who never supported GESO–now runs his institute, and as we chat, waiting for Kennedy, he busies himself in the kitchen, preparing coffee for the esteemed professor and his guest.

Kennedy’s evocation of a premodern world is consonant with Yale’s overall strategy regarding GESO. The administration insists graduate students are “apprentices,” not employees. Back in the boom years of the post-World War II era, when there was a tenure-track job waiting for almost every PhD, that may have made more sense. But today, the university’s gauzy idealism seems misplaced. Indeed, in a landmark decision last October, the NLRB, rejecting an appeal by New York University, affirmed that teaching assistants at private institutions are employees with the right to form unions under the Wagner Act. After a bitter fight, negotiations between NYU and the Graduate Student Organizing Committee/UAW began this spring. There are now union drives at public and private universities around the country, including Columbia and Brown.

Yet the Yale administration wants to turn back the clock. After last fall’s NLRB ruling, Levin encouraged NYU president Jay Oliva to challenge the decision in federal court, saying that unionization was not “in the best interest of graduate students themselves.” Administrators insist they don’t understand, despite ten years of struggle, why graduate students would want a union. Perhaps, provost Alison Richard suggests, they don’t feel part of the Yale family. They have no residential colleges or social outlets. Alienated and lonely, graduate students turn to GESO. But Richard admits that she’s had little contact with pro-union students and has had to “infer” most of her ideas about graduate student unionization. “It is perplexing to me,” Richard says, a little wistfully. “I don’t get it entirely.”

In 1978, writer Vivian Gornick spent a few months on a College Fellowship at Yale. Writing for The Nation, she described the boorish inanities of the Yale professoriat. Instead of intellectual conversation, she found brandy-swilling poker games and senior faculty more interested in her kneecaps than her mind. Expecting bookish companionship, she encountered sherry-hour condescension, magnified by an immense, unshakable self-regard. The wife of one senior professor described her husband’s morning ritual to Gornick: While shaving, he would stand in front of the mirror, thinking, “Jesus Christ. I’m at Yale.”

Yale may have traded its clubby atmosphere for a more liberal multiculti image. But even among the progressive faculty who protested the honorary degree President Bush received recently, there are many–like comp lit professor Peter Brooks, who defended Yale’s anti-GESO stance, describing Yale students as “the blessed of the earth”–who have staunchly opposed GESO. And that’s because, like Gornick’s faculty, they remain in thrall to the mystique of the Ivy League. As the school anthem goes, Yale should evoke the same fealty as country, even as God. For them, the university embodies the celestial realm of the mind, far removed from the grubby materiality of the workaday world. Administrators like Richard and professors like Kennedy fear the loss of a beautiful ideal that represents the only way to pursue truth and knowledge. But can an institution devoted to seeing the world as it is rest upon a romance?

The campus labor movement–from the living-wage sit-in at Harvard to grad student organizing to the daily struggles of locals 34 and 35–insists that the university be seen for what it is: a workplace like any other, resting upon the labor of people who grade blue books for low pay, cook and file, lead sections and scrub toilets. Far from denying the university’s purpose, the labor movement on campus fulfills its deepest promise: to tell the truth about the world. Perhaps its greatest triumph will be to strip Yale of its idyllic myths and feudal convictions, making it impossible to take either seriously any longer.



]]>
https://www.thenation.com/article/archive/yale-bites-unions/
Flower Power: The Lessonshttps://www.thenation.com/article/archive/flower-power-lessons/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,William Johnson,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-FeinJun 8, 2000 An article in the financial section of the New York Observer this spring described a company named NetJ.com Corporation.]]>

An article in the financial section of the New York Observer this spring described a company named NetJ.com Corporation. NetJ.com has no business operations, no revenue and no assets to speak of. To quote its business plan, filed with the Securities and Exchange Commission, “the company is not currently engaged in any substantial business activity and has no plans to engage in any such activity in the foreseeable future.” This would seem to render NetJ.com worthless. Yet in February, it was capitalized at $22.9 million.

Despite the Nasdaq’s recent gyrations, a market like this one can’t help but bring to mind tulipmania, the brief period of fantastic speculation in tulip bulbs in seventeenth-century Holland. Ever since Charles Mackay wrote his 1841 classic, Extraordinary Popular Delusions and the Madness of Crowds, tulipmania has been a potent symbol of the euphoric irrationality of financial speculation. According to Mackay, in the late 1630s a single tulip bulb in the Netherlands might be sold for a price equivalent to that for four oxen. The high point of Mackay’s story is the anecdote of a sailor who breakfasted upon what he believed to be an ordinary onion: “Little did he dream that he had been eating a breakfast whose cost might have regaled a whole ship’s crew for a twelvemonth.”

But Mackay wrote back in those old-fashioned days, when people believed that markets might be speculative, something other than arbiters of perfect truth and rationality. Today, trendy academics like to say that the tulip craze wasn’t a bubble at all. In a paper published in 1989, economist Peter Garber argued that the dramatic increase in bulb prices simply reflected the great demand for rare varieties of the bulb. As the flowers multiplied, their price fell. The best-known bubble in history turns out to have been a rational response to fundamentals, after all. (Even Garber, though, was stumped by the twentyfold price increase for common varieties of tulip in the month before the crash; he conceded that this might not have been completely rational–but when you’re an economist, it’s the model, not the data, that counts.)

Now there’s a new history of tulipmania to go with today’s sky-high market. Tulipomania, by Mike Dash, retells yet again the story of the flower craze, emphasizing the economic context of the bubble and its effects on Dutch society and culture. Perhaps the strangest thing about this book is what it doesn’t talk about: Clearly it’s supposed to provide insight into the stock-market boom of the past decade, yet Dash scrupulously avoids saying anything direct about the Dow Jones, preferring instead to expound upon the finer points of tulip cultivation. Still, it’s a colorful description of one of the oddest periods of financial history. And while Dash doesn’t comment directly on financial euphoria today–wouldn’t want to alarm anyone!–if one reads between the lines, Tulipomania offers its own sly commentary on contemporary market fever.

* * *

For one thing, Dash attributes the tulip craze to inexperienced investors crowding into the market–not unlike today’s day traders. Tulipmania exploded in 1636, during an economic boom in the Netherlands. But despite economic growth, artisans and laborers found themselves barely eking out a decent living, unable to cash in. For these humble sorts, the tulip market offered an irresistible temptation. “Growing bulbs was a lot easier than working an eighty-hour week hammering horseshoes or working a loom,” writes Dash. Many of the people growing and selling bulbs were new to the flower business altogether and had little experience in the financial markets. At the same time, speculative financial instruments–like the futures market–were just coming into existence. One can almost hear Alan Greenspan’s anxious sigh as Dash describes these novices to the market.

In a sense, Dash uses tulipmania to legitimize “normal” financial activity in the early seventeenth century, much as the recent flick Boiler Room told a parable of cheating bucket shops to demonstrate the virtues of real brokerages. Dash suggests that the tulip market was a “rough but intended parody” of the newly built beurs–the legitimate Amsterdam stock market. Tulips were more frequently traded in raucous pubs (substitute: chat rooms) than in the calm halls of the beurs. Instead of being staffed by experienced bankers or people who understood the complexity of financial markets, the tulip market was dominated by “country people and poor city dwellers who had, when they started dealing in bulbs, almost certainly never owned a single share in their entire lives.”

The crash came on the first Tuesday of February, 1637. For no apparent reason, all of a sudden there were no buyers for the bulbs: “The market for tulips simply ceased to exist.” Since prices had been rising thanks to simple speculation, they went into a free fall. As Dash puts it, the rout “was far more rapid and complete than history’s most notorious financial disaster, the Wall Street Crash of 1929 and the Great Depression that followed it.” However, there were few links between the tulip market and the real economy, which was hardly based on mass consumption: No generalized crisis followed the collapse of flower prices.

Tulipomania concludes with a few non sequiturs about horticultural history; Dash stubbornly refuses to say anything substantive about finance. (His other implicit analogies don’t hold up. Most stock ownership is concentrated in the wealthiest 10 percent of the population. Novice investors and day traders aren’t creating the boom; rather, they are riding on its coattails.) But other historians who aren’t so allergic to generalization think the episode tells us something important about how financial markets work. For the great economic historian Charles Kindleberger, the tulip bubble serves as a prime example of speculation. Markets, he argues, though generally rational, are not immune to manias, which are by definition irrational. In a bubble, buyers purchase goods simply because they believe their prices will increase. This stokes demand for the goods, hence becoming, at least in the short term, a self-fulfilling prophecy.

However, the moment buyers decide they can no longer risk bidding prices up, they collapse immediately–since the product was only valued as long as people believed its price would rise. Kindleberger believed that an expansion of credit could feed such a speculative frenzy (and thus magnify the effects of a crash)–a useful reminder in times like these, when debt held by financial institutions, corporate debt and good old-fashioned credit-card debt are on the rise, making it seem likely that rising interest rates will call a halt to the party over time.

* * *

So how about the trillion-dollar-plus question: Is today’s market a bubble? Despite reading stories like NetJ.com’s in the paper nearly every week, and the fabulously high valuations of other tech companies that seem to show little prospect of ever turning a profit, it’s not so clear. For one thing, thanks to massive stock buybacks–and mergers and acquisitions–corporations are actually returning much more cash to shareholders than they have in the past, over 90 percent of after-tax profits over the 1990s, according to Left Business Observer editor Doug Henwood (about double the rate during the Golden Age of the fifties and sixties, when economic growth was distributed far more equally than it is today). It may make sense for investors to value their shares more highly. One might ask whether giving so much money back to shareholders is viable over the long term; as my friend who peruses the Flow of Funds like baseball box scores has noted, corporations are supporting this massive transfer of cash to shareholders through heavy borrowing. This can’t endure indefinitely.

On a deeper level, the end of the Soviet Union, social democracy in Western Europe and anticolonial movements in the Third World have left capitalism with no political or ideological opposition to speak of. And what the stock market clearly does tell us about is the balance of power between labor and capital underlying the economy as a whole. From the standpoint of the very rich, there’s plenty to celebrate. After all, the American labor movement was hacked down over the 1980s; the massive skewing upward of income distribution has kept moguls fat and happy. What better cause could there be for euphoria? For this very reason, a few good strikes or a couple months of rising wages may well prove the most effective antidote to market fever. If the balance of power between workers and owners shifts in the real economy, Wall Street will start to seem more dangerous. When the market falls this time–unlike the tulipmania–it’s likely to be because of changes in the economy and politics that go well beyond the Dow Jones.



]]>
https://www.thenation.com/article/archive/flower-power-lessons/
Feminine Mystiquershttps://www.thenation.com/article/archive/feminine-mystiquers/Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Charles Postel,Robert Greene II,Michael Kazin,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,William Johnson,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-Fein,Our Readers,Kim Phillips-Fein,Kim Phillips-Fein,Kim Phillips-FeinMar 11, 1999 For Danielle Crittenden, the "click" came when she was going to play tennis with her husband and a couple of acquaintances. She left her racket on one side of the court.]]>

For Danielle Crittenden, the “click” came when she was going to play tennis with her husband and a couple of acquaintances. She left her racket on one side of the court. When she went to get it, she noticed her husband had left his too, so she picked it up. When she got back to the group, one woman said, “Oh darn, I was going to congratulate you for not bringing it.”

Click!

Crittenden suddenly realized that her most casual interaction with her husband might be criticized by an outsider who, stoked with feminist nostrums, would find lingering female subservience in the most innocent acts. “What sort of behavior would I be capable of next? Fetching him his newspaper and slippers? Having a chilled martini ready for him when he came home from work?” And so she set out to write What Our Mothers Didn’t Tell Us: Why Happiness Eludes the Modern Woman, an antifeminist tract for the nineties. Its release date neatly coincided with Wendy Shalit’s A Return to Modesty: Discovering the Lost Virtue, a similar tale of a young girl coming of age in this crazy postfeminist world.

It’s more than a little disingenuous for Crittenden to present herself as being prompted into reaction by a harmless conversation over tennis. A seasoned journalist and founder of the right-wing Independent Women’s Forum publication The Women’s Quarterly, she’s certainly no stranger to antifeminism. Nor is 23-year-old Wendy Shalit, whose stream-of-consciousness ramblings on boyfriends, college and virginity have been gussied up into a book by which Shalit herself will certainly be embarrassed in a few years; she’s a frequent writer for conservative journals like Commentary. Together with other self-styled feminists like Elizabeth Fox-Genovese and Christina Hoff Sommers, the two represent an odd new trend in right-wing thinking: The antifeminist appeal is today being made on grounds of women’s well-being and satisfaction, in language that explicitly recalls Betty Friedan.

Contributing to one’s eerie sense that all conservative books just might emanate from the same few ghostwriters, What Our Mothers Didn’t Tell Us and A Return to Modesty are remarkably similar, both in their anecdotal style and in their obsessions. Both writers get frantic over coed bathrooms (Shalit, one of whose first big articles for Commentary was titled “A Ladies’ Room of One’s Own,” has built her entire career on this unpromising foundation). Both note how revealing it is that we’re “flocking to Jane Austen movies.” Both feel obliged to establish their sympathies with mainstream economics. Shalit refers to her Chicago-school economist father, and Crittenden uses market metaphors to talk about sex: “When something becomes widely and cheaply available, its value usually goes down too.”

Most important, both books are modeled on The Feminine Mystique–Crittenden’s self-consciously, Shalit’s less so. Like Betty Friedan, both use women’s magazines, those hoary rags devoted to shilling skin creams and facial masks, to gain some insight into the female psyche. But unlike Friedan, neither writer treats Cosmopolitan, Glamour or Redbook with any hint of irony or sense of disjuncture between the claims of the magazines and real women’s lives. For instance, they find that women are horribly anxious about their looks and their relationships with men. According to Cosmo? You don’t say!

But the effort to rewrite The Feminine Mystique for the postfeminist nineties goes well beyond a few quotes from women’s magazines. Inverting Friedan’s famous formulation, both writers argue that “while we now recognize that women are human, we blind ourselves to the fact that we are also women,” as Crittenden puts it. Feminism’s fatal flaw, they say, is that it taught women to ignore their “fundamental female desires” for husband, home and family. A quarter-century after second-wave feminism, “the modern woman” is miserable and confused. Professional success doesn’t make her happy, and her newfound independence makes it impossible for her to get what she really wants: a good husband and children. To find and keep husbands, women must stop having sex outside marriage, and they must stop deceiving themselves about wanting a job. Only then will men treat them as ladies, and only then will they be able to have stable homes. If Friedan thought women were languishing in suburban kitchens, Shalit and Crittenden think they are frustrated and desperate in office towers. What modern women really want, as Shalit puts it, is “our ‘feminine mystique’ back.”

Although both writers posit basic, immutable differences between men and women, their claims of female difference don’t hinge on female ineptitude. Both take it for granted that women can become lawyers and doctors and philosophy professors, excel in school, do math. (Construction jobs don’t come up, given the authors’ upper-middle-class milieu.) For Shalit, the realization of basic female difference came at college. When she was growing up, her beloved father–her mother isn’t mentioned–never treated her differently because she was a girl. “When I returned home from the prom, after all, I could discuss anything I chose with my father.” But then Shalit got to Williams College. She noticed the anorexic girls, stick-thin and exhausted. She observed protests against date rape. She listened to her friends complain about their boyfriends. And finally, she decided that the conservatives should “take the claims of the feminists seriously.”

But not too seriously. Date rape, for example, can’t have anything to do with “the patriarchy”–in other words, men–because we now live in a postsexist age. After all, Williams College has Women’s Pride Week, right? Instead, Shalit proposes “that the woes besetting the modern young woman–sexual harassment, stalking, rape, even ‘whirlpooling’ (when a group of guys surround a girl who is swimming, and then sexually assault her)–are all expressions of a society which has lost its respect for female modesty.”

And why has “society” lost its respect for female modesty? Because women, corrupted by feminism, aren’t modest anymore. It turns out that–in addition to whirlpooling–anorexia, depression and even bad sex can all be attributed to loose women. Men were perfect gentlemen–opening doors, giving up seats, courting patiently–till women started sleeping with them whether they held doors or not. “How can we expect men to be honorable when a large number of women consistently send them the message that they do not have to be?” As for men listening to women without the spur of sexual blackmail, forget it! “Women can’t tell men how to behave–they either inspire, or fail to inspire.”

Despite the occasional Rousseau quote to assure us of her precocity, Shalit’s main source of reading material–besides econ textbooks and women’s magazines–appears to be bodice-ripping romance novels. She loves duels and seems to believe that the appropriate response to a failed romance just might be suicide.

Crittenden, on the other hand, has no illusions that men behaved better in the era of hoop skirts, swoons, smelling salts and gentleman callers. Both then and now, she sees men as unappealing louts and cads who, despite their flaws, are by a cruel trick of biology the sole ticket to female happiness. If Shalit justifies her antifeminism with gallant Jimmy Stewarts and blushing maidens, Crittenden sees men who’ll leave their wives at a moment’s notice for a big-breasted blonde. Shalit believes young women are innately modest and must be brainwashed by feminist cadres into thinking they want premarital sex. Surely you couldn’t want to do that, she whispers coyly, cringing and turning pink.

For Crittenden, though, young women’s sexuality is all too real. In fact, it’s to blame for older women’s misery. At first, young women “strut about like female Don Juans,” wearing miniskirts and lipstick, “virtually daring men to become sexually entangled with them.” But soon, their sexual power fades; they can’t compete with the new, younger crop of downtown divas. Eyes getting “crinkly,” the career girl suddenly craves a baby: “Her apartment feels too quiet, her work, no matter how exciting or interesting, is less absorbing, and her spare time, unless packed with frenetic activities, almost echoes with loneliness–think of an endless wintry Sunday afternoon unbroken by the sound of another voice.” Even if she does manage, somehow, to marry, those selfish girls have ruined marriage for everyone: “If young, attractive women offer no-strings-attached sex, then men will have no pressing reason to tie themselves down.” Stay-at-home wives are naturally hostile toward working women who might steal their men: “What woman wants a hungry shark trawling near her shore?” Crittenden hits an especially shrill note in her chapter on “Aging,” in which she expresses disgust for the women in their mid-30s and 40s with gray hair and lined faces she sees with their children at the playground. What man would stay with one of them?

Shalit and Crittenden believe that women are self-sacrificing creatures, happiest in love. “Modern women” are in trouble because they’ve been sucked, against their natures, into an agonistic, commercial culture. The images of the neurotic 35-year-old sitting near the monkey bars, the sleek, anxious office professional, the green-haired college feminist, are supposed to awaken instant discomfort, like a picture of a deformed child. But the key assertion is that women’s “problems”–and, implicitly, the larger problems of our culture–can be solved if women simply admit their feminine wishes and do what they’ve really wanted to do all along: Stay virgins till marriage, marry young and have kids early.

Neither writer offers a shred of evidence for her claims, which makes these books second-rate agitprop rather than “first-rate sociology” (the gushing claim for Shalit’s book by the editor of Partisan Review). How could they? Their basic claim–that women can’t get married these days–is simply false: If either one bothered to look at the census, she’d see that the overwhelming majority of the female population still gets married, for better or for worse. In Shalit’s beloved 1890s the percentage of women between 35 and 44 who’d never been married was a scant two percentage points lower than what it is today (12 percent). It’s true that women are marrying later–especially compared with thirty years ago–but they’re still getting married. And when you factor in the couple million women “co-habitating” without the blessing of the state (including lesbians), not to mention women who are single by choice–it does happen!–this new “problem that has no name” simply disappears.

Shalit and Crittenden don’t understand divorce either. Implicitly or explicitly, both advocate early marriage. But in fact, today early marriage tends to undermine marital stability: Women who’ve married after 30 are less likely to divorce, while women who marry in their teens–or who’ve interrupted their education to marry–are more likely to do so. Something else greatly increases the likelihood of marital breakdown: being poor. Of course, neither writer suggests raising wages to increase marital stability. And it’s strange to suggest that bored husbands divorce their wives for “smart, attractive, and, above all, unencumbered” young women. Not only are women more likely to initiate divorce, but women who get divorced are more likely to express serious problems with their marriage before divorce than are men, casting some doubt on Crittenden’s “why stay married when I can meet young women at a bar any night of the week” hypothesis. (Granted, Crittenden is not very interested in women’s unhappiness in marriage: Her rallying cry is, “It’s time to settle.”)

But why bother with facts when the topic is women? Neither Crittenden nor Shalit denies the accomplishments of second-wave feminism: Instead, they seek to transform the remarkable victories feminism has achieved into dismal failures–transgressions of an underlying, unchanged nature. Young professional women discover that “the feminist ideas on which they were weaned do not lead them to happier lives but only to loneliness, stress, and the forfeiture of the most joyous experiences of a woman’s life,” according to Crittenden. At other times, it seems Shalit and Crittenden want to pick and choose–access to higher education but not sexual freedom; employment in the professions but no delayed marriage–without admitting that to be treated equally as students and employees, women can’t be placed on a pedestal or treated as though the only thing that really matters to them is husband and children.

The truth is that political feminism has dramatically changed women’s lives for the better, and at much less cost than Shalit and Crittenden insist. In my experience (and as long as anecdote is the standard, why shouldn’t my experience count?), women don’t feel intense psychic dissonance or overwhelming guilt because they have lives outside the home: Very few young women–especially the well-educated young women Shalit and Crittenden describe–want to give up work for family. It is true, of course, that balancing the demands of work and family can be extremely difficult, but this is a problem with a name: inflexible work hours and inadequate childcare.

Thanks also to political feminism, young women today don’t think there’s a contradiction between having a sex life and deserving respect from men. Nor do they want to give up on finding husbands and partners who aren’t just “inspired” by them but love, respect and listen to them. For many, this means waiting till they’ve grown up to get married, not getting hitched to the first halfway-decent man who comes along. After all, a hundred years ago women didn’t get married young because they understood something we don’t about love; they did it because they had no choice.

Still, the Shalit and Crittenden account of modern American culture as shallow, competitive and amoral is likely to resonate with some readers. It’s absurd to suggest–as they do–that American life is more viciously Darwinian today than it was in the 1890s. It was pretty viciously Darwinian then. In the early part of this century, sociologists like Caroline Ware and Robert and Helen Merrell Lynd observed how impoverished American society was by its mores of unfettered individualism and how difficult it is to found a new culture on the principle of private gain. So feminism isn’t to blame for the ascendance of market norms in private life. The panoply of sins enumerated by the anti-feminists–premarital sex, single parenthood, delayed childbearing–did not call into being the culture of the strip mall, the TV set and the self-help book.

The antifeminist appeal to women to give up sex and work for the good of the culture is a cynical, inherently conservative effort to silence a real political question–what kind of society is best for human beings?–and replace it with a vision of domestic utopia.

Reminiscent of Christopher Lasch, Crittenden describes the family as a realm of “duties and sacrifices…incomparable love.” (Socialism in one tract house?) Shalit, sounding more like a New Age hippie, thinks romantic love used to be “beautiful and true” but has been corrupted by a selfish, competitive individualism. “Everything and everyone is up for grabs, and we always face the harsh world directly, unmediated by any enduring sentiment other than each out for him- or herself.” Aside from her fixation on Victorian-era marriage–which, far from being the epitome of romance, was often just a business deal under another name–what’s strange here is that in the rest of life, Shalit would, if she followed through on her father’s free-market faith, embrace a world in which everyone was “out for him- or herself” (or himself, anyway).

This contradictory insistence on the absolute freedom of the market and the absolute rigidity of gender roles has been around for quite some time. The organization of the bourgeois family of the mid-nineteenth century, as Eric Hobsbawm writes, was in direct opposition to the bourgeois economy: “Within it freedom, opportunity, the cash nexus and the pursuit of individual profit did not rule.” On one level, then, the neocons appeal to a longstanding human need to evaluate life by some standard other than profit and loss, which helps explain why family-values furor has mounted during the long turn toward free-market economics. Recoiling from hyperindividualism, women like Crittenden and Shalit yearn for romance and the family, a mini-society governed by love and mutual understanding.

But that isn’t all. The traditional family is hardly an egalitarian institution, and herein lies the other pole of its appeal: In a world in which all is shifting and in flux, it provides–if only in fantasy–a certain source of authority for men and security for women. “Because superiority was so uncertain for the individual, it had to have one form that was permanent and secure,” writes Hobsbawm of the nineteenth century. “Because its essential expression was money, which merely expresses the relationship of exchange, other forms of expression which demonstrated the domination of persons over persons had to supplement it.” No wonder, then, that in an era of deepening economic and social insecurity, neoconservatives should seize once more on the old ideal of the family.



]]>
https://www.thenation.com/article/archive/feminine-mystiquers/