In 1992, as the United States wallowed in recession, presidential candidate Bill Clinton began to use the term “working middle class” to describe millions of Americans who were being hurt by the restructuring of the American economy. This apparent oxymoron referred to two distinct groups: the mass of well-paid unionized industrial workers who in the post-World War II era were able to afford houses, late-model cars and their kids’ college tuition but were now suffering permanent job losses; and the growing professional and technical classes who were experiencing a steep decline in their status. The plight of the blue-collar workers was a familiar story. But it was harder to grasp that the professionals, too, began to feel the economic squeeze and, perhaps just as important, were coming to know what it meant to be a closely supervised employee rather than one who expected to enjoy work autonomy. Many discovered that their pay as much as their working conditions was deteriorating. And some were unexpectedly suffering from the epidemic of downsizing.
By the middle of the nineties the future of the middle class had become a hot political issue and a labor question. For example, the fastest-growing sector of the labor movement was that of unions of physicians, nurses and academics, especially graduate assistants in major universities; and Microsoft, the hugely successful software company that nevertheless had installed a two-tiered system for computer professionals, faced an AFL-CIO organizing drive. However, we have no vocabulary to explain the sudden appearance of class issues among those groups that have been the embodiment of the American belief in social mobility.
When referring to class we are usually content to speak in terms of the growing “inequality gap” and in other descriptive euphemisms. Some identify class oppression with the very poor in images derived from films like The Grapes of Wrath or understand class as a subcategory of racism. Others cling to a stereotypical version of the industrial worker and never seem to notice that while still formidable in the US and world economy, under the knife of technology and corporate restructuring blue-collar production jobs are declining in both absolute and relative numbers. The growth occupations–now one-sixth of the labor force–are those in the production and distribution of knowledge and information. We loosely describe them as “professionals” and imagine that they are comfortable in their middle-class identities. But we have trouble explaining why many of them act like “workers” when they join unions and even strike for higher salaries and more control over their work.
Perhaps the most dramatic instance of the public’s and the labor movement’s misperception of the plight of professionals was the air-traffic controllers’ fiasco in the summer of 1981. By striking against the federal government, the relatively well-paid controllers violated federal law and their own collective-bargaining agreement. Many, both in and out of unions, who find professional unionism anomalous wondered why they walked out. The issue was not so much wages as the nerve-racking working conditions. The confusion enabled President Reagan to fire 11,000 of them amid their own union’s and the AFL-CIO’s failure to make its case in the court of public opinion. Consequently, the union was broken, and it took years to mend. Yet, as books like Grace Budrys’s When Doctors Join Unions have shown, the reasons that professionals organize embrace salary issues but are also fundamentally linked to their feeling of degradation. The essays in Will Teach for Food, edited by Cary Nelson, discuss this combination of factors that has led to the astounding rise of more than thirty graduate-assistant unions during the nineties at major public and private universities.