FROM THE END OF WORLD WAR II J. through the mid 1970s, the real wages of American workers nearly doubled, moving up in tandem with the growth in productivity. The United States benefited from an implicit social contract: By working hard and contributing to productivity, profits, and economic growth, workers and their families could expect improved living standards, greater job security, and a secure and dignified retirement. This social contract broke down after 1980, as employees lost their bargaining power. Since then, productivity has grown more than 70 percent while real compensation of nonmanagerial workers has remained flat. Wages for the lowest-paid workers have collapsed even more than for average workers.
While conventional explanations for stagnant wages and increased inequality--such as those that emphasize technological changes and increased premium for skills--may be part of the story, they fail to take into account the historical policy and institutional forces that created and sustained the postwar social contract, or to understand what needs to be done to restore it in a way consistent with the needs of today's workforce and economy.
The postwar social contract was grounded in New Deal legislation that established a minimum wage, other wage and hour regulations, and labor laws that allowed workers to build the bargaining power needed to enforce wage-determination norms and principles in negotiating with large corporations. The undermining of this system of regulations, and of union bargaining power, accounts for significant portions of the wage lag.
Despite the convention of a poverty "line," the working poor and the near-poor have much in common. After periodically raising the minimum wage from 1938 through the 1960s as prices rose, Congress and several presidents have allowed the real value of the minimum wage to fall about 25 percent since the late 1960s. Econometric estimates indicate that this alone can account for about a 20 percent decline in real wages for those paid at or slightly above the minimum, and for a 6 percent to 10 percent reduction for those earning twice the minimum wage.
The system of collective bargaining that grew out of the New Deal's National Labor Relations Act and the influence of the War Labor Board during World War II likewise get insufficient credit for their role in creating and sustaining the link between productivity and wages. The War Labor Board used wage comparisons to instill the principle of "equal pay for equal work" within industries and occupations. It encouraged negotiations for health insurance, pensions, and other benefits that eventually became sine qua non for a "good job." By the mid-1970s, union members were about 20 percent more likely to be covered by these benefits than nonunion workers.
In the late 1940s, the United Auto Workers and General Motors negotiated contracts that explicitly linked wage increases to productivity growth (the "annual improvement factor") and to increases in the cost of living. These private-sector wage norms reinforced the underlying social contract. From the mid-1940s through the 1970s, unions led the process of improving wages by producing union wage premiums ranging from 10 percent to 25 percent, with the biggest effects for less-skilled jobs and less-educated workers. Union-negotiated wages and benefits spilled over to affect nonunion workers and managers across the economy. For example, managers surveyed in 1978 by the Conference Board reported that wage and benefit settlements in their largest bargaining units affected up to two and a half times as many employees than unionized workers in their firms, and four times as many employees in their local product or labor markets.
UNIONS HAVE DECLINED FOR SEVERAL mutually reinforcing reasons: deregulation, industrial change, globalization, and increased employer resistance (often abetted by lax government enforcement of the right to organize). …