To see interactive package, visit this article on a non-mobile browser.
The steady stream of Watergate revelations, President Richard Nixon's twists and turns to fend off disclosures, the impeachment hearings, and finally an unprecedented resignation-all these riveted the nation's attention in 1974. Hardly anyone paid attention to a story that seemed no more than a statistical oddity: That year, for the first time since the end of World War II, Americans' wages declined.
Since 1947, Americans at all points on the economic spectrum had become a little better off with each passing year. The economy's rising tide, as President John F. Kennedy had famously said, was lifting all boats. Productivity had risen by 97 percent in the preceding quarter-century, and median wages had risen by 95 percent. As economist John Kenneth Galbraith noted in The Affluent Society, this newly middle-class nation had become more egalitarian. The poorest fifth had seen their incomes increase by 42 percent since the end of the war, while the wealthiest fifth had seen their incomes rise by just 8 percent. Economists have dubbed the period the "Great Compression."
This egalitarianism, of course, was severely circumscribed. African Americans had only recently won civil equality, and economic equality remained a distant dream. Women entered the workforce in record numbers during the early 1970s to find a profoundly discriminatory labor market. A new generation of workers rebelled at the regimentation of factory life, staging strikes across the Midwest to slow down and humanize the assembly line. But no one could deny that Americans in 1974 lived lives of greater comfort and security than they had a quarter-century earlier. During that time, median family income more than doubled.
Then, it all stopped. In 1974, wages fell by 2.1 percent and median household income shrunk by $1,500. To be sure, it was a year of mild recession, but the nation had experienced five previous downturns during its 25-year run of prosperity without seeing wages come down.
What no one grasped at the time was that this wasn't a one-year anomaly, that 1974 would mark a fundamental breakpoint in American economic history. In the years since, the tide has continued to rise, but a growing number of boats have been chained to the bottom. Productivity has increased by 80 percent, but median compensation (that's wages plus benefits) has risen by just 11 percent during that time. The middle-income jobs of the nation's postwar boom years have disproportionately vanished. Low-wage jobs have disproportionately burgeoned. Employment has become less secure. Benefits have been cut. The dictionary definition of "layoff" has changed, from denoting a temporary severance from one's job to denoting a permanent severance.
As their incomes flat-lined, Americans struggled to maintain their standard of living. In most families, both adults entered the workforce. They worked longer hours. When paychecks stopped increasing, they tried to keep up by incurring an enormous amount of debt. The combination of skyrocketing debt and stagnating income proved predictably calamitous (though few predicted it). Since the crash of 2008, that debt has been called in.
All the factors that had slowly been eroding Americans' economic lives over the preceding three decades-globalization, deunionization, financialization, Wal-Martization, robotization, the whole megillah of nefarious –izations-have now descended en masse on the American people. Since 2000, even as the economy has grown by 18 percent, the median income of households headed by people under 65 has declined by 12.4 percent. Since 2001, employment in low-wage occupations has increased by 8.7 percent while employment in middle-wage occupations has decreased by 7.3 percent. Since 2003, the median wage has not grown at all.
The middle has fallen out of the American economy-precipitously since 2008, but it's been falling out slowly and cumulatively for the past 40 years. Far from a statistical oddity, 1974 marked an epochal turn. The age of economic security ended. The age of anxiety began.
The economic landscape of the quarter-century following World War II has become not just unfamiliar but almost unimaginable today. It constitutes what historian Vaclav Smil has termed "a remarkable singularity": The United States came out of World War II dominating the world's production and markets, and its unprecedented wealth was shared broadly among its citizens.
The defining practice of the day was Fordism (named after Henry Ford), under which employers paid their workers enough that they could afford to buy the goods they mass--produced. The course of Fordism never ran as smoothly as it may seem in retrospect. Winning pay increases in halcyon postwar America required a continual succession of strikes.
At the commanding heights of the U.S. economy, the largest American company, General Motors, and the most militant and powerful American union, the United Auto Workers, had fought an epochal battle in the winter of 1945–1946, the UAW's members staying off the job for nearly four months in what proved to be a vain attempt to win a co-equal say in the company's management. In 1948, with GM fearing another massive disruption and the UAW willing to give up on co-management, the two sides reached a pattern-setting agreement: In return for a two-year no-strike pledge from the union, GM signed a contract granting its workers not only a sizable raise but an annual cost-of-living adjustment that matched the rate of inflation, and an "annual improvement factor" that raised pay in tandem with the increase in the nation's productivity. In 1950, after a brief strike, the two sides signed a five-year contract-dubbed the Treaty of Detroit-that extended the no-strike pledge, the raise, the cost-of-living adjustment, and the annual improvement factor and added health coverage and more generous pensions. As the economy grew, so would the autoworkers' paychecks.
Within a few years, the increases that GM had agreed to became standard in half the union contracts in America, though workers still had to strike to win these gains. In 1952, 2.7 million workers participated in work stoppages. Throughout the 1950s, the yearly number of major strikes averaged more than 300. The largest strike in American history, in terms of work hours lost, occurred in 1959, when 500,000 steelworkers walked off the job for 116 days to secure increased wages and improved health and pension coverage.
Management was no fan of these disruptions, but they were regarded as the normal ebb and flow of labor relations. Indeed, throughout the 1940s, '50s, and '60s, many corporate executives believed that their workers' well-being mattered. "The job of management is to maintain an equitable and working balance among the claims of the various directly affected interest groups: stockholders, employees, customers, and the public at large," the chairman of Standard Oil of New Jersey (later Exxon) said in 1951. Once hired, a good worker became part of the family, which entitled him to certain rewards. "Maximizing employee security is a prime company goal," Earl Willis, General Electric's manager of employee benefits, wrote in 1962.
During these years, the GI Bill enabled far more Americans to attend college than ever had before. The ranks of America's professionals swelled, and America's income swelled with them. But the contracts enjoyed by the nation's union members-who then made up a third of the nation's workforce-boosted personal income in the U.S. even more. Indeed, these contracts covered so many workers that their gains spilled over to nonmembers as well. Prince-ton economist Henry Farber calculated that the wages of workers in nonunion firms in industries that were at least 25 percent unionized were 7.5 percent higher than the wages of comparable workers in industries with no union presence.
In the three decades following World War II, the United States experienced both high levels of growth and rising levels of equality, a combination that confounded historical precedent and the theories of conservative economists. By 1973, the share of Americans living in poverty bottomed out at 11.1 percent. It has never been that low since.
By the early 1980s, the Treaty of Detroit had been unilaterally repealed. Three signal events-Federal Reserve Chairman Paul Volcker's deliberately induced recession, President Ronald Reagan's firing of striking air-traffic controllers, and General Electric CEO Jack Welch's declaration that his company would reward its shareholders at the expense of its workers-made clear that the age of broadly shared prosperity was over.
The abrogation didn't arrive unheralded. Beginning in 1974, inflation had begun to plague the American economy. The 1970s were framed by two "oil shocks": the OPEC embargo of 1973 and the U.S. boycott of Iranian oil after the mullahs swept to power in 1979. During the decade, the price of a barrel of oil rose from $3 to $31. Productivity, which had been rising at nearly a 3 percent annual clip in the postwar decades, slowed to a 1 percent yearly increase during the 1970s. Europe and Japan had recovered from the devastation of World War II, and Japanese imports, chiefly autos, doubled during the late '60s. In 1971, the U.S. experienced its first trade deficit since the late 1800s. Starting in 1976, it has run a trade deficit every year.
Profits of America's still largely domestic corporations suffered. The Dow Jones Industrial Average, which had inched past 1,000 in 1972, tanked with the oil embargo the following year and didn't climb back to that level for another decade. Although the biggest contributor to inflation was the increase in energy prices, a growing number of executives and commentators laid the blame for the economy's troubles on the wages of American workers. "Some people will have to do with less," Business Week editorialized. "Yet it will be a hard pill for many Americans to swallow-the idea of doing with less so that big business can have more."
With the second oil shock, inflation surged to 13.5 percent. Volcker responded by inducing a recession. "The standard of living of the average American," he said, "has to decline." Raising the federal funds interest rate to nearly 20 percent throughout 1981, the Fed chairman brought much of American business-particularly the auto industry, where sales collapsed in the face of high borrowing costs-to a standstill. By 1982, unemployment had risen to a postwar high of 10.8 percent.
The industrial Midwest never recovered. Between 1979 and 1983, 2.4 million manufacturing jobs vanished. The number of U.S. steelworkers went from 450,000 at the start of the 1980s to 170,000 at decade's end, even as the wages of those who remained shrank by 17 percent. The decline in auto was even more precipitous, from 760,000 employees in 1978 to 490,000 three years later. In 1979, with Chrysler on the verge of bankruptcy, the UAW agreed to give up more than $650 million in wages and benefits to keep the company in business. General Motors and Ford were not facing bankruptcy but demanded and received similar concessions. In return for GM pledging not to close several U.S. factories, the UAW agreed to defer its cost-of-living adjustment and eliminate its annual improvement increases. Henceforth, as the productivity of the American economy increased, the wages of American workers would not increase with it. Tide and boats parted company.
Democrats as well as Republicans responded to the inflation of the late 1970s with policies that significantly reduced workers' incomes. The Democrats' solution of choice, promoted by both President Jimmy Carter and his liberal rival Senator Edward Kennedy, was deregulation. At their initiative, both trucking and airlines were deregulated, lowering prices and wages in both industries. In the quarter--century following 1975, drivers' pay fell by 30 percent. Wage declines followed in other deregulated industries, such as telecommunications.
If Volcker's and Carter's attacks on unions were indirect, Reagan's was altogether frontal. In the 1980 election, the union of air-traffic controllers was one of a handful of labor organizations that endorsed Reagan's candidacy. Nevertheless, they could not reach an accord with the government, and when they opted to strike in violation of federal law, Reagan fired them all. (His actions contrasted sharply with those of President Nixon, who responded to an illegal wildcat strike of postal workers in 1970 by negotiating a settlement and letting them return to their jobs.)
Reagan's union busting was quickly emulated by many private-sector employers. In 1983, the nation's second-largest copper-mining company, Phelps Dodge, ended its cost-of-living adjustment, provoking a walkout of its workers, whom it replaced with new hires who then decertified the union. The same year, Greyhound Bus cut wages, pushing its workers out on strike, then hired replacements at lower wages. Also in 1983, Louisiana Pacific, the second-largest timber company, reduced its starting hourly wage, forcing a strike that culminated in the same kind of worker defeats seen at Phelps Dodge and Greyhound. Eastern Airlines, Boise Cascade, International Paper, Hormel meatpacking-all went down the path of forcing strikes to weaken or destroy their unions.
In the topsy-turvy world of the 1980s, the strike had become a tool for management to break unions. Save in the most exceptional circumstances, unions abandoned the strike. The number of major strikes plummeted from 286 a year in the 1960s and 1970s, to 83 a year in the 1980s, to 34 a year in the 1990s, to 20 a year in the 2000s. The end of the strike transformed the American economy. From the 1820s through the 1970s, workers had two ways to bid up their wages: threatening to take their services elsewhere in a full-employment economy and walking off the job with their fellow workers until managers met their demands. Since the early 1980s, only the full-employment-economy option has been available-and just barely. Save for the late 1990s, the economy has been nowhere near full employment.
The loss of workers' leverage was compounded by a radical shift in corporations' view of their mission. In August 1981, at New York's Pierre Hotel, Jack Welch, General Electric's new CEO, delivered a kind of inaugural address, which he titled "Growing Fast in a Slow-Growth Economy." GE, Welch proclaimed, would henceforth shed all its divisions that weren't No. 1 or No. 2 in their markets. If that meant shedding workers, so be it. All that mattered was pushing the company to pre-eminence, and the measure of a company's pre-eminence was its stock price.
Between late 1980 and 1985, Welch reduced the number of GE employees from 411,000 to 299,000. He cut basic research. The company's stock price soared. So much for balancing the interests of employees, stockholders, consumers, and the public. The new model company was answerable solely to its stockholders.
In the decade preceding Welch's speech, a number of conservative economists, chiefly from the University of Chicago, had argued that the midcentury U.S. corporation had to contend with a mishmash of competing demands. Boosting the company's share price, they contended, gave corporate executives a clear purpose-even clearer if those executives were incentivized by receiving their payments in stock. After Welch's speech, the goal of America's corporate executives became the elevation of the company's-and their own-stock. If revenues weren't rising, and even if they were, that goal could be accomplished by reducing wages, curtailing pensions, making employees pay more for their health coverage, cutting research, eliminating worker training, and offshoring production.
(After CEO Louis Gerstner announced in 1999 that IBM, long considered a model employer, would no longer pay its workers defined benefits and would switch to 401(k)s, corporate America largely abandoned paying for its employees' secure retirements.)
By the end of the century, corporations acknowledged that they had downgraded workers in their calculus of concerns. In the 1980s, a Conference Board survey of corporate executives found that 56 percent agreed that "employees who are loyal to the company and further its business goals deserve an assurance of continued employment." When the Conference Board asked the same question in the 1990s, 6 percent of executives agreed. "Loyalty to a company," Welch once said, "it's nonsense."
In 1938, while campaigning, successfully, to persuade Congress to establish a federal minimum wage, President Franklin D. Roosevelt told a crowd in Fort Worth, "You need more industries in Texas, but I know you know the importance of not trying to get industries by the route of cheap wages for industrial workers." In fact, Southern business and political leaders knew nothing of the sort. What prevented most American corporations from establishing facilities in the South was its oppressive weather and even more oppressive racial discrimination.
By the 1970s, the South was both air--conditioned and moderately desegregated. The decade was the first in the 20th century that saw more Americans moving into the region than leaving it. Indeed, during the 1970s, just 14 percent of newly created jobs were located in the Northeast and Midwest, while 86 percent were located in the Sunbelt.
The definitive Southern company, and the company that has done the most to subject the American job to the substandard standards of the South, has been Wal-Mart, which began as a single store in Rogers, Arkansas, in 1962. That year, the federal minimum wage, set at $1.15 an hour, was extended to retail workers, much to the dismay of Sam Walton, who was paying the employees at his fast-growing chain half that amount. Since the law initially applied to businesses with 50 or more employees, Walton argued that each of his stores was a separate entity, a claim that the Department of Labor rejected, fining Walton for his evasion of federal law.
Undaunted, Wal-Mart has carried its commitment to low wages through a subsequent half-century of relentless expansion. In 1990, it became the country's largest retailer, and today the chain is the world's largest private-sector employer, with 1.3 million employees in the United States and just under a million abroad. As Wal-Mart grew beyond its Ozark base, it brought Walton's Southern standards north. In retail marketing, payroll generally constitutes between 8 percent and 12 percent of sales, but at Wal-Mart, managers are directed to keep payroll expenses between 5.5 percent and 8 percent of sales. Managers who fail at this don't remain managers for long. While Wal-Mart claims the average hourly wage of its workers is $12.67, employees contend it is several dollars lower.
When a Wal-Mart opens in a new territory, it either drives out the higher-wage competition or compels that competition to lower its pay. David Neumark, an economist at the University of California, Irvine, has shown that eight years after Wal-Mart comes to a county, it drives down wages for all (not just retail) workers until they're 2.5 percent to 4.8 percent below wages in comparable counties with no Wal-Mart outlets.
By controlling a huge share of the U.S. retail market, including an estimated 20 percent of the grocery trade, Wal-Mart has also been able to mandate reduced prices all along its worldwide supply chain. In response, manufacturers have slashed the wages of their employees and gone abroad in search of cheaper labor. The warehouse workers who unload the containers in which the company's goods are shipped from China to the U.S. and repackage them for sale are retained by low-wage temporary employment agencies, though many of those workers have held the same job for years.
Shunting its workers off to temp agencies is just one of the many ways Wal-Mart diminishes what it sees as the risk of unionization. When the employees in one Canadian store voted to unionize, Wal-Mart closed the store. When butchers in one Texas outlet voted to go union, Wal-Mart eliminated the meat department in that store and in every other store in Texas and the six surrounding states. But Wal-Mart's antipathy to unions and affinity for low wages merely reflects the South's historic opposition to worker autonomy and employee rights. By coming north, though, Wal-Mart has lowered retail-sector wages throughout the U.S.
The more recent influx of European- and Japanese-owned nonunion factories to the South has had a similar effect. In their homelands, Mercedes, Volkswagen, and Toyota work closely with unions, and the German companies pay their workers as much as or more than the most highly paid American autoworkers. When such companies move into the American South, however, they go native, not only paying their workers far less than they do in Europe or Japan but also opposing their efforts to form a union. (Under pressure from the German autoworkers union, however, Volkswagen has recently committed itself to establishing a consultative works council at its Tennessee plant. Such councils are standard at Volkswagen plants in Germany and other nations; in the U.S., the particulars of American labor law require that the company recognize the UAW as the workers' representative.)
One way these factories reduce workers' wages is not to employ them directly. By the estimate of one former manager, roughly 70 percent of the workers at Nissan's plant in Smyrna, Tennessee, aren't Nissan employees but rather are under contract to temporary employment-service companies that pay them roughly half the hourly wage of Nissan's own employees. One academic survey found that while just 2.3 percent of manufacturing workers in 1989 were temps, by 2004 the number had risen to 8.7 percent.
Southern competition is one reason newer hires at the Detroit Three's auto plants have hourly wages that top out between $16 and $19, while workers hired before the institution of the two-tier system can see their base pay rise to between $29 and $33 an hour. A cumulative effect of Wal-Martization is that incomes in the industrial Midwest have been dropping toward levels set in Alabama and Tennessee. According to Moody's Analytics, the wage-and-benefit gap between Midwestern and Southern workers, which was $7 in 2008, had shrunk to just $3.34 by the end
of 2011.
As corporate executives came under pressure to reward share-holders by cutting labor costs, the revolution in transportation and communication enabled them to move production facilities to the developing world where workers came cheap. The flight of jobs to low-wage nations was accelerated by a series of trade accords, most prominently the North American Free Trade Agreement in 1993 and the extension of Permanent Normal Trade Relations to China in 2000.
The textile and apparel industry lost more than 900,000 jobs in the 1990s and 2000s. High-tech manufacturing was not spared, either. The computer and electronics--manufacturing sector lost an estimated 760,000 jobs during that time. By offshoring the production of its iPhone to the Chinese labor contractor Foxconn, Apple has realized a profit margin of 64 percent on each device, one of many reasons its stock price soared. From 2000 to 2010, the number of Americans employed in manufacturing shrank from 17.1 million to just 11.3 million. In 2011, the number of workers in the low-paying retail sector surpassed the number in manufacturing for the first time.
The decimation of manufacturing wasn't due to a sharp acceleration of manufacturing productivity-indeed, productivity increases were higher in the previous decade, which saw less job loss. What made the difference was trade policy. Economist Rob Scott has calculated that the United States lost 2.4 million jobs just to China in the eight years following the passage of normalized trade relations.
Offshoring has had an even broader effect on the jobs that have remained behind. Alan Blinder, the Princeton economist who was vice chairman of the Federal Reserve in the 1990s, has estimated that roughly 25 percent of all American jobs are potentially offshorable, from producing steel to writing software to drafting contracts. This has placed a ceiling on wages in these and myriad other occupations that can be sent overseas.
Economists have long thought that labor's share of the national income varied so little that it could be considered a constant. The immovability of labor's share was called "Bowley's Law," after the British economic historian Arthur Bowley, who first identified it nearly a century ago.
In the wake of the economic collapse of 2008, Bowley's Law was swept away-along with many of the economic standards that had characterized American life. Today, the share of the nation's income going to wages, which for decades was more than 50 percent, is at a record low of 43 percent, while the share of the nation's income going to corporate profits is at a record high. The economic lives of Americans today paint a picture of mass downward mobility. According to a National Employment Law Project study in 2012, low-wage jobs (paying less than $13.83 an hour) made up 21 percent of the jobs lost during the recession but more than half of the jobs created since the recession ended. Middle-income jobs (paying between $13.84 and $21.13 hourly) made up three-fifths of the jobs lost during the recession but just 22 percent of the jobs created since.
In 2013, America's three largest private--sector employers are all low-wage retailers: Wal-Mart, Yum! Brands (which owns Taco Bell, Pizza Hut, and Kentucky Fried Chicken) and McDonald's. In 1960, the three largest employers were high-wage unionized manufacturers or utilities: General Motors, AT&T, and Ford.
The most telling illustration of the decline of Americans' work life may be that drawn by economists John Schmitt and Janelle Jones of the Center for Economic and Policy Research. They calculated the share of good jobs Americans held in 1979 and in 2010. If only because workers in 2010 were, on average, seven years older and more educated than their 1979 counterparts, they should have been doing better. The two economists devised three indices of a good job: that it paid at least the 1979 male median wage ($37,000 in 2010 dollars), provided health benefits, and came with a 401(k) or pension. By those standards, 27.4 percent of American workers had good jobs in 1979. Three decades later, that figure had dropped to 24.6 percent.
The decline of the American job is ultimately the consequence of the decline of worker power. Beginning in the 1970s, corporate management was increasingly determined to block unions' expansion to any regions of the country (the South and Southwest) or sectors of the economy (such as retail and restaurants) that were growing. An entire new industry-consultants who helped companies defeat workers' efforts to unionize-sprang up. Although the National Labor Relations Act prohibits the firing of a worker involved in a union-organizing campaign, the penalties are negligible. Firings became routine. Four efforts by unions to strengthen workers' protections during the Johnson, Carter, Clinton, and Obama presidencies came up short. By 2013, the share of private-sector workers in unions declined to just 6.6 percent, and collective bargaining had been effectively eliminated from the private-sector economy.
The collapse of workers' power to bargain helps explain one of the primary paradoxes of the current American economy: why productivity gains are not passed on to employees. "The average U.S. factory worker is responsible today for more than $180,000 of annual output, triple the $60,000 in 1972," University of Michigan economist Mark Perry has written. "We're able to produce twice as much manufacturing output today as in the 1970s, with about seven million fewer workers." In many industries, the increase in productivity has exceeded Perry's estimates. "Thirty years ago, it took ten hours per worker to produce one ton of steel," said U.S. Steel CEO John Surma in 2011. "Today, it takes two hours."
In conventional economic theory, those productivity increases should have resulted in sizable pay increases for workers. Where conventional economic theory flounders is its failure to factor in the power of management and stockholders and the weakness of labor. Sociologist Tali Kristal has documented that the share of revenues going to wages and benefits in manufacturing has declined by 14 percent since 1970, while the share going to profits has correspondingly increased. She found similar shifts in transportation, where labor's share has been reduced by 10 percent, and construction, where it has been cut by 5 percent. What these three sectors have in common is that their rate of unionization has been cut in half during the past four decades. All of which is to say, gains in productivity have been apportioned by the simple arithmetic of power.
Only if the suppression of labor's power is made part of the equation can the overall decline in good jobs over the past 35 years be explained. Only by considering the waning of worker power can we understand why American corporations, sitting on more than $1.5 trillion in unexpended cash, have used those funds to buy back stock and increase dividends but almost universally failed even to consider raising their workers' wages.
So was the America of 1947–1974-the America of the boomers' youth-the great exception in the nation's economic history, a golden age that came and went and can never come again? Were the conditions that led to the postwar boom and its egalitarian prosperity so anomalous that the American economic success story will continue to recede in our rearview mirrors? Are the forces of globalization and robotization inevitably going to raise the incomes of the few and depress the incomes of the many?
That the American supremacy over the global economy in the three decades after World War II was a one-time phenomenon is a given. That globalization and automation have made and will continue to make massive changes in America's economy is obvious. But it's worth noting that one high-wage advanced manufacturing nation has seen its workers thrive in the past 40 years: Germany. Like American multinationals, all the iconic German manufacturers-Daimler, Siemens, BASF, and others-have factories scattered across the globe. Unlike the American multinationals, however, they have kept their most remunerative and highest-value-added production jobs at home. Nineteen percent of the German workforce is employed in manufacturing, well above the 8 percent of the American workforce. German industrial workers' wages and benefits are about one-third higher than Americans'. While the U.S. runs the world's largest trade deficit, Germany runs a surplus second only to China's and occasionally surpasses it.
To be sure, Germany's identity is more wrapped up in manufacturing than America's is, but that's because of national arrangements that not just bolster manufacturing through such policies as excellent vocational education but also give workers more power. By law, all German companies with more than 1,000 employees must have equal numbers of worker and management representatives on their corporate boards. For the most part, German companies don't get their funding from issuing stocks and bonds but rather by generating investment either internally or by borrowing from banks; the role of the shareholder is insignificant. By practicing a brand of capitalism in which employees and communities still matter, Germany has been able to subject itself to the same forces of globalization that the United States has without substantially diminishing its workers' power and income.
What has vanished over the past 40 years isn't just Americans' rising incomes. It's their sense of control over their lives. The young college graduates working in jobs requiring no more than a high-school degree, the middle-aged unemployed who have permanently opted out of a labor market that has no place for them, the 45- to 60-year-olds who say they will have to delay their retirement because they have insufficient savings-all these and more are leading lives that have diverged from the aspirations that Americans until recently believed they could fulfill. This May, a Pew poll asked respondents if they thought that today's children would be better or worse off than their parents. Sixty-two percent said worse off, while 33 percent said better. Studies that document the decline of intergenerational mobility suggest that this newfound pessimism is well grounded.
The extinction of a large and vibrant American middle class isn't ordained by the laws of either economics or physics. Many of the impediments to creating anew a broadly prosperous America are ultimately political creations that are susceptible to political remedy. Amassing the power to secure those remedies will require an extraordinary, sustained, and heroic political mobilization. Americans will have to transform their anxiety into indignation and direct that indignation to the task of reclaiming their stake in the nation's future.