This article appeared in the Summer 2016 issue of The American Prospect magazine. Subscribe here.
After experiencing substantial wage gains during the shared-growth decades of the postwar era, American workers have increasingly confronted labor markets of precarious jobs that pay too little to provide a minimally decent standard of living. This reality has finally broken through politically in the movement for a $15 federal minimum wage. However, some prominent economists contend that a minimum wage high enough to provide a decent standard of living poses too high a risk of job loss.
But this fear is purely speculative; we have no reliable evidence that a $15 wage floor, phased in over four to six years, would cause declining employment opportunities for low-wage workers. Indeed, the wage threshold at which substantial employment effects are likely to occur may be considerably higher. What we do know is that a $15 wage would have big impacts on the living standards of millions of working families. The recent commitments of California and New York state to establish a $15 minimum are estimated to increase the income of more than one-third of the workers in each state. The effects on consumer demand, and consequently on other low-wage employment, will be enormous.
For decades, the dominant premise has been that the right criterion for evaluating proposed minimum-wage hikes is whether they would produce any job loss at all—in any establishment anywhere. However, maintaining a strict zero-job-loss standard is the wrong approach. We should be focusing on overall employment effects for low-wage workers, and better yet, net benefits for all low-wage working families after taking into account improved incomes for the vast majority. Moreover, any potential modest negative employment impacts could be offset by the right macroeconomic and public investment policies.
Taking the longer view, a high-productivity and high-wage path necessarily requires getting rid of low-productivity, low-wage jobs. That should not be seen as a bad outcome. We don’t see it this way when workplace safety, child labor, or environmental rules raise labor costs. Few would argue that we should apply a zero-job-loss criterion to the adoption of new technologies, foreign investments, or imports. Why should a zero-job-loss standard apply only to minimum-wage policy, especially when the main beneficiaries would be the lowest paid, who are at least one-third of all workers? The sensible approach is to ensure that change takes place at a pace that allows for workers to adjust, with the help of complementary policies (public jobs, retraining) that together ensure that low-wage workers do not bear the burden of the transition to a living-wage economy.
IT IS WELL ESTABLISHED THAT America’s productivity growth since the late 1970s has not benefited the vast majority of workers. In 2014, the average wages at the 10th, 20th, and 30th percentiles were just $8.62, $10.08, and $12.09, respectively—nearly exactly what they earned in inflation-adjusted terms almost four decades ago in 1979. Even the median wage (the 50th percentile) increased by just 85 cents between 1979 and 1999 ($16.02 to $16.87), and just 3 cents more since 1999, reaching $16.90 in 2014.
This has been a particularly disastrous period for young workers (ages 18 to 34) without a college degree. For these workers, the low-wage share of employment increased from 36.1 percent in 1979 to 61.4 percent in 2014 (the dollar value of the low-wage threshold is $12.50 in 2014 dollars, using the conventional cutoff of two-thirds of the median full-time wage).
Not since the Depression years of the 1930s has the majority of the American workforce faced such a low-paying, precarious labor market. This insecurity is compounded by the shift to contingent jobs, where full-time work is the exception.
It is popular among economists to blame globalization and technology—intrinsic features of capitalist development that no sane policy-maker would want to block in any fundamental way. The answer, economists and pundits say, is more education and training. But this orthodox reasoning, consistent with textbook stories and convenient for low-wage employers, flies in the face of history, theory, and evidence.
Though skills and earnings are broadly correlated and an economy is clearly better off with higher-skilled workers than lower-skilled ones, the long debate about whether skills are the main driver of recent widening inequality is now largely settled. Stagnant and declining wages are mainly the result of the deliberate weakening of equalizing institutions, from wage regulation to trade unions and not the result of increasing demand for skills. Even prominent advocates of the skills hypothesis, such as MIT’s David Autor, have lately conceded that these institutional factors are a major part of the story.
The historical record is that the American labor market also failed to provide a minimally decent standard of living for vast numbers of working families in the decades before the Great Depression. This was recognized by leading mainstream economists of the day, including John Bates Clark (after whom the American Economic Association’s award for the top economist under the age of 40 is named), who in 1913 reluctantly supported a statutory minimum wage to counter the effects of the “hunger discipline” of unregulated labor market competition, which systematically bid the prevailing wage down to subsistence levels and below—well below what Clark called the worker’s “marginal productivity” (the extra value generated by the worker).
In Britain, Prime Minister David Lloyd George argued in 1919 that a broader view of economic efficiency required a minimum wage (it took almost a century until Britain finally got one, in 1999):
Every worker should be ensured a minimum wage which will enable him or her to maintain a becoming standard of life for himself and his family. Apart altogether from considerations of humanity it is on the highest interest to the State that children should be brought up under conditions that will make them fit and efficient citizens.
This recognition that the bottom of the labor market, absent offsetting legislation or collective bargaining, would be pushed down toward a subsistence wage has a long pedigree. Well over two centuries ago, Adam Smith was unequivocal: As long as workers bargained individually, employers would always have the bargaining advantage. And in competitive product markets, employers would be forced to take the low road and pay the lowest possible wage.
The outcome is the widespread payment of below-subsistence wages whenever there is a surplus pool of workers—a condition that has characterized capitalist labor markets since Smith’s time. This is a classic case of “rational irrationality,” where what is rational, and indeed necessary, in unregulated labor markets is irrational from a social and economic-efficiency perspective. This is why the great social reformer Sidney Webb in 1912 called the payment of below-subsistence wages a “vicious form of parasitism.” The same is true today: Taxpayers heavily subsidize Walmart’s near–minimum wage policy. (See “Confronting the Parasite Economy” by Nick Hanauer.)
Modern technology and globalization have contributed to this historical pattern. New computer-based labor-saving production technologies tend to reduce demand for less-skilled workers; technological advances and lower communication and transportation costs facilitate offshoring of work to the lowest-wage locations. The antidote is not just higher skills, but wage regulation and countervailing worker power, to put limits on low-road employer policies—as J.B. Clark, Lloyd George, Sidney Webb, and many other leading figures understood over a century ago.
THE STATUTORY MINIMUM WAGE can help keep full-time workers out of poverty. An appropriately designed wage floor also increases the incentive to work, reduces wage and income inequality, and lessens the need for means-tested social assistance for working-poor families. But this has not been the path of the American federal minimum wage, which has collapsed in value from $9.54 in 1968 to $8.81 in 1989 to $7.25 in 2014 (in 2014 dollars), where it remains today, even as productivity has more than doubled. Today’s minimum wage is just 37 percent of the median wage, which has been essentially flat for four decades.
The experience of other countries demonstrates that a race to the bottom is not the only way to run an efficient, technologically advanced economy. With average incomes and productivity levels continuing to rise, the incidence of very low pay in a country is explained not by exposure to new production technologies and global competition, but by political choices over how to regulate the low-wage labor market. In France and Australia, to take just two examples, the minimum wage relative to the median wage is far higher than America’s 37 percent: It’s 53 percent for Australia and 61 percent in France (2014).
Doesn’t a higher minimum wage depress employment, especially among the low-skilled? Evidently not. If we compare the prevalence of low pay and the employment rates of young workers across 18 affluent nations, it’s simply not the case that lower wages at the bottom produce more jobs. As the chart shows, there is no correlation at all. While these numbers show a 14 percentage-point gap in the low-wage share of employment between France (with 11 percent) and the U.S. (with 25 percent), the employment rates for young, less-educated workers is nearly the same. Similarly, Australia’s incidence of low pay (15.8 percent) is about 9 percentage points below the U.S. level, but with low-education employment rates about 7 points higher. Denmark displays the strongest challenge to the orthodox prediction—a low-wage share of employment of below 8 percent, a full 17 points below the 25 percent of the U.S., and Denmark still shows a superior employment rate for young workers with less education. Above all else, what explains these outcomes is collective bargaining and the imposition of a wage floor, mandated either through collective bargaining or by statute.
It is also misleading to pose a higher minimum wage and more social income as policy alternatives. We clearly need both. Other affluent countries that choose a high-wage path for their workers also provide a much higher “social wage” in the form of universal (not means-tested) support for health, housing, education, and especially child care. In the current context, the legal wage floor must in the U.S. carry a much higher burden than in other affluent countries for maintaining minimally decent family incomes, and yet the U.S. is an extreme bottom-end outlier when it comes to income support. The fact that the U.S. has both a lower minimum wage and less social income reflects a common factor—the balance of political power.
THE FEDERAL MINIMUM WAGE was first established in 1938 by the Fair Labor Standards Act (FLSA) to ensure a “minimum standard of living necessary for health, efficiency, and general well being of workers.” The debate over minimum-wage legislation in the 1930s focused on the constitutional right of the federal government to intervene in private voluntary contracting between two parties (workers and firms), on the issue of interference in local and state economic affairs, and on the consequences for Southern regional competitiveness. But above all, opponents before and since the FLSA have made the case against a meaningful wage floor on the grounds that it will have the perverse effect of harming the very workers it aims to benefit. As the FLSA also states, the objective of a decent standard of living should be achieved “without substantially curtailing employment.”
Protesters gather outside a Burger King restaurant during a demonstration by fast-food workers and activists calling for the federal minimum wage to be raised to $15, Wednesday, April 15, 2015, in College Park, Georgia.
After a long political struggle, the compromise was a nationwide minimum wage set at just 25 cents an hour (President Franklin D. Roosevelt and Labor Secretary Frances Perkins’s goal was 40 cents). This amount was equivalent to about $4.20 in inflation-adjusted 2015 dollars, and covered only about one-fifth of the workforce. The final minimum-wage policy contained no formula to set the future wage floors, nor a mechanism to index it to inflation. Accordingly, any future increases would require an act of Congress, which guaranteed that it would be a political battle to just maintain the purchasing power of the minimum wage, much less make it high enough to actually ensure that full-time work could maintain “the general well being of workers.”
In response to congressional inaction over the years, many states and localities have legislated large increases in their own statutory minimum wage. California and New York state passed large increases in their statewide minimum wage rates in early 2016. California’s wage will be raised in increments from the current $10 per hour until it reaches $15 by 2022. The New York rate will reach $15 by the end of 2018 for New York City employers with 11 or more employees. Even strongly Republican states have recently passed large minimum-wage increases. At least eight cities, including Seattle, San Francisco, and Los Angeles, are scheduled to raise the municipal minimum wage to around $15 over the next several years.
According to the National Employment Law Project (NELP), 42 percent of all workers earn less than $15 per hour—clearly barely adequate for a single full-time worker, much less a family with dependent children. What would a $15 minimum wage do for a working family with just one full-time worker? In recent years, a number of “basic-needs budgets” have been developed, designed to estimate the costs that a wage must cover for a minimally decent standard of living for each of many different family types.
The chart shows these costs, as calculated by the Economic Policy Institute, for two family types in nine metropolitan areas, a single worker and a single worker with one dependent child. These calculations indicate that a wage of about $13.45 is necessary for a single worker in Colorado Springs, but $15.67 is required in Chicago and $21.07 in Washington, D.C. With a dependent child, the basic-needs wages in these cities jump to $24.90, $26.40, and $39.35, respectively. In short, a $15 wage would not come close to covering a basic-needs budget for a family with a single dependent child anywhere in the United States.
How do these figures compare to the current proposals for large hikes in the minimum wage over the next four to six years? The chart shows the current values of two proposed target wage floors. One is a 2020 federal minimum wage of $12, which is the equivalent of $10.92 in 2016 (based on CBO inflation projections); the second is a $15 minimum wage that would be fully phased in by 2021—about $13.34 in today’s dollars. As the figure shows, even the more generous $15 proposal would not cover even a basic-needs budget for a single worker alone in any of these cities, much less a family with a dependent child. From a standard-of-living perspective, a $15 minimum wage phased in over four to six years seems barely adequate.
HOW RISKY IS A PHASED-IN $15 wage floor? We don’t know for certain, but we can begin with some historical evidence. On January 25, 1950, the wage floor was increased by 87.5 percent, from 40 cents to 75 cents. This represented a sudden increase in the ratio of the minimum wage to the average hourly earnings of non-farm private-sector workers, from 31.4 percent in 1949 to 56.2 percent in 1950.
What was the impact on jobs? Teenage (ages 16 to 19) unemployment rates fell, from 15.8 percent in October 1949 (three months before the increase) to 15.2 percent in February 1950 (one month after), and then fell further to just 12 percent in April (three months after)—still before the call-ups began for the Korean War. A year later, in April 1951, the teenage unemployment rate was down to 7.9 percent. Over the same period (October 1949 to April 1951), the overall unemployment rate fell from 7.9 percent to 3.1 percent. Much the same story can be told for the 33.3 percent increase in the minimum wage that took place on March 1, 1956. Other factors simply swamped the impact of a higher wage floor.
Similarly, as the New York State Department of Labor has put it, “New York increased its minimum wage eight times from 1991 through 2015 and six of those times, the data show an employment uptick following an increase in the state’s minimum wage.” The lesson of these examples is that any tendency for negative employment effects can be swamped by other factors—most importantly by the strength of the macroeconomy.
According to the U.K.’s Low Pay Commission, which was charged by the government to use a criterion of no “significant negative effect on employment or the economy,” rather than strictly zero job loss, in researching the impact of minimum-wage legislation that has been on the books since 1999:
Since 1999 the Low Pay Commission has commissioned over 130 research projects that have covered various aspects of the impact of the National Minimum Wage on the economy. In that period the low paid have received higher than average wage increases but the research has, in general, found little adverse effect on aggregate employment; the relative employment shares of the low-paying sectors; individual employment or unemployment probabilities; or regional employment or unemployment differences.
In the U.S., surveys report that employers are able to recoup some of the cost of higher wages in the form of lower turnover, less absenteeism, and increased productivity. Studies of the Los Angeles airport estimate that the institution of a living wage reduced turnover by up to 17 percentage points. A study of home-care workers covered by a living-wage increase in California found that turnover decreased 57 percent after the wage was implemented. Studies for citywide minimum-wage laws find similar results. Economists Sylvia Allegretto and Michael Reich examined the effects of a 25 percent hike in the minimum wage on restaurant prices in San Jose, California, and found no negative employment effects.
A phased increase from $7.25 to $15 would in fact be an increase from at least $9 in 12 states (in 21 states, only the $7.25 federal wage floor applies), and the annual percentage increase for every state would be far less than the 1951 and 1956 increases. In terms of the absolute level of a $15 wage, it is instructive that Costco has just announced that starting pay for all its workers in its U.S. stores is now at least $13, which compares to the current value of $13.34 for a $15 wage in 2021. The alternative business model is Walmart, whose low-pay strategy has netted the store fantastic profits; the Walton family is said to be worth $150 billion, and four family members recently made Forbes’s list of the top 20 richest Americans.
Protestors march in support of raising the minimum wage to $15 an hour as part of an expanding national movement known as Fight for 15, Wednesday, April 15, 2015, in Miami.
The closest thing we have to a reliable estimate of the net effects of a $15 wage is a recent comprehensive study on the New York state proposal done by the UC Berkeley Institute for Research on Labor and Employment. Like the results calculated by EPI’s David Cooper, this study estimates that raising New York’s wage floor from $9 to $15 will increase earnings for about 3.2 million workers and increase average pay for those getting raises by 23 percent (about $5,000 a year). While the study’s model suggests there may be as many as 78,000 lost jobs, these are expected to be more than offset by the (more certain) gains from increased consumer demand, leading to a slight positive net effect (3,000 jobs).
The study concludes, “In the end, the costs of the minimum wage will be borne by turnover reductions, productivity increases and modest price increases.” This list fails to mention the possibility of lower wage increases for higher-income employees and, most importantly, of lower but acceptable profit margins for business firms.
There is another benefit. Means-tested social spending is increasingly tied to work, through the Earned Income Tax Credit, food stamps, and other programs. Much of America’s public assistance goes to working families with very low incomes, effectively subsidizing low-wage employers. According to the Berkeley Labor Center, about $13.1 billion is spent on public assistance for working families in New York state; EPI’s David Cooper estimates this figure to be about $8.7 billion.
A $15 New York state minimum wage would dramatically reduce this taxpayer subsidy to employers. Let’s say this increase in worker pay leads to a 25 percent decrease in means-tested public assistance to working families in New York, and let’s also assume a worst-case scenario in which 78,000 workers lose their jobs under a $15 wage, with zero offsetting job gains from increased consumer demand. This would make available between $28,154 (Cooper) and $48,875 (Labor Center) for each displaced worker. And 3.1 million workers would still get wage increases. The net benefits of a $15 minimum wage for the living standards of working families and taxpayers that come from a “high-road” economy could be huge.
THE PROGRESSIVE CASE FOR a substantial increase in the minimum wage should be reoriented from a “no-harm” framing to one that explicitly calls for a minimum living wage on broadly defined net-benefit grounds. This includes not just a higher wage’s net monetary benefits to working families’ standard of living, but also the many positive spillover effects of a “high-road” employment model. Decent pay helps working families avoid dependence on public spending that is stigmatizing and politically divisive, and would help end the current practice of subsidizing low-wage “race to the bottom” employment models that have increasingly characterized human-resources practices at for-profit, nonprofit, and government employers alike.
While a target like $15 can be a good political strategy, the process of raising the statutory wage floor to the highest level possible without causing intolerably large employment effects is to establish an annual rate of increase, either in percentage or absolute dollar terms. A commission, much like the UK’s Low Pay Commission, would closely monitor the wage hikes for effects on the standard of living of working families (given prevailing means-tested benefits, which will need to be adjusted), business closures, and job opportunities.
As Tony Atkinson, the eminent British researcher on inequality, has argued, to effectively combat poverty and inequality, we often need a change in the discourse. Concerning the debate over the minimum wage, the criterion for setting the appropriate national legal wage floor should not be driven by statistical contests over which particular wage threshold poses “little or no risk of job loss,” but rather by determining what wage will ensure a minimally decent standard of living from full-time work, and what policies can complement a minimum living wage that will ensure that any costs of job loss are adequately compensated.
If we really care about maximizing employment opportunities, we would put a much higher priority on full-employment fiscal and monetary macroeconomic policy, minor variations in which have had massively greater employment effects than even the highest statutory wage floors that have been proposed. But it is also well within our capabilities to counter any job loss with such remedies as extended unemployment benefits, education and training subsidies, public jobs programs, and other social income. A minimum living wage combined with meaningful child cash allowances would promote work incentives while all but eliminating both in-work poverty and child poverty. It would put the U.S. into waters that most other affluent nations have charted and are already navigating.
This article is based on “What’s the Right Minimum Wage? Reframing the Debate from ‘No Job Loss’ to a ‘Minimum Living Wage,’” a working paper co-authored by Kea Fiedler and Stephanie Luce, to be released by the Washington Center for Equitable Growth in June.