Victor Moussa/Shutterstock
Drug-resistant “superbugs” are predicted to kill more people per year than cancer by 2050. Already, more than two million Americans annually are infected with bacteria that have evolved to resist antibiotics with at least 23,000 dying as a result. Given the nature of infectious disease, antibiotic resistance is increasingly seen as a threat to national security, with potential consequences for trade, global development, and even counterterrorism.
Despite the many incentives being thrown their way, however, major pharmaceutical companies are pulling out of antibiotic research and development at alarming rates, citing a lack of market incentives. In an industry that has a bad case of market-incentive syndrome, it is time to consider the establishment of public pharmaceutical companies as an antidote.
The increasing threat of superbugs arises in large part because antibiotic resistance is growing while the development of new antibiotics is on the decline. Antibiotic research and development (R&D) activity by Big Pharma has been falling for decades, grinding to an almost complete halt in the 2000s before pressure from governments and public health authorities led to a slight uptick in the development of new antibiotics.
The reason is simple: markets don't incentivize innovation in an area like antibiotics where there's little money to be made. The very nature of antibiotics works against free market motivations. Because they are intended to be taken for short periods of time, and are often curative, antibiotics are not nearly as profitable as medications developed to treat chronic disease like diabetes or heart disease. They also tend to be priced lower than many other drugs. This all adds up to mean that antibiotic development has a very slow return on investment.
With American health care seen as an industry, rather than a service, and pharmaceuticals becoming increasingly financialized, it's no surprise that investment in low-return drugs like antibiotics is dwindling. Nevertheless, the government has tried all its traditional forms of persuasion to entice Big Pharma to rise to the challenge of new antibiotic R&D. They have increased public funding for research, offered subsidies and longer patent protections. But with the announcements of Novartis, AstraZeneca, Eli Lilly, and others exiting this market—even after making very public commitments to the cause—it has become clear the strategy has failed.
Assuring that new antibiotic development keeps pace with the increasingly critical demand would require what is referred to as “delinkage,” a severing of the tie between R&D costs and the volume of sales. The ultimate form of delinkage would be to take the development and production of medicines like these into public hands and out of the market entirely. What's more, Americans across the political spectrum already support the idea of a public option in pharmaceuticals.
The broadening base of support for an alternative to business as usual in the pharmaceutical sector might have something to do with the industry's increasingly financialized business model, which puts profits over people. From the 1960s onward, companies across industries in the U.S. shifted from a “retain and reinvest” strategy, in which profits were reinvested in the company workforce and other productive assets, to one of “downsize and distribute,” in which companies downsized in order to distribute more profit to shareholders. Pharmaceuticals were no exception. This focus on maximizing shareholder value is proving to be antithetical to the innovation we need to combat the threat of antibiotic resistance and address other public health concerns.
According to a study by the Institute for New Economic Thinking, many large pharmaceuticals, “routinely distribute more than 100 percent of profits to shareholders, generating the extra cash by reducing reserves, selling off assets, taking debt, or laying off employees.” None of these actions helps foster innovation in the pharmaceutical sector. To the contrary, as study author William Lazonick said, “There really is very little drug development going on in companies showing the highest profits and capturing much of the gains.”
Moreover, the “downsize and distribute” strategy seems to have gained steam in recent years in the U.S. pharmaceutical industry. A 2017 study showed that the industry has dramatically reduced its net tangible assets (physical assets like machinery, current inventory and cash) over time, while increasing their intangible assets (patents, trademarks, etc). Looking at pharmaceutical companies on the Fortune 1000 between 2002 and 2014, the study found that the industry “reduced its net tangible assets from $10 billion to approximately $4 billion,” with those assets even plummeting to a negative $1.5 billion for the 2005-to-2008 period—while still increasing earnings per share by 88 percent.
This evidence implies that the pharmaceutical industry has become an excellent example of a value-extracting industry feeding the needs of shareholder overlords rather than a value-creating industry that provides essential drug development. The innovation we need to produce the next generation of antibiotics and other life-saving medications requires the patient capital that public enterprise is much better equipped to provide than private equity.
In just one recent example of the primacy of value extraction in the industry, over the past five years pharmaceutical giant Merck spent $1.91 on stock buybacks and dividends for every dollar it earned in profits—30 percent more than what it spent on R&D during that period. Even the CEO of BlackRock, the world’s largest asset manager, has warned of the dangers that this sort of rent-seeking behavior poses to the larger economy. In a 2014 letter to the leaders of S&P 500 companies, CEO Lawrence Fink wrote, “it concerns us that, in the wake of the financial crisis, many companies have shied away from investing in the future growth of their companies … [by cutting] capital expenditure and even increas[ing] debt to boost dividends and increase share buybacks.”
Fink was right to be concerned. Not only does this sort of financialization of industry contribute to economic instability, it also contributes to inequality, itself an important predictor of health outcomes. If the industry’s business model not only exacerbates health disparities but also fails to produce the medications we need to treat many illnesses like antibiotic-resistant infections, it is time for a real transformation of the sector, not more tinkering around the edges.
With public funds already supporting the majority of the basic research needed to develop new drugs, the industry is a natural choice for deprivatization. No amount of public shaming or pleas for the industry to behave like good corporate citizens seems to do the trick. Already provided every advantage (monopoly patent rights, subsidies, restrictions on drug imports and resales, etc.) the industry still fails to adequately supply many of the medications we require. As noted health-care economist Uwe Reinhardt put it, rather than a success story of American enterprise, we should think of the pharmaceutical industry as “fragile little birds that the protective hand of government carefully shields from the harsh vagaries of truly free, competitive markets.”
Perhaps it's time for us to stop protecting Big Pharma and to start outcompeting them.
The U.S. pharmaceutical industry may have been the perfect test case for the Chicago School neoliberal project. Why then should it not be the test case for a new economic paradigm, one based on shared prosperity, equity and sustainability? A democratically-controlled pharmaceutical industry working for the public good would be a powerful example for—and important pillar of—the new economy we so desperately need to ensure the long-term health and wellbeing of communities.