OpenAI is on a deal-making tear, despite the flop of the much-hyped release of GPT-5 in August. This past month, OpenAI closed another funding round, raising its valuation from $300 billion to $500 billion. It announced a partnership with Oracle, in which OpenAI pledged to spend $300 billion over five years to finance a 4.5-gigawatt (GW) data center buildout as part of its Stargate mega-project. OpenAI and chip designer Broadcom announced a $10 billion partnership to build customized AI chips to handle the projected exponential growth in demand for generative AI tools and products. Then, OpenAI and AMD unveiled a deal in which OpenAI will purchase 6 GW of AMD AI chips, worth roughly $90 billion, and, in a twist, AMD granted OpenAI warrants to 160 million shares of its stock, potentially amounting to 10 percent ownership of AMD. In total, OpenAI has secured $850 billion in data center investments, adding up to 17 GW of planned computing capacity—more power than what New York City and its 20 million residents consume.

There’s just one problem with this master plan: OpenAI doesn’t have the money to pay for it. For example, OpenAI is committing to pay Oracle $60 billion in capex investment annually for five years. For reference, Meta, one of the most valuable and profitable companies in the world, which brought in $164.5 billion in revenue in 2024 and ended the year with a free cash flow of $52.10 billion, plans to spend $72 billion in 2025 building data centers. OpenAI, on the other hand, is on pace to bring in $12.7 billion this year, expects to lose $9 billion, predicts its losses will swell to $47 billion by 2028, and doesn’t expect to break even until 2029. How can OpenAI plan to spend five times what it brought in?

More from Bryan McMahon

As is often the case in the AI industry, Nvidia came to the rescue, announcing a $100 billion investment in OpenAI. Now OpenAI has the money to send to Oracle to build data centers, which Oracle will fill with Nvidia GPUs. So wait—is Nvidia just paying itself?

The Snake That Eats Itself

In fact, Nvidia is the king of this kind of deal, although $100 billion for a single company certainly stands out. It regularly makes equity investments in AI startups that are also its customers. Like the mythical ouroboros, the snake that eats itself, the AI economy turns equity investments from Nvidia and other companies into purchases of their own products, effectively self-funding their own record revenues. So what happens when this recirculation can’t cover for the lack of massive amounts of available cash for the planned AI buildout?

Related: Bubble Trouble

Nvidia made similar equity stake deals with startups CoreWeave and Lambda. Those startups’ number one cost is the computing power needed to train and deploy AI models, so much of Nvidia’s equity investment goes toward buying cloud computing. Although Google and Amazon have their own AI chips, Nvidia’s GPUs account for nearly 90 percent of the AI chip market. So, the startup pays for cloud computing from someone who bought mostly Nvidia chips for their data center. Nvidia’s money printer keeps whirring—not only do its equity investments get returned to it through chip purchases, but Nvidia books revenue that wouldn’t otherwise exist.

While demand is growing and the need for computing seems infinite, these circular flows pump valuations higher. But they present an equally potent downside risk should the AI bubble pop. Vendor financing, as these circular practices are called, was also a feature of the dot-com bubble. Cisco, the Nvidia of its day and, briefly, the most valuable company in the world, sold the networking equipment companies needed to access the new, fiber-fast internet speeds, just like Nvidia sells the chips companies need to run AI. While demand for fiber seemed infinite, Cisco began financing companies that would buy its networking equipment. But when the music stopped in 2000, Cisco was forced to swallow a then-record $2.2 billion write-down for the returned and unused networking equipment it had lent out. Its stock lost 80 percent of its value over the next two years and never approached its dizzying heights of the dot-com era again.

Related: Bubble Trouble

A growing feature of this ouroboros AI economy is the involvement of public markets. The sums needed to train and run AI models are forcing OpenAI to turn to the deeper pools of liquidity in the stock market, as the recent deal between OpenAI and AMD underscores. The deal’s structure is unusual. OpenAI is planning to buy 6 GW of AMD’s AI chips, but there is a snag. The price tag for the chips is roughly $90 billion, which, as we saw above, OpenAI does not have. As of yet, it’s unclear how OpenAI will pay. However, AMD also agreed to issue OpenAI warrants to purchase 160 million of its shares at $0.01 per share in performance-based tranches. According to Reuters, ā€œThe first tranche will vest after the initial shipment of MI450 chips set for the second half of 2026. The remaining milestones include specific AMD stock price targets that escalate to $600 a share for the final installment of stock to unlock.ā€

In theory, OpenAI can use the stock grants to fund the chip purchase with money it otherwise wouldn’t have. The bet for AMD is that as more AI chips get deployed, its valuation will rise, making up for the roughly 10 percent of the company’s stock it gave OpenAI for $0.01. So this is more circularity, more vendors finding ways to get their chips into the hands of companies that can’t pay for them in cash, all premised on the bet that the demand and returns for AI, sometime in the future, will be big enough to fill in the holes in balance sheets along the way.

What Goes Up Must Come Down

OpenAI suddenly looks like Atlas, holding up the whole AI sector (and, potentially, the U.S. economy) with vague IOUs about the future. OpenAI is using a ā€œif we build it, they will comeā€ approach to future AI demand. But just as we saw with the circular funding agreements, there are gaping holes in the future OpenAI promises that spell trouble ahead.

After the Oracle and AMD deals were announced, Goldman Sachs analyzed OpenAI’s finances. If you exclude its massive capital commitments, OpenAI’s operating expenses are projected to be $26 billion in 2026. Internal revenue will cover 47 percent of the cost, while vendor financing and external capital will cover 27 percent and 25 percent, respectively. However, once you include its capital commitments totaling $114 billion, the balance sheet becomes dangerously unbalanced. Internal revenue and vendor financing account for only 17 percent of operating costs, while external funding swells to 75 percent. And the cash crunch is set to get worse, as OpenAI predicts it losses to grow from $9 billion this year to $47 billion by 2028. OpenAI simply doesn’t have the money to meet its commitments.

There are other holes beyond OpenAI’s questionable financing math. Bain & Company released its annual technology report in September, estimating that the AI industry needs $2 trillion in annual revenue to complete the projected data center buildout by 2030. However, even factoring in cost savings from AI, Bain expects the industry will be $800 billion short, a 40 percent funding gap. It’s not easy to come up with an extra $800 billion, even for the money-printing Magnificent 7.

In addition, even if that revenue gap is filled, the biggest bottleneck may be in delivering the electricity to power the data center buildout. According to a recent analysis by Morgan Stanley, data centers in the U.S. are projected to guzzle 57 GW of power between 2025 and 2028—equivalent to more than five New York Cities. But the power industry grows slowly, even in periods of intense demand. Between now and 2028, the power industry will deliver only 21 GW of electricity to data centers, a 36 GW shortfall of projected demand. In other words, less than half of the power needed will be available. The $800 billion shortfall starts to look like an underestimate given these constraints.

Simply put, the numbers don’t add up, and the tech industry is using circular flows that pull in billions from the stock market to try to patch the growing gaps. OpenAI doesn’t have the money to make good on its commitments, and is turning to vendor financing and external capital to make up the difference; even if the projected data center buildout is pulled off, the industry faces an $800 billion shortfall by 2030; and, anyway, the power industry is expected to deliver less than half of the necessary power to complete the buildout. Oh, and by the way, the AI sector, which increasingly rests on OpenAI’s shoulders, is holding up the U.S. economy, with 80 percent of stock market gains in 2025 attributable to AI spending. If you look outside the tech sector, GDP growth from the rest of the economy was a measly 0.1 percent in the first half of 2025. It’s one thing to debate whether the tech industry can survive the AGI bubble popping—which one analysis recently calculated had swelled to be 17 times bigger than the dot-com bubble and 4 times the housing bubble in 2008—but what we should really be asking ourselves is: Can the U.S. economy escape an economic cataclysm when the bubble bursts?

Bryan McMahon is an independent researcher and writer covering AI and the tech industry.