If you want to understand who the U.S. government serves, in all its corrupt and self-serving glory, you need only look at two ongoing fights in the National Defense Authorization Act, a bill setting military policy that’s critical to weapons manufacturers, and therefore is the only thing that passes like clockwork in Congress every year.
On the one side, a bipartisan Senate coalition led by Sens. Elizabeth Warren (D-MA) and Tim Sheehy (R-MT) managed to add the commonsense notion that the military should be allowed to repair the equipment it buys, building off a mandate instituted by the secretary of the Army. Needing someone to fly in every time a tank or aircraft carrier breaks wastes time and money, and serves as a second bite at the apple for lucrative military contractors. But because Congress is often just a pass-through for corporate America, the contractors’ lobbyists are blitzing Capitol Hill to secure their position as the military’s high-priced mechanics. It would “cripple innovation” to let troops fix their own stuff, apparently.
Meanwhile, our tech overlords are desperate to immunize themselves from oversight while they build their new toy on a mountain of dodgy accounting. Their effort to put a moratorium on all state-based artificial intelligence regulation for ten years first appeared in the Big Beautiful Bill; once the public got wind of it, the measure failed 99-1 in the Senate. The AI moratorium has re-emerged in the NDAA, which habitually becomes a catchall for lobbyist-driven bad ideas. Donald Trump himself has demanded AI immunity from state laws. But Senate Democrats are resisting once again, arguing that this emerging technology shouldn’t be placed in a regulatory free-fire zone in the absence of federal policy.
With that avenue blocked off, the Trump administration has now proposed an executive order, which leaked in draft form last night. The EO is designed to use multiple tools of federal coercion to make it impossible for states to regulate AI. But it’s actually much more expansive than that, potentially making it impossible for states to regulate much of anything business-related. While it’s primarily a wet kiss for Big Tech firms and the Wall Street interests financing data center expansion, the malleable definition of “AI” could block states from approaching anything with an algorithm or machine learning, extending Trump deregulation to the entire country and exposing consumers, workers and startups to untold abuses.
The moratorium push also has a political purpose. AI firms are mimicking their brothers in the crypto industry by building a war chest to buy Washington and neutralize any constraints on its conduct. They’ve already captured the White House, and this executive order is the payback. As Trump plummets in the polls and botches gerrymandering, the only thing he may have left in his favor next year is a tower of campaign cash. The executive order represents his chance to secure more.
AFTER A BORING PREAMBLE about how Trump is poised to “remove barriers to AI leadership” in the U.S. as a national-security imperative, the draft executive order plots several schemes to force states to drop AI regulation. It cites among the types of regulatory approaches that need to be stopped a California AI safety law and a second one in Colorado about preventing discrimination by algorithm, which already shows you the drift that’s setting in here.
First, the draft EO sets up an “AI Litigation Task Force” at the Department of Justice, with the mission of suing states for unconstitutionally intervening in interstate commerce by passing AI regulations. This comes out of a legal doctrine known as the “dormant commerce clause,” alleging that it’s illegal for states to regulate within their borders if the impacts ripple outside them. The more corporations try to expand this, the less states are able to regulate. Applying it to AI models represents a big swing; in fact, it comes directly from a memo out of AI-heavy venture capital firm Andreessen Horowitz.
The Commerce Department would identify state laws that allegedly violate constitutional provisions, including freedom of speech, and refer them to the task force. The draft EO also resurrects the effort to restrict broadband deployment funding from the 2021 infrastructure law to states that attempt to regulate AI, on the claim that this would somehow threaten the broadband build-out. Other agencies are encouraged to condition their funding to states on the AI regulatory landscape as well.
The Federal Communications Commission is directed in the draft EO to adopt a disclosure standard for AI models in a way that would preempt state law. And the Federal Trade Commission is directed to issue policy guidance that any state regulation requiring AI firms to alter their models represents an unfair or deceptive practice. There’s a hand-wave at new federal legislation, but the point of this draft EO, should it be released, would be to chill the ability for states to adopt guardrails on AI development, under the threat of being sued by the Trump administration on several fronts.
What is an “AI law”? Is that just about training of AI models? Is it about algorithms that are used to collect personal data to set prices, or collude with industry partners to gouge renters? Any business that uses a computer is going to appeal to the White House that state-level regulation affecting them represents an attack on the development of AI. And if they have enough money and influence, they’ll succeed.
If you want to analogize this to the financial crisis, Georgia passed a law around 2004 to crack down on the epidemic of mortgage fraud that even the FBI acknowledged at the time. Federal bank regulators said that the big banks they regulated could not be subject to the Georgia law, and secondary mortgage buyers Fannie Mae and Freddie Mac said they would stop buying Georgia mortgages. The state sheepishly revoked its law, and a few years later everything blew up.
In that instance, the government, backed by corporate benefactors, was trying to keep the lucrative mortgage securitization debt bomb rolling. In this case, the government is backed by tech industry benefactors, and the mortgage securitization debt bomb is now a data center securitization debt bomb. But the goal is the same: keep the thing propping up the economy liberated and free, until it spectacularly collapses.
THIS SAVES CORPORATE LOBBYISTS from having to do the hard work of going state by state to fight AI regulations. One such fight is happening in real time in New York, which passed the RAISE Act, a bill to mandate safety testing for the AI model makers that have spent more than $100 million. Gov. Kathy Hochul hasn’t yet decided whether to sign it, and tech lobbyists have pressured her to veto. The American Innovators Network, backed by Andreessen Horowitz, spent over $350,000 on lobbying and digital advertising against the RAISE Act.
The draft EO would do these lobbyists’ work for them, forcing Hochul to risk a lawsuit if she wants to sign the bill. In the context of the insane spending on the AI build-out, that’s not much of a savings; preempting lobbying across the country might get you one Nvidia Blackwell server rack. But the certainty a moratorium will provide is priceless, a nice side benefit to capturing the White House. The only people at risk would be all of us not inside a C-suite at OpenAI or Anthropic.
Plus, it allows the AI lobby to focus on vanquishing their enemies. The co-author of the RAISE Act is Alex Bores, a New York assemblymember and former computer scientist. He’s running in the crowded Manhattan primary to replace Jerry Nadler in the House. And a new AI super PAC is singling him out for retribution.
Backed by—there they are again—Andreessen Horowitz, Palantir co-founder Joe Lonsdale, and president of OpenAI Greg Brockman, Leading the Future has amassed a $100 million fortune to unleash in the 2026 midterms. It has committed millions of dollars to stopping Bores, claiming his legislation “would handcuff not only New York’s, but the entire country’s, ability to lead on AI jobs and innovation.” Bores responded by welcoming their hatred. “When they say, ‘Hey, we’re going to spend millions against Alex because he might regulateBig Techand put basic guardrails on AI,’ I just basically forward that to my constituents,” Bores said on Monday.
Josh Vlasto, a former Andrew Cuomo chief of staff, is one of the heads of Leading the Future; he also organized Fairshake, the crypto super PAC that purchased Congress in the 2024 elections. The model of industries pouring massive sums into primaries to get their sycophants into lawmaking positions is going to continue, because the returns on investment are so massive. The problem underlies virtually every policy matter worth considering.
One final point. This orgy of corruption and pay-to-play politics could have been avoided, because it should have been the entire point of the government shutdown fight. The Trump administration is circumventing Congress (and now state legislatures) by using imagined authority to dole out favors to friends and punish enemies. You see this with the draft EO on AI, as surely as you see it with the plan to dismantle the Education Department piece by piece, pushing its offices into other agencies.
But Congress has one power to stop that: the power of the purse. Just as they prohibited any money from congressional appropriations from being spent to fire federal workers, they could have prohibited funds from being spent to shuffle core functions of agencies created by Congress, or to preempt state laws on artificial intelligence.
Even Republicans wanted to rein in the president’s runaway power grab, before administration officials intimidated them. This was the actual fight of the shutdown, which ended in failure. Now we have three more years of government by personal cult leader, because Democrats thought “stop the power grab” was too complicated a slogan. The implications of this failure will be felt by everyone, and celebrated by Big Tech lobbyists.

