The US Economy Is Getting Tethered to AI - Warning Shots #11

The U.S. economy is locking itself into AI. This episode maps the financial and policy forces driving it—and why labor, power, and safety risks can’t be ignored.

Written by
on

The U.S. economy is locking itself to AI. Markets, infrastructure, and policy are all leaning into a future where artificial intelligence becomes the growth engine—and that creates a “too big to stop” dynamic with real risks.

In this episode, our hosts map the two trains pulling the country forward: finance and industry on one track, government posture on the other. Together, they unpack the bets, the rhetoric, and the consequences.

What we explore in this episode

  • The AI–finance flywheel: vendor financing, index weight, and hype-fueled capex
  • Building “offices for AIs”: data centers eclipsing offices, and the energy crunch
  • Washington’s posture: rejecting global governance and dismissing “catastrophism”
  • The “too big to stop” trap: economic dependence that narrows choices
  • Power concentration: a few firms steering value and supply chains
  • Labor disruption: junior hiring freeze and a broken talent ladder
  • Near-term euphoria vs. long-term reckoning

The AI–finance flywheel: vendor financing, index weight, and hype-fueled capex

Nvidia is reportedly exploring $100B in vendor financing to its biggest customer, OpenAI. AI-heavy firms now make up nearly a quarter of the S&P 500, meaning the market itself lives or dies on AI sentiment. Oracle’s stock recently jumped on talk of ~$60B in future OpenAI spend for compute that doesn’t even exist yet.

This is a deep vertical entanglement: chipmakers, hyperscalers, foundation model labs, and cloud buyers all leaning on each other’s promises. When the index, the credit, and the headlines all point the same way, capital flows wherever the story goes.

Building “offices for AIs”: data centers eclipsing offices, and the energy crunch

Capital expenditure is shifting. Data center construction is overtaking office builds, and in some regions, even logistics costs are being driven up by data center demand. We’re not building cubicles for people anymore—we’re building server farms for machines.

The energy demand is staggering. New facilities have been compared to multiple nuclear plants’ worth of load. That raises thorny questions about grid upgrades, siting battles, and who gets electricity when the curve spikes.

Washington’s posture: rejecting global governance and dismissing “catastrophism”

Recent White House statements pushed back on international AI governance while dismissing concerns about existential risk as mere “catastrophism.” At State, Jacob Helberg echoed that framing: warnings about AI doom are seen as a pretext for control, while America’s strategy is to win the race.

The problem? Calling extinction risk “catastrophism” sidesteps the real arguments. You don’t need doomerism to justify baseline safety standards. Brushing off debate degrades the discourse at precisely the moment it’s most needed.

The “too big to stop” trap: economic dependence that narrows choices

As the economy, markets, and infrastructure become entangled with AI, the idea of pulling the plug becomes unthinkable. Too many jobs, pensions, and city budgets will depend on continued growth. That’s the trap: even if harms become clear, we’ll rationalize further expansion because the alternative feels worse.

Later, as AI systems begin orchestrating other AIs, the asymmetry flips: it won’t be us unplugging them—they’ll decide how plugged-in we are. That’s not science fiction; it’s a strategic risk.

Power concentration: a few firms steering value and supply chains

If a small cluster of companies controls compute, models, data, and distribution, they control value creation itself. Even with mechanisms like UBI, consumption still flows through supply chains those firms dictate.

Market power at this scale shapes not just prices but norms and policy tempo. Guardrails are needed before the moat hardens and meaningful optionality disappears.

Labor disruption: junior hiring freeze and a broken talent ladder

Claims in the field suggest 70% of office tasks and up to 90% of programming could be automated within years. The impact is already visible: junior hiring is drying up. Without entry-level reps, analysts, or developers, the pipeline to senior expertise breaks.

AI as a “manager of agents” might look efficient on paper, but it sidelines the human learning loops that build skills. Fewer paths to mastery mean weaker motivation—a long-term social systems problem, not just a labor statistic.

Near-term euphoria vs. long-term reckoning

Right now, markets and tech feel euphoric. Valuations are climbing, products are shipping, and investors are celebrating. But if safety and governance remain sidelined, the reckoning later will dwarf any short-term correction.

The responsible path isn’t slamming the brakes. It’s steering: setting standards, building oversight, aligning incentives, and planning for both power and people.

Closing Thoughts

We are wiring the U.S. economy to AI faster than we are wiring in safety, equity, and resilience. Once dependency sets in, choices narrow—and the cost of course-correcting skyrockets.

This isn’t just about stocks or servers. It’s about who holds power, how we learn and work, and whether the systems we’re building remain aligned with human goals.

Take Action

📺 Watch the full episode
🔔 Subscribe to the YouTube channel
🤝 Share this blog
💡 Support our work