#ai + #economics

Public notes from activescott tagged with both #ai and #economics

Thursday, December 25, 2025

Tech companies have moved more than $120bn of data centre spending off their balance sheets using special purpose vehicles funded by Wall Street investors, adding to concerns about the financial risks of their huge bet on artificial intelligence.

Meta in October completed the largest private credit data centre deal, a $30bn agreement for its proposed Hyperion facility in Louisiana that created an SPV called Beignet Investor with New York financing firm Blue Owl Capital.

The SPV raised $30bn, including about $27bn of loans from Pimco, BlackRock, Apollo and others, as well as $3bn in equity from Blue Owl.

Monday, November 10, 2025

Be patient. Not afraid.

For layoffs in the tech sector, a likely culprit is the financial stress that companies are experiencing because of their huge spending on AI infrastructure. Companies that are spending a lot with no significant increases in revenue can try to sustain profitability by cutting costs. Amazon increased its total CapEx from $54 billion in 2023 to $84 billion in 2024, and an estimated $118 billion in 2025. Meta is securing a $27 billion credit line to fund its data centers. Oracle plans to borrow $25 billion annually over the next few years to fulfill its AI contracts. 

“We’re running out of simple ways to secure more funding, so cost-cutting will follow,” Pratik Ratadiya, head of product at AI startup Narravance, wrote on X. “I maintain that companies have overspent on LLMs before establishing a sustainable financial model for these expenses.”

We’ve seen this act before. When companies are financially stressed, a relatively easy solution is to lay off workers and ask those who are not laid off to work harder and be thankful that they still have jobs. AI is just a convenient excuse for this cost-cutting.

Last week, when Amazon slashed 14,000 corporate jobs and hinted that more cuts could be coming, a top executive noted the current generation of AI is “enabling companies to innovate much faster than ever before.” Shortly thereafter, another Amazon rep anonymously admitted to NBC News that “AI is not the reason behind the vast majority of reductions.” On an investor call, Amazon CEO Andy Jassy admitted that the layoffs were “not even really AI driven.”

We have been following the slow growth in revenues for generative AI over the last few years, and the revenues are neither big enough to support the number of layoffs attributed to AI, nor to justify the capital expenditures on AI cloud infrastructure. Those expenditures may be approaching $1 trillion for 2025, while AI revenue—which would be used to pay for the use of AI infrastructure to run the software—will not exceed $30 billion this year. Are we to believe that such a small amount of revenue is driving economy-wide layoffs?

Tuesday, October 28, 2025

Seems about right. Interesting metrics on startups too:

  • Foundation Model Labs: Revenue must grow faster than Compute Costs.
  • Enterprise AI Platforms: High Gross Retention because of high AI Feature Adoption.
  • Application Layer: Net Revenue Retention (NRR) > 120% and CAC Payback < 12 months.
  • Inference API Players: High Revenue per GPU-Hour (pricing power).
  • Energy/Infrastructure: Structural Energy Cost Advantage and high utilization.

Energy infrastructure, unlike GPUs that become obsolete in five years, compounds in value over decades.

Consider the math: A single large AI training cluster can require 100+ megawatts of continuous power — equivalent to a small city. The United States currently generates about 1,200 gigawatts of electricity total. If AI compute grows at projected rates, it could demand 5-10% of the nation’s entire power generation within a decade.

And unlike fiber optic cable or GPU clusters, power infrastructure can’t be deployed quickly. Nuclear plants take 10-15 years to build. Major transmission lines face decades of regulatory approval. Even large solar farms require 3-5 years from planning to operation.

The companies prepping themselves to survive scarcity aren’t just stockpiling compute—they’re building root systems deep enough to tap multiple resources: energy contracts locked in for decades, gross retention rates above 120%, margin expansion even as they scale, and infrastructure that can flex between training and inference as market dynamics shift.