#openai

Public notes from activescott tagged with #openai

Monday, March 23, 2026

OpenAI and Anthropic are competing for partnerships with buyout firms that would allow them to quickly roll out their AI tools to ​potentially hundreds of private, established companies owned by buyout firms. This would boost adoption of their models and encourage customer stickiness at scale.

OpenAI is ‌offering private-equity firms a guaranteed minimum return of 17.5%, significantly higher than typical preferred instruments, two people familiar said. It is also offering early access to its newest AI models as it seeks to enlist investors like TPG and Advent for its joint venture, three sources said.

Friday, March 13, 2026

We did the math. At $185 billion a year, in eight years, Google would be spending $1.5 trillion, slightly more than OpenAI has committed to spend over the same time period. Extend that out to 10 years, as Vahdat noted, and Google would be spending $1.9 trillion.

Vahdat is clear that this is “not a promise” that Google would spend that much over the next 10 years. But the decade-long view he takes suggests the scope of Google’s bet. “The point here is that we are, at Google, investing at the highest levels,” he says.

There’s a big difference between Google’s data center ambitions and OpenAI’s: Google is a money-making machine. In the fourth quarter, Google parent Alphabet raked in $113 billion in revenue; for the full year, sales topped $400 billion for the first time in the company’s more than 25 year history. By comparison, OpenAI is spending at similar levels and only brought in about $13 billion in revenue last year — a tiny fraction of Google’s revenue, and less than half of Google’s cash reserves.

Google’s TPUs previously were only used in house for Google’s own infrastructure — to power consumer apps like Gmail and YouTube, and eventually train self-driving cars and develop and run AI models like Gemini. Now, they’re one of the industry’s go-tos: maybe not as popular as Nvidia’s top of the line Blackwells, but still useful for pretraining and operating AI models at scale. Google first started selling access to them through a cloud service in 2018, letting other companies rent out processing power. But more recently, Google has inked high profile deals, like a big contract with Anthropic, and has reportedly been in talks with Meta to use its chips. In December, Morgan Stanley estimated that TPUs could generate $13 billion for Google by 2027. “It is fair to say that the demand for cloud TPUs has been unprecedented,” Vahdat says, particularly in the last few years.

In August, Vahdat, Google Chief Scientist Jeff Dean, and 10 other researchers and execs at the company, co-published a paper aiming to contextualize AI’s power guzzling. The paper says that the median prompt for Google’s Gemini AI model uses the same amount of energy it takes to power 9 seconds of television and consumes around five drops of water, which they write is “substantially lower than many public estimates.” (One report says large data centers can consume up to 5 million gallons per day, equivalent to the water use of a town populated by up to 50,000 people.)

Monday, March 2, 2026

It does, kinda, matter that Hegseth turned a simple contract dispute into an attempted corporate death sentence, weaponizing a supply-chain security designation that was clearly designed for tech the US government fears could be infiltrated by hostile foreign nations.

Yet, under Hegseth’s order, Chinese AI models would technically be more welcome in America’s military supply chain than Anthropic’s. The “supply chain risk” designation is now being used to punish a domestic company for having safety guidelines. DeepSeek, with its direct ties to the Chinese government, faces fewer restrictions than a San Francisco company that committed the cardinal sin of asking for human oversight on killing decisions.

One source familiar with the Pentagon’s negotiations with AI companies confirmed that OpenAI’s deal is much softer than the one Anthropic was pushing for, thanks largely to three words: “any lawful use.” In negotiations, the person said, the Pentagon wouldn’t back down on its desire to collect and analyze bulk data on Americans. If you look line-by-line at the OpenAI terms, the source said, every aspect of it boils down to: If it’s technically legal, then the US military can use OpenAI’s technology to carry it out. And over the past decades, the US government has stretched the definition of “technically legal” to cover sweeping mass surveillance programs — and more.

In the years after 9/11, US intelligence agencies ramped up a surveillance system that they determined fell within the legal limits OpenAI cites, including multiple mass domestic spying operations (along with apparently highly invasive international ones). In 2013, National Security Agency intelligence contractor Edward Snowden revealed the extent of some of these programs, such as reportedly collecting telephone records of Verizon customers on an “ongoing, daily” basis, and gathering bulk data on individuals from tech companies like Microsoft, Google, and Apple via a secretive program called PRISM. Despite promises of reform from intelligence agencies and attempts at legal changes, few significant limits to these powers were enacted. Mike Masnick, founder of Techdirt, said online that OpenAI’s deal “absolutely does allow for domestic surveillance. EO 12333 is how the NSA hides its domestic surveillance by capturing communications by tapping into lines outside the US even if it contains info from/on US persons.”

Thursday, November 6, 2025

Looking for all that money Sam plans to spend…

“This is where we’re looking for an ecosystem of banks, private equity, maybe even governmental, the ways governments can come to bear,” she said. Any such guarantee “can really drop the cost of the financing but also increase the loan-to-value, so the amount of debt you can take on top of an equity portion.”

OpenAI is losing money at a faster pace than almost any other startup in Silicon Valley history thanks to the upside-down economics of building and selling generative AI. The company expects to spend roughly $600 billion on computing power from Oracle, Microsoft, and Amazon in the next few years, meaning that it will have to grow sales exponentially in order to make the payments. Friar said that the ChatGPT maker is on pace to generate $13 billion in revenue this year.

Tuesday, November 4, 2025

On track to lose $8.5B/yr is good, right?

OpenAI generated around $4.3 billion in revenue in the first half of 2025, about 16% more than it generated all of last year, The Information reported on Monday, citing financial disclosures to shareholders.

OpenAI said it burned $2.5 billion, in large part due to its research and development costs for developing artificial intelligence and for running ChatGPT, the report added. Research and development cost the ChatGPT maker $6.7 billion in the first half, the report said, adding that it had about $17.5 billion in cash and securities at the end of the period. OpenAI looks to meet its full-year revenue target of $13 billion and a cash-burn target of $8.5 billion, the report added.