Post: Exclusive: Google deepens Thinking Machines Lab ties with new multi-billion-dollar deal

Exclusive: Google deepens Thinking Machines Lab ties with new multi-billion-dollar deal

Former OpenAI executive Meera Murthy’s startup, Thinking Machines Lab, has signed a new multibillion-dollar deal to expand its use of Google Cloud’s AI infrastructure, including systems powered by Nvidia’s latest GPUs, TechCrunch has learned exclusively.

The deal is worth in the single-digit billions, according to a source familiar with the matter, and includes infrastructure services to support model training and deployment, as well as access to Google’s latest AI systems built on top of Nvidia’s new GB300 chips.

Google is actively pursuing several cloud deals with AI developers as it aims to wrap up its cloud offering with other services such as storage, a Kubernetes engine, and Spanner, its database product. Earlier this month, Anthropic signed a contract. With Google and Broadcom for multi-gigawatt capacity tensor processing units (TPUs) (these are Google’s custom-designed AI chips for machine learning workloads).

But the competition is tough. Just this week, Anthropic also signed a new deal with Amazon to secure up to 5 gigawatts of capacity to train and deploy the cloud.

Earlier this year, Thinking Machines partnered with Nvidia in a deal that included investment from the chipmaker. But this is the first time the lab has tied up with a cloud services provider. The deal isn’t exclusive, so Thinking Machines could use multiple cloud providers over time, but it’s still a sign that Google is trying to shut down the fast-growing Frontier Labs as quickly as possible.

Murthy left his job as chief technologist of OpenAI and founded Thinking Machines in February 2025. The company, which soon raised a $2 billion seed round at a $12 billion valuation, has remained highly secretive, but launched its first product in October. Dub TinkerIt is a tool that automates the creation of custom Frontier AI models.

Wednesday’s agreement provided some insight into the development of thinking machines. In a press release, Google noted that it could support the startup’s reinforcement learning workloads, which Tinker’s architecture relies on. Reinforcement learning is a training approach that has fueled recent advances in labs including DeepMind and OpenAI, and the scale of the Google Cloud deal reflects how computationally expensive the task can be.

TechCrunch event

San Francisco, CA
|
October 13-15, 2026

Thinking Machines is among the first Google Cloud customers to access its GB300-powered systems, which, according to Google, offer a 2X improvement in training and serving speed compared to previous-generation GPUs.

“Google Cloud has enabled us to run at record speeds with the reliability we demand,” Thinking Machines founding researcher Myel Ott said in a statement.

When you make a purchase through links in our articles, we may earn a small commission. This does not affect our editorial freedom.