The deal could be worth billions, although nothing is nailed down, and one source said Meta has not yet decided whether the TPUs would handle model training or the lighter inference jobs that demand less grunt.
Google has been polishing its silicon for years, and a chunky Meta contract would give the search outfit and other chip challengers a chance to prise open Nvidia’s near-total market dominance. Nvidia’s shares apparently dipped seven per cent once word of the talks surfaced.
Google muttered that its cloud arm is seeing “accelerating demand” for TPUs and Nvidia GPUs and insisted it “is committed to supporting both, as we have for years.”
Core Scientific chief executive Adam Sullivan said: “The biggest story in AI right now is that Google and Nvidia are being extraordinarily competitive. They’re in a race to secure as much data-centre capacity as they can.”
Both companies are wooing customers with financing deals to ease the sting of huge chip orders. Sullivan said: “They don’t care about how much revenue they generate. This is about who gets to [artificial general intelligence] first.”
After watching its stock sink on Tuesday, Nvidia popped up on X, saying, “We’re delighted by Google’s success. They’ve made great advances in AI, and we continue to supply to Google. NVIDIA is a generation ahead of the industry it’s the only platform that runs every AI model and does it everywhere computing is done.”
Google began using TPUs about a decade ago to improve its services, then offered them to cloud customers in 2018 for training and inference workloads. The chips now power its Gemini models and have been sold to players such as Anthropic. Anthropic said last month it will spend tens of billions from next year to buy up to one million TPUs, which is enough to deliver roughly a gigawatt of compute for its AI systems.
Most large language models are still trained on Nvidia’s GPUs, which evolved from a video game kit into the backbone of modern AI, thanks to their ability to run billions of simultaneous calculations. Nvidia’s parts are used by thousands of developers either directly or through the big cloud providers that fill their data centres with them.
Google’s TPUs are application-specific integrated circuits designed for specific workloads, boosting energy efficiency and giving them an edge in some AI tasks. Analysts and data centre operators reckon TPUs are one of the few real threats to Nvidia’s dominance. However, Google must sell them more widely if it wants to challenge the champion appropriately.