Meta Eyes Google Chips For AI Data Centers How Will It Affect Nvidia
Meta is in advanced talks with Alphabet’s Google to integrate Google’s tensor processing units (TPUs) into its AI data centres from 2027, while also exploring the option to rent TPUs from Google Cloud as early as next year, according to a report by The Information.
The move highlights the tech giant’s search for alternatives to Nvidia’s graphics processing units (GPUs), which have long dominated AI infrastructure.
Could TPUs Become A Real Competitor To Nvidia
Google has positioned TPUs as a lower-cost option compared with Nvidia chips, with potential advantages for firms prioritising higher security standards.
The company has reportedly discussed aiming for 10 per cent of Nvidia’s revenue from its TPU business.
Originally developed for internal use in Google Cloud, TPUs have evolved to handle AI workloads efficiently, with customisation enabling Google to optimise performance and power use alongside its AI systems.
Market Reaction Signals Growing Interest In TPUs
The news rattled Nvidia’s stock, with shares falling 3.6% in premarket trading, while Alphabet rose 2.6%.
Analysts see potential validation for Google’s chip technology if Meta adopts TPUs, following a major deal with AI startup Anthropic to supply up to 1 million chips.
Seaport analyst Jay Goldberg described the Anthropic agreement as a “really powerful validation” of Google’s technology.
Meta’s AI Investment Could Drive TPU Demand
Meta, one of the largest AI infrastructure spenders, projects capital expenditure between $70 billion and $72 billion this year, with Bloomberg Intelligence analysts estimating that its 2026 spending of at least $100 billion could translate into $40–50 billion for inference-chip capacity alone.
Renting TPUs through Google Cloud could accelerate adoption, providing flexibility while diversifying away from Nvidia amid ongoing supply constraints.
How TPUs Differ From GPUs In AI Workloads
While Nvidia GPUs remain the backbone of AI model training, TPUs are application-specific integrated circuits designed explicitly for AI and machine learning tasks.
Experts say this level of customisation gives Google a competitive edge, offering highly efficient performance for AI inference.
The chips have gained traction beyond Google’s own operations as firms seek alternatives to reduce reliance on Nvidia.
Will Meta’s Move Reshape The AI Chip Landscape
A partnership with Meta could mark a key milestone for Google, boosting confidence in TPUs as a viable alternative in the AI hardware market.
TPUs have already attracted interest from other major players and suppliers, with Asian companies linked to Alphabet seeing early gains—South Korea’s IsuPetasys surged 18%, while Taiwan’s MediaTek rose nearly 5%.
Meta declined to comment on the report, while Google did not immediately respond to requests for clarification.
The outcome of these discussions will be closely watched, as AI infrastructure spending ramps up globally and competition for efficient, scalable chips intensifies.