TLDR
- Google will supply up to 1 million Tensor Processing Units to Anthropic in a deal worth tens of billions of dollars
- The TPU deployment starts in 2026 and adds over 1 gigawatt of AI computing power
- Anthropic’s revenue reached $7 billion annually with Claude serving 300,000+ businesses
- Google has invested $3 billion total while Amazon remains the largest investor at $8 billion
- The multi-cloud strategy spreads workloads across Google, Amazon, and Nvidia infrastructure
Google and Anthropic unveiled a major cloud computing partnership on Thursday. The deal provides Anthropic with access to up to 1 million of Google’s custom Tensor Processing Units.
The agreement is valued at tens of billions of dollars. It marks one of the largest AI hardware commitments announced in the industry.
Deployment of the TPUs begins in 2026. The chips will deliver more than 1 gigawatt of computing capacity for Anthropic’s AI operations.
Building a 1-gigawatt data center typically costs around $50 billion. Roughly $35 billion of that expense goes toward chips and processing hardware.
Anthropic’s Rapid Business Growth
Anthropic has experienced explosive growth since its founding by former OpenAI researchers. The company now serves more than 300,000 businesses through its Claude AI models.
This customer base represents a 300-fold increase over just two years. The company’s annual revenue run rate has climbed to $7 billion.
Large enterprise customers generating over $100,000 in annual revenue have increased nearly sevenfold in the past year. Claude Code, the company’s coding assistant, reached $500 million in annualized revenue within two months of launching.
Anthropic claims this makes Claude Code the fastest-growing product in history. The rapid revenue growth drives the need for expanded computing infrastructure.
Strategic Multi-Cloud Infrastructure
Anthropic operates across multiple cloud providers rather than relying on a single vendor. The company uses Google’s TPUs, Amazon’s Trainium chips, and Nvidia’s GPUs for different workloads.
This approach allows Anthropic to optimize for cost, performance, and specific use cases. Training, inference, and research tasks are assigned to the most suitable platform.
Google Cloud CEO Thomas Kurian highlighted the efficiency benefits. He noted that Anthropic’s expanded TPU usage reflects strong price-performance results over several years.
The deal includes access to Google’s seventh-generation Ironwood accelerator chips. These processors are designed specifically for machine learning tasks.
Amazon remains Anthropic’s largest financial backer and primary cloud provider. Amazon has invested $8 billion in the AI startup, compared to Google’s $3 billion total investment.
AWS built Project Rainier, a custom supercomputer for Claude running on Amazon’s Trainium 2 chips. These chips help reduce computing costs compared to other options.
Investment Timeline and Market Impact
Google first invested $2 billion in Anthropic during 2023. The tech giant added another $1 billion in early 2025, bringing total investment to $3 billion.
Financial analysts estimate Anthropic contributed one to two percentage points to AWS growth in recent quarters. This contribution is expected to exceed five percentage points in the second half of 2025.
Anthropic maintains control over its model weights, pricing decisions, and customer data. The company has no exclusivity arrangements with any cloud provider.
This multi-cloud strategy proved valuable during Monday’s AWS outage. Claude continued operating normally due to its infrastructure spread across multiple providers.
The partnership deepens Google’s position as both a major investor and infrastructure provider. Anthropic’s CFO Krishna Rao stated the expansion helps the company grow the compute needed to advance AI development.


