TLDR
- Anthropic has reached nearly $20B in annual revenue run rate, more than doubling from $9B recorded in late 2025
- Pentagon’s Defense Secretary Pete Hegseth designated Anthropic as a “supply-chain risk” due to AI safety policy disputes
- This classification may prevent government procurement and impact partnerships with defense contractors including Lockheed Martin
- Strong demand for Claude’s coding capabilities fuels explosive growth; Claude app reached top position on Apple’s free app rankings
- Anthropic plans legal action, calling the Pentagon’s designation “legally unsound”
Recent reports from Bloomberg reveal that Anthropic is poised to achieve nearly $20 billion in annualized revenue, based on information from individuals with knowledge of the company’s performance. This figure represents more than a doubling from the $9 billion run rate the AI company disclosed at 2025’s conclusion.
BREAKING: ANTHROPIC REVENUE JUST DOUBLED
Anthropic revenue run rate $20 billion
>$9 billion at the end of 2025
>$14 billion a few weeks ago
>$19 billion nowMORE THAN DOUBLED in 2 months
>Valuation $380 billion
>#1 on the App Storeclaude code cooked pic.twitter.com/IRFtMKxvkA
— NIK (@ns123abc) March 4, 2026
Sources indicate the company, currently holding a valuation near $380 billion, recently surpassed $19 billion in annualized revenue, marking a significant jump from approximately $14 billion recorded just several weeks earlier.
The explosive revenue expansion has been largely fueled by Claude Code, an innovative solution enabling software developers to streamline and automate sophisticated programming workflows. Enterprise clients and development teams have embraced the technology at an accelerating pace.
Consumer interest has also surged dramatically, with Anthropic’s flagship consumer application securing the number one position in Apple’s free app category during the recent weekend, demonstrating widespread appeal extending well beyond corporate users.
However, this impressive financial trajectory faces a significant obstacle from federal authorities. Defense Secretary Pete Hegseth has classified Anthropic as a “supply-chain risk,” a categorization historically reserved for entities associated with hostile nations.
The confrontation originated when Anthropic declined Pentagon requests to permit unrestricted military use of its AI technology for surveillance operations and autonomous weaponry. The company advocated for implementing safety guardrails, a proposal the Defense Department ultimately rejected.
Pentagon Dispute Could Affect Defense Contracts
This supply-chain risk classification serves to prohibit federal agencies from procuring Anthropic’s technology while encouraging partner organizations to sever ties.
Lockheed Martin announced its intention to comply with Pentagon guidance by discontinuing use of Anthropic’s solutions across its operations. Other major defense contractors including General Dynamics, RTX, and L3Harris have not publicly disclosed their response plans.
Dean Ball, formerly of the White House advisory team, characterized the government’s move as “attempted corporate murder.”
In response, Anthropic has condemned the classification as “legally unsound” and confirmed its readiness to pursue judicial remedies.
Claude App Tops Apple Charts During Government Feud
Public adoption trends have diverged sharply from the government’s position. The Claude application’s ascent to the top of Apple’s free app rankings occurred precisely as the Pentagon announced its restrictive measures.
The remarkable revenue acceleration from $14 billion to beyond $19 billion in annualized run rate transpired over mere weeks, indicating that commercial demand has remained resilient despite federal government actions.
Whether the Pentagon’s designation will materially impact Anthropic’s enterprise sales pipeline and government contracting opportunities over the long term remains an open question.
Company representatives have confirmed their willingness to pursue litigation should formal supply-chain risk status be officially imposed.


