TLDRs;
- OpenAI inks massive five-year Oracle cloud deal valued over $300B.
- ChatGPT reaches 800 million weekly users amid rising AI demand.
- Oracle integrates OpenAI models into healthcare systems post-Cerner.
- Texas and Midwest Stargate AI buildout will generate thousands of jobs.
Oracle CEO Clay Magouyrk expressed strong confidence that OpenAI can handle an annual cloud services bill of $60 billion.
Speaking at Oracle’s AI World conference in Las Vegas, Magouyrk highlighted the scope of the existing five-year cloud agreement between the two companies, valued at over $300 billion. Despite OpenAI reporting a $5 billion net loss in 2024, Oracle executives remain bullish on the company’s ability to sustain such massive infrastructure costs.
The deal underscores the strategic partnership between Oracle and OpenAI, with Oracle providing critical cloud infrastructure to support the rapid expansion of AI models such as ChatGPT. Currently, ChatGPT boasts 800 million weekly active users, illustrating both the platform’s popularity and the computational demands required to support it.
Integrating AI Into Healthcare Systems
Oracle’s CEO Mike Sicilia emphasized the integration of OpenAI’s models into the company’s electronic health record systems.
This initiative builds on Oracle’s $28 billion Cerner acquisition in 2022, signaling the tech giant’s commitment to embedding AI into healthcare. The integration aims to streamline patient data access, improve diagnostic efficiency, and enhance overall healthcare delivery through advanced AI-driven insights.
“I’ve seen the results, and I really do think that they’re going to have a dramatic impact on industries, on enterprises of all types,” Sicilia said of OpenAI.
The collaboration with OpenAI allows Oracle to deploy AI tools across a wide range of enterprise applications, ensuring that organizations can leverage the latest machine learning models for critical operations.
Hyperscale AI Buildouts in Texas and Midwest
OpenAI, Oracle, and SoftBank have unveiled plans for the Stargate network, a hyperscale AI data center project spanning Texas, New Mexico, Ohio, and additional Midwest locations.
The buildout aims to deliver nearly 7 gigawatts of computing capacity, representing over $400 billion in combined investment across the next three years.
The flagship Abilene, Texas site alone is designed to scale past 1 gigawatt, reflecting the massive scale of infrastructure required for cutting-edge AI workloads. The project is expected to generate over 6,000 construction jobs and 1,700 long-term positions for developers, construction firms, and infrastructure providers. Additionally, OpenAI and Broadcom plan to deploy 10 gigawatts of new AI chips, including Nvidia’s next-generation Vera Rubin processors.
Financial Implications for OpenAI
While the $60 billion yearly cloud bill is theoretically feasible, it significantly outpaces OpenAI’s current revenue.
The company generated approximately $4.3 billion in revenue during the first half of 2025. Annualized revenue as of July 2025 stands near $12 billion, meaning infrastructure costs alone would require roughly five times current revenue levels to cover expenses before accounting for salaries, research, or other operating costs.
Analysts suggest that a sustainable model for OpenAI would likely require annual revenues exceeding $150 billion to support continued AI development and massive cloud infrastructure commitments. Despite these financial challenges, investor sentiment appears positive, with Oracle shares rising 5% on October 13, bringing its market capitalization near $900 billion.
AI Chips and Infrastructure Strategy
OpenAI’s cloud operations leverage Nvidia chips rented from Oracle, CoreWeave, Google, and Microsoft, while also developing a custom AI processor with Broadcom.
This combination of rented and proprietary hardware allows OpenAI to scale operations rapidly and maintain flexibility in a competitive AI landscape.
The partnership highlights how hyperscale cloud infrastructure and AI development are increasingly intertwined, with Oracle positioned as a central player in powering the next generation of AI applications.