TLDR;
- Asia’s AI boom is outpacing data center and power infrastructure growth.
- Groq CRO warns of worsening shortages in compute and energy resources.
- Governments and tech giants are investing heavily, but gaps remain.
- Gartner projects a 160% spike in AI-related energy demand by 2027.
As artificial intelligence (AI) continues its breakneck global ascent, Asia is emerging as a region under pressure. According to Ian Andrews, Chief Revenue Officer at Groq, a U.S. semiconductor startup specializing in ultra-fast inference chips, the continent is facing an acute shortage of data center infrastructure and power capacity necessary to support AI’s growth.
Speaking at the ATxSummit in Singapore, Andrews sounded the alarm:
“There’s a huge challenge getting enough compute for what we want to do in Asia. It’s a big problem that we have to tackle.”
Andrews emphasized that while model development is advancing rapidly, infrastructure hasn’t kept pace. “We’re still in the infancy of AI,” he said, warning that bottlenecks are already forming and likely to worsen over the next five years.
Inference is the New Frontier
Unlike many AI players still focused on training large models, Groq is betting on inference—the stage where models are deployed to deliver real-time outputs. This phase, according to Andrews, will soon consume more computational resources than training itself, exacerbating the infrastructure strain.
As AI becomes integrated into nearly every digital experience, from enterprise software to consumer apps, the demand for low-latency, high-speed inference will only increase.
“There is no model in which we have enough data center capacity, enough power, and enough infrastructure to run all of that in this region,” Andrews cautioned.
Governments Step In, But Is It Enough?
Governments and corporations across Asia are ramping up investments to address this shortfall. South Korea, now home to the largest ChatGPT paid user base outside the U.S., is welcoming OpenAI’s third Asian office. Taiwan has pledged $3 billion over three years to enhance its AI data infrastructure. Meanwhile, tech giants are funneling billions into global AI infrastructure projects.
Yet, analysts warn that these efforts may still fall short. A recent Gartner report forecasts a 160% increase in data center energy consumption by 2027, driven by AI and generative AI (GenAI) demands. Power grids in many regions may soon hit their limits, further compounding the issue.
The Energy Dilemma and Sustainability Risk
Gartner’s analysis paints a stark picture. With data centers projected to require over 500 terawatt-hours of electricity annually by 2027—a 2.6x increase from 2023—the energy crisis threatens not just operational capacity but also sustainability goals.
“The insatiable demand for power is outstripping the capacity of utility providers,” said Bob Johnson, VP Analyst at Gartner.
In response, some regions are resorting to fossil fuels, risking higher carbon emissions and undermining green energy initiatives.
To address this, Gartner recommends prioritizing energy-efficient AI models, exploring edge computing, and investing in advanced energy storage solutions. These innovations, while still in early stages, could help balance AI advancement with environmental responsibility.
That said, the surge in AI adoption is colliding with the hard realities of physical infrastructure in Asia. While countries race to scale up, the gap between AI ambition and operational readiness is widening.
As Groq’s Andrews put it, “Model progression is solvable. Infrastructure? That’s the real battle.”