TLDRs:
- SK Hynix Q3 operating profit jumps 62% amid booming AI memory chip demand.
- Revenue climbs to $17.1 billion as DRAM and NAND fully booked for 2026.
- HBM4 chip production ramps up, fueling investor confidence and premarket gains.
- Analysts caution on sustainability despite strong AI-driven growth in memory sector.
Shares of SK Hynix jumped up to 4 % in premarket trading on Wednesday before recoiling, following the company’s announcement of a remarkable 62% rise in operating profit for the September quarter, reaching a record 11.4 trillion won (around US$8 billion) and slightly exceeding analysts’ expectations.
Revenue for the quarter also surged to 24.5 trillion won (US$17.1 billion), reflecting robust demand for memory chips powering AI infrastructure.
The gains mark one of the strongest quarters in the company’s history, driven largely by large-scale AI projects and hyperscaler orders.

DRAM and NAND Fully Booked for 2026
The company announced that its entire DRAM and NAND lineup for 2026 has already been fully reserved, highlighting the strength of AI-related demand.
SK Hynix plans to further ramp production to meet growing market needs, particularly with the launch of its next-generation HBM4 chips this quarter. Although the company did not disclose specific clients, industry analysts suggest hyperscalers and cloud providers are the primary drivers behind the full-booking.
While SK Hynix prepares to supply HBM4 chips, experts note potential supply challenges due to packaging and manufacturing bottlenecks.
The complex seven-step HBM stacking process, including Chip-on-Wafer-on-Substrate (CoWoS) lines and Through-Silicon Via (TSV) yield improvements, still presents constraints.
Despite these hurdles, SK Hynix’s HBM4 specifications, featuring 10 Gbps speeds and 40% power efficiency gains, position the company ahead of competitors like Micron and Samsung, who are pursuing similar high-bandwidth memory solutions.
Memory-Efficient Software Eases Price Pressures
Emerging software solutions designed for memory efficiency are helping offset some cost pressures from rising DRAM and HBM prices.
Open-source inference engines like vLLM optimize large language model processing through chunked prefill and parallelism, while LeanKV compresses transformer key-value caches by up to 5.7x.
These developments allow GPU vendors and cloud providers to manage HBM supply concentration, over 90% in Korean hands, while maintaining competitive performance.
Investors React Positively to AI Boom
The stock surge reflects investor optimism about the company’s position in the AI memory chip market. However, some analysts caution that while AI demand is robust now, the sustainability of such rapid growth may face challenges.
Traditional memory chip price hikes of up to 30% in Q4 have contributed to revenue gains but could pressure customers if prolonged.
Analysts forecast HBM revenue could grow from $17 billion in 2024 to nearly $98 billion by 2030, driven by AI acceleration across cloud, enterprise, and consumer markets.
SK Hynix’s aggressive production plans, combined with its advanced HBM4 technology, suggest the company is well-positioned to capture a significant share of this expanding market. While caution remains regarding supply bottlenecks and long-term sustainability, the short-term outlook for SK Hynix appears exceptionally strong.


