TLDRs :
- Nvidia partners with Samsung to scale Groq AI chip production rapidly.
- Groq LPUs use SRAM, consume less energy, and avoid GPU bottlenecks.
- Samsung’s memory gains from AI chips offset smartphone profit declines.
- AI-driven demand may push electronics prices higher worldwide.
Nvidia (NVDA) is accelerating its push into AI chip manufacturing by partnering with Samsung Electronics’ foundry division to increase production of Groq-designed inference processors. The move comes as global demand for energy-efficient AI chips surges, driven by companies seeking faster, more cost-effective inference solutions. Industry sources report that Samsung will expand wafer production for Groq from 9,000 units in 2025 to 15,000 this year, marking a significant ramp-up in mass production.
Groq, a U.S.-based AI chip startup, has formed a technology licensing partnership with Nvidia, allowing the semiconductor giant to tap into Groq’s specialized Language Processing Units (LPUs). These LPUs are designed for inference workloads, contrasting with traditional GPUs that are more suited for AI training tasks.
Groq LPUs Cut Energy Use Significantly
Groq’s chip architecture is notable for its use of on-chip static RAM (SRAM) instead of high-bandwidth memory (HBM), which remains scarce in the global market. The LPUs are capable of achieving over 80 terabytes per second of bandwidth while consuming roughly ten times less energy than comparable GPUs. By avoiding the supply-chain constraints tied to HBM, Groq aims to provide a reliable and scalable solution for inference processing.
Additionally, Groq manufactures its chips entirely in North America, reducing dependency on the complex, multinational supply chains required for modern GPUs. Analysts suggest this approach could help Nvidia and Groq sidestep the bottlenecks that have historically slowed GPU availability.
AI Chip Demand Boosts Samsung Profits
The growing appetite for AI processors has also had broader effects on the semiconductor ecosystem. Samsung, for instance, has benefited from the memory shortage driven by AI chip demand, with the company’s operating profit from its memory division surging more than 200% year over year in Q4 2025.
However, the same trend has put pressure on Samsung’s smartphone business. Operating profits in the mobile unit fell to 1.9 trillion won, down 9.5% from the previous year, as rising memory costs weigh on device margins. Industry experts warn that higher component prices could eventually be passed on to consumers, driving up electronics prices globally.
Nvidia Prepares SRAM Chip Launch
Looking ahead, Nvidia may unveil a Groq-designed inference processor at GTC 2026, potentially incorporating SRAM rather than HBM. This innovation could offer faster, more energy-efficient AI computation and reinforce Nvidia’s position in the inference market. For Groq, the expansion at Samsung not only solidifies manufacturing capacity but also strengthens the company’s strategy to remain a reliable alternative to GPU-based solutions.
In sum, Nvidia’s collaboration with Samsung reflects a broader industry trend: the rapid scaling of AI hardware to meet surging demand while navigating supply-chain limitations. As AI workloads become more pervasive, chips like Groq’s LPUs may play a pivotal role in delivering high-speed, low-energy inference for enterprise and cloud applications alike.


