Samsung Electronics is reportedly on track to supply more than half of Nvidia's next-generation System on CAMM (SOCAMM2) memory modules in 2026, becoming the largest contributor to the AI-server CPU ecosystem. SOCAMM — touted by Nvidia as a new high-performance DRAM standard and often described as a "second HBM" — is set to redefine how CPU-side memory is deployed inside advanced AI servers, Hankyung and ICsmart reported.
Quanta Computer vice chairman and president C. C. Leung said on December 3 that electricity supply has become the most serious constraint for AI server manufacturing, overtaking concerns about memory shortages. Speaking at the DIGITIMES Tech Forum 2026 in Taipei, he said reliable power will remain a major hurdle for the industry through 2026 and likely beyond, even as demand for AI servers stays strong.
Micron said in its latest quarterly results that it will work with TSMC to produce base logic dies for both standard and custom HBM4E memory. The disclosure highlights a significant shift in how next-generation high-bandwidth memory may be built, as foundries, rather than DRAM makers, begin handling the foundational logic layer of advanced HBM stacks.

