CONNECT WITH US

HBM technology and capacity deployment

Chiawen Chang, DIGITIMES Research, Taipei 0

Credit: DIGITIMES

Subscribe to Research IC Manufacturing to read this report or purchase it separately. Subscribe Now
In response to the continuous growth in AI demand, memory device makers SK Hynix, Samsung Electronics, and Micron Technology plan to mass-produce high bandwidth memory 3E (HBM3E) products in 2024
Abstract

Compared to other DRAM products, HBM, formed by stacking multiple DRAMs, offers high bandwidth suitable for high-performance computing (HPC) applications, and therefore, chipmakers Nvidia, AMD and Intel are actively introducing products with HBM to seize the windfalls of rising AI demand.

Such a development has prompted memory device makers to accelerate HBM advancement, with SK Hynix, Samsung, and Micron all planning to launch HBM3E in 2024, narrowing the product generation gap.

In addition to expanding their facilities to ramp up capacities, the three firms have unveiled mass-production roadmaps for their next-generation products. Micron’s product deployment projects and specifications are the most comprehensive, indicating its eagerness to accelerate HBM deployment.

Table of contents
Price: NT$16,250 (approx. US$500)