CONNECT WITH US

Micron sampling MRDIMMs for AI and HPC

Jessie Shen, DIGITIMES Asia, Taipei 0

Micron MRDIMMs. Credit: Micron

Micron Technology has disclosed that it is currently conducting sampling of its Multiplexed Rank Dual Inline Memory Modules (MRDIMM) for memory-intensive applications, including AI and HPC.

Micron customers will be able to operate increasingly demanding workloads and maximize the value of their compute infrastructure with the MRDIMMs. Micron MRDIMMs surpass current TSV RDIMMs in applications that necessitate more than 128GB of memory per DIMM slot by providing the highest bandwidth, largest capacity, and lowest latency, as well as improved performance per watt, to expedite memory-intensive virtualized multi-tenant, HPC, and AI data center workloads.

The new memory offering is the first generation in the Micron MRDIMM family and will be compatible with Intel Xeon 6 processors, according to the memory chip vendor.

By implementing DDR5 physical and electrical standards, MRDIMM technology delivers a memory advancement that allows scaling of both bandwidth and capacity per core to future-proof compute systems and meets the expanding demands of data center workloads.

MRDIMMs provide the following advantages over RDIMMs: Effective memory bandwidth increases by up to 39%; bus efficiency improves by over 15%; and latency improvements by up to 40% when compared to RDIMMs.

MRDIMMs support a wide capacity range of 32GB to 256GB in standard and tall form factors (TFF), making them ideal for high-performance 1U and 2U servers. TFF modules' improved thermal design lowers DRAM temperatures by up to 20 degrees Celsius while maintaining the same power and airflow, allowing for more efficient cooling in data centers and optimizing overall system task energy for memory-intensive workloads.

Micron's industry-leading memory design and process technology using 32Gb DRAM die enables 256GB TFF MRDIMMs to have the same power envelope as 128GB TFF MRDIMMs using 16Gb die. A 256GB TFF MRDIMM provides a 35% improvement in performance over similar-capacity TSV RDIMMs at the maximum data rate. With 256GB TFF MRDIMMs, data centers can drive unprecedented TCO benefits over TSV RDIMMs.

Micron MRDIMMs are available now and will be shipping in volume in the second half of calendar year 2024. Following generations of MRDIMMs will continue to provide up to 45% more memory bandwidth per channel than similar-generation RDIMMs.

"By leveraging DDR5 interfaces and technology, MRDIMMs provide seamless compatibility with existing Xeon 6 CPU platforms, giving customers flexibility and choice," said Matt Langman, VP and GM of data center product management, Intel Xeon 6 at Intel, Micron's statement. "MRDIMMs provide customers a full choice of higher bandwidth, lower latencies and capacity points for HPC, AI, and a plethora of workloads, all on the same Xeon 6 CPU platforms that also support standard DIMMs."

In the same statement, Lenovo VP and GM of AI and high-performance computing Scott Tease indicated "Micron MRDIMMs will help close the bandwidth gap for memory-intensive workloads like AI inference, AI retraining, and countless high-performance computing workloads. Our collaboration with Micron is stronger than ever, and laser-focused on delivering balanced, high-performance, sustainable technology solutions to our mutual customers."