CONNECT WITH US

Micron unveils new products based on leading 176-layer NAND and 1-alpha DRAM process for the new data economy

Press release

The digital transformation driven by the convergence of AI and 5G has made data a key enabler of economic development. As a result, in addition to requiring higher computing performance, memory and storage components are playing an increasingly important role for various computing architectures. At 2021 Computex Taipei, Micron Technology unveiled several new products based on its leading 176-layer NAND and 1α (1-alpha) DRAM technology. Meanwhile, Sanjay Mehrotra, Micron President and CEO, and Raj Hazra, Senior Vice President and General Manager of Compute and Networking Business Unit, also gave speeches, explaining how the innovation of memory and storage can drive the expansion of AI applications, in a bid to realize the unlimited possibilities from data center to intelligent edge devices.

The potential of data economy

Sanjay Mehrotra pointed out in a keynote speech titled "Innovation for the Data Economy: Why today's infrastructure innovation brings data to life, powering insights for all" that technology innovations plays the central role in enabling enterprises to seize with accelerating data-driven insights, and drive the full potential of the data economy. According to McKinsey's study, advanced data analytics technologies will add $13 trillion to the global economy by 2030, and the added value is driven by our ability to make better and faster decisions from the massive amount of data.

"Innovation is occurring more rapidly than ever before," he said. "AI and 5G are at the heart of this change, and the combination of these two technologies is able to create newfound opportunities for the post-pandemic world."

First, with the rapid development of AI, by 2025, it is estimated that 75% of enterprises will have shifted from piloting AI solutions to operationalizing them, bringing comprehensive changes in all industries. The pervasive AI applications also stimulated the transformation of computing platforms, so that data analysis can be carried out better and faster. With its ultra-high bandwidth, multitude of connections, and low-latency capabilities, 5G will fuel the growth of the intelligent edge of our network, providing real-time access and process of data.

Sanjay Mehrotra emphasized, "All data comes to life through memory and storage, where Micron shines. This growth and transformation presents an opening for technology innovation to deliver bold new solutions for data center, network infrastructure, and intelligent edge devices."

The world's first 176-layer NANDs

Underpinning by three strategies, including advanced process node technology, product innovation, and building ecosystem, Micron aims to take the lead in the market. During the speech, Mehrotra announced volume production of the company's first PCIe Gen4 solid-state drives (SSDs) built with the world's first 176-layer NAND. It will be able to support the PC market for data-intensive workloads with higher performance, lower power consumption and a more elaborate size.

In addition, Micron is shipping LPDDR4x in volume on its leading 1α node this month, quickly following the introduction of initial 1α node DRAM products in January 2021. The company has also completed validation of its 1α-based DDR4 on leading data center platforms, including 3rd Generation AMD EPYC. Both are in volume production in Micron's advanced DRAM fabrication facilities in Taiwan, including its newly established A3 facility in Taichung.

For automotive applications, Micron announced that it is sampling 128GB and 256GB densities of its 96-layer NAND as part of its new portfolio of UFS 3.1 managed NAND products.

The key to AI innovation

As the industry moves towards the goal of AI ubiquity, what are the challenges to be solved ahead? Raj Hazra explained the importance of memory innovation for expanding AI applications in a speech titled "Memory Is at the Heart of AI Innovation" and at Micron's Computex pre-show press conference.

He pointed out that traditional server architectures were built around the needs of the CPU. With the rise of data-intensive workloads like AI, HPC and advanced analytics, we tried to add more CPU or heterogeneous cores to cope with these new challenges. But memory innovation has not kept pace with compute growth, which makes memory the bottleneck in compute architectures.

Therefore, memory and storage are becoming central to platform innovation. This is why, when the industry defined the next standard for DRAM innovation it centered that innovation - for the first time - on data center requirements and not client platform needs.

DDR5 is a once-in-a-decade platform innovation delivering much-needed advancements in memory bandwidth, density and data center capability. In order to speed market adoption of DDR5, Micron has made significant momentum with its Technology Enablement Program (TEP) for DDR5, which was launched in 2020. The program has now engaged more than 250 design and technical leaders from more than 100 industry leaders.

The Future of Memory

Apart from DDR5, Raj Hazra said that in order to support AI training and high-performance computing workloads, we should renovate the existing CPU-centric platform by providing breakthrough memory capacity to meet the application requirements of data centers.

In light of this, Micron announced its support of Compute Express Link (CXL) standard, which aims to transform fundamental compute architectures, enabling new connectivity between compute logic and memory across CPU, GPU and other accelerator technologies, as well as configuring resources upon workloads in real time.

"Memory must be closer to compute units for faster read and write which can significantly improve the power efficiency. With the introduction of the new open CXL standard, the industry is on the cusp of realizing fully composable data center architectures, enabling to connect CPUs to GPUs and accelerators, storage, memory and networking."

In particular, as AI models become increasingly larger, whether it is scaling up or scaling out, we are now facing various challenges such as power consumption, cost, latency, and capacity. By implementing a new memory hierarchy, including near memory such as high-bandwidth in-package direct attached memory, and far memory such as CXL attached memory, great improvement will occur to the processing performance in the hot data zone.

Raj Hazra said that, with this architectural innovation, the future composable data center will be optimized infrastructures that can tackle any number of data-centric workloads - from AI training to simulation and modeling to database management and analytics. Actively developing CXL products, Micron is now arming with comprehensive memory and storage product portfolios to meet the various needs of AI computing.

Please visit Micron website for more.

Micron President and CEO Sanjay Mehrotra

Micron President and CEO Sanjay Mehrotra