Home
News
Products
Corporate
Contact
 
Thursday, April 17, 2025

News
Industry News
Publications
CST News
Help/Support
Software
Tester FAQs
Industry News

HBM in the Spotlight Again


Friday, April 11, 2025

High-bandwidth memory (HBM) is again in the limelight. At GTC 2025, held in San Jose, California, from 17 to 21 March, SK hynix displayed its 12-high HBM3E devices for artificial intelligence (AI) servers. The Korean memory maker also showcased a model of its 12-high HBM4, currently under development, claiming that it’s now completing the preparatory works for large-scale production of the 12-high HBM4 in the second half of 2025.

Micron, another leading memory supplier, is signaling strong demand for its HBM chips in AI and high-performance computing (HPC) applications. Micron’s chief business officer, Sumit Sadana, told Reuters that all of Micron’s HBM chips are sold out for the calendar year 2025.

HBM—essentially a 3D structure of vertically stacked DRAM dies on top of a logic die—relies on advanced packaging technologies like through silicon vias (TSVs) while using a silicon interposer for interconnection with the processor. It’s proving highly suitable in parallel compute environments such as HPC and AI workloads.

That’s because it can handle multiple memory requests simultaneously from various cores in GPUs and AI accelerators to facilitate parallel workload processing. In fact, HBM has become the main venue for overcoming memory bottlenecks in data-intensive HPC and AI workloads. Otherwise, these memory bottlenecks lead to underutilization of AI processors.

What’s also pivotal about HBM devices is their continued development to improve AI accelerator performance. For instance, the current generation HBM3E devices use thermal compression with micro-bumps and underfills to stack DRAM dies. Next, HBM makers like Micron, Samsung and SK hynix are transitioning toward HBM4 devices, which employ advanced packaging technologies such as copper-copper hybrid bonding to increase input/outputs, lower power consumption, improve heat dissipation, and reduce electrode dimensions.

Market research firm IDTechEx’s report “Hardware for HPC, Data Centers, and AI 2025-2035: Technologies, Markets, Forecasts” assesses the key developments and trends in HBM devices serving AI and HPC workloads. It also projects that compared to 2024, the unit sales of HBM are forecast to increase 15-fold by 2035.

HBM was a prominent technology highlight in 2024 for its ability to overcome the memory wall for AI processors. With the emergence of HBM4 memory devices, that trend is likely to continue in 2025 and beyond.

By: DocMemory
Copyright © 2023 CST, Inc. All Rights Reserved

CST Inc. Memory Tester DDR Tester
Copyright © 1994 - 2023 CST, Inc. All Rights Reserved