Samsung, SK Hynix and Micron Battle for HBM3e AI Memory

High Bandwidth Memory (HBM) is a type of DRAM technology that offers a number of advantages:

Lower voltages – HBM is designed to operate at lower voltages, which means it generates less heat.

Higher capacity – HBM can store more data and process it at once than previous generations.

Faster training times – HBM3 Gen2 offers over 2.5x performance improvement per watt, which can be beneficial for AI and HPC workloads.

HBM3E delivers increased performance per watt for AI and HPC workloads. Micron designed an energy efficient data path that reduces thermal impedance, enables greater than 2.5x improvement in performance/watt compared to previous generation. Micron predicts that by 2025, approximately half of all cloud infrastructure servers will be AI servers, necessitating a sixfold increase in DRAM. The NVIDIA H100 AI GPUs is a 7-die package with TSMC’s Chip-on-Wafer-on-Substrate packaging architecture, which has the core GPU compute unit at the center surrounded by 6 HBM blocks.

AMD’s MI 300 AI accelerator, AMD claims is faster than the Nvidia H100, has 8 HBM memory stacks in each unit, with 12 vertically stacked DRAM dies with through-silicon via on a base logic die.

Novel memory technologies, such as MRAM, RRAM, CBRAM, etc., have the potential to either complement or provide alternatives to conventional DRAM.

Lower cost – HBM is a unique form of DRAM technology that provides extremely high bandwidth at considerably lower cost than SRAM.

Higher density – HBM can be packaged to provide much higher densities than are available with SRAM.

Samsung has about 40% HBM market share.
SK Hynix has abour 30% HBM market share.
Micron has 26% HBM market share.
Nanya has 2%.

Micron is ranked third in the world with HBM memory. Micron announced its HBM3E memory on July 26, 2023. Micron plans to start shipping HBM3E memory in high volume in early 2024. The initial release is expected to be 24 GB 8-stack HBM3e memory in early 2024, followed by 36 GB modules with 12 stacks in 2025.

HBM3e will have products launched by each of the three leaders in Q2-Q3 2024. Micron should just beat out Hynix with a release. Samsung will lag by a few months.

They will each be racing for HBM4 around 2026.

Here is a link to a 14 page product info guide to HBM2E memory which started mass production in 2020.