Micron unveils its 36GB HBM3E memory as it tries to catch up with Samsung and SK Hynix, who are racing towards the next breakthrough: HBM4 with 16 layers, a bandwidth of 1.65TBps, and 48GB models.
Micron's new 36GB HBM3E memory provides faster processing speeds while consuming little power.
Micron has officially introduced its 36GB HBM3E memory with 12 layers, marking its entry into the competitive sector of high-performance memory solutions, especially for artificial intelligence systems and data-based applications. As the complexity of artificial intelligence workloads increases and the volume of data grows, the need for energy-efficient memory solutions becomes essential. Micron's HBM3E memory aims to offer a balance between processing speed and lower energy consumption, characteristics that are typically difficult to reconcile in powerful systems.
With a capacity that is 50% greater than current HBM3E offerings, Micron's 36GB memory becomes an essential component for artificial intelligence accelerators and data centers handling large workloads. The HBM3E provides over 1.2 terabytes per second (TB/s) of memory bandwidth, with a pin speed exceeding 9.2 gigabits per second (Gb/s), ensuring rapid access to data for artificial intelligence applications. Additionally, it stands out by reducing energy consumption by 30% compared to its competitors.
Despite the significant improvements in capacity and energy efficiency that the HBM3E offers, Micron enters a field dominated by Samsung and SK Hynix, who are racing to develop the next generation of high-bandwidth memory: HBM4. The HBM4 is expected to feature 16 layers of DRAM and performance of over 1.65TBps of bandwidth, far surpassing the capabilities of HBM3E. Moreover, with configurations reaching up to 48GB per stack, HBM4 will provide greater memory capacity, allowing artificial intelligence systems to handle increasingly complex workloads.
Despite the challenge posed by its competitors, Micron's 12-layer HBM3E memory remains an important player in the artificial intelligence ecosystem. The company has already begun shipping production units to key industry partners for validation, thus facilitating its integration into AI accelerators and data center infrastructures. Thanks to its strong support network and partnerships within the ecosystem, Micron's memory solutions are effectively integrated into existing systems, enhancing the performance of AI workloads.
A notable collaboration is the one Micron maintains with TSMC's 3DFabric Alliance, which helps optimize the manufacturing of artificial intelligence systems. This partnership supports the development of Micron's HBM3E memory and ensures its integration into advanced semiconductor designs, further enhancing the capabilities of AI accelerators and supercomputers.