For Rubin and Feynman: SK Hynix, Samsung and Micron show HBM4E with up to 64 GB (update)

For Rubin and Feynman: , Samsung, and Micron Show HBM4E with up to 64 GB

8 comments

For Rubin and Feynman: SK Hynix, Samsung, and Micron Show HBM4E with up to 64 GB

Major memory manufacturers SK Hynix, Samsung, and Micron are in various stages of expansion, including 48 GB HBM4 for GTC 2025. The development is similar across all three; 16x memory will spearhead the future portfolio. But even that’s not enough.

SK Hynix is ​​riding the wave of success alongside Nvidia. No other memory manufacturer was at the center of HBM, has long been, and still is, NVIDIA's most important supplier. But others are rising to the occasion to add time for SK Hynix. The manufacturer already has the first 48 GB of HBM4 stacks in its luggage. These are based on 16 layers with 3 GB of large memory chips. HBM4 by SK Hynix

HBM4 by SK Hynix

The 12-layer version, below the absolute top and probably a much more common alternative, is “12Hi.” Other manufacturers are also playing here; 36 GB of HBM4 stacks are likely the new standard. Before that, HBM3E will already offer 36 GB in a 12-layer version; all manufacturers are also included here—the latest sales plans for HBM had already stated this. SK Hynix plans and shares this via the press release to be able to ship HBM4 with 12 layers starting at the end of this year—provided orders are available. Rubin’s 288 GB of HBM4 is based precisely on these 36 GB stacks.

The manufacturers then also provide a little information on the possible speeds and bandwidths. SK Hynix claims that 8.0 Gbps is defined for the indicated chips, Samsung speaks of 9.2 Gbps, which should already refer to an expansion step.

64 GB in larger stacks is the future

And what's next? SK Hynix is ​​already hinting at this. As the Rubin Ultra image shows, with 16 memory chips, HBM stacks will have to continue to grow. To achieve 1 TB of memory with 16 chips, a stack of 64 GB is required. Rubin Ultra

Rubin Ultra

SK Hynix is ​​getting more specific in its development showcase. As a result, stacks can grow to 20 or more layers. Mathematically, this would require at least 21 stacks, which would then provide a little over 63 GB of HBM stack space with the current 3 GB chips. 16 elements result in a calculation of 1,008 GB. Something like this only applies to HBM4E, and then to Rubin Ultra, which corresponds to at least two years. Adjustments must always be made here and there. SK Hynix HBM3E and HBM4 SK Hynix HBM3E and HBM4

Techastuce received information about this item from Nvidia at a event in San Jose, California. The cost of arrival, departure, and five hotel accommodations were covered by the company. There was no manufacturer influence or obligation to report.

Topics: HBM Micron Nvidia Samsung SK Hynix Technologies

Leave a Comment

Your email address will not be published. Required fields are marked *