TrendForce reports that next-gen HBM3 & HBM3e memory will dominate the AI GPUs industry, especially after the significant increase in interest from companies to incorporate the DRAM.
The existing NVIDIA A100 and H100 AI GPUs are powered by HBM2e & HBM3 memory, respectively which debuted in 2018 and 2020. Several manufacturers, such as Micron, SK Hynix, and Samsung, are rapidly developing facilities for mass production of new and faster HBM3 memory, and it won't be long before it becomes the new benchmark.
There is a generic detail about the HBM3 memory not known by many people. As highlighted by TrendForce, HBM3 will drop in different variations. The lower-end HBM3 will reportedly run at 5.6 to 6.4 Gbps, while the higher variants surpass the 8 Gbps mark. The higher-end variant will be called "HBM3P, HBM3A, HBM3+, and HBM3 Gen2."
We recently reported that the HBM industry is bound to witness a significant rise in market share, with SK Hynix at the lead. It is expected that future AI GPUs, such as the AMD MI300 Instinct GPUs and NVIDIA's H100, will feature the next-gen HBM3 process, at which SK Hynix has the upper hand since it has already reached manufacturing stages and has received a sample request from NVIDIA itself.
Micron also recently announced plans for its future HBM4 memory design but that isn't expected till 2026 so the upcoming NVIDIA Blackwell GPUs codenamed "GB100" are likely to utilize the faster HBM3 variants when they arrive next between 2024-2025. Mass production on the HBM3E memory which utilizes 5th gen process technology (10nm) is expected to commence in 1H of 2024 and both Samsung / SK Hynix are expected to ramp up.
Competitors like Samsung and Micron are pushing the throttle, with several reports
Read more on wccftech.com