Micron has started sampling its next-gen HBM3 Gen2 memory to customers including NVIDIA who have praised its performance & efficiency uplift.
During an earnings call on Wednesday, Micron predicted a more "widened" loss for the upcoming quarter due to the dwindling demand from the consumer memory market. However, Micron did reveal the fact that the company has been working with NVIDIA closely now and that their high bandwidth HBM3 Gen2 memory is expected to debut in Team Green's upcoming AI & HPC GPUs during the first half of 2024.
Micron expects the AI frenzy to "rescue" them from their negative financial quarters since the company's Chief Executive Sanjay Mehrotra has predicted the generation of "millions of dollars" of revenue from the AI industry.
Micron's next-gen HBM3 technology is named "HBM3 Gen2". Regarding specifications, it features a 1β process node, which brings in much faster speeds and denser capacities. The first batch of 2nd-generation HBM3 features an 8-Hi design, offering up to 24 GB capacities and bandwidth exceeding 1.2 TB/s with up to 2.5x the performance per watt of the prior generation. The DRAM will operate at 9.2 Gb/s, a 50% boost over the standard "HBM3", which runs around 4.6 Gb/s speeds.
Micron had also announced a "12-Hi" design, which is planned for later. If we look at the industry's situation, SK Hynix has dominated the market share mainly due to its strong ties with companies like NVIDIA and AMD, but that looks to change in the future. Micron plans to dominate the HBM business by rapidly adopting next-gen standards, which is evident in today's development. This is also great news for AI chip makers as they will now have a selection of Micron, Samsung, and SK Hynix HBM3 Gen2 / HBM3e modules to
Read more on wccftech.com