Start your day with intelligence. Get The OODA Daily Pulse.
Micron just unveiled specs for its new high-bandwidth memory for AI, which appears to one-up the fastest HBM on the market. Leading memory producer Micron Technology and its peers are in one of the worst memory downturns in history, but the hope is that memory-hungry artificial-intelligence (AI) applications will help the industry dig itself out. Micron had impressively caught up to and surpassed Samsung and SK Hynix in the most widely used forms of DRAM and NAND flash, as the first company to develop 1-beta DRAM and 232-layer NAND last year. But Micron had also fallen behind its competitors in the crucial high-bandwidth memory (HBM) market for AI. HBM is high-capacity stacked DRAM crucial for training AI models and inferencing quickly, and it’s considered one of the main bottlenecks to unlocking more powerful AI. HBM is a tiny 1% of the DRAM market today, but it’s expected to grow at a 45% average growth rate or more for several years, because of its need in AI processing. This is one of the lone growth drivers in the DRAM market today, and investors didn’t appreciate that Micron has fallen behind. SK Hynix is thought to be the leader in HBM, having begun its development back in 2013. However, on July 26, Micron announced its newest HBM product, which appears to blow the competition out of the water.
Full story : Micron Just Changed the Game in Artificial-Intelligence Memory.