March 22, 2024 /SemiMedia/ -- Micron CEO Sanjay Mehrotra recently stated that all production capacity of high-bandwidth memory (HBM) chips used to develop complex artificial intelligence (AI) applications in 2024 has been sold out, and most of the supply in 2025 has been allocated.
Sumit Sadana, Micron's chief commercial officer, said that the company's HBM products have signed new customers, but they have not yet announced them.
Mehrotra said that the demand for artificial intelligence servers is driving the rapid growth of HBM, DDR5 and data center SSDs. The supply of high-end DRAM and NAND is tight, which has a positive effect on the price trend of the storage device market.
Micron expects that HBM will make a positive contribution to Micron's DRAM business and overall gross profit margin starting from the third quarter of fiscal year 2024. It is estimated that Micron's HBM revenue will reach hundreds of millions of dollars this year.
SK Hynix has been the sole supplier of Nvidia's HBM chips. However, Nvidia will use Micron's latest HBM3E chip in its next-generation H200 GPU.
All Comments (0)