Micron has started volume production of its HBM3E (High Bandwidth Memory 3E) memory product.

Due to start shipping in the second quarter of 2024, Micron’s 24GB 8H HBM3E will form part of Nvidia’s H200 Tensor Core GPU.

Micron, Shanghai
Micron - Shanghai, China – Micron

In a statement, Micron said its HBM3E has been designed to address the demands AI workloads are placing on memory solutions. It offers pin speed greater than 9.2Gbps and more than 1.2TBps of memory bandwidth to enable fast data access for AI accelerators, supercomputers, and data centers.

Produced using the company’s 1-beta process technology, Micron said its HBM3E also has ~30 percent lower power consumption compared to solutions offered by its competitors.

The 36GB 12-Hi HBM3E is due to be released in March 2024.

“Micron is delivering a trifecta with this HBM3E milestone: time-to-market leadership, best-in-class industry performance, and a differentiated power efficiency profile,” said Sumit Sadana, executive vice president and chief business officer at Micron.

“AI workloads are heavily reliant on memory bandwidth and capacity, and Micron is very well-positioned to support the significant AI growth ahead through our industry-leading HBM3E and HBM4 roadmap, as well as our full portfolio of DRAM and NAND solutions for AI applications.”

In June 2023, Micron announced plans to build a new $2.75 billion semiconductor AMTP (assembly, testing, marking, and packaging) facility in Gujarat, India. The plant was proposed under the Modified Programme for Semiconductors and Display Fab Ecosystem, a $10 billion scheme that was authorized by the Indian government in December 2021. The program allows companies to apply for up to 50 percent of capital costs for eligible semiconductor and display manufacturing projects.

Set to launch in two phases, the project will create up to 5,000 new Micron jobs and 15,000 community jobs over the next several years.