Leading the industry in power efficiency for data centre and cloud AI acceleration

19-06-2025 | Micron | AI

The importance of high-performance memory has never been greater, fuelled by its vital role in supporting the growing demands of AI training and inference workloads in data centres. Micron Technology, Inc. has announced the shipment of HBM4 36GB 12-high samples to multiple key customers. This milestone extends its leadership in memory performance and power efficiency for AI applications. Built on its well-established 1ß (1-beta) DRAM process, proven 12-high advanced packaging technology and highly capable MBIST feature, the device delivers seamless integration for customers and partners developing next-generation AI platforms.

As the use of generative AI continues to grow, the ability to effectively manage inference becomes more significant. The new product features a 2048-bit interface, achieving speeds greater than 2TB/s per memory stack and more than 60% better performance than the previous generation. This expanded interface facilitates rapid communication and a high-throughput design that speeds up the inference performance of large language models and chain-of-thought reasoning systems. Simply put, it will help AI accelerators respond faster and reason more effectively.

Furthermore, the product features over 20% better power efficiency compared to the company's previous-generation HBM3E products, which first established new, unrivalled benchmarks in HBM power efficiency in the industry. This improvement allows maximum throughput with the lowest power consumption, thereby maximising data centre efficiency.

Generative AI use cases continue to multiply, and this transformative technology is poised to deliver considerable benefits to society. The product is a crucial enabler, driving quicker insights and discoveries that will foster innovation in diverse fields such as healthcare, finance and transportation.

"Micron HBM4's performance, higher bandwidth and industry-leading power efficiency are a testament to our memory technology and product leadership," said Raj Narasimhan, senior vice president and general manager of Micron's Cloud Memory Business Unit. "Building on the remarkable milestones achieved with our HBM3E deployment, we continue to drive innovation with HBM4 and our robust portfolio of AI memory and storage solutions. Our HBM4 production milestones are aligned with our customers' next-generation AI platform readiness to ensure seamless integration and volume ramp."

sebastian_springall.jpg

By Seb Springall

Seb Springall is a seasoned editor at Electropages, specialising in the product news sections. With a keen eye for the latest advancements in the tech industry, Seb curates and oversees content that highlights cutting-edge technologies and market trends.