Data processing speed exceeds 2TB per second, 60% improvement over existing models
Advanced MR-MUF process application, minimized warping and increased stability ↑
SK Hynix has become the first company in the world to supply samples of the 'HBM4 12-layer', taking a step forward in the AI semiconductor race.
SK Hynix announced on the 19th, “Based on the technological competitiveness and production experience that has led the HBM market, we have started shipping HBM4 12-layer samples earlier than originally planned.”
The company plans to begin the certification process with its clients and complete preparations for mass production within the second half of this year.
The HBM4 12-layer product unveiled this time is evaluated as the highest-level DRAM capable of ultra-high-speed data processing required for AI semiconductors.
The data processing speed of this product exceeds 2 terabytes (TB) per second.
This is a speed improvement of over 60% compared to the previous generation product (HBM3E), and can process over 400 FHD movies (based on 5GB) in just 1 second.
The company emphasized that this new product is world-class not only in terms of speed but also in terms of capacity.
In addition, SK Hynix applied the ‘Advanced MR-MUF’ process to the manufacturing of this new product.
This process minimizes warping that may occur during the stacking of memory chips and also greatly improves heat dissipation performance, resulting in a product that is It is a technology that increases stability.
Through this, we also succeeded in implementing a capacity of 36 GB based on 12-level stacking.
SK Hynix has continuously demonstrated its competitiveness in technological development and mass production of HBM products in recent years.
Starting with mass production of HBM3 in 2022, the company secured leadership in the global AI memory market by being the first in the world to mass produce HBM3E 8-layer and 12-layer products in 2024.
This HBM4 sample supply is also interpreted as a move to take the lead in the next-generation AI memory market.
Kim Joo-sun, President of SK Hynix AI Infrastructure Division, said, “We have continuously overcome technological limitations in response to customer demands,” and “Based on our experience in supplying the world’s largest HBM supply, we will complete performance verification and mass production preparations without a hitch.”
The company's strategy is to further strengthen its competitiveness in the future ultra-large AI market through this product.
With the explosive growth of the AI market and the rapidly increasing demand for ultra-high-speed, high-capacity memory, the industry is paying attention to what impact SK Hynix's HBM4 product will have on the global semiconductor market landscape.