반도체 AI 인더스트리 4.0 SDV 스마트 IoT 컴퓨터 통신 특수 가스 소재 및 장비 e4ds plus

SK Hynix to mass produce HBM3E, the world's best memory, next year

기사입력2023.08.21 12:57

Customer sampling in progress, mass production expected in the first half of next year
AI memory market expansion accelerates performance rebound

SK Hynix has developed a high-performance memory semiconductor, 'HBM3E', to realize the highest performance AI technology to date.

SK Hynix announced on the 21st that it has developed a new ultra-high-performance DRAM for AI, "HBM3E," and has begun supplying samples to customers to conduct performance verification. Mass production of the product is slated to begin in the first half of next year.

High Bandwidth Memory (HBM) is a product that vertically connects multiple DRAMs, dramatically increasing data processing speeds compared to conventional DRAM. As demand for high-performance AI accelerator chips evolves, HBM is also growing rapidly.

According to a report by Shinhan Investment & Securities, demand for new high-end products such as DDR5 and HBM3 is expanding faster than expected, and with SK Hynix's HBM3 investment detected, shipment plans are being discussed in detail.

SK Hynix started with the first-generation HBM and developed the fourth-generation HBM3 last year. According to officials, the product is expected to be available on the market from the second half of this year.

This 5th generation HBM3E is an extended version of HBM3 and is expected to be included in the NVIDIA GB100, which is scheduled to be released in 2025, so the industry had expected the release date to be in the first quarter of 2024.

In this announcement, SK Hynix announced that it is aiming for mass production by the first half of 2024, moving up the schedule from the previous second half.

In the second quarter earnings announced by SK Hynix in July, the memory semiconductor business recorded a deficit, but sales of high-spec products such as HBM3 and DDR5 increased. With demand for AI servers and high-bandwidth memory expected to increase, the verification results of this product are expected to impact the performance recovery of SK Hynix, which secured orders from NVIDIA.

SK Hynix emphasized, “Based on our experience in exclusively mass-producing HBM3, we have succeeded in developing HBM3E, an extended version that achieves the world’s highest performance,” and “Based on the industry’s largest HBM supply experience and mass production maturity, we will begin mass production of HBM3E in the first half of next year and solidify our unrivaled position in the AI memory market.”

“NVIDIA has long collaborated with SK hynix on HBM for cutting-edge accelerated computing solutions,” said Ian Buck, vice president of Hyperscale and HPC at NVIDIA. “We look forward to continued collaboration between our companies on HBM3E to deliver the next generation of AI computing.”

■ HBM3E, 1.15TB data processing per second

SK Hynix claimed that HBM3E has the best performance to date in all aspects, including speed, heat control, customer usability, and convenience, which are essential specifications for AI memory.

HBM provides high bandwidth for AI accelerator chips to enable AI algorithms to quickly access data. The newly developed HBM3E can process up to 1.15 terabytes (TB) of data per second. This is equivalent to processing data equivalent to more than 230 5GB FHD movies in just one second.

SK Hynix's engineers announced, "By applying the latest Advanced MR-MUF technology to this product, we have improved its heat dissipation performance by 10% compared to previous models." This technology involves stacking semiconductor chips, injecting a liquid protective material into the gaps between chips to protect the circuitry, and then solidifying it. Compared to the method of layering film-type materials on top of each chip, this process is more efficient and has been evaluated as effective in heat dissipation.

Additionally, HBM3E is backward compatible with previous versions. Customers can apply the new product to HBM3-based systems without any design or structural changes.

SK Hynix Vice President Ryu Seong-su (in charge of DRAM product planning) said, “With HBM3E, we have solidified our market leadership by enhancing the completeness of our product lineup in the HBM market, which is gaining attention along with the advancement of AI technology,” and “Going forward, the proportion of supply of HBM, a high value-added product, will continue to increase, accelerating the upward trend in business performance.”

Meanwhile, Samsung Electronics announced on the 20th that it will establish a new research and development (R&D) corporation in the United States, "Samsung Apparel (SFI)," to actively collaborate with U.S. national research institutions on semiconductors. This corporation will be responsible for Samsung Semiconductor's AI and high-performance computing (HPC) projects for U.S. federal government (USG) agencies and contract companies.