Samsung Electronics will begin mass production next month of its sixth-generation high bandwidth memory, or HBM4, for use in artificial intelligence semiconductors. After falling behind rivals in the fourth and fifth generations, the company is seeking to reverse that slide by making an aggressive push with its sixth-generation product to reestablish a clear technology edge.
According to industry sources on Jan. 26, Samsung Electronics is expected to start mass production of HBM4 next month and supply it to major customers, including Nvidia. HBM4 is a next-generation memory designed for advanced AI semiconductors such as Nvidia’s Rubin and AMD’s MI450, both scheduled for release later this year.
Samsung’s HBM4 is reported to offer industry-leading performance. Its data transfer speed reaches 11.7 gigabits per second, exceeding the industry benchmark of 10 gigabits per second. The previous generation, HBM3E, achieved a top speed of 9.6 gigabits per second.
Samsung’s strong HBM4 performance is widely attributed to its use of advanced semiconductor components. The logic die, which links the DRAM to the graphics processing unit at the base of the HBM stack and functions as its control hub, is built on a 4-nanometer process. The DRAM employs sixth-generation technology, known as 1c, based on an 11-nanometer-class process. By contrast, SK hynix’s HBM4 uses a 12-nanometer logic die and fifth-generation DRAM, known as 1b, built on a 12-nanometer-class process. As manufacturing processes advance, higher levels of integration enable faster data processing and improved power efficiency.
Initially, some in the industry questioned whether Samsung Electronics was pursuing an overly aggressive strategy. After trailing competitors in HBM3 and HBM3E, analysts warned that adopting such advanced technology could complicate efforts to ensure stable performance. Concerns centered in particular on the logic die, where foundry capabilities are critical. While SK hynix partnered with Taiwan Semiconductor Manufacturing Co., Samsung chose to rely on its own manufacturing processes. That decision fueled skepticism, as TSMC is widely regarded as holding an advantage over Samsung in the foundry business.
Samsung Electronics has since eased those concerns by stabilizing both the performance and yield of HBM4, earning positive assessments from customers. Kim Dong-won, head of research at KB Securities, said Samsung’s HBM4 is expected to deliver the strongest performance in the industry. Securities firms project Samsung’s share of the HBM market this year will rise to around 30 percent, roughly double its previous level.
Another factor drawing attention to Samsung’s HBM4 is the emergence of new competitors in the AI semiconductor market, including Google, Broadcom and AMD, an arena long dominated by Nvidia. An industry official said chipmakers are increasingly seeking to differentiate their products by adopting even marginally higher-performing HBM. Samsung’s HBM4 is also expected to be used in Google’s next-generation artificial intelligence chip, the Tensor Processing Unit.
More detailed plans and production timelines for HBM4 are expected to be outlined during Samsung Electronics’ earnings announcement on Jan. 29. Samsung Electronics and SK hynix are both scheduled to report their fourth-quarter results for last year on the same day.
박현익 기자 beepark@donga.com