Last updated: December 4th, 2025 at 11:09 UTC+01:00


Samsung unveils HBM4 chip that could be its biggest money-maker

Samsung’s HBM4 chip is manufactured using the 1c process, which could potentially surpass its competitors.

Asif Iqbal Shaik

Reading time: 2 minutes

samsung logo 17
General

After losing to Micron and SK Hynix for the past couple of years in the high-bandwidth memory segment, Samsung unveiled its first HBM4 chip at the Semiconductor Exhibition (SEDEX) 2025 expo in South Korea. This marks the company’s huge step to regain lost ground in the next few years with its most advanced memory chip yet.

Samsung unveils HBM4 chips that could be used in AI servers worldwide

At the SEDEX 2025 exposition held at COEX from October 22 to 24, 2025, in Seoul, South Korea, Samsung unveiled (via Asia Business Daily) its first HBM4 chip. HBM4, the sixth-generation high-bandwidth memory chip, is used in AI accelerators made by companies like AMD and Nvidia. These AI accelerators power the Generative AI algorithms used by some of the world’s largest firms.

The performance of Samsung’s HBM4 chip is crucial. If Nvidia decides to buy Samsung’s HBM4 chips, the South Korean firm could potentially earn billions of dollars in operating profit every quarter for the next few years.

samsung hbm4 chip hbm3e sedex 2025

Image Credits: Asia Business Daily

SK Hynix, Samsung’s biggest rival, has also completed the development of its HBM4 chip. The company showcased its HBM4 chip alongside Samsung at the same expo. It is reportedly in advanced talks with Nvidia for a large-scale supply. Micron, Samsung, and SK Hynix have all sent their HBM4 chips to Nvidia, which will test them over the next few weeks before deciding which company to offer the contract to.

Samsung developed its HBM4 chips using the 10nm class, sixth-generation (1c) DRAM process. This process is considered more advanced than the 10nm class, fifth-generation (1b) process used by SK Hynix for its HBM4 chips. While the 1c process theoretically offers higher performance, it’s unclear whose HBM4 chips perform better until Nvidia starts using them.

Currently, all the best AI accelerators are using HBM3E chips, and Samsung has been selling its HBM3E chips for the past year or so. HBM4 will be used in Nvidia's next-generation AI accelerator, Rubin. Samsung lost a lot of business opportunities last year due to issues with its HBM3E chips, and it is aiming not to repeat its past mistakes.

Image Credits: Asia Business Daily