Last updated: December 26th, 2025 at 06:14 UTC+01:00


Samsung could start making its most important AI memory chip ever in February

These chips will be used in Nvidia's next-generation AI accelerator system called Rubin that will launch in the second half of 2026.

Asif Iqbal Shaik

Reading time: 2 minutes

samsung logo 17
Business

After missing out on the opportunity to supply fifth-generation high-bandwidth memory (HBM3E) chips in large quantities to Nvidia, Samsung is making sure that it doesn't face the same issues with its sixth-generation high-bandwidth memory (HBM4) chips. It will reportedly start mass production of HBM4 chips in February 2026.

It was recently revealed that Samsung's HBM4 chips passed Nvidia's quality testing with flying colors. A new report from SEDaily now claims that Samsung will start the mass production of HBM4 chips at its Pyeongtaek campus in February 2026. Its primary rival SK Hynix will also start the mass production of its own HBM4 chips around the same time.

Most of Samsung's HBM4 chips will be installed in Nvidia's next-generation AI accelerator system called Vera Rubin, which will be launched in the second half of 2026. Some HBM4 chips will also be supplied to Google for its seventh-generation Tensor Processing Units (TPUs).

While SK Hynix has decided to use a 12nm fabrication process for the base die of its HBM4 chips, Samsung has used a 10nm-class fabrication process for its rival chips. Hence, it is reported that Samsung's HBM4 chips have better performance. In its internal tests, the company has reportedly achieved speeds of up to 11.7Gbps.

Portable SSD T5 EVO USB 3.2 Deals

Buy From Samsung Store

Since the HBM production capacities of both Samsung and SK Hynix are already said to have been sold out for the next year, AI firms are scrambling to procure as many memory chips as possible, as the ongoing chip shortage could lead to production bottlenecks for firms like Amazon, Google, Microsoft, and OpenAI. Samsung will earn billions by selling its HBM chips.