Samsung expands its HBM-PIM memory to more applications

General
By 

Last updated: August 24th, 2021 at 09:32 UTC+02:00

Earlier this year, Samsung had unveiled the world’s first HBM-PIM (High-Bandwidth Memory Processing-In-Memory) modules with integrated processing. This new memory technology offers faster performance for AI applications while reducing power consumption. Now, the South Korean firm has announced that it will expand this DRAM technology to more applications.

During Hot Chips 33, an annual expo where brand new semiconductor technologies are unveiled, Samsung Semiconductor announced the first commercial application of HBM-PIM. This futuristic technology has been tested in the Xilinx Virtex Ultrascale+ (Alveo) AI accelerator, where it delivered 2.5x faster processing while offering 60% improved power efficiency. This shows that HBM-PIM is ready for a wider range of applications, including commercial servers and mobile devices.

AXDIMM (Acceleration DIMM) are DRAM modules powered by HBM-PIM chips, and they include both memory chips and processing cores. This combination reduces large data movements between CPU and DRAM. The AI engine built inside the buffer chip can perform parallel processing of multiple memory ranks, greatly improving speed and power efficiency.

Samsung AXDIMM PIM Modules

These AXDIMM modules retain the conventional DIMM form factor, which means traditional DIMMs can be easily replaced with AXDIMMs. They are currently being tested on customer servers, and it has been noted that they offer twice the performance in AI-based recommendation applications while offering 40% improved power efficiency. SAP has been testing AXDIMM to offer improved performance on SAP HANA and to accelerate the database.

Samsung also announced that it would soon bring processing-in-memory to mobile devices through LPDDR5-PIM modules. These modules offer twice the performance in certain AI tasks such as chatbot, translation, and voice recognition. The world leader in the memory segment plans on standardizing the PIM platform in the first half of 2022 by working with other industry leaders.

Nam Sung Kim, Senior Vice President of DRAM Product & Technology at Samsung Electronics, said, “HBM-PIM is the industry’s first AI-tailored memory solution being tested in customer AI-accelerator systems, demonstrating tremendous commercial potential. Through standardization of the technology, applications will become numerous, expanding into HBM3 for next-generation supercomputers and AI applications, and even into mobile memory for on-device AI as well as for memory modules used in data centers.

General AIDRAMHBM2Samsung ElectronicsSamsung Semiconductor
Load comments

Related News

Your Samsung charger could soon charge your friend’s iPhone̷ …

  • By Mihai M.
  • 2 days ago

Samsung is manufacturing Tesla’s next-gen HW4.0 self-driving chi …

  • By Mihai M.
  • 2 days ago

Samsung’s building a TV plant in Pakistan to churn out 50,000 un …

  • By Mihai M.
  • 3 days ago

Lithuanian government’s anti-Xiaomi rant could end up helping Sa …

  • By Anil G.
  • 3 days ago

Is the White House not interested in talking to Samsung about chips an …

  • By Mihai M.
  • 1 week ago

Samsung Electronics’ brand reputation takes a big hit in 2021

  • By Anil G.
  • 3 weeks ago

Has Samsung just made a $1.3 billion mistake in China?

  • By Adnan F.
  • 3 weeks ago

DDR5 supply prices will be ~30% higher than DDR4, market watchers say

  • By Mihai M.
  • 3 weeks ago