Big win for Samsung as NVIDIA picks its 7nm process for next-gen GPU
Samsung has made no secret of the fact that it wants to offset the decline in its memory business with non-memory chips. Its foundry business already makes chips for the likes of Intel but the company has scored a big win today. NVIDIA confirmed its plans to have its next-gen GPU made on Samsung’s 7nm process.
It was reported last month that NVIDIA had decided to fab its next-gen Ampere GPU architecture on Samsung’s 7nm EUV process. TSMC has long been NVIDIA’s fab partner for GPU but Samsung has won out this time.
Samsung to manufacture NVIDIA’s next-gen GPU
The market had expected that NVIDIA would stick to TSMC’s 7nm process for its next-gen GPU. However, subsequent reports suggest that Samsung had “aggressively undercut” TSMC on price. It was a bold move by the company as it continues to work hard to win over more customers from its foremost competitor.
The move has done the trick. NVIDIA Korea chief Yoo Eung-joon confirmed during a press conference today that production of the company’s next-gen GPU architecture will be carried out by Samsung on its 7nm EUV process. “It is meaningful that Samsung Electronics’ 7-nanometer process would be used in manufacturing our next-generation GPU,” Yoo said.
Yoo didn’t say just how much of the foundry production will be handled by Samsung, only suggesting that it would be “substantial.” This isn’t the first time that NVIDIA is relying on Samsung’s manufacturing. It has done that for the GTX 1050 and GTX 1050 Ti cards so this isn’t an unprecedented development. Ampere-based NVIDIA graphic cards are set to hit the market next year.
NVIDIA’s decision might also be based on supply considerations. TSMC’s 7nm node has already seen very high demand from the likes of Apple and AMD. Opting for Samsung’s 7nm process may enable NVIDIA to have more supply so that it can effectively meet demand. This agreement would certainly provide a boost to Samsung’s foundry business and enable the company to offset the decline from the memory side.Join the Discussion