ID :
707668
Fri, 10/31/2025 - 09:06
Auther :

[APEC 2025] Samsung Electronics says is 'working together' with Nvidia on HBM4 supply

SEOUL, Oct. 31 (Yonhap) -- Samsung Electronics Co. said Friday it is in close talks with U.S. tech giant Nvidia Corp. over the supply of next-generation high bandwidth memory (HBM) chips as the two companies aim to expand ties in the artificial intelligence (AI) industry.

The announcement came after Nvidia said it will provide its latest Blackwell graphic processing units (GPUs) to South Korean companies, including Samsung. The surprise announcement was made on the sidelines of the Asia-Pacific Economic Cooperation (APEC) CEO Summit in the southeastern city of Gyeongju, some 275 kilometers southeast of Seoul.

"In addition to our ongoing collaborations, Samsung and Nvidia are also working together on HBM4," Samsung Electronics said in a release.

"With incredibly high bandwidth and energy efficiency, Samsung's advanced HBM solutions are expected to help accelerate the development of future AI applications and form a critical foundation for manufacturing infrastructure driven by these technologies," it added.

HBM is a core component for AI servers, with its demand rising rapidly as the memory chip significantly boosts data-processing speeds in data centers.

Samsung said it will continue to deliver other next-generation memory solutions, including HBM, GDDR and SOCAMM, as well as foundry services, to Nvidia, expanding cooperation across the global AI value chain.

The South Korean tech giant added it plans to build a new AI-powered factory under ties with Nvidia.

"By deploying more than 50,000 Nvidia GPUs, AI will be embedded throughout Samsung's entire manufacturing flow, accelerating development and production of next-generation semiconductors, mobile devices and robotics," the company said.

"With incredibly high bandwidth and energy efficiency, Samsung's advanced HBM solutions are expected to help accelerate the development of future AI applications and form a critical foundation for manufacturing infrastructure driven by these technologies," it added.

HBM is a core component for AI servers, with its demand rising rapidly as the memory chip significantly boosts data-processing speeds in data centers.

Samsung said it will continue to deliver other next-generation memory solutions, including HBM, GDDR and SOCAMM, as well as foundry services, to Nvidia, expanding cooperation across the global AI value chain.

The South Korean tech giant added it plans to build a new AI-powered factory under ties with Nvidia.

"By deploying more than 50,000 Nvidia GPUs, AI will be embedded throughout Samsung's entire manufacturing flow, accelerating development and production of next-generation semiconductors, mobile devices and robotics," the company said.


X