A researcher at a Samsung Electronics chip cleanroom
Samsung Electronics Co., the world’s largest memory chipmaker, has officially launched a team dedicated to developing advanced high-bandwidth (HBM) memory, a core chip to power artificial intelligence (AI) devices.
The Suwon, South Korea-based tech giant also launched a separate team handling advanced chip packaging, which combines the HBM chip with the graphics processing unit (GPU) to produce AI chip accelerators.
The launch of the two dedicated teams is part of Samsung’s organizational revamp on Thursday – its first reshuffle since Vice Chairman Jun Young-hyun in May took the helm of Samsung’s Device Solutions (DS) division, which oversees the company’s chip business.
Samsung’s HBM3E chip
Samsung previously set up two HBM task force teams and this time it pulled the two teams together and placed them under its memory department’s DRAM development division.
The new HBM development team will be led by DRAM Vice President Sohn Young-soo.
HBM has become an essential part of the AI boom because it provides a much faster processing speed than traditional memory chips.
NEW TEAM TO FOCUS ON HBM3E, HBM4
The new HBM team will concentrate its resources on developing the fifth-generation HBM3E and sixth-generation HBM4 chips, sources said.
Samsung’s chip business head and Vice Chairman Jun Young-hyun
Analysts said the first litmus test for the new Samsung chip leader will be to ensure that Samsung supplies its HBM chips to Nvidia Corp., the world’s top AI chip designer.
Samsung, which vows to triple its HBM output this year, is eager to pass quality testing currently underway by Nvidia.
The company in April began mass production of its HBM chips for generative AI chipsets, called 8-layer HBM3E.
In February, Samsung said it developed HBM3E 12H, the industry’s first 12-stack HBM3E DRAM and the highest-capacity HBM product to date.
HBM3E is expected to power Nvidia’s new AI chips such as B100 and GB200 and AMD’s MI350 and MI375, which are set for launch later this year.
The organizational revamp is aimed at catching up in the top-end AI chip market, where Samsung has fallen behind its rivals.
SK Hynix Inc. is the top supplier of HBM chips to Nvidia, which controls over 80% of the market for graphics processing units, the core of AI computing tasks. Currently, the Korean chipmaker is Nvidia’s only supplier of the fourth-generation HBM3 chip.
ADVANCED CHIP PACKAGING
On Thursday, Samsung launched a separate chip packaging team and placed it under chip leader Jun’s direct control.
Advanced chip packaging is an integral part of Samsung’s ambitious turn-key chip manufacturing services.
Samsung’s advanced foundry process node roadmap unveiled at Samsung Foundry Forum (SFF) 2024 in San Jose
AI accelerators such as Nvidia’s H200 and AMD’s MI300X, which combine HBM chips and the GPU, are created through cutting-edge packaging technology.
Packaging, one of the final steps in semiconductor manufacturing, places chips in a protective case to prevent corrosion and provides an interface to combine and connect already-made chips.
Chip foundry leader Taiwan Semiconductor Manufacturing Co. (TSMC) and fast-follower Samsung are striving to advance their packaging technology as it can enhance chip performance without having to shrink the nanometer through ultra-fine processing, which is technologically challenging and requires more time.
Last year, Samsung said it would use an advanced three-dimensional (3D) chip packaging tech, dubbed SAINT, from 2024 to better compete with its archrival TSMC.
By Jeong-Soo Hwang
hjs@hankyung.com
In-Soo Nam edited this article.