
Nvidia CEO Jensen Huang holds the next-generation AI (artificial intelligence) chip platform ‘Vera Rubin’ at CES 2025, an annual consumer electronics trade show, in Las Vegas, Nevada, U.S. Jan. 6, 2025. (Photo courtesy of REUTERS)
NVIDIA CEO Jensen Huang announced that the next-generation AI (artificial intelligence) chip platform ‘Vera Rubin’ has currently entered the full production stage. As Vera Rubin will be equipped with HBM4 (6th generation High Bandwidth Memory) and SOCAMM2, the market share competition among the three major memory companies (Samsung Electronics, SK Hynix, and Micron) is expected to intensify further.
During a special presentation at ‘CES 2026’ held at the Bleau Live Theater in Fontainebleau Hotel, Las Vegas, United States on Jan. 5 (local time), CEO Huang stated, “Vera Rubin is currently in the full production stage,” adding, “We will be able to supply it to global customers by the end of this year.” Vera Rubin was initially expected to launch in the second half of this year, and NVIDIA has officially reconfirmed the mass production timeline on the official stage.
Vera Rubin is a product that combines six chips, including the ‘Vera’ CPU (Central Processing Unit) and ‘Rubin’ GPU (Graphics Processing Unit), into a single platform. The company has adopted a strategy of connecting multiple chips to simultaneously enhance performance and efficiency, moving away from the approach of increasing the number of transistors in a single chip to improve performance.
CEO Huang emphasized, “Vera Rubin has improved AI inference performance by 5 times and training performance by 3.5 times compared to Blackwell,” stating, “The key is not single chip performance, but system architecture that allows multiple chips to move as one, eliminating data bottlenecks and enhancing throughput tailored to AI workloads.”
Particularly, the Rubin GPU will be equipped with HBM4. The industry expects that HBM4 will quickly establish itself as the mainstream in the HBM market, coinciding with the launch of NVIDIA’s Rubin GPU, given that NVIDIA is the largest customer for AI semiconductors.
The Vera CPU will also be equipped with ‘SOCAMM2’, an AI server memory module led by NVIDIA. It is designed to efficiently handle large-scale computations by placing low-power DRAM (LPDDR5X) near the CPU. SOCAMM is known to reduce power consumption to one-third the level compared to existing DDR (Double Data Rate)-based server modules.
As NVIDIA officially indicated the launch timeline for Vera Rubin, the battle for dominance in the memory industry over the HBM4 and SOCAMM2 markets is expected to intensify. Samsung Electronics aims for a comeback by applying the industry’s first 1c (10-nanometer class 6th generation) process to HBM4. Samsung Electronics’ HBM4 achieves a data processing speed of 11 Gb (gigabits) per second and a bandwidth of 2.8 TB/s. Energy efficiency is reported to have improved by 40% compared to the previous generation. Custom samples (CS) of SOCAMM2 are also being supplied to NVIDIA.
Samsung Electronics DS (Device Solutions) Division President and Vice Chairman Jeon Young-hyun expressed confidence in a recent New Year’s address, stating, “HBM4 has demonstrated differentiated competitiveness, receiving evaluations from customers that ‘Samsung is back’.”
SK Hynix is expected to maintain the highest market share in HBM4, following HBM3E. SK Hynix recorded a 57% market share in the global HBM market in the third quarter of this year, ranking first. At this CES, the company plans to unveil its HBM4 16-stack 48GB product for the first time. This is a successor model to the HBM4 12-stack 36GB that achieved the industry’s highest speed of 11.7 Gbps, with development proceeding according to customer schedules. The company also plans to showcase next-generation memory solutions including SOCAMM2.
An industry official stated, “As not only NVIDIA but also other companies like AMD are developing their own AI chips, competition among memory companies will accelerate significantly starting this year.”















