By Hyunjoo Jin
SEOUL, Feb 12 (Reuters) – Samsung Electronics said on Thursday that it has started shipping the latest high-bandwidth memory chips, HBM4, to unnamed customers, as the chipmaker races to catch up with rivals in supplying to Nvidia.
The global rush to build AI data centers has fuelled demand for HBM, a type of dynamic random-access memory (DRAM) that helps process massive amounts of data generated by complex artificial intelligence applications.
Samsung, the world’s top memory chipmaker, had been slow in responding to the advanced AI chip market, lagging behind rivals in supplying previous-generation HBM chips.
Samsung said its HBM4 delivers a consistent processing speed of 11.7 gigabits-per-second (Gbps), a 22% increase from its predecessor, HBM3E. Samsung said its latest chips can achieve the maximum speed of 13Gbps, which helps mitigate growing data bottlenecks.
Samsung also said it plans to deliver samples for next-generation HBM4E chips in the second half of this year.
Samsung shares ended up 6.4% on Thursday.
RISING COMPETITION
SK Hynix said in January that it aims to maintain its “overwhelming” market share in the next-generation HBM4 chips, which were in volume production, as it faces rising competition from Samsung Electronics.
It also added that it aims to achieve the production yields of HBM4 similar to that of current-generation HBM3E chips.
Micron’s CFO also said it was in high-volume production of HBM4 and has commenced customer shipments of HBM4.
(Reporting by Hyunjoo Jin and Heekyong Yang; Editing by Ed Davies, Rashmi Aich and Louise Heavens)





Comments