The Korea Times

Samsung develops fastest data processing AI chip

12-layer HBM3E DRAM expected to help company outpace rivals

- By Baek Byung-yeul baekby@koreatimes.co.kr

Samsung Electronic­s succeeded in developing the industry’s first 36-gigabyte (GB) 12-layer HBM3E DRAM, which boasts the fastest data processing speed among AI memory chips, the company said, Tuesday.

The new developmen­t is expected to help the world’s largest memory chip maker outpace rival SK hynix in the AI memory chip industry and gain an advantageo­us position in the HBM market.

HBM, short for High Bandwidth Memory, is a memory semiconduc­tor that revolution­izes data processing speed by vertically connecting multiple DRAMs. The higher the bandwidth of the memory chip, the wider the data pathway becomes, resulting in an increased capacity to process larger amounts of data simultaneo­usly. HBM3E represents the fifth generation of HBM DRAM.

Samsung said the 12-layer HBM3E DRAM has a maximum bandwidth of 1,280 GB per second and offers 36 GB capacity, which is the highest among existing HBM chips. It has not only succeeded in developing the product, but already provided samples to customers and plans to commence mass production in the first half of this year.

As generative AI services are increasing­ly being utilized, there is a surge in demand for high-performanc­e DRAM at data centers requiring rapid processing of large datasets. SK hynix has been at the forefront of the market, commanding a 50 percent market share among memory chip makers last year.

With Samsung’s competitor Micron of the U.S. also announcing on Feb. 26 its plans to begin mass-producing HBM3E, intense competitio­n is anticipate­d among memory chip makers vying for dominance in the AI memory chip market.

SK hynix started mass production of an eight-layer HBM3E early this year. This product is expected to be applied to Nvidia’s GPU, which is scheduled to be released in the second quarter, and they are also developing a 12-layer product.

Samsung said its new HBM3E offers improved performanc­e and capacity, surpassing the previous version, HBM3, by more than 50 percent.

“The industry’s AI service providers are increasing­ly requiring HBM with higher capacity, and our new HBM3E 12H product has been designed to answer that need,” Bae Yong-cheol, executive vice president of memory product planning at Samsung, said. “This new memory solution is a cornerston­e of our efforts to advance core technologi­es for high-stack HBM and establish technologi­cal leadership in the high-capacity HBM market during the AI era.”

 ?? Courtesy of Samsung Electronic­s ?? Samsung Electronic­s’ 36-gigabyte 12-layer HBM3E DRAM memory chips
Courtesy of Samsung Electronic­s Samsung Electronic­s’ 36-gigabyte 12-layer HBM3E DRAM memory chips

Newspapers in English

Newspapers from Korea, Republic