Widening lead in AI memory chips demand
SK hynix investing Us$1bil to capture
SEOUL: SK Hynix Inc is ramping up its spending on advanced chip packaging, in hopes of capturing more of the burgeoning demand for a crucial component in artificial intelligence (AI) development: high-bandwidth memory (HBM).
The Icheon-based firm is investing more than Us$1bil in South Korea to expand and improve the final steps of its chip manufacture, said Lee Kang-wook, a former Samsung Electronics Co engineer who now heads up packaging development at SK Hynix.
Innovation with that process is at the heart of HBM’S advantage as the most sought-after AI memory, and further advances will be key to reducing power consumption, driving performance and cementing the company’s lead in the HBM market.
Lee specialises in advanced ways of combining and connecting semiconductors, which has grown in importance with the advent of modern AI and its digestion of vast troves of data via parallel processing chains.
While SK Hynix has not disclosed its capital expenditure budget for this year, the average analyst estimate puts the figure at 14 trillion won (Us$10.5bil). That suggests advanced packaging, which could take up a 10th of that, is a major priority.
“The first 50 years of the semiconductor industry has been about the front-end,” or the design and fabrication of the chips themselves, Lee said in an interview. “But the next 50 years is going to be all about the back-end,” or packaging.
Being first to achieve the next milestone in this race can now catapult companies into industry-leading positions.
SK Hynix was chosen by Nvidia Corp to provide the HBM for its standard-setting AI accelerators, pushing the South Korean firm’s value up to 119 trillion won.
Its stock has gained nearly 120% since the start of 2023, making it South Korea’s second most valuable company and outperforming Samsung and US rival Micron Technology Inc.
Lee, now 55 years old, helped pioneer a novel method to packaging the third generation of the technology, HBM2E, which was quickly followed by the other two major makers.
That innovation was central to SK Hynix winning Nvidia as a customer in late 2019.
Stacking chips to derive greater performance has long been Lee’s passion. In 2000, he earned his PHD on 3D integration technology for micro-systems from Japan’s Tohoku University, under Mitsumasa Koyanagi, who invented stacked capacitor DRAM used in mobile phones.
In 2002, Lee joined as principal engineer at Samsung’s memory division, where he led the development of Through-silicon Via (Tsv)based 3D packaging technologies. That work would later become the foundation for developing HBM.
HBM is a type of high-performance memory that stacks chips on top of one another and connects them with TSVS for faster and more energy-efficient data processing.
But back in the pre-smartphone era, Samsung was making bigger bets elsewhere. And the norm was for global chipmakers to outsource to smaller Asian nations the tasks of assembling, testing and packaging chips.
So when SK Hynix and US partner Advanced Micro Devices Inc introduced HBM to the world in 2013, they remained unchallenged for two years before Samsung developed its HBM2 in late 2015.
Lee joined SK Hynix three years later. They joked, with a measure of pride, that HBM stood for “Hynix’s Best Memory.”
“SK Hynix’s management had better insights into where this industry is headed and they were well prepared,” said Sanjeev Rana, an analyst at CLSA Securities Korea.
“When the opportunity came their way, they grabbed it with both hands.” As for Samsung, “they were caught napping.”
Chatgpt’s release in November 2022 was the moment Lee had been waiting for. By that time, his team had developed a new packaging method called mass reflow-molded underfill, aided by his contacts in Japan.
The process, which involves injecting and then hardening liquid material between layers of silicon, improved heat dissipation and production yields.
SK Hynix teamed up with Namics Corp in Japan for the material and a related patent, according to a person familiar with the matter. — Bloomberg