South Korea’s SK Hynix forecasts that the market for a specialised form of memory chip designed for artificial intelligence will grow 30% a year until 2030, a senior executive said in an interview with Reuters.
The upbeat projection for global growth in high-bandwidth memory (HBM) for use in AI brushes off concern over rising price pressures in a sector that for decades has been treated like commodities such as oil or coal.
“AI demand from the end user is pretty much, very firm and strong,” said SK Hynix’s Choi Joon-yong, the head of HBM business planning at SK Hynix.
The billions of dollars in AI capital spending that cloud computing companies such as Amazon, Microsoft and Alphabet’s Google are projecting will likely be revised upwards in the future, which would be “positive” for the HBM market, Choi said.
The relationship between AI build-outs and HBM purchases is “very straightforward” and there is a correlation between the two, Choi said. SK Hynix’s projections are conservative and include constraints such as available energy, he said.
But the memory business is undergoing a significant strategic change during this period as well. HBM – a type of dynamic random access memory or DRAM standard first produced in 2013 – involves stacking chips vertically to save space and reduce power consumption, helping to process the large volumes of data generated by complex AI applications.
SK Hynix expects this market for custom HBM to grow to tens of billions of dollars by 2030, Choi said.
Due to technological changes in the way SK Hynix and rivals such as Micron Technology and Samsung Electronics build next-generation HBM4, their products include a customer-specific logic die, or “base die”, that helps manage the memory.
That means it is no longer possible to easily replace a rival’s memory product with a nearly identical chip or product.
Part of SK Hynix’s optimism for future HBM market growth includes the likelihood that customers will want even further customisation than what SK Hynix already does, Choi said.
At the moment it is mostly larger customers such as Nvidia that receive individual customisation, while smaller clients get a traditional one-size-fits-all approach.
“Each customer has different taste,” Choi said, adding that some want specific performance or power characteristics.
SK Hynix is currently the main HBM supplier to Nvidia, although Samsung and Micron supply it with smaller volumes. Last week, Samsung cautioned during its earnings conference call that current generation HBM3E supply would likely outpace demand growth in the near term, a shift that could weigh on prices.
“We are confident to provide, to make the right competitive product to the customers,” Choi said.
U.S. President Donald Trump on Wednesday said the United States would impose a tariff of about 100% on semiconductor chips imported from countries not producing in America or planning to do so.
Choi declined to comment on the tariffs.
Trump told reporters in the Oval Office the new tariff rate would apply to “all chips and semiconductors coming into the United States,” but would not apply to companies that were already manufacturing in the United States or had made a commitment to do so.
Trump’s comments were not a formal tariff announcement, and the president offered no further specifics.
South Korea’s top trade envoy Yeo Han-koo said on Thursday that Samsung Electronics and SK Hynix would not be subject to the 100% tariffs on chips if they were implemented.

Samsung has invested in two chip fabrication plants in Austin and Taylor, Texas, and SK Hynix has announced plans to build an advanced chip packaging plant and an artificial intelligence research and development facility in Indiana.
South Korea’s chip exports to the United States were valued at $10.7 billion last year, accounting for 7.5% of its total chip exports.
Some HBM chips are exported to Taiwan for packaging, accounting for 18% of South Korea’s chip exports in 2024, a 127% increase from the previous year.
Published – August 11, 2025 11:41 am IST