$NVDA Supplier | SK Hynix Expects AI Memory Market To Grow 30% A Year To 2030 – RTRS https://t.co/TmYCQWiqWc
SK Hynix forecasts that the market for a specialized form of memory chip designed for artificial intelligence will grow 30% a year until 2030, a senior executive said in an interview with Reuters. "AI demand from the end user is pretty much, very firm and strong," said SK https://t.co/HMwnnbr3m2 https://t.co/3QxPANdVZy
I think one thing many are missing over the last few days is compute capacity will expand exponentially as Nvidia's new chips/AI servers are released every single year. GB200/GB300 NVL72 is only now rolling out in volume. That means more and more people will have access to https://t.co/LmtnszrCQJ
SK Hynix expects the global market for high-bandwidth memory (HBM) used in artificial-intelligence systems to expand about 30% a year through 2030, according to Choi Joon-yong, who heads the South Korean chipmaker’s HBM business planning unit. In an interview with Reuters, Choi said sustained investment in AI infrastructure by cloud providers such as Amazon, Microsoft and Alphabet’s Google should lift demand for the high-performance chips, driving the segment’s value into the ‘tens of billions of dollars’ within five years. The vertically stacked HBM devices help cut power use while processing the large data sets required by generative-AI models and are a critical component in Nvidia’s latest accelerators, for which SK Hynix is the primary supplier. Choi added that future growth could accelerate further as customers request increasingly customised versions of the memory, deepening the link between AI build-outs and HBM orders.