2024.08.23

SK Hynix Is Developing Next-Gen HBM With 30x Performance Uplift


The market for high-bandwidth memory (HBM) is experiencing the same explosive growth as the AI industry as a whole, as all the high-end AI accelerators use HBM memory. Therefore, every company that makes this kind of memory is pouring billions into R&D for future memory products. To that end, SK Hynix raised a few eyebrows at a recent conference by stating it's developing next-gen HBM with up to 30% uplift over the existing standard.


The remarks by SK Hynix were made at the SK Icheon Forum 2024 by company vice president Ryu Seong-su. According to Business Korea, the executive stated the company plans to develop a next-gen HBM product that offers 20-30 times the performance of current HBM. No timeline was given on when we might expect it, but given the ambitious nature of the proposed memory, it's safe to say we won't see it for a generation or two, if not further out than that. It's also previously discussed a radical idea for a future product that would put HBM and logic directly on top of a processor.


The executive also aDDRessed the notion that the AI boom will just end in heartbreak like the metaverse craze, essentially saying it doesn't matter (this is our paraphrasing) because the memory SK Hynix makes is required whether or not the companies using the end products ever turn a profit. He also said it will be important in the future for SK Hynix to develop its own memory semiconductor specifications instead of just following what other companies (likely Samsung) are doing.


Figuring out where this future HBM might fit into the market could take a while. The entire Data Center and AI industry uses HBM3 memory, with HBM3E expected to arrive at the end of this year into early 2025. That will be followed by HBM4 later in 2025 and HBM4e in 2026 or beyond. Therefore, whatever SK Hynix is talking about with a 30x uplift is either a new kind of memory or something far beyond what's being proposed for the next few years.


Ryu's remarks also mentioned demand from what he calls the "Magnificent 7:" Apple, Microsoft, Google Alphabet, Amazon, Nvidia, Meta, and Tesla. He said he spent the previous weekend talking on the phone to these companies about customer memory solutions for future AI products. Those companies' AI efforts—and massive budgets—have SK Hynix burning the midnight oil to develop memory products like what it proposed at the conference.