At CES 2026, SK hynix Makes Its Case for the Future of AI Memory
I was in the CES 2026 press room when SK hynix laid out its next-gen AI memory roadmap
Las Vegas always has noise, but the CES press rooms are different. They’re quieter, more technical, and the people in the seats are listening for one thing: what ships, what scales, and what changes procurement decisions six months from now. On January 6, 2026, SK hynix stepped up in that setting and made its message pretty clear — the next cycle of AI hardware is going to be memory-constrained, and they intend to be the company defining the memory stack.
The announcement centered on a dedicated customer exhibition hall at the Venetian Expo (January 6–9), with the theme “Innovative AI, Sustainable Tomorrow.” The phrase is marketing, sure — but the product list underneath it was not. They framed the whole show around AI-optimized memory, and they backed that up with a mix of high-bandwidth memory, low-power modules for servers, client-side DRAM, and a NAND story that’s directly aimed at AI data centers.
HBM4 was the headline in the room
The first thing everyone keyed in on was HBM. SK hynix said it is showing a next-generation HBM product described as a 16-high 48GB HBM4. They positioned it as the follow-on to the 12-high 36GB HBM4 they’ve already talked about, and they made a point of saying the work is being driven by customer requirements.