Published in News

Nvidia gives Micron SOCAMM gig

by on25 July 2025


Memory market braces for shake‑up

Nvidia is moving to disrupt the memory market again, planning to deploy between 600,000 and 800,000 SOCAMM modules in 2025 and handed the entire contract to Micron.

According to DigiTimes Asia the numbers are modest compared to Nvidia’s nine million planned HBM units for the same year, analysts say SOCAMM could reshape the memory and substrate landscape over time.

SOCAMM, first unveiled at Nvidia’s GTC 2025 conference in May, will debut in the upcoming GB300 “Blackwell” platform and the Digits AI PC. Nvidia has already shared projected order quantities with memory and substrate partners, signalling its intent to integrate SOCAMM into next‑generation AI servers, workstations and consumer systems.

Micron got the job to be the first memory maker to secure approval for SOCAMM volume production, beating Samsung Electronics and SK hynix to the punch. As we reported last week, Nvidia had initially tapped all three to co‑develop the module, but Micron’s faster turnaround gives it a clear first‑mover advantage.

Designed for low‑power, high‑bandwidth AI workloads, SOCAMM builds on LPDDR DRAM technology and significantly upgrades performance compared to existing laptop modules like LPCAMM. Micron claims SOCAMM offers 2.5 times the bandwidth of traditional RDIMM server modules while cutting size and power consumption by one‑third.

SOCAMM’s modular design keeps it compact and upgrade‑friendly, making it attractive not just for AI servers but also for future AI PCs. By slotting between cost‑efficient mainstream memory and ultra‑expensive HBM stacks, SOCAMM could fill a crucial gap for scalable AI computing without the extreme costs of top‑tier memory.

The module’s adoption is already creating ripples in the substrate market. SOCAMM requires specialised PCB designs, opening a new category of demand for substrate suppliers. While early volumes remain limited, analysts expect a surge in orders if Nvidia’s SOCAMM strategy gains traction, potentially driving fresh competition among DRAM vendors and PCB makers.

SOCAMM won’t replace HBM in the near term, but it signals Nvidia’s intent to diversify its memory ecosystem to match different AI workloads, from hyperscale data centres to consumer‑grade AI PCs.

If the rollout succeeds, it could push memory makers and substrate suppliers into a new arms race for bandwidth, efficiency and scale.

Rate this item
(1 Vote)

Read more about: