Published in PC Hardware

Nvidia ramps up SOCAMM modular memory production

by on16 July 2025


Headed for AI PCs and servers

Nvidia is preparing up to 800,000 LPDDR-based SOCAMM modules this year ahead of a next-gen SOCAMM 2 launch designed to give its AI products superior performance and higher efficiency while remaining easily upgradeable.

Nvidia showcased SOCAMM at its GTC event, with the new GB300 Blackwell platform using the technology. Developed by Micron, SOCAMM differs from typical soldered solutions such as HBM and LPDDR5X. Instead, it offers a compact, modular memory form factor secured with just three screws, making it swappable and much easier to upgrade than traditional memory options.

Korean outlet ETNews claims Nvidia is planning between 600,000 and 800,000 SOCAMM modules for 2025. While that figure is far lower than the huge HBM memory shipments NVIDIA relies on for its data centre products, it signals the start of a wider rollout. The real volume push is expected next year with the arrival of SOCAMM 2, a second-generation design aimed at scaling up adoption across AI PCs and servers.

SOCAMM is based on LPDDR DRAM, typically found in low-power mobile devices, but it brings much higher flexibility. It offers bandwidth between 150 GB/s and 250 GB/s while promising better power efficiency than RDIMM, LPDDR5X, or LPCAMM memory solutions commonly used in mobile platforms. The exact efficiency gains are unknown, but reports suggest a significant leap forward, particularly for AI workloads where both bandwidth and power usage are critical.

The modular approach has clear advantages. It allows AI PCs and AI servers to upgrade memory without swapping entire boards, improving longevity and lowering overall costs. It also helps keep power draw under control in devices that do not require the massive throughput of HBM but still need higher efficiency than standard LPDDR solutions.

Micron is currently Nvidia's sole manufacturer of SOCAMM, but Samsung and SK Hynix are reportedly in talks to join production as demand scales. This multi-vendor strategy could ensure steady supply as AI products increasingly move toward modular, swappable memory.

Although 800,000 modules is a modest start compared to HBM, Nvidia's cunning plan to expand SOCAMM availability with SOCAMM 2 suggests this memory standard is poised to become a core part of its AI ecosystem, especially for low-power and upgradeable devices.

 

Last modified on 16 July 2025
Rate this item
(0 votes)

Read more about: