The outfit claims the chip has speeds up to 10.7Gbps and offers power savings of up to 20 per cent compared to the previous generation. According to Micron, this allows users to squeeze more performance from mobile AI applications without sending battery life into freefall. That means snappier voice translation and AI-driven recommendations even when off the cloud.
Corporate vice president and general manager of Micron’s Mobile and Client Business Unit Mark Montierth said, “Micron's 1-gamma node-based LPDDR5X memory is a game-changer for the mobile industry. This breakthrough technology delivers lightning-fast speeds and remarkable power efficiency, all within the industry’s thinnest LPDDR5X package, paving the way for exciting new smartphone designs.”
In physical terms, the new LPDDR5X chips come in a 0.61mm package, which the company reckons is six per cent thinner than the next slimmest rival and 14 per cent slimmer than its own previous generation. This sort of miniaturisation is particularly tasty for designers of foldable or wafer-thin smartphones who are desperate to reclaim every micrometre of space.
Micron’s engineers also appear to be getting serious with extreme ultraviolet lithography. The 1γ node uses EUV to boost bit density and pairs it with next-gen CMOS7 tech and high-K metal gate transistors for a speed and efficiency combo that fits nicely with mobile AI needs.
The company claims its new memory enables large language models like Llama 2 to respond 30 per cent faster when recommending local restaurants, more than 50 per cent faster when translating English speech into Spanish text, and 25 per cent faster when helping you choose a new motor.
Micron is currently sampling 16GB units to partners and plans to roll out capacities ranging from 8GB to 32GB in flagship phones starting in 2026. The chips are also expected to find homes in energy-hungry edge devices such as AI PCs, tablets and even servers, where power and performance continue to spar over every watt.