Home / Technology / Micron Unveils 256GB LPDDR5x for AI Servers
Micron Unveils 256GB LPDDR5x for AI Servers
10 Mar
Summary
- New 256GB LPDDR5x modules aim to boost AI server memory capacity.
- Eight modules can achieve a massive 2TB of server memory capacity.
- Micron's new module uses less power and takes up less space.

Micron has launched a high-density 256GB SOCAMM2 memory module specifically engineered for AI servers. This module utilizes 64 monolithic 32GB LPDDR5x chips to meet the substantial memory requirements of modern artificial intelligence workloads. Its introduction is set to significantly enhance server memory architecture, a critical factor as AI models and inference pipelines demand ever-larger memory pools.
With the capacity to install eight SOCAMM2 modules in an eight-channel server CPU configuration, total memory can reach an impressive 2TB of LPDRAM. This represents approximately a one-third increase over previous module generations, enabling larger context windows and more complex AI inference tasks.
Micron emphasizes the power efficiency of its new design, stating that the SOCAMM2 module consumes roughly one-third the power and occupies one-third the physical footprint of comparable RDIMMs. This efficiency translates to lower thermal loads and reduced infrastructure costs in data centers, while its modular design simplifies maintenance and future upgrades.
The 256GB SOCAMM2 module has demonstrated tangible performance benefits. Micron reports a more than 2.3x improvement in time-to-first-token for long-context inference tasks when used for key-value cache offloading. In standalone CPU workloads, this LPDRAM configuration offers over three times better performance per watt compared to mainstream server memory.




