Home / Technology / AI's Real Bottleneck: Memory, Not Just Compute Power
AI's Real Bottleneck: Memory, Not Just Compute Power
14 Jan
Summary
- Memory, not compute power, is the primary bottleneck in AI models.
- Phison's aiDAPTIV+ uses SSDs to expand memory for AI processing.
- AI profitability for hyperscalers hinges on data storage capacity.

Phison CEO Pua Khein Seng has identified memory capacity as the critical limiting factor in artificial intelligence, rather than the widely discussed computing power. He explained that insufficient memory can lead to system crashes and significant delays in AI responses, impacting user experience and the overall business case for AI.
To address this, Phison is developing aiDAPTIV+, a technology that utilizes Solid State Drives (SSDs) to act as an expanded memory pool. This solution complements existing DRAM, allowing GPUs to concentrate on processing tasks without being hindered by memory access speeds. The goal is to significantly reduce Time to First Token (TTFT) and enable more efficient AI inference.
Pua emphasized that the profitability of AI for hyperscalers is intrinsically linked to data storage capacity. With companies investing heavily in GPUs, the real revenue stream for cloud service providers comes from inference, which is heavily reliant on efficient and ample data storage. Phison is also pushing for higher-capacity enterprise SSDs, with a 244TB model in development.




