Home / Technology / Old Mac Struggles with New AI Demands
Old Mac Struggles with New AI Demands
1 Feb
Summary
- Running large language models locally requires significant RAM, ideally 32GB.
- Older machines with 16GB RAM can experience extreme slowness with AI models.
- Newer AI models demand more powerful hardware, often exceeding older specs.

An attempt to run open-source large language models (LLMs) locally on a three-year-old MacBook Pro with 16GB of RAM revealed significant hardware limitations. While the machine handles everyday tasks, it faced extreme performance issues, including lengthy processing times, when attempting to run AI models.
Models like glm-4.7-flash, weighing 19 gigabytes, took over an hour to generate a simple response. Even a purportedly faster model, gpt-oss:20b, exhibited slow performance. These experiences underscore that running LLMs effectively often requires at least 32GB of RAM, a specification now considered a minimum for information workers engaging with AI.
The investigation indicated that memory-intensive AI operations are pushing the boundaries of even relatively recent hardware. The rising cost and demand for DRAM by cloud data centers further complicate accessibility for individuals seeking to run AI models on personal machines. This situation suggests a potential need for hardware upgrades to keep pace with AI advancements.




