feedzop-word-mark-logo
searchLogin
Feedzop
homeFor YouIndiaIndia
You
bookmarksYour BookmarkshashtagYour Topics
Trending
Terms of UsePrivacy PolicyAboutJobsPartner With Us

© 2026 Advergame Technologies Pvt. Ltd. ("ATPL"). Gamezop ® & Quizzop ® are registered trademarks of ATPL.

Gamezop is a plug-and-play gaming platform that any app or website can integrate to bring casual gaming for its users. Gamezop also operates Quizzop, a quizzing platform, that digital products can add as a trivia section.

Over 5,000 products from more than 70 countries have integrated Gamezop and Quizzop. These include Amazon, Samsung Internet, Snap, Tata Play, AccuWeather, Paytm, Gulf News, and Branch.

Games and trivia increase user engagement significantly within all kinds of apps and websites, besides opening a new stream of advertising revenue. Gamezop and Quizzop take 30 minutes to integrate and can be used for free: both by the products integrating them and end users

Increase ad revenue and engagement on your app / website with games, quizzes, astrology, and cricket content. Visit: business.gamezop.com

Property Code: 5571

Home / Technology / Old Mac Struggles with New AI Demands

Old Mac Struggles with New AI Demands

1 Feb

•

Summary

  • Running large language models locally requires significant RAM, ideally 32GB.
  • Older machines with 16GB RAM can experience extreme slowness with AI models.
  • Newer AI models demand more powerful hardware, often exceeding older specs.
Old Mac Struggles with New AI Demands

An attempt to run open-source large language models (LLMs) locally on a three-year-old MacBook Pro with 16GB of RAM revealed significant hardware limitations. While the machine handles everyday tasks, it faced extreme performance issues, including lengthy processing times, when attempting to run AI models.

Models like glm-4.7-flash, weighing 19 gigabytes, took over an hour to generate a simple response. Even a purportedly faster model, gpt-oss:20b, exhibited slow performance. These experiences underscore that running LLMs effectively often requires at least 32GB of RAM, a specification now considered a minimum for information workers engaging with AI.

The investigation indicated that memory-intensive AI operations are pushing the boundaries of even relatively recent hardware. The rising cost and demand for DRAM by cloud data centers further complicate accessibility for individuals seeking to run AI models on personal machines. This situation suggests a potential need for hardware upgrades to keep pace with AI advancements.

trending

Chelsea beats West Ham 3-2

trending

Liverpool, Newcastle face injury woes

trending

WWE Royal Rumble in Riyadh

trending

Barcelona faces Elche in LaLiga

trending

Goretzka staying at Bayern Munich

trending

ICC T20 World Cup squads

trending

Gold, silver ETFs crashed

trending

Curran, Pandya T20Is stats compared

trending

Suryakumar Yadav T20I record

Disclaimer: This story has been auto-aggregated and auto-summarised by a computer program. This story has not been edited or created by the Feedzop team.
Running AI models locally, even smaller ones, often necessitates a minimum of 32GB of RAM for adequate performance.
An older MacBook Pro with 16GB of RAM can struggle significantly, leading to extreme slowness and lengthy processing times for AI models.
DRAM is becoming increasingly memory-intensive as cloud data centers consume more of it to run large language models.

Read more news on

Technologyside-arrowArtificial Intelligence (AI)side-arrow

You may also like

Study: Brain Comprehension Mirrors AI's Deep Layers

21 Jan • 43 reads

article image

Budget 2026: Powering India with Indigenous AI

18 Jan • 56 reads

article image

AI Bubble Fears Rise: Will 2026 Be the Burst?

15 Jan • 96 reads

article image

AI Fears Shake Salesforce, Adobe Stocks

14 Jan • 133 reads

article image

AI Exposed: Hackers Exploit Proxy Flaws

12 Jan • 67 reads

article image