Home / Technology / OpenAI's Massive Computing Goals Raise Grave Environmental Alarms
OpenAI's Massive Computing Goals Raise Grave Environmental Alarms
13 Nov
Summary
- OpenAI plans to build 250 gigawatts of computing power by 2033
- This would require 30 million GPUs running 24/7, consuming more electricity than India
- OpenAI's data center projects are fueling environmental concerns about carbon emissions

In November 2025, an internal OpenAI memo outlining the company's plans to dramatically scale up its computing infrastructure has raised serious environmental alarms. According to the memo, OpenAI has set an "audacious long-term goal" to build 250 gigawatts of computing capacity by 2033.
To achieve this, the company would need to purchase over 30 million GPUs per year and run them 24/7, 365 days a year. This staggering level of computing power would require more electricity than the entire nation of India and produce more carbon dioxide emissions than ExxonMobil.
OpenAI's data center expansion plans, including its involvement in initiatives like Project Stargate in the US, are fueling these environmental concerns. The company is not alone, as tech giants like Google, Amazon, Meta, and xAI are also announcing massive data center projects, often powered by environmentally questionable gas turbines.
The secondary effects of OpenAI's computing ambitions could be just as dire. Meeting its goals would require 10 of the world's most advanced fabricator facilities operating nonstop to provide the necessary GPUs, further straining energy and resource supplies. This could drive up prices, reduce availability, and have long-term economic and health implications for local communities.



