Home / Technology / AI's Thirst: Fact vs. Fiction
AI's Thirst: Fact vs. Fiction
5 Mar
Summary
- AI energy consumption concerns are valid; water use claims debated.
- US data centers consumed 45% of global electricity in 2024.
- AI chatbots use 10x more electricity than traditional search engines.

Debates around AI's environmental impact, particularly its energy and water consumption, are intensifying. While OpenAI CEO Sam Altman has dismissed concerns about water usage as "totally fake," citing a move away from evaporative cooling, independent data reveals a substantial environmental footprint. In 2024, the US accounted for 45% of global data center electricity consumption.
Generative AI chatbots are considerably more energy-intensive than traditional search engines, with some estimates suggesting they use up to 10 times more electricity per query. For instance, Google reported that a median Gemini text prompt consumes 0.24 watt-hours of energy and about 0.26 milliliters of water.
Data centers increasingly rely on water for cooling to prevent overheating. However, the type of cooling system significantly impacts consumption, with closed-loop systems being more resource-efficient than evaporative cooling. OpenAI states it is prioritizing closed-loop systems, aligning with Altman's claims of reduced water use.
Despite these advancements, the overall demand for energy and water is expected to rise with the growing popularity of AI. Companies are exploring renewable energy sources like solar and battery storage to power these operations. Nevertheless, data centers still largely depend on the grid, which often relies on fossil fuels.
The push for transparency and sustainable practices is growing, with communities and policymakers urging a balance between AI innovation and environmental responsibility. Projections indicate a significant spike in AI water consumption by 2050, underscoring the urgency of addressing these environmental challenges.




