Home / Technology / AI Energy Gulp: Kerosene for Every Search?
AI Energy Gulp: Kerosene for Every Search?
27 Feb
Summary
- AI searches consume significant energy, akin to half a litre of kerosene per query.
- Indian SMEs face high compute costs and infrastructure gaps, risking AI race exclusion.
- Experts discuss AI's energy demands and access disparities at India AI Impact Summit.

Experts convened at the India AI Impact Summit 2026 in New Delhi on February 18th to discuss the immense energy demands of artificial intelligence. A striking revelation was that each public large language model (LLM) search query consumes approximately half a litre of kerosene.
The summit also focused on the potential exclusion of Indian Small and Medium Enterprises (SMEs) from the global AI innovation race. Panellists pointed out that prohibitively high compute costs, often in the millions for basic model training, and a lack of suitable infrastructure create a significant barrier for SMEs compared to larger corporations.
Discussions emphasized the need for innovative solutions, including heterogeneous compute architectures combining CPUs, GPUs, and NPUs tailored for affordability and efficiency. Experts like Gokul V Subramaniam from Intel India advocated for these approaches to scale AI within resource-constrained environments.
Concerns about data transparency and the 'black box' nature of AI were also raised, with calls for explainability and robust regulatory frameworks. Strategies like federated learning were proposed to balance the need for vast datasets with national data sovereignty and privacy concerns.
Furthermore, the summit highlighted the importance of broadening AI skills across the workforce through industry-academia collaborations and curriculum reforms. The event also showcased the democratization of AI through real-time Indic language translation services, breaking down language barriers.




