AI Energy Use Remains a Mystery as Tech Giants Stay Silent
Research highlights the lack of transparency around AI energy consumption, with major companies like OpenAI keeping carbon emissions data secret despite growing environmental concerns.
As AI becomes increasingly integrated into daily life, its energy consumption and environmental impact are raising alarms. However, major tech companies like OpenAI are keeping critical data under wraps, leaving researchers and the public in the dark.
The OpenAI Example
OpenAI CEO Sam Altman recently claimed that the average ChatGPT query uses 0.34 watt-hours of energy—comparable to an oven running for one second or a lightbulb for a few minutes. However, experts like Sasha Luccioni, climate lead at Hugging Face, question the validity of this figure due to the lack of context. "He could have pulled that out of his ass," Luccioni remarked, highlighting the absence of details on how OpenAI calculated this number.
The Transparency Gap
A new analysis by Luccioni and colleagues reveals that 84% of large language model (LLM) usage in May 2025 involved models with zero environmental disclosure. This lack of transparency makes it impossible for users to gauge the carbon footprint of their AI interactions. "It blows my mind that you can buy a car and know its fuel efficiency, but we use AI tools daily with no emissions data," Luccioni said.
Misleading Claims
One widely circulated statistic—that a ChatGPT query uses 10 times more energy than a Google search—traces back to a casual remark by John Hennessy, chairman of Alphabet (Google's parent company). Despite its shaky origins, this claim has been repeated in policy discussions and media reports, further muddying the waters.
Open-Source Insights
A study published in Frontiers of Communication evaluated 14 open-source LLMs, including models from Meta and DeepSeek, finding significant variations in energy use. Some models consumed 50% more energy than others, with reasoning-heavy models generating more internal "thinking tokens"—and thus higher emissions. Lead author Maximilian Dauner suggests directing simpler queries to less energy-intensive models to reduce waste.
The Bigger Picture
Energy use isn’t just about queries; training AI models and maintaining data centers (cooling, networking, etc.) add to the carbon footprint. Noman Bashir, an MIT researcher, compares current emissions studies to testing a car’s fuel efficiency without accounting for its weight or passengers. "We’re missing critical context," he says.
Call for Action
Luccioni advocates for mandatory carbon disclosures for all AI systems. "Given the climate crisis, this should be top of the agenda for regulators," she argues. Until then, the true cost of AI’s energy hunger will remain a mystery.
Paresh Dave contributed reporting.
Related News
Lenovo Wins Frost Sullivan 2025 Asia-Pacific AI Services Leadership Award
Lenovo earns Frost Sullivan's 2025 Asia-Pacific AI Services Customer Value Leadership Recognition for its value-driven innovation and real-world AI impact.
Baidu Wenku GenFlow 2.0 Revolutionizes AI Agents with Multi-Agent Architecture
Baidu Wenku's GenFlow 2.0 introduces a multi-agent system for parallel task processing, integrating with Cangzhou OS to enhance efficiency and redefine AI workflows.
About the Author

Dr. Lisa Kim
AI Ethics Researcher
Leading expert in AI ethics and responsible AI development with 13 years of research experience. Former member of Microsoft AI Ethics Committee, now provides consulting for multiple international AI governance organizations. Regularly contributes AI ethics articles to top-tier journals like Nature and Science.