Skip to content
Accueil » Gemini: Is Google’s AI really energy-efficient?

Gemini: Is Google’s AI really energy-efficient?

google_ai_energy

Artificial intelligence is reshaping our daily lives—we use it for everything, from writing to image generation. But its environmental impact is becoming a major concern, and AI ecology will be a central topic in 2026.

In this context, Google published a study claiming that a query on its Gemini model uses less energy than nine seconds of television. But that figure raises questions about the methodology and the limits of such estimates.

This article, written by the Yiaho team, analyzes these data, explores the challenges of measuring AI’s environmental footprint, and looks at the prospects for more sustainable technology.

Gemini: negligible energy consumption?

According to Google’s study, a text query on Gemini uses 0.24 watt-hours (Wh) of electricity, 0.26 milliliters of water (the equivalent of five drops), and emits 0.03 grams of carbon dioxide equivalent.

To give a sense of scale, that corresponds to the energy used by a modern TV in nine seconds—about 100 Wh.

These figures focus on the active processing phase of queries, including the energy used by processors, data center cooling systems, and idle servers. But AI model training—a notoriously energy-hungry step—is excluded from these calculations, which limits the scope of the analysis.

Comparison with ChatGPT: a lack of clarity?

For comparison, Sam Altman, CEO of OpenAI, said in June 2025 that a query on ChatGPT uses about 0.34 Wh and 0.3 ml of water. These values, slightly higher than Gemini’s, remain hard to assess due to a lack of methodological details.

OpenAI’s lack of transparency makes any rigorous comparison impossible, highlighting a structural problem: the absence of universal standards for measuring AI’s energy impact.

Obstacles to reliable measurement

Assessing the environmental footprint of an AI query is a complex task.

Google adopted an approach that includes the energy used by infrastructure, cooling systems, and inactive servers needed to absorb traffic spikes. Yet several grey areas remain.

The definition of a “query” remains vague: is it a word, a sentence, or a longer text?

In addition, Google does not disclose the daily volume of queries on Gemini, making it impossible to estimate the overall impact.

Finally, the lack of validation by an independent third party, as Google acknowledges, calls for caution when interpreting these results.

Toward greener AI?

Despite these encouraging figures for a single query, AI’s cumulative impact worldwide is significant. Data centers already consume 1% to 2% of global electricity, a share that could grow rapidly with the rise of AI.

Also read on this topic: Does AI pollute? The answer is surprising!

However, initiatives are emerging: Google is investing in infrastructure powered by renewable energy, and research aims to optimize algorithms to reduce their energy consumption, especially during training.

A commendable effort, but progress still needed

Google’s study on Gemini marks a step toward greater transparency in a sector that is often opaque. However, without international standards, external audits, and accounting for model training, these data remain incomplete.

For AI to be part of a sustainable approach, tech players will need to align their measurement methods and communicate more openly. In the meantime, users and industry observers have a key role to play in keeping the pressure on these critical issues.

Source: Connaisancedesénergies

Leave a Reply

Your email address will not be published. Required fields are marked *

Glen

Glen