On 21/8, Google released its first report measuring the environmental impact of answering Gemini AI prompts. Analysts consider this the most transparent estimate of the environmental footprint from a major tech company with a popular AI product.
![]() |
Gemini logo. Photo: Reuters |
According to the report, a single Google AI prompt generates 0.033 gCO2e across all three scopes. For an AI company, the largest emissions fall under scope 2, which represents indirect emissions from electricity, heating, and cooling. Scope 1 covers direct emissions from the company's operations, while scope 3 includes other indirect emissions such as upstream transportation, waste disposal, and fuel and energy consumption (outside scopes 1 and 2).
For Gemini AI prompts, scope 2 emissions are 0.023 gCO2e, accounting for 70% of the total.
Usage | Details | |
Energy | 0.24 Wh | Equivalent to watching TV for 9 seconds |
Water | 0.26 ml | Equivalent to 5 drops of water |
Emissions | 0.033 gCO2e | Breakdown: - Scope 2: 0.023 gCO2e - Scope 1 and 3: 0.01 gCO2e |
The report also states that each prompt uses an average of 0.24 Wh of electricity and 0.26 ml of water. Google equates this consumption to five drops of water and 9 seconds of TV viewing.
Google's AI accelerators account for 58% of the electricity usage. Equipment needed to support specialized AI hardware, like server CPUs and memory, adds another 25%. Backup devices, necessary for contingencies and operating in idle mode, account for 10%. The remaining power goes towards data center operations, including cooling and power conversion.
Furthermore, the electricity used per AI query has decreased by a factor of 33 compared to a year ago. This improvement stems from advances in hardware, optimized algorithms, and more efficient data center operations.
However, Google's figures are average estimates and don't represent all queries submitted to Gemini. Some prompts consume more energy. Jeff Dean, Google's chief scientist, gave the example of inputting dozens of books into Gemini and requesting a detailed summary.
Additionally, using the reasoning model can also increase energy demands because it performs more steps before providing an answer.
Dean also noted that the report is limited to text prompts and doesn't account for the resources needed to generate images or videos, which are more energy-intensive.
Despite the multifold increase in efficiency per prompt, total electricity consumption at the company level continues to rise, driven by rapidly growing AI query demand. CarbonCredits calls this trend an example of the "Jevons paradox," an economic principle named after economist William Stanley Jevons.
This principle states that increased efficiency in using a resource can lead to an overall increase in the consumption of that resource, rather than a decrease. Higher efficiency lowers the cost of using the resource, thereby increasing demand and accelerating consumption.
In fact, Google's total greenhouse gas emissions in 2024 increased by 51% over five years, fueled by the surge in AI activity. Data centers consumed 30.8 million MWh of electricity, double the amount used in 2020.
Google has been working to mitigate the climate impact of its data centers, focusing on renewable and nuclear energy solutions and purchasing carbon offset credits.
Bao Bao (via Google, CarbonCredits, MIT Technology Review)