HyperAIHyperAI
Back to Headlines

Google's AI Energy Report: One Gemini Query Equals 1 Second of Microwave Operation

2 days ago

Google has released its first comprehensive report on the energy consumption of its Gemini AI application, revealing that a median prompt—representing the typical energy use across all queries—consumes 0.24 watt-hours of electricity. This amount of energy is roughly equivalent to running a standard microwave oven for one second. The report also provides estimates for water usage and carbon emissions associated with processing a single text-based prompt. This marks the first time a major tech company with a widely used AI product has offered such detailed, transparent data on the environmental footprint of its AI services. The report includes full methodological details, explaining how Google calculated its estimates—something previously unavailable to researchers and the public due to limited access to internal operations. As AI adoption grows, understanding its energy demands has become increasingly urgent. Earlier this year, MIT Technology Review published a series of in-depth reports on AI and energy use, but no leading AI company at the time was willing to disclose energy usage per prompt. Google’s new report now fills that gap, offering a rare, behind-the-scenes look at the real-world costs of AI interaction. The study covers the full scope of energy use, not just the power consumed by AI chips, but also the supporting infrastructure required to run them. According to Google’s chief scientist Jeff Dean, the goal was to be comprehensive. The findings show that AI chips—specifically Google’s custom TPUs and equivalent GPUs—account for 58% of the total energy. Another 25% comes from supporting hardware, including the CPU and memory on the host systems. An additional 10% is attributed to redundant backup systems that remain idle but are kept ready in case of failure. The remaining 8% comes from indirect operational costs such as cooling and power conversion within data centers. “This kind of transparency is valuable for the field,” said Mosharaf Chowdhury, a professor at the University of Michigan and co-creator of the ML.Energy leaderboard, which tracks AI model energy use. “These kinds of estimates are only possible at the scale of companies like Google, which have access to detailed operational data.” Jae-Won Chung, a Ph.D. candidate and co-lead of the ML.Energy project, called the report “the most comprehensive analysis to date.” It’s important to note that the 0.24 watt-hour figure represents the median, not the average across all queries. Some tasks—such as asking Gemini to summarize dozens of books—can consume significantly more energy. Similarly, inference-heavy tasks that require multiple reasoning steps tend to be more energy-intensive. The report focuses solely on text prompts and does not include energy use for image or video generation, which, as noted in earlier MIT Technology Review investigations, can require substantially more power. Despite the high energy demands of some tasks, Google reports a dramatic decline in energy use over time. A median Gemini prompt in May 2024 consumed 33 times more energy than one in May 2025, a reduction the company attributes to continuous model and software optimizations. In terms of carbon emissions, Google estimates that processing a median prompt results in 0.03 grams of CO₂. This figure was derived by multiplying the total energy used by the average carbon intensity of the electricity grid, adjusted for Google’s own renewable energy procurement. Rather than using regional grid averages, the company applied a market-based approach, factoring in its long-term commitments to clean energy. Since 2010, Google has secured contracts for over 22 gigawatts of renewable power from solar, wind, geothermal, and advanced nuclear sources, resulting in a carbon footprint per unit of electricity that is about one-third of the average for the grids in which it operates. Water use is also included in the report. Google estimates that each prompt requires 0.26 milliliters of water—about five drops—primarily for cooling data center infrastructure. Dean emphasized that the goal of the report is to help users understand the real-world impact of interacting with AI. “People are using these tools for all kinds of things,” he said. “They shouldn’t be worried about the energy or water use of Gemini because our measurements show it’s comparable to everyday actions—like watching a few seconds of TV or using a few drops of water.” The release significantly advances public knowledge about AI’s resource footprint. It comes amid growing pressure on tech companies to disclose environmental data. “I’m really glad they released this,” said Sasha Luccioni, an AI and climate researcher at Hugging Face. “People want to know the cost.” While the report offers unprecedented transparency, some key details remain undisclosed. Notably, Google has not revealed the total number of daily Gemini queries, which would allow for a better estimate of the platform’s overall energy impact. Ultimately, the decision about what to share, when, and how, remains with the company. Still, experts see this as a critical first step. “We’re pushing for a standardized AI energy rating—something like Energy Star,” Luccioni said. “This isn’t a replacement for that, but it’s a vital piece of the puzzle.”

Related Links

Google's AI Energy Report: One Gemini Query Equals 1 Second of Microwave Operation | Headlines | HyperAI