Google shares how much energy Gemini AI uses

Advertisement

Google has shared new numbers on the environmental cost of its Gemini AI models. The company says that each text query uses only 0.24 watt-hours of energy, produces 0.03 grams of carbon emissions, and consumes 0.26 millilitres of water. To put it simply, that is about the same as watching TV for less than nine seconds.

These details were published in fresh research papers and blog posts by Google scientists, including Chief Scientist Jeff Dean. The goal is to give a clearer picture of what AI really costs at scale. Until now, many reports missed parts of the picture, such as idle server use, cooling needs, or water consumption. Google says its new method gives a fuller and more accurate measurement.

The company also highlighted big efficiency improvements. In just one year, the average energy use of a Gemini text prompt dropped 33 times, while its carbon impact fell 44 times. At the same time, the model’s answers have actually improved in quality.

Advertisement

Google credits this progress to many upgrades. These include new model designs like Mixture-of-Experts, faster response methods such as speculative decoding, custom-built Tensor Processing Units (TPUs), and smarter data centre management. Its latest TPU, called Ironwood, is said to be 30 times more efficient than the first version it released publicly.

Even with these advances, demand for energy is still rising. In 2024, Google’s data centres used 27 percent more electricity than the year before. However, emissions still fell 12 percent, thanks to cleaner energy and efficiency gains. Google has promised to eventually run fully on carbon-free energy and replace more water than it consumes.

The company says being open about AI’s energy use is important for accountability as AI becomes more common. By publishing its methods, it hopes to set a clear standard for measuring the true footprint of artificial intelligence and show that efficiency is key to making AI sustainable.