Generative AI Requires Massive Amounts of Power and Water, and the Aging U.S. Grid Can’t Handle the Load

The AI Boom Puts a Strain on Energy and Water Resources, Highlighting the Need for Infrastructure Upgrades

Advertisement

The explosive growth of generative AI has led to a surge in new data centers, which are driving a massive increase in power and water consumption. As AI applications expand, concerns are growing about whether the U.S. can meet this rising demand with its aging electrical grid and limited water resources.

Dipti Vachani, head of automotive at chip company Arm, highlights the urgency of addressing power issues to sustain AI advancements. Arm’s low-power processors are becoming popular because they reduce energy use by up to 15% in data centers. Nvidia’s latest AI chip, Grace Blackwell, also focuses on energy efficiency, using 25 times less power than previous models.

Despite these advancements, the energy requirements for AI remain daunting. For instance, a single ChatGPT query consumes nearly 10 times more energy than a typical Google search, and generating an AI image can use as much power as charging a smartphone. Training large language models has previously been found to produce as much CO2 as five gas-powered cars over their lifetimes.

Advertisement

The rapid expansion of data centers to support AI is causing a significant increase in emissions. Google’s greenhouse gas emissions rose nearly 50% from 2019 to 2023, partly due to data center energy consumption. Microsoft’s emissions also increased nearly 30% from 2020 to 2024. In Kansas City, Meta’s AI-focused data center is delaying plans to close a coal-fired power plant due to high power demands.

Globally, there are over 8,000 data centers, with the U.S. hosting the largest concentration. The Boston Consulting Group predicts a 15%-20% annual increase in data center demand through 2030, potentially accounting for 16% of total U.S. power consumption. This is a significant jump from 2.5% before the launch of OpenAI’s ChatGPT in 2022, equivalent to the power usage of about two-thirds of U.S. homes.

Efforts to manage this surge include looking for locations with better access to renewable energy and upgrading infrastructure. Vantage Data Centers, for instance, is expanding its operations in Ohio, Texas, and Georgia and exploring ways to integrate renewable sources like wind, solar, and nuclear power. Some companies are also investing in on-site energy generation, such as solar modules and mini nuclear reactors.

The U.S. electrical grid, however, struggles to handle the increased load due to its aging infrastructure. Expanding transmission lines is costly and time-consuming, with some projects facing opposition from local communities concerned about rising utility bills. Innovations like predictive software for transformers aim to address grid weaknesses, but replacing outdated equipment remains a slow process.

Cooling requirements add another layer of complexity. Generative AI data centers are projected to require 4.2 billion to 6.6 billion cubic meters of water annually by 2027, surpassing the total water withdrawal of half the U.K. Solutions like air conditioning units that do not use water and direct-to-chip liquid cooling are being explored, though retrofitting existing facilities can be challenging.

Companies are also investigating on-device AI to reduce the load on power-hungry data centers. Despite these efforts, the future of AI infrastructure will depend on overcoming these significant energy and water constraints.

As the industry continues to grow, balancing technological advancement with sustainable resource management will be crucial to avoid exacerbating the strain on power and water supplies.