The rise of commercially available AI systems has led to alarming headlines: each time ChatGPT writes an email, it’s like dumping out a bottle of water, or every prompt answered by a chatbot is equivalent to powering a light bulb for about 20 minutes.
As the technology becomes more integrated into our daily lives, researchers are attempting to quantify AI’s impact on the environment. While concerns over AI’s energy demand and its increasing ubiquity grow, successful efforts among policymakers to track or regulate the industry’s footprint have not. That is largely due to the lack of relevant data and reporting mechanisms from companies in the tech and energy sectors.
What are AI energy costs? And where do they come from?
The environmental similes used in news articles highlight just one aspect of AI’s energy demands. The technology’s ecological impact spans the months and sometimes years that it can take to train and deploy an AI model. Different stages of development rely on specialized hardware with varying, but rather large, consumption needs.
During the training stage, models are fed curated—though often extensive—amounts of data to “learn” based on their algorithms. The electronic chips, or graphics processing units (GPUs), used during this process often run 24 hours a day, leading to high energy demands (usually supported by non-renewable sources) that increase quickly depending on the complexity of the model.
For example, researchers from Google and UC Berkeley estimated that training OpenAI’s ChatGPT-3 model consumed enough electricity to power around 120 U.S. homes for one year. While a lack of access to this kind of data has made calculating the electricity demands of GPT-4 more difficult, researchers estimate that it likely took 50 times more electricity.
One estimate using data from the International Energy Agency (IEA) suggested that ChatGPT uses nearly 10 times more electricity than a normal Google search. These numbers will only grow as more users embrace generative AI programs instead of traditional search engines. In fact, researchers estimate that if Google processed 9 billion AI-powered searches per day, they could require 23-30 times the amount of energy needed for normal searches.
Some experts, such as Sasha Luccioni, the AI and Climate Lead at Hugging Face, are maintaining an “AI Energy Score” dashboard, reporting an approximately 62,000 times difference between the highest and lowest energy needs across the dashboard’s different use cases and models. Energy demands for generating text, images, and videos also vary widely, with video generation being the most energy-intensive; the MIT Technology Review recently found that an AI model used about 3.4 million joules, the amount of energy to run a microwave for over an hour, to generate a 5-second video.
Much of the processing of these commands comes through data centers. Data center electricity consumption was 4.4% of U.S. electricity demand in 2023, and could grow to 6% by 2026. Globally, these centers account for 1-2% of the world’s energy needs, but given the rising demands of AI, some expect this could reach 21% by 2030.
This doesn’t account for the use of water, which is needed to cool hardware. Some U.K. researchers estimate the global use of water for data processing could reach half of the country’s water usage by 2027.
Additional indirect energy costs are incurred by maintaining large building infrastructures themselves, including computing hardware, storage systems, and network equipment.
There are also end-of-life costs. A GPU is typically used for four years before being discarded or repurposed. Currently, there is a lack of information on how these devices are disposed of and what the environmental impact of this waste accumulation will be.
Lack of data makes AI energy consumption less transparent
These estimates are further complicated by the fact that tech and electric companies don’t typically release energy or water consumption data and are not subject to obligations to disclose them.
Most estimates are uncertain because they lack proprietary information. While Google, Microsoft, and Meta declined to share AI prompt energy needs of their models, OpenAI CEO Sam Altman recently wrote that an average query uses 0.34 watt-hours, or how much a lightbulb uses in a couple of minutes, and 0.000085 gallons of water. These figures, however, do not include the extensive energy needs of training the model and only reflect energy needs on a per-query basis.
Growing efficiencies in AI energy consumption
This level of consumption is not lost on those leading the charge. OpenAI CEO Sam Altman conceded that AI will consume more power than might be expected.
Although there has been some growth in efficiency, these improvements have recently plateaued. Larger data centers present an opportunity for greater energy efficiency, as exemplified by the construction of “hyperscale” centers. Other examples include improvements at the hardware level, such as power capping, or limiting the amount of power that’s fed to processors and GPUs; this technique has been shown to decrease energy consumption by up to 15% with minimal effects on the user’s experience.
Other efficiencies are also resulting from the building of smaller models, pruning or quantization of algorithm architecture design, switching to renewable energy sources, increasing collaboration amongst AI firms, or using AI to identify possible improvements for itself and other models. For example, Google DeepMind said it succeeded in reducing its energy consumption in data centers by 30% by using AI to better predict cooling needs.
However, these improvements still require additional use of AI and therefore greater energy consumption, even with new efficiencies accounted for. With lower energy costs for AI systems, increased demand may lead to a rebound effect, where more efficient technology and its lower production costs generate greater demand and adoption, again resulting in increased consumption. This cycle is sometimes referred to as Jevons paradox, a term initially coined when the steam engine became more efficient.
Unfortunately, the impact of increasing energy demands will not be felt equally. Some regions and states are already being drained of resources far more than others. Virginia’s “Data Center Alley” offers one example; it hosts over 300 data centers, the largest number in the United States. Residents are fighting back against the continuous expansions of data centers in the region, leading to rising energy bills, increasing water demand, and worsening air quality due to the facilities’ back-up diesel generators.
Further south, the NAACP recently sued Elon Musk’s xAI for allegedly operating turbines for a South Memphis data center without appropriate permits. The NAACP emphasized that toxic emissions from the turbines were directed toward predominantly Black neighborhoods that bear the brunt of environmental racism.
What will power AI in the future?
To ensure that increasing AI adoption does not exacerbate environmental degradation, policymakers, companies, and communities will need a better understanding of where to focus their attention and resources, starting with better data on emissions and energy consumption.
MIT scientists found that companies using independent auditors for assurance decreased their total emissions by 7.5% annually, even when starting with higher emissions than companies without assurance. However, estimates might not be detailed enough. For example, the IEA estimate includes activity across data centers, without a direct focus on AI applications. Unless tech companies reveal what percentage of their energy use is related to AI, the ambiguity will persist.
Last year, Sen. Ed Markey (D-MA) introduced the “Artificial Intelligence Environmental Impacts Act of 2024” to better measure the impact of AI. The legislation would require the Environmental Protection Agency (EPA) to conduct a comprehensive study about the issue, in addition to convening a consortium of stakeholders through the National Institute of Standards and Technology (NIST) and creating a voluntary reporting system. This would culminate in an interagency report submitted to Congress to report findings and provide policy recommendations.
The U.S. Government Accountability Office (GAO) also published a report on generative AI’s environmental and human effects in April 2025, outlining six possible policy options, including encouraging developers to share model details about the infrastructure used for generative AI’s training and usage, as well as providing government incentives for more resource-efficient models and training methods.
At the state level, at least 60 bills have been introduced across the country to address the impact of data centers, but there has been little meaningful change. Harvard researchers found that the carbon intensity of data centers was 48% higher than the nationwide average, partly because data centers are often built in areas with dirtier electrical grids.
Major companies, such as Meta and Microsoft, have looked to alternative forms of energy to power their data centers, focusing on nuclear energy. However, the MIT Technology Review emphasized that nuclear power makes up only 20% of electricity production in the U.S., while clean but intermittent technologies, such as wind and solar, cannot fully power always-on data centers.
Given the many unknowns, the time is right for the U.S. to support further research initiatives into methods to lower AI’s environmental impact, while continuing to improve transparency and set standards in this important area. Failure to do so could not only accelerate the climate crisis, but also slow the adoption of AI applications and delay their benefits.
-
Acknowledgements and disclosures
Google, Microsoft, and Meta are general, unrestricted donors to the Brookings Institution. The findings, interpretations, and conclusions posted in this piece are solely those of the authors and are not influenced by any donations.
The Brookings Institution is committed to quality, independence, and impact.
We are supported by a diverse array of funders. In line with our values and policies, each Brookings publication represents the sole views of its author(s).
Commentary
As energy demands for AI increase, so should company transparency
July 14, 2025