Energy consumption for AI

Topic

The rapidly increasing demand for electricity to power AI data centers and training, which is becoming a major constraint on the growth of the AI industry.


entitydetail.created_at

7/26/2025, 7:10:45 AM

entitydetail.last_updated

7/26/2025, 7:13:00 AM

entitydetail.research_retrieved

7/26/2025, 7:13:00 AM

Summary

Energy consumption for Artificial Intelligence, particularly for training and utilizing deep learning models and generative AI, is a significant and escalating concern, identified as the primary bottleneck for future AI development. This demand necessitates massive new investments in energy infrastructure to support the rapidly expanding AI ecosystem. The environmental impact includes a substantial carbon footprint due to reliance on fossil-fuel-based electricity and excessive water usage for cooling data centers. While global data center electricity consumption was estimated at 240-340 TWh in 2022, projections indicate that by 2028, AI-specific purposes could consume 165-326 TWh annually, potentially accounting for over half of all data center electricity and equivalent to powering 22% of US households. Despite these challenges, some scientists suggest that AI may also offer solutions to environmental problems, and strategies like transitioning to renewable energy and optimizing hardware/software are being explored to mitigate its impact.

Referenced in 1 Document
Research Data
Extracted Attributes
  • Proposed solutions

    Transition to 100% renewable energy, specialized AI hardware/software, innovative data center designs, comprehensive AI policies/regulation

  • Key environmental impacts

    Substantial carbon footprint, excessive water usage

  • Energy split: Inference vs. Training

    Inference 60-70%, Training 20-40%

  • Energy consumption for image generation

    Approximately 0.5 watt hours

  • Energy consumption for AI server request

    7-9 watt hours

  • Data center electricity consumption (2022)

    240-340 TWh (1-1.3% of world demand)

  • Projected AI electricity consumption (2028)

    165-326 TWh/year

  • Primary bottleneck for future AI development

    Yes

  • Projected AI electricity consumption equivalent (2028)

    22% of all US households

  • Projected AI electricity consumption as % of US data centers (2028)

    Over 50%

Timeline
  • Google reported energy figures for normal search queries. (Source: Nature.com)

    2009-XX-XX

  • Global data centers consumed 240-340 TWh of electricity, representing 1-1.3% of world demand. (Source: Nature.com, IEA)

    2022-XX-XX

  • Alex de Vries estimates NVIDIA servers dedicated to AI could consume 85-134 TWh of electricity annually. (Source: Polytechnique Insights)

    2027-XX-XX

  • Projections indicate AI alone could consume 165-326 TWh of electricity annually, accounting for over 50% of data center electricity and equivalent to 22% of US households. (Source: MIT Technology Review, Lawrence Berkeley National Laboratory)

    2028-XX-XX

Environmental impact of artificial intelligence

The environmental impact of artificial intelligence includes substantial energy consumption for training and using deep learning models, and the related carbon footprint and water usage. Some scientists have suggested that artificial intelligence (AI) may also provide solutions to environmental problems.

Web Search Results
  • Reducing AI's Climate Impact: Everything You Always Wanted to ...

    To address the accelerating demands of AI’s energy consumption, the ideal solution would be to transition to 100% renewable energy, but this goal is currently distant. A more feasible approach is the syncretic one, combining specialized AI hardware and software, innovative data center designs, and the implementation of comprehensive AI policies, including regulation. This discussion will outline current strategies for reducing AI’s energy demands, with many solutions derived from software

  • How much energy will AI really consume? The good, the bad and ...

    that each request through an AI server requires 7–9 watt hours (Wh) of energy. That is 23–30 times the energy of a normal search, going by figures Google reported in a 2009 blogpost (see go.nature.com/3d8sd4t). When asked to comment on de Vries’ estimate, Google did not respond. [...] Luccioni and others found that different tasks require varying amounts of energy. On average, according to their latest results, generating an image from a text prompt consumes about 0.5 Wh of energy, while generating text uses a little less. For comparison, a modern smartphone might need 22 Wh for a full charge. But there is wide variation: larger models require more energy (see ‘How much energy does AI use?’). De Vries says that the numbers are lower than those in his paper, but that might be [...] On the basis of supply-chain estimation methods, analysts say that data centres currently use just a small proportion of the world’s electricity demand. The International Energy Agency (IEA) estimates4 that the electricity used by such facilities in 2022 was 240–340 TWh, or 1–1.3% of world demand (if cryptocurrency mining and data-transmission infrastructure are included, this raises the proportion to 2%).

  • Generative AI: energy consumption soars - Polytechnique Insights

    con­sump­tion by data cen­tres, cryp­tocur­ren­cies and AI could amount to between 160 and 590 TWh com­pared with 2022. This is equiv­a­lent to the elec­tric­i­ty con­sump­tion of Swe­den (low esti­mate) or Ger­many (high estimate). [...] But these fig­ures could sky­rock­et. Alex de Vries esti­mates that by 2027, if pro­duc­tion capac­i­ty match­es the com­pa­nies’ promis­es, NVIDIA servers ded­i­cat­ed to AI could con­sume 85 to 134 TWh of elec­tric­i­ty every year. The cause: the surge in the use of gen­er­a­tive AI. Chat­G­PT, Bing Chat, Dall‑E, etc. These types of arti­fi­cial intel­li­gence, which gen­er­ate text, images or even con­ver­sa­tions, have spread across the sec­tor at record speed. How­ev­er, this type of AI [...] Chat­G­PT, every­thing has been reversed and the infer­ence phase has become pre­dom­i­nant.” Recent data pro­vid­ed by Meta and Google indi­cate that it accounts for 60–70% of ener­gy con­sump­tion, com­pared with 20–40% for train­ing11.

  • We did the math on AI's energy footprint. Here's the story you haven't ...

    Given the direction AI is headed—more personalized, able to reason and solve complex problems on our behalf, and everywhere we look—it’s likely that our AI footprint today is the smallest it will ever be. According to new projections published by Lawrence Berkeley National Laboratory in December, by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households. [...] By 2028, the researchers estimate, the power going to AI-specific purposes will rise to between 165 and 326 terawatt-hours per year. That’s more than all electricity currently used by US data centers for all purposes; it’s enough to power 22% of US households each year. That could generate the same emissions as driving over 300 billion miles—over 1,600 round trips to the sun from Earth. [...] First, a data center humming away isn’t necessarily a bad thing. If all data centers were hooked up to solar panels and ran only when the sun was shining, the world would be talking a lot less about AI’s energy consumption. That’s not the case. Most electrical grids around the world are still heavily reliant on fossil fuels. So electricity use comes with a climate toll attached.

  • Why AI uses so much energy—and what we can do about it

    Stacked area and bar chart showing historical and projected U.S. server electricity consumption from 2014 to 2028, broken down by processor type. AI workloads, especially those using 8 GPUs, drive significant growth in projected energy use, with total consumption reaching nearly 400 TWh in high scenarios by 2028. ## What are the key environmental consequences of AI development? [...] The environmental impact of AI extends beyond high electricity usage. AI models consume enormous amounts of fossil-fuel-based electricity, significantly contributing to greenhouse gas emissions. The need for advanced cooling systems in AI data centers also leads to excessive water consumption, which can have serious environmental consequences in regions experiencing water scarcity. [...] Initially, energy concerns in computing were consumer-driven, such as improving battery life in mobile devices. Today, the focus is shifting to environmental sustainability, carbon footprint reduction, and making AI models more energy efficient. AI, particularly large language models (LLMs), requires enormous computational resources. Training these models involves thousands of graphics processing units (GPUs) running continuously for months, leading to high electricity consumption. By