energy consumption for AI

Topic

The significant amount of electrical power required to train and run AI models. A paper from Germany is discussed which proposes a new architecture that could reduce this energy consumption by tens of thousands of times.


First Mentioned

9/27/2025, 5:10:04 AM

Last Updated

9/27/2025, 5:12:57 AM

Research Retrieved

9/27/2025, 5:12:57 AM

Summary

Energy consumption for AI is a growing concern, driven by the increasing capabilities and adoption of AI models. This topic was recently highlighted on the All-In Podcast, where discussions included a proposed architectural solution from researchers in Germany aimed at significantly reducing AI's energy demands, potentially enabling more powerful inference on edge devices. The broader context reveals that global electricity consumption reached 24,398 terawatt-hours (TWh) in 2022, a nearly threefold increase since 1981. AI's energy footprint is substantial, with a single AI server request consuming 7–9 watt-hours (Wh), significantly more than a normal search. Data centers, crucial for AI, are major drivers of electricity demand, consuming an estimated 240–340 TWh globally in 2022. Projections indicate that by 2028, AI-specific purposes could account for 165–326 TWh annually, representing over half of data center electricity and equivalent to powering 22% of US households. Solutions to mitigate this include hardware improvements, smaller models, smarter training, and the use of clean energy.

Referenced in 1 Document
Research Data
Extracted Attributes
  • Energy Formula

    E = P × t (Energy = Power × Time)

  • Power Measurement Units

    Watts (W), Kilowatts (kW)

  • Energy Measurement Units

    Watt-hours (Wh), Kilowatt-hours (kWh)

  • Energy for Generating Text

    Slightly less than 0.5 Wh

  • Energy per AI Server Request

    7–9 Wh

  • Global Electricity Consumption (1981)

    8,132 TWh

  • Global Electricity Consumption (2022)

    24,398 TWh

  • Hardware Power-Capping Effect on Time

    Increase time by 3%

  • Energy per Normal Search (2009 Google)

    23–30 times less than an AI server request

  • Hardware Power-Capping Effect on Energy

    Decrease consumption by up to 15%

  • Energy for Modern Smartphone Full Charge

    22 Wh

  • Typical AI Data Center Power Consumption

    As much as 100,000 households

  • US Energy for Data Centers (Latest Reports)

    4.4% of all US energy

  • Energy for Generating Image from Text Prompt

    ~0.5 Wh

  • Data Center Electricity Use (2022, IEA estimate)

    240–340 TWh (1–1.3% of world demand)

  • Largest AI Data Centers Under Construction Power Consumption

    20 times a typical AI data center

  • Projected AI Electricity Use (2028, US Households Equivalent)

    22% of all US households

  • Projected AI Electricity Use (2028, Lawrence Berkeley National Lab)

    165–326 TWh/year

  • Data Center Electricity Use (2022, IEA estimate, including crypto/data transmission)

    2% of world demand

Timeline
  • Global electricity consumption was 8,132 TWh. (Source: wikipedia)

    1981

  • Google reported figures for normal search energy consumption. (Source: web_search_results)

    2009

  • Global electricity consumption reached 24,398 TWh. (Source: wikipedia)

    2022

  • Data centers used an estimated 240–340 TWh of electricity, representing 1–1.3% of world demand (IEA estimate). (Source: web_search_results)

    2022

  • Hardware designed for AI doubled electricity consumption. (Source: web_search_results)

    2023

  • Lawrence Berkeley National Laboratory published new projections on AI electricity use. (Source: web_search_results)

    Late 2023

  • Projected that more than half of data center electricity will be used for AI, consuming 165–326 TWh/year (Lawrence Berkeley National Laboratory projection). (Source: web_search_results)

    2028

  • Researchers in Germany proposed an architectural solution to drastically reduce energy consumption for AI. (Source: Related Documents)

    Undated

  • The All-In Podcast discussed the topic of energy consumption for AI. (Source: Related Documents)

    Undated

Electric energy consumption

Electric energy consumption is energy consumption in the form of electrical energy. About a fifth of global energy is consumed as electricity: for residential, industrial, commercial, transportation and other purposes. The global electricity consumption in 2022 was 24,398 terawatt-hour (TWh), almost exactly three times the amount of consumption in 1981 (8,132 TWh). China, the United States, and India accounted for more than half of the global share of electricity consumption. Japan and Russia followed with nearly twice the consumption of the remaining industrialized countries. While power is measured in watts (W) or kilowatts (kW), energy consumption is typically measured in watt-hours (Wh) or kilowatt-hours (kWh). The relationship is fundamental: energy (E) equals power (P) multiplied by time (t): E=P×t {\displaystyle {\text{E=P×t}}}

Web Search Results
  • How much energy will AI really consume? The good, the bad and ...

    that each request through an AI server requires 7–9 watt hours (Wh) of energy. That is 23–30 times the energy of a normal search, going by figures Google reported in a 2009 blogpost (see go.nature.com/3d8sd4t). When asked to comment on de Vries’ estimate, Google did not respond. [...] Luccioni and others found that different tasks require varying amounts of energy. On average, according to their latest results, generating an image from a text prompt consumes about 0.5 Wh of energy, while generating text uses a little less. For comparison, a modern smartphone might need 22 Wh for a full charge. But there is wide variation: larger models require more energy (see ‘How much energy does AI use?’). De Vries says that the numbers are lower than those in his paper, but that might be [...] On the basis of supply-chain estimation methods, analysts say that data centres currently use just a small proportion of the world’s electricity demand. The International Energy Agency (IEA) estimates4 that the electricity used by such facilities in 2022 was 240–340 TWh, or 1–1.3% of world demand (if cryptocurrency mining and data-transmission infrastructure are included, this raises the proportion to 2%).

  • AI and energy: Will AI reduce emissions or increase power demand?

    AI’s energy use currently only represents a fraction of the technology sector’s power consumption, which is estimated to generate around 2-3% of total global emissions. This is likely to change as more companies, governments and organizations use AI to drive efficiency and productivity. Data centres are already significant drivers of electricity demand growth in many regions. Image: IEA [...] The data centres used to train and operate AI models consume much of this energy. A typical AI data centre, according to the International Energy Agency (IEA), uses as much power as 100,000 households right now, but the largest centres currently being constructed will consume 20 times that amount. Discover 'Industries in the Intelligent Age' at Davos 2025 [...] “AI systems vary widely in energy consumption depending on their complexity and usage, but they generally require significant amounts of electricity to process and analyse data efficiently.”

  • Optimize Efficiency With AI-Driven Energy Management - Pecan AI

    AI-driven energy management solutions are crucial for reducing costs, increasing sustainability, and minimizing environmental impact. AI optimizes energy consumption by analyzing data, identifying patterns, and making accurate forecasts. Real-world examples show AI's effectiveness in reducing energy usage and costs. Benefits include cost reduction, improved sustainability, and environmental impact. [...] AI energy management primarily works by analyzing and interpreting vast energy consumption data. Advanced algorithms and machine learning models identify these data sets' patterns, anomalies, and trends. These insights allow for more accurate forecasting of energy needs and better decision-making, reducing energy waste. ‎ [...] Cost reduction is one of the most attractive benefits of leveraging AI energy management. Energy is a significant overhead for many organizations. Businesses can drastically cut their energy bills through the smart and efficient use of energy. AI-driven solutions achieve this by identifying inefficiencies in energy consumption and making strategic recommendations to rectify them. ‎

  • We did the math on AI's energy footprint. Here's the story you haven't ...

    Given the direction AI is headed—more personalized, able to reason and solve complex problems on our behalf, and everywhere we look—it’s likely that our AI footprint today is the smallest it will ever be. According to new projections published by Lawrence Berkeley National Laboratory in December, by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households. [...] By 2028, the researchers estimate, the power going to AI-specific purposes will rise to between 165 and 326 terawatt-hours per year. That’s more than all electricity currently used by US data centers for all purposes; it’s enough to power 22% of US households each year. That could generate the same emissions as driving over 300 billion miles—over 1,600 round trips to the sun from Earth. [...] hardware designed for AI, which led them to double their electricity consumption by 2023. The latest reports show that 4.4% of all the energy in the US now goes toward data centers.

  • The Future of AI and Energy Efficiency - IBM

    Hardware improvements Smaller models Smarter model training Use of clean and renewable energy Open source and collaboration ### Hardware improvements Power-capping hardware has been shown to decrease energy consumption by up to 15%, while only increasing the time it takes to return a result by a barely noticeable 3%. AI energy use can also be reduced by using carbon-efficient hardware, which “matches a model with the most carbon-efficient mix of hardware,” according to MIT. [...] The rapid growth in AI adoption has also resulted in dramatic increases in energy use. Energy is needed both to build and train AI models and then to power the complex math that a model completes each time it is asked for information or to generate content. [...] No one expects AI adoption to slow because too many companies and executives see it as an indispensable part of their future. Marrying these 2 ambitions—tapping AI’s benefits while progressing on net-zero pledges— requires a smart approach. ## Addressing AI’s energy consumption challenges Fortunately, numerous industry experts are working on a range of solutions. Those solutions include: