Energy demand for AI

Topic

The significant and growing need for electricity to power data centers for training and running artificial intelligence models, which is the primary driver for Big Tech's investment in nuclear energy.


entitydetail.created_at

8/20/2025, 3:38:28 AM

entitydetail.last_updated

8/20/2025, 3:40:09 AM

entitydetail.research_retrieved

8/20/2025, 3:40:09 AM

Summary

The increasing energy demand for Artificial Intelligence (AI) is a significant concern, driven by the substantial energy consumption and carbon footprint associated with training and utilizing deep learning models, as well as the material intensity of AI data centers. This trend compels major technology companies like Amazon, Google, and Microsoft to invest heavily in nuclear energy, particularly Small Modular Reactors (SMRs). While AI presents environmental challenges, it also offers potential solutions such as material innovations and improved grid management. In response to these environmental impacts, governments are implementing policies to oversee AI-related environmental issues and infrastructure development. The discussion around AI's energy needs has also sparked debates about the necessity and safety of nuclear power. Furthermore, AI's growing influence is beginning to disrupt industries like travel, with platforms like Perplexity potentially disintermediating traditional online travel agencies.

Referenced in 1 Document
Research Data
Extracted Attributes
  • Key Concern

    Increasing energy demand for AI

  • Primary Drivers

    Training and utilizing deep learning models

  • Material Intensity

    AI data centers require large amounts of electronics using specialized mined metals

  • Dominant Energy Use in AI

    Inference (80-90% of computing power)

  • Associated Environmental Impacts

    Substantial energy consumption, carbon footprint, water usage, e-waste

  • Localized Impact Example (Ireland)

    Data centers make up around 17% of its electricity demand

  • Projected US Power Demand (AI Data Centers) by 2035

    123 gigawatts (up from 4 gigawatts in 2024)

  • Current Global Electricity Demand Share (Data Centers)

    1% to 2%

  • Annual Growth Rate (Data Center Electricity Consumption 2024-2030)

    Around 15% per year

  • Projected Global Electricity Demand (Data Centers) by 2030 (IEA Base Case)

    Around 945 TWh

  • Projected Global Electricity Demand (Data Centers) by 2035 (IEA Lift-Off Case)

    Exceeding 1700 TWh

  • Projected Global Electricity Demand Share (Data Centers) by 2030 (MIT Estimate)

    Up to 21% (factoring in cost of delivering AI)

  • Projected Global Electricity Demand Share (Data Centers) by 2030 (IEA Base Case)

    Just under 3%

  • Projected Global Electricity Demand (Data Centers) by 2035 (IEA High Efficiency Case)

    Around 970 TWh

Timeline
  • US power demand from AI data centers is estimated at 4 gigawatts. (Source: Deloitte)

    2024-01-01

  • Global electricity demand from data centers is projected to double by 2026, fueled by AI adoption. (Source: IEA, MIT Sloan)

    2022-01-01

  • Global electricity consumption for data centers is projected to reach around 945 TWh, representing just under 3% of total global electricity consumption (IEA Base Case). (Source: IEA)

    2030-01-01

  • Data centers could account for up to 21% of overall global energy demand when factoring in the cost of delivering AI to customers (MIT Sloan estimate). (Source: MIT Sloan)

    2030-01-01

  • Data center electricity consumption is projected to grow by around 15% per year until 2030. (Source: IEA)

    2024-01-01

  • Power going to AI-specific purposes is estimated to rise to between 165 and 326 terawatt-hours per year. (Source: Technology Review)

    2028-01-01

  • Global electricity demand from data centers is projected to reach around 970 TWh (IEA High Efficiency Case) or exceed 1700 TWh (IEA Lift-Off Case). (Source: IEA)

    2035-01-01

  • US power demand from AI data centers could reach 123 gigawatts. (Source: Deloitte)

    2035-01-01

  • Increasing energy demand for AI becomes a significant concern due to substantial energy consumption and carbon footprint of deep learning models and data centers. (Source: Summary, Wikipedia)

    Ongoing

  • Governments begin instituting policies to improve oversight and review of environmental issues associated with AI and related infrastructure. (Source: Wikipedia)

    Ongoing

  • Big Tech companies like Amazon, Google, and Microsoft invest heavily in nuclear energy, particularly Small Modular Reactors (SMRs). (Source: Related Documents, Summary)

    Ongoing

  • AI's growing influence begins to disrupt industries like travel, with platforms like Perplexity threatening traditional online travel agencies. (Source: Summary, Related Documents)

    Ongoing

  • Debates arise regarding the necessity and safety of nuclear power for AI energy needs, with differing viewpoints on economic benefits and risks. (Source: Summary, Related Documents)

    Ongoing

  • AI offers potential solutions to environmental problems, such as material innovations and improved grid management. (Source: Wikipedia)

    Ongoing

  • Inference accounts for 80-90% of AI's computing power energy demands. (Source: Technology Review)

    Ongoing

Environmental impact of artificial intelligence

The environmental impact of artificial intelligence includes substantial energy consumption for training and using deep learning models, and the related carbon footprint and water usage. Moreover, the AI data centers are materially intense, requiring a large amount of electronics that use specialized mined metals and which eventually will be disposed as e-waste. Site selection for new server farms can also be a source of environmental justice concerns, without full review by local communities. Some scientists argue that artificial intelligence (AI) may also provide solutions to environmental problems, such as material innovations, improved grid management, and other forms of optimization across various fields of technology. As the environmental impact of AI becomes more apparent, governments have begun instituting policies to improve the oversight and review of environmental issues that could be associated with the use of AI, and related infrastructure development.

Web Search Results
  • Energy demand from AI

    The High Efficiency Case shares similar constraints and drivers with the Base Case, but assumes stronger progress on energy efficiency in software, hardware and infrastructure. As a result, the same level of demand for digital services and AI is met with a reduced electricity consumption footprint. This unlocks energy savings of more than 15%, with global electricity demand from data centres reaching around 970 TWh by 2035. As a result, 2.6% of global electricity demand goes to data centres. [...] The Lift-Off Caseassumes stronger growth in AI adoption than in the Base Case. A more resilient supply chain and greater flexibility, in data centre location, powering and operations, enable faster data centre deployment. It sees global electricity demand from data centres in 2035 that is around 45% higher than in the Base Case, exceeding the 1 700 TWh mark and reaching around 4.4% of global electricity demand. [...] Our Base Case finds that global electricity consumption for data centres is projected to double to reach around 945 TWh by 2030 in the Base Case, representing just under 3% of total global electricity consumption in 2030. From 2024 to 2030, data centre electricity consumption grows by around 15% per year, more than four times faster than the growth of total electricity consumption from all other sectors. However, in the wider context, a 3% share in 2030 means that data centre share in global

  • What's the impact of artificial intelligence on energy demand?

    First, the International Energy Agency (IEA) recently published its landmark World Energy Outlook 2024 report. It suggests that energy demand for data centres and AI will still be pretty small for the next five years at least. I read it as them saying: “Everyone just needs to chill out a bit.” But in a more diplomatic way.1 [...] While data centres and AI consume only a few percent of global electricity. In some countries, this share is much higher. Ireland is a perfect example, where data centres make up around 17% of its electricity demand. In the US and some countries in Europe, it’s higher than the global average, and closer to 3% to 4%. As we’ll see later, energy demand for AI is very localised; in more than five states in the US, data centres account for more than 10% of electricity demand. [...] So, both things can be right at the same time: Microsoft’s data centres could be powered entirely by Three Mile Island and this would just be a rounding error on US or global electricity demand. What’s crucial here is that the energy demands for AI are very localised. This means there can be severe strain on the grid at a highly localised level, even if the impact on total energy demand is small.

  • AI has high data center energy costs — but there are solutions

    Data centers could account for up to 21% of overall global energy demand by 2030 when the cost of delivering AI to customers is factored in. Already, data centers account for 1% to 2% of overall global energy demand, similar to what experts estimate for the airline industry, Gadepally said. That figure is poised to skyrocket, given rising AI demands, potentially hitting 21% by 2030, when costs related to delivering AI to consumers are factored in. [...] Surging demand for artificial intelligence has had a significant environmental impact, especially when it comes to data center use. The International Energy Agency has estimated that global electricity demand from data centers could double between 2022 and 2026, fueled in part by AI adoption.

  • We did the math on AI's energy footprint. Here's the story ...

    “For any company to make money out of a model—that only happens on inference,” says Esha Choukse, a researcher at Microsoft Azure who has studied how to make AI inference more efficient. As conversations with experts and AI companies made clear, inference, not training, represents an increasing majority of AI’s energy demands and will continue to do so in the near future. It’s now estimated that 80–90% of computing power for AI is used for inference. [...] One billion of these every day for a year would mean over 109 gigawatt-hours of electricity, enough to power 10,400 US homes for a year. If we add images and imagine that generating each one requires as much energy as it does with our high-quality image models, it’d mean an additional 35 gigawatt-hours, enough to power another 3,300 homes for a year. This is on top of the energy demands of OpenAI’s other products, like video generators, and that for all the other AI companies and startups. [...] By 2028, the researchers estimate, the power going to AI-specific purposes will rise to between 165 and 326 terawatt-hours per year. That’s more than all electricity currently used by US data centers for all purposes; it’s enough to power 22% of US households each year. That could generate the same emissions as driving over 300 billion miles—over 1,600 round trips to the sun from Earth.

  • Can US infrastructure keep up with the AI economy?

    The scale of AI data centers and their commensurate power needs are growing exponentially. By 2035, Deloitte estimates that power demand from AI data centers in the United States could grow more than thirtyfold,reaching 123 gigawatts, up from 4 gigawatts in 2024 (figure 1).1AI data centers can require dramatically more energy per square foot than traditional data centers. For example, a five-acre data center augmenting central processing units with specialized graphics processing units might