Energy Consumption for AI
The significant and growing demand for electricity driven by AI data centers, which is raising concerns about grid stability and rising energy costs for consumers.
First Mentioned
10/4/2025, 5:08:52 AM
Last Updated
10/4/2025, 5:13:53 AM
Research Retrieved
10/4/2025, 5:13:53 AM
Summary
The rapid advancement and widespread adoption of Artificial Intelligence (AI), particularly generative AI, have led to significant concerns regarding its substantial energy consumption. This energy demand is primarily driven by the extensive use of data centers, which are straining power grids in various regions, such as Virginia. The burgeoning AI sector is characterized by an intense arms race, with companies like OpenAI, Google DeepMind, and Meta pushing the boundaries of AI capabilities, aiming for artificial general intelligence (AGI). This progress is fueled by innovations like the transformer architecture and the increasing use of graphics processing units to accelerate neural networks. However, the proliferation of AI-generated content, sometimes referred to as "AI slop," and the rise of open-source AI models from countries like China, present both opportunities and challenges, putting pressure on the dominance of closed-source models from the United States. The immense energy requirements of AI, coupled with ethical considerations and potential harms from AI-generated content, are prompting discussions about regulatory policies, including state-level regulations and the need for federal oversight to ensure the technology's safety and benefits while maintaining economic competitiveness.
Referenced in 1 Document
Research Data
Extracted Attributes
Environmental Impacts
Greenhouse gas emissions (due to fossil-fuel-based electricity), excessive water consumption (for cooling systems)
Energy for Text Generation
Slightly less than 0.5 Wh
Energy per AI Server Request
7–9 watt hours (Wh)
Hardware Power-Capping Impact
Decrease energy consumption by up to 15% with only a 3% increase in result return time
Proposed Solutions for Efficiency
Hardware improvements (e.g., power-capping), smaller models, smarter model training, use of clean/renewable energy, open source and collaboration, carbon-efficient hardware
Key Technologies Driving Consumption
Graphics Processing Units (GPUs), Transformer architecture
Primary Driver of Energy Consumption
Data Centers
Energy for Image Generation (text prompt)
Approximately 0.5 Wh
Global Data Center Electricity Demand (2022)
240–340 TWh (1–1.3% of world demand, up to 2% with crypto mining and data-transmission)
Projected US AI Electricity Consumption (2028)
165–326 TWh per year
Projected Data Center Electricity for AI (2028)
More than half of total data center electricity
Projected US Server Electricity Consumption (2028, high scenario)
Nearly 400 TWh
Timeline
- Artificial intelligence was founded as an academic discipline. (Source: wikipedia)
1956
- Google reported energy figures for a normal search, which are significantly lower than AI server requests. (Source: web_search_results)
2009
- Funding and interest in AI vastly increased, with graphics processing units (GPUs) beginning to be used to accelerate neural networks. (Source: wikipedia)
2012
- Growth in AI accelerated further with the introduction of the transformer architecture. (Source: wikipedia)
2017
- An ongoing period of rapid progress in advanced generative AI became known as the 'AI boom'. (Source: wikipedia)
2020s
- Data centers globally consumed an estimated 240–340 TWh of electricity, accounting for 1–1.3% of world demand (up to 2% including cryptocurrency mining and data-transmission). (Source: web_search_results)
2022
- Projections indicate that more than half of the electricity consumed by data centers will be used for AI, with AI alone consuming 165–326 TWh per year in the US. (Source: web_search_results)
2028
Wikipedia
View on WikipediaArtificial intelligence
Artificial intelligence (AI) is the capability of computational systems to perform tasks typically associated with human intelligence, such as learning, reasoning, problem-solving, perception, and decision-making. It is a field of research in computer science that develops and studies methods and software that enable machines to perceive their environment and use learning and intelligence to take actions that maximize their chances of achieving defined goals. High-profile applications of AI include advanced web search engines (e.g., Google Search); recommendation systems (used by YouTube, Amazon, and Netflix); virtual assistants (e.g., Google Assistant, Siri, and Alexa); autonomous vehicles (e.g., Waymo); generative and creative tools (e.g., language models and AI art); and superhuman play and analysis in strategy games (e.g., chess and Go). However, many AI applications are not perceived as AI: "A lot of cutting edge AI has filtered into general applications, often without being called AI because once something becomes useful enough and common enough it's not labeled AI anymore." Various subfields of AI research are centered around particular goals and the use of particular tools. The traditional goals of AI research include learning, reasoning, knowledge representation, planning, natural language processing, perception, and support for robotics. To reach these goals, AI researchers have adapted and integrated a wide range of techniques, including search and mathematical optimization, formal logic, artificial neural networks, and methods based on statistics, operations research, and economics. AI also draws upon psychology, linguistics, philosophy, neuroscience, and other fields. Some companies, such as OpenAI, Google DeepMind and Meta, aim to create artificial general intelligence (AGI)—AI that can complete virtually any cognitive task at least as well as a human. Artificial intelligence was founded as an academic discipline in 1956, and the field went through multiple cycles of optimism throughout its history, followed by periods of disappointment and loss of funding, known as AI winters. Funding and interest vastly increased after 2012 when graphics processing units started being used to accelerate neural networks and deep learning outperformed previous AI techniques. This growth accelerated further after 2017 with the transformer architecture. In the 2020s, an ongoing period of rapid progress in advanced generative AI became known as the AI boom. Generative AI's ability to create and modify content has led to several unintended consequences and harms, which has raised ethical concerns about AI's long-term effects and potential existential risks, prompting discussions about regulatory policies to ensure the safety and benefits of the technology.
Web Search Results
- How much energy will AI really consume? The good, ...
that each request through an AI server requires 7–9 watt hours (Wh) of energy. That is 23–30 times the energy of a normal search, going by figures Google reported in a 2009 blogpost (see go.nature.com/3d8sd4t). When asked to comment on de Vries’ estimate, Google did not respond. [...] Luccioni and others found that different tasks require varying amounts of energy. On average, according to their latest results, generating an image from a text prompt consumes about 0.5 Wh of energy, while generating text uses a little less. For comparison, a modern smartphone might need 22 Wh for a full charge. But there is wide variation: larger models require more energy (see ‘How much energy does AI use?’). De Vries says that the numbers are lower than those in his paper, but that might be [...] On the basis of supply-chain estimation methods, analysts say that data centres currently use just a small proportion of the world’s electricity demand. The International Energy Agency (IEA) estimates4 that the electricity used by such facilities in 2022 was 240–340 TWh, or 1–1.3% of world demand (if cryptocurrency mining and data-transmission infrastructure are included, this raises the proportion to 2%).
- Why AI uses so much energy—and what we can do about it
Image Image 5: Stacked area and bar chart showing historical and projected U.S. server electricity consumption from 2014 to 2028, broken down by processor type. AI workloads, especially those using 8 GPUs, drive significant growth in projected energy use, with total consumption reaching nearly 400 TWh in high scenarios by 2028. [...] The environmental impact of AI extends beyond high electricity usage. AI models consume enormous amounts of fossil-fuel-based electricity, significantly contributing to greenhouse gas emissions. The need for advanced cooling systems in AI data centers also leads to excessive water consumption, which can have serious environmental consequences in regions experiencing water scarcity. [...] Initially, energy concerns in computing were consumer-driven, such as improving battery life in mobile devices. Today, the focus is shifting to environmental sustainability, carbon footprint reduction, and making AI models more energy efficient. AI, particularly large language models (LLMs), requires enormous computational resources. Training these models involves thousands of graphics processing units (GPUs) running continuously for months, leading to high electricity consumption. By
- We did the math on AI's energy footprint. Here's the story ...
Given the direction AI is headed—more personalized, able to reason and solve complex problems on our behalf, and everywhere we look—it’s likely that our AI footprint today is the smallest it will ever be. According to new projections published by Lawrence Berkeley National Laboratory in December, by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households. [...] By 2028, the researchers estimate, the power going to AI-specific purposes will rise to between 165 and 326 terawatt-hours per year. That’s more than all electricity currently used by US data centers for all purposes; it’s enough to power 22% of US households each year. That could generate the same emissions as driving over 300 billion miles—over 1,600 round trips to the sun from Earth. [...] First, a data center humming away isn’t necessarily a bad thing. If all data centers were hooked up to solar panels and ran only when the sun was shining, the world would be talking a lot less about AI’s energy consumption. That’s not the case. Most electrical grids around the world are still heavily reliant on fossil fuels. So electricity use comes with a climate toll attached.
- Energy and AI – Analysis - IEA
This report from the International Energy Agency (IEA) aims to fill this gap based on new global and regional modelling and datasets, as well as extensive consultation with governments and regulators, the tech sector, the energy industry and international experts. It includes projections for how much electricity AI could consume over the next decade, as well as which energy sources are set to help meet it. It also analyses what the uptake of AI could mean for energy security, emissions,
- The Future of AI and Energy Efficiency - IBM
Hardware improvements Smaller models Smarter model training Use of clean and renewable energy Open source and collaboration ### Hardware improvements Power-capping hardware has been shown to decrease energy consumption by up to 15%, while only increasing the time it takes to return a result by a barely noticeable 3%. AI energy use can also be reduced by using carbon-efficient hardware, which “matches a model with the most carbon-efficient mix of hardware,” according to MIT. [...] The rapid growth in AI adoption has also resulted in dramatic increases in energy use. Energy is needed both to build and train AI models and then to power the complex math that a model completes each time it is asked for information or to generate content. [...] No one expects AI adoption to slow because too many companies and executives see it as an indispensable part of their future. Marrying these 2 ambitions—tapping AI’s benefits while progressing on net-zero pledges— requires a smart approach. ## Addressing AI’s energy consumption challenges Fortunately, numerous industry experts are working on a range of solutions. Those solutions include: