Energy constraints in AI

Topic

The concept, discussed by Chamath Palihapitiya, that the growth of AI will be limited not by silicon technology, but by the availability of energy inputs and other raw materials.


First Mentioned

10/11/2025, 3:44:33 AM

Last Updated

10/11/2025, 3:45:47 AM

Research Retrieved

10/11/2025, 3:45:47 AM

Summary

Energy constraints are a significant bottleneck in the ongoing buildout of artificial intelligence, particularly generative AI. The production of generative AI systems, which rely on large data centers with specialized chips, requires substantial amounts of electricity for processing and water for cooling. This high energy demand, coupled with supply chain issues for critical components like High-bandwidth memory (HBM), presents a major challenge for the AI industry. Companies like OpenAI are actively working to secure these resources, engaging directly with manufacturers to ensure the necessary hardware is available to support the massive AI infrastructure being developed. The energy intensity of AI systems has also raised concerns about their environmental impact, especially in the context of global energy transition efforts.

Referenced in 1 Document
Research Data
Extracted Attributes
  • Primary Constraint

    High energy demand for processing and cooling

  • Transparency Issue

    Lack of detailed data and transparency from firms on AI systems' electricity demands

  • Environmental Impact

    Significant contribution to carbon footprint and ecological harm

  • Energy per Text Generation

    Slightly less than 0.5 Wh

  • Energy per AI Server Request

    7-9 watt hours (23-30 times a normal search)

  • Critical Component Bottleneck

    High-bandwidth memory (HBM)

  • Key Resource Required (Cooling)

    Water

  • Regulatory Classification Issue

    AI not yet treated as its own sector by US Energy Information Administration (EIA)

  • Key Resource Required (Processing)

    Electricity

  • Projected AI Power Consumption (2028)

    165-326 terawatt-hours per year (more than all US data centers currently)

  • Energy per Image Generation (text prompt)

    Approximately 0.5 Wh

  • Estimated GPT-3 Training Carbon Emissions

    502 metric tons

  • Estimated GPT-3 Training Energy Consumption

    1,287 MWh

  • Projected Global Data Center Electricity Share (2030)

    Double (IEA estimate)

  • Projected Electricity Demand Increase (AI in search engines)

    Tenfold increase (IEA estimate)

Timeline
  • Google reported energy figures for a normal search query, providing a baseline for comparison with AI energy demands. (Source: web_search_results)

    2009

  • The training process for the GPT-3 model is estimated to have consumed 1,287 MWh of energy and caused approximately 502 metric tons of carbon emissions. (Source: web_search_results)

    2020-05-01

  • Researchers estimate that the power consumption for AI-specific purposes will rise to between 165 and 326 terawatt-hours per year. (Source: web_search_results)

    2028

  • The International Energy Agency (IEA) projects that the share of global electricity powering data centers will double. (Source: web_search_results)

    2030

  • The AI buildout is a capital-intensive race significantly constrained by energy demands and bottlenecks in the supply chain for critical components like High-bandwidth memory (HBM). (Source: related_documents)

    Ongoing

  • OpenAI is actively working to secure necessary hardware resources by dealing directly with manufacturers such as SK Hynix and Samsung. (Source: summary)

    Ongoing

Generative artificial intelligence

Generative artificial intelligence (Generative AI, GenAI, or GAI) is a subfield of artificial intelligence that uses generative models to produce text, images, videos, audio, software code or other forms of data. These models learn the underlying patterns and structures of their training data and use them to produce new data based on the input, which often comes in the form of natural language prompts. Generative AI tools have become more common since the AI boom in the 2020s. This boom was made possible by improvements in transformer-based deep neural networks, particularly large language models (LLMs). Major tools include chatbots such as ChatGPT, Copilot, Gemini, Claude, Grok, and DeepSeek; text-to-image models such as Stable Diffusion, Midjourney, and DALL-E; and text-to-video models such as Veo and Sora. Technology companies developing generative AI include OpenAI, xAI, Anthropic, Meta AI, Microsoft, Google, DeepSeek, and Baidu. Generative AI is used across many industries, including software development, healthcare, finance, entertainment, customer service, sales and marketing, art, writing, fashion, and product design. The production of generative AI systems requires large scale data centers using specialized chips which require a lot of electricity for processing and water for cooling. Generative AI has raised many ethical questions and governance challenges as it can be used for cybercrime, or to deceive or manipulate people through fake news or deepfakes. Even if used ethically, it may lead to mass replacement of human jobs. The tools themselves have been criticized as violating intellectual property laws, since they are trained on copyrighted works. The material and energy intensity of the AI systems has raised concerns about the environmental impact of AI, especially in light of the challenges created by the energy transition.

Web Search Results
  • Why AI uses so much energy—and what we can do about it

    Initially, energy concerns in computing were consumer-driven, such as improving battery life in mobile devices. Today, the focus is shifting to environmental sustainability, carbon footprint reduction, and making AI models more energy efficient. AI, particularly large language models (LLMs), requires enormous computational resources. Training these models involves thousands of graphics processing units (GPUs) running continuously for months, leading to high electricity consumption. By [...] Additionally, the storage and transfer of massive datasets used in AI training require substantial energy, further increasing AI’s environmental burden. Without proper sustainability measures, the expansion of AI could accelerate ecological harm and worsen climate change. Image [...] Only a handful of organizations, such as Google, Microsoft, and Amazon, can afford to train large-scale models due to the immense costs associated with hardware, electricity, cooling, and maintenance. Smaller institutions with limited GPU/TPU resources would take significantly longer to train models, leading to even higher cumulative energy consumption. Additionally, AI models often require frequent retraining to remain relevant, further increasing energy usage. Infrastructure failures,

  • Energy Efficiency in AI Models: Strategies for a Sustainable Future

    The training of AI models heavily relies on energy-intensive hardware. The use of specialized hardware such as graphics processing units (GPUs) results in significant energy consumption. For instance, the training process of the GPT-3 model with 175 billion parameters is estimated to have consumed 1,287 MWh of energy and caused approximately 502 metric tons of carbon emissions. However, energy consumption is not limited to the training phase. The real-world deployment of these models—continuous [...] Many different approaches are being tried to reduce the high energy consumption of AI systems. These methods aim to minimize energy usage while maintaining model performance as much as possible. The main strategies for improving energy efficiency in AI models are outlined below: [...] Accurately measuring how much energy AI models consume and their carbon footprints is complex. One of the main reasons for this complexity is the variability in energy usage depending on the hardware AI models run on. For example, GPUs, CPUs, or specialized AI chips all consume different amounts of energy. Additionally, the diversity of energy sources used by data centers further complicates energy and carbon calculations.

  • We did the math on AI's energy footprint. Here's the story you haven't ...

    By 2028, the researchers estimate, the power going to AI-specific purposes will rise to between 165 and 326 terawatt-hours per year. That’s more than all electricity currently used by US data centers for all purposes; it’s enough to power 22% of US households each year. That could generate the same emissions as driving over 300 billion miles—over 1,600 round trips to the sun from Earth. [...] We heard from several other researchers who say that their ability to understand the emissions and energy demands of AI are hampered by the fact that AI is not yet treated as its own sector. The US Energy Information Administration, for example, makes projections and measurements for manufacturing, mining, construction, and agriculture, but detailed data about AI is simply nonexistent. “Why should we be paying for this infrastructure? Why should we be paying for their power bills?” [...] One billion of these every day for a year would mean over 109 gigawatt-hours of electricity, enough to power 10,400 US homes for a year. If we add images and imagine that generating each one requires as much energy as it does with our high-quality image models, it’d mean an additional 35 gigawatt-hours, enough to power another 3,300 homes for a year. This is on top of the energy demands of OpenAI’s other products, like video generators, and that for all the other AI companies and startups.

  • How much energy will AI really consume? The good, the bad and ...

    that each request through an AI server requires 7–9 watt hours (Wh) of energy. That is 23–30 times the energy of a normal search, going by figures Google reported in a 2009 blogpost (see go.nature.com/3d8sd4t). When asked to comment on de Vries’ estimate, Google did not respond. [...] Luccioni and others found that different tasks require varying amounts of energy. On average, according to their latest results, generating an image from a text prompt consumes about 0.5 Wh of energy, while generating text uses a little less. For comparison, a modern smartphone might need 22 Wh for a full charge. But there is wide variation: larger models require more energy (see ‘How much energy does AI use?’). De Vries says that the numbers are lower than those in his paper, but that might be [...] Complicating matters further is a lack of transparency from firms about their AI systems’ electricity demands. “The real problem is that we’re operating with very little detailed data and knowledge of what’s happening,” says Jonathan Koomey, an independent researcher who has studied the energy use of computing for more than 30 years and who runs an analytics firm in Burlingame, California.

  • The Future of AI and Energy Efficiency - IBM

    The rapid growth in AI adoption has also resulted in dramatic increases in energy use. Energy is needed both to build and train AI models and then to power the complex math that a model completes each time it is asked for information or to generate content. [...] Existing methods of training models require significant energy because AI developers often use several previous modelsas a starting point to train new models. Running all these models increases the power required. [...] The International Energy Agency (IEA) has suggestedthat integrating AI into existing tools such as internet search engines might result in a tenfold increase in electricity demand. By 2030, the IEA projects the share of global electricity that powers data centers will double.