This project explores the energy cost of AI models from queries to training
See the DataAI has become an everyday assistant, helping with everything from planning travel and generating images to debugging code, writing emails, and even offering therapeutic support. While these large language models (LLMs) are helpful, their utility comes at a surprising environmental cost.
A single query to a popular AI chatbot consumes as much electricity as leaving a lightbulb on for 20 minutes—over ten times the energy required for a typical Google search. Remember the old warning about leaving the fridge door open and “killing the polar bears”? That may have been an exaggeration, but in today’s AI-driven world, the environmental impact feels far more tangible.
This project explores the hidden cost of AI: its energy consumption. It investigates the electricity demands of training AI models, running everyday queries, and maintaining the massive data centers that keep these tools running.
Artificial Intelligence (AI) refers to computer systems that perform tasks requiring complex human intelligence such as understanding languages, image creation or processing, and making decisions. Larger AI models like ChatGPT and Gemini rely on energy-intensive processes during their initial training and continued, daily use, which involve running computations on GPU or TPU hardware. This process consumes large amounts of electricity, which are often sourced from carbon-emitting power grids. AI’s environmental impact is increasingly growing and remains an increasingly important point of concern.
ChatGPT uses about 3Wh of energy per query which is equivalent to:
Running a lightbulb for 20 minutes
Charging your phone to 20 - 30%
10x the energy of a single Google search
Microwaving something for 2 minutes
Different query types vary significantly in energy use depending on their complexity, token length, and compute requirements (e.g., context window size, model architecture, inference depth).
Energy: 0.01–0.03 Wh/query
~same as 1–2 seconds of LED bulb use
Energy: 0.2–1.0 Wh/query
~same as charging a smartphone for 5 minutes
Energy: 0.3–1.5 Wh/query
~same as boiling water for a cup of tea
Energy: 0.5–2.0 Wh/query
~same as using a laptop for 10–20 minutes
Explore top models from OpenAI, Google, Anthropic, Meta, and Mistral- usage stats, queries/day, and total energy use
~100M Queries/day
~400 Training Energy MWh
~75M Queries/day
~500 Training Energy MWh
~7.5M Queries/day
~375 Training Energy MWh
~17.5M Queries/day
~350 Training Energy MWh
~500M Queries/day
Unknown Training Energy MWh
~7.5M Queries/day
Unknown Training Energy MWh
Unknown Queries/day
Unknown Training Energy MWh
~1M Queries/day
~433 Training Energy MWh
Explore the various companies with multiple LLMs. Each color represents a company and the lines connect each child model
When looking at the energy usage from AI models and digital platforms, Google Search remains the highest energy-consuming method with an estimated 24 million MWh consumed annually. The next non-Google platform, ChatGPT-3’s model consumes 1,287 MWh shows that within a year Google Search consumes roughly 18,650x more energy output than GPT-3.
This visualization looks into estimated CO2 emissions associated with different AI companies based on the energy usage of their deployed AI LLM models. Google leads with a significant margin, due to the scale of Google Search and projections of AI-Overview usage. Major contributors include Meta, OpenAI, and DeepMind. However, open-source, research-focused organizations like BigScience, Hugging Face, and Aleph Alpha show significantly lower emission footprints.
The timeline of AI deployments shows a steady increase in LLMs from 2020 to 2026. Large tech companies like Google and Meta have been continuing to release iterations of LLM models such as Llama 2 variants (Meta). This timeline highlights both open-source initiatives and proprietary product launches, illustrating the diverse and growing ecosystem of LLM development.
ChatGPT uses about 1.095 billion kWh (or 1.1 terawatt-hours, TWh) of energy per year which is equivalent to:
Powering 104,000 homes for a year
Powering the entire country of Finland or Belgium for a day
The amount of energy needed for 2.5 moon missions
1.6 million years of constant Netflix streaming
AI is here to stay—but how do we ensure we have the resources to sustain this growth without devastating environmental consequences? And how do we save the polar bears?
What can you do to help reduce AI’s energy burden?