Hero Image

How Much Does A Question Cost?

The Energy Consumption of AI

This project explores the energy cost of AI models from queries to training

See the Data

Introduction

AI has become an everyday assistant, helping with everything from planning travel and generating images to debugging code, writing emails, and even offering therapeutic support. While these large language models (LLMs) are helpful, their utility comes at a surprising environmental cost.

A single query to a popular AI chatbot consumes as much electricity as leaving a lightbulb on for 20 minutes—over ten times the energy required for a typical Google search. Remember the old warning about leaving the fridge door open and “killing the polar bears”? That may have been an exaggeration, but in today’s AI-driven world, the environmental impact feels far more tangible.

This project explores the hidden cost of AI: its energy consumption. It investigates the electricity demands of training AI models, running everyday queries, and maintaining the massive data centers that keep these tools running.

Project Overview Image

What is AI and How Does it Use Energy?

Artificial Intelligence (AI) refers to computer systems that perform tasks requiring complex human intelligence such as understanding languages, image creation or processing, and making decisions. Larger AI models like ChatGPT and Gemini rely on energy-intensive processes during their initial training and continued, daily use, which involve running computations on GPU or TPU hardware. This process consumes large amounts of electricity, which are often sourced from carbon-emitting power grids. AI’s environmental impact is increasingly growing and remains an increasingly important point of concern.

ChatGPT uses about 3Wh of energy per query which is equivalent to:

Lightbulb

Running a lightbulb for 20 minutes

Charge

Charging your phone to 20 - 30%

Google Search

10x the energy of a single Google search

microwave

Microwaving something for 2 minutes

How Much Energy Do Different Queries Use?

Different query types vary significantly in energy use depending on their complexity, token length, and compute requirements (e.g., context window size, model architecture, inference depth).

Simple Definition

Energy: 0.01–0.03 Wh/query

~same as 1–2 seconds of LED bulb use

Reading + Answering PDF

Energy: 0.2–1.0 Wh/query

~same as charging a smartphone for 5 minutes

Code Generation or Debugging

Energy: 0.3–1.5 Wh/query

~same as boiling water for a cup of tea

Complex Multi-Part Reasoning

Energy: 0.5–2.0 Wh/query

~same as using a laptop for 10–20 minutes

What LLMs Are Out There?

Explore top models from OpenAI, Google, Anthropic, Meta, and Mistral- usage stats, queries/day, and total energy use

GPT-4o
Open AI

~100M Queries/day

~400 Training Energy MWh

Gemini
Google

~75M Queries/day

~500 Training Energy MWh

LLama
Meta

~7.5M Queries/day

~375 Training Energy MWh

Claude
Anthropic

~17.5M Queries/day

~350 Training Energy MWh

R1
DeepSeek

~500M Queries/day

Unknown Training Energy MWh

Nova
Amazon

~7.5M Queries/day

Unknown Training Energy MWh

Large 2
Mistral AI

Unknown Queries/day

Unknown Training Energy MWh

Bloom
Big Science

~1M Queries/day

~433 Training Energy MWh

Companies and Their Models

Explore the various companies with multiple LLMs. Each color represents a company and the lines connect each child model

How Much Energy Do These Models Consume?

When looking at the energy usage from AI models and digital platforms, Google Search remains the highest energy-consuming method with an estimated 24 million MWh consumed annually. The next non-Google platform, ChatGPT-3’s model consumes 1,287 MWh shows that within a year Google Search consumes roughly 18,650x more energy output than GPT-3.

Company Contributions to CO2 Emissions

This visualization looks into estimated CO2 emissions associated with different AI companies based on the energy usage of their deployed AI LLM models. Google leads with a significant margin, due to the scale of Google Search and projections of AI-Overview usage. Major contributors include Meta, OpenAI, and DeepMind. However, open-source, research-focused organizations like BigScience, Hugging Face, and Aleph Alpha show significantly lower emission footprints.

When did these AI models come to be?

The timeline of AI deployments shows a steady increase in LLMs from 2020 to 2026. Large tech companies like Google and Meta have been continuing to release iterations of LLM models such as Llama 2 variants (Meta). This timeline highlights both open-source initiatives and proprietary product launches, illustrating the diverse and growing ecosystem of LLM development.

ChatGPT uses about 1.095 billion kWh (or 1.1 terawatt-hours, TWh) of energy per year which is equivalent to:

House

Powering 104,000 homes for a year

Finland

Powering the entire country of Finland or Belgium for a day

Rocket

The amount of energy needed for 2.5 moon missions

netflix

1.6 million years of constant Netflix streaming

How Can You Help?

Hero Image

Conclusion

AI is here to stay—but how do we ensure we have the resources to sustain this growth without devastating environmental consequences? And how do we save the polar bears?

Take Action

What can you do to help reduce AI’s energy burden?

Use smaller models when possible
Query thoughtfully, cache where you can
Advocate for green AI policy