Lesson 1521 of 1570
AI and Energy Cost of Prompts: What Each Query Actually Burns
Each ChatGPT query uses real water and electricity. Learn what the numbers are and how to be smarter.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The big idea
- 2energy
- 3sustainability
- 4data centers
Concept cluster
Terms to connect while reading
Section 1
The big idea
A ChatGPT query uses about 10x the energy of a Google search and a sip of water for cooling. Multiply that by 200 daily users in your school. Knowing the cost makes you a more thoughtful user, not a guilty one.
Some examples
- Ask Claude for the latest 2026 numbers on energy per LLM query.
- Ask ChatGPT how Anthropic and OpenAI offset their data-center water use.
- Ask Gemini what regions have the cleanest grids for AI today.
- Ask Perplexity for the studies behind the '500ml of water per email' claim.
Try it!
Look at your last 10 prompts. Could three of them have been one? Try it next session.
Key terms in this lesson
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “AI and Energy Cost of Prompts: What Each Query Actually Burns”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Builders · 7 min
AI and Why Some AI Costs Money to Run
Every ChatGPT query costs the company real money — that's why free tiers have limits.
Creators · 9 min
Building A Rural AI Literacy Group At Your Library
The fastest way to spread AI literacy in a small town is a recurring meet-up at the library. Here's a starter playbook for the volunteer who'll lead it.
Explorers · 5 min
AI Needs Electricity to Think
AI brains live inside computers that run on electricity, just like a TV or phone.
