Lesson 1053 of 1234
AI Reads Tiny Word Chunks Called Tokens
AI does not read full words. It reads little chunks called tokens.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The big idea
- 2token
- 3AI input
- 4chunks
Concept cluster
Terms to connect while reading
Section 1
The big idea
AI breaks every message into small chunks called tokens before reading. A token is often a piece of a word.
Some examples
- The word 'unhappy' might be 2 tokens: 'un' + 'happy'.
- A short word like 'cat' is usually 1 token.
- Spaces and punctuation can be tokens too.
- 100 words is roughly 130 tokens in English.
Try it!
Type 'pineapple' to AI and ask how many tokens that is. Most AIs split it into 2 or 3 chunks.
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “AI Reads Tiny Word Chunks Called Tokens”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Explorers · 40 min
How AI Chops Up Words Into Tiny Pieces
AI breaks words into little chunks called tokens.
Builders · 40 min
Why AI 'Forgets' Halfway Through a Long Chat
AI has a memory limit called the context window. Hitting it explains a LOT of weird behavior.
Builders · 40 min
AI and tokens vs words: why your prompt costs what it costs
Learn what a token actually is so you can predict cost and context limits.
