Lesson 1816 of 2116
Mixture of Depths: How AI Models Spend Compute Per Token
Mixture-of-depths lets models skip layers per token to spend compute where it matters; understand it to evaluate efficiency claims honestly.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The premise
- 2mixture of depths
- 3conditional compute
- 4efficiency
Concept cluster
Terms to connect while reading
Section 1
The premise
Mixture-of-depths trains a router to skip transformer layers on easy tokens, spending compute where input difficulty actually demands it.
What AI does well here
- Reduce average compute per token while preserving downstream quality
- Concentrate compute on tokens with high routing-uncertainty
- Compose with mixture-of-experts for additional efficiency gains
What AI cannot do
- Match dense-model quality on adversarial tail-of-distribution tasks always
- Avoid additional engineering complexity in serving systems
- Replace the need for careful router-training data
Key terms in this lesson
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “Mixture of Depths: How AI Models Spend Compute Per Token”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Creators · 40 min
Mixture-of-Experts: Why MoE Models Behave Differently
Mixture-of-experts architectures route tokens through specialized sub-networks — and the routing creates eval and serving behaviors single-dense models do not have.
Creators · 40 min
Tool-Use Evaluation: Building Reliable Agent Benchmarks
Tool-use evals must capture argument correctness, sequencing, and recovery from tool errors — not just whether the model called the tool at all.
Creators · 9 min
AI for Resume English (Immigrant Career Edition)
American resumes look different from many other countries. AI can format your work history in the U.S. style and translate foreign job titles.
