Lesson 1164 of 1570
Fine-Tuning vs Prompting: When You Actually Need to Train
Most people who think they need fine-tuning just need better prompts and a few examples. Real fine-tuning is rare.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The big idea
- 2fine-tuning
- 3few-shot
- 4prompting
Concept cluster
Terms to connect while reading
Section 1
The big idea
Fine-tuning means actually retraining a model on your data so it learns your style or domain. It's expensive and slow. 90% of the time, few-shot prompting (giving examples in the prompt) gets you the same result without training. Reach for fine-tuning only when prompts can't reach the quality you need at scale.
Some examples
- A startup using GPT-5 with 5 example outputs in every prompt — prompting wins.
- A medical company training Llama on their own clinical notes — real fine-tuning.
- A side project that wants the AI to talk like a pirate — system prompt, not training.
- A legal-tech tool needing exact citation format on millions of queries — fine-tune for that one task.
Try it!
Take a task you wished an AI did better. List 5 strong examples in your prompt. See if you actually needed to fine-tune.
Key terms in this lesson
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “Fine-Tuning vs Prompting: When You Actually Need to Train”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Creators · 40 min
When to Fine-Tune vs When to Just Prompt: A Decision Framework
Fine-tuning is expensive and slow to iterate on. Prompting is fast and free. Knowing when fine-tuning actually pays off saves teams from premature optimization.
Builders · 40 min
Claude vs ChatGPT for Teens: Quick Comparison
Both are great chatbots but they have different vibes. Knowing which to pick saves time.
Builders · 40 min
Context Windows: How Much AI Can 'Remember'
Each AI has a 'context window' — how much it can hold in memory. Knowing this matters for big tasks.
