Lesson 507 of 1570
Iterate, Don't Restart: Debugging and Improving Prompts, Part 1
Most teens scrap a bad AI answer and start over. Better: refine the answer with feedback. Way more efficient.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The big idea
- 2Don't Restart When AI Is Off — Iterate
- 3The big idea
- 4Knowing when to stop iterating with AI
Concept cluster
Terms to connect while reading
Section 1
The big idea
When AI gives a so-so answer, most teens delete and start fresh. Better approach: tell AI what is wrong, what to fix, what to keep. AI iterates better than it starts over.
Some examples
- 'Make it shorter — half the length.'
- 'You missed the point about [X]. Add that.'
- 'Rewrite this more like a casual text, less formal.'
- 'I like the first paragraph. Change just the second one to be more emotional.'
Try it!
Next time AI gives a bad answer, instead of starting over, give specific feedback. Try 3 rounds of refinement. Compare your final answer to your starting point. Way better, right?
Key terms in this lesson
Section 2
Don't Restart When AI Is Off — Iterate
Section 3
The big idea
When AI answers are 70% there, do not start over. Iterate — tell it what to change. Way faster than starting fresh and getting different mediocre output.
Some examples
- 'I like this but make it shorter.'
- 'Keep paragraph 1 but rewrite paragraph 2 to be more dramatic.'
- 'Good but too formal — make it sound more like a teen wrote it.'
- 'Almost there. Now add a counter-argument in the third paragraph.'
Try it!
Understanding "Don't Restart When AI Is Off — Iterate" in practice: Prompting is a skill: the more specific and structured your input, the more useful the output. When AI's answer is mostly right but a little off, iterate. Don't start over. Way faster — and knowing how to apply this gives you a concrete advantage.
- Apply iteration in your prompting workflow to get better results
- Apply refinement in your prompting workflow to get better results
- Apply efficient prompting in your prompting workflow to get better results
- 1Rewrite one of your best prompts using role + context + task + format
- 2Ask an AI to critique your prompt and suggest improvements
- 3Compare outputs from two models using the same prompt
Section 4
Knowing when to stop iterating with AI
Section 5
The big idea
AI lets you regenerate forever, which is dangerous. You can spend an hour 'improving' a paragraph that was already fine after attempt two. Builders learn to stop, not to chase perfect.
Some examples
- Set a timer: max 3 prompt iterations on the intro of an essay.
- If the 4th draft isn't clearly better than the 2nd, the 2nd was the answer.
- Lock in a version, then move to the next section.
- Ask: 'Would my teacher notice the difference?' If no, ship it.
Try it!
Next time you use AI for homework, count your iterations out loud. Stop yourself at three and look at what you have. You'll usually be done.
Section 6
Prompting AI as a rubber duck
Section 7
The big idea
Coders have a trick called 'rubber duck debugging': explain your problem out loud to a rubber duck and you'll often spot the bug yourself. AI is a duck that talks back.
Some examples
- Explain your code line by line to AI before asking for fixes
- Explain a math problem in your own words to AI
- Explain why you're stressed before asking for advice
- Explain a paper outline before asking for feedback
Try it!
Take a problem you're stuck on. Explain it to AI in 3-4 paragraphs as if it knew nothing. Notice if you figure it out before AI even responds.
Section 8
AI and Success Criteria: Tell AI What 'Done' Looks Like
Section 9
The big idea
Success criteria are the exact things an answer must hit. Without them, AI guesses what 'done' means. With them, AI self-checks and gives you a tighter result.
Some examples
- 'A great answer will: (1) be under 150 words, (2) include one stat, (3) end with a question.'
- 'Don't stop until you have three options of different price ranges.'
- 'Check yourself: did you actually answer the original question?'
- Give AI the rubric your teacher will grade you on.
Try it!
Take a recent prompt and add three success criteria as a checklist. Ask AI to verify each one in its answer.
Section 10
AI and Iteration: The Magic of Saying 'Make It Better'
Section 11
The big idea
Iteration prompting means treating AI's first answer as a draft, not the final. The second and third versions are usually 5x better. People who don't iterate get bland output and blame AI.
Some examples
- 'Make it punchier. Cut 30%.'
- 'That's too generic. Add specifics from my context.'
- 'Now do three more variations: serious, funny, and weird.'
- 'Combine the best of v2 and v4.'
Try it!
Ask AI for something. Iterate at least three times — make it shorter, weirder, sharper. Compare v1 and v4.
Section 12
AI and Prompt Debugging: When the Answer's Wrong, Fix the Prompt
Section 13
The big idea
Prompt debugging is the skill of asking why a bad answer is bad and fixing your prompt — not just rewording it. The bug is usually missing context, an ambiguous word, or no example.
Some examples
- Bad answer? Ask AI: 'What did you assume that I didn't tell you?'
- Spot ambiguous words: 'better' for who? 'short' how short?
- Add the missing constraint and try again.
- Test the same prompt in a fresh chat to rule out chat history confusion.
Try it!
Find a recent AI answer you didn't like. Ask AI 'what would make my prompt clearer?' Apply its suggestions and rerun.
Section 14
AI and Evaluator Prompts: Make AI Grade Itself
Section 15
The big idea
After AI answers, in a new turn say 'grade your last response on this rubric: clarity (1-5), accuracy (1-5), tone (1-5).' AI usually finds its own weaknesses, then you can ask it to fix them.
Some examples
- Prompt: 'Score that essay 1-5 on thesis, evidence, voice. Then rewrite to fix the lowest score.'
- Two-pass prompting (write then critique) outperforms one-pass.
- AI is harsher on itself than you'd expect — that's useful.
- You can have AI grade against the actual class rubric you paste in.
Try it!
Have AI write a paragraph, then grade and rewrite it twice. Compare draft 1 to draft 3.
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “Iterate, Don't Restart: Debugging and Improving Prompts, Part 1”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Builders · 40 min
Iterate, Don't Restart: Debugging and Improving Prompts, Part 2
It's faster to send three OK prompts than to craft one perfect one — iteration beats premeditation.
Explorers · 40 min
Advanced Moves: Get AI to Explain, Check, Quiz, and Improve, Part 2
You can give AI rules to follow — no big words, no scary stuff, etc.
Explorers · 40 min
When the Answer Isn't Right: Feedback, Iteration, and Trying Again, Part 2
You don't have to start over each time. Keep building like LEGO.
