Lesson 18 of 1570
When Prompts Fail: Debugging Checklist
Bad output is almost never random. It's a clue. Here's how to diagnose and fix a broken prompt instead of just mashing the regenerate button.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1Every bad output is a clue
- 2prompt debugging
- 3failure modes
- 4diagnostic questions
Concept cluster
Terms to connect while reading
Section 1
Every bad output is a clue
When a prompt produces something wrong, bland, or off-topic, the output is telling you exactly where the prompt is weak. Read the failure carefully before you touch the prompt.
Common failure modes and fixes
Compare the options
| Symptom | Likely cause | Fix |
|---|---|---|
| Output too generic. | No role, no audience. | Add a specific role and target audience. |
| Output too long. | No constraint. | Add a word/bullet/sentence limit. |
| Wrong format. | Format not specified or unclear. | Show an exact example of the format. |
| Missed instruction. | Instruction buried in the middle. | Move key rules to the top or bottom; use bullets. |
| Contradictory output. | Your prompt has contradictions. | Reread — are you asking for 'short and detailed' or 'formal and fun'? |
| Hallucinated facts. | Asked for info it can't know. | Add sources, use retrieval, or say 'if unsure, say so.' |
| Refuses to answer. | Triggered a safety filter. | Rephrase without adversarial framing; explain your legitimate purpose. |
The debug prompt trick
Ask the AI to critique its own failure.
I gave you this prompt:
"""
<YOUR PROMPT HERE>
"""
And you gave me this output:
"""
<BAD OUTPUT>
"""
Help me debug. Specifically:
1. Which parts of my prompt are ambiguous?
2. Which instructions did you miss, and why?
3. What would you change in the prompt to get a better output?
Be direct, not flattering.This meta-prompt often exposes ambiguities you couldn't see. The AI knows what confused it — ask.
Diagnostic questions to ask yourself
- 1Is there a clear role? ('You are a ...')
- 2Is the audience specified?
- 3Are the constraints numbered and prominent?
- 4Is there at least one example of the output format?
- 5Did I put the most important instruction at the end? (Recency bias helps.)
- 6Could a tired person read my prompt and do the task?
- 7Did I ask for the output in a specific format?
Key terms in this lesson
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “When Prompts Fail: Debugging Checklist”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Builders · 26 min
Few-Shot Prompting: Teach by Example
Instead of describing what you want, show the AI two or three examples. Few-shot prompting is often the fastest way to get consistent output.
Builders · 28 min
The Five-Part Prompt: Role, Context, Examples, Constraints, Format
Pro prompters follow a structure. Give the AI a role, set the context, show examples, set constraints, and pick a format. This framework alone 10x's your output quality.
Builders · 22 min
Iterate, Don't Rewrite
Beginners scrap their prompt and start over. Pros keep the good parts and change only what isn't working. Here's how to iterate like a craftsperson.
