Lesson 104 of 1550
HIPAA Considerations for AI Tools: Protecting Patient Privacy in the Prompt
Every healthcare worker using AI tools must understand when patient data becomes PHI, what constitutes a HIPAA violation, and how to use AI productively while maintaining patient privacy and regulatory compliance.
Lesson map
What this lesson covers
Learning path
The main moves in order
- 1The HIPAA and AI collision
- 2HIPAA
- 3PHI
- 4BAA
Concept cluster
Terms to connect while reading
Section 1
The HIPAA and AI collision
HIPAA was enacted in 1996, long before consumer AI tools existed. Its Privacy Rule and Security Rule still govern any PHI processed by a covered entity or business associate — including, now, AI tools used by healthcare workers. A nurse using ChatGPT to summarize a patient note without a BAA is committing a HIPAA violation, regardless of whether data is shared further.
The 18 PHI identifiers
HIPAA's Safe Harbor method for de-identification requires removing 18 specific identifier categories. AI tools are only safe to use with patient data when all 18 are removed or the data has been certified as de-identified by a statistical expert. The 18 categories include: name, geographic data below state level, dates (except year) for individuals over 89, phone numbers, fax numbers, email, SSN, medical record numbers, health plan beneficiary numbers, account numbers, certificate/license numbers, vehicle identifiers, device identifiers, web URLs, IP addresses, biometric identifiers, full-face photos, and any other unique identifier.
- 1Consumer AI tools (ChatGPT free tier, Claude.ai standard) are not BAA-covered by default
- 2Enterprise tiers with BAAs exist — confirm with your vendor before clinical use
- 3De-identification does not mean 'remove the name' — all 18 identifiers must go
- 4Geographic data below state level includes city, county, ZIP code, and address
- 5Small cell sizes (fewer than 3 patients per demographic group) can be re-identified even without explicit identifiers
Key terms in this lesson
The big idea: the prompt window is part of your documentation environment. HIPAA applies.
End-of-lesson quiz
Check what stuck
15 questions · Score saves to your progress.
Tutor
Curious about “HIPAA Considerations for AI Tools: Protecting Patient Privacy in the Prompt”?
Ask anything about this lesson. I’ll answer using just what you’re reading — short, friendly, grounded.
Progress saved locally in this browser. Sign in to sync across devices.
Related lessons
Keep going
Adults & Professionals · 10 min
AI and PHI Redaction Spot-Check: Catching Missed Identifiers
AI can spot-check a redacted document for missed PHI, but the privacy officer signs off on what actually leaves the building.
Creators · 35 min
Career+: Boundaries for AI-Assisted Clinical Notes
Clinical note tools can reduce documentation burden, but they need privacy, accuracy, review, and accountability boundaries.
Builders · 7 min
AI and Prepping for Your First Hospital Volunteer Shift
Volunteering at a hospital? AI can help you understand HIPAA and what you can (and can't) say at home.
