Tendril · Adults & Professionals · AI in Healthcare
AI for Goals-of-Care Conversation Prep: Assembling Context, Not Scripting Empathy
Use AI to surface what the chart says about prior conversations, prognosis, and family — then have the conversation yourself.
11 min · Reviewed 2026
The premise
AI can pull together what's already in the record so the clinician walks in prepared — it cannot script the conversation or substitute for presence.
What AI does well here
Summarize prior advance directives and code status changes
Pull prognostic data into a brief
List the family contacts and prior involvement
What AI cannot do
Decide what to say in the room
Predict prognosis with certainty
Replace the relational work
End-of-lesson check
15 questions · take it digitally for instant feedback at tendril.neural-forge.io/learn/quiz/end-healthcare-ai-palliative-goals-of-care-prep-adults
A clinician wants to use AI before a goals-of-care conversation with a patient's family. Which task is BEST suited for AI assistance?
Formulating the exact emotional phrases to express empathy
Deciding whether to recommend hospice versus continued treatment
Reviewing the chart to identify what prior conversations documented about code status
Generating a script of exactly what to say to the family about prognosis
Why are AI-generated scripts for goals-of-care conversations particularly risky?
They cannot be customized for different religious backgrounds
They are usually too short to cover necessary information
They read as canned and can erode trust at emotionally critical moments
They may violate HIPAA regulations about patient data
Which of the following is within AI's demonstrated capability for goals-of-care preparation?
Predicting exactly how long a patient will live with certainty
Summarizing advance directives and code status changes from the record
Determining which family member should serve as surrogate decision-maker
Deciding the optimal timing to broach end-of-life topics
A clinician asks an AI system: 'From this chart, what does the patient already understand about their prognosis?' What is the MOST appropriate interpretation of what AI can provide?
A prediction of what the patient will believe when told the news
A script for how to explain the prognosis to this specific patient
A definitive statement of the patient's exact psychological state
A summary of documented prior conversations about illness understanding
What limitation does the lesson highlight about AI's role in prognostic information?
AI should not be used to discuss prognosis at all with families
AI cannot predict prognosis with certainty
AI always overestimates survival times
AI cannot access any data about disease progression
Which statement best captures the lesson's core premise about AI in goals-of-care conversations?
AI and clinicians should share decision-making authority equally
AI should assemble context from the chart while the clinician conducts the conversation
AI can replace the clinician for routine goals-of-care discussions
AI is not useful in palliative care situations
The lesson mentions that 'empathic phrasing must come from you in the room.' What is the implied concern about AI-generated empathy?
Generated empathic phrases may come across as insincere or formulaic
AI cannot properly tailor empathic statements to cultural contexts
AI lacks the legal authority to express empathy
Empathy expressed by AI may sound authentic but lacks genuine concern
A clinician uses AI to prepare for a family meeting. Which outcome represents appropriate use of AI in this workflow?
AI provides a word-for-word script the clinician reads to the family
AI decides the family is not ready for a goals-of-care conversation
AI tells the clinician exactly what treatment path to recommend
AI identifies that the patient's advance directive was last updated three years ago and lists the designated healthcare proxy
The lesson lists three things AI does well for goals-of-care prep. Which one is NOT among them?
Generating empathic statements for the conversation
Summarizing prior advance directives and code status changes
Listing family contacts and prior involvement
Pulling prognostic data into a brief
What does the lesson identify as a task that AI absolutely should NOT do in goals-of-care preparation?
Compiling a list of prior documented conversations
Organizing prognostic data points for clinician review
Reviewing the medical record for relevant history
Deciding what to say in the actual conversation with patient or family
The lesson emphasizes that AI cannot replace what essential element in goals-of-care conversations?
The electronic medical record system
The patient's family members
The hospital's legal department
The clinician's presence and relational work
When preparing for a goals-of-care conversation, what information about family can appropriately be sourced from AI?
The exact emotional support each family member will need
Whether family members have legal authority to make decisions
Which family member will be most receptive to bad news
Family contacts and prior involvement documented in the chart
Which key term from the lesson describes the type of care focused on improving quality of life for patients with serious illness?
Palliative care
Advance care planning
Goals of care
Serious illness conversation
The lesson warns that using AI-generated scripts in goals-of-care conversations can have what negative consequence?
It may expose the hospital to malpractice liability
It may violate informed consent requirements
It can erode patient and family trust
It typically prolongs the conversation unnecessarily
A clinician asks an AI system to help prepare for a goals-of-care conversation. Which question represents appropriate use of AI assistance?
What specific words should I use to tell this patient they are dying?
Should this patient be transitioned to hospice today?
What does the chart show about prior advance directive completion?
Which family member should I designate as the surrogate?