Skip to content

Informed consent for AI-assisted care

Patients and caregivers should know when AI is involved in their care. This is increasingly required by law (e.g., California AB 3030 for written communications) and is a core ethical commitment regardless of whether your jurisdiction has caught up.

What to tell patients and caregivers

At intake and at substantive milestones (reauthorization, plan revision):

  1. AI assists in drafting clinical documents. Specify which: assessments, treatment plans, progress notes, summaries.
  2. What the AI sees. Session notes, ABC data, assessment scores, and prior records, whatever applies.
  3. A clinician reviews and is responsible. Outputs may be edited or rejected; the clinician's name on the document is meaningful.
  4. They can opt out. Without penalty to their care. (If you cannot honor this, you have a different problem.)
  5. What happens to their data. Vendors involved, BAAs in place, retention, deletion rights.
  6. How to ask questions. A specific contact, not a generic mailbox.

Plain-language matters

The standard for "informed" is that the person actually understood. A consent form full of vendor names and acronyms is legally minimal but ethically thin.

  • Reading level appropriate to the audience.
  • Translation into the family's primary language.
  • For minors, an age-appropriate assent conversation alongside guardian consent.
  • For adults with intellectual or cognitive disabilities, supported decision-making rather than presumed substitute consent.
  • New AI use in a category not previously consented to.
  • New vendor or material change in data flow.
  • Material change in the system's role (e.g., from drafting summaries to drafting full plans).

What "opt-out" actually requires operationally

  • A flag in your system that disables AI-assisted drafting for the patient.
  • Workflow paths that don't depend on AI output.
  • Staff training on how to handle opted-out cases.
  • Verification that opt-out is honored. Audit it.

If your system can't operate without AI drafting, you don't have an opt-out. You have a coercive default.

The hardest cases

  • Minors whose guardians consent but who later, as adolescents, raise concerns. Plan for re-consent at appropriate developmental milestones.
  • Court-involved or custody-disputed cases where guardian consent is contested. Default to non-AI-assisted care until authority is clear.
  • Emergency or crisis cases where consent processes don't fit the timeline. Document the reasoning; revisit when stable.

A note on the "AI literacy" gap

Many patients and caregivers do not have a working model of what an LLM is. A consent process that uses the phrase "artificial intelligence" without context is technically informative and substantively not. Develop short, specific explainers in plain language. The work of making consent meaningful is yours, not the patient's.