Research

Ethical Obligations to Inform Patients About Use of AI Tools

Article /

JAMA, 2025

Authors: Michelle M. Mello, Danton Char, Sonnet H. Xu.

Abstract: Permeation of artificial intelligence (AI) tools into health care tests traditional understandings of what patients should be told about their care. Despite the general importance of informed consent, decision support tools (eg, automatic electrocardiogram readers, rule-based risk classifiers, and UpToDate summaries) are not usually discussed with patients even though they affect treatment decisions. Should AI tools be treated similarly? The legal doctrine of informed consent requires disclosing information that is material to a reasonable patient’s decision to accept a health care service, and evidence suggests that many patients would think differently about care if they knew it was guided by AI. In recent surveys, 60% of US adults said they would be uncomfortable with their physician relying on AI, 70% to 80% had low expectations AI would improve important aspects of their care, only one-third trusted health care systems to use AI responsibly, and 63% said it was very true that they would want to be notified about use of AI in their care.

Read full article in JAMA, July 2025.