Article Text

Download PDFPDF
AI support for ethical decision-making around resuscitation: proceed with care
  1. Nikola Biller-Andorno1,2,
  2. Andrea Ferrario3,
  3. Susanne Joebges1,
  4. Tanja Krones1,4,
  5. Federico Massini1,2,
  6. Phyllis Barth1,2,
  7. Georgios Arampatzis2,5,
  8. Michael Krauthammer6
  1. 1 Institute of Biomedical Ethics and History of Medicine, Universität Zürich, Zurich, Switzerland
  2. 2 Collegium Helveticum, Zurich, Switzerland
  3. 3 Department of Management, Technology, and Economics, Eidgenössische Technische Hochschule Zürich, Zurich, Switzerland
  4. 4 Clinical Ethics, Universitätsspital Zürich, Zurich, Switzerland
  5. 5 Computational Science and Engineering Laboratory, Eidgenössische Technische Hochschule Zürich, Zurich, Switzerland
  6. 6 Department of Quantitative Biomedicine, Chair of Medical Informatics, Universität Zürich, Zurich, Switzerland
  1. Correspondence to Professor Nikola Biller-Andorno, Institute of Biomedical Ethics and History of Medicine, Universität Zürich, Zurich 8006, Switzerland; biller-andorno{at}ibme.uzh.ch

Abstract

Artificial intelligence (AI) systems are increasingly being used in healthcare, thanks to the high level of performance that these systems have proven to deliver. So far, clinical applications have focused on diagnosis and on prediction of outcomes. It is less clear in what way AI can or should support complex clinical decisions that crucially depend on patient preferences. In this paper, we focus on the ethical questions arising from the design, development and deployment of AI systems to support decision-making around cardiopulmonary resuscitation and the determination of a patient’s Do Not Attempt to Resuscitate status (also known as code status). The COVID-19 pandemic has made us keenly aware of the difficulties physicians encounter when they have to act quickly in stressful situations without knowing what their patient would have wanted. We discuss the results of an interview study conducted with healthcare professionals in a university hospital aimed at understanding the status quo of resuscitation decision processes while exploring a potential role for AI systems in decision-making around code status. Our data suggest that (1) current practices are fraught with challenges such as insufficient knowledge regarding patient preferences, time pressure and personal bias guiding care considerations and (2) there is considerable openness among clinicians to consider the use of AI-based decision support. We suggest a model for how AI can contribute to improve decision-making around resuscitation and propose a set of ethically relevant preconditions—conceptual, methodological and procedural—that need to be considered in further development and implementation efforts.

  • clinical ethics
  • decision-making
  • emergency medicine
  • end-of-life
  • patient perspective
  • artificial intelligence

Data availability statement

Data are available upon request.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Footnotes

  • NB-A and AF are joint first authors.

  • Contributors NB-A conceived the study and acquired the funding. NB-A and AF designed the study and led the development of the study materials, with input and consensus from all authors. FM and PB collected the empirical data. All authors contributed to the analysis and interpretation of the empirical data. NB-A and AF drafted the manuscript and all authors contributed to and approved the final version.

  • Funding This study formed part of a larger fellowship project on ‘Digital support of decision-making in health care’ (led by Nikola Biller-Andorno) at the Collegium Helveticum, an Institute of Advanced Studies carried by the University of Zurich, the Swiss Federal Institute of Technology (ETH) Zurich and the Zurich University of the Arts (ZHdK).

  • Competing interests None declared.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.