Article Text
Statistics from Altmetric.com
Introduction
In the article ‘Testimonial injustice in medical machine learning’, Pozzi argues that the prescription drug monitoring programme (PDMP) leads to testimonial injustice as physicians are more inclined to trust the PDMP’s risk scores over the patient’s own account of their medication history.1 Pozzi further develops this argument by discussing how credibility shifts from patients to machine learning (ML) systems that are supposedly neutral. As a result, a sense of distrust is now formed between patients and physicians. While there are merits to Pozzi’s main argument of epistemic injustice caused by PDMPs, Pozzi mentions but ultimately glosses over the problem of automation bias. In this commentary, I will discuss automation bias and the affect it has on clinical decision making as well as a technical problem exacerbated by the usage of PDMPs that can potentially cause physical harms.
Unaccounted problems with ML systems
It is reiterated in the article that the confidence physicians have in the PDMP’s risk scores over the patient’s testimony leads to misplaced trust in the ML systems. What Pozzi describes here is known as automation bias, which occurs when there is an over-reliance on ML systems. …
Footnotes
Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.
Competing interests None declared.
Provenance and peer review Not commissioned; internally peer reviewed.
Linked Articles
- Feature article
Read the full text or download the PDF:
Other content recommended for you
- Testimonial injustice in medical machine learning
- Evidence, ethics and the promise of artificial intelligence in psychiatry
- Further remarks on testimonial injustice in medical machine learning: a response to commentaries
- Artificial intelligence, bias and clinical safety
- Epistemic injustice, children and mental illness
- Epistemic virtues of harnessing rigorous machine learning systems in ethically sensitive domains
- Epistemic injustice in healthcare encounters: evidence from chronic fatigue syndrome
- Daring discourse: artificial intelligence in pain medicine, opportunities and challenges
- Quickly identifying people at risk of opioid use disorder in emergency departments: trade-offs between a machine learning approach and a simple EHR flag strategy
- Testimonial injustice in medical machine learning: a perspective from psychiatry