TY - JOUR
T1 - Relational Ethics and Structural Epistemic Injustice of AI in Medicine
AU - Herzog, Christian
AU - Branford, Jason
N1 - Publisher Copyright:
© The Author(s) 2025.
PY - 2025/12
Y1 - 2025/12
N2 - With this contribution, we propose an initial, relational ethics approach to epistemic inclusion in the development and application of medical decision support systems. We revisit issues of epistemic injustice in various forms of medical AI and consider different orders of epistemic oppression. We argue that AI-based decision support risks excluding patients and medical personnel from relevant epistemic processes vital to good medical practice. For example, relating to a subject’s lifeworld, value sets, assumptions concerning good health, and means of overcoming or living with disease. By recognizing medical decision support systems as mediators of shared epistemic resources between patients, medical personnel, and medical research, we contend that a concern for epistemic inclusion ought to guide their conception and development. A relational ethics-based consideration of these epistemic processes further illuminates the structural character of these forms of epistemic exclusion. Ultimately, our approach seeks to reinstate the epistemic privilege of those perspectives being marginalized by these technologies and challenges the proclaimed impending epistemic obligation to utilize AI-based tools as the state-of-the-art in medical diagnosis and, perhaps even, therapy and its planning.
AB - With this contribution, we propose an initial, relational ethics approach to epistemic inclusion in the development and application of medical decision support systems. We revisit issues of epistemic injustice in various forms of medical AI and consider different orders of epistemic oppression. We argue that AI-based decision support risks excluding patients and medical personnel from relevant epistemic processes vital to good medical practice. For example, relating to a subject’s lifeworld, value sets, assumptions concerning good health, and means of overcoming or living with disease. By recognizing medical decision support systems as mediators of shared epistemic resources between patients, medical personnel, and medical research, we contend that a concern for epistemic inclusion ought to guide their conception and development. A relational ethics-based consideration of these epistemic processes further illuminates the structural character of these forms of epistemic exclusion. Ultimately, our approach seeks to reinstate the epistemic privilege of those perspectives being marginalized by these technologies and challenges the proclaimed impending epistemic obligation to utilize AI-based tools as the state-of-the-art in medical diagnosis and, perhaps even, therapy and its planning.
UR - https://www.scopus.com/pages/publications/105021124886
U2 - 10.1007/s13347-025-00987-1
DO - 10.1007/s13347-025-00987-1
M3 - Journal articles
AN - SCOPUS:105021124886
SN - 2210-5433
VL - 38
JO - Philosophy and Technology
JF - Philosophy and Technology
IS - 4
M1 - 160
ER -