Explainable artificial intelligence in pancreatic cancer prediction: from transparency to clinical decision-making

可解释人工智能在胰腺癌预测中的应用:从透明度到临床决策

阅读:1

Abstract

BACKGROUND/OBJECTIVES: Pancreatic cancer (PC) remains among the most lethal malignancies worldwide, with a persistently low 5-year survival rate despite advances in systemic therapies and surgical innovation. Machine learning (ML) has emerged as a transformative tool for early detection, prognostic modelling, and treatment planning in PC, yet widespread clinical use is constrained by the "black box" nature of many models. Explainable artificial intelligence (XAI) offers a pathway to reconcile model accuracy with clinical trust, enabling transparent, reproducible, and clinically meaningful predictions. METHODS: We reviewed literature from 2020-2025, focusing on ML-based studies in PC that incorporated or discussed XAI techniques. Methods were grouped by model architecture, data modality, and interpretability framework. We synthesized findings to evaluate the technical underpinnings, interpretability outcomes, and clinical relevance of XAI applications. RESULTS: Across 21 studies on ML in PC, only three studies explicitly integrated XAI, primarily using SHAP and SurvSHAP. These methods helped identify key biomarkers, comorbidities, and survival predictors, while enhancing clinician trust. XAI approaches were categorized by staging (ante-hoc vs. post-hoc), compatibility (model-agnostic vs. model-specific), and scope (local vs. global explanations). Barriers to adoption included methodological instability, limited external validation, weak workflow integration, and lack of standardized evaluation. CONCLUSIONS: XAI has the potential to serve as a cornerstone for advancing transparent, trustworthy ML in PC prediction. By clarifying model reasoning, XAI enhances clinical interpretability and regulatory readiness. This review provides a technical and clinical synthesis of current XAI practices, positioning explainability as essential for translating ML innovations into actionable oncology tools.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。