Interpretable machine learning models for beta thalassemia prediction: an explainable AI approach for smart healthcare 5.0

用于β地中海贫血预测的可解释机器学习模型:面向智能医疗5.0的可解释人工智能方法

阅读:1

Abstract

INTRODUCTION: An inherited blood disorder that bounds the production of beta globin, an important protein that has a handsome contribution in the development of hemoglobin and Red Blood Cells (RBC). This protein also enables cells to carry oxygen to tissues throughout the human body. Genetic variation in hemoglobin beta gene signals the body to make beta globin chains is the cause of beta thelasemia with three major types major, intermediate and minor. There is a need of an expert system for the diagnosis of this particular disease. METHODS: This study introduces an interpretable Expert system for the prediction of Beta Thelesemia incorporating Explainable AI (XAI) techniques to enhance clinical needs. Principle component Analysis (PCA) with Synthetic Minority Over-sampling Technique (SMOTE) is applied on the Beta Thalassemia Carrier (BTC) dataset 5066 patients to reduce the dimentiality and balance the output classes. Machime learning classifiers such as Neural Networks, Recurrent Neural Networks and Long Short Term Memory (LSTM) is applied. RESULTS: The latest one will give the 99.30% accuracy, 99.33% precision, 99.33% recall, 99.33% specificity, and 99.33% f1 score. DISCUSSION: Furthermore ensuring the models transparency and interpretability, the proposed method integrates SHapley Ad-ditive exPlanations (SHAP) and Local Interpretable Model-Agnostic Explanations (LIME), enabling both global and local interpretability of model predictions. SHAP gives us insight into important features at the global level, while LIME explains individual predictions, making the model's decisions more comprehensible for clinical applications.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。